US20100228804A1 - Constructing image captchas utilizing private information of the images - Google Patents

Constructing image captchas utilizing private information of the images Download PDF

Info

Publication number
US20100228804A1
US20100228804A1 US12/397,561 US39756109A US2010228804A1 US 20100228804 A1 US20100228804 A1 US 20100228804A1 US 39756109 A US39756109 A US 39756109A US 2010228804 A1 US2010228804 A1 US 2010228804A1
Authority
US
United States
Prior art keywords
images
image
captcha
recited
public information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/397,561
Inventor
Anirban Dasgupta
Shanmugasundaram Ravikumar
Kunal Punera
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yahoo Inc
Original Assignee
Yahoo Inc until 2017
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yahoo Inc until 2017 filed Critical Yahoo Inc until 2017
Priority to US12/397,561 priority Critical patent/US20100228804A1/en
Assigned to YAHOO! INC. reassignment YAHOO! INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DASGUPTA, ANIRBAN, PUNERA, KUNAL, RAVIKUMAR, SHANMUGASUNDARAM
Publication of US20100228804A1 publication Critical patent/US20100228804A1/en
Priority to US13/487,504 priority patent/US9037595B2/en
Assigned to YAHOO HOLDINGS, INC. reassignment YAHOO HOLDINGS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO! INC.
Assigned to OATH INC. reassignment OATH INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAHOO HOLDINGS, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2103Challenge-response

Definitions

  • the present disclosure generally relates to constructing image CAPTCHAs and more specifically relates to constructing image CAPTCHAs utilizing private information of the candidate images that is only known to the entity responsible for constructing the CAPTCHAs.
  • a CAPTCHA is a type of challenge-response test used to determine whether the response is generated by a machine, e.g., a computer. The test is based on the assumption that a human's ability in pattern recognition is much superior to that of a machine's, at least for the present.
  • a CAPTCHA test involves presenting one or more images to a testee, i.e., the person being tested, together with a challenge, i.e., a question.
  • the challenge is related to the one or more images presented to the testee and generally requires the testee to recognize some form of pattern in the image(s).
  • the testee needs to provide a correct response to the challenge in order to pass the test.
  • FIGS. 1A and 1B illustrate two sample CAPTCHA tests 110 and 120 .
  • the CAPTCHA test 110 includes an image 111 of a distorted text string. Note that texts may be considered a special form of image.
  • the challenge 112 asks the testee to recognize the distorted text string and enter it in the response field 113 . In order to pass the test, the testee must enter the correct text string shown in the image 111 .
  • the CAPTCHA test 120 includes an image 121 of an animal.
  • the challenge 122 asks the testee to recognize the animal and describe it in the response field 123 . In order to pass the test, the testee must correctly identify the animal shown in the image 121 .
  • CAPTCHAs are often used to prevent automated computer software from performing actions that degrade the quality of service of a given system.
  • CAPTCHA tests several points often need to be considered.
  • the challenges should be constructed such that current computer software is unable to determine the responses accurately while most humans can.
  • due to the great amount of information publicly available and easily accessible, e.g., via the Internet it is possible that some of the publicly available information may be used by computer software to help solve CAPTCHA challenges.
  • the present disclosure generally relates to constructing image CAPTCHAs and more specifically relates to constructing image CAPTCHAs utilizing private information of the candidate images that is only known to the entity responsible for constructing the CAPTCHAs.
  • a set of candidate images is obtained.
  • Each candidate image has public information and private information.
  • the candidate images may be obtained from various sources, including but not limited to images publicly available on the Internet.
  • One or more images are selected from the set of candidate images based on each image's public information and private information for the construction of an image CAPTCHA.
  • the private information of an image is only accessible by an entity responsible for constructing the CAPTCHA, i.e., the entity that causes the CAPTCHA to be constructed.
  • the image CAPTCHA includes the one or more selected images, a challenge, and a correct response, such that it is difficult, even nearly impossible, for a computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the selected image(s).
  • the selection of the image(s) may be further optimized based on the specific type of the CAPTCHA to be constructed.
  • FIGS. 1A and 1B illustrate two sample CAPTCHA tests.
  • FIG. 2 illustrates a method of constructing an image CAPTCHA according to particular embodiments of the present disclose.
  • FIG. 3 illustrates an image-description type CAPTCHA.
  • FIG. 4 illustrates an image-similarity type CAPTCHA.
  • FIG. 5 illustrates a general computer system suitable for implementing embodiments of the present disclosure.
  • an image CAPTCHA having one or more images, a challenge, and a correct response is constructed by selecting the one or more images from a set of candidate images based on each image's public information and private information.
  • An image's private information is accessible only to an entity responsible for constructing the CAPTCHA.
  • the candidate images may be obtained from a variety of sources, including but not limited to publicly accessible images on the Internet.
  • the automated computer systems may use such information to help guess the correct responses to the CAPTCHAs constructed using these images.
  • the CAPTCHA challenges may be crafted such that it is very difficult, even nearly impossible, for automated computer systems to guess the correct CAPTCHA responses using only the public information of each of the CAPTCHA images.
  • FIG. 2 illustrates a method of constructing an image CAPTCHA according to particular embodiments of the present disclose.
  • a set of candidate images is obtained (step 210 ).
  • Each of the candidate images has public information and private information.
  • Each candidate image's public information may be accessible by any entity, e.g., any person, any computer software, etc.
  • each candidate image's private information is accessible only to a specific entity responsible for causing the CAPTCHA to be constructed. Consequently, each candidate image's private information is accessible by computer software, e.g., the computer software that constructs the image CAPTCHA, associated with the entity responsible for causing the CAPTCHA to be constructed.
  • the candidate images may be obtained from a variety of sources, including but not limited to images publicly available on the Internet, images from private collections, and any image that is accessible by the entity responsible for causing the CAPTCHA to be constructed.
  • the entity responsible for causing the CAPTCHA to be constructed is associated with an Internet search engine.
  • the content of the search queries are only known to the search engine and consequently to the entity associated with the search engine.
  • These search queries may be collected and processed to obtain private information of candidate images obtained from the Internet.
  • each of the result images is an image of white orchid.
  • each of the result images may have private information indicating that the image is related to white orchid.
  • the assumption may not be completely accurate for all of the result images.
  • the person selects a particular one of the result images for further viewing by clicking on a link associated with the specific result image. Since it is known that the person is searching for images of white orchid, the fact that the person has selected a particular one of the result images may further indicate that at least this particular image selected by the person is most likely an image of white orchid. Thus, the particular image selected by the person may have private information indicating that the image is related to white orchid.
  • the entity responsible for causing the CAPTCHA to be constructed is associated with a server, such as a web application server.
  • Internet activities conducted on the server are only known to the server and consequently to the entity associated with the server. These activities may be monitored and information of the activities may be collected and stored, such as in a database or in one or more log files. The information may be processed to obtain private information of candidate images obtained from the Internet.
  • the person may search for and view web pages relating to the Golden Gate Bridge. If a web page viewed by the person contains an image, it is likely that the image is an image of the Golden Gate Bridge or at least relates to the Golden Gate Bridge. Thus, the image contained in the web page may have private information indicating that the image is related to the Golden Gate Bridge.
  • Internet search and activity logs accessible only to the entity associated with the search engine or server are one source for obtaining private information of the candidate images obtained on the Internet.
  • Private information of the candidate images may be obtained from any source that is accessible only to the entity responsible for causing the CAPTCHA to be constructed. For example, if the candidate images come from a private collection, these candidate images may have information inaccessible to the general public, which may be used to obtain private information for the candidate images.
  • One or more images are selected from the set of candidate images based on each image's private information and public information for the construction of an image CAPTCHA (step 220 ).
  • the image(s) is/are selected such that when they are used to construct the image CAPTCHA that includes the selected image(s), a challenge, and a correct response, it is difficult, even nearly impossible, for a computer to automatically determine the correct response of the image CAPTCHA using only the public information of each of the selected image(s) (step 230 ).
  • the process for selecting images for a CAPTCHA starts with a set of candidate images from which to select the CAPTCHA images. For each of these candidate images, as much publicly and privately available information is obtained as possible. Upon obtaining the pertinent information, the information is given as input to a CAPTCHA type-specific procedure. The procedure returns a judgment on whether the candidate image is suitable for use in the CAPTCHA under consideration based on the pertinent information about the candidate image. The procedure for image selection may make a decision on several candidate images at a time for some types of CAPTCHAs.
  • CAPTCHA challenges There are different types of CAPTCHA challenges.
  • the two sample CAPTCHA tests illustrated in FIGS. 1A and 1B requests the testee to recognize and describe the subject matter of the image presented.
  • Other types of CAPTCHA include requesting the testee to selected one image from multiple images that best matches or is most similar to a target image, to find the boundary or geometric center of an image that is positioned among multiple connected and optionally distorted images, etc.
  • the selection of the image(s) used for the CAPTCHA may be further refined based on the type of the CAPTCHA to be constructed.
  • FIG. 3 illustrates an image-description type CAPTCHA 300 .
  • An image 310 is presented to the testee.
  • the challenge is for the testee to describe the subject matter of the image 310 .
  • the challenge may be presented in a variety of ways. For example, in one option, the testee may be asked to provide a word or a phrase 320 that describes the object in the image 310 . In an alternative option, the testee may be asked to select one of the multiple words provided 330 that describes the object in the image 310 . For this type of CAPTCHA, it is preferable that the image selected has little or no public information and/or incorrect public information.
  • the computer program may use publicly available information associated with the presented image to help determine the subject matter of the presented image.
  • the public information may include tags, descriptions, and other publicly available data associated with the image.
  • the less public information of the presented image is available the more difficult it is for the computer program to guess the correct subject matter of the presented image.
  • the presented image has incorrect public information, it may mislead the computer program into guessing the wrong response. Whether an image's public information is correct or incorrect may be determined by comparing the image's public information against the image's private information. If the two sets of information do not agree mostly, it is likely that the image's public information is largely incorrect.
  • an image-description type CAPTCHA it is desirable to select an image that has public information that is orthogonal to the correct response of the CAPTCHA.
  • the selection process may be further refined by analyzing the amount and/or the correctness of the public information of each candidate image and selecting an image that has relatively little or no public information and/or mostly incorrect public information.
  • FIG. 4 illustrates an image-similarity type CAPTCHA 400 .
  • a target image 410 and a set of additional choice images 421 , 422 , 423 , 424 are presented to the testee.
  • the challenge is for the testee to select from the additional choice images 421 , 422 , 423 , 424 one image that matches or is similar to the target image 410 .
  • the testee is asked to select one of the additional choice images 421 , 422 , 423 , 424 that shows the same person as that in the target image 410 .
  • the computer program may use publication information associated with the presented images to help determine which of the additional choice images shows the same subject matter as that in the target image. If the correct response image and the target image have little or no similar public information, then it may be difficult for the computer program to guess the correct response image using the public images of each of the presented images.
  • the image 422 shows the same person as that in the target image 410 . If the image 422 and the target image 410 share little or no similar public information, then it may be difficult for the computer program to guess that the image 422 is the correct response image.
  • the incorrect response images i.e., the images 421 , 423 , 424 , share similar public information with the target image 410 , then it may mislead the computer program into guessing the wrong response image.
  • an image-similarity type CAPTCHA it is desirable to select a set of images where the target image and the correct response image share little or no similar public information, i.e., the difference between the publication information of the target image and the public information of the correct response image is relatively large. Generally, the less similar public information shared between the target image and the correct response image, the more difficult for the computer program to guess the correct response image using each image's public information.
  • the selection process may be further refined by analyzing the degrees of similarity between the public information of the selected target image and the public information of each of the selected additional choice images and selecting a target image and a correct response image that share little or no similar public information.
  • those additional choice images other than the correct response image may be selected based on their sharing similar public information with the target image.
  • the method described above may be implemented as computer software using computer-readable instructions and physically stored in computer-readable medium.
  • a “computer-readable medium” as used herein may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, system or device.
  • the computer readable medium may be, by way of example only but not by limitation, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, system, device, propagation medium, or computer memory.
  • the computer software may be encoded using any suitable computer languages, including future programming languages. Different programming techniques can be employed, such as, for example, procedural or object oriented.
  • the software instructions may be executed on various types of computers, including single or multiple processor devices.
  • Embodiments of the present disclosure may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nano-engineered systems, components and mechanisms may be used.
  • the functions of the present disclosure can be achieved by any means as is known in the art.
  • Distributed, or networked systems, components and circuits can be used.
  • Communication, or transfer, of data may be wired, wireless, or by any other means.
  • FIG. 5 illustrates a computer system 500 suitable for implementing embodiments of the present disclosure.
  • the components shown in FIG. 5 for computer system 500 are exemplary in nature and are not intended to suggest any limitation as to the scope of use or functionality of the computer software implementing embodiments of the present disclosure. Neither should the configuration of components be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary embodiment of a computer system.
  • Computer system 500 may have many physical forms including an integrated circuit, a printed circuit board, a small handheld device (such as a mobile telephone or PDA), a personal computer or a super computer.
  • Computer system 500 includes a display 532 , one or more input devices 533 (e.g., keypad, keyboard, mouse, stylus, etc.), one or more output devices 534 (e.g., speaker), one or more storage devices 535 , various types of storage medium 536 .
  • input devices 533 e.g., keypad, keyboard, mouse, stylus, etc.
  • output devices 534 e.g., speaker
  • storage devices 535 e.g., various types of storage medium 536 .
  • the system bus 540 link a wide variety of subsystems.
  • a “bus” refers to a plurality of digital signal lines serving a common function.
  • the system bus 540 may be any of several types of bus structures including a memory bus, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • bus architectures include the Industry Standard Architecture (ISA) bus, Enhanced ISA (EISA) bus, the Micro Channel Architecture (MCA) bus, the Video Electronics Standards Association local (VLB) bus, the Peripheral Component Interconnect (PCI) bus, the PCI-Express bus (PCI-X), and the Accelerated Graphics Port (AGP) bus.
  • Processor(s) 501 also referred to as central processing units, or CPUs optionally contain a cache memory unit 502 for temporary local storage of instructions, data, or computer addresses.
  • Processor(s) 501 are coupled to storage devices including memory 503 .
  • Memory 503 includes random access memory (RAM) 504 and read-only memory (ROM) 505 .
  • RAM random access memory
  • ROM read-only memory
  • ROM 505 acts to transfer data and instructions uni-directionally to the processor(s) 501
  • RAM 504 is used typically to transfer data and instructions in a bi-directional manner. Both of these types of memories may include any suitable of the computer-readable media described below.
  • a fixed storage 508 is also coupled bi-directionally to the processor(s) 501 , optionally via a storage control unit 507 . It provides additional data storage capacity and may also include any of the computer-readable media described below. Storage 508 may be used to store operating system 509 , EXECs 510 , application programs 512 , data 511 and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It should be appreciated that the information retained within storage 508 , may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 503 .
  • Processor(s) 501 is also coupled to a variety of interfaces such as graphics control 521 , video interface 522 , input interface 523 , output interface, storage interface 525 , and these interfaces in turn are coupled to the appropriate devices.
  • an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers.
  • Processor(s) 501 may be coupled to another computer or telecommunications network 530 using network interface 520 .
  • the CPU 501 might receive information from the network 530 , or might output information to the network in the course of performing the above-described method steps.
  • method embodiments of the present disclosure may execute solely upon CPU 501 or may execute over a network 530 such as the Internet in conjunction with a remote CPU 501 that shares a portion of the processing.
  • computer system 500 when in a network environment, i.e., when computer system 500 is connected to network 530 , computer system 500 may communicate with other devices that are also connected to network 530 . Communications may be sent to and from computer system 500 via network interface 520 . For example, incoming communications, such as a request or a response from another device, in the form of one or more packets, may be received from network 530 at network interface 520 and stored in selected sections in memory 503 for processing. Outgoing communications, such as a request or a response to another device, again in the form of one or more packets, may also be stored in selected sections in memory 503 and sent out to network 530 at network interface 520 . Processor(s) 501 may access these communication packets stored in memory 503 for processing.
  • embodiments of the present disclosure further relate to computer storage products with a computer-readable medium that have computer code thereon for performing various computer-implemented operations.
  • the media and computer code may be those specially designed and constructed for the purposes of the present disclosure, or they may be of the kind well known and available to those having skill in the computer software arts.
  • Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs) and ROM and RAM devices.
  • Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter.
  • the computer system having architecture 500 may provide functionality as a result of processor(s) 501 executing software embodied in one or more tangible, computer-readable media, such as memory 503 .
  • the software implementing various embodiments of the present disclosure may be stored in memory 503 and executed by processor(s) 501 .
  • a computer-readable medium may include one or more memory devices, according to particular needs.
  • Memory 503 may read the software from one or more other computer-readable media, such as mass storage device(s) 535 or from one or more other sources via communication interface.
  • the software may cause processor(s) 501 to execute particular processes or particular steps of particular processes described herein, including defining data structures stored in memory 503 and modifying such data structures according to the processes defined by the software.
  • the computer system may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute particular processes or particular steps of particular processes described herein.
  • Reference to software may encompass logic, and vice versa, where appropriate.
  • Reference to a computer-readable media may encompass a circuit (such as an integrated circuit (IC)) storing software for execution, a circuit embodying logic for execution, or both, where appropriate.
  • IC integrated circuit
  • a “processor,” “process,” or “act” includes any human, hardware and/or software system, mechanism or component that processes data, signals or other information.
  • a processor can include a system with a general-purpose central processing unit, multiple processing units, dedicated circuitry for achieving functionality, or other systems. Processing need not be limited to a geographic location, or have temporal limitations. For example, a processor can perform its functions in “real time,” “offline,” in a “batch mode,” etc. Portions of processing can be performed at different times and at different locations, by different (or the same) processing systems.
  • FIGS. 1 through 5 can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application.

Abstract

An image CAPTCHA having one or more images, a challenge, and a correct answer to the challenge is constructed by selecting the one or more images from a plurality of candidate images based at least in part on each image's public information and private information. The private information of each of the images is accessible only to an entity responsible for constructing the CAPTCHA. Optionally, the one or more images are selected further based on the specific type of the CAPTCHA to be constructed.

Description

    TECHNICAL FIELD
  • The present disclosure generally relates to constructing image CAPTCHAs and more specifically relates to constructing image CAPTCHAs utilizing private information of the candidate images that is only known to the entity responsible for constructing the CAPTCHAs.
  • BACKGROUND
  • A CAPTCHA, or Captcha, is a type of challenge-response test used to determine whether the response is generated by a machine, e.g., a computer. The test is based on the assumption that a human's ability in pattern recognition is much superior to that of a machine's, at least for the present. In a typical scenario, a CAPTCHA test involves presenting one or more images to a testee, i.e., the person being tested, together with a challenge, i.e., a question. The challenge is related to the one or more images presented to the testee and generally requires the testee to recognize some form of pattern in the image(s). The testee needs to provide a correct response to the challenge in order to pass the test.
  • FIGS. 1A and 1B illustrate two sample CAPTCHA tests 110 and 120. In FIG. 1A, the CAPTCHA test 110 includes an image 111 of a distorted text string. Note that texts may be considered a special form of image. The challenge 112 asks the testee to recognize the distorted text string and enter it in the response field 113. In order to pass the test, the testee must enter the correct text string shown in the image 111. In FIG. 1B, the CAPTCHA test 120 includes an image 121 of an animal. The challenge 122 asks the testee to recognize the animal and describe it in the response field 123. In order to pass the test, the testee must correctly identify the animal shown in the image 121.
  • CAPTCHAs are often used to prevent automated computer software from performing actions that degrade the quality of service of a given system. When constructing CAPTCHA tests, several points often need to be considered. First, the challenges should be constructed such that current computer software is unable to determine the responses accurately while most humans can. Second, there need to be enough instances of CAPTCHA tests such that human CAPTCHA solvers employed by spammers are unable to enumerate them all. In addition, due to the great amount of information publicly available and easily accessible, e.g., via the Internet, it is possible that some of the publicly available information may be used by computer software to help solve CAPTCHA challenges.
  • SUMMARY
  • The present disclosure generally relates to constructing image CAPTCHAs and more specifically relates to constructing image CAPTCHAs utilizing private information of the candidate images that is only known to the entity responsible for constructing the CAPTCHAs.
  • According to various embodiments, a set of candidate images is obtained. Each candidate image has public information and private information. The candidate images may be obtained from various sources, including but not limited to images publicly available on the Internet.
  • One or more images are selected from the set of candidate images based on each image's public information and private information for the construction of an image CAPTCHA. The private information of an image is only accessible by an entity responsible for constructing the CAPTCHA, i.e., the entity that causes the CAPTCHA to be constructed. The image CAPTCHA includes the one or more selected images, a challenge, and a correct response, such that it is difficult, even nearly impossible, for a computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the selected image(s). In addition, the selection of the image(s) may be further optimized based on the specific type of the CAPTCHA to be constructed.
  • These and other features, aspects, and advantages of the disclosure are described in more detail below in the detailed description and in conjunction with the following figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present disclosure is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
  • FIGS. 1A and 1B illustrate two sample CAPTCHA tests.
  • FIG. 2 illustrates a method of constructing an image CAPTCHA according to particular embodiments of the present disclose.
  • FIG. 3 illustrates an image-description type CAPTCHA.
  • FIG. 4 illustrates an image-similarity type CAPTCHA.
  • FIG. 5 illustrates a general computer system suitable for implementing embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • The present disclosure is now described in detail with reference to a few preferred embodiments thereof as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It is apparent, however, to one skilled in the art, that the present disclosure may be practiced without some or all of these specific details. In other instances, well known process steps and/or structures have not been described in detail in order to not unnecessarily obscure the present disclosure. In addition, while the disclosure is described in conjunction with the particular embodiments, it should be understood that this description is not intended to limit the disclosure to the described embodiments. To the contrary, the description is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the disclosure as defined by the appended claims.
  • According to various embodiments of the present disclosure, an image CAPTCHA having one or more images, a challenge, and a correct response is constructed by selecting the one or more images from a set of candidate images based on each image's public information and private information. An image's private information is accessible only to an entity responsible for constructing the CAPTCHA. The candidate images may be obtained from a variety of sources, including but not limited to publicly accessible images on the Internet.
  • Although texts may be considered a special form of image, image recognition, i.e., analyzing and recognizing patterns in true images, presents a still more difficult challenge than character recognition to computer systems. Thus, true image-based CAPTCHAs are more difficult for spammers utilizing automated computed programs to defeat. Since the candidate images from which the images used for constructing the CAPTCHAs are selected may include any publicly accessible image on the Internet, the base for selecting the CAPTCHA images is extremely large, in fact, too large for automated computer systems to enumerate responses. On the other hand, many images on the Internet enjoy copyright protections. When these images are used, it is often necessary to provide information, e.g., the origins, of the images. The automated computer systems may use such information to help guess the correct responses to the CAPTCHAs constructed using these images. However, by utilizing private information of the candidate images that is accessible only to the entity responsible for constructing the CAPTCHAs when selecting the CAPTCHA images, the CAPTCHA challenges may be crafted such that it is very difficult, even nearly impossible, for automated computer systems to guess the correct CAPTCHA responses using only the public information of each of the CAPTCHA images.
  • Constructing Image CAPTCHAs
  • FIG. 2 illustrates a method of constructing an image CAPTCHA according to particular embodiments of the present disclose. A set of candidate images is obtained (step 210). Each of the candidate images has public information and private information. Each candidate image's public information may be accessible by any entity, e.g., any person, any computer software, etc. On the other hand, each candidate image's private information is accessible only to a specific entity responsible for causing the CAPTCHA to be constructed. Consequently, each candidate image's private information is accessible by computer software, e.g., the computer software that constructs the image CAPTCHA, associated with the entity responsible for causing the CAPTCHA to be constructed.
  • The candidate images may be obtained from a variety of sources, including but not limited to images publicly available on the Internet, images from private collections, and any image that is accessible by the entity responsible for causing the CAPTCHA to be constructed.
  • According to particular embodiments, the entity responsible for causing the CAPTCHA to be constructed is associated with an Internet search engine. When Internet users communicate search queries to the search engine, the content of the search queries are only known to the search engine and consequently to the entity associated with the search engine. These search queries may be collected and processed to obtain private information of candidate images obtained from the Internet.
  • For example, suppose a person wishes to locate images of white orchid on the Internet. The person communicates the search query “white orchid” to the search engine associated with the entity responsible for causing the CAPTCHA to be constructed. The search engine conducts a search on the Internet and returns a set of result images. It may be assumed that each of the result images is an image of white orchid. Thus, each of the result images may have private information indicating that the image is related to white orchid. However, the assumption may not be completely accurate for all of the result images.
  • To further refine the accuracy of the images' private information, suppose the person selects a particular one of the result images for further viewing by clicking on a link associated with the specific result image. Since it is known that the person is searching for images of white orchid, the fact that the person has selected a particular one of the result images may further indicate that at least this particular image selected by the person is most likely an image of white orchid. Thus, the particular image selected by the person may have private information indicating that the image is related to white orchid.
  • According to particular embodiments, the entity responsible for causing the CAPTCHA to be constructed is associated with a server, such as a web application server. Internet activities conducted on the server are only known to the server and consequently to the entity associated with the server. These activities may be monitored and information of the activities may be collected and stored, such as in a database or in one or more log files. The information may be processed to obtain private information of candidate images obtained from the Internet.
  • For example, suppose a person wishes to locate information about the Golden Gate Bridge in San Francisco Bay on the Internet. The person may search for and view web pages relating to the Golden Gate Bridge. If a web page viewed by the person contains an image, it is likely that the image is an image of the Golden Gate Bridge or at least relates to the Golden Gate Bridge. Thus, the image contained in the web page may have private information indicating that the image is related to the Golden Gate Bridge.
  • Internet search and activity logs accessible only to the entity associated with the search engine or server are one source for obtaining private information of the candidate images obtained on the Internet. Private information of the candidate images may be obtained from any source that is accessible only to the entity responsible for causing the CAPTCHA to be constructed. For example, if the candidate images come from a private collection, these candidate images may have information inaccessible to the general public, which may be used to obtain private information for the candidate images.
  • One or more images are selected from the set of candidate images based on each image's private information and public information for the construction of an image CAPTCHA (step 220). The image(s) is/are selected such that when they are used to construct the image CAPTCHA that includes the selected image(s), a challenge, and a correct response, it is difficult, even nearly impossible, for a computer to automatically determine the correct response of the image CAPTCHA using only the public information of each of the selected image(s) (step 230).
  • According to various embodiments, in general, the process for selecting images for a CAPTCHA starts with a set of candidate images from which to select the CAPTCHA images. For each of these candidate images, as much publicly and privately available information is obtained as possible. Upon obtaining the pertinent information, the information is given as input to a CAPTCHA type-specific procedure. The procedure returns a judgment on whether the candidate image is suitable for use in the CAPTCHA under consideration based on the pertinent information about the candidate image. The procedure for image selection may make a decision on several candidate images at a time for some types of CAPTCHAs.
  • Although the steps of the method illustrated in FIG. 2 are described as occurring in a particular order, the present disclosure contemplates any suitable steps of the method illustrated in FIG. 2 occurring in any particular order.
  • Refining the Image Selection Process
  • There are different types of CAPTCHA challenges. The two sample CAPTCHA tests illustrated in FIGS. 1A and 1B requests the testee to recognize and describe the subject matter of the image presented. Other types of CAPTCHA include requesting the testee to selected one image from multiple images that best matches or is most similar to a target image, to find the boundary or geometric center of an image that is positioned among multiple connected and optionally distorted images, etc. According to particular embodiments, if the specific type of the CAPTCHA to be constructed is known, then the selection of the image(s) used for the CAPTCHA may be further refined based on the type of the CAPTCHA to be constructed.
  • FIG. 3 illustrates an image-description type CAPTCHA 300. An image 310 is presented to the testee. The challenge is for the testee to describe the subject matter of the image 310. The challenge may be presented in a variety of ways. For example, in one option, the testee may be asked to provide a word or a phrase 320 that describes the object in the image 310. In an alternative option, the testee may be asked to select one of the multiple words provided 330 that describes the object in the image 310. For this type of CAPTCHA, it is preferable that the image selected has little or no public information and/or incorrect public information.
  • For a computer program to solve image-description type CAPTCHAs, in addition to image recognition or pattern recognition algorithms, which often do not provide satisfactory results, the computer program may use publicly available information associated with the presented image to help determine the subject matter of the presented image. The public information may include tags, descriptions, and other publicly available data associated with the image. Thus, the less public information of the presented image is available, the more difficult it is for the computer program to guess the correct subject matter of the presented image. Alternatively or in addition, if the presented image has incorrect public information, it may mislead the computer program into guessing the wrong response. Whether an image's public information is correct or incorrect may be determined by comparing the image's public information against the image's private information. If the two sets of information do not agree mostly, it is likely that the image's public information is largely incorrect.
  • Thus, to construct an image-description type CAPTCHA, it is desirable to select an image that has public information that is orthogonal to the correct response of the CAPTCHA. Generally, the less correct public information and/or the more incorrect public information the presented image has, the more difficult it is for the computer program to guess the correct response. In step 220, when selecting an image from the candidate images to construct an image-description type CAPTCHA, the selection process may be further refined by analyzing the amount and/or the correctness of the public information of each candidate image and selecting an image that has relatively little or no public information and/or mostly incorrect public information.
  • FIG. 4 illustrates an image-similarity type CAPTCHA 400. A target image 410 and a set of additional choice images 421, 422, 423, 424 are presented to the testee. The challenge is for the testee to select from the additional choice images 421, 422, 423, 424 one image that matches or is similar to the target image 410. In this example, the testee is asked to select one of the additional choice images 421, 422, 423, 424 that shows the same person as that in the target image 410.
  • For a computer program to solve image-similarity type CAPTCHAs, the computer program may use publication information associated with the presented images to help determine which of the additional choice images shows the same subject matter as that in the target image. If the correct response image and the target image have little or no similar public information, then it may be difficult for the computer program to guess the correct response image using the public images of each of the presented images. In this example, the image 422 shows the same person as that in the target image 410. If the image 422 and the target image 410 share little or no similar public information, then it may be difficult for the computer program to guess that the image 422 is the correct response image. Optionally in addition, if the incorrect response images, i.e., the images 421, 423, 424, share similar public information with the target image 410, then it may mislead the computer program into guessing the wrong response image.
  • Thus, to construct an image-similarity type CAPTCHA, it is desirable to select a set of images where the target image and the correct response image share little or no similar public information, i.e., the difference between the publication information of the target image and the public information of the correct response image is relatively large. Generally, the less similar public information shared between the target image and the correct response image, the more difficult for the computer program to guess the correct response image using each image's public information. In step 220, when selecting images from the candidate images to construct an image-similarity type CAPTCHA, the selection process may be further refined by analyzing the degrees of similarity between the public information of the selected target image and the public information of each of the selected additional choice images and selecting a target image and a correct response image that share little or no similar public information. Optionally in addition, those additional choice images other than the correct response image may be selected based on their sharing similar public information with the target image.
  • Computer System
  • The method described above may be implemented as computer software using computer-readable instructions and physically stored in computer-readable medium. A “computer-readable medium” as used herein may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, system or device. The computer readable medium may be, by way of example only but not by limitation, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, system, device, propagation medium, or computer memory.
  • The computer software may be encoded using any suitable computer languages, including future programming languages. Different programming techniques can be employed, such as, for example, procedural or object oriented. The software instructions may be executed on various types of computers, including single or multiple processor devices.
  • Embodiments of the present disclosure may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nano-engineered systems, components and mechanisms may be used. In general, the functions of the present disclosure can be achieved by any means as is known in the art. Distributed, or networked systems, components and circuits can be used. Communication, or transfer, of data may be wired, wireless, or by any other means.
  • For example, FIG. 5 illustrates a computer system 500 suitable for implementing embodiments of the present disclosure. The components shown in FIG. 5 for computer system 500 are exemplary in nature and are not intended to suggest any limitation as to the scope of use or functionality of the computer software implementing embodiments of the present disclosure. Neither should the configuration of components be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary embodiment of a computer system. Computer system 500 may have many physical forms including an integrated circuit, a printed circuit board, a small handheld device (such as a mobile telephone or PDA), a personal computer or a super computer.
  • Computer system 500 includes a display 532, one or more input devices 533 (e.g., keypad, keyboard, mouse, stylus, etc.), one or more output devices 534 (e.g., speaker), one or more storage devices 535, various types of storage medium 536.
  • The system bus 540 link a wide variety of subsystems. As understood by those skilled in the art, a “bus” refers to a plurality of digital signal lines serving a common function. The system bus 540 may be any of several types of bus structures including a memory bus, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example and not limitation, such architectures include the Industry Standard Architecture (ISA) bus, Enhanced ISA (EISA) bus, the Micro Channel Architecture (MCA) bus, the Video Electronics Standards Association local (VLB) bus, the Peripheral Component Interconnect (PCI) bus, the PCI-Express bus (PCI-X), and the Accelerated Graphics Port (AGP) bus.
  • Processor(s) 501 (also referred to as central processing units, or CPUs) optionally contain a cache memory unit 502 for temporary local storage of instructions, data, or computer addresses. Processor(s) 501 are coupled to storage devices including memory 503. Memory 503 includes random access memory (RAM) 504 and read-only memory (ROM) 505. As is well known in the art, ROM 505 acts to transfer data and instructions uni-directionally to the processor(s) 501, and RAM 504 is used typically to transfer data and instructions in a bi-directional manner. Both of these types of memories may include any suitable of the computer-readable media described below.
  • A fixed storage 508 is also coupled bi-directionally to the processor(s) 501, optionally via a storage control unit 507. It provides additional data storage capacity and may also include any of the computer-readable media described below. Storage 508 may be used to store operating system 509, EXECs 510, application programs 512, data 511 and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It should be appreciated that the information retained within storage 508, may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 503.
  • Processor(s) 501 is also coupled to a variety of interfaces such as graphics control 521, video interface 522, input interface 523, output interface, storage interface 525, and these interfaces in turn are coupled to the appropriate devices. In general, an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch-sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers. Processor(s) 501 may be coupled to another computer or telecommunications network 530 using network interface 520. With such a network interface 520, it is contemplated that the CPU 501 might receive information from the network 530, or might output information to the network in the course of performing the above-described method steps. Furthermore, method embodiments of the present disclosure may execute solely upon CPU 501 or may execute over a network 530 such as the Internet in conjunction with a remote CPU 501 that shares a portion of the processing.
  • According to various embodiments, when in a network environment, i.e., when computer system 500 is connected to network 530, computer system 500 may communicate with other devices that are also connected to network 530. Communications may be sent to and from computer system 500 via network interface 520. For example, incoming communications, such as a request or a response from another device, in the form of one or more packets, may be received from network 530 at network interface 520 and stored in selected sections in memory 503 for processing. Outgoing communications, such as a request or a response to another device, again in the form of one or more packets, may also be stored in selected sections in memory 503 and sent out to network 530 at network interface 520. Processor(s) 501 may access these communication packets stored in memory 503 for processing.
  • In addition, embodiments of the present disclosure further relate to computer storage products with a computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present disclosure, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter.
  • As an example and not by way of limitation, the computer system having architecture 500 may provide functionality as a result of processor(s) 501 executing software embodied in one or more tangible, computer-readable media, such as memory 503. The software implementing various embodiments of the present disclosure may be stored in memory 503 and executed by processor(s) 501. A computer-readable medium may include one or more memory devices, according to particular needs. Memory 503 may read the software from one or more other computer-readable media, such as mass storage device(s) 535 or from one or more other sources via communication interface. The software may cause processor(s) 501 to execute particular processes or particular steps of particular processes described herein, including defining data structures stored in memory 503 and modifying such data structures according to the processes defined by the software. In addition or as an alternative, the computer system may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute particular processes or particular steps of particular processes described herein. Reference to software may encompass logic, and vice versa, where appropriate. Reference to a computer-readable media may encompass a circuit (such as an integrated circuit (IC)) storing software for execution, a circuit embodying logic for execution, or both, where appropriate. The present disclosure encompasses any suitable combination of hardware and software.
  • A “processor,” “process,” or “act” includes any human, hardware and/or software system, mechanism or component that processes data, signals or other information. A processor can include a system with a general-purpose central processing unit, multiple processing units, dedicated circuitry for achieving functionality, or other systems. Processing need not be limited to a geographic location, or have temporal limitations. For example, a processor can perform its functions in “real time,” “offline,” in a “batch mode,” etc. Portions of processing can be performed at different times and at different locations, by different (or the same) processing systems.
  • Although the acts, operations or computations disclosed herein may be presented in a specific order, this order may be changed in different embodiments. In addition, the various acts disclosed herein may be repeated one or more times using any suitable order. In some embodiments, multiple acts described as sequential in this disclosure can be performed at the same time. The sequence of operations described herein can be interrupted, suspended, or otherwise controlled by another process, such as an operating system, kernel, etc. The acts can operate in an operating system environment or as stand-alone routines occupying all, or a substantial part, of the system processing.
  • Reference throughout the present disclosure to “particular embodiment,” “example embodiment,” “illustrated embodiment,” “some embodiments,” “various embodiments,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure and not necessarily in all embodiments. Thus, respective appearances of the phrases “in a particular embodiment,” “in one embodiment,” “in some embodiments,” or “in various embodiments” in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any specific embodiment of the present disclosure may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments of the present disclosure described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the present disclosure.
  • It will also be appreciated that one or more of the elements depicted in FIGS. 1 through 5 can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application.
  • As used in the description herein and throughout the claims that follow, “a”, “an”, and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise. Additionally, the term “or” as used herein is generally intended to mean “and/or” unless otherwise indicated. Combinations of components or steps will also be considered as being noted, where terminology is foreseen as rendering the ability to separate or combine is unclear.
  • While this disclosure has described several preferred embodiments, there are alterations, permutations, and various substitute equivalents, which fall within the scope of this disclosure. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present disclosure. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and various substitute equivalents as fall within the true spirit and scope of the present disclosure.

Claims (20)

1. A method, comprising:
accessing a plurality of candidate images, each candidate image comprising public information and private information;
selecting one or more images from the plurality of candidate images based at least in part on each image's public information and private information; and
constructing a CAPTCHA comprising the one or more images, a challenge, and a correct response,
wherein it is difficult for a computer to automatically determine the correct response using only the public information of each of the one or more images, and
wherein the private information of each of the one or more images is accessible only to an entity responsible for constructing the CAPTCHA.
2. The method as recited in claim 1, wherein it is nearly impossible for the computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the one or more images.
3. The method as recited in claim 1, wherein it is impossible for the computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the one or more images.
4. The method as recited in claim 1, wherein selecting the one or more images from the plurality of candidate images is further based on a type of the CAPTCHA.
5. The method as recited in claim 4, wherein:
the one or more images comprises one image,
the challenge comprises describing a subject matter of the image, and
the image is selected from the plurality of candidate images based on the image lacking public information or having incorrect public information.
6. The method as recited in claim 4, wherein:
the one or more images comprises one target image and a plurality of choice images,
the challenge comprises selecting a response image from the plurality of choice images that has a subject matter that is most similar to a subject matter of the target image among the plurality of choice images, and
the target image and the plurality of choice images are selected from the plurality of candidate images based on the target image and the response image having different public information.
7. The method as recited in claim 6, wherein the target image and the plurality of choice images are selected from the plurality of candidate images further based on the target image and each of the plurality of choice images other than the response image having similar public information.
8. The method as recited in claim 1, further comprising obtaining the plurality of candidate images from the Internet.
9. The method as recited in claim 8, further comprising obtaining the private information of each of the plurality of candidate images from search queries communicated to a search engine associated with the entity.
10. The method as recited in claim 8, further comprising obtaining the private information of each of the plurality of candidate images from logs generated by a server associated with the entity, wherein the logs comprises data relating to Internet activities conducted via the server.
11. A computer program product comprising a plurality of computer program instructions physically stored in a computer-readable medium, wherein the plurality of computer program instructions are operable to cause at least one computing device to:
access a plurality of candidate images, each candidate image comprising public information and private information;
select one or more images from the plurality of candidate images based at least in part on each image's public information and private information; and
construct a CAPTCHA comprising the one or more images, a challenge, and a correct response,
wherein it is difficult for a computer to automatically determine the correct response using only the public information of each of the one or more images, and
wherein the private information of each of the one or more images is accessible only to an entity responsible for constructing the CAPTCHA.
12. The computer program product as recited in claim 11, wherein it is nearly impossible for the computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the one or more images.
13. The computer program product as recited in claim 11, wherein it is impossible for the computer to automatically determine the correct response of the CAPTCHA using only the public information of each of the one or more images.
14. The computer program product as recited in claim 11, wherein to select the one or more images from the plurality of candidate images is further based on a type of the CAPTCHA.
15. The computer program product as recited in claim 14, wherein:
the one or more images comprises one image,
the challenge comprises describing a subject matter of the image, and
the image is selected from the plurality of candidate images based on the image lacking public information or having incorrect public information.
16. The computer program product as recited in claim 14, wherein:
the one or more images comprises one target image and a plurality of choice images,
the challenge comprises selecting a response image from the plurality of choice images that has a subject matter that is most similar to a subject matter of the target image among the plurality of choice images, and
the target image and the plurality of choice images are selected from the plurality of candidate images based on the target image and the response image having different public information.
17. The computer program product as recited in claim 16, wherein the target image and the plurality of choice images are selected from the plurality of candidate images further based on the target image and each of the plurality of choice images other than the response image having similar public information.
18. The computer program product as recited in claim 11, wherein the plurality of computer program instructions are further operable to cause the at least one computing device to obtain the plurality of candidate images from the Internet.
19. The computer program product as recited in claim 18, wherein the plurality of computer program instructions are further operable to cause the at least one computing device to obtain the private information of each of the plurality of candidate images from search queries communicated to a search engine associated with the entity.
20. The computer program product as recited in claim 18, wherein the plurality of computer program instructions are further operable to cause the at least one computing device to obtain the private information of each of the plurality of candidate images from logs generated by a server associated with the entity, wherein the logs comprises data relating to Internet activities conducted via the server.
US12/397,561 2008-03-21 2009-03-04 Constructing image captchas utilizing private information of the images Abandoned US20100228804A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/397,561 US20100228804A1 (en) 2009-03-04 2009-03-04 Constructing image captchas utilizing private information of the images
US13/487,504 US9037595B2 (en) 2008-03-21 2012-06-04 Creating graphical models representing control flow of a program manipulating data resources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/397,561 US20100228804A1 (en) 2009-03-04 2009-03-04 Constructing image captchas utilizing private information of the images

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/487,504 Continuation US9037595B2 (en) 2008-03-21 2012-06-04 Creating graphical models representing control flow of a program manipulating data resources

Publications (1)

Publication Number Publication Date
US20100228804A1 true US20100228804A1 (en) 2010-09-09

Family

ID=42679184

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/397,561 Abandoned US20100228804A1 (en) 2008-03-21 2009-03-04 Constructing image captchas utilizing private information of the images
US13/487,504 Expired - Fee Related US9037595B2 (en) 2008-03-21 2012-06-04 Creating graphical models representing control flow of a program manipulating data resources

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/487,504 Expired - Fee Related US9037595B2 (en) 2008-03-21 2012-06-04 Creating graphical models representing control flow of a program manipulating data resources

Country Status (1)

Country Link
US (2) US20100228804A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100306213A1 (en) * 2009-05-27 2010-12-02 Microsoft Corporation Merging Search Results
US20100325706A1 (en) * 2009-06-18 2010-12-23 John Hachey Automated test to tell computers and humans apart
US20110029781A1 (en) * 2009-07-31 2011-02-03 International Business Machines Corporation System, method, and apparatus for graduated difficulty of human response tests
US20110081640A1 (en) * 2009-10-07 2011-04-07 Hsia-Yen Tseng Systems and Methods for Protecting Websites from Automated Processes Using Visually-Based Children's Cognitive Tests
US20110113378A1 (en) * 2009-11-09 2011-05-12 International Business Machines Corporation Contextual abnormality captchas
US8613098B1 (en) * 2009-06-30 2013-12-17 Intuit Inc. Method and system for providing a dynamic image verification system to confirm human input
US20140068756A1 (en) * 2009-03-24 2014-03-06 Aol Inc. Systems and methods for challenge-response animation and randomization testing
CN103927465A (en) * 2014-01-05 2014-07-16 艾文卫 Verification code generating and verifying method based on graphs
US8793761B2 (en) 2011-08-10 2014-07-29 International Business Machines Corporation Cognitive pattern recognition for computer-based security access
US8875239B2 (en) 2011-08-10 2014-10-28 International Business Machines Corporation Cognitive pattern recognition for security access in a flow of tasks
US20150180856A1 (en) * 2013-01-04 2015-06-25 Gary Stephen Shuster Captcha systems and methods
US20150215299A1 (en) * 2014-01-30 2015-07-30 Novell, Inc. Proximity-based authentication
US9471767B2 (en) 2014-08-22 2016-10-18 Oracle International Corporation CAPTCHA techniques utilizing traceable images
CN106034029A (en) * 2015-03-20 2016-10-19 阿里巴巴集团控股有限公司 Verification method and apparatus based on image verification codes
US9794264B2 (en) 2015-01-26 2017-10-17 CodePix Inc. Privacy controlled network media sharing
CN107808079A (en) * 2014-07-04 2018-03-16 王纪清 Server, user device, and terminal device
US11204987B2 (en) * 2019-11-07 2021-12-21 Nxp B.V. Method for generating a test for distinguishing humans from computers
US11258810B2 (en) * 2015-11-16 2022-02-22 Tencent Technology (Shenzhen) Company Limited Identity authentication method, apparatus, and system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9274762B2 (en) * 2011-11-16 2016-03-01 Raytheon Company System and method for developing an object-oriented system
US9170783B1 (en) * 2011-12-15 2015-10-27 The Mathworks, Inc. Class creation assistant for textual programming languages
US9304746B2 (en) * 2012-06-07 2016-04-05 Carmel-Haifa University Economic Corporation Ltd. Creating a user model using component based approach
CN106557413A (en) * 2015-09-25 2017-04-05 伊姆西公司 Based on the method and apparatus that code coverage obtains test case

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6195698B1 (en) * 1998-04-13 2001-02-27 Compaq Computer Corporation Method for selectively restricting access to computer systems
US20060047766A1 (en) * 2004-08-30 2006-03-02 Squareanswer, Inc. Controlling transmission of email
US20090077628A1 (en) * 2007-09-17 2009-03-19 Microsoft Corporation Human performance in human interactive proofs using partial credit
US20090138723A1 (en) * 2007-11-27 2009-05-28 Inha-Industry Partnership Institute Method of providing completely automated public turing test to tell computer and human apart based on image
US20090232351A1 (en) * 2008-03-12 2009-09-17 Ricoh Company, Ltd. Authentication method, authentication device, and recording medium
US20090259588A1 (en) * 2006-04-24 2009-10-15 Jeffrey Dean Lindsay Security systems for protecting an asset
US20090328150A1 (en) * 2008-06-27 2009-12-31 John Nicholas Gross Progressive Pictorial & Motion Based CAPTCHAs
US20100031330A1 (en) * 2007-01-23 2010-02-04 Carnegie Mellon University Methods and apparatuses for controlling access to computer systems and for annotating media files
US20100037319A1 (en) * 2008-08-08 2010-02-11 Microsoft Corporation Two stage access control for intelligent storage device
US20100095350A1 (en) * 2008-10-15 2010-04-15 Towson University Universally usable human-interaction proof
US20100232719A1 (en) * 2006-09-08 2010-09-16 Google Inc. Shape Clustering in Post Optical Character Recognition Processing
US20110055585A1 (en) * 2008-07-25 2011-03-03 Kok-Wah Lee Methods and Systems to Create Big Memorizable Secrets and Their Applications in Information Engineering

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6550054B1 (en) * 1999-11-17 2003-04-15 Unisys Corporation Method for representing terminal-based applications in the unified modeling language
AU1948201A (en) * 1999-12-06 2001-06-12 Axiomatic Design Software, Inc. Method and apparatus for producing software
US7010779B2 (en) * 2001-08-16 2006-03-07 Knowledge Dynamics, Inc. Parser, code generator, and data calculation and transformation engine for spreadsheet calculations
US7412455B2 (en) * 2003-04-30 2008-08-12 Dillon David M Software framework that facilitates design and implementation of database applications
EP1788497A1 (en) * 2005-11-18 2007-05-23 Alcatel Lucent Design pattern and procedure for processing an object model
US8291372B2 (en) * 2008-03-21 2012-10-16 International Business Machines Corporation Creating graphical models representing control flow of a program manipulating data resources

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6195698B1 (en) * 1998-04-13 2001-02-27 Compaq Computer Corporation Method for selectively restricting access to computer systems
US20060047766A1 (en) * 2004-08-30 2006-03-02 Squareanswer, Inc. Controlling transmission of email
US20090259588A1 (en) * 2006-04-24 2009-10-15 Jeffrey Dean Lindsay Security systems for protecting an asset
US20100232719A1 (en) * 2006-09-08 2010-09-16 Google Inc. Shape Clustering in Post Optical Character Recognition Processing
US20100031330A1 (en) * 2007-01-23 2010-02-04 Carnegie Mellon University Methods and apparatuses for controlling access to computer systems and for annotating media files
US20090077628A1 (en) * 2007-09-17 2009-03-19 Microsoft Corporation Human performance in human interactive proofs using partial credit
US20090138723A1 (en) * 2007-11-27 2009-05-28 Inha-Industry Partnership Institute Method of providing completely automated public turing test to tell computer and human apart based on image
US20090232351A1 (en) * 2008-03-12 2009-09-17 Ricoh Company, Ltd. Authentication method, authentication device, and recording medium
US20090328150A1 (en) * 2008-06-27 2009-12-31 John Nicholas Gross Progressive Pictorial & Motion Based CAPTCHAs
US20110055585A1 (en) * 2008-07-25 2011-03-03 Kok-Wah Lee Methods and Systems to Create Big Memorizable Secrets and Their Applications in Information Engineering
US20100037319A1 (en) * 2008-08-08 2010-02-11 Microsoft Corporation Two stage access control for intelligent storage device
US20100095350A1 (en) * 2008-10-15 2010-04-15 Towson University Universally usable human-interaction proof

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140068756A1 (en) * 2009-03-24 2014-03-06 Aol Inc. Systems and methods for challenge-response animation and randomization testing
US8910308B2 (en) * 2009-03-24 2014-12-09 Aol Inc. Systems and methods for challenge-response animation and randomization testing
US9495460B2 (en) * 2009-05-27 2016-11-15 Microsoft Technology Licensing, Llc Merging search results
US20100306213A1 (en) * 2009-05-27 2010-12-02 Microsoft Corporation Merging Search Results
US20100325706A1 (en) * 2009-06-18 2010-12-23 John Hachey Automated test to tell computers and humans apart
US9225531B2 (en) * 2009-06-18 2015-12-29 Visa International Service Association Automated test to tell computers and humans apart
US10097360B2 (en) 2009-06-18 2018-10-09 Visa International Service Association Automated test to tell computers and humans apart
US8613098B1 (en) * 2009-06-30 2013-12-17 Intuit Inc. Method and system for providing a dynamic image verification system to confirm human input
US8589694B2 (en) * 2009-07-31 2013-11-19 International Business Machines Corporation System, method, and apparatus for graduated difficulty of human response tests
US20110029781A1 (en) * 2009-07-31 2011-02-03 International Business Machines Corporation System, method, and apparatus for graduated difficulty of human response tests
US20110081640A1 (en) * 2009-10-07 2011-04-07 Hsia-Yen Tseng Systems and Methods for Protecting Websites from Automated Processes Using Visually-Based Children's Cognitive Tests
US8495518B2 (en) * 2009-11-09 2013-07-23 International Business Machines Corporation Contextual abnormality CAPTCHAs
US20110113378A1 (en) * 2009-11-09 2011-05-12 International Business Machines Corporation Contextual abnormality captchas
US8793761B2 (en) 2011-08-10 2014-07-29 International Business Machines Corporation Cognitive pattern recognition for computer-based security access
US8875239B2 (en) 2011-08-10 2014-10-28 International Business Machines Corporation Cognitive pattern recognition for security access in a flow of tasks
US20150180856A1 (en) * 2013-01-04 2015-06-25 Gary Stephen Shuster Captcha systems and methods
US9166974B2 (en) * 2013-01-04 2015-10-20 Gary Shuster Captcha systems and methods
US9467436B2 (en) * 2013-01-04 2016-10-11 Gary Stephen Shuster Captcha systems and methods
US10298569B2 (en) * 2013-01-04 2019-05-21 Gary Stephen Shuster CAPTCHA systems and methods
US9860247B2 (en) * 2013-01-04 2018-01-02 Gary Stephen Shuster CAPTCHA systems and methods
CN103927465A (en) * 2014-01-05 2014-07-16 艾文卫 Verification code generating and verifying method based on graphs
US20150215299A1 (en) * 2014-01-30 2015-07-30 Novell, Inc. Proximity-based authentication
US9722984B2 (en) * 2014-01-30 2017-08-01 Netiq Corporation Proximity-based authentication
CN107808079A (en) * 2014-07-04 2018-03-16 王纪清 Server, user device, and terminal device
US9870461B2 (en) 2014-08-22 2018-01-16 Oracle International Corporation CAPTCHA techniques utilizing traceable images
US9471767B2 (en) 2014-08-22 2016-10-18 Oracle International Corporation CAPTCHA techniques utilizing traceable images
US9794264B2 (en) 2015-01-26 2017-10-17 CodePix Inc. Privacy controlled network media sharing
CN106034029A (en) * 2015-03-20 2016-10-19 阿里巴巴集团控股有限公司 Verification method and apparatus based on image verification codes
US10817615B2 (en) 2015-03-20 2020-10-27 Alibaba Group Holding Limited Method and apparatus for verifying images based on image verification codes
US11258810B2 (en) * 2015-11-16 2022-02-22 Tencent Technology (Shenzhen) Company Limited Identity authentication method, apparatus, and system
US11204987B2 (en) * 2019-11-07 2021-12-21 Nxp B.V. Method for generating a test for distinguishing humans from computers

Also Published As

Publication number Publication date
US20120240099A1 (en) 2012-09-20
US9037595B2 (en) 2015-05-19

Similar Documents

Publication Publication Date Title
US20100228804A1 (en) Constructing image captchas utilizing private information of the images
US9165129B2 (en) Keyboard as biometric authentication device
US11516210B1 (en) Image-based authentication systems and methods
WO2015074496A1 (en) Identity authentication method and device and storage medium
US8873814B2 (en) System and method for using fingerprint sequences for secured identity verification
US10789078B2 (en) Method and system for inputting information
US9805120B2 (en) Query selection and results merging
US11361068B2 (en) Securing passwords by using dummy characters
US20170371866A1 (en) Language model using reverse translations
US20180004844A1 (en) Method and system for presenting content summary of search results
CN109783631B (en) Community question-answer data verification method and device, computer equipment and storage medium
US8639679B1 (en) Generating query suggestions
US20140095308A1 (en) Advertisement distribution apparatus and advertisement distribution method
JP2007522551A (en) Multi-select challenge-response user authentication system and method
US10133859B2 (en) Managing registration of user identity using handwriting
CN102214034A (en) Display apparatus, authentication method, and program
US20180300466A1 (en) Method and appapratus for controlling electronic device, and electrode device
CN103703461A (en) Detecting source languages of search queries
US20220100839A1 (en) Open data biometric identity validation
CN106796608A (en) Contextual search character string synonym is automatically generated
CN108053545A (en) Certificate verification method and apparatus, server, storage medium
US11481648B2 (en) Software categorization based on knowledge graph and machine learning techniques
Mohamed et al. On the security and usability of dynamic cognitive game CAPTCHAs
US20110047447A1 (en) Hyperlinking Web Content
CN110858244B (en) Verification method, data processing method, computer equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: YAHOO| INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DASGUPTA, ANIRBAN;RAVIKUMAR, SHANMUGASUNDARAM;PUNERA, KUNAL;REEL/FRAME:022342/0725

Effective date: 20090225

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: YAHOO HOLDINGS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO| INC.;REEL/FRAME:042963/0211

Effective date: 20170613

AS Assignment

Owner name: OATH INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAHOO HOLDINGS, INC.;REEL/FRAME:045240/0310

Effective date: 20171231