US20030153299A1 - Event manager for use in fraud detection - Google Patents
Event manager for use in fraud detection Download PDFInfo
- Publication number
- US20030153299A1 US20030153299A1 US10/335,499 US33549902A US2003153299A1 US 20030153299 A1 US20030153299 A1 US 20030153299A1 US 33549902 A US33549902 A US 33549902A US 2003153299 A1 US2003153299 A1 US 2003153299A1
- Authority
- US
- United States
- Prior art keywords
- events
- score
- event
- fraud
- executable code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M15/00—Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
- H04M15/41—Billing record details, i.e. parameters, identifiers, structure of call data record [CDR]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4016—Transaction verification involving fraud or risk level assessment in transaction processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M15/00—Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
- H04M15/43—Billing software details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M15/00—Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
- H04M15/44—Augmented, consolidated or itemized billing statement or bill presentation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M15/00—Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
- H04M15/47—Fraud detection or prevention means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/12—Detection or prevention of fraud
- H04W12/126—Anti-theft arrangements, e.g. protection against subscriber identity module [SIM] cloning
Definitions
- This application relates to the field of telecommunications and more particularly to the field of fraud detection in telecommunications systems.
- Fraud may be as simple as the physical theft of a wireless handset, or applying for a wireless subscription with no intention of paying.
- Other fraud is more sophisticated.
- tumbling-clone fraud entails the interception of a number of valid identification numbers from airborne wireless traffic, and the use of each identifier in sequence to render detection of the fraudulent activity more difficult.
- some of the fraudulent activity focuses on fraudulently obtaining subscriptions. For example, a thief might falsify application information or steal valid personal information from another individual.
- a fraud detection system receives data relating to telecommunications activity.
- Event generators generate events from the received data, with each event having a weight corresponding to an increased or decreased likelihood of fraud.
- the aggregated events for a subject determine a score for the subject, which is used to prioritize the subject in an investigation queue.
- Human analysts are assigned to open investigations on the investigation queue according to the priority of subjects. In this manner, investigation resources can be applied more effectively to high-risk subscribers and events.
- a method for detecting fraud in a telecommunications system includes: receiving one or more events relating to a subscriber; combining the one or more events to provide a score; and storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- the method may further include repeating the above for a plurality of subscribers; and storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold.
- the method may further include prioritizing the investigation queue according to the plurality of scores.
- the method may include updating the score of one of the plurality of suspect subscribers to provide an updated score, and removing the one of the plurality of suspect subscribers from the investigation queue if the updated score does not exceed the predetermined threshold.
- the method may also include assigning a human analyst to investigate one of the plurality of suspect subscribers.
- the method may include determining a region for each one of the plurality of suspect subscribers; and assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
- assigning a human analyst may further include: receiving a request to investigate from the human analyst; assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority; and removing the one of the plurality of suspect subscribers from the investigation queue.
- combining the one or more events to provide a score may further include: weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and summing the one or more weighted events to provide a score.
- This method may further include aging each of the one or more weighted events using a half-life.
- the one or more event weights may be discounted according to a match quality.
- the one or more event weights may be determined using logistic regression.
- Combining the one or more events to provide a score may further include feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
- This method may further include prioritizing the investigation queue according to the plurality of scores.
- a system for detecting telecommunications fraud includes: means for receiving one or more events relating to a subscriber; means for combining the one or more events to provide a score; and means for storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- the system may further include: means for applying the receiving means, the combining means, and the storing means to a plurality of subscribers; and means for storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceed the predetermined threshold.
- the system may further include means for prioritizing the investigation queue according to the plurality of scores.
- the system may further include means for removing one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time.
- the system may further include means for assigning a human analyst to investigate one of the plurality of suspect subscribers.
- a system may further include: means for determining a region for each one of the plurality of suspect subscribers; and means for assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
- the assigning means may include: means for receiving a request to investigate from the human analyst; and means for assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority.
- the combining means may further include: means for weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and means for summing the one or more weighted events to provide a score.
- the system may further include means for aging each of the one or more weighted events using a half-life.
- the one or more event weights may be discounted according to a match quality.
- the one or more event weights may be determined using logistic regression.
- the combining means may further include means for feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
- the system may further include means for prioritizing the investigation queue according to the plurality of scores.
- a computer program for detecting telecommunications fraud may be embodied in machine executable code including: machine executable code to receive one or more events relating to a subscriber; machine executable code to combine the one or more events to provide a score; and machine executable code to store the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- the computer program may further include: machine executable code to repeat the machine executable code to receive, the machine executable code to combine, and the machine executable code to store for a plurality of subscribers; and machine executable code to store a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold.
- the computer program may further include machine executable code to prioritize the investigation queue according to the plurality of scores.
- the computer program may further include machine executable code to remove one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time.
- the computer program may include machine executable code to assign a human analyst to investigate one of the plurality of suspect subscribers.
- the computer program may further include machine executable code to determine a region for each one of the plurality of suspect subscribers; and machine executable code to assign a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
- the machine executable code to assign a human analyst may further include machine executable code to receive a request to investigate from the human analyst; and machine executable code to assign to the human analyst a one of the plurality of suspect subscribers having a highest priority.
- the machine executable code to combine the one or more events to provide a score may further include machine executable code to weight the one or more events according to one or more event weights, thereby providing one or more weighted events; and machine executable code to sum the one or more weighted events to provide a score.
- the computer program may further include machine executable code to age each of the one or more weighted events using a half-life. One or more event weights may be discounted according to a match quality. The one or more event weights may be determined using logistic regression.
- the machine executable code to combine the one or more events to provide a score may further include machine executable code to feed the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
- the computer program may further include machine executable code to prioritize the investigation queue according to the plurality of scores.
- FIG. 1 is a block diagram of a telecommunications fraud detection system according to the principles of the invention
- FIG. 2 is a block diagram of the software components used in a telecommunications fraud detection system according to an embodiment of the present invention
- FIG. 3 shows the data records used by an embodiment of an event manager according to the principles of the invention
- FIGS. 4 A- 4 B are a flow chart showing event management according to the principles of the invention.
- FIG. 5 shows system parameters for a fraud detection system according to the principles of the invention
- FIG. 6 shows the events generated by one embodiment of the provisioning loader process, using monthly billing information
- FIG. 7 is a flow chart of a fuzzy matching process used in a preferred embodiment of the fraud detection system.
- FIG. 8 shows a graphical user interface screen presented by the client.
- FIG. 1 is a block diagram of a telecommunications fraud detection system according to one embodiment of the present invention.
- the fraud detection system 100 comprises a switch 102 connected in a communicating relationship to a Call Detail Record (“CDR”) loader 104 , an event manager 106 , a fraud database 108 , and a billing system 110 connected in a communicating relationship with a provisioning loader 112 .
- the CDR loader 104 , the event manager 106 , the fraud database 108 , and the provisioning loader are further connected in a communicating relationship with a web server 114 .
- the web server 114 is connected in a communicating relationship with one or more analyst devices 116 .
- the switch 102 may be any wired or wireless telecommunications switch, or may be a mediation device for aggregating a number of switches.
- the switch 102 may also include a roamer tape or roamer network connection to receive call information relating to local subscribers from other switches or geographic regions.
- the switch 102 forwards CDR's to the CDR loader 104 , which receives one or more records for each call from the switch 102 , and converts each received record into a common CDR format.
- the CDR loader 104 generates events based upon the received CDR's.
- the CDR loader 104 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard, with a mass storage device suitable to the volume of CDR's received from the switch 102 , and including a suitable interface to a data network 118 .
- the data network 118 is preferably a 100Base-T network compatible with the Institute of Electrical and Electronics Engineers (“IEEE”) 802.3 standard, commonly referred to as “Ethernet.”
- the data network 118 may also include wireless communication systems or internetwork components such as the Internet or the Public Switched Telephone Network.
- the CDR loader 104 also includes suitable processing power to execute programs that examine received CDR's and generate events based upon the received CDR's.
- the CDR loader 104 generates events that are transmitted to the event manager 106 over the data network 118 .
- the fraud database 108 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard.
- the fraud database 108 includes a suitable interface to the data network 118 , and suitable processing power to execute programs that access and maintain a database of subscribers that have committed fraud, or are otherwise associated with a heightened risk of fraudulent activity.
- the fraud database 108 also executes programs to examine new provisioning data, and in particular, changes to account information, received from the provisioning loader over the data network 118 , for fraud risks. Based upon this examination, the fraud database 108 generates events that are transmitted to the event manager 106 over the data network 118 .
- the provisioning loader 112 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard.
- the provisioning loader 112 includes suitable interfaces to the billing system 110 , the fraud database 108 , and the data network 118 .
- the provisioning loader 112 receives account and billing data from the billing system 110 , and examines the data to detect possible fraud. Based upon this examination, the provisioning loader 112 generates events that are transmitted to the event manager 106 over the data network 118 .
- the provisioning loader also forwards new and changed account data to the fraud database 108 .
- the billing system 110 may be any billing system, typically a proprietary billing system operated by a telecommunications carrier or service provider seeking to detect possible fraud.
- the billing system 110 may format provisioning and billing data in a format adapted to the provisioning loader 112 , or the provisioning loader may perform any required formatting or transformation required for further operations.
- provisioning data provisioning information
- provisioning information provisioning information
- the event manager 106 receives events from the CDR loader 104 , the fraud database 108 , and the provisioning loader 112 . Other events may be received from other event generators connected to the data network 118 , which may include, for example, credit bureaus or other remote fraud detection systems.
- the event manager 106 is a server, such as one manufactured by Sun Microsystems or Hewlett Packard, and includes a suitable interface to the data network 118 .
- the event manager 106 also includes a processor with sufficient processing power to handle event traffic from the event generators connected to the data network 118 , and a mass storage device suitable to storing received events.
- the event manager maintains cumulative scores for subscribers and an investigation queue that includes subscribers posing a likelihood of fraud.
- the web server 114 is preferably an Intel-based server, such as one manufactured by Compaq or Gateway.
- the web server 114 provides a graphical user interface for the analyst devices 116 , and a functional interface through which analyst devices 116 may access the event manager 106 and data stored in the event generators, i.e., the CDR loader 104 , the fraud database 108 , and the provisioning loader 112 .
- the analyst devices 116 are computers, preferably thin client stations, including suitable interfaces for connection with the web server 114 .
- An analyst network 120 between the analyst devices 116 and the web server 114 may include any data network, such as a 10Base-T Ethernet network.
- the web server 114 communicates with the analyst devices 116 using the World Wide Web.
- the analyst network 120 may also include wireless communication systems or internetwork components, such as leased lines, frame relay components, the Internet, or the Public Switched Telephone Network, as indicated generally by an alternative network 122 . It will be appreciated that the analyst network 120 and the data network 118 may also be the same network, or may share some internetwork components.
- Each server may be a separate physical server, or some of the servers may be logical servers on a single physical server. Conversely, a single server may consist of several physically separate servers configured to operate as a single logical server.
- the analyst devices 116 may be at a single location sharing a local area network, or may be distributed over a wide geographical area.
- the data network 118 is preferably a dedicated physical network, but may also be a virtual private network, a wide-area or local area network, or may include some combination of these. Any computer and/or network topology may be used with the present invention, provided it offers sufficient communication and processing capacity to manage data from the switch 102 and the billing system 110 .
- FIG. 2 is a block diagram of the software components used in a telecommunications fraud detection system according to an embodiment of the present invention.
- the software components operate on, for example, the CDR loader 104 , the event manager 106 , the fraud database 108 , the provisioning loader 112 , and the web server 114 of FIG. 1.
- a hardware abstraction layer 200 provides a hardware-independent foundation for software components, and typically includes an operating system such as Windows NT, UNIX, or a UNIX derivative.
- a middleware layer 210 provides for communication among software components.
- the middleware layer 210 includes C++ class libraries for encapsulating processes in a message-oriented environment.
- the classes define components and hierarchies for client/server control, tokens (basic data packets and structures), communications, and shared memory, and provide functionality for messaging, mailboxes, queues, and process control.
- the middleware 210 provides a message-oriented environment that establishes a communication path 220 between a user interface process 222 , an event manager process 224 , a CDR loader process 226 , a provisioning loader process 228 , and a fraud database process 230 , such that the processes may communicate independent of the network and computer topology of the fraud detection system 100 .
- the software includes one or more clients 232 , executing on analyst devices 116 to present a user interface to human analysts.
- the event manager process 224 receives events from the CDR loader process 226 , the provisioning loader process 228 , and the fraud database process 230 .
- the event manager process 224 uses events to maintain scores for current subscribers.
- the event manager also maintains an investigation queue which includes subscribers who's scores suggest a heightened likelihood of fraud.
- the events and scores are maintained in an event database 234 , which may be embodied, for example, on a mass storage device under control of a database management system such as that sold by Oracle.
- the CDR loader process 226 includes a switch interface (not shown) to receive CDR's from a switch 102 , and uses a call database 238 to store CDR's.
- the provisioning loader process 228 includes an interface to a provider billing system (not shown) and uses a subscriber database 240 to store provisioning data from the billing system.
- the fraud database process 230 includes a fraud database 242 that stores data concerning identities associated with fraud.
- the user interface process 222 operates as a front-end server to the clients 232 .
- the user interface process 222 may use any programming language suitable to client-server processes, preferably HTML, Java, Visual Basic, XML or some other graphically oriented language (or combination of languages).
- the user interface process 222 also provides a functional interface through which a client 232 may inspect information in the event database 234 , the call database 238 , and the subscriber database 240 during the course of an investigation.
- FIG. 3 shows the data records used by an embodiment of an event manager according to the principles of the invention.
- the foregoing elements of each event 250 are fields within a database record.
- Each event 250 received from one of the event generators includes a subscriber type 252 , which is either primary (account level) or secondary (subscriber level).
- a subscriber identifier 254 denotes the specific subscriber/account. By storing subscriber identification information in this two-tiered manner, the event manager can meaningfully distinguish between behavior of a subscriber, and behavior of a specific account of a subscriber.
- Each event 250 includes a family 256 , which is generally “fraud”, but allows for other event types.
- a category 258 identifies the event generator source, and a type 260 identifies a particular type of event from the event generator specified by the category 258 .
- the time 262 of the event 250 is included.
- Each event 250 includes a weight 264 that represents a numerical value associated with each event 250 , a half-life 266 that represents the rate at which the weight diminishes, a match quality 268 that represents a degree of correspondence between the subscriber and a subscriber having a known history of fraud, as well as an event identifier 270 and an expiration date 272 that are assigned to each event 250 as it is received by the event manager process 224 .
- a weight 264 that represents a numerical value associated with each event 250
- a half-life 266 that represents the rate at which the weight diminishes
- a match quality 268 that represents a degree of correspondence between the subscriber and a subscriber having a known history of fraud
- an event identifier 270 and an expiration date 272 that are assigned to each event 250 as it is received by the event manager process 224 .
- Each summary 280 includes a subscriber type 282 and a subscriber identifier 284 that correspond to the subscriber type 252 and the subscriber identifier 254 used in each event 250 .
- Each summary 280 also includes an age time 286 that indicates the last time that an alert score 288 was aged using the half-life 266 .
- the alert score 288 is a composite score used to prioritize investigation.
- the alert score 288 is preferably a sum of a primary score 290 , the score for a particular subscriber, and a critical score 292 , the highest score for any account of the subscriber.
- a critical identifier 294 indicates the account to which the critical score 292 corresponds.
- An array of partial scores 296 is also included in the summary 280 , and includes partial scores (weights) for the subscriber, along with associated half-lives.
- Each summary 280 is stored in a “bulletin board,” a non-permanent, shared memory area used by the event manager process 224 and the other processes of the fraud detection system. It will be appreciated that each event 250 and each summary 280 may include other fields with additional subscriber information, event information, or system administration information for use in the fraud detection system 100 .
- FIGS. 4 A- 4 B are a flow chart showing event management according to the principles of the invention.
- each step is a task operating within the event manager process 224 .
- Event management conceptually begins with the event manager process 224 receiving an event message, as shown in step 300 .
- the event message may include one or more events 250 generated, for example, by the CDR loader 226 , the provisioning loader 228 , or the fraud database 230 .
- an event receiver task checks for valid data and assigns an event identifier 270 and an expiration date 272 to each event 250 .
- the events 250 contained in the event message are then stored in the event database 234 , as shown in step 302 .
- the event manager process 224 is preferably multi-threaded, such that the process 224 may return to step 300 to receive a new event message at the same time that control is passed to a setup task, as shown is in step 304 .
- each event 250 is prepared for further analysis.
- existing summaries 280 and other data for a subscriber type 252 and subscriber identifier 254 are retrieved from databases as necessary and placed on to the bulletin board for common use by the event manager process 224 and the other processes and tasks.
- step 306 events are aged.
- Each event 250 has, associated therewith, a half-life 266 which describes the manner in which the weight 264 decreases in value over time.
- the half-life is the amount of time for the weight 264 to decrease by fifty percent, thus denoting an exponential decay. It will be appreciated that linear decay, or some other time-sensitive de-weighting may be used to age events.
- Events may be aged on an as-needed basis. That is, instead of aging every event 250 daily, events 250 for a subscriber may be aged when an event 250 for that subscriber is received.
- the summary 280 for a subscriber includes the most recent aging as an age time 286 .
- a daily ager task is provided, which operates once each day to age weights for any subscriber having an “open investigation.” The open investigation is a subscriber/account on an investigation queue, which will be described in more detail with reference to step 314 .
- the event manager process 224 checks for any meta-events. Meta-events are combinations of events, possibly in conjunction with other information, which indicate a likelihood of fraud beyond that suggested by pre-assigned event weights.
- the event manager process 224 may be configured to perform any test on any combination of events and/or other data available in the fraud detection system 100 . For example, in a “persistent fraud” meta-event, the subscriber type 252 and subscriber identifier 254 are checked against known occurrences of fraud recorded in the event database 234 by the event manager process 224 .
- Step 308 may include several such tests, which may be performed sequentially or in parallel, and each of which may generate its own meta-event. After all such tests have been completed, the event manager process 224 may proceed.
- an aggregator task updates the summaries 280 to reflect received events and meta-events.
- the partial scores 296 and primary score 290 are updated.
- a new critical score 292 may be determined, an a new alert score 288 calculated therefrom.
- the critical score identifier 294 may change due to the events, it is preferred to defer calculation of a new alert score 288 until a critical subscriber (and associated critical identifier 294 ) has been determined.
- a critical subscriber is identified. Of all of the accounts of a subscriber, i.e., the subscriber identifiers 254 of a subscriber type 252 , only an account with a highest qualifying score is used to calculate the alert score 288 . In a preferred embodiment, a score is not qualifying if there has been an investigation for the subscriber identifier 284 within a pre-determined time that has resulted in a positive (i.e., fraud) outcome.
- alerts are created. Once the critical subscriber has been identified, the alert score 288 may be calculated for a subscriber/account, as identified by the subscriber type 282 and subscriber identifier 284 . A new alert for the summary 280 is generated if the alert score 288 changes, and will be one of “add,” “remove,” or “changed.” An investigation queue of “open investigations” is maintained, which includes each subscriber/account having an alert score 288 meeting or exceeding the alert threshold. The alert provides instructions to a separate task that is responsible for maintaining the investigation queue, i.e., prioritizing the queue according to alert scores 288 and handling analyst requests for an open investigation.
- An add alert is generated when the alert score 288 first meets or exceeds the alert threshold.
- a remove alert is generated to automatically close an open investigation when the alert score 288 falls below the alert threshold, with the caveat that the open investigation will not be closed if it is currently associated with an analyst.
- the changed alert is generated when the alert score 288 changes.
- no alert will be generated within a predetermined period (an “alert delay”) after a finding of fraud relative to a subscriber/account.
- step 316 databases are updated.
- the preceding steps operate on the bulletin board, which is a non-permanent memory.
- any changes with respect to event logging, scores, summaries, and the like should be permanently recorded in the databases.
- there is no locking on the databases so there is a chance that a change by one task or process may collide with a change from another task or process.
- Failed changes are detected using a version counter for the databases, and a failed change causes each step including and after setup 304 to be repeated. After a maximum number of tries, events 250 may be stored in a separate file for future recovery.
- the investigation queue is updated according to any alerts.
- the investigation queue is then prioritized such that when a client 232 (operated by an analyst) requests an open investigation, the client 232 will receive the open investigation having the highest score.
- an automatic hotline is provided.
- This task can generate a message to a service provider for immediate termination of an account when the alert score 288 exceeds a predetermined value.
- the message may be logged so that a carrier can observe the affect on customers prior to implementing the automatic hotline.
- FIG. 5 shows system parameters for a fraud detection system according to the principles of the invention.
- the system parameters 400 may be stored on the event manager 106 , or in some other memory within the fraud detection system 100 .
- the system parameters 400 include a corporation 402 , along with any corporate hierarchical information such as a company, division, or market.
- the system parameters 400 include an alert threshold 404 that determines the alert score 288 at which an investigation is opened.
- the alert threshold 404 may be customized for a particular provider.
- An alert delay 406 determines the amount of time to wait, after detecting fraud, before generating additional alerts for a subscriber/account.
- An investigation expiration 408 determines the amount of time after an investigation is closed that it should be purged.
- Events are also purged when their aged weights fall below an event weight minimum 410 . Events may also be purged after a predetermined time, as established by an event age maximum 412 . These user-configurable parameters permit system resources to be tailored to event traffic.
- each event generator 226 , 228 , 230 , 232 (and the event manager 224 for meta-events) is responsible for providing a weight for each event that it generates. These weights are determined (estimated) using a scoring model based upon known techniques of conditional probability and logistic regression. The weights are preferably scaled and shifted to provide a scoring model in which any score above zero indicates a significant likelihood of fraud.
- the scoring model is a logit(x) function scaled from ⁇ 200 to 1000, with a score of 400 representing a 0.5 probability of fraud.
- the scores are presented to a client 232 in textual form, i.e., “force alert,” “high,” “medium,” “low,” “zero,” etc. It will be appreciated that other mathematical techniques may be used, provided they can discriminate among events to determine events carrying an increased likelihood of fraud. For example, a neural network may be trained to evaluate events based upon known instances of fraud. A neural network may also be used to generate the alert score 288 in step 314 above.
- the event manager process 224 may internally generate meta-events. However, other events handled by the event manager process 224 are received from other processes within the fraud detection system 100 . For example, an investigation outcome event is generated by a client 232 when an analyst closes an investigation. If the analyst has determined that there is no fraud, then this information is entered into the user interface presented to the analyst by the client 232 and user interface process 222 . The client 232 then generates an event having a negative score, which indicates a reduced likelihood of fraud. Another event that may be generated by the client is a very important person (“VIP”) event. The VIP event carries either a positive or a negative score indicative of the analyst's estimate of a necessary bias required to accurately assess risk. Additional event generators are discussed below.
- VIP very important person
- the provisioning loader process 228 is an event generator.
- the provisioning loader 228 receives a stream of provisioning data from a billing system 110 .
- Provisioning data is stored in the subscriber database 240 .
- the provisioning data includes information such as new accounts, account changes, rate plan changes, and billing information such as payment, late payment, non-payment, payment method, and the like. This provisioning data is examined for potentially fraudulent behavior.
- the provisioning loader process 228 may generate events based upon this stream of provisioning data.
- the provisioning stream is provided to the provisioning loader process 228 several times each day. Other configurations are possible instead of, or in addition to, this stream. For example, if available from the carrier, a real-time provisioning feed may be used to generate events in near-real-time as new accounts are added or existing account information is changed.
- FIG. 6 shows the events 420 generated by one embodiment of the provisioning loader process 228 , using monthly billing information.
- the events 420 may be categorized by a category 422 , a type 424 , and a sub-type 426 .
- a first category 422 is “change,” which has two types 424 : info 430 and rate plan 432 .
- info 430 type has sub-types “name,” “address,” “phone,” and may include any other billing information recorded for a subscriber.
- a change to “info” may not, by itself, suggest fraud. However, repeated changes over a short period of time may indicate a heightened likelihood of fraud, and these events are preferably weighted with long half-lives and weights such that two or three occurrences will exceed the alert threshold 404 .
- rate plan 432 Another type 424 of the category 422 change is rate plan 432 .
- the sub-type for rate plan 432 denotes the amount of time since the account was created, one of thirty, sixty, ninety, or ninety-plus days.
- a weight and a half-life may be accorded to a rate plan change according to how long the account has existed. For example, a low weight may be assigned to rate plan changes in the first thirty days of a new account if the carrier permits free changes in that time period. In subsequent periods, changes are expected to be less frequent, and will be accorded greater weight and longer half-lives.
- a category 422 of event for the provisioning loader process 224 is billing.
- a first type 424 of this category 422 is pay 434 .
- a pay 434 event indicates generally that a bill has been paid, and has associated therewith a negative weight to indicate a reduced likelihood of fraud. The weight according to a particular pay 434 event will depend on the manner in which payment was made.
- the pay 434 type 422 has several sub-types 426 , including “cash,” “check,” “verified funds,” “credit,” “pre-paid card,” and “debit.” Each sub-type has associated therewith a weight and a half-life. In a preferred embodiment, payments made in “cash” or “verified funds” such as a verified check or money order receive stronger negative weights.
- An event of the billing category 422 may also have a type 424 of non-payment (“NPAY”) 436 .
- NPAY non-payment
- This may be a generic non-payment (“GEN”), a pre-paid card with a positive balance (“PPCPB”), or a partial payment (“PARTP”).
- the match quality 268 for a partial payment billing event 436 is proportional to the amount of the bill that has been paid.
- a credit card link 438 category generates events based upon a link to the fraud database 242 .
- Accounts are generally secured by a credit card.
- ACC credit card changes for an account
- BILL bill
- a query is transmitted to the fraud database process 230 , which will search for exact matches in the fraud database 242 .
- This information is used by the provisioning loader process 228 to generate events indicative of fraud.
- the subscriber database 240 may be examined for any other subscribers or accounts using the same credit card number.
- the fraud database process 230 also operates as an event generator.
- the fraud database process 230 includes a fuzzy matching process.
- the fuzzy matching process receives provisioning information from the provisioning loader process 228 each time that the provisioning loader process 228 detects a change in subscriber information (or a new account).
- the fuzzy matching process generates events based upon matches or near matches with records in the fraud database 242 .
- These records may be subscribers or accounts associated with instances of known fraud, subscribers or accounts with instances of suspected fraud, or identities otherwise associated with fraud.
- the fuzzy matching process is described below.
- FIG. 7 is a flow chart of a fuzzy matching process used in a preferred embodiment of the fraud detection system 100 .
- each record received from the provisioning loader process is tested against the fraud database 242 by the fuzzy matching process, and the output is a stream of matching events.
- the terms “account change data” or “account change information” are intended to refer to the subset of provisioning data that is forwarded from the provisioning loader process 228 to the fraud database process 230 . This is preferably a subset of provisioning data that corresponds to account changes and new accounts.
- a received record of account change data is formatted for processing by the fuzzy matching process. This may include, as necessary, parsing of the record and suppression of any user-specified values.
- particular fraud records may also be suppressed in the fraud database 242 , so that subsequent searching by the fuzzy matching process will pass over those records.
- a search key is generated for a particular field or fields of the record.
- search keys may be limited to those corresponding to the changed fields.
- search keys are preferably generated for every category that is not suppressed.
- Each search key defines those fields of an incoming record that should be used for a particular search.
- Search keys preferably include an individual name, a business name, an address, a telecommunications number, an identification number, and an equipment serial number. It is noted that credit card numbers use an exact match, as described above.
- the search key preferably includes a street name, city, two-character state code, and five or nine character postal code.
- step 520 possible matches are collected.
- the matches are collected by applying a set of comparison rules to the search key fields in the account change record and the search key fields in each record in the fraud database.
- the comparison rules are established separately for each search key.
- possible matches are preferably returned for an exact match, a match of last name and a first letter of one or more first and middle names, a one-off character match with a name, an eighty percent character match, an eighty percent character match with two transposed letters, a short word exactly matching the first letters of a long word, and out of order names.
- a normalized match may also be used, in which a given name is converted using a name synonym table (i.e., “Bob”->“Robert”).
- matches are generally collected for 80% character matches of words with more than four characters, and for words with two transposed characters, without reference to the order in which the words appear. Substring matches are also collected by matching shorter words against the left-most characters of a longer word. Matches are also collected for one-off character matches of street numbers. Address matches will only be collected for an exact match of the state, city, and postal code.
- matches are collected for exact matches of the country number, area code, prefix, and subscriber number.
- a “wild card” match may also be used, such that a match is collected if every digit except for the wild card digit(s) are an exact match.
- a wild card may be a three-digit area code or a three-digit prefix.
- Identification numbers may be driver's licenses, Social Security numbers, Federal Tax ID numbers, or other numbers uniquely identifying individuals or entities. For identification numbers, matches are collected for exact matches, one-off matches, all-but-one character matches, and two transposed character matches. For equipment serial numbers, only exact matches are collected.
- a match quality is calculated for each match collected, according to a set of predetermined rules.
- the predetermined rules are established separately for each search key. For an individual name, match quality is calculated by taking the percentage of characters of a shorter string that match a longer string. Match quality is calculated for both a first and last name in this inquiry. An exact match on two words of the same length receives a match quality of 100%.
- the match quality is otherwise scaled according to the nature of the match (exact match, substring match, one-off match, two-off match) and the length of the shorter word (1 character, 2-4 characters, 5-7 characters, 8+ characters).
- an exact match with a shorter word length of one character is a 100% match, while a one-off match with a word of 5-7 characters is an 80% match.
- the match quality for each nature-of-match and length-of-shorter-word may be user configured.
- match quality is determined by calculating the percentage of characters of shorter strings that match longer strings. Match quality is calculated for each word in the business name. An exact match on two words of the same length receives a match quality of 100% regardless of the length of the two words. The match quality is scaled according to the nature of the match and the length of the shorter word. For example, an exact match of only one character will receive a weight of 80%. Similarly, while a one-off match of a 2-4 character word will receive a match quality of 0%, a one-off match for a 5-7 character word will receive an 80% match quality, and a one-off match for an 8+ character will receive a 90% match quality. These match qualities are user configurable.
- a match quality is determined by finding the percentage of characters of the shorter street address (and unit number, if available) that match the longer street address.
- the match quality is weighted equally for each separate word of the address. Any exact match of two words of the same length receives a match quality of 100% regardless of the length. Match quality is otherwise scaled according to the nature of the match and the length of the shorter word. In a preferred embodiment, these match qualities are the same as those for individual names.
- the match qualities are user configurable. The city, state, and postal code has a match quality of either 100% (exact) or 0%.
- a discounted score is calculated for each matching record.
- the discounted score is a sum of the first name match and the last name match, multiplied by the weight assigned to the individual name element.
- the sum is preferably weighted so that a last name match is accorded greater significance than a first name match.
- the first name is weighted as 0.35 of the total
- the last name is weighted as 0.65 of the total score.
- the weight may be distributed among multiple first names and initials. For example, two matching first names may be weighted at 100% while to matching first initials will only be weighted at 80%.
- Out-of-sequence matches are preferably assigned 80% of the match quality for a corresponding in-sequence match.
- the discounted score is a weighted sum of the match quality for each word in the business name, multiplied by the weight assigned to business names. All words of the business name are treated equally, however, out of sequence matches are weighted at 80%.
- the discounted score is a weighted sum of the match quality for each word. Out of sequence matches are adjusted to 80% of the match quality for the corresponding in-sequence match. In a preferred embodiment, the total score is weighted as 0.50 city/state and postal code match, and 0.50 street address match. Alternatively, if a unit number is available, the score is weighted 0.20 unit number, 0.50 city/state and postal code, and 0.40 street address match.
- the discounted score is the match quality (0% or 100%) multiplied by the weight assigned to a telecommunications number.
- the discounted score is the match quality multiplied by the weight assigned to identification numbers.
- the discounted score is the weight assigned to serial numbers.
- step 550 the fuzzy matching process checks if all search keys have been applied to an account change record, and returns to step 510 when one or more search keys for a record have not been tested. When each defined search key has been tested, the fuzzy matching process proceeds to step 560 .
- an event may be generated.
- an event is only generated when the fuzzy matching process generates a non-zero score for a record.
- an event may only be generated when the discounted match score exceeds a predetermined threshold.
- a different event may be generated for each search key, such that a single account change record may generate more than one event if more than one search key for that account change record results in a match.
- fuzzy matching process is intended generally to locate account information similar to records in the fraud database 242 .
- Other search keys and scoring techniques may be used, consistent with the processing power of the fraud database 108 , provided they identify similarity between records and generate corresponding events in a timely fashion.
- a human analyst accesses the fraud detection system 100 by using the client 232 .
- the client provides a user interface, preferably a graphical user interface, to the analyst, by which the analyst may request and receive subjects for investigation from the investigation queue.
- an analyst requesting an investigation receives the open investigation on the investigation queue having the highest score.
- analysts are geographically distributed over a large area, it may be desirable to assign to an analyst the open investigation having the highest score in that analysts region.
- separate investigation queues may be maintained for each geographic region. When the analyst receives the open investigation, or summary information therefore, the open investigation is locked so that no other analyst may work on the same investigation.
- the analyst may conduct an investigation by examining any data in the system relating to the subscriber/account, including data in the event database 234 , the call database 238 , the subscriber database 240 , and the fraud database 242 .
- An investigation concludes with a finding of fraud, a finding of no fraud, or an indeterminate finding.
- Each of these findings, when entered by the analyst, is an additional fraud detection event.
- FIG. 8 shows a graphical user interface screen presented by the client 232 .
- the depicted screen is a “close investigation” screen in which an analyst enters a resolution of an investigation.
- a title bar 580 describes the page being viewed. Instructions 585 relevant to the page may be displayed below the title bar 580 . General tools are provided in a tool bar 590 along the left side of the screen.
- a fraud outcome is specified in a drop-down list 600 .
- a fraud type may also be selected from a drop-down list 602 of recognized fraud types.
- a text box 604 is provided for an analyst to enter additional notes concerning the investigation. When the fraud outcome and the fraud type have been selected, the analyst may close the investigation by selecting an “OK” box 606 .
- the analyst may instead cancel the close investigation operation by clicking a “Cancel” box 608 , and proceed with additional investigation.
- the interface screen depicted in FIG. 8 is not intended to be limiting. It will be appreciated that numerous graphical user interface tools and objects are known in the art and may be used with a graphical user interface according to the invention. It will further be appreciated that other screens are preferably also used, for example, to handle requests for new investigations, provide summary information to an analyst, and to investigate records and call histories of a subject under investigation.
- the user interface process 222 may, in addition to providing a functional interface between the clients 232 and the rest of the fraud detection system 100 , also be used to track analyst productivity. Analyst actions may be logged, with time stamps, for future analysis of time spent in individual and aggregate analysis.
- a graphical user interface is provided for administration of the system, including control of weights and half-lives for each event, and for controlling the system parameters such as those described in reference to FIG. 5.
- a number of user interfaces and graphical user interface tools are known in the art, and may be adapted to administration of a fraud detection system operating according to the principles of the invention. It will further be appreciated that such a system will preferably include security to prevent unauthorized modifications thereto.
Abstract
According to the principles of the invention, a fraud detection system receives data relating to telecommunications activity. Event generators generate events from the received data, with each event having a weight corresponding to an increased or decreased likelihood of fraud. The aggregated events for a subject (a subscriber or an account) determine a score for the subject, which is used to prioritize the subject in an investigation queue. Human analysts are assigned to open investigations on the investigation queue according to the priority of subjects. In this manner, investigation resources can be applied more effectively to high-risk subscribers and events.
Description
- This application claims the benefit of U.S. Provisional Application No. 60/108,952 and U.S. Provisional Application No. 60/108,971, both filed on Nov. 18, 1998. The disclosure of those applications is incorporated herein by reference.
- 1. Field of the Invention
- This application relates to the field of telecommunications and more particularly to the field of fraud detection in telecommunications systems.
- 2. Description of Related Art
- Along with the growth in wireless telecommunications, there has been a growth in telecommunications fraud. The current techniques for committing fraud are generally known and understood. Fraud may be as simple as the physical theft of a wireless handset, or applying for a wireless subscription with no intention of paying. Other fraud is more sophisticated. For example, tumbling-clone fraud entails the interception of a number of valid identification numbers from airborne wireless traffic, and the use of each identifier in sequence to render detection of the fraudulent activity more difficult. Also, some of the fraudulent activity focuses on fraudulently obtaining subscriptions. For example, a thief might falsify application information or steal valid personal information from another individual.
- However, understanding the modalities for wireless fraud does not provide any specific strategy for addressing the fraud. A wireless carrier may have millions of subscribers, who may collectively make millions of calls each day. Even if some of the characteristics of fraudulent activity are known, it may be impractical to allocate human resources to examine each call individually. If a typical wireless telecommunications system handles two million calls each day, perhaps only a few hundred of these calls should be examined closely. One approach to “filtering” this mass of information is disclosed in U.S. Pat. No. 5,615,408, which describes a system for credit-based management of telecommunications activity. According to the '408 patent, each call within the system is examined for possible credit problems among subscribers, and a credit alert is generated when a credit risk is present.
- While the system disclosed in the '408 patent presents a significant advance in telecommunications monitoring, it may fail to detect certain fraudulent activity. For example, an identical identification number may occur simultaneously in two disjoint cells, which may not present any credit issues, but does indicate that a handset has been cloned. Additionally, the information available for a call may only suggest a heightened probability of fraud rather than a definite instance of fraud. As the search criteria for a fraud-detection system broaden, more and more calls must be examined. Furthermore, automated responses, such as immediate termination of service, may be undesirable, particularly for legitimate subscribers that cross a statistical line into ostensibly fraudulent activity.
- There remains a need for a telecommunications fraud detection system that can handle large call volume while permitting individualized attention to possibly fraudulent activity. The system should prioritize possibly fraudulent activity so that a human analyst can be assigned to investigate instances with a high likelihood of fraud.
- According to the principles of the invention, a fraud detection system receives data relating to telecommunications activity. Event generators generate events from the received data, with each event having a weight corresponding to an increased or decreased likelihood of fraud. The aggregated events for a subject (a subscriber or an account) determine a score for the subject, which is used to prioritize the subject in an investigation queue. Human analysts are assigned to open investigations on the investigation queue according to the priority of subjects. In this manner, investigation resources can be applied more effectively to high-risk subscribers and events.
- In one embodiment, a method for detecting fraud in a telecommunications system according to the principles of the invention includes: receiving one or more events relating to a subscriber; combining the one or more events to provide a score; and storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- In this aspect, the method may further include repeating the above for a plurality of subscribers; and storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold. The method may further include prioritizing the investigation queue according to the plurality of scores. The method may include updating the score of one of the plurality of suspect subscribers to provide an updated score, and removing the one of the plurality of suspect subscribers from the investigation queue if the updated score does not exceed the predetermined threshold. The method may also include assigning a human analyst to investigate one of the plurality of suspect subscribers. The method may include determining a region for each one of the plurality of suspect subscribers; and assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region. In this method assigning a human analyst may further include: receiving a request to investigate from the human analyst; assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority; and removing the one of the plurality of suspect subscribers from the investigation queue.
- In the method, combining the one or more events to provide a score may further include: weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and summing the one or more weighted events to provide a score. This method may further include aging each of the one or more weighted events using a half-life. The one or more event weights may be discounted according to a match quality. The one or more event weights may be determined using logistic regression. Combining the one or more events to provide a score may further include feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events. This method may further include prioritizing the investigation queue according to the plurality of scores.
- In another aspect, a system for detecting telecommunications fraud according to the principles of the invention includes: means for receiving one or more events relating to a subscriber; means for combining the one or more events to provide a score; and means for storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- In this aspect, the system may further include: means for applying the receiving means, the combining means, and the storing means to a plurality of subscribers; and means for storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceed the predetermined threshold. The system may further include means for prioritizing the investigation queue according to the plurality of scores. The system may further include means for removing one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time. The system may further include means for assigning a human analyst to investigate one of the plurality of suspect subscribers.
- A system according to the principles of the invention may further include: means for determining a region for each one of the plurality of suspect subscribers; and means for assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region. The assigning means may include: means for receiving a request to investigate from the human analyst; and means for assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority. The combining means may further include: means for weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and means for summing the one or more weighted events to provide a score.
- The system may further include means for aging each of the one or more weighted events using a half-life. The one or more event weights may be discounted according to a match quality. The one or more event weights may be determined using logistic regression. The combining means may further include means for feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events. The system may further include means for prioritizing the investigation queue according to the plurality of scores.
- In another aspect, a computer program for detecting telecommunications fraud according to the principles of the invention may be embodied in machine executable code including: machine executable code to receive one or more events relating to a subscriber; machine executable code to combine the one or more events to provide a score; and machine executable code to store the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
- In this aspect, the computer program may further include: machine executable code to repeat the machine executable code to receive, the machine executable code to combine, and the machine executable code to store for a plurality of subscribers; and machine executable code to store a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold. The computer program may further include machine executable code to prioritize the investigation queue according to the plurality of scores. The computer program may further include machine executable code to remove one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time.
- Further in this aspect, the computer program may include machine executable code to assign a human analyst to investigate one of the plurality of suspect subscribers. The computer program may further include machine executable code to determine a region for each one of the plurality of suspect subscribers; and machine executable code to assign a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region. The machine executable code to assign a human analyst may further include machine executable code to receive a request to investigate from the human analyst; and machine executable code to assign to the human analyst a one of the plurality of suspect subscribers having a highest priority.
- The machine executable code to combine the one or more events to provide a score may further include machine executable code to weight the one or more events according to one or more event weights, thereby providing one or more weighted events; and machine executable code to sum the one or more weighted events to provide a score. The computer program may further include machine executable code to age each of the one or more weighted events using a half-life. One or more event weights may be discounted according to a match quality. The one or more event weights may be determined using logistic regression. The machine executable code to combine the one or more events to provide a score may further include machine executable code to feed the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events. The computer program may further include machine executable code to prioritize the investigation queue according to the plurality of scores.
- The foregoing and other objects and advantages of the invention will be appreciated more fully from the following further description thereof, with reference to the accompanying drawings, wherein:
- FIG. 1 is a block diagram of a telecommunications fraud detection system according to the principles of the invention;
- FIG. 2 is a block diagram of the software components used in a telecommunications fraud detection system according to an embodiment of the present invention;
- FIG. 3 shows the data records used by an embodiment of an event manager according to the principles of the invention;
- FIGS.4A-4B are a flow chart showing event management according to the principles of the invention;
- FIG. 5 shows system parameters for a fraud detection system according to the principles of the invention;
- FIG. 6 shows the events generated by one embodiment of the provisioning loader process, using monthly billing information;
- FIG. 7 is a flow chart of a fuzzy matching process used in a preferred embodiment of the fraud detection system; and
- FIG. 8 shows a graphical user interface screen presented by the client.
- To provide an overall understanding of the invention, certain illustrative embodiments will now be described, including an event manager for fraud detection in a telecommunications system. However, it will be understood by those of ordinary skill in the art that the methods and systems described herein can be suitably adapted to any environment where human analysts monitor a high volume of discrete events in real time, such as a financial transaction system or a supervised data network.
- FIG. 1 is a block diagram of a telecommunications fraud detection system according to one embodiment of the present invention. The
fraud detection system 100 comprises aswitch 102 connected in a communicating relationship to a Call Detail Record (“CDR”)loader 104, anevent manager 106, afraud database 108, and abilling system 110 connected in a communicating relationship with aprovisioning loader 112. TheCDR loader 104, theevent manager 106, thefraud database 108, and the provisioning loader are further connected in a communicating relationship with aweb server 114. Theweb server 114 is connected in a communicating relationship with one ormore analyst devices 116. - The
switch 102 may be any wired or wireless telecommunications switch, or may be a mediation device for aggregating a number of switches. Theswitch 102 may also include a roamer tape or roamer network connection to receive call information relating to local subscribers from other switches or geographic regions. Theswitch 102 forwards CDR's to theCDR loader 104, which receives one or more records for each call from theswitch 102, and converts each received record into a common CDR format. - The
CDR loader 104 generates events based upon the received CDR's. TheCDR loader 104 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard, with a mass storage device suitable to the volume of CDR's received from theswitch 102, and including a suitable interface to adata network 118. Thedata network 118 is preferably a 100Base-T network compatible with the Institute of Electrical and Electronics Engineers (“IEEE”) 802.3 standard, commonly referred to as “Ethernet.” Thedata network 118 may also include wireless communication systems or internetwork components such as the Internet or the Public Switched Telephone Network. TheCDR loader 104 also includes suitable processing power to execute programs that examine received CDR's and generate events based upon the received CDR's. TheCDR loader 104 generates events that are transmitted to theevent manager 106 over thedata network 118. - The
fraud database 108 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard. Thefraud database 108 includes a suitable interface to thedata network 118, and suitable processing power to execute programs that access and maintain a database of subscribers that have committed fraud, or are otherwise associated with a heightened risk of fraudulent activity. Thefraud database 108 also executes programs to examine new provisioning data, and in particular, changes to account information, received from the provisioning loader over thedata network 118, for fraud risks. Based upon this examination, thefraud database 108 generates events that are transmitted to theevent manager 106 over thedata network 118. - The
provisioning loader 112 is preferably a server, such as one manufactured by Sun Microsystems or Hewlett Packard. Theprovisioning loader 112 includes suitable interfaces to thebilling system 110, thefraud database 108, and thedata network 118. Theprovisioning loader 112 receives account and billing data from thebilling system 110, and examines the data to detect possible fraud. Based upon this examination, theprovisioning loader 112 generates events that are transmitted to theevent manager 106 over thedata network 118. The provisioning loader also forwards new and changed account data to thefraud database 108. Thebilling system 110 may be any billing system, typically a proprietary billing system operated by a telecommunications carrier or service provider seeking to detect possible fraud. Thebilling system 110 may format provisioning and billing data in a format adapted to theprovisioning loader 112, or the provisioning loader may perform any required formatting or transformation required for further operations. The terms “provisioning data,” “provisioning information,” and the like, as used herein, are intended to refer to billing information, account information, and any other information provided by a carrier that relates to subscribers or accounts. - The
event manager 106 receives events from theCDR loader 104, thefraud database 108, and theprovisioning loader 112. Other events may be received from other event generators connected to thedata network 118, which may include, for example, credit bureaus or other remote fraud detection systems. Theevent manager 106 is a server, such as one manufactured by Sun Microsystems or Hewlett Packard, and includes a suitable interface to thedata network 118. Theevent manager 106 also includes a processor with sufficient processing power to handle event traffic from the event generators connected to thedata network 118, and a mass storage device suitable to storing received events. The event manager maintains cumulative scores for subscribers and an investigation queue that includes subscribers posing a likelihood of fraud. - The
web server 114 is preferably an Intel-based server, such as one manufactured by Compaq or Gateway. Theweb server 114 provides a graphical user interface for theanalyst devices 116, and a functional interface through whichanalyst devices 116 may access theevent manager 106 and data stored in the event generators, i.e., theCDR loader 104, thefraud database 108, and theprovisioning loader 112. - The
analyst devices 116 are computers, preferably thin client stations, including suitable interfaces for connection with theweb server 114. Ananalyst network 120 between theanalyst devices 116 and theweb server 114 may include any data network, such as a 10Base-T Ethernet network. In a preferred embodiment, theweb server 114 communicates with theanalyst devices 116 using the World Wide Web. Theanalyst network 120 may also include wireless communication systems or internetwork components, such as leased lines, frame relay components, the Internet, or the Public Switched Telephone Network, as indicated generally by analternative network 122. It will be appreciated that theanalyst network 120 and thedata network 118 may also be the same network, or may share some internetwork components. - It will be appreciated by those skilled in the art that a number of topologies are possible for the
fraud detection system 100. Each server may be a separate physical server, or some of the servers may be logical servers on a single physical server. Conversely, a single server may consist of several physically separate servers configured to operate as a single logical server. Theanalyst devices 116 may be at a single location sharing a local area network, or may be distributed over a wide geographical area. Thedata network 118 is preferably a dedicated physical network, but may also be a virtual private network, a wide-area or local area network, or may include some combination of these. Any computer and/or network topology may be used with the present invention, provided it offers sufficient communication and processing capacity to manage data from theswitch 102 and thebilling system 110. - FIG. 2 is a block diagram of the software components used in a telecommunications fraud detection system according to an embodiment of the present invention. The software components operate on, for example, the
CDR loader 104, theevent manager 106, thefraud database 108, theprovisioning loader 112, and theweb server 114 of FIG. 1. Ahardware abstraction layer 200 provides a hardware-independent foundation for software components, and typically includes an operating system such as Windows NT, UNIX, or a UNIX derivative. Amiddleware layer 210 provides for communication among software components. In one embodiment, themiddleware layer 210 includes C++ class libraries for encapsulating processes in a message-oriented environment. The classes define components and hierarchies for client/server control, tokens (basic data packets and structures), communications, and shared memory, and provide functionality for messaging, mailboxes, queues, and process control. In particular, themiddleware 210 provides a message-oriented environment that establishes acommunication path 220 between auser interface process 222, anevent manager process 224, aCDR loader process 226, aprovisioning loader process 228, and afraud database process 230, such that the processes may communicate independent of the network and computer topology of thefraud detection system 100. The software includes one ormore clients 232, executing onanalyst devices 116 to present a user interface to human analysts. - The
event manager process 224 receives events from theCDR loader process 226, theprovisioning loader process 228, and thefraud database process 230. Theevent manager process 224 uses events to maintain scores for current subscribers. The event manager also maintains an investigation queue which includes subscribers who's scores suggest a heightened likelihood of fraud. The events and scores are maintained in anevent database 234, which may be embodied, for example, on a mass storage device under control of a database management system such as that sold by Oracle. - The
CDR loader process 226 includes a switch interface (not shown) to receive CDR's from aswitch 102, and uses acall database 238 to store CDR's. Theprovisioning loader process 228 includes an interface to a provider billing system (not shown) and uses asubscriber database 240 to store provisioning data from the billing system. Thefraud database process 230 includes a fraud database 242 that stores data concerning identities associated with fraud. Theuser interface process 222 operates as a front-end server to theclients 232. Theuser interface process 222 may use any programming language suitable to client-server processes, preferably HTML, Java, Visual Basic, XML or some other graphically oriented language (or combination of languages). Theuser interface process 222 also provides a functional interface through which aclient 232 may inspect information in theevent database 234, thecall database 238, and thesubscriber database 240 during the course of an investigation. - FIG. 3 shows the data records used by an embodiment of an event manager according to the principles of the invention. The foregoing elements of each
event 250 are fields within a database record. Eachevent 250 received from one of the event generators includes asubscriber type 252, which is either primary (account level) or secondary (subscriber level). A subscriber identifier 254 denotes the specific subscriber/account. By storing subscriber identification information in this two-tiered manner, the event manager can meaningfully distinguish between behavior of a subscriber, and behavior of a specific account of a subscriber. Eachevent 250 includes a family 256, which is generally “fraud”, but allows for other event types. Acategory 258 identifies the event generator source, and atype 260 identifies a particular type of event from the event generator specified by thecategory 258. Thetime 262 of theevent 250 is included. - Each
event 250 includes aweight 264 that represents a numerical value associated with eachevent 250, a half-life 266 that represents the rate at which the weight diminishes, a match quality 268 that represents a degree of correspondence between the subscriber and a subscriber having a known history of fraud, as well as an event identifier 270 and an expiration date 272 that are assigned to eachevent 250 as it is received by theevent manager process 224. The use of these fields will be explained in further detail below. - Each
summary 280 includes asubscriber type 282 and asubscriber identifier 284 that correspond to thesubscriber type 252 and the subscriber identifier 254 used in eachevent 250. Eachsummary 280 also includes anage time 286 that indicates the last time that analert score 288 was aged using the half-life 266. Thealert score 288 is a composite score used to prioritize investigation. Thealert score 288 is preferably a sum of aprimary score 290, the score for a particular subscriber, and a critical score 292, the highest score for any account of the subscriber. Acritical identifier 294 indicates the account to which the critical score 292 corresponds. An array ofpartial scores 296 is also included in thesummary 280, and includes partial scores (weights) for the subscriber, along with associated half-lives. Eachsummary 280 is stored in a “bulletin board,” a non-permanent, shared memory area used by theevent manager process 224 and the other processes of the fraud detection system. It will be appreciated that eachevent 250 and eachsummary 280 may include other fields with additional subscriber information, event information, or system administration information for use in thefraud detection system 100. - FIGS.4A-4B are a flow chart showing event management according to the principles of the invention. In one embodiment, each step is a task operating within the
event manager process 224. Event management conceptually begins with theevent manager process 224 receiving an event message, as shown instep 300. The event message may include one ormore events 250 generated, for example, by theCDR loader 226, theprovisioning loader 228, or thefraud database 230. Instep 300, an event receiver task checks for valid data and assigns an event identifier 270 and an expiration date 272 to eachevent 250. Theevents 250 contained in the event message are then stored in theevent database 234, as shown instep 302. Theevent manager process 224 is preferably multi-threaded, such that theprocess 224 may return to step 300 to receive a new event message at the same time that control is passed to a setup task, as shown is instep 304. - In
step 304, eachevent 250 is prepared for further analysis. In particular, existingsummaries 280 and other data for asubscriber type 252 and subscriber identifier 254 are retrieved from databases as necessary and placed on to the bulletin board for common use by theevent manager process 224 and the other processes and tasks. - In
step 306, events are aged. Eachevent 250 has, associated therewith, a half-life 266 which describes the manner in which theweight 264 decreases in value over time. In a preferred embodiment, the half-life is the amount of time for theweight 264 to decrease by fifty percent, thus denoting an exponential decay. It will be appreciated that linear decay, or some other time-sensitive de-weighting may be used to age events. Events may be aged on an as-needed basis. That is, instead of aging everyevent 250 daily,events 250 for a subscriber may be aged when anevent 250 for that subscriber is received. To facilitate this calculation, thesummary 280 for a subscriber includes the most recent aging as anage time 286. In addition, a daily ager task is provided, which operates once each day to age weights for any subscriber having an “open investigation.” The open investigation is a subscriber/account on an investigation queue, which will be described in more detail with reference to step 314. - In
step 308, theevent manager process 224 checks for any meta-events. Meta-events are combinations of events, possibly in conjunction with other information, which indicate a likelihood of fraud beyond that suggested by pre-assigned event weights. Instep 308, theevent manager process 224 may be configured to perform any test on any combination of events and/or other data available in thefraud detection system 100. For example, in a “persistent fraud” meta-event, thesubscriber type 252 and subscriber identifier 254 are checked against known occurrences of fraud recorded in theevent database 234 by theevent manager process 224. If thesubscriber type 252 and the subscriber identifier 254 for anevent 250 indicate a subscriber with a status of fraud found, then a meta-event with a high score is generated for that subscriber. Step 308 may include several such tests, which may be performed sequentially or in parallel, and each of which may generate its own meta-event. After all such tests have been completed, theevent manager process 224 may proceed. - In
step 310, an aggregator task updates thesummaries 280 to reflect received events and meta-events. Thepartial scores 296 andprimary score 290 are updated. A new critical score 292 may be determined, an a newalert score 288 calculated therefrom. However, as thecritical score identifier 294 may change due to the events, it is preferred to defer calculation of a newalert score 288 until a critical subscriber (and associated critical identifier 294) has been determined. - In
step 312, a critical subscriber is identified. Of all of the accounts of a subscriber, i.e., the subscriber identifiers 254 of asubscriber type 252, only an account with a highest qualifying score is used to calculate thealert score 288. In a preferred embodiment, a score is not qualifying if there has been an investigation for thesubscriber identifier 284 within a pre-determined time that has resulted in a positive (i.e., fraud) outcome. - In
step 314, alerts are created. Once the critical subscriber has been identified, thealert score 288 may be calculated for a subscriber/account, as identified by thesubscriber type 282 andsubscriber identifier 284. A new alert for thesummary 280 is generated if thealert score 288 changes, and will be one of “add,” “remove,” or “changed.” An investigation queue of “open investigations” is maintained, which includes each subscriber/account having analert score 288 meeting or exceeding the alert threshold. The alert provides instructions to a separate task that is responsible for maintaining the investigation queue, i.e., prioritizing the queue according toalert scores 288 and handling analyst requests for an open investigation. An add alert is generated when thealert score 288 first meets or exceeds the alert threshold. A remove alert is generated to automatically close an open investigation when thealert score 288 falls below the alert threshold, with the caveat that the open investigation will not be closed if it is currently associated with an analyst. The changed alert is generated when thealert score 288 changes. In addition, no alert will be generated within a predetermined period (an “alert delay”) after a finding of fraud relative to a subscriber/account. - In
step 316, databases are updated. The preceding steps operate on the bulletin board, which is a non-permanent memory. However, any changes with respect to event logging, scores, summaries, and the like, should be permanently recorded in the databases. In one embodiment, there is no locking on the databases, so there is a chance that a change by one task or process may collide with a change from another task or process. Failed changes are detected using a version counter for the databases, and a failed change causes each step including and aftersetup 304 to be repeated. After a maximum number of tries,events 250 may be stored in a separate file for future recovery. - In
step 318, the investigation queue is updated according to any alerts. The investigation queue is then prioritized such that when a client 232 (operated by an analyst) requests an open investigation, theclient 232 will receive the open investigation having the highest score. - In
step 320, an automatic hotline is provided. This task can generate a message to a service provider for immediate termination of an account when thealert score 288 exceeds a predetermined value. Alternatively, in a trial phase for the automatic hotline, the message may be logged so that a carrier can observe the affect on customers prior to implementing the automatic hotline. - FIG. 5 shows system parameters for a fraud detection system according to the principles of the invention. The
system parameters 400 may be stored on theevent manager 106, or in some other memory within thefraud detection system 100. Thesystem parameters 400 include acorporation 402, along with any corporate hierarchical information such as a company, division, or market. Thesystem parameters 400 include analert threshold 404 that determines thealert score 288 at which an investigation is opened. Thealert threshold 404 may be customized for a particular provider. An alert delay 406 determines the amount of time to wait, after detecting fraud, before generating additional alerts for a subscriber/account. Aninvestigation expiration 408 determines the amount of time after an investigation is closed that it should be purged. Individual events are also purged when their aged weights fall below anevent weight minimum 410. Events may also be purged after a predetermined time, as established by anevent age maximum 412. These user-configurable parameters permit system resources to be tailored to event traffic. - Generally, each
event generator event manager 224 for meta-events) is responsible for providing a weight for each event that it generates. These weights are determined (estimated) using a scoring model based upon known techniques of conditional probability and logistic regression. The weights are preferably scaled and shifted to provide a scoring model in which any score above zero indicates a significant likelihood of fraud. In one embodiment, the scoring model is a logit(x) function scaled from −200 to 1000, with a score of 400 representing a 0.5 probability of fraud. In a preferred embodiment, the scores are presented to aclient 232 in textual form, i.e., “force alert,” “high,” “medium,” “low,” “zero,” etc. It will be appreciated that other mathematical techniques may be used, provided they can discriminate among events to determine events carrying an increased likelihood of fraud. For example, a neural network may be trained to evaluate events based upon known instances of fraud. A neural network may also be used to generate thealert score 288 instep 314 above. - The
event manager process 224 may internally generate meta-events. However, other events handled by theevent manager process 224 are received from other processes within thefraud detection system 100. For example, an investigation outcome event is generated by aclient 232 when an analyst closes an investigation. If the analyst has determined that there is no fraud, then this information is entered into the user interface presented to the analyst by theclient 232 anduser interface process 222. Theclient 232 then generates an event having a negative score, which indicates a reduced likelihood of fraud. Another event that may be generated by the client is a very important person (“VIP”) event. The VIP event carries either a positive or a negative score indicative of the analyst's estimate of a necessary bias required to accurately assess risk. Additional event generators are discussed below. - The
provisioning loader process 228 is an event generator. Theprovisioning loader 228 receives a stream of provisioning data from abilling system 110. Provisioning data is stored in thesubscriber database 240. The provisioning data includes information such as new accounts, account changes, rate plan changes, and billing information such as payment, late payment, non-payment, payment method, and the like. This provisioning data is examined for potentially fraudulent behavior. Theprovisioning loader process 228 may generate events based upon this stream of provisioning data. In one embodiment, the provisioning stream is provided to theprovisioning loader process 228 several times each day. Other configurations are possible instead of, or in addition to, this stream. For example, if available from the carrier, a real-time provisioning feed may be used to generate events in near-real-time as new accounts are added or existing account information is changed. - FIG. 6 shows the
events 420 generated by one embodiment of theprovisioning loader process 228, using monthly billing information. Theevents 420 may be categorized by acategory 422, atype 424, and asub-type 426. Afirst category 422 is “change,” which has two types 424: info 430 and rate plan 432. The info 430 type has sub-types “name,” “address,” “phone,” and may include any other billing information recorded for a subscriber. A change to “info” may not, by itself, suggest fraud. However, repeated changes over a short period of time may indicate a heightened likelihood of fraud, and these events are preferably weighted with long half-lives and weights such that two or three occurrences will exceed thealert threshold 404. - Another
type 424 of thecategory 422 change is rate plan 432. The sub-type for rate plan 432 denotes the amount of time since the account was created, one of thirty, sixty, ninety, or ninety-plus days. By creating sub-types for this type, a weight and a half-life may be accorded to a rate plan change according to how long the account has existed. For example, a low weight may be assigned to rate plan changes in the first thirty days of a new account if the carrier permits free changes in that time period. In subsequent periods, changes are expected to be less frequent, and will be accorded greater weight and longer half-lives. - Another
category 422 of event for theprovisioning loader process 224 is billing. Afirst type 424 of thiscategory 422 is pay 434. A pay 434 event indicates generally that a bill has been paid, and has associated therewith a negative weight to indicate a reduced likelihood of fraud. The weight according to a particular pay 434 event will depend on the manner in which payment was made. Thus the pay 434type 422 hasseveral sub-types 426, including “cash,” “check,” “verified funds,” “credit,” “pre-paid card,” and “debit.” Each sub-type has associated therewith a weight and a half-life. In a preferred embodiment, payments made in “cash” or “verified funds” such as a verified check or money order receive stronger negative weights. Those forms of payment which are subject to fraudulent use, such as credit cards, will receive negative weights of less magnitude. Where a “credit” or “debit” card is used, it may further be desirable to query the fraud database 242 for an exact match with a number associated with fraud. - An event of the
billing category 422 may also have atype 424 of non-payment (“NPAY”) 436. This may be a generic non-payment (“GEN”), a pre-paid card with a positive balance (“PPCPB”), or a partial payment (“PARTP”). In a preferred embodiment, the match quality 268 for a partialpayment billing event 436 is proportional to the amount of the bill that has been paid. - A
credit card link 438 category generates events based upon a link to the fraud database 242. Accounts are generally secured by a credit card. Each time a credit card changes for an account (“ACC”), or is used to pay a bill (“BILL”), a query is transmitted to thefraud database process 230, which will search for exact matches in the fraud database 242. This information is used by theprovisioning loader process 228 to generate events indicative of fraud. In addition, thesubscriber database 240 may be examined for any other subscribers or accounts using the same credit card number. - The
fraud database process 230 also operates as an event generator. In addition to the exact matching process used for credit card (or debit card) numbers, thefraud database process 230 includes a fuzzy matching process. The fuzzy matching process receives provisioning information from theprovisioning loader process 228 each time that theprovisioning loader process 228 detects a change in subscriber information (or a new account). The fuzzy matching process generates events based upon matches or near matches with records in the fraud database 242. These records may be subscribers or accounts associated with instances of known fraud, subscribers or accounts with instances of suspected fraud, or identities otherwise associated with fraud. The fuzzy matching process is described below. - FIG. 7 is a flow chart of a fuzzy matching process used in a preferred embodiment of the
fraud detection system 100. Generally, each record received from the provisioning loader process is tested against the fraud database 242 by the fuzzy matching process, and the output is a stream of matching events. As used herein, the terms “account change data” or “account change information” are intended to refer to the subset of provisioning data that is forwarded from theprovisioning loader process 228 to thefraud database process 230. This is preferably a subset of provisioning data that corresponds to account changes and new accounts. - In
step 500, a received record of account change data is formatted for processing by the fuzzy matching process. This may include, as necessary, parsing of the record and suppression of any user-specified values. In a preferred embodied, particular fraud records may also be suppressed in the fraud database 242, so that subsequent searching by the fuzzy matching process will pass over those records. - In
step 510, a search key is generated for a particular field or fields of the record. Where the fuzzy matching is performed on an account change, search keys may be limited to those corresponding to the changed fields. For new accounts, search keys are preferably generated for every category that is not suppressed. Each search key defines those fields of an incoming record that should be used for a particular search. Search keys preferably include an individual name, a business name, an address, a telecommunications number, an identification number, and an equipment serial number. It is noted that credit card numbers use an exact match, as described above. For the address, the search key preferably includes a street name, city, two-character state code, and five or nine character postal code. - In
step 520, possible matches are collected. The matches are collected by applying a set of comparison rules to the search key fields in the account change record and the search key fields in each record in the fraud database. The comparison rules are established separately for each search key. For an individual name, possible matches are preferably returned for an exact match, a match of last name and a first letter of one or more first and middle names, a one-off character match with a name, an eighty percent character match, an eighty percent character match with two transposed letters, a short word exactly matching the first letters of a long word, and out of order names. A normalized match may also be used, in which a given name is converted using a name synonym table (i.e., “Bob”->“Robert”). - For a business name, different matching rules are applied. Naturally, exact matches are returned. Matches will also be collected for business names with the same words in a different sequence. For business names with more than 4 characters, a match will also be collected for 80% character matching. Two transposed characters are also collected if at least 80% of the characters match for each word. In addition, exact matches for shorter strings will be collected. Normalized business names are preferably used. In a normalized business name, common abbreviations are expanded (i.e., “IBM”->“International Business Machines”) using a table of known abbreviations, and common extensions such as Corp. or Corporation are removed.
- For an address, matches are generally collected for 80% character matches of words with more than four characters, and for words with two transposed characters, without reference to the order in which the words appear. Substring matches are also collected by matching shorter words against the left-most characters of a longer word. Matches are also collected for one-off character matches of street numbers. Address matches will only be collected for an exact match of the state, city, and postal code.
- For telecommunications numbers, matches are collected for exact matches of the country number, area code, prefix, and subscriber number. A “wild card” match may also be used, such that a match is collected if every digit except for the wild card digit(s) are an exact match. In a preferred embodiment, a wild card may be a three-digit area code or a three-digit prefix.
- Identification numbers may be driver's licenses, Social Security numbers, Federal Tax ID numbers, or other numbers uniquely identifying individuals or entities. For identification numbers, matches are collected for exact matches, one-off matches, all-but-one character matches, and two transposed character matches. For equipment serial numbers, only exact matches are collected.
- In
step 530, a match quality is calculated for each match collected, according to a set of predetermined rules. The predetermined rules are established separately for each search key. For an individual name, match quality is calculated by taking the percentage of characters of a shorter string that match a longer string. Match quality is calculated for both a first and last name in this inquiry. An exact match on two words of the same length receives a match quality of 100%. The match quality is otherwise scaled according to the nature of the match (exact match, substring match, one-off match, two-off match) and the length of the shorter word (1 character, 2-4 characters, 5-7 characters, 8+ characters). For example, in a preferred embodiment, an exact match with a shorter word length of one character is a 100% match, while a one-off match with a word of 5-7 characters is an 80% match. The match quality for each nature-of-match and length-of-shorter-word may be user configured. - For business names, match quality is determined by calculating the percentage of characters of shorter strings that match longer strings. Match quality is calculated for each word in the business name. An exact match on two words of the same length receives a match quality of 100% regardless of the length of the two words. The match quality is scaled according to the nature of the match and the length of the shorter word. For example, an exact match of only one character will receive a weight of 80%. Similarly, while a one-off match of a 2-4 character word will receive a match quality of 0%, a one-off match for a 5-7 character word will receive an 80% match quality, and a one-off match for an 8+ character will receive a 90% match quality. These match qualities are user configurable.
- For addresses, a match quality is determined by finding the percentage of characters of the shorter street address (and unit number, if available) that match the longer street address. The match quality is weighted equally for each separate word of the address. Any exact match of two words of the same length receives a match quality of 100% regardless of the length. Match quality is otherwise scaled according to the nature of the match and the length of the shorter word. In a preferred embodiment, these match qualities are the same as those for individual names. The match qualities are user configurable. The city, state, and postal code has a match quality of either 100% (exact) or 0%.
- For telecommunications numbers, an exact match or an exact wild card match is assigned a match quality of 100%. Any other match is assigned a match quality of 0%.
- For identification numbers, an exact match is assigned a match quality of 100%, a one-off match is assigned a quality of 90%, and a two-off match is assigned 80%. For equipment serial numbers, each exact match is assigned a match quality of 100%.
- In
step 540, a discounted score is calculated for each matching record. For an individual name, the discounted score is a sum of the first name match and the last name match, multiplied by the weight assigned to the individual name element. The sum is preferably weighted so that a last name match is accorded greater significance than a first name match. In one embodiment, the first name is weighted as 0.35 of the total, and the last name is weighted as 0.65 of the total score. As a further modification, the weight may be distributed among multiple first names and initials. For example, two matching first names may be weighted at 100% while to matching first initials will only be weighted at 80%. Out-of-sequence matches are preferably assigned 80% of the match quality for a corresponding in-sequence match. - For a business name, the discounted score is a weighted sum of the match quality for each word in the business name, multiplied by the weight assigned to business names. All words of the business name are treated equally, however, out of sequence matches are weighted at 80%.
- For an address, the discounted score is a weighted sum of the match quality for each word. Out of sequence matches are adjusted to 80% of the match quality for the corresponding in-sequence match. In a preferred embodiment, the total score is weighted as 0.50 city/state and postal code match, and 0.50 street address match. Alternatively, if a unit number is available, the score is weighted 0.20 unit number, 0.50 city/state and postal code, and 0.40 street address match.
- For telecommunications numbers, the discounted score is the match quality (0% or 100%) multiplied by the weight assigned to a telecommunications number.
- For identification numbers, the discounted score is the match quality multiplied by the weight assigned to identification numbers. For equipment serial numbers, the discounted score is the weight assigned to serial numbers.
- It will be appreciated that steps510-540 are repeated for each search key defined for the fuzzy matching process. In
step 550, the fuzzy matching process checks if all search keys have been applied to an account change record, and returns to step 510 when one or more search keys for a record have not been tested. When each defined search key has been tested, the fuzzy matching process proceeds to step 560. - In
step 560, an event may be generated. In one embodiment, an event is only generated when the fuzzy matching process generates a non-zero score for a record. Alternatively, an event may only be generated when the discounted match score exceeds a predetermined threshold. In a preferred embodiment, a different event may be generated for each search key, such that a single account change record may generate more than one event if more than one search key for that account change record results in a match. - It will be appreciated that the fuzzy matching process is intended generally to locate account information similar to records in the fraud database242. Other search keys and scoring techniques may be used, consistent with the processing power of the
fraud database 108, provided they identify similarity between records and generate corresponding events in a timely fashion. - A human analyst accesses the
fraud detection system 100 by using theclient 232. The client provides a user interface, preferably a graphical user interface, to the analyst, by which the analyst may request and receive subjects for investigation from the investigation queue. In one embodiment, an analyst requesting an investigation receives the open investigation on the investigation queue having the highest score. Where analysts are geographically distributed over a large area, it may be desirable to assign to an analyst the open investigation having the highest score in that analysts region. Alternatively, separate investigation queues may be maintained for each geographic region. When the analyst receives the open investigation, or summary information therefore, the open investigation is locked so that no other analyst may work on the same investigation. Through theuser interface 222, the analyst may conduct an investigation by examining any data in the system relating to the subscriber/account, including data in theevent database 234, thecall database 238, thesubscriber database 240, and the fraud database 242. An investigation concludes with a finding of fraud, a finding of no fraud, or an indeterminate finding. Each of these findings, when entered by the analyst, is an additional fraud detection event. - FIG. 8 shows a graphical user interface screen presented by the
client 232. The depicted screen is a “close investigation” screen in which an analyst enters a resolution of an investigation. Atitle bar 580 describes the page being viewed. Instructions 585 relevant to the page may be displayed below thetitle bar 580. General tools are provided in atool bar 590 along the left side of the screen. A fraud outcome is specified in a drop-downlist 600. A fraud type may also be selected from a drop-downlist 602 of recognized fraud types. Atext box 604 is provided for an analyst to enter additional notes concerning the investigation. When the fraud outcome and the fraud type have been selected, the analyst may close the investigation by selecting an “OK”box 606. The analyst may instead cancel the close investigation operation by clicking a “Cancel”box 608, and proceed with additional investigation. The interface screen depicted in FIG. 8 is not intended to be limiting. It will be appreciated that numerous graphical user interface tools and objects are known in the art and may be used with a graphical user interface according to the invention. It will further be appreciated that other screens are preferably also used, for example, to handle requests for new investigations, provide summary information to an analyst, and to investigate records and call histories of a subject under investigation. - The
user interface process 222 may, in addition to providing a functional interface between theclients 232 and the rest of thefraud detection system 100, also be used to track analyst productivity. Analyst actions may be logged, with time stamps, for future analysis of time spent in individual and aggregate analysis. - In a preferred embodiment, a graphical user interface is provided for administration of the system, including control of weights and half-lives for each event, and for controlling the system parameters such as those described in reference to FIG. 5. A number of user interfaces and graphical user interface tools are known in the art, and may be adapted to administration of a fraud detection system operating according to the principles of the invention. It will further be appreciated that such a system will preferably include security to prevent unauthorized modifications thereto.
- While the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. Accordingly, the spirit and scope of the present invention is to be limited only by the following claims.
Claims (55)
1. A method for detecting fraud in a telecommunications system comprising:
receiving one or more events relating to a subscriber;
combining the one or more events to provide a score; and
storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
2. The method of claim 1 further comprising:
repeating the method for a plurality of subscribers; and
storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold.
3. The method of claim 2 further comprising prioritizing the investigation queue according to the plurality of scores.
4. The method of claim 2 further comprising updating the score of one of the plurality of suspect subscribers to provide an updated score, and removing the one of the plurality of suspect subscribers from the investigation queue if the updated score does not exceed the predetermined threshold.
5. The method of claim 2 further comprising assigning a human analyst to investigate one of the plurality of suspect subscribers.
6. The method of claim 2 further comprising:
determining a region for each one of the plurality of suspect subscribers; and
assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
7. The method of claim 5 wherein assigning a human analyst further comprises:
receiving a request to investigate from the human analyst;
assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority; and
removing the one of the plurality of suspect subscribers from the investigation queue.
8. The method of claim 1 wherein combining the one or more events to provide a score further comprises:
weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and
summing the one or more weighted events to provide a score.
9. The method of claim 8 further comprising aging each of the one or more weighted events using a half-life.
10. The method of claim 8 wherein the one or more event weights are discounted according to a match quality.
11. The method of claim 8 wherein the one or more event weights are determined using logistic regression.
12. The method of claim 2 wherein combining the one or more events to provide a score comprises feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
13. The method of claim 12 further comprising prioritizing the investigation queue according to the plurality of scores.
14. A system for detecting telecommunications fraud comprising:
means for receiving one or more events relating to a subscriber;
means for combining the one or more events to provide a score; and
means for storing the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
15. The system of claim 14 further comprising:
means for applying the receiving means, the combining means, and the storing means to a plurality of subscribers; and
means for storing a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceed the predetermined threshold.
16. The system of claim 15 further comprising means for prioritizing the investigation queue according to the plurality of scores.
17. The system of claim 15 further comprising means for removing one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time.
18. The system of claim 15 further comprising means for assigning a human analyst to investigate one of the plurality of suspect subscribers.
19. The system of claim 15 further comprising:
means for determining a region for each one of the plurality of suspect subscribers; and
means for assigning a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
20. The system of claim 18 wherein the assigning means further comprises:
means for receiving a request to investigate from the human analyst; and
means for assigning to the human analyst a one of the plurality of suspect subscribers having a highest priority.
21. The system of claim 14 wherein the combining means further comprises:
means for weighting the one or more events according to one or more event weights, thereby providing one or more weighted events; and
means for summing the one or more weighted events to provide a score.
22. The system of claim 21 further comprising means for aging each of the one or more weighted events using a half-life.
23. The system of claim 21 wherein the one or more event weights are discounted according to a match quality.
24. The system of claim 21 wherein the one or more event weights are determined using logistic regression.
25. The system of claim 15 wherein the combining means further comprises means for feeding the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
26. The system of claim 25 further comprising means for prioritizing the investigation queue according to the plurality of scores.
27. A computer program for detecting telecommunications fraud embodied in machine executable code comprising:
machine executable code to receive one or more events relating to a subscriber;
machine executable code to combine the one or more events to provide a score; and
machine executable code to store the subscriber and the score in an investigation queue if the score exceeds a predetermined threshold.
28. The computer program of claim 27 further comprising:
machine executable code to repeat the machine executable code to receive, the machine executable code to combine, and the machine executable code to store for a plurality of subscribers; and
machine executable code to store a plurality of suspect subscribers in the investigation queue, each one of the plurality of suspect subscribers having a score that exceeds the predetermined threshold.
29. The computer program of claim 28 further comprising machine executable code to prioritize the investigation queue according to the plurality of scores.
30. The computer program of claim 28 further comprising machine executable code to remove one of the plurality of suspect subscribers from the investigation queue if the one of the plurality of suspect subscribers has not been investigated within a predetermined time.
31. The computer program of claim 28 further comprising machine executable code to assign a human analyst to investigate one of the plurality of suspect subscribers.
32. The computer program of claim 28 further comprising:
machine executable code to determine a region for each one of the plurality of suspect subscribers; and
machine executable code to assign a regional human analyst to investigate those ones of the plurality of suspect subscribers having a particular region.
33. The computer program of claim 32 wherein the machine executable code to assign a human analyst further comprises:
machine executable code to receive a request to investigate from the human analyst; and
machine executable code to assign to the human analyst a one of the plurality of suspect subscribers having a highest priority.
34. The computer program of claim 27 wherein the machine executable code to combine the one or more events to provide a score further comprises:
machine executable code to weight the one or more events according to one or more event weights, thereby providing one or more weighted events; and
machine executable code to sum the one or more weighted events to provide a score.
35. The computer program of claim 34 further comprising machine executable code to age each of the one or more weighted events using a half-life.
36. The computer program of claim 34 wherein the one or more event weights are discounted according to a match quality.
37. The computer program of claim 34 wherein the one or more event weights are determined using logistic regression.
38. The computer program of claim 28 wherein the machine executable code to combine the one or more events to provide a score comprises machine executable code to feed the one or more events to a neural network, the neural network being trained to generate a score indicative of possible fraud from the one or more events.
39. The computer program of claim 38 further comprising machine executable code to prioritize the investigation queue according to the plurality of scores.
40. A method for identifying possibly fraudulent activity in a telecommunications system comprising:
providing a fraud record, the fraud record including a first plurality of fields;
providing an account change record, the account change record including a second plurality of fields;
providing a search key, the search key indicating one or more search key fields corresponding to fields of the account change record and the fraud record;
applying the search key and a first set of rules to the account change record and the fraud record, thereby determining whether there is a possible match;
calculating a match quality for the one or more search key fields if there is a possible match; and
generating an event if there is a possible match, the event having a weight indicative of the quality of a match between the account change record and the fraud record.
41. The method of claim 40 further comprising providing a plurality of fraud records and collecting a plurality of matches.
42. The method of claim 41 further comprising providing a plurality of account change records, and for each one of the plurality of account change records, repeating each of providing a fraud record, providing a search key, applying the search key and a first set of rules, calculating a match quality, and generating an event.
43. The method of claim 40 wherein calculating a match quality further comprises calculating one or more field match terms for each field of the search key, weighting each field match term, and calculating a weighted sum of the field match terms.
44. The method of claim 40 wherein the fraud record is a record of an account with known fraudulent activity.
45. The method of claim 40 wherein the fraud record is a record of an account with suspected fraudulent activity.
46. The method of claim 40 , further comprising providing a plurality of search keys.
47. The method of claim 40 , further comprising providing each generated event to an event manager.
48. A method for identifying possibly fraudulent activity in a telecommunications system comprising:
defining one or more events, each event corresponding to a category of account activity,
assigning to each of the one or more events an event weight and an event half-life;
receiving a provisioning record, the provisioning record corresponding to an activity in an account;
determining which category of account activity corresponds to the provisioning record; and
generating an event for the account activity, the event having the event weight and the event half-life of the category to which the account activity corresponds.
49. The method of claim 48 further comprising providing a plurality of provisioning records, thereby generating a plurality of events.
50. The method of claim 49 wherein the plurality of provisioning records is a daily billing information stream from a carrier.
51. The method of claim 49 wherein the plurality of provisioning records is a real-time payment information stream from a carrier.
52. The method of claim 49 wherein the plurality of provisioning records is a real-time account change information stream from a carrier.
53. The method of claim 48 wherein one of the account activity categories is a change in account information.
54. The method of claim 48 wherein one of the account activity categories is a change in bill payment information.
55. The method of claim 48 , further comprising defining one or more types and one or more sub-types for each account activity category, and defining an event for each sub-type.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/335,499 US20030153299A1 (en) | 1998-11-18 | 2002-12-31 | Event manager for use in fraud detection |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10897198P | 1998-11-18 | 1998-11-18 | |
US10895298P | 1998-11-18 | 1998-11-18 | |
US09/443,065 US6535728B1 (en) | 1998-11-18 | 1999-11-18 | Event manager for use in fraud detection |
US10/335,499 US20030153299A1 (en) | 1998-11-18 | 2002-12-31 | Event manager for use in fraud detection |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/443,065 Continuation US6535728B1 (en) | 1998-11-18 | 1999-11-18 | Event manager for use in fraud detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030153299A1 true US20030153299A1 (en) | 2003-08-14 |
Family
ID=26806466
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/443,065 Expired - Lifetime US6535728B1 (en) | 1998-11-18 | 1999-11-18 | Event manager for use in fraud detection |
US10/335,499 Abandoned US20030153299A1 (en) | 1998-11-18 | 2002-12-31 | Event manager for use in fraud detection |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/443,065 Expired - Lifetime US6535728B1 (en) | 1998-11-18 | 1999-11-18 | Event manager for use in fraud detection |
Country Status (6)
Country | Link |
---|---|
US (2) | US6535728B1 (en) |
EP (1) | EP1131976A1 (en) |
AU (1) | AU768096B2 (en) |
CA (1) | CA2351478A1 (en) |
HK (1) | HK1040156A1 (en) |
WO (1) | WO2000030398A1 (en) |
Cited By (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050125338A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using reconciliation information |
US20050125296A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining biometric information at a point of sale |
US20050125337A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for identifying payor location based on transaction data |
US20050125360A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining authentication marks at a point of sale |
US20050125339A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using biometric information |
US20050125351A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using authenticating marks |
US20050125295A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining payor information at a point of sale |
US20070094281A1 (en) * | 2005-10-26 | 2007-04-26 | Malloy Michael G | Application portfolio assessment tool |
US20080162202A1 (en) * | 2006-12-29 | 2008-07-03 | Richendra Khanna | Detecting inappropriate activity by analysis of user interactions |
WO2008138029A1 (en) * | 2007-05-11 | 2008-11-20 | Fmt Worldwide Pty Ltd | A detection filter |
US20090025084A1 (en) * | 2007-05-11 | 2009-01-22 | Fraud Management Technologies Pty Ltd | Fraud detection filter |
US20090271404A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group, Inc. | Statistical record linkage calibration for interdependent fields without the need for human interaction |
US20100005078A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | System and method for identifying entity representations based on a search query using field match templates |
US20100174688A1 (en) * | 2008-12-09 | 2010-07-08 | Ingenix, Inc. | Apparatus, System and Method for Member Matching |
US20110138007A1 (en) * | 2008-08-15 | 2011-06-09 | Fujitsu Limited | Business flow distributed processing system and method |
US8195654B1 (en) * | 2005-07-13 | 2012-06-05 | Google Inc. | Prediction of human ratings or rankings of information retrieval quality |
US20120278868A1 (en) * | 2011-04-29 | 2012-11-01 | Boding B Scott | Fraud detection system audit capability |
US8688072B1 (en) * | 2010-12-30 | 2014-04-01 | Sprint Communications Company L.P. | Agent notification triggered by network access failure |
US9015171B2 (en) | 2003-02-04 | 2015-04-21 | Lexisnexis Risk Management Inc. | Method and system for linking and delinking data records |
US9189505B2 (en) | 2010-08-09 | 2015-11-17 | Lexisnexis Risk Data Management, Inc. | System of and method for entity representation splitting without the need for human interaction |
US9342783B1 (en) | 2007-03-30 | 2016-05-17 | Consumerinfo.Com, Inc. | Systems and methods for data verification |
US9411859B2 (en) | 2009-12-14 | 2016-08-09 | Lexisnexis Risk Solutions Fl Inc | External linking based on hierarchical level weightings |
US9529851B1 (en) | 2013-12-02 | 2016-12-27 | Experian Information Solutions, Inc. | Server architecture for electronic data quality processing |
US9684905B1 (en) | 2010-11-22 | 2017-06-20 | Experian Information Solutions, Inc. | Systems and methods for data verification |
US9697263B1 (en) | 2013-03-04 | 2017-07-04 | Experian Information Solutions, Inc. | Consumer data request fulfillment system |
US9760861B2 (en) | 2011-04-29 | 2017-09-12 | Visa International Service Association | Fraud detection system automatic rule population engine |
CN107566358A (en) * | 2017-08-25 | 2018-01-09 | 腾讯科技(深圳)有限公司 | A kind of Risk-warning reminding method, device, medium and equipment |
US10075446B2 (en) | 2008-06-26 | 2018-09-11 | Experian Marketing Solutions, Inc. | Systems and methods for providing an integrated identifier |
US10102536B1 (en) | 2013-11-15 | 2018-10-16 | Experian Information Solutions, Inc. | Micro-geographic aggregation system |
US10262362B1 (en) | 2014-02-14 | 2019-04-16 | Experian Information Solutions, Inc. | Automatic generation of code for attributes |
US20190132336A1 (en) * | 2017-10-30 | 2019-05-02 | Bank Of America Corporation | System for across rail silo system integration and logic repository |
US10339527B1 (en) | 2014-10-31 | 2019-07-02 | Experian Information Solutions, Inc. | System and architecture for electronic fraud detection |
US10375078B2 (en) | 2016-10-10 | 2019-08-06 | Visa International Service Association | Rule management user interface |
US10592982B2 (en) | 2013-03-14 | 2020-03-17 | Csidentity Corporation | System and method for identifying related credit inquiries |
US10593004B2 (en) | 2011-02-18 | 2020-03-17 | Csidentity Corporation | System and methods for identifying compromised personally identifiable information on the internet |
US10699028B1 (en) | 2017-09-28 | 2020-06-30 | Csidentity Corporation | Identity security architecture systems and methods |
US10733293B2 (en) | 2017-10-30 | 2020-08-04 | Bank Of America Corporation | Cross platform user event record aggregation system |
US10896472B1 (en) | 2017-11-14 | 2021-01-19 | Csidentity Corporation | Security and identity verification system and architecture |
US10909617B2 (en) | 2010-03-24 | 2021-02-02 | Consumerinfo.Com, Inc. | Indirect monitoring and reporting of a user's credit data |
US10963434B1 (en) | 2018-09-07 | 2021-03-30 | Experian Information Solutions, Inc. | Data architecture for supporting multiple search models |
US11030562B1 (en) | 2011-10-31 | 2021-06-08 | Consumerinfo.Com, Inc. | Pre-data breach monitoring |
US11151468B1 (en) | 2015-07-02 | 2021-10-19 | Experian Information Solutions, Inc. | Behavior analysis using distributed representations of event data |
US11157914B2 (en) * | 2019-12-13 | 2021-10-26 | Visa International Service Association | Method, system, and computer program product for processing a potentially fraudulent transaction |
US11227001B2 (en) | 2017-01-31 | 2022-01-18 | Experian Information Solutions, Inc. | Massive scale heterogeneous data ingestion and user resolution |
US11880377B1 (en) | 2021-03-26 | 2024-01-23 | Experian Information Solutions, Inc. | Systems and methods for entity resolution |
US11882139B2 (en) | 2012-07-24 | 2024-01-23 | Twilio Inc. | Method and system for preventing illicit use of a telephony platform |
US11941065B1 (en) | 2019-09-13 | 2024-03-26 | Experian Information Solutions, Inc. | Single identifier platform for storing entity data |
Families Citing this family (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6526389B1 (en) * | 1999-04-20 | 2003-02-25 | Amdocs Software Systems Limited | Telecommunications system for generating a three-level customer behavior profile and for detecting deviation from the profile to identify fraud |
US7272855B1 (en) * | 1999-06-08 | 2007-09-18 | The Trustees Of Columbia University In The City Of New York | Unified monitoring and detection of intrusion attacks in an electronic system |
US7140039B1 (en) | 1999-06-08 | 2006-11-21 | The Trustees Of Columbia University In The City Of New York | Identification of an attacker in an electronic system |
US7013296B1 (en) | 1999-06-08 | 2006-03-14 | The Trustees Of Columbia University In The City Of New York | Using electronic security value units to control access to a resource |
US6601014B1 (en) * | 1999-11-30 | 2003-07-29 | Cerebrus Solutions Ltd. | Dynamic deviation |
FI110651B (en) * | 2000-02-22 | 2003-02-28 | Nokia Corp | A method for checking the amount of data transferred |
US7089197B1 (en) * | 2000-05-08 | 2006-08-08 | Mci, Llc | System and method for verification of customer information entered via an Internet based order entry system |
JP3872263B2 (en) * | 2000-08-07 | 2007-01-24 | 富士通株式会社 | CTI server and program recording medium |
US6597775B2 (en) * | 2000-09-29 | 2003-07-22 | Fair Isaac Corporation | Self-learning real-time prioritization of telecommunication fraud control actions |
US6850606B2 (en) * | 2001-09-25 | 2005-02-01 | Fair Isaac Corporation | Self-learning real-time prioritization of telecommunication fraud control actions |
GB0103053D0 (en) * | 2001-02-07 | 2001-03-21 | Nokia Mobile Phones Ltd | A communication terminal having a predictive text editor application |
US7113932B2 (en) * | 2001-02-07 | 2006-09-26 | Mci, Llc | Artificial intelligence trending system |
US7089592B2 (en) * | 2001-03-15 | 2006-08-08 | Brighterion, Inc. | Systems and methods for dynamic detection and prevention of electronic fraud |
CA2447717A1 (en) * | 2001-05-17 | 2002-11-21 | Arthur Lance Springer | Domestic to international collect call blocking |
US7206806B2 (en) * | 2001-05-30 | 2007-04-17 | Pineau Richard A | Method and system for remote utilizing a mobile device to share data objects |
WO2003017153A1 (en) * | 2001-08-20 | 2003-02-27 | Patent One Inc. | Identification information issuing system |
US20050190905A1 (en) * | 2002-04-16 | 2005-09-01 | George Bolt | Hierarchical system and method for analyzing data streams |
GB0210241D0 (en) * | 2002-05-03 | 2002-06-12 | Cerebrus | Local usage monitoring and fraud detection for radio communication networks |
US7020254B2 (en) * | 2002-10-28 | 2006-03-28 | Bellsouth Intellectual Property Corporation | Escalation tracking system |
US7899901B1 (en) * | 2002-12-02 | 2011-03-01 | Arcsight, Inc. | Method and apparatus for exercising and debugging correlations for network security system |
US8176527B1 (en) * | 2002-12-02 | 2012-05-08 | Hewlett-Packard Development Company, L. P. | Correlation engine with support for time-based rules |
US20040138975A1 (en) * | 2002-12-17 | 2004-07-15 | Lisa Engel | System and method for preventing fraud in check orders |
US20110202565A1 (en) * | 2002-12-31 | 2011-08-18 | American Express Travel Related Services Company, Inc. | Method and system for implementing and managing an enterprise identity management for distributed security in a computer system |
US7143095B2 (en) * | 2002-12-31 | 2006-11-28 | American Express Travel Related Services Company, Inc. | Method and system for implementing and managing an enterprise identity management for distributed security |
US20040230448A1 (en) * | 2003-02-14 | 2004-11-18 | William Schaich | System for managing and reporting financial account activity |
US7555540B2 (en) * | 2003-06-25 | 2009-06-30 | Microsoft Corporation | Media foundation media processor |
US8914309B2 (en) * | 2004-08-20 | 2014-12-16 | Ebay Inc. | Method and system for tracking fraudulent activity |
US20060085690A1 (en) * | 2004-10-15 | 2006-04-20 | Dell Products L.P. | Method to chain events in a system event log |
FR2877176B1 (en) * | 2004-10-22 | 2007-04-20 | Agence Spatiale Europeenne | METHOD AND DEVICE FOR ORDERING AND TRANSMITTING DATA PACKETS FROM A COMMON TRANSMITTER TO A PLURALITY OF USERS SHARING A COUMMUN TRANSMIT CHANNEL. |
US7802722B1 (en) | 2004-12-31 | 2010-09-28 | Teradata Us, Inc. | Techniques for managing fraud information |
US20070016522A1 (en) * | 2005-07-15 | 2007-01-18 | Zhiping Wang | Data processing system for a billing address-based credit watch |
US7760861B1 (en) * | 2005-10-31 | 2010-07-20 | At&T Intellectual Property Ii, L.P. | Method and apparatus for monitoring service usage in a communications network |
US20070204033A1 (en) * | 2006-02-24 | 2007-08-30 | James Bookbinder | Methods and systems to detect abuse of network services |
US20080228646A1 (en) * | 2006-10-04 | 2008-09-18 | Myers James R | Method and system for managing a non-changing payment card account number |
US9185123B2 (en) | 2008-02-12 | 2015-11-10 | Finsphere Corporation | System and method for mobile identity protection for online user authentication |
US8116731B2 (en) * | 2007-11-01 | 2012-02-14 | Finsphere, Inc. | System and method for mobile identity protection of a user of multiple computer applications, networks or devices |
US8374634B2 (en) * | 2007-03-16 | 2013-02-12 | Finsphere Corporation | System and method for automated analysis comparing a wireless device location with another geographic location |
US9922323B2 (en) | 2007-03-16 | 2018-03-20 | Visa International Service Association | System and method for automated analysis comparing a wireless device location with another geographic location |
US9432845B2 (en) | 2007-03-16 | 2016-08-30 | Visa International Service Association | System and method for automated analysis comparing a wireless device location with another geographic location |
US9420448B2 (en) | 2007-03-16 | 2016-08-16 | Visa International Service Association | System and method for automated analysis comparing a wireless device location with another geographic location |
US8280348B2 (en) | 2007-03-16 | 2012-10-02 | Finsphere Corporation | System and method for identity protection using mobile device signaling network derived location pattern recognition |
US7440915B1 (en) | 2007-11-16 | 2008-10-21 | U.S. Bancorp Licensing, Inc. | Method, system, and computer-readable medium for reducing payee fraud |
CA2727831C (en) | 2008-06-12 | 2019-02-05 | Guardian Analytics, Inc. | Modeling users for fraud detection and analysis |
US10115153B2 (en) * | 2008-12-31 | 2018-10-30 | Fair Isaac Corporation | Detection of compromise of merchants, ATMS, and networks |
US20100235908A1 (en) * | 2009-03-13 | 2010-09-16 | Silver Tail Systems | System and Method for Detection of a Change in Behavior in the Use of a Website Through Vector Analysis |
US20100235909A1 (en) * | 2009-03-13 | 2010-09-16 | Silver Tail Systems | System and Method for Detection of a Change in Behavior in the Use of a Website Through Vector Velocity Analysis |
US10290053B2 (en) | 2009-06-12 | 2019-05-14 | Guardian Analytics, Inc. | Fraud detection and analysis |
US20110197166A1 (en) * | 2010-02-05 | 2011-08-11 | Fuji Xerox Co., Ltd. | Method for recommending enterprise documents and directories based on access logs |
US9392003B2 (en) | 2012-08-23 | 2016-07-12 | Raytheon Foreground Security, Inc. | Internet security cyber threat reporting system and method |
US10896421B2 (en) | 2014-04-02 | 2021-01-19 | Brighterion, Inc. | Smart retail analytics and commercial messaging |
US20180053114A1 (en) | 2014-10-23 | 2018-02-22 | Brighterion, Inc. | Artificial intelligence for context classifier |
US9509705B2 (en) | 2014-08-07 | 2016-11-29 | Wells Fargo Bank, N.A. | Automated secondary linking for fraud detection systems |
US20150339673A1 (en) | 2014-10-28 | 2015-11-26 | Brighterion, Inc. | Method for detecting merchant data breaches with a computer network server |
US20150032589A1 (en) | 2014-08-08 | 2015-01-29 | Brighterion, Inc. | Artificial intelligence fraud management solution |
US20160055427A1 (en) | 2014-10-15 | 2016-02-25 | Brighterion, Inc. | Method for providing data science, artificial intelligence and machine learning as-a-service |
US20150066771A1 (en) | 2014-08-08 | 2015-03-05 | Brighterion, Inc. | Fast access vectors in real-time behavioral profiling |
US20160071017A1 (en) | 2014-10-15 | 2016-03-10 | Brighterion, Inc. | Method of operating artificial intelligence machines to improve predictive model training and performance |
US20160063502A1 (en) | 2014-10-15 | 2016-03-03 | Brighterion, Inc. | Method for improving operating profits with better automated decision making with artificial intelligence |
US11080709B2 (en) | 2014-10-15 | 2021-08-03 | Brighterion, Inc. | Method of reducing financial losses in multiple payment channels upon a recognition of fraud first appearing in any one payment channel |
US10546099B2 (en) | 2014-10-15 | 2020-01-28 | Brighterion, Inc. | Method of personalizing, individualizing, and automating the management of healthcare fraud-waste-abuse to unique individual healthcare providers |
US20160078367A1 (en) | 2014-10-15 | 2016-03-17 | Brighterion, Inc. | Data clean-up method for improving predictive model training |
US10290001B2 (en) | 2014-10-28 | 2019-05-14 | Brighterion, Inc. | Data breach detection |
US10671915B2 (en) | 2015-07-31 | 2020-06-02 | Brighterion, Inc. | Method for calling for preemptive maintenance and for equipment failure prevention |
US20190342297A1 (en) | 2018-05-01 | 2019-11-07 | Brighterion, Inc. | Securing internet-of-things with smart-agent technology |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5345595A (en) * | 1992-11-12 | 1994-09-06 | Coral Systems, Inc. | Apparatus and method for detecting fraudulent telecommunication activity |
US5495521A (en) * | 1993-11-12 | 1996-02-27 | At&T Corp. | Method and means for preventing fraudulent use of telephone network |
US5627886A (en) * | 1994-09-22 | 1997-05-06 | Electronic Data Systems Corporation | System and method for detecting fraudulent network usage patterns using real-time network monitoring |
US5790645A (en) * | 1996-08-01 | 1998-08-04 | Nynex Science & Technology, Inc. | Automatic design of fraud detection systems |
US5819226A (en) * | 1992-09-08 | 1998-10-06 | Hnc Software Inc. | Fraud detection using predictive modeling |
US5905949A (en) * | 1995-12-21 | 1999-05-18 | Corsair Communications, Inc. | Cellular telephone fraud prevention system using RF signature analysis |
US5907602A (en) * | 1995-03-30 | 1999-05-25 | British Telecommunications Public Limited Company | Detecting possible fraudulent communication usage |
US5950121A (en) * | 1993-06-29 | 1999-09-07 | Airtouch Communications, Inc. | Method and apparatus for fraud control in cellular telephone systems |
US5966650A (en) * | 1995-07-13 | 1999-10-12 | Northern Telecom Limited | Detecting mobile telephone misuse |
US5978669A (en) * | 1994-11-10 | 1999-11-02 | Telefonaktiebolaget Lm Ericsson | Method of detecting fraud in a radio communications network by analyzing activity, identification of RF channel data for mobile stations in the network |
US6163604A (en) * | 1998-04-03 | 2000-12-19 | Lucent Technologies | Automated fraud management in transaction-based networks |
US6327352B1 (en) * | 1997-02-24 | 2001-12-04 | Ameritech Corporation | System and method for real-time fraud detection within a telecommunications system |
US6370373B1 (en) * | 1994-11-23 | 2002-04-09 | Lucent Technologies Inc. | System and method for detecting cloning fraud in cellular/PCS communications |
-
1999
- 1999-11-18 US US09/443,065 patent/US6535728B1/en not_active Expired - Lifetime
- 1999-11-18 WO PCT/US1999/027611 patent/WO2000030398A1/en not_active Application Discontinuation
- 1999-11-18 CA CA002351478A patent/CA2351478A1/en not_active Abandoned
- 1999-11-18 EP EP99960537A patent/EP1131976A1/en not_active Withdrawn
- 1999-11-18 AU AU17408/00A patent/AU768096B2/en not_active Ceased
-
2002
- 2002-03-11 HK HK02101839.1A patent/HK1040156A1/en unknown
- 2002-12-31 US US10/335,499 patent/US20030153299A1/en not_active Abandoned
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5819226A (en) * | 1992-09-08 | 1998-10-06 | Hnc Software Inc. | Fraud detection using predictive modeling |
US5345595A (en) * | 1992-11-12 | 1994-09-06 | Coral Systems, Inc. | Apparatus and method for detecting fraudulent telecommunication activity |
US5950121A (en) * | 1993-06-29 | 1999-09-07 | Airtouch Communications, Inc. | Method and apparatus for fraud control in cellular telephone systems |
US5495521A (en) * | 1993-11-12 | 1996-02-27 | At&T Corp. | Method and means for preventing fraudulent use of telephone network |
US5627886A (en) * | 1994-09-22 | 1997-05-06 | Electronic Data Systems Corporation | System and method for detecting fraudulent network usage patterns using real-time network monitoring |
US5978669A (en) * | 1994-11-10 | 1999-11-02 | Telefonaktiebolaget Lm Ericsson | Method of detecting fraud in a radio communications network by analyzing activity, identification of RF channel data for mobile stations in the network |
US6370373B1 (en) * | 1994-11-23 | 2002-04-09 | Lucent Technologies Inc. | System and method for detecting cloning fraud in cellular/PCS communications |
US5907602A (en) * | 1995-03-30 | 1999-05-25 | British Telecommunications Public Limited Company | Detecting possible fraudulent communication usage |
US5966650A (en) * | 1995-07-13 | 1999-10-12 | Northern Telecom Limited | Detecting mobile telephone misuse |
US5905949A (en) * | 1995-12-21 | 1999-05-18 | Corsair Communications, Inc. | Cellular telephone fraud prevention system using RF signature analysis |
US5790645A (en) * | 1996-08-01 | 1998-08-04 | Nynex Science & Technology, Inc. | Automatic design of fraud detection systems |
US6327352B1 (en) * | 1997-02-24 | 2001-12-04 | Ameritech Corporation | System and method for real-time fraud detection within a telecommunications system |
US6163604A (en) * | 1998-04-03 | 2000-12-19 | Lucent Technologies | Automated fraud management in transaction-based networks |
Cited By (125)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9384262B2 (en) | 2003-02-04 | 2016-07-05 | Lexisnexis Risk Solutions Fl Inc. | Internal linking co-convergence using clustering with hierarchy |
US9020971B2 (en) | 2003-02-04 | 2015-04-28 | Lexisnexis Risk Solutions Fl Inc. | Populating entity fields based on hierarchy partial resolution |
US9015171B2 (en) | 2003-02-04 | 2015-04-21 | Lexisnexis Risk Management Inc. | Method and system for linking and delinking data records |
US9043359B2 (en) | 2003-02-04 | 2015-05-26 | Lexisnexis Risk Solutions Fl Inc. | Internal linking co-convergence using clustering with no hierarchy |
US9037606B2 (en) | 2003-02-04 | 2015-05-19 | Lexisnexis Risk Solutions Fl Inc. | Internal linking co-convergence using clustering with hierarchy |
US7287689B2 (en) * | 2003-12-09 | 2007-10-30 | First Data Corporation | Systems and methods for assessing the risk of a financial transaction using authenticating marks |
US7783563B2 (en) | 2003-12-09 | 2010-08-24 | First Data Corporation | Systems and methods for identifying payor location based on transaction data |
US20050125351A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using authenticating marks |
US20050125339A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using biometric information |
US20080046368A1 (en) * | 2003-12-09 | 2008-02-21 | First Data Corporation | Systems and methods for assessing the risk of a financial transaction using authenticating marks |
US20050125296A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining biometric information at a point of sale |
US7398925B2 (en) | 2003-12-09 | 2008-07-15 | First Data Corporation | Systems and methods for assessing the risk of a financial transaction using biometric information |
US20050125295A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining payor information at a point of sale |
US7905396B2 (en) | 2003-12-09 | 2011-03-15 | First Data Corporation | Systems and methods for assessing the risk of a financial transaction using authenticating marks |
US20050125360A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining authentication marks at a point of sale |
US20050125338A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for assessing the risk of a financial transaction using reconciliation information |
US20050125337A1 (en) * | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for identifying payor location based on transaction data |
US9116945B1 (en) | 2005-07-13 | 2015-08-25 | Google Inc. | Prediction of human ratings or rankings of information retrieval quality |
US8195654B1 (en) * | 2005-07-13 | 2012-06-05 | Google Inc. | Prediction of human ratings or rankings of information retrieval quality |
US20070094281A1 (en) * | 2005-10-26 | 2007-04-26 | Malloy Michael G | Application portfolio assessment tool |
WO2008083320A3 (en) * | 2006-12-29 | 2008-09-12 | Amazon Tech Inc | Detecting inappropriate activity by analysis of user interactions |
US20080162202A1 (en) * | 2006-12-29 | 2008-07-03 | Richendra Khanna | Detecting inappropriate activity by analysis of user interactions |
US11308170B2 (en) | 2007-03-30 | 2022-04-19 | Consumerinfo.Com, Inc. | Systems and methods for data verification |
US9342783B1 (en) | 2007-03-30 | 2016-05-17 | Consumerinfo.Com, Inc. | Systems and methods for data verification |
US10437895B2 (en) | 2007-03-30 | 2019-10-08 | Consumerinfo.Com, Inc. | Systems and methods for data verification |
US20090025084A1 (en) * | 2007-05-11 | 2009-01-22 | Fraud Management Technologies Pty Ltd | Fraud detection filter |
WO2008138029A1 (en) * | 2007-05-11 | 2008-11-20 | Fmt Worldwide Pty Ltd | A detection filter |
US20100146638A1 (en) * | 2007-05-11 | 2010-06-10 | Fmt Worldwide Pty Ltd | Detection filter |
US9836524B2 (en) | 2008-04-24 | 2017-12-05 | Lexisnexis Risk Solutions Fl Inc. | Internal linking co-convergence using clustering with hierarchy |
US8135681B2 (en) | 2008-04-24 | 2012-03-13 | Lexisnexis Risk Solutions Fl Inc. | Automated calibration of negative field weighting without the need for human interaction |
US8489617B2 (en) | 2008-04-24 | 2013-07-16 | Lexisnexis Risk Solutions Fl Inc. | Automated detection of null field values and effectively null field values |
US8572052B2 (en) | 2008-04-24 | 2013-10-29 | LexisNexis Risk Solution FL Inc. | Automated calibration of negative field weighting without the need for human interaction |
US8484168B2 (en) | 2008-04-24 | 2013-07-09 | Lexisnexis Risk & Information Analytics Group, Inc. | Statistical record linkage calibration for multi token fields without the need for human interaction |
US20090292694A1 (en) * | 2008-04-24 | 2009-11-26 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical record linkage calibration for multi token fields without the need for human interaction |
US20090292695A1 (en) * | 2008-04-24 | 2009-11-26 | Lexisnexis Risk & Information Analytics Group Inc. | Automated selection of generic blocking criteria |
US20090271363A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group Inc. | Adaptive clustering of records and entity representations |
US20090271424A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Group | Database systems and methods for linking records and entity representations with sufficiently high confidence |
US8046362B2 (en) | 2008-04-24 | 2011-10-25 | Lexisnexis Risk & Information Analytics Group, Inc. | Statistical record linkage calibration for reflexive and symmetric distance measures at the field and field value levels without the need for human interaction |
US20090271397A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical record linkage calibration at the field and field value levels without the need for human interaction |
US8135680B2 (en) | 2008-04-24 | 2012-03-13 | Lexisnexis Risk Solutions Fl Inc. | Statistical record linkage calibration for reflexive, symmetric and transitive distance measures at the field and field value levels without the need for human interaction |
US8135679B2 (en) | 2008-04-24 | 2012-03-13 | Lexisnexis Risk Solutions Fl Inc. | Statistical record linkage calibration for multi token fields without the need for human interaction |
US8495077B2 (en) | 2008-04-24 | 2013-07-23 | Lexisnexis Risk Solutions Fl Inc. | Database systems and methods for linking records and entity representations with sufficiently high confidence |
US8135719B2 (en) | 2008-04-24 | 2012-03-13 | Lexisnexis Risk Solutions Fl Inc. | Statistical record linkage calibration at the field and field value levels without the need for human interaction |
US9031979B2 (en) | 2008-04-24 | 2015-05-12 | Lexisnexis Risk Solutions Fl Inc. | External linking based on hierarchical level weightings |
US20090271694A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group Inc. | Automated detection of null field values and effectively null field values |
US8195670B2 (en) | 2008-04-24 | 2012-06-05 | Lexisnexis Risk & Information Analytics Group Inc. | Automated detection of null field values and effectively null field values |
US8250078B2 (en) | 2008-04-24 | 2012-08-21 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical record linkage calibration for interdependent fields without the need for human interaction |
US8266168B2 (en) * | 2008-04-24 | 2012-09-11 | Lexisnexis Risk & Information Analytics Group Inc. | Database systems and methods for linking records and entity representations with sufficiently high confidence |
US8275770B2 (en) | 2008-04-24 | 2012-09-25 | Lexisnexis Risk & Information Analytics Group Inc. | Automated selection of generic blocking criteria |
US20090271405A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Grooup Inc. | Statistical record linkage calibration for reflexive, symmetric and transitive distance measures at the field and field value levels without the need for human interaction |
US20090271359A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical record linkage calibration for reflexive and symmetric distance measures at the field and field value levels without the need for human interaction |
US8316047B2 (en) | 2008-04-24 | 2012-11-20 | Lexisnexis Risk Solutions Fl Inc. | Adaptive clustering of records and entity representations |
US20090271404A1 (en) * | 2008-04-24 | 2009-10-29 | Lexisnexis Risk & Information Analytics Group, Inc. | Statistical record linkage calibration for interdependent fields without the need for human interaction |
US11769112B2 (en) | 2008-06-26 | 2023-09-26 | Experian Marketing Solutions, Llc | Systems and methods for providing an integrated identifier |
US11157872B2 (en) | 2008-06-26 | 2021-10-26 | Experian Marketing Solutions, Llc | Systems and methods for providing an integrated identifier |
US10075446B2 (en) | 2008-06-26 | 2018-09-11 | Experian Marketing Solutions, Inc. | Systems and methods for providing an integrated identifier |
US8639691B2 (en) | 2008-07-02 | 2014-01-28 | Lexisnexis Risk Solutions Fl Inc. | System for and method of partitioning match templates |
US8495076B2 (en) | 2008-07-02 | 2013-07-23 | Lexisnexis Risk Solutions Fl Inc. | Statistical measure and calibration of search criteria where one or both of the search criteria and database is incomplete |
US8572070B2 (en) | 2008-07-02 | 2013-10-29 | LexisNexis Risk Solution FL Inc. | Statistical measure and calibration of internally inconsistent search criteria where one or both of the search criteria and database is incomplete |
US8484211B2 (en) | 2008-07-02 | 2013-07-09 | Lexisnexis Risk Solutions Fl Inc. | Batch entity representation identification using field match templates |
US20100005079A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | System for and method of partitioning match templates |
US8639705B2 (en) | 2008-07-02 | 2014-01-28 | Lexisnexis Risk Solutions Fl Inc. | Technique for recycling match weight calculations |
US20100005090A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical measure and calibration of search criteria where one or both of the search criteria and database is incomplete |
US8661026B2 (en) | 2008-07-02 | 2014-02-25 | Lexisnexis Risk Solutions Fl Inc. | Entity representation identification using entity representation level information |
US20100005057A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical measure and calibration of internally inconsistent search criteria where one or both of the search criteria and database is incomplete |
US20100005078A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | System and method for identifying entity representations based on a search query using field match templates |
US8285725B2 (en) | 2008-07-02 | 2012-10-09 | Lexisnexis Risk & Information Analytics Group Inc. | System and method for identifying entity representations based on a search query using field match templates |
US8190616B2 (en) | 2008-07-02 | 2012-05-29 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical measure and calibration of reflexive, symmetric and transitive fuzzy search criteria where one or both of the search criteria and database is incomplete |
US8090733B2 (en) | 2008-07-02 | 2012-01-03 | Lexisnexis Risk & Information Analytics Group, Inc. | Statistical measure and calibration of search criteria where one or both of the search criteria and database is incomplete |
US20100005056A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | Batch entity representation identification using field match templates |
US20100005091A1 (en) * | 2008-07-02 | 2010-01-07 | Lexisnexis Risk & Information Analytics Group Inc. | Statistical measure and calibration of reflexive, symmetric and transitive fuzzy search criteria where one or both of the search criteria and database is incomplete |
US20100010988A1 (en) * | 2008-07-02 | 2010-01-14 | Lexisnexis Risk & Information Analytics Group Inc. | Entity representation identification using entity representation level information |
US20100017399A1 (en) * | 2008-07-02 | 2010-01-21 | Lexisnexis Risk & Information Analytics Group Inc. | Technique for recycling match weight calculations |
US8583754B2 (en) * | 2008-08-15 | 2013-11-12 | Fujitsu Limited | Business flow distributed processing system and method |
US20110138007A1 (en) * | 2008-08-15 | 2011-06-09 | Fujitsu Limited | Business flow distributed processing system and method |
US20100174688A1 (en) * | 2008-12-09 | 2010-07-08 | Ingenix, Inc. | Apparatus, System and Method for Member Matching |
US8359337B2 (en) * | 2008-12-09 | 2013-01-22 | Ingenix, Inc. | Apparatus, system and method for member matching |
US9122723B2 (en) | 2008-12-09 | 2015-09-01 | Optuminsight, Inc. | Apparatus, system, and method for member matching |
US9836508B2 (en) | 2009-12-14 | 2017-12-05 | Lexisnexis Risk Solutions Fl Inc. | External linking based on hierarchical level weightings |
US9411859B2 (en) | 2009-12-14 | 2016-08-09 | Lexisnexis Risk Solutions Fl Inc | External linking based on hierarchical level weightings |
US10909617B2 (en) | 2010-03-24 | 2021-02-02 | Consumerinfo.Com, Inc. | Indirect monitoring and reporting of a user's credit data |
US9501505B2 (en) | 2010-08-09 | 2016-11-22 | Lexisnexis Risk Data Management, Inc. | System of and method for entity representation splitting without the need for human interaction |
US9189505B2 (en) | 2010-08-09 | 2015-11-17 | Lexisnexis Risk Data Management, Inc. | System of and method for entity representation splitting without the need for human interaction |
US9684905B1 (en) | 2010-11-22 | 2017-06-20 | Experian Information Solutions, Inc. | Systems and methods for data verification |
US8688072B1 (en) * | 2010-12-30 | 2014-04-01 | Sprint Communications Company L.P. | Agent notification triggered by network access failure |
US10593004B2 (en) | 2011-02-18 | 2020-03-17 | Csidentity Corporation | System and methods for identifying compromised personally identifiable information on the internet |
US9971992B2 (en) | 2011-04-29 | 2018-05-15 | Visa International Service Association | Fraud detection system automatic rule population engine |
US9129321B2 (en) * | 2011-04-29 | 2015-09-08 | Visa International Service Association | Fraud detection system audit capability |
US20120278868A1 (en) * | 2011-04-29 | 2012-11-01 | Boding B Scott | Fraud detection system audit capability |
US9760861B2 (en) | 2011-04-29 | 2017-09-12 | Visa International Service Association | Fraud detection system automatic rule population engine |
US10643180B2 (en) | 2011-04-29 | 2020-05-05 | Visa International Service Association | Fraud detection system automatic rule population engine |
US11568348B1 (en) | 2011-10-31 | 2023-01-31 | Consumerinfo.Com, Inc. | Pre-data breach monitoring |
US11030562B1 (en) | 2011-10-31 | 2021-06-08 | Consumerinfo.Com, Inc. | Pre-data breach monitoring |
US11882139B2 (en) | 2012-07-24 | 2024-01-23 | Twilio Inc. | Method and system for preventing illicit use of a telephony platform |
US9697263B1 (en) | 2013-03-04 | 2017-07-04 | Experian Information Solutions, Inc. | Consumer data request fulfillment system |
US10592982B2 (en) | 2013-03-14 | 2020-03-17 | Csidentity Corporation | System and method for identifying related credit inquiries |
US10580025B2 (en) | 2013-11-15 | 2020-03-03 | Experian Information Solutions, Inc. | Micro-geographic aggregation system |
US10102536B1 (en) | 2013-11-15 | 2018-10-16 | Experian Information Solutions, Inc. | Micro-geographic aggregation system |
US9529851B1 (en) | 2013-12-02 | 2016-12-27 | Experian Information Solutions, Inc. | Server architecture for electronic data quality processing |
US11847693B1 (en) | 2014-02-14 | 2023-12-19 | Experian Information Solutions, Inc. | Automatic generation of code for attributes |
US10262362B1 (en) | 2014-02-14 | 2019-04-16 | Experian Information Solutions, Inc. | Automatic generation of code for attributes |
US11107158B1 (en) | 2014-02-14 | 2021-08-31 | Experian Information Solutions, Inc. | Automatic generation of code for attributes |
US10339527B1 (en) | 2014-10-31 | 2019-07-02 | Experian Information Solutions, Inc. | System and architecture for electronic fraud detection |
US10990979B1 (en) | 2014-10-31 | 2021-04-27 | Experian Information Solutions, Inc. | System and architecture for electronic fraud detection |
US11941635B1 (en) | 2014-10-31 | 2024-03-26 | Experian Information Solutions, Inc. | System and architecture for electronic fraud detection |
US11436606B1 (en) | 2014-10-31 | 2022-09-06 | Experian Information Solutions, Inc. | System and architecture for electronic fraud detection |
US11151468B1 (en) | 2015-07-02 | 2021-10-19 | Experian Information Solutions, Inc. | Behavior analysis using distributed representations of event data |
US10375078B2 (en) | 2016-10-10 | 2019-08-06 | Visa International Service Association | Rule management user interface |
US10841311B2 (en) | 2016-10-10 | 2020-11-17 | Visa International Service Association | Rule management user interface |
US11227001B2 (en) | 2017-01-31 | 2022-01-18 | Experian Information Solutions, Inc. | Massive scale heterogeneous data ingestion and user resolution |
US11681733B2 (en) | 2017-01-31 | 2023-06-20 | Experian Information Solutions, Inc. | Massive scale heterogeneous data ingestion and user resolution |
CN107566358A (en) * | 2017-08-25 | 2018-01-09 | 腾讯科技(深圳)有限公司 | A kind of Risk-warning reminding method, device, medium and equipment |
US11580259B1 (en) | 2017-09-28 | 2023-02-14 | Csidentity Corporation | Identity security architecture systems and methods |
US11157650B1 (en) | 2017-09-28 | 2021-10-26 | Csidentity Corporation | Identity security architecture systems and methods |
US10699028B1 (en) | 2017-09-28 | 2020-06-30 | Csidentity Corporation | Identity security architecture systems and methods |
US10721246B2 (en) * | 2017-10-30 | 2020-07-21 | Bank Of America Corporation | System for across rail silo system integration and logic repository |
US10733293B2 (en) | 2017-10-30 | 2020-08-04 | Bank Of America Corporation | Cross platform user event record aggregation system |
US20190132336A1 (en) * | 2017-10-30 | 2019-05-02 | Bank Of America Corporation | System for across rail silo system integration and logic repository |
US10896472B1 (en) | 2017-11-14 | 2021-01-19 | Csidentity Corporation | Security and identity verification system and architecture |
US11734234B1 (en) | 2018-09-07 | 2023-08-22 | Experian Information Solutions, Inc. | Data architecture for supporting multiple search models |
US10963434B1 (en) | 2018-09-07 | 2021-03-30 | Experian Information Solutions, Inc. | Data architecture for supporting multiple search models |
US11941065B1 (en) | 2019-09-13 | 2024-03-26 | Experian Information Solutions, Inc. | Single identifier platform for storing entity data |
US11157914B2 (en) * | 2019-12-13 | 2021-10-26 | Visa International Service Association | Method, system, and computer program product for processing a potentially fraudulent transaction |
US11893586B2 (en) | 2019-12-13 | 2024-02-06 | Visa International Service Association | Method, system, and computer program product for processing a potentially fraudulent transaction |
US11880377B1 (en) | 2021-03-26 | 2024-01-23 | Experian Information Solutions, Inc. | Systems and methods for entity resolution |
Also Published As
Publication number | Publication date |
---|---|
EP1131976A1 (en) | 2001-09-12 |
CA2351478A1 (en) | 2000-05-25 |
AU1740800A (en) | 2000-06-05 |
HK1040156A1 (en) | 2002-05-24 |
WO2000030398A1 (en) | 2000-05-25 |
US6535728B1 (en) | 2003-03-18 |
AU768096B2 (en) | 2003-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6535728B1 (en) | Event manager for use in fraud detection | |
US8412633B2 (en) | Money transfer evaluation systems and methods | |
US8082349B1 (en) | Fraud protection using business process-based customer intent analysis | |
US5627886A (en) | System and method for detecting fraudulent network usage patterns using real-time network monitoring | |
US7958032B2 (en) | Generating event messages corresponding to event indicators | |
US7562814B1 (en) | System and method for identity-based fraud detection through graph anomaly detection | |
US7620599B2 (en) | System and method for detecting fraudulent calls | |
CN106384273A (en) | Malicious order scalping detection system and method | |
EP1629617A2 (en) | Method and system for providing fraud detection for remote access services | |
AU2015339448A1 (en) | System and method for real time detection and prevention of segregation of duties violations in business-critical applications | |
US20070265946A1 (en) | Aggregating event indicators | |
US20070265945A1 (en) | Communicating event messages corresponding to event indicators | |
CN108563706A (en) | A kind of collection big data intelligent service system and its operation method | |
US20020152146A1 (en) | Method and apparatus for identifying patent licensing targets | |
EP1427244A2 (en) | Event manager for use in fraud detection | |
US20090234827A1 (en) | Citizenship fraud targeting system | |
CN115564449A (en) | Risk control method and device for transaction account and electronic equipment | |
AU2003261471A1 (en) | Event Manager for Use in Fraud Detection | |
US11657348B2 (en) | System for dynamic exception prioritization | |
US10152712B2 (en) | Inspecting event indicators | |
US11257334B2 (en) | Automatic exception reconciliation | |
Vikram et al. | A solution architecture for financial institutions to handle illegal activities: a neural networks approach | |
US20210398094A1 (en) | System for correspondence matching | |
CN109685662A (en) | Investment data processing method, device, computer equipment and its storage medium | |
Liang et al. | A taxonomy of electronic funds transfer domain intrusions and its feasibility converting into ontology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |