US20070190495A1 - Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios - Google Patents
Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios Download PDFInfo
- Publication number
- US20070190495A1 US20070190495A1 US11/642,589 US64258906A US2007190495A1 US 20070190495 A1 US20070190495 A1 US 20070190495A1 US 64258906 A US64258906 A US 64258906A US 2007190495 A1 US2007190495 A1 US 2007190495A1
- Authority
- US
- United States
- Prior art keywords
- target
- sensing device
- image
- user
- impact
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41G—WEAPON SIGHTS; AIMING
- F41G3/00—Aiming or laying means
- F41G3/26—Teaching or practice apparatus for gun-aiming or gun-laying
- F41G3/2616—Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
- F41G3/2622—Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile
- F41G3/2655—Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating the firing of a gun or the trajectory of a projectile in which the light beam is sent from the weapon to the target
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41A—FUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
- F41A33/00—Adaptations for training; Gun simulators
- F41A33/02—Light- or radiation-emitting guns ; Light- or radiation-sensitive guns; Cartridges carrying light emitting sources, e.g. laser
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J5/00—Target indicating systems; Target-hit or score detecting systems
- F41J5/02—Photo-electric hit-detector systems
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J5/00—Target indicating systems; Target-hit or score detecting systems
- F41J5/08—Infra-red hit-indicating systems
Definitions
- the present invention pertains to firearm training systems, such as those disclosed in U.S. Pat. No. 6,322,365 (Shechter et al.), U.S. Pat. No. 6,616,452 (Clark et al.) and U.S. Pat. No. 6,966,775 (Kendir et al.) and U.S. Patent Application Publication Nos. 2002/0197584 (Kendir et al.) and 2005/0153262 (Kendir), the disclosures of which are incorporated herein by reference in their entireties.
- the present invention pertains to a firearm laser training system that accommodates various types of targets for facilitating a variety of firearm training activities.
- Firearms are utilized for a variety of purposes, such as hunting, sporting competition, law enforcement and military operations.
- the inherent danger associated with firearms necessitates training and practice in order to minimize the risk of injury.
- special facilities are required to facilitate practice of handling and shooting the firearm. These special facilities tend to provide a sufficiently sized area for firearm training and/or confine projectiles propelled from the firearm within a prescribed space, thereby preventing harm to the surrounding environment. Accordingly, firearm trainees are required to travel to the special facilities in order to participate in a training session, while the training sessions themselves may become quite expensive since each session requires new ammunition for practicing handling and shooting of the firearm.
- firearm training is generally conducted by several organizations (e.g., military, law enforcement, firing ranges or clubs, etc.). Each of these organizations may have specific techniques or manners in which to conduct firearm training and/or qualify trainees. Accordingly, these organizations tend to utilize different types of targets, or may utilize a common target, but with different scoring criteria. Further, different targets may be employed by users for firearm training or qualification to simulate particular conditions or provide a specific type of training (e.g., grouping shots, hunting, clay pigeons, etc.).
- organizations e.g., military, law enforcement, firing ranges or clubs, etc.
- Each of these organizations may have specific techniques or manners in which to conduct firearm training and/or qualify trainees. Accordingly, these organizations tend to utilize different types of targets, or may utilize a common target, but with different scoring criteria. Further, different targets may be employed by users for firearm training or qualification to simulate particular conditions or provide a specific type of training (e.g., grouping shots, hunting, clay pigeons, etc.).
- U.S. Pat. No. 4,164,081 discloses a marksman training system including a translucent diffuser target screen adapted for producing a bright spot on the rear surface of the target screen in response to receiving a laser light beam from a laser rifle on the target screen front surface.
- a television camera scans the rear side of the target screen and provides a composite signal representing the position of the light spot on the target screen rear surface.
- the composite signal is decomposed into X and Y Cartesian component signals and a video signal by a conventional television signal processor.
- the X and Y signals are processed and converted to a pair of proportional analog voltage signals.
- a target recorder reads out the pair of analog voltage signals as a point, the location of which is comparable to the location on the target screen that was hit by the laser beam.
- U.S. Pat. No. 5,281,142 discloses a shooting simulation training device including a target projector for projecting a target image in motion across a screen, a weapon having a light projector for projecting a spot of light on the screen, a television camera and a microprocessor.
- An internal device lens projects the spot onto a small internal device screen that is scanned by the camera.
- the microprocessor receives various information to determine the location of the spot of light with respect to the target image.
- U.S. Pat. No. 5,366,229 discloses a shooting game machine including a projector for projecting a video image that includes a target onto a screen.
- a player may fire a laser gun to emit a light beam toward the target on the screen.
- a video camera photographs the screen and provides a picture signal to coordinate computing means for computing the X and Y coordinates of the beam point on the screen.
- WO 92/08093 discloses a small arms target practice monitoring system including a weapon, a target, a light-beam projector mounted on the weapon and sighted to point at the target and a processor.
- An evaluating unit is connected to the camera to determine the coordinates of the spot of light on the target.
- a processor is connected to the evaluating unit and receives the coordinate information. The processor further displays the spot on a target image on a display screen.
- the Berke, Zaenglein, Jr. and Suzuki systems employ particular targets or target scenarios, thereby limiting the types of firearm training activities and simulated conditions provided by those systems.
- the Berke system utilizes both front and rear target surfaces during operation. Thus, placement of the target is restricted to areas having sufficient space for exposure of those surfaces to a user and the system.
- the Berke and Kunnecke et al. systems merely display impact locations to a user, thereby requiring a user to interpret the display to assess user performance during an activity. The user assessment is typically limited to the information (impact locations) provided on the display, thereby restricting feedback of valuable training information to the user and limiting the training potential of the system.
- Yet another object of the present invention is to employ user-specified targets within a firearm laser training system to conduct desired training procedures.
- a further object of the present invention is to assess user performance within a firearm laser training system by determining scoring and/or other performance information based on detected impact locations of simulated projectiles on a target.
- a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities.
- the system employs an image sensing device to detect laser beam impacts on a target, where the laser beam is projected from an actual or simulated user firearm equipped with a laser transmitter.
- the image sensing device compensates for image distortions and the sensing device viewing angle with respect to the intended target, and enables detection of laser beam impacts on various types of targets (e.g., paper targets, projected targets, videos, still or moving images, etc.) to provide firearm training with varying scenarios.
- the image sensing device provides impact location information to a computer system to graphically display the impact locations and provide information pertaining to user performance of the training activity.
- FIG. 1 is a view in perspective of a firearm laser training system with a laser beam directed from a firearm onto an intended target according to the present invention.
- FIG. 2 is an exploded view in perspective and partial section of a laser transmitter assembly of the system of FIG. 1 fastened to a firearm barrel.
- FIG. 3 is a schematic block diagram of an image sensing device according to the present invention.
- FIG. 4 is a procedural flow chart illustrating the manner in which the image sensing device is calibrated according to the present invention.
- FIGS. 5-6 are schematic illustrations of exemplary graphical user screens displayed by the system of FIG. 1 for calibrating the image sensing device.
- FIG. 7A is a side view in elevation and partial section of a target including an image sensing device to detect laser beam impact locations according to the present invention.
- FIG. 7B is a front view in elevation of a target including a plurality of image sensing devices each associated with a target section to detect laser beam impact locations according to the present invention.
- FIG. 8 is a procedural flow chart illustrating the manner in which the system of FIG. processes and displays laser beam impact locations according to the present invention.
- FIG. 9 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during system operation.
- FIG. 10 is a schematic illustration of an alternative exemplary graphical user screen displayed by the system of FIG. 1 during system operation.
- FIG. 11 is a schematic illustration of the exemplary graphical user screen of FIG. 10 during operation of the system in a trace mode.
- FIG. 12 is a schematic illustration of the exemplary graphical user screen of FIG. 10 during operation of the system in a plot mode.
- FIG. 13 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system for a shotgun course of fire.
- FIG. 14 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system for a course of fire.
- FIGS. 15-17 are schematic illustrations of exemplary graphical user screens displayed by the system of FIG. 1 during operation of the system with various targets.
- FIG. 18 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a target in the form of a video segment.
- FIG. 19 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of an indoor firing range.
- FIG. 20 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of a live firing range.
- FIG. 21 is a schematic illustration of an exemplary graphical user screen displayed by the system of FIG. 1 during operation of the system with a simulation of a skeet shoot session.
- FIG. 1 A firearm laser training system that accommodates various types of targets according to the present invention is illustrated in FIG. 1 .
- the firearm laser training system includes a laser transmitter assembly 200 , a target 10 , an image sensing device 16 and a computer system 18 .
- the laser assembly is attached to an unloaded user firearm 76 to adapt the firearm for compatibility with the training system.
- firearm 76 is implemented by a conventional hand-gun and includes a trigger 77 , a barrel 78 , a hammer 79 and a grip 85 .
- a user aims unloaded firearm 76 at target 10 and actuates trigger 77 to project laser beam 11 from laser assembly 200 toward the target.
- Sensing device 16 detects the laser beam impact location on the target and provides location information to computer system 18 .
- Target 10 may be in the form of a paper target, a television or computer monitor, or an LCD or other panel or display displaying images serving as a target.
- the computer system processes the location information and displays simulated projectile impact locations on the target via a graphical user screen (e.g., FIGS. 9-21 ) as described below.
- the system may further include a projector 17 to provide targets in the form of projected images 14 .
- Sensing device 16 detects the laser beam impact location on the projected image and provides location information to computer system 18 as described above.
- the computer system processes the location information and displays simulated projectile impact locations on the projected image via a graphical user screen (e.g., FIGS. 15-21 ) as described below.
- the computer system may determine scoring and other information based upon the performance of a user.
- target 10 may be implemented by a two-dimensional target, preferably constructed of paper or other material, and attached to or suspended from a supporting structure, such as a wall.
- the target preferably includes indicia forming a transitional type target having a silhouette of a person with several sections or zones (e.g., typically between five and seven) defined therein.
- the target sections may each be assigned a value in order to determine a score for a user.
- the sections and values typically vary based on the system application and/or particular organization (e.g., military, law enforcement, firearm club, etc.) utilizing the system. Further, plural target sections (e.g., contiguous or non-contiguous) may be associated with a common value, while each section may be of any shape or size.
- the score is determined by accumulating the values of the target sections impacted by the laser beam during the firearm activity.
- the values of the target sections may further be multiplied by a scoring factor set by the system and/or the user to accommodate various scoring schemes utilized by different organizations.
- the target may include an image produced by a projector, a television or computer screen and/or a an LCD or other display panel, where these targets may similarly include sections or zones each assigned a corresponding value as described above.
- the computer system receives the beam impact locations from the sensing device and retrieves the section values corresponding to the impact locations as described below. Section values for each beam impact are accumulated to produce a score for a user.
- the target may be of any shape or size, may be constructed of any suitable materials and may include any indicia to provide any type of target for facilitating any type of training, qualification, gaming, entertainment or other activity.
- the system may utilize any conventional, simulated or “dry fire” type firearms (e.g., hand-gun, rifle, shotgun, firearms powered by air/carbon dioxide, etc.), or firearms utilizing blank cartridges such as those disclosed in the above-mentioned patents and patent application publications, for projecting a laser beam to provide full realism in a safe environment.
- laser transmitter assembly 200 includes a barrel sleeve 202 , a power source 210 , typically in the form of batteries, a laser module 211 and a sleeve cap 216 .
- the barrel sleeve includes a generally cylindrical barrel member 204 and a threaded stop 206 disposed at the barrel member distal end.
- the transverse cross-sectional dimensions of the barrel member are slightly less than those of barrel 78 to enable the barrel member to be inserted within the barrel.
- the barrel member includes an adjustment member 203 disposed at the barrel member proximal end. The adjustment member is typically in the form of a screw and adjusts the barrel member dimensions in response to manipulation by a user.
- the adjustment member alters the barrel member dimensions to enable the barrel member to frictionally engage the barrel and provide a snug fit.
- Stop 206 is in the form of a substantially annular ring and has dimensions slightly greater than those of the barrel member and barrel to limit insertion of the sleeve within the barrel.
- the stop outer surface includes threads facilitating engagement with sleeve cap 216 as described below.
- Power source 210 has dimensions sufficient for insertion within the barrel sleeve to provide power to laser module 211 .
- Laser module 211 includes a substantially cylindrical housing 212 including therein a mechanical wave sensor (not shown) and an optics package (not shown) including a laser and a lens. These components may be arranged within the housing in any suitable fashion.
- the optics package emits a laser beam through the lens toward an intended target in response to detection of hammer fall by the mechanical wave sensor.
- trigger 77 is actuated
- hammer 79 impacts the firearm frame. The impact generates a mechanical wave which travels distally along barrel 78 toward barrel member 202 .
- the term “mechanical wave” or “shock wave” refers to an impulse (e.g., acoustic wave, shock wave, vibration along the barrel, etc.) traveling through the barrel.
- the mechanical wave sensor within the laser module senses the mechanical wave and generates a trigger signal.
- the mechanical wave sensor may include a piezoelectric element, an accelerometer or a solid state sensor, such as a strain gauge.
- the optics package within the laser module generates and projects a laser beam from firearm 76 in response to the trigger signal.
- the optics package laser is generally enabled for a predetermined time interval sufficient for image sensing device 16 to detect a beam impact.
- the beam may be modulated, coded or pulsed in any desired fashion.
- the laser module may include an acoustic sensor to sense actuation of the trigger and enable the optics package.
- the laser module is similar in function to the laser devices disclosed in the aforementioned patents and patent application publications.
- the laser assembly may be constructed of any suitable materials and may be fastened to firearm 76 at any suitable locations by any conventional or other fastening techniques.
- the optics package of laser module 211 is generally disposed toward the housing distal end, while the mechanical wave sensor is typically disposed toward the housing proximal end to detect the mechanical wave.
- Sleeve cap 216 is substantially cylindrical having an open proximal end and a closed distal end. The cap includes threads disposed on its interior surface to engage threads of stop 206 . The closed distal end of the cap includes a substantially central opening 220 defined therein to enable a laser beam emitted by the laser module to pass through the cap.
- power source 210 and laser module 211 are inserted into barrel sleeve 202 .
- Sleeve cap 216 is subsequently attached to stop 206 via their respective threaded portions.
- the barrel sleeve is inserted into the barrel, preferably until stop 206 contacts the barrel distal end.
- the barrel sleeve dimensions may be selectively adjusted by manipulation of the adjustment member as described above to provide a secure fit.
- Laser transmitter assembly 200 basically emits a laser beam from laser module 211 through opening 220 of cap 216 in response to firearm actuation as described above.
- computer system 18 is coupled to and receives and processes information from sensing device 16 to provide various feedback to a user.
- the computer system is typically implemented by a conventional IBM-compatible laptop or other -type of personal computer (e.g., notebook, desk top, mini-tower, Apple Macintosh, palm pilot, etc.) preferably equipped with display or monitor 34 , a base 32 (e.g., including the processor, memories, and internal or external communication devices or modems) and a keyboard 36 (e.g., in combination with a mouse or other input devices).
- Computer system 18 includes software to enable the computer system to communicate with sensing device 16 and provide feedback to the user.
- the computer system may utilize any of the major platforms (e.g., Linux, Macintosh, Unix, OS2, etc.), but preferably includes a Windows environment (e.g., Windows 95, 98, NT, 2000, XP, etc.). Further, the computer system includes components (e.g. processor, disk storage or hard drive, etc.) having sufficient processing and storage capabilities to effectively execute the system software.
- major platforms e.g., Linux, Macintosh, Unix, OS2, etc.
- a Windows environment e.g., Windows 95, 98, NT, 2000, XP, etc.
- the computer system includes components (e.g. processor, disk storage or hard drive, etc.) having sufficient processing and storage capabilities to effectively execute the system software.
- Computer system 18 is connected to sensing device 16 via an Ethernet type connection.
- the sensing device may be mounted on a tripod and positioned at a suitable location from the target. However, any type of mounting or other structure may be utilized to support the sensing device.
- the sensing device detects the location of beam impacts on the target (e.g., by capturing an image of the target and detecting the location of the beam impact from the captured image) and provides impact location information in the form of X and Y coordinates to computer system 18 .
- the sensing device performs a calibration to correlate the captured image space with the target space and/or display space as described below.
- a printer (not shown) may further be connected to the computer system to print reports containing user feedback information (e.g., score, hit/miss information, etc.).
- Target characteristics are contained in several files that are stored by computer system 18 .
- a desired target may be photographed and/or scanned prior to system utilization to produce several target files and target information.
- images of user generated targets may be captured via sensing device 16 and optionally manipulated to form a target image, while computer system 18 or other computer system (e.g., via the training system or conventional software) may be utilized to produce the target files and target information for use by the system.
- a target file includes a parameter file, a display image file, a scoring image file and a print image file.
- the parameter file includes information to enable the computer system to control system operation.
- the parameter file includes the filenames of the display, scoring and print image files, a scoring factor and cursor information (e.g., grouping criteria, such as circular shot group size).
- the display and print image files include an image of the target scaled to particular sections of the monitor and report containing that image, respectively.
- Indicia preferably in the form of substantially circular icons, are overlaid on these images to indicate beam impact locations (e.g., FIGS. 9-10 and 14 ), and typically include an identifier to indicate the particular shot (e.g., the position number of the shot within a shot sequence).
- the dimensions of the indicia may be adjusted to simulate different ammunition or firearm calibers entered by a user.
- the scoring image file includes a scaled scoring image of the target having scoring sections or zones shaded with different colors. Any variation of colors may be utilized, and the colors are each associated with corresponding information associated with that zone.
- the zone information typically includes scoring values, but may include any other types of activity information (e.g., target number, desirable/undesirable hit location, priority of hit location, friend/foe, etc.).
- impact location information is received from the sensing device
- computer system 18 utilizes that information to access a corresponding location within the scoring image.
- the sensing device may correlate the captured image space directly to the display, scoring and/or print image files. In this case, the received coordinates may be used to directly access the corresponding locations in these files.
- the computer system may perform a translation on the received coordinates to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations.
- the color associated with the image location identified by the location information indicates a corresponding zone and/or scoring value.
- the colored scoring image functions as a look-up table to provide a zone value based on the location within the scoring image pertaining to a particular beam impact location.
- the scoring value of an impact location may be multiplied by a scoring factor within the parameter file to provide scores compatible with various organizations and/or scoring schemes.
- the scoring of the system may be adjusted by modifying the scoring factor within the parameter file and/or the scoring zones on the scoring image within the scoring image file.
- the scoring image file may indicate occurrence of various events (e.g., hit/miss of target locations, target sections impacted based on priority, hit friend or foe, etc.) in substantially the same manner described above.
- the target files typically include a second display file containing a scaled image of the target.
- the dimensions of this image are substantially greater than those of the image contained in the initial display image file, and the second display file is preferably utilized to display a target having plural independent target sites.
- the target files along with scaling and other information (e.g., target range information input by user) are stored on computer system 18 for use during system operation.
- the system may readily accommodate any type of target without interchanging system components.
- target files may be downloaded from a network, such as the Internet, and loaded into the computer system to enable the system to access and be utilized with additional targets
- Image sensing device 16 detects laser beam impact locations on the target and provides the X/Y or Cartesian coordinates of the impact location to computer system 18 for processing.
- the image sensing device is illustrated in FIG. 3 .
- image sensing device 16 includes an image sensor 102 , an image buffer 104 , memories 106 , 108 and 110 , a Field Programmable Gate Array (FPGA) 112 controlling sensing device functions, an Ethernet controller 118 providing Ethernet connections and a power supply 126 providing power for the sensing device.
- Image sensor 102 captures images of an intended target and employs charge-coupled devices (CCD) or CMOS.
- the image sensor repeatedly captures an image of the target, preferably through a lens 131 , and provides target image information to FPGA 112 for processing.
- the FPGA may store image information in image buffer 104 .
- Image sensor 102 preferably provides image data in monochromatic form (e.g., gray scale or black and white, where red (R), blue (B) and green (G) pixel values are substantially the same); however, the image sensor may alternatively provide full color images (e.g., including red (R), blue (B) and green (G) pixel values).
- the sensing device may further employ a band pass type filter 129 to enable light from lens 131 having the wavelength of the laser transmitter (e.g., approximately 650 nanometers) to pass to the image sensor, while suppressing the ambient light.
- the lens initially focuses and provides ambient light in parallel rays for filtering by filter 129 .
- the image characteristics of the image sensor enable the device to capture images of the target including any changes to the target (e.g., beam impacts) occurring between successive frame transmissions.
- the sensing device facilitates detection of beam impacts from laser transmitters with pulse durations less than the frame rate of the image sensor (e.g., pulse durations as low as approximately 1.5 millisecond). Since the sensing device can accommodate low pulse durations, the sensing device may be universally employed with various laser transmitters.
- FPGA 112 controls operation of sensing device 16 and is coupled to image sensor 102 , image buffer 104 , memories 106 , 108 and 110 , a system clock 114 , a reset switch 116 , Ethernet controller 118 and I/O ports 124 , 128 .
- the FPGA includes a combination of hardware and software to perform image processing and control of sensing device operation.
- the FPGA performs sensor timing control, exposure control, impact detection, memory control and system control functions.
- the sensor timing control relates to control of the image sensor for capturing images and transferring image information to FPGA 112 for processing.
- System clock 114 provides a clock signal for the control functions, while reset switch 116 enables a user to reset or reboot the sensing device.
- I/O ports 124 , 128 enable the sensing device to control external devices (e.g., laser transmitter, audio and visual indicators, etc.) to simulate return fire and/or provide audio and/or visual indications to a user.
- the FPGA may perform image decimation and hit synchronization functions.
- Image decimation relates to generation of lower resolution images (e.g., selection of a portion of pixels from an image) to enhance transfer rates over a network.
- Hit synchronization is employed when sensing device 16 is utilized with or includes a signature detector 127 to detect signatures within an emitted laser beam.
- Each laser beam may include a signature to identify the particular user transmitting that beam.
- the signatures may be embedded within the laser beam in any desired fashion (e.g., modulation, pulse widths, etc.), while the signature detector may be implemented by any conventional or other devices to detect patterns or signatures within the transmitted laser signal.
- the FPGA determines the location of beam impacts as described below, while signature detector 127 determines the user providing the beam impact, thereby validating the beam impact as being from an authorized user (e.g., as opposed to registering a beam impact detection from ambient light).
- the signature detector further enables detection of beam impacts in the presence of significant levels of ambient light (e.g., sunlight) since an impact is registered in response to detection of the signature.
- the signature detector serves a filter for detected beam impacts to eliminate or minimize false detections due to ambient light.
- the FPGA coordinates detection of impact locations with verification of laser beam signatures by the signature detector to associate impacts with particular or verified users.
- Memory 106 is preferably implemented by a flash type memory and stores software for FPGA 112 to perform image processing and/or control functions.
- Memory 108 is preferably implemented by a static dynamic random access memory (SDRAM) and stores various settings for calibration and other functions, while memory 110 is preferably implemented by a flash type memory and stores calibration data.
- SDRAM static dynamic random access memory
- memory 110 is preferably implemented by a flash type memory and stores calibration data.
- memories 106 , 108 and 110 are each non-volatile, the sensing device may be continually re-used without a re-calibration (e.g., unless the position of the sensing device or target has changed).
- Sensing device 16 communicates with computer system 18 via an Ethernet or other network type connection.
- Ethernet controller 118 is coupled to FPGA 112 and to an Ethernet interface 120 .
- the Ethernet controller controls communications with the computer system and is coupled to an Ethernet connector 122 via Ethernet interface 120 .
- These components may be implemented by any conventional or other network components to provide communications with the computer system via any suitable protocols (e.g., Ethernet, etc.).
- Sensing device 16 may process images from various viewing angles and orientations and compensates for deformations in the captured image due to the device position and/or lenses employed over the image sensor (e.g., fisheye, etc.). In order to compensate for the deformations, a calibration is performed as illustrated in FIGS. 4-6 .
- the sensing device is coupled to the computer system with the computer system being initialized for communications (e.g., assigning a network address to the sensing device, projector, etc.) at step 150 .
- a target e.g., paper, projected image, television or computer screen, LCD panel, etc.
- a calibration icon on the computer system display is actuated.
- the projector is initialized to provide the image at a desired location and with a suitable alignment, size and/or form.
- the computer system verifies connection of the sensing device and subsequently displays a calibration graphical user screen 130 ( FIGS. 5-6 ) at step 154 .
- Screen 130 includes a viewing area 132 , instruction areas 133 , a gain slide 136 , and buttons 137 , 138 that respectively provide help information and enable continuation of the calibration.
- Instruction areas 133 provide instructions to a user, while viewing area 132 displays an image of the target from sensing device 16 and a grid 134 to identify image boundaries for the calibration.
- the grid is substantially rectangular and includes a series of labeled points along the grid perimeter.
- the grid includes eight labeled points 1 - 8 identifying locations of the target image to the sensing device. Four points are located at respective corners of the grid, while the remaining four points are located at the approximate midpoint between the grid corners along each of the grid longer and shorter dimensioned edges. An additional point for the calibration is located at the center of the grid.
- These grid points correspond to particular locations (e.g., top left corner, top middle, top right corner, etc.) or coordinates within the target or an ideal target image.
- point 1 may correspond to pixel coordinates 0 , 0 within an ideal image.
- gain slide 136 is manipulated to enable the target image captured by the sensing device to be visible in viewing area 132 at step 156 .
- the sensing device position and/or lens may further be adjusted to enable the target to fill the viewing area and/or to adjust the focus.
- the user manipulates each perimeter point on the grid in sequence to coincide with the corresponding location on the target within the captured image in viewing area 132 at step 158 ( FIG. 6 ).
- the grid center point is subsequently manipulated to coincide with the target center in the captured image.
- the user indication of the grid points in the captured image in combination with the corresponding locations of the grid points in an ideal or undistorted image enable the system to detect and compensate for the displacement as described below.
- the sensing device receives the target image boundaries from the computer system and the FPGA processes the target image to determine a translation matrix at step 160 that compensates for the angle and/or orientation of the sensing device and distortion produced by lens 131 within the captured image.
- the translation matrix maps each image pixel to a corresponding coordinate in the target and/or display spaces (e.g., computer system display, etc.).
- the primary causes of errors with respect to detection of beam impacts include radial distortion caused by the sensing device lens including a magnification at the center of the image slightly greater than the magnification at the edges, and the perspective distortion caused by the sensing device position and angle relative to the target.
- the calibration technique produces the translation matrix to compensate for the position and angle of the sensing device and to correct the distortions caused by the lens.
- the user manipulates the grid to identify each of the points on the distorted image as described above.
- the real or actual coordinates of the calibration points e.g., the coordinates of the grid points in an ideal image
- the coordinates on the distorted image e.g., indicated by the user
- the radial and perspective distortions of the image may be modeled by conventional techniques, such as those disclosed in R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-The-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4. August 1987, the disclosure of which is incorporated herein by reference in its entirety.
- the position and orientation of the sensing device relative to the target and radial distortion parameters are estimated using the known locations (e.g., the points indicated by the user on the distorted image and the locations of the grid points in an ideal image).
- This estimation may be determined according to conventional techniques, such as those disclosed in D. G. Bailey, “A New Approach to Lens Distortion Correction”, Proceedings of Image and Vision Computing, New Zealand, pp. 59-64, November 2002, the disclosure of which is incorporated herein by reference in its entirety.
- the perspective distortion is corrected by using a projective transformation. Subsequently, the lens distortion is corrected using the determined radial distortion parameters.
- the coordinates obtained from this modeling are compared with corresponding coordinates measured from the image to provide an error term.
- the distortion parameters are adjusted to minimize this error. This may be accomplished in accordance with conventional techniques, such as those disclosed in G. Vass et al., “Applying and Removing Lens Distortion in Post Production”, The Second Hungarian Conference on Computer Graphics and Geometry, 2003, the disclosure of which is incorporated herein by reference in its entirety.
- the resulting coordinates with minimal error are placed within the translation matrix.
- the corresponding coordinates from the translation matrix translate the impact to target and/or display spaces and are provided to the computer system for processing.
- the translation matrix is stored in the sensing device for future use (without a re-calibration).
- the sensing device (or translation matrix) may correlate the captured image space directly to the display, scoring and/or print image files as described above. In this case, the coordinates provided by the sensing device may be used to directly access the corresponding locations in these files.
- the computer system may perform a translation on the coordinates received from the sensing device to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations.
- the FPGA may perform the calibration with respect to actual pixel coordinates, or with respect to distances from reference points (e.g., where the quantity of pixels per millimeter or other length unit may be utilized).
- the sensing device further performs a light sensitivity calibration to adjust for ambient light conditions at step 162 .
- the user may select high, medium or low sensitivity.
- the FPGA processes the target image pixel intensities (e.g., red (R), green (G) and blue (B) pixel values) to measure the ambient light and determines the thresholds described below for determining beam impact detections.
- the FPGA determines a threshold value for each of a plurality of regions within the target image.
- the present invention utilizes forty-eight regions; however, any quantity of regions at any locations within the image may be utilized.
- the threshold value for each region is determined from the sum of a maximum luminance or pixel intensity value of pixels within that region and an offset.
- the offset is based on a desired light sensitivity indicated by a user as described above. Since target images are being repeatedly captured and transmitted to the FPGA, certain captured target images may not contain any beam impact detections. Accordingly, the thresholds for each region basically control the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a corresponding region of the captured target image.
- the resulting information is stored in sensing device memory 106 at step 164 to enable the sensing device to be re-used without performing the calibration (e.g., the calibration is performed in response to a change in position of the target and/or sensing device).
- Sensing device 16 may be utilized within various targets to detect laser beam impacts on those targets as illustrated in FIG. 7A .
- a target 15 may include a housing 23 , sensing device 16 and a diffuser 19 .
- the housing may be of any shape or size, and may be in the form of any suitable target to simulate a scenario (e.g., silhouette, bull's eye, standard targets for military or law enforcement applications, etc.).
- the target may be of the types disclosed in the aforementioned U.S. Pat. No. 6,322,365 (Shechter et al.) and U.S. Patent Application Publication No. 2005/0153262 (Kendir).
- the sensing device is calibrated as described above with diffuser 19 serving as a target surface and disposed in housing 23 at any suitable location.
- the housing includes an opening 25 with diffuser 19 disposed over or within that opening to serve as an impact location for target 15 .
- Sensing device 16 captures target images and determines beam impact locations in substantially the same manner described above to provide beam impact coordinates to a computer system via a wired or wireless network connection.
- target 15 may include a plurality of sensing devices 16 each associated with a corresponding target section and coupled to a network switch 27 , preferably employing an Ethernet protocol.
- the network switch may be implemented by any conventional or other network device and may utilize any suitable communications protocol.
- Target housing 23 may include a single large diffuser 19 serving as a housing side to receive beam impacts. Alternatively, a housing side may include a plurality of openings 25 with a corresponding diffuser 19 disposed over or within each opening.
- Each sensing device is calibrated and placed in the target housing to detect beam impacts within a corresponding section or diffuser as described above. The sensing devices transfer coordinates from detected beam impacts to network switch 27 for transference to a computer system for processing as described above.
- Computer system 18 includes software to control system operation and provide graphical user interfaces for displaying user performance. The manner in which the computer system monitors beam impact locations and provides information to a user is illustrated in FIGS. 8-9 . Initially, computer system 18 ( FIG. 1 ) facilitates performance of calibrations for the sensing device and ambient light conditions as described above at step 40 .
- a user may commence projecting the laser beam from the firearm toward a target (e.g., paper, projected image, television or computer screen, LCD panel, etc.).
- Sensing device 16 captures target images at step 42 and processes the captured target images, via FPGA 112 , to determine a beam impact location at step 44 .
- each captured target image received by the sensing device includes a plurality of pixels each associated with red (R), green (G) and blue (B) values to indicate the color and luminance of that pixel.
- red (R), green (G) and blue (B) are substantially the same.
- the red, green and blue values for each pixel are multiplied by a respective weighting factor and summed to produce a pixel density.
- the respective weights may have the same or different values and may be any types of values (e.g., integer, real, etc.).
- the beam impact location is considered to occur within a group of pixels within a captured image where each group member has a density value exceeding a threshold.
- This threshold is determined from the light sensitivity calibration described above and corresponds to the region containing the beam impact.
- the group of pixels containing or representing the beam impact form an area or shape.
- the pixel at the center of the area or shape formed by the pixel group is considered by the system to contain, or represent the location of, a beam impact. If the density value of each captured image pixel is less than the threshold, the captured target image is not considered to include a beam impact.
- the threshold basically controls the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a captured target image.
- the coordinates e.g., X and Y coordinates
- the coordinates are translated by the translation matrix to coordinates within the target space and/or display space (e.g., computer display, etc.) and provided to the computer system for processing at step 46 .
- the computer system includes several target files having target information and scaled images as described above.
- the coordinates received from the sensing device enable display or overlay of the impact location on the target files.
- the sensing device may determine the pulse width of the laser beam (e.g., from detection of beam impacts within successive frames), and provide information to the computer system to enable display of messages in response to a user utilizing a laser having an unsuitable pulse width with respect to the system configuration.
- the detection of laser pulse widths may be utilized to identify and associate beam impacts with particular users employing laser transmitters with different pulse widths.
- the system preferably is configured for laser transmitters emitting a pulse having a duration of 1.5 milliseconds, but may be utilized and/or configured for operation with laser transmitters having any desired pulse width.
- the received coordinates are utilized to access a corresponding location in the scoring image and determine the score or other activity information for the beam impact at step 50 .
- the received coordinates are utilized to identify a corresponding location within the scoring image.
- Various sections of the scoring image are color coded to indicate a value or other activity information associated with that section as described above.
- the color of the identified location within the scoring image is ascertained to indicate the value or other activity information for the beam impact.
- the scoring factor within the parameter file is applied to (e.g., multiplied by) the score value to determine a score for the beam impact.
- the score and other impact information is determined and stored in a database or other storage structure, while a computer system display showing the target is updated to illustrate the beam impact location and other information (e.g., natural dispersion, center of mass, score, score percentage, elapsed time, qualification, etc.) at step 52 .
- a computer system display showing the target is updated to illustrate the beam impact location and other information (e.g., natural dispersion, center of mass, score, score percentage, elapsed time, qualification, etc.) at step 52 .
- the display image is displayed, while the beam impact location is identified by indicia that are overlaid with the display image and placed in an area encompassing the received coordinates.
- the indicia may be scaled to reflect the caliber of the firearm.
- the computer system may provide audio (e.g., resembling firearm shots and/or hits) to indicate beam impact.
- Exemplary graphical user screens indicating the target, beam impact locations, impact time, score and other information is illustrated in FIGS. 9-10 and 14 .
- the system may be configured to detect, process and display beam impacts: from any desired shooting rate (e.g., machine gun rates of approximately one thousand rounds per minute, etc.); originating from blank fire (e.g., as disclosed in the aforementioned U.S. Pat. No. 6,322,365); and/or for targets at maximum distances of approximately twenty-five meters.
- a round or session of firearm activity is not complete as determined at step 54 , the user continues actuation of the firearm and the system detects beam impact locations and determines information as described above.
- the computer system retrieves information from the database and determines information pertaining to the round at step 56 .
- the computer system may further determine grouping circles. These are generally utilized on shooting ranges where projectile impacts through a target must all be within a circle of a particular diameter (e.g., four centimeters).
- the computer system may analyze the beam impact information and provide groupings and other information on the display that is typically obtained during activities performed on firing ranges (e.g., dispersion, etc.).
- the grouping circle and beam impact location indicia are typically overlaid with the display image and placed in areas encompassing the appropriate coordinates of the display image space in substantially the same manner described above.
- the computer system retrieves the appropriate information from the database and generates a report for printing at step 60 .
- the report includes the print image, while beam impact location coordinates are retrieved from the database for the report.
- the beam impact locations are identified by indicia that are overlaid with the print image and placed in an area encompassing the corresponding location of the beam impact as described above for the display.
- the report may further include various information pertaining to user performance (e.g., score, dispersion, center of mass, impact score, cumulative score, score percentage, elapsed time, time between shots, etc.), and may alternatively be generated in electronic form in any desired format (e.g., .doc, .pdf, .wpd, etc.).
- the computer system When another round is desired, and a calibration is requested at step 64 , the computer system facilitates calibration of the sensing device at step 40 and the above process of system operation is repeated. Similarly, the above process of system operation is repeated from step 42 when another round is desired without performing a calibration. System operation terminates upon completion of the training or qualification activity as determined at step 62 .
- FIG. 9 An exemplary user screen employed by the system for a silhouette type target (e.g., a military M9 type target) is illustrated in FIG. 9 .
- screen 170 includes a target area 172 , an action bar 174 , an information area 176 , a shot table area 178 and a mode selection area 179 .
- Target area 172 includes an image of the target with beam impact locations indicated thereon as described above.
- Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); remove a shot from the shot table; open a previously saved session; save a session; and perform network and/or instructor functions.
- access options e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.
- start/stop a session e.g., toggle between on-line and off-line modes
- remove a shot from the shot table open a previously saved session; save a session; and perform network and/or instructor functions.
- Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, qualification, etc.).
- Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit.
- Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below.
- Screen 170 may further enable a user to display menus for selecting options, reports and targets or overlays. These menus are typically displayed in response to actuation of corresponding icons in the action bar.
- screen 180 is similar to screen 170 described above and includes target area 172 , action bar 174 , information area 176 , shot table area 178 and mode selection area 179 .
- Target area 172 includes a target image in the form of a bull's eye.
- Action bar 174 includes a series of icons to: assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); remove a shot from the shot table; open a previously saved session; save a session; and stop a session.
- access options e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.
- Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, etc.).
- Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit.
- Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below.
- the trace mode is enabled in response to a user selecting the trace mode in mode selection area 179 and the laser transmitter assembly operating in a “constant on” mode.
- the computer system displays a flashing block 171 on the graphical user screen ( FIG. 11 ).
- the block follows movement of the firearm or laser beam projected on the target.
- the computer system receives coordinates of laser beam impact locations from the sensing device and utilizes those coordinates to display the block.
- the position of the block is adjusted on the display in accordance with the received coordinates. As the firearm or laser beam alters position, the block is similarly adjusted on the display to visually indicate movement of the firearm.
- the system traces the aiming position of the firearm or laser transmitter assembly and reports graphically the horizontal and vertical deviations of the firearm ( FIG. 12 ).
- the laser transmitter assembly is configured to continuously project a laser beam from the firearm (e.g., “constant on” mode) as described above.
- the continuous laser beam projection allows the sensing device to trace any movement of the firearm, which in turn, allows the computer system to provide feedback to the user relating to fluctuation in firearm aim.
- the computer system continuously receives detection information (e.g., target image coordinates indicating beam impact locations) from the sensing device. Since the laser transmitter assembly is in a continuous mode (e.g., continuously projecting a laser beam onto the target), the sensing device traces the aim of the firearm on the target and continuously relays detection information to the computer system.
- the computer system determines the target impact locations as described above and compiles and displays a trace report to provide an indication to the user of the horizontal and vertical fluctuations of the firearm with respect to an actual and/or desired hit location on the target.
- FIG. 12 illustrates an exemplary graphical user screen displaying plot mode information and includes plots of horizontal and vertical fluctuations in firearm aim. The vertical and horizontal plots are typically color coded to identify a particular plot.
- a target e.g., paper, projected image, television or computer screen, LCD or other display panel, etc.
- sensing device 16 is connected to the computer system.
- Laser transmitter assembly 200 is inserted into barrel 78 of firearm 76 as described above.
- the laser module is actuated in response to depression of firearm trigger 77 .
- the computer system is commanded to commence a firearm activity, and may initially control calibrations for sensing device 16 , as necessary, in the manner described above.
- the firearm may be actuated by a user, while the sensing device captures images of the target and provides coordinates of beam impact locations to the computer system as described above.
- the computer system may determine a score value corresponding to the impacted target section and other information for storage in a database as described above.
- the impact location and other information are displayed on a graphical user screen (e.g., FIGS. 9-12 ) as described above.
- the computer system retrieves the stored information and determines information pertaining to the round for display on the graphical user screen.
- a report may be printed providing information relating to user performance as described above.
- the system may provide indicia on the display to indicate and trace firearm movement as described above.
- the system may be employed with various targets and corresponding graphical user screens in substantially the same manner described above to detect and display beam impact locations on the various targets.
- the targets may include zones or sections for scoring or to indicate a particular firearm activity as described above.
- the targets may be for display purposes to indicate beam impact locations (without scoring).
- An exemplary target to simulate a shotgun emission is illustrated in FIG. 13 .
- the system displays standard shotgun dispersion patterns (e.g., with nine pellets) on a target image from the center mass of a detected beam impact.
- screen 190 is substantially similar to screen 170 described above and includes target area 172 , action bar 174 , information area 176 , shot table area 178 and mode selection area 179 .
- Target area 172 includes a target image in the form of a silhouette.
- Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); end a task (as described below); remove a shot from the shot table; open a previously saved session; save a session; and perform network functions.
- Information area 176 provides various information to a user (e.g., date, time, user name, user identification, time remaining, total number of hits, score, dispersion, center of mass, qualification, etc.).
- the computer system receives beam impact coordinates from the sensing device and determines the dispersion pattern (e.g., additional beam impact locations from the beam impact coordinates) for shotgun pellets to display on the target in target area 172 . This is preferably accomplished by applying a probability distribution to the received beam impact coordinates. The probability distribution is biased toward the center of the received beam impact location and produces pixel coordinates for the dispersion pattern.
- the options for this type of screen include a shot limit, auto start and a session time. These options may be selected in response to actuation of an options icon in action bar 174 as described above.
- the system may provide qualification of users for various tasks.
- a series of tasks (or a course of fire) are performed by the user to determine a user qualification.
- the tasks are stored in a file that indicates the task order and the shot and time limit for each task. Shots are not registered by the system after expiration of the shot or time limit. Successive tasks may be automatically performed, or an instructor may need to indicate the start of a succeeding task after completion of a prior task.
- screen 230 is similar to screen 170 described above and includes target area 172 , action bar 174 , information area 176 , shot table area 178 and mode selection area 177 .
- Target area 172 includes a target image in the form of a silhouette.
- Action bar 174 is similar to the action bar described above, and includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a task; remove a shot from the shot table; open a previously saved session; save a session; and load a course of fire.
- access options e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.
- Information area 176 provides various information to a user (e.g., user name, user identification, task, task hits, total hits, task score, total score, dispersion, center of mass, qualification, time left, etc.).
- Shot table area 178 includes a shot table providing the hit number, time, score and task for each detected target hit.
- Mode selection area 177 includes radio buttons to enable a user to select the mode of operation and inputs for mode options (e.g., time between tasks, end task on shot limit, etc.).
- the modes include: instructor controlled qualification to enable an instructor to control the start of tasks; automatic qualification where an instructor or shooter starts a session and succeeding tasks start after a preset delay until completion of the tasks; free practice to enable a user to practice; and task selection to enable a user to select and perform a particular task.
- the mode options enable a user or instructor to enter the time between tasks and whether termination of a task occurs at the expiration of the shot limit or time limit.
- the computer system may determine a user qualification based on the performance of the tasks by the user and criteria for specific qualifications.
- the system may alternatively be employed with various targets in the form of images or videos produced from a projector, displayed on a television screen or monitor, or displayed on an LCD or other panel in substantially the same manner described above to detect and display beam impact locations on those various targets (e.g., FIGS. 15-21 ).
- the targets may include zones or sections for scoring or to indicate a particular firearm activity.
- the targets may be for display purposes to indicate beam impact locations (without scoring).
- Exemplary targets in the form of projected or displayed images include: balloons or other objects that change in both size and shape while moving to improve skills with moving targets and judgment ( FIG. 15 ); bowling pins or other stationary objects at varying distances (e.g., for each pin), size and exposure time ( FIG.
- the projected or displayed images may simulate various shooting activities or ranges including: an indoor shooting range with an exemplary silhouette type target at various distances ( FIG. 19 ); live fire courses with silhouette targets and recording hits and misses, where the target scenario may be edited by a user ( FIG. 20 ); and skeet shooting with adjustable speed and difficulty levels ( FIG. 21 ).
- the system may utilized with videos of scenarios ( FIG. 18 ).
- the system records shots during the session for playback of the beam impacts (e.g., indicated by the icon as viewed in FIG. 18 ) with the video for analysis of user performance.
- the system may pause the playback at each shot for analysis.
- a user may load a custom video of a scenario for use with the system.
- the system may include any quantity or type of target of any shape or size, constructed of any suitable materials and placed in any desired location.
- the computer system may be implemented by any conventional or other computer or processing system.
- the components of the system may be connected by any communications devices (e.g., cables, wireless, network, etc.) in any desired fashion, and may utilize any type of conventional or other interface scheme or protocol.
- the computer system may be in communication with other training systems via any type of communications medium (e.g., direct line, telephone line/modem, network, etc.) to facilitate group training or competitions.
- the system may be configured for any types of training, qualification, competition, gaming and/or entertainment applications.
- the printer may be implemented by any conventional or other type of printer.
- the firearm laser training system may be utilized with any type of firearm (e.g., hand-gun, rifle, shotgun, machine gun, etc.), while the laser module may be fastened to the firearm at any suitable locations via any conventional or other fastening techniques (e.g., frictional engagement with the barrel, brackets attaching the device to the firearm, etc.).
- the system may include a dummy firearm projecting a laser beam, or replaceable firearm components (e.g., a barrel) having a laser device disposed therein for firearm training.
- the replaceable components e.g., barrel
- the laser assembly may include the laser module and barrel member or any other fastening device.
- the laser module may emit any type of laser beam.
- the optics package may include any suitable lens for projecting the beam.
- the laser beam may be enabled for any desired duration sufficient to enable the sensing device to detect the beam.
- the laser module may be fastened to a firearm or other similar structure (e.g., a dummy, toy or simulated firearm) at any suitable locations (e.g., external or internal of a barrel) and be actuated by a trigger or any other device (e.g., power switch, firing pin, relay, etc.).
- the laser module may be configured in the form of ammunition for insertion into a firearm firing or similar chamber and project a laser beam in response to trigger actuation.
- the laser module may be configured for direct insertion into the barrel.
- the laser module may include any type of sensor or detector (e.g., acoustic sensor, piezoelectric element, accelerometer, solid state sensors, strain gauge, etc.) to detect mechanical or acoustical waves or other conditions signifying trigger actuation.
- the laser module components may be arranged in any fashion, while the module power source may be implemented by any type of batteries.
- the module may include an adapter for receiving power from a common wall outlet jack or other power source.
- the laser beam may be visible or invisible (e.g., infrared), may be of any color or power level, may have a pulse of any desired duration and may be modulated in any fashion (e.g., at any desired frequency or unmodulated) or encoded in any manner to provide any desired information, while the transmitter may project the beam continuously or include a “constant on” mode.
- the system may be utilized with transmitters and detectors emitting any type of energy (e.g., light, infrared, etc.).
- the target may be implemented by any type of target having any desired configuration and indicia forming any desired target site and be disposed at any suitable location.
- the target may include any desired still or moving images (e.g., still image, video, etc.) displayed on paper or other material, on a surface by a projector, on a television, computer, LCD panel or other form of display.
- the target may be of any shape or size, and may be constructed of any suitable materials.
- the target may include any conventional or other fastening devices to attach to any supporting structure.
- the supporting structure may include any conventional or other fastening devices to secure a target to that structure.
- any type of adhesive may be utilized to secure a target to the structure.
- the support structure may be implemented by any structure suitable to support or suspend a target.
- the target may include any quantity of sections or zones of any shape or size and associated with any desired values.
- the target may include any quantity of individual targets or target sites.
- the system may utilize any type of coding, color or other scheme to associate values with target sections (e.g., table look-up, target location identifiers as keys into a database or other storage structure, etc.).
- the sections or zones may be identified by any type of codes, such as alphanumeric characters, numerals, etc., that indicate a score value or any other information.
- the score values may be set to any desired values.
- the target characteristics and images may be contained in any quantity of any types of files.
- the target images may be scaled in any desired fashion.
- the coordinate translations may be accomplished via any conventional or other techniques, and may be performed by the sensing device and/or computer system.
- the target files may contain any information pertaining to the target (e.g., filenames, images, scaling information, indicia size, etc.).
- the target files may be produced by the computer system or other processing system via any conventional or other software and placed on the computer system for operation. Alternatively, the target files may reside on another processing system accessible to the computer system via any conventional or other communications medium (e.g., network, modem/telephone line, etc.), or be available on any type of storage medium.
- the system may be disposed in a case or other storage unit for transport, where the case may be of any size or shape and may be constructed of any suitable materials.
- the sensing device may be of any shape or size, and may be constructed of any suitable materials.
- the sensing device components e.g., image sensor, memories, FPGA, system clock, reset switch, buffer, I/O ports, power supply, etc.
- the image sensor may be implemented by any conventional or other image sensor (e.g., camera, CCD, matrix or array of light sensing elements, etc.) suitable for detecting the laser beam and/or capturing a target image.
- the sensor may provide gray scale or full color images and include any desired frame rate suitable to detect beam impacts.
- the sensing device may employ any type of light sensing elements, and may utilize a grid or array of any suitable dimension.
- the filter may be implemented by any conventional or other filter having filtering properties for any particular frequency or range of frequencies.
- the lens may be implemented by any suitable lens to view the target.
- the FPGA may be implemented by any suitable hardware and/or software modules to perform the functions described herein (e.g., processor, circuitry, logic, etc.).
- the memories and buffer may be implemented by any type of conventional or other memories or storage units (e.g., DRAM, Flash, buffers, volatile, non-volatile, etc.) and may store any desired information. Since the memories are preferably non-volatile, the sensing device may be continually re-used without a re-calibration even after losing power or a power down (e.g., unless the position of the sensing device or target has changed). This further enables the sensing device to be available with target calibration data pre-loaded into the device. For example, targets may be designed and/or calibrated for the sensing device during device manufacture and immediately used in the field.
- the memories may include sufficient storage capacity to store at least one translation matrix for one or more corresponding targets.
- the sensing device may employ any suitable controllers, connectors and interfaces for communications and may utilize any desired communication protocols (e.g., Ethernet, etc.).
- the communications may be conducted via any communications medium (e.g., LAN, WAN, wired or cables, wireless, etc.).
- the I/O ports may be of any quantity, may be implemented by any type of conventional or other ports (e.g., IR, terminals/pins, etc.) and may provide any suitable I/O to transfer (e.g., transmit and/or receive) information with external devices (e.g., audio/visual indicators, laser devices, etc.).
- the system clock may provide a clock signal of any suitable frequency.
- the signature detector may be included within or coupled to the sensing device and may detect any suitable signature or pattern within any desired energy wave (e.g., laser, light, IR, etc.).
- the signature detector may be implemented by any conventional or other device detecting patterns within transmitted energy signals (e.g., circuitry, processor, etc.).
- the sensing device may be utilized (or the power supply may provide) any suitable power signals, but preferably in the range of 7V DC to 20V DC.
- the sensing device may receive power signals via unused pins of the network or Ethernet connector.
- the sensing device may be supported by any mounting device (e.g., a tripod, a mounting post, etc.) and positioned at any suitable locations providing access to the target.
- the calibration may utilize any quantity of points on the grid to define the target area, and may map the area to any sized array.
- the grid locations may correspond to any suitable locations within the target confines.
- the sensing device may be positioned at any suitable location and at any desired viewing angle relative to a target.
- the sensing device may be coupled to any port of the computer system via any conventional or other device (e.g., cable, wireless, etc.). Alternatively, the sensing device may provide images to the computer system to determine beam impact locations.
- the sensing device may be configured to detect any energy medium having any modulation, pulse or frequency.
- the laser may be implemented by a transmitter emitting any suitable energy wave.
- the sensing device may transmit any type of information to the computer system to indicate beam impact locations, while the computer system may process any type of information (e.g., X and Y coordinates, image information, etc.) from the sensing device to display and provide feedback information to the user.
- the sensing device may be utilized in any type of target of any shape or size to detect beam impacts thereon.
- the target may include any quantity of sensing devices arranged in any fashion with each associated with a section of any shape or size to detect beam impacts on that section.
- the sensing devices may be coupled in any desired fashion to provide beam impact and other information (e.g., network, daisy chain, individual connections, wired connections, wireless connections, etc.).
- the targets may include any quantity of any suitable diffuser to enable detection of beam impacts by the sensing device.
- the software for the computer system and sensing device may be implemented in any desired computer language and could be developed by one of ordinary skill in the computer arts based on the functional descriptions contained in the specification and flow charts illustrated in the drawings.
- the computer system may alternatively be implemented by any type of hardware and/or other processing circuitry.
- the various functions of the computer system and sensing device may be distributed in any manner between these items and/or among any quantity of software modules, processing systems and/or circuitry.
- the software and/or algorithms described above and illustrated in the flow charts may be modified in any manner that accomplishes the functions described herein.
- the database may be implemented by any conventional or other database or storage structure (e.g., file, data structure, etc.).
- the display screens and reports may be arranged in any fashion and may contain any type of information.
- the screens may include any quantity of any types of input mechanisms (e.g., fields, radio or other buttons, icons, etc.).
- the various parameter or other values may be displayed in the report and/or on the screens in any manner (e.g., charts, bars, etc.) and in any desired form (e.g., actual values, percentages, etc.), while any of the values displayed on the screens may be adjusted by the user via any desired input mechanisms.
- the calibration screen may include a grid of any shape, color or size to facilitate alignment of the sensing device with the target.
- the grid may include any quantity of points for a user to specify and may be associated with any locations of the target.
- the target may be defined within the captured target image in any desired manner via any suitable input mechanisms.
- the target may be defined at any suitable locations within the captured target image, while the selected locations may be indicated by any quantity of any types of indicia of any shape, color or size.
- the translation matrix may be determined by any conventional or other algorithms to compensate for viewing angle and/or correct deformations in the image.
- the calibration data may be stored in the computer system and/or sensing device.
- the density value may be determined with any weights having any desired value or types of values (e.g., integer, real, etc.).
- the weights and pixel component values may be utilized in any desired combination to produce a pixel density.
- any quantity of pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the presence and location of a beam impact within an image.
- any quantity of density and/or pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the threshold and light conditions.
- the threshold may be determined periodically, in response to any desired light or other conditions (e.g., light conditions are outside any desired range or have any desired change in value, etc.) or in response to a user, and may be set by the computer system and/or user to any desired value.
- the system may alternatively utilize gray scale or any type of color images (e.g., pixels having gray scale, RGB or other values) and manipulate any quantity of pixel values within any quantity of images in any desired fashion to determine the threshold, light conditions and presence and location of a beam impact.
- the system may utilize any quantity of thresholds each associated with a region of any size or shape.
- the threshold offset may be of any desired values based on the user desired sensitivity.
- the indicia indicating beam impact locations and other information may be of any quantity, shape, size or color and may include any type of information.
- the indicia may be placed at any locations and be incorporated into or overlaid with the target images or video.
- the system may produce any desired type of display or report having any desired information.
- the computer system may determine scores or other activity information based on any desired criteria.
- the computer system may poll the sensing device or the sensing device may transmit images and/or coordinates at any desired intervals for the tracing and/or plot modes or sensing functions.
- the sensing device may detect the laser beam continuously for any desired interval to initiate the tracing and/or plot modes.
- the indicia for the tracing and/or plot modes may be of any quantity, shape, size or color and may include any type of information.
- the tracing indicia may be placed at any locations and be incorporated into or overlaid with the target images.
- the tracing indicia may be flashing or continuously appearing on the display.
- the trace and/or plot modes may be implemented with any of the screens described above and may display any quantity of previous impact locations to show movement of the firearm.
- the system may be configured for use with a transmitter emitting a laser beam having any desired pulse width, and may provide any type of message or other indication when the pulse width of a laser beam detected by the system is not compatible with the system configuration.
- the system may be configured to detect and process beam impact locations at any desired shot rate.
- the systems may utilize any conventional or other techniques to convert between the various image spaces, and may compensate for any desired sensing device position and/or viewing angle.
- the system may be utilized with targets scaled in any fashion to simulate conditions at any desired ranges, and may utilize lasers having sufficient power to be detected at any desired scaled range.
- the calibrations for the sensing device may be initiated by a user as described above, or may be performed by the sensing device and/or computer system periodically or in response to detection of conditions (e.g., light conditions, detection of target position or orientation, etc.).
- the calibrations e.g., target alignment, light sensitivity, etc.
- the target alignment may be performed in conjunction with user specified points as described above. Alternatively, the calibration may be performed automatically (without the user specifying the calibration points).
- the user enables the image sensing device to view the entire target, where the sensing device and/or computer system employs conventional or other image recognition techniques to determine (e.g., by shape, color or other criteria) any suitable calibration points (e.g., the corners, midsections and/or center of the image).
- the calibration points may be utilized in the calibration to compensate for perspective and radial distortions in the manner described above.
- top”, “bottom”, “side”, “upper”, “lower”, “front”, “rear”, “horizontal”, “vertical”, “right”, “left” and the like are used herein merely to describe points of reference and do not limit the present invention to any specific configuration or orientation.
- the present invention is not limited to the applications disclosed herein, but may be utilized for any type of firearm training, qualification, competition, gaming or entertainment applications.
- the invention makes available a novel sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios, wherein a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities.
Abstract
Description
- This application claims priority from U.S. Provisional Patent Application Ser. No. 60/752,586, entitled “Sensing Device For Firearm Laser Training System and Method of Simulating Firearm Operation With Various Training Scenarios” and filed Dec. 22, 2005, the disclosure of which is incorporated herein by reference in its entirety.
- 1. Technical Field
- The present invention pertains to firearm training systems, such as those disclosed in U.S. Pat. No. 6,322,365 (Shechter et al.), U.S. Pat. No. 6,616,452 (Clark et al.) and U.S. Pat. No. 6,966,775 (Kendir et al.) and U.S. Patent Application Publication Nos. 2002/0197584 (Kendir et al.) and 2005/0153262 (Kendir), the disclosures of which are incorporated herein by reference in their entireties. In particular, the present invention pertains to a firearm laser training system that accommodates various types of targets for facilitating a variety of firearm training activities.
- 2. Discussion of Related Art
- Firearms are utilized for a variety of purposes, such as hunting, sporting competition, law enforcement and military operations. The inherent danger associated with firearms necessitates training and practice in order to minimize the risk of injury. However, special facilities are required to facilitate practice of handling and shooting the firearm. These special facilities tend to provide a sufficiently sized area for firearm training and/or confine projectiles propelled from the firearm within a prescribed space, thereby preventing harm to the surrounding environment. Accordingly, firearm trainees are required to travel to the special facilities in order to participate in a training session, while the training sessions themselves may become quite expensive since each session requires new ammunition for practicing handling and shooting of the firearm.
- In addition, firearm training is generally conducted by several organizations (e.g., military, law enforcement, firing ranges or clubs, etc.). Each of these organizations may have specific techniques or manners in which to conduct firearm training and/or qualify trainees. Accordingly, these organizations tend to utilize different types of targets, or may utilize a common target, but with different scoring criteria. Further, different targets may be employed by users for firearm training or qualification to simulate particular conditions or provide a specific type of training (e.g., grouping shots, hunting, clay pigeons, etc.).
- The related art has attempted to overcome the above-mentioned problems by utilizing laser or light energy with firearms to simulate firearm operation and indicate simulated projectile impact locations on targets. For example, U.S. Pat. No. 4,164,081 (Berke) discloses a marksman training system including a translucent diffuser target screen adapted for producing a bright spot on the rear surface of the target screen in response to receiving a laser light beam from a laser rifle on the target screen front surface. A television camera scans the rear side of the target screen and provides a composite signal representing the position of the light spot on the target screen rear surface. The composite signal is decomposed into X and Y Cartesian component signals and a video signal by a conventional television signal processor. The X and Y signals are processed and converted to a pair of proportional analog voltage signals. A target recorder reads out the pair of analog voltage signals as a point, the location of which is comparable to the location on the target screen that was hit by the laser beam.
- U.S. Pat. No. 5,281,142 (Zaenglein, Jr.) discloses a shooting simulation training device including a target projector for projecting a target image in motion across a screen, a weapon having a light projector for projecting a spot of light on the screen, a television camera and a microprocessor. An internal device lens projects the spot onto a small internal device screen that is scanned by the camera. The microprocessor receives various information to determine the location of the spot of light with respect to the target image.
- U.S. Pat. No. 5,366,229 (Suzuki) discloses a shooting game machine including a projector for projecting a video image that includes a target onto a screen. A player may fire a laser gun to emit a light beam toward the target on the screen. A video camera photographs the screen and provides a picture signal to coordinate computing means for computing the X and Y coordinates of the beam point on the screen.
- International Publication No. WO 92/08093 (Kunnecke et al.) discloses a small arms target practice monitoring system including a weapon, a target, a light-beam projector mounted on the weapon and sighted to point at the target and a processor. An evaluating unit is connected to the camera to determine the coordinates of the spot of light on the target. A processor is connected to the evaluating unit and receives the coordinate information. The processor further displays the spot on a target image on a display screen.
- The systems described above suffer from several disadvantages. In particular, the Berke, Zaenglein, Jr. and Suzuki systems employ particular targets or target scenarios, thereby limiting the types of firearm training activities and simulated conditions provided by those systems. Further, the Berke system utilizes both front and rear target surfaces during operation. Thus, placement of the target is restricted to areas having sufficient space for exposure of those surfaces to a user and the system. In addition, the Berke and Kunnecke et al. systems merely display impact locations to a user, thereby requiring a user to interpret the display to assess user performance during an activity. The user assessment is typically limited to the information (impact locations) provided on the display, thereby restricting feedback of valuable training information to the user and limiting the training potential of the system.
- Accordingly, it is an object of the present invention to accommodate various types of targets within a firearm laser training system to conduct varying types of training, qualification and/or entertainment activities.
- It is another object of the present invention to employ an image sensing device with a firearm laser training system that detects beam impact locations on a target and compensates for various orientations and viewing angles of the device relative to the target.
- Yet another object of the present invention is to employ user-specified targets within a firearm laser training system to conduct desired training procedures.
- A further object of the present invention is to assess user performance within a firearm laser training system by determining scoring and/or other performance information based on detected impact locations of simulated projectiles on a target.
- The aforesaid objects may be achieved individually and/or in combination, and it is not intended that the present invention be construed as requiring two or more of the objects to be combined unless expressly required by the claims attached hereto.
- According to the present invention, a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities. The system employs an image sensing device to detect laser beam impacts on a target, where the laser beam is projected from an actual or simulated user firearm equipped with a laser transmitter. The image sensing device compensates for image distortions and the sensing device viewing angle with respect to the intended target, and enables detection of laser beam impacts on various types of targets (e.g., paper targets, projected targets, videos, still or moving images, etc.) to provide firearm training with varying scenarios. The image sensing device provides impact location information to a computer system to graphically display the impact locations and provide information pertaining to user performance of the training activity.
- The above and still further objects, features and advantages of the present invention will become apparent upon consideration of the following detailed description of specific embodiments thereof, particularly when taken in conjunction with the accompanying drawings wherein like reference numerals in the various figures are utilized to designate like components.
-
FIG. 1 is a view in perspective of a firearm laser training system with a laser beam directed from a firearm onto an intended target according to the present invention. -
FIG. 2 is an exploded view in perspective and partial section of a laser transmitter assembly of the system ofFIG. 1 fastened to a firearm barrel. -
FIG. 3 is a schematic block diagram of an image sensing device according to the present invention. -
FIG. 4 is a procedural flow chart illustrating the manner in which the image sensing device is calibrated according to the present invention. -
FIGS. 5-6 are schematic illustrations of exemplary graphical user screens displayed by the system ofFIG. 1 for calibrating the image sensing device. -
FIG. 7A is a side view in elevation and partial section of a target including an image sensing device to detect laser beam impact locations according to the present invention. -
FIG. 7B is a front view in elevation of a target including a plurality of image sensing devices each associated with a target section to detect laser beam impact locations according to the present invention. -
FIG. 8 is a procedural flow chart illustrating the manner in which the system of FIG. processes and displays laser beam impact locations according to the present invention. -
FIG. 9 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during system operation. -
FIG. 10 is a schematic illustration of an alternative exemplary graphical user screen displayed by the system ofFIG. 1 during system operation. -
FIG. 11 is a schematic illustration of the exemplary graphical user screen ofFIG. 10 during operation of the system in a trace mode. -
FIG. 12 is a schematic illustration of the exemplary graphical user screen ofFIG. 10 during operation of the system in a plot mode. -
FIG. 13 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system for a shotgun course of fire. -
FIG. 14 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system for a course of fire. -
FIGS. 15-17 are schematic illustrations of exemplary graphical user screens displayed by the system ofFIG. 1 during operation of the system with various targets. -
FIG. 18 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system with a target in the form of a video segment. -
FIG. 19 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system with a simulation of an indoor firing range. -
FIG. 20 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system with a simulation of a live firing range. -
FIG. 21 is a schematic illustration of an exemplary graphical user screen displayed by the system ofFIG. 1 during operation of the system with a simulation of a skeet shoot session. - A firearm laser training system that accommodates various types of targets according to the present invention is illustrated in
FIG. 1 . Specifically, the firearm laser training system includes alaser transmitter assembly 200, atarget 10, animage sensing device 16 and acomputer system 18. The laser assembly is attached to an unloadeduser firearm 76 to adapt the firearm for compatibility with the training system. By way of example only,firearm 76 is implemented by a conventional hand-gun and includes atrigger 77, abarrel 78, ahammer 79 and agrip 85. However, the firearm may be implemented by any conventional firearms (e.g., hand-gun, rifle, shotgun, etc.), while the laser and firearm combination may be implemented by any of the simulated firearms disclosed in the above-mentioned patents and patent application publications.Laser assembly 200 emits a beam 11 of visible laser light in response to actuation oftrigger 77. The laser assembly is configured for insertion withinbarrel 78 to fasten the laser assembly to the barrel as described below. - A user aims unloaded
firearm 76 attarget 10 and actuates trigger 77 to project laser beam 11 fromlaser assembly 200 toward the target.Sensing device 16 detects the laser beam impact location on the target and provides location information tocomputer system 18.Target 10 may be in the form of a paper target, a television or computer monitor, or an LCD or other panel or display displaying images serving as a target. The computer system processes the location information and displays simulated projectile impact locations on the target via a graphical user screen (e.g.,FIGS. 9-21 ) as described below. The system may further include aprojector 17 to provide targets in the form of projectedimages 14.Sensing device 16 detects the laser beam impact location on the projected image and provides location information tocomputer system 18 as described above. The computer system processes the location information and displays simulated projectile impact locations on the projected image via a graphical user screen (e.g.,FIGS. 15-21 ) as described below. In addition, the computer system may determine scoring and other information based upon the performance of a user. - The system may be utilized with various types of targets to facilitate firearm training and/or qualifications (e.g., certification to a particular level or to use a particular firearm). The system may additionally be utilized for entertainment purposes (e.g., in target shooting games or sporting competitions). By way of example only,
target 10 may be implemented by a two-dimensional target, preferably constructed of paper or other material, and attached to or suspended from a supporting structure, such as a wall. The target preferably includes indicia forming a transitional type target having a silhouette of a person with several sections or zones (e.g., typically between five and seven) defined therein. The target sections may each be assigned a value in order to determine a score for a user. The sections and values typically vary based on the system application and/or particular organization (e.g., military, law enforcement, firearm club, etc.) utilizing the system. Further, plural target sections (e.g., contiguous or non-contiguous) may be associated with a common value, while each section may be of any shape or size. The score is determined by accumulating the values of the target sections impacted by the laser beam during the firearm activity. The values of the target sections may further be multiplied by a scoring factor set by the system and/or the user to accommodate various scoring schemes utilized by different organizations. Alternatively, the target may include an image produced by a projector, a television or computer screen and/or a an LCD or other display panel, where these targets may similarly include sections or zones each assigned a corresponding value as described above. - The computer system receives the beam impact locations from the sensing device and retrieves the section values corresponding to the impact locations as described below. Section values for each beam impact are accumulated to produce a score for a user. The target may be of any shape or size, may be constructed of any suitable materials and may include any indicia to provide any type of target for facilitating any type of training, qualification, gaming, entertainment or other activity. Moreover, the system may utilize any conventional, simulated or “dry fire” type firearms (e.g., hand-gun, rifle, shotgun, firearms powered by air/carbon dioxide, etc.), or firearms utilizing blank cartridges such as those disclosed in the above-mentioned patents and patent application publications, for projecting a laser beam to provide full realism in a safe environment.
- An exemplary laser transmitter assembly employed by the training system is illustrated in
FIG. 2 . Specifically,laser transmitter assembly 200 includes abarrel sleeve 202, apower source 210, typically in the form of batteries, alaser module 211 and asleeve cap 216. The barrel sleeve includes a generallycylindrical barrel member 204 and a threadedstop 206 disposed at the barrel member distal end. The transverse cross-sectional dimensions of the barrel member are slightly less than those ofbarrel 78 to enable the barrel member to be inserted within the barrel. The barrel member includes anadjustment member 203 disposed at the barrel member proximal end. The adjustment member is typically in the form of a screw and adjusts the barrel member dimensions in response to manipulation by a user. The adjustment member alters the barrel member dimensions to enable the barrel member to frictionally engage the barrel and provide a snug fit. Stop 206 is in the form of a substantially annular ring and has dimensions slightly greater than those of the barrel member and barrel to limit insertion of the sleeve within the barrel. The stop outer surface includes threads facilitating engagement withsleeve cap 216 as described below.Power source 210 has dimensions sufficient for insertion within the barrel sleeve to provide power tolaser module 211. -
Laser module 211 includes a substantiallycylindrical housing 212 including therein a mechanical wave sensor (not shown) and an optics package (not shown) including a laser and a lens. These components may be arranged within the housing in any suitable fashion. The optics package emits a laser beam through the lens toward an intended target in response to detection of hammer fall by the mechanical wave sensor. Specifically, whentrigger 77 is actuated, hammer 79 impacts the firearm frame. The impact generates a mechanical wave which travels distally alongbarrel 78 towardbarrel member 202. As used herein, the term “mechanical wave” or “shock wave” refers to an impulse (e.g., acoustic wave, shock wave, vibration along the barrel, etc.) traveling through the barrel. The mechanical wave sensor within the laser module senses the mechanical wave and generates a trigger signal. The mechanical wave sensor may include a piezoelectric element, an accelerometer or a solid state sensor, such as a strain gauge. The optics package within the laser module generates and projects a laser beam fromfirearm 76 in response to the trigger signal. The optics package laser is generally enabled for a predetermined time interval sufficient forimage sensing device 16 to detect a beam impact. The beam may be modulated, coded or pulsed in any desired fashion. Alternatively, the laser module may include an acoustic sensor to sense actuation of the trigger and enable the optics package. The laser module is similar in function to the laser devices disclosed in the aforementioned patents and patent application publications. The laser assembly may be constructed of any suitable materials and may be fastened tofirearm 76 at any suitable locations by any conventional or other fastening techniques. - The optics package of
laser module 211 is generally disposed toward the housing distal end, while the mechanical wave sensor is typically disposed toward the housing proximal end to detect the mechanical wave.Sleeve cap 216 is substantially cylindrical having an open proximal end and a closed distal end. The cap includes threads disposed on its interior surface to engage threads ofstop 206. The closed distal end of the cap includes a substantiallycentral opening 220 defined therein to enable a laser beam emitted by the laser module to pass through the cap. - In operation,
power source 210 andlaser module 211 are inserted intobarrel sleeve 202.Sleeve cap 216 is subsequently attached to stop 206 via their respective threaded portions. The barrel sleeve is inserted into the barrel, preferably untilstop 206 contacts the barrel distal end. The barrel sleeve dimensions may be selectively adjusted by manipulation of the adjustment member as described above to provide a secure fit.Laser transmitter assembly 200 basically emits a laser beam fromlaser module 211 through opening 220 ofcap 216 in response to firearm actuation as described above. - Referring back to
FIG. 1 ,computer system 18 is coupled to and receives and processes information from sensingdevice 16 to provide various feedback to a user. The computer system is typically implemented by a conventional IBM-compatible laptop or other -type of personal computer (e.g., notebook, desk top, mini-tower, Apple Macintosh, palm pilot, etc.) preferably equipped with display or monitor 34, a base 32 (e.g., including the processor, memories, and internal or external communication devices or modems) and a keyboard 36 (e.g., in combination with a mouse or other input devices).Computer system 18 includes software to enable the computer system to communicate withsensing device 16 and provide feedback to the user. The computer system may utilize any of the major platforms (e.g., Linux, Macintosh, Unix, OS2, etc.), but preferably includes a Windows environment (e.g., Windows 95, 98, NT, 2000, XP, etc.). Further, the computer system includes components (e.g. processor, disk storage or hard drive, etc.) having sufficient processing and storage capabilities to effectively execute the system software. -
Computer system 18 is connected to sensingdevice 16 via an Ethernet type connection. The sensing device may be mounted on a tripod and positioned at a suitable location from the target. However, any type of mounting or other structure may be utilized to support the sensing device. The sensing device detects the location of beam impacts on the target (e.g., by capturing an image of the target and detecting the location of the beam impact from the captured image) and provides impact location information in the form of X and Y coordinates tocomputer system 18. The sensing device performs a calibration to correlate the captured image space with the target space and/or display space as described below. A printer (not shown) may further be connected to the computer system to print reports containing user feedback information (e.g., score, hit/miss information, etc.). - The system may be utilized with various types of targets. Target characteristics are contained in several files that are stored by
computer system 18. In particular, a desired target may be photographed and/or scanned prior to system utilization to produce several target files and target information. Alternatively, images of user generated targets may be captured viasensing device 16 and optionally manipulated to form a target image, whilecomputer system 18 or other computer system (e.g., via the training system or conventional software) may be utilized to produce the target files and target information for use by the system. A target file includes a parameter file, a display image file, a scoring image file and a print image file. The parameter file includes information to enable the computer system to control system operation. By way of example only, the parameter file includes the filenames of the display, scoring and print image files, a scoring factor and cursor information (e.g., grouping criteria, such as circular shot group size). The display and print image files include an image of the target scaled to particular sections of the monitor and report containing that image, respectively. Indicia, preferably in the form of substantially circular icons, are overlaid on these images to indicate beam impact locations (e.g.,FIGS. 9-10 and 14), and typically include an identifier to indicate the particular shot (e.g., the position number of the shot within a shot sequence). The dimensions of the indicia may be adjusted to simulate different ammunition or firearm calibers entered by a user. - The scoring image file includes a scaled scoring image of the target having scoring sections or zones shaded with different colors. Any variation of colors may be utilized, and the colors are each associated with corresponding information associated with that zone. The zone information typically includes scoring values, but may include any other types of activity information (e.g., target number, desirable/undesirable hit location, priority of hit location, friend/foe, etc.). When impact location information is received from the sensing device,
computer system 18 utilizes that information to access a corresponding location within the scoring image. The sensing device may correlate the captured image space directly to the display, scoring and/or print image files. In this case, the received coordinates may be used to directly access the corresponding locations in these files. Alternatively, the computer system may perform a translation on the received coordinates to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations. - The color associated with the image location identified by the location information indicates a corresponding zone and/or scoring value. In effect, the colored scoring image functions as a look-up table to provide a zone value based on the location within the scoring image pertaining to a particular beam impact location. The scoring value of an impact location may be multiplied by a scoring factor within the parameter file to provide scores compatible with various organizations and/or scoring schemes. Thus, the scoring of the system may be adjusted by modifying the scoring factor within the parameter file and/or the scoring zones on the scoring image within the scoring image file. Alternatively, when other activity information is associated with the zones, the scoring image file may indicate occurrence of various events (e.g., hit/miss of target locations, target sections impacted based on priority, hit friend or foe, etc.) in substantially the same manner described above.
- In addition, the target files typically include a second display file containing a scaled image of the target. The dimensions of this image are substantially greater than those of the image contained in the initial display image file, and the second display file is preferably utilized to display a target having plural independent target sites. The target files along with scaling and other information (e.g., target range information input by user) are stored on
computer system 18 for use during system operation. Thus, the system may readily accommodate any type of target without interchanging system components. Moreover, target files may be downloaded from a network, such as the Internet, and loaded into the computer system to enable the system to access and be utilized with additional targets -
Image sensing device 16 detects laser beam impact locations on the target and provides the X/Y or Cartesian coordinates of the impact location tocomputer system 18 for processing. The image sensing device is illustrated inFIG. 3 . Specifically,image sensing device 16 includes animage sensor 102, animage buffer 104,memories Ethernet controller 118 providing Ethernet connections and apower supply 126 providing power for the sensing device.Image sensor 102 captures images of an intended target and employs charge-coupled devices (CCD) or CMOS. The image sensor repeatedly captures an image of the target, preferably through alens 131, and provides target image information to FPGA 112 for processing. The FPGA may store image information inimage buffer 104. -
Image sensor 102 preferably provides image data in monochromatic form (e.g., gray scale or black and white, where red (R), blue (B) and green (G) pixel values are substantially the same); however, the image sensor may alternatively provide full color images (e.g., including red (R), blue (B) and green (G) pixel values). The sensing device may further employ a bandpass type filter 129 to enable light fromlens 131 having the wavelength of the laser transmitter (e.g., approximately 650 nanometers) to pass to the image sensor, while suppressing the ambient light. The lens initially focuses and provides ambient light in parallel rays for filtering byfilter 129. - The image characteristics of the image sensor enable the device to capture images of the target including any changes to the target (e.g., beam impacts) occurring between successive frame transmissions. Thus, the sensing device facilitates detection of beam impacts from laser transmitters with pulse durations less than the frame rate of the image sensor (e.g., pulse durations as low as approximately 1.5 millisecond). Since the sensing device can accommodate low pulse durations, the sensing device may be universally employed with various laser transmitters.
-
FPGA 112 controls operation ofsensing device 16 and is coupled toimage sensor 102,image buffer 104,memories system clock 114, areset switch 116,Ethernet controller 118 and I/O ports -
System clock 114 provides a clock signal for the control functions, whilereset switch 116 enables a user to reset or reboot the sensing device. I/O ports - In addition, the FPGA may perform image decimation and hit synchronization functions. Image decimation relates to generation of lower resolution images (e.g., selection of a portion of pixels from an image) to enhance transfer rates over a network. Hit synchronization is employed when sensing
device 16 is utilized with or includes asignature detector 127 to detect signatures within an emitted laser beam. Each laser beam may include a signature to identify the particular user transmitting that beam. The signatures may be embedded within the laser beam in any desired fashion (e.g., modulation, pulse widths, etc.), while the signature detector may be implemented by any conventional or other devices to detect patterns or signatures within the transmitted laser signal. The FPGA determines the location of beam impacts as described below, whilesignature detector 127 determines the user providing the beam impact, thereby validating the beam impact as being from an authorized user (e.g., as opposed to registering a beam impact detection from ambient light). The signature detector further enables detection of beam impacts in the presence of significant levels of ambient light (e.g., sunlight) since an impact is registered in response to detection of the signature. In effect, the signature detector serves a filter for detected beam impacts to eliminate or minimize false detections due to ambient light. In the case wheresignature detector 127 is employed, the FPGA coordinates detection of impact locations with verification of laser beam signatures by the signature detector to associate impacts with particular or verified users. -
Memory 106 is preferably implemented by a flash type memory and stores software forFPGA 112 to perform image processing and/or control functions.Memory 108 is preferably implemented by a static dynamic random access memory (SDRAM) and stores various settings for calibration and other functions, whilememory 110 is preferably implemented by a flash type memory and stores calibration data. Thus, when the sensing device is calibrated as described below, the resulting data is stored insensing device 16. Sincememories -
Sensing device 16 communicates withcomputer system 18 via an Ethernet or other network type connection.Ethernet controller 118 is coupled toFPGA 112 and to anEthernet interface 120. The Ethernet controller controls communications with the computer system and is coupled to anEthernet connector 122 viaEthernet interface 120. These components may be implemented by any conventional or other network components to provide communications with the computer system via any suitable protocols (e.g., Ethernet, etc.). -
Sensing device 16 may process images from various viewing angles and orientations and compensates for deformations in the captured image due to the device position and/or lenses employed over the image sensor (e.g., fisheye, etc.). In order to compensate for the deformations, a calibration is performed as illustrated inFIGS. 4-6 . Initially, the sensing device is coupled to the computer system with the computer system being initialized for communications (e.g., assigning a network address to the sensing device, projector, etc.) atstep 150. A target (e.g., paper, projected image, television or computer screen, LCD panel, etc.) is provided atstep 152, and a calibration icon on the computer system display is actuated. In the case of a projected image, the projector is initialized to provide the image at a desired location and with a suitable alignment, size and/or form. The computer system verifies connection of the sensing device and subsequently displays a calibration graphical user screen 130 (FIGS. 5-6 ) atstep 154. -
Screen 130 includes aviewing area 132,instruction areas 133, again slide 136, andbuttons Instruction areas 133 provide instructions to a user, while viewingarea 132 displays an image of the target from sensingdevice 16 and agrid 134 to identify image boundaries for the calibration. The grid is substantially rectangular and includes a series of labeled points along the grid perimeter. By way of example only, the grid includes eight labeled points 1-8 identifying locations of the target image to the sensing device. Four points are located at respective corners of the grid, while the remaining four points are located at the approximate midpoint between the grid corners along each of the grid longer and shorter dimensioned edges. An additional point for the calibration is located at the center of the grid. These grid points correspond to particular locations (e.g., top left corner, top middle, top right corner, etc.) or coordinates within the target or an ideal target image. For example,point 1 may correspond to pixel coordinates 0,0 within an ideal image. These ideal image points are used in the calibration as described below. - Initially, gain
slide 136 is manipulated to enable the target image captured by the sensing device to be visible inviewing area 132 atstep 156. The sensing device position and/or lens may further be adjusted to enable the target to fill the viewing area and/or to adjust the focus. Once the target is viewable inviewing area 132, the user manipulates each perimeter point on the grid in sequence to coincide with the corresponding location on the target within the captured image inviewing area 132 at step 158 (FIG. 6 ). The grid center point is subsequently manipulated to coincide with the target center in the captured image. These points indicated by the user identify points on the distorted image corresponding to the grid points and are utilized in the calibration as described below. If the target in the captured image is rotated or otherwise displaced by the viewing angle and/or position of the sensing device, the user indication of the grid points in the captured image in combination with the corresponding locations of the grid points in an ideal or undistorted image enable the system to detect and compensate for the displacement as described below. - Once the grid is shifted to the image, the sensing device receives the target image boundaries from the computer system and the FPGA processes the target image to determine a translation matrix at
step 160 that compensates for the angle and/or orientation of the sensing device and distortion produced bylens 131 within the captured image. The translation matrix maps each image pixel to a corresponding coordinate in the target and/or display spaces (e.g., computer system display, etc.). Initially, the primary causes of errors with respect to detection of beam impacts include radial distortion caused by the sensing device lens including a magnification at the center of the image slightly greater than the magnification at the edges, and the perspective distortion caused by the sensing device position and angle relative to the target. The calibration technique produces the translation matrix to compensate for the position and angle of the sensing device and to correct the distortions caused by the lens. - The user manipulates the grid to identify each of the points on the distorted image as described above. The real or actual coordinates of the calibration points (e.g., the coordinates of the grid points in an ideal image) and the coordinates on the distorted image (e.g., indicated by the user) are used to calculate distortion parameters. The radial and perspective distortions of the image may be modeled by conventional techniques, such as those disclosed in R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-The-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation, Vol. RA-3, No. 4. August 1987, the disclosure of which is incorporated herein by reference in its entirety.
- The position and orientation of the sensing device relative to the target and radial distortion parameters are estimated using the known locations (e.g., the points indicated by the user on the distorted image and the locations of the grid points in an ideal image). This estimation may be determined according to conventional techniques, such as those disclosed in D. G. Bailey, “A New Approach to Lens Distortion Correction”, Proceedings of Image and Vision Computing, New Zealand, pp. 59-64, November 2002, the disclosure of which is incorporated herein by reference in its entirety.
- The perspective distortion is corrected by using a projective transformation. Subsequently, the lens distortion is corrected using the determined radial distortion parameters. The coordinates obtained from this modeling are compared with corresponding coordinates measured from the image to provide an error term. The distortion parameters are adjusted to minimize this error. This may be accomplished in accordance with conventional techniques, such as those disclosed in G. Vass et al., “Applying and Removing Lens Distortion in Post Production”, The Second Hungarian Conference on Computer Graphics and Geometry, 2003, the disclosure of which is incorporated herein by reference in its entirety. The resulting coordinates with minimal error are placed within the translation matrix.
- Once the FPGA detects a beam impact within a captured image, the corresponding coordinates from the translation matrix translate the impact to target and/or display spaces and are provided to the computer system for processing. The translation matrix is stored in the sensing device for future use (without a re-calibration). The sensing device (or translation matrix) may correlate the captured image space directly to the display, scoring and/or print image files as described above. In this case, the coordinates provided by the sensing device may be used to directly access the corresponding locations in these files. Alternatively, the computer system may perform a translation on the coordinates received from the sensing device to access the locations within the display, scoring and/or print image files corresponding to the detected beam impact locations. Further, the FPGA may perform the calibration with respect to actual pixel coordinates, or with respect to distances from reference points (e.g., where the quantity of pixels per millimeter or other length unit may be utilized).
- In addition, the sensing device further performs a light sensitivity calibration to adjust for ambient light conditions at
step 162. The user may select high, medium or low sensitivity. The FPGA processes the target image pixel intensities (e.g., red (R), green (G) and blue (B) pixel values) to measure the ambient light and determines the thresholds described below for determining beam impact detections. Basically, the FPGA determines a threshold value for each of a plurality of regions within the target image. By way of example, the present invention utilizes forty-eight regions; however, any quantity of regions at any locations within the image may be utilized. The threshold value for each region is determined from the sum of a maximum luminance or pixel intensity value of pixels within that region and an offset. The offset is based on a desired light sensitivity indicated by a user as described above. Since target images are being repeatedly captured and transmitted to the FPGA, certain captured target images may not contain any beam impact detections. Accordingly, the thresholds for each region basically control the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a corresponding region of the captured target image. - Once the calibrations are complete, the resulting information is stored in
sensing device memory 106 atstep 164 to enable the sensing device to be re-used without performing the calibration (e.g., the calibration is performed in response to a change in position of the target and/or sensing device). -
Sensing device 16 may be utilized within various targets to detect laser beam impacts on those targets as illustrated inFIG. 7A . By way of example, atarget 15 may include ahousing 23,sensing device 16 and adiffuser 19. The housing may be of any shape or size, and may be in the form of any suitable target to simulate a scenario (e.g., silhouette, bull's eye, standard targets for military or law enforcement applications, etc.). For example, the target may be of the types disclosed in the aforementioned U.S. Pat. No. 6,322,365 (Shechter et al.) and U.S. Patent Application Publication No. 2005/0153262 (Kendir). The sensing device is calibrated as described above withdiffuser 19 serving as a target surface and disposed inhousing 23 at any suitable location. The housing includes anopening 25 withdiffuser 19 disposed over or within that opening to serve as an impact location fortarget 15.Sensing device 16 captures target images and determines beam impact locations in substantially the same manner described above to provide beam impact coordinates to a computer system via a wired or wireless network connection. - Since the information provided by the sensing device is a relatively small amount, several targets or sensing devices may be coupled to the same computer system without significantly affecting network traffic or bandwidth. Accordingly, a large target may include several sensing devices as illustrated in
FIG. 7B . By way of example, target 15 may include a plurality ofsensing devices 16 each associated with a corresponding target section and coupled to anetwork switch 27, preferably employing an Ethernet protocol. The network switch may be implemented by any conventional or other network device and may utilize any suitable communications protocol.Target housing 23 may include a singlelarge diffuser 19 serving as a housing side to receive beam impacts. Alternatively, a housing side may include a plurality ofopenings 25 with a correspondingdiffuser 19 disposed over or within each opening. Each sensing device is calibrated and placed in the target housing to detect beam impacts within a corresponding section or diffuser as described above. The sensing devices transfer coordinates from detected beam impacts to networkswitch 27 for transference to a computer system for processing as described above. -
Computer system 18 includes software to control system operation and provide graphical user interfaces for displaying user performance. The manner in which the computer system monitors beam impact locations and provides information to a user is illustrated inFIGS. 8-9 . Initially, computer system 18 (FIG. 1 ) facilitates performance of calibrations for the sensing device and ambient light conditions as described above atstep 40. - Once the calibrations are completed, a user may commence projecting the laser beam from the firearm toward a target (e.g., paper, projected image, television or computer screen, LCD panel, etc.).
Sensing device 16 captures target images atstep 42 and processes the captured target images, viaFPGA 112, to determine a beam impact location atstep 44. Specifically, each captured target image received by the sensing device includes a plurality of pixels each associated with red (R), green (G) and blue (B) values to indicate the color and luminance of that pixel. In the case of gray scale (e.g., black and white) images, the values for red (R), green (G) and blue (B) are substantially the same. The red, green and blue values for each pixel are multiplied by a respective weighting factor and summed to produce a pixel density. In other words, the pixel density may be expressed as follows:
Pixel Density=(R×Weight1)+(G×Weight2)+(B×Weight3)
where Weight1, Weight2 and Weight3 are weighting values that may be selected in any fashion to enable the system to identify beam impact locations within the captured target images. The respective weights may have the same or different values and may be any types of values (e.g., integer, real, etc.). - The beam impact location is considered to occur within a group of pixels within a captured image where each group member has a density value exceeding a threshold. This threshold is determined from the light sensitivity calibration described above and corresponds to the region containing the beam impact. Typically, the group of pixels containing or representing the beam impact form an area or shape. The pixel at the center of the area or shape formed by the pixel group is considered by the system to contain, or represent the location of, a beam impact. If the density value of each captured image pixel is less than the threshold, the captured target image is not considered to include a beam impact. The threshold basically controls the system sensitivity to the emitted beam in relation to the ambient light, and enables the system to determine the presence of a beam impact within a captured target image. When the computer system identifies a pixel containing a beam impact, the coordinates (e.g., X and Y coordinates) of that pixel within the captured target image are translated by the translation matrix to coordinates within the target space and/or display space (e.g., computer display, etc.) and provided to the computer system for processing at
step 46. - The computer system includes several target files having target information and scaled images as described above. The coordinates received from the sensing device enable display or overlay of the impact location on the target files. In addition, the sensing device may determine the pulse width of the laser beam (e.g., from detection of beam impacts within successive frames), and provide information to the computer system to enable display of messages in response to a user utilizing a laser having an unsuitable pulse width with respect to the system configuration. Further, the detection of laser pulse widths may be utilized to identify and associate beam impacts with particular users employing laser transmitters with different pulse widths. The system preferably is configured for laser transmitters emitting a pulse having a duration of 1.5 milliseconds, but may be utilized and/or configured for operation with laser transmitters having any desired pulse width.
- The received coordinates are utilized to access a corresponding location in the scoring image and determine the score or other activity information for the beam impact at
step 50. Specifically, the received coordinates are utilized to identify a corresponding location within the scoring image. Various sections of the scoring image are color coded to indicate a value or other activity information associated with that section as described above. The color of the identified location within the scoring image is ascertained to indicate the value or other activity information for the beam impact. The scoring factor within the parameter file is applied to (e.g., multiplied by) the score value to determine a score for the beam impact. The score and other impact information is determined and stored in a database or other storage structure, while a computer system display showing the target is updated to illustrate the beam impact location and other information (e.g., natural dispersion, center of mass, score, score percentage, elapsed time, qualification, etc.) atstep 52. - The display image is displayed, while the beam impact location is identified by indicia that are overlaid with the display image and placed in an area encompassing the received coordinates. The indicia may be scaled to reflect the caliber of the firearm. In addition, the computer system may provide audio (e.g., resembling firearm shots and/or hits) to indicate beam impact. Exemplary graphical user screens indicating the target, beam impact locations, impact time, score and other information is illustrated in
FIGS. 9-10 and 14. The system may be configured to detect, process and display beam impacts: from any desired shooting rate (e.g., machine gun rates of approximately one thousand rounds per minute, etc.); originating from blank fire (e.g., as disclosed in the aforementioned U.S. Pat. No. 6,322,365); and/or for targets at maximum distances of approximately twenty-five meters. - If a round or session of firearm activity is not complete as determined at
step 54, the user continues actuation of the firearm and the system detects beam impact locations and determines information as described above. However, when a round or session is determined to be complete atstep 54, the computer system retrieves information from the database and determines information pertaining to the round atstep 56. The computer system may further determine grouping circles. These are generally utilized on shooting ranges where projectile impacts through a target must all be within a circle of a particular diameter (e.g., four centimeters). The computer system may analyze the beam impact information and provide groupings and other information on the display that is typically obtained during activities performed on firing ranges (e.g., dispersion, etc.). The grouping circle and beam impact location indicia are typically overlaid with the display image and placed in areas encompassing the appropriate coordinates of the display image space in substantially the same manner described above. - When a report is desired as determined at
step 58, the computer system retrieves the appropriate information from the database and generates a report for printing atstep 60. The report includes the print image, while beam impact location coordinates are retrieved from the database for the report. The beam impact locations are identified by indicia that are overlaid with the print image and placed in an area encompassing the corresponding location of the beam impact as described above for the display. The report may further include various information pertaining to user performance (e.g., score, dispersion, center of mass, impact score, cumulative score, score percentage, elapsed time, time between shots, etc.), and may alternatively be generated in electronic form in any desired format (e.g., .doc, .pdf, .wpd, etc.). When another round is desired, and a calibration is requested atstep 64, the computer system facilitates calibration of the sensing device atstep 40 and the above process of system operation is repeated. Similarly, the above process of system operation is repeated fromstep 42 when another round is desired without performing a calibration. System operation terminates upon completion of the training or qualification activity as determined atstep 62. - An exemplary user screen employed by the system for a silhouette type target (e.g., a military M9 type target) is illustrated in
FIG. 9 . Specifically,screen 170 includes atarget area 172, anaction bar 174, aninformation area 176, ashot table area 178 and amode selection area 179.Target area 172 includes an image of the target with beam impact locations indicated thereon as described above.Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); remove a shot from the shot table; open a previously saved session; save a session; and perform network and/or instructor functions. -
Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, qualification, etc.).Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit.Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below.Screen 170 may further enable a user to display menus for selecting options, reports and targets or overlays. These menus are typically displayed in response to actuation of corresponding icons in the action bar. - The system may additionally provide trace and plot features as described above. Referring to
FIGS. 10-12 ,screen 180 is similar toscreen 170 described above and includestarget area 172,action bar 174,information area 176, shottable area 178 andmode selection area 179.Target area 172 includes a target image in the form of a bull's eye.Action bar 174 includes a series of icons to: assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); remove a shot from the shot table; open a previously saved session; save a session; and stop a session. -
Information area 176 provides various information to a user (e.g., date, time, user name, user identification, course, session elapsed time, total number of hits, score, score percentage, dispersion, center of mass, etc.).Shot table area 178 includes a shot table providing the hit number, time and score for each detected target hit.Mode selection area 179 includes radio buttons to enable a user to select the mode of operation. The modes include shot mode to record shot locations on the screen, trace mode to track a laser beam on the target as described below and plot mode to plot a path of laser beam impacts as described below. - The trace mode is enabled in response to a user selecting the trace mode in
mode selection area 179 and the laser transmitter assembly operating in a “constant on” mode. The computer system displays aflashing block 171 on the graphical user screen (FIG. 11 ). The block follows movement of the firearm or laser beam projected on the target. Basically, the computer system receives coordinates of laser beam impact locations from the sensing device and utilizes those coordinates to display the block. The position of the block is adjusted on the display in accordance with the received coordinates. As the firearm or laser beam alters position, the block is similarly adjusted on the display to visually indicate movement of the firearm. - With respect to plot mode, the system traces the aiming position of the firearm or laser transmitter assembly and reports graphically the horizontal and vertical deviations of the firearm (
FIG. 12 ). In this mode, the laser transmitter assembly is configured to continuously project a laser beam from the firearm (e.g., “constant on” mode) as described above. The continuous laser beam projection allows the sensing device to trace any movement of the firearm, which in turn, allows the computer system to provide feedback to the user relating to fluctuation in firearm aim. The computer system continuously receives detection information (e.g., target image coordinates indicating beam impact locations) from the sensing device. Since the laser transmitter assembly is in a continuous mode (e.g., continuously projecting a laser beam onto the target), the sensing device traces the aim of the firearm on the target and continuously relays detection information to the computer system. - The computer system determines the target impact locations as described above and compiles and displays a trace report to provide an indication to the user of the horizontal and vertical fluctuations of the firearm with respect to an actual and/or desired hit location on the target.
FIG. 12 illustrates an exemplary graphical user screen displaying plot mode information and includes plots of horizontal and vertical fluctuations in firearm aim. The vertical and horizontal plots are typically color coded to identify a particular plot. - Operation of the system is described with reference to
FIG. 1 . Initially, a target (e.g., paper, projected image, television or computer screen, LCD or other display panel, etc.) is provided andsensing device 16 is connected to the computer system.Laser transmitter assembly 200 is inserted intobarrel 78 offirearm 76 as described above. The laser module is actuated in response to depression offirearm trigger 77. The computer system is commanded to commence a firearm activity, and may initially control calibrations for sensingdevice 16, as necessary, in the manner described above. The firearm may be actuated by a user, while the sensing device captures images of the target and provides coordinates of beam impact locations to the computer system as described above. The computer system may determine a score value corresponding to the impacted target section and other information for storage in a database as described above. The impact location and other information are displayed on a graphical user screen (e.g.,FIGS. 9-12 ) as described above. When a round is complete, the computer system retrieves the stored information and determines information pertaining to the round for display on the graphical user screen. Moreover, a report may be printed providing information relating to user performance as described above. In addition, the system may provide indicia on the display to indicate and trace firearm movement as described above. - The system may be employed with various targets and corresponding graphical user screens in substantially the same manner described above to detect and display beam impact locations on the various targets. The targets may include zones or sections for scoring or to indicate a particular firearm activity as described above. Alternatively, the targets may be for display purposes to indicate beam impact locations (without scoring). An exemplary target to simulate a shotgun emission is illustrated in
FIG. 13 . In this case, the system displays standard shotgun dispersion patterns (e.g., with nine pellets) on a target image from the center mass of a detected beam impact. Specifically,screen 190 is substantially similar toscreen 170 described above and includestarget area 172,action bar 174,information area 176, shottable area 178 andmode selection area 179.Target area 172 includes a target image in the form of a silhouette.Action bar 174 includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a session (e.g., toggle between on-line and off-line modes); end a task (as described below); remove a shot from the shot table; open a previously saved session; save a session; and perform network functions.Information area 176 provides various information to a user (e.g., date, time, user name, user identification, time remaining, total number of hits, score, dispersion, center of mass, qualification, etc.). - The computer system receives beam impact coordinates from the sensing device and determines the dispersion pattern (e.g., additional beam impact locations from the beam impact coordinates) for shotgun pellets to display on the target in
target area 172. This is preferably accomplished by applying a probability distribution to the received beam impact coordinates. The probability distribution is biased toward the center of the received beam impact location and produces pixel coordinates for the dispersion pattern. The options for this type of screen include a shot limit, auto start and a session time. These options may be selected in response to actuation of an options icon inaction bar 174 as described above. - The system may provide qualification of users for various tasks. In this case, a series of tasks (or a course of fire) are performed by the user to determine a user qualification. The tasks are stored in a file that indicates the task order and the shot and time limit for each task. Shots are not registered by the system after expiration of the shot or time limit. Successive tasks may be automatically performed, or an instructor may need to indicate the start of a succeeding task after completion of a prior task. Referring to
FIG. 14 ,screen 230 is similar toscreen 170 described above and includestarget area 172,action bar 174,information area 176, shottable area 178 andmode selection area 177.Target area 172 includes a target image in the form of a silhouette.Action bar 174 is similar to the action bar described above, and includes a series of icons to: indicate on-line/off-line status; assign a shooter name and course of fire to the target; add or change overlays; print a copy of the session; access options (e.g., target timeout, target auto start, caliber size, shot limit, enable/disable sound, etc.); start/stop a task; remove a shot from the shot table; open a previously saved session; save a session; and load a course of fire. -
Information area 176 provides various information to a user (e.g., user name, user identification, task, task hits, total hits, task score, total score, dispersion, center of mass, qualification, time left, etc.).Shot table area 178 includes a shot table providing the hit number, time, score and task for each detected target hit.Mode selection area 177 includes radio buttons to enable a user to select the mode of operation and inputs for mode options (e.g., time between tasks, end task on shot limit, etc.). The modes include: instructor controlled qualification to enable an instructor to control the start of tasks; automatic qualification where an instructor or shooter starts a session and succeeding tasks start after a preset delay until completion of the tasks; free practice to enable a user to practice; and task selection to enable a user to select and perform a particular task. The mode options enable a user or instructor to enter the time between tasks and whether termination of a task occurs at the expiration of the shot limit or time limit. The computer system may determine a user qualification based on the performance of the tasks by the user and criteria for specific qualifications. - The system may alternatively be employed with various targets in the form of images or videos produced from a projector, displayed on a television screen or monitor, or displayed on an LCD or other panel in substantially the same manner described above to detect and display beam impact locations on those various targets (e.g.,
FIGS. 15-21 ). The targets may include zones or sections for scoring or to indicate a particular firearm activity. Alternatively, the targets may be for display purposes to indicate beam impact locations (without scoring). Exemplary targets in the form of projected or displayed images include: balloons or other objects that change in both size and shape while moving to improve skills with moving targets and judgment (FIG. 15 ); bowling pins or other stationary objects at varying distances (e.g., for each pin), size and exposure time (FIG. 16 ); and multiple stationary objects in a timed exercise (e.g., Epyx style plates) to increase speed and accuracy (FIG. 17 ). In addition, the projected or displayed images may simulate various shooting activities or ranges including: an indoor shooting range with an exemplary silhouette type target at various distances (FIG. 19 ); live fire courses with silhouette targets and recording hits and misses, where the target scenario may be edited by a user (FIG. 20 ); and skeet shooting with adjustable speed and difficulty levels (FIG. 21 ). - In addition, the system may utilized with videos of scenarios (
FIG. 18 ). The system records shots during the session for playback of the beam impacts (e.g., indicated by the icon as viewed inFIG. 18 ) with the video for analysis of user performance. The system may pause the playback at each shot for analysis. Further, a user may load a custom video of a scenario for use with the system. - It will be appreciated that the embodiments described above and illustrated in the drawings represent only a few of the many ways of implementing a sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios.
- The system may include any quantity or type of target of any shape or size, constructed of any suitable materials and placed in any desired location. The computer system may be implemented by any conventional or other computer or processing system. The components of the system may be connected by any communications devices (e.g., cables, wireless, network, etc.) in any desired fashion, and may utilize any type of conventional or other interface scheme or protocol. The computer system may be in communication with other training systems via any type of communications medium (e.g., direct line, telephone line/modem, network, etc.) to facilitate group training or competitions. The system may be configured for any types of training, qualification, competition, gaming and/or entertainment applications. The printer may be implemented by any conventional or other type of printer.
- The firearm laser training system may be utilized with any type of firearm (e.g., hand-gun, rifle, shotgun, machine gun, etc.), while the laser module may be fastened to the firearm at any suitable locations via any conventional or other fastening techniques (e.g., frictional engagement with the barrel, brackets attaching the device to the firearm, etc.). Further, the system may include a dummy firearm projecting a laser beam, or replaceable firearm components (e.g., a barrel) having a laser device disposed therein for firearm training. The replaceable components (e.g., barrel) may further enable the laser module to be operative with a firearm utilizing any type of blank cartridges. The laser assembly may include the laser module and barrel member or any other fastening device. The laser module may emit any type of laser beam. The optics package may include any suitable lens for projecting the beam. The laser beam may be enabled for any desired duration sufficient to enable the sensing device to detect the beam. The laser module may be fastened to a firearm or other similar structure (e.g., a dummy, toy or simulated firearm) at any suitable locations (e.g., external or internal of a barrel) and be actuated by a trigger or any other device (e.g., power switch, firing pin, relay, etc.). Moreover, the laser module may be configured in the form of ammunition for insertion into a firearm firing or similar chamber and project a laser beam in response to trigger actuation. Alternatively, the laser module may be configured for direct insertion into the barrel. The laser module may include any type of sensor or detector (e.g., acoustic sensor, piezoelectric element, accelerometer, solid state sensors, strain gauge, etc.) to detect mechanical or acoustical waves or other conditions signifying trigger actuation. The laser module components may be arranged in any fashion, while the module power source may be implemented by any type of batteries. Alternatively, the module may include an adapter for receiving power from a common wall outlet jack or other power source. The laser beam may be visible or invisible (e.g., infrared), may be of any color or power level, may have a pulse of any desired duration and may be modulated in any fashion (e.g., at any desired frequency or unmodulated) or encoded in any manner to provide any desired information, while the transmitter may project the beam continuously or include a “constant on” mode. The system may be utilized with transmitters and detectors emitting any type of energy (e.g., light, infrared, etc.).
- The target may be implemented by any type of target having any desired configuration and indicia forming any desired target site and be disposed at any suitable location. The target may include any desired still or moving images (e.g., still image, video, etc.) displayed on paper or other material, on a surface by a projector, on a television, computer, LCD panel or other form of display. The target may be of any shape or size, and may be constructed of any suitable materials. The target may include any conventional or other fastening devices to attach to any supporting structure. Similarly, the supporting structure may include any conventional or other fastening devices to secure a target to that structure. Alternatively, any type of adhesive may be utilized to secure a target to the structure. The support structure may be implemented by any structure suitable to support or suspend a target. The target may include any quantity of sections or zones of any shape or size and associated with any desired values. The target may include any quantity of individual targets or target sites. The system may utilize any type of coding, color or other scheme to associate values with target sections (e.g., table look-up, target location identifiers as keys into a database or other storage structure, etc.). Further, the sections or zones may be identified by any type of codes, such as alphanumeric characters, numerals, etc., that indicate a score value or any other information. The score values may be set to any desired values.
- The target characteristics and images may be contained in any quantity of any types of files. The target images may be scaled in any desired fashion. The coordinate translations may be accomplished via any conventional or other techniques, and may be performed by the sensing device and/or computer system. The target files may contain any information pertaining to the target (e.g., filenames, images, scaling information, indicia size, etc.). The target files may be produced by the computer system or other processing system via any conventional or other software and placed on the computer system for operation. Alternatively, the target files may reside on another processing system accessible to the computer system via any conventional or other communications medium (e.g., network, modem/telephone line, etc.), or be available on any type of storage medium.
- The system may be disposed in a case or other storage unit for transport, where the case may be of any size or shape and may be constructed of any suitable materials.
- The sensing device may be of any shape or size, and may be constructed of any suitable materials. The sensing device components (e.g., image sensor, memories, FPGA, system clock, reset switch, buffer, I/O ports, power supply, etc.) may be implemented by any conventional or other components performing the functions described herein. The image sensor may be implemented by any conventional or other image sensor (e.g., camera, CCD, matrix or array of light sensing elements, etc.) suitable for detecting the laser beam and/or capturing a target image. The sensor may provide gray scale or full color images and include any desired frame rate suitable to detect beam impacts. The sensing device may employ any type of light sensing elements, and may utilize a grid or array of any suitable dimension. The filter may be implemented by any conventional or other filter having filtering properties for any particular frequency or range of frequencies. The lens may be implemented by any suitable lens to view the target.
- The FPGA may be implemented by any suitable hardware and/or software modules to perform the functions described herein (e.g., processor, circuitry, logic, etc.). The memories and buffer may be implemented by any type of conventional or other memories or storage units (e.g., DRAM, Flash, buffers, volatile, non-volatile, etc.) and may store any desired information. Since the memories are preferably non-volatile, the sensing device may be continually re-used without a re-calibration even after losing power or a power down (e.g., unless the position of the sensing device or target has changed). This further enables the sensing device to be available with target calibration data pre-loaded into the device. For example, targets may be designed and/or calibrated for the sensing device during device manufacture and immediately used in the field. The memories may include sufficient storage capacity to store at least one translation matrix for one or more corresponding targets.
- The sensing device may employ any suitable controllers, connectors and interfaces for communications and may utilize any desired communication protocols (e.g., Ethernet, etc.). The communications may be conducted via any communications medium (e.g., LAN, WAN, wired or cables, wireless, etc.). The I/O ports may be of any quantity, may be implemented by any type of conventional or other ports (e.g., IR, terminals/pins, etc.) and may provide any suitable I/O to transfer (e.g., transmit and/or receive) information with external devices (e.g., audio/visual indicators, laser devices, etc.). The system clock may provide a clock signal of any suitable frequency. The signature detector may be included within or coupled to the sensing device and may detect any suitable signature or pattern within any desired energy wave (e.g., laser, light, IR, etc.). The signature detector may be implemented by any conventional or other device detecting patterns within transmitted energy signals (e.g., circuitry, processor, etc.). The sensing device may be utilized (or the power supply may provide) any suitable power signals, but preferably in the range of 7V DC to 20V DC. The sensing device may receive power signals via unused pins of the network or Ethernet connector.
- The sensing device may be supported by any mounting device (e.g., a tripod, a mounting post, etc.) and positioned at any suitable locations providing access to the target. The calibration may utilize any quantity of points on the grid to define the target area, and may map the area to any sized array. The grid locations may correspond to any suitable locations within the target confines. The sensing device may be positioned at any suitable location and at any desired viewing angle relative to a target. The sensing device may be coupled to any port of the computer system via any conventional or other device (e.g., cable, wireless, etc.). Alternatively, the sensing device may provide images to the computer system to determine beam impact locations. The sensing device may be configured to detect any energy medium having any modulation, pulse or frequency. Similarly, the laser may be implemented by a transmitter emitting any suitable energy wave. The sensing device may transmit any type of information to the computer system to indicate beam impact locations, while the computer system may process any type of information (e.g., X and Y coordinates, image information, etc.) from the sensing device to display and provide feedback information to the user.
- The sensing device may be utilized in any type of target of any shape or size to detect beam impacts thereon. The target may include any quantity of sensing devices arranged in any fashion with each associated with a section of any shape or size to detect beam impacts on that section. The sensing devices may be coupled in any desired fashion to provide beam impact and other information (e.g., network, daisy chain, individual connections, wired connections, wireless connections, etc.). The targets may include any quantity of any suitable diffuser to enable detection of beam impacts by the sensing device.
- It is to be understood that the software for the computer system and sensing device may be implemented in any desired computer language and could be developed by one of ordinary skill in the computer arts based on the functional descriptions contained in the specification and flow charts illustrated in the drawings. The computer system may alternatively be implemented by any type of hardware and/or other processing circuitry. The various functions of the computer system and sensing device may be distributed in any manner between these items and/or among any quantity of software modules, processing systems and/or circuitry. The software and/or algorithms described above and illustrated in the flow charts may be modified in any manner that accomplishes the functions described herein. The database may be implemented by any conventional or other database or storage structure (e.g., file, data structure, etc.).
- The display screens and reports may be arranged in any fashion and may contain any type of information. The screens may include any quantity of any types of input mechanisms (e.g., fields, radio or other buttons, icons, etc.). The various parameter or other values may be displayed in the report and/or on the screens in any manner (e.g., charts, bars, etc.) and in any desired form (e.g., actual values, percentages, etc.), while any of the values displayed on the screens may be adjusted by the user via any desired input mechanisms. The calibration screen may include a grid of any shape, color or size to facilitate alignment of the sensing device with the target. The grid may include any quantity of points for a user to specify and may be associated with any locations of the target. The target may be defined within the captured target image in any desired manner via any suitable input mechanisms. The target may be defined at any suitable locations within the captured target image, while the selected locations may be indicated by any quantity of any types of indicia of any shape, color or size. The translation matrix may be determined by any conventional or other algorithms to compensate for viewing angle and/or correct deformations in the image. The calibration data may be stored in the computer system and/or sensing device.
- The density value may be determined with any weights having any desired value or types of values (e.g., integer, real, etc.). The weights and pixel component values may be utilized in any desired combination to produce a pixel density. Alternatively, any quantity of pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the presence and location of a beam impact within an image. Further, any quantity of density and/or pixel values within any quantity of images may be manipulated in any desired fashion (e.g., accumulated, averaged, multiplied by each other or weight values, etc.) to determine the threshold and light conditions. The threshold may be determined periodically, in response to any desired light or other conditions (e.g., light conditions are outside any desired range or have any desired change in value, etc.) or in response to a user, and may be set by the computer system and/or user to any desired value. The system may alternatively utilize gray scale or any type of color images (e.g., pixels having gray scale, RGB or other values) and manipulate any quantity of pixel values within any quantity of images in any desired fashion to determine the threshold, light conditions and presence and location of a beam impact. The system may utilize any quantity of thresholds each associated with a region of any size or shape. The threshold offset may be of any desired values based on the user desired sensitivity.
- The indicia indicating beam impact locations and other information may be of any quantity, shape, size or color and may include any type of information. The indicia may be placed at any locations and be incorporated into or overlaid with the target images or video. The system may produce any desired type of display or report having any desired information. The computer system may determine scores or other activity information based on any desired criteria. The computer system may poll the sensing device or the sensing device may transmit images and/or coordinates at any desired intervals for the tracing and/or plot modes or sensing functions. The sensing device may detect the laser beam continuously for any desired interval to initiate the tracing and/or plot modes. The indicia for the tracing and/or plot modes may be of any quantity, shape, size or color and may include any type of information. The tracing indicia may be placed at any locations and be incorporated into or overlaid with the target images. The tracing indicia may be flashing or continuously appearing on the display. The trace and/or plot modes may be implemented with any of the screens described above and may display any quantity of previous impact locations to show movement of the firearm.
- The system may be configured for use with a transmitter emitting a laser beam having any desired pulse width, and may provide any type of message or other indication when the pulse width of a laser beam detected by the system is not compatible with the system configuration. The system may be configured to detect and process beam impact locations at any desired shot rate. The systems may utilize any conventional or other techniques to convert between the various image spaces, and may compensate for any desired sensing device position and/or viewing angle. The system may be utilized with targets scaled in any fashion to simulate conditions at any desired ranges, and may utilize lasers having sufficient power to be detected at any desired scaled range.
- The calibrations for the sensing device (e.g., target alignment, light sensitivity, etc.) may be initiated by a user as described above, or may be performed by the sensing device and/or computer system periodically or in response to detection of conditions (e.g., light conditions, detection of target position or orientation, etc.). The calibrations (e.g., target alignment, light sensitivity, etc.) may be performed individually or in any combination or order. The target alignment may be performed in conjunction with user specified points as described above. Alternatively, the calibration may be performed automatically (without the user specifying the calibration points). In this case, the user enables the image sensing device to view the entire target, where the sensing device and/or computer system employs conventional or other image recognition techniques to determine (e.g., by shape, color or other criteria) any suitable calibration points (e.g., the corners, midsections and/or center of the image). The calibration points may be utilized in the calibration to compensate for perspective and radial distortions in the manner described above.
- It is to be understood that the terms “top”, “bottom”, “side”, “upper”, “lower”, “front”, “rear”, “horizontal”, “vertical”, “right”, “left” and the like are used herein merely to describe points of reference and do not limit the present invention to any specific configuration or orientation.
- The present invention is not limited to the applications disclosed herein, but may be utilized for any type of firearm training, qualification, competition, gaming or entertainment applications.
- From the foregoing description, it will be appreciated that the invention makes available a novel sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios, wherein a firearm laser training system accommodates various types of targets for facilitating a variety of firearm training activities.
- Having described preferred embodiments of a new and improved sensing device for a firearm laser training system and method of simulating firearm operation with various training scenarios, it is believed that other modifications, variations and changes will be suggested to those skilled in the art in view of the teachings set forth herein. It is therefore to be understood that all such variations, modifications and changes are believed to fall within the scope of the present invention as defined by the appended claims.
Claims (53)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/642,589 US20070190495A1 (en) | 2005-12-22 | 2006-12-21 | Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US75258605P | 2005-12-22 | 2005-12-22 | |
US11/642,589 US20070190495A1 (en) | 2005-12-22 | 2006-12-21 | Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070190495A1 true US20070190495A1 (en) | 2007-08-16 |
Family
ID=38368999
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/642,589 Abandoned US20070190495A1 (en) | 2005-12-22 | 2006-12-21 | Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios |
Country Status (1)
Country | Link |
---|---|
US (1) | US20070190495A1 (en) |
Cited By (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080119261A1 (en) * | 2006-10-23 | 2008-05-22 | Jorge Heymann | Slot machine bonus game providing awards for manual dexterity |
US20090059094A1 (en) * | 2007-09-04 | 2009-03-05 | Samsung Techwin Co., Ltd. | Apparatus and method for overlaying image in video presentation system having embedded operating system |
US20090124352A1 (en) * | 2007-11-13 | 2009-05-14 | Ignacio Gerson | Slot machine game with side pot |
US20090253103A1 (en) * | 2008-03-25 | 2009-10-08 | Hogan Jr Richard Russell | Devices, systems and methods for firearms training, simulation and operations |
US20100092925A1 (en) * | 2008-10-15 | 2010-04-15 | Matvey Lvovskiy | Training simulator for sharp shooting |
US20100273131A1 (en) * | 2007-12-11 | 2010-10-28 | Korea Elecom Co., Ltd. | Laser transmitter for simulating a fire weapon and manufacturing method thereof |
US20100273130A1 (en) * | 2009-04-22 | 2010-10-28 | Integrated Digital Technologies, Inc. | Shooting training systems using an embedded photo sensing panel |
US7926408B1 (en) * | 2005-11-28 | 2011-04-19 | Metadigm Llc | Velocity, internal ballistics and external ballistics detection and control for projectile devices and a reduction in device related pollution |
WO2011056196A1 (en) * | 2009-11-09 | 2011-05-12 | Projectionworks, Inc. | Systems and methods for optically projecting three-dimensional text, images and/or symbols onto three-dimensional objects |
US20110306019A1 (en) * | 2010-06-09 | 2011-12-15 | Ohio Ordnance Works, Inc. | M249 non-firearm simulating a functional m249 firearm |
US20120295229A1 (en) * | 2011-05-19 | 2012-11-22 | Fortitude North, Inc. | Systems and Methods for Analyzing a Marksman Training Exercise |
US20130017515A1 (en) * | 2011-01-18 | 2013-01-17 | Moore Larry E | Laser trainer target |
US8607495B2 (en) | 2008-10-10 | 2013-12-17 | Larry E. Moore | Light-assisted sighting devices |
US8621774B1 (en) | 2004-03-29 | 2014-01-07 | Metadigm Llc | Firearm with multiple targeting laser diodes |
US8627591B2 (en) | 2008-09-05 | 2014-01-14 | Larry Moore | Slot-mounted sighting device |
US8690664B2 (en) | 2006-09-25 | 2014-04-08 | Etasse Limited | Slot machine game with additional award indicator |
US8696150B2 (en) | 2011-01-18 | 2014-04-15 | Larry E. Moore | Low-profile side mounted laser sighting device |
US8695266B2 (en) | 2005-12-22 | 2014-04-15 | Larry Moore | Reference beam generating apparatus |
US8702493B2 (en) | 2007-11-09 | 2014-04-22 | Etasse Limited | Slot machine game with award based on another machine |
US20140168447A1 (en) * | 2012-12-18 | 2014-06-19 | Trackingpoint, Inc. | Optical Device Including a Mode for Grouping Shots for Use with Precision Guided Firearms |
US20140193779A1 (en) * | 2010-01-19 | 2014-07-10 | Oren Louis Ohr | Dry fire training device |
US8777620B1 (en) * | 2006-08-15 | 2014-07-15 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US8813411B2 (en) | 2008-10-10 | 2014-08-26 | P&L Industries, Inc. | Gun with side mounting plate |
US8844189B2 (en) | 2012-12-06 | 2014-09-30 | P&L Industries, Inc. | Sighting device replicating shotgun pattern spread |
US8911235B1 (en) | 2006-08-15 | 2014-12-16 | Triggermaster, Inc. | Shooting training device |
US20150023591A1 (en) * | 2013-07-18 | 2015-01-22 | Ramakrishna Potluri, Intellecttech Corp Pvt Ltd | Optical analysis of a point of aim of a projectile discharge device |
US20150091252A1 (en) * | 2013-10-02 | 2015-04-02 | Neil Chadwick | Shooting Target Management Systems and Related Methods |
US20150141100A1 (en) * | 2009-02-27 | 2015-05-21 | George Carter | Error Correction System and Method for a Simulation Shooting System |
US9151564B1 (en) | 2006-08-15 | 2015-10-06 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US9182194B2 (en) | 2014-02-17 | 2015-11-10 | Larry E. Moore | Front-grip lighting device |
US9297614B2 (en) | 2013-08-13 | 2016-03-29 | Larry E. Moore | Master module light source, retainer and kits |
US20160153755A1 (en) * | 2014-10-25 | 2016-06-02 | Benjamin J. Morgan | System and Method for Timing Firearm Practice Drills |
ES2578806A1 (en) * | 2016-02-19 | 2016-08-01 | Eduardo PÉREZ CALLE | Internal reactive impact pointing device (d.i.s.i.r.) (Machine-translation by Google Translate, not legally binding) |
US20160282076A1 (en) * | 2015-03-23 | 2016-09-29 | Ronnie VALDEZ | Simulated hunting devices and methods |
US9470485B1 (en) | 2004-03-29 | 2016-10-18 | Victor B. Kley | Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control |
US9520031B2 (en) | 2008-07-07 | 2016-12-13 | Etasse Limited | Slot machine game with symbol lock-in |
US9533221B1 (en) * | 2012-09-24 | 2017-01-03 | Ryan Welch | Video game controller device for a firearm |
JP2017067439A (en) * | 2016-12-28 | 2017-04-06 | 株式会社エイテック | Target system |
US9644826B2 (en) | 2014-04-25 | 2017-05-09 | Larry E. Moore | Weapon with redirected lighting beam |
US20170261283A1 (en) * | 2016-03-10 | 2017-09-14 | Joe Jenius Inc. | Portable dry fire practice shooting system |
US9782667B1 (en) | 2009-02-27 | 2017-10-10 | George Carter | System and method of assigning a target profile for a simulation shooting system |
US20170292813A1 (en) * | 2016-04-07 | 2017-10-12 | Jab Company Llc | Target shooting |
US9829280B1 (en) | 2016-05-26 | 2017-11-28 | Larry E. Moore | Laser activated moving target |
US9830408B1 (en) * | 2012-11-29 | 2017-11-28 | The United States Of America As Represented By The Secretary Of The Army | System and method for evaluating the performance of a weapon system |
IL256122A (en) * | 2017-12-05 | 2018-01-31 | Engelstein Tal | A smart safety contraption and methods related thereto for use with a firearm |
US9921017B1 (en) | 2013-03-15 | 2018-03-20 | Victor B. Kley | User identification for weapons and site sensing fire control |
US20180245881A1 (en) * | 2017-02-27 | 2018-08-30 | Kurt S. SCHULZ | Firearm simulator targets and firearm simulation systems |
US10132595B2 (en) | 2015-03-20 | 2018-11-20 | Larry E. Moore | Cross-bow alignment sighter |
DE102017006254A1 (en) | 2017-06-30 | 2019-01-03 | Simon Fröhlich | Apparatus for evaluating laser shots on targets |
US10209033B1 (en) | 2018-01-30 | 2019-02-19 | Larry E. Moore | Light sighting and training device |
US10209030B2 (en) | 2016-08-31 | 2019-02-19 | Larry E. Moore | Gun grip |
US10213679B1 (en) | 2009-02-27 | 2019-02-26 | George Carter | Simulated indirect fire system and method |
US10436538B2 (en) | 2017-05-19 | 2019-10-08 | Crimson Trace Corporation | Automatic pistol slide with laser |
US10436553B2 (en) | 2014-08-13 | 2019-10-08 | Crimson Trace Corporation | Master module light source and trainer |
US10451376B2 (en) | 2014-12-16 | 2019-10-22 | Kurt S. SCHULZ | Firearm simulators |
US10527390B1 (en) | 2009-02-27 | 2020-01-07 | George Carter | System and method of marksmanship training utilizing an optical system |
US10532275B2 (en) | 2012-01-18 | 2020-01-14 | Crimson Trace Corporation | Laser activated moving target |
JP2020041743A (en) * | 2018-09-11 | 2020-03-19 | 株式会社日立国際電気 | Firing training system |
US10712116B1 (en) * | 2014-07-14 | 2020-07-14 | Triggermaster, Llc | Firearm body motion detection training system |
CN111433553A (en) * | 2017-12-07 | 2020-07-17 | 李国镇 | Integrated shooting simulation system using fisheye lens camera |
CN112513556A (en) * | 2018-03-21 | 2021-03-16 | 因韦里斯培训解决方案公司 | Device and method for detecting a firing event |
CN112665454A (en) * | 2021-01-18 | 2021-04-16 | 河北砺兵科技有限责任公司 | Target distribution method in man-machine confrontation training |
US11278818B2 (en) * | 2017-07-05 | 2022-03-22 | Irisity AB | Method for apply gamification techniques to a security system |
US11383151B2 (en) * | 2016-08-17 | 2022-07-12 | Fowling Enterprises, Llc | Automated game scoring and pin tracking system |
US11662178B1 (en) | 2009-02-27 | 2023-05-30 | George Carter | System and method of marksmanship training utilizing a drone and an optical system |
US20230280133A1 (en) * | 2022-03-04 | 2023-09-07 | Naki U. Soyturk | Laser-based firearm and target assembly and method of use |
US11882813B2 (en) | 2020-10-15 | 2024-01-30 | Ronnie A Valdez | Wildlife tracking system |
Citations (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2934634A (en) * | 1957-07-09 | 1960-04-26 | William M Hellberg | Game and practice attachment for a gun |
US3452453A (en) * | 1966-05-05 | 1969-07-01 | Saab Ab | Gunnery practice apparatus employing laser beams |
US3510965A (en) * | 1967-04-14 | 1970-05-12 | Don E Rhea | Training aid for sighting small arms |
US3590225A (en) * | 1969-02-14 | 1971-06-29 | Brunswick Corp | Digital arrow location computer |
US3633285A (en) * | 1970-03-09 | 1972-01-11 | Litton Systems Inc | Laser markmanship trainer |
US3782832A (en) * | 1973-04-12 | 1974-01-01 | Us Army | Method of boresight alignment of a weapon |
US3792535A (en) * | 1972-12-11 | 1974-02-19 | Us Navy | Laser rifle simulator system |
US3888022A (en) * | 1974-06-04 | 1975-06-10 | Us Army | Moving target screen |
US3938262A (en) * | 1974-10-17 | 1976-02-17 | Hughes Aircraft Company | Laser weapon simulator |
US4068393A (en) * | 1972-06-27 | 1978-01-17 | Vsevolod Tararine | Projectile firing training method and device |
US4102059A (en) * | 1975-04-03 | 1978-07-25 | Cerheronics Inc. | Small arms laser training device |
US4164081A (en) * | 1977-11-10 | 1979-08-14 | The United States Of America As Represented By The Secretary Of The Navy | Remote target hit monitoring system |
US4195422A (en) * | 1976-12-20 | 1980-04-01 | Laspo Ag | System for simulating weapon firing |
US4218834A (en) * | 1978-03-02 | 1980-08-26 | Saab-Scania Ab | Scoring of simulated weapons fire with sweeping fan-shaped beams |
US4256013A (en) * | 1979-03-30 | 1981-03-17 | Quitadama Dominick J | Multiple target weapons system |
US4269415A (en) * | 1979-04-13 | 1981-05-26 | Thorne Booth George M | Scoring system for shooting gallery |
US4281993A (en) * | 1980-05-19 | 1981-08-04 | The United States Of America As Represented By The Secretary Of The Navy | Semiconductor laser alignment device |
US4313272A (en) * | 1979-04-25 | 1982-02-02 | Laser Products Corporation | Laser beam firearm aim assisting methods and apparatus |
US4313273A (en) * | 1979-04-25 | 1982-02-02 | Laser Products Corporation | Firearms and laser beam aim assisting methods and apparatus |
US4336018A (en) * | 1979-12-19 | 1982-06-22 | The United States Of America As Represented By The Secretary Of The Navy | Electro-optic infantry weapons trainer |
US4340370A (en) * | 1980-09-08 | 1982-07-20 | Marshall Albert H | Linear motion and pop-up target training system |
US4367516A (en) * | 1980-11-03 | 1983-01-04 | Jacob Lionel C | Marksmanship training device and method |
US4439156A (en) * | 1982-01-11 | 1984-03-27 | The United States Of America As Represented By The Secretary Of The Navy | Anti-armor weapons trainer |
US4452458A (en) * | 1981-09-18 | 1984-06-05 | C. Carl Timander | Device to determine, indicate and record aim of object |
US4518360A (en) * | 1981-06-22 | 1985-05-21 | The Singer Company | Device to compensate for distortion in target location in a visual system |
US4572509A (en) * | 1982-09-30 | 1986-02-25 | Sitrick David H | Video game network |
US4583950A (en) * | 1984-08-31 | 1986-04-22 | Schroeder James E | Light pen marksmanship trainer |
US4592554A (en) * | 1983-04-05 | 1986-06-03 | Peter Gilbertson | Equipment for simulated shooting |
US4640514A (en) * | 1984-02-24 | 1987-02-03 | Noptel Ky | Optoelectronic target practice apparatus |
US4657511A (en) * | 1983-12-15 | 1987-04-14 | Giravions Dorand | Indoor training device for weapon firing |
US4662845A (en) * | 1985-09-27 | 1987-05-05 | Loral Electro-Optical Systems, Inc. | Target system for laser marksmanship training devices |
US4678437A (en) * | 1985-09-27 | 1987-07-07 | Technology Network International, Inc. | Cartridge and target device for markmanship training |
US4680012A (en) * | 1984-07-07 | 1987-07-14 | Ferranti, Plc | Projected imaged weapon training apparatus |
US4737106A (en) * | 1985-03-23 | 1988-04-12 | Schlumberger Electronics (U.K.) Limited | Weapon training systems |
US4804325A (en) * | 1986-05-15 | 1989-02-14 | Spartanics, Ltd. | Weapon training simulator system |
US4811955A (en) * | 1986-09-29 | 1989-03-14 | Carlo De Bernardini | Hand fire-arm for shooting without ammunition |
US4830617A (en) * | 1986-01-18 | 1989-05-16 | Accles And Shelvoke Limited | Apparatus for simulated shooting |
US4898391A (en) * | 1988-11-14 | 1990-02-06 | Lazer-Tron Company | Target shooting game |
US4922401A (en) * | 1989-05-22 | 1990-05-01 | International Fuel Cells | Inverter circuit utilizing the reverse voltage capabilities of symmetrical gate turn off thyristors |
US4923402A (en) * | 1988-11-25 | 1990-05-08 | The United States Of America As Represented By The Secretary Of The Navy | Marksmanship expert trainer |
US4934937A (en) * | 1988-12-14 | 1990-06-19 | Tommy Judd | Combat training system and apparatus |
US4983123A (en) * | 1989-11-06 | 1991-01-08 | Phase Dynamics, Inc. | Marksmanship training apparatus |
US4988111A (en) * | 1988-12-12 | 1991-01-29 | Yonatan Gerlizt | Non hand-held toy |
US4995812A (en) * | 1989-11-02 | 1991-02-26 | Reynolds Fred W | Method of obtaining usable hydrocolloid impression material |
US5004423A (en) * | 1988-06-30 | 1991-04-02 | Bertrams Kurt U | Training aid for such side arms as revolvers and pistols |
US5026158A (en) * | 1988-07-15 | 1991-06-25 | Golubic Victor G | Apparatus and method for displaying and storing impact points of firearm projectiles on a sight field of view |
US5035622A (en) * | 1989-11-29 | 1991-07-30 | The United States Of America As Represented By The Secretary Of The Navy | Machine gun and minor caliber weapons trainer |
US5090708A (en) * | 1990-12-12 | 1992-02-25 | Yonatan Gerlitz | Non hand-held toy |
US5092071A (en) * | 1991-03-13 | 1992-03-03 | Larry Moore | Weapon accessory mount |
US5095433A (en) * | 1990-08-01 | 1992-03-10 | Coyote Manufacturing, Inc. | Target reporting system |
US5119576A (en) * | 1989-06-06 | 1992-06-09 | Torsten Erning | Firearm with separable radiation emitting attachment |
US5179235A (en) * | 1991-09-10 | 1993-01-12 | Toole Ronald L | Pistol sighting device |
US5181015A (en) * | 1989-11-07 | 1993-01-19 | Proxima Corporation | Method and apparatus for calibrating an optical computer input system |
US5194007A (en) * | 1991-05-20 | 1993-03-16 | The United States Of America As Represented By The Secretary Of The Navy | Semiconductor laser weapon trainer and target designator for live fire |
US5194006A (en) * | 1991-05-15 | 1993-03-16 | Zaenglein Jr William | Shooting simulating process and training device |
US5194008A (en) * | 1992-03-26 | 1993-03-16 | Spartanics, Ltd. | Subliminal image modulation projection and detection system and method |
US5208418A (en) * | 1987-05-15 | 1993-05-04 | Oerlikon-Contraves Ag | Aligning method for a fire control device and apparatus for carrying out the alignment method |
US5213503A (en) * | 1991-11-05 | 1993-05-25 | The United States Of America As Represented By The Secretary Of The Navy | Team trainer |
US5215465A (en) * | 1991-11-05 | 1993-06-01 | The United States Of America As Represented By The Secretary Of The Navy | Infrared spot tracker |
US5328190A (en) * | 1992-08-04 | 1994-07-12 | Dart International, Inc. | Method and apparatus enabling archery practice |
US5400095A (en) * | 1993-05-11 | 1995-03-21 | Proxima Corporation | Display projection method and apparatus an optical input device therefor |
US5413357A (en) * | 1992-07-06 | 1995-05-09 | Nsm Aktiengesellschaft | Program controlled entertainment and game apparatus |
US5433134A (en) * | 1993-10-05 | 1995-07-18 | Leiter; Edward J. | Blank firing conversions for semiautomatic pistols |
US5486001A (en) * | 1991-05-30 | 1996-01-23 | Baker; Rick | Personalized instructional aid |
US5488795A (en) * | 1994-02-28 | 1996-02-06 | American Laser Technology, Inc. | Multi-caliber laser firing cartridge |
US5502459A (en) * | 1989-11-07 | 1996-03-26 | Proxima Corporation | Optical auxiliary input arrangement and method of using same |
US5504501A (en) * | 1989-11-07 | 1996-04-02 | Proxima Corporation | Optical input arrangement and method of using same |
US5515079A (en) * | 1989-11-07 | 1996-05-07 | Proxima Corporation | Computer input system and method of using same |
US5529310A (en) * | 1994-10-19 | 1996-06-25 | Interactive Innovations, Inc. | Hand-held multi-function wireless target control system |
US5591032A (en) * | 1995-03-23 | 1997-01-07 | Richard L. Powell | Laser weapon simulator apparatus with firing detection system |
US5594468A (en) * | 1989-11-07 | 1997-01-14 | Proxima Corporation | Optical system auxiliary input calibration arrangement and method of using same |
US5605461A (en) * | 1994-10-27 | 1997-02-25 | Seeton; Gary E. | Acoustic triggered laser device for simulating firearms |
US5613913A (en) * | 1994-04-06 | 1997-03-25 | Sega Enterprises, Ltd. | Method for developing attractions in a shooting game system |
US5641288A (en) * | 1996-01-11 | 1997-06-24 | Zaenglein, Jr.; William G. | Shooting simulating process and training device using a virtual reality display screen |
US5649706A (en) * | 1994-09-21 | 1997-07-22 | Treat, Jr.; Erwin C. | Simulator and practice method |
US5716216A (en) * | 1996-11-26 | 1998-02-10 | Lightshot Systems, Inc. | System for simulating shooting sports |
US5738522A (en) * | 1995-05-08 | 1998-04-14 | N.C.C. Network Communications And Computer Systems | Apparatus and methods for accurately sensing locations on a surface |
US5740626A (en) * | 1997-04-03 | 1998-04-21 | Olympic Arms, Inc. | Modified firearms for firing simulated ammunition |
US5870136A (en) * | 1997-12-05 | 1999-02-09 | The University Of North Carolina At Chapel Hill | Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications |
US5890906A (en) * | 1995-01-20 | 1999-04-06 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US6012980A (en) * | 1995-12-01 | 2000-01-11 | Kabushiki Kaisha Sega Enterprises | Coordinates detecting device, method for same and game device |
US6028593A (en) * | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
US20020009694A1 (en) * | 2000-01-13 | 2002-01-24 | Rosa Stephen P. | Firearm laser training system and kit including a target structure having sections of varying reflectivity for visually indicating simulated projectile impact locations |
US20020012898A1 (en) * | 2000-01-13 | 2002-01-31 | Motti Shechter | Firearm simulation and gaming system and method for operatively interconnecting a firearm peripheral to a computer system |
US6385855B1 (en) * | 1998-07-10 | 2002-05-14 | Nanoptics, Inc. | Sighting device for projectile type weapons for operation in day and night |
US6390642B1 (en) * | 2000-02-16 | 2002-05-21 | Robert Wayne Simonton | Tracer light for archer's arrow |
US6551189B1 (en) * | 1999-12-03 | 2003-04-22 | Beijing Kangti Recreation Equipment Center | Simulated laser shooting system |
US6572375B2 (en) * | 2000-01-13 | 2003-06-03 | Beamhit, Llc | Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm |
US6575753B2 (en) * | 2000-05-19 | 2003-06-10 | Beamhit, Llc | Firearm laser training system and method employing an actuable target assembly |
US6579098B2 (en) * | 2000-01-13 | 2003-06-17 | Beamhit, Llc | Laser transmitter assembly configured for placement within a firing chamber and method of simulating firearm operation |
US20030136900A1 (en) * | 1997-08-25 | 2003-07-24 | Motti Shechter | Network-linked laser target firearm training system |
US20040014010A1 (en) * | 1997-08-25 | 2004-01-22 | Swensen Frederick B. | Archery laser training system and method of simulating weapon operation |
US6709272B2 (en) * | 2001-08-07 | 2004-03-23 | Bruce K. Siddle | Method for facilitating firearms training via the internet |
US6739873B1 (en) * | 1996-09-18 | 2004-05-25 | Bristlecone Corporation | Method and apparatus for training a shooter of a firearm |
US20040136567A1 (en) * | 2002-10-22 | 2004-07-15 | Billinghurst Mark N. | Tracking a surface in a 3-dimensional scene using natural visual features of the surface |
US20050017456A1 (en) * | 2002-10-29 | 2005-01-27 | Motti Shechter | Target system and method for ascertaining target impact locations of a projectile propelled from a soft air type firearm |
US20070082322A1 (en) * | 2005-10-12 | 2007-04-12 | Matvey Lvovskiy | Training simulator for sharp shooting |
US20070166669A1 (en) * | 2005-12-19 | 2007-07-19 | Raydon Corporation | Perspective tracking system |
-
2006
- 2006-12-21 US US11/642,589 patent/US20070190495A1/en not_active Abandoned
Patent Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2934634A (en) * | 1957-07-09 | 1960-04-26 | William M Hellberg | Game and practice attachment for a gun |
US3452453A (en) * | 1966-05-05 | 1969-07-01 | Saab Ab | Gunnery practice apparatus employing laser beams |
US3510965A (en) * | 1967-04-14 | 1970-05-12 | Don E Rhea | Training aid for sighting small arms |
US3590225A (en) * | 1969-02-14 | 1971-06-29 | Brunswick Corp | Digital arrow location computer |
US3633285A (en) * | 1970-03-09 | 1972-01-11 | Litton Systems Inc | Laser markmanship trainer |
US4068393A (en) * | 1972-06-27 | 1978-01-17 | Vsevolod Tararine | Projectile firing training method and device |
US3792535A (en) * | 1972-12-11 | 1974-02-19 | Us Navy | Laser rifle simulator system |
US3782832A (en) * | 1973-04-12 | 1974-01-01 | Us Army | Method of boresight alignment of a weapon |
US3888022A (en) * | 1974-06-04 | 1975-06-10 | Us Army | Moving target screen |
US3938262A (en) * | 1974-10-17 | 1976-02-17 | Hughes Aircraft Company | Laser weapon simulator |
US4102059A (en) * | 1975-04-03 | 1978-07-25 | Cerheronics Inc. | Small arms laser training device |
US4195422A (en) * | 1976-12-20 | 1980-04-01 | Laspo Ag | System for simulating weapon firing |
US4164081A (en) * | 1977-11-10 | 1979-08-14 | The United States Of America As Represented By The Secretary Of The Navy | Remote target hit monitoring system |
US4218834A (en) * | 1978-03-02 | 1980-08-26 | Saab-Scania Ab | Scoring of simulated weapons fire with sweeping fan-shaped beams |
US4256013A (en) * | 1979-03-30 | 1981-03-17 | Quitadama Dominick J | Multiple target weapons system |
US4269415A (en) * | 1979-04-13 | 1981-05-26 | Thorne Booth George M | Scoring system for shooting gallery |
US4313273A (en) * | 1979-04-25 | 1982-02-02 | Laser Products Corporation | Firearms and laser beam aim assisting methods and apparatus |
US4313272A (en) * | 1979-04-25 | 1982-02-02 | Laser Products Corporation | Laser beam firearm aim assisting methods and apparatus |
US4336018A (en) * | 1979-12-19 | 1982-06-22 | The United States Of America As Represented By The Secretary Of The Navy | Electro-optic infantry weapons trainer |
US4281993A (en) * | 1980-05-19 | 1981-08-04 | The United States Of America As Represented By The Secretary Of The Navy | Semiconductor laser alignment device |
US4340370A (en) * | 1980-09-08 | 1982-07-20 | Marshall Albert H | Linear motion and pop-up target training system |
US4367516A (en) * | 1980-11-03 | 1983-01-04 | Jacob Lionel C | Marksmanship training device and method |
US4518360A (en) * | 1981-06-22 | 1985-05-21 | The Singer Company | Device to compensate for distortion in target location in a visual system |
US4452458A (en) * | 1981-09-18 | 1984-06-05 | C. Carl Timander | Device to determine, indicate and record aim of object |
US4439156A (en) * | 1982-01-11 | 1984-03-27 | The United States Of America As Represented By The Secretary Of The Navy | Anti-armor weapons trainer |
US4572509A (en) * | 1982-09-30 | 1986-02-25 | Sitrick David H | Video game network |
US4592554A (en) * | 1983-04-05 | 1986-06-03 | Peter Gilbertson | Equipment for simulated shooting |
US4657511A (en) * | 1983-12-15 | 1987-04-14 | Giravions Dorand | Indoor training device for weapon firing |
US4640514A (en) * | 1984-02-24 | 1987-02-03 | Noptel Ky | Optoelectronic target practice apparatus |
US4680012A (en) * | 1984-07-07 | 1987-07-14 | Ferranti, Plc | Projected imaged weapon training apparatus |
US4583950A (en) * | 1984-08-31 | 1986-04-22 | Schroeder James E | Light pen marksmanship trainer |
US4737106A (en) * | 1985-03-23 | 1988-04-12 | Schlumberger Electronics (U.K.) Limited | Weapon training systems |
US4662845A (en) * | 1985-09-27 | 1987-05-05 | Loral Electro-Optical Systems, Inc. | Target system for laser marksmanship training devices |
US4678437A (en) * | 1985-09-27 | 1987-07-07 | Technology Network International, Inc. | Cartridge and target device for markmanship training |
US4830617A (en) * | 1986-01-18 | 1989-05-16 | Accles And Shelvoke Limited | Apparatus for simulated shooting |
US4804325A (en) * | 1986-05-15 | 1989-02-14 | Spartanics, Ltd. | Weapon training simulator system |
US4811955A (en) * | 1986-09-29 | 1989-03-14 | Carlo De Bernardini | Hand fire-arm for shooting without ammunition |
US5208418A (en) * | 1987-05-15 | 1993-05-04 | Oerlikon-Contraves Ag | Aligning method for a fire control device and apparatus for carrying out the alignment method |
US5004423A (en) * | 1988-06-30 | 1991-04-02 | Bertrams Kurt U | Training aid for such side arms as revolvers and pistols |
US5026158A (en) * | 1988-07-15 | 1991-06-25 | Golubic Victor G | Apparatus and method for displaying and storing impact points of firearm projectiles on a sight field of view |
US4898391A (en) * | 1988-11-14 | 1990-02-06 | Lazer-Tron Company | Target shooting game |
US4923402A (en) * | 1988-11-25 | 1990-05-08 | The United States Of America As Represented By The Secretary Of The Navy | Marksmanship expert trainer |
US4988111A (en) * | 1988-12-12 | 1991-01-29 | Yonatan Gerlizt | Non hand-held toy |
US4934937A (en) * | 1988-12-14 | 1990-06-19 | Tommy Judd | Combat training system and apparatus |
US4922401A (en) * | 1989-05-22 | 1990-05-01 | International Fuel Cells | Inverter circuit utilizing the reverse voltage capabilities of symmetrical gate turn off thyristors |
US5119576A (en) * | 1989-06-06 | 1992-06-09 | Torsten Erning | Firearm with separable radiation emitting attachment |
US4995812A (en) * | 1989-11-02 | 1991-02-26 | Reynolds Fred W | Method of obtaining usable hydrocolloid impression material |
US4983123A (en) * | 1989-11-06 | 1991-01-08 | Phase Dynamics, Inc. | Marksmanship training apparatus |
US5594468A (en) * | 1989-11-07 | 1997-01-14 | Proxima Corporation | Optical system auxiliary input calibration arrangement and method of using same |
US5515079A (en) * | 1989-11-07 | 1996-05-07 | Proxima Corporation | Computer input system and method of using same |
US5181015A (en) * | 1989-11-07 | 1993-01-19 | Proxima Corporation | Method and apparatus for calibrating an optical computer input system |
US5504501A (en) * | 1989-11-07 | 1996-04-02 | Proxima Corporation | Optical input arrangement and method of using same |
US5502459A (en) * | 1989-11-07 | 1996-03-26 | Proxima Corporation | Optical auxiliary input arrangement and method of using same |
US5035622A (en) * | 1989-11-29 | 1991-07-30 | The United States Of America As Represented By The Secretary Of The Navy | Machine gun and minor caliber weapons trainer |
US5095433A (en) * | 1990-08-01 | 1992-03-10 | Coyote Manufacturing, Inc. | Target reporting system |
US5090708A (en) * | 1990-12-12 | 1992-02-25 | Yonatan Gerlitz | Non hand-held toy |
US5092071A (en) * | 1991-03-13 | 1992-03-03 | Larry Moore | Weapon accessory mount |
US5194006A (en) * | 1991-05-15 | 1993-03-16 | Zaenglein Jr William | Shooting simulating process and training device |
US5281142A (en) * | 1991-05-15 | 1994-01-25 | Zaenglein Jr William | Shooting simulating process and training device |
US5194007A (en) * | 1991-05-20 | 1993-03-16 | The United States Of America As Represented By The Secretary Of The Navy | Semiconductor laser weapon trainer and target designator for live fire |
US5486001A (en) * | 1991-05-30 | 1996-01-23 | Baker; Rick | Personalized instructional aid |
US5179235A (en) * | 1991-09-10 | 1993-01-12 | Toole Ronald L | Pistol sighting device |
US5213503A (en) * | 1991-11-05 | 1993-05-25 | The United States Of America As Represented By The Secretary Of The Navy | Team trainer |
US5215465A (en) * | 1991-11-05 | 1993-06-01 | The United States Of America As Represented By The Secretary Of The Navy | Infrared spot tracker |
US5194008A (en) * | 1992-03-26 | 1993-03-16 | Spartanics, Ltd. | Subliminal image modulation projection and detection system and method |
US5413357A (en) * | 1992-07-06 | 1995-05-09 | Nsm Aktiengesellschaft | Program controlled entertainment and game apparatus |
US5328190A (en) * | 1992-08-04 | 1994-07-12 | Dart International, Inc. | Method and apparatus enabling archery practice |
US5400095A (en) * | 1993-05-11 | 1995-03-21 | Proxima Corporation | Display projection method and apparatus an optical input device therefor |
US5433134A (en) * | 1993-10-05 | 1995-07-18 | Leiter; Edward J. | Blank firing conversions for semiautomatic pistols |
US5488795A (en) * | 1994-02-28 | 1996-02-06 | American Laser Technology, Inc. | Multi-caliber laser firing cartridge |
US5613913A (en) * | 1994-04-06 | 1997-03-25 | Sega Enterprises, Ltd. | Method for developing attractions in a shooting game system |
US5649706A (en) * | 1994-09-21 | 1997-07-22 | Treat, Jr.; Erwin C. | Simulator and practice method |
US5529310A (en) * | 1994-10-19 | 1996-06-25 | Interactive Innovations, Inc. | Hand-held multi-function wireless target control system |
US5605461A (en) * | 1994-10-27 | 1997-02-25 | Seeton; Gary E. | Acoustic triggered laser device for simulating firearms |
US5890906A (en) * | 1995-01-20 | 1999-04-06 | Vincent J. Macri | Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment |
US5591032A (en) * | 1995-03-23 | 1997-01-07 | Richard L. Powell | Laser weapon simulator apparatus with firing detection system |
US5738522A (en) * | 1995-05-08 | 1998-04-14 | N.C.C. Network Communications And Computer Systems | Apparatus and methods for accurately sensing locations on a surface |
US6012980A (en) * | 1995-12-01 | 2000-01-11 | Kabushiki Kaisha Sega Enterprises | Coordinates detecting device, method for same and game device |
US6028593A (en) * | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
US5641288A (en) * | 1996-01-11 | 1997-06-24 | Zaenglein, Jr.; William G. | Shooting simulating process and training device using a virtual reality display screen |
US6739873B1 (en) * | 1996-09-18 | 2004-05-25 | Bristlecone Corporation | Method and apparatus for training a shooter of a firearm |
US5716216A (en) * | 1996-11-26 | 1998-02-10 | Lightshot Systems, Inc. | System for simulating shooting sports |
US5740626A (en) * | 1997-04-03 | 1998-04-21 | Olympic Arms, Inc. | Modified firearms for firing simulated ammunition |
US20040014010A1 (en) * | 1997-08-25 | 2004-01-22 | Swensen Frederick B. | Archery laser training system and method of simulating weapon operation |
US20030136900A1 (en) * | 1997-08-25 | 2003-07-24 | Motti Shechter | Network-linked laser target firearm training system |
US5870136A (en) * | 1997-12-05 | 1999-02-09 | The University Of North Carolina At Chapel Hill | Dynamic generation of imperceptible structured light for tracking and acquisition of three dimensional scene geometry and surface characteristics in interactive three dimensional computer graphics applications |
US6385855B1 (en) * | 1998-07-10 | 2002-05-14 | Nanoptics, Inc. | Sighting device for projectile type weapons for operation in day and night |
US6551189B1 (en) * | 1999-12-03 | 2003-04-22 | Beijing Kangti Recreation Equipment Center | Simulated laser shooting system |
US6579098B2 (en) * | 2000-01-13 | 2003-06-17 | Beamhit, Llc | Laser transmitter assembly configured for placement within a firing chamber and method of simulating firearm operation |
US6572375B2 (en) * | 2000-01-13 | 2003-06-03 | Beamhit, Llc | Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm |
US20020012898A1 (en) * | 2000-01-13 | 2002-01-31 | Motti Shechter | Firearm simulation and gaming system and method for operatively interconnecting a firearm peripheral to a computer system |
US20020009694A1 (en) * | 2000-01-13 | 2002-01-24 | Rosa Stephen P. | Firearm laser training system and kit including a target structure having sections of varying reflectivity for visually indicating simulated projectile impact locations |
US6390642B1 (en) * | 2000-02-16 | 2002-05-21 | Robert Wayne Simonton | Tracer light for archer's arrow |
US6575753B2 (en) * | 2000-05-19 | 2003-06-10 | Beamhit, Llc | Firearm laser training system and method employing an actuable target assembly |
US6709272B2 (en) * | 2001-08-07 | 2004-03-23 | Bruce K. Siddle | Method for facilitating firearms training via the internet |
US20040136567A1 (en) * | 2002-10-22 | 2004-07-15 | Billinghurst Mark N. | Tracking a surface in a 3-dimensional scene using natural visual features of the surface |
US20050017456A1 (en) * | 2002-10-29 | 2005-01-27 | Motti Shechter | Target system and method for ascertaining target impact locations of a projectile propelled from a soft air type firearm |
US20070082322A1 (en) * | 2005-10-12 | 2007-04-12 | Matvey Lvovskiy | Training simulator for sharp shooting |
US20070166669A1 (en) * | 2005-12-19 | 2007-07-19 | Raydon Corporation | Perspective tracking system |
Cited By (104)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8621774B1 (en) | 2004-03-29 | 2014-01-07 | Metadigm Llc | Firearm with multiple targeting laser diodes |
US9470485B1 (en) | 2004-03-29 | 2016-10-18 | Victor B. Kley | Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control |
US9891030B1 (en) | 2004-03-29 | 2018-02-13 | Victor B. Kley | Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control |
US7926408B1 (en) * | 2005-11-28 | 2011-04-19 | Metadigm Llc | Velocity, internal ballistics and external ballistics detection and control for projectile devices and a reduction in device related pollution |
US8695266B2 (en) | 2005-12-22 | 2014-04-15 | Larry Moore | Reference beam generating apparatus |
US20190226791A1 (en) * | 2006-08-15 | 2019-07-25 | Triggermaster, Inc. | Trigger pull training device |
US8777620B1 (en) * | 2006-08-15 | 2014-07-15 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US8911235B1 (en) | 2006-08-15 | 2014-12-16 | Triggermaster, Inc. | Shooting training device |
US9151564B1 (en) | 2006-08-15 | 2015-10-06 | Triggermaster, Inc. | Firearm trigger pull training system and methods |
US9728095B1 (en) | 2006-08-15 | 2017-08-08 | Triggermaster, Llc | Firearm trigger pull training system and methods |
US10247505B1 (en) * | 2006-08-15 | 2019-04-02 | Triggermaster, Llc | Trigger pull training device |
US11788813B2 (en) * | 2006-08-15 | 2023-10-17 | Triggermaster, Llc | Trigger pull training device |
US8690664B2 (en) | 2006-09-25 | 2014-04-08 | Etasse Limited | Slot machine game with additional award indicator |
US9165419B2 (en) * | 2006-10-23 | 2015-10-20 | Etasse Limited | Slot machine bonus game providing awards for manual dexterity |
US20080119261A1 (en) * | 2006-10-23 | 2008-05-22 | Jorge Heymann | Slot machine bonus game providing awards for manual dexterity |
US20090059094A1 (en) * | 2007-09-04 | 2009-03-05 | Samsung Techwin Co., Ltd. | Apparatus and method for overlaying image in video presentation system having embedded operating system |
US8702493B2 (en) | 2007-11-09 | 2014-04-22 | Etasse Limited | Slot machine game with award based on another machine |
US20090124352A1 (en) * | 2007-11-13 | 2009-05-14 | Ignacio Gerson | Slot machine game with side pot |
US20100273131A1 (en) * | 2007-12-11 | 2010-10-28 | Korea Elecom Co., Ltd. | Laser transmitter for simulating a fire weapon and manufacturing method thereof |
US20090253103A1 (en) * | 2008-03-25 | 2009-10-08 | Hogan Jr Richard Russell | Devices, systems and methods for firearms training, simulation and operations |
US8827706B2 (en) | 2008-03-25 | 2014-09-09 | Practical Air Rifle Training Systems, LLC | Devices, systems and methods for firearms training, simulation and operations |
US9520031B2 (en) | 2008-07-07 | 2016-12-13 | Etasse Limited | Slot machine game with symbol lock-in |
US8627591B2 (en) | 2008-09-05 | 2014-01-14 | Larry Moore | Slot-mounted sighting device |
US9188407B2 (en) | 2008-10-10 | 2015-11-17 | Larry E. Moore | Gun with side mounting plate |
US8607495B2 (en) | 2008-10-10 | 2013-12-17 | Larry E. Moore | Light-assisted sighting devices |
US8813411B2 (en) | 2008-10-10 | 2014-08-26 | P&L Industries, Inc. | Gun with side mounting plate |
US20100092925A1 (en) * | 2008-10-15 | 2010-04-15 | Matvey Lvovskiy | Training simulator for sharp shooting |
US10625147B1 (en) | 2009-02-27 | 2020-04-21 | George Carter | System and method of marksmanship training utilizing an optical system |
US9308437B2 (en) * | 2009-02-27 | 2016-04-12 | Tactical Entertainment, Llc | Error correction system and method for a simulation shooting system |
US11662178B1 (en) | 2009-02-27 | 2023-05-30 | George Carter | System and method of marksmanship training utilizing a drone and an optical system |
US20150141100A1 (en) * | 2009-02-27 | 2015-05-21 | George Carter | Error Correction System and Method for a Simulation Shooting System |
US10213679B1 (en) | 2009-02-27 | 2019-02-26 | George Carter | Simulated indirect fire system and method |
US10527390B1 (en) | 2009-02-27 | 2020-01-07 | George Carter | System and method of marksmanship training utilizing an optical system |
US11359887B1 (en) | 2009-02-27 | 2022-06-14 | George Carter | System and method of marksmanship training utilizing an optical system |
US9782667B1 (en) | 2009-02-27 | 2017-10-10 | George Carter | System and method of assigning a target profile for a simulation shooting system |
US20100273130A1 (en) * | 2009-04-22 | 2010-10-28 | Integrated Digital Technologies, Inc. | Shooting training systems using an embedded photo sensing panel |
WO2011056196A1 (en) * | 2009-11-09 | 2011-05-12 | Projectionworks, Inc. | Systems and methods for optically projecting three-dimensional text, images and/or symbols onto three-dimensional objects |
US20110169924A1 (en) * | 2009-11-09 | 2011-07-14 | Brett Stanton Haisty | Systems and methods for optically projecting three-dimensional text, images and/or symbols onto three-dimensional objects |
US8610761B2 (en) * | 2009-11-09 | 2013-12-17 | Prohectionworks, Inc. | Systems and methods for optically projecting three-dimensional text, images and/or symbols onto three-dimensional objects |
US9332251B2 (en) | 2009-11-09 | 2016-05-03 | Delta Sigma Company | Systems and methods for optically projecting three-dimensional text, images and/or symbols onto three-dimensional objects |
US9163904B2 (en) * | 2010-01-19 | 2015-10-20 | Oren Louis Ohr | Dry fire training device |
US11906270B2 (en) | 2010-01-19 | 2024-02-20 | Oren Louis Uhr | Dry fire training device |
US20140193779A1 (en) * | 2010-01-19 | 2014-07-10 | Oren Louis Ohr | Dry fire training device |
US11415394B2 (en) * | 2010-01-19 | 2022-08-16 | Oren Louis Uhr | Dry fire training device |
US11680775B2 (en) | 2010-01-19 | 2023-06-20 | Oren Louis Uhr | Dry fire training device |
US20110306019A1 (en) * | 2010-06-09 | 2011-12-15 | Ohio Ordnance Works, Inc. | M249 non-firearm simulating a functional m249 firearm |
US8764446B2 (en) * | 2010-06-09 | 2014-07-01 | Robert I. Landies | M249 non-firearm simulating a functional M249 firearm |
US8696150B2 (en) | 2011-01-18 | 2014-04-15 | Larry E. Moore | Low-profile side mounted laser sighting device |
US9170079B2 (en) | 2011-01-18 | 2015-10-27 | Larry E. Moore | Laser trainer cartridge |
US9429404B2 (en) * | 2011-01-18 | 2016-08-30 | Larry E. Moore | Laser trainer target |
US20130017515A1 (en) * | 2011-01-18 | 2013-01-17 | Moore Larry E | Laser trainer target |
US9915508B2 (en) | 2011-01-18 | 2018-03-13 | Larry Moore | Laser trainer target |
US20120295229A1 (en) * | 2011-05-19 | 2012-11-22 | Fortitude North, Inc. | Systems and Methods for Analyzing a Marksman Training Exercise |
US10532275B2 (en) | 2012-01-18 | 2020-01-14 | Crimson Trace Corporation | Laser activated moving target |
US9533221B1 (en) * | 2012-09-24 | 2017-01-03 | Ryan Welch | Video game controller device for a firearm |
US9830408B1 (en) * | 2012-11-29 | 2017-11-28 | The United States Of America As Represented By The Secretary Of The Army | System and method for evaluating the performance of a weapon system |
US8844189B2 (en) | 2012-12-06 | 2014-09-30 | P&L Industries, Inc. | Sighting device replicating shotgun pattern spread |
US9146077B2 (en) | 2012-12-06 | 2015-09-29 | Larry E. Moore | Shotgun with sighting device |
US20140168447A1 (en) * | 2012-12-18 | 2014-06-19 | Trackingpoint, Inc. | Optical Device Including a Mode for Grouping Shots for Use with Precision Guided Firearms |
US9921017B1 (en) | 2013-03-15 | 2018-03-20 | Victor B. Kley | User identification for weapons and site sensing fire control |
US20150023591A1 (en) * | 2013-07-18 | 2015-01-22 | Ramakrishna Potluri, Intellecttech Corp Pvt Ltd | Optical analysis of a point of aim of a projectile discharge device |
US9297614B2 (en) | 2013-08-13 | 2016-03-29 | Larry E. Moore | Master module light source, retainer and kits |
US20150091252A1 (en) * | 2013-10-02 | 2015-04-02 | Neil Chadwick | Shooting Target Management Systems and Related Methods |
US9441923B2 (en) * | 2013-10-02 | 2016-09-13 | Neil Chadwick | Shooting target management systems and related methods |
US9841254B2 (en) | 2014-02-17 | 2017-12-12 | Larry E. Moore | Front-grip lighting device |
US9182194B2 (en) | 2014-02-17 | 2015-11-10 | Larry E. Moore | Front-grip lighting device |
US10371365B2 (en) | 2014-04-25 | 2019-08-06 | Crimson Trace Corporation | Redirected light beam for weapons |
US9644826B2 (en) | 2014-04-25 | 2017-05-09 | Larry E. Moore | Weapon with redirected lighting beam |
US10712116B1 (en) * | 2014-07-14 | 2020-07-14 | Triggermaster, Llc | Firearm body motion detection training system |
US10436553B2 (en) | 2014-08-13 | 2019-10-08 | Crimson Trace Corporation | Master module light source and trainer |
US20160153755A1 (en) * | 2014-10-25 | 2016-06-02 | Benjamin J. Morgan | System and Method for Timing Firearm Practice Drills |
US11112204B2 (en) | 2014-12-16 | 2021-09-07 | Kurt S. SCHULZ | Firearm simulators |
US10451376B2 (en) | 2014-12-16 | 2019-10-22 | Kurt S. SCHULZ | Firearm simulators |
US10132595B2 (en) | 2015-03-20 | 2018-11-20 | Larry E. Moore | Cross-bow alignment sighter |
US11320228B2 (en) | 2015-03-23 | 2022-05-03 | Ronnie A. Valdez | Simulated hunting devices and methods |
US20160282076A1 (en) * | 2015-03-23 | 2016-09-29 | Ronnie VALDEZ | Simulated hunting devices and methods |
US10508882B2 (en) * | 2015-03-23 | 2019-12-17 | Ronnie VALDEZ | Simulated hunting devices and methods |
ES2578806A1 (en) * | 2016-02-19 | 2016-08-01 | Eduardo PÉREZ CALLE | Internal reactive impact pointing device (d.i.s.i.r.) (Machine-translation by Google Translate, not legally binding) |
US20170261283A1 (en) * | 2016-03-10 | 2017-09-14 | Joe Jenius Inc. | Portable dry fire practice shooting system |
US10627183B2 (en) * | 2016-03-10 | 2020-04-21 | Joe Jenius Inc. | Portable dry fire practice shooting system |
US20170292813A1 (en) * | 2016-04-07 | 2017-10-12 | Jab Company Llc | Target shooting |
US9829280B1 (en) | 2016-05-26 | 2017-11-28 | Larry E. Moore | Laser activated moving target |
US10113836B2 (en) | 2016-05-26 | 2018-10-30 | Larry E. Moore | Moving target activated by laser light |
US11383151B2 (en) * | 2016-08-17 | 2022-07-12 | Fowling Enterprises, Llc | Automated game scoring and pin tracking system |
US10209030B2 (en) | 2016-08-31 | 2019-02-19 | Larry E. Moore | Gun grip |
JP2017067439A (en) * | 2016-12-28 | 2017-04-06 | 株式会社エイテック | Target system |
US10895435B2 (en) | 2017-02-27 | 2021-01-19 | Kurt S. SCHULZ | Firearm simulator targets and firearm simulation systems |
US20180245881A1 (en) * | 2017-02-27 | 2018-08-30 | Kurt S. SCHULZ | Firearm simulator targets and firearm simulation systems |
US10436538B2 (en) | 2017-05-19 | 2019-10-08 | Crimson Trace Corporation | Automatic pistol slide with laser |
DE102017006254A1 (en) | 2017-06-30 | 2019-01-03 | Simon Fröhlich | Apparatus for evaluating laser shots on targets |
US11278818B2 (en) * | 2017-07-05 | 2022-03-22 | Irisity AB | Method for apply gamification techniques to a security system |
IL256122A (en) * | 2017-12-05 | 2018-01-31 | Engelstein Tal | A smart safety contraption and methods related thereto for use with a firearm |
US11293722B2 (en) | 2017-12-05 | 2022-04-05 | Tal Engelstein | Smart safety contraption and methods related thereto for use with a firearm |
CN111433553A (en) * | 2017-12-07 | 2020-07-17 | 李国镇 | Integrated shooting simulation system using fisheye lens camera |
US10209033B1 (en) | 2018-01-30 | 2019-02-19 | Larry E. Moore | Light sighting and training device |
CN112513556A (en) * | 2018-03-21 | 2021-03-16 | 因韦里斯培训解决方案公司 | Device and method for detecting a firing event |
US11719511B2 (en) * | 2018-03-21 | 2023-08-08 | Inveris Training Solutions, Inc. | Apparatus and methods for detection of a shot firing event |
CN112513556B (en) * | 2018-03-21 | 2024-03-08 | 因韦里斯培训解决方案公司 | Apparatus and method for detecting a firing event |
JP7160607B2 (en) | 2018-09-11 | 2022-10-25 | 株式会社日立国際電気 | shooting training system |
JP2020041743A (en) * | 2018-09-11 | 2020-03-19 | 株式会社日立国際電気 | Firing training system |
US11882813B2 (en) | 2020-10-15 | 2024-01-30 | Ronnie A Valdez | Wildlife tracking system |
CN112665454A (en) * | 2021-01-18 | 2021-04-16 | 河北砺兵科技有限责任公司 | Target distribution method in man-machine confrontation training |
US20230280133A1 (en) * | 2022-03-04 | 2023-09-07 | Naki U. Soyturk | Laser-based firearm and target assembly and method of use |
US11874094B2 (en) * | 2022-03-04 | 2024-01-16 | Naki U. Soyturk | Laser-based firearm and target assembly and method of use |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20070190495A1 (en) | Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios | |
US6966775B1 (en) | Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations | |
US7329127B2 (en) | Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control | |
US20020012898A1 (en) | Firearm simulation and gaming system and method for operatively interconnecting a firearm peripheral to a computer system | |
US6322365B1 (en) | Network-linked laser target firearm training system | |
US20080213732A1 (en) | System and Method for Calculating a Projectile Impact Coordinates | |
US4923402A (en) | Marksmanship expert trainer | |
US6942486B2 (en) | Training simulator for sharp shooting | |
US20040014010A1 (en) | Archery laser training system and method of simulating weapon operation | |
US20160298930A1 (en) | Target practice system | |
US20070254266A1 (en) | Marksmanship training device | |
US20100233660A1 (en) | Pulsed Laser-Based Firearm Training System, and Method for Facilitating Firearm Training Using Detection of Laser Pulse Impingement of Projected Target Images | |
US20100092925A1 (en) | Training simulator for sharp shooting | |
US20070160960A1 (en) | System and method for calculating a projectile impact coordinates | |
KR101330060B1 (en) | Method and system for training full duplex simulator shooting tactics using laser | |
EP1398595A1 (en) | Network-linked laser target firearm training system | |
WO2004104508A2 (en) | Archery laser training system and method | |
JPH11142097A (en) | Shooting training device | |
CN215064086U (en) | Shooting range system | |
RU2774375C2 (en) | Shooting simulator | |
US10876819B2 (en) | Multiview display for hand positioning in weapon accuracy training | |
CN112166676B (en) | Simulated actual combat shooting training system | |
AU783018B2 (en) | Network-linked laser target firearm training system | |
UA109927U (en) | ELECTRONIC MODULAR SHOOTER TRAINER | |
AU2920202A (en) | Network-linked laser target firearm training system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: L3 COMMUNICATIONS CORPORATION, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KENDIR, O. TANSEL;YILDIRIM, RIFAT TOLGA;REEL/FRAME:018706/0117 Effective date: 20061220 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: L-3 COMMUNICATIONS CORPORATION, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:L3 COMMUNICATIONS CORPORATION;REEL/FRAME:026602/0245 Effective date: 20110119 |