US20020115047A1 - Method and system for marking content for physical motion analysis - Google Patents

Method and system for marking content for physical motion analysis Download PDF

Info

Publication number
US20020115047A1
US20020115047A1 US09/788,031 US78803101A US2002115047A1 US 20020115047 A1 US20020115047 A1 US 20020115047A1 US 78803101 A US78803101 A US 78803101A US 2002115047 A1 US2002115047 A1 US 2002115047A1
Authority
US
United States
Prior art keywords
information
analysis
video
sensed
swing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/788,031
Inventor
Michael McNitt
Jeffrey Parks
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Golftec Enterprises LLC
Original Assignee
GolfTEC Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GolfTEC Inc filed Critical GolfTEC Inc
Priority to US09/788,031 priority Critical patent/US20020115047A1/en
Assigned to GOLFTEC, INC. reassignment GOLFTEC, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCNITT, MICHAEL J., PARKS, JEFFREY J.
Assigned to GOLFTEC, INC. reassignment GOLFTEC, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ERB, DAVID ARTHUR
Assigned to GOLFTEC ENTERPRISES LLC reassignment GOLFTEC ENTERPRISES LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOLFTEC, INC.
Priority to PCT/US2002/005217 priority patent/WO2002066119A1/en
Assigned to GOLFTEC ENTERPRISES LLC reassignment GOLFTEC ENTERPRISES LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOLFTEC INC.
Publication of US20020115047A1 publication Critical patent/US20020115047A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0002Training appliances or apparatus for special sports for baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0015Training appliances or apparatus for special sports for cricket
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/38Training appliances or apparatus for special sports for tennis

Definitions

  • the invention relates generally to a method and system for providing physical motion training and instruction. More particularly, the invention relates to a computer-implemented system for providing athletic training and instruction. Even more particularly, the present invention relates to saving of lesson files for future review.
  • Some motion analysis systems provide animation that depicts elements of a golf swing based upon captured data. However, the animation is crude and doesn't show the golfer what he/she looks like during a swing.
  • motion analysis systems are used with video analysis systems in order to try to overcome the problems associated with each system as it is used independently of the other.
  • the instructor will use the motion capture data and subjectively map the information to the video data. Although this provides more specific data to the instructor, it is associated with at least one significant problem.
  • the instructor while viewing the video, must estimate the swing positions corresponding to the data points from the motion analysis information. Accordingly, analysis of the swing requires not only considerable effort, but also a significant amount of estimation in associating the positional data points with an associated position on the student's swing.
  • the systems for providing the video analysis are separate from the systems that provide motion capture information such that the instructor must manipulate numerous controls for displaying, to the student, the various positional measurement values as well as for providing separate video replays.
  • Another problem associated with current methods of providing instructional information to the student relates to the fact that following a teaching session, students are typically provided a copy of the recorded session. Given that the entire teaching session is recorded, much of the recorded material is redundant or otherwise unnecessary. Thus, in order to provide only relevant material to the student, the instructor must review the entire recorded lesson and select and separately save only the relevant material. Doing so consumes a significant amount of time and effort.
  • an analysis tool that synchronizes at least two signals carrying sensed information associated with physical motion.
  • the synchronized signals are used in providing analysis related to the physical motion conducted.
  • the analysis tool incorporates a processing environment and at least two sensors sensing information related to physical motion.
  • the processing environment synchronizes signals received from the sensors and processes the synchronized signals to generate analysis information.
  • the analysis information provides information to allow for correction and instruction.
  • the processing environment includes a synchronization module to perform the synchronization of the signals, a processing module for processing the sensed information into analysis information, and an analysis module for presenting the analysis information to the athlete. Consequently, the present invention synchronizes information signals carrying two different forms of information, processes these signals, and presents combined information to provide correction and instruction.
  • the analysis tool is used to provide athletic training and instruction. Two or more signals carrying sensed information associated with athletic motion are synchronized to provide an athlete with analysis regarding the athletic motion sensed.
  • the analysis tool is used for golf swing analysis. While used for golf swing analysis, the signals relate to video frame data signal carrying video information of a golf swing and a positional data signal carrying positional motion information associated with positional measurements of elements of the golf swing. The video frame data signal and the positional data signal are synchronized by the analysis tool to provide golf swing analysis.
  • the analysis tool might be used for educational analysis of any element of an athletic motion where the element is used as a measure through which a sport is conducted.
  • the present invention relates to an overall system for providing athletic training and instruction.
  • the system has a first sensor generating a first information signal carrying a first type of sensed information and a second sensor generating a second information signal carrying a second type of sensed information.
  • the analysis tool also includes a synchronization module receiving the first and second signal and synchronizing the first signal with the second signal to provide a combined signal that can be used for athletic training and instruction.
  • the present invention relates to a method for providing athletic training and instruction, wherein the method includes the acts of receiving a first signal carrying sensed information samples from a first sensor and receiving a second signal carrying sensed information samples from a second sensor.
  • the sensed information from the second sensor is associated with a different form of sensed information than carried by the first signal.
  • the method includes the act of synchronizing the first signal with the second signal to provide an analysis tool for providing athletic training and instruction.
  • the present invention relates to a display device that displays both motion capture information and video information, whether synchronized or not.
  • the system also provides one control panel that can control the replay of two or more video signals and/or positional information.
  • the present invention also relates to a method of marking information while the training session is in process. Marked content is saved to a separate file that may be reviewed at a later time. By marking contend contemporaneously during the training session, the instructor is not forced to review the entire lesson to select the relevant material. Instead, only the marked material is reviewed for relevancy and saved to the lesson file.
  • the present invention relates to an analysis method and system for analyzing physical motion, e.g., a golf swing, wherein the physical motion occurs during a training session.
  • Sensors sense the physical motion and generate information signals related to sensed information.
  • the system also has a storage medium for storing the sensed information conducted by the information signals and a marking module for marking a plurality of portions of the sensed information.
  • the marking module marks the portions of sensed information contemporaneously with the training session.
  • the marking module may be user-activated or might provide automatic marking of predetermined content.
  • the sensed information may include video information, audio information and/or positional measurement information. Additionally, some of the sensed information may be synchronized, e.g., the video information and the positional measurement information may be synchronized.
  • the saved lesson file may be uploaded to a server computer system, wherein the lesson file on the server computer system may be accessed by a remote client computer system over a network connection, such as the Internet. The marked content may be displayed on a monitor.
  • the invention may be implemented as a computer process, a computing system or as an article of manufacture such as a computer program product or computer readable media.
  • the computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
  • the computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process.
  • FIG. 1 is a functional diagram of an analysis tool in accordance with an embodiment of the present invention and the associated environment.
  • FIG. 2 is a simplified block diagram that illustrates functional components of an analysis tool such as the analysis tool shown in FIG. 1, the analysis tool having sensors and a processing environment.
  • FIG. 3 is a functional diagram of the sensors and the processing environment of the analysis tool shown in FIG. 2 in accordance with an embodiment of the present invention.
  • FIG. 4 is a flow diagram that illustrates operational characteristics for providing analysis information to an analysis module.
  • FIG. 5 is a flow diagram that illustrates operational characteristics shown in FIG. 4 in more detail in accordance with an embodiment of the present invention.
  • FIG. 6 is a flow diagram that illustrates operational characteristics related to control of analysis information using an input device.
  • FIG. 7 is a reproduction of a display screen for presenting analysis information from an analysis tool such as the analysis tool shown in FIG. 1.
  • FIG. 8 is a reproduction of a display screen for presenting synchronized analysis information from the analysis tool shown in FIG. 1.
  • FIG. 9 is a flow diagram that illustrates operational characteristics related to the video frame sample acquisition process shown in FIG. 5.
  • FIG. 10 is a flow diagram that illustrates operational characteristics related to providing physical motion training and instruction via the World Wide Web.
  • FIG. 1 An analysis tool 100 used to provide synchronization of various elements is shown in FIG. 1.
  • the analysis tool 100 synchronizes signals from a video analysis system 102 with signals from a position analysis system 104 . Signals from the video analysis system 102 and the position analysis system 104 carry sensed information associated with a physical motion. The resulting synchronized signals may then be used to provide physical motion analysis related to the performed motion.
  • the analysis is presented to provide the person performing the motion with correction and instruction such as instruction to improve the person's golf swing.
  • analysis tool 100 is described below as a system and method for providing golf swing analysis, the analysis tool 100 might be similarly used to provide motion analysis in other sports, such as baseball, tennis, cricket, polo, or any other sports where an athlete's motion is a measure through which an element of the sport is conducted. Moreover, the analysis tool 100 might be similarly used to provide almost any form of physical motion analysis.
  • the video analysis system 102 uses video recording equipment to record a physical motion and to transmit a recorded video information signal 108 to a process environment 114 .
  • the position analysis system 104 captures positional information and transmits a positional information signal 110 to the process environment 114 .
  • Process environment 114 interprets the received video 108 and positional 110 information signals sent to the process environment 114 and synchronizes the signals.
  • the process environment 114 processes the synchronized signals in order to generate analysis, or teaching information, which may be used for golf swing analysis and training.
  • process environment 114 may be any combination of elements capable of receiving signals and synchronizing those signals.
  • process environment 114 is a personal computer, it is capable of displaying information related to the synchronization of the signals, but such a display is not a necessary component of process environment 114 .
  • the analysis tool 100 also has an impact analysis system 106 , which captures impact information and conducts an impact information signal 112 to the process environment 114 .
  • process environment 114 synchronizes the three information signals 108 , 110 and 112 .
  • analysis tool 100 synchronizes a video information signal 108 provided by the video analysis system 102 with an impact information signal 112 provided by the impact analysis system 106 .
  • the analysis tool 100 synchronizes a positional information signal 110 provided by the position analysis system 104 with an impact information signal 112 provided by the impact analysis system 106 .
  • a pressure information signal (not shown) might be used with the analysis tool 100 . In this embodiment, the pressure information signal might be synchronized with any one of the information signals 108 , 110 or 112 , or a combination of the signals 108 , 110 , 112 .
  • FIG. 2 A simplified illustration of the functional components of an analysis tool 199 that incorporates aspects of the analysis tool 100 shown in FIG. 1 is shown in FIG. 2.
  • the analysis tool 199 has at least two sensors 202 and 204 that communicate with a synchronization module 200 .
  • the synchronization module 200 synchronizes physical motion information received from the sensors 202 and 204 and communicates the resulting synchronized information to a processing module 212 .
  • the physical motion information might be associated with any form of physical motion subject to correction and instruction.
  • the physical motion information is associated with an element of an athletic sport, such as, but not limited to a swing, a stroke, a run, a throw, a catch, or any other motion associated with an element through which a sport might be conducted.
  • the physical motion information might be associated with motions related to physical or occupational therapy.
  • the physical motion information might be associated with a golf swing.
  • the processing module 212 receives synchronized information from the synchronization module 200 and, in turn, processes the synchronized information in order to provide analysis information to an end user. The analysis information is then used to provide physical motion correction and instruction.
  • analysis information is in a form suitable for review by the user. Therefore, analysis information may be video replay of a golf swing, a visual representation of positional data has been gathered or a visual representation of impact information that has been gathered, etc.
  • analysis information is provided to the user through a graphical user interface operating on a computer display.
  • the display may be located on-site, e.g., where the golf swing is performed or located remotely.
  • the remote display relates to replaying the recorded lesson on a television or computer at another location.
  • the recorded lesson may be recorded onto a videocassette, compact disc, floppy disc or other readable memory. Additionally, the recorded lesson may be stored on a web server and the user may access lesson via the World Wide Web.
  • a first sensor 202 senses information and then transmits a first information signal 208 relative to the sensed information to the synchronization module 200 .
  • a second sensor 204 senses a different type of information than the first sensor 202 and transmits a second information signal 210 to the synchronization module 200 .
  • the information signals 208 and 210 may be either analog or digital and may be partitioned into time samples, or segments.
  • the analysis tool 199 might use more than two sensors in obtaining more than two forms of sensed information.
  • the information signals 208 and 210 are delivered to the synchronization module 200 substantially contemporaneously. Contemporaneous conduction of these signals may be achieved by real-time conduction of the signals as they are sensed by the sensors 202 and 204 .
  • the sensed information might be positional information related to a golfer's swing, video information relative to a golfer's swing, impact information relative to impact of the club head with a golf ball resulting from a golfer's swing, or pressure information relative to weight transfer associated with a golf swing.
  • information signals 208 and 210 might be a positional information signal 110 , a video information signal 108 , an impact information signal 112 , or a pressure information signal (not shown).
  • the sensed information might be any form of information related to a stroke, swing, movement, or motion of an athlete performing acts while engaged in any sport. Additionally, the sensed information might be any form of information related to physical motion.
  • a second sensor 204 transmits an independent information signal 210 relative to a type of sensed information other than the information sensed by the first sensor 202 .
  • the information signals 208 and 210 are synchronized. Synchronization of the information signals 208 and 210 may be accomplished in several ways to ensure that portions, or samples, of one signal (such as 208 ) relate to portions, or samples, of the other signal (such as 210 ) based on associated time information.
  • the information signals 208 and 210 might be time-stamped using an internal clock mechanism. Accordingly, each sample from the first information signal 208 corresponds to a sample from the second information signal 210 .
  • time stamps are administered on each information signal 208 and 210 on preset intervals such that corresponding samples of the signals 208 and 210 are identified by the same time stamp.
  • time stamps are administered on each information signal 208 and 210 independently and the association of the samples is accomplished through a comparative analysis performed by the synchronization module 200 .
  • Time stamping the information signals 208 and 210 creates synchronized information that is transmitted to the processing module 212 to provide synchronized analysis associated with the information acquired by the sensors 202 and 204 .
  • the information signals 208 and 210 may be synched using associated times derived from time stamps without a corresponding time stamp in the other signal.
  • the first information signal 208 may contain five samples to every one sample of the second information signal 210 .
  • the samples might be associated such that one sample of the first information signal 208 relates to five samples of the second information signal 210 .
  • interpolation might be used to supply missing data points to the signal of sensed information lagging in time samples. Interpolation can be administered through a conventional polynomial equation such that there results one sample of data related to the first information signal 208 exists for every sample of data related to the second information signal 210 .
  • FIG. 3 An analysis tool 300 , in accordance with an embodiment of the present invention, is shown in FIG. 3.
  • the analysis tool 300 incorporates information signals from a position analysis system 350 and a video analysis system 352 .
  • the analysis tool 100 also incorporates an information signal from an impact analysis system 354 .
  • the information signals 362 , 360 , and 364 are respectively provided by the systems 352 , 350 , and 354 to the process environment 114 (FIG. 1) for synchronization.
  • the position analysis system 350 includes a motion capture system 320 , a motion data acquisition module 308 , and a communication connection to the process environment 114 (FIG. 1).
  • the video analysis system 352 includes a video capture system 322 , a capture board 306 , a video frame acquisition module 310 , and a communication connection to the process environment 114 (FIG. 1).
  • the impact analysis system 354 includes an impact event sensor 324 , an impact data acquisition module 312 , and a communication connection to the process environment 114 (FIG. 1).
  • the motion capture system 320 might be administered by at least one magneto-sensitive sensor contained in a magnetic field.
  • the motion capture system might be a Polhemus Iso-Track IITM magnetic sensor system. Multiple magneto-sensitive sensors are placed on the golfer's body at positions corresponding to particular swing elements of a golfer's swing. The magneto-sensitive sensors are used to transmit positional measurement information as the swing moves through the magnetic field. In a specific embodiment, thirty positional measurement data samples/second are captured in binary mode using two magneto-sensitive sensors.
  • the motion capture system 320 might be administered by at least one color or retro-reflective marker contained in an image field responsive to colors or reflectiveness of the marker.
  • the motion capture system 320 conducts positional measurement information via a positional information signal 360 to the motion data acquisition module 308 for data compilation and documentation. Additionally, the motion data acquisition module 308 might convert the signal 360 to a format recognizable by the processing module 314 if necessary.
  • a synchronization module 301 receives the positional information signal 360 and synchronizes the signal 360 to a video information signal 362 .
  • the positional information signal 360 carries information identifying sensed positional measurements of swing, or motion, elements relative to a three-dimensional coordinate system.
  • the three-dimensional coordinate system about which positional elements are measured might be an absolute coordinate system.
  • positional measurements for each element of a physical motion are taken with reference to a single, fixed origin that is independent from the person performing the physical motion.
  • an absolute origin is an origin used for all element measurements of the physical motion for each different user.
  • an axis system may be defined, as shown in FIG.
  • ⁇ s might represent a rotational angle of the shoulders around the x-axis
  • ⁇ s might represent a rotational angle of the shoulders around the y-axis
  • ⁇ h might represent a rotational angle of the shoulders around the z-axis
  • ⁇ h might represent a rotational angle of the hips around the x-axis
  • ⁇ h might represent a rotational angle of the hips around the y-axis
  • ⁇ h might represent a rotational angle of the hips around the z-axis
  • relates to shoulder and hip rotation
  • relates to shoulder and hip tilt. Measured with reference to the absolute coordinate system, positional elements related to bend, rotation, and tilt of both the shoulders and the hips are measured around the absolute origin.
  • the coordinate system might be a referenced coordinate system.
  • positional elements are measured with reference to coordinate origins that are unique to the user.
  • measurements related to rotation, bend, and tilt of the shoulder might be referenced to a coordinate system having an origin located on a golfer's hip.
  • the measurements described above while discussing the absolute coordinate system are used to determine rotational positions about the reference coordinate system.
  • the rotational position of the shoulders around the x-axis ( ⁇ sp ), the rotational position of the shoulders around the y-axis ( ⁇ sp ), and the rotational position of the shoulders around the z-axis ( ⁇ sp ) are defined as follows:
  • ⁇ sp ⁇ s + ⁇ h cos( ⁇ s + ⁇ h )+ ⁇ h sin(
  • ⁇ sp ⁇ sp + ⁇ h sin( ⁇ s + ⁇ h )+ ⁇ h cos(
  • positional measurements associated with the shoulders are used in this illustration
  • positional measurements of other elements such as hip rotation, wrist rotation, head rotation, or any other element associated with a golf swing or other physical motion
  • hip rotation might be measured around a coordinate system referenced to an origin located around a golfer's knees.
  • the video capture system 322 includes at least one video recording device transmitting a video feed signal carrying video frame samples defining image information.
  • the video capture system 322 uses 2 analog, 60 frames/second, interlaced video cameras with s-video outputs. The video cameras are positioned such that the front and the side view of a golfer are captured. Moreover, the frame size of the cameras is 400 by 480, thereby filling two video windows on a graphical user interface on a display.
  • each video feed signal is transmitted to a capture board 306 .
  • Each capture board 306 converts the image information carried in the video feed signal to video frame data that the video acquisition module 310 can recognize.
  • each capture board might be a video framegrabber card configured for the s-video mode.
  • Video frame samples are carried in a video information signal 362 from the capture board 306 to the video acquisition module 310 for data documentation and compilation.
  • the video data acquisition module 310 might convert the signal 362 to a format recognizable by the processing module 314 .
  • the synchronization module 301 receives the video information signal 362 and synchronizes the video information signal 362 to the positional information signal 360 .
  • an impact analysis system 354 is incorporated into analysis tool 300 with the video analysis system 357 or the position analysis system 350 , or both.
  • the impact analysis system 354 senses impact information related to the impact of a golf club head and a golf ball (FIG. 1).
  • Impact measurement information is associated with clubface angles and measurements as the club approaches the golf ball, strikes the golf ball and follows through. Impact measurement information allows for calculations related to the velocity, distance, and direction of the golf ball upon impact. Such information is important in understanding the mechanics of a golf club swing.
  • impact event information is associated with the exact time of impact between the ball and the club and therefore, indicates the occurrence of an event.
  • Impact information is transmitted through the impact information signal 364 once impact occurs.
  • An impact analysis sensor 324 detects impact information and may be a radar sensor, a high-speed video recording device, a pressure sensor, a laser grid sensor, or any equivalent sensor for sensing impact-related information.
  • the impact analysis sensor 324 is a laser grid sensing the impact between a club head and the golf ball.
  • other types of sensors may be used to collect impact measurement information. Alternatively, impact measurement information might not be collected at all.
  • the impact analysis system 354 might be administered through a laser grid sensor, such as a Focaltron AchieverTM laser grid device contained in a custom mounting feature about the impact zone.
  • the laser grid sensor might serve as the only impact analysis sensor 324 collecting both impact measurement information and impact event information.
  • the laser grid sensor is used solely to collect impact measurement information.
  • the sensor positions a laser grid surrounding the point of estimated impact between a golf ball and a golf club. Once the clubface enters the laser grid, the sensor detects various clubface measurements as the clubface extends through the grid.
  • the impact information signal 364 carries information relative to the sensed clubface through the laser grid along with the impact event information.
  • a trigger event system 332 is used.
  • the trigger event system 332 is operably connected to processing module 314 so that a triggering event signal 334 can be communicated to the processing module 314 .
  • the triggering event signal 334 relates the occurrence of a trigger event, which provides the reference point in time, i.e., the trigger event time, that allows the processing module 314 to define a timing window for analysis.
  • the timing window may be defined by a start time equal to the trigger event time minus a predetermined period, e.g., 3 seconds, and an end time equal to the trigger event time plus a predetermined period, e.g., 3 seconds.
  • the data collected within the timing window is marked and stored for analysis and/or playback. If the collected data from the video 352 and position 350 analysis systems falls outside the timing window, then it is discarded out of the buffers 302 , 304 .
  • the trigger event may be caused by manual selection of an input request, predetermined positional coordinates on a golfer's swing, or any other triggering operation associated with a golfer's swing. Additionally, the trigger event may be caused by impact between the golf ball and the golf club head. In accordance with one embodiment, the trigger event is sensed by a microphone, or other acoustical measurement sensor, sensing impact between a club head and a golf ball.
  • the position analysis system 350 , the video analysis system 352 , and the impact analysis system 354 transmit information signals 360 , 362 , and 364 to a synchronization module 301 .
  • information signals 362 and 360 enter the synchronization module 301 from the position analysis system 350 and the video analysis system 352 , samples on the information signals 350 and 352 are time-stamped as described in conjunction with FIG. 2.
  • the samples identified with the time stamp are stored in sample buffers 302 and 304 .
  • the samples stamped from the positional information signal 360 are stored in a metric sample buffer memory 302 and the samples stamped from the video informational signal 362 are stored in a video sample buffer memory 304 .
  • the sample buffers 302 and 304 only hold the stamped data points for a limited amount of time.
  • the sample buffers 302 and 304 are preferably designed as first-in, first-out (FIFO) buffers. Accordingly, once the buffer memories 302 and 304 are full, earlier samples are erased as new samples are received by the buffer. Buffer memories 302 and 304 continue storing information until the time period defined by the timing window is expired. Once expired, the information is marked and stored to disk or another portion of memory to be used during analysis
  • the impact analysis system 354 is not enabled until the impact analysis sensor 324 senses the impact event.
  • impact measurement information is not carried in the impact information signal 364 .
  • impact event information is time stamped, transmitted to the process environment 114 in the impact information signal 364 , and stored in impact sample buffer 326 .
  • the impact analysis system 354 is enabled as the iron of the golf club approaches and extends through the golf ball. Once the impact analysis system 354 is enabled, impact measurement information is collected, time-stamped, and carried by the impact information signal 364 to the process environment 114 . The impact measurement information is stored in the impact sample buffer 326 .
  • impact sample buffer 326 might not be used to store information when the trigger event system 332 is used without the impact analysis system 354 .
  • the video analysis system 352 and the position analysis system 350 continue collecting information until the timing window expires. Continuation of the information collection by the motion 350 and video 352 analysis systems ensure that both systems 350 and 354 collect information related to the follow-through swing of the golfer. As long as the position analysis system 350 and the video analysis system 352 continue collecting information, the synchronization module 301 continues time stamping samples on the information signals 360 and 362 . In an alternative embodiment, the video analysis system 352 and the position analysis system 350 terminate information collection once a trigger event is sensed.
  • All data samples are time stamped using the same timebase.
  • the timebase might be a Win32TM high-precision timer.
  • the position analysis system 350 grabs a sample about every 33 ms and the video analysis system 352 grabs a sample about every 16 ms. Therefore, identical positional measurements are stored in the metric sample buffer 302 for more than one image record being stored in the video sample buffer 304 .
  • the timer information indicates the relative location in time at which the samples were gathered. Headers of the video sample buffer 304 contain information corresponding to positional measurement samples stored in the metric sample buffer 302 .
  • the video capture system 352 Upon completion of the trigger countdown, the video capture system 352 is stopped. Once the collection and compilation of data is completed, e.g., timing window completed, by the position analysis 104 and video analysis systems 102 , the stored positional and video frame data samples are transmitted from the synchronization module 301 into the processing module 314 .
  • the processing module 314 transforms the data stored in the buffers 302 , 304 , and 326 into analysis information.
  • the processing module 314 is a data processing system processing the data stored in the buffers 302 , 304 , and 326 into information of a form suitable for a user.
  • the processing module 314 might be a part of a desktop computer system having typical input/output capabilities, a processing unit, and memory functionality.
  • the processing module 314 receives information resulting from a timing window and automatically stores that information in such a manner that if the system crashes during the processing stage, then the information may be recovered. In such a case, all information received by the processing module 314 is stored to a temporary file. This temporary file may then be erased once the lesson is explicitly stored into a more permanent file. Additionally, this temporary file is typically only used to restore information due to a crash, but may be accessed for other reasons.
  • the processing module 314 discards redundant records of positional measurement samples.
  • the processing module 314 also may implement a spline fit algorithm to each of the positional measurement samples. Using the spline parameters based on the smooth motion being measured, the metric value at each frame time may be computed. This calculated data is written into a positional measurement file which is ultimately saved as part of an archived lesson.
  • the analysis information is thereafter transmitted to the analysis module 315 through at least one analysis information signal 330 .
  • Analysis information is information derived from video, motion, or impact analysis and presented in a form which can be interpreted by a user.
  • the analysis information is presented to the analysis module 315 in real time so that a user may monitor a golf swing and various measurements associated with the golf swing as the swing is conducted.
  • the analysis module 315 might present positional analysis information synchronized with video analysis information while the user is monitoring both forms of analysis information at the same time he/she is conducting the swing.
  • the positional analysis information being presented as measurements dynamically varying as the swing is conducted.
  • the video analysis information presents an image of the swing at an address position
  • the positional measurement associated with a particular swing element is also defined at the address position.
  • recorded analysis information is presented to the analysis module 315 so that a user may review a golf swing and various measurements associated with the golf swing at a later time.
  • the recorded analysis information contains information from at least two analysis systems, such as the video 352 and position 350 analysis systems, that are synchronized to the common timebase.
  • weight transfer sensor information may be synchronized with video and/or the position information.
  • the analysis information that is displayed provides the synchronized information from the weight transfer information along with the video and/or position information.
  • grip pressure information may also be sensed by one of the sensor systems and synchronized along with the video, position, and/or weight transfer information and displayed accordingly.
  • Input device 318 is operably connected to the processing module 314 and may be used to control the selection, operation, and appearance of analysis information in accordance with an embodiment of the present invention. For instance, the input device 318 may control the selection of which signals are currently presented to the analysis module 315 . If the golfer only wants video and position analysis displayed on the analysis module 315 , such a request is preferably made through the input device 318 . Likewise, the input device 318 might allow the golfer or instructor to control a video playback of the golf swing. In accordance with another embodiment of the present invention, the input device 318 might be responsible for complete control of user selection, activation, operation, and termination of the analysis tool 300 . If the input device 318 is responsible for complete control of the analysis tool 300 , then the input device 318 might also be used as the trigger event system 332 .
  • the analysis module 315 might be a monitor.
  • the analysis module 315 contains a video adapter that has an s-video output to duplicate a monitor display on a conventional television.
  • the analysis module 315 might be a web server or a kiosk, thereby allowing a user to access the analysis information from a remote station.
  • one embodiment of the invention is presentation of the analysis information through an Internet connection such that a golfer may participate in a remote lesson.
  • the analysis module 315 might communicate to the remote station through an Ethernet, a wireless, or a TCP/IP protocol connection.
  • FIG. 10, described below, represents operations performed to provide physical motion training and instruction via the World Wide Web.
  • the analysis module 315 might be a hard disk, a floppy disc, a tape disk, a CD, or any other recordable medium allowing the golfer or instructor to download analysis information for later use.
  • FIG. 7 illustrates a screen shot 700 of the user interface of the analysis module 315 presenting analysis information to a user.
  • the screen shot 700 presents video analysis information 702 , or video clips, taken from the video analysis system 102 .
  • the screen shot 700 depicts a split screen 704 to show synchronized video frame data from two separate video capture systems 322 .
  • Screen division 706 presents a first video clip or video frame data from a first video recording device, such as video device 322 described above and screen division 708 presents a second video clip or video frame data from a second video recording device.
  • the video recording devices simultaneously record video information of a swing from different angles. In other embodiments, more than two video capture systems 322 might be used to capture video frame information.
  • the screen is divided into two display regions or areas, 706 and 708 , wherein each region presents video analysis information 702 , i.e., video clips, derived from video frame data associated with one golfer.
  • screen divisions 706 and 708 might present video analysis information 702 derived from video frame data associated with two separate golfers.
  • screen division 706 might display a student golfer receiving golf swing training while screen division 708 presents a professional golfer performing a swing.
  • screen division 706 might display a student golfer receiving golf swing training while screen division 708 presents a professional golfer performing a swing.
  • positional elements of the professional's swing are synchronized to the student's swing by using an impact or trigger event common to both swings. Such synchronization is realized through the synchronization module 301 in the fashion described in conjunction with FIG. 3.
  • Screenshot 700 further includes selection elements. Selection elements are selectable by the input device 318 and allow the presentation of different types of analysis information. Selection of motion capture selection elements 710 and 712 display positional measurement analysis information (not shown), which has been collected by the position analysis system 104 and synchronized to the video analysis information 702 , on the screenshot 700 . If an impact measurement system is used with the impact analysis system 106 , selection of impact capture selection element 714 displays impact measurement analysis information, which has been collected by the impact analysis system 104 and synchronized to the video analysis information 702 , on the screenshot 700 . Scrollbar selection elements 716 and 718 allow the user of the analysis module 315 to select any swing element of the swing for display as the video analysis information 702 .
  • address 720 and 728 , top 722 and 730 , impact 724 and 730 , and finish 726 and 734 selection elements allow the user of the analysis module 315 to select exact swing elements of the swing for display as the video analysis information 702 .
  • selection of the address selection element 720 or 728 adjusts the video frame data presented in the video analysis information to an address swing element of the golfer's swing.
  • selection of the top 722 or 730 , impact 724 or 730 , and finish 726 or 734 selection elements adjusts the video frame data presented in the video analysis information to the top, impact, or finish elements, respectively.
  • selection elements 720 , 722 , 724 , 726 , 728 , 730 , 732 , and 734 allow a user of the analysis module 315 to identify various key positions in the measured motion, e.g., the address, top, impact and finish positions of the golfer's swing motion, and to quickly display these positions when selected.
  • Various other selection elements are presented on the screen shot 700 allowing a user to select various other functionalities associated with physical motion correction and instruction.
  • the separate video frames 706 and 708 may either be controlled separately or as one, allowing the use of only one set of controls to display synchronized information contemporaneously.
  • the synchronized video information relates to at least two video clips of information that were taken simultaneously, e.g., of the same swing.
  • the video frames are part of a graphical user interface that detects whether the video frame information that is being displayed in the two frames 706 and 708 are synchronized in time, i.e., time-synchronized.
  • information from two video cameras of the same swing, but taken from different angles (as shown in FIG. 7) are synchronized in time and, in such a case, the graphical user interface automatically detects this situation.
  • video information of a student's swing that is to be shown in one frame, such as frame 706 is not synchronized in time with video information of another golfer, such as a golf pro, that may also be shown in the other frame, such as frame 708 . Since the two sets of video information represent different swings, the two sets are not synchronized in time. Detecting whether the two signals are synchronized in time may be performed in a number of ways, such as by setting a flag, assigning an identification value, comparing associated time information or comparing format information, among others.
  • the graphical user interface Upon detecting that the two sets of video information are synchronized in time, the graphical user interface automatically links many of the selection elements together such that either group of control elements controls both video clips, i.e., both sets of video information.
  • controls that play, fast forward, reverse and stop the video replay for one frame e.g., controls 716 , 720 , 722 , 724 and 726 that normally control frame 706 would also simultaneously control frame 708 , when the video signals are synchronized in time.
  • controls 718 , 728 , 730 , 734 and 736 would simultaneously control frame 706 when the signals are synchronized, instead of merely controlling frame 708 .
  • the graphical user interface Upon detecting that the two video signals are not synchronized in time, e.g., represent different motions, then the graphical user interface maintains the two sets of controls for each frame 706 and 708 as separate.
  • the graphical user interface may provide a selectable toggle button or element displayed on the screenshot 700 that could toggle the control of the two frames 706 and 708 from being controlled as one, or as two separate frames.
  • a selectable toggle button or element displayed on the screenshot 700 that could toggle the control of the two frames 706 and 708 from being controlled as one, or as two separate frames.
  • the selection of either address selection element 720 or 728 would automatically display the address video information in both frames 706 and 708 .
  • the selection of one of the selection elements 720 or 728 would only cause the display of the address video information in one of the two frames 706 or 708 , respectively.
  • FIG. 8 illustrates a screen shot 800 of the user interface of the analysis module 315 presenting analysis information related to an embodiment described in FIG. 1. Selection elements, split screen divisions, and displayed information of the screen shot 800 are the same as those shown in screen shot 700 and described above in conjunction with FIG. 7. However, the screen shot 800 presents positional measurement analysis information 804 synchronized with video analysis information 802 .
  • the video frame analysis information 802 and the positional measurement analysis information 804 are synchronized such that each frame sample of video data corresponds to a measurement sample of position elements of the golfer's swing.
  • the shoulder turn measurement value 806 presented on the shoulder turn measurement display 808 will vary each time that scrollbar selection element 716 addresses a different video frame sample of the video data.
  • the positional measurement analysis information 804 is displayed through measurement displays 808 , 810 , and 820 , 822 , 824 , 826 , 828 , 830 , and 832 .
  • measurement displays 808 - 810 are associated with screen division 706
  • measurement displays 820 , 822 , 824 , 826 , 828 , 830 , and 832 are associated with screen division 708 . Accordingly, control over which measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 are presented is administered through motion capture selection elements 710 and 712 .
  • the video analysis information 702 might be linked to the positional measurement analysis information 804 in a such way that the positional measurement values are identified, highlighted, or displayed as the video playback shows the golfer conducting the swing.
  • the measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 might be presented as a particular color signifying an analysis of aspects of a golfer's swing. For instance, if the shoulder turn measurement display 808 is red, then the golfer has turned his shoulder to an angle that is not desirable in an instructed golf swing.
  • the positional elements of the swing may be compared to a table or database of values to determine whether such information relates to positional information that is desirable or not, wherein the database contains average values based on predetermined desirable swing mechanics. Consequently, if the shoulder turn measurement display 808 is green, then the angle of the golfer's shoulder turn, or rotation, is within a desirable range for an instructed golf swing as compared to the referenced database. Additionally, measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 might be presented as yellow, or another intermediate color, suggesting that a measurable element of a golf swing is about to shift outside a desirable range.
  • measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 are capable of being positioned on the screen such that a user can move the displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 to any desired location on the screen. That is, through the use of a user-input device, such as a mouse or other input device, the displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 may be interactively positioned in different locations. For example, FIG.
  • screen division 706 might contain other measurement displays, such as a shoulder bend measurement display, a hip tilt measurement display, a hip bend measurement display, a shoulder tilt measurement display, or any other measurement display associated with analysis information derived from positional measurement samples.
  • values shown in measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 might be used in real time where the user is monitoring a display as he/she performs the golf swing, or other physical motion.
  • the values presented in the measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 dynamically vary as the user engages in the swing, or motion.
  • presenting the positional measurement information in real time a user is able to adjust a swing or motion as he/she is conducting such.
  • the measurement displays 808 , 810 , 820 , 822 , 824 , 826 , 828 , 830 , and 832 may also be highlighted in colors to alert the user, in real time, of a desirable range of motion for specific swing elements.
  • analysis information presenting analysis derived from only the video analysis 102 and the impact analysis 106 systems might be synchronized and displayed in similar fashion as described in conjunction with FIG. 7.
  • analysis information presenting analysis derived from only the position analysis 104 and the impact analysis 106 systems might be displayed in similar fashion as described in conjunction with FIG. 7.
  • analysis information presenting analysis from a variety of analysis systems other than a video 102 , position 104 , or impact 106 analysis system might be synchronized and displayed as discussed in FIG. 7 and FIG. 8.
  • the process environment 114 may be implemented as software, hardware, or any combination of hardware and software designed as an analysis tool in accordance with the embodiments and equivalents to the embodiments described for this invention.
  • the process environment 114 might include at least some form of computer-readable media accessible by a computing device capable of receiving at least two separate information signals simultaneously.
  • the process module 314 might be a computing device accessing the computer-readable media.
  • the computer-readable media might be stored on storage media including, but not limited to ROM, RAM, EPROM, flash memory or other memory technology, digital versatile disks (DVD), CD-ROM, or other optical storage, magnetic tape, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other medium accessible by the computing device that can used to store the analysis information and the information carried by the information signals 208 and 210 .
  • FIG. 4 generally illustrates operational characteristics for providing analysis information to an analysis module in order to provide physical motion training and instruction.
  • the processes described in conjunction with FIG. 4 are directed to providing golf swing analysis, the process might be similarly used to provide swing analysis in other sports, such as baseball, tennis, cricket, polo, or any other sport where an athlete's swing of an apparatus is a measure through which an element of the sport is conducted.
  • the process might be similarly used to provide any form of physical motion analysis associated with any form of physical motion subject to correction and instruction.
  • receive operation 402 receives a first information signal representing sensed information relative to a golf club swing.
  • the first signal is of a first type of information, e.g., video, position, weight transfer, pressure or impact information, among others.
  • receive operation 404 receives a second information signal representing sensed information relative to the golf club swing, wherein the second information signal is a different type of signal as compared to the first signal.
  • the first type of signal may be video information and the second type may be positional, weight transfer or impact information.
  • first receive operation 402 and second receive operation 404 simultaneously receive the first and second information signals.
  • the first information signal and the second information signal might be acquired substantially simultaneously.
  • Synchronization operation 406 synchronizes the two signals received in operations 402 and 404 .
  • synchronization operation 406 synchronizes the signals by time stamping samples of data points of each information signal.
  • Synchronization operations 406 time stamps each sample in relative fashion thereby ensuring that portions of one signal relate to portions of the other signal based on associated time information.
  • synchronization of the signals is done in a way such that the first sampled data point on the first information signal is identified by the same time marking as the first sampled data point on the second information signal. Accordingly, subsequent sampled data points on the first information signal are identified by the same time marking as subsequent sampled data points on the second information signal.
  • the synchronization operation 406 might stamp the information signals with associated times derived from time stamps without a corresponding time stamp in the other signal.
  • synchronization module 406 might stamp one information signal with five samples to every one stamped sample of the other information signal.
  • the samples might be associated such that one sample of the first information signal relates to five samples of the second information signal.
  • interpolation is used to supply missing data points to the signal of sensed information lagging in time samples. Interpolation is administered through a conventional polynomial equation to ensure that one sample of data of the first information signal exists for every sample of data of the second information signal.
  • the equation used to may be predetermined based on the type of motion being measured. Consequently, depending on the type of motion being analyzed, e.g., a golf swing versus a baseball swing, the equation used to synchronize values may be different.
  • Process operation 408 interprets each information signal and generates analysis information.
  • the analysis information presented on an analysis module, is used for golf swing analysis and training.
  • the sensed information might be positional information related to the motion of a golf club swing.
  • the sensed information might be video information associated with a golf club swing.
  • the sensed information might be impact information relative to impact of the club head with a golf ball resulting from a golf club swing.
  • the first and second information signals might be a positional information signal, a video information signal, or an impact information signal.
  • the sensed information might be any form of information related to a stroke, swing, movement, or motion of a person performing physical acts.
  • FIG. 5 illustrates operational characteristics for providing analysis information to an analysis module in order to provide athletic swing analysis of an athlete's swing.
  • FIG. 5 is a more detailed illustration of the operations described in conjunction with FIG. 4.
  • Start operation 500 is executed each instance that a single motion data sample and a video frame data sample is transmitted from each of a motion analysis and a video analysis system into an acquisition module.
  • receive operations 402 , 404 , 502 , and 504 are administered in acquisition modules.
  • a data sample is defined as a point, slice, or portion of an information signal.
  • Motion receive operation 502 , stamp operation 506 , and storage operation 510 make up a positional measurement acquisition process of a position analysis system and, in accordance with an embodiment, are operations of a Windows32® executable software program written in the C++ programming language.
  • Motion receive operation 502 acquires positional information associated with a golfer's swing. In accordance with an embodiment, the positional information is transmitted from the position analysis system and carried in a positional information signal.
  • Motion receive operation 502 reads positional information records from a serial port connected to a motion capture system. The start of each positional information record is indicated by a byte with the high bit on. Each record consists of the x, y and z cosine measurements from the motion capture system.
  • computation operation 503 computes the required Euler angles for the parameters specified by the user.
  • the computed Euler angles are stored in a shared memory structure to be time stamped by position stamp operation 506 .
  • the shared memory structure contains a snapshot of the last computed record sample.
  • Positional data stamp operation 506 time stamps the motion data sample stored in a shared memory structure as computed Euler angles.
  • the time stamp administered by position stamp operation 506 relates the motion data sample to the associated video frame sample as described in conjunction with FIG. 4.
  • Positional data storage operation 510 stores the time-stamped motion data sample in a buffer as shown in FIG. 3.
  • trigger sensory operation 516 detects whether a trigger event has occurred.
  • Video receive operation 504 , video stamp operation 508 , and video frame storage operation 510 make up a video frame sample acquisition process of a video analysis system and, in accordance with an embodiment, are operations of a Windows32® executable software program written in the C++ programming language. Separate instances of the video frame sample acquisition process execute for each capture board in the video analysis system.
  • the video analysis system contains two capture boards.
  • the capture board hardware is initialized into a 60 Hz field-mode of 240 lines per field at the specified width and these parameters might be defined by the software manufacturer's double-buffering queued asynchronous technique.
  • video receive operation 504 awaits arrival of a video frame sample associated with the golfer's swing from the video capture system and, upon arrival, acquires the video frame sample.
  • the video frame sample is transmitted from the video analysis system and carried in a video information signal.
  • video frame stamp operation 508 time stamps the video frame sample acquired by the video receive operation 504 so that the video frame sample relates to a positional measurement sample.
  • operation flow passes to video frame storage operation 512 .
  • the video frame sample is stored in a buffer by the video frame storage operation 512 .
  • the buffer is a circular buffer having 120 records.
  • the circular buffer may have any number of records depending upon the length in time of the physical motion analyzed.
  • All data samples are time stamped using the same timebase.
  • the timebase might be a Win32TM high-precision timer. Whereas the position analysis system grabs a sample about every 33 ms, the video analysis system grabs a sample about every 16 ms. Therefore, identical positional measurements are stored in the metric sample buffer for more than one image record being stored in the video sample buffer memory. Being on the same timebase, the timer information indicates the relative location in time at which the samples were gathered. Headers of the video sample buffer memory may contain information corresponding to positional measurement samples stored in the metric sample buffer.
  • the storage operations 510 and 512 store the value of each time stamp with each sample.
  • the storage operations 510 and 512 might store the samples in linked or adjacent buffers identified by the time stamp value in order to maintain the association between the two samples such that an association between the two samples is maintained while the samples are stored in the buffer.
  • Data is stored in the buffer for a predetermined period of time so that if neither an impact information signal nor a triggering event signal is transmitted in the predetermined time period, data in the buffer is stored in first in, first out basis.
  • Trigger sensory operation 516 detects whether a trigger event has occurred.
  • the trigger event might be the manual selection of an input request (e.g., pressing a key on a keyboard), predetermined positional coordinates on a golfer's swing, or any other triggering operation associated with a golfer's swing. Additionally, the trigger event might be on impact between the golf club head and the golf ball as detected by a microphone. If trigger sensory operation 516 has not detected a trigger event, then operation flow returns to start operation 500 and receive operations 502 and 504 acquire a subsequent data sample. If trigger sensory operation 516 detects a trigger event, then operation flow passes to collection operation 518 .
  • Collection operation 518 continues the collection, time stamping, and storage of video and motion data samples administered through receive operations 502 and 504 , stamp operations 506 and 508 , and storage operations 510 and 512 .
  • collection operation 518 might collect impact measurement data samples in the same fashion as receive operations 502 and 504 , stamp operations 506 and 508 , and storage operations 510 and 512 .
  • Impact measurement data samples represent coordinate and relative positions of the clubface of the golf club as the club head of the golf club enters and leaves a predetermined area surrounding the impact location between the club head and the golf ball.
  • continuation operation 514 limits the period of collection, stamping, and storage as defined by the timing window set by the trigger event.
  • Continue operation 514 sets a predetermined time period within which the execution of the positional measurement acquisition and video frame sample acquisition processes will continue, thereby allowing positional measurement and video frame data samples associated with the golfer's follow-through to be collected following detection of an impact or trigger event.
  • the predetermined time period is set to zero, thereby terminating collection once a trigger event occurs.
  • the predetermined time period is set to a finite time period other than zero upon occurrence of a trigger event.
  • the predetermined time period is set by a countdown timer that counts video frame data samples. After 20 video frame data samples have been captured following a trigger event, both the positional measurement and the video frame sample acquisition processes are terminated. This specific configuration results in a 100-frame pre-trigger circular buffer.
  • Process operation 520 interprets the data samples stored for each analysis system.
  • Process operation 520 generates analysis information from the interpreted samples and transmits the analysis information to an analysis module in a format suitable for presentation to the golfer.
  • process operation 520 discards redundant records of positional measurement samples.
  • a spline fit is applied to each of the positional measurement samples. Using the spline parameters based on the smooth motion being measured, the metric value at each frame time is computed. This calculated data is written into a positional measurement file which is ultimately saved as part of an archived lesson.
  • FIG. 9 is a specific embodiment of the operations of the video frame sample acquisition process 900 described in conjunction with FIG. 5.
  • Start operation 902 initiates the video frame sample acquisition process 900 .
  • the video frame sample acquisition process 900 is initiated at the beginning of, or a time prior to, the physical motion to be acquired by the video capture system.
  • hardware initialization operation 904 initializes each capture board into a 60 Hz field-mode of 240 lines per field at the specified width.
  • frame arrival operation 906 awaits arrival of a video frame sample.
  • Frame arrival operation 906 operates in an endless loop to wait for the video frame sample.
  • next frame operation 908 Upon arrival of the video frame sample, operation flow passes to next frame operation 908 .
  • Next frame operation 908 queues a grab for the next video frame sample and operation flow passes to image copy operation 910 .
  • Image copy operation 910 copies the just-received image information of the video frame sample into the current record of the circular buffer.
  • the current record is the record in the circular buffer that is being accessed by the record pointer.
  • time storage operation 912 passes to time storage operation 912 .
  • Time storage operation 912 stores the time associated with acquisition of the video frame sample in the record header of the current record.
  • time storage operation 912 stores the time stamp of video stamp operation 508 , which is described in conjunction with FIG. 5.
  • operation flow passes to trigger detection operation 914 .
  • Trigger detection operation 914 checks to see whether a trigger event has been administered. If a trigger event has been administered, then operation flow passes to countdown check operation 916 .
  • Countdown check operation 916 checks to see if the countdown timer initiated by the trigger event has completed counting. If trigger detection operation 914 has not detected a trigger event, then operation flow passes to record advance operation 918 . Record advance operation 918 advances the record pointer to the next record. Likewise, if countdown check operation 916 determines that the countdown has not been exhausted, operation flow passes to record advance operation 918 . Once the record pointer has been advanced to the next record, operation flow passes to frame arrival operation 906 and continues as earlier described. If countdown check operation 916 determines that the countdown is completed, the operation flow passes to video freeze operation 920 . Video freeze operation 920 sets a video freeze flag signaling termination of video frame acquisition.
  • Start operation 600 begins operation flow for control of which analysis information is transmitted to the analysis module.
  • operations illustrated in FIG. 6 are sub-operations that are performed during process operation 520 .
  • Selection operation 602 acquires the selection request of the type of analysis module the user of the analysis tool has requested to use.
  • the user requests to use the analysis tool through a monitor.
  • the analysis module requested might be a hard disk, a floppy disc, a tape disk, a CD, or any other recordable medium allowing the golfer or instructor to download analysis information for later use.
  • the analysis module requested might be a web server or a kiosk, thereby allowing a user to access the analysis information from a remote station.
  • the selection request acquired by selection operation 602 is preferably sent by the analysis module when the user logs on to the analysis tool through the analysis module. In other embodiments, the selection request might be through a user request communicated to the analysis tool directly from an input device. Once a selection request of a particular analysis module is acquired by the analysis tool, operation flow passes to format operation 604 .
  • Format operation 604 converts the analysis information into a format suitable for presentation onto the selected analysis module if the analysis information is not already in that a suitable format. Once the analysis information is formatted according to the selected analysis module, operation flow passes to presentation operation 606 . Presentation operation 606 presents the formatted analysis information to the analysis module in order for the module to deliver the analysis information to the user of the analysis tool. Once the analysis information is presented, operation flow passes to control selection request operation 608 . Control selection request operation 608 waits for an input selection request from the input device.
  • the input selection request might be any task associated with control over the analysis tool, including, but not limited to, activation of the analysis tool, operation of the analysis tool, appearance of the presentation to the analysis module, selection of which analysis systems are used and presented through the analysis tool, and any control operation associated with use of the analysis tool.
  • operation flow passes to execution operation 610 .
  • Execution operation 610 executes the task associated with the input selection request.
  • presentation operation 606 presents analysis information incorporating performance of the task to the analysis module as requested by the input selection.
  • operation flow passes to control operation 608 and continues as illustrated above.
  • FIG. 10 illustrates operations related to a web-based application in accordance with an embodiment.
  • the web-based application is an interactive application providing a golf instruction and training process 1000 to a golfer over the Internet.
  • analysis information related to the golf swing Prior to beginning the golf instruction and training process 1000 , analysis information related to the golf swing must be processed by an analysis tool 100 . That is, either during or following a training session, a lesson file is created that contains analysis information related to that lesson, e.g., tips, tricks, video data, etc., that may be accessed for future reference.
  • operation 1004 is used to compile the information into a lesson file
  • a lesson file is a compressed, encoded computer readable file that may contain video, still images, synchronized sensor data, text information, recorded audio, and necessary instructions to recreate events or other marked portions of the training session for subsequent access by users with access to the authorized decoding/presentation software.
  • Analysis operation 1004 marks specific analysis information for the web-based golf lesson. Such analysis information is marked by selection elements on the user interface of the analysis tool.
  • save drill selection element 736 , save screen selection element 738 , save before selection element 740 , and save after selection element 742 mark portions of the analysis information that are to be used with the web-based lesson. For instance, a swing recorded with video and associated measurement data prior to professional instruction might be marked to show the golfer an example of an undesirable swing. Additionally, a swing recorded with video and associated measurement data following professional instruction might also be marked to show the golfer improvement in his/her swing.
  • analysis operation 1004 allows marking of all forms of analysis information, including, instructor and student comments, measurement values, video playback, still shots associated with the video playback, audio clips, such as comments and observations from an instructor, and any other form of analysis information derived from the analysis tool.
  • the marking method may be used in creation of any saved lesson. That is, the marked material may be stored to a file and saved to a computer disc, videocassette or any other type of recording medium such that the lesson can be viewed at a later time by the user. Marking material to be saved to a final while the actual lesson is occurring saves time since the instructor does not have to review a recording of the entire lesson and manually select pertinent information, e.g., swings, comments, drills, etc. Instead, the instructor merely selects the appropriate screen element to mark the pertinent information, either a swing, comment, still shot, etc., contemporaneously for saving to the final recorded lesson.
  • the actual marking essentially relates storing the information into a temporary file, and then once the lesson is completed, the temporary information may be stored to a more permanent file.
  • Contemporaneous marking relates to the selection of pertinent content during the training session. Indeed, with respect to specific portions or event that occur during the training session, the marking occurs before, during, or substantially immediately after the occurrence of that event to preserve the relevant data in a predetermined location, separate from the other sensed information. In this respect, substantially immediately thereafter relates to the marking of information, such as information related to a particular swing, following the swing, but before the occurrence next swing or lesson instruction.
  • the student may make three consecutive swings, and before the fourth swing, the instructor decides that the information stored on the system that relates to the third swing should be marked for saving to the final lesson file. Prior to the fourth swing, the instructor marks the third swing to be saved to the lesson file. Additionally, the instructor may mark audio instructions or discussions related to the third swing to be saved along with video and/or positional measurement information related to the third swing.
  • upload operation 1006 uploads the saved lesson file, i.e., the marked analysis information to the web-based application resident on a server.
  • the saved lesson file i.e., the marked analysis information
  • the web-based application resident on a server.
  • any saved lesson file may be saved and uploaded to a web-server.
  • the entire lesson may be recorded and uploaded to the web-server.
  • connection operation 1008 refers to the act of a user connecting to the server via the World Wide Web and accesses the web-based application from a remote computer.
  • connection operation 1010 accesses the lesson information located on the web server. Such access may involve downloading the identification of the user and the marked analysis information associated with the user to the user's computer system.
  • format operation 604 formats the marked analysis information so that presentation of the marked analysis information may be controlled and displayed through the web-based application via the World Wide Web. Once the marked analysis information is formatted, operation flow passes to presentation operation 606 and continues as described in conjunction with FIG. 6 with user-control over the marked analysis information being provided through an Internet connection.
  • the above described analysis tool significantly improves the analysis of physical motion and the overall learning process for learning the proper athletic motion. Indeed, replaying the synchronized signals provides a valuable teaching tool in that a user can visualize swing measurement values of their own motion. Providing the combination of these signals removes guesswork associated with trying to pinpoint problem areas and the degree to which they are a problem. Additionally, the present invention relates to many improvements in the lesson process, such as combining numerous signals (video, audio, motion capture, impact analysis, etc.), allowing for numerous display options (video with motion capture values, movable value boxes, predetermined color scheme, etc.), and numerous playback options (tape, Web, etc.).

Abstract

An analysis system and method for providing athletic training and instruction by sensing different types of information, such as video, positional information, weight transfer information during a training session and storing the information to a lesson file. The method and system marks portions of sensed information contemporaneously during the training session. Contemporaneous marking allows for saving of only pertinent information to lesson file without the need to review an entire lesson to generate the lesson file.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is related to subject matter disclosed in U.S. patent application for a Method and System for Physical Motion Analysis, Serial No. (Attorney Docket No. 40154.1US01), and U.S. patent application for a Method and System for Presenting Information for Physical Motion Analysis, Serial No. (Attorney Docket No. 40154.3US01), both of which are filed concurrently herewith, the subject matter of those applications is incorporated in this application by reference. [0001]
  • TECHNICAL FIELD
  • The invention relates generally to a method and system for providing physical motion training and instruction. More particularly, the invention relates to a computer-implemented system for providing athletic training and instruction. Even more particularly, the present invention relates to saving of lesson files for future review. [0002]
  • BACKGROUND OF THE INVENTION
  • Over the course of time, many different techniques have been implemented in order to teach the proper mechanics of swinging a golf club. Currently, most instructors, e.g., golf professionals, use a video analysis system to teach a student how to properly swing a golf club. Using a typical video analysis system, the student's golf swing is captured by a video-recording device. The instructor replays the recorded video information to illustrate the student's golf swing while providing feedback regarding the swing. Instructional feedback may be comments relative to problems associated with the student's swing, compliments regarding improvement in the student's swing, suggestions on correcting the user's swing, or any other verbal instructional comments in context with the student's swing. Visualizing one's personal golf swing in this manner has been recognized as a valuable tool in identifying problems as well as correcting those problems in order to improve the overall golf swing. [0003]
  • Although video analysis systems are widely used by golf professionals, these systems have particular drawbacks. One particular drawback relates to the fact that a golf professional must subjectively analyze the video information. Not only is this analysis subjective and therefore open to interpretation and subject to inaccuracies, but also such analysis is exacerbated by the fact that many problems associated with a golf swing are typically not captured by the video recording system given different camera angles, too few cameras, or loose clothing. Therefore, golf professionals are typically forced to guess the problem. Accordingly, the advice given by a golf professional may be inaccurate since it is difficult to isolate mechanics and measurements of the swing on video. [0004]
  • In order to overcome the drawbacks associated with typical video analysis systems, instructors have implemented motion or position analysis systems. Current motion analysis systems require that the student/athlete to wear sensor elements on their body and the sensor elements transmit positional data of isolated body parts, such as hands, hips, shoulders and head. The isolated points on the body are measured during a swing in accordance with an absolute reference system, e.g., a Cartesian coordinate system wherein the center point is a fixed point in the room. By using motion analysis, exact measurements are provided from which an instructor can more accurately determine problems in a student's swing. Even though motion analysis provides accurate positional data of the student's swing, it is not, in and of itself, particularly useful since it gives no visual aid as to where the problems may really be. When used by itself, the motion analysis system is not an effective teaching tool since the instructor is only provided with numbers and not a visualization of what the student is doing wrong. Some motion analysis systems provide animation that depicts elements of a golf swing based upon captured data. However, the animation is crude and doesn't show the golfer what he/she looks like during a swing. [0005]
  • Consequently, motion analysis systems are used with video analysis systems in order to try to overcome the problems associated with each system as it is used independently of the other. The instructor will use the motion capture data and subjectively map the information to the video data. Although this provides more specific data to the instructor, it is associated with at least one significant problem. The instructor, while viewing the video, must estimate the swing positions corresponding to the data points from the motion analysis information. Accordingly, analysis of the swing requires not only considerable effort, but also a significant amount of estimation in associating the positional data points with an associated position on the student's swing. Not only must the instructor estimate which portions of the video information relate to the corresponding portions of the positional measurement information, the instructor must also do so for hundreds, if not thousands, of data points if a complete analysis is performed. Clearly, this task is burdensome at best and most likely impossible. [0006]
  • Moreover, the systems for providing the video analysis are separate from the systems that provide motion capture information such that the instructor must manipulate numerous controls for displaying, to the student, the various positional measurement values as well as for providing separate video replays. [0007]
  • Another problem associated with current methods of providing instructional information to the student relates to the fact that following a teaching session, students are typically provided a copy of the recorded session. Given that the entire teaching session is recorded, much of the recorded material is redundant or otherwise unnecessary. Thus, in order to provide only relevant material to the student, the instructor must review the entire recorded lesson and select and separately save only the relevant material. Doing so consumes a significant amount of time and effort. [0008]
  • It is with respect to these and other considerations that the present invention has been made. [0009]
  • SUMMARY OF THE INVENTION
  • In accordance with this invention, the above and other problems are solved by an analysis tool that synchronizes at least two signals carrying sensed information associated with physical motion. The synchronized signals are used in providing analysis related to the physical motion conducted. In accordance with one aspect of the present invention, the analysis tool incorporates a processing environment and at least two sensors sensing information related to physical motion. The processing environment synchronizes signals received from the sensors and processes the synchronized signals to generate analysis information. The analysis information provides information to allow for correction and instruction. In accordance with other aspects, the processing environment includes a synchronization module to perform the synchronization of the signals, a processing module for processing the sensed information into analysis information, and an analysis module for presenting the analysis information to the athlete. Consequently, the present invention synchronizes information signals carrying two different forms of information, processes these signals, and presents combined information to provide correction and instruction. [0010]
  • In accordance with certain aspects of the invention, the analysis tool is used to provide athletic training and instruction. Two or more signals carrying sensed information associated with athletic motion are synchronized to provide an athlete with analysis regarding the athletic motion sensed. In accordance with one aspect of the present invention, the analysis tool is used for golf swing analysis. While used for golf swing analysis, the signals relate to video frame data signal carrying video information of a golf swing and a positional data signal carrying positional motion information associated with positional measurements of elements of the golf swing. The video frame data signal and the positional data signal are synchronized by the analysis tool to provide golf swing analysis. In accordance with other aspects, the analysis tool might be used for educational analysis of any element of an athletic motion where the element is used as a measure through which a sport is conducted. [0011]
  • In accordance with other aspects, the present invention relates to an overall system for providing athletic training and instruction. The system has a first sensor generating a first information signal carrying a first type of sensed information and a second sensor generating a second information signal carrying a second type of sensed information. The analysis tool also includes a synchronization module receiving the first and second signal and synchronizing the first signal with the second signal to provide a combined signal that can be used for athletic training and instruction. [0012]
  • In accordance with still other aspects, the present invention relates to a method for providing athletic training and instruction, wherein the method includes the acts of receiving a first signal carrying sensed information samples from a first sensor and receiving a second signal carrying sensed information samples from a second sensor. The sensed information from the second sensor is associated with a different form of sensed information than carried by the first signal. The method includes the act of synchronizing the first signal with the second signal to provide an analysis tool for providing athletic training and instruction. [0013]
  • In accordance with other aspects, the present invention relates to a display device that displays both motion capture information and video information, whether synchronized or not. The system also provides one control panel that can control the replay of two or more video signals and/or positional information. [0014]
  • In accordance with yet other aspects, the present invention also relates to a method of marking information while the training session is in process. Marked content is saved to a separate file that may be reviewed at a later time. By marking contend contemporaneously during the training session, the instructor is not forced to review the entire lesson to select the relevant material. Instead, only the marked material is reviewed for relevancy and saved to the lesson file. [0015]
  • In accordance with these certain aspects, the present invention relates to an analysis method and system for analyzing physical motion, e.g., a golf swing, wherein the physical motion occurs during a training session. Sensors sense the physical motion and generate information signals related to sensed information. The system also has a storage medium for storing the sensed information conducted by the information signals and a marking module for marking a plurality of portions of the sensed information. The marking module marks the portions of sensed information contemporaneously with the training session. The marking module may be user-activated or might provide automatic marking of predetermined content. [0016]
  • The sensed information may include video information, audio information and/or positional measurement information. Additionally, some of the sensed information may be synchronized, e.g., the video information and the positional measurement information may be synchronized. Moreover, the saved lesson file may be uploaded to a server computer system, wherein the lesson file on the server computer system may be accessed by a remote client computer system over a network connection, such as the Internet. The marked content may be displayed on a monitor. [0017]
  • The invention may be implemented as a computer process, a computing system or as an article of manufacture such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process. The computer program product may also be a propagated signal on a carrier readable by a computing system and encoding a computer program of instructions for executing a computer process. [0018]
  • These and various other features as well as advantages, which characterize the present invention, will be apparent from a reading of the following detailed description and a review of the associated drawings.[0019]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a functional diagram of an analysis tool in accordance with an embodiment of the present invention and the associated environment. [0020]
  • FIG. 2 is a simplified block diagram that illustrates functional components of an analysis tool such as the analysis tool shown in FIG. 1, the analysis tool having sensors and a processing environment. [0021]
  • FIG. 3 is a functional diagram of the sensors and the processing environment of the analysis tool shown in FIG. 2 in accordance with an embodiment of the present invention. [0022]
  • FIG. 4 is a flow diagram that illustrates operational characteristics for providing analysis information to an analysis module. [0023]
  • FIG. 5 is a flow diagram that illustrates operational characteristics shown in FIG. 4 in more detail in accordance with an embodiment of the present invention. [0024]
  • FIG. 6 is a flow diagram that illustrates operational characteristics related to control of analysis information using an input device. [0025]
  • FIG. 7 is a reproduction of a display screen for presenting analysis information from an analysis tool such as the analysis tool shown in FIG. 1. [0026]
  • FIG. 8 is a reproduction of a display screen for presenting synchronized analysis information from the analysis tool shown in FIG. 1. [0027]
  • FIG. 9 is a flow diagram that illustrates operational characteristics related to the video frame sample acquisition process shown in FIG. 5. [0028]
  • FIG. 10 is a flow diagram that illustrates operational characteristics related to providing physical motion training and instruction via the World Wide Web.[0029]
  • DETAILED DESCRIPTION OF THE INVENTION
  • An [0030] analysis tool 100 used to provide synchronization of various elements is shown in FIG. 1. In an embodiment, the analysis tool 100 synchronizes signals from a video analysis system 102 with signals from a position analysis system 104. Signals from the video analysis system 102 and the position analysis system 104 carry sensed information associated with a physical motion. The resulting synchronized signals may then be used to provide physical motion analysis related to the performed motion. In accordance with an embodiment of the invention, the analysis is presented to provide the person performing the motion with correction and instruction such as instruction to improve the person's golf swing. Although the analysis tool 100 is described below as a system and method for providing golf swing analysis, the analysis tool 100 might be similarly used to provide motion analysis in other sports, such as baseball, tennis, cricket, polo, or any other sports where an athlete's motion is a measure through which an element of the sport is conducted. Moreover, the analysis tool 100 might be similarly used to provide almost any form of physical motion analysis.
  • In accordance with an embodiment, the [0031] video analysis system 102 uses video recording equipment to record a physical motion and to transmit a recorded video information signal 108 to a process environment 114. Meanwhile, the position analysis system 104 captures positional information and transmits a positional information signal 110 to the process environment 114. Process environment 114 interprets the received video 108 and positional 110 information signals sent to the process environment 114 and synchronizes the signals. The process environment 114 processes the synchronized signals in order to generate analysis, or teaching information, which may be used for golf swing analysis and training. Although shown in FIG. 1 as a relatively typical personal computer type environment, process environment 114 may be any combination of elements capable of receiving signals and synchronizing those signals. Additionally, as process environment 114 is a personal computer, it is capable of displaying information related to the synchronization of the signals, but such a display is not a necessary component of process environment 114.
  • In an alternative embodiment, the [0032] analysis tool 100 also has an impact analysis system 106, which captures impact information and conducts an impact information signal 112 to the process environment 114. In this embodiment, process environment 114 synchronizes the three information signals 108, 110 and 112. In other embodiments, analysis tool 100 synchronizes a video information signal 108 provided by the video analysis system 102 with an impact information signal 112 provided by the impact analysis system 106. In yet other embodiments, the analysis tool 100 synchronizes a positional information signal 110 provided by the position analysis system 104 with an impact information signal 112 provided by the impact analysis system 106. In yet other embodiments, a pressure information signal (not shown) might be used with the analysis tool 100. In this embodiment, the pressure information signal might be synchronized with any one of the information signals 108, 110 or 112, or a combination of the signals 108, 110, 112.
  • A simplified illustration of the functional components of an [0033] analysis tool 199 that incorporates aspects of the analysis tool 100 shown in FIG. 1 is shown in FIG. 2. The analysis tool 199 has at least two sensors 202 and 204 that communicate with a synchronization module 200. The synchronization module 200 synchronizes physical motion information received from the sensors 202 and 204 and communicates the resulting synchronized information to a processing module 212. The physical motion information might be associated with any form of physical motion subject to correction and instruction. In accordance with one embodiment, the physical motion information is associated with an element of an athletic sport, such as, but not limited to a swing, a stroke, a run, a throw, a catch, or any other motion associated with an element through which a sport might be conducted. In other embodiments, the physical motion information might be associated with motions related to physical or occupational therapy. In accordance with the embodiment depicted in FIG. 1, the physical motion information might be associated with a golf swing. The processing module 212 receives synchronized information from the synchronization module 200 and, in turn, processes the synchronized information in order to provide analysis information to an end user. The analysis information is then used to provide physical motion correction and instruction.
  • The analysis information is in a form suitable for review by the user. Therefore, analysis information may be video replay of a golf swing, a visual representation of positional data has been gathered or a visual representation of impact information that has been gathered, etc. In accordance with an embodiment, such analysis information is provided to the user through a graphical user interface operating on a computer display. The display may be located on-site, e.g., where the golf swing is performed or located remotely. Typically, the remote display relates to replaying the recorded lesson on a television or computer at another location. The recorded lesson may be recorded onto a videocassette, compact disc, floppy disc or other readable memory. Additionally, the recorded lesson may be stored on a web server and the user may access lesson via the World Wide Web. [0034]
  • A [0035] first sensor 202 senses information and then transmits a first information signal 208 relative to the sensed information to the synchronization module 200. Likewise, a second sensor 204 senses a different type of information than the first sensor 202 and transmits a second information signal 210 to the synchronization module 200. The information signals 208 and 210 may be either analog or digital and may be partitioned into time samples, or segments. In an alternative embodiment, the analysis tool 199 might use more than two sensors in obtaining more than two forms of sensed information.
  • The information signals [0036] 208 and 210 are delivered to the synchronization module 200 substantially contemporaneously. Contemporaneous conduction of these signals may be achieved by real-time conduction of the signals as they are sensed by the sensors 202 and 204. In accordance with various embodiments of the present invention, the sensed information might be positional information related to a golfer's swing, video information relative to a golfer's swing, impact information relative to impact of the club head with a golf ball resulting from a golfer's swing, or pressure information relative to weight transfer associated with a golf swing. Accordingly, information signals 208 and 210 might be a positional information signal 110, a video information signal 108, an impact information signal 112, or a pressure information signal (not shown). Additionally, the sensed information might be any form of information related to a stroke, swing, movement, or motion of an athlete performing acts while engaged in any sport. Additionally, the sensed information might be any form of information related to physical motion. Regardless of the type of information sensed by the first sensor 202, a second sensor 204 transmits an independent information signal 210 relative to a type of sensed information other than the information sensed by the first sensor 202.
  • Once acquired by the [0037] synchronization module 200, the information signals 208 and 210 are synchronized. Synchronization of the information signals 208 and 210 may be accomplished in several ways to ensure that portions, or samples, of one signal (such as 208) relate to portions, or samples, of the other signal (such as 210) based on associated time information. In an embodiment, the information signals 208 and 210 might be time-stamped using an internal clock mechanism. Accordingly, each sample from the first information signal 208 corresponds to a sample from the second information signal 210. In this embodiment, time stamps are administered on each information signal 208 and 210 on preset intervals such that corresponding samples of the signals 208 and 210 are identified by the same time stamp. In another embodiment, time stamps are administered on each information signal 208 and 210 independently and the association of the samples is accomplished through a comparative analysis performed by the synchronization module 200. Time stamping the information signals 208 and 210 creates synchronized information that is transmitted to the processing module 212 to provide synchronized analysis associated with the information acquired by the sensors 202 and 204.
  • Alternatively, the information signals [0038] 208 and 210 may be synched using associated times derived from time stamps without a corresponding time stamp in the other signal. For example, the first information signal 208 may contain five samples to every one sample of the second information signal 210. In such an example, even though the samples do not correspond to the same time stamp, the samples might be associated such that one sample of the first information signal 208 relates to five samples of the second information signal 210. In accordance with another embodiment, if the samples do not correspond to the same time stamp, interpolation might be used to supply missing data points to the signal of sensed information lagging in time samples. Interpolation can be administered through a conventional polynomial equation such that there results one sample of data related to the first information signal 208 exists for every sample of data related to the second information signal 210.
  • An [0039] analysis tool 300, in accordance with an embodiment of the present invention, is shown in FIG. 3. The analysis tool 300 incorporates information signals from a position analysis system 350 and a video analysis system 352. In accordance with another embodiment, the analysis tool 100 also incorporates an information signal from an impact analysis system 354. The information signals 362, 360, and 364 are respectively provided by the systems 352, 350, and 354 to the process environment 114 (FIG. 1) for synchronization. In an embodiment, the position analysis system 350 includes a motion capture system 320, a motion data acquisition module 308, and a communication connection to the process environment 114 (FIG. 1). Likewise, the video analysis system 352 includes a video capture system 322, a capture board 306, a video frame acquisition module 310, and a communication connection to the process environment 114 (FIG. 1). Likewise, the impact analysis system 354 includes an impact event sensor 324, an impact data acquisition module 312, and a communication connection to the process environment 114 (FIG. 1).
  • In accordance with an embodiment of the present invention, the [0040] motion capture system 320 might be administered by at least one magneto-sensitive sensor contained in a magnetic field. Specifically, in one embodiment, the motion capture system might be a Polhemus Iso-Track II™ magnetic sensor system. Multiple magneto-sensitive sensors are placed on the golfer's body at positions corresponding to particular swing elements of a golfer's swing. The magneto-sensitive sensors are used to transmit positional measurement information as the swing moves through the magnetic field. In a specific embodiment, thirty positional measurement data samples/second are captured in binary mode using two magneto-sensitive sensors. In accordance with an alternative embodiment, the motion capture system 320 might be administered by at least one color or retro-reflective marker contained in an image field responsive to colors or reflectiveness of the marker.
  • The [0041] motion capture system 320 conducts positional measurement information via a positional information signal 360 to the motion data acquisition module 308 for data compilation and documentation. Additionally, the motion data acquisition module 308 might convert the signal 360 to a format recognizable by the processing module 314 if necessary. A synchronization module 301 receives the positional information signal 360 and synchronizes the signal 360 to a video information signal 362. The positional information signal 360 carries information identifying sensed positional measurements of swing, or motion, elements relative to a three-dimensional coordinate system.
  • In accordance with one embodiment of the invention, the three-dimensional coordinate system about which positional elements are measured might be an absolute coordinate system. Under an absolute coordinate system, positional measurements for each element of a physical motion are taken with reference to a single, fixed origin that is independent from the person performing the physical motion. For example, in the embodiment described in FIG. 1, rotation of the shoulder and hip during a golf swing are all measured with reference to an absolute origin, such as a fixed spot on the floor. Thus, an absolute origin is an origin used for all element measurements of the physical motion for each different user. As an example, from a fixed spot on the floor, an axis system may be defined, as shown in FIG. 1, e.g., where the x-axis and the z-axis are parallel to the floor and perpendicular to each other, and where the y-axis is perpendicular to the x-axis and z-axis and is orthogonal to the floor. Using this axis system, measurements may be taken using angular rotation values. For example, Φ[0042] s might represent a rotational angle of the shoulders around the x-axis, Θs might represent a rotational angle of the shoulders around the y-axis, ζh might represent a rotational angle of the shoulders around the z-axis, Φh might represent a rotational angle of the hips around the x-axis, Θh might represent a rotational angle of the hips around the y-axis, and ζh might represent a rotational angle of the hips around the z-axis. Accordingly, Φ relates to shoulder and hip bend, Θ relates to shoulder and hip rotation, and ζ relates to shoulder and hip tilt. Measured with reference to the absolute coordinate system, positional elements related to bend, rotation, and tilt of both the shoulders and the hips are measured around the absolute origin.
  • In accordance with an alternative embodiment, the coordinate system might be a referenced coordinate system. In a reference coordinate system, positional elements are measured with reference to coordinate origins that are unique to the user. For example, in the embodiment described in FIG. 1, measurements related to rotation, bend, and tilt of the shoulder might be referenced to a coordinate system having an origin located on a golfer's hip. Illustrating this example, the measurements described above while discussing the absolute coordinate system are used to determine rotational positions about the reference coordinate system. Accordingly, the rotational position of the shoulders around the x-axis (Φ[0043] sp), the rotational position of the shoulders around the y-axis (Θsp), and the rotational position of the shoulders around the z-axis (ζsp) are defined as follows:
  • Φspsh cos(Θsh)+ζh sin(|Θh|)
  • Θspsh
  • ζspsph sin(Θsh)+ζh cos(|Θh|)
  • Whereas positional measurements associated with the shoulders are used in this illustration, positional measurements of other elements, such as hip rotation, wrist rotation, head rotation, or any other element associated with a golf swing or other physical motion, might be measured against a different reference coordinate system than used in this example. For instance, hip rotation might be measured around a coordinate system referenced to an origin located around a golfer's knees. [0044]
  • In accordance with an embodiment of the present invention, the [0045] video capture system 322 includes at least one video recording device transmitting a video feed signal carrying video frame samples defining image information. In accordance with a specific embodiment, the video capture system 322 uses 2 analog, 60 frames/second, interlaced video cameras with s-video outputs. The video cameras are positioned such that the front and the side view of a golfer are captured. Moreover, the frame size of the cameras is 400 by 480, thereby filling two video windows on a graphical user interface on a display.
  • In accordance with an embodiment, each video feed signal is transmitted to a [0046] capture board 306. Each capture board 306 converts the image information carried in the video feed signal to video frame data that the video acquisition module 310 can recognize. In accordance with a specific embodiment, each capture board might be a video framegrabber card configured for the s-video mode. Video frame samples are carried in a video information signal 362 from the capture board 306 to the video acquisition module 310 for data documentation and compilation. Additionally, the video data acquisition module 310 might convert the signal 362 to a format recognizable by the processing module 314. The synchronization module 301 receives the video information signal 362 and synchronizes the video information signal 362 to the positional information signal 360.
  • In accordance with an embodiment, an [0047] impact analysis system 354 is incorporated into analysis tool 300 with the video analysis system 357 or the position analysis system 350, or both. The impact analysis system 354 senses impact information related to the impact of a golf club head and a golf ball (FIG. 1). Two forms of impact information—impact measurement information and impact event information—are sensed by the impact analysis system 354. Impact measurement information is associated with clubface angles and measurements as the club approaches the golf ball, strikes the golf ball and follows through. Impact measurement information allows for calculations related to the velocity, distance, and direction of the golf ball upon impact. Such information is important in understanding the mechanics of a golf club swing. On the other hand, impact event information is associated with the exact time of impact between the ball and the club and therefore, indicates the occurrence of an event. Impact information is transmitted through the impact information signal 364 once impact occurs. An impact analysis sensor 324 detects impact information and may be a radar sensor, a high-speed video recording device, a pressure sensor, a laser grid sensor, or any equivalent sensor for sensing impact-related information. In one embodiment, the impact analysis sensor 324 is a laser grid sensing the impact between a club head and the golf ball. In other embodiments, other types of sensors may be used to collect impact measurement information. Alternatively, impact measurement information might not be collected at all.
  • In accordance with one embodiment of the present invention, the [0048] impact analysis system 354 might be administered through a laser grid sensor, such as a Focaltron Achiever™ laser grid device contained in a custom mounting feature about the impact zone. In accordance with another embodiment, the laser grid sensor might serve as the only impact analysis sensor 324 collecting both impact measurement information and impact event information. In other embodiments, the laser grid sensor is used solely to collect impact measurement information. In order to collect impact measurement information, the sensor positions a laser grid surrounding the point of estimated impact between a golf ball and a golf club. Once the clubface enters the laser grid, the sensor detects various clubface measurements as the clubface extends through the grid. The impact information signal 364 carries information relative to the sensed clubface through the laser grid along with the impact event information.
  • In order to determine which video frame and positional measurement information samples should be synchronized, a [0049] trigger event system 332 is used. The trigger event system 332 is operably connected to processing module 314 so that a triggering event signal 334 can be communicated to the processing module 314. The triggering event signal 334 relates the occurrence of a trigger event, which provides the reference point in time, i.e., the trigger event time, that allows the processing module 314 to define a timing window for analysis. The timing window may be defined by a start time equal to the trigger event time minus a predetermined period, e.g., 3 seconds, and an end time equal to the trigger event time plus a predetermined period, e.g., 3 seconds. The data collected within the timing window is marked and stored for analysis and/or playback. If the collected data from the video 352 and position 350 analysis systems falls outside the timing window, then it is discarded out of the buffers 302, 304. The trigger event may be caused by manual selection of an input request, predetermined positional coordinates on a golfer's swing, or any other triggering operation associated with a golfer's swing. Additionally, the trigger event may be caused by impact between the golf ball and the golf club head. In accordance with one embodiment, the trigger event is sensed by a microphone, or other acoustical measurement sensor, sensing impact between a club head and a golf ball.
  • In accordance with an embodiment of the present invention, the [0050] position analysis system 350, the video analysis system 352, and the impact analysis system 354 transmit information signals 360, 362, and 364 to a synchronization module 301. As information signals 362 and 360 enter the synchronization module 301 from the position analysis system 350 and the video analysis system 352, samples on the information signals 350 and 352 are time-stamped as described in conjunction with FIG. 2. The samples identified with the time stamp are stored in sample buffers 302 and 304. In particular, the samples stamped from the positional information signal 360 are stored in a metric sample buffer memory 302 and the samples stamped from the video informational signal 362 are stored in a video sample buffer memory 304.
  • In accordance with an embodiment, the sample buffers [0051] 302 and 304 only hold the stamped data points for a limited amount of time. The sample buffers 302 and 304 are preferably designed as first-in, first-out (FIFO) buffers. Accordingly, once the buffer memories 302 and 304 are full, earlier samples are erased as new samples are received by the buffer. Buffer memories 302 and 304 continue storing information until the time period defined by the timing window is expired. Once expired, the information is marked and stored to disk or another portion of memory to be used during analysis
  • With respect to [0052] impact sample buffer 326 and in accordance to one embodiment, the impact analysis system 354 is not enabled until the impact analysis sensor 324 senses the impact event. In this embodiment, impact measurement information is not carried in the impact information signal 364. Once sensed, impact event information is time stamped, transmitted to the process environment 114 in the impact information signal 364, and stored in impact sample buffer 326. In accordance with another embodiment, the impact analysis system 354 is enabled as the iron of the golf club approaches and extends through the golf ball. Once the impact analysis system 354 is enabled, impact measurement information is collected, time-stamped, and carried by the impact information signal 364 to the process environment 114. The impact measurement information is stored in the impact sample buffer 326. In accordance with an alternative embodiment, impact sample buffer 326 might not be used to store information when the trigger event system 332 is used without the impact analysis system 354.
  • In accordance with an embodiment, the [0053] video analysis system 352 and the position analysis system 350 continue collecting information until the timing window expires. Continuation of the information collection by the motion 350 and video 352 analysis systems ensure that both systems 350 and 354 collect information related to the follow-through swing of the golfer. As long as the position analysis system 350 and the video analysis system 352 continue collecting information, the synchronization module 301 continues time stamping samples on the information signals 360 and 362. In an alternative embodiment, the video analysis system 352 and the position analysis system 350 terminate information collection once a trigger event is sensed.
  • All data samples, whether video, positional, impact, audio or any other sample associated with a physical motion, are time stamped using the same timebase. In accordance with a specific embodiment, the timebase might be a Win32™ high-precision timer. In such an embodiment, the [0054] position analysis system 350 grabs a sample about every 33 ms and the video analysis system 352 grabs a sample about every 16 ms. Therefore, identical positional measurements are stored in the metric sample buffer 302 for more than one image record being stored in the video sample buffer 304. Being on the same timebase, the timer information indicates the relative location in time at which the samples were gathered. Headers of the video sample buffer 304 contain information corresponding to positional measurement samples stored in the metric sample buffer 302.
  • Upon completion of the trigger countdown, the [0055] video capture system 352 is stopped. Once the collection and compilation of data is completed, e.g., timing window completed, by the position analysis 104 and video analysis systems 102, the stored positional and video frame data samples are transmitted from the synchronization module 301 into the processing module 314. The processing module 314 transforms the data stored in the buffers 302, 304, and 326 into analysis information. In accordance with an embodiment, the processing module 314 is a data processing system processing the data stored in the buffers 302, 304, and 326 into information of a form suitable for a user. Specifically, the processing module 314 might be a part of a desktop computer system having typical input/output capabilities, a processing unit, and memory functionality.
  • In an embodiment, the [0056] processing module 314 receives information resulting from a timing window and automatically stores that information in such a manner that if the system crashes during the processing stage, then the information may be recovered. In such a case, all information received by the processing module 314 is stored to a temporary file. This temporary file may then be erased once the lesson is explicitly stored into a more permanent file. Additionally, this temporary file is typically only used to restore information due to a crash, but may be accessed for other reasons.
  • In accordance with a specific embodiment, the [0057] processing module 314 discards redundant records of positional measurement samples. The processing module 314 also may implement a spline fit algorithm to each of the positional measurement samples. Using the spline parameters based on the smooth motion being measured, the metric value at each frame time may be computed. This calculated data is written into a positional measurement file which is ultimately saved as part of an archived lesson.
  • The analysis information is thereafter transmitted to the [0058] analysis module 315 through at least one analysis information signal 330. Analysis information is information derived from video, motion, or impact analysis and presented in a form which can be interpreted by a user. In accordance with one embodiment, the analysis information is presented to the analysis module 315 in real time so that a user may monitor a golf swing and various measurements associated with the golf swing as the swing is conducted. For instance, the analysis module 315 might present positional analysis information synchronized with video analysis information while the user is monitoring both forms of analysis information at the same time he/she is conducting the swing. The positional analysis information being presented as measurements dynamically varying as the swing is conducted. For example, while the video analysis information presents an image of the swing at an address position, the positional measurement associated with a particular swing element is also defined at the address position. In accordance with another embodiment, recorded analysis information is presented to the analysis module 315 so that a user may review a golf swing and various measurements associated with the golf swing at a later time. The recorded analysis information contains information from at least two analysis systems, such as the video 352 and position 350 analysis systems, that are synchronized to the common timebase.
  • Alternatively, weight transfer sensor information may be synchronized with video and/or the position information. In such a case, the analysis information that is displayed provides the synchronized information from the weight transfer information along with the video and/or position information. Additionally, grip pressure information may also be sensed by one of the sensor systems and synchronized along with the video, position, and/or weight transfer information and displayed accordingly. [0059]
  • [0060] Input device 318 is operably connected to the processing module 314 and may be used to control the selection, operation, and appearance of analysis information in accordance with an embodiment of the present invention. For instance, the input device 318 may control the selection of which signals are currently presented to the analysis module 315. If the golfer only wants video and position analysis displayed on the analysis module 315, such a request is preferably made through the input device 318. Likewise, the input device 318 might allow the golfer or instructor to control a video playback of the golf swing. In accordance with another embodiment of the present invention, the input device 318 might be responsible for complete control of user selection, activation, operation, and termination of the analysis tool 300. If the input device 318 is responsible for complete control of the analysis tool 300, then the input device 318 might also be used as the trigger event system 332.
  • In an embodiment, the [0061] analysis module 315 might be a monitor. In accordance with a specific embodiment, the analysis module 315 contains a video adapter that has an s-video output to duplicate a monitor display on a conventional television. In other embodiments, the analysis module 315 might be a web server or a kiosk, thereby allowing a user to access the analysis information from a remote station. Indeed, one embodiment of the invention is presentation of the analysis information through an Internet connection such that a golfer may participate in a remote lesson. As such, the analysis module 315 might communicate to the remote station through an Ethernet, a wireless, or a TCP/IP protocol connection. FIG. 10, described below, represents operations performed to provide physical motion training and instruction via the World Wide Web. In yet other embodiments, the analysis module 315 might be a hard disk, a floppy disc, a tape disk, a CD, or any other recordable medium allowing the golfer or instructor to download analysis information for later use.
  • A user interface presenting analysis information derived from the [0062] video analysis system 352 is shown in FIG. 7, in accordance with an embodiment of the present invention. FIG. 7 illustrates a screen shot 700 of the user interface of the analysis module 315 presenting analysis information to a user. The screen shot 700 presents video analysis information 702, or video clips, taken from the video analysis system 102. In particular, the screen shot 700 depicts a split screen 704 to show synchronized video frame data from two separate video capture systems 322. Screen division 706 presents a first video clip or video frame data from a first video recording device, such as video device 322 described above and screen division 708 presents a second video clip or video frame data from a second video recording device. The video recording devices simultaneously record video information of a swing from different angles. In other embodiments, more than two video capture systems 322 might be used to capture video frame information.
  • In FIG. 7, the screen is divided into two display regions or areas, [0063] 706 and 708, wherein each region presents video analysis information 702, i.e., video clips, derived from video frame data associated with one golfer. In an alternative embodiment, screen divisions 706 and 708 might present video analysis information 702 derived from video frame data associated with two separate golfers. For example, screen division 706 might display a student golfer receiving golf swing training while screen division 708 presents a professional golfer performing a swing. Such an implementation allows student golfers to compare and contrast their swing with the professional's swing. In accordance with one embodiment of the invention, positional elements of the professional's swing are synchronized to the student's swing by using an impact or trigger event common to both swings. Such synchronization is realized through the synchronization module 301 in the fashion described in conjunction with FIG. 3.
  • [0064] Screenshot 700 further includes selection elements. Selection elements are selectable by the input device 318 and allow the presentation of different types of analysis information. Selection of motion capture selection elements 710 and 712 display positional measurement analysis information (not shown), which has been collected by the position analysis system 104 and synchronized to the video analysis information 702, on the screenshot 700. If an impact measurement system is used with the impact analysis system 106, selection of impact capture selection element 714 displays impact measurement analysis information, which has been collected by the impact analysis system 104 and synchronized to the video analysis information 702, on the screenshot 700. Scrollbar selection elements 716 and 718 allow the user of the analysis module 315 to select any swing element of the swing for display as the video analysis information 702. Likewise, address 720 and 728, top 722 and 730, impact 724 and 730, and finish 726 and 734 selection elements allow the user of the analysis module 315 to select exact swing elements of the swing for display as the video analysis information 702. For example, selection of the address selection element 720 or 728 adjusts the video frame data presented in the video analysis information to an address swing element of the golfer's swing. Likewise, selection of the top 722 or 730, impact 724 or 730, and finish 726 or 734 selection elements adjusts the video frame data presented in the video analysis information to the top, impact, or finish elements, respectively. In accordance with an embodiment, selection elements 720, 722, 724, 726, 728, 730, 732, and 734 allow a user of the analysis module 315 to identify various key positions in the measured motion, e.g., the address, top, impact and finish positions of the golfer's swing motion, and to quickly display these positions when selected. Various other selection elements are presented on the screen shot 700 allowing a user to select various other functionalities associated with physical motion correction and instruction.
  • Importantly, the separate video frames [0065] 706 and 708 may either be controlled separately or as one, allowing the use of only one set of controls to display synchronized information contemporaneously. In this case, the synchronized video information relates to at least two video clips of information that were taken simultaneously, e.g., of the same swing. In an embodiment of the invention, the video frames are part of a graphical user interface that detects whether the video frame information that is being displayed in the two frames 706 and 708 are synchronized in time, i.e., time-synchronized. As an example, information from two video cameras of the same swing, but taken from different angles (as shown in FIG. 7), are synchronized in time and, in such a case, the graphical user interface automatically detects this situation. As another example, video information of a student's swing that is to be shown in one frame, such as frame 706, is not synchronized in time with video information of another golfer, such as a golf pro, that may also be shown in the other frame, such as frame 708. Since the two sets of video information represent different swings, the two sets are not synchronized in time. Detecting whether the two signals are synchronized in time may be performed in a number of ways, such as by setting a flag, assigning an identification value, comparing associated time information or comparing format information, among others.
  • Upon detecting that the two sets of video information are synchronized in time, the graphical user interface automatically links many of the selection elements together such that either group of control elements controls both video clips, i.e., both sets of video information. In essence, controls that play, fast forward, reverse and stop the video replay for one frame, e.g., controls [0066] 716, 720, 722, 724 and 726 that normally control frame 706 would also simultaneously control frame 708, when the video signals are synchronized in time. Similarly, controls 718, 728, 730, 734 and 736 would simultaneously control frame 706 when the signals are synchronized, instead of merely controlling frame 708.
  • Upon detecting that the two video signals are not synchronized in time, e.g., represent different motions, then the graphical user interface maintains the two sets of controls for each [0067] frame 706 and 708 as separate.
  • In an alternative embodiment, the graphical user interface may provide a selectable toggle button or element displayed on the [0068] screenshot 700 that could toggle the control of the two frames 706 and 708 from being controlled as one, or as two separate frames. As discussed above, when controlled as one, for example, the selection of either address selection element 720 or 728 would automatically display the address video information in both frames 706 and 708. On the other hand, when operated separately, the selection of one of the selection elements 720 or 728 would only cause the display of the address video information in one of the two frames 706 or 708, respectively.
  • In accordance with another embodiment, a user interface presenting analysis information derived from the [0069] position analysis system 104 and the video analysis system 102 is shown in FIG. 8. FIG. 8 illustrates a screen shot 800 of the user interface of the analysis module 315 presenting analysis information related to an embodiment described in FIG. 1. Selection elements, split screen divisions, and displayed information of the screen shot 800 are the same as those shown in screen shot 700 and described above in conjunction with FIG. 7. However, the screen shot 800 presents positional measurement analysis information 804 synchronized with video analysis information 802. The video frame analysis information 802 and the positional measurement analysis information 804 are synchronized such that each frame sample of video data corresponds to a measurement sample of position elements of the golfer's swing. For example, the shoulder turn measurement value 806 presented on the shoulder turn measurement display 808 will vary each time that scrollbar selection element 716 addresses a different video frame sample of the video data. The positional measurement analysis information 804 is displayed through measurement displays 808, 810, and 820, 822, 824, 826, 828, 830, and 832. Whereas measurement displays 808-810 are associated with screen division 706, measurement displays 820, 822, 824, 826, 828, 830, and 832 are associated with screen division 708. Accordingly, control over which measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 are presented is administered through motion capture selection elements 710 and 712.
  • In accordance with an embodiment, the [0070] video analysis information 702 might be linked to the positional measurement analysis information 804 in a such way that the positional measurement values are identified, highlighted, or displayed as the video playback shows the golfer conducting the swing. Indeed, the measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 might be presented as a particular color signifying an analysis of aspects of a golfer's swing. For instance, if the shoulder turn measurement display 808 is red, then the golfer has turned his shoulder to an angle that is not desirable in an instructed golf swing. Indeed, the positional elements of the swing may be compared to a table or database of values to determine whether such information relates to positional information that is desirable or not, wherein the database contains average values based on predetermined desirable swing mechanics. Consequently, if the shoulder turn measurement display 808 is green, then the angle of the golfer's shoulder turn, or rotation, is within a desirable range for an instructed golf swing as compared to the referenced database. Additionally, measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 might be presented as yellow, or another intermediate color, suggesting that a measurable element of a golf swing is about to shift outside a desirable range.
  • In accordance with another embodiment, measurement displays [0071] 808, 810, 820, 822, 824, 826, 828, 830, and 832 are capable of being positioned on the screen such that a user can move the displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 to any desired location on the screen. That is, through the use of a user-input device, such as a mouse or other input device, the displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 may be interactively positioned in different locations. For example, FIG. 8 shows shoulder tilt measurement display 830 positioned by the golfer's shoulder and measurement displays 820, 822, 824, 826, and 828 in a default arrangement on screen division 708. In yet another embodiment, screen division 706 might contain other measurement displays, such as a shoulder bend measurement display, a hip tilt measurement display, a hip bend measurement display, a shoulder tilt measurement display, or any other measurement display associated with analysis information derived from positional measurement samples.
  • In accordance with yet another embodiment, values shown in [0072] measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 might be used in real time where the user is monitoring a display as he/she performs the golf swing, or other physical motion. As such, the values presented in the measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 dynamically vary as the user engages in the swing, or motion. By presenting the positional measurement information in real time, a user is able to adjust a swing or motion as he/she is conducting such. As described above, the measurement displays 808, 810, 820, 822, 824, 826, 828, 830, and 832 may also be highlighted in colors to alert the user, in real time, of a desirable range of motion for specific swing elements.
  • In other embodiments, analysis information presenting analysis derived from only the [0073] video analysis 102 and the impact analysis 106 systems might be synchronized and displayed in similar fashion as described in conjunction with FIG. 7. In yet other embodiments, analysis information presenting analysis derived from only the position analysis 104 and the impact analysis 106 systems might be displayed in similar fashion as described in conjunction with FIG. 7. In yet other embodiments, analysis information presenting analysis from a variety of analysis systems other than a video 102, position 104, or impact 106 analysis system might be synchronized and displayed as discussed in FIG. 7 and FIG. 8.
  • The [0074] process environment 114 may be implemented as software, hardware, or any combination of hardware and software designed as an analysis tool in accordance with the embodiments and equivalents to the embodiments described for this invention. In an embodiment, the process environment 114 might include at least some form of computer-readable media accessible by a computing device capable of receiving at least two separate information signals simultaneously. Accordingly, the process module 314 might be a computing device accessing the computer-readable media. The computer-readable media might be stored on storage media including, but not limited to ROM, RAM, EPROM, flash memory or other memory technology, digital versatile disks (DVD), CD-ROM, or other optical storage, magnetic tape, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other medium accessible by the computing device that can used to store the analysis information and the information carried by the information signals 208 and 210.
  • The logical operations of the various embodiments of the present invention are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations making up the embodiments of the present invention described herein are referred to variously as operations, structural devices, acts or modules. It will be recognized by one skilled in the art that these operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof without deviating from the spirit and scope of the present invention as recited within the claims attached hereto. [0075]
  • FIG. 4 generally illustrates operational characteristics for providing analysis information to an analysis module in order to provide physical motion training and instruction. Although the processes described in conjunction with FIG. 4 are directed to providing golf swing analysis, the process might be similarly used to provide swing analysis in other sports, such as baseball, tennis, cricket, polo, or any other sport where an athlete's swing of an apparatus is a measure through which an element of the sport is conducted. Moreover, the process might be similarly used to provide any form of physical motion analysis associated with any form of physical motion subject to correction and instruction. [0076]
  • Initially, receive [0077] operation 402 receives a first information signal representing sensed information relative to a golf club swing. The first signal is of a first type of information, e.g., video, position, weight transfer, pressure or impact information, among others. Next, receive operation 404 receives a second information signal representing sensed information relative to the golf club swing, wherein the second information signal is a different type of signal as compared to the first signal. As an example, the first type of signal may be video information and the second type may be positional, weight transfer or impact information. In an embodiment, first receive operation 402 and second receive operation 404 simultaneously receive the first and second information signals. In another embodiment, the first information signal and the second information signal might be acquired substantially simultaneously.
  • [0078] Synchronization operation 406 synchronizes the two signals received in operations 402 and 404. In one embodiment, synchronization operation 406 synchronizes the signals by time stamping samples of data points of each information signal. Synchronization operations 406 time stamps each sample in relative fashion thereby ensuring that portions of one signal relate to portions of the other signal based on associated time information. In an embodiment, synchronization of the signals is done in a way such that the first sampled data point on the first information signal is identified by the same time marking as the first sampled data point on the second information signal. Accordingly, subsequent sampled data points on the first information signal are identified by the same time marking as subsequent sampled data points on the second information signal. Alternatively, the synchronization operation 406 might stamp the information signals with associated times derived from time stamps without a corresponding time stamp in the other signal. For example, synchronization module 406 might stamp one information signal with five samples to every one stamped sample of the other information signal.
  • In such an example, even though the samples do not correspond to the same time stamp, the samples might be associated such that one sample of the first information signal relates to five samples of the second information signal. In accordance with an embodiment, if the samples do not correspond to the same time stamp, interpolation is used to supply missing data points to the signal of sensed information lagging in time samples. Interpolation is administered through a conventional polynomial equation to ensure that one sample of data of the first information signal exists for every sample of data of the second information signal. The equation used to may be predetermined based on the type of motion being measured. Consequently, depending on the type of motion being analyzed, e.g., a golf swing versus a baseball swing, the equation used to synchronize values may be different. [0079]
  • Once synchronized, the information signals are transmitted to process [0080] operation 408. Process operation 408 interprets each information signal and generates analysis information. The analysis information, presented on an analysis module, is used for golf swing analysis and training.
  • In an embodiment, the sensed information might be positional information related to the motion of a golf club swing. In another embodiment, the sensed information might be video information associated with a golf club swing. In yet another embodiment, the sensed information might be impact information relative to impact of the club head with a golf ball resulting from a golf club swing. Accordingly, the first and second information signals might be a positional information signal, a video information signal, or an impact information signal. Additionally, the sensed information might be any form of information related to a stroke, swing, movement, or motion of a person performing physical acts. [0081]
  • FIG. 5 illustrates operational characteristics for providing analysis information to an analysis module in order to provide athletic swing analysis of an athlete's swing. In particular, FIG. 5 is a more detailed illustration of the operations described in conjunction with FIG. 4. [0082] Start operation 500 is executed each instance that a single motion data sample and a video frame data sample is transmitted from each of a motion analysis and a video analysis system into an acquisition module. In accordance with an embodiment, receive operations 402, 404, 502, and 504 are administered in acquisition modules. A data sample is defined as a point, slice, or portion of an information signal.
  • Motion receive [0083] operation 502, stamp operation 506, and storage operation 510 make up a positional measurement acquisition process of a position analysis system and, in accordance with an embodiment, are operations of a Windows32® executable software program written in the C++ programming language. Motion receive operation 502 acquires positional information associated with a golfer's swing. In accordance with an embodiment, the positional information is transmitted from the position analysis system and carried in a positional information signal. Motion receive operation 502 reads positional information records from a serial port connected to a motion capture system. The start of each positional information record is indicated by a byte with the high bit on. Each record consists of the x, y and z cosine measurements from the motion capture system.
  • Following receive [0084] operation 502, computation operation 503 computes the required Euler angles for the parameters specified by the user. The computed Euler angles are stored in a shared memory structure to be time stamped by position stamp operation 506. At any given time, the shared memory structure contains a snapshot of the last computed record sample.
  • While in the shared memory structure, the record sample is associated with a sample of the video frame data captured by a video capture system and transmitted through a video information signal. Positional [0085] data stamp operation 506 time stamps the motion data sample stored in a shared memory structure as computed Euler angles. The time stamp administered by position stamp operation 506 relates the motion data sample to the associated video frame sample as described in conjunction with FIG. 4.
  • Once the sample is stamped, operation flow passes to positional [0086] data storage operation 510. Positional data storage operation 510 stores the time-stamped motion data sample in a buffer as shown in FIG. 3. Each pass through the positional measurement acquisition process, trigger sensory operation 516 detects whether a trigger event has occurred.
  • Video receive [0087] operation 504, video stamp operation 508, and video frame storage operation 510 make up a video frame sample acquisition process of a video analysis system and, in accordance with an embodiment, are operations of a Windows32® executable software program written in the C++ programming language. Separate instances of the video frame sample acquisition process execute for each capture board in the video analysis system. In accordance with a specific embodiment, the video analysis system contains two capture boards. In this embodiment, the capture board hardware is initialized into a 60 Hz field-mode of 240 lines per field at the specified width and these parameters might be defined by the software manufacturer's double-buffering queued asynchronous technique.
  • Once initialized, video receive [0088] operation 504 awaits arrival of a video frame sample associated with the golfer's swing from the video capture system and, upon arrival, acquires the video frame sample. The video frame sample is transmitted from the video analysis system and carried in a video information signal. In accordance with an embodiment, video frame stamp operation 508 time stamps the video frame sample acquired by the video receive operation 504 so that the video frame sample relates to a positional measurement sample. Once the samples are stamped, operation flow passes to video frame storage operation 512. The video frame sample is stored in a buffer by the video frame storage operation 512. In accordance with an embodiment, the buffer is a circular buffer having 120 records. In accordance with alternative embodiments, the circular buffer may have any number of records depending upon the length in time of the physical motion analyzed.
  • All data samples, whether video, positional measurement, impact, or any other sample associated with a physical motion, are time stamped using the same timebase. In accordance with a specific embodiment, the timebase might be a Win32™ high-precision timer. Whereas the position analysis system grabs a sample about every 33 ms, the video analysis system grabs a sample about every 16 ms. Therefore, identical positional measurements are stored in the metric sample buffer for more than one image record being stored in the video sample buffer memory. Being on the same timebase, the timer information indicates the relative location in time at which the samples were gathered. Headers of the video sample buffer memory may contain information corresponding to positional measurement samples stored in the metric sample buffer. [0089]
  • In accordance with an embodiment, the [0090] storage operations 510 and 512 store the value of each time stamp with each sample. In another embodiment, the storage operations 510 and 512 might store the samples in linked or adjacent buffers identified by the time stamp value in order to maintain the association between the two samples such that an association between the two samples is maintained while the samples are stored in the buffer. Data is stored in the buffer for a predetermined period of time so that if neither an impact information signal nor a triggering event signal is transmitted in the predetermined time period, data in the buffer is stored in first in, first out basis. Once storage operations 510 and 512 store the data samples, operation flow passes to trigger sensory operation 516.
  • Trigger [0091] sensory operation 516 detects whether a trigger event has occurred. As described earlier, the trigger event might be the manual selection of an input request (e.g., pressing a key on a keyboard), predetermined positional coordinates on a golfer's swing, or any other triggering operation associated with a golfer's swing. Additionally, the trigger event might be on impact between the golf club head and the golf ball as detected by a microphone. If trigger sensory operation 516 has not detected a trigger event, then operation flow returns to start operation 500 and receive operations 502 and 504 acquire a subsequent data sample. If trigger sensory operation 516 detects a trigger event, then operation flow passes to collection operation 518.
  • [0092] Collection operation 518 continues the collection, time stamping, and storage of video and motion data samples administered through receive operations 502 and 504, stamp operations 506 and 508, and storage operations 510 and 512. In accordance with an embodiment, collection operation 518 might collect impact measurement data samples in the same fashion as receive operations 502 and 504, stamp operations 506 and 508, and storage operations 510 and 512. Impact measurement data samples represent coordinate and relative positions of the clubface of the golf club as the club head of the golf club enters and leaves a predetermined area surrounding the impact location between the club head and the golf ball.
  • While [0093] collection operation 518 oversees the continued collection, stamping, and storage of sensed data samples, continuation operation 514 limits the period of collection, stamping, and storage as defined by the timing window set by the trigger event. Continue operation 514 sets a predetermined time period within which the execution of the positional measurement acquisition and video frame sample acquisition processes will continue, thereby allowing positional measurement and video frame data samples associated with the golfer's follow-through to be collected following detection of an impact or trigger event. Once the predetermined time period has elapsed, the processes are terminated and operation flow passes to process operation 520. In accordance with an embodiment, the predetermined time period is set to zero, thereby terminating collection once a trigger event occurs. In accordance with other embodiments, the predetermined time period is set to a finite time period other than zero upon occurrence of a trigger event. In a specific embodiment, the predetermined time period is set by a countdown timer that counts video frame data samples. After 20 video frame data samples have been captured following a trigger event, both the positional measurement and the video frame sample acquisition processes are terminated. This specific configuration results in a 100-frame pre-trigger circular buffer.
  • Upon completion of the trigger countdown, the video frame and positional measurement acquisition processes are frozen. [0094] Process operation 520 interprets the data samples stored for each analysis system. Process operation 520 generates analysis information from the interpreted samples and transmits the analysis information to an analysis module in a format suitable for presentation to the golfer. In accordance with a specific embodiment, process operation 520 discards redundant records of positional measurement samples. A spline fit is applied to each of the positional measurement samples. Using the spline parameters based on the smooth motion being measured, the metric value at each frame time is computed. This calculated data is written into a positional measurement file which is ultimately saved as part of an archived lesson.
  • Operational characteristics of the video frame sample acquisition process are shown in FIG. 9. In particular, FIG. 9 is a specific embodiment of the operations of the video frame [0095] sample acquisition process 900 described in conjunction with FIG. 5. Start operation 902 initiates the video frame sample acquisition process 900. The video frame sample acquisition process 900 is initiated at the beginning of, or a time prior to, the physical motion to be acquired by the video capture system. Once initiated, hardware initialization operation 904 initializes each capture board into a 60 Hz field-mode of 240 lines per field at the specified width. Once the capture boards are initiated, frame arrival operation 906 awaits arrival of a video frame sample. Frame arrival operation 906 operates in an endless loop to wait for the video frame sample.
  • Upon arrival of the video frame sample, operation flow passes to [0096] next frame operation 908. Next frame operation 908 queues a grab for the next video frame sample and operation flow passes to image copy operation 910. Image copy operation 910 copies the just-received image information of the video frame sample into the current record of the circular buffer. The current record is the record in the circular buffer that is being accessed by the record pointer. Once the image information is copied, operation flow passes to time storage operation 912. Time storage operation 912 stores the time associated with acquisition of the video frame sample in the record header of the current record. In particular, time storage operation 912 stores the time stamp of video stamp operation 508, which is described in conjunction with FIG. 5.
  • Once the time has been stored, operation flow passes to trigger [0097] detection operation 914. Trigger detection operation 914 checks to see whether a trigger event has been administered. If a trigger event has been administered, then operation flow passes to countdown check operation 916. Countdown check operation 916 checks to see if the countdown timer initiated by the trigger event has completed counting. If trigger detection operation 914 has not detected a trigger event, then operation flow passes to record advance operation 918. Record advance operation 918 advances the record pointer to the next record. Likewise, if countdown check operation 916 determines that the countdown has not been exhausted, operation flow passes to record advance operation 918. Once the record pointer has been advanced to the next record, operation flow passes to frame arrival operation 906 and continues as earlier described. If countdown check operation 916 determines that the countdown is completed, the operation flow passes to video freeze operation 920. Video freeze operation 920 sets a video freeze flag signaling termination of video frame acquisition.
  • Referring back to FIG. 6, an illustration of operations related to control of analysis information presented to the analysis module is shown in accordance with an embodiment of the present invention. [0098] Start operation 600 begins operation flow for control of which analysis information is transmitted to the analysis module. In particular, operations illustrated in FIG. 6 are sub-operations that are performed during process operation 520.
  • [0099] Selection operation 602 acquires the selection request of the type of analysis module the user of the analysis tool has requested to use. In an embodiment, the user requests to use the analysis tool through a monitor. In other embodiments, the analysis module requested might be a hard disk, a floppy disc, a tape disk, a CD, or any other recordable medium allowing the golfer or instructor to download analysis information for later use. In yet other embodiments, the analysis module requested might be a web server or a kiosk, thereby allowing a user to access the analysis information from a remote station. The selection request acquired by selection operation 602 is preferably sent by the analysis module when the user logs on to the analysis tool through the analysis module. In other embodiments, the selection request might be through a user request communicated to the analysis tool directly from an input device. Once a selection request of a particular analysis module is acquired by the analysis tool, operation flow passes to format operation 604.
  • [0100] Format operation 604 converts the analysis information into a format suitable for presentation onto the selected analysis module if the analysis information is not already in that a suitable format. Once the analysis information is formatted according to the selected analysis module, operation flow passes to presentation operation 606. Presentation operation 606 presents the formatted analysis information to the analysis module in order for the module to deliver the analysis information to the user of the analysis tool. Once the analysis information is presented, operation flow passes to control selection request operation 608. Control selection request operation 608 waits for an input selection request from the input device. The input selection request might be any task associated with control over the analysis tool, including, but not limited to, activation of the analysis tool, operation of the analysis tool, appearance of the presentation to the analysis module, selection of which analysis systems are used and presented through the analysis tool, and any control operation associated with use of the analysis tool. If an input selection request is received by the analysis tool, as determined by control selection operation 608, then operation flow passes to execution operation 610. Execution operation 610 executes the task associated with the input selection request. Once the task is performed, operation flow passes to presentation operation 606. Presentation operation 606 presents analysis information incorporating performance of the task to the analysis module as requested by the input selection. Following presentation, operation flow passes to control operation 608 and continues as illustrated above.
  • Operations associated with presentation and control of analysis information through a World Wide Web based application is shown in accordance with an embodiment of the invention in FIG. 10. In particular FIG. 10 illustrates operations related to a web-based application in accordance with an embodiment. The web-based application is an interactive application providing a golf instruction and [0101] training process 1000 to a golfer over the Internet. Prior to beginning the golf instruction and training process 1000, analysis information related to the golf swing must be processed by an analysis tool 100. That is, either during or following a training session, a lesson file is created that contains analysis information related to that lesson, e.g., tips, tricks, video data, etc., that may be accessed for future reference. Thus, operation 1004 is used to compile the information into a lesson file where a lesson file is a compressed, encoded computer readable file that may contain video, still images, synchronized sensor data, text information, recorded audio, and necessary instructions to recreate events or other marked portions of the training session for subsequent access by users with access to the authorized decoding/presentation software.
  • [0102] Analysis operation 1004 marks specific analysis information for the web-based golf lesson. Such analysis information is marked by selection elements on the user interface of the analysis tool. In accordance with an embodiment, save drill selection element 736, save screen selection element 738, save before selection element 740, and save after selection element 742 mark portions of the analysis information that are to be used with the web-based lesson. For instance, a swing recorded with video and associated measurement data prior to professional instruction might be marked to show the golfer an example of an undesirable swing. Additionally, a swing recorded with video and associated measurement data following professional instruction might also be marked to show the golfer improvement in his/her swing.
  • By being marked, the recordings are saved to a lesson file and later used in the web-based lesson to provide the golfer with a comparison of his before and after swings. Moreover, [0103] analysis operation 1004 allows marking of all forms of analysis information, including, instructor and student comments, measurement values, video playback, still shots associated with the video playback, audio clips, such as comments and observations from an instructor, and any other form of analysis information derived from the analysis tool.
  • Although described herein as the marking of information for a web-lesson, the marking method may be used in creation of any saved lesson. That is, the marked material may be stored to a file and saved to a computer disc, videocassette or any other type of recording medium such that the lesson can be viewed at a later time by the user. Marking material to be saved to a final while the actual lesson is occurring saves time since the instructor does not have to review a recording of the entire lesson and manually select pertinent information, e.g., swings, comments, drills, etc. Instead, the instructor merely selects the appropriate screen element to mark the pertinent information, either a swing, comment, still shot, etc., contemporaneously for saving to the final recorded lesson. The actual marking essentially relates storing the information into a temporary file, and then once the lesson is completed, the temporary information may be stored to a more permanent file. [0104]
  • Contemporaneous marking relates to the selection of pertinent content during the training session. Indeed, with respect to specific portions or event that occur during the training session, the marking occurs before, during, or substantially immediately after the occurrence of that event to preserve the relevant data in a predetermined location, separate from the other sensed information. In this respect, substantially immediately thereafter relates to the marking of information, such as information related to a particular swing, following the swing, but before the occurrence next swing or lesson instruction. As an example, the student may make three consecutive swings, and before the fourth swing, the instructor decides that the information stored on the system that relates to the third swing should be marked for saving to the final lesson file. Prior to the fourth swing, the instructor marks the third swing to be saved to the lesson file. Additionally, the instructor may mark audio instructions or discussions related to the third swing to be saved along with video and/or positional measurement information related to the third swing. [0105]
  • Once the lesson has been saved, upload [0106] operation 1006 uploads the saved lesson file, i.e., the marked analysis information to the web-based application resident on a server. Although described herein as using “marked” information for a web-lesson, referring to the contemporaneous marking of material to be used in the final saved lesson, it should be noted that any saved lesson file, whether contemporaneously marked or selected following the lesson, may be saved and uploaded to a web-server. Alternatively, the entire lesson may be recorded and uploaded to the web-server.
  • Following upload [0107] operation 1006, operation flow continues to connection operation 1008, which refers to the act of a user connecting to the server via the World Wide Web and accesses the web-based application from a remote computer. Once connected operation flow passes to access operation 1010. During access operation 1010, the user accesses the lesson information located on the web server. Such access may involve downloading the identification of the user and the marked analysis information associated with the user to the user's computer system.
  • Once the information is accessed, then [0108] format operation 604 formats the marked analysis information so that presentation of the marked analysis information may be controlled and displayed through the web-based application via the World Wide Web. Once the marked analysis information is formatted, operation flow passes to presentation operation 606 and continues as described in conjunction with FIG. 6 with user-control over the marked analysis information being provided through an Internet connection.
  • The above described analysis tool significantly improves the analysis of physical motion and the overall learning process for learning the proper athletic motion. Indeed, replaying the synchronized signals provides a valuable teaching tool in that a user can visualize swing measurement values of their own motion. Providing the combination of these signals removes guesswork associated with trying to pinpoint problem areas and the degree to which they are a problem. Additionally, the present invention relates to many improvements in the lesson process, such as combining numerous signals (video, audio, motion capture, impact analysis, etc.), allowing for numerous display options (video with motion capture values, movable value boxes, predetermined color scheme, etc.), and numerous playback options (tape, Web, etc.). [0109]
  • The various embodiments described above are provided by way of illustration only and should not be construed to limit the invention. Those skilled in the art will readily recognize various modifications and changes that may be made to the present invention without following the example embodiments and applications illustrated and described herein, and without departing from the true spirit and scope of the present invention, which is set forth in the following claims. [0110]

Claims (21)

What is claimed is:
1. An analysis system for analyzing physical motion, the physical motion occurring during a training session, the analysis system comprising:
a plurality of sensors that sense physical motion, wherein the sensors generate information signals related to sensed information;
a storage medium for storing the sensed information conducted by the information signals; and
a marking module for marking a plurality of portions of the sensed information, the marking module marks the portions of sensed information contemporaneously with the training session.
2. An analysis system as defined in claim 1 wherein the marking module is user-activated.
3. An analysis system as defined in claim 1 wherein the marking module automatically saves marked information to a separate location in the storage medium.
4. An analysis system as defined in claim 1 wherein the sensed information relates to audio information, video, still images, and positional measurement information.
5. An analysis system as defined in claim 1 wherein the plurality of sensors comprise a first sensor and a second sensor and wherein the sensed information from the first sensor is video information and the sensed information from the second sensor is positional measurement information.
6. An analysis system as defined in claim 5 wherein the sensed video information and positional measurement information is synchronized.
7. An analysis system as defined in claim 6 wherein the first type of the physical motion relates to a golf swing.
8. A method of storing lesson information, the lesson information relating to portions of a training session, the method comprising:
receiving sensed information during the training session and relating to the training session;
contemporaneously marking portions of the sensed information; and
storing marked portions of the sensed information to a lesson file.
9. A method of storing lesson information as defined in claim 8 wherein the sensed information is video information.
10. A method of storing lesson information as defined in claim 8 wherein the marked portions are automatically saved to a lesson file.
11. A method as defined in claim 8 further comprising:
contemporaneously recording audio information during the training session;
automatically linking the recorded audio information to a displayed image; and
marking the linked recorded audio information and displayed image for storage.
12. A method as defined in claim 8 wherein the sensed information comprises video information, audio information and positional measurement information.
13. A method as defined in claim 12 wherein the video information and the positional measurement information are synchronized.
14. A method as defined in claim 8 further comprising uploading the lesson file to a server computer system, wherein the lesson file on the server computer system may be accessed by a remote client computer system over a network connection.
15. A method as defined in claim 14 wherein the network connection is the Internet.
16. A computer program product readable by a computing system and encoding a computer program of instructions for executing a computer process for providing athletic instruction related to a physical motion, said computer process comprising:
receiving sensed information related the athletic instruction;
contemporaneously marking portions of the sensed information; and
storing the marked portions to a lesson file.
17. A computer program product as defined in claim 16 wherein the sensed information is video information.
18. A computer program product as defined in claim 16 wherein the sensed information comprises video information, audio information and positional measurement information.
19. A computer program product as defined in claim 16 wherein the physical motion relates to a golf swing.
20. A computer program product as defined in claim 16 wherein the lesson file is stored on a computer readable medium
21. A computer program product as defined in claim 16 wherein the process further comprises the following two acts that occur prior to the contemporaneous marking act:
synchronizes at least two different types of sensed information; and
presenting the synchronized information on a display.
US09/788,031 2001-02-16 2001-02-16 Method and system for marking content for physical motion analysis Abandoned US20020115047A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US09/788,031 US20020115047A1 (en) 2001-02-16 2001-02-16 Method and system for marking content for physical motion analysis
PCT/US2002/005217 WO2002066119A1 (en) 2001-02-16 2002-02-14 Method and system for analyzing physical motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/788,031 US20020115047A1 (en) 2001-02-16 2001-02-16 Method and system for marking content for physical motion analysis

Publications (1)

Publication Number Publication Date
US20020115047A1 true US20020115047A1 (en) 2002-08-22

Family

ID=25143229

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/788,031 Abandoned US20020115047A1 (en) 2001-02-16 2001-02-16 Method and system for marking content for physical motion analysis

Country Status (2)

Country Link
US (1) US20020115047A1 (en)
WO (1) WO2002066119A1 (en)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030033602A1 (en) * 2001-08-08 2003-02-13 Simon Gibbs Method and apparatus for automatic tagging and caching of highlights
WO2004114203A1 (en) * 2003-06-20 2004-12-29 Inpractis Corporation Inc. Method and apparatus for activity analysis
US20050215336A1 (en) * 2004-03-26 2005-09-29 Sumitomo Rubber Industries, Ltd. Golf swing-diagnosing system
WO2007006346A1 (en) * 2005-07-12 2007-01-18 Dartfish Sa A method for analyzing the motion of a person during an activity
US20070196800A1 (en) * 2006-01-27 2007-08-23 Douthit Ronnie D Systems and methods for golfing simulation and swing analysis
EP1846115A2 (en) * 2005-01-26 2007-10-24 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20090089238A1 (en) * 2007-09-27 2009-04-02 Henry Colburn Stevenson-Perez Knowledge management portal for rapid learning and assessment of science
US20090239673A1 (en) * 2006-05-31 2009-09-24 Golfkick, Limited Golfing Aids
US20090270193A1 (en) * 2008-04-24 2009-10-29 United States Bowling Congress Analyzing a motion of a bowler
US20100015585A1 (en) * 2006-10-26 2010-01-21 Richard John Baker Method and apparatus for providing personalised audio-visual instruction
US20100117837A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100175083A1 (en) * 2007-06-13 2010-07-08 Streamezzo Method of broadcasting a complementary element, corresponding server and terminal
US8162804B2 (en) 2007-02-14 2012-04-24 Nike, Inc. Collection and display of athletic information
WO2013056218A2 (en) * 2011-10-14 2013-04-18 Motion Signature Analysis Corp. Systems and methods for task-associated motion analysis
US20130113961A1 (en) * 2011-11-04 2013-05-09 Nike, Inc. Portable Movement Capture Device And Method Of Finite Element Analysis
US8457350B2 (en) 2001-08-10 2013-06-04 Sony Corporation System and method for data assisted chrom-keying
US20140085461A1 (en) * 2012-09-21 2014-03-27 Casio Computer Co., Ltd. Image specification system, image specification apparatus, image specification method and storage medium to specify image of predetermined time from a plurality of images
US20140180632A1 (en) * 2012-12-21 2014-06-26 Yamaha Corporation Motion Analysis Device
US20140289628A1 (en) * 2013-03-21 2014-09-25 Casio Computer Co., Ltd. Notification control apparatus for identifying predetermined frame in moving image
US20150005089A1 (en) * 2008-10-09 2015-01-01 Golf Impact, Llc Golf Swing Measurement and Analysis System
US20150018990A1 (en) * 2012-02-23 2015-01-15 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US20150029341A1 (en) * 2013-07-09 2015-01-29 Aditi Sinha Sport training equipment
US20160050358A1 (en) * 2008-03-21 2016-02-18 Disney Enterprises, Inc. Method and System for Multimedia Captures With Remote Triggering
US9339714B2 (en) 2014-05-20 2016-05-17 Arccos Golf Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US9348829B2 (en) 2002-03-29 2016-05-24 Sony Corporation Media management system and process
JP2016158702A (en) * 2015-02-27 2016-09-05 セイコーエプソン株式会社 Three-dimensional image processing system, three-dimensional image processing device, and three-dimensional image processing method
EP3086320A1 (en) * 2015-04-23 2016-10-26 Adidas AG Method and device for associating frames in a video of an activity of a person with an event
US20170001071A1 (en) * 2014-07-09 2017-01-05 Aditi Sinha Sport training equipment
US20170048466A1 (en) * 2012-03-21 2017-02-16 Casio Computer Co., Ltd. Image processing device that generates a composite image
US20170117019A1 (en) * 2014-03-21 2017-04-27 Golfzon Co., Ltd. Method for synchronizing data between different types of devices and data processing device for generating synchronized data
US20170157464A1 (en) * 2014-03-28 2017-06-08 Seiko Epson Corporation Information providing method, information providing device, information providing system, and information providing program
JP2017131327A (en) * 2016-01-26 2017-08-03 カシオ計算機株式会社 Instruction assist system, and instruction assist program
JP2017131308A (en) * 2016-01-26 2017-08-03 カシオ計算機株式会社 Instruction assist system, and instruction assist program
US9770639B2 (en) 2015-07-21 2017-09-26 Arccos Golf, Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US20180272220A1 (en) * 2017-03-24 2018-09-27 Robert Glorioso System and Method of Remotely Coaching a Student's Golf Swing
US20180310049A1 (en) * 2014-11-28 2018-10-25 Sony Corporation Transmission device, transmission method, reception device, and reception method
US10173100B2 (en) * 2016-09-17 2019-01-08 Navyaa Sinha Sport training equipment
JP2019010569A (en) * 2018-10-24 2019-01-24 株式会社ニコン program
JP2019010574A (en) * 2018-10-24 2019-01-24 株式会社ニコン program
USD842401S1 (en) 2017-11-02 2019-03-05 Daniel J. Mueller Baseball
JP2019081064A (en) * 2019-03-04 2019-05-30 株式会社ニコン program
JP2019170722A (en) * 2018-03-28 2019-10-10 カシオ計算機株式会社 Electronic apparatus, operation detection method, and operation detection program
US10573193B2 (en) * 2017-05-11 2020-02-25 Shadowbox, Llc Video authoring and simulation training tool
US10679413B2 (en) 2014-06-10 2020-06-09 2Mee Ltd Augmented reality apparatus and method
US10682562B2 (en) 2017-01-17 2020-06-16 Arccos Golf Llc Autonomous personalized golf recommendation and analysis environment
JP2020130957A (en) * 2019-02-26 2020-08-31 豊田合成株式会社 Swing training method
US10856037B2 (en) * 2014-03-20 2020-12-01 2MEE Ltd. Augmented reality apparatus and method
US10885333B2 (en) 2012-09-12 2021-01-05 2Mee Ltd Augmented reality apparatus and method
JP2021072881A (en) * 2010-11-10 2021-05-13 ナイキ イノベイト シーブイ Systems and method for time-based athletic activity measurement and display
US11944428B2 (en) 2015-11-30 2024-04-02 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3408750A (en) * 1965-09-15 1968-11-05 George T. Mccollough Visi-golf modern method of golf instruction
US5111410A (en) * 1989-06-23 1992-05-05 Kabushiki Kaisha Oh-Yoh Keisoku Kenkyusho Motion analyzing/advising system
US5342054A (en) * 1993-03-25 1994-08-30 Timecap, Inc. Gold practice apparatus
US5823786A (en) * 1993-08-24 1998-10-20 Easterbrook; Norman John System for instruction of a pupil
AU2123297A (en) * 1996-02-12 1997-08-28 Golf Age Technologies Golf driving range distancing apparatus and methods
US6159016A (en) * 1996-12-20 2000-12-12 Lubell; Alan Method and system for producing personal golf lesson video

Cited By (105)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030033602A1 (en) * 2001-08-08 2003-02-13 Simon Gibbs Method and apparatus for automatic tagging and caching of highlights
US8457350B2 (en) 2001-08-10 2013-06-04 Sony Corporation System and method for data assisted chrom-keying
US9348829B2 (en) 2002-03-29 2016-05-24 Sony Corporation Media management system and process
WO2004114203A1 (en) * 2003-06-20 2004-12-29 Inpractis Corporation Inc. Method and apparatus for activity analysis
AU2005201321B2 (en) * 2004-03-26 2007-08-09 Sri Sports Limited Golf swing-diagnosing system
GB2414190B (en) * 2004-03-26 2007-03-07 Sumitomo Rubber Ind Golf swing diagnosing system
US7857708B2 (en) 2004-03-26 2010-12-28 Sri Sports Limited Golf swing-diagnosing system
GB2414190A (en) * 2004-03-26 2005-11-23 Sumitomo Rubber Ind Golf swing diagnosis apparatus
US20050215336A1 (en) * 2004-03-26 2005-09-29 Sumitomo Rubber Industries, Ltd. Golf swing-diagnosing system
EP1846115A2 (en) * 2005-01-26 2007-10-24 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
EP1846115A4 (en) * 2005-01-26 2012-04-25 Bentley Kinetics Inc Method and system for athletic motion analysis and instruction
WO2007006346A1 (en) * 2005-07-12 2007-01-18 Dartfish Sa A method for analyzing the motion of a person during an activity
US20080094472A1 (en) * 2005-07-12 2008-04-24 Serge Ayer Method for analyzing the motion of a person during an activity
US8848058B2 (en) * 2005-07-12 2014-09-30 Dartfish Sa Method for analyzing the motion of a person during an activity
US11452914B2 (en) 2006-01-09 2022-09-27 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US9907997B2 (en) 2006-01-09 2018-03-06 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100121227A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11399758B2 (en) 2006-01-09 2022-08-02 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100201512A1 (en) * 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for evaluating body movements
US20100201500A1 (en) * 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for communicating biometric and biomechanical information
US20100204616A1 (en) * 2006-01-09 2010-08-12 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7821407B2 (en) 2006-01-09 2010-10-26 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7825815B2 (en) 2006-01-09 2010-11-02 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100117837A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7978081B2 (en) 2006-01-09 2011-07-12 Applied Technology Holdings, Inc. Apparatus, systems, and methods for communicating biometric and biomechanical information
US10675507B2 (en) 2006-01-09 2020-06-09 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11819324B2 (en) 2006-01-09 2023-11-21 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11653856B2 (en) 2006-01-09 2023-05-23 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100121228A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11717185B2 (en) 2006-01-09 2023-08-08 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20070196800A1 (en) * 2006-01-27 2007-08-23 Douthit Ronnie D Systems and methods for golfing simulation and swing analysis
US20090239673A1 (en) * 2006-05-31 2009-09-24 Golfkick, Limited Golfing Aids
US20100015585A1 (en) * 2006-10-26 2010-01-21 Richard John Baker Method and apparatus for providing personalised audio-visual instruction
US11210963B2 (en) * 2006-10-26 2021-12-28 Richard John Baker Method and apparatus for providing personalised audio-visual instruction
US20170206794A1 (en) * 2006-10-26 2017-07-20 Richard John Baker Method and Apparatus for Providing Personalised Audio-Visual Instruction
US8162804B2 (en) 2007-02-14 2012-04-24 Nike, Inc. Collection and display of athletic information
US10307639B2 (en) 2007-02-14 2019-06-04 Nike, Inc. Collection and display of athletic information
US11081223B2 (en) 2007-02-14 2021-08-03 Nike, Inc. Collection and display of athletic information
US8782689B2 (en) * 2007-06-13 2014-07-15 Streamezzo Method of broadcasting content and at least one complementary element, utilizing a server and a terminal
US20100175083A1 (en) * 2007-06-13 2010-07-08 Streamezzo Method of broadcasting a complementary element, corresponding server and terminal
US20090089238A1 (en) * 2007-09-27 2009-04-02 Henry Colburn Stevenson-Perez Knowledge management portal for rapid learning and assessment of science
US20160050358A1 (en) * 2008-03-21 2016-02-18 Disney Enterprises, Inc. Method and System for Multimedia Captures With Remote Triggering
US20090270193A1 (en) * 2008-04-24 2009-10-29 United States Bowling Congress Analyzing a motion of a bowler
US20150005089A1 (en) * 2008-10-09 2015-01-01 Golf Impact, Llc Golf Swing Measurement and Analysis System
US9604118B2 (en) * 2008-10-09 2017-03-28 Golf Impact, Llc Golf club distributed impact sensor system for detecting impact of a golf ball with a club face
US11935640B2 (en) 2010-11-10 2024-03-19 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
JP7432499B2 (en) 2010-11-10 2024-02-16 ナイキ イノベイト シーブイ Systems and methods for measuring and displaying athletic activity on a time-based basis
JP2021072881A (en) * 2010-11-10 2021-05-13 ナイキ イノベイト シーブイ Systems and method for time-based athletic activity measurement and display
US11817198B2 (en) 2010-11-10 2023-11-14 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
WO2013056218A3 (en) * 2011-10-14 2014-05-30 Motion Signature Analysis Corp. Systems and methods for task-associated motion analysis
WO2013056218A2 (en) * 2011-10-14 2013-04-18 Motion Signature Analysis Corp. Systems and methods for task-associated motion analysis
US20130113961A1 (en) * 2011-11-04 2013-05-09 Nike, Inc. Portable Movement Capture Device And Method Of Finite Element Analysis
US8982216B2 (en) * 2011-11-04 2015-03-17 Nike, Inc. Portable movement capture device and method of finite element analysis
US20180264342A1 (en) * 2012-02-23 2018-09-20 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US9999825B2 (en) * 2012-02-23 2018-06-19 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US10758807B2 (en) 2012-02-23 2020-09-01 Playsight Interactive Ltd. Smart court system
US20150018990A1 (en) * 2012-02-23 2015-01-15 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US10391378B2 (en) * 2012-02-23 2019-08-27 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US20190351306A1 (en) * 2012-02-23 2019-11-21 Playsight Interactive Ltd. Smart-court system and method for providing real-time debriefing and training services of sport games
US10382704B2 (en) * 2012-03-21 2019-08-13 Casio Computer Co., Ltd. Image processing device that generates a composite image
US20170048466A1 (en) * 2012-03-21 2017-02-16 Casio Computer Co., Ltd. Image processing device that generates a composite image
US10885333B2 (en) 2012-09-12 2021-01-05 2Mee Ltd Augmented reality apparatus and method
US11361542B2 (en) 2012-09-12 2022-06-14 2Mee Ltd Augmented reality apparatus and method
US20140085461A1 (en) * 2012-09-21 2014-03-27 Casio Computer Co., Ltd. Image specification system, image specification apparatus, image specification method and storage medium to specify image of predetermined time from a plurality of images
US20140180632A1 (en) * 2012-12-21 2014-06-26 Yamaha Corporation Motion Analysis Device
US20140289628A1 (en) * 2013-03-21 2014-09-25 Casio Computer Co., Ltd. Notification control apparatus for identifying predetermined frame in moving image
US9946346B2 (en) * 2013-03-21 2018-04-17 Casio Computer Co., Ltd. Notification control apparatus for identifying predetermined frame in moving image
US20150029341A1 (en) * 2013-07-09 2015-01-29 Aditi Sinha Sport training equipment
US9457228B2 (en) * 2013-07-09 2016-10-04 Aditi Sinha Sport training equipment
US10856037B2 (en) * 2014-03-20 2020-12-01 2MEE Ltd. Augmented reality apparatus and method
US11363325B2 (en) 2014-03-20 2022-06-14 2Mee Ltd Augmented reality apparatus and method
US20170117019A1 (en) * 2014-03-21 2017-04-27 Golfzon Co., Ltd. Method for synchronizing data between different types of devices and data processing device for generating synchronized data
US10016654B2 (en) * 2014-03-21 2018-07-10 Golfzon Co., Ltd. Method for synchronizing data collected from a camera device and a motion sensing device and data processing device generating the synchronized data
US20170157464A1 (en) * 2014-03-28 2017-06-08 Seiko Epson Corporation Information providing method, information providing device, information providing system, and information providing program
US10427017B2 (en) 2014-05-20 2019-10-01 Arccos Golf Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US9339714B2 (en) 2014-05-20 2016-05-17 Arccos Golf Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US11094131B2 (en) 2014-06-10 2021-08-17 2Mee Ltd Augmented reality apparatus and method
US10679413B2 (en) 2014-06-10 2020-06-09 2Mee Ltd Augmented reality apparatus and method
US20170001071A1 (en) * 2014-07-09 2017-01-05 Aditi Sinha Sport training equipment
US9724561B2 (en) * 2014-07-09 2017-08-08 Aditi Sinha Sport training equipment
US20180310049A1 (en) * 2014-11-28 2018-10-25 Sony Corporation Transmission device, transmission method, reception device, and reception method
US10880597B2 (en) * 2014-11-28 2020-12-29 Saturn Licensing Llc Transmission device, transmission method, reception device, and reception method
JP2016158702A (en) * 2015-02-27 2016-09-05 セイコーエプソン株式会社 Three-dimensional image processing system, three-dimensional image processing device, and three-dimensional image processing method
US9978425B2 (en) 2015-04-23 2018-05-22 Adidas Ag Method and device for associating frames in a video of an activity of a person with an event
EP3086320A1 (en) * 2015-04-23 2016-10-26 Adidas AG Method and device for associating frames in a video of an activity of a person with an event
CN106066990A (en) * 2015-04-23 2016-11-02 阿迪达斯股份公司 For the method and apparatus that the frame in the motion video of people is associated with event
US10589161B2 (en) 2015-07-21 2020-03-17 Arccos Golf, Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US9770639B2 (en) 2015-07-21 2017-09-26 Arccos Golf, Llc System and method for monitoring performance characteristics associated with user activities involving swinging instruments
US11944428B2 (en) 2015-11-30 2024-04-02 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities
JP2017131327A (en) * 2016-01-26 2017-08-03 カシオ計算機株式会社 Instruction assist system, and instruction assist program
JP2017131308A (en) * 2016-01-26 2017-08-03 カシオ計算機株式会社 Instruction assist system, and instruction assist program
US10173100B2 (en) * 2016-09-17 2019-01-08 Navyaa Sinha Sport training equipment
US10682562B2 (en) 2017-01-17 2020-06-16 Arccos Golf Llc Autonomous personalized golf recommendation and analysis environment
US11219814B2 (en) 2017-01-17 2022-01-11 Arccos Golf Llc Autonomous personalized golf recommendation and analysis environment
US20180272220A1 (en) * 2017-03-24 2018-09-27 Robert Glorioso System and Method of Remotely Coaching a Student's Golf Swing
US10573193B2 (en) * 2017-05-11 2020-02-25 Shadowbox, Llc Video authoring and simulation training tool
USD842401S1 (en) 2017-11-02 2019-03-05 Daniel J. Mueller Baseball
JP7069953B2 (en) 2018-03-28 2022-05-18 カシオ計算機株式会社 Electronic devices, partial identification methods for operation data, and programs
JP2019170722A (en) * 2018-03-28 2019-10-10 カシオ計算機株式会社 Electronic apparatus, operation detection method, and operation detection program
JP2019010569A (en) * 2018-10-24 2019-01-24 株式会社ニコン program
JP2019010574A (en) * 2018-10-24 2019-01-24 株式会社ニコン program
JP2020130957A (en) * 2019-02-26 2020-08-31 豊田合成株式会社 Swing training method
WO2020174985A1 (en) * 2019-02-26 2020-09-03 豊田合成株式会社 Golf swing practice method
JP7115357B2 (en) 2019-02-26 2022-08-09 豊田合成株式会社 swing training method
JP2019081064A (en) * 2019-03-04 2019-05-30 株式会社ニコン program

Also Published As

Publication number Publication date
WO2002066119A1 (en) 2002-08-29

Similar Documents

Publication Publication Date Title
US6537076B2 (en) Method and system for presenting information for physical motion analysis
US6567536B2 (en) Method and system for physical motion analysis
US20020115047A1 (en) Method and system for marking content for physical motion analysis
US6514081B1 (en) Method and apparatus for automating motion analysis
US5868578A (en) Sports analysis and testing system
US9914018B2 (en) System, method and apparatus for capturing and training a swing movement of a club
US5184295A (en) System and method for teaching physical skills
US4891748A (en) System and method for teaching physical skills
EP3635951A1 (en) Augmented reality learning system and method using motion captured virtual hands
Wilson Development in video technology for coaching
US8020098B2 (en) Video analysis system of swing motion
US20040162154A1 (en) Kinetic motion analyzer
JP4646209B2 (en) Practical skill analysis system and program
US20050261073A1 (en) Method and system for accurately measuring and modeling a sports instrument swinging motion
JP2005110850A (en) Body movement evaluating method, swing motion evaluating method, and body movement measuring system
EP1930841B1 (en) Method and measuring device for motional performance
JP2002248188A (en) Multimedia analyzing system and its usage
JP2009050721A (en) Swing movement assessment method, swing movement assessment apparatus, swing movement assessment system, and swing movement assessment program
WO2006135160A1 (en) System and method for analyzing golf swing motion
JP2007313362A (en) Apparatus for teaching body motions
WO2007035878A2 (en) Method and apparatus for determining ball trajectory
Chun et al. A sensor-aided self coaching model for uncocking improvement in golf swing
EP3226229A1 (en) Motion evaluation method and system in a sport context
WO2010085704A1 (en) Video overlay sports motion analysis
JP5521265B2 (en) Swing display method and swing display system

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOLFTEC, INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCNITT, MICHAEL J.;PARKS, JEFFREY J.;REEL/FRAME:011559/0477

Effective date: 20010216

AS Assignment

Owner name: GOLFTEC, INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ERB, DAVID ARTHUR;REEL/FRAME:012522/0379

Effective date: 20011024

AS Assignment

Owner name: GOLFTEC ENTERPRISES LLC, COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOLFTEC, INC.;REEL/FRAME:012347/0572

Effective date: 20011108

AS Assignment

Owner name: GOLFTEC ENTERPRISES LLC, COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOLFTEC INC.;REEL/FRAME:013031/0778

Effective date: 20020509

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION