WO2009027917A1 - System and method for displaying anonymously annotated physical exercise data - Google Patents

System and method for displaying anonymously annotated physical exercise data Download PDF

Info

Publication number
WO2009027917A1
WO2009027917A1 PCT/IB2008/053386 IB2008053386W WO2009027917A1 WO 2009027917 A1 WO2009027917 A1 WO 2009027917A1 IB 2008053386 W IB2008053386 W IB 2008053386W WO 2009027917 A1 WO2009027917 A1 WO 2009027917A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
person
physical
physical exercise
processing unit
Prior art date
Application number
PCT/IB2008/053386
Other languages
French (fr)
Inventor
Gerd Lanfermann
Original Assignee
Koninklijke Philips Electronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2010521520A priority Critical patent/JP2010536459A/en
Priority to US12/673,793 priority patent/US20110021317A1/en
Priority to EP08789619A priority patent/EP2185071A1/en
Priority to CN200880104207A priority patent/CN101784230A/en
Publication of WO2009027917A1 publication Critical patent/WO2009027917A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/221Ergometry, e.g. by using bicycle type apparatus
    • A61B5/222Ergometry, e.g. by using bicycle type apparatus combined with detection or measurement of physiological parameters, e.g. heart rate
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/15Miscellaneous features of sport apparatus, devices or equipment with identification means that can be read by electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/04Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/202Measuring physiological parameters of the user blood composition characteristics glucose
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/207P-O2, i.e. partial O2 value
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors.
  • the exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback. If the professional performing the review is not present during the exercise, video camera recordings could be sent to him. These recordings could be reviewed intuitively by the professional and the commented recordings could be understood intuitively by the person undertaking the exercise. However, these recordings, especially when sent away to a remote professional, could breach the privacy of the person. Furthermore, a completely automatic processing of such recorded images to provide meaningful feedback is a demanding task. Alternatively, the sole transmission of data from the sensors would not violate the privacy of the person.
  • US 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device.
  • Physiological data associated with the user is acquired from the user.
  • the physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol.
  • the methodology further involves using the mobile communication device to communicate the physiological data to a network server.
  • the physiological data is integrated into the virtual physiological model of the user.
  • the user can access data and depictions of the user developed from the physiological data.
  • a user can create an avatar representative of the current physical state of the user.
  • the user can adjust the avatar to change the appearance of the avatar to a more desired appearance.
  • the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions.
  • various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals.
  • Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals.
  • the interpretation of sensor signals in the frontend leads to difficulties on the part of the user. It is hard to relate to an abstract rendering of an artificial screen character.
  • the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
  • the first two steps of the method describe how two different sets of information about the exercise of the person are gathered.
  • physical exercise data is gathered, for example by continuously monitoring sensor signals from the person.
  • visual recordings are gathered, for example by using a digital video camera.
  • the physical exercise data can then be transmitted to a physically separate annotation unit.
  • the physical separation of the annotation unit provides for an anonymization of the data.
  • the physical exercise data can be processed into representations of the exercise for review by a third person.
  • the physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates.
  • the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise.
  • the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise.
  • the annotation information is joined with the visual recordings.
  • the recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information.
  • the synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
  • an exercise of a person can be reviewed anonymously and feedback can be given to the person.
  • the anonymization allows for the sharing of professional resources, making the reviewing process more efficient.
  • the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
  • an avatar is calculated based on the physical exercise data.
  • the term 'avatar' shall denote a computer-generated abstract rendering which represents the posture or motions of a person.
  • the avatar may be a stick figure.
  • the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like.
  • step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
  • the person will then see the visual recording of his exercise, the annotations and the avatar.
  • the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
  • transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
  • an interconnected computer network preferably the internet.
  • Suitable protocols can include those of the TCP/IP protocol.
  • the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
  • motion data either relate to the exercise itself, such as in the case of motion and posture data.
  • Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
  • the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
  • Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise.
  • Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
  • the present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit; a display device in communication with the physical data processing unit; - at least one posture recording device assigned to the person undertaking exercises and in communication with the physical data processing unit; a visual recording device in communication with the physical data processing unit; a data storage unit for storing and retrieving data from the physical data processing unit and the visual recording device; the data storage means being in communication with the physical data processing unit; a physically separate annotation unit in connection with the physical data processing unit, the connection being via an interconnected computer network.
  • the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
  • the motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors.
  • the transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
  • the at least one posture recording device comprises an optical mark on the person undertaking exercises.
  • the posture recording device then employs an optical tracking system for tracking the at least one optical mark.
  • the optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso.
  • the tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
  • a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
  • a further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • Fig. 1 shows a system according to the present invention
  • Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
  • FIG. 3 shows a flowchart of a method according to the present invention
  • Fig. 4 shows modules for performing a method according to the present invention.
  • Fig. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises.
  • the person has motion sensors 3 situated on his thighs and his ankles.
  • optical marks 3' are located on the wrist and the torso.
  • the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data.
  • a video camera 4 records the motions of the person.
  • the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3'. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3.
  • the raw or processed sensor signals and positional information from the optical marks 3' are stored in a data storage unit 5. Furthermore, the video stream of the person performing the exercise is also stored there.
  • the data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3, 3' corresponds to which frame of a video clip of the person performing the exercise.
  • the physical data processing unit 1 uses an interconnected computer network such as the internet 7, the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3' to a physically separate annotation unit 6. Temporal information is also transmitted.
  • This annotation unit then calculates a visual representation such as an avatar from the received physical data.
  • a physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation.
  • the annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7.
  • the physical data processing unit 1 accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated.
  • a movie sequence is generated for viewing by the person and displayed on display 2.
  • the video stream of the person and an avatar calculated from the recorded data are shown simultaneously.
  • the comments of the physical therapist are also displayed or voiced to the person.
  • Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data.
  • a person has been performing an exercise.
  • Physical data representing his motions has been recorded and used for calculation of an avatar representation.
  • the avatar's motion has been time-resolved and split into a stream of individual frames 20.
  • the person's movements have been recorded by a video camera.
  • This video image sequence has also been time-resolved and split into a stream of individual frames 21.
  • the time line in Fig. 2 beneath the frame streams arbitrarily begins at 4: 16 minutes and ends at 4:21 minutes.
  • the person starts with both of his arms stretched lowered.
  • the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head.
  • the arm is kept in this position while the same movement is supposed to be performed with the right arm.
  • the person is not able to keep his right arm outstretched in the horizontal position.
  • the arm is bent at the elbow.
  • a physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal. At the person's location the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
  • Fig. 3 shows a flowchart of a method according to the present invention.
  • the first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors.
  • the visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32.
  • an annotation system Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33.
  • These annotations are transmitted back to the patient system 34, wherein 'patient' denotes the person performing an exercise.
  • the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
  • Fig. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in Fig. 1.
  • a sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41. Synchronously with the sensor receiver 40, a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43.
  • the movement transmission module 41 transmits its data to a remotely located movement receiver 45. This is symbolized by barrier 44 separating the two sub-groups of modules.
  • the movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer.
  • the annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47.
  • Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise.
  • This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence.
  • a rendering module 50 the overlaid video sequence is displayed to the person who has performed the exercise.

Abstract

The present invention relates to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises. Based on physical exercise data, the physical exercise data is annotated at a physically separate annotation unit. At the location of the person, visual recordings of the person undertaking exercises together with synchronized annotation information are displayed to the person. A system for performing the method comprises a physical data processing unit (1), a display device (2), at least one posture recording device (3, 3'), a visual recording device (4), a data storage unit (5) and a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).

Description

System and method for displaying anonymously annotated physical exercise data
BACKGROUND OF THE INVENTION
The present invention relates to a system and method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
Home rehabilitation exercises for persons suffering from a medical condition like a stroke or home training exercises for persons wishing to improve body motions like a golf swing can be recorded via sensors. The exercises can also be evaluated by a professional such as a physiotherapist or a golf instructor in order to give the person a direct feedback. If the professional performing the review is not present during the exercise, video camera recordings could be sent to him. These recordings could be reviewed intuitively by the professional and the commented recordings could be understood intuitively by the person undertaking the exercise. However, these recordings, especially when sent away to a remote professional, could breach the privacy of the person. Furthermore, a completely automatic processing of such recorded images to provide meaningful feedback is a demanding task. Alternatively, the sole transmission of data from the sensors would not violate the privacy of the person. In this respect, US 6,817,979 B2 relates to a system and method which provide for interacting with a virtual physiological model of a user with the use of a mobile communication device. Physiological data associated with the user is acquired from the user. The physiological data is transmitted to the mobile communication device, preferably with the use of a wireless communication protocol. The methodology further involves using the mobile communication device to communicate the physiological data to a network server. The physiological data is integrated into the virtual physiological model of the user. The user can access data and depictions of the user developed from the physiological data. By way of example, a user can create an avatar representative of the current physical state of the user. The user can adjust the avatar to change the appearance of the avatar to a more desired appearance. For example, the anatomical dimensions of the avatar can be changed to reflect desired waist, chest, upper arms and thigh dimensions. Given differences between the desired avatar features and present avatar features, various training, diet and related fitness recommendations can be developed to establish a training regimen most suited to help the user achieve the desired fitness goals. Physiological data is subsequently acquired and applied to the user's avatar, and compared to the desired avatar's data to determine if the training regimen is effective in achieving the desired fitness goals. However, in general the interpretation of sensor signals in the frontend leads to difficulties on the part of the user. It is hard to relate to an abstract rendering of an artificial screen character.
Despite this effort accordingly there still exists a need in the art for a system and a method for displaying anonymously annotated physical exercise data to a person undertaking exercises.
SUMMARY OF THE INVENTION
To achieve this and other objects the present invention is directed to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
DETAILED DESCRIPTION OF THE INVENTION
Before the invention is described in detail, it is to be understood that this invention is not limited to the particular component parts of the devices described or process steps of the methods described as such devices and methods may vary. It is also to be understood that the terminology used herein is for purposes of describing particular embodiments only, and is not intended to be limiting. It must be noted that, as used in the specification and the appended claims, the singular forms "a," "an" and "the" include singular and/or plural referents unless the context clearly dictates otherwise. In the context of the present invention, the term anonymously annotated data denotes data where a third person performing the annotation has no knowledge about the identity of the person whose data he is annotating. In particular, the data does not allow for a recognition of the person. One way of achieving the anonymization is by assigning identification numbers to the data. Physical exercise data is data relating to movements or other exercises of a person.
The first two steps of the method describe how two different sets of information about the exercise of the person are gathered. Firstly, physical exercise data is gathered, for example by continuously monitoring sensor signals from the person. At the same time, visual recordings are gathered, for example by using a digital video camera. By synchronously gathering this data it is ensured that later on, a certain portion of the video stream can be attributed to a certain portion of the sensor signal stream and vice versa.
As the visual recordings and the physical exercise data are separate entities, the physical exercise data can then be transmitted to a physically separate annotation unit. The physical separation of the annotation unit provides for an anonymization of the data. At the annotation unit the physical exercise data can be processed into representations of the exercise for review by a third person. The physical exercise data can then be annotated. This includes automatic processing of the data, for example by detecting deviations from motion templates. Furthermore, the third person can include comments and suggestions to provide helpful feedback to the person performing the exercise. Afterwards, the annotation information is transmitted to a display and processing unit at the site of the person performing the exercise. Here, the annotation information is joined with the visual recordings. The recordings of the person undertaking exercises are then displayed to the person together with the synchronized annotation information. The synchronization provides for displaying the annotation at the correct time so the person can directly understand what has caught the attention of the reviewer or the automatic reviewing system.
In summary, with the method according to the present invention an exercise of a person can be reviewed anonymously and feedback can be given to the person. The anonymization allows for the sharing of professional resources, making the reviewing process more efficient. At the same time, when the person receives the feedback it is very clearly shown to him, via the visual recordings, which part of the exercise has prompted the feedback.
In one embodiment of the invention, at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data. For the purposes of this invention, the term 'avatar' shall denote a computer-generated abstract rendering which represents the posture or motions of a person. In simple cases, the avatar may be a stick figure. In more sophisticated cases, the avatar may represent additional information like the pulse rate, the amount of sweating, muscle fatigue and the like. An advantage of using an avatar representation is that the avatar can be rotated on the screen of the annotation unit while representing the exercise. This enables the reviewer to choose the best viewing angle for assessing the exercise.
In a further embodiment of the invention step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person. In summary, the person will then see the visual recording of his exercise, the annotations and the avatar. This is advantageous as the avatar may depict more clearly the motions of the persons if they are obscured in the visual recording by baggy clothing or if they have not been recorded correctly on camera. Again, the avatar may be rotated to achieve the best viewing perspective. Another option is to provide multiple viewing angles with one or more avatars.
In a further embodiment of the invention transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet. This allows a remotely located person to perform the review and the annotation. Suitable protocols can include those of the TCP/IP protocol.
In a further embodiment of the invention the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate. These data types either relate to the exercise itself, such as in the case of motion and posture data. Other data types relate to the overall condition or physical fitness of the person. Knowledge about this can give valuable insight into the effectiveness of rehabilitation or training measures. For example, it may be inferred whether the person is in the supercompensation phase after a training stimulus.
In a further embodiment of the invention the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings. Visual information can be in the form of markings such as arrows pointing out a specific issue that are inserted into the images of the avatar. Additionally, small video clips can be inserted to show the correct execution of the exercise. Other visual information can be written comments or graphs showing statistics of data like electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or the respiratory rate. This enables to assess the situation of the person performing the exercise at one glance. Audio signals can be simple beeps when a movement is no performed correctly. Recorded speech comments can be added by the reviewer when this is the simplest way of explaining an exercise.
The present invention is further directed towards a system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit; a display device in communication with the physical data processing unit; - at least one posture recording device assigned to the person undertaking exercises and in communication with the physical data processing unit; a visual recording device in communication with the physical data processing unit; a data storage unit for storing and retrieving data from the physical data processing unit and the visual recording device; the data storage means being in communication with the physical data processing unit; a physically separate annotation unit in connection with the physical data processing unit, the connection being via an interconnected computer network.
In one embodiment of the invention the at least one posture recording device comprises a motion sensor on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors. The motion sensors can be worn on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. They can be commercially available highly integrated solid state sensors. The transmission of the sensor signals to the posture assessment unit can be undertaken via wire, wirelessly or in a body area network using the electrical conductivity of the human skin. After calculation of the person's posture the result can be present in the form of an avatar.
In a further embodiment of the invention the at least one posture recording device comprises an optical mark on the person undertaking exercises. The posture recording device then employs an optical tracking system for tracking the at least one optical mark.
Based on the signals of the optical tracking system a representation of the person's posture is then calculated. The optical marks can be borne on the body of the person on selected locations like upper arm, lower arm, upper leg, lower leg or torso. The tracking of the marks can be effected with a single camera or a multitude of cameras. When a stereo camera is used, three-dimensional posture and movement data is generated. After image processing and calculation of the person's posture the result can be present in the form of an avatar.
It is also possible to combine several posture monitoring principles. For example, a combination of motion sensors and optical tracking may provide complementary data to better calculate the posture of the person.
A further aspect of the present invention is the use of a system according to the present invention claims for displaying anonymously annotated physical exercise data to a person undertaking exercises.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will become more readily understood with reference to the following drawing, wherein
Fig. 1 shows a system according to the present invention Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data
Fig. 3 shows a flowchart of a method according to the present invention Fig. 4 shows modules for performing a method according to the present invention.
DETAILED DESCRIPTION OF THE DRAWINGS
Fig. 1 shows a system according to the present invention for displaying anonymously annotated physical exercise data to a person undertaking exercises. As posture recording devices, the person has motion sensors 3 situated on his thighs and his ankles. Furthermore, optical marks 3' are located on the wrist and the torso. Being physical exercise data, the signals of the motion sensors 3 are transmitted wirelessly to the physical data processing unit 1 where the raw sensor signals are processed into motion and posture data. A video camera 4 records the motions of the person. Furthermore, the physical data processing unit 1 performs optical tracking operations on the video stream of the camera 4 for identifying the position and the movement of the optical marks 3'. This is also processed into motion and posture data and complements the data obtained from the motion sensors 3. The raw or processed sensor signals and positional information from the optical marks 3' are stored in a data storage unit 5. Furthermore, the video stream of the person performing the exercise is also stored there. The data in the data storage unit 5 is stored together with an information about the time of recording. This makes it possible to correlate or synchronize the information, for example knowing which position as indicated by posture recording devices 3, 3' corresponds to which frame of a video clip of the person performing the exercise.
Using an interconnected computer network such as the internet 7, the physical data processing unit 1 transmits the processed sensor 3 signals and positional information from the optical marks 3' to a physically separate annotation unit 6. Temporal information is also transmitted. This annotation unit then calculates a visual representation such as an avatar from the received physical data. A physical therapist views the motion of the visual representation on his terminal 8 and comments sequences, thus performing the annotation. The annotation together with the time within the exercise when the annotation has been made is transmitted back to the physical data processing unit 1 at the location of the person undertaking exercises. Again, the transmission is achieved over an interconnected computer network such as the internet 7.
The physical data processing unit 1 then accesses the data storage unit 5 and retrieves the recorded data and video clips from the particular exercise that has been annotated. A movie sequence is generated for viewing by the person and displayed on display 2. In this case, the video stream of the person and an avatar calculated from the recorded data are shown simultaneously. At the appropriate time, the comments of the physical therapist are also displayed or voiced to the person. Fig. 2 shows a synchronous overlay of visual recordings and an avatar representing physical exercise data. A person has been performing an exercise. Physical data representing his motions has been recorded and used for calculation of an avatar representation. The avatar's motion has been time-resolved and split into a stream of individual frames 20. Likewise, the person's movements have been recorded by a video camera. This video image sequence has also been time-resolved and split into a stream of individual frames 21. As the physical exercise data and the visual recordings have been gathered simultaneously, one common time line can be assigned to them. The time line in Fig. 2 beneath the frame streams arbitrarily begins at 4: 16 minutes and ends at 4:21 minutes. In the exercise of Fig. 2, the person starts with both of his arms stretched lowered. In the images, the left arm is kept stretched and raised along the coronal plane until the hand is above the person's head. The arm is kept in this position while the same movement is supposed to be performed with the right arm. At the time of 4:20, the person is not able to keep his right arm outstretched in the horizontal position. The arm is bent at the elbow. This makes it much easier to lift the arm so at this point no therapeutic benefit is gained. A physical therapist remotely reviewing the avatar frames 20 can then single out the frame at 4:20 minutes and add a visual or verbal comment. This comment, together with the information that it is to be shown at 4:20 minutes into the exercise, is transmitted to the person for future reviewal. At the person's location the annotation can then be combined with the visual recordings 21 so that the person can relate more directly to the exercise and contemplate his errors in performing it.
Fig. 3 shows a flowchart of a method according to the present invention. The first step 30 is to record the exercise a person is performing visually, using a camera, and via posture data, using sensors. The visual recordings are stored 31 and the posture recordings are transmitted to an annotation system 32. Using the annotation system, a person reviews the posture recordings and adds his comments and markers 33. These annotations are transmitted back to the patient system 34, wherein 'patient' denotes the person performing an exercise. On the patient side, the stored visual recordings are retrieved 35 and combined with the annotations 36 in order to give the person a comprehensive feedback that still does not compromise his anonymity.
Fig. 4 shows modules for performing a method according to the present invention to complement the depiction of a system in Fig. 1. A sensor receiver 40 receives signals from motion sensors or information from the tracking of optical marks. This sensor receiver 40 communicates its data to a movement transmission module 41. Synchronously with the sensor receiver 40, a camera 42 captures a video sequence of the person performing exercises. These video sequences are stored in a storage facility 43. The movement transmission module 41 transmits its data to a remotely located movement receiver 45. This is symbolized by barrier 44 separating the two sub-groups of modules.
The movement receiving module 45 passes the data on to a movement annotator 46 where the data is transformed into processible data and annotated by a reviewer. The annotation together with information on the temporal position of the annotation within the exercise is passed on to annotation transmission module 47. Aforementioned annotation transmission module 47 transmits the information to an annotation receiver 48 located at the sub-group of modules assigned to the person performing the exercise. This annotation information reaches a processing and overlay module 49 which accesses video sequences from the storage module 43 and combines the sequences with the annotation so that the annotation is present at the appropriate time of the video sequence. Finally, via a rendering module 50, the overlaid video sequence is displayed to the person who has performed the exercise. To provide a comprehensive disclosure without unduly lengthening the specification, the applicant hereby incorporates by reference each of the patents and patent applications referenced above.
The particular combinations of elements and features in the above detailed embodiments are exemplary only; the interchanging and substitution of these teachings with other teachings in this and the patents/applications incorporated by reference are also expressly contemplated. As those skilled in the art will recognize, variations, modifications, and other implementations of what is described herein can occur to those of ordinary skill in the art without departing from the spirit and the scope of the invention as claimed. Accordingly, the foregoing description is by way of example only and is not intended as limiting. The invention's scope is defined in the following claims and the equivalents thereto. Furthermore, reference signs used in the description and claims do not limit the scope of the invention as claimed.

Claims

CLAIMS:
1. A method for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising the steps of: a) gathering physical exercise data from a person undertaking exercises; b) synchronously gathering visual recordings of the person undertaking exercises; c) transmitting the physical exercise data to a physically separate annotation unit; d) based on the physical exercise data, annotating the physical exercise data at the physically separate annotation unit; e) transmitting the annotation information to a display and processing unit for review of the person undertaking exercises; f) displaying the visual recordings of the person undertaking exercises together with synchronized annotation information to the person.
2. Method according to claim 1, wherein at the physically separate annotation unit in step d) an avatar is calculated based on the physical exercise data.
3. Method according to claims 1 or 2, wherein step f) additionally comprises calculating an avatar and displaying the avatar synchronously with the visual recordings and the annotations to the person.
4. Method according to claims 1 to 3, wherein transmitting the physical exercise data in step c) and transmitting the annotation information in step e) is undertaken via an interconnected computer network, preferably the internet.
5. Method according to claims 1 to 4, wherein the physical exercise data from the person is selected from the group comprising motion data, posture data, electromyographic data, pulse rate, blood pressure, blood oxygen content, blood sugar content, severity of perspiration and/or respiratory rate.
6. Method according to claims 1 to 5, wherein the annotation information is selected from the group comprising visual information, audio signals and/or speech recordings.
7. A system for displaying anonymously annotated physical exercise data to a person undertaking exercises, comprising: a physical data processing unit (1); a display device (2) in communication with the physical data processing unit
(i); - at least one posture recording device (3, 3') assigned to the person undertaking exercises and in communication with the physical data processing unit (1); a visual recording device (4) in communication with the physical data processing unit (1); a data storage unit (5) for storing and retrieving data from the physical data processing unit (1) and the visual recording device (4); the data storage means (5) being in communication with the physical data processing unit (1); a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).
8. System according to claim 7, wherein the at least one posture recording device
(3, 3') comprises a motion sensor (3) on the person undertaking exercises, the sensor being selected from the group comprising acceleration sensors, inertia sensors and/or gravity sensors.
9. System according to claim 7, wherein the at least one posture recording device
(3, 3') comprises an optical mark (3') on the person undertaking exercises.
10. Use of a system according to claims 7 to 9 for displaying anonymously annotated physical exercise data to a person undertaking exercises.
PCT/IB2008/053386 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data WO2009027917A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2010521520A JP2010536459A (en) 2007-08-24 2008-08-22 System and method for displaying anonymized annotated physical exercise data
US12/673,793 US20110021317A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data
EP08789619A EP2185071A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data
CN200880104207A CN101784230A (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP07114912.4 2007-08-24
EP07114912 2007-08-24

Publications (1)

Publication Number Publication Date
WO2009027917A1 true WO2009027917A1 (en) 2009-03-05

Family

ID=40122948

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/053386 WO2009027917A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Country Status (5)

Country Link
US (1) US20110021317A1 (en)
EP (1) EP2185071A1 (en)
JP (1) JP2010536459A (en)
CN (1) CN101784230A (en)
WO (1) WO2009027917A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012112900A1 (en) * 2011-02-17 2012-08-23 Nike International Ltd. Selecting and correlating physical activity data with image date
ITGE20120011A1 (en) * 2012-01-27 2013-07-28 Paybay Networks S R L PATIENT REHABILITATION SYSTEM
EP2750120A1 (en) * 2012-12-27 2014-07-02 Casio Computer Co., Ltd. Exercise information display system and exercise information display method
EP2873444A3 (en) * 2013-11-13 2015-07-15 Motorika Ltd. Virtual reality based rehabilitation apparatuses and methods
US9089182B2 (en) 2008-06-13 2015-07-28 Nike, Inc. Footwear having sensor system
WO2015110298A1 (en) * 2014-01-24 2015-07-30 Icura Aps System and method for mapping moving body parts
US9192816B2 (en) 2011-02-17 2015-11-24 Nike, Inc. Footwear having sensor system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8500604B2 (en) * 2009-10-17 2013-08-06 Robert Bosch Gmbh Wearable system for monitoring strength training
JP5791726B2 (en) 2010-09-29 2015-10-07 ダカドー・アーゲー Automated health data acquisition, health data processing, and health data communication system
US9378336B2 (en) 2011-05-16 2016-06-28 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
CN102440774A (en) * 2011-09-01 2012-05-09 东南大学 Remote measurement module for related physiological information in rehabilitation training process
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
US9652992B2 (en) * 2012-10-09 2017-05-16 Kc Holdings I Personalized avatar responsive to user physical state and context
US9501942B2 (en) 2012-10-09 2016-11-22 Kc Holdings I Personalized avatar responsive to user physical state and context
JP2014199613A (en) * 2013-03-29 2014-10-23 株式会社コナミデジタルエンタテインメント Application control program, application control method, and application control device
US10484437B2 (en) * 2015-01-21 2019-11-19 Logmein, Inc. Remote support service with two-way smart whiteboard
US20160346612A1 (en) * 2015-05-29 2016-12-01 Nike, Inc. Enhancing Exercise Through Augmented Reality
WO2017055080A1 (en) * 2015-09-28 2017-04-06 Koninklijke Philips N.V. System and method for supporting physical exercises
CN105641900B (en) * 2015-12-28 2019-07-26 联想(北京)有限公司 A kind of respiratory state based reminding method and electronic equipment and system
KR102511518B1 (en) * 2016-01-12 2023-03-20 삼성전자주식회사 Display apparatus and control method of the same
CN105615852A (en) * 2016-03-17 2016-06-01 北京永数网络科技有限公司 Blood pressure detection system and method
JP7009955B2 (en) * 2017-11-24 2022-01-26 トヨタ自動車株式会社 Medical data communication equipment, servers, medical data communication methods and medical data communication programs
US11511158B2 (en) * 2018-08-07 2022-11-29 Interactive Strength, Inc. User interface system for an interactive exercise machine
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816986A2 (en) * 1996-07-03 1998-01-07 Hitachi, Ltd. Method, apparatus and system for recognizing motions
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
US20060166737A1 (en) * 2005-01-26 2006-07-27 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5679004A (en) * 1995-12-07 1997-10-21 Movit, Inc. Myoelectric feedback system
JP3469410B2 (en) * 1996-11-25 2003-11-25 三菱電機株式会社 Wellness system
US20060247070A1 (en) * 2001-06-11 2006-11-02 Recognition Insight, Llc Swing position recognition and reinforcement
US20060183980A1 (en) * 2005-02-14 2006-08-17 Chang-Ming Yang Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
WO2006103676A2 (en) * 2005-03-31 2006-10-05 Ronen Wolfson Interactive surface and display system
US20090299232A1 (en) * 2006-07-12 2009-12-03 Koninklijke Philips Electronics N.V. Health management device
WO2009024929A1 (en) * 2007-08-22 2009-02-26 Koninklijke Philips Electronics N.V. System and method for displaying selected information to a person undertaking exercises

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0816986A2 (en) * 1996-07-03 1998-01-07 Hitachi, Ltd. Method, apparatus and system for recognizing motions
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US6817979B2 (en) 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
US20060166737A1 (en) * 2005-01-26 2006-07-27 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US10912490B2 (en) 2008-06-13 2021-02-09 Nike, Inc. Footwear having sensor system
US11026469B2 (en) 2008-06-13 2021-06-08 Nike, Inc. Footwear having sensor system
US10314361B2 (en) 2008-06-13 2019-06-11 Nike, Inc. Footwear having sensor system
US9622537B2 (en) 2008-06-13 2017-04-18 Nike, Inc. Footwear having sensor system
US9089182B2 (en) 2008-06-13 2015-07-28 Nike, Inc. Footwear having sensor system
US11707107B2 (en) 2008-06-13 2023-07-25 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US10293209B2 (en) 2010-11-10 2019-05-21 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11600371B2 (en) 2010-11-10 2023-03-07 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9429411B2 (en) 2010-11-10 2016-08-30 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9757619B2 (en) 2010-11-10 2017-09-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US10632343B2 (en) 2010-11-10 2020-04-28 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11817198B2 (en) 2010-11-10 2023-11-14 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11568977B2 (en) 2010-11-10 2023-01-31 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11935640B2 (en) 2010-11-10 2024-03-19 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9192816B2 (en) 2011-02-17 2015-11-24 Nike, Inc. Footwear having sensor system
US11170885B2 (en) 2011-02-17 2021-11-09 Nike, Inc. Selecting and correlating physical activity data with image data
CN107122585A (en) * 2011-02-17 2017-09-01 耐克创新有限合伙公司 Selected using view data and associate sports data
WO2012112900A1 (en) * 2011-02-17 2012-08-23 Nike International Ltd. Selecting and correlating physical activity data with image date
KR101810751B1 (en) 2011-02-17 2017-12-19 나이키 이노베이트 씨.브이. Selecting and correlating physical activity data with image data
US9924760B2 (en) 2011-02-17 2018-03-27 Nike, Inc. Footwear having sensor system
CN107122585B (en) * 2011-02-17 2022-07-01 耐克创新有限合伙公司 Selecting and associating athletic activity data using image data
US9411940B2 (en) 2011-02-17 2016-08-09 Nike, Inc. Selecting and correlating physical activity data with image data
US10179263B2 (en) 2011-02-17 2019-01-15 Nike, Inc. Selecting and correlating physical activity data with image data
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US8827815B2 (en) 2011-02-17 2014-09-09 Nike, Inc. Location mapping
ITGE20120011A1 (en) * 2012-01-27 2013-07-28 Paybay Networks S R L PATIENT REHABILITATION SYSTEM
US11071345B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Footwear having sensor system
US11793264B2 (en) 2012-02-22 2023-10-24 Nike, Inc. Footwear having sensor system
US10357078B2 (en) 2012-02-22 2019-07-23 Nike, Inc. Footwear having sensor system
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
EP2750120A1 (en) * 2012-12-27 2014-07-02 Casio Computer Co., Ltd. Exercise information display system and exercise information display method
US9656119B2 (en) 2012-12-27 2017-05-23 Casio Computer Co., Ltd. Exercise information display system, exercise information display method, and computer-readable storage medium having exercise information display program stored thereon
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US11918854B2 (en) 2013-02-01 2024-03-05 Nike, Inc. System and method for analyzing athletic activity
US10024740B2 (en) 2013-03-15 2018-07-17 Nike, Inc. System and method for analyzing athletic activity
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
US9810591B2 (en) 2013-03-15 2017-11-07 Nike, Inc. System and method of analyzing athletic activity
EP2873444A3 (en) * 2013-11-13 2015-07-15 Motorika Ltd. Virtual reality based rehabilitation apparatuses and methods
WO2015110298A1 (en) * 2014-01-24 2015-07-30 Icura Aps System and method for mapping moving body parts

Also Published As

Publication number Publication date
CN101784230A (en) 2010-07-21
US20110021317A1 (en) 2011-01-27
JP2010536459A (en) 2010-12-02
EP2185071A1 (en) 2010-05-19

Similar Documents

Publication Publication Date Title
US20110021317A1 (en) System and method for displaying anonymously annotated physical exercise data
KR100772497B1 (en) Golf clinic system and application method thereof
JP4594157B2 (en) Exercise support system, user terminal device thereof, and exercise support program
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
US9892655B2 (en) Method to provide feedback to a physical therapy patient or athlete
CN108289613B (en) System, method and computer program product for physiological monitoring
US20170136296A1 (en) System and method for physical rehabilitation and motion training
CA2844651C (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US20140172460A1 (en) System, Method, and Computer Program Product for Digitally Recorded Musculoskeletal Diagnosis and Treatment
EP2635988B1 (en) Method and system for automated personal training
EP2643779B1 (en) Fatigue indices and uses thereof
JP4335456B2 (en) A system that allows a person who exercises a series of exercises to perform to be self-managed
US20150327794A1 (en) System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system
US11395940B2 (en) System and method for providing guided augmented reality physical therapy in a telemedicine platform
JP2009542397A (en) Health management device
WO2013163204A1 (en) Equestrian performance sensing system
US9248361B1 (en) Motion capture and analysis systems for use in training athletes
JP2019118783A (en) Remote rehabilitation analysis device and method thereof
Huang et al. Smartglove for upper extremities rehabilitative gaming assessment
CN115115810A (en) Multi-person collaborative focus positioning and enhanced display method based on spatial posture capture
US20220277506A1 (en) Motion-based online interactive platform
KR20060088110A (en) Sportcare set-top-box monitoring system
US20200371738A1 (en) Virtual and augmented reality telecommunication platforms
KR20140082449A (en) Health and rehabilitation apparatus based on natural interaction
US20100262989A1 (en) System and method for generating individualized exercise movies

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880104207.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08789619

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008789619

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2010521520

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 12673793

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE