US20060128263A1 - Computerized assessment system and method for assessing opinions or feelings - Google Patents

Computerized assessment system and method for assessing opinions or feelings Download PDF

Info

Publication number
US20060128263A1
US20060128263A1 US11/297,498 US29749805A US2006128263A1 US 20060128263 A1 US20060128263 A1 US 20060128263A1 US 29749805 A US29749805 A US 29749805A US 2006128263 A1 US2006128263 A1 US 2006128263A1
Authority
US
United States
Prior art keywords
facial expression
computer
subject
face image
feelings
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/297,498
Inventor
John Baird
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Psychological Applications LLC
Original Assignee
Psychological Applications LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Psychological Applications LLC filed Critical Psychological Applications LLC
Priority to US11/297,498 priority Critical patent/US20060128263A1/en
Assigned to PSYCHOLOGICAL APPLICATIONS LLC reassignment PSYCHOLOGICAL APPLICATIONS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAIRD, JOHN C.
Publication of US20060128263A1 publication Critical patent/US20060128263A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick

Definitions

  • the present invention relates to methods for assessing opinions or feelings and more particularly, relates to a computerized assessment system and method for assessing opinions or feelings by allowing a subject to dynamically adjust a variable facial expression on a computer-generated face image.
  • each of the faces 110 a - 110 f includes an outline of the head containing a mouth, nose, two eyes, and eyebrows.
  • the faces 110 a - 110 f may be distinguished one from the other by the shape of the mouth (smiling, neutral, or frowning), by the location of the eyebrows (curving down, curving up, or touching the tops of the eyes) and by the character of the eyes (light or dark, no tears, tears). The child is asked to choose the one static face that best describes how he/she is feeling at the moment.
  • One measurement problem is that there is no continuous variable that is associated with the facial expression and that changes systematically as the emotion being depicted ranges from the extremes of “no hurt” to “hurts worst.” Thus, there is no objective measure associated with the emotional or painful intensity being indicated.
  • Another problem is that the number and type of features depicted in each face is not always the same. In FIG. 1 , for example, eyelids are not present in faces 110 a - 110 c but are present in faces 110 d - 110 f; and tears are present in face 110 f but not in the other faces 110 a - 110 e. Thus, there is no single change in the face that is uniquely linked to differences in the emotional intensity supposedly depicted.
  • the Face Scale is asymmetric.
  • One implementation problem is that the verbal instructions given to the child may not match the facial expression being expressed.
  • the face 110 b labeled by the investigator as “hurts little bit” is depicted as smiling, not frowning.
  • Another implementation problem is that such methods do not allow the child to continuously change or fine-tune their judgments before deciding on a final judgment.
  • static methods do not allow for the recording of dynamic changes over time. The specific nature of these dynamic changes may provide insight into the judgment strategy being followed by the child, as well as provide a sensitive measure of the child's response to treatment.
  • ratings associated with the selected faces are not automatically stored in a computer file for later analysis. The ratings must be entered into the computer by hand, thus increasing the time burden on the investigator or clinician, and increasing the chance of error in the input of data.
  • the nominal scale only allows one to make statements of the following sort: “the pain level represented by this number (face) is different from the pain level represented by some other number (face) in the series.” The relative intensity of pain through comparison of one face with another may not be assessed.
  • FIG. 1 is an illustration of the static faces on a conventional faces pain rating scale.
  • FIG. 2 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions by allowing dynamic adjustment of a variable facial expression on a computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 3 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a desktop computer, consistent with one embodiment of the present invention.
  • FIG. 4 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a handheld computer, consistent with another embodiment of the present invention.
  • FIGS. 5 and 6 are schematic block diagrams of a computerized assessment system for assessing opinions or feelings, consistent with different embodiments of the present invention.
  • FIG. 7 is an illustration of a range of possible facial expressions of the computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 8 is a graphical illustration of an algorithm used to change a facial expression on the computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 9 is a flow chart illustrating a method of assessing opinions or feelings, consistent with one embodiment of the present invention.
  • FIGS. 10A-10C are illustrations of computer-generated face images together with different graphical representations of toys being assessed, consistent with one application of the present invention.
  • FIGS. 11A-11C are illustrations of computer-generated face images together with different graphical representations of activities in a medical facility being assessed, consistent with another application of the present invention.
  • FIG. 12 is an illustration of a computer-generated face image together with a video being assessed, consistent with a further application of the present invention.
  • a computerized assessment system 200 may be used to assess feelings or opinions of a subject by allowing the dynamic adjustment of a variable facial expression on a computer-generated face image 210 .
  • the computerized assessment system 200 may be used to assess any feelings or opinions of a subject including, but not limited to, pain, anxiety, fear, happiness/sadness, pleasure/displeasure, and likes/dislikes.
  • the computerized assessment system 200 may be used to assess feelings or opinions of a child or other individual (e.g., an adult with a reading disability).
  • the computerized assessment system 200 may be used by the subject alone or together with one or more individuals conducting the assessment.
  • One exemplary application for the computerized rating system 200 is to assess the pain felt by a child in a medical facility. Other applications are also described in greater detail below.
  • the computerized assessment system 200 may include a display 202 for displaying the computer-generated face image 210 and a user input device 204 for controlling the dynamic adjustment of the variable facial expression on the face image 210 .
  • the user input device 204 may provide user input signals that cause the display 202 to change the facial expression on the face image 210 .
  • the face image 210 may include a two-dimensional schema of at least a head 212 , mouth 214 , eyes 216 , and a nose 218 , which may be proportionally sized and translated within the x-y plane. Those skilled in the art will recognize that the face image 210 may be represented in other ways (e.g., a more detailed three-dimensional image).
  • variable facial expression of the face image 210 may be dynamically adjusted by changing the upward and downward curvature of line representing the mouth 214 (i.e., smiling and frowning) and the opening and closing of the lines representing the eyes 216 . These facial features may be changed without changing the circle representing the head 212 and the line representing the nose 218 .
  • the user input device 204 may include one or more controls 230 , 232 to control the dynamic changes in the expression of the face image 210 .
  • one control 230 may provide a positive user input signal that causes the face 210 to change positively (e.g., mouth 214 curves upward and eyes 216 open) and another control 232 may provide a negative user input signal that causes the face 210 to change negatively (e.g., mouth 214 curves downward and eyes 216 close).
  • the controls 230 , 232 may include up and down arrows corresponding to the direction of movement. Those skilled in the art will recognize that other features may also be dynamically adjusted (e.g., changing eyebrows or tears from the eyes).
  • FIGS. 3 and 4 show computerized assessment systems 300 , 400 consistent with different embodiments of the present invention.
  • the computerized assessment systems 300 , 400 may stand alone or may be coupled to a network, for example, using either a wired or wireless connection.
  • the computerized assessment system 300 may be implemented using a personal computer (e.g., a desktop or a laptop computer) including a display 302 and one or more input devices 304 , 304 a (e.g., a mouse, a keyboard, joystick, or a separate remote control device) coupled to the personal computer.
  • the user e.g. a subject being assessed or individual conducting the assessment
  • may depress controls e.g., buttons 330 , 332 on a mouse or keys 330 a, 332 a on a keyboard
  • buttons 330 , 332 on a mouse or keys 330 a, 332 a on a keyboard e.g., buttons 330 , 332 on a mouse or keys 330 a, 332 a on a keyboard
  • the user input device may also include a remote control or a wireless device (not shown) communicating with the personal computer using a wireless protocol.
  • the user input device may also include a joystick (not shown) that the user moves in different directions to provide the user input signals.
  • the computerized assessment system 400 may be implemented using a handheld computer (e.g., a personal digital assistant) including a display 402 and an input device 404 located on the handheld computer.
  • the display 402 may be a touch screen display
  • the input device 404 may include control images 430 , 432 (e.g., arrows) displayed on the display 402 .
  • the user may touch the control images 430 , 432 on the display 402 (e.g., using a stylus) to change the expression of a computer-generated face image 410 displayed on the display 402 .
  • the input device 404 may be implemented using other controls (e.g., keys, push buttons, rollers) located on the handheld computer.
  • the computerized assessment system and method may also prompt the subject to provide an assessment in response to a target stimulus, for example, by providing a visual or audible representation of an item, activity, or concept for which the assessment is to be made.
  • the exemplary computerized assessment systems 300 , 400 may display an image 320 , 420 on the display 302 , 402 in proximity to the computer-generated face image 310 , 410 .
  • the image 320 , 420 may be a photograph, drawing or video depicting the item, activity or concept.
  • the computerized assessment systems 300 , 400 may play an audio clip describing or representing an item, activity, or concept.
  • a separate image or audio representation may be provided (e.g., using another device) instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300 , 400 .
  • An individual may also provide a verbal query or description of an item, activity, or concept instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300 , 400 . Examples of items, activities, or concepts that may be assessed are described in greater detail below.
  • the computerized assessment system may be implemented using software executed by a computing device.
  • the assessment software 520 resides on a stand-alone general purpose computer 510 ( FIG. 5 ), such as a PC or handheld computer, which allows the user to access the software 520 .
  • Files including images, videos, or audio clips used to prompt the assessment may also be stored on the general purpose computer 510 .
  • a display 502 for displaying the computer-generated face image and a user input device 504 for controlling the variable facial expression may be coupled to the stand-alone general purpose computer 510 .
  • the assessment software 620 resides on a server computer 612 ( FIG. 6 ) and is accessed using a computer 610 connected to the server computer 612 over a data network 630 , such as a local area network, a wide area network, an intranet, or the Internet. Files including images, videos, or audio clips used to prompt the assessment may also be stored on the server computer 612 .
  • a display 602 for displaying the computer-generated face image and a user input device 604 for controlling the variable facial expression may be coupled to the general purpose computer 610 .
  • the software 520 , 620 can be implemented to perform the functions described herein using programming techniques known to a programmer of ordinary skill in the art.
  • the assessment software 520 on the stand-alone computer 510 can be developed using a programming language such as Basic
  • the assessment software 620 residing on the server computer 610 can be developed using a programming language such as Java.
  • Embodiments of the software may be implemented as a computer program product for use with a computer system.
  • Such implementation includes, without limitation, a series of computer instructions that embody all or part of the functionality described herein with respect to the assessment system and method.
  • the series of computer instructions may be stored in any machine-readable medium, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies.
  • Such a computer program product may be distributed as a removable machine-readable medium (e.g., a diskette, CD-ROM), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network (e.g., the Internet or World Wide Web).
  • a removable machine-readable medium e.g., a diskette, CD-ROM
  • a computer system e.g., on system ROM or fixed disk
  • a server or electronic bulletin board e.g., the Internet or World Wide Web
  • Alternative embodiments of the invention may be implemented as pre-programmed hardware elements or as a combination of hardware, software and/or firmware.
  • FIG. 7 illustrates seven of the numerous possible expressions of a computer-generated face image 710 a - 710 g, consistent with one embodiment of the present invention.
  • the user e.g., a child subject
  • the controls e.g., up and down arrows
  • the user can change the mouth in successive steps from a state of smiling (e.g., face image 710 a ) to a neutral state (e.g., face image 710 d ) to a state of frowning (e.g., face image 710 g ).
  • the eyes may vary from completely open (associated with the smiling face image 710 a ) to half open (associated with neutral face image 710 d ) to completely closed (associated with frowning face image 710 g ).
  • the user may change the facial expression in either direction (positive or negative) multiple times until satisfied.
  • the background color of the face image e.g., white
  • the outline of the face and nose e.g., blue
  • the color of the mouth and eyes may be associated with the expression that is being depicted. For example, all features of the face may be “blue” when the mouth is in the neutral position. For all expressions indicating a smile, the mouth and eyes may be “green”, and for all expressions indicating a frown, the mouth and eyes may be “red.” These colors can be changed so long as they do not interfere with the visibility of either the fixed or dynamic facial features. Those skilled in the art will recognize, however, that other colors are possible.
  • a computerized assessment system and method may also assign rating values to the different facial expressions and may record rating values associated with a selected facial expression.
  • the ratings are thus associated with the opinion or feeling of the subject.
  • FIG. 7 shows one example of scaled ratings associated with each of the facial expressions 710 a - 710 g, although such rating values may or may not be displayed with the computer-generated face image.
  • the rating values range from 99 to ⁇ 99 with positive values associated with smiling faces 710 a - 710 c and negative values associated with frowning faces 710 e - 710 g.
  • the expressions may be equally spaced (e.g., 33 units) along an objective scale.
  • the ratings may be recorded, for example, by the computer used to make the assessment (e.g., general purpose computer 510 in FIG. 5 ) or by another computer on a network (e.g., server computer 612 in FIG. 6 ).
  • An x-y coordinate system 810 may be situated with its origin (zero) at the center of a line 812 a, 812 b representing the mouth.
  • a positive quadratic equation may be used to generate different degrees of smiling and a negative quadratic equation may be used to generate different degrees of frowning.
  • the maximum smile (line 812 a ) and the maximum frown (line 812 b ) are depicted in FIG. 8 to illustrate the extremes of facial expressions that can be generated.
  • the value x may range from the left corner (minimum x value) to the center (zero x value) and out to the right corner (maximum x value).
  • the variable y represents the curvature of the mouth (i.e., in the y axis) from the left corner to the right corner, which may be essentially continuous depending on the step size for x.
  • the value of the scalar multiplier ⁇ may be under user control, and the user input signals generated by a user input device may be used to increase and decrease the value.
  • the step size determines the degree of change in the facial expression accompanying the activation of the input device.
  • the value of the scalar ⁇ (or some number that is a transformation of ⁇ ) may be used as a rating value.
  • the range of scalar values from ⁇ to ⁇ may represent an entire spectrum of opinions or feelings from extremely positive (e.g., no pain, happiness, pleasure) to extremely negative (e.g., pain, sadness, displeasure).
  • the value of the scalar indicates the rating, and the ratings may change dynamically with the changes in the facial expression.
  • the final rating may be taken to indicate the user's final opinion or feeling in regards to the target stimulus or the concept represented by the target stimulus.
  • a convenient range of values for the ratings can be obtained by the appropriate scaling of the scalar ⁇ to create a range of possible scores or ratings from ⁇ 100 to +100.
  • the opening and closing of the eyes on the face image may be scaled in magnitude with the smiling and frowning of the mouth.
  • the eyes may thus range from ‘completely open’ when the mouth shows maximum smiling, to ‘half open’ when the mouth is neutral, and then to ‘completely closed’ when the mouth shows maximum frowning.
  • a series of still images may be displayed, each having an incremental change in the facial feature(s) (e.g., the mouth and eyes).
  • the still images may be displayed sequentially to the user such that the mouth and/or eyes dynamically change.
  • Each of the still images may be associated with a rating value.
  • the assessment system displays 910 the computer-generated face image including the variable facial expression. Initially, the computer-generated face image may be displayed with a neutral expression (e.g., face image 710 d in FIG. 7 ). The subject being assessed may be prompted 912 to express an opinion or feeling on a particular matter (e.g., an item, activity or concept). As described above, the assessment system or an individual performing the assessment may prompt the subject.
  • a prompt is a single verbal query, such as “How does the bump on your head make you feel?” or “How does playing with a dog make you feel?”
  • a prompt is a nonverbal auditory event (e.g., a barking dog, thunder).
  • a further example of a prompt is a photograph or drawing (e.g., a picture of a person, a toy, or a food type).
  • Yet another example of a prompt is a changing display, such as a video and/or audio depiction of an event that unfolds over time.
  • the assessment system may then receive 914 a user input signal provided by the subject based on the subject's opinion or feeling. If the subject feels more positive than the computer-generated face displayed initially, for example, the subject may provide a positive input signal (e.g., using a control with an up arrow). If the subject feels more negative than the computer-generated face displayed initially, the subject may provide a negative input signal (e.g., using a control with a down arrow). In response to the user input signal, the variable facial expression on the computer-generated face changes 916 .
  • the user input signal may be received repeatedly and the facial expression may change dynamically as the user makes adjustments until the facial expression of the computer-generated face corresponds to the subject's opinion or feeling on a particular matter.
  • the system thus allows the subject to adjust the expression of the face and to continue making adjustments until satisfied that the expression (e.g., one of the expressions shown in FIG. 7 ) represents their opinions or feelings about a particular matter (e.g., item, activity, concept).
  • the assessment system may also store or record 918 a rating value associated with the displayed facial expression and thus the subject's opinion or feeling.
  • the rating value may be stored or written to a data file for later analysis.
  • the data file may be given a code name for each user or subject for whom ratings are recorded.
  • the rating values may be recorded together with information identifying the visual and/or audible representation prompting the assessment. If multiple stimulus conditions are employed in the same test trial (e.g., feelings about each of a series of pictures are expressed), for example, each rating value and its associated picture label may be stored in the file for later analysis.
  • rating values may be recorded in discrete steps as the subject changes the facial expression to permit the rating process to be reviewed and evaluated by an investigator at a later point in time.
  • each keystroke or user input can be stored in a file, as well as the time associated with each input. The assessment system and method may thus recreate the dynamic changes leading up to each final rating and the time required to make each change in a rating. Such information may prove valuable when attempting to understand the process used by the subject in making the assessment.
  • a child may express opinions or feelings (e.g., likes and dislikes) about the desirability of different toys.
  • a series of toys may be presented singly, for example, in an image adjacent to the computer-generated face image.
  • the actual toys may be presented to the child.
  • the child may adjust the facial expression (e.g., the shape of the mouth and simultaneously the eyes) to indicate their level of like or dislike for the toy.
  • a toy manufacturer could use this method to pre-test the desirability of different products from the standpoint of the child before deciding upon which toys should be marketed to the public. Those toys that receive high positive ratings from a group of children might be good candidates for the marketplace.
  • a child may express opinions or feelings (e.g., pleasure and displeasure) regarding different aspects of his or her hospital experience as a patient.
  • a series of hospital scenes e.g., a play room, food service, and needle injection
  • the child may adjust the facial expression (e.g., mouth and eyes) to express a level of pleasure or displeasure with the scene depicted in the cartoon.
  • the child expresses a strong positive opinion about the scene where the little girl is playing with the therapy dog ( FIG. 11A ), a weaker positive opinion about a hospital meal ( FIG. 11B ), and a very negative opinion about receiving a needle injection ( FIG. 11C ).
  • Other cartoons might depict the child's room in the hospital, an attending doctor or nurse, the “X-ray room” or the “operating room.”
  • Embodiments of the present invention may allow a hospital to acquire information pertaining to different aspects of their service and facilities, as seen through the eyes of the young patient.
  • a child may express opinions or feelings about a topic or activity depicted in a video displayed (e.g., on a computer or television monitor) beside the face image. Sound (e.g., voices, music, etc.) can also accompany the visual display. As the video is played, the child makes adjustments of the facial expression to indicate pleasure or displeasure about the contents of the video. The child may adjust the facial expression of the face over time in order to represent a continuous sequence of varying opinions regarding scenes that are portrayed in the video. Because every user input (e.g., key stroke) leads to a change in the rating, it is possible to record ratings over the course of whatever events are depicted in the video.
  • the video may depict a short story of a child entering the hospital, being admitted, arriving at a patient room, having X-rays taken, lying sick in bed, and then later being discharged.
  • a child patient being tested at the computer may be asked to express how they felt throughout the video as different scenes were played out.
  • the child may change the facial expression according to how various parts of the story made him/her feel. This information could be used by hospital personnel to help identify those aspects of a child's hospital stay that were judged favorably and unfavorably.
  • a television producer may wish to pretest an educational video program intended for children by determining which parts of a storyline were well received and which parts were not well received. An iterative procedure may be followed whereby the story is shown and a group of children are tested using the computerized assessment system. After analyzing the data, the producer may wish to modify or omit the scenes that did not elicit the intended emotional response. This may be followed by tests on a new group of children. This procedure could continue until the children expressed the desired sequence of emotions over the course of the story, which met the goals of either the producer or of those individuals producing the video (e.g., parents or educators). Such information regarding children's opinions could assist television producers or educators in creating favorably-rated programs for children. This particular type of dynamic judgment over time may be cumbersome or impossible to implement if the child had to make repeated selections from a static, photographic series.
  • the computerized assessment system and method may also be used to assess opinions or feelings of adult subjects.
  • the computerized assessment system and method may be particularly useful for adults who are disabled or who cannot read.
  • Those skilled in the art will also recognize numerous other applications for the computerized assessment system and method.
  • embodiments of the present invention include a computerized system and method for assessing opinions or feelings of a subject.
  • the computerized method includes displaying at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of the subject.
  • the method also includes receiving at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject and displaying changes in the variable facial expression of the computer-generated face image in response to the user input signal.
  • the variable facial expression changes dynamically until a selected facial expression is displayed.
  • the computerized assessment system includes a display configured to display at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of a subject.
  • the system also includes a user input device configured to generate at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject.
  • a computer is configured to change the variable facial expression of the computer-generated face image on the display in response to the user input signal.

Abstract

A computerized assessment system and method may be used to assess opinions or feelings of a subject (e.g., a child patient). The system and method may display a computer-generated face image having a variable facial expression (e.g., changing mouth and eyes) capable of changing to correspond to opinions or feelings of the subject (e.g., smiling or frowning). The system and method may receive a user input signal in accordance with the opinions or feelings of the subject and may display the changes in the variable facial expression in response to the user input signal. The system and method may also prompt the subject to express an opinion or feeling about a matter to be assessed.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of co-pending U.S. Provisional Patent Application Ser. No. 60/634,709, filed on Dec. 9, 2004, which is fully incorporated herein by reference.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • This invention was made with government support under SBIR grant No. 2 R44 NS042387-02 awarded by the National Institutes of Health. The Government has certain rights in the invention.
  • TECHNICAL FIELD
  • The present invention relates to methods for assessing opinions or feelings and more particularly, relates to a computerized assessment system and method for assessing opinions or feelings by allowing a subject to dynamically adjust a variable facial expression on a computer-generated face image.
  • BACKGROUND INFORMATION
  • Studies have been performed using static facial expressions to assess a child's self-reported pain. Instruments used in the studies include a small set of cartoon faces or photographs to represent different degrees of pain. In a common application, the Faces Pain Rating Scale, six cartoons are employed and the child (e.g., age 3 yrs or older) is told that the faces are pictures of someone who is very happy because he doesn't hurt at all, hurts just a little bit, hurts even more, hurts a whole lot, and hurts as much as you can imagine. As shown in FIG. 1, each of the faces 110 a-110 f includes an outline of the head containing a mouth, nose, two eyes, and eyebrows. The faces 110 a-110 f may be distinguished one from the other by the shape of the mouth (smiling, neutral, or frowning), by the location of the eyebrows (curving down, curving up, or touching the tops of the eyes) and by the character of the eyes (light or dark, no tears, tears). The child is asked to choose the one static face that best describes how he/she is feeling at the moment.
  • These and related studies are described in various publications, incorporated herein by reference and identified as follows:
  • Beyer, J. E. (1984). “The Oucher” a User's Manual and Technical Report. Evanston, Ill.: Judson Press.
  • Bieri, D., and others (1990). The Faces Pain Scale for the self-assessment of the severity of pain experienced by children: development, initial validation, and preliminary investigation for the ratio scale properties, Pain, 41 (2), 139-150.
  • Buchanan, L., Voigtman, J., & Mills, H. (1997). Implementing the agency for health care policy and research pain management pediatric guideline in a multicultural practice setting. J. Nurs. Care Qual., 11(3), 23-35.
  • Keck, J. F., Gerkensmeyer, J. E., Joyce, B. A., & Schade, J. G. (1996). Reliability and validity of the faces and word descriptor scales to measure procedural pain. Journal of Pediatric Nursing, 11(6), 368-374.
  • McGrath, P. A., & Gillespie, J. (2001). Pain assessment in children and adolescents (pp. 97-118). In (D. C. Turk & R. Melzack, Eds.) Handbook of pain assessment, 2nd edition, The Guilford Press: New York.
  • McRae, M. E., Rourke, D. A., Imperial-Perez, F. A., Eisenrigh, C. M., & Ueda, J. N. (1997). Development of a research-based standard for assessment, intervention, and evaluation of pain after neonatal and pediatric cardiac surgery. Pediatric Nursing, 23(3), 263-271.
  • Sporrer, K. A., Jackson, S. M., Agner, S., Laver, J., & Abboud, M. R. (1994). Pain in children and adolescents with sickle cell anemia: a prospective study utilizing self-reporting. The American Journal of Pediatric Hematology/Oncology, 16(3), 219-224.
  • Tyler, D., Douthit, A., & Chapman, C. (1993). Toward validation of pain measurement tools for children: a pilot study. Pain, 52, 301-309.
  • West, N., Oakes, L., and others (1994). Measuring pain in pediatric oncology ICU patients. Journal of Pediatric Oncology Nursing, 11(2), 64-68.
  • Wong, D. L. (1999). Whaley & Wong's nursing care of infants and children (6th edition). St. Louis, Mo.: Mosby Year-Book.
  • Wong, D. L., & Baker, C. (1988). Pain in children: comparison of assessment scales, Pediatr. Nurs. 14(1), 9017.
  • The static methods described in these publications raise questions about whether they fulfill the assumptions of an interval scale of measurement, employ an optimum mode of implementation, and provide a usable guide for the interpretation of results.
  • One measurement problem is that there is no continuous variable that is associated with the facial expression and that changes systematically as the emotion being depicted ranges from the extremes of “no hurt” to “hurts worst.” Thus, there is no objective measure associated with the emotional or painful intensity being indicated. Another problem is that the number and type of features depicted in each face is not always the same. In FIG. 1, for example, eyelids are not present in faces 110 a-110 c but are present in faces 110 d-110 f; and tears are present in face 110 f but not in the other faces 110 a-110 e. Thus, there is no single change in the face that is uniquely linked to differences in the emotional intensity supposedly depicted. A further problem is that the Face Scale is asymmetric. There are two smiling faces 110 a, 110 b, one neutral face 110 c, and three frowning faces 110 d-110 f. The investigators (Wong and Baker, 1988) who developed the Face Scale for measuring pain believe that the subjective differences between each face (in terms of pain) are equal, and hence, that the scale is an interval scale permitting the use of parametric statistics (e.g., means and standard deviations of ratings provided by groups of respondents). However, there appears to be no published (and peer reviewed) evidence to support this assumption.
  • One implementation problem is that the verbal instructions given to the child may not match the facial expression being expressed. For example, the face 110 b labeled by the investigator as “hurts little bit” is depicted as smiling, not frowning. Another implementation problem is that such methods do not allow the child to continuously change or fine-tune their judgments before deciding on a final judgment. A further implementation problem is that static methods do not allow for the recording of dynamic changes over time. The specific nature of these dynamic changes may provide insight into the judgment strategy being followed by the child, as well as provide a sensitive measure of the child's response to treatment. Another implementation problem is that ratings associated with the selected faces are not automatically stored in a computer file for later analysis. The ratings must be entered into the computer by hand, thus increasing the time burden on the investigator or clinician, and increasing the chance of error in the input of data.
  • The confluence of the measurement and implementation problems suggests that the Faces Pain Rating Scale, as well as the closely related picture scales described in the works cited here, can only be considered a “nominal” scale in that no measure of quantitative difference can be inferred from the child's choice of one face over the others ((Baird, J. C. & Noma, E. (1978), Fundamentals of Scaling and Psychophysics, Chap. 1, John Wiley & Sons: New York; Stevens, S. S. (1946) On the theory of scales of measurement. Science, 103, 677-680.) In the case of assessing a child's pain, the nominal scale only allows one to make statements of the following sort: “the pain level represented by this number (face) is different from the pain level represented by some other number (face) in the series.” The relative intensity of pain through comparison of one face with another may not be assessed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other features and advantages will be better understood by reading the following detailed description, taken together with the drawings wherein:
  • FIG. 1 is an illustration of the static faces on a conventional faces pain rating scale.
  • FIG. 2 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions by allowing dynamic adjustment of a variable facial expression on a computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 3 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a desktop computer, consistent with one embodiment of the present invention.
  • FIG. 4 is a diagrammatic illustration of a computerized assessment system for assessing feelings or opinions implemented on a handheld computer, consistent with another embodiment of the present invention.
  • FIGS. 5 and 6 are schematic block diagrams of a computerized assessment system for assessing opinions or feelings, consistent with different embodiments of the present invention.
  • FIG. 7 is an illustration of a range of possible facial expressions of the computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 8 is a graphical illustration of an algorithm used to change a facial expression on the computer-generated face image, consistent with one embodiment of the present invention.
  • FIG. 9 is a flow chart illustrating a method of assessing opinions or feelings, consistent with one embodiment of the present invention.
  • FIGS. 10A-10C are illustrations of computer-generated face images together with different graphical representations of toys being assessed, consistent with one application of the present invention.
  • FIGS. 11A-11C are illustrations of computer-generated face images together with different graphical representations of activities in a medical facility being assessed, consistent with another application of the present invention.
  • FIG. 12 is an illustration of a computer-generated face image together with a video being assessed, consistent with a further application of the present invention.
  • DETAILED DESCRIPTION
  • Referring to FIG. 2, a computerized assessment system 200 may be used to assess feelings or opinions of a subject by allowing the dynamic adjustment of a variable facial expression on a computer-generated face image 210. The computerized assessment system 200 may be used to assess any feelings or opinions of a subject including, but not limited to, pain, anxiety, fear, happiness/sadness, pleasure/displeasure, and likes/dislikes. The computerized assessment system 200 may be used to assess feelings or opinions of a child or other individual (e.g., an adult with a reading disability). The computerized assessment system 200 may be used by the subject alone or together with one or more individuals conducting the assessment. One exemplary application for the computerized rating system 200 is to assess the pain felt by a child in a medical facility. Other applications are also described in greater detail below.
  • The computerized assessment system 200 may include a display 202 for displaying the computer-generated face image 210 and a user input device 204 for controlling the dynamic adjustment of the variable facial expression on the face image 210. The user input device 204 may provide user input signals that cause the display 202 to change the facial expression on the face image 210. The face image 210 may include a two-dimensional schema of at least a head 212, mouth 214, eyes 216, and a nose 218, which may be proportionally sized and translated within the x-y plane. Those skilled in the art will recognize that the face image 210 may be represented in other ways (e.g., a more detailed three-dimensional image).
  • According to one embodiment, the variable facial expression of the face image 210 may be dynamically adjusted by changing the upward and downward curvature of line representing the mouth 214 (i.e., smiling and frowning) and the opening and closing of the lines representing the eyes 216. These facial features may be changed without changing the circle representing the head 212 and the line representing the nose 218. The user input device 204 may include one or more controls 230, 232 to control the dynamic changes in the expression of the face image 210. For example, one control 230 may provide a positive user input signal that causes the face 210 to change positively (e.g., mouth 214 curves upward and eyes 216 open) and another control 232 may provide a negative user input signal that causes the face 210 to change negatively (e.g., mouth 214 curves downward and eyes 216 close). The controls 230, 232 may include up and down arrows corresponding to the direction of movement. Those skilled in the art will recognize that other features may also be dynamically adjusted (e.g., changing eyebrows or tears from the eyes).
  • FIGS. 3 and 4 show computerized assessment systems 300, 400 consistent with different embodiments of the present invention. The computerized assessment systems 300, 400 may stand alone or may be coupled to a network, for example, using either a wired or wireless connection.
  • The computerized assessment system 300, consistent with one embodiment, may be implemented using a personal computer (e.g., a desktop or a laptop computer) including a display 302 and one or more input devices 304, 304 a (e.g., a mouse, a keyboard, joystick, or a separate remote control device) coupled to the personal computer. The user (e.g. a subject being assessed or individual conducting the assessment) may depress controls (e.g., buttons 330, 332 on a mouse or keys 330 a, 332 a on a keyboard) on the user input device 304, 304 a to change the expression of a computer-generated face image 310 displayed on the computer display 302. Although a mouse user input device 304 and a keyboard user input device 304 a are shown, the user input device may also include a remote control or a wireless device (not shown) communicating with the personal computer using a wireless protocol. The user input device may also include a joystick (not shown) that the user moves in different directions to provide the user input signals.
  • The computerized assessment system 400, consistent with another embodiment, may be implemented using a handheld computer (e.g., a personal digital assistant) including a display 402 and an input device 404 located on the handheld computer. For example, the display 402 may be a touch screen display, and the input device 404 may include control images 430, 432 (e.g., arrows) displayed on the display 402. The user may touch the control images 430, 432 on the display 402 (e.g., using a stylus) to change the expression of a computer-generated face image 410 displayed on the display 402. Alternatively, the input device 404 may be implemented using other controls (e.g., keys, push buttons, rollers) located on the handheld computer.
  • The computerized assessment system and method may also prompt the subject to provide an assessment in response to a target stimulus, for example, by providing a visual or audible representation of an item, activity, or concept for which the assessment is to be made. The exemplary computerized assessment systems 300, 400, for example, may display an image 320, 420 on the display 302, 402 in proximity to the computer-generated face image 310, 410. The image 320, 420 may be a photograph, drawing or video depicting the item, activity or concept. Alternatively or additionally, the computerized assessment systems 300, 400 may play an audio clip describing or representing an item, activity, or concept. According to a further alternative, a separate image or audio representation may be provided (e.g., using another device) instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300, 400. An individual may also provide a verbal query or description of an item, activity, or concept instead of or in addition to the image and/or audio clip provided by the computerized assessment systems 300, 400. Examples of items, activities, or concepts that may be assessed are described in greater detail below.
  • Referring to FIGS. 5 and 6, the computerized assessment system may be implemented using software executed by a computing device. In one embodiment, the assessment software 520 resides on a stand-alone general purpose computer 510 (FIG. 5), such as a PC or handheld computer, which allows the user to access the software 520. Files including images, videos, or audio clips used to prompt the assessment may also be stored on the general purpose computer 510. A display 502 for displaying the computer-generated face image and a user input device 504 for controlling the variable facial expression may be coupled to the stand-alone general purpose computer 510.
  • In another embodiment, the assessment software 620 resides on a server computer 612 (FIG. 6) and is accessed using a computer 610 connected to the server computer 612 over a data network 630, such as a local area network, a wide area network, an intranet, or the Internet. Files including images, videos, or audio clips used to prompt the assessment may also be stored on the server computer 612. A display 602 for displaying the computer-generated face image and a user input device 604 for controlling the variable facial expression may be coupled to the general purpose computer 610.
  • The software 520, 620 can be implemented to perform the functions described herein using programming techniques known to a programmer of ordinary skill in the art. For example, the assessment software 520 on the stand-alone computer 510 can be developed using a programming language such as Basic, and the assessment software 620 residing on the server computer 610 can be developed using a programming language such as Java.
  • Embodiments of the software may be implemented as a computer program product for use with a computer system. Such implementation includes, without limitation, a series of computer instructions that embody all or part of the functionality described herein with respect to the assessment system and method. The series of computer instructions may be stored in any machine-readable medium, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies. It is expected that such a computer program product may be distributed as a removable machine-readable medium (e.g., a diskette, CD-ROM), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network (e.g., the Internet or World Wide Web). Alternative embodiments of the invention may be implemented as pre-programmed hardware elements or as a combination of hardware, software and/or firmware.
  • FIG. 7 illustrates seven of the numerous possible expressions of a computer-generated face image 710 a-710 g, consistent with one embodiment of the present invention. By activating the controls (e.g., up and down arrows) of a user input device, the user (e.g., a child subject) can change the mouth in successive steps from a state of smiling (e.g., face image 710 a) to a neutral state (e.g., face image 710 d) to a state of frowning (e.g., face image 710 g). At the same time, the eyes may vary from completely open (associated with the smiling face image 710 a) to half open (associated with neutral face image 710 d) to completely closed (associated with frowning face image 710 g). The user may change the facial expression in either direction (positive or negative) multiple times until satisfied.
  • In one application, the background color of the face image (e.g., white) and the outline of the face and nose (e.g., blue) may not change with the facial expression. The color of the mouth and eyes may be associated with the expression that is being depicted. For example, all features of the face may be “blue” when the mouth is in the neutral position. For all expressions indicating a smile, the mouth and eyes may be “green”, and for all expressions indicating a frown, the mouth and eyes may be “red.” These colors can be changed so long as they do not interfere with the visibility of either the fixed or dynamic facial features. Those skilled in the art will recognize, however, that other colors are possible.
  • A computerized assessment system and method may also assign rating values to the different facial expressions and may record rating values associated with a selected facial expression. The ratings are thus associated with the opinion or feeling of the subject. FIG. 7 shows one example of scaled ratings associated with each of the facial expressions 710 a-710 g, although such rating values may or may not be displayed with the computer-generated face image. In this example, the rating values range from 99 to −99 with positive values associated with smiling faces 710 a-710 c and negative values associated with frowning faces 710 e-710 g. As shown, the expressions may be equally spaced (e.g., 33 units) along an objective scale. The ratings may be recorded, for example, by the computer used to make the assessment (e.g., general purpose computer 510 in FIG. 5) or by another computer on a network (e.g., server computer 612 in FIG. 6).
  • Referring to FIG. 8, one method of generating changes in facial expression and assigning a rating indicating the subject's opinion or feeling is described in greater detail. An x-y coordinate system 810 may be situated with its origin (zero) at the center of a line 812 a, 812 b representing the mouth. A positive quadratic equation may be used to generate different degrees of smiling and a negative quadratic equation may be used to generate different degrees of frowning. The maximum smile (line 812 a) and the maximum frown (line 812 b) are depicted in FIG. 8 to illustrate the extremes of facial expressions that can be generated.
  • The degree of smiling may be indicated by different curvatures according to the following equation:
    y=λx2   (1)
    where x corresponds to a point along the mouth (i.e., along the x axis) and λ is a scalar multiplier. The value x may range from the left corner (minimum x value) to the center (zero x value) and out to the right corner (maximum x value). The variable y represents the curvature of the mouth (i.e., in the y axis) from the left corner to the right corner, which may be essentially continuous depending on the step size for x. By changing the value of the scalar multiplier λ, the amount of smiling can be varied from neutral to its maximum. By changing the magnitude of the scalar multiplier λ in the equation by small incremental steps, essentially continuous changes in expression can be produced. By changing the magnitude of the scalar multiplier λ in the equation by larger incremental steps, larger, discrete successive changes are possible. The value of the scalar multiplier λ may be under user control, and the user input signals generated by a user input device may be used to increase and decrease the value. The step size determines the degree of change in the facial expression accompanying the activation of the input device.
  • In the same manner, a quadratic equation may be used to generate different degrees of frowning, except that the sign of the expression is negative, as shown in the following equation:
    y=−λx 2   (2)
    For this case, as the scalar λ increases, the degree of frowning approaches its maximum.
  • According to this method, the value of the scalar λ (or some number that is a transformation of λ) may be used as a rating value. In this manner, the range of scalar values from λ to −λ may represent an entire spectrum of opinions or feelings from extremely positive (e.g., no pain, happiness, pleasure) to extremely negative (e.g., pain, sadness, displeasure). Thus, at each point in time during the adjustment process, the value of the scalar indicates the rating, and the ratings may change dynamically with the changes in the facial expression. When the subject stops changing the facial expression, the final rating may be taken to indicate the user's final opinion or feeling in regards to the target stimulus or the concept represented by the target stimulus. In one embodiment, a convenient range of values for the ratings can be obtained by the appropriate scaling of the scalar λ to create a range of possible scores or ratings from −100 to +100.
  • The opening and closing of the eyes on the face image may be scaled in magnitude with the smiling and frowning of the mouth. The eyes may thus range from ‘completely open’ when the mouth shows maximum smiling, to ‘half open’ when the mouth is neutral, and then to ‘completely closed’ when the mouth shows maximum frowning.
  • Those skilled in the art will recognize that other techniques may be used to dynamically change the facial expression. For example, a series of still images may be displayed, each having an incremental change in the facial feature(s) (e.g., the mouth and eyes). The still images may be displayed sequentially to the user such that the mouth and/or eyes dynamically change. Each of the still images may be associated with a rating value.
  • Referring to FIG. 9, one method of assessing opinions or feelings is described. The assessment system displays 910 the computer-generated face image including the variable facial expression. Initially, the computer-generated face image may be displayed with a neutral expression (e.g., face image 710 d in FIG. 7). The subject being assessed may be prompted 912 to express an opinion or feeling on a particular matter (e.g., an item, activity or concept). As described above, the assessment system or an individual performing the assessment may prompt the subject. One example of a prompt is a single verbal query, such as “How does the bump on your head make you feel?” or “How does playing with a dog make you feel?” Another example of a prompt is a nonverbal auditory event (e.g., a barking dog, thunder). A further example of a prompt is a photograph or drawing (e.g., a picture of a person, a toy, or a food type). Yet another example of a prompt is a changing display, such as a video and/or audio depiction of an event that unfolds over time.
  • The assessment system may then receive 914 a user input signal provided by the subject based on the subject's opinion or feeling. If the subject feels more positive than the computer-generated face displayed initially, for example, the subject may provide a positive input signal (e.g., using a control with an up arrow). If the subject feels more negative than the computer-generated face displayed initially, the subject may provide a negative input signal (e.g., using a control with a down arrow). In response to the user input signal, the variable facial expression on the computer-generated face changes 916. The user input signal may be received repeatedly and the facial expression may change dynamically as the user makes adjustments until the facial expression of the computer-generated face corresponds to the subject's opinion or feeling on a particular matter. The system thus allows the subject to adjust the expression of the face and to continue making adjustments until satisfied that the expression (e.g., one of the expressions shown in FIG. 7) represents their opinions or feelings about a particular matter (e.g., item, activity, concept).
  • The assessment system may also store or record 918 a rating value associated with the displayed facial expression and thus the subject's opinion or feeling. The rating value may be stored or written to a data file for later analysis. The data file may be given a code name for each user or subject for whom ratings are recorded. The rating values may be recorded together with information identifying the visual and/or audible representation prompting the assessment. If multiple stimulus conditions are employed in the same test trial (e.g., feelings about each of a series of pictures are expressed), for example, each rating value and its associated picture label may be stored in the file for later analysis.
  • In addition to storing a final rating value, rating values may be recorded in discrete steps as the subject changes the facial expression to permit the rating process to be reviewed and evaluated by an investigator at a later point in time. In addition, each keystroke or user input can be stored in a file, as well as the time associated with each input. The assessment system and method may thus recreate the dynamic changes leading up to each final rating and the time required to make each change in a rating. Such information may prove valuable when attempting to understand the process used by the subject in making the assessment.
  • According to one application of the system, illustrated in FIGS. 10A-10C, a child may express opinions or feelings (e.g., likes and dislikes) about the desirability of different toys. A series of toys may be presented singly, for example, in an image adjacent to the computer-generated face image. Alternatively, the actual toys may be presented to the child. The child may adjust the facial expression (e.g., the shape of the mouth and simultaneously the eyes) to indicate their level of like or dislike for the toy. A toy manufacturer could use this method to pre-test the desirability of different products from the standpoint of the child before deciding upon which toys should be marketed to the public. Those toys that receive high positive ratings from a group of children might be good candidates for the marketplace. Those toys that receive negative ratings might not be marketed or might be withdrawn from the market in the event they had already been offered for sale. In the present example, the child clearly likes the “teddy bear” (FIG. 10A), likes the “robot” less (FIG. 10B), and does not like the “puppet” (FIG. 10C).
  • According to another application of the system, illustrated in FIGS. 11A-11C, a child may express opinions or feelings (e.g., pleasure and displeasure) regarding different aspects of his or her hospital experience as a patient. A series of hospital scenes (e.g., a play room, food service, and needle injection) may be presented singly, for example, in cartoon images next to the computer-generated face image. The child may adjust the facial expression (e.g., mouth and eyes) to express a level of pleasure or displeasure with the scene depicted in the cartoon. In the present example, the child expresses a strong positive opinion about the scene where the little girl is playing with the therapy dog (FIG. 11A), a weaker positive opinion about a hospital meal (FIG. 11B), and a very negative opinion about receiving a needle injection (FIG. 11C). Other cartoons might depict the child's room in the hospital, an attending doctor or nurse, the “X-ray room” or the “operating room.”
  • An assessment may thus be obtained from the child without requiring an intervention by adult relatives, guardians or friends who may not be able to accurately determine the nature (positive or negative valence) of the child's experiences. Information regarding patient opinions could assist a hospital staff in making improvements in their service and in anticipating problems they might encounter in accommodating younger patients. It has become increasingly important for hospitals to assess the opinions of their patients about their treatment and care during hospital stays. Embodiments of the present invention may allow a hospital to acquire information pertaining to different aspects of their service and facilities, as seen through the eyes of the young patient.
  • According to another application, shown in FIG. 12, a child may express opinions or feelings about a topic or activity depicted in a video displayed (e.g., on a computer or television monitor) beside the face image. Sound (e.g., voices, music, etc.) can also accompany the visual display. As the video is played, the child makes adjustments of the facial expression to indicate pleasure or displeasure about the contents of the video. The child may adjust the facial expression of the face over time in order to represent a continuous sequence of varying opinions regarding scenes that are portrayed in the video. Because every user input (e.g., key stroke) leads to a change in the rating, it is possible to record ratings over the course of whatever events are depicted in the video.
  • In one application, the video may depict a short story of a child entering the hospital, being admitted, arriving at a patient room, having X-rays taken, lying sick in bed, and then later being discharged. A child patient being tested at the computer may be asked to express how they felt throughout the video as different scenes were played out. The child may change the facial expression according to how various parts of the story made him/her feel. This information could be used by hospital personnel to help identify those aspects of a child's hospital stay that were judged favorably and unfavorably.
  • In another application, a television producer may wish to pretest an educational video program intended for children by determining which parts of a storyline were well received and which parts were not well received. An iterative procedure may be followed whereby the story is shown and a group of children are tested using the computerized assessment system. After analyzing the data, the producer may wish to modify or omit the scenes that did not elicit the intended emotional response. This may be followed by tests on a new group of children. This procedure could continue until the children expressed the desired sequence of emotions over the course of the story, which met the goals of either the producer or of those individuals producing the video (e.g., parents or educators). Such information regarding children's opinions could assist television producers or educators in creating favorably-rated programs for children. This particular type of dynamic judgment over time may be cumbersome or impossible to implement if the child had to make repeated selections from a static, photographic series.
  • Although the exemplary embodiments describe a use for assessing children's opinions or feeling, the computerized assessment system and method may also be used to assess opinions or feelings of adult subjects. The computerized assessment system and method may be particularly useful for adults who are disabled or who cannot read. Those skilled in the art will also recognize numerous other applications for the computerized assessment system and method.
  • In summary, embodiments of the present invention include a computerized system and method for assessing opinions or feelings of a subject. Consistent with one embodiment of the invention, the computerized method includes displaying at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of the subject. The method also includes receiving at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject and displaying changes in the variable facial expression of the computer-generated face image in response to the user input signal. The variable facial expression changes dynamically until a selected facial expression is displayed.
  • The computerized assessment system includes a display configured to display at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of a subject. The system also includes a user input device configured to generate at least one user input signal for changing the variable facial expression in accordance with opinions or feelings of the subject. A computer is configured to change the variable facial expression of the computer-generated face image on the display in response to the user input signal.
  • While the principles of the invention have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the invention. Other embodiments are contemplated within the scope of the present invention in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention, which is not to be limited except by the following claims.

Claims (20)

1. A computerized method for assessing opinions or feelings of a subject, said method comprising:
displaying at least one computer-generated face image, said computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of said subject;
receiving at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
displaying changes in said variable facial expression of said at least one computer-generated face image in response to said user input signal, wherein said variable facial expression changes dynamically until a selected facial expression is displayed.
2. The computerized method of claim 1 wherein a range of rating values are associated with said variable facial expression, and further comprising storing at least one rating value associated with at least one selected facial expression.
3. The computerized method of claim 1 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.
4. The computerized method of claim 3 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.
5. The computerized method of claim 1 further comprising prompting said subject to express an opinion or feeling about a matter.
6. The computerized method of claim 5 wherein prompting said subject includes displaying a visual representation of said matter in proximity to said computer-generated face image.
7. The computerized method of claim 6 wherein said visual representation includes at least one picture.
8. The computerized method of claim 6 wherein said visual representation includes at least one video.
9. The computerized method of claim 5 wherein prompting said subject includes playing an auditory representation of said matter about which said subject has an opinion or feeling.
10. The computerized method of claim 9 wherein said auditory representation includes at least one verbal message.
11. The computerized method of claim 9 wherein said auditory representation includes at least one nonverbal auditory event.
12. The computerized method of claim 5 wherein prompting the subject includes providing an auditory query regarding said matter about which the subject has an opinion or feeling.
13. The computerized method of claim 5 wherein prompting said subject includes asking said subject to express feelings of pain felt by said subject.
14. A machine-readable medium whose contents cause a computer system to perform a method for assessing opinions or feelings of a subject, said method comprising:
displaying at least one computer-generated face image, said computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of said subject;
receiving at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
displaying changes in said variable facial expression of said at least one computer-generated face image in response to said user input signal, wherein said variable facial expression changes dynamically until a selected facial expression is displayed.
15. The machine-readable medium of claim 14 wherein said method further comprises prompting said subject to express an opinion or feeling about a matter.
16. The machine-readable medium of claim 14 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.
17. The machine-readable medium of claim 16 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.
18. A computerized assessment system for assessing opinions or feelings of a subject, said system comprising:
a display configured to display at least one computer-generated face image having a variable facial expression capable of changing dynamically to correspond to opinions or feelings of a subject;
a user input device configured to generate at least one user input signal for changing said variable facial expression in accordance with opinions or feelings of said subject; and
a computer configured to change said variable facial expression of said computer-generated face image on said display in response to said user input signal.
19. The computerized assessment system of claim 18 wherein said at least one computer-generated face image includes a mouth, and wherein a curvature of said mouth is capable of being changed to provide said variable facial expression.
20. The computerized assessment system of claim 18 wherein said at least one computer-generated face image includes eyes, and wherein said eyes are capable of opening and closing to provide said variable facial expression.
US11/297,498 2004-12-09 2005-12-08 Computerized assessment system and method for assessing opinions or feelings Abandoned US20060128263A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/297,498 US20060128263A1 (en) 2004-12-09 2005-12-08 Computerized assessment system and method for assessing opinions or feelings

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US63470904P 2004-12-09 2004-12-09
US11/297,498 US20060128263A1 (en) 2004-12-09 2005-12-08 Computerized assessment system and method for assessing opinions or feelings

Publications (1)

Publication Number Publication Date
US20060128263A1 true US20060128263A1 (en) 2006-06-15

Family

ID=36584633

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/297,498 Abandoned US20060128263A1 (en) 2004-12-09 2005-12-08 Computerized assessment system and method for assessing opinions or feelings

Country Status (1)

Country Link
US (1) US20060128263A1 (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070224584A1 (en) * 2006-03-02 2007-09-27 Michelle Hokanson Apparatus for use in assessing pain level
US20070255696A1 (en) * 2006-04-28 2007-11-01 Choicebot Inc. System and Method for Assisting Computer Users to Search for and Evaluate Products and Services, Typically in a Database
US20080275830A1 (en) * 2007-05-03 2008-11-06 Darryl Greig Annotating audio-visual data
US20090009341A1 (en) * 2007-07-05 2009-01-08 Alexander Gak Device, method and/or system for monitoring the condition of a subject
US20100298659A1 (en) * 2009-05-20 2010-11-25 Triage Wireless, Inc. Body-worn system for continuously monitoring a patient's bp, hr, spo2, rr, temperature, and motion; also describes specific monitors for apnea, asy, vtac, vfib, and 'bed sore' index
US20110224498A1 (en) * 2010-03-10 2011-09-15 Sotera Wireless, Inc. Body-worn vital sign monitor
US20120148161A1 (en) * 2010-12-09 2012-06-14 Electronics And Telecommunications Research Institute Apparatus for controlling facial expression of virtual human using heterogeneous data and method thereof
US8321004B2 (en) 2009-09-15 2012-11-27 Sotera Wireless, Inc. Body-worn vital sign monitor
US8364250B2 (en) 2009-09-15 2013-01-29 Sotera Wireless, Inc. Body-worn vital sign monitor
US8437824B2 (en) 2009-06-17 2013-05-07 Sotera Wireless, Inc. Body-worn pulse oximeter
US8475370B2 (en) 2009-05-20 2013-07-02 Sotera Wireless, Inc. Method for measuring patient motion, activity level, and posture along with PTT-based blood pressure
US8527038B2 (en) 2009-09-15 2013-09-03 Sotera Wireless, Inc. Body-worn vital sign monitor
US8545417B2 (en) 2009-09-14 2013-10-01 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US8602997B2 (en) 2007-06-12 2013-12-10 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US8740802B2 (en) 2007-06-12 2014-06-03 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US8747330B2 (en) 2010-04-19 2014-06-10 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US8888700B2 (en) 2010-04-19 2014-11-18 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US8979765B2 (en) 2010-04-19 2015-03-17 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
WO2015164260A1 (en) * 2014-04-21 2015-10-29 Acevedo William Carl Method and apparatus for providing state identification
US9173594B2 (en) 2010-04-19 2015-11-03 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US9173593B2 (en) 2010-04-19 2015-11-03 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US20160063317A1 (en) * 2013-04-02 2016-03-03 Nec Solution Innovators, Ltd. Facial-expression assessment device, dance assessment device, karaoke device, and game device
JP2016049438A (en) * 2014-08-28 2016-04-11 岡崎 章 Psychological amount measurement device that quantitatively measures psychology of person, and measuring method
JP2016049433A (en) * 2014-08-28 2016-04-11 岡崎 章 Emotion expression method, emotion expression apparatus, and computer program
US9313344B2 (en) 2012-06-01 2016-04-12 Blackberry Limited Methods and apparatus for use in mapping identified visual features of visual images to location areas
US9339209B2 (en) 2010-04-19 2016-05-17 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US9364158B2 (en) 2010-12-28 2016-06-14 Sotera Wirless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US9439574B2 (en) 2011-02-18 2016-09-13 Sotera Wireless, Inc. Modular wrist-worn processor for patient monitoring
US20180104822A1 (en) * 2016-10-13 2018-04-19 Toyota Jidosha Kabushiki Kaisha Communication device
US10064547B2 (en) * 2013-08-27 2018-09-04 Johnson & Johnson Vision Care, Inc. Means and method for demonstrating the effects of low cylinder astigmatism correction
US10357187B2 (en) 2011-02-18 2019-07-23 Sotera Wireless, Inc. Optical sensor for measuring physiological properties
US10420476B2 (en) 2009-09-15 2019-09-24 Sotera Wireless, Inc. Body-worn vital sign monitor
CN111093913A (en) * 2017-09-11 2020-05-01 Groove X 株式会社 Action-independent robot watching opposite side
US10806351B2 (en) 2009-09-15 2020-10-20 Sotera Wireless, Inc. Body-worn vital sign monitor
US10827973B1 (en) 2015-06-30 2020-11-10 University Of South Florida Machine-based infants pain assessment tool
US11030662B2 (en) * 2007-04-16 2021-06-08 Ebay Inc. Visualization of reputation ratings
US11202604B2 (en) 2018-04-19 2021-12-21 University Of South Florida Comprehensive and context-sensitive neonatal pain assessment system and methods using multiple modalities
US11253169B2 (en) 2009-09-14 2022-02-22 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US11330988B2 (en) 2007-06-12 2022-05-17 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
USD969216S1 (en) * 2021-08-25 2022-11-08 Rebecca Hadley Educational poster
US11607152B2 (en) 2007-06-12 2023-03-21 Sotera Wireless, Inc. Optical sensors for use in vital sign monitoring
US11631280B2 (en) 2015-06-30 2023-04-18 University Of South Florida System and method for multimodal spatiotemporal pain assessment
JP7266229B1 (en) * 2022-11-26 2023-04-28 株式会社Kansei Design Human psychological state evaluation tool
US11896350B2 (en) 2009-05-20 2024-02-13 Sotera Wireless, Inc. Cable system for generating signals for detecting motion and measuring vital signs

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5732232A (en) * 1996-09-17 1998-03-24 International Business Machines Corp. Method and apparatus for directing the expression of emotion for a graphical user interface
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US5732232A (en) * 1996-09-17 1998-03-24 International Business Machines Corp. Method and apparatus for directing the expression of emotion for a graphical user interface

Cited By (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070224584A1 (en) * 2006-03-02 2007-09-27 Michelle Hokanson Apparatus for use in assessing pain level
US20070255696A1 (en) * 2006-04-28 2007-11-01 Choicebot Inc. System and Method for Assisting Computer Users to Search for and Evaluate Products and Services, Typically in a Database
US8326890B2 (en) * 2006-04-28 2012-12-04 Choicebot, Inc. System and method for assisting computer users to search for and evaluate products and services, typically in a database
US11030662B2 (en) * 2007-04-16 2021-06-08 Ebay Inc. Visualization of reputation ratings
US11763356B2 (en) 2007-04-16 2023-09-19 Ebay Inc. Visualization of reputation ratings
US20080275830A1 (en) * 2007-05-03 2008-11-06 Darryl Greig Annotating audio-visual data
US8126220B2 (en) * 2007-05-03 2012-02-28 Hewlett-Packard Development Company L.P. Annotating stimulus based on determined emotional response
US9668656B2 (en) 2007-06-12 2017-06-06 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US10765326B2 (en) 2007-06-12 2020-09-08 Sotera Wirless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US8602997B2 (en) 2007-06-12 2013-12-10 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US9215986B2 (en) 2007-06-12 2015-12-22 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US9161700B2 (en) 2007-06-12 2015-10-20 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US11607152B2 (en) 2007-06-12 2023-03-21 Sotera Wireless, Inc. Optical sensors for use in vital sign monitoring
US8808188B2 (en) 2007-06-12 2014-08-19 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US8740802B2 (en) 2007-06-12 2014-06-03 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US11330988B2 (en) 2007-06-12 2022-05-17 Sotera Wireless, Inc. Body-worn system for measuring continuous non-invasive blood pressure (cNIBP)
US20090009341A1 (en) * 2007-07-05 2009-01-08 Alexander Gak Device, method and/or system for monitoring the condition of a subject
US8738118B2 (en) 2009-05-20 2014-05-27 Sotera Wireless, Inc. Cable system for generating signals for detecting motion and measuring vital signs
US8909330B2 (en) 2009-05-20 2014-12-09 Sotera Wireless, Inc. Body-worn device and associated system for alarms/alerts based on vital signs and motion
US9492092B2 (en) 2009-05-20 2016-11-15 Sotera Wireless, Inc. Method for continuously monitoring a patient using a body-worn device and associated system for alarms/alerts
US8672854B2 (en) 2009-05-20 2014-03-18 Sotera Wireless, Inc. System for calibrating a PTT-based blood pressure measurement using arm height
US8594776B2 (en) 2009-05-20 2013-11-26 Sotera Wireless, Inc. Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds
US11918321B2 (en) 2009-05-20 2024-03-05 Sotera Wireless, Inc. Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds
US20100298659A1 (en) * 2009-05-20 2010-11-25 Triage Wireless, Inc. Body-worn system for continuously monitoring a patient's bp, hr, spo2, rr, temperature, and motion; also describes specific monitors for apnea, asy, vtac, vfib, and 'bed sore' index
US11896350B2 (en) 2009-05-20 2024-02-13 Sotera Wireless, Inc. Cable system for generating signals for detecting motion and measuring vital signs
US11589754B2 (en) 2009-05-20 2023-02-28 Sotera Wireless, Inc. Blood pressure-monitoring system with alarm/alert system that accounts for patient motion
US10973414B2 (en) 2009-05-20 2021-04-13 Sotera Wireless, Inc. Vital sign monitoring system featuring 3 accelerometers
US10987004B2 (en) 2009-05-20 2021-04-27 Sotera Wireless, Inc. Alarm system that processes both motion and vital signs using specific heuristic rules and thresholds
US10555676B2 (en) 2009-05-20 2020-02-11 Sotera Wireless, Inc. Method for generating alarms/alerts based on a patient's posture and vital signs
US8956294B2 (en) 2009-05-20 2015-02-17 Sotera Wireless, Inc. Body-worn system for continuously monitoring a patients BP, HR, SpO2, RR, temperature, and motion; also describes specific monitors for apnea, ASY, VTAC, VFIB, and ‘bed sore’ index
US8956293B2 (en) 2009-05-20 2015-02-17 Sotera Wireless, Inc. Graphical ‘mapping system’ for continuously monitoring a patient's vital signs, motion, and location
US8475370B2 (en) 2009-05-20 2013-07-02 Sotera Wireless, Inc. Method for measuring patient motion, activity level, and posture along with PTT-based blood pressure
US11103148B2 (en) 2009-06-17 2021-08-31 Sotera Wireless, Inc. Body-worn pulse oximeter
US8437824B2 (en) 2009-06-17 2013-05-07 Sotera Wireless, Inc. Body-worn pulse oximeter
US11134857B2 (en) 2009-06-17 2021-10-05 Sotera Wireless, Inc. Body-worn pulse oximeter
US9775529B2 (en) 2009-06-17 2017-10-03 Sotera Wireless, Inc. Body-worn pulse oximeter
US11638533B2 (en) 2009-06-17 2023-05-02 Sotera Wireless, Inc. Body-worn pulse oximeter
US8554297B2 (en) 2009-06-17 2013-10-08 Sotera Wireless, Inc. Body-worn pulse oximeter
US9596999B2 (en) 2009-06-17 2017-03-21 Sotera Wireless, Inc. Body-worn pulse oximeter
US10085657B2 (en) 2009-06-17 2018-10-02 Sotera Wireless, Inc. Body-worn pulse oximeter
US11253169B2 (en) 2009-09-14 2022-02-22 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US10123722B2 (en) 2009-09-14 2018-11-13 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US10595746B2 (en) 2009-09-14 2020-03-24 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US8545417B2 (en) 2009-09-14 2013-10-01 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US8740807B2 (en) 2009-09-14 2014-06-03 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US8622922B2 (en) 2009-09-14 2014-01-07 Sotera Wireless, Inc. Body-worn monitor for measuring respiration rate
US10806351B2 (en) 2009-09-15 2020-10-20 Sotera Wireless, Inc. Body-worn vital sign monitor
US8527038B2 (en) 2009-09-15 2013-09-03 Sotera Wireless, Inc. Body-worn vital sign monitor
US10420476B2 (en) 2009-09-15 2019-09-24 Sotera Wireless, Inc. Body-worn vital sign monitor
US8364250B2 (en) 2009-09-15 2013-01-29 Sotera Wireless, Inc. Body-worn vital sign monitor
US8321004B2 (en) 2009-09-15 2012-11-27 Sotera Wireless, Inc. Body-worn vital sign monitor
US10213159B2 (en) 2010-03-10 2019-02-26 Sotera Wireless, Inc. Body-worn vital sign monitor
US10278645B2 (en) 2010-03-10 2019-05-07 Sotera Wireless, Inc. Body-worn vital sign monitor
US20110224498A1 (en) * 2010-03-10 2011-09-15 Sotera Wireless, Inc. Body-worn vital sign monitor
US8591411B2 (en) 2010-03-10 2013-11-26 Sotera Wireless, Inc. Body-worn vital sign monitor
US8727977B2 (en) 2010-03-10 2014-05-20 Sotera Wireless, Inc. Body-worn vital sign monitor
US8979765B2 (en) 2010-04-19 2015-03-17 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US9339209B2 (en) 2010-04-19 2016-05-17 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US9173593B2 (en) 2010-04-19 2015-11-03 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US9173594B2 (en) 2010-04-19 2015-11-03 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US8747330B2 (en) 2010-04-19 2014-06-10 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US8888700B2 (en) 2010-04-19 2014-11-18 Sotera Wireless, Inc. Body-worn monitor for measuring respiratory rate
US20120148161A1 (en) * 2010-12-09 2012-06-14 Electronics And Telecommunications Research Institute Apparatus for controlling facial expression of virtual human using heterogeneous data and method thereof
US9585577B2 (en) 2010-12-28 2017-03-07 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US10722130B2 (en) 2010-12-28 2020-07-28 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US10722132B2 (en) 2010-12-28 2020-07-28 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US10722131B2 (en) 2010-12-28 2020-07-28 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US9364158B2 (en) 2010-12-28 2016-06-14 Sotera Wirless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US10856752B2 (en) 2010-12-28 2020-12-08 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US9380952B2 (en) 2010-12-28 2016-07-05 Sotera Wireless, Inc. Body-worn system for continuous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure
US10357187B2 (en) 2011-02-18 2019-07-23 Sotera Wireless, Inc. Optical sensor for measuring physiological properties
US9439574B2 (en) 2011-02-18 2016-09-13 Sotera Wireless, Inc. Modular wrist-worn processor for patient monitoring
US11179105B2 (en) 2011-02-18 2021-11-23 Sotera Wireless, Inc. Modular wrist-worn processor for patient monitoring
US9313344B2 (en) 2012-06-01 2016-04-12 Blackberry Limited Methods and apparatus for use in mapping identified visual features of visual images to location areas
US20160063317A1 (en) * 2013-04-02 2016-03-03 Nec Solution Innovators, Ltd. Facial-expression assessment device, dance assessment device, karaoke device, and game device
US10064547B2 (en) * 2013-08-27 2018-09-04 Johnson & Johnson Vision Care, Inc. Means and method for demonstrating the effects of low cylinder astigmatism correction
WO2015164260A1 (en) * 2014-04-21 2015-10-29 Acevedo William Carl Method and apparatus for providing state identification
JP2016049433A (en) * 2014-08-28 2016-04-11 岡崎 章 Emotion expression method, emotion expression apparatus, and computer program
JP2016049438A (en) * 2014-08-28 2016-04-11 岡崎 章 Psychological amount measurement device that quantitatively measures psychology of person, and measuring method
US10827973B1 (en) 2015-06-30 2020-11-10 University Of South Florida Machine-based infants pain assessment tool
US11631280B2 (en) 2015-06-30 2023-04-18 University Of South Florida System and method for multimodal spatiotemporal pain assessment
US20180104822A1 (en) * 2016-10-13 2018-04-19 Toyota Jidosha Kabushiki Kaisha Communication device
US10350761B2 (en) * 2016-10-13 2019-07-16 Toyota Jidosha Kabushiki Kaisha Communication device
CN111093913A (en) * 2017-09-11 2020-05-01 Groove X 株式会社 Action-independent robot watching opposite side
US11498222B2 (en) * 2017-09-11 2022-11-15 Groove X, Inc. Autonomously acting robot that stares at companion
US11202604B2 (en) 2018-04-19 2021-12-21 University Of South Florida Comprehensive and context-sensitive neonatal pain assessment system and methods using multiple modalities
USD969216S1 (en) * 2021-08-25 2022-11-08 Rebecca Hadley Educational poster
JP7266229B1 (en) * 2022-11-26 2023-04-28 株式会社Kansei Design Human psychological state evaluation tool

Similar Documents

Publication Publication Date Title
US20060128263A1 (en) Computerized assessment system and method for assessing opinions or feelings
Hilty et al. A review of telepresence, virtual reality, and augmented reality applied to clinical care
Rush et al. Nurses as imperfect role models for health promotion
Weddle et al. Professional perception and expert action: Scaffolding embodied practices in professional education
US20150302536A1 (en) Virtual information presentation system
CN101657845A (en) The apparatus and method that present personal story for medical patient
Joseph et al. Using immersive virtual environments (IVEs) to conduct environmental design research: A primer and decision framework
Lupton et al. Reimagining digital health education: Reflections on the possibilities of the storyboarding method
Laughey et al. Twelve tips for teaching empathy using simulated patients
Brunzini et al. A comprehensive method to design and assess mixed reality simulations
Vrana et al. Can a computer administer a Wechsler Intelligence Test?
Bonoti et al. ‘A smile stands for health and a bed for illness’: Graphic cues in children’s drawings
Kosmyna et al. " The thinking cap 2.0" preliminary study on fostering growth mindset of children by means of electroencephalography and perceived magic using artifacts from fictional sci-fi universes
Griffin et al. E-racing false narratives: A Black woman track star’s multimodal counterstory of possible futures
Becerril-Ortega et al. Design Process for a Virtual Simulation Environment for Training Healthcare Professionals in Geriatrics
O'Neill Perspectives of parents of children with cerebral palsy on the supports, challenges, and realities of integrating AAC into everyday life
Ceccato et al. Influence of stimuli emotional features and typicality on memory performance: insights from a virtual reality context
Schmehl Introduction to concept mapping in nursing: Critical thinking in action
Gormley Supporting Children with Complex Communication Needs to Communicate Choices during an Inpatient Stay: Effect of a Partner Training on Health Care Professionals
Mulcahy Jr Cognitive self-appraisal of depression and self-concept: Measurement alternatives for evaluating affective states
Tuononen et al. Zooming in on interactions: A micro-analytic approach examining triadic interactions between children with autism spectrum disorder and their co-participants
Bezemer Social semiotics: Theorising meaning making
Strange Improvised music to support Intensive Interaction for children with complex needs: A feasibility study of brief adjunctive music therapy
KR102446138B1 (en) Interactive communication training method, device and program for medical personnel
Campbell et al. Pedagogies of Rhetorical Empathy-in-Action: Role Playing and Story Sharing in Healthcare Education

Legal Events

Date Code Title Description
AS Assignment

Owner name: PSYCHOLOGICAL APPLICATIONS LLC, VERMONT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BAIRD, JOHN C.;REEL/FRAME:016993/0846

Effective date: 20051208

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION