CA2296119A1 - Interactive toy - Google Patents

Interactive toy Download PDF

Info

Publication number
CA2296119A1
CA2296119A1 CA002296119A CA2296119A CA2296119A1 CA 2296119 A1 CA2296119 A1 CA 2296119A1 CA 002296119 A CA002296119 A CA 002296119A CA 2296119 A CA2296119 A CA 2296119A CA 2296119 A1 CA2296119 A1 CA 2296119A1
Authority
CA
Canada
Prior art keywords
user
interactive
toy
visible
procedure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002296119A
Other languages
French (fr)
Inventor
Oz Gabay
Jacob Gabay
Nimrod Sandlerman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Creator Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/081,255 external-priority patent/US6160986A/en
Application filed by Individual filed Critical Individual
Publication of CA2296119A1 publication Critical patent/CA2296119A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/04Electrically-operated educational appliances with audible presentation of the material to be studied
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • A63H13/005Toy figures with self-moving parts, with or without movement of the toy as a whole with self-moving head or facial features
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/36Details; Accessories
    • A63H3/38Dolls' eyes
    • A63H3/40Dolls' eyes movable

Abstract

An interactive toy apparatus including a toy (10) having a fanciful physical appearance (17, 18, 19, 20), a speaker (58) mounted on the toy (10), a user input receiver (28), a user information storage unit (74) storing information relating to at least one user, a content controller (82) operative in response to current user inputs received via the user input receiver (28) and to information stored in the storage unit (74) for providing audio content to the user via the speaker (58).

Description

DEMANDES OU BREVETS VO~UMINEUX
LA PRESF~11TE PARTIE DE CETTE DEMANDS OU CE BREVET
COMPREND PLUS D'UN TOME.
CEC1 EST LE TOME ~ DE 3 NOTE: Pour les tomes additionels, veuillez contacter fe Bureau canadien des brevets JUMBO APPLICATIONS/PATENTS
THIS SECTION OF THE APPUCAT10NlPATENT CONTAINS MORE
THAN ONE VOLUME
THIS IS VOLUME ,~ OF -a F
t NOTE: Far additional voiumes~please contact'the Canadian Patent Office INTERACTIVE TOY
FIELD OF THE INVENTION
The present invention relates to computer systems and methods generally and more particularly to development of inter-active constructs, to techniques for teaching such development, and to verbally interactive toys.
BACKGROUND OF THE INVENTION
Various types of verbally interactive toys are known in the art. Generally speaking, these toys may be divided into two categories, computer games and stand-alone toys. The stand-alone toys, which typically have electronic circuitry embedded therein, normally provide a relatively low level of speech recognition and a very limited vocabulary, which often lead to child boredom and frustration during play.
Computer games enjoy the benefit of substantial comput-ing power and thus can provide a high level of speech recognition and user satisfaction. They are characterized by being virtual in their non-verbal dimensions and thus lack the capacity of bonding with children.
The following patents are believed to represent the state of the art in verbally interactive toys:
US Patent 4,712,184 to Haugerud describes a computer controlled educational toy, the construction of which teaches the user computer terminology and programming and robotic technology.
Haugerud describes computer control of a toy via a wired connec-tion, wherein the user of the computer typically writes a simple program to control movement of a robot.
US Patent 4,840,602 to Rose describes a talking doll responsive to an external signal, in which the doll has a vocabu-lary stored in digital data in a memory which may be accessed to cause a speech synthesizer in the doll to simulate speech.
US Patent 5,021,878 to Lang describes an animated character system with real-time control.
US Patent 5,142,803 to Lang describes an animated character system with real-time control.
US Patent 5,191,615 to Aldava et al. describes an interrelational audio kinetic entertainment system in which movable and audible toys and other animated devices spaced apart from a television screen are provided with program synchronized audio and control data to interact with the program viewer in relationship to the television program.
US Patent 5,195,920 to Collier describes a radio con-trolled toy vehicle which generates realistic sound effects on board the vehicle. Communications with a remote computer allows an operator to modify and add new sound effects.
US Patent 5,270,480 to Hikawa describes a toy acting in response to a MIDI signal, wherein an instrument-playing toy performs simulated instrument playing movements.
US Patent 5,289,273 to Lang describes a system for remotely controlling an animated character. The system uses radio signals to transfer audio, video and other control signals to the animated character to provide speech, hearing vision and movement in real-time.
US Patent 5,388,493 describes a system for a housing for a vertical dual keyboard MIDI wireless controller for accor-dionists. The system may be used with either a conventional MIDI
cable connection or by a wireless MIDI transmission system.
German Patent DE 3009-040 to Neuhierl describes a device for adding the capability to transmit sound from a remote control to a controlled model vehicle. The sound is generated by means of a microphone or a tape recorder and transmitted to the controlled model vehicle by means of radio communications. The model vehicle is equipped with a speaker that emits the received sounds.
The disclosures of all publications mentioned in the specification and of the publications cited therein are hereby incorporated by reference.
SUMMARY OF THE INVENTION
The present invention seeks to provide verbally inter-active toys and methods thereto which overcome disadvantages of the prior art as described hereinabove.
There is thus provided in accordance with a preferred embodiment of the present invention interactive toy apparatus including a toy having a fanciful physical appearance, a speaker mounted on the toy, a user input receiver, a user information storage unit storing information relating to at least one user a content controller operative in response to current user inputs received via the user input receiver and to information stored in the storage unit for providing audio content to the user via the speaker.
Further in accordance with a preferred embodiment of the present invention the user input receiver includes an audio receiver.
Still further in accordance with a preferred embodiment of the present invention the current user input includes a verbal input received via the audio receiver.
Additionally in accordance with a preferred embodiment of the present invention the user input receiver includes a tactile input receiver.
Moreover in accordance with a preferred embodiment of the present invention the storage unit stores personal informa-tion relating to at least one user and the content controller is operative to personalize the audio content.
Further in accordance with a preferred embodiment of the present invention the storage unit stores information relat-ing to the interaction of at least one user with the interactive toy apparatus and the content controller is operative to control the audio content in accordance with stored information relating to past interaction of the at least one user with the interactive toy apparatus.
Still further in accordance with a preferred embodiment of'the present invention the storage unit also stores information relating to the interaction of at least one user with the inter-active toy apparatus and the content controller also is operative to control the audio content in accordance with information relating to past interaction of the at least one user with the interactive toy apparatus.
Additionally in accordance with a preferred embodiment of the present invention the storage unit stores information input verbally by a user via the user input receiver.
Moreover in accordance with a preferred embodiment of the present invention the storage unit stores information input verbally by a user via the user input receiver.
Further in accordance with a preferred embodiment of the present invention the storage unit stores information input verbally by a user via the user input receiver.
Still further in accordance with a preferred embodiment of the present invention the interactive toy apparatus also includes a content storage unit storing audio contents of at least one content title to be played to a user via the speaker, the at least one content title being interactive and containing interactive branching.

Additionally in accordance with a preferred embodiment of the present invention the at least one content title includes a plurality of audio files storing a corresponding plurality of content title sections including at least one two alternative content title sections, and a script defining branching between the alternative user sections in response to any of a user input, an environmental condition, a past interaction, personal informa-tion related to a user, a remote computer, and a time-related condition.
Moreover in accordance with a preferred embodiment of the present invention the interactive toy apparatus also includes a content storage unit storing audio contents of at least one content title to be played to a user via the speaker, the at least one content title being interactive and containing interac-tive branching.
Further in accordance with a preferred embodiment of the present invention the at least one content title includes a plurality of parallel sections of content elements including at least two alternative sections and a script defining branching between alternative sections in a personalized manner.
Still further in accordance with a preferred embodiment of the present invention the user information storage unit is located at least partially in the toy.
Additionally in accordance with a preferred embodiment of the present invention the user information storage unit is located at least partially outside the toy.
Moreover in accordance with a preferred embodiment of the present invention the content storage unit is located at least partially in the toy.
Further in accordance with a preferred embodiment of the present invention the content storage unit is located at least partially outside the toy.
Still further in accordance with a preferred embodiment of the present invention the user input receiver includes a microphone mounted on the toy, and a speech recognition unit receiving a speech input from the microphone.
Additionally in accordance with a preferred embodiment of the present invention the user information storage unit is operative to store the personal information related to a plurali-ty of users each identifiable with a unique code and the content controller is operative to prompt any of the users to provide the user's code.
Moreover in accordance with a preferred embodiment of the present invention the user information storage unit is opera-tive to store information regarding a user's participation per-formance.
There is also provided in accordance with a preferred embodiment of the present invention toy apparatus having changing facial expressions, the toy including multi-featured face appara-tus including a plurality of multi-positionable facial features, and a facial expression control unit operative to generate at least three combinations of positions of the plurality of facial features representing at least two corresponding facial expres-sions.
Further in accordance with a preferred embodiment of the present invention the facial expression control unit is operative to cause the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
Still further in accordance with a preferred embodiment of the present invention the toy apparatus also includes a speak-er device, an audio memory storing an audio pronouncement, and an audio output unit operative to control output of the audio pronouncement by the speaker device, and the facial expression control unit is operative to generate the combinations of posi-tions synchronously with output of the pronouncement.
There is also provided in accordance with a preferred embodiment of the present invention toy apparatus for playing an interactive verbal game including a toy, a speaker device mounted on the toy, a microphone mounted on the toy, a speech recognition unit receiving a speech input from the microphone, and an audio storage unit storing a multiplicity of verbal game segments to be played through the speaker device, and a script storage defining interactive branching between the verbal game segments.
Further in accordance with a preferred embodiment of the present invention the verbal game segments include at least one segment which prompts a user to generate a spoken input to the verbal game.
Still further in accordance with a preferred embodiment of the present invention the at least one segment includes two or more verbal strings and a prompt to the user to reproduce one of the verbal strings.
Additionally in accordance with a preferred embodiment of the present invention the at least one segment includes a riddle.
Moreover in accordance with a preferred embodiment of the present invention the at least one of the verbal strings has educational content.
Further in accordance with a preferred embodiment of the present invention the at least one of the verbal strings includes a feedback to the user regarding the quality of the user's performance in the game.
Still further in accordance with a preferred embodiment of the present invention the interactive toy apparatus further includes mufti-featured face apparatus assembled with the toy including a plurality of mufti-positionable facial features, and a facial expression control unit operative to generate at least three combinations of positions of the plurality of facial fea-tures representing at least two corresponding facial expressions.
Additionally in accordance with a preferred embodiment of the present invention the facial expression control unit is operative to cause the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
Moreover in accordance with a preferred embodiment of the present invention the interactive toy apparatus also includes an audio memory storing an audio pronouncement, and an audio output unit operative to control output of the audio pronounce-ment by the speaker device, and the facial expression control unit is operative to generate the combinations of positions WO 99/54015 PC'f/IL99/00202 synchronously with output of the pronouncement.
Further in accordance with a preferred embodiment of the present invention the interactive toy apparatus further includes a microphone mounted on the toy, a speech recognition unit receiving a speech input from the microphone, and an audio storage unit storing a multiplicity of verbal game segments of a verbal game to be played through the speaker device, and a script storage defining interactive branching between the verbal game segments.
Still further in accordance with a preferred embodiment of the present invention the verbal game segments include at least one segment which prompts a user to generate a spoken input to the verbal game.
Additionally in accordance with a preferred embodiment of the present invention the at least one segment includes two or more verbal strings and a prompt to the user to reproduce one of the verbal strings.
Moreover in accordance with a preferred embodiment of the present invention the at least one segment includes a riddle.
Further in accordance with a preferred embodiment of the present invention the at least one of the verbal strings has educational content.
Still further in accordance with a preferred embodiment of the present invention and further including a microphone mounted on the toy;a speech recognition unit receiving a speech input from the microphone, and an audio storage unit storing a multiplicity of verbal game segments of a verbal game to be played through the speaker device and a script storage defining interactive branching between the verbal game segments.
Moreover in accordance with a preferred embodiment of the present invention the verbal game segments include at least one segment which prompts a user to generate a spoken input to the verbal game.
Additionally in accordance with a preferred embodiment of the present invention wherein at least one segment includes two or more verbal strings and a prompt to the user to reproduce one of the verbal strings. Additionally or alternatively at least one segment comprises a riddle.
Still further in accordance with a preferred embodiment of the present invention at least one of the verbal strings has educational content.
Additionally in accordance with a preferred embodiment of the present invention the at least one of the verbal strings includes a feedback to the user regarding the quality of the user's performance in the game.
There is also provided in accordance with a preferred embodiment of the present invention a method of toy interaction including providing a toy having a fanciful physical appearance, providing a speaker mounted on the toy, providing a user input receiver, storing in a user information storage unit information relating to at least one user providing, via a content controller operative in response to current user inputs received via the user input receiver and to information stored in the storage unit, audio content to the user via the speaker.
Further in accordance with a preferred embodiment of the present invention the storing step includes storing personal information relating to at least one user and personalizing, via the content controller, the audio content.
Still further in accordance with a preferred embodiment of the present invention the storing step includes storing infor-mation relating to the interaction of at least one user with the interactive toy apparatus and controlling, via the content con-troller, the audio content in accordance with stored information relating to past interaction of the at least one user with the interactive toy apparatus.
Additionally in accordance with a preferred embodiment of the present invention the method further includes storing, in a content storage unit, audio contents of at least one content title to be played to a user via the speaker, the at least one content title being interactive and containing interactive branching.
Moreover in accordance with a preferred embodiment of the present invention the method further includes storing person-al information related to a plurality of users each identifiable with a unique code and prompting, via the content controller, any of the users to provide the user's code.
Further in accordance with a preferred embodiment of the present invention the method further includes storing infor-mation regarding a user's participation performance.
Still further in accordance with a preferred embodiment of the present invention the method further includes providing multi-featured face apparatus including a plurality of multi-positionable facial features, and generating at least three combinations of positions of the plurality of facial features representing at least two corresponding facial expressions.
Additionally in accordance with a preferred embodiment of the present invention the method further includes causing the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
Moreover in accordance with a preferred embodiment of the present invention the method also includes storing an audio pronouncement, and providing the audio pronouncement by the speaker, and generating combinations of facial positions synchro-nously with output of the pronouncement.
There is also provided, in accordance with a preferred embodiment of the present invention, a system for teaching pro-gramming to students, such as school-children, using interactive objects, the system including a computerized student interface permitting a student to breathe life into an interactive object by defining characteristics of the interactive object, the com-puterized student interface be being operative to at least par-tially define, in response to student inputs, interactions be-tween the interactive object and humans; and a computerized teacher interface permitting a teacher to monitor the student's progress in defining characteristics of the interactive object.
Further in accordance with a preferred embodiment of the present invention, the computerized teacher interface permits the teacher to configure the computerized student interface.
Also provided, in accordance with a preferred embodi-ment of the present invention, is a teaching system for teaching engineering and programming of interactive objects to students, the system including a computerized student interface permitting a student to breathe life into an interactive object by defining characteristics of the interactive object, the computerized user interface being operative to at least partially define, in re-sponse to student inputs, interactions between the interactive object and humans, and a computerized teacher interface permit-ting a teacher to configure the computerized student interface.
Also provided, in accordance with another preferred embodiment of the present invention, is a computer system for development of emotionally perceptive computerized creatures including a computerized user interface permitting a user to develop an emotionally perceptive computer-controlled creature by defining interactions between the emotionally perceptive comput-er-controlled creature and natural humans including at least one response of the emotionally perceptive computer-controlled crea-ture to at least one parameter, indicative of natural human emotion, derived from a stimulus provided by the natural human and a creature control unit operative to control the emotionally perceptive creature in accordance with the characteristics and interactions defined by the user.
Further in accordance with a preferred embodiment of the present invention, the parameter indicative of natural human emotion includes a characteristic of natural human speech other than language content thereof.
Also provided, in accordance with a preferred embodi-ment of the present invention, is a method for development of emotionally perceptive computerized creatures, the method includ-ing defining interactions between the emotionally perceptive computer-controlled creature and natural humans including at least one response of the emotionally perceptive computer-con-trolled creature to at least one parameter, indicative of natural human emotion, derived from a stimulus provided by the natural human, and controlling the emotionally perceptive creature in accordance with the characteristics and interactions defined by the user.
Additionally provided, in accordance with a preferred embodiment of the present invention, is a method for teaching programming to school-children, the method including providing a computerized visual-programming based school-child interface permitting a school-child to perform visual programming and providing a computerized teacher interface permitting a teacher to configure the computerized school-child interface.
Also provided is a computerized emotionally perceptive computerized creature including a plurality of interaction modes operative to carry out a corresponding plurality of interactions with natural humans including at least one response to at least one natural human emotion parameter, indicative of natural human emotion and an emotion perception unit operative to derive at least one natural human emotion parameter from a stimulus provid-ed by the natural human, and to supply the parameter to at least one of the plurality of interaction modes, and, optionally, a physical or virtual, e.g. on-screen, body operative to partici-pate 'in at least one of the plurality of interactions.

BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciat-ed from the following detailed description, taken in conjunction with the drawings in which:
Fig. lA is a simplified pictorial illustration of a toy forming at least part of an interactive toy system constructed and operative in accordance with a preferred embodiment of the present invention;
Fig. 1B is a back view of the toy of Fig. 1;
Fig. 2 is a partially cut away pictorial illustration of the toy of Figs. lA and 1B;
Fig. 3 is a simplified exploded illustration of ele-ments of the toy of Figs. lA, 1B, and 2;
Figs. 4A, 4B, 4C, 4D and 4E are illustrations of the toy of Figs. lA - 3 indicating variations in facial expressions thereof;
Fig. 5 is a simplified block diagram illustration of the interactive toy apparatus of a preferred embodiment of the present invention;
Fig. 6 is a functional block diagram of a base station forming part of the apparatus of Fig. 5;
Fig. 7 is a functional block diagram of a circuitry embedded in a toy forming part of the apparatus of Fig. 5;
Figs. 8A - 8G, taken together, comprise a schematic diagram of base communication unit 62 of Fig. 5;
Figs. 8H - 8N, taken together, comprise a schematic diagram of base communication unit 62 of Fig. 5, according to an alternative embodiment;
Figs. 9A - 9G, taken together, comprise a schematic diagram of toy control device 24 of Fig. 5;
Figs. 9H - 9M, taken together, comprise a schematic diagram of toy control device 24 of Fig. 5, according to an alternative embodiment;
Figs. 10 - 15, taken together, are simplified flowchart illustrations of a preferred method of operation of the interac-tive toy system of Figs. 1 - 9G;
Figs. 16A and 168, taken together, form a simplified operational flow chart of one possible implementation of the opening actions of a script executed by the "Play" sub-module of Fig. 10;
Figs. 17A - 17E, taken together, form a simplified operational flow chart of one possible implementation of a story script executed by the "Play" sub-module of Fig. 10;
Figs. 18A - 18G, taken together, form a simplified operational flow chart of one possible implementation of a game script executed by the "Play" sub-module of Fig. 10;
Figs. 19A - 19C, taken together, form a simplified operational flow chart of one possible implementation of a song script executed by the "Play" sub-module of Fig. 10;
Figs. 20A - 20C, taken together, form a simplified operational flow chart of one possible implementation of the "Bunny Short" story script of Figs. 17A - 17E and executed by the "Play" sub-module of Fig. 10;
Figs. 21A - 21F, taken together, form a simplified operational flow chart of one possible implementation of the "Bunny Long" story script of Figs. 17A - 17E and executed by the "Play" sub-module of Fig. 10;
Fig. 22 is a simplified operational flow chart of the "Theme Section" referred to in Figs. 17D, 18C, I9B, and 19C;
Fig. 23A is a pictorial illustration of the development and operation of a physical toy living creature in accordance with a preferred embodiment of the present invention;
Fig. 23B is a pictorial illustration of the development and operation of a virtual living creature in accordance with a preferred embodiment of the present invention;
Fig. 23C is a simplified semi-pictorial semi-block diagram illustration of a system which is a variation on the systems of Figs. 23A - 23B in that a remote content server is provided which serves data, programs, voice files and other contents useful in breathing life into a computerized living creature;
Fig. 24A is a pictorial illustration of a school-child programming a computerized living creature;
Fig. 24B is a pictorial illustration of human, at least verbal interaction with a computerized living creature wherein the interaction was programmed by a student as described above with reference to Fig. 24A;
Figure 24C is a pictorial illustration of a creature equipped with a built in video camera and a video display such as a liquid crystal display (LCD);
Fig. 25 is a simplified software design diagram of preferred functionality of a system administrator;

Fig. 26 is a simplified software diagram of preferred functionality of teacher workstation 312 in a system for teaching development of interactive computerized constructs such as the system of Figs. 23A - 23C;
Fig. 27 is a simplified software diagram of preferred functionality of student workstation 10 in a system for teaching development of interactive computerized constructs such as the system of Figs. 23A - 23C;
Figs. 28 - 31 are examples of screen displays which are part of a human interface for the Visual Programming block 840;
Fig. 32 is a screen display which includes an illustra-tion of an example of a state machine view of a project;
Fig. 33 is a screen display which enables a student to create an environment in which a previously generated module can be tested;
Figs. 34 - 37 are examples of display screens presented by the teacher workstation 312 of any of Figs. 23A, 23B or 23C;
Fig. 38 is a simplified flowchart illustration of the process by which the student typically uses the student worksta-tion of any of Figs. 23A, 23B or 23C;
Fig. 39 is an example of a display screen generated by selecting Event in the Insert menu in the student workstation 310;
Fig. 40 is an example of a display screen generated by selecting Function in the Insert menu in the student workstation 310;
Fig. 41 is a simplified flowchart illustration of processes performed by the student in the course of performing steps 910 and 920 of Fig. 38;
Fig. 42 is a simplified flowchart illustration of an emotional interaction flowchart design process;
Figs. 43 - 102 illustrate preferred embodiments of a computerized programming teaching system constructed and operative in accordance with a preferred embodiment of the present invention.
Fig. 103 is a table illustration of an emotional analysis database;
Fig. 104 is an emotional analysis state chart;
Fig. 105 illustrates typical function calls and call-back notifications;
Fig. 106 illustrates typical input data processing suitable for a media BIOS module;
Fig. 107 illustrates typical input data processing suitable for a UCP implementation module;
Fig. 108 illustrates typical data processing suitable for user applications and an API module;
Fig. 109 illustrates a typical UCP implementation module and media BIOS output data processing;
Fig. 110 illustrates output data processing for a protocol implementation module and media BIOS module;
Fig. 111 illustrates typical figure configuration; and Figs. 112 - 115 illustrate typical install-check up (BT
1/4, 2/4, 3/4 and 4/4 respectively).
Attached herewith are the following appendices which aid in the understanding and appreciation of one preferred embod-invent of the invention shown and described herein:
Appendix A is a computer listing of a preferred soft-ware implementation of the interactive toy system of the present invention;
Appendix B is a preferred parts list for the apparatus of Figs. 8A - 8G; and Appendix C is a preferred parts list for the apparatus of Figs. 9A - 9G.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Fig. lA which is a simplified pictorial illustration of a toy, generally designated 10, forming at least part of an interactive toy system constructed and opera-tive in accordance with a preferred embodiment of the ~~resent invention. While toy 10 may be implemented in any number of physical configurations and still maintain the functionality of an interactive toy system as is described herein, for illustra-tion purposes only toy 10 is shown in Fig. lA as typically having a fanciful physical appearance and comprising a body portion 12 having a number of appendages, such as arms 14, legs 16, eyelids 17, eyes 18, a nose 19, and a mouth 20. Arms 14 and legs 16 may be passive "appendages" in that they are not configured to move, while eyelids I7, eyes 18 and mouth 20 may be "active" appendages in that they are configured to move as is described in greater detail hereinbelow with reference to Figs. 3 - 4E.
Fig. 1B is a back view of the toy of Fig. 1 and addi-tionally shows toy 10 as typically having an apertured area 22, behind which a speaker may be mounted as will be described in greater detail hereinbelow.
Fig. 2 is a partially cut away pictorial illustration of the toy of Figs. lA and 1B showing a toy control device 24, typically housed within body potion 12, and a number of user input receivers, such as switches 26 in arms 14 and legs 16 for receiving tactile user inputs, and a microphone 28 for receiving audio user inputs. It is appreciated that the various user input receivers described herein may be located anywhere within toy 10, such as behind nose 19, provided that they may be accessed by a tactile or audio user input, such as verbal input, as required.
It is appreciated any of a multitude of known sensors and input devices, such as accelerometers, orientation sensors, proximity sensors, temperature sensors, video input devices, etc., although not particularly shown, may be incorporated into toy 10 for receiving inputs or other stimuli for incorporation into the interactive environment as described herein regarding the interactive toy system of the present invention.
Additional reference is now made to Fig. 3 which is a simplified exploded illustration of elements of the toy ZO of Figs. lA, 1B, and 2. A facial portion 30 of body portion 12 of Fig. 1 is shown together with nose 19 and mouth 20, and having two apertures 32 for receiving eyelids I7 and eyes 18. Facial portion 30 typically sits atop a protective cover 34 which is mounted on a protective box 36. Eyelids 17, eyes 18, and mouth 20 each typically cooperate with a motion element 38 which pro-vides a movement to each appendage. Motion elements 38 are typically driven by a gear plate 40 which is in turn controlled by a gear shaft 42 and a motor 44. Circuitry 24 effects a de-sired movement of a specific appendage via a corresponding motion element 38 by controlling motor 44 and gear shaft 42 to orient and move gear plate 40 depending on the desired rotational orien-tation of gear plate 40 relative to the current rotational orien-tation as determined by an optical positioning device 46. Gear plate 40 preferably selectably cooperates with a single one of motion elements 38 at a time depending on specific rotational orientations of gear plate 40. A speaker 58 is also provided for audio output. Power is typically provided by a power source 48, typically a DC power source.
Figs. 4A, 48, 4C, 4D and 4E are illustrations of toy 10 of Figs. lA - 3 indicating variations in facial expressions thereof. Fig. 4A shows eyes 18 moving in the direction indicated by an arrow 50, while Fig. 4B shows eyes 18 moving in the direc-tion indicated by an arrow 52. Fig. 4C shows eyelids 17 having moved to a half-shut position, while Fig. 4D shows eyelids 17 completely shut. Fig. 4E shows the lips of mouth 20 moving in the directions indicated by an arrow 54 and an arrow 56. It is appreciated that one or both lips of mouth 20 may move.
Reference is now made to Fig. 5 which is a simplified block diagram illustration of the interactive toy apparatus constructed and operative in accordance with a preferred embodi-ment of the present invention. Typically, a computer 60, such as a personal computer based on the PENTIUM microprocessor from Intel Corporation, is provided in communication with a base communication unit 62, typically a radio-based unit, via a RS-232 serial communications port. It is appreciated that communication between the computer 60 and the base unit 62 may be effected via parallel port, MIDI and audio ports of a sound card, a USB port, or any known communications port. Unit 62 is typically powered by a power supply 64 which may be fed by an AC power source.
Unit 62 preferably includes an antenna 66 for communication with toy control device 24 of toy 10 (Fig. 2) which is similarly equipped with an antenna 68. Toy control device 24 typically controls motor 44 (Fig. 3), switches 26 (Fig. 2), one or more movement sensors 70 for detecting motion of toy 10, microphone 28 (Fig. 2), and speaker 58 (Fig. 3). Any of the elements 24, 44, 26, 28, 58 and 70 may be powered by power source 48 (Fig. 3).
Computer 60 typically provides user information stor-age, such as on a hard disk or any known and preferably non-volatile storage medium, for storing information relating to a user, such as personal information including the user's name, a unique user code alternatively termed herein as a "secret name"
that may be a made-up or other fanciful name for the user, typi-cally predefined and selected by the user, the age of the user, etc.
Computer 60 also acts as what is referred to herein as a "content controller" in that it identifies the user interacting with toy 10 and controls the selection and output of content via toy 10, such as via the speaker 58 as is described in greater detail hereinbelow. The content controller may utilize the information relating to a user to personalize the audio content delivered to the user, such as by referring to the user with the user's secret name or speaking in a manner that is appropriate to the gender of the user. Computer 60 also typically provides content storage for storing content titles each comprising one or more content elements used in response to user inputs received via the user input receivers described above with reference to toy I0, in response to environmental inputs, or at random. For example, a content title may be a joke, a riddle, or an interac-tive story. An interactive story may contain many content ele-ments, such as audio elements, generally arranged in a script for sequential output. The interactive story is typically divided into several sections of content element sequences, with multiple sections arranged in parallel to represent alternative interac-tive branches at each point in the story. The content controller selects a branch according to a current user input with toy 10, previous branch selections, or other user information such as past interactions, preferences, gender, or environmental or temporal conditions, etc.
Computer 60 may be in communication with one or more other computers, such as a remote computer by various known means such as by fixed or dial-up connection to a BBS or to the Inter-net. Computer 60 may download from the remote server, either in real-time or in a background or batch process, various types of content information such as entirely new content titles, addi-tional sections or content elements for existing titles such as scripts and voice files, general information such as weather information and advertisements, and educational material. Infor-mation downloaded from a remote computer may be previously cus-tomized for a specific user such as by age, user location, pur-chase habits, educational level, and existing user credit.
The content controller may also record and store user information received from a user via a user input receiver such as verbal or other audio user inputs. Computer 60 preferably includes speech recognition capabilities, typically implemented in hardware and/or software, such as the Automatic Speech Recog-nition Software Development Kit for WINDOWS 95 version 3.0, commercially available from Lernout & Hauspie Speech Products, Sint-Krispisnstraat 7, 8900 Leper, Belgium. Speech recognition may be used by the content controller to analyze speech inputs from a user to determine user selections, such as in connection with an interactive story for selecting a story branch. Speech recognition may also be used by the content controller to identi-fy a user by the secret name or code spoken by the user and received by microphone 28.
The content controller also provides facial expression control. The facial mechanism (Fig. S) may provide complex dynamic facial expressions by causing the facial features to fluctuate between various positions at different rates. Prefera-bly, each facial feature has at least two positions that it may assume. Two or more facial features may be moved into various positions at generally the same time and at various rates in order to provide a variety of facial expression combinations to generate a variety different emotions. Preferably, the content controller controls the facial feature combinations concurrent with a user interaction or a content output to provide a natural accompanying expression such as lip synchronization and natural eye movements.
The content controller preferably logs information relating to content provided to users and to the interactions between each user and toy 10, such as the specific jokes and songs told and sung to each user, user responses and selections to prompts such as questions or riddles or interactive stories, and other user inputs. The content may utilize the information relating to these past interactions of each user to subsequently select and output content and otherwise control toy 10 as appro-priate, such as play games with a user that were not previously played with that user or affect the level of complexity of an interaction.
It is appreciated that computer 60 may be housed within or otherwise physically assembled with toy 10 in a manner in which computer 60 communicates directly with toy control device 24 not via base unit 62 and antennae 66 and 68, such as through wired means or optical wireless communications methods. Alterna-tively, computer 60 may be electronically integrated with toy control device 24.
Fig. 6 is a functional. block diagram of base communica-tion unit 62 of Fig. 5. Unit 62 typically comprises a micro controller unit 72 having a memory 74. Unit 72 communicates with computer 60 of Fig. 5 via an adapter 76, typically connected to computer 60 via an RS-232 port or otherwise as described above with reference to Fig. 5. Unit 72 communicates with toy control device 24 of toy 10 (Fig. 2) via a transceiver 78, typically a radio transceiver, and antenna 66.
Fig. 7 is a functional block diagram of toy control device 24 of Fig. 5. Device 24 typically comprises a micro controller unit 82 which communicates with base unit 72 of Fig. 5 via a transceiver 84, typically a radio transceiver, and antenna 68. Power is supplied by a power supply 86 which may be fed by power source 48 (Fig. 5). Unit 82 preferably controls and/or receives inputs from a toy interface module 88 which in turn controls and/or receives inputs from the speaker, microphone, sensors, and motors as described hereinabove. Transceiver 84 may additionally or alternatively communicate with interface module 88 for direct communication of microphone inputs and speaker outputs.
Reference is now made to Figs. 8A - 8G, which, taken together, comprise a schematic diagram of base communication unit 62 of Fig. 5. Appendix B is a preferred parts list for the apparatus of Figs. 8A - 8G.
Figs. 8H - 8N, taken together, comprise a schematic diagram of base communication unit 62 of Fig. 5, according to an alternative embodiment.
Reference is now made to Figs. 9A - 9G which, taken together, comprise a schematic diagram of toy control device 24 of Fig. 5. Appendix C is a preferred parts list for the apparatus of Figs. 9A - 9G.
Figs. 9H - 9M, taken together, comprise a schematic diagram of toy control device 24 of Fig. 5, according to an alternative embodiment.
Reference is now made to Figs. 10 - 15 which, taken together, are simplified flowchart illustrations of a preferred method of operation of the interactive toy system of Figs. 1 -9G. It is appreciated that the method of Figs. 10 - 15 may be implemented partly in computer hardware and partly in software, or entirely in custom hardware. Preferably, the method of Figs.
- 15 is implemented as software instructions executed by computer 60 (Fig. 5). It is appreciated that the method of Figs.
10 - 15, as well as other methods described hereinbelow, need not necessarily be performed in a particular order, and that in fact, for reasons of implementation, a particular implementation of the methods may be performed in a different order than another par-ticular implementation.
Fig. 10 describes the main module of the software and high-level components thereof. Operation typically begins by opening the communication port to the base unit 62 and initiating communication between computer 60 and toy control device 24 via base unit 62. The main module also initiates a speech recogni-ti~on engine and displays, typically via a display of computer 60, the main menu of the program for selecting various sub-modules.
The main module typically comprises the following sub-modules:
1) "About You" is a sub-module that enables a user to configure the system to the users preferences by entering parame-ters such as the users real name, secret name, age and date of birth, color of the hair and eyes, gender, and typical bed-time and wake-up hours;
2) "Sing Along" is another sub-module that provides specific content such as songs with which the user may sing along;
3) "How To Play" is a sub-module tutorial that teach-es the user how to use the system and play with the toy 10;
4) "Play" is the sub-module that provides the inter-active content to the toy ~0 and directs toy 10 to interact with the user;
5) "Toy Check-Up" is a sub-module that helps the user to solve technical problems associated with the operation of the system, such as the toy having low battery power and lack of sufficient electrical power supply to the base station; and 6} "Exit" is a sub-module that enables the user to cease the operation of the interactive toy system software and clear it from the computers memory.
Fig. 11 shows a preferred implementation of the "open communication" step of Fig. 10 in greater detail. Typical opera-tion begins with initialization of typical system parameters such as setting up the access to the file system of various storage units. The operation continues by loading the display elements, opening the database, initializing the toy and the communication drivers, initializing the speech recognition software engine, and creating separate threads for various concurrently-operating activities such that one user may interact with the toy while another user may use the computer screen and keyboard for other purposes, such as for word processing.
Fig. 12 shows a preferred implementation of the "About You" sub-module of Fig. 10 in greater detail. Typical operation begins when the user has selected the "About You" option of the main menu on the computers screen. The user is then prompted to indicate whether the user is an existing user or a new user. The user then provides the users identification and continues with a registration step. Some or all of the operations shown in Fig. 12 may be performed with verbal guidance from the toy.
Fig. 13 shows a preferred implementation of the regis-tration step of Fig. 12 in greater detail. Typical operation begins by loading a registration data base, selecting a secret name, and then selecting and updating parameters displayed on the computers screen. When the exit option is selected the user returns to the main menu described in Fig. 10.
Fig. 14 shows a preferred implementation of the "Sing Along" sub-module of Fig. 10 in greater detail. Typical opera-tion begins with displaying a movie on the computer screen and concurrently causing all the toys 10 within communication range of the base unit to provide audio content, such as songs associ-ated with the movie, through their speakers. The user can choose to advance to the next song or exit this module and return to the main module, such as via keyboard entry.
Fig. 15 shows a preferred implementation of the "How To Play" and "Play" sub-modules of Fig. 10. Typical operation begins with the initialization of the desired script, described in greater details hereinbelow, minimizing the status window on the computer screen, closing the thread, and returning to the main menu. The computer continues to operate the thread responsible for the operation of the toy, and continues to concurrently display the status of the communication medium and the script on the computer screen.
Reference is now made to Figs. 16A and 16B which, taken together, form a simplified operational flow chart of one possi-ble implementation of the opening actions of a script executed by the "Play" sub-module of Fig. 10. The implementation of Figs.
16A and 168 may be understood in conjunction with the following table of action identifiers and actions:

OPENING
Audio Text ~.

~ Squeeze my foot please Op002 opOlSm "Hi! Good morning to you! Wow, what a morning! I'm Storyteller!

What's your Secret Name, please?

op020m Hi! Good afternoon! Wow, what an afternoon! I'm Storyteller!

What's your Secret Name, please?

Op025m "Hi! Good evening! Wow, what a night. I'm Storyteller!

What's your Secret Name, please?

op036m O.K. From now on I'm going to call you RAINBOW. So, hi Rainbow, whaddaya laiow!

O.K., Rainbow, you're the boss. You choose what we do. Say: STORY, GAME or SONG.

op040m Ace, straight from outer space !

O.K., Ace, you're the boss. You choose what we do.
Say: STORY, GAME or SONG.

Op045m Rainbow, well whaddaya Irnow!

O.K., Rainbow, you're the boss. You choose what we do. Say: STORY, GAME or SONG.

OpO50m Bubble Gum, well fiddle de dum !

O.K., Bubble Gum, you're the boss. You choose what we do. Say: STORY, GAME

or SONG.

op060 Don't be shy. We'll start to play as soon as you decide.
Please say out loud: STORY, GAME or SONG.

SUBSTITUTE SHEET (RULE 26) Typical operation of the method of Figs. 16A and 16B
begins by playing a voice file identified in the above table as op002. This is typically performed by instructing the toy to begin receiving a voice file of a specific time length. The voice file is then read from the storage unit of the computer and communicated via the radio base station to the toy control device that connects the received radio input to the toys speaker where it is output. voice file op002 requests that the user press the microswitch located in the nose or the foot of the toy.
If the user presses the microswitch the script then continues by playing either of voice files op015m, op020m or op025m, each welcoming the user in accordance with the current time of the day, and then requests that the user pronounce his or her secret name to identify himself or herself to the system. The script then records the verbal response of the user for three seconds. The recording is performed by the computer, by sending a command to the toy to connect the toy's microphone to the toys radio transmitter and transmit the received audio input for three seconds. The radio communication is received by the radio base station, communicated to the computer and stored in the comput-er's storage unit as a file. The application software then per-forms speech recognition on the recorded file. The result of the speech recognition process is then returned to the script pro-gram. The script continues according to the user response by playing a personalized welcome message that corresponds to the identified secret name or another message where an identification is not successfully made. This welcome message also requests the WO 99/540!5 PCT/IL99/00202 user to select between several options such as a story, a game or a song. The selection is received by recording the user verbal response and performing speech recognition. More detailed de-scription of a simplified preferred implementation of a story, a game, and a song are provided in Figs 17A to 17E, 18A to 18G, and 19A to 19C respectively.
Figs. 17A - 17E, taken together, form a simplified operational flow chart of one possible implementation of a story script executed by the "Play" sub-module of Fig. 10. The imple-mentation of Figs. 17A - 17E may be understood in conjunction with the following table of action identifiers and actions:

STORY MENU
Audio Text stm105 "Hey Ace, it looks like you like stories as much as I do. I know a great story about three very curious bunnies.

stm 1 I "Hey Rainbow, it looks like you like stories as much 0 as I do. I know a great story about three very curious bunnies.

Stml 15 "Hey Bubble Gum, it looks like you like stories as much as I do. I know a great story about three very curious bunnies.

stm125m A story. What a great idea! I love stories! Let's tell one together. Let's start with "Goldilocks and the Three Bears."

Stm 130m Once upon a time, there was a young girl who got lost in the forest. Hungry and tired, she saw a small, cozy little house. The door was open, so she walked right in.

stm135m On the kitchen table were three bowls of porridge.
She walked up to one of the bowls and put a spoonful of porridge in her mouth.

Srin140m Oooh! You tell me. How was the porridge? Too Hot, Too Cold or Just Right'1 Go ahead, say the words: TOO HOT, TOO COLD, or JUST
RIGHT

stm150 (Sputtering) Too hot! That was Papa Bear's bowl.
The porndge was too hot.

Stm155 (Sputtering) Too cold! That was Mama Bear's bowl.
The porridge was too cold Stm160 Hmmm. Just right! That was Baby Bear's bowl. The porridge was just right! And Goldilocks ate it all up!

stm170 Telling stories with you makes my day! Do you want to hear another story?'Say:

YES or NO.

stm 180 If you want to hear another story, just say YES.
If you want to do something else, just say NO.

stm195 I'm going to tell you a story about three very curious little bunnies.

sttn205m Uh-oh! It looks Iiice the bunnies are in a bit of trouble! Do you want to hear the rest of the Bunny story now? Say YES or NO.

stm206m Remember the Bunny story? The bunnies were eating something yummy, and then they heard someone coming. Do you want to hear what happens? Say YES or NO.

stm215m If you want to hear the rest of the Bunny story, say YES. If you want to do something else, say NO.

SUBSTITUTE SHEET (RULE 26) else, say N0.
stm225 ~ No? - OK, that's enough for now. Remember that you can play with the Funny Bunny Booklet whenever you want. Let's see, what would you like to do now?
Stm230 ~ Would you like to play a game or hear a song now? Say GAME or SONG.
stm245 ~ Now, let's play a game or sing a song. You decide. Please - GAME or SONG.

SUBSTITUTE SHEET (RULE 26) Figs. 18A - 18G, taken together, form a simplified operational flow chart of one possible implementation of a game script executed by the "Play" sub-module of Fig. 10. The imple-mentation of Figs. I8A - 18G may be understood in conjunction with the following table of action identifiers and actions:

WO 99/54015 PCT/IL99l00202 GAME MENU
Audio Text gm805 Hey Ace, so you're back for more games. Great! Let's play the Jumble Story again.

gm810 Hey Rainbow, so you're back for more games. Great!
Let's play the Jumble Story again.

Gm815 Hey Bubble Gum, so you're back for more games. Great!
Let's play the 3umble Story again.

Gm820m A game! What a great idea! I love playing games.
Especially games that come out of stories.

Gm840 This game is called Jumble Story. The story is all mixed up and you're going to help me fix it.

Gm84~m Listen to the sentences I say when you squeeze my nose, my hand or my foot. Then squeeze again in the right order so that the story will make sense.

gm847m Here goes, Press my nose please.

gm855m (sneezes) oh, sorry. (sniffles) it's o.k, now, you can press my nose.

Gm860 A woman came to the door and said she was a princess gm865m "O.k. - now squeeze my foot"

gm875m "Don't worry, I won't kick. Squeeze my foot please."

Gm890 Soon after they got married and lived happily ever after gm895 One more, now squeeze my hand please.

gm905m "Just a friendly squeeze shake if you please."

Gm910 . Once upon a time, a prince was looking for a princess to marry gm915 "Now try to remember what you squeezed to hear each sentence. Then squeeze my hand, my foot or press my nose in the right order to get the story right."

gm921 A woman came to the door and said she was a princess gm922 Soon after they got married and lived happily ever after gm923 . Once upon a time, a prince was looking for a princess to marry gm924 If you want to play the 3umble Story, press my nose, squeeze my hand and squeeze my foot in the right order.

Gm925 The right order is HAND, NOSE then FOOT. Try it.

gm926m "You did it! Super stuff! What a jumble Story player you are!"

gm930m "And that's the way the story goes! Now it's not a jumbled story anymore! In fact, it's the story of the "Princess and the Pea." If you want, I can tell you the whole story SUBSTITUTE SHEET (RULE 26) from beginning to end. What do you say: YES or NO?"

gm932 "You played Jumble Story very well! Do you want to play a different game now? Say YES or NO."

gm933 We can try this game another time. Do you want to play a different game now? Say YES or NO

gm940 "OK, then, enough games for now. There's so much more to do. Should we tell a story or sing a song? Say: STORY or SONG.

gm945 You tell me what to do! Go ahead. Say: STORY or SONG.

gm965m This is another of my favorite games. It's called the Guessing Game.

gm970 OK, let's begin. fm thinking about something sticky.
Guess - Is it A LOLLIPOP or PEANUT BUTTER? Say LOLLIPOP or PEANUT BUTTER.

gm972 Guess which sticky thing fm thinking about. A LOLLIPOP
or PEANUT BUTTER

gm975 That's right! fm thinking about a lollipop It's sticky and it also has a stick.

Gm980 That's right! fm thinking about Peanut Butter that sticks to the roof of your mouth.

gm984 That was fantasticky. Let's try another. What jumps higher a RABBTT or a Bear ?
Say RABBTT or BEAR.

gm982 Let's see. What jumps higher - a RABBTT or a BEAR

gm985m A rabbit, that's right, a rabbit jumps (SERIES OFBOINGS) with joy unless it is a toy.

Gm990 fd like to see a bear jump but fd hate to have it land on me.

gm1005 That was excellent game playing. Let's try something different. How about a story or a song now? You tell me: STORY or SONG.

gm997 Choose what we shall do. Say STORY or SONG.

SUBSTITUTE SHEET (RULE 26) Figs. 19A - 19C, taken together, form a simplified operational flow chart of one possible implementation of a song script executed by the "Play" sub-module of Fig. I0. The imple-mentation of Figs. 19A - 19C may be understood in conjunction with the following table of action identifiers and actions:

*rB

WO 99/54015 PCT/iL99/00202 SONG MENU
Audio Text Sng305 "In the mood for a song, Ace from outer space?. Super!
Let's do the porridge song again. Come on. Sing along with me."

Sng310 "In the mood for a song, Rainbow well whaddaya know?
Super! Let's do the porridge song again. Come on. Sing along with me."

Sng315 "In the mood for a song, Bubble Gum, fiddle de dum?
Super! Let's do the porridge song again. Come on. Sing along with me."

Sng320 A song, a song, we're in the mood to sing a song.

Sng_prog Short "Peace Porridge"

Sng370 "Do you want me to sing the rest of the song? Just say: YES or NO.

Sng390 That song reminds me of the Goldilocks story. Remember?
- Goldilocks liked her porridge JUST RIGHT!

Sng395 "I just thought of another great song. We can hear another song, play a game, or tell a story. Just say :SONG or GAME or STORY.

Sng410 All right, We're going to do a great song now. Here + goes..." [SINGS short HEAD

SNG HAND AND SHOULDERS]

sng415 What a song! What a great way to get some excercise!

Do you want to play a game or hear a story now? Say:
GAME or STORY.

sng425 I'm in the mood for a great game or a cool story.
You decide what we do. Tell me:

GAME or STORY.

SUBSTITUTE SHEET (RULE 26) Figs. 20A - 20C, taken together, form a simplified operational flow chart of one possible implementation of the "Bunny Short" story script of Figs. 17A - 17E and executed by the "Play"sub-module of Fig. 10. The implementation of Figs. 20A -20C may be understood in conjunction with the following table of action identifiers and actions:

BUNNY SHORT
Audio text rb300Sm music Rb005m (Sighing) "Dear me," said the Hungry W oman as she looked in her cupboard.

(Squeaky noise of cupboard opening). It was nearly empty, with nothing left except a jar of... You decide what was in the jar? HONEY, PEANUT BUTTER or MARSHMALLOW FLUFF?

rb0lS You decide what was in the jar. Say HONEY, PEANUT
BUTTER or MARSHMALLOW FLUFF

rb026 It was HONEY

rb0301 Honey!! Sweet, delicious, sticky honey, made by bees and looooved by bears.

rb0302 Peanut butter!! Icky, sticky peanut butter that sticks to the roof of your mouth.

rb0303 Marshmallow fluff Gooey, white, and sticky inside-out marshmallows that tastes great with peanut butter!

rb30SOm She reached up high into the cupboard for the one jar which was there. (Sound of woman stretching, reaching.), but she wasn't very careful and didn't hold it very well...the jar crashed to the floor, and broke. (Sound of glass crashing and brealang.) rb30S5 And sticky Honey started spreading all over the floor.

rb3060 And sticky Peanut butter started spreading all over the floor.

rb3065 And sticky Marshmallow fluff started spreading all over the floor.

rb090m "Now I have to clean it up before the mess gets worse, so where is my mop?"

[Sounds of doors opening and closing.) Oh, yes! I
lent the mop to the neighbor, Mr.

Yours-Iz-Mine, who never ever returns things rb307S She put on her going-out shoes and rushed out of the house Then, a tiny furry head with long pointed ears, a pink nose and cotton-like tail popped up over the window sill. (Sound effect of something peeping, action.) rbl 10 What do you think it was? An elephant? A mouse? or A bunny? You tell me:

GIRAFFE, ELEPHANT, or BUNNY.

rbI20 no... Elephants have long trunks, not long ears Rbl2S , no...Giraffes have long necks, not long ears.

Rb130 1t was a bunny! The cutest bunny you ever did see!
And the bunny's name was BunnyOne.

SUBSTITUTE SHEET (RULE 26) (Sniffing) There's something yummy-smelling in here."

Rb 195 Now when bunnies get excited, they start hopping up an down which is exactly what BunnyOne started to do.

rb200 Can you hop like a bunny? When I say, "BOING," hop like a bunny. Everytime I

"Boing" you hop again. When you want to stop, squeeze my hand.

3-boings rb220m While BunnyOne was boinging away, another bunny came around. BunnyTwo, was even more curious than BunnyOne and immediately peeked over the window sill.

"Hey, BunnyOne," BunnyTwo said rb230 Let's go in and eat it all up.

"Oh, I don't know if that's a good idea.." said BunnyOne.
"We could get into trouble.".

231m music Rb235 No sooner had BunnyOne said that , when a third pair of long ears peeked over the windowsill. Who do you think that was?

Rb245 Right you are! How did you know that! This is fun, we're telling the story together!

rb3155 His name was BunnyThree!

rb3160 BunnyThree looked at BunnyOne and BunnyTwo and he hopped smack in the middle of the honey And started licking away rb3165 BunnyThree looked at BunnyOne and BunnyTwo and he hopped smack in the middle of the peanut butter. And started licking away rb3170 BunnyThree looked at BunnyOne and BunnyTwo and he hopped smack in the middle of the marshmallow fluff. And started licking away rb3175 BunnyOne and BunnyTwo saw BunnyThree licking away and hopped in as well.

rb275 t But even as the three bunnies were nibbling away at the honey, they heard footsteps.

rb2752 But even as the three bunnies were nibbling away at the peanut butter, they heard footsteps.

rb2753 But even as the three bunnies were nibbling away at the marshmallow fluff, they heard footsteps.

rb280m Music SUBSTITUTE SHEET (RULE 26) Figs. 21A - 2IF, taken together, form a simplified operational flow chart of one possible implementation of the "Bunny Long" story script of Figs. 17A - 17E and executed by the "Play" sub-module of Fig. 10. The implementation of Figs. 21A -21F may be understood in conjunction with the following table of action identifiers and actions:

*rB

BUNNY LONG
Audio Text rb280m (Suspenseful music) rb285 "hey Bunnies - let's go" whispered BunnyOne, who as we know was the most cautious of the bunch.
"Yeah, we're out of here" answered BynnyTwo and BunnyThree.
But as they tried to get away, they saw to their dismay, that they were ---stuck rb2901 Stuck in a honey puddle rb2902 Stuck in peanut butter freckle-like blobs rb2903 Stuck in a gooey cloud of sticky marshmallow fluff.

Rb295 "What do we do?" asked BunnyTwo?

rb2961 (aside) BUBLLE GLJM, don't worry, these three rabbits always manage to get away rb2962 (aside) ACE" don't worry, these three rabbits always manage to get away rb2963 (aside)RAINBOW, don't worry, these three rabbits always manage to get away rb297m rb300 The door opened, and in wallced the Hungry Man, who had met the Hungry Woman coming back with the mop from YoursIsMines house..

rb3051 "So you mean to tell me that all we have for dinner is bread and honey rb3052 "So you mean to tell me that all we have for dinner is bread and peanut butter rb3053 "So you mean to tell me that all we have for dinner is bread and marhmallow fluff Rb315 That's not even enough for a Rabbit?"
Which was what he said when he walked into the door and saw the three bunnies stuck to the floor.

rb316m Rb320 "Sweetie, I should have known you were kidding but you should never kid around W ~ta: me when I'm hungry. Rabbit for dinner- my favorite."

Rb330 "Hey, let's go," whispered BunnyOne.
"Yeah, we've got to get out of here," whispered BunnyTwo and Bunny Three. But when they tried to move, they found their feet firmly stuck.

Rb335 The Hungry Woman came in, she had no idea what the Hungry Man was tallcing about, until she saw the rabbits and said:
"(giggle) - yes dear, I was just joking. Yummy rabbits for you dinner. Why don't, you catch the rabbits while I get wood for a fire."

SUBSTITUTE SHEET (RULE 26) rb345 "No need to catch them," said the Hungry Man. "Those rabbits are good and stuck...

right where they are. I'll go out to the garden and pick some potatoes. By the time the fire is hot, I'll be back to help you put the rabbits in the pot. And he hurried off.

rb346m (Sounds of footsteps receding, door shutting.) Rb350m "What are we going to do?" asked BunnyThree - he wasn't so brave any more.

"Let's try to jump out" said BunnyOne.

So they tried to (boing - distorted) and tried to (boing) but they couldn't budge.

Rb355m The Hungry Woman and Hungry Man came in with wood for the fire. They were whistling happily because they knew they were going to eat well. They started the fire and put on a pot of water, whistling as the fire grew hotter (whistling in the background). All this time, the rabbits stood frozen like statues.

Rb360 Can you stand as still as a statue? If you want to practice being a statue, just like the bunnies, squeeze my hand and then stand still. When you're finished being a statue, squeeze my hand again.

rb370 "Right , so now you're a statue and rll wait until you squeeze my hand."

rb375 "Squeeze my hand before you play Statue."

rb382 That was a long time to be a statue.

rb385 "A little more wood and the fire will be hot enough to cook in," the Hungry Woman said to her husband, and they both went out to gather more wood..

rb386 (sound effect) Rb390 "Did you hear that?" whispered BunnyTwo fiercely.
"What oh what are we going to do?"

"Let's try to jump one more time," said BunnyOne.

Rb395m Rainbow,You know, you can help them. When you hear [BOING], hop as high as you can.

Rb400m Ace, You know, you can help them. When you hear [BOFNG], hop as high as you can.

Rb405m Bubble gum, You know, you can help them. When you hear [BOING], hop as high as you can.

Rb410m Sound of BOING] And up the buruues hopped. [BOING]
And again they hopped.

[GOING] And again they hopped.

rb4151 One more [BOING] and they were free of the puddle m of honey.

rb4152m One more [BOING] and they were free of the peanut butter blob.

rb4153m One more [BOING] and they were free of the marshmallow fluff sticky cloud.

4$
SUBSTITUTE SHEET (RULE 26) rb4201 You know why? Because as the fire grew hotter, the honey grew thinner, thin enough for the rabbits to unstick their feet.

rb2402 You know why? Because as the fire grew hotter, the peanut butter grew thinner, thin enough for the rabbits to unstick their feet.

Rb4203 You know why? Because as the fire grew hotter, the marshmallow fluff grew thinner, thin enough for the rabbits to unstick their feet.

Rb425m One more [BOING] and they were on the window sill, and then out in the garden and scurrying away.

rb426m (music) rb435m Just then, the Hungry Man and the Hungry Woman walked in the door with the wood and potatoes , singing their favorite song (Peas Porridge Hot in background) Rb440 They walked in, just in time to see their boo hoo hoo rabbit dinner hopping out and away in the garden.

rb445m As the hopped, they were singing happily (Honey on the Table in background) SUBSTITUTE SHEET (RULE 26) Fig. 22 is a simplified operational flow chart of the "Theme Section" referred to in Figs. 17D, 18C, 19B, and 19C. The Theme Section presents the user with a general introduction and tutorial to the overall application.
Appendix A is a computer listing of a preferred soft-ware embodiment of the interactive toy system described hereina-bove. A preferred method for implementing software elements of the interactive toy system of the present invention is now de-scribed:
1) Provide a computer capable of running the WINDOWS
95 operating system;
2) Compile the source code of the sections of Appen-dix A labeled:
* Installation Source Code * Application Source Code * ActiveX Source Code for Speech Recognition * CREAPI.DLL
* CRPRO.DLL
* BASEIO.DLL
* Toy Configuration Source Code into corresponding executable files onto the computer provided in step 1);
3) Install the "Automatic Speech Recognition Software Development Kit" for WINDOWS 95 version 3.0 from Lernout &
Hauspie Speech Products, Sint-Krispisnstraat 7, 8900 Leper, Belgium;
4) Compile the source code of the sections of Appen-dix A labeled:
* Base Station Source Code * Toy Control Device Source Code into corresponding executable files and install into the base communication unit 62 of Fig. 5 and into the toy control device 24 of Fig. 5 respectively;
5) Run the executable file corresponding to the Installation Source Code;
6) Run the executable file corresponding to the Toy Configuration Source Code;
7) Run the executable file corresponding to the Application Source Code;
It is appreciated that the interactive toy system shown and described herein may be operative to take into account not only time of day but also calendar information such as holidays and seasons and such as a child's birthday. For example, the toy may output special messages on the child's birthday or may gener-ate a "tired" facial expression at night-time.
Preferably, at least some of the processing functional-ities of the toy apparatus shown and described herein are provid-ed by a general purpose or household computer, such as a PC, which communicates in any suitable manner with the toy apparatus, typically by wireless communication such as radio communication.
Preferably, once the toy has been set up, the PC program contain-ing the processing functions of the toy runs in background mode, allowing other users such as adults to use the household computer for their own purposes while the child is playing with the toy.
Preferred techniques and apparatus useful in generating computerized toys are described in copending PCT application No.
PCT/IL96/00157 and in copending Israel Patent Application No.
121,574 and in copending Israel Patent Application No. 121,642, the disclosures of which are incorporated herein by reference.
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or re-cords, but otherwise reserves all copyright rights whatsoever.
In the present specification and claims, the term "computerized creature" or "computerized living creature" is used to denote computer-controlled creatures which may be either virtual creatures existing on a computer screen or physical toy creatures which have actual, physical bodies. A creature may be either an animal or a human, and may even be otherwise, i.e. an object.
"Breathing life" into a creature is used to mean imparting life-like behavior to the creature, typically by defin-ing at least one interaction of the creature with a natural human being, the interaction preferably including sensing, on the part of the creature, of emotions exhibited by the natural human being.
A "natural" human being refers to a God-created human which is actually alive in the traditional sense of the word rather than a virtual human, toy human, human doll, and the like.
Reference is now made to Figs. 23A and 23B, which are illustrations of the development and operation of a computerized living creature in accordance with a preferred embodiment of the present invention. Fig. 23A shows a physical creature, while Fig.
23B shows a virtual creature.
As seen in Figs. 23A and 23B, a facility for teaching the development of interactive computerized constructs is pro-vided, typically including a plurality of student workstations 310 and a teacher workstation 312, which are interconnected by a bus 314 with a teaching facility server 316 serving suitable contents to the teacher workstation 312 and the student worksta-tions 310. Preferably, a creature life server 3I8 (also termed herein a "creature support server" or "creature life support server) is provided which provides student-programmed life-like functions for a creature 324 as described in detail below. Alter-natively servers 316 and 318 may be incorporated in a single server. As a further alternative, multiple creature support servers 318 may be provided, each supporting one or more comput-erized living creatures. As a further alternative (not shown), a single central computer may be provided and the student and teacher workstations may comprise terminals which are supported by the central computer.
As seen in Fig. 23A, creature life support server 18 is preferably coupled to a computer radio interface 320 which pref-erably is in wireless communication with a suitable controller 322 within the computerized living creature 324, whereby the actions and responses of the computerized living creature 324 are controlled and stored as well as its internalized experiences are preferably retained and analyzed.

i It is appreciated that the computerized living creature 324 preferably is provided, by creature life server 318, with a plurality of different anthropomorphic senses, such as hearing, vision, touch, temperature, position and preferably with compos-ite, preferably student-programmed senses such as feelings. These senses are preferably provided by means of suitable audio, visu-al, tactile, thermal and position sensors associated with the computerized living creature. Additionally in accordance with a preferred embodiment of the invention, the computerized living creature 324 is endowed with a plurality of anthropomorphic modes of expression, such as speech, motion and facial expression as well as composite forms of expression such as happiness, anger, sorrow, surprise. These expression structures are achieved by the use of suitable mechanical and electromechanical drivers and are generated in accordance with student programs via creature life server 318.
Referring now to Fig. 23B, it is seen that a virtual computerized living creature 334 may be created on a display 336 of a computer 338 which may be connected to bus 314 either di-rectly or via a network, such as the Internet. The virtual com-puterized living creature 334 preferably is endowed with a plu-rality of different anthropomorphic senses, such as hearing, vision, touch, position and preferably with composite senses such as feelings. These senses are preferably provided by associating with computer 338, a microphone 340, a camera 342, and a tactile pad or other tactile input device 344.
A speaker 346 is also preferably associated with com-puter 338. A server 348 typically performs the functionalities of both teaching facility server 316 and creature life server 318 of Fig. 23A.
Additionally in accordance with a preferred embodiment of the invention, the virtual computerized living creature 334 is endowed with a plurality of anthropomorphic modes of expression, such a speech, motion and facial expression as well as composite expressions such as happiness, anger, sorrow, surprise. These are achieved by suitable conventional computer techniques.
It is a preferred feature of the present invention that the computerized living creature can be given, by suitable pro-gramming, the ability to interact with humans based on the afore-mentioned anthropomorphic senses and modes of expression both on the part of the computerized living creature and on the part of the human interacting therewith. Preferably, such interaction involves the composite senses and composite expressions mentioned above.
Fig. 23C is a simplified semi-pictorial semi-block diagram illustration of a system which is a variation on the systems of Figs. 23A - 23B in that a remote content server 342 is provided which serves data, programs, voice files and other contents useful in breathing life into the creature 324.
Fig. 24A is a pictorial illustration of a student programming the creature 324 (not shown), preferably using a simulation display 350 thereof. Programming is carried out by the student in interaction with the student workstation 310.
Interaction may be verbal or alternatively may take place via any other suitable input device such as keyboard and mouse.

i WO 99/54015 PC'T/IL99/00202 The command "play record", followed by speech, followed by "stop", means that the student workstation should record the speech content generated by the student after "play record", up to and not including "stop" and store the speech content in a voice file and that the creature life server 318 should instruct the creature 324 to emit the speech content stored in the voice file.
"If - then -endif", "speech recognition", "speech type", "and" and "or" are all control words or commands or programming instructions, as shown in Fig. 31.
Fig. 24B is a pictorial illustration of human, at least verbal interaction with a computerized living creature wherein the interaction was programmed by a student as described above with reference to Fig. 24A.
Figure 24C is a pictorial illustration of a creature 324 equipped with a built in video camera 342 and a video display 582 such as a liquid crystal display (LCD). The video camera provides visual inputs to the creature and via the creature and the wireless communication to the computer. The display enables the computer to present the user with more detailed information.
In the drawing the display is used to present more detailed and more flexible expressions involving the eyes and eye brows. Color display enables the computer to adopt the color of the eyes to the user or subject matter.
It is a particular feature of the present invention that an educational facility is provided for training engineers and programmers to produce interactive constructs. It may be *rB

PC'T/IL99/00202 appreciated that a teacher may define for a class of students an overall project, such as programming the behavior of a policeman.
He can define certain general situations which may be broken down into specific events. Each event may then be assigned to a stu-dent for programming an interaction suite.
For example, the policeman's behavior may be broken up into modules such as interaction with a victim's relative, inter-action with a colleague, interaction with a boss, interaction with a complainer who is seeking to file a criminal complaint, interaction with a suspect, interaction with an accomplice, interaction with a witness. Each such interaction may have sub-modules depending on whether the crime involved is a homicide, a non-homicidal crime of violence, a crime of vice, or a crime against property. Each module or sub-module may be assigned to a different child.
Similarly, a project may comprise programming the behavior of a schoolchild. In other words, the emotionally per-ceptive creature is a schoolchild. This project may be broken into modules such as behavior toward teacher, behavior toward module and behavior toward other children. Behavior toward other children may be broken up into submodules such as forming of a secret club, studying together, gossiping, request for help, etc.
To program a particular submodule, the student is typically expected to perform at least some of the following operations:
a. Select initial events which trigger entry into his submodule. For example, hearing the word "club" may trigger entry into a "Forming Secret Club" submodule. These initial events may form part of the state machine of the module or preferably may be incorporated by the students jointly or by the teacher into a main program which calls various modules upon occurrence of various events.
b, List topics appropriate to the dialogue to be main-tained between the schoolchild and a human approaching the schoolchild. For example, in order to form a club, the club typically needs a name, a list of members, a password, a flag, rules, etc.
c, Determine relationships between these topics. For example, the password needs to be conveyed to all members on the list of members, once the list of members has been established.
d, Formulate a branched dialogue between the schoolchild and the human, designed such that each utterance of the school-child tends to elicit a response, from the human, which is easily categorizable. For example, the schoolchild may wish to ask only limited-choice questions rather than open-ended questions. If, for example, the schoolchild asks, "What color should the flag be: white or black or red?" then the system merely needs to recognize one of three words.
e. Determine how to detect emotion and determine the roles of different emotions in the schoolchild-human relationship. For example, if the school-child is defining, in conjunction with the human, the list of members, the schoolchild may notice that the human is becoming emotional. The schoolchild may therefore elect to recommend that the list of members be terminated and/or may express empathy. Alternatively or in addition, each utterance of the schoolchild may have a slightly different text for each of three or four different emotional states of the human.
Other projects include programming the behavior of a teacher, parent, pet, salesperson, celebrity, etc. It is appreci-ated that the range of projects is essentially limitless.
It is appreciated that the complexity of programming an emotionally perceptive being is anticipated to cause amusing situations whereby the emotionally perceptive being performs in a flawed fashion. This is expected to enhance the learning situa-tion by defusing the tension typically accompanying a student error or student failure situation by associating student error with a humorous outcome. The difficulty of programming an emo-tionally perceptive being is not a barrier to implementation of the system shown and described herein because the system's objec-tive is typically solely educational and correct and complete functioning of the emotionally perceptive being is only an arti-fact and is not the aim of the system.
Furthermore, although programming a being which is emotionally perceptive at a high level is extremely difficult, even simplistic emotional sensitivity, when featured by a ma-chine, has a tremendous effect on the interaction of humans with the machine. Therefore, programming of emotional perceptiveness, even at the elementary level, is a rewarding activity and conse-quently is capable of motivating students to enhance their pro-gramming abilities through practice.
Fig. 25 is a simplified software diagram of preferred functionality of a system administrator. Preferably, one of the teacher workstations 312 doubles as a system administrator work-station.
Fig. 26 is a simplified software diagram of a preferred functionality of teacher workstation 312 in a system for teaching development of interactive computerized constructs such as the system of Figs. 23A - 23C.
Student administration functionality (unit 715 in Fig.
25) typically includes conventional functionalities such as student registration, statistical analysis of student character-istics, student report generation, etc.
Integration (unit 740) may be performed by groups of students or by the teacher. Preferably, the teacher workstation provides the teacher with an integration scheme defining the order in which the various modules should be combined.
Run-time administration functionality (unit 750) refers to management of a plurality of creature life servers 318. For example, a teacher may have at his disposal 15 creatures con-trolled by 3 creature life servers and 30 projects, developed by 300 students and each including several project modules. Some of the project modules are alternative. The run-time administration functionality enables the teacher to determine that at a particu-lar day and time, a particular subset of creatures will be con-trolled by a particular creature life server, using a particular project. If the project includes alternative modules, the teacher additionally defines which of these will be used.
Fig. 27 is a simplified software diagram of preferred functionality of student workstation 310 in a system for teaching development of interactive computerized constructs such as the system of Figs. 23A - 23C. The Analysis and Design block 815 in Fig. 27 typically comprises a word processing functionality, a flowchart drawing functionality and a database schema design functionality allowing the student to document his analysis of the project module.
The Visual Programming block 840 in Fig. 27 is prefer-ably operative to enable a student to define and parametrize software objects and to associate these objects with one another.
Software objects preferably include:
Sub-modules; events such as time events, verbal events, database events, sensor events, and combinations of the above;
functions such as motion functions, speech (playback) functions;
states for a state machine; and tasks performed in parallel.
A typical session of visual programming may, for example, comprise the following steps:
a, Student selects "view" and then ".state machine" in order to view the state machine currently defining his module of the project that his class has been assigned. In response, the system displays the current state machine to the student.
b, Student selects "insert" and then selects "state", thereby to add a new state to the state machine.
c, Student selects "insert" and "connection" in order to connect the new state to an existing state in the state machine.
d, Student defines an event and function for the selected connection. The function may be selected from among existing functions listed under the Functions option or may be generated, using the Program Block option, and using a third generation programming language such as Basic or by opening a state machine within the function.
Selection may be implemented by any suitable interface mechanism such as drag-and-drop of icons from a toolbox or such as selection from a menu bar and subsequent selection from menus associated with menu bar items.
The visual programming block 840 preferably allows a student to select one of a plurality of "views" each comprising a different representation of the module as programmed thus far by the student. The views may, for example, include:
a, sub-modules within the module assigned to the student;
b, a list of events within the module. Events typically include time events, sensor events, verbal events, database events e.g. that a particular counter in the database has reached zero, and combinations of the above. An event can be generated from scratch, modified or associated with an existing connection between a source state and a destination state.
c, a state machine illustrating states in the module and connections therebetween;
d, a list of tasks, wherein each task includes a sequence of functions and/or modules and wherein an association is defined between tasks in order to allow the sequences of the various tasks to be performed in parallel.
e, a list of functions within the module. Functions typi-cally include verbal functions e.g. talking, speech recognition and recording, actuator functions such as motor functions and lighting functions, database functions such as computations performed on data stored in the database.
A function can be generated from scratch, modified or associated with an existing connection between a source state and a destination state.
Within each view, the student may modify or add to any aspect of the module represented in the view. For example, in order to modify an event associated with an individual connection in the state machine, the student may typically access the event list and change the definition of the event. Alternatively, the student may access the state machine and select a different event to associate with the individual connection.
Figs. 28 - 31 are examples of screen displays which are part of a human interface for the Visual Programming block 840.
As shown in the menu bar of Fig. 28, the student is preferably given the option of performing one of the following types of activity:
Conventional file operations, conventional editing operations, viewing operations, insert operations, simulation operations and conventional Window and Help operations.
Using the View menu, also shown in Fig. 28, the student may elect to view various representations of the module he has developed, such as a project map representation, module chart representation, list of tasks, etc.
In Fig. 28, the student has selected Connections in the View menu. In response, the student typically is shown, on the screen, a list of the existing state machine connections in his or her module. The student may then select one or another of the connections. As shown, the student has selected connection t6. In response, the student sees a screen display of the parameters of connection t6, including the connection's source and destination states, and the event and function associated with the connec-tion.
Typically, each function is a combination of one or more function primitives such as "play", "record", "set expres-sion", etc.
A list of the currently defined function primitives and their parameters is typically displayed to the student response to a student selection of the "function primitive" option in the View menu.
Fig. 29 is an illustration of a state machine view of a module, generated in response to the student's selection of State Machine from the View menu. As shown, interactions are shown in state form, wherein the creature moves from state to state, wherein transition from state to state is conditional upon occur-rence of the event which appears between the states, and is accompanied by occurrence of the function which appears between the states.
For example, the transition between State 2 to State 6 is associated with Function 7 and Event 7. This means that when the creature is in State 2, then if it detects Event 7, it performs Function 7 and moves to State 7.
Event 7 may, for example, be that the natural human is happy. This is a complex event being a combination of several primitive events such as Loud Voice, High Pitch, Intonation Rises at End of Sentence, "happy" detected by speech recognition unit, etc. Function 7 may, for example, be emission of the follow-ing message: "It looks like you're in a great mood today, right?"
State 6 may, for example, be a Waiting For Confirmation Of Emotional Diagnosis state in which the creature waits for the natural human to confirm or reject the creature's perception that the natural human is "in a great mood".
State 2 may, for example, be an Emotion Change state in which a change in emotion has been detected but the new emotion has not yet been characterized.
"U" denotes an unconditional transition from one state to another.
In Fig. 30, the student is modifying the module by inserting a new function intended to be associated with a state-to-state connection within the state machine. The function which the student is shown to be inserting is the function "record for 2 seconds".
It is appreciated that the Functions option under the View option (Fig. 28) may be employed to define functions which are a sequence of existing functions.
The screen display of Fig. 32 includes an illus-tration of an example of a state machine view of a project. As shown, each connection between states is characterized by an event and by a function. Occurrence of an event causes the func-tion to be performed and the process to flow from the current state to the next state. For example, if event E1 occurs when the system is in State 1, then the system performs F1 and advances to state 6.

In Fig. 32, states are represented by ovals, events by diamonds and functions by rectangles. To insert an event and a function for a connection, the student selects the desired con-nection from the display of Fig. 32, then selects Insert in the main menu bar of Visual Programming and then selects, in turn, Function and Event.
The screen display of Fig. 33 enables a student to create an environment in which a previously generated module can be tested. To do this, the student typically does as follows:
a. the student generates a simulation of the software that actuates the module (launch setup);
b. the student generates a simulation of the environment which deals with inputs to the module and outputs from the mod-ule. In other words, the environment simulation generated in step (b) simulatively provides inputs to the module and accepts and acts upon, simulatively, outputs by the module which would have caused the environment to act back onto the module;
c. the student defines a setup for monitoring the module's performance. Typically, the student defines that certain detected events will be displayed on the screen and certain detected events will be logged into a log file.
d. the student executes the simulation, simultaneously monitoring the screen; and e. the student views the contents of the log file.
Figs. 34 - 37 are examples of display screens presented by the teacher workstation 312 of Figs. 23A, 23B or 23C.
Specifically, Fig. 34 is an example of a display screen generated within the Student Administration unit 715 of Fig. 26.
As shown, the display screen enables a teacher to enter and modify student identification particulars and also to view the project and module assigned to each student and preferably, the status of the project and module. The display screen also allows the teacher to assign a mark to the student. Alternatively, assigning marks may be part of execution monitoring (unit 760).
Assignment of students to projects and modules is typically carried out within the project module assignment unit 730 as described below with reference to Fig. 35.
Fig. 35 is an example of a display screen generated within the project module assignment unit 730 of Fig. 26. As shown, the teacher typically selects a project from among a menu of projects which typically displays characteristics of each project such as level of difficulty, number of modules, etc. In Fig. 13, the teacher has selected the "policeman" project. As shown, there are several modules within the project.
The teacher also selects a class to perform the project. In Fig. 35, the teacher has selected Class 3A and in response, the screen display has displayed to the teacher, a list of the students in Class 3A.
The screen display also displays to the teacher a list of the modules in the "policeman" project and the teacher assigns one or more students to each module, typically by clicking on selected students in the student menu.
Fig. 36 is an example of a display screen generated within the integration supervising unit 740 of Fig. 26. As shown, the teacher typically determines at least an order in which modules will be integrated to form the finished project. The system typically draws graphic representations of connections between modules which are to be integrated with one another. Each such connection is typically marked with a date and with a status indication (integrated/not-integrated).
Fig. 37 is an example of a display screen generated within the assign run-time unit 755 of Fig. 26. The assign run-time unit is particularly important if the creature generated is a physical creature rather than a virtual creature. If this is the case, then the physical creature typically is a scarce re-source shared by a large number of students. As shown, the teach-er typically selects a physical creature, such as a red police-man, from among an available pool of physical creatures. The selected physical creature performs the functionalities defined by the teacher's students when working on the policeman project, at a teacher-determined time. If two different modules are as-signed to the same time and the same creature, i.e. if the red policeman is instructed to operate in his "victim's relative"
module and in his "suspect" module, then the teacher typically defines a priority system such that overriding is minimal.
Fig. 38 is a simplified flowchart illustration of the process by which the student typically uses the student worksta-tion 310 of Fig. 23.
A preferred flowchart illustration of processes per-formed by the student in the course of performing steps 910 and 920 of Fig. 38 is described hereinbelow with reference to Fig.
41.

As shown, initially, a teacher or project delineator defines states, i.e. categories of emotion (happy, sad, angry).
A student operationally defines each emotion category in terms of contents of and/or characteristics of verbal inputs recorded/received from human. The student defines events to partition emotions into categories. Characteristics of verbal inputs include: voice amplitude, voice pitch, rate of speech and diction quality.
The student defines explicit interrogations confirming various categories of emotion. The student defines each interro-gation as a state, each interrogation as a function, and each result of interrogation as an event.
The student and/or teacher determines modification of interaction with human according to category of human's emotion.
Fig. 39 is an example of a display screen generated by selecting Event in the Insert menu in the student workstation 10.
As shown, the event which is being selected comprises closure of various switches. Specifically, the event comprises closure of a switch in the right hand of the creature 324 or closure of a switch in the right foot of the creature.
Fig. 40 is an example of a display screen generated by selecting Function in the Insert menu in the student workstation 10. As shown, the function which is being selected comprises an eye-motion. Specifically, the function comprises movement of the eyeballs to the left.
Preferred embodiments of the present invention and technologies relevant thereto are now described with reference to Figs. 42 - 68.

A preferred architecture of the LOLA application is described in chart form in Figs. 42 - 68.
The LOLA system is a distributed application that is composed of several main processes. Address and data spaces boundaries are separating these processes which can reside on one computer or on different computers in the network. These processes use a standard middleware (MW) like CORBA/DCOM/RMI in order to communicate transparently with each other.
The main processes are:
Task dispatcher:
This component runs on every radio base station that communicates with living objects. The main sub-components in this component are described in Figs. 42 - 68.
Proxy Objects:
Responsibilities: Every living object in the system has a corre-sponding object that represents it. All operation invocations that are done on a living object are first invoked on its proxy object, and all events generated by a living object are first received in its proxy object. In addition, the proxy object is responsible to store and track the state of each living object.
The proxy object is a remote object in order to allow inter-process communication.
Services used by the proxies (collaborators):
* The proxies are using the provided Java Bean in order to invoke operations and receive events from the living object.
* The security manager in order to verify if a requested opera-tion is legal.

* The log and event service in order to log messages and generate events.
Services provided to other components:
* The tasks that are spawned by the dispatcher interact locally with the proxies.
* The IDE can interact with the proxies in order to allow remote debugging or executions.
* The management console can remotely interact with the proxy in order to invoke diagnostics and monitoring operations.
Dispatcher engine:
Responsibilities: Gets from the task manager the registered tasks for execution, and executes each task in a separate thread. The tasks run in a sandbox in order to enforce security policies.
Services used by the dispatcher:
* The task manager in order to receive the registered tasks.
* The spawned tasks use the proxy objects in order to invoke operations on the living objects.
* The timer, in order to receive time events.
* The log and event service in order to log messages and generate events.
Services provided to other components:
* The IDE can interact with the dispatcher in order to coordinate remote debugging or executions.
* The management console can remotely interact with the dispatch-er in order to invoke diagnostics and monitoring operations.
Timer:
Responsibilities: Generate time events to the registered listen-ers.
Services used by the timer:
* The timer doesn't use any service provided by the LOLA system.
It only uses OS services.
Services provided to other components:
* The dispatcher registers in the timer in order to receive time events.
LOLA Servers This component supplies the required services to all other components in the system. The main sub-components in this component are described in Figs. 42 - 68.
Log server:
Responsibilities: The log server is responsible to log messages of other components in the system, and to retrieve those messages according to several criteria. Log messages, unlike events are just logs, i.e. they only log information, rather then expect that some action will be triggered from that log messages.
Services used by the log server:
* The persistent storage service in order to keep the logs in a persistent storage.
Services provided to other components:
* The dispatcher and the proxies log certain events during task executions.
* The management console and the students IDE in order to track the execution of particular tasks.
* The teacher management console in order to receive statistics about task executions.
Monitor engine:

Responsibilities: The monitor engine is responsible to receive events from other components in the system, and to act upon them according to event-condition-action logic. The monitor engine supplies such logic on a system wide basis, even though this component can in addition reside on every radio base station in order to allow local handling of events.
Services used by the monitor engine:
* The persistent storage service in order to keep the policies and the received events in a persistent storage.
Services provided to other components:
* The dispatcher and the proxies generate events during task executions, or when pooling the system for its sanity.
* The management console in order to receive the events and act appropriately upon them.
Security manager:
Responsibilities: The security manager keeps in a repository all the users, groups, and roles in the system, and according to that decides who has the permission to do what action.
Services used by the security manager:
* The persistent storage service in order to keep the users, groups and roles in a persistent storage.
Services provided to other components:
* The proxies in order to confirm remote operations that are invoked on them.
* The task manager in order to confirm that a specific task registration is allowed.

WO 99!54015 PCT/IL99100202 Task Manager:
Responsibilities: The task manager keeps in a repository all the tasks in the system, and according to that supplies the appropriate radio base stations the tasks that they should execute.
Services used by the task manager:
* The persistent storage service in order to keep the tasks in a persistent storage.
* The security manager in order to confirm task registration.
Services provided to ether components:
* The radio base stations in order to receive the registered tasks.
Management Console This component is the console of the administrator that monitors and controls the system behavior, and configures the system appropriately. In addition, it provides the teacher a console from which it can query the system in order to do tasks such as evaluate students works, or assign permissions to its students to execute particular tasks.
The main sub-components in this component are illustrated in Figs. 42 - 68. An on-line view of these components is also shown in these figures.
Responsibilities: The console for on-line monitoring and control of the system. View of things like the tasks that are running on each radio base station, and the state and status of each living object. The ability to invoke operations such as changing the channel of a particular living object. The ability to view all the events that are generated in the system.
Services used by the on-line view typically include:' * The proxy object in order to invoke operations on them, and receive events from them.
* The dispatcher in order to monitor and control tasks executions in an on-line manner.
* The monitor engine in order to receive events on a system wide basis.
Services provided to other components:
* The on-line view is only a GUI client.
A configuration view is illustrated in the figures.
Responsibilities: The console for configuring the system during its run-time. Configurations such as definitions of users, groups, and roles are done from this console.
Services used by the configuration view * The security manager in order to authorize the invoked operations.
Services provided to other components:
* The configuration view is only a GUI client.
Off-line view:
Responsibilities: Configurations done to the system not during its normal executions, such as upgrade, adding living objects, and others.
Services used by the configuration view Services provided to other components:
*rB

* The configuration view is only a GUI client.
Teacher Console Responsibilities: The console to be used by the teacher in order to evaluate the students' works. The teacher will be provided with information such as the popularity of the students' works, and other statistics about the task executions. In addition, the teacher will be able to view the source of all the tasks that were written by its students.
Services used by the configuration view * The task manager in order to view the source of its students tasks.
* The log server in order to obtain statistics about tasks execu-tions.
Services provided to other components:
* The off-line view is only a GUI client.
Integrated Development Environment (IDE) This component runs on each student programming station. The architecture support the following three possibilities:
* A standalone PC residing in the student home and not connected to the Internet.
* A PC residing in the students home, and connected to the LOLA system via the Internet. A firewall can reside between the PC in the student home, and the LOLA system.
* A PC residing in an internal intranet, and connected to other LOLA components via a standard middleware.

*rB

IDE core:
Responsibilities: The integrated development environment that is used by the students to write tasks that will be executed by the task dispatcher.
Services used by the IDE core:
* The IDE core use the living object simulator in order to test the task before register is for execution.
* The IDE core can use the proxy object in order to execute the task on a real living object. This feature can be used only if the IDE core can communicate with the proxy object via the middleware, i.e. only if the PC resides on the same intranet, or remotely from home if a firewall doesn't restrict packets of the middleware port, and the available bandwidth allows that.
Services provided to other components:
* The IDE core is only a client of services.
Proxies Simulator:
Responsibilities: Simulate the proxies of the living object in order to allow local debugging and executions of tasks.
Services used by the configuration view *
Services provided to other components:
* The IDE core uses the simulator for local task execution and debugging.
Tasks registration:
Responsibilities: Browser based component that provides the students the ability to add or delete tasks for execution on a radio-based PC.
Services used by the configuration view * The task registration server.
Services provided to other components:
* Deployment This component is responsible for the deployment of all other components in the system. In particular, it is responsible for the deployment of all proxy objects and their corresponding simulators, and the building of these objects if necessary. The building of these objects is optional, and basically there are three alternatives regarding this issue:
* All objects are of the same type, i.e, all objects have the same interface regardless the living object they represent. Operations that are specific to a particular living object are executed via a common interface like "send cmd". The advantage of this approach is simple deployment, maintenance and configuration of the system. The disadvantage, is a command set that is less meaningful to its users, and more important, that improper use of the command will be detected only when the task is executed on the living object, rather than being detected before on the simulator or at compile time.
* All objects are of the same type in the API level, but every object knows its type. All types in the system reside in a repository. Thus, from deployment and maintenance perspective this approach is less simple, the API of the command set is still not meaningful, but errors can be detected when the task is executed on the simulator.

i * Objects from different types have different API to access them. Thus, the deployment and maintenance of the system is even less simple because code is generated and build according to the types of the living objects, rather than just being kept in a repository, or not kept at all. However, the command set is more meaningful to its users, and errors will be detected as soon a~ the task is compiled. Thus, this approach is the preferred approach. However, implementing this approach requires more development efforts, and thus can be implemented only in a secondary iteration.
Task and security managers data model Figs. 42 - 68 include a chart which describes the data models of the task and security managers.
* User:
* Name.
* Password: encrypted using one-way function.
* Groups: one or more groups the user belongs to.
* Group:
* Name.
* Users: zero or more users that belong to this group.
* Roles: zero or more roles that are associated with this group.
* Role:
* Name.
* Permissions: According to the following criteria:
* Living object types.
* Living objects.
* Computers.

* Times: capabilities like UNIX crontab.
* Task:
* Name.
* Location.
* Users: One or more users that wrote this task.
* Execution time: Where and when this task will execute. Must match the roles that associated with the user's group.
* Living object:
* Name * Type * Host * Tasks: zero or more tasks that operate this living object.
* Living object type:
* Name.
Components descriptions Security Manager The security manager exports two main servers for other components:
* ConfigAuthorization: Responsible to build the repository of users, groups and roles. Its exported operations are remote operations. The administrator triggers the invocation of these operations whenever she decides to update the definitions of pupils, groups and roles. The administrator makes these changes through its GUI-based console that acts as a clients that uses the above mentioned operations.
* ConfirmAuthorization: Responsible to check whether a I

specific operation is legal, by using the data in the repository.
The clients of this service are:
* The task manager - it asks for confirmations whenever a pupil registers a task.
* The proxy objects - is asks for confirmations whenever a pupil invoke a remote operation.
. Task Manager The task manager keeps in a repository all the tasks in the system, and according to that supplies the appropriate radio base stations the tasks that they should execute.
Figs. 42 - 68 include a diagram illustration of the scenario where a pupil registers a task for execution. She first enters her user and password, and the security manager checks the authorization of the pupil. If authorized, the pupil gets a menu of all the allowed operations, i.e. she get a menu with the following operations:
* Add task * Remove task * Update task * List all registered tasks Supp se that the pupil decides to register a task for execution, so ~he chooses the "Add task" o eration. The task manager re P
ceives the task content and the task info, and asks the security manager whether the pupil is permitted to register a task with the specified task info. If so, the task manager registers the task, and notifies the pupil that the registration ended success-fully.

Task scheduler:
The task scheduler is responsible for the scheduling of all the registered tasks. Whenever the execution time of a task arrives, the task scheduler is responsible to notify the appropriate dispatcher that it should download the task and spawn it.
When the scheduler starts, it iterates through all the list of registered task, and for every SchedInfo object it builds a simple object that contains the next time that this task should be started and stopped.
The task scheduler keeps a list of indexes of all the registered tasks, according to their execution time. It then registers in the timer to receive events whenever the execution time of a task arrives. Upon receiving such event it notifies the appropriate dispatcher that it should download and execute the task.
Task dispatcher:
The task dispatcher gets from the scheduler a registered task, whenever the start time of the task arrives.
Then, it executes the task in a separate thread. Each task runs in a sandbox in order to enforce security policies. The following state diagram describes the task dispatcher.
A diagram included in Figs. 42 - 68 describes the data flow among the task dispatcher, task scheduler and other components in the system. The task scheduler can receive time events from the timer, and taskListChange event from the task manager. The time event is generated when the start execution time of a task arrives. This event triggers the downloading and *rB

spawning of a task from the scheduler to the dispatcher. The taskListChange event actually changes the list of the scheduled task, thus changes the registrations in the timer.
The management console can browse and change manually the tasks that are executing.
General consdierations relating to preferred LOLA
s~rstem implementations are now described.
The LOLA (Living Object LAboratory) is a computer class that enables pupils to build and experience animation of physical figures called living objects. The animation provides the living objects with the ability to interact with users in a human voice, in a human-like and intelligent manner.
The Living Objects Laboratory teaches pupils to analyze, design and program "Natural Intelligence" (NI) into physical objects - the Living Objects figures. The NI developed by the pupils over time accumulates and increases the ability of the Living Objects to interact with the pupils. The Living Objects figures are distributed over the schoolyard and are used as playing and educational objects for all the children in the schoolyard.
Natural Intelligence Natural Intelligence is the ability of a computerized object to present "human-like behavior". Human beings, even the very young are highly adaptive to their ever-changing environment. This skill enables a significant amount of freedom in the interaction between humans.
Computer based systems have strict interaction protocol. The behavior of a computerized machine is highly predictable and very accurate as long as the communicator (user or another computerized machine) strictly follows the rules of the protocol. Deviation from the protocol should lead to immediate cessation of the interaction.
Programming of computers and computer-based machines is oriented to "problem solving". The program ends (or pauses, waiting for a new input or event) when an well-identified target is reached. Human interaction is oriented towards building a growing shared understanding. Even when the final goal of the interaction is to solve a problem,, the "continuous goal" of each step of the interaction is to collect and add relevant information to the collective pool of knowledge. This can be done until the final goal is reached. In many situations, the final goal is not known before the interaction begins, and is identified only later, as a result of the interaction.
Implementing Natural Intelligence into a machine enables the machine to perform the following loop:
1. Identify a situation.
2. Respond to a human being.
3. Deliver information that describes the accumulated or additional understanding of the situation.
4. Identify what information is missing.
5. Suggest additional information.
6. Request additional information.
7. Receive the human response and analyze it.
Goals of LOLA

The first implementation of LOLA is targeted at high schools for educational purposes. These are the high level goals of the project:
* Teaches pupils to analyze, design and program "Natural Intelligence" (NI} into physical objects.
* Friendly and easy to use system that will attract pupils to learn high technology subjects.
* Support teachers in tasks assignments and grading.
* Serves as content-based objects that amuse and provide information to the pupils and staff.
Services and their Use Case Analysis The main actors in the system are pupil, teacher, administrator and user. This document specifies the important use-cases of the actors of the system. The use-cases are grouped by the actors targeted by the service: pupil, teacher, administrator and user. One person can act as one or more actors.
In particular, every pupil, teacher and administrator is also a user of the system. It might be that the same person acts as a teacher and an administrator.
The major components in the system are:
* Programming station: every station that contains the IDE (Integrated Development Environment} that provide the ability to program NI into Living Objects. The computer at the pupils' home can also be such a programming station, if Creator IDE was installed on it.
* Radio based station: every station that communicates with one or more Living Objects (via RF communication), and sends these objects commands.
* LOLA servers: Station that hosts the servers of the LOLA system, e.g. task server, security server.
* Teacher and administrator console: stations in the lab that are used by the teacher and administrator respectively.
* Living objects: Living objects are toys equipped with a control device. The control device contains a micro-controller, a radio transceiver and I/0 ports. The I/0 ports connect to various peripheral components also contained within the Living Objects, such as: speaker(s), microphone(s), sensors, actuators, motor(s), lamps, video camera, etc. The peripherals enable the Living Object to interact with humans in a human-like manner. The peripherals are operated by the micro-controller. The micro-con-troller receives its program instruction in real time from a radio-based PC via the built-in transceiver.
Two more secondary actors that provide data for building an internal database are later introduced. An information server that provides data for building an internal database that support queries made from pupils tasks, and a contents provider that provides contents that will be kept in a contents database. These contents will be scheduled for execution as determined.
We describe the services, and an analysis of the related use cases.
Pupil Services The main services offered to pupils, who build the behaviors of the living objects, are illustrated in the drawings.

Name Creator IDE Installation Actors Pupil if installed on her home PC, administrator if installed on a PC at school. Teacher might also install the IDE
on her home PC in order to browse her pupils' tasks.
Gdal That Creator IDE will be installed correctly.
Forces in Context 1) There could have been previous installations. In such a case, this installation will be an upgrade of previous installations.
2) Installshield type installation.
3) Pupil typically works on Windows 95/98 based PC, but might also work on other environments such as Macintosh, Windows3.11/DOS, Linux or NC (in such a case the installation will take place in the server).
Trigger Actor starts the installation process from a CD, or from a downloaded file.
Summary This use case captures the first, and Later installations of Creator IDE:
1) Actor is asked for several configurations parameters.
2) Actor advances to regular usage of Creator IDE.
Pre-conditions Actor downloaded the package, or has a CD.

Post-conditions Creator IDE is installed.
Related use cases Create or Update living object types on a PC at home should be followed immediately, or be deferred to a later time at the users convenience.
Name Add living object type at home Actors Pupil.
Teacher might also be an actor of this use-case if she has installed the IDE on her home PC.
Administrator is not an actor here: Administrator has a separate use case dealing with living object updates.
Goal That the types of all living objects in the system will be known to Creator IDE, in order to support a simulator for every living object type.
Forces in Context 1) The information source will be typically the LOLA
system installed at school, and the update process will be browser based and be done via the Internet. A firewall might reside between the pupil browser at home, and the LOLA system.
2} The pupil can put the required data on a floppy disk (or other media} at school, and then install it on her PC at home.
Trigger system.
Summary Can be either one of the following triggers:
1) The Creator IDE has been just being installed.
2) New type of living object has been connected to the Create or update the types of the living objects known to~the IDE installed at the pupil's home.
Pre-conditions Creator IDE has been Installed.
Post-conditions 1) The simulators in Creator IDE are matching the types of the available living objects.
2) Pupil can commence to build a decision tree.
Related use cases 1) Creator IDE Installation 2) LOLA installation Name Actors Goal Build a decision tree Pupil Build a task that is ready for compilation.
Forces in Context 1) No programming knowledge is required 2) Easy to use friendly GUI.
3) Can reuse decision trees or sub-trees made in previous tasks.

4) Can use built-in decision trees or sub-trees.

5) Pupil wants to use high level commands that are specific to the toy she is working with.

Trigger 1) Teacher assigns homework to her pupils.

2) Pupil builds the decision tree during a class in the lab, by his own free choice.
or Summary This use case captures the scenario where a pupil builds a decision tree in order to program NI into a living object.

I) Pupil launch Creator IDE.

2) Pupil builds a decision tree.

Pre-conditions 1) Creator IDE is installed on the pupil desktop.

Post-conditions 1) A task that is ready for compilation.

Relateduse cases 1} Creator IDE installation: is a requirement.

2) Create or Update living object types on a PC
at home: a requirement.
is Name Build a highly customized decision tree Actors Pupil Goal Build a task that is ready for compilation.

Forces in Context 1) Basic programming skills are required.
2) Easy to use programming language and libraries.
3) Reuse decision trees or sub-trees made in previous tasks.
4) Use built-in decision trees or sub-trees.
5) Pupil wants to use high level commands that are specific to the toy she is working with.
Trigger I) Teacher assigns homework to her pupils.
2) Pupil builds the decision tree during a class in the lab, or by his own free choice.
Summary This use case captures the scenario where a pupil builds a decision tree in order to program NI into a living object.
1) Pupil launch Creator IDE.
2) Pupil builds a decision tree.
Pre-conditions 1) Creator IDE is installed on the pupil desktop.
2) Simulators simulate the living objects that exist in school.
Post-conditions 1) A task that is ready for compilation.
Related use cases 1) Creator IDE installation: is a requirement.
2) Create or Update living object types on a PC at WO 99/54015 PCT/iL99/00202 home: is a requirement.
Name Compile a task Actors Pupil Goal Produce a task that is ready for execution on a living object, which behaves according to the decision tree built by the pupil.
Forces in Context 1) Pupil should not be familiar with the internal implementation of the decision tree.
2) If the pupil only built a decision tree without the addition of pupil s defined macros/code, then the compilation process should be expected to pass in most cases.
3) Compilation errors/warning should be displayed by a view of a decision tree. Only in cases that the pupil added macros, these lines should be displayed either.
4) Friendly, easy to use.
Trigger 1) Pupil has built a decision tree.
Summary This use case captures the scenario where a pupil built a decision tree, and wants to compile it.
1) Pupil launch Creator IDE.
2) Pupil builds a decision tree.
3) Pupil compiles the task.
Pre-conditions 1) Pupil has built a decision tree.
Post-conditions 1) If compilation passes - a task that is ready for execution.
Related use cases 1) Build a highly customize decision tree or Build a decision tree is a requirement.
Name Execute a task Actors Pupil Goal Execute a task locally on the pupil PC in order to check it. The task is interacting with a living object simulator resides on the pupil PC, or if available with a physical living object connected either to the pupil PC, or to other PC in the network.
Forces in Context 1) Living object simulator should simulate accurately a physical living-object behavior. In particular, it should point on all errors that can occur when this task is executed alone on a living object.
2) Look as an integral part of Creator IDE.
3) Friendly, easy to use GUI.
4) Security: check pupil permission in case she is trying to execute the task on a living object connected to a remote PC.

Trigger 1} Pupil built and compiled a task, and wants to execute it.
Summary This use case captures the scenario where a pupil has built a decision tree, and wants immediately to run it, typically in. order to check the task.
1) Pupil launch Creator IDE.
2) Pupil builds a decision tree.
3) Pupil compiles the task.
4} Pupil executes the task.
Pre-conditions 1) Pupil has built a decision tree and compiled it.
Post-conditions 1) A task that is ready for execution on a living object.
Related use cases 1) Build a highly customize decision tree or Build a decision tree and compile a task is a requirement.
Name Debug a task Actors Pupil Goal Debug a task locally on the pupil PC. The task is interacting with a living object simulator resides on the pupil PC, or if available with a physical living object connected to the pupil PC, or to other computer in the network.
Forces in Context 1) Living object simulator should simulate accurately a physical living-object behavior. In particular, it should point on all errors that can occur when this task is executed with the living object alone.
2) Look as an integral part of Creator IDE.
3) Friendly, easy to use GUI.
4) Security checks if pupil executes the task on a living object connected to a remote PC.
5) Pupil can trace task execution in steps, and can see in a graphical way what node in the decision tree is being executed now.
6) Pupil can step into lines of code added to the decision tree.
7) Usual debug capabilities like step into, step over, run to cursor, set breakpoint, continue, watch, etc...
Trigger 1) Pupil built and compiled a task, and wants to debug it.
Summary This use case captures the scenario where a pupil has built a decision tree, and wants to debug it.
I) Pupil launch Creator IDE.
2) Pupil builds a decision tree.
3) Pupil compiles the task.
4) Pupil debugs the task.
*rB

Pre-conditions 1) Pupil has built a decision tree.
Post-conditions 1) A task that is ready for execution on a living object.
Related use cases 1) Build a highly customize decision tree or Build a decision tree and compile a task is preferably a requirement.
Name Task registration Actors Pupil Goal That the task will be installed correctly, and run when scheduled.
Forces in Context 1) Browser-based registration via the Internet or intranet.
2) Security, privacy.
3) Firewall can reside between the web-based client and the servers.
Trigger Pupil starts the registration process, typically after she has built, executed and debugged a task.
Summary This use case captures the case where pupil registers a task for execution.

i lj Pupil is asked for a user-name and password.
2) Pupil is asked to send the file of the task.
3) Pupil can browse all her registered tasks, and perform additional operations such as remove previously registered tasks.
Pre-conditions Pupil has built, executed and debugged her task.
Post-conditions Task is registered for execution as scheduled.
Related use cases 1) Debug a task or Execute a task.
Name Browse task's executions logs Actors The main actor is a pupil. A teacher or an administrator might also be the actors of this use-case, typically in order to help in problems solving.
Goal Browse the logs of a task that has been already executed, typically in order to diagnose problems.
Forces in Context 1) Pupils can browse the logs from every PC that is connected to the intranet.
2) Browser-based logs browsing via the Internet, where a firewall resides between the PC at home and the LOLA system is a nice to have feature.
3) Pupil can browse logs according to several criteria.
Trigger i 1) Pupil's task has been executed, and pupil wants to browse the execution logs.
Summary This use case captures the scenario where a pupil has built a decision tree, registered it for execution, and wants to browse the logs of the execution.
1) Pupil launch Creator IDE.
2) Pupil builds a decision tree.
3) Pupil debugs the task.
4) Pupil registers the task.
5) Pupil browses the execution's logs.
Pre-conditions 1) Pupil has registered a task, and that task has already been executed.
Post-conditions 1) Pupil understands how her task has been executed.
Related use cases 1) Task registration is a requirement.
Teacher Services Teacher is responsible for all aspects of task assignments, checking and evaluation.
Name Browse pupils tasks Actors Teacher Goal WO 99/54015 PCTlIL99/00202 Browse pupils tasks in order to evaluate their tasks, or help with problem solving.
Forces in Context 1) Security, privacy - only a teacher can browse pupils tasks.
2) Teacher can browse every registered task.
3) Teacher uses creator IDE as the task browser.
4) According to the configuration, teacher can or can not change pupils tasks.
Trigger Teacher wants to evaluate her pupils tasks, or help them in problems solving.
Summary Z) Teacher launch creator IDE.
2) Teacher logs into the task manager.
3) Teacher loads a task from the server to her IDE.
Pre-conditions 1) Creator IDE is installed on the teacher desktop.
Post-conditions A pupil task appears on the teacher console.
Related use cases Creator IDE Installation is a requirement.
The use case of Executed tasks statistics is either used as a measure to evaluate pupils tasks.
Name Executed tasks statistics Actors i Teacher Goal Teacher browses through the statistics gathered about her pupils tasks, typically in order to evaluate their works.
Forces in Context 1) Security, privacy - only a teacher can browse pupils tasks.
2) Teacher can browse every statistics related to her pupils tasks.
Trigger 1) Teacher wants to evaluate her pupils tasks.
Summary 1) Teacher logs into the statistics server.
2) Teacher queries the server for data, and browses this data.
Pre-conditions Pupils tasks have been already executed in the system.
Post-conditions Teacher has more measures to evaluate her pupils tasks.
Related use cases The use case of browse pupils tasks is either used as a measure to evaluate pupils tasks.
Administrators Services The administrator is responsible for the installation, deployment, maintenance, diagnostics, monitoring and controlling of the system.
Name i Installation Actors Administrator Goal That the LOLA system will be installed correctly Forces in Context 1) Application components should be deployed in such a way that no bottlenecks will occur, and the system will run in an efficient way.
2) Installation process can be done from a central location.
3) There could have been previous installations. In such case, this installation will be upgrade of previous installations.
4) Installshield like installation.
5) System should scale to support tens of living objects, and hundreds of pupils.
Trigger Administrator starts the installation process from CD, or from a downloaded file.
Summary This use case captures the first, and later installations of the LOLA system:
1) Administrator is asked for several configurations parameters.
2) Administrator advances to the update living object use case.
Pre-conditions i Administrator downloaded the package, or has a CD.
Post-conditions Everything is setup for defining living object types.
Related use cases 1) Update living object types can follow immediately, or be deferred to a later time at the user's convenience.
Name Add living object types Actors Administrator Goal That the types and objects of all living objects in the system will be known to the system, and appropriate application components will be deployed according to that.
Forces in Context ly Done from a central location.
2) Living objects and objects types can be added or removed from the system during its lifetime, and not only after the installation.
3) In particular, the simulators residing in the IDE on the pupils PCs at home should be updated.
Trigger 1) The LOLA system has been just being installed.
2) New type of living object should be connected to the system.
Summary The system is configured according to the available living ob-i jects.
Pre-conditions Installation of the system.
Post-conditions All living object types are known in the system.
Related use cases 1) Installation 2) Trigger the use case of Create or update living object types on a PC at home.
Name Pupils, groups and roles definitions Actors Administrator Goal Pupils can log into the system, and perform actions according to their permissions.
Forces in Context 1) Flexibility - pupil can be belong to one or more groups, and each group can have one or more roles. The same role can be assigned to several groups.
2) This process can be done after installation, and configuration of the living object, as well as on a regular basis whenever new pupils, groups or roles should be added or removed.
3) Users definition is independent of the OS users.
Trigger The teacher asks the administrator to open accounts to her pupils, so that they will start using the system.

i Summary This use case captures the scenario where a teacher of a class wants that her pupils will be granted with permission to use the system.
1) Administrator defines roles: each role definition consists of role name and the permissions that the owner of this role is granted. Permissions can be granted according to the following criteria:
* Living object types.
* Living objects.
* Times: capabilities like UNIX crontab.
2) Administrator defines groups: each group definition consists of group name, and zero or more roles that are associated with this group.
3) Administrator defines users: each user definition consists of user name, password (encrypted with one-way function) and zero or more roles that are associated with this group.
Pre-conditions 1) Installation.
2} Update living objects types.
Post-conditions Pupils can log into the system according to their permissions.
Related use cases 1) Installation and Update living object types are required.
Name i Diagnose, monitor and control the system.
Actors Administrator Goal That the actor be able to diagnose, monitor and control the system.
Forces in Context 1) Potential problems should be detected in advance when possible.
2) Isolate problems through diagnostics tools.
3) Resolve problems through corrective measures.
4) Automatic sanity checks.
5) Allow the administrator to define automatic action to certain events, e.g. change the RF channel upon receiving a specific time event.
6) Administrator can invoke operations on living objects, and receive events from them in an on-line manner.
7) Administrator can browse all events in the system.
8) Browser-based management console.
9) Security.
10) Integration with enterprise management console if exists.
Trigger Management of the system on a regular basis, or after a pupil or a teacher complains of problems.
Summary 1) Administrator launch browser-based management station.
2) Administrator diagnoses, monitors, and controls the system.
Pre-conditions 1) System has been already installed Post-conditions System functions correctly.
Related use cases 1) Installation.
2) Browse and change scheduling time of tasks.
Name Browse and change scheduling time of tasks.
Actors Administrator Goal Control the execution time of tasks from a central location, and from a view of the whole system.
Forces in Context 1) Potential problems that stem from task scheduling should be detected in advance when possible.
2) Administrator should be able to see the scheduling time of all tasks in the system, and in several views.
3) Administrator should be able to change scheduling time of tasks, or to schedule unscheduled tasks for execution.
4) Security.
Trigger 1) Pupils have just registered their tasks for execution. Administrator wants to verify that they scheduled their tasks appropriately. Note: Pupils can only register tasks according to their permissions. However, they still can register tasks not appropriately - for example - if two or more pupils have registered tasks on the same living object and with overlapping times, and those tasks acts on same sensors.
2) Pupils have registered tasks, but didn't specify the scheduling time, typically because the administrator wants to avoid conflicts and specify it herself. Thus, the administrator specifies the scheduling times of all tasks.
3) Tasks had been downloaded from a content provider server on the Internet. Administrator wants to schedule those tasks for execution.
Summary 1) Administrator launches browser-based management station.
2) Administrator browses all tasks in the system, and their scheduling times if schedules.
3) Administrator changes scheduling times of tasks, or scheduled new tasks for execution.
Pre-conditions 1) System has been already installed 2) Tasks have been already registered in the system, or downloaded into the system.
Post-conditions Tasks are scheduled for execution as desired.
Related use cases i 1) Installation.
2) Diagnose, monitor and control the system.
Users Services The users can be everyone in the schoolyard that interacts with a living object. In particular it can be. a pupil, teacher, administrator or none of them.
Name Interaction with living object Actors User Goal The purpose of the interaction can be for amusement, education, task checking (pupil or teacher), or system checking (administra-tor).
Forces in Context 1) Friendly interaction.
2) Living object operated according to the registered tasks and the scheduler that schedule these tasks for executions.
Trigger User sees a living object in the schoolyard and decides to interact with it.
Summary This use case captures the scenario where a user interacts with a living object. User interacts with the living object by voice (listening or talking to it), by watching its reactions, or by triggering its sensors.
Pre-conditions i WO 99!54015 PCT/IL99/00202 One or more tasks are executing with the living object.
Post-conditions One or both of the followings:
1) The user is amused, more educated.
2) A task has been checked with a physical living object (student or teacher).
3) Living object has been checked of its functionality (administrator).
Related use cases 1) Execute a task.
2) Debug a task.
3) Task registration.
Contents providers Services External servers that interact with the system in order to push data into LOLA database, or supply such data upon a request from a LOLA client.
Name Build contents database Actors Contents providers Goal Push or supply tasks (contents) that will run on living objects.
Forces in Context 1) Leverage the capabilities developed for the LOIS
system.

i WO 99/54015 PCTlIL99/00202 2) Contents can be pushed automatically on a regular basis, or can be pulled upon a request.
3) Tasks written by contents providers are scheduled for execution in a similar way to tasks written by pupils.
Trigger Depends on the configuration:
I) Generally, administrator will configure the push client to run updates at specific intervals, so the trigger is the push client scheduler.
2) Administrator may manually initiate a download.
Summary This use case captures the scenario where the administrator at school wants to schedule for execution tasks that were written by contents providers, and to update these tasks on a regular basis. These tasks are scheduled for execution in a similar way to tasks written by pupils.
All the use-cases that support that action, e.g.
registration, billing, content-provider side are considered part of the LOIS system.
Pre-conditions 1) The LOLA system has been installed.
2) The installation and registration use cases of the LOIS system.
Post-conditions 1) New content that is ready for execution resides now in the tasks database.
Related use cases 1) Installation i Information servers Services External servers that interact with the system in order to push data into LOLA database, or supply such data upon a request from a LOLA client.
Name Supplies information to build a database that supports queries of pupils tasks.
Actors Information servers Goal Push or supply data that will serve pupils database queries.
Forces in Context 1) Use standard tools and protocols to build this database.
2) Data can be pushed automatically on a regular basis, or can be pulled upon a request.
Trigger Depends on the configuration:
1) Generally, administrator will configure the push client to run updates at specific intervals, so the trigger is the push client scheduler.
2) Administrator may manually initiate a download.
Summary This use case captures the scenario where the administrator at school wants to build an internal database that i pupils can query it, instead of searching the desired data on the web.
Pre-conditions The LOLA system has been installed.
Post-conditions 1) The database is updated.
Related use cases 1) Installation Fig. 42 is a simplified flowchart illustration of an emotional interaction flowchart design process.
Figs. 43 - 102 illustrate preferred embodiments of a computerized programming teaching system constructed and operative in accordance with a preferred embodiment of the present invention.
Figs. 69 to 102 are now described in detail.
Figure 69 is a general logical overview of the system network with the servers (such as the database server 316 and creature control server 318) at the center and the students' programming workstations 310, teacher station 312, administrator station 1200, and radio base station 320 clustered around the servers.
Figure 70 is a general logical overview of the control over the creatures 322 with the radio base station (that provides the control over the creatures) at the center and the students' programming workstations 310, teacher station 312, administrator station 1200, and radio base station 320 clustered around the servers.
Figure 71:

i The main menu of the administrator station comprises of four main sub-menus: Real-Time Information 1250 regarding the operation of the system, Diagnose 1260 for troubleshooting hardware and software problems, Configuration and registration 1270 of software and hardware components and Task 1280 for the deployment and administration of the various tasks (projects, p>:ograms) provided by students and executed by the system.
Figure 72 illustrates the basic steps for developing and testing a task (project, program) at home. First the student develops the task (step 1290), then compiles the source code (step 1300), than executes the task using the simulator (step 1310). If the task does not perform as it Was designed the student uses the simulator (step 1320) to debug the program and to find the problem, correct it and test the task again. If the task performs as designed the student registers the task (step 1330) to be executed over a physical creature.
Figure 73 illustrates the process of developing, testing and registration of a task by a student at home and at school. The process begins with the student at home, similar to Fig 62, however, the student transfers the task to school and continues with the same process at school.
Figure 74 is a flow chart describing a very simple "decision tree" (also termed "state machine"). This flow chart instructs the creature to enter "listen mode", thus recording the verbal utterances of the user and processing the recording by means of the speech recognition engine. The listen mode persists until the term "wake-up" is spotted the task sings a song. After i the song is finished the process repeats.
Figure 75 is a block diagram showing the main functions of the simulation engine. The simulation engine enables .the student to test the program (task) developed for a physical creature without a physical creature itself. The simulation engine provides all the physical functions of the physical creature by means of standard computer peripherals such as computer microphone to simulate creature listen functions (1450), computer speakers to simulate creature talking functions (1460), simulation of the creature motion by displaying animation of the creature on the computer screen (1470), simulation of the creature sensors with the computer keyboard and mouse (1480) and simulation of video display and video camera installed in the creature by means of the computer display and peripheral video camera.
Figure 76 is a flow chart describing the process of registration and execution of a project (task). In step 1500 the student or the teacher registers the task in the database server (Lola server) 316 for future execution by means of a specific creature control server 318 and a specific creature 324. In step 1510, at the appropriate date, time or other conditions as specified in the registration step 1500, the Lola server 316 sends the task to the appropriate creature control 318 server for execution. The Creature Control Server launches the program and execute it by sending commands via the radio base station (320) to the appropriate creature (324).
Figure 77 is a Block diagram of the main services available to the teachers. Teachers can access exclusive i extensions of the IDE (step 1600) to select and investigate each of the tasks of each of the students (step 1610). The teacher can brows the student tasks (1620), view statistics associated with the execution of the tasks (1630) such as absolute performance statistics (1640) and relative performance statistics (1650) and to assign marks to the students (1660).
Figure 78 is a Block diagram of the Living Object Laboratory (LOLA) system topology, comprising of the main subsystems:
The LOLA Server, comprising one or more servers, such as database server and creature control servers: Administrator Station (1710); Teacher station (1720); Student Programming station (1740); and Radio Base Station (1750). All the main sub-systems, except for the radio base station, are interconnected by networking means such as HyperText Transport Protocol (HTTP) or middleware (MW) where middleware is any appropriate interfacing software. Typically the all subsystems except for the Radio Base Station are interconnected over a Local Area Network (LAN) such as the Ethernet, while the Radio Base Station is connected by means of Universal Serial Bus (USB).
Figure 79 is a Block diagram of the Living Object Laboratory (LOLA) system presenting the main (logical) services provided by the system: The database engine 1760 manages all accesses to the database repository 1765. The Iog server logs 1770 details of the execution and performance of all creatures and tasks running in the system. The monitor engine 1775 presents to the users real time information about the performance of tasks i executed by the system at the time of monitoring. The security manager 1780 supervises all user access to the system and verifies that only authorized users will have access to particular parts of the database as is predetermined by the administrator. The task manager 1785 supervises the operation of all tasks in the system according to instruction provided by authorized users. These services are typically provided by software subsystems that are separated and interconnected by conventional means of communication such as HTTP and middleware.
Figure 80 is a Block diagram of the main services available to the system administrator by means of the system administrator station 1200. These services typically comprise:
On-line console 1800 for all services that are available while the system functions regularly.
Off-line console 1810 for all services available when the system is shut down for major installation and maintenance procedures.
Configuration console 1820 that enables the system administrator to set-up hardware peripherals, networking configuration,etc.
Deployment console 1830 that enables the system administrator to set-up new creatures or change the configuration of existing creatures.
Figure 81 is a Block diagram of the main modules of the software of the Creature Control Server, whether implemented as an independent server or as a part of another server such as the general LOLA server. The Creature Control Server comprises of multiplicity of Proxy Objects 1840, each of which is responsible i to a specific creature and a scheduler task that is responsible for the coordination and timing of the operation of the various proxies.
Figure 82 is a Block diagram of the main services available to the student by means of the programming station.
These services are implemented as modules interconnected by means of interfacing such as HTTP and middleware. The three main modules/services are the Interactive Development Environment 1860 (IDE} that enables the student to perform the programming of the tasks assigned to him; the simulator 1870 that enables the student to test the developed program using virtual creatures animated on the computer screen; and task registration 1880 that enables the student to registered the developed program for execution by means of a physical creature.
Figure 83 is a Block diagram of the main services available to the teacher by means of the teacher station. These services are identical to the services and module construction of the programming station except for the additional teacher console that enables the teacher to assign tasks to students, monitor their work, assign marks, etc.
Figures 84 to 93 together comprise a general description of a demonstration pilot project of a Living Object Laboratory.
Figure 84 is a block diagram of pilot Living Object Laboratory comprising of two classes, each with five programming stations, one teacher station, one radio base station connected directly to the network and one creature. Additionally, outside the two classes, one LOLA server, one administrator station and one base station, also connected directly to the network and controlling four creatures.
Figure 85 is a block diagram describing the methods and functions for installing the pilot laboratory and using it at the administrator level and within the two classes.
Figures 86 and 87 Describe the software and the hardware topologies of the pilot system.
Figures 88 to 90 are a flow chart description of the steps in the activation of the demonstration program of the pilot project.
Figure 90 describes the main application modules of the pilot system.
Figures 92 and 93 illustrate the steps to be taken to make the LOLA system operative.
Figure 92 lists the software modules that has to be installed to be able to activate the pilot demonstration software.
Figure 93 lists the configuration activity that has to be done before the activity described in Figs. 88 to 89 can be carried.
The order in which the steps are executed is not important as long as all the steps are executed completely.
Figure 94 to figure 105 describes the structure and features of the Interactive Development Environment (IDE).
Figure 94 describes a typical construction of the screen of the IDE. The screen typically comprises of a top menu bar 2000 and a bottom status bar 2005 as is common to all windows i applications; tool bars, such as 2010 and 2020 that can be placed anywhere on the screen and are shown adjacent to the top menu bar. Tool bar 2010 contains icons of software tools available to the programmer such as editing, compiling, simulating, etc.
Programming Tool bar 2020 contains icons of objects that the programmer can incorporate in the software program, such as states, events, functions, etc. An object can be dragged from the tool bar and dropped into the programming window 2030 to be connected with other objects in this window. When an object is selected the properties of the specific objects appear in the object inspector window 2040. The values of these properties can be modified by the programmer to create the necessary program behavior. When simulation is selected an animation of the programmed creature appears in the simulation window 2050. When the creature is instructed to collect input data (such as speech or tactile sensors) the popup menu 2060 appears and the programmer can interact with the creature by the appropriate selections from the popup menu. The message window 2070 provides the programmer with hints during the programming activity and with tracing data of the program execution during simulation activities.
Figure 95 describes the main functions (File, Edit, View, etc) are available to the programmer in the top menu bar 2000 of the IDE screen and the sub-functions that are made available in a drop down window when a main function is selected.
Figures 96A and 96B describe the main objects and programming tools available to the user in the object tool bar i 2010 and the programming tool bar 2020.
Figure 97 describes the objects inspection window 2040 in more details.
Figure 98 describes the main groups of messages that appear to the programmer in the message window 2070 at various situations. Such message groups are: programming syntax errors, compilation errors, progress indication messages for various functions such as compilation and debugging, test logging messages that the system provide while debugging.
Figure 99 is a block diagram of the simulation process and module structure. When simulation is activated the IDE module 2200 executes the tested program but sends the creature executable instructions to the virtual creature command interface 2210. Interface 2210 identifies the creature type and the appropriate creature function to be simulated selects and operates the appropriate function 2220. The function 2220 executes the appropriate animation 2230 of the virtual creature on the computer display.
Figure 100 describes the structure of the bottom status bar 2005.
Figures lOlA to lOlB describes in more detail the content and structure of the objects tool bar 2020 for various groups of objects when such a group is selected. Figure lOlA
refers in detail to 2100 of Fig. 96A; Figure lOlB refers in detail to 2120 of Fig. 96A; Figure lOlC refers in detail to 2120 of Fig. 96A and Figure lOlD refers in detail to 2130 of Fig. 96A.
Emotional Analysis Concept Paper The goal of the Living Object Laboratory is to teach students the art to instill human behavior in computerized machines. One major characteristic of humans is emotional sensitivity. That is, the ability to identify the emotional state and state transition in another human being and to respond accordingly. It is very diffi-cult to teach emotional sensitivity to humans and it is much more difficult to instill emotional sensitivity in machines. However, even the most simplistic emotional sensitivity, when featured by a machine, has a tremendous effect on the interaction of humans and the machine. Therefore, the art of programming emotional sensitivity is important.
The goal of Emotional Analysis is to provide the main application with the capabilities to accommodate to the emotional state of the human that interacts with the machine. Emotional analysis is a background process, or processes. Emotional analysis evaluates the emotional state of the person who interacts with the Living Object. The evaluation is performed continuously, in parallel to other processes. The process may be performed as a subroutine called by the main process or as a background task, as is appro-priate for the level of complexity of the application system and the perceived ease of programming. The main module (or process) deals with the main goals of the application (such as playing the role of a teacher, a guard, a guide, a playmate, etc.). The Emotional Analysis communicates with the main task, receiving the required inputs and providing the main application with queues for appropriate response to the interacting human.

The Emotional Analysis is mostly verbal. The Emotional Analysis process analyses the content of verbal inputs recorded by the main application. According to the results of the analysis .the Emotional Analysis provides the main application with appropriate data. The data provided by the Emotional Analysis process to the main process may range from the perceived emotional state, or emotional state transition, of the interacting human, to detailed verbal phrases to be played by the main process. The final deci-sion, to provide the Emotional Analysis with inputs and to follow the Emotional Analysis outputs, is in the hands of the main (application) process.
The Emotional Analysis is basically a program and can be pro-grammed using the same programming means available for program-ming the main application. The Emotional Analysis program can be viewed as an algorithm, implemented as a state machine, where events are combinations of acoustic analysis and semantic analy-sis of verbal inputs received (recorded) from the interacting human and accumulated data.
The design of the Emotional Analysis process involves several stages such as:
Determining the scope of emotions, e.g., three emotions: sad, happy, angry.
Determining acoustic and semantic representations of the emotions to be detected in the received (recorded) verbal inputs from the interactive human, e.g.
Voice amplitude (quiet or loud voice) Voice pitch Rate of speech Diction quality (quality of speech recognition) Specific words such as "sad", "nappy", "angry"
Of course, the change in one of the above features may be more important than the feature itself. E.g., raising the voice car-ries more emotional information than continuous loud voice.
Determining means for explicit interrogations of the emotions of th,~ interactive human, such as direct questions, e.g. "Are you sad?"
Determining modifications of the application interaction accord-ing to the perceived emotional state of the interacting human.
First should be determined the goal of the modification and then the means. For example:
Goals Express empathy Provide emotional support, encouragement, etc.
Affect (change) mood Means Adaptation of appropriate amplitude (loudness), pitch and rate of verbal output.
Several versions of the same verbal content to be selected and played.
Default/standard phrases expressing empathy, interest, support, etc.
Determining the communication means (the protocol) between the application processes) and the Emotional Analysis process.
Assigning Marks to Student's Programming Projects Teachers usually evaluate examinations and assign marks based on a checklist. This is true for all subject matter, from exact sciences to humanities. It is also true for the evaluation of programming, from analysis through design to implementation.
Checklist evaluation can be automated, that is, be executed by means of a computer. Since the mechanism of computerized evalua-tion of examinations is common and the same for all subject matter it is outside the scope of this document.
Programming must also work properly, that is, the implementation must function on its own, without faults (crashes) and according to the specifications. It is obvious that the computer can track the performance of the executed program, analyze the performance according to the specifications, and report the results.
Automated (or computerized) evaluation is performed by means of a monitoring program that logs the performance of the monitored program, analyzes the log and reports the results. To enable the monitoring, several checkpoints are set within the monitored program, and the monitoring program logs every passage through these each of these checkpoints with the values of associated parameters.
LOLa's default monitoring provides every entry into and exit from each state (and hence, every entry to and exit from each state transition/connection). The monitoring program reports the re-sults of the monitoring by program module and by student. A mark can be assigned according to the following criteria:
The percentage of states and state connections that have been entered (and hence have been tested).
The percentage of states and state connections that have been exited (and hence have performed successfully).
Internal performance balance, that is, the ratio between the number of entries to (exits from) the entity (state; connection) least visited (most visited) and the average number of entries (exits) within the module (for each and every module). More precisely, the square root of the sum of the squares of the differences between entries (exits) of the list and the most visited entities and the average.
Overall performance balance, that is the ratio between the number of entries (exits) in the module and the project average.
Fig. 103 is a table illustration of an emotional analysis database; and Fig. 104 is an emotional analysis state chart.
The emotional analysis apparatus is sensitive to mood changes of the user. Mood changes are associated with changes in features of speech of the user, such as loudness, rate, pitch (these are examples of implicit events), the use of specific terms by the user and the answers to direct closed questions (these are examples of explicit events) played by the creature.
Each such event has a weight and when the event occurs the weight is added to the relevant table cell. Only when a threshold is passed does the creature respond to a perceived mood change (by providing empathy, asking a closed question, and the like).
Figs. 105 - 110 illustrate a preferred embodiment of Uniport, including a software architecture overview. Fig. 105 illustrates typical function calls and callback notifications.
Fig. 106 illustrates typical input data processing suitable for a media BIOS module. Fig. 107 illustrates typical input data proc-essing suitable for a UCP implementation module. Fig. 108 illus-trates typical output data processing suitable for user applica-tions and an API module. Fig. 109 illustrates a typical UCP
implementation module and media BIOS output data processing. Fig.
110 illustrates output data processing for a protocol implementa-tion module and media BIOS module. In Fig. 110, MAX OB signifies the maximum of elements in the out buffer.
A description of typical exported functions are as follows:

crs=AddWords Tr.is function is used to add words to the active conte~:t from speech recogn_tior, engine.
crWai~~orEvent ~'his function is used to wait an event from the UN=T.

*rB

b=oTransferReset This runction is used to reset input or/a.~.d output Cuelle in the Media BIOS module (see pic. 5, 2) 3.? Protocol implementation module exported function description oroSvstem0oen This function is used to open the system.
orvSVStemClose This function is used to close the sys~em.
oroSendMessaae This .unction is used to send a control message.
oroSendBuffer This function is used to send a buffer.
tar O'='r ~nSf er ~eSe t This fLlnCt'_On 1S used t0 reset input Or/a-~.d Outpllt queue in the Media BICS module (see pic. 5, 2) 3.? API module exported function description C=~aSeDeteCt '='his function is used to detect she base (Determine de~~i~..e ID) .
crSvstemOpen This function is used to open the system.
CrjVStemClCSe This function is used ~o close the system.

c=SetNotification This function is used to setup callback not=fication mechanism in the user application.
c~SendBuffe~
This function is used to send a buffer.
YSendMessage '~h;s function is used to send a message.
c=UnitTalk This function is used t0 play a sou.~.d f'_e.
~3aseVe=sicn This funCtl.an 'S used t0 get VerS=Gn nLmDer IrOm t."1e ~~c~
c=srGetWords This function is ~,:sed to get all she word f=om the ac~_4e context of speech reCOgr.'tion engine.
c=~=C=eateContext This function is used to create new context in the Sp2eCh recognition engine.
c=s~~eieteCortext T':is function is used to delete context from the s~2ec~ recognition engine.
C'_'SrS2IeCtCOnteXt This function is used to select the cor:te~t from the sp°ech recognition engine.
crsrRemoveWords This function is used to remove wards rrom the active c~~~ext from speech recognition. engine.

~.1 Media BIOS module e;cported function description ~ioM=diaConnect '.'his function is used to connect to the communication media.
bioMediaDisconnect this function is used to disconnect from the comuunication media.
dioAddOutBuffer 'this function is used to add the out_out puffer tc the ou~put queue (see pic. 5) Figs. 11I - 115, taken together, form a toy configura-tion flowchart. Fig. 111 illustrates typical figure configura-tion. Figs. 112 - 115 illustrate typical install-check up (BT
1/4, 2/4, 3/4 and 4/4 respectively).
To generate a screen interface, the following texts may be recorded which serve as voice-over of onscreen texts:

Screen 0010-001?
Intro to the introduction:
Morning: Hey, hi there. Good morning to you!
Afternoon: Hey, what's up. Good afternoon to you!
Evening: Hey, g-o-o-ood evening to you!
Screen 0040: ABOUT YOU
Click here to give or change user information.
Screen 0050: HOT CLIP!
Click here to see the Storyteller dance and sing.
Screen 0060: HOW TO PLAY
Click here to hear how to play.
Screen 0070: PLAY
Click here to start to play.
Screen 0080: ** NEW': CHECK-UP
Click here to check the system.
Screen 0090: EXIT
Click here to close the Storyteller program.
Screen OO.I1 Double-clock on the user's name or type the name of a new user. If you are typing a new name, press ENTER when done.
Screen 0042 Please type the name of a new user. When done, press ENTER.
Screen 0043 When we play together. I'll call you by a secret name. Double-click now on the secret name you want.

*rB

When they haven't chosen a secret name I have to know your secret name first. Go ahead and choose one now.
About you Click here to tell me all about yourself.
Instructions for filtina in personal data 1. Click the cursor on an item.
2. A menu will appear. Make your choice.
3. Go through each item one by one.
4. When you are done, click on the MAIN MENU button.
General invitation to enter data. at ton of ABOUT YOUISECRET NAME/FAVORITES
screen Please tell me all about you. I'm very happy to know you.
Favorites Click here to choose your favorite things.
~ What's your favorite color? Yellow? Red? Blue? Pick one.
~ What's your favorite food? Pizza? Macaroni and cheese? French fries? Pick one.
~ What's your favorite activity Playin' make believe? Drawing' Playing computer games?
Pick one.
~ What's your favorite animal? Cats? Dogs? Horses? Pick one.
Graphic of cursor Click here to return to the Storyteller's Main Menu.

It is appreciated that the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form. The software components may, generally, be implemented in hardware, if desired, using conventional tech-niques.
It is appreciated that the particular embodiment de-scribed in the Appendices is intended only to provide an extreme-ly detailed disclosure of the present invention and is not in-tended to be limiting.
It is appreciated that various features of the inven-tion which are, for clarity, described in the contexts of sepa-rate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided separately or in any suitable subcombina-tion.

APPENDIX A

SUBSTITUTE SHEET (RULE 26) Application Source Code SUBSTITUTE SHEET (RULE 26) Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved rsavc~~ms=====s==o~~e~-e=aa=~-==_~___~ --m~ .a--.==c.. ~~cc==s=o==
Description : This is the Main unit.
unit Main:
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, StdCtrls,Toy, -PDBEngine, XMIDILib TLB, OleCtrls, NEWSRLib TLB
,ExtCtrls,MPlayer, ComCtrls,jpeg, Menus:
type TTalkThread class(TThread) =

private ToyNumber : Integer:

TalkFile . string:

Motion . Integer;

protected constructor create (ToyNumberl : Integer: TalkFilel : string Motionl: Integer):
procedure Execute: override:
end:
TMainForm = class(TForm1 Buttonl: TButton:
SR1: TSR:
XMidil: TXMidi:
MainMenul: TMainMenu:
testl: TMenuItem:
spacel: TMenuItem:
Panell: TPanel:
MediaPlayerl: TMediaPlayer~
Timerl: TTimer:
procedure FormCreate(Sender: TObject);
procedure FormClose(Sender: TObject; var Action: TCloseAction):
procedure spacelClick(Sender: TObject):
procedure TimerlTimer(Sender: TObject):
private ( Private declarations 1 SUBSTITUTE SHEET (RULE 26) TalkingInThread : TTalkThread;
public { Public declarations ) CurrentPath . string;
CreatorPath . string;
DatabasePath . string;
GraphicsPath . string;
AudioPath . string;
UsagePath . string;
AboutYouPath . string;
Toy . TToy;
ToyMachine . string;
PDBEngine . TPDBEngine:
ThreadInProgress : Boolean;
ToyNameIsStotyTeller : string;
ToyNameIsHear . string;
procedure ApplicationInitialization;
procedure GotoMainMenu;
procedure GotoCreator;
function Back6roundTalking (TalkFile : string; Motion : string) :Integer;
function GetCurrentPath (CurrentExeName : string) . string;
function TalkInBackGround (ToyNumber : Integer;TalkFile : string;
Motion : string) :Integer:
end:
var MainForm: TMainForm;
implementation uses Menu, Status, creator;
{$R *.DFM) procedure TMainForm.FormCreate(Sender: TObject);
begin // screen. cursor:=crDefault;
ToyNameIsStoryTeller : 'StoryTeller';
ToyNamels8ear . 'Teddy8ear';

SUBSTITUTE SHEET (RULE 26) Screen. Cursors[5] .= LoadCursor(Hlnstance,'PestoHand.Cur'):
Screen. Cursors[6] .= LoadCursor(HInstance,'PestoMenu.Cur'):
cursor : 5:
Applicationlnitialization:
Cursor :- crNone;
Timerl.Interval :- 1000:
Timerl.Enabled .= True;
if (Toy.ToyNumber > 0) and (Toy.ToyNumber < 32) then ToyMachine ToyNameIsBear else ToyMachine := ToyNameIsStoryfieller:
end:
procedure TMainForm.ApplicationInitialization;
begin // Fill Pathes CurrentPath .= GetCurrentPath(Application.ExeName);
CreatorPath .
Copy(CurrentPath,I,Length(CurrentPath)-Length('Executables\')):
//CreatorPath .= CopyfApplication.ExeName,l, //Length(Application.ExelJame)-2?):
DatabasePath .= CreatorPath + 'PESTO\DATABASE\':
GraphicsPath := CreatorPath + 'PESTO\GRAPHICS\':
AudioPath .= CreatorPath + 'PESTO\AUDIO\':
UsagePath := CreatorPath + 'PESTO\USAGE\':
PDBEngine := TPDBEngine.create (self):
PDBEngine.DataBasePath := DatabasePath;
PDBEngine.LoadRegistration;
PDBEngine.SetChildNumber(1);
PDBEngine.LoadConfiguration~
Toy := TToy.Create (self):
Toy.ToyNumber := PDBEngine.ToyNumber:
Toy.TurnOn:
Application.Icon.LoadFromFile(GraphicsPath+'PestoIcon.ico');
AboutYouPath .= AudioPath+'AboutYou\':
end;
function TMainForm.GetCurrentPath (CurrentExeName : string) . string:
var i : integer;
begin i := Length(CurrentExeName):
While i>0 do begin SUBSTITUTE SHEET (RULE 26) if CopylCurrentExeName,i,1) _ '\' then i:=0 else begin i := i -1:
CurrentExeName := Copy(CurrentExeName,l,i):
end:
end:
Result := CurrentExeName;
end;
procedure TMainForm.GotoMainMenu;
begin if Time< StrToTime('12:00:00') then BackGroundTalking(AboutYouPath + 'vo001.wav','S') else if Time> StrToTime('16:00:00') then BackGroundTalking(AboutYouPath + 'vo003.wav','S') else BackGroundTalking(AboutYouPath + 'vo002.wav','S'):
Spacel.Enabled := False:
Hide.
Timerl.Enabled := False;
MenuForm.Show:
end;
procedure TMainEorm.FormClose(Sender: TObject; var Action: TCloseAction)~
begin if ThreadInProgress then Exit;
Toy.TurnOff:
Toy. Free;
PDBEngine.Freei end:
procedure TMainForm.spacelClick(Sender: TObject):
begin //space GotoCreator:
end:
procedure TMainEorm.GotoCreator;
begin SUBSTITUTE SHEET (RULE 26) Spacel.Enabled : False;
Hide;
CreatorForm.Show;
CreatorForm.PlayMovie;
Timerl.Enabled := False;
end;
function TMainForm.BackGroundTalking (TalkFile : string; Motion : string) :Integer;
var Threadl : TTalkThread;
begin ThreadInProgress := True:
Threadl := TTalkThread.create(Toy.ToyNumber,TalkFile,O);
Result := 0;
end;
function TMainForm.TalkInBackGround (ToyNumber : Integer;TalkFile : string;
Motion : string) :Integer:
var Threadl : TTalkThread;
begin ThreadInProgress := True;
Threadl := TTalkThread.create(ToyNumber,TalkFile,O);
Result : 0;
end:
constructor TTalkThread.create (ToyNumberl : Integer; TalkFilel : string ;
Motionl: Integer);
begin inherited create(False);
ToyNumber := ToyNumberl;
TalkFile .= TalkFilel;
Motion := Motionl;
FzeeOnTerminate := True;
end:
procedure TTalkThread.Execute;
begin //85 = 55H Broadcast if (MainForm.ToyMachine ='StoryTeller') and (ToyNumber <> 851 then MainForm.XMidil.ToyTalk2(ToyNumber,TalkFile,0,0,Motion,0);
if (MainForm.TcyMachine = 'TeddyBear') or (ToyNumber = 85) then SUBSTITUTE SHEET (RULE 26) MainForm.XMidil.Net~ToyTalk(ToyNumber,TalkFile,0,9,0):
Terminate:
MainForm.ThreadlnProgress := False:
Exit;
end:
procedure TMainForm.TimerlTimer(Sender: TObject)~
begin //GotoCreator:
GotoMainMenu;
end;
end.

SUBSTITUTE SHEET (RULE 26) =xxmasasssassxaxaasssx=s=ss=xxsrs=sxssasasssssxsssass~sxsss:xx:x Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved sxassxsssssxxc=sssasaesss=assns ~sv=ssaxxxesssasasaasssaxsssxss Description : This is the Checkup unit.
unit Checkup;
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms; Dialogs, ExtCtrIs.Jpeg, ComCtrls;
TCheckUpForm = class('TForm) Image 1: TImage;
ExitImage: TImage;
CarAnimate: TAnirnate:
ClownAnimate: TAnimate:
DuckAnimate: TAnimate;
DuckWallsAnimate: TAnimate;
LightOrganAnimate: TAnimate;
LogoMoveAnimate: TAnimate;
MicrophoneAnimate: TArumate;
MotionLetersAnimate: TAnimate;
SensorAnimate: TAnimate:
SpeakerAnimate: TAnimate;
Image2: TImage;
procedure E:citImageClick(Sender. TObject);
procedure FormCreate(Sender: TObject);
procedure Image2Click(Sender: TObject);
private { Private declarations }
public { Public declarations }
procedure ActivateTheAnimation (value : Boolean);
end;
var SUBSTITUTE SHEET (RULE 26) CheckUpForm: TCheckUpForcn:
implementation uses Menu. Main;
{SR *.DFM}
procedure TCheckUpForm.EsithnageClick(Sender: TObject);
//
ActivateTlteAnimation(False);
MenuForm.Show;
end;
procedure TCheckUpForm.FormCreate(Sender: TObject);
begin II
with CarArumate do begin FileName := MainForm.GraphicsPath +'Car.avi':
Left := 260;
Top .= 441;
Width := 80;
Height := 60;
end;
with ClownAnimate do begin FileName := MainForm.GraphicsPath +'Clown.avi':
Left := 652;
Top .= 393;
Width := 32;
Height := 40;
end:
with DuckAnitnate do begin SUBSTITUTE SHEET (RULE 26) FileName := MainForm.GraphicsPath +'Duck.avi';
Left := 613;
Top .= 114;
Width := 48;
Height := 50;
end;
with DuckWalkAnimate do begin FileName := MainForm. GraphicsPath + 'DuckWallc.avi ;
Left := 599;
Top .= 216;
Width := 128;
Height := 1 I5;
and;
with LightOrganAnimate do FileName := MainForm.GraphicsPath +'LightOrgan.avi':
Left :_ 4ss;
Top .= 440;
Width := 48;
Height := 70;
end;
With LogoMoveAnimate do begin FileName := MainForm.GraphicsPath + 'LogoMove.avi';
Left . 336;
Top := 19:
Width . 112;
Height : 45:
end;
with MicrophoneAnimate do begin FileName := MainForm.GraphicsPath + 'HubWave.avi';
Left .= 95;
Top := 365;
Width .= 80;

SUBSTITUTE SHEET (RULE 26) Height : 90;
end;
with MotionLetersAnimate do begin FileName := MainForm.GraphicsPath + 'MotionLeters.avi';
Left .- 468;
Top . 172;
Width . 144;
Height := 45;
end:
with SensorAnimate do begin FileName := MainForm.GraphicsPath + 'Sensor.avi'p Left .= 341;
Top .= 22?;
Width .= 96:
Height := 60:
end:
with SpeakerAnimate do begin FileName := MainForm.GraphicsPath + 'Speaker.avi':
Left .= 5?:
Top := 169:
Width . 96;
Height := 80;
end:
end:
procedure TCheckUpForm.ActivateTheAnimation(value : booleanl:
begin //
try CarAnimate.Active .= Value;
except end:
try ClownAnimate.Active .= Value;
except SUBSTITUTE SHEET (RULE 26) end:
try DuckAnimate.Active .= Value;
except end:
try DuckWalkAnimate.Active .= Value;
except end:
try LightOrganAnimate.Active .= Value:
except end:
try LogoMoveAnimate.Active .= Value;
except end;
try MicrophoneAnimate.Active .= Value;
except end:
try MotionLetersAnimate.Active := Value:
except end:
try SensorAnimate.Active .= Value:
except end:
try SpeakerAnimate.Active := Value:
except end;
end:

SUBSTITUTE SHEET (RULE 26) procedure TCheckUpForm.Image2Click(Sender: Tobject);
begin //
// MainForm.Toy.TurnOff;
sleep(5000);
ActivateTheAnimation(False);
close:
MainForm.close;
WinExec(PChar(MainForm.CurrentPath + 'PESTOInstallation'),sw show):
Application. Terminate;
end:
end.

SUBSTITUTE SHEET (RULE 26) 3:asaass:a:osa:acaao-a=saes--=a-ssm~-sxsea-xx~asscxss=aaxxaaos Copyright (c) 2995-1998 Creator Ltd. A11 Rights Reserved s~eaxacs~ csvxssesaaeaasxvas~ xcsxx=xaecc-ammsaesxsxtamssa:xxss Description : This is the Creator unit.
unit creator:
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, Menus, ExtCtrls, MPlayer:
type TCreatorForm = class(TForm) MainMenul: TMainMenu:
testl: TMenuItem;
spacel: TMenuItem:
MediaPlayerl: TMediaPlayer:
Panell: TPanel:
Timerl: TTimer:
Escapel: TMenuItem:
procedure spacelClick(Sender: TObject):
procedure TimerlTimer(Sender: TObject):
procedure FormCreate(Sender: TObject):
procedure EscapelClick(Sender: TObject):
procedure FormClose(Sender: TObject: var Action: TCloseAction):
private (. Private declarations 1 public ( Public declarations I
StartPlay : Integer;
procedure PlayMovie:
procedure GoToPestoSong:
end;
var CreatorForm: TCreatorForm:
implementation uses PestoSong, Main, Menu:

SUBSTITUTE SHEET (RULE 26) ($R *.DFM) procedure TCreatorForm.spacelClick(Sender: TObject):
begin //space GoToPestoSong:
end;
procedure TCreatorForm.GoToPestoSong:
begin //
Spacel.Enabled := False;
MediaPlayerl.stop:
MediaPlayerl.Close;
hide;
PestoSongForm.Show;
PestoSongForm.PlayMovie:
Timerl.Enabled := False;
end:
procedure TCreatorForm.PlayMovie;
begin //
try Timeri.Enabled := True:
MediaPlayerl.Play:
except end:
end:
procedure TCreatorForm.TimerlTimer(Sender: TObject);
begin //
StartPlay := StartPlay + t;
if StartPlay = 1 then exit;
GoToPestoSong:
end:
procedure TCreatorForm.FormCreate(Sender: TObject):
begin //
Timerl.Enabled := False:

SUBSTITUTE SHEET (RULE Z6) Panell.Cursor :- crNone;
Timerl.Interval : 17000:
StartPlay :- 0;
Cursor :- crNone:
MediaPlayerl.FileName := MainForm.GraphicsPath + 'OpenO.avi';
end:
procedure TCreatorForm.EscapelClick(Sender: TObject);
begin // Exit try MediaPlayerl.stop;
except end:
try MediaPlayerl.Close:
except end:
Hide;
MenuForm.Show;
end:
procedure TCreatorForm.FormClose(Sender: TObject:
var Action: TCloseAction):
begin // Exit try MediaPlayerl.stop;
except end;
try MediaPlayerl.Close;
except end;
end:
end.

SUBSTITUTE SHEET (RULE 26) scssex=asasssaso-sssaea===--°°s==---sas-sssaaas=ssssas-ss=sssasss Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved saamasssososssamasssaasmsaa-as~ ===ess=_ ssas- --sssssassss=ars Description : This is the Menu unit.
unit Menu:
interface uses Windows, Messages, SysUtils, Classes. Graphics, Controls, Forms, Dialogs, StdCtrls, Buttons, ExtCtrls, ComCtrls, Intro,jpeg, Toy, Menus;
type TMenuForm ~ class(TForm) Menulmage: TImage:
SetupImage: TImage;
AnimationImage: TImage:
IntroImage: TImage:
PlayImage: TImage:
ExitImage: TImage;
TVAnimate: TAnimate:
PickUserImage: Tlmage:
OKButtonImage: TImage;
UserNameEdit: TEdit:
SetUpOrgImage: TImage:
CheckImage: TImage;
PickUserTitleLabel: TLabel:
PickUserLabell: TLabel:
PickUserLabel2: TLabel;
ImageFramel: TImage:
ImageFrame2: Tlmage;
ImageFrame3: TImage:
ImageFrame4: TImage~
ImageFrameS: TImage:
ImageFrame6: TImage:
UserNameLabell: TLabel:
UserNameLabel2: TLabel;
UserNameLabel3: TLabel;
UserNameLabel4: TLabel:
UserNameLabelS: TLabel;
UserNameLabel6: TLabel;

SUBSTITUTE SHEET (RULE 26) UserNameLabel7: TLabel;
UserNameLabel3: TLabei;
MainMenul: TMa~nMenu:
test)): TMenuItem;
Enter): TMenuItem;
Escape): TMenuItem;
ToyImage: TImage;
ToyAnimate: TAnimate;
procedure FormCreate(Sender: Tobject):
procedure UserNamelButtonCiick(Sender: TObject);
procedure UserName2ButtonClick(Sender: TObject);
procedure UserName3ButtonClicklSender: TObject);
procedure UserName4ButtonClick(Sender: TObject):
procedure UserNameSButtonClick(Sender: TObject)~
procedure UserName6ButtonClick(Sender: TObject):
procedure UserName7ButtonClick(Sender: TObject):
procedure UserNameBButtonClick(Sender: TObject);
procedure UserNamelButtonMouseMove(Sender: TObject; Shift: TShiftState;
X, Y: Integer):
procedure UserName2ButtonMouseMove(Sender: TObject: Shift: TShiftState:
X, Y: Integer):
procedure UserName3ButtonMouseMove(Sender: TObject: Shift: TShiftState:
X, Y: Integer):
procedure UserName4ButtonMouseMove(Sender: TObject; Shift: TShiftStatet X, Y: Integer):
procedure UserNameSButtonMouseMove(Sender: TObject: Shift: TShiftstate;
X, Y: Integer);
procedure UserName6ButtonMouseMove(Sender: TObject; Shift: TShiftState;
X, Y: Integer):
procedure UserName?ButtonMouseMove(Sender: TObject; Shift: TShiftState;
X, Y: Integer):
procedure UserNameBButtonMouseMove(Sender: TObject; Shift: TShiftstate;
X, Y: Integer).
procedure OKButtonlmageClick(Sender: TObject);
procedure ImageFramelMouseMove(Sender: TObject; Shift: TShiftState: X, Y: Integer).
procedure ImageFrame2MouseMove(Sender: TObject; Shift: TShiftStatet X, Y: Integer);
procedure ImageFrame3MouseMove(Sender: TObject; Shift: TShiftState; X, Y: Integer);
procedure ImageFrame9MouseMove(Sender: TObject; Shift: TShiftState: X, Y: Integer):
procedure ImageFrameSMouseMove(Sender: TObject; Shift: TShiftState; X.

SUBSTITUTE SHEET (RULE 26) Y: Integer):
procedure ImageFrame6MouseMove(Sender: TObject; Shift: TShiftState; X.
Y: Integer);
procedure ImageFramelClick(Sender: TObject):
procedure ImageFrame2Click(Sender: TObject):
procedure ImageFrame3Click(Sender: TObject):
procedure ImageFrame4Click(Sender: TObject):
procedure ImageFrameSClick(Sender: TObject);
procedure ImageFrame6Click(Sender: TObject)s procedure MenuImageMouseMove(Sender: TObject: Shift: TShiftState: X, Y: Integer);
procedure EnterlClick(Sender: TObject);
procedure EscapelClick(Sender: TObject):
procedure FortnClose(Sender: TObject: var Action: TCloseAction):
private ( Private declarations ) procedure ResetCurrentButton:
procedure UserNameDefaultColor:
procedure ClearUserName:
procedure AssignCursorsInImages:
procedure ShowRegistration(Value : string);
public ( Public declarations ) Threadl . TIntro:
CurrentButton . String;
end;
var MenuForm: TMenuForm;
const crPESTOHandCursor = 100:
crPESTOMenuCursor = I01:
implementation uses Main, Status, Registration, ToySimulation, MotionSimulation, Checkup.
PestoSong, SingAlong, creator;
($R *.DFM) procedure TMenuForm.ResetCurrentButton:
begin SUBSTITUTE SHEET (RULE 26) if CurrenzButton'Setup' thenSetupImage.Visible _ False:
' thenAnimationlmage.Visible i ' if CurrentButtonon = Animat False;
= 'Intro' thenIntroImage.Visible .

if CurrentButton False:

if CurrentButton'Play' thenPlayImage.Visible = .

False;
= 'Check' thenChecklmage.Visible .

if CurrentButton False:

if CurrentButton'Exit' thenExitImage.Visible = .

False:
end:
procedure TMenuForm.FormCreate(Sender: TObject);
begin Screen.Cursors [crPESTOHandCursor] .
LoadCursorIHInstance, 'PESTOHAND.CUR');
Screen.Cursors [crPESTOMenuCursor] .
LoadCursor(HInstance, 'PESTOMENU.CUR');
TVAnimate.FileName := MainForm.GraphicsPath+'Noise.AVI':
TVAnimate.Active := True:
MainForm.Hide;
Threadl := nil:
SetupImage.Visible .= True;
SetUpOrgImage.Visible .= False:
AnimationImage.Visible := False;
IntroImage.Visible .= False;
Playlmage.Visible .= False:
CheckImage.Visible .= False;
ExitImage.Visible .= False;
SetupImage.cursor .= crPESTOMenuCursor;
SetUpOrgImage.cursor .= crPESTOMenuCursor;
AnimationImage.cursor .= crPESTOMenuCursor;
IntroImage.cursor .= crPESTOMenuCursor;
Playlmage.cursor .= crPESTOMenuCursor;
Checklmage.cursor .= crPESTOMenuCursor;
ExitImage.cursor .= crPESTOMenuCursor:
ImageFramel.cursor .= crPESTOHandCursor;

SUBSTITUTE SHEET (RULE 26) ImageFrame2.cursor .= crPESTOHandCursor;

ImageFrame3.cursor .= crPESTOHandCursor;

ImageFrame9.cursor .= crPESTOHandCursor:

ImageFrameS.cursor . = crPESTOHandCursor;

ImageFrame6.cursor . = crPESTOHandCursor;

OKButtonlmage.Cursor = crPESTOHandCursor;
.

Current8utton : 'Setup ';

// Unvisible Registration PickUserImage.Visible .= False:

// Reg 1 PickUserTitleLabel.Visible := False;

UserNameLabell.Visible := False;

UserNameLabel2.Visible := False?

UserNameLabel3.Visible := False:

UserNameLabel9.Visible := False;

UserNameLabelS.Visible := False;

UserNameLabel6.Visible := False;

UserNameLabel7.Visible := False;

UserNameLabelB.Visible := False:

// Reg 2 PickUserLabell.Visible .= False;

PickUserLabel2.Visible .= False;

UserNameEdit.Visible := False;

OKButtonImage.Visible .= False;

//

Cursor :- crPESTOMenuC ursor;

//AssignCursorsInImage s:

UserNameLabell.Caption : 'NEW USER';

With ToyAnimate do begin FileName := MainForm.GraphicsPath+'Eye.AVI';
Active := True:
Left .= 3?6;
Top .= 252;
Width .- 80;
Height := 40;
end:
With ToyImage do begin Left .= 265;

SUBSTITUTE SHEET (RULE 26) Top .= 177;
Width . 309;
Height : 368;
end:
end;
procedure TMenuForm.UserNamelButtonClick(Sender: TObjectl;
begin //
if MainForm.ThreadInProgress then exit;
PickUserTitleLabel.Visible := False;
UserNameLabell.Visible := False:
UserNameLabel2.Visible := False:
UserNameLabel3.Visible := False:
UserNameLabel4.Visible := False:
UserNameLabelS.Visible := False:
UserNameLabel6.Visible := False.
UserNameLabel7.Visible := False;
UserNameLabelB.Visible := False:
UserNameEdit.Text ~ ":
PickUserLabell.Visible := True;
PickUserLabel2.Visible .= True:
UserNameEdit.Visible .= True;
OKButtonlmage.Visible .= True:
UserNameEdit.SetFocus;
MainForm.PDBEngine.InsertNewChild~
MainForm.BackGroundTalking(MainForm.AboutYouPath + 'vo0042.wav','S');
end:
procedure TMenuForm.UserName2ButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit:
MainForm.PDBEngine.SetChildNumber(1);
ShowRegistration(UserNameLabel2.Caption);
end;
procedure TMenuForm.UserName3ButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
MainFarm.PDBEngine.SetChildNumber(2):

SUBSTITUTE SHEET (RULE 26) ShowRegistration(UserNameLabel3.Caption);
end:
procedure TMenuForm.UserName4ButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
MainForm.PDBEngine.SetChildNumber(3);
ShowRegistration(UserNameLabel4.Caption);
end:
procedure TMenuForm.UserNameSButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exi t MainForm.PDBEngine.SetChildNumber(9);
ShowRegistration(UserNameLabelS.Caption);
end:
procedure TMenuForm.UserName6ButtonClick(Sender: TObject):
begin //
if MainForm.ThreadInProgress then exit:
MainForm.PDBEngine.SetChildNumber(5):
ShowRegistration(UserNameLabel6.Caption):
end;
procedure TMenuForm.UserName7ButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
MainForm.PDBEngine.SetChildNumber(6);
ShowRegistration(UserNameLabel7.Caption);
end:
procedure TMenuForm.UserNameBButtonClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
MainForm.PDBEngine.SetChildNumber(7);
ShowRegistration(UserNameLabelB.Caption);
end:

SUBSTITUTE SHEET (RULE 26) Wp 99/54015 PCT/IL99/00202 procedure TMenuForm.UserNamelButtonMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integerf;
begin //
if UserNameLabell.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabell.Font.Color := clGreen;
end;
end:
procedure TMenuForm.UserNameDefaultColor;

var UserColor : TColor;

begin.

//

UserColor := clRed;

UserNameLabell.Font.Color= UserColor;
:

UserNameLabel2.Font.Color= UserColor;
:

UserNameLabel3.Font.Color= UserColor;
:

UserNameLabel9.Font.Color= UserColor;
:

UserNameLabelS.Font.Color= UserColor;
:

UserNameLabel6.Font.Color= UserColor;
:

UserNameLabel7.Font.Color= UserColor;
:

UserNameLabelB.Font.Color= UserColor;
:

end;

procedure TMenuForm.UserName2ButtonMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin //
if UserNameLabel2.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabel2.Font.Color := clGreen;
end;
end;
procedure TMenuForm.UserName3ButtonMouseMove(Sender: TObject;
Shift: TShiftState: X, Y: Integer);
begin SUBSTITUTE SHEET (RULE 26) //
if UserNameLabel3.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabel3.Font.Color := clGreen;
end;
end;
procedure TMenuForm.UserName4ButtonMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin //
if UserNameLabel4.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabel4.Font.Color := clGreen;
end;
end:
procedure TMenuForm.UserName58uttonMouseMove(Sender: TObject;
Shift: TShiftState: X, Y: Integer);
begin //
if UserNameLabelS.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabelS.Font.Color := clGreen;
end;
end:
procedure TMenuForm.UserName6ButtonMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin //
if UserNameLabel6.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabel6.Font.Color := clGreen;
end;
end;
procedure TMenuForm.UserName78uttonMouseMove(Sender: TObject;
Shift: TShiftState: X, Y: Integer);

SUBSTITUTE SHEET (RULE 26) begin.
//
if UserNameLabel7.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabel7.Font.Color := clGreen;
end;
end;
procedure TMenuForm.UserNameBButtonMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin //
if UserNameLabelB.Font.Color <> clGreen then begin UserNameDefaultColor;
UserNameLabelB.Font.Color := clGreen;
end:
end:

procedure TMenuForm.ClearUserName;

begin //

PickUserImage.Visible .= False;

UserNameLabell.Visible = False;
:

UserNameLabel2.Visible = False;
:

UserNameLabel3.Visible = False;
:

UserNameLabel9.Visible = False;
:

UserNameLabelS.Visible = False;
:

UserNameLabel6.Visible = False;
:

UserNameLabel7.Visible = False;
:

UserNameLabelB.Visible = False:
:

UserNameEdit.Visible .= False;

OKButtonImage.Visible .= False:

end;
procedure TMenuForm.OKButtonImageClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
if Length(Trim(UserNameEdit.Text))>0 then l61 SUBSTITUTE SHEET {RULE 26) begin with MainForm.PDBEnginedo begin SecretName , ChildSex " %

BirthDay ChildEyeColor .

ChildHairColor " %

BedTimeHour " %

FavoriteColor " %

FavoriteFood " ' FavoriteActivity .

FavoriteAnimal . " %

end:

ShowRegistration(TrimLeft(UserNameEdit.Text))%

end:

end:
procedure TMenuForm.AssignCursorsInImages%
begin //
PickUserImage.Cursor .= crPESTOHandCursor:

UserNameLabell.Cursor = crPESTOHandCursor:
.

UserNameLabel2.Cursor = crPESTOHandCursor:
.

UserNameLabel3.Cursor = crPESTOHandCursor%
.

UserNameLabel4.Cursor = crPESTOHandCursor%
.

UserNameLabelS.Cursor = crPESTOHandCursor%
.

UserNameLabel6.Cursor = crPESTOHandCursor%
.

UserNameLabel7.Cursor .= crPESTOHandCursor%

UserNameLabelB.Cursor .= crPESTOHandCursor%

UserNameEdit.Cursor .= crPESTOHandCursor%

OKButtonlmage.Cursor .= crPESTOHandCursor%

end;
procedure TMenuForm.ImageFramelMouseMove(Sender: TObject%
Shift: TShiftState% X, Y: Integer)%
begin if CurrentButton <> 'Setup' then begin ResetCurrentButton%
SetupImage.Visible .= True:
CurrentButton . 'Setup':

SUBSTITUTE SHEET (RULE 26) end;
end:
procedure TMenuForm.ImageFrame2MouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer);
begin if CurrentButton <> 'Animation' then begin ResetCurrentButton:
AnimationImage.Visible := True:
CurrentButton . 'Animation':
end:
end:
procedure TMenuForm.ImageFrame3MouseMove(Sender: TObject:
Shift: TShiftState: X, Y: Integer):
begin if CurrentButton <> 'Intro' then begin ResetCurrentButton:
IntroImage.Visible .= True:
CurrentButton . 'Intro';
end:
end;
procedure TMenuForm.ImageFrame4MouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer).
begin if CurrentButton <> 'Play' then begin ResetCurrentButton:
PlayImage.Visible .= True:
CurrentButton . 'Play':
end:
end:
procedure TMenuForm.ImageFranieSMouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer):
begin if CurrentButton <> 'Check' then begin ResetCurrentButton:
Checklmage.Visible .= True:

SUBSTTTUTE SHEET (RULE Z6) WO 99!54015 PCT/IL99/00202 Current8utton . 'Check';
end;
end;
procedure TMenuForm.ImageFrame6MouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin end;
if CurrentButton <> 'Exit' then begin ResetCurrentButton;
Exitlmage.Visible .= True;
CurrentButton . 'Exit';
end;
procedure TMenuForm.ImageFramelClick(Sender: TObject);
begin if MainForm.ThreadInProgress then exit:
// Load From DataBase MainForm.PDBEngine.SetChildNumber(1);
UserNameLabel2.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumber(2):
UserNameLabel3.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumberl3);
UserNameLabel4.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumber(4);
UserNameLabelS.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumber(5);
UserNameLabel6.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumber(6);
UserNameLabel?.Caption := MainForm.PDBEngine.ChildName;
MainForm.PDBEngine.SetChildNumber(7);
UserNameLabelB.Caption := MainForm.PDBEngine.ChildName;
// Registration SetupImage.Visible .= False;
SetuporgImaqe.Visible .= True;
PickUserTitleLabel.Visible := True;
PickUserImage.Visible .= True:
UserNameLabell.Visible := True;
UserNameLabel2.Visible := True;
UserNameLabel3.Visible := True;
UserNameLabel4.Visible := True:
UserNameLabelS.Visible := True;

SUBSTITUTE SHEET (RULE 26) UserNameLabel6.Visible := True:
UserNameLabel7.Visible := True;
UserNameLabelB.Visible := True;
if UserNameLabel2.Caption" thenUserNameLabel2.Visible= False;
= :

if UserNameLabel3.Caption" thenUserNameLabel3.VisibleFalse;
= :

if UserNameLabel4.Caption" thenUserNameLabel4.VisibleFalse;
= :

if UserNameLabelS.Caption" thenUserNameLabelS.VisibleFalse;
= :

if UserNameLabel6.Caption" thenUserNameLabel6.VisibleFalse;
= :

if UserNameLabel7.Caption" thenUserNameLabel7.VisibleFalse;
= :

if UserNameLabelB.Caption" thenUserNameLabelB.Visible= False;
= :

ImageFramel.Enabled .= False;

ImageFrame2.Enabled .= False;

ImageFrame3.Enabled .= False;

ImageFrame4.Enabled . False;

ImageFrameS.Enabled . False;

ImageFrame6.Enabled .- False;

//Toy To TV

ToyAnimate.Visible = False;
:

ToyImage.Visible . = False;

with TVAnimate do begin Active .= False;

FileName := MainForm.GraphicsPath+'TV.AVI';

Active .= True;

Left .= 627;

Top .= 308;

Width . 101;

height . 104;

end;

//
MainForm.BackGroundTalking(MainForm.AboutYouPath + 'vo0041.wav','S');
end;
procedure TMenuForm.ImageFrame2ClicktSender: TObject);
begin // Siag Along ( Hide;
CreatorForm.Show:
CreatorForm.PlayMovie; I
SingAlongForm.PlaySongs;
(with SingALongForm do SUBSTITUTE SHEET (RULE 26) *rB

begin Spacel.Enabled :- True:
PlaySongs:
Show;
end;) end;
procedure TMenuForm.ImageFrame3Click(Sender: TObject);
begin // Execute INTRO
MainForm.Hide;
Hide:
StatusForm.Caption : 'Storyteller How-to-Play Status';
StatusForm.Show;
if MainForm.PDBEngine.ToyNumber < 0 then begin SimulationForm.Show;
MotionSimulationForm.Show:
end;
Threadl := TIntro.Create('Intro');
end:
procedure TMenuForm.ImageFrame4Click(Sender: TObject);
begin // Execute PLP.Y
MainForm.Hide;
Hide:
StatusForm.Caption : 'Storyteller Play Status';
StatusForm.Show;
if MainForm.PDBEngine.ToyNumber < 0 then begin SimulationForm.Show;
MotionSimulationForm.Show;
end;
Threadl := TIntro.Create('Play');
end;
procedure TMenuForm.ImageFrameSClick(Sender: TObject);
begin .
//
Hide;
CheckUpForm.ActivateTheAnimation(True);
CheckUpForm.Show;

SUBSTITUTE SHEET (RULE Z6) end:
procedure TMenuForm.ImageFrame6Click(Sender: TObject);
begin // Exit close;
MainForm.Close;
end;
procedure TMenuForm.MenuImageMouseMove(Sender:
TObject; Shift: TShiftState:

X, Y: Integer):

begin CurrentButton : ":

SetupImage.Visible = False:
.

AnimationImage.Visible= False:
:

IntroImage.Visible = False:
.

PlayImage.Visible = False;
.

CheckImage.Visible = False;
.

ExitImage.Visible = False:
.

end:

procedure TMenuForm.ShowRegistration(Value : string):

begin with RegistrationForm do begin CurrentItem . ";

SecretName . ":

Gender . "

DateOfBirth . " ;

EyeColor . ":

HairColor . ":

BedTimeHour . ":

FavoriteColor . "

FavoriteFood FavoriteActivity ":
.

FavoriteAnimal "
.

BoyImage.Visibl~e , .= False;

BoyHairYellowlmage.Visible:= False:

HoyfiairBlackImage.Visible:= False:

BoyHairOrangeImage.Visible.- False;

BoyHairBrownlmage.Visible.= False:

BoyEyeBlueImage.Visible := False;

SUBSTTTUTE SHEET (RULE 26) BoyEyeGreenlmage.Visible .= False:

BoyEyeBrownlmage.Visible .= False;

BoyEyeBlackImage.Visible .= False;

BoyShirtYellowImage.Visible.= False;

BoyShirtBlueImage.Visible . False;

BoyShirtRedImage.Visible .= False;

GirlImage.Visible .= False;

GirlHairYellowImage.Visible.= False;

GirlHairBrownImage.Visible .= False;

GirlHairOrangeImage.Visible.= False;

GirlHairBlackImage.Visible .= False;

GirlEyeBlueImage.Visible .= False;

GirlEyeGreenImage.Visible .= False;

GirlEyeBrownImage.Visible .= False;

GirlEyeBlackImage.Visible .= False;

GirlShirtYellowlmage.Visible.= False;

GirlShirtBlueImage.Visible .= False:

GirlShirtRedImage.Visible .= False;

FavoritePanel.Visible .= False:

BirthDayPanel.Visible .= False;

BedTimeHourPanel.Visible .= False;

end;

RegistrationForm.UserNameLabel.Caption := Value;

MainForm.PDBEngine.ChildName := Value;

MainForm.PDBEngine.UpDateCurrentChild;

RegistrationForm.LoadFromDataBase;

//

with RegistrationForm do begin if SecretName = " then begin AboutYouLabel.Visible .= False;

AboutSexLabel.Visible .= False;

AboutAgeLabel.Visible .= False;

AboutEyeLabel.Visible .- False;

AboutHairLabel.Visible .= False;

AboutBedTimeLabel.Visible .= False;

FavoritesLabel.Visible .= False;

FavoritesColorLabel.Visible .= False;

FavoritesFoodLabel.Visible .= False;

FavoritesActivityLabel.Visible .= False;

SUBSTITUTE SHEET (RULE 26) FavoritesAnimalLabel.Visible .- False;
end else begin AboutYouLabel.Visible .= True;

AboutSexLabel.Visible .= True;

AboutAgeLabel.Visible .= True;

AboutEyeLabel.Visible .= True;

AboutHairLabel.Visible .= True:

AboutBedTimeLabel.Visible .= True;

FavoritesLabel.Visible .= True;

FavoritesColorLabel.Visible .= True;

FavoritesFoodLabel.Visible .= True;

FavoritesActivityLabel.Visible .= True;

FavoritesAnimalLabel.Visible .= True;

DrawBoyOrGirl;

end;

end;

//

MainForm.PDBEngine.SetCurrentToFirst:

//

MainForm.Hide;
Hide;
RegistrationForm.Show;
RegistrationFozm.ShowVIfSelected;
MainForm.BackGroundTalking(MainForm.AboutYouPath + 'vo0047.wav','S');
end;
procedure TMenuForm.EnterlClick(Sender: TObject);
begin // Enter = OK
if OKButtonImage.Visible then OKButtonlmageClick(nil);
end;
procedure TMenuForm.EscapelClick(Sender: TObject);
begin // Exit Close;
MainForm.Close;
end;
procedure TMenuForm.FormClose(Sender: TObject; var Action: TCloseAction);

SUBSTTTUTE SHEET (RULE 26) begin end;
end.
TVAnimate.Active : False;
17~
SUBSTITUTE SHEET (RULE 26) mx~ax==axesmrxaaxxaxx=ovxecaxecaxxsxossx:aoxaxxcxx-xaxma:xxsao Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved ~~__~.~xx=xxxcxssa=====s=c=ca=eoso=a~~- =c-casoxoxo~=-axsoxx~
Description : This is the PanelControls unit.
unit PanelControls:
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, Buttons, ExtCtrls;
type TPanelControlForm = class(TForm) Panell: TPanei:
PauseButton: TSpeedButton:
StartButton: TSpeedButton:
StopButton: TSpeedButton:
procedure StopButtonClick(Sender: TObject)t procedure StartButtonClick(Sender: TObject):
procedure PauseButtonClick(Sender: TObject):
procedure FormCreate(Sender: TObject):
procedure FormHide(Sender: TObject):
private ( Private declarations ) public ( Public declarations ) Status : string:
end:
var PanelControlForm: TPanelControlForm~
implementation ($R *.DFM) procedure TPanelControlForm.StopButtonClick(Sender: TObject):
begin Status : 'STOP':
end;

SUBSTITUTE SHEET (RULE 26) procedure TPanelControlForm.StartButtonClick(Sender: TObject):
begin Status :_ 'START':
end:
procedure TPanelControlForm.PauseButtonClick(Sender: TObject);
begin Status : 'PAUSE':
end;
procedure TPanelControlForm.FormCreate(Sender: TObject);
begin Status : "
end:
procedure TPanelControlForm.FormHide(Sender: TObject);
begin Status :_ "
end:
end.

SUBSTITUTE SHEET (RULE 26) xvxxso==oaacxssvaaxsamsxa=s3aa=aaesasa=o=axxscxoaaemcx=xxsxsa-as:=.
Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved nxassomaxeoxsescocsooosxxococe=c=sxr=o=s~ -oo=o~ xosxsss=~c~ax Description : This is the PDBEngine unit.
unit PDBEngine;
// Pseudo DataBase Engine interface uses Classes, Windows,SysUtils;
type TPDBEngine = class(TComponent) private // Registration FChildName . string;

FChildSex . string;

FChildEyeColor . string;

FChildHairColor . string:

FBedTimeHour . string;

FBirthDay . string;

FSecretName . string;

FFavoriteColor . string;

FFavoriteFood . string;

FFavoriteActivity . string;

FFavoriteAnimal . string;

FChildNumber . Integer;

FVisitSongMenu . Integer;

FVisitGameMenu . Integer;

FVisitStoryMenu . Integer;

FVisitBunnyShort . Integer;

FVisitBunnyLong . Integer;

FVisitPrincess . Integer;

FBunnyFavoriteFood . string:

// Configuration FToyNumber . Integer;

FDataBasePath :
string;

// Multi Toys FMultiToyl . Integer;

FMultiToy2 . Integer;

FMultiToy3 . Integer;

SUBSTITUTE SHEET (RULE 26) FMultiToy4 . Integer;
FMultiToyS . Integer;
FMultiToy6 . Integer;
FMultiToy7 . Integer;
FMultiToyB . Integer;
protected // Data(ChildNumber,FieldNumber]
Data . Array(1..10,1..40] of string; // change const; Example:
Data(i.jl . 'test';
procedure ClearDataArray;
public procedure LoadRegistration;
procedure SaveRegistration;
procedure InsertNewChild;// Become First in the List (Array) procedure UpDateCurrentChild;
procedure SetChildNumber (Value : Integer);
procedure LoadConfiguration;
procedure SaveConfiguration;
procedure SetCurrentToFirst:
procedure LoadMultiToys;
procedure SaveMultiToys;
published // Registration property ChildName . string read FChildName write FChildName:
property ChildSex . string read FChildSex write FChildSex:
property ChildEyeColor . string read FChildEyeColor write FChildEyeColor;
property ChildHairColor . string read FChildHairColor write FChildHairColor;
property BedTimeHour . string read FBedTimeHour write FBedTimeHour;
property BirthDay . string read FBirthDay write FBirthDay:
property SecretName . string read FSecretName write FSecretName;
property FavoriteColor . string read FFavoriteColor write FFavoriteColor;
property FavoriteFood . string read FFavoriteFood write FFavoriteFood;
property FavoriteActivity : string read FFavoriteActivity write FFavoriteActivity;
property FavoriteAnimal : string read FFavoriteAnimal write FFavoriteAnimal:

SUBSTITUTE SHEET (RULE 26) property ChildNumber Integer read FChildNumber write SetChildNumber%

property VisitSongMenu. Integer read FVisitSongMenuwrite FVisitSongMenu:
write property VisitGameMenu. Integer read FVisitGameMenu FVisitGameMenu:
r read FVisitStoryMenu write property VisitStoryMenu. Intege FVisitStoryMenu%
read FVisitBunnySho rt write property VisitBunnyShort : Integer FVisitBunnyshort%

property VisitBunnyLong. Integer read FVisitBunnyLong write FVisitBunnyLong%

property VisitPrincess. Integer read FVisitPrincesswrite FVisitPrincess%

BunnyFavoriteFood .
string read FBunnyFavoriteFood write t y proper FBunnyFavoriteFood%

// Configuration property ToyNumber . Integer read FToyNumberFToyNumber%
write property DataBasePath . string read FDataBasePathwrite FDataBasePath%

// Multi Toys property MultiToyl Integer read FMultiToyl FMultiToyl;
. write property MultiToy2 Integer read FMultiToy2 FMultiToy2%
. write property MultiToy3 Integer read FMultiToy3 FMultiToy3:
. write property MultiToy4 Integer read FMultiToy4 FMultiToy4%
. write MultiToyS . Integer read FMultiToyS FMultiToy5;
t write y proper property MultiToy6 Integer read FMultiToy6 FMultiToy6:
. write property MultiToy7 Integer read FMultiToy7 FMultiToy7;
. write property MultiToyB Integer read FMultiToyB FMultiToyB;
. write end%

const HowManyChildren = 7%
ChildProperties = 40% // change alse Array implementation procedure TPDBEngine.LoadRegistration%
var F : TextFile%
i, j : integer%
begin ClearDataArray%

SUBSTITUTE SHEET (RULE 26) try AssignFile (F,DatabasePath+'Registration.CRE')~
Reset (F):
i : 1:
while not EOF(F) do begin for j:=1 to ChildProperties do Readln(F,Data[i,j)):
i := i + 1:
end:
CloseFile(F):
except SaveRegistration:
end:
ChildNumber := 1:
end: .
procedure TPDBEngine.SaveRegistration:
var F : TextFile:
i, j . Integer;
begin AssignFile (F,DatabasePath+'Registration.CRE'):
Rewrite(F1:
for i:=1 to HowManyChildren do begin for j:=1 to ChildProperties do WritelniF.Data[i.jJ)~
end:
CloseFile(F);
end:
// All The Data Shift 1 Step: Nine Become Ten, Eight Become Nine ....
// First Become Second.
// The New Data will be in the First Record.
procedure TPDBEngine.InsertNewChild:
var index : integer;
i . integer:
j . integer:
begin for i : (HowManyChildren-1) downto 1 do begin for j : 1 to ChildProperties do begin l76 SUBSTITUTE SHEET (RULE 26) Data[i+l, j] .= Data(i,j]~
end;
end;
// index := ChildNumber:
ChildNumber := 1:
UpDateCurrentChild:
// ChildNumber := index;
end:
procedure TPDBEngine.UpDateCurrentChild:
begin Data(ChildNumber,l] = ChildName .

Data[ChildNumber,2] = ChildSex .

Data[ChildNumber,3] = ChildEyeColor .

Data[ChildNumber,9] = ChildHairColor ;
.

Data(ChildNumber,5] = BedTimeHour ;
.

Data[ChildNumber,6] = BirthDay .

Data(ChildNumber,7] = SecretName .

Data(ChildNumber,8] = FavoriteColor .

Data(ChildNumbez,9] = FavoriteFood .

Data[ChildNumber,l0]= FavoriteActivity:
.

Data[ChildNumber,ll]= FavoriteAnimal .

Data[ChildNumber,l2]= IntToStr(VisitSongMenu) .

Data[ChildNumber,l3]= IntToStr(VisitStoryMenu) .

Data[ChildNumber,l4]= IntToStr(Visit8unnyShort) . ;

Data[ChildNumber,l5]= IntToStr(VisitBunnyLong) .

Data(ChildNumber,l6]= IntToStr(VisitGameMenu) .

Data(ChildNumber,l7]= IntToStr(VisitPrincess) .

Data[ChildNumber,l8]= BunnyFavoriteFood .

SaveRegistration:

end:

procedure TPDBEngine.SetChildNumber (Value : Integer):
begin if (Value > 0) and (Value < HowManyChildren+1) then FChildNumber := Value else FChildNumber := 1;
ChildName . Data[ChildNumber,l];
ChildSex . Data[ChildNumber,2];
ChildEyeColor . Data(ChildNumber,3]:
ChildHairColor . Data[ChildNumber,4]:
BedTimeHour . Data(ChildNumber,5]7 BirthDay . Data[ChildNumber,6]:
SecretName . Data[ChildNumber,7]t SUBSTITUTE SHEET (RULE 26) FavoriteColor Data(ChildNumber,8];
.

FavoriteFood Data[ChildNumber,9]:
.

FavoriteActivityData[ChildNumber,l0]:
.

FavoriteAnimal Data(ChildNumber,ll]:
.

VisitSongMenu StrToIntDef(Data[ChildNumber,l2],0);
.

VisitStoryMenu StrToIntDef(Data(ChildNumber,l3],0)J
.

VisitBunnyShort StrToIntDef(Data[ChildNumber,l4],0):
.

VisitBunnyLong StrToIntDef(Data(ChildNumber,l5],0):
.

VisitGameMenu StrToIntDef(Data(ChildNumber,l6],0):
.

VisitPrincess StrToIntDef(Data[ChildNumber,l7],0);
.

BunnyFavoriteFoodData(ChildNumber,l8]:
.

end:
procedure TPDBEngine.ClearDataArray7 var i . integer:
j . integer;
begin for i : 1 to HowManyChildren do begin foz j :- 1 to ChildProperties do begin Data(i,j] . "
end:
end:
end;
procedure TPDBEngine.LoadConfiguration:
Var F : TextFile:
begin FToyNumber :- 0:
try AssignFile (F,DatabasePath+'Configuration.CRE'):
Reset (F):
Readln(F,FToyNumber);
CloseFile(F):
except SaveConfiguration;
end;
end:
procedure TPDBEngine.SaveConfiguration:

SUBSTITUTE SHEET (RULE 26) var F : TextFile:
begin AssignFile (F,DatabasePath+'Configuration.CRE'):
Rewrite(F):
Writeln(F,FToyNumber):
CloseFile(F):
end: .
procedure TPDBEngine.SetCurrentToFirst;
var i . integer;
Temp : string:
begin //
While ChildNumber > 1 do begin for i := 1 to ChildProperties do begin Temp .= Data[ChildNumber,i]:
Data[ChildNumber,i] .= Data[ChildNumber-l,i]:
Data[ChildNumber-l,il := Temp:
end:
ChildNumber := ChildNumber - 1:
end:
end:
procedure TPDBEngine.LoadMultiToys;
var F : TextFile:
begin FToyNumber := 0:
try AssignFile (F,DatabasePath+'MultiToys.CRE'):
Reset (F):
Readln(F,FMultiToyl):
Readln(F,FMultiToy2):
Readln(F,FMultiToy3);
Readln(E,FMultiToy4):
Readln(F,FMultiToyS):
Readln(F,FMultiToy6):
Readln(F,FMultiToy7):
Readln(F,FMultiToyB):

SUBSTITUTE SHEET (RULE Z6) CloseFile(F):
except SaveConfiguration;
end;
end;
procedure TPDBEngine.SaveMultiToys;
var F : TextFile;
begin AssignFile (F,DatabasePath+'MultiToys.CRE'):
Rewrite(FI:
Writeln(F,FMultiToyl);
Writeln(F,FMultiToy2):
Writeln(F,FMultiToy3);
Writeln(F,FMultiToy4):
Writeln(F,FMultiToyS);
Writeln(F,FMultiToy6):
Writeln(F,FMultiToy7);
Writeln(F,FMultiToyB);
CloseFile(F)t end;
end.

SUBSTITUTE SHEET (RULE 26) xaaas==~~~aasasvs-=s=oas==e- =a=coaaa~esc===o~==csxaox~sxx Copyright (c) .I995-1998 Creator Ltd. A11 Rights Reserved ==aaoaasa:aaa-ssasccsv=s=sxaaocxaossaxaox=a=sasa=asxxcaxx~x~ocx Description : This is the PestoSong unit.
unit PestoSong:
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, Menus, ExtCtrls, MPlayer:
type TPestoSongForm = class(TForm) MainMenul: TMainMenu;
testl: TMenuItem:
spacel: TMenuItem:
MediaPlayerl: TMediaPlayer;
Panell: TPanel:
Timerl: TTimer:
Escapel: TMenuItem:
procedure spacelClick(Sender: TObject):
procedure FormCreate(Sender: TObject):
procedure TimerlTimer(Sender: TObject)~
procedure EscapelClick(Sender: TObject):
procedure FormClose(Sender: TObject; var Action: TCloseAction):
private [ Private declarations ) public ( Public declarations ) FirstTimePlay : Boolean.
Section . Integer:
procedure PlayMovie:
procedure GoToMenu;
procedure PlaySection (value : Integer):
procedure ToyTalk(NumbersOfToy : string :Wave : string :motion :string):
end:
var PestoSongForm: TPestoSongForm:

SUBSTITUTE SHEET (RULE 26) implementation uses Main, Menu;
[$R *.DFM}
procedure TPestoSongForm.spacelClick(Sender: TObject);
begin //space if MainForm.ThreadInProgress = True then exit:
GoToMenu:
end:
procedure TPestoSongForm.GoToMenu:
begin Timerl.Enabled := False;
Spacel.Enabled := False;
MediaPlayerl.Stop:
MediaPlayerl.Close:
hide:
MenuForm.Show:
end;
procedure TPestoSongForm.PlayMovie:
begin MediaPlayerl.Play:
Toy2alk('All','StoryTeller.wav','S'):
end;
procedure TPestoSongForm.PlaySection (Value : integer);
begin MediaPlayerl.Close;
case Value of 1: begin MediaPlayerl.FileName := MainForm.GraphicsPath + 'Logo.avi';
ToyTalk('One','Logo.wav','S'):
end:
2: begin MediaPlayerl.FileName := MainForm.GraphicsPath + 'Alonel.mov':
ToyTalk('One','Alonel.wav','S'):
end;

SUBSTITUTE SKEET (RULE 26) WO 99/54015 PC?/IL99/00202 3: begin MediaPlayerl.FileName := MainForm.GraphicsPath + 'Alone2.mov':
ToyTalk('One','Alone2.wav','S'):
end;
4: begin MediaPlayerl.FileName := MainForm.GraphicsPath + 'Alone3.mov':
ToyTalk('One','Alone3.wav','S'):
end;
5: begin MediaPlayerl.FileName := MainForm.GraphicsPath + 'All.mov':
ToyTalk('All'.'All.wav','S'):
end:
end:
MediaPlayerl.open:
MediaPlayerl.Play:
end:
procedure TPestoSongForm.FormCreate(Sender: TObject):
begin Panell.Cursor := crNone:
Timerl.Enabled := False;
Timerl.Interval := 60000;
Cursor := crNone~
MediaPlayerl.FileName := MainForm.GraphicsPath + 'StoryTeller.avi':
MediaPlayerl.open;
FirstTimePlay := True;
end:
procedure TPestoSongForm.TimerlTimer(Sender: TObject):
begin //
GoToMenu;
end;
procedure TPestoSongForm.EscapelClick(Sender: TObject);
begin // Exit if MainForm.ThreadlnProgress = True then exit:
MediaPlayerl.stop~

SUBSTITUTE SHEET (RULE 26) MediaPlayerl.Close:
Hide;
MenuForm.Show;
end;
procedure TPestoSongForm.ToyTalk(NumbersOfToy : string :Wave : string ;motion :string);
var ToyNo : Integer;
begin if NumbersOfToy = 'All' then ToyNo := 85 //55H
else ToyNo := MainForm.Toy.ToyNumber;
MainForm.TalkInBackGround (ToyNo,MainForm.AudioPath +Wave, "):
end;
procedure TPestoSongForm.ForsnClose(Sender: TObject;
var Action: TCloseActionl:
begin try MediaPlayerl.stop:
except end:
try MediaPlayerl.Close:
except end;
end;
end.

SUBSTITUTE SHEET (RULE 26) =~:~aaas-co==a-ss==ao==s~ x= ~ea==sa~°x==cssxa=a==csx---~-ssx=a Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved ssss:axaxxsxa=x- =o=s=o===a=xx==axaxxxmo==ssxx=sx3axsaaxxaxaaaams Description : This is the Registration unit.
unit Registration;
interface uses Windows, Messages, SysUtils, Classes. Graphics, Controls, Forms, Dialogs, ExtCtrls,jpeg, StdCtrls, Buttons, Spin, Grids, Calendar, ComCtrls, Menus;
type TRegistrationForm = class(TForm) RegistrationImage: TImage;
RegistrationBackImage: TImage:
UserNameLabel: TLabel;
BoyImage: TImage;
BoyHairYellowImage: TImage;
BoyHairBlackImage: TImage;
BoyHairOrangeImage: TImage;
BoyHairBrownImage: TImage;
BoyEyeBluelmage: TImage;
BoyEyeGreenImage: TImage;
BoyEyeBrownImage: TImage;
BoyEyeBlackImage: TImage;
BoyShirtYellowImage: TImage;
BoyShirtBlueImage: TImage;
BoyShirtRedImage: TImage;
Girllmage: TImage:
GirlHairYellowlmage: TImage:
GirlHairBrownImage: TImage:
GirlHairOrangelmage: Tlmage;
GirlHairBlackImage: Timage;
GirlEyeBlueImage: TImage;
GirlEyeGreenlmage: TImage;
GirlEyeBrownImage: TImage;
GirlEyeBlacklmage: Tlmage;
GirlShirtYellowImage: Tlmage;
GirlShirtBlueImage: TImage;
GirlShirtRedImage: TImage;

SUBSTITUTE SHEET (RULE 26) AboutYouLabel: TLabel;
AboutSexLabel: TLabel;
AboutAgeLabel: TLabel;
RboutEyeLabel: TLabel;
AboutHairLabel: TLabel;
AboutBedTimeLabel: TLabel;
FavoritesLabel: TLabel;
FavoritesColorLabel: TLabel;
FavoritesFoodLabel: TLabel;
FavoritesActivityLabel: TLabel;
FavoritesAnimalLabel: TLabel;
FavoritePanel: TPanel;
PanelImage: TImage;
PanelLabell: TLabel;
PanelLabel2: TLabel;
PanelLabel3: TLabel;
PanelLabel4: TLabel;
SecretNameLabel: TLabel;
GoOutArrowlmage: TImage:
BirthDayPanel: TPanel;
BirthDayImage: TImage;
Calendarl: TCalendar;
SpinEditl: TSpinEdit;
ComboBoxl: TComboBox;
BedTimeHourPanel: TPanel;
BedTimeHourImage: TImage;
ComboBox2: TComboBox;
BesrEyesAnimate: TAnimate;
SteemAnimate: TAnimate:
WheelsAnimate: TAnimate;
BirthDayOKImage: TImage;
BedTimeHourOKImage: TImage;
VGenderImage: TImage;
VBirthdaylmage: TImage;
VEyeColorImage: Tlmage;
VHairColorImage: Tlmage;
VBedTimeHourImage: TImage;
VFavoriteColorImage: TImage:
VFavoriteFoodlmage: TImage;
VFavoriteActivityImage: Tlmage;
VFavoriteAnimalImage: TImage:
MainMenul: TMainMenu:
testl: TMenuItem;

SUBSTITUTE SHEET (RULE 26) *rB

Escapel: TMenuItem;
BallJumpAnimate: TAnimate;
procedure FormCreate(Sender: TObject);
procedure RegistrationBacklmageClick(Sender: TObject);
procedure AboutSexLabelClick(Sender: TObject);
procedure AboutAgeLabelClick(Sender: TObject);
procedure AboutEyeLabelClick(Sender: TObject);
procedure AboutHairLabelClick(Sender: TObject);
procedure AboutBedTimei.abelClick(Sender: TObject);
procedure PanelLabellClick(Sender: TObject);
procedure PanelLabel2Click(Sender: TObject);
procedure PanelLabel3Click(Sender: TObject);
procedure PanelLabel4Click(Sender: TObject);
procedure FavoritesColorLabelClick(Sender: TObject);
procedure FavoritesFoodLabelClick(Sender: TObject);
procedure FavoritesActivityLabelClick(Sender: TObject);
procedure FavoritesAnimalLabelClick(Sender: TObjectl;
procedure AboutSexLabelMouseMove(Sender: TObject; Shift: TShiftState:
X, Y: Integer);
procedure AboutAgeLabelMouseMove(Sender: TObject; Shift: TShiftState:
X, Y: Integerl:
procedure AboutEyeLabelMouseMove(Sender: TObject; Shift: TShiftState:
X, Y: Integer):
procedure AboutHairLabelMouseMove(Sender: TObject; Shift: TShiftState:
X, Y: Integer):
procedure AboutBedTimeLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer):
procedure PanelLabellMouseMove(Sender: TObject; Shift: TShiftState; X, Y: Integer);
procedure PanelLabel2MouseMove(Sender: TObject; Shift: TShiftState: X, Y: Integer):
procedure PanelLabel3MouseMove(Sender: TObject; Shift: TShiftState; X, Y: Integer):
pzocedure PanelLabel4MouseMove(Sender: TObject; Shift: TShiftState; X, Y: Integer):
procedure FavoritesColorLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integerl:
procedure FavoritesFoodLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
procedure FavoritesActivityLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
procedure FavoritesAnimalLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);

SUBSTITUTE SHEET (RULE 26) *rB

procedure SecretNameLabelClick(Sender: TObject);
procedure RegistrationImageMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
procedure SecretNameLabelMouseMove(Sender: TObject; Shift: TShiftState;
X, Y: Integer):
procedure RegistrationBackImageMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
procedure GoOutArrowlmageClick(Sender: TObject);
procedure ComboBoxlChangelSender: TObject);
procedure SpinEditlChange(Sender: TObject);
procedure CalendarlChange(Sender: TObject);
procedure BirthDayImageClickISender: TObject);
procedure RegistrationlmageClick(Sender: TObject);
procedure BedTimeHourlmageClick(Sender: TObject);
procedure ComboBox2Change(Sender: TObject);
procedure BirthDayOKImageClick(Sender: TObject);
procedure BedTimeHourOKImageClick(Sender: TObject);
procedure EscapelClick(Sender: TObject);
private ( Private declarations ) public ( Public declarations ) CurrentItem . string;
ChildName . string;
SecretName . string;
Gender . string;
DateOfBirth . string;
EyeColor . string;

HairColor . string;

BedTimeHour . string;

FavoriteColor . string;

FavoriteFood . string;

FavoriteActivity. string;

FavoriteAnimal . string:

procedure InitialReg;
procedure DrawBoyorGirl;
procedure AssignCurrentItem (Value : string);
procedure GoBackToMenu;
procedure ChoosePanelLabel(Value : Integer);
procedure SaveToDataBase;
procedure LoadFromDataBase;
procedure BackgroungSpeaking (Value: string);
procedure ShowVIfSelected;
l88 SUBSTITUTE SHEET (RULE 26) end:
var RegistrationFOrm: TRegistrationForm;
implementation uses Main, Menu;
{$R *.DFM}
procedure TRegistrationForm.FormCreate(Sender: TObject):
begin //Maximize WindowState := wsMaximized;
RegistrationBackImage.Cursor := crDefault:
InitialReg;
end;
procedure TRegistrationForm.RegistrationBacklmageClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit:
GoBackToMenu;
end:
procedure TregistrationForm.GoBackToMenu:
begin with MenuForm do begin PickUserImage.Visible .= False:
// Reg 1 PickUserTitleLabel.Visible: False;

UserNameLabell.VisibleFalse;
:=

UserNameLabel2.VisibleFalse;
:=

UserNameLabel3.VisibleFalse:
:=

UserNameLabel4.VisibleFalse:
:=

UserNameLabelS.VisibleFalse;
:=

UserNameLabel6.VisibleFalse:
:=

UserNameLabel7.VisibleFalse:
:=

UserNameLabelB.VisibleFalse;
:=

// Reg 2 PickUserLabell.VisibleFalse:
.=

SUBSTITUTE SHEET (RULE 26) PickUserLabel2.Visible .- False;
UserNameEdit.Visible .= False;
OKButtonlmage.Visible .= False;
SetUpOrgImage.Visible .= False;

SetupImage.Visible := True;

ImageFramel.Enabled .= True;

ImageFrame2.Enabled .= True;

ImageFrame3.Enabled .= True;

ImageFrame4.Enabled .= True;

ImageFrameS.Enabled .= True;

ImageFrame6.Enabled .= True;

//Toy To TV

ToyAnimate.Visible : = True;

ToyImage.Visible . = True;

with TVAnimate do begin Active .= False;

FileName := MainForm.6raphicsPath+'noise.AVI';

Active .= True;

Left := 629;

Top .= 318;

Width .= 112;

height .= 88;

end;

//
end;
SaveToDataBase;
Close;
MenuForm.Show;
end;
procedure TRegistrationForm.InitialReg;
begin RegistrationImage.Visible := True;
RegistrationBacklmage.Visible := True;
UserNameLabel.Visible .= True;
Boylmage.Visible := False;

HoyHairYellowImage.Visible:= False;

BoyHairBlackImage.visible.- False;

BoyHairOrangeImage.Visible.= False;

BoyHairBrownImage.Visible:= False;

SUBSTITUTE SHEET (RULE 26) BoyEyeBlueImage.Visible .- False;

BoyEyeGreenImage.Visible .= False:

BoyEyeBrownImage.Visible . False;

BoyEyeBlackImage.Visible .= False;

BoyShirtYellowImage.Visible.= False:

BoyShirtBlueImage.Visible .= False:

BoyShirtRedlmage.Visible .= False;

Girllmage.Visible .= False:

GirlHairYellowImage.Visible.= False:

GirlHairBrownImage.Visible .= False;

GirlHairOrangelmage.Visible. False:

GirlHairBlackImage.Visible . False:

GirlEyeBlueImage.Visible .= False;

GirlEyeGreenImage.Visible .= False:

GirlEyeBrownImage.Visible .= False;

GirlEyeBlackImage.Visible .= False:

GirlShirtYellowImage.Visible.= Falser GirlShirtBlueImage.Visible .= False:

GizlShirtRedImage.Visible .= False:

AboutYouLabel.Visible := False:

AboutSexLabel.Visible .= False:

AboutAgeLabel.Visible .= False:

AboutEyeLabel.Visible .= False:

AboutHairLabel.Visible := False:

AboutBedTimeLabel.Visible .= False;

FavoritesLabel.Visible .= False;

FavoritesColorLabel.Visible. False;

FavoritesFoodLabel.Visible.= False:

FavoritesActivityLabel.Visible.- False:

FavoritesAnimalLabel.Visible.= False;

FavoritePanel.Visible .= False;
//RegistrationImage.Cursor= 6%

AboutSexLabel.Cursor = 5%

AboutAgeLabel.Cursor = 5%

AboutEyeLabel.Cursor 5%

AboutHairLabel.Cursor := S;

AboutBedTimeLabel.Cursor= 5%

FavoritesColorLabel.Cursor= 5%

FavoritesFoodLabel.Cursor= 5%

SUBSTTrUTE SHEET (RULE 26) FavoritesActivityLabel.Cursor .= 5;
FavoritesAnimalLabel.Cursor . 5:
Panellmage.Cursor -5%

PanelLabell.Cursor ' S%

PanelLabel2.Cursor -5%

PanelLabel3.Cursor =
5%

PanelLabel4.Cursor =
5%

SecretNameLabel.Cursor 5%

RegistrationBacklmage.Cursor=
5%

GoOutArrotaImage.Cursor =
5%

BedTimeHourOKImage.Cursor:3 5%

BirthDayOKImage.Cursor =
5%

CurrentItem ~ "%
%
SecretName .
%
Gender ~ "
DateOfBirth ~ "%
EyeColor ~ "%
HairColor ~ " ' BedTimeHour . " %
FavoriteColor ~ " %
FavoriteFood ~ " ' FavoriteActivity :_ " %
FavoriteAnimal . "
ComboBoxl.Items.Add('January'):
ComboBoxl.Items.Add('February');
ComboBoxl.Items.Add('March');
ComboBoxl.Items.Add('April');
ComboBoxl.Items.Add('May');
ComboBoxl.Items.Add('June');
ComboBoxl.Items.Add('July');
ComboBoxl.Items.Add('August');
ComboBoxl.Items.Add('September')%
ComboBoxl.Items.Rdd('October');
ComboBoxl.Items.Add('November');
ComboBoxl.Items.Add('December');
SpinEditl.Value : 1995%
ComboBox2.Items.Add('6:00 PM');
Combo8ox2.Items.Add('6:30 PM');
ComboBox2.Items.Add('7:00 PM')%
ComboBox2.Items.Add('7:30 PM');

SUBSTITUTE SHEET (RULE 26) ComboBox2.Items.Add('8:00 PM');
ComboBox2.Items.Add('8:30 PM');
ComboBox2.Items.Add('°:00 PM'1;
ComboBox2.Items.Add('°:30 PM');
ComboBox2.Items.Add('10:00 PM');
ComboBox2.Items.Add('10:30 PM');
with BedTimeHourPanel do begin Left : 135;
Top .= 335;
Width : 157;
Height := 78;
with BirthDayPanel do begin Left : 134;
Top .= 239;
Width := 278;
Height := 201;
end;
end:
BearEyesAnimate.FileName := MainForm.GraphicsPath + 'BearEye.avi';
SteemAnimate.FileName := MainForm.GraphicsPath + 'Steem.avi';
WheelsAnimate.FileName := MainForm.GraphicsPath + 'Wheels.avi';
BallJumpAnimate.FileName := MainForm.GraphicsPath + 'BallJump.avi';
BearEyesAnimate.Active .= True:
SteemAnimate.Active .= True;
WheelsAnimate.Active .= True;
BallJumpAnimate.Active .= True;
end:
procedure TRegistrationForm.AboutSexLabelClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
if Gender = 'Boy' then ChoosePanelLabel(1);
if Gender = 'Girl' then ChoosePanelLabel(2);
PanelLabell.caption : 'Boy';
PanelLabel2.caption : 'Girl';
with FavoritePanel do SUBSTITUTE SHEET (RULE Z6) begin Left 139.
.

Top =
. 204;

Width =
. 225;

Height =
: 85;

end;
FavoritePanel.Visible .= True;
BedTimeHourPanel.Visible := False;
BirthDayPanel.Visible := False;
CurrentItem : 'Gender';
SteemAnimate.Visible := True;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'ay62.wav','S');
end;
procedure TRegistrationForm.AboutAgeLabelClick(Sender: TObject);
var Temp : string;
begin if MainForm.ThreadInProgress then exit;
Temp := DateOfBirth;
if Length(DateOfBirth) = 10 then begin Calendarl.Year . StrToInt(copy(Temp,7,4));
Calendarl.Day . StrToInt(copy(Temp,4,2)I;
Calendarl.Month . StrToInt(copy(Temp,l,2));
end:
SpinEditl.Value .= Calendarl.Year;
ComboBoxl.ItemIndex := Calendarl.Month-1;
BirthDayPanel.Visible .= True;
FavoritePanel.Visible := False;
BedTimeHourPanel.Visible := False;
SteemAnimate.Visible := False;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'ay63.wav','S');
end;
procedure TRegistrationForm.AboutEyeLabelClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
if EyeColor'Blue' then ChoosePanelLabel(1);
=

if EyeColor'Green'then ChoosePanelLabel(2);
=

if EyeColor'Brown'then ChoosePanelLabel(3);
=

if EyeColor'Black'then ChoosePanelLabel(4);
=

SUBSTITUTE SHEET (RULE 26) PanelLabell.caption : 'Blue';
PanelLabel2.caption : 'Green';
PanelLabel3.caption : 'Brown'.
PanelLabel4.caption : 'Black':
with FavoritePanel do begin Left .- 134;
Top .= 269;
Width .= 225;
Height := 145;
end:
FavoritePanel.Visible .= True;
BedTimeHourPanel.Visible := False;
BirthDayPanel.Visible .= False:
CurrentItem : 'EyeColor';
SteemAnimate.Visible := False:
MainForm.BackGroundTalking(MainForm.AboutYouPath +'ay66.wav','S');
end:
procedure TRegistrationForm.AboutHairLabelClick(Sender: TObject);
begin //
if MainForm.ThreadlnProgress then exit;
if HairColor = 'Blond' then ChoosePanelLabel(1);
if HairColor = 'Brown' then ChoosePanelLabel(2);
if HairColor = 'Red' then ChoosePanelLabel(3);
if HairColor = 'Black' then ChoosePanelLabel(4);
PanelLabell.caption'Blond'.
:

PanelLabel2.caption'Brown':
:

PanelLabel3.caption'Red';
:

PanelLabel4.caption'Black';
:

with FavoritePanel do begin Left .= 134;

Top .= 302;

Width .= 225;

Height := 145;

end:

FavoritePanel.Visible.= True;

BedTimeHourPanel.Visible :=
False;

BirthDayPanel.Visible.= False;

SUBSTITUTE SHEET (RULE 26) Currentltem : 'HairColor';
SteemAnimate.Visible := False;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'ay67.wav','S');
end:
procedure TRegistrationForm.AboutBedTimeLabelClick(Sender: TObject);
begin if MainForm.ThreadInProgress then exit:
ComboBox2.ItemIndex := ComboBox2.Items.IndexOf(BedTimeHour);
BedTimeHourPanel.Visible := True:
FavoritePanel.Visible .= False:
BirthDayPanel.Visible .= False:
SteemAnimate.Visible := False;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'ay68.wav','S');
end:
procedure TRegistrationForm.PanelLabellClick(Sender: TObject);
begin if MainForm.ThreadIaProgress then exit:
FavoritePanel.Visible := False;
AssignCurrentItem (PanelLabell.Caption);
DrawBoyOrGirl;
CurrentItem : " ;
end:
procedure TRegistrationForm.PanelLabel2Click(Sender: TObject);
begin if MainForm.ThreadInProgress then exit;
FavoritePanel.Visible := False:
AssignCurrentItem (PanelLabel2.Caption);
DrawBoyOrGirl;
CurrentItem :_ ";
end;
procedure TRegistrationForm.PanelLabel3Click(Sender: TObject);
begin if MainForm.ThreadInProgress then exit;
FavoritePanel.Visible := False;
AssignCurrentItem (PanelLabel3.Caption);
DrawBoyOrGirl;
CurrentItem :_ " ;
end:

SUBSTITUTE SHEET (RULE 26~

procedure TRegistrationForm.PanelLabel4Click(Sender: TObject);
begin if MainForm.ThreadInProgress then exit;
FavoritePanel.Visible := False;
AssignCurrentItem (PanelLabel4.Caption);
DrawBoyOrGirl;
CurrentItem : " %
end%
procedure TRegistrationForm.FavoritesColorLabelClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit:
if FavoriteColor = 'Yellow' then ChoosePanelLabel(1)%
if FavoriteColor = 'Blue' then ChoosePanelLabel(2)%
if FavoriteColor = 'Red' then ChoosePanelLabel(3);
PanelLabell.caption : 'Yellow';
PanelLabel2.caption : 'Blue';
PanelLabel3.caption : 'Red';
with FavoritePanel do begin Left := 415;
Top .= 204%
Width .= 225%
Height : 115%
end;
FavoritePanel.Visible .= True;
BedTimeHourPanel.Visible := False;
BirthDayPanel.Visible .= False;
CurrentItem : 'FavoriteColor'%
SteemAnimate.Visible := True%
MainForm.BackGroundTalking(MainForm.AboutYouPath +'fa7l.wav','S')%
end:
procedure TRegistrationForm.FavoritesFoodLabelClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
if FavoriteFood = 'Pizza' then ChoosePanelLabel(1)%
if FavoriteFood = 'French Fries' then ChoosePanelLabel(2);
if FavoriteFood = 'Macaroni And Cheese' then ChoosePanelLabel(3);
l97 SUBSTITUTE SHEET (RULE 26) PanelLabell.caption : 'Pizza';
PaneiLabel2.caption : 'French Fries';
PanelLabel3.caption : 'Macaroni And Cheese';
with FavoritePanel do begin Left =
. 415:

Top =
. 236;

Width =
. 225:

Height115;
:

end:
FavoritePanel.Visible .= True;
HedTimeHourPanel.Visible := False;
BirthDayPanel.Visible .= False:
CurrentItem : 'FavoriteFood';
SteemAnimate.Visible := True;
MainForm.HackGroundTalking(MainForm.AboutYouPath +'fa72.wav','S');
end;
procedure TRegistrationForm.FavoritesActivityLabelClick(Sender: TObject)~
begin //
if MainForm.ThreadInProgress then exit;
if FavoriteActivity = 'Drawing' then ChoosePanelLabel(1);
if FavoriteActivity = 'Playing Computer Games' then ChoosePanelLabel(2);
if FavoriteActivity = 'Pretending' then ChoosePanelLabel(3);
PanelLabell.caption : 'Drawing';
PaneiLabel2.caption : 'Playing Computer Games';
PanelLabel3.caption : 'Pretending';
with FavoritePanel do begin Left .=
415;

Top .=
271;

Width:=
225;

Height: 115;

end;
FavoritePanel.Visible .= True;
BedTimeHourPanel.Visible :- False;
BirthDayPanel.Visible .= False;
CurrentItem : 'FavoriteActivity';
SteemAnimate.Visible := True;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'fa73.wav','S');
end;

SUBSTITUTE SHEET (RULE 26) procedure TRegistrationForm.FavoritesAnimalLabelClick(Sender: TObject);
begin //
if MainForm.ThreadlnProgress then exit:
if FavoriteAnimal = 'Horse' then ChoosePanelLabel(1);
if FavoriteAnimal = 'Dog' then ChoosePanelLabel(2);
if FavoriteAnimal = 'Cat' then ChoosePanelLabel(3);
PanelLabell.caption : 'Horse':
PanelLabel2.caption : 'Dog':
PanelLabel3.caption : 'Cat':
with FavoritePanel do begin Left .= 415;
Top .= 304:
Width .= 225:
Height : 115:
end:
FavoritePanel.Visible .= True;
BedTimeHourPanel.Visible := False;
BirthDayPanel.Visible .= False;
CurrentItem :_ 'FavoriteAnimal';
SteemAnimate.Visible := True?
MainForm.BackGroundTalking(MainForm.AboutYouPath +'fa74.wav','S');
end:
procedure TRegistrationForm.ChoosePanelLabel(Value : Integer):

begin if Value = 1 then PanelLabell.Font.Color= clFuchsia;
:

if Value = 2 then PanelLabel2.Font.Color= clFuchsia;
:

if Value = 3 then PanelLabel3.Font.Color= clFuchsia;
:

if Value = 4 then PanelLabel4.Font.Color= clFuchsia;
:

end:

procedure TRegistrationForm.AboutSexLabelMouseMove(Sender: TObject:
Shift: TShiftState: X, Y: Integer);
begin AboutSexLabel.Font.Color := clTeal;
end;
procedure TRegistrationForm.AboutAgeLabelMouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer);
begin SUBSTITUTE SHEET (RULE 26) AboutAgeLabel.Font.Color := clTeai~
end:
procedure TRegistrationForm.AboutEyeLabelMouseMove(Sender: TObject:
Shift: TShiftState: X, Y: Integer):
begin AboutEyeLabel.Font.Color := clTeal:
end:
procedure TRegistrationForm.AboutHairLabelMouseMovetSender: TObject;
Shift: TShiftState: X, Y: Integer):
begin AboutHairLabel.Font.Color := clTeal:
end:
procedure TRegistrationForm.AboutBedTimeLabelMouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer):
begin AboutBedTimeLabel.Font.Color := clTeal:
end:
procedure TRegistrationForm.PanelLabellMouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer):
begin PanelLabell.Font.Color := clRed;
PanelLabel2.Font.Color := clTeal:
PanelLabel3.Font.Color := clTeal:
PanelLabel4.Font.Color := clTeal:
end:
procedure TRegistrationForm.PanelLabel2MouseMove(Sender: TObject;
Shift: TShiftState; X, Y: Integer):
begin PanelLabell.Font.Color := clTeal:
PanelLabel2.Font.Color := clRed:
PanelLabel3.Font.Color := clTeal:
PanelLabel4.Font.Color := clTeal:
end:
procedure TRegistrationForm.PanelLabel3MouseMove(Sender: TObject;
Shift: TShiftState: X, Y: Integer):
begin PanelLabell.Font.Color := clTeal:

SUBSTITUTE SHEET (RULE 26) PaneiLabel2.Font.Color := clTeal;
PanelLabel3.Font.Color := clRed;
PanelLabel4.Font.Color := clTeal;
ends procedure TRegistrationForm.PanelLabel4MouseMove(Sender: TObject:
Shift: TShiftState: X, Y: Integer):
begin PanelLabelI.Font.Color := clTeal;
PanelLabel2.Font.Color := clTeal;
PanelLabel3.Font.Color := clTeal;
PanelLabel4.Font.Color := clRed;
end:
procedure TRegistrationForm.FavoritesColorLabelMouseMove(Sender: TObject;
Shift: TShiftState: X, Y: Integer);
begin FavoritesColorLabel.Font.Color := clTeal;
end:
procedure TRegistrationForm.FavoritesFoodLabelMouseMovelSender: TObject:
Shift: TShiftState; X, Y: Integer):
begin FavoritesFoodLabel.Font.Color := clTeal:
end:
procedure TRegistrationForm.FavoritesActivityLabelMouseMove( Sender: TObject: Shift: TShiftState; X, Y: Integer);
begin FavoritesActivityLabel.Font.Color := clTeal:
end:
procedure TRegistrationForm.FavoritesAnimalLabelMouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer).
begin FavoritesAnimalLabel.Font.Color := clTeal;
end:
procedure TRegistrationForm.SecretNameLabelClick(Sender: TObject)) begin //
if MainForm.ThreadInProgress then exit;
if SecretName = 'Bubble Gum' then ChoosePanelLabel(1):

SUBSTITUTE SHEET (RULE Z6) if SecretName = 'Rain8ow' then ChoosePanelLabel(21;
PanelLabell.caption : 'Bubble Gum';
PanelLabel2.caption : 'RainBow';
with FavoritePanel do begin Left .= 292;
Top .- 186:
Width .= 225;
Height := 85;
end;
FavoritePanel.Visible .= True;
BedTimeFiourPanel.Visible := False;
BirthDayPanel.Visible .= False;
CurrentItem : 'SecretName';
SteemAnimate.Visible := True;
MainForm.BackGroundTalking(MainForm.AboutYouPath +'fa75.wav','S'):
end:
procedure TRegistrationForm.RegistrationImageMouseMove(Sender:
TObject:

Shift: TShiftState; X, Y: Integer);

begin //

AboutSexLabel.Font.Color .= clBlue:

AboutAgeLabel.Font.Color .= clBlue;

AboutEyeLabel.Font.Color .= clBlue;

AboutHairLabel.Font.Color .= clBlue;

AboutBedTimeLabel.Font.Color .= clBlue;

FavoritesColorLabel.Font.Color:= clBlue;

FavoritesFoodLabel.Font.Color .= c181ue;

FavoritesActivityLabel.Font.Color:= clBlue;

FavoritesAnimalLabel.Font.Color.= clBlue;

PanelLabell.Font.Color .= clTeal;

PanelLabel2.Font.Color := clTeal;

PanelLabel3.Font.Color = clTeal;

PanelLabel4.Font.Color .= clTeal;

SecretNameLabel.Font.Color = clGray;

if CurrentItem = " then FavoritePanel.Visible := False;

GoOutArrowImage.Visible :=
False;

end;

SUBSTITUTE SHEET (RULE 26) procedure TRegistrationForm.SecretNameLabelMouseMove(Sender: TObject:
Shift: TShiftState: X, Y: Integer):
begin //
SecretNameLabel.Font.Color .= clFuchsia;
end;
procedure TRegistrationF'orm.DrawBoyOrGirl:
var dx : integer:
dy : integer:
begin //
BoyHairYellowImage.Visible := False:
BoyHairBlackImage.Visiblc .= False;
BoyHairOrangelmage.Visible .= False;
BoyHairBrownImage.Visible .= False:
BoyEyeBlueImage.Visible .= False;
BoyEye6reenImage.Visible := False:
BoyEyeBrownlmage.Visible .= False:
BoyEye8lackImage.Visible := False;
BoyShirtYellowImage.Visible .= False;
BoyShirtBlueImage.Visible .= False:
BoyShirtRedImage.Visible .= False:
GirlHairYellowImage.Visible .= False:
GirlHairBrownImage.Visible .= False;
GirlHairOrangeImage.Visible .= False:
GirlHairBlacklmage.Visible .= False:
GirlEye8lueImage.Visible .= False:
GirlEyeGreenlmage.Visible := False:
GirlEyeBrownlmage.Visible .= Falser GirlEyeBlackImage.Visible .= False:
GirlShirtYellowImage.Visible .= False;
GirlShirtBluelmage.Visible .= False;
GirlShirtRedImage.Visible .= False:
//
dx :- 32;

SUBSTITUTE SHEET (RULE 26) dy := 30;
if Gender = 'Boy' then begin GirlImage.Visible .= False;
with BoyImage do begin Left . = 272+dx;

Top . = 208+dy;

Width . = 201;

Height = 337;
.

Visible = True;
:

end;

if HairColor = 'Blond' then with BoyHairYellowImage do begin Left .= 309+dx;
Top .= 208+dy;
Width := 109:
Height .= 98:
Visible := True:
end;
if HairColor = 'Brown' then with BoyHairBrownImage do begin Left .= 312+dx;
Top .= 208+dy;
Width . 105;
Height .= 97;
Visible := True;
end;
if HairColor = 'Red' then with BoyHairOrangeImage do begin Left .= 312+dx;
Top .= 208+dy;
Width . 105;
Height .= 97;
Visible := True;
end;

SUBSTITUTE SHEET (RULE 2~

WO 99!54015 PCT/IL99/00202 if HairColor= 'Black' then with BoyHairBlackImage do begin Left . 311+dx;

Top .= 206+dy;

Width . 113;

Height 105;
.

Visible True:
:=

end:

if EyeColor'Blue' = then with BoyEye8luelmage do begin Left . = 352+dx:

Top . = 267+dy;

Width . = 46;

Height = 25:
.

Visible = True;
:

end;
if EyeColor'Green' = then With BoyEyeGreenImage do begin Left . = 356+dx;

Top . = 266+dy;

Width : = 49.

Height = 25;
.

Visible = True:
:

end:
if EyeColor = 'Brown' then with BoyEyeBrownimage do begin Left .= 352+dx;
Top := 267+dy;
Width .= 99;
Height .= 25;
Visible := True;
end:
if EyeColor = 'Black' then with BoyEyeBlacklmage do SUBSTTTUTE SHEET (RULE 26) begin Left . = 352+dx;

Top . = 265+dy:

Width = 99%
.

Height = 29:
.

Visible = True:
:

end:
if FavoriteColor = 'Yellow' then with BoyShirtYellowImage do begin Left .= 288+dx:
Top .= 320+dy;
Width := 185:
Height .= 193;
Visible := True;
end:
if FavoriteColor = 'Blue' then With BoyShirtBluelmage do begin Left . = 285+dx:

Top . = 319+dy:

Width . = 156:

Height = 192;
.

Visible = True;
:

end:

if FavoriteColor = 'Red' then with BoyShirtRedImage do begin Left .= 285+dx:
Top .= 312+dy:
Width .= 161:
Height .= 185:
Visible := True:
end;
end:
//
if Gender = 'Girl' then begin BoyImage.Visible .= False:

SUBSTITUTE SHEET (RULE 26) with GirlImage do begin Left . = 274+dx;

Top . - 197+dy;

Width - 177;
.

Height = 305;
.

Visible = True;
:

end;

if HairColor = 'Blond' then with GirlHairYellowImage do begin Left .= 281+dx;
Top .= 197+dy;
Width := 139;
Height .- 121;
Visible := True;
end;
if HairColor= 'Brown' then with GirlHairBrownImage do begin Left .= 277+dx:

Top .= 197+dy:

Width . 143;

Height 129;
:=

Visible True;
:=

end;

if HairColor= 'Red' then with GirlHairOrangeImage do begin Left .= 279+dx;

Top .= 197+dy:

width . 142;

Height 129:
.=

Visible True;
:=

end:
if HairColor = 'Black' then with GirlHairBlackImage do begin SUBSTITUTE SHEET (RULE 26) Left = 280+dx;
.

Top 197+dy;
.

Width = 139:
:

Height - 129;
.

Visible= True;
:

end;
if EyeColor'Blue' = then with GirlEyeBlueImage do begin Left := 360+dx;

Top .= 266+dy;

Width .= 49%

Height 33%
.=

visible True%
:=

end;
if EyeColor'Green'.
= then with GirlEyeGreenlmage do begin Left := 363+dx%

Top .= 266+dy%

Width . = 49%

Height = 25%
.

Visible = True:
:

end:
if EyeColor = 'Brown' then with GirlEyeBrownlmage do begin Left .= 363+dx%
Top := 266+dy%
Width .= 49%
Height .= 25:
Visible := True:
end;
if EyeColor'Black' = then with GirlEyeBlackImage do begin Left . = 359+dx:

Top . = 266+dy;

Width . = 49%

SUBSTITUTE SHEET (RULE 26) Height .= 25:
Visible := True;
end;
if FavoriteColor = 'Yellow' then with GirlShirtYellowImage do begin Left . = 305+dx;

Top . = 303+dy;

Width . = 144:

Height = 209:
.

Visible = True:
:

end:
if FavoriteColor = 'Blue' then with GirlShirtBlueImage do begin Left .= 302+dx:
Top .= 312+dy;
Width .= 147:
Height .= 193:
Visible := True;
end:
if FavoriteColor = 'Red' then with GirlShirtRedlmage do begin Left . = 305+dx;

Top . = 315+dy:

Width . - 143;

Height = 201;
.

Visible = True:
:

end:
end:
end:
procedure TRegistrationForm.AssignCurrentItem (Value : string):
begin if CurrentItem = 'SecretName' then begin SecretName := Value;
AboutYouLabel.Visible .= True:

SUBSTITUTE SHEET (RULE 26) AboutSexLabel.Visible .= True;

AboutAgeLabel.Visible .= True:

AboutEyeLabel.Visible .= True:

AboutHairLabel.Visible .= True;

AboutBedTimeLabel.Visible .= True;

FavoritesLabel.Visible .= True;

FavoritesColorLabel.Visible.= True:

FavoritesFoodLabel.Visible .= True:

FavoritesActivityLabel.Visible.= True:

FavoritesAnimalLabel.Visible.= True;

end:

if(CurrentItem = 'Gender') Value) then and (Gender <>

begin DateOfBirth . "

EyeColor :~ ":

HairColor :_ "

BedTimeHour . "

FavoriteColor . "

FavoriteFood . " 7 FavoriteActivity . "

FavoriteAnimal . "

Gender := Value:

end:

ifCurrentItem = 'DateOfBirth'thenDateOfBirth .= Value:

ifCurrentItem = 'EyeColor' thenEyeColor .= Value:

ifCuzrentItem = 'HairColor' thenHairColor .= Value%

ifCurrentItem = 'BedTimefiour'thenBedTimeHour .= Value;

ifCurrentItem = 'FavoriteColor'thenFavoriteColor .= Values ifCurrentItem = 'FavoriteFood'thenFavoriteFood := Value;

ifCurrentItem = 'FavoriteActivity'thenFavoriteActivity:= Value;

ifCurrentItem = 'FavoriteAnimal'thenFavoriteAnimal.= Value:

//
8ackgroungSpeaking (Value);
SteemAnimate.Visible := True:
ShowVIfSelected:
//
end:
procedure TRegistrationForm.BackgroungSpeaking (Value: string):
var TalkString : string:
begin SUBSTITUTE SHEET (RULE 26) TalkString : ":
if CurrentItem = 'SecretName' then begin if Value = 'Bubble Gum' then TalkString : 'fa75a';
if Value = 'RainBow' then TalkString : 'fa75b';
end;
if CurrentItem = 'Gender' then begin if Value = 'Boy' then TalkString : 'ay64';
if Value = 'Girl' then TalkString : 'ay65';
end:
ifCurrentItem= 'EyeColor' then begin if Value 'Blue' then TalkString 'ay66a';
= :

if Value 'Green' then TalkString 'ay66b':
= :

if Value 'Brown' then TalkString 'ay66c';
= :

if Value 'Black' then TalkString 'ay66d':
= :

end;

ifCurrentItem= 'HairColor' then begin if Value 'Blond' then TalkString : 'ay68a';
=

if Value 'Brown' then TalkString : 'ay68b';
=

if Value Talkstring : 'ay68c';
= 'Red' then if Value 'Black' then TalkString : 'ay68d';
=

end:

ifCurrentItem~ 'FavoriteColor'then begin if Value 'Yellow' then TalkString 'fa7la';
= :

if Value 'Blue' then TalkString 'fa7lb':
= :

if Value 'Red' then TalkString 'fa7lc't = :_ end:

ifCurrentItem= 'FavoriteFood'then begin if Value 'Pizza' then TalkString : 'fa72a':
=

if Value 'French Fries'then TalkString : 'fa72b'~
=

if Value 'Macaroni And eese' then TalkString : 'fa72c':
= Ch end;

SUBSTTfiTTE SHEET (RULE 26) if Currentltem = 'FavoriteActivity' then begin if Value = 'Drawing' then TalkString :_ 'fa73a';
if Value = 'Playing Computer Games' then TalkString : 'fa73b';
if Value = 'Play Make Believe' then TalkString : 'fa73c';
end;
if CurrentItem = 'FavoriteRnimal' then begin if Value = 'Horse' then TalkString : 'fa74a';
if Value = 'Dog' then TalkString : 'fa74b';
if Value = 'Cat' then Talkstring : 'fa74c';
end;
if TalkString <> " then MainForm.BackGroundTalking(MainForm.AboutYouPath +TalkString+'.wav','S');
end;
procedure TRegistrationForm.RegistrationBackImageMouseMove(Sender: TObject:
Shift: TShiftState; X, Y: Integer);
begin //
GoOutArrowImage.Visible := True;
end;
procedure TRegistrationForm.GoOutArrowImageClick(Sender: TObject);
begin if MainForm.ThreadInProgress then exit;
GoBackToMenu;
end;
procedure TRegistrationForm.LoadFromDataBase;
begin ChildName .= MainForm.PDBEngine.ChildName:

SecretName .= MainForm.PDHEngine.SecretName;

Gender .= MainForm.PDHEngine.ChildSex;

DateOfBirth .= MainForm.PDBEngine.BirthDay;

EyeColor := MainForm.PDBEngine.ChildEyeColor;

HairColor .= MainForm.PDBEngine.ChildHairColor;

BedTimeHour .= MainForm.PDBEngine.BedTimeHour;

FavoriteColor .= MainForm.PDBEngine.FavoriteColor;

FavoriteFood .= MainForm.PDBEngine.FavoriteFood;

SUBSTITUTE SHEET (RULE Z6) FavoriteActivity .= MainForm.PDBEngine.FavoriteActivity;
FavoriteAnimal .= MainForm.PDBEngine.FavoriteAnimal:
end:
procedure TRegistrationForm.SaveToDataBase;
begin MainForm.PDBEngine.ChildName . ChildName:

MainForm.PDBEngine.SecretName . SecretName;

MainForm.PDBEngine.ChildSex . Gender:

MainForm.PDBEngine.BirthDay . DateOfBirth;

MainForm.PDBEngine.ChildEyeColor EyeColor;
.

MainForm.PDBEngine.ChildIiairColorHairColor;
.

MainForm.PDBEngine.HedTimeHour BedTimeHour;
.

MainForm.PDBEngine.FavoriteColor FavoriteColor;
.

MainForm.PDBEngine.FavoriteFood FavoriteFood:
.

MainForm.PDBEngine.FavoriteActivityFavoriteActivity;
.

MainForm.PDBEngine.FavoriteAnimal FavoriteAnimal;
.

MainForm.PDBEngine.UpDateCurrentChild;

end:

procedure TRegistrationForm.ComboBoxlChange(Sender: TObject);
begin Calendarl.Month := ComboBoxl.ItemIndex +1;
end;
procedure TRegistrationForm.SpinEditlChange(Sender: TObject);
begin Calendarl.Year := SpinEditl.Value;
end:
procedure TRegistrationForm.CalendarlChange(Sender: TObject);
var spacel : string:
space2 : string;
begin if Calendarl.Month < 10 then spacel : '0' else spacel : "
if Calendarl.Day < 10 then space2 : '0' else space2 : "
DateOfBirth := spacel+IntToStr(Calendarl.Month)+'/'+space2+
IntToStr(Calendarl.Day)+'/'+IntToStr(Calendarl.Year);
end:
procedure TRegistrationForm.BirthDayImageClick(Sender: TObject):
begin SUBSTITUTE SHEET (RULE 26) if MainForm.ThreadInProgress then exit;
BirthDavPanel.Visible := False;
ShowVIfSelected;
end;
procedure TRegistrationForm.RegistrationImageClick(Sender: TObject);
begin if MainForm.ThreadInProgress then exit;
BirthDayPanel.Visible := False;
FavoritePanel.Visible := False:
BedTimeHourPanel.Visible := False;
SteemAnimate.Visible := True:
ShowVIfSelected:
end:
procedure TRegistrationForm.BedTimeHourImageClick(Sender: TObjectl:
begin if MainForm.ThreadInProgress then exit;
BedTimeHourPanel.Visible := False:
ShowVIfSelected:
end:
procedure TRegistrationForm.ComboBox2Change(Sender: TObject):
begin BedTimeHour := ComboBox2.Text:
end;
procedure TRegistrationForm.BirthDayOKImageClick(Sender: TObject):
begin.
//
if MainForm.ThreadInProgress then exit;
BirthDayPanel.Visible := False;
ShowVIfSelected;
end:
procedure TRegistrationForm.BedTimeHourOKImageClick(Sender: TObject);
begin //
if MainForm.ThreadInProgress then exit;
BedTimeHourPanel.Visible := False;
ShowVIfSelected;
end:

SUBSTITUTE SHEET (RULE 26) procedure TRegistrationForm.ShowVIfSelected;
begin if Gender - " then vGenderImage.visible . = False else.VGenderImage.visible .= True;

if DateOfBirth = " then VBirthdayImage.visible . = False else VBirthdayImage.visible .= True;

if EyeColor - " then VEyeColorImage.visible . = False else VEyeColorlmage.visible .= True;

if HairColor - " then VHairColorImage.visible . = False else VHairColorImage.visible .= True;

if BedTimeHour = " then VBedTimeHourImage.visible = False .

else VBedTimeHourImage.visible .= True;

if FavoriteColor - " then VFavoriteColorImage.visible= False .

else VFavoriteColorImage.visible .= True:

if FavoriteFood = " then VFavoriteFoodImage.visible= False .

else VFavoriteFoodlmage.visible .= True;

if FavoriteActivity = " then VFavoriteActivityImage.visible= False :

else VFavoriteActivityImage.visible := True;

if FavoriteAnimal - " then VFavoriteAnimalImage..visible= False .

else VFavoriteAnimalImage.visible .= True;

end:
procedure TRegistrationForm.EscapelClick(Sender: TObject);
begin //
GoBackToMenu;
end:
end.

SUBSTTTUTE SHEET (RULE 26) *rB

sasxsss~ sxs~as~ xsesaxaaass~ sssssssssxcsssscos~xssssxsssssssss Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved caaxsxsxxx-sxxxxx..-.xoccxa- eso~=xx~-- -====xoxs=xxxxxoxosxxxxcxacas Description : This is the SingAlong unit.
unit SingAlong;
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, ExtCtrls, Menus:
type TSingAlongForm = class(TForm) Imagel: TImage:
StoryTellerImage: TImaget Peaslmage: TImage;
HeadImage: TImage:
MainMenul: TMainMenu:
testl: TMenuItem:
spacel: TMenuItem:
Timerl: TTimer:
procedure spacelClick(Sender: TObject):
procedure TimerlTimer(Sender: TObject);
procedure FormCreate(Sender: TObject)~
private ( Private declarations ) public (. Public declarations ) Song : Integer:
procedure PlaySongs:
end:
var SingAlongForm: TSingAlongForm:
implementation uses Menu, Main, creator, Pestosong:
($R '.DFM) SUBSTITUTE SHEET (RULE 26) procedure TSingAlongForm.spacelClick(Sender: TObject);
begin // stop playing and go back to menu Timerl.Enabled : False;
Spacel.Enabled := False;
Hide;
MenuForm.Show;
end;
procedure TSingAlongForm.TimerlTimerlSender: TObject);
begin Timerl.Enabled := False;
Hide;
CreatorForm.Show;
CreatorForm.PlayMovie;
end:
procedure TSingAlongForm.FormCreate(Sender: TObject);
begin Timerl.Enabled :- False;
with StoryTellerImage do begin Left . 125;
Top .= 80;
Width .= 453;
Height :- 453;
end;
with PeasImage do begin Left . 176;

Top . 176;

Width 186;
.

Height = 236;
:

end;

with HeadImage do begin Left . 152;

Top . 129;

Width = 258;
.

Height 346:
:

end;

StoryTellerImage.Visible := True;

SUBSTITUTE SHEET (RULE 26) Peaslmage.Visible .= False;
HeadImage.Visible .- False;
end;
procedure TSingAlongForm.PlaySongs;
begin // 85 = Broadcast with PestoSongForm do begin Spacel.Enabled := True;
Timerl.Enabled := True;
MediaPlayerl.Open:
end;
with CreatorForm do begin Spacel.Enabled := True;
Timerl.Enabled := True:
MediaPlayerl.Open;
end;
Timerl.Interval := 3000;
Timerl.Enabled := True:
Song := 1:
StoryTellerImage.Visible := True;
PeasImage.Visible .= False;
HeadImage.Visible .= False;
//MainForm.TalkInBackGround (85,MainForm.AudioPath +
'StoryTeller.wav', ");
end;
end.

SUBSTITUTE SHEET (RULE Z6) _sasmsas~mssssss~~=ssoosssvss- sso-sceasoxexso~ s=sssssssssssssss Copyright fc) 1995-1998 Creator Ltd. A11 Rights Reserved sssas=ossvscsxscss~ =c=ss~ ossosssss=sssssssssssossasssas==assssa Description : This is the Status unit.
unit Status;
interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, ComCtrls, Buttons, StdCtrls,jpeg, Gauges, ExtCtrls, Menus;
type TStatusForm = class(TForm) Statuslmage: TImage;
GotoMenuImage: TImage;
MinimizeImage: TImage:
StatusGauge: TGauge;
Labell: TLabel:
MainMenul: TMainMenu;
testl: TMenuItem:
Escapel: TMenuItem:
StatusAnimate: TAnimate;
SpeechLabel: TLabel:
StandByLabel: TLabel;
TalkErrorLabel: TLabel;
procedure FormCloselSender: TObject; var Action: TCloseAction);
procedure SpeedButtonlClick(Sender: TObject);
procedure FormCreate(Sender: TObject);
procedure SpeedButton2Click(Sender: TObject);
procedure GotoMenuImageClick(Sender: TObject);
procedure MinimizeImageClick(Sender: Tobject);
procedure EscapelClick(Sender: TObject):
procedure FormShow(Sender: TObject);
procedure FormHide(Sender: TObject);
private { Private declarations ) public [ Public declarations ) end;

SUBSTITUTE SHEET (RULE 26) var StatusForm: TStatusForm:
implementation uses Menu, Main, ToySimulation, MotionSimulation, PanelControls;
($R ;.DFM) procedure TStatusForm.FormClose(Sender: TObject; var Action: TCloseAction):
begin StatusAnimate.Active .= False:
MenuForm.Show:
if MenuForm.Threadl <> nil then begin MenuForm.Threadl.Terminate:
end:
end:
procedure TStatusForm.SpeedButtonlClick(Sender: TObject):
begin SimulationForm.Close~
MotionSimulationForm.Close;
close:
MenuForm.Show;
end:
procedure TStatusForm.FormCreate(Sender: TObject)~
begin //Icon.LoadFromFile(MainForm.GraphicsPath+'PestoIcon.ico'):
StatusAnimate.FileName := MainForm.GraphicsPath+'top.AVI':
StatusAnimate.Active .= Truep GotoMenuImage.Cursor : 5:
MinimizeImage.Cursor := 5;
end:
procedure TStatusForm.SpeedButton2Click(Sender: TObject);
begin Application. Minimize:
end;
procedure TStatusForm.GotoMenuImageClick(Sender: TObjectl:
begin SUBSTITUTE SHEET (RULE 26) SimulationFOrm.Close;
MotionSimulationForm.Close;
close;
MenuForm.Show;
end:
procedure TStatusForm.MinimizeImageClick(Sender: TObject);
begin Application. Minimize;
end:
procedure TStatusForm.EscapelClick(Sender: TObject);
begin SimulationForm.Close:
MotionSimulationForm.Close;
close;
MenuForm.Show;
end:
procedure TStatusForm.FormShow(Sender: TObject);
begin PanelControlForm.Show;
end:
procedure TStatusForm.FormHide(Sender: TObjectl;
begin PanelControlForm.Hide;
end;
end.

SUBSTITUTE SHEET (RULE 26) _~______-________________-__-_=_________________________=___--___ Copyright (c) 1995-1998 Creator Ltd. A11 Rights Reserved _~~________~_________________=___=--==m~ __________-____-_-____~
Description : This is the Toy unit.
unit Toy:
{This Unit contained several methods, converting from D1I/Ocx to Simple methods) interface uses Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs:
type Troy = class(TComponent) private ( Private declarations ) FToyNumber : Integer:
public { Public declarations ) function Talk (TalkFiles : string; Motion : string) :Integer;
function TalkAndListen (TalkFiles : string: TalkMotion : string:
ListenTime : Real: ListenMotion : string) . Integer;
function Wait (ListenTime : Real: Motion : string) . Integer:
function Listen (Map . string; DelayTime : Real device : String: Motion : string) .
Integer:
function TurnOn . Boolean;
function Turnoff . Boolean:
function CheckToySystem . Integer;
function ListenConv(ListenMotion : string): Integer:
function TalkConv(ListenMotion : string): Integer;
function RecordWave (WaveFile . string: DelayTime : Real Motian : String) . Integer:
function ListenActive (Map . string; DelayTime : Real device : String: Motion : string) . Integer:
function TalkAll (TalkFiles : string: Motion : string) :Integer:
function ToyTalkIn (ToyNumberValue : T_nteger:TalkFiles : string:

SUBSTITUTE SHEET (RULE 26) LTime: Integer; Motion . string) .Integer;
function ToyListenIn (DTime : Integer : Motion . string) :Integer;
published property ToyNumber : integer read E'ToyNumber write FToyNumber;
end:
const SBDevice = 0;
AllToys = 85:
implementation uses Main, ToySimulation, MotionSimulation,status;
//*****************************************************~.********************
// Examples : Talk('a.wav + b.wav + c.wav','Motor and Listen :. 1.5);
// Listen Time = l.5sec //sensors : 1-Nose 2-Hand 3-Foot ; 0-none function TToy.Talk (TalkFiles : string; Motion : string) :Integer:
var LTime . integer:
SensorFlag . Boolean;
SensorNumber : Integer;
i . integer;
begin LTime := 12000:
StatusForm.TalkErrorLabel.Visible := False;
if ToyNumber < 0 then begin SimulationForm.ToyTalk (TalkFiles, Motion, 0(LTime));//fix While (SimulationForm.ToyTalkstatus = True) or (SimulationForm.ToyListenStatus = True! do sleep(500);
//Sleep(1000);//Limitation of the Equipment //fix Result := SimulationForm.KeyPress:
end //___ __________ _____________ ___ __________ _____________ ___________ else begin // ToyNumber >0 Result := ToyTalkIn(ToyNumber,TalkFiles,LTime,Motion);
if Result < 0 then begin sleepl250);

SUBSTITUTE SHEET (RULE 26) Result := ToyTalkIn(ToyNumber,TalkFiles,LTime,Motion);
end;
if Result < 0 then begin StatusForm.TalkErrorLabel.Visible := True;
sleep(10001:
Result := ToyTalkIn(ToyNumber,TalkFiles,LTime,Motion):
end:
if Result < 0 then begin sleep(1000)7 Result := ToyTalkln(ToyNumber,TalkFiles,LTime,Motion)t end:
if Result = -2 then begin sleep(20000);
Result := ToyTalkIn(ToyNumber,TalkFiles,LTime,Motion):
end:
if Result = -2 then begin sleep(20000);
Result := ToyTalkIn(ToyNumber,TalkFiles,LTime,Motionl:
end;
end;
end:
~~*****************************t************************************
function TToy.Ta1kR11 (TalkFiles : string: Motion : string) :Integer;
var LTime . integer:
SensorFlag . Boolean:
SensorNumber : Integer:
i . Integer:
begin LTime : 50:
if ToyNumber < 0 then begin SimulationForm.ToyTalk (TalkFiles, Motion, LTimef:

SUBSTITUTE SHEET (RULE 26) While (SimulationForm.ToyTalkStatus = True) or (SimulationForm.ToyListenStatus = True) do sleep(500);
Sleep(1000);//Limitation of the Equipment Result := SimulationForm.KeyPress;
end //==:asssaswam:m:ass====sass===ox-==sx=ssssess=sacs=sasses==x:=se=xss else begin // ToyNumber >0 Result := ToyTalkIn(AlIToys,TalkFiles,LTime,Motion);
if Result < 0 then Result := ToyTalkln(AllToys,TalkFiles,LTime,Motion):
end:
end:
//**iii*****tt*ii*i*******i**t*it*t*****t**i*********t******i*t**
function TToy.Wait (ListenTime : Real; Motion : string) . Integer;
va r LTime : integer:
begin //sssasssssassssae=sass~xc==xa=saa=a=sxssesaoxssmsa=ss=sa=sssass=
LTime := Trunc(1000*ListenTime);
SimulationForm.ToyListen (LTime,Motion);
While (SimulationForm.ToyTalkstatus = True) or (SimulationForm.ToyListenStatus = True) do sleep(200);
Sleep(1000);//Limitation of the Equipment Result :- SimulationForm.KeyPress;
//=5s'~SsaaC=C=xx=x~i==xs===C==~xCsxCS=~s==~s=S=~ ~s~3x=sC=sCSSS~3 ( Result :_ MainForm.XMidi~.ToyListen2(ToyNumber,Trunc(LTime*10),ListenConv(Motion)):
Result := MainForm.SRl.WaitForEvent(",Trunc(LTime*10),2); //Sensors //ss=e=sasamosssas==xas===sa===x==s==sees=ssssaa=csssvs=s:=asass-.c end:
//***i************t******t*****i**iii*i**ti*i********i*iii*t***********i**
//sensors : 1-Nose 2-Hand 3-Foot ; 0-none function TToy.Listen (Map . string; DelayTime : Real ;
device : String; Motion : string) . Integer;
va r DTime : Integer;

SUBSTITUTE SHEET (RULE 26) Flags : =nteger;
SRisOn : Boolean;
begin sleep(1001;
DTime := Trunc(DelayTime*10):
Flags := 0;
if Pos('SR',Device) > 0 then begin Flags .= Flags + 1:
SRisOn := True;
end else SRisOn := False;
if Pos('Sensor',Device) > 0 then Flags .= Flags + 2;
StatusForm.SpeechLabel.Visible := True:
Result := MainForm.srl.4VaitForEvent(Map,DTime,Flags):
StatusForm.SpeechLabel.Visible := False:
if (Result > 0) and (MainForm.srl.GetPhraseConfLevel(1) < 5000)and (SRisOn = True) then Result := 0;
end;
//***ii*ii***i*ii*iit**i*it**ii*ti**ti********i****itii*****it***
//sensors : 1-Nose 2-Hand 3-Foot : 0-none function TToy.ListenActive (Map . string: DelayTime : Real device : String: Motion : string) . Integer:
var DTime : Integer;
Flags : Integer:
SRisOn : Boolean;
begin Result : 1:
DTime := Trunc(DelayTime*10);
if ToyNumber > -1 then begin Result := ToyListenIn(DTime,Motion)t if Result <> 1 then Result := ToyListenln(DTime,Motion);
if Result <> 1 then begin Result := ToyListenIn(DTime,Motion);
end;
end else Result :- 1;
Flags := 0;

SUBSTITUTE SHEET (RULE 26) if Pos('SR',Device) > 0 then begin Flags .= Flags + 1;
SRisOn := True.
end else SRisOn := False:
if Pos('Sensor',Device) > 0 then Flags := Flags + 2;
sleep(100):
if Result = 1 then begin StatusForm.SgeechLabel.Visible := True;
Result := MainForm.srl.WaitForEvent(Map,DTime,Flags);
StatusForm.SpeechLabel.Visible := False:
if (Result > 0) and (MainForm.srl.GetPhraseConfLevel(1) < 5000)and (SRisOn = True) then Result := 0:
end else if Result = -2 then Result : -9999:
end:
//*************t**********t******ti**i*************t************************
function TToy.TalkAndListen (TalkFiles : string; TalkMotion : stringy ListenTime : Real: ListenMotion : string) . Integer;
var Flags : Integer:
LTime : Integer;
begin Flags := 0;
Ltime := 0;
( SimulationForm.ToyTalk (TalkFiles, Motion, LTime):
While (SimulationForm.ToyTalkStatus = True) or (SimulationForm.ToyListenStatus = True) do sleep(500):
Sleep(1000);//Limitation of the Equipment Result : SimulationForm.KeyPress;
//=-==saxxx==xa==:==xxaxaaaa=xs==x=x==x=ax=aax==s=x=xx=xa=a:xx==x //Result := MainForm.XMidil.ToyTalk2(ToyNumber,TalkFiles, // SBDevice,0,0,0):
//==xxx===ax=axx=xxxx=a===~x=x=xa=:asxaxxa=axx=ma==x=xxxxa=xaaxs SUBSTITUTE SHEET (RULE 26) end;
//***i**i*******t*********i************i*******t*************t***
function TToy.TurnOn . Boolean;
var ResultSR . Integer;
ResultXMidi : Integer;
begin // open SR
(with MainForm do begin SRl.DataBase . 'ASR1500 - Telephone';
SRl.User . 'Creator':
SRl.Context . 'Demo';
SRl.OpenAttr .= 0;
ResultSR .= SRl.Init;
end:?
ResultSR := MainForm.SRl.Init;
//open MIDI
ResultXMidi := MainForm.XMidil.StartHUB;
if (MainForm.ToyMachine = MainForm.ToyNameIsBear) then MainForm.XMidil.SetMotor(ToyNumber.0,1,200);
Result :_ (ResultSR = 0) AND (ResultXMidi = 0) ;
end:
function TToy.TurnOff . Boolean;
var ResultSR . Integer:
ResultXMidi : Integer:
begin //close SR & MIDI
ResultSR .= MainFozm.SRl.Closet ResultXMidi := MainForm.XMidil.StopHUB;
Result : (ResultSR = 0> AND (ResultXMidi = 0) ;
end:
// 0 = OK , -1 = LowBattery~ -2 = No Communication // -3 = LowBattery ~ No Communication function TToy.CheckToySystem . Integer:
begin Result : 0;
end:

SUBSTITUTE SHEET (RULE 26) function TToy.ListenConv(ListenMotion : string): Integer:
begin Result : 0:
ifListenMotion'w' thenResult = := 0:

ifListenMotion'X' thenResult = := 1:

ifListenMotion'Y' thenResult = := 2;

ifListenMotion'Z' thenResult = := 3:

end;
function TToy.TalkConv(ListenMotion : string): Integer:
begin Result : 0;
ifListenMotion 'S'thenResult 0;
= :=

ifListenMotion 'EC'thenResult 1;
= :=

ifListenMotion 'E'thenResult 2;
= :=

ifListenMotion 'EL'thenResult 3:
= :=

ifListenMotion 'S#'thenResult 4;
= :=

ifListenMotion 'X'thenResult 5;
= :=

ifListenMotion 'X'thenResult 6;
= :=

ifListenMotion 'X'thenResult 7;
= :=

ifListenMotion 'X'thenResult 8;
= :=

ifListenMotion 'X'thenResult 9:
= :=

ifListenMotion 'X'thenResult 10;
= :=

ifListenMotion 'X'thenResult 11:
= :=

ifListenMotion 'X'thenResult 12:
= :=

if.ListenMotion 'X'thenResult 13;
= :-ifListenMotion 'X'thenResult 14:
= :-ifListenMotion 'X'thenResult 15:
= :=

ifListenMotion 'X'thenResult i6:
= :=

ifListenMotion 'X'thenResult 17;
= :=

ifListenMotion 'X'thenResult 1B;
= :=

ifListenMotion 'X'thenResult 19:
= :-ifListenMotion 'X'thenResult 20:
= :=

ifListenMotion 'X'thenResult Z1;
= :=

ifListenMotion 'X'thenResult 22:
= :=

ifListenMotion 'X'thenResult 23;
= :=

ifListenMotion 'x'thenResult 24:
= :=

ifListenMotion 'X'thenResult 25:
= :=

ifListenMotion 'X'thenResult 26;
= :=

ifListenMotion 'X'thenResult 27;
= :=

ifListenMotion 'X'thenResult 28;
= :=

ifListenMotion 'X'thenResult 29:
= :=

SUBSTITUTE SHEET (RULE 26) if ListenMotion = 'X' then Result := 30;
end;
function TToy.RecordWave (WaveFile . string: DelayTime : Real Motion : String) . Integer;
var DTime : Integer;
begin Wait(DelayTime,Motion);
DTime := Trunc(DelayTimeil0);
Result := MainForm.XMidil.ToyRecord(WaveFile,DTime):
end:
function TToy.ToyTalkIn (ToyNumberValue : Integer:TalkFiles : string:
LTime: Integer; Motion . string) :Integer:
begin sleep(100):
if.(MainForm.ToyMachine ='StoryTeller') and (ToyNumber <> 85) then Result := MainForm.XMidil.ToyTalk2(ToyNumber,TalkFiles, SBDevice,LTime,TalkConv(Motion),0);
if (MainForm.ToyMachine = 'Teddy8ear') or (ToyNumber = 85) then Result := MainForm.XMidiI.NewToyTalk(ToyNumber,TalkFiles, SBDevice,9.LTime);
end;
function TToy.ToyListenIn (DTime : Integer : Motion . string) :Integer:
begin sleep(100):
if MainForm.ToyMachine = 'StoryTeller' then Result MainForm.XMidil.ToyListen2(ToyNumber,DTime,ListenConv(Motion)):
if MainForm.ToyMachine = 'TeddyBear' then Result := MainForm.XMidil.ToyListenTime(ToyNumber,DTime);
end;
end.

SUBSTITUTE SHEET (RULE 26) _===-=s==aes==ma=a==e==acaaass==vc==secassoav-c=aacoooea=s_ ~ssv-Copyright (c) 1995-I998 Creator Ltd. A11 Rights Reserved a::ms:osaa=sas=smss===saeaamn===a=es=sass=x:a=s==s. _esos==o=:mv Description : This is the Intro unit.
unit Intro;
interface uses Status, Main, Toy,PanelControls,Windows, Messages, SysUtils, Classes, Graphics, Controls, Forms, Dialogs, Registration;
type TIntro = class(TThread) private ( Private declarations }

GameStatus . string;

ChildName . string;

SecretName . string;

DateOfBirth . string:

EyeColor . string;

HairColor . string;

BedTimeHour . string;

FavoriteColor . string;

FavoriteFood , string;

FavoriteActivity , string:

FavoriteAnimal , string:

ChildSex . string;

IntroNextSection . Integer;

PlayNextSection . Integer;

PrincessNextSection . Integer;

TheStoryMenuNextSection. Integer;

RunStoryMenuNextSection. Integer;

BedTimeRitualNextSection: Integer;

GrouchyNextSection . Integer;

BunnyNextSection . Integer:

PresentationNextSectionInteger;
.

VisitSongMenu . Integer;

VisitGameMenu . Integer;

VisitStoryMenu . Integer;

SUBSTITUTE SHEET (RULE 26) VisitBunnyShort . Integer;
VisitBunnyLong . Integer;
VisitPrincess . Integer;
BunnyFavoriteFood . string;
protected procedure Execute; override;
procedure LoadDataFromDatabase;
procedure SaveDataFromDatabase:
procedure UpdateIntroBar;
procedure UpdatePlayBar;
procedure UpdatePrincessBar;
procedure UpdateTheStoryMenuBar;
procedure UpdateRunStoryMenuBar;
procedure UpdateBedTimeRitualBar;
procedure UpdateGrouchy8ar;
procedure UpdateBunnyBar;
procedure UpdatePresentationBar;
procedure ApplicationMinimize:
procedure ClearStatusControl;
public constructor Create (Status : string);
end:
//sensors : 1-Nose 2-Hand 3-Foot const NoseSensor = 2;
HandSensor = 1;
FootSensor = 3;
implementation ( Important: Methods and properties of objects in VCL can only be used in a method called using Synchronize, for example, Synchronize(UpdateCaption);
and UpdateCaption could look like, procedure TIntro.UpdateCaption;
begin Forml.Caption : 'Updated in a thread';
end; ) SUBSTITUTE SHEET (RULE 26) ( TIntro ) constructor TIntro.Create (Status : string);
begin inherited Create(False);
FreeOnTerminate := True;
GameStatus := Status;
end;
procedure TIntro.LoadDataFromDatabase;
begin // Get Current Data From database ChildName .= Trim(MainForm.PDBEngine.ChildName);

SecretName .= Trim(MainForm.PDBEngine.SecretName);

ChildSex .= Trim(MainForm.PDBEngine.ChildSex);

DateOfBirth .= Trim(MainForm.PDBEngine.BirthDay);

EyeColor .= Trim(MainForm.PDBEngine.ChildEyeColor);

HaizColor := Trim(MainForm.PDBEngine.ChildHairColor);

BedTimeHour .= Trim(MainForm.PDBEngine.BedTimeHour);

FavoriteColor .= Trim(MainForm.PDBEngine.FavoriteColor);

FavoriteFood := Trim(MainForm.PDBEngine.FavoriteFood);

FavoriteActivity.= Trim(MainForm.PDBEngine.FavoriteActivity);

FavoriteAnimal:= Trim(MainForm.PDBEngine.FavoriteRnimal);

VisitSongMenu := MainForm.PDHEngine.VisitSongMenu;

VisitGameMenu := MainForm.PDBEngine.VisitGameMenu:

VisitStoryMenu.= MainForm.PDBEngine.VisitStoryMenu:

VisitBunnyShort:= MainForm.PDBEngine.VisitHunnyShort;

VisitBunnyLong:= MainForm.PDBEngine.VisitBunnyLong;

VisitPrincess .= MainForm.PDBEngine.VisitPrincess;

BunnyFavoriteFood.= MainForm.PDBEngine.BunnyFavoriteFood;

end;
procedure TIntro.SaveDataFromDatabase;
begin // Save Current Data To database MainForm.PDBEngine.VisitSongMenu . VisitSongMenu:
MainForm.PDBEngine.VisitGameMenu . VisitGameMenu;
MainForm.PDBEngine.VisitStoryMenu . VisitStoryMenu;
MainForm.PDBEngine.VisitBunnyShort : VisitBunnyShort;
MainForm.PDBEngine.VisitBunnyLong . VisitBunnyLong;
MainForm.PDBEngine.VisitPrincess . VisitPrincess;
MainForm.PDBEngine.BunnyFavoriteFood . BunnyFavoriteFood;
end;

SUBSTITUTE SKEET (RULE Z6) procedure TIntro.Execute;

var ParamTalk . Integer;

ParamListen . Integer;

Toy , TToy;

LastPresentation . Integer;

Path . string;

IntroPath . string;

BedTimePath . string;

BunnyPath . string;

GrouchyPath . string;

PeasPath . string:

PlayPath . string;

RunStoryPath . string;

SillyPath . string;

SongMenuPath . string;

SongsPath . string;

StoryMenuPath. string;

PresentationPath . string:

GamePrincessstatus : string;

WaveSection . string;

Sensorl . Integer:

Sensor2 . Integer;

Sensor3 . Integer;

GetSensor . Integer;

LoopCount . Integer;

DTime . Integer;

BeginningPlay. Boolean;

begin ( Place threadcode here ToY .= MainForm.Toy;

Path .= MainForm.AudioPath;

IntroPath := Path+'Intro\';

BedTimePath .= Path+'BedTime\';

BunnyPath := Path+'Bunny\';

GrouchyPath .= Path+'Grouchy\';

PeasPath .= Path+'Peas\';

PlayPath .= Path+'Play\';

RunStoryPath = Path+'RunStory\':
.

SillyPath = Path+'Silly\';
.

SongMenuPath = Path+'SongMenu\':
.

SongsPath = Path+'Songs\';
:

StoryMenuPathPath+'StoryMenu\';
:

SUBSTITUTE SHEET (RULE 26) PresentationPath : Path+'Presentation\';
LastPresentation := 0;
ParamTalk .= 0;
LoopCount .= 0;
Sensorl := 0;
Sensor2 .= 0;
Sensor3 .= 0;
DTime .= 5;
BeginningPlay .= True:
Synchronize(LoadDataFromDatabase);
if GameStatus = 'Intro' then //__~= How To Play =---____ begin IntroNextSection :_ //UnPerform Intro Script -l;

PlayNextSection :_ //UnPerform Play Script -1;

PrincessNextSection :_ //UnPerform Princess Script -1;

TheStoryMenuNextSection. -1: //UnPerform TheStoryMenu Script RunStoryMenuNextSection. -1; //UnPerform RunStoryMenu Script BedTimeRitualNextSection: -1: //UnPerform BedTimeRitual Script GrouchyNextSection . -l: //UnPerform Grouchy Script BunnyNextSection :_ //UnPerform Bunny Script -1;

PresentationNextSection:_ //UnPerform Presentation -1; Demo Ver 1.0 end else begin //=a==== Play =__~___.____ IntroNextSection :_ //UnPerformIntro Script -1;

PlayNextSection . //UnPerformPlay Script -1;

PrincessNextSection . //UnPerformPrincess Script -1:

TheStoryMenuNextSection. //UnPerformTheStoryMenu Script -1;

RunStoryMenuNextsection. //UnPerformRunStoryMenu Script -1;

BedTimeRitualNextSection: //UnPerformBedTimeRitual Script -1;

GrouchyNextsection :_ //UnPerformGrouchy Script -l:

BunnyNextSection :_ //UnPerformBunny Script -1;

PresentationNextSection.= //Perform 1: Presentation Demo Ver 1.0 end;

StatusForm.StatusGauge.Progress := 0;
while not Terminated do begin // Checking (if StatusForm.WindowState = wsMinimized then SUBSTITUTE SHEET (RULE 26) begin //Synchronize(ApplicationMinimize):
end; ) // =g=====_----__= Presentation =___________-______=====x==x====-___ // ----------- write here all sessions ------------------------// =xa:a==e=s=a====:xss=======aaxx=x=sx========c=ms==saac=xxacsss=====
case PresentationNextSection of 1 : begin PresentationNextSection := 5;
end;
: begin ParamTalk := Toy.Talk(PresentationPath+'op002.wav','S'):
Synchronize(UpdatePresentationBarl:
PresentationNextSection := 6;
end:
6 : begin Pa~amList~n : Toy.ListenActive (" ,180,'Sensor','W');
if (ParamI.isten = FootSensor) or(ParamListen = NoseSensor) then PresentationNextSection := 10 else PresentationNextSection := 5:
//PresentationNextSection : 10;//?? Delete this line end;
IO : begin BeginningPlay := False:
if Time< StrToTime('12:00:00') then PresentationNextSection .= 15 .= 25 := 20;
else if Time> StrToTime('16:00:00'1 then PresentationNextSection else PresentationNextSection end:
: begin ParamTalk ~c= Toy.Talk(PresentationPath+'op015m.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 35;
end;

SUBSTITUTE SHEET (RULE 26) 20 : begin ParamTalk := Toy.Talk(PresentationPath+'op020m.wav','S'):
Synchronize(UpdatePresentationHar);
PresentationNextSection := 35:
end;
25 : begin ParamTalk := Toy.Talk(PresentationPath+'op025m.wav','S')~
Synchronize(UpdatePresentatian8ar);
PresentationNextSection := 35;
end:
35 : begin SecretName : 'Rainbow':
ParamListen .= Toy. Listen ('rainbow/2,bubble gum/3', 7,'SR','W'):
case ParamListen of 2: begin PresentationNextSection := 45;
SecretName : 'Rainbow';
end:
3: begin PresentationNextSection := 50:
SecretName : 'BubbleGum':
end;
else PresentationNextSection := 36:
end;
Synchronize(UpdatePresentationBar)t end:
36 : begin ParamTalk := Toy.Talk(PresentationPath+'op036m.wav','S#'1;
Synchronize(UpdatePresentationBar):
PresentationNextsection := 55;
end:
45 : begin ParamTalk := Toy.Talk(PresentationPath+'op045m.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextsection := 55:
end:

SUBSTITUTE SHEET (RULE 26) 50 : begin ParamTalk := Toy.Talk(PresentationPath+'op050m.wav','S#'):
Synchronize(UpdateeresentationBar):
PresentationNextSection : 55;
end;
55 : begin ParamListen .= Toy. Listen ('story/l,game/2,song/3', 7,'SR','W'):
case ParamListen of 1: PresentationNextSection := 100: //story menu 2: PresentationNextSection := 800: //game menu 3: PresentationNextSection := 300; //song menu else PresentationNextSection := 60;
end:
Synchronize(UgdatePresentationBarl~
end:
60 : begin ParamTalk := Toy.Talk(PresentationPatht'op060.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 65;
end:
65 : begin ParamListen .= Toy. Listen ('story/l,game/2,song/3', 7,'SR','W'):
case ParamListen of 1: PresentationNextSection100; //story : menu 2: PresentationNextSection800: //game := menu 3: PresentationNextSection300: //song := menu else PresentationNextSection:= 100;

end:

Synchronize(UpdatePresentationBar):
end:
//story menu 100 : begin if VisitStoryMenu = 0 then PresentationNextSection := 125 else begin PresentationNextSection := 235;
if VisitHunnyShort = 0 then SUBSTITUTE SHEET (RULE 26) begin PresentationNextSection : 105;
if SecretName = 'BubbleGum' then PresentationNextSection : 115:
if SecretName = 'Rainbow' then PresentationNextSection := 110;
end;
if (VisitBunnyLong = 0) and (VisitBunnyShort <> 0) then PresentationNextSection := 206;
if (VisitBunnyShort <> 0) and (VisitBunnyLong <> 0) then PresentationNextSection := 235;
end:
VisitStoryMenu := VisitStoryMenu +1;
end;
110 : begin ParamTalk := Toy.Talk(PresentationPath+'stm110.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 200:
end;
115 : begin ParamTalk := Toy.Talk(PresentationPath+'stm115.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 200:
end;
125 : begin ParamTalk := Toy.Talk(PresentationPath+'stm125m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 130;
end;
I30 : begin ParamTalk := Toy.Talk(PresentationPath+'stm130m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 135;
end;
135 : begin ParamTalk := Toy.Talk(PresentationPath+'stmi35m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 140;

SUBSTITUTE SHEET (RULE 26) end:
140 : begin ParamTalk := Toy.Talk(PresentationPath+'stm140m.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 145:
end:
145 : begin ParamListen .= Toy. Listen ('too hot/l,too cold/2,just right/3', DTime,'SR','W'):
case ParamListen of 1: PresentationNextsection150:
:=

2: PresentationNextSection155:
:-3: PresentationNextSection160:
:=

else PresentationNextSection:=
165:

end:
Synchronize(UpdatePresentationBar):
end:
150 : begin ParamTalk := Toy.Talk(PresentationPath+'stm150.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 170:
end:
155 : begin ParamTalk := Toy.Talk(PresentationPath+'stm155.wav','S');
Synchronize(UpdatePresentationBarl:
PresentationNextSection := 170:
end:
160 : begin ParamTalk := Toy.Talk(PresentationPath+'stm160.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 170:
end:
165 : begin ParamTalk := Toy.Talk(PresentationPath+'stm165.wav','S'):
Synchronize(UpdatePresentation8ar):
PresentationNextSection : 170;
end:

SUBSTITUTE SHEET (RULE 26) WO 99!54015 PCT/IL99/00202 170 : begin ParamTalk := Toy.Talk(PresentationPath+'stm170.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection : 175:
end:
175 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W');
case ParamListen of 1: PresentationNextSecticn :- 195;
2: PresentationNextSection := 225:
else PresentationNextSection := 180:
end:
Synchronize(UpdatePresentationBar);
end:
180 : begin ParamTalk := Toy.Talk(PresentationPath+'stm180.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 185:
end:
185 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection : 195;
2: PresentationNextsection := 230:
else PresentationNextSection := 230:
end:
Synchronize(UpdatePresentationBarl:
end:
195 : begin ParamTalk := Toy.Talk(PresentationPath+'stm195.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 200:
end;
200 : begin ParamTalk := Toy.Talk(PresentationPath+'stm200.wav','S'):
Synchronize(UpdatePresentationear):
PresentationNextSection := 3000: //bunny short SUBSTITUTE SHEET (RULE 26) end:
205 : begin ParamTalk := Toy.Talk(PresentationPath+'stm205m.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 210:
end:
206 : begin ParamTalk := Toy.Talk(PresentationPath+'stm206m.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextsection := 210:
end:
210 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection := 2000://bunny long 2: PresentationNextsection := 225:
else PresentationNextSection := 215:
end:
Synchronize(UpdatePresentationBar):
end:
215 : begin ParamTalk := Toy.Talk(PresentationPath+'stm215m.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 220:
end:
220 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W'1:
case ParamListen of 1: PresentationNextSection := 2000: //bunny long 2: PresentationNextSection := 225:
else PresentationNextSection := 225:
end:
Synchronize(UpdatePresentationBar):
end:
225 : begin ParamTalk := Toy.Talk(PresentationPath+'stm225.wav','S'):
Synchronize(UpdatePresentation8ar):

SUBSTITUTE SHEET (RULE 26) PresentationNextSection := 230:
end:
230 : begin ParamTalk := Toy.Talk/PresentationPath+'stm230n.wav','S#'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 240;
end;
240 : begin ParamListen .= Toy. Listen ('game/l,song/2,storyteller/3',DTime,'SR','W'):
case ParamListen of 1: PresentationNextSection :m 800://game menu 2: PresentationNextSection := 300: //song menu 3: PresentationNextSection := 5000 //theme song else PresentationNextSection := 245;
end:
Synchronize(UpdatePresentationBar):
end;
245 : begin ParamTalk := Toy.Talk(PresentationPath+'stm245n.wav','S#'l:
Synchronize(UpdatePresentationBar):
PresentationNextSection := 250;
end:
250 : begin ParamListen := Toy. Listen ('game/l,song/2,storyteller/3',DTime,'SR','W');
case ParamListen of l: PresentationNextSection := 800: //game menu 2: PresentationNextSection := 300; //song menu 3: PresentationNextSection : 5000://theme song else PresentationNextSection := B00:
end:
Synchronize(UpdatePresentationBar):
ends // song menu 300 : begin Synchronize(UpdatePresentationBar):
if VisitSongMenu = 0 then PresentationNextSection : 320 else SUBSTITUTE SHEET (RULE 26) *rB

begin PresentationNextSection := 305;
if SecretName = 'HubbleGum' then PresentationNextSection .= 315;
.= 310;
if SecretName = 'Rainbow' then PresentationNextSection end;
VisitSongMenu := VisitSongMenu +1:
end:
310 : begin ParamTalk := Toy.Talk(PresentationPath+'sng310.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 330;
end:
315 : begin ParamTalk := Toy.Talk(PresentationPath+'sng315.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 330:
end:
320 : begin ParamTalk := Toy.TalklPresentationPath+'sng320.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 330;
end;
330 : begin ParamTalk _:= Toy.Talk(PresentationPath+'sng prog.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 370;
end;
370 : begin ParamTalk := Toy.Talk(PresentationPath+'sng370.wav','S#'1;
Synchronize(UpdatePresentationBar);
PresentationNextSection := 375;
end:
375 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W'l;
case ParamListen of SUBSTITUTE SHEET (RULE 26) 1: PresentationNextSection := 380;
2: PresentationNextsection := 395:
else PresentationNextSection := 395;
end;
Synchronize(UpdatePresentationBar);
end;
380 : begin ParamTalk := Toy.Talk(PresentationPath+'sng380m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 390:
end:
390 : begin ParamTalk := Toy.Talk(PresentationPath+'sng390.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 395;
ends 395 : begin ParamTalk := Toy.Talk(PresentationPath+'sng395.wav','S#'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 400;
end;
400 : begin ParamListen .= Toy. Listen ('game/l,story/2,song/3',DTime,'SR','W'):
case ParamListen of 1: PresentationNextSection := 800; //game menu 2: PresentationNextSection :- I00; //story menu 3: PresentationNextSection := 910:
else PresentationNextSection := 410;
end;
Synchronize(UpdatePresentationBar);
end;
410 : begin ParamTalk := Toy.Talk(PresentationPath+'sng410m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 415;
end;

SUBSTITUTE SHEET (RULE 26) 415 : begin ParamTalk := Toy.Talk(PresentationPath+'sng415n.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 420 end;
420 : begin ParamListen .= Toy. Listen ('game/l,story/2,storyteller/3',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection := 800; //game menu 2: PresentationNextSection := 100; //story menu 3: PresentationNextSection := 5000: //theme song else PresentationNextsection := 425:
end:
Synchronize(UpdatePresentationBar):
end;
425 : begin ParamTalk := Tol~.Talk(PresentationPath+'sng425n.wav','S#'):
Synchronize(UpdatePresentationBar)~
PresentationNextSection := 430:
end:
430 : begin ParamListen .= Toy. Listen ('game/l,story/2,storyteller/3',DTime,'SR','W'):
case ParamListen of 1: PresentationNextSection :- 800: //game menu 2: PresentationNextSection := 100; //story menu 3: PresentationNextSection := 5000: //theme song else PresentationNextSection := 800;
end;
Synchronize(UpdatePresentationBar):
end:
//StandBy 699 : begin ParamListen .= Toy.ListenActive ('continue/l,youre on/2,theme/3',12,'SR','W')i case ParamListen of 1: PresentationNextSection := LastPresentation:
2: PresentationNextSection : 10:
3: PresentationNextSection :_ X00:

SUBSTITUTE SHEET (RULE 26) else PresentationNextSection :- 699:
end;
Synchronize(UpdatePresentationBar);
end;
700 : begin //Speech before theme song ParamTalk := Toy.Talk(PresentationPath+'sb700.wav','S');
Synchronize(UpdatePresentationBarl;
PresentationNextSection := 5000;
end;
// WakeUp 750 : begin ParamTalk := Toy.Talk(PresentationPath+'op005.wav','S#'):
if ParamTalk = -2 then begin PresentationNextSection :~ 750:
sleep1200);
end else PresentationNextSection := 760;
Synchronize(UpdatePresentationBar);
end;
760 : begin ParamListen .= Toy.ListenActive ('yes/l.no/2',12,'SR','W');
case ParamListen of 1: PresentationNextSection := LastPresentation:
2: PresentationNextSection := 10:
else PresentationNextSection :~ LastPresentation;
end;
Synchronize(UpdatePresentationBar);
end:
// Geme Menu 800 : begin if VisitGameMenu = 0 then PresentationNextSection : 820 else begin PresentationNextSection := BOS;
if SecretName = 'BubbleGum' then PresentationNextSection := 815;
:= 810;
if SecretName = 'Rainbow' then PresentationNextSection SUBSTITUTE SHEET (RULE 26) end;
VisitGameMenu := VisitGameMenu +1:
end:
810 : begin ParamTalk := Toy.Talk(PresentationPath+'gm810.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection : 841:
end;
815 : begin ParamTaik := Toy.Talk(PresentationPath+'gm815.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 841;
end;
820 : begin ParamTalk :~ Toy.Talk(PresentationPath+'gm820m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 967:
end:
840 : begin ParamTalk := Toy.Talk(PresentationPath+'gm890.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 841:
end;
841 : begin ParamTalk := Toy.Talk(PresentationPath+'gm841.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 845;
end;
845 : begin ParamTalk := Toy.Talk(PresentationPath+'gm845.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection : 919;
end:
847 : begin ParamTalk := Toy.Talk(PresentationPath+'gm847m.wav','S'):

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdatePresentationBar);
PresentationNextSection :- 850:
LoopCount := 0:
end;
850 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none ParamListen : Toy. Listen ( " ,DTime,'Sensor','W')://nose if ParamListen = NoseSensor then PresentationNextSection := 860 else PresentationNextSection := 855;
SynchronizefUpdatePresentationBar);
LoopCount := LoopCount +1;
if (LoopCount = 3) and (PresentationNextSection = 855) then PresentationNextSection := 860:
end;
855 : begin ParamTalk := Toy.Talk(PresentationPath+'gm855m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 850:
end:
860 : begin ParamTalk := Toy.Talk(PresentationPath+'gm860.wav','S');
Synchronize(UpdatePresentation8ar);
PresentationNextSection := 865;
end;
865 : begin ParamTalk := Toy.TalkIPresentationPath+'gm865m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 870:
LoopCount :-- 0;
end;
870 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none ParamListen : Toy. Listen ( " ,DTime,'Sensor','W');//foot if ParamListen = FootSensor then PresentationNextSection : 890 else PresentationNextSection := 875:
Synchronize(UpdatePresentationBar);
LoopCount := LoopCount +1;
if (LoopCount = 3I and (PresentationNextSection = 875) then SUBSTITUTE SHEET (RULE 26) PresentationNextSection := 890;
end;
875 : begin ParamTalk := Toy.Talk(PresentationPath+'gm875m.wav','S'l;
Synchronize(UpdatePresentationBar);
PresentationNextSection := 8?0;
end;
890 : begin ParamTalk := Toy.Talk(PresentationPath+'gm890.wav','S');
Synchronize(UpdatePresentation8arl;
PresentationNextSection := 895;
end;
895 : begin ParamTalk := Toy.Talk(PresentationPath+'gm895.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 900;
LoopCount := 0;
end;
900 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none ParamListen : Toy. Listen (",DTime,'Sensor','W');//hand if ParamListen = HandSensor then PresentationNextSection := 910 else PresentationNextSection := 905;
Synchronize(UpdatePresentationBar);
LoopCount := LoopCount +1;
if (LoopCount = 3) and (PresentationNextSection = 905) then PresentationNextSection := 910;
end;
905 : begin ParamTalk := Toy.Talk(PresentationPath+'gm905m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 900;
end;
910 : begin ParamTalk := Toy.Talk(PresentationPath+'gm910.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 915:

SUBSTITUTE SHEET (RULE 26) end:
915 : begin ParamTalk := Toy.Talk(PresentationPath+'gm915.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 920;
Sensorl := 0;
Sensor2 := 0;
Sensor3 := 0:
LoopCount := 0:
end:
919 : begin PresentationNextSection := 920;
Sensorl := 0:
Sensor2 := 0:
Sensor3 := 0;
LoopCount := 0;
end;
920 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none LoopCount := LoopCount + It if (Sensors = HandSensor) and (Sensor2 = NaseSensor) and (Sensor3 = FootSensor) then PresentationNextSection :- 926 //success else begin PresentationNextSection := 920; // looping if LoopCount > 5 then // pressing not right begin PresentationNextSection := 924: //timeout if (Sensorl > 0) or (Sensor2 >0) or (Sensor3 >0) then else PresentationNextSection := 925://press not right end begin GetSensor .= Toy. Listen ( " ,DTime,'Sensor','W');
if GetSensor > 0 then begin if GetSensor = NoseSensor then PresentationNextSection : 921: //nose SUBSTITUTE SHEET (RULE 26) if GetSensor = HandSensor then PresentationNextSect;on := 923; //hand if GetSensor = FootSensor then PresentationNextSection := 922; //foot Sensorl := Sensor2;
5ensor2 := Sensor3;
Sensor3 := GetSensor;
//timeout end;
if (Sensorl = 0) and (Sensor2 = 0) and (Sensor3 = 0) and (LoopCount = 3) then PresentationNextSection := 925;
end;
end;
end;
921 : begin ParamTalk := Toy.Talk(PresentationPath+'gm921.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 920;
end;
922 : begin ParamTalk := Toy.Talk(PresentationPath+'gm922.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 920;
end;
923 : begin ParamTalk := Toy.Talk(PresentationPath+'gm923.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection : 920;
end:
925 : begin ParamTalk := Toy.Talk(PresentationPath+'gm925.wav','S');
Synchronize(UpdatePresentationHar);
PresentationNextSection := 927:
end:
926 : begin ParamTalk := Toy.Talk(PresentationPath+'gm926m.wav','S');
Synchronize(UpdatePresentationBarl;

SUBSTITITTE SHEET (RULE 26) PresentationNextSection : 1006;
end:
927 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none PresentationNextSection : 928;
Sensozl := Toy. Listen (".7.'Sensor','W'l;
if Sensorl = HandSensor then Toy.Talk(PresentationPath+'gm910.wav','S'):
if Sensorl = NoseSensor then Toy.Talk(PresentationPath+'gm860.wav','S');
if Sensorl = FootSensor then Toy.Talk(PresentationPath+'gm890.wav','S');
Sensor2 := Toy. Listen ( " ,7.'Sensor','W');
if Sensor2 = HandSensor then Toy.Talk(PresentationPath+'gm910.wav','S');
if Sensor2 = NoseSensor then Toy.Talk(PresentationPath+'gm860.wav','S');
if Sensor2 = FootSensor then Toy.Talk(PresentationPath+'gm890.wav','S');
Sensor3 := Toy. Listen (".7,'Sensor','W'1;
if Sensor3 = HandSensor then Toy.Talk(PresentationPath+'gm910.wav','S')t if Sensor3 = NoseSensor then Toy.Talk(PresentationPath+'gm860.wav','S'):
if Sensor3 = FootSensor then Toy.Talk(PresentationPath+'gm890.wav','S'):
if (Sensorl = HandSensor) and (Sensor2 = NoseSensor) and (Sensor3 FootSensor) then PresentationNextSection := 926 else PresentationNextSection := 928;
end;
928 : begin ParamTalk := Toy.Talk(PresentationPath+'gm928.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 1005:
end;
932 : begin ParamTalk := Toy.Talk(PresentationPath+'gm932.wav','S#');
Synchronize(UpdatePresentationBar);

SUBSTITUTE SHEET (RULE 26) Presentat'_onNextSection := 936;
end;
933 : begin ParamTalk := Toy.Talk(PresentationPath+'gm933n.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextsection : 1006;
end;
936 : begin ParamListen .= Toy. Listen ('yes/l,no/2',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection := 840;
2: PresentationNextSection := 940:
else PresentationNextSection := 940;
end;
SynchronizelUpdatePresentation8ar);
end;
940 : begin ParamTalk := Toy.Talk(PresentationPath+'gm940n.wav','S#');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 941;
LoopCount := 0;
end;
941 : begin ParamListen .= Toy. Listen ('story/l,song/2,storyteller/3',DTime,'SR','W'1;
case ParamListen of 1: PresentationNextSection := 100; //story menu 2: PresentationNextSection := 300: //song menu 3: PresentationNextSection := 300; //theme song else PresentationNextsection := 945;
end;
Synchronize(UpdatePresentation8ar);
LoopCount := LoopCOUnt +1;
if (LoopCount = 3) and (PresentationNextSection = 945) then PresentationNextSection : 100;
end:
945 : begin ParamTalk := Toy.Talk(PresentationPath+'gm940n.wav','S#'):

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdatePresentationBar):
PresentationNextSection := 941:
end:
965 : begin ParamTalk := Toy.Talk(PresentationPath+'gm965m.wav','S'1:
Synchronize(UpdatePresentationBar):
PresentationNextSection := 970%
end:
967 : begin ParamTalk := Toy.Talk(PresentationPath+'gm967.wav','S'):
Synchronize(UpdatePresentationBaz):
PresentationNextSection := 970%
end;
970 : begin ParamTalk := Toy.Talk(PresentationPath+'gm970.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 971%
LoopCount := 0:
end;
971 : begin ParamListen .= Toy. Listen ('lollipop/l,peanut butter/2',7,'SR','W')i case ParamListen of 1: PresentationNextSection := 975%
2: PresentationNextSection := 980:
else PresentationNextSection := 972:
end:
LoopCount := LoopCount +1:
if (LoopCount = 3) and (PresentationNextSection = 972) then PresentationNextSection := 984%
Synchronize(UpdatePresentationBar):
end:
972 : begin ParamTalk := Toy.Talk(PresentationPath+'gm972.wav','S#'1:
Synchronize(UpdatePresentationBar):
PresentationNextSection :- 971:
end:

SUBSTITUTE SHEET (RULE 26) 975 : begin ParamTalk := Toy.Talk(PresentationPath+'gm975.wav','S')%
Synchronize(UpdatePresentationBar):
PresentationNextSection :- 984:
end:
980 : begin ParamTalk := Toy.Talk(PresentationPath+'gm980.wav','S'1%
Synchronize(UpdatePresentation8ar)%
PresentationNextSection :- 984%
end%
984 : begin ParamTalk := Toy.Talk(PresentationPath+'gm984.wav','S'I%
Synchronize(UpdatePzesentationBar):
PresentationNextSection := 983%
LoopCount := 0;
end:
983 : begin ParamListen .= Toy. Listen ('rabbit/l,bear/2',DTime,'SR','H1'1%
case ParamListen of 1; PresentationNextSection := 985:
2: PresentationNextSection := 990:
else PresentationNextSection := 982%
end:
LoopCount := LoopCount +1%
if (LoopCount = 3) and (PresentationNextSection = 982) then PresentationNextSection : 1005;
Synchronize(UpdatePresentationBar)%
end:
982 : begin ParamTalk := Toy.Talk(PresentationPath+'gm982.wav','S#'):
Synchronize~UpdatePresentationBar)%
PresentationNextSection := 983.
end:
985 : begin ParamTalk := Toy.Talk(PresentationPath+'gm985m.wav','S')%
Synchronize(UpdatePresentationBar)%
PresentationNextsection : 932%
end:

SUBSTITUTE SHEET (RULE 26) *rB

990 : begin ParamTalk := Toy.Talk(PresentationPath+'gm990.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection : 932:
if MainForm.ToyMachine = 'TeddyBear' then PresentationNextSection := 1005;
end:
1005 : begin ParamTalk := Toy.Talk(PresentationPath+'gm1005.wav','S')~
Synchronize(UpdatePresentationBar);
PresentationNextSection := 1006 end:
1006 : begin ParamTalk := Toy.Talk(PresentationPath+'gm1006.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 998:
LoopCount := 0;
end:
998 : begin ParamListen .= Toy. Listen ('story/l,song/2,storyteller/3',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection : 100; //story menu 2: PresentationNextSection : 300; //song menu 3: PresentationNextSection := 5000; //theme song else PresentationNextSection := 997;
end;
LoopCount := LoopCount +1:
if (LoopCount = 3) and (PresentationNextSection = 997) then PresentationNextSection : 100;
Synchronize(UpdatePresentationBar);
end;
997 : begin ParamTalk := Toy.Talk(PresentationPath+'gm997.wav','S#'):
Synchronize(UpdatePresentationBar):
PresentationNextSection : 998 end:

SUBSTITUTE SHEET (RULE 26) // Bunny Long ___________~__ -______~__-____________________--__ 2000 : begin Synchronize(UpdatePresentationBar);
VisitBunnyLong := VisitBunnyLong + 1;
PresentationNextSection := 2280:
end;
2280 : begin ParamTalk := Toy.Talk(PresentationPath+'rb280m.wav','E');
Synchronize(UpdatePresentationBar);
PresentationNextsection := 2285:
end;
2285 : begin ParamTalk := Toy.Talk(PresentationPath+'rb286.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextsection := 2286;
end:
2286 : begin ParamTalk := Toy.Talk(PresentationPath+'rb287.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2290;
end:
2290 : begin WaveSection :='rb2901';
if BunnyFavoriteFood = 'Honey' then WaveSection :='rb2901';
if BunnyFavoriteFood = 'Peanut' then WaveSection :-'rb2902';
if BunnyFavoriteFood = 'Marshmallow' then WaveSection :-'rb2903';
ParamTalk := Toy.Talk(PresentationPath+WaveSection+'.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2295;
end;
2295 : begin Synchronize(UpdatePresentation8ar);
PresentationNextSection := 2296:
end:
2296 : begin WaveSection :='rb2961';
if SecretDlame = 'BubbleGum'then Wavesection :='rb2961N';

SUBSTITUTE SHEET (RULE 26) if SecretName = 'Ace' then WaveSection :-'rb2962N't if SecretName = 'RainBow' then WaveSection :='rb2963N':
ParamTalk := Toy.Talk(PresentationPath+WaveSection+'.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2300:
end:
2297 : begin ParamTalk := Toy.Talk(PresentationPath+'rb297m.wav','E'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2300:
end:
2300 : begin ParamTalk := Toy.Talk(PresentationPath+'rb300.wav','S'1:
SynchronizefUpdatePresentationBar):
PresentationNextSection := 2305;
end:
2305 : begin WaveSection :='rb3051':
if BunnyFavoriteFood = 'Honey' then WaveSection :='rb3051':
if BunnyFavoriteFood = 'Peanut' then WaveSection :-'rb3052':
if BunnyFavoriteFood = 'Marshmallow' then WaveSection :-'rb3053':
ParamTalk := Toy.Talk(PresentationPath+WaveSection+'.wav','S'):
Synchronize(UpdatePresentationBar)~
PresentationNextSection := 2315:
end:
2315 : begin ParamTalk := Toy.Talk(PresentationPath+'rb315.wav','S'1:
Synchronize(UpdatePresentationBar);
PresentationNextsection := 2320:
ends 2316 : begin Synchronize(UpdatePresentationBar)~
PresentationNextSection := 2320;
end:
2320 : begin ParamTalk := Toy.Talk(PresentationPath+'rb320.wav','S');
Synchronize(UpdatePresentationBar)7 SUBSTITUTE SHEET (RULE 26) PresentationNextSection :_ ?330;
end;
2330 : begin ParamTalk := Toy.Talk(PresentationPath+'rb330.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2335;
end;
2335 : begin ParamTalk := Toy.Talk(PresentationPath+'rb336.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2336;
end;
2336 : begin ParamTalk := Toy.Talk(PresentationPath+'rb337.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2344;
end;
2344 : begin ParamTalk := Toy.Talk(PresentationPath+'rb343.wav','S'1:
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2345;
end;
2345 : begin ParamTalk := Toy.Talk(PresentationPath+'rb344.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2346;
end:
2346 : begin ParamTalk := Toy.Talk(PresentationPath+'rb346m.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2350;
end;
2350 : begin ParamTalk := Toy.Talk(PresentationPath+'rb351.wav','S'):
Synchronize(Update?resentationBar);
PresentationNextSection := 2351;

SUBSTITUTE SHEET (RULE 26) end;
2351 : begin ParamTalk := Toy.Talk(PresentationPath+'rb352.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2355;
end:
2355 : begin ParamTalk := Toy.Talk(PresentationPath+'rb356.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2356:
end:
2356 : begin ParamTalk := Toy.Talk(PresentationPath+'rb357.wav','S'):
Synchronize(UpdatePresentationBarl:
PresentationNextSection := 2360;
end:
2360 : begin ParamTalk := Toy.Talk(PresentationPath+'rb360.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2365:
end:
2365 : begin //sensors : 1-Nose 2-Hand 3-Foot ; 0-none ParamListen := Toy. Listen ( " ,DTime,'Sensor'.'~~1'1%
case ParamListen of NoseSensor: PresentationNextSection := 2375:
HandSensor: PresentationNextSection := 2370.
FootSensor: ?resentationNextSection := 2375:
else end:
PresentationNextSection := 2385:
Synchronize(UpdatePresentationBar):
end:
2370 : begin ParamTalk := Toy.Talk(PresentationPath+'rb370.wav'.'S');
Synchronize(UpdatePresentation8ar):
PresentationNextSection := 2380;

SUBSTITUTE SHEET (RULE 26) *rB

end;
2375 : begin Param'Palk := Toy.Talk(PresentationPath+'rb375.wav','S');
Synchronize(UpdatePresentationSar):
PresentationNextSection := 2365:
end;
2380 : begin //sensors : 1-Nose 2-Hand 3-Foot : 0-none ParamListen := Toy. Listen (" ,DTime,'Sensor','W'):
case ParamListen of NoseSensor: PresentationNextSection := 2385;
HandSensor: PresentationNextSection := 2382;
FootSensor: PresentationNextSection := 2385;
else PresentationNextSection := 2385:
end;
Synchronize(UpdatePresentationBar);
end:
2382 : begin ParamTalk := Toy.Talk(PresentationPath+'rb382.wav','S')7 Synchronize(UpdatePresentationBarl;
PresentationNextSection := 2385;
end;
2385 : begin ParamTalk := Toy.Talk(PresentationPath+'rb385.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2386;
end;
2386 : begin ParamTalk := Toy.Talk(PresentationPath+'rb386.wav','E');
Synchronize(UpdatePresentationHar);
PresentationNextSection := 2390;
end:
2390 : begin ParamTalk := Toy.Talk(PresentationPath+'rb390.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2395;

SUBSTITUTE SHEET (RULE 26) end:
2395 : begin ParamTalk := Toy.Talk(PresentationPath+'rb395m.wav','S'):
SynchronizelUpdatePresentationBar);
PresentationNextSection := 2410:
end:
2400 : begin ParamTalk := Toy.Talk(PresentationPath+'rb400m.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2410:
end;
2405 : begin ParamTalk := Toy.Talk(PresentationPath+'rb405m.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2410:
end:
2410 : begin ParamTalk := Toy.Talk(PresentationPath+'rb410m.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2415:
end;
2415 : begin WaveSection :='rb4151m':
if BunnyFavoriteFood = 'Honey' then WaveSection .='rb4151m':
if BunnyFavoriteFood = 'Peanut' then WaveSection .-'rb9152m':
if BunnyFavoriteFood = 'Marshmallow' then WaveSection .='zb4153m':
ParamTalk := Toy.Talk(PresentationPath+WaveSection+'.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 2420:
end:
2420 : begin WaveSection :-'rb4201':
if HunnyFavoriteFood = 'Honey' then Wavesection :-'rb4201':
if BunnyFavoriteFood = 'Peanut' then WaveSection :='rb4202':

SUBSTITUTE SHEET (RULE 26) if BunnyFavoriteFood = 'Marshmallow' then WaveSection :-'rb4203':
ParamTalk := Toy.Talk(PresentationPath+WaveSection+'.wav','S'):
Synchronize(UpdatePresentationSar):
PresentationNextSection := 2425;
end:
2925 : begin ParamTalk := Toy.Talk(PresentationPath+'rb425m.wav','S'):
Synchronize(UpdatePresentationBart;
PresentationNextSection := 2426;
ends 2426 : begin ParamTalk := Toy.Talk(PresentationPath+'rb426m.wav','EL'):
Synchronize(UpdatePzesentationBar):
PresentationNextSection := 2435;
end;
2435 : begin ParamTalk := Toy.Talk(PresentationPath+'rb435m.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextsection := 2440:
end:
2440 : begin ParamTalk := Toy.Talk(PresentationPath+'rb490.wav','S'):
Synchronize(UpdatePresentation8ar):
PresentationNextSection := 2445;
end:
2445 : begin ParamTalk := Toy.Talk(PresentationPath+'rb495m.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 245:
end:
// End of Bunny Long =______________________________________________ // Bunny Short _____________________________ ___________________ 3000 : begin Synchronize(UpdatePresentationBar):
VisitBunnyShort := VisitBunnyShort + 1;
PresentationNextSection : 3005;
end:

SUBSTITUTE SHEET (RULE 26) 3005 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3005m.wav','E'):
Synchronize(UpdatePresentationBar):
PresentationNextSection : 3010;
end:
3010 : begin ParamTalk := Toy.Talk(PresentationPath+'rb005m.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3015;
end:
3015 : begin ParamListen := Toy. Listen ('honey/l,peanut butter/2,marshmallow fluff/3',7,'SR','W');
case ParamListen of 1: PresentationNextSection := 3035:
2: PresentationNextSection := 3040:
3: PresentatioxWextSection := 3045;
else PresentationNextSection := 3020:
end:
Synchronize(UpdatePresentationBar)~
end:
3020 : begin ParamTalk := Toy.Talk(PresentationPath+'rb015.wav','S#')f Synchronize(UpdatePresentationBar):
PresentationNextSection := 3025 end:
3025 : begin ParamListen .= Toy. Listen ('honey/l,peanut butter/2,marshmailow fluff/3',DTime,'SR','W'):
case ParamListen of 1: PresentationNextSection3035:
:

2: PresentationNextSection3040;
:=

3: PresentationNextSection3045;
:-else PresentationNextSection:=
3030;

end:
Synchronize(UpdatePresentationBar);
end:

SUBSTITUTE SHEET (RULE 26) 3030 : begin BunnyFavoritefood : 'Honey':
ParamTalk := Toy.Talk(PresentationPath+'rb026.wav','S'l:
Synchronize(UpdatePresentationBar);
PresentationNextsection :- 3035:
end:
3035 : begin BunnyFavoriteFood : 'Honey') ParamTalk := Toy.TalkfPresentationPath+'rb0301.wav','S'):
Synchronize(UpdatePresentationBarl:
PresentationNextSection := 3050;
end:
3040 : begin BunnyFavoriteFood : 'Peanut':
ParamTalk := Toy.TalklPresentationPath+'rb0302.wav','S'):
Synchronize(UpdatePresentationSar)t PresentationNextsection := 3050 end:
3045 : begin BunnyFavoriteFood : 'Marshmallow':
ParamTalk := Toy.Talk(PresentationPath+'rb0303.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3050:
end:
3050 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3050m.wav','S'):
PresentationNextSection := 3055;
if BunnyFavoriteFood = 'Honey' then PresentationNextSection .- 3055:
:= 3060:
.= 3065:
if BunnyFavoriteFood = 'Peanut' then PresentationNextSection if BunnyFavoriteFood = 'Marshmallow' then PresentationNextSection Synchronize(UpdatePresentationBar):
end:
3055 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3055.wav','S'):
Synchronize(UpdatePresentationBar);

SUBSTITUTE SHEET (RULE 26) *rB

PresencationNextSection : 3075;
end:
3060 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3060.wav','S');
Synchronize(UpdatePresentationBarl;
PresentationNextSection :- 3075:
end:
3065 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3065.wav','S')~
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3075;
end:
3075 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3075n.wav','S');
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3080;
end:
3080 : begin ParamTalk := Toy.Talk(PresentationPath+'rb110.wav','S#'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3085:
end:
3085 : begin ParamListen .= Toy. Listen ('giraffe/l,elephant/2,bunny/3',7,'SR','W');
case ParamListen of 1: PresentationNextSection3095;
:=

2: PresentationNextSection3090:
:=

3: PresentationNextSection3100;
:-else PresentationNextSection:=
3100;

end;

Synchronize(UpdatePresentationBar);

end:

3090 : begin ParamTalk := Toy.Talk(PresentationPath+'rb120.wav','S'1;
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3125;

SUBSTITUTE SHEET (RULE 26) end:
3095 : begin ParamTalk := Toy.Talk(PresentationPath+'rb125.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3125:
end:
3100 : begin ParamTalk := Toy.Talk(PresentationPath+'rb130.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection : 3125:
end:
3125 : begin ParamTalk := Toy.Talk(PresentationPath+'rb220n.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextsection := 3135:
end;
3135 : begin ParamTalk := Toy.Talk(PresentationPath+'rb231m.wav','EL'):
Synchronize(UpdatePresentationHar):
PresentationNextSection := 3140:
end:
3140 : begin ParamTalk := Toy.Talk(PresentationPath+'rb235n.wav','S#'1:
Synchronize(UpdatePresentationBar):
PresentationNextSection := 3145:
end;
3145 : begin ParamListen .= Toy. Listen ('bunnythree/1',DTime,'SR','W');
case ParamListen of 1: PresentationNextSection := 3150:
else PresentationNextSection := 3155:
end:
Synchronize(UpdatePresentationBar):
end:
3150 : begin ParamTalk := Toy.Talk(PresentationPath+'rb245.wav','S'):

SUBSTITUTE SHEET (RULE 26) PresentationNextSection : 3160;
if BunnyFavoriteFood = 'Honey' then PresentationNextSection := 3160;
.= 3165;
:= 3170;
if BunnyFavoriteFood = 'Peanut' then PresentationNextSection if BunnyFavoriteFood = 'Marshmallow' then PresentationNextSection Synchronize(UpdatePresentationBar);
end:
3155 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3155.wav','S');
PresentationNextSection := 3160;
if BunnyFavoriteFood ~ 'Honey' then PresentationNextSection .= 3160;
.= 3165;
:= 3170;
if BunnyFavoriteFood = 'Peanut' then PresentationNextSection if BunnyFavoriteFood = 'Marshmallow' then PresentationNextSection Synchronize(UpdatePresentationBar);
end;
3160 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3160.wav','S'):
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3180;
end;
3165 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3165.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3185;
end:
3170 : begin ParamTalk := Toy.Talk(PresentationPath+'rb3170.wav','S');
Synchronize(UpdatePresentationBarl;
PresentationNextSection :- 3190;
end;
3180 : begin ParamTalk := Toy.Talk(PresentationPath+'rb2751.wav'.'S');
Synchronize(UpdatePresentationHar);

SUBSTITUTE SHEET (RULE 26) PresentationNextsection : 3195;
end;
3185 : begin ParamTalk := Toy.Talk(PresentationPath+'rb2752.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3195;
end;
3190 : begin ParamTalk := Toy.Talk(PresentationPath+'rb2753.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 3195:
end:
3195 : begin ParamTalk := Toy.Talk(PresentationPath+'rb280m.wav','E');
Synchronize(UpdatePresentationBar);
PresentationNextSection := 2051 end;
// End of Bunny Short =°°°~°=~~-3°~~a=:~:a~moass~a=x=ago=~azss=oar=a~a // Princess and The Ped =~~=a~=-=~=°~=s=~m-~s~c~c~=s=c=~~s~cxaa-a 4000 : begin VisitPrincess := Visitprincess + 1;
PresentationNextSection : -1;
PrincessNextSection .= 1;
end;
4010 : begin PresentationNextSection :- 699; //go back from princess end;
// End of Princess and The Pea // Theme Song ==ammo:~-~~-==-e~c~==ass:=m~~avm=========e=~~c~e===s=
5000 : begin PresentationNextSection := 5010;
end;
5010 : begin ParamTalk := Toy.TalkAll(Path+'StoryTeller.wav','S');
Synchronize(UpdatePresentationBar);
PresentationNextSection :- 5020:

SUBSTITUTE SHEET (RULE 26) end:
5020 : begin //ParamTalk := Toy.Talk(Path+'Alonel.wav','S'):
SynchronizefUpdatePresentationBar):
PresentationNextSecticn :- 5030:
end;
5030 : begin //ParamTalk :~ Toy.TalkAll(Path+'All.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 5040;
end:
5040 : begin //ParamTalk := Toy.Talk(Path+'Alone2.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 5050:
end;
5050 : begin //ParamTalk := Toy.TalkAll(Path+'All.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 5060:
end;
5060 : begin //ParamTalk := Toy.Talk(Path+'Alone3.wav','S'):
Synchronize(UpdatePresentationBar):
PresentationNextSection := 45:
if SecretName = 'BubbleGum' then PresentationNextSection : 50;
if SecretName = 'Ace' then PresentationNextSection :- 40:
if SecretName = 'Rainbow' then PresentationNextSection : 45:
end:
// End of Theme Song ==xa==x====ax=====__.=====s~ -=====x=xx====xx==
//PAUSE
10000 : begin PresentationNextSection : 10000:
sleep(200):
end;
end://End of Presentation SUBSTITUTE SHEET (RULE 26) if (PresentationPtextSecticn <> 699) and (PresentationNextSection <>
10000) and (PresentationNextSection <> 760) and (PresentationNextSection <>
750)then LastPresentation : PresentationNextSection:
(+
// ___ ________= I N T R 0 ___________________________________ // ----------- write here all sessions ------------------"-'-'-"
// _____=====s====.~_____________________________________________~_-_ case IntroNextSection of 1 : begin //Toy.Wait(12,'W'):
(sleep(300):
ParamTalk .= Toy. Talk (IntroPath+'in001.wav','E');
ParamListen := Toy. Listen ('yes/l,no,2',1.5,'SR and Sensor'):
StatusForm.StatusGauge.Progress := IntroNextSection/4.5; ) IntroNextSection := 5;
end;
2 : begin (ParamTalk .= Toy. Talk (IntroPath+'inOla.wav','E'):
Synchronize(UpdateIntroBar):
IntroNextSection := 3;) end:
3 : begin (ParamTalk .= Toy. Talk (IntroPath+'inOlb.wav','EL'):
Synchronize(UpdatelntroHar):
IntroNextSection := 5;) end:
4 : begin (sleep(300):
Synchronize(UpdatelntroBar);
IntroNextSection := 5.) end:
: begin ParamTalk .= Toy. Talk (IntroPath+'inOl.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection : 10;

SUBSTITUTE SHEET (RULE 26) end:
6 : begin (sleepl300);
Synchronize(UpdateIntroBar);
IntroNextSection := 7;) end;
7 : begin (sleep(3001;
Synchronize(UpdateIntroBarl;
IntroNextSection := 8;}
end:
8 : begin (sleep1300);
Synchronize(UpdateIntroBar):
IntroNextSection := 9;) end;
9 : begin (sleep(300}:
Synchronize(UpdateIntroBar);
IntroNextSection := 10;) end;
10: begin ParamTalk .= Toy. Talk (IntroPath+'in02m.wav','SP1');
Synchronize(UpdateIntroBar):
IntroNextSection := 21:
if SecretName = 'Bubble gum' then IntroNextSection := 21:
if SecretName = 'Ace' then IntroNextSection := 22;
if SecretName = 'Rainbow' then IntroNextSection := 23;
end;
11: begin (sieep(3001;
Synchronize(UpdateIntroBar);
IntroNextSection : 12 ;) end:
12: begin (sleep(300);

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdateIntroBar):
IntroNextSection : 13;) end:
13: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection : 14:}
end:
I4: begin (sleep(300):
Synchronize(UpdateIntroBar);
IntroNextSection : 15;) ends 15: begin {ParamTalk .= Toy. Talk (IntroPath+'in02b.wav','S'):
Synchronize(UpdateIntroBar);
IntroNextSection :- 16:) end:
16: begin (ParamTalk .= Toy. Talk (IntroPath+'in02c.wav','E'):
Synchronize(UpdateIntroBar):
IntroNextSection := 20;) end;
1?: begin (ParamTalk .= Toy. Talk (IntroPath+'in02c.wav','E'):
Synchronize(UpdatelntroBar);
IntroNextSection :- 18;) end;
18: begin (sleep(3001;
Synchronize(UpdateIntroBarl:
IntroNextSection : 19;) end;
19: begin (sleep(300);
Synchronize(UpdateIntroBar);

SUBSTITUTE SHEET (RULE 26) WO 99/54015 PCTlIL99/00202 IntroNextSection : 20:) end:
20: begin [ParamTalk .= Toy. Talk (IntroPath+'in02.wav','S');
Synchronize(UpdateIntroBar):
IntzoNextSection := 21:
if SecretName = 'Bubble gum' then IntroNextSection := 21;
if SecretName = 'Ace' then IntroNextSection := 22:
if SecretName = 'Rainbow' then IntroNextSection := 23;) end:
21: begin ParamTalk .= Toy. Talk (IntroPath+'in03.wav','S')p Synchronize(UpdatelntroBar):
IntroNextSection := 30:
end;
22: begin ParamTalk .= Toy. Talk (IntroPath+'in03a.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection := 30;
end:
23: begin ParamTalk .= Toy. Talk (IntroPath+'in03b.wav','S'):
Synchronize(UpdatelntroBar)t IntroNextSection : 30:
end:
24: begin (sleep(300):
Synchronize(UpdateIntroBar):
IntroNextSection := 25;1 end:
25: begin (sleep(300):
SynchronizelUpdateIntroBar)~
IntroNextSection := 26;) end:
26: begin SUBSTITUTE SHEET (RULE 26) i (sieepl300);
Synchronize(UpdateIntroBar);
IntroNextSection := 27;) end:
27: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntraNextSection := 28;) end;
28: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 29;) end:
29: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection := 30;) end;
30: begin ParamTalk := Toy. Talk (IntroPath+'in04.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 35;
end;
31: begin (ParamTalk .= Toy. Talk (IntroPath+'in04a.wav','EL');
Synchronize(UpdateIntroBar);
IntroNextSection := 35;) end;
32: begin (sleep(300);
Synchronize(UpdatelntroBarl;
IntroNextSection := 33;) end:
33: begin (sleep(300);

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdateIatroBar):
IntroNextSection := 34;) end:
34: begin (sleep(300):
Synchronize(UpdatelntroBar):
IntroNextSection := 35;) end;
35: begin ParamTalk .= Toy. Talk (IntroPath+'inOSm.wav','SP2');
Synchronize(UpdateIntroBar);
IntroNextSection := 95;
end:
36: begin (sleep(300);
Synchronize(UpdatelntroBarl:
IntroNextSection := 37:) end:
37: begin (sleep(300):
Synchronize(UpdateIntroBar):
IntroNextSection := 38;) end:
38: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection :- 39;) end;
39: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection := 40;) end;
40: begin (ParamTalk .= Toy. Talk (IntroPath+'inOS.wav','S');
Synchronize(UpdateIntroBar):

SUBSTITUTE SHEET (RULE Z6) *rB

IntroNextSection := 45;) end;
4I: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 42;) end:
92: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 43;) end;
43: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection := 44;) end:
44: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 45;) end;
95: begin ParamTalk .= Toy. Talk (IntroPath+'in06.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSectian := 50;
end;
96: begin (sieep1300):
Synchronize(UpdateIntroBarl;
IntroNextSection := 47;1 end;
47: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 48;) SUBSTITUTE SHEET (RULE 26) end;
48: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 49;) end;
49: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 50;) end:
50: begin ParamTalk .= Toy. Talk (IntroPath+'in07.wav','S');
Synchronize(UpdatelntroBarl;
IntroNextSection := 55;
end;
51: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 52;) end;
52: begin ~sleep(300);
Synchrcnize(UpdateIntroBar);
IntroNextSection := 53;) end;
53: begin (sleep(300);
Synchronize(UpdateIntroBarl;
IntroNextSection := 59;1 end:
54: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 55;}
end;

SUBSTITUTE SHEET (RULE 26) 55: begin ParamListen := Toy.Wait(12,'W');
if ParamListen = 1 then IntroNextSection := 60 else IntroNextSection := 65;
Synchronize(UpdateIntroBar);
end;
56: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 57;) end;
57: begin (sleep(300);
5ynchronize(UpdateIntroBar):
IntroNextSection := 58;) end;
58: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 59;) end;
59: begin (sleep(300);
Synchronize(UpdateIntroBarl;
IntroNextSection := 60;) end:
60: begin ParamTalk .= Toy. Talk (IntroPath+'in09.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 67;
end;
61: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection := 62;) end;

SUBSTITUTE SHEET (RULE 26) 62: begin (sleep(300);
Synchronize(UpdateIntroBarl:
IntroNextSection := 63;) end;
63: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextsection := 64;) end;
64: begin (sleep(300):
Synchronize(UpdateIntroBar);
IntroNextSection := 65;) end:
65: begin ParamTalk .= Toy. Talk (IntroPath+'inl0.wav','S');
if ParamTalk = 1 then IntroNextSection := 60 else IntroNextSection := 66;
Synchronize(UpdateIntroBar);
end;
66: begin ParamListen := Toy.Wait(12.'W');
if ParamListen = 1 then IntroNextSection := 60 else IntroNextSection := 67;
Synchroni2e(UpdateIntroBar);
end;
67: begin ParamTalk .= Toy. Talk (IntroPath+'******.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 70;
end;
68: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection :- 69;) SUBSTITUTE SHEET (RULE 26) end;
69: begin (sleepl300);
Synchronize(UpdateIntroBarl;
IntroNextSection := 70;) end;
70: begin ParamTalk .= Toy. Talk (IntroPath+'inl0b.wav','S');
5ynchronize(UpdateIntroBar);
IntroNextSection := 71;
if SecretName = 'Bubble gum' then IntroNextSection : 7I;
if SecretName = 'Ace' then IntroNextSection := 72;
if SecretName = 'Rainbow' then IntroNextSection := 73;
end;
71: begin ParamTalk .= Toy. Talk (IntroPath+'inll.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 80;
end:
72: begin ParamTalk .= Toy. Talk (IntroPath+'inlla.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 80;
end:
73: begin ParamTalk .= Toy. Talk iIntroPath+'inllb.wav','S');
Synchronize(UpdatelntroBar);
IntroNextSection := 80;
end;
74: begin (sleep(300);
Synchronize(UpdatelntroBar);
IntroNextSection := 75;) end;
75: begin (sleep(3001;

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdateIntroBar):
IntroNextSection := 76;) end;
76: begin (sleep(300);
Synchronize(UpdateIntroBar):
IntroNextSection := 77;) end;
77: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 78;) end;
78: begin {sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 79;) end:
79: begin {sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection := 80;) end:
80: begin ParamTalk .= Toy. Talk (IntroPath+'inllm.wav','S');
Synchronize(UpdateIntroBar):
IntroNextSection := 85;
end;
81: begin {ParamTalk .= Toy. Talk (Intro9ath+'inl2.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection := 85:1 end;
82: begin Isleepl300);
Synchronize(UpdateIntroBar):

SUBSTITUTE SHEET (RULE 26) WO 99154015 PCTlIL99/00202 IntroNextSection := 83;1 end:
83: begin (sleep(300);
Synchronize(UpdateIntroBar);
IntroNextSection :- 84;}
end:
84: begin (sleep(300):
Synchronize(UpdateIntroBar):
IntroNextSection := 85;) end:
85: begin ParamTalk .= Toy. Talk (IntroPath+'inl2.wav','EL');
Synchronize(UpdatelntroBar):
IntroNextSection := 86:
end;
86: begin ParamTalk := Toy. Talk (IntroPath+'inl2b.wav','S'):
Synchronize(UpdateIntro8ar):
IntroNextSection := 90;
endp 87: begin (sleep(3001;
Synchronize(UpdateIntroHar);
IntroNextSection := 88;1 end;
88: begin (sleep(300):
Synchronize(UpdateIntro8ar);
IntroNextSection := 89;}
end:
89: begin (sleep(300);
Synchronize(UpdateIntroBarl;
IntroNextSection := 90;) SUBSTITUTE SHEET (RULE 26) end:
90: begin ParamTalk .= Toy. Talk (IntroPath+'inl3.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextsection := 95:
end:
95: begin ParamTalk .= Toy. Talk (IntroPath+'inl3a.wav','S'):
//randomize WAVE
Synchronize(UpdateIntroBar):
IntroNextSection := 100:
end:
100: begin ParamTalk .= Toy. Talk (IntroPath+'inl9.wav','S'):
Synchronize(UpdateIntroBarl:
IntroNextSection := 110:
end:
110: begin ParamListen := Toy.Wait(12,'W')f if ParamListen = 3 then IntroNextSection := 120 else IntroNextSection : 115:
Synchronize(UpdateIntroBar):
end:
115: begin ParamTalk .= Toy. Talk (IntroPath+'inl4a.wav','S'):
if ParamTalk = 3 then IntroNextSection : 120 else IntroNextSection : 116:
Synchronize(UpdateIntroBar):
end:
116: begin ParamListen := Toy.Wait(12,'W'):
if ParamListen = 3 then IntroNextSection : 120 else IntroNextSection :- 145;
Synchronize(UpdateIntroBar):
end:
120: begin SUBSTITUTE SHEET (RULE 26) ParamTalk .= Toy. Talk (IntroPath+'inl3a.wav','S');//randomize WAVE
Synchronize(UpdateIntroBar);
IntroNextSection :- 145;
end;
145: begin ParamTalk .= Toy. Talk (IntroPath+'inl5.wav','S'1;
Synchronize(UpdateIntroBar);
IntroNextSection := 155;
end;
155: begin ParamTalk .= Toy. Talk (IntroPath+'inl6.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection :- 160:
if SecretName = 'Bubble gum' then IntroNextSection : 160:
if SecretName = 'Ace' then IntroNextSection :- 161;
if SecretName = 'Rainbow' then IntroNextSection := 162:
end;
160: begin ParamTalk .= Toy. Talk (IntroPath+'inl7.wav','S')~
Synchronize(UpdateIntro8arl:
IntroNextSection := 164;
end:
161: begin ParamTalk .= Toy. Talk (IntroPath+'inl7a.wav','S');
Synchronize(UpdateIntroBar):
IntroNextSection : 164;
end;
162: begin ParamTalk .= Toy. Talk (IntroPath+'inl7b.wav','S'):
Synchronize(UpdateIntroBarl;
IntroNextSection : 164:
end:
164: begin ParamTalk .= Toy. Talk (IntroPath+'inl8.wav','S')7 Synchronize(UpdateIntro8ar);
IntroNextSection : 165;

SUBSTITUTE SHEET (RULE 26) end;
I65: begin ParamTalk .= Toy. Talk (IntroPath+'SPin165.wav','SP3');
Synchronize(UpdateIntroBar);
IntroNextSection := 175;
end;
166: begin (ParamTalk .= Toy. Talk (IntroPath+'in20.wav','S');
SynchronizelUpdateintroBar);
IntroNextSection := 167;) end;
167: begin (ParamTalk .= Toy. Talk (IntroPath+'inbeep.wav','EL'):
Synchronize(UpdateIntroBar);
IntroNextSection := 168;) end:
168: begin (ParamTalk .= Toy. Talk (IntroPath+'inl9.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 169;) end;
169: begin (ParamTalk .= Toy. Talk (IntroPath+'inblerp.wav','EC');
Synchronize(UpdateIntroBar);
IntroNextSection := 170;) end;
170: begin (ParamTalk .= Toy.Talk (IntroPath+'in2l.wav +
'+IntroPath+'in22.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 171:) end;
171: begin (ParamTalk .= Toy. Talk (IntroPath+'inboop.wav','E');
Synchronize(UpdateIntroBar):
IntroNextSection := 172; ) SUBSTITUTE SHEET (RULE 26) end;
172: begin (ParamTalk .= Toy. Talk (IntroPath+'in26.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 175;1 end;
173: begin (ParamTalk .= Toy. Talk (IntroPath+'in23.wav','S');
SynchronizelUpdateIntroBar):
IntroNextSection := 175;) end;
175: begin ParamListen := Toy.Wait(12,'W');
if ParamListen = 1 then IntroNextSection :- 180 else IntroNextSection := 185:
Synchronize(UpdateIntroBar);
end;
180: begin ParamTalk .= Toy. Talk (IntroPath+'in24.wav','S');
5ynchronize(UpdateIntroBar);
IntroNextSection := 195;
end;
181: begin (ParamTalk .= Toy.Talk (IntroPath+'inbeep.wav','EL');//*****
check???
if ParamTalk = 2 then IntroNextSection := 185 else IntroNextSection : 182:
Synchronize(UpdateIntroBar);) end:
I82: begin (PatamTalk := Toy.Talk (IntroPath+'in25b.wav','S');//*****
check???
if ParamTalk = 2 then IntroNextSection := 185 else IntroNextSection : 184;
Synchronize(UpdateIntroBarl;) end;

SUBSTITUTE SHEET (RULE 26) 189: begin (ParamListen := Toy. Listen ( " ,10,'Sensor'):
if ParamListen = 2 then IntroNextSection : 185 else IntroNextSection : 190;
Synchronize(UpdateIntroBar);) end:
185: begin ParamTalk .= Toy. Talk (IntroPath+'SPinl85.wav','SP4'):
if ParamTalk = 1 then IntroNextSection : 180 else IntroNextSection : 190;
Synchronize(UpdateIntroBar):
end:
186: begin (ParamTalk .= Toy. Talk (IntroPath+'inboop.wav','E'):
if ParamTalk = 1 then IntroNextSection := 180 else IntroNextSection : 187:
Synchronize(UpdateIntroBar): ) end:
187: begin (ParamTalk := Toy. Talk (IntroPath+'in29b.wav','S');
if ParamTalk = 1 then IntroNextSection : 180 else IntroNextSection : 190;
Synchroni2e(UpdateIntroBar): ) end:
190: begin ParamListen := Toy.Wait(12,'W'):
if ParamListen = 1 then IntroNextSection : 180 else IntroNextSection : 195:
Synchronize(UpdateIntroBar):
end:
195: begin ParamTalk .= Toy. Talk (IntroPath+'SPin195.wav','SP5'):
Synchronize(UpdateIntroBar):
IntroNextSection :- 197;
end:
196: begin (ParamTalk .= Toy. Talk (IntroPath+'in23.wav','S'):

SUBSTTTUTE SHEET (RULE 26) Svnchronize(UpdateIntroBar):
IntroNextSection : 197:}
end;
197: begin ParamListen := Toy.Wait(12,'W');
if ParamListen = 2 then IntroNextSection := 200 else IntroNextSection := 205:
Synchrcnize(UpdateIntroBar);
end:
200: begin ParamTalk .= Toy. Talk (IntroPath+'in33.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 215;
end;
205: begin ParamTalk .= Toy. Talk (IntroPath+'SPin205.wav','SP6');
if ParamTalk = 2 then IntroNextSection := 200 else IntroNextsection := 210;
Synchronize(UpdateIntroHar);
cnd;
206: begin (ParamTalk .= Toy. Talk (IntroPath+'inbeep.wav','S');
if ParamTalk = 2 then IntroNextSection := 200 else IntroNextSection := 207;
Synchronize(UpdatelntroBar):}
end;
207: begin (ParamTalk .= Toy. Talk (IntroPath+'in34b.wav','S');
if ParamTalk = 2 then IntroNextSection := 200 else IntroNextSection := 210;
Synchronize(UpdateIntroBar):) end;
210: begin earamListen := Toy.Wait(12,'W');
if ParamListen = 2 then IntroNextSection := 200 else IntroNextSection := 215;
Synchronize(UpdateIntroBar);

SUBSTITUTE SHEET (RULE 26) *rB

end:
215: begin ?aram'Palk .= Toy. Talk (IntroPath+'SPin215.wav','SP7'):
Synchronize(UpdateIntroBar):
IntroNextSection := 217:
end:
216: begin (ParamTalk .= Toy. Talk (IntroPath+'in30.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection := 217;) end:
217: begin ParamListen := Toy.Wait(12,'W'):
if ParamListen = 3 then IntroNextSection := 220 else IntroNextsection := 221:
Synchronize(UpdateIntroBar):
end;
220: begin ParamTalk .= Toy. Talk (IntroPath+'in36.wav','S'):
Synchronize(UpdateIntroBar);
IntroNextSection := 230:
end:
221: begin ParamTalk .= Toy. Talk (IntroPath+'SPin221.wav','SP8'):
if ParamTalk = 3 then IntroNextSection := 220 else IntroNextSection :.= 224:
Synchronize(UpdateIntroBar):
end:
222: begin (ParamTalk .= Toy. Talk (IntroPath+'inblerp.wav','EC'):
Synchronize(UpdateIntroBar):
IntroNextSection := 90:) end:
223: begin (ParamTalk .= Toy. Talk (IntroPath+'in37b.wav','S'):
Synchronize(UpdatelntroBar):

SUBSTITUTE SHEET (RULE 26) WO 99/54015 PCT/iL99/00202 IntroNextsection :_ 190;1 end;
229: begin ParamTalk .= Toy.Wait(12.'W');
if ParamTalk = 3 then IntroNextSection := 220 else IntroNextSection := 230;
Synchronize(UpdateIntroBar):
end:
230: begin ParamTalk .= Toy. Talk (IntroPath+'in38.wav','S');
Synchronize(UpdatelntroBar);
IntroNextSection := 235;
end;
235: begin ParamTalk .= Toy. Talk (IntroPath+'SPin235.wav','SP9');
if ParamTalk = 1 then IntroNextSection := 250 else IntroNextSection := 241;
Synchronize(UpdateIntroBar);
end;
241: begin ParamTalk .= Toy. Talk (IntroPath+'in40a.wav','S');
Synchronize(UpdatelntroBar);
IntroNextSection := 242;
end:
242: begin ParamTalk .= Toy. Talk (IntroPath+'SPin242.wav','SP10');
Synchronize(UpdateIntroBar);
IntroNextSection := 298:
end;
243: begin (Param2alk .= Toy. Talk (IntroPath+'inl2b.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 90;) end;
244: begin (ParamTalk .= Toy. Talk (IntroPath+'inl2b.wav','S');

SUBSTITUTE SHEET (RULE 26) Synchronize(UpdateIntroBar~:
~ntroNextSection : 90: ) end;
245: begin (ParamTalk .= Toy. Talk (IntroPath+'inl2b.wav','S'):
Synchronize(UpdateIntroBarl:
IntroNextSection := 90;) end:
246: begin (ParamTalk ._ Toy. Talk (IntroPath+'inl2b.wav','S'1:
Synchronize(UpdateIntroBar):
IntroNextsection := 90;) end:
247: begin (ParamTalk .= Toy. Talk (IntroPath+'in24.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection := 190:) end;
248: begin ParamTalk := Toy. Talk (IntroPath+'SPin235.wav','SP9'):
if ParamTalk = 1 then IntroNextsection := 250 else IntroNextSection := 270;
Synchronize(UpdateIntroBar);
end;
250: begin ParamTalk .= Toy. Talk (IntroPath+'in39.wav','S'):
Synchronize(UpdateIntroBar):
IntroNextSection := 265;
end;
265: begin ParamTalk .= Toy. Talk (IntroPath+'in4l.wav','S');
Synchronize(Updatelntro8ar);
IntroNextSection := 270;
end;
270: begin ParamTalk .= Toy. Talk (IntroPath+'in9lm.wav','EL'):

SUBSTITUTE SHEET (RULE 26) Synchrorize(UpdateT_ntroBar);
IntroNextSection := 275;
end;
275: begin ParamTalk .= Toy. Talk (IntroPath+'in44.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 295;
end;
276: begin (ParamTalk .= Toy. Talk (IntroPath+'in44c.wav','E');
Synchronize(UpdateIntroBar);
IntroNextSection := 277;) end:
277: begin (ParamTalk .= Toy. Talk (IntroPath+'in44b.wav','S');
Synchronize(UpdateIntroBar):
IntroNextsection := 285;) end;
285: begin (ParamTalk .= Toy.Talk ( " ,'EC'); // sleep(1000) wait 1 sec Synchronize(UpdateintroBar);
IntroNextSection :- 300;) end:
290: begin (ParamTalk .= Toy. Talk (IntroPath+'inl2b.wav','S');
//?????~
Synchronize(UpdatelntroBar);
IntroNextSection := 90;) end;
295: begin ParamTalk .= Toy. Talk (IntroPath+'in49.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 300;
end;
300: begin SUBSTITUTE SHEET (RULE 26) ParamTalk .= Toy. Talk (IritroPath+'in50.wav','S');
Synchronize(UpdateIntroBar);
Introl3extSection := 305:
end;
305: begin ParamListen .= Toy. Listen ('too hot/7',12,'SR','W');
if ParamLiszen = 7 then IntroNextSection : 315 else IntroNextSeccion :- 310;
Synchronize(UpdateIntroBar);
end:
310: begin ParamTalk .= Toy. Talk (IntroPath+'in52.wav','S');
Synchronize(UpdatelntroBar);
IntroNextSection := 311;
end;
311: begin ParamListen .= Toy. Listen ('too hot/7',12,'SR','W');
if ParamListen = 7 then IntroNextSection := 315 else IntroNextSection := 320;
Synchronize(UpdateIntroBar);
end;
315: begin ParamTalk .= Toy. Talk (IntroPath+'in52m.wav','EL'1:
Synchronize(UpdateIntroBar):
IntroNextSection := 316;
end;
316: begin ParamTalk .= Toy. Talk (IntroPath+'in5l.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection : 320:
end:
320: begin ParamTalk .= Toy. Talk (IntroPath+'in53.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection :- 325:
end;

SUBSTITUTE SHEET (RULE 26) 325: begin ParamListen .= Toy. Listen ('too cold/7',12,'SR','W'):
if ParamListen = 7 then IntroNextSection : 335 else IntroNextSection := 331:
Synchronize(UpdatelntroBar):
end:
330: begin ParamTalk := Toy. Talk (IntroPath+'in55.wav','S'):
Synchronize(UpdateIntroHarl;
IntroNextSection := 331;
end;
331: begin ParamListen .= Toy. Listen ('too cold/7',12,'SR','W'):
if ParamListen = 7 then IntroNextSection := 335 else IntroNextSection := 340;
Synchronize(UpdateIntroBar)~
end:
335: begin ParamTalk .= Toy. Talk (IntroPath+'in55m.wav','E'1:
Synchronize(UpdateIntroBar):
IntroNextSection := 336;
end:
336: begin ParamTalk .= Toy. Talk (IntroPath+'in54.wav','S');
SynchronizefUpdateIntroBar):
IntroNextSection := 340 end:
340: begin ParamTalk .= Toy. Talk (IntroPath+'in56.wav','S');
Synchronize(UpdateIntroBarl:
IntroNextSection := 345:
end;
345: begin ParamListen .= Toy. Listen ('just right/7',12,'SR','W'I:
if ParamListen = 7 then IntroNextSection := 355 else IntroNextSection : 350;
Synchronize(UpdateIntroBar):

SUBSTITUTE SKEET (RULE 26) end;
350: begin ParamTalk .= Toy. Talk (IntroPath+'in58.wav','S');
Synchronize(UpdatelntroBar);
IntroNextSection :- 351;
end;
351: begin ParamListen .= Toy. Listen ('just right/7'.12,'SR','W');
if ParamListen = 7 then IntroNextSection := 355 else IntroNextSection :- 360;
Synchronize(UpdatelntroBar);
end;
355: begin ParamTalk .= Toy. Talk (IntroPath+'in58m.wav','EL');
Synchronize(UpdateIntroBar);
IntroNextSection := 356;
end;
356: begin ParamTalk .= Toy. Talk (IntroPath+'in57.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 360;
end:
360: begin ParamTalk .= Toy. Talk (IntroPath+'in59.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 365;
end:
365: begin ParamListen .= Toy. Listen ('bears/7',12,'SR','W');
if ParamListen = 7 then IntroNextSection :- 370 else IntroNextSection := 371;
Synchronize(UpdateIntroBar);
end;
370: begin ParamTalk .= Toy. Talk (IntroPath+'in60.wav','S');
Synchronize(UpdateIntroBar);

SUBSTITUTE SHEET (RULE 26) IntroNextSection : 375;
end;
371: begin ParamTalk .= Toy. Talk (IntroPath+'in60a.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 375;
end;
375: begin ParamTalk .= Toy. Talk (IntroPath+'in6lm.wav','SP11');
Synchronize(UpdateIntroBar);
IntroNextSection : 385:
end:
380: begin (ParamTalk .= Toy. Talk (IntroPath+'in6l.wav','S');
Synchronize(UpdateIntzoBar);
IntroNextSection := 385;) end:
385: begin ParamListen .= Toy. Listen ('too hot/l,too cold/2,just right/3', 12,'SR','W'):
case ParamListen of 1: IntroNextSection := 390;
2: IntroNextSection := 400;
3: IntroNextSection := 405;
else IntroNextSection :- 410;
end;
Synchronize(UpdateIntroBar);
end;
390: begin ParamTalk .= Toy. Talk (IntroEath+'in62.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 415;
end;
400: begin ParamTalk .= Toy. Talk (IntroPath+'in63.wav','S');
Synchronize(UpdateIntroBar);
IntroNextSection := 915;

SUBSTITUTE SHEET (RULE 26) DEMANDES OU BREVETS VOLUMINEUX
LA PRESENTS PARTIE DE CETTE DEMANDS OU CE BREVET
COMPREND PLUS D'UN TOME.
CECI EST LE TOME ~ DE 3 NOTE: Pour les tomes additionels, veuillez contacter le Bureau canadien des brevets JUMBO APPLlCATlONS/PATENTS

THAN ONE VOLUME
THIS !S VOLUME ,~ OF -F
f NOTE:.For additional volumes please contact'the Canadian Patent Office

Claims (57)

1. Interactive toy apparatus comprising:
a toy having a fanciful physical appearance;
a speaker mounted on the toy;
a user input receiver;
a user information storage unit storing information relating to at least one user:
a consent controller operative in response to current user inputs received via said user input receiver and to information stored in said storage unit for providing audio content to said user via said speaker.
2. Interactive toy apparatus according to claim 1 and wherein said user input receiver includes an audio receiver.
3. Interactive toy apparatus according to claim 2 wherein said current user input comprises a verbal input received via said audio receiver.
4. Interactive toy apparatus according to claim 1 and wherein said user input receiver includes a tactile input receiver.
5. Interactive toy apparatus according to claim 1 and wherein said storage unit stores personal information relating to at least one user and said content controller is operative to personalize said audio content.
6. Interactive toy apparatus according to claim 1 and wherein said storage unit stores information relating to the interaction of at least one user with said interactive toy apparatus and said content controller is operative to control said audio content in accordance with stored information relating to past interaction of said at least one user with said interactive toy apparatus.
7. Interactive toy apparatus according to claim 5 and wherein said storage unit also stores information relating to the interaction of at least one user with said interactive toy apparatus and said content controller also is operative to control said audio content in accordance with information relating to past interaction of said at least one user with said interactive toy apparatus.
8. Interactive toy apparatus according to claim 1 and wherein said storage unit stores information input verbally by a user via said user input receiver.
9. Interactive toy apparatus according to claim 5 and wherein said storage unit stores information input verbally by a user via said user input receiver.
10. Interactive toy apparatus according to claim 7 and wherein said storage unit stores information input verbally by a user via said user input receiver.
11. Interactive toy apparatus according to claim 1 and also comprising a content storage unit storing audio contents of at least one content title to be played to a user via the speaker, said at least one content title being interactive and containing interactive branching.
12. Interactive toy apparatus according to claim 11 wherein said at least one content title comprises:
a plurality of audio files storing a corresponding plurality of content title sections including:
at least one two alternative content title sections; and a script defining branching between said alternative user sections in response to any of a user input, an environmental condition, a past interaction, personal information related to a user, a remote computer, and a time-related condition.
13. Interactive toy apparatus according to claim 5 and also comprising a content storage unit storing audio contents of at least one content title to be played to a user via the speaker, said at least one content title being interactive and containing interactive branching.
14. Interactive toy apparatus according to claim 13 wherein said at least one content title comprises a plurality of parallel sections of content elements including at least two alternative sections and a script defining branching between alternative sections in a personalized manner.
15. Interactive toy apparatus according to claim 1 and wherein said user information storage unit is located at least partially in said toy.
16. Interactive toy apparatus according to claim 1 and wherein said user information storage unit is located at least partially outside said toy.
17. Interactive toy apparatus according to claim 1 and wherein said content storage unit is located at least partially in said toy.
18. Interactive toy apparatus according to claim 1 and wherein said content storage unit is located at least partially outside said toy.
19. Interactive toy apparatus according to claim 1 wherein the user input receiver comprises:
a microphone mounted on the toy; and a speech recognition unit receiving a speech input from the microphone.
20. Interactive toy apparatus according to claim 5 wherein the user information storage unit is operative to store said personal information related to a plurality of users each identifiable with a unique code and wherein said content controller is operative to prompt any of said users to provide said user's code.
21. Interactive toy apparatus according to claim 5 wherein the user information storage unit is operative to store information regarding a user's participation performance.
22. Toy apparatus having changing facial expressions, the toy comprising:
multi-featured face apparatus including a plurality of multi-positionable facial features; and a facial expression control unit operative to generate at least three combinations of positions of said plurality of facial features representing at least two corresponding facial expressions.
23. Apparatus according to claim 22 wherein the facial expression control unit is operative to cause the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
24. Toy apparatus according to claim 22 and also comprising:
a speaker device;
an audio memory storing an audio pronouncement; and an audio output unit operative to control output of the audio pronouncement by the speaker device, and wherein the facial expression control unit is operative to generate the combinations of positions synchronously with output of the pronouncement.
25. Toy apparatus for playing an interactive verbal game comprising:
a toy;
a speaker device mounted on the toy;
a microphone mounted on the toy;
a speech recognition unit receiving a speech input from the microphone;
and an audio storage unit storing:

a multiplicity of verbal game segments to be played through the speaker device; and a script storage defining interactive branching between the verbal game segments.
26. Toy apparatus according to claim 25 wherein the verbal game segments include at least one segment which prompts a user to generate a spoken input to the verbal game.
27. Toy apparatus according to claim 25 wherein at least one segment includes two or more verbal strings and a prompt to said user to reproduce one of the verbal strings.
28. Toy apparatus according to claim 25 wherein at least one segment comprises a riddle.
29. Toy apparatus according to claim 25 wherein at least one of the verbal strings has educational content.
30. Toy apparatus according to claim 25 wherein at least one of the verbal strings comprises a feedback to said user regarding the quality of said user's performance in the game.
31. Interactive toy apparatus according to claim 1 and further comprising:
multi-featured face apparatus assembled with said toy including a plurality of multi-positionable facial features; and a facial expression control unit operative to generate at least three combinations of positions of said plurality of facial features representing at least two corresponding facial expressions.
32. Interactive toy apparatus according to claim 31 wherein the facial expression control unit is operative to cause the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
33. Interactive toy apparatus according to claim 31 and also comprising:
an audio memory storing an audio pronouncement; and an audio output unit operative to control output of the audio pronouncement by the speaker device, and wherein the facial expression control unit is operative to generate the combinations of positions synchronously with output of the pronouncement.
34. Interactive toy apparatus according to claim 1 and further comprising:
a microphone mounted on the toy;
a speech recognition unit receiving a speech input from the microphone;
and an audio storage unit storing:
a multiplicity of verbal game segments of a verbal game to be played through the speaker device; and a script storage defining interactive branching between the verbal game segments.
35. Interactive toy apparatus according to claim 34 wherein the verbal game segments include at least one segment which prompts a user to generate a spoken input to the verbal game.
36. Interactive toy apparatus according to claim 34 wherein at least one segment includes two or more verbal strings and a prompt to said user to reproduce one of the verbal strings.
37. Interactive toy apparatus according to claim 34 wherein at least one segment comprises a riddle.
38. Interactive toy apparatus according to claim 34 wherein at least one of the verbal strings has educational content.
39. Interactive toy apparatus according to claim 34 wherein at least one of the verbal strings comprises a feedback to said user regarding the quality of said user's performance in the game.
40. A method of toy interaction comprising:
providing a toy having a fanciful physical appearance;
providing a speaker mounted on the toy;
providing a user input receiver;
storing in a user information storage unit information relating to at least one user:
providing, via a content controller operative in response to current user inputs received via said user input receiver and to information stored in said storage unit, audio content to said user via said speaker.
41. A method according to claim 40 and wherein said storing step comprises storing personal information relating to at least one user and personalizing, via said content controller, said audio content.
42. A method according to claim 40 and wherein said storing step comprises storing information relating to the interaction of at least one user with said interactive toy apparatus and controlling, via said content controller, said audio content in accordance with stored information relating to past interaction of said at least one user with said interactive toy apparatus.
43. A method according to claim 40 and further comprising storing, in a content storage unit, audio contents of at least one content title to be played to a user via the speaker, said at least one content title being interactive and containing interactive branching.
44. A method according to claim 40 and further comprising storing personal information related to a plurality of users each identifiable with a unique code and prompting, via said content controller, any of said users to provide said user's code.
45. A method according to claim 40 and further comprising storing information regarding a user's participation performance.
46. A method according to claim 40 and further comprising:
providing multi-featured face apparatus including a plurality of multi-positionable facial features; and generating at least three combinations of positions of said plurality of facial features representing at least two corresponding facial expressions.
47. A method according to claim 46 and further comprising causing the features to fluctuate between positions at different rates, thereby to generate an illusion of different emotions.
48. A method according to claim 46 and also comprising:
storing an audio pronouncement; and providing said audio pronouncement by said speaker; and generating combinations of facial positions synchronously with output of the pronouncement.
49. A system for teaching programming to schoolchildren using interactive objects, the system comprising:
a computerized school-child interface permitting a school-child to breathe life into an interactive object by defining characteristics of the interactive object, said computerized school-child interface being operative to at least partially define, in response to school-child inputs, interactions between said interactive object and humans;
and a computerized teacher interface permitting a teacher to monitor the school-child's progress in defining characteristics of the interactive object.
50. A system according to claim 49 wherein the computerized teacher interface permits the teacher to configure the computerized school-child interface.
51. A teaching system for teaching engineering and programming of interactive objects to students, the system comprising:
a computerized student interface permitting a student to breathe life into an interactive object by defining characteristics of the interactive object, said computerized user interface being operative to at least partially define, in response to student inputs, interactions between said interactive object and humans; and a computerized teacher interface permitting a teacher to configure the computerized student interface.
52. A computer system for development of emotionally perceptive computerized creatures comprising:
a computerized user interface permitting a user to develop an emotionally perceptive computer-controlled creature by defining interactions between the emotionally perceptive computer-controlled creature and natural humans including at least one response of said emotionally perceptive computer-controlled creature to at least one parameter, indicative of natural human emotion, derived from a stimulus provided by the natural human; and a creature control unit operative to control the emotionally perceptive creature in accordance with the characteristics and interactions defined by the user.
53. A system according to claim 52 wherein said parameter indicative of natural human emotion comprises a characteristic of natural human speech other than language content thereof.
54. A method for development of emotionally perceptive computerized creatures, the method comprising:
defining interactions between the emotionally perceptive computer-controlled creature and natural humans including at least one response of said emotionally perceptive computer-controlled creature to at least one parameter, indicative of natural human emotion, derived from a stimulus provided by the natural human; and controlling the emotionally perceptive creature in accordance with the characteristics and interactions defined by the user.
55. A method for teaching programming to students, the method comprising:
providing a computerized visual-programming based school-child interface permitting a school-child to perform visual programming; and providing a computerized teacher interface permitting a teacher to configure the computerized school-child interface.
56. A computerized emotionally perceptive computerized creature comprising:
a plurality of interaction modes operative to carry out a corresponding plurality of interactions with natural humans including at least one response to at least one natural human emotion parameter, indicative of natural human emotion; and an emotion perception unit operative to derive at least one natural human emotion parameter from a stimulus provided by the natural human, and to supply the parameter to at least one of the plurality of interaction modes.
57. A creature according to claim 56 and also comprising a physical body operative to participate in at least one of the plurality of interactions.
CA002296119A 1998-04-16 1999-04-15 Interactive toy Abandoned CA2296119A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
IL12412298 1998-04-16
IL124122 1998-04-16
US09/081,255 1998-05-19
US09/081,255 US6160986A (en) 1998-04-16 1998-05-19 Interactive toy
PCT/IL1999/000202 WO1999054015A1 (en) 1998-04-16 1999-04-15 Interactive toy

Publications (1)

Publication Number Publication Date
CA2296119A1 true CA2296119A1 (en) 1999-10-28

Family

ID=26323628

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002296119A Abandoned CA2296119A1 (en) 1998-04-16 1999-04-15 Interactive toy

Country Status (7)

Country Link
US (1) US6959166B1 (en)
EP (1) EP0991453A1 (en)
JP (1) JP3936749B2 (en)
CN (1) CN1272800A (en)
AU (1) AU3343199A (en)
CA (1) CA2296119A1 (en)
WO (1) WO1999054015A1 (en)

Families Citing this family (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6663393B1 (en) * 1999-07-10 2003-12-16 Nabil N. Ghaly Interactive play device and method
US7878905B2 (en) 2000-02-22 2011-02-01 Creative Kingdoms, Llc Multi-layered interactive play experience
US7445550B2 (en) 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US6773344B1 (en) 2000-03-16 2004-08-10 Creator Ltd. Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems
JP2001277166A (en) * 2000-03-31 2001-10-09 Sony Corp Robot and behaivoir determining method therefor
WO2002013935A1 (en) * 2000-08-12 2002-02-21 Smirnov Alexander V Toys imitating characters behaviour
KR100396752B1 (en) * 2000-08-17 2003-09-02 엘지전자 주식회사 Scholarship/growth system and method using a merchandise of toy
JP4296714B2 (en) * 2000-10-11 2009-07-15 ソニー株式会社 Robot control apparatus, robot control method, recording medium, and program
KR20020044736A (en) * 2000-12-06 2002-06-19 스마트아이엔티 주식회사 Apparatus and method for an education terminal
US6910186B2 (en) 2000-12-08 2005-06-21 Kyunam Kim Graphic chatting with organizational avatars
US9625905B2 (en) * 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys
US7457752B2 (en) * 2001-08-14 2008-11-25 Sony France S.A. Method and apparatus for controlling the operation of an emotion synthesizing device
KR100624403B1 (en) * 2001-10-06 2006-09-15 삼성전자주식회사 Human nervous-system-based emotion synthesizing device and method for the same
US7037455B2 (en) * 2001-12-21 2006-05-02 Mattel, Inc. Insert molding method
US20070066396A1 (en) 2002-04-05 2007-03-22 Denise Chapman Weston Retail methods for providing an interactive product to a consumer
US20040043373A1 (en) * 2002-09-04 2004-03-04 Kaiserman Jeffrey M. System for providing computer-assisted development
US7238079B2 (en) * 2003-01-14 2007-07-03 Disney Enterprise, Inc. Animatronic supported walking system
US7248170B2 (en) * 2003-01-22 2007-07-24 Deome Dennis E Interactive personal security system
US9446319B2 (en) * 2003-03-25 2016-09-20 Mq Gaming, Llc Interactive gaming toy
JP4007224B2 (en) * 2003-03-27 2007-11-14 株式会社デンソー Robot fixing device
US7862428B2 (en) 2003-07-02 2011-01-04 Ganz Interactive action figures for gaming systems
US7534157B2 (en) 2003-12-31 2009-05-19 Ganz System and method for toy adoption and marketing
AU2004309432B2 (en) 2003-12-31 2010-12-09 2121200 Ontario Inc. System and method for toy adoption and marketing
US7465212B2 (en) * 2003-12-31 2008-12-16 Ganz System and method for toy adoption and marketing
US20070299694A1 (en) * 2006-06-26 2007-12-27 Merck David E Patient education management database system
US20080039247A1 (en) * 2006-08-02 2008-02-14 Sandra L. Uhler Footbag And A System Relating Thereto
EP1895505A1 (en) 2006-09-04 2008-03-05 Sony Deutschland GmbH Method and device for musical mood detection
EP1912193A1 (en) * 2006-10-02 2008-04-16 Koninklijke Philips Electronics N.V. Interactive storyteller system
US8307295B2 (en) 2006-10-03 2012-11-06 Interbots Llc Method for controlling a computer generated or physical character based on visual focus
US20080082301A1 (en) * 2006-10-03 2008-04-03 Sabrina Haskell Method for designing and fabricating a robot
US20080082214A1 (en) * 2006-10-03 2008-04-03 Sabrina Haskell Method for animating a robot
AU2007237363B2 (en) 2006-12-06 2010-04-29 2121200 Ontario Inc. Feature codes and bonuses in virtual worlds
TW200824767A (en) * 2006-12-08 2008-06-16 Yu-Hsi Ho Materialization system for virtual object and method thereof
EP2121155A1 (en) * 2007-02-12 2009-11-25 IM Smiling B.V. Method for controlling an external device via the usb-port of a personal computer
US20080195724A1 (en) * 2007-02-14 2008-08-14 Gopinath B Methods for interactive multi-agent audio-visual platforms
TW200836893A (en) * 2007-03-01 2008-09-16 Benq Corp Interactive home entertainment robot and method of controlling the same
US20080287033A1 (en) * 2007-05-17 2008-11-20 Wendy Steinberg Personalizable Doll
US8128500B1 (en) 2007-07-13 2012-03-06 Ganz System and method for generating a virtual environment for land-based and underwater virtual characters
CN101411946B (en) * 2007-10-19 2012-03-28 鸿富锦精密工业(深圳)有限公司 Toy dinosaur
US20090117819A1 (en) * 2007-11-07 2009-05-07 Nakamura Michael L Interactive toy
US8088002B2 (en) * 2007-11-19 2012-01-03 Ganz Transfer of rewards between websites
US8612302B2 (en) 2007-11-19 2013-12-17 Ganz Credit swap in a virtual world
US20090132357A1 (en) * 2007-11-19 2009-05-21 Ganz, An Ontario Partnership Consisting Of S.H. Ganz Holdings Inc. And 816877 Ontario Limited Transfer of rewards from a central website to other websites
US8626819B2 (en) 2007-11-19 2014-01-07 Ganz Transfer of items between social networking websites
JP2011507413A (en) * 2007-12-17 2011-03-03 プレイ・メガフォン・インコーポレイテッド System and method for managing bi-directional communication between a user and a bi-directional system
US8046620B2 (en) * 2008-01-31 2011-10-25 Peter Sui Lun Fong Interactive device with time synchronization capability
US8172637B2 (en) * 2008-03-12 2012-05-08 Health Hero Network, Inc. Programmable interactive talking device
CA2623966A1 (en) * 2008-04-01 2009-01-12 Ganz, An Ontario Partnership Consisting Of 2121200 Ontario Inc. And 2121 812 Ontario Inc. Reverse product purchase in a virtual environment
US20100041312A1 (en) * 2008-08-15 2010-02-18 Paul King Electronic toy and methods of interacting therewith
US20100053862A1 (en) * 2008-09-04 2010-03-04 Burnes Home Accents, Llc Modular digital image display devices and methods for providing the same
US20100100447A1 (en) * 2008-10-21 2010-04-22 Ganz Toy system and extravaganza planner
WO2010061286A1 (en) * 2008-11-27 2010-06-03 Stellenbosch University A toy exhibiting bonding behaviour
US8255807B2 (en) 2008-12-23 2012-08-28 Ganz Item customization and website customization
US8909414B2 (en) 2009-12-14 2014-12-09 Volkswagen Ag Three-dimensional corporeal figure for communication with a passenger in a motor vehicle
US8843553B2 (en) 2009-12-14 2014-09-23 Volkswagen Ag Method and system for communication with vehicles
US20110202863A1 (en) * 2010-02-18 2011-08-18 Corrallo Charles Shane Computer Entertainment Tracker Application for Limiting Use of Specific Computer Applications and Method of Use
US8308667B2 (en) * 2010-03-12 2012-11-13 Wing Pow International Corp. Interactive massaging device
US8836719B2 (en) 2010-04-23 2014-09-16 Ganz Crafting system in a virtual environment
US8775341B1 (en) 2010-10-26 2014-07-08 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
US9015093B1 (en) 2010-10-26 2015-04-21 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
US20120185254A1 (en) * 2011-01-18 2012-07-19 Biehler William A Interactive figurine in a communications system incorporating selective content delivery
US9180380B2 (en) 2011-08-05 2015-11-10 Mattel, Inc. Toy figurine with internal lighting effect
US20140178847A1 (en) 2011-08-16 2014-06-26 Seebo Interactive Ltd. Connected Multi Functional System and Method of Use
US20130268119A1 (en) * 2011-10-28 2013-10-10 Tovbot Smartphone and internet service enabled robot systems and methods
US9079113B2 (en) * 2012-01-06 2015-07-14 J. T. Labs Limited Interactive personal robotic apparatus
US8721456B2 (en) 2012-02-17 2014-05-13 Ganz Incentivizing playing between websites
US9649565B2 (en) * 2012-05-01 2017-05-16 Activision Publishing, Inc. Server based interactive video game with toys
CN103505880A (en) * 2012-06-29 2014-01-15 新昌县冠阳技术开发有限公司 Intelligent interaction device for user and toy under condition that user and toy do not meet
US10223636B2 (en) 2012-07-25 2019-03-05 Pullstring, Inc. Artificial intelligence script tool
US8972324B2 (en) 2012-07-25 2015-03-03 Toytalk, Inc. Systems and methods for artificial intelligence script modification
US10528385B2 (en) 2012-12-13 2020-01-07 Microsoft Technology Licensing, Llc Task completion through inter-application communication
US9675895B2 (en) 2013-03-13 2017-06-13 Hasbro, Inc. Three way multidirectional interactive toy
US9259659B2 (en) 2013-04-30 2016-02-16 Mattel, Inc. Twist-waist punching figure
US20140329433A1 (en) * 2013-05-06 2014-11-06 Israel Carrero Toy Stuffed Animal with Remote Video and Audio Capability
US10043412B2 (en) 2013-05-26 2018-08-07 Dean Joseph Lore System for promoting travel education
US9406240B2 (en) * 2013-10-11 2016-08-02 Dynepic Inc. Interactive educational system
CN104679378A (en) * 2013-11-27 2015-06-03 苏州蜗牛数字科技股份有限公司 Music media playing mode based on virtual head portrait
CN103778576A (en) * 2014-01-24 2014-05-07 成都万先自动化科技有限责任公司 Bodybuilding consultation service robot
CN103761932A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Robot for weather forecast broadcasting service
CN103753579A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 News broadcasting service robot
CN103753538A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Company conference explanation service robot
CN103753582A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Noctivagation safety service robot
US9925456B1 (en) 2014-04-24 2018-03-27 Hasbro, Inc. Single manipulatable physical and virtual game assembly
US20150238879A1 (en) * 2014-05-23 2015-08-27 Bluniz Creative Technology Corporation Remote interactive media
JP6547244B2 (en) * 2014-06-30 2019-07-24 カシオ計算機株式会社 Operation processing apparatus, operation processing method and program
JP6476608B2 (en) * 2014-06-30 2019-03-06 カシオ計算機株式会社 Operation processing apparatus, operation processing method, and program
US9962615B2 (en) 2014-07-30 2018-05-08 Hasbro, Inc. Integrated multi environment interactive battle game
CN104575502A (en) * 2014-11-25 2015-04-29 百度在线网络技术(北京)有限公司 Intelligent toy and voice interaction method thereof
US10089772B2 (en) 2015-04-23 2018-10-02 Hasbro, Inc. Context-aware digital play
CN104959985B (en) * 2015-07-16 2017-10-17 深圳狗尾草智能科技有限公司 The control system and its method of a kind of robot
KR101824977B1 (en) * 2015-11-26 2018-02-02 엠텍씨앤케이 주식회사 a interworking method between contents and external devices
GB2540831B (en) * 2016-02-05 2018-04-04 The Eyelash Trainer Ltd An Apparatus for Practicing the Application of Eyelash Extensions
JP6763167B2 (en) * 2016-03-23 2020-09-30 カシオ計算機株式会社 Learning support device, learning support system, learning support method, robot and program
JP6756130B2 (en) * 2016-03-23 2020-09-16 カシオ計算機株式会社 Learning support device, robot, learning support system, learning support method and program
US10491380B2 (en) * 2016-03-31 2019-11-26 Shenzhen Bell Creative Science and Education Co., Ltd. Firmware of modular assembly system
HK1216278A (en) * 2016-04-27 2016-10-28 Kam Ming Lau An education system using virtual robots
CN106205612B (en) * 2016-07-08 2019-12-24 北京光年无限科技有限公司 Information processing method and system for intelligent robot
US10839325B2 (en) * 2016-11-06 2020-11-17 Microsoft Technology Licensing, Llc Efficiency enhancements in task management applications
CA3043016A1 (en) * 2016-11-10 2018-05-17 Warner Bros. Entertainment Inc. Social robot with environmental control feature
US11045738B1 (en) 2016-12-13 2021-06-29 Hasbro, Inc. Motion and toy detecting body attachment
CN106823378A (en) * 2017-02-20 2017-06-13 包伯瑜 A kind of role playing toy system
US10758828B1 (en) 2017-03-17 2020-09-01 Hasbro, Inc. Music mash up collectable card game
US10354176B1 (en) 2017-05-03 2019-07-16 Amazon Technologies, Inc. Fingerprint-based experience generation
US10965391B1 (en) * 2018-01-29 2021-03-30 Amazon Technologies, Inc. Content streaming with bi-directional communication
CN110400494A (en) * 2018-04-25 2019-11-01 北京快乐智慧科技有限责任公司 A kind of method and system that children stories play
CN108671552A (en) * 2018-05-03 2018-10-19 深圳市沃特沃德股份有限公司 intelligent toy control method and device
CN110035166B (en) * 2019-03-20 2021-03-26 广州美术学院 Interaction device associated with multiple mobile phone terminals
CN110211434A (en) * 2019-05-30 2019-09-06 江苏科斗教育科技有限公司 A kind of artificial intelligence education programming robot
RU2712349C1 (en) * 2019-07-24 2020-01-28 Федеральное государственное бюджетное образовательное учреждение высшего образования "Поволжский государственный технологический университет" Toy
US11389735B2 (en) 2019-10-23 2022-07-19 Ganz Virtual pet system
US11358059B2 (en) 2020-05-27 2022-06-14 Ganz Live toy system
TWI774208B (en) * 2021-01-22 2022-08-11 國立雲林科技大學 Story representation system and method thereof
DE102021106403A1 (en) * 2021-03-16 2022-09-22 Geobra Brandstätter Stiftung & Co. Kg Game system and game with the game system
CN115382224B (en) * 2022-08-02 2024-02-13 奥飞娱乐股份有限公司 Face switching mechanism of toy carrier and toy carrier

Family Cites Families (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3009040C2 (en) 1980-03-08 1982-05-19 Hermann Dr. 8510 Fürth Neuhierl Toy vehicle with electric power supply, electric drive and radio remote control
JPS6055985A (en) 1983-09-05 1985-04-01 株式会社トミー Sound recognizing toy
US4679789A (en) 1983-12-26 1987-07-14 Kabushiki Kaisha Universal Video game apparatus with automatic skill level adjustment
US4712184A (en) 1984-09-12 1987-12-08 Haugerud Albert R Computer controllable robotic educational toy
US4642710A (en) * 1985-03-15 1987-02-10 Milton Bradley International, Inc. Animated display controlled by an audio device
US4670848A (en) 1985-04-10 1987-06-02 Standard Systems Corporation Artificial intelligence system
US4752068A (en) * 1985-11-07 1988-06-21 Namco Ltd. Video game machine for business use
EP0265438A1 (en) 1986-05-02 1988-05-04 SIROTA, Vladimir Toy
US4802879A (en) 1986-05-05 1989-02-07 Tiger Electronics, Inc. Action figure toy with graphics display
US4846693A (en) * 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US4840602A (en) * 1987-02-06 1989-06-20 Coleco Industries, Inc. Talking doll responsive to external signal
US4857030A (en) 1987-02-06 1989-08-15 Coleco Industries, Inc. Conversing dolls
US4923428A (en) * 1988-05-05 1990-05-08 Cal R & D, Inc. Interactive talking toy
US4858930A (en) * 1988-06-07 1989-08-22 Namco, Ltd. Game system
US5241142A (en) 1988-06-21 1993-08-31 Otis Elevator Company "Artificial intelligence", based learning system predicting "peak-period" ti
US4959037A (en) 1989-02-09 1990-09-25 Henry Garfinkel Writing doll
US5195920A (en) 1989-02-16 1993-03-23 Collier Harry B Radio controlled model vehicle having coordinated sound effects system
SE466029B (en) 1989-03-06 1991-12-02 Ibm Svenska Ab DEVICE AND PROCEDURE FOR ANALYSIS OF NATURAL LANGUAGES IN A COMPUTER-BASED INFORMATION PROCESSING SYSTEM
US5109222A (en) 1989-03-27 1992-04-28 John Welty Remote control system for control of electrically operable equipment in people occupiable structures
US5142803A (en) 1989-09-20 1992-09-01 Semborg-Recrob, Corp. Animated character system with real-time contol
US5182557A (en) 1989-09-20 1993-01-26 Semborg Recrob, Corp. Motorized joystick
US5021878A (en) 1989-09-20 1991-06-04 Semborg-Recrob, Corp. Animated character system with real-time control
GB8922140D0 (en) 1989-10-02 1989-11-15 Blue Box Toy Factory A toy musical box
US5191615A (en) 1990-01-17 1993-03-02 The Drummer Group Interrelational audio kinetic entertainment system
AU652209B2 (en) * 1990-11-14 1994-08-18 Robert Macandrew Best Talking video games
US5261820A (en) * 1990-12-21 1993-11-16 Dynamix, Inc. Computer simulation playback method and simulation
DE69230968D1 (en) 1991-03-04 2000-05-31 Inference Corp CASE-BASED DEDUCTIVE SYSTEM
US5209695A (en) 1991-05-13 1993-05-11 Omri Rothschild Sound controllable apparatus particularly useful in controlling toys and robots
EP0527527B1 (en) 1991-08-09 1999-01-20 Koninklijke Philips Electronics N.V. Method and apparatus for manipulating pitch and duration of a physical audio signal
US5281143A (en) 1992-05-08 1994-01-25 Toy Biz, Inc. Learning doll
US5377103A (en) 1992-05-15 1994-12-27 International Business Machines Corporation Constrained natural language interface for a computer that employs a browse function
US5369575A (en) 1992-05-15 1994-11-29 International Business Machines Corporation Constrained natural language interface for a computer system
US5434777A (en) 1992-05-27 1995-07-18 Apple Computer, Inc. Method and apparatus for processing natural language
US5390281A (en) 1992-05-27 1995-02-14 Apple Computer, Inc. Method and apparatus for deducing user intent and providing computer implemented services
US5390282A (en) 1992-06-16 1995-02-14 John R. Koza Process for problem solving using spontaneously emergent self-replicating and self-improving entities
US5270480A (en) 1992-06-25 1993-12-14 Victor Company Of Japan, Ltd. Toy acting in response to a MIDI signal
ES2143509T3 (en) 1992-09-04 2000-05-16 Caterpillar Inc INTEGRATED EDITION AND TRANSLATION SYSTEM.
JPH0689274A (en) 1992-09-08 1994-03-29 Hitachi Ltd Method and system for supporting judgement
AU5363494A (en) 1992-10-19 1994-05-09 Jeffrey Scott Jani Video and radio controlled moving and talking device
US5615112A (en) 1993-01-29 1997-03-25 Arizona Board Of Regents Synthesized object-oriented entity-relationship (SOOER) model for coupled knowledge-base/database of image retrieval expert system (IRES)
US5388493A (en) 1993-11-17 1995-02-14 Curletto; Giorgio F. Extra low profile housing for vertical dual keyboard MIDI wireless controller for accordionists
US5694558A (en) 1994-04-22 1997-12-02 U S West Technologies, Inc. Method and system for interactive object-oriented dialogue management
US5704018A (en) 1994-05-09 1997-12-30 Microsoft Corporation Generating improved belief networks
US5733131A (en) * 1994-07-29 1998-03-31 Seiko Communications Holding N.V. Education and entertainment device with dynamic configuration and operation
US5724074A (en) 1995-02-06 1998-03-03 Microsoft Corporation Method and system for graphically programming mobile toys
US5636994A (en) * 1995-11-09 1997-06-10 Tong; Vincent M. K. Interactive computer controlled doll
US5752880A (en) 1995-11-20 1998-05-19 Creator Ltd. Interactive doll
US5779486A (en) 1996-03-19 1998-07-14 Ho; Chi Fai Methods and apparatus to assess and enhance a student's understanding in a subject
US5727951A (en) 1996-05-28 1998-03-17 Ho; Chi Fai Relationship-based computer-aided-educational system
US6134590A (en) * 1996-04-16 2000-10-17 Webtv Networks, Inc. Method and apparatus for automatically connecting devices to a local network
US5700178A (en) * 1996-08-14 1997-12-23 Fisher-Price, Inc. Emotional expression character
IL120857A (en) * 1997-05-19 2003-03-12 Creator Ltd Programmable assembly toy
US20010032278A1 (en) * 1997-10-07 2001-10-18 Brown Stephen J. Remote generation and distribution of command programs for programmable devices
US6160986A (en) 1998-04-16 2000-12-12 Creator Ltd Interactive toy
US6663393B1 (en) * 1999-07-10 2003-12-16 Nabil N. Ghaly Interactive play device and method
US6439956B1 (en) * 2000-11-13 2002-08-27 Interact Accessories, Inc. RC car device

Also Published As

Publication number Publication date
US6959166B1 (en) 2005-10-25
EP0991453A1 (en) 2000-04-12
WO1999054015A1 (en) 1999-10-28
JP2002505614A (en) 2002-02-19
CN1272800A (en) 2000-11-08
AU3343199A (en) 1999-11-08
JP3936749B2 (en) 2007-06-27

Similar Documents

Publication Publication Date Title
US6959166B1 (en) Interactive toy
US6160986A (en) Interactive toy
US11158202B2 (en) Systems and methods for customized lesson creation and application
US20190236975A1 (en) Integrated development environment for visual and text coding
Druin et al. Robots for kids: exploring new technologies for learning
Kelleher Motivating Programming: Using storytelling to make computer programming attractive to middle school girls
US20020068500A1 (en) Adaptive toy system and functionality
Fontijn et al. StoryToy the interactive storytelling toy
WO2001012285A9 (en) Networked toys
Resner Rover@ Home: Computer mediated remote interaction between humans and dogs
Williams PopBots: leveraging social robots to aid preschool children's artificial intelligence education
US20040043373A1 (en) System for providing computer-assisted development
JP2006308815A (en) Electronic learning system and electronic system
Rizzo et al. UAPPI: A platform for extending app inventor towards the worlds of IoT and machine learning
EP3576075A1 (en) Operating a toy for speech and language assessment and therapy
Lauwers Aligning capabilities of interactive educational tools to learner goals
Slootmaker EMERGO: a generic platform for authoring and playing scenario-based serious games
Bergqvist When Code Becomes Play: Appropriation in the Programming of Outdoor Play Spaces
Bonetti Design and implementation of an actor robot for a theatrical play
Paracha et al. Examining the theoretical schema of shimpai muyou! Narrative learning environment
Hamidi Rafigh: A Living Media System for Motivating Target Application Use for Children
Gnoli nutella: the construction and enactment of simulated macroworlds
Johnson et al. Keeping Mindful of Modality: A Comparison of Computer Science Education Resources for Learning
Verschoor et al. Using a social robot as facilitator in usability-tests with children
Laamanen Architecture for theatre robotics

Legal Events

Date Code Title Description
FZDE Discontinued
FZDE Discontinued

Effective date: 20030415