USRE39435E1 - Learning system with learner-constructed response based methodology - Google Patents

Learning system with learner-constructed response based methodology Download PDF

Info

Publication number
USRE39435E1
USRE39435E1 US10/653,748 US65374803A USRE39435E US RE39435 E1 USRE39435 E1 US RE39435E1 US 65374803 A US65374803 A US 65374803A US RE39435 E USRE39435 E US RE39435E
Authority
US
United States
Prior art keywords
learner
presenting
display
constructed response
storage medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US10/653,748
Inventor
Dennis Ray Berman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vxi Global Solutions LLC
Original Assignee
Drb Lit Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US10/653,748 priority Critical patent/USRE39435E1/en
Application filed by Drb Lit Ltd filed Critical Drb Lit Ltd
Assigned to DRB LIT LTD. reassignment DRB LIT LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERMAN, DENNIS RAY
Priority to US10/815,330 priority patent/US7357640B2/en
Application granted granted Critical
Publication of USRE39435E1 publication Critical patent/USRE39435E1/en
Assigned to BERMAN, DENNIS R reassignment BERMAN, DENNIS R SECURITY AGREEMENT Assignors: DRB LIT LTD.
Priority to US11/925,234 priority patent/US20080076109A1/en
Priority to US11/924,844 priority patent/US20080076107A1/en
Assigned to BERMAN, DENNIS R., MR. reassignment BERMAN, DENNIS R., MR. PATENT SECURITY AGREEMENT Assignors: DRB LIT LTD.
Assigned to LOCKIN, LLC reassignment LOCKIN, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DRB LIT LTD., TRIVAC LTD.
Assigned to DRB LIT LTD. reassignment DRB LIT LTD. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: BERMAN, DENNIS R.
Assigned to MEMORY SCIENCE, LLC reassignment MEMORY SCIENCE, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LOCKIN, LLC
Assigned to VXI GLOBAL SOLUTIONS LLC reassignment VXI GLOBAL SOLUTIONS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEMORY SCIENCE, LLC
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • G09B7/04Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student characterised by modifying the teaching programme in response to a wrong answer, e.g. repeating the question, supplying a further explanation

Definitions

  • test results thus are not necessarily indicative of actual knowledge.
  • Evaluation of responses involves identification of pre-defined keyword data pertaining to the subject matter being tested. Examples include passages of text with important keywords (keywords being defined herein to include one or more words, or phases, or related words and phases, or synonyms). Multiple choice questions may also include keywords, such that after the learner completes a sequence of reading material or any kind of current multiple-choice, mix or match, true false questions, the learner is prompted to enter answers to “fill-in-the-blank” or “verbal narrative” questions (a learner-constructed response). The learner entered responses are compared to standard solutions recorded on the system and remedial actions are provided.
  • FIGS. 3A and 3B are flow charts illustrating processes of the learning methodology.
  • the reference numeral 10 refers to a system for implementing a training methodology of the present invention, described in detail below.
  • the system 10 includes a computer 12 , having a processor 14 and a memory 16 .
  • the computer 12 may comprise a personal or desktop computer, a television, a handheld device, or any other suitable electronic device.
  • a display 18 and audio output 20 are connected to the computer.
  • Inputs include a user keyboard 22 , a mouse 24 , or other suitable devices. The inputs are used for various purposes such as entering information initiated by the user, interacting with a software application running on the computer, etc.
  • a disc input 26 is provided for providing programming or content for operation on the system 10 , it being understood that any suitable media may be contemplated.
  • the presentation information component 28 contains information for presenting the question, and may also include additional instructions, help information and an avenue for capturing learner-constructed responses (e.g., a text area or a record button for voice input).
  • the evaluation information component 30 may include a sequence of phrases and, in one embodiment, these may take the form of standard HTML tags for the display of question information and a sequence of proprietary tags for the encoding of expected key-words or phrases under the “META-DATA” tag in HTML.
  • keywords may describe one or more words or phrases that express the gist or main concepts of the topic under discussion, or the main concepts of a training session to be conveyed to the learners. Keywords may comprise one word, phrases, multiple associated words, synonyms and/or related phrases. For example, keywords may be organized as an n-tuple of consisting of one main word or phrase, followed by associated synonyms and related phrases. Keywords in the component 44 as illustrated in the example of FIG. 2B that correspond to the target knowledge component 42 are empathy, understanding customer needs, relating to customer requirements. It is understood that the authors of “content” for the system 10 will supply the target knowledge for the presentation information component 30 , corresponding to each passage or item to be tested in the component 28 . It is understood that the target knowledge of the component 28 may be expressed as text passages, graphics, video, or multiple choice questions, or in some other fashion.
  • the system 10 provides a learning methodology that improves the speed and retention of learning, and furthermore provides improved accuracy in assessment of the learner.
  • a learner-constructed response in which the learner must use his or her own words in answering a question, greater assurance is provided that the learner indeed knows the subject matter.
  • the system allows for refinement of the testing as the learner gets closer to accurate responses, as enabled by the construction of a key word component associated with the target knowledge component, as enabled by the evaluation process.

Abstract

A methodology in which a learner-constructed response is provided in answer to a question presented by the system, the response being evaluated by comparison with pre-defined expected responses and, based upon the evaluation, the system determining whether to proceed to another question or to offer remedial feedback. Such a learner-constructed response based evaluation methodology greatly reduces the potential for “guess-work” based correct responses and improves the training process through remedial feedback and advancement upon demonstration of knowledge.

Description

BACKGROUND
This invention relates to systems and methods for personnel training and, more particularly, to supervised or self-administered computer-based training systems that incorporate a learner-constructed response based testing methodology for improved evaluation of knowledge acquisition.
A variety of systems are available for automated learning and training using computers or other personal electronic devices. In current computer mediated learning and training systems, assessment of the “knowledge” gained by the user is carried out by, for example, true/false questions, matching (paired-associate) type questions, multiple choice questions, and marking questions. A multiple choice question differs from a marking question in that a multiple choice question has one correct answer, while a marking question has multiple correct answers. The foregoing question formats are not fully effective as learning aids, not are they reliable in assessing actual knowledge, for various reasons. For example, in a true/false question, a learner has a fifty-fifty chance of answering correctly by guessing; in a four way multiple choice question, the probability of a correct answer through guessing is twenty five percent. Test results thus are not necessarily indicative of actual knowledge.
What is needed, therefore, is a methodology for use in computer based training that provides for improved learning, improved efficiency, and improved reliability in the assessment of a user's actual knowledge of subject matter.
SUMMARY
This invention provides a methodology in which a learner-constructed response is provided in answer to a question presented by the system, the response being evaluated by comparison with pre-defined expected responses and, based upon the evaluation, the system determining whether to proceed to another question or to offer remedial feedback. Such a learner-constructed response based evaluation methodology greatly reduces the potential for “guess-work” based correct responses and improves the training process through remedial feedback and advancement upon demonstration of knowledge.
Evaluation of responses involves identification of pre-defined keyword data pertaining to the subject matter being tested. Examples include passages of text with important keywords (keywords being defined herein to include one or more words, or phases, or related words and phases, or synonyms). Multiple choice questions may also include keywords, such that after the learner completes a sequence of reading material or any kind of current multiple-choice, mix or match, true false questions, the learner is prompted to enter answers to “fill-in-the-blank” or “verbal narrative” questions (a learner-constructed response). The learner entered responses are compared to standard solutions recorded on the system and remedial actions are provided.
The methodology may be used in a specially designed training system or in cooperation with existing computer based training systems. For every “choice” based question (e.g., multiple choice), for example, the methodology may prompt for a “user-constructed response” based upon a question that has associated with it all acceptable correct user-constructed responses to this question, the presentation to the learner being designed to include an area or mechanism for capturing a learner response either in the form of text or spoken words. The correct response is recognized if the response matches the keyword(s), e.g., primary/related keyword(s) or phrase(s) and/or synonym(s).
In one implementation, a computer program is provided for implementing a learning system with a learner-constructed response based methodology, the program including a presentation process for presenting at least one knowledge topic to the learner and for prompting the learner to enter a learner constructed response thereto; an evaluation information process for providing keyword data that corresponds to the knowledge topic; and an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic block diagram illustrating a system for implementing a learning methodology of the present invention.
FIGS. 2A-2C are schematic representations illustrating components and question formats for the learning methodology.
FIGS. 3A and 3B are flow charts illustrating processes of the learning methodology.
DETAILED DESCRIPTION
In FIG. 1, the reference numeral 10 refers to a system for implementing a training methodology of the present invention, described in detail below. The system 10 includes a computer 12, having a processor 14 and a memory 16. It is understood that the computer 12 may comprise a personal or desktop computer, a television, a handheld device, or any other suitable electronic device. A display 18 and audio output 20 are connected to the computer. Inputs include a user keyboard 22, a mouse 24, or other suitable devices. The inputs are used for various purposes such as entering information initiated by the user, interacting with a software application running on the computer, etc. A disc input 26 is provided for providing programming or content for operation on the system 10, it being understood that any suitable media may be contemplated.
Programming, as discussed in detail below for implementing the present learning methodology, is stored on disc input 26 and/or memory 16 and is executed by the system 10. The learning methodology preferably is practiced using the foregoing system components, although it may be practiced with alternative components.
FIGS. 2A-2C illustrate schematically an example implementation of the learning methodology in which presentation information component 28 is associated with evaluation information component 30. A logical file 32 has encoded therein the components 28, 30 (it being understood that they may in some embodiments be encoded in distinct files). A learner evaluation and control program 34, including instructions for presentation control 36 and evaluation control 38, are provided to implement the methodology using the information contained in the file(s) 32. A graphical user interface (GULIGUI) 40 operates with the program 34 for providing the presentation and interaction with the learner, it being understood that the GUI 40 may be implemented using commercially available software including, but not limited to, a web browser. The file(s) 32, program 34, and GUI 40 are understood to operate in conjunction with standard software on the computer 12 of the system 10, and may be stored in memory 16, disk input 26, or as otherwise appropriate for efficient operation.
The presentation information component 28 contains information for presenting the question, and may also include additional instructions, help information and an avenue for capturing learner-constructed responses (e.g., a text area or a record button for voice input). The evaluation information component 30 may include a sequence of phrases and, in one embodiment, these may take the form of standard HTML tags for the display of question information and a sequence of proprietary tags for the encoding of expected key-words or phrases under the “META-DATA” tag in HTML.
Referring to FIGS. 2A and 2B, in one example the presentation information information 28 includes a target knowledge component 42 comprising subject matter content for presentation (e.g., display) to a learner. As illustrated, example content may comprise a lesson on important sales techniques, to be presented in any variety of ways to the learner (e.g., text, video, graphics, sound). The evaluation information component 30 includes keywords that may include one or more primary keywords and/or phrases, related keywords and/or phrases, and/or synonyms. The keywords may also be formatted to identify negative constructs (wrong answers) and flag them. In the example of FIG. 2B, the evaluation information component includes keyword component 44 as an associated set of words relevant to the target knowledge component 42. For example, keywords may describe one or more words or phrases that express the gist or main concepts of the topic under discussion, or the main concepts of a training session to be conveyed to the learners. Keywords may comprise one word, phrases, multiple associated words, synonyms and/or related phrases. For example, keywords may be organized as an n-tuple of consisting of one main word or phrase, followed by associated synonyms and related phrases. Keywords in the component 44 as illustrated in the example of FIG. 2B that correspond to the target knowledge component 42 are empathy, understanding customer needs, relating to customer requirements. It is understood that the authors of “content” for the system 10 will supply the target knowledge for the presentation information component 30, corresponding to each passage or item to be tested in the component 28. It is understood that the target knowledge of the component 28 may be expressed as text passages, graphics, video, or multiple choice questions, or in some other fashion.
The program 34 enables creation of the components 42, 44 for a desired training session. During the creation of the training “content” the authors are prompted to create different key-words and phrases that best describe the “gist” of the content or embody the essence of the knowledge topic under discussion. These key-words and phrases are utilized for the construction of questions. These key-words may also be analyzed to produce additional key-words, phrases or synonyms, and identify negative constructs (wrong answers).
Referring to FIG. 2C, illustrated are example question formats 46 and 48. Once the target knowledge component 42 is presented to the learner, a series of test questions may be provided to the learner. It is also understood that the target knowledge component 42 may itself take the form of multiple choice or other question or question/answer formats. As illustrated by the question format 48, eventually the learner will be presented with a question format that requires the learner to construct a response to a question about the target knowledge in which the learner must construct the target knowledge in his or her own words. The example illustrated is the format 48 in which the learner is prompted to fill in a blank in response to a displayed question. Other scenarios are similarly envisioned in which the learner must express an answer (audibly or in writing) in his or her own word or words. As described below, the learner's words are evaluated by the system 10 to determine whether remediation, further testing, or advancement to new material is appropriate.
FIG. 3A is a functional block diagram describing steps of an evaluation process 300 of the learning methodology as implemented by the program 34 operating on the computer 12 of the system 10, for example, FIG. 3B describes lexical pre-processing of a user-constructed response to eliminate negative, conjunctive, and non-definitive language constructs in the user-constructed response.
Referring to the process 300, in step 302 the learner is prompted to construct the target knowledge (presented previously, as described above) in his or her own words. One example of the prompt is the fill-in-the-blank format 48, above. In step 304, if the learner's response is verbal, the speech is converted into text data. After the learner's response has been fully entered, a comparison can be triggered automatically in a predetermined manner. For example, the learner can hit a particular key on the keyboard (e.g., an “Enter” key) or activate a particular area on the display screen to start the comparison. In step 306, the comparison is performed of the learner's response with the pre-defined key word data contained in the evaluation information component 30 (FIG. 2A). The comparison may involve a variety of analyses. For example, the comparison may:
(1) check for and correct spelling mistakes in the learner-constructed responses;
(2) determine whether the correct key word (words, phrases) appear in the learner-constructed response;
(3) determine whether synonyms of missing key word(s) appear in the learner-constructed response;
(4) determine whether related phrases that convey the same meaning as the expected key word(s) or phrases appear in the learner-constructed response;
(5) determine whether there are any incorrect key word(s) or phrases in the learner-constructed response or other negative constructs that might indicate a wrong answer.
A variety of logic selections for evaluation are contemplated. In one example, for purposes of improved learning and expediting the testing, a decision is made in step 308 of whether the learner response fails a lexical analysis (described more fully in FIG. 3B), thereby indicating a possible wrong answer or misunderstanding. If yes, then in step 310 the methodology prompts the user for a positive construct. If not, in step 312 a determination is made whether or not expected keyword(s) are found in the response, albeit not necessarily in the exact way or phraseology preferred. If yes, then the methodology proceeds to step 314 and provides a success message to the evaluation control program and execution returns to the program for testing of other target knowledge topics. If not, then in step 316 a determination is made whether expected related phrase(s) are found in the learner's response (thus indicating a correct or partially correct answer). If yes, execution proceeds to step 314. If not, in step 318 a determination is made whether expected synonym(s) appear in the learner response, thereby indicating a correct or partially correct answer. If yes, execution proceeds to step 314. If not, the methodology proceeds to step 320. In step 320, a “failure” message is sent to the evaluation control program 34.
Possible scenarios of a “failure” message to the evaluation control program 34 are that the evaluation control program may:
(1) Proceed to other questions and come back to the question upon which failure is indicated, until a satisfactory answer is received.
(2) Offer remedial questions or target information;
(3) Re-evaluate the learner with a focus on the missed part of the current topic.
Possible scenarios of a “success” message to the evaluation control program 34 are that the evaluation control program may:
(1) Discontinue further questioning on the target knowledge subject;
(2) Question the learner on the target knowledge again or in a different way to confirm understanding.
Referring to FIG. 3B, a lexical pre-processing algorithm 308 (described generally in FIG. 3A) is provided that eliminates negative, conjunctive, and non-definitive language constructs in user-constructed responses. In step 322, a user-constructed response is parsed and scanned for pre-defined language constructs.
In step 324, if the response contains negative constructs, the learner is prompted in step 326 for alternative responses. For example, if the learner types “no empathy” or “not empathy” or “don't XXX” or “can't YYY” a parsing algorithm that looks for “empathy” or “XXX” or “YYY” will normally flag this as correct even though the negative construct makes the meaning totally different. Accordingly, step 324 determines that the answer with the negative construct is incorrect and proceeds to step 326.
If in step 324 there are no negative constructs, in step 328 a determination is made whether the user-constructed response contains a “conjunctive” construct and, if so, in step 330 prompts the learner for a single response. As an example, if “and” or “hut” or “or” are included in the answer, to indicate a possible guess or two possible answers, step 328 determines that the user-constructed responses is not correct and prompts the learner in step 330.
If in step 328 there are no conjunctive constructs, a determination in step 332 whether there are non-definite constructs, and if so, prompts the learner for a definite response. Example non-definite constructs include, e.g., “maybe” or “perhaps.”
If in step 332 there are no non-definite constructs, in step 336 execution proceeds to the next phase of the analysis, as further described in step 312 of FIG. 3A (described above).
It is noted that at any given moment during the execution of the above mentioned learning methodology, various information pertaining to the training session or the performance of the learner is collected by the system 10 for different purposes. In one specific case, at the end of a training session, the collected information gives an in-depth view of how well the learner has been trained. The collected information can be analyzed to generate various reports to be delivered to a predetermined interested party. For instance, the analyzed information will help to identify comparative difficulties of different materials or subjects covered in the training session, or provide information on how the learner has performed on a per question basis, etc. A statistical analysis and report can also be generated in a similar fashion based on the performances of a group of learners with regard to the training session. Therefore, the interested party can evaluate the performance of a group of learners to make various decisions such as to determine whether the training session should be revised, or whether the group of learners can be profiled in a certain manner.
In summary, the system 10 provides a learning methodology that improves the speed and retention of learning, and furthermore provides improved accuracy in assessment of the learner. By requiring, perhaps in addition to traditional multiple choice or other testing techniques, a learner-constructed response in which the learner must use his or her own words in answering a question, greater assurance is provided that the learner indeed knows the subject matter. Also, the system allows for refinement of the testing as the learner gets closer to accurate responses, as enabled by the construction of a key word component associated with the target knowledge component, as enabled by the evaluation process.
Although illustrative embodiments of the invention have been shown and described, other modifications, changes, and substitutions are intended in the foregoing disclosure. Accordingly, it is appropriate that the appended claims be constructed broadly and in a manner consistent with the scope of the invention.

Claims (174)

1. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter another learner-constructed response.
2. The program of claim 1 wherein the comparsion comprises a determination of whether or not expected keyword data appears in the learner-constructed response, the keyword data comprising at least one synonym.
3. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting at least one knowledge topic on the display, using a graphical user interface, to the learnerand for ;
prompting the learner to enter a learner constructed learner-constructed response thereto;
presenting on the display, using the graphical user interface, the learner-constructed response;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success of failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter another learner-constructed response.
4. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting a series of knowledge topics on the display, using a graphical user interface, to the learner; and
prompting the learner to enter a learner constructed learner-constructed response to each topic;
presenting on the display, using the graphical user interface, the learner-constructed responses;
comparing keyword data that corresponds to the knowledge topics with the learner-constructed responses; and
determining success or failure of the learner to know each of the knowledge topics, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response;
upon a determination of failure of the learner, providing remedial information to the learner and again prompting the learner to enter a learner-constructed response;
upon a determination of success of the learner, discontinuing presentation and prompting of the learner regarding the particular knowledge topic;
whereupon automated presentation of the series is completed when success is determined for each knowledge topic.
5. The method of claim 4 wherein the comparing comprises a determination of whether or not expected keyword data appears in the learner-constructed response, the keyword data comprising at least one exact keyword.
6. The method of claim 4 wherein the comparing comprises a determination of whether or not expected keyword data appears in the learner-constructed response, the keyword data comprising at least one exact phrase.
7. The method of claim 4 wherein the comparing comprises a determination of whether or not expected keyword data appears in the learner-constructed response, the keyword data comprising at least one synonym.
8. The method of claim 4 wherein the comparing comprises a determination of whether or not the learner-constructed response fails a lexical analysis.
9. The method of claim 4 further comprising:
collecting information regarding a performance of at least one learner during the presentation process, the evaluation information process and the evaluation process;
analyzing the collected information; and
generating a report based on the analyzed information for at least one predetermined party.
10. The computer readable storage medium of claim 1, wherein the presentation process for prompting includes a process for prompting the learner to fill in a blank in response to a displayed question.
11. The computer readable storage medium of claim 1, wherein the presentation process for prompting includes a process for utilizing HTML tags to display a question.
12. The computer readable storage medium of claim 1, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in an HTML tag.
13. The computer readable storage medium of claim 1, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in a meta-data tag.
14. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting text.
15. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting graphics.
16. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting video.
17. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting on a desktop computer display.
18. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting on a television display.
19. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting on a hand-held display.
20. The computer readable storage medium of claim 1, wherein the presentation process for presenting includes a process for presenting, using a web browser, on the display.
21. The computer readable storage medium of claim 1, further comprising a receiving process for receiving the learner-constructed response via voice input.
22. The computer readable storage medium of claim 1, further comprising a receiving process for receiving the learner-constructed response via voice input and converting the voice input into text.
23. The method of claim 3, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
24. The method of claim 3, wherein the prompting includes utilizing HTML tags to display a question.
25. The method of claim 3, wherein the comparing includes comparing the learner-constructed response with data encoded in an HTML tag.
26. The method of claim 3, wherein the comparing includes comparing the learner-constructed response with data encoded in a meta-data tag.
27. The method of claim 3, wherein the presenting at least one knowledge topic includes presenting text.
28. The method of claim 3, wherein the presenting one knowledge topic includes presenting graphics.
29. The method of claim 3, wherein the presenting one knowledge topic includes presenting video.
30. The method of claim 3, wherein the presenting one knowledge topic includes presenting on a desktop computer display.
31. The method of claim 3, wherein the presenting one knowledge topic includes presenting on a television display.
32. The method of claim 3, wherein the presenting one knowledge topic includes presenting on a hand-held display.
33. The method of claim 3, wherein the presenting one knowledge topic includes presenting, using a web browser, on the display.
34. The method of claim 3, further comprising receiving the learner-constructed response via voice input.
35. The method of claim 3, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
36. The method of claim 4, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
37. The method of claim 4, wherein the prompting includes utilizing HTML tags to display a question.
38. The method of claim 4, wherein the comparing includes comparing at least one learner-constructed response with data encoded in an HTML tag.
39. The method of claim 4, wherein the comparing includes comparing at least one learner-constructed response with data encoded in a meta-data tag.
40. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting text.
41. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting graphics.
42. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting video.
43. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting on a desktop computer display.
44. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting on a television display.
45. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting on a hand-held display.
46. The method of claim 4, wherein the presenting a series of knowledge topics includes presenting, using a web browser, on the display.
47. The method of claim 4, further comprising receiving the learner-constructed response via voice input.
48. The method of claim 4, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
49. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of the learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein after a determination of failure of the learner, remedial information is provided to the learner, after which the learner is prompted to enter another learner-constructed response.
50. The computer readable storage medium of claim 49, wherein the presentation process for prompting includes a process for prompting the learner to fill in a blank in response to a displayed question.
51. The computer readable storage medium of claim 49, wherein the presentation process for prompting includes a process for utilizing HTML tags to display a question.
52. The computer readable storage medium of claim 49, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in an HTML tag.
53. The computer readable storage medium of claim 49, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in a meta-data tag.
54. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting text.
55. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting graphics.
56. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting video.
57. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting on a desktop computer display.
58. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting on a television display.
59. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting on a hand-held display.
60. The computer readable storage medium of claim 49, wherein the presentation process for presenting includes a process for presenting, using a web browser, on the display.
61. The computer readable storage medium of claim 49, further comprising a receiving process for receiving the learner-constructed response via voice input.
62. The computer readable storage medium of claim 49, further comprising a receiving process for receiving the learner-constructed response via voice input and converting the voice input into text.
63. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response to one of the at least one knowledge topic;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the one of the at least one knowledge topic; and
an evaluation process for determining, based upon entry of the learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data, wherein upon a determination of failure of the learner, remedial information is provided to the learner after the learner is prompted to enter the learner-constructed response.
64. The computer readable storage medium of claim 63, wherein the presentation process for prompting includes a process for prompting the learner to fill in a blank in response to a displayed question.
65. The computer readable storage medium of claim 63, wherein the presentation process for prompting includes a process for utilizing HTML tags to display a question.
66. The computer readable storage medium of claim 63, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in an HTML tag.
67. The computer readable storage medium of claim 63, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in a meta-data tag.
68. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting text.
69. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting graphics.
70. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting video.
71. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting on a desktop computer display.
72. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting on a television display.
73. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting on a hand-held display.
74. The computer readable storage medium of claim 63, wherein the presentation process for presenting includes a process for presenting, using a web browser, on the display.
75. The computer readable storage medium of claim 63, further comprising a receiving process for receiving the learner-constructed response via voice input.
76. The computer readable storage medium of claim 63, further comprising a receiving process for receiving the learner-constructed response via voice input and converting the voice input into text.
77. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyboard data, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
78. The computer readable storage medium of claim 77, wherein the presentation process for prompting includes a process for prompting the learner to fill in a blank in response to a displayed question.
79. The computer readable storage medium of claim 77, wherein the presentation process for prompting includes a process for utilizing HTML tags to display a question.
80. The computer readable storage medium of claim 77, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in an HTML tag.
81. The computer readable storage medium of claim 77, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in a meta-data tag.
82. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting text.
83. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting graphics.
84. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting video.
85. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting on a desktop computer display.
86. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting on a television display.
87. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting on a hand-held display.
88. The computer readable storage medium of claim 77, wherein the presentation process for presenting includes a process for presenting, using a web browser, on the display.
89. The computer readable storage medium of claim 77, further comprising a receiving process for receiving the learner-constructed response via voice input.
90. The computer readable storage medium of claim 77, further comprising a receiving process for receiving the learner-constructed response via voice input and converting the voice input into text.
91. A computer readable storage medium storing a computer program, the computer program for execution by a computer system having a processor, a memory, and a display, the computer program for implementing a learning system with a learner-constructed response based methodology, comprising:
a presentation process for presenting on the display, using a graphical user interface, at least one knowledge topic to the learner and for prompting the learner to enter a learner-constructed response thereto;
a displaying process for presenting on the display, using the graphical user interface, the learner-constructed response;
an evaluation information process for providing keyword data that corresponds to the knowledge topic; and
an evaluation process for determining, based upon entry of a learner-constructed response to the knowledge topic, success or failure of the learner to know the knowledge topic, the success or failure being determined by comparison of the learner-constructed response with the keyword data wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a plurality of learner-constructed responses.
92. The computer readable storage medium of claim 91, wherein the presentation process for prompting includes a process for prompting the learner to fill in a blank in response to a displayed question.
93. The computer readable storage medium of claim 91, wherein the presentation process for prompting includes a process for utilizing HTML tags to display a question.
94. The computer readable storage medium of claim 91, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in an HTML tag.
95. The computer readable storage medium of claim 91, wherein the evaluation process for determining includes a process for comparing the learner-constructed response with data encoded in a meta-data tag.
96. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting text.
97. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting graphics.
98. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting video.
99. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting on a desktop computer display.
100. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting on a television display.
101. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting on a hand-held display.
102. The computer readable storage medium of claim 91, wherein the presentation process for presenting includes a process for presenting, using a web browser, on the display.
103. The computer readable storage medium of claim 91, further comprising a receiving process for receiving the learner-constructed response via voice input.
104. The computer readable storage medium of claim 91, further comprising a receiving process for receiving the learner-constructed response via voice input and converting the voice input into text.
105. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response to one of the at least one knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the one knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
106. The method of claim 105, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
107. The method of claim 105, wherein the prompting includes utilizing HTML tags to display a question.
108. The method of claim 105, wherein the determining includes comparing the learner-constructed response with data encoded in an HTML tag.
109. The method of claim 105, wherein the determining includes comparing the learner-constructed response with data encoded in a meta-data tag.
110. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting text.
111. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting graphics.
112. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting video.
113. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting on a desktop computer display.
114. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting on a desktop computer display.
115. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting on a hand-held display.
116. The method of claim 105, wherein the presenting the at least one knowledge topic includes presenting, using a web browser, on the display.
117. The method of claim 105, further comprising receiving the learner-constructed response via voice input.
118. The method of claim 105, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
119. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response thereto;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner before the learner is prompted to enter a last learner-constructed response.
120. The method of claim 119, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
121. The method of claim 119, wherein the prompting includes utilizing HTML tags to display a question.
122. The method of claim 119, wherein the determining includes comparing the learner-constructed response with data encoded in an HTML tag.
123. The method of claim 119, wherein the determining includes comparing the learner-constructed response with data encoded in a meta-data tag.
124. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting text.
125. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting graphics.
126. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting video.
127. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting on a desktop computer display.
128. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting on a television display.
129. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting on a hand-held display.
130. The method of claim 119, wherein the presenting the at least one knowledge topic includes presenting, using a web browser, on the display.
131. The method of claim 119, further comprising receiving the learner-constructed response via voice input.
132. The method of claim 119, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
133. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, at least one knowledge topic to the learner;
prompting the learner to enter a learner-constructed response to the at least one knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topic with the learner-constructed response; and
determining success or failure of the learner to know the knowledge topic, the success or failure being determined by whether or not expected keyword data appears in the learner-constructed response, wherein upon a determination of failure of the learner, remedial information is provided to the learner after the learner is prompted to enter the learner-constructed response.
134. The method of claim 133, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
135. The method of claim 133, wherein the prompting includes utilizing HTML tags to display a question.
136. The method of claim 133, wherein the determining includes comparing the learner-constructed response with data encoded in an HTML tag.
137. The method of claim 133, wherein the determining includes comparing the learner-constructed response with data encoded in a meta-data tag.
138. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting text.
139. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting graphics.
140. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting video.
141. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting on a desktop computer display.
142. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting on a television display.
143. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting on a hand-held display.
144. The method of claim 133, wherein the presenting the at least one knowledge topic includes presenting, using a web browser, on the display.
145. The method of claim 133, further comprising receiving the learner-constructed response via voice input.
146. The method of claim 133, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
147. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, a series of knowledge topics to the learner;
prompting the learner to enter a learning constructed response to each knowledge topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to each knowledge topic with each learner-constructed response;
determining a success or a failure of the learner to know each knowledge topic, the success or failure being determined by whether expected keyword data appears in each learner-constructed response;
after a determination of failure of the learner for a particular knowledge topic, providing remedial information to the learner for the particular knowledge topic and prompting the learner to enter a new learner-constructed response to the particular knowledge topic;
upon a determination of success of the learner for a particular one of the knowledge topics, discontinuing presentation and prompting of the learner regarding the particular one of the knowledge topics.
148. The method of claim 147, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
149. The method of claim 147, wherein the prompting includes utilizing HTML tags to display a question.
150. The method of claim 147, wherein the determining includes comparing at least one learner-constructed response with data encoded in an HTML tag.
151. The method of claim 147, wherein the determining includes comparing at least one learner-constructed response with data encoded in a meta-data tag.
152. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting text.
153. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting graphics.
154. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting video.
155. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting on a desktop computer display.
156. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting on a television display.
157. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting on a hand-held display.
158. The method of claim 147, wherein the presenting the series of knowledge topics includes presenting, using a web browser, on the display.
159. The method of claim 147, further comprising receiving the learner-constructed response via voice input.
160. The method of claim 147, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
161. A method for implementing an automated learning system, the method performed by a computer system having a processor, a memory, and a display, the method comprising:
presenting on the display, using a graphical user interface, a series of knowledge topics to the learner;
prompting the learner to enter a learner-constructed response to each topic;
presenting the learner-constructed response on the display using the graphical user interface;
comparing keyword data that corresponds to the knowledge topics with the learner-constructed responses;
determining success or failure of the learner knowing each one of the knowledge topics, the success or failure being determined by whether or not expected keyword data appears in each learner-constructed response; and
upon a determination of failure of the learner, providing remedial information to the learner at a later time and prompting the learner to enter another learner-constructed response.
162. The method of claim 161, wherein the prompting includes prompting the learner to fill in a blank in response to a displayed question.
163. The method of claim 161, wherein the prompting includes utilizing HTML tags to display a question.
164. The method of claim 161, wherein the determining includes comparing at least one learner-constructed response with data encoded in an HTML tag.
165. The method of claim 161, wherein the determining includes comparing at least one learner-constructed response with data encoded in a meta-data tag.
166. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting text.
167. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting graphics.
168. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting video.
169. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting on a desktop computer display.
170. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting on a television display.
171. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting on a hand-held display.
172. The method of claim 161, wherein the presenting the series of knowledge topics includes presenting, using a web browser, on the display.
173. The method of claim 161, further comprising receiving the learner-constructed response via voice input.
174. The method of claim 161, further comprising receiving the learner-constructed response via voice input and converting the voice input into text.
US10/653,748 2000-10-17 2003-09-02 Learning system with learner-constructed response based methodology Expired - Lifetime USRE39435E1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US10/653,748 USRE39435E1 (en) 2000-10-17 2003-09-02 Learning system with learner-constructed response based methodology
US10/815,330 US7357640B2 (en) 2003-07-02 2004-03-31 Lock-In Training system
US11/925,234 US20080076109A1 (en) 2003-07-02 2007-10-26 Lock-in training system
US11/924,844 US20080076107A1 (en) 2003-07-02 2007-10-26 Lock in training system with retention round display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/690,223 US6461166B1 (en) 2000-10-17 2000-10-17 Learning system with learner-constructed response based testing methodology
US10/653,748 USRE39435E1 (en) 2000-10-17 2003-09-02 Learning system with learner-constructed response based methodology

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US09/690,223 Reissue US6461166B1 (en) 2000-10-17 2000-10-17 Learning system with learner-constructed response based testing methodology
US10/613,564 Continuation-In-Part US20050003336A1 (en) 2003-07-02 2003-07-02 Method and system for learning keyword based materials

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US10/815,330 Continuation-In-Part US7357640B2 (en) 2003-07-02 2004-03-31 Lock-In Training system
US11/924,844 Continuation-In-Part US20080076107A1 (en) 2003-07-02 2007-10-26 Lock in training system with retention round display

Publications (1)

Publication Number Publication Date
USRE39435E1 true USRE39435E1 (en) 2006-12-19

Family

ID=24771617

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/690,223 Ceased US6461166B1 (en) 2000-10-17 2000-10-17 Learning system with learner-constructed response based testing methodology
US10/653,748 Expired - Lifetime USRE39435E1 (en) 2000-10-17 2003-09-02 Learning system with learner-constructed response based methodology

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/690,223 Ceased US6461166B1 (en) 2000-10-17 2000-10-17 Learning system with learner-constructed response based testing methodology

Country Status (1)

Country Link
US (2) US6461166B1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040234938A1 (en) * 2003-05-19 2004-11-25 Microsoft Corporation System and method for providing instructional feedback to a user
US20060141438A1 (en) * 2004-12-23 2006-06-29 Inventec Corporation Remote instruction system and method
US20070065797A1 (en) * 2005-09-20 2007-03-22 Ross Elgart System and method of preparing for essay examinations
WO2009089475A1 (en) * 2008-01-09 2009-07-16 Beauchamp Scott E Customized learning and assessment of student based on psychometric models
US8506305B2 (en) 2008-12-23 2013-08-13 Deck Chair Learning Systems Inc. Electronic learning system
US8606170B2 (en) 2012-01-20 2013-12-10 Northrop Grumman Systems Corporation Method and apparatus for interactive, computer-based, automatically adaptable learning

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2396509A1 (en) * 2000-01-12 2001-07-19 Avis Gustason Methods and systems for multimedia education
US6461166B1 (en) * 2000-10-17 2002-10-08 Dennis Ray Berman Learning system with learner-constructed response based testing methodology
WO2002056279A1 (en) * 2001-01-09 2002-07-18 Prep4 Ltd. Training system and method for improving user knowledge and skills
GB2388699A (en) * 2001-01-23 2003-11-19 Educational Testing Service Methods for automated essay analysis
US6688889B2 (en) 2001-03-08 2004-02-10 Boostmyscore.Com Computerized test preparation system employing individually tailored diagnostics and remediation
US7062220B2 (en) * 2001-04-18 2006-06-13 Intelligent Automation, Inc. Automated, computer-based reading tutoring systems and methods
US7286793B1 (en) 2001-05-07 2007-10-23 Miele Frank R Method and apparatus for evaluating educational performance
US7074128B2 (en) 2001-08-03 2006-07-11 Drb Lit Ltd. Method and system for enhancing memorization by using a mnemonic display
US6816702B2 (en) * 2002-03-15 2004-11-09 Educational Testing Service Consolidated online assessment system
US7127208B2 (en) * 2002-01-23 2006-10-24 Educational Testing Service Automated annotation
JP3772205B2 (en) * 2002-02-06 2006-05-10 国立大学法人佐賀大学 Teaching material learning system
US8374540B2 (en) 2002-03-15 2013-02-12 Educational Testing Service Consolidated on-line assessment system
JP3839336B2 (en) * 2002-03-26 2006-11-01 富士通株式会社 Learning support method
JP3995081B2 (en) * 2002-03-27 2007-10-24 富士通株式会社 Learning support method and learning support program
US7088949B2 (en) 2002-06-24 2006-08-08 Educational Testing Service Automated essay scoring
US20040254794A1 (en) * 2003-05-08 2004-12-16 Carl Padula Interactive eyes-free and hands-free device
US7797146B2 (en) * 2003-05-13 2010-09-14 Interactive Drama, Inc. Method and system for simulated interactive conversation
US20050239022A1 (en) * 2003-05-13 2005-10-27 Harless William G Method and system for master teacher knowledge transfer in a computer environment
US20050239035A1 (en) * 2003-05-13 2005-10-27 Harless William G Method and system for master teacher testing in a computer environment
US7357640B2 (en) * 2003-07-02 2008-04-15 Drb Lit Ltd. Lock-In Training system
JP4387720B2 (en) * 2003-07-28 2009-12-24 株式会社浜松早期認知症研究所 Dementia test apparatus, dementia test server, dementia test client, and dementia test system
CN1886767A (en) * 2003-11-28 2006-12-27 语言的森林有限公司 Composition evaluation device
US7364432B2 (en) * 2004-03-31 2008-04-29 Drb Lit Ltd. Methods of selecting Lock-In Training courses and sessions
US7657220B2 (en) 2004-05-21 2010-02-02 Ordinate Corporation Adaptive scoring of responses to constructed response questions
WO2007015869A2 (en) * 2005-07-20 2007-02-08 Ordinate Corporation Spoken language proficiency assessment by computer
WO2007025168A2 (en) * 2005-08-25 2007-03-01 Gregory Tuve Methods and systems for facilitating learning based on neural modeling
US7657221B2 (en) * 2005-09-12 2010-02-02 Northwest Educational Software, Inc. Virtual oral recitation examination apparatus, system and method
US11741848B2 (en) * 2006-10-11 2023-08-29 Cynthia Elnora ASHBY Interactive system and method for diagnosing test-taking errors based on blooms taxonomy
US10037707B2 (en) * 2006-10-11 2018-07-31 Cynthia Elnora ASHBY Interactive method for diagnosing test-taking errors
US20100057708A1 (en) * 2008-09-03 2010-03-04 William Henry Billingsley Method and System for Computer-Based Assessment Including a Search and Select Process
TWI407393B (en) * 2008-09-16 2013-09-01 Univ Nat Taiwan Normal Interactive teaching game automatic production methods and platforms, computer program products, recording media, and teaching game system
US9679256B2 (en) * 2010-10-06 2017-06-13 The Chancellor, Masters And Scholars Of The University Of Cambridge Automated assessment of examination scripts
US8819169B2 (en) * 2011-05-20 2014-08-26 Hallmark Cards, Incorporated Prompting service
US20140162236A1 (en) * 2012-12-07 2014-06-12 Franco Capaldi Interactive assignment system including a simulation system for simulating models of problems
US20150037765A1 (en) * 2013-08-02 2015-02-05 Speetra, Inc. System and method for interactive electronic learning and assessment
TW201516989A (en) * 2013-10-22 2015-05-01 zhong-zhi Lin Editing system and method for interactive question answering game
US9530329B2 (en) 2014-04-10 2016-12-27 Laurence RUDOLPH System and method for conducting multi-layer user selectable electronic testing
KR101853091B1 (en) * 2017-05-19 2018-04-27 (주)뤼이드 Method, apparatus and computer program for providing personalized educational contents through user response prediction framework with machine learning
US11037459B2 (en) * 2018-05-24 2021-06-15 International Business Machines Corporation Feedback system and method for improving performance of dialogue-based tutors
US11574121B2 (en) 2021-01-25 2023-02-07 Kyndryl, Inc. Effective text parsing using machine learning

Citations (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3408749A (en) 1967-04-11 1968-11-05 American Can Co Branching-instruction teaching device
US3566482A (en) * 1968-10-24 1971-03-02 Data Plex Systems Educational device
US3606688A (en) * 1968-07-19 1971-09-21 Associated Research Lab Propri Method and apparatus for teaching a multiplicity of students
US3671668A (en) * 1968-11-18 1972-06-20 Leonard Reiffel Teaching system employing a television receiver
US3715811A (en) * 1970-10-13 1973-02-13 Westinghouse Electric Corp Remedial branching teaching system
US4289313A (en) * 1979-09-07 1981-09-15 Delamontagne Robert P Management teaching game apparatus and method
US4416454A (en) * 1979-09-07 1983-11-22 Delamontagne Robert P Management teaching game method
US4817036A (en) 1985-03-15 1989-03-28 Brigham Young University Computer system and method for data base indexing and information retrieval
US4833610A (en) 1986-12-16 1989-05-23 International Business Machines Corporation Morphological/phonetic method for ranking word similarities
US4895518A (en) * 1987-11-02 1990-01-23 The University Of Michigan Computerized diagnostic reasoning evaluation system
US4958284A (en) * 1988-12-06 1990-09-18 Npd Group, Inc. Open ended question analysis system and method
US5002491A (en) 1989-04-28 1991-03-26 Comtek Electronic classroom system enabling interactive self-paced learning
US5002865A (en) * 1985-04-24 1991-03-26 Konica Corporation Silver halide photographic material
US5011413A (en) * 1989-07-19 1991-04-30 Educational Testing Service Machine-interpretable figural response testing
US5033969A (en) * 1989-07-21 1991-07-23 Pioneer Electronic Corporation Support device for resolving questions about reproduced information
US5112064A (en) * 1990-06-13 1992-05-12 Weedman Gail H Psychology game
US5168565A (en) 1988-01-20 1992-12-01 Ricoh Company, Ltd. Document retrieval system
US5246375A (en) * 1991-09-23 1993-09-21 Wouter Goede Memory aiding device
US5265065A (en) 1991-10-08 1993-11-23 West Publishing Company Method and apparatus for information retrieval from a database by replacing domain specific stemmed phases in a natural language to create a search query
US5307266A (en) 1990-08-22 1994-04-26 Hitachi, Ltd. Information processing system and method for processing document by using structured keywords
US5314340A (en) * 1990-10-30 1994-05-24 Texas Instruments Incorporated Electronic teaching apparatus having two-part partially and wholly actuated for indication of correct and incorrect answers
US5325465A (en) 1992-03-04 1994-06-28 Singapore Computer Systems Limited End user query facility
US5384703A (en) 1993-07-02 1995-01-24 Xerox Corporation Method and apparatus for summarizing documents according to theme
US5424947A (en) 1990-06-15 1995-06-13 International Business Machines Corporation Natural language analyzing apparatus and method, and construction of a knowledge base for natural language analysis
US5441415A (en) * 1992-02-11 1995-08-15 John R. Lee Interactive computer aided natural learning method and apparatus
US5442780A (en) 1991-07-11 1995-08-15 Mitsubishi Denki Kabushiki Kaisha Natural language database retrieval system using virtual tables to convert parsed input phrases into retrieval keys
US5463773A (en) 1992-05-25 1995-10-31 Fujitsu Limited Building of a document classification tree by recursive optimization of keyword selection function
US5475588A (en) 1993-06-18 1995-12-12 Mitsubishi Electric Research Laboratories, Inc. System for decreasing the time required to parse a sentence
US5511793A (en) * 1992-06-08 1996-04-30 Quantum Development, Inc. Composite chess game and method
US5519608A (en) 1993-06-24 1996-05-21 Xerox Corporation Method for extracting from a text corpus answers to questions stated in natural language by using linguistic analysis and hypothesis generation
US5528491A (en) 1992-08-31 1996-06-18 Language Engineering Corporation Apparatus and method for automated natural language translation
US5540589A (en) * 1994-04-11 1996-07-30 Mitsubishi Electric Information Technology Center Audio interactive tutor
US5597312A (en) 1994-05-04 1997-01-28 U S West Technologies, Inc. Intelligent tutoring method and system
US5632624A (en) * 1993-09-22 1997-05-27 Brainchild, Inc. Electronic study guide
WO1997018698A2 (en) * 1995-11-08 1997-05-29 Peter Geoffrey Mcdermott Memory aid
US5689716A (en) 1995-04-14 1997-11-18 Xerox Corporation Automatic method of generating thematic summaries
US5694523A (en) 1995-05-31 1997-12-02 Oracle Corporation Content processing system for discourse
US5696962A (en) 1993-06-24 1997-12-09 Xerox Corporation Method for computerized information retrieval using shallow linguistic analysis
US5708822A (en) 1995-05-31 1998-01-13 Oracle Corporation Methods and apparatus for thematic parsing of discourse
US5749736A (en) * 1995-03-22 1998-05-12 Taras Development Method and system for computerized learning, response, and evaluation
US5768580A (en) 1995-05-31 1998-06-16 Oracle Corporation Methods and apparatus for dynamic classification of discourse
US5863208A (en) * 1996-07-02 1999-01-26 Ho; Chi Fai Learning system and method based on review
US5885087A (en) * 1994-09-30 1999-03-23 Robolaw Corporation Method and apparatus for improving performance on multiple-choice exams
US5987443A (en) 1998-12-22 1999-11-16 Ac Properties B. V. System, method and article of manufacture for a goal based educational system
US5987302A (en) * 1997-03-21 1999-11-16 Educational Testing Service On-line essay evaluation system
US6029043A (en) 1998-01-29 2000-02-22 Ho; Chi Fai Computer-aided group-learning methods and systems
US6067538A (en) 1998-12-22 2000-05-23 Ac Properties B.V. System, method and article of manufacture for a simulation enabled focused feedback tutorial system
US6077085A (en) * 1998-05-19 2000-06-20 Intellectual Reserve, Inc. Technology assisted learning
US6115683A (en) * 1997-03-31 2000-09-05 Educational Testing Service Automatic essay scoring system using content-based techniques
US6120297A (en) * 1997-08-25 2000-09-19 Lyceum Communication, Inc. Vocabulary acquistion using structured inductive reasoning
US6164974A (en) 1997-03-28 2000-12-26 Softlight Inc. Evaluation based learning system
US6168440B1 (en) * 1993-02-05 2001-01-02 National Computer Systems, Inc. Multiple test item scoring system and method
US6173251B1 (en) 1997-08-05 2001-01-09 Mitsubishi Denki Kabushiki Kaisha Keyword extraction apparatus, keyword extraction method, and computer readable recording medium storing keyword extraction program
US6181909B1 (en) * 1997-07-22 2001-01-30 Educational Testing Service System and method for computer-based automatic essay scoring
US6199034B1 (en) 1995-05-31 2001-03-06 Oracle Corporation Methods and apparatus for determining theme for discourse
US6208832B1 (en) 1997-11-14 2001-03-27 Sony Corporation Learning system with response analyzer
US6226611B1 (en) * 1996-10-02 2001-05-01 Sri International Method and system for automatic text-independent grading of pronunciation for language instruction
US6256399B1 (en) * 1992-07-08 2001-07-03 Ncs Pearson, Inc. Method of distribution of digitized materials and control of scoring for open-ended assessments
US6254395B1 (en) * 1998-04-13 2001-07-03 Educational Testing Service System and method for automated testing of writing skill
US6267601B1 (en) * 1997-12-05 2001-07-31 The Psychological Corporation Computerized system and method for teaching and assessing the holistic scoring of open-ended questions
US6282404B1 (en) * 1999-09-22 2001-08-28 Chet D. Linton Method and system for accessing multimedia data in an interactive format having reporting capabilities
US6287123B1 (en) * 1998-09-08 2001-09-11 O'brien Denis Richard Computer managed learning system and data processing method therefore
US6292792B1 (en) * 1999-03-26 2001-09-18 Intelligent Learning Systems, Inc. System and method for dynamic knowledge generation and distribution
US6295439B1 (en) * 1997-03-21 2001-09-25 Educational Testing Service Methods and systems for presentation and evaluation of constructed responses assessed by human evaluators
US6302698B1 (en) * 1999-02-16 2001-10-16 Discourse Technologies, Inc. Method and apparatus for on-line teaching and learning
US6311040B1 (en) * 1997-07-31 2001-10-30 The Psychological Corporation System and method for scoring test answer sheets having open-ended questions
US6343935B1 (en) * 2000-03-01 2002-02-05 Castle Hill Learning Company, Llc Computerized interactive educational method and apparatus for teaching vocabulary
US6345270B1 (en) 1997-05-26 2002-02-05 Fujitsu Limited Data management system
US6356864B1 (en) * 1997-07-25 2002-03-12 University Technology Corporation Methods for analysis and evaluation of the semantic content of a writing based on vector length
US6411924B1 (en) 1998-01-23 2002-06-25 Novell, Inc. System and method for linguistic filter and interactive display
US6461166B1 (en) * 2000-10-17 2002-10-08 Dennis Ray Berman Learning system with learner-constructed response based testing methodology
US6470170B1 (en) 2000-05-18 2002-10-22 Hai Xing Chen System and method for interactive distance learning and examination training
US6493690B2 (en) 1998-12-22 2002-12-10 Accenture Goal based educational system with personalized coaching
US6548470B1 (en) 1998-12-14 2003-04-15 The Procter & Gamble Company Bleaching compositions
US6553382B2 (en) 1995-03-17 2003-04-22 Canon Kabushiki Kaisha Data management system for retrieving data based on hierarchized keywords associated with keyword names
US6554618B1 (en) 2001-04-20 2003-04-29 Cheryl B. Lockwood Managed integrated teaching providing individualized instruction

Patent Citations (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3408749A (en) 1967-04-11 1968-11-05 American Can Co Branching-instruction teaching device
US3606688A (en) * 1968-07-19 1971-09-21 Associated Research Lab Propri Method and apparatus for teaching a multiplicity of students
US3566482A (en) * 1968-10-24 1971-03-02 Data Plex Systems Educational device
US3671668A (en) * 1968-11-18 1972-06-20 Leonard Reiffel Teaching system employing a television receiver
US3715811A (en) * 1970-10-13 1973-02-13 Westinghouse Electric Corp Remedial branching teaching system
US4289313A (en) * 1979-09-07 1981-09-15 Delamontagne Robert P Management teaching game apparatus and method
US4416454A (en) * 1979-09-07 1983-11-22 Delamontagne Robert P Management teaching game method
US4817036A (en) 1985-03-15 1989-03-28 Brigham Young University Computer system and method for data base indexing and information retrieval
US5002865A (en) * 1985-04-24 1991-03-26 Konica Corporation Silver halide photographic material
US4833610A (en) 1986-12-16 1989-05-23 International Business Machines Corporation Morphological/phonetic method for ranking word similarities
US4895518A (en) * 1987-11-02 1990-01-23 The University Of Michigan Computerized diagnostic reasoning evaluation system
US5168565A (en) 1988-01-20 1992-12-01 Ricoh Company, Ltd. Document retrieval system
US4958284A (en) * 1988-12-06 1990-09-18 Npd Group, Inc. Open ended question analysis system and method
US5002491A (en) 1989-04-28 1991-03-26 Comtek Electronic classroom system enabling interactive self-paced learning
US5011413A (en) * 1989-07-19 1991-04-30 Educational Testing Service Machine-interpretable figural response testing
US5033969A (en) * 1989-07-21 1991-07-23 Pioneer Electronic Corporation Support device for resolving questions about reproduced information
US5112064A (en) * 1990-06-13 1992-05-12 Weedman Gail H Psychology game
US5424947A (en) 1990-06-15 1995-06-13 International Business Machines Corporation Natural language analyzing apparatus and method, and construction of a knowledge base for natural language analysis
US5307266A (en) 1990-08-22 1994-04-26 Hitachi, Ltd. Information processing system and method for processing document by using structured keywords
US5314340A (en) * 1990-10-30 1994-05-24 Texas Instruments Incorporated Electronic teaching apparatus having two-part partially and wholly actuated for indication of correct and incorrect answers
US5442780A (en) 1991-07-11 1995-08-15 Mitsubishi Denki Kabushiki Kaisha Natural language database retrieval system using virtual tables to convert parsed input phrases into retrieval keys
US5246375A (en) * 1991-09-23 1993-09-21 Wouter Goede Memory aiding device
US5265065A (en) 1991-10-08 1993-11-23 West Publishing Company Method and apparatus for information retrieval from a database by replacing domain specific stemmed phases in a natural language to create a search query
US5441415A (en) * 1992-02-11 1995-08-15 John R. Lee Interactive computer aided natural learning method and apparatus
US5325465A (en) 1992-03-04 1994-06-28 Singapore Computer Systems Limited End user query facility
US5463773A (en) 1992-05-25 1995-10-31 Fujitsu Limited Building of a document classification tree by recursive optimization of keyword selection function
US5511793A (en) * 1992-06-08 1996-04-30 Quantum Development, Inc. Composite chess game and method
US6256399B1 (en) * 1992-07-08 2001-07-03 Ncs Pearson, Inc. Method of distribution of digitized materials and control of scoring for open-ended assessments
US5528491A (en) 1992-08-31 1996-06-18 Language Engineering Corporation Apparatus and method for automated natural language translation
US6168440B1 (en) * 1993-02-05 2001-01-02 National Computer Systems, Inc. Multiple test item scoring system and method
US5475588A (en) 1993-06-18 1995-12-12 Mitsubishi Electric Research Laboratories, Inc. System for decreasing the time required to parse a sentence
US5519608A (en) 1993-06-24 1996-05-21 Xerox Corporation Method for extracting from a text corpus answers to questions stated in natural language by using linguistic analysis and hypothesis generation
US5696962A (en) 1993-06-24 1997-12-09 Xerox Corporation Method for computerized information retrieval using shallow linguistic analysis
US5384703A (en) 1993-07-02 1995-01-24 Xerox Corporation Method and apparatus for summarizing documents according to theme
US5632624A (en) * 1993-09-22 1997-05-27 Brainchild, Inc. Electronic study guide
US5540589A (en) * 1994-04-11 1996-07-30 Mitsubishi Electric Information Technology Center Audio interactive tutor
US5597312A (en) 1994-05-04 1997-01-28 U S West Technologies, Inc. Intelligent tutoring method and system
US5885087A (en) * 1994-09-30 1999-03-23 Robolaw Corporation Method and apparatus for improving performance on multiple-choice exams
US6086382A (en) * 1994-09-30 2000-07-11 Robolaw Corporation Method and apparatus for improving performance on multiple-choice exams
US6553382B2 (en) 1995-03-17 2003-04-22 Canon Kabushiki Kaisha Data management system for retrieving data based on hierarchized keywords associated with keyword names
US5749736A (en) * 1995-03-22 1998-05-12 Taras Development Method and system for computerized learning, response, and evaluation
US5689716A (en) 1995-04-14 1997-11-18 Xerox Corporation Automatic method of generating thematic summaries
US5768580A (en) 1995-05-31 1998-06-16 Oracle Corporation Methods and apparatus for dynamic classification of discourse
US6199034B1 (en) 1995-05-31 2001-03-06 Oracle Corporation Methods and apparatus for determining theme for discourse
US5694523A (en) 1995-05-31 1997-12-02 Oracle Corporation Content processing system for discourse
US5708822A (en) 1995-05-31 1998-01-13 Oracle Corporation Methods and apparatus for thematic parsing of discourse
WO1997018698A2 (en) * 1995-11-08 1997-05-29 Peter Geoffrey Mcdermott Memory aid
US5863208A (en) * 1996-07-02 1999-01-26 Ho; Chi Fai Learning system and method based on review
US6226611B1 (en) * 1996-10-02 2001-05-01 Sri International Method and system for automatic text-independent grading of pronunciation for language instruction
US5987302A (en) * 1997-03-21 1999-11-16 Educational Testing Service On-line essay evaluation system
US6295439B1 (en) * 1997-03-21 2001-09-25 Educational Testing Service Methods and systems for presentation and evaluation of constructed responses assessed by human evaluators
US6164974A (en) 1997-03-28 2000-12-26 Softlight Inc. Evaluation based learning system
US6115683A (en) * 1997-03-31 2000-09-05 Educational Testing Service Automatic essay scoring system using content-based techniques
US6345270B1 (en) 1997-05-26 2002-02-05 Fujitsu Limited Data management system
US6181909B1 (en) * 1997-07-22 2001-01-30 Educational Testing Service System and method for computer-based automatic essay scoring
US6356864B1 (en) * 1997-07-25 2002-03-12 University Technology Corporation Methods for analysis and evaluation of the semantic content of a writing based on vector length
US6311040B1 (en) * 1997-07-31 2001-10-30 The Psychological Corporation System and method for scoring test answer sheets having open-ended questions
US6173251B1 (en) 1997-08-05 2001-01-09 Mitsubishi Denki Kabushiki Kaisha Keyword extraction apparatus, keyword extraction method, and computer readable recording medium storing keyword extraction program
US6120297A (en) * 1997-08-25 2000-09-19 Lyceum Communication, Inc. Vocabulary acquistion using structured inductive reasoning
US6208832B1 (en) 1997-11-14 2001-03-27 Sony Corporation Learning system with response analyzer
US6267601B1 (en) * 1997-12-05 2001-07-31 The Psychological Corporation Computerized system and method for teaching and assessing the holistic scoring of open-ended questions
US6411924B1 (en) 1998-01-23 2002-06-25 Novell, Inc. System and method for linguistic filter and interactive display
US6160987A (en) 1998-01-29 2000-12-12 Ho; Chi Fai Computer-aided group-learning methods and systems
US6029043A (en) 1998-01-29 2000-02-22 Ho; Chi Fai Computer-aided group-learning methods and systems
US6254395B1 (en) * 1998-04-13 2001-07-03 Educational Testing Service System and method for automated testing of writing skill
US6077085A (en) * 1998-05-19 2000-06-20 Intellectual Reserve, Inc. Technology assisted learning
US6287123B1 (en) * 1998-09-08 2001-09-11 O'brien Denis Richard Computer managed learning system and data processing method therefore
US6548470B1 (en) 1998-12-14 2003-04-15 The Procter & Gamble Company Bleaching compositions
US6067538A (en) 1998-12-22 2000-05-23 Ac Properties B.V. System, method and article of manufacture for a simulation enabled focused feedback tutorial system
US6493690B2 (en) 1998-12-22 2002-12-10 Accenture Goal based educational system with personalized coaching
US5987443A (en) 1998-12-22 1999-11-16 Ac Properties B. V. System, method and article of manufacture for a goal based educational system
US6302698B1 (en) * 1999-02-16 2001-10-16 Discourse Technologies, Inc. Method and apparatus for on-line teaching and learning
US6292792B1 (en) * 1999-03-26 2001-09-18 Intelligent Learning Systems, Inc. System and method for dynamic knowledge generation and distribution
US6282404B1 (en) * 1999-09-22 2001-08-28 Chet D. Linton Method and system for accessing multimedia data in an interactive format having reporting capabilities
US6343935B1 (en) * 2000-03-01 2002-02-05 Castle Hill Learning Company, Llc Computerized interactive educational method and apparatus for teaching vocabulary
US6470170B1 (en) 2000-05-18 2002-10-22 Hai Xing Chen System and method for interactive distance learning and examination training
US6461166B1 (en) * 2000-10-17 2002-10-08 Dennis Ray Berman Learning system with learner-constructed response based testing methodology
US6554618B1 (en) 2001-04-20 2003-04-29 Cheryl B. Lockwood Managed integrated teaching providing individualized instruction

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Crowder, Norman A., Arithmetic of Computers, An Introduction to Binary and Octal Mathematics. A Tutor Text, 1958, pp. i-iv and 1-18, Doubleday & Company, Garden City, NY. *
FullRecall. Retrieved from the Internet on Apr. 25, 2005: <URL: http://www.fullrecall.com>.
Spaced repetition-Wikipedia, Retrieved from the Internet on Apr. 26, 2005: <URL: http://en.wikipedia.org/wiki/Spaced_repetition>.
Super Memory. Retrieved from the Internet on Apr. 25, 2005 <URL: http://www.supermemo.com>.

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040234938A1 (en) * 2003-05-19 2004-11-25 Microsoft Corporation System and method for providing instructional feedback to a user
US20060141438A1 (en) * 2004-12-23 2006-06-29 Inventec Corporation Remote instruction system and method
US20070065797A1 (en) * 2005-09-20 2007-03-22 Ross Elgart System and method of preparing for essay examinations
WO2009089475A1 (en) * 2008-01-09 2009-07-16 Beauchamp Scott E Customized learning and assessment of student based on psychometric models
US20090202969A1 (en) * 2008-01-09 2009-08-13 Beauchamp Scott E Customized learning and assessment of student based on psychometric models
US8506305B2 (en) 2008-12-23 2013-08-13 Deck Chair Learning Systems Inc. Electronic learning system
US8851900B2 (en) 2008-12-23 2014-10-07 Deck Chair Learning Systems Inc. Electronic learning system
US8606170B2 (en) 2012-01-20 2013-12-10 Northrop Grumman Systems Corporation Method and apparatus for interactive, computer-based, automatically adaptable learning

Also Published As

Publication number Publication date
US6461166B1 (en) 2002-10-08

Similar Documents

Publication Publication Date Title
USRE39435E1 (en) Learning system with learner-constructed response based methodology
Uchihara et al. The effects of repetition on incidental vocabulary learning: A meta‐analysis of correlational studies
US10720078B2 (en) Systems and methods for extracting keywords in language learning
Mostow et al. Some useful tactics to modify, map and mine data from intelligent tutors
Zhao Measuring authorial voice strength in L2 argumentative writing: The development and validation of an analytic rubric
Chapelle et al. Assessing language through computer technology
US7062220B2 (en) Automated, computer-based reading tutoring systems and methods
Mislevy et al. Concepts, terminology, and basic models of evidence-centered design
US20060014130A1 (en) System and method for diagnosing deficiencies and assessing knowledge in test responses
US20090202969A1 (en) Customized learning and assessment of student based on psychometric models
US20150199400A1 (en) Automatic generation of verification questions to verify whether a user has read a document
JP2004524559A (en) Automatic paper analysis method
US20090239201A1 (en) Phonetic pronunciation training device, phonetic pronunciation training method and phonetic pronunciation training program
AU2002338449A1 (en) Automated, computer-based reading tutoring system and method.
Yoon et al. Rhetorical structure, sequence, and variation: A step‐driven move analysis of applied linguistics conference abstracts
KR20110079252A (en) Study management system for online lecture, and method for the same
Kobrin et al. The cognitive equivalence of reading comprehension test items via computerized and paper-and-pencil administration
O’Grady Adapting multiple-choice comprehension question formats in a test of second language listening comprehension
KR100532225B1 (en) A study apparatus using network and the method thereof
Malec Developing web-based language tests
CN111127271A (en) Teaching method and system for studying situation analysis
Reinertsen Why can't it mark this one?: A qualitative analysis of student writing rejected by an automated essay scoring system
US11854432B1 (en) Developing an e-rater advisory to detect babel-generated essays
US20220164713A1 (en) Proficiency assessment and instructional feedback using explainable artificial intelligence
Wang et al. Employing the Think-Alound Method to Explore Listeners’ Test-Taking Cognitive Processes

Legal Events

Date Code Title Description
AS Assignment

Owner name: DRB LIT LTD., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BERMAN, DENNIS RAY;REEL/FRAME:014480/0687

Effective date: 20030829

CC Certificate of correction
AS Assignment

Owner name: BERMAN, DENNIS R, TEXAS

Free format text: SECURITY AGREEMENT;ASSIGNOR:DRB LIT LTD.;REEL/FRAME:019181/0279

Effective date: 20070101

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: BERMAN, DENNIS R., MR., TEXAS

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:DRB LIT LTD.;REEL/FRAME:027371/0283

Effective date: 20111201

FEPP Fee payment procedure

Free format text: PAT HOLDER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: LTOS); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

AS Assignment

Owner name: LOCKIN, LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TRIVAC LTD.;DRB LIT LTD.;REEL/FRAME:032371/0018

Effective date: 20140131

Owner name: DRB LIT LTD., TEXAS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BERMAN, DENNIS R.;REEL/FRAME:032370/0618

Effective date: 20140131

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: MEMORY SCIENCE, LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LOCKIN, LLC;REEL/FRAME:033367/0085

Effective date: 20140721

AS Assignment

Owner name: VXI GLOBAL SOLUTIONS LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEMORY SCIENCE, LLC;REEL/FRAME:050298/0734

Effective date: 20190503