US20140308650A1 - Evaluation control - Google Patents

Evaluation control Download PDF

Info

Publication number
US20140308650A1
US20140308650A1 US14/252,402 US201414252402A US2014308650A1 US 20140308650 A1 US20140308650 A1 US 20140308650A1 US 201414252402 A US201414252402 A US 201414252402A US 2014308650 A1 US2014308650 A1 US 2014308650A1
Authority
US
United States
Prior art keywords
evaluation
artifact
indications
portfolio
evaluation criteria
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/252,402
Inventor
Miles T. Loring
Paul C. Grudnitski
Vishal Kapoor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pearson Education Inc
Original Assignee
Pearson Education Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pearson Education Inc filed Critical Pearson Education Inc
Priority to US14/252,402 priority Critical patent/US20140308650A1/en
Assigned to PEARSON EDUCATION, INC. reassignment PEARSON EDUCATION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KAPOOR, VISHAL, LORING, Miles T., GRUDNITSKI, Paul C.
Publication of US20140308650A1 publication Critical patent/US20140308650A1/en
Priority to US15/491,888 priority patent/US10019527B2/en
Priority to US15/719,114 priority patent/US20180307770A1/en
Priority to US16/032,023 priority patent/US10977257B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/243Natural language query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/338Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/955Retrieval from the web using information identifiers, e.g. uniform resource locators [URL]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Definitions

  • This disclosure relates in general to learning and can include traditional classroom learning or on-line or computerized learning including, but without limitation, learning or instruction with a Learning Management System (LMS) and/or Online Homework System (OHS).
  • LMS Learning Management System
  • OHS Online Homework System
  • Work product is frequently generated during the learning process.
  • the evaluation of the work product is important to facilitating learning and providing feedback to the creator of the work product. While evaluation of work product is important to the learning process, it also requires significant resources. Thus, better systems, methods, and devices are desired to facilitate in evaluation of work product.
  • the present disclosure relates to a system for verifying the evaluation of subject matter.
  • the system includes a processor that receives a portfolio that includes a compilation of artifacts that are work product of an evaluatee, provides one of the artifacts to an evaluator, and receives a plurality of first indications of an evaluation criteria, which evaluation criteria can include a plurality of sub-criteria.
  • the first indications of the evaluation criteria identify a first portion of the artifact and a first portion of the evaluation criteria.
  • the processor can further assign a value to the plurality of indications of the evaluation criteria, which value identifies the source of the indications of the evaluation criteria.
  • the source of the indications of the evaluation criteria is the evaluator.
  • the processor can receive a first evaluation of the artifact, which evaluation can be based on the evaluation criteria and the first indications of the evaluation criteria; receive a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact; and compare the first and second evaluations of the artifact, the first and second indications of the evaluation criteria, and the first and second associated portions of the artifact.
  • the processor can provide an indication of the differences between the first and second evaluations of the artifact, between the first and second indications of the evaluation criteria, and between the first and second the associated portions of the artifact.
  • the system can include a memory that can store information relating to the received portfolio, the first evaluation of the artifact, the second evaluation of the artifact, the second indications of the evaluation criteria and the second associated portions of the artifact, and the result of the comparison of the first and second evaluations of the artifact.
  • the processor can select the evaluator, which evaluator can be selected from one or several list of potential evaluators according to one or several traits of the evaluator and/or of the artifacts being evaluated.
  • the identified portion of the evaluation criteria can be a sub-criterion.
  • the system can include a user device that can display the first evaluation of the artifact, display the first indications of the evaluation criteria, and display first associated portions of the artifact.
  • the comparison of the first and second evaluations of the artifact can include generation of a difference value that characterizes the degree of difference between the first and second evaluations, retrieval of an acceptance threshold that is a value demarking levels of acceptable and unacceptable differences between the first and second evaluations, and comparison of the difference value to acceptance threshold.
  • the system can include comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations.
  • the processor can generate a final score based on the received plurality of first indications of an evaluation criteria.
  • Some aspects of the present disclosure relate to a method of verifying the evaluation of subject matter.
  • the method includes receiving a portfolio that includes a compilation of artifacts that are work product of an evaluatee, providing one of the artifacts to an evaluator, and receiving a plurality of first indications of an evaluation criteria.
  • the evaluation criteria can be a plurality of sub-criteria
  • the first indications of the evaluation criteria can identify a first portion of the artifact and a first portion of the evaluation criteria.
  • the method can include assigning a value to the plurality of indications of the evaluation criteria, which value can identify the source of the indications of the evaluation criteria; receiving a first evaluation of the artifact, which evaluation is based on the evaluation criteria and the first indications of the evaluation criteria; and providing the first evaluation of the artifact, the first indications of the evaluation criteria, and first associated portions of the artifact.
  • the method can include receiving a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact; comparing the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact; and providing an indication of the differences between the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact.
  • the method can include selecting the evaluator.
  • the identified portion of the evaluation criteria can be a sub-criterion.
  • the comparison of the first and second evaluations of the artifact can include generating a difference value that characterizes the degree of difference between the first and second evaluations, retrieving an acceptance threshold that demarks levels of acceptable and unacceptable differences between the first and second evaluations, and comparing the difference value to the acceptance threshold.
  • the method includes comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations.
  • the method includes generating a final score based on the received plurality of first indications of an evaluation criteria.
  • Some aspects of the present disclosure relate to a method of training an evaluator.
  • the method includes generating a portfolio including a compilation of artifacts that are work product of an evaluate; receiving evaluation criteria associated with the portfolio, which evaluation criteria can include a plurality of sub-criteria; and receiving a key that includes a plurality of indications of an evaluation criteria.
  • the indications in the key are the correct indications of an evaluation criteria for a portfolio.
  • the indications of the evaluation criteria in the key identify a portion of one of the artifacts and a sub-criterion of the evaluation criteria.
  • Some embodiments of the method include providing the portfolio to a trainee and receiving a portfolio evaluation that includes a plurality of indications of the evaluation criteria.
  • the indications in the portfolio evaluation are received from the trainee, and the indications of the evaluation criteria in the portfolio evaluation identify a portion of one of the artifacts and a sub-criterion of the evaluation criteria.
  • Some embodiments of the method include comparing the key and the portfolio evaluation according to a Boolean function to determine the accuracy of the portfolio evaluation and providing an indicator of the accuracy of the portfolio evaluation.
  • the comparison of the key and the portfolio evaluation includes comparing the score of the artifact in the portfolio evaluation with the score of the artifact in the key. In some embodiments, the comparison of the portfolio evaluation and the key includes generating a difference value that characterizes the degree of difference between the portfolio evaluation and the key, retrieving an acceptance threshold that demarks levels of acceptable and unacceptable differences between the portfolio evaluation and the key, and comparing the difference value to an acceptance threshold. In some embodiments, the comparison of the key and the portfolio evaluation includes the comparison of the indications of the evaluation criteria in the portfolio evaluation to the indications of the evaluation criteria in the key. In some embodiments, the method includes providing additional training material.
  • FIG. 1 is a block diagram illustrating one embodiment of an evaluation control system.
  • FIG. 2 is a block diagram illustrating one embodiment of a user device for use with an evaluation control system.
  • FIG. 3 is a flowchart illustrating one embodiment of a process for evaluation of one or several portfolios and/or artifacts.
  • FIG. 4 is a flowchart illustrating one embodiment of a process for evaluation control.
  • FIG. 5 is a flowchart illustrating one embodiment of a process for generating evaluation data.
  • FIG. 6 is a flowchart illustrating one embodiment of a process for displaying evaluation data.
  • FIG. 7 is a flowchart illustrating one embodiment of a process for training an evaluator
  • FIG. 8 is a block diagram illustrating one embodiment of a computer system.
  • FIG. 9 is a block diagram illustrating one embodiment of a special-purpose computer.
  • the present disclosure provides a method for verifying the evaluation of the portfolio.
  • This method can include receiving a portfolio that includes work product which can be, for example, generated by one or several individuals.
  • work product which can be, for example, generated by one or several individuals.
  • all or portions of this work product is provided to an evaluator and one or several tags are received from the evaluator, which tags identify subject matter relevant to an evaluation criteria.
  • the evaluation of the portfolio and/or of one or several of the artifacts is received and used to generate first evaluation data.
  • This first evaluation data can be compared to second evaluation data which can be, for example, generated by a second evaluator.
  • the degree of differences and the differences between the first evaluation data and the second evaluation data can, in some embodiments, provide information relating to the accuracy evaluation. Specifically, the evaluation is more likely accurate if the first and second evaluation data match and/or closely match.
  • the present disclosure provides a method for training an evaluator, also referred to herein as a trainee, to evaluate a portfolio.
  • This method can include generating a training portfolio, evaluation criteria associated with the portfolio, and a second, verified evaluation, also referred to herein as a key.
  • this method can include receiving an indication of tags identifying a portion of the portfolio, a relevant portion of the evaluation criteria, and the trainee, and receiving a portfolio evaluation.
  • the received portfolio evaluation can be compared to the key and differences between the received evaluation and the key can be indicative of the training level of the trainee and/or potential areas for providing further training to the trainee. In some embodiments, these differences, and the degree of difference can be used to determine when a trainee is a trained evaluator.
  • the evaluation control system 100 collects, receives, and stores data relating to an artifact.
  • the artifact can comprise any work product and can include, for example, digital work product.
  • the digital work product can include written work product and/or recorded work product which can include sound and/or video recordings.
  • the evaluation control system 100 can collect, receive, and store data relating to and/or facilitating the evaluation of one or several artifacts.
  • the evaluation control system 100 can receive and/or create evaluation criteria for one or several artifacts, can receive one or several indicators of portions of the artifact relevant to the evaluation criteria, and can receive an evaluation for one or several artifacts and/or portfolios.
  • the evaluation control system 100 can be configured to assess the evaluation of the one or several artifacts to facilitate training of one or several evaluators and/or for evaluation quality control.
  • the evaluation control system 100 can include a processor 102 .
  • the processor 102 can provide instructions to and receive information from the other components of the evaluation control system 100 .
  • the processor 102 can act according to stored instructions, which stored instructions can be located in memory associated with the processor and/or in other components of the evaluation control system 100 .
  • the processor 102 can comprise a microprocessor, such as a microprocessor from Intel® or Advanced Micro Devices, Inc.®, or the like.
  • the evaluation control system 100 can include one or several databases 104 .
  • the one or several databases 104 can comprise stored data relevant to the functions of the evaluation control system 100 .
  • the one or several databases 104 include a profile database 104 -A.
  • the profile database 104 -A can include profile data for one or several users which can include, for example, one or several evaluators and/or one or several supervisors.
  • the profile database 104 -A can include information relating to the creator of the work product, also referred to herein as a evaluatee.
  • the profile data can include any information relating to the user; in some embodiments, for example, this information can include an indication of the evaluator's progress through a training program, an indication of the quality and/or accuracy of one or several of the evaluator's evaluations, and/or an indication of the type and/or subject matter of artifacts that the evaluator may evaluate.
  • the profile database 104 -A can comprise login information. This information can include, for example, information identifying a user such as, for example, a username and password or a user identification number. In some embodiments, for example, when a user desires to access the evaluation control system 100 , the user can be prompted to enter identification information such as, for example, a username and password. After the user provides the identification information, the evaluation control system 100 can verify the identification information, and specifically, the processor 102 can compare the user-provided identification information to information stored within the profile database 104 -A to determine if the actual user is an authorized user.
  • the one or several databases 104 can include a portfolio database 104 -B.
  • the portfolio database 104 -B can include one or several portfolios.
  • a portfolio can comprise a grouping of one or several artifacts. In some embodiments, these artifacts can be grouped according to type, content, evaluation type which can include information relating to how the artifacts in the portfolio can be evaluated, and/or work product author.
  • the evaluation control system 100 can include a tag database 104 -C.
  • the tag database 104 -C can include information used in evaluating one or several artifacts.
  • the tag database 104 -C can comprise a plurality of tags.
  • the tags can be indications of the application of one or several evaluation criteria or sub-criteria to all or portions of an artifact.
  • a tag can identify a portion of an artifact, the tag can identify a portion of an evaluation criterion relevant to the indicated portion of the work product, and in some embodiments, the tag can indicate the evaluator who created the tagging, the time and date of the tag creation, and/or any other desired information relating to the tag.
  • the tag can include information such as, for example, one or several comments, notes, and/or marks. These comments, notes, and/or marks can be created by the user and can, for example, provide feedback to the creator of the artifact and/or be used in scoring/evaluation of the artifact.
  • the tag can include a tag type. In some embodiments, the tag type can associate the tag with, for example, a portion of a rubric, a positive or negative attribute, or the like.
  • the tag type can indicate a misspelling; a grammatical error; a content error; a good, poor, or adequate technique; a good, average, or bad argument; a good, average, or poor use of content; a correct answer; an incorrect answer; or the like.
  • the evaluation control system 100 can include an evaluation database 104 -D.
  • the evaluation database 104 -D can include information that can facilitate the performing of an evaluation of one or several artifacts and/or portfolios. In some embodiments, this information can include one or several evaluation criteria for use in evaluating one or several artifacts and/or portfolios. In some embodiments, the evaluation criteria can comprise, for example, a rubric and/or other scoring aid. The evaluation criteria can, in some embodiments, comprise one or several sub-criteria that can, for example, focus on a specific aspect of the evaluation.
  • the evaluation database 104 -D can include information identifying the result of a started, partially completed, and/or completed evaluation. In some embodiments, the information in the evaluation database 104 -D can be organized by portfolio, artifact, evaluator, or in any other desired manner.
  • the evaluation database 104 -D can include information linking one or several artifacts' and/or portfolios' tags with one or several evaluation criteria and/or the result of the evaluation of one or several artifacts and/or portfolios.
  • the evaluation database 104 -D can include information used in assessing an evaluation of one or several artifacts and/or portfolios.
  • this information can include a second evaluation of the one or several artifacts and/or portfolios.
  • this second evaluation can be a verified second evaluation that reflects a desired and/or ideal evaluation, and in some embodiments, the second evaluation can be a non-verified evaluation.
  • the evaluation control system 100 can include one or several user devices 106 , which can include an evaluator device 106 -A and/or a supervisor device 106 -B.
  • the user devices 106 allow a user, including an evaluator, a supervisor, a trainer, and/or a trainee to access the evaluation control system 100 .
  • the details and function of the user devices 106 will be discussed at greater length in reference to FIG. 2 below.
  • the evaluation and optimization system 100 can include a data source 108 , also referred to as a repository.
  • the data source 108 can be the source of the one or several artifacts and/or portfolios, the source of the one or several evaluation criteria, and/or the source of one or several second evaluations.
  • the data source can comprise an educational service provider, such as, for example, a school, a university, a college, and/or a Learning Management System (LMS).
  • LMS Learning Management System
  • the evaluation control system 100 can include a network 110 .
  • the network 110 allows communication between the components of the evaluation control system 100 .
  • the network 110 can be, for example, a local area network (LAN), a wide area network (WAN), a wired network, wireless network, a telephone network such as, for example, a cellphone network, the Internet, the World Wide Web, or any other desired network.
  • the network 110 can use any desired communication and/or network protocols.
  • the user device 106 can be configured to provide information to and/or receive information from other components of the evaluation control system 100 .
  • the user device can access the evaluation control system 100 through any desired means or technology, including, for example, a webpage, a web portal, or via network 110 .
  • the user device 106 can include a network interface 200 .
  • the network interface 200 allows the user device 106 to access the other components of the evaluation control system 100 , and specifically allows the user device 106 to access the network 110 of the evaluation control system 100 .
  • the network interface 200 can include features configured to send and receive information, including, for example, an antenna, a modem, a transmitter, receiver, or any other feature that can send and receive information.
  • the network interface 120 can communicate via telephone, cable, fiber-optic, or any other wired communication network.
  • the network interface 200 can communicate via cellular networks, WLAN networks, or any other wireless network.
  • the user device 106 can include a user interface 202 that communicates information to, and receives inputs from, a user.
  • the user interface 202 can include a screen, a speaker, a monitor, a keyboard, a microphone, a mouse, a touchpad, a keypad, or any other feature or features that can receive inputs from a user and provide information to a user.
  • the user device 106 can include a review engine 204 .
  • the review engine 204 can be configured to receive one or several artifacts and/or portfolios from the portfolio database 104 -B and provide the one or several artifacts and/or portfolios to the user via, for example, the user interface 202 .
  • the review engine can include features and/or software that allow providing a range of software and/or artifact types to the user including, for example, images, written documents, recordings including sound and/or video recording, and/or any other desired software and/or artifact type.
  • the user device 106 can include a tagging engine 206 .
  • the tagging engine can be configured to allow a user to add, remove, and/or edit a tag that can be, for example, associated with one or several artifacts and/or one or several portfolios.
  • the tagging engine 206 can be configured to allow a user to tag a portion of one or several artifacts and/or portfolios.
  • the tag can identify a portion of the one or several artifacts and/or portfolios including, for example, a starting point, an ending point, and/or a duration of the portion of the one or several artifacts and/or portfolios.
  • the tagging engine 206 can be configured to allow a user to associate the portion of one or several artifacts and/or portfolios with one or several evaluation criteria and/or evaluation sub-criteria.
  • the user device 106 can include an evaluation engine 208 .
  • the evaluation engine 208 can be configured to allow a user to evaluate one or several artifacts and/or portfolios.
  • the evaluation engine can be configured to group one or several tags associated with one or several artifacts and/or portfolios and provide these grouped tags to the user.
  • the evaluation engine 208 can group tags associated with one artifact and with one evaluation criteria and/or evaluation sub-criteria, and provide these tags to the user.
  • the evaluation engine 208 can be configured to allow the user to review the portions of the one or several artifacts and/or portfolios associated with the tags, and to receive an evaluation from the user based on those tags.
  • the evaluation engine 208 can be configured to allow the review of an evaluation of one or several artifacts and/or portfolios. In some embodiments, for example, the evaluation engine 208 can be configured to compare the evaluation of one or several artifacts and/or portfolios with a second evaluation that can be, for example, the verified evaluation. In some embodiments, the evaluation engine 208 can be further configured to generate and provide a comparison report identifying the differences between the evaluation and the second evaluation and indicating whether the evaluation is acceptable.
  • FIG. 3 a flowchart illustrating one embodiment of a process 300 for evaluation of one or several portfolios and/or artifacts is provided.
  • the process 300 can be performed by the evaluation control system 100 and/or components of the evaluation control system 100 .
  • the process 300 begins at block 302 wherein a portfolio is received.
  • the portfolio can be received by and/or from a component of the evaluation control system 100 , and in one embodiment, the portfolio can be received from the data source 108 .
  • the portfolio can comprise one or several artifacts which can be a collection of work product.
  • this work product can be generated by a user of the data source 108 , and in some embodiments, this work product can be collected by the data source 108 .
  • the portfolio can be stored within one of the databases 104 including, for example, the portfolio database 104 -B.
  • the process 300 proceeds to block 304 wherein the artifacts are provided.
  • the artifacts can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106 -A.
  • the user device 106 can provide the artifacts to the user via the user interface 202 .
  • the artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104 -B.
  • the processor 102 can query the portfolio database 104 -B for a stored artifact.
  • one or several artifacts can be selected from the portfolio database 104 -B and can be provided to the user.
  • the process 300 proceeds to block 306 wherein a tag is received and/or applied.
  • the tag can be received via one of the user devices 106 such as, for example, the evaluator device 106 -A, and can be stored in one or several of the databases 104 including tag database 104 -C.
  • a tag can be applied in that the tag, and the data relevant to the tag, is stored in one of the databases 104 .
  • the tag can identify a portion of the artifact, can identify a portion of the evaluation criteria relevant to the portion of the artifact, can include a note relating to the evaluation criteria and/or to the tagged portion of the artifact, and/or can identify the user adding, removing, and/or editing the tag.
  • the process 300 proceeds to block 308 wherein the evaluation is applied.
  • the evaluation can be applied based on the tags associated with the artifact and/or stored in the tag database 104 -C.
  • the evaluation can be applied based on the number of tags associated with one or several of the evaluation criteria and/or sub-criteria and/or based on information relating to the evaluation criteria and/or sub-criteria that do not have a related tag and/or have fewer related tags than a threshold value.
  • the application of the evaluation can, in some embodiments, be received from the user via the user device 106 and/or generated by the processor 102 .
  • the process 400 for evaluation control can be performed by the evaluation control system 100 and/or a component of thereof.
  • the process 400 begins at block 402 wherein the portfolio is received.
  • the portfolio can be received by and/or from a component of the evaluation control system 100 , and in one embodiment, the portfolio can be received from the data source 108 .
  • the portfolio can comprise one or several artifacts which can be a collection of work product.
  • this work product can be generated by a user of the data source 108 , and in some embodiments, this work product can collected by the data source 108 .
  • the portfolio after the portfolio has been received, the portfolio can be stored within one of the databases 104 including, for example, the portfolio database 104 -B.
  • the process 400 proceeds to block 404 wherein an indication of artifacts is provided.
  • the indication of artifacts can be provided to the user via one of the user devices 106 such as, for example, via the evaluator device 106 -A.
  • the indication of artifacts can comprise an indicator of artifacts stored within the portfolio database 104 -B. In some embodiments, this indicator can comprise a listing, table, and/or index of artifacts stored in the portfolio database 104 -B.
  • the indication of the artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104 -B.
  • the processor 102 can query the portfolio database 104 -B for indications of artifacts stored within the portfolio database 104 -B. These indications of artifacts stored within the portfolio database 104 -B can be provided to the user via, for example, the user interface 202 .
  • the process 400 proceeds to block 406 wherein a selection of one or several of the artifacts is received.
  • the selection of one or several of the artifacts can be received via one of the user devices 106 such as, for example, the evaluator device 106 -A.
  • the selected one or several artifacts can correspond to provided indications of artifacts stored within the portfolio database.
  • the process 400 proceeds to block 408 wherein an artifact is provided.
  • the artifacts can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106 -A.
  • the user device 106 can provide the artifacts to the user via the user interface 202 .
  • the artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104 -B.
  • the processor 102 can query the portfolio database 104 -B for a stored artifact.
  • one or several artifacts can be selected from the portfolio database 104 -B and can be provided to the user.
  • the process 400 proceeds to block 410 wherein a tag is received and/or applied.
  • the tag can be received from the user via the user device 106 , and specifically via, for example, the evaluator device 106 -A.
  • the tag can identify a portion of one or several artifacts and/or portfolios, a portion of a relevant evaluation criteria and/or evaluation sub-criteria, and/or an indicator of the identification of the person and/or evaluator adding, removing, and/or editing the tag.
  • the tag can be applied in that the tag and/or the portion of the artifact associated with the tag is stored.
  • the received tag can be stored in one of the databases 104 including, for example, the tag database 104 -C.
  • the process 400 proceeds to block 412 wherein the tag is correlated to the evaluation criteria.
  • this correlation can include retrieving tag information identifying a related one or several evaluation criteria and/or evaluation sub-criteria and storing this information within the evaluation database.
  • this can be performed by the processor 102 and/or by another component of the evaluation control system 100 including, for example, by the user device 106 and/or component thereof such as the tagging engine 206 and/or the evaluation engine 208 .
  • this evaluation can be received from and/or performed with one of the user devices 106 and/or other components of the evaluation control system 100 .
  • this step can include the grouping of one or several tags, providing the group of one or several tags to the user, and receiving an evaluation based on these tags and the evaluation criteria.
  • the receipt evaluation can be stored in one of the databases 104 such as, for example, the evaluation database 104 -D.
  • the process 400 proceeds to block 416 wherein portions of the one or several artifacts and/or portfolios identified by the one or several tags are saved.
  • these portions of the artifact can be saved within one of the databases 104 , and specifically within the portfolio database 104 -B and/or the tag database 104 -C.
  • the process 400 proceeds to block 418 wherein a first evaluation data is generated.
  • the first evaluation data can comprise information relating to the evaluation and allowing the re-creation of the evaluation. This information can include the evaluation of one or several artifacts and/or portfolios provided by the evaluator, the one or several tags associated with the one or several artifacts and/or portfolios identified by the evaluator, saved portions of the one or several artifacts and/or portfolios identified by the one or several tags, and/or an indicator of the identity of the evaluator.
  • the process 400 proceeds to decision state 420 wherein it is determined if second evaluation data has been received.
  • this determination can be made by the processor 102 and can include querying one or several of the databases 104 including, for example, the evaluation database 104 -B.
  • the process 400 proceeds to block 422 wherein evaluation data is provided.
  • the evaluation data can be provided to one or several of the user devices 106 including, for example, the evaluator device 106 -A and/or the supervisor device 106 -B.
  • this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios to determine if the evaluation is the same, the comparison of the tags associated with the evaluation criteria of the first and second evaluation data to determine similarities and/or differences in tags applied in both instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags.
  • this comparison of the evaluation data can include comparison of the tags associated with the first and second evaluation data to determine the similarities/differences in the two taggings of the artifact. In one embodiment, for example, this comparison can include determining whether the tags associated with the two evaluations each identify the same positive and/or negative aspects/attributes of the artifact. In some embodiments, this comparison can include comparing the linking between the content of the artifact to the evaluation criteria to determine whether the evaluators linked similarly tagged content to similar portions of the evaluation criteria. Similarly, in some embodiments, this comparison can include a comparison of the overall evaluation and/or score for the artifact and/or portfolio.
  • the overall evaluations and/or scores for the artifact and/or portfolio can be compared. In some embodiments, if the comparison of the overall evaluations and/or scores for the artifact and/or portfolio indicate sufficient difference, then the association of the tags with content of the artifact and/or portfolio and/or the association of the tags with the evaluation criteria are compared.
  • the overall evaluation and/or score of the first and second evaluations are compared to generate a difference value indicating the degree of difference between the overall evaluation and/or score of the first and second evaluations.
  • this difference value is compared to an acceptance threshold.
  • the acceptance threshold can identify a degree of difference between evaluations that identifies acceptable/unacceptable differences between the evaluations and/or triggers additional comparison of the first and second evaluations.
  • the process proceeds to decision state 426 , discussed at greater length below, whereas, if the comparison of the difference value to the acceptance threshold indicates that the evaluations are inadequately similar, then the comparison of the evaluation data can include a comparison of the tags as discussed above.
  • this comparison of the evaluation data can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106 . In some embodiments, for example, this comparison can be performed according to a Boolean function wherein matching aspects of the first and second evaluation data are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value.
  • this determination can include determining whether there are any differences between the first and second evaluation data which can include, for example, determining whether the evaluation of the one or several artifacts and/or portfolios are different, determining whether one or several applied tags are different, and/or determining whether the portions of the one or several artifacts and/or portfolios identified by the tags are different. In some embodiments, for example, this determination can be performed with reference to values assigned to the first and second evaluation data as discussed above, and can be performed by the processor 102 and/or the evaluation engine 208 . If it is determined that there are no differences and/or that a difference threshold has not been met, then the process 400 proceeds to block 422 when evaluation data is provided.
  • the process 400 proceeds to block 428 wherein a difference report is generated.
  • the difference report can, for example, identify the differences between the first and second evaluation data including, for example, differences in the evaluation of the one or several artifacts and/or portfolios, differences in the one or several artifacts and/or portfolios, and/or differences in the portions of the artifacts and/or portfolios tags identified by the tags.
  • the difference report can be generated by the evaluation control system 100 , and can specifically be generated by the processor 102 and/or one of the user devices 106 or component thereof including, for example, the evaluation engine 208 .
  • the process 400 proceeds block 430 wherein the difference report is provided.
  • the difference report can be provided to the user via one of the user devices including, for example, the supervisor device 106 -B, and specifically the user interface 202 of the user device 106 .
  • the process 400 proceeds to decision state 432 wherein it is determined if additional evaluation data has been received.
  • the additional evaluation data can be, for example, third evaluation data, fourth evaluation data, fifth evaluation data, sixth evaluation data, and/or any other evaluation data including, for example, n th -evaluation data.
  • evaluation data can be collected until there is a convergence of the evaluation data.
  • additional evaluation data can be data relating to a further evaluation of one or several artifacts and/or portfolios that can be used to determine which of the first and/or second evaluation data is accurate, accurately reflects a correct evaluation of one or several artifacts and/or portfolios, and/or is the most accurate.
  • this determination can be made by the processor 102 and can include querying one or several of the databases 104 including, for example, the evaluation database 104 -B. If it is determined that there is no additional evaluation data, the process 400 can terminate.
  • this comparison can include comparing one or both of the first and second evaluation data with the additional evaluation data.
  • this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios to determine if the evaluation is the same, comparing the tags associated with the evaluation criteria of the first, second, and/or additional evaluation data to determine similarities and/or differences in tags applied in those instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags.
  • this comparison can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106 . In some embodiments, for example, this comparison can be performed according to a Boolean function wherein matching aspects of the first, second, and/or additional evaluation data are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value.
  • the process 400 can proceed to block 436 wherein a discrepancy report is provided.
  • the discrepancy report can identify differences between the first and second evaluation data and the additional evaluation data.
  • the discrepancy report can identify which of the first and/or second evaluation data most closely approximates the additional evaluation data.
  • the discrepancy report can be provided to the user via one of the user devices 106 including, for example, the supervisor device 106 -B, and specifically the user interface 202 of the user device 106 .
  • FIG. 5 a flowchart illustrating one embodiment of a process 500 for generating evaluation data is shown.
  • the process 500 can be performed as part of, or in the place of block 418 shown in FIG. 4 .
  • the process 500 can be performed by the processor 102 , the evaluation engine 208 of one of the user devices 106 , and/or by any other component of the evaluation control system.
  • the process 500 begins at decision state 502 , wherein it is determined whether to display data, and particularly, whether to display all or portions of the evaluation data including, for example, the first evaluation data. In some embodiments, this can include determining whether a user request for the display of evaluation data has been made, which user request can be, a specific user request, or a general rule or request to display evaluation data. If it is determined that evaluation data will not be displayed, then the process 500 proceeds to block 504 and returns to block 420 or FIG. 4 .
  • artifacts are received.
  • this can include receiving all artifacts for which the evaluation data is relevant including, for example, all evaluated artifacts for, a student, a class, a grade, a study, or any other group of artifacts.
  • these artifacts can be retrieved from, for example, one of the databases 104 such as, for example, the portfolio database 104 -B.
  • the process 500 proceeds to block 508 , wherein the tags are received/retrieved.
  • the retrieval of the tags can include the retrieval of information associated with the tags, including, for example, one or several comments, notes, or marks created and/or associated with the tags.
  • the tags can be retrieved from one of the databases such as, for example, the tag database 104 -C.
  • the process 500 proceeds to block 510 where grouping criteria are received.
  • the grouping criteria can include one or several rules for categorizing tags. In some embodiments, these one or several rules can categorize tags according to the artifact with which a tag is associated, the type of tag, the type of comment, note, or mark associated with the tag, or the like.
  • the grouping criteria can be created by the user and can be received via one or several of the user devices 106 and can be stored in one of the databases 104 such as the evaluation database 104 -D.
  • the process 500 proceeds to block 512 wherein the tags are grouped.
  • the grouping of the tags can include, for example, grouping the tags according to one or several attributes of the tag including, for example, the tag type, tag content, including any comment, note, or mark associated with the tag, or the like. In some embodiments, this grouping can be performed according to the grouping criteria.
  • the grouping of the tags can include storing information identifying the grouping of one or several of the tags, which data can be stored in, for example, the tag database 104 -C.
  • the process 500 proceeds to block 514 , wherein the tags are displayed.
  • the tags can be displayed to the user via, for example, the user device 106 .
  • the tags and artifacts can be simultaneously displayed to the user such that the tags are located in a first display portion and one or several artifacts, or portions thereof, are located in a second display portion.
  • the first and second display portions can be first and second portions of a display such as, for example, a screen or monitor.
  • FIG. 6 a flowchart illustrating one embodiment of a process 600 for displaying evaluation data is shown.
  • the process 600 can be performed as part of, or in the place of block 514 shown in FIG. 5 .
  • the process 600 can be performed by the processor 102 , the evaluation engine 208 of one of the user devices 106 , and/or by any other component of the evaluation control system.
  • the process 600 begins at block 602 wherein tag data is retrieved.
  • this retrieval of tag data can be the same as the receipt of tags in block 508 , and in some embodiments, this retrieval of tag data can include the retrieval of tag data in addition to that retrieved in block 508 .
  • This tag data can be retrieved from one of the databases 104 such as, for example, the tag database 104 -C.
  • the process 600 proceeds to block 604 wherein a count is incremented for each tag associated with received/retrieved tag data. In some embodiments, this incrimination can be performed based on all of the information received in one or both of blocks 508 and 602 . In some embodiments, the count can be stored in one of the databases 104 . After the count has been incremented for each of the received/retrieved tags, the process 600 proceeds to block 606 wherein the number of tags is determined. In some embodiments, this can be achieved via retrieval of the count.
  • tag type is extracted from the tag data.
  • this tag type information can be stored as part of the tag data, and can be extracted from the tag data received in block 602 . In some embodiments, this extraction can be performed by the processor 102 and/or the user device 106 .
  • the process 600 proceeds to block 610 wherein a tag score is generated.
  • the tag score can reflect the degree to which a tag affects the score of one or several artifacts.
  • the tag score can indicate a degree to which the tag increments, decrements, or does not affect an artifact score.
  • a tag indicating a negative aspect of an artifact can, based on the strength of the negative aspect artifact, decrease the score of the artifact.
  • a tag indicating a position aspect of an artifact can, based on the strength of the positive aspect of the artifact, increase the score of the artifact.
  • the tag score can be stored in one of the databases 104 such as the tag database 104 -C.
  • the tag score can be generated according to scoring rules that can be stored in one of the databases 104 such as, for example, the evaluation database 104 -D.
  • the process 600 proceeds to block 612 , wherein a sum score is calculated.
  • the sum score can be a value representing the aggregate effect of some or all of the tags.
  • the sum score can be a rough score that can be converted to a final score for an artifact and/or a final score for an artifact.
  • the sum score can be calculated by the combination of tag scores, which combination can include the addition of tag scores, subtraction of tag scores, and the application of one or several weighting factors to some or all of the tag scores based on the relative importance and/or weight associated with some or all of the tag scores.
  • the sum score can be stored in one of the databases 104 such as the tag database 104 -C.
  • the tag score can be generated according to scoring rules that can be stored in one of the databases 104 such as, for example, the evaluation database 104 -D.
  • the process 600 proceeds to block 614 wherein the sum score is compared to scoring data.
  • the sum score is a rough score
  • this can include the conversion of the sum score to a final score.
  • the final score can, in some embodiments, be a recommended final score, and/or final score range.
  • an evaluator may be able to select a score other than the recommended final score, and in some embodiments, the evaluator may be limited to selecting a score corresponding to the recommended final score, including a score from the range indicated by the recommended final score.
  • this conversion can include comparison of the sum score to scoring data, application of a scoring algorithm, or the like. In some embodiments, this conversion can be performed by the processor 102 and/or user device 106 .
  • the process 600 proceeds to block 616 , wherein the score is retrieved.
  • this score can be the final score, and the retrieval of this score can be the receipt of the result of the scoring algorithm, the output of the comparison of the sum score to the scoring data, or the like.
  • the final score can be stored within one of the databases 104 such as, for example, the evaluation database 104 -D.
  • Process 700 can be used to provide an evaluation task to a trainee and to qualify the results of that evaluation task.
  • the process 700 can be performed by the evaluation control system 100 and/or components of the evaluation control system 100 .
  • Process 700 begins at block 702 wherein the portfolio is generated.
  • a portfolio can be generated specifically for purposes of training an evaluator, which evaluator in training is also referred to herein as a trainee.
  • the generated portfolio can be created with the user device 106 such as, for example, the supervisor device 106 -B and/or with the data source 108 .
  • the process 700 proceeds block 704 wherein evaluation criteria are received.
  • the evaluation criteria can be used in evaluating one or several artifacts and/or portfolios and can comprise indications of features of the one or several artifacts and/or portfolios, and a scoring effect of those features.
  • the evaluation criteria can comprise, for example, a rubric and/or other scoring aid.
  • the evaluation criteria can, in some embodiments, comprise one or several sub-criteria that can, for example, focus on a specific aspect of the evaluation.
  • the evaluation criteria can be created with the user device 106 such as, for example, the supervisor device 106 -B and/or with the data source 108 .
  • the process proceeds to block 706 wherein a key is received.
  • the key can comprise a second evaluation of the generated portfolio, and specifically, a verified evaluation of the portfolio.
  • the key can include information relating to the evaluation the portfolio, and specifically to the ideal overall evaluation of the portfolio and/or one or several artifacts in the portfolio, information relating to ideal tagging associated with the portfolio and the evaluation criteria, and portions of the portfolio and/or one or several artifacts indicated by the ideal tags.
  • the process 700 proceeds to block 708 wherein the portfolio is provided.
  • the portfolio and/or one or several artifacts in the portfolio can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106 -A.
  • the user device 106 can provide the portfolio and/or one or several artifacts in the portfolio to the user via the user interface 202 .
  • the artifacts can be retrieved from the one or several portfolio stored within the portfolio database 104 -B.
  • the processor 102 can query the portfolio database 104 -B for the stored portfolio and/or one or several artifacts in the portfolio.
  • the desired portfolio and/or one or several artifacts in the desired portfolio can be selected from the portfolio database 104 -B and can be provided to the user.
  • the process 700 proceeds to blocks 710 through 712 which blocks outline the step of receiving the tag indicated in block 410 of FIG. 4 in greater detail.
  • the step of receiving the tag begins with block 710 wherein an indicator of one or several artifact and/or portfolio portions is received.
  • this indicator can be received via one of the user devices 106 including, for example, the evaluator device 106 -A.
  • this can include an indication of one or several tagged portions of the one or several artifacts and/or portfolios.
  • this indicator can identify the tag portion of the one or several artifacts and/or portfolios and can, for example, identify the beginning and/or end of the tagged portion of the one or several artifacts and/or portfolios.
  • this indicator can be stored in one of the databases 104 including, for example, the tag database 104 -C.
  • the process 700 proceeds to block 712 wherein an indicator of a criteria is received.
  • the indicator of the criteria can be received via one of the user devices 106 including, for example, the evaluator device 106 -A.
  • the indicator of the criteria can identify a portion of the evaluation criteria relevant to the indicated portion of the one or several artifacts and/or portfolios. In some embodiments, this indicator can be stored in one of the databases 104 including, for example, the tag database 104 -C.
  • this indicator can be received in one of the user devices 106 including, for example, the evaluator device 106 -A. In some embodiments, this indicator can be used to determine the skill level of the trainee in evaluating a portfolio. In some embodiments, this indicator can be stored in one of the databases 104 including, for example, the tag database 104 -C.
  • the process 700 proceeds to block 716 wherein the portfolio evaluation is received.
  • the portfolio evaluation can be received from and/or performed with one of the user devices 106 and/or other components of the evaluation control system 100 .
  • this step can include the grouping of one or several tags, providing the group of one or several tags to the user, and receiving an evaluation based on these tags and the evaluation criteria.
  • the receipt evaluation can be stored in one of the databases 104 such as, for example, the evaluation database 104 -D.
  • this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios determining if the evaluation and/or score is the same, comparing the tags associated with the evaluation criteria of the first and second evaluation data to determine similarities and/or differences in tags applied in both instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags.
  • this comparison can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106 .
  • this comparison can be performed according to a Boolean function wherein matching aspects of the portfolio evaluation and the key are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value. In some embodiments, this comparison can be performed in the same and/or similar manner to the comparison of block 424 of FIG. 4 .
  • the process 700 proceeds to decision state 712 wherein it is determined if the portfolio evaluation and the key include a different evaluation of the portfolio and/or of one or several artifacts in the portfolio.
  • this determination can be made by the processor 102 and/or by a component of one of the user devices 106 including, for example, the evaluation engine 208 .
  • this determination can include retrieving the assigned values indicative of matching and/or nonmatching aspects of the portfolio evaluation and/or the evaluation of one or several artifacts in the portfolio.
  • the process 700 proceeds to block 722 wherein an indicator of the difference in the evaluation is stored.
  • this indicator can be stored in one of the databases 104 including, for example, the profile database 104 -A, the portfolio database 104 -B, and/or the evaluation database 104 -D.
  • the process 700 proceeds to decision state 724 wherein it is determined if different tags have been assigned to portfolio and/or to the one or several artifacts in the portfolio in the portfolio evaluation and the key.
  • this determination can be made by the processor 102 and/or by a component of one of the user devices 106 including, for example, the evaluation engine 208 .
  • this determination can include retrieving assigned values indicative of matching and/or nonmatching tags in the portfolio evaluation and/or in evaluation of one or several artifacts in the portfolio.
  • the process 700 proceeds to block 726 wherein an indicator of the difference in the tags is stored.
  • this indicator can be stored in one of the databases 104 including, for example, the profile database 104 -A, the portfolio database 104 -B, and/or the evaluation database 104 -D.
  • the process 700 proceeds to block 728 wherein a difference report is generated.
  • the difference report can, for example, identify the differences between the portfolio evaluation and the key including, for example, differences in the evaluation of the one or several artifacts and/or portfolios, differences in the one or several artifacts and/or portfolios, and/or differences in the portions of the artifacts and/or portfolios tags identified by the tags.
  • the difference report can be generated by the evaluation control system 100 , and can specifically be generated by the processor 102 and/or one of the user devices 106 or component thereof including, for example, the evaluation engine 208 .
  • the process 700 proceeds block 730 wherein the difference report is provided.
  • the difference report can be provided to the user via one of the user devices including, for example, the supervisor device 106 -B, and specifically the user interface 202 of the user device 106 .
  • the process 700 proceeds to block 730 wherein training is recommended and/or training content is provided.
  • the difference between the portfolio evaluation and the key can be sufficient such that additional training can be beneficial.
  • this training can be recommended based on the difference in the evaluation of the one or several artifacts and/or portfolios, and in some embodiments, this training can be recommended based on the difference in the tags applied to the one or several artifacts and/portfolios.
  • a component of the evaluation control system 100 such as, processor 102 and/or one of the user devices 106 can compare the difference between one of the portfolio evaluation and the key with a threshold for requiring additional training, and can, in some embodiments, recommend additional training and/or provide additional training material based of the relationship of the difference in portfolio evaluation and the key.
  • the computer system 800 can include a computer 802 , keyboard 822 , a network router 812 , a printer 808 , and a monitor 806 .
  • the monitor 806 , processor 802 and keyboard 822 are part of a computer system 826 , which can be a laptop computer, desktop computer, handheld computer, mainframe computer, etc.
  • the monitor 806 can be a CRT, flat screen, etc.
  • a user 804 can input commands into the computer 802 using various input devices, such as a mouse, keyboard 822 , track ball, touch screen, etc. If the computer system 800 comprises a mainframe, a designer 804 can access the computer 802 using, for example, a terminal or terminal interface. Additionally, the computer system 826 may be connected to a printer 808 and a server 810 using a network router 812 , which may connect to the Internet 818 or a WAN.
  • the server 810 may, for example, be used to store additional software programs and data.
  • software implementing the systems and methods described herein can be stored on a storage medium in the server 810 .
  • the software can be run from the storage medium in the server 810 .
  • software implementing the systems and methods described herein can be stored on a storage medium in the computer 802 .
  • the software can be run from the storage medium in the computer system 826 . Therefore, in this embodiment, the software can be used whether or not computer 802 is connected to network router 812 .
  • Printer 808 may be connected directly to computer 802 , in which case, the computer system 826 can print whether or not it is connected to network router 812 .
  • a special-purpose computer system 904 is shown.
  • the above methods may be implemented by computer-program products that direct a computer system to perform the actions of the above-described methods and components.
  • Each such computer-program product may comprise sets of instructions (codes) embodied on a computer-readable medium that directs the processor of a computer system to perform corresponding actions.
  • the instructions may be configured to run in sequential order, or in parallel (such as under different processing threads), or in a combination thereof. After loading the computer-program products on a general purpose computer system 826 , it is transformed into the special-purpose computer system 904 .
  • Special-purpose computer system 904 comprises a computer 802 , a monitor 806 coupled to computer 802 , one or more additional user output devices 930 (optional) coupled to computer 802 , one or more user input devices 940 (e.g., keyboard, mouse, track ball, touch screen) coupled to computer 802 , an optional communications interface 950 coupled to computer 802 , a computer-program product 905 stored in a tangible computer-readable memory in computer 802 .
  • Computer-program product 905 directs system 904 to perform the above-described methods.
  • Computer 802 may include one or more processors 960 that communicate with a number of peripheral devices via a bus subsystem 990 .
  • peripheral devices may include user output device(s) 930 , user input device(s) 940 , communications interface 950 , and a storage subsystem, such as random access memory (RAM) 970 and non-volatile storage drive 980 (e.g., disk drive, optical drive, solid state drive), which are forms of tangible computer-readable memory.
  • RAM random access memory
  • non-volatile storage drive 980 e.g., disk drive, optical drive, solid state drive
  • Computer-program product 905 may be stored in non-volatile storage drive 980 or another computer-readable medium accessible to computer 802 and loaded into memory 970 .
  • Each processor 960 may comprise a microprocessor, such as a microprocessor from Intel® or Advanced Micro Devices, Inc.®, or the like.
  • the computer 802 runs an operating system that handles the communications of product 905 with the above-noted components, as well as the communications between the above-noted components in support of the computer-program product 905 .
  • Exemplary operating systems include Windows® or the like from Microsoft® Corporation, Solaris® from Oracle®, LINUX, UNIX, and the like.
  • User input devices 940 include all possible types of devices and mechanisms to input information to computer system 802 . These may include a keyboard, a keypad, a mouse, a scanner, a digital drawing pad, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices. In various embodiments, user input devices 940 are typically embodied as a computer mouse, a trackball, a track pad, a joystick, wireless remote, a drawing tablet, a voice command system. User input devices 940 typically allow a user to select objects, icons, text and the like that appear on the monitor 806 via a command such as a click of a button or the like. User output devices 930 include all possible types of devices and mechanisms to output information from computer 802 . These may include a display (e.g., monitor 806 ), printers, non-visual displays such as audio output devices, etc.
  • a display e.g., monitor 806
  • printers e.g., non-visual displays
  • Communications interface 950 provides an interface to other communication networks 995 and devices and may serve as an interface to receive data from and transmit data to other systems, WANs and/or the Internet 818 .
  • Embodiments of communications interface 950 typically include an Ethernet card, a modem (telephone, satellite, cable, ISDN), a (asynchronous) digital subscriber line (DSL) unit, a FireWire® interface, a USB® interface, a wireless network adapter, and the like.
  • communications interface 950 may be coupled to a computer network, to a FireWire® bus, or the like.
  • communications interface 950 may be physically integrated on the motherboard of computer 802 , and/or may be a software program, or the like.
  • RAM 970 and non-volatile storage drive 980 are examples of tangible computer-readable media configured to store data such as computer-program product embodiments of the present invention, including executable computer code, human-readable code, or the like.
  • Other types of tangible computer-readable media include floppy disks, removable hard disks, optical storage media such as CD-ROMs, DVDs, bar codes, semiconductor memories such as flash memories, read-only-memories (ROMs), battery-backed volatile memories, networked storage devices, and the like.
  • RAM 970 and non-volatile storage drive 980 may be configured to store the basic programming and data constructs that provide the functionality of various embodiments of the present invention, as described above.
  • RAM 970 and non-volatile storage drive 980 may also provide a repository to store data and data structures used in accordance with the present invention.
  • RAM 970 and non-volatile storage drive 980 may include a number of memories including a main random access memory (RAM) to store instructions and data during program execution and a read-only memory (ROM) in which fixed instructions are stored.
  • RAM 970 and non-volatile storage drive 980 may include a file storage subsystem providing persistent (non-volatile) storage of program and/or data files.
  • RAM 970 and non-volatile storage drive 980 may also include removable storage systems, such as removable flash memory.
  • Bus subsystem 990 provides a mechanism to allow the various components and subsystems of computer 802 to communicate with each other as intended. Although bus subsystem 990 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple busses or communication paths within the computer 802 .
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof.
  • the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • the embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a swim diagram, a data flow diagram, a structure diagram, or a block diagram. Although a depiction may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged.
  • a process is terminated when its operations are completed, but could have additional steps not included in the figure.
  • a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
  • embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium.
  • a code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein.
  • Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein.
  • software codes may be stored in a memory.
  • Memory may be implemented within the processor or external to the processor.
  • the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
  • ROM read only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
  • machine-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.

Abstract

Systems and methods for evaluation control are disclosed herein. An evaluation control system can be used in evaluating one or several artifacts. The evaluation control system can be used to verify an evaluation and/or to provide evaluation training. The evaluation system can include a processor and a user device. The evaluation system can provide one or several artifacts and evaluation criteria to an evaluator. The evaluation system can receive one or several tags associated with the one or several artifacts and/or the evaluation criteria. The one or several tags can include information that, in connection with the evaluation criteria, support an evaluation and/or score for the one or several artifacts.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 61/811,347, filed on Apr. 12, 2013, and entitled “EVALUATION CONTROL,” the entirety of which is hereby incorporated by reference herein.
  • BACKGROUND OF THE INVENTION
  • This disclosure relates in general to learning and can include traditional classroom learning or on-line or computerized learning including, but without limitation, learning or instruction with a Learning Management System (LMS) and/or Online Homework System (OHS).
  • Work product is frequently generated during the learning process. The evaluation of the work product is important to facilitating learning and providing feedback to the creator of the work product. While evaluation of work product is important to the learning process, it also requires significant resources. Thus, better systems, methods, and devices are desired to facilitate in evaluation of work product.
  • BRIEF SUMMARY OF THE INVENTION
  • In some aspects, the present disclosure relates to a system for verifying the evaluation of subject matter. The system includes a processor that receives a portfolio that includes a compilation of artifacts that are work product of an evaluatee, provides one of the artifacts to an evaluator, and receives a plurality of first indications of an evaluation criteria, which evaluation criteria can include a plurality of sub-criteria. In some embodiments, the first indications of the evaluation criteria identify a first portion of the artifact and a first portion of the evaluation criteria. The processor can further assign a value to the plurality of indications of the evaluation criteria, which value identifies the source of the indications of the evaluation criteria. In some embodiments, the source of the indications of the evaluation criteria is the evaluator. The processor can receive a first evaluation of the artifact, which evaluation can be based on the evaluation criteria and the first indications of the evaluation criteria; receive a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact; and compare the first and second evaluations of the artifact, the first and second indications of the evaluation criteria, and the first and second associated portions of the artifact. The processor can provide an indication of the differences between the first and second evaluations of the artifact, between the first and second indications of the evaluation criteria, and between the first and second the associated portions of the artifact. The system can include a memory that can store information relating to the received portfolio, the first evaluation of the artifact, the second evaluation of the artifact, the second indications of the evaluation criteria and the second associated portions of the artifact, and the result of the comparison of the first and second evaluations of the artifact.
  • In some embodiments of the system, the processor can select the evaluator, which evaluator can be selected from one or several list of potential evaluators according to one or several traits of the evaluator and/or of the artifacts being evaluated. In some embodiments of the system, the identified portion of the evaluation criteria can be a sub-criterion.
  • In some embodiments, the system can include a user device that can display the first evaluation of the artifact, display the first indications of the evaluation criteria, and display first associated portions of the artifact. In some embodiments, the comparison of the first and second evaluations of the artifact can include generation of a difference value that characterizes the degree of difference between the first and second evaluations, retrieval of an acceptance threshold that is a value demarking levels of acceptable and unacceptable differences between the first and second evaluations, and comparison of the difference value to acceptance threshold. In some embodiments, the system can include comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations. In some embodiments, the processor can generate a final score based on the received plurality of first indications of an evaluation criteria.
  • Some aspects of the present disclosure relate to a method of verifying the evaluation of subject matter. The method includes receiving a portfolio that includes a compilation of artifacts that are work product of an evaluatee, providing one of the artifacts to an evaluator, and receiving a plurality of first indications of an evaluation criteria. In some embodiments, the evaluation criteria can be a plurality of sub-criteria, and the first indications of the evaluation criteria can identify a first portion of the artifact and a first portion of the evaluation criteria. In some embodiments, the method can include assigning a value to the plurality of indications of the evaluation criteria, which value can identify the source of the indications of the evaluation criteria; receiving a first evaluation of the artifact, which evaluation is based on the evaluation criteria and the first indications of the evaluation criteria; and providing the first evaluation of the artifact, the first indications of the evaluation criteria, and first associated portions of the artifact. In some embodiments, the method can include receiving a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact; comparing the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact; and providing an indication of the differences between the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact.
  • In some embodiments, the method can include selecting the evaluator. In some embodiments of the method, the identified portion of the evaluation criteria can be a sub-criterion. In some embodiments of the method, the comparison of the first and second evaluations of the artifact can include generating a difference value that characterizes the degree of difference between the first and second evaluations, retrieving an acceptance threshold that demarks levels of acceptable and unacceptable differences between the first and second evaluations, and comparing the difference value to the acceptance threshold. In some embodiments, the method includes comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations. In some embodiments, the method includes generating a final score based on the received plurality of first indications of an evaluation criteria.
  • Some aspects of the present disclosure relate to a method of training an evaluator. The method includes generating a portfolio including a compilation of artifacts that are work product of an evaluate; receiving evaluation criteria associated with the portfolio, which evaluation criteria can include a plurality of sub-criteria; and receiving a key that includes a plurality of indications of an evaluation criteria. In some embodiments, the indications in the key are the correct indications of an evaluation criteria for a portfolio. In some embodiments, the indications of the evaluation criteria in the key identify a portion of one of the artifacts and a sub-criterion of the evaluation criteria. Some embodiments of the method include providing the portfolio to a trainee and receiving a portfolio evaluation that includes a plurality of indications of the evaluation criteria. In some embodiments, the indications in the portfolio evaluation are received from the trainee, and the indications of the evaluation criteria in the portfolio evaluation identify a portion of one of the artifacts and a sub-criterion of the evaluation criteria. Some embodiments of the method include comparing the key and the portfolio evaluation according to a Boolean function to determine the accuracy of the portfolio evaluation and providing an indicator of the accuracy of the portfolio evaluation.
  • In some embodiments of the method, the comparison of the key and the portfolio evaluation includes comparing the score of the artifact in the portfolio evaluation with the score of the artifact in the key. In some embodiments, the comparison of the portfolio evaluation and the key includes generating a difference value that characterizes the degree of difference between the portfolio evaluation and the key, retrieving an acceptance threshold that demarks levels of acceptable and unacceptable differences between the portfolio evaluation and the key, and comparing the difference value to an acceptance threshold. In some embodiments, the comparison of the key and the portfolio evaluation includes the comparison of the indications of the evaluation criteria in the portfolio evaluation to the indications of the evaluation criteria in the key. In some embodiments, the method includes providing additional training material.
  • Further areas of applicability of the present disclosure will become apparent from the detailed description provided hereinafter. It should be understood that the detailed description and specific examples, while indicating various embodiments, are intended for purposes of illustration only and are not intended to necessarily limit the scope of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present disclosure is described in conjunction with the appended figures:
  • FIG. 1 is a block diagram illustrating one embodiment of an evaluation control system.
  • FIG. 2 is a block diagram illustrating one embodiment of a user device for use with an evaluation control system.
  • FIG. 3 is a flowchart illustrating one embodiment of a process for evaluation of one or several portfolios and/or artifacts.
  • FIG. 4 is a flowchart illustrating one embodiment of a process for evaluation control.
  • FIG. 5 is a flowchart illustrating one embodiment of a process for generating evaluation data.
  • FIG. 6 is a flowchart illustrating one embodiment of a process for displaying evaluation data.
  • FIG. 7 is a flowchart illustrating one embodiment of a process for training an evaluator
  • FIG. 8 is a block diagram illustrating one embodiment of a computer system.
  • FIG. 9 is a block diagram illustrating one embodiment of a special-purpose computer.
  • In the appended figures, similar components and/or features may have the same reference label. Where the reference label is used in the specification, the description is applicable to any one of the similar components having the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The ensuing description provides preferred exemplary embodiment(s) only, and is not intended to limit the scope, applicability or configuration of the disclosure. Rather, the ensuing description of the preferred exemplary embodiment(s) will provide those skilled in the art with an enabling description for implementing a preferred exemplary embodiment. It is understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope as set forth in the appended claims.
  • In one embodiment, the present disclosure provides a method for verifying the evaluation of the portfolio. This method can include receiving a portfolio that includes work product which can be, for example, generated by one or several individuals. In some embodiments, all or portions of this work product is provided to an evaluator and one or several tags are received from the evaluator, which tags identify subject matter relevant to an evaluation criteria. In some embodiments, the evaluation of the portfolio and/or of one or several of the artifacts is received and used to generate first evaluation data. This first evaluation data can be compared to second evaluation data which can be, for example, generated by a second evaluator. The degree of differences and the differences between the first evaluation data and the second evaluation data can, in some embodiments, provide information relating to the accuracy evaluation. Specifically, the evaluation is more likely accurate if the first and second evaluation data match and/or closely match.
  • In one embodiment, the present disclosure provides a method for training an evaluator, also referred to herein as a trainee, to evaluate a portfolio. This method can include generating a training portfolio, evaluation criteria associated with the portfolio, and a second, verified evaluation, also referred to herein as a key. In some embodiments, this method can include receiving an indication of tags identifying a portion of the portfolio, a relevant portion of the evaluation criteria, and the trainee, and receiving a portfolio evaluation. The received portfolio evaluation can be compared to the key and differences between the received evaluation and the key can be indicative of the training level of the trainee and/or potential areas for providing further training to the trainee. In some embodiments, these differences, and the degree of difference can be used to determine when a trainee is a trained evaluator.
  • With reference now to FIG. 1, a block diagram of one embodiment of an evaluation control system 100 is shown. The evaluation control system 100 collects, receives, and stores data relating to an artifact. In some embodiments, the artifact can comprise any work product and can include, for example, digital work product. In some embodiments, the digital work product can include written work product and/or recorded work product which can include sound and/or video recordings. In some embodiments, the evaluation control system 100 can collect, receive, and store data relating to and/or facilitating the evaluation of one or several artifacts. The evaluation control system 100 can receive and/or create evaluation criteria for one or several artifacts, can receive one or several indicators of portions of the artifact relevant to the evaluation criteria, and can receive an evaluation for one or several artifacts and/or portfolios. In some embodiments, the evaluation control system 100 can be configured to assess the evaluation of the one or several artifacts to facilitate training of one or several evaluators and/or for evaluation quality control.
  • The evaluation control system 100 can include a processor 102. The processor 102 can provide instructions to and receive information from the other components of the evaluation control system 100. The processor 102 can act according to stored instructions, which stored instructions can be located in memory associated with the processor and/or in other components of the evaluation control system 100. The processor 102 can comprise a microprocessor, such as a microprocessor from Intel® or Advanced Micro Devices, Inc.®, or the like.
  • The evaluation control system 100 can include one or several databases 104. The one or several databases 104 can comprise stored data relevant to the functions of the evaluation control system 100. The one or several databases 104 include a profile database 104-A. The profile database 104-A can include profile data for one or several users which can include, for example, one or several evaluators and/or one or several supervisors. In some embodiments, for example, the profile database 104-A can include information relating to the creator of the work product, also referred to herein as a evaluatee.
  • The profile data can include any information relating to the user; in some embodiments, for example, this information can include an indication of the evaluator's progress through a training program, an indication of the quality and/or accuracy of one or several of the evaluator's evaluations, and/or an indication of the type and/or subject matter of artifacts that the evaluator may evaluate.
  • In some embodiments, for example, the profile database 104-A can comprise login information. This information can include, for example, information identifying a user such as, for example, a username and password or a user identification number. In some embodiments, for example, when a user desires to access the evaluation control system 100, the user can be prompted to enter identification information such as, for example, a username and password. After the user provides the identification information, the evaluation control system 100 can verify the identification information, and specifically, the processor 102 can compare the user-provided identification information to information stored within the profile database 104-A to determine if the actual user is an authorized user.
  • The one or several databases 104 can include a portfolio database 104-B. The portfolio database 104-B can include one or several portfolios. In some embodiments, a portfolio can comprise a grouping of one or several artifacts. In some embodiments, these artifacts can be grouped according to type, content, evaluation type which can include information relating to how the artifacts in the portfolio can be evaluated, and/or work product author.
  • The evaluation control system 100 can include a tag database 104-C. The tag database 104-C can include information used in evaluating one or several artifacts. In some embodiments, for example, the tag database 104-C can comprise a plurality of tags. In some embodiments, the tags can be indications of the application of one or several evaluation criteria or sub-criteria to all or portions of an artifact. In some embodiments, a tag can identify a portion of an artifact, the tag can identify a portion of an evaluation criterion relevant to the indicated portion of the work product, and in some embodiments, the tag can indicate the evaluator who created the tagging, the time and date of the tag creation, and/or any other desired information relating to the tag. In some embodiments, the tag can include information such as, for example, one or several comments, notes, and/or marks. These comments, notes, and/or marks can be created by the user and can, for example, provide feedback to the creator of the artifact and/or be used in scoring/evaluation of the artifact. In some embodiments, the tag can include a tag type. In some embodiments, the tag type can associate the tag with, for example, a portion of a rubric, a positive or negative attribute, or the like. In one embodiment, for example, the tag type can indicate a misspelling; a grammatical error; a content error; a good, poor, or adequate technique; a good, average, or bad argument; a good, average, or poor use of content; a correct answer; an incorrect answer; or the like.
  • The evaluation control system 100 can include an evaluation database 104-D. The evaluation database 104-D can include information that can facilitate the performing of an evaluation of one or several artifacts and/or portfolios. In some embodiments, this information can include one or several evaluation criteria for use in evaluating one or several artifacts and/or portfolios. In some embodiments, the evaluation criteria can comprise, for example, a rubric and/or other scoring aid. The evaluation criteria can, in some embodiments, comprise one or several sub-criteria that can, for example, focus on a specific aspect of the evaluation.
  • The evaluation database 104-D can include information identifying the result of a started, partially completed, and/or completed evaluation. In some embodiments, the information in the evaluation database 104-D can be organized by portfolio, artifact, evaluator, or in any other desired manner. The evaluation database 104-D can include information linking one or several artifacts' and/or portfolios' tags with one or several evaluation criteria and/or the result of the evaluation of one or several artifacts and/or portfolios.
  • The evaluation database 104-D can include information used in assessing an evaluation of one or several artifacts and/or portfolios. In some embodiments, for example, this information can include a second evaluation of the one or several artifacts and/or portfolios. In some embodiments, this second evaluation can be a verified second evaluation that reflects a desired and/or ideal evaluation, and in some embodiments, the second evaluation can be a non-verified evaluation.
  • The evaluation control system 100 can include one or several user devices 106, which can include an evaluator device 106-A and/or a supervisor device 106-B. The user devices 106 allow a user, including an evaluator, a supervisor, a trainer, and/or a trainee to access the evaluation control system 100. The details and function of the user devices 106 will be discussed at greater length in reference to FIG. 2 below.
  • The evaluation and optimization system 100 can include a data source 108, also referred to as a repository. The data source 108 can be the source of the one or several artifacts and/or portfolios, the source of the one or several evaluation criteria, and/or the source of one or several second evaluations. In some embodiments, the data source can comprise an educational service provider, such as, for example, a school, a university, a college, and/or a Learning Management System (LMS).
  • The evaluation control system 100 can include a network 110. The network 110 allows communication between the components of the evaluation control system 100. The network 110 can be, for example, a local area network (LAN), a wide area network (WAN), a wired network, wireless network, a telephone network such as, for example, a cellphone network, the Internet, the World Wide Web, or any other desired network. In some embodiments, the network 110 can use any desired communication and/or network protocols.
  • With reference now to FIG. 2, a block diagram of one embodiment of a user device 106 is shown. As discussed above, the user device 106 can be configured to provide information to and/or receive information from other components of the evaluation control system 100. The user device can access the evaluation control system 100 through any desired means or technology, including, for example, a webpage, a web portal, or via network 110. As depicted in FIG. 2, the user device 106 can include a network interface 200. The network interface 200 allows the user device 106 to access the other components of the evaluation control system 100, and specifically allows the user device 106 to access the network 110 of the evaluation control system 100. The network interface 200 can include features configured to send and receive information, including, for example, an antenna, a modem, a transmitter, receiver, or any other feature that can send and receive information. The network interface 120 can communicate via telephone, cable, fiber-optic, or any other wired communication network. In some embodiments, the network interface 200 can communicate via cellular networks, WLAN networks, or any other wireless network.
  • The user device 106 can include a user interface 202 that communicates information to, and receives inputs from, a user. The user interface 202 can include a screen, a speaker, a monitor, a keyboard, a microphone, a mouse, a touchpad, a keypad, or any other feature or features that can receive inputs from a user and provide information to a user.
  • The user device 106 can include a review engine 204. In some embodiments, the review engine 204 can be configured to receive one or several artifacts and/or portfolios from the portfolio database 104-B and provide the one or several artifacts and/or portfolios to the user via, for example, the user interface 202. In some embodiments, the review engine can include features and/or software that allow providing a range of software and/or artifact types to the user including, for example, images, written documents, recordings including sound and/or video recording, and/or any other desired software and/or artifact type.
  • The user device 106 can include a tagging engine 206. In some embodiments, the tagging engine can be configured to allow a user to add, remove, and/or edit a tag that can be, for example, associated with one or several artifacts and/or one or several portfolios. In some embodiments, for example, the tagging engine 206 can be configured to allow a user to tag a portion of one or several artifacts and/or portfolios. In some embodiments the tag can identify a portion of the one or several artifacts and/or portfolios including, for example, a starting point, an ending point, and/or a duration of the portion of the one or several artifacts and/or portfolios. In some embodiments, the tagging engine 206 can be configured to allow a user to associate the portion of one or several artifacts and/or portfolios with one or several evaluation criteria and/or evaluation sub-criteria.
  • The user device 106 can include an evaluation engine 208. The evaluation engine 208 can be configured to allow a user to evaluate one or several artifacts and/or portfolios. In some embodiments, for example, the evaluation engine can be configured to group one or several tags associated with one or several artifacts and/or portfolios and provide these grouped tags to the user. In one embodiment, for example, the evaluation engine 208 can group tags associated with one artifact and with one evaluation criteria and/or evaluation sub-criteria, and provide these tags to the user. The evaluation engine 208 can be configured to allow the user to review the portions of the one or several artifacts and/or portfolios associated with the tags, and to receive an evaluation from the user based on those tags.
  • In some embodiments, the evaluation engine 208 can be configured to allow the review of an evaluation of one or several artifacts and/or portfolios. In some embodiments, for example, the evaluation engine 208 can be configured to compare the evaluation of one or several artifacts and/or portfolios with a second evaluation that can be, for example, the verified evaluation. In some embodiments, the evaluation engine 208 can be further configured to generate and provide a comparison report identifying the differences between the evaluation and the second evaluation and indicating whether the evaluation is acceptable.
  • With reference now to FIG. 3, a flowchart illustrating one embodiment of a process 300 for evaluation of one or several portfolios and/or artifacts is provided. In some embodiments, the process 300 can be performed by the evaluation control system 100 and/or components of the evaluation control system 100.
  • The process 300 begins at block 302 wherein a portfolio is received. In some embodiments, for example, the portfolio can be received by and/or from a component of the evaluation control system 100, and in one embodiment, the portfolio can be received from the data source 108. In some embodiments, and as discussed above, the portfolio can comprise one or several artifacts which can be a collection of work product. In some embodiments, this work product can be generated by a user of the data source 108, and in some embodiments, this work product can be collected by the data source 108. In some embodiments, after the portfolio has been received, the portfolio can be stored within one of the databases 104 including, for example, the portfolio database 104-B.
  • After the portfolio has been received, the process 300 proceeds to block 304 wherein the artifacts are provided. In some embodiments, for example, the artifacts can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, the user device 106 can provide the artifacts to the user via the user interface 202. The artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104-B. In some embodiments, for example, the processor 102 can query the portfolio database 104-B for a stored artifact. In some embodiments, one or several artifacts can be selected from the portfolio database 104-B and can be provided to the user.
  • After the artifacts have been provided, the process 300 proceeds to block 306 wherein a tag is received and/or applied. In some embodiments, for example, the tag can be received via one of the user devices 106 such as, for example, the evaluator device 106-A, and can be stored in one or several of the databases 104 including tag database 104-C. In one embodiment, a tag can be applied in that the tag, and the data relevant to the tag, is stored in one of the databases 104. In one embodiment, for example, and as discussed above, the tag can identify a portion of the artifact, can identify a portion of the evaluation criteria relevant to the portion of the artifact, can include a note relating to the evaluation criteria and/or to the tagged portion of the artifact, and/or can identify the user adding, removing, and/or editing the tag.
  • After the tag has been received, the process 300 proceeds to block 308 wherein the evaluation is applied. In some embodiments, for example, the evaluation can be applied based on the tags associated with the artifact and/or stored in the tag database 104-C. In some embodiments, the evaluation can be applied based on the number of tags associated with one or several of the evaluation criteria and/or sub-criteria and/or based on information relating to the evaluation criteria and/or sub-criteria that do not have a related tag and/or have fewer related tags than a threshold value. The application of the evaluation can, in some embodiments, be received from the user via the user device 106 and/or generated by the processor 102.
  • With reference now to FIG. 4, a process 400 for evaluation control is provided. In some embodiments, for example, the process 400 for evaluation control can be performed by the evaluation control system 100 and/or a component of thereof. The process 400 begins at block 402 wherein the portfolio is received. In some embodiments, for example, the portfolio can be received by and/or from a component of the evaluation control system 100, and in one embodiment, the portfolio can be received from the data source 108. In some embodiments, and as discussed above, the portfolio can comprise one or several artifacts which can be a collection of work product. In some embodiments, this work product can be generated by a user of the data source 108, and in some embodiments, this work product can collected by the data source 108. In some embodiments, after the portfolio has been received, the portfolio can be stored within one of the databases 104 including, for example, the portfolio database 104-B.
  • After the portfolio has been received, the process 400 proceeds to block 404 wherein an indication of artifacts is provided. In some embodiments, for example, the indication of artifacts can be provided to the user via one of the user devices 106 such as, for example, via the evaluator device 106-A. The indication of artifacts can comprise an indicator of artifacts stored within the portfolio database 104-B. In some embodiments, this indicator can comprise a listing, table, and/or index of artifacts stored in the portfolio database 104-B. The indication of the artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104-B. In some embodiments, for example, the processor 102 can query the portfolio database 104-B for indications of artifacts stored within the portfolio database 104-B. These indications of artifacts stored within the portfolio database 104-B can be provided to the user via, for example, the user interface 202.
  • After the indication of the artifacts has been provided, the process 400 proceeds to block 406 wherein a selection of one or several of the artifacts is received. In some embodiments, for example, the selection of one or several of the artifacts can be received via one of the user devices 106 such as, for example, the evaluator device 106-A. The selected one or several artifacts can correspond to provided indications of artifacts stored within the portfolio database.
  • After selection of one or several artifacts has been received, the process 400 proceeds to block 408 wherein an artifact is provided. In some embodiments, for example, the artifacts can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, the user device 106 can provide the artifacts to the user via the user interface 202. The artifacts can be retrieved from the one or several portfolios stored within the portfolio database 104-B. In some embodiments, for example, the processor 102 can query the portfolio database 104-B for a stored artifact. In some embodiments, one or several artifacts can be selected from the portfolio database 104-B and can be provided to the user.
  • After the artifacts have been provided, the process 400 proceeds to block 410 wherein a tag is received and/or applied. In some embodiments, for example, the tag can be received from the user via the user device 106, and specifically via, for example, the evaluator device 106-A. In some embodiments, the tag can identify a portion of one or several artifacts and/or portfolios, a portion of a relevant evaluation criteria and/or evaluation sub-criteria, and/or an indicator of the identification of the person and/or evaluator adding, removing, and/or editing the tag. In some embodiments, the tag can be applied in that the tag and/or the portion of the artifact associated with the tag is stored. In some embodiments, the received tag can be stored in one of the databases 104 including, for example, the tag database 104-C.
  • After the tag has been received, the process 400 proceeds to block 412 wherein the tag is correlated to the evaluation criteria. In some embodiments, for example, this correlation can include retrieving tag information identifying a related one or several evaluation criteria and/or evaluation sub-criteria and storing this information within the evaluation database. In some embodiments, for example, this can be performed by the processor 102 and/or by another component of the evaluation control system 100 including, for example, by the user device 106 and/or component thereof such as the tagging engine 206 and/or the evaluation engine 208.
  • After the tag has been correlated to the evaluation criteria, the process 400 proceeds to block 414 wherein an evaluation of the artifact and/or portfolio is received. In some embodiments, for example, this evaluation can be received from and/or performed with one of the user devices 106 and/or other components of the evaluation control system 100. In some embodiments, for example, this step can include the grouping of one or several tags, providing the group of one or several tags to the user, and receiving an evaluation based on these tags and the evaluation criteria. In some embodiments, the receipt evaluation can be stored in one of the databases 104 such as, for example, the evaluation database 104-D.
  • After the evaluation of the artifact and/or portfolio has been received, the process 400 proceeds to block 416 wherein portions of the one or several artifacts and/or portfolios identified by the one or several tags are saved. In some embodiments, for example, these portions of the artifact can be saved within one of the databases 104, and specifically within the portfolio database 104-B and/or the tag database 104-C.
  • After portions of the artifact identified by one or several tags have been saved, the process 400 proceeds to block 418 wherein a first evaluation data is generated. In some embodiments, for example, the first evaluation data can comprise information relating to the evaluation and allowing the re-creation of the evaluation. This information can include the evaluation of one or several artifacts and/or portfolios provided by the evaluator, the one or several tags associated with the one or several artifacts and/or portfolios identified by the evaluator, saved portions of the one or several artifacts and/or portfolios identified by the one or several tags, and/or an indicator of the identity of the evaluator.
  • After the first evaluation data has been generated, the process 400 proceeds to decision state 420 wherein it is determined if second evaluation data has been received. In some embodiments, for example, this determination can be made by the processor 102 and can include querying one or several of the databases 104 including, for example, the evaluation database 104-B.
  • If it is determined that there is no second evaluation data, the process 400 proceeds to block 422 wherein evaluation data is provided. In some embodiments, for example, the evaluation data can be provided to one or several of the user devices 106 including, for example, the evaluator device 106-A and/or the supervisor device 106-B.
  • Returning again to decision state 420 it is determined that second evaluation data has been received, the process 400 proceeds to block 424 wherein the evaluation data are compared. In some embodiments, for example, this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios to determine if the evaluation is the same, the comparison of the tags associated with the evaluation criteria of the first and second evaluation data to determine similarities and/or differences in tags applied in both instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags.
  • In some embodiments, for example, this comparison of the evaluation data can include comparison of the tags associated with the first and second evaluation data to determine the similarities/differences in the two taggings of the artifact. In one embodiment, for example, this comparison can include determining whether the tags associated with the two evaluations each identify the same positive and/or negative aspects/attributes of the artifact. In some embodiments, this comparison can include comparing the linking between the content of the artifact to the evaluation criteria to determine whether the evaluators linked similarly tagged content to similar portions of the evaluation criteria. Similarly, in some embodiments, this comparison can include a comparison of the overall evaluation and/or score for the artifact and/or portfolio. In one embodiment, for example, the overall evaluations and/or scores for the artifact and/or portfolio can be compared. In some embodiments, if the comparison of the overall evaluations and/or scores for the artifact and/or portfolio indicate sufficient difference, then the association of the tags with content of the artifact and/or portfolio and/or the association of the tags with the evaluation criteria are compared.
  • In one such embodiment, the overall evaluation and/or score of the first and second evaluations are compared to generate a difference value indicating the degree of difference between the overall evaluation and/or score of the first and second evaluations. In some embodiments, this difference value is compared to an acceptance threshold. In some embodiments, the acceptance threshold can identify a degree of difference between evaluations that identifies acceptable/unacceptable differences between the evaluations and/or triggers additional comparison of the first and second evaluations. In one embodiment, if the comparison of the difference value to the acceptance threshold indicates that the evaluations are adequately similar, then the process proceeds to decision state 426, discussed at greater length below, whereas, if the comparison of the difference value to the acceptance threshold indicates that the evaluations are inadequately similar, then the comparison of the evaluation data can include a comparison of the tags as discussed above.
  • In some embodiments, for example, this comparison of the evaluation data can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106. In some embodiments, for example, this comparison can be performed according to a Boolean function wherein matching aspects of the first and second evaluation data are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value.
  • After the first and second evaluation data have been compared, the process 400 proceeds to decision state 426 wherein is determined if there are differences between the first and second evaluation data. In some embodiments, for example, this determination can include determining whether there are any differences between the first and second evaluation data which can include, for example, determining whether the evaluation of the one or several artifacts and/or portfolios are different, determining whether one or several applied tags are different, and/or determining whether the portions of the one or several artifacts and/or portfolios identified by the tags are different. In some embodiments, for example, this determination can be performed with reference to values assigned to the first and second evaluation data as discussed above, and can be performed by the processor 102 and/or the evaluation engine 208. If it is determined that there are no differences and/or that a difference threshold has not been met, then the process 400 proceeds to block 422 when evaluation data is provided.
  • If it is determined that there are differences between the first and second evaluation data and/or that the difference threshold has been met, then the process 400 proceeds to block 428 wherein a difference report is generated. In some embodiments, the difference report can, for example, identify the differences between the first and second evaluation data including, for example, differences in the evaluation of the one or several artifacts and/or portfolios, differences in the one or several artifacts and/or portfolios, and/or differences in the portions of the artifacts and/or portfolios tags identified by the tags. In some embodiments, the difference report can be generated by the evaluation control system 100, and can specifically be generated by the processor 102 and/or one of the user devices 106 or component thereof including, for example, the evaluation engine 208.
  • After the difference report has been generated, the process 400 proceeds block 430 wherein the difference report is provided. In some embodiments, for example, the difference report can be provided to the user via one of the user devices including, for example, the supervisor device 106-B, and specifically the user interface 202 of the user device 106.
  • After the difference report has been provided, the process 400 proceeds to decision state 432 wherein it is determined if additional evaluation data has been received. In some embodiments, the additional evaluation data can be, for example, third evaluation data, fourth evaluation data, fifth evaluation data, sixth evaluation data, and/or any other evaluation data including, for example, nth-evaluation data. In some embodiments, for example, evaluation data can be collected until there is a convergence of the evaluation data.
  • In some embodiments, additional evaluation data can be data relating to a further evaluation of one or several artifacts and/or portfolios that can be used to determine which of the first and/or second evaluation data is accurate, accurately reflects a correct evaluation of one or several artifacts and/or portfolios, and/or is the most accurate. In some embodiments, for example, this determination can be made by the processor 102 and can include querying one or several of the databases 104 including, for example, the evaluation database 104-B. If it is determined that there is no additional evaluation data, the process 400 can terminate.
  • If it is determined that there is additional data, the process 400 proceeds to block 434 wherein the evaluation dated is compared. In some embodiments, for example, this comparison can include comparing one or both of the first and second evaluation data with the additional evaluation data. In some embodiments, for example, this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios to determine if the evaluation is the same, comparing the tags associated with the evaluation criteria of the first, second, and/or additional evaluation data to determine similarities and/or differences in tags applied in those instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags. In some embodiments, for example, this comparison can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106. In some embodiments, for example, this comparison can be performed according to a Boolean function wherein matching aspects of the first, second, and/or additional evaluation data are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value.
  • After the evaluation data has been compared, the process 400 can proceed to block 436 wherein a discrepancy report is provided. In some embodiments, for example, the discrepancy report can identify differences between the first and second evaluation data and the additional evaluation data. In some embodiments, for example, the discrepancy report can identify which of the first and/or second evaluation data most closely approximates the additional evaluation data. In some embodiments, for example, the discrepancy report can be provided to the user via one of the user devices 106 including, for example, the supervisor device 106-B, and specifically the user interface 202 of the user device 106.
  • With reference now to FIG. 5, a flowchart illustrating one embodiment of a process 500 for generating evaluation data is shown. In some embodiments, the process 500 can be performed as part of, or in the place of block 418 shown in FIG. 4. The process 500 can be performed by the processor 102, the evaluation engine 208 of one of the user devices 106, and/or by any other component of the evaluation control system.
  • The process 500 begins at decision state 502, wherein it is determined whether to display data, and particularly, whether to display all or portions of the evaluation data including, for example, the first evaluation data. In some embodiments, this can include determining whether a user request for the display of evaluation data has been made, which user request can be, a specific user request, or a general rule or request to display evaluation data. If it is determined that evaluation data will not be displayed, then the process 500 proceeds to block 504 and returns to block 420 or FIG. 4.
  • If it is determined that evaluation data will be displayed, then the process 500 proceeds to block 506, wherein artifacts are received. In some embodiments, this can include receiving all artifacts for which the evaluation data is relevant including, for example, all evaluated artifacts for, a student, a class, a grade, a study, or any other group of artifacts. In some embodiments, these artifacts can be retrieved from, for example, one of the databases 104 such as, for example, the portfolio database 104-B.
  • After the artifacts have been received/retrieved, the process 500 proceeds to block 508, wherein the tags are received/retrieved. In some embodiments, the retrieval of the tags can include the retrieval of information associated with the tags, including, for example, one or several comments, notes, or marks created and/or associated with the tags. In some embodiments, the tags can be retrieved from one of the databases such as, for example, the tag database 104-C.
  • After the tags have been received, the process 500 proceeds to block 510 where grouping criteria are received. In some embodiments, the grouping criteria can include one or several rules for categorizing tags. In some embodiments, these one or several rules can categorize tags according to the artifact with which a tag is associated, the type of tag, the type of comment, note, or mark associated with the tag, or the like. The grouping criteria can be created by the user and can be received via one or several of the user devices 106 and can be stored in one of the databases 104 such as the evaluation database 104-D.
  • After the grouping criteria have been received, the process 500 proceeds to block 512 wherein the tags are grouped. In some embodiments, the grouping of the tags can include, for example, grouping the tags according to one or several attributes of the tag including, for example, the tag type, tag content, including any comment, note, or mark associated with the tag, or the like. In some embodiments, this grouping can be performed according to the grouping criteria. In some embodiments, the grouping of the tags can include storing information identifying the grouping of one or several of the tags, which data can be stored in, for example, the tag database 104-C.
  • After the tags have been grouped, the process 500 proceeds to block 514, wherein the tags are displayed. In some embodiments, the tags can be displayed to the user via, for example, the user device 106. In one embodiment, for example, the tags and artifacts can be simultaneously displayed to the user such that the tags are located in a first display portion and one or several artifacts, or portions thereof, are located in a second display portion. In some embodiments, the first and second display portions can be first and second portions of a display such as, for example, a screen or monitor. After the tags have been displayed, the process 500 proceeds to block 516 and returns to block 420 of FIG. 4.
  • With reference now to FIG. 6, a flowchart illustrating one embodiment of a process 600 for displaying evaluation data is shown. In some embodiments, the process 600 can be performed as part of, or in the place of block 514 shown in FIG. 5. The process 600 can be performed by the processor 102, the evaluation engine 208 of one of the user devices 106, and/or by any other component of the evaluation control system.
  • The process 600 begins at block 602 wherein tag data is retrieved. In some embodiments, this retrieval of tag data can be the same as the receipt of tags in block 508, and in some embodiments, this retrieval of tag data can include the retrieval of tag data in addition to that retrieved in block 508. This tag data can be retrieved from one of the databases 104 such as, for example, the tag database 104-C.
  • After the tag data has been received, the process 600 proceeds to block 604 wherein a count is incremented for each tag associated with received/retrieved tag data. In some embodiments, this incrimination can be performed based on all of the information received in one or both of blocks 508 and 602. In some embodiments, the count can be stored in one of the databases 104. After the count has been incremented for each of the received/retrieved tags, the process 600 proceeds to block 606 wherein the number of tags is determined. In some embodiments, this can be achieved via retrieval of the count.
  • After the number of tags has been determined, the process 600 proceeds to block 608 wherein tag type is extracted from the tag data. In some embodiments, this tag type information can be stored as part of the tag data, and can be extracted from the tag data received in block 602. In some embodiments, this extraction can be performed by the processor 102 and/or the user device 106.
  • After the tag type has been extracted from the tag data, the process 600 proceeds to block 610 wherein a tag score is generated. In some embodiments, the tag score can reflect the degree to which a tag affects the score of one or several artifacts. In some embodiments, the tag score can indicate a degree to which the tag increments, decrements, or does not affect an artifact score. In some embodiments, for example, a tag indicating a negative aspect of an artifact can, based on the strength of the negative aspect artifact, decrease the score of the artifact. Similarly, a tag indicating a position aspect of an artifact can, based on the strength of the positive aspect of the artifact, increase the score of the artifact. The tag score can be stored in one of the databases 104 such as the tag database 104-C. In some embodiments, the tag score can be generated according to scoring rules that can be stored in one of the databases 104 such as, for example, the evaluation database 104-D.
  • After the tag score has been generated, the process 600 proceeds to block 612, wherein a sum score is calculated. In some embodiments, the sum score can be a value representing the aggregate effect of some or all of the tags. The sum score can be a rough score that can be converted to a final score for an artifact and/or a final score for an artifact. In some embodiments, the sum score can be calculated by the combination of tag scores, which combination can include the addition of tag scores, subtraction of tag scores, and the application of one or several weighting factors to some or all of the tag scores based on the relative importance and/or weight associated with some or all of the tag scores. In some embodiments, the sum score can be stored in one of the databases 104 such as the tag database 104-C. The tag score can be generated according to scoring rules that can be stored in one of the databases 104 such as, for example, the evaluation database 104-D.
  • After the sum score has been calculated, the process 600 proceeds to block 614 wherein the sum score is compared to scoring data. In some embodiments in which the sum score is a rough score, this can include the conversion of the sum score to a final score. The final score can, in some embodiments, be a recommended final score, and/or final score range. In some embodiments, an evaluator may be able to select a score other than the recommended final score, and in some embodiments, the evaluator may be limited to selecting a score corresponding to the recommended final score, including a score from the range indicated by the recommended final score. In some embodiments, this conversion can include comparison of the sum score to scoring data, application of a scoring algorithm, or the like. In some embodiments, this conversion can be performed by the processor 102 and/or user device 106.
  • After the sum score has been compared to the scoring data, the process 600 proceeds to block 616, wherein the score is retrieved. In some embodiments, this score can be the final score, and the retrieval of this score can be the receipt of the result of the scoring algorithm, the output of the comparison of the sum score to the scoring data, or the like. In some embodiments, the final score can be stored within one of the databases 104 such as, for example, the evaluation database 104-D. After the score has been retrieved, the process 600 proceeds to block 618 and returns to block 420 of FIG. 4.
  • With reference now to FIG. 7, a flowchart illustrating one embodiment of a process 700 for training an evaluator is shown. In some embodiments, this process 700 can be used to provide an evaluation task to a trainee and to qualify the results of that evaluation task. The process 700 can be performed by the evaluation control system 100 and/or components of the evaluation control system 100. Process 700 begins at block 702 wherein the portfolio is generated. In some embodiments, for example, a portfolio can be generated specifically for purposes of training an evaluator, which evaluator in training is also referred to herein as a trainee. In some embodiments, the generated portfolio can be created with the user device 106 such as, for example, the supervisor device 106-B and/or with the data source 108.
  • After the portfolio has been generated the process 700 proceeds block 704 wherein evaluation criteria are received. In some embodiments, the evaluation criteria can be used in evaluating one or several artifacts and/or portfolios and can comprise indications of features of the one or several artifacts and/or portfolios, and a scoring effect of those features. In some embodiments, the evaluation criteria can comprise, for example, a rubric and/or other scoring aid. The evaluation criteria can, in some embodiments, comprise one or several sub-criteria that can, for example, focus on a specific aspect of the evaluation. The evaluation criteria can be created with the user device 106 such as, for example, the supervisor device 106-B and/or with the data source 108.
  • After the evaluation criteria has been received, the process proceeds to block 706 wherein a key is received. In some embodiments, for example, the key can comprise a second evaluation of the generated portfolio, and specifically, a verified evaluation of the portfolio. The key can include information relating to the evaluation the portfolio, and specifically to the ideal overall evaluation of the portfolio and/or one or several artifacts in the portfolio, information relating to ideal tagging associated with the portfolio and the evaluation criteria, and portions of the portfolio and/or one or several artifacts indicated by the ideal tags.
  • After the key has been received, the process 700 proceeds to block 708 wherein the portfolio is provided. In some embodiments, for example, the portfolio and/or one or several artifacts in the portfolio can be provided to the user via one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, the user device 106 can provide the portfolio and/or one or several artifacts in the portfolio to the user via the user interface 202. The artifacts can be retrieved from the one or several portfolio stored within the portfolio database 104-B. In some embodiments, for example, the processor 102 can query the portfolio database 104-B for the stored portfolio and/or one or several artifacts in the portfolio. In some embodiments, the desired portfolio and/or one or several artifacts in the desired portfolio can be selected from the portfolio database 104-B and can be provided to the user.
  • After the portfolio has been provided, the process 700 proceeds to blocks 710 through 712 which blocks outline the step of receiving the tag indicated in block 410 of FIG. 4 in greater detail. The step of receiving the tag begins with block 710 wherein an indicator of one or several artifact and/or portfolio portions is received. In some embodiments, this indicator can be received via one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, this can include an indication of one or several tagged portions of the one or several artifacts and/or portfolios. In some embodiments, this indicator can identify the tag portion of the one or several artifacts and/or portfolios and can, for example, identify the beginning and/or end of the tagged portion of the one or several artifacts and/or portfolios. In some embodiments, this indicator can be stored in one of the databases 104 including, for example, the tag database 104-C.
  • After the indicator of the portion of the one or several artifacts and/or portfolios is received, the process 700 proceeds to block 712 wherein an indicator of a criteria is received. In some embodiments, for example, the indicator of the criteria can be received via one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, the indicator of the criteria can identify a portion of the evaluation criteria relevant to the indicated portion of the one or several artifacts and/or portfolios. In some embodiments, this indicator can be stored in one of the databases 104 including, for example, the tag database 104-C.
  • After the indicator of the criteria is received, the process 700 proceeds to block 714 wherein an indicator of the trainee is received. In some embodiments, for example, this indicator can be received in one of the user devices 106 including, for example, the evaluator device 106-A. In some embodiments, this indicator can be used to determine the skill level of the trainee in evaluating a portfolio. In some embodiments, this indicator can be stored in one of the databases 104 including, for example, the tag database 104-C.
  • After the indicator of the trainee has been received, the process 700 proceeds to block 716 wherein the portfolio evaluation is received. In some embodiments, for example, the portfolio evaluation can be received from and/or performed with one of the user devices 106 and/or other components of the evaluation control system 100. In some embodiments, for example, this step can include the grouping of one or several tags, providing the group of one or several tags to the user, and receiving an evaluation based on these tags and the evaluation criteria. In some embodiments, the receipt evaluation can be stored in one of the databases 104 such as, for example, the evaluation database 104-D.
  • After the portfolio evaluation has been received, the process 700 proceeds to block 718 wherein the portfolio evaluation and the key are compared. In some embodiments, for example, this comparison can include comparing the evaluation of the one or several artifacts and/or portfolios determining if the evaluation and/or score is the same, comparing the tags associated with the evaluation criteria of the first and second evaluation data to determine similarities and/or differences in tags applied in both instances, and/or the comparison of portions of the one or several artifacts and/or portfolios identified by the tags. In some embodiments, for example, this comparison can be performed by the processor 102 and/or by the evaluation engine 208 of one of the user devices 106. In some embodiments, for example, this comparison can be performed according to a Boolean function wherein matching aspects of the portfolio evaluation and the key are assigned a first value and nonmatching aspects of the first and second evaluation data are assigned a second value. In some embodiments, this comparison can be performed in the same and/or similar manner to the comparison of block 424 of FIG. 4.
  • After the portfolio evaluation and the key have been compared, the process 700 proceeds to decision state 712 wherein it is determined if the portfolio evaluation and the key include a different evaluation of the portfolio and/or of one or several artifacts in the portfolio. In some embodiments, for example, this determination can be made by the processor 102 and/or by a component of one of the user devices 106 including, for example, the evaluation engine 208. In some embodiments, this determination can include retrieving the assigned values indicative of matching and/or nonmatching aspects of the portfolio evaluation and/or the evaluation of one or several artifacts in the portfolio.
  • If it is determined that the portfolio evaluation and the key include a different evaluation of the portfolio and/or of one or several artifacts in the portfolio, the process 700 proceeds to block 722 wherein an indicator of the difference in the evaluation is stored. In some embodiments, for example, this indicator can be stored in one of the databases 104 including, for example, the profile database 104-A, the portfolio database 104-B, and/or the evaluation database 104-D.
  • After indicator of the difference in the evaluation has been stored, or, returning again to decision state 720 if it is determined that the portfolio evaluation and the key do not include a different evaluation of portfolio and/or one or several aspects in the portfolio, the process 700 proceeds to decision state 724 wherein it is determined if different tags have been assigned to portfolio and/or to the one or several artifacts in the portfolio in the portfolio evaluation and the key. In some embodiments, for example, this determination can be made by the processor 102 and/or by a component of one of the user devices 106 including, for example, the evaluation engine 208. In some embodiments, this determination can include retrieving assigned values indicative of matching and/or nonmatching tags in the portfolio evaluation and/or in evaluation of one or several artifacts in the portfolio.
  • If it is determined the portfolio evaluation and the key include different tags, the process 700 proceeds to block 726 wherein an indicator of the difference in the tags is stored. In some embodiments, for example, this indicator can be stored in one of the databases 104 including, for example, the profile database 104-A, the portfolio database 104-B, and/or the evaluation database 104-D.
  • After indicator of the difference in the tags has been stored, or, returning again to decision state 724 if it is determined that the portfolio evaluation and the key not include the different tags, the process 700 proceeds to block 728 wherein a difference report is generated. In some embodiments, the difference report can, for example, identify the differences between the portfolio evaluation and the key including, for example, differences in the evaluation of the one or several artifacts and/or portfolios, differences in the one or several artifacts and/or portfolios, and/or differences in the portions of the artifacts and/or portfolios tags identified by the tags. In some embodiments, the difference report can be generated by the evaluation control system 100, and can specifically be generated by the processor 102 and/or one of the user devices 106 or component thereof including, for example, the evaluation engine 208.
  • After the difference report has been generated, the process 700 proceeds block 730 wherein the difference report is provided. In some embodiments, for example, the difference report can be provided to the user via one of the user devices including, for example, the supervisor device 106-B, and specifically the user interface 202 of the user device 106.
  • After the difference report has been provided, the process 700 proceeds to block 730 wherein training is recommended and/or training content is provided. In some embodiments, for example, the difference between the portfolio evaluation and the key can be sufficient such that additional training can be beneficial. In some embodiments, for example, this training can be recommended based on the difference in the evaluation of the one or several artifacts and/or portfolios, and in some embodiments, this training can be recommended based on the difference in the tags applied to the one or several artifacts and/portfolios. In some embodiments, for example, a component of the evaluation control system 100 such as, processor 102 and/or one of the user devices 106 can compare the difference between one of the portfolio evaluation and the key with a threshold for requiring additional training, and can, in some embodiments, recommend additional training and/or provide additional training material based of the relationship of the difference in portfolio evaluation and the key.
  • With reference now to FIG. 8, an exemplary environment with which embodiments may be implemented is shown with a computer system 800 that can be used by a user 804 as all or a component of the evaluation control system 100. The computer system 800 can include a computer 802, keyboard 822, a network router 812, a printer 808, and a monitor 806. The monitor 806, processor 802 and keyboard 822 are part of a computer system 826, which can be a laptop computer, desktop computer, handheld computer, mainframe computer, etc. The monitor 806 can be a CRT, flat screen, etc.
  • A user 804 can input commands into the computer 802 using various input devices, such as a mouse, keyboard 822, track ball, touch screen, etc. If the computer system 800 comprises a mainframe, a designer 804 can access the computer 802 using, for example, a terminal or terminal interface. Additionally, the computer system 826 may be connected to a printer 808 and a server 810 using a network router 812, which may connect to the Internet 818 or a WAN.
  • The server 810 may, for example, be used to store additional software programs and data. In one embodiment, software implementing the systems and methods described herein can be stored on a storage medium in the server 810. Thus, the software can be run from the storage medium in the server 810. In another embodiment, software implementing the systems and methods described herein can be stored on a storage medium in the computer 802. Thus, the software can be run from the storage medium in the computer system 826. Therefore, in this embodiment, the software can be used whether or not computer 802 is connected to network router 812. Printer 808 may be connected directly to computer 802, in which case, the computer system 826 can print whether or not it is connected to network router 812.
  • With reference to FIG. 9, an embodiment of a special-purpose computer system 904 is shown. The above methods may be implemented by computer-program products that direct a computer system to perform the actions of the above-described methods and components. Each such computer-program product may comprise sets of instructions (codes) embodied on a computer-readable medium that directs the processor of a computer system to perform corresponding actions. The instructions may be configured to run in sequential order, or in parallel (such as under different processing threads), or in a combination thereof. After loading the computer-program products on a general purpose computer system 826, it is transformed into the special-purpose computer system 904.
  • Special-purpose computer system 904 comprises a computer 802, a monitor 806 coupled to computer 802, one or more additional user output devices 930 (optional) coupled to computer 802, one or more user input devices 940 (e.g., keyboard, mouse, track ball, touch screen) coupled to computer 802, an optional communications interface 950 coupled to computer 802, a computer-program product 905 stored in a tangible computer-readable memory in computer 802. Computer-program product 905 directs system 904 to perform the above-described methods. Computer 802 may include one or more processors 960 that communicate with a number of peripheral devices via a bus subsystem 990. These peripheral devices may include user output device(s) 930, user input device(s) 940, communications interface 950, and a storage subsystem, such as random access memory (RAM) 970 and non-volatile storage drive 980 (e.g., disk drive, optical drive, solid state drive), which are forms of tangible computer-readable memory.
  • Computer-program product 905 may be stored in non-volatile storage drive 980 or another computer-readable medium accessible to computer 802 and loaded into memory 970. Each processor 960 may comprise a microprocessor, such as a microprocessor from Intel® or Advanced Micro Devices, Inc.®, or the like. To support computer-program product 905, the computer 802 runs an operating system that handles the communications of product 905 with the above-noted components, as well as the communications between the above-noted components in support of the computer-program product 905. Exemplary operating systems include Windows® or the like from Microsoft® Corporation, Solaris® from Oracle®, LINUX, UNIX, and the like.
  • User input devices 940 include all possible types of devices and mechanisms to input information to computer system 802. These may include a keyboard, a keypad, a mouse, a scanner, a digital drawing pad, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices. In various embodiments, user input devices 940 are typically embodied as a computer mouse, a trackball, a track pad, a joystick, wireless remote, a drawing tablet, a voice command system. User input devices 940 typically allow a user to select objects, icons, text and the like that appear on the monitor 806 via a command such as a click of a button or the like. User output devices 930 include all possible types of devices and mechanisms to output information from computer 802. These may include a display (e.g., monitor 806), printers, non-visual displays such as audio output devices, etc.
  • Communications interface 950 provides an interface to other communication networks 995 and devices and may serve as an interface to receive data from and transmit data to other systems, WANs and/or the Internet 818. Embodiments of communications interface 950 typically include an Ethernet card, a modem (telephone, satellite, cable, ISDN), a (asynchronous) digital subscriber line (DSL) unit, a FireWire® interface, a USB® interface, a wireless network adapter, and the like. For example, communications interface 950 may be coupled to a computer network, to a FireWire® bus, or the like. In other embodiments, communications interface 950 may be physically integrated on the motherboard of computer 802, and/or may be a software program, or the like.
  • RAM 970 and non-volatile storage drive 980 are examples of tangible computer-readable media configured to store data such as computer-program product embodiments of the present invention, including executable computer code, human-readable code, or the like. Other types of tangible computer-readable media include floppy disks, removable hard disks, optical storage media such as CD-ROMs, DVDs, bar codes, semiconductor memories such as flash memories, read-only-memories (ROMs), battery-backed volatile memories, networked storage devices, and the like. RAM 970 and non-volatile storage drive 980 may be configured to store the basic programming and data constructs that provide the functionality of various embodiments of the present invention, as described above.
  • Software instruction sets that provide the functionality of the present invention may be stored in RAM 970 and non-volatile storage drive 980. These instruction sets or code may be executed by the processor(s) 960. RAM 970 and non-volatile storage drive 980 may also provide a repository to store data and data structures used in accordance with the present invention. RAM 970 and non-volatile storage drive 980 may include a number of memories including a main random access memory (RAM) to store instructions and data during program execution and a read-only memory (ROM) in which fixed instructions are stored. RAM 970 and non-volatile storage drive 980 may include a file storage subsystem providing persistent (non-volatile) storage of program and/or data files. RAM 970 and non-volatile storage drive 980 may also include removable storage systems, such as removable flash memory.
  • Bus subsystem 990 provides a mechanism to allow the various components and subsystems of computer 802 to communicate with each other as intended. Although bus subsystem 990 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple busses or communication paths within the computer 802.
  • A number of variations and modifications of the disclosed embodiments can also be used. Specific details are given in the above description to provide a thorough understanding of the embodiments. However, it is understood that the embodiments may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
  • Implementation of the techniques, blocks, steps and means described above may be done in various ways. For example, these techniques, blocks, steps and means may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • Also, it is noted that the embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a swim diagram, a data flow diagram, a structure diagram, or a block diagram. Although a depiction may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in the figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination corresponds to a return of the function to the calling function or the main function.
  • Furthermore, embodiments may be implemented by hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. When implemented in software, firmware, middleware, scripting language, and/or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory. Memory may be implemented within the processor or external to the processor. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • While the principles of the disclosure have been described above in connection with specific apparatuses and methods, it is to be clearly understood that this description is made only by way of example and not as limitation on the scope of the disclosure.

Claims (18)

What is claimed is:
1. A system for verifying the evaluation of subject matter, the system comprising:
a processor configured to:
receive a portfolio, wherein the portfolio comprises a compilation of artifacts, wherein the artifacts comprise work product of an evaluatee;
provide one of the artifacts to an evaluator;
receive a plurality of first indications of an evaluation criteria, wherein the evaluation criteria comprises a plurality of sub-criteria, wherein the first indications of the evaluation criteria identify a first portion of the artifact and a first portion of the evaluation criteria;
assign a value to the plurality of indications of the evaluation criteria, wherein the value identifies the source of the indications of the evaluation criteria, wherein the source of the indications of the evaluation criteria is the evaluator;
receive a first evaluation of the artifact, wherein the evaluation is based on the evaluation criteria and the first indications of the evaluation criteria;
receive a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact;
compare the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact; and
provide an indication of the differences between the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact; and
memory configured to store:
the received portfolio;
the first evaluation of the artifact;
the second evaluation of the artifact;
the second indications of the evaluation criteria and the second associated portions of the artifact; and
the result of the comparison of the first and second evaluations of the artifact.
2. The system of claim 1, wherein the processor is further configured to select the evaluator.
3. The system of claim 1, wherein the identified portion of the evaluation criteria comprises a sub-criterion
4. The system of claim 1, further comprising a user device configured to:
display the first evaluation of the artifact;
display the first indications of the evaluation criteria; and
display first associated portions of the artifact.
5. The system of claim 4, wherein the comparison of the first and second evaluations of the artifact further comprises:
generation of a difference value, wherein the difference value characterizes the degree of difference between the first and second evaluations;
retrieval of an acceptance threshold, wherein the acceptance threshold is a value demarking levels of acceptable and unacceptable differences between the first and second evaluations; and
comparison of the difference value to acceptance threshold.
6. The system of claim 5, further comprising comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations.
7. The system of claim 5, wherein the processor is further configured to generate a final score based on the received plurality of first indications of an evaluation criteria.
8. A method of verifying the evaluation of subject matter comprising:
receiving a portfolio, wherein the portfolio comprises a compilation of artifacts, wherein the artifacts comprise work product of an evaluatee;
providing one of the artifacts to an evaluator;
receiving a plurality of first indications of an evaluation criteria, wherein the evaluation criteria comprises a plurality of sub-criteria, wherein the first indications of the evaluation criteria identify a first portion of the artifact and a first portion of the evaluation criteria;
assigning a value to the plurality of indications of the evaluation criteria, wherein the value identifies the source of the indications of the evaluation criteria;
receiving a first evaluation of the artifact, wherein the evaluation is based on the evaluation criteria and the first indications of the evaluation criteria;
providing the first evaluation of the artifact, the first indications of the evaluation criteria, and first associated portions of the artifact;
receiving a second evaluation of the artifact, second indications of the evaluation criteria, and second associated portions of the artifact;
comparing the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact; and
providing an indication of the differences between the first and second evaluations of the artifact, indications of the evaluation criteria, and the associated portions of the artifact.
9. The method of claim 8, further comprising selecting the evaluator.
10. The method of claim 8, wherein the identified portion of the evaluation criteria comprises a sub-criterion.
11. The method of claim 10, wherein the comparison of the first and second evaluations of the artifact further comprises:
generating a difference value, wherein the difference value characterizes the degree of difference between the first and second evaluations;
retrieving an acceptance threshold, wherein the acceptance threshold is a value demarking levels of acceptable and unacceptable differences between the first and second evaluations; and
comparing the difference value to acceptance threshold.
12. The method of claim 11, further comprising comparing the first indications of the evaluation criteria and first associated portions of the artifact with the second indications of the evaluation criteria and second associated portions of the artifact if the comparison of the difference value to the acceptance threshold indicates an unacceptable level of difference between the first and second evaluations.
13. The method of claim 10, further comprising generating a final score based on the received plurality of first indications of an evaluation criteria.
14. A method of training an evaluator comprising:
generating a portfolio comprising a compilation of artifacts, wherein the artifacts comprise work product of an evaluatee;
receiving evaluation criteria associated with the portfolio, wherein the evaluation criteria comprise a plurality of sub-criteria;
receiving a key, wherein the key comprises a plurality of indications of an evaluation criteria, wherein the indications in the key are the correct indications of an evaluation criteria for a portfolio; wherein the indications of the of the evaluation criteria in the key identify a portion of one of the artifacts and a sub-criteria of the evaluation criteria;
providing the portfolio to a trainee;
receiving a portfolio evaluation, wherein the portfolio evaluation comprises plurality of indications of the evaluation criteria, wherein the indications in the portfolio evaluation are received from the trainee; wherein the indications of the evaluation criteria in the portfolio evaluation identify a portion of one of the artifacts and a sub-criteria of the evaluation criteria;
comparing the key and the portfolio evaluation according to a Boolean function to determine the accuracy of the portfolio evaluation; and
providing an indicator of the accuracy of the portfolio evaluation.
15. The method of claim 14, wherein the comparison of the key and the portfolio evaluation comprises comparing the score of the artifact in the portfolio evaluation with the score of the artifact in the key.
16. The method of claim 15, wherein the comparison of the portfolio evaluation and the key further comprises:
generating a difference value, wherein the difference value characterizes the degree of difference between the portfolio evaluation and the key;
retrieving an acceptance threshold, wherein the acceptance threshold is a value demarking levels of acceptable and unacceptable differences between the portfolio evaluation and the key; and
comparing the difference value to acceptance threshold.
17. The method of claim 14, wherein the comparison of the key and the portfolio evaluation comprises the comparison of the indications of the evaluation criteria in the portfolio evaluation to the indications of the evaluation criteria in the key.
18. The method of claim 15, further comprising providing additional training material.
US14/252,402 2013-04-12 2014-04-14 Evaluation control Abandoned US20140308650A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/252,402 US20140308650A1 (en) 2013-04-12 2014-04-14 Evaluation control
US15/491,888 US10019527B2 (en) 2013-04-12 2017-04-19 Systems and methods for automated aggregated content comment generation
US15/719,114 US20180307770A1 (en) 2013-04-12 2017-09-28 Systems and methods for automated aggregated content comment generation
US16/032,023 US10977257B2 (en) 2013-04-12 2018-07-10 Systems and methods for automated aggregated content comment generation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361811347P 2013-04-12 2013-04-12
US14/252,402 US20140308650A1 (en) 2013-04-12 2014-04-14 Evaluation control

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/491,888 Continuation-In-Part US10019527B2 (en) 2013-04-12 2017-04-19 Systems and methods for automated aggregated content comment generation

Publications (1)

Publication Number Publication Date
US20140308650A1 true US20140308650A1 (en) 2014-10-16

Family

ID=50729849

Family Applications (8)

Application Number Title Priority Date Filing Date
US14/252,402 Abandoned US20140308650A1 (en) 2013-04-12 2014-04-14 Evaluation control
US15/491,888 Active US10019527B2 (en) 2013-04-12 2017-04-19 Systems and methods for automated aggregated content comment generation
US15/637,693 Active 2037-08-25 US10417241B2 (en) 2013-04-12 2017-06-29 System and method for automated aggregated content comment provisioning
US15/637,588 Abandoned US20180307769A1 (en) 2013-04-12 2017-06-29 Interface-based automated aggregated content generation
US15/719,365 Abandoned US20180307771A1 (en) 2013-04-12 2017-09-28 Systems and methods for automated aggregated content comment generation
US15/719,114 Abandoned US20180307770A1 (en) 2013-04-12 2017-09-28 Systems and methods for automated aggregated content comment generation
US16/032,023 Active 2035-03-23 US10977257B2 (en) 2013-04-12 2018-07-10 Systems and methods for automated aggregated content comment generation
US16/189,456 Active 2037-11-25 US11003674B2 (en) 2013-04-12 2018-11-13 Systems and methods for automated aggregated content comment generation

Family Applications After (7)

Application Number Title Priority Date Filing Date
US15/491,888 Active US10019527B2 (en) 2013-04-12 2017-04-19 Systems and methods for automated aggregated content comment generation
US15/637,693 Active 2037-08-25 US10417241B2 (en) 2013-04-12 2017-06-29 System and method for automated aggregated content comment provisioning
US15/637,588 Abandoned US20180307769A1 (en) 2013-04-12 2017-06-29 Interface-based automated aggregated content generation
US15/719,365 Abandoned US20180307771A1 (en) 2013-04-12 2017-09-28 Systems and methods for automated aggregated content comment generation
US15/719,114 Abandoned US20180307770A1 (en) 2013-04-12 2017-09-28 Systems and methods for automated aggregated content comment generation
US16/032,023 Active 2035-03-23 US10977257B2 (en) 2013-04-12 2018-07-10 Systems and methods for automated aggregated content comment generation
US16/189,456 Active 2037-11-25 US11003674B2 (en) 2013-04-12 2018-11-13 Systems and methods for automated aggregated content comment generation

Country Status (5)

Country Link
US (8) US20140308650A1 (en)
CN (1) CN105264555A (en)
AU (2) AU2014250772A1 (en)
CA (1) CA2908952A1 (en)
WO (1) WO2014169288A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9483954B2 (en) 2013-03-12 2016-11-01 Pearson Education, Inc. Educational network based intervention
US9928383B2 (en) 2014-10-30 2018-03-27 Pearson Education, Inc. Methods and systems for network-based analysis, intervention, and anonymization
US10019527B2 (en) 2013-04-12 2018-07-10 Pearson Education, Inc. Systems and methods for automated aggregated content comment generation
US10516691B2 (en) 2013-03-12 2019-12-24 Pearson Education, Inc. Network based intervention
CN110889775A (en) * 2019-10-29 2020-03-17 贵州电网有限责任公司 Key distribution equipment characteristic parameter system
US10789316B2 (en) * 2016-04-08 2020-09-29 Pearson Education, Inc. Personalized automatic content aggregation generation
US11068043B2 (en) 2017-07-21 2021-07-20 Pearson Education, Inc. Systems and methods for virtual reality-based grouping evaluation
US11126923B2 (en) 2016-04-08 2021-09-21 Pearson Education, Inc. System and method for decay-based content provisioning

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10095797B2 (en) * 2014-10-03 2018-10-09 Salesforce.Com, Inc. Suggesting actions for evaluating user performance in an enterprise social network
US10740292B2 (en) 2015-05-18 2020-08-11 Interactive Data Pricing And Reference Data Llc Data conversion and distribution systems
US10474692B2 (en) * 2015-05-18 2019-11-12 Interactive Data Pricing And Reference Data Llc Data conversion and distribution systems
US10026023B2 (en) * 2016-08-11 2018-07-17 International Business Machines Corporation Sentiment based social media comment overlay on image posts
US10078708B2 (en) * 2016-11-15 2018-09-18 Tealium Inc. Shared content delivery streams in data networks
US20180286267A1 (en) * 2017-03-31 2018-10-04 Pearson Education, Inc. Systems and methods for automated response data sensing-based next content presentation
JP6369706B1 (en) * 2017-12-27 2018-08-08 株式会社Medi Plus Medical video processing system
US11205157B2 (en) * 2019-01-04 2021-12-21 Project Revamp, Inc. Techniques for communicating dynamically in a managed services setting
CN110366002B (en) * 2019-06-14 2022-03-11 北京字节跳动网络技术有限公司 Video file synthesis method, system, medium and electronic device
CN110599842A (en) * 2019-09-11 2019-12-20 广东电网有限责任公司 Virtual reality technology-based distribution network uninterrupted operation training system
US11188546B2 (en) * 2019-09-24 2021-11-30 International Business Machines Corporation Pseudo real time communication system
CN110942698B (en) * 2019-12-31 2021-09-21 上海东捷建设(集团)有限公司第五分公司 Simulation real operation control platform for distribution network uninterrupted operation
CN111415105B (en) * 2020-04-28 2023-06-02 中国联合网络通信集团有限公司 Comment verification method, node, population thermodynamic diagram data node and management system
EP4147452A4 (en) * 2020-05-06 2023-12-20 ARRIS Enterprises LLC Interactive commenting in an on-demand video
US11328031B2 (en) * 2020-07-11 2022-05-10 International Business Machines Corporation Automatically generated timestamps based on comment
CN112069524A (en) * 2020-09-15 2020-12-11 北京字跳网络技术有限公司 Information processing method, device, equipment and storage medium
CN113486632B (en) * 2021-05-27 2023-04-07 四川大学华西医院 Method and device for recording review opinions
US11558213B1 (en) * 2021-08-04 2023-01-17 International Business Machines Corporation Deep tagging artifact review session
US20230052148A1 (en) * 2021-08-10 2023-02-16 Keross Extensible platform for orchestration of data with built-in scalability and clustering
US11853341B2 (en) 2021-12-16 2023-12-26 Rovi Guides, Inc. Systems and methods for generating interactable elements in text strings relating to media assets
US20230199260A1 (en) * 2021-12-16 2023-06-22 Rovi Guides, Inc. Systems and methods for generating interactable elements in text strings relating to media assets
US11768867B2 (en) 2021-12-16 2023-09-26 Rovi Guides, Inc. Systems and methods for generating interactable elements in text strings relating to media assets

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040202991A1 (en) * 1993-02-05 2004-10-14 Ncs Pearson, Inc. Dynamic on-line scoring guide and method
US20100070510A1 (en) * 2004-03-30 2010-03-18 Google Inc. System and method for rating electronic documents
US20110191286A1 (en) * 2000-12-08 2011-08-04 Cho Raymond J Method And System For Performing Information Extraction And Quality Control For A Knowledge Base
US20120078653A1 (en) * 2001-10-26 2012-03-29 Ubc Specialty Clinical Services, Llc Computer System and Method for Training, Certifying or Monitoring Human Clinical Raters
US20120303635A1 (en) * 2008-08-15 2012-11-29 Adam Summers System and Method for Computing and Displaying a Score with an Associated Visual Quality Indicator

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10333538A (en) * 1997-05-29 1998-12-18 Fujitsu Ltd Network type education system, record medium recording instructor side program of network type education system and record medium recording participant side program
US6978115B2 (en) 2001-03-29 2005-12-20 Pointecast Corporation Method and system for training in an adaptive manner
AU2003228705A1 (en) 2002-04-26 2003-11-10 Kumon North America, Inc. Method and system for monitoring and managing the educational progress of students
US7827208B2 (en) 2006-08-11 2010-11-02 Facebook, Inc. Generating a feed of stories personalized for members of a social network
JP2007274090A (en) * 2006-03-30 2007-10-18 Toshiba Corp Content reproducing apparatus, method, and program
US20080038705A1 (en) 2006-07-14 2008-02-14 Kerns Daniel R System and method for assessing student progress and delivering appropriate content
US8457544B2 (en) 2008-12-19 2013-06-04 Xerox Corporation System and method for recommending educational resources
US8635222B2 (en) * 2007-08-28 2014-01-21 International Business Machines Corporation Managing user ratings in a web services environment
KR100822839B1 (en) 2007-09-14 2008-04-16 주식회사 레드덕 System and method of customizing a weapon on the online first person shooting game
CA2770868C (en) * 2009-08-12 2014-09-23 Google Inc. Objective and subjective ranking of comments
US8958741B2 (en) 2009-09-08 2015-02-17 Amplify Education, Inc. Education monitoring
US20120040326A1 (en) 2010-08-12 2012-02-16 Emily Larson-Rutter Methods and systems for optimizing individualized instruction and assessment
US9384512B2 (en) * 2010-12-10 2016-07-05 Quib, Inc. Media content clip identification and combination architecture
US20120221687A1 (en) * 2011-02-27 2012-08-30 Broadcastr, Inc. Systems, Methods and Apparatus for Providing a Geotagged Media Experience
US9812024B2 (en) 2011-03-25 2017-11-07 Democrasoft, Inc. Collaborative and interactive learning
WO2014047425A1 (en) * 2012-09-21 2014-03-27 Comment Bubble, Inc. Timestamped commentary system for video content
US8753200B1 (en) 2013-02-01 2014-06-17 Pearson Education, Inc. Evaluation and rectification system
US9483954B2 (en) 2013-03-12 2016-11-01 Pearson Education, Inc. Educational network based intervention
US20140308650A1 (en) 2013-04-12 2014-10-16 Pearson Education, Inc. Evaluation control
US10079952B2 (en) * 2015-12-01 2018-09-18 Ricoh Company, Ltd. System, apparatus and method for processing and combining notes or comments of document reviewers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040202991A1 (en) * 1993-02-05 2004-10-14 Ncs Pearson, Inc. Dynamic on-line scoring guide and method
US20110191286A1 (en) * 2000-12-08 2011-08-04 Cho Raymond J Method And System For Performing Information Extraction And Quality Control For A Knowledge Base
US20120078653A1 (en) * 2001-10-26 2012-03-29 Ubc Specialty Clinical Services, Llc Computer System and Method for Training, Certifying or Monitoring Human Clinical Raters
US20100070510A1 (en) * 2004-03-30 2010-03-18 Google Inc. System and method for rating electronic documents
US20120303635A1 (en) * 2008-08-15 2012-11-29 Adam Summers System and Method for Computing and Displaying a Score with an Associated Visual Quality Indicator

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10516691B2 (en) 2013-03-12 2019-12-24 Pearson Education, Inc. Network based intervention
US9483954B2 (en) 2013-03-12 2016-11-01 Pearson Education, Inc. Educational network based intervention
US11003674B2 (en) 2013-04-12 2021-05-11 Pearson Education, Inc. Systems and methods for automated aggregated content comment generation
US10019527B2 (en) 2013-04-12 2018-07-10 Pearson Education, Inc. Systems and methods for automated aggregated content comment generation
US20180307769A1 (en) * 2013-04-12 2018-10-25 Pearson Education, Inc. Interface-based automated aggregated content generation
US10977257B2 (en) 2013-04-12 2021-04-13 Pearson Education, Inc. Systems and methods for automated aggregated content comment generation
US10417241B2 (en) 2013-04-12 2019-09-17 Pearson Education, Inc. System and method for automated aggregated content comment provisioning
US9928383B2 (en) 2014-10-30 2018-03-27 Pearson Education, Inc. Methods and systems for network-based analysis, intervention, and anonymization
US10366251B2 (en) 2014-10-30 2019-07-30 Pearson Education, Inc. Methods and systems for network-based analysis, intervention, and anonymization
US10083321B2 (en) 2014-10-30 2018-09-25 Pearson Education, Inc. Methods and systems for network-based analysis, intervention, and anonymization
US10789316B2 (en) * 2016-04-08 2020-09-29 Pearson Education, Inc. Personalized automatic content aggregation generation
US20200410024A1 (en) * 2016-04-08 2020-12-31 Pearson Education, Inc. Personalized automatic content aggregation generation
US11126923B2 (en) 2016-04-08 2021-09-21 Pearson Education, Inc. System and method for decay-based content provisioning
US11126924B2 (en) 2016-04-08 2021-09-21 Pearson Education, Inc. System and method for automatic content aggregation evaluation
US11651239B2 (en) 2016-04-08 2023-05-16 Pearson Education, Inc. System and method for automatic content aggregation generation
US11068043B2 (en) 2017-07-21 2021-07-20 Pearson Education, Inc. Systems and methods for virtual reality-based grouping evaluation
CN110889775A (en) * 2019-10-29 2020-03-17 贵州电网有限责任公司 Key distribution equipment characteristic parameter system

Also Published As

Publication number Publication date
US10417241B2 (en) 2019-09-17
CA2908952A1 (en) 2014-10-16
US20180307770A1 (en) 2018-10-25
US20190146975A1 (en) 2019-05-16
US20180307771A1 (en) 2018-10-25
WO2014169288A1 (en) 2014-10-16
CN105264555A (en) 2016-01-20
AU2020200909A1 (en) 2020-02-27
US20180307688A1 (en) 2018-10-25
US10019527B2 (en) 2018-07-10
AU2014250772A1 (en) 2015-10-29
US20180307769A1 (en) 2018-10-25
US20170364600A1 (en) 2017-12-21
US10977257B2 (en) 2021-04-13
US11003674B2 (en) 2021-05-11
US20190042578A1 (en) 2019-02-07

Similar Documents

Publication Publication Date Title
AU2020200909A1 (en) Evaluation control
US9063975B2 (en) Results of question and answer systems
US9280908B2 (en) Results of question and answer systems
US9406239B2 (en) Vector-based learning path
US9268766B2 (en) Phrase-based data classification system
US9390378B2 (en) System and method for high accuracy product classification with limited supervision
US9412281B2 (en) Learning system self-optimization
WO2022111244A1 (en) Data processing method and apparatus, electronic device and storage medium
US10372763B2 (en) Generating probabilistic annotations for entities and relations using reasoning and corpus-level evidence
CN110377631B (en) Case information processing method, device, computer equipment and storage medium
US9916377B2 (en) Log-aided automatic query expansion approach based on topic modeling
US11194963B1 (en) Auditing citations in a textual document
US10628749B2 (en) Automatically assessing question answering system performance across possible confidence values
US10282678B2 (en) Automated similarity comparison of model answers versus question answering system output
US20150199909A1 (en) Cross-dimensional learning network
AU2018271315A1 (en) Document processing and classification systems
CN116611074A (en) Security information auditing method, device, storage medium and apparatus
JP6424315B2 (en) Learning support apparatus, learning support program, and learning support method
US10332411B2 (en) Computer-implemented systems and methods for predicting performance of automated scoring
US11735061B2 (en) Dynamic response entry
CN113312258A (en) Interface testing method, device, equipment and storage medium
CN112288584A (en) Insurance application processing method and device, computer readable medium and electronic equipment
US20180375926A1 (en) Distributed processing systems
WO2023060954A1 (en) Data processing method and apparatus, data quality inspection method and apparatus, and readable storage medium
US20210343174A1 (en) Unsupervised machine scoring of free-response answers

Legal Events

Date Code Title Description
AS Assignment

Owner name: PEARSON EDUCATION, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LORING, MILES T.;GRUDNITSKI, PAUL C.;KAPOOR, VISHAL;SIGNING DATES FROM 20140413 TO 20140414;REEL/FRAME:032846/0365

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION