CA1275327C - Emulation process for creating and testing computer systems - Google Patents

Emulation process for creating and testing computer systems

Info

Publication number
CA1275327C
CA1275327C CA000530647A CA530647A CA1275327C CA 1275327 C CA1275327 C CA 1275327C CA 000530647 A CA000530647 A CA 000530647A CA 530647 A CA530647 A CA 530647A CA 1275327 C CA1275327 C CA 1275327C
Authority
CA
Canada
Prior art keywords
data
input
output
information
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CA000530647A
Other languages
French (fr)
Inventor
Isaac Meyer Perelmuter
Jerry James Ward
Ann Galgowski Lech
Daniel Frederick Belfer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Iconectiv LLC
Original Assignee
Bell Communications Research Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bell Communications Research Inc filed Critical Bell Communications Research Inc
Application granted granted Critical
Publication of CA1275327C publication Critical patent/CA1275327C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software

Abstract

Abstract of the Disclosure A method of operating a computer system is disclosed that facilitates the creation and testing of application system software by utilizing an autonomous environment to emulate the application system environment.
The emulation environment allows the user to call into view sequences of standard input-output (I/O) screen format pairs normally used to submit information to and receive information from the application system. In the emulation mode, these screens are prepared off-line and stored until the user desires to exercise the application system. Each input format is filled with information that will serve as actual input to the application system when it is exercised. Each output format is filled with information that comprises expected results when the application system is actually exercised. The expected results are compared to actual results after execution of each I/O pair and further application system processing is controlled by the comparison results. The stand-alone emulation environment provides editing and control routines and library functions to aid the user in defining and modifying the I/O sequences.

Description

~t7~j3;~

Technical Field of the Invention This inven~ion relates generally to the process of developing large~scale software systems and, more partic~larly, to a methodology for creating and testing S the software in an environment emulating the production environment in which the software is ultimately deployed.
Backyround of the Invention . .
The development and maintenance of a large-scale software system generally consumes a substantial amount of time, e~fort and financial resources and requires the coordinated interaction of personnel with diverse talents and skills throughout the development process.
Typically, a large-scale software system will comprise a number of software subsystems and each subsystem, in turn, is composed of numerous modules. ~ach module is designed to perform a selected low-level function. Various modules are then integrated at a higher functional level to form a particular subsystem capable of effecting a specific task. ~11 the subsystems are further integrated at an even higher functional level and thereby provide the full functionality required o~ the entire system.
An illustrative example of a large software system that Einds widespread use in the telecommunications environment is the TIRKS system (TIRKS is a trademark of Bell Communications Research~ Inc.). ~t its inception during the early 1970~s, the TIRKS system began as a relatively small system. ~n early version of the TIRKS
system was written in assembly language for deployment on a large mainframe computer system. At that time, the TIRKS system had only a few subsystems. Its function was two-fold, namely, to track inter-ofEice circuit orders and to store and inventory inter-office equipment and iacilities. To exploit advances in computer and .:
' ' : ' ' ~ '-, ' , :
, fA~7~
~7~

communicatiorls technology as well as expanding upon the capabilities of the TI~KS system~ the system has been continuous]y updated and augmented by adding other suhsystems and a~companying modules. Currently, the TIRKS
system contains about 50 different subsystems embodied in approximately 10 million lines of source code and composed of 23,000 different modules. ~hese subsystems now provide a number oE diverse functions, such as: marketing (accepting customer orders for inter-office network services); provisioning (equipment and facility inventorying, order tracking as well as inter-office equipmen-t and facility assignment); and operations (establishing actual inter office connections as well as the monitoring and testing Of the inter-office network)~
With this general organizational overview of a large-scale software system in focus, it is instructive to consider the so-called "life cycleli of a software system~
The "life cyc]el' generally may, for purposes of this discussion, be partitioned into a number of serially-occurring, basically mutually exclusive phases. The scope of activity that occurs at each phase varies depending upon whether the software system is entirely new or is on-going and evolving, such as the TIRKS system.
During an initial phase, which might be characterized as a l'conceptualizationl' phase, generic requirements are produced wherein the high-level functionality of the overall system is specified. It is during this phase that the ultimate users of the system elucidate their requirements and objectives to the system analysts. Various alternative solutions aimed at achieving the objectives are proposed and the analysts select the most viable solution and then generate the requirementsO
A second phase, the "implementation'l phase, commences when the requirements are further analyzed and parsed to obtain a refined system definition. The system is now viewed as comprising an arrangement of stand-alone , ' . .
.

.

.

~7 ~

but interdepen~ent modules. ~hese modules are usually developed separately by diEferent groups or individuals.
rrhe development effort at this juncture comprises coding of the ~odules in a specific computer language, such as the C language, and -then testing the executlon of the individ~lal modules.
The third phase, called "integration", is initiated by combining appropriate modules into their respective subsystemsr thereby forming the overall system~
Subsystem testing may also be effected to insure that the - modules are properly linked and compatible.
A fourth phase, called "system test", begins when the overall system is handed off to the testers.
Thus, instead of releasing the system directly to the end usersr an in-house test group is interposed in the "life cycle", with the charter of "trouble-shooting" the intended release. ~t is well-established that the cost of correcting software defects after a software system has reached the end user is significantly greater than correcting defects found before the system reaches that stage. This has lead to an increased emphasis on reduction of errors during system development, primarily by testing. The objective of the testers is to locate any problems. A l'problem" is a discrepancy between what was intended to be implemented and what the system actually does as revealed through testing. In a large system, system testers are faced with ~he dilemma of how to choose test cases effectively given practical limitations on the number of test cases ~hat can be selected.
A final phase begins when the system is embedded in the user's environment for purposes of acceptance testing. In this phase, particularly if the system is to replace or enhance a similar software system, the replacement system may actually augment the operation of the prior software system. However, the primary purpose of acceptance testing is to determine if the system ~ accomplishes what the user requested and~ secondarily, to -: , . ., : .

-'' ' ' ' ' :
.
.: " .

3~7 detect problems for corrective action. After successfulcompletion of this final phase, the system is released to become part of the user's production environment Despite all oE the testing in the various phases, it is possible that the system may contain an unforeseen error, called a "bug", which will only be discovered during actual use. Each "bug" is eliminated by first detecting its source, and then "conceptualizing" and deslgning a solution followed by a suitable modification of the system. In efEect, each major "bug" effects a new "liEe cycle". Other causes of a "life cycle" iteration include intended enhancements and modifications or technology changes to take advantage of intervening advances that have occurred since the time the entire system was first developed.
As suggested by the above discussion, a large-scale system development effort is organized around the various phases of the "life cycle"~ Each organization performs its specified activities and hands off its portions oE the system, or "units", to the next organization. Thus, each organization tends to have a local, rather than a global, viewpoint of the overall system.
In order to hand-off the units to the next organiæation, it is typically required that the units satisfy some quantifiable but limited objective criteria.
For exampleS it might be required that all modules are present, that they are at the proper source code level and are compiled, and that the code is a~ least executable.
Thus, when an organization receives its units of interest, it is expected that the units are operational, even if only at a rudimentary level, so that the receiving organization may quickly begin to perform its activity~
However, even though objec-tive criteria are established for hand-off, it is generally true that the ultimate decision to pass on a given uni-t is mainly the result oE a subjective evaluation.

~ .
,. ..

~ , ' ' ; ' ii3~7 Con-trol oE the process duriny conventional software development tends to be somewhat subjective in nature because, unlike the case of traditional hardware development, there are no sophisticated, objective control procedures that pervade or encompass all phases of the development. In eEfect, there are no universally applicable methods or techniques that can quickly and accurately perform detailed measurements and report on results within acceptable time limits and resource usage9 As also alluded to in the overview discussion of the "liEe cycle" phases, testing considerations, either implicitly or explicitly, occur during all development phases. ~owever, controlled formal testing has been concentrated at the system level in the "system test"
organization. Principally, this is due to the lack of tools that would efficiently allow for the development and sharing of tests by diEferent organizations.
Historically, the initial method of testing could be characteri2ed as manual and autonomous. With respect to the TIRKS system, which uses a video display terminal ~VDT) as an input/output device, "manual"
generally means a person positioned at the V~T makes terminal entries, transmits the entries -to the unit under test, and then evaluates responses returned for display on the VDT. Based on the response, a new set of terminal entries i5 entered and a new response is evaluated. This request response procedure continues until the person is satisfied with all responses or, if "bugs" are detected, until the "bugs" are fixed and the unit is retested. By "autonomous" is meant that only the person making the VDT
entries is aware of exactly how and why the unit was exercised during each test session (unless the test exercise was documented, which is typically not the case).
In effect, autonomous tests are not tests that can be repeated precisely nor are they in a form to be passed to the next organization.

-:

. ~ . .

.~ .
,~ ' ' , , .

i3~

In order to mitigate the autonomous nature of testing, at least at the overall system level, controlled formal testing was lodged in the "system test"
organization. So-called test scripts are developed and maintained by the test.ers. A test script is a listing o the entries to be supplied to a VDT by the tester as well as a listing of the responses expected for evaluation.
Even with this advancement to manual mode testincJ, clearly it was and is still time consuming and prone to error.
Because of the shortcomings of manual mode testing, so-called automated mode testing was introduced in order to exercise the system by the tester. The test scripts are now supplied to the system under test in a rapid, repeatable fashion. Automated testing assures that the script is followed precisely and evaluations via comparisons of responses on the VDT to stored, anticipated responses may also be done precisely. Automated tests can also be written to be deterministic. By deterministic is meant that during test execution~ the test itself can determine if system responses are as expected; if not, the test reports an error. With deterministic tests, the time a tester spends on post-test analysis can be significantly reduced, and virtually eliminated if no unexpected responses are reported. Moreover~ system testers are freed to do more productive duties, such as new script development, rather than performing the essentially clerical function of foIlowing a script with entries to a ; VDT. In addition, automated tests can be executed during any time period and do not require that the tester be present to run tests. Also~ especially important automated scripts will be maintained to correspond to system updates and enhancements so as to benefit from the deterministic aspects of automated testsO In this way, "regression" testing, that is, comparison of results from a new version of a system to prior, "benchmark" results from an earlier version or previously tested versions are readily accomplished. And finally, automated -tests can , , ~ ..

~ ~ 2 ~ ~
3~
~ 7 produce~ as a side-effect of the testing itselE, a precise machine record of what has been tested and the results of that testing~ ThereEore~ there exists a provable and objective measurement as to the extent -that the system under test operates as expected, which is precisely what needs to be measured as a software system moves through its "life cycle".
Automated testing as it is practiced today, ho~ever, does have its own deflciencies and limitations.
Most troublesome is the need to learn a new computer language~ A manual test script, which is carried out by terminal entries, is transmitted to the system under test via a block o~ data propagating over a channel. An automated test script version of a manual script must provide the equivalent of the data block without the necessity for human interactive terminal entries. In conventional automated testing, the new computer language performs the interfacing function of the off-line building o data blocks equivalent to the desired terminal entries as well as the off-line storing of anticipated responses from the system for comparison to actual returned responses.
Even with automated testing, system testing is still lodged in the system testing organization because of the specialized expertise required of testersO
Oftentimes, in factr it is required that two individuals be teamed to create the equivalent of a single tester.
One of the individuals on the team is familiar with the actual application environment and Xnows the techniques to e~ercise the system properly but not the special computer language; the second team member is knowledgeable about the special testing language but does not know the application environment in requisite detail. This pairing of individuals is expensive and leads to inefficiencies.
In concluding this Background Section, it is instructive to conjure: (1) a first depiction of the conventional five phases in the life-cycle o~ the system :

, : . , . . - . :
:' ' : . - .
' , ~2~7~3~

development process; and (2) a second depiction of a model for an improved process. The second depiction serves as a point oE departure for the principles of the present invention. With respect to the first visualization, developers have viewed the five life-cycle phases of one iteration as five activities placed side-by-side with a modicum of interaction between adjacent events and basically no interaction between non-adjacent events. A
second iteration causes a second set of five phases to be placed adjacent to the first set so that "acceptance testing'l of -the Eirst set serves as input to "concep-tualization" of the second set. This straisht line depiction is replicated throughout the life of the software system as new sets are juxtaposed to existing set groupings.
The second depiction views the life-cycle phases as mapping into a circle partitioned into five "pie-shaped" seyments representing the life-cycle events. Now, "acceptance testing'l is adjacent to "conceptualization"
and the cyclic nature of the development process is self-evident~ Moreover, the center of the circle is common to all segments and represents the knowledge base that is ! common to all developers during all the phases. It is apparent that non-adjacent events may access, share and utilize the same information as adjacent events. This implies that the information be in a format that is usable by all par-ties and no party should be burdened with learning, for example, a new computer language to test the system. With this second depiction, it is possible to consider those tools conventionally construed narrowly as testing tools and treat these tools on a broader basis as tools influencing the very creation of the system ` ultimately to be tested.
Summarv of the Invention . ~ ~
The above~described shortcomings and limitations of the conventional methods for developing and then exercising an application system are obviated, in ~1 :, ~ . .

.~ , accordance with the principles of the present invention, by utllizing an emulation system environment that simulates the actual input/output interface of the application system. ~n this emulation environment, the 5 use~ creates input/output transactions for eventual execution by the application system in a screen~driven, automated mode which is independent of and of-line Erom the application system.
In the application system, a transaction is broadly described as the supplying of information to an input screen format, the transmitting of the input information to the application system for execution, and the receiving of information on a corresponding output format in response to system execution~ To generate a corresponding transaction in the emulation environment, the user: ti) calls into view a replica of the selected input ormat stored by the emulation system, populates the Eormat fields with the input information, and then stores the filled-in format; and (ii) calls into view the associated output screen replica, supplies the anticipated application system responses to t~e appropriate fields, and then stores the filled-in output format. The series of transactions generated in this manner may then be passed to the application system for execution at a time determined by the user. The actual application system responses are then cornpared to the anticipated responses and, depending upon the results of the comparison, the next appropriate action is invoked.
A feature of the present invention is that the user operates in a familiar environment and no new languages~ such as a test languagel need to be learned.
Moreover, anyone familiar with the application system formats may now formulate automated tests. Also, the off-line storage of the inpu~/output filled-in screen replicas provide for repeatable exercising of the application system and the storage contents constitute self documentation.

~2'7~3~

The organization and operation oE this invention will be ullderstood from a consideration oE the detailed description of the illustrative embodiment, which Eollows, when taken ln conjunction with the accompanying drawing.
Brief Descriptio~ oE the Drawlng FIG. 1 depicts a conventional arrangement for implementing manual mode testing of a system under test;
FIG. 2 depicts a scree~-driven, autom~ted testiny arrangement, in accordance with an illustrative emhodiment of the present invention, for exercising the system under test;
FIG. 3 is a block diagram of the dialog - controller system within processor 300 of FIG. 2 which initially controls interaction with the user via terminal 400 shown in FIG. 2;
FIG. 4 is a block diagram of the test generator of FIG. 3;
FIG. 5 depicts the manner in which new test sets are added to the test flow, including the linkages that . 20 uniquely define the test flow;
~ FIG, 6 depicts the information assembled by tha ; screen definition controller and set part parser of FIG. 4 in order to produce an input data set part screen display, FIG. 7 depicts the information assembled by the : 25 screen de:Einition controller and set part passer of FIG. 4 in order to produce an output data set part screen display;
FIG. 8 depicts an expanded version of FIG. 7 wherein information supplied via VARIABLE and EXPRESSION
windows are used to supply field data;
FIG. 9 illustrates the linkage between the ~expressiorl and test arrays depicted generally in FIG. 7;
FIG. 10 depictsy in both pictorial and block diagram form, the display editor of FIG. 4;
i 35 FIG. 11 shows the alignment o-f FIGS. llA and llB;

: : :

' ,. ~

~I f3~q~;3 ~ D~7 FIGS. llA and llB are a depiction of the manner in which window buEEers are linked on the buEfer free list prior to allocation as well as the linkage among allocated windo~s on -the di.splay list associa-ted with the display device 400 of FIG. 4;
FIG. 12 depicts, in block diagra~l form, -the command controller of FIG 10;
~ IGo 13 shows an exemplary command window menu;
FIG. 14 depicts the cor~espondence among the command menu, the location map for the menu as well as the contents of the command stack of ~IG~ 12 and the highlight array;
FIG. 15 shows an example of the command menu definition table file of FIG. 12;
FIG. 16 shows the correspondence between the command menu and the command stack for a selected command;
FIG. 17 further depicts the correspondence between the command menu and the command stack after a series of selected commands;
FIG. 18 is a flow diagram corresponding to the processing performed by the controller of FIG. 3;
FIG. 19 is a flow diagram corresponding to the processing performed by the generator o ~IG. 4; and FIG. 20 is a flow diagram of the method for submitting the test prepared by the emulation process to the syste~ under test and for receiving and comparing the results produced by the system under test to those stored via the emulation process.
D ailed Description To place in perspective the detailed description of the present invention, it is instructive to gain a basic underskanding of both the manual and language-driven automated testing methodologies of conventional system testing arrangements Accordingly, the Eirst part of this detailed description discusses both of these methodologies as they are applied to the TIRKS system. This approach has the additional advantage of introducing terminology . . - . . ~ .
,:

~2~3~2~
~ 12 -and nota-tion tha-t will aid in elucida~ing the various aspects of the present invention.
1 Conventional Arrangement Initially, as depicted in FIG. 1, the point of view taken is that of a system tester positioned at video display terminal (VDT) 200 and poised to operate in a manual mode. The system under test (SUT), such as the TIRKS system and as depicted by blocls 100, is accessible from VDT 200 via channel 210. Input information to SUT 100 is generally entered by the tester via ; keyboard 201 and output information is displayed in character or graphical format on video screen 202. Input is also displayed on screen 202 prior to transmission to SUT 100. A representative computer upon which SUT 100 resides is an IBM Model 3084 having the OS/MVS type operating system~ Representative of YDT 200 is an IBM
Model 327S Terminal, ~hich is connected to the Model 3084 via the 3274-type Controller also supplied by IBM.
Since a tester accesses SUT 100 via the usual or standard interface, that is, via VDT 200 and interposed channel 210, the tester must follow the same protocol and procedures as any other user. It is only the motivation and goals of each that differentiates the tester from the user and SUT 100 cannot discern those differences. The u~er is generally interested in providing real data to SUT 100 to produce usable results; the tester is not generally interested in results in themselves, but whether SUT 100 produces results in the manner intended.
In order to enable the tester/user to input data and receive output in a uniform manner, several types of so-called formats may be called into view on VDT 200.
After turning-on V~T 200 r and assuming S~T 100 is operational, the tester requests a format by spelling out, for example, /FOR, followed by the desired format, and then strikes the "ENT~R" key on the keyboard, thereby transferring the request to SUT 100. As an example, it is ~i supposed that the initial input to SUT 100 is a request to ' .
: '`' ~, . . .

' : :
~ ' .

~2~3~3~7 "log-on". ~or log~on, the tester must supply password in~ormation so that Sl]~ 100 may determine if the tester i5 a valid userO Th~sl if the tester types~ Eor instance, the log-on format "gclogo" aEter /FOR, followed by the "ENTER" key, then the following response is returned by SUT 100 and displayed on screen 202 lthe underbars indica-te where a user is to make entries):

***GOC LOGON/OFF***
ENTER RRO:
ENTER PASSWORD :
- PF KEY ASSIGNMENT
1: FIND - DISPLAY LO~ON STATUS
4: ADD - LOGON
8: REFRESH - REFRESH SCREEN
10: DELETE - LOGOFF
.
The user supplied fields have the following meanings:
' RRO ~ Responsible Reporting Office (i.e., user identification) ; 20 PASSWORD - valid password to enable tester to access SUT 100 The information with the heading "PF KEY ASSIGNMENT"
(PF implies Program Function) lists the allowed response keys which then initiate one of the following actions:
`: ~
FIND - Displays requested RRO
ADD - Log-on to subsystem REFRESH - Clear all fields on the screen DELETE - Log-off subsystem It is supposed that the tester responds first by entering data in the two fields, namely, RRO is filled in with the three character string l'DFB" and PASSWORD
contains the six character string "PASSWD", and then the tester strikes the "PF4" key.

~ : .
.
' : ~ , . .. - , : -'.' ' ' .
~ - .

,~ ~t~ ~0^~
~ g~

In ~eneral, screen di~play 202 may contain up to 19~0 character positions as determined by 24 lines per screen at 80 characters per line. VDT 200 translates the characters as well as the character position information into a form suitable for transmission over channel 210.
The striking of the "PF4" key by the tester signifies that the tester has completed the screen entries and that the character information is lo be translated within VDT 200 and then transmitted over channel 210 to S~T 100.
If the information supplied by the tester at "log-on" is complete and accurate so that it is acceptable to SUT 100, the next response returned by SUT 100 to VDT 200 is as shown in DISPLA~ 2, which is essentially DISPLAY 1 above augmented with the additional line (line 24 of the display):

***GOC LO~ON/OFF***
~ 1~ n O D 1^~
l.lLI I.IL~ l~L~V
; ENTER PASSWORD :
PF KE~ ASSIGNMEMT

4: ADD - LOGON
8: REFRESH - REFRESH SCREEN
10: DELETE - LOGOFF

GClOlOlI LOGON SUCCESSFULLY coMpLErrED~
', :
By visually inspecting the response line, the tester is now ready to select the next testing activity. Since in this example the log-on attempt was ; successful, the tester will move to a next planned testing step, as discussed shortly. However, if the log-on was unsuccessful, another branch of the so-called testing tree must be traversed, such as a retry ~ on the RRO and~or PASSWORD entries. If such `~ information is known to be valid upon input and it is not being accepted by SUT 100, it may be necessary to contact the system developers to investigate the ~ ~ r ` "`''' :`:

'.

' ~,' .

'yl~' ~b~

discrepancy between the expected and actual responses.
It is important at this ~uncture to reiterate and emphasize some characteristics to manual mode testing already encountered. First, the tes-ting i5 interactive in that a tester supplies an input, evaluates a response, then supplies another input, and so forth. Testing is ef~ected on line and only when a teste~ is present. Also, any responses from SUT 100 must be visually interpreted from a full screen display of basically disparate, oftentimes scattered information, as demonstrated in more detail shortly.
The screen information must be read and interpreted and, based upon an analysis, the tester decides which branch of the testing tree is most fr~itful to pursue.
If it is presumed tha-t log-on is successful/ the tester now moves to the next testing iteration. For -this step, a new format is to be called into view and then populated with data. Thus, after /FOR is displayed by the appropriate keystrokesr the tester types the name of the new format, now called l'gcocml", and then strikes the "ENTER" key. SUT 100 responds with -the requested format and it is displayed on screen 202. The tester now enters the data to exercise the various components of SUT 100 selected for testing~
DISPLAY 3 below depicts the pertinent part of the "gcocmll' screen as prepared by the tester just prior to transmission to SUT 100~

- . . .
' ' ' : ' , :''- ' ' ' , .

4 ~yr~

/FOR_ *TIRKS ORDER CONTROL (MESSAGE)*
CLO: SLM ORD B_ CKT-*~**********ORDER LEVEL**~**k*******~*
ORD TYP: N SLS/ORIG: TWS _ CUST: TWS-DEMO_ RAG:
APP: 012086 DD/MDFR:021086 SWC.
WCO: _ ECO: _ TRO- _ OCO: __ CCO:
DOC: _ REL ORD:
ID: SID JEPS: _ _ SID: .
***************CIRCUIT LEVEL***************
CMT: 001_ CID: 1001/df55ie/stlsmoOl/m-/stlsmoO2 FMT: m ACTN. pa MOD: _ DR: msgis CAC/TGAC: _ The unpopulated format screen comprises capitalized acronym Eields followed by a colon, for example, "CLO", "ORD", "CKT", t ~ ~ "APP", ..., "MOD", "DR" and "CAC/TGAC" are the screen fields. Information following the fields comprise data entered by the tester; for example, "SLM" after "CLO", "B" after "ORD", "012086" aEter "APP" and ~Imsgis~ after 'IDR'' are entries typed by the tester. Certain fields remain blank; ~or example, i'CKT", "MOD" and l'CAC/TGAC"
from the above exemplary field list have no entries.
When the unfilled screen is first received by VDT 200, the field entries usually have a series of underbars to indicate where data may be entered and the total length of acceptable data. For instance, "CLO" originally had six places reserved for data and three of these have been filled with IlSLM''. "CKT", which remains unfilled in this instance, has two places reserved for data~
It is not important at this juncture to define the meanings o the field acronyms. Rather, the concept to understand is that for each for~at appearance; fields are displayed with indicators as to the position and length of anticipated data, and the tester enters only that subset of data deemed ~- appropriate to exexcise SU~ 100. After the tester is satisfied with the entries, the contents of screen 20 are transmitted to SVT 100 via an appropria~e Program Function Key-- "eF4l' in the case of DISPLAY 2. Those , ~, .
' ' : ' ' , , ~ ' '' ' ' -~ 1 ~ ''9 ~"' ~' Pr - ~7 ~

keys utilized to transmit sc~een 202 inforMation to SUT 100 are deEined by the format implementer~ ~n the case of DISPLAY 1, the allowed PF Keys were displayed to the tester. For the case of DISPLAY 3, the implementer chose ~ot to display the possible PF keys.
This implies ~hat, besides a compendium of display formats, the tester must have readily available reference material that provides information regarding the keys acceptable for transmitting screen 202 to SUT 100 for those formats without an explicit display of the PF keys.
Upon submission of the contents of DISPLAY 3 to surr 100, SUT 100 processes the contents and then responds to VDT 200 with the results, which are shown in pertinent part in DIS~LAY 4 below.

/FOR_ *TIRKS ORDER CONTROL (MESSAGE) *
CLO: SLM123 ORD: _ CKT: _ FMT: _ ACT~: _ MOD: DR:
CAC/TGAC: AXYZ123_ GC102A ADD S~CCESSFUL

All field positions e~cept "CLO" and "CAC/TGAC" are blank. The "CLO" field now has the original input entry "SLM" augmented with the three character data string "123" and "CAC/TGAC", which was blank upon inputr now has the eight character data string "AXYZ123_". It is presumed Eor the present testing strategy that this returned data is not used immediately, but rather it is to be used as either input to other format requests or compared to results produced later in the test process from submission of other format re~uests. To save this data, the tester generally writes the data strings on a notepad so they ,:, .

. . . , ' -:

3~2~

are readily available for later comparisons. In addition to the field data, the message 'IGCl02A ADD
SUCCESSFUL" appears on line 24. This indicates to the tester that SUT 100 has processed the input format as anticipated. Other possible messages include, for example, 'IGCl02E ORDER ~LR~ADY EXISTSI' or IlGCl03E
INVALID CIRCUIT", and the ne~t step in the testing is guided by the received message.
For the sake of expediency~ it is supposed that the next test iteration utilizes a portion of the notepad data for input and compares the remainder portion to one of the output results. Again, the format called into view is llgcocml'l~ and it is shown in pertinent part in DISPLAY 5 below just prior to transmission to SUT 100:

~FOR_ *TIRKS ORDER CoNTRoL (MESSAGE)*
CLO SLM123 ORD: CKT:

FMT: ACTN: MOD: _ DR:
CAC/T~AC: -All field positions except "CLO'I are blank. The 'ICLO'' field has the entry "S~M123" taken from the data listed on the notepad. In fact, the l'CLOII field is treated as ; comprising two juxtaposed subfields, namely, I'CLOlll and "CL02ll, each storing a three~character string; in this ; particular example, "CLOl" = "SLM" and "CLO2" = "123".
(This partitioning into subfields is utilized for automated testing and is mentioned now so the description of automated test script generation may be expeditiously discussed below~.
When the first "~cocml" format, as shown in DISPLAY 3, was transmitted to SIJT 100, the "PF4" key was utilized. This adds the screen data to the ; applica-tion system database. Now, to transmit ::

.: ' ' . ~
-~9 DISPLAY 5 to SUT 100, the "PF1" key is stroked~ This has the effect of inding a subset of the specific data in the application system database.
The information re-turned to screen 202 as a result of processing the contents of DISPLAY 5 is shown in pertinent part in DISPLAY 6, as Eollows:

/FOR_ *TIRKS ORDER CONTROL (MESSAGE)*
CLO: ORD: _ CICT

CAC~TGAC:AXYZ123_ ~ . . _ ; Only the fielcl "CAC/TGAC" has an entry, namely the "AXYZ123_" string, and line 24 is blankO The tester now refers to the data saved on the notepad and compares the "CAC/TGACl' data displayed on DISPLAY 6 to that returned earlier in DISPLAY 4. In this particular case, the two data strings are identical, thereby indicating SUT 100 processed the add-find sequence as intended by the program developers.
Earlier in this subsection, certain fundamental characteristics of manual mode testing, such as the interactive, on-line and interpretative nature of testing, were summarized~ With the descriptions associated with DISPLAYS 3-6 now completed, it is possible to elucidate the subtle, sometimes burdensome, ~; aspects to manual mode testing.
First, there is no formal mechanism, short of notepad -tabulations and rekeying~ to input the results ; 35 from prior test steps into later test iterations~
Similarly, there is no convenient way to compare results generated from earlier screen inputs to succeeding screen outputs. Moreover, there is no .
,. ~
;:
.

,: . ',,, - . :
. :-r-~- ~0 --mechanism to e~ter othe~ types of data ~rhich might also be characterized as variable. For instance, in DISPLAY 3, the field "APP" conveys a date ; ~month/day/~ear), generally the date of first access to the format with the speciali~ed data entries, and the adjacent field "DD/MDFR" conveys the "APP" date plus a given numbex of working days (15 working days for DISPLAY 3). This implies that the user/tester must either compute or have tabulated the "~D/MDFR" date each time a corresponding "APP" date is entered.
From these characteristics it is realized that the so-called test Elow and test data are inextricably bound in the manual testing mode. Test flow is loosely defined at this point as the set of unfilled input-output Eormat pairs displayed in succession during atest sequence. For instance, unfilled versions of DISPLAYS 1 and 2 above form one pair of the set and unfilled DISPLAYS 3 and 4 form a second paix. Test data is the character information supplied to the positions following the individual acronyms defining the various fields.
It is now possible to contemplate a mechanized process that simulates various aspects of manual mode testing, the process may be characterized by: (1) the ofE-line storage of format pairs comprising the test Elow; ~2) successive calling into view unfilled formats; (3) filling the formats with data, either fixed or variable, and as might be stored from the processing of pLior format requestsj (4) submitting the combined test flow and test data to SUT 100 for step-by-step execution; and (5) the reporting of results in comparing data returned from SUT 100 to expected results. Accordingly, the remainder oE this subsection describes one conventional type of automated procedure that effects such a mechanized process. As discussed in the Background Section, conventional automated procedures utilize a programming language; generally, ... .
.

.~ .

3~ 3~
- 21 ~

it is ~ high-level languag2. This will be apparent as the description unfolds.
The approach taken to elucidate the principles of the language-driven au-tomated test process is that of describing the program code written by a tester to exercise SUT 100 in the same manner as the above described manual mode test example. During the description, reference is made to Appendix A, which lists -the complete code corresponding to the manual mode test. Each line of source code has a numerical label to aid in the description; the label ranges from 1 through 47.
In particular, lines 1-10 of the code perform basically the initialization function of any high-level language. Thusr for instance, a universal program file ~"corn_defs" on line 2) is incorporated by reference into the code and certain dimensioned variables are defined (today and t_c102, both of length 6, on lines 3 and 4). It should be emphasized that it is not expected that the full import of lines 1-10 be presently understoodl but rather only the gist of their meaning is important for the immediate discussion.
Line 11 of the code equates "today" to a quantity obtained rom a function call to an automated test system function designated "Ldatef()". Thus, the date need not be hard-coded into the test script and the date the code is executed will be substituted appropriately.
Line 12 calls a function that does other general run-time initialization.
Line 13 indicates the current format that may be considered to be in view of the tester--here the "gclogo" as per DISPLA~ 1. Line 14 is a function that effects a series of steps, namely, the equivalent of the successive spelling out of /FOR, "gclogo" and the stroking of the ''ENTERIl key in the manual mode~ Again, as per DISPL~Y 1, the field acronyms are filled in ,. ~ - . .

: ' ~ ' ,' , ~ :

~7~Z7 ~ 22 -appropriately on lines 15 and 16. To accomplish this, a compendiu~ oE formats is ~equired so the acceptable field deEinitions may be gleaned. This compendium must be readily available since it is virtually impossible for the tester to have committed to memory all of the Eields from the numerous Eormats. Moreover, the compendium will list the program function keys associated with each format. In this case, line 17 indicates the l'PF4" key is selected, as was the situation in the manual mode test. Lines 18 and 19 show that the returned message (".msg") is to be compared to the depicted character string ("GClOl...COM~LETED") and if there is no match, an error function is called which prints an error message and the screen that is returned from SUT 100.
; Comments similar to the above for lines 13-19 may be used to describe code lines 20-36l that is, the current format (line 20) treated as in view is defined and entered (line 21), certain fields (lines 22-35) are populated and then transmitted to SUT 100 (line 36).
The correspondence of these lines to DISPLAY 3 is rather straightforward, except perhaps for lines 27 and 28. On line 27, the field "~P~" i5 no longer hard-coded, but is defined by the "today" variable produced ~, 25 from an earlier function call (line 11)~ A secon~
function "datey()" on line 28 increments the variable "today" by a given amount (15 days in this case) and is used so that test flow may be separated from test data.
In lines 37 and 38, temporary storage variables t c102 and t cac store output data and serve as the equivalent of the notepad entries obtained from DISPLAY 4 above.
Lines 39 and 40 are used to compare the returned message to a string array (GC102_2) stored in an auxiliary file (I'c30cmsg''~.

:

, ' ' ~, ' . ~, .. .
.: :
, ' ~7~ii327 - ~3 -Lines 41 and 42 correspond to DISPLAY 5 whereas lines 44 and 45 correspond tG DISPLA~ 6. There is no need, prior to line 41, to invoke a new Eormat since the last called format ~here "gcocml'l) is accessible until overwritten by another format call. The concatenation of the quantities defined in lines 41 and 42 Eorm the field "CLO", as discussed earlier, this quantity is composed of fixed data "SLM" and variable data as stored in temporary location t_c102~ Upon reception oE the results produced by processing lines 41-43~ the first seven positions oE the field "CAC/TGAC" (.cac on line 44) are compared -to the temporary storage quantity t_cac. If these quantities do not compare, an error message is printed (lines 44 and 45).
Finally, line 46 provides a comment message within the script code and line 47 indicates the end of the test script definition.
Once the code is completed, the standard operations of compiling, linking, loading and then executing the code through a special driver interface to SUT lOO must be effected. If there are bugs in the test code, they must be found and corrected~ The tester may spend significant time coding and debugging test code and the tester is diverted from the intended goal o exercising SUT 100.
This completes the description of manual mode and language-driven au-tomated testing. The following subsection presents in overview ~ashion the principles of the testing aspects in accordance with the present inventionO The same testing example is used as the vehicle to convey these principles.

20 Overview in Accordance with the Present Invention In FIG. l, the tester i5 viewed as co~municating with SUT 100 in an interactive mode via VDT 200 which, in turn, maintains an on-line connection to SUT lOO
::

~: :............. . . .
. . ., : :
- : , :
: .- :
' .

~7~3~
-- 2~ -over channel 210.
No~, with reference to FIG. 2, the tester is viewed a~ operating a~ another video display terminal, namely VDT 400, generally unrelated to VDI' 200.
VDT 400 is~ in turn, connected to an independent, off~
line emulation processor 300 via channel 410.
Processor 300 provides a processing environment generally distinct from the environment o SUT 100, which is now referred to as the application environment oE SUT 100. Exemplary of the combination of processor 300, VDT 400 and channel 410 is an IBM-AT
personal computer running under control of the DOS
operating system.
The processing environment provided by processor 300 emulates the interface subsystem of the application environment. The interface subsystem is that part of the application system that receives input format information over channel 210 as transmitted by VDT 200 and returns the output information via an output format to VDT 200 upon processing by SUT 100.
In the off-line processing environment, the tester may now create a test sequence--the combination of test flow and test data--in substantially the same operational mode as manual, on-line testing. Since the interface subsystem is emulated in the processing environment, the tester has the same capability of calling into view a desired input format and filling in the required fields as well as indicating the Program Function key used eventually to input the format to SUT 100. However, in addition, the tester can call into view the output orma~ that should be returned upon correct processing by the application system, and the tester can populate this output Eormat with the results expected when the input format is actually processed. A comparison of actual to expected results completes one input-output transaction in a test sequence.
. ~
., ., .
' ' ' , ' ' . . ' :
, '' 53~

For instance, with refe~ence to DISPLAY l of the manual mode example, the tester requests, while in the emulation processing environment, a display equivalent to the 'gclogo" format of DISPLA~l 1. InEormation (RRO
and PASSWORD) is provided by keystrokes to keyboard 401 of VDT 400O Once completed, the contents of the format~ as present on screen 402, are stored in processor 3Q0. T~e contents are stored in a form that allows the information to be actually corlveyed from processor 300 to SUT l00, via interface processor 500 and links l01 and 301, at some future time for execution by SUT 100. The actual time of communication between processor 300 and processor 500, and ultimately - S~T l00, is generally scheduled by the tester and is activated by processor 500 when the scheduled time is achieved. Processor 500 is typically an IBM 3270-type personal computer; line l01 is equivalent to line 210 of FIG. l; and link 301 may be, typically, a RS 232 serial link. ~. :
In conjunction with supplying data to the "gclogc" input format, the tester also calls into view the 7'gclogo" output format and fills in the fields to produce the equivalent of DISPLAY 2. The contents of DISPLAY 2 are also stored in processor 300. Upon activation of SUT 100 by processor 500, SUT 100 processes the input as initially stored by processor 300 and then transmits the actual response as a corresponding output format to processor 500.
Processor 500 then compaxes this response to the expected response as originally stored by processor 300. The action taken upon comparison is dependent upon the comparison results as well as the next iteration planned by the tester. Such considerations, however, are basically the same as in manual mode or language-driven automated testing arrangements.
* Trade Mark ;'~ ~.

., . . .. , .. . . .. , ., .. ~ ~ , .. . . ..... .... ..... ........ .. . .. ... .. ... . . .
' ~
~ ' ' ' ' .~

~ ~t7 The mode of testing in accordance with the present invention may be characte~i~ed as auto~ated, screen-driven testing. ~ny user, including a tester, basically utilizes the Eormats that such a user is familiar wi~h already. There is no need or the user to learn an additional programming language in order to exercise the application systemO Sequences of screens to exercise the system are prepared off-line and stored for later execution. This creates a library of routlnes of both a repeatable and self-documenting nature, thereby further enhancing the ability to exercise fully the application system by all users in the development cycle.
In line with completing the present example, certain additional foundational corlcepts and terminology relative to screen-driven testing are now introduced; these will also prove to be beneficial in understanding the more detailed discussion in the sequel.
The input and output displays appearing on VDT 200 are called formats when referring to the application envixonment~ The displays appearing on VDT 400 are designated as screens when discussiDg the emulation environment~ Screens therefore comprise formats as well as other displays used to describe and prepare a testO
A test sequence is composed oE a series of transactions; each transaction is the complete cycle of sending information to the application system and evaluating the returned information. From another viewpoint, a test sequence comprises the independent entities of test flow and test data; it is generally possible to run the same test flow with different test data. To completely specify a test sequence, both test flow and test data are required.

' ' ~ ' '' ' ' ' ' .

i3~'~
~ 27 -In the emulation environment, a transaction is generated by supplying information to three Eunctional screens that are called serially into ~iew; these are the CONTROL screen, the INP~T DATA screen and O~TPUT
5 DATA screenO The latter two screens are e~uivalent, respectively, to the filled~in input format and the filled-in output ~orma-t corresponding to the anticipated result. To demonstrate the utility of the CONTROL screen, DISPLAY 7 shows a completed CONTROL
screen for the "gclogo'l transaction.

*** CONTROL SCREEN ***
INPUT SCREEN : ~clogo_ KEY : pf4 OUTPUT SCREEN : gclogo_ ; The DISPLAY comprises three input fields shown in upper case. The data supplied by the tester (shown in lower case) provides sufficient information to allow processor 300 to display, first, the ':gclogo" format for the input and, second, upon request, the "gclogo'l format for the output. The required Program Function Key is conveyed via the ''KEYI' field.
The three unEilled, functional screens form three components to an entity generically referred to as a ~Iset~; each screen is said ~o comprise a "set part".
Other possible set parts to be discussed shortly include an INPUT NOTE screen and an OUTPUT ~OTE screen Thus~ the test flow for each transaction in a complete test sequence is identified with a corresponding set. The complete test is then defined by an array of sets and, to be particular7 a file called the SETAR~ fiIe stores the array in processo~ 300 each time a test sequence is defined. ALso, test data for each set is s~ored, in separa-te but associated files, within processor 300 for later recall and use. For instance, if the set associated with DISPLAYS 1 and 2 is labeled as set "2", then file "2.1"

, ' - , ' ~

.. . .

~2'7Si3~

contains CONTROL screen information, Eile "2.3" contains INPUT DATA screen info~rnation and file "2.5" con-tains OUTPUT DATA screen informationO To execute a test, the corresponding SE~A~R file and its associated set part data files are passed to p~ocessor 500. ~rocessor 500 then pa~ses and prepares these files for submission to SUT 100 and for comparison ~o results returned from SUT 100.
Since test flow and test data are stored in files, such file operations as listing oE the files or modifying the files via an editor are readily comprehended. Such operations are supplied as system utilities and enhance the versatility of the preferred embodiment of the present invention.
With language~driven automated testing as illustrated by Appendix A, system functions are provided so that certain test data need not be hard-coded into the test sequence (for example, Ldatef() to obtain today's date). Similar function call capability is provided with screen-driven testing. To effect this in a screen-driven environment, the tester positions the cursor of scxeen 401 to the Eield in the displayed format selected to be filled in with run-time dependent data and requests, via appropriate keystrokes, that an overlay on the displayed format be called into view; this overlay is designated by the term EXPRESSION window in the illustrative embodiment~
An overlay is used since functions, and more particularly, expressions, are generally longer than the actual field length In the EXPRESSION environment, certain pre-defined functions may be called and expressions defined In turn, the function or expression is substituted for the field data at run time. For instance, in the case of the APP field of DISPLAY 3j the field data is defined by the function "today". Also, the due date field (DD/MDFR) may be provided by the expression "today-~lS". Upon exitin~
the EXP~ESSION environment, the tester is reminded that the APP Eield is defined by a function and that the DD/MDFR field is filled by an expression since a series , ~ '' ''~ "' ' ' . ' ' .

~5~7 of, Eor example, six "@" symbols appea~ in the data portio~ of the Eielcl.
Another feature included wlthin the screen-dLiven emulation environment i5 the ability to associate variables with data fields or subfields. These variables provide the equivalent oE notepad entries. The variables may be passed among sets of a test Eor comparison and input pu~poses. The environment to efect this association is called into view in essentially the same manner as in EXPR~S~ION window. This new environment is reEerred to as the VARIABLE environment. For instance, with reference to DISPLAY 4, the data of field CAC/TGAC is to be saved for comparison to the data returned in the same field of DISPLAY 6. Utilizing the VARIABLE window capability, data returned in CAC/TGAC on DISPLAY 4 may be stored in a variable quantity (e.g., t_cac as per Appendix A)~ The tester is reminded that a variable is associated with the field since the data portion of the field contains, for example, a series of "@" symbols. Later, as per DISPLAY 6, t_cac may then be used for comparison to the data returned in CAC/TGAC, and appropriate action undertaken based on the comparison.
The above overview presented the basic design philosophy of the screen-driven, automated development and testing system. The following section describes the operational flow of the system in accordance with the illustrative embodiment of the present invention.
3. Process Flow Initially, the focus o the following description is on that part of the emulation process utilized to process the test characterized by DISPLAYS 1-6 presented in the previous subsections. Also, enhancements to the process, particularly aids to modifying the test, are incorporated in the description. Later, further refinements to the overall operation of the emulation ~ process are disclosed.

':' ~ . :
.~ .

.. ~

~L~71r 3~
~a Upon start-up of processor 300 of FIG. 2, a menu oE action items is displayed to the user on screen device 402; the user is required to select an action item.
The action menu in accordance with the preEerred embodiment is as followso _ CREATE
MODIFY
EXECUTE
= EXIT
To invoke the desired action ltem, the user places a specified keyboard symbol (e.g., Y for "yes") over the appropriate underbar and then sends a "display complete' signal (e.g., CONTROL E~ to off-line processor 300.

With reference to FIG. 3~ dialog controller 1000 controls the action menu communication with the user.
Upon start-up, controller 1010, which stores the unformatted contents of the action menu screen definition, passes these contents to terminal screen builder 1020 for formatting. The output of screen builder 1020 is then passed to terminal screen controller 1030 and, in turn, is conveyed from controller 1030 to VDT 400 via channel 410 for display as DISPLAY 8 on screen 402.
Central to dialog controller 1000 is dialog ;~ ~ manager 1040. Once the user has responded to the action menu request, dialog manager 1040 sets up for display of the information screen of the chosen action item.
If it is supposed that the user requests a CREATE action, that is, a completely new test sequence is to be created, then dialog manager 1040 conveys to next screen definition controller 1050 the need to Eormat and display the CREATE information screen. The data required to genexate the ~REA~E screen is then passed from controller 105C to screen builder 1020 and, in turn, to screen controller 1030 for display to the user. The user ' .
: . - ' , . ' . ' - ~ .. ' ' :
:' :
.
- .

theo enters the required infor~ation which, Eor the illustrative ernbodiment, is the name to be associated with the new test, and dialog manager 1040 passes control to the action i-~em's controller, in this case create test controller 1060.
create controller 1060 initiates (i) the creation of a basic SETA~R file as well as an Initial set and a Final set, and (ii) the creation of a new test directory, identiEied by its user-deEined name and which stores the SETARR file and associated data files. The Initial set is used to begin every test de-finition and comprises only a no-te screen that can be used to enter notes about the complete test sequence. The Final set is used to identify the end of every test and also comprises only a note screen. ~rhe test sequence as initially constructed i.s written to test file 1065, which is accessible to test generator 1070. Con-trol is now passed from create controller 1060 to test generator 1070 and the user may begin to expand on the test definition existirlg at this point, namely, a skeleton SET~RR file and Initial and Final set files, in the illustrative embodiment, test file 1065 resides on the hard disk of processor 300~ The user is placed automatically into the "addset" mode, which means that after each new set is completed, another set will be readied for entry by the user~ The user may end the "addset" mode with an appropriate command, as discussed below~
With respect to the remaining blocks in FIG. 3, controllers 1080 and 1090 correspond, respectively, to the MOVIFY and EXECUTE options or modes on the action menu list. The operation of these controllers will bQ
discussed in some detail later~ Also, embedded within screen builder 1020 are certain tables, a mapper, an image builder and so forth, as identified by indicia 1021-1026;
the function of each of these blocks will be discussed when required.

The detailed composition of test generator 1070 is depict~d in block diagram form in FIG. 4. Generator 1070 is used to complete the building of the screen-driven -test se~uence as initiated earlier by crea-te test controlle~ 1060 of FIGo 3. Each block oE E`IG. 4 and its interaction with the other blocks is discussed broadly now; the make-up of each block is then treated in more detail.
The structure of the test as it resides in the set array portion oE test file 1065 is brought into test parser 710 under control of display controller 720. As now illustrated, file 1065 contains the information for a completely built tes-t, that is, besides the Set Array and Initial and Final Sets already creat d, other intermediate sets (e.g., Sets 10 and 20) are depicted. Although the intermediate sets have been shown now for completeness and reference purposes, they still must be created by the process soon to be described. Exemplary Set 10 is composed of five set parts, namely, CONTROL, INPUT NOT~S, INPUT DATA, OUTPUT NOTES and OUTPUT DATA, which is generally the case~ The results of parsing by test flow parser 710 reside in memory and are represented by test flow data structure file 711. These results, as well as ~5 updated results from editing by display editor 740, comprise updated flow structure file 741 which is also stored in memory. The overall contents of file 741 are available to test flow writer 780 whenever a test flow ; "save" command is issued. The updated test flow is written to file 1065 via flow writer 780 under control of display editor 740.
Display Controller 720 also determines which sets and set parts are to be displayed, generally for editing purposes. The data for each set part is read in from file 1065 by set part parser 730. The data structures resul-ting from parsing reside in memory and are represented hy set part data structure file 731. Display , .
,, - ~
'' '", . ' ', :
.~ .

~75i~

controller 720, based upon inEormation received from test low parser 710) provides to screen definition controller 770 the name of the screen layout, iE any, that will be required to display the associated set part. The screen layout inEormation is passed to set part parser 730 which builds the dlsplay from the layout and set part data of file 1065. Parser 730 builds the completed set part data str~lctures and stores them in file 731. If any set part has been edited by the user, the updated set part data structures reside in memory, as represented by updated data structure file 750, and the updated set part i5 written to file 1065 b~ set part writer 760 when the "save" comlnand is issued. After a "save", control may then be returned to display controller 720 and, in turn, to dialog controller 1000 of FIG. 3~
As indicated above, test parser 710 extracts and then parses the test Elow that is stored hy file 1065~
The test flow is represented by a set of structures that embody the test sequence. One such structure is an array 2C of set information, which has already been designated SETARR for discussion purposes; it i5 basically a link list which controls the ordering of sets in a test sequence. Each link in the list contains information specific to the set it represents.
To demonstrate link usage as well as naming conventions employed in the illustrative embodiment, reference is made to FIG. 5. Line (i) of FIG. 5 depicts pertinent contents of the SETARR file upon entry to test generator 1070 (FIG~ 4) immediately following a CREATE
action menu selection.
Block 651 is representative of the Initial set whereas block 652 is indicative of the Final set. Each set has a number of identifiers associated with its corresponding block representationO One such identifier is the internal or fixed index, shown by an integer in a circle (eOg.l block 651 depicts a zero in a circle, C(0~, whereas block 652 is shown with a C~ . A second , ., .

associatecl identifier is the non-fixed or external index, shown by an integer in the block representation (eOg., block 65L contains a 0 whereas block 652 shows a 10). A
~hird identifier describes the type of set represented by the block (e.g., block 65L deflnes a LOOP set whereas block 652 describes a REGULAR set). In a test sequence, there is the possibility of other sub-loops besides the main loop; the Initial set is always the start of the main loop and is so indicated. Also, each block h~s both a forward and backward link to other blocks~ The links fully describe the interconnection and hence the flow of the process. For instance, block 651 is linked in the forward direction to block 652, as shown by the 'Il'' on the block interconnecting line~ Each link is defined in terms of the internal index corresponding to the "to block" in a "from block-to block" interconnection. Similar comments apply to the backward link. A summary of the link list information is shown in TABLE I for line ~i) of FIG. 5:
TABLE I
BLOCK FORWARD BACKWARD
LABEL LINK LINK
651 "1" "1"
652 "0" "0"
Finally, with respect to line (i) of FIG. 5, a location pointer, depicted by the "ADD" label on the arrowed line, indicates the location of the next set to be added to the test flow; this set is actually the first set added by the user since the Initial and Final sets were generated by create controller 10Ç0. In the example, the first set added represents the transaction of DISPLAY 7, namely the "gclogo-pf4-gclogo" transaction. Block 652, repres~nting the E'inal set, has been "pushed-down" in the right-hand stack and block 653 has been interposed. The "ADD" pointer indicates the location in the SETARR of the transaction which will be added next in the sequence (in the exemplary test flow, the "gcocml-pf4-gcocml"

. ~

-' . . ' :
. ' .
.: -- - ' transac~ion). The external indices have been adjusted (e.g., block 652 now has an external index oE 2Q) whereas internal indices remain fixed (block 652 is still C(1)3.
Accordingly, the link list also changes, as summarized in TABLE II:
TABLE II
; BLOCK FORWARD BACKWARD
LABEL LINK LINK
651 "2l' "1"
652 "0" "2l' 653 "1" "0"
Lines (iii) and (iv) of FIG. 5 depict, respectively, the test flow upon entry of the "gcocml-pf4-gcocml" transaction and then the "gcocml-pfl-gcocml"
transaction. The link list of the final test flo~ of the example, as shown by line (iv), is as follows:
TABLE III
BLOCK FORWARD BACKWARD TRANSACTION
LABEL LINK LINK COMMENT
. . _ , . , 20 651 "2" "1" Initial Set 652 "0" "4" Final Set 653 "3" "0" gclogo-pf4-gclogo 654 "4" "2" gcocml-pf4-gcocml 655 "1" "3" g~ocml-pfl-gcocml With the naming convention established for the sets of a test flow, it is now possible to indicate the ; na~in~ convention associated with -the sets in the SETARR.
As noted earlier, there are generally five set parts.
Thus, for any set having an in~ernal index X, the set parts and corresponding files are as follows: "X.l" for CONTROL; "X.2" for INPUT NOTES; "X.3" for INPUT DATA;
"X.4" for OUTPUT NOTE5; and "X.5" for OUTPUT DATA.
Referring again to FIG. 4, -the operation of test generator 1070 was descrihed in general terms and there was no attempt to distinguish the CREATE op-tion from the MODIFY option of the action item menu. As shown in ,.. .

~ - :
:; -.
:

:
.

FIG. 3, create controlle~ lO60 and modify contxoller 1080 both access test generator 1070 via the same path, namely path 1071, thereby indicating that processing by generator 1070 is substantially the same for either option~ One key difference i5 tied ~o the "ADD" pointer discussed wi~h reference to EIG. 5. This pointer is automatically "on" when entering test controller 1070 from the CR~ATE mode, whereas if the user desires to MODIFY a test sequence by addiny a new transaction, the "pointer on" condition must be actively set by the user. Th s feature is important in that basically the same programming code may be utili2ed to process either option, thereby mitiga-ting computer memory usage.
As alluded to earlier, display controller 720 has the responsibility oE (i) initiating action by test flow parser 710 to obtain set information, starting with the Initial set, and then ~ii) executing a continuous loop by calling set part parser 730 to sequentially process sets and associated set part values, again starting with 2Q the Initial set. The loop may be broken by the user whenever the user so desires. Upon return from parser 710, the first set part under scrutiny is the control set part of the Initial set. The loop advances, respectivelyl from this control set part to the input note set part, then ~o the input data set part, to the output note set part, to the output data set part, to the control information of the next set and so forth. Some of these may be bypassed at the discretion of the user or internal flow control. The next set is determined from the link 3Q list exemplified by TABLE III. If the current set part is an input or output data set part ("X.3" or "X.5"), then the format name embedded in the set array is used.
Controller 720 passes this format name to screen definition con-troller 770 to determine if there is a valid format definition corresponding to the format name. When a match is achieved, set part parser 730 is called.

,. ~ - . . . .

' :' ~753;2~

and SET PA~T PARSER 730 Set part parser 730 builds difEerent ~tructures based on the type of set part being parsed. The first type considered is an input data set part. With reference to FIG. 6, which depicts pertinent blocks of test generator 1070, screen definition controller 770 receives the fsrmat name as provided by display controller 720.
Screen controller 770 builds a skeleton screen image for the given ormat. In FIG. 6, it is presumed that the Eormat corresponding to LOGON, that is, DISPLAY 1 from the previous subsection, is requested. Block 771 is representative of a memory buEEer used to store the skeleton screen image. Associated with screen image buEfer 771 are two other buffers, namely, location map 772 and symbol array 773. Map 772 depicts the locations reserved for the field entries~ For instance, the first Eield oE the LOGON format that is filled-in by the user is the R~O field. The three consecutive l's in map 772 indicate the position as well as extent of the user-supplied data~ Similarly, the third possible field to have entries is the "message" line or all of line 24 of the display. Symbol array 773 summarizes the information with respect to image 771 and map 772. For instance, in array 773, data for RRO is placed in row 2, column 13 of the format and is o length 3. The "link" column is an index on rows oE symbol array 773 itselE.
Control is passed from screen definition ; controller 770 to set part parser 730. Parser 730 reads ; 30 in from file 1065 and checks to see iE data has been supplied to this set part already (since the processing may~result rom a Modify request). Presuming that data is already present, parser 730 fills in the screen imagel now depicted by block 732~ for eventual display to the user.
To Elnd the information necessary to build screen image 732, symbol array 773 is augmented with data obtained from the corresponding set part to produce new :

~ ~,' ' . ', .

- 3~ -symbol array 733. The additional data, in the "data link"
column, gives the linkage necessary to extract the field data residing in expression array buEfer 734. A data link entry of zero indicates there is no stored data in array 734. Thus, for this example, line 24 of the screen image 732 (the message line) is blank, whereas RRO, with a data link o one, stores the user-entered character striny "DFB 1l .
. The second type of set part considered for 10 parsing is an output data set part; the parsing is shown symbolically in FIG. 7. The description of FIG. 7 follows the pattern of the description of FIG. 6 with the primary emphasis on the differences between FIGS. 6 and 7. In FIG. 7, the pertinent blocks of test generator 1070, namely, screen definition controller 770 and set part parser 730, are depicted. Screen definition controller 770 receives the desired format name as communicated from display controller 720 and then builds the skeleton screen image for the desired ormat. In FIG. 7, it is presumed that the format corresponding to DISPLAY 2 is reques-ted. Block 771 now stores -the skeleton screen image for the format of DISPLA~ 2. In the manner described above with respect to FIG. 6, location map 772 and symbol array 773 are built in buffer memory.
After the skeleton screen image is built, control is passed to set part parser 730 from screen definition controller 770. At this point in the processing, set part parser 730 is used to read in data for comparison to data eventually returned from SUT 100 (FIG. 2) and build the appropriate data structures, namely, screen image 732, symbol array 733, expression : array 734 and test array 735. Whereas in FIG. 6, the focus is on the RRO and PASSWORD fields, the focus of FIG. 7 is on the "messagei' or MSG field (the last line of this particular screen image). The data link column in symbol array 733 points to data that is compared against data returned in this field as a result of exercising , .

' ' ' '.. . .

. ' .

7~

SUT 100. In FIG~ 7, ~he MSG field points to the data in link 1 of expression array 734 as the data to be used for comparison purposes.
The test link column in expression array 734 provides an index into test array 735. Each test named in test array 735 may be built from one or more field comparisons and test array 735 summarizes the comparison field list. FIG~ 7 shows a link from expression array 734 to the single test named in test array 735, namely, TESTl.
10 Thus, the interplay among arrays 733-735 is such that the data returned in the ~Imessage~ field upon execution of SUT 100 is associated with TESTl and, for this example, there is a single comparison of TESTl data to the single data line in array 734.
By way of generalizing the principles depicted in FIGS. 6 and 7 for the illustrative embodiment, reference is made to FIG. 8. In FIGo 8, an output set part for the LOGON format is considered. Focusing on the RRO field of screen image 732, it is seen that three "@"
characters appear as field data. Cross-referencing arrays 733 and 734 leads to the conclusion that the data returned in the RRO field upon execution of SUT 100 will be compared to the character string i'DFB". Based on previous discussion, it is expected that "DFB" character stri~g would be displayed in screen image 732 rather than the 11@11 charactersO However, as alluded to earlier, "@"
characters are displayed whenever a variable or an expression is defined for field data. In this case, RRO
is defined as a variable and this is evidenced by the addition of variable array 736 and the variable link column in symbol array 733. The data returned in field RRO may be used later, Eor example, as input to another format. The storing of data associated with variable array 736 provides the no-tepad function alluded to in the overview description.

~ .

,::

i3;~

In screen image 732, the PASSWORD field also contains a string of "@" characters. In this case, the "~" characters indicate that the data returned from SUT 100 is to be compared to an expression. This is discerned from cross-referencing arrays 733 and 734. In particular, the expression 'today' forms part of the PASSWORD and it will be evaluated at the run time of SUT 100.
In FIG. 8, test array 735 has been expanded relative to the one shown in FIG. 7 since three fields are now used for test purposes. A further refinement to the emulation process, leading to the expansion of test array 735, is shown in FIG. 9 wherein the pertinent parts of arrays 734 and 735 are depicted. The situation depicted in FIG. 9 indicates that the first comparison of TEST3 is the entry of line or link 3 of array 734. To determine if there are any more comparisons for TEST3, the 'next' column in array 734 is addressed. In this case, there is another comparison at link 4 of array 734. The data entry of link 4 shows the field is to be tested ~ against 'LOOKUP[GOC-llllI]'. The 'lookup' command ; accesses a message file GOC and finds the string associated with the index llllI. It is also possible to indicate whether this array entry is to be ORed or ANDed to previous comparison results.
The last type of set part considered for parsing is either an input or output note set. The parsing is effected in a manner similar to that described with respect to FIGS. 6 or 7. Basically, screen image 771 is blank screen and screen image 732 is a display of the note ; data stored by file 1065.
, :

Display editor 740 of FIG. 4 handles the display o-f a window~ that is, the bacXground screen plus any overlay, on device 402 of FIG. 3 as well as keyed in data from keyboard 4010 Also~ editor 740 processes updates of ' .

lZ7S3~ ~

set part and set array data structures eEfected by the user during an editing session. The editing operation is accomplished either directly on the screen of device 402 or indirectly, via a SPECIAL WINDOW, overlayed on the screen of device 402. The mode seLected depends on the type of change. Generally, the direct mvde or the VARIABLE and EXPR~SSIO~ special windows are used to modify set part data structures and the COMMAND special window is utilized to change the contents and order of the test flow in the test data structures. The display on device 402 is built from a link list of window buffers; the order of the buffers in the link list shows the hierarchy of the window ~uffers~ This hierarchy is used to determine those portions of the background screen and overlays that shs~
through for each screen display.
A block diagram for display editor 740 is depicted in FIG. 10. Terminal screen builder 1020 and terminal screen controller 3030 have already been depicted in FIG. 1 as part of the action menu selection process.
Both builder 1020 and controller 1030 are, in effect, subroutine~like processors that are acceSSed during ; various operations in the emulation process to perform the basic screen building and control functions.
Terminal screen builder 1020, utilizing window buffer list information associated with buffer block 1026, creates both the terminal screen image represented symbolically by block 1~21 as well as blocks 10~2-1025 which provide supplementary information utili~ed to manipulate the terminal screen image. With respect to the buffer list inormation encompassed by block 1026, FIG. 11 depicts the manner in which the window buffers are linked on the buffer ree list prior to allocation as well as the linkage among allocated windows on the display list as they are prepared for display on screen 402. In particular, the window grouping o subset (i) of FIG. llA
shows three ree wlndows on the window link list as well as the for~ard and backward pointer arrangements coupling :
.: ' .

~27~i3~

the windows to the free list. Subset (ii~ depicts the new state of the free list after WINDOW 1 has been extracted from the free list and placed on the window display list.
The display list and its associated windows are also linked via forward and backward pointers~ Two additional pointers, namely, CW and SCW point to the Current Window and Screen Current Window ~that is, ~he main or underlyiny screen window) are also deployed to track overlays. In fact, subset (iii) of FIG. 11~ depicts an overlay situation wherein two windows have been moved sequentially from the fre~ list to the display list and wherein WINDOW 1 is the underlyiny window and WINDOW 2 is displayed in overlay ashion on WINDOW 1.
Subset grouping (iv) of FIG~ llB depicts that a special, stand-alone window, referred to as the SPECIAL
WINDOW, may also be added to the display list whenever it is invoked by the user. A new pointer SW aids in tracking this overlay. The link lists for this situation are also depicted for both the free and display listsO Finally, grouping (v) of FIG. llB depicts the buffer list arrangement prior to allocation of any windows off the free list and exhibits the stand-alone nature of the special window.
Again with reference to FIG. 10, screen image 1021 is built from screen text, such as described for FIG. 6, as well as stored information relating to che window size, window location and location map. Screen builder 1020 first builds a display for the top window of the overlays. The next window is checked for any location~ that are not covered up by the top window. Each ~ successive window is checked for any locations that are ; not covered by the windows on top of the window under consideration~
Cursor control table 1022 is built by using window location sizes, location map and symbol array to determine cursor protected positions and fields. A cursor protected position means that the cursor is not allowed to : ` . . . .

- :

:12 ~S327 - ~3 -be placed in ~hat location or posi tion on screen 402. A
protected Eield means the cursor may be moved into the location bu-t a character may not be typed into the position on screen 402. For instance, an input note set 5 part has a corresponding window with all screen positions and fields being unprotected since the user may type into Y any position on the screen. On an output data set part window, however, scree~ fields that are filled with "@"
symbols are protected fields but not protected positions.
10 The cursor may be moved to the field, but only control-like keystrokes are allowed.
Boundary table 1023 contains window boundaries that may not be crossed by the cursor.
Screen layout mapper 1024 stores the index into -~ 15 the symbol array (e.g., array 773 of FIG. 6) for a field on the screen display. The index is used in conjunction with screen-to-window guide 1025 to determine the actual symbol array entry. Screen-to-window guide 1025 comprises two types oE information contained in tables. One table 20 is used to find which particular window buffer goes with a terminal position on screen 402. A second table points to the symbol array in the window buffer associated with the loca-tion data.
i Command controller 743 oE FIG. 10 controls communication with the user whenever it is necessary to perform such activities as terminating the automatic addition of new sets while in the CREATE mode, or while in the MODIFY mode, altering test data structures or 33 positioning to some speciEic set part within the test sequence. When initiated by the appropriate keyhoard entry (e.g~, control~C~ on keyboard 401, command controller 743 calls into view on device 402 an overlay window characterized by a Eour-line display; this window was reerred to as the special window in the discussion of FIG. llB.

:, . -: :
' : - ' .~' . , : .
.. . .

In elucidating the operation of command controller 743, reference is made to FIG. 120 The our-line menu iB built in command menu builder 7432 upon instruction from command dialog controller 7431, which serves as ~he interEace to screen handler 742 of FIG. lOo In general, the command menu i5 composed of a top line called the "direct command" line, second and third lines called "selection" and "data entry", respectively, and a bottom "message" line. The lines compri~ing any command - 10 menu are supplied to menu builder 7432 ~rom menu table file 7435, with the particular lines selected for display being dependent on the given user request.
The initial command window that is built is diSPlayed pictorially, in pertinent part, in FIG. 13. In FI~. 13, -the "command" line is highlighted by cross-hatching ~on an actual screen display the highlighting is accomplished, for example, by a light-on-dark display or blinking screen locations). The '~selection" line and "data entry" line have descriptive token entries whereas the "messa~e" line is blank. In other types of command window displays, the "data entry" line may have data entry positions adjacent to some or all of the descriptive tokens.
To demonstrate one prime processing function invoked from the command menu, it is supposed that the user decides that the test sequence of the illustratlve example represented by DISPLAYS 1-6 defines the desired test. The user desires to end the automatic addition of sets in the CREATE mode and save the test sequence defined 30 by DISPLAYS 1-6. It is recalled that DISPLAY 6 is the last output data set in the desired test sequence. To save the test sequence characteri~ed by the SETARR file and the set parts associated with DISPLAYS 1-6, the user invokes the command window overlay on DISPLAY 6. To save the t~st flow and test data, the user positions the screen cursor to the "selection" field entry designated SAVE.

' , ' -32~

The user signals command dialog controller 7431, via a keyboard entry such as a carriage return, to process the SAVE 'Iselection'' entry. There is a so-called menu field number associated with the cursor posi-tion of each "selection" entry in the displayed window. This number is added to command stack 7436 in response to the carriage return and~ in this particular case, the SAVE field number is the one placed on stack 7435. Control is then passed to command action controller 7434~
Command action controller 7434 controls the actual running of the de~ired request. Con-troller 7434 utilizes the information on stack 7436 to sequence the processing of command requests. In this case, the sole SAVE request is processed, and control is then passed back to command dialog controller 7431.
It is also possible to process a "selection"
command as well as a "data entry" command by utilizing the "direct command" line. Thus, rather than positioning the cursor to the desired field in the "selection" line, the user may enter the desired command by directly typing the command on the top line of the COMMAND window display.
For instance, the word SAVE could be typed in this top line and SAVE processing ~ould then be effected. However, before the processing may be accomplished, it would be necessary to call command line parser 7433 in order to parse and thereby determine the appropriateness of the typed information. This direct entry capability is utilized to expedite command processing for experienced users.
As an example of certain other capabilities of command controller 743 which further illustrate its functionality~ the techni~ue for adding a regular set to an already defined test sequence is now considered. Such a situation is depicted by the transition between lines (i) and (ii) oE FIG. 5. In the earlier discussion relating to FIG. 5, however/ regular set 10 was added automatically because CREATE mode operation was assumed.

.~ :
' ~ . ' ''' :

:

~7~

Now, it is presumed that line (i) of FIG. 5 represen-ts the complete ~est definition and the MODIFY mode is deployed to further deEine the test sequence.
The user/ at the action menu level, requests the MODIF~ mode and provides the requisite test nameO The screen displayed in response to the user entries is the CONTROI. screen for LOOP 0 (FIG. 5(i)). The user next invokes the COMMAND special window to display the command menu and control now resides in command controller 743.
Command controller 743 initializes command stack 7436 with the field number of the initial menu.
This field number on stack 7436 is used to index an entry into menu -table file 7435. Since this is the initial request for the COMMAND special window, command menu builder 7432 is invoked and the display represented by FIG. 13 is presented in overlay fashion on the already displayed CONTROL screen for LOOP 0.
To elaborate on the menu building process, reference is made to FIGS. 14 and 15, which depict, respectively, a more detailed version of FIG. 13 and the partial contents of menu table file 7435 In FIG. 14, the location map built by the command menu builder and associated with the initial menu is presented; as depicted, the field number associated with SETS is a 3, STRUCTURE corresponds to 5 and SAVE maps to an 8. The origin of this location map may be discerned with reference to FIG. 15. It is noted that the first entry in the 'Ifield number" column is a "l", and the text as well as the embedded field number associated with each text entry are shown in the top row of the next column. Thus, since the command menu builder initially receives and processes the first field number, as depicted in FIG. 14, the "selection" line is composed of the text on line 1 of FIG. 15. Moreover, by convention, the "data entry" line is composed of the tokens associated with the first entry on the "selection" line. In this case, the first entry is SETS, with a field number of 30 Cross-referencing field , ,~.

.
.: , . .. .
~ " ` - ' .

~'75~3~7 ~7 ~

num~er 3 indicates that JUMP, ADD and DELETE are displayed on the "data entry" Iine whenever SETS is the highlighted selection entry or the "command" line is highlighted.
Since the objective oE this example is to add a regular set after set Q (FIG. 5(i)), the user manipulates the initia] menu display to select the SETS option in the menu (for instance, by operating the tab key). ~he user then strlkes an appropriate key (e.g., ENTER) to indicate that an operation on SETS is to be performed. Command dialog controller 7431 appends the field number found on the location map at the cursor position, namely field number 3, to command stack 7436, as now depicted in FIG. 16. Also in FIG. 16, the old "data entry" line becomes the new "selection" line and the text in menu table 7435 associated with the JUMP option becomes the new "data entry" line, as may readily be comprehended with reference to FIG. 15, field number 10.
Since the user wishes to add a set, the cursor is positioned by the user to the token ADD. As the cursor is moved across the tokens in the "selection" line, the associated menu for the highlighted word on the selection line is placed on the l'data entry" line. In this way, the user is alerted to the next level of menu options available if the pointed to item on the current level is selected.
The user proceeds in the selec-t-tab-ENTER mode through the various levels of the menu. Generally, at some point in the selection process, the user is required to enter data adjacent to "data entry" line tokens. This is the case in the pre~ent example, as depicted by FIG. 17, wherein the user desires to add a set after set 0, so the appropriate information is entered on the command display. The contents of command stack 7436 and the corresponding COMMAND M~NU are given by FIG. 17.
Moreover, two additional stacks, namely, the "data" stack and "index to data" stack are shown in FIG. 17~ As the name implies, the latter stack stores a pointer to the ., .

, - . .

' -, ~ .

~7~i32~7 48 ~

actual data stored in the Eormer staclc. In FIG. l7, for example, the index stack points to location 2 of the data stack wherein "y" Eor yes is stored. The llyll represents the data placed before the token field AFTER in the displayed window of FIG. 17.
Once the user has completed the menu selection procedure and enters the appropriate information as required~ control is passed to action controller 7434.
~ction controller 7434 accesses stack 7436 and commences processing the commands associated with the field numbers placed on the stack in top-down fashion.
As suggested in the foregoing discussion, the user may also enter commands directly without stepping through the various levels of the menu. Direct commands are entered on the first line of the command menu window.
Using the previous example~ after the initial menu of FIG. 13 is displayed, the user enters the ~ollowiny information string on the "direct command" line:
SETS ADD AFTER 0.
Upon entry of the appropriate key stroke (e.g., ENTER), command controller 7431 recognizes that the user has entered a direct command and passes control to command line parser 7433. Line parser 7433 sequences through menu table file 7435 and compares each menu entry to the Eirst word placed on the "direct command" line. If a match occurs, command line parser 7433, by emulating the actions of a user during menu selection, places the field number of the matched command on command stack 7436. Command line parser 7433 processes subsequent words or data on the "direct command" line in a similar manner until all words and data are handled. Each time there is a match, the actions of a user are emulated (e.g.l cursor positioning and ENTER key stroking) exactly as with menu selection.
However, the window displayed to the user is fixed, that is, there is no changing of data on the "selection" and "data entry" lines until af~er the "direct command" line is parsed.

:

' ' ` , .

3~7 . ~9 If a match is no-t found between the word under consideration and any menu tahle entry, an error message is displayed on the "message" line of the command window.
A user may also ~upply less than a complete command list and thereby skip forward in the series of menu displays that the user would otherwise step through individually.
WI~DOW CONTROLLER 744 Window controller 744 of FIG. 10 serves a two-fold purpose. The Eirst aspect pertains to ~he overall management of the window free list and the window display list as described earlier with reference to FIG. 11. The second aspect relates ~ore particularly to the management of the EXPRESS~ON and VARIA~LE special windows; these comprise two of the windows of the 5PECIAL WINDOW block discussed generally with reference to FIG. llB. A third window in the SPECIAL WINDOW category is the COMMAND
special window described in the last subsection.
The manner of calling either the EXPRESSION or VARIABLE windows into view and populating -the displayed window with information is substantially similar to ~he operation of the COMM~ND special window The EXPRESSION
and ~ARIABLE windows are called into view in overlay ; fashio~, generally onto an INPUT or O~TPUT DATA set part, to supply "expression" (e.g., today + 15) or ~Ivariable~
(e.g., t cac) information.
To invoke the window, the cursor is positioned to the field of interest and special keyboard entries are stroked (e.g., control-X and control-V for the EXPRESSION
and VARIABLE windows, respectively). The appropriate window is overlayed proximate to the field under consideration. A typical V~RIABL~ window overlay has three lines of the form (commensurate to FIG~ 13~o ' ' ~

~ ~7~7 - 50 ~

OVERLAY DISPLAY

add edit delete ...
[cb,ce] -> "variable"
mst3:

The first or menu line provides a li~ of options that the user may sequence through to arr-ive at the option of choice. The second or defini-tion line allows the user -to type in the desired "variable" name.
The [cb,ce] designation, (cursor begin, cursor end) provides the user with the capability of defining the variable for all or even a portion of the overall field length. For instance, iE a field length is 10 positions, then the notation [3,6] designates only the third through sixth positions are associated with the particular defined variable. The following example of a definition line relates to DISPLAY 4 above:

[1,7] -> t_cac.

The third or messa~e line provides feedback to the user for instance, on allowed menu choices or error messages.
Storage of the information provided by a VARIABLE window has already been di~cussed with reference to FIGS. 6-9~ It is recalled that information accepted as valid upon a call to a VARIABLE window is shown by a "fill" character (e~g., @) in the appropriate field positions upon exiting the window.
A typical EXPRESSIO~ window overlay also has three lines generally of the form of the above OVE2LAY
DISPLAY; however, the definition line is now of the form [cb,ce] "relation" 'lexpression'l, where "relation" is an operation of the form "equal"
"greater than", "not equal to" (=,~,!=) and so forth.

:' :

-~ .
- .
.
`

.

~753;~

Moreover, "expression" may be one from a standard set of system-supplied expressions such as "today" for the execution or run time date. The handling oE the EXPRESSIO~ window information w~s also covered in the description of FIGS. 6-9. A ield defined by an expression also has an appropriate "fill" character to indicate this Eact.
.
4. Program Flow The desGription in the foregoing section focused on the emulation processing methodology in accordance with the illustrative embodiment of the present invention.
This section treats the corresponding program flow, that is, the detailed series of operational steps utilized to effect the process flow.
In FIG. 3, dialog controller 1000 depicts the arrangement of processing blocks used to communicate with the user. FIG. 18 shows the flow diagram corresponding essentially to the processing performed by controller 1000 of FIG. 3. In FIG. 18, upon s-tart-up, emulation process initialization is accomplished by code block 2005. For instance, arrays are restored to default values and screen 402 oE FIG. 3 is cleared. After initialization, the ackion menu screen exemplified by DISPLAY 8 above is built by code block 2015 using data stored on-line by file 20100 The window bufer for the action menu is allocated as already discussed with reEerence to FIC. 11~ The completed window buffer is displayed to the user via code block 2020. Decision block 2025 processes -the keyboard entry by the user.
Supposing a CREATE request, code block 2030 prepares the next window displayed to the user. The format for the CREATE screen is contained in on-line ~ile 2035. The user, in response to the CREATE display on the screen, types the name selected by the user to be associated with the test. This invokes: code block 2040 wherein a directory file with the test name iu entered , -, ~' `.

i3~7 into the ~ile structure; code block 2045 to write the Initial and Final Set data into file 1065; code blocks 2050 and 2055 to write out the skeleton SETARR file to test Eile 1065; and passing of con-trol to code block 2060.
In the event the user selected the MODIFY mode, decision block 2025 directs control to code block 2070 wherein the MODIFY screen is created, using on-line file 2075, and displayed to the user. After the user supplies an already existing test name via the keyboard, control is also passed to code block 2060.
If the user selects the ExEcurrE mode~ control is passed to FIG. 20, which is discussed below.
Code block 2060 prepares the parameters needed by the next section of the code. The paLameters depend on the manner in which code block 2060 is entered, that is, either from the CREATE mode or the MODIFY mode. FOL
instance~ if MODIFY is selected, then the command window menu is overlayed automatically on the next window displayed to the user. After parameter preparation, control is passed from code block 2060 to the program flow of FIG. 19~ The flow of FIG. 19 corresponds basically to the processing effected by test generator 107~ of FIG. 4.
After an initialization phase performed by code block 2105 of FIG. 19, control is passed to code block 2110. This code effects a reading-in of the sErARR
file from file storage 1065. Supposing the CREATE mode, the SETARR file is a skeleton file with the structure represented by FIG. S(i)o The initialization performed by block 2105 would cause a "yes" response in decision block 2115~ thereby invoking the allocation and display of the CONTROL screen via code blocks 2120 and 2125. The layout of the required LOOP CONrrROL screen is provided by on-line storage file 2124.
The CONTROL screen for the Initial set allows the user to provide such information as the names of notepad-type variables that will be used in the test flow , description. This is in contrast to the CONTROL screen for a REG set where;n the INPUT DATA and OUrrPUT DATA
formats as well as the "message1' field comparison information are typical user entries. The manner in which S the format information is embedded in the SErrAXR file is presented shortly; similarly, the techni~ues for storing the "message" field as set part data is also elucidated below. Code blocks 2130, 2135 and 2140 extract the "message" field from set part data whenever required by a ; 10 particular CONTROL screen.
Againr presuming the CREATE mode, code block 2130 causes a branch to user screen handler code block 2150, thereby bypassing code blocks 2135 and 2140.
Handler code block 2150, described in more detail below, processes user keyboard entries to the displayed control screen. If it is supposed that the defaults for the LOOP
CONTROL screen are acceptable, such as no variables being defined, the user may invoke an immediate return from user handler 2150 (e.g. by using the EXIT command of the COMMAND special window). Code block 2155, coupled to code block 2150, allows the user to return to the action menu level. Supposing the user desires to continue in the CREATE mode, control is passed to code block 2160 wherein para~eters are prepared in order to process the next set part~ As a result of the parameters defined at this point, control is passed rom block 2160 to code block 2145. This occurs since a LOOP set has only a CONTROL set part, which has already been displayed, so the ` "no'l responses to blocks 2116-2119 are invoked, thereby leading to code block 2145.
.

, ':
.
, ~27~7 Since the CRE~TE mode is under consideration, code bloc~ 2145 adds a set after the LOOP set, as represented by FIG. 5(ii)~ Co~trol is passed from code blocls 2145 to code block 2115 via screen code block 2150 (not invoked yet since no user input is required), through return code block 2155 and preparation block 2160.
Control block 2115 yields a "yes" response as a result of parameters prepared in code block 2160. A REG
CONTROL screen window is displayed to the user, via blocks 21~0 and 2125 and file 2124. Since this set part is new/ there i8 presently no data in OUTPUT DATA (in fact, the format of OUTPUT DATA is about to be defined on ; the REG CONTROL screen), so control is passed directly to screen handler 2150.
lS Using the example described previously, the data to be provided to this REG CONTROL screen is as presented earlier in DISPLAY 7, namelyr the "gclogo-pf4 gclogo"
transactionO Thus, the user enters this information on the REG CONTROL screen via the keyboard and handler ; 20 code 2150 appropriately processes the information~ Now, as control i5 passed from block 2150 through blocks 2155, 2160 and the "no" portion of blocks 2115 and 2116, the "yes" response path of block 2117 is traversed as a result of parameters prepared within block 2160 (namely, CONTROL screen has already been displayed and no INPUT NOTE data will be provided by the user). An appropriate display is built by code blocks 2122 and 2127, the latter having on-line access to all formats that may be requested by a user. In the instant case, the format displayed corresponds to the "gclogo" format of unfilled DISPLA~ 1. In the CREATE
mode, code block 2171 is bypassed since there is no pre-'r : existing data stored in the set part portion of file 1065 However, in the MODIFY mode, code block 2171 is invoked to fill-in the display via the technique described with reference to ~I~. 6. Control is then passed to screen handler code block ~150 to again process user responses.

.~

:.

' , ' :.

;i3~7 This tlme upon return from handler block 2150, control lodges in code block 2119. Followiny the "yes" path, the display coLresponding to unfilled DISPLAY 2 is built by code blocks 2124 and 21~9. Ayain, blo~k 2129 has access to all fo~mats that the user may wish to invokeO Before p~oceeding to user handler code block 2150, a deter~ination regarding the n~ed to add a new set is made in code block 2145O In the present case, a new set is added; it corresponds to the 'igcocml-pE4-gcocml"
transac-tion depicted in file structure form by FIG. 5(iii).
The processing by the code blocks of FIG. 19 continues, for the example, for two more iterations to cover the transactions represented by DISPLAYS 3 and 4 and DISPLAYS 5 and 6, respectively. ~s control is returned to screen handler code block 2150 after the third iteration, return code block 2155 is invoked. The test sequence presently embodied by the SETARR and set parts is written to file 1065 and the action menu is then displayed.

of Set~

As indicated above, both the SETARR and set part files are stored, in a so-callad canonic representation, in file 1065. llhe canonic representation for the example test flow is now discussed.
The canonic form of LISTING 1 below is essentially the canonic representation for the exemplary SETARR file.

.

.
- . :
' -: - -~.27~7 LISTING l L I N E S ETARR CANON I C FORM
(1007 (2007 2 (102~1023 today)(l023 t_c1O2)(1023 t_cac)) 3 (10~6(2007 4 (1018 0 0 -1) ' ' ' ' ' ') 6 (2007 7 (1002 (2007 8 (1018 2 10 3) 9 'gclogoY 'pf4' 'gclogo')) (1002 (2007 11 (1018 3 20 4) 12 'gcocml' Ipf4l 'gcocml')) 13 (1002(2007 14 (1018 4 30 1) 'geocml' ~pE1' 'gcocml')) 16 (1002(2007 17 (1018 1 40 0) 18 ' I ' ' ' ' lg 21 ~
In order to disc~ss the contents of LISTING 1, it is necessary to first introduce some terminology and notation. The basic building blocks of the canonical form are atoms and lists. An atom describes a unit which cannot be further subdivided~ Atoms are combined into groups to form a list, and any list can be composed of other lists and atoms to Eorm a hierarchical structure.
;~ There are generally ive types of atoms, namely:
operators; numbers; string eonstants; variables; and clefined reEerences~ Operators are positive integersO
Numbers are integers that may be negative. A string constant is any group of characters surrounded by primes.
A variable is any group of alphanumeric characters starting with a letter. Finally, a defined reference is a ; name that begins with a "." to indicate that it represents a symbolically defined screen address For example, in LISTING 1: on line 1, the integer 1007 is an operator; on ~, line 2, today is a variable; on line 4, -1 is a number; on j line 9, 'gclogo' is a string variable. An example o a A~ ~ defined reference is shown in the set par~ listing ' `

.
.

.

.
.

ii3~

presented shortly.
In a ]ist, atoms and other lists are separated by spaces and surrounded by parentheses. Also, the first atom in a list is an operator. The operator is associated with an object and the operator acts as a tag and thereby indicates thak the list is defining an instance of that object.
A simple example of operator taggi~g is provided by line 2 of LISTI~G l; this line represents the storing of EXPRESSIO~ and VARIABLE symbol table for these quantities in the SE~ARR. The operator 1022 is associated with a symbol table and operator 1023 is associated with a symbol table entry. In the line (1022(1023 today)(1023 t_c1O2)(1023 t_cac)), the outer list operator (i.e~,1022) indicates that a sy~bol table is represented. Each of the three lists within the outer list represents the table entries.
As the canonical form is parsed, actions are taken as lists are recognized from the deepest nesting level outwards. Thus, in line 2 of LISTING 1, each symbol table entry is recognized first and then the symbol table operator is recognized~ When a list is "recognized"~ a routine associated with the operator in that list is invoked. The components of the list are passed as ; 25 parameters. If a component is an atom, it is passed directly. If a component is a list, then that list must first be replaced with the result of processing the list at parse -time. Thus, with respect to line 2 of LISTING 1, the order of events is as follows:

(i) recognize the list (1023 today) and call the routine associated with symbol table entries. This routine has one inpuk parameter, namely, today. As a result of processing, this entry will be created in the symbol table. The routine that built this entry passes bac~ a pointer to the entry.

;''-'. ' - ' -, '~ . .

3~

(ii) similarly, reco~nize the list (1023 t_c1O2) and then the list (1023 t_cac), build entries and return pointers.

(iii) recognize the list (1022 pointerl pointer2 pointer3) and call the routine a~sociated with operator 1022. This routine takes a collection of pointers and groups them to create a complete symbol table. A pointer to the table is returned as the final result of parsing the list.

Now, by way of parsing the complete canonic form of LISTING 1, it is noted that the firs-t innermost list appears on line 8 and has associated operator atom 1018.
Atom 1018 i5 designated as a Property Operator (PROPOP) and its function is to return the internal index, the external index and the forward link of a set. Regarding line 8, the parameters are 2, 10 and 3, respectively, as summarized by block 653 of FIG. 5(iv~. There are three other innermost lists as given by lines 11, 14 and 17, respectively, and the parameters returned correspond to blocks 65~, 655 and 652 of FIG~ 5(iv).
The first list at the next highest level appears on line 7 and has the associated operator atom 2007, called a LISTOP ~List Operator). Its function is to return a list of atoms, which in this case comprises the numbers 2, 10 and 3 as well as the strings 'gclogo', 'pf4' and 'gclogo'. Three other LISTOPs appear at this level on lines 10, 13 and 16~ respectively. It is noted that the Iast LISTOP, which corresponds to the FINAL set or block 652 in FIG~ 5(iv), has blanks in the string constant positions.
The first list at the next highest level also appears on line 7 and has the associated operator atom 1002, called a REGSETOP (Regular Set Operator). Its function is to delimit the appearance o~ each Regular Set :
: :
:: :
,: - , ~ : .
; ~ , ' ' . ' ~ ' ' , . . : . ' - :

~;Z 75;32~

in the next highest list which is invoked by the next hiyhest list atom 2007 on line 6. The LISTOP on line 6 retur~s all the informa-tion relating to lines 7 through 18 to the nex~ level ~ISTOP appearing on line 3O Lines 4 and 5 also supply property information relating to the Initial Set to LISTOP 2007 of line 3. Also on line 3 appears the so-called LOOPOP (Loop Operator) having associated atom 1006. Its function is to associate all Regular Sets appearing as a result of the LISTOP
processing of line 3 with this particular LOOP 0 (in this case, the only loop).
The highest level list has associated operator atom 1007, which designates the TSTFWOP (Test Flow Operator). Its function is to provide the complete test flow, including symbol table data and forward and backward links, when invoXed. It is important to realize that the link data need not explicitly be set Eorth as part of the canonic form since this information can be deduced from the ordering and positioning of set information in the canonic form~

Canonl Form Representation of Set Parts To exemplify the canonic form representation of set part data~ the set parts corresponding to DISPLAY 3 and DISPL~Y 6, namely set parts "3.3" and "4.5" are presented and discussed. Set part ~l3,3~ in pertinent part, is shown in LISTING 2:

.
- : . : ,. .

~o -LINE SET PART "3~3" CANONIC FORM
1 (1027(2007 2 (1001(1009 .clol) 'SLM') 3 ~1001(1009 .app) today) 4 (1001(1009 .due_date) (2008 today 15) 6 )) The operator atoms are as follows: 1027 - INOP
(Input Operator); 2007 - LISTOP; 1001 - FASNOP (Field Assignment Operator); 1009 - FIOP (Field Operator); and 2008 - BI~OP (Binary Opera-tor)O In parsing this canonic form, the innermost list is associated with BINOP
atom 2008; its function is to perform the binary addition of the EXPRESSION variable 'today' generated at run time with the constant 15. The result is assigned~ via F~SNOP 1001, to .due_date, which is a defined reference.
FIOP 1009 indicates that this defined reference is to be associated with the "due-date" field (DD/MDFR) field of the present format. INOP 1027 defines the lists and atoms on lines 2-6 as an INPUT DATA set part.
The canonic form for set part "4~5" is shown in LISTING 3 as follows:

.

r :: :

.

. '': .
., ~

3~

LINE SET PART "4.5" CANONIC FORM
1 (1026(2007 2 (1003(1011 TESTl) 3 (2007 4 (1010(1009 .cac)"!="
t_cac) 8 )) The new operator atoms are as follows:
1026 - OUTOP (Output Operator); 1003 - TSTOP (Test Operator); 1011 TSPROP (Test Property Operator); and 1010 - TSPTOP ~Test Part Operator). In parsing this canonic form, the innermost operator 1009 lndicates that the defined reference .cac is to be associated with the "cac" Eield (CAC/TGAC) of the present format. TSPTOP 1010 indicates that the field returned by F~OP 1009 is to be compared to variable t_cac and an error message is to be generated if the test results in a "not equal" ~"!="1.
The fact that a test is to be performed is indicated on line 2 by TSTOP 1003 and TSPROP 1011 enters this as TEST 1 in the test array file (see block 735 of FIG. 8) for the variable t_cac. OUTOP 1026 indicates that the set part canonic form is an OUTPUT DATA set part.
EXEC~TE TEST CONTROLLER 1090 Execute test controller 1090, which has been shown generally in the block diagram of FIGo 3 and as a portion of the flow diagram in FIG. 18, is now shown in expanded flow diagram form in FIG. 20.
The so-called execution phase is initiated at the action menu level as one of the options selectable by the user. When invoked, with reference to FIG. 20, the user supplies a test name identiEied with the test ~equence to be executed. It is possible to have many test files of the type exempliEied by file 1065 resident on the storage medium associated with processor 300. The test name selected identifies the particular one -to be prepared , ' : , ' :
, . .

7~3~

~- 62 -for execution. Block 2205 of FIG. 20 displays the blank "execute screen" passed Erom block 2206 and then processes user input.
Presuming a valid test is named, the canonic form repLesentation from file 1065 associated with the test name is provided -to lnterface processor 50U of FIG. 2 by the processing of block 2210. As depicted in FIG. 2, file 1065 may be transmitted to processor 500 via channel 301. As an alternative, if processor 500 is configured with a drive to read in diskettes, file 1065 may be advantageously copied to a diskette by processor 300 and then loaded into the disk drive of processor 500 for accessiny purposes. In either case, processor 500 is typically an IBM 3270 PC running under control of the 3270 PC Control Program and utili7ing the High Level Language Application Program Interface.
Once processor 500 is placed in control of the processing, the set array and set part data are read into memory and parsed by block 2215 of FIG. 20. Processor 500 is configured and programmed to translate the canonic form representation of a test into screen-image format data for transmission to SUT 100 of FIG. 2 via channel 101.
Upon parsing by block 2215, execution is initiated starting with the first set. Presuming the exemplary problem represented by DISPLAYS 1-6 is the test under consideration (the first set is a loop set and there are three other sets), then decision blocks 2216 and 2217 divert the processing to block 2230.
Loop set information and variable information are obtained from the CONTROL screen for the loop set~ the variables are initialized and then decision block 2216 is again invoked. Since the three REG sets remain for the example, processing by block 2235 is now activated.
The processing by block 2235 provides the INPUT
DATA to the input ~ormat by e~-tracting the appropriate set part information from file 1065. The input format has already been obtained from the SETA~R processing by .

- : , :: ' , ,, :

S3~

block 2215. As part of the field fill-in pr~cess, cert~in EXPRESSIONS are evaluated within block 2240, the results of the evaluations as well as extracted set part data are filled-in by the processing of block 2245. ~lso, block 225~ saves data into variables for later use as notepad-type en-tries. Wi-th the input preparation completed, the program function key and the input format are transmitted as a coded channel signal to SUT 100 via channel 101 as represented by processing block 2260.
SUT 100 processes the input format as any other user inpu-t since the emulation and interface processing are transparent to SUT 100.
SUT 100 respOndS by transmitting a block of data representative of the proces~ing by SUT 100 and this da-ta block is stored by processor 500 as the "ac-tual'l result from processing the input -Eormat. In order to compare the actual to the "expected" results, block 2260 builds the output format from OUTPUT DATA stored by file 1065. Also, block 2265 saves data into variables for later use as notepad type entries. The comparison processing occurs in block 2270 wherein actual and expected results are parsed for discrepancies between them and an error message file is written if any comparison fails.
Decision block 2275 provides the ne~t-activity strategy desired by the user. For instance, some unsuccessful test comparisons may require that processing immediately terminate (YES) whereas others may lead to continued execution (~O). If there is no need to terminate~ processing by the path s-tarting with block 2235 and ending with block 2270 is again effected through the last set or until a fatal error occurs.
In the event that processing terminates, either by completing the submission of all the sets or as a result of unsuccessful comparisons, processing by block 2220 is invoked. The message file obtained by exercising ~UT 100 is transferred to processor 300 and control is then passed to processor 300. The user may ~75i~

- 6~ -desire to see the flle at this time or continue withanother named test~ Decision block 2221 provides this capability.
After execution, the user is transferred back to the action menu level as set forth in FIG. 18. The user is ready for another iteration or may exit the emulation process, as appropriate.
While this inven-tion has been described and shown with reference to an illustrative embodiment thereof, it will be understood by those skilled in the art that changes in form and detail may be made therein without departing from the spiri-t and scope of the : invention.

i32~

APPEND I X A
_ ~ .. ....
1. /~ ini-tial set */
2. ~include l'com deEs"
3. str today(6);
4. str t_c1O2(6);
5. str t_cac(7);
6. #auxdef gocmsg 'Igocmsg.x"
7. Lusrmain(ac,av) 8. int ronly ac;
9. str ronly av()[ac];
10. {
11. today = Ldatef("$M$D$Y");
12. scriptst~;
13. #curmask "gclogos~z"
14. get("gclogo");
15. .rro = "DFB";
16. .password = ''PASSWDI';
17. Lxmit(k_pf4);
18. if ( [.msg] != "GC10101I LOGON SUCCESSFULLY COMPLETED" ) 19. { error(l,l0); }
20. ~curmask "gcocmls.z 21. get("gcocml");
22. .clo_l = "SLM";
23. .order = "B";
24. .ord_type = "N";
25. .sls_orig = "TWS";
26~ .customer = "TWS-DEMO";
27. .app = today;
280 .due_date - datey(today,l5);
29. .wco = 'IWRO'';
30. .sid = 11.~1;
31. .item_1 = "001";
32, .circuit id = "1001/DF55IE/STLSMO01JM-/STLSMO02";
33. .fmt_l = "M";
34. .action = I'PAI';
35. .dr = ''MSGIS'I;
360 Lxmit(k_pf4);
37. t c102 - [.clo_2]:
38. t_cac = Lsubstr([.cac],0,7);
39. if ~ [.msg] != gocmsg.GC102_2 ) { error(1,20); }
41. .clo_1 = "SLM";
42. .clo_2 = t_c102;
43. Lxmit(k_pfl);
44. if ~ Lsubstr([.cac],0,7) l= t_cac ) 45. { error(1,30); }
46. /* ~inal set */

' ' . - ~ , ' -' - :'. ' ' ~ ,:

Claims (18)

1. In combination with a process for executing an application system by operating on information submitted through an input format and by returning processed results through an output format, a method CHARACTERIZED BY THE STEPS OF:
prior to execution of the application system, emulating the output format in an autonomous emulation system and filling said emulated output format with expected results, and after execution of the application system, comparing the processed results with said expected results and then controlling further execution of the application system in accordance with said comparison.
2. In a processing system comprising a processor, a keyboard, a display device and a storage medium, a method for creating an input-output transaction executable by an autonomous, screen-driven application system, said method comprising the steps of emulating the application system input-output interface environment in the processing system environment, displaying on the device a selected one of the emulated input screen formats comprising said interface environment, supplying input information to the device from the keyboard in accordance with said displayed input format, processing said input information with the processor to obtain processed input information, and storing said processed input information in the storage medium, and displaying on the device a corresponding one of the emulated output screen formats comprising said interface environment, supplying output information to the device from the keyboard in accordance with said displayed output format, processing said output information with the processor to obtain processed output information, and storing said processed output information in the storage medium, said transaction comprising the combined information stored as a result of supplying said input and output information.
3. The method as recited in claim 2 wherein said step of emulating comprises the step of storing in the storage medium display data representative of said input formats and said output formats.
4. The method as recited in claim 3 wherein each step of displaying said emulated format comprises the step of converting within the processor said display data representative of said format to a form displayable by the device.
5. The method as recited in claim 2 wherein each step of processing information comprises the step of converting said information to a form executable by the application system.
6. The method as recited in claim 5 wherein said step of converting includes the step of translating said information to a canonic form representation.
7. The method as recited in claim 2 wherein each said step of processing information comprises the step of partitioning said information into a flow part and a data part and each step of storing comprises the steps of allocating distinct storage locations in the storage medium for said flow part and said data part and then saving said flow part and said data part in a canonic form in their respective locations.
8. The method as recited in claim 2 wherein each said step of supplying information includes the step of associating expressions executable by said processing system with selected fields comprising said formats.
9. The method as recited in claim 8 wherein each said step of supplying information further includes the step of associating variables with preselected fields comprising said formats.
10. The method as recited in claim 2 wherein each said step of supplying information to said displayed format includes the steps of requesting a window overlay comprising menu selection items, overlaying said window onto said displayed format and passing control to the processing associated with said overlay until menu selection activity is completed.
11. The method as recited in claim 2 wherein each said step of supplying information to said format further includes the steps of selecting one of the field locations comprising said displayed format, calling a window overlay to provide window data, supplying window data to said overlay, processing said window data, and upon termination of window overlay processing, storing said window data in locations of said storage medium associated with said one of the field locations.
12. The method as recited in claim 2 further comprising the steps, prior to said steps of displaying, of:
displaying a control screen on the device, and supplying control data via said control screen to designate said selected one of the input formats and said corresponding one of the output formats comprising said transaction.
13. The method as recited in claim 2 for generating a sequence of transactions further comprising the steps of repeating said first step of emulating and said second and third steps of displaying to produce said sequence.
14. The method as recited in claim 13 further comprising the steps of parsing each of said transactions to produce converted input data and converted output data, said converted data compatible with execution of the application system and said converted output data representative of expected results from processing said converted input data, submitting said converted input data to the application system for execution to produce actual results, returning said actual results to the processing system, and comparing said actual results to said expected results and controlling the processing of any remaining ones of said transactions in correspondence to said comparison.
15. A method for creating a program sequence in a form compatible with a screen-driven application system by utilizing a processing system comprising a processor, a keyboard, a display device and a storage medium, said method comprising the steps of emulating the application system input-output interface environment in the processing system environment by storing in the storage medium information representative of the input and output formats utilized by the application system, prompting the user to select, via the keyboard, a selected one of said emulated input screens and displaying on the device said selected input screen, supplying input data, via the keyboard, to the processor for storage in the medium as required by said displayed input screen, prompting the user to select, via the keyboard, a corresponding one of said emulated output screens and displaying on the device said selected output screen, and supplying output data, via the keyboard, to the processor for storage in the medium as required by said displayed output screen, said program sequence comprising the stored data resulting from said steps of supplying data.
16. A method for generating a process flow in a processing system environment utilizing sets of input and output formats corresponding to essentially identical sets of input and output formats in an application system environment, wherein the processing system comprises a processor, means for storing data and an entry/display terminal, and wherein the process flow is in a form executable by the application system, said method comprising the steps of (a) calling into view a selected one of the input formats for display on the terminal, (b) supplying selected input information to said displayed format via terminal entries, (c) storing the terminal contents of step (b), via the processor, into the storing means, (d) calling into view the output format corresponding to said selected one of the input formats for display on the terminal, (e) supplying selected output information to said displayed format in step (d) via terminal entries, (f) storing the terminal contents of step (e), via the processor, into the storing means, and (g) repeating steps (a) through (f) until the process flow is complete, said process flow comprising the successive sets of stored contents resulting from performing steps (c) and (f).
17. A method for executing an application system in the application system environment, the system comprising an interface subsystem for submitting input information to the system and for extracting output information from the system, said method comprising the steps of emulating the interface subsystem in a processing environment distinct from the application environment, formulating in said processing environment an execution sequence by utilizing program structures defined in said emulated interface subsystem, generating in said processing environment code corresponding to said execution sequence by invoking code generation software compatible with the application system, submitting said code from said processing environment to the application environment, said code now serving as the input information, and transmitting the output information resulting from the processing of said code from the application environment to said processing environment for comparison to expected responses stored in said processing environment.
18. In combination with a screen-driven application system, a processing system for generating an input-output transaction executable by the application system, said processing system comprising means for emulating the input-output interface subsystem comprising the application system, means for displaying a selected one of the emulated input screen formats comprising said interface subsystem, means for supplying input data to said input format, means for storing said input data in a form executable by the application system, means for displaying a corresponding one of the emulated output screen formats comprising said interface subsystem, means for supplying output data to said output format, and means for storing said output data in a form executable by the application system, said transaction comprising the combined executable data stored in said means for storing.
CA000530647A 1986-09-23 1987-02-26 Emulation process for creating and testing computer systems Expired - Lifetime CA1275327C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US910,668 1986-09-23
US06/910,668 US5045994A (en) 1986-09-23 1986-09-23 Emulation process having several displayed input formats and output formats and windows for creating and testing computer systems

Publications (1)

Publication Number Publication Date
CA1275327C true CA1275327C (en) 1990-10-16

Family

ID=25429144

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000530647A Expired - Lifetime CA1275327C (en) 1986-09-23 1987-02-26 Emulation process for creating and testing computer systems

Country Status (2)

Country Link
US (1) US5045994A (en)
CA (1) CA1275327C (en)

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220658A (en) * 1986-03-10 1993-06-15 International Business Machines Corporation System for testing a performance of user interactive-commands using an emulator-overlay for determining the progress of the user timing response
US5410675A (en) * 1989-08-21 1995-04-25 Lisa M. Shreve Method of conforming input data to an output data structure and engine for accomplishing same
US5157782A (en) * 1990-01-31 1992-10-20 Hewlett-Packard Company System and method for testing computer hardware and software
GB9023633D0 (en) * 1990-10-31 1990-12-12 Int Computers Ltd Predicting the performance of a computer system
US5657438A (en) * 1990-11-27 1997-08-12 Mercury Interactive (Israel) Ltd. Interactive system for developing tests of system under test allowing independent positioning of execution start and stop markers to execute subportion of test script
US5640537A (en) * 1991-11-20 1997-06-17 Apple Computer, Inc. Apparatus for causing a computer system to respond to emulated user interaction in the absence of actual user interaction
US5410681A (en) * 1991-11-20 1995-04-25 Apple Computer, Inc. Interpreter for performing remote testing of computer systems
EP0616466B1 (en) * 1992-01-07 2000-06-07 Thomson Consumer Electronics, Inc. Horizontal panning for wide screen television
US5303166A (en) * 1992-04-14 1994-04-12 International Business Machines Corporation Method and system for automated network benchmark performance analysis
US5905494A (en) * 1992-08-12 1999-05-18 International Business Machines Corporation Method and system within an object oriented programming environment for enhanced efficiency of entry of operator inputs utilizing a complex object
CA2099737C (en) * 1992-09-08 1997-08-19 Terrence Kent Barrington Communications network test environment
US5600789A (en) * 1992-11-19 1997-02-04 Segue Software, Inc. Automated GUI interface testing
US5363501A (en) * 1992-12-22 1994-11-08 Sony Electronics, Inc. Method for computer system development verification and testing using portable diagnostic/testing programs
DE4321054C2 (en) * 1993-06-24 1995-12-14 Siemens Ag Procedures to support test routines
EP0660235A1 (en) * 1993-12-22 1995-06-28 International Business Machines Corporation Method for automated software application testing
WO1995025304A1 (en) * 1994-03-14 1995-09-21 Green Hills Software, Inc. Optimizing time and testing of higher level language programs
JPH07262025A (en) * 1994-03-18 1995-10-13 Fujitsu Ltd Execution control system
US5831608A (en) * 1994-04-21 1998-11-03 Advanced Transition Technologies, Inc. User interface for a remote terminal
US5500941A (en) * 1994-07-06 1996-03-19 Ericsson, S.A. Optimum functional test method to determine the quality of a software system embedded in a large electronic system
US5689705A (en) * 1995-02-13 1997-11-18 Pulte Home Corporation System for facilitating home construction and sales
US5687224A (en) * 1995-07-26 1997-11-11 Alley, Jr.; Willard Kent Telecommunications circuit provisioning and administration system
US5781449A (en) 1995-08-10 1998-07-14 Advanced System Technologies, Inc. Response time measurement apparatus and method
US5983001A (en) * 1995-08-30 1999-11-09 Sun Microsystems, Inc. Method and system for facilitating the automatic creation of test scripts
US5860071A (en) * 1997-02-07 1999-01-12 At&T Corp Querying and navigating changes in web repositories
US5754760A (en) * 1996-05-30 1998-05-19 Integrity Qa Software, Inc. Automatic software testing tool
US5968119A (en) * 1996-12-09 1999-10-19 Wall Data Incorporated Method of accessing information of an SNA host computer from a client computer using a specific terminal emulation
US5911059A (en) * 1996-12-18 1999-06-08 Applied Microsystems, Inc. Method and apparatus for testing software
US6065041A (en) * 1997-09-18 2000-05-16 Electronics For Imaging, Inc. Interface code architecture
US6286131B1 (en) * 1997-12-03 2001-09-04 Microsoft Corporation Debugging tool for linguistic applications
US6120298A (en) * 1998-01-23 2000-09-19 Scientific Learning Corp. Uniform motivation for multiple computer-assisted training systems
US6293801B1 (en) 1998-01-23 2001-09-25 Scientific Learning Corp. Adaptive motivation for computer-assisted training system
US6208640B1 (en) 1998-02-27 2001-03-27 David Spell Predictive bandwidth allocation method and apparatus
US6067638A (en) * 1998-04-22 2000-05-23 Scientific Learning Corp. Simulated play of interactive multimedia applications for error detection
US6113645A (en) * 1998-04-22 2000-09-05 Scientific Learning Corp. Simulated play of interactive multimedia applications for error detection
US6477571B1 (en) 1998-08-11 2002-11-05 Computer Associates Think, Inc. Transaction recognition and prediction using regular expressions
WO2000019664A2 (en) * 1998-09-30 2000-04-06 Netscout Service Level Corporation Managing computer resources
US6308146B1 (en) 1998-10-30 2001-10-23 J. D. Edwards World Source Company System and method for simulating user input to control the operation of an application
US6453435B1 (en) * 1998-12-29 2002-09-17 Fujitsu Network Communications, Inc. Method and apparatus for automated testing of circuit boards
US6321198B1 (en) * 1999-02-23 2001-11-20 Unisys Corporation Apparatus for design and simulation of dialogue
US6591377B1 (en) * 1999-11-24 2003-07-08 Unisys Corporation Method for comparing system states at different points in time
US20020032538A1 (en) * 2000-05-09 2002-03-14 Lee Young-Seok Software test system and method
US7949564B1 (en) * 2000-05-31 2011-05-24 Western Digital Technologies, Inc. System and method of receiving advertisement content from advertisers and distributing the advertising content to a network of personal computers
US7596484B1 (en) 2000-11-15 2009-09-29 Itt Manufacturing Enterprises, Inc. Network node emulator and method of node emulation
US7337429B1 (en) 2000-11-28 2008-02-26 International Business Machines Corporation Application system certification process
US7426052B2 (en) * 2004-03-29 2008-09-16 Dell Products L.P. System and method for remotely building an information handling system manufacturing image
US7499405B2 (en) * 2005-06-28 2009-03-03 International Business Machines Corporation Method for testing branch execution and state transition logic in session initiation protocol application modular components
US20070050676A1 (en) * 2005-08-24 2007-03-01 Suresoft Technologies Inc. Software testing device and method, and computer readable recording medium for recording program executing software testing
US7889953B2 (en) * 2007-03-28 2011-02-15 Dell Products L.P. System and method for managing images using parent-child relationship
US9740363B2 (en) * 2013-10-02 2017-08-22 Velocity Technology Solutions, Inc. Methods and systems for managing community information
EP3679458A1 (en) * 2017-10-11 2020-07-15 Google LLC Keyboard input emulation
US10769056B2 (en) 2018-02-26 2020-09-08 The Ultimate Software Group, Inc. System for autonomously testing a computer system
US10977155B1 (en) 2018-05-31 2021-04-13 The Ultimate Software Group, Inc. System for providing autonomous discovery of field or navigation constraints
US10599767B1 (en) 2018-05-31 2020-03-24 The Ultimate Software Group, Inc. System for providing intelligent part of speech processing of complex natural language
US11113175B1 (en) 2018-05-31 2021-09-07 The Ultimate Software Group, Inc. System for discovering semantic relationships in computer programs
US11010284B1 (en) 2018-05-31 2021-05-18 The Ultimate Software Group, Inc. System for understanding navigational semantics via hypothesis generation and contextual analysis
US10747651B1 (en) 2018-05-31 2020-08-18 The Ultimate Software Group, Inc. System for optimizing system resources and runtime during a testing procedure

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4458331A (en) * 1981-10-09 1984-07-03 International Business Machines Corporation Interactive display terminal with alternating data processing and text processing sessions with text processing status line operable during data processing session
US4604710A (en) * 1981-10-09 1986-08-05 International Business Machines Corporation System for converting data processing information to text processing format and vice versa
US4570217A (en) * 1982-03-29 1986-02-11 Allen Bruce S Man machine interface
US4509122A (en) * 1982-11-18 1985-04-02 International Business Machines Corporation Method for controlling the file transfer capability of an interactive text processing system that is emulating a host processing system terminal
US4601010A (en) * 1983-12-19 1986-07-15 Briscoe Robert J Converter device for a computer terminal
US4727480A (en) * 1984-07-09 1988-02-23 Wang Laboratories, Inc. Emulation of a data processing system

Also Published As

Publication number Publication date
US5045994A (en) 1991-09-03

Similar Documents

Publication Publication Date Title
CA1275327C (en) Emulation process for creating and testing computer systems
US6360332B1 (en) Software system and methods for testing the functionality of a transactional server
US6587969B1 (en) Software system and methods for testing the functionality of a transactional server
US6959431B1 (en) System and method to measure and report on effectiveness of software program testing
US5371883A (en) Method of testing programs in a distributed environment
AU722149B2 (en) Determination of software functionality
DE60212372T2 (en) SYSTEM AND METHOD FOR CREATING DIAGNOSTICS OF A PORTABLE DEVICE
US5485615A (en) System and method of interactively developing desired computer programs by using plurality of tools within a process described in graphical language
US7367017B2 (en) Method and apparatus for analyzing machine control sequences
US6654950B1 (en) Software rehosting system and method
US6993748B2 (en) Systems and methods for table driven automation testing of software programs
US5598511A (en) Method and apparatus for interpreting data and accessing on-line documentation in a computer system
US20030070120A1 (en) Method and system for managing software testing
US7107182B2 (en) Program and process for generating data used in software function test
EP1085418A2 (en) Method and system for testing behaviour of procedures
US20020029377A1 (en) System and method for developing test cases using a test object library
US5390131A (en) Apparatus and method for displaying wafer test results in real time
US5940617A (en) Debugger for controlling execution of software installed in object to be controlled on the basis of state transition model, debugging method thereof, record medium thereof, and method for correlating function specifications and code addresses
Bennett et al. A transformation system for maintenance-turning theory into practice
White et al. Test manager: A regression testing tool
CN111782539A (en) Test and diagnosis integrated development platform based on domestic operating system
Harrold et al. Aristotle: A system for development of program analysis based tools
US5381344A (en) Apparatus and method for obtaining a list of numbers of wafers for integrated circuit testing
CA1267229A (en) Reconfigurable automatic tasking system
CN113505061B (en) Automatic test software platform

Legal Events

Date Code Title Description
MKLA Lapsed