d3 2 team20333 v1 - isg stuttgart · 4.2.3 periodical run of automatic test for bugfix versions ......

31
TEAM Test Execution and Test Management for Numerical Control Software Best Practice Action IST-1999-20333 Deliverable D-3.2 Author(s): Joachim Mayer, Andreas Grosse, Thomas Bürger Type: Deliverable, Experience report Activity: WP 4.2; 4.3; 4.4 Set up of constructive QA activities Date: 06.12.01 Status: Released Name of document: D3_2_TEAM20333_V1.doc Availability: IST

Upload: nguyentruc

Post on 25-Jun-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

TEAM

Test Execution and Test Management for Numerical Control Software

Best Practice Action IST-1999-20333

Deliverable D-3.2

Author(s): Joachim Mayer, Andreas Grosse, Thomas Bürger

Type: Deliverable, Experience report

Activity: WP 4.2; 4.3; 4.4 Set up of constructive QA activities

Date: 06.12.01

Status: Released

Name of document: D3_2_TEAM20333_V1.doc

Availability: IST

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 2/31

Table of Contents

1 Abstract ........................................................................................................................................ 3

2 Objectives of WP-4...................................................................................................................... 3

3 Standard forms for test execution (WP 4.2) ............................................................................. 4

3.1 CM as basis for test execution ................................................................................................. 4

3.2 Preparation of standard forms at ISG....................................................................................... 5

4 Realisation of specified constructive QA activities (WP 4.3) .................................................. 8

4.1 Integration of INSURE++ into software development process ............................................... 8

4.1.1 Appropriate configuration and use of INSURE++......................................................... 8

4.1.2 Various test cases with INSURE++ ............................................................................. 10

4.1.3 Use of INSURE++ within the development process of ISG ........................................ 15

4.2 Integration of regression test into software development process ......................................... 19

4.2.1 Test program data base................................................................................................. 20

4.2.2 Use of different configuration list sets ......................................................................... 22

4.2.3 Periodical run of automatic test for bugfix versions .................................................... 24

4.2.4 Integration of test environment into source control ..................................................... 25

4.2.5 Test of release versions ................................................................................................ 28

4.2.6 Set-up of an automatic test environment ...................................................................... 28

4.3 Integration of already existing test methods into software development process.................. 29

5 Validation of constructive QA activities (WP 4.4) ................................................................. 30

6 Glossary ...................................................................................................................................... 31

7 References .................................................................................................................................. 31

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 3/31

1 Abstract

The submitted paper contains the results of the Workpackage 4.2, 4.3 and 4.4 (WP-4.2,-4.3,-4.4) of the IST project IST-1999-20333. The work performed in these WP’s is described in the present “Experience report” as planed in the Description of work (DOW) /1/ (WP-4).

2 Objectives of WP-4

WP-4 contains all activities related to the set up of constructive QA activities. Generally, WP-4 is subdivided into the four subtasks

• WP-4.1 Specification of constructive QA activities

• WP-4.2 Preparation of standard forms for test execution

• WP-4.3 Realisation of spec. constructive QA activities

• WP-4.4 Validation of constructive QA activities

According to the specified integration of the analytical QA activities to the software development process (WP-4.1 described in Deliverable D3.1, /2/) particular standard forms for test execution and documentation were defined and realised within WP-4.2. The specified constructive QA activities were implemented into practice within WP-4.3. Their validation was done within WP-4.4.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 4/31

3 Standard forms for test execution (WP 4.2)

3.1 CM as basis for test execution

The purpose of the TEAM-project is the optimisation of the QA related activities of the software development process at ISG. This is done by the introduction of an established and precisely described QA method, the so-called QA sub-model of the V-Model /3/. To ensure the suitability of this method for ISG’s demands the required QA products and activities, provided within this method, were selected in a so-called “Tailoring phase”. Due to an adaptation of the selected activities and products to the already existing QA activities and products within ISG’s software development process, the necessary extensions and optimisations of ISG’s QA activities were executed.

For an effective QA an appropriate CM Tool is necessary. At ISG such a CM tool (ClearCase from Rational) was introduced to the software development process within the InCoMM project.

ClearCase is a comprehensive software version control and configuration management system, designed for development teams working in a local network. It supports a model of parallel software development, where elements are branched and merged.

Different variants for development purpose or error-removals can be defined with this tool in a suitable way by defining different views. This definition of views is supported by assigning attributes to the individual software elements. Furthermore this tool offers the ISG enough possibilities, to describe ISG-internal existing processes with the aid of mechanisms provided by the tool.

Figure 1 shows the incremental structure of a parallel software development process of different variants. Starting point is a stable, working version of the application, by ISG called “Mainstream version”. This version is freezed by assigning a version name (Label). In general, different kind of work should be performed on different development streams, called branches (variants). For each software change activity at ISG (Functional extension, bugfix or change request, functional improvement /2/ ) an separate branch is created.

By this strategy the different software changes are isolated from each other. Projects with higher priority can be performed independent from other developments.

Each project responsible has the authorization in ClearCase to create and administrate his own branches for his team before the development begins. A developer or a team gets access to such a branch exclusively by a view, which also is made available by the project responsible. So each developer will work on his own view, editing programs, building software, testing, and so on. Though the views are separate, this produces a development environment in which developers on the same team:

q See the development tree in the same way

q Are totally isolated from work performed on other branches

q Are isolated from each other when they make changes

q Share each others’ source code changes, as they safe their changes in their common branch

Some developers might belong to multiple teams. Such developers can switch views, depending on their current work.

For example, branches for customer specific versions, bugfixes, functional software developments and integrations can be defined. After the conclusion of a development the integration of the software changes is executed by merging them into the mainstream or other customer specific variants. Especially into the mainstream software changes only may be integrated by a merge. The result is a new software version with a new label, which is the basis for further developments.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 5/31

ClearCase allows to integrate product enhancements incrementally and frequently. The more frequent new versions are created, the easier the tasks of merging parallel development work and testing the result.

Mainstream

Bugfix ofLabeln

CustomerLabeln

Bugfix ofLabeln-1

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Integrationof function

Labeln-1

New Function

Fig. 1: Incremental structure of the software change activities within the CM tool ClearCase.

3.2 Preparation of standard forms at ISG

The different software change activities (Functional extension, error removal or change request; functional improvement) at ISG must be supported by corresponding testing methods. To ensure an understandable and safe test process, standard forms for test execution must be defined. Following table (figure 2) gives an overview about the available testing methods and its use during change activities. Some test methods are mandatory, all other test methods are optional and its use depends on the functional changes.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 6/31

Table of the required and optional testing methods during development process

Process

Testing method Funct. enhancement/ New

development

Bugfix/Change request

Funct. improvement

Phases

Docu review Specification

Debugging

Code review

Test programms

Manual block

Coding

INSURE++: Static

analysis

Dynamic analysis

Memory test

Code coverage

Grafic tool

PLC simulation

HMI simulation

Handwheel simulation

Coding Integration

Automatic test

Machine test

Integration

MANDATORY

OPTIONAL

Fig. 2: Table of the required and optional testing methods during development process

Beside the standard forms for the already existing test methods described in /4/ for the commercial tool INSURE++ and for the regression test standard forms must be defined.

The use of both tools is useful during integration before and after the merge of a branch, because here the specific strenghts of the tools can be used very efficient /5/. Beside the static analysis during coding phase especially dynamic analysis, memory test and code coverage of INSURE++ can improve the quality of software changes before and after an integration. So it seems to be useful to make this tool available for all developers and to prescribe especially the use of static and dynamic code analysis. The regression test in the current realisation is useful to execute functional tests after integration in a mainstream or customer specific version. INSURE++ can be used additionally for dynamic tests (Figure 3).

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 7/31

Mainstream

Bugfix ofLabeln

CustomerLabeln

Bugfix ofLabeln-1

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Integrationof function

Labeln-1

New Function

Insure

Regression testcomplemented by Insure

Fig. 3: General fields of application for test executions with INSURE++ and regression test.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 8/31

4 Realisation of specified constructive QA activities (WP 4.3)

4.1 Integration of INSURE++ into software development process

To increase the software quality it is necessary to prove each change during the development process. Besides the code reviews an “automatic” check by a tool like INSURE++ can be integrated.

INSURE++ allows instrumented and non-instrumented programs to be check at run-time. Because INSURE++ source code instrumentation gets deeper inside the code being checked, instrumented code allows more complicated and subtile errors to be detected.

4.1.1 Appropriate configuration and use of INSURE++

For simple parallel management of different configurations a new INSURE++-configuration should be added to the project. This allows to switch back and forward between instrumented and non-instrumented versions as between debug and release versions.

Runtime errors can be detected and fixed in the instrumented version. The result can be checked in the release comfiguration or even vice versa. Or even for short test a single file can be temporary instrumented in the original debug configuration.

In the control panel of INSURE++ general settings are defined. To manage different configurations the checkbox “instrument all build in Visual C++” should not be selected by default. In this way the switch between different configurations (instrumented and none instrumented) can be chosen explicitly through the corresponding button on the tool bar.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 9/31

All files and also the linkage can be instrumented by INSURE++ build or rebuild action.

For explicit checking or review of new developed functionalities single files can be instrumented manually. This is a possibility for short tests, because it is not as time consuming as instrumentation of all files of the project.

Also instrumented and none instrumented files can be mixed. In this case it is not guaranteed that - even if the file where the error occurred has been instrumented – INSURE++ detects the same error when none instrumented files are mixed.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 10/31

4.1.2 Various test cases with INSURE++

Static test

Through static tests during compilation INSURE++ can detect essential errors. These errors must be proven by the respective software developer. As a QA responsible subsequent actions must be initiated. All recognized errors must be included in the change management data base.

If the report has no essential meaning or requires no immediate bugfix, the report message of INSURE++ can be suppressed. In some cases INSURE++ reports an error although it is obviously no error or INSURE itself displays an internal problem (PARSER_ERROR). Suppressing can be chosen for each file or error message individually. Also global settings are possible.

E.g. the following error report can be disabled by suppressions control panel.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 11/31

The configuration of suppressed report messages can be saved as INSURE++ specific setting. This specific setting must be saved together with each version of a customer branch. Through this constant precondition the same log-files can be reproduced, in order to detect changes and new errors easily.

Dynamic test (start-up)

After static test during compilation of the files a first dynamic test is performed during start-up of the NC-kernel. For reliability in numerical controls the allocation of memory from operation system is limited to the NC-start-up sequence. Checking the startup by INSURE++ basic NC-configuration errors and resulting memory leaks can be detected. A short start-up / shutdown-sequence displays the memory leaks in a report file. Dependent on the real-time operation system that is used as target system, memory leaks must be eliminated. Runtime: Executed "target", pid=640 ************************** INSURE SUMMARY ************************* v6.0 ** * Program : target * * Arguments : Not available * * Directory : D:\cc\v254.team\nc-kernel\awd_isg\pc85\make\obj_Insure * * Compiled on : Nov 22, 2001 15:37:07 * * Run on : Nov 23, 2001 10:17:10 * * Elapsed time : 01:43:43 * * Malloc HWM : 8713469 bytes (8509K) * *************************************************************************** 252 outstanding memory references for 8686534 bytes (8482K). Leaks detected at exit ---------------------- 528 bytes 1 chunk allocated at term_hdl.c, 354 malloc() (interface) isg_set_terminate_handler() d:\cc\v254.team\nc-kernel\std\isg_util\term_hdl.c, 354 main() zst_main.c, 3167 116 bytes 1 chunk allocated at threadex.c, 108 _calloc_dbg() (interface) _beginthreadex() threadex.c, 108 _beginthreadex() (interface) isg_thread_create() d:\cc\v254.team\nc-kernel\std\isg_util\os_spez\os_win32.c, 825 isg_set_terminate_handler() d:\cc\v254.team\nc-kernel\std\isg_util\term_hdl.c, 371 main() zst_main.c, 3167 Outstanding allocated memory ---------------------------- 3953144 bytes 5 chunks allocated at os_mem.c, 153 calloc() (interface) isg_alloc() d:\cc\v254.team\nc-kernel\std\fblock\util\os_mem.c, 153 alloc() d:\cc\v254.team\nc-kernel\std\fblock\util\nc_util.c, 4426 bf_install() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_all.c, 752 list_step_intpr() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konfig.c, 537 konf_ausfuehren() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 448

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 12/31

konf_gestartet__gestartet() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 1033 konf_ablst() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 1717 fb_ablaufst() d:\cc\v254.team\nc-kernel\std\fblock\fbbf\fb_ablau.c, 279 task_rnd() d:\cc\v254.team\nc-kernel\awd_isg\isg\c_rund\task_rnd.c, 797 start_up() zst_main.c, 2600 _threadstartex() threadex.c, 212 1625592 bytes 11 chunks allocated at os_mem.c, 153 calloc() (interface) isg_alloc() d:\cc\v254.team\nc-kernel\std\fblock\util\os_mem.c, 153 alloc() d:\cc\v254.team\nc-kernel\std\fblock\util\nc_util.c, 4426 bf_install() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_all.c, 752 list_step_intpr() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konfig.c, 537 konf_ausfuehren() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 448 konf_gestartet__gestartet() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 1033 konf_ablst() d:\cc\v254.team\nc-kernel\std\fblock\konfig\konf_abl.c, 1717 fb_ablaufst() d:\cc\v254.team\nc-kernel\std\fblock\fbbf\fb_ablau.c, 279 task_int() d:\cc\v254.team\nc-kernel\awd_isg\isg\c_int\task_int.c, 696 start_up() zst_main.c, 2599 _threadstartex() threadex.c,

Test during individual time interval

To detect run-time-errors in individual time intervals the code may be modified in a special way to enable / disable the checking.

#ifdef __INSURE__

_Insure_set_option(„runtime”, “off”);

< code not to be tested >

_Insure_set_option(„runtime”, “on”);

#endif

To switch INSURE++-option a NC-interface can be provided, which enables turning on or off options via the user interface or even through a NC-program command. E.g. if the run-time check shall be turned on automatically on the start of special program following code can be used:

Start of NC-program

#ifdef __INSURE__

if (strcmp( “insure.nc”, &nc_program[0]) == 0)

_Insure_set_option(„runtime”, “on”);

#endif

End of NC-Program

#ifdef __INSURE__

_Insure_set_option(„runtime”, “off”);

#endif

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 13/31

Dynamic test with display of actual system context

In some cases it is very useful to stop the program execution when INSURE++ reports an error. By setting a breakpoint on INSURES++ trap-function this can be achieved.

_Insure_trap_error

After stopping in case of errors the contents of relevant variables can be inspected and also the following program sequence can be stepped through.

E.g. after a overflow of reading memory it is necessary not only to see the stack information and the report message of INSURE++. But also the individual values of the actual variables must be seen. After catching the trap of INSURE++ a quick watch of necessary variables visualize this information. Runtime: Executing "target", pid=1117 [3 suppressed] >> memcpy( p_dest, &p_descr->value, p_descr->size); Reading overflows memory: <argument 2> bbbbbbbbbbbbbbbbbbbbbbbb | 28 | 136 | 120 | rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr Reading (r) : 0x03fcf758 thru 0x03fcf857 (256 bytes) From block (b) : 0x03fcf73c thru 0x03fcf7df (164 bytes) description, declared at od_intpr.c, 3659 Stack trace where the error occurred: memcpy() (interface) oam_init_container() d:\cc\v254.team\nc-kernel\std\fblock\util\oam_lib.c, 7080 binary_container_init() d:\cc\v254.team\nc-kernel\std\fblock\util\od_intpr.c, 3706 binary_containers_init() d:\cc\v254.team\nc-kernel\std\fblock\util\od_intpr.c, 3760 hmi_list_default() d:\cc\v254.team\nc-kernel\awd_isg\isg\sys_abls\objects.c, 3435 oam_interpreter() d:\cc\v254.team\nc-kernel\std\fblock\util\oam_lib.c, 8156 icp_manager() d:\cc\v254.team\nc-kernel\std\kommu\cp_inter.c, 1995 prcs_call() d:\cc\v254.team\nc-kernel\std\kommu\cp_prcs.c, 677 hmi_ablst() d:\cc\v254.team\nc-kernel\std\sda\hmi\hmi_abls.c, 2515 fb_ablaufst() d:\cc\v254.team\nc-kernel\std\fblock\fbbf\fb_ablau.c, 279 task_com() d:\cc\v254.team\nc-kernel\awd_isg\isg\fb_com\task_com.c, 598 start_up() zst_main.c, 2601 _threadstartex() threadex.c, 212 0x77f04ede()

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 14/31

Run-time check of individual functionality

To cover individual functions by a test case (e.g. new development functionality) the test can be executed including a test specific program. This test program generates an INSURE++ test report, e.g:

% R_TEST (dec00693.nc) N10 P2= 1 RP2= 5 (R1 also permitted for radius) N15 $IF R1 != 5 N16 #MSG ["Error reading/writing 'R'"] N17 $ENDIF N20 R1= 5 P2 = R1 N25 $IF P2 != 5 N26 #MSG ["Error reading/writing 'R'"] N27 $ENDIF N30 RR1 (equal to R=R1=R) N31 R=R1 N32 R1=10 N33 XR YR1 N40 G17 G90 G01 X0 Y0 F25000 N50 G02 X100 R50 N60 G03 X200 R51 N70 G01 X0 Y0 N80 G02 R=52 N90 X100 N100 G01 X0 Y0 N110 R1=53 N120 G02 X100 N130 G03 X200 N140 G01 X0 Y0 N150 G02 X100 R1=-54 N160 G03 X200 M30

Runtime: Executing "target", pid=983 >> i_hash = (SGN16) ((64 * i_hash + p_param->index[dim]) % hash_size); Reading array out of range: p_param->index[dim] Index used : -1 Valid range: 0 thru 3 (inclusive) Stack trace where the error occurred: calc_param_hash_index() d:\cc\v254.team\nc-kernel\std\sda\decoder\parm_man.c, 1928 dec_search_param() d:\cc\v254.team\nc-kernel\std\sda\decoder\parm_man.c, 603 p_funktion() d:\cc\v254.team\nc-kernel\std\sda\decoder\din_fkt2.c, 4099 ascii_auswerten() d:\cc\v254.team\nc-kernel\std\sda\decoder\dec_ausw.c, 746 ef_decoder() d:\cc\v254.team\nc-kernel\std\sda\decoder\decoder.c, 1235 dec_gestartet_bereit() d:\cc\v254.team\nc-kernel\std\sda\decoder\dec_abls.c, 3111 dec_ablst() d:\cc\v254.team\nc-kernel\std\sda\decoder\dec_abls.c, 4670 fb_ablaufst() d:\cc\v254.team\nc-kernel\std\fblock\fbbf\fb_ablau.c, 279 task_rnd() d:\cc\v254.team\nc-kernel\awd_isg\isg\c_rund\task_rnd.c, 797 cnc_thread() zst_main.c, 972 _threadstartex() threadex.c, 212

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 15/31

4.1.3 Use of INSURE++ within the development process of ISG

Cyclic checks of Mainstream

The mainstream branch is used as root branch for all development branches. Therefore it is important to keep it on high quality by executing cyclic tests by INSURE++. The period of checks depends on

• the amount of changes within the mainstream,

• number of merges into the mainstream

• and the setting of labels if new branches were taken out of a mainstream node.

E.g. if a new common node by a label on the mainstream is defined the quality of this node must guaranteed.

Bugfix ofLabeln

CustomerLabeln

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Integrationof function

New Function

Toolcheck

Toolcheck

Toolcheck

Fig. 4: INSURE++ checking of the ISG-mainstream branch.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 16/31

Cyclic checks of bugfixes

Each release of a customer variant must be maintained and kept error free. Any critical error already found by ISG itself or reported from another customer is reported to the customer. After being informed about an error the customer can request the bugfix. The bugfix itself must be performed in a common branch to provide it to all other variants, which are affected by this error. To remove an error the bugfix branch is merged in the individual customer release, incrementally or totally. Therefore it is important to check the bugfix cyclically by INSURE++.

∆t

Mainstream

Bugfix ofLabeln CustomerLabeln

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Toolcheck

Toolcheck

Toolcheck

∆t

Fig. 5: Cyclical checks of bugfix branch at ISG.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 17/31

Checking before merges

Experience has shown, that each development of a new function has a great amount of new errors. Before merging the new function in other variants it has to be checked and the errors must be removed as early as possible within the whole development process. This prevents to copy errors to other varinats. Basically each merge can introduce existing errors from the source variant to the destination variant.

Mainstream

Bugfix ofLabeln

CustomerLabeln

Bugfix ofLabeln-1

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Integrationof function

Labeln-1

New Function

Toolcheck

Toolcheck

Toolcheck

Toolcheck

Fig. 6: Checks before merges at ISG

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 18/31

Checking after merges

Especially by merges new software errors can arise. Because if files have been modified in two parallel branches at the same time and must be combined together even an automatic merge tool has problems. In some case a manual merge is necessary. The following mentioned errors can occur:

• Wrong merge management (erase of wrong parts),

• Functional unconsidered side effects,

• Taking over the wrong code part in manual merge.

Mainstream

Bugfix ofLabeln

CustomerLabeln

Bugfix ofLabeln-1

Branch Node for new branches (Label)

Merge of change Change of source (Development)

Integrationof function

Labeln-1

New Function

Toolcheck

Toolcheck

Toolcheck

Toolcheck

Toolcheck

Fig. 7: Important check points after merges at ISG

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 19/31

4.2 Integration of regression test into software development process

For regression test ISG developed an automatic test environment. This has lead to a big improvement of NC test execution, because it is possible to execute a large number of test programs very fast. So the time effort to detect and localize errors within the software components can be reduced.

The automatic test environment was manually installed for a software version. The needed components for an automatic test are as follows:

\v252_bugfix\nc-kernel\ (ISG NC-kernel software) std\... (that must be tested ) error\... test\... awd_isg\ 5ax\ exe\ (Executables) kern.exe ahmi.exe ... script\ (Test scripts) autotest.lst ... listen\ (Configuration) achsmds1.lis (of NC-kernel) achsmds2.lis achsmds3.lis ... prg\ (NC-test programs) dec00001.nc dec00002.nc bav00003.nc ... output\ (Output files: ) autotest.log (Detailed Logfile) autotest.lg1 (Logfile in short) dec00001.dec (Printout of ) dec00002.dec (generated ) bav00003.bav (channel blocks) ...

Fig. 8: Components of the automatic test

For some test projects the developers have to install the automatic test environment on their computers. Before and after each new development or other source modification, e.g. bugfix, these developers have to run the automatic test (see fig. 9). Afterwards the improvement and also undesirable side effects on other functions can be detected fast by comparing the produced output files with a common file compare tool.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 20/31

automatic test

automatic test

development(source changes)

output 1

output 2

check

Fig. 9: Test sequence for source modifications

This test procedure always depends on the number of NC-test programs that are used. So the aim was to extend the number of used test programs. So that in the end all available functions in the ISG software version can be tested. For each new development and also for the test of bugfixes all developers must create NC-test programs. But these NC-test programs were not always collected before by a responsible for the test procedures. So this process was initiated and new NC-test programs were added to the automatic test script file periodically.

4.2.1 Test program data base

For a better handling of the large number of test programs a NOTES data base is developed (see fig. 10 – 13) .

Fig. 10: Test program data base

For each test program a data sheet with the description of the tested function, used NC-commands and other properties for automatic test are entered. If available, also pictures of the graphical output are added.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 21/31

Fig. 11: Data sheet of a test program (part 1), description and how to use.

Fig. 12: Data sheet of a test program (part 2), listing of used NC-commands.

Fig. 13: Data sheet of a test program (part 3), enclosed test program file and if available graphical reference output picture of the test program.

Because of the large number of files and the related big data storage size, the test programs are not stored on the computer of the developers. The files are moved onto a server disc. After changes in the configuration lists of the NC-kernel software all developers can use these common files to run the automatic test on their computer.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 22/31

NC-test programdata base

LAN

run automatic test

run automatic test

run automatic test

Fig. 14: NC-test programs on server disc

4.2.2 Use of different configuration list sets

Many functions of the NC-kernel software can be tested with the standard configuration lists. But for other functions different settings in the configuration lists are necessary. So additional sets of configuration lists are defined for the automatic test environment:

Configuration Script for automatic test Description

A auto_a01.lst 3ax, 5ax milling conventional

B auto_b01.lst 5ax , C-axis, lathe

C auto_c01.lst 5ax, synchronous operation

D auto_d01.lst C-axis, YC-axis configuration

Fig. 15: Different configurations sets for automatic test

In the previously described automatic test environment only one configuration was tested. And the script file containing all NC-test programs to be tested was started manually from the input shell. With the increasing number of NC-kernel configuration sets the manual start of the automatic test via input shell was too inefficient. So a batch tool was developed which starts all available configuration sets one after the other and starts a file compare afterwards.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 23/31

REM ################################################################################### REM Automatic test of V252_bugfix REM ################################################################################### REM REM ************** Adaptation ********************************************************* set AutotestDrive=E: set AutotestFolder=v252_bugfix set Logfile="auto_all.log" ... REM Reference version set RefDrive=F: set RefDir=ref_outp_v252 REM ************** End of Adaptation ************************************************** REM ############## Test using configuration lists A ################################### ECHO BATCH: Start auto_a01 test >>%Logfile% call %AutotestDrive%%AutotestFolder%\nc-test\batch\hoch_a01.bat ECHO BATCH: auto_a01 test finished >>%Logfile% REM ############## Test using configuration lists B ################################### ECHO BATCH: Start auto_b01 test >>%Logfile% call %AutotestDrive%%AutotestFolder%\nc-test\batch\hoch_b01.bat ECHO BATCH: auto_b01 test finished >>%Logfile% REM ############## Test using configuration lists C ################################### ECHO BATCH: Starting auto_c01 test >>%Logfile% call %AutotestDrive%%AutotestFolder%\nc-test\batch\hoch_c01.bat ECHO BATCH: auto_c01 test finished >>%Logfile% REM ############## Test using configuration lists D ################################### ... REM ################################################################################### REM Check output via file compare REM ################################################################################### cd outp_a01 ECHO ********** diff configuration A ********** >>%Logfile% diff auto_a01.lg1 %RefDrive%\%RefDir%\outp_a01\auto_a01.lg1 >>%Logfile% cd ..\outp_b01 ECHO ********** diff configuration B ********** >>%Logfile% diff auto_b01.lg1 %RefDrive%\%RefDir%\outp_b01\auto_b01.lg1 >>%Logfile% cd ..\outp_c01 ECHO ********** diff configuration C ********** >>%Logfile% diff auto_c01.lg1 %RefDrive%\%RefDir%\outp_c01\auto_c01.lg1 >>%Logfile% cd ..\outp_d01 ECHO ********** diff configuration D ********** >>%Logfile% diff auto_d01.lg1 %RefDrive%\%RefDir%\outp_d01\auto_d01.lg1 >>%Logfile% ... :END REM ###################################################################################

Fig. 16: Batch tool to start automatic test with all available configuration sets

The protocol file and the file compare tool results of each configuration set are logged into a global log file for a quick check of the batch tool results. The configuration specific output of the automatic test environment (printout of channel block outputs) remained the same. These printout of the channel block output still had to be checked manually with a file compare tool by the operator.

For many source code modifications this automatic test and the following check via file compare tool worked well. But with some developments too many changes in the printout of the channel block outputs occurre. Therefore it was very hard to check whether the output is OK or not. Because of the big number of output files it was very time consuming to check each output file. This lead to the decision that a test of a new development has passed, if a certain number of checked output files were

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 24/31

OK. But in the future additional support by the automatic test environment will be necessary to improve the check of that many output files.

In the past global software version tests have been started at release labels only because of the big effort to run the test programs manually and to check the output. Now with the automatic test environment it is easy to run many test programs. But there were still many changes in the output files from one label to the next.

4.2.3 Periodical run of automatic test for bugfix versions

The practice has shown that it is easier to check the output if the test run frequency is higher. So the automatic test environment is installed for all bugfix variants of ISG. Each bugfix variant is tested every second day, because one run of the automatic test with the different configuration sets takes more than 24 hours. Using windows scheduler (see fig. 10) it is possible to start the automatic test periodically according to the programmed times.

Fig. 17: Periodic start of automatic test environment with scheduler

In the same frequency the differences in the output files are checked by a responsible for the test and are written into the change protocol document of the bugfix variant. So side effects on other functions after a bugfix by a developer into the bugfix variant can be detected fast. In addition the changes of the output are received in sequence and can therefore be checked easier than before, because the file compare tool usually only prints some few changes from one run to the next run.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 25/31

No.

of

run

NC-Program Error-ID

of

Remarks Name Notes

Error-No.

and

new removed (date of

bugfix)

...

87 Configuration A

bav00519.nc

dec00360.nc

dec00421.nc

dec00422.nc

dec00720.nc

21051

21051

21051

21051

21051

New warning 21051

is OK (programming

of S required.)

Gr/Hi

ER1679 /

Nov 26, 01

122 Configuration A

dec00017.nc

dec00906.nc

dec01059.nc

dec01084

20375

21000

20364

21051

21051

Fig. 18: Change protocol of V2.5.2 bugfix version

At ISG 2 bugfix branches (variants) for the 2 software releases had to be checked at the beginning of this project. I the meanwhile two more software versions were released and so 4 bugfix branches have to be checked now. Therefore the automatic test run frequency had to be reduced to once a week now.

4.2.4 Integration of test environment into source control

To simplify set-up and service of the test environment the directory structure was altered and all configuration, script and output files of automatic test were taken under source control with Rational ClearCase.

NC-test programs were moved to one directory (ClearCase VOB) “nc-prg” and stored in different subdirectories according to the tested software module.

Automatic test files were placed into directory (ClearCase VOB) “nc-test”. Here the following files can be found:

1. Batch files to start and run automatic test with different configuration sets.

2. Configuration files for the different configurations of NC-kernel.

3. Script files with the test programs to be started for the configuration.

4. Log files of automatic test.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 26/31

5. Compiler protocol file.

6. Channel block output files of every test program.

\v252_bugfix\nc-kernel\... ( ISG NC-kernel software ) ( that must be tested ) \v252_bugfix\nc-prg\ ( NC-test programs ) bavo\ bav00055.nc ... dec\ dec00001.nc ... \v252_bugfix\nc-test\ ( Automatic test environment ) batch\ auto_all.bat ( Global batch file ) auto_a01.bat ( Batch for configuration A ) auto_b01.bat ( Batch for configuration B ) ... listen\ ( Configuration files ) a\ achsmds1.lis achsmds2.lis ... b\ achsmds1.lis achsmds2.lis ... ... script\ ( Test scripts for: ) auto_a01.lst ( configuration A ) auto_b01.lst ( configuration B ) ... Logfile\ auto_all.log ( Global log file of batch ) Make\ nmake.out ( Compiler output ) outp_a01\ ( Channel block output files ) auto_a01.lg1 ( of ISG NC-kernel software ) auto_a01.log bav00055.bav ... outp_b01\ outp_...

Fig. 19: New directory structure of automatic test environment

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 27/31

The global batch to control automatic test is extended with source control commands. In a first step the latest files of the tested software variant are checked out of the source control tool, the projects are compiled and checked whether the compile test was successful or not. If there was a compile error the responsible of the test department receives a pop-up message on its screen.

BATCH: Start autotest of V252_bugfix

BATCH: Update view of V252_bugfix

# ClearCase snapshot view update log

# Update session parameters:

#

FormatVersion: 4

SnapshotView: D:\cc\v252_bugfix

StartTime: 15-Nov-01.17:00:16

UserID: autotest@autotest

ProcessID: -485641

ConfigSpecSet: FALSE

PreviewOnly: FALSE

PreserveVOBTimes: FALSE

UpdateHijackedFiles: TRUE

RenameHijackedFiles: FALSE

#

# Actions taken to update the view:

#

Updated: nc-kernel\std\fblock\include\err.inc \main\share\244 \main\share\245

Updated: nc-kernel\std\fblock\include\err_eng.inc \main\share\34 \main\share\35

#

# Update session status:

#

EndTime: 15-Nov-01.17:00:43

BytesCopied: 1571275

Fig. 20: Example of protocol of global automatic test batch when getting the latest files out of source control

Making V2.5.2

Start of Compilation

********************************************

Compile kern

ACHSTEIL.C

ctrl_uti.c

d:\cc\v252_bugfix\nc-kernel\std\fblock\util\ctrl_uti.c(1465):

warning C4505: 'ctrl_el_error_function' : Nichtreferenzierte lokale Funktion wurde entfernt

DEC_ABLS.C

...

********************************************

Fig. 21: Example of compiler log file

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 28/31

With the file compare of compiler log file it can be quickly checked if additional compiler warning occur and ask the developer to remove them.

All log files are automatically “checked out” of source control and at the end of the test “checked in” again. Therefore any changes of a test run can be checked with file compare of two different file versions in the source control tool. This can be done afterwards too, even there were several test runs in between.

4.2.5 Test of release versions

If a release version must be tested, not the latest files out of source control is used. For these tests files with a special test label are used only.

4.2.6 Set-up of an automatic test environment

In the moment the following works to set-up the automatic test environment for a new software version have to be done:

1. Adapt path entries in batch and script files.

2. Adapt configuration lists (A, B, C, …) according to available functions in the software version.

With ClearCase it is easy to “merge” (re-use) a configuration list setting of an already adapted software version to another software version with the same base line. In fig.22 an example of “branch” (software version) “v252.abc-main” on base line V252 is shown, that can use the adapted configuration list of branch “bugfix_v252”. But afterwards additional changes in the list according to the available functions in the software version have to be done.

Fig. 22: Example of configuration list handling with ClearCase

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 29/31

4.3 Integration of already existing test methods into software development process

The use of the already existing test methods at ISG depends on the functional extensions. This test methods are still necessary and can not be replaced by the commercial tool INSURE++ or the regression test. Therefore they will be used also in future. Following table gives an overview about this test methods and for which cases they are suitable.

Test method Use

Docu review Review of specifications or development documents. The review can be done by the customer or other members of the project team. A development process may only start after release by the project responsible.

Debugging Standard test method during coding phase. All new source parts must be executed and checked step by step at least one time.

Code review The code review can be done during debugging or after conclusion of coding. This must be done by another member of the project team together with the responsible developer.

Test programms The NC-test programs are created by the developer for the check of functional completeness after coding phase for the specific functional extension. Therefore the NC-test programs must include all test cases. The NC test programs are entered in a NOTES data base (see chapter 4.2.1) and are also used within the regression test.

Manual block Suitable for a fast test of a functionality or first examinations of a problem. Is mainly used during debugging to execute a short NC-sequence.

Grafic tool

PLC simulation

HMI simulation

Handwheel simulation

These test tools are mainly used during integration for the test of axis movements (Grafical visualisation), simulation of hardware elements (handwheel, joystick, buttons) or the test of interfaces to external control components (PLC, HMI).

Machine test Is executed only for large projects and if a suitable machine for the functional test is available. This kind of test often is already defined in the specification phase according to a customer agreement. In this cases the customer is responsible for the provision of a suitable test machine.

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 30/31

5 Validation of constructive QA activities (WP 4.4)

Results and outlook for INSURE++

One of the greatest problems in the daily use of INSURE++ is the needed CPU-performance. Running on a 1 GHz Pentium III with 512 Mbyte memory it takes about over 5 times as long to compile the project with instrumentation of all files by INSURE++ as compiled with standard settings. But not even the compilation takes a long time also running an instrumented executable is a much more time consuming process. In most cases a slow system in the daily work has no user acceptance.

Additionally it must be considered that the change in time critical interaction of process - because of instrumentation – may have different results.

Because the effect of any error could not be predicted at all, the easiest way is to instrument all files of a project, to be sure to find all errors.

The cost of instrumentation must be seen in relation with the benefit taken out of. At the moment the disadvantage stand against the use of INSURE++ in each workstation and in the global instrumentation of all files. But the great advantage of every error, which has been found by use of INSURE++, must be emphasized. The right balance of using INSURE++ must be found.

The disadvantages could be neglected if INSURE++ would be integrated in the automatic regression test. In this case the report files must be saved, to be compared with any version in future. So changes and new error are easy to detect.

Summary:

• test with global instrumentation practical, otherwise – because of side effects – not all errors can be detected,

• test with partly intstumented files if requested or to test new functionalities,

• integration of INSURE++ in regression test.

Results and outlook for regression test

In the future additional support by the automatic test environment is necessary to improve the check of output files. Here a big time effort to check the file compare output is still necessary. This can be solved by modifying the printout of channel block output and afterwards use a “grep” tool to extract the important parts of the output. The main important goal in the future will be the reduction of file compare output.

During this project the automatic test environment was used by the test department and from some developers only. In the future the tool must be used by all developers to test their source changed directly on their computer before integration into any other software variants. The earlier an error is found the less software variants must be fixed. To reach this goal the set-up of automatic test environment must be easy and fast

Best Practice Action IST-1999-20333 Deliverable D-3.2

Page 31/31

6 Glossary

ISG Industrielle Steuerungstechnik GmbH

TEAM Test Execution and Test Management for Numerical Control Software

InCoMM Experimental Introduction of a Configuration Management Model for an Open Control System

NC Numerical Control

QA Quality Assurance

WP Work Package

DOW Description of Work

IST Information Society Technology

HMI Human Machine Interface

CM Configuration Management

PLC Programmable Logic Control

7 References

/1/ ISG: TEAM Project - Description of Work Version 5, Annex I of IST Contract No. IST-1999-20333, Stuttgart, 2000.

/2/ ISG: TEAM Project - Deliverable D3.1 – Set up of constructive QA activities

/3/ N.N.: Entwicklungsstandard für IT-Systeme des Bundes - Vorgehensmodell, 1997.

/4/ ISG: TEAM Project - Deliverable D1 - Tailoring

/5/ ISG: TEAM Project - Deliverable D2.2 – Set up of analytical QA activities