importFromCSV - import test cases and requirements into ApTest Manager


importFromCSV [-f|-u|-a] [-C|-c cont] [-n|-v|-q] [-d dir] [-r|-t] [-U user] suite csvFile [mapFile]


importFromCSV allows you to import records from a Comma-Separated Values (CSV) file into an existing ATM test suite. Records are extracted from the CSV file, optionally transformed based on the rules in a mapping file, and then saved as ATM test cases or requirements.



Force existing tests or requirements to be overwritten if a record with its ID exists in the CSV file.


If the test or requirement already exists, update its fields with the contents of the fields in the CSV file.


As -u, but append revision history fields, rather than replacing them.

-c cont

If an ID field with the contents cont is found, append the row's data to the previous row. This allows you to split your data over multiple lines in your CSV file if you require.

If a field is in a table, data from each row will be joined with ", ", producing a new row in the table. Commas in the field's contents will be escaped as "%2c". Other fields will have the data joined with newlines.


Like -c '' - appends data to the previous row if the ID field is blank.


Don't really generate the test cases or requirements, just say what would have been done.


Be verbose.


Be quiet - write nothing to standard output. Errors will still be written to STDERR.

-d dir

Create tests or requirements in subdirectory dir. By default, all tests or requirements are placed in the top level directory of the test suite.


Create requirements.


Create test cases (default).


Default last-modifying user (defaults to 'ATMUser').


The test suite to import into.


The CSV file to import.


The mapping file, if required.


The mapping file must contain the fields SOURCE, TARGET, REQRD, DEFAULT, and PATTERN.


The name of a column in the CSV file, corresponding to a field to be imported. The column is not required to actually exist in the CSV file - you can use it to fill in default values that are missing from the CSV data using the DEFAULT value described below.


The name of a field defined in the target suite.


A field indicating if the field must contain a value in the CSV file. A required field must have a "Y" in this column.


A default value for the field, if it is empty (or missing) in the CSV file.


An optional perl substitution pattern that will be applied to the input before it is assigned to the test case or requirement.

For example:

name     id            Y      ""          s{-}{/}g
input    input         N      None
process  procedure     Y
purpose  requirements  Y
cdate    cdate         N      2007/01/01

According to the above file, the "name" column will contain values for the "id" field, and the values will have all "-" translated into "/". The "input" column contains values for the "input" field, but any missing values will be replaced with "None". The "process" and "purpose" columns map to the "procedure" and "requirements" fields, respectively, and must have values. The "cdate" column maps to the "cdate" field, and any missing values will be replaced with "2007/01/01". (In fact, the "cdate" column may well be missing from the CSV file entirely.)


The CSV file MUST have as its first line the label for each column, so that the rest of the file can be analyzed in that context. If no mapping file has been specified, then the column names must correspond to fields defined in the target suite. If a mapping file is specified, then all columns in the CSV file MUST have entries in the mapping file.

There MUST be a column in the CSV file that corresponds to the "id" field in the target suite. See notes on IDs below for more details on the contents of the field.

Finally, remember that the input file for this is a CSV file. CSV files are a Windows/DOS file format, and consequently are expected to have lines ending in CRNL, not solely NL as on UNIX systems, or solely CR as on Mac systems. If your copy of ATM is hosted on a UNIX system, and you are generating CSV files on Windows platforms, be certain to transfer the files to the UNIX system in "binary" mode, not in "text" or "ascii" mode. That way the line endings will be preserved, and this script will perform as expected.


The ID field governs which test case or requirement will be created or altered. Note that the details below apply to the contents of the field after any pattern defined in the mapping file has been applied.

"Plain" ID suites

In suites with no auto-numbering turned on, the ID field must contain the name of the test or requirement to be created. It may optionally include a folder path, e.g.


Folders are separated by forward slashes (/). Anything after the last / is taken to be the ID.

If a folder is specified, it is relative to the folder specified using the -d option, or the suite root if no -d option was given. If the folder does not exist, it will be created.

If the test case or requirement specified exists, it will be overwritten or updated only if one of the -f, -u, or -a options was given.

Auto-numbered suites

In suites with auto-numbering (by folder or by suite), existing test cases or requirements are referred to by their number only, e.g.


If no numeric ID is given, a new test case or requirement will be created and assigned a suitable number, e.g.


The ID can also be empty.

If a folder is specified, it is relative to the folder specified using the -d option, or the suite root if no -d option was given. If the folder does not exist, it will be created.

Outline-numbered suites

It is not possible to specify an ID to replace or update in an outline-numbered suite.

The ID field should therefore contain only a folder path to show where the new test case or requirement should be created.


ApTest Manager understands two date formats: YYYY/MM/DD and "seconds since the epoch" (such as the value returned by the Perl function time(). You may be able to use mapping file patterns to translate dates in other formats, e.g.

cdate   cdate   N      2007/01/01  s{(\d{2})/(\d{2})/(\d{4})}{$3/$1/$2}

Copyright © 2000-2013 Applied Testing and Technology, Inc. All rights reserved.