Once you create the data file required for the current import job, you can import it to the Capillary database using the 6 steps provided in the following.
The following are the important points to consider before starting with import.
*The mentioned time period is in - IST for APAC & APAC2 cluster - CST for China cluster; - GMT for EU cluster
Duration for data import in the day time (Peak hours) | * 6 AM - 10 PM |
Duration for data import in night time (FTP hours) | * 10 PM - 6 AM |
No of records allowed in the last 24 hours by direct import | 200,000 |
No of records allowed in the last 24 hours in the day time (Peak hours) day FTP import | 200,000 |
No of records allowed in last 24 hours in night time (FTP hours) by FTP import | 1,000,000 |
Maximum size of a single file (in MB) | 200 MB |
Step 1: Create data file (.csv)
First, create a CSV file based on the purpose of the import. Each purpose is associated with a different profile. For example, register customers, add transactions, issue points, add inventory, and so on. You need to create CSV files according to profiles.
|
To know more about different fields supported for each profile, see Profiles, Supported Fields, and Template Configuration.
Step 2: Upload the CSV file
Once you create a file, upload it via. Import Framework.
- On the Settings page (
), click Master Data Management > Data Import
- Click Data Import. You will see the following page
In the Direct Upload field, do the following
- In Upload data csv, choose the CSV file that you want to import
- In the No. of rows to be ignored from top, set the number of rows from the top that you want to exclude from importing
For example, if number 2 is set in this field then while uploading data to the server the top two rows will not be considered. This can be used when there are field names, duplicate data or blank data in the top rows
- In No. of rows to be ignored from bottom, set the number of rows from the bottom that you want to exclude from importing
For example, if 15 is set, then while importing the last 15 rows will be ignored. You can set this option in case of duplicate/blank/invalid entries in the last rows of the CSV file.
- In File Delimiter, type in the character used to separate each value in the file
For example, a file can have either ',' separated or 'tab' separated or ';' separated values. If ',' is mentioned then values will be separated after every , found in the file.
- In Enclosure, type the special character used in the file for 'string' data types
- In Escape Character, type in the special character that needs to be ignored from the file
- Click Submit.
Step 3: Choose the respective profile and create template
Based on the type of data, choose the profile name. create a new template for the profile or use an existing template. A profile template stores the Template Configurations set and lets you reuse whenever required instead of creating a new template every time.
To choose from the existing templates,
- In Choose Profile, select the profile based on the data that you are uploading. For more information on profiles, see Profiles, Supported Fields, and Template Configuration
- In Choose Template, select the template that you want to use for the current import
- Click Submit.
To create a new template
- Check Create new template box.
- In the Name of the template, type a name for the new template
- Click Submit.
Step 4: Configure the Profile Template
The options vary based on the profile chosen. Configure the profile template as explained in Profiles, Supported Fields, and Template Configuration and click Submit
Step 5: Map each field with the respective column of the CSV file (Field Mapping)
Map each field with the respective column names of the CSV file and click Submit. Fields that are marked with * are mandatory.
- Not Null
If you do not want to allow saving null values for a field, check this box. It will ignore the entries with null values for that specific field while importing.
- Data Transformer & Transformer Inputs
This option is used to modify the existing data in a column automatically by making use of the functions and other configurations.
There are four types of data transformers as mentioned in the following table.
None | Field is not modified while importing data |
Merge | Merges the values of two or more columns to a single field. Write the merge function in the Transformer Inputs box. Columns=>x,y,z or Columns=>2,3,4;Separator=>[] where x, y and z are column numbers. For example, if Column=>1,2,3 is set then the text in the columns 1, 2, 3 will be merged to a single field separating each column with a comma (,), a default value. You can set any delimiter for merging two or more columns using the Separator=> function. For example, if Column=>1,2,3;Separator=>; is set in Transformer Inputs, the columns 1, 2 and 3 will be merged separating the value of each column with a semicolon |
Default Value | Replaces null values present in the column with a value specified in Transformer Inputs. Value=>[Any Value]. For example, if Value=>store.server is set in Transformer Inputs, all null values identified in the column will be replaced with 'store.server' |
Convert to Date & Time | Changes the date and time format across the column. This is applicable only to the columns with date and time. Set the standard date and time format in the Transformer Inputs box. Date parameters d - Day of the month (1-31) m - Month (1-12) Y - Year in four digits (example: 2013) y: Year (13) Time Parameters H: Hours (24hr format) s: Seconds h: 12 hour format(1-12) i: Minutes (00 - 60) The parameters of date and time format can be used in any order. For example, date can be set as d/m/Y or m/d/Y or Y/m/d. However, it is recommended to use the standard date and time format of the organization. |
Expression | This function lets you apply some mathematical calculations on two or more columns. Exp=>{Column no. of A} {+ or - or * or /} {Column no. of B} For example, if the expression is set to Exp=>{1}{2}-10, All null values of that column will be replaced with - (value of that field will be (value of column 1) * (value of column2) - 10). You can use this function only on columns with integer values but not for string data types. |
Note:
|
Step 6: Import data to the local database
Once submitting the configurations and field mappings you will see the preview of the import jobs. Any errors identified in the import file will be highlighted in red.
To import data to the temporary database, scroll down to the Import to Temporary DB field and click Submit

Note:
|
Step 7: Import data to the main database
Once the data is imported into the local database, you will see an option to import it to the main database.

Once the data is imported to the temporary database, you will see an option to import data to the main database.
- Check Import to the Database checkbox
- Click Submit
Note:
|