public class DataImport extends Base
DataImport
contains methods that can be used to import data from a test data file to the List of Maps format commonly used for DataSource records (importToRows()), or directly into a DataSource (importToDataSource()). By default the input file is expected to contain comma-delimited data like a .csv
file, but JSON and XML are also supported.
Imported data may be transformed during import, for details search SmartClient Reference for "dataSourceField.importStrategy"
.
ServletTools.loadWebRootFile(path)
Modifier and Type | Class and Description |
---|---|
static class | DataImport.ImportFormat Import formats that can be used DataImport.ImportFormat.CSV DataImport.ImportFormat.JSON DataImport.ImportFormat.XML DataImport.ImportFormat.AUTO |
class | DataImport.ParseDate Implements the translate method of the DataTranslator interface. |
class | DataImport.ParseDateTime Implements the translate method of the DataTranslator interface. |
class | DataImport.ParseFloat Implements the translate method of the DataTranslator interface. |
class | DataImport.ParseNumber Implements the translate method of the DataTranslator interface. |
class | DataImport.ParseText Implements the translate method of the DataTranslator interface. |
class | DataImport.ParseTime Implements the translate method of the DataTranslator interface. |
Constructor and Description |
---|
DataImport() Create a DataImport configured for CSV import with default quote string. |
DataImport(DataImport.ImportFormat theInputType, java.lang.String theDelimiter) Create a DataImport configured for the specified import format. |
DataImport(DataImport.ImportFormat theInputType, java.lang.String theDelimiter, java.lang.String theQuoteString) Create a DataImport configured for the specified import format and quote string. |
Modifier and Type | Method and Description |
---|---|
java.util.Map | importDataSourceRecord(java.util.Map record, DataSource dataSource) |
java.util.Map | importDataSourceRecord(java.util.Map record, java.util.List columns, java.lang.String dataSourceName) |
java.util.Map | importDataSourceRecord(java.util.Map record, java.util.Map columnRemap, DataSource ds) Imports provided record (as Map parameter) to dataSource record. |
java.util.Map | importDataSourceRecord(java.util.Map record, java.util.Map columnRemap, java.lang.String dataSourceName) Imports provided record (as Map parameter) to dataSource record. |
java.util.Map | importDataSourceRecord(java.util.Map record, java.lang.String dataSourceName) |
java.util.List | importDataSourceRecords(java.io.Reader in, java.util.List columns, java.util.Map translators, java.lang.String dataSourceName) |
java.util.List | importDataSourceRecords(java.io.Reader in, java.util.List columns, java.lang.String dataSourceName) |
java.util.List | importDataSourceRecords(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators, java.lang.String dataSourceName) Import from InputStream in either CSV/TSV, JSON or XML format and return the imported records in a List. |
java.util.List | importDataSourceRecords(java.io.Reader in, java.util.Map columnRemap, java.lang.String dataSourceName) |
java.util.List | importDataSourceRecords(java.io.Reader in, java.lang.String dataSourceName) |
long | importToDataSource(java.util.Map record, java.util.List columns, java.lang.String dataSourceName) |
long | importToDataSource(java.util.Map record, java.util.Map columnRemap, java.lang.String dataSourceName) Imports provided record (as Map parameter) to dataSource. |
long | importToDataSource(java.util.Map record, java.lang.String dataSourceName) |
long | importToDataSource(java.io.Reader in, java.util.List columns, java.util.Map translators, java.lang.String tableName) |
long | importToDataSource(java.io.Reader in, java.util.List columns, java.lang.String tableName) |
long | importToDataSource(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators, java.lang.String dataSourceName) Import from InputStream in either CSV/TSV, JSON or XML format and save the imported records to the target SQLDataSource. |
long | importToDataSource(java.io.Reader in, java.util.Map columnRemap, java.lang.String tableName) |
long | importToDataSource(java.io.Reader in, java.lang.String dataSourceName) |
java.util.List | importToRows(java.io.Reader in) |
java.util.List | importToRows(java.io.Reader in, java.util.List columns) |
java.util.List | importToRows(java.io.Reader in, java.util.List columns, java.util.Map translators) |
java.util.List | importToRows(java.io.Reader in, java.util.Map columnRemap) |
java.util.List | importToRows(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators) Import from InputStream in either CSV/TSV, JSON or XML format to a List of Maps. |
void | setAutoInterpretBooleans(boolean auto) If true, the import process auto interprets boolean fields values converting them to Booleans , otherwise leaves them as Strings . |
void | setPopulateDisplayFields(boolean populate) If true, the import process populates displayField values with the import data in cases where it transforms the underlying value using a related display record. |
public DataImport()
public DataImport(DataImport.ImportFormat theInputType, java.lang.String theDelimiter)
theInputType
- the form of the data input.theDelimiter
- a java.lang.String
character used as the delimiter, by default, ","
public DataImport(DataImport.ImportFormat theInputType, java.lang.String theDelimiter, java.lang.String theQuoteString)
theInputType
- the form of the data input.theDelimiter
- a java.lang.String
character used as the delimiter, by default, ","
theQuoteString
- for delimited input only, a java.lang.String
("/""
, by default) to signify a double quote since a double quote would otherwise be misinterpreted as the end of a string by the parserpublic void setPopulateDisplayFields(boolean populate)
displayField
of "countryName". If this flag is set then as part of the process of importing a "countryId" value and transforming the import value "United States" into the corresponding id value, the importer also sets "countryName" on that record to the display value it just transformed (ie, "United States"). By default displayFields are not populated - you must pass true to this method if you require that behavior. See DataSource.getRelatedDisplayRecord(String, Object, DSRequest)
for more details about related display records and import transformation.
populate
- If true, populate displayFields with the import values used to derive a key via a related display record. If false, just leave displayFields unpopulatedpublic void setAutoInterpretBooleans(boolean auto)
Booleans
, otherwise leaves them as Strings
. Conversion rules:
auto
- If true, the import process auto interprets values boolean fields values converting them to Booleans
, otherwise leaves them as Strings
.public long importToDataSource(java.io.Reader in, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.io.Reader in, java.util.List columns, java.lang.String tableName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.io.Reader in, java.util.Map columnRemap, java.lang.String tableName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.io.Reader in, java.util.List columns, java.util.Map translators, java.lang.String tableName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators, java.lang.String dataSourceName) throws java.lang.Exception
InputStream
in either CSV/TSV, JSON or XML format and save the imported records to the target SQLDataSource. Search for "testData" in SmartClient Reference for the description of supported formats and examples. Same rules apply here.
An optional columnRemap
can be provided to translate between column names in CSV/TSV or property names in JSON to the field names of the DataSource. Use null
as a Map value to cause data for a column to be discarded.
For signatures that take "List columns", this is the same as providing a columnRemap does not rename any input fields.
If no columnRemap
or incomplete columnRemap
is provided, column names that are not re-mapped will be matched to DataSource fields by comparing to both the field name and field title, ignoring letter case. Any column name that isn't matched to a DataSource field is discarded.
For delimited input, the header line may be omitted. DataImport
will attempt to automatically detect whether the first line is a header by attempting to match the data to expected column names. If the header line is omitted, the columns will be assumed to be in the order of the columnRemap
if provided (use a class such as LinkedHashMap
to preserve order), or in the order of the DataSource fields if no columnRemap
is provided. Any extra columns will be discarded.
Any SimpleType that inherits from the given base type also uses the same translators: DataImport.ParseDate
DataImport.ParseDateTime
DataImport.ParseTime
DataImport.ParseText
DataImport.ParseNumber
DataImport.ParseFloat
For CSV/TSV and JSON data a set of translators can be provided to in the format expected by importToRows
. By default, the following translations are applied based on DataSource field type:
int
, integer
, sequence
or number
: parsed as Java Integer float
or decimal
: parsed as a Java Double date
: parsed as a date value by standard Java DataFormat
, lenient parsing in current locale datetime
: parsed as a datetime value by standard Java DataFormat
, lenient parsing in current locale in
- input file as a ReadercolumnRemap
- an optional mapping between the input data and the field names of the target DataSource.translators
- optional translators to use to transform the data from the input to the type the DataSourceField expectsdataSourceName
- the ID of the target DataSource1
if the import was successful, and 0
otherwisejava.lang.Exception
public long importToDataSource(java.util.Map record, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.util.Map record, java.util.List columns, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public long importToDataSource(java.util.Map record, java.util.Map columnRemap, java.lang.String dataSourceName) throws java.lang.Exception
Map
parameter) to dataSource. This method uses same logic with column re-mapping as importToDataSource(...)
and general features of DataImport
such as transforming imported data according to import strategies.record
- input record as MapcolumnRemap
- an optional mapping between the input data and the field names of the target DataSource.dataSourceName
- the ID of the target DataSource1
if the import was successful, and 0
otherwisejava.lang.Exception
public java.util.List importDataSourceRecords(java.io.Reader in, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.List importDataSourceRecords(java.io.Reader in, java.util.List columns, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.List importDataSourceRecords(java.io.Reader in, java.util.Map columnRemap, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.List importDataSourceRecords(java.io.Reader in, java.util.List columns, java.util.Map translators, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.List importDataSourceRecords(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators, java.lang.String dataSourceName) throws java.lang.Exception
InputStream
in either CSV/TSV, JSON or XML format and return the imported records in a List. Note that imported records are not saved, for that consider using DataImport.importToDataSource(Reader, Map, Map, String)
API. Search for "testData" in SmartClient Reference for the description of supported formats and examples. Same rules apply here.
An optional columnRemap
can be provided to translate between column names in CSV/TSV or property names in JSON to the field names of the DataSource. Use null
as a Map value to cause data for a column to be discarded.
If no columnRemap
or incomplete columnRemap
is provided, column names that are not re-mapped will be matched to DataSource fields by comparing to both the field name and field title, ignoring letter case. Any column name that isn't matched to a DataSource field is discarded.
For delimited input, the header line may be omitted. DataImport
will attempt to automatically detect whether the first line is a header by attempting to match the data to expected column names. If the header line is omitted, the columns will be assumed to be in the order of the columnRemap
if provided (use a class such as LinkedHashMap
to preserve order), or in the order of the DataSource fields if no columnRemap
is provided. Any extra columns will be discarded.
Any SimpleType that inherits from the given base type also uses the same translators: DataImport.ParseDate
DataImport.ParseDateTime
DataImport.ParseTime
DataImport.ParseText
DataImport.ParseNumber
DataImport.ParseFloat
For CSV/TSV and JSON data a set of translators can be provided to in the format expected by importToRows
. By default, the following translations are applied based on DataSource field type:
int
, integer
, sequence
or number
: parsed as Java Integer float
or decimal
: parsed as a Java Double date
: parsed as a date value by standard Java DataFormat
, lenient parsing in current locale datetime
: parsed as a datetime value by standard Java DataFormat
, lenient parsing in current locale in
- input file as a ReadercolumnRemap
- an optional mapping between the input data and the field names of the target DataSource.translators
- optional translators to use to transform the data from the input to the type the DataSourceField expectsdataSourceName
- the ID of the target DataSourcejava.lang.Exception
public java.util.Map importDataSourceRecord(java.util.Map record, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.Map importDataSourceRecord(java.util.Map record, DataSource dataSource) throws java.lang.Exception
java.lang.Exception
public java.util.Map importDataSourceRecord(java.util.Map record, java.util.List columns, java.lang.String dataSourceName) throws java.lang.Exception
java.lang.Exception
public java.util.Map importDataSourceRecord(java.util.Map record, java.util.Map columnRemap, java.lang.String dataSourceName) throws java.lang.Exception
Map
parameter) to dataSource record. This method uses same logic column re-mapping as importDataSourceRecords(...)
and general features of DataImport
such as transforming imported data according to import strategies.record
- input record as a MapcolumnRemap
- an optional mapping between the input data and the field names of the target DataSource.dataSourceName
- the ID of the target DataSourcejava.lang.Exception
public java.util.Map importDataSourceRecord(java.util.Map record, java.util.Map columnRemap, DataSource ds) throws java.lang.Exception
Map
parameter) to dataSource record. This method uses same logic column re-mapping as importDataSourceRecords(...)
and general features of DataImport
such as transforming imported data according to import strategies.record
- input record as a MapcolumnRemap
- an optional mapping between the input data and the field names of the target DataSource.ds
- instance of the target DataSourcejava.lang.Exception
public java.util.List importToRows(java.io.Reader in) throws java.lang.Exception
java.lang.Exception
public java.util.List importToRows(java.io.Reader in, java.util.List columns) throws java.lang.Exception
java.lang.Exception
public java.util.List importToRows(java.io.Reader in, java.util.Map columnRemap) throws java.lang.Exception
java.lang.Exception
public java.util.List importToRows(java.io.Reader in, java.util.List columns, java.util.Map translators) throws java.lang.Exception
java.lang.Exception
public java.util.List importToRows(java.io.Reader in, java.util.Map columnRemap, java.util.Map translators) throws java.lang.Exception
InputStream
in either CSV/TSV, JSON or XML format to a List of Maps. Search for "testData" in SmartClient Reference for the description of supported formats and examples. Same rules apply here.
This API essentially performs part of the steps of importToDataSource()
but does not actually insert records into a DataSource, instead just returning a List of Maps which could then be inserted into a DataSource, validated further, serialized to XML or otherwise processed.
Input format and delimiter is specified in the constructor or via setInputType(). CSV/TSV delimited data is expected to have a leading line providing column names, which will be used as keys for the Maps returned.
An optional columnRemap
can be provided to translate between column names in CSV/TSV or property names in JSON to a different set of column names to be used in the output. Use a null
as the value of a key in this Map to indicate that data for the column should be discarded. The matching process against the keys of the columnRemap is case-insensitive.
For signatures that take "List columns", this is the same as providing a columnRemap does not rename any input fields.
An optional set of translators can be provided to translate to desired Java types. The translators
Map should be a map from column name (in the CSV/TSV file - not the remapped column name) to the fully qualified name of a specific inner class that will do the translation (for example: "com.isomorphic.tools.DataImport.ParseText"). The class invokes its own translate method which takes any Java Object and produces any Java Object as output.
If the input type is set to autoDetect, importToRows()
will attempt to autodetect the input format. In addition, for delimited input, if the delimiter is unspecified, importToRows()
will attempt to detect whether the delimiter is a comma (",") or tab ("\t"), and will throw an exception if auto-detection fails.
Example usage for CSV parsing (which is the default input type and delimiter):
InputStream is = ServletTools.loadWebRootFile("files/portfolios.csv"); Reader reader = new InputStreamReader(is); List portfolios = new DataImport().importToRows(reader);
in
- a character-stream reader set to read the data input streamcolumnRemap
- optional mapping from input column/property names to output Map keys.translators
- optional list of translatorsjava.lang.Exception