The file-based connectors bridge between Data Virtuality Server and local and remote file storage systems. 

type namedescriptionspecific features
ufileaccess and manage files at local filesystem 
ftpaccess and manage files via ftp protocol 
sftpaccess and manage files via sftp protocol 
scpaccess and manage files via scp protocol 
s3access and manage files stored at Amazon AWS S3 storage 
blobaccess and manage files stored at Azure Blob Storage 

Metadata

Before issuing queries to the file data source, we need to configure the data source using appropriate Data Virtuality Server procedures:

Example

call SYSADMIN.createConnection( name => <alias>, jbossCLITemplateName => <type name>, 'connectionOrResourceAdapterProperties => '<connector specific setting depending on type>');
call SYSADMIN.createDatasource( name => <alias>, translator => 'ufile', modelProperties => '', translatorProperties => '');
SQL

(info) The translator has to be 'ufile' for all file-based data sources.

(info) Data Virtuality Studio provides a comfortable way to connect to data sources using graphical wizards. In order to do so, use the corresponding data source type under File section in the Add data source wizard.

Usage

File data sources are utilizing stored procedures shared by all file-based connectors to gather data from their sources. These data may be further processed by Data Virtuality Server. This is commonly done with table functions (like TABLE, TEXTTABLE and XMLTABLE) in combination with parsing functions depending on a data structure.

(info) Data Virtuality Studio provides a variety of Query Builders for that purpose. They allow an easy specification for file encoding and structure of data. These Query Builders are accessible via "SQL editor" - "Tools".

Stored procedures shared by all File Based Connectors

name of the procedureinput parameter (data type / nulls allowed)example call & purpose
getFiles

pathAndPattern (string / not null)

Retrieves all files as blobs with an optional extension at the given path.

If the extension path is specified, then it will filter all of the file in the directory referenced by the base path.

If the extension pattern is not specified and the path is a directory, then all files in the directory will be returned. Otherwise, the single file referenced will be returned.

Usage

call <alias>.getFiles('pathAndPattern')
SQL

Example

call s3.getFiles('marketing/*.csv')
SQL
getTextFiles

pathAndPattern (string / not null)

Retrieves all files as clobs with an optional extension at the given path.

All the same files as with getFiles will be retrieved, the only difference is that the results will be clob values.

Usage

call <alias>.getTextFiles('pathAndPattern')
SQL

Example

call s3.getTextFiles('marketing/*.csv')
SQL
saveFile

pathAndPattern (string / not null)

value (clob, blob, xml / not null)

Saves the clob, blob or xml value to given path. The path should reference a new file location or an existing file to overwrite completely. Both absolute and relative paths are valid as input.

Usage

call <alias>.saveFile('pathAndPattern', value)
SQL

(info) PathAndPattern has slightly different meaning here than for the other file based connector procedures. Actually its specifiing path and filename, not a pattern.

Example

call <alias>.saveFile('marketing/july_2016.csv', (SELECT cast(data as xml) FROM report_server.marketing_data_july_2016));
SQL

listFiles

pathAndPattern (string / not null)

Lists all files from specified directory.

Usage

call <alias>.listFile('pathAndPattern')
SQL

Example

call s3.getTextFiles('marketing/*.csv'))
SQL

deleteFiles

pathAndPattern (string / not null)

Deletes all files matching the pattern.

call <alias>.deleteFile('pathAndPattern')
SQL

(warning) call <alias>.deleteFile('') will delete all files in the directory without further confirmation