How to Build a Export BatchJob


#1

Here’s an example of how to export Units from postgres to a file store location (AWS S3, Azure Blob, etc.). c3ShowType(Export) for more details, but essentially, this will perform a fetch by default if no specific action is defined, and write the results of the fetch to the file type specified in the BatchExportSpec Type. The number of files produced or the number of objects can be specified, among other things.

Export.startExport(BatchExportSpec.make({
	targetType: "Unit",
    contentType: "csv",
    filter:"exists(name) && exists(concept)",
    csvHeader: "id,name,concept",
    targetPath: "Unit_Test",
    numFiles:1,
    deleteExisting: true
  }))

This startExport() command will return an Export object. (Note that Export extends BatchJob, so we can interact with a running Export the same way we’d interact with a running BatchJob)

to check the status of an Export, grab the id of the Export object returned by the startExport() method and run:

Export.status('export-id')

Once the job has completed (status = completed), you can grab the FILE_PATH with the following command:

Export.files('export-id')

Then the FILE_PATH should be appended to your environment url in the following format:

https://environment.c3.ai/file/1/TENANT_NAME/TAG_NAME/FILE_PATH

The FILE_PATH will look something like this (where azure:// can be exchanged for s3:// on AWS-backed environments):
azure://environment/fs/tenant/tag/inbox/export/Unit_Test/Unit_Test-GM9ZNWHCJ1-0.csv

Example Full URL:
https://demo-lightbulb.c3.ai/file/1/lightBulb/demo/azure://environment/fs/tenant/tag/inbox/export/Unit_Test/Unit_Test-GM9ZNWHCJ1-0.csv

The above full URL: ENVIRONMENT-SPECIFIC URL + FILE_PATH will display the csv in the web browser.

To download, simply right-click in the webpage and select “Save As”. This will prompt you to download the file onto your machine.


Export Type Definition