How to Build a Export BatchJob


Here’s an example of how to export Units from postgres to a file store location (AWS S3, Azure Blob, etc.). c3ShowType(Export) for more details, but essentially, this will perform a fetch by default if no specific action is defined, and write the results of the fetch to the file type specified in the BatchExportSpec Type. The number of files produced or the number of objects can be specified, among other things.

	targetType: "Unit",
    contentType: "csv",
    filter:"exists(name) && exists(concept)",
    csvHeader: "id,name,concept",
    targetPath: "Unit_Test",
    deleteExisting: true

This startExport() command will return an Export object. (Note that Export extends BatchJob, so we can interact with a running Export the same way we’d interact with a running BatchJob)

to check the status of an Export, grab the id of the Export object returned by the startExport() method and run:


Once the job has completed (status = completed), you can grab the FILE_PATH with the following command:


Then the FILE_PATH should be appended to your environment url in the following format:

The FILE_PATH will look something like this (where azure:// can be exchanged for s3:// on AWS-backed environments):

Example Full URL:

The above full URL: ENVIRONMENT-SPECIFIC URL + FILE_PATH will display the csv in the web browser.

To download, simply right-click in the webpage and select “Save As”. This will prompt you to download the file onto your machine.

Export Type Definition