Skip to main content



Connect Superblocks to S3 to build apps that can list, read, delete, and upload files in S3:

  • Read data from S3 and utilize it in scheduled reports

Read data from S3 in a scheduled job

  • Upload data from API steps or local files to S3

Upload to S3 from an application

Setting up AWS S3

1. Add integration

Select AWS S3 from the integrations page.

2. Configure settings

Fill out the form with the following settings:

NameTRUEName that will be displayed to users when selecting this integration in Superblocks
RegionTRUEAWS region where the S3 bucket is hosted, e.g. us-east-1
Access Key IDTRUEAccess key ID for your AWS account
Secret KeyTRUESecret access key for your AWS account
IAM Role ARNFALSEARN of the role for Superblocks to assume for accessing S3 resources

3. Test and save

Click Test Connection to check that Superblocks can connect to the data source.


If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for On-Premise-Agent)

After connecting successfully, click Create to save the integration.

4. Set profiles

Optionally, configure different profiles for separate development environments.


S3 Connected You can now read, delete, and upload files in S3 from any Application, Workflow, or Scheduled Job.

Creating AWS S3 steps

Connect to your S3 integration from Superblocks by creating steps in Application APIs, Workflows, and Scheduled Jobs. An S3 step can perform the following actions:

List files in an S3 bucket


Superblocks also supports connecting to AWS services with Boto3 in Python steps if you require additional functionality.

Use cases


Drag files into an application using the FilePicker component, and upload them to S3. See more details in the FilePicker guide here.

Use a form component and file picker to upload a file to S3


Export a Google Sheet as a CSV to an S3 bucket.

Use S3 in a workflow to store an exported Google Sheet as a CSV

Scheduled Jobs

Query order analytics, update an inventory prediction model, and upload it to S3 for the data science team to use.

Save a prediction model in S3 daily using a scheduled job


Check out our guide on common errors across database integrations. If you are encountering an error that you don't see in the guide, or the provided steps are insufficient to resolve the error, please contact us at