Skip to main content
Use this guide when you want Aampe to provide and manage the S3 bucket. You simply upload your event data to a bucket we provision for you.

Getting Started

Step 1: Request a Bucket from Aampe

Contact the Aampe team at [email protected] to request a dedicated S3 bucket. We will provision a bucket and provide you with:
  • Bucket name
  • Region
  • Write credentials (one of the options below)

Step 2: Choose a Credential Type

Aampe can provide credentials in several formats depending on your infrastructure:
Credential TypeBest For
IAM User Access KeysDirect S3 uploads, AWS CLI, most ETL tools
Pre-signed URLsOne-time uploads, serverless workflows
Cross-Account RoleOrganizations that prefer role-based access from their AWS account
Let us know your preference when requesting the bucket.

Step 3: Configure Your Data Pipeline

Once you receive credentials, configure your data pipeline to write to the Aampe bucket.

Using AWS CLI

aws s3 cp events_20240115.json s3://<AAMPE_BUCKET_NAME>/events_20240115.json

Using AWS SDK (Python Example)

import boto3

s3 = boto3.client(
    's3',
    aws_access_key_id='<ACCESS_KEY_PROVIDED_BY_AAMPE>',
    aws_secret_access_key='<SECRET_KEY_PROVIDED_BY_AAMPE>',
    region_name='<REGION_PROVIDED_BY_AAMPE>'
)

s3.upload_file('events_20240115.json', '<AAMPE_BUCKET_NAME>', 'events_20240115.json')

Using Pre-signed URLs

If Aampe provides pre-signed URLs, you can upload directly via HTTP:
curl -X PUT -T events_20240115.json "<PRE_SIGNED_URL>"

Step 4: Upload Your Data

Upload event data in one of these formats:
  • JSON or Newline Delimited JSON (ndJSON) - Preferred
  • CSV
  • Parquet

File Naming Convention

Use a consistent naming pattern:
  • Daily files: events_YYYYMMDD.json (e.g., events_20240115.json)
  • With partitions: /year=2024/month=01/day=15/events_20240115.json

Data Format

Your event data should follow the Aampe Data Model.

Verification

Once you start uploading, the Aampe team will confirm that we are successfully receiving your data.

Frequently Asked Questions

Where is the data stored?

Aampe buckets are hosted in AWS regions optimized for our data processing infrastructure. If you have specific data residency requirements, let us know and we will accommodate where possible.

Can I use my existing data pipeline?

Yes. Most ETL tools (Fivetran, Airbyte, dbt, etc.) and cloud services (AWS Glue, Lambda, Step Functions) support writing to S3 with access keys.

What happens to my data?

Aampe processes your event data according to our data processing agreement. Data is used solely for providing personalization services to your organization.
Contact us at [email protected] if you need assistance setting up your data pipeline.