Skip to main content
There are 2 ways to send data to Aampe from BigQuery:
  • Direct Read Access from Your BQ Dataset
  • Push to Google Cloud Storage Bucket

Option 1: Direct Read Access from Your BQ Dataset

Aampe will access your BQ with a dedicated service account. To grant Aampe read-only access to your BQ dataset, you need to create an Aampe service account and grant the following permissions:
  • bigquery.datasets.get
  • bigquery.readsessions.create
  • bigquery.readsessions.getData
  • bigquery.tables.get
  • bigquery.tables.getData
  • bigquery.tables.list

Step-by-step Procedure

  1. Create the custom role:
    • Go to https://console.cloud.google.com/iam-admin/roles 1 Create Role Pn
    • Fill the form (title: Aampe Custom BQ Read Access) 2 Role Menu Pn
    • Add the following permissions:
      • bigquery.datasets.get
      • bigquery.readsessions.create
      • bigquery.readsessions.getData
      • bigquery.tables.get
      • bigquery.tables.getData
      • bigquery.tables.list
  2. Grant permissions:
    • Go to https://console.cloud.google.com/iam-admin/iam?project=
    • Press Grant access button Bigquery3 Grant Access Pn
    • Fill the form “add principal” with Aampe service account that we will provide you (We will provide you the service account to use for integration via email)
    • Fill the form “assign role” with Aampe Custom BQ Read Access 4 Save Grant Pn
All you need to provide us with the following details now:
  • Project ID
  • Dataset ID
  • Data location
  • Table names (of all the tables that you would like to provide us data of)

Option 2: Push to Google Cloud Storage Bucket

When the data size is very large we recommend that you push data to a cloud storage bucket. We will then copy this data from your GCS bucket to one set up by Aampe for ingestion to our system. If you prefer, you can directly push to a bucket created by Aampe.

Steps

  1. Create a bucket on GCP.
  2. Provide Aampe with access to that bucket (We will provide you the details of the account that you need to provide access to)
  3. Set up export from BQ to the cloud storage bucket: Export Data
  4. Automate the export for each day of event data
For reference on the data model, please refer to Data Models.