There are 2 ways to send data to Aampe from BigQuery:

Method 1: Direct Read Access from Your BQ Dataset

This method is generally used when events data is exported from Firebase directly. Aampe will access your BQ with a dedicated service account which we will provide to you. To grant Aampe read-only access to your BQ dataset, you need to grant the following permissions to the Aampe service account:
  • bigquery.datasets.get
  • bigquery.readsessions.create
  • bigquery.readsessions.getData
  • bigquery.tables.get
  • bigquery.tables.getData
  • bigquery.tables.list
The default BigQuery Data Viewer role (the closest role provided by Google to what Aampe needs) is more suited to internal read access in your company, and is therefore too broad to grant to an external company. We recommend that you:
  • Create a custom role with the needed minimal permissions
  • Grant this role to the Aampe service account
  • Make sure you have the “Role Administrator” and “Security Administrator” roles before you begin.

Step-by-step Procedure

  1. Create the custom role:
  2. Grant permissions:
    • Go to https://console.cloud.google.com/iam-admin/iam?project=
    • Press Grant access button
    • Fill the form “add principal” with Aampe service account that we will provide you (We will provide you the service account to use for integration via email)
    • Fill the form “assign role” with Aampe Custom BQ Read Access
All you need to provide us with the following details now:
  • Project ID
  • Dataset ID
  • Data location
  • Table names (of all the tables that you would like to provide us data of)

Method 2: Push to Google Cloud Storage Bucket

When the data size is very large we recommend that you push data to a cloud storage bucket. We will then copy this data from your GCS bucket to one set up by Aampe for ingestion to our system. If you prefer, you can directly push to a bucket created by Aampe.

Steps

  1. Create a bucket on GCP.
  2. Provide Aampe with access to that bucket (We will provide you the details of the account that you need to provide access to)
  3. Set up export from BQ to cloud storage bucket: Export Data
  4. Automate the export for each day of event data
For reference on the data model, please refer to Data Models.