<\!DOCTYPE html> BigQuery Connector - Adam
1Connect
2Select Tables
3Process
4Download
<\!-- Step 1 -->

Connect to Google BigQuery

Provide your GCP project details and upload a service account key to connect.

Found in the GCP Console → Project selector
The BigQuery dataset containing your tables
📄

Click to upload or drag & drop

service-account-key.json

Or paste JSON directly below

Setup guide

1
Open GCP Console → IAM & Admin → Service Accounts
2
Create a new service account with role: BigQuery Data Viewer + BigQuery Job User
3
Create a key → JSON format → download the file
4
Upload the downloaded JSON file here — Adam never stores keys beyond your session
5
If using VPC Service Controls, add Adam's service account to the perimeter allowlist
<\!-- Step 2 -->

Select tables to process

Connected to my-project / production_analytics — 5 tables found

Select BigQuery tables to anonymise.
Schema preview — customer_profiles
customer_profiles · NATIVE TABLE · EU
FieldTypeModeDescription
customer_idSTRINGREQUIREDPrimary identifier
email_addressSTRINGNULLABLEPII — contact email
full_nameSTRINGNULLABLEPII — display name
age_bandSTRINGNULLABLEAge bracket e.g. 25-34
acquisition_channelSTRINGNULLABLE
created_atTIMESTAMPREQUIREDAccount creation time
<\!-- Step 3 -->

Processing tables

Streaming data from BigQuery, running PII detection, and applying anonymisation.

[10:44:01] Authenticated to GCP project: my-analytics-project
[10:44:01] BigQuery API: europe-west2 · dataset: production_analytics
[10:44:02] Starting processing run for 2 tables
<\!-- Step 4 -->

Download anonymised data

Processing complete. Your anonymised BigQuery exports are ready.

user_events_anon
12.4M rows · Parquet · 284 MB
customer_profiles_anon
284,100 rows · CSV · 18.7 MB
Provenance Report
PII detection + anonymisation audit trail
Explore Data →
Done.