BigQuery

With MadKudu + BigQuery, send your engagement, intent or CRM data to MadKudu.

Note: this integration is in Beta, tell us what you think at product@madkudu.com

MadKudu Tip

If you'd like to send MadKudu data to your BigQuery (Destination), you can do so through the Amazon S3 Destination.

What does this integration do?

Turning on the BigQuery integration allows you to send BigQuery data to MadKudu: product usage, web visits, contact properties to surface to your Sales reps in the MadKudu iFrame in Salesforce or leverage in a scoring model.

Important notes

  1. Data in BigQuery typically resides in tables. When sharing data with MadKudu, the best practice is to create separate tables for Event data, Contact data. You don't need to send all objects: if you have your CRM already connected to MadKudu you can send only event data with the contact email for example.

  2. In each table shared with MadKudu, there must be a column corresponding to a created_at and updated_at: MadKudu pulls data incrementally

    • new rows based on created_at

    • and updated rows based on updated_at

      instead of your whole table at every pull to go faster

  3. The integration uses the following BigQuery scope: https://www.googleapis.com/auth/bigquery which corresponds to View and manage your data in Google BigQuery and see the email address for your Google Account. If you'd like to restrict access to MadKudu to only certain dataset and tables, please configure these permissions in BigQuery to the user you'll connect with MadKudu.

What Objects can you send to MadKudu?

MadKudu works with the following entities and required properties (if the naming doesn't match exactly these column you'll be able to map them in MadKudu but they need to be in there): 

Contacts

Column

Type

example

Comment

id

VARCHAR(256)

93d8AB4f69fa80

Required

email

VARCHAR(256)

john@slack.com

Required

created_at

TIMESTAMP

2023-12-02 20:23:12.010000

Required. UTC

updated_at

TIMESTAMP

2023-12-02 20:23:12.010000

Required. UTC

contact_{property}

depends on the property can be numeric or varchar or boolean

Optional.

Events

Column

Type

example

Comment

id

VARCHAR(256)

03492f12-061b-4f54-b866-764b4f69fa80

Required

event_name

VARCHAR(256)

logged_in

Required

event_timestamp

TIMESTAMP

2023-12-02 20:22:51.780000

Required

contact_id

VARCHAR(256)

john@slack.com

Required (or send contact_id and the Contacts table)

created_at

TIMESTAMP

2023-12-02 20:23:12.010000

Required

updated_at

TIMESTAMP

2023-12-02 20:23:12.010000

Required

event_{property}

depends on the property can be numeric or varchar or boolean

Optional

How to send Bigquery data to MadKudu?

Pre-requisite

You have access to your BigQuery account and to MadKudu app

For MadKudu to start pulling data from your BigQuery account, please follow the following steps

  1. Log in to MadKudu app  (app.madkudu.com)


  2. Click on Integrations > BigQuery

  3. Click on "Grant access to Google BigQuery" and follow the authentication flow

  4. Once connected you'll see a form asking for

    • Project ID: located in your project list in your Google Console



    • Dataset: the name of your dataset

    • Location: the location of your dataset

    • Enter the table names corresponding to Events, Contacts or Accounts (coming soon) (you don't need to send MadKudu all 3, only the ones needed for your use case)

    • For each object, map your BigQuery table column names corresponding to the required properties. In the example below, the column in BigQuery corresponding to the event time is named "timestamp" and needs to be mapped into the event_timestamp (MadKudu naming)


    • Click Save

    • Click on Test connection to check MadKudu can connect to your dataset

MadKudu will start pulling data moving forward! For MadKudu to pull historical data on these tables, please submit a support ticket - it's a quick configuration to turn on on our end.

Example of request:

"Hi, we'd like to pull all data from our BigQuery instance connected to MadKudu

  • Table Events: from YYYY-MM-DD  (we recommend 9 months of data)

  • Table Contacts: from YYYY-MM-DD

  • Table Account: from YYYY-MM-DD (coming soon)

Thanks"

At any time, you can see the status of the integration in Settings > Processes, under Connectors - Pull you should see the BigQuery integration and any error.

After a few hours, you can see the volume of Data already pulled by going to Data in the main left navigation bar and BigQuery

 


FAQ

Which BigQuery scope does this integration use when authenticating? 

The integration uses the following BigQuery scope: https://www.googleapis.com/auth/bigquery which corresponds to View and manage your data in Google BigQuery and see the email address for your Google Account.

How do BigQuery contacts get linked to Salesforce Accounts?

MadKudu uses the email domain of your provided contacts to link to the website of the respective Salesforce Account.