One of the strengths of the DataSift platform, is the ease with which filtered interactions can be sent to a variety of destinations.
Google BigQuery is a good example. It takes no more than 10 minutes to configure a database in Google BigQuery, then it takes no more than 5 minutes to configure a filter in the DataSift platform which sends matching interactions straight to BigQuery.
Sounds too good to be true? Register for this Thursday’s free workshop and I’ll show you how.
Google BigQuery is a Cloud Platform service for querying billions of rows of data within table. As you can see below, it uses a structured query language (SQL).
Once you’ve created a project, you only need to create a dataset. You can leave creating tables and writing field mappings to the DataSift platform.
In the DataSift platform, go to the destinations page and click on Google BigQuery. You’ll need some details from the Google project so we know where to send the data and have the correct authorization.
The new destination is saved and you’ll be able to select it on live or historic recordings.
Then select your new Google BigQuery destination. As you can see from this example, you have multiple instances of a BigQuery destination, with each one sending data to different tables or datasets.
You’ll be shown a summary next – make sure you check the Processing Cost in DPUs before hitting Start Task.
Of course, all this can be managed with the DataSift API. Sign up for this Thursday’s free workshop and I’ll walk through the example above and the API. We limit numbers to allow plenty of time for questions and answers so register quickly!
See the full schedule of free workshops here: DataSift Training