We know that for many of our clients, analyzing social media data is only one of their challenges and that there are many more data buckets out there that they need to consider to achieve their overall digital marketing goals. Thus combining all these data points into one consistent story is what will lead you in the right direction in the end. That’s why at quintly we always try to be as flexible as possible, in particular when it comes to our exporting options. At the moment we already offer a vast range of export options, but sometimes a PNG, XLS, PPT or PDF export is not enough. And this is why we want to explain in this article how to combine quintly data with other data from a client’s organization.
Setting up quintly with Google BigQuery and Tableau will not only help you to streamline your data analysis process, but also ensures that you have all the data you need integrated into one work process.
We have worked closely with our clients to create a very easy and automated process. Through this we want to make sure that they can access their quintly data in any system they want to use. In addition, we also make sure that the data is updated continuously.
In this blog article, I will dig deeper into the topic and show you how to integrate quintly with Google BigQuery and how to use this as a base for visualization tools like Tableau. If you follow this simple step-by-step process you can take the next step in automating your analytics workflow.
As a starting point, you need to define which data you want to export. You can easily do this by looking at the default metrics that we have in our tool or simply build your own custom metric (QQL). Please note that the visualization style doesn’t play an important role here as the data will be exported in a raw format anyways, so rather look for the right content in your metric instead of focusing on the design. Depending on which visualization tool you will use in the end, this is the place where you think about how the data will be represented.
Once you have found the data or metric that you want to integrate, you need to get the QQL query for this metric. You can do this very easily by clicking on the little “?” icon. In the opening popup you can just copy and paste the query.
Now that you have decided on what data to export and you have the QQL query, it’s time to create the destination table in Google BigQuery where the data should be put into. To make it easy, our systems assume that the columns have the same names as they come out of the QQL query (that you got in Step 1). So if you want to have different column names, simply adjust the QQL query. As Google BigQuery is not really made for deleting data, you will have to create one additional column on top of the ones from the QQL. This column must be called “importTime” and is of type “DateTime”.
Every time we write a row into this table, this field will be filled with the current time so that you always know when an entry was imported. This becomes especially important when we import raw level data like Facebook posts or Twitter tweets. As we will update the data every day, we will create a new line for every post every day. That means the same post could end up there multiple times, but it would show different import times. A little side note: To differentiate between the posts, only take the one with the most up-to-date “importTime” value.
The next step is to automate the flow of data from quintly to Google BigQuery. At the moment there is no self-serve mode for this inside our tool, but you can get in touch with our support and we will help you to setup everything that is needed. With most of our clients, we already have setup a daily routine that updates the data in BigQuery, but we can adjust the frequency according to your needs.
The last step is then to get the data from BigQuery into Tableau. You (or someone else in your company) may have done this already with lots of other data sources, so you would do it in the same way with the quintly data. If you haven’t done it before, Tableau has lots of great content to explain it in a very easy way, which can be found here.
One of the first clients we have done such an integration with is Benefit Cosmetics. Their goal was to centralize all the data from the different tools they use and as they have been a long-term quintly client - We were very happy to work with them on achieving that goal together.
This is what they have to say:
"Benefit Cosmetics is a brand with a large global footprint and quintly has been an invaluable partner in analyzing owned social media performance. Every quarter we launch new markets with their own respective social media accounts, and quintly helps seamlessly connect, collect, and analyze that data.
In global analytics, where our goal is to aggregate data across our markets and mine it for insights, quintly is connected straight to our BigQuery data warehouse, allowing us to efficiently blend, subset, and visualize our data with the other tools in our BI stack."
By following these four steps, you should be able to get your overall analytics to the next level. We know that social media data is only one of the many data buckets out there, so we want to be as flexible as possible with integrating it into your processes. We do this by offering a very modern API but also with integrations like the one described in this article.
In case you are interested to explore a similar solution for your company, just get in touch with us and we will help you to get started.