Deployment of Analytics Reports to Focus on Insights

Kris
Kris

As an analyst, we all know that we have to spend our time diving deep into the data to connect it to business goals and objectives. However, we end up dealing with many things like tagging, process, QAing, agreement on KPIs, dealing with too many irrelevant metrics, fighting with time to delivery of reports, etc.  It is imperative that you spend your time and focus on bringing insights to the business, with a right execution and deployment of your analytics reports.

You can argue about human communication aspect of the challenges, but in my experience, there are many challenges around bringing the basic ‘data visibility’ to the key stakeholders from a technical standpoint. Examples are (not limited to):

  • Data reporting environment and availability of data are sub-optimal. Example, clients working with multiple agencies are getting multiple reports in different formats. In most of the cases, clients data environment sits behind the firewall and is not open to agencies.
  • No alignment between different data handlers in what data sources to use. Every data sources are in different locations, and data collection methodologies differences are not well articulated to the end users. (i.e. data handlers could be… clients and agencies or analysts and marketers)
  • BI (business intelligence) solutions aren’t well established in the company for whatever reason. Data experts, especially analysts aren’t necessarily an expert in business intelligence.
  • The gap between the reality of ‘speed of changes in digital data’ and resources or speed to deploying digital data in traditional BI solutions.

With social media related metrics are booming, and web analytics metrics becomes more fragmented, analysts are facing many challenges just to focus on clean data sets to do their analysis.

I feel like a lot of businesses are feeling this pain from the huge disconnect between the speed of the data environment changes and expectations from the analytics reporting and insights expectations. More data does not mean more data visibility or analysts churning out great reports…

Save some time on data cleansing and massaging the data

In my view, data massaging and cleansing are the most time-consuming part of the analysis workflow. I’m not saying that we should hate it, but certain tasks are definitely not fun. Like exporting data to excel, switching around different analytics tool to make sure your analysis is backed and supported more than one data point, copying and posting charts/images/data, etc. Analysts should be spending time digging deeper into the meaning behind the outcome.

If you look at data like a water and knowing that if you built right pipe lines to have that water come in and out in a way that is ideal for you, then we need to build that pipe. That’s how I see BI but would expect that to apply in tools that allow analysts to do a more robust analysis. Google Analytics does a great job allowing an analyst to segment data, create custom reports, and save it into dashboards, but it is not enough. As one of the most important job for an analyst is to find correlation and dive into causation analysis, that analytics/report view is just the beginning of it.

My thought and approach of saving some time for analysis are to find your own analytical pattern and approach to data. Based on the assumption that your business has KPIs and goals, and you have those measures covered (perhaps automatically reported nicely in some BI solutions), the next step is to have operational metrics to answer what and why those KPIs changed (went up or down).

So it is in analysts’ best interests to ask, ‘how do I get those operational measures in a way like a water would come out a pipe in automatic fashion so I do not need to do manual work every time I do the analysis?’. This question is pretty interesting as when I meet many people in similar fields everybody seems to approach it with different techniques, tools, mindsets, etc. A lot of them are intended to save time or get data in a consistent fashion.

Here are some examples practices that you may have come across.

  • One practice some people may be doing is scheduling email reports and open it up occasionally for analysis.
  • Leverage web services like REST APIs to pull data into excel automatically when needed to refresh. In the past, I’ve built parameter string text files, and when a cell value changes (like dates), then the excel data puller will pull the data based on that changed value in the cell.
  • All operational data (pretty much everything) is captured in BI environment. You are lucky if you fall into this camp.
  • All operational/detail metrics are saved as favorites or bookmarks in web analytics solutions. However, analysts still go across multiple platforms to do a cross analysis.
  • Data engineers deal with the heavy lifting. For some magical ways, they’re able to get the data you need anytime. Again, lucky you. You have many supportive resources to help get those detail data.
  • Data engineers put a lot of the data you need into some database you could work with (like MySQL, SQL server, etc.). Rest of it is just analyzed by tapping into that DB and query using SQL or have your data visualization tool connect to that DB directly for further modeling.

Now, I’d like to share you mine. It is definitely not a perfect way, but it is working pretty well for me so far. Especially if you’re running tight on resources for help.

Build technical and marketing business acumen to run a lean analytics practice

I was privileged to learn scripting language during my first four years after college. Recently, I came to learn that with the programming language like Ruby or Python, writing scripts are much easier than before where I was writing the code in Perl, shell scripts, etc. There are so many libraries to access and parse REST APIs in form of XML or JSON available through enterprise level solutions or even free data online. Also, databases like MySQL are free, so putting those data into a storage (even locally) is not a big deal anymore.

In a nutshell, the two technical acumens that helped me the most were scripting and database programming. Once the data is stored in the database, you can use Excel or great visualization tools like Tableau, Spotfire, QlikView, etc.

Rapidly churning out data, and charts/graphs are not like the experience we had to go through a decade ago. Once you have these data in a database, then the rest is easy. I’m not saying this because you should “go learn to program”. It is more on the perspective to drive analysts to be creative beyond the common analytics tool’s capabilities, and build the fundamentals around data management where you could potentially be more creative with data.

Also, for analysts to work with technical folks, it is obvious that the analysts will need to articulate what data is needed. Better understanding the underlining data will allows analysts to articulate to the business folks on what is reportable or not reportable or explain the disclaimers. Definitely a reason why it is very hard to find great digital analytics experts who understand the data from when it was created to all the way down to putting that data into business context.

I really don’t expect all marketing metrics analysts type of role to do tagging, scripting, database programming, but to me, great digital analysts are the ones who could understand and could articulate the whole gamut of digital data from end to end.

With many reporting tools and analytics platforms go into a cloud as Paas, and various analytics services are readily accessible as Saas, these are all very beneficial for data analysts in many ways. Understanding this rapidly changing data/services environment, and acting upon these data inputs in snappy fashion are great acumens digital analysts will need to build.

For me, it took about 3 full business days to build 3 prototype dashboards starting from web analytics API to MySQL database to Tableau dashboard accessible through the intranet. In traditional BI, in my estimate, it would have taken about 3 weeks to deploy. It wasn’t my programming that enabled this quickly (I wished…), but rather the availability of the technical environment that is so different from years ago.

I highly recommend digital experts to learn and check out.
– Scripting languages
– Database trends
– Application stacks (Heroku, dot cloud, amazon, etc)
– Rapid BI tools (i.e. Tableau, Spotfire, QlikView, etc.)

Analytics

Kris Twitter

As a data journalist, I enjoy curating and analyzing marketing trends, and data. The things that fascinate me the most are the transforming business landscape due to evolving marketing technologies.