What is a Lean Dashboard?
A measurement tool used for collecting and reporting data related to a strategy in a streamlined and visual way. It helps measure growth and execution for continuous improvement. Lean dashboards provide real-time measurements for real-time feedback to confidently adjust plans.
Wescover is an early stage startup — so it’s essential we measure our growth & execution and continuously improve ourselves. These metrics should be analyzed and projected to the whole company in an accessible dashboard.
The Analytics Pipeline
A simple representation of a data analytics pipeline will look like this:
- Report metrics from the Application
- Store them in a scalable data store
- Analyze with a BI tool
- Visualize on a dashboard
As an early stage startup, we should make sure we are efficient — putting in the minimal effort to provide a requested functionality. This effort should preferably form the foundations for a further expansion, but it should not be a bottomless endeavor of building the best solution.
With this in mind we set to build a dashboard in a lean manner. As a Serverless fanboy, my approach is to first see what services can be reused and “glued” before reinventing the wheel myself. This is the account of how we implemented a simple analytics pipeline in Wescover.
Steps 1 & 2: Report and Store
Wescover runs on Azure cloud services, so it was fairly easy to use the Application Insights service for these steps. We’ve used the Application Insights SDK to report custom metrics and these were stored in this managed service without additional hassle.
One thing to notice though, Application Insights’ data retention period is limited (90 days). This is good enough when the analytics are digested based on the last weeks metrics. If a longer period is needed then you can configure a continuous export to Azure’s storage service and build an ETL pipeline.
Step 3: Analyze
Its rich syntax enables the execution of complex queries. Below is an example of such a query. The query fetches the weekly totals for each custom event, aggregated by the last 2 weeks:
The result would look something like this:
Step 4: Visualize
Now that we were able to query our data and analyze it, we’d like to display it on a nice lean dashboard for the whole team to see. We found that Google’s Data Studio is a simple yet sufficient solution for our needs. It has good diagrams variety, it’s simple to use and customize, and provides Google’s G Suite access control out-of-the-box.
The Problem with Pulling Data from Azure’s Application Insights
Sounds great, but we had one (major) problem: Data Studio cannot pull data from Azure’s Application Insights. It connects to various Google services (Big Query, Cloud SQL, Sheets, Analytics, DCM, etc.) and provides community connectors to third parties (Facebook Insights, Twitter and more) but not to Azure. Surprise surprise.
Serverless Scheduled Job
As mentioned above, Data Studio can connect to Google Sheets as the data source. If only we could’ve push our analytics there 🤔
Easy! We can use a serverless scheduled job to query the analytics from Application Insights and push them into the Google Sheet! Azure Functions is a good candidate for this, but for this scenario Azure Logic Apps is a perfect match. So We’ve built a Logic App with a Recurrence trigger type to run every hour and do the following:
- Query Application Insights via an Application Insights connector
- Put the results into a Google Sheet via a Google Sheet connector
As simple as that. No servers were spun, no code was written and no 3rd party APIs had to be investigated.
BTW, remember the data retention limitation I mentioned before? Once we’ve put the query results into the Google Sheet they will remain there forever. And the dashboard will display the analytics history not only of the last 90 days, but since the very beginning.
We’ve built a full analytics pipeline with not much more than “glueing” between managed services — and we did it quickly.
The Wescover application reports custom metrics into the Application Insights store. A Logic App analyzes the data with an Analytics query and puts the results in a Google Sheet. The Google Data Studio displays the analytics in a beautiful lean dashboard. 🤓
Bonus: Logic Apps Tips & Tricks
A few neat tips for Logic Apps:
- It’s useful to create an object for the query with the Compose action. Makes it easier to maintain and to refer it from later actions in the flow.
- Use Scopes to group actions and separate them from other logical parts. It forms a more organized app and comes handy when you wish to have a particular error handling for specific parts of it. Actually our real logic app runs several scopes every time it’s triggered, each one has a different query:
- Logic App is in fact a JSON configuration, so you can store it in your repository and deploy via Azure’s cli.
Cheers — hope you found this handy! 👏
Get the skills you need for a better career.
Master modern tech skills, get certified, and level up your career. Whether you’re starting out or a seasoned pro, you can learn by doing and advance your career in cloud with ACG.