Build impactful Test Automation dashboards using ELK stack

In the previous post, I mentioned my one of thousand opinions to improve test automation visibility to succeed in Agile transformation. If you happen to like those dashboards, continue to read. In this post, I attempt to explain how those dashboards can be created for your use.

Technology

Technology behind that is ELK stack.

  • Elasticsearch to store the logs\test results.
  • Logstash to ship the logs from your test runner to Elasticsearch.
  • Kibana to dashboard the data.

Minimum design could look like this

Screen Shot 2015-09-15 at 7.29.50 PM

There are many references and video walkthru to standup the environment. May be a quick pointer here is Elasticsearch setup, Logstash setupKibana setup or may be docker container.

In addition to this, we’ll also use another Elasticsearch plugin. Marvel to directly talk to Elasticsearch via API. Here is more info and install guide.

Mapping

Now that the environment is up and running, first step is to define the mapping. For lack of better term, mapping might be something similar to db schema. In order to pivot data appropriately, we need to input data in a consistent pattern and mapping helps for that.

A simple mapping to begin with

Screen Shot 2015-09-15 at 8.11.15 PM

Few quick points about mapping

  • Index – something similar to the database
  • type – something similar to the table
  • Nested Object – to store array of objects inside an object
  • _timestamp – enabled to automatically index timestamp of the document
  • “index” : “not_analyzed” will indicate Elasticsearch not to analyze this field. Good example is that test case names potentially have multiple words in a sentence, ie. User login to the app from mobile… I dont want Elasticsearch to dismantle the sentence, tokenize the words instead I want to report the name as one sentence. In such case, I’d prefer not to analyze the field.

(There are best practices to create the index combined with date and create a one daily as Elasticsearch can do efficient search across the indices.. for simplicity, I’ve created an index with whatever is required for this attempt)

Here is the gist to create above index, you can simply run this from Marvel (for example, http://localhost:9200/_plugin/marvel/sense/index.html)

Publish data

Once the index is created, we’ll publish fake data using APIs. Because publishing real test execution data depends on lot of external factors such as Test cases, plugin\logic to parse your test results conforming mapping structure and logstash configuration to ship the data over to ELK. While logstash configuration might be one easy thing, other pieces have huge spectrum of variables. For example, tests might be written in Jasmine + Protractor or Specflow + C# or something in Java.. so, at this point, lets fake it until we make it.

Screen Shot 2015-09-15 at 9.06.47 PM

Gist – publish data

In the above picture, I push a sample data manually using DSL from marvel and right side is the acknowledgement showing this document is indexed.

If you run “GET quality/_search”, it’ll show up all the documents indexed. Use your favorite scripting language to publish fake data as much as you want.

Visualization

At this point, we have some data in Elasticsearch, lets see how to create some visualizations using Kibana.

Launch Kibana (http://localhost:5601/) and confirm some settings

Screen Shot 2015-09-15 at 9.16.56 PM

Add “quality” to defaultIndex and make sure _timestamp is in metaFields ..otherwise, no visualization would work.

Discover

First step is to identify the subset of data we want to visualize.. navigate to “Discover” tab, select required fields from available fields and Save the discovery

Discovery 1

Visualize

Create new visualization

Visualize 1

Select data source for visualization

Datasource

Construct the desired visualization. In this case, I want to display grids of data by timestamp. So its simply building spliting the rows as shown below..and save the visualization.Visualize 2

similarly, build couple of more visualization

more Visualizations

Dashboard

Finally, we can bring all the visualizations together in the dashboard.

Dashboard 1

Visualizations built so far might not be so interesting because there are many test execution plugins offer pie chart and other charts of test execution data.

Best part of using ELK stack is the power of filters that allows us to go back in time and see the trend and make use of the time series data.

Below I’m applying the time filter

power of filter

Here I can combine the time filter with another parameter of my choice. For example, I want to see only ‘failed’ test cases on ‘dev’ environment in last ‘1 week’

power of filter 2

It can help answering many more interesting questions.

Would it be nice if you can see the quality of your build that went to production 14 days back? how many regression cases passed\failed?

Would it be nice if you can correlate your application logs with load test logs and detect anomaly?

Would it be nice if you can go to a single pane of glass that shows Functional, Performance, Security, Regression test results of your application for a given period?

Endless possibilities. Elasticsearch offers rich RESTful APIs as well to build and visualize something that Kibana doesn’t support out of the box..

Advertisements

2 thoughts on “Build impactful Test Automation dashboards using ELK stack

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s