Showing posts with label opensource. Show all posts
Showing posts with label opensource. Show all posts

Saturday, July 2, 2016

What is Web Analytics and how to Test it?

vodQA returned - this time with the theme - Testing Heuristics in ThoughtWorks, Hyderabad on 2nd July 2016.

Here I spoke about - "The What, Why and How of Web Analytics Testing". 

Abstract

Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation. But just knowing about Web Analytics is not sufficient for business now.
There are new kids in town - IoT and Big Data - two of the most used and heard-off buzz words in the Software Industry!

With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions.

There are 2 types of analysis that one needs to think about.
1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product.
2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns.

Slides


Video

PS: Apologies for the video quality - I am not seen very clearly - but the slides are bright & clear, and so is the audio - so the important aspects are covered!




Pictures









Feedback


My Takeaway & Learning

- The attendees did not have much exposure to Web Analytics, and how it works. I should spend more time in speaking about that
- I should spend more time in challenges and potential solutions related to Big Data & IoT
- A lot of people are interested in WAAT - that could be a separate, more detailed discussion

Wednesday, October 14, 2015

Good Trends for TTA in DevOps Summit

I spoke in DevOps Summit on 8th Oct in Bangalore on "To Deploy, or Not To Deploy - Decide using TTA's Trend & Failure Analysis".

The conversations during and after this talk with various veterans in the Software Industry, across various different domains; reiterated my belief in the need for me to spend more time in taking TTA to the next level and make it a more robust and feature-rich product.

Below are the details of the talk:


Abstract

In a fast-moving environment, where Continuous Integration (CI) and Continuous Delivery (CD) are a necessity and not a luxury, how can teams decide if a product is ready to be deployed to the next environment and go 'live'?

What is the health of your product portfolio at any point in time?
Can you identify patterns over a period of time in making better decisions to better the quality of your product(s)?
Test Automation across all layers of the Test Pyramid enables to get quick feedback about the health of the product-under-test.

However, in an organization having multiple products in its portfolio, how can you get the collated quality / health information from all the products, quickly and in real-time? Or, for a large program of work, which has various projects being worked on in parallel by numerous teams across the world, how can the relevant people quickly get the consolidated quality / health information for the whole program?

In such cases, how can you:
- figure out any Trends / Patterns in the quality, or,
- do any meaningful Comparative Analysis (say between the quality of last release Vs the next release), or,
- do quick Failure Analysis and prioritize the 'fixing' of issues in an efficient fashion, and,
- do some quick Functional Performance Benchmarking.

At present this needs to be done manually.
Learn an effective way to answer the above questions - with TTA (Test Trend Analyzer), an open source product.

TTA give you real-time and visual insights into the health of the product portfolio using the Test Automation results. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Slides from the talk


Video from the talk

 

 


Monday, October 12, 2015

Web Analytics and the new kids in town!

I spoke in Agile Testing Alliance Global Gathering on 8th Oct in Bangalore on "The What, Why and How of Web Analytics Testing".

This talk was my take on explaining a very important, yet quite ignored, aspect of Product / Application Development - Web Analytics. Below is the abstract of the talk, followed by slides and video from the talk.


Topic: The What, Why & How of Web Analytics Testing

Learning Objectives:

The most used and heard about buzz words in the Software Industry today are … IoT and Big Data!

With IoT, with a creative mindset looking for opportunities and ways to add value, the possibilities are infinite. With each such opportunity, there is a huge volume of data being generated - which if analyzed and used correctly, can feed into creating more opportunities and increased value propositions.

There are 2 types of analysis that one needs to think about.
1. How is the end-user interacting with the product? This will give some level of understanding into how to re-position and focus on the true value add features for the product.
2. With the huge volume of data being generated by the end-user interactions, and the data being captured by all devices in the food-chain of the offering, it is important to identify patterns from what has happened, and find out new product / value opportunities based on usage patterns.

Learn what is Web Analytics, why is it important, and see some techniques how you can test it manually and and also automate that validation.


Slides from the talk

Wednesday, September 23, 2015

Selenium Conference 2015 - it simply came, and went so fast

Its been a crazy summer - the 2nd week of September 2015 just amplified that…

A good few months ago we - the Selenium Conference Planning Committee started on the journey of planning this years Selenium Conference 2015. We started with debating where to have this years conference, till Portland magically came up on the radar, and became a reality. We met over Google Hangout every 2 weeks initially, and then as we got closer to the date, every week.

Can’t believe as I am writing this post, the conference is already over (a couple of weeks ago) …

The team put in a lot of hard work - me doing the least of that … and the turnout (approx 500 people), the interactions and the quality of talks proves the hard work paid dividends.

I traveled from Pune, India on 5th Sept at around 6pm headed to Portland, Oregon. The journey - from home to the hotel took approximately 35 hours.

After crazy 4 days, and a total of around 25-30 hours of sleep in 5 nights (thanks to the jet lag), and having delivered 3 talks as well, it was another 35 hour trip back home ... the only good thing after this hectic trip - I never got adjusted to the US time zone - which meant no jet-lag when I came back home :) This was a first for me :)

Slides & Videos from Selenium Conference 2015:

All the slides and videos for all the talks are available here.

Below is the list of my talks:

To Deploy or Not-to-Deploy - decide using TTA's Trend & Failure Analysis

I got a lot of very good feedback for this talk, and also quite a few people expressed interest in trying it out! Looking for feedback from their experiences!


Video of the talk is available on YouTube here:


Slides are available here:


Automate across Platform, OS, Technologies with TaaS

This topic is so relevant with anyone working in large enterprises, or when it is being "mandated" to work on a common test automation framework.

Video of the talk is available here:

Slides are available here:


Say ‘No’ to (more) Selenium Tests

I paired with Bhumika on this talk. We were very agile in preparing for this talk - a day in advance to be precise. Also, it was very bold topic to have in a Selenium Conference - standing in front of 200+ Selenium enthusiasts, and telling them - do NOT write more Selenium tests. But went pretty well ... given that we were able to walk on our own feet out of the room, and that people were able to get the message we were trying to deliver :D

Video of the talk is available here:


Slides are available here:


Thursday, May 28, 2015

vodQA Pune - Innovations in Testing

vodQA Update - Agenda + Slides + Videos


Here is an update of the vodQA that went by at supersonic speed!


We had an intense and action-packed vodQA in ThoughtWorks, Pune on Saturday, 6th June 2015 - with the theme - Innovations in Testing!

Here are some highlights from the event:
  • You can find the details of the agenda + links to slides & videos from here or here.
  • After a record breaking attendee registrations (~500), we frantically closed off registrations. This meant around 140-180 people would show up based on historic attendance trends. 135 attendees made it to vodQA - the first person reaching office at 8.30am - when the event was supposed to start at 10am! That is enthusiasm!
  • We had 45+ speaker submissions (and we had to reject more submissions because the registrations had already closed). After speaking to all submitters, and a lot of dry-runs and feedback, we eventually selected 6 talks, 4 lightning talks, 4 workshops from this massive list.
  • We were unfortunately able to select only 2 external speakers (but it was purely based on the content + relevance to the theme). One of these speakers travelled all the way from Ahmedabad to Pune on his own for delivering a Lightning Talk.
  • We had a few ThoughtWorkers travelling from Bangalore (2 speakers + 1 attendee) and 1 (speaker) from Gurgaon
  • We had around 30-40 ThoughtWorkers participating in the conference. 
  • No event in the office can be possible without the amazing support from our Admin team + support staff!
  • Overall - we had around 200 people in the office on a Saturday!
  • For the first time, we did a live broadcasting of all the talks + lightning talks (NO workshops). This allowed people to connect with vodQA as it happened. Also - usually the last and most cumbersome thing from a post-event processing - uploading videos - was now the the first thing that was completed. See the videos here on youtube. This update got delayed because we still have to get the link to the slides :(
  • We celebrated the 5th Birthday of vodQA!
  • Even though most projects in TW Pune are running at 120+% delivery speed, we were able to pull this off amazingly well! This can only happen when individuals believe in what they are contributing towards. Thank you all!
  • We wrapped up most of the post-event activities (office-cleanup, retro, post-vodQA dinner and now this update email) within 5 days of the vodQA day - another record by itself!
  • Some pictures are attached with this email.
You can see the tweets and comments in the vodQA group on facebook

Again, A HUGE THANKS to ALL those who participated in any way!

On behalf of the vodQA team + all the volunteers!










-----------------------------------------------------------------------------------------------------------------------------------------

[UPDATE]

Detail agenda, with expected learning and speaker information available here (http://vodqa-pune.weebly.com/agenda.html) for vodQA Pune - Innovations in Testing.

NOTE;
- Each workshop has limited # of seats.
- Registration for workshop will be done at the Attendee Registration Desk between 9am-10am on vodQA day.
- Registration will be on first-come-first choice basis.
- See each talk / workshop details (below) for pre-requisites, if any.


----------------

vodQA is back in ThoughtWorks, Pune on Saturday, 6th June 2015. This time the theme is - "Innovations in Testing".

We got a record number of submissions from wannabe speakers and HUGE number of attendee registrations. Selecting 12-14 talks from this list was no small task - but we had to take a lot of tough decisions.

The agenda is now published (see here - http://vodqa-pune.weebly.com/agenda.html) and we are looking forward to have a very rocking vodQA!

Tuesday, May 19, 2015

Role of Automation in Testing

I am speaking in Discuss Agile 2015 conference on 13-14 June 2015 on the following topics - 

As part of this conference, I also did an interview with Saket Bansal and Atulya Mishra on - The Role of Automation in Testing.

This was an interesting, virtual interview - where interested people had asked questions during registration, and also a lot of questions came up during the interview.

Below is the video recording of the interview. 


I also referenced some slides when speaking about some specific topics. Those can be seen below, or directly from slideshare.




Monday, May 11, 2015

vodQA Geek Night in ThoughtWorks, Hyderabad - Client-side Performance Testing Workshop

I am conducting a workshop on "Client-side Performance Testing" in vodQA Geek Night, ThoughtWorks, Hyderabad from 6.30pm-8pm IST on Thursday, 14th May, 2015.

Visit this page to register!

Abstract of the workshop:

In this workshop, we will see the different dimensions of Performance Testing and Performance Engineering, and focus on Client-side Performance Testing. 

Before we get to doing some Client-side Performance Testing activities, we will first understand how to look at client-side performance, and putting that in the context of the product under test. We will see, using a case study, the impact of caching on performance, the good & the bad! We will then experiment with some tools like WebPageTest and Page Speed to understand how to measure client-side performance.

Lastly - just understanding the performance of the product is not sufficient. We will look at how to automate the testing for this activity - using WebPageTest (private instance setup), and experiment with yslow - as a low-cost, programmatic alternative to WebPageTest.

Venue:

ThoughtWorks Technologies (India) Pvt Ltd.
3rd Floor, Apurupa Silpi,
Beside H.P. Petrol Bunk (KFC Building),
Gachibowli,
Hyderabad - 500032, India 

Wednesday, January 28, 2015

vodQA Cocktail - early in 2015

As we get ready for Celebrating Selenium's 10 year journey in vodQA Hyderabad, ThoughtWorks Chennai is ready to take vodQA to the next level on Saturday, 21st February, 2015 with an interesting Cocktail of topics related to Software Testing.

Register here as a speaker for vodQA Chennai, or here as an attendee.

Tuesday, January 13, 2015

Start 2015 by Celebrating Selenium in vodQA Hyderabad

ThoughtWorks, Hyderabad is proud to host its first vodQA, also the first vodQA of 2015 and start 10 Years of Selenium Celebration. This event will be held on Saturday, 31st Jan 2015.

Look at the agenda of this vodQA and register soon. Given that we have mostly workshops in this vodQA, seats are going to be limited!

Here is the address and direction to the ThoughtWorks office.

UPDATE:

Slides for my talk on the "Future of Testing, Test Automation and the Quality Analyst" are now available here:

Wednesday, December 24, 2014

Disruptive Testing with Julian Harty

As part of the Disruptive Testing series, the last interview of 2014, with Julian Harty is now available here (http://www.thoughtworks.com/insights/blog/disruptive-testing-part-8-julian-harty) as a video interview. The transcript of the same is also published.

Also look at ThoughtWorks Insights for other great articles on a variety of topics and themes.

Saturday, November 22, 2014

To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA) in AgilePune 2014

I spoke on the topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" in Agile Pune, 2014.

The slides from the talk are available here, and the video is available here.



 

Below is some information about the content.


The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality!
In order for these organizations to to understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually.


So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not?


Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.

The next set of questions are:
  • How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
  • How can you know if the product is ready to go 'live'?
  • What is the health of you product portfolio at any point in time?
  • Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
 

The solution - TTA - Test Trend Analyzer
 
TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.
 
There are 2 sets of audience who will benefit from TTA:
1. Management - who want to know in real time what is the latest state of test execution trends across their product portfolios / projects. Also, they can use the data represented in the trend analysis views to make more informed decisions on which products / projects they need to focus more or less. Views like Test Pyramid View, Comparative Analysis help looking at results over a period of time, and using that as a data point to identify trends.

 
2. Team Members (developers / testers) - who want to do quick test failure analysis to get to the root cause analysis as quickly as possible. Some of the views - like Compare Runs, Failure Analysis, Test Execution Trend help the team on a day-to-day basis.
 
NOTE: TTA does not claim to give answers to the potential problems. It gives a visual representation of test execution results in different formats which allow team members / management to have more focussed conversations based on data points.

Some pictures from the talk ... (Thanks to Shirish)








Saturday, November 15, 2014

The decade of Selenium

Selenium has been around for over a decade now. ThoughtWorks has published an eBook on the occasion - titled - "Perspectives on Agile Software Testing". This eBook is available for free download.

I have written a chapter in the eBook - "Is Selenium Finely Aged Wine?

An excerpt of this chapter is also published as a blog post on utest.com. You can find that here.

Sunday, June 29, 2014

What reporting / reporters you use with Selenium / WebDriver?

Test execution reports is usually an after-thought when doing automation. 
 
  • What reporting techniques have you used on your automation projects (language / tools probably do not matter)? 
  • What is the test log format used? 
  • Do you use any special / different plugins, or rely on the CI tools with some plugins?
  • What value do you get out of these reports?
 
 

Monday, June 23, 2014

To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)

[UPDATE: 18th July 2014] I spoke on the same topic - "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)" at Unicom's World Conference on Next Generation Testing in Bangalore on 18th July 2014. The slides are available here and the video is available here. In this talk, I also gave a demo of TTA.
 
I spoke in 3 conferences last week about "To Deploy or Not to Deploy - decide using Test Trend Analyzer (TTA)"

You can find the slides here and the videos here:

Here is the abstract of the talk:

The key objectives of organizations is to provide / derive value from the products / services they offer. To achieve this, they need to be able to deliver their offerings in the quickest time possible, and of good quality. To understand the quality / health of their products at a quick glance, typically a team of people scramble to collate and collect the information manually needed to get a sense of quality about the products they support. All this is done manually. So in the fast moving environment, where CI (Continuous Integration) and CD (Continuous Delivery) are now a necessity and not a luxury, how can teams take decisions if the product is ready to be deployed to the next environment or not? Test Automation across all layers of the Test Pyramid is one of the first building blocks to ensure the team gets quick feedback into the health of the product-under-test.
 

The next set of questions are:
    •    How can you collate this information in a meaningful fashion to determine - yes, my code is ready to be promoted from one environment to the next?
    •    How can you know if the product is ready to go 'live'?
    •    What is the health of you product portfolio at any point in time?
    •    Can you identify patterns and do quick analysis of the test results to help in root-cause-analysis for issues that have happened over a period of time in making better decisions to better the quality of your product(s)?
 

The current set of tools are limited and fail to give the holistic picture of quality and health, across the life-cycle of the products.
 

The solution - TTA - Test Trend Analyzer
 

TTA is an open source product that becomes the source of information to give you real-time and visual insights into the health of the product portfolio using the Test Automation results, in form of Trends, Comparative Analysis, Failure Analysis and Functional Performance Benchmarking. This allows teams to take decisions on the product deployment to the next level using actual data points, instead of 'gut-feel' based decisions.

Monday, May 19, 2014

WAAT at StarEast2014

I was speaking about "Build the 'right' regression suite using Behavior Driven Testing (BDT)" at StarEast 2014 and met Marcus Merrell who was speaking about "Automated Analytics Testing with Open Source Tools". I figured out deep into our conversation that he has used WAAT, and a few others at the table also were aware of and had used WAAT before. Felt great!

Thursday, May 8, 2014

Update from Webinar on "Build the 'right' regression suite using BDT" for NY Selenium Meetup

I had a challenging, yet good time speaking in a Webinar for the New York Selenium Meetup community on how to "Build the 'right' regression suite using Behavior Driven Testing (BDT)". This webinar was conducted on 6th May 2014 at 6.30pm and I am very thankful to Mona Soni to help organize the same.

Before I speak about the challenges, here are the slides and the audio + screen recording from the webinar. The video is not cleaned-up ... I had started recording the session and then we did wait for a few minutes before we started off, but you can forward to around the 01:15 min mark and audio starts from that point.

This was challenging because of 2 main reasons:
> With a webinar, I find it difficult to connect with the audience. I am not able to gauge if the content is something they already know about, so I can proceed faster. Or, if they are not following, I need to go slower. Or, the topic is just not interesting enough to them. There may be other reasons as well, but I just do not get that real-time feedback which is so important when explaining a concept and a technique.
Though there were some good interactions and great questions in form of chat, I miss that eye-to-eye connect. This webinar was conducted using GoToMeeting. Maybe next time I do this, I need to try to get webcams enabled for atleast a good few people attending to understand that body language.

> The 2nd challenge I had was purely my own body not being able to adjust well enough. I had flown in from India to Florida to speak in STAREAST 2014 conference just a couple of days ago, and was still adjusting to the jet-lag. Evening times turned out to be my lowest-energy points on the day and I felt myself struggling to keep focus, talk and respond effectively. I would like to apologize to the attendees if they felt my content delivery was not up to the mark for this reason.

I appreciate any feedback on the session, and looking forward to connect with you and talk about Testing, Test Automation, my open-source tools (TaaS, WAAT, TTA) and of course BDT!

Thursday, April 10, 2014

Sample test automation framework using cucumber-jvm

I wanted to learn and experiment with cucumber-jvm. My approach was to think of a real **complex scenario that needs to be automated and then build a cucumber-jvm based framework to achieve the following goals:
  • Learn how cucumber-jvm works
  • Create a bare-bone framework with all basic requirements that can be reused
Once you know the basics and fundamentals of building a scalable and maintainable Test Automation frameworks, it was really easy to apply my past learning and experiences to learn cucumber-jvm and build a framework from scratch.

So, without further ado, I introduce to you the cucumber-jvm-sample Test Automation Framework, hosted on github. 

Following functionality is implemented in this framework:

  • Tests specified using cucumber-jvm
  • Build tool: Gradle
  • Programming language: Groovy (for Gradle) and Java
  • Test Data Management: Samples to use data-specified in feature files, AND use data from separate json files
  • Browser automation: Using WebDriver for browser interaction
  • Web Service automation: Using cxf library to generate client code from web service WSDL files, and invoke methods on the same
  • Take screenshots on demand and save on disk
  • Integrated cucumber-reports to get 'pretty' and 'meaningful' reports from test execution
  • Using apache logger for storing test logs in files (and also report to console)
  • Using aspectJ to do byte code injection to automatically log test trace to file. Also creating a separate benchmarks file to track time taken by each method. This information can be mapped separately in other tools like Excel to identify patterns of test execution.

Feel free to fork and use this framework on your projects. If there are any other features you think are important to have in a Test Automation Framework, let me know. Even better would be to submit pull requests with those changes, which I will take a look at and accept if it makes sense.

** Pun intended :) The complex test I am talking about is a simple search using google search.

Friday, March 28, 2014

WAAT Java v1.5.1 released today

After a long time, and with lot of push from collaborators and users of WAAT, I have finally updated WAAT (Java) and made a new release today. 

You can get this new version - v1.5.1 directly from the project's dist directory.

Once I get some feedback, I will also update WAAT-ruby with these changes.

Here is the list of changes in WAAT_v1.5.1:

Changes in v1.5.1

  • Engine.isExpectedTagPresentInActualTagList in engine class is made public
  • Updated Engine to work without creating testData.xml file, and directly sending exceptedSectionList for tags
    Added a new method
        Engine.verifyWebAnalyticsData(String actionName, ArrayList
    expectedSectionList, String[] urlPatterns, int minimumNumberOfPackets)
  • Added an empty constructor for Section.java to prevent marshalling error
  • Support Fragmented Packets
  • Updated Engine to support Pattern comparison, instead of String contains
Do let me know if you see any problems / issues with this update.

Thanks.