Wednesday, April 30, 2014
TaaS blog post featured on ThoughtWorks Insights
My presentation on "Automate Your Tests across Platform, OS, Technologies with TaaS" is features on ThoughtWorks Insights page.
Monday, April 21, 2014
What are the criteria for determining success for Test Automation?
Test Automation is often thought of as a silver bullet that will solve the Teams testing problems. As a result there is a heavy investment of time, money, people in building an Automated Suite of Tests of different types which will solve all problems.
Is that really the case? We know the theoretical aspect of what is required to make Test Automation successful for the team. I want to know from practical perspective, with the context what worked or not for you.
I am currently writing about "Building an Enterprise-class Test Automation Framework", and am gathering experience reports based on what others in the industry have seen.
I am looking for people to share stories from their current / past experiences of full or limited success of Test Automation, answering the below questions (at a minimum):
Is that really the case? We know the theoretical aspect of what is required to make Test Automation successful for the team. I want to know from practical perspective, with the context what worked or not for you.
I am currently writing about "Building an Enterprise-class Test Automation Framework", and am gathering experience reports based on what others in the industry have seen.
I am looking for people to share stories from their current / past experiences of full or limited success of Test Automation, answering the below questions (at a minimum):
Context:
- What is the product under test like? (small / med / large / enterprise) (web / desktop / mobile / etc.)
- How long is the the Test Automation framework envisioned to be used? (few months, a year or two, more than a few years, etc.)
- What is the team (complete and test automation) size?
- Is the testing team co-located or distributed?
- What are the tools / technologies used for testing?
- Are the skills and capabilities uniform for the team members?
- Is domain a factor for determining success criteria?
Framework related:
- What are the factors determining the success / failure of Test Automation implementation?
- What worked for you?
- What did not work as well?
- What could have been different in the above to make Test Automation a success story?
- What are the enablers in your opinion to make Test Automation successful?
- What are the blockers / anchors in your opinion that prevented Test Automation from being successful?
- Does it matter if the team is working in Waterfall methodology or Agile methodology?
Thursday, April 10, 2014
Sample test automation framework using cucumber-jvm
I wanted to learn and experiment with cucumber-jvm. My approach was to think of a real **complex scenario that needs to be automated and then build a cucumber-jvm based framework to achieve the following goals:
So, without further ado, I introduce to you the cucumber-jvm-sample Test Automation Framework, hosted on github.
Feel free to fork and use this framework on your projects. If there are any other features you think are important to have in a Test Automation Framework, let me know. Even better would be to submit pull requests with those changes, which I will take a look at and accept if it makes sense.
** Pun intended :) The complex test I am talking about is a simple search using google search.
- Learn how cucumber-jvm works
- Create a bare-bone framework with all basic requirements that can be reused
So, without further ado, I introduce to you the cucumber-jvm-sample Test Automation Framework, hosted on github.
Following functionality is implemented in this framework:
- Tests specified using cucumber-jvm
- Build tool: Gradle
- Programming language: Groovy (for Gradle) and Java
- Test Data Management: Samples to use data-specified in feature files, AND use data from separate json files
- Browser automation: Using WebDriver for browser interaction
- Web Service automation: Using cxf library to generate client code from web service WSDL files, and invoke methods on the same
- Take screenshots on demand and save on disk
- Integrated cucumber-reports to get 'pretty' and 'meaningful' reports from test execution
- Using apache logger for storing test logs in files (and also report to console)
- Using aspectJ to do byte code injection to automatically log test trace to file. Also creating a separate benchmarks file to track time taken by each method. This information can be mapped separately in other tools like Excel to identify patterns of test execution.
Feel free to fork and use this framework on your projects. If there are any other features you think are important to have in a Test Automation Framework, let me know. Even better would be to submit pull requests with those changes, which I will take a look at and accept if it makes sense.
** Pun intended :) The complex test I am talking about is a simple search using google search.
Tuesday, April 8, 2014
Disruptive Testing - An interview with Matt Heusser
Read an interview with Matt Heusser to know what he thinks about Lean Software Testing, Scaling Agile, Testing Metrics, and other thought provoking questions.
Subscribe to:
Posts (Atom)