Posts

Showing posts from July, 2017

Test automation framework architecture. Part 1 - No architecture

Image
Most of the "UI test automation" tutorials I have seen describe the Test Automation Solution where Selenium Web Driver is used directly from test methods and no additional abstraction layer exists. This "architectural pattern" is so ubiquities, that I decided to describe this as well. I think we can call this "architectural pattern" as No architecture . The structure of a test automation solution created with No architecture pattern presented (in a very rough way) on the picture below: Such test automation solution usually consists of some amount of test classes, each containing some number of test methods. All orchestration of interaction with System Under Test (SUT) is done either (using JUnit terms) in setUp and tearDown methods, or test methods themselves. Tool, whatever it is - Selenium, RestAssured, Selenide - called directly in test methods. Such an approach is the industry well-known anti-pattern. However, in some cases, it may be ok

Test automation framework architecture. Preface.

Some time ago I wrote this post describing my understanding (at that time) of the architectures used to create test automation solution (framework). While there's some information I still agree with, my understanding has evolved and I want to share this understanding with others. First, lets probably coin the terminology I use (which is not necessarily would be the right one - feel free to suggest something different) Test automation framework It is a framework that allows one to write automated tests. Usually one means the specialized framework, i.e. framework that is specialized for one or several related applications under test. A framework does not include tests themselves Test automation solution It is is a complete solution used for test automation. It includes everything needed to perform the automated tests, including tests themselves. A solution may be based on a framework, however, it is not mandatory. Architecture May be described as an imaginary model t

Agile team - rockband metaphor

I've been thinking about a proper metaphor about traditional (aka waterfall) and adaptive (aka agile) process and came to this - let's think about a rock band. Waterfall You can create a neat studio album using random people. First, you will record drums, then bass. After that, you can try different guitars... If the guitar sounds badly you will try to fix it or use something else. Then you will do mastering and mixing, adding effects and creating a nice cover. In a couple of months, you will have an album with a clean sound and neat content, great cover. The key is that here you should pay more attention to the process. You need to have a good plan (lyrics and music). This going to be lengthy but you will be fine even if bus guitarist decides to leave in the middle of the process - just arrange a replacement. In music, it works fine, but in business, time plays against you and plan may not be relevant. Agile You with your band are on a stage. You did have time f

Industry Carelessness

Looking through the LinkedIn feed I see a horrible thing. I think we can call it "Industry Carelessness". I see lots of posts of the kind: Automating testing using XXXXium Digital transformation using microservices architecture (please download our brochure) We will transform your business, here's how And so on. This is very disturbing. It is like "we don't know what the problem you have (if any) but we have a solution already!" Moreover, I worked for a company whose whole business model was like that, which was delivering what it had in place instead of what was necessary. I interviewed employees who did not know where we were but already knew where we should go and how do we go there. "Ok, we shall do this and that". "Hold on a second, don't you want to know what we already do and what results we get? "Not necessary, the industry goes into that direction so let's do thing 'right'" And even worse, I wa

An effective test automation

So what it means to have effective test automation? Let's say I can automate 10 000 tests a day. Impressive number, isn't it? Does it mean that my test automation effective? Hardly we can tell without knowing other things. Turns out that test automation effectiveness has little to do with how many tests you can automate a day. To judge effectiveness we need to think about what value test automation adds. That leads me to a thought that test automation may be really the wrong term. Test automation is an activity, not a product. Even worse, the product that test automation creates (the value it produces) may be created by other means, too. What value test automation creates? Faster execution of 70 000 UI test scenarios? Don't think so. Nice test result report? Well, maybe, but not necessarily. My favourite one, "elimination of human error in testing". I will not even bother to comment. So the value it creates? My best guess nowadays would be that testing is a