Showing posts from 2017

Manual and automated testing confusion

I am getting tired of post about how "automated testing" going to replace "manual testing".

Let me offer you a simple analogy. I have a car that have lots of useful self-check lights, like "check engine", "low gas", "low battery", etc. Automated tests are similar to those lights - if light is on, most likely something is wrong and I need mechanic to look at my car. Does having those self-check lights let me not to visit mechanic for human-driven check yearly? They don't. Fact that self-check light is not on does not mean my car is OK, does it?

Areas covered lots with self-check may require less time to check (as there lower risk they would be broken). However there are lots of areas that is not feasible to cover with self-checks. There're some car parts I am not even aware of, while mechanic knows they weak points and can find issue in a couple of minutes.

It is possible to have only automated checks and not have mechanics to loo…

Perceived quality level of a software may have dropped, but testing is not the answer.

"Modern software is of a lower quality that it was in a past".

Maybe. Perceived quality of software may have decreased, but I don't think that "more testing" is a proper solution.
More testing does not mean more quality More testing may find more issues, but not necessarily. Simply spending more time on the activity does not mean results would be better. And somebody needs to fix bugs, test bug fixes.
So we can't tell that more testing means more quality, but certainly means bigger costs.
Software quality != No bugs Bugs matter, but that's hardly the only factor to measure quality level. There are a whole bunch of other things that matters: UX, number of features, documentation, price, delivery model, cute logos...
Spending more time on testing may mean spending less time on these things.

Adequate quality level So it is obvious that one can't spend 100 years testing every possible case. I think that each product has some Adequate level of …

Agile is not the goal, but means

So we sit in the room and discuss the transformation plan for a company which was working in a waterfall manner for nearly a decade. The final goal is to switch to iterative development with somewhat small iteration (let's say, 3 weeks). The "only" problem we had is that we have 3 weeks if manual regression, and it does not really fit to the plan. We have options:
Long waterfall-like sprint, where first half of the sprint we develop features, second half - test themSeries of short sprints and one "hardening"/"release" sprint dedicated to regression when we finally decide to release things.  There's third obvious option, which is decrease regression time, but no-one seems to have knowledge how it can be done in a next 5 years. So this gets dropped.
"We need to have potentially shippable increment each sprint, "hardening" sprints are not Agile!" - says one man in a room.

Well, that may be true. May be it not as cool as Spotify Eng…

Two different views on Test Automation

I have published several posts with the aim to deliver one message - sometimes it is more efficient (fast and convenient) to change the application under test (make it more testable or eliminate the need for testing at all) then invent or employ complicated test automation techniques to check the same functionality. Even though there were a lot of misunderstanding caused by badly forming those posts, I still think I stroke something deeper.

For instance, this twitter post made me think that we speak two different languages:

That's like asking a pharma company to self-certify that their drugs are safe without any independent approval! #softwaretesting#CIO — Ayush Trivedi (@ayushtrivedi) 4 September 2017
Now it started to seem to me that there there two different views on what test automation is. First, probably prevailing point of view, is that test automation is a part of ages-old traditional QA process, where test automation specialists is just a test specialist using tools to test

The broken concept of a Page object, or Why Developers Should Be Responsible For Test Automation

Preface: I am in the middle of writing a series of posts about test automation frameworks architecture. I am still going to continue that series, even though this posts kind of devaluate the whole test automation framework concept a bit. Sorry for that, just can't stop ranting.

Looking though the internet I spotted a couple of posts where some test automation specialists were talking about "page object" "pattern"/"model", as it was something special they had invented.
Well, I also have something to say about "page object".

"Page object" may be described as pattern that allows us to decouple thing you can do with web page (external interface describing test/business logic) from the real implementation code you will have to use to actually get things done with this bloody web page. If one heard about GoF patterns, she may think about &…

Test automation framework architecture. Part 2 - Layered architecture

Probably the most popular architecture pattern used for test automation frameworks (TAF) is layered architecture. This pattern is so well known that on job interviews for some companies when they ask you about TAF architecture you are supposed to describe this one. If you don't - they think you know nothing about the architecture altogether.

I suggest first reading brilliant description of the pattern at the oreilly web page, cause in this post I am going to describe the pattern in a way it is usually applied to build test automation solution.

Usually, there're three distinct layers, which may have different names, but follow the same logic mostly. Sometimes those layers called test layer, business-layer and core layer, but there're no standard names really. Key rules for layered architecture are the dependency direction (each level depend on the level below) and call direction (no level can call/reference code described in the level above).

The rough structure of such…

Test automation framework architecture. Part 1 - No architecture

Most of the "UI test automation" tutorials I have seen describe the Test Automation Solution where Selenium Web Driver is used directly from test methods and no additional abstraction layer exists. This "architectural pattern" is so ubiquities, that I decided to describe this as well.

I think, we can call this "architectural pattern" as No architecture. The structure of test automation solution created with No architecturepattern presented (in a very rough way) on the picture below:

Such test automation solution usually consists of some amount of test classes, each containing some number of test methods. All orchestration of interaction with System Under Test (SUT) is done either (using JUnit terms) in setUp and tearDown methods, or test methods themelves. Tool, whatever it is - Selenium, RestAssured, Selenide - called directly in test methods.

Such approach is industy well known anti-pattern. However, in some cases, it may be ok to use. There're some…

Test automation framework architecture. Preface.

Sometime ago I wrote this post describing my understanding (at that time) of the architectures used to create test automation solution (framework). While there's some information I still agree with, my understanding has evolved and I want to share this understanding with others.

First, lets probably coin the terminology I use (which is not necessarily would be the right one - feel free to suggest something different)

Test automation framework
It is a framework that allows one to write automated tests. Usually one means the specialized framework, i.e. framework that is specialized for one or several related applications under test. Framework does not include tests themselves

Test automation solution
It isis a complete solution used for test automation. It includes everything needed to perform automated test, including tests themselves. Solution may be based on a framework, however it is not mandatory.

May be described as an imaginary model that dictates how code is of the…

Agile team - rockband metaphor

I've been thinking about proper metaphor about traditional (aka waterfall) and
adaptive (aka agile) process and came to this - lets think about rock band.

Waterfall You can create neat studio album using random people. First you will record drums, then bass. After that you can try different guitars...

If guitar sounds badly you will try to fix it or use something else. Then you will do mastering and mixing, adding effects and creating a nice cover. In a couple of month you will have album with clean sound and neat content, great cover.

The key is that here you should pay more attention to the process. You need to have a good plan (lyrics and music). This going to be lengthy but you will be fine even if bus guitarist decides to leave in the middle of the process - just arrange a replacement. In music it works fine, but in business time plays against you and plan may not be relevant.

Agile You with your band are on a stage. You did have a time for soundcheck but was drinking bee…

Industry Carelessness

Looking through the LinkedIn feed I see a horrible thing. I think we can call it "Industry Carelessness". I see lots of posts of the kind:
Automating testing using XXXXiumDigital transformation using microservices architecture (please download our brochure)We will transform you business, here's how And so on. This is very disturbing. It is like "we don't know what the problem you have (if any) but we have a solution already!" Moreover, I worked for a company which whole business model was like that, which was delivering what it had in place instead of what was necessary.

I interviewed employees who did not know where we were but already knew where we should go and how do we go there.

"Ok, we shall do this and that".
"Hold on a second, don't you want to know what we already do and what results we get?
"Not necessary, industry goes into that direction so let's do thing 'right'"

And even worse, I was once the one who did…

An effective test automation

So what it means to have an effective test automation? Let's say I can automate 10 000 tests a day. Pretty big figure, ain't it? Does it mean that my test automation effective? Hardly we can tell without knowing other things. Turns out that test automation effectiveness has little to do with how many tests you can automate a day. To judge effectiveness we need to thing about what value test automation adds.

That leads me to a thought that test automation may be really a wrong term. Test automation is an activity, not a product. Even worse, the product that test automation creates (value it produces) may be created by other means, too.

What value test automation creates? Faster execution of 70 000 UI test scenarios? Don't think so. Nice test result report? Well, may be, but not necessarily. My favorite one, "elimination of human error in testing". Will not even bother to comment.

So value it creates? My best guess nowdays would be that testing is an activity withi…

Musing about ethics and software develpment

There's something that I haven't seen in education plan of IT degrees ever - professional ethics, and I think it is huge miss. Ethics is being taught for lawyers, MD, teachers and lots of other degrees. Ethics tells us that there's something beyond our job responsibilities. Ethics reminds us that the one who is footing the bill may not be the final decision maker of everything. That if something is legal, it is yet does not mean it is a right thing to do.

Let me share a story about one of my previous project. I was working for a IT services company and we were helping our client to deliver a new version of software. A peculiar thing was that if there were a bug in the Product, than something horrible could happen (in the worst case - somebody could die). And I was the quality guy on a project.

Our sponsor (the one, who was footing the bill, and ultimate decision maker) had his deadlines. He already made a demo for marketing people and wanted to ship the software. Users we …

Exploratory testing and bug hunting fun

Being SDET specialist, I do lots of coding, automating, meeting, refactoring, code reviewing and other related things these days. I do enjoy most of my daily activities, and I didn't think that being forced to do some manual testing stuff can be anything interesting to me.

Chances turned, though, that I was the only QA-focused specialist in a team and we had to provide quality feedback on a product that had had little (close to not at all) unit test coverage. Being inspired by James Bach's and Michael Bolton's Rapid Software Testing methodology, I devised exploratory test session plan for the application, distributed activities between team members and did my best to find any possible issues in application we worked on.

Being overwhelmed by regular work activities, I couldn't even imagine how fun this bug hunt could be! I found quite a few issues, and I felt being House, M.D., detective Columbo, or somebody of a similar fashion. I was chasing after one hint to another, …