2010 Global Testing Survey Results: Automation Testing

Data was compiled and analyzed by Michael Hackett, LogiGear Senior Vice President. This is the first analysis of the 2010 Global Testing Survey. More survey results will be included in subsequent magazine issues. To read the overview, visit www.logigear.com/survey-response-overview.html

The target audience of the survey were black box testers. Please note that to these respondents, test automation is mainly about UI level automation, not unit, performance or load testing.

These automation survey results contain two mutually exclusive sections. There were sets of questions for teams that currently automate tests and another set for teams that currently do not automate any tests.

I. Overview

Before delving into the respondent’s frame of mind with answers to questions from  Test Automation, I will highlight some results from The Politics of Testing, Training, Strategy and Overview sections that will set the stage for a better understanding of the issues faced by these respondents in test automation.

PT1 (Politics of Testing)- What phase or aspect of testing do you feel the management at your company does not understand? (You can select multiple answers.)

Projects are most often behind schedule because of shifting requirements not test delays. 43%

How to measure testing

41%

How to adequately schedule testing

41%

Test automation is not easy

40%

Testing requires skill

35%

The inability of complete testing

32%

Choosing good coverage metrics

23%

None, my management team fully understands testing.

17%

Result analysis: The 3rd highest response, virtually tied as the area of testing that management does not understand test automation is not easy!

PT2- What challenges does your product team have regarding quality? (You can select multiple answers.)

Insufficient schedule time 63%

Lack of upstream quality assurance (requirements analysis and review, code inspection and review, unit testing, code-level coverage analysis)

47%

Feature creep

39%

Lack of effective or successful automation

36%

Poor project planning and management

33%

Project politics

31%

Poor project communication

27%

Inadequate test support tools

25%

No effective development process (SDLC) with enforced milestones (entrance/exit criteria)

25%

Missing team skills

23%

Low morale

16%

Poor development platform

8%

Result analysis: By far, the #1 answer here is insufficient schedule time. The #4 answer is a lack of successful automation. It is too easy to say more investment in test automation will solve all your team’s problems—but it will definitely help! More and more effective test automation always helps projects.
It is not the answer to all problems, as clearly emphasized in the second highest choice, that a lack of upstream quality practices cannot be solved by downstream test automation! But better test automation will go far in helping the manual test effort and by doing so, at least relieve some tester stress.

T1 (Training)- What skills are missing in the group? (You may select multiple answers.)

Automation skill 52%

Coding/tool building

42%

Technical understanding of the code, system, platform, environment

35%

Subject matter/domain knowledge

33%

QA/testing skill

32%

Test tool (ALM, test case manager, bug tracking, source control, continuous integration tool)

22%

Results analysis: Interesting but never surprising, the highest chosen answer by teams in regards to what they lack─more than half the respondents─is test automation skills! It is obvious and clear that acquiring more test automation skills is the single most important job security point.

S1 (Strategy)- How do you do regression testing?

Both 47%

Manual

34%

Automated

15%

We do not do regression testing

4%

Results analysis: A very big surprise to me─the lack of automated regression! Wow. That is one of the biggest and most surprising results of the entire survey! Why do 1/3 of teams still do all manual regression? Bad idea, bad business objective.

O1 (Overview)- Do you currently automate testing?

Yes 63%

No, not currently

37%

If no, has your team ever automated testing?

Yes 90%
No 10%

Results analysis: With over 1/3 of respondents currently not automating tests, these results, however, are contrary to popular belief and any sort of best practice. What I see out in the business world are many teams that think everyone automates and they themselves automate enough. I also see many teams where all testing is manual and see automation as not common, too difficult, and not something testers do. This number is alarmingly high. Any team not automating has to seriously look at the service they are providing their organization as well as the management support they are receiving from that organization!

II. For teams that currently automate testing

A1 (Automate)- Have you been trained in test automation?

Yes 70%

No

30%

Results analysis: For teams that currently automate and still have 30% of the team untrained in test automation is deeply problematic. When this is the case, I often see the problem as too much technical information centralized into too few people. This is problematic for the career growth of staff, a business analyst or subject matter expert team as it demonstrates management does not invest in its staff. If your team automates and you have been left behind, it is a good idea to get a book, take a class, educate yourself and insinuate yourself into test automation for your own career planning!

A2- Estimate what percentage of the test effort is automated?

Less than 25% 36%

50 – 74%

32%

Over 75%

14%

25 to 49 %

13%

Very little.

5%

All our testing is automated

0%

Results analysis: Still amazing to see how little test groups automate. Over 40% of teams automate less than 25% of their tests! With 46% of teams automating over 50% of their test effort reveals very significant strides can be made in reducing the dependence on manual testing.

A3- How would you describe your automated test effort?

Is it a very effective part of your test effort. 33.30%

It is key to meeting schedule dates. Our test schedule would be much longer without automation.

19%

It is key to improving product quality.

14.30%

It is key to product stability.

9.50%

It is effective for just what we have automated and does not significantly affect the rest of the test effort.

9.50%

It frees me up from repetitive tasks to do more interesting, exploratory testing.

9.50%

It is somewhat useful, but has not lived up to expectation or cost.

4.80%

It is the most important part of our test effort.

0%

It is a waste of time and does not tell us much.

0%

Results analysis: What surprises me most about this set of answers is that no one thought the automated tests were the most important part of their test effort. What does this say? The respondents take their automation for granted? It isn’t trusted or it isn’t good? Or quite possibly, it is important but having a human manually interact with the system is the most important part of the test effort! Exactly 1/3 said it was a very effective part of the effort. Almost 20% said it is key to meeting the schedule. The fact that these numbers are not higher shows how test automation has not yet achieved its full potential in helping teams.

A4- How do you measure test automation effectiveness?

Time saved from manual execution. 81%

Percentage lines of code covered by automated tests.

14.30%

Number of bugs found.

4.80%

Product stability measured by number of support/help desk calls from users or hot fix/patch fixes.

0%

We do not measure test effectiveness for automation

0%

Results analysis: This is an overwhelming measurement of test automation by time saved from manual testing. An important observation to note is that we don’t measure test automation by bugs found- it’s important for teams to understand this and clearly they do.

A5- What strategy do you use to design your automated tests?

Direct scripting of manual test cases 33.30%

Data driven

19%

Action based/keyword

19%

Record and Playback

14.30%

No specific method

14.30%

Results analysis: These results are interesting to see the level of sophistication of various teams’ efforts.

A6- What is the main benefit of test automation?

More confidence in the released product 42.10%

Faster releases/meeting the schedule

36.80%

Higher product quality

10.50%

Waste of time

5.30%

None/no benefit

5.30%

Finding more bugs

0%

More interesting job/skill than manual testing

0%

Finding less bugs

0%

Less focus on bug finding

0%

Slower releases

0%

Additional useful comment from a respondent: “Some of the best uses of automation are to exercise the application, perform redundant or error-prone tasks, configure or set up test environments, and to execute regression tests.”

Results analysis: The results are encouraging for automation─there is a consensus that testers believe test automation provides more confidence in the quality of the product and increases the ability to meet schedules.

III. For teams that do not automate testing

A7- Have you tried to automate tests for your product?

Yes 56%

No

44%

Results analysis: A surprising number of teams have never tried automating! I think this is another dark secret of software testing. Drawing from my own speculation, it’s my opinion that many companies either have never invested in test automation, do not realize its benefits, afraid to try something new, or realize they need a significant investment to make it work and are not willing to further fund testing. It could also be that teams may have tried automating and given up, were not supported by the development organization, or test tool funding was cut. These are situations that need to be addressed for test teams to provide long term benefits to the organization.

A8- What would prevent you from automating tests now? (You may select multiple answers.)

Investment in automation program of training, staff, tool, and maintenance cost is too high. 43.80%

Tool cost too high.

37.50%

Management does not understand what it takes to successfully automate.

37.50%

Not enough technical skill to build successful automation.

37.50%

Code or UI is too dynamic to automate.

37.50%

Test case maintenance cost too high.

25%

Not enough knowledge about how to make automation effective.

25%

It will not help product ship with better quality or in shorter schedule.

25%

Bad project management will not be helped by automating tests.

12.50%

Results analysis: The great variety of reasons why teams do not automate is clear: cost, management misunderstanding of automation and lack of knowledge are the great downfall of test automation.

A9- Are you, or is your current team technically skilled to the point where you can build and maintain test automation? ( Remember, this response is only from teamscurrently not automating.)

No 56%

Yes

44%

Results analysis: If you are a software tester without much knowledge about automation, it would be best for your own career security to dive into test automation, read as much as you can, learn about the best methods, and see what various tools can actually do. Take responsibility to learn about the future of your career.

A10- Would product quality improve if you automated tests?

Yes 69%

No

31%

Results analysis: It is problematic that 31% of respondents do not see product quality improving with automation. Some teams may think the quality of their product is high and they do not need to automate. For those teams not so optimistic, there are a few possible ideas behind this: not understanding what information automated tests do and do not give you, not understanding tasks that can be automated to free up time to do more, deeper manual testing, but also, some teams may be resigned to low quality products, regardless of how their testing gets done.

IV. Participant Comments

The following statements are comments from respondents and their experience with test automation:

  1. “Make sure the test cases that are designated for automation are of the right quality.”
  2. “I have oh so many. I worked for a test tools company for over 8 years. How much time do you have? ;-) Common problems were inflated expectations of automation, lack of skills to implement effectively, lack of cooperation with the dev teams in enabling testability features, etc.
    But the worst stories I have are related to an over-reliance on automation where tools replaced people for testing and products were shipped with confidence that failed in the field horribly upon first use. These scenarios were VERY embarrassing and cause me to often throw the caution flag when I see a team driving toward ‘automating everything.’”
  3. “Automation frees up resources to do more stuff. Instead of spending 5 days running a manual regression the automated regression runs in 1/2 day and the team can spend a day doing data validation and exploratory testing.”
  4. “Test automation requires collaboration between developers, automation engineer, and functional test engineer. The more transparent the automation effort is, the more useful it will be.”
  5. “Identify the appropriate tools for your project. Use it everywhere possible, when in the testing”
  6. “I’m not sure if this is experienced by testers worldwide, but I had several encounters of IT Project Managers having the misconception of test automation. They have the expectation that every functionality in the application should be fully automated.
    As a test automation engineer, I’ve [learned] that in an application, not all functionality could be automated due to limitations on the automated tool. Or the application function is too complex to automate, producing an ineffective test automation effort.
    My strategy to overcome this is to advise the manager to identify the critical functionalities that can be effectively automated, thus reducing manual testing effort significantly and reaping the most out of the automated tool.”

Next month’s survey analysis is on Test Strategy and SDLC.

Leave a Reply

Your email address will not be published. Required fields are marked *

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>