Everything You Wanted to Know About Test Automation

1 / 67
About This Presentation
Title:

Everything You Wanted to Know About Test Automation

Description:

Provide Unique Page Names. Name Arrays of Similar Controls. Adding ... Use test applets from third-party vendors to verify compatibility before purchasing ... – PowerPoint PPT presentation

Number of Views:136
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Everything You Wanted to Know About Test Automation


1
Everything You Wanted to Know About Test
Automation
  • Brian Le Suer

QAAC - Quality Assurance Association of CT
February 8, 2007
2
Overview
  • Test Automation Truths or Myths
  • Frequently Asked Questions About Test Automation
  • Testing Dynamic Applications

3
Truth or Myth?
  • Recorders make test automation easier
  • Myth
  • The Truth
  • They make recording easy
  • But they also make it easy to create automation
    that is difficult and expensive to maintain
  • A recorder can serve as a training device to help
    you learn a scripting language

4
Truth or Myth?
  • You cant develop test automation for an
    application until it is stable
  • Myth
  • The Truth
  • You should develop test plans as soon as the
    Requirements are documented
  • You can also develop automated tests from
    wire-frames, mock-ups and prototypes
  • In the case of wire-frames, you can even test
    some of your methods
  • If tests are structured properly, only a small
    portion of your work will have to be maintained
    when GUI changes are made to an application that
    has not yet stabilized.

5
Truth or Myth?
  • Test automation is most effective for regression
    testing
  • Part Truth
  • Test automation is very effective for regression
    testing
  • It is also effective for first pass testing when
    a data driven approach is used
  • Automated tests should be designed to find bugs
    not just verify that regression tests dont break

6
Truth or Myth?
  • It is difficult to automate a dynamic application
    where data changes constantly
  • Myth
  • The Truth
  • Data can be separated from test code so that only
    the data needs to be changed
  • Better yet, in some cases input data can be
    captured at runtime to meet the criteria required
    for a test
  • Or a database query can be executed to get data
    at runtime
  • Verification data can also be captured at runtime
    so that tests remain flexible in a dynamic
    environment

7
Truth or Myth?
  • Test automation is a software development task
  • Truth
  • Automated tests should be designed, developed and
    tested
  • Automated test components are assets that should
    be treated like application source code
  • Test automation standards should be developed
  • Test automation should be subject to peer reviews

8
Truth or Myth?
  • Productivity gains will be realized from test
    tools right away
  • Myth
  • Test automation is an investment
  • Quality gains will be realized right away
  • Productivity gains will be realized over time as
    skills develop and manual testing tasks are
    offset by completed automation

9
Truth or Myth?
  • Automated tests are difficult to maintain
  • Myth
  • The Truth
  • Separate business rules, test methods and data to
    promote maintainability
  • If properly structured, automated tests can be
    maintained across many releases of a target
    application
  • Its important to run automated tests on every
    build and make incremental changes as needed

10
63 of Test Automation Projects Fail
  • Misconceptions
  • Testing tools dont work
  • Testing tools are too difficult to use
  • My application is too complex
  • Our project schedules are too tight
  • Management will never support it
  • Automation will eliminate the need for all manual
    testing
  • Manual testing must have taken place before
    automating a feature

11
What are the most important components to have in
place before implementing test automation?
12
Setting realistic goals
  • Rules of thumb are difficult to apply
  • Conduct a small pilot project
  • Less than a month in duration
  • After creating just enough infrastructure,
    track time to automate each feature
  • Compare time to test manually
  • Use this data for subsequent projects
  • Continue to update data
  • Automation will become more efficient after the
    first few projects

13
Evaluating Staff Skill Sets
  • Need at least one member with experience
  • Need at least one member with programming skills
  • Methodology should be appropriate for staff skill
    sets
  • Provide training beyond test tool vendor
    offerings

14
Developing Good Test Plans
  • Test plans structured by test objectives are
    optimized for automation
  • Results in more efficient automation
  • Promotes reuse of test components
  • Test plan scenarios do not provide adequate
    structure
  • Results in automated tests that are long and
    complex
  • Results in automated tests that are too broad

15
Alternative Approaches
  • Scenario Based
  • Strengths
  • Provides detail and context to test tasks
  • Replicates how system is used in production
  • Provides a script to follow to execute the test
  • Weaknesses
  • No structured view of test requirements
  • (difficult to determine test coverage)
  • Often delayed until late in the dev cycle
  • Generally limited to 1 platform, 1 release
  • Outline Based
  • Strengths
  • Can start immediately (before product)
  • Provides a summary of coverage without reading
    and understanding entire document
  • Emphasizes what over how
  • Results in a more comprehensive set of tests
  • Results in tests that have a clear objective
  • Weaknesses
  • Requires better understanding of requirements
  • Requires earlier internal communications

16
Example Outline
17
Designing Good Test Cases
  • Test cases should have a single objective
  • Test cases should result in one of two
    dispositions pass or fail
  • Test cases are independent
  • No test case relies on the successful completion
    of another test
  • Test cases start and stop at a known base state

18
Evaluating QA Culture
  • Test automation will not be successful if its a
    skunk-works
  • Automated tests need to be integrated with manual
    testing
  • All QA staff members should have a role in test
    automation

19
Choosing Best Automation Candidates
  • Poor candidates
  • Long or complex transactions
  • One-offs
  • Unstable or difficult to predict results
  • Tests that cross multiple applications
  • Good candidates
  • Short or simple transactions
  • Many data combinations
  • Expected results are stable or easy to generate
    at runtime
  • Tests that are executed regularly
  • Tasks that are difficult to do manually
  • Highest priority features

20
Preparing Test Data
  • Maintain control of test data
  • Establish data ownership
  • Strive to re-establish data state where possible
  • Where data is dynamic, develop techniques to
    predict results at runtime
  • Search Engines
  • Retail Order Entry

21
Meeting Equipment Requirements
  • Provide 2 machines on the desktop of each
    automated test engineer
  • One for developing new automation
  • One for testing/debugging recent automation
  • Provide a lab for shared usage
  • Automation should run against every build
  • All automation should run within a reasonable
    time
  • Automation should run against all supported
    platforms

22
Building A Strong Framework
  • Make test case maintenance a top priority
  • Strive for early successes to gain management
    support
  • Maximize reuse of automated test components
  • Ensure that test cases run across supported
    platforms
  • Ensure that test cases are not machine-dependent

23
Establishing Source Code Control
  • Test code should be protected using the same
    standards as application code
  • Any of the commercial packages will suffice
  • Use the system being used for application code

24
Developing Standards
  • Implement automation standards
  • Coding is faster and more efficient
  • Maintenance costs are lower
  • Learning curves are reduced
  • Staff can be easily redeployed

25
What are the most important aspects to consider
when evaluating testing tools?
26
Evaluating Testing Tools
  • Does the test tool support all of the required
    platforms?
  • Is the learning curve of the testing tool and
    approach appropriate for staff skill set?
  • Weigh the ease of script maintenance at least as
    high as ease of developing scripts
  • Does testing tool provide a means for reusing
    automation components?

27
Evaluating Testing Tools
  • Does the test tool provide a recovery mechanism?
  • Is there a powerful scripting language?
  • Does the test tool recognize application objects?

28
When in the project should the test automation
effort begin?
  • Automation should begin during the design phase
  • Test plans can be written from requirements
    documents
  • Wire frames or mock ups can be used to capture
    application objects and write basic methods

29
How can an application be made friendly to
testing tools?
30
Building In Testability
  • Proper naming of application pages and objects
  • Addition of hidden controls
  • Use of standard objects
  • Build custom objects and choose third- party
    controls that are automation-friendly

31
Provide Unique Page Names
32
Name Arrays of Similar Controls
33
Adding Hidden Html Objects
  • Add a hidden html object for each control that
    returns the value of the control
  • ltinput type"hidden" name"qa_headerpage"
    value"qa_usernameTester,,qa_itemsincart1
    item,,qa_cart_total13.79" /gt

34
Custom/Third Party Objects
  • Publish useful methods (e.g. SelectCell)
  • Publish properties (e.g. sClip)
  • Add new properties for test verification
  • Provide an hWin for each application control
  • Copy-enable text fields
  • Add hidden control (same color as background) to
    reflect value of rendered text

35
Custom/Third Party Objects
  • Implement keyboard short-cuts and accelerators
  • Follow platform standards for keyboard commands,
    e.g. ltHOMEgt, ltENDgt, ltCTRL-HOMEgt, ltSPACEgt, etc.
  • Use test applets from third-party vendors to
    verify compatibility before purchasing

36
Automating Dynamic Applications
37
Automating Dynamic Applications
  • The Problem Defined when building automated
    tests for dynamic applications
  • Application objects are not static
  • Test input data must change frequently
  • Expected results are difficult to predict
  • Requirements for a successful solution
  • Strategies for testing a dynamic application

38
The Problem
  • Dynamic applications change so rapidly that
    traditional approaches to test automation yield a
    poor return on investment
  • Paradox Changes in applications invalidate the
    test that were designed to verify that the
    changes havent broken anything
  • Examples include eCommerce store-fronts, news web
    sites, eTrading sites, knowledge bases.

39
The Problem
  • Objects related to dynamic content are not
    static
  • Todays forecast
  • The days top story
  • Share price of a stock
  • Featured item for sale
  • Available abstracts

40
Will This Look The Same Tomorrow?
41
Will This Look The Same Tomorrow?
42
Will This Look The Same Tomorrow?
43
Will This Look The Same Tomorrow?
44
Will This Look The Same Tomorrow?
45
The Problem
  • Input data must be frequently changed
  • Product catalogs change
  • Coupons expire
  • SKUs go out of stock
  • A stock price may rise or fall beyond a threshold
    required for a specific test
  • The calendar rolls forward

46
The Problem
  • Expected results are difficult to predict
  • Item prices change
  • Expiration dates are exceeded
  • Results of a search vary with available content
  • Inventory/Availability of products is fluid
  • Daily content keeps changing

47
Different Approach Required
  • Automated approach must be closer to the way a
    manual tester would react to changes
  • Automated tests must be more flexible
  • Dynamic objects can not be declared
  • Input data must be generated on the fly
  • Expected results must be captured at runtime

48
Strategies
  • Declare objects generically or dynamically
    instantiate objects
  • Parse content from page
  • Use data from one page to verify data on another
  • Use other available tools for verification
  • Query the data base for test data that meet
    attributes required for test cases

49
Declare Only Static Objects
  • Objects that tend to be static
  • Action Buttons
  • Category Links
  • Objects that tend to be dynamic
  • Most Links
  • HtmlText
  • Data stored in Tables

50
(No Transcript)
51
Create Generic Objects
  • Declare objects generically
  • HtmlLink FirstItem
  • tag httpCatalog?Browse?sku.asp?PageType
  • Dynamically instantiate objects
  • MyWindow.HtmlLink(iLink).GetText()
  • MyWindow.HtmlLink(iLink).Click()

52
Parse content from page
  • Finding reliable tags for dynamic objects is not
    always possible
  • Columns and rows may not be seen consistently by
    the testing tool when table contents is dynamic
  • Its easy to copy the page contents onto the
    clipboard and then parse it to get the value of
    dynamic objects
  • Store the data in a file for verification of
    subsequent pages in a transaction

53
(No Transcript)
54
Writing a Method to Parse a Page
  • Send the keystroke ("ltCtrl-agtltCtrl-cgt")
  • Click in a safe location to clear the highlight
  • Get the contents of the clipboard into a LIST OF
    STRING
  • Use your testing languages string functions to
    locate a static string to use as an anchor
  • Get the value of dynamic fields as off-sets of
    the anchor
  • Store the data in a .INI file

55
The resulting LIST OF STRING
56
(No Transcript)
57
(No Transcript)
58
A ParsePage Method
59
The resulting INI file section
60
Using Parse Page Results
  • Use data captured by ParsePage as expected
    results for test cases
  • Write a generic method for comparing two pages or
    comparing a page to a list of predicted results
  • A VerifyPage method should compare data on the
    current page and compare with data captured on a
    previous page (using ParsePage)

61
An Example VerifyPage Method Prototype
  • VOID VerifyPage (STRING sIniFile, STRING
    sPrevPageIniTag, STRING sThisPageIniTag, LIST OF
    STRING lsIniVerify)
  • sIniFile is the name of the Ini file
  • sPrevPageIniTag is the section in the Ini file
    containing the previous pages data
  • sThisPageIniTag is the section in the Ini file to
    write the current pages data
  • lsIniVerify is the list of values to verify in
    each section, e.g.
  • LIST OF STRING lsItem1AndOrderTotal
    "Item1Name, "Item1SKU, "Item1Cost,
    "Item1Shipping, "OrderTotal

62
Find Alternative Means of Verification
  • In dynamic applications, it is not always
    possible to explicitly predict the results of a
    transaction
  • For example, you may not be able to predict a
    stock price, an interest rate, the current
    temperature
  • It may not be possible or even desirable to
    completely control this data in a test
    environment
  • Can you determine the expected result at runtime?
  • Sometimes alternative means of verification are
    needed

63
Example Search
  • The results of a search may not be the same on
    any given date or time
  • An alternative verification step might be to
    ensure the search term displays in all matches
  • An easy way to implement such a step is to use
    the Browsers Find On Page feature

64
FindOnPage example
65
Query the Database
  • To ensure test case data is current, consider
    executing queries to mine for data during or just
    before a test is to be run
  • Build queries that search for test data that
    fulfill attribute requirements for test cases
  • For example, if a test requires a SKU that is
    out of stock, assign the SKU to be used in the
    test based on the results of the query

66
Other Verification Strategies
  • Use a range or tolerance instead of a specific
    expected result
  • Parse for required keywords when full content can
    not be predicted
  • Check for the absence of error messages and
    presence of a result even if the content of the
    result can not be predicted

67
Thank you
Star Quality 23 College Street Hopkinton, MA
01748 508-497-3413 www.starquality.biz
Write a Comment
User Comments (0)