Acceptance Test Engineering Guide, Vol. I - Thinking about Acceptance RC1

Update: Release Candidate version of Volume I of the Acceptance Test Engineering Guide (Oct 26, 2009) is released

Download the guide

The Team

  • Main authors:
    • Grigori Melnik (program manager at Microsoft patterns & practices)
    • Gerard Meszaros (agile coach, test automation expert and author of xUnit Test Patterns - Refactoring Test Code)
  • Contributors and production:
    • Jon Bach (professional test strategist)
    • Michael Puleio (developer at p&p with a passion around testing)
    • Rohit Sharma (tester at p&p)
    • Hakan Erdogmus (applied researcher at Kalemun Research Inc.)
    • RoAnn Corbisier (technical writer and editor at p&p)
    • Dennis DeWitt (technical writer with Linda Werner & Associates Inc.)
    • Tina Burden (editor)
    • Richard Burte (production specialist)
    • Veronica Ruiz (graphic designer, CXR Design)
    • Members of the Advisory Board and invited reviewers
    • You, the community. We want to hear from you via the Discussion forum here, the Issue Tracker, and feedback comments on what we post. We will listen, even if we do not agree with the recommendations or comments.

Project Description

The Acceptance Test Engineering Guide will provide guidance for technology stakeholders (developers, development leads, testers, test leads, architects, etc.) and business stakeholders (managers, customers, end users, etc) on the discipline of acceptance testing.

Why acceptance testing?

patterns & practices has produced just a few guides related to testing (including performance testing, security testing of web apps, and testing of .NET application blocks). However, we hear a lot of requests from our customers for guidance on testing and test strategy in general, and also guidance on every type of testing you can think of. Based on this customer feedback, and a look at what guidance was available, we determined that acceptance testing was the next area to invest in.

What is acceptance testing?

Working definitions for a number of terms are available in our Draft Glossary. The current definition that is framing our work and discussions is:
  • Acceptance Testing: Planned evaluation of a system by customers/customer proxies to assess to what degree it satisfies their expectations.
We are open to suggestions on this and other terms. Please leave comments on the Draft Glossary page.

What are we producing?

This guide is the first in the series of three dedicated to acceptance testing and requirements engineering:
  • Acceptance test engineering guide
  • Acceptance test automation guide
  • Tool support for acceptance test-driven development.

The first guide will cover the discipline of acceptance testing from several perspectives and contexts. It will provide models, heuristics and a set of actionable job aides rooted in a sample app. The focus is on:
  • How to Plan for Acceptance Testing
  • What Kinds of Acceptance Tests to Run
  • How to Create and Run Acceptance Tests
  • Defining What “Done” Means
  • How to Justify Your Approach

What types of things you can learn in the guide?

If any of the following goals apply to you, you will want to check out the the guide.
ATE_consumption_model.png



AcceptanceTestEngineering_Announcement.pdf

pag_logo.gif

Fill out p&p assets usage survey (NEW!!!)

Last edited Nov 12, 2009 at 3:38 AM by gmelnik, version 52