Unit Test – Definition

Leave a comment

Source: http://artofunittesting.com/definition-of-a-unit-test/

I used to feel that a ‘unit’ was the smallest possible part of a code base (a method, really). But in the past couple of years I’ve changed my mind. Here’s how I define a unit test, as of October 2011:

A unit test is an automated piece of code that invokes a unit of work in the system and then checks a single assumption about the behavior of that unit of work.

A unit of work is a single logical functional use case in the system that can be invoked by some public interface (in most cases). A unit of work can span a single method, a whole class or multiple classes working together to achieve one single logical purpose that can be verified.

A good unit test is:

  • Able to be fully automated
  • Has full control over all the pieces running (Use mocks or stubs to achieve this isolation when needed)
  • Can be run in any order  if part of many other tests
  • Runs in memory (no DB or File access, for example)
  • Consistently returns the same result (You always run the same test, so no random numbers, for example. save those for integration or range tests)
  • Runs fast
  • Tests a single logical concept in the system
  • Readable
  • Maintainable
  • Trustworthy (when you see its result, you don’t need to debug the code just to be sure)

I consider any test that doesn’t live up to all these as an integration test and put it in its own “integration tests” project.

Advertisements

Test Driven Development by Martin Blore

Leave a comment



 

The path to TDD and software excellence…

Posted on June 2, 2010 by Martin Blore

The more I study TDD, and the more I read Robert C. Martin’s study on clean code principals, the more I see this pyramid of development maturity.

TDD feels like the icing on the cake. The cake must come first. In this case, the cake is the principles and practices of clean code. Without clean code and the knowledge of Agile software principals, TDD is a very difficult practice to do. In my mind, the maturity model goes like this:

TDD Maturity ModelThe model shows level of difficulty and experienced needed, the further up the model you go. I have no doubt that some companies have taken TDD and enforced the process on their teams, without first going through the preliminary processes of learning. This is dangerous. I have seen projects eventually bury themselves in complexity and grind themselves down to a stop because of lack of clean code principles. When TDD is applied that way, not only will your software have fields of bad code inside of it, but you now have a test library to maintain with an equal amount of, if not more, bad code. The levels of complexity long running software projects can build up is scary. Maintenance costs and improvements become 10 times more expensive.

The investment in the above principles and practices are absoloutely golden to maintaining cost effective development of software.

“Agile Principles, Patterns, and Practices in C#” is a fantastic book from Robert C. Martin. Also, check out his Bad Code presentation over at InfoQ. It definitily got me laughing.

Source: http://martinblore.wordpress.com/2010/06/02/the-path-to-tdd-and-software-excellence/

QA Tester vs. QA Engineer vs. QA Architect

Leave a comment

QA Tester QA Engineer QA Architect
Function – Strong in test execution. Function – Test planning, Test Design and execution. Function – Define approach to test entire systems
Write and execute test cases – may not be coverage driven. Requirements driven testing. Prepare test plans, develop test cases and execute tests with a focus on coverage. Design, plan, execute, monitor, improve testing process for a testing engagement.
Determines Quality. Good to answer – did you find any bugs? Engineers Quality. Good to answer – what is the quality of the product? Provides answers to – 

  • What are the quality attributes and goals of the product?
  • What should be the test strategy, methodology and test tools so as to ensure product quality?
Linear thinkers, low capability of analysis and re-usability of efforts/resources. Logical thinkers. Ability to resolve issues using abstraction. Capability in analysis/predictions/improvements. Analytical and creative. Systems thinking and quantitative/statistical thinking capabilities.
Requires a defined environment. Typically are weak in finding solutions in ambiguous/constrained environments. Can reconcile conflicting constraints. Able to define the environment.
Low process oriented capability. Process and metrics/measurement driven. Defines standards, guidelines, methodologies, metrics.
May not be cost sensitive (time, effort, monetary, etc.) Cost sensitive Cost sensitive
Good for UI Testing 

  • Regression testing
  • Portability testing
Good for System/functionality testing 

  • Performance testing /Automation
  • Independent validations
  • Domain testing
  • Programming skills orientation
Good for defining, planning and managing for test engagements.
   
  • Ability to understand the goals of an organization and suggest a test architecture.
  • Able to suggest alternative approaches and benefits of the same.
  • Ability to suggest improvements to process and technology areas for a test system.
  • Ability to define the framework for testing.
  • Ability to analyze risks and provide mitigation plans.
  • Ability to analyze test requirements and provide a solution in terms of test approach and design, suggested tools etc.
  • ability to design the entire test life cycle processes.
  • Capability to lead and co-ordinate a team of analysts for testing engagements.
  • Software development skills.
  • In touch with new methodologies and tools for software testing.
  • Ability to design, plan and execute and monitor a testing process.
Typical involvement is in later stage of SDLC Best to have them involved in complete SDLC Cycle. Best to have them involved in complete SDLC Cycle.

Source: http://manishrathi.com/2009/09/11/qa-tester-vs-qa-engineer-vs-qa-architect/