Nederlands - nl-NLEnglish (United Kingdom)
PDF Print E-mail

Zyntax Quality Solution


Zyntax offers its customers support in all stages of the Customer’s Testing Lifecycle, beginning with Advice and following through to Analysis and Process Improvement. These supporting activities are shown in the lifecycle diagram below and explained further below.


Testing Life Cycle




Before testing can begin or even be planned in detail, we begin with a discussion of the business situation, application architecture and current state, and business requirements for the application.
From this we can advise on the areas of greatest risk (and therefore, test focus), the testability of the application from a technical perspective, the suitability of the requirements or what is missing, and the resources which will be necessary in order to complete the testing successfully.


Test Planning

To develop the initial schedule and priorities of what will be tested, with which priority , by whom, and technically how it will be achieved. At this stage we detail expected test scripts, test data, scenarios, expected results, and acceptance criteria.
In this stage the key risks and requirements are identified, such as appropriate access to applications, test database(s), obtaining relevant input data, and availability of test environments, as well as obtaining releases of the application itself.



During this stage we develop a test framework and test scripts based on this framework. Development of a test framework requires an initial investment of time, but provides the basis for consistent test scripts, logs, and reporting throughout the rest of the test process.
For practical implementation, the test framework and test scripts are developed concurrently, allowing for the benefits of a framework while still providing results as early as possible.


Test Execution

Test scripts are combined to form realistic scenarios, each of which can be used to emulated normal, peak, or stress workloads.
The most valuable measurements are usually obtained when trends are realized, these by executing multiple tests where a single factor changes in order to identify the behavior of the application when this factor changes. Typical changes are with the number of emulated users, frequency of transactions, adjustments in software configuration, or changes to the hardware/network infrastructure.
While tests are executed, monitoring is set up to run concurrently, so this should be set up before or as soon after test execution begins.


Monitoring of applications, systems and networks

Typical measurements are response time and throughput, for seeing how the responsiveness of an application changes with increased load, and then comparing these results against requirements or acceptance criteria.
When the response times exceed the acceptance level or the throughput falls below its requirements, additional data is required in order to help identify the root cause(s) of the issues.
Data is typically collected during each test but is tailored to each environment based on application architecture, available capabilities, and permissions. In addition to resource data from web, application, interface, and database servers, application and system logs might be collected as well, allowing one to see which entries were made during the course of a specific test.
As a general rule, we automate the monitoring as much as possible. The automation can take an extra investment to set up, but provides a consistent set of data which can be compared from one test to another.



For performance testing, response times, throughput, and resource measurements are reported upon. Response times are individually measured and reported for the key user activities, such that min, max, average and percentiles per user activity are provided. Depending on the circumstances, additional resource usage (application, database, system and network) will be measured. These will be reported in addition to end-user response times.
As the test implementation and execution allows, subsequent tests are executed in such a manner that these tests differ by a single change to a previous test, allow useful comparisons. These comparisons may also be part of the reporting provided.
For functional testing, the quality measures agreed are provided. The typical measures are relating test cases, pass/fail results, test case coverage, and progress versus planning.
There are significant types of tests which border on performance and functional testing, typically thought of as single-user tests, such as testing whether the application can handle two users simultaneously performing the same activity. For this reason we report based on the type of tests actually agreed on and performed, not strictly in the categories performance and functional.


Analysis and Process Improvement

Zyntax can help you analyze your application to identify performance problems and scalability issues. This in turn leads to recommendation towards design and coding improvements targeted at enhanced application performance.
We pride ourselves on our flexible scheduling and professional approach to performance test problems, and with years of experience and a solid team backing up the individual consultant - we can help solve your short or long term performance challenges.


Mentoring, Training, Coaching

Training content ranges from standard courses for IBM Rational functional or performance testing, for the Zyntax test adaptors, or courses tailored to specific testing needs; note that tailoring of a course usually requires one or two days to plan and implement.
Training which occurs after the a test framework has been developed and is smoothly working on-site is best to prepare individuals or teams to carry on the testing process themselves, possibly with follow-on coaching on an as-needed basis. A regular review process is recommended.



The Software Quality solution offers our customers help in the complete testing lifecycle, from advice through making them self-sufficient in their testing efforts.

The strength of Zyntax is its adaptation to customer needs, development of practical solutions, and flexibility in resourcing.

Please report web-related issues to
Copyright 2010 Zyntax Consulting BV, All Rights Reserved