Writing Test Protocol content from scratch for a GxP System

Developing tests for a system from the very beginning is a daunting task for many where the question of "where do I even start?" may be on your mind.

For software or equipment, in a pharmaceutical and medical device setting, the question of “is it validated, tested?” is asked across multiple scenarios and these queries often make system owners very uncomfortable. Tests are proof that a system does as it is intended for it to do. It is a regulatory fundamental that a system meet its requirements and here lye the reasons for discomfort for many. In system introduction, schedule can overpower the quality function due to commercial realities and this translates into significant gaps in test evidence being generated.

Test case and test script appearance

Before getting into content, structure of a test documents appearance is vital to establish a uniform method of creation. A test case is named equivalent to an area of testing such as “backup and restore” or “Equipment Manual Mode”.  The 3 essential principles for the test case are to summarise the objective (what we want to do in summary), Procedure (how to execute the tests) and acceptance criteria (Comparing the tests to something such as an approved design document). An example is shown below.

Backup and Restore Verification

Objective –

To confirm the machines PLC and HMI can be backed up through the use of the TIA portal on server “APPC11”.

Procedure –

1) For each row in the test script table, execute the Test description, confirm the expected result occurs and complete the actual result ensuring it is identical to the expected result.

2) Record a PASS and enter initials – Date as per section 3.

Acceptance criteria –

1) The backup method detailed in SOP-00235 “WiFi distribution system Administration Procedure” section 6 has been verified.

2) The software for the system is as per detailed design DS-00147 WiFi distribution system Software Design Specification”. 

The test script itself is completed in a tabular form with the following headings most commonly used:

  • Step ID
  • Test Description
  • Expected Result
  • Actual result
  • PASS - FAIL
  • Initial – Date

  

Creating test content based on a design and requirements

The interaction between a set of requirements and a series of design documents is not the focus here but it is needed summarise it before elaborating on test content examples. Requirements are bottom line needs that are a breakdown of ultimately what a system should do.

 

 

A WiFi distribution system may be required to provide purified water to a manufacturing vessel for example. A requirement may state how much of this purified water needs to be generated per hour and another requirement may state that a specific PLC and HMI should be used (i.e. S7 1500 from Siemens with WinCC - DeltaV as the user interface). The design documentation is then often broken down to have a functional specification that details how everything will be met in CONCEPT initially. A functional specification will provide the summary for agreement. Once the functional specification is agreeable to the authors of the requirements specification are satisfied that their vision of the system is captured in the functional specification, detailed design documents are generated to focus on key areas such as –

  • Network Architecture Diagram
  • Software Design Specification (i.e. User Interface and PLC program)
  • Hardware Design Specification
  • Report Design Specification
  • P&ID drawings
  • Electrical Drawings
  • IO List 

For a person responsible for testing with all this content provided in the form of design documentation, the best mindset to have when the system is physical in front of you,

  • “Is the content of this documentation correct?”
  • “Is the system of me a true reflection of the approved design?” 

The difference between commissioning, qualification and validation Testing

As far as testing goes in GxP setting, terms for various stages of testing can confuse many. It is best to understand that testing is broken down as being official and unofficial when it comes to acquiring evidence. What this means is that testing to be shown to external reviewers of a system or software will only need to be the “Executed Qualification Test Evidence”.

Most strategies call for a SITE ACCEPTANCE TEST (SAT) to be completed on a system before a system is released for official validation. A SAT is full of tests which can be made to represent official testing if it goes well and isn’t full of clear errors. Official Validation testing is broken down into three main categories –

  • Installation Qualification
  • Operational Qualification
  • Performance Qualification 

Examples of Test content and wording

The test description is a series of actions that must be followed. Using a WiFi distribution system as an example, the action of running a sanitisation on the WiFi storage tank manually could be described as follows:

 

Comments are closed