Developing test scripts with explicit compared to vague wording

Technical writing scripted perfection vs unscripted get it done quick

Good quality test scripts require clear wording in the test actions and equally clear wording on the interpretation of results. Protocols for me are the main tool used in the defence of a design for a system. For a high end GxP system, the protocols should cover each relevant requirement in the URS which has been elaborated on in detail through a design specification. 

This topic is of increasing relevance due to conversations related to CSA (Computer Systems Assurance) having an impact on test development and how much effort a validation professional should put into getting the wording perfect or simply “Good enough”.

As a recap for those unfamiliar with CSA, this essentially allows a tester to create tests which are essentially very explicit (Scripted – Step by step instructions) and implicit (Unscripted – vague summarised short instruction). This is based on a scoring system of being High GxP risk, medium GxP risk and low GxP risk. Many CSV professionals have struggled with the concept of unscripted testing and have struggled to find documented examples. 

Unscripted does not mean undocumented, it means very simple wording like "Test the system to ensure it has backup and restore functionality". This has drawn the ire of those that defend regulations with evidence and joy from those that are under pressure to complete tasks to tight schedules.

The goal of this blog post is not to discuss CSA vs CSV however. The goal is to create an example of test instructions which are explicit in one example vs implicit in another example only. I would like to then discuss the implications regarding the defence a backroom team can mount during an audit.

A test scenario

As a quick example of a test case, a DCS Automation system called DeltaV will be used where the network connection will be tested with a user account. DeltaV has the ability to communicate over OPC to enabled devices and software. It’s an extremely important area to get some testing on as it can form the backbone of physical IO updating user interfaces where alarming is also a factor. DV IO Watch is an application which is used in simulation of IO, parameters and software logic.

The goal here will be to verify the DV IO watch software can connect to the deltav system with some attachments included as evidence.

Test case header

The Explicit scripted Test example -

Attachment of step 1

Attachment of step 2

 

Attachment of step 3

Attachment of step 4

Attachment of step 5

Explicit scripted test development timeline

Creating these explicit tests is certainly time consuming. The hardship is to be able to recreate the test successfully if an auditor requests it. The knowledge is not lost and the backroom team have clear evidence. In the pre-approval step also, its also extremely clear what actions will be taken and not simply a high level objective. 

To create this level of test content detail, a person must be in front of a system where the side effect of getting to know the system at a low level is realised. The great thing to note is that this occurs before execution and all of the potential failures that go with it.

 The Implicit unscripted Test example -

 

Implicit unscripted test development timeline

Its hard not to smirk at the test description in an implied test script when a direct comparison is made to an explicit one. The objective is simply stated and evidence is simply collected at the time and attached.

Creating that test took all of 30 seconds and placed all the mystery directly into execution. "We will learn as we go" came to mind here.

Comparison of the value each approach brings with positives and negatives regarding clarity

First off, I like explicit testing because of the nature of how I have worked with difficult companies and those that question how well a system was evaluated before production release. Unfortunately management aggression comes out when things go wrong so it is vital to show proof of scope in requirements and the design being met. The scripted explicit test allows no room for misinterpretation especially if it was pre-approved by committee.

Implied unscripted tests are conceptual only. Pre-approval is with an idea and test ideas float waiting for refinement that comes only in post approval.

So where does the clarity come from with unscripted? It comes in the form of comments on the attachments and in any comments box in a protocol.

The human aspect of execution

When executing instructions, people differ in their interpretation. The meaning of words and refinement of test logic is a common feature of protocols. Discussions on the need to be exact vs the need to give room for a small amount of interpretation is a sign of experience and maturity. Here is an example -

Action - Open the tank drain valve manually through the user interface
Result - Verify the valve opens on the system and that the user interface drain valve updates from red to green within 5 seconds.
My eyes are drawn to the 5 seconds straight away. Is the 5 really important. If it opens in 6 seconds then its a fail. How did the 5 seconds get counted. From a calibrated timesource? Being explicit can trip us up and create failures when there is no need. Hands on execution experience is invaluable when debating wording.
 

Protocols are a means of legal defence

Regulatory requirements are a serious matter and proof of performance needs to be clear on serious criteria. The explicit test script can be re-executed at a future date. A clear breakdown was provided in the explicit test. We can see an exact reference to the design in the test case objective. Unscripted implied tests can have evidence also but we are often relying on notes on post approval comments. If the QA resource is not vigilant in their review, then the problems are clear.

Not being able to explain ourselves is not an option for a backroom team. Clarity provides the ability to defend our system in targeted questioning.

Conclusion

I have developed many test scripts on both software and equipment throughout the years. Explicit wording is a basic need when a requirement is a clear GxP item. Wording though needs to aid the execution and not allow potential trivial errors popup due to basic grammar or unnecessary verifications. The implied test wording mindset needs to be evaluated depending on the situation. Its an easy thing to write "PASS" especially when you are under pressure to meet a deadline.

 

For Queries - aphillips@appliedprojectsengineering.com

 
 
 

Comments (1) -

  • I enjoyed your article. It touched on a number of points I have discussed with others recently. Do you happen to have CSA protocol example that you could share?
Comments are closed