How do you test a complex system that doesn't have any UI without developers on the test team? In this post I'll discuss the architecture we developed, and in a follow up post I'll dig into some of the specific challenges we faced, and how we overcame them.

One Big Adapter

One of the projects I’ve been working on is more or less a giant adapter. We need to keep data synchronized between two systems with different data definitions and sometimes different data completely for similar concepts. On one side of the adapter application is a proprietary system that is accessed through an API. On the other side is a DDS bus, which is a reliable message queueing system, almost like a distributed database.

Our goal is near 100% automated black box testing for this project. These tests also serve as the regression tests for each build. We don’t have any developers on the test team, so the testers also needed a way to define the tests without of writing code.

Test Harness

We decided on an architecture consisting of a REST service, which we call the test harness, to communicate with the two systems on either end of our adapter application. The test harness can receive data as JSON, and create the appropriate objects in the requested system. It can also read data from a system and return it as JSON formatted data.

The developers generally start development by creating the test harness communications on one side or the other. This allows them to generate the scenario in that system that the adapter application has to respond to. The developers can then use cURL to exercise the REST endpoint while developing the adapter code, and easily change the data to fit their needs.

Once development on the adapter application has sent the data to the other system, the test harness is updated to communicate with that system so that the developer can validate their adapter code. Again, this is generally done using cURL and manual verification of the data.

With both sides of the test harness in place, testers can now start defining their tests. This brings up the first challenge. The JSON can get extremely complex, and there is no good way to publish the data types for each of the fields. Furthermore, we don’t want the testers to have to develop any code, including scripts, to exercise the test harness.

Focus Pocus

Using reflection against our test harness code, we are able to export all of the REST endpoints. This gives us a list of actions that a tester can add to a test. These endpoints can receive or return a specific type, so we use reflection to extract all of the class information for those types as XSD. With these generated files, we created an application that reads in this data and presents it to a tester with a simple UI. Because the tool abstracts so much from the tester, almost like magic, we call this tool Focus Pocus.

Testers define a test by dragging one or more actions from the list of endpoints onto the test. Each of these actions can call the test harness with a single argument or pass data into the test harness, and receive data back from the test harness.

A UI is generated for creating the data to pass into the test harness based on the XSD exported from the test harness code. This UI validates the data for intrinsic data types, and creates nested fields for complex data types. It also generates lists of valid values for enumerations. Some test harness endpoints work with interfaces and/or base classes, so the UI also handles defining which concrete class to send and dynamically updates based on this choice.

The UI also provides a mechanism to validate data returned from the test harness is correct. The testers selects a specific field from the return type, and adds an operator (==, !=, >=, etc) and expected value. This also ensures that the expected value defined is correct for the field’s data type, and allows selecting a value from a dropdown for enumerations.

A Wrench in the Architecture

The architecture was meant to keep the test harness completely separate from the adapter application. Unfortunately, each system has its own independent ID for each object, and the only place where the mapping between the two systems exists is in the adapter application. The test harness is an independent set of JAR files, and is now loaded when present into the adapter application at startup. This allows the test harness to read the mappings between the two systems from the IoC container. We don't have complete separation of test code from production code, but the only interaction between the two is through the IoC container.

Report Generation

Focus Pocus saves all of this information in a database, along with information about the test, such as tags to categorize them, the requirement(s) the test is for, and a description of the test. Each action can also have a textual description associated with it. This information allows us to generate our test documentation directly from the actual tests stored in the database.

There is a command line component of Focus Pocus that reads the database and executes the tests. The results of each test and its actions are also stored in the database, and an HTML report is generated showing the number of tests the passed/failed, and the output of each validation.



Daryl Reed