No matter how much effort is put into testing, one can never be sure that the system under test will behave according to the requirements. Typically, once the system is deployed, little or no checking is performed to ensure that the system has not deviated from the expected behaviour. If the system does deviate, the system administrator might not be able to immediately detect any problems. For example a few pence’ error in financial transactions originating from a particular country might go unnoticed for a long while until an audit discovers an accumulated discrepancy. The solution to this problem is to somewhat perform an ongoing audit of the system’s behaviour where any deviations are instantly visible.
Many test engineers are familiar with the concept of an assertion - explicitly stating assumptions which should hold throughout a system’s execution and automatically raising an alarm if an assumption fails. Indeed, assertions do provide a means of ongoing auditing but they are severely limited in the expressive power they provide. Using only basic assertions, simple checks would quickly clutter the code they are meant to check.
This tutorial starts by exposing the challenges of expressing non-trivial checks using basic assertions. Subsequently, you will be able to experiment with incrementally complex assertions and shown how these can be elegantly expressed using aspect-oriented programming techniques. Towards the end of the tutorial, the use of aspect-oriented programming is superseded by a specialised monitoring tool which simplifies the task of specifying assertion logic even further. By the end of the tutorial you will experiment with LarvaLight and would hopefully appreciate the benefits of using monitors to test a system at runtime.
The resources and instructions below should help you get started:
Ingredients: