Skip to main content

Debugging Domain Specific Languages

One of the problems with debugging domain specific languages (DSLs) is that a single DSL construct is often mapped to a multitude of aspects in the generated code or generic implementation. Thus, the classic concept of a breakpoint is difficult to interpret.

However, it seems to me that this could be overcome fairly easily. Simply allow users to set breakpoints targeted at specific aspects of the system. Once you think about this, it becomes clear that such breakpoints might even be useful in mainstream generic languages with non-trivial framework support, such as Java or C#. Imagine, for instance, being able to set breakpoints on a class declaration that trigger when an instance is constructed, serialized or deserialized. Now, I know you can do this by overriding the corresponding methods and setting a breakpoint there. Cumbersome, but workable. What, however, if the feature designer did not allow for overriding the aspect? Imagine such a class breakpoint triggering when an instance is retained in memory during garbage collection, and then some sort of inspection support that tells you what chain of references led to the retention!

Tool support for creating easily debuggable DSLs seems still not to be in the near future. However, I think that even today we could improve matters.

Don't be generic

Generic code is hard to debug. Users of your DSL will have to step through code using mostly abstract terms, covering all sorts of behaviour supported by the DSL, but not relevant in the particular case at hand. They will get stack traces full of abstract terms meaning nothing to them. And they will not be able to set straightforward breakpoints. Sophisticated users may be able to set breakpoints in the generic code with an appropriate trigger condition expression, but that will be the exception.

Contrast this with generated, specific code. This is straightforward to debug. It is, in essence, the same as debugging the code you would have written yourself. In addition, it may often be possible to spot the problem by simply looking at the generated code. If, then, the generated code contains hints about what elements in the DSL-based specification led to the inclusion or exclusion of parts of the generated code, the user even gets a clue as to what might be wrong in their DSL specification. Or else they may have found a bug in your generator. In any case, they do not feel helpless in the face of a problem. So:

Rule 1
Whenever you feel the urge to implement an aspect of your system using generic code, think twice. Be very sure that the advantages of the generic implementation (which, granted, can be big) more than outweigh the reduced debugging ability and confidence of your users.


Rule 2
When generating code, provide an option for including comments in the code that let users understand the decisions that led to the inclusion and exclusion of particular elements of the generated code. This will make it easier for them to find out what to change in their DSL-coded program to achieve a desired runtime effect.


When generic, think about debugging

Language and system designers should, for every generic feature they add, think about how users are going to be able to debug the effects of the feature. Current SQL servers, for instance, partially address this problem by providing optimizer plan views. This is a great example. They key here is that, as with generated code, they make the result of the generic interpretation of the SQL query visible. Interestingly, they do it by using code generation internally. The SQL query is analysed and compiled into an execution plan (which is, essentially, what gets visualized). This plan is really nothing more than generated code for a specialized low-level data access language. So:

Rule 3
Try to implement a generic interpreter internally as a compiler to a lower-level language and an interpreter for that language. Make the visualization of the generated lower-level program accessible to your users during debugging. Design the the lower-level language so it can be understood by your more sophisticated users.


In addition to the benefit for your users, it will actually often make your code cleaner and easier to understand, because analysis and execution are clearly separated. And, if you can reuse generated lower-level programs and/or trivially map your lower-level language to MSIL or Java byte code, you may even get better performance.

SQL servers still fail, however, to visualize, and thus make accessible during performance tuning, the decision path that led to the final plan (see rule 2 above). Which leads to the necessity of scarce expert database query tuners.

Simple tool improvements for generic code

A fairly typical case of generic code used to implement a DSL is where a model of business objects is mapped to a set of generated classes. These classes all inherit from a few framework provided base classes that implement the bulk of the common functionality. The generated classes are thus very thin. They are the stubs providing symbolic integration between the DSL and the underlying implementation language, as well as being the place where you plug in your custom overrides of generic functionality. (This is, as I understand, how Microsoft proposes one implement the DSLs one creates using their new modelling tools in Visual Studio 2005.)

In this setup, a typical problem is that stack traces, for instance those obtained from logs of problems that occurred in deployed applications, are so generic as to be practically meaningless. Now, if only the stack-trace producing code where changed so that, instead of getting
GenericObj.InternalDoSave
GenericObj.DoSave
GenericObj.Save

we would have
GenericObj[MyAddressObj].InternalDoSave
GenericObj[MyAddressObj].DoSave
GenericObj[MyAddressObj].Save

Now we would get some information that might actually help in trying to reproduce the problem!

The same applies for the display of reference values in the debugger in general. Here, at least, Visual Studio 2005 seems to offer a solution using the custom debugger visualizations one can provide.

Comments

Popular posts from this blog

Threaded chat article and demo

While nothing major, managing threaded conversations in chat has bothered me for quite a while. Yesterday I had an idea on how to improve matters: Works using existing chat infrastructure. Needs only augmented clients. Plays well even if other party uses a non-thread aware chat tool. Separates threads automatically based on interaction patterns. I've written an article and have created an online demo about it. Discussion welcome.

Access 2003 and the DCOM Server Process Launcher

Here's a hint: Don't disable the "DCOM Server Process Launcher" service on XP SP2. It may look like one heck of a vulnerability when you really don't use DCOM at all, but, unfortunately, Microsoft Access 2003 does. It will simply open an instant message box stating "A problem occurred while Microsoft Access was communicating with the OLE server or ActiveX Control." if the service is not running.

Beyond TDD: Documentation Driven Development

There are quite a few articles extolling the virtues of test-driven development these days ( here's one ). And for good reason, too. Having done TDD for quite a while, I recently started combining it with documentation-driven design. This is what my open-source tool, JCite , is all about. With this approach, I sketch out the most important use cases, combine them into the index of a tutorial (links plus teasers summarizing the use-case), flesh out the tutorial topics (and thus use-cases) one by one, develop the use-case tests in parallel to each topic, cite the important parts of the tests as actual code samples into the topic, and only then start doing the implementation (this last step is accompanied by more tests, which are now more like unit-tests). In all, this is like literate programming, but of the use-case tests rather than the implementation code. TDD already helps to make you focus on the user during API design. DDD takes the effect further by making you tell consistent ...