http://www.spinellis.gr/pubs/jrnl/2000-JSS-DSLPatterns/html/dslpat.html
This is an HTML rendering of a working paper draft that led to a publication. The publication should always be cited in preference to this draft using the following reference:

Citation(s): 299 (Google Scholar), 126 (selected).

This document is also available in PDF format.

The document's metadata is available in BibTeX format.

Find the publication on Google Scholar

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Diomidis Spinellis Publications

Notable Design Patterns for Domain-Specific Languages

Diomidis Spinellis

Abstract

The realisation of domain-specific languages ( dsls) differs in fundamental ways from that of traditional programming languages. We describe eight recurring patterns that we have identified as being used for DSL design and implementation. Existing languages can be extended, restricted, partially used, or become hosts for DSLs. Simple DSLs can be implemented by lexical processing. In addition, DSLs can be used to create front-ends to existing systems or to express complicated data structures. Finally, DSLs can be combined using process pipelines. The patterns described form a pattern language that can be used as a building block for a systematic view of the software development process involving DSLs.

Keywords: design patterns; domain-specific languages.

1  Introduction

The realisation of domain-specific languages (DSLs) differs in fundamental ways from that of traditional programming languages. Although the idea of domain-specific languages is more than mature [Lan66], their role in the architecture, design, and implementation of software systems has only recently been acknowledged [Ram97]. Some DSLs are being designed as full-flavoured programming languages [Wir74] and implemented as interpreters or compilers using traditional programming language implementation techniques and tools [ASU85]. However, the software process and economics behind the realisation of a DSL are, more often than not, entirely different from those that drive the implementation of a traditional programming language. Specifically, DSLs are by definition part of a larger system and often implemented for a narrow usage domain. The resources available for designing and implementing them are therefore constrained to a small percentage of those available for the system they belong to, and difficult to amortise over a large user base. The constraints on the design and implementation effort and talent that can be devoted to the realisation of a DSL have brought forward a number of distinct and reusable strategies. These DSL realisation strategies solve specific problems of design and can be applied to many similar problems. The description of such reusable designs, often referred to as patterns [AIS+77,CS95,GHJV95], allows their dissemination and conscious reuse by DSL designers and software practitioners.

The remainder of this paper is structured as follows: in section 2 we introduce DSLs and outline their differences from executable specification and general purpose languages while in section 3 we present the formalism of design patterns that we use for describing the DSL realisation strategies in section 4. Finally, section 5 concludes this paper with a discussion of the relationships between the outlined design patterns and directions of future research.

2  Domain-Specific Languages

A domain-specific language is a programming language tailored specifically to an application domain: rather than being general purpose it captures precisely the domain's semantics. A DSL-based development methodology addresses the need for increasing domain specialisation in the software engineering field [Jac99]. Examples of DSLs include lex and yacc [JL87] used for program lexical analysis and parsing, HTML [BLC95] used for document mark-up, and VHDL used for electronic hardware descriptions. Domain-specific languages allow the concise description of an application's logic reducing the semantic distance between the problem and the program [BBH+94,SG97a].

[p]

dslarch.gif

Figure 1: UML diagram of a DSL-based system architecture.

DSLs are, by definition, special purpose languages. Any system architecture encompassing one or more DSLs is typically structured as a confederation of modules; some implemented in one of the DSLs and the rest implemented using a general purpose programming language (Figure 1). As a design choice for implementing software systems DSLs present a number distinct advantages over a ``hard-coded'' program logic:

Concrete Expression of Domain Knowledge
Domain-specific functionality is not coded into the system or stored in an arcane file format; it is captured in a concrete human-readable form. Programs expressed in the DSL can be scrutinised, split, combined, shared, published, put under release control, printed, commented, and even be automatically generated by other applications.

Direct Involvement of the Domain Expert
The DSL expression style can often be designed so as to match the format typically used by the domain expert. This results in keeping the experts in a very tight software lifecycle loop where they can directly specify, implement, verify, and validate, without the need of coding intermediaries. Even if the DSL is not high-level enough to be used as a specification language by the domain expert, it may still be possible to involve the expert in code walkthroughs far more productive than those over code expressed in a general purpose language.

Although the DSL concept bears similarity to executable specification languages [Som89], [TM87] such as OOSPEC [PH95] the DSL approach exhibits some important advantages:

Expressiveness
Executable specification languages taking a Swiss army knife approach towards the problem of specification offer facilities for specifying all types of systems, but often at a cost of clearness of expression. As an example OBSERV [TY92] provides a multiparadigm environment allowing the system specification using object-oriented constructs, finite state machines, and logic programming. In contrast, DSLs being tailored towards a narrow, specific domain can be designed to provide the exact formalisms suitable for that domain.

Runtime Efficiency
The possible interactions between different elements of a general purpose specification language such as its type system and its support for concurrency result in runtime inefficiencies. A narrowly focused DSL can employ the most efficient implementation strategy and specialised optimisations for satisfying the expressed specification.

Modest Implementation Cost
DSLs are typically implemented by a translator that transforms the DSL source code into source or intermediate code compatible with the rest of the system. Such an approach can often be implemented using string processing languages such as awk [AKW79] and Perl, language development tools such as lex and yacc, specialised systems such as TXL [CHHP91] and KHEPERA [FNP97], or declarative languages such as Prolog and ML. The DSL implementation cost is - and should always be - modest.

Reliability
As described in the previous paragraph, the limited scope of a DSL often allows a source-to-source transformation type of implementation. The small scale of the required implementation effort often results in a translator whose correctness can be trivially verified. The size of typical executable specification languages means that the implementor must often take the correctness of the language's implementation on trust.

On the other hand, the system architect contemplating the use of a DSL architecture should also have in mind the following potential shortcomings of this approach:

Tool Support Limitations
CASE and integrated software development tools offer only limited support for integrating DSLs into the development process. Ad hoc solutions are often required to smoothly integrate DSL code with existing revision control systems, compilers, editors, source browsers, and debuggers.

Training Costs
In contrast to established specification languages such as Z [PST91] system implementers and maintainers will by definition have no prior exposure to the DSL being used. This problem is somehow mitigated by the fact that an appropriately chosen DSL will be familiar to other participants of the implementation effort such as those involved in the specification, beta testing, and final use. These participants will be able to perform DSL code walkthroughs - a task normally reserved for experienced software engineers.

Design Experience
DSL-based system architectures are not widely adopted within the software industry. As a result, there is an evident lack of design experience, prescriptive guidelines, mentors, design patterns, and supporting scientific literature. Early adopters will need to rely more on their own judgement as they adopt the approach in a stepwise fashion.

Software Process Integration
The use of DSLs is not yet an integral part of established software processes. Therefore, the software process being used has to be modified in order to take into account the design, implementation, integration, debugging, and maintenance of the adopted DSLs.

The implementation of a DSL differs from the implementation of a general purpose language. Compilers for general purpose languages are typically structured as a lexical analyser, a parser, a semantic analyser, an optimiser, and a target code generator. In contrast, the limited scope of a DSL allows and requires different implementation strategies. The lexical, syntactic, and semantic simplicity of DSLs often obviate the need for some elements that would be required by a general purpose language compiler; for example instead of using a parser front-end DSL implementations often process the source language using regular expressions. In addition, the often-limited user population of a DSL does not justify a large implementation effort forcing DSL implementers to choose the most economical realisation strategies; as an example compilation into assembly code of the target machine is rarely a practical proposition. Finally, as DSLs are often part of the development process of a larger system, schedule pressures drive DSL builders towards implementation methods that can rapidly deliver results. The aim of this paper is to provide, in the form of a pattern language, a repertoire of methods often used in the implementation of a DSL.

3  Design Patterns

The notion of design patterns has its origins on the seminal work of the architect Christopher Alexander. Alexander outlines how the relationship between recurring problems and their respective solutions establishes patterns as follows:

``Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem, in such a way that you can use this solution a million times over, without ever doing it the same way twice.'' - [AIS+77]

Twenty years later Gamma et al. [GHJV95] cross-pollinated these ideas into the field of reusable object-oriented software design. Design patterns offer a convenient way to capture, document, organise, and disseminate existing knowledge from a given area in a consistent and accessible format. Patterns differ from algorithms and data structures in that the concepts they describe can not be coded and used as a subroutine or an object class. Patterns also differ from frameworks as they do not describe the structure of a complete system: interrelated patterns are typically used together to solve a general design problem in a given context.

In this paper we describe eight recurring patterns that we have identified as being used for DSL design and implementation. The description of these patterns provides the DSL designers with a clear view of the available DSL realisation strategies, the forces that will guide them towards the selection of a specific pattern, the consequences of that decision, examples of similar uses, and the available implementation alternatives. In our description of the patterns we followed - in free text form - the format and classification used by Gamma et al. [GHJV95]. We classify each pattern as creational if it involves the creation of a DSL, structural if it describes the structure of a system involving a DSL, and behavioural if it describes DSL interactions.

4  DSL Design Patterns

In the following sections for every pattern we:

4.1  Piggyback

[p]

pigi.gif

Figure 2: The piggyback pattern.

The piggyback structural pattern (Figure 2) uses the capabilities of an existing language as a hosting base for a new DSL. Often a DSL needs standardised support for common linguistic elements such as expression handling, variables, subroutines, or compilation. By designing the DSL on top of an existing language the needed linguistic support is provided ``for free''. The piggyback pattern can be used whenever the DSL shares common elements with an existing language. Typically, the DSL language processor passes the linguistic elements that are expressed in the existing language to the language processor of the existing language. Where the DSL is implemented as a compiled language a typical implementation compiles the DSL code into the base language: DSL code is compiled as needed, while embedded base-language elements are emitted unmodified. Consequently, the resulting output of the compilation consists entirely of the base language. If the DSL is implemented as an interpreter, a similar strategy can be applied if the base language provides a facility for calling its interpreter with suitable arguments from within the DSL interpreter.

Typical examples of this approach are the yacc [Joh75] and lex [Les75] processors. While the specifications of the input grammar (in the case of yacc) and the input strings (in the case of lex) are expressed in a DSL, the resulting actions for recognised grammar rules and tokens are specified in C which is also the processors' output language. Yacc uses the piggyback approach more aggressively as it introduces special variables (denoted by the $ sign) to the C constructs used for specifying the actions.

The piggyback approach resembles in structure compiler front-ends that generate an intermediate language. However, the structure we propose uses an existing, human-readable, and typically general-purpose language, as the compilation target rather than a specialised, machine-readable intermediate language. The effort of translating the DSL into an existing human-readable language instead of implementing an alternative compiler front-end is substantially lower. In addition, the process of this translation is relatively straightforward and can be implemented as a simple source-to-source transformation - often merely using lexical processing constructs as described in section 4.3. In contrast, the implementation of a compiler front-end requires detailed knowledge of the intermediate language, and often intimate knowledge of a specific compiler implementation.

4.2  Pipeline

[p]

pipe.gif

Figure 3: The pipeline pattern.

The pipeline behavioural pattern (Figure 3) solves a problem of DSL composition. Often a system can best be described using a family of DSLs. The prototypical example for such an application is the composition of diverse mark-up languages in text processing systems. Different such languages can be used to specify tables [Les79b], mathematical equations [KC74], chemical formulas [BJK87], pictures [Ker82], graphs [BK86], and organic element chemical structures [BJK87]. In cases where a number of DSLs are needed to express the intended operations, their composition can be designed and implemented using a pipeline. Typically, all DSLs are organised as a series of communicating elements. Each DSL handles its own language elements and passes the rest down to the others. Sometimes, the output of one DSL can be expressed in terms of the input expected by another DSL further down the pipeline chain [Ben86]. The use of the pipeline pattern encourages the division of responsibility among small specialised DSLs and discourages bloated feature-rich language designs. The DSL-based system can be built in a stepwise fashion, adding components as needed with new components utilising existing ones.

As suggested by its name, the pattern can often be implemented using a pipeline of independent communicating system processes. Many modern operating systems provide facilities for setting up such a pipeline, while the Unix shells also provide a supporting built-in notation. The pipeline approach has been used by the troff [Oss79] family of text processing tools. Elements of a troff-based text processing pipeline can include eqn [KC74] for processing equations, tbl [Les79b] for processing tables, pic [Ker82] for processing pictures, grap [BK86] for drawing statistical displays, dag [GNV88] for typesetting directed graphs, chem [BJK87] for typesetting chemical structures, and refer [Les79a] for processing references. A similar structure has also been used to produce algorithm animations [BK91]. In addition, if one considers the command line arguments passed to typical Unix commands as a mini-DSL, then typical pipelines of Unix tool invocations can also be considered as an application of this pattern. This mode of use allows the implementation of sophisticated applications such as spell checkers or complicated operations on images and sound using families of tools such as the system's text processing tools, the pbm [P+93] portable bitmap collection, and the sox sound tools.

4.3  Lexical Processing

lex.gif
Figure 4: The lexical processing pattern.

The lexical processing creational pattern (Figure 4) offers an efficient way to design and implement DSLs. Due to their - by definition - limited field of applicability DSLs impose severe restrictions to the effort that can be used for their design and implementation. Many DSLs can be designed in a form suitable for processing by techniques of simple lexical substitution; without tree-based syntax analysis. The design of the DSL is geared towards lexical translation by utilising a notation based on lexical hints such as the specification of language elements (e.g. variables) using special prefix or suffix characters. The form of input for this family of DSLs is often line-oriented, rather than free form and delimited by character tokens. This design pattern can be used together with the piggyback pattern in cases where, after some lexical processing, the output of the DSL processor can be passed to the processor of the base language.

The utilisation of this pattern lowers the implementation cost for DSLs making them a practical proposition for applications where the cost of a full parser-based translation would not be justified. As translators based on lexical structure are often implemented using interpreted or rapid prototyping languages, the DSL design and implementation can gracefully evolve together in a combined iterative process. Examples of this application include numerous DSLs implemented using tools such as sed [McM79], awk [AKW88], Perl [WS90], Python [Lut96], m4 [KR79], and the C pre-processor. Most of these tools offer a rich set of lexical processing and substitution facilities - often expressed in terms of extended regular expressions - that can be used to implement a complete DSL in tens of lines of code.

4.4  Language Extension

exten.gif
Figure 5: The language extension pattern.

The language extension creational pattern (Figure 5) is used to add new features to an existing language. Often an existing language can effectively serve a new need with the addition of a few new features to its core functionality. In this case, a DSL can be designed and implemented as an extension of the base language. The language extension pattern differs from the piggyback pattern by the roles played by the two languages: the piggyback pattern uses an existing language as an implementation vehicle for a DSL, whereas the extension pattern is used when an existing language is extended within its syntactic and semantic framework to form a DSL. The design of a DSL using this pattern involves the addition of new language elements to an existing base language. These elements can include new data types, language block interaction mechanisms, semantic elements, or syntactic sugar. Typically the DSL inherits all syntax and semantics of the base language used, while adding its own extensions. An object-oriented class hierarchy can thus be formed with a number of DSLs being derived from base languages and forming base languages for other DSLs.

The use of the language extension pattern frees the DSL designer from the burden of designing a full featured language. In addition, where the pattern is used to design a non-trivial DSL hierarchy, the pattern offers a clear way of organising the language relationships and interactions [SDE95]. Compiled-language implementations of the extension pattern are often structured in the form of a pre-processor which transforms the DSL into the base language. Alternatively, source-to-source transformations [CHHP91], code composition [SG97b], or intentional programming [Sim95] techniques can be used to augment the language using high level operators. One of the earliest examples of this pattern is the ``rational FORTRAN'' (Ratfor) compiler [Ker75] which provided a structured version of FORTRAN. The implementation of the original C++ compiler (cfront) also used this technique [Str84]. A current effort using the extension pattern involves the addition of generic types to the Java programming language [BOSW98]. Extensions of interpreted languages can also benefit from this design pattern by implementing the language extension using a meta-interpreter [SS86] or a metacircular evaluator [ASS90]. Examples include the examination of abstract syntax trees using an interpreter of Prolog extended with an ambient current object [Cre97] and the extension of ML for graph drawing [KH97].

4.5  Language Specialisation

special.gif
Figure 6: The language specialisation pattern.

Language specialisation (Figure 6) is a creational pattern that removes features of a base language to form a DSL. In some cases the full power of an existing language may prevent its adoption for a specialised purpose. A representative case arises when requirements related to the safety or security aspects of a system can be satisfied only be removing some ``unsafe'' aspects (such as dynamic memory allocation, unbounded pointers, or threads) from a language [Mot94]. In such cases a DSL may be designed and implemented as a subset of an existing language. Whenever some specific features of an existing language render it unsuitable for a given application, the design of a DSL following the specialisation pattern can result in a mature language that satisfies the given requirements. The design of the DSL involves the removal from the base language of the unwanted syntactic or semantic features. Since the DSL is effectively a subset of the base language the removal can be guaranteed by a language processor that checks the DSL conformance. In a limited number of cases additional run-time checks may be required. Examples of DSLs designed following the specialisation pattern are Javalight [NvO98] which is a type-safe subset of Java, the educational subsets of Pascal used for a stepwise introduction to the language [Sav95], the HTML [BLC95] application of SGML [ISO86], and the automotive `safer-subset' of C [ER97].

4.6  Source to source transformation

[p]

s2s.gif

Figure 7: The source to source transformation pattern.

The source to source transformation creational pattern allows the efficient implementation of DSL translators. As outlined in section 2, the resources available for implementing a DSL are often severely constrained. Source to source transformation can be used to ease the burden of implementation. When the DSL can not be designed as a language extension, specialisation, or using the piggyback pattern it is often possible to leverage the facilities provided by existing language tools using a source to source transformation technique. The DSL source code is transformed via a suitable shallow or deep translation process into the source code of an existing language. The tools available for the existing language are then used to host - compile or interpret - the code generated by the transformation process.

When using this pattern one capitalises on the existing language processor infrastructure. This can include optimising compilers, linkers, and native code instruction schedulers. In addition, in some circumstances, even tools that rely on mappings between the source code and the machine code (such as profilers, execution tracers, and symbolic debuggers) can be used. In particular, some candidate host languages such as C offer a mechanism for specifying the file and source line of the DSL code that generated a particular sequence of host code instructions. The use of this pattern makes it also relatively easy to troubleshoot the DSL compilation process, because the resulting code will often be easy to read and reason about. One other possibility involves the translation of the DSL code into the intermediate language used by existing language compilers. The pattern can be implemented using a traditional lexical analysis, parsing, and host code generation process. In addition, a number of tools such as TXL [CHHP91] and KHEPERA [FNP97] can be used to speed-up the implementation process.

4.7  Data Structure Representation

data.gif
Figure 8: The data structure representation pattern.

The data structure representation creational pattern (Figure 8) allows the declarative and domain-specific specification of complex data. Data-driven code [KP78] relies on initialised data structures whose complexity can often make them difficult to write and maintain. Complicated structures are better expressed using a language rather than their underlying representation (e.g. a graph adjacency list may be easier expressed as a list of path connections). Designing a DSL to represent the data offers an attractive solution to the problem. The pattern is of use whenever a non-trivial data structure (anything other than rectangular arrays) needs to be initialised with data. It is particularly applicable to the initialisation of data structures whose elements are interrelated such as trees, graphs, arrays of pointers to statically initialised structure elements, arrays of pointers to functions, and multilingual text elements.

The DSL typically defines a user-friendly, alternative but isomorphic, representation of the underlying data structure elements. The DSL compiler can then parse the alternative data representation and transform the data elements to the structure needed for the internal representation. The adoption of this pattern minimises the chances of initialising data structures with wrong or inconsistent data, as the DSL compiler can perform such checks when compiling the data into the internal format. In addition, the data can be generated in the most efficient internal representation using tools such as the perfect hash function generator gperf [Sch90]. The pattern is most often implemented as a DSL compiler from the external to the internal representation. Where runtime efficiency is not a major constraint the DSL can be directly coded within the system's hosting language source code utilising the user-friendly alternative data structure and suitably interpreted at runtime. Such strategies are often employed in systems written in interpreted declarative languages such as Prolog or Lisp. A representative example of a DSL based on this pattern is FIDO [KS97] which is designed to concisely express regular sets of strings or trees. Other cases of DSL-based data specifications are the table initialisations generated by yacc [Joh75] and lex [Les75] for the table-driven parsing and lexical analysis automata they create.

4.8  System Front-End

front.gif
Figure 9: The system front-end pattern.

The configuration and adaptation of a system can often be relegated to a DSL front-end (Figure 9). Complicated software systems offer hundreds of configuration options, while their users require ever-increasing adaptation possibilities. Adding more features and configuration options can enlarge and complicate the system with diminishing returns on real functionality and user-friendliness. Making the system programmable by means of the DSL front-end structural pattern provides its users with a declarative, maintainable, organised, and open-ended mechanism for configuring and adapting it. Systems with more than a few configuration options, and systems whose operation can not be adequately specified by means of some arguments or a graphical user interface typically benefit from the addition of a DSL front-end.

Using this strategy the system's configuration parameters and internal functionality are exposed as elements of the DSL - e.g. as variables and functions respectively. At this point it is often advantageous to remove from the system all elements that can be specified by means of the DSL, and code them in terms of it, thus simplifying its structure.

Often the addition of a DSL to a system can reveal synergistic effects by enabling its communication with other systems, allowing for the automatic generation of DSL programs with increased functionality, establishing a common language among its user base, providing a mechanism for optimising or checking the system's configuration, and opening a market for third-party add-on applications. The pattern is most often implemented as an interpreted language embedded within the target system. A number of existing interpretative languages have been used or targeted explicitly for this purpose. Lisp-like languages have been used by systems such as the Emacs [Sta84] editor and the AutoCAD package [RH98], while languages such as Tcl [Ous94], Perl [WS90], and Microsoft's Application Basic [Boc99] provide explicit support for system embedding.

5  Conclusions

rel.gif
Figure 10: Relationships within the DSL pattern language.

We have described a DSL pattern language consisting of eight DSL design patterns. Our pattern language does not include the design of a DSL using traditional programming language design and implementation techniques (lexical analysis, parsing, code generation), as the aspects of those are extensively covered in the existing literature. The relationships of the patterns we described are depicted in Figure 10. The interrelationships between the patterns are both interesting, and typical of a pattern language focused on a specific domain. Throughout our literature research for drafting this work we were impressed by the multitude of DSL designs, implementation strategies, and resulting systems, and the scarcity of supporting design frameworks and methodologies. We hope that the pattern language we presented can be used as a building block for a systematic view of the software development process involving DSLs.

Acknowledgements

We would like to thank the anonymous referees for their insightful comments on the previous version of this paper.

References

[AIS+77]
Christopher Alexander, Sara Ishikawa, Murray Silverstein, Max Jacobson, Ingrid Fiksdahl-King, and Shlomo Angel. A Pattern Language. Oxford University Press, 1977.

[AKW79]
Alfred V. Aho, Brian W. Kernighan, and Peter J. Weinberger. Awk - a pattern scanning and processing language. Software: Pract. & Exp., 9(4):267-280, 1979.

[AKW88]
Alfred V. Aho, Brian W. Kernighan, and Peter J. Weinberger. The AWK Programming Language. Addison-Wesley, 1988.

[ASS90]
Harold Abelson, Gerald Jay Sussman, and Jullie Sussman. Structure and Interpretation of Computer Programs. MIT Press, 1990.

[ASU85]
Alfred V. Aho, Ravi Sethi, and Jeffrey D. Ullman. Compilers, Principles, Techniques, and Tools. Addison-Wesley, 1985.

[BBH+94]
J. Bell, F. Bellegarde, J. Hook, R. B. Kieburtz, A. Kotov, J. Lewis, L. McKinney, D. P. Oliva, T. Sheard, L. Tong, L. Walton, and T. Zhou. Software design for reliability and reuse: a proof-of-concept demonstration. In Conference on TRI-Ada '94, pages 396-404. ACM, ACM Press, 1994.

[Ben86]
Jon Louis Bentley. Programming pearls: Little languages. Communications of the ACM, 29(8):711-721, August 1986.

[BJK87]
Jon Louis Bentley, Lynn W. Jelinski, and Brian W. Kernighan. CHEM - a program for phototypesetting chemical structure diagrams. Computers and Chemistry, 11(4):281-297, 1987.

[BK86]
Jon Louis Bentley and Brian W. Kernighan. GRAP - a language for typesetting graphs. Communications of the ACM, 29(8):782-792, August 1986.

[BK91]
Jon Louis Bentley and Brian W. Kernighan. A system for algorithm animation. Computing Systems, 4(1):5-30, Winter 1991.

[BLC95]
T. Berners-Lee and D. Connolly. RFC 1866: Hypertext Markup Language - 2.0, November 1995. Status: PROPOSED STANDARD.

[Boc99]
David Boctor. Microsoft Office 2000 Visual Basic Fundamentals. Microsoft Press, 1999.

[BOSW98]
Gilad Bracha, Martin Odersky, David Stoutamire, and Philip Wadler. Making the future safe for the past: Adding genericity to the Java programming language. ACM SIGPLAN Not., 33(10):183-200, October 1998. Proceedings of the 1998 ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages and Applications (OOPSLA '98).

[CHHP91]
James R. Cordy, Charles D. Halpern-Hamu, and Eric Promislow. TXL: A rapid prototyping system for programming language dialects. Computer Languages, 16(1):97-107, January 1991.

[Cre97]
Roger F. Crew. ASTLOG: A language for examining abstract syntax trees. In Ramming [Ram97], pages 229-242.

[CS95]
James O. Coplien and Douglas C. Schmidt. Pattern Languages of Program Design. Addison-Wesley, 1995.

[ER97]
P. D. Edwards and R. S. Rivett. Towards an automative `safer subset' of C. In Peter Daniel, editor, 16th International Conference on Computer Safety, Reliability and Security: SAFECOMP '97, pages 185-195, York, UK, September 1997. European Workshop on Industrial Computer Systems: TC-7, Springer Verlag.

[FNP97]
Rickard E. Faith, Lars S. Nyland, and Jan F. Prins. KHEPERA: A system for rapid implementation of domain specific languages. In Ramming [Ram97], pages 243-255.

[GHJV95]
Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides. Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, 1995.

[GNV88]
E. R. Gansner, S. C. North, and K. P. Vo. DAG - a program that draws directed graphs. Software: Pract. & Exp., 18(11):1047-1062, November 1988.

[ISO86]
International Organization for Standardization, Geneva, Switzerland. Information processing - Text and office systems - Standard Generalized Markup Language (SGML), 1986. ISO 8879:1986.

[Jac99]
Michael Jackson. Specializing in software engineering. IEEE Software, 16(6):119-121, Nov/Dec 1999.

[JL87]
Stephen C. Johnson and Michael E. Lesk. Language development tools. Bell System Technical Journal, 56(6):2155-2176, July-August 1987.

[Joh75]
Stephen C. Johnson. Yacc - yet another compiler-compiler. Computer Science Technical Report 32, Bell Laboratories, Murray Hill, NJ, USA, July 1975.

[KC74]
Brian W. Kernighan and L. L. Cherry. A system for typesetting mathematics. Computer Science Technical Report 17, Bell Laboratories, Murray Hill, NJ, USA, May 1974.

[Ker75]
Brian W. Kernighan. Ratfor - a preprocessor for a rational Fortran. Software: Pract. & Exp., 5(4):395-406, 1975.

[Ker82]
Brian W. Kernighan. PIC - a language for typesetting graphics. Software: Pract. & Exp., 12:1-21, 1982.

[KH97]
Samual N. Kamin and David Hyatt. A special-purpose language for picture-drawing. In Ramming [Ram97], pages 297-310.

[KP78]
Brian W. Kernighan and P. J. Plauger. The Elements of Programming Style. McGraw-Hill, second edition, 1978.

[KR79]
Brian W. Kernighan and Dennis M. Ritchie. The M4 macro processor. In Unix Programmer's Manual [Uni79].

[KS97]
Nils Klarlund and Michael I. Schwarzbach. A domain-specific language for regular sets of strings and trees. In Ramming [Ram97], pages 145-156.

[Lan66]
P. J. Landin. The next 700 programming languages. Communications of the ACM, 9(3):157-166, May 1966.

[Les75]
Michael E. Lesk. Lex - a lexical analyzer generator. Computer Science Technical Report 39, Bell Laboratories, Murray Hill, NJ, USA, October 1975.

[Les79a]
Michael Lesk. Some applications of inverted indexes on the Unix system. In Unix Programmer's Manual [Uni79].

[Les79b]
Michael E. Lesk. TBL - a program to format tables. In Unix Programmer's Manual [Uni79].

[Lut96]
Mark Lutz. Programming Python. O'Reilly and Associates, 1996.

[McM79]
L. E. McMahon. SED - a non-interactive text editor. In Unix Programmer's Manual [Uni79].

[Mot94]
Motor Industry Research Association. Development guidelines for vehicle based software, November 1994.

[NvO98]
Tobias Nipkow and David von Oheimb. Javalight is type-safe-definitely. In Conference Record of POPL '98: The 25th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, pages 161-170, San Diego, California, 1998.

[Oss79]
J. F. Ossanna. NROFF/TROFF user's manual. In Unix Programmer's Manual [Uni79].

[Ous94]
John K. Ousterhout. Tcl and the Tk Toolkit. Addison-Wesley, 1994.

[P+93]
Jef Poskanzer et al. NETPBM: Extended portable bitmap toolkit. Available online ftp://ftp.x.org/contrib/utilities/, December 1993. Release 7.

[PH95]
Mohammad N. Paryavi and William J. Hankley. OOSPEC: an executable object-oriented specification language. In ACM 23rd annual computer science conference. CSC '95, pages 169-177. ACM, ACM Press, 1995.

[PST91]
Ben Potter, Jane Sinclair, and David Till. An Introduction to Formal Specification and Z. Prentice-Hall, 1991.

[Ram97]
J. Christopher Ramming, editor. USENIX Conference on Domain-Specific Languages, Santa Monica, CA, USA, October 1997. USENIX.

[RH98]
Rod R. Rawls and Mark A. Hagen. Autolisp Programming: Principles and Techniques. Goodheart-Willcox Co., 1998.

[RJB99]
James Rumbaugh, Ivar Jacobson, and Grady Booch. The Unified Modeling Language Reference Manual. Addison-Wesley, 1999.

[Sav95]
Walter Savitch. Pascal - An Introduction to the Art and Science of Programming. Benjamin/Cummings Pub. Co., Inc., fourth edition, 1995.

[Sch90]
Douglas C. Schmidt. Gperf: A perfect hash function generator. In USENIX C++ Conference, pages 87-100, San Francisco, CA, USA, April 1990. Usenix Association.

[SDE95]
Diomidis Spinellis, Sophia Drossopoulou, and Susan Eisenbach. Object-oriented technology in multiparadigm language implementation. Journal of Object-Oriented Programming, 8(1):33-38, March/April 1995.

[SG97a]
Diomidis Spinellis and V. Guruprasad. Lightweight languages as software engineering tools. In Ramming [Ram97], pages 67-76.

[SG97b]
James M. Stichnoth and Thomas Gross. Code composition as an implementation language for compilers. In Ramming [Ram97], pages 119-132.

[Sim95]
Charles Simonyi. The death of computer languages and the birth of intentional programming. Technical Report MSR-TR-95-52, Microsoft Corporation, Redmond, WA, USA, September 1995. Available online at ftp://ftp.research.microsoft.com/pub/tr/tr-95-52.ps.

[Som89]
Ian Sommerville. Software Engineering. Addison-Wesley, third edition, 1989.

[SS86]
Leon Sterling and Ehud Shapiro. The Art of Prolog. MIT Press, 1986.

[Sta84]
R. M. Stallman. EMACS: The extensible, customizable, self-documenting display editor. In D. R. Barstow, H. E. Shrobe, and E. Sandwell, editors, Interactive Programming Environments, pages 300-325. McGraw-Hill, 1984.

[Str84]
Bjarne Stroustrup. Data abstraction in C. Bell System Technical Journal, 63(8):1701-1732, October 1984.

[TM87]
Wadysaw M. Turski and Thomas S. E. Maibaum. The Specification of Computer Programs. Addison-Wesley, 1987.

[TY92]
Shmuel Tyszberowicz and Amiram Yehudai. OBSERV - a prototyping language and environment. ACM Transactions on Software Engineering and Methodology, 1(3):269-309, July 1992.

[Uni79]
UNIX Programmer's Manual. Volume 2 - Supplementary Documents. Bell Telephone Laboratories, Murray Hill, NJ, USA (also available online http://plan9.bell-labs.com/7thEdMan/), seventh edition, 1979.

[Wir74]
Niklaus Wirth. On the design of programming languages. In Jack L. Rosenfeld, editor, Information Processing 74: Proceedings of IFIP Congress 74, pages 386-393, Stockholm, Sweden, August 1974. International Federation for Information Processing, North-Holland Publishing Company.

[WS90]
Larry Wall and Randal L. Schwartz. Programming Perl. O'Reilly and Associates, Sebastopol, CA, USA, 1990.

Biographical Information

Diomidis Spinellis holds an MEng in Software Engineering and a PhD in Computer Science both from Imperial College (University of London, UK). He is an assistant professor at the Department of Information and Communication Systems, University of the Aegean, Greece. He has contributed software to the 4.4BSD Unix distribution, the X-Windows system, and is the author of a number of open source software packages, libraries, and tools. His research interests include Software Engineering, Programming Languages, and Information Security, Dr. Spinellis is a member of the ACM, the IEEE, and a founding member of the Greek Internet User's Society. He is a co-recipient of the Usenix Association 1993 Lifetime Achievement Award.