Posts

Showing posts from 2011

Ein Weihnachtsgedicht

(C) Professor Fish, aka Ralf Lämmel Stille Nacht! Unheilvolle Nacht! Alles schläft; einsam wacht nur der Bachelor-Student, der nie pennt, der nur rennt. So viele Prüfungen, o weh! So viele Prüfungen, o weh! Stille Nacht! Unheilvolle Nacht! Die wird wieder durchgemacht. Bester Student, o wie lacht bös' aus Deinem Gesicht die Aversion gegen Bologna's Gericht. So viele Regularien, o weh! So viele Regularien, o weh! Stille Nacht! Unheilvolle Nacht! Was hat man nur aus dem Uni-Studium gemacht? Es ist nicht Bologna allein. Andere Trends reihen sich hier ein. Das Denken an Deutschland in der Nacht, hat auch den Professor um den Schlaf gebracht. Lautes Land! Unheilvolles Land! O Marx, gib mir Eltern, die Kinder fordern anstatt Lehrer zu quälen. O Lenin, gib mir Erstsemester, die wissen wollen anstatt videozugamen. Früher war alles besser, o weh! Früher war alles besser, o weh! Stille Nacht! Unheilvolle Nacht! Soll es Weihnachten nun sein, dann Tod den E

A riddle regarding type safety

Of course, I have explained to the students in my language theory class what type safety means (remember: progress + preservation) and we have studied this notion time and again for some of the TAPL languages. However, I don't feel like the notion really sinked in universally. Some students consider all this semantics and calculus stuff as obscure and there is a certain critical mass of notation, when passed, concepts are not understood anymore (by some if not many of the Bachelor-level students in my class). Last night, I came up with a nice "semantics riddle" for type safety: What would be a super-trivial language with a type system and an SOS semantics such that type safety is violated? I gave the students 2 minutes to think about the question. I emphasized several times that the language should be super-trivial. I also asked them for a general strategy for mis-designing a language to be type-unsafe. One idea that popped up was to support some notion of cast.

A more serious online partial evaluator

Admittedly, the illustrative partial evaluator of the previous blog post was quite limited. Consider, for example, the following attempts of partially evaluating different applications of the exponentiation function; the last attempt diverges: > peval (lib, (Apply "exp" [Const 2, Const 3])) Const 8 > peval (lib, (Apply "exp" [Var "x", Const 3])) Binary Times (Var "x") (Binary Times (Var "x") (Binary Times (Var "x")(Const 1))) > peval (lib, (Apply "exp" [Const 2, Var "n"])) IfZero (Var "n") (Const 1) (Binary Times (Const 2) (IfZero (Binary Plus (Var "n") (Const (-1))) (Const 1) (Binary Times (Const 2) (IfZero (Binary Plus (Binary Plus (Var "n") (Const (-1))) (Const (-1))) (Const 1) (Binary Times (Const 2) (IfZero (Binary Plus (Binary Plus (Binary Plus (Var "n") (Const (-1))) (Const (-1))) (Const (-1))) (Const 1) (Binary Times (Const 2) (IfZero (Binary Plus (B

An illustrative online partial evaluator

Let's understand the basic mechanics of a (too?) simple online partial evaluator . (I am grateful to William Cook for encouraging the following Haskell exercise, and for sharing with me some stimulating slides and notes on the subject.) I recommend John Hatcliff's excellent and detailed course material on "Foundations of Partial Evaluation and Program Specialization"---available online . For those in a hurry, try Wikipedia . Anyway, partial evaluation, macro systems, and multi-stage programming is a lot of fun. Here is also a paper (draft) by William Cook that I can warmly recommend. Consider the following syntax of a simple, pure, first-order, functional programming language: type Prog = (FEnv,Expr) type FEnv = [FDef] type FDef = (String,([String],Expr)) data Expr = Const Int | Var String | Binary Op Expr Expr | IfZero Expr Expr Expr | Apply String [Expr] deriving (Show) data Op = Plus | Times deriving (Show) Thus, a program consists of a main expression and

Empirical studies of software languages

Dear language researchers and engineers, if you were interested in empirical studies of software languages , what journals and conferences come to your mind as potential publication venues for such studies? In different terms, what do you think what quality journals and conferences would be popping up if you were googling (binging) intelligently for empirical studies of programming languages, modeling languages, domain-specific languages, and other software languages you can think of? Please send your ideas by email to rlaemmel@acm.org by 19 May. The results of this inquiry will be published while preserving anonymity of all participants. Thanks! Professor Fish PS1: The reason for asking is that we are into a meta-empirical effort, continuing our SLE 2010 paper " Empirical language analysis in software linguistics " (Jean-Marie Favre and Dragan Gasevic and Ralf Lämmel and Ekaterina Pek). We are going to use content analysis , where we aim for automation for data mining to t

Language processing for dummies

I enjoyed reading recently " Language Implementation Patterns: Create Your Own Domain-Specific and General Programming Languages " by Terence Parr (the ANTLR guy) as available from The Pragmatic Bookshelf . The book uses a design pattern style to present techniques for language processing. Eventually the book gets to leverage ANTLR for many problems, but it also explains language processing in terms of patterns that require regular programming effort without the help of a program generator. This is a clever way of teaching some basics of grammars, parsing, tree walking and other concepts that can be explained all too easily (elsewhere) in a too complicated manner, or in a manner requiring substantial background. In my (very) advanced Bachelor's course Programming techniques and technologies , I face students who don't know much yet about formal languages and certainly nothing about compiler construction; they are aware though of context-free grammars (EBNFs) for the p

A tiny attribute grammar exercise

(C) 2011, Ralf Laemmel Let's map Knuth' attribute grammar example to Haskell. This has been done many times. In fact, there is a long track of excellent research on the relation between attribute grammars and functional programming. For instance, see the work by Doaitse Swierstra and team over many years. Our development here is meant to just illustrate the basic mapping in an as simple and self-contained manner as possible. One should notice that the chosen example (due to Knuth) requires laziness for the straightforward encoding that is leveraged below. Hence, C programmers are encouraged to try to transcribe this Haskell program to their favorite language. The following context-free grammar is assumed: digit : '0' digit : '1' digits : digit digits : digits digit number : digits number : digits '.' digits That is, we deal with binary numbers---both integer and fractional numbers. We use the following algebraic datatypes to model the grammar: data Digit

Test-driven grammar-class understanding with ANTLR

Suppose you would like to understand some grammar classes such as LL(1) and LL(*). Then you might appreciate a test suite like the following, or you would actually try to discover such a test suite yourself. I used the following suite in a lab of my " Software Language Processing " class. It is very convenient means of illustrating things. Students can copy and paste these codes now and further investigate. It helps understanding formal notions and actual, operational parser behaviors. The first ANTLR input specifies a trivial grammar with a fixed lookahead of 1. grammar LL1; options { k = 1; } s : 'a' 'x' | 'b' 'x' ; Let's apply ANTLR: java -classpath ".:antlr-3.2.jar" org.antlr.Tool LL1.g ANTLR likes that grammar just fine. (There are no complains.) We are allowed to infer hence that the grammar is indeed LL(1). The next ANTLR input combines two alternatives that are clearly not LL(1). However, we still use a lookahead of 1. gra

A tiny bit of denotational semantics

All languages and assignments in this blog post are (C) 2011 Ralf Lämmel. This is actually an assignment for my paradigms/semantics class . Consider the following super-trivial interpreter written in Haskell and in the direct style of denotational semantics: data Exp = Const Int | Add Exp Exp eval :: Exp -> Int eval (Const i) = i eval (Add x y) = eval x + eval y It's time to test: > eval (Add (Const 20) (Const 22)) 42 This incredible language handles integer constants and addition. Now assume that we are adding an "exit" form of expression. The intended semantics of a term "Exit e " is to define the final result of expression evaluation as the value of e . Hence, if an exit expression occurs inside an addition, then the addition is effectively cancelled. Thus: data Exp = Const Int | Add Exp Exp | Exit Exp For instance: Add (Const 20) (Const 22) should evaluate to 42. Add (Exit (Const 88)) (Const 42) should evaluate to 88. A particularly clumsy