DMD 1.021 and 2.004 releases

Jascha Wetzel "[firstname]" at mainia.de
Tue Sep 11 07:13:27 PDT 2007


Jari-Matti Mäkelä wrote:
> Kirk McDonald wrote:
> 
>> Walter Bright wrote:
>>> Stewart Gordon wrote:
>>>
>>>> Maybe.  But still, nested comments are probably likely to be supported
>>>> by more code editors than such an unusual feature as delimited strings.
>>>
>>> Delimited strings are standard practice in Perl. C++0x is getting
>>> delimited strings. Code editors that can't handle them are going to
>>> become rapidly obsolete.
>>>
>>> The more unusual feature is the token delimited strings.
>> Which, since there's no nesting going on, are actually very easy to
>> match. The Pygments lexer matches them with the following regex:
>>
>> q"([a-zA-Z_]\w*)\n.*?\n\1"
> 
> It's great to see Pygments handles so many possible syntaxes. Unfortunately
> backreferences are not part of regular expressions. I've noticed two kinds
> of problems in tools:
> 
> a) some can't handle backreferences, but provide support for nested comments
> as a special case. So comments are no problem then, but all delimited
> strings are.
> 
> b) some lexers handles both nested comments and delimited strings, but all
> delimiters must be enumerated in the language definition. Even worse, some
> highlighters only handle delimited comments, not strings.
> 
> Maybe the new features (= one saves on average < 5 characters of typing per
> string) are more important than tool support? Maybe all tools should be
> rewritten in Python & Pygments?

D's delimited strings can (luckily) be scanned with regular languages, 
because the enclosing double quotes are required. else the lexical 
structure wouldn't even be context free and a nightmare for 
automatically generated lexers.
therefore you can match q"[^"]*" and check the delimiters during 
(context sensitive) semantic analysis.



More information about the Digitalmars-d-announce mailing list