Request for comments: std.d.lexer

Mehrdad wfunction at hotmail.com
Mon Jan 28 01:03:42 PST 2013


On Sunday, 27 January 2013 at 19:48:23 UTC, Walter Bright wrote:
> On 1/27/2013 2:17 AM, Philippe Sigaud wrote:
>> Walter seems to think if a lexer is not able to vomit thousands
>> of tokens a seconds, then it's not good.
>
> Speed is critical for a lexer.
>
> This means, for example, you'll need to squeeze pretty much all 
> storage allocation out of it. A lexer that does an allocation 
> per token will is not going to do very well at all.


Semi-unrelated question -- how would you benchmark a _parser_?

Is there any way to get a _number_ as an answer, or is comparing 
against a rival parser the only way of benchmarking a parser?


More information about the Digitalmars-d mailing list