First class lazy Interval
Michel Fortin
michel.fortin at michelf.com
Fri Feb 27 19:43:57 PST 2009
On 2009-02-27 08:44:31 -0500, Andrei Alexandrescu
<SeeWebsiteForEmail at erdani.org> said:
> Michel Fortin wrote:
>> On 2009-02-27 04:43:46 -0500, bearophile <bearophileHUGS at lycos.com> said:
>>
>>> D2 supports the interval syntax in the foreach:
>>> foreach (i; 1..1000) {...}
>>>
>>> Such intervals are useful in a very large number of situations. So,
>>> with the new Range support, it may be useful to allow the interval
>>> syntax to be used in other contexts as well.
>>> So x..y may become a first-class lazy interval from x to y-1, that can
>>> be passed to functions too, etc, and not just used into foreach (the
>>> compiler can recognize it, and often optimize it away in many
>>> situations, replacing it with a normal for() loop).
>>
>> I agree that having first-class intervals in the language would make it
>> better, especially when you want to pass intervals as function
>> arguments.
>
> I'm having trouble understanding what's wrong with the good old data
> types and functions.
Nothing, really. Specifying intervals as two separate function
arguments as in f(a, b) is perfectly acceptable. Having a struct
Interval { int a, b } is also a nice way to define and pass an interval
arround. And defining intervals as a..b in foreach and when slicing
array is a pretty neat syntax.
But while all these solutions for passing intervals are nice, having an
incoherent mix of all these isn't. I think intervals should be
uniformized, and that probably goes by openning the a..b syntax to
other uses. That could be done easily by mapping a..b to a struct in
the standard library.
That said, as others have pointed out, it also opens some new possibilities:
It makes it easier to support multidimentional arrays. For instance,
marray[1..3, 3, 3..4] could translate to a call to
marray.opSliceIndex(interval!(int), int, interval!(int)).
Making intervals a type generalizes them as ranges. For instance you
could create a "random" template function that chooses a random element
in a random-access range: random(range), then use that same function
with an integer interval: random(1..10). That's an improvement over
having two arguments: random(1, 10), because a two-arguments tuple
can't be treated as a range (with front, back, popFront, popBack,
etc.). And while having random(interval(1, 10)) is technically the
same, it also is a lot more cluttered.
If the language can avoid the clutter in foreach and array slices, then
why can't we avoid it elsewhere? Why does an interval when inside a
language construct looks better than elsewhere? That's the oddities
mapping a..b to a standard type usable everywhere would avoid.
--
Michel Fortin
michel.fortin at michelf.com
http://michelf.com/
More information about the Digitalmars-d
mailing list