[Issue 6725] core.time.dur should accept floating point

via Digitalmars-d-bugs digitalmars-d-bugs at puremagic.com
Thu Jul 24 00:27:17 PDT 2014


https://issues.dlang.org/show_bug.cgi?id=6725

--- Comment #36 from Sobirari Muhomori <dfj1esp02 at sneakemail.com> ---
(In reply to Vladimir Panteleev from comment #34)
> (In reply to Sobirari Muhomori from comment #32)
> > It's meaningful to sleep for 200ms, but not for 0.2s. When you need a better
> > precision, you switch to the appropriate unit. How would you specify 1/60
> > fraction of a minute?
> 
> I still don't understand this argument. 200ms and 0.2s are the same thing,
> how can it be meaningful to sleep for 200ms but not for 0.2s?

Not quite the same. A second is an inadequate precision for specification of
time with sub-second precision. Don't misunderstand me, I don't disallow you to
use floating point for time specification, I only think it should not be
encouraged for wide use, hence it should not be included in standard lib, but
done in your code, which is trivial: floatDur is a very small function and can
be included in docs to help people get it right with warning and discouraging
at the same time, I think, it's an optimal choice given the tradeoffs and
goals.

BTW, just thought about another possible confusion: one can mistake 0.30h for
30 minutes. I use decimal time system at work, but it requires a considerable
amount of time to get used to it (requires thinking about time in quanta of 6
minutes) and can be confusing when you see it for the first time.

> > Digital signature is an important example. Cryptographic security is an
> > important technology enjoying wide use.
> 
> So are thousands and thousands of other technologies being in use on your
> computer right now.

That was a reply to your assertion that hashing of timestamps is an esoteric
example. If the timestamp should be signed, you can't avoid it, if a duration
should be specified with sub-second precision, you can use millisecond unit, if
it should be specified with a week precision, you can use week unit, I don't
see, how second can address all needs for duration specifications.
Units+integers interface provided by standard library is superior to float
interface.

> > Millisecond exists precisely for that purpose. In my experience millisecond
> > precision works fine up to a scale of a minute (even though you don't need
> > milliseconds for durations >2s).
> 
> Are you saying that the program should just accept an integer number at the
> lowest precision it needs? That's just wrong: 1) it puts abstract technical
> reasoning before user experience; and 2) it strays from a well-established
> convention.

Program should accept values in units of appropriate precision. I recommend
7zip as a example of good interface for specification of values in a wide range
- from bytes to megabytes. I didn't check it, but believe it doesn't support
float values, but frankly it doesn't need it.

> > It's again a need for a precision better than a second. Though, I'd still
> > question that 1.5s is much better than 1 or 2 seconds.
> 
> I don't understand this argument. Are you saying that no program should ever
> need to sleep for 1.5 seconds?

Even if it does, specifying 1500ms is not an issue, but I'd still question its
utility, I don't see how 25% difference can be practically notable.

--


More information about the Digitalmars-d-bugs mailing list