wut: std.datetime.systime.Clock.currStdTime is offset from Jan 1st, 1 AD

Jonathan M Davis newsgroup.d at jmdavisprog.com
Wed Jan 24 08:12:55 UTC 2018


On Wednesday, January 24, 2018 10:50:55 drug via Digitalmars-d wrote:
> 24.01.2018 10:25, Jonathan M Davis пишет:
> > If you need to interact with time_t, there's SysTime.toUnixTime,
> > SysTime.fromUnixTime, stdTimeToUnixTime, and unixTimeToStdTime -
> > assuming of course that time_t is unix time. But if it's not, you're
> > kind of screwed in general with regards to interacting with anything
> > else, since time_t is technically opaque. It's just _usually_ unix time
> > and most stuff is going to assume that it is. There's also
> > SysTime.toTM, though tm isn't exactly a fun data type to deal with if
> > you're looking to convert anything.
> >
> > But if you care about calendar stuff, using January 1st, 1 A.D. as your
> > epoch is far cleaner than an arbitrary date like January 1st, 1970. My
> > guess is that that epoch was originally selected to try and keep the
> > values small in a time where every bit mattered. It's not a
> > particularly good choice otherwise, but we've been stuck dealing with
> > it ever since, because that's what C and C++ continue to use and what
> > OS APIs typically use.
> >
> > - Jonathan M Davis
>
> I'm agree with you that 1 A.D. is better epoch than 1970. IIRC c++11 by
> default uses 1 nsec presicion so even 64 bits are not enough to present
> datetime from January 1st, 1 A.D. to our days.

Yeah. Hecto-nanoseconds is essentially optimal. Any less precise, and you're
losing precision for nothing, and any more precise, and the range of allowed
values becomes too small. I'd love to be more precise, and I'd love to have
a greater range of values so that Duration could cover SysTime.max -
SysTime.min, but unfortunately, that would mean taking Duration up to cent
(assuming that it were implemented), which would be overkill. Even one more
bit would make a big difference, but 65 isn't a power of two.

The part of C++11's date/time stuff which horrified me was the fact that
they templatized their duration type. It does make it more flexible, but it
makes it a _lot_ less user-friendly to pass them around to different APIs.
For most purposes, hecto-nanoseconds is plenty accurate with a range of
values that is more than enough, resulting in a type that can be used in
most circumstances while still being user-friendly.

> And by the way I'd like to thank you for your great work - in comparison
> to very (at least for me) inconsistent means c/c++ provide to handle
> date and time, std.datetime is the great pleasure to work with.

Thanks. I did it because I was sick of time-related bugs at work, and I
wanted the language I wanted D to get it right. By no means do I claim that
std.datetime is perfect, but IMHO, it's way better than what most languages
typically have.

LOL. I just ran into time-related bug the other day when I tried out the NHL
apps for Android and Roku. I was one timezone to the east of the live game I
was watching, and in both applications, the bar representing the timeline
for the game claimed that the game was an hour longer than it was, with the
game being an hour farther along than it was. Presumably, they did something
with local time when they should have been using UTC.

Time is one of those things that seems like it should be easy to get right
but which is surprisingly easy to get wrong.

- Jonathan M Davis




More information about the Digitalmars-d mailing list