wut: std.datetime.systime.Clock.currStdTime is offset from Jan 1st, 1 AD

Jonathan M Davis newsgroup.d at jmdavisprog.com
Wed Jan 24 07:25:26 UTC 2018


On Wednesday, January 24, 2018 10:05:12 drug via Digitalmars-d wrote:
> 24.01.2018 03:15, Jonathan M Davis пишет:
> > On Tuesday, January 23, 2018 23:27:27 Nathan S. via Digitalmars-d wrote:
> >> https://dlang.org/phobos/std_datetime_systime.html#.Clock.currStdTime
> >> """
> >> @property @trusted long currStdTime(ClockType clockType =
> >> ClockType.normal)();
> >> Returns the number of hnsecs since midnight, January 1st, 1 A.D.
> >> for the current time.
> >> """
> >>
> >> This choice of offset seems Esperanto-like: deliberately chosen
> >> to equally inconvenience every user. Is there any advantage to
> >> this at all on any platform, or is it just pure badness?
> >
> > Your typical user would use Clock.currTime and get a SysTime. The badly
> > named "std time" is the internal representation used by SysTime. Being
> > able to get at it to convert to other time representations can be
> > useful, but most code doesn't need to do anything with it.
> >
> > "std time" is from January 1st 1 A.D. because that's the perfect
> > representation for implementing ISO 8601, which is the standard that
> > std.datetime follows, implementing the proleptic Gregorian calendar
> > (i.e. it assumes that the calendar was always the Gregorian calendar
> > and doesn't do anything with the Julian calendar).
> >
> > https://en.wikipedia.org/wiki/ISO_8601
> > https://en.wikipedia.org/wiki/Proleptic_Gregorian_calendar
> >
> > The math is greatly simplified by using January 1st 1 A.D. as the start
> > date and by assuming Gregorian for the whole way.
> >
> > C# does the same thing with its date/time stuff - it even uses
> > hecto-nanoseconds exactly like we do. hnsecs gives you the optimal
> > balance between precision and range that can be gotten with 64 bits (it
> > covers from about 22,000 B.C. to about 22,000 A.D., whereas IIRC, going
> > one decimal place more precise would reduce it to about 200 years in
> > either direction).
> >
> > - Jonathan M Davis
>
> I guess he meant it's inconvenient working with c/c++ for example to
> add/subtract difference between epoch in c/c++ and d

If you need to interact with time_t, there's SysTime.toUnixTime,
SysTime.fromUnixTime, stdTimeToUnixTime, and unixTimeToStdTime - assuming of
course that time_t is unix time. But if it's not, you're kind of screwed in
general with regards to interacting with anything else, since time_t is
technically opaque. It's just _usually_ unix time and most stuff is going to
assume that it is. There's also SysTime.toTM, though tm isn't exactly a fun
data type to deal with if you're looking to convert anything.

But if you care about calendar stuff, using January 1st, 1 A.D. as your
epoch is far cleaner than an arbitrary date like January 1st, 1970. My guess
is that that epoch was originally selected to try and keep the values small
in a time where every bit mattered. It's not a particularly good choice
otherwise, but we've been stuck dealing with it ever since, because that's
what C and C++ continue to use and what OS APIs typically use.

- Jonathan M Davis




More information about the Digitalmars-d mailing list