https://xkcd.com/2867

Alt text:

It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]

  • ericbomb@lemmy.world
    link
    fedilink
    English
    arrow-up
    124
    arrow-down
    1
    ·
    edit-2
    11 months ago

    We use datediff in sql and let God handle the rest.

    “Oh but they’re in different time zones” “Oh did you account for if one is in day light savings and other isn’t” “Aren’t some of these dates stored in UTC and some local?”

    Are all problems I do not care about.

        • phoneymouse@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          1
          ·
          11 months ago

          If your system hasn’t been upgraded to 64-bit types by 2038, you’d deserve your overflow bug

          • Appoxo@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            11 months ago

            Let’s just nake it 128-Bit so it’s not our problem anymore.
            Hell, let’s make it 256-Bit because it sounds like AES256

            • phoneymouse@lemmy.world
              link
              fedilink
              English
              arrow-up
              16
              ·
              edit-2
              11 months ago

              64 bits is already enough not to overflow for 292 billion years. That’s 21 times longer than the estimated age of the universe.

              • nybble41@programming.dev
                link
                fedilink
                English
                arrow-up
                13
                ·
                11 months ago

                If you want one-second resolution, sure. If you want nanoseconds a 64-bit signed integer only gets you 292 years. With 128-bit integers you can get a range of over 5 billion years at zeptosecond (10^-21 second) resolution, which should be good enough for anyone. Because who doesn’t need to precisely distinguish times one zeptosecond apart five billion years from now‽

                • Hamartiogonic@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  11 months ago

                  If you run a realistic physical simulation of a star, and you include every subatomic particle in it, you’re going to have to use very small time increments. Computers can’t handle anywhere near that many particles yet, but mark my words, physicists of the future are going want to run this simulation as soon as we have the computer to do it. Also, the simulation should predict events billions of years in the future, so you may need to build a new time tracking system to handle that.

              • Faresh@lemmy.ml
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                11 months ago

                With a 128 bit integer you can represent 340 undecillion (or sextillion if you use the long scale notation) seconds, which is equivalent to 10 nonillion (or quintillion, long scale) years. The universe will long have have stopped being able to support life by then because stars stopped forming (enough time would have passed it could have happened a hundred quadrillion (a hundred thousand billion, long form) times over assuming we start counting from the birth of the universe).

      • The_Lurker@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        Swatch’s Internet Beats are making more and sense every time Daylight Savings forces a timezones change. Why are we still using base 12 for time anyway?

  • marcos@lemmy.world
    link
    fedilink
    English
    arrow-up
    67
    ·
    11 months ago

    From the wikipedia:

    TCB ticks faster than clocks on the surface of the Earth by 1.550505 × 10−8 (about 490 milliseconds per year)

    It’s amazing that this level of detail is relevant to anything.

    • KISSmyOS@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      ·
      11 months ago

      Without considering this, most people wouldn’t be able to drive anywhere they haven’t been before anymore.

      • brianorca@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        11 months ago

        “Wouldn’t be able to” is a bit of a stretch, since Thomas Maps existed long before GPS. But it wouldn’t be so easy as it is now.

          • nilloc@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            I was about when I navigated using an AAA TripTik to get my mom and younger brothers through an 1600 mile road trip. We also had AAA guidebooks for along the path when I had to help pick a motel along the way because a torrential rain slowed us down. It was a fun game of figuring out how far it was, whether it had enough stars (mom said at least 2, but the more the better), and the best price.

            It was the late 80s though, and I didn’t have a game boy, so it was kinda the only form of entertainment.

      • kurwa@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        11 months ago

        I got to “The day before Saturday is always Friday” and I was like waaaa?

        • lad@programming.dev
          link
          fedilink
          English
          arrow-up
          8
          ·
          11 months ago

          I thought it is about when Julian calendar was dropped in favour of Gregorian, but that’s not it:

          Thursday 4 October 1582 was followed by Friday 15 October 1582

          • elvith@feddit.de
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            11 months ago

            Also some of the islands around the International Date Line did switch their stance on which side of the Date Line they are. So… they might have had a day twice or lost a whole day in the process. And maybe, they didn’t change sides only once…

            E.g. see here https://youtu.be/cpKuBlvef6A

            • lad@programming.dev
              link
              fedilink
              English
              arrow-up
              3
              ·
              11 months ago

              A great video you linked, the missing Friday is in it on timestamp 22:45

              The Thursday 29th of December 2011 was followed by Saturday 31st of December 2011 on Samoa

      • whoisearth@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        Epoch is your friend, or use UTC. At least that’s my layman reasoning. I have no challenges working with DateTime except when I don’t know the underlying conditions applied from the source code.

    • randy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      I really wish that list would include some explanations about why each line is a falsehood, and what’s actually true. Particularly the line:

      The software will never run on a space ship that is orbiting a black hole.

      If the author has proof that some software will run on a space ship that is orbiting a black hole, I’d be really interested in seeing it.

      • nybble41@programming.dev
        link
        fedilink
        English
        arrow-up
        12
        ·
        11 months ago

        Technically isn’t the Earth itself a sort of space ship which is orbiting (…a star which is orbiting…) the black hole at the center of the Milky Way galaxy? Not really close enough for time dilation to be a factor, but still.

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        All links to the original article are dead and even archive.org doesn’t have a capture either. I guess the argument is along the lines of “it might not be relevant, when you’re scripting away some tasks for your small personal projects, but when you’re working on a widely used library or tool - one day, it might end up on a space vessel to explore whatever.”

        E.g. my personal backup script? Unlikely. The Linux kernel? Somewhat plausible.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        11 months ago

        Well in a very strict sense one can’t really say “never” (unless you can see into the Future), but it’s probably safe to go along with “It’s highly unlikelly and if it does happen I’ll fix it or will be long dead so won’t care”.

      • Hamartiogonic@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        It’s a programmer thing. As you’re typing the code, you may suddenly realize that the program needs to a assume certain things to work properly. You could assume that time runs at a normal rate as opposed to something completely wild when traveling close to the speed of light or when orbiting a black hole.

        In order to keep the already way too messy code reasonably simple, you decide that the program assumes you’re on Earth. You leave a comment in the relevant part of the code saying that this part shouldn’t break as long as you’re not doing anything too extreme.

    • lad@programming.dev
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 months ago

      This one is good (or evil, depends on how you see it):

      Human-readable dates can be specified in universally understood formats such as 05/07/11.

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        9
        ·
        11 months ago

        That one’s really good.

        Which one is it?

        • July 5th 2011
        • May 7th 2011
        • July 11th 2005
        • November 7th 2005

        And is it 2011/2005 or rather 1911/1905, 1811/1805,…?

    • Kethal@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      11 months ago

      Does anyone know what is untrue about “Unix time is the number of seconds since Jan 1st 1970.”?

      • icydefiance@lemm.ee
        link
        fedilink
        English
        arrow-up
        27
        ·
        edit-2
        11 months ago

        When a leap second happens, unix time decreases by one second. See the section about leap seconds here: https://en.m.wikipedia.org/wiki/Unix_time

        As a side effect, this means some unix timestamps are ambiguous, because the timestamps at the beginning and the end of a leap second are the same.

        • nybble41@programming.dev
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          It might be more accurate to say that Unix time is the number of days since Jan 1st, 1970, scaled by 24×60×60. Though it gets a bit odd around the actual leap second since they aren’t spread over the whole day. (In some ways that would be a more reasonable way to handle it; rather than repeating a second at midnight, just make all the seconds slightly longer that day.)

  • chuck@lemmy.ca
    link
    fedilink
    English
    arrow-up
    49
    ·
    11 months ago

    Ah I’ve gotten to the point where I have to define what “frame” and epoch each time base is in before I’ll touch the representation of time( Unix,Gregorian, etc) .To be honest I’m probably just scratching the surface of time problem.

    Hell probably the reason we haven’t seen time travellers is we suck at tracking time and you probably need to accurately know your time and place to a very good precision to travel to a given point and we can’t say where and when that is with enough accuracy to facilitate where to land. And people don’t want to land in the earth’s surface or 10000 km away from a stable orbit. Maybe some writer can build that out for a time travel book or to discount it for some reason lol

    • kurwa@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      11 months ago

      I recall a short story like that where someone died because they time traveled, but didn’t account for position.

    • niktemadur@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      11 months ago

      Then there’s continental drift, which as Indiana Jones reminded us this past summer, Archimedes didn’t know about when he built his time machine.

      Pet peeve: brushing aside the time travel fantasy element, there is not a single shred of evidence of any type of connection between Archimedes and the Antikythera Mechanism.

      As if the only person clever enough in Ancient Greece was that one famous dude from Syracuse.
      Ionians: “Are we a joke to you?”

    • Omniraptor@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 months ago

      Could you eli5 what frame and epoch are? I don’t get why aren’t unix timestamps an adequate way to store time, they seem pretty easy and intuitive

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    English
    arrow-up
    35
    ·
    edit-2
    11 months ago

    I just spent two days debugging a reporting endpoint that takes in two MM-YYYY parameters and tries to pull info between the first day of the month for param1 and the last day of the month for param2 and ended up having to set my date boundaries as

    LocalDate startDate = new LocalDate(1, param1.getMonth(), param2.getYear()); //pretty straightforward, right?

    //bump month by one, account for rollover, set endDate to the first of that month, then subtract one day

    int endMonth = param2.month == 12 ? param2.month + 1 : 1;

    LocalDate endDate = new LocalDate(1, endMonth, param2.year).minusDays(1);

    This is extraordinarily simply for humans to understand intuitively, but to code it requires accounting for a bunch of backward edge/corner case garbage. The answer, of course, is to train humans to think in Unix epoch time.

        • Jarix@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 months ago

          Would you mind trying to explain (ELI5 style) what you did before and why you are excited for this new method for those of us who dont understand code?

          • Alien Nathan Edward@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            ·
            11 months ago

            it does in a way that’s been reviewed, vetted and tested by a lot of people the thing that I’m trying to do with code that’s only ever been seen by me and one other guy and has been tested to this best of my ability, which i hope is quite good but one person can easily miss edge cases and weird coincidences.

            • And009@reddthat.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Tried and tested, now gotta brush up those searching skills and use llm to get your work done quicker

            • Jarix@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              11 months ago

              So how is the new thing different/better (other than less lines i guess?) If you dont me asking

              • Alien Nathan Edward@lemm.ee
                link
                fedilink
                English
                arrow-up
                4
                ·
                11 months ago

                it’s simpler and a lot easier for another engineer to look at and understand later, so they can verify that it’s right or change it if it’s wrong or we decide to do something a little bit different. it’s also been reviewed and tested by a lot of people working in a lot of cases that are all a little bit different from one another, so the odds that their code is correct are better than the odds that my code is correct, all other things being equal

              • hikaru755@feddit.de
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                11 months ago

                It’s easier to understand, easier to review for correctness, and less likely to cause problems with additional changes in the future. Even though it sounds counterintuitive, software developers generally try to write as little code as possible. Any code you write is a potential liability that has to be maintained, so if you can instead just call code that others have already written and that has been tested, you’ll want to do that. (Note that “less code” doesn’t mean fewer lines of code, it means less logical complexity, which is often, but not always, also less in terms of characters/lines)

                • Jarix@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  11 months ago

                  So like my english teacher taught me. Keep It Stupid Simple(though he would say keep it simple stupid to some people in class i am just realizing now 20+ years later)

          • drislands@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            11 months ago

            To break it down a bit further, the code that was provided is specifically trying to get the last day of a month, which I’ll call Month X since it will vary. The code is doing these things, in this order:

            1. Get the month after Month X
            2. If Month X is 12 (aka December) get Month 1 instead (aka January)
            3. Get the Date that is day 1 of the Month from step 2
            4. Get the Date that is one day before the Date from step 3

            All this to get the last day of the month from Month X. The reason they did it this way is so they didn’t have to say “Is this February? Then get day 28. Is this January/March/etc? then get day 31.” and so on.

            The code that the other user provided will instead get the last day of Month X without having to do all those steps. It’s doing something in the background to get the same data, but the coder doesn’t have to worry about exactly how because they can trust it will work as expected.

            It ultimately boils down to the user carving out a round piece of wood, fitting it on an axle and bolting it on, then to find someone already has cheap wheels for sale that are more stable than what they just made.

              • drislands@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                You’re welcome! A big part of coding is finding how other people solved the problems you’re solving and finding how to incorporate their work into yours.

      • Alien Nathan Edward@lemm.ee
        link
        fedilink
        English
        arrow-up
        9
        ·
        11 months ago

        I was transcribing it from memory and that exact problem cost me like two hours when I was writing it the first time. Well spotted, now write me a unit test for that case.

        • drislands@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          Y’know, I legitimately said to myself “I bet they were writing that from memory and just forgot the edge case. I wonder if that was a problem when doing it originally?” before I wrote that comment. 😂 Time to get some Spock tests set up!

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      Unix epoch time in UTC, making sure that your local offset and drift are current at the time of conversion to UTC…

    • Strykker@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      11 months ago

      All dates and times shall be stored and manipulated in Unix time. Only convert to a readable format at the top of the UI, and forget trying to parse user inputs :P that’s just impossible

    • chayleaf@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Unix epoch time is wrong too, as it doesn’t include leap seconds, meaning your time difference will be off by up to 15 seconds.

  • BeautifulMind ♾️@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    11 months ago

    LOL whenever I have to work with DateTime systems that try to account for every possibility (and fail trying) I am reminded that in some disciplines, it’s acceptable to simplify drastically in order to do ‘close enough’ work.

    I mean, if spherical cows are a thing because that makes the math of theoretical physics doable, why not relativity-free or just frame-constant date-time measures that are willing to ignore exotic edge cases like non-spherical livestock?

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      Because “relativity” isn’t even close to your biggest problem with time. The way we communicate time changes historically, unpredictably, without obvious record. The only way to know what time you’re talking about is to know exactly how you got your information. What location, measured at what time relative to recorded changes in the local time zone, with how much drift relative to the last time you synchronized to which ntp server, and so on. These things easily account for hours or days of error, not just nanoseconds.

  • MxM111@kbin.social
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    11 months ago

    Not even going to general relativity, as this comics suggests to experience time slowing down due to gravity, the events are not just at time, but also at particular location in space relative particular inertial system. Not specifying it, and not specifying the inertial system for which the final answer is needed makes it impossible to calculate even in special relativity, without effects of gravity.

    • steventhedev@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      edit-2
      11 months ago

      Here’s the shortlist of horrors I’ve had to deal with in my career:

      • Mixed US/ROW short date formats - DD/MM/YY, MM/DD/YY
      • mixed timezones in the same column
      • the wrong timezone (marked as PDT but actually UTC, or sometimes the other way around)
      • clock drift
      • timezones again…because timezones suck
      • historical timezones
      • NTP configurations

      Things I’ve read about but haven’t needed to deal with personally:

      • leap seconds
      • clock slew vs skip
      • hardware clocks
      • PTP

      The one thing I really don’t care about is relativity

  • Limonene@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    C++ user with operator overloading: “T2 minus T1.”

    Let someone else implement the class. There’s probably a library for it.

    • worldsayshi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Is there a straightforward way to know which overloading will be used or is it just a roulette every time you add a sketchy new library to your build?

    • Kogasa@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      That’s fine if T1 and T2 include a datetime with the exact local timezone. A simple timestamp or timestamp + utc offset won’t work.