• MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    5 hours ago

    In this thread: mostly people that don’t know how timekeeping works on computers.

    This is already something that we’re solving for. At this point, it’s like 90% or better, ready to go.

    See: https://en.m.wikipedia.org/wiki/Year_2038_problem

    Time keeping, commonly, is stored as a binary number that represents how many seconds have passed since midnight (UTC) on January 1st 1970. Since the year 10,000 isn’t x seconds away from epoch (1970-01-01T00:00:00Z), where x is any factor of 2 (aka 2^x, where x is any integer), any discrepancies in the use of “year” as a 4 digit number vs a 5 digit number, are entirely a display issue (front end). The thing that does the actual processing, storing and evaluation of time, gives absolutely no fucks about what “year” it is, because the current datetime is a binary number representing the seconds since epoch.

    Whether that is displayed to you correctly or not, doesn’t matter in the slightest. The machine will function even if you see some weird shit, like the year being 99 100 because some lazy person decided to hard code it to show “99” as the first two digits, then take the current year, subtract 9900, and display whatever was left (so it would show the year 9999 as “99”, and the year 10000 as year “100”) so the date becomes 99 concatenated with the last two (now three) digits left over.

    I get that it’s a joke, but the joke isn’t based on any technical understanding of how timekeeping works in technology.

    The whole W2k thing was a bunch of fear mongering horse shit. For most systems, the year would have shown as “19-100”, 1900, or simply “00” (or some variant thereof).

    • friendlymessage@feddit.org
      link
      fedilink
      arrow-up
      5
      ·
      1 hour ago

      Y2K was definitely not only fear-mongering. Windows Systems did not use Unix timestamps, many embedded systems didn’t either, COBOL didn’t either. So your explanation isn’t relevant to this problem specifically and these systems were absolutely affected by Y2K because they stored time differently. The reason we didn’t have a catastrophic event was the preventative actions taken.

      Nowadays you’re right, there will be no Y10K problem mainly because storage is not an issue as it was in the 60s and 70s when the affected systems were designed. Back then every bit of storage was precious and therefore omitted when not necessary. Nowadays, there’s no issue even for embedded systems to set aside 64 bit for timekeeping which moves the problem to 292277026596-12-04 15:30:08 UTC (with one second precision) and by then we just add another bit to double the length or are dead because the sun exploded.

    • FooBarrington@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      3 hours ago

      My brother in Christ, there’s more to time than just storing it. Every datetime library I’ve ever used only documents formatting/parsing support up to four year digits. If they suddenly also supported five digits, I guarantee it will lead to bugs in handling existing dates, as not all date formats could still be parsed unambiguously.

      It won’t help you if time is stored perfectly, while none of your applications support it.

      Regarding Y2K, it wasn’t horse shit - thousands upon thousands of developer hours were invested to prevent these issues before they occurred. Had they not done so, a bunch of systems would have broken, because parsing time isn’t just about displaying 19 or 20.

      • friendlymessage@feddit.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 hour ago

        I would hope that these kinds of parsers are not used in critical applications that could actually lead to catastrophic events, that’s definitely different to Y2K. There would be bugs, yes, but quite fixable ones.

        Regarding Y2K, it wasn’t horse shit - thousands upon thousands of developer hours were invested to prevent these issues before they occurred. Had they not done so, a bunch of systems would have broken, because parsing time isn’t just about displaying 19 or 20.

        “There’s no glory in prevention”. I guess it’s hard to grasp nowadays, that mankind at some point actually tried to stop catastrophies from happening and succeeded

        • FooBarrington@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          23 minutes ago

          Even if such parsers aren’t used directly in critical systems, they’ll surely be used in the supply chains of critical systems. Your train won’t randomly derail, but disruptions in the supply chain can cause repair parts not to be delivered, that kind of thing.

          And you can be certain such parsers are used in almost every application dealing with datetimes that hasn’t been specifically audited or secured. 99% of software is held together with duct tape.