• minimalfootprint@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    9 months ago

    Y2K is similar. Most people will remember not much happening at all. Lots of people worked hard to solve the problem and prevent disaster.

    • ThatWeirdGuy1001@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      26
      ·
      edit-2
      9 months ago

      Was there ever really a threat to begin with? The whole thing sounds like Jewish space lasers to me.

      Edit: Gotta love getting downvoted for asking a question.

      • Verxiq@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        23
        ·
        9 months ago

        Yes, most administrative programs, think hospitals, municipal, etc had a year set only in 2 digits. Yesterdays timestamp will read as 99 years in the future, since the year is 00. Imagine every todo item of the last 20 odd years suddenly being pushed onto your todo list. Timers set to take place every x time can’t check when last something happend. Time critical nuclear safety mechanisms, computers getting stuck due to data overload, everything needed to be looked at to determine risk.

        So you take all the dates, add size to store additional data, add 1900 to the years and you are set. In principle a very straight forward fix, but it takes time to properly implement. Because everyone was made aware of the potential issue IT professionals could more easily lobby for the time and funds to make the necessary changes before things went awry.

        • ThatWeirdGuy1001@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          9 months ago

          That’s fuckin wild and seems like a massive oversight.

          Did they just not expect us all to live that long or did they just not think of it at all?

          • Ithi@lemmy.ca
            link
            fedilink
            English
            arrow-up
            13
            ·
            9 months ago

            Yeah I would imagine poor/lazy planning or they either thought their tools would be replaced by then and/or that computers were just a fad so there’s no way they’d be used in the year 2000.

          • withabeard@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            9 months ago

            Depends on the “they”…

            But generally, back in the day data storage, memory and processing power were expensive. Multiple factors more expensive than they are now. Storing a year with two digits instead of four was a saving worth making. Over time, some people just kept doing what they had been doing. Some people just learned from mentors to do it that way, and kept doing it.

            It was somewhat expected that systems would improve and over time that saving wouldn’t be needed. Which was true. By the year 2000 “modern” systems didn’t need to make that saving. But there was a lot of old code and systems that were still running just fine, that hadn’t been updated to modern code/hardware. it became a bit of a rush job at the end to make the same upgrade.

            There is a similar issue coming up in the year 2038. A lot of computing platforms store dates as the number of seconds since the beginning of 1970-01-01 UTC. As I type this comment there have been 1,710,757,161 seconds since that date. It’s a simple way to store time/date in a way that can be converted back to a human readable format quite easily. I’ve written a lot of code which does exactly this. I’ve also written lot of code and data storage systems that store this number as a 32bit integer. Without drilling down into what that means, the limit of that data storage type will be a count of 4,294,967,296. That means at 2038-01-19 03:14:07 UTC, some of my old code will break, because it wont be able to properly store the dates.

            I no longer work for that employer, I no longer maintain that code. Back when I wrote that code, a 32bit integer made sense. If I wrote new code now, I would use a different data type that would last longer. If my old code is still in use then someone is going to have to update it. Because of the way business, software and humans work. I don’t expect anyone will patch that code until sometime around the year 2037.

            • BorgDrone@lemmy.one
              link
              fedilink
              English
              arrow-up
              4
              ·
              9 months ago

              Without drilling down into what that means, the limit of that data storage type will be a count of 4,294,967,296.

              A little nitpick: the count at that time will be 2,147,483,647. time_t is usually a signed integer.

            • cqthca@reddthat.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              9 months ago

              I often wonder what happened to the code I wrote in 2010 and used for production coordination & was working fine when I retired (2018). I figured the minute I left the hotshot kids would want to upgrade to their own styles. Not everyone liked it bc it wasn’t beautiful but no one could say it wasn’t functional, so it persisted. I was busy learning design and assemble CNC routers; but it worked and I didn’t have time to make a selection of backgrounds & banners. It’s just Excel, AutoCAD, & Access using VBA, which everyone has says they are going to deprecate VBA but, alas, people still want it. I remember Autodesk announcing the deprecation of VBA c. 2012 and I just looked and I guess they changed their mind bc there are modules for VBA available

              14 years ago at stackoverflow. What is the future of VBA? https://stackoverflow.com/questions/1112491/what-is-the-future-of-vba Download the Microsoft VBA Module for AutoCAD - Autodesk https://www.autodesk.com/support/technical/article/caas/tsarticles/ts/3kxk0RyvfWTfSfAIrcmsLQ.html Links to download for VBA modules for their products Feb 7, 2024 To install the Microsoft Visual Basic for Applications Module (VBA) for Autocad, do the following: Select the appropriate download from the list below. Close all programs. In Windows Explorer, double-click the downloaded self-extracting EXE file.

              sometimes legacy methods last longer bc no one wants to be a hotshot.

          • SpaceCowboy@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            9 months ago

            The Mayans figured a calendar that only went to 2012 would be good enough. And they were right, their civilization didn’t exist anymore in 2012. Only relevance their calendar system had in 2012 was that some people felt like it was a prophecy about the end of the world. Nope, just was an arbitrary date the Mayans rightly assumed would be far enough away it wouldn’t matter.

            While I suppose you could make a date format that was infinitely expandable, it would take more processing power and is really unnecessary.

            Anyway got until 2038 until we’ll have to deal with a popular date format running out of bits. We’ll probably be in some kind of mad max post apocalyptic world before then so it won’t matter.

            • Ultraviolet@lemmy.world
              link
              fedilink
              English
              arrow-up
              10
              ·
              edit-2
              9 months ago

              That’s a misconception. The Maya (not Mayan, that’s the language) long count for December 20, 2012 was 12.19.19.17.19. December 21, 2012 was 13.0.0.0.0. Today is 13.0.11.7.4. It continues the same way indefinitely, it’s just the number of days since some arbitrary date (August 11, 3114 BCE if you’re curious) in base 20, with the second to last digit in base 18, which seems odd at first but it rather cleverly makes it so the third digit can stand in as a rough approximation of years, and the second is approximately a generation. Now October 13, 4772 could be seen as an endpoint but there’s nothing that says it can’t be extended with one more digit to 1.0.0.0.0.0, and then you’re good for another 150,000 years or so.

              Now there was a creation myth that said 0.0.0.0.0 was the previous world’s 13.0.0.0.0, but there was no recorded belief that this was any sort of recurring cycle, in fact plenty of Maya texts predicted astronomical events millennia past 2012. The idea that it was recurring was probably borrowed from the similar Greek construct of ekpyrosis, which doesn’t specify any sort of time frame.

        • SpaceCowboy@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          8
          ·
          9 months ago

          You’re saying “imagine” a lot there.

          Were there concrete examples of critical software that actually would’ve failed? At the time I remember there was one consultant that was on the news constantly saying everything from elevators to microwaves would fail on Y2K. Of course this was creating a lot of business for his company.

          When you think about it storing a date with 6 bytes would take more space than using Unix time which would give both time and date in four bytes. Y2K38 is the real problem. Y2K was a problem with software written by poor devs that were trying to save disk space by actually using more disk space than needed.

          And sure a lot of of software needed to be tested to be sure someone didn’t do something stupid. But a lot of it was indeed an exaggeration. You have to reset the time on your microwave after a power outage but not the date, common sense tells you your microwave doesn’t care about the year. And when a reporter actually followed up with the elevator companies, it was the same deal. Most software simply doesn’t just fail when it’s run in an unexpected year.

          If someone wrote a time critical safety mechanism for a nuclear reactor that involved parsing a janky homebrew time format from a string then there’s some serious problems in that software way beyond Y2K.

          The instances of the Y2K bug I saw in the wild, the software still worked, it just displayed the date wrong.

          Y2K38 is the real scary problem because people that don’t understand binary numbers don’t understand it at all. And even a lot of people in the technology field think it’s not a problem because “computers are 64 bit now.” Don’t matter how many bits the processor has, it’s only the size that’s compiled and stored that counts. And unlike some janky parsed string format, unix time is a format I could see systems at power plants actually using.

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            9 months ago

            Some of the software at my employer at the time, would have failed. In particular, I fixed some currency trading software

          • BorgDrone@lemmy.one
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            9 months ago

            When you think about it storing a date with 6 bytes would take more space than using Unix time which would give both time and date in four bytes. Y2K38 is the real problem. Y2K was a problem with software written by poor devs that were trying to save disk space by actually using more disk space than needed.

            This comes to mind:

            You don’t store dates as Unix time. Unix timestamps indicate a specific point in time. Dates are not a specific point in time.

            • SpaceCowboy@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 months ago

              You also don’t store dates in a string that you’ll have to parse later. I’ve had to deal with MM-DD-YYYY vs. DD-MM-YYYY problems more times than I can count.

              And you understand that you could have a date in unix time and leave the time to be midnight, right? You’d end up with an integer that you could sort without having to parse every goddamn string first.

              And for God’s sake if you insist on using strings for dates at the very least go with something like YYYY-MM-DD. Someone else may someday have to deal with your shit code, at the very least make the strings sortable FFS.

              • cqthca@reddthat.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 months ago

                You don’t have a line that checks the format and auto converts to your favorite?

              • BorgDrone@lemmy.one
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                9 months ago

                You also don’t store dates in a string that you’ll have to parse later

                Depends. If the format is clearly defined, then there’s no problem. Or could use a binary format. The point is that you store day/month/year separately, instead of a Unix timestamp.

                And you understand that you could have a date in unix time and leave the time to be midnight, right?

                No, you can’t.

                First of all, midnight in what timezone? A timestamp is a specific instant in time, but dates are not, the specific moment that marks the beginning of a date depends on the timezone.

                Say you store the date as midnight in your local timezone. Then your timezone changes, and all your stored dates are incorrect. And before you claim timezones rarely change, they change all the time. Even storing it as the date in UTC can cause problems.

                You use timestamps for specific instances in time, but never for storing things that are in local time. Even if you think you are storing a specific instant it, time, you aren’t. Say you make an appointment in your agenda at 14:00 local time, you store this as a Unix timestamp. It’s a specific instant in time, right? No, it’s not. If the time zone changes so, for example, DST goes into effect at a different time, your appointment could suddenly be an hour off, because that appointment was not supposed to be at that instant in time, it was supposed to be at 14:00 in the local timezone, so if the timezone changes the absolute point in time of that appointment changes with it.

                • SpaceCowboy@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  9 months ago

                  First of all, midnight in what timezone? A timestamp is a specific instant in time, but dates are not, the specific moment that marks the beginning of a date depends on the timezone.

                  What are you talking about? The same problems apply no matter which format you’re talking about. Depending on which side of the dateline your timezone is on you could wind up with different dates.

                  Does your janky string format of “18-03-2024” suddenly has to become aware of the timezone if I tack on a “0:00” at the end of it? Or maybe you always will have timezone issues no matter what the precision of the time you want to store.

                  I think you got it in your mind that you can’t do anything other than Timestamp=getdate() and if it’s a date only you have to use a string. That’s not the case. You can indeed translate a date into any number of formats, unix time is one of them. I assure you that 1710720000 will translate to the same janky “18-03-2024” format you’re using every single time unless you deliberately mess with timezones in code where you admit that you don’t want to deal with timezones. But your string jankiness break simply by someone parsing it with MM-dd-yyyy just as easily and this may not require someone to do something to deliberately break it. Depending on the library that’s being used and the localization settings of the OS, this can happen automatically. If your code will break because someone has different OS settings than yours, you are writing bad code.

                  If the goal is to save space then your format uses 10 bytes, while the timestamp uses 4 (with Y2K38 problems) or 8 with 64 bit Epoch time. If you’re not too worried about saving space (you really shouldn’t be these days) then use the appropriate structs defined by the language you’re using and the DB you’re using.

                  Even this would be better than a string:

                  struct { int year byte month byte day }

                  Six bytes as opposed to 10 and there would be no issues with confusion with the dd and MM parts of the string. It’s still shit (use existing date libraries instead) but still won’t have as many problems than what you’re doing. Seriously anything is better than just dumping a date into a string. And as I say, using the dd-MM-yyyy format is bad for multiple reasons.

                  Though congratulations, you’ve convinced me that Y2K might’ve been a bigger problem than I thought given how adamant you are about repeating similar mistakes that caused those issues. I guess even when there’s very obvious problems with how someone’s doing things they will insist on doing things that way even when it’s pointed out all the problems with it. I can imagine someone in the 80s and 90s pointing out the Y2K problem to someone writing the code and getting some arrogant bullshit about how only mid-level programmers worry about that. “Experts put dates in strings LOL!”

                  • BorgDrone@lemmy.one
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    9 months ago

                    Does your janky string format of “18-03-2024” suddenly has to become aware of the timezone

                    No, there is no timezone, and that is the entire point. In the majority of cases you just want to store the local date. The point is that a local date or time is not necessarily a fixed point in time. If I have drinks at 18:00 every Friday, that doesn’t change when we switch to or from DST, it’s still 18:00 in local time. I don’t need a timezone, I know what timezone I live in.

                    Now, in cases where timezones do matter, for example if you have a Zoom meeting with someone from another country, you can store as local time + timezone. But this is still very different from storing a Unix timestamp. This meeting will be at a specific time in a specific timezone, and the exact moment in time will adjust when changes are made to that timezone. Again, a Unix timestamp does not allow for this, as it’s always UTC.

                    I assure you that 1710720000 will translate to the same janky “18-03-2024” format you’re using every single time unless you deliberately mess with timezones in code

                    No, it doesn’t. You can’t convert it to any date unless you “mess with timezones”, because 1710720000 is a specific moment in time and you have to provide a timezone when converting it to a date. You are mistaking the fact that some systems implicitly use UTC when converting for some sort of of universal standard, because it’s not.

                    Run the following Swift code:

                    let d = Date(timeIntervalSince1970: 1710720000)
                    print(d.formatted(date: .complete, time: .omitted))
                    

                    You’ll get a different date depending on your location.

                    If your code will break because someone has different OS settings than yours, you are writing bad code.

                    Yes, and your bad code will break simply because you are abusing a datatype for something beyond it’s intended use. If you want to store an absolute point in time, by all means use a Unix timestamp, but if you want to store a local time you should never use it because it’s not mean for that and it doesn’t encode the information needed to represent a local time.

                    Even this would be better than a string:

                    struct { int year byte month byte day }

                    Yes that’s fine. I’m not arguing that you should store it as a string, I’m arguing that you should store it as individual components, in whatever format, instead of seconds since the epoch. As long as the format is well specced it doesn’t really matter. Strings are used all the time for representing dates by the way. For example, ASN.1, which is used everywhere, stores dates and time as strings and it’s perfectly fine as the format is specified unambigiously.

                    Six bytes as opposed to 10

                    In what archaic system are int’s still 4 bytes? Int is 64-bits, or 8 bytes, on any modern machine. If I read your format on a 64-bit machine, it’ll break. Also is that int little or big endian? You code still breaks if you spec an int32 and you store your date on an x86 machine (little endian) and I read it on a big-endian machine. You know what’s not ambiguous ? “This time is stored as an ISO8601 string”.

      • jjjalljs@ttrpg.network
        link
        fedilink
        English
        arrow-up
        8
        ·
        9 months ago

        You’re probably getting down voted because you asked here instead of a search engine, and many people think it’s common knowledge, and it was already answered in this thread.

        Sometimes an innocent question looks like someone JAQing off.

      • Overshoot2648@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        It was a massive threat as it would break banking records and aircraft flight paths. Those industries spent millions to fix the problem. In 14 years(2038) we’ll have a similar problem with all 32bit computers breaking if they haven’t had firmware updates to store UTC time as a 64bit number composed of two 32bit numbers. Lots of medical, industrial, and government equipment will need to either be patched or replaced.

      • jemikwa@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        By comparison, there were a few systems that had issues on February 29th because of leap day. Issues with such a routine thing in this current day should be unthinkable.

      • Strykker@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        There wasn’t much of a real “threat”, in that planes wouldn’t fall out of the sky. but banking systems would probably get quite confused, and potentially lead to people being unable to access money easily until it got fixed.

      • bloom_of_rakes@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        9 months ago

        You insinuate that these people might be gullible dopes who swallow whatever it’s popular to swallow, no brains involved.

        We have a zero tolerance policy for that attitude.