• elvith@feddit.de
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      I don’t know what I did wrong, but the bug must be somewhere in HelloWorldExampleClassForTutorialBuilderFactory.HelloWorldExampleClassForTutorialBuilderFactory(StringBuilderFactory myHelloWorldExampleClassForTutorialStringBuilder, int numberOfTimesToDisplayHelloWorld)

  • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
    link
    fedilink
    arrow-up
    104
    arrow-down
    6
    ·
    edit-2
    1 year ago

    I know the guy meant it as a joke but in my team I see the damage “academic” OOP/UML courses do to a programmer. In a library that’s supposed to be high-performance code in C++ and does stuff like solving certain PDEs and performing heavy Monte-Carlo simulations, the guys with OOP/UML background tend to abuse dynamic polymorphism (they put on a pikachu face when you show them that there’s also static polymorphism) and write a lot of bad code with lots of indirections and many of them aren’t aware of the fact that virtual functions and dynamic_cast’s have a price and an especially ugly one if you use them at every step of your iterative algorithm. They’re usually used to garbage collectors and when they switch to C++ they become paranoiac and abuse shared_ptr’s because it gives them peace of mind as the resource will be guaranteed to be freed when it’s not needed anymore and they don’t have to care about when that is the case, they obviously ignore that under the hood there are atomics when incrementing the ref counter (I removed the shared pointers of a dev who did this in our team and our code became twice as fast). Like the guy in the screenshot I certainly wouldn’t want to have someone in my team who was molded by Java and UML diagrams.

    • ForegoneConclusion@lemmy.world
      link
      fedilink
      arrow-up
      77
      arrow-down
      2
      ·
      1 year ago

      Depends on the requirements. Writing the code in a natural and readable way should be number one.

      Then you benchmark and find out what actually takes time; and then optimize from there.

      At least thats my approach when working with mostly functional languages. No need obsess over the performance of something thats ran only a dozen times per second.

      I do hate over engineered abstractions though. But not for performance reasons.

      • Ryan@programming.dev
        link
        fedilink
        arrow-up
        32
        ·
        1 year ago

        You need to me careful about benchmarking to find performance problems after the fact. You can get stuck in a local maxima where there is no particular cost center buts it’s all just slow.

        If performance specifically is a goal there should probably at least be a theory of how it will be achieved and then that can be refined with benchmarks and profiling.

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        14
        ·
        edit-2
        1 year ago

        Writing the code in a natural and readable way should be number one.

        I mean, even there it depends what you’re doing. A small matrix multiplication library should be fast even if it makes the code uglier. For most coders you’re right, though.

            • Aceticon@lemmy.world
              link
              fedilink
              arrow-up
              4
              ·
              edit-2
              1 year ago

              You can add tons of explanatory comments with zero performance cost.

              Also in programming in general (so, outside stuff like being a Quant) the fraction of the code made which has high performance as the top priority is miniscule (and I say this having actually designed high-performance software systems for a living) - as explained earlier by @ForegoneConclusion, you don’t optimize upfront, you optimized when you figure out it’s actually needed.

              Thinking about it, if you’re designing your own small matrix multiplication library (i.e. reinventing the wheel) you’re probably failing at a software design level: as long as the licensing is compatible, it’s usually better to get something that already exists, is performance oriented and has been in use for decades than making your own (almost certainly inferior and with fresh new bugs) thing.

              PS: Not a personal critical - I too still have to remind myself at times to not just reinvent that which is already there. It’s only natural for programmers to trust their own skills above whatever random people did some library and to want to program rather than spend time evaluating what’s out there.

              • CanadaPlus@lemmy.sdf.org
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                1 year ago

                Thinking about it, if you’re designing your own small matrix multiplication library (i.e. reinventing the wheel)

                I thought of this example because a fundamental improvement was actually made with the help of AI recently. 4x4 in specific was improved noticeably IIRC, and if you know a bit about matrix multiplication, that ripples out to large matrix algorithms.

                PS: Not a personal critical

                I would not actually try this unless I had a reason to think I could do better, but I come from a maths background and do have a tendency to worry about efficiency unnecessarily.

                I think in most cases (matrix multiplication being probably the biggest exception) there is a way to write an algorithm that’s easy to read, especially with comments where needed, and still approaches the problem the best way. Whether it’s worth the time trying to build that is another question.

      • Aceticon@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        In my experience we all go through a stage at the Designed-Developer level of, having discovered things like Design Patterns, overengineering the design of the software to the point of making it near unmaintainable (for others or for ourselves 6 months down the line).

        The next stage is to discover the joys of KISS and, like you described, refraining from premature optimization.

    • magic_lobster_party@kbin.social
      link
      fedilink
      arrow-up
      34
      arrow-down
      1
      ·
      1 year ago

      I think many academic courses are stuck with old OOP theories from the 90s, while the rest of the industry have learned from its failures long time ago and moved on with more refined OOP practices. Turns out inheritance is one of the worst ways to achieve OOP.

      • fidodo@lemm.ee
        link
        fedilink
        arrow-up
        20
        ·
        1 year ago

        I think a lot of academic oop adds inheritance for the heck of it. Like they’re more interested in creating a tree of life for programming than they are in creating a maintainable understandable program.

        • magic_lobster_party@kbin.social
          link
          fedilink
          arrow-up
          35
          ·
          1 year ago

          OOP can be good. The problem is that in Java 101 courses it’s often taught by heavily using inheritance.

          I think inheritance is a bad representation of how stuff is actually built. Let’s say you want to build a house. With the inheritance way of thinking you’re imagining all possible types of buildings you can make. There’s houses, apartment buildings, warehouses, offices, mansions, bunkers etc… Then you imagine how all these buildings are related to each other and start to draw a hierarchy.

          In the end you’re not really building a house. You’re just thinking about buildings as an abstract concept. You’re tasked to build a basic house, but you are dreaming about mansions instead. It’s just a curious pastime for computer science professors.

          A more direct way of building houses is to think about all the parts it’s composed of and how they interact with each other. These are the objects in an OOP system. Ideally the objects should be as independent as possible.

          This concept is called composition over inheritance.

          For example, you don’t need to understand all the internals of the toilet to use it. The toilet doesn’t need to be aware of the entire plumbing system for it to work. The plumbing system shouldn’t be designed for one particular toilet either. You should be allowed to install a new improved toilet if you so wish. As long the toilet is compatible with your plumbing system. The fridge should still work even if there’s no toilet in the house.

          If you do it right you should also be able to test the toilet individually without connecting it to a real house. Now you suddenly have a unit testable system.

          If you ever need polymorphism, you should use interfaces.

      • einsteinx2@programming.dev
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        That’s the problem, a lot of CS professors never worked in the industry or did anything outside academia so they never learned those lessons…or the last time they did work was back in the 90s lol.

        Doesn’t help that most universities don’t seem to offer “software engineering” degrees and so everyone takes “computer science” even if they don’t want to be a computer scientist.

        • jungle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          edit-2
          1 year ago

          There’s an alternative system where this doesn’t happen: pay university professors less than a living wage.

          You do that, and you’ll get professors who work in the industry (they have to) and who love teaching (why else would they teach).

          I studied CS in country where public university is free and the state doesn’t fund it appropriately. Which obviously isn’t great, but I got amazing teachers with real world experience.

          My son just finished CS in a country with paid and well funded university, and some of the professors were terrible teachers (I watched some of his remote classes during covid) and completely out of touch with the industry. His course on AI was all about Prolog. Not even a mention of neural networks, even while GPT3 was all the rage.

          • rbhfd@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            who love teaching (why else would they teach)

            Professors love doing academic research. Teaching is a requirement for them, not a passion they pursue (at least not for most of them).

            • jungle@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Yeah, that makes it even worse.

              To be clear, I’m not advocating for not paying living wages to professors, I’m just describing the two systems I know and the results.

              I don’t know how to get teachers who are up to date with industry and love teaching. You get that when teaching doesn’t pay, but it’d be nice if there was a better way.

      • Aceticon@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        The Design Patterns book itself (for many an OO-Bible) spends the first 70 something pages going all about general good OO programming advice, including (repeatedly emphasised) that OO design should favour delegation over inheritance.

        Personally for me (who started programming professionally in the 90s), that first part of the book is at least as important the rest of it.

        However a lot of people seemed to have learned Patterns as fad (popularized by oh-so-many people who never read a proper book about it and seem to be at the end of a long chinese-whispers chain on what those things are all about), rather than as a set of tools to use if and when it’s appropriate.

        (Ditto for Agile, where so many seem to have learned loose practices from it as recipes, without understanding their actual purpose and applicability)

        I’ll stop ranting now ;)

    • wolf@lemmy.zip
      link
      fedilink
      arrow-up
      17
      ·
      1 year ago

      I fully agree about the damage done at universities. I also fully agree about the teaching professors being out of the game too long or never having been at a level which would be worth teaching to other people. A term which I heard from William Kenned first is ‘mechanical sympathy’. IMHO this is the big missing thing in modern CS education. (Ok, add to that the missing parts about proper OOP, proper functional programming and literally anything taught to CS grads but relational/automata theory and mathematics (summary: mathematics) :-P). In the end I wouldn’t trust anyone who cannot write Assembler, C and knows about Compiler Construction to write useful low level code or even tackle C++/Rust.

    • Dohnakun@lemmy.fmhy.mlB
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      OOP/UML courses

      Luckily, i had only one, and the crack who code-golfes in assembler did the work of us three.

    • odbol@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      That’s wild that shared ptr is so inefficient. I thought everyone was moving towards those because they were universally better. No one mentions the performance hit.

      • Duralf@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Atomic instructions are quite slow and if they run a lot… Rust has two types of reference counted pointer for that reason. One that has atomic reference counting for multithreaded code and one non-atomic for single threaded. Reference counting is usually overkill in the first place and can be a sign that your code doesn’t have proper ownership.

  • Coreidan@lemmy.world
    link
    fedilink
    arrow-up
    66
    arrow-down
    3
    ·
    edit-2
    1 year ago

    This thread reminds me that most “developers” are terrible and don’t take the time to understand the language.

    All of these Java developers you guys hate is the result of schools pushing out idiots. It’s not the language but rather the type of people you hire. These people will suck at writing in any language regardless of what order they try.

    • cloudy1999@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Agreed, good tools can be used badly. Over the years I’ve written Java, C++, and PHP professionally, and I’ve seen excellent and horrible impls in each. Today, I mostly use Java and this thread is reminding me that I need to learn a new for-fun language.

        • clutchmatic@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          Kotlin won’t save your skin if the code you wrote should be performant but you layered it into a heap of abstract classes, interfaces, factories, etc and, realistically, no one else would use or expand on that

  • TrismegistusMx@lemmy.world
    link
    fedilink
    arrow-up
    65
    arrow-down
    4
    ·
    edit-2
    1 year ago

    When I was in the military, the shooting instructors said they preferred training females because they haven’t been trained poorly by somebody else.

    EDIT: Designating recruits as male and female is the way the military does things. I don’t use the terms male and female when referring to groups of humans. I felt the need to clarify since somebody already took offense.

        • Kryomaani@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          10
          ·
          edit-2
          1 year ago

          I’m from a country with mandatory conscription for men, so yes, I’ve been in the military and I’ve seen the misogyny (among countless other varieties of bigotry) rampant in that system from front row seats. We had a handful of female volunteer conscripts, as well as one of my NCOs was a woman, and it was blatantly obvious they were not recieving the same treatment as the majority of us who were men (and not in a good way, if there was any room for confusion).

          Experiences like that are among the key reasons I’m not happy to see people keep perpetuating that kind of behavior, especially in other traditionally male-centric contexts like the IT industry and even here on this forum.

          • ScrimbloBimblo@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            2
            ·
            edit-2
            1 year ago

            Whether or not you personally agree with the military’s choice of language is not relevant. You’re assuming the trainer agrees with your political views, but you weren’t there, so you have no idea what they said or didn’t say.

        • Kryomaani@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          3
          ·
          1 year ago

          Perhaps because people aren’t going around calling others “males” to demean them?

          These are not difficult concepts if you turn on your brain.

      • grandkaiser@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        1 year ago

        Nah. In the military, you aren’t “men and women” you are “soldiers” (or sailors, Marines, or airmen). If you are referring specifically to a specific gender such as a “female” soldier, then that’s what you call them.

        No one says “women soldiers” except maybe a civilian.

        • Kryomaani@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          11
          ·
          1 year ago

          No one says “women soldiers” except maybe a civilian.

          And I’m not telling you to, stop putting words in my mouth. Female as an adjective is fine, “female soldier” is fine, calling a group of human women “females”, as in a noun, is demeaning and incel lingo.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        1 year ago

        In French, the literal translation of female and male, are only used for animals in the common language, but I have been taught that in English it is ok to use those for humans in common language. Is it not the case in your region?

        • Kryomaani@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Female/male are used in English as adjectives when describing humans, but as nouns they only refer to animals. “She is a woman” and “She is a female actress” are both okay but calling women “females” is purposefully demeaning and sexist. I do not believe there is any regional difference in this, nor should we really care about such since there are no regions when we’re on a global forum.

          • oce 🐆@jlai.lu
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            nor should we really care about such since there are no regions when we’re on a global forum

            English being a third language for me, I’m actually interested in understanding the differences coming from different cultures that I may not be aware of. I find global forum to be nice for this reason, although they tend to be dominated by the Northern American culture.

            • kboy101222@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              I can’t speak with certainty, but I can speak subjectively on this -

              I have family and friends all over the US (I’m only missing someone from both Dakotas to complete the collection), so I’ve heard damn near every accent, regional dialect, language, etc that there is to hear in the US, including some near dead native languages and Pennsylvania Dutch.

              The only people I know that use “females” instead of women are either in the military like OP or they’re sexist. Sometimes it’s blatantly misogynistic, sometimes they’re casually sexist.

              Or they’re both. It’s frequently both.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    Man if I were in the US I’d apply for that job in a heartbeat, looks like that was written by a head dev who actually knows what he’s talking about rather than some recruiter

  • pwndave@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    That’s really interesting. Maybe it’s like @nxtsuda@lemmy.world said. For a lot of folks, OOP was the way we learned and operated for years

    Could they have just asked it differently? Or do they just have Java hate.

    • SpaceNoodle@lemmy.world
      link
      fedilink
      arrow-up
      25
      arrow-down
      1
      ·
      1 year ago

      It’s obviously an embedded role. Java and its developers are notorious for throwing memory and compute usage out the window.

    • Nechesh@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      If all I knew about java was some of the garbage projects I’ve inherited over the years I might hate it too.

    • serinus@midwest.social
      link
      fedilink
      arrow-up
      47
      arrow-down
      3
      ·
      1 year ago

      OOP is fine. It’s particularly Java culture that’s terrible.

      I never want to see the word Factory in a class name ever again.

      When a Java dev writes in any other language, you can tell. Too many layers of abstraction is a key indicator. They make simple problems complex.

      I once inherited a C# website project from a Java dev. I couldn’t even figure out how to modify the CSS. And I’m a C# dev.

      • Aceticon@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        I’ve worked with Java for decades (kid you not: learned it from reading the Java Language Specification 1.0 back when it came out) and there’s definitivelly a stage (often a long one) in one’s career when one thinks him/herself so great at OO and just overengineers every single software design way (way, WAY) beyond the actual objective of behind the whole OO design concept (maintenability and bug reduction), actually achieving the opposite objectives (an unmaintainable POS, riddled with hard to track bugs because of way too many unnessary details having overwhelmed the developer’s ability to keep track of it all).

        Eventually you learn KISS design and Refactoring as a sort of housekeeping practice for code and design.

        But yeah, as a freelancer I’ve very commonly landed in the middle of maintenance-stage projects with existing code bases that were clearly done by somebody at that oh-so-special stage in their career, and often it’s better to just reverse engineer the business requirements from the application and redo the whole thing (in the process cutting the codebase size to a small fraction of what it was).

      • kboy101222@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I was part of a fun era at my university where they switched from C++, which is what I took in intro to programming, to java. So by the time I was doing some group projects senior year, I was working in C# with people who had only done Java.

        They wanted to abstract everything. Everything had to be a class. Any time they repeated 2 lines of code it got put into a helper class.

        We ran into an issue where the code just would not run no matter how hard we tried and of course no one on the project but me bothered to use git (they would literally send me the zipped up project on discord and I had to copy and paste everything into the actual code). I ended up rewriting the entire project overnight. It actually wasn’t that bad once I got into the flow of things. Turns out none of them knew how to program without being explicitly told how.

        Still not the worst college group project though. Maybe top 5.

  • topperharlie@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    lol, last time I switched jobs some years ago I did the same but in the other side, I had a side small section with level of expertise on programming languages and explicitly added java with 1/10 to send a clear message xD

    (is not that radical giving that I’ve been a embedded/graphics programmer most of my career, but still, funnier than not mentioning it)

  • Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    I actually have a ton of professional Java experience and have done a lot of microcontroller stuff of late (for fun mainly) and if you go at doing software for ARM Cortex-M microcontrollers the Java way you’re going to end with overengineered bloatware.

    It’s however not a Java-only thing: most of those things have too little memory and processing resources for designing the whole software in a pure OO way, plus you’re pretty much coding directly on the low-level (with at most a thin Hardware Abstraction Layer between your code and direct register manipulation) so only ever having used high-level OO languages isn’t really good preparation for it, something which applies not only for people with only Java experience but also for those whose entire experience is with things like C#.Net as well as all smartphone frameworks and languages (Objective-C, Kotlin, Swift).

    • odbol@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      I used to write a lot of performance-critical Java (oxymoron I know) for wearables, and one time I got a code reviewer who only did server-side Java, and the differences in our philosophies were staggering.

      He wanted me to convert all my code to functional style, using optionals and streams instead of simple null checks and array iterations. When explained that those things are slower and take more memory it was like I was speaking an alien language. He never even had to consider that code would be running on a system with limited RAM and CPU cycles, didn’t even understand how that was possible.

      • eth0p@iusearchlinux.fyi
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        This may be an unpopular opinion, but I like some of the ideas behind functional programming.

        An excellent example would be where you have a stream of data that you need to process. With streams, filters, maps, and (to a lesser extent) reduction functions, you’re encouraged to write maintainable code. As long as everything isn’t horribly coupled and lambdas are replaced with named functions, you end up with a nicely readable pipeline that describes what happens at each stage. Having a bunch of smaller functions is great for unit testing, too!

        But in Java… yeah, no. Java, the JVM and Java bytecode is not optimized for that style of programming.

        As far as the language itself goes, the lack of suffix functions hurts readability. If we have code to do some specific, common operation over streams, we’re stuck with nesting. For instance,

        var result = sortAndSumEveryNthValue(2, 
                         data.stream()
                             .map(parseData)
                             .filter(ParsedData::isValid)
                             .map(ParsedData::getValue)
                        )
                        .map(value -> value / 2)
                        ...
        

        That would be much easier to read at a glance if we had a pipeline operator or something like Kotlin extension functions.

        var result = data.stream()
                        .map(parseData)
                        .filter(ParsedData::isValid)
                        .map(ParsedData::getValue)
                        .sortAndSumEveryNthValue(2) // suffix form
                        .map(value -> value / 2)
                        ...
        

        Even JavaScript added a pipeline operator to solve this kind of nesting problem.

        And then we have the issues caused by the implementation of the language. Everything except primitives are an object, and only objects can be passed into generic functions.

        Lambda functions? Short-lived instances of anonymous classes that implement some interface.

        Generics over a primitive type (e.g. HashMap<Integer, String>)? Short-lived boxed primitives that automatically desugar to the primitive type.

        If I wanted my functional code to be as fast as writing everything in an imperative style, I would have to trust that the JIT performs appropriate optimizations. Unfortunately, I don’t. There’s a lot that needs to be optimized:

        • Inlining lambdas and small functions.
        • Recognizing boxed primitives and replacing them with raw primitives.
        • Escape analysis and avoiding heap memory allocations for temporary objects.
        • Avoiding unnecessary copying by constructing object fields in-place.
        • Converting the stream to a loop.

        I’m sure some of those are implemented, but as far as benchmarks have shown, Streams are still slower in Java 17. That’s not to say that Java’s functional programming APIs should be avoided at all costs—that’s premature optimization. But in hot loops or places where performance is critical, they are not the optimal choice.

        Outside of Java but still within the JVM ecosystem, Kotlin actually has the capability to inline functions passed to higher-order functions at compile time.

        /rant

  • nayhel89@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    The horror of the single inheritance that forces you to use composition instead.
    The boredom of knowledge which exception every method throws.
    The narrowness of generics that don’t allow duck typing.
    The oppression of monads and pattern matching.
    The poverty of a central package repository and only 2 package managers.
    The pressure of choice between dozens of garbage collectors for different workloads.
    The promiscuity of the single platform that interconnects various programming languages and allows all of them to use features like state-of-the-art profile-guided optimizations.

    The horrible world of Java.

  • lasagna@programming.dev
    link
    fedilink
    arrow-up
    4
    arrow-down
    4
    ·
    1 year ago

    In most programming I have done, we treat the users as the dumb mofos. In Java, the programmers are treated as the dumb mofos. As a dumb mofo, I have a great dislike toward Java’s standard development ecosystems.