• 0 Posts
  • 60 Comments
Joined 2 年前
cake
Cake day: 2023年7月5日

help-circle
  • A few years ago I had a software problem, and in the course of trying to solve it I found someone with almost the identical problem on SO, although no-one had posted a solution. Later on, I managed to piece some facts together and come up with a solution that worked for me. Trying to make life easier for others having the same problem, I posted my solution to that SO question, along with a brief explanation of what I thought the underlying problem was, and how my solution addressed it.

    I got several upvotes, and one or two comments from people saying it worked for them too, which was nice. There was also a post from someone it didn’t work for, and they outlined why they thought that might be, which was constructive.

    Unfortunately there was also some salty grump who weighed in just to tell me that my solution wasn’t “correct”. Not that it didn’t work mind you, just that it wasn’t good enough for them. As far as I bothered to look into their vague comments, my solution may have fixed the issue more as a side-effect than directly, but it did fix the issue. Meanwhile this person offered no alternative instructions of their own.

    As time goes on, I seem to run across this sort of – not just unhelpful but “anti-helpful” – attitude more and more often on SO.


  • That’s kind of the bare bones of how it works, underneath all the abstraction layers and pretty GUIs.

    Then it evolves.

    First, you start splitting your code into multiple source files, either because your programs get too big to keep scrolling up and down one huge file to cross-check things, or because you want to incorporate someone else’s code into your program, and it’s more than just one or two functions you can easily copy and paste. You can still keep compiling and linking all of this in one step, but the command gets so long that you make a shell script/batch file as a shortcut.

    After that, you might want to mix-and-match various source files to target different platforms, or to make other bulk changes, and you start going down the rabbit hole of having your shell script take arguments, rather than having a dozen different scripts. And then one day you take another look at “make” and realize that whereas before it seemed like impenetrable overengineering, it now makes complete and obvious sense to you.

    Then you discover using “make” (or a similar utility) to split compilation and linking into separate steps, which used to seem nonsensical, but now you’re dealing with codebases that take more than a couple of seconds to compile, or precompiled libraries or DLLs, and you get comfortable with the idea of just hanging on to compiled object files and (re)using them when the source for that part of the program hasn’t changed.

    And finally (maybe) you look at some of the crazy stuff in fancy IDEs and understand why it’s there; that it’s just representations of all this other stuff that you now know about and feel competent with. I say “maybe” because I’ve been programming for over 35 years, occasionally professionally but mostly as a hobbyist, and there are still things in IDEs that I either don’t understand, or don’t see the point of having them. But knowing the underlying principles makes me feel comfortable enough to ignore them.


  • I hadn’t heard of Kate before, so I can’t offer much hands-on advice. I dug around and found a “handbook” here: https://docs.kde.org/stable5/en/kate/kate/index.html

    Unfortunately it does look like you need to define a project to compile/run anything, which appears to require manually creating a .kateproject file in the directory as outlined here: https://docs.kde.org/stable5/en/kate/kate/kate-application-plugin-projects.html#project-create

    I had exactly the same problem when I moved from languages that were interpreted or combined the IDE and runtime environment into one, and starting to use languages which had their own external compiler. Unfortunately, open source project user documentation is often terrible for beginners (what I found above for Kate seems to be no exception), and IDEs often seem to be written by people who don’t really expect anyone to actually use the included build options (to be fair, most folks seem to like using their own separate build utilities, so probably this is often the case)

    If you can tell us which compiler or interpreter you’re using (e.g. gcc, clang, Python), someone can probably tell you how to compile and/or run a single-file program from the terminal with a fairly simple command.


  • Whoops! When I looked at the second time that the shift value is calculated, I wondered if it would be inverted from the first time, but for some reason I decided that it wouldn’t be. But looking at it again it’s clear now that (1 - i) = (-i + 1) = ((~i + 1) + 1), making bit 0 the inverse. Then I wondered why there wasn’t more corruption and realized that the author’s compiler must perform postfix increments and decrements immediately after the variable is used, so the initial shift is also inverted. That’s why the character pairs are flipped, but they still decode correctly otherwise. I hope this version works better:

    long main () {
        char output;
        unsigned char shift;
        long temp;
        
        if (i < 152) {
            shift = (~i & 1) * 7;
            temp = b[i >> 1] >> shift;
            i++;
            output = (char)(64 & temp);
            output += (char)((n >> (temp & 63)) & main());
            printf("%c", output);
        }
    
        return 63;
    }
    

    EDIT: I just got a chance to compile it and it does work.


  • I first learned about Java in the late 90s and it sounded fantastic. “Write once, run anywhere!” Great!

    After I got past “Hello world!” and other simple text output tutorials, things took a turn for the worse. It seemed like if you wanted to do just about anything beyond producing text output with compile-time data (e.g. graphics, sound, file access), you needed to figure out what platform and which edition/version of Java your program was being run on, so you could import the right libraries and call the right functions with the right parameters. I guess that technically this was still “write once, run anywhere”.

    After that, I learned just enough Java to squeak past a university project that required it, then promptly forgot all of it.

    I feel like Sun was trying to hit multiple moving targets at the same time, and failing to land a solid hit on any of them. They were laser-focused on portable binaries, but without standardized storage or multimedia APIs at a time when even low-powered devices were starting to come with those capabilities. I presume that things are better now, but I’ve never been tempted to have another look. Even just trying to get my machines set up to run other people’s Java programs has been enough to keep me away.








  • Did you read all the way to the end of the article? I did.

    At the very bottom of the piece, I found that the author had already expressed what I wanted to say quite well:

    In my humble opinion, here’s the key takeaway: just write your own fucking constructors! You see all that nonsense? Almost completely avoidable if you had just written your own fucking constructors. Don’t let the compiler figure it out for you. You’re the one in control here.

    The joke here isn’t C++. The joke is people who expect C++ to be as warm, fuzzy, and forgiving as JavaScript.







  • Let me know if you find one that uses AI to find groupings of my search terms in its catalogues instead of using AI to reduce my search to the nearest common searches made by others, over some arbitrary popularity threshold.

    Theoretical search: “slip banana peel 1980s comedy movie”
    Expected results in 2010: Pages about people slipping on banana peels, mostly in comedy movies, mostly from the 80s.
    Expected results in 2024: More than I ever wanted to know about buying bananas online, the health impacts of eating too many or not enough bananas, and whatever “celebrities” have recently said something about them. Nothing about movies from the 80s.


  • That was my first take as well, coming back to C++ in recent years after a long hiatus. But once I really got into it I realized that those pointer types still exist (conceptually) in C, but they’re undeclared and mostly unmanaged by the compiler. The little bit of automagic management that does happen is hidden from the programmer.

    I feel like most of the complex overhead in modern C++ is actually just explaining in extra detail about what you think is happening. Where a C compiler would make your code work in any way possible, which may or may not be what you intended, a C++ compiler will kick out errors and let you know where you got it wrong. I think it may be a bit like JavaScript vs TypeScript: the issues were always there, we just introduced mechanisms to point them out.

    You’re also mostly free to use those C-style pointers in C++. It’s just generally considered bad practice.