• 0 Posts
  • 306 Comments
Joined 1 year ago
cake
Cake day: June 22nd, 2023

help-circle
  • Because usually if you end up at the API reference in that situation it’s a code / project smell that other stuff is going wrong.

    If I want to use a library to do something, you should be able to search for what you want to do + language / framework, find the library’s docs, follow the install instructions and then look through the highest level API / instructions and then just go from there.

    If you find yourself confused at unhelpful API references that just means that they have badly written top level API docs, badly written intros, or quite probably just badly written APIs.


  • I mean, research funding is a huge problem, but half the problem is that journalists and reporters are largely people who went into English or Communications and stopped taking or learning any science past the high school level and thus don’t actually know how to read papers or report on them. Not to mention that critically reading a scientific paper and evaluating in the context of other research takes a significant amount of time, more time than is given to write a normal newspaper article.

    And they’re reporting that science to people who on average know the same or less than them, so their mistakes and misreporting is never caught or corrected.


  • For software to run on a computer, it needs to tell the computer what to do, “display this picture of a flower”, “move my character to the left”, “save this poem to a file”.

    And for a bunch of different software to all run on the same machine, they all need to use the same basic set of instructions, this is called the machine’s Instruction Set.

    Because the instruction set has to work for any software, these instructions don’t look that readable to us, instead of “show this flower” they might be “move this bit of memory into the processor”, but software builds up millions of those instructions to eventually display a flower.

    Intel processors used a set of instructions that were called x86, and then when AMD made a rival processor, they made theirs use the same instruction set so that their processors would be compatible with all the software written for Intel processors (and when they needed to move from 32bit instructions to 64bit instructions, they made a new set called x64).

    Meanwhile Apple computers for a long time used processors built by IBM that used IBMs PowerPC instruction set.

    Now many companies are using the ARM instruction set, but ARM is still a private company you have to pay licensing fees to, so RISC-V is rising as a new, truly open source and free to use instruction set.





  • Or is this a battle I can pick to shield my self from ms

    Read the post before coming to the comments to reply.

    OP is asking on here about whether or not to pick this battle and fight his company over it. Yes, you are probably technically correct that a company can’t force you to install an authenticator app on your phone. However, that is a battle that you will have to fight with them that will accomplish essentially nothing if you win.

    In Canada right now there is a major auto manufacturer that is being sued by the union over this very issue. It is a years long legal case that had to be escalated through the union, it’s lawyers ,and now arbitration. Does that not sound like a battle to you?


  • Because an object is good at representing a noun, not a verb, and when expressing logical flows and concepts, despite what Java will tell you, not everything is in fact, a noun.

    I.e. in OOP languages that do not support functional programming as first class (like Java), you end up with a ton of overhead and unnecessary complications and objects named like generatorFactoryServiceCreatorFactory because the language forces you to creat a noun (object) to take an action rather than just create a verb (function) and pass that around.




  • Answer: there’d be far less software in the world, it would all be more archaic and less useful, and our phones and laptops would just sit at 2% utilization most of the time.

    There’s an opportunity cost to everything, including fussing over whether that value can be stored as an int instead of a double to save 8 bits of space. High level languages let developers express their feature and business logic faster, with fewer bugs, and much lower ongoing maintenance costs.


  • I’m not as hardcore as most, I run windows as my main OS, but I do love my LG Gram 17" laptop from ~3-4 years ago.

    It’s powerful enough for general use, webdev, and very light 3D modelling, and it is insanely light and portable. I have a 14" MacBook at work and the gram is lighter than it, thinner, not that much bigger, and far more durable.

    Great keyboard and trackpad, giant screen (I wish it was brighter but this is the version from 3-4 years ago), and surprisingly solid Bluetooth, microphone, thunderbolt etc.



  • masterspace@lemmy.catoProgrammer Humor@programming.devGOD DAMMIT STEVEN! NOT AGAIN!
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    2 months ago

    No it wouldn’t. You’d have git beginners committing IDE configs and secrets left and right if -A was the default behavior.

    No, you wouldn’t because no one is a git beginner, they’re a software developer beginner who need to use git. In that scenario, you are almost always using repos that are created by someone else or by some framework with precreated git ignores.

    You know what else it could do? Say “hey, youve said add with no files selected, press enter to add all changed files”

    Esc, :, q. Sure it’s a funny internet meme to say vim is impossible to quit out of, but any self-respecting software developer should know how, and if you don’t, you have google. If you think this is hard, no wonder you struggle with git.

    Dumping people into an archaic cli program that doesn’t follow the universal conventions for exiting a cli program, all for the the goal of entering 150 characters of text that can be captured through the CLI with one prompt, is bad CLI design.

    There is no reason to ever dump the user to an external editor unless they specifically request it, yet git does, knowing full well that that means VIM in many cases.

    And no, a self respecting software developer wouldn’t tolerate standards breaking, user unfriendly software and would change their default away from VIM.

    Git’s authors were the first users. The team that started the linux kernel project created it and used it because no other version control tool in existence at that time suited their needs. The subtle implication that you, as a user of git, know better than the authors, who were the original users, is laughable.

    Lmao, the idea that we should hero worship every decision Linus Torvalds ever made is the only thing laughable here.




  • Lol if an employer can’t have an intelligent discussion about user friendly interface design I’m happy to not work for them.

    Every interview I’ve ever been in there’s been some moment where I say ‘yeah I don’t remember that specific command, but conceptually you need to do this and that, if you want I can look up the command’ and they always say something along the lines of ‘oh no, yeah, that makes conceptual sense don’t worry about it, this isn’t a memory test’.



  • Yeah man, me too.

    I went to school for electrical engineering, my first job was at an architecture firm designing the electrical stuff for buildings (including making all the electrical drawings for bank branches so we had some professional crossover 😋), and I ended up teaching myself software to automate a bunch of our designs and processes. I was literally directly making building design and construction more efficient … Buuuut… The arch industry pays poorly and I realized they was no way of ever owning a house at the pace I was going so I left for software and doubled my salary in like 2 years. I went from senior electrical engineer to intermediate software engineer and saw a 50% increase… All in a country experiencing a massive potentially existential housing crisis, and the industry pay disparity directly incentivized me to stop working on it and go work doing mostly bullshit software work.

    The software industry is grossly overpaid for how hard we work and for how critical our relative contributions are to society, though even in the software industry the pay is incredibly distorted. Orders of magnitude more money goes to random social media bullshit and VC startups that go nowhere than to mission critical teams doing stuff like maintaining security and access control software.


  • git add with no arguments outputs a message telling you to specify a path.

    Yes, but a more sensible default would be -A since that is how most developers use it most of the time.

    git commit with no arguments drops you into a text editor with instructions on how to write a commit message.

    Git commit with no arguments drops you into vim, less a text editor and more a cruel joke of figuring out how to exit it.

    Again, I recognize that git has a steep learning curve, but you chose just about the worst possible examples to try and prove that point lol.

    Git has a steep learning curve not because it’s necessary but because it chose defaults that made sense to the person programming it, not to the developer using it and interacting with it.

    It is great software and obviously better than most other version control systems, but it still has asinine defaults and it’s cli surface is over complicated. When I worked at a MAANG company and had to learn their proprietary version control system my first thought was “this is dumb, why wouldn’t you just use git like everyone else”, then I went back to Git and realized how much easier and more sensible their system was.