• 0 Posts
  • 35 Comments
Joined 2 years ago
cake
Cake day: June 26th, 2023

help-circle

  • After reading a lot of comments in this thread, I’m not sure I know what spaghetti code is. I thought spaghetti code was when the order of execution was obfuscated due to excessive jumps and GOTOs. But a lot of people are citing languages without those as examples of spaghetti code. Is this just a classic “I don’t like this programming language, and I don’t know much about it.” Or is there something I’m missing?



  • That’s not what I’m saying at all. What I’m trying to say is that I can’t think of any way a program working with numeric types could start outputting string types. I could maybe believe a calculator program that disables exceptions could do that, but even then, who would do that?









  • stingpie@lemmy.worldtoProgrammer Humor@lemmy.mlTrue Story
    link
    fedilink
    arrow-up
    1
    arrow-down
    5
    ·
    1 year ago

    If C++/C were real languages for real programming they’d enforce unreadability in the compiler.

    No sane language designer would say “It is imperative that you write the most unreadable code possible” then write a compiler that says “oh your code doesn’t triple dereference pointers? lol lmao that rocks”

    They have played you all for fools.



  • Recently, I’ve just given up trying to use cuda for machine learning. Instead, I’ve been using (relatively) cpu intensive activation functions & architecture to make up the difference. It hasn’t worked, but I can at least consistently inch forward.


  • I’m not sure I understand your argument. Are you saying that the emulated processor executes instructions while the SoC doesn’t? Every instruction that goes to the x86 is broken down into several SoC instructions, which the SoC executes in order to emulate what an x86 would do. Saying that the emulated x86 is booting/running Linux, but the SoC is not is like saying that computers can’t run java code, they can only run jvm.