After reading a lot of comments in this thread, I’m not sure I know what spaghetti code is. I thought spaghetti code was when the order of execution was obfuscated due to excessive jumps and GOTOs. But a lot of people are citing languages without those as examples of spaghetti code. Is this just a classic “I don’t like this programming language, and I don’t know much about it.” Or is there something I’m missing?
- 0 Posts
- 35 Comments
stingpie@lemmy.worldto Programmer Humor@programming.dev•Mom can we have Scratch? We have scratch at home. Scratch at home:8·1 month agoYou could do this in basic ASCII, with only three defines. replace "_ " with “{”, replace “_;” with “}”, and “_” with nothing. If your compiler processes macros in the correct order, it will become valid code. (You would use semicolons as the vertical lines)
That’s not what I’m saying at all. What I’m trying to say is that I can’t think of any way a program working with numeric types could start outputting string types. I could maybe believe a calculator program that disables exceptions could do that, but even then, who would do that?
I refuse to believe the python one ever happens. Unless you are importing libraries you don’t understand, and refuse to read the documentation for, I don’t see how a string could magically appear from numeric types.
Anything that is turning complete & has enough ram can emulate x86, and an x86 emulator can boot Linux.
How can you tell this is AI? I don’t see any of the characteristic AI probabilistic blurs, and the reflections & caustics seem right.
stingpie@lemmy.worldto Programmer Humor@programming.dev•An extremely crude comic about programming languages4·8 months agoYeah, I’m not a model for good programing. I don’t program professionally, I just like challenging myself in my hobby projects.
stingpie@lemmy.worldto Programmer Humor@programming.dev•An extremely crude comic about programming languages5·8 months agoNo, I don’t do anything professionally. I just enjoy challenging myself.
stingpie@lemmy.worldto Programmer Humor@programming.dev•An extremely crude comic about programming languages27·8 months agoI am both the left guy and right guy. If you can’t program without using a memory safe language, it’s a skill issue. But I also don’t want to switch to rust because I like the challenge of manual memory management. (Also rust’s syntax and semantics looks like it was designed by a monkey attacking a typewriter.)
stingpie@lemmy.worldto Programmer Humor@programming.dev•It's official, Rust is an anti C/C++ elitist slur174·9 months agoRust is already obsolete, compared to Stingpie’s excellent assembly language, paired with object oriented programming!
This is the SEALPOOP specification:
- an assembler must create a binary program which satisfies the programmer’s specifications.
- a compiler must translate the programmer’s code into SEALPOOP’s parallel instruction set source, which should be fed into the SEALPOOP PISS assembler.
If C++/C were real languages for real programming they’d enforce unreadability in the compiler.
No sane language designer would say “It is imperative that you write the most unreadable code possible” then write a compiler that says “oh your code doesn’t triple dereference pointers? lol lmao that rocks”
They have played you all for fools.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•I am the Rust programmer, I will rewrite everything in Rust.473·1 year agoRust is the WORST programming “language.”
- it is against the natural order for a PROGRAM to tell the PROGRAMMER how to fix an error. Fixes should ONLY come from PROPHETIC DREAMS.
- obfuscation should be done for FUN by PROGRAMMERS to SCARE python programmers. It should NOT be a MANDATORY feature of a language.
- Memory leaks are a GIFT given to us by GOD. Programmers will ALWAYS PRAY TO GOD for SOLUTIONS as long as there are MEMORY LEAKS.
Recently, I’ve just given up trying to use cuda for machine learning. Instead, I’ve been using (relatively) cpu intensive activation functions & architecture to make up the difference. It hasn’t worked, but I can at least consistently inch forward.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•Select all SoCs which can boot mainline linux.1·1 year agoI’m not sure I understand your argument. Are you saying that the emulated processor executes instructions while the SoC doesn’t? Every instruction that goes to the x86 is broken down into several SoC instructions, which the SoC executes in order to emulate what an x86 would do. Saying that the emulated x86 is booting/running Linux, but the SoC is not is like saying that computers can’t run java code, they can only run jvm.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•Select all SoCs which can boot mainline linux.31·1 year agoI respectfully disagree. The turning machine is not doing any set-up before the emulated CPU begins execution, and all of the actual BIOS is done by the emulated CPU.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•Select all SoCs which can boot mainline linux.12·1 year agoEmulated processors can do the same things as physical processors, including booting from disk.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•Select all SoCs which can boot mainline linux.242·1 year agoYes. Any turing complete processor can perfectly emulate any other turing complete processor, whether it is x86, arm, or riscv. Mainline Linux can then run on this emulated processor without modification.
stingpie@lemmy.worldto Programmer Humor@lemmy.ml•Select all SoCs which can boot mainline linux.193·1 year agoAnything that’s turning complete, has enough ram, and has a c compiler can run Linux. Theoretically, you could program a CPLD to run brainfuck and you could still run Linux.
Rust isn’t real. Its just a bit much of abbreviated gibberish created by religious fanatics worshipping safe programming practices and reasonable error handling!
I don’t understand how not using a keyword to define a function causes the meaning to change depending on imports. I’ve never run into an issue like that before. Can you give an example?