Mossy Feathers (They/Them)

A

  • 0 Posts
  • 177 Comments
Joined 1 year ago
cake
Cake day: July 20th, 2023

help-circle

  • My biggest complaint about Sims-likes is that the visual style always looks too serious. It gives me the feeling that whatever I’m going to do with my not-Sims, it’s gonna be something that makes me regret my real life.

    You wanna know what I did the last time I played the Sims 2 though? I repeatedly held parties at my Sim’s house and then lured the guests into a room they couldn’t get out of. I also used the moveobjects cheat to collect police cars whenever a cop showed up to shut the party down. By the time I was done I had amassed around 70 urns, many hysterical immortal Sims (Sims with households can’t die while visiting someone’s house in the Sims 2), 4 Police cars and a fire truck.

    The Sims has a mischievous air to it that tickles the devil on your shoulder and begs you to listen to them. None of the Sims-likes I’m aware of seem to have the same air.

    Edit: now I want to play the Sims again.




  • Rollercoaster Tycoon 1 and 2; Need for Speed 2 and 3; SimCity 3k.

    Also, check your monitor properties. Afaik most CRT monitors (not TVs; those run at 60hz/50hz depending on region) are meant to run at 75~85hz. If it’s running at 60hz when it’s meant to run at a higher refresh rate, then that might be why it’s nauseating (my crt has a very noticeable flicker at 60hz, but that goes away at 75hz).

    Edit: to expand on this for any late-comers: CRTs work by using an electron gun (aka particle accelerator aka a motherfucking PARTICLE CANNON) to fire an electron beam at red, green and blue phosphors. When the electron hits a phosphor, it emits light based on the color hit. This beam sweeps over the phosphors at a speed dictated by the display’s refresh rate and illuminates the phosphors one-by-one until it has illuminated the entire screen. This is why trying to take a picture or video of a CRT requires you to sync your shutter speed with the CRT. If your shutter isn’t synced then the monitor will appear to be strobing or flickering (because it is, just very, very quickly)

    These phosphors have a set glow duration, which varies based on the intended display refresh rate. A refresh rate that is too low will cause the phosphors to dim before the electron beam passes over them, while a refresh rate that’s too high can cause ghosting, smearing, etc because the phosphors haven’t had a chance to “cool off”. TVs are designed to run at 60hz/50hz, depending on the region, and so their phosphors have a longer glow duration to eliminate flickering at their designated refresh rate. Computer monitors, on the other hand, were high-quality tubes and were typically geared for +75hz. The result is that if you run them at 60hz then you’ll get flickering because the phosphors have a shorter glow duration than a TV.

    Note: this is a place where LCD/LED panels solidly beat CRTs, because they can refresh the image without de-illuminating the panel, avoiding flicker at low refresh rates.

    Edit 2: oh! Also, use game consoles with CRT TVs, not computer monitors. This is because old consoles, especially pre-3d consoles, “cheated” on sprites and took advantage of standard CRT TV resolution to blend pixels. The result is that you may actually lose detail if you play them on a CRT computer monitor or modern display. That’s why a lot of older sprite-based games unironically look better if you use a real CRT TV or a decent CRT emulator video filter.


  • I’m… honestly kinda okay with it crashing. It’d suck because AI has a lot of potential outside of generative tasks; like science and medicine. However, we don’t really have the corporate ethics or morals for it, nor do we have the economic structure for it.

    AI at our current stage is guaranteed to cause problems even when used responsibly, because its entire goal is to do human tasks better than a human can. No matter how hard you try to avoid it, even if you do your best to think carefully and hire humans whenever possible, AI will end up replacing human jobs. What’s the point in hiring a bunch of people with a hyper-specialized understanding of a specific scientific field if an AI can do their work faster and better? If I’m not mistaken, normally having some form of hyper-specialization would be advantageous for the scientist because it means they can demand more for their expertise (so long as it’s paired with a general understanding of other fields).

    However, if you have to choose between 5 hyper-specialized and potentially expensive human scientists, or an AI designed to do the hyper-specialized task with 2~3 human generalists to design the input and interpret the output, which do you go with?

    So long as the output is the same or similar, the no-brainer would be to go with the 2~3 generalists and AI; it would require less funding and possibly less equipment - and that’s ignoring that, from what I’ve seen, AI tends to be better than human scientists in hyper-specialized tasks (though you still need scientists to design the input and parse the output). As such, you’re basically guaranteed to replace humans with AI.

    We just don’t have the society for that. We should be moving in that direction, but we’re not even close to being there yet. So, again, as much potential as AI has, I’m kinda okay if it crashes. There aren’t enough people who possess a brain capable of handling an AI-dominated world yet. There are too many people who see things like money, government, economics, etc as some kind of magical force of nature and not as human-made systems which only exist because we let them.








  • You’re the one contradicting yourself when you’re saying that linux requires a Translation layer. And the translations are not always 1:1. Please show me the benchmarks.

    How is this a contradiction? It seems like it’d be the opposite. Translation layers reduce performance as they translate programs from one system to another, so the fact that Linux can run games in a translation layer and still get as good, or better, performance than Windows means that Linux is fast enough to make up for the translation layer performance penalty.

    Regardless, here are some benchmarks.

    From 2019, Windows 10 vs Pop_OS:

    https://www.forbes.com/sites/jasonevangelho/2019/07/17/these-windows-10-vs-pop-os-benchmarks-reveal-a-surprising-truth-about-linux-gaming-performance/?sh=6035a5e65e74

    While these are all in 1080p, several are also running in translation layers. The ones that are running native were faster in Linux, while the ones running in proton achieved roughly the same performance. This was also 4~5 yrs ago, and proton has improved a lot. Additionally, these were run on an Nvidia card using their proprietary drivers, and Linux is known to be AMD-biased.

    So here’s another one from a couple years ago with Windows 11 vs Manjaro (benchmark totals for 4k, 1440p and 1080p at the end): https://m.youtube.com/watch?v=xwmNLqJL7Zo

    While they found that games tended to perform better on windows in 4k, they also found that games in 1440p were roughly the same while 1080p averaged faster on Linux despite running in a mix of proton, Proton-GE, and wine. This is also a couple years old though, and while the average might be better on Linux, there were some pretty significant performance gaps at the top and bottom of the chart.

    Here’s a third one from about 6 months ago. This was pretty highly circulated on Lemmy, so I’m surprised you didn’t see it, but here it is:

    https://discuss.tchncs.de/post/5340976

    They claim to have seen an average 17% improvement on the games they benchmarked, and included a video of the benchmarks. There was a later benchmark where they claimed they got +20% performance using a tweaked version of Garuda Linux, but that required user tweaks and I’m mainly concerned with “un-tweaked” performance.


    Linux isn’t perfect, and if you want to play games with no hassle, then Windows is probably still your best bet. However, in situations where you’re trying to squeeze as much performance as you can out of an underpowered device, Linux just seems obvious. You have standardized hardware that allows you to spend the time and effort to iron out bugs and deficiencies with fewer edge cases than you’d get with non-standardized hardware. I think that’s why Steam(Deck) OS is so good. It runs on standardized hardware and so it’s easy for Valve to configure and optimize for user-friendliness because they don’t have to worry about ten billion different hardware configurations.

    Also, as a side note, I’ve found that older games just run better on Linux. They ironically tend to be way less of a hassle to get working. It’s because Wine (and I think Proton/Proton-GE) have compatibility for 16bit programs, while windows doesn’t. You have to run a virtual machine with Windows XP or earlier to run 16bit programs, and I’ve found that to be a mess.

    Seriously, I cannot get a Windows 98 virtual machine set up on Windows 10 to save my life. It just won’t properly install on software like VMWare, and I’ve had to resort to actual PC emulators to get 16bit games to run on a modern windows PC (which are slow as fuck). I’ve read it has something to do with AMD CPUs? I don’t know what the specific issue is though, just that it supposedly works just fine on Intel but not AMD. However, I haven’t encountered that mess on Linux.

    Edit: as an amusing side-side note, I’m old enough that a number of my favorite games from when I was growing up are no longer able to run on Windows because they require a 16bit OS (or a 32bit OS with 16bit compatibility). Despite that, my grandfather’s Hoyle card game that’s older than I am, still somehow runs flawlessly on Windows 10. What the fuck?


  • Many of the games are made to be run on windows, windows is still a effecient os, it’s just a lot of bloat, which can be disabled.

    A) as someone else pointed out, “bloat” and “efficient” are exclusive to one another. Now, you can argue that windows is efficient in some areas and bloated in others, but “bloat” and “efficiency” are mutually exclusive when applied generally.

    B) yes, most, if not all of it, can be disabled through registry edits and 3rd party hacks. However, in my experience, the more you try to debloat windows, the more unstable it gets. Then, it will all come back eventually via updates, which means you get to disable it all again. Finally, again in my experience, the more you try to debloat windows, the less stable it gets, and this carries over even when the OS reinstalls/reenables bloat you tried to get rid of. Seriously, my experience is that even after windows updates rebloat everything, the OS remains unstable, and becomes even more unstable after you debloat again. Granted this was with windows 10, but I imagine the same is more or less true for windows 11.

    Also a lot of optimizations in nt has been done for gaming, features which are missing in the linux kernel, but there are RFCs to add nt like synchronization primitives, in the linux kernel.

    C) and yet, iirc, recent Linux vs Windows 11 benchmarks show Windows games running on Linux via Proton/Proton-GE anywhere from slightly slower to slightly faster than Windows, despite requiring translation layers to run; while the Linux-native games typically run faster than their Windows counterparts.

    Windows is just that bloated.




  • A typing game like Mario Teaches Typing or Typing of the Dead except all the sentences are ad slogans or brand names.

    Emergency phone lines have ads at the beginning of the call to help pay for emergency services (because the government won’t pay for them).

    Revoke regulation that requires disclaimers on paid endorsements (in other words, you have no idea if someone is endorsing a product because they like it, or because they were paid to talk about it).

    Digital piracy is now a felony on par with drug felonies.

    Ad blocking is now digital piracy.

    Copyright is now indefinite, applied retroactively. An agency is formed to pursue copyright infringement on behalf of deceased rights holders and defunct companies.

    Criticism is no longer considered free speech if it leads to direct or indirect economic damage (“your rights end where mine begin!”)

    Referencing or speaking about a copy-protected work in-depth constitutes copyright infringement. However, enforcement is up to the rights holder except in the case of deceased individuals or defunct companies.

    The last three may seem tangential, but together it means companies can take action against you for talking negatively about their advertisements and products, regardless of how old they are. Now companies like Disney can use copyright to permanently erase things like The Song of the South or Walt Disney’s Nazi boner.

    Advertising is allowed on voter ballots (the voting process can be expensive after all).

    Politicians must publicly endorse companies which endorse them (it’s only fair). Failing to do so is considered a form of ad blocking.

    Public schools may include advertisements in their curriculum to augment teacher salaries. There are no restrictions on how many advertisements are presented, how they are presented, or the extent of their presentation. Choosing not to present an advertisement that is part of the curriculum is considered a form of ad blocking. "You have to pay teachers somehow, and I’ll be damned if it comes out of my pocket".

    I could probably come up with more, but this is making me depressed.