is sort of essentially blockchain without the decentralized ledger part
So a [Merkle tree](http://www…com/ https://en.wikipedia.org/wiki/Merkle_tree)?
is sort of essentially blockchain without the decentralized ledger part
So a [Merkle tree](http://www…com/ https://en.wikipedia.org/wiki/Merkle_tree)?
Gentoo is the espresso you get when your coffee-obsessed friend with >$10k worth of barista equipment asks if you’d like a coffee. It’s the best damn thing you’ve ever tasted, but by the time your friend has finished preparing and all the settings are dialed in, it’s around midnight and you should have gone home hours ago
Reading the blog post, it’s a lot more nuanced than that: someone reported a CVE, which was related to a possible int overflow in client code handling the timeout between requests. NVD chose to grade this as a 9.8/10 on their severity scale (for context, CVE-2014-0160, also known as Heartbleed, got a 7.5/10), which is ludicrous for a bug which could at most change the retry timeout of your request from your intended years to a few seconds. Daniel says that this is not a security vulnerability at all and has no business being listed on the CVE database, whereas NVD argues that it’s a bug, it’s been reported to them and because overflows are undefined behavior, anything can happen and so it’s a security vulnerability.
In the end, they agreed to at least adjust the severity down to a 3.3, but I can understand that Daniel is still somewhat miffed about it. Personally I also agree that it’s not really a security issue and that even a 3.3 is too high in terms of severity.
Many debuggers (at least in the Java world, which is what I’m working with for a living) support more advanced features like only triggering the breakpoint if a certain condition is reached or only every X hits of the breakpoint.
Also, if you try and debug using print in the main game loop, wouldn’t that write so much to console/log that it’s effectively unreadable?
I recently discovered Manic Miners, a remake of 1999’s Lego Rock Raiders, and ever since I’ve been busy reliving my childhood in 1080p. Now if only someone could remake Lego Racers 1&2…
Beyond that, I found out that the Steam release of Dwarf Fortress totally passed me by last year, and so I’ve been getting back into that and I keep marveling at the lovely graphics and the mouse control. I’m happy that I can support the creators this way after years of playing the game every once in a while. Still waiting for stuff like Dwarf Therapist, but for the first time I’m playing DF without tons of add-ons and it’s actually pretty neat. I’m looking forward to all the FUN I’ll be having! :P
Interestingly, the guy who made the referenced post, ‘avis’, is allegedly the new name of ‘birdie’, a well-known troll on the forums who was banned a while back. Basically everyone there agrees that it’s him and no action is taken against this new account.
Especially when the original article is about anything related to Rust. An hour after the article is live you’ll have 50 posts arguing and trolling like there’s nothing more important in the whole wide world. So entertaining!
Anyone expecting to use Linux the same way they are using Windows, without any changes, is going to be disappointed. You cannot reasonably expect to keep the same learned workflows from one system and use them on a completely different system without having to at least tweak some of it.
Learning is part of such switchovers, and loudly complaining that “Thing X is not working like I know it to, this is why people don’t like Linux” is not making anyone more likely to help you nor is it going to solve your problem. I’m glad that you managed to find a way to do what you need in any case, and maybe that command will stick around in the back of your head for when you need something similar sometime in the future :)
It was underpowered when the Switch released, yes, but I’d wager that it was a good choice for the application when Nintendo started designing the Switch. Couple that with the (not unreasonable IMO) expectation that there would be successors to the X1 that they could hypothetically put into the Switch and release a higher-perf revision with minimal changes, I can see why they chose it. Unfortunately, Nvidia dropped the X1 line and that (again, purely speculative) scenario never manifested.
The heavy stuff would be things like shader compilation and state management for multiple different graphics APIs (OpenGL and Vulkan mostly).
AFAIK Linux graphics drivers are usually separated into a userspace and a kernel space component, like amdgpu
on the kernel side and RADV/RadeonSI within Mesa on the userspace side.
So you do not need to do a full reboot to e.g. benefit from performance optimizations within Mesa to get things like faster shader compilation or more efficient draw call submission, which I think most people care about when doing driver updates. In fact you don’t even need to soft reboot, because once Mesa is updated, all following uses of it already run the new version, all without a reboot. However if your GPU is not yet supported by the kernel side, then Mesa is of no use to you.
That being said, yes the kernel side is a very important part of the driver, but it’s such a low-level driver that very few people would be able to do much of anything with it, which is why I made that distinction.
Yes they do, Mesa being one. Only the close to the metal stuff and Kernel-DRM is handled in kernel space, most of the heavy stuff is done in user space.
To be fair™ they did at least do a little bit to deal with the existing answers becoming obsolete by changing the default answer sorting. The “new” (it’s already been at least a year IIRC) sorting pushes down older answers and allows newer answers to rise to the top with fewer votes. That still doesn’t fix the issue that the accepted answer likely won’t change as new ways of doing things become standard, but at least it’s a step in the right direction.
I think rsync
is short for remote sync
Yes they’re usually called “<fontfamily> Display”. IIRC Display variants are optimized to be used on digital displays (usually on the web), where a lower resolution (72ish DPI) than printing (~300 DPI) is quite common.
lol no, it’s used almost everywhere where performance is important and people want(ed) OOP, from tiny projects to web browsers (Chrome, Firefox) to game engines (Unreal, CryEngine). Many of these are hugely complex and do encounter segfaults on a somewhat frequent basis.
Saying C++ is mostly used for embedded applications is like saying C# is mostly used for scripting games, i.e. it doesn’t nearly cover all the use cases.
This depends on your definition of “higher-level”, but many people would argue that C++ is on a similar level to Java or C# in terms of abstraction. The latter two do, however, have a garbage collector, which vastly simplifies memory management for the programmer(generally anyway).