Bevy, cause I’m a sucker for Rust
Bevy, cause I’m a sucker for Rust
This is a use-after-free, which should be impossible in safe Rust due to the borrow checker. The only way for this to happen would be incorrect unsafe code (still possible, but dramatically reduced code surface to worry about) or a compiler bug. To allocate heap space in safe Rust, you have to use types provided by the language like Box
, Rc
, Vec
, etc. To free that space (in Rust terminology, dropping it by using drop()
or letting it go out of scope) you must be the owner of it and there may be current borrows (i.e. no references may exist). Once the variable is drop
ed, the variable is dead so accessing it is a compiler error, and the compiler/std handles freeing the memory.
There’s some extra semantics to some of that but that’s pretty much it. These kind of memory bugs are basically Rust’s raison d’etre - it’s been carefully designed to make most memory bugs impossible without using unsafe
. If you’d like more information I’d be happy to provide!
That’s the point. Malicious compliance.
Having made the choice to use GTK for a Rust project years ago - before a lot of the more Rust-friendly frameworks were around - this is exactly why I chose it. Nothing to do with DEs or any of that, just looking for a better coding experience. Now I’d probably choose one of the several Rust-focused solutions that have popped up though.
It’s a way to detect which way the stick is pointing using magnets. It’s way more accurate and incredibly reliable.
Yep! Just need faster internet so I can share with more friends 😭
Same here. And especially for watch parties Jellyfin has been great.
I’d be interested in setting up the highest quality models to run locally, and I don’t have the budget for a GPU with anywhere near enough VRAM, but my main server PC has a 7900x and I could afford to upgrade its RAM - is it possible, and if so how difficult, to get this stuff running on CPU? Inference speed isn’t a sticking point as long as it’s not unusably slow, but I do have access to an OpenAI subscription so there just wouldn’t be much point with lower quality models except as a toy.