• 0 Posts
  • 124 Comments
Joined 5 years ago
cake
Cake day: October 2nd, 2020

help-circle
  • (ok i see, you’re using the term CPU colloquially to refer to the processor. i know you obviously know the difference & that’s what you meant - i just mention the distinction for others who may not be aware.)

    ultimately op may not require exact monitoring, since they compared it to standard system monitors etc, which are ofc approximate as well. so the tools as listed by Eager Eagle in this comment may be sufficient for the general use described by op?

    eg. these, screenshots looks pretty close to what i imagined op meant

    now onto your very cool idea of substantially improving the temporal resolution of measuring memory bandwidth…you’ve got me very interested with your idea :)

    my inital sense is counting completed L3/4 cache misses sourced from DRAM and similar events might be alot easier - though as you point out that will inevitably accumulate event counts within a given time interval rather than an individual event.

    i understand the role of parity bits in ECC memory, but i didn’t quite understand how & which ECC fields you would access, and how/where you would store those results with improved temporal resolution compared to event counts?

    would love to hear what your setup would look like? :) which ECC-specific masks would you monitor? where/how would you store/process such high resolution results without impacting the measurement itself?








  • edit: nvm i re-read what you wrote

    i agree it does mostly fulfill the criteria for libre software. perhaps not in every way to the same spirit as other projects, but that is indeed a separate discussion.

    h̶o̶w̶ ̶m̶a̶n̶y̶ ̶c̶o̶m̶m̶u̶n̶i̶t̶i̶e̶s̶ ̶a̶r̶e̶ ̶d̶o̶i̶n̶g̶ ̶t̶h̶a̶t̶ ̶r̶i̶g̶h̶t̶ ̶n̶o̶w̶?̶ ̶i̶ ̶s̶u̶s̶p̶e̶c̶t̶ ̶y̶o̶u̶ ̶m̶a̶y̶ ̶b̶e̶ ̶d̶r̶a̶s̶t̶i̶c̶a̶l̶l̶y̶ ̶u̶n̶d̶e̶r̶s̶t̶a̶t̶i̶n̶g̶ ̶t̶h̶e̶ ̶b̶a̶r̶r̶i̶e̶r̶s̶ ̶f̶o̶r̶ ̶t̶h̶a̶t̶.̶ ̶b̶u̶t̶ ̶w̶o̶u̶l̶d̶ ̶b̶e̶ ̶d̶e̶l̶i̶g̶h̶t̶e̶d̶ ̶t̶o̶ ̶b̶e̶ ̶p̶r̶o̶v̶e̶n̶ ̶w̶r̶o̶n̶g̶.̶.̶.̶


  • so many classics listed already, some others…

    • the one where a civillisation downloads their lives into picard. whole ep you’re wondering wtf is with picard this is slow as he lives some dudes life, then the reveal is amazing it’s so full of positivity and warmth - appropriately titled “the inner light”. hats off for capturing positivity so well onscreen (for some reason an elusive skill)

    • when picard defends data’s rights as a sentient being



  • Thanks for the distinctions and links to the other good discussions you’ve started!

    For the invasive bits that are included, it’s easy enough for GrapheneOS to look over the incremental updates in Android and remove the bits that they don’t like.

    That’s my approximate take as well, but it wasn’t quite what I was getting at.

    What I meant is, to ask ourselves why is that the case? A LOT of it is because google wills it to be so.

    Not only in terms of keeping it open, but also in terms of making it easy or difficult - it’s almost entirely up to google how easy or hard it’s going to be. Right now we’re all reasonably assuming they have no current serious incentives to change their mind. After all, why would they? The miniscule % of users who go to the effort of installing privacy enhanced versions of chromium (or android based os), are a tiny drop in the ocean compared to the vast majority of users running vanilla and probably never even heard of privacy enhanced versions.




  • excellent writeup with some high quality referencing.

    minor quibble

    Firefox is insecure

    i’m not sure many people would disagree with you that FF is less secure than Chromium (hardly a surprise given the disparity in their budgets and resources)

    though i’m not sure it’s fair to say FF is insecure if we are by comparison inferring Chromium is secure? ofc Chromium is more secure than FF, as your reference shows.


    another minor quibble

    projects like linux-libre and Libreboot are worse for security than their counterparts (see coreboot)

    does this read like coreboot is proprietary? isn’t it GPL2? i might’ve misunderstood something.


    you make some great points about open vs closed source vs proprietary etc. again, it shouldn’t surprise us that many proprietary projects or Global500 funded opensource projects, with considerably greater access to resources, often arrive at more robust solutions.

    i definitely agree you made a good case for the currently available community privacy enhanced versions based on open source projects from highly commercial entities (Chromium->Vanadium, Android/Pixel->GrapheneOS) etc. something i think to note here is that without these base projects actually being opensource, i’m not sure eg. the graphene team would’ve been able to achieve the technical goals in the time they have, and likely with even less success legally.

    so in essence, in the current forms at least, we have to make some kind of compromise, choosing between something we know is technically more robust and then needing to blindly trust the organisation’s (likely malicious) incentives. therefore as you identify, obviously the best answer is to privacy enhance the project, which does then involve some semi-blind trusting the extent of the privacy enhancement process - assuming good faith in the organisation providing the privacy enhancement: there is still an implicit arms race where privacy corroding features might be implemented at various layers and degrees of opacity vs the inevitably less resourced team trying to counter them.

    is there some additional semi-blind ‘faith’ we’re also employing where we are probably assuming the corporate entity currently has little financial incentive in undermining the opensource base project because they can simply bolt on whatever nastiness they want downstream? it’s probably not a bad assumption overall, though i’m often wondering how long that will remain the case.

    and ofc on the other hand, we have organisations who’s motivation we supposedly trust (mostly…for now), but we know we have to make a compromise on the technical robustness. eg. while FF lags behind the latest hardening methods, it’s somewhat visible to the dedicated user where they stand from a technical perspective (it’s all documented, somewhere). so then the blind trust is in the purity of the organisation’s incentives, which is where i think the political-motivated wilfully-technically-ignorant mindset can sometimes step in. meanwhile mozilla’s credibility will likely continue to be gradually eroded, unless we as a community step up and fund them sufficiently. and even then, who knows.

    there’s certainly no clear single answer for every person’s use-case, and i think you did a great job delineating the different camps. just wanted to add some discussion. i doubt i’m as up to date on these facets as OP, so welcome your thoughts.


    I’m sick of privacy being at odds with security

    fucking well said.


  • our sensory capabilities are probably better than you think

    however good our current capabilities are, it’s not exactly reasonable to think we’re at the apex. we don’t know everything - perhaps we never will, but even if we do it’ll surely be in 100, 1,000 or 10,000 years, rather than 10 years.

    i’m not aware of any sound argument that the final paradigm in sensing capability has already happened.

    there is really no scenario where this logic works

    assuming you mean there’s no known scenario where this logic works? then yes, that’s the point - we currently don’t know.

    this is asklemmy not a scientific journal. there can be value or fun in throwing ideas around about the limits of what we do know, or helping op improve their discussion, rather than shit on it. afaict they’ve made clear elsewhere in this thread they’re just throwing ideas around & not married to any of it.



  • everyone in here gleefully shitting on op (in a rather unfriendly fashion btw)

    getting hung up on the 1:99 thing, when what they actually said was

    As long as the percentage is not 100%

    obviously i’m not saying op has presented firm evidence of the supernatural. but the irony of supposedly espousing the scientific method, while completely ignoring the critical part of op’s argument.

    who here is claiming to know 100.000000% of all supernatural evidence is absolutely disproven? that would be an unscientific claim to make, so why infer it?

    is the remaining 10-x % guaranteed “proof” of ghosts/aliens? imo no, but it isn’t unreasonable to consider it may suggest something beyond our current reproducible measurement capacity (which has eg. historically been filed under “ghosts”). therefore the ridicule in this thread - rather than friendly/educational discussion - is quite disappointing.

    it’s not exactly reasonable to assume we’re at the apex of human sensory capability, history is full of this kind of misplaced hubris.

    until the invention of the microscope, germs were just “vibes” and “spirits”



  • imo

    Main Points

    1. most people (including most men) do not actually give a fuck.

    2. a tiny insignificant group mumbling in a dark corner probably do care, but noone should give a shit or listen to them.

    3. instead their voice is amplified in social/legacy media as a typical divide and conquer tactic (men vs women is ‘powerful’ as its half the planet vs the other half).

    4. unoriginal drones parrot those amplifications because they’ll get angry about whatever their screens tell them to this week.

    5. society has leaned male-dominant for too long, so genuine efforts to be fair are perceived by some idiots (see #2,#4) as “unfair”.

    6. corporations don’t actually give a shit about equality, so their maliciously half-arsed pretense at fairness rings hollow, adding more fuel to the flames.

    Bonus

    If you want to know more about this, see the Bechdel test, once you see it, you can’t unsee it everywhere you go:

    The test asks whether a work features at least two female characters who have a conversation about something other than a man.