• 0 Posts
  • 7 Comments
Joined 1 year ago
cake
Cake day: June 24th, 2023

help-circle
  • itsnotlupus@lemmy.worldtoLinux@lemmy.mlraw man files?
    link
    fedilink
    arrow-up
    27
    ·
    edit-2
    1 year ago

    You can list every man page installed on your system with man -k . , or just apropos .
    But that’s a lot of random junk. If you only want “executable programs or shell commands”, only grab man pages in section 1 with a apropos -s 1 .

    You can get the path of a man page by using whereis -m pwd (replace pwd with your page name.)

    You can convert a man page to html with man2html (may require apt get man2html or whatever equivalent applies to your distro.)
    That tool adds a couple of useless lines at the beginning of each file, so we’ll want to pipe its output into a | tail +3 to get rid of them.

    Combine all of these together in a questionable incantation, and you might end up with something like this:

    mkdir -p tmp ; cd tmp
    apropos -s 1 . | cut -d' ' -f1 | while read page; do whereis -m "$page" ; done | while read id path rest; do man2html "$path" | tail +3 > "${id::-1}.html"; done
    

    List every command in section 1, extract the id only. For each one, get a file path. For each id and file path (ignore the rest), convert to html and save it as a file named $id.html.

    It might take a little while to run, but then you could run firefox . or whatever and browse the resulting mess.

    Or keep tweaking all of this until it’s just right for you.



  • Running strange software grabbed from unknown sources will never not be a risky proposition.

    Uploading the .exe you just grabbed to virustotal and getting the all clear can indicate two very different things: It’s either actually safe, or it hasn’t yet been detected as malware.

    You should expect that malware writers had already uploaded some variant of their work to virustotal before seeding it to ensure maximum impact.
    Getting happy results from virustotal could simply mean the malware author simply tweaked their work until they saw those same results.

    Notice I said “yet” above. Malware tends to eventually get flagged as such, even when it has a headstart of not being recognized correctly.
    You can use that to somewhat lower the odds of getting infected, by waiting. Don’t grab the latest crack that just dropped for the hottest game or whatever.
    Wait a few weeks. Let other people get infected first and have antiviruses DBs recognize a new malware. Then maybe give it a shot.

    And of course, the notion that keygens will often be flagged as “bad” software by unhelpful antivirus just further muddies the waters since it teaches you to ignore or altogether disable your antivirus in one of the most risky situation you’ll put yourself into.

    Let’s be clear: There’s nothing safe about any of this, and if you do this on a computer that has access to anything you wouldn’t want to lose, you are living dangerously indeed.


  • There are a near infinity of those out there, many of which just grab other scanlation groups’ output and slap their ads on top of it.

    Mangadex is generally my happy place, but you’ll have to wander out and about for various specific mangas.

    Several of the groups that post on Mangadex also have their own website and you may find more stuff there.

    For example right now I’ve landed on asurascans.com, which has a bunch of Korean and Chinese long strips, with generally good quality translations.

    The usual sticky points with all those manga sites is the ability to track where you are in a series and continue where you left off when new chapters are posted.
    Even Mangadex struggles with that, their “Updates” page is the closest thing they have to doing that and it’s still not very good.

    If you’re going to stick to one site for any length of time, and you happen to be comfortable with userscripts, Id’ suggest you head over to greasyfork.org, search for the manga domain you’re using, and look for scripts that might improve your binging experience there.





  • One of my guilty pleasures is to rewrite trivial functions to be statements free.

    Since I’d be too self-conscious to put those in a PR, I keep those mostly to myself.

    For example, here’s an XPath wrapper:

    const $$$ = (q,d=document,x=d.evaluate(q,d),a=[],n=x.iterateNext()) => n ? (a.push(n), $$$(q,d,x,a)) : a;
    

    Which you can use as $$$("//*[contains(@class, 'post-')]//*[text()[contains(.,'fedilink')]]/../../..") to get an array of matching nodes.

    If I was paid to write this, it’d probably look like this instead:

    function queryAllXPath(query, doc = document) {
        const array = [];
        const result = doc.evaluate(query, doc);
        let node= result.iterateNext();
        while (node) {
            array.push(node);
            n = result.iterateNext();
        }
        return array;
    }
    

    Seriously boring stuff.

    Anyway, since var/let/const are statements, I have no choice but to use optional parameters instead, and since loops are statements as well, recursion saves the day.

    Would my quality of life improve if the lambda body could be written as => if n then a.push(n), $$$(q,d,x,a) else a ? Obviously, yes.