TL-DR; for stuff that is NOT from sonarrr/radrr (e.g. downloaded long time ago / gotten from friends, RSS feeds, whatever), is there a better way to find subs than downloading everything from manual DDL sites and trying everything until one works (matching english text and correctly synced)?

I am not currently using bazarr and I understand that it can catch anything from sonarr that is missing subs but that is not the use-case I need. I am still open to it but since most of the new stuff I get already has subs, I’m looking more at my stuff that is NOT coming from sonarr bc that’s where I have the most missing subs. thinking since there github say:

Be aware that Bazarr doesn’t scan disk to detect series and movies: It only takes care of the series and movies that are indexed in Sonarr and Radarr."

that most of my use-case is going to be manual searches. It also sounds like Bazarr uses same kind of DDL sites like opensubtitles and subscene that I am already using as its backend / source so curious if there is any advantage vs looking up old stuff on the sites directly.

And especially if there is some way to match existing files with the correct subs, even if the file/folder names no longer contain the release group (e.g. via duration or other mediainfo data or maybe even via checksums). I know vlc can do it for a single file… but since I have a LOT of stuff w missing subs, I’m looking for a way that I can do something similar from a bash script or some other bulk job without getting a bunch of unsynced subs.

  • Meuzzin@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    11 months ago

    Jellyfin has a couple subtitling features and plugins. It’ll write subs as it streams. If the meta data is still intact, it’ll automatically download subs.

    • BlackFlagsForever@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      was hoping to keep it more light-weight and not bring in a media server but i guess if i’m having this much of a pain doing things the old fashioned way, it’s still an idea to try so thanks.

      as far as meta data, any clue what it looks for?

      asking cuz my collection is a hodgepodge of a bunch of different sources. Most of the stuff that is missing subs are a mix of tv shows and movies that came from either:

      • makemkv rips and OTA recordings from a few buddies
      • older tv releases that came from public tracker sites
      • ??? no fucking clue, maybe i ddl’ed it years ago? not sure

      I was just poking around with mediainfo on a few movies I am looking for subs for currently and I see some of the ones that were downloaded appear to still have the original file in the Movie name field (including the release group). OTA rips, I kinda feel like I’m probably fucked on bc they aren’t even gonna match a standard duration but will check it out

      • Meuzzin@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        If the video was ripped and prepared as a scene release, it’ll download the specific subs for that release using the meta data (assuming they added it when released). If not, I haven’t ran into a single issue using Jellyfins Opensubtitle plugin to grab a generic subtitle file for the movie/show if there is no scene info. Its always lined up well.

        Don’t really need a very powerful server to run Jellyfin. Most NAS hardware, or a Raspberry Pi 3+ handles it just fine. I ran it on a Raspi 3b for several years.

        Jellyfins own “on the fly” subtitle writing works fine too…