• circuscritic@lemmy.ca
    link
    fedilink
    arrow-up
    32
    ·
    edit-2
    5 months ago

    Awesome. Truly spectacular.

    Generative AI is so energy intensive ($$$), that Google is requiring users subscribe to Gemini.

    Google is entirely dependent on advertising sales. Ad revenue subsidizes literally everything else, from Android development to whichever 8-12 products and services they launch and subsequently cancel each year.

    Now, Google wants to remove web results and just use generative AI instead of search as it’s default user interface.

    So, like I said: Awesome.

    • pup_atlas@pawb.social
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      5 months ago

      While I agree in principle, one thing I’d like to clarify is that TRAINING is super energy intensive, once the network is trained, it’s more or less static. Actually using the network isn’t dramatically more energy than any other indexed database lookup.

      • towerful@programming.dev
        link
        fedilink
        arrow-up
        8
        ·
        5 months ago

        Training will never stop, tho.
        New models will keep coming out, datasets and parameters are going to change.

        • pup_atlas@pawb.social
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          I firmly believe it will slow down significantly. My prediction for the future is that there will be a much bigger focus on a few “base” models that will be tweaked slightly for different roles, rather than “from the ground up” retraining like we see now. The industry is already starting to move in that direction.