• abcdqfr@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    6
    ·
    4 months ago

    Wake me up when it works offline “The Llama 3.1 models are available for download through Meta’s own website and on Hugging Face. They both require providing contact information and agreeing to a license and an acceptable use policy, which means that Meta can technically legally pull the rug out from under your use of Llama 3.1 or its outputs at any time.”

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      1
      ·
      edit-2
      4 months ago

      WAKE UP!

      It works offline. When you use with ollama, you don’t have to register or agree to anything.

      Once you have downloaded it, it will keep on working, meta can’t shut it down.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          8
          ·
          4 months ago

          Oh, sure. For the 405B model it’s absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.

          I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.

    • RandomLegend [He/Him]@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      4 months ago

      It’s available through ollama already. i am running the 8b model on my little server with it’s 3070 as of right now.

      It’s really impressive for a 8b model

    • sunzu@kbin.run
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      4 months ago

      I was able to set up small one via open webui.

      It did ask to make an account but I didn’t see any pinging home when I did it.

      What am I missing here?