Just a guy doing stuff.

  • 0 Posts
  • 249 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • You can dislike the statement all you want, but they literally do not have a way to know things. They provide a convincing illusion of knowledge through statistical likelihood of the next token occurring, but they have no internal mechanism for looking up information.

    They have no fact repositories to rely on.

    They do not possess the ability to know what is and is not correct.

    They cannot check documentation or verify that a function or library or API endpoint exists, even though they will confidently create calls to them.

    They are statistical models, calculating how likely the next token is based on transformations in a many-dimensional space in which the relationships between existing tokens are treated as vectors in a process for determining the next token.

    They have their uses, but relying on them for factual information (which includes knowledge of apis and libraries) is a bad idea. They are just as likely to provide realistic answers as they are to make up fake answers and present them as real.

    They are good for inspiration or a jumping off point, but should always be fact checked and validated.

    They’re fantastic at transforming data from one format to another, or extracting data from natural language written information. I’m even using one in a project to guess at filling in a form based on an incoming customer email.



  • Not the person you’re replying to, but my main hangup is that LLMs are just statistical models, they don’t know anything. As such, they very often hallucinate language features and libraries that don’t exist. They suggest functions that aren’t real and they are effectively always going to produce average code - And average code is horrible code.

    They can be useful for exploration and learning, sure. But lots of people are literally just copy-pasting code from LLMs - They just do it via an “accept copilot suggestion” button instead of actual copy paste.

    I used Copilot for months and I eventually stopped because I found that the vast majority of the time its suggestions are garbage, and I was constantly pausing while I typed to await the suggestions, which broke flow state and tired me out more then it ever helped.

    I’m still finding bugs it introduced months later. It’s great for unit tests, but that’s basically it in my case. I don’t let the AI write production code anymore





  • Hexarei@programming.devtoMemes@lemmy.mlplease
    link
    fedilink
    arrow-up
    13
    ·
    5 months ago

    The main thing people are upset about isn’t that OneDrive exists or that Microsoft is pushing it. It’s that updates have made it so that OneDrive folder backup is automatically enabled without user permission. Backing up files to OneDrive without being asked to. That is a privacy nightmare.

    I personally host my own copy of Nextcloud and use that for anything I need to sync or back up. I have a regular back up job that snapshots the Ceph cluster it uses for storage and copies it to my own NAS box here in the house, which is automatically replicated via a Nebula network (like TailScale or Zerotier but fully self-managed) to an identical NAS at my parents’ house across town.