Actually, really liked the Apple Intelligence announcement. It must be a very exciting time at Apple as they layer AI on top of the entire OS. A few of the major themes.

Step 1 Multimodal I/O. Enable text/audio/image/video capability, both read and write. These are the native human APIs, so to speak.

Step 2 Agentic. Allow all parts of the OS and apps to inter-operate via “function calling”; kernel process LLM that can schedule and coordinate work across them given user queries.

Step 3 Frictionless. Fully integrate these features in a highly frictionless, fast, “always on”, and contextual way. No going around copy pasting information, prompt engineering, or etc. Adapt the UI accordingly.

Step 4 Initiative. Don’t perform a task given a prompt, anticipate the prompt, suggest, initiate.

Step 5 Delegation hierarchy. Move as much intelligence as you can on device (Apple Silicon very helpful and well-suited), but allow optional dispatch of work to cloud.

Step 6 Modularity. Allow the OS to access and support an entire and growing ecosystem of LLMs (e.g. ChatGPT announcement).

Step 7 Privacy. <3

We’re quickly heading into a world where you can open up your phone and just say stuff. It talks back and it knows you. And it just works. Super exciting and as a user, quite looking forward to it.

https://x.com/karpathy/status/1800242310116262150?s=46

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 months ago

    I watched an abbreviated video. Pretty much everything they announced was available on other platforms 5+years ago

      • helenslunch@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        I mean don’t get me wrong, these things are a huge QoL improvement. It’s just that they should be embarrassed to be lacking such basic functionality on such insanely expensive devices. And people should be embarrassed to carry them around. There are still tons of basic features missing.