I posted about it in this community, it’s got a link to the github comment regarding the matter.
They rolled it back, and delayed the hard requirement to next year.
After a quick skim, my favs:
Inline completions for multiple cursors
and Command duration tracked.
I wonder, for people that don’t have copilot, these updates must feel pretty barren
Here’s an article regarding the matter:
https://www.omgubuntu.co.uk/2024/02/vscode-drops-ubuntu-18-04-support-leaves-devs-screwed
Great job on the contributions, very grateful for the hard work that we can all enjoy.
you’re in a room with ten people, and you shout “I like turtles”
vs
you’re in a room with ten people and a thousand bots, and you shout “I like turtles”
what’s your problem again?
so happy they are expanding the profiles features.
see, it’s not always a Jupyter update \s
probably easier to look for something OS wide.
Relationship life is not a court case and bringing up words used mean very little. Sometime’s even tones supercede the meaning of the word. Empathy rules, and if the other party has no empathy for your understanding of what actually was communicated it can all be very hard. In this situation I would bet not only Rachel but most of us viewers, have little empathy for Ross’s understanding of the now status quo of their relationship.
TLDR: Ross acting dumb to not be in the wrong. 😜
maybe put “This is what it took to defederate from exploding-heads after being federated with them since the start of lemmy.world” before the screenshot of the post, as right now it seems confusing.
Are we assuming AI won’t be able to create a good prompt? 😂
People really don’t understand the current state of LLM, like the pictures generated “Its a really good picture of what a dog would look like, it’s not actually a dog”. Like a police sketch, with a touch of “randomeness” so you don’t always get the same picture.
I’m guessing they will try to solve this issue with some cheap human labour to review what is being generated. These verifers will probably not be experts on all the subjects that the llm will be spitting out, more of a “That does kind of look like a dog, APPROVED”.
Let’s say I’m wrong, and LLM’s can make as good of an article as any human. The content would be so saturated (even a tumblr user could now make as good and as much content as one of these companies), I would expect companies to be joining in on all the strikes 😆.
Funny world we are all going into.
Boas Entradas
🧟John is a 🍎lemm.app user, he subscribes to 🐢turtle community on 🍌lemm.ban
he is the first ever to do this on 🍎lemm.app
so 🍎lemm.app creates a copy with the last 20 posts and now will always keep in sync with future posts
👩🚀Jill is also a user on🍎 lemm.app, she subscribes to the same 🐢turtle community but a year later.
she will be able to see all the posts of that year all the way up to those 20 posts.
a few community blocks here and there should help
Local workspace extensions
This one will be useful for when I’m coding inside a docker container that only has a presistant workspace.
nice update, reminded me that I have no binding action for middle button clicking 😱