C is so old, it has a way to work around that! In case your 198x keyboard was not set to ASCII you know. Not sure if Morse covers all the characters needed for the replacement trigraphs though.
C is so old, it has a way to work around that! In case your 198x keyboard was not set to ASCII you know. Not sure if Morse covers all the characters needed for the replacement trigraphs though.
The YouTube channel looking glass universe (highly recommended!) also has a video on how alphafold works.
You should report it to the Bundesnetzagentur, spam calls are illegal.
https://www.bundesnetzagentur.de/DE/Vportal/AnfragenBeschwerden/Beschwerde_Aerger/start.html
Server’s down :(
Yes, but at some point it doesn’t matter. The AI is trained to replicate human writing. There will be a point where it becomes so good that the result is a perfect replica, where it is indistinguishable from human text. I.e. even a perfect detector will not be able to confidently declare it as AI written, not ever. Because there is no difference.
I bet AI detection is going to get a lot better over time.
I doubt it. ChatGPT 3.5 is good enough to rewrite small snippets of text with better phrasing, ChatGPT 4.0 can write a paragraph if given enough support. Good enough as in "the output is indistinguishable from what a human would have written.
Of course you can do even more with the currently available tools - and get found out.
There is a way to make AI generated text detectable: by slightly pushing the output towards a consistent pattern a detector can reliably judge long pieces of text as AI generated.
Imagine if the AI is biased towards consecutive words starting with consecutive letters of the alphabet (e.g. “a blue car” instead of “a navy vehicle”.). Not strongly biased, but enough so that when there are 1000 words you can look at the probability of consecutive words starting with consecutive letters of the alphabet and get a clear result.
There are two problems though: this only works with proprietary systems and only with long texts.
Make both hands into a fist and hold them out in front of you so that the knuckles are visible. Now start on a pinky and count the knuckles and valleys between them. Knuckles are 31 days, valleys are 30 (and February). When you switch between hands it doesn’t count as a valley.
Left Pinky knucke: January, 31 days
Left Pinky/ring finger valley: February
Left Ring finger knuckle: march, 31
Left Ring/middle: April, 30
Left Middle: may, 31
Left Middle/index: June, 30
Left Index: July, 31
Right Index: August, 31
Right Index/middle: September, 30
Right middle: Oktober, 31
Right middle/ring: November, 30
Right ring finger knuckle: December, 31
I think it depends a lot on where and when you grew up. Afaik in China it’s very much uncommon to be able to swim.
I’m not sure what to tell you, other than that yes, you can simply take your hands off the handle bars on most bikes if you’re going fast enough.
That is true for most current “self driving” systems, because they are all just glorified assist features. Tesla is misleading its customers massively with their advertisement, but on paper it’s very clear that the car will only assist in safe conditions, the driver needs to be able to react immediately at all times and therefore is also liable.
However, Mercedes (I think it was them) have started to roll out a feather where they will actually take responsibility for any accidents that happen due to this system. For now it’s restricted to nice weather and a few select roads, but the progress is there!
Definitely JS if you want to also have a website. Use electron to turn your website into an executable for the desktop. Python+qt is ok for Desktop apps, but does not work for a website.
Languages that compile to wasm would also be an option, (e.g. https://egui.rs with rust), but as far as i am aware none of the languages you’ve listed are in that set. (Only go would even be a contender between python, ruby, js and go)
Eh it’s not that great.
One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.
Fossil fuel-burning plants, whether that’s natural gas, coal, or oil, produce even less. There’s no way to ramp up nuclear capacity in the time it will take to supply these millions of chips, so much, if not all, of that extra power demand is going to come from carbon-emitting sources.
If you ignore the two fastest growing methods of power generation, which coincidentally are also carbon free, cheap and scalable, the future does indeed look bleak. But solar and wind do exist…
The rest is purely a policy rant. Yes, if productivity increases we need some way of distributing the gains from said productivity increase fairly across the population. But jumping to the conclusion that, since this is a challenge to be solved, the increase in productivity is bad, is just stupid.
So it stops once someone doesn’t finish?
Alternatively the y axis could be “blog posts not about …”
You can literally run large language models with a single exe download: https://github.com/Mozilla-Ocho/llamafile
It doesn’t get much simpler than that.
Addendum:
The docs say
For reproducible outputs, set temperature to 0 and seed to a number:
But what they should say is
For reproducible outputs, set temperature to 0 or seed to a number:
Easy mistake to make
I appreciate the constructive comment.
Unfortunately the API docs are incomplete (insert obi wan meme here). The seed value is both optional and irrelevant when setting the temperature to 0. I just tested it.
Yeah no, that’s not how this works.
Where in the process does that seed play a role and what do you even mean with numerical noise?
Edit: I feel like I should add that I am very interested in learning more. If you can provide me with any sources to show that GPTs are inherently random I am happy to eat my own hat.
Ah, gotcha.
Is there like a list where you can enter your server so that other people use it as an ntp server? Or how did you advertise it to have 2800 requests flooding in?
No, because you can’t mathematically guarantee that pi contains long strings of predetermined patterns.
The 1.101001000100001… example by the other user was just that - an example. Their number is infinite, but never contains a 2. Pi is also infinite, but does it contain the number e to 100 digits of precision? Maybe. Maybe not. The point is, we don’t know and we can’t prove it either way (except finding it by accident).