• 0 Posts
  • 487 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle


  • Yeah, but how was that food?

    I just tried a fine dining restaurant for the first time this past weekend.

    I was just curious after watching a bunch of cooking competitions on Netflix about how good that kind of food could be so decided to find a Michelin star restaurant and give it a try.

    While the portions were small, the food was on another level. Even the “worst” of it was only that because it wasn’t amazing, but still really good.

    The food was so good that when I got home and snacked that night, it was hard to enjoy any of my usual favorite snacks because it all felt so basic after that.

    It was fancy in other regards, too. Like when my buddy went to the bathroom, someone came over and folded his cloth napkin rather than leave it bunched up on the table.

    Plus, even though the portions were tiny and we joked about whether we’d need to stop for fast-food afterwards, by the end of the 9 or so courses, I felt completely satisfied. Even the snacking I mentioned was more due to the munchies than actual hunger.

    It was expensive though. Two taster menu plus two drinks each came to about 500 CAD plus tip. And it was one of the cheaper options. There was a two Michelin star sushi place that advertised seats starting at 800 and I’m not even sure that includes any food, though I think it gets the “chef cooks what he wants” menu, which tbf would probably be way better than what I’d want anyways.

    This place only needed to be booked like a month in advance, so the place you’re talking about sounds like it’s on another level itself. Though I’m curious how much that other level translates to better food.








  • Or sometimes they think it’s moving 3d shapes on a screen until they fit together, and to show how difficult it is, the entire thing will fall apart during the hacking/programming montage.

    Though to be fair, I don’t think the producers of that media think it’s like that. Trying to put actual programming on the screen would probably be boring unless it was just a montage of reactions, starting with an overwhelmed look, followed by confidence or pride, followed by a completely baffled look and wtf expressions, then a “fuck I was stupid when I wrote this yesterday” look, then maybe a bigger wtf and physically acting out frustration, then a eureka look, all followed by a satisfied smile and nodding as the montage music ends and another character says, “I can’t believe it’s finally done and hasn’t crashed in 30 minutes!” Though I bet that would be more entertaining for programmers who can relate to the stages of development and debug than non-programmers.











  • There can be a lot of subtle changes going from one uarch to another.

    Eg, C/C++ for x64 and ARM both use a coprocessor register to store the pointer to thread-local storage. On x64, you can offset that address and read it from memory as an atomic operation. On ARM, you need to first load it into a core register, then you can read the address with offset from memory. This makes accessing thread-local memory on ARM more complicated to do in a thread safe manner than on x64 because you need to be sure you don’t get pre-empted between those two instructions or one thread can end up with another’s thread-local memory pointer. Some details might be off, it’s been a while since I dealt with this issue. I think there was another thing that had to line up perfectly for the bug to happen (like have it happen during a user-mode context switch).

    And that’s an example for two more similar uarchs. I’m not familiar with cell but I understand it to be a lot more different than x64 vs ARM. Sure, they’ve got all the documentation and probably still even have the collective expertise such that everything is known by at least someone without needing to look it up, but those individuals might not have that same understanding on the x64 side of things to see the pitfalls before running into them.

    And even once they experience various bugs, they still need to be debugged to figure out what’s going on, and there’s potential that the solution isn’t even possible in the paradigm used to design whatever go-between system they were currently working on.

    They are both Turing complete, so there is a 1:1 functional equivalence between them (ie, anything one can do, the other can). But it doesn’t mean both will be able to do it as fast as the other. An obvious example of this is desktops with 2024 hardware and desktops with 1990 hardware also have that 1:1 functional equivalence, but the more recent machines run circles around the older ones.