Godspeed with your effort.
Godspeed with your effort.
I’m really enjoying the fedi, but a return to the text and ANSI graphics and community of 93 BBS keeps calling me.
The author’s primary gripe, IMHO, has legs: the question about the oven’s relationship to baking is buried as part of bake()
and is weird. But the solution here is not the left-hand code, but rather to port some good, old-fashioned OOP patterns: dependency injection. Pass a reference to an Oven in createPizza()
and the problem is solved.
Doing so also addresses the other concern about whether an Oven
should be a singleton (yes, that’s good for a reality-oriented contrived code sample) or manufactured new for each pizza (yes, that’s good for a cloud runtime where each shard/node/core will need to factory its own Oven). The logic for cloud-oven (maybe like ghost kitchens?) or singleton-oven is settled outside of the narrative frame of createPizza(). Again, the joy of dependency injection.
To their other point, shouldn’t the internals of preheating be enclosed in the oven’s logic…why yes that’s probably the case as well. And so, for a second time, this code seems to recommend OOP. In Sandi Metz style OOP in Ruby (or pretty much any other OOP language) this would be beautiful and rational. Heck, if the question of to preheat or no is sufficiently complex, then that logic can itself be made a class.
As I write, I thought: “How is golang so bad at abstraction?” I’m not sure that that is the case, but as a writer of engineering education, I think the examples chosen by the Google Testing Blog don’t serve well. Real-world examples work really well with OOP languages, fast execution and “systems thinking” examples work great with golang or C. Perhaps the real problem here is that the contrived example caters to showing off the strengths of OOP, but uses a procedural/imperative-style-loving language. Perhaps the Testing Blog folks assumed that everyone was on-board with the “small factored methods approach is best” as an article of faith and could accept the modeled domain as a hand wave to whatever idea it was they were presenting.
Here’s the biggest reason: we are evolved from savannah primates for whom the ability to make eye contact and hold it was a signal of “you can trust me, I’m not about to bite you.” Paper and pen don’t signal “I have decided to break this evolutionary/social contract” in the same way a phone or open laptop does.
I help mentor a lot of young people in early career and their generation with a phone is an excuse for an x-er/boomer interviewer to punt them waiting to happen. It’s career and comp limiting, right or no.
Also if one finds a taken note is missing something, contact the original party. A conversation that begins with: “you got me thinking about this more deeply and I think I may have missed something…” is the key to mentorship, advocacy, and growth.
In short from a transcoding of bits perspective, other media may be better. But for those they acknowledge human constraint and opportunity a nice notebook and (a cheap shill from me) a Lamy Safari medium nib fountain pen will do you quite well.
The A* paper standard and the metric system. A Pythagorean can dream.
The best way to get Linux in the era was to get a box of floppies from a guy at the 2600 meeting. Got Slackware in 93 and a goofy little video game made by some guys up i45 in mesquite called Wolfenstein. Wonder what happened to them.
Thanks, the details of the early decade of the year of Desktop Linux are growing murky.
Just to confirm, SuSE has no ties to SCO and that weird crusade Darl McBride was on, right?
During the pandemic nadir I kept thinking about “viewing” and “seeing” and Zoom and Gladia.