I write about technology at theluddite.org

  • 6 Posts
  • 93 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle

  • The difference is that, unlike craigslist, OnlyFans takes a massive 20% cut of all revenue. For comparison, Patreon takes a little more than 5%. Purely from a labor perspective, that’s outrageous, so I do think that it’s fair to demand that they at least do more to justify it, which ought to include protecting the people that actually do the work.

    There’s also what’s to me the bigger problem: OnlyFans obviously didn’t invent online sex work, but it did radically reshape it. They are responsible for mainstreaming this patreon-style, girl-next-door porn actress that people expect to interact with on a parasocial level. Those are features that OnlyFans purposefully put in to maximize their own profit, but they seem particularly ripe for the kind of nauseating small-scale abuse that the article discusses in depth. Suddenly, if an abusive partner wants to trap and control someone, there’s a mainstream, streamlined path to making that profitable. Again, OnlyFans didn’t create that, in the same way that Uber didn’t create paying some random person with a car for a ride to the airport, but they did reshape it, systematize it, mainstream it, and profit handsomely off it. Craigslist was just a place to put classifieds, but OnlyFans is a platform that governs every detail of these relationships between creators and fans, down to the font of their DMs. If the way that they’ve built the platform makes this kind of abuse easier, that’s a huge problem.

    I agree with you that this article doesn’t do a good job articulating any of this, though.




  • Investment giant Goldman Sachs published a research paper

    Goldman Sachs researchers also say that

    It’s not a research paper; it’s a report. They’re not researchers; they’re analysts at a bank. This may seem like a nit-pick, but journalists need to (re-)learn to carefully distinguish between the thing that scientists do and corporate R&D, even though we sometimes use the word “research” for both. The AI hype in particular has been absolutely terrible for this. Companies have learned that putting out AI “research” that’s just them poking at their own product but dressed up in a science-lookin’ paper leads to an avalanche of free press from lazy credulous morons gorging themselves on the hype. I’ve written about this problem a lot. For example, in this post, which is about how Google wrote a so-called paper about how their LLM does compared to doctors, only for the press to uncritically repeat (and embellish on) the results all over the internet. Had anyone in the press actually fucking bothered to read the paper critically, they would’ve noticed that it’s actually junk science.





  • I know that this kind of actually critical perspective isn’t point of this article, but software always reflects the ideology of the power structure in which it was built. I actually covered something very similar in my most recent post, where I applied Philip Agre’s analysis of the so-called Internet Revolution to the AI hype, but you can find many similar analyses all over STS literature, or throughout just Agre’s work, which really ought to be required reading for anyone in software.

    edit to add some recommendations: If you think of yourself as a tech person, and don’t necessarily get or enjoy the humanities (for lack of a better word), I recommend starting here, where Agre discusses his own “critical awakening.”

    As an AI practitioner already well immersed in the literature, I had incorporated the field’s taste for technical formalization so thoroughly into my own cognitive style that I literally could not read the literatures of nontechnical fields at anything beyond a popular level. The problem was not exactly that I could not understand the vocabulary, but that I insisted on trying to read everything as a narration of the workings of a mechanism. By that time much philosophy and psychology had adopted intellectual styles similar to that of AI, and so it was possible to read much that was congenial – except that it reproduced the same technical schemata as the AI literature. I believe that this problem was not simply my own – that it is characteristic of AI in general (and, no doubt, other technical fields as well). T








  • Other people have already posted good answers so I just want to add a couple things.

    If you want a very simple, concrete example: Healthcare. It depends on how you count, but more than half the world’s countries have some sort of free or low cost public healthcare, whereas in the US, the richest country in the history of countries, that’s presented as radical left wing kooky unrealistic communist Bernie idea. This isn’t an example of a left-wing policy that we won’t adopt, but of what in much of the world is a normal public service that we can’t adopt because anti-socialism in this country is so malignant and metastasized that it can be weaponized against things that are just considered normal public services almost like roads in other countries.

    A true left wing would support not just things like healthcare, but advocate for an economic system in which workers have control over their jobs, not the bosses. That is completely absent.

    Also, this meme:

    Two panel comic. top one is labeled republicans. bottom one is democrats. they're both planes dropping bombs except democrats has an lgbt flag and blm flag

    It’s glib, but it’s not wrong. Both parties routinely support American militarism abroad. Antimilitarism in favor of internationalism has been a corner stone for the left since the left began.



  • I completely and totally agree with the article that the attention economy in its current manifestation is in crisis, but I’m much less sanguine about the outcomes. The problem with the theory presented here, to me, is that it’s missing a theory of power. The attention economy isn’t an accident, but the result of the inherently political nature of society. Humans, being social animals, gain power by convincing other people of things. From David Graeber (who I’m always quoting lol):

    Politics, after all, is the art of persuasion; the political is that dimension of social life in which things really do become true if enough people believe them. The problem is that in order to play the game effectively, one can never acknowledge this: it may be true that, if I could convince everyone in the world that I was the King of France, I would in fact become the King of France; but it would never work if I were to admit that this was the only basis of my claim.

    In other words, just because algorithmic social media becomes uninteresting doesn’t mean the death of the attention economy as such, because the attention economy is something innate to humanity, in some form. Today its algorithmic feeds, but 500 years ago it was royal ownership of printing presses.

    I think we already see the beginnings of the next round. As an example, the YouTuber Veritsasium has been doing educational videos about science for over a decade, and he’s by and large good and reliable. Recently, he did a video about self-driving cars, sponsored by Waymo, which was full of (what I’ll charitably call) problematic claims that were clearly written by Waymo, as fellow YouTuber Tom Nicholas pointed out. Veritasium is a human that makes good videos. People follow him directly, bypassing algorithmic shenanigans, but Waymo was able to leverage their resources to get into that trusted, no-algorithm space. We live in a society that commodifies everything, and as human-made content becomes rarer, more people like Veritsaium will be presented with more and increasingly lucrative opportunities to sell bits and pieces of their authenticity for manufactured content (be it by AI or a marketing team), while new people that could be like Veritsaium will be drowned out by the heaps of bullshit clogging up the web.

    This has an analogy in our physical world. As more and more of our physical world looks the same, as a result of the homogenizing forces of capital (office parks, suburbia, generic blocky bulidings, etc.), the fewer and fewer remaining parts that are special, like say Venice, become too valuable for their own survival. They become “touristy,” which is itself a sort of ironically homogenized commodified authenticity.

    edit: oops I got Tom’s name wrong lol fixed



  • I cannot handle the fucking irony of that article being on nature, one of the organizations most responsible for fucking it up in the first place. Nature is a peer-reviewed journal that charges people thousands upon thousands of dollars to publish (that’s right, charges, not pays), asks peer reviewers to volunteer their time, and then charges the very institutions that produced the knowledge exorbitant rents to access it. It’s all upside. Because they’re the most prestigious journal (or maybe one of two or three), they can charge rent on that prestige, then leverage it to buy and start other subsidiary journals. Now they have this beast of an academic publishing empire that is a complete fucking mess.