He / They

  • 16 Posts
  • 1.41K Comments
Joined 3 years ago
cake
Cake day: June 16th, 2023

help-circle

  • immoral people existing is not the problem here

    True. The profit motive is. People pushing harmful content are doing it because it makes them money, not because they’re twirling their moustaches as they relish their evil deeds. You remove the profit motive, you remove the motivation to harm people for profit.

    the difference is that there isn’t an algorithm that acts as a vector for harmful bullshit

    The algorithms boost engagement according to 1) what people engage with, and 2) what companies assess to be appealing. Facebook took the lead in having the social media platform own the engagement algorithms, but the companies and people pushing the content can and do also have their own algorithmic targeting. Just as Joe Camel existed before social media and still got to kids (and not just on TV), harmful actors will find and join discords. All that Facebook and Twitter did was handle the targeting for them, but it’s not like the targeting doesn’t exist without the platforms’ assistance.

    Said bad actors do not exist in anywhere near the same capacity. Imo the harm of public chat rooms falls under the “parents can handle this” umbrella. Public rooms are still an issue, but from experience being a tween/teen on those platforms, it’s not even close to being as bad.

    It wasn’t as bad on those… back when we were teens. It absolutely is now. If anything, you’ll usually find that a lot of the most harmful groups (red-pill/ manosphere, body-image- especially based around inducing EDs- influencers) actually operate their own discords that they steer/ capture kids into. They make contact elsewhere, then get them into a more insular space where they can be more extreme and forceful in pushing their products, out of public view.

    If it was the case that it was just individual actors on the platform causing the harm and not the structure of the platforms incentivizing said harm, then we would see more of this type of thing in real life as well.

    I’m not saying it’s all individuals, I’m saying the opposite; it’s companies. Just not social media companies. Social media companies are the convenient access vector for the companies actually selling and pushing the harmful products and corollary ideas that drive kids to them.

    I struggle to think of a more complete solution to the harm caused by social media to children than just banning them.

    Given that your immediate solution was to regulate kids instead of regulating companies, I don’t think you’re going to be interested in my solutions.





  • despite how harmful it is for society as a whole, and especially children

    If you don’t understand that the motivation is to target kids with ads and influencer content designed to push products, you’re not going to solve anything. Kids have to have spaces to communicate with each other in order to develop healthy socialization skills. Locking them in a proverbial box is not healthy, and guess what, we killed off 99% of third spaces that welcome kids.

    If social media is banned for under 16’s, then children would have to communicate with normal chat apps.

    I feel like you are envisioning “chat apps” to mean “text-only”, but chat apps have been multimedia/ multi-modal, and multi-user (i.e. not 1:1 messaging) for a long time now, and can be just as easily infiltrated by the same actors targeting kids on social media.

    at some point some systemic problems are better served by systemic solutions

    This is not a solution, this is a band-aid that doesn’t attack the root cause whatsoever.






  • Capitalism gonna Capitalism.

    We are watching one of the greatest wastes of money in history, all as people are told that there “just isn’t the money” to build things like housing, or provide Americans with universal healthcare, or better schools, or create the means for the average person to accumulate wealth.

    This could have been written about the War on Terror.

    I can find no analyst commentary on Meta making sixteen billion dollars on fraud, because it doesn’t matter to them, because this is the Rot Economy, and all that matters is number go up.

    I’m not sure why he thinks morality is a factor in market movement. You’ll not find the stock market negatively reacting to money being spent on genocide in the Middle East or murders in the Caribbean, or to Palantir expanding into a mass-surveillance apparatus either.

    Analysts that do not sing the same tune as everybody else are marginalized, mocked and aggressively policed… By not being skeptical or critical you are going to lead regular people into the jaws of another collapse.

    Yes, market collapses are actually loved by large wealth holders, because unless the entire currency itself collapses, the people with the most currency are the ones best-positioned to benefit from the collapse. Investors will ride the economy off a cliff so they can salvage the scrap at the bottom. Sam Altman literally opined about ‘redefining’ the social contract when AI collapses the economy in his White House presser.

    Analysts have, on some level, become the fractional marketing team for the stocks they’re investing in.

    Because major news, analysis firms, and banks are all owned by the oligarchy, and no one is being punished for using that power to manipulate the market. They know that if they’re a big firm and they say, “this stock is amazing!” it will go up, and since they own that stock, they get richer.

    When it happens, I promise I won’t be too insufferable, but I will be calling for accountability for anybody who boosted AI 2027, who sat in front of Sam Altman or Dario Amodei and refused to ask real questions, and for anyone who collected anything resembling “detailed notes” about me or any other AI skeptic.

    It’s sad to me that Ed lived through 2008 and still thinks there will be accountability in this system. At some point you have to accept that the purpose of a system is what the system does. Our system cyclically collapses, economically, in order to enrich billionaires. It happened during the DotCom bubble, it happened in 2008, and it happened during COVID, and that’s just in my short lifespan.

    I realize I’m pearl-clutching over the amoral status of capitalism and the stock market

    I really don’t think you are. You haven’t even begun to reach the bare minimum level of disdain and disgust-inducing realism one should have about capitalism, nevermind anything being remotely close to pearl-clutching.


  • As someone who is not anti-tool just because big companies and capitalism are misusing said tool (that’s a ‘big companies’ and ‘capitalism’ issue that applies to far more than LLMs), this seems like a non-starter for any business use of the platform.

    Enterprise tools definitely have an expectation of 1) not having ads placed in them, and 2) not having their users tracked for third-party data sale, not because they love their employees, but because they’re scared one could infer proprietary business information via user metadata correlation. No company wants their new product to be “blown” early because their devs’ internet activity was aggregated and the product inferred, or worse to have a competitor get the jump on them because of it. Most companies begrudgingly accept use of e.g. Google, but corporate policies will absolutely limit the kind of information you can put in a Google search. ChatGPT is just by its nature much more likely to end up getting proprietary data put in (because it’s a ‘conversation’).

    The “promise” that OpenAI will only use said data to target ads is laughable, even if OpenAI believes it.




  • Using ‘casual’ and ‘hardcore’ in the traditional gatekeep-y, “filthy casuals” way that e.g. Dark Souls players often do, isn’t really what the article is talking about.

    CoD and other battlepass-ridden live-service games don’t actually require high skill levels, they require high time investment. Destiny 2 stopped being a casual game in this sense once they started removing content, because it now places demands on the players’ time, rather than allowing players to engage with it casually/ at their leisure. Also, Destiny 2 has raids. No game with instance raids is casual. I don’t play Fortnite, which is why I asked whether they have time-limited events, and I don’t particularly care about where it falls versus others, I just tend to see most live-service games as inherently less casual due to this.

    My ‘hardcore’ game for many many years was Eve Online, and let me tell you, there’s nothing casual about leaving work early or setting alarms for 4am and coordinating with several hundred people around the globe to all be online when a POS timer is finishing. It’s a hardcore game, but it’s not about twitch-aiming or dodge-timing gameplay.