Discussion about this post

User's avatar
Brandon's avatar
31mEdited

AI developing this way is a problem of capitalism, which is a problem of our political institutions breaking down. That's what has us barreling into a future that no one asked for and we never had a public conversation about.

AI rises within a society driven solely by market logic and whose most powerful leaders at worst don't care about normal people, and at best are indifferent their lives, the stability of their own society, or the future of the planet.

It's too narrow to talk of AI having a narrative problem -- disgust with AI exists along the historical arc of big tech -- social platforms, business models built extraction of our attention, data and fomenting of antagonism and anger.

The more informed among us may root for Anthropic over OpenAI -- but even if it people less informed about these distinctions (them being new companies vs old big tech incumbents) and throw it all companies into the bucket of "big tech" -- these distinction aren't all that directionally important. The market logic wins, and we know corporate elites like the ones who own/run the technology care, even if they have moral compunctions, will be driven by the arms race of market logic to grow at all costs.

Only within a system where corporations to become more powerful than governments can they roll up decades of our collective expertise, thought, and craft, to build tools to displace us with nearly no consequence -- this the same society that federally prosecuted Aaron Swartz for downloading articles from JSTOR without permission.

And at just the moment we need democratic governance to politically and socially harness AI, and re-orient socioeconomic policy and build a new new deal, we are governed by the most morally vacant and corrupt president and administration, probably ever. But beyond Trump people recognize that their lives are being governed by big tech and handful of techno-oligarchs too.

So yea, I don't support violence, but the AI populism is justified.

Nick Herman's avatar

"One nightmare is a future where we get AI that’s good enough to wreak social and economic havoc, but not yet good enough to cure cancer / solve climate change / deliver 10% GDP growth. In that world… who pays?"

It's actually the opposite on at least part of that, since data centers are contributing to climate change and environmental damage. Taking the long term view that people in tech espouse/can't be bothered by the little people, I'm actually more disturbed by this complete disregard for this acceleration than by short term job losses and reshuffles (tragic, but true that it has always been the case with all technologies).

What's different in the present day, is our much greater biological and climate impact on the world, in a far more tenuous moment than anytime in the recent past, and with existing and incoming tech that increases the feedback loop in the wrong direction. It's almost like people in AI, (for the ones that can even be bothered to have thoughts like this), are taking it as an opportunity to see how quickly they can mentally relegate environmental concerns to the dustbin, after, probably, all that "wasted time" over recent decades by some annoying groups, focusing on it, getting in the way of "progress."

For my own part, although almost everyone is wrong at least partially when it comes to prediction, I can easily imagine a time in the future when AI (and whatever iterations and variations comes after the current wave), have popped, we are left in an even deeper climate-environmental-species depopulation-oceans dead--food shortages doom loop, and all this is simply another grim bullet point of the past, more evidence that we did nothing about the things that really matter and are really real in the physical world we live in. I'm sure the ultra rich will prosper, justify, and rationalize it away no matter what, securely sequestered away.

An excellent recent book I'd recommend to you and any who sees this is Goliath's Curse, by Luke Kemp. There's many common themes with your interests and some of the deeper narratives you're discussing, but I think one of the most important aspects of it comes in the very beginning of the book, when he points out that our nature as humans for 99% of the time we've existed, was to be mostly highly egalitarian. This is not new information, but the current evidence is pretty convincing, and shows how unnatural such (literally) world-destroying oligarchies are--technology is just the tool to be wielded by those in charge, as in our present time.

His central thesis is Goliath entities/states sow the seeds of their own destruction. On some level, if you agree with the conclusions (which I think are convincing), it's somewhat meaningless and disingenuous to say you understand threats or violence towards such entities while disapproving of them, because it is the natural mechanism left in society for power to shift away from the much greater violence that is already being done and will be done, when others methods have already repeatedly failed/been suppressed, as they absolutely have, particularly in the US.

2 more comments...

No posts

Ready for more?