Welcome to the Utopia Forums! Register a new account
The current time is Fri Mar 29 00:54:17 2024

Utopia Talk / Politics / Maybe we need to slow down?
Nimatzo
iChihuaha
Wed Jul 07 05:37:40
The speed of innovation and progress has been accelerating probably since we discovered fire. To take that as an example, we burned a lot of stuff for great progress before we realized burning stuff is bad. Perhaps if we were not so horny about progress and slowed things down, we wouldn’t be creating so many problems and existential threats.

There are things we have done and are doing, that we know has considerable risks, like nuclear weapons, AI and Biotech, but once the cat is out of the bag, everyone who can will commit to an arms race. The most recent of these are the development of autonomous weapons. Do you really want to ban this research if China (or vice versa) is doing it? You should, but you won’t.

We are fucked, discuss… While you can.
jergul
large member
Wed Jul 07 05:41:05
Its why I attribute humanity viral qualities. Not unnatural. We know that humans and virii have coevolved. On the bright side, virii have taken on human characteristics (try to live and let live).
Nimatzo
iChihuaha
Wed Jul 07 06:13:08
I see two components in the virus argument. The first being simply procreation, the other the rapid evolution. In the case of the latter it has become decoupled from the genetic one and now manifests in the making and breaking of stuff.

They both matter in that equation, but the evolution part creates the existential threats. I guess simply expressed as, whatever inherent problem there is in how many we are, is multiplied by the problems that come with accelerating innovation (e.g. AI, biotech). Though, some of those problems are cumulative, like certain toxins in the air and water.
Habebe
Member
Thu Jul 15 10:37:33
Clearly there are those at least in small communities that have such ideas. I grew up near the Amish and went to school with moderate Mennonites for example.

But as you pointed out, we can't ever underestimate the other guys greed.

We are genetically evolved I think to try and out do the other guy, those who didn't probably died off quicker.

"we wouldn’t be creating so many problems and existential threats."

I would argue that, for as many problems as we make, we eventually fix them and have fixed previous problems as well.

Now economically, arguing against a fixation on growth to opt for a more sustainable flat growth levels, that's interesting.
Nimatzo
iChihuaha
Thu Jul 15 10:46:55
You don’t understand existential threat. As my home boy Biggie said, ”dead niggas don’t make no moves”. If civilization collapses, to fix that is a tall order, or worse we make ourselves go extinct, then there is no fixing that. The context here is runaway tech, nukes, biotech etc.
Habebe
Member
Thu Jul 15 11:01:56
Nimatzo, Im arguing, what is the alternative?

Many of these new techs may just be what saves us.I think its imperative to save our species, that we become a 2 planet species, the thing is that there is some overlap in dangerous tech (potentially) and species saving tech.

You've already mentioned, its near impossible to stop it.

Honestly it's quite the accomplishment really that we have kept nuclear tech as contained as we have for So long.
Habebe
Member
Thu Jul 15 11:03:38
As for existential threat, throughout history we have had plenty, which habe been fixed through innovation.

In the 70s , people thought we would have world wide famines due to overpopulation.
Dukhat
Member
Thu Jul 15 11:39:29
It's too late. Kleptocrats can spread memes and instantly get dumb men across the world to vote for their cause.

The solution to unfettered technology is some sensible regulation and that isn't going to happen with conservative populism run amuck.
Nimatzo
iChihuaha
Thu Jul 15 11:44:45
The alternative is to slow down, which would be an alternative cultural mode to the current one. Progressing is written into our DNA, but the speed is partly due to accumulation of knowledge and acceptance of certain economic modus operandi.

And no, until the 20th century we had not delivered any kind of technology that was an existential threat, nothing exponential effects.
Nimatzo
iChihuaha
Thu Jul 15 12:02:09
There is a conservative approach to progress that involves a heavy dose of ”first do no harm”. And yes it looks impossible because we are very convinced, very impressed as we are speeding down the high way, that this ride will end well. If you accept the premise, the premise is we are facing the risk of world wide collapse or specie wide ruin, to then ask ”what is the alternative”, is probably the wrong question.
OsamaIsDaWorstPresid
Member
Thu Jul 15 13:35:40
”first do no harm”

u think da chinks care bout that? rofl

theyd abort there baby dauthtar then feed it 2 da dog 2 plump it up 4 food
Habebe
Member
Thu Jul 15 14:05:53
I stand by it as the right question to ask.

First off, these are possible existential threats (in away AI,Biotech )

So its not an assured threat.If we were to slow down, we very well could face existential threats from that depending on the execution.

So its apt to ask whats the alternative and what effects does that come with.

So far these have all been rather vague references so Im not certain on what the details are.
Dukhat
Member
Thu Jul 15 14:10:10
You can see what China is doing. They use AI and constant surveillance to track people across their real and online interactions. Anyone deemed an insurrectionist has their hidden "score" lowered and they lose the ability to borrow money or travel and other rights. That's for ethnic Han Chinese.

Minorities just up and disappear and noone is the wiser.

Meanwhile in the US, it's a free-for-all with Social Media companies going for "engagement" which means getting stupid people addicted to narratives that fill them with hatred and fear most times.

This is how we got the insurrection and qAnon and the Trump cult in general.

That's just the tip of the iceberg. AI and other stuff will be huge in the future.
Habebe
Member
Thu Jul 15 14:13:29
Dukhatss lack of self awareness continues to astound me...
Dukhat
Member
Thu Jul 15 14:17:24
Irony is deep and wrenching.
Dukhat
Member
Thu Jul 15 14:19:22
The point being, arriving at honest answers and solutions to our problems that maintain the rights and dignitiy of the individual will not happen today because so much of the world is deeply and fundamentally ignorant.

And made even more so by the scheer complexity and power of the media and bubbles of information directed at them by kleptocrats who could care less about the future of the world and only want religious zealot judges and tax cuts today.
Nimatzo
iChihuaha
Thu Jul 15 14:59:55
Habebe
Ok fair enough, I will rephrase it for you, it is the wrong question to ask if one thinks the survival of the species is important. i.e if out behavior creates existential risks, we have to figure out alternatives. It is the meta level of progress you could say :-)
I did not create this thread because I had a solution to the problem, an alternative, but in principle I laid out the solution. Progress must be burdened with the risk that are being externalized. Regulation is a minimum, but it is also a matter of economy and culture.
Habebe
Member
Thu Jul 15 15:29:37
Nimatzo, "Ok fair enough, I will rephrase it for you, it is the wrong question to ask if one thinks the survival of the species is important. i.e if out behavior creates existential risks, we have to figure out alternatives"

And that's fine, Im saying we need to make sure the the cure isnt worse than the poison.

The thing your suggesting, to intentionally slowing down the rate of progress (we won't even get into the HOW).

Progress very well could be the solution. Not to mention we already have current existential threats to mankind that many people agree the solution is even more progress.

If ypur goal is to remove as many existential threats to mankind as possible, our best hope is greater technology.

Are you speaking generally or do you have certain technologies you want slowed down? I realize ypur not offering a solution.

#Letthe liquor do the thinking#Iamtheliquor
Nimatzo
iChihuaha
Fri Jul 16 05:03:01
We are not really connecting here, I am talking about the inherent risks in the speed of the progress of technology. You are confusing developing technology to avert external existential threat with existential threats _caused by us_ and our behavior, there is no adverserial relationship between those different threats. We can shoot down asteroids or colonize Mars and as an example not invent a super virus that kills everyone.

In fact when you look at it, there are far more existential threats as a consequences of our behavior and techological inventions than natural existing in the solar system. Basically only a massive asteroid or solar flare are the only extinctions level risks out there in nature. Only our imagination acts as a limit for the existential risks we can ”innovate”.

First do no harm, means just that, it doesn’t mean let the patient die.
Habebe
Member
Tue Aug 03 05:03:03
http://freakonomics.com/podcast/save-the-planet-rebroadcast/

I thought of this thread while listening to this post, thought it was relevant.


"We are not really connecting here, I am talking about the inherent risks in the speed of the progress of technology. "

Yes, I get that, I really do. I'm simply arguing that there is inherent risks to slowing down as well.

Look at the FDA (US), its job is to slow down the rate of new drugs to make sure they are safe.

The counter argument is that the FDA has done more damage than good by restricting access to beneficial drugs while people suffer.

I think there is a good parallel in that example.


Habebe
Member
Tue Aug 03 05:03:27
While listening to this podcast*
Nimatzo
iChihuaha
Tue Aug 03 07:18:14
Ok well, the FDA regulation is actually a great example. The USA has banned the use of federal money for gain of function research. I would argue, that _is_ an example of slowing down. Because the argument is made that gain of function research is a way of staying ahead of the natural evolution of vectors and predicting it.

"I'm simply arguing that there is inherent risks to slowing down as well."

Yes, we risk getting advances slower, but there are no existential risk to us as a specie. Individually there is, some people will die because we didn't find the cure fast enough. As heartless as that sounds, these are marginal losses.
Habebe
Member
Tue Aug 03 08:05:32
I think Im just more on the fence whether or not the losses from reduced advancement are that marginal.Mabey it is, mabey it isn't, we havnt invented a working crystal ball yet.

I think gain of function gets a bad rap, I just think it needs to be more secure, like literally put them on an island and sequester them frommthe public for 90 days until they can return kind of deal.

Im on the fence because, well look at covid 19.What of it was far deadlier, for sake of argument lets say we know for a fact it was organically made.

Well through gain of function, it it is plausible that we could have prevented the pandemic.This one wasn't end the species bad, but the next one might be.

Nuclear weapons, dangerpis stuff, potentially species ending. However since their invention we have become less violent.

I just think the pro and con arguments are more ambiguous.

As far as being heartless, if the answers were more concrete, I don't think it would be heartless t worst it would a minor negative side effect for the greater good.
Seb
Member
Tue Aug 03 15:21:16
Nimatzo:

One could equally argue that we are presently at a point where we are unsustainable as a civilization in and either need to speed up or risk catastrophe.
Nimatzo
iChihuaha
Thu Aug 05 16:30:58
Ok, do it, argue it. What are you thinking about specifically that is moving towards catastrophe that can not at least technically be averted?
Sam Adams
Member
Thu Aug 05 17:22:23
"and either need to speed up or risk catastrophe."

Speed up logically, the right things.

Speed up fusion research, solar, batteries, spaceships, genetic engineering, simple automation.

Slow down supercomputers and especially AI.
Sam Adams
Member
Thu Aug 05 17:24:08
"What are you thinking about specifically that is moving towards catastrophe that can not at least technically be averted?"

AI. its very existence will ultimately threaten ours. Stop that shit.
habebe
Member
Thu Aug 05 17:40:39
What Sam's saying answers my question, identifying specific areas to slow or speed down.

Theoretically at least, we all agree its not politically feasible anyway.

I agree if the goal is to continue the species we need to colonize as many planets as we can , as fast as we can.
Habebe
Member
Thu Nov 11 06:34:26
Ttt
jergul
large member
Thu Nov 11 07:47:56
Why do I get the idea that weather prognosis might be vulnerable to automation by way of algorythms? :D
jergul
large member
Thu Nov 11 07:49:18
habebe
Fix shit here first. It is the very baseline test of space viability.

If we cant do that, then no point worrying about lifeship strategies. Everyone leaving will die stupidly.
Nimatzo
iChihuaha
Thu Nov 11 08:09:11
But what if the problems here are the result of layers of historical legacy and can't be fixed in time?
Habebe
Member
Thu Nov 11 08:19:22
Jergul, With that attitude Id still be in England or Hesse.
jergul
large member
Thu Nov 11 08:58:34
habebe
Instead of sleeping on your parents couch? Trust me, that may seem mighty comfy now, but it would not do well in outer space when the CO2 recycler broke down.

Nimi
Not going to escape layers of historical legacy on blast off. Fix here, then dream of perpetual human potential.
Habebe
Member
Thu Nov 11 09:17:39
Oh , listen, I don't want to be one of the first Martians.

But jergulnometry can't dispute that a 2 planet species has a greater chance of survival than one.
jergul
large member
Thu Nov 11 09:18:32
That sounded meaner than I meant. You do what you have to do to help your parents out. But it is not ideal that your system makes this the practical way of them getting the help they need.
jergul
large member
Thu Nov 11 09:22:48
We are not going to be a two planet species, nor a one planet species if we can't figure shit out here.

This is easy comparatively. Earth is a paradise with all the creature comforts of home. Yet we insist on shitting on the dining room table.

Maybe stop shitting on the table before thinking hard stuff is any kind of solution at all?
kargen
Member
Thu Nov 11 13:03:51
"While on top of Everest, I looked across the valley towards the great peak Makalu and mentally worked out a route about how it could be climbed. It showed me that even though I was standing on top of the world, it wasn't the end of everything. I was still looking beyond to other interesting challenges."
Edmund Hillary

That is why we won't and can't stop advancing in all kinds of ways. There will always be people that ask what next. They will go forward thinking only of the progress and others will follow behind and think of how they can benefit.
murder
Member
Thu Nov 11 13:13:18

There is no artificial intelligence. Only artificial stupidity.

show deleted posts

Your Name:
Your Password:
Your Message:
Bookmark and Share