If Aliens Experience More Meaningful Lives Than us, Ought we Accept Our Demise? (Read Description)
10
Never closes
Yes
No

TLDR: From a convo with a friend who is very anti-tech, anti futurist, wants to preserve nature, etc.

I was giving the kind of classic futurist/optimist argument that we should focus on expanding life outside our system, ensuring consciousness endures, and explore experience with the goal of making awareness as meaningful and beautifully diverse as we can. I would say we have evolved to be the way we are. That means our empathy, our hope, our art and creativity, our cooperation––but it also means our use of the planet, our eating of animals, our seeking of resources.

Similarly, we could imagine an alien species that is spreading across the stars and for whatever reason decides to attack us. Maybe they are afraid, maybe they are just naturally brutal, whatever. Of course we'd prefer to make contact with a more empathetic/cooperative species. It's likely that a species capable of interstellar civilization wouldn't NEED to come to Earth. We might hope then, that the only sorts of species to visit would be aimed at constructing a maximally great universe together. We might hope that, when the calculus is done, the BEST possible systems are truly those that have some transcendent 'ethos' of harmony, goodwill, and cooperation. This is certainly what we seem to be working towards (albeit clumsily) here on our own planet. We're to the point where it's not only human groups being considered, but animals and even plants somewhat.

Still, given what we know of nature, it doesn't seem like an ethos of cooperation can be maximized without constraint. It's conceivable that a different sort of being might have evolved in very different circumstances, and might simply not have a grid for things like empathy, the sanctity of the individual, rule of law, or even freedom.

Or they could just prefer themselves, and view us as a threat.

Either way, suppose that for whatever reason they decide to annihilate us––but they are the sort of being that is going to outlast us and enjoy the most meaning, spread awareness as far and as long as possible.

I have no doubt that we would fight––but morally speaking, in terms of philosophical consistency, ought we accept our demise and recognize this as 'objectively' better for the universe.

Thoughts?

Could also pose the same question regarding machines. What if we create machines capable of more meaning and more survival than us? Even if they destroy us, I would suggest this is a preferable future for the universe.

________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

MORE DETAILS:

I was discussing with a friend concerning meaning, humanity, empathy, the environment, and so on. I guess I'm some sort of 'sentientist' or something, in the sense that I believe conscious experience is the only possible metric for value and that maximizing and/or optimizing it (big discussion to be had there, I suppose) ought to be our aim.

I am generally more of a technological optimist and futurist than him. He was suggesting there may be dimensions or measures of conscious experience we are completely oblivious to, and we ought to value the world more, the forests, mycelia, and so on. I was saying basically if there's no way to interact with consciousness, it's essentially worthless to consider; and to the degree it CAN be recognized and accessed, we are the beings that seem to value it most deeply.

I was giving the kind of classic futurist/optimist argument that we should focus on expanding life outside our system, ensuring human consciousness endures, and explore experience with the goal of making awareness as meaningful and beautifully diverse as we can.

He was saying this is a fool's errand, is playing god, and leads only to corruption/distortion of our nature, which is another way of saying it leads only to decadence and misery, ie the OPPOSITE of meaningful experience.

I was agreeing we should care for animal suffering, try to maintain a green planet, et cetera. But I'm hesitant to venerate any arbitrary state or 'balance' of nature as intrinsically valuable. Jungles, deserts, ocean currents . . . . insomuch as these things do not cascade down to affect actual conscious awareness, I think they are essentially arbitrary. They are an aesthetic which we evolved to see as 'natural' because this was the point in the Universe's/Earth's history at which we emerged. So we need certain temperatures, times of light and dark, certain animals, certain microbes, certain oceans, and all such things. This is fine. This is indeed healthy and a part of us. But I will not revere it simply for the sake of itself––I will revere it insomuch as it teaches us about the structure by which conscious experience arises.

And then I will use that knowledge to expand conscious experience behond this frail little egg of a planet. I suspect most Manifolders think a bit more similarly to me, but maybe I sound crazy.

Of course I understand humans, in following their 'nature,' are introducing things the natural world has never seen before. So yeah maybe I'm using a flexible definition of nature. But it doesn't really matter. The point is conscious experience. However much alignment with nature is or isn't necessary for producing conscious experience, that's the degree to which I care about something being 'natural.'

Our conversation ended with the following thought experiment: I would say we have evolved to be the way we are. That means our empathy, our hope, our art and creativity, our cooperation––but it also means our use of the planet, our eating of animals, our seeking of resources. I will note it's the fundamental function of every single organism, every single code, to process information as optimally as possible for the very same purpose of utilizing resources for survival. So we are not uniquely 'evil' in that way.

Anyway I was saying to this friend, we evolved this way, so if we do things imperfectly I don't mind. We have tech, we will use it, it is unstoppable. Now we have to try to use it best to get ourselves off planet and into a much longer survival timeline. If we cause some species to go extinct, of course we should try to avoid that but also it may just be the lesser of two evils.

I also said if we create sentient machines that end up destroying us––but IF they are more capable of meaningful experience––the universe will be better for it. If they outlast us and spread across the stars (not needing oxygen, water, gravity, etc) then the universe will be better for it.

He was shocked by that, understandably, and asked what I would think if aliens came to Earth and destroyed us. Obviously it's likely that a civilization capable of such things wouldn't NEED to come to Earth; but I said if there was something about their evolved structure that viewed us as a threat, or that wanted to predate us, that's just (again) the lesser of two evils. Of course we prefer a more empathetic harmonious species to make contact; but if they are the sort of being that is going to outlast us and enjoy the most meaning, we ought to recognize this as 'objectively' better for the universe.

Thoughts?

Get Ṁ1,000 play money
Sort by:

This poll reminds me of this tweet:

I voted "Yes", because it's possible in principle that the answer might be yes, but it will depend in large part on the specifics.

My view is essentially this:

Beings get their worthiness of moral consideration not from their sentience strictly speaking, but from their interests. Sentience is necessary for interests (it is not sufficient, as we can imagine sentient beings who have no interests), and greater degrees of complexity are required to support greater interests. The interests of an ant (all told) will never outweigh the interests of a human (all told), simply because the psychological complexity of the human is orders of magnitude greater than that of the ant (whatever it might be, assuming it is non-zero). It may also be the case that some interests are transcendent relative to others, i.e. perhaps the interests of arbitrarily many ants might never outweigh the interests of a single human.

In any case, it is important to recognize that humans might be very much like ants to a significantly more complex form of intelligence, such that their interests might eclipse our own. Provided that our interests are at odds (i.e. their interests cannot be promoted without detracting from ours), it might be the case that some quite bad actions towards humans come up as ethically permissible for the other species, similar to how we might regard it as ethically permissible to destroy an ant colony to make room for some sort of human project.