this is an impossible standard, and I don’t believe it’s one you actually ascribe to: for instance, pretty much everyone is ok with sterilizing stray dogs and cats, and there is never a question of consent.
I don’t claim to 100% live in an ideal way. I try to keep improving but I don’t think I’ll ever be perfect
i think in cases where consent is difficult or impossible to achieve, we should act in the best interest of the experiencer in question. But I think that example is a tough one, at first glance I think we shouldn’t sterilize them, but then when I consider what will almost certainly happen if they’re not sterilized I think it’s probably worth doing the one bad thing to prevent worse things from happening. It’s an example where I think a utilitarian approach makes the most sense, since the variables are relatively clear
and a bible believing Christian has a clear answer: it doesn’t matter, you have dominion, do what you want. I imagine you don’t like that reasoning, but it, to, gives clear guidance on the morality.
I’m not talking about whether you live your values, I’m suggesting you don’t understand the implications of your own values, and under scrutiny you would find them internally inconsistent.
which is fine, as long as you’re not going out and telling others the right thing to do.
i think I do understand them, I’ve thought about that problem before. Can you go into more detail on what you mean by internally inconsistent? By my understanding, situations in the world can come about where values need to be weighed, or there are only bad choices available, but that doesn’t mean those values should be discarded or replaced or that they shouldn’t be shared/spread.
either it’s true that you can write an axiom that says “sentient beings should always consent to anything that is done to them” or you can write an axiom that says “you should always do what will bring about the most happiness or at least distress”
those axioms are in conflict with one another. it’s not that there’s only bad choices. it’s that you’ve given yourself conflicting standards.
Neither of those are axioms I hold. The axiom “all sentient beings are morally relevant” does not specify how to go from there, and I am not convinced that any one ethical framework is “the one”. There are some things that all the ones I’m aware of converge on with a sentientist perspective, but there are weird cases as well like whether to euthanize stray animals where they don’t converge
someone experiencing it should have a say in whether or not they experience it
once again, we are going to be disagreeing on the relevant definitions of “someone”.
the experiencers should have a say in whether or not they experience it
this is an impossible standard, and I don’t believe it’s one you actually ascribe to: for instance, pretty much everyone is ok with sterilizing stray dogs and cats, and there is never a question of consent.
I don’t claim to 100% live in an ideal way. I try to keep improving but I don’t think I’ll ever be perfect
i think in cases where consent is difficult or impossible to achieve, we should act in the best interest of the experiencer in question. But I think that example is a tough one, at first glance I think we shouldn’t sterilize them, but then when I consider what will almost certainly happen if they’re not sterilized I think it’s probably worth doing the one bad thing to prevent worse things from happening. It’s an example where I think a utilitarian approach makes the most sense, since the variables are relatively clear
and a bible believing Christian has a clear answer: it doesn’t matter, you have dominion, do what you want. I imagine you don’t like that reasoning, but it, to, gives clear guidance on the morality.
I’m not talking about whether you live your values, I’m suggesting you don’t understand the implications of your own values, and under scrutiny you would find them internally inconsistent.
which is fine, as long as you’re not going out and telling others the right thing to do.
i think I do understand them, I’ve thought about that problem before. Can you go into more detail on what you mean by internally inconsistent? By my understanding, situations in the world can come about where values need to be weighed, or there are only bad choices available, but that doesn’t mean those values should be discarded or replaced or that they shouldn’t be shared/spread.
either it’s true that you can write an axiom that says “sentient beings should always consent to anything that is done to them” or you can write an axiom that says “you should always do what will bring about the most happiness or at least distress”
those axioms are in conflict with one another. it’s not that there’s only bad choices. it’s that you’ve given yourself conflicting standards.
Neither of those are axioms I hold. The axiom “all sentient beings are morally relevant” does not specify how to go from there, and I am not convinced that any one ethical framework is “the one”. There are some things that all the ones I’m aware of converge on with a sentientist perspective, but there are weird cases as well like whether to euthanize stray animals where they don’t converge
How do you define “someone”?