In my view, although I am neither a neurologist nor a philosopher, things should absolutely scale with neuron blob complexity, and it should do so in a non-linear way. I dislike harming an animal with a complex brain like mammals, cephalopods etc. much more than I dislike harming the equivalent nerve mass in insects, for instance.
That’s also the way I feel, but I think that’s probably human bias and closely related to the evolutionary pressure behind my mirror neurons and how strongly they trigger correlates with outside sentient phenotype.
I think if I knew what it felt like (if anything) to be an ant colony, I might have different views around the causal use of boric acid (and related) to keep them out of human spaces.
In my view, although I am neither a neurologist nor a philosopher, things should absolutely scale with neuron blob complexity, and it should do so in a non-linear way. I dislike harming an animal with a complex brain like mammals, cephalopods etc. much more than I dislike harming the equivalent nerve mass in insects, for instance.
That’s also the way I feel, but I think that’s probably human bias and closely related to the evolutionary pressure behind my mirror neurons and how strongly they trigger correlates with outside sentient phenotype.
I think if I knew what it felt like (if anything) to be an ant colony, I might have different views around the causal use of boric acid (and related) to keep them out of human spaces.