It almost sounds as though you're suggesting (or leading the reader in this direction) that whether something is autonomous isn't really useful, but maybe degrees of how autonomous a thing is, makes more sense. This seems like a framework that could be more useful for us: instead of a Turing Test, which could theoretically never ever be passed for the reasons you suggest here, we could describe system A as "level 3 autonomous" or something, and system B as "level 4 autonomous" and so on.
Don't ask me for any details, though! I'm more of an "idea guy."
I throw an idea out there and run away quickly when the going gets tough.
Good launching board for thoughtful conversations here, Michael! I hope others take this opportunity.