It's interesting to me that people get so upset about AI suggestions in writing tools. Not the ethics of it (I understand the ethical debates), but just the sheer annoyance of the suggested correction. That little underline or tooltip about a misplaced comma really gets to some people, myself included at times. So I can understand why it would be frustrating to have a computer constantly trying to tell you how to do something, especially if it's that you're already good at and the computer is not. But for me, it's actually often helpful.
It's probably strange, but whenever I see a really bad AI suggestion, it motivates me to keep writing. I can't stand seeing something wrong and not doing anything about it. I'm hardly unique in this phenomenon; one of the most obnoxious (but true) pieces of advice I've ever seen is that if you want to get a question answered on the internet, and merely asking doesn't get a response, say something on the topic that you know is wrong -- then people will leap to correct you, and you'll get an answer to your original question much easier.
I've never quite been able to bring myself to do this -- the reputational damage seems like more trouble than an answer is worth, and I'm certainly not going to go through the trouble of making a sockpuppet account in order to get an answer... but it's always on my mind.
Something something XKCD; duty calls when someone is wrong on the internet ;)
To be clear, I don't think this happens because people are unhelpful or anything. Often people won't answer your questions because they feel like they are missing information. But if they see that you've said something very wrong, and they know it's wrong, then they feel motivated to point out that you are wrong.
That said, I think part of the reason people get so irritated by AI suggestions is because they feel like the computer is telling them they're doing something wrong. And in some ways, I think that's true. If you're using a writing tool and the AI keeps suggesting changes, it can feel like the computer is saying your writing is bad. Being constantly nagged about low-stakes things is annoying whether it's a computer doing it, or a spouse, or a child.
I can imagine that it feels a bit like getting caught off in a sentence. Like when you're talking and somebody interrupts you to finish the sentence. Some people like that, because it makes them feel like the other person is engaged and listening and paying attention and on the same wavelength as you but other people hate it because it throws off your train of thought and feels a little bit insulting like the other person didn't want to hear what you had to say. So I imagine that preferences with these tools is in some way, like conversation preferences.
Sometimes, though, it's useful. Motivating, in the same way that some highly successful celebrities report that a big reason they kept pushing past the point of reason was a desire to spite someone who told them they wouldn't make it in their field. It's not really healthy for relationships, but computers don't care if we like them, so they just keep blindly pushing until we turn off the annoying feature.
But I try to resist the desire to turn off the annoying thing. I think the AI suggestions can be helpful, even if they're irritating; they help me to focus and to stay on task, if only because fixing each little problem is an easier task than starting from scratch, even if what I wind up with bears absolutely no resemblance to the original suggestion my computer shoved at me 😅