One training lecture for new employees includes reading the book QBQ and sharing at least 100 words in the group about what you've learned and how you feel after reading the book.
I found it interesting to read those thoughts shared one by one in the group. Some people's thoughts on the book are interesting. And some thoughts were obviously generated by a nonhuman. Or AI. It's not hard to find similar sentences or structures from those AI texts. People today don't even have the patience to read a book that can be finished in less than one hour. Pathetic. I couldn't wonder if they knew it was easy to tell whether their writings were AI-generated.
I once watched a crime drama where the detectives investigated a letter written by a girl who died at a hotel. The so-called assistant didn't know whether the letter was genuinely written by the victim or faked by the murderer. It was then the professor told her the letter was authentic and why.
Human writing tends to have grammar errors, typos, and a style of their own, particularly when we're writing something personal that involves human thinking and expressions. Also, we'll likely find various marks if texts are handwritten.
AI-generated texts are sourced from human writings, so it's possible those words coming from a machine could be human, too. But more often than not, the contents become less human after reconstruction through some algorithm. During the process of learning and putting words here and there together, the content it generates becomes less "reasonable" from the human perspective.
Don't get me wrong. I am not against AI. I have used AI tools to generate articles—more than once. But I would adjust the contents AI gave me. I would add a human touch, even though it's a general instructional SEO article.
AI can help us. But it (or they? or...?) can't replace humans. Not yet. It gives us the basic parts of the content we ask for, but we should not take what it gives as the final work.
Don't rely on AI. Don't let it take "your" feelings away from you.