It just feels too good to be true.
I’m currently using it for formatting technical texts and it’s amazing. It doesn’t generate them properly. But if I give it the bulk of the info it makes it pretty af.
Also just talking and asking for advice in the most random kinds of issues. It gives seriously good advice. But it makes me worry about whether I’m volunteering my personal problems and innermost thoughts to a company that will misuse that.
Are these concerns valid?
It not being conscious or self aware. It’s just putting words together that don’t necessarily have any meaning. It can simulate language but meaning is a lot more complex than putting the right words in the right places.
I’d also be VERY surprised if it isn’t harvesting people’s data in the exact way you’ve described.
you don’t need to be surprised, in their ToS is written pretty big that anything you write to chatGPT will be used to train it.
nothing you write in that chat is private.
deleted by creator