I’ve copy-pasted a few of the stuff I’ve posted on this subreddit onto the [playground](https://beta.openai.com/playground/p/default-essay-outline?model=text-davinci-003) (requires a free account) – and after ending the post with a clear question (e.g. any thoughts?) it spits out something that’s… actually mildly comforting. Yes, if you ask enough follow-up questions it becomes repetitive, and yes, it’s definitely not a catch-all but it’s still scary effective. Maybe I’m unsettled as I expected gibberish – tbh it makes sense as it was trained on actual human input (possibly from reddit as well lol) – but yeah.

2 comments
  1. Just answered somebodys question on this sub 5 mins ago with a chatgpt generated response. Also, heres a response to yours: *It is certainly impressive how well ChatGPT can mimic human conversation and provide helpful advice. However, it is important to remember that it is still just a machine and should not be relied upon as a replacement for professional therapy or counseling. It is always best to seek out real support from trained professionals if you are struggling with mental health issues.*

    Although i agree that it is a genius therapist, it just seems to have a case of imposter syndrome

  2. Well it’s a text generation AI. It just guesses (or rather calculates with a bit of randomness) what words would fit best.
    If you fed it something therapists say it would of course say something akin to a therapist.

    You should know that it doesn’t really replace an actual human. It just makes you believe it knows what its saying.

    Everything it says is basically normal, general advice you can easily find on the internet. Most people don’t bother to read it on the internet anyway.

Leave a Reply
You May Also Like