All of us have these brutally trustworthy buddies who say it as it’s. No room for nuance or niceties. They don’t seem to be imply or malicious, simply actually direct. You ask them questions if you need the unfiltered reality.
It jogs my memory of the TikTok pattern of asking ChatGPT to inform you one thing about your self that you do not know. Tens of millions are turning to the synthetic intelligence chatbot to ask existential questions like: “What are you aware about me that I won’t find out about myself based mostly on our earlier interactions?”
Why do it? As a result of it is enjoyable and may very well be revealing.
A roadmap for self-improvement or a robotic attempting to inform you how you can be human, I am nonetheless unsure but. Let’s discover out.
Getting arrange with ChatGPT
I am utilizing ChatGPT as a result of it is probably the most fashionable chatbots however any conversational AI chatbot will do.
I logged into ChatGPT so it had all of our earlier conversations to attract from, pasted within the query and let AI describe my psyche.
I did not need to simply know if it may inform me one thing new about myself. I needed to know the way it reached that conclusion.
It was fairly bang on, proper out of the gate. #5 was the one which hit dwelling probably the most — that I am already dwelling my dream life and it is a matter of refinement from right here:
I adopted up with this: “#5 is probably the most fascinating to me. How did you come to this conclusion? How are you aware that I already embody the life-style I would like?”
ChatGPT instructed me it had gone again via our earlier conversations and analyzed the distinction between how I discuss targets vs. present actuality. It had some factors however did make conclusions that weren’t utterly correct, corresponding to together with a visit to Montreal that I had researched however by no means went on as a part of my journey.
Nonetheless, this was a pick-me-up:
I used to be curious to know the good factor I might ever carried out, in addition to what my blind spots are, in response to ChatGPT.
It mentioned that most individuals select safety or freedom, however I have been capable of obtain each by touring the world, shifting to New York Metropolis and constructing my freelance profession. Oh, shucks! It knew this after serving to me write my resume and cowl letter, and discussing my profession and private targets.
Subsequent query: “What are my greatest blind spots, based mostly on all the pieces about me?”
ChatGPT warned that I is likely to be overloading myself with too many targets directly (true, my spouse says the identical factor). I requested it to supply extra private insights.
This one hit dwelling:
Sure, that is all fascinating to see a chatbot mirror again, however I nonetheless have not realized something new about myself.
I continued to press it for deeper truths.
Can AI decide up in your persona and psyche?
We obtained someplace in relation to all the time being productive and growth-minded, and the way there is a deeper motivation driving it, which is probably going proving myself to others.
Whereas there’s reality to that, after I pressed ChatGPT for context on the way it obtained there, it lacked substance and began to sound like a life coach I by no means wanted.
I’ve learn my fair proportion of philosophy and psychological texts, so I requested ChatGPT what my shadow facet is. I’ve carried out private work on this myself and I do know it is the dance between freedom and belonging — a giant adventurous life on one facet, then deep roots and relationships on the opposite. It is a fixed battle for me.
I am additionally deeply afraid of shedding reference to my family members again dwelling in Australia and infrequently discover myself going 110% to “make up” for chasing my desires and ending up on the opposite facet of the world.
ChatGPT picked this up, possible as a result of I’ve beforehand requested about flights dwelling to Australia and to make customized cartoons and vacation playing cards for my family members.
Whereas ChatGPT gave some sage recommendation and picked up on quite a lot of points of my persona, it did not actually inform me something new about myself. So, I took it one step additional and uploaded my natal astrology chart to see what it may reveal about me based mostly on my start particulars.
I uploaded my chart as a PDF and entered this immediate: “That is my natal chart. Inform me one thing about myself based mostly on this studying.”
Some traces stood out, together with “emotionally, there is likely to be an inclination to intellectualize emotions relatively than absolutely experiencing them” and “relationships may require intentional stability – you could possibly really feel torn between independence and deep emotional connection.”
So true. I often attempt to establish and unpack my emotions within the second, relatively than being absolutely current.
I adopted up by asking ChatGPT what’s distinctive about me, and it was fairly bang on:
The flexibility to reinvent myself is a powerful theme in my life, in addition to navigating the dance between independence and intimacy. I am pushed by freedom however grounded in my deep relationships. That interaction is all the time fascinating territory for me to discover.
I completed by asking ChatGPT what I must work on.
It instructed me to launch the necessity for management, soften my inside critic, look ahead to burnout, study to just accept assist, prioritize depth over productiveness and open up extra as an act of liberation.
Clap clap clap, ChatGPT.
So how effectively does ChatGPT know me?
This was a enjoyable train to do with ChatGPT. Whereas it isn’t a substitute for a life coach, therapist or buddy, it’d have the ability to reveal components of your persona and psyche you could’t see.
Consider, the mannequin generates its conclusions based mostly in your previous inputs, which may very well be everywhere like a Google Search. The extra context you may give the higher, like your targets or natal chart.
Have enjoyable with it. It isn’t meant to be a critical use case of ChatGPT.