Artificial Social Intelligence
Humans learn through intuition and feelings. However, the one thing I'm unable to develop intuition for is human feelings (including my own). So I'm forced to rely on an AI-like mechanism
In my last post here I explained that I find it hard to feel feelings, either in myself or in others. This has been one major outcome of some of the recent therapy that I’ve gone through. And the absence of feelings has meant that I need to rationalise a lot more about things than the average person does.
Let’s take an example.
I remember this time when I was in class 11 (“junior year of high school”, for American audiences), which was the time when most people in my class were preparing for engineering entrance exams. Someone proposed that on one holiday we meet at a playground to play cricket, when some of us were surprised that the usually studious class topper also turned up. The usual jokes were made about things like him using kinematics formulae to determine where to run after a ball to take a catch, etc.
I’m bringing up this seemingly ordinary incident because he didn’t actually use physics (which he was bloody good at) to figure out where to catch the ball. He simply used his intuition, and knew where to put his hands. Actually, most people don’t need to use any physics to know how to run after a ball. You take (and miss) a few catches, and you quickly develop an intuitive heuristic on what to do to take a catch.
And this is basic human learning. Humans don’t learn to do things by learning the underlying processes. Yes, you need to learn the underlying processes to know how the systems work, and to design them. However, in order to make decisions quickly, you need to develop an intuition to solve the problem.
However, as detailed in my previous post, the one thing that I have failed to develop intuition for is feelings - either my own or other people’s. As I found through my therapy, I find it impossible to describe my feelings (or put them in words). I’ve been married to my wife for thirteen years now, and known her very well for at least another year, but I still massively struggle with understanding how she feels about something.
And in the absence of feeling, I behave like a good artificial intelligence would, by collecting data and trying to “infer the next word”.
There is a whole bank of life situations I have with my wife, with the context of the situation, what she or I said, and how she reacted. And I have built a mental “machine learning model” to model her feelings based on the context and what I’ve said. And like any machine learning model, I keep getting it wrong.
You might have heard of the story of how a machine learning model needs thousands of images of cats and dogs to learn to identify them, and how this is very different from a child, who only needs to see a few cats and dogs to be able to easily identify them. This is because the child quickly gets an intuition for cats and dogs, and is able to apply them without needing any “formulae”.
This is how it is with interpersonal relationships and understanding other people, for most people. This wiring is missing for me, though. And with the feeling wiring missing, the only resort I have is to resort to “machine learning”, based on the data that I have observed.
It is not just with my wife - it is in all social situations, including work. I don’t know when I’ve pissed someone off, or made someone happy. Unless someone is explicit with their feedback to me, I absolutely don’t get it (or I use my “artificial intelligence” to draw patterns, which isn’t as strong as emotions). I miss signs that someone likes me. I miss signs that someone is pissed off with me. I miss signs that someone wants to talk more to me. And so on.
As a consequence, I oscillate between being downright rude to people and obsessively worrying about whether something I’m going to say is offensive - again note that I lack the intuition to make the latter decision, and am forced to resort to my “artificial intelligence” (i.e. data analysis and explicit pattern recognition), which is not as strong.
And what makes matters worse is that I know that my “artificial intelligence” is buggy and prone to errors, which results in (seemingly unnecessary) anxiety. There are days when I am stressed despite not doing at all - because of the anxiety on whether my AI is working well.
In most things in life, including catching a ball, driving, and interpreting data, I’m able to develop intuitions. However, the one thing I’m incapable of developing intuition for is feelings - either my own or others’. And so I’m forced to rely on my “artificial social intelligence”!