https://img.inleo.io/DQmRW7iLLAQHV2CUeM6fM8Ej6gx8TCRo1yt5SSXJVAxRXua/connection-4848255_1280.webp
Every AI company is in a race to bring better AI to us and their main goal is to make AI more human.
But I think humanizing AI to the point where it can simulate empathy and companionship is treading on some very disturbing grounds here. Yes of course having AI companions might look harmless and infact helpful for people struggling with loneliness but when we start seeing these machines as actual friends or partners don't we risk erasing what makes our human connections so unique?
AI may be able mimic empathy, but I don't believe it can ever feel anything on its own. I think people don't realize that the emotional connections we form with these machines are one-sided and it is almost like talking to a mirror.
The danger is that this might lead us to expect the same perfection from real people.
If a girlfriend robot is behaving perform well and you human girlfriend can't behave that perfectly, it kind of becomes a competition and you the guy expect more from human girlfriend which is wrong. Real relationships which are usually messy and flawed, might start to feel disappointing to many.
What will happen is that we’ll end up emotionally undercutting our own selves, we'll become impatient with others and even worse, impatient with ourselves.
Companies profit by promoting these illusions so they're not going to care what it does to the society. So we're not just paying with our wallets but wiping away at what’s really human.