The article is about AIs that listen to your tone of voice, watch your body language to figure out how you're feeling. A couple of times they seem to be talking about machines that have emotions, but I think that's me misreading.
So here's a prediction. There will come a time when these things are common but not quite as good as humans are at reading us. Good enough though. At that point we will get into the habit of exaggerating things like our facial expressions for the benefit of the machine, a bit like we speak slowly and clearly (and with an American accent) to speech recognition systems now.
Then the machines will catch up and we'll stop doing that, although some of us will find it works quite well for our interactions with humans too (as does speaking slowly and clearly and, sometimes, with an American accent does) and we'll carry on dong it.