Emotions and AI

Love and multiagent systems and AI in general…  Love can motivate and inspire us.  The affects are incorporated into an agent’s utility function and it affects the perceived costs versus actually costs.  It is a lens with which agents could view the world.  Is this emotion necessary for AI to function appropriately?  What emotions are necessary?

I think this could be an allegory.  And maybe it could be related somehow to autism and people that have a hard time with emotions.  It might already be a movie or book.
I found an article on what its like to have never felt an emotion: