Ted Chiang has the rare ability to tell stories of deep emotional complexity through a language so simple to appear, at traits, shallow. That’s the trick; that’s the magic of the narrator. In many of his stories, the slow unfolding of the plot carries the reader like an unmanned canoe along a quiet tropical river. The scenery flows monotonous under our eyes. Feeling no friction, we absorb the characters and their lives, their relationships and their thoughts. Without noticing, we are now immersed in their world. The fictional part - a new technology, a breakthrough in AI, an alien arrival - blends in as if we were already familiar with it. We don’t get distracted by silly questions of possibility or probability; we are free to focus on the meaning. The shallowness of the story allows us to find its depth.

The Lifecycle of Software Objects makes us spectators to the life of Ana, Derek and a few others who happen - by chance more than by will - to be involved with an attempt to breed sentient digital avatars called digients. Chiang doesn’t have any specific objective with the plot. The future he paints is neither utopia nor dystopia. There is no climax or surprising plot twist, no obvious moral nor teaching. Yet, readers will catch their mind reflecting about the irresistible human desire to play the role of the creator; or asking what truly separates objects - for smart they may be - from beings; or why committing to a cause often requires the suffering of those who we have near.

By the time I put down the book, I had two particular thoughts in mind. The first relates to how technology emerges and what we can do about it. I tend to oscillate in my opinion on whether technology has its own will or it is just something it derives from us. If technology, in other words, wants something by itself, or if it only wants what we want it to want. Reading this story knocked me back to the determinate (it wants something) position, albeit with a twist. Chiang brings humanity to the centre of the stage while making it clear that technology doesn’t depend on any individual human. Technological progress is presented as both personal and impersonal. Personal because the emergence of something new is the product of very visible individual journeys - you cannot disentangle the story of Facebook from the story of its founders and early team. Impersonal because the absence of a grand plan or individual stroke of genius gives us the certainty that we would have ended up in a similar - if not the same - place had those specific individuals taken another path.

When it comes to artificial intelligence, in its different shapes and forms, I get the feeling that the same steady current that carries us along Chiang’s plot will carry us to realising some version of it. A current combining the inescapable attraction for the possible (“because it’s there”), greed, and the need to put a meal on the table at the end of the day (the main character, Ana, accepts the job at the digient maker as a surrogate for an unachievable - real - animal care career). The same mix of ex post inevitable and in itinere contingent that defines most breakthrough innovations 1.

The second thought relates to the title of the story, the idea of lifecycle in software - but also non-software - objects. In some ways, the presence of mismatching lifecycles is the kernel in the narration. Human lives - Ana’s and Derek’s - span across the life of Blue Gamma - the startup behind the digients - and Data Earth - the platform that initially hosted them. They precede the digients but might not outlast them, if they find a way to self-host. The digients themselves are now longer-lasting than traditional software, and that causes the tension between fleeting consumers who want to replace them for something new and the people who grow attached to them to the point they are unable to ditch them as you would ditch an inanimate object.

There is something more in the realisation that software objects tend to have shorter lifecycles than animated ones. In certain ways, software is losing some of its super-power in the effort to be humanised. Like Arwind losing her immortality to marry a mortal, becoming intelligent - in the sense of emotional intelligence more than pure computational one - slows down the snowballing process of trial and error typical of software development. New factors come into the picture, new motives, new needs - in Maslow sense - and above all the need of life. Survival instinct goes hand in hand with intelligence.

Now that they have established a relationship with their digients, Ana and Derek are unable to move on. Now that they have established a relationship with Ana and Derek, the digients are unwilling to be phased out. One day, software will overcome this problem. It will learn to write itself and to port itself to the newest platform. Until that day, establishing a software-human relationship will bring with it the same trade-offs of any human-human relationship. When we develop emotional ties to one-another, we put down roots, and we lose optionality. When we get attached to software, we trade speed for backward compatibility. When we truly love a software object, we end up like those people who keep an old computer at the office to run that one specific application they cannot live without.


  1. Just as I was reading this story, I heard about Lux Capital’s Josh Wolfe interest in animal cognition