Being Human in the Age of AI

11 months ago 37

being human is big, bigger than we usually think it is Too Human by David August There is a lot of talk about artificial intelligence (AI) doing as well or better than humans at things. Machines have long out-performed...

being human is big, bigger than we usually think it is

A robot's human-like face looks back at us amidst a chaotic background.
Too Human by David August

There is a lot of talk about artificial intelligence (AI) doing as well or better than humans at things. Machines have long out-performed people at a lot of stuff. And this doesn't just apply to complicated machines like cars, but simple machines, like a lever, have long amplified or done better than us. A pry bar is better at prying things than our bare hands. But humans have not, ever since the first person used a rock to make their efforts easier, been made obsolete by the machines we've made.

Sure, some things we have machines do that people used to do. But humans haven't stopped existing because of this. If anything, machines have allowed us to grow: machines and technology are part of why there are more humans currently alive than there have been at any other point in history. Humans haven't been made obsolete by tech.

So why is AI disrupting things typically done only by humans? Things like writing, art making, film-making and acting are starting to be threatened by systems, artificial intelligence (AI) systems. Why isn't it pry bars and not artists? Life and limb.

A self-driving car getting things wrong immediately collapses into the tyranny of atoms: concrete and often irreversible consequences come when a machine driving itself ends a life, or ruins someone's property. While self-driving vehicles may get to the point they are safer than human drivers, they will never be perfect, and this imperfection when matters of life and limb are involved, makes them hard to cheer for or even experiment with out in the real world.

Not so with media, entertainment and the arts. Artistic creations made by non-humans don't leave grotesque consequences if they fail: any bad movies, books, music and more made by artificial-intelligence (AI) will fade into the noise floor of un-celebrated human artistic attempts. No life or limb is imperiled if an artificial intelligence (AI) made film falls flat, instead bank accounts and careers get impacted (and disappointed audience).

This lack of life and limb involvement can make creative and artistic endeavors feel like the "consequence-free" world of software and web development: as if one can fail quickly and iterate in order to experiment in public and refine as one goes. However, the context of software and any context where life and limb are in play react very differently to things if one tries to fail fast. Experimenting and learning are invaluable, but at scale and in public, the tyranny of atoms makes doing so with self-driving vehicles, autonomous weapons and other life and limb scenarios far more dire than an unpleasing painting or a film that flops at the box office.

While many life and limb contexts benefit from the fact automated and artificial intelligence (AI) systems don't suffer from fatigue, emotional compromise or inebriation, some of the same things that drive humans to be tired, emotional, or otherwise not consistent are vital to make good decisions in other ways. Logical decisions with zero emotion can be low quality decisions (there are studies suggesting this is true, but they are beyond the scope of our discussion here).

Artificial intelligence (AI) promises the benefits of intelligence with none of the "downsides" like inconsistency or asking for a paycheck. Taking humans out of the loop may be possible, even beneficial in some contexts. But imagining an artificial system will benefit from all the strengths we take for granted in humans (or even see as liabilities) seems a mistake. Human are imperfect according to humans. What we find desirable and what the universe has evolved us into for our survival as a species are not the same. Eugenics makes many errors, this is one among them. Humans are great at a lot, even things we don't think of as great.

Being human includes the foibles and messy parts. It's not pre-Copernican to suggest that humans, after millions of years of evolution, are excellent at being human. This includes messing stuff up, hurting ourselves and also creating great things, beauty, connection, grace, mercy, cruelty: all of it. To imagine in the space of a few years or decades one can do what has taken billions of people filtering through millions of years of evolving is the magical thinking here. To fall so I love with human ingenuity and creation as to believe in a few generations time we can out create the forces of evolution over the timeline of geologic epochs, that is feeling to me like the fallacy in this.

Yes, AI can do, and maybe even excel at, specific human-like tasks. The question of what is necessary and sufficient for something artistic to be art, what is the essential that makes art in fact art is an entire branch of philosophy. There is little consensus on this, and what makes art human, or what human component is necessary for something to connect and resonate may not have consensus either. That said: I think saying there is an X factor, some amorphous and undefined thing, that is a cop out.

The generative AIs, both images and text, are sophisticated mirrors. Highly complex yes, but basically reflecting back their prompt and their training data is all they're actually doing. Often compelling enough for us to forget this, but that is what is happening even if the results feel like more.

So here goes:

It is the vast and minute combination of human intuition, rationality, emotion, fear, power, weakness, apprehension, insight and ignorance. At the risk of paraphrasing the dignity of man speech in Hamlet, we might sum it up as the human condition.

Wikipedia, a source of common meanings, sums the human condition up as "the characteristics and key events of human life, including birth, learning, emotion, aspiration, morality, conflict, and death." Mortality and love (not just romantic love) certainly seem to figure in to what I think the condition of being human includes. And how love and mortality figure into making art, or any creation in a way, is somehow uniquely human. We can probably spend a long time and many words defining, describing and discussing this.

For our purposes: knowing we will die perhaps underscores all our actions with a fleeting immediacy, even if that immediacy is an undercurrent and not the focus of what we do moment to moment, day to day. Does a machine have a functional life? Of course. But none of its activities are filtered, really in any way, through a lens of that mechanized mortality. Even in the back of our minds, under our awareness, the fact we may stop existing does inform in both subtle and bold ways how we do what we do.

Fictional characters are often revealed by how the do what they do. In many ways, how we do what we do is who we are. How the character says what they say, matters almost as much as what they've said. I leave the studies on non-verbal communication dominating the information exchanged between people for others to detail elsewhere.

How someone lifts a coffee cup can speak volumes about them, what they have done, are doing and will do. Does the way they lift it reveal the old injury from their time in a past captivity? Do they lift it in preparation to enjoy a moment's respite from current stress? Are they preparing to go without coffee as they are due too enter a coffee-less meeting? All of these options and a myriad more impact how the simple act of lifting a cup of coffee happens. They can be revealed and communicated in many ways like how quickly, slowly, angrily, happily and a bunch of other details that we all notice, even if not consciously.

And this rich, deep, full bodied (pun intended) action and all it communicates (literally, emotionally and maybe even spiritually) happens on some level even from this mundane act of picking up a cup of coffee. It can be filtered (pun intended again) through the whole of the human experience of the person lifting the cup, and also through the whole of the our human experiences as we witness it. With varying levels of awareness and attention, our neurology encounters and interprets what we behold in a rich amalgam of ways that help us understand, cognitively and otherwise, the world and those in it.

If all of this can easily become an overwhelming, almost unmeasurable amount of "data" were we to try to digitize it. Both the actor and the viewer are effected by it all and in turn effect the interaction in so many ways.

To imagine that we, in a handful of years, will be able to do this with a machine is folly. To replicate or emulate with a machine (let alone innovate) such a deep and complex moment as raising a cup of coffee is a little bit of hubris.

What's that you say? An mp3 compresses sound to about a tenth the size (in data) of what CD quality sound takes up by trying too remove what the algorithm imagines (was designed) to think we won't miss. And mp3s still give us the emotional and other impacts of the higher quality sound file? Our music on our phones and streaming services is still music, right? Yes, but if something walks like a duck, quacks like a duck and isn't actually a duck it still matters if it isn't a duck. And some people do consciously hear the difference between an mp3 and CD quality music. The rest of us unconsciously hear the difference (studies of galvanic skin response suggest different audio qualities and film frames rates effect us differently physiologically).

Why does that matter? Because there are many things the simulacrum (the simulation or reproduction) of things do not have that the actual things do have. People still flock to see the Mona Lisa despite reproductions, even very high quality and faithful reproductions. Those reproductions may have a "better" viewing experience than traveling to Paris and dealing with crowds. Is that some sort of fame fetish, or placebo effect? Maybe.

The fact remains: when we have had whatever day we have had, and crave the comfort of stories, we often yearn to have humans help us make sense, not just of events, but of our own experience, emotions and spirituality from the day. We have done this as long as we have done anything, even if at first it used to only be around the embers of a fire in caves long ago. Now we use tech to send reproductions of people to the far reaches of the globe, but the purpose and our need for it is the same.

If people still care about an original painting made centuries ago, can we really assume they won't care about original people on screen made now? Can our creation with only artificial intelligence (AI) really out human our ability to human? No. It can enhance, stand-in-for and supplement humans. Much as shoes do for more the fragile soles of our feet than going barefoot does, so too will artificial intelligence (AI) allow us to do new things. And like shoes, or a pry bar, or any other technology, artificial intelligence (AI) will change things, even in ways we can't yet guess at. But it seems deeply unlikely any artificial intelligence will somehow, inexplicably, transcend the makers of it.

What feature could allow any digital or electronic system to suddenly cross the threshold of passing from not human to beyond human? Since we don't agree on what being human is to begin with, we may not even be able to measure or meaningfully discuss when something we have made is more us than we are.

Our imagining we already consciously understand the what and why of how we connect with humans and human stories may be the big fallacy of this. Can we make something that imitates human stuff? Probably. We can probably even make imitations that we will believe are the real thing, at least in some contexts. Will this somehow become better at being human, whatever that means to us, than humans are? Probably not. Whatever our weaknesses and strengths, we probably aren't going to buck or defy our own natures enough to make a more human-than-human non-human.


View Entire Post

Read Entire Article