I picked up a copy of an AI-assisted book called Imaginoids, by an author using the pen name of Ether Busker. Was written in 2021, apparently using GPT-3.

It’s got some interesting language, though overall feels a little more like a psychedelic trip report than necessarily an AI speaking. It’s a little meandering, and light on narrative, though I’m not finished with it yet. The best read of it is probably just letting it wash over you…

The key takeaway I have gotten so far from the book actually appears in the intro, and I would guess is primarily human-written. Excerpted below (slightly out of order):

“I produced this book with the firm conviction that artists, dreamers, creators, culture designers, and oddball freaks have a supremely important job to do. If we want our children to enjoy a livable AI-powered future, we artists must roll up our sleeves…

This is a job for artists, as much as for software engineers, if not more so…

What if zany artists would call shotgun for the front passenger seat to to co-pilot AI development?”

This author is, I think, making an excellent point that bears repeating: we’re putting just about all of our eggs into the “engineer” basket in the development of AI, and only secondarily servicing other kinds of people with the byproducts that get generated as a result.

In a perfect world, that might be enough. In our raggedly imperfect world, it is extremely far away from being enough. Engineers, for all their amazing attributes, are not the only nor necessarily the ideal representatives of all the human race. But they hold a shit-ton of power in the development of these technologies… How can we better balance it with other types and modalities of human knowledge, experience, and – dare I say it – spirit?

There’s plenty of talk in AI circles about inclusive development, but this almost always has to do with representing different races, gender identities, etc. All of which is important, and all of which has its place… But apart from this book introduction quoted above, I have not really heard anybody suggest that we need different kinds of humans to participate in developing and steering these technologies. Artists, it just so happens, might just fit the bill.

So how do you actually execute on this need, once you’ve become aware of it? How as an artist do you feed back into the development of the tools?

One way is obviously testing, experimentation, sharing of results, and sharing ample feedback with product teams. Again, all of this is important, but it is very different from – say – every engineering team also giving artists – and moreover humanists – an equal say in how these things ought to go.

Ethicists, to a certain degree, fit this role of being the “let’s ask a human person how this does or might impact people.” But the risks and opportunities that they look for are a much more constrained set than what the artists will gravitate towards.

I’m not sure of the answer here. I’ve seen, working in technology, that engineers are valued so much higher and are so much more in demand than “arts & letters” type people, that it’s like the rest of us non-engineers are almost not even in the running. Yes, artists might sometimes wind up in product or project management positions (or more obviously design positions), but even that ends up being somewhat constrained in my experience.

Again, I don’t know how you should execute this in practice. I suppose AI artist residencies is one pathway that has been established for this, where participants get to play around with the tech, and presumably feed back more directly into product development. That’s very cool, but from what I’ve seen, those opportunities are extremely few, and most of the listings I’ve found for those are expired. And anyway, how from a business perspective, can one even quantify the contributions of artists in something like this? Especially in this downturned economy tech is currently undergoing.

Difficult problem, but an important one that we need to keep talking about.