7

Stress-testing AI-imagined art — or “your mother is a tracer”

 1 year ago
source link: https://uxdesign.cc/stress-testing-ai-imagined-art-or-your-mother-is-a-tracer-2222cef8b124
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Stress-testing AI-imagined art — or “your mother is a tracer”

I used machine learning to teach myself embroidery, and it helped me remember what humans are good at: editing.

split image of an embroidery — one imagined by AI and one made by a human
top image imagined by Stable Diffusion, bottom image exists in reality / is a real tangible thing

I first trained a GAN to produce things to sew in 2017, and it was messy and complicated and barely successful. Then Open AI released DALL-E, and the model was pre-primed for my purposes. What purpose? To have a machine imagine a thing (the machine owns the intellectual property), and then have a human produce it (the human is the lean machine). It’s like an inverse industrial revolution — so, I started and documented it on Instagram.

How? The concept is this:

  1. Have an AI imagine something handmade
  2. Process that image as a vector file (in this case: SVG)
  3. Process that file to prep it for an embroidery machine (in this case: PES)
  4. Have a machine embroider it

So (sew) easy, right? No, not so much, and definitely not when I am stomping around replacing my third needle of the day on the Brother SE600 embroidery machine. Why? I have an idea why.

In 1997, the movie Chasing Amy came out. I had a huge crush on Jason Lee. Shortly after, I took a class on vector graphics and was introduced to the concept of Live Tracing, and these two concepts became permanently linked in my mind.

this has multiple bad words and is definitely nsfw

The instructor prefaced the lesson by letting us know we were welcome to use live tracing but 1) they will know we used it and 2) so will everyone else and 3) it will not look good and 4) you will spend more time trying to fix its mistakes than if you just did it by hand, so just do it by hand.

Several students decided they were smarter than the instructor and yes, the instructor was correct. We all knew. It could not be more obvious.

Twenty Five years pass. Many things change. Computers fit in pockets.

AI companies have similarities in how they are pursuing creation of artificial general intelligence: multiple examples of a thing help “train” an understanding of the patterns that make a thing what it “is”.

again, many bad words nsfw

What makes a thing a thing?

We understand the world around us through exposure and interaction — this is the basis of constructivism, so it makes sense to see Piaget’s concepts of cognitive development applied to artificial intelligence. Children learn the differences between a house cat and a lion by collecting and interpreting patterns and information. AI is no different: it simply has a much larger capacity to consume information.

I wanted to see if AI can create a new quilt pattern, knowing this is a much more complex statement than it seems. Quilts use geometry to create “quilt square patterns” — the underlying logic of a quilt for sewing and construction.

Current artificial intelligence interprets patterns from the material that has been supplied to the AI model to learn from. As someone who follows along with DIY happenings on the internet, this outcome of my prompt was 0% surprising.

DALL-E image of a single hand holding a quilt in an open field
DALL-E imagining a “modern quilt square pattern”

It understands the basic concept: the quilt has geometry, top stitching, binding. It also understands that humans commonly take pictures of their finished quilt pieces outside in nature. Or, rather, that a single hand commonly takes pictures of their finished quilt pieces outside in nature (where is the rest of the human and what is up with those fingers 🤢).

A quick glance at Pinterest gives some insight into how this image was hallucinated.

a bunch of quilts on pinterest out in nature
modern quilts on pinterest

Generative AI is great at many things, but it is also not great at many things. It lacks editorial skill and deep meaning, because these tend to be subjective, individual skills.

What does live tracing and constructivism have to do with all this?

The basic concept I learned in that graphic design class has not changed: some things are better left to humans. Trying to clean up what the computer “created” is an annoying hassle and you will spend more time trying to correct the mistakes than if you had just done it yourself from the beginning (or, “just do it the right way from the start” as you may have heard before once or twice).

We can use technology to supplement, and augment, create efficiency, and autocomplete. But, like live tracing, asking a machine to create something from nothing? To imagine an embroidery design, and then convert that design to shapes, and then convert those shapes to fillable paths for another machine? Nope.

Not only does technology not accomplish this well, it also simply does not work / can cause damage. Anyone who has used live trace knows that it makes its best guess at what a shape is: this results in many many many unnecessary and illogical shapes in the document. Not only does it look “not that great,” but it consistently strains and breaks the embroidery machine when it tries to apply the rules of sewing to the shapes.

There is a hole punched clear through my bobbin carrier from an angry needle bucking against an impossible instruction the machine fed to it.

The images created by artificial intelligence show impossible colors and depth, stitches that cannot be reproduced, unnerving hands, and a rudimentary at best concept of how an embroidery hoop closure works.

slider showing an embroidery imagined by AI, and one made by an embroidery machine in real life.
Stable Diffusion Image || Embroidered Final Piece

But artificial intelligence also “gets us” — it collects and synthesizes patterns about the things humans make, because it has learned from our shared history and the photo documentation that supports it. The images it makes show the things we gravitate towards, repeat, mimic, the consistent design elements and choices. It is a bit like listening to a recording of yourself — uncanny, and unnerving.

keep calm and put a bird on it, etc.

I set off on this story hoping to sit back and munch on some popcorn while the machines did all of the hard work, and it took me eight “fine but not great” embroideries before I admitted that the issue wasn’t any of the automation settings I was using. It was the fact that I was using automation to create a pattern.

AI is excellent at understanding patterns, but understanding patterns is not at all the same thing as creating patterns.

✌️ 🦾 🪡 Watch me make mistakes at Sewing Machine Learning.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK