Back to Blog Index

AI: An Artist’s Friend or Foe? Discussion with Dan Jeffries, CIO of Stability.ai (Stable Diffusion)

No items found.

What does AI really mean for artists?

Will artificial intelligence be the friend we’ve always wanted by eliminating monotonous tasks and speeding up workflows? Or will it turn out to be a foe and make digital artists obsolete? Joey sits down with Daniel Jeffries, CIO of Stability AI, to discuss the real-world use cases of artificial intelligence and the moral, ethical, and philosophical ramifications of AI art. 

Show Notes

People

Dan Jeffries

Garry Kasparov

Pieces

The Mandalorian

Avatar (2009)

Tools

Stable Diffusion

Midjourney

Dall-E 2

Avatar AI

Hairstyle AI

Unreal Engine

Deforum Stabile Diffusion

Resources

Stability.ai

Imagen

Adobe MAX

Science Saturday: AI enables early identification, intervention in debilitating lung disease

AI Researchers At Mayo Clinic Introduce A Machine Learning-Based Method For Leveraging Diffusion Models To Construct A Multitask Brain Tumor Inpainting Algorithm

Deep Blue (chess computer)

A New AI Trend: Chinchilla (70B) Greatly Outperforms GPT-3 (175B) and Gopher (280B)

Transcript

00:00:15:00 - 00:00:29:21
Daniel Jeffries
A lot of times people are just looking, oh, here's my quick prompt generator or whatever. And that's kind of all they see and they start seeing anything else. But when you start seeing it as a giant workflow for concept designers and motion designers, all these things, texture designers, fashion houses, etc. this is when it becomes truly exciting.

00:00:30:15 - 00:00:59:03
Joey Korenman
Unless you've been quite literally living under a rock, you know that AI has finally come to the world of art daily. Mid journey and stable diffusion are basically household names in the motion design industry, and we are all figuring out in real time how to grapple with the awesome power of these tools. To help figure out what it all means, we invited Dan Jeffries, the chief intelligence officer for stability II, the company behind Stable Diffusion, to come on the podcast to talk about their A.I. model.

00:00:59:17 - 00:01:20:13
Joey Korenman
Dan is a brilliant writer, speaker and futurist and has a lot to say about what this technology does not mean. We get into the moral, ethical, legal, and even philosophical ramifications of AI and talk about the real world use cases that are coming soon. If you're a human being trying to grapple with this brave new world, this episode is for you.

00:01:20:16 - 00:01:40:11
Joey Korenman
And if you're listening to this episode, you should know that we recorded this interview with video and we'll be showing things that we talk about on the video in the edited version, which will be on School of Motion’s YouTube channel. So check that out if you'd like to hear and see this podcast episode. And now let's meet Dan Jeffries right after we hear from some of our amazing School of Motion alumni.

00:01:41:02 - 00:01:46:09
SOM Alumni
Hi! Hey School of Motion. Hello there. School of Motion courses offer such a compelling experience.

00:01:46:10 - 00:01:46:25
SOM Alumni
I was now part

00:01:46:25 - 00:01:53:26
SOM Alumni
of this huge community of motion designers from all around the world. Such a tight class and community. They just had a huge impact

00:01:53:26 - 00:01:54:24
SOM Alumni
on my life. Thank you

00:01:54:24 - 00:01:58:19
SOM Alumni
School of Motion. I couldn't have done it without you.

00:02:00:20 - 00:02:08:25
Joey Korenman
Well, Dan, thank you so much for coming on, man. It's really nice to meet you and chat with you. And I have so many questions about this amazing technology, so I appreciate you being here.

00:02:09:11 - 00:02:11:01
Daniel Jeffries
Thanks so much for having me. I really appreciate it.

00:02:11:13 - 00:02:22:00
Joey Korenman
Right on. Well, the first thing I wanted to ask you is about your job title, because I've never heard of a chief information officer. You're the CIO of Stability that I. So maybe you could just explain what that means.

00:02:23:01 - 00:02:47:02
Daniel Jeffries
What's the chief intelligence officer? CIO actually used to be called chief information officer when when you would use that three letter acronym. In the past, I always joked that my my general job in the world is thinking for folks. And I'm very good at kind of thinking through a lot of the steps. So when people want to actualize something, you know, they're not aware of that, that there's something like 40 or 50 steps that sort of have to be done to make something a reality.

00:02:47:03 - 00:03:02:07
Daniel Jeffries
I'm a bit older than I look, and so I've got a lot of templates in my head. I guess I'm a well-trained guy at this point that like I can kind of look back and go, cool. This is how other companies or other things that I've come across have done. This was how you build a team or this is how you implement a policy or whatever.

00:03:02:15 - 00:03:20:08
Daniel Jeffries
And I've been kind of a jack of all trades, master of none. And so that I've kind of got a good sense of business, good sense of legal, good sense of writing, communications, good sense of technology, good sense of business structure, these kinds of things. So that's kind of where my sort of role fit and stability, and that's all I think of it.

00:03:20:21 - 00:03:44:07
Joey Korenman
Well, I'm sure you're navigating lots of landmines at this point. Is a very new technology. And there's this quote I was reminded of by Arthur C Clarke. Any sufficiently advanced technology is indistinguishable from magic. I'm sure this is not the first time someone's thrown this line at you with what you're dealing with. But tools like stable fusion that stability has created and mid journey and I mean, these things are basically just magic to me.

00:03:44:07 - 00:04:00:12
Joey Korenman
It's inconceivable that they work and I don't really understand it. And I'm an artist and I'm surrounded by artists all the time. There are several PhDs worth of work that have gone into these tools, but is there a way that you can help me and the audience wrap our heads around how these tools are actually doing what they do?

00:04:01:05 - 00:04:21:23
Daniel Jeffries
Well, first of all, it is magic. And look, the interesting thing about artificial intelligence and maybe this is the best way to think about it, is that if you think about traditional coding and in traditional coding, you have a developer who thinks about all of the logic and then kind of explains that logic in a very structured programing language.

00:04:21:23 - 00:04:42:09
Daniel Jeffries
They lay out the logic. That's really what it is. If this, then that while this, while you're waiting for this, wait for this signal, do this thing right. It's loops, it's conditionals. It's basically structured logic. Right? And that's been very effective for us as a society. If you think about like the tool we're using right now to record stuff that's all through traditional hand coding, you think about the browser.

00:04:42:09 - 00:05:07:28
Daniel Jeffries
If you think about all the apps on your phone, the operating system on your phone, any of the apps that pop up, the vast majority of those are traditional, encoded logic. What started to happen, though, is that we started to reach the limits of that hand coded logic and what we can actually do with it. And so if you think back to the emergent competition, which was a giant competition of Stanford with the wonderful elite, as the professor there who then was on onto Google, that was a contest to do image recognition.

00:05:07:28 - 00:05:35:11
Daniel Jeffries
And if you look at everything before Alex and Alex, that was one of the first kind of major neural nets to use GPUs and to really kind of push the envelope. That was a lot of people trying to use traditional logic and not getting to better than 60 or sort of 60 plus percent or whatever. In terms of accuracy, it's very easy to do something structured like I need to get a username and password and log in, but how do you describe what a cat is to a computer?

00:05:35:23 - 00:05:53:23
Daniel Jeffries
It's well, it has fur, except when it doesn't. But what is fur? It has pointy ears, except when it doesn't. And what are pointy years, right? There's no real way to describe that. And so neural networks kind of come along and they were kind of lambasted for a long time because we didn't have the data and we didn't have the compute power to be able to do it.

00:05:53:23 - 00:06:13:15
Daniel Jeffries
And it's not like theoretical physics where you can make a living just writing equations on a board and without being able to kind of test it, it's very practical. You have the machine learning tree practically have to do stuff with it and we just didn't have the data for that. So once Aleksandr comes along and all of a sudden crushes the competition, you all of a sudden they go from one neural network that takes it to like 80% in that year.

00:06:13:15 - 00:06:29:10
Daniel Jeffries
And then all of a sudden the next year, everyone except one contestant was a neural network, and they pushed it up to like 90 plus. And then the year after that, it was already beyond sort of kind of human capability and they had to shut the contest out. Right. So it was a very quick timeline kind of using these neural nets.

00:06:29:28 - 00:06:51:01
Daniel Jeffries
Sometimes neural nets kind of make people upset because in the traditionally AI network, even because we want to add all of our own kind of your hand tuned logic to it. But there's an old joke from Richard Sutton that every time you fire a linguist, the language model gets better. And what we've generally found in Sutton's bitter lesson, as he calls it, is that only like scale and search really worked well.

00:06:51:01 - 00:07:08:10
Daniel Jeffries
And so the more that we kind of scale up, compute, scale up the data, scale the search for able to do these things. And so basically the narrow network learns about the characteristics of the underlying data and it learns to kind of approximate those things or do those kinds of things and make sort of predictions about the next bits that are happening.

00:07:08:13 - 00:07:23:04
Daniel Jeffries
Right. And I think start by describes it a bit as kind of like spreadsheets on steroids, but you could do a lot of cool things with spreadsheets on steroids. Maybe it's not truly intelligent in the way we think of it. But, you know, a squirrel we might not think of as intelligent. It's better to think of things as that.

00:07:23:04 - 00:07:41:27
Daniel Jeffries
There's lots of kinds of intelligent. You might think a squirrel is intelligent in the way that a human is, but you try to keep that squirrel out of your garden, and I guarantee you're going to find out that squirrel is pretty smart, right? Because it's optimized to get into your garden. And so I think these things are these narrow kind of tools that are able to do interesting things like drive a boat or drive a car or recognize images or create images now.

00:07:41:27 - 00:08:06:10
Daniel Jeffries
And the diffusion process that creates sort of stable diffusion, that's trained on a lot of data. It learns how to add noise to that data and then reverse it. And then it learns to kind of add its own characteristics over time. And there's some other kind of interesting bits underneath the hood that make it happen. But these kind of diffusion models have been particularly fantastic at generative A.I., which is now becoming incredibly important in the entire artificial intelligence space.

00:08:06:27 - 00:08:25:16
Joey Korenman
Thank you. Has a really good explanation. I want to try to dig a little bit deeper because I want to talk to you in a little bit about the, I guess, potential limitations of this technology and how far really it can go. I've seen videos where there's noise and it just looks like static and then it slowly clears up and turns into like something that looks like a master painter made it.

00:08:25:27 - 00:08:45:18
Joey Korenman
I mean, are these neural networks, are they basically just learning what the most common pixel color is? When I see this pixel color over here and it just like you get enough data and it has no idea what it's making, it just knows that in general this word goes with this arrangement of pixels and voila, now we have an astronaut riding a horse.

00:08:45:19 - 00:08:47:00
Joey Korenman
Is that how they work?

00:08:47:25 - 00:09:03:09
Daniel Jeffries
I mean, it's not necessarily pixels in other words, like for you, it's sometimes more interesting to look back at things like a convolutional neural network, right, where it had sort of layers of things and when they would kind of peel back some of the what was happening under the hood, you would see kind of each layer learns a different set of characteristics.

00:09:03:09 - 00:09:31:24
Daniel Jeffries
So one of it might sort of learn about edges, one of it might learn about swirls and blobs of color. One of them might learn about pixels, one of them might learn about different layers of it. And it kind of learns these different characteristics or what they call features. Essentially, we might think of features as like, well, if someone has a large nose or a small nose or quietly a nose, or they might have bright eyes or a small eyes, they might have like large lips, or they might have a bald head or they might have a lot of here, at least on their face.

00:09:31:24 - 00:09:51:03
Daniel Jeffries
But no, but that's kind of some characteristics of those things, like the way these kind of lines work together, the way that the colors kind of are adjacent to each other, those kinds of things, or what it starts to learn over time. And there's a similar sort of process, although it's different in the way it works in diffusion, in that it's modeling the adding noise to the image and then the reversal of that process.

00:09:51:03 - 00:10:08:03
Daniel Jeffries
Right? And it doesn't become a 1 to 1 and then it immediately sort of recreates the image in there, although it's kind of it's doing that, it's training session, but then it's essentially learning how to improvise from there. Right. And that's really the important thing is that it's I think some people especially have kind of tried to characterize it.

00:10:08:03 - 00:10:23:21
Daniel Jeffries
This is just this kind of giant database of image or whatever. It's is stitching together a bunch of images and saying that this is how it works. It's not searching for a database. It's not that's not how it works. And in many ways, it's similar to how an artist learns in life. It's just we learn slightly differently, but not that differently than machines.

00:10:23:21 - 00:10:39:25
Daniel Jeffries
Like when I was doing drawing and painting in school was one of my first loves, had a book called Drawing Lessons from the Great Masters. And I learned how to like draw all the old pictures from Da Vinci and Raphael and all those. And there was a lot of analysis in there of like the shading and how those lines came together.

00:10:39:25 - 00:11:02:27
Daniel Jeffries
Right. And if you think about like when the picture was sort of zoomed in and there was an overlapping of shadows or whatever in a particular area and the way lines come together, that's a feature that it can learn. And so it learns a lot of these kind of granular features. It doesn't necessarily learn them on this kind of huge scale, learn someone sort of a smaller scale, and it learns slightly differently in terms of like how it kind of puts those things together.

00:11:03:05 - 00:11:19:28
Daniel Jeffries
We may have kind of like sub algorithms that we don't understand, where we know certain types of pattern matching. Like I know a face should be at the top and should be connected by a neck and shoulders. And if it's not right, there's something in my brain intuitively that goes, Wait a minute. Like the head shouldn't be floating off in space.

00:11:20:04 - 00:11:37:14
Daniel Jeffries
So sometimes you see those weird kind of Cronenberg images where it hasn't quite learned that underlying 3D geometry and of course, like Flamingo Pants, there's the so on show me. There was a great one of like with Blade Runner where they were asking the question that is trying to figure out whether it's a guy and he goes, Show me your hands.

00:11:37:14 - 00:11:44:07
Daniel Jeffries
And it's like it has like eight figures or so. So those details are challenging for it to learn, but not insurmountable.

00:11:44:12 - 00:12:02:24
Joey Korenman
Yeah, that's funny. I mean, you're making me think like when you learn to draw, one of the things and I think is probably the hardest thing to teach is to stop seeing objects as the object itself. Like don't see it as hands. Just look at the shape and the shading and just recreate that, ignore what it is. And that's actually kind of a way to get better at doing that.

00:12:02:24 - 00:12:24:11
Joey Korenman
And it seems like that's what these tools are doing automatically. They don't know that it's a hand, obviously, because they wouldn't put eight fingers on it. That's really interesting. I want to understand why this year everything seemed to happen all at the same time. Yeah, I know you've been talking about this stuff for years as as have other people, but I think up until maybe February or March of this year, I it never occurred to me that this was possible.

00:12:24:17 - 00:12:32:21
Joey Korenman
And everyone else I've talked to in our industry felt the same way. So was there some breakthrough? Was it like some new GPU came out? Like, what was it?

00:12:33:15 - 00:13:03:09
Daniel Jeffries
So I don't think so. My philosophy fits very much with Stephen Johnson's. If you ever watched him, he's a fantastic kind of writer. He wrote the book and we got to now that kind of and he has a BBC series where he kind of goes through the history of Glass, for instance, and how important it is to society over time and everything that it's been in from like the screen on your phone to fiber optic cables, moving things across the world to keeping out sunlight so that certain aspects of sunlight so that you can actually get a sunburn inside the house because it filters that, but through the visible rays and mirrors and what they

00:13:03:09 - 00:13:22:07
Daniel Jeffries
meant to the development of consciousness. So there's kind of this thing or there's sort of this hummingbird effect that kind of happens right where there's a lot of different developments, and then it has these beneficial developments in an unexpected domain. And so I think what happened was you just had a lot of research being poured into it. And what was interesting is when you look back at neural nets, they were not doing well for a long period time.

00:13:22:07 - 00:13:37:01
Daniel Jeffries
There was like one or two universities in Canada that were kind of keeping the flame alive when most people thought they're just toys and they're really not going to be able to do anything. So there wasn't enough compute, there wasn't enough data like we talked about before. And because of that, you didn't have a lot of people except like the hardcore true believers kind of doing it.

00:13:37:01 - 00:13:53:16
Daniel Jeffries
And they told kids, don't study this, it's waste time. So that's why we don't have enough animal research now and people are kind of scrambling to learn this stuff. But then you started getting things like Image Net, right? And then you start saying, Wait a minute. Well, GPUs are very good at doing matrix math. In other words, like instead of it being these kind of old school processors, we had four or five course.

00:13:53:25 - 00:14:09:08
Daniel Jeffries
It's actually better when you have thousands of little cores in there and you can spread the math across it like a giant spreadsheet, right? So that started to happen. And then you had companies like Google start hiring those researchers academically, and there was a bunch of kind of breakthroughs where they were kind of training up a neural translation.

00:14:09:12 - 00:14:25:10
Daniel Jeffries
Right. And they trained up neural translation, which was made suddenly made Google translate much better in a huge leap right after before that, they had a rule based kind of coded system, and then they had to develop their own kind of process or CPUs to deal with that. And then the video saw that it was like, wait a minute, we want to be on the stage.

00:14:25:10 - 00:14:43:07
Daniel Jeffries
So they started pouring all their huge budget and research and R&D into it. So they made kind of huge leads. So you have this kind of perfect storm of the Internet wells up with lots and lots of data. You have all these kind of chips getting smarter and Nvidia going from kind of a pure gaming company to focused on artificial intelligence.

00:14:43:20 - 00:14:59:05
Daniel Jeffries
And then you have more and more people kind of getting into it, pouring research dollars into it, primarily out of that, all of the tech companies, etc.. But then you start to get these independent research labs, you get folks like open AI that for sort of billions of dollars into this is it's kind of pure research and they start doing stuff that you just couldn't have done years ago.

00:14:59:05 - 00:15:15:26
Daniel Jeffries
You couldn't have taken like a cluster of 10,000 views and put them together and get the data to do it. It would have been inordinately expensive. So I think the compute sort of expanded. Plus the pricing came down because we started to have the data and plus we had interest in it because we were starting to see little breakthroughs.

00:15:16:01 - 00:15:32:10
Daniel Jeffries
There's an old joke that when something works, we don't call it air anymore. So like when you talk to your phone and it understands what you're saying, you just call it voice recognition. That's artificial intelligence, right? We just don't call it a morgue. Remember what I said earlier? We reached the limits of what we could do with traditional coding.

00:15:32:10 - 00:15:48:13
Daniel Jeffries
There's some things that we're probably always going to do that way, but there's a whole bunch of things that we simply cannot solve with traditional coding. It doesn't matter if we put $1,000,000,000,000 into it and a million developers, we would likely would not be able to solve many of the challenges that we can solve with artificial intelligence. And that's sort of what happened.

00:15:48:13 - 00:15:56:23
Daniel Jeffries
And so I think you just had you got a lot of these little building blocks kind of waiting to be put together. And it just it it was sort of a it was just time for it to make the leap.

00:15:57:06 - 00:16:21:03
Joey Korenman
So it just sort of it bubbled up underneath the zeitgeist and then it cracked the surface and everyone saw it. I'm fascinated. I follow a bunch of, you know, indie hackers and entrepreneurs online. And it's been amazing to me how quickly little micro tools have spun out, built on top of these platforms. I mean, there's there's Avatar A.I., which will you upload like 20 photos of yourself and it makes different styles of you that you can use as your profile picture.

00:16:21:03 - 00:16:38:23
Joey Korenman
There's another one that just popped up, which is near and dear to my heart, which is a hair style generator where it takes your face, puts hair on it. I mean, I think we could both we could benefit from the stand. So one of the big differences with stable diffusion, though, and I'm curious why this decision was made is it's open source.

00:16:38:29 - 00:16:52:02
Joey Korenman
So Dolly mid journey, they had sort of demo periods but now you have to pay to to use them. Stable diffusion is open source so you don't have to pay. So why was that decision made and what does that mean in terms of a business that's raised a lot of money?

00:16:52:29 - 00:17:11:07
Daniel Jeffries
Well, I mean, I think we've seen a lot of success with open source movement in the past. I was at Red Hat for a decade. And so I've seen firsthand and I started there when I think there were 1400 people or so, and I was there for ten years. And so I grow to much larger over time. In the early days, we were trying to explain why, like Linux wasn't communism going to destroy the world.

00:17:11:07 - 00:17:30:27
Daniel Jeffries
Right? And, and so and now when you think about it, it seems almost crazy to you because Linux literally runs everything. It runs all the public clouds, including Microsoft right there business. If Microsoft have been successful in crushing it in the early days, they would have shot their future business in the foot. Right. And because now it powers a great deal of the azure cloud and it powers nuclear subs, it powers all the world.

00:17:31:03 - 00:17:50:20
Daniel Jeffries
Almost all the world's supercomputers and power is routers that are in your house edge devices that powers Android. Like it's just everywhere. And it's because it's ubiquitous and because people are able to take it, bend it, break it, and think about it in a new way. So when you put something out in in that way, it allows kind of the creative forces of a lot of different people to do it.

00:17:51:00 - 00:18:05:23
Daniel Jeffries
The challenge is when you just keep it behind an API and you kind of strictly say what people can and can't do with it, you're going to have to think of everything that they could do with it, right? Because there's just an infinite pool of creativity out there. And no matter how smart your team is, you're not going to think of everything.

00:18:05:24 - 00:18:26:25
Daniel Jeffries
And so what you end up with daily and majorly by the great majority is looking fantastic with the new version they do of trend is where opening is amazing. They do incredible work or I love them. But you know, like there's really only like a painting program out of both of those. But like when I saw people do fusion, what it really showed is that there's this pent up demand of people who are like traditional coders, great technologies.

00:18:26:25 - 00:18:47:24
Daniel Jeffries
It's people who are dabbler is like artists with a technical bent, whatever. All of a sudden who could like now take a fully fledged powerful model and figure out ways that, like, nobody else had done it. So I've seen like synthetic brain scans developed the synthetic lung scans developed, right? I was just reading about like the Mayo Clinic training up a diffusion model to basically do synthetic lung cancer development.

00:18:48:00 - 00:19:04:20
Daniel Jeffries
They weren't using simple diffusion, but they trained up a diffusion model. But they probably got the idea from the fact that like simple diffusion, it's a limited so powerful out there. Right. And why is that important? Because being able to get that data, remember I said that the data is really important. And sometimes when you talk about medical data, there's a lot of challenges to getting that data.

00:19:04:20 - 00:19:27:19
Daniel Jeffries
Maybe you don't have enough of it, right, because you just don't have enough instances that are unique for the model to be able to learn the features of cancer so that it can do better detections. I saw people doing air stuff with like showing wallpaper suddenly in a room so that they could look around, right? I've seen fantastic artists sort of building it into their workflow, and I understand the kind of existential scare that sometimes people feel about like job uncertainty and these kinds of things.

00:19:27:19 - 00:19:53:28
Daniel Jeffries
We live in this world, but my sense of it is that it's a tool and it's just going to be worked into the workflow. You still need the human intelligence. And the perfect example is I saw a fellow who painted it was an awesome illustrator and they did the under painting of a cool frog and then put it through image to image and like it generated an awesome looking frog and then they pulled it into Photoshop and painted glasses, a different outfit on it all, and then put that back through image to image and did 20 iterations of that and then went, Oh, wait a minute, that one's cool.

00:19:54:06 - 00:20:07:12
Daniel Jeffries
I'm going to pull that went in and mess with that one. Right? So just kind of rapid iteration and I've always thought of these creative tools when I look at them, it's going to be just like that, right? Someone's gonna be playing the guitar and I'm going to feed it to the machine, or they're going to go, give me give me 20 continuations of that.

00:20:07:12 - 00:20:20:22
Daniel Jeffries
And you go, Wait a minute, number 15 is awesome. Like, let me hear that one again. Okay, number 15, give me more of that and give me a hundred iterations of that. Okay, that's cool. Although wait a minute, I'm going to play this here right? So it's got to be this kind of like this center is what we call it.

00:20:21:02 - 00:20:36:10
Daniel Jeffries
It's really like it's a collaborative tool and a center really came out of the idea of like of Garry Kasparov after he lost to Deep Blue in chess. Right. And and that was sort of this big moment where we thought that was going to be the breakthrough, but it turned out actually just be kind of a brute force research.

00:20:36:10 - 00:20:54:05
Daniel Jeffries
So it wasn't very intellectual, right. But it was still impressive. Right now, deep blue is very impressive, but it wasn't true kind of artificial intelligence yet, which is why we didn't see the breakthrough starting that. But he held a contest the next year where you can enter as an AI, a person or an AI human combo. And the A.I. human combo ended up winning.

00:20:54:12 - 00:21:11:02
Daniel Jeffries
And it was really a collection of a couple of expert players, not even grandmasters, playing with an AI that beat the grandmasters and beat the Grandmaster Artificial Intelligence Program. And so that's to me, what I think artificial intelligence is about for a huge chunk of time. It's not that it won't ever do some types of some types of jobs.

00:21:11:06 - 00:21:31:06
Daniel Jeffries
Right. We do sometimes replace to change the jobs in history. But again, I don't know that everybody is calling for us to go back to like we are loyal lambs and slaughtering giant sperm whales and digging the gunk out of their head. Just because we want it, because we're afraid of the electric light. So I think but the vast majority of things that we have here are really about these kind of tools and workflows.

00:21:31:06 - 00:21:51:03
Daniel Jeffries
And I see it's going to speed up and it's going to create new kinds of workflows for artists. A lot of times people are just like, Oh, here's my quick prompt generator or whatever, and that's kind of all they see and they stop seeing anything else. But when you start seeing it as a giant workflow for concept designers and motion designers, all these things, texture designers, fashion houses, etc. this is when it becomes truly exciting.

00:21:51:20 - 00:22:16:06
Joey Korenman
Yeah. I want to get into in a little bit the philosophy behind all of this, because I think you're exactly right. Like my take is that the obvious use case for this, especially in our industry, is generating concepts and art direction. One of the things I would do is I'd get on Google Image Search and if I'm doing a commercial about whatever pizza, I'll type in pizza and you'll see a thousand images of pizza in different styles and it triggers something and I think that this is just another way to do that.

00:22:16:21 - 00:22:31:21
Joey Korenman
But there are some I think there are some differences with this versus I don't know, like say if you're a 3D artist and you're modeling a scene and you need a model of a guitar, well, you could take a week and model the guitar. You could just go on a you could buy a model that someone else has done.

00:22:31:29 - 00:22:44:14
Joey Korenman
Is that cheating? Does that cheapen what you've done? I don't think so. But if you didn't really have to do anything to get the image other than imagine it and type it in, does that cheapen it? So I don't want you to answer that yet. I want to save that because I think that's going to be fun to get into.

00:22:44:15 - 00:23:13:12
Joey Korenman
I want to hear a little bit more about like where this technology's going. You brought up some really cool use cases to create artificial data that can be used that I'm assuming to train humans and also to train other AIS, but specifically thinking about like image generation. I mean, the fidelity just gets better and better, but there are certain things that, like you mentioned hands and also I notice AIS seem to give these tools trouble sometimes or if there's multiple people in the scene, sometimes they sort of intersect in weird ways.

00:23:13:12 - 00:23:24:27
Joey Korenman
What does it take to fix all of that stuff? Is it just a matter of more data, or is it possible that there is some limitation that will bump up against that? That really you can't be overcome through this method?

00:23:24:28 - 00:23:45:22
Daniel Jeffries
I would say it's probably more data and better data when you look at it. Like if you look at kind of the chinchilla paper, that was sort of a giant breakthrough where they're able to essentially say like if you make the data a lot better, then actually improves the performance of the model a tremendous amount. And you're starting to see this kind of focus on data quality or what they call sort of data centric A.I. that came out of boundary.

00:23:45:22 - 00:24:03:10
Daniel Jeffries
Right. And this idea that we spent all this time kind of figuring out the models and tweaking the models and experimenting and doing all these things. And if there was noise in the data, like if you were thinking about trying to create a model that can understand what I'm saying in the car, maybe you only had 10% of samples where the kids were yelling, Are we there yet?

00:24:03:10 - 00:24:32:15
Daniel Jeffries
And the music was too loud and there was like wind noise in the background. And so every time that happens, it's particularly bad at being able to understand that, right? So the typical approach in the past would have been go and tweak the algorithm and tweak the model bunch until you can overcome the deficiency in the data. The answer now would be go back and augment or enhance the data and make a purchase of much more sounds of say from a from a sound database or whatever infrasound database of wind and music and kids babbling and like car noises or whatever.

00:24:32:26 - 00:24:49:25
Daniel Jeffries
And I could then programmatically take the samples that I already have and make new samples where I've added that to the kind of the background noise. If I wanted to train something to detect kind of boat crashes visually, I'm not going to have a lot of footage of both crashes, but I could probably create them with Unreal Engine or with or again, augmenting some of the engines that are there.

00:24:49:25 - 00:25:13:26
Daniel Jeffries
Right. And so that kind of that kind of creates that. So you want you want to have better data. I mean, certainly we could get a I've been advocating for a hand database because it's I do find it kind of funny that victims were some of the hardest things to draw as an artist for myself when I was a kid and almost every artist, like you said, you have to kind of break it down to the geometry and shader shadow level and forget that it's a hand and just see kind of the intersection between the fingers and how far that is.

00:25:13:26 - 00:25:29:29
Daniel Jeffries
And this distance between this distance and there's no real edge to don't draw a line around it. It's all the players are. So there's all these kind of intricate things you have to learn. And I think the same with artificial intelligence. And so it's actually not surprising that two of the most expressive parts of the human body right?

00:25:29:29 - 00:25:47:17
Daniel Jeffries
The eyes, the window to the soul and the hands are incredibly challenging because where we are, our eyes are, whether I'm kind of looking internally at you are looking off into the distance or kind of like looking up into the right or thinking about something or whatever. But that matches the pose that you sort of understand intuitively because you understand kind of human mechanism.

00:25:47:29 - 00:26:03:07
Daniel Jeffries
The hand is also the most flexible thing. You have all these digits that can move independently, right? They can come together in a fierce they can open up. They can do like one finger pointing. They can touch like this. Right. They can make all these sort of symbols and then wear that sort of and then it's attached to this arm, which is kind of free floating.

00:26:03:07 - 00:26:18:20
Daniel Jeffries
It can be in all these different shapes and mean this kind of different expression. And so I think the challenge is that everybody's intelligence hasn't quite understood that that level of flexibility and that level of expression. And it's something that like with a larger set of data, would probably learn it over time. It's never going to be perfect.

00:26:18:20 - 00:26:38:12
Daniel Jeffries
No. And sometimes actually this is the drawing from the people are like, hey, today is stupid and it'll never kind of achieve kind of true intelligence. But even with the usually the benchmark being sort of human intelligence, with the lack of understanding that there are many types of intelligence, again, we don't think of a squirrel as human level intelligent, but again, try to keep it out of your card right.

00:26:38:12 - 00:27:00:02
Daniel Jeffries
And there are many types of intelligence that are perfectly functional and useful. We don't need human level intelligence, but I could care less. That's great if we ever get there. But in the short term, there are millions of applications for narrow, kind of not too smart. And so I don't need this kind of perfection. Like, if you think about something like a self-driving car, okay, we kill 1.2 million people on the road every year and 50 million people are injured.

00:27:00:12 - 00:27:17:25
Daniel Jeffries
So, you know, when people when the media when a Tesla crashes are like, look at this evil technology, it's destroying everything. It'll never work. You know, like, okay, yeah. But like during that same time, that one Tesla crash, there were like there were 10,000 people getting injured. Right. So you look at the machines like were they ever going to be perfect?

00:27:17:25 - 00:27:37:12
Daniel Jeffries
No, there's no human that's perfect. Humans actually are tremendously terrible drivers. It's something that requires kind of perfect concentration for an extended period of time, which is something we don't do very well. Right. So the cars don't need to have this sort of higher level of intelligence to be able to deal with 99.9% of the scenarios that it's going to face.

00:27:37:12 - 00:27:59:29
Daniel Jeffries
And in fact, Waymo talked about this. They essentially said, look, the cars are 90% of the way there. The 1% is like the hardest part. Right. And what they're really looking for is these weird edge cases. Somebody just jumps out into the middle or there's something that looks like a barrier, but it's not where a piece of trash is not actually a solid thing that's going to cause a crash or something that is solid that could, you know, cause you to veer off the road.

00:27:59:29 - 00:28:14:09
Daniel Jeffries
And these kinds of these kind of weird edge case situations that we're used to dealing with. You go down a road, it's closed off. Are you allowed to back up or drive over the sidewalk where these kinds of decisions are very challenging? And so those are these edge cases, and I would agree that that's challenging to get to.

00:28:14:16 - 00:28:31:23
Daniel Jeffries
But if these cars and maybe like if you're just thinking about trucks driving on the road or on the highway or whatever, if they're able to reduce the amount of fatalities by half, for instance, when does this get good enough? And again, when you think about if I bring this back to our tools and not sort of kind of life and death thing, it just right now, it can be awesome for prototyping.

00:28:31:23 - 00:28:49:02
Daniel Jeffries
It doesn't even need to be sort of production level thing where you automatically get a perfect asset at the end of it, right? It's just that it has to be good for like, okay, I want to iterate a prototype and get all these ideas. I'm like, Oh wait, that was really cool. And now I can kind of apply my artistic skills to make it even better, get it to a finished production.

00:28:49:12 - 00:28:53:00
Daniel Jeffries
I think that's fantastic to me. If it never gets beyond that, it's still really cool.

00:28:53:16 - 00:29:10:24
Joey Korenman
I wonder how big of an effect like the Terminator series had on the public perception of these things? Because there is definitely like a fear baked in of like giving a machine too much control. And as an artist, I totally understand it too. So we talked about certain things that are going to be really hard to ever get perfect.

00:29:10:24 - 00:29:23:28
Joey Korenman
And I've also seen workarounds of this, where maybe if you get an image out of stable diffusion, the eyes are messed up. But then there's some other A.I. tool that is really good at eyes and you run it back through that. Now the eyes are fine. I know a lot of people are doing that.

00:29:24:11 - 00:29:29:13
Daniel Jeffries
Becomes a workflow, right? Like a higher level workflow. We're chaining it through a series of tools to get a particular effect.

00:29:29:22 - 00:29:44:16
Joey Korenman
Yeah, I mean, I've seen people do like they'll upload 20 images of themselves. They train the model on that, then they get a bunch of options of, You could look like this if there's an anime character. And then they run that through a thing that cuts out the background. And it all happens in a few minutes and I think that's really powerful.

00:29:44:18 - 00:30:04:12
Joey Korenman
I gave a talk earlier this year at a conference in Chicago, and it was a bunch of motion designers, mostly 3D artists and all of this stuff had just started to really like pop in the public consciousness. And I was showing all these images, these beautiful renders, and then I said, Well, it's okay because none of these things can animate anything yet.

00:30:04:17 - 00:30:23:07
Joey Korenman
But then the next slide was some people have been using and I don't know what the terminology is, but they were using some, I guess, program or some application built on stable diffusion called forum that was actually creating animations using the tool. And there's this weird quality to every single one I've seen there. It's like a fever dream.

00:30:23:12 - 00:30:42:03
Joey Korenman
There's this coherence problem where things pop up and they don't quite seem right. And my intuition is that like, okay, the nature of the way this tool works is causing that, and I'm sure that's going to get better, but it's right now it's way off. Like if you wanted a photo realistic shot of something, we're not very close to that.

00:30:42:03 - 00:30:59:01
Joey Korenman
I think if you want something really cool looking, we're there. And if you want to do a music video, you could use it. But how much of a challenge is it to get from where we are? You can have a photorealistic image, but 24 frames a second of a shot that it looks indistinguishable from something that shot with a camera.

00:30:59:16 - 00:31:01:25
Joey Korenman
How far off are we from that? And how difficult is that?

00:31:01:25 - 00:31:18:00
Daniel Jeffries
I don't know that it's necessarily difficult. I think it's a matter of kind of the researchers focusing in on it with the correct dataset and the correct approach to the problem. Right. And so I talked to a company, for instance, that and it's image and they have the beginning of the image and they have the transformations that the artist did of that and then the final kind of rendition.

00:31:18:00 - 00:31:39:17
Daniel Jeffries
So we're talking about potentially training up a model that's able to look at those kind of transitions and then approximate those kind of transitions. So that could be a path to doing those kinds of things. If you think about the way animators worked in the past, it's funny because I think the maker of Ghost in the Shell said, you can't even make an animation that the way we used to do in the past where you had everyone kind of hand drawing everything right and you wouldn't people won't even do it.

00:31:39:17 - 00:32:03:01
Daniel Jeffries
They don't even want to start it. But keep the sort of entire frame kind of. Yeah, because it's a slow, boring process. Right. So you would have these kind of keyframe animators, right? Who would draw like Mickey Mouse like this and then like this, and then you'd have someone to withdraw all the interim steps in between. So my sense is that you're probably going to start to see breakthroughs where people are able to kind of think about it from a keyframe perspective and feed the machine, various types of keyframes.

00:32:03:01 - 00:32:22:15
Daniel Jeffries
And then I think I'll be actually relatively quick learning, kind of like approximate movements in between those. And so I do think that we get to sort of animation relatively quickly, that we're sort of working on some animation stuff. I know some other folks are too. Like you said right now, a lot of it looks like a fever dream where it kind of grows ahead in the back of the head it in one frame and that's all cool.

00:32:22:15 - 00:32:38:06
Daniel Jeffries
And like I think we'll look back on those early I kind of a esthetic of those kinds of things, but I do think we get to the point we're able to do video and imagery and these kinds of things. And in the not too distant future, again, I think it's just going to the same way. It's really just going to enhance the sort of workflow of what we do.

00:32:38:06 - 00:32:58:29
Daniel Jeffries
Right. I mean, I think at some point in time, for instance, I wrote a blog posted 15 or 20 years ago, I said, look, video games and film will converge and they'll converge when kind of when video game engines get to the point of photorealism in like near perfect physics. And I think we're already now I think we're almost there probably with unreal five.

00:32:58:29 - 00:33:18:27
Daniel Jeffries
Right, and very close. And by the way, they're using the same assets for like The Mandalorian and and the PlayStation is just showing a different amount by doing mathematical approximations of it. So even if it's some real six or seven or whatever that gets us there, eventually you reach a point where like, what more of an engine do you need that essentially you can already do photo realism or any style, and it can do these kind of physics.

00:33:18:27 - 00:33:33:21
Daniel Jeffries
So now is that a bad thing? Well, I don't know. Avatar costs, what, $340 million or whatever. And that's out of the range of the average person. But what if it's at a point where now the average person in film school who is like got a great creative team could do it for 20. We could do an avatar for $20 Million.

00:33:33:28 - 00:33:54:09
Daniel Jeffries
What kind of explosion of creativity do we get at that point? And then what kind of explosion of creativity do you get for when people are able to like create some kind of storyboarding system where they're able to like do a bunch of prompts to kind of create storyboards based on their ideas in their text and then from that, they're able to look at the storyboards and craft out something that's rough approximation of it or whatever that artist can take from there.

00:33:54:09 - 00:34:08:21
Daniel Jeffries
And then it reduces it to five or $10 Million. What it means is there's more creativity. It means there's more options, and there's a number of folks who are able to break through who you would have never seen because they wouldn't have got through the old gate guardians. And to me, whenever you're enabling more creativity, that's a good thing.

00:34:08:21 - 00:34:32:16
Daniel Jeffries
Yes, that means you're going to get a higher volume, and that means that there's going to be a lot of crap to wade through. There's already a lot of crap to wade through, right? So from my standpoint, it's never a bad thing when you're enabling more creativity, when you're kind of bringing down the barriers to entry. And then also these tools also allow people to kind of express themselves in new ways in the same way that, you know, the brush, you know, in certain kinds of paint don't replace the pencil.

00:34:32:16 - 00:34:34:08
Daniel Jeffries
They just offer a new type of expression.

00:34:34:17 - 00:34:54:13
Joey Korenman
Let's talk a little bit more about this animation thing. So, I mean, the the way that stable diffusion in daily and mid journey the way they create images there's no underlying 3D scene or 3D model. But I have seen other things pop up on Twitter and other places where, okay, here's an AI that makes geometry you type in plain and it gives you a plane.

00:34:54:15 - 00:35:13:03
Joey Korenman
And then there's this other technology. I don't know if it's related, but, you know, neural gradients, fields where you can get these sort of 3D approximations of a scene with the lighting and the reflection. And do you think that it's really the combination of all of these things that's going to turn into text to 90 minute action movie starring Tom Cruise as opposed to text to image, which is what we have now.

00:35:13:17 - 00:35:32:15
Daniel Jeffries
I don't think the technology even super advanced in other words, we absolutely get to generating 3D models of stuff from text, a very near future that the work is already being done on it. It's just we have a number of groups banging on our door with sort of with datasets and commercial partnerships to do that. The researchers are chomping at the bit to do it.

00:35:32:15 - 00:35:49:15
Daniel Jeffries
We're not the only ones. There's plenty of folks out there who are already researching it. The stuff that's come out of Google, the stuff that's come out of in video where they're able to approximate point clouds and then smooth over the mesh over the top of it and then give real information that you can point into any kind of 3D program, whether it's I don't know what the kids are using today, but back in the day, I use my.

00:35:49:15 - 00:35:50:26
Daniel Jeffries
Are we still using that? I don't know.

00:35:50:26 - 00:35:53:12
Joey Korenman
It's out there. The kids like blender these days. Blender open.

00:35:53:27 - 00:36:16:20
Daniel Jeffries
Okay. Yeah. So the tools change, but they just they get a little bit better over time. So look, that's actually going to happen. I am not sure that with the limitations of the tools that we have now in terms of like if I just maxed out how far they can go based on the technology just now, and that's all what I always do as a feature.

00:36:16:21 - 00:36:42:13
Daniel Jeffries
I do kind of a, I don't know, a, like a probability analysis in my head of like major branches, right? They kind of float out into the future. And if I do sort of a maximization of 3D image generation or sort of sound generation, video generation, etc., what I don't get to is describe the movie. It just makes it completely photo realistic fashion and tells the story and sets the cinematic direction.

00:36:42:17 - 00:36:44:09
Joey Korenman
All these Williams and the whole thing.

00:36:44:22 - 00:37:05:16
Daniel Jeffries
Yeah, like that. That I don't see that the technology, even if it's fully maxed out, kind of achieving that. But I do see all those tools very much changing the way that film and streaming and television and illustration work, right? In the same way that when we look back at Jurassic Park and there's a great documentary on it where they're like, you can't use digital stuff that doesn't make any sense.

00:37:05:16 - 00:37:20:15
Daniel Jeffries
And like, fractal effects are the only thing that'll work. And then they made this. They had two folks who were like, We're good. We absolutely could do this. And they were able to make the two T-Rex working overtime, kind of figuring it out on their own. And all of a sudden they were doing this sort of digital film editing, right?

00:37:20:16 - 00:37:42:25
Daniel Jeffries
And when back when people were still cutting, literally cutting and pasting things together and they were like, Oh, this is the way everyone's going to do it, right? That's what Spielberg was seeing at the time. People like if you look at the media at the time, they were like, That's crazy. Why would anyone do this? And of course, now it's incredibly rare for anyone to use sort of physical film or you have seen a resurgence of practical effects or actually the merger of practical and digital effects now.

00:37:42:25 - 00:37:58:02
Daniel Jeffries
Right. You see some of that stuff with the religion and Disney and stuff where they put up those giant screens that are able to shift and move with the actors. And so they're no longer just in a suit with little dots all over them and like talking to a green screen and trying to pretend there's a dragon, right.

00:37:58:06 - 00:38:15:20
Daniel Jeffries
There would have to be some unobtainium level breakthrough that I don't see on the horizon yet. I don't see it in the research. I don't see it as a possibility where I would be able to go, like even if I could write it out in a 20 page treatment, that it would then generate the next Mission Impossible 55.

00:38:15:21 - 00:38:35:15
Daniel Jeffries
I don't see that happening yet. I don't know what that technology would be that would get us there. But I think that kind of cuts to the heart of people sort of worry about this kind of thing that we would suddenly be able to kind of do it at that level. I think if one were able to do it at that level, we have a very different level of intelligence in artificial intelligence that I don't see in the pipeline currently.

00:38:35:26 - 00:38:54:09
Joey Korenman
I mean, that that is what artists worry about. I think this is a good Segway. It's that to kind of talk about this stuff. So I've heard people pontificate on the internet as they do that. Exactly that. The situation I described you're going to be able to type in I want a movie with dinosaurs in Lakewood Ranch, Florida, with a hurricane outside.

00:38:54:09 - 00:39:09:28
Joey Korenman
And our hero Dan Jeffries, is fighting and it'll create it. And the thing that I don't know, like I will say, I was totally wrong about the potential of these images, generation tools. If you'd asked me a year ago, will you be able to type in a sentence and it gives you this high fidelity thing? I would say no.

00:39:09:28 - 00:39:30:16
Joey Korenman
And I was wrong, but the thing that I still feel pretty strongly about is that even as incredible as these tools are, they're only as powerful as the concepts and the ideas from the artists that has those ideas, right. And so when I gave this talk, I'm in a room with a lot of people who they make their living producing work, and it's going to be easier and easier to produce the work, right?

00:39:31:06 - 00:39:58:11
Joey Korenman
So the value of that work, and it may eventually even be the monetary value of a 3D modeler, for example, is going to be lower because these tools just make it easier, democratizes it. So then I'm curious and you're an artist too, so like what then is the important thing that artists should be focusing on, knowing that, okay, it's going to be a lot easier in the future, maybe in five years of being a texture artist, which used to require like really super specialized knowledge.

00:39:58:12 - 00:40:11:03
Joey Korenman
Now you're just going to go out with your iPhone, you're going to point it at the texture you want, and this is just going to make it. But what's the important thing that these guys really even you know, from your perspective, projecting out they won't be able to do what's the thing that artist should be focusing on?

00:40:12:06 - 00:40:25:06
Daniel Jeffries
I think any artist should just be like thinking of it as another tool that they get familiar with, like a paintbrush or a camera or anything else. Right. And the problem is that really the media is sort of treated unfairly. And I think some folks have been understandably, the.

00:40:25:06 - 00:40:26:21
Joey Korenman
Media doesn't do that. Damn that.

00:40:26:27 - 00:40:42:20
Daniel Jeffries
I think and I think look, I think sometimes look, we have a long history of the way it's been portrayed. It's either been killer robots or like the end of the world. Right. Or like this existential thing, or like the death of all jobs I've even written in my own short stories, like 25 years ago of like, robots doing this kind of thing.

00:40:42:20 - 00:41:06:09
Daniel Jeffries
But I think it's I think it's a fairly tired theme at this point. And I think it's also, unfortunately, created this kind of weird existential fear. And I just I don't know, I actually I mean, when I think about it, I don't really actually worry about, like, sentient robots because humans don't have any trouble, like, being jerks. And so I worry about sometimes narrowly, I am in the hands of like evil, selfish jerks, kind of directing that thing that has no consciousness, right?

00:41:06:21 - 00:41:24:29
Daniel Jeffries
So I don't worry about the kind of rising up and taking over. I don't worry about this kind of mythical, you know, death of old jobs kind of thing. I think, first of all, we've already kind of recreated all the jobs multiple times in history, like you didn't hunt the water buffalo and tan leather to make your clothes today, like you didn't put down your food like you hunted it down in the grocery store.

00:41:24:29 - 00:41:44:07
Daniel Jeffries
And there was a whole series of folks who were feeding that grocery store. And then you were able to do other things like Kohl's Post podcast. So I think the jobs do change over time, and I think it would be disingenuous to say that they won't change at all. But in the same way, it's like if you're a radiologist, you should know that like artificial intelligence is going to assist in terms of ideology.

00:41:44:07 - 00:41:58:24
Daniel Jeffries
So you should be comfortable with that and like utilizing it, but still develop your own intuition, your own kind of critical sense of it. When I think about it in terms of like we're kind of that far away from large language models, being able to crank out an article is titled the and Jefferies? I don't know. I don't really worry about it.

00:41:58:24 - 00:42:18:02
Daniel Jeffries
I'm still doing Jefferies. I'm still going to be able to create my articles. I still have my thoughts and my experience and the things that I'm doing in my life that are going to inform the things that I want to create as an artist. Right. I've been doing it for 20 years. I don't care if it is. And in fact, I'll use that tool probably in the same iterative way to say like, wait a minute, I'm really having trouble writing a conclusion of this.

00:42:18:02 - 00:42:38:14
Daniel Jeffries
Like, go look at the story that I'm in my fifth draft. I mean, like, give me a better ending. Okay, that's pretty cool. I'm going to change this and this and like, it gives me a new idea to kind of move forward with. So I'd say just embrace the tools. Don't fall into this kind of fallacy, right? That like that the air is this sort of inhuman thing that's going to take over the world or kill everyone or whatever.

00:42:38:21 - 00:42:58:07
Daniel Jeffries
It's really it's just a science fiction story. And science fiction stories are about conflict. Stories are about conflict. It's like when the media portrays this, it's always about the conflict because fear drives click right. And even if you're kind of a member of the I don't know what's the awesome, uplifting story, I read it. There's like 20 million people on and I forget the name of it, but it's like even that one.

00:42:58:07 - 00:43:16:06
Daniel Jeffries
When I go on the uplifting stories, it's about puppy dog being saved and like a new cure for cancer. You go, Oh, these are great stories. And you read them for about 2 minutes. And then Human Nature. I go right back to reading like The Economist and like it's all day, but at the end of the world and the economy is crashing and wars, I think we are, by our nature in many ways sort of fear based creature.

00:43:16:09 - 00:43:41:22
Daniel Jeffries
It animates a lot of the things that animates like what's creating art and building cities and like farming these kinds of things. But there are other emotions, right? Joy is the kind of true creativity of art, right? And I think as long as people are kind of willing to evolve an understanding that like it's nobody's thing is fixed in time, we want to be the state of thriving and growing and that and to do that, it means just sort of adapting to the kind of new things that are available to us and embracing them.

00:43:42:00 - 00:43:55:11
Joey Korenman
I mean, this is a tale as old as time. There's like some famous story that I remember learning as a kid of a guy who he had a giant sledge hammer and he would drive railroad spikes and then some machine came along. They could do it faster and they raced. And of course, the machine wins and the guy dies of a heart attack.

00:43:55:11 - 00:44:17:29
Joey Korenman
But it's sort of like this parable about like technology comes for us all. And when I gave this talk in Chicago, I was sort of urging everybody to like do some soul searching and figure out, like, what is it about making the art that is fulfilling? Right. Is it the end result that you're getting or is it the process of doing it because as the end result gets easier and more democratized, it does feel like a little bit of an ego hit.

00:44:18:00 - 00:44:36:00
Joey Korenman
If you worked for 15 years to be able to make a 3D render that looks a certain way. And now, you know, a 12 year old that can type in the right prompt can get the exact same image. What does that mean? Right as an artist? And it is funny too, because there are things that I tools have been helping with in our industry for years already, like rotoscoping is one.

00:44:36:00 - 00:44:52:08
Joey Korenman
No one complains that like, oh, all the rotoscoping is they're out of work now, no one complains about that. But when it comes to doing style frames for a commercial or something like that and being able to knock out 20 variations and pitch those without really having to actually open Photoshop one time, it is a little bit of an ego hit, I think.

00:44:52:08 - 00:45:08:22
Joey Korenman
And so I'm just curious. I mean, this isn't really stability. I responsibility to figure this out for society. But I'm curious if you have any thoughts on as an artist, a lot of times our ego gets wrapped up in the result of what we're doing. And really my philosophy is it should be about the process of doing it.

00:45:08:22 - 00:45:16:07
Joey Korenman
And if that brings you joy, then guess what? You're an artist. How do you think about that in the context of these tools making the result easier to get?

00:45:17:00 - 00:45:34:03
Daniel Jeffries
I mean, I think we're talking about the difference between sort of commercial art and like the philosophy or the feeling of art. And for me, most certainly that joy that you're talking about, but that process of zoning out and losing track of time, that kind of flow experience of writing is why I still write. I mean, I make pretty good money off of writing.

00:45:34:10 - 00:45:58:08
Daniel Jeffries
I could probably do it full time if I wanted to focus on it. It's not quite as lucrative this technology, but I could certainly make a living doing it and be quite happy about it. Maybe happier than some of the stress sometimes in the tech world. Right. But for me, actually, the most important thing is when I have those two or 3 hours where I'm just focused and everything's on snooze mode and my phone is off and I'm not nobody else is sort of bothering me and able to zone in with my thoughts and focus and type out something.

00:45:58:08 - 00:46:19:08
Daniel Jeffries
And I lose track of that time. And I just feel I feel better in the rest of the day. I feel like I feel my purpose in life and that's exciting. That's not ever going to go away. People still have the option to create those things. Now, when we're talking about the process of commercial art essentially being sped up in some way, I would argue that like, we don't have less art because the tools have been democratized.

00:46:19:08 - 00:46:36:06
Daniel Jeffries
I would argue we have more. I mean, people now, right now are talking about the golden age of streaming. I can't wait to go home and watch the crown, which is just dropped. I've been watching and awe and it's some of the best television I ever see, but sci fi or not. And I look at the graphics behind it and I go, Man, and look at the stuff from when we were kids and it looks so janky and horrible.

00:46:36:16 - 00:46:52:13
Daniel Jeffries
And I look at the stuff they're just producing on television now. It looks so immersive. It's astonishing. It's incredible. And then you have this amazing acting and we have so much kind of we have tons of streaming networks now. We have content creators on the web, like people have cameras in their phone. They're creating a million different things per second.

00:46:52:13 - 00:47:11:15
Daniel Jeffries
Per second. Right. You've got all of these different things now. I think we have more art now than we've ever had in the history of mankind. And that's because of the democratization of all of these tools. It's because of tools like Photoshop. It's because of the World Wide Web. And changes in distribution is because of Kindle and AI is really just going to accelerate that.

00:47:11:15 - 00:47:30:14
Daniel Jeffries
Again, I think when people look at it and they go, Oh my gosh, like again, the rotoscope version, I'm not able to do this in this way, right? Yes, that is a challenge. Right. And if we have to kind of adapt, but that's tough. And some people are better at adoption than others. And I understand a lot of the artists are coming from and I think some folks from the community are very like us versus them about that kind of thing.

00:47:30:14 - 00:47:48:00
Daniel Jeffries
Like it's technology, just deal with it. No, I don't think about it that way either. It's painful sometimes to try to adapt or to change to learn a new tool. It's also tremendously exciting. It's tremendously exciting. And there's a joy in learning how to learn a new thing from scratch. Creativity is about change. There's joy in learning a new process.

00:47:48:00 - 00:47:54:13
Daniel Jeffries
There's joy in adaptation, and there's joy in being a part of this eternal creative flow where life is always changing and never static.

00:47:55:04 - 00:48:13:20
Joey Korenman
Preach. I love it. I love it, man. All right. So let's talk about a common objection. I see. And I'm not even sure it's an objection because, I mean, as we were talking before we started recording, I've been really surprised by how accepting of this new technology, my industry has been. I expected a lot more gnashing of teeth and tearing of clothes.

00:48:13:20 - 00:48:33:08
Joey Korenman
They're going to come for our jobs. I haven't really seen that. I've seen a lot of artists embrace this. But one one criticism that I have seen and I suspect that it's not entirely accurate is some version of this. It's these AI models. They were trained on millions and billions of images on the Internet, some of which I may have made and I own the copyright to.

00:48:33:12 - 00:48:48:12
Joey Korenman
And I didn't give you permission to crawl the Internet and grab these images. And now you've built a tool that is using my image as an inspiration to make new ones. So I'm curious, like a is that perception of it at all accurate and how would you kind of respond to that?

00:48:48:27 - 00:49:05:27
Daniel Jeffries
Look, I think stability is very much working with the community. I think of things like a kind of no no robots style text where, you know what, do not crawl style text. I think it's something we have to be sensitive to and we have to think about. I can sympathize and sort of understand. I would say historically, we also have to sort of be careful what you wish for.

00:49:06:02 - 00:49:35:07
Daniel Jeffries
Right. Like there's been a kind of push to say like somebody should be able to sort of copyright a style now. And if you look back at the history of courts for 30 or 40 years, they've come down on the opposite side of that again and again. Right. So in other words, if I if you create an exact reproduction, you take somebody's image off a wall and you use that in your film or your advertisement without without telling them, that's always been something that's still totally legal and should be at the same time, this idea that you could sort of cover the style, imagine kind of the sort of wormhole or sort of that opens

00:49:35:07 - 00:49:53:15
Daniel Jeffries
up if you sort of allow that and suddenly I go out tomorrow and copyright anime style, I go out tomorrow and copyright heavy metal, I go out tomorrow and copyright like all guy wearing a t shirt or whatever and like sue everyone, it's the patent troll, right? Artistic patent trolls. There's a reason that the courts have sort of understood that maybe that has to change in the area.

00:49:53:16 - 00:50:11:12
Daniel Jeffries
I don't know. But it will have a ripple effect that I think folks should think carefully about this. I also think that, again, there was like early days of the Web. There was a movement to say, like every website that you link to show that you able to get permission to them, you can imagine how that sort of would have destroyed the World Wide Web, right?

00:50:11:12 - 00:50:25:04
Daniel Jeffries
It's like you would have this whole army of people inside CNN like answering these requests and the whole site would never get back to you, so you'd never be able to link to it. Like it would just be impossible. It would have come to the nature of it. So my thinking is like, look, humans learn from looking at other people's art.

00:50:25:04 - 00:50:41:23
Daniel Jeffries
I certainly did when I was growing up. I certainly learned how to write staccato and run on sentences from Ernest Hemingway and other great artists, and that mashed up into my own kind of evolution over time. Whereas maybe my early stuff looks a lot like Hemingway with all of its strengths and weaknesses, but over time you learn which of those techniques you like.

00:50:41:23 - 00:50:56:29
Daniel Jeffries
And as you get better as an artist in ten or 20 years of experience. And so imagine if Tom Brady had to learn it had to call up like Joe Montana in order to learn how to study footage to pro football. Imagine, like, imagine if you had to get the permission of every artist to like to draw their thing right.

00:50:56:29 - 00:51:08:22
Daniel Jeffries
If you had to if you had to learn from all the different sort of animate folks and you had to get their permission just to learn how to draw your style. So I think A.I. learns in a very similar way and I think that it's just been that was maybe the concern that sort of surprised me the most.

00:51:08:22 - 00:51:24:26
Daniel Jeffries
I sort of predicted all of the other one, kind of the other concerns that folks would have. And so none of them really surprised me at all that one sort of took me a little by surprise. Again, though, I'm sensitive to it. I want to understand it. And if people want to be able to opt out of things, then I think as a community we'll be able to come up with an answer to that.

00:51:24:26 - 00:51:45:08
Daniel Jeffries
I think that's fair if they want. I think what will happen is in the short period of time, people, there'll be a small group of people won't want to be not want to be trained on, but over time it'll be something that folks don't even think about. Right. In ten years, he won't even really be a debate. You'll just say this is the way that sort of things have worked and you just want your stuff to be able to be in there and you want to be able to use it for your own style or whatever.

00:51:45:08 - 00:52:03:11
Daniel Jeffries
But I think people should understand the ripple effects of saying something like, we should be able to discover the style or those kinds of things. It's because, again, there's a reason the courts haven't come down on that before and like you don't want me copyrighting anyway. So and putting everyone out of business or making them pay at all every time they decide to put something up on art station.

00:52:03:11 - 00:52:16:24
Daniel Jeffries
But you do want to get to a point maybe where people want to put in a do not call kind of tax or something like that, that the various machine learning groups would respect it. I think that I think we have to find it again, a healthy middle ground, a healthy balance. Right. And to me, whenever I think of it, I'm sort of a radical centrist.

00:52:16:24 - 00:52:34:13
Daniel Jeffries
I always try to think of how do we get to a practical solution to things that's fair to 80% of the folks, then maybe the 10% on the far left and the far right are upset. So be it. You're not going to please every everyone in the world. But I want to I think we can get to a solution that is beneficial for everyone as majority unhappy.

00:52:34:13 - 00:52:35:00
Joey Korenman
And then that's what.

00:52:36:07 - 00:52:41:08
Daniel Jeffries
That's when you've compromised. Well, when everyone gets most of what they want, but not everything, no.

00:52:41:08 - 00:52:57:10
Joey Korenman
One should be getting well, I think there's I love talking about this stuff. I think there's sort of like the legal questions of it. Right. Which I'm sure those will be litigated out over the decades. And then there's the moral questions and, you know, like, here's an example that I don't know if I've seen this, but I've heard people talking about this kind of thing.

00:52:57:10 - 00:53:31:11
Joey Korenman
Right. And maybe it's not even really yet. Maybe it's just a fear. But when you go on these tools and you type in a prompt, you can type in a unicorn flying through the air, backlit by the moon in the style of and name this artist. And if you put a Picasso or Andy Warhol, that's one thing. If you put an artist that happens to be a modern working 3D artist that's just really well known and has a particular style that brands come to them, pay them lots of money to make something in that site, in that style, and now, you know, have to because this thing can make your style and it's the style

00:53:31:11 - 00:53:47:23
Joey Korenman
associated with your name. I think that's where it gets a little squirrely feeling. And I don't know what the solution is. I mean, have there been conversations about like I mean, that's an edge case for sure, but our conversations about that sort of thing going on and is that influencing, I guess, the direction of the product at all?

00:53:48:11 - 00:54:11:21
Daniel Jeffries
I mean, I don't have a great answer to that question, that one particular edge case and I think you can always come up with a lot of different sort of edge cases. I think it's been sort of I think too many folks have focused on the idea that you could put it in a particular name and get back an approximation of that style, because I don't think that the vast majority of people are using those tools, like it's actually one variable in a variable like 20 or 50 things.

00:54:11:21 - 00:54:25:08
Daniel Jeffries
And that the vast majority of people who are, well, I just want this in the style of so-and-so are just sort of playing around like they're not like creating a commercial thing. They're not trying to like go sell that artwork or do that, that kind of thing. Although there's tons of TV.

00:54:25:12 - 00:54:26:07
Joey Korenman
Not yet anyway.

00:54:26:14 - 00:54:43:19
Daniel Jeffries
Momo's or or whatever. But the vast majority of people are not really doing that. And I think they underestimated and you were saying this on on on Twitter, that the vast majority of folks seem to underestimate that most people want to want to create their own unique style through the prompt. In other words, they want to combine 50 different keywords of various things.

00:54:43:19 - 00:55:06:07
Daniel Jeffries
And it and again, that style they say, well, is just one aspect of it. This person's name is just one variable within there. And it's usually like the mixture of three or four different people plus 25 other keywords are like metallic and like like cinematic lighting and like digital paint style or whatever. And I think the vast majority of people want to create kind of their own, their own unique thing.

00:55:06:07 - 00:55:27:04
Daniel Jeffries
They're not really interested in dropping anyone else in the same way that if you're an artist, you're not really interested in like, like I learned from Hemingway, but I don't want to just write like Hemingway. I wanted to learn like the interesting stuff that Hemingway does and weave that into my own creation so that when I'm writing a short, punchy scene, I learn why he was choosing staccato sentences or run on sentences for action to kind of keep it flowing.

00:55:27:04 - 00:55:46:01
Daniel Jeffries
But I don't want it to be exactly the same. I want to understand the underlying thought process behind it. And so I think the vast majority of folks want to really do that right? And they want to create something unique. And I've also actually seen, by the way, artists themselves training themselves into into the package or just and then using it again as a new iteration.

00:55:46:02 - 00:56:05:16
Daniel Jeffries
I saw a very famous, famous artist typing in prompts in the style of his work and in order to create iterations of potential sculptures, either resin or bronze or whatever. But he could iterate on different some tools right now. I was like, Oh man, that's super cool. Like, it's like having an avatar of yourself in a way that I can kind of iterate on things I don't appreciate.

00:56:05:16 - 00:56:30:15
Daniel Jeffries
When folks are in the artistic community or totally dismissive of the things I think there's a middle ground and I think, I think legitimate concerns at times. But I think in general, over time we're going to evolve solutions to these kind of things. We're going to say, like, if, if it's an advertising campaign where somebody sort of mimicking a famous photographer or whatever, but the industry will figure out, like, whether that's even a good idea for you to do that because you maybe never going to work with it again or is there some sort of legal ramification to do it?

00:56:30:15 - 00:56:47:12
Daniel Jeffries
I don't have a good answer to those things, but I do know that over time, society will sort of evolve a fair answer to those kinds of things. And I trust society to figure those things out. And we always technology and outside them, there's this idea that technology uses external things. Technology is it comes from us. It's an extension of us and it's a part of us.

00:56:47:12 - 00:57:04:15
Daniel Jeffries
Right. And when it changes us and changes the way that nature work, we adapt to it. We always find a way to integrate a new technology. We find a way to kind of evolve into it. And we create new regulations for cars when we just had horses and carriages before. We create new ways of kind of dealing with those cars over time.

00:57:04:23 - 00:57:20:13
Daniel Jeffries
In the early days, we had no idea how fast they should go or whether we should wear a seatbelt. Even like wearing a seatbelt was controversial. Oh, my gosh. Are we should we be allowed to do that or should we be able to force people to do that? Literally, that was the debate. Right. And but over time, very few people would think putting on a seatbelt anymore.

00:57:20:13 - 00:57:35:22
Daniel Jeffries
Right. Even though that was a tremendous debate. It really is. How dare you make me wear a seatbelt? Right. It was actually the debate at the time. So I think, again, we there's always kind of this push pull that happens and then the system stabilizes and people are able to come to a middle ground where there's a consensus answer for these kinds of things.

00:57:35:22 - 00:57:37:24
Daniel Jeffries
And that plays out over time naturally.

00:57:37:24 - 00:57:53:05
Joey Korenman
Yeah, it's made me think it's almost a you're open sourcing the that part of this process which is really I mean I think that's the only way to do it. You can't centrally plan how society is going to react to something like this and figure out all the rules ahead of time. I think you have to kind of bump into those rough edges, right?

00:57:54:00 - 00:58:20:10
Daniel Jeffries
You do. And that's the way that that's the way they'd all like. Great ideas have been filtered out into the world. It's a way that all technologies have filtered out into the world. They filter out into society. And then there's some good things that you can do with it. There's some bad things. There's some stuff in between. And we learn to kind of mitigate the bad things and sort of embrace the good side of those things, which are a combination of consensus and regulation and sort of like best practices for doing things and evolution over time.

00:58:20:11 - 00:58:26:12
Daniel Jeffries
And I think that's the nature of how society develops, the nature of the way we sort of evolve as a species.

00:58:26:12 - 00:58:50:06
Joey Korenman
So on that topic, why don't we like go a little deeper into like the potential bad things that that could happen because of this technology. And recently Adobe their big annual conference Adobe max and they have a lot of A.I. tools finding their way into Photoshop and other things like that. And they made a big point this year that they're doing a lot of work to try and ensure that this stuff is used ethically.

00:58:50:06 - 00:59:06:13
Joey Korenman
I mean, I think the obvious example would be the deepfake, the president of the United States saying something horrible about it causing World War Three. I mean, I think this is probably like a real danger of having tools like this be so easily accessible. And I don't know what you do about that. So I'm curious what's your philosophy on.

00:59:06:20 - 00:59:26:13
Joey Korenman
Okay, now there's a new weapon out there. I was thinking of the famous quote from Oppenheimer when he saw the nuclear bomb go off for the first time. Now I am become death, the destroyer of worlds. And so this technology like theoretically could be used to do some pretty bad stuff. Are there ways to safeguard against that? What are companies like Stability doing to kind of prevent worst case scenarios?

00:59:27:10 - 00:59:51:01
Daniel Jeffries
It's about stability is firing up a context now for like $200,000 to build like for the community to sort of build deepfake detectors and then we'll just open source that for free and give it out to the world. My sense is that the way that it all essentially work is a bit in the way that sort of spam filters work in that it's like initially you kind of have these traditional coded and coded rules to try to stop spam, but then they only work to 60 or 70%, but then they weren't very effective and spammers are going around them.

00:59:51:13 - 01:00:07:10
Daniel Jeffries
Then you started to see kind of these more machine learning based Bayesian approaches to it, and they start getting better and better over time. The spammers will then kind of take their tool and learn how to kind of get around it by injecting certain keywords or whatever it was that they got there anyway. And I would argue spam is largely a problem at this point.

01:00:07:15 - 01:00:22:05
Daniel Jeffries
The vast majority of commercial software that you get and if you're on Gmail or whatever, you're just not going to see that 99.9% of spam. And that's because eventually over time, the tools were there to detect it or mitigate it. So I think we'll sort of see the same thing here. We'll see the sort of rise of kind of tools on either side.

01:00:22:05 - 01:00:37:14
Daniel Jeffries
I also want to say, though, that really nobody's been waiting around for the fusion models to do it. I mean, again, there's been a shift in the way we think about these things, right? It's like Photoshop has been able to kind of do it, put celebrity heads on bodies or whatever and these kinds of things. And skilled artists could do it.

01:00:37:14 - 01:00:56:08
Daniel Jeffries
Now, does it get sort of potentially easier over time? Sure. But there's already kind of machine learning models that are already being used in, the UK, more on both sides or whatever, to make fake videos and statements. So those kinds of things are already happening. So as a society we already really have to deal with it and we're going to need better deepfake, we're going to need better sort of tools to authenticate images.

01:00:56:08 - 01:01:11:08
Daniel Jeffries
Maybe it's like when you maybe it's hidden sort of information in the images. We're sort of looking at some stuff like that. On a research side and you could potentially have like a sort of hidden watermarks or things of that nature in there. I think a lot of researchers are kind of looking at those things. So you're going to have tools that are going to develop over time.

01:01:11:08 - 01:01:28:06
Daniel Jeffries
Maybe there's some sort of cryptographic technique that allows you to kind of authenticate the image as it's coming off, as it's coming off of the camera or things like that. I think there's also been a shift that's it's a little puzzling in some respects where we've changed the concept of where sort of the responsibility lies in a lot of things.

01:01:28:06 - 01:01:54:17
Daniel Jeffries
Right. And again, I think in the old days, somebody put a celebrity head on a different body and Photoshop. We didn't call up the executives of Photoshop and say, look, you can't but you can't put up Photoshop until you find a way to make sure this never, ever happens. Right? We somehow got into this place with artificial intelligence that we're supposed to be able to mitigate, like every possible sort of human behavior would be like saying like, well, you can't give anyone a credit card because until you can figure out every way that they'll potentially commit fraud.

01:01:54:17 - 01:02:15:15
Daniel Jeffries
But that thing, I think that's a strange shift for society. It actually concerns me a great deal. I think that like it there's this overwhelming benefits that that are super important for folks to understand. Right. Like we haven't said to the kitchen manufacturers that we have to mitigate like it can only be used for cutting vegetables. And if you can't mitigate somebody cutting their finger with it or stabbing somebody, then you can't put this out.

01:02:15:15 - 01:02:30:13
Daniel Jeffries
And to me, that's strange. It is strange. And I understand that every technology exists on a continuum of good to bad. To me, artificial intelligence is very much in the center. It can be used for good and bad things, but I think the vast majority of people are good and the vast majority of people are going to use it for good.

01:02:30:16 - 01:02:46:09
Daniel Jeffries
And maybe a gun is like farther to the side of like the dark side. But I could still use it to hunt and feed my family or protect my family, whereas a lamp might be farther to the side of life and I could still pick it up and hit you over the head with it. Right. But we haven't told the light bulb and the light manufacturers, look, you can't put the light out of us.

01:02:46:09 - 01:03:01:28
Daniel Jeffries
You can make sure nobody ever gets hit over the head with this thing. But I do think for the vast majority of things, the vast majority of people are good and the benefits. So when we have sort of red teams out there in the world looking at what they can do wrong or be harmful, they should also be looking company should also be spinning up a green team.

01:03:01:28 - 01:03:19:06
Daniel Jeffries
And we're doing that as well. Red and green teams, the green teams to basically do an analysis of what would happen if we don't really see what myriad benefits of synthetic cancer scans and the ability to create new wallpapers and the ability to create myself in 12 different styles. And what are the downsides and the benefits to society?

01:03:19:06 - 01:03:36:12
Daniel Jeffries
They're lost from releasing these things, and I think that's just as important. And somehow we focused on we just focused on the dark side to the extent of all health. And it's kind of saddens me in a lot of ways because I just feel like there's so many good benefits and the vast majority of people are good. And if people do something wrong or illegal with it, then they should place the blame for that in the same way that it works.

01:03:36:12 - 01:03:52:12
Joey Korenman
Now, I love the kitchen knife metaphor and I mean, I think you're right. Part of it is just the doom gets clicks. And I think that's kind of why, you know, anything negative kind of rises to the top these days, like thinking like going back to that Oppenheimer quote in the nuclear. That's probably not a fair comparison at all.

01:03:52:12 - 01:03:53:17
Daniel Jeffries
It's a little tough. Yeah.

01:03:54:14 - 01:03:57:07
Joey Korenman
It's almost like it's almost like comparing someone to Hitler once you've gone.

01:03:58:00 - 01:04:03:17
Daniel Jeffries
But yeah, then what are they going? Yeah, it's like you've already jumped the shark. You've got the arguments already drawn to it.

01:04:03:17 - 01:04:24:29
Joey Korenman
It's a little just to play devil's advocate and hopefully this holds. But like the world developed nuclear technology, the United States used it in World War Two and other countries have nukes. But there is this understanding that we're not going to use this technology. We have it, but let's not use it because it's opening Pandora's box. I guess there is a case where there is something that is in kind of in a category all by itself.

01:04:24:29 - 01:04:26:16
Joey Korenman
And so would.

01:04:26:16 - 01:04:42:19
Daniel Jeffries
You say autonomous killing machine? Yeah. No autonomous community, is that right? I mean, there's and there's already groups at the U.N. that are looking into that seriously or whatever and whether those things should be banned. But there's also probably an arms race that allows that potentially to happen or those kinds of things, those kinds of things I think are super important for society to debate.

01:04:42:19 - 01:04:56:19
Daniel Jeffries
But again, like you're the analogy and this is where again, if we go too far on the doomsday scenario, so if you think about my idea that technology exists on a continuum, all technologies exist on a continuum. This is I can drink water from this, but I could also hit you over the head with it. But mostly I'm going to drink water from it.

01:04:57:00 - 01:05:16:00
Daniel Jeffries
Okay And that 99, 9.99% of the people are going to drink water from this thing. And so the one edge case where somebody decides to hit somebody, does that mean we should not have this water? So I think but when you're talking about something like a nuclear weapon, you're talking about something whose very purpose is to destroy a lot of things very quickly.

01:05:16:00 - 01:05:33:27
Daniel Jeffries
That's what it is. It's about that that is not a fair comparison to like artificial intelligence. Right? If I created an artificial intelligence that was just designed to kill or harm people and to like do targeting, practice or whatever, then yes, that should be heavily regulated and or like banished and something that we would want to think very deeply of as a society.

01:05:33:27 - 01:05:56:14
Daniel Jeffries
But the rest of the artificial intelligence technology generally, I would say, falls on a much farther, either on the spectrum of Google or somewhere in the middle. And that when we use these kind of like extreme analogy is it creates misunderstanding about the nature of those things. So of course we want to regulate nuclear weapons. But I think somebody else made the analogy of the nuclear weapon.

01:05:56:14 - 01:06:16:01
Daniel Jeffries
So it's an absurd observation. I'm sorry, but it is an absurd observation and it's completely unfair if you're worried about the eyes getting too intelligent, I would argue that people need to learn how to think more clearly before they worry about that, because, like, really, artificial intelligence is not it's not even close to a nuclear weapon. Maybe maybe AGI in this mythical sense, like become something.

01:06:16:01 - 01:06:32:21
Daniel Jeffries
But I don't know, frankly, I wonder if I could just do a better job. I mean, we certainly can't do a worse job when I think about the kind of cruel, sadistic and horrific kind of conquerors and kings and queens of the days of old, maybe I'll do a better job. Maybe we haven't even consider that possibility that we should.

01:06:32:21 - 01:06:53:12
Daniel Jeffries
But I think but again, I think it's just super important to be careful about these kind of extreme examples. And I think, once again, that kind of creates this unnecessary fear that really is unrealistic and that people, again, need to look at the kind of each individual tool and use case, and they need to look at it with a clear mind and logic and sort of a sane sort of middle ground understanding.

01:06:53:12 - 01:07:01:14
Daniel Jeffries
And certainly if you want to think about something like a lethal machine or whatever, that's a category that must be considered seriously. But the rest of the stuff that needs to be considered in a very different way.

01:07:01:24 - 01:07:21:20
Joey Korenman
Yeah, yeah no, I totally agree with you. And I think the way you nailed it there, it's like nuclear weapons are literally designed to just destroy stuff. This could very likely going to help cure certain cancers and things like that. Right. So it could be used by bad actors. I'm thinking of others. This guy, Palmer Luckey, he invented the Oculus VR headset.

01:07:21:20 - 01:07:41:01
Joey Korenman
That was his company. And now he runs like a weapons startup, like he makes weapons. And there's another kind of way to look at this, too, which is that a stability that I and my journey is really it's like these are being created in developed Western countries where we have like a legal system and a morality that most people have that I mean, I think we're all on the same page.

01:07:41:01 - 01:07:55:02
Joey Korenman
No one wants most people don't want to hurt each other with this stuff, but there are other bad actors out there. And so we need to develop the technology, too, because they're going to be doing it whether we do it or not. And so I'd rather the good guys developing it. So that's also another thing that I think is good to keep in mind.

01:07:55:10 - 01:08:10:15
Daniel Jeffries
There's examples in totalitarian states where they use things for scaling surveillance and those kinds of things, and we should be really wary of those kinds of things, and we should be. Those are the things we should be out there yelling about and protesting and making sure that we don't allow that to kind of creep into our own societies as well.

01:08:10:16 - 01:08:30:11
Daniel Jeffries
Right. I find those cases to be incredibly unethical and I couldn't support. I also understand that like sometimes I don't know, somebody said something about they created a thing they could generate like a positive drug discovery novel sort of chemicals for combating cancer. And then somebody put out a paper saying, Oh my God, it can also be used to create sort of like disease chemicals or chemicals that hurt people in my book.

01:08:30:15 - 01:09:06:23
Daniel Jeffries
Chemistry can be used to do that in a rogue chemist like and once the chemist understands how like to combine chemicals, they could do it to make things that cure cancer. They could do it to poison people. So again, it's again, it falls into the hands of the user and the person who's doing this. And again, I believe very much in the vast majority of people being good and that these tools are too important pride in our ability to fight cancer, discover new drugs, create new ways of doing art and work to create new ways of expressing ourselves, to create translations for languages that that that are don't have a great deal of speakers anymore.

01:09:06:23 - 01:09:25:28
Daniel Jeffries
And we don't want to lose all those precious texts that exist and now we can start to train these gigantic models that are able to preserve some of these languages in ways that have never been done. We're also able to look back at like historical records and things that we had trouble translating from our college archeology sites, and suddenly we're able to translate that and understand what people were thinking in the past in these kind of ancient, ancient scripts.

01:09:25:28 - 01:09:31:29
Daniel Jeffries
And so there's just so many cool things and it's so exciting and I just want I want people to be excited about it.

01:09:32:02 - 01:09:46:24
Joey Korenman
So I guess the last question I have for you, Dan, so this has been a really awesome and eye opening and I do feel like I'm a natural optimist. So I already super excited about all these tools. I feel even more excited because you've given out 100 examples of other things I'd never thought of, like ways that this technology can be used.

01:09:46:24 - 01:10:03:10
Joey Korenman
So maybe why don't we wrap up like you're a futurist and a writer and clearly, like, you think about this stuff a lot. Could you kind of maybe paint a picture of like, what is it going to look like in five, ten, 20 years when this technology's more mature? What are the things that we're not even thinking about now?

01:10:03:10 - 01:10:12:19
Joey Korenman
I mean, image creation, great. It makes it easier to do our jobs. But what are the other things that are going to come out of these models, these experiments, the research that's being done and all that?

01:10:13:06 - 01:10:27:08
Daniel Jeffries
I mean, you're going to have these amazing sort of like cutting edge models that are always being hosted and being dropped by big companies, and then they'll get cheaper over time and kind of trickle down to the hands of regular folks who are able to think about brand new things to do with them and run them on smaller levels of compute.

01:10:27:08 - 01:10:41:27
Daniel Jeffries
You'll probably have kind of like Akamai of inference that's kind of running all over the world. It's able to kind of run these gigantic models and you'll be able to consume them via APIs, the ability to consume them like directly the amazing kinds of things we're able to do. I think it's going to revolutionize our ability to do drug discovery.

01:10:41:27 - 01:11:00:07
Daniel Jeffries
I think that sort of protein folding was one of the first examples of it where we maybe synthesized a million different proteins and in our entire history and it took like months and lots of expensive equipment to do it, then we're able to make predictions for 200 million proteins kind of overnight. And what does that mean for medicine as we kind of apply that sort of technology?

01:11:00:07 - 01:11:15:28
Daniel Jeffries
It certainly means kind of personalized medicine. In other words, like if you think about now, if people go cash by a drug so expensive because it's really expensive to research drugs to be able to cure things. So the big pharma companies are primarily going to focus on what they're going to focus on, the things that hurt or kill the most people.

01:11:15:28 - 01:11:32:03
Daniel Jeffries
They're too focused on cancer or, different types of cancer. But there's all kinds of smaller diseases that would never look at. A friend of mine got skin riding where you're kind of itchiness all the time and you kind of have to take allergy pill every every couple of days. And if you kind of do it, it brings up a red welt.

01:11:32:09 - 01:11:45:19
Daniel Jeffries
No one's going to create a drug for that. No one's going to spend the R&D money to do it. But an artificial intelligence might be able to look at all the different characteristics of that particular disease, look at the drugs that have been created, or look through the possible combinations and create that. And then maybe there's a drug 3D printer in those.

01:11:45:20 - 01:12:06:00
Daniel Jeffries
These things already exist. Chemical 3D printers are DNA 3D printer so that you're able to go to the drugstore and get personalized medicine for just your particular disease. Right. And maybe there's another artificial intelligence that's looking at those particular to make sure that they pass certain baseline set so that they're not going to hurt people and that they have certain characteristics by running group through virtual tests where those kinds of things start to happen to whatever.

01:12:06:09 - 01:12:25:18
Daniel Jeffries
And the FDA, I think you're going to have breakthroughs in economics where you're looking at or able to kind of analyze the way systems are working in real time. We're already starting to see this where we're seeing like it's are able to go wait minute. That country might not be telling the full truth about their economic production because we're looking at all the container ships that are coming out of there and we're calculating that we're putting that up against kind of the things that already exist.

01:12:25:18 - 01:12:42:20
Daniel Jeffries
And we've already talked some of the other ones like like cars, certainly. I'm not convinced that we're going to solve any sort of climate issues by just eating one less burger and like occasionally not flushing the toilet. I think it's going to take much more larger things. Right. And so I think if we were able if artificial intelligence were able to sort of reduce the amount of cars on the road.

01:12:42:20 - 01:13:00:05
Daniel Jeffries
Right, and reduce the fatalities, all of a sudden you're able to say you're in a big city and you've spread out 10,000 cars and you can get a car anywhere in the city within 2 minutes based on kind of the distribution of those cars. And they can get you anywhere. I think that's revolutionary. It's awesome. And all of a sudden those parking lots could be converted back into very interesting things.

01:13:00:05 - 01:13:17:25
Daniel Jeffries
I think we're going to have new material science, right? We're going to look at all kinds of different materials and design new things. I think going to have turtles all the way down where the we're already seeing A.I. designing new chips. I think we'll have personal assistants. Right, that know it's probably better than we know ourselves. And you'll be able to say and forgot my sister's birthday, I'm sorry.

01:13:17:25 - 01:13:30:23
Daniel Jeffries
And I'm on my way to her house. And it's cool to look at what it knows about her and what it, what it knows about you and how old you are and how old you are and like how much time you've got and. Okay, but she does like beer, but she doesn't like wine and she's not into, like, chocolate.

01:13:30:23 - 01:13:47:12
Daniel Jeffries
So, like, there's a wine store and you should probably pick up on her low on this way. And there's another one in case you missed that one. Right. And suddenly the other way around. I think that was very awesome. I think I'm always an optimist, very excited about things. I know sometimes that is like there's a downside to these things.

01:13:47:12 - 01:14:06:06
Daniel Jeffries
But like humans, like I said, we have a dark side and we've always been doing we've done negative things with or without technology throughout time. We've had wars, we've had conquests, we've had these kinds of things. That aspect of human nature is unchanged. But our ability to create and do new things and improve, I think the vast majority of people, it's one of the luckiest times to be alive.

01:14:06:06 - 01:14:23:24
Daniel Jeffries
I think we're one of the luckiest 1% of 1% to ever live. We are really in a golden age of material science. We're the golden age of like science and technology. We're in a golden age of like mortality and longevity and health. All of that has been because of technology. And it doesn't mean that we couldn't have any setbacks and that we don't have setbacks all the way.

01:14:24:05 - 01:14:42:04
Daniel Jeffries
Doesn't mean we haven't had some terrible things that have come out of that as well. Weapons of war. What we're too is incredibly destructive because of those tools versus swords and stones. But When I look at like the vast majority of human history, I think this is absolutely the luckiest time to be alive. You've got just things getting better.

01:14:42:04 - 01:14:59:21
Daniel Jeffries
And when too often we're looking at all the negative things in life, it's like the great statistician I once saw says, If I keep showing you the bottom of my shoe all the time, you're going to think the whole shoe. And it's just not a thing I have right here. Keeps my drink cold for the entire day or warm the entire day without any ice.

01:14:59:23 - 01:15:11:05
Daniel Jeffries
We don't even think about this. So this is magical. This is awesome to have this thing. And I just think we're going to have more and more of those things and people just should embrace it and be excited to be alive because it's an amazing, amazing time.

01:15:11:05 - 01:15:29:13
Joey Korenman
I want to thank Dan for coming on. He has incredible demands made on his time these days, as you can imagine. And it was an honor to chat with him. Check out his Substack future histories at Daniel Jeffries dot com to go deeper into his thoughts and ideas about technology that we discussed. Yes. Thank you so much for watching and I'll see you next time.

EXPLORE ALL COURSES

Dive into real-time 3D with our Unreal Engine beginner's course by Jonathan Winbush. Master importing assets, world-building, animation, and cinematic sequences to create stunning 3D renders in no time! Perfect for motion designers ready to level up.

Explore this Course

Unlock the secrets of character design in this dynamic course! Explore shape language, anatomy rules, and motifs to craft animation-ready characters. Gain drawing tips, hacks, and Procreate mastery (or any drawing app). Ideal for artists seeking to elevate their craft.

Explore this Course

Elevate your freelance motion design career with our guide to client success. Master a repeatable method for finding, contacting, and landing clients. Learn to identify prospects, nurture leads, and develop a thriving freelance philosophy amidst chaos.

Explore this Course

Rev up your editing skills with After Effects! Learn to use it for everyday needs and craft dynamic templates (Mogrts) for smarter teamwork. You'll master creating animated graphics, removing unwanted elements, tracking graphics, and making customizable templates.

Explore this Course

Stand out with Demo Reel Dash! Learn to spotlight your best work and market your unique brand of magic. By the end, you'll have a brand new demo reel and a custom campaign to showcase yourself to an audience aligned with your career goals.

Explore this Course

Illuminate your 3D skills with Lights, Camera, Render! Dive deep into advanced Cinema 4D techniques with David Ariew. Master core cinematography skills, gain valuable assets, and learn tools and best practices to create stunning work that wows clients.

Explore this Course

Master After Effects at your own pace with Jake Bartlett's beginner course. Perfect for video editors, you'll learn to create stylish animated graphics, remove unwanted elements, and track graphics into shots. By the end, you'll be equipped for everyday AE needs and more.

Explore this Course

Revolutionize your Premiere workflow with customizable AE templates! Master creating dynamic Motion Graphics Templates (Mogrts) in After Effects to speed up your team's work. By the end, you'll craft easily-customizable templates for seamless use in Premiere Pro.

Explore this Course
Your download is in your inbox!!!
Check your email (spam, too) for the download link!
Please check the spam folder if you don't see the message within a minute or two. Google likes to hide your downloads, sometimes.
Oops! Something went wrong while submitting the form.

Looking for the best way to improve your animation?

Discover the hidden techniques behind organic motion design animation in our essential course, Animation Bootcamp!

Not sure where to start?

If you’re a beginner, here are some great courses to help you get started:

After Effects Kickstart

Dive into the fundamentals of motion design with our most popular (and recently updated) After Effects course.

LEARN MORE

Photoshop + Illustrator Unleashed

Master the basics of Photoshop and Illustrator and gain invaluable insights in this introductory level course.

LEARN MORE

Design Kickstart

An introduction to the design principles behind all great work.

LEARN MORE

More Advanced?

If you’re a more advanced student looking to up your game, here are some great options:

Animation Bootcamp

Learn the art and principles of creating beautiful movements in Adobe After Effects.

LEARN MORE

Design Bootcamp

Learn to design for motion in this intermediate-level, project-based course.

LEARN MORE

Cinema 4D Basecamp

Learn Cinema 4D from the ground up in this exciting introductory C4D course.

LEARN MORE

Now is the time to learn the skills you need to advance in your motion design career: