111. Effects of AI with Dietmar Fischer

This month we're chatting with Dietmar Fischer about what we will mean by saying "AI" in the future? AI in science fiction, the fact that AI’s don’t want for anything, jobs and the political effects of unemployment, post-work society and defining what a good life is, the Chinese AI legislation, protecting young people from AI anthropomorphising, AI literacy, the AI bubble, Human extinction, and more
Date: 14th of April 2026
Podcast authors: Ben Byford and Dietmar Fischer
Audio duration: 52:20 | Website plays & downloads: 3 Click to download
Tags: Existential Risk, Job displacement, Sci-fi, Podcast, Legislation, AI literacy, Politics | Playlists: Existential risk, Legislation, Work

Dietmar Fischer is an economist turned digital marketer, now going down the AI road. With the Beginner's Guide to AI he teaches AI literacy to the people out there who are open for AI, but do neet some ideas and examples on how to get started or how to use AI on the next level.

You can find the podcast at beginnersguideto.ai, his marketing agency at argoberlin.com and himself on LinkedIn: https://www.linkedin.com/in/dietmarfischer/


Transcription:

Ben Byford:: Hello and welcome to episode one hundred eleven or one hundred and eleven of the Machine Ethics Podcast. This month, we've got a podcast swap with Detmar Fischer, who runs the Beginner's Guide of AI podcast. This was recorded on the twentieth of February 2026 Uh, as you can see, we're kind of in a new office and we're trying out Adobe Podcasts. Uh, this is filmed and available on YouTube, so check it out there as well. We interview each other about what we mean by saying AI in the future. AI in science fiction. Humans love trying to break AI's to show how much cleverer they are. What do they mean by intelligence? The fact that AI's don't want anything. Jobs and the political effects of unemployment, post-work society, and defining what good life is. The Chinese AI legislation protecting young people from AI, anthropomorphism, AI literacy and the AI bubble. We also talk about human extension, human extinction, and a much, much more. So go check out Beginner's Guide to AI. You can find more from us at machine-ethics .net And if you can, you can support us on Patreon. Patreon.com /machine ethics. Thank you and hope you enjoy.

Ben Byford:: so, uh, hi, um, welcome to the podcast podcasts. Um, I'm going to just wave my hands around, gesticulate a little bit. Uh, so I apologize for that. Uh, we're recording this, uh, as a bit of experiment. Um, because I've not used this software to record video and audio at the same time. Uh, so hopefully it's going to go well, uh, Dietmar, uh, if you could introduce yourself, who are you and what, what do you do?

Dietmar Fischer:: Yeah. So I'm, uh, first of all, host of beginner's guide to AI. This is the one podcast and, uh, I do online marketing AI stuff. Uh, now it's this agents building and whatever. And, um, basically interested in the topic since I started reading science fiction when I was young

Ben Byford:: Yeah and..

Dietmar Fischer::

Ben Byford:: Great. And you've got this, uh, podcast. So we're hopefully doing this podcast exchange so that

Dietmar Fischer:: Yes,

Ben Byford:: we'll have this

Dietmar Fischer:: yes,

Ben Byford:: on

Dietmar Fischer:: yes.

Ben Byford:: the Machine Ethics podcast and your podcast as well. So hi to everyone

Dietmar Fischer:: This

Ben Byford:: who

Dietmar Fischer:: is exactly

Ben Byford:: is there.

Dietmar Fischer:: Ben. What do you do?

Ben Byford:: Um, so I'm Ben Byford. Um, so I run the machine ethics podcast, as I mentioned, um, I have been doing it for ten years now. Um, which is kind of bonkers. Um, before it was cool to do podcasts and to talk about AI, I guess maybe. And, uh, I have also been a web designer, developer, a UX design person, a person who makes stuff, uh, designs apps and, uh, teaches. I did a bit of teaching, adults, uh, to code and about, um, innovation methodologies, user research, that sort of thing. And mostly these days, because the podcast, because of this interest in AI stuff and ethics, I read and write and talk and do workshops on that AI and ethics. And I also, um. Make computer games. So that's my, my two main things I'm trying to do at the moment. Yep.

Dietmar Fischer:: This is actually an interesting thing with the computer games. How much AI is there? But this is a topic for later I think. Uh, first

Ben Byford:: Yep.

Dietmar Fischer:: of all, um, let's start with the most important stuff ethics, business, AI. Um, you have a typical question, Ben. I realized and my audience

Ben Byford:: Yep.

Dietmar Fischer:: would love to hear that question.

Ben Byford:: Yes.

Dietmar Fischer::

Ben Byford:: Um, so at the beginning of the podcast, I always ask, um, kind of this, this question around what is AI? You know, what are we actually talking about? And it's, I ask it because it's always met with different, um, answers. So for you, what are we talking about? What is this thing? AI?

Dietmar Fischer:: I prepared for this. Exactly. So, so

Ben Byford:: Yeah.

Dietmar Fischer:: the thing is, for me, I'm a Capricorn, so I'm like a really hands on guy. And for me, AI is the things I read in my science fiction books. So it's like there's might be like, uh, behaving like an alien, but it's still like a human in a sense of or intelligent being acting. It's basically if you take a Spock or AI, for me, it's not much different. It's the same,

Ben Byford:: Mmm.

Dietmar Fischer:: same type of character that that comes there. So, uh,

Ben Byford:: Mmm.

Dietmar Fischer:: if you have like a house called the Rutger Hauer in Blade Runner and, uh, or you have, uh, Hal, uh, those are all, uh, in the sense of for me kind of intelligence. And they, they find themselves by, they act, they, they act like, like we humans could.

Ben Byford:: Yeah. And it's interesting because those examples like this are science fiction examples. Um, probably a few years ago felt quite far away. Right. Um, and we had this idea. I'm assuming we had this kind of idea, this kind of the cultural idea of what we're talking about and what we're aiming for with AI. Um, and I guess more and more these days, it feels like we are encroaching on that sort of science fiction vision. Um, slightly. Um, but for your examples, there are all kinds of examples of things which are like seemingly things seemingly people like in terms of they have some, you know, inner reflection, some stuff going on. Um, and they're not just like tools and inanimate objects which happen to do interesting

Dietmar Fischer:: Yeah.

Ben Byford:: abilities. Um, is that what you mean? Is that you're kind of looking at it like

Dietmar Fischer:: Yes.

Ben Byford:: that?

Dietmar Fischer:: Yeah, yeah. This

Ben Byford:: Yeah,

Dietmar Fischer:: is exactly the thing. So they they

Ben Byford:: yeah.

Dietmar Fischer:: have agency, they have even morals or their own set of rules and they, they act and it, it wouldn't be, you wouldn't be able to, to, to say if it's an maybe it's an alien, but it's, it's a, it could, it could be a biological, biological life form or an

Ben Byford:: Mmm.

Dietmar Fischer:: artificial. It doesn't really matter from from the science fiction standpoint. Um, this is um, but I had just the negative examples there. Also the positive ones or there's the whole, um, problems with the Isaac Asimov ones. And, uh, there could be positive ones. There could be detective AI, detective he has, and something like this. And, uh,

Ben Byford:: Yeah.

Dietmar Fischer:: it's interesting now I think about it, uh, if you have ChatGPT or something, for me, it's still not sentient. It's still a tool

Ben Byford:: Mmm.

Dietmar Fischer:: I use. And, um, I don't put the idea in. I mean, sometimes when one reads what, what it writes and one says he or her or something, but I still

Ben Byford:: Yep.

Dietmar Fischer:: know it's just a tool. It's like lacks memory, but with this, uh, yeah, with the speed of development, I think things change. And for many people, it's already sentient, even if it's not. Aw, yeah. It's a good thing. Yeah.

Ben Byford:: Yeah, yeah, I think it's one of those questions, isn't it? Like what? When? How will we know? Like when is what bar are we measuring by to, to,

Dietmar Fischer:: Yeah.

Ben Byford:: to like, uh, be able to put it into a new category. And at that point, like, um, this happened a couple of years ago, right? Where everything was, uh, AI people actually meant some machine learning. And now more recently, everyone means generative AI,

Dietmar Fischer:: Yeah.

Ben Byford:: you know, chatbots and stuff like that. And at that point, maybe in the future, when we say AI, AI, we actually mean like things which have some sort of, yeah, continuation, some sort of memory, some sort of, uh, reflection that are, are, are less tall ish than they are. Um, and AI will be that thing. Right? And it would be more like your, your Rutgers and your, um, you know, science fiction examples. Uh, and then we'll call everything else, like, I don't know, data analytics or something, you know?

Dietmar Fischer:: Yeah, this is really interesting. I just read

Ben Byford:: Yeah.

Dietmar Fischer:: the first chapter of a, of a book, my professor. So I'm in Munich in the university there and they are publishing a book. And that's the thing. They interpret texts with AI together with AI. They define it like this. And he's like, he starts that. We don't just use it as a tool anymore because the tool also changes us. So, um, parts of this is already in our intelligence. We can't, it's like, like the GPS example, if we suddenly don't have GPS, it's really hard to navigate because we already put some stuff of our intelligence in the machine. So yeah, it's dumb

Ben Byford:: Yeah,

Dietmar Fischer::

Ben Byford:: yeah yeah, Do you think most people like, use it or understand it? I think, you know, I was having this conversation with my parents, um, recently, who are, you know, going in their seventies, and I think at that generation, it's probably unlikely that they know much about it, I would say. I'm sorry if you're listening to this and you're like, I know all about it. And yeah, um, that's probably a massive generalization, but like, um, I think, you know, that, that, that's the thing, isn't it? Like, that's the, um, when everyone's doing it, um, you know, your parents will be doing it and like, are we there yet? Almost, um,

Dietmar Fischer:: This.

Ben Byford:: with, with that kind of understanding that literacy around it as well.

Dietmar Fischer:: This is really interesting because yesterday just had a call with someone. She's about sixty and they were laughing their asses off because they could put into ChatGPT. How many O's have has orange and gave them two. And so I couldn't replicate it even it didn't work. I showed me one,

Ben Byford:: Yeah.

Dietmar Fischer:: but still they were like, uh, the funniest thing they could do is prove that the machine is wrong, and that tells you also something about it, how they use it. It's not like I don't see any sense in proving that the machine makes mistakes. I know that the machine makes mistakes. It's, uh, it's not there yet, but, um, I think it's a defense because they're really afraid of it and they, they really personalize the machine already. And then now try to take away the, the anthropomorphization part of it by proving that it can't do simple tasks. They can do, I guess.

Ben Byford:: Yes. Yeah, yeah. And I'm presuming that you're inferring. You're suggesting that it can do tasks, right?

Dietmar Fischer:: Yeah.

Ben Byford:: Yeah,

Dietmar Fischer:: they,

Ben Byford:: yeah.

Dietmar Fischer:: they use it every day. So they, they know it can do a lot, but they, I think they're quite afraid that it will do more and will or

Ben Byford:: Mmm.

Dietmar Fischer:: they don't know really if it's intelligent or not. I mean, one just has to scroll through social media and see many people, uh, that don't realize what it is really, you know, it's.

Ben Byford:: Mhm.

Dietmar Fischer:: Yeah.

Ben Byford:: Mhm. Yeah. I mean, I think there's a lot of like, uh, I like to call it like semantic soup in there, like intelligence. Like you were just saying intelligence as in conscious or sentient. That's what like, and people mean, like, use these words all the time. So it's almost like it produces like, and we used to call things smart, right? I don't know, like everything was

Dietmar Fischer:: Smartphone.

Ben Byford:: like this, a smartphone, a smart thing. Um, so I feel like the intelligence thing is a bit like that where we're using it in places where we actually mean something else, or it would be more explicit to say something else. So like, it almost feels like we need this intelligence split where we're, what we mean by intelligence is smart. Like it's doing a digital thing that is useful. Um, and in terms of AI, maybe it's a useful action. Like these things are getting to the point where they're taking actions. They are able to interact with digital products and stuff. And the other side is like, oh, it's intelligent. It of itself is intelligent, right? And what they mean is like, it's, it has some internal thing going on. Um, which we could more easily, you know, refer to as consciousness or sentience or those sorts of terms. So, um, yeah, I find that really fascinating that people are like just use words and no one knows what anyone means most of the time, you know?

Dietmar Fischer:: This is totally interesting because also, if I can't really put a finger on it, what it is exactly, I tend to generalize and then it's intelligent

Ben Byford:: Yeah.

Dietmar Fischer:: if it's conscious or not or so, but I have this term and in my mind I judge it like it would be like this tool that it can already do things. And I attribute things to the. Yeah, I mean, they give it names now. I mean,

Ben Byford:: Yeah,

Dietmar Fischer:: they give

Ben Byford:: yeah,

Dietmar Fischer:: ChatGPT

Ben Byford:: yeah yeah,

Dietmar Fischer:: names

Ben Byford:: yeah. I

Dietmar Fischer:: and.

Ben Byford:: mean, is it intelligent as a, uh, as a, like there's the weird squirrel squirrel example, like it's intelligent because it can hide nuts and find nuts. Is that is that what we mean? Or we mean something else that we like? It's intelligent because it feels like a person. You know what I mean? Like, is it because it completes tasks

Dietmar Fischer:: Yeah.

Ben Byford:: well, or is it because it feels something akin to some anthropomorphic version of what we mean by intelligence? You know, um, I would hope that most people were down this line of like, it's intelligent because it's smart, right? It completes tasks and we want it to complete and that there may be tasks that we haven't been able to complete, you know, in, in the past or whatever. Um, so that's why I kind of mean, yeah, I mean, I

Dietmar Fischer:: Yeah.

Ben Byford:: feel like I'm making these distinctions, uh, on the fly. So I apologize. They're not like completely like thought through. Um.

Dietmar Fischer:: It's a podcast. We allowed to do this? No, but it's

Ben Byford:: Yeah.

Dietmar Fischer:: really interesting. I think because of his voice, I normally don't. I know there's the voice mode. I use it sometimes, but if

Ben Byford:: Mmm.

Dietmar Fischer:: you go down the road like character AI or whatever, and you have like really an avatar, it's not the squirrel anymore for the people, even if it has

Ben Byford:: Mmm.

Dietmar Fischer:: this intelligence level of a squirrel might

Ben Byford:: Yep.

Dietmar Fischer:: be might be higher, but for them, it's really the, the, it's not the insight that makes the distinction, but the outside the communication tools. I think this is,

Ben Byford:: Yeah. That's. Yeah. That's interesting. Um, I think, I think there's like, uh, you know, there's an anthropomorphic version of that. Do you find, do you find that when you're talking to people that they are thinking in that way, that they are having relationships or having, you know, getting advice from these systems and stuff?

Dietmar Fischer:: Actually, I don't, I, I know that there are a lot of people I don't really know, the people might be also my age group. This is more like the

Ben Byford:: Mmm.

Dietmar Fischer:: um, not yet boomer, but gen Y Gen no, not not yet. Also not Gen Y and Gen X in the middle. And so using tools used to use tools, learning a lot. So and this is, I don't know, my, my new employee. She's in her twenties, so I don't know how she does it. She's not I know that she's really rational. Um, for her, it's not a not a person or so, but this is like, I don't know privately how she uses if she has a character AI chat or whatever. So and

Ben Byford:: Yeah.

Dietmar Fischer:: there are those, there are quite a lot of people. And there was this one ChatGPT research, where OpenAI looked at what do people do with ChatGPT? And it was the second most used case was personal relationship. It's, uh, was it second or even most, but people usually use I mean, there's enough numbers out there, even if interestingly, I don't know the people it sounds like, yeah, there are people, but it's not me.

Ben Byford:: Yeah yeah yeah

Dietmar Fischer:: I

Ben Byford:: yeah

Dietmar Fischer:: don't

Ben Byford:: yeah

Dietmar Fischer:: know them, but,

Ben Byford:: yeah.

Dietmar Fischer:: uh.

Ben Byford:: I guess it's the anecdotes, isn't it? I think we were all aware that, like, people do do it and it's, uh, as long as they're like cognizant that it is currently a tool, let's say, um, I'm going to throw that out there and that it, um, it can't, it had, like I've been saying this for the podcast. It's like it's got no skin in the game, right? Like he doesn't really, he doesn't care for you. It's just regurgitating text, right? Um, but it can be useful text, useful knowledge, information stuff. Um, but it's not like it doesn't want, it doesn't want to be your friend. It doesn't want for anything. So there's, there's the distinctions.

Dietmar Fischer:: But they are programmed like that. It feels like they want to be

Ben Byford:: Yeah,

Dietmar Fischer:: your friend.

Ben Byford:: yeah.

Dietmar Fischer:: This is definitely the part where the companies want you to keep interacting. And also, this is what you want. And I mean, if you like, in business context, one could hand you a paper and read this or

Ben Byford:: Mhm.

Dietmar Fischer:: you have a nice agent, uh, nicely graphically designed, telling you what the paper is about, but and you would listen to it or many, much more people would listen it. So I think this is the process is far already. So.

Ben Byford:: yeah, yeah. And I wanted to, I think I feel like we could talk about this for a while, but like, I was hoping that we could talk about the kind of like the jobs and economic thing, because I know that you obviously you have this, um, you have, uh, training in economist, um, economics, that's the

Dietmar Fischer:: Economics.

Ben Byford:: word economics.

Dietmar Fischer:: Yeah.

Ben Byford:: Um, so I was wondering how you feel about it in terms of, uh, the economic changes that we might be seeing, um, over the last few years, maybe we've had like a lot of noise in that with, um, the Covid time and, um, obviously wars and things that have been happening. But do you think that there's going to be some sort of, uh, AI change that is going to happen around the economics of the situation. It's a very general question, but we can we can jump off from there.

Dietmar Fischer:: But it's good we stay general. Because I have to make a disclaimer. I did my master's thesis on all the microeconomics game theory stuff, so

Ben Byford:: Mhm.

Dietmar Fischer:: I know. And obviously I'm interested in macroeconomic stuff, but, uh, I'm, I was more into the game theory stuff. But anyway,

Ben Byford:: Right.

Dietmar Fischer:: the job thing, it's really, it's, uh, I mean, there's, there's definitely change coming. And the question is how quick it comes. And

Ben Byford:: Mhm.

Dietmar Fischer:: many things now are excuses for firing people. The this is one thing. So we don't see the real effect yet, but it's also already some numbers there. Um, I it's really hard to say if there comes a new world with new employment for, for humans, I honestly don't know what comes. And there's

Ben Byford:: Mmm.

Dietmar Fischer:: this comparison to Industrial Revolution, where all people got new jobs, but only after twenty, thirty years. So there was.

Ben Byford:: Mmm.

Dietmar Fischer:: I'm in Berlin here. Berlin was the one of the most extreme cities with poverty everywhere and people dying on. What were they? Cholera and whatever in the city. And so it took like twenty, thirty years to adapt to the system. So it might

Ben Byford:: MM.

Dietmar Fischer:: be that there comes a new system, but there will be a gap. And this is actually a thing I see because most days, if you if you got get spam mail or whatever, and it's mostly about AI, it's about how to, to fire people. There's a lot of those

Ben Byford:: MM.

Dietmar Fischer:: how to work

Ben Byford:: Yep.

Dietmar Fischer:: with less people, not do with the same people more, but do less and save money on your people. And as soon as the mindset, this is a strong mindset, there will be

Ben Byford:: Yep.

Dietmar Fischer:: an economic problem. And then the economic problem turns into political problem. Because if you fire, if, if unemployment gets rises by ten percent or so. The whole Democratic. I mean, we already have problems with populists, but this is perfect for them. And

Ben Byford:: Yeah.

Dietmar Fischer:: then it might have even political effects that are really severe. I'm a little bit afraid on the future.

Ben Byford:: Yeah, yeah. I mean, is there like, uh, a good version where the. Because obviously a lot of this stuff is measured in GDP. And, um, that's an interesting, you know, measure in itself, uh, for different reasons. Um, but is there a good version where we see the, the uplift in the financial, um, I want to say, uh, like production, right? The economic production, uh, because of AI and then, and therefore we can start, you know, having more leisure time and that sort of thing. Like, you know, we, I feel like

Dietmar Fischer:: Mmm.

Ben Byford:: I keep saying this, but like, we've been sold this dream right where, uh, with technology, we can have more time for things that we really want to do. Um, hopefully those things are like useful pro-social, you know, things moving your body, seeing your family, uh, being creative, all the things that we know that human beings like, uh, you know, get a lot out of, um, and hopefully just not more work or like more toil, like some, some work is less drudgery than others. Um, do you, do you feel like there might be a good way out and then we can kind of go back into the, uh, the less good maybe.

Dietmar Fischer:: No. Yeah. The less good gets more clicks. Yes. But

Ben Byford:: Yeah.

Dietmar Fischer:: no, I also see, uh, I can't give you a strictly good answer on that. It depends. I was just a lot of research because, see, in Germany, there was this thing that the Chancellor said, the Germans don't work enough. And there was a lot of research in. There is already a lot of research, and

Ben Byford:: Yeah.

Dietmar Fischer:: I just read a good paper on it. How is the definition of work, how people think about it? And there's an interesting thing that people wouldn't say because Germans, the clichés we work and we live for working and it's not the

Ben Byford:: Mmm.

Dietmar Fischer:: case anymore. It's like thirty percent in Germany of the workers at the moment live for working seventy percent not. So I would say those seventy percent they definitely profit from. They already cut their jobs and they profit

Ben Byford:: Mhm.

Dietmar Fischer:: from. If they have to work less for the same level of of income. Perfect for them. But there are other countries like interestingly, the US where people are much more, um, they define themselves much more through their job. And the more you define yourself

Ben Byford:: Yeah.

Dietmar Fischer:: through the job, the the more negative is the is the effect you have to first. It's this typical thing that you have with

Ben Byford:: Mhm.

Dietmar Fischer:: people retire. Then there's this, the mortality rate is getting high of those people who don't have a meaning in life. So and I think this is we have to learn the meaning. And if we do that, it's not complicated. Actually, it's like if we define as a society, um, values like doing something for good family or whatever, or just enjoying your life. If we define this

Ben Byford:: Yeah.

Dietmar Fischer:: as the, the goal, then we have a really good future ahead, I guess.

Ben Byford:: Yeah yeah yeah. So like if we sort out the economic situation, then we could, we need we're left with the value situation. Like what is it that we want to do? What do we value? What, you know, gets up, gets us up in the morning sort

Dietmar Fischer:: Yeah.

Ben Byford:: of thing.

Dietmar Fischer:: Yeah. This

Ben Byford:: Yeah,

Dietmar Fischer:: was

Ben Byford:: yeah.

Dietmar Fischer:: really, um, we have to think about what we want in life. And this is many people want this job and defined by the job. And as long as this is the case, then they have to have to learn. We have to make classes for what's your meaning? What's what's what makes you human. And, uh, and, but, but from a, from a whole perspective of if there's those ideas of universal basic income. And in

Ben Byford:: Yeah.

Dietmar Fischer:: general, if people could live on a certain level, then for many people that's enough. And that it's okay. I mean, I think many people would be happy in this future. We just have to

Ben Byford:: Yeah.

Dietmar Fischer:: make political steps to go there. And then actually a good thing. And another good thing is people are really critical of AI at the moment already. So this means I think there will be a political process. Um, it might be in our, uh, first talk to prepare the interview, we had this Fukushima moment. Uh, so it might need this bad moment for people who really think about it, but it doesn't need to. People are already really afraid. If you talk about AI

Ben Byford:: Yeah.

Dietmar Fischer:: generated images, everybody hates them by now. It's, uh, it.

Ben Byford:: Yeah. It's surprising by me, for me, because I think we've already had some of that. Like we've already had some suicides. We already had the image stuff with GROK earlier in the year being a big issue, um, producing pornographic imagery, um, of underage people, which is so we've, we've already had these kind of like quite epically like, uh, terrible kind of things that have happened. And, and they're not like things that you can really argue about. It's like, well, actually, you know, it was, it was kind of somewhat bad, but, you know, it could have been, you know, it's not

Dietmar Fischer:: Yeah.

Ben Byford:: that bad. It's like, no, that was pretty stupid. Right? And awful. Um, and same for some of the other, the other issues. So, you know, given that fact, the, the, the Fukushima incident is going to be worse than that because we've already not done anything enough. Right.

Dietmar Fischer:: Yeah.

Ben Byford:: Um, and by that I mean the governments are stepping in and, you know, I'm presuming that the companies aren't going to step in and self-censor themselves. You know what I mean? Like

Dietmar Fischer:: Why

Ben Byford:: at

Dietmar Fischer:: should

Ben Byford:: this

Dietmar Fischer:: they.

Ben Byford:: point.

Dietmar Fischer:: Yeah. No.

Ben Byford:: Yeah,

Dietmar Fischer:: Anthropic does it. Anthropic has the idea to do it. So.

Ben Byford:: Yeah yeah yeah. I mean, I, I don't know them personally and they definitely say the right things. I don't

Dietmar Fischer:: Yeah.

Ben Byford:: know obviously

Dietmar Fischer:: Yeah.

Ben Byford:: what they're doing, uh, internally. Um, but yeah, yeah. So I think like it's going to be bad, like something something's going to happen and then we're going to be able to make a decision on the scale of, you know, global or, um, you know, Western countries or whatever it is to come together and decide that we need to pull back or it's in this context, it's not going to work out in this context. It's fine, you know?

Dietmar Fischer:: I see. Actually, interestingly, while you said it, I realized, okay, we do actually do something against algorithms like you see Australia with a ban on, uh, on social media use and social media use is nothing else than using algorithms, uh, to consume addictive stuff. And in

Ben Byford:: Mhm.

Dietmar Fischer:: this case it's like videos or so. And they Australia went first and other countries follow. So

Ben Byford:: Yep.

Dietmar Fischer:: I think it's also there is something happening and people realize how bad algorithms are already. I think it

Ben Byford:: Mhm.

Dietmar Fischer:: maybe might not need this Fukushima moment where where thousands of people die from from one incident, but it it comes slowly to the the people.

Ben Byford:: Mhm.

Dietmar Fischer:: People realize what happens with algorithms.

Ben Byford:: Yeah. I found it, um, completely bonkers that recently I, I saw that the Chinese are introducing some protections around, uh, chatbots specifically stuff, um, where they are limiting, you know, uh, minors to only so much time spent on these systems per day. Um, and that there is a onus on the service to try and not make the service, uh, present as human basically. There's like, you know, there's some quite hard rules in there. And then there's quite some like more fuzzy things. It's like, how are you going to police that? I'm not sure, but like, um, I feel like, like, why aren't we thinking like that? Why aren't we protecting our citizens from, you know, like, I was thinking about it the other day and with my son, I'm, I am actually like very worried because he's, he's, um, he's quite young and he's, when he's going to hit teenage years, he's of that kind of, um, persuasion where he likes to, um, he likes to be liked, he likes to follow rules, he likes to understand what's going on and he likes to and he's not, um, he's going to be embarrassing for the future, but he's not as, um, um, socially aware as, as some people are and that it feels like, you know, the whole chatbot and becoming, uh, you know, addicted to it and thinking it's a person would, would be something in his wheelhouse. And that is, uh, uh, terrifying. Uh, as a parent, because I have to on myself, uh, currently, uh, espouse all this information. You know, I mean, I have to be the arbiter of the, of this new digital, uh, literacy, um, this AI literacy because seemingly how, you know, how else am I going to protect them?

Dietmar Fischer:: Yeah. The thing with China I found interesting, um, because they haven't, they, they just don't have many children. So they, they try to protect them even more. I guess this is one factor for, for country dimension and for personal dimension. Yeah. My, my, my daughter, she's four. Um, she's if she sees the smartphone, she's totally like into it. This is like this fight, this typical fight. And there

Ben Byford:: Yeah.

Dietmar Fischer:: is a, there is addictive potential in it. So how do we as parents make it possible that they can use it, but not too much. Not an

Ben Byford:: Yeah.

Dietmar Fischer:: addictive sense. And if China goes, goes there and tries something and it works. I mean, that's really interesting.

Ben Byford:: Yeah. Yes. Let's hope that there are more people listening to what's

Dietmar Fischer:: Hmm.

Ben Byford:: going on over there to like, um, because it, it feels like a stereotype that they're always doing things, uh, with less guardrails. Um, so it's nice

Dietmar Fischer:: No

Ben Byford:: that they are,

Dietmar Fischer:: no no.

Ben Byford:: um, you know, protecting

Dietmar Fischer:: They had

Ben Byford:: in

Dietmar Fischer:: this

Ben Byford:: that way.

Dietmar Fischer:: they had the same thing for gaming, I think. I remember there was like or social media was it there was, I think they had already rules for,

Ben Byford:: Um.

Dietmar Fischer:: for other services as well. So this is just one additional service but that they started

Ben Byford:: Yeah.

Dietmar Fischer:: this and that's really interesting. Yeah.

Ben Byford:: Yeah.

Dietmar Fischer:: Yeah.

Ben Byford:: Uh, do you think, uh, like coming back to the economic stuff, like there was a lot of like chatter earlier on in the year about, uh, the bubble bursting and the, the real cost of using the AI and the inflated company valuations and stuff like that. Do you feel do you feel like that's like a thing that could happen this year? Um, like there is a, a bursting situation, like a, or a reevaluation of, uh, these companies.

Dietmar Fischer:: The interesting thing is that it's really there's two schools. One is saying AI goes for the whole labor market, which is trillions of euros of dollars that are out there. So if someone dips into it and can just recover ten percent of that. So that is like, that is hugely economically valuable. This is one thing. So if that goes in this direction, there's not directly a yeah, some for some. And this is the other part. Um, there will probably be a bubble that will be a bubble that burst because many of the firms, they are not original or the big firms like openAI , Anthropic they come out with products that destroy the primitive business models the others have. And they get money now. And there's then connection to the private, uh, credit market where they get debt from, from private firms that is not controlled. This is so it's connected. And there are signs that that there are problems already, but this is this, um, there's this example of the guy who betted on, um, uh, the, the new economy, um, bubble bursting and

Ben Byford:: Yeah. Yeah.

Dietmar Fischer:: he is at the moment, he already, they, they invested disinvested from all his part, all his investments, I think again, but

Ben Byford:: Mhmm

Dietmar Fischer:: the thing is, last time he invested two years too early basically. So, um, in this two years, there are still a third.

Ben Byford:: Mmm.

Dietmar Fischer:: You could have made thirty percent more money. So something I read something about this.

Ben Byford:: Yeah.

Dietmar Fischer:: Uh, don't don't don't, uh, the numbers

Ben Byford:: But.

Dietmar Fischer:: are like a rule of

Ben Byford:: But he was right. The first he was. He ended up right the first

Dietmar Fischer:: Yeah,

Ben Byford:: time around. Like.

Dietmar Fischer:: about two years

Ben Byford:: Yeah.

Dietmar Fischer:: too early. And so he could

Ben Byford:: Yeah.

Dietmar Fischer:: have made much more money. But in the end he was right. But it sounds

Ben Byford:: HMM.

Dietmar Fischer:: good. But this is the timing. And now if if you really risk averse, you should now move out of everything.

Ben Byford:: HMM.

Dietmar Fischer:: Even if you like risk, then definitely stay invested. And the grey zone in the middle, it's probably still okay. But think about your investments. I mean, but but

Ben Byford:: Yeah.

Dietmar Fischer:: for the risk averse people that there will be a bubble that burst because there's too much money. I mean, this is like, it's like the, the, the new economy bubble is the same. You had lots of firms that that didn't have a business model. Things did not work out, but Amazon still exists. So it's like, um, and there will

Ben Byford:: Yeah.

Dietmar Fischer:: be a lot of firms that survive. So look at the firms, look at what happens if you invest in something, then, um, think about in what to invest or on the whole, if the economy goes down because of this, that can also be you can't do much against it.

Ben Byford:: Mhm.

Dietmar Fischer:: Yeah, but one, two years, something like this. I think that is realistic.

Ben Byford:: Yeah. So do you think one two years from now sort

Dietmar Fischer:: Yeah.

Ben Byford:: of thing?

Dietmar Fischer:: Yeah.

Ben Byford:: Yeah.

Dietmar Fischer:: But

Ben Byford:: So,

Dietmar Fischer:: I'm

Ben Byford:: uh,

Dietmar Fischer:: I was not I'm I don't

Ben Byford:: we're

Dietmar Fischer:: have stocks.

Ben Byford:: not.

Dietmar Fischer:: All

Ben Byford:: Yeah,

Dietmar Fischer:: the

Ben Byford:: exactly.

Dietmar Fischer:: stocks

Ben Byford:: We're

Dietmar Fischer:: I

Ben Byford:: not.

Dietmar Fischer:: all

Ben Byford:: We're

Dietmar Fischer:: the

Ben Byford:: not

Dietmar Fischer:: stocks

Ben Byford:: financial

Dietmar Fischer:: I

Ben Byford:: advisors,

Dietmar Fischer:: buy go down.

Ben Byford:: so.

Dietmar Fischer:: This is a, you can, you can hire me. I, I if you have a competitor, I buy the stocks and then it, it goes down. So I'm really bad at those things.

Ben Byford:: Right. But we'll be here to talk about it after the fact. Right.

Dietmar Fischer:: Yeah.

Ben Byford:: We'll be up to. Yeah, we'll, we'll say that we're right in some shape or form. We'll pull a Donald Trump will twist the narrative so that we're, uh, we're

Dietmar Fischer:: Yeah.

Ben Byford:: on top.

Dietmar Fischer:: Spin

Ben Byford:: Yeah.

Dietmar Fischer:: it. Yeah.

Ben Byford:: Um, yeah, I think, um, yeah, I don't, I don't have the skin in the game for that one, but it's, I think it's like, um, interesting to think about because it, because obviously it will move things Drastically if one even in just one of the big players. You know, um, goes down, goes under. Um, and personally, I feel like it's like, it's one of those situations where it's, it's a monopoly situation basically, you know? So there will be a winner. I'm not sure who it's going to be yet, you know, but my bets are on, you know, someone who is already in the game and is probably backed up by, um, you know, your Googles and Microsofts to be honest, but um.

Dietmar Fischer:: But the big ones, they're all backed up by by big ones, by other big players. So I don't

Ben Byford:: Mmm.

Dietmar Fischer:: think they are in danger. Like if you look at anthropic then you see Amazon has a part in them Even Google. Google is a competitor but also has a part in them. And so

Ben Byford:: Mmm.

Dietmar Fischer:: I think they are and they have a business model, even if it doesn't work yet. And they want to expand crazily, but I don't think they are really in danger. It's more like the mid size ones or the small

Ben Byford:: Yeah.

Dietmar Fischer:: ones. And um, yeah, but in the end, if it comes to a crisis that doesn't matter and hits everyone, then yeah.

Ben Byford:: Yes.

Dietmar Fischer:: So.

Ben Byford:: Yeah. Yeah, exactly. Yeah. If it if the bubble bursts in its entirety, then obviously

Dietmar Fischer:: Yeah.

Ben Byford:: that's uh, bad. Um, but obviously

Dietmar Fischer:: And.

Ben Byford:: if everyone loses their jobs, that's bad. So we have to tread this very fine needle, like.

Dietmar Fischer:: Mhm.

Ben Byford:: Yeah,

Dietmar Fischer:: Mhm.

Ben Byford:: yeah.

Dietmar Fischer:: And it's even unclear if there's a bubble that bursts. It can also be a, a process from. Yeah. Some some people think more and some people learn a lot of. Also, we talk about humans, not of AI. We

Ben Byford:: Um.

Dietmar Fischer:: don't learn so much from past events that if

Ben Byford:: Unfortunately.

Dietmar Fischer:: an AI looks at it, they would say, okay, there is a risk of this and that and probabilities. And then, uh, they would say, okay, bubble comes with this probability. We have a strategy for this. Humans are

Ben Byford:: Yes.

Dietmar Fischer:: like, no, no, we know better. Overconfidence

Ben Byford:: Yeah,

Dietmar Fischer:: is a typical thing.

Ben Byford:: yeah, that's

Dietmar Fischer:: So

Ben Byford:: no fun. Let's go. Let's do it.

Dietmar Fischer:: yeah.

Ben Byford:: Yeah, yeah. I think famously we're bad at risk management. Risk assessment. Yeah,

Dietmar Fischer:: Yeah

Ben Byford:: definitely.

Dietmar Fischer:: yeah.

Ben Byford:: Uh, I've got the, uh, the other question that I ask on the podcast as well about what scares you and what excites you about AI and our mediated future is, is that something that you're interested in?

Dietmar Fischer:: Yes, definitely.

Ben Byford:: Yeah. So what scares you and what's exciting for you right now?

Dietmar Fischer:: The I have these moments where I think, okay, in three to five years, we humans are obsolete. Um, I have in my podcast. I have this typical

Ben Byford:: Mmm.

Dietmar Fischer:: Terminator question, matrix question, and it might even come really to extinction. Extinction of mankind or to mass unemployment and rulers that rule then

Ben Byford:: Mmm.

Dietmar Fischer:: AI rulers. This is the thing that sometimes scares me. It really. If I have a bad day, this. I have this fear and I see there as a possibility

Ben Byford:: Yep.

Dietmar Fischer:: because there is not much regulation. And um, it is an event that is a thing that we humans can control or could, let's say, but we don't at the moment. So I see this is really a danger.

Ben Byford:: Mhm mm.

Dietmar Fischer:: And the where was it? I had this thing. It doesn't need to be that the AI is conscious or really intelligent. Ah, this is a book. I have an interview upcoming of a guy and he says there's this um, um, they are um, mining the moon and then those, uh, nanobots get out of control producing

Ben Byford:: Mmm.

Dietmar Fischer:: more and more nanobots. And the event that can follow is the destruction of the moon and they are not intelligent. There's just an error in the system and they can't fix it. Uh. So I won't spoil anything. But if the moon is gone, that's a huge danger for earth. So for living, for life on earth and

Ben Byford:: Yeah.

Dietmar Fischer:: this, there is this risk that we have. And in like in a bad day, I have this on a good day, I think I have to stay healthy long enough so I can live forever. Because also, this is the thing

Ben Byford:: Yeah.

Dietmar Fischer:: we, we with AI, it could be and this is the, the part where I see see lots of things happening to make our life really better.

Ben Byford:: Yeah. So you might you you think that we could get to the point where we are? I mean, it feels like a nice thing to do to spend more time and energy on solving some diseases and sort of those sorts of things. Um, longevity maybe. Um, and then it feels like for me, there's like a prerequisite for that, which is like everyone has a basic living standard, um, which is part of the economic problem that we were talking about, you know? Um, so with your extinction event, you, are you saying that we are, we could be in a couple of years time economically defunct, uh, economically useless. And then at that point, things get very fast downhill sort of thing as well as the machines going wrong essentially.

Dietmar Fischer:: It kind of also depends. It could go wrong. There's an Isaac Asimov story where basically there's five computers ruling over earth. It's not official, but they manage all the economy and the humans, they don't manage really the computers. And then there's one guy trying to change something. Everything works fine and still some. And they they promote him away from from a decision position. And so everything is great because the computers are in control. So, um, it could be also this, but then we don't have the freedom to decide. But is it, is it so bad? I mean, I we always do the right as a species. Do we do the, do we decide right? Do we make the right decisions?

Ben Byford:: Yeah, yeah, I think, I think the, the, my, my, the, the pretense for that right in the current world that we live in is that the machines are logical or the AI's are logical at the moment, we don't have those machines. So that's not a problem because they're not logical. Like the way that chatbots work is that you get lots of language and it analyzes the language for like, what is the most relevant next thing? Right? That isn't logic. Like, so what we'll have is a thing which is, uh, logical running the world. Um, just being like dreamy, just kind of like making stuff up. And some of that will be logical because it's stuff that it's seen in the language, right? But others of it would just be completely like flat earther stuff and like,

Dietmar Fischer:: Yeah, yeah.

Ben Byford:: uh, aliens exist. And I've met them like completely bonkers stuff. It doesn't have that like currently anyway, like the current situation is, it doesn't have any grounding in the way, um, that we feel like it should do or this, like, you know, the science fiction has told us that it will. Um, and I'm trying to, I'm trying to make sense out of like an analogy for it, but the way that I talk about it at the moment is like, we wanted Spock right from Star Trek, but we've got Kirk, you know,

Dietmar Fischer:: Yeah.

Ben Byford:: we've got this thing that just does stuff because of feels and is very intelligent and is very courageous and does and is like, has like the backing of the, the starships and all these people behind it. But really we wanted Spock, you know, we didn't want this thing. We wanted the, like, hard, logical. This is the way it's happening. This is the way it should be done situation. And that's not that's as far as I'm concerned. That's not what we've got.

Dietmar Fischer:: And this, this like to turn the question around.

Ben Byford:: Yeah.

Dietmar Fischer:: So what's your perspective then on if we get Kirk and not Spock?

Ben Byford:: Yeah, I think I think we what we the thing that is the problem is that we think we have Spock when we actually have Kirk. Right. We expect it to be a certain way and it's not a certain way, which is why we get people thinking about things in the wrong way. It's like, um, there's an example that I give as well is that these chatbots don't give a good game of Go or good game of chess, you know, because they're not, it's not language really. It's not, you know, they have games of Go and games of chess in their models in the, in the, you know, the characters of the models that they've seen. But they're not. They're not there to, to play go like a go algorithm is going to be far better playing go. But it's not a general thing. It's just a go playing algorithm. So I think we have to, we. It's a part of that literacy thing. I think people don't really appreciate what these things are good at and what they. And what you should move and do something else with. but what's what is the real issue, I think is a bit like you. I think it's, um, where we've got these AIs which can manipulate language to, um, do things in the world, do actions because a lot of the things we do involve coding or involve messaging people, that sort of thing, that it doesn't actually matter because they can do those things and produce outcomes in that way. So, um, yeah, I'm mostly worried about that kind of like that being applied to people's jobs and not being, you know, fast enough to change the system to accompany that. Um, and also for people to be able to do bad things quicker and easier and, and that sort of stuff as well, because the tools.

Dietmar Fischer:: I think this is really interesting because there are firms who try different approaches. Um, but like, like there's this, there's Peter Foss from Igo and he said, you have those huge data centers and the brain has twenty watts electricity usage. And yeah, so it's

Ben Byford:: Yeah.

Dietmar Fischer:: totally different. And he says the problem is that the money goes to the to those. So it's faster, but then they go faster, even faster, but still in the wrong direction. And,

Ben Byford:: Um.

Dietmar Fischer:: um, there are firms who have different approaches, but I don't think anybody sees them, at least at the moment. And if this stays like this, we still keep the Kirk and we don't get the Spock we actually want or need as a society.

Ben Byford:: Yeah,

Dietmar Fischer:: Probably.

Ben Byford:: yeah,

Dietmar Fischer:: Also,

Ben Byford:: yeah,

Dietmar Fischer:: no.

Ben Byford:: yeah. Yeah, definitely. I think, I think it's, it's, uh, it would be helpful to have, uh, you know, a scientific scientist, but, you know, a rational system that is able to check assumptions against everything he does. Um, basically, um, which is part like, like if you think about, um, uh, I can't remember his name. Um, you know, one of the, the, um, creators of the, um, neural network situation, um, these guys are thinking more like, well, maybe we've hit a, a plateau and maybe we need to think about the more traditional expert systems, symbolic AI with the learning machine learning AI to create some sort of hybrid situation. I haven't seen like that play out, uh, myself technologically, but I think that's it seems reasonable, you know? Yeah.

Dietmar Fischer:: I think the problem is there's a certain amount of programmers that understand AI. There's a certain

Ben Byford:: HMM.

Dietmar Fischer:: only a certain money. And it all goes in one direction. And the other ideas. Um, as long as they don't, they didn't hit the wall yet. So. And as long as they don't hit a wall, this won't change. And this has. Yeah. It's really a risky situation. I, I don't

Ben Byford:: HMM.

Dietmar Fischer:: think this will, this is actually a, that's a now now now getting really nerd. There's I long ago I played, uh, tabletop role play games. Those

Ben Byford:: Yep.

Dietmar Fischer:: were Star Trek and we had a, had a Vulcan and he was an evil Vulcan. So he said it's logical. There was a situation where there

Ben Byford:: HMM.

Dietmar Fischer:: were enemies or whatever, and he tried to kill one of his team because he was too loud or whatever. And he said, it's logical. This guy leads to that. Every one of us dies. So, um, so if you have the Spock system and

Ben Byford:: Yeah.

Dietmar Fischer:: the Spock system tries to kill humanity, it will be good at it. And if

Ben Byford:: Yes.

Dietmar Fischer:: you have

Ben Byford:: Yeah yeah

Dietmar Fischer:: them,

Ben Byford:: yeah. That's true.

Dietmar Fischer:: it will make a lot of mistakes. And that leaves room for us to not get killed, which is

Ben Byford:: Yeah.

Dietmar Fischer:: actually quite funny. Uh.

Ben Byford:: So I think I feel like it comes down to like some, some, uh, some decisions need to be made by us, right? Still.

Dietmar Fischer:: Yeah, yeah.

Ben Byford:: Yeah.

Dietmar Fischer:: And if the AI is not good enough at what

Ben Byford:: Mhm.

Dietmar Fischer:: they, what it does, uh, that's, that leaves still room for us in outside of the zoo or that we are not as humans in a zoo. But

Ben Byford:: Yeah,

Dietmar Fischer:: anyway,

Ben Byford:: yeah, yeah.

Dietmar Fischer:: the perspective, this the Terminator question and all over again. But, but do you

Ben Byford:: Mmm.

Dietmar Fischer:: think that that the, that we get extinguished or controlled by by AI.

Ben Byford:: I mean, possibly like, the more, the more control, more access we give to the systems at the moment is producing some very odd behavior. So by odd behavior, I mean giving access to emails enables it to email people and it's been emailing people. Um, it like these services, right? These, these models, um, because it has access to do these things and it has the knowledge about doing these things, knowledge. Again, I don't know how we use these terms. Um, so I think we could quite easily let it hurt us like, because we've just given it too much access, right? And it doesn't know what it's doing. It's just, it's just doing stuff. It's doing stuff which is seeing it's data. Um, so I think that's a big issue and is not being helped by the, uh, open floor people and things like that. The enabling technologies are actually, uh, becoming the, you know, they might be part of this, uh, destruction idea. Um, but I also think that people malicious, like I was saying earlier, malicious people can have superpowers now and be more malicious. And like I was, um, I was talking to my, my family a year ago about having a, um, a password, right? A secret password that we don't write down anywhere, you know, so people are already trying to rip you off. They're going to try and do it more easily now basically. And we, you know, we're not high worth individuals. So for those high worth individuals, it's going to be bad. It's going to, you know, it's going to be it's going to be quite, um, a terrible time. Um, so I think you people need to up their personal security. Um, and I think they need to, you know, get, uh, an understanding for the technology, even if they don't want to use it day to day, um, appreciate what it is and what it does. and try and do good things, try and do the. Curing cancer and making food for the poor, poor, the. The people who don't have. You know, I feel like, you know, I mean, I feel like part of what we were saying earlier about, you know, maybe we could make a good situation. I think we could. We just need to like move in that direction, like try and make a good situation now. And I know it's like for me personally, it's hard because, um, I have to make money, I have to pay my mortgage and I have to, um, put food on the table for my kids and stuff like that. But I feel like as soon as the mortgage goes away, I can start doing those things more easily. So I don't know how I'm going to do that. You know what I mean? But like, if you're in that position where you don't have to think about those things on a day to day basis, do that, try and do better things, which are with, you know, with technology because we have it. So, um, try and move the needle against the people who are going to do the malicious things.

Dietmar Fischer:: I had just

Ben Byford:: I don't

Dietmar Fischer:: just

Ben Byford:: know.

Dietmar Fischer:: know I had a thought that. But because you said something, there's the people who do good with AI and do the bad. And that will be amplified. That would

Ben Byford:: Mmm.

Dietmar Fischer:: be the. This is always an interesting question I have. Would I want to live in a superhero world? Um, because in the end I could be a superhero, but I

Ben Byford:: Mmm.

Dietmar Fischer:: could also be the victim of the super villain. Um, in. And the chances are higher if you, if you live in an in house called, uh, the Batman city in Gotham City. Yeah.

Ben Byford:: Yeah.

Dietmar Fischer:: And

Ben Byford:: Yeah.

Dietmar Fischer:: if you live there, life is much more dangerous. But it has also, you could be the superhero. I mean, uh,

Ben Byford:: Mmm.

Dietmar Fischer:: the stakes

Ben Byford:: Yeah.

Dietmar Fischer:: are higher. And I think we come to a higher stakes world with AI.

Ben Byford:: Yeah. That's a nice way of framing it. I don't know. I have to ruminate on that one. How do people find your podcast if they're not, if they're listening on the machine Ethics podcast?

Dietmar Fischer:: So it's, it's relatively easy. Beginner's guide to dot AI. and this is the way or connect to me on LinkedIn. Dietmar Fischer and on LinkedIn, just type it in and hopefully you get a funny guy with, uh, with headphones on.

Ben Byford:: That's your picture, right? That's

Dietmar Fischer:: yeah, yeah.

Ben Byford:: profile. Yep.

Dietmar Fischer:: And then Ben, how do they find you?

Ben Byford:: well, hopefully just type in machine ethics podcast to wherever you get your podcasts. It should be on there. And if you enjoyed us talking and just reach out, um, tell your friends about the podcasts and share it and, uh, send us a message because I always enjoy people who reach out because sometimes it feels like we're shouting into a void and it's nice to hear from you. So yeah.

Dietmar Fischer:: Totally. Yes. Every comment is appreciated. Every mail. Yeah. we,

Ben Byford:: That's

Dietmar Fischer:: know you exist and we know from but. Yeah. Great.

Ben Byford:: Yeah. Yeah, exactly. So, um, thanks very much. Uh, Ditmar.

Dietmar Fischer:: Thank you. Ben.


Episode host: Ben Byford

Ben Byford is a AI ethics consultant, code, design and data science teacher, games designer with years of design and coding experience building websites, apps, and games.

In 2015 he began talking on AI ethics and started the Machine Ethics podcast. Since, Ben has talked with academics, developers, doctors, novelists and designers on AI, automation and society. He is available for articles, talks and workshops.

Through Ethical by Design Ben and the team help organisations make better AI decisions leveraging their experience in design, technology, business, data, sociology and philosophy.