Facing the Inevitable: Technology in the 21st Century

Facing the Inevitable: Technology in the 21st Century

How do you think these technological advancements mentioned in the book will lead to a more communal society? Are there any potential drawbacks?


Welcome back to "Building a Coaching Culture"! In this episode, we dive into the fascinating topic of technological advancements and their impact on our society. From the colonization of Mars advocated by Elon Musk to the increasing number of sensors tracking our every move, we explore the opportunities and challenges that arise. We also discuss Kevin Kelly's book, "The Inevitable," which predicts 12 technological advancements that will shape the next 30 years. As we navigate through these transformative times, we reflect on the communal aspects of digital culture, the exponential growth of technology, and the need to address potential risks. Join us as we delve into the future and discuss what lies ahead in this ever-evolving technological landscape.


In this episode, you'll discover:

  • The benefits and risks of sensor data availability
  • Discussion of the book "The Inevitable" by Kevin Kelly
  • Monitoring consumption of resources through sensors
  • Predicted impact of technological advancements on society
  • The game-changing emergence of Chat GPT.


I invite you to listen to the full episode to gain deeper insights into these thought-provoking topics. Link to the episode in the comments below!


Building a Coaching Culture is presented by Two Roads Leadership

Produced, edited, and published by Make More Media

Building a Coaching Culture - #76: Facing the Inevitable: Technology in the 21st Century === Lucas Flatter: [00:00:00] there's this whole developer jokey thing about how all we do is Google on stack overflow, which is like the question and answer for programmers, and it's completely true. Like, why would I hit my head against the wall when I can just see if somebody else has already had the problem? J.R. Flatter: Hey, welcome back everybody. It's JR Flatter with our co-host Lucas. How you doing? Lucas Flatter: Pretty good. J.R. Flatter: So they got Declan's haircut yesterday. Was [00:01:00] that traumatic or relatively? Lucas Flatter: I think it went well. Lena went without me J.R. Flatter: Oh, okay. Lucas Flatter: didn't report any freak. J.R. Flatter: So we're talking about today in our Building a Coaching Culture podcast. This book the Inevitable by Kevin Kelly. I don't think a hardly a day goes by. I don't talk about this book, and it really surprised me. You and I hadn't talked about it yet. When I'm teaching, I'm always referring to this book and the premise of the book. It was written in 2017, which is nearly seven years ago now, depending on when you're listening to this. He predicted 12 of things that are technologically inevitable across the next 30 years. And so just rereading the book recently and looking back seven years, a lot of what he is talking about is coming true already, but also [00:02:00] we're seven years into the 30 years. So really only have 23 years until all of these things come to fruition or, or, or some generation of them. so Kevin Kelly, if you don't know, is the co-founder of Wired Magazine. He's a really strong voice in the world of technology and everybody's own admission. a couple of decades ago, he was relatively libertarian which requires one to be an individualist, and he sees a very, communal requirement. Emerging from these inevitable technologies. So just wanted to plant that seed as we walk through here. What does that mean for me at the tail end of the baby boomers, you tail end of the millennials and your son, and all of my grandchildren, and a lot of the, the listeners and viewers and what does that mean to us, across the next 30 years? And then, as he wraps the book up, he tells you all these things. And the 12th thing he [00:03:00] tells you is that we're at the very beginning of this transformation, the very beginning of the third millennial. it's pretty, a really incredible book. What are your initial thoughts? I know you've read it and took a recent look at it. Lucas Flatter: I mean, yeah, you, you think about all the progress we've made with internet and there's all these theories on what's gonna happen to the economy based on it, what's gonna happen because of ai. And I think this kind of takes, yeah, like we're at the beginning of another radical reinvention of the office and work and data and things like that. So looking forward, instead of just saying, oh, look how much the Internet's revolutionized everything, but what's gonna happen next to everything. J.R. Flatter: Yeah. And I think that's one of the, the reasons I like to, at least when I'm talking about this book and these ideas, talk about the ending at the beginning, plant the seed early, that all of these things, [00:04:00] as magnificent as they are, and some of them come with significant risk so magnificent and risky. they're only just the beginning. And one of the things he talks about is how unimaginable the future is to us. Introducing new realms of the possible. You and I talk about it in our coaching professions, self-limiting perceptions, self-limiting beliefs, some of these things. When I read this book seven years ago, I hadn't even heard of the term ai and, and I looked at it again today. and it just oh, wow. Now it's part of the everyday vernacular, everyday language. Whereas seven years ago it was just something geeks might have talked about, right? Not a business person or whomever else, which I think, you coming from that world, but also talking to leaders, I think is a neat, a nice, unique intersection in itself.[00:05:00] So first I thought maybe we'd talk about all 12 of them together, but it didn't really make sense as I looked at it. So I'm just gonna talk about them one at a time, and then I. Whatever you wanna jump in as always, but as we wanna, link them together or reference back and forth, let's certainly stop and do that. So the first one that he talks about is becoming, and sometimes the title of his inevitable categories are intuitive. And this is one that kind of intuitive, but really wasn't by the title that he chose, but the becoming, we talk about a lot in our leadership development and coaching coach training, entropy. And he doesn't mention that scientific term by name, but what the chapter really reminded me was, and he says it, everything's fallen apart around us. You buy a new computer, you buy a new phone. and the scientific principle of entropy is that it's just that, that the world's tendency is toward disorder. And [00:06:00] what are we doing to, account for that tendency toward disorder, especially in this very disruptive, phase of, of technology. and the, the one bullet that really stood out for me, and again, I don't think he says this exactly, but the idea that there are no more upgrades. And again, seven years ago when he talked about this, we were just at the cusp of having daily upgrades to everything we interact with. get in my car now and I get a message from the manufacturer saying, you're map is upgrading, or some technological advances being downloaded into my, computer. no more versions of Windows, no more versions of, the graphical user interfaces that we use. web, crawlers, whatever, however you wanna describe them. They're all happening automatically. When I turn my computer off every night, I have to wait 30 [00:07:00] seconds and it tells me, do not shut down your computer because my antivirus is upgrading every day. what do you think about when you think about this first inevitable, the becoming? Lucas Flatter: just, the buzzword for programming and, and web applications is software as a service. So instead of, oh, I'm gonna go buy Photoshop, I'm gonna subscribe to Photoshop, and, and like you said, continual updates. And I, I think from like the value creation side or the, business side, you are now in the role of continually prioritizing problems and anticipating problems because what's the point of updating something you know, every week or every day? It's because you know that entropy that you mentioned is causing an issue, or the users have discovered an issue that you couldn't have anticipated, and [00:08:00] you are getting ahead of that. So you almost have to just like always be one step ahead in that kind of paradigm. J.R. Flatter: Yeah, I see parallels here between very human intergenerational challenges, right? We try to put these labels on generations that I am a baby boomer. And so between 1940 something and 1960 something, but even that was back in the days when Moore's law was somewhat predictive, right? Moore's Law is long ago, dead and gone. The idea that, doubling of technologies over time is so infinite, as to be necessary to continually update, you think of the scale of change in 12 months in a, in a single quarter. As I was re rereading this book and reviewing our talk today, it just struck me again and again and again, things that I had looked at only literally days ago, weeks, so months ago and [00:09:00] years ago, how a accurate Kelly was, but also b, how much had changed in relatively short period of time. So a second topic is Cognifying, and again, you and I talk about technical intelligence, cognitive intelligence. Emotional intelligence, but I've always thought about it from the human aspect of the human brain in a cognitive mode. Kelly talks about it, in a technological cognification and specifically talks about artificial intelligence. This is probably one of the more glaring, wow that really happened. Here we are, where we are in time, and, AI is here. It's very real. It's very accessible. it's very valuable. Even to a layman like me, I'm not a technology guy. and Kelly talks about adding AI to X.[00:10:00] And again, to jump to the end, we're only at the beginning. chat. G p t just hit the market. Within this calendar year, and I'm reminded back when the iPhone hit the market and all the peripherals started to come out and literally companies emerged from nowhere. Otter for one, right? The Otter case for your phone to protect it. Nothing to do with Apple, but just added X to Y and that's what Kelly's talking about in Cognifying. How do we take artificial intelligence and add it to almost unimaginable tasks? I know you're a computer scientist, you're probably using this every day already. Lucas Flatter: I mean, I think something that the AI tools have been really good at is, and this is kind of by [00:11:00] definition, these language models, they take, they understand. You turn it into data and then turn the data back into something that you can understand. So if you collect a bunch of survey data or user data, whatever data you're already collecting, plus ai, is that insight that you would need in An expert analyst or someone with a lot of existing insight to look at the data and pull things out of the AI could tell you a story, so that's super powerful. J.R. Flatter: Oh yeah. I mean, I, I have a background in statistical analysis and I consider myself pretty good at looking at data and teasing out relevance. I. Oh, wow. I, I think I might see a connection here, or perhaps you notice a theme emerging here. Well, at the speed my brain works. Now you add AI to that and it's infinite [00:12:00] speed. it's what I would call exhaustive and So when you get a doctorate, you write a dissertation. Part of that dissertation is you do a literature review. You're supposed to review the literature and concisely write about it in a chapter. And I don't care how good you are or how hard you work, you're nowhere near exhaustive. but AI in the blink of an eye can have an exhaustive review of the literature, and present it to you in a consumable format that a layman can use it. You don't have to have that cognitive, inter intermediary, that I used to be looking at. Hundreds of thousands of data points and, and kind of the user interface between the data and the decision maker, if you would. Lucas Flatter: Yeah, and then think about it in the reverse way. Like I wrote an essay and I want you to tell me, how effective the theme [00:13:00] is and they can give you some objective, measurement of, oh, how often did you mention it? What strength of the language you used when you mentioned it. J.R. Flatter: and that's the part that we talked early on about risk. And one of the risks of artificial intelligence is even though it's reviewing the entirety of knowledge, perhaps, which is, a metaphor more than a reality, it still has an algorithm. That algorithm is created by a subjective human being, or a collective group of subjective human beings. it gets tweaked by human beings. It gets, artificial intelligence gets more intelligent, and so it becomes, knowledgeable into itself. so, how do you control for, we're trying to be as objective as possible, but no human being is truly objective. [00:14:00] And human beings write the algorithms. So how do you know a, that the answer you're getting is objective? Making quotations with my fingers. and how do you know that it's not purposefully agenda driven? Lucas Flatter: I mean, I think that some. Of the other markers, or what does he call them, the forces, some of them kind of get at, that libertarian ideal where this is no longer centralized behind Twitter and Google and Facebook, but yeah, the only way to have it be, more of an objective truth is to get everybody's input on what's valuable, what's true. J.R. Flatter: I don't know why, but I'm just getting this, this vision of Nemo running through the matrix, thinking it's real. And that's you and I as we interact with AI now, and you do ask it to write an essay for you. You read it seems to make sense, [00:15:00] seems to, jive with the knowledge that you do have. are you ever really gonna know? So, the third topic that, Kevin talks about is one of his inevitable is flowing. I really love this idea. When I first started reading the chapter, I had to read it a couple of times before it really sunk in for me. the idea of infinite duplication, and you and I might have talked about this last week, in a session, but the idea that when the library in Alexandria was burned, all that lost knowledge, quote unquote, and that'll never happen again because of infinite duplication. so right now I have pictures that one of the biggest tragedies of a house burning. minus the risk of, injury or death. Everybody survived, but everything was gone. Pictures were gone, love letters were gone. people's [00:16:00] entire histories were erased. in a blaze, those days are gone even in the short time that I've been working with technology, there's been several reinventions of how one creates this duplication and now it, it happens behind the scenes and I have nothing to do with it. I'm, I know that, you know the metaphor of the cloud, and I know that our technological team has put tools in place that when I close the documents, not just sitting on the hard drive in my computer, but it's in this cloud that can't burn and that has duplication, even if that. Physical presence was destroyed. There would be a duplicate of it somewhere. So that was the first part of the, flowing. But then this, this idea of duplication. But then the second part is the literal death of the office metaphor. Right? and I didn't even realize this until I, I [00:17:00] read it in this book, but you know, we come kicking and screaming into the 21st century of all of our same labels, right? Files, desktops as if there were a physical filing cabinet. And if there were, as if there were a physical desk, and we even have that image icon on our computers, those days are going away. what do you think about this, the idea of flowing. Lucas Flatter: yeah, I mean, I think about if you're trying to build like a tool, for example, if you wanted to build an AI tool and you said, I want to take sports data and turn it into a story or a narrative, you can connect to a data source that's providing like every new match of professional sport X, Y, Z. and maybe you want to connect to, a separate database to get like demographic information about your users. So all [00:18:00] of this power that you can just tap into that other people are creating. And like you said earlier, those cottage industries, like maybe I provide a particular source that nobody else can, and that's now, value that I'm providing to this infinite stream. J.R. Flatter: Yeah. And you just connected two dots for me. 'cause one of the things I talk about a lot, and I learned this probably about your age, not for any particular reason, just did, this idea of self-selection and the world asks us to self-select, not be asked. So if you see a connection, you see a way to use flowing, use infinite duplication and create value. you don't need anybody's permission to do that in this world. 20, 30, 40 years ago the capital resources necessary to do that, the [00:19:00] market curves to entry not only be beyond the capital resources, but bureaucracies and licenses and other inhibitors, those are fading away. and Kelly talks about it a little later on in this idea of this communal world that we now live in. So, number four. So there's 12. And so now we're a third of the way through screening. and he means literally screening. This is one of those where he later on talks about filtering, but screening could be filtering. But here he literally means looking at a screen, and not screening as in filtering. and he talks about this clash of cultures, and I could see this in myself even today. So as you and I are having this conversation, there's at least three very common formats of books, if you can even call them books. Now, book used to be a physical object with a cover [00:20:00] and pages, This clash of books, literal books versus the books on screen. Now we have books on audio. a decade ago, two decades ago, they were physical on a cassette tape. But now given the cloud and streaming and some of the other technologies, so you can literally have a book in your hand. You can have a book on the screen, or you can have a book in your ear. and this clash of books versus screens, it goes far beyond the physical, to cultural. actually something that, you and I talk about a lot in our training and education is what is truth? And as I read, this really brought to light some of the challenges we see in our culture right now. Because there are different truths in different generations. Different principles, different value, different goals, different life goals, different work, family self balances.[00:21:00] And now Kelly's introducing this clash of cultures in the physical versus the digital and boiling it all the way down to a definition of truth. And as I looked at this truth to the book generation, I won't call it Boomer or Gen Z or Gen X, but the book generation Truth was law. And what did the law say? What does the Constitution say? What does precedent say? But in the screen world, and this is Kelly's definition, truth is what does technology tell us? Lucas Flatter: and I would say that the pattern I'm seeing is kind of, it's getting a little worse because if you think about we had broadcast tv and so there was like a few sources of information that would, populate your screen with information and, [00:22:00] audio, video and then VCRs, game consoles now streaming devices. Now you use your own source of information and you would just plug it into the screen. but even like as far back as the iPad or like even the first smartphones, now the screen is attached to whatever, provider. So it's almost if you have a device that's if. The screen is attached to it. Then there's like an intermediary between whatever truth, and you have less control. So like I see the current state where we have monitors and TVs that, like kind of generic and agnostic to whatever they're being displayed on them. That's kind of going away and I think in the next couple years in terms of like how our interaction with screens are, are [00:23:00] gonna go. J.R. Flatter: I have a recent example. I was at a trade show and I bought a device that was separate and distinct from my phone. So for the first time in a long time, I now have this device that I have to power. I plug in, put sensors on me, and it provides massage and. And different things I could set to different settings, but it just struck me as I looked at it like, I haven't had another device for a long time, and how long is this separate and distinct device going to survive before it gets, su assumed by some of these inevitable that we're talking about here Lucas Flatter: one other one that just comes to mind before we move to the next one was like, all those like diving computers and like adventure computers. Like now the new Apple Watch does all these things and it's like this tiny sliver of the [00:24:00] demographic that would need that, but they just destroyed it. Device class, like you're saying. J.R. Flatter: Absolutely. And therein lies the beauty and the risk of this, this time that we're in. so number five, accessing. Not assessing, but accessing how do you gain access? And I really, really enjoyed this chapter and this inevitable because it's, I'm living it. I, within the last 12 months have transitioned from, and this seems insignificant, but for a boomer, it's not cable to streaming. You wanna talk about a learning curve. where do I get live television from? Where do I get all of these shows that I've been accessing through cable television, through this decades old relationship I had with them, and tens, if not hundreds of thousands of dollars. And the inconvenience of having a physical cable that I had to plug everything [00:25:00] into. and now, 12 months in or so, maybe even 18 months in, I downloaded a new app last night because there was a new show I wanted to watch. I did have access to it, and it seemed rather seamless. And so, Kelly talks about accessing versus possessing, right? For my entire life, I've heard and said the phrase possession is nine tenths of the law. Even there's precedent in law, who has it physically? Well, those days are going away. There are things we still physically possess. I have this pen, I have this phone, I have this computer, this desk. But a lot of the value I have, I don't physically have it. It's floating, digitally somewhere. why would you have, why would you own anything when you can borrow everything legally, not just your neighbor's lawnmower, but movies, cars. and probably one that struck me the most was the idea of dematerialization. And I know that [00:26:00] sounds very fancy. But you know, you think about the size of this phone, versus what it would've been 10 years ago, 20 years ago, and things are just getting smaller and smaller, less material. It's not the idea that we're becoming less materialistic. It's the idea that literally the materials we're using are getting smaller and smaller and less and less, and we're borrowing a lot more relevant, having to physically possess it. What do you think about accessing, you're a streamer. I know you are. Lucas Flatter: Yeah. Yeah. J.R. Flatter: I learned from you. Lucas Flatter: Yeah, I mean, I think about if you go back to the cassettes and vinyls and CDs, like it's a physical media that you need technology in order to access in a way, because, I guess you could create your own record player potentially, but anything more complicated than that, you need to ask, Sony or Hisense or [00:27:00] whatever to make you a V C R, because you're not gonna figure that out. So it's from a consumer level, it's okay, if I've gotta plug this D V D player into my TV already, like why am I like now having this physical totem of the, of the movie instead of just, oh, I just wanna watch it. but then you think about going back to these centralized sources now. They know when you're watching something, How many times you watched it, who was in the room probably when you watched it, et cetera. J.R. Flatter: Oh yeah. This is where we're getting into tracking, which is later on. I was reminded as recently as yesterday, we're not completely into this accessing place because who are the red boxes that, that are outside of drug stores where you could go physically Lucas Flatter: Yeah. They're just red box. J.R. Flatter: Yeah. Red box. That's, that's what they're called. So some, we're not completely [00:28:00] through this, but then it has on the red box, oh, you don't really even need to do this because you could just stream it from here. So they're in the middle of that transition. one of the classic, business stories is that story of, Netflix versus, what's the name of the Lisa? Yeah. Blockbuster. there's one Blockbuster Left Lease. There was last time I checked. know, they didn't make that transition. Hopefully Red Blocks can do that. So accessing to sharing, and this is probably for me, the hardest one to swallow just because of my own life view, communal aspects of the digital culture. and you know a lot more about this than I do, and so I'll, I'll be quiet. But if you think about the 20th century version of this would've been, I grew up as a farmer and so we had co-ops where the farmers would get together and buy things in bulk so they could share in, in the scale or a communal farm [00:29:00] where a group of people would get together. back in the frontier days, we needed those groups for protection. but now we're doing this kind of communal sharing in a digital world. I'd really love to hear your thoughts on this idea of this communal aspect. Lucas Flatter: Yeah. I mean, part of it is the fact that, we can share so much. So if somebody invents a solution for something that just works a hundred percent of the time, it's probably going to be shared across industries and whether it has a single source or not, it's the same algorithm and it's the same piece of code. So there's this very egalitarian idea in software development where there's plenty of industry-wide used pieces of code as libraries that are just completely open source. So anybody can update them, [00:30:00] anybody can download it, anybody can take it, modify it, and then even create sold products with it. And, if you're developing something, you're always thinking. What's out there that is already being shared that I can use, and even if I'm not using any libraries, like I mentioned, you can also just, oh, I have this problem. Let me look up, people that have already asked this question, there's this whole developer jokey thing about how all we do is Google on stack overflow, which is like the question and answer for programmers, and it's completely true. Like, why would I hit my head against the wall when I can just see if somebody else has already had the problem? J.R. Flatter: I think I'd go right back to AI and, the battle AI's having with traditional education. It reminds me when I was a young, even before a teenager, a company called Texas Instruments [00:31:00] came out with this handheld calculator that a person, a, a normal person could afford. And the entire education system went up in arms. You know, now children are gonna be stupid and they're gonna cheat on tests, et cetera, et cetera. And you're hearing the same chatter with ai. I think it was Einstein who said, why would I memorize anything that's written down already? I could just look at it in a book. and the same is true for me in of ai. And, and it comes to a larger theme of all these inevitables and a larger theme of change. you could wish to sweep back the ocean and good luck trying. but these things, as Kelly said, are inevitable. And so even for me, get on board, Be an adopter rather than, what would be the exact opposite of an adopter? Lucas Flatter: Naysayer. a ludite. J.R. Flatter: Yeah. Ludite. I was looking for that word today. Ludite. 'cause one of the, the topics is really [00:32:00] relevant to the Luddites. Everything that's ever been invented has already been invented. so how do I coalesce capitalism and the communal nature of the digital culture? Because capitalism, for better or worse, is me seeking my interest in the marketplace. Just, we talked about Otter, we talked about self-selecting ourself if we see an opportunity. So what, how do you coalesce individualism in this communal 21st century world? Lucas Flatter: Yeah, I mean I was actually thinking about that when I, I was reading Jurassic Park like last year, and that was written in the nineties, and they're talking about how, the internet, everybody thinks it's going to. make everyone more into individualistic and more innovation. All these ideas popping up all over the place. But it kind of, in a lot of ways, what they [00:33:00] were predicting in the book was that there's more, it just pushes some voices to the top and you might lose some of that innovation because like I said earlier, my convenience is I don't have to solve this problem, but potentially if I solve 10 problems for myself, maybe one of them is unique and innovative, which I might be avoiding. So I guess, yeah, I don't know the answer to that, but I think maybe it's going to reduce individualism for a while. J.R. Flatter: All right. Number seven, filtering. I go back to the idea of screening. this is literally filtering, as you might think, it, it would mean, reducing the size. and Kelly uses a very interesting, example here that all recorded music is about 720 terabytes. And in 2017, when he wrote this book, it would've cost you [00:34:00] $72,000 to buy that size of storage. And so every song ever recorded you could have had in the size of a brick, for $72,000, and now that same storage of $720, one 10th, and again, projecting to the future, I'm sure it's gonna go down by a scale of 10 again. So with all of that available to you, how do you literally filter all of that information into some usable framework in the military world? there's this idea of joint all domain, command and control, and it's a very fancy way of saying across all of the ways that one could fight and land. See, air space, cyber are the five domains. Getting all of that information to every warrior, it's a great ideal, but with infinite data, how do you filter it? And so that's what number [00:35:00] chapter seven Inevitable Seven's all about, is finding a way to filter that. and I just found myself doing it today. I'm out in my car washed and literally, the guy is 10 feet away, but I used the front desk. I left the key at the front desk and asked, the front desk attendant, Hey, when the car wash guy gets here, can you please interact with him? And filter out that one activity that'll allow me to do another activity. So it's very real. Lucas Flatter: Yeah, I guess, that story kind of made me think you have all these experiences all day, every day and, how do you even make sense of any of that? And usually it's I. Some kind of narrative, there's a beginning, middle, and end to the day, and this person went against me and I, this person helped me. and I kind of think about right now, you think about these filters as, [00:36:00] okay, the algorithms recommending me something, because I listen to this, they think I wanna listen to this or because I'm this age. and that's like the 2017 version probably. But at some point you're gonna see these like meta narratives around, oh, you're going on like a musical journey and, and you're gonna have all these different experiences and we're planning it out for you. And, and making it more of like an interactive journey. And I think one of the two forward is interactive. So that's kind of part of this futuristic experience We're gonna. J.R. Flatter: Yeah, you've just coached me 'cause this is exactly what a coach does. You connected two dots for me. You and I talk about principles all the time, and I just connected the dots. The principles are a way of filtering, not right or wrong, not judgmental. I. This is for me and was for our family. If maybe, perhaps even for our company,[00:37:00] one of our principles, two of our principles, and it helps us filter what do we say yes to and what do we say no to without having to go through those terabytes of data. Lucas Flatter: And relationships like, yeah, there's a lot of human things like, yeah, like by having a wife or a child, you're now having a completely different experience. J.R. Flatter: yeah. Saying yes to one thing, close the door to many other things. So number eight, I love this idea of remixing you've referenced this indirectly a couple of times, and I go back to what my, one of my, physics lessons that always comes to mind. That the mass of the universe is constant. And that's kind of Kelly's idea of remixing so the world has finite resources, we have finite time. So how do we use an inevitable, to take advantage of what we have? And that is the idea of remixing. we see it in [00:38:00] movies, we see it in products. large car manufacturers almost never innovate. They follow the innovation of the little guys and gals who take the risk, the keyless entry or automatic windows or halogen headlights, let somebody else take the risk and then if it works, we'll adopt it. That's the idea of remixing. and I love this, quote from him, the supreme fungibility of digital bits. I don't know why that struck me, so, almost whimsical, but yeah, it's absolutely true. in the digital world, fungibility is just talking about how change, how you can change things. This idea of infinite change within the digital world, I find really intriguing. Lucas Flatter: Yeah, I mean the thing that came to mind for me when I read this one was like [00:39:00] all the history of video games, you can play them all on your computer now, and they used to be like on proprietary hardware. So there were all these barriers like, oh, I need this old Nintendo or this old, whatever hardware, but. Now it's like any computer, like from the past five years can play everything. So I was thinking about it like, what if, there were no barriers and, and I can jump from here to here, to here, to here. So kind of what I'm going back to where like I'm not selecting one experience and then selecting another experience. But yeah, give me peanut butter and jelly. Gimme two experiences at once just by that fungibility of bits. Take the code from here, take the code from here. And that's, that's super labor intensive right now. But you know, in a couple years it could be streamlined. J.R. Flatter: Well, the idea that I could talk into my remote control and get pretty high [00:40:00] predictability that I'm gonna find the show I'm actually looking for, even if I'm pronouncing it with an accent, we're at number nine now. So we're into the last third of these, interacting. And this is us as human beings interacting with the digital world and how inevitable that is. And we might. Commonly call that virtual reality, which again, when he wrote this book, virtual Reality was one of those phrases that you might've heard occasionally, and it's in its early days of actually being available to a common person, in any usable work way. And so the phrase that really sticks out for me when Kelly's talking about this virtual reality and our interaction with, was this unshakable sense of presence. And I, I pulled that directly from the book. that's his quote. And again, high reward, but also high risk [00:41:00] here. One of the things that perpetuates our species is our interactions with one another. And it's very common to see people walking down a sidewalk with a set of headphones on purposely not interacting with any other human being, but interacting with the digital world. I think there's a risk here if we somehow remove the need to interact with one another completely. And what does that mean? Lucas Flatter: Yeah, I mean, when you think about interaction with the computer, it's how am I transmitting my information to the computer, like either by typing or clicking or scrolling, if I have a game controller. but all of those are like so much more limited than what we do every day with our hands and picking things up and, looking at things and making facial expressions. And so I. We're getting [00:42:00] closer to the point where, you know, right now we have the technology where I can look at somewhere on the screen and, and the screen knows it, and, and I can select things that way. But having everything that I can, portray as a person, be recognized by some kind of program, whether it's VR and then interaction also, every force has an opposite and equal reaction. So because it's just my hands in the air, like I need like extra feedback visually that tells me, okay, the computer knows that, my hands just moved. so I think about okay, what are we, how are we gonna be interacting? Am I gonna be wearing gloves or glasses or what's the hardware going to be? I guess I see. If we do get to that point, then yeah, there won't be much difference between interacting with [00:43:00] somebody across the country versus in the same room. And that's, they've been saying that for 30 years as well. Right? It's like they're in the same room with you. J.R. Flatter: I almost hesitate to say that, we're really talking about the Matrix now, right? Nemo's really in the Matrix now. and I guess this goes back to Kelly's 12th inevitable and that the fact that we're at the beginning, we have nearly 8 billion people on the planet right now and Elon Musk is talking about, yeah, we have to get to Mars 'cause we're gonna run out of things here. I suspect in my heart of hearts that we're gonna figure this out and it's not gonna be the catastrophic outcome that one might imagine of the, the matrix. and none of us are really actually interacting with one another. So one to number 10, and this is one that's probably the most real, for me and the idea of tracking, for years you could buy a second home [00:44:00] somewhere in a tax-free state. And claim that you live there because you had a home there. Those days are gone. there are so many sensors for good, bad, or indifferent available, to the world right now. that the world knows where you are almost that day. That could, that there's actually somebody out there watching. But there are, different groups of people watching for different reasons. People wanna sell you stuff. People want to tax you, people want to, whatever they wanna do, and how they use those sensors, traffic, travel a lot. And so every time I get on a flight, every time I get on a train, the world knows I was on that train. Rideshare, did I get a ride or a Lyft or Uber? yeah, the world's tracking you and from, The idea that the more sensors there are, the more that we can be tracked and are tracked. And how do you have privacy? the world knows how much [00:45:00] gas you're using. The world knows how much electricity you're using, how much water you're using, what you're eating. 'cause it knows what the world knows through its sensors, what you're buying at the store. again, great reward, but high potential risk. Lucas Flatter: Yeah, I think the most negative thing I see all the time is products that are optimized based on whatever they're tracking. So, you can kind of tell when something, it's okay, why is this user experience different? But it's not easier for me. It's more frustrating for me. It's like it's because of some reason. And the reason is usually, well this keeps people on the platform longer, or it keeps you scrolling to the next thing longer. And so it's like sometimes by tracking all this information, they change the product and now they're not really tracking what they were originally tracking. They're now, doing this social experiment on people. But then [00:46:00] like from a benefit side, I. I don't know if Brittany was telling me that her teenage daughter was saying like that the young people like appreciate when they get like recommended things based on ads. They're like, well, they know me and it, it made me easier to select, the next pair of pants I buy or something. And I guess if you think about it, like the ultimate smartphone would be like a personal assistant that actually knows you and if a real physical, personal assistant is going to follow you around all day, they're gonna know where you're going and where you're driving and everything. So it's do I want to trust like one point of failure that like knows everything about me? J.R. Flatter: That's a great example. I hadn't even connected those two dots. There you go. Again, I spend hours and hours and hours planning travel. I like to leave at a certain time and arrive at a certain time. Take [00:47:00] nonstops. I'm really agnostic on who takes me, even the mode of transportation and where I stay, where I eat, that takes hours and hours and hours. I would love to have somebody tracking that for me. I. And just say, I wanna go to St. Louis and spend one night, can you please travel that given my parameters? I, I can see where a phases going with that. I don't mind that, I don't mind even being tracked. I don't mind that when I buy gas in Florida or if I buy gas in Virginia that there's a sensor that knows the IRS has access to that data. I don't mind that. Lucas Flatter: ultimately you would hope, like I brought up the example of like them, it's like poor optimization, but if they have the ability, like instead of on Twitter, they're optimizing for money, they're optimizing for like how much joy I am having while I'm using Twitter. Or if the government says like, [00:48:00] how do we optimize for human happiness, and then let's make the laws based on that. J.R. Flatter: yeah, and the challenge is it's so individualized. it's literally infinite. It's the reason we all take personality profiles. We all have different preferences, and they're often, if not regularly irrational. You can't show me the LodgIQ, how you chose your life spouse, or how I chose my life partner. but we did, and probably given 10 choices, we've always chosen pretty close to the same thing. all right, number 11, questioning. And I think this was probably one of the more powerful ones and really speaks to me across generation. And I look across the room and I have three, what I would call sheepskins there, the degrees that I have and, the authority or the, expertise that they demonstrate. well, not so much anymore, right? [00:49:00] Age used to be a questioning factor. do you have sufficient wisdom? Do you have sufficient experience? Not so much anymore. So there's some real challenges. There's real value, but also some real challenges. one of them is, yeah, making mistakes and, and having repetitions. different experiences creates muscle memory creates wisdom, so there's something to be said to that. So we shouldn't throw the baby out with the bathwater, but also a pretty confident the 21st century is telling us you probably don't need a sheepskin hanging on your wall to prove your excellence or to prove your expertise in any given area. You got a young son. The idea that questioning is inevitable, he's going to question you and Lena. Lucas Flatter: Mm-hmm. Yeah, I mean, I [00:50:00] guess, yeah, like questions are always pointing to like unknown things and, and if, if I can quickly find the answer to something, it's probably, either been done before in some way or, figure it out to the point where it's not worth a whole lot of thought. So there's a whole ton of opportunity in, like we mentioned earlier, just connecting several known quantities, but in a way that nobody's done it before. Like the combination is novel, but the questions are what lead to actual novelty, I think like new ideas. J.R. Flatter: Yeah, and I wrote in the notes as I was getting ready for today. it is one of those life-changing events for its significance or even insignificance. I think I was a little older than I wrote down, but in the third or the fourth grade, and we had this computer scientists that came in. I mean, you can think about it in [00:51:00] 1970, what computers looked like. And I raised my hand and asked the scientist, are we ever gonna have a robot that walks independent and is computerized? And he laughed at me. He said, son, do you have any idea how big that computer would need to be? It would literally fill this gymnasium, sit down. And even at that, that age, would I been eight or nine years old? It was like, I'm not so sure this guy's right. And so maybe you benefited this as you're coming up, that maybe I didn't know and maybe I should listen and have an open mind about these things. Lucas Flatter: yeah, like thinking about your own experiences and yeah, when that coaching kind of leadership or mentorship came up versus when somebody just told you to do something and people talk about [00:52:00] those like pivotal moments, all the time. If that guy would've said oh, you know what? That's really interesting. Computers are shrinking this fast, and it'll take 45 years for them. Then maybe you would your curiosity would be opened up a little. J.R. Flatter: All right. Number 12. We've been talking about this the whole time. We're just at the beginning. Kelly grabs this phrase that he calls the Holos. I think it's a acronym of the Holosphere. He doesn't say distinctly, whether it is or isn't, but the idea that here at the beginning of the third millennial we're a quarter of the way through the first century or the third millennial, the scale of what's to come is almost unimaginable. And I find that both very exciting, but also, not threatening, but there's significant risk. I heard a, a phrase recently that, we fought two world wars already and the, the catastrophic outcomes of those [00:53:00] and where will the third World War leave us? That's the unimaginable scale of destruction that this digital age. It can bring adding AI to X as Kelly says. But on the very positive side, if we can figure out how to live with one another, and use this for good rather than destruction, I think it's gonna be incredible. My, as of yet, born great-grandchildren are gonna live in a very, very different world than you and I grew up, and then their great-grandchildren. So it's Moore's Law to the hundredth, scale. Lucas Flatter: I think like it's pretty much that idea that, like we mentioned earlier of everybody controlling all information and all technology, but then it's like I. Yeah, like the current paradigm is kind of like different nations are [00:54:00] defending themselves and different people attached to different nations. But yeah, you're talking about getting rid of all borders and everybody behaving as like a planetary, so it's who's, no, if nobody's making decisions, it's no centralized source. How are we soliciting those and making sure that everybody's like happy with the outcome? Do we need to convince an AI that like we can keep our nukes or something, you know what I mean? Yeah. I saw a funny video last night. It was a self-driving car, and the police officer was asking it to pull over. They had a construction, it was at a construction site, and the car couldn't understand what it, the hand signals were all about and kept edging forward like, Hey, get outta my way. J.R. Flatter: I'm, Yeah, from a very human perspective, how do we learn to, [00:55:00] to get along with one another when, like you said, borders and nation states and no curbs to entry to markets. and therefore a lot of the value of a central government fades away. It's gonna be some real challenges, some great beauty, but some great challenges too. But I can imagine the Luddites, being here in the first century of the third millennia would be like, wow, how did that happen? Lucas Flatter: But even like people that do participate, there's gonna be. Bad actors, people making like poor choices and selfish choices and cetera. J.R. Flatter: All right, my friend. Well, that's it. the 12 Inevitable, across the next 30 years, probably one of the reasons I love the book so much is I, we talk about 30 year visions all the time, and along comes Kevin Kelly and doing the same thing.[00:56:00] just kind of backs up the idea that we do need to, the looking decades into the future and taking action today to take advantage of and thrive and survive and, and amongst all that change.

© 2024. All Rights Reserved.

Your cart is empty Continue
Shopping Cart
Subtotal:
Discount 
Discount 
View Details
- +
Sold Out