Sonix is an automated transcription service. We transcribe audio and video files for storytellers all over the world. We are not associated with The Joe Rogan Experience. Making transcripts available for listeners and those that are hearing-impaired is just something we like to do. If you are interested in automated transcription, click here for 30 free minutes.
To listen and watch the transcript playback in real-time ?, just click the player below.
: Ah, ha, ha, ha. Four, three, two, one, boom. Thank you. Thanks for doing this, man. Really appreciate it.
: You're welcome.
: It's very good to meet you.
: Nice to meet you too.
: And thanks for not lighting this place on fire.
: You're welcome. That's coming later.
: How does one, just in the middle of doing all the things you do, create cars, rockets, all the stuff you're doing, constantly innovating, decide to just make a flamethrower? Where do you have the time for that?
: Well, the flame, we didn't put a lot of time into the flamethrower. This was an off-the-cuff thing. It's sort of a hobby company called the Boring Company, which started out as a joke, and we decided to make a real, and dig a tunnel under LA. And then, other people asked us to dig tunnels. And so, we said yes in a few cases.
: Now, who-
: And then, we have a merchandise section that only has one piece of merchandise at a time. And we started off with a cap. And there was only one thing on, which is BoringCompany.com/hat. That's it. And then, we sold the hats, limited edition. It just said, "The Boring Company."
: And then, I'm a big fan of Spaceballs, the movie. And in Spaceballs, Yogurt goes through the merchandising section, and they have a flamethrower in the merchandising section of Spaceballs. And, like, the kids love that one. That's the line when he pulls up the flamethrower. It's like, "We should do a flamethrower." So, we-
: Does anybody tell you no? Does anybody go, "Elon, maybe for yourself, but selling a flamethrower, the liabilities, all the people you're selling this device to, what kind of unhinged people are going to be buying a flamethrower in the first place? Do we really want to connect ourselves to all these potential arsonists?
: Yeah, it's a terrible idea. It's terrible. Don't buy one. I said, "Don't buy this flamethrower. Don't buy it. Don't buy it." That's what I said, but, still, people bought it.
: There's nothing I can do to stop them. I did not stop them.
: You build it, they will come.
: I said, "Don't buy it. It's a bad idea."
: How many did you make?
: It's dangerous. It's wrong. Don't buy it. And, still, people bought it. I just couldn't stop them.
: How many did you make?
: And they're all gone?
: In three — I think, four days. They sold out in four days.
: Are you going to do another run?
: No, that's it?
: Oh, I see.
: I said we're doing 20. We did 50,000. 50,000 hats, and that was a million dollars. I thought, "Okay. Well, we'll sell something for 10 million," and that was 20,000 flamethrowers at $500 each. They went fast.
: Yeah. How do you have the time? How do you have the time to do that though? I mean, I understand that it's not a big deal in terms of all the other things you do, but how do you have time to do anything? I just — I don't understand your time management skills.
: I mean, I didn't spend much time on this flamethrower. I mean, to be totally frank, it's actually just a roofing torch with an air rifle cover. It's not a real flamethrower.
: Which is why it says, "Not a flamethrower."
: That's why we were very clear, this is not actually a flamethrower. And, also, we are told that various countries would ban shipping of it, that they would ban flamethrowers. So, we're very — To solve this problem for all of the customs agencies, we labeled it, "Not a flamethrower."
: Did it work? Was it effective?
: I don't know. I think so. Yes.
: So far.
: Now, but you do-
: Because they said you cannot ship a flamethrower.
: But you do so many different things. Forget about the flamethrower. Like, how do you do all that other shit? Like, how does one decide to fix LA traffic by drilling holes in the ground? And who do you even approach with that? Like, when you have this idea, who do you talk to about that?
: I mean, I'm not saying it's going to be successful or something, you know. It's not like asserting that it's going to be successful. But so far, I've lived in LA for 16 years, and the traffic has always been terrible. And so, I don't see any other, like, ideas for improving the traffic. So, in desperation, we're going to dig a tunnel. And maybe that tunnel will be successful and maybe it won't.
: I'm listening.
: Yeah. I'm not trying to convince you it's going to work.
: And are the people that you-
: I mean, or anyone.
: But you are starting this though. This is actually a project you're starting to implement, right.
: Yeah, yeah, no. We've dug about a mile. It's quite long. It takes a long time to walk it.
: Yeah. Now, when you're doing this, what is the ultimate plan? The ultimate plan is to have these in major cities, and anywhere there's mass congestion, and just try it out in LA first?
: Yeah. It's in LA because I mostly live in LA. That's the reason. It's a terrible place to dig tunnels. This is one of the worst places to dig tunnels mostly because of the paperwork. You all think it's like, "What about seismic?" It's like, actually, both tunnels are very safe in earthquakes.
: Why is that?
: Earthquakes are essentially a surface phenomenon. It's like waves on the ocean. So, if there's a storm, you want to be in a submarine. So, being in a tunnel is like being in a submarine. Now, the way the tunnel is constructed, it's constructed out of these interlocking segments, kind of like a snake. It's sort of like a snake exoskeleton with double seals.
: And so, even when the ground moves, the tunnel actually is able to shift along with the ground like an underground snake, and it doesn't crack or break. And it's extremely unlikely that both seals would be broken. And it's capable of taking five atmospheres of pressure. It's waterproof, methane-proof, well, gas-proof of any kind, and meets all California seismic requirements.
: So, when you have this idea, who do you bring this to?
: I'm not sure what you mean by that.
: Well, you're implementing it. So, you're digging holes in the ground.
: Like, you have to bring it to someone that lets you do it.
: Yes. There are some engineers from SpaceX who thought it would be cool to do this. And the guy who runs it, like, day-to-day is Steve Davis. He's a longtime SpaceX engineer. He is great. So, Steve was like, "I'd like to help make this happen." I was like, "Cool." So, we started off with digging a hole in the ground. It's got like a permit for a pit, like pit, and just dug a big pit.
: And you have to tell them what the pit's for, or you just said, "Hey, we just want to dig a hole."
: I just filled up this form.
: That's it?
: Yeah, it was a pit in our parking lot.
: But do you have to give them some sort of a blueprint for your ultimate idea? And do they have to approve it? Like, how does that work?
: Now. We just started off with a pit.
: A big pit. And, you know, it's not really — You know, they don't really care about the existential nature of a pit. You just say like, "I want a pit."
: Yeah. And it's a hole in the ground. So then, we got the permit for the pit, and we dug the pit, and we dug it in, like, I don't know, three days, two to three days. Actually, I think two, 48 hours, something like that because Eric Carr said he was coming by for the Hype. He's going to attend the Hyperloop Competition. which is like a student competition we have for who can make the fastest part in the Hyperloop. And he was coming.
: The finals are going to be on Sunday afternoon. And so, Eric is coming by on Sunday afternoon. He's like, "You know, we should dig this pit, and then like show Eric." So, this was like Friday morning. And then, yeah. So, it's about a little over 48 hours later, we dug the pit. There was like wind 24/7. Oh, 24. 48 straight hours, something like that. And dug this big pit, and we're like, "Show Eric the pit." It's like, obviously, it's just a pit. But, hey, a hole in the ground is better than no hole in the ground.
: And what did you tell him about this pit? I mean, you just said this is the beginning of this idea.
: We're going to build tunnels under LA to help funnel traffic better.
: And they just go, "Okay." But we've joked around about this in the podcast before to like what if a person can go to the people that run the city and go, "Hey, I want to dig some holes on the ground and put some tunnels in there," and they go "Oh, yeah, okay."
: Not the only one with a hole in the ground.
: But it's a-
: People dig holes in the ground all time.
: But my question is, like, I know how much time you must be spending on your Tesla factory. I know how much time you must be spending on SpaceX. And yet, you still have time to dig holes under the ground in LA, and come up with these ideas, and then implement them. Like-
: I have a million ideas.
: I'm sure you do.
: There's no shortage of that. Yeah.
: I just don't know how you manage your time. I don't understand it. It doesn't seem — It doesn't even seem humanly possible.
: You know, I do, basically — I think, people, like, don't totally understand what I do with my time. They think, like, I'm a business guy or something like that. Like my Wikipedia page says business magnate.
: What would you call yourself?
: A business magnet. Can someone please change my Wikipedia page to magnet?
: They'll change it for you.
: Please change.
: Right now, it's probably already changed.
: It's locked. So, somebody has to be able to unlock it and change it to magnet.
: Someone will get that.
: I want to be a magnet. No, I do engineering, you know, and manufacturing, and that kind of thing. That's like 80% or more of my time.
: Ideas, and then the implementation of those ideas.
: Those are like hardcore engineering, like-
: … designing things, you know.
: It's structural, mechanical, electrical, software, user interface, engineering, aerospace engineering.
: But you must understand there's not a whole lot of human beings like you. You know that, right? You're an oddity-
: … to chimps like me.
: We're all chimps.
: Yeah, we are.
: We're one notch. One notch above a chimp.
: Some of us are a little more confused. When I watch you doing all these things, I'm like, "How does this motherfucker have all this time, and all this energy, and all these ideas, and then people just let him do these things?"
: Because I'm an alien.
: That's what I've speculated.
: Then, I'm on record saying this in the past. I wonder.
: It's true.
: I mean, if there was one? I was like, "If there was, like, maybe an intelligent being that we created, you know, like some AI creature that's superior to people, maybe it's just hanging around with us for a little while like you've been doing, and then fix a bunch of shit." I mean, that's the way.
: I might have some mutation or something like that.
: You might. Do you think you do?
: Do you wonder? Like, around normal people, you're like, "Hmm." Think, "What's up with these boring dumb motherfuckers?" ever?
: Not bad for a human, but, I think, I will not be able to hold a candle to AI.
: You scared the shit out of me when you talk about AI between you and Sam Harris.
: Oh sure.
: I didn't consider it until I had a podcast with Sam once.
: That's great.
: And he made me shit my pants. Talking about AI, I realized, like, "Oh, this is a genie that once it's out of the bottle, you're never getting it back in."
: That's true.
: There was a video that you tweeted about one of those Boston dynamic robots.
: And you're like, "In the future, it will be moving so fast, you can't see it without a strobe light."
: Yeah. You can probably do that right now.
: And no one's really paying attention too much other than people like you, or people that are really obsessed with technology, all these things are happening. And these robots are — Do you see the one where PETA put out a statement that you shouldn't kick robots?
: It's probably not wise.
: For retribution.
: Their memory is very good.
: I bet it's really good.
: It's really good.
: I bet it is.
: And getting better every day.
: It's really good.
: Are you honestly legitimately concerned about this? Are you — Is, like, AI one of your main worries in regards to the future?
: Yes. It's less of a worry than it used to be, mostly due to taking more of a fatalistic attitude.
: So, you used to have more hope, and you gave up some of it. And, now, you don't worry as much about AI. You're like, "This is just what it is."
: Pretty much. Yes, yes, yes.
: Was that not? Yes but no.
: It's not necessarily bad. It's just it's definitely going to be outside of human control.
: Not necessarily bad, right?
: Yes. It's not necessarily bad. It's just outside of human control. Now, the thing that's going to be tricky here is that it's going to be very tempting to use AI as a weapon. It's going to be very tempting. In fact, it will be used as a weapon. So, the on ramp to serious AI, the danger is going to be more humans using it against each other, I think, most likely. That will be the danger. Yeah.
: How far do you think we are from something that can make its own mind up whether or not something's ethically or morally correct, or whether or not it wants to do something, or whether or not it wants to improve itself, or whether or not it wants to protect itself from people or from other AI? How far away are we from something that's really truly sentient?
: Well, I mean, you could argue that any group of people, like a company is essentially a cybernetic collective of people and machines. That's what a company is. And then, there are different levels of complexity in the way these companies are formed. And then, there's a sort of like a collective AI in the Google, sort of, Search, Google Search, you know, where we're all sort of plugged in as like nodes on the network, like leaves on a big tree.
: And we're all feeding this network with our questions and answers. We're all collectively programming the AI. And Google Plus, all the humans that connect to it, are one giant cybernetic collective. This is also true of Facebook, and Twitter, and Instagram, and all the social networks. They're giant cybernetic collectives.
: Humans and electronics all interfacing, and constantly now, constantly connected.
: Yes, constantly.
: One of the things that I've been thinking about a lot over the last few years is that one of the things that drives a lot of people crazy is how many people are obsessed with materialism and getting the latest greatest thing. And I wonder how much of that is — Well, a lot of it is most certainly fueling technology and innovation. And it almost seems like it's built into us. It's like what we like and what we want that we're fueling this thing that's constantly around us all the time.
: And it doesn't seem possible that people are going to pump the brakes. It doesn't seem possible at this stage where we're constantly expecting the newest cellphone, the latest Tesla update, the newest MacBook Pro. Everything has to be newer and better. And that's going to lead to some incredible point. And it seems like it's built into us. It almost seems like it's an instinct that we're working towards this, that we like it. Our job, just like the ants build the anthill, our job is to somehow know how fuel this.
: Yes. I mean, I made this comment some years ago, but it feels like we are the biological bootloader for AI. Effectively, we are building it. And then, we're building progressively greater intelligence. And the percentage of intelligence that is not human is increasing. And, eventually, we will represent a very small percentage of intelligence. But the AI is informed strangely by the human limbic system. It is, in large part, our id writ large.
: How so?
: We mentioned all those things, the sort of primal drives. There's all of the things that we like, and hate, and fear. They're all there on the internet. They're a projection of our limbic system. That's true.
: No, it makes sense. And the thinking of it as a — I mean, thinking of corporations, and just thinking of just human beings communicating online through these social media networks in some sort of an organism that's a — It's a cyborg. It's a combination. It's a combination of electronics and biology.
: Yeah. This is — In some measure, like, it's to the success of these online systems. It's sort of a function of how much limbic resonance they're able to achieve with people. The more limbic resonance, the more engagement.
: Whereas, like one of the reasons why probably Instagram is more enticing than Twitter.
: Limbic resonance.
: Yeah. You get more images, more video.
: It's tweaking your system more.
: Do you worry or wonder, in fact, of what the next step is? I mean, a lot of you didn't see Twitter coming. You know, communicate with 140 characters or 280 now would be a thing that people would be interested in. Like it's going to excel. It's going to become more connected to us, right?
: Yes. Things are getting more and more connected. They're, at this point, constrained by bandwidth. Our input/output is slow, particularly output. Output got worse with thumbs. You know, we used to have input with 10 fingers. Now, we have thumbs. But images are just, also, other way of communicating at high bandwidth. You take pictures and you send pictures to people. What sends, that communicates far more information than you can communicate with your thumb.
: So, what happened with you where you decided, or you took on a more fatalistic attitude? Like, was there any specific thing, or was it just the inevitability of our future?
: I try to convince people to slow down. Slow down AI to regulate AI. That's what's futile. I tried for years, and nobody listened.
: This seems like a scene in a movie-
: Nobody listened.
: … where the the robots are going to fucking takeover. You're freaking me out. Nobody listened?
: Nobody listened.
: No one. Are people more inclined to listen today? It seems like an issue that's brought up more often over the last few years than it was maybe 5-10 years ago. It seemed like science fiction.
: Maybe they will. So far, they haven't. I think, people don't — Like, normally, the way that regulations work is very slow. it's very slow indeed. So, usually, it will be something, some new technology. It will cause damage or death. There will be an outcry. There will be an investigation. Years will pass. There will be some sort of insights committee. There will be rule making. Then, there will be oversight, absolutely, of regulations. This all takes many years. This is the normal course of things.
: If you look at, say, automotive regulations, how long did it take for seatbelts to be implemented, to be required? You know, the auto industry fought seatbelts, I think, for more than a decade. It successfully fought any regulations on seatbelts even though the numbers were extremely obvious. If you had seatbelts on, you would be far less likely to die or be seriously injured. It was unequivocal. And the industry fought this for years successfully. Eventually, after many, many people died, regulators insisted on seatbelts. This is a — This time frame is not relevant to AI. You can't take 10 years from a point of which it's dangerous. It's too late.
: And you feel like this is decades away or years away from being too late. If you have this fatalistic attitude, and you feel like it's going — We're in an almost like a doomsday countdown.
: It's not necessarily a doomsday countdown. It's a-
: Out of control countdown?
: Out of control, yeah. People quote the singularity, and that's probably a good way to think about it. It's a singularity. It's hard to predict like a black hole, what happens past the event horizon.
: Right. So, once it's implemented, it's very difficult because it would be able to-
: Once the genie is out of the bottle, what's going to happen?
: Right. And it will be able to improve itself.
: That's where it gets spooky, right? The idea that it can do thousands of years of innovation very, very quickly.
: And, then, it will be just ridiculous.
: We will be like this ridiculous biological shitting, pissing thing trying to stop the gods. "No, stop. We're like living with a finite lifespan, and watching, you know, Norman Rockwell paintings."
: It could be terrible, and it could be great. It's not clear.
: But one thing is for sure, we will not control it.
: Do you think that it's likely that we will merge somehow or another with this sort of technology, and it'll augment what we are now, or do you think it will replace us?
: Well, that's the scenario. The merge scenario with AI is the one that seems like probably the best. Like if-
: For us?
: Yes. Like if you can't beat it, join it. That's-
: Yes, yeah.
: You know. So, from a long-term existential standpoint, that's like the purpose of Neuralink is to create a high bandwidth interface to the brain such that we can be symbiotic with AI because we have a bandwidth problem. You just can't communicate through fingers. It's too slow.
: And where's Neuralink at right now?
: I think. we'll have something interesting to announce in a few months. That's, at least, an order of magnitude better than anything else. I think better than, probably, anyone thinks is possible.
: How much can you talk about that right now?
: I don't want to jump the gun on that.
: But what's like the ultimate? What's the idea behind that? Like, what are you trying to accomplish with it? What would you like best case scenario?
: I think, best case scenario, we effectively merge with AI where AI serves as a tertiary cognition layer, where we've got the limbic system. Kind of the, you know, primitive brain essentially. You got the cortex. So, you're currently in a symbiotic relationship. Your cortex and limbic system are in a symbiotic relationship. And, generally, people like their cortex, and they like their limbic system. I haven't met anyone who wants to delete their limbic system or delete their cortex. Everybody seems to like both.
: And the cortex is mostly in service to the limbic system. People may think that the thinking part of themselves is in charge, but it's mostly their limbic system that's in charge. And the cortex is trying to make the limbic system happy. That's what most of that computing power is. It's launched towards, "How can I make the limbic system happy?" That's what it's trying to do.
: Now, if we do have a third layer, which is the AI extension of yourself, that is also symbiotic. And there's enough bandwidth between the cortex and the AI extension of yourself, such that the AI doesn't de facto separate. Then, that could be a good outcome. That could be quite a positive outcome for the future.
: So, instead of replacing us, it will radically change our capabilities?
: Yes. It will enable anyone who wants to have super human cognition, anyone who wants. This is not a matter of earning power because your earning power would be vastly greater after you do it. So, it's just like anyone who wants can just do it in theory. That's the theory. And if that's the case then, and let's say billions of people do it, then the outcome for humanity will be the sum of human will, the sum of billions of people's desire for the future.
: That billions of people with enhanced cognitive ability?
: Radically enhanced?
: And which would be — It — But how much different than people today? Like if you had to explain it to a person who didn't really understand what you're saying, like how much different are you talking about? When you say radically improved, like, what do you mean? You mean mind reading?
: It will be difficult to really appreciate the difference. It's kind of like how much smarter are you with a phone or computer than without? You're vastly smarter actually. You know, you can answer any question. If you connect to the internet, you can answer any question pretty much instantly, any calculation, that your phone's memory is essentially perfect. You can remember flawlessly. Your phone can remember videos, pictures, everything perfectly. That's the-
: Your phone is already an extension of you. You're already a cyborg. You don't even — What most people don't realize, they are already a cyborg. That phone is an extension of yourself. It's just that the data rate, the rate at which — The communication rate between you and the cybernetic extension of yourself, that is your phone and computer, is slow. It's very slow.
: And that is like a tiny straw of information flow between your biological self and your digital self. And we need to make that tiny straw like a giant river. A huge high band with the interface. It's an interface problem, data rate problem. It's all the data rate problem that I think we can hang on to human machine symbiosis through the long term. And then, people may decide that they want to retain their biological self or not. I think they'll probably choose to retain the biological self.
: Versus some sort of Ray Kurzweil scenario where they download themselves into a computer?
: You will be essentially snapshotted into a computer at any time. If your biological self dies, you could probably just upload it to a new unit literally.
: Pass that whiskey. We're getting crazy over here. This is getting ridiculous.
: Down the rabbit hole.
: Grab that sucker. Give me some of that. This is too freaky. See, if I was just talking-
: I've been thinking about this for a long time, by the way.
: I believe you. If I was talking to one — Cheers, by the way.
: Cheers. It is a great whiskey.
: Thank you. I don't know where this came. Who brought this to us?
: I'm trying to remember. I can't-
: Somebody gave it to us. Old Camp. Whoever it was-
: It's good.
: … thanks.
: It's good.
: Yeah, it is good. This is just inevitable. Again, going back to when you decided to have this fatalistic viewpoint. So, you weren't — You tried to warn people. You talked about this pretty extensively. I've read several interviews where you talked about this. And then, you just sort of just said, "Okay, it just is. Let's just-" And, in a way, by communicating the potential for — I mean, for sure, you're getting the warning out to some people.
: Yeah. Yeah. I mean, I was really going on the warning quite a lot. I was warning everyone I could. Yeah, I've met with Obama and just for one reason, like, "Better watch out."
: Just talk about AI.
: And what did he say? So, what about Hillary? Worry about her first. Shh, everybody, quiet.
: He listened. He certainly listened. I met with Congress. I met with — I was at a meeting of all 50 governors and talked about just the AI danger. And I talked to everyone I could. No one seemed to realize where this was going.
: Is it that, or do they just assume that someone smarter than them is already taking care of it? Because when people hear about something like AI, it's almost abstract. It's almost like it's so hard to wrap your head around it.
: It is.
: By the time it happens, it will be too late?
: Yeah. I think, they didn't quite understand it, or didn't think it was near term, or not sure what to do about it. And I said, like, you know, an obvious thing to do is to just establish a committee, government committee, to gain insight. You know, before you oversight, before you do make regulations, you should like try to understand what's going on. And then, you have an insight committee. Then, once they learn what's going on, you get up to speed. Then, they can make maybe some rules or proposed some rules. And that would be probably a safer way to go about things.
: It seems — I mean, I know that it's probably something that the government's supposed to handle, but it seems like I wouldn't want the — I don't want the government to handle this.
: Who do you want to handle this?
: I want you to handle this.
: Oh geez.
: Yeah. I feel like you're the one who could bring the bell better because if Mike Pence starts talking about AI, I'm like, "Shut up, bitch. You don't know anything about AI. Come on, man. He doesn't know what he's talking about." That's just games.
: I don't have the power to regulate other companies. I don't if I'm supposed to, but you know.
: Right, but maybe companies could agree. Maybe there could be some sort of a — What I mean is we have agreements where you're not supposed to dump toxic waste into the ocean, you're not supposed to do certain things that could be terribly damaging, even though they would be profitable. Maybe this is one of those things.
: Maybe we should realize that you can't hit the switch on something that's going to be able to think for itself and make up its own mind as to whether or not it wants to survive or not, and whether or not it thinks you're a threat, or whether or not it thinks you're useless. Like, "Why do I keep this dumb finite life form alive? Why? Why keep this thing around? It's just stupid. It just keeps polluting everything. It's shitting everywhere it goes, lighting everything on fire, and shooting at each other. Why would I keep this stupid thing alive? Because, sometimes, it makes good music, you know. Sometimes it makes great movies. Sometimes it makes beautiful art, and sometimes — you know. Sometimes it's cool to hang out with. Like with my-
: Yes, for all those reasons.
: Yeah. For us, those are great reasons.
: But for anything objective standing outside that go, "This is definitely a flawed system." This is like if you went to the jungle and you watch these chimps engage in warfare and beat each other with wooden sticks.
: Chimps are really mean.
: They're fucking real mean.
: They're fucking mean.
: They're real mean.
: I saw a movie, Chimpanzee. I thought it was going to be like some Disney thing. Like, holy cow.
: What movie was that?
: It's called Chimpanzee.
: Is it a documentary?
: Yeah, yeah. It's kind of like a documentary. I was like, "Damn, these chimps are mean."
: They're mean.
: They're cruel.
: Yeah. They're calculated. Yeah.
: They sneak up on each other and-
: Like, I didn't realize chimps did calculated cruelty.
: I was pretty — I left that meeting kinda like, "This is dark."
: Right. Well, we know better because we've advanced. But if we hadn't, we'd be like, "Man, I don't want to fucking live in a house. I like the chimp ways, bro. Chimp ways to go. This is it, man, chimp life. You know, we got-
: Simple chimp life.
: Chimp life right now. But we, in a way, to the AI, might be like those chimps and like, "These stupid fucks launching missiles out of drones, and shooting each other underwater." Like we're crazy. We got torpedoes, and submarines, and fucking airplanes that drop nuclear bombs indiscriminately on cities. We're assholes.
: They might go, "Why are they doing this?" It might, like, look at our politics, look at what we do in terms of our food system, what kind of food we force down each other's throats. And they might go, "These people are crazy. They don't even look after themselves."
: I don't know. I mean, how much do we think about chimps? Not much.
: Very little.
: It's like-
: It's true.
: … these chimps are at war. This like look — It's like groups of chimps just attack each other, and they kill each other. They torture each other. That's pretty bad. They hunt monkeys. They're — Like this is probably the most, but, you know. I mean, when was the last time you watched chimps?
: All the time.
: You do.
: You're talking to the wrong guy.
: Okay. Well, unfortunately, yeah.
: This fucking podcast, dude, we're talking about chimps every episode.
: It's chimp city? Okay.
: People are laughing right now. Yeah, constantly. I'm obsessed.
: I saw that David Attenborough documentary on chimps where they were eating those colobus monkeys and ripping them apart.
: Yes, this was rough.
: I saw that many, many years ago.
: It's gruesome.
: It just changed how-
: I go, "Oh, this is why people are so crazy. We came from that thing."
: Yeah, exactly.
: It is the colobus.
: They got, like, better philosophy.
: Yeah, they're like swingers.
: Yeah, they really are. They seem to be way more — Even than us, way more civilized.
: They just seem to resolve everything with sex.
: Yeah. The only rules they have is the mom won't bang the son. That's it.
: That's it. Mom won't bang her sons. They're good women.
: Good women in the bonobo community. Everybody else is banging it out.
: Yeah. I haven't seen the Bonobo Movie.
: Well, they're disturbing just at a zoo of bonobos at the zoo.
: They're just constantly going.
: Constantly fucking, yeah. It's all they do.
: It's just one stuff.
: Yeah. And they don't care, gay, straight, whatever. Let's just fuck. What's with these labels?
: I haven't seen bonobos at a zoo. I just probably like-
: I don't think I have either.
: And not on the PJ section.
: Yeah, I don't think they have them at many zoos. We've looked at it before too, didn't we?
: It's probably pretty awkward.
: Yeah. I think that's the thing. They don't like to keep regular chimps at zoos because bonobos are just always jacking off and-
: Fucking it.
: In San Diego.
: What's that? They have in San Diego?
: San Diego's got some, yeah.
: Really? Interesting.
: Probably separate them. Yeah.
: I mean, how many are there in a cage, you know? I was like-
: … "It's going to be pretty intense."
: Yeah, yeah. Yeah, we're a weird thing, you know. And I've often wondered whether or not we're — you know, our ultimate goal is to give birth to some new thing. And that's why we're so obsessed with technology because it's not like this technology is really — I mean, it's certainly enhancing our lives too in a certain way, but, I mean, ultimately, is it making people happier right now? Most technology I would say no. In fact, you and I were talking about social media before this about just not having Instagram on your phone, and not dealing, and you feel better.
: Yes. I think, one of the issues with social media, it's been pointed out by many people, is that, I think, maybe particularly Instagram people look like they have a much better life than they really do.
: By design.
: Yeah. People are posting pictures of when they're really happy. They're modifying those pictures to be better looking. Even if they're not modifying the pictures, they're, at least, selecting the pictures for the best lighting, the best angle. So, people basically seem they are way better looking than they basically really are.
: And they're way happier seeming than they really are. So, if you look at everyone on Instagram, you might think, "Man, there are all these happy beautiful people, and I'm not that good looking, and I am not happy. So, I must suck," you know. And that's going to make you feel sad; when, in fact, those people you think are super happy, actually, not that happy. Some of them are really depressed. They're very sad. Some of the happiest-seeming people are actually some of the saddest people in reality. And nobody looks good all the time. It doesn't matter who you are.
: No. It's not even something you should want.
: Why do you want to look great all the time?
: Yeah, exactly. So, I think things like that can make people quite sad just by comparison because you're sort of — People generally think of themselves relative to others. It's like we are constantly re-baselining our expectations. And you can see to say if you watch some show like Naked and Afraid, or, you know, if you just go and try living in the woods by yourself for a while, and you're like, "The land that civilization is quite great." People want to come back to civilization pretty fast on Naked or Afraid.
: Wasn't there a Theodore quote, that "Comparison is the thief of joy."
: Yeah. Happiness is reality minus expectations.
: That's great too, but the comparison is the thief of joy really holds true to people. Is it?
: Theodore Roosevelt.
: Roosevelt, fascinating. And when you're thinking about Instagram, because what essentially Instagram is for a lot of people is you're giving them the opportunity to be their own PR agent, and they always go towards the glamorous, you know. And when anybody does show, you know, #nofilter, they really do do that. "Oh, you're so brave. Look at you, no makeup," you know, which they look good anyway.
: "You look great. What are you doing? Oh my God. You don't have makeup on. You still look hot as fuck. You know what you're doing. I know what you're doing too." They're letting you know. And then, they're feeding off that comment section. Sort of sitting there like it's a fresh stream of love. Like you're getting right up to the sources as it comes out of the earth, and you're sucking that sweet, sweet love water.
: A lot of emojies, smoggy emojies.
: A lot of emojies.
: My concern is not so much what Instagram is. It's that I didn't think that people had the need for this or the expectation for some sort of technology that allows them to constantly get love and adulation from strangers, and comments, and this ability to project this sort of distorted version of who you really are.
: But I worry about where it goes. Like what's the next one? What's the next one? Like, where's is it? Is it going to be augmented to some sort of a weird augmented or virtual sort of Instagram type situation where you're not going to want to live in this real world, you're going to want to interface with this sort of world that you've created through your social media page and some next level thing.
: Yeah. Go live in the simulation.
: Yeah, man.
: In the simulation.
: Some ready player one type shit that's real. That seems — we have that HTC vibe here. I've only done it a couple times quite honestly because it kind of freaks me out.
: My kids fucking love it, man. They love it. They love playing these weirdo games and walking around that headset on. But part of me watching them do it goes, "Wow, I wonder if this is like the precursor." Just sort of like if you look at that phone that Gordon Gekko had on the beach and you compare that-
: Yes, the big cell phone.
: Yeah, you pair that to like a Galaxy Note 9.
: Like how the fuck did that become that, right? And I wonder when I see this HTC Vibe, I'm like, "What is that thing going to be 10 years from now when we're making fun of what it is now?" I mean, how ingrained, and how connected and interconnected is this technology going to be in our life?
: It will be, at some point, indistinguishable from reality.
: We will lose this. We'll lose this. Like you and I are just looking at each other through our eyes.
: Are we?
: I see you. You see me, I think, I hope.
: You think so?
: I think you probably have regular eyes.
: This could be some simulation.
: It could. Do you entertain that?
: Well, the argument for the simulation, I think, is quite strong because if you assume any improvements at all over time, any improvement, 1%, 0.1%, just extend the time frame, make it a thousand years, a million years. The universe is 13.8 billion years old. Civilization, if you count it, if you're very generous, civilization is maybe 7000 or 8000 years old if you count it from the first writing. This is nothing. This is nothing.
: So, if you assume any rate of improvement at all, then games will be indistinguishable from reality, or civilization will end. One of those two things will occur. Therefore, we are most likely in a simulation.
: Or we're on our way to one, right?
: Because we exist.
: Well, not just because we exist.
: Pretty exactly.
: We could most certainly be on the road. We could be on the road to that, right. it doesn't mean that it has to have already happened.
: It could be in base reality. It could be in base reality.
: We could be here now on our way to the road or on our way to the destination where this can never happen again, where we are completely ingrained in some sort of an artificial technology or some sort of a symbiotic relationship with the internet or the next level of sharing information. But, right now, we're not there yet. That's possible too, right? It's possible that a simulation is, one day, going to be inevitable, that we're going to have something that's indistinguishable from regular reality, but maybe we're not there yet. That's also possible.
: Yes, it is.
: Though we're not quite there yet. This is real. You want to touch that wood?
: It feel very real.
: Maybe that's why everybody is like into like mason jars and shit.
: Mason jars.
: Suede shoes. People that like craft restaurants, and they want raw wood. Everyone wants the metal people. It seems like people are like longing toward some weird log cabin type nostalgia.
: Sure, reality.
: Yeah, like holding on. Like clinging.
: Dragging their nails through the man like, "Don't take me yet."
: "I want to-"
: But then, people go get a mason jar with a wine stem or a handle. That's dark.
: It makes me-
: It makes me lose faith in humanity.
: Mason jar, wine stem and a handle, they have those?
: The sturdy people. That's just assholes. That's like people make pet rocks.
: Right. Some people are just assholes. They take advantage of our generous nature.
: It was made with the wine stem. Made with handle.
: They made it that way?
: Yes. They're manufactured like that.
: So, the one way, they welded it on to the mason jar. You fuck.
: But that would be fine if there was like glued it on or something.
: Right. There would be like-
: But it was made that way.
: Like trash shit. Oh, this is disgusting. Look at this. It is right there.
: Yes, it's pretty harsh. Yup.
: This is terrible. Yeah. That's like fake breasts that are designed to be hard. Like fake breasts from the '60s. It's like if you really long for the ones with ripples, here we go. Yeah. That's almost what that is.
: What are you going to do, man? There's nothing, you know. There's nothing you can do to stop certain terrible ideas from propagating.
: Yeah. Anyway, I don't want to sound like things are too dark because I think like you kind of have to be optimistic about the future. There's no point in being pessimistic. It's just too negative because it is-
: It doesn't help.
: It doesn't help, you know. I think you want to be — I mean, my theory is like you'd rather be optimistic. I think, I'd rather be optimistic and wrong than pessimistic and right.
: At least, we're on that side.
: Right, yeah.
: Because if you're pessimistic, it's going to be miserable.
: Yeah. Yeah, nobody wants to be around you anyway if it's the end of the world. You're like, "I fucking told you, bro."
: Yeah, exactly.
: The world is ending. Yeah. It is way — it is for all.
: I did my part.
: I mean-
: Enjoy the journey.
: Right. If you really want to get morose, I mean, it is what it is for all of us anyway. We're all going to go, unless something changes.
: I mean, ultimately, you know, even if we just sort of existed as humans forever, we'd still eventually would be like the heat death of the universe-
: Gazillion years from now.
: Right, even if we get it past the sun.
: If we figure out a way past the sun running out of juice.
: Eventually, it's going to end. It's just a question of when.
: So, it really is all about the journey.
: Or transcendence from whatever we are now into something that doesn't worry about death.
: The universe, as we know it, will dissipate into a fine mist of cold nothingness eventually.
: And then, someone's going to bottle it and put a fragrance to it, sell it to French people in another dimension.
: It's just a very long time.
: So, I think it's really just about, how can we make it last longer?
: Are you a proponent of the multi-universe's theory? Do you believe that there are many, many universes, and that even if this one fades out that there's other ones that are starting fresh right now, and there's an infinite number of them, and they're just constantly in a never-ending cycle of birth and death?
: I think most likely. This is just about probability. There are many, many simulations. These simulations, we might as well call them reality, or we could call them the multiverse.
: These simulations you believe are created like someone has manufactured-
: They're running on the substrate.
: That substrate is probably boring.
: How so?
: Well, when we create a simulation like a game or a movie, it's the distillation of what's interesting about life. You know, it takes a year to shoot an action movie. And then, that's all to slow down into two or three hours. So, let me tell you, if you've seen an action movie being filmed, it's freaking — It's boring. It's super boring. It takes — There's like lots of takes. Everything's in a green screen. It looks pretty goofy. It doesn't look cool. But once you had the CGI, and have great editing, it's amazing.
: So, I think, most likely, if we're a simulation, it's really boring outside the simulation because why would you make simulation as boring? You'd make simulation way more interesting than base reality.
: That is if this right now is a simulation.
: And, ultimately, inevitably, as long as we don't die or get hit by a meteor, we're going to create some sort of simulation if we continue on the same technological path we're on right now.
: But we might not be there yet. So, it might not be a simulation here. But it most likely is you feel other places.
: This notion of a place or where is-
: Flawed perception.
: Like that if you have that, sort of, that vibe you have, which is for the — that's was made by valve, and it's really valve that made it. HTC did the hardware, but it's really a valve thing.
: Makers of Half-life.
: Yes. Great company.
: Great company.
: When you're in that virtual reality, which is only going to get better, where are you? Where are you really?
: You aren't anywhere.
: Well, whereas-
: You're in the computer.
: You know, what defines where you are?
: It's your perception.
: Is it your perceptions or is it, you know, a scale that we have under your butt. You're right here. I've measured you. You're the same weight as you were when you left. But meanwhile, your experience is probably different-
: Why do you think you're where you are right now? You might not be.
: I'll buck up a joint if you keep talking. Your man is just going to come in here. We might have to lock the door.
: Right now, you think you're in a studio in LA.
: That's what I heard.
: You might be in a computer.
: Man, I think about this all the time. Yeah, I mean, it's unquestionable that one day that will be the case, as long as we keep going, as long as nothing interrupts us, and if we start from scratch, and, you know, we're single-celled organisms all over again. And then, millions and millions of years later, we become the next thing that is us with creativity and the ability to change this environment. It's going to keep monkeying with things until it figures out a way to change reality. To change — I mean, almost like punch a hole through what is this thing into what what it wants it to be and create new things. And then, those new things will intersect with other people's new things, and there will be this ultimate pathway of infinite ideas and expression all through technology.
: And then, we're going to wonder like, "Why are we here? What are we doing?"
: Let's find out.
: I mean, I think we should take the actions, the set of actions that are most likely to make the future better.
: Yes, right.
: Right. Right. And then, we evaluate those actions to make sure that it's true.
: Well, I think there's a movement to that. I mean, in terms of like a social movement. I think some of it's misguided, and some of it's exaggerated, and there's a lot of people that are fighting for their side out there. But it seems like the general trend of, like, social awareness seems to be much more heightened now than has ever been in any other time in history because of our ability to express ourselves instantaneously to each other through Facebook, or Twitter, or what have you. And that the trend is to abandon preconceived notions, abandon prejudice, abandon discrimination, and promote kindness and happiness as much as possible. Looking at this knife? Somebody gave it to me. Sorry.
: Yeah. What is it?
: Fuck you. My friend, Donnie, brought this with him, and it just stayed here. I have a real samurai sword, if you want to play with that. I know you're into weapons. That's from the 1500s. Samurai's something on the table.
: That's cool.
: I'll grab it. Hold on. Yeah, that's legit samurai sword from an actual samurai from the 1500s. If you pull out that blade, that blade was made the old way where a master craftsman-
: Folded metal?
: Folded that metal and hammered it down over and over again over a long period of time, and honed that blade into what it is now. What's crazy is that more than 500 years later, that thing is still pristine. I mean, whoever took care of that and passed it down to the next person who took care of it, and you know until it got to the podcast room, it's pretty fucking crazy.
: One day, someone's going to be looking at a Tesla like that. How many of these fucking backdoor they pop off sideways like a Lamborghini?
: They should see what the Tesla can do. He didn't — You should — I'll show you how to once.
: Well, I've driven one. I love them.
: Yeah, but most people don't know what it can do.
: In terms like ludicrous mode? In terms of like driving super fast and irresponsibly on public roads, is that what you're saying?
: Any car can do that.
: Yeah. What can it do that I need to know about?
: I mean, the Model X can do this like ballet thing to the Trans-Siberian Orchestra. It's pretty cool.
: Wait, it dances?
: Legitimate, like it goes around?
: Why would you program that into a car?
: It seemed like fun.
: That's what I get about you. That's what's weird. Like when you showed up here, you were all smiles, and you pull out a fucking blowtorch and not a blowtorch, but I'm like, "Look at this-"
: Not a flamethrower.
: Not a flamethrower. Like, "He's having fun."
: I want to be clear, it's definitely not a flamethrower.
: But you're having fun. Like this thing, you know, you program a car to do a ballet dance, you're having fun.
: It's great.
: But how do you have the time to do that? I don't understand why you're digging holes under the earth, and sending rockets into space, and powering people in Australia. Like how the fuck do you have time to make the car dance ballet?
: Well, I mean, in that case there were some engineers at Tesla that said, "You know, what if we make this car dance and play music?" I'm like, "That sounds great. Please do it. Let's try to get it done in time for Christmas." We did.
: Is there a concern about someone just losing their mind and making it do that on the highway?
: No, it won't do that.
: What if it's in bumper-to-bumper traffic?
: No, it won't do it?
: No. Actually, you have to sneeze drag.
: Oh, sneeze drag.
: Yeah, that's why people don't know about it. But if you have the car-
: It's like it could do lots of things, lots of things.
: Once Reddit gets a hold of it, everyone's going to know already.
: You just have to — Everyone, if you search for it on the internet, you will find out.
: They will find.
: But people don't know that they should even search for it.
: Well, they do now.
: There's so many things about the Model X, and the Model S, and the Model 3 that people don't know about. We should probably do a video or something to explain it because I have close friends of mine and I say, "Do you know the car can do this?" and they're like, "Nope."
: Do you want to do a video of that? Do you like the fact that some people don't know?
: No, I think it's probably not. We should tell people.
: Yeah, probably.
: That would help your product. I mean, it's not like you don't sell enough of them. You sell almost too many of them, right.
: I mean, I think, a Tesla is the most fun thing you could possibly buy ever. That's what it's meant to be. Well, our goal is to make — It's not exactly a car. It's actually a thing to maximize enjoyment, make as maximum fun.
: Okay. Electronic, like big screen, laptop, ridiculous speed, handling, all that stuff.
: Do you have a-
: And we're going to put video games in it.
: You are?
: Is that wise?
: What kind of video games? Candy Crush?
: You won't be able to drive while you're playing the video game. But, like, for example, we're just putting the Atari emulator, RAM emulator in it. So, we'll play a Missile Command, and Lunar Lander, and a bunch of other things. Yeah.
: That sounds cool.
: It's pretty fun.
: I like that.
: Yeah. I mean, probe the interface for Missile Command because it's too hard with the old trackball. So, there's a touch screen version of Missile Command. So, you have a chance.
: Do you — You have an old car, don't you? Don't you have like an old Jaguar?
: Yeah. How did you know that? Let's pause for that. I have a '61 series 1 E-type Jaguar.
: I love cars.
: It's great.
: Yeah, I love old cars.
: The only-
: That's one of the things-
: Yeah, the only two gassing cars I have are that and an old — like a Ford Model T that a friend of mine gave me. Those are my only two gasoline cars.
: Is the Ford Model T all stock? Oh, there's your car. Look at that.
: I have the convertible.
: That is a gorgeous car.
: It's a soft car.
: God, that's a good looking car.
: Is that yours?
: That is — It's not mine. It's extremely close to mine, but I don't have a front license plate on mine.
: It's a beautiful car. They nailed it. That new type-
: Mine looks like that.
: God, they nailed that.
: That's what mine looks like. Maybe it is mine.
: There's certain iconic shapes.
: And there's something about those cars too. They're not as capable, not nearly as capable as like a Tesla, but there's something really satisfying about the mechanical aspect of like feeling the steering, and the-
: … grinding of the gears and the shifting. Something about those that's extremely satisfying even though they're not that competent. Like I have a 1993 Porsche 964. It's like lightweight. It's an RS America. It's not very fast. It's not like in comparison to a Tesla or anything like that. But the thing about it is like it's mechanical, you feel it. Everything's like-
: It's like it gives you this weird thrill, like you're on this clunky ride, and there's all this feedback. There's something to that.
: Yeah. Yeah, absolutely. I mean, yeah. My E Type is like basically no electronics.
: And so, you like that, but you also like electronics.
: Like Tesla Sup, it's like the far end of electronics.
: It drives itself.
: It's driving itself better every day.
: We're about to release the software that will enable you to just turn it on, and it'll drive from highway on ramp, to highway exit, do lane changes, overtake other cars-
: To go from one interchange to the next. If you get on, say, the 405, get off 300 miles later, and go through several highway interchanges, and just overtake other cars, and hook into the nav system, and then-.
: And you're just meditating, om.
: While your car is just traveling.
: It's kind of eerie. It's kind of eerie.
: What did you think when you saw that video of that dude fallen asleep behind the wheel? I'm sure you've seen it, the one in San Francisco. It's like right outside of San Jose. It's out cold, like this. And the cars an inch bumper-to-bumper in traffic moving along.
: You've seen it, right?
: Yeah, yeah. We just changed the software. Changed the software. That's, I think, an old video. We changed software. If you don't touch the wheel, it will gradually slow down, and put the emergency lights on, and wake you up.
: Oh, that's hilarious.
: That's hilarious.
: Can you choose what voice wakes you up?
: Well, it's sort of more of a — It sort of honks.
: It honks.
: There should be like, "Wake up, fuckface. You're endangering your fellow humans."
: We could gently wake you up with a sultry voice.
: That would be good like something with a southern accent. "Hey, wake up."
: Wake up, sunshine.
: Hey, sweetie.
: Why don't you wake up?
: You could pick your-
: Right, like-
: Like whatever you want. Yes.
: Yeah, I choose the Australian girl for Siri.
: I like her voice.
: Do you want it seductive?
: It's my favorite. I like Australian.
: What flavor? Do what you want it to be angry. It could be anything.
: You want those Australian prison lady genes. Now, when you program something like that in, is this in response to a concern, or is it your own?
: Do look at it and go, "Hey, they shouldn't just be able to fall asleep. Let's wake them up."
: Yeah, yeah. It's like — You know, we're like — Yeah, people are falling asleep. We've got to do something about that.
: Right. But when you first released it, you didn't consider it, right? You're just like, "Well, no one's going to just sleep."
: People fall asleep in their cars all the time.
: All the time.
: They crash.
: Yeah, it's horrible.
: At least, our car doesn't crash. That's better.
: It's better not to crash.
: Imagine if that guy had fallen asleep in a gasoline car, they do all the time.
: For sure, yeah.
: They would crash into somebody.
: And, in fact, the thing that really, you know, got me to — It's like, "Man, we better get a autopilot going and get it out." A guy was in an early Tesla driving down the highway, and he fell asleep, and he ran over a cyclist, and killed him. I was like, "Man, if we had autopilot, he might have fallen asleep, but, at least, he wouldn't run over that cyclist."
: So, how did you implement it? Like did you just use cameras and-
: … programmed with the system, so that if it sees images, it slows down? And how much time do you get? And like-
: Is the person who's in control of it allow the program to how fast it goes?
: Yes. Yeah, you can program it to be more or less, like more conservative or like more aggressive driver. And you can say what speed you want it to — What speed is okay.
: I know you have ludicrous mode. Do you have douche bag mode?
: Well, in-
: It just cuts people off.
: Well, for lane changes, it's tricky because if you're in like LA, like unless you're pretty aggressive, right, it's hard to change lanes sometimes.
: You can't. It's hard to be Satnam. It's hard to be Namaste here in LA.
: If you want to hit that Santa Monica Boulevard off in-
: I mean, you've got to be a little pushy.
: You've got to be a little pushy, yeah.
: On the freeway.
: Especially if you were angry.
: If you're a little angry, they don't want you, and they speed up.
: Sometimes, yeah, I think, people like overall are pretty nice on the highway, even in LA, but sometimes they're not.
: Do you think the Neuralink will help that quick?
: Everybody will be locked in together, this hive mind.
: Tunnels will help it. We wouldn't have traffic.
: That will help a lot.
: How many of those can you put in there?
: Nice thing about tunnels-
: Are you thinking about for everybody?
: Nice thing about tunnels is you can go 3D.
: Oh right.
: So, you can go many levels.
: Until you hit.
: Yeah, but you go — You can have 100 levels of with bombs.
: Jesus Christ. I don't want to be on 99. That would be a negative 99 floors.
: This is one of the fundamental things people don't appreciate about tunnels is that it's not like roads. The fundamental issue with roads is that you have a 2D transport system and a 3D living and workspace environment. So, you've got all these tall buildings or concentrated work environments. And then, you want to go into those like 2D transport system with-
: Hugely inefficient.
: … pretty low density because cars are spaced out pretty far. And so, that, obviously, is not going to work. You're going to have traffic guaranteed. But if you can go 3D on your transport system, then you can solve all traffic. And you can either go 3D up with a flying car, or you can go 3D down with tunnels. You can have as many tunnel levels as you want, and you can arbitrarily relieve any amount of traffic. You can go further down with tunnels than you can go up with buildings. You're 10,000 feet down if you want. I wouldn't recommended it, but-.
: What was that movie with — What's his face? Bradley — Not Bradley Cooper, Christian? No. What the fuck is his name? Batman. Who is Batman?
: Christian Bale.
: Christian Bale, where they fought dragons. Him and Matthew McConaughey. He went down deep into the earth. How deep can you go?
: I don't think that was Batman.
: Yeah, it was. Yeah, it was.
: Batman fought dragons? I don't-
: No, it wasn't Batman but it's Christian Bale.
: The Rain of Fire.
: Rain of Fire.
: Never saw that?
: Terrible. Terrible but good. I would look at it some time.
: I wouldn't recommend drilling super far down but the earth is a big-
: Yeah, but you can't drill deep. It gets hot, right?
: … molten
: The earth is a giant ball of lava with a thin crust on the top, which we think of as like the surface, this thin crust. And it's mostly just a big bowl of lava. That's earth, but 10,000 feet is not a big deal.
: Have you given any consideration whatsoever to the flat earth movement?
: That's a troll situation.
: Oh, it's not. No, it's not. You would like to think that-
: … because you're super genius. But I, as a normal person, I know these people are way dumber than me. And they really, really believe. They watch YouTube videos, which go on uninterrupted, and spew out a bunch of fucking fake facts very eloquently and articulately. And they really believe. These people really believe.
: I mean, if it works for them, sure. Fine.
: It's weird though, right, that in this age where, you know, there's ludicrous mode in your car, goes 1.9 seconds, 060.
: That's 2.2.
: 2.2. Which one's 1.9? The Coaster.
: The Next Generation Roadster.
: Standard edition.
: Yeah, I'm on top of this shit.
: That's just without-
: Standard edition.
: Yeah. So, it's not the performance package.
: What performance package?
: What the fuck do you need?
: We put a rocket thruster in it.
: For real?
: What are they going to burn?
: Nothing. Ultrahigh pressure compressed air.
: Whoa. Just air?
: Just called gas thrusters.
: Then, do you have the air tanks or the-
: Sucking air, okay.
: Yeah. It has an electric pump.
: Pump it up like 10,000 PSI.
: And how fast are we talking? Zero to 60.
: How fast you want to go?
: I want to go-
: We could make this thing fly.
: I want to go back in time.
: I can make it fly.
: You make it fly?
: Do you anticipate that as being — I mean, you're talking about the tunnels and then flying cars. Do you really think that's going to be real?
: Too noisy, and there's too much airflow. So, the final issue with flying cars, I mean, if you get like one of those like toy drones, think of how loud those are and how much air they blow. Now, imagine if that's like a thousand times heavier. This is not going to make your neighbors happy. Your neighbors are not going to be happy if you land a flying car in your backyard.
: It will be very helicopter-like.
: Or on your roof. It's just really going to be like, "What the hell. That was annoying."
: You can't even — Like, if you want a flying car, just put some wheels on a helicopter.
: Is there a way around that? Like what if they figure out some sort of magnetic technology, like all those Bob Lazar type characters who were thinking that was a part of the UFO technology they were doing at Area 51? Remember, didn't they have some thoughts about magnetics? Nope.
: No? Bullshit?
: Yeah. There's a fundamental momentum exchange with the air. So, you must accelerate. There's like this — There's a sudden — You have a mass, and you have gravitational acceleration. And mass times — Your mass times gravity must equal the mass of airflow times acceleration of that airflow to have a neutral force. MG=MA
: So, it's impossible to go around-
: And then you won't move.
: If MG is greater than MA, you will go down. And if MA is greater than MG, you will go up. That's how it works.
: There's just no way around that?
: There is definitely no way around it.
: There's no way to create some sort of a magnetic something or another that allows you to float?
: Technically, yes. You could have a strong enough magnet, but that magnet would be so strong that you would create a lot of trouble.
: It would just suck cars up into your car? Just pick up axles and do that?
: I mean, it should have to repel off of either material on the ground or in a really nutty situation off of Earth's gravitational field, and somehow make that incredibly light, but that magnet would cause so much destruction. You'd be better off with a helicopter.
: So, if there was some sort of magnet road, like you have two magnets, and they repel each other, if you had some sort of a magnet road that was below you, and you could travel on that magnet road, that would work?
: Yes. Yes, you can have a magnet road.
: A magnet road. Is that too ridiculous?
: No, it will work. So, you could do that.
: That's ridiculous too, right?
: I would not recommend it.
: There's a lot of things you don't recommend.
: I would super not recommend that. Not good. Not wise, I think.
: Magnet roads?
: No. No. No, definitely not. Definitely not. Yeah, it would cause a lot of trouble.
: So, you put some time and consideration into this other than — You know, instead like my foolishly rendered thoughts. So, you think that tunnels are the way to do it?
: Oh, it will work, for sure.
: That'll work?
: And these tunnels that you're building right now, these are basically just like test versions of this ultimate idea that you have?
: You know, it's just a hole in the ground.
: Right. We played videos of it where your ideas-
: It's just a hole in the ground.
: … that you drop that hole in the ground. There's a sled on it, and the sled goes very fast, like 100 miles an hour plus.
: Yeah, it can go real fast. You can go as fast as you want. And then, if you want to go long distances, you can just draw the air out of the tunnel, make sure it's real straight.
: Draw the air out of the tunnel?
: Yeah, it's sort of vacuum tunnel because the — And then, depending on how fast you want to go, you're going to take these wheels, or you could use air bearings depending upon the ambient pressure in the tunnel, or you could mag lev it if you want to go super fast.
: So, magnet road?
: Yes, underground magnet roads.
: Underground magnet roads?
: Yeah. Otherwise, you're going to really create a lot of trouble because of those metal things.
: Oh. So, magnet road is the way to go, just underground.
: If you want to go really fast underground, you would be mag lev in a vacuum tunnel.
: Mag in a vacuum tunnel.
: Magnetic levitation in a vacuum tunnel launchers. Fun?
: With rocket launchers?
: No, I would not recommend putting any-
: Come on.
: … exhaust gas in the tunnel.
: Oh, okay. I see what you're saying because then the air will be gone.
: Because, then, the air will pump it out.
: Right. You have to pump it out, and you probably have limited amount of air in the first place. Like how much can you breathe? Do you have to pump oxygen into these cubicles, these tubes?
: No. We have a pressurized pod. It'd be like a little tiny underground spaceship basically.
: Like an airplane because you have air on airplanes. It's not getting new air in.
: It is.
: It is?
: You have like a little hole?
: Yeah, they have a pump.
: So, it gets it from the outside?
: Wow, I didn't know that.
: It's like the air's — Airplanes have it easy because, essentially, you can — they're pretty leaky, but-
: Yeah, but as long as the air pump is working at a distance. I mean, they have backup pumps, sort of like, you know, three pumps, or four pumps, or something. And then, there's like — It exhausts through the outflow valve and through whatever seals are not sealing quite right. Usually, the door doesn't seal quite right on the plane. So, there's a bit of leakage around the door. But the pumps exceed the outflow rate. And then, that sets the pressure in the cabin.
: Now, have you ever looked at planes and gone, "I can fix this."
: "I just don't have the time."
: I have a design for a plane.
: You do?
: A better design?
: I mean, probably. I think it is, yes.
: Who have you talked to about this?
: I've talked to friends.
: Friends and-
: I'm your friend.
: Girlfriends and-
: You can tell me. What you got? What's going on?
: Well, I mean, the exciting thing to do would be some sort of electric vertical takeoff and landing, supersonic jet of some kind.
: Vertical takeoff and landing meaning no need for a runway. Just shoot up straight in the air.
: How would you do that? I mean, they do that in some military aircraft, correct?
: Yes. The trick is that you have to transition to level flight. And then, the thing that you would use for vertical takeoff and landing is not suitable for high-speed flight.
: So, you have two different systems? Vertical takeoff is one system?
: I've thought about this quite a lot. I've thought about this quite a lot.
: I guess, thinking about an electric plane is that you want to go as high as possible, but you need a certain energy density in the battery pack because you have to overcome gravitational potential energy. Once you've overcome gravitational potential energy, and you're out at a high altitude, the energy use in cruise is very low. And then, you can recapture a large part of the gravitational potential energy on the way down. So, you really don't need any kind of reserve fuel, if you will, because you have the energy of height, gravitational potential energy. This is a lot of energy.
: So, once you can get high, like the way to think about a plane is it's a force balance. So, the force balance — So, a plane that is not accelerating is a neutral force balance. You have the force of gravity, you have the lift force, you have the wings. Then, you've got the force of the whatever thrusting device, so the propeller, or turbine, or whatever it is. And you've got the resistance force of the air.
: Now, the higher you go, the lower the air resistance is. Air density drops exponentially, but drag increases with the square, and exponential beats the square. The higher you go, the faster you will go for the same amount of energy. And at a certain altitude, you can go supersonic with less energy per mile, quite a lot less energy per mile than an aircraft at 35,000 feet because it's just a force balance.
: I'm too stupid for this conversation.
: It makes sense though.
: No, I'm sure it does. Now, when you think about this new idea of of design, when you have this idea about improving planes, are you going to bring this to somebody and check this one out?
: Well, I have a lot on my plate.
: Right. That's what I'm saying. I don't know how you do what you do now, but if you keep coming up with these. But it's got to be hard to pawn this off on someone else either, like, "Hey, go do a good job with this vertical takeoff and landing system that I want to implement to regular planes.".
: The airplane, electric airplane isn't necessarily right now. Electric cars are important. We need-
: We need some sort of-
: Solar energy is important. Stationary storage of energy is important. These things are much more important than creating electric supersonic futile. Also, the plane's naturally — You really want that gravitational energy density for an aircraft, and this improving over time. So, you know, it's important that we accelerate the transition to sustainable energy. That's why electric cars, it matters whether electric cars happen sooner or later. You know, we're really playing a crazy game here with the atmosphere or the oceans.
: We're taking vast amounts of carbon from deep underground and putting this in the atmosphere. It's just crazy. We should not do this. It's very dangerous. So, we should accelerate the transition to sustainable energy. I mean, the bizarre thing is that, obviously, we're going to run out of oil in the long term. You know, we're going to — There's only so much oil we can mine and burn. It's totally logical. We must have a sustainable energy transport and energy infrastructure in the long term.
: So, we know that's the endpoint. We know that. So, why run this crazy experiment where we take trillions of tons of carbon from underground and put it in the atmosphere and oceans? This is an insane experiment. It's the dumbest experiment in human history. Why are we doing this? It's crazy.
: Do you think this is a product of momentum that we started off doing this when it was just a few engines, a few hundred million gallons of fuel over the whole world, not that big of a deal? And then, slowly but surely over a century, it got out of control. And now, it's not just our fuel, but it's also, I mean, fossil fuels are involved in so many different electronics, so many different items that people buy. It's just this constant desire for fossil fuels, constant need for oil-
: Without consideration of the sustainability.
: You know, the things like oil, oil, coal, gas, it's easy money.
: It's easy money. So-
: Have you heard about clean coal? The president's been tweeting about it. It's got to be real. CLEAN COAL, all caps. Did you see? He used all caps. Clean coal.
: Well, you know, it's very difficult to put that CO2 back in the ground. It doesn't like being in solid form.
: Have you thought about something like that?
: It takes a lot of energy.
: Like some sort of a filter, giant building-sized filter sucks carbon out in the atmosphere? Is that possible?
: No, no, it doesn't. It's not possible.
: Nope, definitely not.
: So, we're fucked?
: No, we're not fucked. I mean, this is quite a complex question.
: You know, we're really just — When we — The more carbon we take out of the ground and add to the atmosphere, and a lot of it gets permeated into the oceans, the more dangerous it is. Like I don't think right — I think we're okay right now. We can probably even add some more but the momentum towards sustainable energy is too slow.
: Like there's a vast base of industry, vast transportation system. Like there's Two and a half billion cars and trucks in the world. And the new car and truck production, if it was a 100% electric, that's only about 100 million per year. So, it would take — If you could snap your fingers and instantly turn all cars and trucks electric, it would still take 25 years to change the transport base to electric. It makes sense because how long does a car and truck last before it goes into the junkyard and gets crushed? About 20 to 25 years.
: Is there a way to accelerate that process, like some sort of subsidies or some encouragement from the government financially?
: Well, the thing that is going on right now is that there is an inherent subsidy in any oil-burning device. Any power plant or car is fundamentally consuming the carbon capacity of the oceans and atmosphere, or just the atmosphere for short. So, like, you can say, okay, there's a certain probability of something bad happening past a certain carbon concentration in the atmosphere.
: And so, there's some uncertain number where if we put too much carbon into the atmosphere, things overheat, oceans warm up, ice caps melt, ocean real estate becomes a lot less valuable, you know, if something's underwater, but it's not clear what that number is. But, definitely, scientists, it's really quite — The scientific consensus is overwhelming. Overwhelming.
: I mean, I don't know any serious scientist, actually zero, literally zero who don't think, you know, that we have quite a serious climate risk that we're facing. And so, that's fundamentally a subsidy occurring with every fossil fuel burning thing, power plants, aircraft, car frankly even rockets. I mean, rockets use up — you know, they burn. They burn fuel. But there's just — you know, with rockets, there's just no other way to get to orbit unfortunately. So, it's the only way.
: But with cars, there's definitely a better way with electric cars. And to generate the energy, do so with photovoltaics because we've got a giant nuclear reactor in the sky called the sun. It's great. It sort of shows up every day, very reliable. So, if you can generate energy from solar panels, store up with batteries, you can have energy 24 hours a day.
: And then, you know, you can send to the polls or in the air to the north with, you know, high voltage lines. Most of the northern parts of the world tend to have a lot of hydropower as well. But, anyway, all fossil fuel-powered things have an inherent subsidy, which is their consumption of the carbon capacity of the atmosphere and oceans.
: So, people tend to think like why should electric vehicles have a subsidy, but they're not taking into account that all fossil fuel-burning vehicles fundamentally are subsidized by the cost, the environmental cost to earth, but nobody's paying for it. We are going to pay for it, obviously. In the future, we'll pay for it. It's just not paid for now.
: And what is the bottleneck in regards to electric cars, and trucks, and things like that? Is it battery capacity?
: Yeah. You got to scale up production. You got to make the car compelling, make it better than gasoline or diesel cars.
: Make it more efficient in terms of, like, the distance it can travel? You're going to be fueling-
: Yeah, you're going to be able to go far enough, recharge fast.
: And your Roadster, you're anticipating 600 miles. Is that correct?
: Yeah, yeah.
: What is it? What is that?
: Yeah, 600 miles.
: Is that right now? Like have you driven one 600 miles now?
: No. We could totally make one right now that would do 600 miles, but the thing is too expensive. So, like the car's got to-
: How much more so?
: Well, you know, just have a chartered kilowatt hour battery pack, and you can go 600 miles as long as you're-
: Right, versus what do you have now?
: 330-mile range. That's plenty for most people.
: 330-mile range. And what is that mean in terms of kilowatts?
: Well, that would be for Model S, 100-kilowatt hour pack will do about 330 miles. Maybe 335 because some people have hyper mild it to 500 miles per mile.
: Hyper mild it. What does that mean?
: Yeah, just like go on-
: 45 miles an hour or something?
: Yeah, like 30 miles an hour or so. It's like on level ground with — You pump the tires up really well, and go on a smooth surface, and you can go for a long time. But, you know, like definitely comfortably do 300 miles.
: Is there any-
: This is fine for most people. Usually, 200 or 250 miles is fine. 300 miles is — You don't even think about it really.
: Is there any possibility that you could use solar power, solar-powered one day, especially in Los Angeles? I mean, as you said about that giant nuclear reactor, a million times bigger than Earth just floating in the sky. Is it possible that one day, you'll be able to just power all these cars just on solar power? I mean, we don't ever have cloudy days if we do just three of them.
: Well, the surface area of a car is without making the car look really blocky or having some-
: Like a G wagon.
: Yeah, and just like if it looked a lot of surface area, or like maybe like solar panels fold out, or something-
: Like your E class. That's what it needed.
: That E type?
: Yeah, the Jaguar E type with a giant long hood, that could be a giant solar panel.
: Well, at the beginning of Tesla, I did want to have this like unfolding solar panel thing. They'd press a button, and it would just like unfold these solar panels, and like charge/recharge your car in the parking lot. Yeah, we could do that, but I think it's probably better to just put that on your roof.
: Right, yeah.
: And then, it's going to — It should be facing the sun all the time because like-
: What car have that on the roof?
: Otherwise, your car could be in the shade. You know, it could be in the shade, it could be in a garage, or something like that.
: Didn't the Fisker have that on the roof? The Fisker Karma New Generation for — I believe, it was only for the radio. Is that correct?
: Yeah, I mean, but I think it could like recharge like two miles a day or something.
: Did you laugh when they started blowing up when they get hit with water? Do you remember what happened?
: They got what?
: Yeah, they had a dealership or-
: Oh yeah.
: The Fisker Karmas were parked-
: Is that like that with a flood in Jersey?
: Yes, yes.
: When the hurricane came in, they got overwhelmed with water, and they all started exploding. There's a fucking great video of it. Did you watch the video?
: I didn't watch the video, but I did see — It's like some picture of the aftermath.
: If I was you, I'd be naked, lubed up, watch that video, laugh my ass off. They all blow up. They got wet, and they blew up. That's not good.
: Yeah, we made our battery waterproof, so that doesn't happen. Actually-
: Smart move.
: Yeah, there was a guy in Kazakhstan that — I think it was Kazakhstan that he just boated through a tunnel, an underwater tunnel, like a flooded tunnel, and just turned the wheels to steer, and pressed the accelerator, and it just floated through the tunnel.
: And he steered around the other cars. I mean, like-
: That's amazing.
: It's on the internet.
: What happens if your car gets a little sideways, like if you're driving in snow? Like what if you're driving, if you're autopilot is on, and you're in like Denver, and it snows out, and your car gets a little sideways, does it correct itself? Does that-
: Oh yeah. It's got great traction control.
: But does it know how to like correct? You know how, like, when your Ascend-
: Oh yeah, sure.
: … kicks, you know how to counter steer?
: Oh, yeah. No, it's really good.
: It knows how to do it?
: It's pretty crazy.
: That's pretty crazy.
: So, like if you're going sideways, it knows how to correct itself?
: It generally won't go sideways.
: It won't?
: Why not?
: It will correct itself before it goes sideways.
: Even in black eyes?
: Yeah. There's videos where you could see the car, the traction-
: Not alone.
: Traction control system is very good. It makes you feel like Superman. It's great. You like feel like you can — Like it's — It will make you feel like this incredible driver.
: I believe it.
: Now, how do you program that?
: We do have testing on like an ice lake in Sweden.
: Oh really?
: Yeah. And like Norway, and Canada, and a few other places.
: Porsche does a lot of that too? They do-
: They did it as well?
: They do a lot of their — They do some of their driver training school on these frozen surfaces. So, you're just — The car is going sideways whether you like it or not. And you have to learn how to slide into corners, and how do we test.
: Yeah. Electric cars have really great traction control because the reaction time is so fast.
: Sort of like where you're gassing a car, you've got a lot of latency. It takes a while for the engine to react, but for electric motors, incredibly precise. That's why you're like — You imagine like if you had like a printer or something, you wouldn't have a gasoline engine printer. That would be pretty weird or like a surgical device. It's going to be an electric motor on the surgical device on the printer. Gasoline engine's going to be just chugging away. It's not going to have the reaction time.
: But to an electric motor, it's operating at the most second level. So, it can turn on and off traction within, like, inches of getting on the onus. Like, let's say, you're driving on a patch of ice, it will turn traction off, and then turn it on a couple inches right after the ice, like a little patch of ice because in the frame of the electric motor, you're moving incredibly slowly. You're like a — You're a snail. You're just moving so slowly because it can see at a thousand frames a second. And so, it's like, say, one Mississippi. It just thought about it things a thousand times.
: So, it's to realize that your wheels are not getting traction. It understands there's some slippery surface that you're driving on.
: And it makes adjustments in real time.
: Yes, in milliseconds.
: That would be so much safer than a regular car.
: Yes, it is.
: Just that alone, for loved ones, you'd want them to be driving your car.
: Yes. The-
: Or on board. Fuck motors. Dude, fuck regular motors.
: That S, X, and 3 have the lowest probability of injury of any cars ever tested by the US government.
: So, this — Yeah, but it's pretty fun. It's pretty crazy. Like we — You know, people still sue us like they'll have like some accident at 60 miles an hour where they'd like twisted an ankle, and they slipped. Like they will be dead in another car, they still sue us.
: But that's to be expected, isn't it?
: It is to be expected.
: Do you take that into account with like the same sort of fatalistic, you know, undertones to sort of just go, "You've got to just let it go. This is what people do."
: I tell you I've got-
: This is what it is.
: … Quite a lot of respect for the justice system. Judges are very smart. And they see — they've — as like I haven't. So far, I've found judges to be very good at justice because like what — and juries are good too. Like, they're actually quite good. You know, people — You know, you read about like occasional errors in the justice system. Let me tell you, most the time, they're very good.
: And like the other guy mentioned who fell asleep in the car, and he rode over a cyclist. And that was what encouraged me to get autopilot out as soon as possible. That guy sued us.
: He sued you for falling asleep?
: Yes. I'm not kidding. He blamed it on the new car smell.
: He blamed him falling asleep on your new car smell. Does someone that's a lawyer-
: This is a real thing that happened.
: Someone that's a lawyer that thought that through in front of his laptop before he wrote that up.
: Yes, he got a lawyer, and he sued us, and the judge was like, "This is crazy. Stop bothering me. No."
: Thank God.
: Thank God. Thank God there's a judge out there with a brain.
: I tell you, judges are very good.
: Some of them.
: I have a lot of-
: What about that judge that sent all these boys up the river in Pennsylvania who was selling those kids out? You know about that story?
: Judge was selling young boys to prisons. He was like literally-
: Yeah, literally, under bribes for — He was-
: Was this an elected judge or-
: He was-
: Because sometimes you have a judge that's like actually a politician.
: No, he was a elected judge. This is a very famous story.
: He's in jail right now, I think, for the rest of his life. And he put away — He would take like a young boy who would do something like steal something from a store, and he would put them in detention for, you know, five years. Something ridiculous egregious. And they investigated his history. And they found out that he was literally being paid off. Was it by private prisons? Is that what the the deal was? There was some sort of — But, anyway, this judge is-
: Actually, two judges.
: Two judges?
: Two judges. Kids for cash scandals, let's call them.
: 2008, yeah. Common pleas judges. So, I think they are elected.
: And who was paying them? Someone — It proven to the point where they're in jail now that someone was paying them to put more asses in the seats in these private prisons.
: It's like a million-dollar payment to put them in a youth center builder.
: A million-dollar payment?
: I do think these private prisons thing is-
: Someone business.
: … creating a bad incentive.
: It's dark.
: Right, yes. But, I mean, that judge is in prison.
: Thank God.
: Yes, but for people who think perhaps the justice system consists entirely of judges like that, I want to assure you-
: … this is not the case. The vast majority of judges are very good.
: I agree.
: And they care about justice, and they could have made a lot more money if they wanted to be a trial lawyer. And instead, they cared about justice, and they made less money because they care about justice. And that's why they're judges.
: I feel that same way about police officers.
: I feel like there's so many interactions with so many different people with police officers that the very few that stand out that are horrific, we tend to look at that like, "This is evidence that police are all corrupt." And I think that's crazy.
: No. Most police are very honest.
: And like the military-
: Like they have an insanely-
: … personnel that I know-
: … are very honorable, ethical people.
: And much more honorable and ethical than the average person. That's my impression.
: I agree. That's my impression as well.
: And that's not to suggest that we be complacent and assume everyone is honest and ethical. And, obviously, if somebody is given a trusted place in society, such as being a police officer or a judge, and they are corrupt, then we must be extra vigilant against such situations-
: … and take action. But we should not think that this is somehow broadly descriptive of people in that profession.
: I couldn't agree more. I think there's also an issue with one of the things that happens with police officers, prosecutors, and anyone that's trying to convict someone or arrest someone is that it becomes a game. And in games, people want to win.
: And sometimes, people cheat.
: Yes, yes. I mean, you know, if you're a prosecutor, you should not always want to win. There are times when you should like, "Okay. I just should not want to win this case." And then, you know, like just pass on that case. Sometimes, people want to win too much. That is true.
: I think, also, it becomes tough. If you're like a district attorney, you know, you tend to sort of see a lot of criminals. And then, your view of the world can get negatively.
: You know, have a negative — You know, you can have a negative view of the world because, you know, you're just interacting with a lot of criminals. But, actually, most of society is not to consist of criminals.
: And I, actually, had this conversation at dinner several years ago with, I guess, it's Tony. I was like, "Man, it must, sometimes, seem pretty, pretty dark because, you know, man, there's some terrible human beings out there. And he was like, "Yup." And he was like dealing with some case, which consisted of a couple of old ladies that would run people over somehow for insurance money. It was rough. Like, "Wow, that's pretty rough." It's like hard to maintain faith in humanity if you're a district attorney, but, you know, it's only a few percent of society that are actually bad.
: And then if you go to the worst, say 0.1% of society are the worst, one in a thousand, one in a million, you know. Like how bad is the millionth worst person in the United States? Pretty damn bad. Like damn evil.
: Like the millionth, well, one in a million of evil is so evil, people cannot even conceive of it. But there's 330 million people in the United States. So, that's 330 people out there somewhere. But by the same token, there's also 330 people who are incredible angels and unbelievably good human beings.
: On the other side.
: But because of our fear of danger, we tend to — our thoughts tend to gravitate towards the worst-case scenario.
: And we want to frame that. And that's one of the real problems with prejudice, whether it's prejudice towards different minorities, or prejudice towards police officers, or anything, it's like we want to look at the worst-case scenario and say, "This is an example of what this is all about.".
: And you see that even with people, how they frame genders. Some men frame women like that. They get ripped off by a few women, and they said, "All women are evil." Some women get fucked over by a few men, "All men are shit." And this is very toxic.
: It is.
: And it's also — It's a very unbalanced way of viewing the world, and it's very emotionally-based, and it's based on your own experience, your own anecdotal experience. And it can be very influential to the people around you, and it's just it's a dangerous way. It's a dangerous thought process and pattern to promote.
: It is. It is a very dangerous, but I really think, you know, people should give other people the benefit of the doubt and assume that they are good until proven otherwise. And, I think, really, most people are actually pretty good people. Nobody's perfect.
: They have to be.
: If you think of vast numbers of us that are just interacting with each other constantly-
: … we have to be better than we think we are.
: Yes. I mean, like-
: There's no other way.
: I mean, here are these weapons but how many times, like, nobody's presumably try to murder you and you're-
: Nobody yet.
: Yes, nobody. It's like the sword right there.
: Not the flamethrower, fake flamethrower here-
: It's not a flamethrower. Now, we've got a real problem, I'm going to put it on that side to him and leave it for the guests.
: I'm like, "Look, man, if I say something that fucked up, it's right there."
: It will liven things up for sure. It's guaranteed to make any party better.
: Yeah. Well, that's — I mean, that's the armed civilization theory, right. An armed community is safe and polite community.
: You know, in Texas, it's kind of true. Yeah. I mean-
: People in Texas are super polite. Therefore, they've got a gun.
: Yes. Don't make somebody angry.
: We don't know what's going to happen.
: Yeah, it's a good move.
: Piss people off, and everybody are going to have a gun.
: You're off to just let that guy get in your lane.
: Yeah, yeah. You know, we got a big test site in Central Texas near Waco.
: Oh yeah? Beautiful.
: Yes, Space X in McGregor. It's about 15 minutes away from Waco.
: That's close to where Ted Nugent lives.
: It is?
: Shout out to Ted Nugent.
: Okay, cool.
: Yeah, there's — You know, we have lots of fire, and loud explosions, and things, and people-
: I bet.
: … they are cool with it.
: They don't give a fuck out there.
: They're very supportive.
: Yeah. You can buy fireworks where, you know, your kids go to school.
: Yeah. You know, it's dangerous.
: Yeah, but it's free.
: It's free.
: There's something about Texas-
: … that's very enticing because of that. It is dangerous, but it's also free.
: Yeah. I kind of like Texas actually.
: I prefer it over places that are more restrictive but more liberal because you could always be liberal. Like just because things are free and just because you have a certain amount of, you know, right wing type characters, it doesn't mean you have to be that way, you know.
: And, honestly, there's a lot of those people that are pretty fucking open minded and let you do whatever you want to do.
: As long as you don't bother them.
: Yeah, exactly.
: That's my hope right now with the way we're able to communicate with each other today and how radically different it is than generations past because we all — Just, the dust settles. We all realize, like what you're saying that most people are good.
: Most people are good.
: The vast majority?
: Yes. I think if you give people the benefit of doubt, for sure.
: I think you're right. You know who could help with that? Mushrooms.
: Don't you think?
: They're delicious.
: Yeah, right.
: They're good for you too.
: All of them. All kinds of them. What do you see in terms of, like, when you think about the future of your companies, what do you see is like bottlenecks? Want some more of this?
: Sure. Thank you.
: What do you see in terms of like bottlenecks of things that are holding back innovation? Is it regulatory commissions and people that don't understand the technology that are influencing policy? Like what could potentially be holding you guys back right now? Is there anything that you would change?
: Yeah, that's a good question. You know, I wish politicians were better at science. That would help a lot.
: That's a problem.
: There's no incentive for them to be good at science.
: There isn't. Actually, you know, they're pretty good at science in China, I have to say.
: Yeah. The mayor of Beijing has, I believe, an environmental engineering degree, and the deputy mayor has a physics degree. I met them, And Mayor says, "Shanghai is really smart and-".
: You're up on technology. What do you think about this government policy of stopping use of Huawei phones? And there's something about the the worry about spying. I mean, from what I understand from real tech people, they think it's horseshit.
: Oh I-.
: Like phones.
: I don't know. I don't know.
: Like the government say, "Don't you buy Huawei phones." Are you up on that at all? No? Should we just abandon this idea?
: Well, I think, like, I guess, if you have like top secret stuff, then you want to be pretty careful about what hardware you use. But, you know, like most people do not have top secret stuff.
: And, like, nobody really cares what porn you watch like, you know.
: Right, yeah.
: It's like nobody actually cares, you know. So-.
: If they do, that's kind of them.
: It's just like-
: National spy agencies do not give a rat's ass which porn you watch. They do not care. So, like, what secrets does a national spy agency have to learn from the average citizen? Nothing.
: Well, that's the argument against the narrative. And the argument by a lot of these tech people is that the real concern is that these companies, like Huawei, are innovating at a radical pace, and they're trying to stop them from integrating into our culture and letting this. Like right now, they're the number two cell phone manufacturer in the world.
: Samsung is number one. Huawei is number two. Apple is now number three. They surpassed Apple as number two. And the idea is that this is all taking place without them having any foothold whatsoever in America. There's no carriers that have their phones. You have to buy their phones unlocked through some sort of a third party, and then put-
: And the worry is, you know, that these are somehow another controlled by the Chinese government. The Communist Chinese government is going to distribute these phones. And I don't know if the worry's economic influence or they'll have too much power. I don't know what it is. Are you paying attention on any of this?
: Not really.
: I don't think we should worry too much about Huawei phones, you know. Maybe, you know, a national security agency shouldn't have Huawei phones. Maybe that's a question mark. But I think for the average citizen, this doesn't matter. Just like no, they're not. I'm pretty sure the Chinese government does not care about the goings of the average American citizen.
: Is there a time where you think that there will be no security, it will be impossible to hold back information that whatever bottleneck we'll let go, we're going to give in? That whatever bottleneck between privacy and ultimate innovation will have to be bridged in order for us to achieve the next level of technological proficiency that we're just going to abandon it, and there'll be no security, no privacy?
: Do people want privacy? Because they seem to put everything on the internet. Practically-.
: Well, right now, they are confused, but when you're talking about your Neuralink, and this this idea that one day, we're going to be able to share information, and we're going to be some sort of a thing that's symbiotically connected?
: Yeah. I think we really worry about security in that situation
: And when-
: For sure. That's like security will be paramount.
: But, also, what we will be. This will be so much different. Our concerns about money, about status, about where all of these things will seemingly go by the wayside if we really become enlightened, if we really become artificially enlightened by some sort of an AI interface where we have this symbiotic relationship with some new internet type connection to information? But, you know, what happens then? What is important? What is not important? Is privacy important when we're all gods?
: I mean, I think the things that we think are important to keep private right now-
: … we probably will not think going forward.
: Shame, right? Information, right? What are hiding? Emotions? What are we hiding?
: I mean, I think, like, I don't know. Maybe it's like embarrassing stuff.
: Right, embarrassing stuff.
: But there's actually — Like, I think, people, there's like not that much that's kept private that people — that is actually relevant.
: That other people would actually care about. When you think other people care about it, but they don't really care about it. And, certainly, governments don't.
: Well, some people care about it. But, then, it gets weird when it gets exposed. Like Jennifer Lawrence, when those naked pictures got exposed, like, I think, in some ways, people liked her more.
: They realized like she's just a person. It's just a girl who likes sex, and is just alive, and has a boyfriend, and sends him messages. And, now, you get to look into it, and you probably shouldn't have, but somebody let it go, and they put it online, and all right.
: She seems to be doing okay.
: She's a person. She's just you, and me, and it's the same thing. She's just in some weird place where she's on a 35-foot tall screen with music playing every time she talks.
: Yeah. I mean, I'm sure like not-
: No, but she's fine.
: She's not happy about it, but she's-
: But she's clearly doing fine.
: But once this interface is fully realized where we really do become something far more powerful in terms of our cognitive ability, our ability to understand irrational thoughts, and mitigate them, and that we're all connected in some sort of an insane way. I mean, what are our thoughts on wealth, our thoughts on social status? Like how many of those just evaporate? And our need for privacy, maybe our need for privacy will be the ultimate bottleneck that we'll have to surpass.
: I think, the things that we think are important now will probably not be important in the future, but there will be things that are important. It's just, like, different things.
: What will be more important?
: I don't know. There might be some more of ideas potentially. I don't think Darwin's going away.
: Darwin's going to be there.
: That was that, yeah.
: Darwin will be there forever.
: Forever, yeah.
: It would just be a different arena. Different arena.
: A digital arena.
: Different arena. Darwin is not going away.
: What keeps you up at night?
: Well, it's quite hard to run companies.
: Especially car companies, I would say. It's quite challenging.
: The car business is the hardest one of all the things you do?
: Yes, because it's a consumer-oriented business as opposed to like SpaceX and-
: Not that SpaceX because SpaceX is no walk in the park, but a car company, it's very difficult to keep a car company alive. It's very difficult. You know, there's only two companies in the history of American car companies that haven't gone bankrupt, and that's Ford and Tesla. That's it.
: Yeah, Ford rode out that crazy storm, huh? They're the only one.
: By the skin of their teeth.
: Shot out to the Mustang.
: Yeah, by the skin of their teeth. That is interesting, right?
: Same with Tesla, we barely survived.
: How close did you get to folding?
: Very close. I mean, 2008 is not a good time to be a car company, especially a startup car company, and especially an electric car company. That was like stupidity squared.
: And this is when you had those cool Roadsters with the T-top?
: With a target top?
: Yeah. We had like a — It was highly modified Elise chassis. The body was completely different. By the way, that was a super dumb strategy that we actually did because we-
: What's dumb?
: It was based on two false premises. One false premise was that we would be able to cheaply convert the Lotus Elise, and use that as a car platform, and that we'll be able to use technology from this little company called AC Propulsion for the electric drive train on the battery. Premise, the AC propulsion technology did not work in production, and we ended up using none of it in long-term. None of it. We had to resign everything.
: And then once you add a battery pack and electric motor to the car, it got heavier. It got 30% heavier. It invalidated the entire structure, all the crash structure. Everything had to be redone. Nothing. Like, I think, it had less than 7% of the parts were common with any other device including cars or anything.
: Everything? Including tires, and wheels, bolts, brakes?
: Yeah, even every-
: Steering wheel? Seat?
: The steering wheel was — I think, the steering wheel was almost the same. Yes, the windscreen. The windscreen.
: No. I think, the windscreen is the same.
: Yes. I think, we were able to keep the windscreen.
: But the last was 7%. So, that's basically-
: Every body panel is different. The entire structure was different. We couldn't use the, like, the HVAC system, the air conditioner. It was belt-driven air conditioner. So, now, we needed something that was electrically driven. We need a new AC compressor.
: And all that takes away from the battery life as well, right?
: Yeah. We need a small highly efficient air conditioning system that fit in a tiny car and was electrically powered, not belt-driven. It was very difficult.
: How much of those weigh, those cars, the Roadster?
: I think it was 2700 pounds.
: That's still very light.
: 27. Depending on which version, 2650 to 2750 pounds, something like that.
: And what was the weight distribution?
: It was about 50 — Well, there were different versions of the car. So, it's about 55 on the rear.
: That's not bad.
: It was rear bias.
: Right, but not bad. Considering like a 911, which is like one of the most popular sports cars of all time. Heavy rear end bias.
: Well, I mean, yeah. The 911, I'm not going to joke, is like the master despite Newton not being on their side.
: I guess, fighting Newton, it's very difficult.
: It's like you've got those — The moments of inertia on a 911 don't make any sense.
: They do once you understand them. Once you understand-
: You don't want to hang the engine off the ass. This is not a wise move.
: You don't want to let up on the gas when you're in a corner.
: The problem with something where the engine is mounted over the rear axle or off the rear axle towards the rear is that your polar moment of inertia is fundamentally screwed. You cannot solve this. It's unsolvable. You're screwed. Polar moment of inertia, you're screwed.
: Like, essentially, if you spawn the car like a top, that's your polar moment of inertia. You're just — I promise I wouldn't swear on this show, by the way.
: Says who?
: This was for a friend.
: Tell that friend to go fuck himself. Who told you not to swear?
: A friend.
: He's not a good friend.
: That friend need to-
: I said I wouldn't swear.
: … realize you're fucking Elon Musk. You can do whatever you want, man. If you ever get confused, call me.
: I'll swear in private. Swear up a storm.
: Okay, just say freaking. It's a fun way. It's like old house moms. Wives and shit that have children, "Oh, this freaking thing."
: Yeah. But, anyway, like the Portia, it's kind of incredible how well Porsche handles given that it's the physics-.
: The moments of inertia are so messed up. To actually still make it work well is incredible.
: Well, if you know how to turn into the corner once you get used to the feeling of it, there's actual benefits to it. You know, there are some benefits.
: I enjoy. The car I had before, Tesla was a 911.
: That was-
: 997 or 6?
: Yeah. Great car, man.
: Yeah. I mean, particularly, the Porsche wouldn't have the variable veins on the turbo, and it didn't have the turbo lag. That was great.
: That was really great. The turbo lag is, like, you know, if you flirt, like phone home, call your mom.
: The older one, right?
: It's like about an hour later-
: … the car accelerates.
: And super dangerous too because where it will start spinning and-
: Yeah. There's something fun about it though like feeling that rear weight kicking around, you know. And again-
: No, it's great.
: … it's not efficient.
: It had a good feel to it.
: Yeah, I agree.
: But that's what I was talking about earlier about that little car that I have, the '93 911. It's not fast. It's not the best handling car, but it's more satisfying than any other car I have because it's so mechanical. It's like everything about it, like crack holes, and bumps, and it gives you all this feedback. And I take it to the comic store because when I get there, I feel like my brain is just popping, and it's on fire. It's like a strategy for me now that I really stop driving other cars there. I drive that car there just for the brain juice, just for the-
: The interaction.
: I mean, you should try Model S P100D.
: I'll try it.
: It will blow your mind-
: … and your skull.
: Tell me what to order, I'll order it.
: Model S P100D.
: Okay. Jamie, write it down.
: That's the car that I drive.
: Okay. Okay, I'll get the car you drive. Okay.
: It will blow your mind-
: How far can I drive?
: … out of your skull.
: I believe you.
: How far can I drive? How far can I drive?
: About 300 miles.
: That's good. For LA regular days, that's good.
: You will never notice the battery.
: How hard is it to get like one of them crazy plugs installed in your house? That difficult?
: No, it's super easy. It's like, yeah.
: Do you-
: It's like a dryer plug. It's like a dryer outlet.
: Didn't you come up with some crazy tiles for your roof that are solar paneled?
: Yeah, yeah. I have it on my roof right now actually. I'm just trying it out. The thing is it takes a while to test roof stuff because roofs have to last a long time.
: So, like, you want your roof to last like 30 years.
: Could you put it over a regular roof?
: No. So, there's two versions. It's like the solar panels you put on a roof. So, like, it depends on whether your roofs new or old. So, if your roofs new, you don't want to replace the roof. You want to put like solar panels on the roof.
: So, that's like retrofit, you know. And they were trying to make the retrofit panels look real nice. But then, the new product were coming out with it is if you have a roof that's either you're building a house or you're going to replace your roof anyway, then you make the tiles have solar cells embedded in the tiles.
: And then, it's quite a tricky thing because you want to not see the solar cell behind the glass tile. So, you have to really work with the glass, and the various coatings, and the layers, so that you don't see the solar cells behind the glass. Otherwise, it doesn't look right.
: So, it's really tricky.
: There it is. Jaime, put it up there.
: Man, that looks good. Is there a-
: See, like, if you look closely, you can see. If you zoom in, like, you can see the cell. But if you zoom out, you don't see the cell.
: Right, but it looks though.
: Like that's hard.
: That's invisible solar cells.
: It's really hard because you have to get the sunlight go through.
: But when it gets reflected back out, it doesn't — it hides the fact that there's a cell there.
: Now, are those available to the consumer right now?
: Well, we have — I think, that's-
: Those on that roof right there?
: That's amazing. Oh, that looks good.
: Ooh, I like that.
: That one is hard.
: Oh. So, you get that kind of fake Spanish looking thing. I like that.
: That's French slate.
: That's why people in Connecticut are smoking pipes. Look at that one.
: That's badass, dude. So, now-
: This will actually work.
: I believe you. So, the solar panels that are on that house that we just looked at, is that sufficient to power the entire home?
: It depends on your energy on how efficient-
: Yeah, yeah.
: So, generally, yes. I would say it's probably for most. It's going to vary, but anywhere from more than you need to maybe half. Like call it half to 1.5 of the energy that you need, depending on how much roof you have relative to living space.
: And how ridiculous you are with your TV.
: TVs no problem. Air conditioning.
: Air conditioning.
: Air conditioning is the problem. If you have an efficient air conditioner, and you don't — and depending on how — like, are you air conditioning rooms when they don't need to be air conditioned, which is very common-
: … because it's a pain in the neck, you know. It's like programming a VCR. It's like-
: Now, it's just blinking 12:00. So, people are just like, "The hell with that. I'm just going to make it this temperature all day long.".
: Right. You know how a smart home where if you're in the room, then it stays cool, right?
: Yeah, it should predict when you're going to be home, and then cool the rooms that you're likely to use with a little bit of intelligence. We're not talking about like genius home here. We're talking like elementary basic stuff.
: You know, like if you could hook that into the car, like manage you coming home. Like there's no point cooling the home-
: … keeping the home really cool when you're not there.
: But it can tell that you're coming home, it's just going to cool it to the right temperature right when you get there.
: Do you have an app that works with your solar panels or anything like that?
: Yeah. Yeah, we do.
: But we need to hook it into the air conditioning to really make the air conditioning work.
: Have you thought about creating an air conditioning system? I know you have. Trick question.
: Cannot answer questions about the future of potential products.
: Okay. Let's just let it go. We'll move on to the next thing.
: That would be an interesting idea.
: Yeah, I would say radiant heating and all that, good ideas. Now, when you think about the efficiency of these homes, and you think about implementing solar power and battery power, is there anything else that people are missing? Is there any other — Like, I just saw a smartwatch that is powered by the heat of the human body, and some new technology.
: It's able to fully power that way?
: I don't know-
: … if it's fully or if it's — Like this watch right here, this is a Casio.
: It's called a Pro Trek. And it's like an outdoors watch, and it's solar-powered.
: And so, it has the ability to operate for a certain amount of time on solar.
: So, if you have it exposed, it could function for a certain amount of time on solar.
: Yeah. Well, you know, like there's self-weighting watches where-
: … you know, it's just got a weight in the watch. And as you move your wrist, the way it moves from one side to the other, and it winds the watch up. That's a pretty cool thing.
: Yeah, yeah.
: Well, it's amazing that like Rolexes that it's all done mechanically.
: There's no batteries in there. There is no nothing.
: Yeah. You could do the same thing. You create a little charger that's based on wrist movement. It really depends on how much energy your watch uses.
: You know what's fucked up about that though? We accept a certain amount of like fuckery with those watches. Like I brought my watch. I have a Rolex that my friend, Lorenzo, gave me, and I brought it to the watch store, and I said, "This thing's always fast." I said, "It's always like after a couple of months, it's like five minutes fast." And they go, "Yup." They go, "Yeah."
: "It's just what it does."
: I go, "Hold on." I go, "So, you're telling me that it just is always going to be fast?" They're like, "Yeah. It's just like every few months, you get like reset it."
: It seems like they should recalibrate that thing.
: They can't. They tried. They say, every few months, whether it's four months, or five months, or six months, it's going to be a couple of minutes fast.
: Okay. It seems like they should really recalibrate that because-
: You should figure that shit out.
: … if it's always fast, you can just-
: … you know, delete those minutes.
: You need to fucking kick down the door at Rolex and go, "You bitches are lazy."
: It's kind of amazing that you can keep time mechanically on a wristwatch with these tiny little gears.
: It's amazing.
: I mean, the whole luxury watch market is fascinating. I'm not that involved in terms — Like I don't buy them. I've bought them as gifts. I don't buy them for myself. But when I look at them online, there's a million dollar watches out there now that are like they have like a little rotating moons and stars.
: And they live — Like look at this thing, how much is that when Jaime?
: I don't know. I just picked one.
: These are fucking preposterous guess. I like gears. I love them. I love them.
: Yeah. I think that is beautiful.
: But there's some of these people that are just taking it right in the ass. They're buying these watches for like $750,000 . Like, "Yeah, that's a Timex, son." Nobody knows. It's not any better than some Casio that you could just buy on — Like, look at that though.
: Well, here's the thing. If you're a person that doesn't just want to know the time, you want craftsmanship, you want some artisan's touch, you want innovation in terms of like a person figuring out how gears and cogs all line up perfectly, to every time it turns over, it's basically a second. I mean, that's just — There's this art to that.
: Yeah, I agree.
: Yeah, it's not just telling time. Yeah, I like this watch a lot, but if it got hit by a rock, I wouldn't be sad.
: It's just to watch. It's a mass-produced thing that runs on some quartz battery. But those things, there's art to that.
: Yeah. No, I agree. It's beautiful.
: Yeah. Love it.
: Yeah. There's something amazing about it. It's-
: Because it represents the human creativity. It's not just electronic innovation. There's something. It's a person's work in that.
: You don't have a watch on.
: I used to have a watch.
: What happened?
: My phone tells the time. So-
: That's a good point. Well, if you lose your phone? Do you — Wait, hold on.
: It's true.
: Let me guess, you are a no case guy.
: That's correct. Living on the edge. Living on the edge without a case.
: Neil deGrasse Tyson. Neil deGrasse Tyson was in here last week. I'm marveled at his ability to get through life without a case.
: That's right.
: You know, he takes his phone, and he flips it in between his fingers like a soldier would do with his rifle.
: He just rolls that shit in between his fingers.
: It's marvelous.
: He says that's the reason why they do it. He said, "Would you look at someone who has a rifle, why would they do that? Why would they flip it around like that?"
: It's like, it goes to drop, they have it in their hand. They catch it quickly.
: So, that's what he does with his phone. He's just flipping his phone around all the time. I got that in Mexico. I was hoping it holds joint.
: Does it do anything? It tips to open.
: Just a hole?
: It's just a hole.
: You could store things in there.
: Yeah. But like try it. Put a joint in there. Close it. You put like one blunt. One, that seems pretentious. You know, that's the idea behind it. I bought it when I was in Mexico because I figured it would be a good size to hold joints, or it's not.
: So, is that a joint or is it a cigar?
: It's marijuana inside of a tobacco.
: Okay. So, it's like posh, part tobacco a pot.
: Yeah. You never had that?
: Yeah. I think I tried one once.
: Come on, man. You probably can't because of stockholders, right?
: I mean, it's legal, right?
: Totally legal.
: How does that work? Do people get upset at you if you do certain things? It's just tobacco and marijuana in there. That's all it is. The combination of tobacco and marijuana is wonderful. First turned on to it by Charlie Murphy, and then reignited by Dave Chappelle. There you go.
: Plus whiskey.
: Perfect. It balances it out.
: Alcohol is a drug that's been grandfathered in.
: Well, it's not just a drug. It's a drug that gets a bad rep because you just have a little, it's great.
: Yeah, little sip here and there, and your inhibitions are relaxed, and it shows your true self. And, hopefully, you're more joyous, and friendly, and happy, and everything. The real worry is the people that can't handle it. Like the real worry about people who can't handle cars and go 016 in 1.9 seconds or anything.
: Have you ever considered something that — Like, imagine if one day, everyone has a car that's on the same, at least, technological standard as one of your cars, and everyone agrees that the smart thing to do is not just to have bumpers but to perhaps have some sort of a magnetic repellent device, something, some electromagnetic field around the cars that as cars come close to each other, they automatically radically decelerate because of magnets or something.
: Well, I mean, our cars brake automatically.
: Yeah. When they see things?
: But like a physical barrier, like-
: Well, the wheels work pretty well.
: The wheels do.
: Yeah, yeah. They work pretty well. Decelerated at, you know, 1.1 to 1.2 Gs, that kind of thing.
: Is your concern that one day all your cars will be on the road, and then, there'll still be regular people with regular cars 20-30 years from now that will get in the mix and be the main problem?
: Yeah. I think, it'd be sort of like, you know, there was a time of transition where there were horses and gasoline cars on the road at the same time. It's been pretty weird.
: That would be the weirdest.
: Yeah. I mean, horses were tricky. You know, back when Manhattan had like 300.000 horses, then figure out like if a horse lives 15 years, you got 20,000 horses dropping dead every day or every year, I should say. Every year, it's 20,000 horses. If there's 300,000 horses in a 15-year lifespan.
: Back in the Gangs of New York days, that movie.
: It's a lot of dead horses. You needed a horse to move the horse.
: They'll probably get pretty freaked out if they have to move our dead horse.
: Do you think they know what's going on?
: Do you think it's as hard?
: I mean, it's got to be like pretty weird.
: No, I would imagine.
: Like, in my mind, dragging this dead, you know, horse around, and I'm a horse.
: Do you-
: They might not like it.
: Do you ever stop and think about your role in civilization? Do you ever stop and think about your role in the culture? Because me, as a person, who never met you until today, when I think of you, you know, I've always thought of you as being this weirdo super inventor dude who just somehow or another keeps coming up with new shit, but there's not a lot of you out there. Like everybody else seems to be — I mean, obviously, you make a lot of money, and there's a lot of people that make a lot of money. You like that clock?
: Pretty dope, right?
: This is a great clock.
: You want one? I'll get you one.
: Okay, done.
: I like weird things like this.
: Oh, this is the coolest. It's TGT Promotion. What is this? TGT Studios? TGT Studios.
: Yeah. So, a gentleman who makes all this by hand. Yeah, it's really cool.
: My study is filled with weird devices.
: Well, get ready for another one.
: All right.
: I'm sending it your way.
: You want a werewolf too? I'll hook you up.
: All right. I'll take one.
: Okay. You want a werewolf and one clock coming up. Do you think about your role in the culture? Because me, as a person, who never met you until today, I've always looked at you and like, "Wow." Like, "How does this guy just keep inventing shit?" Like, how do you how do you keep coming up with all these new devices? And do you ever consider how unusual — Like I had a dream once that there was a million Teslas. Instead of like one Tesla, there was a million Teslas.
: Not just the car but Nikola.
: Oh, yeah, sure.
: And that in his day, there was a million people like him who were radically innovative.
: It was a weird dream, man. It was so strange. And I've had it more than once.
: That would result in a very rapid technology innovation. That's for sure.
: It's one of the only dreams of my life I've had more than one time.
: Okay, wow.
: Like where I've woken up, and it's in the same dream. I'm in the same dream. And in this dream, it's 1940s, 1950s, but everyone is severely advanced. There's flying blimps with like LCD screens in the side of them. And everything is bizarre and strange. And it stuck with me for whatever — Obviously, this is just a stupid dream. But for whatever reason, all these years, that stuck with being. Like it takes one man, like Nikola Tesla, to have more than a hundred inventions that were patents, right. I mean, he had some-
: He's pretty great.
: … pretty fucking amazing ideas.
: But there was-
: In his day, there was very few people like him.
: Yeah, that was true.
: What if there was a million? Like what in the experience-
: Things would advance very quickly.
: Right, but there's not a million Elon Musks. There's one motherfucker. Do you think about that or you just try to not?
: I don't think. I don't think you'd necessarily want to be me. That'd be good.
: Well, what's the worst part about you?
: I should. I never thought people would like it that much.
: Well, most people would, but they can't be. So, that's like some superhero type shit. You know, we wouldn't want to be Spiderman. I'd rather just sleep tight in Gotham City and hope he's out there doing his job.
: It's very hard to turn it off.
: Yeah. What's the hardest part?
: It might sound great if it's turned on, but what if it doesn't turn off?
: Now, I showed you the isolation tank, and you've never experienced that before.
: I think that could help you turn it off a little bit just for the night.
: Yeah. Just give you a little bit of sleep, a little bit of perspective. It's magnesium that you get from the water as well that makes you sleep easier because the water has Epsom salts in it. But may be some sort of strategy for sacrificing your — or not sacrificing but enhancing your biological recovery time by figuring out a way whether it's through meditation or some other ways to shut off that thing at night. Like you must have like a constant stream of ideas that's running through your head all the time. You're getting text messages from chicks.
: No. I'm getting text messages from a friend saying, "What the hell are you doing smoking weed?".
: Is that bad for you? It's legal.
: I mean-
: It's government approved.
: It's not — You know, I'm not a regular smoker of weed.
: How often do you smoke it?
: Almost never. I mean, it's-
: How does it feel?
: I don't actually notice any effect.
: Well, there you go. There was a time where I think it was Ramadan for someone gave some Buddhist monk a bunch of acid.
: And he ate it, and it had no effect on him.
: I doubt that.
: I would say that too, but I've never meditated to the level that some of these people have where they're constantly meditating all day. They don't have any material possessions. And all of their energy is spent trying to achieve a certain mindset. I would like to cynically deny that. I'd like to cynically say, "Hey, just fuck and think the same way I do." They're just hanging out with flip flops on and make weird noises, but maybe no.
: You know, I know a lot of people like weed, and that's fine, but I don't find that it is very good for productivity.
: For you.
: Not for me.
: Yeah. I mean, I would imagine that for someone like you, it's not. For someone like you, it would be more like a cup of coffee, right. You want to have a latte.
: Yeah. It's more like the opposite of a cup of coffee.
: What is that?
: It's like a cup of coffee in reverse.
: Weed is?
: No, I'm saying you would like more. More like will be beneficial to you. It would be like coffee.
: I like to get things done. I like to be useful. That is one of the hardest things to do is to be useful.
: When you say you like to get things done-
: … like, in terms of like what-
: I should get things done.
: … gives you satisfaction? When you complete a project, when something that you invent comes to fruition, and you see people enjoying it, that feeling.
: Yes, doing something useful for other people that I like doing.
: That's interesting for other people.
: So, that, do you think that that is maybe the way you recognize that you have this unusual position in the culture where you can uniquely influence certain things because of this? I mean, you essentially have a gift, right.
: I mean, you would think it was a curse, but I'm sure it's been fueled by many, many years of discipline and learning. But you, essentially, have a gift and that you have this radical sort of creativity engine when it comes to innovation and technology. It's like you're just you're going at very high RPMs.
: All the time. That doesn't stop.
: What is that like?
: I don't know what would happen if I got into a sensory deprivation tank.
: Let's try it.
: It sounds a little concerning.
: But why?
: It's like running the engine with no resistance. That is-
: Is that what it is though? Maybe it's not.
: Maybe it's fine. I don't know.
: How much-
: I'll try it. I'll try it.
: Have you ever-
: It's fine.
: … experimented with meditation or anything?
: What do you do, or what have you done rather?
: I mean, just sort of sit there, and be quiet, and then repeat some mantra, which acts as a focal point. It does still the mind. It does still the mind, but I don't find myself drawn to it frequently.
: Do you think that perhaps productivity is maybe more attractive to you than enlightenment or even the concept of whatever enlightenment means. Like, what are you trying to achieve when you're meditating all the time? With you, it seems like almost like there's a franticness to your creativity that comes out of this burning furnace. And in order for you to like calm that thing down, you might have to throw too much water on it.
: It's like a never-ending explosion.
: Like what is it like? Try to explain it to a dumb person like me. What's going on?
: Never-ending explosion.
: It's just constant ideas just bouncing around.
: So, when everybody leaves, it's just Elon sitting at home brushing his teeth, just bunch ideas bouncing around your head.
: Yeah, all the time.
: When did you realize that that's not the case with most people?
: I think, when I was, I don't know, five or six or something. I thought I was insane.
: Why did you think you were insane?
: Because it is clear that other people do not. Their mind wasn't exploding with ideas all the time.
: So, they weren't expressing it. They weren't talking about it all day. And you realized by the time you were five or six like, "Oh, they're probably not even getting this thing that I'm getting."
: No. It was just strange. It was like, "Hmm, kind of strange." That was my conclusion, kind of strange.
: But did you feel diminished by it in any way? Like knowing that this is a weird thing that you really probably couldn't commiserate with other people, they wouldn't understand you.
: I hope they wouldn't find out because they might like put me away or something.
: You thought that?
: For a second, yes.
: When you were little?
: Yeah. They put people away. What if they put me away?
: Like when you were little, you thought this?
: Wow. Well, you thought, "This is so radically different than the people that are around me if they find out I got this stream coming in."
: But, you know, I was only like five or six probably.
: Do you think this is like — I mean, there's outliers biologically. You mean, there's people that are 7 foot 9, there's people that have giant hands, there's people that have eyes that are 20/15 vision. There's always the outliers. Do you feel like you like caught this, like you have got some — you're like on some weird innovation creativity sort of wave that's very unusual? Like you tapped into — I mean, just think of the various things you may have accomplished in a very short amount of time, and you're constantly doing this. That's a weird — You're a weird person, right.
: Right, I agree.
: Yeah. Like what if there's a million Elon Musks?
: Well, that would be very, very weird.
: Yeah, it would be pretty weird. I agree.
: Real weird.
: What if there were a million Joe Rogans?
: There probably is. There's probably two million. I mean, I think that's the case with a lot of folks.
: Yeah. I mean, but, like, you know, my goal is like try to do useful things, try to maximize the probability for the future's good, make the future exciting, something you look forward to, you know. You know, with Tesla, I want to try to make things that people love. Like, how do you think you could buy that you really love, that really give you joy? So rare, so rare. I wish there were more things. That's what we try to do. Just make things that somebody loves.
: When you-
: That's so difficult.
: When you think about things that someone loves, like, do you specifically think about like what things would improve people's experience, like what would change the way people interface with life that would make them more relaxed or more happy? You really think, like, when you're thinking about things like that, is that like one of your considerations? Like what could I do that would help people-
: … that maybe they wouldn't be able to figure out?
: Yeah. Like what are the set of things that can be done to make the future better? Like, you know, like so, I think, a future where we are a space-faring civilization and out there among the stars. This is very exciting. This makes me look forward to a future. This makes me want that future. You know, the things, there need to be things that make you look forward to waking up in the morning.
: You wake up in the morning, you look forward to the day, you look forward to the future. And a future where we are a space-faring civilization and out there among the stars, I think, that's very exciting. That is a thing we want; whereas, if we knew we would not be a space-faring civilization but forever confined to Earth, this would not be a good future. That would be very sad, I think.
: It would be so sad in terms-
: Like I don't want a sad future.
: … just the finite lifespan of the Earth itself-
: … and the solar system itself. But even though it's possibly — You know, I mean, how long do they feel like the sun and the solar system is going to exist? How many hundreds of millions of years?
: Well, it's probably, if you're saying when does the sun boil the oceans-
: About 500 million years.
: So, is it sad that we never leave because in 500 million years, that happens? Is that what you're saying?
: No. I just think like if there are two futures, and one future us we're out there among the stars, and the things we read about and see in science fiction movies, the good ones are true, and we have these starships, and we're going see what other planets are like, and we're a multi-planet species, and the scope and scale of consciousness is expanded across many civilizations, and many planets, and many star systems, this is a great future. This is a wonderful thing to me. And that's what we should strive for.
: But that's biological travel. That's cells traveling physically to another location.
: Do you think that's definitely where we're going?
: Yeah, I don't think so either. I used to think so. And, now, I'm thinking more likely less than ever. Like almost every day less likely.
: We can definitely go to the moon and Mars.
: Yeah. Do you think we will colonize?
: I think we will go to the asteroid belt. And we can go to the moons of Jupiter, Saturn, even get to Pluto.
: That'd be the craziest place ever if we colonize Mars, and reform it, and turn it into like a big Jamaica. Just oceans and-
: I think, we should. I think that would be great.
: I mean, imagine that there is-
: That would be great. Amazing.
: It's possible, right?
: We can turn the whole thing into Cancún.
: I mean, over time.
: It wouldn't be easy but yes.
: You could just warm — You could warm it up.
: Yeah, you can warm it up. You could add air. You get some water there. I mean, over time, hundreds of millions of years or whatever it takes.
: We'll be a multi-planet species.
: Yeah, that would be amazing.
: We're a multi-planet species.
: If we could-
: That's what we want to be-
: … legitimately like air-condition-
: … Saturn.
: I'm pro-human.
: Me too. Yeah, me too.
: I love humanity. I think it's great.
: We're glad as a robot that you love humans because we love you too, and we don't want you to kill us and eat us. And-
: I mean, you know, strangely, I think a lot of people don't like humanity and see it as a blight, but I do not.
: Well, I think one of those — I think, part of that is just they've been — you know, they've been struggling. When people struggle, they associate their struggle with other people. They never internalize their problems. They look to other people as holding them back, and people suck, and fuck people, and it's just — You know, it's a never ending cycle. But not always. Again, most people are really good. Most people, the vast majority.
: This may sound corny.
: It does sound corny.
: But love is the answer.
: It is you answer.
: Yeah, it is. It sounds corny because we're all scared. You know, we're all scared of trying to love people, being rejected, or someone taking advantage of you because you're trying to be loving.
: What if we all could just relax and love each other?
: It wouldn't hurt to have more love in the world.
: It definitely wouldn't hurt.
: It would be great.
: Yeah, we should do that.
: Yeah, I agree, man.
: Like really.
: How are you going to fix that? Do you have a love machine you're working on?
: No, but probably spend more time with your friends and less time on social media.
: Now, deleting social media from your applications, from your phones, will that give you a 10% boost to happiness? What do you think the percentage is?
: I think probably something like that, yeah.
: Yeah, a good 10%.
: Yeah, I mean, the only thing I've kept is Twitter because I kind of like meet some means of getting a message out, you know.
: Well, that's about it. So far so good.
: Well, what's interesting with you, you actually occasionally engage with people on Twitter.
: Yeah, that's-
: What percentage of that is a good idea?
: Good question.
: Probably 10%, right? It's hard.
: It's mostly — I think, it's on balance, more good than bad, but there's definitely some bad. So-.
: Do you ever-
: Hopefully, the good outweighs the bad.
: Do you ever think about how odd it is, the weird feeling that you get when someone says something shitty to you on Twitter, and you read it? That weird feeling. This weird little negative jolt. It's like a subjective negative jolt of energy that you don't really need to absorb, but you do anyway. Like, "I want to fuck this guy. Fuck him."
: I mean, there's a lot of negativity on Twitter.
: It is, but it's a weird in it's form. Like the way, if you ingest it as if you're like — you try to be like a little scientist as you're ingesting it, you're like, "How weird is this?" And I'm even getting upset at some strange person saying something mean to me. It's not even accurate.
: I mean, the vast number of negative comments, for the vast majority, I just ignore them, the vast majority.
: Every now and again, you have draw in, something not good.
: It's not good.
: You make mistakes.
: Yes, you can make mistakes.
: We can make some mistakes.
: We're all human. We can make mistakes. Yeah, it's hard. And people love it when you say something, and you take it back, and they're like, "Fuck you. We saved it forever. I'll fucking screenshot that shit, bitch. You had that thought. You had that thought." I'm like, "Well, I deleted it." "Not good enough. You had the thought. I'm better than you. I never had that thought. You had that thought, you piece of shit. Look, I saved it. I put it on my blog. Bad thought."
: Yeah. I'm not sure why people think that anyone would think that deleting a tweet makes them go away. It's like, "Hello, been on the internet for a while."
: Yeah. Well it's even like-
: Anything is forever.
: And the thing is they don't want you to be able to delete it because the problem is if you don't delete it, and you don't believe it anymore, it's really hard to say, "Hey, that thing above, I don't really believe that anymore. I changed the way I view things."
: Because people would go, "Well, fuck you. I have that over there. I'm going to just take that. I'm not going to pay attention to that shit you wrote underneath it."
: It's on your permanent record.
: Yeah. It's forever like a tattoo.
: Like high school, "We'll put this on your permanent record."
: Yeah. It's like a tattoo. You keep it.
: Yeah. Well, it's this thing where there's a lack of compassion. It's a lack of compassion issue. People are just like intentionally shitty to each other all the time online, and trying to catch-
: They're more trying to catch people doing something that's arrestable, like a cop trying to, like, get, you know, arrests on his record. It's like they're trying to catch you for something, more than they're logically looking at it thinking it's a bad thing that you've done, or that it's an idea they don't agree with so much, they needed to insult you. They're trying to catch you.
: Yeah, yeah. I mean, it's way easier to be mean on social media than it is to be mean in person.
: Way easier.
: It's weird. It's not a normal way of human interacting. It's cheating.
: You're not supposed to be able to interact so easily when the people are not looking at.
: You would never do that. Don't be so mean when somebody looking in their eyes. If you did, you'd feel like shit.
: Most people.
: Yeah, unless you're a sociopath, you'd feel terrible.
: Elon Musk, this has been a pleasure.
: Yeah, likewise.
: It really has been.
: It's been an honor. Thank you for having me.
: Thanks for doing this because I know you don't do a lot of long form stuff like this. I hope I didn't weird you out, and I hope you don't get mad that you smoked weed.
: I mean-
: It's not bad. It's legal. We're in California. This is just as legal as this whiskey we've been drinking.
: This is all good, right?
: Cheers. Thank you. Is there any message you would like to put out other than love is the answer, because I think you really nailed it with that.
: No. I think, you know, I think people should be nicer to each other, and give more credit to others, and don't assume that they're mean until you know they're actually mean. You know, just, it's easy to demonize people. You're usually wrong about it. People are nicer than you think. Give people more credit.
: I couldn't agree more. And I want to thank you not just for all the crazy innovations you've come up with and your constant flow of ideas but that you choose to spread that idea, which is very vulnerable, but it's very honest, and it resonates with me.
: It's true.
: And I believe it.
: It's true.
: I believe it's true too. So, thank you.
: You're welcome.
: All you assholes out there, be nice. Be nice, bitch. All right. Thank you, everybody. Thank you, Elon.
: All right, thank you.
: Good night, everybody.
New to Sonix? Click here for 30 free transcription minutes!