On this episode of Recode Decode, hosted by Kara Swisher, Y Combinator Partner Daniel Gross talks with Kara and The Verge's Casey Newton about why he returned to the startup incubator that gave him his start in Silicon Valley. He says YC is vitally important for bringing new outsider voices into the highly networked tech industry.
Gross co-founded the personal search engine Cue, which Apple bought in 2013 for a reported $35 million and integrated into iOS, and is currently interested in applications of artificial intelligence.
You can read some of the highlights here, or listen to the entire interview in the audio player below. We’ve also provided a lightly edited complete transcript of their conversation.
If you like this, be sure to subscribe to Recode Decode on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
Kara Swisher: Recode Radio presents Recode Decode, coming to you from the Vox Media Podcast Network. Hi, I’m Kara Swisher, executive editor of Recode. You may know me as director of a zombie movie set in Silicon Valley called Why Zombinator, but in my spare time I talk tech, and you’re listening to Recode Decode, a podcast about tech and media’s key players, big ideas and how they’re changing the world we live in. You can find more episodes of Recode Decode on Apple Podcasts, Spotify, Google Play Music or wherever you listen to your podcasts, or just visit Recode.net/podcasts for more.
Today I’m in San Francisco with Casey Newton, the Silicon Valley editor of The Verge, and the host of an upcoming podcast Converge. We have Casey joining me for several special episodes of Recode Decode this month because this Converge is coming, right?
Casey Newton: Yup.
KS: Can you go into it just a little bit?
CN: Yeah, absolutely. If you like Kara’s podcast, I think you might like Converge as well. You know, The Verge has never done an interview show. The Verge, we’re fascinated by Silicon Valley, we’re fascinated by the people here and the products they make. Where did they come from? How do they see the world? Some of the conversations that I have in my reporting life for me are more fun to do than the articles I wind up writing, and so I thought if I could bring those conversations to more people, we could have a really fun show. So it’s coming together quickly, and we can’t wait to share it with you.
KS: All right. Are you basically trying to steal my thunder?
CN: Here’s how I’ve always seen it:
KS: Patel put you up to it.
CN: Yeah. There’s always a healthy rivalry between us Vox Media brands, and I love your podcast, and when I listen to it you seem the most fascinated by money and power. I think that’s what gets you going is money and power. And I love listening to money and power.
My perspective and what I’m interested in day to day is a little bit different. I truly am fascinated by how products get made and by how some of these visionaries see the world. We get to use their products, but we often don’t know where they came from.
CN: We often don’t know about their stories and about how they formed their worldview, so I want to dig into that, I want to have some fun. You know, I do improv comedy on the side, I like to tell jokes, so I just like a little bit of fun back and forth, let’s have some laughs.
KS: Are you saying I’m no fun?
CN: I think you’re a lot of fun. Actually, when I can make you laugh, that is one of the highlights of my day, because you talk so fast it’s like if I can stop you with a one-liner, that’s a great day for me.
KS: Absolutely. And all joking aside, Casey and I have been talking about having to do this podcast for a while.
CN: Super supportive.
KS: And it’s here and the ideas behind it ... Because a lot of the things that are really interesting are the people you don’t know in Silicon Valley who create all kinds of things.
KS: So Casey’s going to be here for four episodes, our special episodes, which are appearing on the Wednesdays and we still have regular Recode Decodes on Mondays and so, well, four really interesting people that he is selecting, he’s hand selecting ...
CN: Hand curated.
KS: Hand curated, artisanal picks, the kind of people he’s going to be having on Converge, which is coming soon. The first one we are having here, the first person we’re having is Daniel Gross in the red chair. How you doing, Daniel? He’s a partner at Y Combinator for almost a year, and before that he co-founded the search engine Cue, which was bought by Apple in 2013. Before he joined Y Combinator, he was director of machine learning at Apple, kind of a big deal there.
Daniel, welcome to Recode Decode. Casey, I’ll let you start.
CN: Yeah, Daniel, thanks for coming by.
Daniel Gross: Thank you so much for having me.
CN: It’s funny, when I thought about a fun guest to have on this very special episode of Recode Decode, I thought of a conversation that you and I had in maybe 2013. I don’t know if you remember this at all, we met at the 21st Amendment Brewery for lunch.
One of San Francisco’s finest.
CN: Yeah, and you definitely could not drink at that point.
CN: And also it was lunch, so why would we do that.
Right. When in New York.
KS: Why Casey, why?
CN: Well, I work better when I’m sober most of the time. But you said this thing to me that stuck with me. Okay, so how old were you in 2013? You were?
I was actually 22.
CN: Oh, were you that? Okay.
Yeah, I think I was. Well, I was turning 22.
CN: Right on. Well, you said this thing to me that stuck with me, which was that — because I was sort of asking you what made you decide to come to Silicon Valley, because you grew up in a faraway land.
That’s right, I’m from Israel originally.
CN: You sort of picked up stakes and you came here to do your thing, and when I asked you why, you said, “Look, I have a very small chance to change the world, but I do think I have a chance, so why not take the chance?” And it just stuck with me, because you encapsulated this view I hear from so many of these entrepreneurs who come here from all of the world. I wrote a story about your coming at the time, and then you went on to do all these crazy things, so when I thought about starting up this podcast and guest hosting with Kara, I thought, “Let’s check in with Daniel and see what he’s been up to in all that time.”
Lord, yeah, I have been failing with my goal of changing the world, but I keep on trying. But it’s true, and that was one of the reasons why I wanted to go back to my roots and join Y Combinator, kind of do what YC did to me unto others. So to find promising, interesting people in random corners of the world and to elevate them and to bring them into the fold of Silicon Valley, with the hopes that they’ll be more successful than I was.
KS: Why don’t you give your background a little bit, just for people who don’t know you.
Of course. So I’m originally from Jerusalem, Israel, born and raised, and when I was 18, I applied to Y Combinator, I was on the precipice of starting to serve in the Israeli Army. I originally, I got accepted into the interview, and so they offered to pay half my ticket for the flight from Israel to San Francisco.
CN: Which is so cheap knowing how much money they have now, honestly.
KS: Exactly. Half the ticket.
Back then, yeah. Back then it was basically ...
KS: So it was like to Omaha?
KS: New York.
I think it was a $2,000 ticket, they paid I think $600. Scrounged up money for the rest of it, flew out, thought I would totally fail the interview, but boy, at least I got a discounted ticket to San Francisco Bay Area, see stuff. Came out, didn’t totally fail the interview, they accepted me, although they wanted me to work on a completely different idea from the one I applied with, which I was willing to do.
KS: Do they just get to impose these things on you?
Well, yeah, at the time, yeah, they have the leverage.
CN: Because at the time, you were actually the youngest founder ever accepted into Y Combinator.
Yeah, and that record has since been broken by multiple people, but at the time, it was revolutionary.
KS: Yeah, that 11-year-old was great, but go ahead.
CN: Yeah, 11 is now the median age of a Y Combinator founder.
Yeah, you may soon find us in the NICU.
KS: You’re washed up, Gross. Go ahead.
So I got accepted into the program, and because I started to work on a brand new idea right when Y Combinator started, I kept on iterating through it. And at some point I launched it midway through the YC program, didn’t really get great reaction, and so I was that person, throughout the three-month batch, that was constantly changing their idea. Constantly, basically building something new, trying to launch it, see what people thought. Right before Demo Day, the moment when the batch ...
KS: The beauty contest.
Yeah, the beauty contest. Thank you for summarizing that so eloquently. The way your mind works. Anyway, so I had this idea unrelated to Cue, and it was making quite a bit of money from Amazon affiliate revenue. We don’t have to get into the details, but Amazon effectively shut it down about 48 hours before Demo Day.
KS: Come on.
And so I had this tremendous graph of revenue that crashed, which sucks. I went to Paul Graham’s house and I said, “Look, I don’t know what to do now,” and he said, “Well you’ve got three options. You can get up onstage and not really tell everyone that your revenue is about to collapse, you can defer Demo Day and do it later, or you can come up with something brand new in 48 hours.” And as he would whisper to me before I went up onstage, “The goal is to not let anyone know this is 48 hours old.” So I did.
KS: This is like an episode of “Silicon Valley,” but go ahead.
Yeah, maybe. I’ve literally never seen an episode, so you would have to tell me.
KS: Hideous that it is.
Thank you so much, I guess? Anyway, I basically built Cue, which at the time had a terrible name, it was called Greplin, which means a lot to an engineer, but to everyone else in the world, it just sounds like “gremlin,” so I ended up changing the name later on. I effectively built a prototype in those 48 hours, got up onstage, demoed it, told everyone it was hopefully going to be the next Google, and raised a bunch of money for it, ended up ...
KS: Wait, this is a 48-hour idea you’ve got money for that you just ...
I didn’t get money at that time.
KS: Explain what it is. Explain what Greplin is.
Sure, sure, sure, so what I’d built — and this ended up becoming a real company, real money.
CN: And it was a cool product, I will say. I liked it.
It had its moment in the sun. So the idea was pretty simple. The idea was to build a search engine, but instead of searching the internet, like the public web, we’d search all of your private, personal content, so Slack, Gmail, Dropbox, EverNote, Salesforce, BaseCamp, whatever you used in one place. So if you’re trying to figure out where to get to that dinner party, the location of that event could literally be in three different places. It’s a Facebook event, a Google Calendar event, could be an email if it’s an Evite thing. And so the idea was, you don’t have to worry about this any more, especially on mobile, where going between, paging between different apps used to be hard. Now it’s slightly better. The idea was we were going to kind of unify that experience for you and let you search through all of your information in one place.
Over time, we tried to make the app even smarter, and we tried to have it predict what you would want to see, such that you didn’t even have to think to search. We were going to kind of be this assistant walking behind you with a clipboard that’s kind of dishing things out to you, very similar to Google now. We built that out, ended up raising series A and series B from Sequoia, meeting a co-founder who to this very day is a good friend of mine, and built up a team, and we ended up building a really good product and really terrible business.
In our world, a user wasn’t a row in the database, it was terabytes and terabytes of information we had to mine and process, and so the only way to really do that and satisfy the unit economics was to build an enterprise product. We kind of realized at this point, three years into it, that we weren’t really an enterprise culture. You can’t really take a consumer company culture and just say, “Well, we’re going to start selling to sales folks,” and all hands whereas ...
KS: It’s called Dropbox.
CN: Yeah, I was going to say.
Well, you know, kudos to them for figuring it out, because it’s hard. It’s a hard cultural shift to move from a world where the engineer is the desired card you want to the one where the salesperson is the desired card you want in your deck. We didn’t really have to think that through too much because Apple approached us and effectively offered to integrate what we were doing into iOS and OS10. So that’s what I ended up doing at Apple, both trying to improve Spotlight, Safari, and a bunch of machine learning teams. For example, today when your iPhone tells you that it’s time to leave to an event based on your current location, or when you pull up Spotlight and it tries to predict what app you want to run, that’s all my old team and some of our old code being integrated.
CN: I have a question. So when you were building this, did it feel like AI to you at the time? Or at what point did you start thinking, “I’m working on AI.”
Yeah, we were pre-AI hype.
CN: Way before.
Way before, and by that I mean like two years.
CN: Neolithic era.
It did not feel like AI at all, it felt like a predictive search. That’s what I kept on calling it in my head, and of course today if I was starting this company, I would rebrand it as AI search.
CN: And you would have raised 10 times as much money.
Ten times as much money, and burned through 10 times as much money, mostly with the same amount of people, just increased San Francisco rent prices. Yeah, it didn’t feel like AI, it definitely felt like search, and I think that beckons an interesting point, which is in many ways, a lot of this hype around AI is I think effectively other technologies we’ve decided to now repackage as AI. So now search companies will now be called AI, companies that are really just building simple logistic regression over ... Building probably useful stuff, but simple linear logistic regression ...
CN: It’s stuff that Kara and I do just for fun.
Yeah, just on a Saturday night, yeah, exactly. That will be called AI as well, and I’m not sure that’s a bad thing. I actually think it’s a good thing, and I think one of the byproducts of all of this buzzword around AI is projects that honestly don’t involve breakthrough technologies that just should get funded will get funded now because we’re all excited about it, whereas they wouldn’t have gotten funded five, 10 years ago.
KS: Give an example about this predictive search. What fits into that basket?
Another example is there are a bunch of companies in a bunch of different industries that are selling software that effectively sorts lists better. So one example is you’re a large company like Starbucks, and you get thousands of people applying to you with jobs. Boy, it’d be great if you had a more efficient way to weigh through all those thousands of people that are applying based on fairly simple machine learning. Previously, when that pitch comes into the room of a VC, I think it may have sounded kind of boring. Well, it’s a slog, resume, whatever. But now, they have the opportunity to kind of use this hype around AI, and I’m not sure that the company is using any of Levandowski’s technology from Google, it’s not hardcore AI. But they’re still able to get funded and they may very well build a better product.
I think one of the right frameworks in terms of how to think about AI today is to really think about it a little bit at the end of the day as like a database. I think we’re all looking for an AI company, but it’s more like companies using AI, like companies would be using a database to do various things, in one particular case sort through resumes better.
KS: So it’s basically just tricking venture capitalists?
Well, I think humans ... At the end of the day, what Elon Musk is doing is I think what humans respond to best, which is you create a lot of hype and enthusiasm around something and then occasionally, you actually create the future just because of the hype and enthusiasm about it, so I’m not quite sure. We could talk about actual technological breakthroughs happening if we want later on, but I guess my point is even in the areas where there’s no tech breakthrough, it still is probably a net good thing that that stuff is getting funded and those approaches are getting tried.
CN: Right. I want to come back to AI real quick, but first, I do want to ask you about this experience that you had as an entrepreneur, because it seems totally crazy. Do you reflect at all on this thing that you did where you leave your home at 18 and come to a new country and raise money and place all these immense expectations on yourself and then have to grow this thing, and then you get three years to do it and you’re like, “I don’t know what the business is”? For people who are listening who want to start their own companies, what is the day-to-day emotions of that roller coaster ride?
The day-to-day emotional roller coaster ride is intense, but the thing you need to realize — if you’re listening to this and you’re outside of Silicon Valley and it all looks hard and unapproachable — is that it’s very gradual, and everyone starts out with very, very humble beginnings. One of the most important things for me coming into Silicon Valley was meeting some of these folks that I thought were titans of the world and thinking to myself, “You know, you’re not that great.” And then realizing that I could do it too.
KS: Feet of clay.
Yeah, and so you need to realize that everything is very gradual and everyone who seems formidable, at the end of the day started out putting on their pants in the same way you do in the morning.
CN: By having their servant bring them to him.
KS: We’ve just got a few minutes in this section, but when you think about that, you moved rather fast compared to a lot of other entrepreneurs, and we’re going to get into the next section about what you’re doing at Y Combinator and how you got there, but when you think about that, what Casey was talking about, what was the key aspect of getting to that realization for you?
And what realization are ...
KS: Meaning that everyone had, everyone was a fraud, I think that’s what you’re saying.
No, I think you were framing it, that’s your job.
KS: Though it is an important thing that people realize that. I knew Jeff Bezos when he had five people, so I have a very different perspective, and now he’s the smartest person on earth apparently.
KS: What does that give you when you think like that?
It definitely gives me ... I think there’s this Steve Jobs quote where he says, “At some point, you realize, you walk around the world and you realize everything was built by people no smarter than you.” And that is probably not true. That is to say, I’m pretty sure the architects of the bridge behind us is built by someone smarter than me, but I kind of suppress that, and I kind of tell myself that I can do whatever I want because you do meet a lot of these people when they’re small and you do realize they’re not that great.
And I think it’s actually a sad fact that people want to fall in love with heroes, and so as a result, in Silicon Valley, we have this process where we build everyone up, everyone’s perfect, no one’s having a bad day, you know, Jeff Bezos is just crushing it all the time on top of windmills breaking champagne bottles.
But he starts out small, and I’m sure he has bad days too, and it’s really important for me to convey this to entrepreneurs or people that think — maybe entrepreneurs around the world — that it’s not as hard as it seems and everyone starts out like you. To the extreme at which I often wonder, if I had never had gotten in to Y Combinator, part of me would like to believe that I would have started a company anyway and I would have been formidable anyway. Part of me wonders, I would have probably been in the army, an orthodox Jew married with six or seven kids.
Yeah. Yeah. Well ...
KS: When are you getting started on that one, I wasn’t aware of that. Anyway, we’re here with Daniel Gross and we’re here also with Casey Newton, the Silicon Valley editor of The Verge, and he has a new podcast called Converge that is coming and he’s practicing here on Recode Decode.
CN: Getting my reps in.
KS: In special episodes. You’re doing very well so far, by the way.
CN: Thank you.
KS: I’m giving you an A right now.
CN: Positive feedback.
KS: We’ve been talking to ... Negative comes next. We’re thrilled to have Daniel Gross here in the red chair, he’s a new partner at Y Combinator, relatively new, and he co-founded the search engine Cue, which was bought by Apple for a gigantically ridiculous sum of money in 2013, and we’re talking about being an entrepreneur and other things. And when we get back, we’ll talk about Y Combinator and why he moved there and left Apple.
We’re back with Daniel Gross, he’s been a partner at Y Combinator for almost a year, and before that he co-founded a search engine, Cue, which was bought by Apple in 2013. We were just talking about how he got to where he got, including calling everything he does AI so that he gets funding. I’m teasing you Daniel, relax. We’re here also with Casey Newton, who is the Silicon Valley editor of The Verge, and he has a new podcast coming on the Vox Media podcast network called Converge, and so he’s here practicing, getting some exercise in.
Daniel, let’s keep talking. How did you get to Y Combinator? You were at Apple, I mean, that’s sort of the center of the universe.
Center of the universe according to Apple. I was at Apple ...
KS: You’re a sassy one.
Yeah. I’m just learning from you.
KS: (in Hebrew) That’s not good.
CN: You set a good example.
KS: (in Hebrew) Doesn’t matter to me.
Wow. At what point did you leave the tribe?
KS: I’m just saying, I’m not Jewish, I just know a lot of Hebrew.
KS: I know, see?
CN: We’re going to offer an alternate audio track to people who don’t speak Hebrew, by the way.
Anyway, yes, so answering your question, how did I leave Apple? I loved Apple, I still love Apple. I at some point, even though my responsibilities kind of reached the point where I felt like I hit the curve of diminishing returns in terms of what I was learning, and I had always wanted to work at Y Combinator. I feel like I owe so much of my career to them. So the opportunity came about to join and be a partner there, and effectively try to do unto others what they did to me, and so I made the leap.
KS: Talk about diminishing returns, because that’s interesting. What does that mean? Here you are, at really the most powerful, richest, certainly richest company on the planet with so much reach and so much influence.
Yeah, so Apple is an amazing company. My role in particular involved a lot of cross-functional management, which to those that are new to the game here, that basically means managing people that don’t directly report to you, which means convincing a lot of people to do stuff that they don’t want to do, where you have no authority. Over time, I was being given a lot of responsibility and not the authority to do various things throughout the company, and so that increasingly felt like a hill I just couldn’t summit. I just felt like I wasn’t learning as much as I could, so I, you know ...
KS: What were you supposed to do?
My charter was — and the team’s still at Apple doing that — trying to make every single Apple product you use, whether it’s your tablet, your Mac or your phone, your watch, smarter. So doing everything from automatically keeping your address book up to date by scanning through your email and figuring out on your device without sending any data to Apple servers when someone sends you their new phone number or sends you an event. All the way from that to trying to predict an app you’re going to search for next.
Our charter is, was, incredibly broad. The main problem is you end up having ... You build the initial technology set, and then you have to integrate it to various other teams, and those teams have their own set of responsibilities and goals and so it ends up becoming this constant push and pull of Daniel and the machine learning guys, and suddenly now need time for my team, I don’t want to do that, I just want to make the address book pretty or something like that. That ended up being slightly frustrating.
KS: That’s a good way to describe problems, big companies.
CN: So you head to Y Combinator where you had this experience, they’ve sort of given you your start. For people who are maybe less familiar with it, talk a little bit about what Y Combinator is and why it’s a special enough place for you to spend some time there.
The way Y Combinator works is twice a year ... We fund companies in batches, and our recent batch was 150 companies, give or take, and so we do this once in the winter and once in the summer, and the way it works is you go online, you fill out an application, you tell us what you’re building. You record a short video of yourself as well, and if that works out, we invite you to interview, and if that works out, we end up funding your company and give your company $120,000.
You then spend three months going through the program where you’re basically taught hopefully to avoid the very, very, common 10,000 mistakes people make when starting companies. It turns out that regardless of what you’re building, whether you’re working on gene editing or AI or the merger of both, there’re very common repeated mistakes, so we try to help you avoid that. And then the program culminates on this day called Demo Day where you get to go up onstage to — Kara responds it’s a beauty pageant — and you talk about ...
KS: It’s a lot of big investors, big names, famous people.
I happen to think the best investors in the world show up, and you kind of pitch what you’re doing, and many companies are raising millions of dollars out of Demo Day.
KS: All right, I’ll call it “The Voice” then, whatever you want.
Sure, yeah, that seems more ...
KS: But it is ... You have to impress them with your presentation.
You have to impress them with your presentation, you have to impress them with what you’re building, and then you end up raising money out of Demo Day.
The relationship with Y Combinator continues as your company continues to grow. A couple of years ago we raised a fund called the YC Continuity Fund, where we try to continue to invest alongside you in future rounds as well as try to help people avoid the whole other set of mistakes that you commonly make, not when you have five people on your team but when you have 50 or so. The idea is to really help both find and support the next generation of the next 10,000 entrepreneurs or so.
KS: It’s global, right?
It’s global. We ask that at the end of the day everyone at least for the duration of the program comes to Mountain View, where we host it, but we’ve accepted people from many, many, many different countries. The reason to go through that are first, if you’re not from Silicon Valley, coming here and not knowing anyone is quite challenging. It’s a very network-driven kind of economy, you need to know the right people in order to gain access to various things. I think Y Combinator is probably the finest vehicle that we have in Silicon Valley to bring outsiders in.
Second is, I do believe the curriculum is genuinely valuable, it was valuable for me. Weirdly, it was valuable for me in this non-intuitive way where it was emotionally very valuable. I was kind of alone throughout some of the program building stuff and it was really useful meeting the various partners — not in the sense that the strategy that I was given was that useful, but in the sense that I walked in feeling dejected and broken and walked out feeling like a million bucks. This will make more sense to someone who’s gone through the program. It’s a little bit like describing a marathon to someone.
Lastly, the network, Y Combinator’s network, is quite useful both in the sense that the active founders in NYC occasionally are running multi-billion dollar companies like Dropbox or Stripe, Coinbase or Airbnb most notably, but also there’s this whole cadre of folks that serve the company that acquired — kind of like I was at Apple — that are executives at a large company. There are many reasons why I think Y Combinator is useful, but I think those are probably the major ones.
CN: Yeah, it’s sort of become Silicon Valley’s Harvard. Most young entrepreneurs I talk to at least wanted to go through YC at some point for all of the reasons that you mentioned. So you go back there to work on AI stuff, do you sprinkle AI fairy dust on all the entrepreneurs now or what do you do?
I wish I had that fairy dust, I’d be sprinkling it away if I had some. Yeah, so I’m a partner at YC, and partners don’t have exclusive focus at the specific companies I work with who span the entire gamut. Like I mentioned earlier, a lot of the feedback that we give isn’t specific to the industry anyway, we’re really relying on the founder for that. However, I have a slight lean towards trying to help fund the next great thousand AI companies with the premise that there are some breakthrough technologies now that are going to make that possible today where it wasn’t possible, say, five, 10 years ago.
In particular, the one thing we did when I got to YC is we launched this thing called YCAI, which is YC’s only other vertical that’s kind of dedicated to AI companies, and companies that go through the vertical get the same YCAI experience that everyone else is getting, but they get a few domain-specific perks. All of these perks are basically modeled after the common infrastructure I saw at Apple that actually made it hard for startups to compete.
Starting a real AI company today is much easier within Google or Apple than it is as a startup for three reasons primarily I think. The first is you lack density of talent. There’s a lot of things that are on the fringe of research and commodity where you actually need researchers to help you build something.
The second is compute infrastructure. If you’re starting a company to provide automatic speech recognition as a service, you’d need to spend tens of thousands of dollars training models, so there’s this upfront cost that suddenly reminds us a lot about the startups of old. We had to go and rack service.
Then the third thing that you have if you’re a large COO that you don’t have if you’re a startup is data sets. You have access to proprietary data sets, which today — this may change over time, but today are kind of the currency for building anything smart.
So what we try to do at YCAI is provide those three to startups. In terms of access to talent, they can book time with open AI scientists kind of as if they were at Google and you could book time with someone who works at Google Brain. In terms of access to proprietary computer infrastructure, we give these folks a lot: At this point, north of a quarter of a million dollars in training time on the various cloud services. Plus, we have a company we partner with to literally rack GPUs in San Jose, in case anyone needs the latest and greatest.
There’s, by the way, a small sidebar. There’s this fascinating shortage of the high-end GPUs on services like Amazon, and so there’s this race to kind of get the GPUs now.
CN: Like the physical hardware?
Like the physical. If you want like 1,000 of the best GPUs on Amazon ...
CN: And I do.
... and mostly for gaming.
CN: For gaming and my Chrome bat ... well, Chrome just destroys my laptop.
Oh this will help you, yeah.
CN: Of course.
CN: It’s hard to get ahold of him. Kara’s like, “Oh, use Safari, it’s great.”
The third one is actually the hardest one to solve, and I think one of the most important ones, which is how can we give our startups common data sets that don’t exist today in the world so that they can train, and yeah, something that worked.
KS: This was a complaint that Elon Musk actually made at Code on the interview that Google and Facebook had too much of the power in AI in terms of the programmers, the abilities to overwhelm any other startup that came, and he was talking about open AI. Before we get to that — because I’d love to where you think the real AI is, because that suggests that only the big companies are going to dominate the next era — but before you get off of YC, you talked about the positive elements of it. Let’s talk about some of the ones that are a lot tougher. Is diversity? You look literally like the person I think would be a partner at YC, you know, a nice white guy with the geeky kind of thing and a nice sweatshirt. You’re sitting here, no insult to you, but what are the challenges you looked at when you came there? Because what do you think are the things they need to improve on?
Yeah, first of all ...
KS: Because they’re Harvard and they’re letting in a lot of the same people, we’re going to get the same result.
Yeah, the CEO happens to be black.
KS: No, I get that.
He’s not here today because he’s probably not as wordy as me, but I do think we’re doing better than most. Second is we do a lot.
KS: There was a big complaint at YC for a long time.
Yeah, and I think Michael has done a very good job of trying to address it. Obviously we’ll continue to do more, but one thing we did in the previous ... Well, in preparation, actually for the winter batch is we went out of our way, every single partner went out of our way, mostly on Michael’s direction, so he gets credit for all of this, on trying to recruit underrepresented founders and minorities. I think we can move the needle more than most organizations.
KS: Absolutely, that’s why.
Yeah, for two reasons. One is we operate at a fairly large volume, and the second is because, as we mentioned earlier, we’re really a system to bring outsiders in. And so we went out of our way to try to recruit underrepresented founders in various cities that we visit. In between YC batches once a year, we actually send the partners on kind of a world tour, and we actually tried to instrument in our software the proportion of people that partners are bringing in that are underrepresented, which is a very, for us, it’s kind of an important feedback loop when there’s a little leaderboard of which partner is bringing the most weight in in terms of underrepresented founders, and we’re kind of comparing them. We feel like we need to work harder towards that goal.
We are kind of both instrumenting software and procedure to try to recruit more underrepresented founders.
KS: What do you think the problem is? I mean, obviously, you don’t have to get into these topics now around sexual harassment, but what do you think the issue has been?
I think the issue is at an earlier ... The sexual harassment stuff is much later in the game. I’m more concerned about how do we get more underrepresented people to even apply for YC. And I think the issue is there’s already a — as we discussed earlier — incorrect characterized image of the successful founder, you know, that Elon Musk, “I’m Ironman,” or the Jeff Bezos champagne windmill. And I think there’s not enough understanding of, again, how silly and small all these people seemed in the early times. As a result, I think if you’re underrepresented, you’re thinking, “I feel already behind, there’s no way I’m going to achieve what those guys achieved.”
So I think the solution for this is to really try to emphasize how silly some of these people seemed early in their career to kind of make it feel more approachable. To me, that’s kind of the most important thing. I think our largest issue is that people self-edit. People don’t ... They know about YC, so we did a good job about promoting it and they know what it does, but they don’t even think they should apply.
The most actionable thing we can do is twofold. One is, as partners we try to go around dispersing the gospel, and second is I would love to start producing more content about people’s earliest days. I think it very much humanizes the experience. So those are two things I think we should be doing to move the needle.
CN: You mentioned this thing I think is interesting, which is that someone has to level the playing field if startups are going to compete with the giants when it comes to AI, and we’re at this time where there is so much concern and suspicion around big tech. What happens if there’s only three or four or five companies that can control the world?
I’d be curious to know what kind of conversations you’re having with the founders you’re seeing about that and about whether you have been able to actually level the playing field or if you are still worried that Google and Facebook just have ...
KS: Google, Facebook, Amazon, Apple ...
I actually would be curious to get Kara’s take on this, but from my reading of history books and Wikipedia pages, IBM felt very, very big ...
KS: That’s the hope.
That’s the hope.
KS: No, of course, I remember AOL was big and someone was talking about something the other day, I go, “Remember when we were scared of Microsoft?” It does, of course, things have a cycle.
Yeah, and things have a cycle and every generation needs to have its own existential dilemma and threat, and so this is kind of ours. So one, in the back of my mind, I’m hoping at the end of the day capitalism will continue, large companies who create a lot of value become inefficient over time and young upstarts will fix them.
KS: Well, you didn’t stay at Apple, so that’s interesting. You wanted to go somewhere else to try ... I think you’re right, I think that one of the issues is that these companies do ... Power does wane, they become less cohesive as a group of people, they get older, all kinds of things, as they get tired, they get everything else, but it really does ... It’s predicated on the fact that there’s another thing coming. What’s the next thing? And some of these technologies do require what you were talking about, the three or four things that are critical. And so it seems like they could take an advantage here, and there’s more of them, it’s not just Microsoft, it’s ...
CN: They also seem like they’re better students of history than IBM was. Everyone at a high level of Facebook is very aware of this phenomenon and they’re doing everything they can ...
KS: But there’s six of them, not one. It was IBM and it was Microsoft, and then it was Google sort of, but still, you know, you hope that entropy will work.
I think so. I think it’ll take a while for another reason, which is kind of Nietzsche’s Übermensch philosophy, and so I think the other thing going on is a lot of those companies today are still being run by their founders, who are truly remarkable.
KS: Yeah, and they’re religious in a way, if you think about it.
Yeah, they are religious in a way. So hopefully over time, you know, Jeff Bezos gets more interested in launching himself into space and Zuck runs for president and then those companies go the way of the IBM. I do think there are very few people in the world working on the particular thing I’m working on because most people are not incentivized to do it. I’m working on infrastructure to make it easier for startups to start up in the AI space. There are very few organizations that have that type of goal alignment. My hope is that there are more of these.
I also have this other crazy belief in particular about the AI world that makes me slightly more optimistic, which goes something like this. My belief is that what’s happened is that Google has turned the AI problem into an infrastructure problem, because that’s what Google knows how to do. Google knows how to crunch large numbers, and so they’ve take all of this AI hype around neural nets and they’re solving it the way they organizationally know how to solve problems, which is Jeff Dean comes up with a way to paralyze everything and do it really efficiently, really cheaply.
I wonder if there are other ways to build real AI. And of course the premise for this is humans, who seem to be generalizing and learning with much less compute power.
So one of my beliefs is that with the right amount of funding to the right places, real AI will emerge, and it’ll look completely different from what we have today.
KS: No, I would agree. Google’s good at making nickels, I don’t know how else to put it.
Yeah, exactly. And I think around 40 percent of the AI research papers published today are published by Google Brain, and so they have the publishing volume. And what happens in academia is people just fast follow whatever is the latest fashion. So you have this whole religion of folks working on massive, large-scale neural nets to try to understand what’s inside images and texts, but I actually think that’s only one approach to the problem.
One of my other goals, it’s this other side project I have, it’s a thing called AI Grant, which is kind of like a distributed, nonprofit AI research lab where we try to ... Just like Y Combinator funds interesting entrepreneurs around the world, we try to fund interesting researchers around the world that are working on approaches that are not Google’s approaches. We’re looking to bring research diversity into the ecosystem with a long-term hope being that one of those approaches works out and we end up finding a patent clerk in Bern that ends up being an Einstein type of thing. And if that works out and we manage to build something that’s able to learn from far fewer examples, boy, then a lot of those notes we were describing earlier that Google have don’t exist.
KS: Absolutely, you know, it’s riveting. We’ll talk more about where AI is going when we get back. We’re here with Daniel Gross, he is a partner at Y Combinator, he’s been there for almost a year, he’s focusing on AI, obviously. We’re also here with Casey Newton, the Silicon Valley editor of The Verge, and he’s the host of an upcoming podcast called Converge.
We’re here on Recode Decode with Daniel Gross in the red chair, he’s a partner at Y Combinator and we’ve been talking about really riveting questions around AI, which is the next frontier of computing, which is the big buzzword at least in Silicon Valley. [We’re] talking about how you take the power out of the hands of the big players like Google, pretty much primarily Google, but Facebook, Apple and others. We’re also here with Casey Newton, a Silicon Valley editor of The Verge, and Daniel was just making a big point of how do you get people thinking in different ways beyond what the big companies want to set, the agendas they want to set.
CN: And you were saying you have this nonprofit that you started up and I wanted to know who is applying for these grants, are these postdocs or undergraduates? Who’s actually coming and who are you giving money to?
Yeah, so just like Y Combinator founders kind of span the entire cast, the same is true here as well. We’ve given out 30 grants to date, we’ve had about 1,600 people apply, and they’ve ranged from people who are currently working at Google Brain and just want funding to explore this random side project they have, to 17-year-olds in random pockets of the world.
KS: Don’t give money to the Google Brain people, but go ahead, all right.
Look, at the end of the day, we’ll fund anything that is interesting and diverse.
KS: All right.
So, I appreciate your sentiment, but I reject your idea.
KS: A lot of money over there. Very good food.
Yeah, they mint money so they can afford to hand it out.
We’re often most excited when we come across someone very unconventional. There’s a team of students working on a project, and they’re all literally in high school. That to me is ...
KS: Where are they from?
They are from, I don’t exactly know, somewhere in the U.S. We try to fund people internationally as well.
KS: What’s the most interesting, unusual?
Probably, in terms of projects?
The most interesting project. Boy, there are a lot. One that’s very understandable, because some of these are knee deep in research, but one that’s very understandable is there’s a team of folks working on trying to use this thing called generative adversarial networks — we can get into what that is later on — to anonymize medical data and effectively build some type of AI that can look at private X-rays. And it kind of learns what X-rays look like from the private data, and then it becomes smart enough to generate its own X-ray images that are inherently not private, that other people can train on. So this would be somewhat the equivalent of me going into a room, learning a lot of Shakespeare, and then kind of speaking out Shakespeare prose. It’s actually not Shakespeare, but it kind of feels like it.
Now, this is really important, because it would allow us to have massive datasets that are quote unquote public that others can train on.
KS: That were private but aren’t private.
So this is like a very important fundamental building block if it’s possible to do.
CN: I love the phrase generative adversarial network, which is the perfect description of Twitter. That does sound like a really cool project. I want to talk about a big AI idea, because I wrote this story about AI this year that I wound up feeling sort of bad about.
CN: So a year and a half or so ago, I started noticing YouTube recommendations getting really good. YouTube is a place where my niche interests are, and so it knows that I love cooking, it knows that I love video games.
You’re watching those Tasty videos?
CN: I love Tasty videos. They are hypnotic.
KS: I also have determined through Kara Swisher AI what Casey wants with cable, but go ahead.
CN: Yeah, we share a cable password. But anyways, I get so fascinated by this, so I call up YouTube, and I say, “I’ve got to hand it to you gang, y’all have cracked it, how are you doing this?” They say, “Come on down, we’ll show you.” So I go, interview these people and I write about how they built this algorithm and it was like thumbs up.
Two months later, we start seeing the stories that these same algorithms are being used to surface the absolute most terrifying videos to children, where there’s all sorts of really problematic stuff where weird creators are sort of picking up beloved children’s characters and then making these really bad animations, like showing them in situations where they’re distressed. Anyway, set the content aside, the algorithms are the same, and something I observed was the reason that YouTube wasn’t aware of this is precisely because their own experiences are so personalized. I’m visiting YouTube every day, only seen AI that’s bringing me cool stuff, these poor kids are seeing Peppa Pig getting slaughtered at the dentist’s office.
CN: Yeah, so how do you think about building AIs that we actually retain some level of oversight into when the results are so individualized, and it becomes opaque to the outside observers.
I think what a lot of teams that are productionizing AI today will tell you is that very frequently they take the simple approach over the complicated, what’s called I guess black box end-to-end system, because it’s impossible to debug those. Very concretely speaking, most of the, I think almost all of the companies working on autonomy are actually they have at the end of the day, when all the decisions need to be made about whether you hit the brakes or not, that stuff is actually very heuristic driven, it’s not a black box neural net because they need to be able to debug it. It’s a true byproduct that at the end of the day, when you are either doing end-to-end learning or you have a highly personalized system, you have no clue what’s going on, and most importantly, your feedback mechanisms do not correlate to long-term human happiness, they correlate to engagement, and that is I think the actual problem.
I think it’s compounded by the fact that engagement correlates to short-term happiness, unclear what’s happening long-term. I’m sure you’ve seen, I think, Tristan Harris’s work around different apps and if you use more, you actually become less happy with, and I think this is one of the main worries that I personally have around AI. We’re going to build an incredibly sweet, irresistible cookie that we’re just going to want more and more and more until we feel bloated, and there’s no way that regulation will catch up in time. So suppressing the urge to go to YouTube and get more engagement from the thing that the AI is showing you is going to be harder and harder to resist.
I don’t exactly know what the solution is. I think the best thing we can do today is actually to do more of that time well spent stuff. The more we can radicalize people about the fact that with the help of AI, your Facebook News Feed, your Twitter feed, your YouTube feed are getting sweeter and sweeter and sweeter, but that may not be what you want long term the more people become self-aware of it.
KS: Or having people in leadership that do think about something besides constant growth and there’s other ways to ... Their business models are predicated on engagement to the extreme, and they don’t even think about it. I think they don’t, like how to make it more useful or more ... They want to know, push that red button. Push the button ...
I’d like to offer a slightly more complex narrative, if possible.
I think they do think about it, but I think it’s always hard at the end of the day, when your forcing function, yeah, is engagement, growing quarterly earnings, it’s hard for that stuff to get prioritized, so I guess the twist is I actually believe they’re fundamentally good people, I just think they can’t beat the incentive structure machine that they have in front of them.
KS: Except that they’re in charge of it. That’s such an out, they think the same thing on gender issues, the same thing on everything. Where are we to understand? They’re billionaires in charge of everything, and so you do have choices when you’re building companies.
Yeah, they have choices, but it is common refrain in Silicon Valley that the incentives always win, and on diversity stuff, I think the incentives are actually the opposite. The incentives are clearly to be more diverse. When it comes to engagement, though, it seems like the incentive is keep coming back and keep coming back.
KS: Maybe, maybe not. That’s what I’m saying. One of the recent things I’ve been thinking about is when you’re talking about finding talent at Y Combinator or anywhere in the world, I am certain that there’s a girl geek in Afghanistan who could have done something amazing who is never going to because they won’t reach her, they don’t get her, or a girl here in ... person of color here in Oakland who doesn’t get reached and maybe they’re the next Mark Zuckerberg or the solver of cancer or something else and the structure is thus that they can’t ever get up, and so that’s what I think about is that these people have choices to be creative with their business plans and in a way that isn’t so ... It’s like cigarette makers, as far as I can tell.
I guess that’s right. I think it requires a fundamental almost religious-like belief in the narrative in order to overcome the organizational incentives. I will say that I was blown away when I was at Apple, how rigorous the executive teams’ belief in customer privacy is to the detriment ...
KS: Apple’s the one that doesn’t rely on advertising and demented obsession of attention, like the attention slot machine. Apple’s the only company that actually doesn’t. Now, they could do a lot more on the Apple devices to just have utilities like Uber. You don’t spend a lot of time with your Uber, like, “I think I’ll look at the Uber app again,” but you certainly do with a Facebook or the Twitter app, and so they could do things to move things in a way that the iPhone is structured that would radically change things if they wanted to.
Yeah, I think you’re right, it’s just I think the challenge is always you’re running Apple, and so there’s that initiative, you’ve got constrained engineering, everyone’s constrained all the time, so you can either spend your time on that or you could spend your time on making sure the screen for the next iPhone looks even better.
KS: But everything is choices — when you’re raising children, when you’re doing anything — but that’s not true. That’s their argument is, “We have to just do this.” It plays into, actually ... Apple for example is very business model, which is we don’t do advertising, but we’re going to make this an even better phone experience for you and then you’ll want to use an Apple phone because it’s not littered with what Walt Mossberg used to call crapplets, do you remember?
KS: And a lot of these services are crapplets or sugar, whatever you want to call it.
Yeah, I very much agree with you. I’m just trying to say that it’s actually really hard. I guess I worry that a lot of people are as a result just going to pick various executives at Facebook or Google and just say, “Oh, they’re evil.”
KS: It’s not evil.
But actually I think the narrative is we’re asking them to climb Everest, and they’re able to do Hawk Hill.
KS: I guess if they weren’t billionaires and they didn’t run the world, I guess I’d feel bad for them, but I don’t. I don’t. I think it’s not true, they have the influence and power. Anyway.
I think the action item for us is the way ... I was blown away when I was inside at Apple at how much some of these blogs influence the internal culture and dialogue.
When Marco writes something, Marco Arment, he’s moving the organization, and so I think the move is to do more of the time well spent stuff because that stuff internally gets people talking and can reprioritize.
CN: I think it’s one of the big tech trends to watch next year, because not only is it people like us saying Facebook and others ought to look at this, it’s the former Facebook employees, it’s the people that built the “Like” button that are saying we need to watch out.
One other point on Facebook I just want to make, I’m hearing this phrase in the culture where people say to me some variation of, “I’m trying to spend less time in the News Feed.” Which is such an interesting thing. You think of every other product, people’s problem is getting them to use it at all. When it comes to Facebook, people are saying, “I’m trying to use it less,” which suggests that they’re failing.
KS: Or slash Twitter or whatever.
CN: Right, and people say it about Twitter too, but it really seems to come up with Facebook, and in part that’s a story about what an amazing thing they’ve built in part using some of these AI recognition techniques.
Yeah, I think that’s right, and I’m at the point honestly where I do things where I openly feel like an addict who’s trying to ... for example, I don’t bring my phone into my bedroom. I have the charger outside.
KS: That’s a very Ariana Huffington thing for you to do.
Well, I don’t know why, but I’ll guess I’ll take that as a compliment.
KS: No, she has a little bed for her phones.
When I go running in the morning, I try to just run with a watch, and that’s like my time away from the phone. So yeah, it’s increasingly hard. And the generation I worry a lot about is if you’re a kid now, I don’t know how you do homework.
CN: There’s just so many distractions.
Because it’s you against 1,000 data science engineers at Facebook trying to make the most compelling thing possible, and homework’s going to lose.
KS: Facebook’s always telling you, you know, people pick what they want. I’m like, but you have 1,000 engineers making those kids push the red button, aren’t you figuring out how to ... Actually it’s interesting, because we just did a podcast with my son Louie. I think they’re a little more in control of themselves. It seems they really can discount and throw out things rather quickly.
CN: Yeah, Louie, we do a podcast with him, and Louie ... I think a lot of teens like him do see through some of this bullshit, like they are aware on some level that they are being played, and so they’re going to get out of Snapchat what they want to get out of Snapchat, and no more.
KS: Yeah, I would agree.
Yeah, there’s another interesting narrative, which is that humans adapt very quickly at the end of the day, and so just like Americans are able to process carbohydrate actually much more efficiently than people that are having it for the first time — kind of a made-up fact, but it should feel like it’s right, it kind of feels like kids will just adjust very quickly.
KS: Or your idea of these things pass, these companies get less powerful, because Louie’s not using Facebook. He’s using Instagram, but he certainly isn’t using Facebook, and the disdain is like, “Why would I do that?”
Right, I would never use Facebook, but Instagram, yeah, I’ll spend three hours a day there.
CN: Okay, I feel like we’re maybe coming to a close, so I thought maybe you could give us some cool AI stuff to get excited about.
KS: Just to be super cool.
All right, the technology I’m most excited about is of this concept I mentioned earlier called a generative adversarial network. Without getting into the details, the tl;dr is that the computer is able to generate content based on examples that it’s learned. Actually the way it does it is kind of cool, and to your point about Twitter, it’s basically the same thing.
The way it works is there’s basically two AIs and one is kind of a conman and one is kind of a cop, and the way it gets really smart, generating, I’m sure you’ve all seen images of fake bedrooms or whatever, is the cop is trained on a little bit of what reality looks like, and the conman is trying to constantly try to counterfeit the cop, and they get smarter by fighting each other. One against the other, revolution after revolution after revolution, and so they’re able to develop a much more deep sense of intelligence around something.
KS: The cop is learning from the conman, presumably?
Yeah, and conman is trying to figure ... The conman is like, “Oh, okay, a cop caught me this time, let me try to slight variation on this image, I’m going to send it to the cop and he’s going to try and figure out whether this images is legitimate or not.” That’s what they’re fighting about.
“I’m going to generate an image, you’ll tell me if it’s real or not,” so conman is constantly ...
KS: But the conman is the teacher.
They’re both teaching each other. Right. I understand you’re kind of looking for a Plato-esque metaphor here, but they’re both teaching each other.
KS: No, but that’s interesting. It’s like Trump and the U.S. electorate.
CN: That’s what I was waiting for.
KS: I had to.
It’s funny, once you learn this concept again, you actually end up using that metaphor quite a bit, or I found. Anyway, cool science, what is this useful for? So one is that anonymization thing I mentioned earlier. Another very practical one is there’s a cool company that’s effectively building a version of Photoshop or Illustrator where if you’re an animator today at Pixar, not only do you have to draw the frames, you have to color them in. And what this thing does is it basically, you make like an initial color palette and it kind of learns from that and then you draw literally like four or five quick lines, and it generates this beautiful image of a scene. I’m excited about this because obviously it’ll make life for folks at Pixar easier, which they need with the loss of John Lasseter. But imagine, just like Garage Band made it possible for everyone to be a musician, imagine that for art.
KS: Whose art is it then? Is the first four lines is the original artist? Is it based on their stuff or does it plagiarize?
CN: We’re not stoned enough to have that conversation.
KS: I know.
Anyway, so that’s one slightly weirder one that’s more interesting than just me espousing the virtues of self-driving cars and trucks. I think another really interesting one that’s related, it’s generative, is it’s actually a YC company, it’s called Lyrebirde, and what they do is they’ll take ...
KS: Liar Bird?
L-Y-R-E-B-Y-R-D-E. Lyrebyrde. They take about two minutes of your voice — from this podcast, possibly — and then they’re able to build a text-to-speech engine of your voice, so they can make Siri sound like you, like me, like Arnold Schwarzenegger, with about two minutes.
CN: This would fundamentally undermine all credibility in the media for the rest of time.
CN: And by the way, whoever heard me say anything that sounds untoward, I guarantee you Lyrebyrde has something to do with it.
KS: I don’t think that, I think that’s not the case.
And so today, but it’ll be an interesting societal change because today when you hear someone’s voice on the phone, your default assumption as a human is, “Oh, it’s real, it’s them,” and it’s not going to be the case I think maybe two years from now.
CN: So imagine you’re like a CEO and somebody says, “Hey, Jeff Bezos is on the line,” and you’re like, “All right Jeff, here’s the product for the map,” and turns out you were talking to TimCo.
Yeah, that would probably never happen, because you know these are the games Tim used to play in order to get an edge. So yeah, so there’s the dystopian narrative, which is that this is the end of the world, fake news will get even more fake, what have you. I personally believe that we’ll adapt fairly quickly. Today, when you look at an image on The Onion, you know that it’s not real, it’s Photoshopped, and so that will happen to voice, so we’ll learn pretty quickly.
There’s similar techniques for videos, I’m sure you’ve seen them online where they got Barack Obama to at least look like he’s saying something that he’s not, so I think it’s just going to be another interesting part of 2020, where there’s going to be a lot of content that is very blatantly not real.
The consumer benefits of this, though, are pretty cool. It’d be nice to have any audio book you want read out by your favorite comedian or any form of content you want read out by ...
KS: If they want to read it.
Yeah, so there’s some really interesting open-ended law that the guys are figuring out now around is this considered impersonation, which is totally legal to do, as a consumer copy of the voice, so ...
CN: I would sell my voice for a pretty low price. Maybe I’ll get the price up when the podcast launches.
Yeah, spread it across the land, let everyone hear your voice.
KS: Yeah, wow, that’s cool. So just finishing up, any predictions for where we’re going into 2018, you think people should — hyped or the opposite — which is something people should be paying attention to.
One thing that I think is hyped that will take longer than everyone expects, I do think the roll-out of self-driving cars will take a long time.
KS: And because?
Well, a couple of reasons. One is uptake of cars take a long time period, so even if the car was available for sale today, it would be like three years until they were mass available, you know, available en mass.
Second is I think regulation is always going to lack, and by self-driving cars, I mean there’s like a lane where there’s self-driving cars in it, I think that will take like five, six years, if not 10. Lastly, I think that we have in front of us, sadly I think, a few more gnarly cases where horrific accidents that actually look really bad, like a truck overturned, you know, whatever, happen and then of course there’ll be a negative backlash against them.
KS: Lots and lots of those.
So I think we have to have a few more cycles of that before it actually ends up rolling out, so that’s one that I think we’re a little bit too optimistic about. Although obviously startups are making tremendous progress, and so is Waymo.
KS: Not in court, but go ahead.
Okay, not in court, yeah. That’s your jurisdiction to cover.
KS: Yeah. I meant Uber, Waymo’s doing great.
CN: Yeah, the Waymo lawyers deserve their holiday bonuses.
I’ve got to say, Judge Alsup for President.
KS: Love him. I know.
CN: Yeah. We profiled him recently, dedicated him radio operator.
KS: Yeah, it’s interesting, I mean he really handed it to them. We were talking to some Uber executives just the other night and they’re like, “What are we going to do?” And they’re really like, “Aagh,” and I was like, “You need to fire your entire legal team.”
CN: Last week.
KS: Yeah, something like that. It’s tough, but it is moving forward.
CN: It is moving forward. That’s interesting. Anyway, so one other interesting category is there’s a plethora of companies building silicon dedicated to AI or machine learning. Effectively trying to do what Google has with the TPU, but for everyone else. I think this is going to be a fascinating ...
KS: That’s interesting.
Yeah. Fascinating category to watch because there’s a bunch of different narratives here. One is that there’s going to be another nVidia or Intel made because effectively GPs are trying to do a lot of things, so if you shrink it down to the bare essentials, you can get something far more efficiently done.
Second is that nVidia will be this company and they’ll just figure it out, and all of these companies getting funded, you know, Graphcore or Cerberus or a bunch of other prelaunch ones, it’s just a byproduct of hype and people don’t realize that actually doing a tape out of a chip is really complicated, and it’s a bunch of software guys just assuming that AI chips will happen as opposed to anyone who’s grounded in reality.
I think the third narrative, which I think is probably the most likely one, is a bunch of large companies sadly end up acquiring one of these companies because they need it at the end of the day in order to satisfy their cost structure, and in particular it’s worth noting in the AI chip market, it’s really split into two.
There’s folks working on what’s called inference — which is to say you’re in the field, you got the drone out there, it’s flying and it’s trying to infer what it’s seeing — versus training — which is you’ve got the data scientist in the room and he needs to train a crazy model. I’m really excited intellectually about the training stuff because I think that there’s a lot of things that are not possible today that will be possible without any interesting algorithmic improvements or breakthroughs, just by getting better silicon. So today most neural nets that we train are really like a couple hundred layers deep, but if you want to get to a world where you have a neural net that’s thousands and thousands of layers deep, you actually need silicon that’s far more efficient to it.
One of the reasons I think why we’ll need to do that is if we truly want to understand language. For a bunch of technical reasons we could never get into because we’re out of time, I think we’re going to need a very, very deep neural net. The whole AI silicon spaces, I think one really interesting one worth watching that may yield one of the few examples we’ll be able to point to is kind of an AI company, as opposed to a company that uses AI.
KS: Yeah, that’s riveting, that’s actually the most riveting thing you’ve said so far.
KS: But I’m going to put you on the spot.
Finally we get to something good.
KS: Just one minute.
KS: Not enterprise, but consumer.
What happens with consumer?
KS: Yeah, you look at other things, they don’t just stick you in the AI ...
AI room. Yeah, here’s one weird one, I would gladly fund social networks all day.
Yeah, I’m excited about that because no one else wants to do it. Here’s my framework for it. My framework for it is two-fold. One is I kind of think that these things are much more like TV shows or fashion trends than they are like technology companies, and so at the end of the day, Simon Cowell wasn’t the first creator of a TV show, he just made the right TV show. And second, in terms of moat, because I know what you’re going to tell me, you’re going to tell me Facebook will crush all of this.
KS: I love this idea, not at all.
I think there are a bunch of different areas that feel so weird that Facebook wouldn’t even bother going after them. I’ll give you an example. If there was like a new desktop-only social network, I can totally imagine that being something Facebook will never chase, it’s not on mobile, desktop shrinking, and so it actually gets big. So that type of weird is something I’m excited about. The other interesting consumer thing that I hope happens — I hope, hope, hope happens — is finally a non-bitcoin consumer-facing experience for any of these blockchain or crypto technologies. It’s not clear to me what that’s going to be, but ...
KS: The AOL of bitcoin. I know it sounds crazy, but AOL ...
Yeah, let’s bring back Steve Case.
KS: Yeah, bring him back. He talks about bitcoin. Daniel, this has been riveting, this is really fun, and thank you for talking to us and thank you for coming on the show.
Thank you so much for having me.
KS: You’re very funny, you’re coming back. You actually can speak in full sentences, which is nice too. And thank you, Casey, you cannot, but no, you can.
KS: You did a great job co-hosting and we have three more to go. And get busy on doing Casey’s voice for AI or whatever.
CN: I’m going to have a generative podcast.
KS: “Hello. This is Casey.”
I like it, that’s a million dollar idea right there.
KS: It’s really not, it’s more like a $10,000 ...
CN: $10,000 ...
KS: I was thinking $10.
This article originally appeared on Recode.net.