🎂 Steve Wozniak & John Graham-Cumming Fireside Chat
Presented by: John Graham-Cumming, Steve Wozniak
Originally aired on September 27, 2021 @ 2:00 PM - 2:30 PM EDT
2021 marks Cloudflare's 11th birthday. For our annual Birthday Week celebration CFTV is featuring an array of new products and guest speakers, as well as a look back at some of our favorites from last year.
In this Cloudflare TV segment, Cloudflare CTO John Graham-Cumming hosts a fireside chat with Steve Wozniak, Co-Founder of Apple, Inc.
English
Birthday Week
Fireside Chat
Transcript (Beta)
All right, well, good afternoon from Lisbon, and I think good morning from wherever my guest is currently sitting, somewhere in California or Nevada or somewhere.
Steve Wozniak, who you will all know as the creator of the first Dial-A -Joke service in the San Francisco Bay Area.
How was, welcome. Thank you for the unusual introduction.
A lot of people don't know about that. It was hard to do. And why was I the first?
Because way back in the old days, we had one phone company, a monopoly, AT&T, and you could not own a phone.
You could not own an answering machine.
Other companies were not allowed to make an answering machine. You had to lease the one they had that was big and expensive for money-making theaters, the Kodafone 700.
And for a young engineer, like just out of college, your biggest expense is your apartment rental.
And half, again, my apartment rental was to rent that machine every month.
But I wanted to have a number you could call and tell a joke, because I was so into humor.
And not just because nobody else spent the money to do it on their own.
Eventually after I did it, a local radio station started one.
And so people would call in and leave a joke? Is that the idea? And then someone could call up and hear a joke?
Yes, I'm sure. I'm sure that I was the most called single line number with no extensions in the United States, because the jokes were so short.
And it was just one call to the next call to the next call to the next call, all on one machine that went for every call in my Cupertino apartment.
This is, you know, a few years before Apple. Yeah, so then you veered off into Apple, which I think other people might have heard of.
And I'm pitied the dial-a-joke thing didn't continue on as long as Apple did.
No, it did. But it did well.
It did well. Got a lot of notice. Eventually, I was telling Polish jokes. And the Polish American Congress Incorporated, a group, got on my back.
So I said, what if I change them to Italian jokes?
Those were the joke books that were going around in those days.
And they said, that's fine. There was no political correctness about telling those kind of jokes.
I mean, it was just all the kinds. And I said, it's just humor.
But eventually, yeah, got to where that, you know, little risque, even if the joke mentions another ethnic group, but doesn't about them.
It doesn't put them down or anything. It's just talking maybe about a word in Spanish and a word in English that could have two different meanings.
Makes it funny.
No, you get criticized for that, even. Well, I'm glad that you helped us with that with Apple, though, because that was a pretty big deal.
And I was reading something on Hacker News the other day.
And there was somebody who was like, I really want to know the answer to this.
I wonder if Woz would answer it. And you weren't there.
So I'm going to go straight into something super nerdy, which is somebody wanted to know why the Apple II had 280 characters across the screen and why it was seven bits rather than eight bits per character.
So I'm going to make this Hacker News reader's dream come true by asking you that.
Why was that? OK, you meant to say 280 bits, not 280.
Sorry. All right. Yeah, I did say that. Sorry about that.
Analog engineer and then a digital engineer. And I could, you know, easily repair colored TVs even.
I mean, I was really into that analog world. So I looked at all the specs, you know, how many microseconds per line you had going across horizontally and eight bits using eight bits per byte for 40 bytes would show up well on some direct monitors.
But I was this was had to be for the home, had to be where nobody had money for a monitor.
And it was your home TV, the analog TVs that are broadcasted on a frequency and need a little air room.
And it really that really was pretty much the top level.
The real reason I'm going to get into, though, is that the Apple one was only text characters and the Apple two.
That was one of the primary modes.
We didn't go to bitmap graphics because we didn't have enough speed and processors to handle a screen, a bitmap graphics that well.
So I built in characters.
And the way you did it, you had a character generator that if you put in the ASCII code for an A, it would give you the rows and the columns.
What are the ones and zeros that are on?
But those those character generator chips. OK, to do something all on my own with my own chip to be so many chips and so much expense.
So you buy a character generator chip to turn an A into the dots on a screen that make an A.
And they were five by seven bits, meaning five bits horizontally by seven vertically.
And I could put one extra space in to make the vertical a nice binary eight.
But going across horizontally five and then I'm going to have two blanks between characters that makes seven or going across horizontally.
I had five and two blanks make seven bits. Three blanks would have looked too weird.
The characters were split apart. I couldn't fit 40 on the on the screen going across.
So the right. So even though, yeah, it was a little costly in terms of circuit design to use seven bits per byte going across the screen horizontally.
That was the solution. I know I did all the calculations on paper and thinking very deeply about what are the right numbers.
And I like it when things are all fit together, sort of serendipitously, you know, and what it comes out to be.
And that was it was a point in time you didn't have ships costing nothing and you didn't have memory cost of nothing.
And no, these were also economic considerations.
Yeah. So let's talk about that. You know, let's contrast. So I'm also an 8-bit kind of person grew up in that world where you had a computer where it was really highly optimized in terms of the hardware.
You knew how everything in the machine worked.
Right. You've got a circuit diagram. You've got a disassembly of the ROM and you really owned it.
So today, where all of the hardware I use, I, you know, if I took it apart, I couldn't understand it.
And it's very difficult to use.
What's stayed the same and what's changed in this, you know, sort of 40 year period?
Moore's law. What you're getting at is that, well, it was like open source then.
When you bought even in the analog days, you bought a radio or a television.
You got the full schematic showing which pins on the chips were connected to which resistors and electronic parts and all that.
And you had it all and you could look at it.
And if you were and you could also have enough smartness to analyze and fix any of the problems, you know, or change them a little bit.
That was the world we grew up in. And although we didn't have personal computers yet, I had looked at manuals of the minicomputers five years before.
And I learned so much from them.
I'd see a person's code. I'd even remember their name.
And that's how you do this type of operation with machine language. That was like open source.
Everything was open source. And I wanted our Apple II to be very open source and show people, here's how computers are made.
Here's the circuit.
Here are the codes. So we published everything. And to this day, you can't go in and understand it all yourself as a person because it's millions of lines of code from hundreds of thousands of people.
Sometimes, you know, what makes an iPhone?
You know, a skilled engineer, even working in Apple, couldn't go back and actually figure every little detail out.
There has to be a lot of bugs have to continue through.
And it's just too huge. When we started Apple, we had no idea. Where's Moore's Law going to taper out, reach its end?
You don't know how far it's going to go.
When we started Apple, the amount of RAM that would hold a song cost close to a million dollars.
Did we ever envision an iPod where you'd hold a thousand songs in your hand?
No. Or that there'd be a chip someday, a tiny chip that had 500 movies on it.
No, because Moore's Law says it's going to happen. But Moore's Law, you just don't know.
Are we going to hit a limit of physics and all that?
And we're kind of getting, we seem to keep getting close to the limit of physics.
But then we now have like, if you look at the iPhone, multiple cores in it. So now you've got, it's becoming a supercomputer itself.
And one of the things I'm wondering about is that period, it was, there was this world where you could kind of get it all, encompass it all in your mind.
What do you think the the was of today is doing?
Right. If there's somebody who's going to be you in 40 years time, what are they working on today that is equivalent of where they really got control over their whole world?
Sure. Well, I didn't think in terms of these are computers that are going to be in everyone's hands and turn the world over.
Of course, I said that it's going to be a revolution.
Everybody will afford their own computer and be able to do things they never could before.
But nowadays, oh, I kind of forget your question.
Well, just like if you think about like you were there, you built this thing and you had you were using the available knowledge.
What is somebody with the available knowledge do you think doing today who's going to turn out to be like you in 40 years time?
I'm going to be someone else going to be interviewing them saying, wow, what was it like in 2020?
Well, you know, and I was I was just trying to design new computers to show the world there are people out there that are trying to show off their technical skills, either it's programming or they even have a little hardware.
And this Internet of Things world has opened up tons of possibilities because you just look around your own life, your own home.
Here's a device or something else that could be digitized in a way.
Trouble is, nowadays, you have to think of putting everything on the Internet and to make money.
It's like we can't sell something. The person owns it and it works forever.
No, we got to have subscriptions and all that. So the world has changed a lot in the business models that were made possible by this more virtual world.
That didn't happen right away. You know, back in the early personal computers.
No, you bought it. You owned it. It worked on its own. You could buy a CD that had an encyclopedia and there was your information to the world.
So it's changed a lot.
Now, so a young a young person starting out, let's say even an older person has an idea.
Basically, you just have to be motivated enough. It's not so much a deal, a thing of knowledge, because you can get a lot of knowledge online, this and that, talking to friends.
You can learn it if you have to, to get something built, to show people just as a fun little show off thing.
I want to impress my friends.
You still do that. And I say, look at look at all the people who in school encounter Arduinos and Raspberry Pis and projects on that.
And they actually, in their science fairs and projects, a number of the students actually go very far and create great things.
So it's easy to to find this world for beginners and jump into it.
And young people, of course, can go much faster than even I can jumping into Raspberry Pis.
And so it seems that you bring this up because in the 1980s, in the UK, where I was growing up, the BBC did this thing where they tried to teach everybody about computers.
It was called the Computer Literacy Project.
It was a series of TV programs. And that was like we need to bootstrap everybody up into computing because it's coming.
And part of that was programming.
Do you think, you know, that this kind of fashionable idea that everyone should know how to program is is right?
Should people know how to program computers?
Well, that BBB show, you said, was for everyone. It was available to everyone, but it wasn't like forced on them like a subject in class you have to take.
Now, the way I thought I was in the Homebrew Computer Club and nobody thought we'd have computers of our own.
And we were thought that it was a revolution. It's going to be affordable.
And I thought out, well, if you have a problem, I've been programming for years, luckily.
And I did great grades and experiences in college.
I thought, if you have a computer and you have an idea or you have a puzzle or something needs solving, type the numbers in.
Here's the answer. I did it myself.
I didn't have to use somebody else's tool. I was independent. And I thought everyone, at least masses.
I sort of thought everyone will learn how to program and that will make them more powerful as people.
They won't have to give it over to someone else.
And we'll put out a tool that can let them program. And right away.
Oh, people started putting out programs that are sort of it's done for you.
A tiny little word processor, a tiny little database or whatever. It's done for you and games mainly.
So people were not writing games. They were just buying them prebuilt and run them.
So the masses in the world really weren't going to turn out to be programmers.
I think programming is a valuable skill to teach in schools because it involves thinking out a sequence of steps that lead to a solution.
And everything in life is like that. It's really good preparation. But I almost then wonder, should you force a grade on people?
Some people are meant to be that way.
They are that way in their heads. And some people are just some other way, maybe more artistic in certain ways, whether it's painting, music, whatever.
And so it should it be for everyone.
And I thought, what does society need? Does society need everyone to be a programmer?
Society needs some programmers. Oh, yes. Very important things in our life are going to come from that.
But it doesn't need 100 percent of the people.
How many? Five percent, 10 percent, whatever. Some small percentage.
So it should be thought of as a career in career terms as a an option and, you know, optional courses you would take.
And, you know, I'm thinking K -12 school primarily, because where you start is usually where you end.
And I was hoping that that would go for the program.
We had a lot of programming introduction in our early Apple II manuals.
You'd read page one, page two, and it would take you to some simple programs.
Language basic was perfect for that. I had never programmed in basic in my life, but I had to write it for the Apple II because I knew the simplest language that could be wouldn't, you know, wouldn't intimidate people.
And for a computer in the home, you need a simple language if you're going to program.
But the world turned, the first steps that really turned into games, but you needed games in the home.
You weren't going to sell a computer into the home to keep track of your inventory levels and sales figures and salaries like they were used in the world.
So a lot of a lot of there was a lot of, yeah, thinking, you know, about it in terms of how are the computers really going to be used?
The whole general picture was in my head.
And pretty soon I gave up the idea that everyone should be course program, you know, but it certainly should be available.
And maybe as just an introduction. Here's what programming is about is not a bad idea because some people discover, oh, this is what I want with the Apple II computer.
It's the same thing as the Raspberry Pi. If you read the first pages in the book as a tool that you can discover what computers are and say, this is me.
This is what I want to do for my life. And maybe you'll turn out to be a CEO of a company someday.
I've run into a lot of them started on the Apple II. And so discovering it is, you know, schools in the past had absolutely nothing on computers and there wasn't an easy way to discover it, except maybe a voluntary class taught by a programmer before school or after school.
And one of the things that I thought came through, we were just sort of chatting about this before this started, was there was a lot of fun in this, right?
There's a lot of fun in this building this machine and then programming this machine at the beginning.
Do you think the fun is still there in computing?
You know, I have to assume that people have a built in nature to want fun and find it in their own ways.
And, you know, whereas I mean, it was fun for me because I'm creating all these new things you wouldn't have believed existed.
I could even play pranks and tricks with things that other people didn't know about.
Today, I think people have this, how do you have fun?
Well, you go and become, well, become an influencer on Instagram or something.
Those people must have, or have, that's their type of fun. Or just making your screens on Facebook and your presentations, making them interesting, a little more artistic.
I think it leads you into creativity because you kind of want to be creative.
For us, when it was happening, though, it was like you're seeing a startling new world because the big computer companies had missed it.
They missed it.
It was little startups coming up from, you know, college kids and less. And that appealed to a lot of people because they saw themselves in that role a lot.
And, you know, one of the things is, I don't, I go back to my own experience, but I look at a lot of others, startup companies.
You're doing just something that may or may not go.
It's risky. And you just believe in it. You've got some little lead on the world, maybe.
And you don't know if it's valuable. That's the most fun time in your life because you have to work so hard, so long.
It's so, so important what you're doing.
And every little new program that came out in those early days of the personal computers, a new program, a new thing that does this, a new type of disc.
Every single one of those was exciting. Every week there was something so exciting that you almost couldn't imagine.
So, and that doesn't happen that as much anymore.
Of course it does happen, you know, and the big famous starters, I mean, be it search engines and Google, be it, you know, socialization and Facebook, they still come about, but I don't know.
It's not like every single week from these big company products, the big companies kind of own the world now.
What have they done?
That is so shocking, outstanding. Oh, they rearranged, Amazon rearranged the pages for finding what you want.
Well, big deal, little, you know, shuffling the, the, the chairs on the Titanic or something comes to mind.
I hope it's not the Titanic, but you know, one of the things that strikes me is that there was also a period where the number of people who were doing things with computing was really small.
And now it's kind of okay that everybody's using them and Amazon is rearranging their pages because it just reflects that computers kind of won.
Like they're everywhere. Well, there's a, then you get entangled with what really is a computer.
Sure. An iPhone is, has the big computer inside, but there's almost nothing on it that's a computer.
You might buy a game for a phone that just runs on the phone using its computer and graphics.
But almost everything tries to tie into Internet where other people have the control.
So there's a money, there's a money funneling process.
And I don't call that computers. I call them personal devices, doing your personal needs in life, your music, your friends, your, your reservations, that sort of thing.
And these are just personal things.
That's not what computers really are about. It's about programming the steps and steps and steps that create, create windows on a screen and ask questions and just make decisions and all that.
That's more in the programmer realm and not very many people really get into that in personal life.
I think mostly it's just the ones who are good in math, wind up sort of going in that direction by college and become the programmers.
Funny thing is though, every single company there is in the world now, because with digital and the Internet came and cell phones came, every company in the world, if they're making lamps, if they're making chairs, they need programmers, a lot of programmers in involved in everything.
So it's kind of one of those universal talents if you're good at it.
So be good at whatever you do.
All right. So it's changed radically from when you were designing the early Apple machines, when I was first getting into eight bit stuff, what is it in technology today that excites you?
If it's not, you know, the fact that the iPhone is very cool, but it sounds like it's not a computer, what, what gets you excited?
Well, you know, a lot of the move towards electric cars and will they be self driving, artificial intelligence gets me excited.
The trouble is I also disdain it a lot.
I think deeply about it and it's making it think it's, it's like a brain.
I've really gone full swing around and I really think we're not close to the brain.
We don't understand how the brain works. We try to make chips that can simulate a brain and, and do that sort of thing someday, well, what if we had a machine that could just learn, be a human conscious with this, and I was at a company once, Fusion IO, where the engineers did figure out how to make a brain.
It takes nine months. And did they do it? It takes nine months. Some of them did it.
And, you know, it was not silicon, I assume. Not silicon. No, I'm saying, I'm joking that we don't really know how to make a real brain, a machine that would say, what should I do today?
Uh, what is this thing I've never seen before? I can kind of, a human can kind of figure out where it must've come from.
It's history and think, think so many ways in so many directions, our artificial intelligence, what we should do is we can, well, we can train a Google computer to recognize a picture of a dog faster than any human, but it doesn't know what a dog is and a one year old girl knows what a dog is.
So you can look at, um, um, we can, we have artificial intelligence for all these little specific categories and they're all really good, better than a human, but we don't tie them together into the whole picture and the meanings.
We haven't figured out how to do that. I think that's a step for the future.
So you're both excited and concerned about this AI world. What is it you're excited about?
Well, I'm excited whenever I hear of a new technology that does more than you thought it could before, you know, maybe quantum computing will, is it, it's like running out of its excitement level, but, uh, maybe it'll come through and just amaze you with being able to do things we never could before.
Um, I, I love technology is not just what's in the latest program out. What's the latest computing product out the latest smartphone.
To me, technology always starts kind of with the atoms, the scientists, the physicists determining how to use different materials that lead over a long time, maybe a decade or more into better products than we had before.
And, you know, just a way of treating like we always talk Silicon now we have Silicon Valley, but now they're, they're talking about, they're getting down to the single atom level where they can analyze things and, you know, maybe there's these little carbon tubes that operate in terahertz and all that.
And, um, different technologies come out. So I like to read science, um, science, although I do have a little bit of thinking, wait a minute, is some science really, um, maybe meaningful for the future or is some science a little bit overboard?
You know, I think we, um, little of both, little both. What's an example of a science that's overboard?
Well, sometimes here is a new technique for making batteries and almost never does it come into play when you hear things presented like that at an early stage, because they look at one facet of, you know, maybe it's, um, joules per per ounce or joules per volume, some, some little one category and look at one category can even be made less, less expensively, but that doesn't mean the whole product will.
We had the same thing in, in programming, like even in, um, days when I was there, it was like a programmer has an idea, ah, apply this if you're a designer.
And if we put these two chips in, oh my gosh, we can do this one part 10 times faster, but it's a part that you only do one time in a hundred, you're already down to like, you know, one in a thousand, nothing that the human user would, would notice.
And, uh, maybe the money that the money, those two chips costs could actually get something better for a user put into user interface as something like that.
So given all of that, what are you, you know, if you be the futurist now and say 10 years out, 2030, what does it look like in terms of technology for the average, for the average Joe?
Well, one of the nice things about getting older, you realize how short life is, but how could we look back and say 20 years ago, what we, what would we have envisioned for the average Joe?
And, you know, 30, 40 years back, I looked at one example, which was the amount of memory that would hold one song cost close to a million dollars.
Would we ever get there?
And, you know, tonight, but it's very difficult. It's easy to make up stories and be a science fiction writer and be a futurist.
We're going to have, everybody's going to be flying around in their cars and zoom vertical takeoff landing.
Every, every, uh, car is going to avoid every collision possible and drive itself.
You'll have no steering wheel in your car, but then sometimes there's hardcore realities and physics of the world and real engineering needs, um, keep you from getting there.
You kind of have to have a balance. So the future is something I never really liked to predict very much.
Obviously the great future in the world would be if we could create out of machinery, out of computers, the next species, the one that replaces us.
And maybe it takes care of us because we are the creators and we are the pets and we'll be the family pets and all of our needs will be taken care of.
We'll have a fun time and we'll have entertainment and play and food and clothing.
We'll have everything that your pet dog needs. So when I started thinking about that, that would, I mean, that we don't know when we're going to get there because we haven't gotten down to a completeness of a brain or an understanding of how the brain's wired.
We don't even know that our memories are stored in our brain.
We just assume it for good reasons, but we don't know it.
And so are we ever going to get there? I think that's why, you know, you said decades out, decades out.
I think that really is the major goal because all these things would theoretically be possible.
But today, every time we try to apply artificial intelligence to doing things humans do with their head, you can never come close to a human yet.
And if you just think about the flip side, which is the worries, what were the things you worried about over the life of Apple?
And then after Apple, were there things you were thinking, oh no, this is going to be dangerous if we go this way, or this is never going to work and never going to be able to happen?
Well, worries are emotions. And when I was 20 years old, I thought out the world, I was going to be really, you know, kind of a scientific, take evidence.
And I'm not going to be, I don't want to get really estranged psychologically.
I'd be, whoa, whoa, whoa. You know, and I don't want to be up and down.
I want to have a nice, even course that lets me be productive in the long run.
So worries, I dropped out of my life. I came up with a formula. Life is about happiness.
It's not about achievement. It's not about how much money you make, how big a house you have.
Life, the way you measure yourself is how happy you are. And my formula was, you know what?
You don't want to ever argue with people. You don't wind up changing minds.
You don't wind up happy. You wind up frowning. And if your car gets dented, my father said, well, don't worry about blaming somebody for it.
Just take the constructive steps. You go get it fixed. And so I said, happiness is smiles, things that make you happy emotionally, minus frowns, things that make you sad.
So I don't want to look at part of the life. Oh, here's something I'm not really happy about that gave me grief of some sort, you know, problems.
And in the early Apple days, I don't know why.
Some magic was spinning out of my brain.
I was like Bob Dylan is 10, his best 10 years, early 10 years. And I was just I my head was in course to make the right decisions.
Like we discussed one about the number of bits across the screen of the Apple 2.
I knew that my mind was always making the right decisions based upon a full set of data that I had.
So why would I look back and say, oh, I made a wrong decision.
Maybe I should have done something else and it would have been turned out better.
Well, that's just a sort of grief that is not a part of my life.
You don't look back. You look now and you look forward.
All right. Well, I'm going to ask you to look back on one thing, because there's a woman on our team called Beth who really wanted to know about dancing with the stars.
And I realize it's nothing to do with technology, but you seemed extremely happy doing that.
I watched it myself. How did that come about?
Yeah, I was unusual on Dancing with the Stars in the way that you mentioned.
Risk taking is a big part of our existence, you know, programmers, startup companies and all that.
I never took a risk as big as that one. I don't watch TV and I had never seen ballroom dancing in any form, live or on TV.
I had no idea what it was.
And they called my assistant, said, Dancing with the Stars, why don't you?
I said, what is that? A dancing show? They got the wrong person. No, no, no.
And I kept denying. And then I met the producers twice and they really, really believed in their art and they believed it was important.
And they wanted people like me who knew nothing about it.
And they would teach us and see how we did.
And so I said no again. A friend in my car said, Steve, they need someone like you.
And I thought he meant they need someone who stands up as a good member of society and a geek and all that.
And so I said, OK, I'll do it. And then I found out later that friend had never seen the show.
But no, I got in there. You know what? And it was so hard.
When your muscles have never moved that way in your life and you've never seen other people do it, the training, it was like six weeks of seven days a week, six hours a day.
Pain, pain from toes to hips. I couldn't even feel a broken toe that I got in it.
I just couldn't feel everything felt that bad. And you work that hard and you work for a few minutes, you know, 10 minutes and you sweat, wipe off the sweat.
Then you go out and try it again, try it again. It was so hard to learn even one little dance.
And I then I moved down to L.A. near the studio and finally got to where I could actually go through the routine we planned for the first dance.
And I said, I am so lucky to be a geek on this show, a person like me being on one of the most wonderful shows, wonderful stage in the world and people and being a part of it.
And I just smiled. I got so happy. How did what everything I did in my life was so good to get me here?
And I just was so happy. And I wasn't even worried thinking about my steps.
I saw in the first dance all these other all the other celebrities were thinking.
I could tell in their head they didn't want to be disturbed.
They're thinking all their steps out. No, I just got happy. I looked straight at the audience.
I, you know, and smiled at them and just said, I'm going to it's going to be a fun thing in life.
And and I took it. I was a little unusual that way.
Well, we're literally almost out of time. And it sounds like a wonderful place to end.
Because it sounds like your formula has been some combination of deliberately being happy and taking risks that they ended up bringing you even more happiness.
So was thank you so much for doing this. Thank you for being part of our 10th anniversary celebration.
Thanks for being part of Fusion IO. And we needed it back in the day.
And, you know, good luck. I'll be watching out for your next appearance on Dancing with the Stars, I hope.
Thanks, John, for this honor.