Cloudflare TV

*APAC Heritage Month* This is Your Tech Leader Speaking

Presented by Gretchen Scott , Dr. Linda McIver
Originally aired on 

Technologists are dependent on the quality of their education, what is the state of data science education in Australia and what can we do to make it better?

Come along to discuss the current situation of data science education with Dr McIver and look at what we should start to question.

English
APAC Heritage Month
Interview

Transcript (Beta)

Hi there, I'm Gretchen. Thanks for joining us today for this session of This is your Technologist Speaking.

We're running a series of talks with local APAC industry leaders and have been lucky enough to curate a guest list that's filled with technologists who have leadership and just the broadest range of knowledge possible.

So today's guest is Dr.

Linda McIver. She has a PhD in computer science education and extensive experience teaching STEM, data science and computational science at both secondary and tertiary levels.

After pioneering authentic data science and computer, oh my gosh, computational science projects with real impact at the John Monash Science School in Australia, she started her own Australian Data Science Education Institute in 2017.

She did this so she could ensure that all Australian students had the opportunity to learn STEM and data science skills in the context of projects that empowered them to actually solve real problems, I guess, in the community.

And it wasn't just toy playing. So she's actually got a new book coming out very shortly called Raising Heritage.

And it's aimed at empowering our kids to save the world.

Linda is an inspiring keynote speaker. I always go to her talks and hassle her from the side.

It's true. She's appeared on ABC's panel program Q&A.

And it's well worth Googling and looking up because I love her questions on there and answers.

She regularly delivers super engaging professional development programs for primary, secondary and tertiary education.

Oh my gosh, educators across all disciplines.

And the feedback she gets from these is unbelievable.

So Linda, welcome. And it's so great to see you this morning.

Thanks so much, Gretchen. It's awesome to be here. I'd like to, I mean, we've talked in the back in the green room, there is so much to discuss in this space.

And it's, you are so knowledgeable. I'd really like to dive right in. I know you're passionate about data science education.

But surely we're in Australia, education is pretty good.

Aren't the kids already learning STEM and data science? That question kind of makes me want to cry.

Because, you know, on paper, yeah, they are.

We, our schools are doing a huge push for STEM and there's data all through the curriculum.

But they're absolutely divorced from context. More often than not, the STEM programs that we have are people coming in and doing robotics for a day, or schools are very excited because they have these wonderful STEM centers, which are laser cutters and 3D printers.

And, you know, they might have some soldering irons and things.

They're like, look, we're doing STEM. What they're doing more often than not, is they're buying the toys before they're actually figuring out what to do with them.

And they think that we have this idea that toys are STEM. And if you do a one day robotics program, you can say, tick, I've done STEM and we're good.

And now we can move on. The thing is, that's not STEM. It's playing with toys.

STEM in the right, expensive toys. And so that, you know, the STEM toy industry is going great guns and doing wonderful things.

But in terms of education, the kids are not understanding what STEM is actually for.

And if you think of STEM, you know, you've got your science, your technology, your engineering, and your maths.

Those are for problem solving. Those are for understanding the world.

Those are for actually making change and fixing things in the world.

But that's not how we use them in education. So think about the way we do science.

We take known inputs and we apply a known process and we get a known output.

And the kids who don't get the right output, what do they do? They fake it until they do, or they copy their neighbor's results.

What we're teaching here is not science.

We're teaching confirmation bias. We're teaching them to get the outcome they expect to get.

I think I did that in my school, chemistry. Everybody did.

Like, it's normal. It's the standard approach. And when you see it, it's funny when I say that in talks, you see this kind of guilty sort of ripple through the room as everyone's like, oh yeah, I did that.

It's really, it's normal because the marks are for getting the results that you're supposed to get for having the graph right.

And for, you know, writing down, this happened because this process that we know did this thing that we know about.

And it's all, you know, it's also, why would you care?

You know, the kids who are into science are fine if a little bored, but the kids who aren't into science are not going to get into science on the basis of this kind of tedious cookie cutter process.

And it's the same with STEM.

When you bring in tech toys, everyone's like, oh, everyone likes to play with toys.

But no two people like to play with exactly the same toys. And so one of the problem with these STEM toys is you bring in robots and everyone's like, oh, robots, they're great.

And they're really exciting. And everyone loves robots.

No, everyone doesn't love robots. The kids who don't love robots, what they learn is robots are just as tedious as they thought.

They also don't work nearly as well as they expected.

And they can't do any of the things they wanted to do with them.

And look, I was right. I can't do STEM, which is catastrophic. It's exactly the wrong result.

It's exactly what we're trying to avoid. We're trying to engage kids with STEM.

And what we're teaching them is they really can't do it just like they thought.

Oh, sorry. That's really quite. I've depressed you. That's a good bit, though.

That's a positive bit, which is that I figured out when I was teaching at John Walsh, I figured out that you could actually teach using data sets that haven't been analyzed yet.

And goodness knows they're easy enough to find.

And you can have the kids doing problems where they don't know what the answer should be, where you don't know what the answer should be.

How do you mark that, then, if you're a teacher?

That's the exciting part. You're actually marking the process.

You're marking the kids on how they went about finding their question, how they went about answering their question, and how they went about communicating that answer.

But most importantly, how they went about validating that their answer was actually correct.

Right?

That's the thing. That's where it all gets really exciting, because we don't do that.

We don't do that in business. We don't do that in government. We certainly don't do it in education.

We do something new. We don't stop to see if it worked. We're on to the next thing.

Imagine if, by default, we teach kids that every time they try something, they check to see if it worked.

And they check to see how well it worked.

And they check to see who it didn't work for. Imagine if we did that in the tech industry.

Stop it, Linda. Right? A lot of things would change. So, that's the really exciting part.

And that took me a couple of years to get to, when I started doing this work.

I started doing it because I was like, I found a way to get more kids into tech, and more kids into STEM, and more kids to realize that even if STEM isn't something they want to do long-term, they are skills that they can gain, and they're skills that they can use to actually do real things.

And that's always been the central driver.

Doing real stuff is really motivating. So, that's why...

So, we started off saying how I thought that our STEM education system was doing fine, right?

But I've got a child, early high school, and one who's finished high school.

And what you hear is, we're doing STEM. We've done STEM. We did the STEM stuff.

We had robots. We had sparrows. Someone came in for a day and taught us some fun stuff.

But you've talked about project-based learning. What does that look like?

And why do you think it's better than toys? So, what I discovered when I started teaching data science is, I was working with a group of year 10s who were science nerds, right?

So, you would think that teaching them tech would be easy.

They'd be right into it. But we were teaching them with toys. And they were drawing pretty pictures.

They were playing with robots, blah, blah, blah.

And the overwhelming feedback from that subject was, why are you making me do this?

It's not relevant to me. I don't want to do it. I'm never going to use it. And you could tell them until you blew in the face that programming skills are programming skills.

It doesn't matter which language you learn. It's going to stay, you know, blah, blah, blah.

They were like, I'm never going to want to program a robot to push another one out of a circle.

This is pointless. But as soon as we started doing the same programming skills in the context of data sets and real data sets, their attitude completely changed.

As well as those programming skills, we're now teaching some data literacy, right?

So, you're looking at a graph and saying, what's wrong with this graph?

Is it misleading in any way? You know, where's the zero on that scale?

All those kind of questions. But also, they're learning to communicate their results.

So, they're learning to make graphs that are not just technically correct, but they're actually meaningful and communicating in a compelling way, but also a valid way.

That's really important. And suddenly, they were saying, oh, this is so useful.

And it's so important. And I'm using it for my science, you know, project.

And I'm using it over here. And I'm using it over there.

And I saw this graph in the newspaper the other day. And oh, my God, they were, you know, they didn't have a zero on the scale.

And it was so misleading.

And, you know, they were suddenly applying it to all of the things in their life that they were doing.

And they could see the point. And they could see the meaning.

Not only that, they were, yeah, they were super engaged. And they learned that these were skills they could actually use to make change, to solve problems in their own communities.

And that's the essence of STEM, right? That's what it's for.

Yeah. You're right. I hadn't thought of it that way, actually. You're right.

The reason we have STEM is to make the world a better place, right? To solve the tricky problems.

So, if we're not teaching it in that way, why are we expecting people to implement it that way?

It's a pretty big gap there. Right. Yeah. It's huge.

And the thing that I found when I started doing this problem-based learning stuff, not only were they more engaged and they could see the point, because the point was built in.

I can't tell you how many times my eldest came home when she was studying math methods.

And she'd come home just really frustrated and go, Mom, I'm learning SURDS and they don't make any sense.

When am I ever going to use this rubbish?

And I would be like, you're probably not.

I can't lie to you. Right? It's very difficult to engage kids with things that they don't see the point of learning.

They're learning to tick a point off on the curriculum.

They could not give a rodent's backside about the curriculum. They just want to learn things that are real and meaningful to them.

So, as soon as you're doing problem-based learning, the meaning is obvious.

It's right there. And you can see why you're learning to do this and the point of it and, you know, what you might use it for later on.

And the interesting thing that I hadn't expected was that in the Year 11 subject, which was elective, I suddenly had a huge jump in students choosing to do the Year 11 elective, and I had a lot more girls.

But it wasn't just the girls. Like, the girls were the obvious, the easily measurable part, but I got a lot more boys as well.

And that's when it really hit me that we're not only excluding girls from tech, we're also excluding the boys who don't see themselves as represented, who don't see themselves as Sheldon from the Big Bang Theory.

The boys who don't get it. The boys, I think it's almost worse for the boys when they think they can't code, because they see the other boys can code.

And so there's some kind of like, that feedback loop says you really don't belong here, because it should be amazing.

Right? Because we know girls can't code. I mean, obviously, I want to kill somebody for that.

But, but, but the boys think they're supposed to be able to code.

So when when we do these tech things where we bring in toys that they're supposed to find fun, and they don't, and we teach them code that they find difficult, and that they see other people who've been coding since they were pretty much in the womb, you know, flying through, you, you get this double whammy of imposter syndrome that strikes the boys as well as the girls.

And so we're excluding a whole range of personality types and background experiences from tech, by doing this, doing this stuff that they're supposed to find fun that they don't find fun.

It just reaffirms for them. I told you I couldn't do STEM.

I told you it wasn't for me. Like, I suck at tech. I don't belong here. It was totally reinforced.

And then they back away super fast. And that's what we need here.

Right? Yeah. We need all the others. What does you mentioned using real data sets?

What have you got an example of those? And where do you I know you've said you can find them.

Do you just Google data set? Well, that is part of the problem. It's one of the reasons I started my charity is because the only reason I could do this work when I was a teacher is because I was half time.

So I was spending all my own time finding the data sets, making sense of them, building projects around them, and then putting them into the into the class.

You can't do that if you're a full time teacher, it's just not on.

And also, I had a PhD in computer science.

So I had a bit of a leg up in making all of that work. Most tech teachers, most teachers don't have a PhD in computer science.

So you know, I felt like I needed to cut through a lot of that drama.

So part of the purpose of ADC is to provide lesson plans and project plans and data sets that have already been found and already been made sense of so that you don't have to do all of that legwork.

We've worked with data about micro bats, which are teeny tiny little bats about this big, super cute and kind of weird looking.

That was collected by researchers in Melbourne, Ivanhoe and Organ Pipes National Park.

I didn't even know we had micro bats.

I know, right? It was so exciting. And it was just someone I met through some science thing that my kids did.

And I started that connection and grabbed all the data and started to run with it.

We've worked with electoral commission data.

So one of the first projects I did with my year 10s was we downloaded a file that contained every vote for the Senate for the federal election in Victoria.

So it was over three million lines of basically a spreadsheet file, a CSV file.

And the kids couldn't even open a file at large in Excel. So they had to code.

I like that. Yeah, it was like, oh, sorry, you're going to have to code.

Um, yeah, it was awesome. And they could see now why they had to code, because they weren't just trying to draw pretty pictures, they were actually trying to extract information.

And the great part about that was it actually had the entire ballot paper.

So it had like every line in the CSV was one vote. And you had the contents of every box on the ballot paper, which meant that the kids could find their own questions.

So there was no problem now with with plagiarism, because every kid has a different question that asks things like, you know, how was my own electorate different from the whole of Victoria?

Or how was my particular polling booth different from my electorate?

One kid looked at the proportion of votes that women below the line got versus men.

And, you know, compared to how many women there were below the line.

They looked, they looked at who like which parties voters followed the how to vote cards, and all this kind of stuff.

It was like, so they just they all had a different question.

They used code to find the answer. And then they visualised it.

It was great. So was that intentional to have a data set so large that they were forced to do it?

Or is that one of those accidental learnings where you went?

It was. I knew they couldn't open it. So and I had intended to teach them code.

But it wasn't like I went looking for a data set that was too big to open in Excel, that was just a happy like, oh, gosh, you can't do this in Excel, even if you wanted to.

Sorry. And so you did all this work for free in your spare time. But then you went, okay, we need to, everyone should have access to this, right.

And you have a very unique skill set, because I'm not sure I would have made any sense of that.

I think I'm quite technically literate, but I would not have been able to take that data set and do something of use with it and deliver it to a year 11 class and say, hey, let's do some fabulous things.

So that's is that what led you to create Etsy?

Yeah, so it was really twofold. One was to, as I say, find and make sense of the data sets and then share those with people so that they could just take advantage of that work and run with it.

And build the lesson project plans around it.

But also to start skilling up teachers so that they can think in this way.

The thing is, most of it is not actually about technical skills. Most of it is really about critical thinking skills.

So the first things that we do when we look at a data set is we ask what's wrong with it?

How is this data not the data that we think it is, not the data that we want it to be?

Who's missing from this data?

How, you know, where are the gaps? What are the flaws? And that's something we don't do in the real world.

We don't do it in the data science industry.

We don't do it in the tech industry. We don't do it in government. Like we need to be asking that by default.

And questions like, who does this data harm?

That's a good question to be asking, I think.

And we have this really amazing belief and trust in numbers.

Why is that, do you think? There's a researcher from Canada called Luke Stark who calls it the charisma of numbers.

And it seems to be something, I don't know whether it's built in or trained, but we do, we see numbers and graphs and we tend to kind of bend at the knees when actually we should be asking really, really difficult questions about the data sets.

You know, what was the sample size?

How did you collect that data? You know, was there some sampling bias in that?

You know, did you only collect the data from people who are willing to answer their phones and talk to researchers?

Because I suspect that's a very specific group of people that is shrinking all the time.

You know, but if you present numbers or you present the outcome of a computer program, people tend to put way too much faith in that.

I mean, just look at the ATAR, the Australian Tertiary Admissions Rank.

Like that is, it is, we're told that it effectively ranks students according to their ability and knowledge and things like that.

Does it? It really doesn't. It's a very flawed measure and it's a single point in time.

You know, there's all kinds of issues with it, but we place a lot of weight on it and give it far more importance than it actually deserves.

And every time we try to reduce people to a number or a rank or, you know, try to organize people with computer systems, it tends to go horribly wrong.

Look at Robodebt.

Yeah, Australia has had some really catastrophic examples of how we've done this, right?

And I'd just like to say for people not in Melbourne, the ATAR is our ranking system for your final two years of school, schooling.

And it kind of, it's this weird combination of weighting some subjects and not some others, and then putting it all together in this magic mix and going, hey, you kind of sit in, I don't know, maybe the 75th percentile of other students that did this right now on that day.

And that's, I don't know, there is a lot wrong with it, but that's how we assess university entrance and what course you get into.

Although we do that based on a supply demand model that is perplexing.

I think there was a lot of thought that went into how it got, how we took the real world and put it into numbers.

But I think then we do them, the maths in the middle and the computing and all this stuff.

And when we pop it back out the other side, it's like we just forgot about the first bit where we were just aggregating things randomly, really.

And we don't do that out.

Also, one of the issues is, and this has been well researched, that as soon as you measure something, the measure becomes sort of a metric and you use it to gauge how well you're doing in the thing.

And then the metric becomes the output that you're aiming for, rather than what you were actually trying to measure in the first place.

For example, the other day, I nearly cried because I heard a teacher talking about literacy and she's doing wonderful work in the literacy space.

And she said, our goal is to improve the NAPLAN results and include, improve the VCE results.

And I was like, no, no, no, that's a measure.

That's not the goal. The goal is to improve literacy. But what happens is that NAPLAN becomes the target, rather than literacy becoming the target, NAPLAN becomes the target.

And that's what's happening. That's what happens in the education system as a whole.

What we've got, for example, in medicine, think about how doctors are selected for their courses.

Some universities are moving to an interview system, which is wonderful.

But when you, when you, basically they select your entrance for the, for the medical degree by how well they did on exams, and then they get through the course and they're measured and ranked by how well they did on exams.

And I don't give a flying rodent's backside how my doctor did on exams.

I care how well they can diagnose me, how well they can communicate with me, how well they can listen, because that's a really rare skill.

And, and, you know, how, how well they can actually help me. And none of those things are very easily measured by exams.

They're quite difficult to measure anyway, because especially medicine has such a practical element to it.

And you're right, we go through and say, you're good at filling out forms and taking tests.

Therefore, we revere you.

And we've started very well, not just started in Australia, I think we teach to the test quite well.

Well, we have to, like the education system has to do that because that's how it's measured.

And it is quite punitively handled if it doesn't, you know, if the schools don't meet the targets, then the schools get punished.

And it's, it's a vicious cycle. But that's not the outcome that we really want for our kids.

No. So then, because you're a doer and a changer and an influencer, you've experienced all this, you started your own foundation to be the change.

But you're also in the process of writing a book.

And it's called Raising Heretics, which I just, oh my gosh, I love that just to begin.

What can I expect to learn from your book when it's out in the ether? Yeah.

So the book's goal is really to lay out the case for a radical change in the education system.

And as I said, when I started doing this, I thought that what I was trying to do was get kids into STEM.

But actually, it's so much bigger than that.

What we're really teaching is we're teaching critical thinking skills. And we're teaching kids to question and challenge their own results, as well as the results of others to question and challenge the orthodoxy.

For example, one piece of orthodoxy that's sorely needed challenging was the idea that COVID-19 wasn't spread by aerosol, which we know that it is.

But because there were, long story, I won't go into it because we'll be here all day.

But the scientific orthodoxy was that it can't be spread by aerosols.

And that actually has directly led to hundreds of thousands, if not millions of deaths, and problems with policy that are still not being fixed.

For example, hotel quarantine. One of the big reasons hotel quarantine doesn't work is because COVID -19 is spread by aerosols.

So somebody needed to stand up and go, you're wrong about the aerosol thing.

And doctors did, and researchers did.

But it took time to break through the wall of this is what we this is the orthodoxy.

We've always done it here. Right. And the thing is that we need more and more people.

We need everybody really trained to question the orthodoxy and to go, well, this is what we believe, but how can we be sure?

And how can we test it?

And if we ask that by default, and if we ask that all the time of everything, then we get to the point where we can change things more easily, and we can improve things and we can solve problems.

So that's really what the essence of the book is about.

It's about what would the world look like if it were evidence-based, rather than the kind of weird ideological mishmash that we have now?

And how do we get there? Because it's interesting, there's a part of me that thought and still kind of does hold on to this idea that we are evidence-based.

But when you talk about the aerosol spread, that I think I read a similar paper to you, and it literally was because someone decided a number was the number to represent something and held on.

They just held on to it. There was no research, they just believed someone said it.

It was taken as the whole truth and never questioned it again.

And the way they sorted that out was the researcher went, well, where did that number come from?

And it took them some time, but they managed to drill down and find where the number came from and figure out that it was actually being used completely incorrectly.

That stuff happens all the time.

And, you know, when I wrote the medical section of the what would the world look like if it was evidence-based, it was really horrifying because our medical system is not nearly as based on evidence as we'd like it to be.

A lot of it is based on historical, you know, the way we've always done things.

And a lot of it is based on really sheer bloody-minded arrogance.

And when you actually look at the number of things that have happened in the medical industry that have no evidence and that weren't picked up, like that weren't measured, they weren't tracked, they weren't evaluated.

So then we come back to that idea that imagine if we evaluated everything by default, you know, if we always evaluated things to see how well they work, we'd pick up on things a lot quicker.

We, you know, it would make a radically different world.

Okay, so we've got time for one final question, Linda. What do you think the biggest myth in tech education is?

I'm going to cheat with the premise of the question and take two.

First is the one that we've already talked about, which is that toys will engage kids in tech.

Toys, for the most part, engage the kids we've already engaged in tech.

They don't engage the others.

They're just dispiriting for the reasons we've already discussed. But the other big reason is that some people can do tech and some people can't.

And that drives me wild.

Anyone can do tech. Anyone can do tech. Part of the reason that we have this myth that people can't do tech is because tech has been designed by a very narrow type of person for a very narrow type of person, which is quite othering and excluding for those of us who don't fit that mold.

But the other reason is that we keep saying that only some people can do tech.

And so we believe it. We just have to stop saying that.

Well, it's a form of gatekeeping. It's nice to know things that other people don't know and to feel like you're special because you can do tech.

I get, oh, I'm very popular when someone's phone breaks or you must get the same thing.

It's like, my friend Gretchen's in tech. She could help me fix this problem.

People think you're amazing and it's nice to have that feeling.

So why would we want to lose that? But also, I think it's a orthodoxy now and we have to smash the orthodoxy.

We have to start questioning it. There's a piece around the questioning.

We have to question, but we have to be okay with being questioned as well in this space.

Yeah, which is why it's so important that the kids learn to question their own results as well as those of others, because it's very easy to question other people.

It's very easy to criticize other people.

But to actually question and criticize your own work is much, much harder and a much more important skill.

And to have other people do it to you as well and not take it as an angsty personal insult.

Oh my gosh, I feel like we're going to have to do a whole another session sometime and explore all of that.

I cannot wait to see your book when it comes out and I'm going to share it in all the social medias all over the world because I think it's such an important read and I've been lucky enough to have a little glimpse of some of it on the way through.

So keep an eye out on all the social media stuff via us because I'll be spooking this far and wide.

But other than that, Linda, thank you so much for coming up on the show this morning with me.

I've learned a lot and I'm a bit heartbroken about the education system, but we're changing it.

Hey, we'd love everyone to join us next week on Wednesday when I'm actually going to talk with one of Australia's leading HR specialists in technology, because that's wild now.

Thanks so much.

Thumbnail image for video "This is your Tech Leader Speaking"

This is your Tech Leader Speaking
Tune in to hear from tech leaders.
Watch more episodes