Chris Lintott – TRANSCRIPT
So let me tell you what cutting edge astronomy looked like not so long ago. It looked like this.
First of all, that’s just a sketch. Somebody stood at a telescope from night after night, incidentally wearing a top hat, sketching what they could see. And secondly, they’re looking at a single object. In this case, it’s what was known as a spiral nebula. An island universe. A mysterious, massive cluster of stars, that we’d now call a galaxy.
And back then, what we could know about the universe was limited by the amount of data that we could collect, and things just aren’t the same anymore. Modern astrophysics looks not like this, but like this. A picture of a million galaxies from the Sloan Digital Sky Survey. Each dot, an individual system of hundreds of billions of suns.
And if we want to understand the evolution of the universe, if we want to understand how we got this wonderful universe that we see around us, we need to study these galaxies. But there’s a problem. The problem is that we don’t really like the universe that we’ve ended up with. We’ve got this 96% of the universe in a form that we don’t understand. In dark matter and dark energy.
And so we need to pay closer attention to each one of these millions and billions of galaxies. We need to treat them not as points of light, but as individual spirals or galaxies, because the shapes of the galaxies tell us about their history. So you can tell that this spiral galaxy has had a very different past from a big ball of stars that we call the elliptical. That’s what we set out to do. We set out to try and identify the shapes of the galaxies.
And that’s a task that humans are much better at than computers. We’re very good at this sort of pattern recognition task, and computers are really rather poor. We tried getting a student to look at a million galaxies, and I can tell you that, after about the first 50,000, they give up. And so we needed a new solution. We decided to call for help.
We set up a website called “Galaxy Zoo.” This is what it looks like today. And Galaxy Zoo asked everyone in the world to help us classify these galaxies, to say what shape they are. And on the first couple of days at Galaxy Zoo, we got this amazing response. We were doing 70,000 galaxy classifications every hour.
And while we haven’t continued at that speed, over time, we’ve done hundreds of millions of classifications from hundreds of thousands of people. And the even better news is that, taken together, those classifications are more accurate than those supplied by professional astronomers. The crowd does not make mistakes, and it has endless enthusiasm for this task of sorting through pictures of the universe. But actually that’s not the interesting part. Something else very interesting happens when you invite hundreds of thousands of people to take part in your research.
What happens is that you have to remember that yes, there’s a crowd, but the crowd is made up of individuals. So here’s one of our classifiers. You may recognize him. This is Brian May from the rock band Queen, who’s actually a PhD astronomer, it should be said. And Brian, not right now, not in this image, but Brian is an avid Galaxy Zoo classifier.
He inspired many other people to come to Galaxy Zoo and join in. He inspired in particular this woman. This is Hanny van Arkel, who’s a teacher in the Netherlands. As you can see, she’s got a guitar that’s just like Brian’s. Anything that he did, she was going to do.
And so she took up Galaxy Zoo too, and she became the first person in history to pay attention to this object. And this illustrates the ability of humans to get distracted from what they are supposed to be doing. What we wanted Hanny to do was classify the galaxy. It’s a Warped Spiral. A rather beautiful one.
But of course like you, she wanted to know what the blob underneath was and she called it a “Voorwerp”, which I believe is Dutch for “thingy”, a term which I’m proud to say we’ve worked into the scientific literature. Because this object is literally one in a million. We didn’t know it was there. It’s a gas cloud, a galaxy sized gas cloud, hanging in space, energized by a jet, driven by a black hole in that neighboring galaxy, which has recently shifted from being an active black hole to a quiet one. It’s a process that we knew happened, but never before have we managed to catch a galaxy in the act, catch it in the act of this transformation.
But because we had so many eyes and we could pay individual attention to each image, we were able to catch this and follow it up. We also discovered that, as astronomers, we’re not the only people struggling with this data flood, with having more information than we knew what to do with. This is a problem that confronts scientists in field, after field, after field. And so after the last few years, we’ve helped researchers not just look at galaxies, but provide the most accurate forecasts of whether solar flares are going to hit the Earth with a project called “Solar Stormwatch.” We’ve transcribed more than a million log book pages from World War I Royal Navy logs to help climate scientists understand the weather of the past, and thus predict the weather of the future.
We have a side project, trying to find out whether wales have accents. And we’re also helping cancer scientists in the UK do that pathology by classifying images, like the one on the screen. Each of these projects makes use of the time and the pattern recognition abilities of hundreds of thousands of people. It uses them collectively to accelerate science. We’ve collected all of these projects together in what we call the “Zooniverse”, a platform for this kind of citizen science a place where people can sit in front of their web browsers and, within a few minutes, see something that no one has ever seen before, and, more importantly, make an authentic contribution to science.
This is science education, but it’s also the cutting edge. People actually get to help us. Even that’s not the exciting part. Magical things happen once you convince people that they too can take part in science, once you convince them that they have the ability to contribute. And to illustrate this, I’d like to talk about a project called “Planet Hunters.”
So Planet Hunters is a project that’s been running for a few years now, and we thought we’d try and discover planets around other stars. This is the most exciting area of astronomy right now. We thought there might be a few planets left over for our citizen scientists to catch. Now whenever you go hunting planets, you can’t see the planets directly. What you have to do instead is look for the indirect signs that they’re there.
And one of the most effective methods is to look for what we call “Transits.” A planet transiting in front of its star will cause the start to dip in brightness by much less than 1%. But if you can catch that dip, then you can infer that the planet is there. It’s a rather simple case of Dark Matter, if you like. You can infer that this planet is there from the data.
And we have data from NASA’s Kepler satellite, which is staring — actually Kepler, I feel rather sorry for it, it’s got the most boring job in the world, because it just stares at 150,000 stars, and every 29 minutes tells us what brightness they are. It can send that data down here, and we can look for these transits. And sometimes these transits are really obvious ( beeping ). In fact, in this case, they’re so obvious that you can hear them. That modulation is the dip at the repeated transit of a planet orbiting its star in just a few days.
But sometimes the transits aren’t so obvious. This is what it might just look like to be on a moon, orbiting a Neptune sized world, now know as Planet Hunters 1B. This was a planet discovered by a group of citizen scientists using our project. And Planet Hunters 1B is currently a unique world. It’s the only world that we know of that has 4 suns in its sky.
So this is a system with 2 pairs of stars, and a planet orbits one of those pairs. It’s a planet that simulations tell us shouldn’t exist. So we didn’t even know to look for this and yet there it is. And it’s there because a group of volunteers saw something strange in the data and they didn’t even tell us about it. They went out, they got more data from the professional NASA website.
They did their own analysis, and they came to us when they’d worked out what was going on. These were people who did not start off as scientists, who’d gone through the engine of motivation, being convinced that they could do something real by taking part in the main project and then collaborated together to make a truly gobsmacking discovery. And then we made the nice image, so that they could try and image what that planet was like. This is real citizen science. And so we’re at a stage now, where we’re running more than 20 projects, where we have humans and robots collaborating together.
We have the robotics systems that take the data, like poor Kepler up in space, and then we have the humans down here on Earth, who are able to analyze the data. So humans and robots are able to work together in harmony. Except that it never works like that. We’ve all seen the Sci-Fi movies. We all know that the robots always go wrong in the end, and this is the vision of my nightmare.
This is the future that we’re heading towards, expect it doesn’t quite look like this, it looks more like this. Because this particular machine that’s going to run roughshod over our classifiers, is the Large Synoptic Survey Telescope. This is a telescope being built right now. It’s going to be on a mountaintop in Chile. It’s as big as the biggest telescopes in the world right now.
But it’s a survey telescope. It’s going to scan the whole sky once every 3 nights. And it’s no exaggeration to say, it’s going provide a movie of the universe. We’re going to see all the asteroids whizzing around the solar system. We’re going to see all stars flickering as they have star spots.
We’re going to see the planets, we’ll catch those transits. And we’re going to see the centers of galaxies flicker, as material falls down into the black holes that lurk within their centers. But to cope with LSST is going to be a challenge. It’s going to produce roughly 30 terabytes of reduced data each and every night. If all we care about is things that change in the universe — let’s simplify the problem: a conservative estimate is that we’re going to have half a million alerts, every single night.
Even if all of you, even if everyone watching, even if we all commit to doing Galaxy Zoo and its friends, we’re not going to cope with this. So we need new solutions to combine the machine with the human. But luckily, there are things we can do. The first thing we can do is we can just be smarter about using people’s attention. We could pay attention to what it is that machines can do.
Maybe they can classify the routine events. We can let them do that and we can give only those things that really need human attention to the humans. The other thing we could do is that we can educate our volunteers. We can use the power of the Zooniverse, the power of citizen science to train up a generation of volunteers, so that they’re able to take on the task of dealing with the truly unusual things that LSST is able to provide us. And that’s what we’re trying to do.
You can see here one of my colleagues, fighting off school children as they attempt to classify in an attempt to discover their own planet. And so the message I want to leave you with is that there is a solution to this problem of big data. There is a solution to this problem of having more data about the universe than we can possibly use. The solution is to invite hundreds of thousands, millions of people to come along, and join us in our scientific adventures. And if we can find a way to do that so that they feel they’re making a real contribution, we not only educate these volunteers, but we can change their lives, and change their attitudes towards science as well.
Thank you very much.