Welcome back. You may recognize we're in the campus usability lab here. And I'm here with Lauren today where we're going to spend an entire session on eye tracking. If you'll take a look at the screen, we have before us, this is not a normal monitor. This is actually a fairly sophisticated and frankly fairly expensive eye tracker which has the property that built into the monitor above and below the display. There are LEDs that bounce infrared light off of whatever they hit but in this case particularly the eyeball of the person who's sitting at the display, and cameras that receive that light and extensive processing that then goes on through another computer as well to track where exactly on the screen you're looking. We're going to go through a number of examples, or one example, I should say, a number of times to take a look at how we might use this for assessing usability but also how we might use this if we were just exploring an existing site. A couple of words before we start. This type of eye tracker is fairly expensive but before we reach the end of this lecture, we're going to talk about some cheaper alternatives that are already out there and that are becoming more reliable and cheaper for people who may want to use this technique. The second thing that I'm going to say is that eye tracking has some really interesting properties. And one of them is that people move their eyes in interesting and, in some cases, predictable ways, depending on what they're doing. A classical example of this is, if you give somebody a hard problem to think about, their eyes look up. I don't think we completely understand why, but we recognize this. And one of the results of that is, as best we can whenever we're doing eye tracking, we try to have somebody in a comfortable position, in as natural a setting for carrying out a task as possible, because if we make things artificial we may mess up the pattern in which people look at things. And so, when you've seen us in the lab before, we've done think aloud protocols where we've asked people to tell us what they're doing, what decisions they're making. What you're going to see today is going to be a lot more quiet. We're going to watch what's going on, and we're going to interpret through the lens of watching, not primarily through the lens of asking. So with that, we're going to start with a process known as calibration. And let's take it away with our calibration step. >> Okay, sounds good. Hi Lauren, I'm going to be working with you today, and I'm going to start by getting you calibrated on our eye tracking monitor. It's going to be fairly simple from your side. I'd like you to begin by sitting comfortably, like you ordinarily would using the computer. >> All right. >> With your hands comfortably at the keyboard so we get the right distance away. >> Good. >> Is the display a comfortable distance from you when you're seated that way? >> Yes it is. >> Okay. Thanks. In that case, I'm going to make sure we got you in the window. And it looks like I'm going to budge this down a little bit. >> All right. >> Just looking at some stuff on my side. And I'd like you to look at the monitor >> All right. >> Okay, and I'd like you to lean in a little bit. Okay, there, we got you'. So I'd like you to sit in that general window, it doesn't have to be absolutely rigid but that's about the right spot to start. >> All right. >> I'm going to show you a little blue dot. I'd like you to focus on the center of that blue dot whenever it comes to a stop. Are you ready? >> I'm ready. Okay, I've got these calibration results in and I'm going to show you that document while trying to do a couple more spots that way we get a little closer view on this. And again, I'm going to ask you to lean forward just a little bit, so we don't lose you. And again, I'm going to show you that dot. Okay, that looks like a great calibration. I'm going to go ahead and save that. And whenever you select start recording, it will present the site that you want to test. >> Great, well wait one second. I'll introduce the task and then we'll do that. So just as a word for the those who are following, you'll notice that the system here was smart enough to recognize when it couldn't really capture a good calibration at one part of the screen. It circled it on the screen and said, yeah that's not working, go do it again. And these systems are pretty intelligent. The task I'm going to ask you to do involves, a great passion here in Minnesota, ice hockey. And there's no one we would rather beat in ice hockey than our rivals at the University of Wisconsin. And so as it turns out I have advance information that like every year we're going to play the University of Wisconsin one weekend here in ice hockey. Your challenge is to see what the best tickets you can get are for the Saturday night game against the University of Wisconsin and just follow as you normally would, we'll take you to the University of Minnesota sports site so you have a reasonable place to start. And take us through it and tell us when you're done and we'll be going and recording as we go. >> Is it time? >> Let's do it. >> Okay. >> Let's start recording. >> Okay. In that case I will work with the calibration we already have and I will present that website to you shortly. And hopefully that will trigger and I will take it full screen now. And it's under your control. >> Take it away. Okay, I think I have it. >> We will let you stop there. So, just by watching this in real-time, we were able to see a number of things. We were able to see, for instance, that on that home page, wow, those boxed ads at the top attracted attention. We were able to see that those bold headings seemed to be effectively scannable. We also saw some areas of frustration, which I'm not going to talk about til we come back to this later, because we're going to actually go through a couple of these, not all of them on camera, so that we can come back and show you a composite of what happens when we have a couple of different people who've gone through this task.