Loading

The
K12
Engineering Education Podcast

« Previous · Next »

The Future of Assessments in Engineering Design

Latest Episode

Previous Episodes

You Might Also Like...

Description

Now that states have written engineering into K-12 science and technology learning standards, more institutions are thinking about how to run standardized tests in the subject. The educational assessments organization ETS is one institution interested in an in-depth, fair, and large-scale assessment for engineering learning. Deb Brockway, Senior Research Associate at ETS, talks about some of her work in this area, especially in engineering design. She describes a current research project to assess students as they work together designing in a virtual environment.

Our closing music is “Yes And” by Steve Combs, used under a Creative Commons Attribution License.

Subscribe and leave episode reviews wherever you get your podcasts. Support Pios Labs with regular donations on Patreon or by buying a copy of the reference book Engineer's Guide to Improv and Art Games by Pius Wong. You'll also be supporting educational tools and projects like Chordinates! or The Calculator Gator. Thanks to our donors and listeners for making the show possible. The K12 Engineering Education Podcast is a production of Pios Labs.

Transcript

Pius Wong  0:00 

When a student is learning engineering design, what's the best way to test them on their learning? Let's hear some ideas on The K12 Engineering Education Podcast.

 

Pius Wong  0:15 

The name ETS might be familiar to you. ETS is a major organization in educational assessment and testing, involved in programs like Advanced lacement or AP tests. and some state standardized tests for K-12. Over the last few years, they've been looking at how to assess learning in K-12 engineering, too. And recently I got to speak to a researcher at ETS about that effort. I'm Pius Wong. I spoke to ETS research supervisor Deb Brockway about how she thinks about assessing learning in engineering design. Deb spoke by phone from New Jersey.

 

Pius Wong  0:58 

Welcome, Deb Brockway,  to this podcast, and I really appreciate you taking the time out from ETS to speak to me. Can you introduce yourself?

 

Deb Brockway  1:08 

Yeah. I'm Deb Brockway, and I am a senior research associate at ETS, where I engage in research related to engineering design assessment, particularly as it's integrated into science. I have had some background -- I was a classroom teacher, science teacher for 15 years. And my very first position was as a pyrotechnic specialist, and I did some work on a team that was doing some design. So that's where I had the first taste of all of this.

 

Pius Wong  1:46 

Wow, how do you become a pyrotechnics specialist?

 

Deb Brockway  1:50 

Well, actually, I fell into the position. In college I had studied biology and chemistry, and that was -- where I was living at the time, that was the closest position available at the time that would allow me to use some of my science background. So one of the projects that I was working on, doing the groundwork for, really, was supporting the development of an airbag device for cars.

 

Pius Wong  2:19 

Well, going back to your research today, you had mentioned that you study the evaluation of engineering design at ETS. What does that look like? What does that mean?

 

Deb Brockway  2:30 

So what what I'm doing is, I have a couple of projects that engage students in a virtual environment to actually engage in engineering design, so that they are working on a design challenge. And they go into this virtual space where they can collaborate. There's a whiteboard space, and they can draw, and there's a chat window so they can discuss their designs. And so the benefit of that -- There are lots of things happening here, but the benefit of that is, as a classroom teacher, and as a former classroom teacher, I would always wonder at the end of the day when the students come up with a design and the final product, who contributed what to this final design? And also what understandings to these students have about the science or math concepts that support their design decisions? And so the virtual environment allows us to do that.

 

Pius Wong  3:36 

It sounds like blended learning.

 

Deb Brockway  3:37 

Blended learning? How so?

 

Pius Wong  3:40 

Well, that's what I'm wondering. So you say it's a virtual environment. I'm assuming these are -- this is an engineering team that's working together at a distance, maybe? Is that true, or is this -- Are these students in a classroom environment?

 

Deb Brockway  3:54 

Well, they are in a classroom environment, but the idea is that they could be working at a distance. There is that benefit to working at a distance. And I think it models, really, the way science and engineering happens. It's that the members of the team are often not co-located. So students actually get to experience what it might be like to work on a design team and be working remotely. But the benefit for classroom teachers really, is to be able to see individual students work. So even though they're in the same classroom, they're not sitting next to each other. So there they are really forced to work within the virtual environment. So then teachers can actually look at the work and say, Oh, well, I see this student has a misconception, or with respect to the science that's supporting the design, or this student is struggling with the idea of optimizing the design, and so I can work with this student on engineering practice. I can work with a student with respect to the science concepts that I'm trying to teach in my classroom. So that allows the teachers to be able to see individual students' contributions. So in this case, it's not so much for -- It's not really intended at this point, anyway, to be a summative assessment. So it's not expected that the teachers would be assigning a grade based on what they're seeing, but that the teachers would use this to inform their next instructional steps. So what should I be doing in the classroom tomorrow to advance the students' learning?

 

Pius Wong  5:40 

Right. So what's the advantage, then, again, of using this virtual environment as opposed to monitoring it, you know, on the traditional engineering notebook, the hard copies of work that they're doing?

 

Deb Brockway  5:54 

Well, you know, when I have seen -- So this project has been the virtual environment where they work together. And then it has a virtual environment where the students work individually. And when students work individually and in a group, we see some differences in what's happening in the group versus what's happening individually. So when they record their information in an engineering notebook, all of the students likely will have the same information there. And that's what we see sort of in this collaboration space in the virtual environment. The students have one product for all of the students in the group. Then when we go to an individual component of the task, then we see that the students are not really all in agreement about directions the maybe have taken, or about the underlying concepts. And so this allows teachers to be able to see where there are disconnects there. And yes, all of the students agreed to this, but really, they're holding on to some preconceived ideas, or they have design ideas that they that they are holding on to. So the virtual environment allows teachers to be able to identify these kinds of situations to best address students' understanding and advance their understanding.

 

Pius Wong  7:25 

Okay. Why would this research be important?

 

Deb Brockway  7:31 

So it's important, I think, in a number of ways. One is that engineering at the K-12 level is new, you know. It's been around for a bit, but when you compare it to the other traditional courses, we really know very little about students' learning in engineering and design, and so the research is important to be able to guide our decisions about what teachers -- what works best in the classroom, but also what will work in terms of a summative assessment. So eventually getting to the point where we can engage students in a design environment in a summative assessment, so that we're assigning grades, rather than saying -- asking students in a static environment to critique a design or to critique a group's work.

 

Pius Wong  8:27 

Okay. Yeah, that has always been an interesting conversation with teachers and principals when we talk about engineering and assessing it. It seems very difficult to do right now. Where do you think that's going to go in the future? I know you're doing --you're building, I guess, the building blocks right now to be able to do summative assessment of engineering learning. Do you think it's even possible to create some kind of summative assessment tool for K-12 engineering?

 

Deb Brockway  9:00 

Definitely. And I would even say definitely with respect to engaging students in a design process. So, you know, there are aspects of current science assessments that are being developed, and engineering literacy assessments that are being developed, that ask students questions about engineering, and engage students in aspects of a design process. But to do it more fully, we definitely can get there. It will take time. But some of the things that we can do when we're doing this in a virtual environment -- So if we have a student engaged in this virtual environment, we collect the process data. So what are they doing when they're working in this virtual environment? So if you give them a device, let's say a device, and that we want them to improve upon that device, and there are different actions that they can take to improve upon that device. So I can -- Maybe it would be helpful if I give you a specific example.

 

Pius Wong  9:59 

Sure.

 

Deb Brockway  9:59 

So students have a solar still that will take ocean water, and through evaporation and condensation will produce pure drinking water for a family that has limited access to pure water for drinking. So they get this flawed design. And they need to improve the design so that it will be more efficient and be more usable. So when the students are working in this environment, we can collect data about the process that they're taking to solve the problem. So for example, one of the things we can look at is: Are they taking a science inquiry approach to solve the problem? Are they controlling variables? Or are they using science and math content knowledge to support design decisions and choices? Are they using a targeted, reasoned approach to make a systematic decision about how to go about investigating this and coming up with a better design? Or is it scattershot? And we can collect the data on their interactions with the system to know how they're approaching this. So are they using a design approach, an engineering approach to solving the problem, for example.

 

Pius Wong  11:22 

When you talk about collecting that data, it sounds interesting. It sounds like you're encoding these qualitative judgments that teachers make all the time in the classroom and you're putting some numbers to it. Is that true? Are you measuring how, like, systematic student is working?

 

Deb Brockway  11:40 

Yeah, we definitely can measure how systematically they are working. So it's not just collecting the data on the students in the classroom. We also conduct interviews with students, one on one, who are engaging with the system so that we're not sort of theoretical saying, Well, I think the student may be doing this because -- We actually conduct interviews with students who are engaging with the system and finding out why they're doing what they're doing. And then we can generalize to some extent if we collect enough data like that. We can generalize and say, Well, now when we have 1000 students or more, we have some understanding of the approach that they're taking and why they're taking that approach.

 

Pius Wong  12:29 

And are you working then with real K-12 classrooms right now? Is that how you're getting your data? you're partnering with teachers and talking with those students?

 

Deb Brockway  12:38 

Yes, actually, I am working with three school districts right now. And three of the teachers -- one teacher from each of the districts has been involved in the project for for various amounts of time. Two of them have been involved for three years. So they're helping. They're providing feedback as we develop the tasks with respect to what is appropriate and what the students are capable of doing, and how to frame it so that it really is effective for real classrooms. Many of their colleagues have also signed on. So we're collecting data this year and next year, to answer some research questions related to it to find out, in fact, how well is this working for a formative assessment? Does it provide teachers with usable information that they can make decisions about what they should be doing with their students the next day or the next week, to increase students' understanding, both with respect -- and abilities with respect to science and engineering.

 

Pius Wong  13:44 

So this virtual environment, it sounds like it has a lot of potential for a lot of data. It might take time to collect all this data, clearly. How long will it be Do you think before we start seeing the results of this out into the wild, like out here in Texas, for example?

 

Deb Brockway  14:00 

Well, that's hard to say. I think we will see some of this rather soon. And of course, research years are a little bit different than the K-12 environment, I guess. But we're doing this already to some extent with science inquiry. So I think it won't take as long with engineering because we've already made that step. So for example, with -- We are giving students -- and I mean we, meaning the larger group of assessors and researchers -- giving students tasks where they investigate variables, conduct a science inquiry investigation in a virtual environment and collecting data, and we can make conclusions about their abilities to conduct an investigation by using this information that we're collecting in a virtual environment.

 

Pius Wong  14:57 

Yeah. I mean even with this virtual environment, it sounds like a lot of the work is still done by the teacher or by the assessor, for example conducting the interview. A lot of this data isn't just automatic, it sounds like. Do you think that these assessment tools for engineering will always have to be that hands-on? Or do you think -- Basically, do you think that there's always going to have to be that teacher or some guide really talking to all those students? Or can it be this idealized world where students learn completely on their own?

 

Deb Brockway  15:32 

Oh, students can -- Well, completely on their own? Yes, I think that's possible, too. At the present time, what we're looking at is providing teachers with feedback. But there's no reason why students couldn't get that feedback at the same time. In fact, that's part of formative assessment. The idea with formative assessment is to provide information back to students and to teachers so that students can reflect on what steps that they need to take, or where are they weak? And where do they need to work in order to be able to improve? That information can go to students, certainly in a slightly different format than we would provide to teachers. But that's possible. And we are working on that. We do have an interface that teachers can go into at this point to see the students' responses. What we need to work on more is, now how do we interpret that? And providing teachers with recommendations for next step. And that's what some of the teachers are going to be helping us with this summer when I meet with them.

 

Pius Wong  16:52 

In Austin, Texas, recently at South by Southwest, ETS was there presenting a lot of cool ideas and research that you all we're doing. And I know that you're not involved in every single branch of research at ETS, but I am curious how you decide what type of research to pursue. I know that you already have done and do a lot of science learning research in K-12. Why all of a sudden this shift engineering, for example? And what else will you be pursuing?

 

Deb Brockway  17:25 

So I actually have been working in research related to engineering education at the K-12 level for 10 years. But I joined ETS a little over three years ago. And so my interest was different. I was at a university, which was an engineering and science university. And we had an interest in K-12 engineering education for a variety of reasons, but certainly because of the institution. And coming to ETS, I think one of the reasons for making it a larger emphasis at ETS is because of the Next Generation Science Standards. So now, engineering is in science. So it needs to be tested, because it's in science. So we need to be able to create tests that are fair and valid, that will evaluate students' abilities with respect to engineering practices. And so there are lots of other programs at the K-12 level that relate to engineering, but it's not necessarily a large-scale assessment. So we need to be able to -- If we're going to report on students' abilities, we need to be able to conduct research so that we we are confident that we're reporting on something that -- on students' abilities, and that it's valid reporting, and they are fair,

 

Pius Wong  18:57 

Right, that word fair came up a lot in all these discussions I've been hearing previously. People think it's really hard to come up with a fair engineering evaluation just because there is all of that process measurement. So I'm really glad that you're looking at it. That does go back to why I was asking the question previously: How long would it take? I think teachers -- Well, maybe not teachers, but maybe school districts can be impatient about finding a metric for this type of learning. Do you think that we'll have a way to do summative assessments pretty soon?

 

Deb Brockway  19:32 

Well, yes, but I would say that what we're going to be able to do with summative assessments now is that we will be doing much more constrained assessments. There will be static items where students critique designs, for example, or maybe critique  another group's process, and there are -- In fact, I'm providing input on some simulations that might be able to be used for engineering. So this would be much more focused, not so broad as what I'm doing with a formative assessment. But where students would be able to engage with the simulation, that potentially could give us information about design. So that's near future. The rest of it is a little bit farther out on the horizon. So for example, I would give another example.  There is the NAEP Technology and Engineering Literacy Assessment, and that engages students in aspects of design process. In that case, however, it's Engineering Literacy, in the sense that students are not required to use science or math in order to be able to solve the problems. I correct myself -- they are required to use it, but the science or math that they need to use is provided to them. They're not required to bring it to the task. So that's a little bit different. So the area that I'm looking at, and that is sort of bounded by the NGSS, is having students engage in engineering design where they do bring their content knowledge. And so they're solving the design problems, and it's relying on them also understanding the science concepts to support design decisions.

 

Pius Wong  21:27 

Right. Going back to formative assessments -- I guess I never asked specifically, is your research just on students of a particular age?

 

Deb Brockway  21:38 

Well, I started with middle school, but just for reasons that we -- for practical reasons, that we were working with students on science assessment at the middle school level, so it was convenient to start with the middle school. So that's why. And I think from my own perspective, I think it's because students at the middle school level -- If we're having an impact on interest, for instance, and career decisions, middle school is a good place to start. By the time students are in the high school level, many of them have more fully formed ideas of where they're going, where they have direction a little more than the middle school level.

 

Pius Wong  22:21 

And is your research continuing to focus on middle school students?

 

Deb Brockway  22:25 

Well, one project is. So the project that I've been working on the longest with the solar still, that's middle school. But I have another project that is at the high school level. And we have not at this point collected data with high school students. We have collected data with adults. And so I'm hopeful that next year and we can collect data with high school students, and it is engaging in a different kind of a design challenge, and using a simulation, and there isn't a collaborative component. So it's quite different, but this would be a more constrained way of looking at how we might evaluate students' abilities with respect to design.

 

Pius Wong  23:07 

You brought up the idea, the very important idea of getting students' interests when they're younger, maybe in middle school. I feel like that sometimes is a separate question of whether students have learned engineering content. Like the question of whether students are interested in engineering versus whether they've learned engineering. Do you think that the formative assessments or the tools that you're developing, that they can be helpful at all in increasing interest in engineering, not just content knowledge, or process knowledge?

 

Deb Brockway  23:40 

I think it's possible. I'm not collecting enough data to be able to say that for certain, but we do ask students questions when we conduct the interviews, one-on-one with them, about how this activity compares to the other kinds of activities that they're doing in the classroom and about their level of interest. But not -- we're not collecting -- on this project, we're not collecting enough information to be able to inform that.  Yeah, so I guess I don't really know.

 

Pius Wong  24:12 

Okay. I always have tons of questions about the future and what ifs, but I know that as a researcher, you don't want to just make bold claims that are unsupported by evidence. But so what I do want to ask you then as a final closing thought is, what questions do you still have as a researcher at ETS, about how we teach or how we assess engineering learning? What are you still trying to find out?

 

Deb Brockway  24:41 

Well, really what I'm trying to find out -- where much of the teams that I work with are trying to find out, is: How can we in a valid and fair way get a better picture of students' abilities with respect to engineering design? And so I think that's really the crux of it, is how can we do that? And so we're looking at a bunch of different ways. And one that I haven't really touched on much at all, is the collaboration component. And you had mentioned the question about how do we even measure collaboration. And we actually do have ways of measuring collaboration, too. In the formative way we can measure it, because we have students working with each other. But also -- and it's not too far on the horizon, either -- is that by collecting these data that we're collecting with real students working with each other, we can take those conversations, and now we can use those as models to have a student interact with an avatar in a virtual environment, in a collaborative environment, and then use that in a summative assessment. So that would again push the envelope with respect to having students engage in design in a summative assessment in a way that better represents what happens in the field.

 

Pius Wong  26:13 

Yeah, that's really interesting. I'll have to stay tuned for more research on that. Well, thank you so much, Deb. If you have any other closing thoughts, anything you'd like to share?

 

Deb Brockway  26:23 

I don't think so. I think we've pretty much covered it. I could talk all afternoon.  Thanks so much for the opportunity to share this with you.

 

Pius Wong  26:34 

That was Deb Brockway, research supervisor at ETS. What do you think about testing learning in engineering design? Send me a message over email, Twitter, SoundCloud, or wherever you are on the internet, and we can talk about it. For more on any subjects you've heard from our talk today, read this episode's show notes, and check out the links. You can also find these notes and some episode transcripts at the podcast website, k12engineering.net. I'll get more transcripts up there soon, I promise. Connect with the show on Twitter @K12Engineering, or find it on the Public Radio Exchange, Facebook, Radiopublic, Patreon, and many other places. Learn more about how to connect at k12engineering.net.  Our closing music is from the song, "Yes, and," by Steve Combs, used under a Creative Commons Attribution License. The K12 Engineering Education Podcast is a production of my independent studio Pios Labs in Austin, Texas. If you listen to the show, please consider sponsoring it. Go to patreon.com/pioslabs to donate, or find links to Patreon from the show website k12engineering.net. Thanks so much to all the current supporters. Your encouragement is fantastic, and thank you, listener, for listening.