Talking about Assessment
Jun 29, 2015 | By: Beth McGinnis-Cavanaugh
We participated in National Science Foundation's (NSF) Teaching and Learning Video Showcase in May, 2015. This was an online poster presentation of sorts, although, obviously, with videos instead of posters. Over one hundred videos were presented in a digital "videohall"; these videos represneted projects that cut a swath across NSF programs CIRCL, CAISE, CADRE, MSPnet, STELAR, CS10K, and ARC. Through My Window is funded by the National Science Foundation under the Informal Science Education (ISE) program and is a CAISE project.
Participants were vetted via an applciation process. Presentations were required to showcase "intervention, innovation, and/or research" as well as address "potential impact, promise, and challenges" in three-minute videos. Our submission was "The Through My Window Story", a video that highlights the development of the Through My Window learning environment and its basis in cutting-edge research. Questions and comments were based on the video. The event was open to the public for commenting, and NSF also appointed faciliators to stimulate and guide good discussion on submitted entries. Many thanks to NSF for hosting this showcase, as it was a valuable learning and networking experience.
Several themes emerged from a thankfully energetic and robust Q&A on our presentation. The topic of greatest interest was assessment. (Another major theme was the use of story, which I will feature in another installment.) Since it seemed to be of great interest to lots of folks throughout the duration of the showcase, I thought I'd post some of the questions and answers currounding that topic here. We continued to hear from a few folks after the showcase ended, too, and invite follow-ups on this topic at any time! And, since we'll have a hefty data set this summer, it will be interesting to follow this blog with a later one that supports our responses below.
Another big thanks to all those who submitted questions. They gave us pause for thought in many instances—always a good thing—although perhaps our responses may have not have consistently reflected deep thinking. Remember that the back and forth was very casual, conversational, and "in the moment". The language that follows is reproduced without edits!
Q. This does sound like a very creative project, and I like the use of narrative as a tool to engage students. Following up on the previous question, how will you evaluate the impact of Through My Window on students? Will you compare your approach to a non-narrative based approach?
A. As I briefly mentioned above, we have a number of ways we’re evaluating the impact of Through My Window. This includes student surveys that look at student attitudes about, engagement in, and perceptions of engineering; online student journal entries that give us a window into student thinking about the specific content areas of each of the learning adventures; and student focus groups. In addition, we think educator perspectives and feedback are incredibly important (in many ways, they’re the experts!), so we also have educator surveys, interviews, and a feedback portal on our educator website (teamthroughmywindow.org).
While we aren’t directly testing Through My Window against a non-narrative control group, educators have told us that narrative provides certain advantages, including engaging a broader range of students and providing an opportunity to integrate the teaching of English/literacy and STEM.
Q. I like your approach of collecting data from students and also feedback from educators about the materials and approach. For the students, can you say a little more about the online journal entries, and how you will use those to learn about students’ ideas about engineering and the specific content areas?
A. Look at my answer to Cynthia, below—but I’ll provide another example here. In one part of the Rio’s Brain artificial intelligence online learning adventure, students watch videos of machines completing certain tasks, like pouring a glass of water, walking through a crowd, dancing, etc. Students are asked to journal about what they think the machine would need to “know” in order to complete each task.
What we’ve found is that by the time they get to this point in the adventure (they’ve already completed the journaling and activities I described to Cynthia below, as well as encountering some new ideas about tasks they think are easy or hard for machines), they are able to frame their answers in terms of machines “knowing” certain things—having rules or specific functions—and being able to act on them. That is, in and of itself, the definition of classical artificial intelligence—so students really are “getting it”!
We should have more examples as we collect more student data and as we release more learning adventures.
Q. The use of narrative is hugely important but I imagine that it can be challenging to capture the value of it in terms of “data.” What have been the most effective methods of documenting your findings and conveying them to funders?
A. With respect to data, at this time we are primarily focusing on student pre- and post-surveys as well as analysis of journal entries in the AI learning adventure. Student surveys address the effectiveness of the learning environment in engendering student understanding of the work of engineers and as well as STEM identity. Do students see themselves as engineers or are they more likely to consider engineering (STEM) as a career after using Through My Window?
Analyzing changes in student learning of engineering concepts—the “big” ideas in the learning adventure and novel—is done with embedded assessment. In the AI learning adventure, students are prompted to journal answers to questions like: “What is intelligence?” “What do intelligent machines need to know to complete different tasks?” and “What knowledge can a machine have and how can it operate on that knowledge?”. Students can answer and revise their answers to these types of questions as they move through the learning adventure; they can share and compare their answers with those of their peer group and revise. The focus is always on ideas and idea improvement—not on right or wrong answers!
A “data team” within the project looks at initial ideas versus post ideas. While we are early in the process, preliminary data indicates significant idea improvement within the AI learning adventure—improvement that signals deeper understanding and broader conception—regardless of the level of sophistication of the initial idea. We are, of course, reporting these findings to NSF as well as disseminating via papers—ASEE, for example, among other opportunities. We’re always open to suggestions when it comes to dissemination methods!
Q. Just curious—do you ever have engineers take a look at the student journals—and provide feedback on the “authenticity” of their ideas and the growth in their understanding?
A. The people who look at the student journal entries are an educational theory expert/principal investigator on our grant/professor at Smith College, Al Rudnitsky; an engineering education expert/principal investigator/Smith professor, Glenn Ellis; and undergraduate students of Ellis and Rudnitsky in both fields. As I mentioned below, they are looking closely at student journal entries for growth and changes in ideas as well as understanding of certain content areas.
Were you thinking more specifically of having engineers “in the field” so to speak look at the journal entries—or comparing what students write to the way professional engineers answer the same questions? That’s an interesting idea—I’ll have to run that by our team!!
Q. I did mean engineers “in the field” but you’ve expanded my ideas to include comparisons with ways professional engineers answer the same question—It would be great to make a few of those comparisons, even if restricted to a small sample.
A. Both really excellent ideas, and I’ve brought them up to our team as possibilities for the future!
Q. Very impressed with the integration of IE and KB to address engineering challenges. How do you plan to broaden participation and evaluate impact?
A. Through My Window consists of three major components—the novel (Talk to Me), the website with interactive learning adventures, and a teachers’ curriculum guide with offline enrichment activities.
We’re using a variety of channels to broaden participation, including exhibitions and presentations at conferences (e.g. the American Society for Engineering Education, the Society for Women Engineers, the National Afterschool Association); collaborations with state after school networks including Connecticut and Massachusetts; social media (including Facebook and Twitter); and strategic networking with formal and informal educational districts and networks.
We are evaluating impact through student surveys that measure student attitudes about engineering as well as through students’ online journaling that they write as they complete the online learning adventures. These journal entries give us an idea of how their ideas about engineering and the specific content area (e.g. design or artificial intelligence) are improving and changing as they progress in the learning adventure.