Category Archives: STEM

Teachers Partnering with Artificial Intelligence: Augmentation and Automation

3x2 grid of colorful AI and learning related icons

By Pati Ruiz and Judi Fusco

Artificial intelligence systems are increasingly being deployed in K-12 educational settings and we expect this trend to continue. Our starting point is that AI systems should support or augment, but never replace, a teacher. In order to ensure this, these systems should be developed with the input of teachers, students, and families.

So, what types of AI systems do teachers want to see developed? A group of teachers from the Merlyn Mind Practitioner Advisory Board shared ideas for how AI might help teachers better support their students. One scenario emerged around students who have Individualized Education Programs or Plans (IEPs)1. In this post we will describe how an AI system might support teachers and students by automating:

  1. Planning and Supporting Preferences
  2. Monitoring
  3. Documentation

Planning and Supporting Preferences

First, a teacher could input student instructional plans into the system. Then, the system can review the plans, make recommendations, and send alerts to the teacher when something may not work for a student. In the alert, the system could provide suggestions of adaptations on lessons or assignments based on the needs of each student. For example, an AI system can scan what’s coming up in an instructional unit and alert the teacher that the website they selected does not meet the accessibility standards required by the students in the classroom. A more advanced system could also suggest an alternative option, or even better, search for multiple resources that are appropriate and let the teacher decide what resources are best suited for their students’ instructional needs. In all cases, the AI system is only helping and making suggestions that the teacher may act on.

An AI system can also allow for a broader range of inputs from students for assignments based on their needs. For example, if a student accommodation includes submitting assignments as recorded audio, but the teacher prefers written assignments, an AI system can convert the student’s audio to text so the teacher can review or grade the text. The text-to-speech tool should also allow the teacher to hear the student’s voice for a particular sentence or phrase, for example, if the translation was not successful. Alternatively, if a student needs to hear the teacher’s comments on their assignments instead of reading them, the AI system can convert the text comments into spoken text for the student to hear. To additionally help the teacher, the system might suggest comments that they had written for another student so the teacher can reuse or repurpose them. The system might also remind the teacher of a student’s preference for feedback and if the student prefers verbal feedback, the teacher could read and record the comments for that more personal touch.

Monitoring

To support teachers in providing adequate accommodations for their students, an AI system can monitor student IEP information and make automated recommendations for needed support. For example, the system could identify students who require extended time and either share a list with the teacher or make appropriate adjustments to due dates for individual students in a learning management system. Here, we point out the need for AI systems to be able to interact with other systems or be embedded within them. Additionally, the system must do this in a way that does not expose sensitive information about students to the whole class.

Related to the text-to-speech and speech-to-text ideas discussed above, an AI system can also provide individualized read-aloud capabilities for students who need that support. The system could also remind the teacher to provide tools, like headphones or closed captioning for students who need to listen to content. We firmly believe that AI systems can help by doing things that machines are good at, while continuing to enable teachers to focus on what humans do best—like developing interpersonal relationships and identifying nuanced needs. With these types of automated supports, it is important to ensure that teachers have the ability to make the final decisions about students’ needs and that students have the agency to accept and decline supports as they go.

Documentation

Supporting a classroom with students who have varying needs—whether they are documented in an IEP or not—requires a significant amount of monitoring and reporting on the part of educators. An AI system could support teachers by not only monitoring the individual requirements of students, but also documenting the adjustments and accommodations that were made for each student. This documentation could then be shared with the students’ families to provide a summary of the work that students have accomplished and how they have been supported in completing that work. Of course, a teacher would review and verify that the summary produced by the AI system is accurate and flag any issues with the write-ups that would need to be addressed by the AI design team.

By the end of the instructional unit, teachers would be able to review reports of student progress, identify what worked and what didn’t, and ensure that all students are making meaningful progress. Automating, planning, tracking, and documentation can give a teacher more time to care for students; however, given the various risks AI systems bring, it is crucial that teachers also have the capability to override an AI system when needed.

Risks

The imagined AI system described helps teachers do what they do best by supporting them to ensure their students receive the accommodations they require and then documents those accommodations. Using such systems will come with risks, and AI systems that engage with student IEP data need to have the highest level of data privacy and oversight. As we discussed earlier, educators must be involved—for example, the teacher is in charge of giving feedback, but the system may make suggestions that help the teacher give better feedback. If educator experts are not in the loop, there could be harmful consequences for students. Educators must be diligent and not assume that every accommodation determined by an AI system is correct or the best decision. AI systems lack full context and the ability to make human decisions. Educators must have oversight and be able to verify and approve every decision made by the system.

Educator Voices

This blog post presents an imagined AI system based on conversations with a group of practitioners from the Merlyn Mind Practitioner Advisory Board. We need more teachers and educators involved in these conversations, so please consider this blog post as an invitation to you to connect with us and join the conversation on the future of AI in Education. In addition to Merlyn Mind, if you are interested in getting involved, please visit the links below.

1 An IEP is a legal document in the United States that is developed for all public school children who need special education. It is created by district personnel with input from the child’s guardians and is reviewed every year. For more information see https://www2.ed.gov/about/offices/list/ocr/docs/edlite-FAPE504.html

Apprentice Learner: Artificial Intelligence (AI) in the Classroom

by Sarah Hampton

One of my favorite things about CIRCLS is the opportunity to collaborate with education researchers and technology developers. Our goal as a community is to innovate education using technology and the learning sciences to give more learners engaging educational experiences to help them gain deep understanding. To reach that goal, we need expertise from many areas: researchers who study how we learn best, teachers who understand how new technologies can be integrated, and developers who turn ideas into hardware or software.

Recently I’ve been reminded of an opportunity when Judi, Pati, and I meet with Daniel Weitekemp in June of 2020. Daniel, a PhD student at Carnegie Mellon University at the time, was developing an AI tool for teachers called Apprentice Learner.

A stacked addition problem demonstrating carrying.

Figure 1. Apprentice Learner Interface that students use when interacting with Apprentice Learner. The user can request a hint or type in their answer and then hit done.

Apprentice Learner looks a bit like a calculator at first glance, so an onlooker might be tempted to say, “What’s so cutting edge about this? We’ve been able to do basic math on calculators for years.” But we need to understand the difference between traditional software and software using artificial intelligence (AI) to appreciate the new benefits this kind of tool can bring to the education table.

In a basic calculator, there’s an unchanging program that tells the screen to display “1284” when you type in “839+445.” There’s no explanation given for how and why the programming behind a calculator works. Yet, for each math problem someone could type in a calculator, there is an answer that has been explicitly programmed to be displayed on the screen.

Contrast a calculator to Apprentice Learner, which uses machine learning (a type of artificial intelligence). No one tells Apprentice Learner to display “1284” when it sees 839+445.” Instead, it has some basic explicit instructions and is given lots of examples of correctly solved problems adding 2 or more columns of numbers. Then it has to figure out how to answer new questions. The examples it is given are called training data. In this case, Apprentice Learner was given explicit instructions about adding single digit numbers and then lots of training data–multidigit addition problems with their answers–maybe problems like “21+43=64,” “49+8=57,” and “234+1767=2001.” Then, it starts guessing at ways to arrive at the answers given from the training data.

The first guess might be to stack the numbers and add each column from left to right. That works perfectly for “21+43,” but gives an incorrect answer of “129” for “49+8.”

Two guesses for adding numbers

The second guess might be to stack the numbers and add each column from right to left. Again, that works perfectly for “21+43.” Unfortunately, that would give an answer of “417” for “49+8.”

The software continues finding patterns and trying out models until it finds one that fits the training data best. You can see below that, eventually, Apprentice Learner “figured out” how to regroup (aka carry) so it could arrive at the correct answer.

A stacked addition problem

So what are the implications for something like this in education? Here are a few of my thoughts.

Apprentice Learner models inductive learning which can help pre-service teachers

Induction is the process of establishing a general law by observing multiple specific examples. It’s the basic principle machine learning uses. In addition, inductive reasoning tasks such as identifying similarities and differences, pattern recognition, generalization, and hypothesis generation play important roles when learning mathematics. (See Haverty, Koedinger, Klahr, Alibali). Multiple studies have shown that greater learning occurs when students induce mathematical principles themselves first rather than having the principles directly explained at the onset. (See Zhu and Simon, Klauer, and Koedinger and Anderson)

However, instructional strategies that promote students to reason inductively prior to direct instruction can be difficult for math teachers to implement if they haven’t experienced learning math this way themselves. Based on conversations with multiple math teacher colleagues throughout the years, most of us learned math in more direct manners i.e., the teacher shows and explains the procedure first and then the learner imitates it with practice problems. (Note: even this language communicates that there is “one right way” to do math unlike induction in which all procedures are evaluated for usefulness. This could be a post in its own right.)

Apprentice Learner could provide a low-stakes experience to encourage early-career teachers to think through math solutions inductively. Helping teachers recognize and honor multiple student pathways to a solution empowers students, helps foster critical thinking, and increases long-term retention. (See Atta, Ayaz, and Nawaz and Pokharel) This could also help teachers preempt student misconceptions (like column misalignment caused by a misunderstanding of place values and digits) and be ready with counterexamples to show why those misconceptions won’t work for every instance, much like I demonstrated above with Apprentice Learner’s possible first and second guess at how multi-digit addition works. Ken Koedinger, professor of human-computer interaction and psychology at CMU put it like this, “The machine learning system often stumbles in the same places that students do. As you’re teaching the computer, we can imagine a teacher may get new insights about what’s hard to learn because the machine has trouble learning it.”

The right training data is crucial

What would have happened if there were only some types of problems in the training data? What if they were all two digit numbers? Then it wouldn’t have mattered if you stacked them left to right or right to left. What if none required regrouping/carrying? Then adding right to left is a perfectly acceptable way to add in every instance. But when all the edge cases are included, the model is more accurate and robust.

Making sure the training data has enough data and a wide array of data to cover all the edge cases is crucial to the success of any AI model. Consider what has already happened when insufficient training data was used for facial recognition software. “A growing body of research exposes divergent error rates across demographic groups, with the poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old.” Some of the most historically excluded people were most at risk for negative consequences of the AI failing. What’s important for us as educators? We need to ask questions about things like training data before using AI tools, and do our best to protect all students from negative consequences of software.

Feedback is incredibly advantageous

A flowchart demonstrating different ways a user can give direct input to Apprentice Learner

Figure 2. Diagram of how it works to give feedback to the Apprentice Learner system.

One of the most interesting things about Apprentice Learner is how it incorporates human feedback while it develops models. Instead of letting the AI run its course after the initial programming, it’s designed for human interaction throughout the process. The developers’ novel approach allows Apprentice Learner to be up and running in about a fourth of the time compared to similar systems. That’s a significant difference! (You can read about their approach in the Association for Computing Machinery’s Digital Library.)

It’s no surprise that feedback helps the system learn, in fact, there’s a parallel between helping the software learn and helping students learn. Feedback is one of the most effective instructional strategies in our teacher toolkit. As I highlighted in a former post, feedback had an average effect size of 0.79 standard deviation – an effect greater than students’ prior cognitive ability, socioeconomic background, and reduced class size on students’ performance. I’ve seen firsthand how quickly students can learn when they’re given clear individualized feedback exactly when they need it. I wasn’t surprised to see that human intervention could do the same for the software.

I really enjoyed our conversation with Daniel. It was interesting to hear our different perspectives around the same tool. (Judi is a research scientist, Pati is a former teacher and current research scientist, Daniel is a developer, and I am a classroom teacher.) I could see how this type of collaboration during the research and development of tools could amplify their impacts in classrooms. We always want to hear from more classroom teachers! Tweet @EducatorCIRCLS and be part of the conversation.

Thank you for your time in talking and reviewing this post, Daniel Weitekamp, PhD Candidate, Carnegie Mellon University.

Learn More about Apprentice Learner:

Learn More about Math Teaching and Learning:

Supporting Computationally Rich Communication During Remote Learning: Lessons Learned

By Colin Hennessy Elliott & the SchoolWide Labs Team

This post was written by a member of the SchoolWide Labs research team, about their experience during the pandemic and what they learned from middle school science and STEM teachers as part of a larger Research-Practice Partnership between a university and a large school district in the United States. The post was reviewed by practicing Educator CIRCLS members. The purpose of the blog is to help open the door between the worlds of research and practice a bit wider so that we can see the differing perspectives and start a dialogue. We are always looking for more practitioners and researchers who want to join us in this work.

The COVID-19 pandemic pushed many school communities online last school year in the US. Teachers were charged with accommodating so many needs while holding levels of care and compassion for students and their families. As a multi-year research project aimed at supporting teachers in integrating computational thinking in science and STEM learning, we worked with renewed senses of compassion, creativity, and struggle. We witnessed how students and teachers innovatively developed computationally rich communication using the technologies from our project while teaching and learning remotely. Below we share a few moments from the 2020-21 school year that have helped us learn what it takes to engage middle school students in computational practices (i.e. collaborating on programming a physical system, interpreting data from a sensor) that are personally relevant and community-based. These moments offer lessons on how collaboration and communication are key to learning, regardless of whether the learning takes place in person or remotely.

Who we are

The SchoolWide Labs research team, housed at the University of Colorado Boulder with collaborators at Utah State University, has partnered with Denver Public Schools (DPS) for over five years. We work with middle school science and STEM teachers to co-develop models for teacher learning that support the integration of computational thinking into science and STEM classrooms. The team selected, assembled, and refined a programmable sensor technology with input from teachers on what would be feasible in their classrooms and in collaboration with a local electronics retailer (SparkFun Electronics). This collaboration focused particularly on programmable sensors because they offer opportunities for students to develop deeper relationships with scientific data, as producers rather than just data collectors.1 This aligns with modern scientific practice where scientists often tinker with computational tools to produce the data they need to answer specific questions.

The Data Sensor Hub (DaSH) is a low cost physical computing system used in the curriculum and professional learning workshops developed by the SchoolWide Labs team. Ensuring the DaSH would be low cost was a priority of the team as an issue of access and equity. The DaSH consists of the BBC Micro:bit, a connection expander called the gator:bit, and an array of sensors that can be attached to the micro:bit and gator:bit with alligator clips (see Figure 1). Students can easily assemble the DaSH themselves to experience the physical connections and hard wiring. Students and teachers can write programs for the DaSH using MakeCode, a block-based programming environment that can be accessed via a web browser, making it easy to use with various computer setups. For students with more programming experience, MakeCode has the option to use python or javascript to program the micro:bits.

Image is a diagram that shows the micro:bit, (smaller looking electronics component with a 6 by 6 array of small LEDs in the middle) inserted into the Gator:bit (larger electronics board with five LED lights in the middle) with three sensors to the left and three wires between the gator:bit and sensors.Image shows a hand holding the micro:bit inserted into the Gator:bit with alligator-clip wires connecting the gator:bit to the microphone sensor.

Figure 1.The Data Sensor Hub (DaSH). The picture on the left depicts the components of the DaSH used with the Sensor Immersion Unit including the micro:bit, Gator:bit and three sensors (top to bottom: soil moisture sensor, microphone sensor, environmental sensor). The picture on the right shows a teacher and student interacting with the DaSH set up just for the microphone sensor.

Before the COVID-19 pandemic, our research team co-designed curricular units with teachers interested in using the DaSH to engage middle school students in scientific inquiry. Currently there are four units available on our website, three that use the DaSH and one that uses a 3-D printer. The Sensor Immersion Unit – the only unit teachers implemented remotely in the 2020-21 school year – has students explore the DaSH in use via a classroom data display, learn basic programming, and create their own displays that collect environmental data (sound, temperature, carbon dioxide levels, or soil moisture) to address a question of their choice. For example, one group of students decided to investigate climate change by measuring atmospheric carbon dioxide levels in their neighborhoods and exploring the impact of plants and trees. The goal is for students to develop ownership of the DaSH as a data collection tool by wiring the hardware and programming the software. In the process, they engage in computational thinking and computationally rich communication when they discuss their use of the DaSH with peers and the teacher.

In the 2020-21 school year most middle schools in Denver Public Schools were remote. Several STEM teachers, with more curricular flexibility, decided to provide DaSHs to students who wanted the responsibility of having them for a period of time. Having the DaSHs in students’ homes offered opportunities to make the barriers between home and school less visible, as students conducted place-based investigations and emergently took on the role of data producers. For example, some students shared temperature data and carbon dioxide levels in and around their homes with the class. In these moments, students emergently took on the role of data producers. Below, we share two examples from observing student and teacher interactions in virtual mediums which helped our research team learn about what is possible using the DaSH. We also developed new supports to help teachers facilitate extended student collaboration and communication when using the DaSH.

Lesson Learned 1: Increasing student collaboration in virtual settings

One middle school STEM teacher, Lauren (a pseudonym), had the opportunity to teach different cohorts of eighth graders in the first two quarters of the 2020-21 school year. A new SchoolWide Labs participant, she was enthusiastic about implementing the Sensor Immersion Unit with her first cohort in the first quarter. She navigated the logistical challenges of getting DaSHs to over half her students along with the pedagogical challenges of adapting the curriculum to a remote setting. After her first implementation, she shared that she was disappointed that her students rarely collaborated or shared their thinking with each other when they were online. We heard from other teachers that they had similar struggles. Before Lauren’s second implementation, we facilitated several professional learning sessions with the aim of supporting teachers to elicit more student collaboration in remote settings. Through our work together, we identified the importance of establishing collaboration norms for students, offering continued opportunities to meet in small groups virtually, and modeling how to make their work visible to each other. In Lauren’s second implementation with new students during next quarter, she intentionally discussed norms and roles for group work in “breakout rooms,” or separate video calls for each group (her school was not using a software that had the breakout room functionality). One of the resulting virtual rooms with three eighth graders during the Sensor Immersion Unit was especially encouraging for both Lauren and our research team. Without their cameras on at any point, the three boys shared their screens (swapping depending on who needed help or wanted to show the others) and coordinated their developing programs (on different screens) in relation to the DaSHs that two students had at home. Their collaboration included checking in to make sure everyone was ready to move on (“Everyone ok?”) and the opportunity to ask for further explanation from others at any point (“hold on, why does my [DaSH]…”). With their visual joint attention on the shared screen, the three successfully navigated an early program challenge using their understanding of the programming environment (MakeCode) and hints embedded in it (developed by the research team).

Lesson Learned 2: Adapting debugging practices to a virtual environment

Many science and STEM teachers new to our projects have struggled with finding confidence in supporting students as they learn to program the DaSH. They specifically worry about knowing how to support students as they debug the systems, which includes finding and resolving potential and existing issues in computer code, hardware (including wiring), or their communication. This worry was further magnified when learning had to happen remotely, even with some students having the physical DaSH systems at home. Common issues teachers encountered in student’s setup were consistent with the bugs that we have identified over the course of the project including: 1) code and wiring do not correspond, 2) problems in the students’ code, 3) the program is not properly downloaded onto the micro:bit, and more.2

Being unable to easily see students’ physical equipment and provide hands-on support made some teachers wary of even attempting to use the SchoolWide Labs curriculum in a remote environment. However, those teachers who were willing to do so made intriguing adaptations to how they supported students in identifying and addressing bugs. These adaptations included: 1) meeting one on one briefly to ask students questions about their progress and asking them to hold up their systems to the camera for a hardware check, 2) holding debugging-specific office hour times during and outside of class time, and 3) having students send their code to teachers to review as formative assessments and debugging checks. Although debugging required more time, patience, and creativity from teachers and students, these activities were generally successful in making the DaSHs work and helping students become more adept users.

Future Directions

As teachers and students have gone back to attending school face-to-face, the lessons learned during remote instruction continue to inform our work and inspire us as a SchoolWide Labs community. As these two examples show, the ingenuity of teachers and students during a tough shift to virtual learning led to new forms of computational thinking, communication, and collaboration. It became more clear that there was a critical need for at least some students to have the physical systems at home to deeply engage in the process of being data producers, which is indicative of the curriculum being so material rich. Yet, simply having the the DaSH in hand did not ensure that students would participate in the kinds of communication important for their engagement in and learning of the targeted science and computational thinking practices. While we continue to explore the complexity of teacher and student interactions with the DaSH, this past summer and fall we have been working with participating teachers (new and returning from last year) to develop a more specific set of norms, which include small group communication norms and roles. Additionally, we have begun to consider other strategies that may support students to learn programming and debugging skills, such as the use of student-created flowcharts to represent their view of the computational decisions of the DaSH. The shift to virtual learning, and now back to face-to-face instruction, has required us to more deeply reflect on the professional learning that best supports teachers in using the DaSH and accompanying curriculum in a variety of instructional settings, with both anticipated and unanticipated constraints. We welcome the opportunity to continue learning with and from our colleagues who are similarly engaged in this type of highly challenging but extremely rewarding endeavor to promote computationally rich and discourse-centered classrooms.

More Information

More on SchoolWide Labs work:

  • Visit our website.
  • Gendreau Chakarov, A., Biddy, Q., Hennessy Elliott, C., & Recker, M. (2021). The Data Sensor Hub (DaSH): A Physical Computing System to Support Middle School Inquiry Science Instruction. Sensors, 21(18), 6243. https://doi.org/10.3390/s21186243
  • Biddy, Q., Chakarov, A. G., Bush, J., Hennessy Elliott, C., Jacobs, J., Recker, M., Sumner, T., & Penuel, W. (2021). A Professional Development Model to Integrate Computational Thinking Into Middle School Science Through Codesigned Storylines. Contemporary Issues in Technology and Teacher Education, 21(1), 53–96.
  • Gendreau Chakarov, A., Recker, M., Jacobs, J., Van Horne, K., & Sumner, T. (2019). Designing a Middle School Science Curriculum that Integrates Computational Thinking and Sensor Technology. Proceedings of the 50th ACM Technical Symposium on Computer Science Education, 818–824. https://doi.org/10.1145/3287324.3287476

Important references for our work

Hardy, L., Dixon, C., & Hsi, S. (2020). From Data Collectors to Data Producers: Shifting Students’ Relationship to Data. Journal of the Learning Sciences, 29(1), 104–126. https://doi.org/10.1080/10508406.2019.1678164

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining Computational Thinking for Mathematics and Science Classrooms. Journal of Science Education and Technology, 25(1), 127–147. https://doi.org/10.1007/s10956-015-9581-5
______________________
1 See Hardy, Dixon, and Hsi(2020) for more information about data producers and collectors.
2 Table 2 in Gendreau Chakarov et al. (2021) has a full list of common DaSH bugs students have encountered across the project.

2019 STEM for ALL Video Showcase with image of youth in the background

Exploring the 2021 STEM For All Video Showcase

Featuring 287 short videos of federally funded projects aimed at improving STEM and Computer Science education, the 2021 STEM For All Video Showcase highlighted strategies to engage students and address educational inequities. The array of 3-minute videos showed the depth of work going on in the field to think about equity and social justice in the wake of COVID-19. Below are some favorites of our CIRCLS team that we hope you enjoy as well!

Co-Creating Equitable STEM Research Led by Communities
Contributed by Leah Friedman
This video features a project partnership between the Cornell Lab of Ornithology and community organizations around the country that are historically excluded from science research. Centering community wisdom and leadership, the group investigates the impact of noise pollution on public health in order to co-create appropriate solutions. This project is an amazing model of upending typical hierarchies of knowledge creation or control in STEM research, provides a really concrete framework for conducting research with community members, and exemplifies ‘broadening’ in every sense of the word.

Interest Stereotypes Cause Gender Gaps in STEM Motivation
Contributed by Judi Fusco
Thinking about stereotypical gendered messages that young children, older children, teens, and even adults receive about whether they belong somewhere is so important. These messages may be subtle, nuanced, and not intended, but they happen; we need to make sure we aren’t excluding anyone, especially without realizing it.

Activity for Stories of Algebra for the Workplace
Contributed by Jeremy Roschelle
What if every student could tell a story of how they’ll use math in the future career? Although this is just a beginning, it seems to me the technology for personalized AI-driven STEM storytelling will arise soon enough — and could help students create their own STEM identity.

You Deserve A Seat at The Table: The Data Economy Workforce
Contributed by Jonathan Pittman
This video features a project at Bethune Cookman University that uses an immersive game learning experience to help students gain 21st century digital workforce skills. Using gamified immersion is an excellent approach to build workforce skills and learn about the future of work.

Big Data from Small Groups: Learning Analytics and Adaptive Support in Game-based Collaborative Learning
Contributed by Dalila Dragnić-Cindrić
In this project, groups of up to four students work together in a 3D game-based environment called Crystal Island to solve complex eco-problems. A research team from Indiana University and North Carolina State University is investigating how students in small groups communicate and coordinate with each other when problem solving. Researchers used learning analytics to drive adaptive support.
The lead presenter is one of our Emerging Scholars, Asmalina Saleh. PIs are James Lester and Cindy Hmelo-Silver. CoPI is Krista Glasewski.

Activity for “WHIMC: Using Minecraft to Trigger Interest in STEM”
Contributed by Wendy Martin
If you are a fan of Minecraft or alternative histories you should check out H. Chad Lane’s video about his project: What-If Hypothetical Implementation in Minecraft (WHIMC). I enjoyed learning about how those researchers were encouraging students to create alternate worlds to help them better understand the phenomena that shape our own world.

To explore videos from past video showcases, visit the STEM For All Multiplex.

Artificial Intelligence & Learning + Out of School Time Learning

by Merijke Coenraad

Tenth graders collaborate on an engineering project.

Tenth graders collaborate on an engineering project. Photo by Allison Shelley/The Verbatim Agency for EDUimages

Welcome back to our blog series on Ambitious Mashups! Today, we are going to focus on the use of Artificial Intelligence in Learning and mash it up with Learning in Out-of-school-time.

Artificial Intelligence and Learning. The most common technologies used within projects were (a) Intelligent tutoring systems, (b) machine learning, (c) speech, vision, and natural interactions, and (d) social robotics and avatars. While these were the most common technologies, many projects used a mashup of technologies and focused on how the technology could be used in new ways within the classroom to support learning. If you want to read more about the types of AI, we recommend the AI4K12 project and this poster thinking about 5 big ideas in AI as a starting point.

Need an example of what this looks like? Check out the Inq-Blotter project. It provides teachers with real-time tools that alert them to students’ science needs as they are learning. Students use the Inq-ITS platform that provides them inquiry science learning experiences through labs and simulations. In addition to providing the experiences, Inq-ITS is an intelligent tutoring system and is able to assess students in real-time. Inq-Blotter builds on these capabilities to send teachers messages relating to how students are doing so they are able to provide just-in-time support. Inq-Blotter provides teachers with the opportunity to gather formative assessment data and support quality inquiry learning. This ambitious project took multiple years of research and mashed up data science, assessment, science learning, and intelligent tutoring tools.

If you are interested in intelligent tutors that help you understand what your students know, you can also check out our post on ASSISTments, an intelligent tutor for math learning. You can also see our webinar that discusses both ASSISTments and Inq-ITS.

So, what does this mean for your classroom? Think about:
What technologies might lie ahead and how do you want to use them? What would an intelligent tutoring system or social robot look like in your classroom?
Are you interested in using an intelligent tutor like Inq-ITS or ASSISTments in your classroom now? What are the implications of using these technologies for your teaching and practices within the classroom?
How could these emerging technologies affect your classroom practices and pedagogy? How will you continue to promote equitable learning opportunities when using them?

Out-of-school-time Learning. While we discussed technology in a formal setting during the school day, Some of the projects also investigated learning with educational technology in out-of-school environments.

Mash-it! Let’s look at a project that ambitiously mashes using AI in out-of-school learning!

The Virtual STEM Buddies for Personalized Learning Experiences in Free Choice Informal Learning Settings project brings together museum learning with intelligent agent buddies to support students’ STEM learning at a children’s museum. The computerized character interacts with the child as they move through the museum and acts as both a meteor and a peer. The AI agent, aka the buddy, is able to give instructions based on teachable moments and help children to find exhibits that aren’t crowded. AI in this out-of-school setting can provide youth with plenty of opportunities to learn and make the most of their museum experience. This ambitious project brought together team members from multiple universities and the Children’s Museum of Atlanta to mash up intelligent tutors, STEM, and informal learning.

What do you think of the possibilities with AI? Tweet us @EducatorCIRCLS and tell us about your innovative technology use and stay tuned for future blogs in this series about CIRCL Ambitious Mashups.

Models for Science Learning: Answering the NGSS Call

By Korah Wiley
Korah Wiley is a learning sciences researcher at Digital Promise with over ten years of classroom teaching experience. Her prior work as a STEM researcher instilled a passion for making the STEM fields more accessible to students and educators.

As a student, I loved all the animal-related topics—topics about plants…not so much. When I became a biology teacher and got to the section on plant biology and photosynthesis in the curriculum I was using, I knew that I, like my students, would need to “hit the books”. However, I quickly found myself deep in the world wide web of teaching and learning resources available online, because I knew that reading a textbook was only going to take my understanding so far. To really understand the material deeply enough to teach it, I needed a multimedia resource. I searched high and low and finally found an animation of the process at a level of detail that would give me the confidence that I understood the process well enough to answer my students questions and support them in their learning process.

The learning process that I sought to engage my students in wasn’t the standard, memorize this information and take a test in a couple of weeks. Rather, it was the kind of learning called for by the Next Generation Science Standards (NGSS)—the three-dimensional integration type. At that time, the North Carolina School of Science and Mathematics was one of the lead state partner organizations for the development, adoption, and implementation of the NGSS. In preparation for the 2010-2011 school year, the science department dean shared the draft NGSS documents and essentially said, “This is the future of science learning and we will help lead the way.” So, as a department, we revised our current curriculum and instruction to align with the call of the NGSS to engage students in the practices of science and engineering with the goal of developing an integrated understanding of disciplinary core ideas and crosscutting concepts.

Finding this photosynthesis animation was great, because 1.) it helped me to understand photosynthesis better and 2.) I could use it to engage my students in the science practice of using a model to understand natural phenomena, particularly ones that are invisible to the naked eye. My students and I went on a journey inspired by the NGSS to learn more than just the what and why of photosynthesis, we were also learning the how. Learning how photosynthesis took place led us to an even more interesting question, what if? What if human cells could harness light and make energy? (It’s actually not as far-fetched as it sounds; Goodman & Bercovich, 2008.)

The question of “what if” led me down new paths when I joined a team to develop a middle school, STEM enrichment program for minoritized and first-generation, college-bound students, called Labs for Learning. What if we developed the program curriculum to engage the participants, rising 7th graders, in a rigorous learning experience, similar to the curriculum we developed to align with the NGSS? Would it be too much for students who were barely in middle school and in woefully under-resourced middle schools at that? Encouraged by the learning experiences we were supporting for our high school students, we took a chance!

I was responsible for teaching biology topics to the 7th graders, which, to my chagrin, included even more about plants! I relied on what I knew worked, the photosynthesis animation that was so helpful for me and my high school students. The animation, for all its awesomeness, was just out of reach for the middle school students, who were really intimidated by the names of the molecules and complexes. Wanting to figure out a way to still use the animation, (knowing that it could help them develop a deeper understanding of key concepts like energy and matter transformation), I told them to just focus on the process and ignore the names. (I figured if they understood the process then they could learn the names later.) This scaffolding ultimately led to physical reenactments of the process, where we turned the abbreviations of the molecule and complex names into initials of the characters. We all had a fantastic time, they all learned the process, and many were inspired to learn the full names of their characters. (It was so exciting to watch!)

These experiences stuck with me when I was deciding on my dissertation focus. In particular, there were three things that followed me into graduate school:

  1. the limited number of resources available to support secondary students in understanding the mechanism of biological phenomena,
  2. the deep capacity of middle school students for mechanistic reasoning, and
  3. the power of a well-designed animation to support robust learning for me and my students.

To help with these problems, I decided to create a photosynthesis animation that focused on the mechanism of photosynthesis such that middle school students (and their teachers) could develop the type of scientific and integrated understanding called for by the NGSS.

After making the animation, I embedded it into an online photosynthesis unit in the Web-based Inquiry Science Environment (WISE) to evaluate whether and to what extent it supported students to meet the NGSS performance expectation for photosynthesis (MS-LS1-6). I found that, similar to my Labs for Learning experience, middle school students are capable of understanding far more complex ideas than we give them credit for (publication under review). Even with as little starting knowledge as knowing the inputs and outputs of photosynthesis, namely that carbon dioxide and water go into the plant and sugar (glucose) and oxygen come out, they were able to learn the biochemical mechanism of the process. While the assessment boundary for the photosynthesis performance expectation states that assessment for the standard does not include the biochemical mechanism of photosynthesis, my findings along with those of numerous other studies say that the middle school students can handle it and can benefit from it in their future STEM learning (Ryoo & Linn, 2012; Russ et al., 2008; Krist et al., 2018). The framework documents for the NGSS, too, recognize the need for understanding mechanisms when developing and constructing scientific explanations (National Research Council, 2012). Answering the call of the NGSS and other ambitious science reform efforts to support students in developing integrated and multi-dimensional science knowledge requires an exploration of mechanisms.

Admittedly, deep exploration into unfamiliar topics is scary, especially as a teacher who is expected to know the answers. But what better way can a teacher support students in the learning process than if they join the process themselves? As the world changes, and learners can look in many places for answers, what they need is not the answer, they need a model of how to learn in a world where information abounds. Such a model will position students to know more than just the answers. They will know how to discover, how to use the wealth of resources available to them to find out. That’s what we can model for our students by learning with them.

At the rate that new information is being generated there is no way any one person can know everything. I suggest, find resources that push you to your edge and invite your students to also explore the edge of their knowledge and ability. You might not know the biochemical mechanism of photosynthesis, for example, but that’s okay, you can learn with them. Find a resource that helps you and scaffold it to help them. Doing so will model for your students how to move from not knowing to knowing a little more, and a little more.

When you do this, you can also help them understand why it matters, and more importantly, why it matters to you. Share with them what’s interesting about the topic to you. Invite them to explore their ideas and share their experience to find out why it matters to them. Position them as pioneers in a space that could make that knowledge worth knowing for someone else. Invite them into the world of imagination and what if; prompting them with, this is the current state but what could be?

These are just some of the learning adventures that you can take with your students. The NGSS is an invitation to deeper more meaningful discovery and learning, for the students as well as the teachers. Your students need a brave guide into the world of the unknown. If you can find resources that allow you to share that space with them, they will appreciate your guidance and example of how to learn throughout their life.

Now that I’ve done this work, I understand how exploring the mechanisms of different phenomena creates rich and transformative learning experiences for ourselves and our students. With the world moving and changing as fast as it is, we need to support students in learning as much as they can, which oftentimes is more than we think!

Acknowledgments. I need to note that the animation discussed here was created in collaboration with a multistakeholder design team, that included disciplinary experts, learning scientists, software developers, teachers and students. My dissertation work was funded by the National Science Foundation (DRL: 1418423; 1813713).

References:
Krist, C., Schwarz, C. V., & Reiser, B. J. (2018). Identifying essential epistemic heuristics for guiding mechanistic reasoning in science learning. Journal of the Learning Sciences, 28(2),
160–205. doi: 10.1080/10508406.2018.1510404

National Research Council. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, D.C.: National Academies Press. doi: 10.17226/13165

Russ, R. S., Scherr, R. E., Hammer, D., & Mikeska, J. (2008). Recognizing mechanistic reasoning in student scientific inquiry: A framework for discourse analysis developed from philosophy of science. Science Education, 92(3), 499–525. doi: 10.1002/sce.20264

Ryoo, K., & Linn, M. C. (2012). Can dynamic visualizations improve middle school students’ understanding of energy in photosynthesis? Journal of Research in Science Teaching, 49(2), 218–243. doi: 10.1002/tea.21003

Data Science in Ambitious Mashups

by Merijke Coenraad

This post will focus on Trends at NSF and Beyond: Data Science.

No matter what subject you teach, it is likely that data comes into play in your classroom. Whether it is statistical analysis of data in math, collecting and analyzing data in science, or analyzing historical and contemporary data to understand the present, data can be found in many classroom activities. A trend within CIRCL projects was harnessing the data revolution. With more and more data collected each day and the accessibility of data for you and your students, there are many ways you can learn from these projects and bring them into play within your classroom. For example, check out these two projects: STEM Literacy through Infographics and Data Clubs.

STEM Literacy Through Infographics

Created by a team of researchers from across the US, STEM Literacy Through Infographics focuses on helping students create infographics in both classroom and out of school settings to help students make visual, argumentative claims (watch a 3-minute video about the project). The project aims to provide students with the skills they need to ask and answer questions about their own lives and communicate that data to others through data journalism. This ambitious project brings together experts in educational technology, mathematics, and learning and mashes up data science, data visualization, and citizen science opportunities to help students make sense of the data that is available to them. If you’re interested, you can try out infographics in your classroom using their helpful step by step guide “How to Make Infographics”, classroom lesson plans and resources, and asynchronous professional development workshop.

Data Clubs

Data Clubs are designed by a team of researchers at TERC and SCIEDS to provide middle school students with data skills that allow them to visualize and analyze data about life-relevant topics. Within the lessons, students write their own questions, collect their own data, and learn how to represent their data by hand and using computer software. This ongoing ambitious project uses design-based research collecting data about students’ data dispositions and interviewing them about their experiences. It mashes up mathematics, informal learning, data visualization, and statistics to help students think about the who, when where, how, and why of data. Try out the currently available modules with your students!

These projects demonstrate the importance of quality data experiences for students and the role that data visualization can play in students learning from the large data sets that are available to them. Besides trying out materials from these projects, how can you use data science in your classroom? Here are some ideas:

  • Explore infographics on Information is Beautiful and have students create their own by hand (as seen in Dear Data) or using a computer program.
  • Engage students with visualization of climate change on Climate.org run by NOAA. The platform provides a number of data visualization environments in which students can explore climate data.
  • Explore the Johns Hopkins US or World Coronavirus maps to discuss current trends (click on counties to see more specific data)
  • Explore data visualization of the 2020 election from Statista or CISA to discuss trends in voting and the role that data visualizations play in data communication (consider showing this clip of CNN analyzers using live data visualizations to discuss visualizations in election reporting)
  • Allow students to use data from Our World in Data and/or the CODAP platform to explore data and create their own visualizations
  • Lead students in a discussion about data collection in their lives and the amount of data collected from their use of social media, online shopping, and other internet-connected activities. Provide students with the opportunity to critically analyze how companies are making money off of their data collection and what they could do to advocate for and protect themselves from harmful data use.

How do you use data in your classroom? Tweet us @EducatorCIRCLS and tell us about your innovative technology use and stay tuned for future blogs in this series about CIRCL Ambitious Mashups.

Ambitious Mashups and CIRCLS

By CIRCL Educators

CIRCL, the Center for Innovative Research in Cyberlearning, has come to an end, but don’t worry, we’re getting ready to roll over to a new project called CIRCLS, the Center for Integrative Research in Computing and Learning Sciences. Stay tuned here and we’ll keep you apprised of any changes. Of course we’ll still be working to bridge practice and research and share what CIRCLS is doing and what we, as educators, are thinking about and facing in our work. If you’d like to get more involved with our work, please contact us! We’re looking for more educators to think and write with.

In the meantime, before we transition to CIRCLS, we want to dive into the final report from CIRCL. In it, we reflect on what we’ve learned since 2013 when CIRCL started. The world and technology have both changed quite a bit. Over the years, CIRCL worked with the approximately 450 projects funded by the National Science Foundation through their Cyberlearning program. The term Cyberlearning is a hard word to grasp, but the program and the projects in it were about using what we know about how people learn and creating new design possibilities for learning with emerging technology. In addition, in a 2017 report, we noted a strong commitment to equity in the CIRCL community. That commitment continues and is discussed in our final report with recommendations for future work to strengthen this important theme.

One thing we were struck by, in the review of the projects, was that there were many innovative designs to enhance learning with technology. As we tried to categorize the projects, we noticed that most contained combinations of multiple technologies, learning theories, and methods. While this may sound confusing, these combinations were purposefully designed to help augment learning and deepen our understanding of the technologies and how people learn. We looked for a term to use to explain this phenomenon and couldn’t find one, so we came up with a new one: Ambitious Mashups. In addition to the importance of mashing things up, the report also discusses:

Next week, we’ll be part of a webinar and talk through the different sections of the report. The webinar welcomes practitioners who want to learn more about research on emerging technologies from NSF-funded projects. While the projects aren’t always ready for use in a school today they offer ideas for new projects and new ways to think about how to use technology to support learning. The ambitious mashup projects think about learning in different ways and show how grounding activities in what we know about how people learn can help meet learning goals and outcomes. Ambitious mashups are usually exciting and give new ideas. CIRCL Educator Sarah Hampton says CIRCL reports can “help you get excited about the future landscape of education.”

We invite you to join us to learn more about Ambitious Mashups and Reflections on a Decade of Cyberlearning Research Webinar
Date: 10/28/2020
Time: 4 pm Eastern / 3 pm Central / 1 pm Pacific

Register

 


 

Book Review: You Look Like a Thing and I Love You

By Judi Fusco

During CIRCL Educators’ Summer of Artificial Intelligence (AI), I read the book You Look Like a Thing and I Love You: How AI Works and Why It’s Making the World a Weirder Place1, by Dr. Janelle Shane. I got the recommendation for it from fellow CIRCL Educator, Angie Kalthoff.

I found the book helpful even though it is not about AI in education. I read and enjoyed the e-book and the audio version. As I started writing this review, I was driving somewhere with one of my teenagers and I asked if we could listen to the book. She rolled her eyes but was soon laughing out loud as we listened. I think that’s a great testament to how accessible the book is.

Teaching an AI

Many of us use AI products like Siri or Alexa, on a regular basis. But how did they get “smart?” In the book, Dr. Shane writes about the process of training machine learning2, systems to be “intelligent”. She tells us how they certainly don’t start smart. Reading about the foibles, flailings, and failings that she has witnessed in her work helped me understand why it is so important to get the training part right and helped me understand some of what needs to be considered as new products are developed.

Dr. Shane starts out comparing machine learning and rule-based AI systems, which are two very different types of AI systems. Briefly, a rule-based system uses rules written by human programmers as it works with data to make decisions. By contrast, a machine learning algorithm3 is not given rules. Instead, humans pick an algorithm, give a goal (maybe to make a prediction or decision), give example data that helps the algorithm learn4, and then the algorithm has to figure out how to achieve that goal. Depending on the algorithm, they will discover their own rules (for some this means adjusting weights on connections between what is input and what they output). From the example data given to the algorithm, it “learns” or rather the algorithm improves what it produces through its experience with that data. It’s important to note that the algorithm is doing the work to improve and not a human programmer. In the book, Dr. Shane explains that after she sets up the algorithm with a goal and gives it training data she goes to get coffee and lets it work.

Strengths and Weaknesses

There are strengths and weaknesses in the machine learning approach. A strength is that as the algorithm tries to reach its goal, it can detect relationships and features of details that the programmer may not have thought would be important, or that the programmer may not even have been aware of. This can either be good or bad.

One way it can be good or positive is that sometimes an AI tries a novel solution because it isn’t bogged down with knowledge constraints of rules in the world. However, not knowing about constraints in the world can simultaneously be bad and lead to impossible ideas. For example, in the book, Dr. Shane discusses how in simulated worlds, an AI will try things that won’t work in our world because it doesn’t understand the laws of physics. To help the AI, a human programmer needs to specify what is impossible or not. Also, an AI will take shortcuts that may lead to the goal, but may not be fair. One time, an AI created a solution that took advantage of a situation. While it was playing a game, an AI system discovered there wasn’t enough RAM in the computer of its opponent for a specific move. The AI would make that move and cause the other computer to run out of RAM and then crash. The AI would then win every time. Dr. Shane discusses many other instances where an AI exploits a weakness to look like it’s smart.

In addition, one other problem we have learned from machine learning work, is that it highlights and exacerbates problems that it learns from training data. For example, much training data comes from the internet. Much of the data on the internet is full of bias. When biased data are used to train an AI, the biases and problems in the data become what guide the AI toward its goal. Because of this, our biases, found on the internet, become perpetuated in the decisions the machine learning algorithms make. (Read about some of the unfair and biased decisions that have occurred when AI was used to make decisions about defendants in the justice system.)

Bias

People often think that machines are “fair and unbiased” but this can be a dangerous perspective. Machines are only as unbiased as the human who creates them and the data that trains them. (Note: we all have biases! Also, our data reflect the biases in the world.)

In the book, Dr. Shane says, machine learning occurs in the AI algorithms by “copying humans” — the algorithms don’t find the “best solution” or an unbiased one, they are seeking a way to do “what the humans would have done” (p 24) in the past because of the data they use for training. What do you think would happen if an AI were screening job candidates based on how companies typically hired in the past? (Spoiler alert: hiring practices do not become less discriminatory and the algorithms perpetuate and extend biased hiring.)

A related problem comes about because machine learning AIs make their own rules. These rules are not explicitly stated in some machine learning algorithms so we (humans aka the creators and the users) don’t always know what an AI is doing. There are calls for machine learning to write out the rules it creates so that humans can understand them, but this is a very hard problem and it won’t be easy to fix. (In addition, some algorithms are proprietary and companies won’t let us know what is happening.)

Integrating AIs into our lives

It feels necessary to know how a machine is making decisions when it is tasked with making decisions about people’s lives (e.g., prison release, hiring, and job performance). We should not blindly trust how AIs make decisions. AIs have no idea of the consequences of its decisions. We can still use them to help us with our work, but we should be very cautious about the types of problems we automate. We also need to ensure that the AI makes it clear what they are doing so that humans can review the automation, how humans can override decisions, and the consequences of an incorrect decision by an AI. Dr. Shane reminds us that an “AI can’t be bribed but it also can’t raise moral objections to anything it’s asked to do” (p. 4).

In addition, we need to ensure the data we use for training are as representative as possible to avoid bias, make sure that the system can’t take shortcuts to meet its goal, and we need to make sure the systems work with a lot of different types of populations (e.g., gender, racial, people with learning differences). AIso, an AI is not as smart as a human, in fact, Dr. Shane shares that most AI systems using machine learning (in 2019) have the approximate brainpower of a worm. Machine learning can help us automate tasks, but we still have a lot of work to do to ensure that AIs don’t harm or damage people. 

What are your thoughts or questions on machine learning or other types of AI in education? Tweet to @CIRCLEducators and be part of the conversation.

Thank you to James Lester for reviewing this post. We appreciate your work in AI and your work to bring educators and researchers together on this topic.

See a recent TED Talk by author Janelle Shane.


Notes:

  1. Read the book to find out what the title means!
  2. Machine learning is one of several AI approaches.
  3. Machine Learning is a general term that also includes neural networks and the more specialized neural network class of Deep Learning. Note also, a famous class of ML algorithms that use rules are decision-tree algorithms.
  4. Some algorithms “learn” with labeled examples and some without, but that’s a discussion beyond the scope of this post.
STEM for ALL video hall logo and text: Eager Maker: Studying the role of failure in design making

EAGER: MAKER: Studying the Role of Failure in Design and Making

by Angie Kalthoff

One of the videos in the STEM for All Video Showcase covered learning what failure means. In the video, Alice Anderson says, “we really heard from educators that they wanted their learners to struggle and persist through struggle and problem solve.” I connected with what Alice described in the video, it made me think “This is it! This is what I want too!” This blog post shows my reflections on this video.  I recommend watching the video to better think with me about the role of failure in learning.

As an educator, I value growth mindset–being able to say to myself, I don’t know it yet but I can work towards it and try my best. I work to instill this mindset in my students at the early childhood, elementary, and university levels. When reviewing this STEM for All project, I connected with a comment made by Adam Maltese “My sense is that the best way to achieve this is to create a culture where iteration toward improvement is a core ideal.”

There’s often not time nor the correct culture for iteration in schools. With my students, I use the word FAIL as an acronym: First Attempt In Learning; I tell my students that I expect them to fail, I fail too. I know it can be frustrating to fail and I try to support them with read-alouds of books to show how the characters have dealt with failure.  (Side note, for inspiration and to show grit, Rosie Revere, Engineer books are great to share with students.) I will share more from the video about projects and findings around failure and some ideas for the classroom.

Overview of the program featured in the video

Over the last few years, the research team that created the video explored how kids (9-15 years of age) engage in making in both formal and informal contexts to see how they respond when things do not go as planned. They looked at how youth reacted to moments of failure and what role adults play in these experiences in hopes of finding a way to help kids persist when they experience frustration. Their goal was to help kids keep coming back and understand how they define failure.

Kids in this study were from three locations: a museum-based maker program, middle school classrooms, and an after-school making and tinkering program.

Around persistence, the team considered was if the attitude displayed by students was related to how familiar a learner was with the tools and materials. Could their familiarity have influenced their persistence?  For example, if a student was engaging with new material and they gave up after only a few attempts but persisted much longer on another project, was there any correlation between having experience with a tool and the amount of time spent problem solving.

How to use in practice

If you’re interested in thinking more about failure, this project gave some practical tips teachers can implement in their classrooms. To illustrate, I make connections from their suggestions to similar things I’ve done in classrooms.

They suggest: Save failures to learn from.

After attending DevTech professional development and working with students in the Early Childhood Technology (ECT) Graduate Certificate Program at Tufts University (where I currently work), I was inspired to keep a Kibo (a robot created for young children available through Kinderlab Robotics) hospital of broken parts. Instead of throwing away parts that no longer work, we keep them for kids to see and explore. They now get to explore broken motors to understand why it’s important for them to use them appropriately. By keeping things that don’t go right, or are a fail, they get to learn from the experience. In this STEM for ALL research project, researchers explain how at one summer camp, they created a Museum of Bent Nails. They took the experience of being frustrated when learning how to use nails and the failure one might feel when one becomes bent and turned it into a learning experience. Fellow CIRCL Educator, Sarah Hampton also loves how it turns all the hard work and learning into a badge of honor — it’s such a tangible way to value failure as a necessary part of the process!

They suggest: Facilitate learning, don’t fix things that aren’t working.

When I was first attempting to facilitate learning with kids and programming, I attended a Code.org workshop. In the workshop, I heard a suggestion about pretending to hold a teacup or actually holding something in my hands like a cup or a book to stop me from taking control of a device to fix broken programs. By doing this, as I walk around as an educator, I am not tempted to touch student projects and fix things for students. Instead, I ask questions to help when students are stuck.  (It’s really easy to not realize you do things FOR students!)

Some tips from educators in the video:

  • Keeping your hands behind your back while talking with learners (so you don’t handle their project)
  • Ask for permission to touch student projects
  • Suggest that learners ask two other people before a teacher

They suggest: Take time to reflect on your own behaviors

Related to facilitating learning and not doing it for the students, Co-Presenter Amber Simpson shared what she did: “I decided to wear a GoPro camera to capture my interactions with upper elementary students engaging in making activities. It is alarming what you learn in watching yourself on video as I was not necessarily modeling appropriate behavior for the undergraduate students I was working with in the space. I found myself not allowing the elementary students to experience failure as much as I thought (or hoped for). However, being on this project has made me aware of such instances and trying to be mindful of my response (or not) to failures not only in making contexts but other contexts such as an academic setting.

If you have the resources, it is great to watch yourself on video. Okay, it might be a little painful, but the insights are so important.

They suggest: Think about how the word FAILURE is used.

The researchers discussed the reticence they saw in educators to use the term failure. For K12 classroom educators, it can be hard for us to embrace because of the need to assign grades. For informal educators, who are often bound by the need to make the experience fun, they may find the word failure antithetical to their purposes. Also, a teacher’s background relates to how they use the word failure.  For example, educators with an engineering background are very familiar with iteration in a design cycle and bring that in. Educators with an artistic background also talk about the process of creating and not ever reaching “the end.” That notion can either be daunting — to think one is never done, or it can be comforting to know that you can always continue to improve.

As an educator, I am still curious about a few other things related to practice:

  • Mindset around failure. What were people already thinking about and how did past experiences influence their experience?

I know some of my students are more ready to think about and handle failure.  How can I help all of them?

  • Working through struggles. How can adults help kids redefine failure as a chance to try something new?

I have some new ideas, but I’m going to keep thinking about this.

  • Developing practical experiences around struggle. Can a particular experience be designed to help all kids and adults become comfortable with struggle?

Again, no easy answer, I’ll keep thinking here.

Practical note: I discussed this project and the idea of failure with Sarah Hampton. She and I agree that it is important to instill the iteration/design process into lessons and yet we find it hard to take the time with current academic expectations and demands in the school day. If you have suggestions for us, please share via Twitter at @circleducators and #CIRCLedu