Category Archives: Written by: Judi Fusco

A Cyberlearning Project looking at Collaboration

By Judi Fusco

Our last post discussed embodied learning and Cyberlearning. Cyberlearning is many different things; on the CIRCL site, we have an overview of Cyberlearning. In this post, we’ll look at another example: a new Cyberlearning project developing technology that may be able to help support teachers and the collaborative learning process. 

It can be difficult to understand what is happening during collaborative work in a classroom when there are multiple groups of students and just one teacher. In a previous post we discussed how it’s hard for an administrator to walk into a classroom and figure out what is happening when students are collaborating because it’s hard to walk up to a group and understand instantly what they are doing. It’s also hard for teachers because they can’t be in all of the groups at the same time. Of course, teachers wish they could be a fly on the wall in each group so that they could ensure that each group is staying on-task and learning, but that’s impossible. Or is it?

At the end of that previous post, I asked if cyberlearning researchers could help create tools to better understand collaboration. When I did that, I was kind of setting myself up to introduce you to a Cyberlearning researcher, Cynthia D’Angelo. She has a project that may lead to the creation of a new Cyberlearning tool to address the problem that it is impossible for a teacher to be in more than one place at a time. Watch this 2-minute video about Speech-Based Learning Analytics for Collaboration (SBLAC) and see what you think.

Cynthia’s research is still in early stages, but all the practitioners I’ve told about it find it interesting and want it for their classroom. Here’s a little more about the project:

In this project, work is being done to determine if technology that examines certain aspects of speech — such as amount of overlapping speech or prosodic features (like pitch or energy) — can give real-time insights about a group’s collaborative activities. If this could happen, and SBLAC went into classrooms, then teachers could get instant information about certain things occurring in group collaboration even when they weren’t present in that group. 

The proposed technology would require a “box” of some sort to sit with each group to analyze the speech features of the group in real time.  One research question in the project is, “Are non-content based speech features (such as amount of overlapping speech or vocal pitch) reliable indicators for predicting how well a group is collaborating?” Initial results suggest this is promising. (Note, this technology doesn’t analyze the content of the speech from the students, just features of the speech. Hopefully, this helps to preserve student privacy.)

It’s important to support groups during collaboration because sometimes groups aren’t effective or an individual student gets left behind. This work, while it is still in early stages, could potentially help teachers identify groups having problems during collaboration. A teacher would no longer have to guess how a group was working when s/he wasn’t present and could target the groups having difficulties to help them improve.

If you want to learn more about the project, watch Cynthia’s 3-minute video shared at the NSF 2016 Video showcase: Advancing STEM Learning for All.  Or you can read the NSF award abstract. Stay tuned, as we’ll have more about this project from two teachers who are working with Cynthia on SBLAC this summer. 

SBLAC really requires teachers and researchers to work together on this hard problem about collaboration as it tries to create new tools to help in the classroom. What do you think of the idea? What do you think is hard or important about collaboration? What kind of feedback would you want on the groups in your classroom. Could SBLAC help administrators understand collaboration? Going forward, we’ll talk more about collaboration and collaborative learning, so feel free to leave questions or comments about collaboration, too.

NSF Video Showcase

Picture

By Judi Fusco

Hey CIRCL Educators, this year and last year, NSF researchers have made short videos to share information about their projects.   I think the videos are great, as they introduce people to different NSF projects, let you know how you can get involved with those projects, and provide inspiration.  

Check out the Cyberlearning Project Videos, vote, and give feedback, if you have time!  (I know it’s that time of the year with end of school and all that is crazy, but these videos are worth watching.) The opportunity to ask a question or give feedback on the videos is open for another few days; the videos will be available indefinitely.  Please share with your colleagues!  If you have a favorite project you’d like to see featured and discussed here on the CIRCL Educators’ blog, let us know!

P.S. Here’s what I’ve watched so far today…

STEM Learning through Infographics
Diverse Learning Technologies
Using Data Visualizations to Empower Informal STEM Educators 
Speech-Based Learning Analytics for Collaboration
A cyber-ensemble of inversion, immersion, collaborative workspaces, query and media-making in learning
and
Project-Based Inquiry Science (PBIS) CyberPD System with 24/7 Online Resources and 3-D Learning Support


Perspective on learning from an administrator

By Judi Fusco

Today, for something completely different, I include snippets from conversations with Katie Hong, an administrator in a large school district in a school-wide Title 1 middle school. Katie is also a doctoral student pursuing her Ed.D. in the Pepperdine EDLT program.  

One of the first things Katie told me was how Keith Sawyer got it right when he said, “Many teachers spend their entire careers mastering the skills required to manage an instructionist classroom, and they understandably have trouble envisioning a different kind of school” (Sawyer, 2014 p. 3). Teachers are told to implement Common Core Standards with student-driven learning, emphasizing collaboration, but they have not been equipped to implement or facilitate constructivist methods in their classroom. Another issue compounding the problem is administrators. Administrators often evaluate teachers based on the instructionist view. As they evaluate, they convey to the teacher how they want to see traditional classroom practices. When Katie was a young teacher, she did student-driven, collaborative lessons; she had one on Mesopotamia where the students were working together exploring the role of irrigation and how it impacted the growth of civilization. Her principal walked in to evaluate her and was a little miffed because the class wasn’t quiet. He told her he’d come back when she was “teaching,” as he couldn’t do an evaluation on her with her students so off-task.  

Administrators have huge power over teachers, and teachers often continue to focus on the traditional classroom practices because they want to please their administrator, receive an effective evaluation, and be viewed as an effective teacher by their colleagues. Administrators aren’t completely to blame. as there aren’t good evaluation instruments or tools to help them evaluate constructivist methods or classes doing cooperative learning. Also, many administrators lack sufficient knowledge about student-driven methods and collaboration.

As Katie and I have continued talking, she has made many observations that have stayed with me. She spoke about how an ideal teacher evaluation should involve much more time than it’s given. Often there’s only time for one classroom visit with a pre- and post- meeting, but it would be better to have visits on a continuous basis throughout the year. She told me that she, as an administrator, would like to observe teachers facilitating student-driven lessons, but teachers often don’t use student-driven lessons on days she’s evaluating them unless she specifically asks them to in their pre-meeting. She also wishes she could have tools to help her understand what is happening more quickly when she walks into a classroom where students are collaborating. When there are a lot of groups, it can be hard to understand and evaluate what is occurring. And the forms she has to use for evaluation often involve a lot of answering of questions that may not capture the most important details. For her own research, she’s interested in thinking about how to help administrators evaluate a constructivist classroom effectively. She said, “I want to see the interaction with the students and teacher and how the teacher facilitates–that would be my ideal observation. I learn so much more when I talk to the students. I want to see if they can synthesize material and apply it. I know the teacher knows the material. I don’t need to see them lecture. I want to observe what the students have learned and understand.”

Thanks for the important perspective, Katie. We’ll have more of your thoughts on student-driven learning in another post, soon. Administrators and teachers, what are your thoughts about teacher evaluations and student-driven learning? What do you need to be successful? If you teach teachers, do you talk with them about the topics covered in this blog post? Cyberlearning researchers, can we help Katie with some new tools for evaluation of student-driven collaboration?  

Sawyer, R.K. (2014) Introduction: The new science of learning. In: Sawyer R. K. (ed.) Cambridge handbook of the learning sciences. Second edition. Cambridge University Press, New York: 1-18.

Learning Scientists and Classroom Practice

​By Judi Fusco

As I promised in the previous post, here’s a look at Tesha Sengupta-Irving and Noel Enyedy’s 2015 article. In this post, I want to take a closer look at one study that shows the kind of work learning scientists do in classrooms with teachers. 

Some teachers (and principals, parents, and others) question whether student-driven (open) pedagogies work for students; they worry if students are on their own, they might waste valuable instructional minutes, especially in math classes. However, by exploring data, discussing and debating, and constructing their own understanding, students in an student-driven, open instructional approach achieve the instructional goals of the course as well as students in a teacher-led (guided or instructivist) approach. In addition, and importantly, students seem to enjoy learning mathematics more when taught with an open or constructivist approach versus a guided approach. In their article, Sengupta-Irving and Enyedy (2015) discuss how important enjoyment is in learning, and why and how a student-driven instructional approach helps them learn.

In the study, students’ test performance was the same for both the teacher-led and student-driven approaches. So why don’t we just stick with teacher-led techniques? Why do we want to switch to more student-driven approaches? Sengupta-Irving and Enyedy, and many other learning scientists, don’t think it’s enough to create mathematically proficient students without helping them develop an interest (or even love) for the subject that the student-driven approach helps create. Learning without enjoyment seems like a lost opportunity that may prevent students from doing well in the future. The authors think if students learn and enjoy subjects, those students might want to go further in the subject and take more classes.  

Using Learning Science as the Foundation to Build Practical Classroom Practices
So what did the students in the student-driven condition do while learning? On their own, the students started with a discussion to explore the data, tried to understand the problem, and debated the approach or solution with peers. They also experimented and during their discussion “invented” an understanding, in this case, of statistics. They (hopefully) invent what the teacher would have told them during a lecture. While it may seem inefficient to let students invent, because, after all, we could just tell them what they need to know, but the discussion and inventing engages them, helps them enjoy the subject, and strengthens their learning.

After they have gained some understanding on their own in their discussion, the teacher has a discussion with the students and helps them learn formal terms. Exploring first contrasts to what students do in the the instructivist or guided condition where the teacher tells them the formal terms, a great deal of information about the problem, what the important concepts are, and the approaches they should take in solving the problem. In the guided condition, students are not given an opportunity to explore informally.

For a long time, learning scientists have known that “telling” students after they have the opportunity to explore and develop their own understanding is more effective than telling them before they have had that opportunity (Schwartz & Bransford, 1998). Sengupta-Irving and Enyedy employ this learning science principle and find that students do well and seem to enjoy the lesson more. 

One other issue that is sometimes discussed about student-driven approaches is whether students are off-task when on their own. It is true, student-driven classrooms are usually noisier than instructivist ones, but that’s because there is learning occurring—in my experience, I have found that learning is a slightly noisy phenomenon. The researchers looked at off-task behavior in the two instructional approaches in the study and there wasn’t a difference. They found more instances of off-task behavior in the teacher-led condition than in the student-driven condition and approximately the same number of minutes of off-task behaviors in the two conditions. I think it’s important to note that the teacher in this research reported that she was more comfortable with the teacher-led approach. Because of that, the teacher may not have used an open approach very often, and her students may not have been as familiar with an open approach–yet there was no extra off-task behavior. To alleviate concerns that student-driven approaches require more time to work, both instructional approaches used the same amount of time for the lesson.

I want to go back to the issue of enjoyment. If, after a lesson, students don’t want to think about it any more—because it’s boring, one of the terms the students in the teacher-led condition used to describe the lesson—then we probably have not done the best we can for the students. Sure, if we tell students about something, we’ve gotten through the lesson and are able to cross that topic off the list. But shouldn’t learning be something more than just an item on a checklist? What if learning was enjoyable and students left wanting more? Learn the same amount, in the same amount of time, with very little off-task behavior, and enjoy it = win-win-win-win. And, add the bonus that enjoyment can potentially help students in their future work and motivate them to continue their studies. I’d make time for that in my classroom.

I’d love to know what you think about the article and their findings. In future posts, we’ll talk about how to o student-driven approaches and hear from teachers who have some good tips. I’d also love to hear how you teach and what you’ve seen or experienced in your classroom. Below you can read more details of the study.

Sengupta-Irving, T., & Enyedy, N. (2015). Why engaging in mathematical practices may explain stronger outcomes in affect and engagement: Comparing student-driven with highly guided inquiry. Journal of the Learning Sciences, 24(4), 550-592, DOI: 10.1080/10508406.2014.928214.

Schwartz, D. L., & Bransford, J. D. (1998). A time for telling. Cognition and instruction, 16(4), 475-5223.


Details of the study
In the study, one 5th grade classroom teacher taught two sets of students the same mathematics topic, for the same amount of time, using two different approaches: open (student-driven; 27 students) and guided  (instructivist; 25 students). The teacher was more comfortable with the guided approach, but had learned how to facilitate the open method and taught one class of students that way. The data collected included written assessments of the student’s work (a test), a survey inquiring about the students’ affect during the lessons, and video of the 5 hours of class time devoted to the topic for each instructional approach. The researchers report three main findings based on the analysis of this data:

  1. Assessment data showed that when students were given the opportunity to explore and solve problems in an open way working with their peers, they performed just as well as students who were in the guided (instructivist) situation. 
  2. Survey responses indicated that students in the open condition enjoyed the lesson significantly more, compared to guided students. Also, students in the open condition did not express any negative affect statements, but guided students did. (“Bored” was one of the negative affect statements used by the guided students.)
  3. Video analysis showed that in the two conditions, the amount of time spent in interactions between teacher and students, and students working together, were very similar. For example, for both conditions, there was a little over 3 hours spent in whole class activity and about 2 hours spent in small group work; during the small group work, adults spent about 1.5 hours helping the students with the lesson or managing behavior. Off-task time was roughly equal in the two conditions: there were 18 off-task instances (involving approximately 11 minutes (out of 300 minutes) of adult intervention) for off-task behavior in the guided condition, and 14 off-task instances (involving approximately 13 minutes (out of 300) of adult intervention) in the open condition. 

Learning, teachers, and learning scientists…oh my!

​By Judi Fusco

I’m thrilled to be writing in the Educators’ Corner.  I’m Judi Fusco and I have been working with teachers (K-12 and higher education instructors and professors) for almost 20 years. I currently work at the Center for Innovative Research in Cyberlearning (CIRCL) and teach at Pepperdine University. If you want to learn more about me and my work, you can see my bio for SRI and visit the archive of Tapped Inthe online community for education professionals that I helped co-found in 1997 (links to many of my publications).  

I love to think about learning, and there are many directions we can go, but today, I’m going to give background about the learning sciences. Why? For one reason, because I believe that with better knowledge of learning sciences, practitioners can help researchers do a better job designing Cyberlearning tools and environments. A second reason is that researchers can help practitioners in understanding when learning is occurring or why it isn’t occurring, and even how to help make it occur. I think that as partners, we can do far more than we can alone. To become better partners, we need to speak the same language.  This post is a start; I hope that you will join in the conversation about learning and learning science.

In the class I teach for first year doctoral students at Pepperdine, many of whom are (fabulous) K-12 teachers, my students and I think deeply about how people learn. Our conversation starts by asking, “What are learning sciences?”  We use this book, The Cambridge Handbook of the Learning Sciences (2nd Edition), to guide much of our thinking.  It’s a big book, and some students were dubious, but after reading it, they told me they enjoyed reading it.  Below I share some of the topics we discuss as we read the first chapter in the book. If you like, the first (introductory) chapter is available as a free sample from Amazon so you can read it. If you want a quick summary, here are some of the things my students and I typically discuss when we read:  

What are learning scientists? Learning scientists are people from diverse backgrounds who care about how people learn in schools, in museums, after school organizations, on the job, or anywhere. They have a deep understanding of cognitive processes (what happens in learning in the minds of the learner), and social processes (what happens in situations and interactions between students with other students, students and teachers) and use their knowledge of learning to design and improve the settings that they study.  Their research and studies are often done in partnership with practitioners and students to see learning theories at work in the real world, not just as theory.  Some learning scientists started their careers as teachers or other practitioners and so they have a very good understanding what the real world is like.

Traditional approach of schooling versus new thinking.  Instructionism, or the teacher as the expert telling students what they need to know and the students accepting it without questioning it, is the traditional model of schooling (Papert, 1993).  Many of us have experiences in being lectured to by teachers who subscribe to instructionism. Instructionism makes the assumption that we can fill empty minds with knowledge and that presenting the same materials to different learners will have the same results. Instructionism is contrasted to constructivism where we assume learners are different and need to construct their own understanding based on what they already know through interacting with new information and others.  I have heard from teachers that as Common Core and Next Generation Science Standards (NGSS) become more widely implemented in schools, different methods for helping learners–not just telling students what they should know–are needed. In informal settings and in some schools, hands-on, active learning with inquiry- or production-centered methods are used to help learners.

Learning scientists (Sawyer, 2006) take the view that in learning situations, knowledge needs to be generated, constructed, and practiced, and that learning with others in collaborative situations works well. (That’s not to say that there’s not a place for lecture, because there sometimes is, but lecture shouldn’t be the only mode.) Learning scientists use what they know from the research in the learning sciences to design good learning situations and environments. 

What is Deep Learning? There’s a lovely table on page 5 of the book, or at 33% of the sample (if you downloaded it), that discusses deep learning versus traditional classroom practices.

According to the book, deep learning is about making new knowledge interrelated and interconnected into the other knowledge a person already has. It is about helping learners see patterns and understand the underlying similarities, differences, or principles.  It is often done in collaboration with others through discussion of a topic. After a discussion, you’ll also have a better understanding of what you know and don’t know (even if you feel as though you have more questions than answers).  Discussions allow participants to reflect to see what they understand and that others see things differently. Unfortunately, in many schools, because learners aren’t really exposed to methods that make learning relevant to them, they often focus on grades or certification. So they decide to cram knowledge into their heads just in time for a test.

Sengupta-Irving and Enyedy, N. (2015) show that when teachers and researchers think about learning goals and implement pedagogical strategies that focus on deep learning, learning becomes more relevant and enjoyable (as it should). I’ll share more about their paper in the next post and talk more about specific work learning scientists  do.

This post has gotten long, so I’ll end here. In future posts, I plan to provide you with examples from Cyberlearning research projects that have taken technologies and designed new approaches, based on learning science research, to help people learn. In this blog, I want to start a conversation because conversations with diverse groups help advance thinking. Please post comments and questions. I look forward to thinking about learning with you.

You can read more about the learning sciences and learning scientists here at the CIRCL site.  

​References
Papert, S. (1993). The children’s machine: Rethinking school in the age of the computer. Basic books.

Sawyer, R.K. (2014) Introduction: The new science of learning. In: Sawyer R. K. (ed.) Cambridge handbook of the learning sciences. Second edition. Cambridge University Press, New York: 1-18. 

Sengupta-Irving, T., & Enyedy, N. (2015). Why engaging in mathematical practices may explain stronger outcomes in affect and engagement: Comparing student-driven with highly guided inquiry. Journal of the Learning Sciences, 24(4), 550-592, DOI: 10.1080/10508406.2014.928214.