Category Archives: Culturally Responsive Teaching

path

Building Classroom Community through Trust

By Angie Kalthoff

Neuroscience tells us the brain feels safest and most relaxed when we are connected to others we trust to treat us well.

I recently participated in an informal conversation with other educators where we were discussing teaching and learning in a distance learning setting. Current teachers were sharing ice breakers and back to school activities that they were finding for their very different-than-normal back to school. I asked for resources around how to talk to kids about their current situations due to the state of the world. People are dealing with a lot of emotions right now. World events like a pandemic, wildfires, and social justice conversations around the murder of George Floyd are a lot for adults to digest. I can’t even imagine how children or teens are processing it all. In this conversation, no one had a resource to share that was specific to online learning but we did talk about Culturally Responsive Teaching and the Brain by Zaretta Hammond. I learned about this book when I was a technology integrationist. Additionally CIRCL Educators has been focusing on it, and other books and topics related to social justice, bias in algorithms, techequity, and other anti-racist practices over the past few months.

In Ms. Hammond’s book, I learned about the importance of trust. Research shows that a positive relationship between students and teachers is crucial for students to reach their fullest potential. Of course! Ask any educator and they can talk to you about the importance of relationships and trust. I experienced this early on in my teaching career. But, if you would have asked me to explain why, I wouldn’t have been able to connect it to the research and history shared in this book. This phenomenon is rooted in our history, from the time when humans roamed the earth and started to live in communities to get protection from animals. From these experiences, it is thought that the brain created a social engagement system to ensure humans form communities, build trustful relationships, and work to maintain them. In this post I will present an introduction to help you understand how to use the research around neuroscience in your classroom to influence positive behaviors, discuss why we developed these systems, and how this relates to your classroom with a culturally responsive response in mind.

To start, if you are new to Culturally Responsive Teaching (CRT), one of the questions I continually ask myself is “am I thinking of my students whose lived experiences are different than mine and what perspectives am I not thinking about?” in her books she focuses on  “How do I treat my students who are different from me? They could have a different skin color, they could speak a different language, they could have different abilities than I do, and have different lived experiences. And, am I building their self esteem or am I creating the positive affirmations that will benefit them in life outside of my classroom?” Ms. Hammond defines CTR as the process of using familiar cultural information and process to scaffold learning. She emphasizes communal orientation and focuses on relationships, cognitive scaffolding, and critical social awareness. 

I began my teaching career as an English as a Second Language (ESL) teacher in 2008. This title has now transitioned to Multi-Lingual and has also been referred to as English Language (EL) teacher and teacher to English Language Learners (ELL). Many of my students spoke more than one language and I appreciate the thought that has gone into the transition of the title. I learned a lot about teaching and the importance of relations early in my career. As a white teacher, who grew up in the MidWest, my background and lived experiences are very different than many of my students. While students in my classroom moved to Minnesota from all over the world, a large part of my student population came from refugee camps in Somalia. It was during this time I learned about the importance of building trust with students and families. I wish I had a resource like Culturally Responsive Teaching and the Brain but I didn’t. In this post, I will share specific research based practices that you can take into your classroom whether it is online in a virtual settings or in person in a physical building.

Neuroscience for Teaching Practice

Affirmation

When your brain feels safe and relaxed it sends oxytocin (the bonding hormone) out which in turn makes us want to build trusting relationships with the people we are engaging with. Neuroscience tells us the brain feels safest and most relaxed when we are connected to others we trust to treat us well. How does the brain know when to do this? In most people, the brain releases oxytocin when any of these actions happen:

  • Simple gestures
  • A smile
  • Nod of the head
  • Pat on the back
  • Touch of your arm

Affirmation in your school environment.

One way that you can bring this into your learning situation is through an affirmation. In the book, a study is described where a principal takes the time to greet each student by giving them her full attention, getting to their level, and offering a bow. Students in this study would light up based on this affirmation and respect, both figuratively and literally.

Mirror Neurons

When we are around others in that we have a trusting relationship with, mirror neurons may help  keep us in sync with them. Some researchers think mirror neurons help us have empathy with others. Additionally, they may help us make and strengthen bonds. Have you ever thought about why you smile when someone smiles at you? This action may be connected to the mirror neuron system. Early studies showed that mirror neurons mirrored what you see. For example, when we see a behavior such as smiling, mirror neurons in the region of our brain that relate to smiling activate. Some researchers believe that we also mirror the behavior we see by also performing the behavior (smiling in this example) and that this mirroring signals trust and rapport.

This section had me searching the Internet for more information and one analogy that often came up was “Monkey See, Monkey Do.” This makes me think about young children and babies. Have you ever had an interaction with a young one where they try to copy a noise, facial expression, or gesture, it may be related to the mirror system. You can watch this introductory video if you want to learn more. (It’s from early on when we were just starting to learn about mirror neurons, but it presents many questions that researchers are still investigating.) Note, the mirror system is fascinating but there is still much research to be done to fully understand it.

Mirror Neurons in your school environment.

While researchers are still learning more about how the mirror system works, many of the big ideas discussed are important for practice. We definitely have areas in our brains that help make us feel connected to others. It may well be that the synchronized dance of mirror neuron systems between people is what is responsible, or it may be something else. Regardless, there is no doubt that connections are related to feeling more relaxed and trusting — important for learning. As a teacher, it really is important to make a personal and authentic connection with your students.

To apply the research from this chapter and begin building a different kind of relationship there are two things you can start working on today that relate to empathy and connection; listening with grace and building trust.

To Listen with Grace

In chapter five there are a few examples of how to listen with grace. They include:

  • Give one’s full attention to the speaker and what is being said
  • Understand the feeling behind the words and be sensitive to the emotions being expressed
  • Suspend judgement and listen with compassion
  • Honor the speaker’s cultural way of communicating

I know we are all so busy that it’s hard to sometimes take the time to be fully present, but listening and connecting in whatever ways we can is even more important in the online space.

Trust Generators

In the book, Zaretta Hammond shares five ways to help create trust, I will discuss one of those, Selective Vulnerability. I chose Selective Vulnerability due to the state of the world we are living in as we live through a pandemic. Our lives and routines have changed. For many, this means taking what we have known as education and changing it drastically. Educators who have been teaching in classrooms for their whole careers are now expected to move to an online environment. Children who have benefited from the in person learning environment are now having to learn from a device outside of school. I think, as a learning community, we might all benefit from selective vulnerability. CIRCL Educator Sarah Hampton and I both agree that there is room to grow in being transparent with students in our own growing pains as learners.

Trust Generator: Selective Vulnerability

Definition: People respect and connect with others who share their own vulnerable moments. It means showing your human side is not perfect.

What It Looks Like: Sharing with a student a challenge you had as a young person or as a learner. Sharing new skills you are learning and what is hard about it. In either case, the information shared is carefully selected to be relevant. Think about who you are talking to and what you have in common. The goal here is to connect and show that you are a fallible human being. A student with a very different background may not be able to understand certain examples and there is the possibility that your example could alienate rather than build rapport.

As I mentioned before, one of the important questions in the book is “How do I treat my students who are different from me?” I think the focus of thinking about the perspective of the person in the interaction is so important. This year has brought so much for all of us to deal with, and as teachers, we need to know who we are talking to and what experiences have shaped them, so that we can work to make connections as a foundation for teaching and learning. If you want to dig deeper into listening with grace and building trust, Ms. Hammond has that and so much more in her book.

What do you think? Connect with us on social media @CIRCLeducators and share how you show affirmation to your students!

red heart icon

Algorithms, Educational Data, and EdTech: Anticipating Consequences for Students

By Pati Ruiz and Amar Abbott

The 2020-2021 school year is underway in the U.S. and for many students, that means using edtech tools in a fully online or blended learning environment. As educators, it is our responsibility to consider how students are using edtech tools and what the unanticipated consequences of using these tools might be. Before introducing edtech tools to students, administrators should spend time considering a range of tools to meet the needs of their students and teachers. In a recent blog post, Mary Beth Hertz described the opportunities for anti-racist work in the consideration and selection of the tools students use for learning. Hertz identified a series of questions educators can ask about the tools they will adopt to make sure those tools are serving the best interest of all of their students. Two of the questions in Hertz’s list ask us to consider data and algorithms. In this post, we focus on these two questions and Hertz’s call to “pause and reflect and raise our expectations for the edtech companies with which we work while also thinking critically about how we leverage technology in the classroom as it relates to our students of color.” The two questions are:

  1. How does the company handle student data? and,
  2. Has the company tested its algorithms or other automated processes for racial biases?

To help us better understand the issues around these two questions, we will discuss the work of two researchers: Dr. Safiya Noble and Dr. Ruha Benjamin. This post expands on our previous post about Dr. Noble’s keynote address — The Problems and Perils of Harnessing Big Data for Equity & Justice — and her book, Algorithms of Oppression: How Search Engines Reinforce Racism. Here, we also introduce the work of Dr. Ruha Benjamin, and specifically the ideas described in her recent book Race After Technology: Abolitionist Tools for the New Jim Code.

Student Data

In order to understand how companies handle student data, we need to first consider the concept of data. Data are characteristics or information that are collected in a manner capable of being communicated or manipulated by some process (Wiktionary, 2020). In Dr. Noble’s keynote speech, she discusses the social construction of data and the importance of paying attention to the assumptions that are made about the characterization of data that are being collected. In her book, Dr. Noble shows how Google’s search engine perpetuates harmful stereotypes about Black women and girls in particular. Dr. Benjamin describes the data justice issues we are dealing with today as ones that come from a long history of systemic injustice in which those in power have used data to disenfranchise Black people. In her chapter titled Retooling Solidarity, Reimagining Justice, Dr. Benjamin (2019) encourages us to “question, which humans are prioritized in the process” (p. 174) of design and data collection. With edtech tools, the humans who are prioritized in the process are teachers and administrators, they are the “clients.” We need to consider and prioritize the affected population, students.

 

When it comes to the collection and use of educational data and interventions for education, there is much work to be done to counteract coded inequities of the “techno status quo.” In her keynote, Dr. Noble offered a list of suggestions for interventions including:

 

  1. Resist making issues of justice and ethics an afterthought or additive
  2. Protect vulnerable people (students) from surveillance and data profiling

 

Center Issues of Justice and Ethics

As described by Tawana Petty in the recent Wired article Defending Black Lives Means Banning Facial Recognition, Black communities want to be seen and not watched. The author writes:

“Simply increasing lighting in public spaces has been proven to increase safety for a much lower cost, without racial bias, and without jeopardizing the liberties of residents.”

What is the equivalent of increasing light in education spaces? What steps are being taken to protect students from surveillance and data profiling? How are teachers and students trained on the digital tools they are being asked to use? How are companies asked to be responsible about the kinds of data they collect?

Schools have legal mandates meant to protect students’ rights, such as the Family Educational Rights and Privacy Act (FERPA) in the U.S. and other policies that protect student confidentiality regarding medical and student educational records. Although a lot of forethought has gone into protecting students’ confidentiality, has the same critical foresight implemented when purchasing hardware and software?

 

In Dr. Noble’s keynote speech, she described the tracking of students on some university campuses through the digital devices they connect to campus Internet or services (like a Library or Learning Management System). The reasoning behind tracking students is to allocate university resources effectively to help the student be successful. However, in this article, Drew Harwell writes about the complex ethical issues regarding students being digitally tracked and an institutions’ obligation to keep students’ data private. So, before software or hardware is used or purchased, privacy and ethics issues must be discussed and addressed. Special energy needs to be dedicated to uncovering any potential “unanticipated” consequences of the technologies as well. After all, without the proper vetting, a bad decision could harm students.

Protect Vulnerable Students

Protecting vulnerable students includes being able to answer Hertz’s question: “Has the company tested its algorithms or other automated processes for racial biases?” But, even when the company has tested its algorithms and automated processes, there is often still work to be done because “unanticipated” results continue to happen. A Twitter spokesperson Liz Kelley recently posted a tweet saying: “thanks to everyone who raised this. we tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do.”

She was responding to the experiment shown below where user @bascule posted: “Trying a horrible experiment…Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama?”

Twitter’s machine learning algorithm chose to center the white face instead of the black face when presented with where the white profile picture was shown on top, white space in between, followed by the black profile picture. But it did the same when the black profile picture was shown on top, white space in between, followed by the white profile picture.

A horrible twitter experiment with face recognition. The algorithm selects the white face regardless of placement

As we can see, the selection and use of tools for learning is complicated and requires balancing many factors. As CIRCL Educators we hope to provide some guidance to ensure the safety of students, families, and their teachers. Additionally, we are working to demystify data, algorithms, and AI for educators and their students. This work is similar to the work being done by public interest technologists in the communities and organizations described by both Noble and Benjamin. We don’t have all of the answers, but these topics are ones that we will continue to discuss and write about. Please share your thoughts with us by tweeting @CIRCLEducators.

 

References

Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge, UK: Polity Press.

data. (2020, August 12). Wiktionary, The Free Dictionary. Retrieved 15:31, August 26, 2020 from https://en.wiktionary.org/w/index.php?title=data&oldid=60057733.

Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.

abstract wall

Breaking Barriers in Computer Science via Culturally Relevant Educational Tools (Part 3)

By Joseph Chipps, Ed.D.

In the last post, I gave background on ethnocomputing and culturally situated design tools, two constructs I used to develop culturally relevant education in newer, equity-designed computer science classes (e.g., Exploring Computer Science (ECS)1 and AP Computer Science Principles (AP CSP)2). It is much more challenging to build those theories and tools into AP Computer Science A due to the use of shared tools and languages. So what can be done in AP Computer Science A?

First, we can increase sociopolitical awareness. In computing, we invite students to identify and address social inequities as developers of technology. For example, in the e-textiles unit of ECS, students construct a textile computing artifact with touch sensors, and collect the ranges read by the sensor when their peers use the artifact. As each student will produce a different range, the developer must synthesize the data while coding to make distinct cases for their product to follow. Inevitably, some students will be unable to use the product due to gender or race because of the way the technology was developed. This then leads to discussions of technology as biased (i.e., airport scanners discriminate against people of color (POC) and facial recognition not recognizing POC). An opportunity I took in the AP Computer Science A curriculum was to introduce the concept of variable declarations using a Student object that contains data (name, age, gpa, and gender). I invited groups of students to discuss what type of data they would make each variable, and to explain their reasoning. Some of the groups assigned gender to a boolean variable (one of two possible states, i.e. true/false) while others assigned gender to a String variable (any array of characters, i.e. “Female”). As students shared stories and engaged in dialogue, the class quickly realized that assigning gender to a boolean value transfers human bias to technology, and rejects the existence of non-binary genders.

Second, we can implement practices from culturally relevant education by using the experiences of students as an asset in the classroom. One of the ways to attach learning to the experiences of students is to connect curriculum to the real world. For example, DiSalvo, Guzdial, Bruckman, and McKlin3 studied how high school Black and Latino male students negotiate between geeking out and being cool when testing games. When students learned it was a real job in the world, students were more likely to maintain their interest and identity in computer science. A second way to attach learning to the lived experiences of students is to connect to students’ perceptions of self (or self-identity). Frederick, Donnor, and Hatley4 conducted a meta analysis of culturally responsive education (CRE) programs in technology education, and determined that courses that utilize CRE should acknowledge students’ representations of self. To achieve this, curriculum should present diverse and realistic perspectives as well as provide spaces for students’ voices and self-expression. I cannot think of examples in computer science; however, Nichole Pinkard’s5 (2001) Rappin’ Reader and Say, Say, Oh Playmate expertly used oral traditions and play rituals, prior knowledge of African-American children, as a method of early literacy instruction. Furthermore, the authors advise that those who acknowledge lived experiences of students also need to find ways to counter harmful narratives and deficit identity formation.

Third, we can build connections with local community members. For example, Lachney6 leveraged the expertise of programmers, students, and schools in the development of the cornrow curves CSDT. The programmers worked with community hair braiders to help develop the computational patterns in the software. But how do we build culturally situated design tools that can be part of the shared tools and languages of industry? Ogbonnaya-Ogburu, Smith, To, and Toyama7 adapted critical race theory to human-computer interaction (HCI) and concluded that the technology sector is prone to interest convergence; that is, the inclusion of POC in technology requires benefits to those in power. For example, changes in designs may not occur unless they help all people, not just POC. Talking with students about this helps create awareness. This overlaps with good techquity practices.

Finally, we can invite students to personalize computing artifacts. Kafai, Fields, and Searle8 studied the experiences of students personalizing electronic textile projects. The final project of the e-textiles unit invites students to develop any textile artifact of their choice as long as it has touch sensors and LEDs, and exhibits four different behaviors based on the user input. Projects have included play mats that force arguing siblings to hold hands for a certain amount of time, t-shirts and hats that display teams and schools in bright lights, and plush dolls that play music when squeezed. The authors found that allowing for personalization showcased aesthetic designs while revealing the diversity embedded in technology development.

When I design CRE curriculum for computer science, I frame the class as project-based, where students collaboratively construct videos, images, applications, presentations, pdfs, and other artifacts to demonstrate their understanding of material. Rather than present information for them to consume, I provide methods of inquiry, where students must reflect, research, discuss, and build their own understanding of content from their unique sociocultural context. Inquiry is situated on real-world contexts. When I can, I invite students to develop their artifacts using culturally situated design tools. To go beyond the curriculum, I challenge students to question the impacts of computing on the economy, society, and culture while providing space for ideas and dialogue, independent of a patriarchal, capitalist, and white framework. Which algorithms are biased, and how can we deconstruct bias within the logic? What does an anti-racist programming language look like?  How do we counter deficit narratives in classes like AP Computer Science A, where the shared tools and languages could negatively impact purposefully excluded communities (PEC)s. And to reiterate, what does anti-racist eduction look like when students are forced to use the shared tools and languages of a profession that purposely excludes them?

Read Part 1 of the series.

Read Part 2 of the series.

  1. Goode, J., Chapman, G., & Margolis, J. (2012). Beyond curriculum: The Exploring Computer Science Program. ACM Inroads, 3(2), 47–53. https://doi.org/10.1145/2189835.2189851
  2. Astrachan, O., Cuny, J., Stephenson, C., & Wilson, C. (2011, March). The CS10K project: mobilizing the community to transform high school computing. In Proceedings of the 42nd ACM technical symposium on Computer science education (pp. 85-86).
  3. DiSalvo, B., Guzdial, M., Bruckman, A., & McKlin, T. (2014). Saving face while geeking out: Video game testing as a justification for learning computer science. Journal of the Learning Sciences, 23(3), 272-315.
  4. Frederick, R., Donnor, J., & Hatley, L. (2009). Culturally responsive applications of computer technologies in education: Examples of best practice. Educational Technology, 49(6), 9-13.
  5. Pinkard, N. (2001). Rappin’ Reader and Say Say OH Playmate: Using Children’s Childhood Songs as Literacy Scaffolds in Computer-Based Learning Environments. Journal of Educational Computing Research, 25(1), 17–34. https://doi.org/10.2190/B3MA-X626-4XHK-ULDR
  6. Lachney, M. (2017) Culturally responsive computing as brokerage: Toward asset building with education-based social movements. Learning, Media and Technology, 42(4), 420-439. doi:10.1080/17439884.2016.1211679
  7. Ogbonnaya-Ogburu, I. F., Smith, A. D., To, A., & Toyama, K. (2020, April). Critical Race Theory for HCI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-16).
  8. Kafai, Y. B., Fields, D. A., & Searle, K. A. (2014). Electronic textiles as disruptive designs: Supporting and challenging maker activities in schools. Harvard Educational Review, 84(4), 532-556. doi:10.17763/haer.84.4.46m7372370214783
Close-up of hands on a laptop computer

Breaking Barriers in Computer Science via Culturally Relevant Educational Tools (Part 2)

by Joseph Chipps  Ed.D.

In the last post, I gave background on my school, situation, and the problems I was trying to address to bring in more people of color (POC) and females into the white and Asian male dominated Computer Science (CS) courses in my school. I also gave background on Culturally Relevant Education (CRE)1 the term I will use that encompasses culturally relevant pedagogy and culturally relevant teaching.

In order to develop Exploring Computer Science (ECS)2 and AP Computer Science Principles (AP CSP)3 using a culturally relevant educational framework, curriculum developers of those courses relied on theories and tools positioned within culturally relevant education: ethnocomputing and culturally situated design tools.

Ethnocomputing attempts to bridge the gap between culture and computing in that it assumes that computing is not a neutral activity; rather, computing is informed by capitalist, patriarchal, and western logic, beliefs, and tools4. Ethnocomputing originated from the idea that computing should be taught using relevant cultural artifacts and references of the local learners; that is, the cultural contexts of the learner5. In the ECS curriculum, through collaborative practices and methods of inquiry, students develop their own understanding of computing using journal writing, dialogue, construction of culturally meaningful artifacts, and presentations. In the code.org AP CSP curriculum, students develop a protocol for sending a color image through a network by creating their own personal favicon, the little icon at the top of a browser tab. This activity allows students to develop icons from their sociocultural backgrounds; students create their own symbols for computing, and through those symbols construct meaning as well as perception of self. Furthermore, the biases of the instructor are acknowledged as students work together to construct their own ideas and interpretations of computing.

AP Computer Science A externally tests students’ understanding of Java, an object-oriented programming language. Object-oriented refers to a style of programming in which we use data structures called objects to hold data that belongs to the object (i.e. a Student’s name, age, and gpa). I give a detailed example at the end of the post, that shows how I attempt to use items from students’ lived experiences to construct the rationale and embedded logic of encapsulating data within a single entity while using a design artifact from industry to help students code-switch.

As I teach about Class and Object in Java, I know they are symbolic tools, shaped by generations of programmers over time. Even the diagram in the example below is a constructed symbol, formed by decisions and negotiations over time within the programming community. So I have to ask: am I acclimating my students to cultural norms embedded within a larger system that purposely excludes them or am I supporting their futures by teaching them tools and languages required for code-switching?

ECS and AP CSP have the privilege of using tools and languages not shared by the programming community because they were designed to exist outside of professional communities for the specific purpose of increasing participation, but activity within a course that has historically used industry standard languages will always be mediated by the shared tools and languages of the professional community. Yes, I can create student-centered activities that allow students to construct their own ideas of concepts and logics, and invite students to raise sociopolitical consciousness in their and other communities. But am I doing a disservice to those students by forcing them to construct the logic, symbols, and beliefs of a culture that purposely excludes them? Or am I helping them enter this community?

Culturally Situated Design Tools (CSDT) support ethnocomputing in that they are collaboratively developed tools that exist outside of the shared tools of computing, and are inspired by purposefully excluded communities (PEC) culture. For example, a collaborative project in ECS requires students to present the cultural background of Native American bead looms, connections between bead looms and mathematics, and their own authentic bead loom designs that they construct using a CSDT. Embedded in this lesson is the realization that computing and mathematical concepts are not singularly defined and owned by white, patriarchal, western history; rather, embedded within Native American cultures. Alternatively, in ECS, after learning the history of cornrows, students use a CSDT to design and reflect on the mathematics of cornrow curves. Students investigate recursion through a CSDT that simulates cornrow curves. When I was taught recursion in a Java class, I heard names like Fibonacci and solved problems that required some understanding of basic number theory. Students constructing recursive artifacts using a non-western tool like a cornrow design simulator is anti-racist computing education. We are putting the stories that were removed from education back into the curriculum.

But what does a CSDT look like when the purpose of a course is to introduce students to the shared tools and languages of the professional community? How can we leverage the experiences and voices of those who have not been included in the development of tools we use to design and execute computing? How do we promote anti-racist education when the tools and languages we use are embedded in exclusionary culture6? These were the barriers I faced when trying to implement ethnocomputing via culturally situated design tools in my AP Computer Science A curriculum. I still do not have answers. Perhaps a next step in computing is to design anti-racist computing tools and languages for industry. How can we use heritage cultural artifacts and vernacular culture to support the development of anti-racist computing tools and languages that can be used in industry as well as education? In my next post, I will explore what can be done in courses like AP Computer Science A such as increasing sociopolitical awareness, using the experiences of students, building connections within the community, and personalizing student-constructed artifacts.

Java Example Details:

For example, a class called Student would be an archetypical framework for how to define a student in a computer, and would include three parts: what data a student has (name, age, gpa, etc); how to create a student (which data can we set initially vs which data can be set later) and; actions we can do with the student data (update data, access data, add new scores to the gpa). While a class is a template for an object, an object is an instance that we can create. For example, once I have defined the template for what a Student is in a file called Student.java, then in a runner file, I can create a Student named Alice and input all of Alice’s data. I can then store all of Alice’s data within the object called Alice. Over time, I can access and manipulate Alice’s data, and even have Alice’s data interact with other Students’ data if, for example, I want to know the average GPA of the school or any selection of Students. The concept of objects is essential to AP Computer Science A. Consequently, I developed a lesson inspired by ethnocomputing for the first week of the course that would invite students to interpret experiences from their life into an object (discussed above). Figure 1 shows an example of what students must create.

MatzohBallSoup
– chefName : String

– ingredientNum : int

– temperature : double

+ getChefName(): String

+ getIngredientNum(): int

+ getTemperature(): double

+ setChefName(String): void

+ setIngredientNum(int): void

+ setTemperature(double): void

+ toString(): String

 Figure 1. Example of object design from my Java curriculum

Read Part 1 of the series.

Read Part 3 of the series.

  1. Aronson, B., & Laughter, J. (2016). The theory and practice of culturally relevant education: A synthesis of research across content areas. Review of Educational Research, 86(1), 163-206. doi: 10.3102/0034654315582066
  2. Goode, J., Chapman, G., & Margolis, J. (2012). Beyond curriculum: The Exploring Computer Science Program. ACM Inroads, 3(2), 47–53. https://doi.org/10.1145/2189835.2189851
  3. Astrachan, O., Cuny, J., Stephenson, C., & Wilson, C. (2011, March). The CS10K project: mobilizing the community to transform high school computing. In Proceedings of the 42nd ACM technical symposium on Computer science education (pp. 85-86).
  4. Tedre, M., Sutinen, E., Kahkonen, E., & Kommers, P. (2006). Ethnocomputing: ICT in cultural and social context. Communications of the ACM. 49(1), 126-130. doi: 10.1145/1107458.1107466
  5. Babbitt, B., Lyles, D., & Eglash, R. (2012). From ethnomathematics to ethnocomputing. In Swapna Mukhopadhyay & Wolff-Michael Roth (Eds.). Alternative forms of knowing mathematics (pp. 205–219). doi: 10.1007/978-94-6091-921-3_10
  6. Margolis, J., Estrella, R., Goode, J., Holmes, J.J. and Nao, K. Stuck in the Shallow End: Education, Race, and Computing. MIT Press, Cambridge, MA, 2010.