Category Archives: Guest Author

Students sit around a large paper on the floor and draw on, look at, or point to it.

Considering Techquity in the Classroom

By Merijke Coenraad

Merijke Coenraad is a PhD Candidate in the Department of Teaching & Learning, Policy & Leadership in the College of Education at the University of Maryland. She is a former middle school teacher. Her research focuses on the intersections of educational technology and equity including the creation of materials, platforms, and experiences in partnership with teachers and youth through participatory design methods.

Flashback to a Spanish Classroom (2016)

Chromebooks out. Hushed silence. Each student leaned over their computer. Tension in the air. I yell, “GO! “ and with one word, the room erupts and groups hurriedly work together to identify vocabulary words before their classmates. In loud whispers students ask their partners for words, “Calcentines, who has socks?” One mistake and the group will have to start over; the stakes are high, and no star student can single handedly win the game for their peers. 

Quizlet transformed flashcards, a time consuming (and often lost or forgotten) physical learning tool into a digital learning experience. My students practiced their vocabulary words through drills and games all week and on Friday, we played Quizlet Live.

When I was still in the classroom, I loved to bring new technology into my social studies and Spanish lessons. I got excited discovering tools like EdPuzzle and Padlet when they were first breaking onto the education stage. With 1 to 1 Chromebooks in my middle school classroom, there was hardly a class period where students were not somehow connected to technology and each of these technologies meant creating a new account. Looking back, I realize that I was naïve while teaching. As I brought tool after tool to my students, I didn’t think deeply about the data collection ramifications and the way that the very tools that could enhance learning might be treating my students inequitably and perpetuating the structural racism and human biases that I worked each day to dismantle. The educational technology that I brought into my classroom had positive effects, but it also had hidden consequences, most of which I might never know.

Four years after leaving the classroom to begin my PhD, my work focuses on one thing, Techquity, or the intersection of technology and equity. This focus is driven by the students I taught and the many times I saw technology act as both an access point and a barrier to their education. Even though I wasn’t thinking about data collection, algorithmic bias, and the effects of AI for the students in my classroom, I was still focused on how technology helped and hindered my students’ education. But those barriers and hindrances go beyond the devices and internet access I have long considered. In the last year, I have learned a lot about forces within and around technology that cause inequities. I have learned about the Coded Gaze of AI Technologies from Joy Buolamwini and the New Jim Code from Ruha Benjamin. I’ve learned about the biases inherent in the very design of technologies with Sara Wachter-Boettcher and how algorithms can be Weapons of Math Destruction from Cathy O’Neil. It has led me to focus on how I can not only be more cognizant of the biases of technology, but also teach students about them.

Techquity: Co-designing with Kids

To learn more about what kids think about Techquity concerns, I partnered with a youth design team to hear what they had to say about Techquity and learn which Techquity concerns were of the most interest to them. I find that kid insight is critical whenever I am discovering new topics to teach to students. The team was constructed of 7 Black youth between the ages of 8 and 13 who meet twice a week to design technologies and learn about being a designer.

Let’s look a little bit at what the kids had to say about Techquity.

While they didn’t have the vocabulary to name algorithmic bias or biases in voice recognition technology, the kids quickly began offering examples of how technologies can be good and bad and how even single technologies can have good and bad sides. For example, one group identified Siri as helpful because “she” can give information without typing, but they also were worried that Siri doesn’t always understand them and “SIRI CAN LISTEN TO US!!!!” While the AI in their phones allowed the students to access all sorts of information, they were not immune to considerations of what it meant for a device to always be listening for, “Hey Siri…”

As our conversation turned and I introduced the kids to some common examples of Techquity concerns such as data collection, targeted advertising, misidentification by AI, and non-diverse tech design teams, the kids continued to describe their own examples. They could recollect times when they received targeted advertising based on location or a recent website visit.

Techquity Concerns

10 common Techquity concerns we discussed are:

  • Algorithms (computer programs) don’t treat everyone fairly
  • Technology development teams are frequently not diverse
  • Alexa, Google Home, and Siri are always listening to me
  • I get personalized ads based on data companies collect about me
  • Technology is not always accessible for individuals with disabilities
  • Companies sell my data
  • Sensors and systems like Alexa, Google Home, and Siri get confused about how I look or what I say
  • People don’t understand how technology works
  • Machine learning and facial recognition isn’t trained well enough to recognize everyone

The kids each ranked the 10 Techquity concerns from “very important to me” to “not very important to me.” The two most highly ranked ideas were algorithmic bias and non-diverse tech companies. The kids were especially concerned that individuals who looked like them were not being represented on design teams when they themselves were and what this meant for the technologies being designed.

As their final design task, the kids designed ways to teach other kids about Techquity by drawing their ideas out on an online platform mimicking paper and pencil. Interestingly, the kids didn’t want to move away from technology just because it could be biased, they just wanted it to be created in more equitable ways and to be used to teach others. Their teaching often included advanced algorithms and even AI. They designed scenarios using robots and adaptive software to allow other kids to experience obvious Techquity concerns and learn from their experiences. One girl, Persinna, explicitly discussed the three-member design team shown in her game as having 2 girls and 1 boy because “that is Techquity.” Kabede felt very strongly that data collection by tech companies was a big concern. He started making connections to actual tools he knows such as DuckDuckGo, a search engine that does not profile users and focuses on user privacy.

What I Would Consider Now If I Were Still a Teacher

I’d start from what these kids already know about Techquity and how algorithms and AI are affecting their lives and build on that. I would educate students about the biases inherent in Google searches, which sort not by popularity of links as is commonly assumed, but based on user profiles and advertising. I would use Kabede’s recommendation and have students use a search engine like DuckDuckGo to prevent tracking and allow for private searches. I would challenge students to think about where algorithms, AI, and technology design are already affecting their lives and how technologies might work better for some individuals than they do for others. We would talk about the sensors in automatic sinks, paper towel dispensers, and medical devices and how those sensors work based on light, but oftentimes work better for people with lighter skin. We would discuss Joy Buolamwini’s experiences and work and talk about how machine learning training sets are often not adequate to identify all people well and how this has direct consequences for the use of AI in policing and surveillance.

While the students in my classroom wouldn’t be the ones causing the technology bias, I would make sure they were aware of it and how it had direct implications for their lives. Most of all, I would base these discussions in students’ lived experiences. Just like the kids on the design teams, it is inevitable that my students experienced technology bias, they just might not have had words for it or known why it was happening. The more I could teach my students and bring Techquity concerns to their knowledge, the more they could protect themselves (and their communities) and make educated decisions about their lives with technology. I know that my middle school students wouldn’t give up their technology and knowing about the biases held by the designers of that technology probably wouldn’t change their opinions of technology being, as Joshua said in the design session, “the best thing ever,” knowing more about their digital footprint and how companies are using their information gives them a small advantage. In this case, knowledge of Techquity concerns could give them power over their data and their technology use.

Student Hands

Why assessment?

by Kip Glazer Ed.D.

Summary

In a distance learning environment, assessment can become much more challenging. This article makes six suggestions as to how a high school teacher can assess students effectively to improve student learning.

Introduction:

In my first article, I made four suggestions to support our staff in a distance learning environment. This article will focus on the importance of assessment and how we should leverage that in the new era of learning, sometimes only remote and sometimes without large-scale standardized assessments. I suggest teachers consider six different ways to leverage assessment to improve student learning:

  1. Ask your students to create tests and quizzes
  2. Integrate student-created tests and rubrics
  3. Focus on critical assessing skills
  4. Give students a place to interact meaningfully
  5. Leverage peer evaluation to scaffold student learning
  6. Create consistency in grading across all similar courses

Background

Many teachers are trained to create learning experiences for our students known as teaching. Especially for secondary teachers, teaching includes creating lesson plans that deliver specialized content to our students and then giving the students assessments (i.e. quizzes and tests) to gauge what the students have learned. However, in an online learning environment, traditional assessments such as quizzes and tests are not as effective due to altered learning environments.

In an in-person learning environment, many teachers rely on the publisher’s test bank or textbook questions for assessment for a variety of reasons including a teacher’s desire to align his or her assessment to the approved curriculum that a teacher is asked to deliver. Others use them to save time; some use them because they don’t feel confident enough to create their own assessments. Over the years, I have worked with many teachers who were not terribly thrilled with the quality of the publisher’s assessments yet used them because they felt that they were not skilled to create test questions. Even if a teacher is well-trained in generating effective assessments, they often struggle to create them as constructing valid assessments takes time and expertise. Furthermore, high school teachers have the added pressure of preparing students for high stakes standardized tests such as the SAT, ACT, or AP that are created by experts. Even if a teacher knows and wants to implement skill- or competency-based assessment, the pressure to prepare his or her students for standardized tests can create tension. I personally experienced this as an AP English Literature teacher for many years.

Scope

Having only had high school teaching experiences, I do not presume to know a lot about how this article will apply to the K-6 setting. Although some of the suggestions will likely be applicable to the 7-12 setting, I do not presume to be an expert in every subject being taught in secondary schools. I intend to provide a few examples and strategies that are grounded on sound learning theories so that the teachers can augment their instructional practices should they find this article useful.

Needs

High school teachers need their instructional leaders to provide a clear and concise standard for instruction and assessment as the results of assessment lead to grades that are reviewed by the colleges as a factor in the college-admission decisions. Variability in assessments, therefore, is directly related to many practical and long-lasting consequences. Furthermore, having a clear understanding of what is being assessed and how it will be assessed can guide instructional practices. Having good assessments is vital in measuring the effectiveness of teaching and learning.

Suggestions

In order to maximize the impact of the assessment, I suggest 6 assessment practices. The suggestions are rooted in Papert’s Constructionism.

Learning, according to Papert, is both situated and pragmatic, and the construction of artifacts to demonstrate learning are not only useful but imperative (Papert & Harel, 1991). I argue that we focus on moving towards more student-created assessments.

1. Ask your students to create tests and quizzes

I suggest that the teachers use fact-based and time-bound quizzes and tests as learning tools rather than as  grade-bearing assessments by allowing your students to create them.

In an online environment, students tend to have more resources available on their fingertips including their peers. It is not uncommon for your students to have additional off-line conversations while they are in your class, known as the “dual-screen interactivity” (Nee & Dozier, 2017, p.5). Examples of dual-screen interactivity include searching for additional information in addition to looking at the primary screen, connecting with others who are interacting with the same content, and creating external posts such as social media posts or memes. (Nee & Dozer, 2017). In fact, a teacher should expect this behavior to happen. Rather than fighting against them, I suggest you leverage them for learning.

For example, consider giving a group of students a section of a textbook to create multiple-choice, true-false, or sequential questions. I used this strategy often when I taught social studies where the knowledge of facts is very important. Not only did each group have to create the quiz questions, but each group also had to explain why they chose the topics and the content to be included in the test. Once the students created the questions, I had others in the class take the quiz to verify that the questions were of high quality based on the justifications provided by the authors of the questions. Then I as the teacher chose questions that I thought were great and added them to the official assessments. This practice allowed my students to interact with the materials multiple times without having to listen to a lecture. Also, this taught the students to look for critical information rather than focusing on obscure facts to trick each other. Finally, this allowed me, the teacher, to leverage the four out of five principles of game-based learning, such as competition and goals, rules, choices, challenges, (Charsky, 2010) as students to compete for the coveted position of becoming the author of the final assessment. Even if a group chose to find the questions online, they had to figure out the justifications and answers, which was harder to copy.

2. Integrate student-created tests and rubric

If you are teaching a course such as English, where foundational skills development becomes the center of the course rather than acquiring more discrete information, I suggest you encourage your students to create the rubric that they can use to grade their own learning as student-created tests and rubrics can improve student agency in learning. I used to have my students research various rubrics and evaluate them and create their own to evaluate each other’s work.

According to Garrison and Ehringhaus (2007), students learn best when they are involved in the assessment process. By allowing the students to be a part of their rubric creation, a teacher can not only improve student learning but also assess what they know about the skills that they are being taught.

3. Focus on assessing critical skills

When I say skills, I mean quoting, citing, summarizing, paraphrasing, and video creation. Because students have unrestricted access to additional resources, being able to create new content to demonstrate what they learned is becoming increasingly important. No matter how much teachers try to secure their assessments, a student can always take a screenshot and share it with other students. If the test only requires recalling facts, it is likely to be ineffective in measuring the authentic level of learning. Rather than spending time to limit access to additional resources, I suggest teachers encourage students to add in new information and then the teachers should examine the new information to understand  why the students thought it was important to include in their final products.

Mathematics teachers can also encourage students to find the problems online that assess the procedures and content of the lesson and ask the students to explain why a question should or should not be included in a future assessment. They can also take it a step further and ask the students to create an instructional video and have them evaluate each other’s video to see which one provided the clearest instruction.

4. Give students a place to interact meaningfully around the subject matter

I also suggest using a discussion forum as an assessment tool. According to Balaji and Chakrabati (2010), a robust online discussion forum has a significant positive effect on student participation and learning. However, the forum should not be used as one more place where the teacher can ask questions of their students. An online forum should be a place where students pose questions of others. Also, teachers should not consider the number of posts as the indicator of student engagement and learning (Song & McNary, 2011). Instead, teachers should encourage the students to pose better questions to each other based on Webb’s Depth of Knowledge (1997, 1999 & 2005).

5. Leverage peer evaluation to scaffold student learning

As teachers leverage peer-to-peer interactions to improve earning, I suggest that teachers leverage peer feedback as a component of every assessment.

For example, I used an embedded feature of the star rating system when I used the discussion forums. Rather than posing questions for my students to answer, I asked my students to create 2-3 questions each week based on their reading. Then they would be required to answer 2-3 questions that were posed by other students in the class. If they discovered that the questions were similar or identical to what they posted, they were to post one additional question to indicate that someone else already posted the same question, which encouraged them to get to the forum quicker than the others. As they answered each other’s questions, they were also encouraged to critique the quality of the question by giving them 1-5 stars. Once again, they were to provide feedback as to why they gave the stars based on Webb’s Depth of Knowledge (1997, 1999 & 2005). After a few rounds of questioning, I asked the students to justify why they felt that DOK level 1 and 2 questions were necessary for some context.

6. Create consistency in grading across all similar courses

Finally, I suggest leveraging the Professional Learning Community (PLC) to create consistency in grading across all similar courses. Even more so in a distance learning environment, parents and students may feel that their students are not being fairly assessed based on their personal feelings and perceptions rather than what is actually happening in the class. I strongly suggest that each PLC creates common practices around the type and frequency of assessments for the benefits of all PLC members to reduce subjectivity among all its members in regards to how their students are being assessed. In a distance learning environment, sharing expertise and saving time around assessment is not only useful but also vital to all of us as it will allow us to preserve our most precious commodity: our time.

Specific considerations for Educators

As we discuss assessment, we should consider the following:

  • Even though a grade can be an indication of student learning, we must look at assessment independent of grades as there are many ways to assess student learning without assigning a grade.
  • Unfortunately, many high school students will not take an assessment seriously unless there is a grade attached. Therefore, any discussion around assessment in high school should address the connection between assessments and grades.
  • In an online environment, traditional assessments that are time-bound and facts-based are not as effective as many opportunities to circumvent even the most effective security measures.
  • Additionally for California EducatorsThe California Education Code 49066 (a) states, “When grades are given for any course of instruction taught in a school district, the grade given to each pupil shall be the grade determined by the teacher of the course and the determination of the pupil’s grade by the teacher, in the absence of clerical or mechanical mistake, fraud, bad faith, or incompetency, shall be final.” In other words, teachers have the final say in a grade.

Conclusion

Being able to accurately assess student learning is one of the most challenging parts of being an effective teacher. We (teachers and administrators) often used our state-based large-scale standardized assessments to evaluate the effectiveness of our teaching. As states suspend these conventional tests that may not have been the most effective way to assess our teaching, we need to look to new assessment options. The absence of these tests may be a great opportunity for us to look at assessment from a completely different perspective. As we move forward with the 100% distance learning model, I urge instructional leaders to pay close attention to how teachers are assessing their students. By paying close attention to our assessment practices, we will be able to improve our understanding of student learning considerably.

Additional resources:

Authentic Assessment – Indiana University, Bloomington

Introduction to competency-based Education – Aurora Institute

References:

Ackermann, E. (2001). Piaget’s constructivism, Papert’s constructionism: What’s the difference? Retrieved from http://learning.media.mit.edu/content/publications/EA.Piaget%20_%20Papert.pdf

Aurora Institute (n.d.). Introduction to Competency-Based Education. Retrieved July 26, 2020, from https://aurora-institute.org/our-work/competencyworks/competency-based-education /

Balaji, M.S., & Chakrabati, D. (2010). Student interactions in online discussion forum: Empirical research from “Media Richness Theory” perspective. Journal of Interactive Online-Learning, 9(1), 1-22.

California Legislative Information (n.d.). California Law. Retrieved July 26, 2020, from http://leginfo.legislature.ca.gov/faces/codes_displaySection.xhtml?lawCode=EDC&sectionNum=49066.

Center for Innovative Teaching and Learning (n.d.). Assessing Student Learning: Authentic Assessment. Retrieved July 26, 2020, from https://citl.indiana.edu/teaching-resources/assessing-student-learning/authentic-assessment/index.html

Charsky, D. (2010). From edutainment to serious games: A change in the use of game Characteristics. Games and Culture, 5(2), 177-198. doi:10.1177/1555412009354727

Garrison, C., & Ehringhaus, M. (2007). Formative and summative assessments in the classroom.

Nee, R. C., & Dozier, D. M. (2017). Second screen effects: Linking multiscreen media use to television engagement and incidental learning. Convergence, 23(2), 214-226.

Papert, S., & Harel, I. (1991). Situating constructionism. In Constructionism. Retrieved from http://www.papert.org/articles/SituatingConstructionism.html

Song, L., & McNary, S. W. (2011). Understanding Students’ Online Interaction: Analysis of Discussion Board Postings. Journal of Interactive Online Learning, 10(1).

Webb, N. L. (1997). Criteria for Alignment of Expectations and Assessments in Mathematics and Science Education. Research Monograph No. 6.

Webb, N. L. (1999). Alignment of Science and Mathematics Standards and Assessments in Four States. Research Monograph No. 18.

Webb, N. L. (2005). Web alignment tool. Wisconsin Center of Educational Research. University of Wisconsin-Madison.

Dr. Glazer speaks into a microphone at an assembly

Suggestions for Supporting Staff in a Distance Learning Environment

by Kip Glazer Ed.D.

Kip Glazer is the principal at San Marcos High School. She has an Ed.D. in Learning Technologies and wrote this to share her thoughts and expertise with district leadership. The leaders were very open to the suggestions. Full disclosure: Santa Barbara Unified School District is entering a consulting relationship with Digital Promise and working with some of the CIRCL Educators in our Fall 2020 Professional Learning.

Summary

This article identifies the four major types of needs of a high school during distance learning. It suggests that we apply the Core Conceptual Framework and the TPACK framework when creating teacher professional development (PD); we choose a different type of learning management system; we curate research-based teaching practices intentionally and systemically; and we implement robust assessment and accountability measures.

Introduction

As a teacher, administrator, and scholar, my professional interests have always centered around developing strong pedagogical skills among our teachers. This document is intended to provide our district leaders with some suggestions to improve our instructional practices as we embark on distance learning. I wrote this from the perspective of a high school teacher and administrator based on my professional experience and expertise.

Background

In addition to writing a dissertation on game-based learning after participating in a hybrid program and engaged in different game-based learning projects, I have experience in a variety of asynchronous and synchronous learning and teaching activities. For example, my former students in Bakersfield, many of whom were English Language Learners or Bilingual students, participated in the asynchronous online writing mentoring project with 6th graders in Chicago. These experiences have afforded me a unique perspective on effective distance and hybrid learning.

Scope

There are numerous topics that are related to distance learning such as online security, student data privacy, and cyberbullying. Although I acknowledge that those topics are important, this document will primarily focus on online instructional practices in relation to teacher professional development (PD) and subsequent quality control of their teaching.

Needs

As the District implements 100% distance learning next school year, we must address the following needs:

  • Needs of all learners including technological, linguistic, cultural, emotional, physical, and academic.
  • Needs of parents who would want consistent, calibrated, highly-responsive, and personalized instructions for their students.
  • Needs of teachers who provide distance learning to the students who they have never met and whose needs range from not having basic technology access to having an abundance of at-home resources in all areas.
  • Needs of the community that is looking to the District to provide comprehensive yet flexible instructional solutions that will maximize all available financial and human resources.

Considerations

As we work to address the above needs, we must consider the following:

  • Social-emotional needs of the staff, students, and parents.
    • Successful distance learning requires strong relationships between the students and teachers, and we must address this issue prior to the beginning of any content-based instruction.
  • Choosing and establishing a coherent instructional framework and/or theoretical framework to build our instruction practices.
    • We must consider hardware, software, and how we leverage both hardware and software in a learning environment to achieve an optimal result. In order for our technology department to be effective, we must have resources, systems, and structures to address all three components that are grounded in a sound theoretical framework. This allows us to avoid chasing the latest and greatest technology tools unnecessarily. All leaders must act as a noise-canceler to be able to lead the teaching force by evaluating and advocating tools that meet our chosen instructional framework.
  • Quality control over instructional practices.
    • One of the biggest and most important tasks is to improve the overall quality of our instructions; we must consider this to be the moral imperative in whatever condition we educate our students.
  • On-going monitoring of effectiveness beyond teacher- or student-preference
    • We must develop a rigorous evaluation protocol that reveals the effectiveness of a tool or instructional practices.

Suggestions

To address the needs above, I suggest the following:

1. Teacher PD

  • Address the needs of the teachers based on a Core Conceptual Framework immediately and urgently.
    • According to Desimone (2009), effective teacher PD must (1) be content-focused (i.e. PD activities centered around the content that the teachers teach and how their students will learn it), (2) include active learning (i.e. participating in lesson studies, or group review and grading of sample student work), (3) be coherent (i.e. PD aligned with the teachers belief and knowledge; PD aligned with the goals of the district, site, and department based on a common instructional focus), (4) be over a period of time (i.e. PD spread different activities over a semester rather than a few days), and (5) facilitate collective participation (i.e. PD provided for a group of teachers who teach the same subject or in the same professional learning community).
  • Adopt the Technological Pedagogical and Content Knowledge (TPACK) Framework as the singular framework for teaching.
    • Use the TPACK framework to guide the creation and evaluation of all PD offerings.  TPACK framework addresses the needs for seamless integration of three major elements – technology, pedagogy, and content – in today’s educational environment (Koehler & Mishra, 2009). The TPACK model illustrates the importance of balancing all three such elements in forming a dynamic learning environment to improve student learning (Harris, Mishra, & Koehler, 2009; Mishra & Koehler, 2006).

TPACK: Technological Pedagogical Content Knowledge Framework

The TPACK image. Adapted from “The TPACK Image,” by M. Koehler & P. Mishra, 2012.

  • Provide personalized learning in all three areas of the TPACK Framework based on the Core Conceptual Framework.
    • We should ensure that any online PD platform is able to provide the necessary training for our teacher to address all three areas of knowledge while addressing the needs of adult learners.

2. Technological tool

  • Choose a singular learning platform that is robust and flexible.
    • The District must choose a robust and flexible LMS that includes tools that strongly maximize student participation such as chats, wikis, forums, and blogs. It must allow an easy integration of tools such as all Google Apps and various video conferencing software. It must also provide detailed analytics and click counts that allow easy monitoring of the students’ activities. Finally, it must have tools to allow family engagement.

3. Teaching and learning practices

  • Curate instructional practices that reflect best-practice that are based on research and data.
    • Many resources that have been shared on our internal Google site Learning at Home for Teachers website are about digital tools. We must expand the site to include (1) the pacing guides, (2) major benchmarks, (3) assessment tools including performance rubrics, (4) the best practices, and (5) unit plans. For example, rather than just sharing the rubric for technology readiness for students, the site should include how a teacher would use it in his or her lesson. Rather than sharing the short videos on a topic, the site should provide examples of them being used in a lesson.

4. Assessment and accountability

  • Continue collecting data around the effectiveness of each tool, pedagogical practices, and content acquisition.
    • One of the benefits of distance learning is that we will have access to a great deal of data; therefore, we must build robust data analytics to quickly identify the area for growth so that we can respond with solutions.
  • Provide a clear and concise plan for common practices among teachers.
    • Distance learning, no matter how well planned, can be and is often a disorienting experience. Just as we ask our teachers to reduce the amount of content and set explicit expectations for their students, we must set 2-3 very clear expectations and adhere to those expectations.

Conclusion

This document is in no way a comprehensive document for distance learning. Because distance learning is not likely to go away any time soon, we must act now. We cannot afford to lose any valuable time before creating a comprehensive instructional plan, especially for our high school seniors who will experience significant loss. I look forward to working with our staff and district leaders to continue improving our practices.

Additional resources:

Assessment and Data toolbox from Dallas ISD

Cyberbullying

Digital Citizenship

FERPA for Educators

Screen Time

Social Media

UDL for Distance Learning

References:

Common Sense Media.  (2020, April 07). Everything You Need to Teach Digital Citizenship. Retrieved July 18, 2020, from https://www.commonsense.org/education/digital-citizenship

Common Sense Media. (n.d.) Screen Time. Retrieved June 18, 2020, from https://www.commonsensemedia.org/screen-time

Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational researcher, 38(3), 181-199.

Educational Technology. (2012). The TPACK Model. Retrieved July 18, 2020, from http://www.rt3nc.org/edtech/the-tpack-model/

Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content knowledge and learning activity types: Curriculum-based technology integration reframed. Journal of Research on Technology in Education, 41(4), 393-416. Retrieved from http://files.eric.ed.gov/fulltext/EJ844273.pdf

Koehler, M. J., & Mishra, P. (2009). What is technological pedagogical content knowledge? Contemporary Issues in Technology and Teacher Education, 9(1), 60-70. Retrieved from http://www.citejournal.org/vol9/iss1/general/article1.cfm

Magid, L., & Gallagher, K. (n.d.). The Educator’s Guide to Social Media. Retrieved July 18, 2020, from https://www.connectsafely.org/eduguide/

Michigan Virtual. (2020, March) Teaching Continuity Readiness Rubric. Retrieved June 18, 2020, from https://michiganvirtual.org/wp-content/uploads/2020/03/Teacher-Continuity-Readiness-Rubric.pdf

Mishra, P., & Koehler, M. J. (2006). Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge. Teachers College Record, 108(6), 1017-1054. doi:10.1111/j.1467-9620.2006.00684.x

Quillen, I. (2013, March 7). Student Mentors: How 6th and 12th Graders Learn From Each Other. KQED Mind/Shift. Retrieved July 18, 2020, from https://www.kqed.org/mindshift/27542/student-mentors-how-6th-and-12th-graders-learn-from-each-other#more-27542

Rappolt-Schlichtmann, G. (2020, March 18). Distance Learning: 6 UDL Best Practices for Online Learning. Retrieved July 18, 2020, from https://www.understood.org/en/school-learning/for-educators/universal-design-for-learning/video-distance-learning-udl-best-practices?_ul=1%2A1vi266z%2Adomain_userid%2AYW1wLUhYa3ZJQUFrcVNWb29EM0RzaExjUGc

Secondary Remote Learning Resources (n.d.) Learning at Home – Teachers. Retrieved July 18, 2020, from https://sites.google.com/sbunified.org/learning-at-home/secondary?authuser=2

StopBullying. (2020, May 07). What Is Cyberbullying? Retrieved July 18, 2020, from https://www.stopbullying.gov/cyberbullying/what-is-it

Sung, K. (2015, October 27). Books-to-Games: Transforming Classic Novels Into Role Playing Adventures. KQED Mind/Shift. Retrieved July 18, 2020, from https://www.kqed.org/mindshift/42538/books-to-games-transforming-classic-novels-into-role-playing-adventures

The PL Toolbox (n.d.) The PL Toolbox. Retrieved July 18, 2020, from https://www.thepltoolbox.com/