Screenshot from MinecraftEDU Tutorial World

Incorporating Games Into Classroom Assessments

In an earlier blog post, I wrote about the specific application of using games to help students with executive functioning issues. However, Games are engaging for all students and can help all students learn course content and life skills.  

Students can learn skills and techniques for planning through digital games like strategy games, simulation games, and role-playing games. In these games, they learn skills such as predicting game events and switching between short- and long-term goals. They learn to prepare for an event by stocking up items in inventory, and they learn from their mistakes (Kulman, 2014).

Games like Minecraft and LittleBigPlanet and coding apps like Scratch help to support planning skills through fun and interactive means. Students need “foresight, planning, dividing the plan into steps, and then actually producing the work,” (Kulman, 2014, p. 118) to be successful.

I have some experience using and teaching coding applications like Scratch, Scratch Jr., Tynker, and Hopscotch. And I live in a house with teenagers, some of whom went through pretty long and obsessive Minecraft phases. But until this week, I had not personally spent any time in Minecraft myself.

That said, my first experience in going through the MSU College of Education’s hosted tutorial world was interesting. When I went to the library to work on my homework, I took my 15-year-old expert son with me. I am so glad I did.

In this video, I walk through parts of the MinecraftEDU Tutorial World and narrate my walkthrough with some information about my experience and how I see Minecraft in my teaching context.

My Difficulty / Challenge

I experienced several difficulties as I went explored in the MinecraftEDU Tutorial World. Two of them are particularly worth noting.

First, I had a difficult time making a long jump over a bridge that had fence pieces on each side. I did not screencast that scene and cannot remember exactly where it was. What I remember is that my son had to help me. He initially made the jump for me but I stubbornly insisted on doing it myself which took ten more minutes of frustrating falling and running up the hill on my part and lots of head shaking on his part. Really, he was a very patient teacher and once I got across that bridge on my own, I was thrilled.

Second, I had a hard time figuring out what to do in the Dig and Build zone. There were “Dig” and “Build” signs in bold all over the place but I could not actually dig anywhere. Finally, I did what my own kids would have done: I read the Wiki, where I finally figured out that all the players before me had dug the ground and built stairs out. So I followed those stairs out.

Relating My MinecraftEDU Experience to My Background

I think I had the difficulty with the jump because I did not grow up playing video games. I have owned a few systems and playing games has been a fun activity during short bursts in my life but I have not logged a lot of hours. So when I got to that jump, I just did not have the experience. I probably could have let my son play me through but I am stubborn and tried again and again. In the end, it was worth it.

My second difficulty resulted from the busyness of my life. I was probably one of the last students to explore the tutorial world and all the digging and building had already been done. The “Dig” and “Build” signs didn’t apply to me because I was just too late.

Minecraft as an Assessment Tool

Two of the criteria in my Rubric 4.0 are based on Universal Design for Learning (Meyer, Rose, & Gordon, 2014).

Allowing students to use games like Minecraft as an assessment tool can potentially meet both of them. Minecraft could allow for multiple means of representation (criterion 6) and multiple means of action and expression (criterion 10).

When students can demonstrate learning through Minecraft (multiple means of action and expression), they are engaged and motivated. Teachers like John Miller say that when he had students create in Minecraft, they were so completely engaged they filled his classroom at lunch every day and groaned at the end of class when he had to turn off the server (Gallagher, 2014).

My Teaching and Minecraft

Minecraft could definitely have a place in my classroom. In my current role as a STEM teacher, I am typically on a pretty tight time frame. I am responsible for delivering specific content to many students. However, I could think of some general ways to use Minecraft now:

  1. As a way to introduce content to my students
  2. As an ending enrichment activity to early finishers
  3. To support the Tynker lessons I teach in fourth-grade. Minecraft and Tynker have partnered up to let students do Tynker coding in a Minecraft environment (https://www.tynker.com/minecraft/featured).

These are general ideas but I look forward to developing more specific ones in time.

References

Gallagher, C. [Colin Gallagher]. (2014, February 19). Minecraft Minechat Episode 23: John Miller [Video file]. Retrieved from https://www.youtube.com/watch?v=4ev0R_xzMEo&feature=youtu.be

Kulman, R. (2014). Playing smarter in a digital world: A guide to choosing and using popular video games and apps to improve executive functioning in children and teens. Plantation FL, FL: Specialty Press, Inc.

Meyer, A., Rose, D.H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST.

 

Using a CMS to Create an Assessment

In a previous blog post, I wrote about my critical review of three Course Management Systems (CMSs) and why I chose Canvas to use with elementary school students. Taking it a step further, I have used Canvas to create a digital assessment for my second graders.

In this video, I walk the viewer through my assessment and describe its purpose and my teaching context. I also critique the assessment based on three of the criteria from my Rubric 4.0 and give a rationale for using Canvas as my CMS. You can read more about this in this post.

Purpose of This Assessment

This is a summative assessment of physical science concepts in second grade. The same assessment will be given as a pre-test and as a post-test to assess students’ understanding of course concepts and to measure their growth over the unit.

Although this is a summative assessment, similar questions can be asked throughout the module in formative assessments using digital tools like Kahoot! or using offline polling. The results of those assessments determine whether course concepts need to be revisited individually or retaught as a whole group. In other words, assessment affects classroom instruction.

Teaching Context

In my two elementary schools, I teach STEM curriculum that is developed by a nonprofit company. It is a comprehensive curriculum that includes a variety of activities, projects, and assessments. When necessary, I modify the curriculum to suit my students and to meet time constraints. I co-teach with classroom teachers in their spaces, adapting instruction based on the formative and summative assessments that take place.

The learning objectives that are assessed are based on the following standards.

Next Generation Science Standards

  • PS1.A: Structure and Properties of Matter – Different kinds of matter exist and many of them can be either solid or liquid, depending on temperature. Matter can be described and classified by its observable properties.
  • K-2-ETS1-3. Analyze data from tests of two objects designed to solve the same problem to compare the strengths and weaknesses of how each performs.
  • ETS1.B: Developing Possible Solutions – Designs can be conveyed through sketches, drawings, or physical models. These representations are useful in communicating ideas for a problem’s solutions to other people.
  • Science and Engineering Practice – Analyzing and Interpreting Data – Analyzing data in K–2 builds on prior experiences and progresses to collecting, recording, and sharing observations.

Common Core English Language Arts

  • W.2.7 Participate in shared research and writing projects (e.g., read a number of books on a single topic to produce a report; record science observations).
Rubric 4.0

Earlier this semester I created Rubric 4.0, a tool with which to assess other assessments. When I created Rubric 4.0, I considered the project-based learning environment in which I teach and the projects my students create. Although the assessment I am describing here is a test, not a project, I used Rubric 4.0 to assess it. The following three criteria were addressed in this CMS assessment. Other criteria, such as Criterion 10: Provides multiple means of action and expression, are not met by this test alone but are allowed for when you consider that students make a culminating project in addition to taking the test.

Criterion 3: Aligns with Established Goals

This assessment aligns with established goals (standards) and with short-term and unit-specific goals. Therefore, this standard is met.

Criterion 4: Transparent learning targets

Students are presented with learning targets in this module early. They are posted in the classroom in the form of “I can” statements.

Criterion 9: Technology component

Students use iPads to log in to Canvas in order to complete this assessment. This criterion is met.

Rationale for Choosing Canvas

As part of my work in CEP813, I performed a critical review of three CMSs. There are many different features of CMSs but I focused on the assessment aspects when I chose this CMS.

Based on the results of that review, Canvas is the most robust system that can be tailored to provide easy access, even for young students. In the free version of Canvas, teachers have the option of building courses from scratch to gain unlimited access to the system (in terms of time and number of classes). I have scratch-built and customized existing courses in Canvas before and would be able to do this, but it may be a limiting factor for other teachers who prefer an out-of-the-box CMS package.

For the purposes of creating assessments in CEP813, I decided to use Canvas, which scored the most points in my critical review. Although I previously created a hybrid art course in Canvas for CEP820, I wanted to further explore the full-featured assessment and tracking capabilities that Canvas provides.

Assessments in Canvas can be configured to include a variety of question types. Teachers can identify correct answers to some question types so that Canvas automatically scores those questions. For other questions, teachers can use the speed grader feature in Canvas to quickly and easily grade many assessments.

Images

The following images were used in the creation of my CMS assessment:

Big Balloons” by Chris Breeze is licensed under CC BY-NC 2.0

Bubbles” by kirahoffman is licensed under CC0

Crayons” by Max Pixel is licensed under CC0

Fire Truck” by Garciacom is licensed under CC0

It Took 38 Year for the Hose to Spring a Leak” by oddharmonic is licensed under CC BY-SA 2.0

Juice” by Rebecca Siegel is licensed under CC BY 2.0 / Cropped from original

Popsicle – Orange Cherry Grape” by Ken is licensed under CC BY-NC 2.0

Teddy Bear” by Polimerek is licensed under CC BY-SA 2.0

Water” by jarmoluk is licensed under CC0

Header image in blog post:

Frozen Waterfall” by Peter Griffin is licensed under CC0 / Cropped from original

Second Iteration of a Formative Assessment for Fifth Grade

In my fifth grade classes, students actively work in small groups to build, modify, and program a robot to move autonomously (with minimal human intervention). They use technology and navigate social learning situations to solve a problem that is anchored in the real world.

This module is challenging but using “focused questions, feedback, and diagnostic assessment” (Wiggins & McTighe,2005, p. 46) helps to uncover misunderstandings, questions and assumptions my students have. In turn, this informs my instruction and helps students learn more, avoid forgetfulness, and transfer what they know to other situations.

To plan for and reflect on one of the formative assessments within this fifth-grade robotics module, I have developed Formative Assessment Design Version 2.0. My prior iteration is Formative Assessment Design Version 1.0, which I wrote about in an earlier blog post.

References

Wiggins, G.P. & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development. Retrieved from http://p2047-ezproxy.msu.edu.proxy1.cl.msu.edu/login?url=https://search-ebscohost-com.proxy1.cl.msu.edu/login.aspx?direct=true&db=e000xna&AN=133964&scope=site

 

A Critical Review of Course Management Systems for Assessments

Online learning is becoming more and more common in our schools. It can include a wide range of courses from fully online courses to those that include an online component with face-to-face instruction.

Course Management Systems

To help facilitate online instruction, teachers may benefit from using a Course Management System (CMS) that allows them to organize and manage course content, assessments, students, and records. These can range from websites to all-inclusive, commercial systems.

As a teacher of science, engineering, and technology, my elementary school classes often include a digital component. In the last year, I have regularly used Canvas and Seesaw. When I was a middle school art teacher I used Google Classroom to facilitate assignments and record keeping.

Critical Review of CMSs

When I was tasked with critically reviewing the assessment features of three CMSs for my current K-5 technology teacher position, I decided to review Canvas, Google Classroom, and Edmodo. Although I have not used Edmodo as a teacher in the past, I have used it as a parent.

I also included Seesaw as a fourth option. While it may not technically be considered a CMS, Seesaw has been a great help to me in getting online content to my young students and collecting assignments back from them. Because I was curious about how it would compare to the full CMSs I was reviewing, I added it as a fourth system to review.

In comparing these CMSs, I used criteria that were provided to me, as well as four other criteria that I consider to be important in a CMS. I scored each criterion with a 1 for yes or 0 for no, so that I could easily compare the features of the CMSs.

My Results

Based on the results of my critical review, Canvas is the most robust system that can be tailored to provide easy access, even for young students. In the free version of Canvas, teachers have the option of building courses from scratch to gain unlimited access to the system (in terms of time and number of classes). I have scratch-built and customized existing courses in Canvas before and would be able to do this, but it may be a limiting factor for other teachers who prefer an out-of-the-box CMS package.

For the purposes of creating assessments in CEP813, I will use Canvas, which scored the most points in my critical review. Although I previously created a hybrid art course in Canvas for CEP820, I still want to further explore the full-featured assessment and tracking capabilities that Canvas provides.

Images

Wordle Cloud for Drexler (2010)” by Chris Jobling is licensed under CC BY-SA 2.0 (as header image)

Assessment Rubric 4.0: Including Technology and Universal Design for Learning

Over the past six weeks, I have been developing and revising a rubric by which to assess other assessments. Here are links to previous iterations and the blog posts that I wrote about them:

This week’s final iteration is Rubric 4.0, which I have updated to include criteria specifying the importance of a technology component in assessment and Universal Design for Learning (UDL). Universal Design for Learning stresses the importance of providing multiple means of engagement, representation, and action and expression to make education accessible for all students (Meyer, Rose, & Gordon, 2014).

References

Meyer, A., Rose, D.H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST.

Creating a Formative Assessment for Fifth Grade

Formative assessment, assessment for learning that occurs during a unit of instruction, is dynamic assessment. It gives teachers the opportunity to find out what students are able to do on their own or with adult help and guidance (Shepard, 2000).

By making students’ thinking visible and open to examination, it can reveal what a student understands and what misconceptions they hold (Trumbull & Lash, 2013). It also provides opportunities for scaffolding steps between one activity and the next, for each individual student (Shepard, 2000).

Guided by Rubric 3.0, my third iteration of a rubric to assess other assessments, I have created the first draft of a formative assessment. Formative Assessment Design Version 1.0 is meant to be used during a fifth-grade robotics module that I teach. During a typical school year, I teach this module four or five times, so I look forward to revising this formative assessment over time to make it the best it can be.

References

Shepard, L. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4-14.

Trumbull, E. & Lash, A. (2013). Understanding formative assessment: Insights from learning theory and measurement theory. San Francisco: WestEd. Retrieved from www.wested.org/online_pubs/resource1307.pdf

 

Assessment Rubric 3.0: Carefully Considering Feedback

According to Hattie and Timperley (2007), feedback, information a learner receives about aspects of their performance, can come from a variety of sources, such as a teacher, parent, classmate, book, or an experience. Feedback makes an impact when it helps to close “the gap between where students are and where they are aiming to be” (p. 90). Its impact on student learning can be significant, although some types of feedback have a greater positive effect than others.

Among other additions, I gave special consideration to the impact of feedback as I created Rubric 3.0, the third iteration of my rubric by which to assess other assessments.

Earlier iterations of this assessment rubric are Rubric 1.0 which you can read about in an earlier blog post, and Rubric 2.0 which you can read about in another earlier blog post.

In the coming weeks, Rubric 3.0 will become Rubric 4.0. By then, it will include ten carefully selected criteria for judging assessment instruments. I look forward to learning more and sharing that rubric soon.

Images

Feedback” by geralt is licensed under CC0

References

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112.

A Critical Review of Art Projects as an Assessment Genre

When I was a visual arts educator, a staple of art class was the art project: the kind that gets hung in the hallways and displayed at district-wide art shows. Typically, each unit included a culminating art project. They may have been 2-dimensional or 3-dimensional, digital or physical, and comprised of any of a variety of materials.

I used this assessment genre for all my classes, kindergarten through high school. Even when I was a student in visual arts education classes, I made artwork for the same reason: so that my teachers could assess my understanding of the concepts we had been learning in class through my creation of artwork.

Because of the importance of this assessment genre to art education, I decided to do a critical review of the art project genre for my latest assessment assignment.

Iterations of an Assessment Rubric

“To understand is to be able to wisely and effectively use—transfer— what we know, in context; to apply knowledge and skill effectively, in realistic tasks and settings” (Wiggins & McTighe, 2005, p. 7). An authentic performance task is one way to assesses students’ ability to transfer what they know (Wiggins & McTighe, 2005). It only makes sense, then, that one assignment in my Assessment class is to create a rubric by which to judge other assessment instruments.

I began this assignment two weeks ago, with Rubric 1.0 which you can read about in an earlier blog post. In that first iteration, I identified three criteria for quality assessments: direct and specific feedback, transparent learning targets, and a self-assessment component.

In this second iteration, Rubric 2.0, I added two criteria for quality assessments: the assessment requires only target knowledge, skills, and abilities (KSAs) to complete; and the assessment requires a transfer of knowledge to demonstrate understanding.

Requires Only Target Knowledge, Skills, and Abilities (KSAs) to Complete

One approach to creating valid and fair assessments is to require only target knowledge, skills, and abilities (KSAs) to complete the assessment. Assessment designers first identify what evidence is needed to judge whether students have demonstrated specified aspects of learning. After determining what knowledge, skills, and abilities (KSAs) are required, assessment designers then examine the assessment tasks to determine whether other unwanted, non-target KSAs are required to complete the assessment. If unwanted KSAs are included in the assessment, the assessment will yield results about the target KSAs and non-target KSAs, such as language skills or math skills (Trumbull & Lash, 2013). Therefore, non-target KSAs should be eliminated.

Assessment Requires Transfer of Knowledge to Demonstrate Understanding

As stated in the introduction, a well-crafted assessment that assesses students’ ability to transfer what they know should include an authentic performance task (Wiggins & McTighe, 2005). The assessment tool should clearly describe criteria for degrees of understanding. Understanding should be assessed separately from other traits, like mechanics, organization, and craftsmanship. According to Wiggins and McTighe (2005), those other traits should be assessed in a separate rubric, or all of the traits should be assessed in a grid-style rubric.

Conclusion

Eventually, my Rubric 2.0 will become Rubric 3.0, and finally, Rubric 4.0. By then, it will include ten carefully selected criteria for judging assessment instruments. I look forward to learning more and sharing those rubrics in future posts.

References

Trumbull, E. & Lash, A. (2013). Understanding formative assessment: Insights from learning theory and measurement theory. San Francisco: WestEd. Retrieved from www.wested.org/online_pubs/resource1307.pdf

Wiggins, G.P. & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development. Retrieved from http://p2047-ezproxy.msu.edu.proxy1.cl.msu.edu/login?url=https://search-ebscohost-com.proxy1.cl.msu.edu/login.aspx?direct=true&db=e000xna&AN=133964&scope=site

Assessing My Own Assessment

Assessment is an important topic in education, with teachers, administrators, parents, students, and policymakers all staking a claim to the results of various types of assessments (NWEA, 2015).

Assessment can be used to inform teaching and provide feedback to students. When used effectively, it can “support and enhance learning” (Shepard, 2000, p. 4).

Testing is just one form of assessment. Drawing by Sarah Van Loo, 2017.

In an effort to improve my assessment practices, I critically examined one of my own assessments. First, I chose three elements that “make it possible for assessment to be used as part of the learning process” (Shepard, 2000, p. 10).  Then I began drafting a rubric with which to assess other assessments, Rubric 1.0. As the name implies, this rubric is a work-in-progress.

Rubric for an Art Project

The word assessment can refer to both the instrument and the process (MAEA, n.d.). The assessment tool that I chose to examine is a rubric for a comic strip. The last time I used this assessment tool was a few years ago. Nevertheless, I created it using a format that I often use for middle school art rubrics, so I think it is useful to examine it.
The assessment process was a project, the creation of a comic strip by each student in my middle school art class. The purpose of this assessment was to evaluate students’ understanding of craft, character development, story, and the basic elements of a comic strip, through their creation of a comic strip.

When I created this assessment tool, I made the assumption that my students were able to read and interpret each of the criteria and descriptions. I also made the assumption that my students understood the vocabulary used in the assessment tool.

Examination of My Comic Strip Rubric

Assessment doesn’t have to be a monster. Drawing by Sarah Van Loo, 2017.

In examining my rubric, I assessed whether it met the three criteria I used to create Rubric 1.0: feedback to students is direct and specific, learning targets are transparent, and it includes a component of self-assessment by the student.

Feedback to Students is Direct and Specific

According to Black and Wiliam (1998), feedback to students should be direct and specific, giving advice to students so they know what can be improved. This helps students recognize their own ability to improve.

In my experience, students sometimes view themselves as “talented” or “not talented.” With specific feedback about their own performance, they develop a growth mindset and learn that they can improve regardless of where they started.

The comments section of my assessment tool provides a space to provide specific feedback to students. If the teacher does not use the comments section but only circles the pre-written descriptions, students may view this feedback as vague.

Learning Targets are Transparent

Students should have access to the criteria by which they will be graded, providing them with the opportunity to strive for excellence and the ability to see the “overarching rationale” of classroom teaching (Black & Wiliam, 1998, p. 143).

I have noticed that when students have clear expectations laid out for them, it prevents a lot of questions from being asked. Students do not need to ask or guess what quality work looks like because clear guidelines have already been established.

The comic strip rubric sets forth clear expectations for quality of work, quantity of work, and use of time in class. It is possible that more elements of a good comic strip could be added, but this rubric sets forth standards for excellent work, as well as work that could be improved.

Includes a Component of Self-Assessment by The Student

When students assess their own work, the criteria of the assignment and feedback from others becomes more important than the grade alone. Students who assess their own work usually become very honest about it and are prepared to defend their work with evidence (Shepard, 2000).

Students who assess their own work are prepared to defend their work with evidence.

When students assess their own work, they use what they discover to improve their own work. I have noticed that they iterate on their projects and make improvements, without prompting.

The comic strip rubric allows for student self-assessment, providing one bonus point for doing it. In my experience, this provides an incentive for some students. Other students do not see the inherent value and therefore pass on assessing themselves. Rather than making it an optional bonus point, it could be a required element of the rubric.

Conclusion

At this point, the comic strip rubric does include the elements of Rubric 1.0. As I revise Rubric 1.0, though, I expect to discover ways to improve my comic strip rubric.

REFERENCES

Black, P. & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139-144.

MAEA. (n.d.). CEP 813 module 1.1 introduction. Retrieved from https://d2l.msu.edu/d2l/le/content/585048/viewContent/5241064/View

NWEA. (2015). Assessment purpose. Retrieved from https://assessmentliteracy.org/understand/assessment-purpose/

Shepard, L. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4-14.

Beautifully Questioning My Teaching Practice

As I prepared to do this, my final project for my three summer classes, I was stuck. These classes have been exhilarating, challenging, and rewarding. Sometimes there were tears, both frustrated and proud.

Today I created something wonderful and hoped that the excitement from that would fuel me through this post. It didn’t. The hard part is that after so many weeks of pushing myself so hard, my brain was stuck.

So I looked at Twitter, read some news, looked at the ceiling. Nothing. One of

First-graders love coding using ScratchJr

my objectives for the assignment was to apply Warren Berger’s Why, What If, and How questioning methodology from A More Beautiful Question (Berger, 2016) to my own practice. So I eventually, begrudgingly started with that.

I began with creating a list of Why questions related to my teaching. I teach Project Lead The Way (PLTW) to students K-5 in two schools. I teach engineering concepts to all my students and coding to all my students except my kindergarteners.

Unsticking the Lid

As soon as I started asking questions, my imagination took flight. As Frances Peavey once said, a good question is like “a lever used to pry open the stuck lid on a paint can,” (Berger, 2016, p. 15). That was it! I simply needed to start asking questions and I was unstuck, just like the lid of the paint can.

As Berger suggests, I started by asking Why questions. If we’re paying attention, we ask Why when we encounter a suboptimal situation (Berger, 2016). Although I love my job and most of my students enjoy my classes, there are some who just don’t. Those students led me to ask:

Why?
  • Why do some students keep asking if they’re doing the problem “right”?
  • Why do some of my students think they can’t code?
  • Why do some of my students refuse to participate?
What If?

As I considered my Why questions, I focused on the fact that “Integrating coding into classes is being perceived by many as a way to stimulate computational and creative thinking,” (Johnson, Becker, Estrada, Freeman, 2015, p.21). Therefore, I decided to address the question: Why do some of my students think they can’t code?

The ScratchJr coding environment is user-friendly for young students, but still offers the opportunity to learn computational thinking.

Pondering this question, I realized that my first and second-grade students have great confidence when it comes to coding. It is my third through fifth-grade students who are more likely to struggle.

With my Why question in my mind, I began to ask What If. During this time of creative, wide open questioning, I asked What If questions to help me consider possibilities for change (Berger, 2016).

  • What if I let my older students start with ScratchJr (typically only first and second graders use ScratchJr)?
  • What if I made time for Hour of Code or other warm-up activities before starting on our unit together?
  • What if I ran an after school coding club?
  • What if I work more closely with the media specialist to coordinate coding lessons?
How?

Asking How is about focusing on making progress toward a solution, about deciding which ideas to pursue (Berger, 2016). One of the great conundrums of my schedule is that I never seem to have enough time.

I co-teach, pushing materials into each classroom, typically for a couple of weeks at a time. When I’m in a class I have so much to do to complete a module. Also, I don’t want to waste any of the classroom teacher’s time. Therefore, I carefully avoid straying from my lesson plans. The problem is that some of my students simply need more. So I asked How…

  • How can I find time to let some students practice coding more outside of class?
    • After school
    • During lunch
    • On my prep hour
    • During other periods in the school day with a PLTW iPad
    • During other periods in the school day using another device

This practice could be with ScratchJr or with the app they’ll use during PLTW. It could even be a different app, as long as they get the opportunity to practice the computational thinking they need to improve their coding skills and gain confidence.

Next Steps

Prior to completing this assignment, I had vaguely considered this issue in the past but hadn’t gotten much past that. By taking the time to do this questioning process, I feel like I’ve taken my first steps toward solving a complex problem in my practice. My next step will be to talk to my classroom teachers to figure out how we can work together on behalf of our students.

References

Berger, W. (2016). A more beautiful question: The power of inquiry to spark breakthrough ideas. New York: Bloomsbury.

Johnson, L., Becker, S. A., Estrada, V., & Freeman, A. (2015). NMC horizon report: 2015 K-12 edition. Austin, TX: The New Media Consortium.

Images

All images in this blog post were created by Sarah Van Loo.

Curiosity Never Grows Old

Since I was little, I have loved to draw. I enjoyed everything about it. I wanted to learn how to make animated movies but never did. Now as an art teacher and technology teacher, I have access to great technologies that can help me. In fact, I spent last year teaching K-5 students coding in ScratchJr, Hopscotch, and Tynker.

This summer I decided to take what I already know about coding from those applications and do what I’ve always wanted to do: make an animated movie. I created this animation using Scratch.

I drew all the sprites, customized the background, and did it. It’s only one minute long, but I am so proud of myself and I’m thrilled with the result. I am delighted to share that movie here:

Images

All images and videos in this blog post were created by Sarah Van Loo.

Questioning the Wicked Problem of Teaching Complex Thinking

Each year The New Media Consortium reports on key trends, significant challenges, and important developments in the field of educational technology. Among the significant challenges of 2015 was teaching complex thinking (Johnson, Becker, Estrada, & Freeman, 2015). The problem itself is complex enough that we could refer to it as a “wicked problem.” According to Koehler and Mishra, these are described as problems that “have incomplete, changing and contradictory requirements” (as cited in Week 4 – Learn, 2017).

“Rodin’s The Thinker” by Andrew Horne, retrieved from https://commons.wikimedia.org/w/index.php?curid=15582363, is licensed under Public Domain.

Because of the changing nature of wicked problems, it is impossible to come up with a perfect solution. Instead, my team Laura Allen, Guadalupe Bryan, Alex Gorton, and I worked to investigate and try to offer a “best bad idea”  in response to the problem of teaching complex thinking (as cited in Week 4 – Learn, 2017).

We approached this wicked problem from the perspective of A More Beautiful Question. We hoped to ask “an ambitious yet actionable question that can begin to shift the way we perceive or think about something – and that might serve as a catalyst for change” (Berger, 2016, p. 8).  Although our problem is unsolvable, we can still be a catalyst for change – if we know what to do.

Using the method presented in A More Beautiful Question, we asked Why, What If, and How. The most challenging aspect of this approach was giving time and thoughtful consideration to each phase in order to ask good questions. Berger points out that we’re deluged with answers, but “to get to our answers, we must formulate and work through the questions ourselves” (Berger, 2016, p. 3).

In our shared planning document, we brainstormed and took notes. Together, we asked 55 Why questions. When we ask Why, it helps to approach the problem from an inquisitive, almost childlike perspective. This led to our beautiful, driving question:

How are teachers addressing the complex thinking skills necessary for students to become productive and innovative 21st-century learners?

I needed to give our complex problem the consideration it deserved. Before moving on to the What If phase, I crafted this infographic about the complexity of our problem:

After arriving at our driving question, we responded by asking What If. When we ask What If, we use creative, divergent thinking to expand the possibilities to explore.

Around this time, we surveyed other educators in our professional learning networks (PLNs) about our wicked problem. Based on the results of the survey and on the What If questions we asked, our team singled out one What If question:

What if students had more freedom/choice in developing their complex thinking skills?

With our survey results in and our What If question settled on, we investigated current research around the question of How. We researched four current educational trends around student choice: project-based learning, genius hour, authentic inquiry, and student choice in assessments.

Check out our ThingLink below to see our group’s presentation of this entire process. We describe our methods, survey and results, and practical ways to introduce student choice in a 21st-century classroom. Don’t miss our references in the lower left if you want to learn even more. (If your browser doesn’t allow you to click on all the links, go directly to the ThingLink site.)

 



Reflections

Collaborative teamwork is a 21st-century skill that our group used to great effect. Even though we were never all in the same room for this, apps like Zoom, text messaging, Google Docs, and email helped us undertake this complex project.

The process was challenging at times, but the results were worth it. I am excited to try out some of our suggestions in my own class this fall.

References

Berger, W. (2016). A more beautiful question: The power of inquiry to spark breakthrough ideas. New York: Bloomsbury.

Johnson, L., Becker, S. A., Estrada, V., & Freeman, A. (2015). NMC horizon report: 2015 K-12 edition. Austin, TX: The New Media Consortium.

Week 4 – Learn. (2017, July 22). Retrieved from http://www.msuedtechsandbox.com/MAETely1-2017/week-4-wicked-problems/week-4-learn/

Images

Unless otherwise captioned, all images and videos in this blog post were created by Sarah Van Loo or the students of MAET Year 1.