Differentiated Goal Writing: Tiered Targets

Approaches to Writing Differentiated Goals

Student Growth Goals (also referred to as Growth Targets) are detailed, measurable goals for student learning and growth developed collaboratively by educators and their evaluators.  They are based on student learning needs identified by a review of students’ baseline skills.  The goals are aligned to course standards and measure learning in the big ideas and essential skills of the class or course.  The goals are rigorous, yet realistic targets of student learning.

There are many approaches to setting student growth goals (or growth targets).  Choosing your method of writing the target is partly personal preference and partly an exercise in critical analysis of the unique students on your roster.  This document highlights the tiered method of growth goal writing.

Whole Group Goals

caras de nios leyendo

The whole group method is when teachers are writing one goal applied every student on the roster.   It is quite different from the tiered goal as there is no differentiation.  The whole group growth goal is the simplest way to write a growth target.  These goals are set by deciding how much growth is expected of students and then adding that amount to student ALL pre-test scores.  These targets often state large generalizations of growth and learning that would then be applied to all students. For example: All students will grow by 20%, or All students will move one level on the writing rubric.  Because of the unique nature of the individual student, there is some concern that this may not allow appropriate goals for all students on that teacher’s roster.

Tiered Growth Goals

Tiered targets are created by grouping students together based on pre assessment (baseline) scores.  Teachers should divide students into 3 or more categories or tiers.  Then, teachers will identify growth expectations for each tier and apply them to each student in that tier.  

Notice that using this method, all students have growth targets they are expected to reach, but the teacher does not calculate the targets based on a formula applied to the whole roster.  Instead the teacher sets growth targets based on the amount of learning expected during the instructional interval for that specific category of students.  

When setting a target of either a minimum expectation or points gained, teachers will use their knowledge of the curriculum, current student data and performance as well as historical student performance to determine targets.  

Screen Shot 2016-03-03 at 11.28.47 PM

Screen Shot 2016-03-03 at 11.28.56 PM

Teachers may have a single formula applied to each tier (ex: See figure 2), or teachers may set a minimum expected score for each tier (ex: See figure 3).

ADVANCED Tiered Growth Goals

 

Advanced Tiered Growth Goals are much like the Tiered Growth Goals in that they are created by grouping students based on baseline results.  However, this method attempts to take into consideration the fact that students earning similar scores just above or just below the cut-off points between tiers may be held to very different growth expectations.  Thus, this method combines a constant target and a variable target.  The goal expectation is that the student reaches whichever target is greater of the two.

 

Screen Shot 2016-03-03 at 11.21.54 PM

Download Our Handout explaining Tiered Goals in more depth.

What’s a Great SLO? Online Course

Time to practice with support!  Don’t be afraid of writing your SLO!  Get the confidence and practice through coaching over a 5 week online course. You can Register Here.

This course is designed to help you complete a quality SLO. From identification of the learning goal and applicable assessments to the analysis of baseline data and setting goals, our instructors walk you through each stage, share current best practice and potential discussion questions.


 

Don’t let writing Growth Goals be daunting!

You have so many different ways to measure and monitor learning of your students. What is the best way to write a goal and what tools should you use to measure learning? In this course you’ll buckle down and learn about the SLO Cycle, choosing the right growth assessment, simple data analysis strategies to make your data speak to you and how to write a good goal that is both attainable and ambitious.

 

COURSE GOALS:

  • Understand how to choose assessments measures for growth
  • Learn about the current research in the SLO cycle
  • Dig into data analysis strategies that will simplify the process and make sense
  • Create SMART goals that focus on your students and realistic results
  • Implement goal writing systems and templates that will help you write a goal for your students
  • Complete an SLO with your classroom data or our practice data
  • Collaborate with and learn from other educators

Week 1: SLO Basics

  • Explore the research behind the SLO model and what teachers across the country are reflecting about the process
  • Understand the SLO learning goal and how you might choose a “Big Idea” to monitor for this purpose
  • Dig into assessment strategies to measure growth and how to best measure student learning in your course
  • Choose an assessment strategy for your course SLO

Week 2: Action Research and Data

  • Learn strategies for data analysis that won’t make your head spin. We will look at simple color coding techniques and 4 step plans to make data analysis bearable and exciting.
  • Explore the purpose behind linking baseline assessment data to instructional changes and pivot points
  • Build your confidence through hands on data analysis of your own students or mock student groups

Week 3: Make it SMART-G

  • Learn how to design a goal that is Specific, Measurable, Attainable, Relevant, Timebound and Growth dependent
  • Learn about categories of goals from whole group to individual and why you might choose one over the other
  • Design the best parameters for goals around an actual goal based on your students

Week 4: Completing the SLO

  • Learn to explain your rationale and flesh out your goal for presentation to your evaluator.
  • Examine and critique example completed tools and goals
  • Complete your own SLO with feedback from other participants and discussions

Week 5: Putting it Together

  • Reflect on aspects of the SLO design process and learn how this process would continue through the instructional interval
  • Leave the class with a comprehensive toolkit for writing your future SLO goals.

Learn More!

Writing Better Multiple Choice: Pitfalls and Solutions

 

The multiple choice question, also known as the selected response or forced choice is an important element of any teacher’s assessment toolbox.

Quality assessment hinges on appropriate question format, often which includes the multiple choice question.

Multiple Choice Question Anatomy

In most forced choice questions, there will be a question, several options (typically 4) one of which will be the correct, undisputed right answer.

Screen Shot 2015-09-15 at 1.07.41 PM

The other “wrong answers” we call distractors.  Don’t underestimate the power of the wrong answer choices when developing valid questions.

Multiple Choice Pitfalls to Avoid:

  1.  Wordy Distractors:  These choices test students’ reading ability rather than their mastery or understanding of objectives.
  2. Overlapping Options:  Often considered “trick questions” which lack validity.  Choices should be mutually exclusive.
  3. Heterogenous Structure:  Mixed length, style, etc. can provide clues about the right answers.  Write your distractors with the same tense, sentence type, length, etc.

 

Possible Distractor Types:

The “Obvious” Distractor:

Some assessment experts recommend a single choice which can generally be easily eliminated by most students. This will help students reduce the focus from 4 choices to 3.  However, BE CAREFUL because the obvious often becomes a “throw away” answer which can potentially be too easy for students.

Screen Shot 2015-09-15 at 1.22.40 PM

Most students will be able to eliminate “c” as it is not a large land animal.

The Unsupported Statement

This statement could possibly be true but is not supported by the text, evidence or data provided.  Often this answer choice sounds “smart” and includes big words.  It appeals to reader’s biases. It may appear to be true…or potentially be true, but it is not supported by evidence provided.

Screen Shot 2015-09-15 at 1.24.39 PM

The story explains that the comics used to be called the “funnies.”  Charlie Brown was often sad and sometimes even bad things happen to him.  “A” is unsupported as the article does not speak about the dog or its human qualities.  There may not have been other comic strips at that time with a dog like Snoopy, but we don’t know that from the article.

The Distortion of Truth

This choice represents a conclusion that distorts what is provided in the text, evidence or data.  It is not supported by the passage or data provided.  It might even disagree with the meaning of the text, evidence or data or include a conclusion beyond what the provided information supports.  It might take words or phrases out of context.


Screen Shot 2015-09-15 at 1.29.24 PM

Choice “c” is a distortion of the truth.  This article does say that Charles Schulz had comics that were rejected by the newspapers.  So, it is partly a true statement.  However, it is likely not what makes him different that other comic strips as those authors may have been rejected as well.

The Extremist

These choices contain exaggerations of the truth and use extreme words like “everyone” or “all of the time” or “never” when not supported by the text evidence or data.  For the more discerning students, these are typically choices that can often be easily eliminated with a more careful look.

Screen Shot 2015-09-15 at 1.40.12 PM

 

Of course the extremist here is “A” claiming that Peanuts is the funniest where there is not any evidence in the text regarding its ranking compared to all other comic strips of the time.

The Skip and Switch

This is the choice if a step in thinking is skilled or altered.  This is a common variation of wrong answers in math or science problems.  The choice might give a correct answer to another question.  It might be a choice reached by solving the problem with an incorrect method.

Screen Shot 2015-09-15 at 1.44.14 PM

Here, choice “D” is one skip and switch, catching students who have switched the “x” and “y” variables.  Choice “c” is also a skill and skip and switch as it is the location of a different point.

Screen Shot 2015-09-15 at 1.45.04 PM

In this second example, “a” and “c” are from skipping steps and doing the order of operations out of order.  These wrong answers will typically point out misconceptions in the student thinking and thus be good for reteaching.

 

Turning Pre-Assessment Data into Growth Goals

It can be overwhelming.  Many teachers ask me, “how can I translate this spreadsheet of numbers into actionable items?”   When presented with a list of numbers, it can be a struggle to translate this into students and need levels.  Here are some starter ideas to get teachers more comfortable with their data.

Imagine a sample set of data on an initial, baseline assessment

Student A: 6%

Student B: 34%

Student C: 33%

Student D: 56%

Student E: 58%

Step 1: Highlight outliers: data points which are far above or below the majority of the other data points

The 6% score of Student A  would be an outlier because it is at the lowest end of the data set and 28 points from the next lowest.  Outliers can tell us students that are far below or far above the majority of students in the classroom.  These are students that will likely require unique attention.

Step 2: Examine both the Average & Range: look at how class did as a whole as well as the differentiation within the scores.  Small Ranges indicate most students are in the same place, large ranges indicate a vast variety of ability.

In this case, the average is 37.4%.

The range is quite large (52 points)  6% to 58%.  This indicates a variety of ability levels in the classroom.  In the case of large ranges, teachers may want to consider grouping similar students to differentiate the various need levels.  The large range also makes the average score less meaningful.  In this case, grouping score clusters and looking at the averages of smaller groups would be very valuable.

Step 3: Zoom Out: This is an easy way to look for natural breaks in the data or clusters for tiered targets.

In the example data set, we can see that there are some similar score groups.  Student B and Student C at 33 and 34 scored similarly where as Student D and E are clustered at 56 and 58.  A larger group of scores, which most teachers will have, will reveal more clusters.

Step 4: Zoom In:  Look at individuals and using other data (observation, RtI, etc.) teachers can make better choices about realistic goal setting

This is a great time to look at outliers.  Students that scored far above or below the majority of other students should be examined using other data that is available.  This may include talking to other students working with this individual, or historical data that is available for this individual.