Updated Middle School Math Decision Gudelinesclick for link
Multi-Tiered Systems of Support
Overarching Themes of MTSS/RTI (NASDSE, 2008)
1: High Quality, Evidence Based instruction and Intervention
Problem Solving and Planning Process:
This year, many have heard the acronym MTSS rather than RTI. The change reflects the objective of varying the intensity of services for students based on the intensity of need. Some of our students may require multiple levels of support rather than a singular ‘intervention’. MTSS provides a better description of a framework that is dynamic and allows a better fit by need for our students. MTSS also allows for an integration of academic and behavioral support rather than one area of the other. Through analysis of assessment data, services can be efficiently and effectively allocated.
- Universal Screening
- High Quality Intervention
- Skill by Treatment
- High Quality Instruction
- Monitoring Progress Using CBM/DBR
- Problem Solving Teams (PST)
- Data-Based Decision Making
- Cheat Sheets / Resources
DISTRICT RESOURCES (Target/Trigger Tables):
Link to current academic year assessment calendar- click for link. Assessments by grade level and time of year listed.
Score Range Descriptions and Rationale:
On all district guiding documents, the following color coding will be used. (See Universal Screening Resource Table above)
Seasonal targets and triggers were determined based on the following rationale and are color coded in the following manner:
Blue- Talent Development consideration: A score or level that is well above the proficiency target.
Green: On Track: Met the Tier 1 target or proficiency scores obtained from the 2017 NWEA Linking Study. The study used an equalization method to provide target scores for each grade and season. Students meeting or exceeding target scores are likely to obtain proficient scores on the Forward Exam.
Yellow- Above Trigger/Not on Track Yet: Student did not meet the Tier 1 target but scored above Warning Trigger Levels. Consider targeted instruction based on needs.
Red- Not on Track/Below Warning Trigger: When possible, the 25th percentile or below was used as a warning indicator or 'trigger' score. On TC running records, triggers was established using levels indicating students were below expectations. Furthermore, Fletcher, Lyon, Fuchs, and Barnes (2007) indicated that the 25 th percentile is typically the level at which students begin to be considered at-risk for reading or math disorders and/or require consideration for intervention.
White column: Score to Predict ACT score of 24 (MAP-RIT Score): Thum and Matta (2015) updated the work of Theaker and Johnson (2011) to provide guidelines regarding fall and spring scores that are likely to indicate ACT scores of at least 24.
Average RIT Gains: (MAP-RIT Scores)- Based on the National RIT Scale Norms Study from 2015 (NWEA), the average growth from Fall to Winter, Winter to Spring, Fall to Spring, and Fall to Fall of the following year were listed. These scores provide estimates of expected growth. This way, teachers, parents and possibly students, can establish goals how to move up from one range to another. For example, moving up from the 25th percentile to the 50th percentile in a realistic manner based on expected growth.
Winter screening will no longer occur in an attempt to be more efficient with existing measures and to reduce assessment time. Furthermore, recent research revealed that winter scores did not significantly add to decision making accuracy (Vanderheyden, Burns, & Bonifay, 2018). When winter updates are needed by teams, it is recommended that benchmark Curriculum Based Measures (CBM) be conducted.
References for Tiers of Performance:
The School District of Elmbrook Academic Universal Screening measures are:
Northwest Evaluation Association Measures of Academic Progress® (MAP®) given three times per year (Oct, Jan and May) for K-8 and selected high school students
NWEA_2015_Full_Norming_Study.pdf (new, used for updated targets/triggers)
- MAP Reading and Math will be assessed during the fall, winter, and spring test windows.
Oral Reading CBM: easyCBM Word Reading and aimsweb Passage Reading Fluency curriculum based measures (CBM)
Teachers College of Reading and Writing: Running Records
Wisconsin Forward: Grades 3-8 and 10
Reading Fluency: aimsweb Oral Reading Fluency
Reading Fluency will be assessed using passages from aimsweb. Students in grades one through five will be asked to read three passages. Words read correctly in a minute will be recorded along with the student's accuracy.
Teachers College of Reading and Writing:
Classroom teachers administer these assessments to students individually. These assessments help teachers identify which level of texts students can read independently and will therefore be able to practice all the reading strategies they are learning during the Reading Workshop.
Universal Screening Resources and References:
Interpreting the easyCBM Progress Monitoring Test Results. (2013). Riverside. (resources are available on the 'lite' side of the easyCBM website, click to access)
Northwest Evaluation Association (NWEA). (2011).RIT scale norms study. Available at: http://www.nwea.org/sites/www.nwea.org/files/resources/NWEA%202011%20Norms%20Re port_01.17.2012_2.pdf
Wisconsin Department of Public Instruction. Wisconsin Forward Score Ranges. Retrieved from: https://dpi.wi.gov/assessment/forward/resources
High Quality Intervention
The interventions listed below are linked to one page summaries. The summaries provide a general description of the interventions, skills targeted, and entry and exit guidelines. Monitoring guidelines including measures, frequency, and discontinuation guidelines are included.
|Area of Need:||Description of Area|
|Reading Decoding/Basic Reading||
Middle: Just Words, Sonday Level 1, Sonday Level, 2, Wilson Reading
High: Sonday Level 1, Sonday Level 2, Just Words, Wilson Reading
Elementary: HELPS, Quick Reads, Read Live (formerly Read Naturally), Repeated Reading
Middle: HELPS, LLI-Purple, LLI Gold, Read Live
High: Read Live, HELPS-depending on reading level, Repeated Reading Intervention- (when probes unavailable or all used)
Link to Read Live Materials- click here - courtesy of Read Live
Link to HELPS Reading Fluency Intervention
Elementary: Bridges Intervention, Do the Math, Number Worlds, Spring Math (pilot 17-18)
Middle: ALEKS, Do the Math Now!
Incremental Rehearsal: Consider use of this Intervention Protocol when mastery of math facts is needed (prerequisite skill for math computation). Adaptable for all grade levels.
Elementary: Dreambox Learning
Middle: Math Support, ALEKS
High: Do the Math Now!
Supplemental Resources: As of the 2016-2017 school year, students will have access to Dreambox Learning and ALEKS.
Dreambox Learning: Grades K-5 (approximately)
ALEKS- Grades 3-12 (approximately)
Writing : Here we go! This will be developed over time, check back often. Links and information will continue to be added and modified. For now, all writing intervention will be provided K-12 in sentence and passage levels.
Self-Regulated Strategy Development (SRSD)
- In Summer 2018. we partnered with ThinkSRSD to begin the process of adopting SRSD as our writing intervention at the passage level. More training will occur this summer.
- Resources are available from ThinkSRSD, sign up to gain access.
Behavior : click to view past version of interventions on MTSS- Behavior side of site
Elementary: Behavior Contract, Check in Check out, Social/Academic Skills Group
Middle: Behavior Contract, Check in Check out,Social/Academic Skills Group
High: Behavior Contract, Check in Check out, Lancer House-BCHS
An intervention can be defined as anything a school does, above and beyond what all students receive, that helps a child succeed in school. The Wisconsin Department of Public Instruction (DPI) has also provided some recommendations regarding the definition of ‘intensive intervention’ if students do not respond to instruction or intervention.
DPI recommends that intensive interventions are:
- Used with individuals or small groups
- Focus on single skills or small groups of targeted skills
- Substantial number of minutes (core general ed. + 30, 60, 90 minutes)
Interventions should also meet the following criteria:
- Delivered in a manner consistent with design
- Be aligned to student need (based on data)
- Be culturally appropriate
In the School District of Elmbrook, we will select interventions that match the student level of skill acquisition. Haring et al. (1978) established the following stages of learning: Acquisition, Fluency, and Generalization/Adaptation. Targeting intervention by skill matched to the stage of learning provides maximum potential for benefit to our students.
The term ‘treatment integrity’ is interchangeable with ‘treatment fidelity’ when used to describe delivery of an intervention. Both terms are receiving more attention and focus in the areas of education, psychology, and medicine. Without knowing how well any intervention or practice was delivered, how can one know the obtained results are a reflection of the practice or the delivery? Although measuring Intervention Integrity is difficult at best, we will be striving to provide high quality practices, methods, and interventions to our students. By clicking this LINK, a proposed process is presented how intervention integrity checklists can be developed. The focus on the checklist is twofold. 1- The checklist is an efficient reminder for the teacher to focus on the key elements of instruction for the intervention. 2- The checklist is a method for peer observation so those who are not as familiar with the intervention, can still provide feedback to the interventionist. The sole purpose of an intervention integrity checklist is to serve as a vehicle for conversations to improve outcomes for students.
Initially, interventions present in the system were used to build our inventory. As we continue to refine our assessment and intervention frameworks, continual review of intervention outcomes will be conducted. Continual review of existing and exploration of new tools and interventions will be conducted to ensure we provide our students the best service possible.
To evaluate new products, this process will be followed:
1. The team identifies the discrete skill addressed by the intervention.
2. Independent Research Review: use of ERIC, What Works Clearinghouse, National Center for Intensive Intervention, Evidence for ESSA, Google Scholar or other academic database to collect relevant, independent supporting research of the product in review.
3. Review of Publisher based research.
4. Review of district data (assessment) to determine if the intensity of need.
5. Development of pilot plan to deploy intervention or product to a limited group. Team of representative professionals will be selected to evaluate use of intervention or product.
6. Selection of assessments to be used to conduct program evaluation (pre-post) or effect of intervention on pilot group.
7. To summarize finding and plan for future, the HEXAGON Tool from the National Implementation Research Network (NIRN) will be completed by the representative group. Click to access the Hexagon tool.
Questions to ask when considering new products or materials. - Click for Link.
|Completed Intervention Reviews-click link to access review|
Blase, K., Kiser, L. and Van Dyke, M. (2013). The Hexagon Tool: Exploring Context. Chapel Hill, NC:National Implementation Research Network, FPG Child Development Institute, University of North Carolina at Chapel Hill.
Please call or email Chris Birr, MTSS Coordinator with questions/comments.
High Quality Instruction
High quality instruction (curriculum, instruction, and assessment) is engaging, standards-based, data-driven, and research-based. All students should receive high quality, core academic and behavioral instruction that is differentiated for student need and aligned with the district standards for mathematics and English language arts and other state and local standards. Standards assist in providing consistent grade-level benchmarks. Core instruction refers to the curriculum, instruction, and assessment that all students receive. Core curriculum stems from and is directly shaped by the standards, the district curricular framework, and the effective use of formative, summative, and benchmark assessments. All curriculum and instructional practices should be examined against their evidence base and the educational context within which they will be implemented.
Research summaries will be developed to guide best practices in instruction and assessment. Archives will be maintained below.
Interventions are intended to increase student performance in the general curriculum for students who are not meeting benchmarks in a particular curricular area. Additional challenges are intended to meet the needs of students who are exceeding benchmarks. For students whose screening data indicate they are either likely to not meet benchmarks or are likely to exceed benchmarks in a particular instructional area, educators use data in a collaborative process to determine appropriate interventions or additional challenges, which are matched to a student’s particular area of need. Within this process, the intensity of intervention or additional challenge is also determined. Students continue to access core curriculum, instruction, and assessment in addition to these small group or individual interventions or additional challenges. In certain cases when stu-dents exceed benchmarks, a collaborative team may determine that an additional challenge may most appropriately take place in lieu of core instruction.
High quality instruction (curriculum, instruction, and assessment) is engaging, standards-based, data-driven, and research-based. The process of identifying and using interventions and additional challenges is flexible and fluid. The intensity and nature of the interventions or additional challenges should be adjusted based on a student’s responsiveness as evidenced by multiple data sources. Instructional time, frequency of instructional sessions, size of the instructional group, level of instruction, instructional technique, and instructional provider are examples of adjustments that can be made to respond to student need. Interventions and additional challenges, as components of high-quality instruction, should each be culturally responsive and appropriate for the students being served. (Wisconsin RTI, 2010).
Below is a listing of high quality resources used in the development of the Elmbrook MTSS framework.
Monitoring Progress using Curriculum Based Measurement (CBM) and/or Direct Behavior Rating (DBR)
Elmbrook Progress Monitoring Guidelines- click for link (REVISED 3/19)
Curriculum Based Measurement (CBM) is a form of assessment that allows for efficient and repeated measurement of a child's academic skills in specific areas. CBM's can be used as a universal screening tool (2-3 times a year for all) or to monitor program for those receiving additional and more intensive instruction. In general, a child is administered various measures on a fairly regular basis (i.e. monthly, bi-weekly, or weekly) to assess progress toward expected outcomes. The frequency of assessment depends on the child's progress and intensity of intervention. The alternative forms of CBM are developed to be controlled for difficulty so progress can be assessed as the child receives instruction or intervention. Gains in performance then accurately reflect skill acquisition. CBM's are typically used to monitor progress of students in academic interventions. For students receiving interventions or supplemental instruction, we will monitor progress at least every other week.
The advantages to using CBM to monitor progress are:
- efficiency- measures take from 1-10 minutes to administer
- alignment to basic skills
- reliability (similar scores result from different assessors)
- ability to use as a universal screening tool and to monitor skills more frequently
For students receiving supplemental academic instruction (intervention), we will monitor progress every other week.
Questions to consider when viewing student CBM performance?
- Is the student making adequate progress to close the gap between his/her current performance and grade level peers? CONTINUE
- Is the student making progress, but not enough to close the gap with grade level peers? CHANGE- intensity, frequency, dosage
- Is the student making less progress than his/her peers? CONSIDER SOMETHING ELSE
- Initial decisions such as whether to intensify intensity or frequency, can be made after 4-6 weeks
- More significant decisions require at least 10-12 data points
Hosp, M. K., Hosp, J. L., & Howell, K. W. (2006). The ABCs of CBM: a practical guide to curriculum-based measurement (1st ed.). New York: Guilford Publications.
Direct Behavior Rating (DBR) is a measurement that allows for quick, efficient, and reliable monitoring of behavior. Plus, DBR is evidence based. More information on DBR can be found here. The typical starting point for DBR is measurement in three areas:
- Academic Engagement- actively or passively participating in the classroom activity
- Respectful- compliant and polite behavior in response to adult direction and/or interactions with peers and adults
- Disruptive- student action that interrupts regular school or classroom activity
Baseline for DBR consists of 7-10 data points (e.g. 5 days of AM and PM ratings). Monitoring could consist of twice weekly checks that are entered and graphed in ion.
Goals for DBR areas are:
8-10 (80-100% of the time) for Academic Engagement and Respect
0-2 (0-20% of the time) for Disruption
Progress Monitoring Resources
|District Guidance Documents/Resources:
NEW- Writing CBM Cheatsheet
Below are links to collections of articles and references used in the selection and justification of CBM as an assessment to monitor progress.
School District of Elmbrook Problem Solving Team (PST) Process:
1.Problem Identification: To identify the gap between the student's performance and the expected performance for his/her grade level. When available, use two sources of data to identify a difficulty (e.g. MAP and Oral Reading, MAP and Teacher's College Running Record). The following sources may be considered:
· Universal Screening Measures
· Grade level expectation and student’s level of performance (gap)
· Past performance and educational history
· Classroom performance and available classroom data
2. Problem Analysis : Once a problem statement has been identified, the team should use available data and additional assessments to narrow in on a Root Cause or specific area of concern. Again, use 2 out of 3 to confirm a weakness is present. Additional assessment using CBM should be conducted when assessment results are contradictory. Prior performance can be considered too- is the weakness consistent?
To accurately target an intervention, use of the Stages of Learning (Haring & Eaton) will be referenced. The stages are:
Student responses are slow with frequent errors. Requires supervision and frequent feedback. Goal: 90-95% accuracy before allowing independent practice.
Goal- facilitate automaticity of skills with high level of accuracy (timed and accurate 95% or more)
3.Generalization and Adaptation
Goal: Use learned skills differently than taught (i.e. horizontal to vertical alignment of math calculations)
Example: If a student struggles to learn how to decode text, providing a fluency intervention will likely result in frustration and limited results.
3. Plan Development : Following the problem analysis, the team will select an intervention that best meets the students need and enter the plan in Ion.
The plan will include:
· Specific skill area addressed
· Number of days/week
· Number of minutes/session
· Goal and expected rate of improvement
4. Plan Implementation : The PST plan will be implemented and reviewed monthly. Monitor fidelity of delivery.
- Is the student present for the intervention?
- Is the intensity enough? (too many students, too few days)
- Is or are the right skills targeted?
5. Plan Evaluation: After a minimum of 4 weeks, teams will check and determine if the student is demonstrating gains in the intervention. If results are less than expected, consider intensifying the intervention and ensuring only 1-2 skills are targeted. To have adequate confidence in progress monitoring results, 8-12 weeks of data are necessary. However, monitor outcomes and adjust as needed.
Burns, Matthew K., and Kimberly Gibbons. Implementing Response-to-intervention in Elementary and Secondary Schools: Procedures to Assure Scientific-based Practices. New York: Routledge, 2012.
Thomas, Alex, and Jeff Grimes. Best Practice in Instructional Consultation and Instructional Consultation Teams. Best Practices in School Psychology. Bethesda, MD: National Association of School Psychologists, 2008.
Data-Based Decision Making:
- Below 50th Percentile on most recent MAP: Give a closer look at all students scoring below the 50th percentile. (This does NOT imply intervention is necessary)
- Reading: Examine Oral Reading and determine if students demonstrate accuracy below 93% correct.
- Confirm with performance on Running Record
- Rule in or out phonological awareness needs (see Reading TLS and/or school psych for more information)
- Math: Examine previous MAP scores to determine if current performance is consistent or an outlier
- Can also view Math Fact Fluency to identify weaknesses
- Triggers do not mean mandatory intervention, teams need to consider student need and all relevant information to select best way to increase skills.
"2 out of 3" rule continues to apply.
- For example, if MAP and OR scores are below triggers, team should consider intervention. If only 1 trigger is low, examine student performance (time of assessment, subtest scores) or collect more data such as a CBM.
- CBM is a confirmatory measure when assessment results contradict.
- When making student level decisions, intervention is not mandated when 2 triggers are met.
School teams will use screening data (MAP, TC, OR) to make decisions regarding student need and whether or not more intensive instruction or intervention is needed.
In some situations, additional information will be needed to either confirm decisions or help narrow in on specific skills for intervention. Diagnostic assessments will be used to gather more specific information to guide intervention or instruction. The assessments may be delivered by a classroom teacher, teaching and learning specialist, learning support staff, or school psychologist.
Winter Benchmark Guidelines
Winter MAP will no longer be administered to reduce testing load and due to evidence that decisions were not greatly enhanced by this information. The following link provides suggestions of measures to administer if teams would like an interim measure for students who are close to exiting interventions or may appear to need more intensive instruction.
Triggers considered are:
Below the 25th Percentile Nationally:
- Below 25th percentile in MAP Reading and/or Math
- Below 25th percentile on Oral Reading Assessments
- If confirmatory CBM are used, scores below the 25th percentile (+/- 2 points)
- Below 93% read correctly on passages (OR reading assessment)
- Below 75% read correctly on Word Reading Fluency (1st grade Fall)
- K-1 PALS : missed sum benchmark- automatic entry into intervention per statute
- 1st PALS is administered only in fall
- Minimal or Basic on Wisconsin Forward (Formerly WKCE, Badger 3-8)
Below district criteria:
- Below Trigger for Teachers’ College Assessment (Reading)
- Below district targets/triggers for writing assessment (TBD)
- Below expectations on Math fluency measures- less than 20 correct per minute
Consider if intervention is necessary,
- What specific skill instruction does this student need to close the gap and eventually meet tier 1 targets?
- How can those skills be most effectively be taught to the student?
Progress Monitoring Rules applicable to students receiving academic intervention:
- Initial CBM conducted within Fall benchmark period (Sept 1-Oct 15) or within 2 weeks initiation of intervention.
Minimum monitoring is every other week
Conduct weekly progress monitoring when:
MAP Reading is below 15th percentile AND CBM scores are below 25th percentile
- If rate of improvement is unclear and/or below the aimline
- Teams can increase the frequency of monitoring to weekly.
The following measures can be used up to weekly :
- Letter Name, Letter Sounds, Phoneme Segmenting- early Literacy Fluency Measures
- Continue Early Literacy areas until spring 50th percentile score reached with >90% accuracy
- Word Reading Fluency: Nonsense Word Fluency: First grade
- Discontinue when spring 50th %ile score reached with 90% accuracy
- Passage Reading Fluency: End of First-Eighth grade
- AIMSWeb: CBM-OR (reading fluency), Maze, Math computation, Math applications and concepts, spelling, writing
Exit Guidelines : Triangulate data when considering exit from intervention. Consider monitoring using CBM up to weekly when making exit decisions.
- Student is approaching 50th percentile on three subsequent CBM's and has 4 points above goal line.
- If trendline is close to 50th and student hits 50th, consider exit
- MAP and/or other measure is approaching 50th percentile
- EXIT RULE OF THUMB: If student is approaching the 50th percentile for the next benchmark season, consider exit after 2-3 probes.
- The objective is to establish a clear trend that the student is on track to reach the 50th percentile with accuracy. Maximize the probability of future success as best you can.
Link to Document regarding Data Management Systems in Elmbrook- click to access a description of where data is entered, stored, and analyzed
ion- Data Management System
ion will be our software to record student plans and view reports on students. Cheat sheets are available at the link below:
ion information- click here for cheat sheets- updated frequently
aimsweb will continue to be our assessment system to monitor progress for students receiving academic intervention or supplemental instruction. aimsweb will be the primary progress monitoring tool to assess progress of students receiving additional instruction or intervention. Minimum monitoring will occur every other week.
All presentations will be listed here with the most recent listed first:
Collaboration resource for private schools interested in RTI/MTSS
March 2, 2016 Data Based Decision Making Using CBM - WASDA Data Summit, Green Bay, WI
October 16, 2015 AWSA Elementary Principals' Convention- click for link
March 5, 2015 RTI Summit in Green Bay, WIUniversal Screening Process.pdf
December 8, 2014 New Teacher MTSS Presentation
Gifted and Talented Parent Advisory Board: 10/15/2014
Quality Educators Conference Presentation from June 2014
Teaching and Learning Committee Update from 10/2013 with glossary of terms from WI-RTI Center:
MTSS/RTI related articles written for associations and journals are posted below:
Wisconsin School Psychologists Association (WSPA) Articles: