December 21, 2024

Q&A – Rate of improvement for EasyCBM

ROI6

Great question from Leasha:

Please help! Our new state policy manual has listed your website as the place to go for calculating our rate of learning for the upcoming school year. Pretty much every county in the state uses Dibels so it is set up and ready to go for them. However, my county uses EasyCBM for reading. Is it possible for me to just edit the benchmark goals, and then the Excel would calculate it properly for students entered, or would I need to do some special tweaking?

My Response:

Absolutely! You can adjusts the benchmarks to whatever you need them to be and the Excel sheet should automatically adjust the calculations for ROI.  Good luck!

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare

Q&A – Slope for standard scores

ROI6

Great question from Lavonne:

My question is similar to that of Jen on July 18, 2011. I am wondering if Rate of Improvement can be graphed and slope comparisons used with standard scores such as those generated by STAR Reading. We also have standard scores on our statewide assessments and could easily generate Excel graphs to show the difference between our student’s performance and that reuqired to pass. We can easily generate our own Excel graphs, we just want to make sure the data is valid.Thank you.

P.S. Thanks so much for this site!

My response:

You can calculate slope for scores that have an equal interval between data points.  STAR Reading, STAR Math, and STAR Early Literacy is a good example of non-CBM data and has been validated as a data set for which you can calculate slope.  Joe Kovaleski and colleagues (2013) just published a book that describes using rate of improvement with computer adaptive tests (CATs), specifically with STAR assessments. I highly recommend getting a copy!

Kovaleski, J. F., VanDerHeyden, A. M., & Shapiro, E. S. (2013). The RTI approach to evaluating learning disabilities. New York, NY; Guilford Press.

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare

PA RtI Implementer’s Forum

Gearing up to present with Andy McCrea tomorrow as part of PaTTAN’s RtI Implementer’s Forum tomorrow. As Andy and I were updating the PowerPoint this past week, we realized we’ve been speaking on this topic for 5 years now! It’s been interesting to see the evolution of awareness in our participants over time. Many more participants have access to curriculum-based measurement (CBM) data than when we first started. Even more have figured out how to graph data and generate trend lines in Microsoft Excel, some without realizing they are essentially computing linear regression statistics! We’ve presented for a range of audiences that started out as mainly school psychologists but has expanded to teams who are implementing components of RtI in their schools. Tomorrow’s workshop is formatted for just that – teams! Should be another great interactive workshop!

RtI Implementer’s Forum Link

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare

Pennsylvania RtI Implementers Forum 2013

Andy and I were just confirmed to conduct a workshop on graphing, calculating, and interpreting rate of improvement in June 2013! Our state technical assistance network is hosting an implementers forum to have school-based RtI teams within the state come together for professional development. We’re looking forward to presenting to a mixed group of professionals!

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare

Computer Adaptive Tests

I’m in my second year at my current district where I serve the secondary student population (grades 7-12). My district’s elementary schools are doing a really nice job with their response to intervention (RtI) framework but we have a lot of work to do at my middle and high schools. At the end of last year, we purchased licenses to use STAR Math and STAR Reading (and STAR Early Literacy for the Elementary folks) through Renaissance Learning. I’ve recently been through a couple of trainings for how to administer and interpret the assessment and reports. I’m hopeful that these assessments will do a better job of capturing my older students’ skills, especially for the student’s in special education since they’ve been using the same CBM probes for years. Our STAR assessments can be used for universal screening, progress monitoring, and diagnostic purposes. Plus, they tie into the Common Core standards, which is so helpful for teachers who need to make the connection between assessment to high school classes!

Another feature I’m impressed with so far (surprise) is that the system provides you with a Student Growth Percentile. For instance, if a 7th grade student scores a grade equivalent of 4.5 on the Reading assessment, the report will tell me how much a student with that same profile (7th grade scoring 4.5) will typically “grow” by the end of the year. I’ve always wondered “how much growth can we expect?” from our students. I’ll have to see how this plays out for the school year. Because I need another project… :)

After attending some sobering workshops at my national school psych convention in February, I’m a little worried about the weight we place on student rate of improvement data when research suggests that we need at least 14 data points to have a reliable oral reading fluency trendline (Christ, Zoplouglu, Long, & Monaghen, 2012)!!! The STAR assessments can provide a reliable trendline after 4 data points. Think about how much sooner we could be making solid instructional decisions?! I’m curious if anyone else is using computer adaptive tests. It will certainly be a learning curve (ha…) for me this year!

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare

NASP Workshop 2012

Thanks to everyone who came out to the workshop session yesterday! Our workshop focused several aspects of rate of improvement including (a) how we arrived at the conclusion that rate of improvement is a meaningful statistic that can be used as part of data-based decision-making, and a need to be consistent with how to calculate, graph, and interpret rate of improvement. It was a great opportunity for us to work with practitioners and educators interested in the topic of student growth in relation to eligibility decision-making.  While a presentation that incorporates technology, especially the variability between software versions can be daunting, we seemed to get through the workshop in a way that reached everyone. However, if you have lingering questions, feel free to email us! We will be posting the latest PowerPoint to the Downloads section of the site this weekend. We appreciate feedback, comments, and questions! We are hoping to be invited to present again at next year’s convention in Seattle, WA!

FacebookTwitterPrintFriendlyPinterestTumblrGoogle+EmailShare