Podding is a teaching strategy popular within RTI, which has teachers organize students into intra-class groups, based on individual learning goals. Ideally, students within multiple classes are supposed to be organized into three different groups based on the individual learning goals they are struggling on. Teachers, then take turns usually in two week periods, teaching each group. For example, you could have one group for students learning what theme is, one group for students learning how to identify a theme in a narrative, and one group for students learning how to evidence, their interpretation of a theme using literary analysis.
The learning goals should be connected and scaffolded. Once a student masters a learning goal, they move into the next group. The fluidity is seen as integral to the process. By keeping the groups fluid you promote a growth mindset and hopefully avoid the risk of students feeling labeled if placed in a lower group. This is at least how Podding is supposed to work. However, I fear that Podding is often more commonly organized as a below grade level group, an at grade level group, and an above grade level group. The second method of doing Podding would be very risky, as it risks making students in the lowest group feel labeled as unintelligent. Something, which I believe I have established in previous articles on the topic of expectations, is the worst thing a teacher could do pedagogically speaking.
RTI and Podding are not mutually exclusive concepts, both strategies can be executed independently of the other. However, the two ideas are closely connected and many schools that do participate in RTI implement Podding. RTI according to meta-analysis is a high yield strategy. According to John Hattie, RTI has an effect size of 1.29. However, as the esteemed Dr. Dylan William has pointed out in the majority of RTI studies, the effect size was negative. In order for these negative scores to be balanced out with such an overall high effect size, it means that RTI studies often result in both extremely high and low results.
I believe Podding might be a possible explanation as to why these discrepancies might be happening. I have worked in schools that use Podding and experimented with it myself over the course of several years. I have seen how wonderful of a tool it can be when used as intended. However, I have also seen how easy and tempting it can be for schools and teachers to execute it with fixed and not fluid groups. Now to be clear, RTI instructors do not recommend schools implement fixed groups; however, I think this discrepancy and the overall difficulty of effectively implementing Podding might provide a plausible explanation for the extreme discrepancies in RTI research in general.
While there are no meta-analyses of Podding directly (to the best of my knowledge and research) we do have a great deal of research and meta-analysis of fixed ability groups within classes. This research has consistently shown low to negative results, over the past 100 years. Saiying and et, al did a second-order meta-analysis that looked at 11 other meta-studies and 280 individual studies done on the topic. I have attempted to break down their research below, to highlight, the overall ineffectiveness of inter-class ability groupings, and to highlight the problem with organizing Podding in this way.
Kulik and Kulik did a meta-analysis in 1982 and found no significant overall effect for between classes ability grouping. Kulik and Kulik re-did their meta-analysis again in 1984, 1985, 1987, and 1992, with no significant effect found. Noland and et, all did a meta-analysis in 1986 with no significant overall effect. Robert Slavin did a meta-analysis in 1987 and found a significantly negative effect size (-.56)g. Slavin re-did his meta-analysis in 1990 and in 1991 with ever so slightly negative results of (-.03)g, and (-.01)g respectively which I would argue is statistically insignificant. Henderson in 1989 did a meta-analysis that found a slightly negative effect sie of (-.34)g. Monstellar did a meta-analysis in 1996, which also found no significant impact. And 2016, Steenbergen-Hu and et al, did a second-order meta-analysis of all previous meta-studies and found an ever so slightly negative effect size of (-.03)g.