Comments

TIPSS #2 – We Could Do Better — 8 Comments

  1. Since Fall, 2008 was the start of the recession, how do these patterns compare with students who entered in Fall 2006 or 2007, before the sudden change in our student body?

    • The purpose of this is to show how patterns of persistence unfold over time for a single exemplar cohort. Rather than replicate this with two earlier cohorts, I can provide a direct comparison between cohorts on some of the same types of outcome metrics. I need to figure out how I can post a nicely formatted table as part of my reply, but for now I can provide the following simple table:

                                                           2006	2007	2008	2009	2010	2011
      Completed one or more award (Cert or AA) in 3 years	7%	8%	9%	9%	8%	6%
      Student Transferred to 4 year school within 3 years	11%	12%	11%	10%	8%	9%
      

      It is very true that the recession changed our students as well as their motivations for persisting. 3-year award completion rates peaked for the 2008 and 2009 cohorts. In contrast transfer rates were somewhat higher before the recession. Initial retention rates (not shown here) were only slightly up for the 2008 cohort compared to the prior 2 fall cohorts, but in fall of 2009 and 2010 initial retention was significantly elevated. The exact impact of the recession tends to vary with the time frame for the outcome metric.

  2. I would like to see an overlay of the changes in financial aid to this continual decline in completion. I do believe that there is a direct correlation.

    • This figure actually doesn’t show any decline in completion – it is only showing how one group of new fall-start students (from 2008) persisted over a six-year time period. In my reply to Kyle above I tried to include some data showing how completion rates have changed over time, and there you can see that completion rates went up during the recession and have since come back down. It is certainly true that the percentage of our students applying for aid and the percentage of those who applied that receive aid both went up sharply during the recession. In addition there have been some changes to aid rules and the total amount of aid disbursed. Furthermore completion rates ARE associated with financial aid funding at the student level and at the college-level, but the story is fairly complex. Students who are economically disadvantaged are at risk for poor student outcomes, but because they qualify for more and better financial aid they tend to do better on some metrics – like initial persistence. The net effect of these competing influences isn’t entirely straightforward and it depends what outcome metric you examine. The following table shows how the 3-year completion rate compares for students who received Pell grants to those who did not (split out by whether they applied for any aid).

       
                    	                2006	2007	2008	2009	2010	2011
                       No FAFSA	6%	8%	9%	8%	7%	6%
      FAFSA, but No Pell Award	13%	13%	14%	12%	14%	10%
      Any Pell Grant Award in year 1	10%	9%	12%	12%	9%	8%
      

      It shows the same peak of completion rates for the 2008 cohort and that in all cohorts the group with the highest completion rate is students who applied for aid but did not receive a Pell grant. Most of these students received other aid. In most cohorts Pell recipients were less likely to complete within 3 years. However, given their relative economic disadvantage, their completion rates are encouraging compared to the rates for students who did not apply for aid at all. Student who did not apply for aid were the most likely to transfer within 3 years.

      Short answer – it’s complicated. Thanks for your comment. Happy to talk to you more about the role of financial aid in student success. Perhaps we will address that in a future TIPSS!

  3. It is very exciting to see data! Thanks!
    Question: Is there another visual format for presenting the data that might tell the visual story more clearly? Perhaps a different kind of chart, or orienting the data differently (like current at top?). This one is a real puzzle to interpret for me.

    • You are not alone in being puzzled about this graphic. We’ve had similar feedback from several sources. In a future TIPSS we will try to illustrate long term patterns of persistence, transfer, and completion using a more readily understandable graph. This one is unusual because time is illustrated from top to bottom on the Y axis. Many people assumed that each bar was representing a different group of students, when in fact this tracks one group of students at each time point over the 6 year period.
      In fall 2008 all of them were new students enrolled at Lane (100% in blue). By the next term (winter 2009), 80% were still enrolled at Lane, 1% had transferred elsewhere, and 19% were no longer enrolled at any college (in red). By spring 2014 (the last bar on the bottom), 3% of this original group of students were still enrolled at Lane, 29% had transferred, 16% had earned an award from Lane, and 52% were not enrolled anywhere. This last group in red was labelled simply as ???????, but they are sometimes referred to as “stop outs” or “drop outs.” To be clear, that group did not transfer or earn a Lane credential at any time in the 6-years since starting at Lane.
      Hope this helps.

  4. I have a great difficulty in giving credibility to a study that admits that it has no data on over half of its target study group.  As it stands, the graduation rate, for example could be anywhere from 16% to 68%; what shows here is basically useless, as a diagnostic tool, because it is woefully deficient in collected data.  Until there is better tracking, the question of whether there even IS a problem, cannot be reliably answered.  Without that, whatever action is taken may be solution to a completely different problem from the real one, or a solution to a problem that doesn’t even exist.  This would be utterly irresponsible.

    • I’m sorry if this presentation didn’t explain the data better. Your comments stem from the impression that the group in red (labeled as “???????”) is one on which we have “no data.” That is not the case at all. We have the same data on these students as all others. These are students who are not enrolled at Lane, have not graduated, and have not transferred. Such students are sometimes classified as “stop-outs” or “drop-outs,” but we didn’t want to apply either of those terms. Perhaps that was not the best choice.
      The actual graduation rate for the cohort of students who entered in fall 2008, after 6 years at Lane is exactly 16%, as shown in the last row. The 52% of students shown in red did not completed any Lane credential, and there is essentially no missing information regarding the awarding of Lane credentials. Nor is there any significant source of missing information regarding Lane credit enrollments. The only potentially significant issue here is whether data on transfers is perfectly reliable, and we acknowledge that it is not. There is a small percentage of students who transfer that we do not know about. About 2% of post-secondary institutions do not share data with the National Student Clearinghouse, some students ask that their data not be shared (less than 5%), and sometimes there are errors in names and dates of birth that results in a failure to match.

Leave a Reply

Your email address will not be published. Required fields are marked *