See the related literature review for research background on this project. This is an update from my previous innovation plan, reflective of new research and a refined focus, with more to come.
Community colleges have long fought the perception that they are just an extension of high school, or “high school with ashtrays.” This is certainly understandable, as they devote a lot of time and resources to striving to ensure that the quality of education is college-level. However, in another sense, assisting the transition from high school to college is an important role that community colleges fill — whether for those students who are not independently wealthy, not able or ready to move away, or who are not yet prepared for college-level work.
That third category — developmental education — is an area of the community college that is ripe for disruption. At Sauk Valley Community College, fewer than half of students are successful in developmental education courses the first time, and only half will continue as students past the first year . The current prevailing model places these courses as a barrier to be overcome before taking college-level coursework. A student — who may well have been receiving the message for years that they are “not good enough,” or “not college material” — takes a placement test, where they are told they are not good enough for college-level work. They must then enroll in and pay for classes — often multiple semesters’ worth — for which they will not receive credit. If they cannot pass the first time, the process repeats. Is it surprising, then, that the completion rates are so low?
This is an area of much interest and the research shows very promising results. Some of these approaches—multiple measures, guided pathways, noncognitive assessments, predictive data modeling, and intrusive advising—are highlighted in the following presentation.
What method or methods will work best at Sauk? I believe the answer to that is a solid, “it depends.” Approaches that work well for, say, a large, urban college will not necessarily be the best approaches for Sauk as a small, rural campus. What works in California, Florida, or Texas may or may not work well in Illinois.
What is clear to me is that we must use data to determine what innovative approaches have worked and have not worked. When we try new approaches, a plan must be put into place to carefully analyze results. When students arrive for advising, we must be sure that we are collecting enough relevant data about the student so the advisors and faculty (as appropriate) have enough information about the student to be able to help them be successful.
I would like to implement a three-stage approach to be implemented during the 2018-19 school year.
Phase 1: Noncognitive Data and Student Retention Data
This phase is already underway as part of the College’s HLC Quality Initiative portion of the accreditation process. For this initiative, the College is discussing collecting noncognitive data via the College Student Inventory™ (CSI) and developing a program for more intrusive advising for those students who we believe can be most effectively helped by more intervention.
However, for this program to be most successful, data needs be be aggregated from several different systems (e.g. student information system [SIS], learning management system [LMS], CSI, and others) and analyzed. Ideally, a system would aggregate this data in a way that could provide (or provide the ability to add on later) predictive analytics or apply machine learning to help us find patterns that we didn’t even know to look for. In addition, the system needs to be useful and meaningful for advisors and faculty so they have access to data and communication tools necessary to intervene on a student’s behalf in a timely manner.
We are currently in process with demonstrations from vendors, and I hope to have a solution selected or designed (if we decide to go with an in-house solution) and beginning to be implemented by the end of the fall 2018 semester.
Phase 2: Study Current and Past Approaches’ Effects on Student Success
Over the years, a number of solutions have been tested or implemented. However, often no data or only anecdotal data has been collected. I would like to develop a study on these approaches to see what effect they had on student success measures. These results could then be used to identify potential trials or approaches for the third phase. I would like to begin studying the following areas beginning in the summer 2018 semester with at least preliminary results available by the end of the fall 2018 semester:
- Develop a baseline metric against which to measure student acceleration
- What impact did these initiative have on student retention, pass rates for gateway courses, or acceleration?
- Change to current ELA prerequisite/corequisite model
- Change from COMPASS to Accuplacer/ALEKS for placement testing
- Creation of first-year experience (FYE) course
- Student success coordinator and activities (success week, success coaching)
- Creation of College Study Skills (CSS) class
- Creation of math lab and usage of Pearson MyMathLab
Studying these results will (1) give us a better sense of whether those initiatives are working and worth continuing and (2) will give us a better direction as we look to expand into new initiatives.
Phase 3: Trial Initiatives
As mentioned above, this phase will depend heavily on results from the previous phase. However, some possible example initiatives might include the following: develop a corequisite support class for a gateway math, develop high-quality online developmental education courses, and expanding multiple measures for placement to include other measures (such as high school GPA). I would like for the trials for this phase to be developed in the spring 2019 semester for implementation in Summer 2019 or Fall 2019.