As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”
An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.
My initial research question is, “Have the implemented recommendations of the SVCC ELA Task Force improved student success and time to completion at Sauk Valley Community College?”
Here’s a brief overview of how this study could be performed and how I plan to lay the groundwork for future Action Research in this area.
Step 1: Define Success Criteria
- Will need to discuss with Institutional Research, Recruiting, and Student Services
- Starting points might be
- Retention rates
- Pass rates for gateway courses (e.g. ENG 101, 103)
- Program completion rates
- Crucial to select appropriate success criteria so they can be used to compare all changes/trials going forward
Step 2: Compare Results
- Use success criteria to compare before/after ELA change.
- Time period for comparison would be Fall 2010 – Spring 2018 semesters (2013-14 was a transition year, so this allows for 4 years before and after complete implementation)
- Data would be extracted from College’s student information system (SIS) and (if applicable) learning management system (LMS)
- Control for other possible factors
Step 3: Review the Literature
- Compare ELA change results with similar changes in published trials
- Search for other potential trials that may work at SVCC
Step 4: Develop/Find Other Trials
- Compare other trials (existing and future) against success criteria
- Use data from successful and unsuccessful trials to select future trials
- Repeat the process
Community colleges have long fought the perception that they are just an extension of high school, or “high school with ashtrays.” This is certainly understandable, as they devote a lot of time and resources to striving to ensure that the quality of education is college-level. However, in another sense, assisting the transition from high school to college is an important role that community colleges fill — whether for those students who are not independently wealthy, not able or ready to move away, or who are not yet prepared for college-level work.
That third category — developmental education — is an area of the community college that is ripe for disruption. At Sauk Valley Community College, fewer than half of students are successful in developmental education courses the first time, and only half will continue as students past the first year . The current prevailing model places these courses as a barrier to be overcome before taking college-level coursework. A student — who may well have been receiving the message for years that they are “not good enough,” or “not college material” — takes a placement test, where they are told they are not good enough for college-level work. They must then enroll in and pay for classes — often multiple semesters’ worth — for which they will not receive credit. If they cannot pass the first time, the process repeats. Is it surprising, then, that the completion rates are so low?
This is an area of much interest and the research shows very promising results. Some of these approaches—multiple measures, guided pathways, noncognitive assessments, predictive data modeling, and intrusive advising—are highlighted in the following presentation.
What method or methods will work best at Sauk? I believe the answer to that is a solid, “it depends.” Approaches that work well for, say, a large, urban college will not necessarily be the best approaches for Sauk as a small, rural campus. What works in California, Florida, or Texas may or may not work well in Illinois.
What is clear to me is that we must use data to determine what innovative approaches have worked and have not worked. When we try new approaches, a plan must be put into place to carefully analyze results. When students arrive for advising, we must be sure that we are collecting enough relevant data about the student so the advisors and faculty (as appropriate) have enough information about the student to be able to help them be successful.
I would like to implement a three-stage approach to be implemented during the 2018-19 school year.
Phase 1: Noncognitive Data and Student Retention Data
This phase is already underway as part of the College’s HLC Quality Initiative portion of the accreditation process. For this initiative, the College is discussing collecting noncognitive data via the College Student Inventory™ (CSI) and developing a program for more intrusive advising for those students who we believe can be most effectively helped by more intervention.
However, for this program to be most successful, data needs be be aggregated from several different systems (e.g. student information system [SIS], learning management system [LMS], CSI, and others) and analyzed. Ideally, a system would aggregate this data in a way that could provide (or provide the ability to add on later) predictive analytics or apply machine learning to help us find patterns that we didn’t even know to look for. In addition, the system needs to be useful and meaningful for advisors and faculty so they have access to data and communication tools necessary to intervene on a student’s behalf in a timely manner.
We are currently in process with demonstrations from vendors, and I hope to have a solution selected or designed (if we decide to go with an in-house solution) and beginning to be implemented by the end of the fall 2018 semester.
Phase 2: Study Current and Past Approaches’ Effects on Student Success
Over the years, a number of solutions have been tested or implemented. However, often no data or only anecdotal data has been collected. I would like to develop a study on these approaches to see what effect they had on student success measures. These results could then be used to identify potential trials or approaches for the third phase. I would like to begin studying the following areas beginning in the summer 2018 semester with at least preliminary results available by the end of the fall 2018 semester:
- Develop a baseline metric against which to measure student acceleration
- What impact did these initiative have on student retention, pass rates for gateway courses, or acceleration?
- Change to current ELA prerequisite/corequisite model
- Change from COMPASS to Accuplacer/ALEKS for placement testing
- Creation of first-year experience (FYE) course
- Student success coordinator and activities (success week, success coaching)
- Creation of College Study Skills (CSS) class
- Creation of math lab and usage of Pearson MyMathLab
Studying these results will (1) give us a better sense of whether those initiatives are working and worth continuing and (2) will give us a better direction as we look to expand into new initiatives.
Phase 3: Trial Initiatives
As mentioned above, this phase will depend heavily on results from the previous phase. However, some possible example initiatives might include the following: develop a corequisite support class for a gateway math, develop high-quality online developmental education courses, and expanding multiple measures for placement to include other measures (such as high school GPA). I would like for the trials for this phase to be developed in the spring 2019 semester for implementation in Summer 2019 or Fall 2019.
In recent months, it has become clear to me that I need to shift focus for my innovation plan, so I wanted to take a moment here to explain that shift and how it came about.
What’s Not Changing
I became interested in developmental education because it’s one of the most clear, well-documented problems affecting higher education, and community colleges in particular. However, as clearly understood a problem as it is, the solutions are anything but clear. To my mind, this is a perfect place to perform research and try experiments. Everyone is hungry for a solution and actually willing to try things (even if that means failing miserably).
After I started digging into it, I became passionate about developmental education. There is so much squandered potential there, and, most importantly, I could see myself and my kids in those students being slowly rejected by a higher education system that’s trying desperately to help them (and I believe they REALLY ARE trying their best to help).
What is Changing
I have slowly come to the realization that my plan to develop these online, self-directed bridge courses was just not going to work. The main reason it wouldn’t work is that I simply wouldn’t be able to make it happen. I’m not a teacher. I’m not developing courses. I don’t have connections into the high schools to get buy-in from that side. Another reason is that, as I started paying closer attention to these issues, I started noticing that a lot more people are working on similar things. Unlike me, they are in the trenches and able to make things happen.
As I was developing a professional development plan, though, I focused on one section of my plan—academic advising—and things started to get easier and ideas started to flow. This, once again, is the COVA model in action. Because I had a choice, I developed increased ownership in the learning process, so I was able to find my voice and my learning was more authentic.
The “Discussion” section of my literature review strongly hints at the direction I’ll be heading. I’ve decided to focus on the academic advising process, making sure that advisors have the data they need to be able to identify students who will need extra assistance but giving them the ability to apply COVA in their own area of expertise–helping students. It’s exciting and more importantly, it’s something I can actually accomplish.
My innovation plan has undergone a substantial shift particularly over the past couple months. Whereas my previous focus had been on developing a set of college preparatory courses based on maker principles, my focus has now shifted to the admissions, advising, and placement process. As I studied the subject more, I’ve discovered that some similar projects were already in play. Further, in my role as Director of Information Services, I have much more ability to directly influence these strategies—particularly as they relate to providing access to data.
Following is a presentation slide deck I developed to share with a Sauk audience outlining my vision for exploring data-driven advising at SVCC.
Particularly with my new focus on a data-driven placement and remediation model in my innovation plan, a professional learning plan is a vital part of this project’s success. Previously, I gave a brief outline of a professional learning approach to include the 5 principles of effective professional development. Now, to flesh out that outline a bit, I have developed a modified 3-Column Table for the learning plan. In addition, I have developed the framework and initial content for a hybrid online/in-person professional learning course as part of the prior approach. I will continue to flesh out and revise the course content with more citations and relevant content. Contributors will also have the ability to add additional resources, so the course will continue to grow throughout its duration as well.
In addition to the mentoring program mentioned in the outline, I have also included modeling by including videos from a number of different colleges that have implemented these programs to provide another level of modeling.
I believe that this approach of encouraging employees to work collaboratively to help solve the problem of ineffective developmental education, combined with providing them with resources and access to data, will give them ownership of the process and allow them to make the most of this professional learning opportunity.
If my innovation plan to provide alternatives to the current model of developmental education at Sauk Valley Community College is to be successful, there will need to be effective professional development. As I looked at effecting organizational change, I narrowed my focus to reducing barriers and providing alternatives to traditional developmental education in the admissions and advising process. Since I am not a classroom teacher and my position at the college does not directly deal with instruction, I think this will be a more effective direction to take my innovation plan.
In developing a plan for professional development, then, I will again focus on the initial phase of this project, gathering additional relevant student data to identify trends we can use to determine what students may need additional intervention and what students are likely to be successful in regular courses with additional resources, lessening the burden of developmental education courses. This outline is the beginning of a professional development plan with the admissions and academic advising areas.
- Incorporate Gulamhussein’s 5 principles of effective professional development.
- Significant, ongoing duration – Training sessions would take place over the course of an academic year, beginning before the data collection project officially starts and continuing at least until after one full semester’s registration and advising cycle has been completed.
- Support during implementation – In addition to the training, prompts and reminders would help advisors and admissions representatives to know what they need to collect. Good user interface and database design will ease the workload and streamline the process. As aggregate data becomes available, it would be shared with employees so they can see the result of their efforts.
- Active initial exposure – Training sessions would not be just an instructor going through PowerPoint slides, but would contain hands-on exercises designed to simulate real-life experiences. Special attention would be paid to outlier situations to help employees think critically about what they should do in a given situation.
- Modeling – A mentoring program would be developed at SVCC, starting in the Student Services area to allow more experienced employees to help newer employees to understand procedures and practices.
- Content specific to area – The mentoring program will also allow colleagues to apply content specifically to their area. In addition, hands-on training sessions could be partially split up by area (academic advising, enrollment management, etc.) and partially mixed (to allow employees to see how their job contributes to the whole).
- Collaboration – Collaboration is vitally important, and it will be central to the mentoring program and in the hands-on training exercises. In addition, we should explore other means of communication such as forums, email listservs, group chats, and the like, to allow for more open sharing.
- Training leaders – Leadership would be shared among Student Services personnel (dealing with students, understanding the “why”), Institutional Research (why the data is important, how it contributes to student success), and Information Services (how to enter and share the data, information security).
- Audience – Primary audience will be academic advisors and enrollment management representatives, but should also include all student services personnel in some form.
- Instructional Design – I will develop the plan using backwards design and a 3 Column Table with a “Big, Hairy, Audacious Goal” (BHAG).
- Timeline – To adequately prepare for upcoming registration periods, trainings would need to begin in the Fall 2018 for Spring 2019 registration. In addition to giving time for preparatory training, implementation for the spring semester makes for an easier implementation as spring registration is generally smaller than the summer/fall registration period.
- Resources – Will need Information Services to develop streamlined data entry screens, Institutional Research to develop data requirements, and Student Services leadership to make time for training sessions and provide guidance to employees.
Professional development (PD)is at somewhat of a transition point at SVCC, making this an ideal time to present an alternative vision of how PD can be done differently. The most effective beginning point to start this discussion would be a discussion at Leadership Council (which consists of administrators and the faculty leaders for different academic areas), so I put together a brief overview intended as an introduction to an open discussion time. This group tends to be pretty open and collaborative, so an introduction of the topic and a nudge in the right direction should be all that’s necessary to start a productive discussion. However, when presenting to different groups, it may be helpful to have a more guided discussion.
In describing the “what is,” I thought it was important to describe it in a way that presented our currently available PD in a positive light and look at the opportunities for improving on what’s good rather than presenting the current state as a completely broken system.
Technically speaking, I firmly believe in the adage that “less is more” with regard to presentation graphics, so I kept the presentation simple with some mild humor and photos at the beginning to engage the audience and then giving way to a more traditional slide deck. I developed the slide desk in Google Slides, wrote a manuscript to follow (I usually end up farther off-script when giving a presentation live but am more confident when I manuscript it first), recorded the screen capture using Snagit, and then did final edits in Adobe Premiere Pro CC.
Few things are as daunting as organizational change, both for the change agent and the person being asked to change. The change agent can easily be overwhelmed by the immense task ahead of them, while everyone else tends to feel like to proverbial “old dog” being asked to learn new tricks. This is why it is so important for me to start with a common understanding of why, how, and what for my innovation plan. This common baseline will help to start everyone on the same page and establish a common goal for organizational change.
Building on that common understanding, then, I will have a basis to be able to explore how to motivate people to change using the Influencer model’s Six Sources of Influence. By addressing the structural, social, and personal spheres of motivation and ability, I will be more likely to meaningfully influence others in my organization to want to change. When we are personally motivated to change—that is, when we have ownership in the process—the project will be more successful.
Motivation alone, though, does not guarantee a project’s success. Perhaps more detrimental to the change process than lack of motivation is the project being choked out by the daily grind, or what the Four Disciplines of Execution calls the “whirlwind.” Having established a strategy to help motivate people, we must move on to execute that strategy through five stages of change—in spite of the whirlwind. This requires singular focus, commitment, and accountability.
Ultimately, though, the biggest impact I can have on an organization—whether I am the one in charge or whether I’m at the bottom or the organizational ladder—is going to be through the individual dealings I have with others. It’s also the area over which I have the most control. Enter the concepts of self-differentiated leadership and crucial conversations.
Much like in the Influencer model, Friedman’s concept of self-differentiated leadership understands the relationship between the social and personal spheres, but refuses to blur the line and descend into groupthink. One way of doing this is what Patterson, Grenny, McMillan, and Switzler refer to as stating your path and asking for others to state theirs. They advise,
“So once you’ve shared your point of view—facts and stories alike—invite others to do the same. If your goal is to keep expanding the pool of meaning rather than to be right, to make the best decision rather than to get your way, then you’ll willingly listen to other views.” (p. 143)
Seeing myself as part of a larger whole, yet unique from it, also allows me to see others in the same way, which encourages the humility and respect necessary to successfully navigate these conversations. Treating others with this respect is a cornerstone of the Crucial Conversations methodology. Without this differentiation and respect, the techniques become mere manipulation.
Again, here, a common understanding of the “why” is important; Patterson, et. al. call it starting with heart. When change becomes confrontational, having established a common starting ground will allow us to come together for the already-agreed-upon common goal. Then, with that common goal established, we can work together to maintain a safe conversational environment where fear doesn’t dominate the exchange and both sides are able to openly yet respectfully share their ideas and concerns. Once agreement has been reached, then, we will be able to move to action together.
These very different, yet very similar, approaches to organizational change work together beautifully to minimize resistance to change and allow for maximum impact.