As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”
An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.
Here’s a brief overview of how this study could be performed and how I plan to lay the groundwork for future Action Research in this area.
Step 1: Define Success Criteria
Will need to discuss with Institutional Research, Recruiting, and Student Services
Starting points might be
Pass rates for gateway courses (e.g. ENG 101, 103)
Program completion rates
Crucial to select appropriate success criteria so they can be used to compare all changes/trials going forward
Step 2: Compare Results
Use success criteria to compare before/after ELA change.
Control for other possible factors
Step 3: Review the Literature
Compare ELA change results with similar changes in published trials
Search for other potential trials that may work at SVCC
Step 4: Develop/Find Other Trials
Compare other trials (existing and future) against success criteria
Use data from successful and unsuccessful trials to select future trials
Community colleges have long fought the perception that they are just an extension of high school, or “high school with ashtrays.” This is certainly understandable, as they devote a lot of time and resources to striving to ensure that the quality of education is college-level. However, in another sense, assisting the transition from high school to college is an important role that community colleges fill — whether for those students who are not independently wealthy, not able or ready to move away, or who are not yet prepared for college-level work.
That third category — developmental education — is an area of the community college that is ripe for disruption. At Sauk Valley Community College, fewer than half of students are successful in developmental education courses the first time, and only half will continue as students past the first year . The current prevailing model places these courses as a barrier to be overcome before taking college-level coursework. A student — who may well have been receiving the message for years that they are “not good enough,” or “not college material” — takes a placement test, where they are told they are not good enough for college-level work. They must then enroll in and pay for classes — often multiple semesters’ worth — for which they will not receive credit. If they cannot pass the first time, the process repeats. Is it surprising, then, that the completion rates are so low?
This is an area of much interest and the research shows very promising results. Some of these approaches—multiple measures, guided pathways, noncognitive assessments, predictive data modeling, and intrusive advising—are highlighted in the following presentation.
What method or methods will work best at Sauk? I believe the answer to that is a solid, “it depends.” Approaches that work well for, say, a large, urban college will not necessarily be the best approaches for Sauk as a small, rural campus. What works in California, Florida, or Texas may or may not work well in Illinois.
What is clear to me is that we must use data to determine what innovative approaches have worked and have not worked. When we try new approaches, a plan must be put into place to carefully analyze results. When students arrive for advising, we must be sure that we are collecting enough relevant data about the student so the advisors and faculty (as appropriate) have enough information about the student to be able to help them be successful.
I would like to implement a three-stage approach to be implemented during the 2018-19 school year.
Phase 1: Noncognitive Data and Student Retention Data
This phase is already underway as part of the College’s HLC Quality Initiative portion of the accreditation process. For this initiative, the College is discussing collecting noncognitive data via the College Student Inventory™ (CSI) and developing a program for more intrusive advising for those students who we believe can be most effectively helped by more intervention.
However, for this program to be most successful, data needs be be aggregated from several different systems (e.g. student information system [SIS], learning management system [LMS], CSI, and others) and analyzed. Ideally, a system would aggregate this data in a way that could provide (or provide the ability to add on later) predictive analytics or apply machine learning to help us find patterns that we didn’t even know to look for. In addition, the system needs to be useful and meaningful for advisors and faculty so they have access to data and communication tools necessary to intervene on a student’s behalf in a timely manner.
We are currently in process with demonstrations from vendors, and I hope to have a solution selected or designed (if we decide to go with an in-house solution) and beginning to be implemented by the end of the fall 2018 semester.
Phase 2: Study Current and Past Approaches’ Effects on Student Success
Over the years, a number of solutions have been tested or implemented. However, often no data or only anecdotal data has been collected. I would like to develop a study on these approaches to see what effect they had on student success measures. These results could then be used to identify potential trials or approaches for the third phase. I would like to begin studying the following areas beginning in the summer 2018 semester with at least preliminary results available by the end of the fall 2018 semester:
Develop a baseline metric against which to measure student acceleration
What impact did these initiative have on student retention, pass rates for gateway courses, or acceleration?
Change to current ELA prerequisite/corequisite model
Change from COMPASS to Accuplacer/ALEKS for placement testing
Creation of first-year experience (FYE) course
Student success coordinator and activities (success week, success coaching)
Creation of College Study Skills (CSS) class
Creation of math lab and usage of Pearson MyMathLab
Studying these results will (1) give us a better sense of whether those initiatives are working and worth continuing and (2) will give us a better direction as we look to expand into new initiatives.
Phase 3: Trial Initiatives
As mentioned above, this phase will depend heavily on results from the previous phase. However, some possible example initiatives might include the following: develop a corequisite support class for a gateway math, develop high-quality online developmental education courses, and expanding multiple measures for placement to include other measures (such as high school GPA). I would like for the trials for this phase to be developed in the spring 2019 semester for implementation in Summer 2019 or Fall 2019.
My innovation plan has undergone a substantial shift particularly over the past couple months. Whereas my previous focus had been on developing a set of college preparatory courses based on maker principles, my focus has now shifted to the admissions, advising, and placement process. As I studied the subject more, I’ve discovered that some similar projects were already in play. Further, in my role as Director of Information Services, I have much more ability to directly influence these strategies—particularly as they relate to providing access to data.
Following is a presentation slide deck I developed to share with a Sauk audience outlining my vision for exploring data-driven advising at SVCC.
Particularly with my new focus on a data-driven placement and remediation model in my innovation plan, a professional learning plan is a vital part of this project’s success. Previously, I gave a brief outline of a professional learning approach to include the 5 principles of effective professional development. Now, to flesh out that outline a bit, I have developed a modified 3-Column Table for the learning plan. In addition, I have developed the framework and initial content for a hybrid online/in-person professional learning course as part of the prior approach. I will continue to flesh out and revise the course content with more citations and relevant content. Contributors will also have the ability to add additional resources, so the course will continue to grow throughout its duration as well.
In addition to the mentoring program mentioned in the outline, I have also included modeling by including videos from a number of different colleges that have implemented these programs to provide another level of modeling.
I believe that this approach of encouraging employees to work collaboratively to help solve the problem of ineffective developmental education, combined with providing them with resources and access to data, will give them ownership of the process and allow them to make the most of this professional learning opportunity.
If my innovation plan to provide alternatives to the current model of developmental education at Sauk Valley Community College is to be successful, there will need to be effective professional development. As I looked at effecting organizational change, I narrowed my focus to reducing barriers and providing alternatives to traditional developmental education in the admissions and advising process. Since I am not a classroom teacher and my position at the college does not directly deal with instruction, I think this will be a more effective direction to take my innovation plan.
In developing a plan for professional development, then, I will again focus on the initial phase of this project, gathering additional relevant student data to identify trends we can use to determine what students may need additional intervention and what students are likely to be successful in regular courses with additional resources, lessening the burden of developmental education courses. This outline is the beginning of a professional development plan with the admissions and academic advising areas.
Incorporate Gulamhussein’s 5 principles of effective professional development.
Significant, ongoing duration – Training sessions would take place over the course of an academic year, beginning before the data collection project officially starts and continuing at least until after one full semester’s registration and advising cycle has been completed.
Support during implementation – In addition to the training, prompts and reminders would help advisors and admissions representatives to know what they need to collect. Good user interface and database design will ease the workload and streamline the process. As aggregate data becomes available, it would be shared with employees so they can see the result of their efforts.
Active initial exposure – Training sessions would not be just an instructor going through PowerPoint slides, but would contain hands-on exercises designed to simulate real-life experiences. Special attention would be paid to outlier situations to help employees think critically about what they should do in a given situation.
Modeling – A mentoring program would be developed at SVCC, starting in the Student Services area to allow more experienced employees to help newer employees to understand procedures and practices.
Content specific to area – The mentoring program will also allow colleagues to apply content specifically to their area. In addition, hands-on training sessions could be partially split up by area (academic advising, enrollment management, etc.) and partially mixed (to allow employees to see how their job contributes to the whole).
Collaboration – Collaboration is vitally important, and it will be central to the mentoring program and in the hands-on training exercises. In addition, we should explore other means of communication such as forums, email listservs, group chats, and the like, to allow for more open sharing.
Training leaders – Leadership would be shared among Student Services personnel (dealing with students, understanding the “why”), Institutional Research (why the data is important, how it contributes to student success), and Information Services (how to enter and share the data, information security).
Audience – Primary audience will be academic advisors and enrollment management representatives, but should also include all student services personnel in some form.
Instructional Design – I will develop the plan using backwards design and a 3 Column Table with a “Big, Hairy, Audacious Goal” (BHAG).
Timeline – To adequately prepare for upcoming registration periods, trainings would need to begin in the Fall 2018 for Spring 2019 registration. In addition to giving time for preparatory training, implementation for the spring semester makes for an easier implementation as spring registration is generally smaller than the summer/fall registration period.
Resources – Will need Information Services to develop streamlined data entry screens, Institutional Research to develop data requirements, and Student Services leadership to make time for training sessions and provide guidance to employees.
Professional development (PD)is at somewhat of a transition point at SVCC, making this an ideal time to present an alternative vision of how PD can be done differently. The most effective beginning point to start this discussion would be a discussion at Leadership Council (which consists of administrators and the faculty leaders for different academic areas), so I put together a brief overview intended as an introduction to an open discussion time. This group tends to be pretty open and collaborative, so an introduction of the topic and a nudge in the right direction should be all that’s necessary to start a productive discussion. However, when presenting to different groups, it may be helpful to have a more guided discussion.
In describing the “what is,” I thought it was important to describe it in a way that presented our currently available PD in a positive light and look at the opportunities for improving on what’s good rather than presenting the current state as a completely broken system.
Technically speaking, I firmly believe in the adage that “less is more” with regard to presentation graphics, so I kept the presentation simple with some mild humor and photos at the beginning to engage the audience and then giving way to a more traditional slide deck. I developed the slide desk in Google Slides, wrote a manuscript to follow (I usually end up farther off-script when giving a presentation live but am more confident when I manuscript it first), recorded the screen capture using Snagit, and then did final edits in Adobe Premiere Pro CC.
Few things are as daunting as organizational change, both for the change agent and the person being asked to change. The change agent can easily be overwhelmed by the immense task ahead of them, while everyone else tends to feel like to proverbial “old dog” being asked to learn new tricks. This is why it is so important for me to start with a common understanding of why, how, and what for my innovation plan. This common baseline will help to start everyone on the same page and establish a common goal for organizational change.
Building on that common understanding, then, I will have a basis to be able to explore how to motivate people to change using the Influencer model’s Six Sources of Influence. By addressing the structural, social, and personal spheres of motivation and ability, I will be more likely to meaningfully influence others in my organization to want to change. When we are personally motivated to change—that is, when we have ownership in the process—the project will be more successful.
Motivation alone, though, does not guarantee a project’s success. Perhaps more detrimental to the change process than lack of motivation is the project being choked out by the daily grind, or what the Four Disciplines of Execution calls the “whirlwind.” Having established a strategy to help motivate people, we must move on to execute that strategy through five stages of change—in spite of the whirlwind. This requires singular focus, commitment, and accountability.
Ultimately, though, the biggest impact I can have on an organization—whether I am the one in charge or whether I’m at the bottom or the organizational ladder—is going to be through the individual dealings I have with others. It’s also the area over which I have the most control. Enter the concepts of self-differentiated leadership and crucial conversations.
Much like in the Influencer model, Friedman’s concept of self-differentiated leadership understands the relationship between the social and personal spheres, but refuses to blur the line and descend into groupthink. One way of doing this is what Patterson, Grenny, McMillan, and Switzler refer to as stating your path and asking for others to state theirs. They advise,
“So once you’ve shared your point of view—facts and stories alike—invite others to do the same. If your goal is to keep expanding the pool of meaning rather than to be right, to make the best decision rather than to get your way, then you’ll willingly listen to other views.” (p. 143)
Seeing myself as part of a larger whole, yet unique from it, also allows me to see others in the same way, which encourages the humility and respect necessary to successfully navigate these conversations. Treating others with this respect is a cornerstone of the Crucial Conversations methodology. Without this differentiation and respect, the techniques become mere manipulation.
Again, here, a common understanding of the “why” is important; Patterson, et. al. call it starting with heart. When change becomes confrontational, having established a common starting ground will allow us to come together for the already-agreed-upon common goal. Then, with that common goal established, we can work together to maintain a safe conversational environment where fear doesn’t dominate the exchange and both sides are able to openly yet respectfully share their ideas and concerns. Once agreement has been reached, then, we will be able to move to action together.
These very different, yet very similar, approaches to organizational change work together beautifully to minimize resistance to change and allow for maximum impact.
While the Influencer model–in particular the Six Sources of Influence (6SI)–deals with motivating and removing barriers to change, the Four Disciplines of Execution (4DX) model deals with the mechanics of bringing about organizational change. The goal of 6SI is to help an individual want to change, to be personally motivated and enabled to change, and support that with social and structural forces. The 4DX model largely operates in the structural sphere and employs a more “top-down” approach. Both models are helpful and appropriate for different situations, though 6SI is more broad and can be used by anyone to help encourage change.
The two models do have a lot of similarities and overlap as well. While using different terms, both models
discuss the importance of a primary, measurable goal which can be influenced by smaller, measurable actions. While not identical, vital behaviors are very similar to the lead measures and the measurable result is similar to the lag measure. Both models emphasize the need for accountability and a small, focused number of goals. Both prioritize regular, quick feedback.
Success in the 4DX–especially in the early days–depends largely on how clearly the objectives are developed and stated. Our Wildly Important Goal (WIG) will be to decrease the number of incoming students needing to take developmental education courses. However admirable that goal may be, it is very difficult to gauge progress or success toward that WIG. Stating the WIG in the “From X to Y by WHEN” format, then, will give us an actionable WIG and lag measure.
WIG: Reduce the percentage of incoming students taking developmental education courses from 50% to 45% by the beginning of the Fall 2019 semester.
Lag Measure: Percentage of incoming students taking developmental education courses to total incoming students.
A few lead measures that would have a direct effect on the lag measure are the following:
Recruit Sauk-bound students for the college prep courses.
Improve courses with more authentic learning experiences
Collect full data sets for incoming students to identify other, earlier methods of remediation
A dashboard interface would be developed in the College’s reporting platform to show the following:
a graph for the lag measure should show at least 3 years of past data so a trend can be observed (since data for the lag measure can only be collected once a semester),
a weekly graph showing the enrollment in the college prep courses,
a weekly graph showing the level of engagement in the courses (activities performed, interactions recorded, etc.),
a weekly graph showing the count of students enrolled with/without full data sets collected, and
a series of graphs showing the data collected from incoming students.
A task force comprised of admissions representatives, academic advising representatives, and instructors would be assembled to begin weekly WIG meetings. Kickoff meetings could be scheduled for the beginning of semester kickoff day or mid-semester workshop day, to demonstrate that this has support from the administration.
As adoption begins, care will need to be taken to celebrate achievements and keep on pace. Accountability is crucial, here, to avoid losing momentum and help to overcome resistance. Focusing on helping the “potentials” improve (instead of focusing on the “resisters,” which may be the tendency) will help keep spirits high and have more of an impact on the final goal.
Particularly with the course quality and engagement lead measure, it will be important to look at ways to improve as the project moves into the optimization phase. As the data is analyzed, too, the plans can be refined and changed.
With a project like this, it is hard to imagine ever getting to the point where the project is “complete”–there will always be room for improvement–but seeing higher success rates for students and a new normal where more incoming students aren’t blocked by developmental courses will pave the way for the next WIG.
If my innovation plan is to be successful, it will need to be supported by a strategy to influence Sauk stakeholders to take part in the process. For this post, I am focusing on just one facet of the plan–collecting data as part of the advising process.
Currently, the developmental education process at Sauk is pretty cut and dried. Depending on their placement test scores, incoming students are placed in the appropriate course. Incorporating other methods of placement such as the college preparation course I am proposing, though, introduces complexity to the process and so it also increases the chance for error. In addition, judging the effectiveness of the process will depend on collecting and analyzing as much data as possible.
Following Grenny, Patterson, Maxfield, McMillan, and Switzler’s model in Influencer calls for finding a “vital behavior” to change and then applying six sources of influence. To achieve the result of having enough data to analyze the effectiveness of the plan, it is imperative that academic advisors collect as much relevant data as possible. While specific relevant measures still need to be defined, some examples of relevant data would include the following: high school GPA and specific course grades, ACT/SAT scores, at-risk markers, and participation indicators in college preparatory course (if applicable).
Therefore, for this influence plan, the vital behavior is to ensure that a complete data set is collected for at least 80% of incoming students. It is not reasonable to expect 100% collection as data may not be available for all students and some students may not be willing to provide all data points.
The Six Sources of Influence
To be successful, I will need to engage all of the six sources of influence from Grenny, et al. While the matrix in Influencer is useful, I prefer to think of the model as concentric circles as it shows the difficulty of penetrating all the way to the Personal level as well as the relationship between the Structural, Social, and Personal levels.
Changes are easiest to make at the structural level, as they are what McChesney, Covey, and Huling call a “stroke-of-the-pen strategy” , a change that can just be made by saying it needs to be done. Structural changes will most directly affect the social level, which will in turn affect the individuals at the personal level.
Structural, or external, motivation could be accomplished by providing printed or digital materials (posters, computer wallpapers, etc.) that remind advisors to ask for all the information, not just the minimum necessary to get the student’s immediate needs taken care of.
In addition, some silly rewards such as “most math scores this month” or “collected 500 high school GPAs” could be given at monthly staff meetings. In addition to turning the data collection into a game, it will also help to encourage…
…healthy peer pressure among advising staff. Between the healthy competition among advisors and the effect of seeing that other advisors are collecting the information, the social motivation will provide a powerful encouragement for advisors to remember to collect information.
Healthy peer pressure will help advisors to be personally motivated, but much more can and should be done to affect the personal motivation realm. One key way to do this will be to clearly contextualize the data by repeatedly discussing the overall goal of the project and sharing data as it becomes available. This will help advisors to see the results of their work and how it is helping to help students succeed.
To help advisors’ ability to collect the necessary information, the most important structural accommodation will be to make sure the database and collection forms are user-friendly and easily accessible.
Good database design will also enable multiple advisors (and other personnel) to collect and enter information, decreasing the load on each individual advisor.
Proper training, of course, is paramount to the success of any program like this. In addition to contextualizing the need for the data collection, training sessions can also equip advisors with clear descriptions of what data need to be collected and responses to common objections students may provide. A mnemonic device to help advisors remember the pieces of information that need to be collected could also be helpful.
I believe that, with these measures in place, 80% data collection is an achievable result and will contribute greatly to the success of the overall project.