Action Research Plan

Topic/Purpose

As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”

An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.

My initial research question is,

“Have the implemented recommendations of the SVCC ELA Task Force improved student success and time to completion at Sauk Valley Community College?”

Design/Methods/Measurement

Determining how to measure student success is a crucial step in this process. A review of the literature provides an excellent starting point and benchmark. Based on my literature review, I believe that the following measures will be important to examine and compare:

  • Gateway English course (ENG 101) pass rates (defined as “C” or above)
  • Pass rates for the next English course (ENG 103)
  • Pass rates for courses with ELA corequisite or no prerequisite requirements
  • 2-year and 4-year graduation rates

In addition, Sauk looks at other metrics to gauge student success such as fall-to-fall retention, fall-to-spring retention, and persistence.While not all of these measures would necessarily be significant, I intend to include these in my study for completeness and comparison. Finally, I would like to introduce another measure called “acceleration” to measure the rate at which students are moved through their developmental sequence.

The time period for the comparison study would be Fall 2010 – Spring 2018 semesters; since the 2013-14 school year was a transition year with full implementation in Fall 2014. This allows ample time for comparison of the time period before and after the change. Some measures (for example, 4-year graduation rates) may warrant a longer study period, but this will be enough data to determine whether the changes have been successful.

Data would be collected from the College’s student information system (SIS) and (if applicable) learning management system (LMS) using anonymized exports of student records. This will allow tracking individual students’ progress through multiple courses and through their program of study. Additional comparison and benchmark data may be collected from IPEDS data and other publicly available data.

After obtaining initial results, I will need to examine additional demographic and academic data to control for other factors that may explain part or all of the results and to determine future areas of study and trials. This will likely consume the majority of the study time, and while it is difficult to predict all the factors I will need to examine, I would need to examine the following at minimum: age, gender, high school, and (to the extent possible) socioeconomic factors.

Timeline

July 2018 – Approval.
The first step is to obtain approval from the SVCC Institutional Review Board to conduct human subject research. I have filed an application and expect to receive a response by mid-July.
July-August 2018 – Initial Success Results.
After I have received approval, I will pull initial student data and begin sorting and analyzing to obtain initial results.
September – November 2018 – Follow the Data.
Once I have the initial success data compiled
November – December 2018 – Write Paper.
With the data compiled and analyzed, I will write a formal research paper. The paper can then be shared with relevant faculty, administration the College’s developmental education committee for feedback. Once the paper is finalized, I will begin to submit it for publication.
January 2018 – Share Results.
Results would be shared with faculty and staff at the College Spring Kickoff, with a copy of the study shared prior for a robust discussion of next steps and further study. I also hope for the opportunity to discuss developing a regular research cycle under the auspices of the developmental education committee.

Have the implemented recommendations of the ELA Task Force improved student success and time to completion at Sauk Valley Community College? – A Literature Review

Update 7/2/2018: Corrected typographical errors

Have the implemented recommendations of the ELA Task Force improved student success and time to completion at Sauk Valley Community College? (PDF)

Action Research Outline

Update 7/1/2018: Added formal research question and additional detail in Step 2 (time period, data sources).

As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”

An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.

My initial research question is, “Have the implemented recommendations of the SVCC ELA Task Force improved student success and time to completion at Sauk Valley Community College?”

Here’s a brief overview of how this study could be performed and how I plan to lay the groundwork for future Action Research in this area.

Step 1: Define Success Criteria

  • Will need to discuss with Institutional Research, Recruiting, and Student Services
  • Starting points might be
    • Retention rates
    • Pass rates for gateway courses (e.g. ENG 101, 103)
    • Program completion rates
  • Crucial to select appropriate success criteria so they can be used to compare all changes/trials going forward

Step 2: Compare Results

  • Use success criteria to compare before/after ELA change.
  • Time period for comparison would be Fall 2010 – Spring 2018 semesters (2013-14 was a transition year, so this allows for 4 years before and after complete implementation)
  • Data would be extracted from College’s student information system (SIS) and (if applicable) learning management system (LMS)
  • Control for other possible factors

Step 3: Review the Literature

  • Compare ELA change results with similar changes in published trials
  • Search for other potential trials that may work at SVCC

Step 4: Develop/Find Other Trials

  • Compare other trials (existing and future) against success criteria
  • Use data from successful and unsuccessful trials to select future trials
  • Repeat the process

Academic Advising Approaches for Student Success in Developmental Education

See the related literature review for research background on this project. This is an update from my previous innovation plan, reflective of new research and a refined focus, with more to come.

Community colleges have long fought the perception that they are just an extension of high school, or “high school with ashtrays.” This is certainly understandable, as they devote a lot of time and resources to striving to ensure that the quality of education is college-level. However, in another sense, assisting the transition from high school to college is an important role that community colleges fill — whether for those students who are not independently wealthy, not able or ready to move away, or who are not yet prepared for college-level work.

Developmental Education

That third category — developmental education — is an area of the community college that is ripe for disruption. At Sauk Valley Community College, fewer than half of students are successful in developmental education courses the first time, and only half will continue as students past the first year . The current prevailing model places these courses as a barrier to be overcome before taking college-level coursework. A student — who may well have been receiving the message for years that they are “not good enough,” or “not college material” — takes a placement test, where they are told they are not good enough for college-level work. They must then enroll in and pay for classes — often multiple semesters’ worth — for which they will not receive credit. If they cannot pass the first time, the process repeats. Is it surprising, then, that the completion rates are so low?

Innovative Methods

This is an area of much interest and the research shows very promising results. Some of these approaches—multiple measures, guided pathways, noncognitive assessments, predictive data modeling, and intrusive advising—are highlighted in the following presentation.

What method or methods will work best at Sauk? I believe the answer to that is a solid, “it depends.” Approaches that work well for, say, a large, urban college will not necessarily be the best approaches for Sauk as a small, rural campus. What works in California, Florida, or Texas may or may not work well in Illinois.

What is clear to me is that we must use data to determine what innovative approaches have worked and have not worked. When we try new approaches, a plan must be put into place to carefully analyze results. When students arrive for advising, we must be sure that we are collecting enough relevant data about the student so the advisors and faculty (as appropriate) have enough information about the student to be able to help them be successful.

The Plan

I would like to implement a three-stage approach to be implemented during the 2018-19 school year.

Phase 1: Noncognitive Data and Student Retention Data

This phase is already underway as part of the College’s HLC Quality Initiative portion of the accreditation process. For this initiative, the College is discussing collecting noncognitive data via the College Student Inventory™ (CSI) and developing a program for more intrusive advising for those students who we believe can be most effectively helped by more intervention.

However, for this program to be most successful, data needs be be aggregated from several different systems (e.g. student information system [SIS], learning management system [LMS], CSI, and others) and analyzed. Ideally, a system would aggregate this data in a way that could provide (or provide the ability to add on later) predictive analytics or apply machine learning to help us find patterns that we didn’t even know to look for. In addition, the system needs to be useful and meaningful for advisors and faculty so they have access to data and communication tools necessary to intervene on a student’s behalf in a timely manner.

We are currently in process with demonstrations from vendors, and I hope to have a solution selected or designed (if we decide to go with an in-house solution) and beginning to be implemented by the end of the fall 2018 semester.

Phase 2: Study Current and Past Approaches’ Effects on Student Success

Over the years, a number of solutions have been tested or implemented. However, often no data or only anecdotal data has been collected. I would like to develop a study on these approaches to see what effect they had on student success measures. These results could then be used to identify potential trials or approaches for the third phase. I would like to begin studying the following areas beginning in the summer 2018 semester with at least preliminary results available by the end of the fall 2018 semester:

  • Develop a baseline metric against which to measure student acceleration
  • What impact did these initiative have on student retention, pass rates for gateway courses, or acceleration?
    • Change to current ELA prerequisite/corequisite model
    • Change from COMPASS to Accuplacer/ALEKS for placement testing
    • Creation of first-year experience (FYE) course
    • Student success coordinator and activities (success week, success coaching)
    • Creation of College Study Skills (CSS) class
    • Creation of math lab and usage of Pearson MyMathLab

Studying these results will (1) give us a better sense of whether those initiatives are working and worth continuing and (2) will give us a better direction as we look to expand into new initiatives.

Phase 3: Trial Initiatives

As mentioned above, this phase will depend heavily on results from the previous phase. However, some possible example initiatives might include the following: develop a corequisite support class for a gateway math, develop high-quality online developmental education courses, and expanding multiple measures for placement to include other measures (such as high school GPA). I would like for the trials for this phase to be developed in the spring 2019 semester for implementation in Summer 2019 or Fall 2019.


References

A Change of Focus for My Innovation Plan

In recent months, it has become clear to me that I need to shift focus for my innovation plan, so I wanted to take a moment here to explain that shift and how it came about.

What’s Not Changing

I became interested in developmental education because it’s one of the most clear, well-documented problems affecting higher education, and community colleges in particular. However, as clearly understood a problem as it is, the solutions are anything but clear. To my mind, this is a perfect place to perform research and try experiments. Everyone is hungry for a solution and actually willing to try things (even if that means failing miserably).

After I started digging into it, I became passionate about developmental education. There is so much squandered potential there, and, most importantly, I could see myself and my kids in those students being slowly rejected by a higher education system that’s trying desperately to help them (and I believe they REALLY ARE trying their best to help).

What is Changing

I have slowly come to the realization that my plan to develop these online, self-directed bridge courses was just not going to work. The main reason it wouldn’t work is that I simply wouldn’t be able to make it happen. I’m not a teacher. I’m not developing courses. I don’t have connections into the high schools to get buy-in from that side. Another reason is that, as I started paying closer attention to these issues, I started noticing that a lot more people are working on similar things. Unlike me, they are in the trenches and able to make things happen.

As I was developing a professional development plan, though, I focused on one section of my plan—academic advising—and things started to get easier and ideas started to flow. COVA: Choice, Ownership, and Voice through Authentic Learning This, once again, is the COVA model in action. Because I had a choice, I developed increased ownership in the learning process, so I was able to find my voice and my learning was more authentic.

The “Discussion” section of my literature review strongly hints at the direction I’ll be heading. I’ve decided to focus on the academic advising process, making sure that advisors have the data they need to be able to identify students who will need extra assistance but giving them the ability to apply COVA in their own area of expertise–helping students. It’s exciting and more importantly, it’s something I can actually accomplish.

Developmental Education Innovation Plan – Presentation

My innovation plan has undergone a substantial shift particularly over the past couple months. Whereas my previous focus had been on developing a set of college preparatory courses based on maker principles, my focus has now shifted to the admissions, advising, and placement process. As I studied the subject more, I’ve discovered that some similar projects were already in play. Further, in my role as Director of Information Services, I have much more ability to directly influence these strategies—particularly as they relate to providing access to data.

Following is a presentation slide deck I developed to share with a Sauk audience outlining my vision for exploring data-driven advising at SVCC.

Alternative Placement & Remediation Professional Learning Plan

Particularly with my new focus on a data-driven placement and remediation model in my innovation plan, a professional learning plan is a vital part of this project’s success. Previously, I gave a brief outline of a professional learning approach to include the 5 principles of effective professional development. Now, to flesh out that outline a bit, I have developed a modified 3-Column Table for the learning plan. In addition, I have developed the framework and initial content for a hybrid online/in-person professional learning course as part of the prior approach. I will continue to flesh out and revise the course content with more citations and relevant content. Contributors will also have the ability to add additional resources, so the course will continue to grow throughout its duration as well.

View the Course in Canvas

In addition to the mentoring program mentioned in the outline, I have also included modeling by including videos from a number of different colleges that have implemented these programs to provide another level of modeling.

I believe that this approach of encouraging employees to work collaboratively to help solve the problem of ineffective developmental education, combined with providing them with resources and access to data, will give them ownership of the process and allow them to make the most of this professional learning opportunity.

Professional Development Plan Outline

If my innovation plan to provide alternatives to the current model of developmental education at Sauk Valley Community College is to be successful, there will need to be effective professional development. As I looked at effecting organizational change, I narrowed my focus to reducing barriers and providing alternatives to traditional developmental education in the admissions and advising process. Since I am not a classroom teacher and my position at the college does not directly deal with instruction, I think this will be a more effective direction to take my innovation plan.

In developing a plan for professional development, then, I will again focus on the initial phase of this project, gathering additional relevant student data to identify trends we can use to determine what students may need additional intervention and what students are likely to be successful in regular courses with additional resources, lessening the burden of developmental education courses. This outline is the beginning of a professional development plan with the admissions and academic advising areas.

  1. Incorporate Gulamhussein’s 5 principles of effective professional development.
    1. Significant, ongoing duration – Training sessions would take place over the course of an academic year, beginning before the data collection project officially starts and continuing at least until after one full semester’s registration and advising cycle has been completed.
    2. Support during implementation – In addition to the training, prompts and reminders would help advisors and admissions representatives to know what they need to collect. Good user interface and database design will ease the workload and streamline the process. As aggregate data becomes available, it would be shared with employees so they can see the result of their efforts.
    3. Active initial exposure – Training sessions would not be just an instructor going through PowerPoint slides, but would contain hands-on exercises designed to simulate real-life experiences. Special attention would be paid to outlier situations to help employees think critically about what they should do in a given situation.
    4. Modeling – A mentoring program would be developed at SVCC, starting in the Student Services area to allow more experienced employees to help newer employees to understand procedures and practices.
    5. Content specific to area – The mentoring program will also allow colleagues to apply content specifically to their area. In addition, hands-on training sessions could be partially split up by area (academic advising, enrollment management, etc.) and partially mixed (to allow employees to see how their job contributes to the whole).
  2. Collaboration – Collaboration is vitally important, and it will be central to the mentoring program and in the hands-on training exercises. In addition, we should explore other means of communication such as forums, email listservs, group chats, and the like, to allow for more open sharing.
  3. Training leaders – Leadership would be shared among Student Services personnel (dealing with students, understanding the “why”), Institutional Research (why the data is important, how it contributes to student success), and Information Services (how to enter and share the data, information security).
  4. Audience – Primary audience will be academic advisors and enrollment management representatives, but should also include all student services personnel in some form.
  5. Instructional Design – I will develop the plan using backwards design and a 3 Column Table with a “Big, Hairy, Audacious Goal” (BHAG).
  6. Timeline – To adequately prepare for upcoming registration periods, trainings would need to begin in the Fall 2018 for Spring 2019 registration. In addition to giving time for preparatory training, implementation for the spring semester makes for an easier implementation as spring registration is generally smaller than the summer/fall registration period.
  7. Resources – Will need Information Services to develop streamlined data entry screens, Institutional Research to develop data requirements, and Student Services leadership to make time for training sessions and provide guidance to employees.

References:

 

More Effective Professional Development

Professional development (PD)is at somewhat of a transition point at SVCC, making this an ideal time to present an alternative vision of how PD can be done differently. The most effective beginning point to start this discussion would be a discussion at Leadership Council (which consists of administrators and the faculty leaders for different academic areas), so I put together a brief overview intended as an introduction to an open discussion time. This group tends to be pretty open and collaborative, so an introduction of the topic and a nudge in the right direction should be all that’s necessary to start a productive discussion. However, when presenting to different  groups, it may be helpful to have a more guided discussion.

In describing the “what is,” I thought it was important to describe it in a way that presented our currently available PD in a positive light and look at the opportunities for improving on what’s good rather than presenting the current state as a completely broken system.

Screen shot of editing video in Adobe Premiere ProTechnically speaking, I firmly believe in the adage that “less is more” with regard to presentation graphics, so I kept the presentation simple with some mild humor and photos at the beginning to engage the audience and then giving way to a more traditional slide deck. I developed the slide desk in Google Slides, wrote a manuscript to follow (I usually end up farther off-script when giving a presentation live but am more confident when I manuscript it first), recorded the screen capture using Snagit, and then did final edits in Adobe Premiere Pro CC.


References: