Effective Online Programs

Udacity – I have taken a few Udacity courses over the years, and I have found that them to be quite successful. In addition to “one-off” courses, they also offer what they call nanodegrees, which provide a useful credential for the student.
Georgia Tech OMCMS – Georgia Tech’s Master’s programs in computer science are built on the Udacity platform and many (if not all) are freely available as a MOOC, though one obviously needs to be enrolled in the program for graduate credit.
Khan Academy – One of the originals and still hard to beat for math education. I used some of the videos as course content and extended resources in my own online math course.
Lynda (now LinkedIn Learning) – Another of the early online learning resources, and still a great place to go to get accessible learning on a wide variety of subjects.
Lamar DLL – At the risk of sounding like a brown-noser, I have learned much about effective course design as a participant in the DLL courses.

Online Learning Reflection

Relevance and Importance of Online Learning

Online learning properly developed offers tremendous benefits for students. In relation to the course I have developed in particular, it offers remedial students the ability to gain understanding of key concepts to prepare for college-level work as well as offering non-traditional students the ability to refresh themselves on key concepts quickly. In addition, short mini-courses like this allow students to take a quick refresher if they need additional support while working through college-level courses. While this type of support is not the best fit for every student, I believe it can help a great many students to succeed in college without feeling like they are being labeled as “the dumb kid,” which study after study has shown simply is not effective.

Learning Theories

While an online learning environment can be effectively developed using any learning theory (or theories), I tend to think in terms of constructivist/constructionist or connectivist theories. For this course in particular, I tended mostly toward connectivist ideas. Connectivism “is focused on connecting specialized information sets, and the connections that enable us to learn more are more important than our current state of knowing.”  Since this course was focused on remediation–making connections that students have missed or forgotten–the first two modules focus on connecting familiar concepts with order of operations. With that background then, the course moves on to methods often associated with behaviorism,  explaining the material and practicing problems. After demonstrating understanding of the concept, then, the last module connects the newly learned concept with other concepts that will build on the new understanding.

Implementation

Having previously developed a detailed 3-column table (3CT) and Understanding by Design (UbD) template, developing the actual course was mostly a matter of locating appropriate resources, writing the introductory and connecting text. I also added the last connecting module, which wasn’t in the initial 3CT OR UbD template. Future areas for improvement would be to include an introductory video and possibly introductory and/or connecting videos to add some more direct, personal connection to students.

Lessons Learned

While developing this course, I was struck in particular by the importance of robust preparation. Having already put in several hours developing the 3CT and UbD made developing the course itself dramatically easier than it would have been otherwise. However, the inverse of that was also interesting. Even with all that preparation, it was not until after the course was taking shape that I noticed some areas that needed improvement. I’m confident as well that teaching the course will require constant re-evaluation and revision of the course. If online learning is to effective, it will be because of many hours of preparation, evaluation, and research.


References

Order of Operations: Online Course

Building on my 3-Column Table and UbD outline, I wanted to create an online mini-course for order of operations. This course primarily has two types of student in mind, a student preparing to go to college trying to “brush up” rather than take developmental education classes or a non-traditional student who needs a refresher.

Screen Shot of Schoology course designBecause of that, the course takes a little extra time looking at the concept from different angles with some fun analogies and exercises. Then, having established the need for order of operations, the next module goes into explaining order of operations on a more technical level–the way students have likely heard it before–before moving on to sets of practice problems so the student can try and self-assess.

A final, ungraded quiz helps students assess whether they are ready to move on or not. Finally, then, the last module connects the current mini-course on to another, related skill. It introduces the FOIL method, which is related to order of operations and can help bridge the student into further algebraic concepts.

There are many other subjects which would be good to cover with mini-courses like this. In looking at Stigler, Givvin, and Thompson’s (2009) research, the two most commonly missed types of problems related to fractions (adding, subtracting, multiplying, dividing, mixed with other whole or decimal numbers) and least common multiple/greatest common denominator. Both of those seem like good candidates for a course of this type.


Reference:

Action Research Plan

Topic/Purpose

As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”

An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.

My initial research question is,

“Have the implemented recommendations of the SVCC ELA Task Force improved student success and time to completion at Sauk Valley Community College?”

Design/Methods/Measurement

Determining how to measure student success is a crucial step in this process. A review of the literature provides an excellent starting point and benchmark. Based on my literature review, I believe that the following measures will be important to examine and compare:

  • Gateway English course (ENG 101) pass rates (defined as “C” or above)
  • Pass rates for the next English course (ENG 103)
  • Pass rates for courses with ELA corequisite or no prerequisite requirements
  • 2-year and 4-year graduation rates

In addition, Sauk looks at other metrics to gauge student success such as fall-to-fall retention, fall-to-spring retention, and persistence.While not all of these measures would necessarily be significant, I intend to include these in my study for completeness and comparison. Finally, I would like to introduce another measure called “acceleration” to measure the rate at which students are moved through their developmental sequence.

The time period for the comparison study would be Fall 2010 – Spring 2018 semesters; since the 2013-14 school year was a transition year with full implementation in Fall 2014. This allows ample time for comparison of the time period before and after the change. Some measures (for example, 4-year graduation rates) may warrant a longer study period, but this will be enough data to determine whether the changes have been successful.

Data would be collected from the College’s student information system (SIS) and (if applicable) learning management system (LMS) using anonymized exports of student records. This will allow tracking individual students’ progress through multiple courses and through their program of study. Additional comparison and benchmark data may be collected from IPEDS data and other publicly available data.

After obtaining initial results, I will need to examine additional demographic and academic data to control for other factors that may explain part or all of the results and to determine future areas of study and trials. This will likely consume the majority of the study time, and while it is difficult to predict all the factors I will need to examine, I would need to examine the following at minimum: age, gender, high school, and (to the extent possible) socioeconomic factors.

Timeline

July 2018 – Approval.
The first step is to obtain approval from the SVCC Institutional Review Board to conduct human subject research. I have filed an application and expect to receive a response by mid-July.
July-August 2018 – Initial Success Results.
After I have received approval, I will pull initial student data and begin sorting and analyzing to obtain initial results.
September – November 2018 – Follow the Data.
Once I have the initial success data compiled
November – December 2018 – Write Paper.
With the data compiled and analyzed, I will write a formal research paper. The paper can then be shared with relevant faculty, administration the College’s developmental education committee for feedback. Once the paper is finalized, I will begin to submit it for publication.
January 2018 – Share Results.
Results would be shared with faculty and staff at the College Spring Kickoff, with a copy of the study shared prior for a robust discussion of next steps and further study. I also hope for the opportunity to discuss developing a regular research cycle under the auspices of the developmental education committee.

Have the implemented recommendations of the ELA Task Force improved student success and time to completion at Sauk Valley Community College? – A Literature Review

Update 7/2/2018: Corrected typographical errors

Have the implemented recommendations of the ELA Task Force improved student success and time to completion at Sauk Valley Community College? (PDF)

Action Research Outline

Update 7/1/2018: Added formal research question and additional detail in Step 2 (time period, data sources).

As I look at my renewed focus on applying data analytics and human creativity to the problem of community college developmental education, it really is an Action Research problem. That is, my goal is to empower and encourage advisors to apply Action Research methodologies to their advising. Accomplishing this, however, will require some research of my own to “prime the pump.”

An essential place to start is to research how effective the changes we’ve tried in the past have been. Specifically, we made a change a few years back to our ELA prerequisite requirements to allow instructors to classify their courses as needing an ELA prerequisite, corequisite, or no requirements. Unfortunately though, the College hasn’t studied how effective it’s been. A study on this would lay the groundwork to further study changes and experiments against an existing set of success metrics.

My initial research question is, “Have the implemented recommendations of the SVCC ELA Task Force improved student success and time to completion at Sauk Valley Community College?”

Here’s a brief overview of how this study could be performed and how I plan to lay the groundwork for future Action Research in this area.

Step 1: Define Success Criteria

  • Will need to discuss with Institutional Research, Recruiting, and Student Services
  • Starting points might be
    • Retention rates
    • Pass rates for gateway courses (e.g. ENG 101, 103)
    • Program completion rates
  • Crucial to select appropriate success criteria so they can be used to compare all changes/trials going forward

Step 2: Compare Results

  • Use success criteria to compare before/after ELA change.
  • Time period for comparison would be Fall 2010 – Spring 2018 semesters (2013-14 was a transition year, so this allows for 4 years before and after complete implementation)
  • Data would be extracted from College’s student information system (SIS) and (if applicable) learning management system (LMS)
  • Control for other possible factors

Step 3: Review the Literature

  • Compare ELA change results with similar changes in published trials
  • Search for other potential trials that may work at SVCC

Step 4: Develop/Find Other Trials

  • Compare other trials (existing and future) against success criteria
  • Use data from successful and unsuccessful trials to select future trials
  • Repeat the process

Academic Advising Approaches for Student Success in Developmental Education

See the related literature review for research background on this project. This is an update from my previous innovation plan, reflective of new research and a refined focus, with more to come.

Community colleges have long fought the perception that they are just an extension of high school, or “high school with ashtrays.” This is certainly understandable, as they devote a lot of time and resources to striving to ensure that the quality of education is college-level. However, in another sense, assisting the transition from high school to college is an important role that community colleges fill — whether for those students who are not independently wealthy, not able or ready to move away, or who are not yet prepared for college-level work.

Developmental Education

That third category — developmental education — is an area of the community college that is ripe for disruption. At Sauk Valley Community College, fewer than half of students are successful in developmental education courses the first time, and only half will continue as students past the first year . The current prevailing model places these courses as a barrier to be overcome before taking college-level coursework. A student — who may well have been receiving the message for years that they are “not good enough,” or “not college material” — takes a placement test, where they are told they are not good enough for college-level work. They must then enroll in and pay for classes — often multiple semesters’ worth — for which they will not receive credit. If they cannot pass the first time, the process repeats. Is it surprising, then, that the completion rates are so low?

Innovative Methods

This is an area of much interest and the research shows very promising results. Some of these approaches—multiple measures, guided pathways, noncognitive assessments, predictive data modeling, and intrusive advising—are highlighted in the following presentation.

What method or methods will work best at Sauk? I believe the answer to that is a solid, “it depends.” Approaches that work well for, say, a large, urban college will not necessarily be the best approaches for Sauk as a small, rural campus. What works in California, Florida, or Texas may or may not work well in Illinois.

What is clear to me is that we must use data to determine what innovative approaches have worked and have not worked. When we try new approaches, a plan must be put into place to carefully analyze results. When students arrive for advising, we must be sure that we are collecting enough relevant data about the student so the advisors and faculty (as appropriate) have enough information about the student to be able to help them be successful.

The Plan

I would like to implement a three-stage approach to be implemented during the 2018-19 school year.

Phase 1: Noncognitive Data and Student Retention Data

This phase is already underway as part of the College’s HLC Quality Initiative portion of the accreditation process. For this initiative, the College is discussing collecting noncognitive data via the College Student Inventory™ (CSI) and developing a program for more intrusive advising for those students who we believe can be most effectively helped by more intervention.

However, for this program to be most successful, data needs be be aggregated from several different systems (e.g. student information system [SIS], learning management system [LMS], CSI, and others) and analyzed. Ideally, a system would aggregate this data in a way that could provide (or provide the ability to add on later) predictive analytics or apply machine learning to help us find patterns that we didn’t even know to look for. In addition, the system needs to be useful and meaningful for advisors and faculty so they have access to data and communication tools necessary to intervene on a student’s behalf in a timely manner.

We are currently in process with demonstrations from vendors, and I hope to have a solution selected or designed (if we decide to go with an in-house solution) and beginning to be implemented by the end of the fall 2018 semester.

Phase 2: Study Current and Past Approaches’ Effects on Student Success

Over the years, a number of solutions have been tested or implemented. However, often no data or only anecdotal data has been collected. I would like to develop a study on these approaches to see what effect they had on student success measures. These results could then be used to identify potential trials or approaches for the third phase. I would like to begin studying the following areas beginning in the summer 2018 semester with at least preliminary results available by the end of the fall 2018 semester:

  • Develop a baseline metric against which to measure student acceleration
  • What impact did these initiative have on student retention, pass rates for gateway courses, or acceleration?
    • Change to current ELA prerequisite/corequisite model
    • Change from COMPASS to Accuplacer/ALEKS for placement testing
    • Creation of first-year experience (FYE) course
    • Student success coordinator and activities (success week, success coaching)
    • Creation of College Study Skills (CSS) class
    • Creation of math lab and usage of Pearson MyMathLab

Studying these results will (1) give us a better sense of whether those initiatives are working and worth continuing and (2) will give us a better direction as we look to expand into new initiatives.

Phase 3: Trial Initiatives

As mentioned above, this phase will depend heavily on results from the previous phase. However, some possible example initiatives might include the following: develop a corequisite support class for a gateway math, develop high-quality online developmental education courses, and expanding multiple measures for placement to include other measures (such as high school GPA). I would like for the trials for this phase to be developed in the spring 2019 semester for implementation in Summer 2019 or Fall 2019.


References

A Change of Focus for My Innovation Plan

In recent months, it has become clear to me that I need to shift focus for my innovation plan, so I wanted to take a moment here to explain that shift and how it came about.

What’s Not Changing

I became interested in developmental education because it’s one of the most clear, well-documented problems affecting higher education, and community colleges in particular. However, as clearly understood a problem as it is, the solutions are anything but clear. To my mind, this is a perfect place to perform research and try experiments. Everyone is hungry for a solution and actually willing to try things (even if that means failing miserably).

After I started digging into it, I became passionate about developmental education. There is so much squandered potential there, and, most importantly, I could see myself and my kids in those students being slowly rejected by a higher education system that’s trying desperately to help them (and I believe they REALLY ARE trying their best to help).

What is Changing

I have slowly come to the realization that my plan to develop these online, self-directed bridge courses was just not going to work. The main reason it wouldn’t work is that I simply wouldn’t be able to make it happen. I’m not a teacher. I’m not developing courses. I don’t have connections into the high schools to get buy-in from that side. Another reason is that, as I started paying closer attention to these issues, I started noticing that a lot more people are working on similar things. Unlike me, they are in the trenches and able to make things happen.

As I was developing a professional development plan, though, I focused on one section of my plan—academic advising—and things started to get easier and ideas started to flow. COVA: Choice, Ownership, and Voice through Authentic Learning This, once again, is the COVA model in action. Because I had a choice, I developed increased ownership in the learning process, so I was able to find my voice and my learning was more authentic.

The “Discussion” section of my literature review strongly hints at the direction I’ll be heading. I’ve decided to focus on the academic advising process, making sure that advisors have the data they need to be able to identify students who will need extra assistance but giving them the ability to apply COVA in their own area of expertise–helping students. It’s exciting and more importantly, it’s something I can actually accomplish.

Developmental Education Innovation Plan – Presentation

My innovation plan has undergone a substantial shift particularly over the past couple months. Whereas my previous focus had been on developing a set of college preparatory courses based on maker principles, my focus has now shifted to the admissions, advising, and placement process. As I studied the subject more, I’ve discovered that some similar projects were already in play. Further, in my role as Director of Information Services, I have much more ability to directly influence these strategies—particularly as they relate to providing access to data.

Following is a presentation slide deck I developed to share with a Sauk audience outlining my vision for exploring data-driven advising at SVCC.