At Yellowdig, when we talk about “standard discussion pedagogy”, we are referring to a framework where there are assignments for each course module/topic (usually weekly), students have a deadline for responding to a prompt, and there is a requirement to make a certain number of posts and comments (usually 1 post and 2 comments on others’ posts) in response to those prompts. There are many related variations to this methodology, but the script is shockingly similar across institutions and departments… as is the sentiment that these “discussions” are a time-consuming and disappointing exercise. Let’s consider this the shallow end of the swimming pool. It feels safer, but it’s still plenty of water to drown in and you definitely can’t safely dive very deep in here.
The deep end of the pool is Yellowdig’s community-building system. Our system was based around a different approach. Our patented point system in Yellowdig is designed to change student behavior while also requiring a regular baseline of participation in open-ended conversations. Additionally, social point earning rules inspire timely, relevant, and high-quality posting and commenting. These conversations are typically started by students sharing news articles or asking questions without specific instructor-led prompts.
The point system can be set up to require a certain amount of participation each week but was not intended to constrain how students participate to earn their points. The expectation is that comments will form the bulk of point earning in a healthy community such that the comment-to-post ratio will be much greater than the 2-to-1 that standard discussion pedagogy dictates. We typically hope these comment-to-post ratios (which we call “conversation ratios”) are upward of 7.0 because higher conversation ratios are related to higher instructor and learner satisfaction, more voluntary participation beyond the requirements, and better grades. This is the fun end of the pool.
Same Pool, Different Rules
As the above descriptions make clear, standard discussion pedagogy (SDP) and Yellowdig’s intended pedagogy for course-based use are quite different. In a course context Yellowdig may be playing in the same pool as LMS Discussions, but the rules around what you can do couldn’t be more different (as seen in the figure below). As a result some instructors see Yellowdig as a paradigm shift requiring a major change in thinking. Change can be scary.
In spite of many people trying Yellowdig because they are unhappy with their outcomes from SDP, many still try to use SDP in Yellowdig. The big problems with this are:
1) The Yellowdig system is designed to circumvent the rigidity of SDP. The point system, the moderation controls, the topic system, and other aspects of Yellowdig are not easy to align with SDP. Inevitably, misalignments create instructor and student friction that erodes most of the value of Yellowdig’s system.
2) SDP is a major contributor to the lackluster outcomes in other technologies. Instructors who do not change pedagogy and the way they interact with students continue to see underwhelming outcomes because our system cannot compensate for a discussion pedagogy that does not work.
Due to these factors, following SDP within Yellowdig creates the impression that Yellowdig is “just ok.” Instructors using SDP generally think Yellowdig is better than LMS discussion forums but tend not to see results that fundamentally transform the learning experience. Most lukewarm or negative responses to Yellowdig have been in response to situations where instructors refused to adopt the vast majority of Yellowdig’s proven practices. Not changing pedagogy in step with the adoption of a new technology is akin to continuing to use chalk when someone has installed a whiteboard in your classroom. You can’t really blame the whiteboard for not displaying your writing better; nor can you be surprised that the chalk is still making your hands dusty.
As illustrated in the figure below, not adopting Yellowdig pedagogy yields a large gap between course outcomes and ideal outcomes, while also leaving a lot of unexplored territory in terms of knowing the results from different implementations (i.e., “Unknown Outcomes”).
Another problem with this approach is that it doesn’t put Yellowdig’s intended pedagogy to the test. SDP instructors have no way to know whether their fears about Yellowdig’s pedagogy are based in reality for their student group. As shown below, due to the structured nature of SDP with rules and prompts placed in weekly modules and an over-emphasis of points for creating new posts, discussions built around SDP cannot be easily course-corrected and finding your ideal discussion pedagogy will take many semesters of using non-ideal implementations.
This scenario may sound hypothetical, but we have regularly observed (and learned from) these shifts among our partners, especially some of our early adopters that started before we had a lot of data to make confident recommendations. The data in the following table are from an instructor who started with a standard discussion approach to setting up the point system which favors the production of posts over comments. In Summer of 2018, he arrived at point settings that are close to our recommendations, which yielded peak engagement relative to the requirements (i.e., the highest “Avg. % of Course Goal”).
As noted in our blog post about our proven practices, aside from the logic in using a tool as it was designed to be used, we see that trying all of Yellowdig’s proven practices and intended pedagogy at the very least ensures interest and engagement. Although people feel prompting students may be “better” educationally, if students will not actively participate in an educationally valid approach to learning, it is not truly a valid approach to learning.
An interesting thing to note about the above undergraduate business case is after arriving near our current recommendations in summer of 2018, the professor continued to explore high points for comments and high social points - the deep end. As any good instructor should do he continues to key in on the pedagogy that he thinks is maximizing learning outcomes for his specific students. However, in the meantime these settings have all yielded more conversation and higher participation than all of the first 6 semesters of use. This is a strong signal, which we’ve confirmed by talking to this professor, that he thinks the increased conversation ratio and participation of these latter semesters is also being reflected in the general quality of student posts and comments. If his initial pedagogy used in the first 6 semesters was producing qualitatively better course discussion than in the last 6 semesters, he would have quickly returned to it.
One of the main reasons people consider Yellowdig to begin with is because students complain about discussion forums and merely “check the box” while doing discussion assignments. If that’s the starting point, why not try something very different?
Dive right into the Yellowdig end of the swimming pool!
From a purely practical standpoint, it makes sense to start by trying to use Yellowdig’s recommended pedagogy. As we demonstrated in our previous blog post, following best practices is far less risky than ignoring them. Try Yellowdig as it was intended to be used, take the engagement benefits that come with it for the first attempt, and then work back toward implementing more structure if it seems necessary to achieving your educational aims.
There are a few major benefits to this approach in terms of arriving quickly at your ideal pedagogy.
By bracketing your old pedagogy and trying a brand new pedagogy, you immediately learn the full breadth of the outcomes you can expect; there are no unknown areas of the spectrum. This will allow you to confidently know that you need to adjust your pedagogy if it comes to that.
You immediately have a chance to test assumptions you may have about our suggested pedagogy. Our data strongly support our claims about the benefits of our system and we encourage you to give them a fair shake.
It’s easier to design a course without the structure inherent in SDP and subsequently add prompts and structure if needed. This approach allows you to arrive at an ideal pedagogy in a few weeks as opposed to a few semesters. Taking structure away from a course is almost impossible; it isn’t hard to give your students a prompt if it turns out they would benefit from a little structure.
Please take a swim in our end of the swimming pool. You’ll have a lot of fun and we will make sure you don’t drown. If you need some more encouragement to take that first leap off the diving board, just reach out!
Brian Verdine, Ph.D. is the Head of Client Success at Yellowdig. Brian received his Ph.D. in Psychology from Vanderbilt University’s Peabody College of Education and Human Development. He went on to a postdoctoral position in the Education department at the University of Delaware where he later became, and continues to be, an Affiliated Assistant Professor. His academic research and his now primary career in educational technology has focused on understanding and improving learning outside of classrooms, in less formal learning situations. At Yellowdig he manages all aspects of Client Success with a strong focus on how implementation in classes influences instructor and student outcomes.