How Not to Engage Parents: Lessons from the Ontario Ministry of Education
Posted on December 2, 2019 in Governance Policy Context
Source: PolicyAlternatives.ca — Authors: Ricardo Tranjan
BehindtheNumbers.ca – Shorthand
December 2, 2019. By Ricardo Tranjan, CCPA@policyalternatives.ca
Public engagement is a double-edged sword: do it right and invigorate democratic institutions; do it wrong and fuel distrust in government. The Education in Ontario Consultation offers a glaring example of public engagement gone wrong.
Here’s why.
Instead of creating mechanisms for ongoing dialogue with the diverse and well-organized education sector, the Ministry opted for one-off consultations.
Instead of designing consultations that promote two-way learning and understanding, the Ministry heavily relied on a poorly designed online survey.
Instead of making all consultation results public and accessible, the Ministry released only a portion of them in a confusing format.
Instead of using public input to refine policies, the Ministry is touting this process – which engaged less 0.5% of Ontarians – as evidence that the government is “listening to parents”.
This blog post unpacks each of these criticisms; it includes a map of the number of survey responses by postal code.
Choosing a public engagement model
The International Association for Public Participation (IAP2) uses a five-level spectrum to classify public engagement processes according to the public’s role in decision-making. On one end of the spectrum, governments simply inform the public about an issue or plan. This type of process increases government transparency but doesn’t offer the public opportunities to influence decisions. An example would be a brochure detailing a neighbourhood revitalization plan. On the opposite end of the spectrum, the public is empowered with the decision-making authority, like in a referendum. In between these two ends, there are less clear-cut processes where governments consult, involve, and collaborate with the public but hold onto the decision-making power.
The Ministry of Education chose to go beyond information sharing, which is positive. And it couldn’t have transferred decision-making authority to participants, due to legislation and the large number of topics under review. The choice was between one-off consultations and more sustained involvement of, and collaboration with stakeholders.
The Ministry opted for one-off consultations.
In consultations, government officials usually present the issues they have identified and the solutions they are considering for those issues, and gather public feedback to fine-tune policies and inform implementation. While consultations allow the public to react to proposals, they seldom offer opportunities for dialogue, thus failing to tap into the knowledge and experience within communities. They are also less likely to lead to collaborative implementation processes since the consulted parties have less ownership over the plans.
Ontario’s education sector is highly organized, with a myriad of parent groups, professional associations and independent advocacy organizations, in addition to a strong education research community outside of the government. This is the perfect context for sustained engagement in the form of ad-hoc committees, advisories and serial engagement activities, formats which allow individuals and organizations to help governments to identify and understand public concerns and aspirations, and develop solutions that take various perspectives into account. The people engaged (not simply consulted) become involved in implementation processes (e.g. educators), or at least closely monitor them (e.g. parents and researchers), creating a positive feedback loop.
The Ministry of Education may have carried out the “largest public consultation on education,” as it claims, but it certainly failed to start a sustained and meaningful dialogue with parents and other education sector stakeholders.
Rolling out the consultation
Some consultations are better than others.
As far as format goes, public meetings generally allow for some two-way interaction between the public and government officials, providing both the chance to learn from each other. When well planned, these meetings provide marginalized communities the support they need to meaningfully participate. Written and automated consultations, on the other hand, reinforce the one-side speaks, one-side reacts dynamics that constitute the major shortcoming of this form of public engagement.
In the Education Ministry consultation conducted in end of 2018, 58% of responses came via an online survey, 4% came through participation in telephone town halls, and 38% through open submission forms.
In other words, 62% of participants answered multiple-choice questions with little to no chance of engaging public officials; the remaining 38% used an open form to voice their views but didn’t receive any response or account of how their input has been received.
The survey included standard demographic questions, several well-formulated policy questions, and at least ten questions (25% of the survey) that won’t generate any actionable information.
Here’s a sample of the ill-formulated questions:
Are you willing to do a consultation like this about Ontario’s education system every 5 years or so? A tiny share of Ontarians completed the survey, and the education system welcomes a new cohort of parents every year, so the answer to this question should have no bearing on whether other consultations are carried out.
How will you know a student has the needed math fundamentals? Select all that apply. The answer choices provided were not mutually exclusive; in fact, they were complementary ways of assessing math skills. As a result, most respondents selected all choices and no useful information has been generated.
Ontario needs to improve student achievement in math. Where should we focus? The survey asks respondents to rate seven answers from less to more impactful (a Likert scale). Six of the seven answers received very similar ratings. Since this question was about prioritization (“where to focus”), a ranking question should have been used instead of Likert scales.
The survey also asks if and how much time and money parents spend to help their children learn math outside of the classroom. Caution is needed in interpreting the results for these questions. Eighty-three percent of parents reported the type of school their children attend. Of these, the share who reported having home-schooled children (2.7%) is much higher than the average for Ontario (0.3%). Respondents with children in private/independent schools (9.6%) were also overrepresented in relation to the provincial average (6.7%).
Even if a consultation is chosen as the public engagement method, and a written survey is used as the consultation format, the Ministry should have carefully designed questions to ensure their applicability.
Releasing and communicating results
There are clear expectations about how governments should handle the results of public consultations: release them, use them in the formulation of policies, and don’t misrepresent the process. It sounds simple. Yet, the Ministry has failed here too.
In the broad 2018 consultation, 38% of participants used the open submission form. The Ministry has not released any summary of this input. Although the online survey and telephone town hall results are available online, the data is presented in peculiar ways: in some cases, counts and percentages are provided, in other cases only counts, and in a few cases only percentages. Except for “select all that apply” questions, where percentages cause confusion, there is no good rationale for omitting counts or percentages: providing both makes the report easier to read.
Here’s an example:
Respondents were asked if they were familiar with the Ontario College of Teachers. Results were provided in counts and percentages. The next question asked respondents to rank, on a scale of 1 to 5, How important is it to you that the Ontario College of Teachers exists? Because only counts were provided, the reader has to calculate that the 2,440 respondents who answered 1 (“not important”) represent 7% of all respondents, whereas the 15,182 people who answered 5 (“very important”) represent 46% of all respondents. And it isn’t clear whether everyone was asked the second question or only the people who answered Yes to the first question (the latter being the right design).
Transparency is clearly lacking. And then there is the misrepresentation of the process.
Consultations are intended to support policy development processes, and not to provide a carte blanche for governments to speak on behalf of the consulted population.
Getting the policy mix right is difficult. Policymaking deals with complex issues for which government officials have to develop timely, effective, efficient and sensitive responses. A consensus has formed in the past decades that officials are more likely to get it right if they ask input from the people who will be affected by the policy in question. The public shares their input in good faith, hoping it will lead to better results. But public trust is undermined when government use the input from a consultation to justify any and all policy changes.
Here are some examples of misuse of the results of the public consultation.
In the 2019 Budget, page 127, a long paragraph starts with “The Province is listening to parents” and ends with “consultations will help shape the government’s plan.” The next paragraph lays out the plan to increase class sizes, the consultation on which had closed only a few weeks earlier, with results never released to the public.
In a question period, on March 19, 2019, the leader of the opposition contended that introducing mandatory e-learning was part of an attempt to balance the budget on the backs of kids. Premier Doug Ford responded, “We consulted with over 72,000 parents. Over 72,000 parents told us what was important, and unlike the previous government, we actually listen to the parents.” The consultation with 72,000 responses didn’t include any question on e-learning.
This 72,000 figure is repeatedly cited in education announcements, be it about sex education, math curriculum, or class sizes. It became a mystical number that justifies any change to Ontario’s education system. This is a misuse of the consultation results, for all the reasons explained above.
If consultation results are going to be misused as a carte blanche from parents, instead of as input on specific policies, here’s a reality check: 72,000 is a very small percentage of the Ontario population – less than 0.5%. To illustrate just how small this is, the interactive map below shows the share of the population who participated in the online survey by postal code (forward sortation area).
In the first months of 2019, the Ministry carried a second, smaller consultation focused on changes to class sizes. This time only open-form submissions were accepted. The Ministry has not released any results for this consultation. Parents tried to gain access to it through a freedom of information request, which was denied. “After careful review, the Ministry has determined the records must be withheld.”
While parents should be commended for participating in these consultations, and the input collected must be seriously considered, the poor design of these public engagement processes and the misuse of their results are a disservice to all Ontarians, especially our children.
Ricardo Tranjan is a senior researcher with the Canadian Centre for Policy Alternatives’ Ontario office; he’s the author of the book Participatory Democracy in Brazil: Socioeconomic and Political Origins and other academic works on citizen participation.
http://behindthenumbers.ca/shorthand/how-not-to-engage-parents-lessons-from-the-ontario-ministry-of-education/?mc_cid=e2ce0765ca&mc_eid=299b6a00a6
Tags: ideology, participation
This entry was posted on Monday, December 2nd, 2019 at 10:57 am and is filed under Governance Policy Context. You can follow any responses to this entry through the RSS 2.0 feed. You can skip to the end and leave a response. Pinging is currently not allowed.