Alexandra Kitty

Intel Update: Please panic in an orderly fashion while I descontruct the narrative.

The Damage Report


Where reputations, lies, and PR campaigns get slabbed. Autopsies on media, crime, and power, no anesthetic.

Four Ontario Colleges: An Institutional Culture Typology

,

Purpose and Method

This document codifies observed institutional traits across four Ontario colleges into a provisional typology. The primary data source is direct, multi-year teaching experience at each institution, supplemented by sector-level research on faculty development, teaching culture, student deficit thinking, and program quality in Ontario’s college sector. The intent is to produce a typology of institutional cultures as experienced from the front of the classroom, not as they present themselves in strategic plans or quality reports.

The framework draws on established concepts in higher-education research, including institutional teaching culture typologies, the distinction between deficit and asset orientations toward students, faculty development structures and their effectiveness, and program quality and responsiveness to feedback.


The Four Types at a Glance

DimensionSheridanMohawkNiagaraConestoga
Type labelOrganized but territorially politicalDeficit-bureaucratic chaosCautious, process-heavy, intermittently responsiveAutocratic, metrics-driven extraction
View of studentsLargely neutral; differentiated by departmentDeficit; “not going to get jobs anyway”Neutral to cautious; formal feedback mechanisms existInstrumental; students as enrolment units
Faculty development qualityFaculty-led, professor-initiated in early 2000s; organized and coherentAbsent or nominal in key departmentsFormal CAE structure; uneven implementationMandatory, paid, standardized; content dated and in places incorrect
Instructor autonomyHigh in approved courses; blocked when instructor tries to set new agendaHigh; able to redesign mediocre courses substantiallyModerate; feedback invited; changes possible but slowLow; scripts enforced; deviation from course outline prohibited
Course design floorModerate to good; innovation platformed at conferencesLow in some departments; but instructors can raise it significantlyModerate; improvement possible through formal channelsLow; standardized design locked in place; improvement blocked
Administrative competenceOrganized at institutional level; individual leaders can be problematicOpenly nepotistic in some departments; chaoticProcess-heavy; bureaucratic but not maliciousAutocratic; top-down; resistant to instructor input
Responsiveness to feedbackConditional; feedback welcomed if it doesn’t threaten turfPoor in some departments; better in some CE and program areasFormal feedback loops exist and are sometimes acted onMinimal; format changes (e.g., 10-day to 3-week course) without examining practice
How success happensWithin-system; platform exists if you work through itDespite the system; instructor quietly subverts weak designSometimes through the system; CAE provides some scaffoldingAgainst the system; individual instructor effort invisible and unsupported
Invisible labour recognized?Partially; contributions visible if institutionally legibleNo; extra loads absorbed without career benefitNo; CE contributions systematically underattributedNo; nightly support work and course redesign entirely invisible to admin
Career pathways for part-time facultyLimited; proposal acceptance possible but department politics decisiveNon-existent in observed departments; long-serving part-timers never convertedSome openness; CE relationships can lead to credit-side entryNot observed; contractual and temporary by design
Security and infrastructureAdequate for credit courses; CE more variableWeak; equipment and physical environment unreliable in some areasPorous; especially in CE; expensive equipment vulnerable to theftStandardized but thin; infrastructure serves metrics, not practice
Student outcomes under good instructionStrong; students respond well to experimental, thinking-heavy tasksStrong; students exceed deficit expectations when given real-world assignmentsModerate to strong; students capable when given adequate supportVariable; compressed formats hostile to learning without instructor heroics

Sheridan: Organized, Faculty-Development Friendly, Territorially Political

What it did well

Sheridan, at least in the early-to-mid 2000s, had the most coherent faculty development culture of the four institutions. A new teaching credential for faculty was initiated by a professor who designed it around what instructors actually needed from classroom experience: a faculty-led, practitioner-driven model that aligns with what research identifies as the most effective form of faculty development: responsive to real teaching contexts rather than driven by administrative compliance. The credential was complemented by the Make It So Conferences, which created a local community of practice among faculty, an informal but powerful vehicle for sharing teaching innovation.

Instruction at Sheridan was structured around giving faculty real autonomy within approved courses. Experimental course designs (blogs in the classroom, “pop” assignments requiring students to think on their feet) were not just tolerated but platformed at sector conferences. This signals what researchers call a “collegial” teaching culture, one in which faculty expertise is genuinely respected and given room to develop.

Formal policies on faculty educational development and program review at Sheridan later codified these instincts into procedures, with structured support for teaching improvement and program revision.

Where it failed

The collegial surface concealed territorial dynamics around intellectual ownership of courses and prestige content. When an experienced, credentialed instructor proposed a new course that department leadership wanted for itself, the approval process became a slow, opaque block. The successive objections (the title, the textbook, the scope) each collapsed on scrutiny, a textbook used at Ivy League universities being deemed insufficient for Sheridan; the survey format of popular culture courses being universal at comparable institutions everywhere, revealing that the objections were pretextual.

Leadership could be aggressive when challenged but retreated under factual counter-pressure. This is consistent with what research identifies as a gap between institutional rhetoric about teaching excellence and actual distribution of power over curriculum. The practical result: Sheridan could support innovation within existing frameworks, but was politically resistant when a non-insider tried to originate new academic territory.


Mohawk: Deficit-Bureaucratic Chaos

The deficit orientation

Mohawk’s most consistent feature in the observed period was a low-expectation, classist orientation toward its student body, most visible in broadcasting and language studies, where students were characterized as unlikely to find work and given correspondingly minimal academic demands. This is textbook deficit thinking as described in the higher-education literature: attributing low outcomes to student deficiencies (ability, motivation, background) rather than to institutional structures, teaching quality, or assignment design.

The deficit framing was applied unevenly and class-specifically. Students in programs with higher social capital (trades, applied technology) were treated differently from those in language and communication programs, reflecting broader stratification within the college itself.

Sector research has found that Ontario colleges serving high proportions of first-generation, working-class, and newcomer students often develop these deficit cultures as a defensive institutional response to under-resourcing and high attrition, blaming students rather than examining structural causes.

The chaos layer

The administrative culture in Language Studies was openly dysfunctional: nepotistic hiring (family members and their spouses placed into full-time roles for which they were not qualified), jealousy toward faculty with professional credentials outside academia, and a complete absence of basic operational knowledge (course lengths, program requirements). Long-serving part-time faculty had no path to full-time employment regardless of their contribution or student outcomes, consistent with Ontario-wide patterns of chronic reliance on contract faculty with no conversion mechanisms.

The literacy/admissions inversion, accepting students first and assessing literacy readiness in-class afterward, reflects a sector-level problem: HEQCO’s research on college-level literacy found most Ontario colleges assess literacy post-admission rather than pre-admission, creating situations where instructors must simultaneously teach course content and remediate foundational skills.

The rescue capacity

Despite this, Mohawk had one structural feature the other three institutions lacked: high instructor autonomy. An instructor could substantially redesign a mediocre course, replace movie reviews with real journalistic assignments, build portfolio-driven work around live events (a McMaster strike, a MuchMusic VJ interview, a jaywalking crackdown), and the institution would not intervene. The weak bureaucratic culture that made it chaotic also made it porous, and skilled instructors could use that porosity to produce genuinely strong outcomes against the institution’s own low expectations for its students.

The result was a recurring pattern: institutional culture predicted failure; skilled instruction produced success; the institution took no lessons from the discrepancy.


Niagara: Cautious, Process-Heavy, Intermittently Responsive

The formal structure

Niagara’s Centre for Academic Excellence (CAE) is the most formally developed faculty support infrastructure of the four institutions. It provides a structured program review and renewal cycle (six-year, action-plan-driven, curriculum-mapped), explicit processes for program and curriculum change initiated by advisory committees, student feedback, or instructor proposal, and formal course feedback surveys built into every credit course every term.

The CAE explicitly frames its mandate around gathering and responding to feedback, with guidance on how instructors should tell students what changed as a result of their input. This is the most coherent institutional articulation of a feedback loop of the four institutions reviewed here.

The gap between structure and experience

The formal infrastructure did not fully translate into practice. Continuing Education, where much of the teaching observed occurred, was more loosely governed: courses could be developed, iterated, and improved by individual instructors with relative freedom, but the same looseness meant physical security was poor (expensive equipment vulnerable to theft), administrative attribution was partial (significant behind-the-scenes coordination work invisible to the institution), and external disruptions (a professor strike) could destabilize courses with no protective buffer.

When feedback was provided through informal channels, the institution could be genuinely responsive, willing to listen and make changes, but this was uneven and dependent on individual administrators rather than the formal structures the CAE documents. This is the “cautious” quality of the Niagara type: the systems exist and sometimes work, but they require a willing intermediary to activate them.

Comparative position

Niagara sits between Sheridan’s organized-but-political culture and Mohawk’s deficit chaos. It is neither as tightly politically managed as Sheridan nor as structurally dysfunctional as Mohawk. Its closest analogue in the organizational literature is what researchers call a “developmental” teaching culture, one that values improvement and supports it through structures, but has not yet reached the “collegial” level where faculty expertise drives the agenda.


Conestoga: Autocratic, Metrics-Driven Extraction

Standardization as a control mechanism

Conestoga’s most distinctive feature was its use of standardized course shells as a mechanism for control rather than quality. Communications courses with significant grade weighting on single 3–5 sentence paragraphs mid-semester, tasks appropriate as Week 1–2 diagnostics in a writing development sequence, treated institutional compliance with the course outline as more important than the pedagogical logic of the sequence. Instructors were explicitly prohibited from deviating from the approved outline, even when the existing design was clearly insufficient for the students and context.

This is the inverse of what effective writing instruction looks like: a well-designed communications course scaffolds through authentic genres (memos, complaint letters, reports, emails) that mirror the work students will do professionally, building from simpler to more complex, and allows instructors to adjust pace and emphasis to meet real student needs. Conestoga’s model prioritized cross-section consistency and administrative legibility over any of those considerations.

Compressed formats and invisible heroics

The accelerated PSW Communications course, a 14-week course compressed into 10 days, delivered to a predominantly BIPOC, immigrant, working, and family-bearing cohort, was a product of provincial and institutional pressure to rapidly expand the PSW workforce. Students arrived frightened; the format was hostile; the institution’s response to demonstrated learning problems was a format change (10 days to 3 weeks) that did not address the underlying pedagogical problem or provide support to instructors.

When an instructor established nightly open Zoom office hours, working until every student had what they needed, every night for the duration of the course, students moved from panic to demonstrably competent performance in ten days. The institution neither recognized nor replicated this practice. No one asked if it was sustainable. No support was offered. The learning gain existed entirely in the space between the institutional structure and the instructor’s individual commitment.

This is the defining feature of the Conestoga type: outcomes depend entirely on instructor heroics that the institution systematically ignores, fails to resource, and cannot reproduce.

The professional development mirror

The internal micro-credential apparatus at Conestoga adds a further dimension. Ontario’s micro-credential framework was built around language of “flexible, timely, in-demand skills.” Conestoga adopted it maximally, routing hundreds of faculty through mandatory internal teaching micro-credentials, but the content of those credentials was, in practice, watered-down, dated, and in places incorrect when evaluated against research-grounded comparative benchmarks (e.g., the Harvard Bok Center Higher Education Teaching Certificate).

A Sheridan teaching credential from 2002, faculty-initiated, practitioner-grounded, outperformed Conestoga’s mandatory internal version by 2021. The gap is not incidental: it reflects an institution that used the appearance of quality (completion numbers, credential counts, quality reports) as a substitute for its substance, across two decades of aggressive growth.


Cross-Cutting Findings

Invisible labour is the sector-wide constant

Across all four institutions, the work that actually produced student success, curriculum redesign, after-hours support, portfolio building, literacy scaffolding, industry coaching, was performed by instructors who received no additional recognition, pay, or institutional learning from their effort. The gap between what the course outline said and what the course did in practice was filled entirely by individual discretion and commitment.

This is not a Conestoga anomaly. It is the structural baseline of Ontario’s college sector: chronic reliance on part-time and contract faculty, post-admission literacy triage, and a quality assurance apparatus that measures completion and compliance rather than learning and growth.

The deficit vs. asset divide is the most predictive variable

Of all the differences between these four institutions, the orientation toward students, deficit (students are the problem) vs. asset (students have capacity that the institution is responsible for developing), is the most predictive of outcomes. Where instructors were supported in treating students as capable, students met the expectation. Where the institutional culture communicated low expectations, those expectations mostly held, not because of the students, but because of what was designed for them.

No student should depend on a single instructor’s empathy

The most consequential structural implication across all four cases is this: student outcomes were far too dependent on whether a given cohort happened to get an instructor willing and able to work against institutional constraints. Radio students who showed up voluntarily on a March break Friday. PSW students who arrived terrified and left confident. Advertising students who enrolled in an Ideas in TV and Film course because the instructor, not the institution, gave them a reason to.

These outcomes were real and durable. They were also not reproducible by the institution, because the institution never understood what produced them. Designing AI-assisted scaffolding, not as a replacement for skilled instructors but as a floor that functions when the skilled instructor is unavailable, is the logical response to a sector that has repeatedly demonstrated it cannot guarantee the conditions for good teaching on its own.


Additional research by Perplexity.