When I was invited to present at a seminar on The Politics and Ethics of Global Urban Practice, I had a mate staying with me. She teaches on the women and gender studies program(me) at Georgetown and, as we shot the breeze about teaching, it struck me that what I wonder about on my bike home is actually central to all of her classes.
I am the woman teaching a wee bit on other people’s courses about 'The Environment' or 'Sustainability' or 'Effics' or some other ambiguous topic that eventually requires our engineering students (reluctantly) to submit a written task. I was that student in the nineties and those were pretty much the only women that lectured me.
And, worse, I am clearly failing at the ethics bit of this because I had to spend the morning before the seminar dealing with three international students who had cheated the plagiarism-checking algorithm when they submitted their essays to me. Ugh. So, I thought I would blog my presentation just to document some experiments with squeezing politics and ethics into my teaching. Necessarily, episode one of this pedagogical work has been about talking to myself, back then, over there – the 'old me': young, adventurous and dumb. The second element is about mixed feelings: is a little politics and ethics on engineering courses a dangerous thing. I'll cover the troubling project, funded by UCL Grand Challenges, that really had me questioning this in another post...
Firstly, I should explain four special things about my department (CEGE at UCL):
- CEGE requires high A-level grades (UK exams for 18 year olds) but no subject pre-requisites and (consequently) has about 30% female undergraduates compared to a national average of 17% on civil engineering course that usually ask for A-levels in physics
- Since 2004, CEGE has seen a massive curriculum reform, shifting away from traditional technical modules to scenarios and projects focused on applying engineering knowledge in contexts that are social, economic and environmental
- We are on a cramped London site with limited lab space and time to allow students time to make, play with the material and fail with their hands. This gap is not central to this post but I will come back to it at the end
- I’ve been lucky to pick up teaching on unconventional and important engineering courses developed by Sarah Bell (Sustainable Infrastructure, Engineering Thinking, Urban Flooding and Drainage) and Dina D’Ayala (Strengthening of Low-Engineering Buildings)
Ok. Now, in spite of all these great things, remember that I am usually in a room with undergraduate engineers or aspiring (and sometimes experienced) staff from international humanitarian organisations. So I operate on the assumption that many are like the 'old me': drawn to projects, models, solutions, silver-bullets, audiobooks not pages, diagrams not paragraphs, matrices not stories, corporate bureaucracy, clarity of hierarchy and so on. So, as I try to bring my practice (a.k.a. mistakes) to bear on this work, I try to teach against the pillars of this worldview: processes, codes and categories. I love them all.
At the seminar, it happened that the architects and engineers were the ones with slides and pictures rather than text and words and I am hoping that in my spidery illustrations, you will see a further point here that ties in to our ways of acquiring and prefering particular kinds of knowledge.
Process: the algorithm might be wrong
This is about challenging an appealing formula by using pictures and stories to question 'normal' sequences, heirarchies and perspectives.
The image contrasts the "standard" algorithm presented in guidance on transitional shelter (p.111 of this pdf) with a photograph that I took about 12 days after the 2010 earthquake in Haiti. In the lecture, students are asked to think about why, when I returned later the same day, people had already gone back to a damaged house at the bottom of the ravine - a ravine that floods with sewage every rainy season - and were hanging out clothes by a flickering lantern at dusk. One answer is that this hillside is an excellent example of risk mitigation. It was a choice of location, from a limited set of choices, that minimised the risk of eviction, reduced rents, cut the time and costs of getting to work and allowed people to live closer to acquaintances already in the city and in buildings carefully adapted to protect from theft and hurricanes.
This is a process that has a history and a specific context that resists the algorithm.
Codes and Standards: critical thresholds and risks
This is about practising the application of codes and standards to show that standards may have unintended and undesirable consequences for engineers keen to do good and make things healthy and safe.
In a lecture on seismic mitigation, we look at the guidance published by a real international organisation in a real country. It is designed to help decide which schools should be retrofitted to withstand an earthquake.
Students are given information about three schools on a map and have to work out whether the retrofit is 'affordable'.
The most remote, tumbledown buildings turn out to be too costly to fix so the priority schools are those closest to roads, cities (and ministries!).
Students are then given information about the livelihoods of families in each place and have to comment on the possibilities of spatial or distributional injustice and whether mitigating one risk, could embed or exacerbate others.
In a lecture on water quality, students are asked to comment on the risky features of a water system using a sketch taken from WHO sanitary inspection forms (p. 150 Annex 2): tools for non-engineers to identify the worst risks to health in their own water systems.
The process of measuring water quality is explained as expensive, slow and dependent on laboratory infrastructure and engineers. Students are then asked whether visual inspection by community members or high-tec testing by experts is most risky for people trying to manage their own water supplies.
In both cases, students are asked: should people in poor countries accept lower standards for building codes/drinking water quality; could imposing high, unachievable standards for particular risks or 'sexy' problems we think we can solve, create greater risks in the aggregate for those "other" people? This provokes but does not resolve further questions: what is an acceptable risk and who decides what is acceptable?
To bring this closer to home and avoid a "them and us" debate, students are introduced to the QuALY system for deciding on health investments in the UK. Crudely, if a new intervention can extend life for less than, say, £22k per year, the country invests in it. Then a student has to read out this quote to show the process and political content of the decision:
“the threshold was not based on "empirical research" as no such research existed anywhere in the world … [it is] really based on the collective judgment of the health economists we have approached across the country. There is no known piece of work which tells you what the threshold should be.”
Former CEO of NICE, Professor Michael Rawlins (House of Commons, 2007)
To wrap up, and link this to professional codes of conduct and standards of behaviour, we look at two ideas that have been elaborated by Jonathan Wolff at UCL. Firstly, the idea that, as professionals, we have to account for our decisions. Unlike a doctor who is obliged (fraught with power and knowledge asymmetries as this may be) to sit with each patient to see whether they give consent to treatment, infrastructure engineers cannot sit with each rail passenger to outline the risks of riding a train. Instead of informed consent, society may prefer what Wolff calls “comfortable de-focusing”. This, then, has implications for our technical formulation of risk (Wolff, 2006):
risk = hazard x likelihood
becomes:
risk = hazard x likelihood x blame x shame
And this goes beyond calculations and liabilities to the institutional, social and political context of engineering.
Categories: it matters who decides and how
Whether you end up 'living above or below the algorithm' depends on a whole series of technical and political processes which, depending on the context, implicate and have implications for engineers.
In this example, students are read a quote from an engineer after the Christchurch earthquake to illustrate the consequences of categories which are decided by politicians on the advice of engineers:
“In retrospect, I wish we had just had red zones (uninhabitable) and green zones (you can go home)… We thought, as engineers, that where we needed more detailed assessment, we should mark the zone as orange. Then we could investigate before letting people go home. But, in fact, the uncertainty, the not knowing…. it was worse for those people. People from the orange zones had higher rates of depression and suicide.”
Students are asked to think of other similar examples.
In this example, students are introduced to simple categories of "vulnerability" used by international organisations to target support to the most needy. Then, they are told a story about the earhquake in Kobe to look at how "being old" was not in itself the vulnerability but the fact that many older people lived in old, wooden houses in the old city centre. Their modest incomes came from letting out rooms to poorer, younger people looking for work. These houses were disproportionateyly affected by the earthquake and the fire that followed so this populaton was disproportionately represented in temporary housing. And because their housing was their only asset and their income from tenants had dried up, this group found it hard to borrow money. And because their housing was high density and on valuable land, with multiple occupants and owners and the layout pre-dated the latest urban planning standards, reconstruction was delayed. Students are asked whether they think vulnerability is a property of a person, a structure or a system.
These images also show how quickly the neat categories get tangled and ambiguous when a story is told or a picture is drawn. These categories are not amenable to diagrams.
Humility and Curiosity: necessary but insufficient?
In all cases, I was hoping to, at least, seed curiosity and humility by making the examples real, the stories personal and the dilemmas uncomfortable. The sessions were supposed to provoke awareness of alternative possibilities (extending the concepts or frameworks). When we worked through examples, there was also a chance to practice, to navigate and discover that questions could not be resolved (experiential, discursive, uncomfortable). In most of our courses, however, the objective is to become fluent and confident (assessed, achieved, solved). If I had left students feeling fluent and confident with any of this material I would surely have failed.
But what if these students were like the old me?
What if learning was just an accidental side-effect of wanting to solve or win? For me, this meant that I did not experience certain ways of knowing and I enjoyed a certain laziness about material that could not be acquired this way. Indeed, to my horror, I often see in myself and sense in our students:
The next post will look at why this is so very troubling...