Monday 29 November 2021

The North Wind doeth blow


View from my bedroom window
Storm Arwen tore through my wood on Friday night, and caused very significant damage.

There is no significant damage to the house; certainly nothing which compromises its integrity. The wind turbine is still standing and still working. The cattle shed is also undamaged.

A strip about 15 to 20 metres wide through the wood from the Summer Palace glade to the blow from three years ago has lost substantially all its trees: it is just utter chaos. This is about quarter to a third of the whole area in the wood, and includes some of the largest trees. The summer palace is entirely gone. The wreck is immediately behind the house and three trees fell on the house.

Around the area where the trees are all fallen, there are further trees which are still standing but unstable. What remains of the wood that's largely unaffected are

* A strip around five metres wide along the east side of the wood;
* A strip of at least 15 metres wide along the south edge of the wood;
* A more substantial area of at least thirty metres wide along the west edge, although this is affected by a much smaller blow in the south-west corner which happened five years ago.

The fallen trees are going to have to be hauled/winched out into Commons Meadow, which means I'm going to need a new 12 foot gate. But actually that fence is going to need work anyway, because one of the strainer posts was braced back to a tree in the wood, and consequently part of the fence is demolished.

There are more pictures here.

Tuesday 16 November 2021

Open Source Climate Models: initial review


Climate models are normally built to do real science; that is not my aim. Rather, I want something which will form a component of an educational game which allows players to make policy decisions to attempt to maintain the climate within ‘safe’ limits, given constraints of population, consumption, demand for strategic materials and so on.

Consequently, I need the model to run on ordinary PCs that people may be expected to have, or even perhaps on something like an X Box. It also needs to be able to simulate a year in at most about twenty minutes, with enough processor cycles free to run user interface code.

It’s quite likely that no existing climate model will work under these constraints.

Systems considered

Name Language Licence Status Documentation Builds?
ClimateMachine Julia Apache License v2.0 Released Present Failing, probably fixable.
Community Earth System Model Fortran, C, Python, Perl BSD-style Released Present Failing
Hector C++, R GPL v3 Released Present Apparently successful
E3SM Fortran, C BSD Style Not suitable for consumer-grade machines Present Not attempted
atlas C, Fortran Apache License v2.0 Probably too low level Minimal Not attempted
Isca Fortran, Python GPL v3 Released Present Apparently successful, documentation slightly wrong

Climate Machine

Build failed with the following output:

ERROR: LoadError: UndefVarError: LLVMPtr not defined
 [1] include(::Module, ::String) at ./Base.jl:377
 [2] top-level scope at none:2
 [3] eval at ./boot.jl:331 [inlined]
 [4] eval(::Expr) at ./client.jl:449
 [5] top-level scope at ./none:3
in expression starting at /home/simon/.julia/packages/CUDA/wTQsK/src/CUDA.jl:9
ERROR: LoadError: LoadError: Failed to precompile CUDA [052768ef-5323-5732-b1bb-66c8b64840ba] to /home/simon/.julia/compiled/v1.4/CUDA/oWw5k_BxRo2.ji.
 [1] error(::String) at ./error.jl:33
 [2] compilecache(::Base.PkgId, ::String) at ./loading.jl:1272
 [3] _require(::Base.PkgId) at ./loading.jl:1029
 [4] require(::Base.PkgId) at ./loading.jl:927
 [5] require(::Module, ::Symbol) at ./loading.jl:922
 [6] include(::Module, ::String) at ./Base.jl:377
 [7] include(::String) at /home/simon/tmp/climate/ClimateMachine.jl/src/ClimateMachine.jl:1
 [8] top-level scope at /home/simon/tmp/climate/ClimateMachine.jl/src/ClimateMachine.jl:12
 [9] include(::Module, ::String) at ./Base.jl:377
 [10] top-level scope at none:2
 [11] eval at ./boot.jl:331 [inlined]
 [12] eval(::Expr) at ./client.jl:449
 [13] top-level scope at ./none:3
in expression starting at /home/simon/tmp/climate/ClimateMachine.jl/src/Arrays/MPIStateArrays.jl:3
in expression starting at /home/simon/tmp/climate/ClimateMachine.jl/src/ClimateMachine.jl:12

The error appears to be caused by a problem in Julia’s CUDA library, which should hand off computation to my graphics processor (which would be a good thing as this has some serious compute power).

All tests also fail, but that is almost certainly because the build failed.

This is probably fixable without a huge amount of effort.

Overall, although I have no experience with Julia, the codebase looks very clean and well designed. The installation process was big and complex, but ran commendably cleanly, with no installation problems.

Community Earth System Model

The build instructions appear to be incomplete. Nothing compiles. There is no executable product. I don’t even know where to start with attempting to investigate further.


Pure R build appeared to work cleanly, but I didn’t understand what I’d got sufficiently to carry out any meaningful tests. I think it worked.

Makefile (standalone) build also appeared to build satisfactory, produced an executable, and I was able to use this to do a test run, but again I don’t understand what I’m doing sufficiently to understand what I got. Still, this is promising.


From the documentation it appeared exceedingly unlikely that E3SM would run satisfactorily on the hardware available to me, so I didn’t attempt this


I think Atlas is probably a useful library for people who know how to build climate models, but I think it’s too low level for what I want to do. Build was not attempted.


Installation/build appeared to work correctly, but there was a slight error with the build documentation.

Where the documentation says to run

(isca_env)$ pip install -e .

I got the following error:

(isca_env) simon@mason:~/tmp/climate/Isca$ pip install -e .
Obtaining file:///home/simon/tmp/climate/Isca
ERROR: file:///home/simon/tmp/climate/Isca does not appear to be a Python project: neither '' nor 'pyproject.toml' found.

I found a file under src/extra/python/, so I ran

pip install -e src/extra/python/

This gave the following output:

Obtaining file:///home/simon/tmp/climate/Isca/src/extra/python
  Preparing metadata ( ... done
Requirement already satisfied: sh in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (1.13.1)
Requirement already satisfied: jinja2 in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (3.0.3)
Requirement already satisfied: f90nml in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (1.3.1)
Requirement already satisfied: numpy in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (1.21.4)
Requirement already satisfied: pandas in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (1.3.4)
Requirement already satisfied: xarray in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from Isca==0.2) (0.20.1)
Requirement already satisfied: MarkupSafe>=2.0 in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from jinja2->Isca==0.2) (2.0.1)
Requirement already satisfied: python-dateutil>=2.7.3 in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from pandas->Isca==0.2) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from pandas->Isca==0.2) (2021.3)
Requirement already satisfied: six>=1.5 in /home/simon/bin/miniforge3/envs/isca_env/lib/python3.9/site-packages (from python-dateutil>=2.7.3->pandas->Isca==0.2) (1.16.0)
Installing collected packages: Isca
  Running develop for Isca
Successfully installed Isca-0.2

So I think that worked satisfactorily. To test it I then attempted to run a test case, and this failed complaining about missing environment variables. So I think this is working and I just need to read the documentation better to get it running.

It needs to be said this was also a very big, complex installation process and the fact that it all ran cleanly is very commendable: this isn’t easy.


All climate models are complex bits of software, and, being mathematically intensive, tend to be written in relatively special purpose languages (R, Julia, Fortran) with which I’m not familiar. I’m also intensely prejudiced against Python, because I hate significant white space, so again where systems use Python as a front end scripting language, this isn’t familiar to me.

Hector and Isca built satisfactorily without much difficulty. Hector successfully ran test cases (and commendably quickly), although it didn’t produce output I am able to interpret at this stage. **Climate Machine** didn't build, but I think this is fixable with a little more work.

All three of these systems are promising and worthy of further investigation. Whether any will do what I need I am not yet certain.

Monday 15 November 2021

The Everyone Dies Event Class

Lytton burns
The climate, globally, is warming. Everyone acknowledges that. It’s not warming equally, or consistently, or evenly; I think everyone acknowledges that as well. Rather, the atmosphere is a heat engine: as you put more energy into it in the form of heat, you get more work out of it, in the form of turbulence. Winds get stronger, precipitation more intense, and heat waves hotter.

Human beings function in a fairly constrained temperature band. The healthy body temperature is 37° Celsius, plus or minus about one degree. The human body cools itself by evaporation. If water can’t evaporate from your skin, you can no longer cool yourself. Rather, you take on heat from the environment. Body temperature above 40° Celsius is a life threatening emergency, and above 42.3° denaturing of proteins, especially in the brain, may occur rapidly. This is not survivable.

But the operation of the human body generates heat continually. The beating of the heart generates heat. The movement of the chest cavity in breathing generates heat. Even brain activity – thinking – generates heat.

So in order to survive we need to be able to dump a small amount of heat into our environment continually. If the air is humid enough that sweat can’t evaporate, we need a small heat gradient to make that possible. So we need it to be actually cooler than 37°, and the survivable number generally quoted is a “wet bulb temperature” of 35°.

What does ‘wet bulb temperature’ mean? It’s the temperature at which water (and sweat) evaporates, and that’s a function of the absolute temperature of the air, and of the humidity: the saturation of water vapour in the air. As the air becomes increasingly humid, so the wet bulb temperature falls. In very dry atmospheric conditions, you can easily survive air temperatures well above 35° Celsius, provided you can drink enough fluids to enable you to sweat.

So: there is a temperature and humidity at which everyone exposed to the air just dies, and just dies pretty quickly. How close are we to hitting those temperature and humidity conditions?

The answer, of course, is that there are places on Earth which regularly exceeded those limits even before the days of significant anthropogenic warming, but that people don’t live in those places. Examples include Death Valley in California, parts of the Arabian Peninsula and of the Sahara, and Pakistan’s northern Sindh province, in which a Victorian British Brigadeer had the bright idea of building a city. What’s interesting is that these are not generally humid places; on the contrary, they’re exceptionally dry.

But the consequence of anthropogenic heating is that we’re seeing both higher average temperatures and higher variations in temperature. Which means we’ve been seeing a lot more exceptional heat events than we’ve been used to.

Over the past decade, ten places on Earth have recorded wet bulb temperatures at or above the theoretical ‘everybody dies’ limit. So far, we haven’t had an event in which very large proportions of the population have died suddenly of heat stroke, despite the fact that three of the places which have seen the highest wet-bulb temperatures, Ras Al Khaimah, Jacobabad and Mecca have significant populations.

But we cannot be very far away from an ‘everybody dies’ event, and the first probably won’t be somewhere that’s accustomed to very high temperatures.

Lytton, in British Columbia, Canada, wasn’t, until this year, accustomed to exceptionally high temperatures. Yes, it had frequently been the hottest place in Canada, but Canada is not on the whole a very hot place. The extreme heat event that hit Lytton in June 2021 – 49.6° Celsius – was fully 5° Celsius warmer than had ever been recorded there before. This is, as I’ve said before, because having more heat in the atmosphere makes it more turbulent and thus more heterogenous.

Lytton is 50°13’52" North. For comparison, that’s about eleven miles south of Plymouth in Devon, England, or about six miles north of Prague, in the Czech Republic.

Lytton didn’t quite exceed the ‘everybody dies’ limit, although it came very close. But it did exceed the ‘everything burns’ limit, which is what it isn’t there any more. And what Lytton proves is that, in an era of increasingly unstable weather, extreme heat events do not only happen in the tropics.


  1. We’re currently on 1.19° Celsius of anthropogenic warming over pre-industrial levels;
  2. At 1.19°, we’re seeing local temperature records being exceeded by 5° not only in the tropics but even in temperate zones;
  3. At 1.19°, our margin of safety for ‘everyone dies’ events in previously habitable zones appears to be completely exhausted;
  4. If capitalism continues (SSP4), we’re heading for 3.2° Celsius of warming; even on the much more optimistic SSP2 track, we’re now heading for 2.7°.

So: if at 1.19° average warming, we’re seeing local records broken by 5°, by what amount will we see them broken at 3.2° average? This isn’t going to be a simple linear curve; it seems to me that it must be at least to some degree exponential, because there are clearly accelerating effects and feedback loops in there.

But let’s assume that it’s simply linear. Then we would be looking at current local records being exceeded by fifteen degrees, more or less anywhere in tropical or temperate zones. Indeed, given what we saw in Siberia in 2020, we could easily see ‘everyone dies’ events happening as far north as the Arctic Circle. Obviously, they will be more common in the tropics. But no currently populous place on Earth will be immune.

So, just sit for a moment, and imagine. It’s 2035. It’s June. You’ve seen ‘everyone dies’ events, covering thousands of square kilometres, happening in other parts of the world already. Now the weather forecast tells you that there’s a 30% probability of an ‘everyone dies’ event in the area where you live in the next ten days.

What do you do?

That weather forecast is going to trigger everyone who can move, to move. It will cause gridlock on every means of transport. It will cause mass civil unrest. A fortnight later we’ll see news reports of ‘rescue’ workers – probably troops – going into the places people have gathered in their last desperate attempts to survive, have died in heaps. Those places may be subway stations, or underground vaults; they may be public buildings where there had been some hope the air conditioning system would not fail.

And each such event will leave a dead zone behind it to which – although we’ll know that these events will happen more or less randomly, like lightning strikes – few people will want to return.

Let’s be clear about this. 3.2° of warming isn’t ‘survivable’. 2.7° of warming isn’t survivable. We’re on the very edge of seeing ‘everyone dies’ events, covering areas the size of small European countries, now, at 1.19°. We cannot afford any more warming.

Yet already we’re hitting cascade effects. As the Arctic warms, so methane frozen in permafrost or beneath the sea thaws and is released into the atmosphere, ratcheting up warming. As ice melts, new, darker sea and land surfaces are exposed, ratcheting up warming. As ocean circulation systems are disrupted, their moderating effects break down, ratcheting up warming. Even if we all stop burning fossil fuels today, the temperature will still rise to about 1.6° Celsius above pre-industrial levels.

We are already in the disaster zone. Every barrel of oil we pump makes it worse. We have to just stop.

Monday 1 November 2021

Owning Scotland's Land

Scotland's land. 
OK, yes, it is my croft.
The white saviours, led by Benedict Macdonald, are again taking up the white man's burden to save Scotland's 'Wild Land' from the wild Scots. His business model is essentially to rent-seek off subsidies provided by the Scottish government intended to support rural communities and rural development, and divert the money, instead, to his friends in the City of London. Yes, this is just the latest cover of a very old tune, but it's long past time we said 'enough'.

The Right Solution

It remains my view that the right solution to Scotland's land problem is to apply a single, universal, highly progressive land tax to every square metre of Scotland, with the proceeds going to local government at a layer far closer to our current community councils than to our present regional councils. This taxation should be so progressive that it would essentially bankrupt all large estates immediately, causing them to surrender the vast majority of their land to those same councils, which would hold it as common.

These councils would be empowered to let parts of their common land to individuals on a non-heritable liferent basis, and to let parts to corporate entities on a maximum term of fifty years, but they would not be empowered to sell common land. At all. Ever.

However, for this solution to be implemented, we'd need a bold, far-sighted and progressive Scottish government, and there is at present no prospect of our ever having one. So we need to look for other solutions.

The Co-operative solution

So, let us suppose we set up a co-operative fund, called for the remainder of this document 'The Co-op Land Fund'. This fund would sell shareholdings to members of the public, who could put in any amount (over a set minimum of perhaps £100); but each member would get only one vote, irrespective of the amount of money they had put in.

This fund would buy parcels of land, normally in blocks of over 100 hectares, and would lease those parcels of land on long leases to further co-operatives ('management co-ops') which would manage and/or occupy that land (ideally, both).

There are two potential models for this:

  1. The Co-op Land Fund could speculatively buy a parcel of land and invite proposals from people interested in forming management co-ops to manage/occupy it;

  2. A nascent management co-op could approach the Co-op Land Fund and say 'there is this particular parcel of land which is now on the market, which we would like to manage/occupy, will you buy it and lease it to us.'

Obviously, the Co-op Land Fund could, and I hope would, operate both these models.

The idea here is that the management co-ops would either represent existing communities of place - a village would set up its own co-op to bid for land in its locality - or intentional communities which had the intention of becoming communities of place, by co-residing on the parcel of land which they leased.

Ideally, the management co-ops would hold the land more or less in perpetuity. They would pay a rental to the Co-op Land Fund, which could provide, in addition to finance for land, also finance for buildings and capital plant, and could also provide land management and ecological advice services.

Management co-ops should be essentially local, holding one parcel of land or at most a few parcels of land within easy cycling distance of one another. Everyone living on, or working on, the land held by a management co-op should be entitled to membership of that co-op.

In short, I view these management co-ops as being essentially something like Standingstone.

The Co-op Land Fund should set broad guidance for activities and developments which should be encouraged, should be discouraged, and should be disallowed, on these parcels of land; this guidance should generally be designed:

  1. To maintain the land in good long-term ecological health;

  2. As part of that, to sequester carbon;

  3. To produce strategic goods including food, natural fibres, timber and electrical power;

  4. To foster community and repopulation.


Thursday 19 August 2021

Where's the steel?

A blast furnace in operation
From the discovery of iron working techniques, about 3,200 years ago, up until the widespread exploitation of fossil fuels, about 250 years ago, iron and steel were rare, precious materials. The average person, across the whole world, almost certainly had less than 500 grammes of it. A knife, probably; some tool of their trade, possibly. Even members of the elite -- warriors who fought in full armour, for example -- probably owned no more than 30kg of iron and steel.

The use of fossil fuel changed all that, of course. There's about one car for every two people in the UK, and the average car now weighs 1857Kg, so that's almost a ton per person in cars alone, not to mention all the steel we now have in buildings and infrastructure. But it's fossil fuels that have made that possible. In future, we can't use them. So how much steel will we have?

Steel costs about 4MWh/ton to make. Current production of steel is about 1.8 billion tonnes per annum.That's about 7.2 billion MWh, or 7.2 million GWh, or 7,200 terawatt hours per annum -- which is 23% of total world industrial energy consumption.

Total world electricity production is around 27,000 terawatt hours So making all our steel electrically would take one third of our total electricity generation capacity. But only 28%, or 7560 TWh, of this is renewable. In other words, if we converted all our current steel making capacity ro electric arc, it would use virtually all of the world's production of renewable energy.

About 260GW of new renewable generating capacity is being added annually, but that figure is a bit misleading, since renewable plant cannot operate at full capacity all the time. Solar panels only operate during hours of daylight, and at full capacity only when the sky is cloudless and the sun significantly above the horizon. Wind turbines operate at full capacity only in a fairly narrow band of wind speeds. So 260GW of capacity does not translate into 2 277 TWh of electricity actually produced per year, but much less. How much less? I don't know, but about a third, or 0.7 TWh, seems a reasonable guess.

However, only a proportion of steel is made using electric arc furnaces; the rest is made using fossil fuels, largely coal. The exact proportion is hard to establish, since especially in the West new electric arc capacity is being built quickly. But the best figure I can get is around 29%. That means, of course, that of that 29% of steel that is made electrically, only 28% – or 8% of total steel making – is carbon neutral.

It also means that, disregarding the proportion of existing electric arc furnaces which are using fossil-fuel generated electricity, it would take seven years of our total new renewable energy capacity to replace existing fossil fuel steel making capacity with renewable. And that's before a single joule of electricity becomes available to power any of the new electric cars, electric trucks, electric trains, etc, that we want to build with that steel.

Except we couldn't build the new capacity that quickly, because it takes (a lot of) steel to build both new steelmaking plant and new renewable electricity generating capacity. We are going to have to shut down the approximately 70% of steelmaking capacity that is fossil fuel powered, and we're going to have to shut it down soon. And in a world which is critically short of a strategic material as critical as steel, making the right choices about how to allocate that steel is going to be hard and contentious.

So no. We're not all going to have electric cars. We're going to have a lot less steel

Wednesday 21 April 2021

Death, Glory, and computer games

the rookie, quick

Let's suppose for a moment that you're a member -- the most junior member, the rookie -- of a squad. Your squad may be police, it may be corporate security, it may be a criminal gang; this doesn't matter. What does matter is that it exists in an environment in which all these things exist, and compete; in which they compete using lethal force.

As the rookie, you've been issued with a weapon. It isn't a very good weapon; it's old, worn out, not particularly powerful; and you're not yet very skilled in its use. But this doesn't matter; there are half a dozen other folk in your squad who are all more experienced and better armed than you. Your leader is a very experienced -- famous, perhaps notorious -- combat veteran. You feel safe, and your squad is moderately successful.

the killer in the dark

Then, one dark night, within the space of about five minutes, your comrades are picked off one by one by an unknown assailant using a sniper rifle from cover. Most of them never see him; most of them don't get a single shot off on target.

And now they're all -- all your experienced, well armed, competent comrades -- dead. Every one of them, dead. There's only you left. There's nothing in particular to defend. No harmless civilians depending on your protection, no pass that must not be sold, not even any very significant amount of booty.


In those circumstances, what do you, the last survivor, the poorly armed rookie, do?

Do you hide, in the dark, in an environment that is full of hiding places? Do you flee into the complex maze of streets and alleys around you, or in one of the fast getaway cars that your squad so often has parked around the scenes of such encounters? Do you stay alive at all costs, to inform your superiors, the rest of your faction, of what has happened, of this new threat they face? Do you throw down your weapon, put your hands up, and beg for mercy?

Or do you fight, to certain death, against a warrior you already know is far more deadly -- and far better armed -- than you are?

the rookie, dead

And yet, this is what (almost) all such characters in (almost) all video games do.

The game I playing at present is Cyberpunk, so the game I am ranting about at present is Cyberpunk. But that's unfair, because this behaviour is ubiquitous. The very rare exceptions (and as far as I'm aware, outside the 'Beat on the Brat' boxing competition, there are none in Cyberpunk) are specifically scripted individual events. The default behaviour for all opponent NPCs is to fight to the death, to fight without quarter given, to the very last man or, all too often, woman.

In Cyberpunk, enemies who are hunting you (yes, even that poor unfortunate last rookie) will call out "come out with your hands in the air," "we only want to talk to you," or "I promise not to kill you."

These apparent offers of quarter are in fact insincere. You are given no way to respond to them directly; but if you do put your weapons down and just stand there, you will be killed. Equally, you are given no way to call on your victims to surrender, or to offer them quarter.

True, you, the player, can walk away from a fight, and (usually) the enemy won't pursue you very far. But, while you remain in the area in which the fight broke out, any leftover survivors will continue, suicidally, to attack.

And surely -- surely -- it can't just be for me that this universal, this irrational, this suicidal refusal either to surrender or to flee wrenches at the willing suspension of disbelief?

A digression: 'non-lethal' weapons

Of course, also, in Cyberpunk, you can use 'non lethal' weapons. 'Non lethal' weapons include most or all blunt melee weapons -- clubs, batons, et cetera -- but also include firearms to which you have added a modifier called 'Pax' which somehow, magically, renders the firearms 'non lethal'.

You can probably guess from all the scare-quotes that I consider this a moral cop-out, and it is. Anyone with experience in the real world knows that blows with a blunt instrument sufficiently forceful to knock the victim out for a substantial time have a substantial chance of causing permanent brain injury or death. Anyone with experience in the real world knows that rubber bullets or plastic baton rounds fired from firearms frequently cause permanent injury or death.

A character sufficiently injured with a 'non-lethal' weapon falls immediately to the ground and never, no matter how long you wait, recovers in any way. Characters felled in this way with 'non-lethal' weapons never reappear in the game.

There are a series of missions given you by a fixer called Regina Jones in which she urges you to use non-lethal weapons, and says that a medical team will come to recover the victim for treatment; but, no matter how long you wait, that medical team is never seen to arrive.

In practice I think 'non-lethal' weapons are just a flimsy screen behind which CD Projekt can claim "well, you don't have to kill hundreds of people, you can use 'non lethal' weapons." But, in fact, to complete a playthrough of Cyberpunk -- even just the 'main plot' quests -- you do have to permanently incapacitate many tens, probably hundreds of characters. In practice, 'non lethal' weapons are a difference that makes no difference.

The reason for 'No Surrender'

So, why? Why does the rookie with the most basic, most useless weapon, sole survivor of his or her squad, still charge into battle against an opponent who has clearly demonstrated martial superiority? Why does the rookie continue to attack even when wounded?

It's partly down to something I've been arguing for ten years now. Because every interaction in modern video games has to be voice acted, characters have very limited vocal repertoire. So there isn't sufficient repertoire for negotiations of surrender (either way). But to an extent the repeated calls of "come out with your hands in the air" give the lie even to this suggestion. There is not just one voice acted instance of that call, there are several. True, they do become repetitive; true, they are not individually voice acted for each opponent.

But the very fact that they are not indicates that it would be possible for the developers to record a few variants of "I give up," "I've had enough," "I surrender." There's an animation which civilian non-player characters caught in the middle of a firefight; the crouch in the open with their hands above their heads. A defeated opponent could put his or her weapon down, call out "I surrender," and do the same.

An injured opponent could writhe on the ground, crying out in pain or whimpering, and beg for medical help. An uninjured opponent could just run away, either abandoning their weapon or taking it with them.

In any of these cases the player would have moral choices. They could succour the wounded. They could handcuff wanted criminals and tag them for the police to collect. They could let fugitives go. Or, they could leave the wounded in agony, could demand bribes from wanted criminals for not calling the police. Or, they could just kill the defeated. In any of these options, the player could either collect the victims' weapons (and, optionally, other valuables) for resale, or not do so.

And, of course, if the player leaves the victim free with their weapon, then there should be at least some chance that the victim will then break their implied parole and restart the fight.

But it's a role playing game. The player should have scope to play their role, to make moral choices. Some players will choose to be merciful; some, even, quixotic. Those choices should be there.

Not providing these options -- having every non-player opponent continue to fight, obstinately, mechanically, robotically, to the bitter end, to the death of every last member of the squad, without ever considering or trying or begging for other outcomes -- is lazy. It's not good game design. It rends at the willing suspension of disbelief. It sucks.

Friday 1 January 2021

T-Bug, memory management, and Cyberpunk

A Cyberpunk 2077 character in T pose
I have no inside information about the development of Cyberpunk 2077, but I am a software engineer with 35 years of experience, and I have written mods for both CD Projekt and Bioware games.

Cyberpunk is essentially two products: RED Engine, and Cyberpunk itself, which runs on top of Red Engine. The engine is very much analogous to a JVM: it abstracts the platform for the game code that runs on top of it.

The Cyberpunk layer itself, and the graphical and audio assets, are probably identical between PC, and old and new versions of the XBox and PlayStation platforms. It is the engine that differs between the platforms. The Cyberpunk layer seems to me to be in a good state of completion — there are bugs, but they're relatively minor.

The version of RED Engine used for Cyberpunk is surprisingly little changed from the version used for Witcher 3, CD Projekt's previous major game. The main obvious change is improved background loading of assets. On a PC towards the upper end of recommended spec, this too is reasonably solid: I have had one crash, one significant audio glitch, and two or three minor visual glitches in twenty hours of game play.

But it's clearly in the engine that the problems lie, and is, I think, where the problems have always been.

The game was launched simultaneously on PC, and on both 'last gen' and 'next gen' XBox and PlayStation consoles, although on both XBox and PlayStation, the code at release used the 'last gen' APIs, and next gen consoles run this in their respective backwards compatibility modes. A further release using next gen APIs is promised, but is not yet available. 

However, the game runs reasonably well on modern gaming PCs and on next generation consoles. But it runs extremely poorly on last generation consoles, to the extent of causing a great deal of negative comment. So why? What's going wrong? 

I emphasise again: I don't know, I have no inside information. This essay is reasonably well informed speculation, and nothing more. However, this is my opinion. 

What sets the older XBox and PlayStation platforms apart is that they have much more limited i/o speed, and much more limited main memory, than the newer generation (or than current PCs). They also have slower processors and more limited graphics subsystems.

Night City — the setting for Cyberpunk — is an extraordinarily ambitious and complex visual environment. To render a single static scene, hundreds of models and textures must be loaded from backing store. 

But the scenes are not static. On the contrary, the user can look around freely at all times, and can move quickly through the environment. At the same time, dozens on non-player characters, vehicles, aircraft and other mobile game objects are also moving (some rapidly) through the scene.

From a development and testing point of view, it's easy to test that a given asset can be loaded into memory and rendered in a given time. It's even relatively easy to test whether a given set of assets can be loaded in a given time.

But what I have particularly seen in the videos of the game running on old-generation hardware iis
  1. Late loading of higher resolution textures; and
  2. Assets (particularly non-player characters) being rendered in default poses.
I also hear that there are a lot of crashes, which I'll come back to.

The two issues I've described above both seem down to the program being i/o bound — it can't get data from disk to screen fast enough, because of limitations in bandwidth. That's hard physics: yes, you can work to make the graphics selection and loading code as efficient as possible, but if you need all those bits on the screen to render a scene and the system doesn't have the raw bandwidth, it isn't going to happen.

The problem is made worse by limited main memory. Where there is main memory to spare, it can be used to cache near-screen assets, so that if, for example, the player turns their head, the required assets are already in main memory. But if main memory is exhausted with all the assets currently on screen, then when the player turns their head, unwanted assets must be culled and fresh assets loaded, immediately.

This raises the issue of crashing. These game assets are big. Culling and reloading will rapidly fragment the heap. But pauses for garbage collection are really undesirable in a fast moving real time environment. Near real time GC of rapidly fragmenting heaps is hard.

Worse is, I suspect, what happens when/if the assets required to render a scene in themselves exhaust main memory. I'm pretty sure this happens, because it's noticeable that scenes rendered on old generation consoles contain fewer non-player characters than similar scenes rendered on PC. There's clearly code that decides whether to cull non plot critical non-player characters when memory load is high.

But thrashing is likely to occur — or at least, there will be need for sophisticated code to prevent thrashing — when assets required to render a scene cannot be accommodated without removing other assets also required to render the exact same scene.

This sort of code — especially when it is being developed under pressure — is very susceptible to the sort of bugs which cause crashes.

So, from a quality point of view, where does that leave us? All these aspects of engine performance are suitable for unit tests, integration tests and characterisation tests. Characterisation tests – does this code run exactly the same as that code? – may be particularly relevant when testing ports to multiple platforms.
If there is not a comprehensive test suite and a continuous integration platform then someone is very derelict in their duty, and I do not believe that. CD Projekt strike me, in both artistic and technical proficiency, as pretty thorough.

Furthermore, we've seen very impressive renderings of scenery and action for two years now, so the upper bound to the size and numbers of assets required for scenes has been known for at least that time. So the performance and stability problems on old generation consoles must have been known.

That implies to me that management ought to have said, at least a year ago, "we will launch only on PC and next-gen platforms, and a degraded version for old generation consoles may follow later but we don't know when."

Obviously, investors and owners of older consoles would have been disappointed, but it would have avoided a significant hit to reputation.
This essay started as a comment on a YouTube video, which, if you're interested, you should watch. 


Creative Commons Licence
The fool on the hill by Simon Brooke is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License