Friday 31 October 2014

Scottish Devolution, and Socialism in One Nation

Image courtesy of Stewart Bremner
[This is a submission to the Smith Commission, written mainly by me but on behalf of Radical Independence Dumfries and Galloway. It's separate from (and deals with different issues to) my own submission, which is here. My instructions from Comrade Lucy were 'Just scrawl "full communism now" on rahbackuvvah silver rizla'. I may have written slightly more. Thanks to everyone who contributed.]


Dear Commissioners

It is a truth universally acknowledged that, compared to the United Kingdom as a whole, Scotland tilts a little to the left. How far to the left Scotland tilts is something we don’t yet really know; in the run-up to the Independence Referendum the SNP pursued a disappointingly unradical policy, an almost Labourite technocratic administration distinguished only by its relative competence. But the fact remains that even counting Labour as a right of centre party, there are more left of centre MSPs than right of centre MSPs – there is a majority at Holyrood which is to the left of the Labour left-wingers, at a time when the Scottish Labour Party is itself somewhat to the left of its UK parent.

This was one of the key drivers of the Yes campaign in the referendum. The left in Scotland has lost confidence in the possibility of seeing progressive government in the UK, and consequently sees the prospect of independence as a means by which a more left wing state can be achieved.

How left wing? We don’t know. No-one knows. Given the chance, it will develop. It may become Scandinavian style social democracy. It may become Clause Four socialism. It may become full blown communism. It should be up to the people of Scotland to decide. And, as today’s Ipsos MORI poll shows, the djinni is not back in the bottle. Any devolution settlement which does not permit Scotland to evolve – if it so chooses – Socialism in One Home Nation, is not a devolution settlement that can last.

That means that is is not possible to devolve this tax but not that tax, this benefit and not that benefit. The possibility must be allowed for in which the constituent nations of the United Kingdom raise taxes on bases wholly different from one another. Scotland must have the option of raising taxes principally from wealth, or property, rather than from income or consumption. Scotland must have the option of instituting a Citizens’ Income rather than retaining a complicated and inconsistent mess of means-tested benefits. We’re not saying that there is now an appetite in Scotland for any of these things, but rather that Scotland is on a journey and we don’t yet know where that journey will lead; but if the devolution settlement is not sufficiently open and flexible to accommodate the path that Scotland chooses, then the devolution settlement will fail.

This, inevitably, raises the matter of TTIP. It should not be in the gift of the UK government to offer up, through secret treaties entered into without public consultation, the public services of Scotland to bids from private industry. It is for Scotland to decide whether our health care, our social care, our street cleaning and bin collections, our fields, hills, factories and housing are to be in public hands or private. If Scotland chooses


“to secure for the workers by hand or by brain the full fruits of their industry and the most equitable distribution thereof that may be possible upon the basis of the common ownership of the means of production, distribution and exchange, and the best obtainable system of popular administration and control of each industry or service”

then that is Scotland’s choice, and must prevail.

The same applies to policy on energy. Scotland has different energy resources, and different energy opportunities than the other home nations. It has significantly higher average windspeeds, a much longer coastline exposed to rougher seas and fiercer tides; it has much more fossil hydrocarbon reserves already discovered than it can responsibly afford to bring to the surface and burn. At the same time it also has scandalous levels of fuel poverty.

Scotland should not be forced to pay to subsidise a nuclear power station in Somerset which it categorically does not need. But similarly, Scotland should not be forced to accept fracking on its own soil. Scotland may choose to frack, or not; it may choose to buy expensive baseload power from the French government’s power station in Somerset, or not. Energy policy, like social policy, taxation policy, education policy, transport policy, must be devolved, and for the same reasons.

So the question becomes, what should not be devolved? It is a fact which cannot be denied that there was a referendum on independence, and the Yes side – on which we stood – lost. Because the Yes side lost, the pro-union parties are feeling encouraged in a spirit of triumphalism to row back from the generous offers they made in the closing days of the campaign. They are well advised to be cautious in doing so. The Ipsos MORI poll published by STV today suggests that pro-independence candidates – the SNP – will take 54 of the 59 Scottish Westminster seats at the general election in just six months time. And if that should happen, then the outcome of the referendum is moot: there will be a clear and incontestable mandate for independence. The will of the Scottish people is by no means settled.

The positive case for the Union, in so far as it was made at all by the No side during the referendum campaign, seems to be that the Union allows a Labour government to impose, with Scottish votes, policies on England that the English do not want and the Scots would not themselves tolerate; and it allows Conservative governments to underpin their economic incompetence with the proceeds of North Sea oil. We’re sorry, you cannot have either of those. The West Lothian Question is a paradox at the core of our constitution; it cannot continue. English votes for English laws may be a slogan of the right, but it’s also clearly democratic and just. The English should be allowed to implement policies which the Scots would not choose for themselves, just as mush as we should be allowed to implement policies which the English would not choose for themselves.

In order to prevent that – in order to retain the UK – the pro-union parties must live up to and exceed their pledges. The irreducible minimum that is required for a functioning UK is that Westminster should control Defence, Foreign Policy and Monetary Policy. We acknowledge that if there is to be a Union at all, these things must be retained by the centre. We caution the Unionist parties that little else should be, if they wish the Union to survive a twelvemonth.

And acknowledging that Defence policy should continue to be a reserved matter does not mean that we are content that the United Kingdom should continue to park its weapons of mass destruction on Scotland’s lawn, in Scotland’s landscape and within spitting distance of Scotland’s largest city. On the contrary: the United Kingdom has entered into a solemn commitment under Article IV of the Nuclear Non-Proliferation Treaty to entirely disarm. These weapons are illegal, and in retaining them the United Kingdom is in breach of its obligations. There is no reason that Scotland should be required to shoulder the burden of this ongoing shame.

We wish to endorse the submissions of our members Simon Brooke and Janet Moxley, and those of Dumfries and Galloway Green Party and Eddi Reader

Sincerely

Simon Brooke, for and on behalf of Radical Independence Dumfries and Galloway

Image copyright (c) Stewart Bremner


Saturday 25 October 2014

Post Scarcity Hardware

A second-generation connection machine in use.
Each light represents one active processor node.
Eight years ago, I wrote an essay which I called Post Scarcity Software. It's a good essay; there's a little I'd change about it now - I'd talk more about the benefits of immutability - but on the whole it's the nearest thing to a technical manifesto I have. I've been thinking about it a lot the last few weeks. The axiom on which that essay stands is that modern computers - modern hardware - are tremendously more advanced than modern software systems, and would support much better software systems than we yet seem to have the ambition to create.

That's still true, of course. In fact it's more true now than it was then, because although the pace of hardware change is slowing, the pace of software change is still glacial. So nothing I'm thinking of in terms of post-scarcity computing actually needs new hardware.

Furthermore, I'm a software geek. I know very little about hardware; but I'm very much aware that as parallelism increases, the problems of topology in hardware design get more and more difficult. I've no idea how physically to design the machines I'm thinking of. But nevertheless I have been thinking more and more, recently, about the design of post-scarcity hardware to support post-scarcity software.

And I've been thinking, particularly, about one issue: process spawning on a new processor, on modern hardware, with modern operating systems, is ridiculously expensive.

A map of the problem

What got me thinking about this was watching the behaviour of the Clojure map function on my eight core desktop machine.

Mapping, in a language with immutable data, in inherently parallelisable. There is no possibility of side effects, so there is no particular reason for the computations to be run serially on the same processor. MicroWorld, being a cellular automaton, inherently involves repeatedly mapping a function across a two dimensional array. I was naively pleased that this could take advantage of  my modern hardware - I thought - in a way in which similarly simple programs written in Java couldn't...

...and then was startled to find it didn't. When running, the automaton would camp on a single core, leaving the other seven happily twiddling their thumbs and doing minor Unixy background stuff.

What?

It turns out that Clojure's default map function simply serialises iterations in a single process. Why? Well, one finds out when one investigates a bit. Clojure provides two different versions of parallel mapping functions, pmap and clojure.core.reducers/map. So what happens when you swap map for pmap? Why, performance improves, and all your available cores get used!

Except...

Performance doesn't actually improve very much. Consider this function, which is the core function of the MicroWorld engine:

(defn map-world
  "Apply this `function` to each cell in this `world` to produce a new world.
   the arguments to the function will be the world, the cell, and any
   `additional-args` supplied. Note that we parallel map over rows but
   just map over cells within a row. That's because it isn't worth starting
   a new thread for each cell, but there may be efficiency gains in 
   running rows in parallel."
  ([world function]
    (map-world world function nil))
  ([world function additional-args]
    (into []
           (pmap (fn [row]
                    (into [] (map
                             #(apply function
                                     (cons world (cons % additional-args)))
                             row)))
                  world))))

As you see, this maps across a two dimensional array, mapping over each of the rows of the array, and, within each row, mapping over each cell in the row. As you can see, in this current version, I parallel map over the rows but serial map over the cells within a row.

Here's why:

Hybrid parallel/non-parallel version

This is the current default version. It runs at about 650% processor loading - i.e. it maxes out six cores and does some work on a seventh. The eighth core is doing all the Unix housekeeping.

(time (def x1 (utils/map-world
        (utils/map-world w heightmap/tag-altitude (list hm))
        heightmap/tag-gradient)))
"Elapsed time: 24592.327364 msecs"
#'mw-explore.optimise/x1

Pure parallel version

Runs at about 690% processor loading - almost fully using seven cores. But, as you can see, fully one third slower.

(time (def x2 (utils/map-world-p-p
       (utils/map-world-p-p w heightmap/tag-altitude (list hm))
        heightmap/tag-gradient)))
"Elapsed time: 36762.382725 msecs"
#'mw-explore.optimise/x2

(For completeness, the clojure.core.reducers/map is even slower, so is not discussed in any further detail)


Non parallel version

Maxes out one single core, takes about 3.6 times as long as the hybrid version. But, in terms of processor cycles, that's a considerable win - because 6.5 cores for 24 seconds is 156 seconds, so there's a 73% overhead in running threads across multiple cores.

(time (def x2 (utils/map-world-n-n
       (utils/map-world-n-n w heightmap/tag-altitude (list hm))
        heightmap/tag-gradient)))
"Elapsed time: 88412.883849 msecs"
#'mw-explore.optimise/x2

Now, I need to say a little more about this. It's obvious that there's a considerable set-up/tear-down cost for threads. The reason I'm using pmap for the outer mapping but serial map for the inner mapping rather than the other way round is to do more work in each thread.

However, I'm still simple-mindedly parallelising the whole of one map operation and serialising the whole of the other. This particular array is 2048 cells square - so over four million cells in total. But, by parallelising the outer map operation, I'm actually asking the operating system for 2048 threads - far more than there are cores. I have tried to write a version of map using Runtime.getRuntime().availableProcessors() to find the number of processors I have available, and then partitioned the outer array into that number of partitions and ran the parallel map function over that partitioning:

(defn adaptive-map 
  "An implementation of `map` which takes note of the number of available cores."
  [fn list]
  (let [cores (.availableProcessors (. Runtime getRuntime ))
        parts (partition-all (/ (count list) cores) list)]
    (apply concat (pmap #(map fn %) parts))))

Sadly, as A A Milne wrote, 'It's a good sort of brake But it hasn't worked yet.'

But that's not what I came to talk about.

We are reaching the physical limits of the speed of switching a single processor. That's why our processors now have multiple cores. And they're soon going to have many more cores. Both Oracle (SPARC)  and ARM are demoing chips with 32 cores, each 64 bits wide, on a single die. Intel and MIPS are talking about 48 core, 64 bit wide, chips. A company called Adapteva is shipping a 64 core by 64 bit chip, although I don't know what instruction set family it belongs to. Very soon we will have more; and, even if we don't have more cores on a physical die, we will have motherboards with multiple dies, scaling up the number of processors even further.

The Challenge


The challenge for software designers - and, specifically, for runtime designers - is to write software which can use these chips reasonably efficiently. But the challenge, it seems to me, for hardware designers, is to design hardware which makes it easy to write software which can use it efficiently.

Looking for the future in the past, part one


Thinking about this, I have been thinking about the Connection Machine. I've never really used a Connection Machine, but there was once one in a lab which also contained a Xerox Dandelion I was working on, so I know a little bit about them. A Connection Machine was a massively parallel computer having a very large number - up to 65,536 - of very simple processors (each processor had a register width of one bit). Each processor node had a single LED lamp; when in use, actively computing something, this lamp would be illuminated. So you could see visually how efficient your program was at exploiting the computing resource available.

[Incidentally while reading up on the Connection Machine I came across this delightful essay on Richard Feynman's involvement in the project - it's of no relevance to my argument here, but nevertheless I commend it to you]

The machine was programmed in a pure-functional variant of Common Lisp. Unfortunately, I don't know the details of how this worked. As I understand it each processor had its own local memory but there was also a pool of other memory known as 'main RAM'; I'm guessing that each processor's memory was preloaded with a memory image of the complete program to run, so that every processor had local access to all functions; but I don't know this to be true. I don't know how access to main memory was managed, and in particular how contention on access to main memory was managed.

What I do know from reading is that each processor was connected to twenty other processors in a fixed topology known as a hypercube. What I remember from my own observation was that a computation would start with just one or a small number of nodes lit, and flash across the machine as deeply recursive functions exploded from node to node. What I surmise from what I saw is that passing a computation to an unoccupied adjacent node was extremely cheap.

A possibly related machine from the same period which may also be worth studying but about which I know less was the Meiko Computing Surface. The Computing Surface was based on the Transputer T4 processor, a 32 bit processor designed specifically for parallel processing. Each transputer node had its own local store, and very high speed serial links to its four nearest neighbours. As far as I know there was no shared store. The Computing Surface was designed to be programmed in a special purpose language, Occam. Although I know that Edinburgh University had at one time a Computing Surface with a significant number of nodes, I don't know how many 'a significant number' is. It may have been hundreds of nodes but I'm fairly sure it wasn't thousands. However, each node was of course significantly more powerful than the Connection Machine's one bit nodes.

A caveat


One of the lessons we learned in those high, far off, arrogant days was that special purpose hardware that could do marvellous things but was expensive lost out to much less capable but cheaper general purpose hardware. There's no point in designing fancy machines unless there's some prospect that they can be mass produced and widely used, because otherwise they will be too expensive to be practical; which presumes not only that they have the potential to be widely used, but also that you (or someone else related to the project) is able to communicate that potential to people with enough money to back the project.


Hardware for Post Scarcity software


Before going forward with this argument, lets go back. Let's go back to the idea of the Clojure map function. In fact, let's go back to the idea of a function.

If a processor is computing a function, and that function has an argument, then before the function can be computed the value of the argument must be computed; and, as the function cannot be computed until the value of the argument has been computed, there is no point in handing off the processing of the argument to another processor, because the first processor will then necessarily be idle until the value is returned. So it may just as well recurse up the stack itself.

However, if a function has two arguments and values of both must be computed, then if the first processor can hand off processing of one of them to another, similar, processor, potentially the two can be processed in the time in which the original processor could process just one. Provided, that is, that the cost of handing off processing to another processor is substantially less than the cost of evaluating the argument - which is to say, as a general thing, the closer one can get the cost of handing off to another processor to the cost of allocating a stack frame on the current processor, the better. And this is where current-generation hardware is losing out: that cost of handing off is just way too high.

Suppose, then, that our processor is a compute node in a Connection-Machine-like hypercube, able to communicate directly at high speed with twenty close neighbours (I'll come back to this point in detail later). Suppose also that each neighbour-connection has a 'busy' line, which the neighbour raises when it is itself busy. So our processor can see immediately without any need for doing a round-robin which of its neighbours are available to do new work.

Our processor receives a function call with seven arguments, each of which is a further function call. It hands six of these off to idle neighbours, pushes one onto its own local stack, computes it, and recurses back to the original stack frame, waits for the last of the other six to report back a value, and then carries on with its processing.

The fly in the ointment here is memory access. I assume all the processors have significant read-only cache (they don't need read-write cache, we're dealing with immutable data; and they only need a very small amount of scratchpad memory). If all six of the other processors find the data they need (for these purposes the executable function definition is also data) in local cache, all is good, and this will be very fast. But what if all have cache misses, and have to request the data from main memory?

This comes down to topology. I'm not at all clear how you even manage to have twenty separate data channels from a single node. To have a data channel from each node, separately, to main memory simply isn't possible - not if you're dealing with very large numbers of compute nodes. So the data bus has to be literally a bus, available to all nodes simultaneously. Which means, each node that wants some data from main memory must ask for it, and then sit watching the bus, waiting for it to be delivered. Which also means that as data is sent out on the bus, it needs to be tagged with what data it is.

Looking for the future in the past, part two


In talking about the Connection Machine which lurked in the basement of Logica's central London offices, I mentioned that it lurked in a lab where one of the Xerox 1108 Dandelions I was employed to work on was also located. The Dandelion was an interesting machine in itself. In typical computers - typical modern computers, but also typical computers of thirty years ago - the microcode has virtually the status of hardware. While it may technically be software, it is encoded immutably into the chip when the chip is made, and can never be changed.

The Dandelion and its related machines weren't like that. Physically, the Dandelion was identical to the Star workstations which Xerox then sold for very high end word processing. But it ran different microcode. You could load the microcode; you could even, if you were very daring, write your own microcode. In its Interlisp guise, it had all the core Lisp functions as single opcodes. It had object oriented message passing - with full multiple inheritance and dynamic selector-method resolution - as a single opcode. But it also had another very interesting instruction: BITBLT, or 'Bit Block Transfer'.

This opcode derived from yet another set, that developed for an earlier version of the same processor on which Smalltalk was first implemented. It copied an arbitrary sized block of bits from one location in memory to another location in memory, without having to do any tedious and time consuming messing about with incrementing counters (yes, of course counters were being incremented underneath, but they were in registers only accessible to the the microcode and which ran, I think, significantly faster than the 'main' registers). This highly optimised block transfer routine allowed a rich and responsive WIMP interface on a large bitmapped display on what weren't, underneath it all, actually terribly powerful machines.

BITBLT for the modern age


Why is BITBLT interesting to us? Well, if we can transfer the contents of only one memory location over the bus in a message, and every message also needs a start-of-message marker and an object reference, then clearly the bus is going to run quite slowly. But if we can say, OK, here's an object which comprises this number of words, coming sequentially after this header, then the amount of overhead to queuing messages on the bus is significantly reduced. But, we need not limit ourselves to outputting as single messages on the bus, data which was contiguous in main memory.

Most of the things which will be requested will be either vectors (yes, Java fans, an object is a vector) or lists. Vectors will normally point to other objects which will be needed at the same time as the vector itself is needed; list structures will almost always do so. Vectors will of course normally be contiguous in memory but the things they point to won't be contiguous with them; lists are from this point of view like structures of linked vectors such that each vector has only two cells.

So we can envisage a bus transfer language which is in itself like a very simple lisp, except decorated with object references. So we might send the list '(1000 (2000) 3000) over the bus as notionally

[ #00001 1000 [ #00002 2000 ] 3000 ]

where '[' represents start-of-object, '#00001' is an object reference, '1000' is a numeric value, and ']' is end-of-object. How exactly is this represented on the bus? I'll come back to that; it isn't the main problem just now.

Requesting and listening


Each processor can put requests onto the 'address bus'. Because the address bus is available to every processing node, every processing node can listen to it. And consequently every processing node does listen to it, noting every request that passes over the bus in a local request cache, and removing the note when it sees the response come back over the data bus.

When a processing node wants a piece of data, it first checks its local memory to see whether it already has a copy. If it does, fine, it can immediately process it. If not, it checks to see whether the piece of data has already been requested. If it has not, it requests it. Then it waits for it to come up the bus, copies it off into local store and processes it.

That all sounds rather elaborate, doesn't it? An extremely expensive way of accessing shared storage?

Well, actually, no. I think it's not. Let's go back to where we began: to map.

Mapping is a very fundamental computing operation; it's done all the time. Apply this same identical function to these closely related arguments, and return the results.

So, first processor gets the map, and passes a reference to the function and arguments, together with indices indicating which arguments to work on, to each of its unemployed neighbours. One of the neighbours then makes a request for the function and the list of arguments. Each other processor sees the request has been made, so just waits for the results. While waiting, each in this second tier of processors may sub-partition its work block and farm out work to unemployed third tier neighbours, and so on. As the results come back up the bus, each processor takes its local copy and gets on with its partition, finally passing the results back to the neighbour who originally invoked it.

The memory manager


All this implies that somewhere in the centre of this web, like a fat spider, there must be a single agent which is listening on the address bus for requests for memory objects, and fulfilling those requests by writing the objects out to the data bus. That agent is the memory manager; it could be software running on a dedicated processor, or it could be hardware. It really doesn't matter. It's operating a simple fundamental algorithm, maintaining a garbage collected heap of memory items and responding to requests. It shouldn't be running any 'userspace' code.

Obviously, there has to be some way for processor nodes to signal to the memory manager that they want to store new persistent objects; there needs to be some way of propagating back which objects are still referenced from code which is in play, and which objects are no longer referenced and may be garbage collected. I know I haven't worked out all the details yet. Furthermore, of course, I know that I know virtually nothing about hardware, and have neither the money nor the skills to build this thing, so like my enormous game engine which I really know I'll never finish, it's really more an intellectual exercise than a project.

But... I do think that somewhere in these ideas there are features which would enable us to build higher performance computers which we could actually program, with existing technology. I wouldn't be surprised to see systems fairly like what I'm describing here becoming commonplace within twenty years.

[Note to self: when I come to rework this essay it would be good to reference Steele and Sussman, Design of LISP-based Processors.]

Sunday 5 October 2014

Submission to the Smith Commission

Dear Commissioners

First let me say I wish you well in your immensely difficult task. While separating Scotland out of the Union might have been hard, keeping Scotland in the Union under present circumstances looks a great deal harder. Your commission was set up in the immediate aftermath of the recent referendum. Germane to the establishment of your commission were

  1. 'The Vow', signed by all the leaders of the major UK parties, whose terms as written are so woolly as to be virtually meaningless, but which was represented to the Scottish electorate as being a promise of DevoMax
  2. A series of extraordinary interventions by Gordon Brown MP, in which he promised - on the basis of what authority it is not clear - a solution '...as close to federalism as we can go in a country where one nation accounts for 80% of the population.'1

While the Command Paper which, as I understand it, will set out your full remit is not due to be published until the end of this month, your own statement of your terms of reference states that your recommendations '...will deliver more financial, welfare and taxation powers, strengthening the Scottish Parliament within the United Kingdom.'2


England does not seek subdivision


In his comments as quoted above, Gordon Brown lays his finger on a core issue which, in my view, your commission cannot duck. England does account for more than 80% of the population of the entire union, and there seems little appetite in England for any significant devolution to smaller units. There is no campaign to recreate Wessex, Mercia or Northumbria as states of the union in their own right. Rather, it seems clear that England, in so far as it wants anything at all from this process, wishes to remain England. It does not seem to me that it would be either desirable or in the medium term sustainable to seek to impose division on England in order to facilitate devolution in Scotland.

Political opinion is not homogeneous across the union


While some commentators rightly point out that the degree of difference between the political centre ground of the Scottish, Welsh, Northern Irish and English polities can be over-emphasised, it remains broadly the case that Wales and Scotland choose more left-leaning politicians and parties, while England chooses more right leaning. The situation in Northern Ireland is confused by factors which do not apply in the remainder of the Union. While none of these things are set in stone, the broad picture of a more right-leaning polity closer to the centre of power in Westminster and more left leaning polities further from Westminster has remained apparent consistently over the past 75 years. Furthermore, in recent elections this division has accelerated3. It is true that the line of division between the more rightward leaning and more leftward leaning territories of the Union does not necessarily accord with national subdivisions.

The West Lothian question and after


On 19th July 2004, Sir Thomas Dalyell Loch, the Eton and Cambridge educated 11th Baronet of the Binns, voted at Westminster to raise the university tuition fee cap to £3,000 per year - for university students in England, only.

Sir Thomas sat then for the constituency of Linlithgow, in West Lothian, in Scotland; his constituents' education was - then as now - governed by the Scottish Parliament at Holyrood. And thus he answered his own West Lothian Question, which has dogged the British constitutional settlement ever since it was first asked, and which has an added frisson in the aftermath of Scotland's failed butterfly rebellion.

Clearly, it is unjust and inappropriate - intolerable to fair minded citizens of any of the nations of the Union - that Members of Parliament representing constituencies in the peripheral nations can vote, as Sir Thomas did, on matters which uniquely affect England. The anomaly which Sir Thomas identified must be addressed.

However, what is being proposed now by Conservative voices is a parliament - the Westminster parliament - which will continue to debate both bills affecting the whole United Kingdom and also bills affecting England only; but with the quirk that Welsh, Northern Irish and Scots MPs will be unable to vote on the English-only bills. This looks, on the face of it, sensible. It will not work.

An English Parliament is inevitable

If Labour wins the next Westminster election, but without a majority of English seats, who then is the English Secretary of State for Health? For Education? For the Environment? for Transport? for Rural Affairs? Fully half of the current UK cabinet have portfolios which cover only England4. If Labour wins a majority in the UK but not in England, Miliband will find himself on the horns of a dilemma.

Either he appoints Labour members to head the English departments, in which case they will none of them have a majority in the chamber to pass any legislation; or else he appoints Tories to his cabinet, in which case fully half of his cabinet are from the opposition. In either scenario, England is quite ungovernable.

The broad consequence of this is that the Union parliament and the English parliament cannot be the same institution. An administration which commands the confidence of the union may not command the confidence of England, and vice versa. It is not coherent to have two separate administrations within the same parliament. Consequently, a new, English parliament is a necessity of any settlement - whether the English want it or not.

The House of Lords cannot continue to be a union institution


This merely adds to the anomalous position of the house in which your chairman sits. No democracy in the world could tolerate a situation in which more than half of its parliamentarians are unelected; the United Kingdom cannot pretend to be a democracy precisely because more than half of its parliamentarians are unelected. Members of the House of Lords represent no constituency - indeed, a significant number of its members hold their seats precisely because they have been roundly defeated in democratic elections. Others hold their seats in consequence of financial donations to political parties - a practice we would condemn as rankly corrupt in any other country.

Although people appointed to the House of Lords take territorial designations in their titles, they in no way represent the views of the people of those territories. Whatever their place of birth, most are by the time they are appointed simply members of the metropolitan elite. So it is not the case that the membership of the House of Lords can conveniently be divided into 'English', 'Welsh', 'Scots' and 'Northern Irish' members. Furthermore, whatever may be the views of the people of England, I cannot see Scots willingly accepting a rag-bag of failed politicians and the undeserving rich as an 'Upper House' for the Scottish Parliament.

The House of Lords has been a serious and scandalous anomaly in the United Kingdom constitution for two centuries now; it simply cannot continue in the new circumstances.

The Union parliament cannot be elected on a population-share basis


The primary reason for the political dissatisfaction which led up to and gave rise to the recent referendum has been that, whoever Scotland votes for, we are always and automatically outvoted in parliament by England. The same, of course, is equally true for Wales and Northern Ireland. It is not possible to have a federal system in which one federate can always, in all votes, outvote all the others put together.

The United States constitution, and the European Parliament, both provide potential solutions to this. In the US, each state, regardless of its population, elects precisely two Senators to Congress5, whether it is Wyoming with 582,658 people or California with thirty-eight million, sixty-five times as many. In the European Parliament6, smaller states elect members from much smaller constituencies, so that one Maltese MEP represents 74,000 electors, whereas each German MEP represents 833,000 - eleven times as many. In the United Kingdom, it is probably not politically feasible to propose that England, Scotland, Wales and Northern Ireland each elect ten members to the Union parliament; but it does not seem to me impossible to suggest that Northern Ireland and Wales might each elect six, Scotland ten, and England fifteen.

These relatively small numbers are based on the notion of a system as Gordon Brown proposed, '...as close to federalism as we can go...' Few responsibilities would remain for the Union parliament to administer; each of the devolved administrations would collect tax and pay a subvention to the Union administration; and consequently with little to do, the Union parliament would not have to be large. If the degree of devolution is less radical, then these numbers might be increased ten fold; but the general principle that England should not be able to outvote the other nations combined must not be lost.

Taxation cannot easily be split


Scotland has subsidised the rest of the UK for every one of the last 35 years7. At the same time, Scotland has suffered areas of great deprivation and especially of shockingly poor health outcomes. This injustice has been partially reflected in the Barnett Formula, but Barnett recognises only the additional social need in Scotland, not the additional tax take. This historical imbalance raises many difficult issues. While there is some merit in the argument that richer areas should pay more to the common weal than poorer area and that Scotland is a richer area, nevertheless there is historical injustice, consequent historical lack of capital spending, and a legacy of anger and mistrust to take into account.

To add any sort of adjustment of taxation into this mix will be fraught.

Currently Scotland has powers to vary income tax, but has not, under either Labour/Liberal Democrat or  SNP administrations, used them. Given the continuing commitment of all the Westminster parties to austerity, a cut in the rate of income tax is fiscally impossible - public services are already stretched to breaking point. But a raise has been seen by all parties as too politically risky; and because the current power only extends to the basic rate of tax, any rise would be regressive, which would run counter to the social policy of any political party which could conceivably be electable in Scotland in the medium term.

However, the 'offers' made by the UK political parties are each in their own way predicated on naked partisan self-interest. None is in Scotland's interest. Income tax alone will not fund the Scottish government's current expenditure, let alone the increased expenditure that must necessarily be a consequence of increased powers. So the Conservative proposals are aimed squarely at undermining the Scottish government's ability to spend, and consequently at undermining its popularity. Labour's bizarre proposals of tinkerings at the margins are aimed at preserving the ability of Scots MPs - of whom Labour optimistically assumes it will continue to have the preponderance - to vote on the Union's, and consequently England's, income tax rates.

The only solutions which will work on taxation are

  1. Devolve no tax-raising powers, and continue with an enhanced Barnett Formula. It will be very hard to get a political consensus on this, and it is against Scotland's interests since the long-term drift of Union policy is towards ever more austerity and ever deeper cuts to social and health provision, leading to lower Barnett Formula payments to Scotland; or
  2. Devolve all tax raising powers completely, and have Scotland pay a subvention to the Union. Again, this will not be easy to build a political consensus around, and the formula for the subvention will be particularly contentious; but it is clearly the only solution which will work in the medium term.

And so I end as I began: by wishing you well in a task which I know will prove thankless, suspect will prove fruitless, and believe will prove pointless. I think that, notwithstanding the verdict of electors in the recent referendum, the slow dissolution of the United Kingdom is already past the point of no return.

Sincerely

Simon Brooke


  1. https://www.pressandjournal.co.uk/fp/news/politics/316983/brown-interdependence-is-the-future/
  2. http://www.smith-commission.scot/news/lord-smith-of-kelvin-announces-details-of-his-commission-at-the-scottish-parliament/
  3. http://www.theguardian.com/news/datablog/2010/may/08/general-election-2010-results-maps
  4. https://www.gov.uk/government/ministers
  5. http://en.wikipedia.org/wiki/United_States_Senate
  6. http://en.wikipedia.org/wiki/European_Parliament
  7. http://bellacaledonia.org.uk/2014/10/08/smith-and-the-subsidy-myth-makers/

Creative Commons Licence
The fool on the hill by Simon Brooke is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License