Values and Identity

By Colin Turfus

We hear much about “values” and “identity” in discussions in the media these days. Often the debate about values is specifically around so-called “British values”; and the discussion about identity is often in the context of what is referred to as “identity politics.” The discourse on both these topics in my experience tends to be tedious and unilluminating, so I would like to try and consider these topics from a fresh perspective, in particular to look at their relationship and consider what light the one can shine on the other.

What Do I Mean by Values?

In relation to values, it is often assumed that, to be worthy of the name, they need to be universal or universalisable, i.e. worthy or capable of being upheld by all people, conceptually at all times (although this last idea is rather difficult to square with the commonly held perception that values somehow evolve as the human race becomes more “enlightened”). One of the consequences of this approach is that the concept of “British values” becomes almost an oxymoron, and we tend only to list amongst them things like “fairness,” “democracy” and “respect for the rule of law” which we would advocate that all people in all nations should adopt on the basis of their self-evident merit, arguing to that end along the lines of Kant’s categorical imperative.

Personally, I consider Kant’s philosophy to be unduly influential in our public debate, not least in his insistence on universalisability as a means of determining rules for what constitutes the good. While such arguments are helpful if we wish to compel others to adopt a mandated value perspective, or at least to behave and speak in public as if they did, much of what we really mean by “values” is not really universal at all; indeed it is often quite idiosyncratic and personal. Not only that, I would even propose that idiosyncratic value perspectives are crucial in bringing people together in social groupings and enable the members thereof to see themselves as distinct from members of other groupings on the basis not only of what Aristotle would refer to as the telos, or intrinsic purpose, of the group but also of the values to which this telos gives rise.

What I am arguing, therefore, is that we should think of values as inhabiting intersecting spheres, mirroring the fact that as multifaceted individuals we ourselves inhabit separate spheres in our lives, such as work, family, sports clubs, choirs, and discussion groups, each with its own telos and its own values, some of which may be universal and others of which may be highly exclusive. What I want to emphasise is that those which are universal are not necessarily higher or more important than others. Indeed they probably only offer a lowest common denominator. Who would wish it to be said as their epitaph only that they always did what was required of them? Or that they never strayed even once into political incorrectness? Surely a life replete with value has to go beyond the mundane and be infused with some idiosyncratic personal passion?

Problems with Conflicting Values

Another common misunderstanding—and this brings me on to the question of identity—is the opposite one: that when we feel the values of some particular group are antithetical to our own, there is some onus on us to show respect for the values of that group and indeed for whatever is the object of their valuing. To my mind that is not just nonsense but dangerous nonsense. The group and their values are different precisely because they are idiosyncratic, not universal. To demand that we respect the values and the valuing of another group is to suggest that we should adopt in some measure their idiosyncrasies. But in order to do so, we must at the same time relinquish hold on some of those associated with our own group (and identity). Thus we are required to pay homage to the values of groups to which we do not belong, and so to sublimate our own values. Presumably the other group is expected to reciprocate. As can be seen, if we were to take this process to its logical conclusion, our very identity, shaped as this is by the groups we belong to and the aims and activities we share with their members, will be undermined.

Much of such discussion about respect for other people’s values revolves around the idea of rights, which of course are inextricably linked with universalisability. Human rights law does indeed protect freedom of expression and freedom of conscience. Consequently there is an onus on each of us to respect the rights of individuals and groups to hold and give expression to their values, but not necessarily to respect the values (or objects of valuing) themselves. Of course, if the expression of “values” is antithetical to universal rights held by others, or worse illegal, even the right of expression is curtailed.

In summary, “values” may by their nature and the role they play in our social lives divide us as much as they unite us, and to believe or wish otherwise is not just mistaken, but potentially dangerous as it may result in drawing groups into unnecessary conflict and undermining their ethos and the seminal role they play in underpinning civil society. Protecting spheres of value requires a judicious measure of separation to be maintained between social groupings. There can be such a thing as too much “unity”.

Problems with Multiple Identities?

As I have suggested above, the idiosyncratic values we hold to are a reflection of the social groupings we belong to (or used to) and vice versa. These give rise in a natural way to our identity, which will naturally be multi-faceted. This state of affairs and the recognition of the ultimate incommensurability of diverse value perspectives has given rise in the modern era to the postmodernist narrative, or perhaps I should say multiplicity of narratives. (How this squares with the purportedly necessary Kantian condition of universalisability of values has never to my knowledge been elucidated.) In accordance with this perspective, no individual or group has the right to prioritise their perspective or values over any other.

Well that’s the theory. The reality in the UK and indeed across Western society in general is of course what has become known as the multi-cultural society, whereby non-indigenous communities are encouraged to hold fast to their separate identity and culture, express their grievances against the majority community and demand preferential access to resources; and are often permitted, encouraged even, to flaunt the law, such as with so-called sanctuary cities in the US and in refugee camps at Sangatte. Thus we enter the realm of identity politics in accordance with which the basis of this entitlement to assert one’s identity or culture is a sense of victimhood, or the experience of prejudice or microaggression at the hands of the “majority” community. We have veritable industries now established in our universities manufacturing theories and devising ever more ways to fan the flames and identify new injustices which had hitherto gone unnoticed. How this can be reconciled with the idea of a society of universal shared values is hard to fathom.

But it does not even stop there. Under the postmodernist agenda new minority communities are all the time being identified, for example through the agency of gender dysphoria which has gone within a matter of years from being a pathological psychological state to being the major battleground in the crusade to evict inherited/traditional values from their erstwhile home at the centre of society. What was previously referred to as the LGBT community has become a veritable alphabet soup where soon we will be running out of letters to represent all the rainbow of gender perspectives which the theory (or, absent that, the ideology) seeks to accommodate. And don’t get me started on the haves versus the have-nots debate (where interestingly it appears uniquely to be the minority who are deemed to be the oppressors!).

Under this narrative, the authenticity of the perspective expressed is deemed to arise not from any coherent philosophical perspective or historical narrative but from the grievances which are evinced, so civilised discussion is barely possible, only capitulation lest one be seen as manifestly part of the problem and labelled as embodying this or that phobia or -ism.

Where Does This Leave Us?

This whole business seems to me to be a misuse of the idea of identity. From the perspective I outlined above, this should be about shared values within a community or social grouping, not shared grievances and enmities. Interestingly I would see the new revanchist nationalism evident in the US and Europe (it was rarely if ever absent anywhere else in the world, so is seldom remarked upon other than in Europe or North America) as a reaction against the perceived injustice of precisely this privileging of minority over majority interests. Unfortunately the manifestation of this tends all too often to be again through the expression of shared grievances and enmities, which is not really leading towards a resolution.

It would appear there is urgent need to put positive values back at the centre of the concept of identity and indeed of our moral/societal/political discourse.

Adam Smith and the Rationality of Self-Interest


Since Adam Smith the prevailing view in economics has been that the free market operates through a principle of rational self-interest. Much as Darwin later identified the underlying mechanism for the variety and dynamism of nature operating at the individual level, so Smith atomised the creation of wealth to the individual’s self-interest: “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest”. The notion of rational self-interest, though, needs to be subject itself to rational scrutiny, as it may contain assumptions about human nature which may limit the idea of the type of society which is possible.

Taking Smith’s own assertion at face value, what is it that constitutes the traders’ “own interest”? Clearly, making a living for themselves, which means the buying and selling of goods to and from others, the point being that trade presupposes the existence of others going about their business. Although we can safely assume that Smith had in mind an economy of more than three or four persons, and sustained by more than meat, beer and bread, pleasurable and sufficient as that may sound to some, for the purpose of this thought experiment let us assume a minimal economic model of four players, the butcher, the brewer, the baker and “I” representing the expectant diner. In such a model, it seems clear that whatever the self-interest of each individual is, it cannot be considered in isolation, but only in relation to the self-interest of others. The three traders and “I” rely on each other and can only participate in the market if each is solvent. Therefore, logically, trade in this state is not a zero-sum game, but depends on a certain level of parity, in which only incremental competitive gains are allowed.

Now, suppose that one of the traders defects from this cooperative model in order to gain an economic advantage over the other two. This could be due to simple greed, or it could be due to a fear that one of the others will jump first. In game theory, a branch of mathematics concerned with the logical outcomes of people behaving rationally under given conditions, this is known as the prisoners’ dilemma, based on a specific example, but generally states that when a player has more to gain individually by cheating than by cooperating with a partner, but more to gain by cooperating with a partner than by them both cheating, they will nevertheless both end up cheating and so end up with the worst result. The reasoning runs as follows: if I cheat I will end up with the best result (even though the other person will end up with little or nothing); I would like to cooperate, but if I can think of cheating so can my partner, and if my partner cheats I will end up with little or nothing; therefore, it is in my interest to cheat. The logical result of rational self-interest is that both partners cheat and end up with less than if they cooperated.

Suppose that the baker, in order to gain a competitive advantage over the butcher and the brewer, starts selling meat and beer, judging that “I” the customer will flock to his store for all my necessities; if he succeeds and drives the butcher and baker out of business, he will have gained all my custom and “I” will have gained a more convenient shop. On the downside the baker will have to diversify the business, which will require more work and may result in a loss of edge in the former area of expertise, opening the potential for targeted competition. The baker will also have lost two important suppliers and customers, and potentially made two enemies. From “my” perspective, disregarding the loss of esteem “I” may have had for the baker (for the moment), this places me in a more vulnerable position economically as, if the baker were to go out of business, “I” would have nowhere to buy my victuals.

There is another scenario: in this one the brewer and the butcher do not fold but respond to the baker by similarly diversifying, thus depriving the baker of any advantage gained by jumping first. They gain no advantage over the former cooperative scenario and take on the disadvantages that the baker had previously assumed; there is not even the prospect of my undivided custom. However, there is a payoff if the brewer, butcher and “I” conspire to deprive the baker of trade. Some experiments have looked at the relationship between our sense of fairness and spite. They turn on adding a new element to the prisoners’ dilemma. If the option for the exploited to pay for the punishment of those who defect is added the outcome is very different. Despite the exploited losing even more, they experience satisfaction at seeing the exploiters punished. Moreover, in future rounds group cooperation is far more common.

In real economies, as opposed to simplified models or experiments, there is a huge capacity to absorb the effects of defection, to the extent that the both perpetrators and victims might imagine that there are no consequences for the defector, hence no justice. This capacity is not unlimited, however, and the timescales for restitution – at least for exposure – are growing shorter in this increasingly connected world. Humans are highly attuned to fairness or the lack of fairness in a situation. This may be one of the reasons for the continuing appeal of socialism; it responds at a deeply atavistic level to the inherent injustice of so much of the world’s economic poverty and institutionalises grievance against those who are seen as unjustly favoured (such as bankers in the current climate). The same is probably true of the wave of populism sweeping the developed economies which harness, similarly through partial truths and vicarious appropriation, the dispossessed’s resentment against the winners from globalisation.

Keynes was one of the few economists who attempted to integrate human irrational impulses into his economic theory. Mostly, though, they have been ignored in the pursuit of pure rationality, exemplified by the extreme mathematization of orthodox economics. Rational self-interest as a real-world strategy does not exist in a solipsistic vacuum, however, but must take account of human feelings and sociality, even absorbing short-term disadvantages for longer-term benefits. Most economists despair at the irrationality of voters who turn their backs on the benefits of the free market, specifically global free trade, in favour of the planned economies of socialism or the protectionist policies of the right wing populists. In light of the scenarios considered, though, this does not necessarily violate the principle of rational self-interest, but it reveals that in open societies the concept is more complex and subtle than often thought. Swings in political culture, while manifesting irrational tendencies, may from a broader perspective be reinforcing economic rationality by reining in the irrational outcomes of defection from cooperation, that defection being entailed by supposedly rational objectives.

 It is a fact that free trade has had a beneficial effect on a global level by bringing millions out of poverty, but also that in doing so it has had a devastating effect on traditional jobs and communities in the developed world, not to mention the effect it is also having on the environment. It is little comfort to be told the truth midway through life that one must retrain for a new career in the digital economy because your job has been exported and be prepared to uproot oneself, family and community. These people vote; and in line with rational self-interest they will, in sufficient numbers, vote for those who promise an end to such deprivation, for this is less about declining standards than about economic survival. Among these voters there are true believers; yet probably many more vote with suspended disbelief to punish those in power and the rich even at the cost of punishing themselves. When the euphoria of populism dies down and the reality of broken promises sets in, there will be a reaction and hopefully this will see movement towards a more cooperative economic culture, in which social concerns are integrated into the market ethos.







Mythopoeic Memory and the Oral Tradition


Memory is not a passive depository of facts, but an active process of creation of meanings. (Alessandro Portelli) 1

The relationship of memory to reality is something that we have all, at one time or another, had to face, not just the fact that our memory is unreliable, but that even our most cherished memories can turn out to be partly – even sometimes wholly – figments of our imagination.2 Dreams are an obvious case in which what we remember never actually occurred. Spanish surrealist Salvador Dali’s The Persistence of Memory, probably his most famous painting, plays on the themes of dreaming, time, space and memory. Its most noticeable feature is the number of timepieces and the fact that they are visually distorted. Being what it is, it has been variously interpreted,3 but one persuasive reading is that it is suggesting allegorically that in memory time itself becomes meaningless, that there is a collapse of the past and the present (also represented by the eternal landscape of the desert in the background), and that the passage of time itself distorts the actuality, the facticity – if indeed there can ever be such a thing – of the events through which we pass as we remember them.

While not disputing the Dalian insight into memory, there is another perspective that can be added, which is that the passage of time rather than simply distorting the reality of events, by blurring the detail allows their true significance to emerge. This is what Hans-Georg Gadamer refers to as the ‘temporal distance’ established between a historical event and the present moment, mediated by a history of interpretation, which is the basis of our own understanding,4 or what Heidegger refers to as ‘the fore-structure of understanding’5. What we understand by a particular event is not ultimately a solitary act of comprehension, but is constructed from the collective discourse around the event, and not simply a contemporary discourse but one located within a historical framework: an interaction with one’s peers, with one’s elders, and with the historical record.

I recently attended a school reunion. Many years have passed and everyone has grown older. It is particularly poignant because the school closed a couple of years after I left ending a history of 500 years. So my year group is among the youngest at the meeting, and over the years the roll-call has become noticeably shorter. Since leaving school we have dispersed over the country, in some cases the world, and have made our own lives and have mainly stayed in contact through this annual gathering. Therefore, most of our shared memories, and certainly this is so collectively, are of the time we spent together at school. It is as if when we meet we are still schoolboys and not grown men with families and careers, in some cases grandchildren. Anecdotes about school life are shared, the reputations of faded or former athletes, scholars and villains resurrected and burnished in the glow of reminiscence.

The strange thing is how different our memories are; even though we were passing through many of the same experiences, we recall them differently; some things we remember – or think we remember – that others have no recollection of at all, until what we remember is no longer an individual act but a shared act of collective narration, and vicarious recollection, in which we no longer discern (or wish to discern) the difference between personal memory and folk memory. Given the impossibility of omniscient grasp of an event, this is how many memories are constructed: through a contemporaneous retelling of collective narrative. Memory is, therefore, linked to the significance of events;6 but whereas memory is the construction we retain within our own experience of the world, the significance of events is more apt to exist at an institutional level into which we contribute and from which we draw. This takes place whenever and wherever people come together, whether in a family, a school reunion, a gathering of colleagues, friends sitting around a campfire and the young or novices initiated into the event. In such cases, an event is no longer singular, but becomes part of a tradition.

Oral traditions and the collective memory are fundamental to social ontology, to the sense of meaning that we acquire as individuals and to the continuing existence of social institutions. According to the anthropologist John Foley: “oral tradition stands out as the single most dominant communicative technology of our species as both a historical fact and, in many areas still, a contemporary reality”.7 If I ask my students what they consider to be the greatest invention of all time, I can guarantee that most of them will say the Internet. I can understand why they think so, to a degree; the Internet has transformed totally the functional aspects of our lives, lightening much of the drudgery: shopping for goods and services, waiting around for something to happen, or spending time in pointless travel. However, the Internet also poses an existential threat to human societies by privileging information over human intercourse and discourse, for which presence, Heidegger’s Dasein,8 is a fundamental prerequisite.9

In all cultures an oral tradition preceded the creation of a written one. In some cultures a written culture never developed, and this underlies our tendency to think of these cultures as inferior or ‘primitive’. The advent of writing is considered to mark the transition to history from pre-history, as records preserve details of societies that we would otherwise struggle to understand. There is no doubt that there is much that is fascinating in the information we can learn from the study of the past, and our whole education and much of our civilisation is built upon such knowledge. However, I think that we are living through a time of the excessive focus on information and that has made us undervalue the living history of oral traditions. Even when the literary remnants or artefacts of the past remain, in the absence of a living tradition they are like the fossils of some prehistoric beast that we can imagine but never truly know. Contrast that with the preliterate folk memories of the Australian aboriginals who still gather to speak of and act out their mythopoeic memories of the Dreamtime, the creation of the world, as they have done for up to 100,000 years.

Some traditions, like lost tribes, will die out, as our school reunions will end when there are too few survivors to continue them any longer. God forbid that anyone writes a book about them in the meantime; that would really be the last nail in the coffin and we would know that we had already passed into history.

Notes and References

  1. Cited on p.77 of Alistair Thomson (2011), op. cit.
  2. A considerable amount of research has centred on the reliability of witness statements in court cases. A spate of well-publicised ‘recollections’ of satanic abuse in the 1980s and 1990s turned out to be entirely fictitious. The resulting ‘moral panic’ reached government level in the US, spread to Britain and resulted in a number of innocent people going to prison. Daniel Yarmey (2001). Does eyewitness memory research have probative value for the courts? Canadian Psychology/Psychologie canadienne, Vol 42(2), May 2001, 92-100; Nicholas P. Spanos , Cheryl A. Burgess & Melissa Faith Burgess (1994). Past-Life Identities, UFO Abductions, and Satanic Ritual Abuse: The Social Construction of Memories. International Journal of Clinical and Experimental Hypnosis, 42 (4), pp.433-446.
  1. For example, as evidence of Dali’s mental instability, or as a comment on Einstein’s theory of relativity.
  2. Hans-Georg Gadamer (1994). Truth and Method. London: Continuum Publishing Group.
  3. Heidegger, M. (1962). Being and time. (Trans. by J. Macquarrie and E. Robinson). New York, NY: Harper & Row.
  4. Alistair Thomson (2011). Memory and Remembering in Oral History. In Donald A. Ritchie (Ed.), The Oxford Handbook of Oral History. Oxford: Oxford University Press, pp.77-95.
  5. John Miles Foley (1999). What’s in a Sign? In E. Anne MacKay (Ed.), Signs of Orality. Leiden: Koninklijke Brill, pp. 1–2.
  6. Heidegger, op. cit.
  7. We all know the corrosive effects of human isolation, both psychologically and for society. This undoubtedly underlies the unrestrained aggression encountered on the internet, when views are aired unmediated by any presence. By contrast, many conflicts can be resolved by meeting and discussing differences with others.

The rise of populism considered from the perspective of evolutionary constraints on our moral choices

May you live in interesting times (Confucian curse)

Men are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites (Edmund Burke)

A scholar of impeccable academic credentials once suggested to me that revolutions are spaced about the average lifespan of a human apart, about 70 years. I was sceptical, as this sounded like numerology, but I did some digging and there are indeed some interesting patterns: the French Revolution (1789) to the European uprisings of 1848 is admittedly only 60years, but if the American Revolution (1776) is counted in, that is about 70; from there to the communist revolution in 1917 is roughly 70 years; and from the revolution to the fall of the Soviet Union (1989) is about 70 years. Of course these events are selective, and I am not suggesting there is a grand plan. However, they may point to an underlying truth: that real social change occurs in a highly disruptive manner, not as a result of gradual progress, and that this change is generational, as it takes the space of about two generations for the contradictions implicit in any system to become apparent and momentum for a new direction to grow to a critical point.

The historian and philosopher of science Thomas Kuhn loaned the term ‘paradigm’ from the obscurity of the social sciences and controversially applied it to revolutionary changes in scientific outlook, in the process reinvigorating the concept, redefining and enlarging it and, admittedly, setting it on course to become the de rigeur cliché for any and all sorts of change; it is surely, though, something that corresponds closely to a paradigm shift that we are living through. We are now, in the West, standing about 70 years on from the end of the second world war, from a time when a transnational consensus was established around such institutions as the United Nations, NATO, the beginnings of the EU, the welfare state in Britain, the founding of modern Israel in the Middle East, the demilitarisation of Germany and Japan, the growth of the military-industrial complex in the USA, a period of US economic and political hegemony in general. Within this span many changes have occurred outside the western democratic sphere: the collapse of the Soviet Union, the emergence of China as a communist nation and then as an economic superpower, the resurgence of Islam as a powerful political idea in the Middle East and beyond, the stirring of real political and economic progress in parts of Africa and in India. In the West itself cracks are beginning to show in many of the post-war settlements and institutions, while there is a pervasive sense of economic stagnation and the loss of international leadership, manifest in the seeming inability to deal with the crisis of migration and endemic war in the Middle East. It is against such a background that we are seeing the arrival of a new kind of politics, anti-establishment and populist in its appeal.

The spectre of populism seems to alight on its most prominent figures, such as Donald Trump, Nigel Farage, Marine Le Pen or Geert Wilders, focussing on their oddity, unsavoury characteristics or questionable beliefs. Although in the main it seems to be a manifestation of right wing politics, there are also populists on the left, such as Bernie Sanders, Podemos in Spain, Beppe Grillo and the Five Star movement in Italy and Syriza in Greece, as well as Jeremy Corbyn in the British Labour party. It is interesting to note that a significant number of voters in America switched from Bernie Sanders, an anti-establishment left-winger, to Trump, rather than Clinton, an embodiment of the liberal elite. Populism has perhaps less to do with the particular political flavour than its anti-establishment stance and the identification, even if to some extent a fabrication, of grievance and loss within a significant proportion of the national population, a loss – whether of identity, jobs or prestige – caused by the policies of the liberal establishment, an establishment, moreover, that has profited by and large from these same policies. However, I want to propose a hypothesis that even underlying these more obvious political triggers, there is an actuality – not a perception – of moral decline that should be worrying us far more than it actually is.

Since such a hypothesis is going to enrage some people even before it is explained, let me first set out what I am not saying. I am not saying that people now are worse than they were in the past; human nature does not change much over time; however, the general belief is that we are much better, and this is an illusion. We may kill people less, at least in the developed world, but this is because of advances in wealth, technology, political structure, religious belief and law. We may believe we are more generous, and cite increasing donations to various causes, but altruism is a natural human trait which is found across all cultures and has no correlation with money (although self-congratulation may). In this context, it is worth noting the rise in xenophobic attacks with the EU referendum vote and with Trump’s election, which demonstrates how shallow the shackles of self-restraint are. Or the pervasive maliciousness of the online world which exists outside conventional social restraints

The roots of moral life from an evolutionary perspective lie in our sociality, and the natural institutions that both create and emerge out of that sociality. By this I mean those social bonds that are rooted in our biology: physical survival, protection, reproduction, genetic inheritance and genetic closeness. These are the underlying infrastructure – if one can call it that in want of a better term – of our sociality and the institutions of family and community, such as marriage, parenthood, kinship, friendship and economic occupation (which includes any activity to support the family and have standing in the community, whether that be hunting, farming or banking). There is no human society in which these things have not been fundamental, despite whatever other advances or changes have occurred. Societies have always flourished at a time when these institutions have been strong; and no society historically has flourished when these have been weak, neglected or under attack.

Moreover, in evolutionary terms, beyond our mere physicality we have an ‘excess’ in our neurological constitution (which we variously refer to as mind, soul or spirit) in which we entertain beliefs about the world of our experience. Interpretations of what this means for our self-understanding vary enormously – my own view is that in the primal state this is a survival mechanism – but whatever the ontological reality of our beliefs, they must not fatally undermine our natural sociality; if they do they will be eliminated by natural selection (on this point I am closer to Richard Dawkins than to Julian Huxley, Darwin’s ‘bulldog’, who believed that we have transcended natural selection). This selective process eventually manifests through the political process, particularly in times of upheaval.

The crux of my argument is that the liberal establishment has allowed and even facilitated the erosion of this evolutionary infrastructure of sociality, and that this has had a disproportionate effect on the less well-educated, less mobile and less wealthy sections of society, who in any society constitute the majority. Liberalism has not simply allowed the export of blue collar jobs abroad where labour is cheaper (to be often replaced by jobs that pay insufficiently for a person to buy a home, marry and raise a family), but, more seriously, has persistently undermined the foundations of family and community which enable the emergence of social solidarity. Economic hardship alone, while an important contributing factor, is not sufficient to accomplish this. For the past 50 years the liberal establishment, consisting primarily of academics, the media, and the entertainment industry, has moved forward an agenda of undercutting the foundations of social solidarity: marriage as the unique core of family life, the historical narrative of national identity, and also religion as one of the core facilitators of communal life. This agenda has gradually been institutionalised in education, law and politics. Whatever one’s political views, when teaching children gender fluidity becomes a greater priority than ensuring a sound basic education for all, when celebrating diversity becomes more important than celebrating full employment, we are entitled to wonder whether liberalism has reached its hubristic apotheosis.

As much as I am an advocate of maximising human freedom, freedom comes with its own built in constraints, an internal logic that freedom cannot undermine itself, that is, allow actions that result in its own destruction. These constraints are those determined by our evolutionary heritage and the institutional structures that emerge from them that constitute our social being, referred to above. That people choose not to marry or become parents, or choose not to marry but have children, or choose to divorce, or choose an alternative lifestyle, these are individual choices and rights in liberal democracies. Nevertheless, they have social consequences, and if these tendencies become prevalent they have demographic consequences. The influx of immigrants into Europe, for example, is agued by some as a necessity given the population shortfall created by a low native birth rate. This, of course, is not what people want to hear. We have become used to thinking about our sociality in purely individualistic terms, in terms of our freedoms and rights and of the social reality in which we want to live; but just as for all our cleverness and ingenuity we cannot ignore fundamental forces such as gravity, so for all our social experimentation we cannot ignore the evolutionary parameters of our being without consequences.

It is at the interface between individual choice and social necessity that the most interesting political choices are made and the most virulent arguments take place. It is likely that this argument will never be finally settled, as this dynamic of competing views is at the heart of democratic culture and ensures its adaptability to changing circumstances. Excessive liberality is bad for societies, just as excessive authority is, and when pushed to one extreme a counter-force inevitably appears. Populism, therefore, can be seen as a collective, unconscious reaction to the ills that plague modern liberal democracies. If it had not been Trump, Farage or Le Pen, other figures would have arisen with similar grievances and similar policies.

Therefore, to categorically assert that populists represent the doom of democracy is to be entirely enclosed within a dying paradigm and to misunderstand the underlying dynamic of the paradigm shift that they represent. To accuse them of being by definition anti-democratic is to have forgotten that democracy was, in the ancient world, and has been in modernity, a revolutionary force, representing more closely the wishes of the mass of people than any other system of governance that has existed. It is also to be wilfully blind to this system having been gradually hijacked by self-perpetuating elites who have empowered and enriched themselves. As long as people generally felt that they were making progress they were willing to acquiesce to the elites in Washington or in Brussels, but since the financial crisis in particular, and the increasingly widening gap between rich and the poor, or those simply struggling to stay afloat, there has been a growing anger on which populism has capitalised.

Nevertheless, there is a great deal of uncertainty and potential for danger in these developments. In a recent article, the author Robert Harris argued that the political situation today resembles that of the 1930s more closely than any time since. I do not think we are even close yet, but the signs should serve as a warning. Berlin in the 1930’s was not only a time of great social unrest, economic turmoil and political agitation, but a byword for moral turpitude. Lest people think that these factors are unconnected, the National Socialists made great play of their intention to clean up Germany morally, which was one factor in their gaining popular support. The present populists are hardly moral paragons and tend to be morally liberal on the whole, but they advance authoritarian policies which could be a step in allowing more extreme policies to follow.

Karl Marx expected the communist revolution to take place in Britain, the most industrially developed country in the nineteenth century; that it did not may be due in part to great reforms in the Victorian era, in particular on the spiritual, educational and economic conditions of the working classes. The populist platform today could be derailed by centrist parties having the courage to undertake reforms of similar magnitude. Having said this, I have no expectation that people today will willingly change their behaviour, as it is human nature to resist difficult choices, moral or otherwise, unless circumstances force our hand. My hope is that our existing institutions are strong enough to withstand the uncertain times into which we are moving and that we may be able in hindsight to view this period in history as a time of readjustment in the balance of freedom and moral obligation within democratic society rather than the beginning of a civilisational holocaust from which we must build anew.


Capital Punishment: Marx, Markets and Mortgages

By James Walker

James Walker is the editor of the literary graphic novel Dawn of the Unread. It was created as a response to alarming literacy statistics in young people across the UK. Now what alarms him is how a changing labour market is making it impossible for his son to get on the housing ladder. What is required is greater economic literacy and to do this he’s joined a reading group exploring Karl Marx’s Capital Vol.1.

First off, let’s have a few statistics about the miserable mess we’re in. I’m not talking about Brexit, Trident or Sam (Big Sam) Allardyce becoming the England manager. I’m talking about two four letter words that define our lives: work and home.

According to the Ministry of Justice (MoJ) there were 10,732 repossessions of rented and mortgaged homes by bailiffs between January and March. Although this was down by 123 during the same period in 2015, it was up by 479 for the final quarter of 2015. But we should be grateful as The Council of Mortgage Lenders (CML) believe if repossessions continue to drop at the current rate we’ll be at our lowest annual numbers since 1982. Back when houses were affordable.

There are some reasons to be cheerful in terms of buying a property. The standard variable rate for a mortgage has plummeted and a rise in stamp duty has slightly halted property developers from  swallowing up entire streets. But this has been offset by the ridiculous increase in house prices that simply make it impossible for anyone to save up a deposit, let alone get a mortgage. I bought my first home when I was twenty-one and it was roughly 3 times my annual wage. My current home is 7 times my annual wage. The house is the same size.

This may explain why rents in both the social and private sectors have risen this year by around 7-9%. The landlords who’ve had their wings clipped by the Chancellor are passing this cost onto those who can’t afford to get onto the property ladder. According to the MoJ there were 10,636 evictions during the first quarter of the year. Expect this to increase, as the cap on housing allowance kicked in at the beginning of April. Then there’s the 7.2million, according to Churchill Insurance, who have moved back in with the parents because a relationship ended and are too poor to rent alone.

For those without the luxury of parents, there’s the streets. You always know when the privileged are in power because the number of people ‘begging’ zooms up. On an average walk across town I probably get stopped between 5-10 times for ‘a spare bit of change’. Expect more of this as hostels, Citizen’s Advice, and public sector support service staff increasingly begin to evaporate.

What we really need is change.

Speaking of which, banks have a lot of loose change at the moment. They’ve saved a bundle in wages by adopting the trend set by supermarkets and kitting out their stores with self-service machines. The unidentified item in the bagging area is staff. People are losing their jobs in every area of work as technology slowly takes hold. Ring up for a taxi and you’ll no longer be put through to a call centre of eternally bored operators. Instead there’s an efficient automated service that tells you where you want picking up from before you’ve even said a word. And you know things are seriously wrong when Waitrose gets in on the trend and dismisses checkout staff in favour of self-service machines.

Banks need to cut back on wages because they’ve finally been caught with their pants down. According to the CCP Research Foundation the top twenty banks paid out £252bn in conduct charges over the past five years, such as the six banks fined a record £4.3bn for rigging foreign exchange rates and Lloyds £4bn penalty for mis-selling of payment protection insurance. So why exactly did we bail out the banks again?

According to the Sutton Trust, the poorest British students will graduate with debts in excess of £50,000. (In the US, by contrast, where students study for an extra year, the average debt at a private for-profit university is £29,000.) Although state-sponsored loans are linked to future earnings, these debts are subject to inflation so the money keeps going up. Students who studied a decade or so ago will tell you that although their debts were a lot cheaper, the loans have been sold off to debt agencies, despite the promise that they wouldn’t be, and now fear earning a penny above a certain threshold because it will trigger larger repayments.

For those of us fortunate enough to have a job there is the constant restructuring of departments and the shoehorning of two jobs into one, and for an added bonus, with reduced hours. Some of us have had our wages frozen for so long we have to put gloves on when we draw money out the bank. We’re told we should be grateful that we’ve got a job, and expected to smile when we receive the ‘Happy Friday’ email wishing us the very best for the weekend and remembering not to be late back in on Monday.

For adolescents who’ve skipped further education there’s the temp agencies where you’re guaranteed the minimum of work for the minimum amount of money. One lad I spoke to told me he had to drive to Grimsby to do a two hour shift and he wasn’t paid for his petrol or the four hours the round trip took. He had to do it because if he refused they wouldn’t consider him for other work. Work left him out of pocket. Of course this is completely illegal but it goes on all the time. ‘Calm down and carry on’ is the expression. This translates as ‘Shut up and do as you’re told’.

Zero-hours contracts are the reality for most of us now. University lecturers are paid by the term and join an expendable workforce who can be got rid of with the flicker of an eyebrow. And this is where the Big Society steps in. The volunteers who run our libraries. The volunteers who cut down the forests. The volunteers who write for free for magazines because they have the deluded idea they can make a difference. So in some respects we’ve been complicit.

All of which finally gets me to my point. If we are expected to live flexibly in a big society on zero-hours contracts, isn’t it time we had a more flexible mortgage, a ‘zero-hours’ mortgage, to reflect the reality of our lives?

A zero-hours mortgage would work exactly like a zero-hours contract. If there’s no work, there’s no mortgage payment. Simple. It’s not your fault that you’re losing your job in the call centre to the latest Siri. If you do work a few hours then you pay a proportionate payment. Yes, calculating this could be tedious but isn’t that better than repossessing a home and putting a family out on the street, which is ultimately more costly for society?

A university lecturer told me recently that universities need to throw out all of their liberal newspapers and stock the Financial Times. He said that’s where the power is, in the things people don’t understand. The things that are deliberately made complicated. For this reason he believes economics should be at the heart of everything that it is taught, no matter what the discipline. It’s for this reason that I’ve joined a reading group where we are slowly working our way through Karl Marx’s Capital volume one, reading one hundred pages per week. It’s complicated, but far more humorous and literary than I would have imagined. I don’t believe in communism, and I certainly don’t believe in capitalism in its current manifestation. All I know is that something isn’t right at the moment and the system needs a bit of tinkering. Hopefully this book group – comprised of PhD students, unemployed, artists, etc. – from Manchester, Mansfield and other places not necessarily beginning with M will help me figure it out.


(James Walker is a lecturer and journalist. He won a 2015 Guardian Teaching Excellence Award for his efforts to improve literacy through the online graphic novel Dawn of the Unread. He has sought to promote Nottingham’s literary history and was the last person to interview the acclaimed novelist Alan Sillitoe. He was also director of Nottingham’s successful bid to become a UNESCO City of Literature. @TheSpaceLathe)











The Banality of Madness

Those whom the gods wish to destroy they first make mad                                                              (Longfellow, The Masque of Pandora)

History repeats itself, first as tragedy and then as farce                                                              (Marx, The Eighteenth Brumaire of Louis Napoleon)


For the modern world that does not believe in evil, but only in blame and victimhood, the currents of the present must augur an existential crisis; for we lack the language and even the thought to encompass the monstrous nature of events such as that which took place in Nice on Thursday evening. Families out enjoying a celebration are now the targets in an asymmetric war in which the mundane becomes the source of terror and mayhem, and in which fellow citizens become a proxy army infected as by some alien virus.

Without mitigating in any sense the awfulness for those involved, there are reassurances in all of this. It becomes clear that this has little to do with Islam, a religion with a long and noble spiritual history, so, perhaps after a period of doubt, we can embrace our Muslim neighbours and compatriots as fellow sufferers and accept their condemnation at face value and not as merely self-serving platitudes. For finally, the heart of IS is revealed not as piety but as pure evil, the offspring of madness and criminality. As their so-called caliphate is losing ground in Syria and Iraq, their remaining currency is only death, indiscriminate death to as many people as possible. For in their logic everyone is an infidel; whether Christian, Jew, Muslim, Sikh, Hindu, Buddhist or of no religion, if living in the lands outside the caliphate or if not at war with their own country.

As the shock of repeated tragedies wears down our vocabulary of outrage and horror until they become clichéd, we risk not indifference or terror, but levity. The spectacular insanity of Nice, in betraying our expectations of reason, humanity, compassion and mercy, cannot be encompassed by the rational mind, but only by something like absurdist theatre. Cartoonists understand this well and frequently tread the fine line between humour and offense. There is, of course, an antidote to this: as the ghost of Marley reminds Scrooge in Dickens’ Christmas Carol, mankind is our business and the common welfare.

The indiscriminate nature of such acts means we are all potential targets, but there is something comforting at the same time in knowing that this madness has taken on the whole world as its enemy and is unlikely to irrupt in our vicinity, or only as likely as anywhere else. We expect our government to take the necessary steps to protect the population, just as we would be wise to be more vigilant. We are going to have to live with this as the new reality for the foreseeable future, and factor in this risk as we do others in going about our daily lives. Otherwise, we should offer an empathetic Gallic shrug and get on with our business.

Foundations of the Moral Order

By Colin Turfus

Part 1: The Moral Basis of the European Project

The Two Pillars

For over half a century now a project has been under way to transform European society from what it was at the mid-point of the 20th century, a disparate collection of peoples possessed of distinct national identities and traditions, into a coherent unified whole based on principles of co-operation and solidarity. This project has been known by various names through its 60-year evolution but is now constituted as the European Union. Opinions vary as to whether the cost of what has been lost along the way is mitigated by the undoubted gains which have been made in terms of both co-operation and solidarity, particularly when one compares the history of the European project with the circumstances of the half-century which preceded its inception with two world wars, both initiated in West/Central Europe. But it cannot be denied that, in terms of its growth in size and scope, it has been an extremely successful political project. To what can this success be attributed? And why is it that when, as at present, voices of dissent are being heard ever more widely in relation to a growing number of issues which are so obviously damaging the life chances and disrupting the lives of many individual and families across Europe, so many continue to defend the track record of the European Union and maintain faith in its founding vision of a united Europe, indeed of a United States of Europe?

In a recently written report on the subject, written for the Theos think tank [8], Ben Ryan argues that the founding vision was essentially a moral one, based on the twin principles or “pillars” of solidarity and subsidiarity. The former, he suggests, is captured in the May 1950 Schuman Declaration as follows:

“Europe will not be made all at once or according to a single plan. It will be built through concrete achievements which first create a de facto solidarity.”

while subsidiarity is according to the glossary of the EU website [9] defined as a principle that

“…aims to ensure that decisions are taken as closely as possible to the citizen and that constant checks are made to verify that action at Union level is justified in light of the possibilities available at national, regional or local level.”

Specifically, it is the principle whereby the Union does not take action (except in the areas which fall within its exclusive competence), unless it is more effective than action taken at national, regional or local level.

It will be my claim below that the success and enduring traction of the European project in winning the hearts and minds of citizens across Europe and beyond is in its appeal to these two principles; indeed that these two principles lie at the heart of any functioning moral framework.

Following the setting out of the founding vision in the first part of his report, Ryan goes on to describe in a second part how the original moral and spiritual vision has been lost and suggests how the various crises which are causing increasing disillusionment with the European project in a growing number of countries are a consequence of this. The third and final part of his report is then devoted to his proposal for “putting a soul (back) in the union.”

While I concur with Ryan in the broad conclusions he draws in his second part, I will seek to argue that what has gone wrong in the European project is not adequately characterised as simply a loss of moral vision but also has to be seen as a failure to grasp the nature of the moral order and to understand its foundations. I will draw my own conclusions on that basis of what can be done to restore the lost moral dimension and provide a new direction for development in the UK in particular, but more generally across the European continent.

The Solidarity Principle

As I suggested, it is hard to argue against the solidarity principle. It is in our fundamental human nature that we share a common identity with all members of the human race. This has led to the framing of the golden rule, formalised by Kant as we shall see below into his categorical imperative. However, the key point I want to bring out is “common identity.” Whereas some aspects of our identity are shared across all our fellow human beings others are specific to smaller groups or communities to which we belong. The fact that solidarity is premised on a shared identity, and rightly so, inevitably means that a greater degree of solidarity is felt for those with whom we have more in common and with whom we “identify” more strongly. This point was well brought out by David Goodhart [10] in his analysis of UK post-war immigration where he pointed out the unresolved tension at the heart of the “multicultural society” whereby separate identity of minorities is promoted while at the same time the inculcation of a universal sense of solidarity is sought.

Also it is human nature to feel greater solidarity for those with whom we feel most closely connected. But often when greater “solidarity” is advocated these days it is not in relation to those surrounding us in our daily lives but often in other countries or in far remote parts of the world, and/or in relation to people about whose lifestyles and circumstances we know little, but who are perceived or portrayed as being in need. I would suggest that this is probably a misuse of the concept of solidarity which is something that arguably should exist independently of the needs of those with whom we feel solidarity and which furthermore tends to be mutual. Such “solidarity” with relative strangers is more accurately characterised as sympathy or compassion, resulting in an expression of support: no less a virtue but a different one.

Within the European Union, one of the main avenues for the expression of solidarity is through the so-called Solidarity Fund whereby the cost of projects in less-developed areas of the Union (or even in accession states) are subsidised by those in more developed nations. This is done in such a way that the European Union itself and not the donors is perceived as the origin of the funding and indeed such is reinforced by the imposition of large plaques which must be displayed at penalty of hefty fines being levied in the event of failure to comply. In this way, the European project is able to expand and sell itself successfully to ever more countries, to the point where it is now running out of European countries and starting to talk about membership for Turkey. This is perhaps not surprising when one re-reads the excerpt from the Schuman Declaration above and realises that solidarity is there defined not as an end itself but as a means to fulfilling “the plan” through “concrete achievements” (their words not mine).

Another point I would make about solidarity is that it is a property associated with a community rather than with an individual, whereas of course sympathy and compassion represent the personal response of an individual. As I have already mentioned, they are also conditioned on the circumstances of another which elicits the response. A further point I would make is that, in the age of the welfare state and universal care, the meeting of the needs of those facing disadvantage or hardship in developed societies becomes less and less the responsibility of individuals motivated by compassion and more the responsibility of government and (publicly-funded) NGOs. So it is natural that the advocacy of more funds being made available for such purposes becomes a substitute for engaging in a direct expression of compassion for those whose needs we are made aware of through first-hand experience (rather then sound-bites on the BBC News at Ten or artful photojournalism). Even “charitable work” for most of those who engage in it in a voluntary capacity consists of raising funds for organisations whose charitable outreach work is invariably done these days by (well-)paid professionals.

So if UK taxpayers (or those of any other country) are to make available funds to support infrastructure development projects in other parts of Europe, let the case be made by our/their elected politicians as to which projects should be supported where. And let us have our say on the proposals in an election. Then it really will be solidarity and the satisfaction we feel will be all the more for it, as will the appreciation and recognition of those in receipt for that which is freely given. We in Britain are already giving about twice as much in overseas aid as a fraction of our GDP than any other country. It is not as if we need to be led by the example of our partners in Europe to find generosity in our hearts. Though to hear how we are criticised by them for our lack of “solidarity” one might easily imagine the situation to be otherwise.

And let us bear in mind that solidarity is mainly about our relationship with those with whom we live in community. Yes, it makes sense to feel and demonstrate solidarity with the Syrian refugees or the Polish immigrants who have moved in down the road or whom we meet at the local school. But our duty towards those who may be facing difficulty in Poland or Syria is a different matter. We should take care lest we find that our attempts to address their issues based on “compassion” rather than a familiarity with the local circumstances, and driven more by a desire to salve our conscience and/or signal our virtue, may do more harm than good and be rewarded not by reciprocated solidarity but by accusations of meddling or even “cultural imperialism.”

The Subsidiarity Principle

If the problem we identified above with the hollowing out of the concept of solidarity within the European Union to the point where it is largely about the enforced transfer of funds is acknowledged as meriting consideration, the shortcomings in this regard pale into insignificance in relation to the obfuscation and disingenuousness that exists around subsidiarity.

But first, why is subsidiarity important in a moral context? As I stated above, the concept of solidarity (or, if you like, empathy) gives rise to the golden rule and provides the justification for Kant’s categorical imperative which, I shall argue in Part 2 below, is the foundation of the modern doctrine of human rights. If the enumeration and enforcement of such rights were a sufficient condition for the establishment of a harmonious world order (or even a single nation), nothing further would need to be said and the argument for subsidiarity would be more difficult to make. But for reasons I shall return to below, human rights have become problematic in a number of ways.

The essence of the problem is that values and consequently what we see as “rights” have a habit of turning out to be incommensurable one with another. This is a reflection of the fact that our values are intrinsically connected to our identities which are in turn shaped by our history (personal and national) and our community, or more properly in a modern context, the diverse communities, real and virtual, in which we live out our lives. I like to think of this multiple connectedness in terms of an individual living at the intersection of multiple hyperplanes, each with its own set of rules and conventions. As we move around on each hyperplane we follow the conventions appropriate to that social context which are shared, either implicitly (between friends) or explicitly (as, for example, in the workplace). As long as activities on different hyperplanes remain partitioned, this works fine. But such separation is not always possible since the hyperplanes intersect. Also, particularly in the modern multicultural society, the rights of diverse groups thrown together in community to live according to their traditional values and lifestyle may explicitly prevent others from doing so. Whereas in the past this was always seen as a problem mainly for the newcomer or immigrant, the pendulum appears to have swung the other way to the point where the incumbents tend to be the ones who are expected to make concessions in the event that a conflict arises.

There are two opposed approaches which can be taken to address the above issue. One approach is to seek a reduction in the dimensionality of the hyperspace and look to impose as far as possible a one-size-fits-all set of rules which everyone is expected to conform to. This is the essence of the human rights approach whereby making society “fairer” comes to be about identifying groups who are disadvantaged or discriminated against and seeking redress through expressions of “solidarity,” publicising the purported injustices and obliging others (through the courts if necessary) to explicitly acknowledge those rights and to “respect” the chosen lifestyles or belief systems of those asserting them. One can certainly see that there is a strong strand of this way of thinking in the policy direction pursued by those driving the European Project.

The alternative approach is based on subsidiarity whereby we seek to allow established hyperplanes to exist and manage conflict according to agreed rules or compromise. Communities once established are, wherever possible, entitled to self-determination, unless some good reason can be given as to why this is antithetical to the greater public good (a Kantian categorical imperative). Interestingly, this second approach appears to be aligned with the idea of the multicultural society. I would argue that it may be. But it does not, as advocates of the multicultural society often argue, mean that we should necessarily side with the minority community and offer them preferential treatment. Much more could be said here but this point is not central to my main argument here.

In the context of the European Union, though, the half-heartedness of its commitment to the principle of subsidiarity is evident in the very language used. For example, areas where the Union has “exclusive competence” are explicitly excluded from challenge by the subsidiarity principle. But one of the biggest criticisms made against the European Project is the avidity with which it arrogates competences to itself at the expense of national parliaments; and the anti-democratic nature of such behaviour. Of course, the fact power was previously exercised at a subsidiary level but then is arrogated to the centre is clear evidence that the principle of subsidiarity is being turned on its head. How is this allowed to occur? The answer is again clear from the EU’s own words: checks are made, it is claimed, to ensure that action is not taken by the Union which would be more effective if taken at a local level. But who is carrying out the checks and making the decisions? We know the answer to that. And what visibility is granted to lower level authorities of decision-making processes which would allow them to influence the outcome? Clearly subsidiarity properly understood is something which the EU is likely only ever to be able to pay lip service to.

So we come to understand the crippled state the EU now finds itself in, where it maintains its popularity by identifying and paying homage to the twin pillars of solidarity and subsidiarity which support the moral order. But it has hollowed out the one and turned the other on its head. The consequence of this is that it heaps criticism and contempt on countries and groups which seek to challenge its one-size-fits-all policies which are defined as if they were categorical imperatives but are actually hypothetical, conditioned on the support they provide for the advancement of the European Project. It evades the need to universalize its arguments by vilifying the disenfranchised, impoverished masses who suffer the consequences of its misguided economic and social policies, justifying its anti-democratic approach on the basis that its critics are populist upstarts, led on by demagogues and racists and pursuing a self-serving nationalist agenda. It pretends to be listening but is only really interested in being seen to be listening. And rather than exercising subsidiarity it gets lackeys like our Prime Minister David Cameron to trumpet the virtues of the European Project and push it down the throat of the electorate with such force that his reputation and that of his chancellor will probably be damaged beyond repair. And indeed his own party will probably take a long time to recover from the battering it has visited on itself in recent weeks.


In the second part of this essay I shall develop a framework for thinking more broadly about the establishment of a moral order in which some of the issues we currently face, not only in Europe but across the globe, might be better addressed.


[1] A. MacIntyre: After Virtue – A Study in Moral Theory, 2nd ed., 1985 (Duckworth, London)

[2] A. MacIntyre: Whose Justice? Which Rationality? 1988 (Duckworth, London)

[3] A. MacIntyre: Three Rival Versions of Moral Enquiry, 1990 (Duckworth, London)

[4] S. Dothan: Judicial Tactics in the European Court of Human Rights, 2011, Public Law and Legal Theory Working Paper No. 358, University of Chicago, Department of Law

[5] H.-W. Micklitz: Judicial Activism of the European Court of Justice and the Development of the European Social Mode in Anti-Discrimination and Consumer Law, 2009, European University Institute Working Papers, LAW 2009/19.

[6] R. Scruton: A Short History of Modern Philosophy: From Descartes to Wittgenstein 1984 (Routledge & Kegan Paul, London)

[7] M. Oakeshott: The Tower of Babel in “Rationalism in Politics” 1962 (Methuen), first published in 1948 in Cambridge Journal, vol. 2

[8] Ben Ryan: A Soul for the Union, 2016, Theos Think Tank Report


[10] David Goodhart, 2013, The British Dream: Successes and Failures of Post-War Immigration


The Intransigence of the Absurd: the Discourses of Racial and Sexual Identity in ‘Identity Politics’

There has long been popular and scientific fascination with feral children, reared and cared for by animals and with no contact with human society, that behave like the species that they live among and, we assume, identify themselves as. Such behaviour is not limited to humans; there are many examples, particularly of domesticated animals, that are adopted by another species that come to assume some of the characteristics of that species. This suggests that what we call identity is a universal of higher intelligence and that it is fairly plastic.

Humans, though, as is often the case, test this theory to destruction. An American woman, Rachel Dolezal, was recently denounced for identifying herself as black and living as a black woman, when her parents were both white, yet men who declare themselves to be women, dress as women and even undergo gender reassignment surgery are increasingly celebrated and accepted on their own terms, such as the much-publicised Caitlyn Jenner. Those who do not react viscerally to this conjunction and implied equivalence may be as puzzled as I am; but even those who do should reflect why these two cases should be considered so different.

Something I read a few years ago struck me then – as it still does – as so outrageous that I struggle to convince myself that it was not an imagined memory rather than an actual one. It was a brief article in some sort of educational magazine, a serious article, not a spoof to the best of my knowledge. It stated, as proof of commitment to the principle of inclusivity, that a particular school was being kept open at night because one of the students, a girl – let us call her Samantha – was a vampire, and could only work at night. Putting to one side the issues of the veracity of memory, journalistic objectivity and the wisdom of local education authorities (their respective dysfunctions are legendary), the central issue is not whether Samantha was a vampire, because clearly she was not, but why some assertions have assumed the power of fact, when the only fact is the fact of assertion.

Throughout history people have always sought to establish and assert their identity, but this process is complex and its focus has shifted over historical time among the kaleidoscope of possible markers such as region, religion, wealth and education. However, the fundamentals of identity are always the same: a playing out of our twin desires for individual freedom, particularly that of expressing our individual difference, and belonging, in which we find and sustain our similarity with others. This process can occur at several levels, as part of our individuality derives from belonging to a hierarchy of in-groups, such as our specific family, neighbourhood, city, region and country, in distinction to a series of out-groups characterised by otherness. Importantly, the precise definition of the other – as outcast, rebel, stranger, outlaw, scapegoat or victim – has a role in our self-definition as not-other.

On the other hand, sometimes our individuality is itself a form of self-imposed otherness, where we alienate ourselves from the mass to which we implicitly belong, in an act of self-exclusion that arouses, at the best, a sneaking reflexive admiration for the outsider hero – oneself – or, at the worst, self-pity for the identification of oneself as victim. Paradoxically, this self identification can become the basis for delusional group identity, in which there is a curious but toxic admixture of feelings of inferiority and superiority.

On one level it is strange that race is such a sensitive issue. After all, the boundaries of race are rather fluid, and science has never managed to establish a consistent or agreed definition. There are genetically homogenous groups such as Icelanders, Ashkenazi Jews and Japanese, but this is due to geographic and cultural isolation, and these pools do not correspond to what we normally call race, but the more limited concept of ethnicity. Race and ethnicity are actually complex cultural artefacts, and this is no more so than when we talk about the labels ‘black’ and ‘white’. From an evolutionary perspective the terms are nonsense; the only people who perhaps have the right to a generic and widespread genetic distinction are aboriginal Africans, but not because they are black in colour – the !Kung bushmen of Namibia, for example, have a reddish skin – but because they do not have the 1-4% of Neanderthal genes that the rest of mankind has inherited from prehistoric interbreeding between the two  human species. Mixing of peoples in the West, particularly in the Americas, means that genetic makeup and skin colour is on a spectrum of continuous variation; a surprising number of white Americans have some black ancestry.

Nonetheless, we can discern that the problematic nature of race does not reside in biology but in history, and that what we call black culture is really a shared history, a history that includes slavery, prejudice, apartheid, persecution, ridicule, drudgery and social deprivation, but also the enormous personal, communal and political forces that have forged great social and cultural gains from such a disadvantageous position. What we see, in fact, is a historically subjugated part of a heterogeneous population seeking common cause to overturn past injustices, rather than a distinct and homogenous entity. But in identity politics the narrative has assumed the status of a categorical assertion, wherein being ‘black’ is recognised as a necessary and sufficient condition for identifying oneself as part of a wronged community, which has become, perversely if understandably, a badge of honour. In its most radical form it assumes that dangerous polarisation of simultaneous inferiority and superiority referred to above, in which the mantle of the suffering victim and outsider can symbolically be asserted, not on the basis of experience necessarily (although many young black men can testify to being the subject of police harassment, known as ‘arrested for being black’), but simply on the basis of the colour of one’s skin. This was Rachel Dolezal’s perceived moral transgression: she assumed a badge of honour to which she was not entitled.

Interestingly, the older generation of radical feminists, such as Germaine Greer, apply much the same criterion of exception to transsexual women, as pretend women who have no right to assume the innate moral superiority of real women achieved through resistance to male domination. Now they find themselves sidelined and – in a recent neologism – ‘no-platformed’ by the younger generation of activists. This disparity in the reception of the trans-racial and the transsexual is hard to explain on the surface. It may be partly due to the great strides that have been made in women’s equality in the last generation, which have defanged the political radicalism of the earlier feminism, whereas racial equality lags behind, but I do not find this a persuasive answer.

I believe that underlying  this phenomenon is something that we could call the intransigence of the absurd, that is the assertion of something for which there is no scientific evidence, but which must be uncompromisingly defended by rhetoric and the layering of myth, most forcefully, naturally, by those who seek political leverage. A prototypical example of this is the assumed historical destiny and moral superiority of nationhood by nationalists of all stripes. The notion of race is one such absurdity, including that of being ‘black’ or ‘white’ or ‘Asian’, which must be vociferously perpetuated by all those seeking to take advantage of individuals who need to ground their tenuous sense of self by ascribing identity or otherness to individuals who bear a passing similarity or difference to themselves, a notion, moreover that must then be imposed on those who wish to make no such distinctions.

The reception of transsexuals into the community of women is based on a different narrative logic. The status of male and female sexual identity is so firmly established in biological reality, that a man believing himself to be a woman (or vice-versa) and acting the part is a patent absurdity recognised as such by everyone. Therefore, the deception has a theatricality that is acknowledged on all sides, as it has been throughout history in many different cultures. There is a twist to this, however. There is no political leverage in mere acceptance of this theatre; therefore, human sexual differentiation has been mythologised in the notion of gender, a radicalised state of indifferance and a socio-political chimera that fuses two notions of moral transgression: that of non-acceptance of the myth; and that of the traditional boundaries of the sexes, which must be preserved in order to be wilfully flouted.

The exact sociological function of race and gender finally diverge. Race is about belonging and exclusion, while gender has become about inclusion and freedom, specifically the freedom to define one’s sexual identity. Both notions are part of the narrative of how we establish social identity in a complex world, and should be tolerated on that understanding. However, we should never lose sight of their fundamental absurdity in inverting reality. That absurdity correlates strongly with an intransigent defence of the absurd; having abandoned evidence, it is not too great a step to abandon reason, openness and a willingness to entertain alternative viewpoints.

Asserting identity should be – as the word implies – about seeking universality above all, as a basis for accepting diversity. Identity politics does precisely the opposite. By repeatedly invoking historical injustices and incubating the fragmentation of human experience to create new forms of victimhood, it promotes belligerence as the essence of the shared social space, inclusion as a tool of exclusion, and the eternal past as the future.

(Note: the term ‘indifferance’, a play on Derrida’s concept of ‘differance’, denotes the prescribed ignoring of difference, distinction or differentiation, leading to moral indifference, rather than toleration, which recognises both difference and moral boundaries.)


Human rights: ‘nonsense on stilts’ or an axiom of human community


Depending on one’s perspective the origins of human rights can be traced back to Cyrus the Great, Magna Carta or to Immanuel Kant’s Categorical Imperative. Like all important social concepts it has been implicit in human institutions before being given greater philosophical clarity. Whatever its origins, there can be little doubt that the twentieth century was a high watermark in the development of human rights and their embodiment in political and legal institutions through the UN Declaration of Human Rights at the end of the Second World War, as both a statement of modern democratic civilised values, and a measure by which to evaluate one’s own nation’s progress and, perhaps more commonly, to criticise other cultures or regimes that did not meet its standards.

By contrast, while the twenty-first century has witnessed an acceleration of human rights legislation in the West, and Europe in particular, the status of human rights as a mark of civilised culture is increasingly threatened. This has happened externally through the rise of China as a world economic and political power that does not adhere to the ideology of human rights, nor observe them in practice, certainly not the rights of free speech and expression of dissident views. The rise of political Islam and its rejection of democracy and Western legal systems in favour of its frequently extreme interpretations of Sharia, has also dented the idea of human rights as a human universal given. Sometimes, even the West’s own stance on human rights has been compromised, for example by the acceptance of torture in the ‘War on Terror’ and by attitudes towards the present influx of migrants arriving from Africa and the Middle East.

Some would consider the greatest damage, though, is that which has been seemingly self-inflicted by an almost endless stream of legal judgements that are said to offend common sense, standards of decency, conscience and national sovereignty, and which threaten to bring the very notion of human rights into disrepute. The recent report by a UN working group on arbitrary detention, which asserts that the WikiLeaks founder Julian Assange has been unlawfully detained was met, certainly within the establishments in Britain and Sweden, with incredulity, a feeling widely shared by the press and most of the public; after all, how could someone who had voluntarily taken refuge in the embassy of a third country with which neither the UK nor Sweden has an extradition treaty, be considered to be ‘detained’ in any meaningful sense of the word. Examples like this have, over the past few years, tarnished the image of human rights among the general public, though this has not precipitated as much serious analysis of rights as their widespread permeation into our institutions and discourses deserves.

The critique of rights is not peculiarly recent, however. From the inception of its use in the modern sense, probably with Locke’s ‘Vindication of the Rights of Man’, the status of the whole idea of universal rights has been questioned. Bentham famously excoriated what were then referred to as natural rights as ‘nonsense on stilts’. Bentham was not objecting to the idea of ‘securities against misrule’, something that had been established in thinking on rights since the signing of the Magna Carta; what he objected to was the idea that rights have a foundation in human nature (Schofield, 2003). It was this latter idea, advanced by some of the thinkers and ideologues behind the American and French revolutions and that was destabilising governments across Europe, that Bentham and others in the Empiricist tradition decried (Smith, 2012).

The contemporary English philosopher Roger Scruton, in a recent lecture on natural rights (Scruton, 2011) has undertaken an analysis of the various components of the human rights issue, which is exemplary in its clarity and exposition of the various distortions and consequences to which it has given rise in the law courts. He follows, though not necessarily in the same language, the tripartite division of rights introduced by the Czech jurist Karel Vasak (1979) into three ‘generations’: first-generation, ‘civil-political’ rights; second-generation, ‘socio-economic’ rights; and third-generation, ‘collective-developmental’ rights of peoples and groups. For Scruton and others following in the Benthamite tradition only the first of these, civil-political rights, has any validity. Scruton offers a plausible definition of rights as having ‘the function of enabling people to claim a sphere of personal sovereignty: a sphere in which their choice is law’ (2011, para 33). These spheres ‘define the boundaries behind which people can retreat and which cannot be crossed without transgression’ (ibid). It is the compromises between free and sovereign individuals that constitute the ‘cement of society’ and it is this sovereignty that needs to be protected against incursions of the state. The only valid understanding of human rights, therefore, is as ‘instruments which safeguard sovereignty’ (ibid, para 35).

Scruton puts forward three objections to socio-economic and collective-developmental rights: 1) they create a sense of obligation, to respect the right, where no relationship, no reciprocity and, therefore, no obligation exists; 2) they result in the enlargement of the state to provide the economic and social benefits to which people are thereby entitled, in the absence of a specified entity or person upon whom this obligation falls; 3) legal judgements based on rights result in a zero-sum game rather than a reasonable compromise, thus loosening the ‘cement of society’. He also notes that, despite the increasing frequency of such judgements, the term ‘human rights’ is used indiscriminately and unreflectively in most cases.

Though I cannot fault the logic of Scruton’s argument, I think the premises upon which it is based are debatable, as are some of the assumptions that lie behind it. First, he does not offer a definition of human rights as such, but at best a definition of their function. Secondly, he argues on the basis of individualism, which is an ideological stance not shared across all cultures, nor indeed wholly shared within the Western academic tradition. As such, the approach of Scruton and others of a similar philosophical outlook is predisposed to find that human rights in themselves are meaningless and nonsense outside of protection of the individual and their privacy from the encroachments of the state. I propose to start from the assumption that in asserting a right of whatever nature or ‘generation’ a person is doing something that is meaningful to them and explore what that meaning is.

There can be little doubt that when a person claims a right they are asserting something about their own identity and also their worth as seen through others’ eyes. Anthropologically speaking, while humans have individuality in their physical being and in their experience of the world, we are also clearly throughout our lives, from birth to death, social beings. If we do not live in human society, relating to others, we are clearly diminished in our humanity. At the same time throughout our lives, in different ways at different times in our development through the life course, we negotiate space for our individuality to flourish. Therefore, in our full humanity both freedom and belonging are essential aspects of what we are and, I suspect, of what the discourse of human rights sets out to achieve. While I have great respect for the British empiricist tradition out of which philosophers like Scruton argue, I believe that their overemphasis on freedom, while clearly important, ultimately leads to a distorted view of human actuality and potentiality.

Rights, of course, have no ontological reality outside of human discourse, but within human society – as opposed to the state of nature – the discourse of rights is, stripped down, one of power: the power to negotiate with power a compromise between freedom and belonging; freedom from the intrusive or abusive power of a collective other, such as a state, a religion, institution or mob, but also to establish an acceptance of belonging, even against the sometime exclusionary force of the other. I think that one of the errors of the individualist position is to assume that there are legitimate and illegitimate forms of power and that the claims of the individual based on religious conviction, conscience, intuition or revelation – or even reason for that matter – take precedence over that of the collective, such as the community or the state. But I find such a view lacking justification. All power is ultimately arbitrary (Bourdieu and Passeron, 1977); no state, institution, social group or individual have ultimate legitimacy, but all have differential power, and while the state may have a virtual monopoly of physical violence, useful in waging war and keeping order, the discourse of rights is a coercive weapon in addressing the balance of power in an attempt to establish the optimal balance between freedom and belonging.

I agree with Scruton’s point that asserting a right, rather than entailing a set of responsibilities – which obviously he, amongst many others, feels would be fair and just – imposes a duty on others, with whom one does not necessary have any relationship, to respect the right. But I would argue that this is just the point of a right; it forces the recognition of a relationship in which there is acceptance of both a degree of belonging and a degree of freedom. Society is an interlocking complex of such obligations, to the extent that they tend towards de facto reciprocity, despite there being no logical entailment for such. For example, the rights to free association and free speech for citizens creates a duty for a government to uphold those rights, against those who would deny them to others with whom they disagree, even if, under provocation, that be the government (or branch of) itself. Conversely, the government asserts the right to collect taxes and the citizenry therefore has a duty to pay them, notwithstanding the fact that taxes are probably the last remnant of the state’s exercise of arbitrary power. Both these rights bind through obligation, but the rights themselves are two mere facts between which there is no causal or other relationship. One could argue, though, that it is the sum of such brute facts that hold societies together. Such a view, it seems to me, is not that different from those empiricists who advocate the self-regulation of the commons through ties of obligation (Ridley, 1996).

For this reason, I do not accept the objection of Scruton and others to the role of the state as a matter of principle, though I accept many of the criticisms of states’ practices around the world, including those of democratic states. Modern life would be impossible without the existence of a well-developed state, which marshals capabilities and resources beyond the capacities of individuals and organisations. This seems to me aptly demonstrated by attempts to diminish the role of the state, which not only reduce its effectiveness, but also, paradoxically, result in an extension of its powers in certain areas. I think this prejudice arises from two sources: one is the idyll of the past, the belief that somehow things were better in ages gone by when the state was smaller and less powerful, an idyll to which we all succumb at some time or other, although a study of history should soon disabuse us of that idea; the other is the belief that there are both legitimate and illegitimate forms of power, whereas all power is arbitrary. The discourse of rights is one that seeks to redress imbalances in power, and one which is likely to continue indefinitely.

The problem lies, ultimately I believe, in a diminished idea of the individual and of individualism, a type of individualism defined in opposition to the collective, where the collective is antagonistic to the individual and the individual’s interests. In reality, the individual and the collective are interdependent, and the type of freedoms and independence we pursue today can only be guaranteed by a powerful state. At the same time demands made upon the state throughout history have reshaped the state, and continue to do so. The fact that many of our demands are incompatible with each other virtually ensures the expansion of the state, as Scruton asserts; but, for example, claims for – and responses to – more autonomy for cities and regions, suggest that states are capable of adaptability as well as continuity. Rather than the diminution of the state, I suggest that what we require is it becoming more benevolent, and to that extent more distant and less visible, with power residing more locally and individually and more equitably distributed. But this absolutely requires an interlocking complex of rights and duties, even if they are sometimes incompatible.

Twiss (1998), in fact, argues that the three generations of rights are not just compatible, but reinforce each other, and that the privileging of one, over time, jeopardises the entire social fabric that accommodates it. The reaction to some of the recent court decisions, of the sort that Scruton among others refer to, shows that there is an awareness, to which governments are not entirely immune, that when rights become unbalanced this threatens the social fabric. There is even growing disquiet among some leading gay rights campaigners that some court decisions that have favoured gay customers refused services by those who claim to be acting out of conscience may have resulted in an injustice (Phillips, 2016). Concerning the Assange case, my first reaction was disbelief. But looked at from a more dispassionate distance, one begins to see that there may be some validity to the perspective of the panel. Regarding the panel’s report, a UK government official made the point that Britain does not accept the principle of diplomatic asylum (Bowcott and Crouch, 2016). I am in no position to assess the validity of this principle, except to note that the fact that the UK government does not accept it does not of itself invalidate it as a potential principle of justice in the complex world of interactions between states.

Both individuality and sociality are hard-wired into our nature and, therefore, the aspiration for both freedom and belonging. Different societies, peoples and national cultures at different times favour one over the other and this drives forward social evolution. There is no doubt that belonging forces obligations on us that can become burdensome and that we need sometimes to retreat into the privacy of the individual ‘sphere of sovereignty’; and indeed such a place is constantly on the agenda of the human rights discourse. But, equally, so is our inborn sociality and our obligations to others, particularly the alienated, excluded and persecuted, and the acceptance of this burden – whether willingly or not – is a mark of our humanity. In the end we must weigh this against the genuine inconvenience placed upon us and sometimes fabricated outrage inspired in us when these obligations are given legal bite.


Owen Bowcott and David Crouch (2016), UN panel calls on UK and Sweden to end Julian Assange’s ‘deprivation of liberty’, The Guardian (online, 5th February 2016, para 9) available at: The Guardian

Pierre Bourdieu and Jean-Claude Passeron (1977). Reproduction in Education, Society and Culture. London: Sage.

Melanie Phillips (2016), Gay activists want to have their cake and eat it, The Times (online, February 5th 2016), available at: The Times

Matt Ridley (1996). The Origins of Virtue. London: Penguin.

Philip Schofield (2003), Jeremy Bentham’s ‘Nonsense upon Stilts’, Utilitas, Volume 15, Issue 01 (March 2003), pp 1-26.

Roger Scruton (2011), Nonsense on Stilts (Prepared for a Conference on Human Rights, Lincoln’s Inn, London, 2011), available at:

George H. Smith (2012), Jeremy Bentham’s Attack on Natural Rights, Libertarianism (online, June 26, 2012, para 7), available at:

Sumner B. Twiss (1998), Moral Grounds and Plural Cultures: Interpreting Human Rights in the International Community, The Journal of Religious Ethics, volume 26 (2), pp 271-282.

Karel Vasak (1979), Pour le troisième génération de droits de l’homme: les droits de solidarité [For the Third Generation of Human Rights: the Rights of Solidarity (Inaugural lecture to the tenth study session of the International Institute of Human Rights, Strasbourg)], Revue de Droits de l’Homme, 1979, 3.

A Paean to Serendipity

This article was first published in Dawn of the Unread, an online graphic novel serial exploring Nottingham’s literary history. In March 2015 it won the Teaching Excellence Award at the Guardian Education Awards. It was created by James Walker, who is also a Director of Nottingham’s bid to become a UNESCO City of Literature.

One of the fascinations of languages are that on occasions they throw up words that are so connected with the spirit of the culture that they defy both translation and even definition. ‘Serendipity’ is a word imbued with something peculiarly English, which is largely untranslatable and indefinable. It is as if the meaning were conveyed through its connotations, as a type of felt-experience. This does not mean that definitions have not been offered, but in my opinion they largely fall short. For example, two chosen at random from the many online offerings, “the fact of finding interesting or valuable things by chance” (Cambridge) and “an aptitude for making desirable discoveries by accident” (, both have shortcomings. The second is better than the first; it focuses on ‘aptitude’, a talent that is potentially learnable and improvable, and on ‘discovery’, which, as I will argue, when considered in relationship to serendipity is an association that can enrich an understanding of both terms. However, the focus on “chance” and “accident” reinforces the idea that its occurrence is random. I believe it is more correct to refer to it as an emergent property of certain conditions.

The origin of the idea to which serendipity gives its name can be traced through a literary historiography and like many such terms has a wonderfully layered texture, a sort of stratigraphy of narrative, interpretation and contingency, almost as though the concept was an example of the very thing it named. The term was coined by Horace Walpole, an art historian, writer and political figure of the eighteenth century, based on a Persian fairy tale called The Three Princes of Serendip, (Serendip being the Persian name for Ceylon/Sri Lanka). The tale concerns three exiled princes who survived and prospered by their wits. Actually it has little to do with serendipity as we understand it today; it details something more like the application of deductive logic to evidence, a proto- semiotics. In the most famous story, the princes, by reading telltale signs, are able to describe a stolen camel that they have never seen – understandably leading to suspicion falling on them – before explaining their method and saving their own necks. The story is part of a collection, called the Hasht Bihisht, written in the 14th century, which itself draws on Persian tales of several centuries earlier (1). The story was introduced to English readers through Italian and French translations initially. But the tale has had most influence through Voltaire’s book Zadig, the story of a Babylonian wanderer based on The Three Princes of Serendip, which examines the interplay of reason, signs and fate and went on to influence the development of both scientific method and detective fiction (2).

Serendipity has been immensely important in science, and it is largely though scientific anecdotes that the meaning of the term as we know it today has been shaped (3). Even though science is equated in the popular mind with being a rational activity, rationality only plays a part. The core of science is discovery, and for Karl Popper, probably the most influential philosopher of science of the twentieth century – paradoxically, in a book entitled The Logic of Scientific Discovery – science develops firstly through bold imaginative leaps, which are then subject to rigorous testing (the rational part), which he insisted be attempts at refutation rather than searching for confirmatory evidence. I am not a scientist, but I have experience of research. What Popper says seems partly true, but also to exaggerate the element of leaping in the dark. It ignores, to my mind, the aspect of immersion (4). Discovery in any enterprise takes place in the messy physicality of the world, whether that is the scientist’s lab, the artist’s studio or the writer’s desk, through immersion in the discourse of the discipline and problematic issues to which one is dedicated to finding a solution. No one who is not immersed in science makes a scientific breakthrough, or in music writes a musical masterpiece.

It is in literature, though, that the idea gestated and still provides the most accessible experiential context. A bookshop in a Derbyshire village that I visit from time to time embodies perfectly the conditions for serendipitous discovery. A narrow shop front that could be from another age opens up into a labyrinthine interior. Visitors typically spend several hours there, undoubtedly absorbed in the books but also possibly in wondering how to extricate themselves. If you have read Umberto Eco’s novel The Name of the Rose or Jorge Luis Borges’ short story The Library of Babel, you will get the idea, although the point is not so much the indecipherability of the floor plan or the magnitude of its extent as the sheer profusion of books and, in particular, the chaotic nature of their display (5). I use the term ‘chaotic’ advisedly: I do not mean there is no order; on the contrary, books are themed as they are in all bookshops. But the owners have taken the maximisation of space to such extraordinary lengths, utilising every possible volume, area and intersection that a wonderfully chaotic complexity arises. On each floor narrow aisles lead to other aisles, which lead to tiny alcoves or to culs de sac. There are books shelved on the stairways to the upper floors, on the landing of each floor, in the restaurant (entered through a bookcase) and even in the toilets (although perhaps my imagination has got the better of me there). In moving through the shop, even if looking for something particular, you are never more than an arm’s length from something completely different. And this is the point: you cannot miss being exposed to such a myriad of alternative influences, that a fortuitous discovery becomes likely (6). I rarely come out of this shop without having bought something, and usually something that I neither knew existed nor realised that I had any interest in.

Although serendipity does not allow of simple descriptions or pat definitions, it is possible to say something meaningful about it. Firstly, it is putting oneself in the way of discovery, but without any guarantee of discovery. In other words, though chance is involved, it is not a random happening. It is to be immersed in the concerns of a particular field of human activity. Secondly, it is to be in an environment of ordered chaos, whether that be by design or accident, where unrelated things lie in proximity, and to have the capacity to see something previously unseen in that juxtaposition, whether a causal relation or merely a suggestive pattern. Thirdly, another element of serendipity, not captured in the definitions, is the element of surprise. Something cannot truly be serendipitous unless its appearance is a surprise; the searched-for thing turns out not quite to be the searched-for thing after all, but something other whose appearance is a revelation. One might say it is one of the vestiges of the sacred which has survived into the scientific age; at least, until now.

I wonder if the serendipitous nature of discovery is under threat from the increasingly technocratic way in which information is processed, both as a society and also personally. The modern approach to knowledge is instrumental. We increasingly no longer put ourselves in the way of discovery. We decide what we want and instigate a search on the web. This is wonderful in its own way, but does not allow the joy of unexpected discovery. Search engines are logical and literal in their search; in order to accommodate news ideas they must be programmed with new search terms. But in this way we are only extending what we already know, or projecting from what we know into the less known. But there is no route into the great realms of ignorance, that which we truly do not know, the ignorance of which we are profoundly ignorant. Even in the age of the world-wide-web most new ideas and recommendations come through personal synergy. As many others have noted, the digitisation of data, and its access through search engines, not only circumvent the contexts within which serendipity operates, but also increase bureaucratic and commercial control of our lives. Under the banners of convenience and choice our time and our choices are increasingly delimited, by increased capacity for administrative delegation and through algorithmically generated feedback on a set of data-deficient decisions, respectively.

A final thought: could it be that the untranslatability of the term ‘serendipity’ renders it a uniquely English experience, in the way that the Inuit experience of snow is able to call on a rich vocabulary unparalleled by any other culture? Neither the cross-cultural context of the term’s origins nor the universality of human experience suggests that this is true. And yet there are hints that English culture has a unique relationship to the idea. English common law arose through a process of judicial discovery of a ‘truth’ in the particular case, which then became the precedent in resolving similar future disputes, unlike the Napoleonic codes based on Roman law prevalent throughout the rest of Europe, which are the imposition of an executive-derived body of law through fiat. Then there is the observation, which does seem to be supported by evidence, that the English are good at scientific discovery, but rather poorer at developing finds into practical technology, which requires greater rationalisation. English society has always existed in the hinterland between order and chaos, which perhaps renders it favourable to a culture of discovery. Yet what has bonded that society is partly a shared literary and discursive tradition that has allowed the cross-fertilisation of ideas. Worryingly, this is disappearing as the tactile, shared and concrete is increasingly displaced by the virtual, solipsistic and evanescent, and the sages of the Internet age have exhibited no capacity, as yet, to see this as problematic.



  1. The tale of The Three Princes of Serendip was translated from Persian into Italian during the late Renaissance, and into French and German in the eighteenth century, but not into English until the 1960’s. The collection of stories of which it is a part goes back to the eleventh century, but is based around an epic interpretation of the fifth century Persian monarch Bahram V. Though the origins go back to pre-Islamic Persia, there are variants of the stories throughout the Middle East, the Balkans, India, Russia and China.

T. G. Remer, Ed. (1965). Serendipity and the Three Princes of Serendip; Trans. from Michele Tramezzino, Peregrinaggio di tre giovani figliuoli del re di Serendippo (1557). University of Oklahoma Press.

2. Voltaire’s Zadig inspired Georges Cuvier, one of the founders of the science of palaeontology, to see inference of an extinct animal’s nature and environment from minimal evidence, as a valid form of reasoning.

Thomas Henry Huxley (1880), On the Method of Zadig: Retrospective Prophecy as a Function of Science. Popular Science Monthly‎, Volume 17‎ (August 1880).

  1. Not only technological innovations, but also theoretical breakthroughs and archaeological finds have often had a strong element of ‘luck’ or ‘good fortune’ in their discovery.

Royston M. Roberts (1989), Serendipity: Accidental Discoveries in Science. New York: Wiley.

  1. Serendipitous discovery has much in common with Thomas Kuhn’s concept of the ‘paradigm shift’ that occurs when an existing paradigm within which normal scientific research takes place no longer accommodates the weight of new and challenging data. Kuhn considers the historical context in which discovery is made, whereas Popper is looking from an epistemological perspective.

Thomas Kuhn (1962). The Structure of Scientific Revolutions. Chicago: The University of Chicago Press.

Karl Popper (1959). The Logic of Scientific Discovery. London: Routledge.

  1. Eco and Borges were not addressing the topic of serendipity; if anything they were making the opposite point that knowledge and the world are opaque to understanding. However, it is their images of libraries of enormous complexity or infinite extent that stay in the mind.

Eco, Umberto (1983). The Name of the Rose. Harcourt.

Jorge Luis Borges (1981). Labyrinths. New York: Penguin.

  1. There have been many attempts to structure the workplace in order to maximise creative interaction among the workforce in companies in the hope of generating serendipitous innovations:

Rachel Emma Silverman (April 30, 2013), The Science of Serendipity in the Workplace. The Wall Street Journal. Online at: