History of Japan from the Heian period through the Second World War

For readers who like to study history, whether rigorously or simply for the enjoyment of historical discussion, last week I finished listening to a series of podcasts – much more like a series of extensive lectures, with each entry spanning 4 to 5 hours in length – on the history of Japan and its involvement in the Second World War. The talks by Dan Carlin (see bottom) have been published over the course of the past two years, and serve almost as a narration of Japanese history (from Carlin’s view) beginning roughly from the Heian period through to the events of the Asiatic-Pacific theatre.

Carlin, it must be said, is not a historian (it is fair to say that he is an amateur historian). And while it is generally the case that many historians applaud his podcast and popular engagement with history, it is important to approach his presentation as a form of popular history, firstly. That is to say, I take it as history as a way of seeking and exploring lessons, and thus, too, as a way of speculative theorising connections of factual historical events. This is not to say that Carlin is not brilliant at presenting history. He is in every sense one of the best popular history presenters, who, as I see it, has a first-principle motivation to give context to historiography by highlighting the human experience of an event. History is bloody, absolutely; and the ‘human factor’ that Carlin ensures is not lost in history is deeply important, not least philosophically. It must also be said that he provides plenty of support for his views and never fails to provide his full list of references, which usually includes both primary and secondary sources; but, again, the insisted nuance is that rigorous historical study and popular history are two very different things. The point of discretion here is just to say that one must approach each talk critically, for example discerning when Carlin is presenting his own theory or views and when he is directly citing a primary or secondary reference.

In more ways than one, listening to Carlin’s historical presentations – especially his emphasis on the human aspect of history – reminds me of an allegory on history by Albert Camus. This is something I should maybe return to and write about sometime.


Admittedly, the history of Japan is something I know of in discrete, disconnected pieces. It’s just not something that has been a focus in my history studies. Like a puzzle picture, some parts I have filled in but mostly in passing or in unconcentrated ways. For example, I have some understanding of its pre-historic period, mainly from books covering our best known research on early human migration that happened to include the Japanese archipelago. Over time I have also picked up some bits on ancient Japan and things like Heian culture, famously the era in which the samurai emerged. I’ve also read bits on Japan’s involvement in World World II but, again, my focus has largely been on the European theatre. Both of my grandfathers, one on my Scottish side and the other on my English side, were involved in the war. I grew up with the Second World War being a regular topic of discussion, with the Battle of Britain and other notable events often a focal point. As a kid, I also studied planes and I really liked the old British war planes, like the Spitfire, and used to build models of them as a hobby. All of this is to say that I’ve never focusedly studied the Asiatic-Pacific theatre in the same way I’ve done the European.

This is perhaps one reason why I found myself thoroughly enjoying Carlin’s series. One can approach his telling of the history with the aim in mind being a study of Japan’s involvement in both world wars. For this reason the focus is narrowed on pertinent historical and cultural developments preceding the great wars, before finally covering the events in the Pacific theatre. There is far too much to comment on, as the range in subject matter is vast. One thing that I found interesting is Carlin’s emphasis on colonialism as it relates to Japan’s motivation, military emergence, and ultimately resource-focused campaign in Asia. But before this, there were so many pertinent socio-cultural and historical developments in Japan’s history, as Carlin tells it, which contribute to what is described as a certain cultural and behavioural fanaticism. This fanaticism is expressed, in one way, through the eyes of the Japanese soldier of the time and finally culminated in an extreme barbarity that very much defined the Asiatic-Pacific theatre.

Carlin starts by first examining the phenomenon of Hiroo Onoda, the last Japanese soldier to come out of hiding from a Philippine jungle and surrender in 1974. What drove Onoda to behave in a way that, in one frame, may be described as going beyond the valour of duty, or in another frame may be described as fanatical and delusional, is a driving question in Carlin’s thesis. It is what shapes his telling of the history, because it leads Carlin, in the prologue, to introduce the observation – very much as we observe across all societies, I would argue – that human beings are malleable for better or for worse, and the ways in which we may be shaped or perhaps even deformed in extreme ways are based on our sociohistorical-cultural circumstances. So what were these circumstances? How did they develop? And what are the deep historical roots, not least related to Japan’s foray into imperialism?

Again, there is much to say, given the range of Japan’s history covered. I encourage the reader to listen to the series, because, while at times Carlin seems to make some drastic theoretical connections, the way he tells the story is absolutely gripping and, no doubt, within his recounting of many first-hand accounts, there are kernels of truth disclosed that are overwhelmingly moving in the sense that the Asiatic-Pacific theatre, in its sometimes unrelenting barbarity, was a deeply human tragedy.


I will leave the reader with this comment, as it is particularly on my mind. The prologue to Carlin’s series, described above, and much of how he traces key developments in Japan’s history – it reminded me very much of Edgerton’s study on social pathology, in which it is argued that a society may be more or less pathological, with the degrees of variance characteristic of the particular sociohistorical-cultural moment. This was also the thesis of my book, Society and Social Pathology, published a few years ago. Within it I argued, if we are to understand social pathology in a critical way, conceptualizing the complex interconnection between the individual subject and his/her social conditions is the first place to start. In studying the relation between one’s sociohistorical-cultural conditions and the impact those conditions have on the individual subject, my thesis argued toward a more comprehensive, systems view of society, its development, its pathology, and its discontents. As a matter of perspective, if nothing else, a number of questions that Carlin asks – for instance, what leads to the development of the sort of behaviour displayed by Onoda – reminded me of similar questions when coming to study the importance of obtaining a well-defined and rigorous concept of social pathology. Below is an excerpt from when I was thinking about such matters:

One incredibly important argument that we will discuss […] concerns how […] all societies, just like individuals, can be pathological to greater or lesser degree (Edgerton, 2010). This is an important feature of my present thesis. In a survey of literature on the history of human society, it would appear fairly safe to conclude that social pathology as defined in this book is a reoccurring characteristic across cultures and epochs.  Overcoming the pathological development of human society is, to borrow the words of Kenan Malik (2014), “a historical challenge”. That is why although capitalism […] may take a central focus in the present study, due to the fact that capitalism as a particular social formation is what defines our present social world, this particular period of human social development is also part of a significantly broader history. For this reason if the intention is to look at the facts, the realities, the many social phenomena, which defines a large part of modern life, in attempt to understand why needless social suffering persists and why irrationality prevails, to accomplish this task we must also come to grips with […] a philosophy of history [that] intersects with and combines numerous disciplines, from anthropology and archaeology to psychology. And it will help us contextualize a framework for understanding both the ongoing process of pathological development throughout history, as well as the ongoing process pertaining to our present conditions.


*Image: Pacific Theater Areas, Wikipedia.

The US election and my holiday reading list

It is reasonable to wait for confirmation; however, as things stand, it appears Donald Trump is about to get walloped in the election, losing both in terms of the electoral college and the popular vote. One might indeed take a moment to say, ‘good riddance!’. But it will take a lot more than a Biden victory to defeat the prejudiced, often anti-science, and certainly contra-enlightenment views that have been amplified in the past years, not to mention the shady funding behind them.

What is super interesting, I think, is that when looking at the numbers the definitive nature of the urban / rural divide in the United States is made explicit. It is old sociology, to be sure, and I can think of no better description than how there seems a complete contradistinction of views. It is a rigid contest, and it is certainly not as simplistic as designating the young, educated urban dweller against the opposite in the country bumpkin. For example, it is noticeable that in some circles the right-wing voices in support of Trump have expressed anti-globalisation views in almost identical ways as some circles on the left, the difference primarily being in the framing. Although this doesn’t factor ideological residues, the point is that the split seems rather nuanced with as much economic as cultural import, not so different than what we have also witnessed here in the UK. For many reasons, I have been reminded also in recent years of Stephen Bronner’s analysis of anti-modernism movements. It will be interesting to read in the coming months new studies on these socio-economic and cultural dynamics, as no doubt a few books are already in the works.

One last comment before moving onto other things: it is hopefully telling that Biden’s first speech as president-elect made explicit mention of the need to reconnect with science. Rebuilding trust in scientific impartiality is imperative, after much opportunism that subordinated key scientific institutions to political bias and ideological ends. Surely, also, such a rebuilding effort coincides with the demand to strengthen evidence-based approaches to policy. Philosophically, such approaches are still not completely without their problems, but it is a project we ought to work toward.

Unification was also a key message, and quite understandably. Whatever one may think of French President Macron, last week he gave what I thought was a nice talk on the principle of enlightenment in the form of communicative reason (it reminded me very much of Habermas): to continue to work to create a public space structured in such a way that rational dialogue and debate may be achieved. This runs completely counter to demonisation and intense polarisation – the old habits of tribalism. Objective reason doesn’t commute, or is not compatible, with ideology in as much that analysis should work continuously to free itself from bias. It is unfortunate to see that it has become a tenet of general discourse to succumb to irrational worldviews in which their political and cognitive biases overshadow the normative process of reason. Biden spoke the other night of the battle between our better angels and our darkest impulses, which I interpreted in these terms – an expression that gives description to the enlightenment project. From a systems view, I am sceptical; but I am also open to seeing what he does. As with an incredibly difficult calculation or when probing an important proof, it is about incremental steps.


Now that I have submitted my thesis on double sigma models and field theory, I have two months holiday before I am due to return to university. During my break, I plan to catch up on a lot of reading. I would also like to finish a number of essays and potentially start drafting several more. For example, I am currently finishing an essay on braneworlds from my studies in autumn 2019. I also have a long essay being polished on deriving general relativity from string theory, among a list of others in my current area of the doubled string, generalised geometry, and de Sitter. So I look forward to my holiday where I can be in my own space a bit and enjoy writing on these fantastic topics.

I also have some others essays that I would like to write in other fields. For example, one essay that I have been working on for some time concerns the epistemology of the early medieval university in which Aristotleanism was formally introduced to Europe. For that purpose, I have added Nicholas Orme’s book on Medieval Schools to my holiday reading list.

There is another book at the top of my reading list. It’s Vincent Azoulay’s acclaimed ‘Pericles of Athens‘. I’ve been thinking of Pericles lately, perhaps partly to do with the experience of a year marked by a global pandemic. Indeed, the collapse of Periclean Athens was instigated in no small way by its own terrible malady – a rather vile plague that proved catastrophic for what was one of the earliest of egalitarian and democratic experiments in human history. (A nice discussion between historians was recently presented here). The intent here is in no way to draw analogies with our contemporary times, although with current trends it is not completely outlandish to suggest that contemporary democracy – and certainly present economic models in which it is housed – is facing a challenge. Indeed, and furthermore, it’s not just the pandemic but many trends in behaviour, not least what we have been seeing politically in the past years, that have highlighted the utter idiocy capable of human beings in a test of democracy at its very foundations. Isaac Asimov once wrote, ‘The great anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge“‘. In a similar vein, there is a great passage by Bertrand Russel that strikes the same point, and similarly a most famous passage by Walt Whitman comes to mind. Considering that some would argue that the prospect of enlightenment has not been fully realised, nor the prospect of democracy completely fulfilled, it will be interesting to read more about the struggles of Periclean Athens. An inspiration to some prominent enlightenment thinkers, often engagements with Pericles are dominated by the same old question that Socrates once asked: has philosophy made citizens (and, I would add, the life of citizens) better? History can often serve as a magnifying glass, and if philosophy’s remaining relevance is tied to asking the question of the existence of needless (social) suffering, maybe there is still something in Pericles to write about.

As a fun read, I’ve picked up Stephen Brusatte’s celebrated title exploring the latest research on the history of dinosaurs. It is a book that I have been desperate to read since the summer. I can’t wait to dig into its pages!

Finally, I was thinking of Brockman’s cross-field collection ‘Life’, with a contribution by Dyson on the garbage-bag-model followed by Ackerman’s ‘The Genius of Birds’. This was actually one of the first on my lists as it relates to my interests in mathematical biology (and plus, I enjoy studying birds in my spare time).

If there is time, I’ve been wanting to read Edward Wilson’s ‘Consilience: The Unity of Knowledge’. In addition to his now dated ‘On Human Nature‘, these books will certainly inspire a number of essays (his book ‘The Diversity of Life’ is also worth mentioning) because I have studied epistemology to great lengths and I am always keen to delve with nuance into evolutionary psychology and its issues. But they will likely have to wait until my summer holidays, along with a list of others (it is an ever-growing list!).

These mostly comprise my general academic reading and don’t really get into some of my ongoing non-academic books. I’ve been reading through a lot of the Star Wars canon recently. For my holiday, I picked up the new Marvel Kylo Ren comic series as well as the new Darth Vader series, and so far I have been enjoying both.

Brownian Motion

R.C. Smith

In the past I have regularly written about and reflected on numerous examples with regards to the practice of science. Not only is this attached to my main, core interests as a physics student aspiring to become a good physicist and a good scientist, which inspires me to think deeply about science. I like to study the history of science as a medium for myself to reflect more broadly – and perhaps philosophically – on the development of scientific knowledge, the history of science in relation to this development and the fundamentals of its practice.

It often strikes me that, whether in science class or within the general domain of culture – there isn’t enough popular or widespread emphasis on the why of modern science. The same, I think, can be said of mathematics in particular. At the start of my mathematical career I was interested in the why of mathematics, but this isn’t usually the focus of our school lessons. Oftentimes, it is after we study mathematics to a high level that we then turn to thinking of the why of our mathematical concepts and systems. I think the same can be generally said of science.

But it is the why of science that reveals some of the deepest sources of scientific passion and inspiration. To neglect the why this is to not fully embrace the depth of meaning that modern science offers human beings. It is in the why of a scientific theory or of basic scientific knowledge that can enliven what today we might merely take for granted as general principles.

More than that, I often find joy in following the logic behind the development of a concept or the evolution of a theory on the basis of first principles – what led to the invention of the first microscope or to the discovery of penicillin? There is a lot of rich and interesting content here.

A nice example that I have pulled from my notebook concerns Brownian motion. The history behind Brownian motion is quite interesting.

In short and overly simple terms, many will already know that Brownian motion helped confirm that matter is made up of lots of tiny particles. In other words, it helped confirm the existence of atoms, which is the smallest particle of a chemical element that can exist. It is named after botanist Robert Brown, who in 1827 reached a most curious conclusion:

Translational motion.gif
By Greg L at the English language Wikipedia, CC BY-SA 3.0, Link

Brown was studying pollen grains at the time. Pollen grains are of course a very fine powder, that many will already be familiar with. If you rub your finger against the petal of a certain flower, you will be able to see the dust – the pollen particles – against your skin. One of Browns’s experiments entailed the study of pollen grains suspended in water. In placing the grains under his microscope, he noticed that pollen particles were moving almost at random. One might describe this movement as “jittery” or as a “zig-zag”. When Brown perceived the movement, he concluded that the grains were somehow “alive”.

However, what was really happening was that the grains were colliding with water molecules. And these water molecules were too small to see under Brown’s microscope, which led Brown to think that the pollen grains were “alive”.

It was only later, when the effects of the collisions could be seen – with better microscopes – that Brown’s observations could be deepened with scientific theory and then verified empirically, contributing to our understanding of particles.

From Ancient Rome to Einstein and Perrin

What is interesting, and why I like the example of Brownian motion (there are many great examples), is because the history of the idea of Brownian motion goes all the way back to Ancient Rome.

By Unknownhttp://commons.wikimedia.org/wiki/File:Lucretius1.jpg, CC0, Link

His name was Titus Lucretius Carus, and he was a Roman poet and philosopher. In what is described on Wikipedia and elsewhere as a “scientific poem”, Lucretius, with tremendous phenomenological attention, offered a detailed account of Brownian motion of dust particles, from which he argued as proof of the existence of atoms. It comes from verses 113-140 in Book II, “On the Nature of Things” (c. 60 BC), and I offer a quote as cited on Wikipedia:

“Observe what happens when sunbeams are admitted into a building and shed light on its shadowy places. You will see a multitude of tiny particles mingling in a multitude of ways… their dancing is an actual indication of underlying movements of matter that are hidden from our sight… It originates with the atoms which move of themselves [i.e., spontaneously]. Then those small compound bodies that are least removed from the impetus of the atoms are set in motion by the impact of their invisible blows and in turn cannon against slightly larger bodies. So the movement mounts up from the atoms and gradually emerges to the level of our senses, so that those bodies are in motion that we see in sunbeams, moved by blows that remain invisible.”

It is, in simple terms, a perfect representation of an experiential observation. A common phenomenon, when concentrated beams of sunlight break through a glass window, and what we see,  caught in these beams, are lots of tiny dust particles floating in the air. I believe this to be one of my own ‘earliest scientific experiences’, observing dust particles as they passed through beams of sunlight, curious about their frantic and irregular patterns and wondering to myself, “what is this microworld?”

It is immediate, experiential observation and knowledge – the object of the particle of dust and its behaviour in gas. The dust particles move randomly. Why? Well, bracketing air current, which is one cause, the jittery motion of the dust particles is also caused by the resultant force of the air molecules – which are too small to see – in contact with the dust particles from all sides. In other words, Brownian dynamics.

But like with most ancient scientific observation and knowledge – largely limited to experiential epistemologies – we can move beyond Lucretius’ account. And in this history we see another example of the power of modern science.

Brown paved the way. But then, 1905, Brownian motion offered an important clue to Albert Einstein. Moreover, it was Einstein who gave an explanation of Brownian motion in terms of “random force”. Employing Newtonian mechanics (i.e., F=ma), the concept of random force helped explain the random or zig-zag or jittery motion of tiny particles, such as observed when pollen particles suspended in water. Another very basic but common and easy to-do experiment concerns capturing smoke in a transparent box, illuminating the smoke with a light, and then observing the smoke particles through a microscope.

In any case, it was Einstein who, in one of his most prolific years, explained why, experimentally, there must be a random force (that the clash between particles and molecules [from all sides of the particle] have a resultant force, causing in this case pollen particles to move in a jittery way).

Einstein also worked out that we can measure the diffusion constant. Diffussion can take on slightly different meanings in different disciplines. At its most basic, in physics, diffusion refers to the process that results from random motion of molecules.  It describes, as a result of molecules or atoms kinetic energy of random motion, the net movement from a region of high concentration to a region of low concentration. It can also be said that there is a frictional force. And in measuring the diffusion constant and the frictional force, Einstein discovered that we can find the kinetic energy of the particle – the amount of agitation of the particle – which can then be related to the absolute temperature of the fluid. Einstein’s theory thus resulted in an important contribution combining Newtonian mechanics and thermodynamics.

[Note: You could say that the direction of the force of atomic bombardment is constantly changing, and at different times the particle is hit more on one side than another, leading to the seemingly random nature of the motion.]

Later, in 1908, Einstein’s explanation of Brownian motion was verified experimentally by Jean Perrin, for which Perrin won the Noble Prize in Physics in 1926. Perrin’s confirmation of Einstein’s calculations provided significant empirical groundwork.

One could write pages on the finer details and expand on the historical chronicle, as well as deepen the explanation of the concepts, but even at this level of the history a remarkable picture of scientific pursuit and knowledge emerges. At the end of the sequence of discoveries and refutations, deepened theories and experimental verification, we arrive to an even more sharpened and expanded knowledge: convincing scientific evidence that atoms and molecules exist.

On Current Events, Evolution and Civilization

R.C. Smith

As far back as 4 million years ago and the period of Australopithecus afarensis, the standard model of hominin evolution offers valuable pieces of insight as to the importance of humanity’s great duel with nature. The current consensus, which is held by most scientists, places a significant moment in hominin evolution at the Great Rift Valley approximately 1.8 million years ago. This is marked by a notable increase in both brain size and hominin species, including Homo erectus. And while at the very frontier of science, new knowledge is constantly being gained and established debates are underway, the main model that is currently most widely accepted, namely the recent single origin hypothesis, suggests that rapidly changing climatic conditions played a role in the evolutionary jump in the Rift Valley.

Taking several significant jumps forward, it is not entirely misguided to suggest a link, however philosophically, between the human duel with nature and the earliest evidence of agriculture. There are of course many competing theories for the domestication of agriculture, and it is not my intention here to purpose or emphasize one over another. Rather, I am simply suggesting the basic idea that insight is offered in understanding the value of what agriculture provided. There is a simply logical sequence to follow: that domestication of crops provided something positive of notable benefit, otherwise it would likely not have crystalized as an established trend. In that a more stable food supply was established, with larger numbers of people being able to live together – the philosophical description of an increased freedom from immediate existential precarity is given an incredible sense of meaning. The seeds of human civilization were sown. Freedom, in the particular sense of an increased freedom from precarious existential conditions, coincided with the transition from hunter-gathers to an entirely new conception of human life.

The foundation of civilization was laid with the advent of agriculture. From there human social organization could take on much more complex forms. Writing, thought, culture – as well as special and geographical planning, city development – were enabled to slowly emerge over time. The rest, one could say, is history. This history is both deeply positive and transformative as well as bloody and barbaric. It would be prejudiced of us to only acknowledge one side and not the other. It is in reference to this latter aspect of human history in which, in the first episode of Star Trek: Next Generation, Q declares that humanity is to be put on trial. The message, the philosophical emphasis and scientific sensibility present in this episode is, for me, classic Roddenberry: human beings have accomplished many amazing things, we have also committed an astonishing list of crimes.

With that said, over thousands of years it is difficult to argue that there hasn’t been progress in almost every conceivable dimension of human life and society. Moral progress is especially evident in the last few hundred years, as Steven Pinker highlights in his book The Better Angels of our Nature (2011). Over recent centuries, with the rise of the modern scientific endeavour, there has been an exponential growth in knowledge which, in my personal opinion, has yet to even glimpse full maturation. With modern science, especially in and through the Enlightenment, a counter-force to human prejudice and bias crystallized. Modernity emerged. Values of reason and scientific rigour, along with critical conceptions of progress, democracy, egalitarianism and cosmopolitanism, were developed and impactes the human social world in an almost unparalleled fashion.

We have not even considered a fraction of the total picture, and already one might sense how remarkable it is that we’ve come so far as a species. But it is also hard to argue that in the midst of the discontinuities brought about by social, cultural transformation, historical progress and thousands of years of social evolution and development, the darker side of human nature, the less admirable qualities of human experience and behaviour, maintain some sense of continuity. Prejudice continues. Bigotry would seem omnipresent on the worst of days. War, famine, ignorance – one cannot attempt to take weight of the total picture without also acknowledging that in midst of so much change and progress, notes Brian Cox, ‘irrational, unscientific, superstitious, mythic, tribal, nationalistic, dominant, violent, and power driven patterns of behaviour remain prevalent within the human universe’.

From all I’ve read and from my current vantage point, I’ve come to the conclusion that the principle of reason is what continues to give humanity hope. In this respect, I very much enjoyed reading Cox’s book Human Universe (2014), not for the physics of which I was already deeply familiar, but because there is something magnificently grande and satisfying in the questions he asks. Human Universe reminds me some of Carl Sagan’s most inspirational literary moments. One example in relation to the continuities of the darker side of human nature follows Cox’s discussion on the advance of technological civilization. Saving detailed discussion on the Drake Equation and the Fermi Paradox for another time, Cox offers a nice site of reflection:

Perhaps it is L, the lifetime of civilisations, that is the fundamental reason for the great silence. This is a sobering thought. The reason we have made no contact with anyone is not because of a lack of stars, or planets, or living things; it’s because of the in-built and unavoidable stupidity of intelligent beings. (p. 113)

Whether such stupidity is unavoidable is up for debate, but the deeper point is sound.

This might seem a bit strong, but it is a view shared at Green Bank by Manhattan Project veteran Philip Morrison. Morrison was intimately involved in the design and development of the first atomic bomb, and he helped load Little Boy onto Enola Gay destined for Hiroshima. The fact that human beings had deployed a potentially civilisation-destroying weapon twice, against civilian targets, and that Morrison had personally loaded one of the bombs, mush have never left him, and on the eve of the Cuban missile crisis, it must have seemed likely that we would do it again on a much grander scale. Drake realised this as well, which is certainly one of the reasons why he introduced the time that a technological civilisation can endure into his equation: we can after all only communicate with nearby civilisations if they exist at the same time as us. This is a possible resolution to the Fermi Paradox.

[…] One could argue that mutually assured destruction, the guiding principle of the Cold War, did act to stabilise our civilisation and is still doing so today. Perhaps no intelligent beings will knowingly destroy their civilisation, which is what global nuclear war on Earth would surely do […]. Similarly, one assumes that the submersion of Miami and Norwich by rising sea levels would silence the so-called climate change sceptics […] and trigger a change of policy that will avert catastrophic, civilisation-threatening climate change in good time. (pp. 113-114)

This passage and many others like it represent a number of sobering and reflective moments offered by Cox. With science and fact, as well as a rich body of research on human evolution, history and civilisation behind him, empirical assessment of where we’ve come from has a way of raising fundamentally important questions about where we would like to go as a species. In the context of current events – from the ongoing struggle against climate change and ‘climate sceptics’ to the increasing tension in the Korean peninsula, with North Korean and US leaders boasting of their nuclear arms, it seems to me that there is a kernel of truth to the fear of stupidity. It never seems too far from being an integral feature of our past and present reality.

Cox responds with a tentative signal of hope, namely:

that a small planet such as Earth cannot continue to support an expanding and flourishing civilisation without a major change in the way we view ourselves. The division into hundreds of countries whose boarders and interests are defined by imagined local differences and arbitrary religious dogma, both of which are utterly irrelevant and meaningless on a galactic scale, must surely be addressed if we are to confront global problems such as mutually assured destruction, asteroid threats, climate change, pandemic disease, and who knows what else […]. The very fact that the preceding sentence sounds hopelessly utopian might provide a plausible answer to the Great Silence. (pp. 114-115)

It is a penetrating, incisive passage of reflection. It brings to mind not only some of my own studies in my spare time on irrationality, but also a much wider series questions. The main question is not limited to individual irrationality. Rather it refers to what some philosophers call the “social deficit of reason” which is much more broad and systemic in scope. Our global political system is deeply inadequate. Likewise, our current economic system can be argued to be insufficient. The current parameters of society as a whole is littered with logical inconsistencies. Cox (pp. 217- 218) emphasizes the point in his use of the following apt example: ATLAS (Asteroid Terrestrial Impact Last Alert System), which is an important early warning system when it comes to potential asteroid collisions, could be considered almost like a global insurance policy for humanity. The cost of such a global insurance policy? One third of the annual wages of top British footballer, Wayne Rooney (Cox, 2014; p. 218). “Such comparisons always sound childish”, writes Cox (2014, p. 218), “’Is it reasonable to spend less on asteroid defense than on a footballer’s annual salary?’”

Just as in this essay, for Cox the question of rational choice comes into the human equation. Will the potential of human rationality ultimately prevail? In that, for example, “the United States spends more on pet grooming than on fusion research” (p. 238) might persuade one’s answer.  But Cox continues, “I am a believer in the innate rationality in human beings; given the right education, the right information and the right tuition in how to think about problems […]. I believe that if I said to someone: ‘Here’s the ideal. You can have limitless clean energy for your lifetime, for your children and grandchildren’s lifetimes and beyond, in exchange for grooming your own cat’, then most people would reach for the comb. I have to believe that, otherwise this book is a futile gesture” (p.238).

A Moment with Newton

R.C. Smith

Over the summer break I set myself the task of reading through a fantastic recently published edition of the Principia. It has so far been immensely enjoyable. And, in the future, I do not doubt that I will dedicate an entire series of blog posts or essays to my working through Newton’s Principia.

principia - new translation

In the meantime, with each new page I’ve been reminded of my personal appreciation for Newton. Growing up, Sir Issac Newton was one of my idols. I remember my first introduction to the name, Newton, when I was no more than six or seven years old. It was in science class and the initial introduction was basic. It was left simply that Issac Newton was an important historical figure, who contributed much to science. In that vein, the presentation was no different than when I was first introduction to Benjamin Franklin, Thomas Jefferson, or Galileo Galilei. Looking back, I wish there was more emphasis in the curriculum to explore these figures and the openness of scientific inquiry shared among them (and other notable historical thinkers). There are very basic lessons, I think, that can be learned about the force or principles of the modern scientific endeavor in the study of some of its most significant figures. One learns that it is not just the individual, himself, but an entire history of human thought and equiry leading up that eureka moment. But, importantly, at a young age there is much inspiration to be gleaned from the story of Newton, Galileo, Maxwell, Faraday (one of my favourite biographies), Friedmann, Wigner, Einstein, and on and on. They all evidence an energy and passion for discovery, and arguably also an incredibly creativity in thought. Important values=s in early education, I would be inclined to argue.

I think it is plausible that most scientists, whether your a young student like myself or a seasoned physicist or biologist or whatever, experience one or two “wow!” moments early on that help foster an interest in the pursuit of science. Mine directly relates to Newton.

Aside from revolutionizing physics by unifying all of mechanics in three laws of motion, my eureka moment (if I can call it that) relates to Newton’s discovery – or inventing – of calculus. The formulation of the law of universal gravity, the eventual development of newtonian field theory – these are all amazing feats in the history of human thought. But it was Newton’s discovery of calculus – needed to solve the equations Newtonian mechanics produced – that really cemented my love for both mathematics and physics. And I suppose, in a very direct way, this existential moment of all embracing and inspiring “wow!” that struck me when I was younger also owes a debt to Gottfried Leibniz.

Why? Well, admittedly, my introduction to calculus was first through Newton; but it was the eventual realization, as I frankly sought to piece together the history of the development of calculus and its first principles, as well as that of classical physics, where two profound facts hit me in a way that I’ll never forget. Understanding that the development of calculus as well as the concept of vectors was, in a deeply important way, absolutely vital to the mechanics of the time, when one studies the history of mathematics ideas in and around this period there is a very coherent, and certainly observable, logical consistency in the build up to the final result. Indeed, and to preface the following with one more remark: it was intentional that Newton’s inventing of calculus was described earlier as a “discovery”. In short: it is understood given the demands of the direction of the new mathematics that had emerged, that without calculus and vectors, the very idea of instantaneous velocity, which we kind of take for granted today, would have been very difficult. We had arrived at a point, in the history of mathematics and more broadly in the history of human thought, that what was required was a mathematics of change – a mathematics that account for change. This was demanded by Newtonian mechanics. But the “wow!” moment, if you will, is in how not only did Newton discover calculus and its relation to the scientific study of nature. Independent from Newton, Gottfried Leibniz also developed calculus! And together, they are both responsible for one of those special and key historical moments of realization, where, in the study of the history of science and of ideas, one is confronted with the special relation between the fundamental disposition of the systematic human pursuit in maths and science and the study of Nature.

That both Newton, given the demands of his mechanics, and Leibniz at a similar point in human history, both discovered calculus suggests that calculus was very much the next logical step.

Now, depending on where one sits with regards to the debates around nominalism and platonism, and the epistemological arguments about the knowledge of abstract objects, this realization may be met with great intrigue or a simple shrug. In the case of the latter, if you don’t think numbers exist, that they do not form an integral or constituent part of object reality, you might simply state that calculus needed to be invented according to the logic of the system within which modern thinking operations, and thus too that all mathematics is human invention. But, if you’re like me, and you see mathematics as a constituent part of reality reaching all the way back to the Pythagorean argument, then the development of calculus is nothing short of awe-inspiring, as it was not so much the need of invention, but the next step in the study of the nature of reality. Indeed, it becomes all the more arousing if one considers Archimedes method of exhaustion to calculate the area under a parabola (anticipating modern integration) as far back as  287–212 BC.

There are many similar events throughout the history of physics and mathematics, where the same solution, conclusion or development was arrived at by independent parties. Likewise, in physics, one could also add the number of times theory has predicted precisely future measurements (beyond the experimental capabilities of the time), or when two independent theories arrive at the same objective outcome. Physics is incredibly rich in inspiration in this regard. Whenever I come across new examples, it always deepens for me the idea of the objective.

As Albert Einstein once remarked, ‘the idea that truth is independent to human beings is something I cannot prove but something I think is basic’. ‘The problem’, stated Einstein, ‘is the logic of continuity’. Max Tegmark, celebrated cosmologist, would even go so far to say that mathematics is reality. And this seems to be the emerging consensus, but I’ll save the arguments toward those ends for another time.

In closing, it was in 1665 that Newton began to develop his ideas on calculus. ‘Fluxions’, as he called it, related very directly to Newton’s early study of the laws of motion. Though I know the textbook version of Newton’s Laws of Motion and Newtonian mechanics like the back of my hand, I am only now starting to read through the Principia. My drive toward first principles and has me eager to read Opticks and Methodis Fluxionum (calculus), the latter published posthumously.

As arguably the world’s greatest physics, to which the likes of more recent modern heroes owe like Maxwell or Einstein owe a great debt, Newton was known as a master scientists. What’s most fascinating about the man, since we’re on the subject, is that he also had an extremely energized private interest in religious and mystical pursuits. Apparently, he produced a thousand page manuscript on his own theological studies, as well as a significant collection of thoughts on things like alchemy. But in public, from what I understand, he would not partake in such wild speculation, which leaves me to wonder whether there was ever a tension between his standards of scientific practice and personal orthodoxy. It reminds a little bit of perhaps one of the more famous stories about Einstein.

I would say, personally, that Einstein is the most creative, critically inquisitive and challenging scientist to have existed. His entire career was based on challenging the status quo. But even Einstein suffered from a personal moment of fallibility with respect to the human vulnerability to orthodoxy. As Max Tegmark (2014, p. 43) recites:

Einstein himself realized that a static universe uniformly filled with matter [Newton’s laws] didn’t obey his new gravity equations. So what did he do? Surely, he’d learned the key lesson from Newton to boldly extrapolate, figuring out what sort universe did obey his equations, and then asking whether there were observations that could test whether we inhabit such a universe. I find it ironic that even Einstein, one of the most creative scientists ever, whose trademark was questioning unquestioned assumptions and authorities, failed to question the most important authority of all: himself, and his prejudice that we live in an eternal unchanging universe.

To his credit, when further evidence emerged, Einstein admitted that adding this extra term in his equation to account for a static, eternal universe was his greatest blunder. It shows his remarkable character as an iconic scientist to admit to such a moment of prejudice, and critically analyze challenges toward that prejudice in an open and rational way. I like to think that, with all we now know, in our current moment of history, Newton’s response to challenges of his personal beliefs would have been the same.