August seems like an unlikely time to be talking about Christmas, but it’ll be here before you know it so we might as well start the debate early this year.
December is usually the time of year when Christians all over the world start going back to church in droves, preparing for the Christmas mass on the 25th of December.
It has long been a bone of contention for regular church goers who see those other people coming during the holy times of year out of duty, rather than living a life dedicated to the church year round. Sometimes, ya gotta wonder if those people are only showing up to church to show off their daughters and sons in their new Christmas outfits.
But either way, Christmas is not only a time of celebration, but also a time of great controversy. As you may or may not know, there has been an ongoing debate about the existence of Jesus Christ for as long as we can remember now, and it’s no wonder.
I mean, have you read the bible? It seems like a long shot that one person could have lived such a life. So it’s no wonder that so many scholars are continuing to question the existence of the real life Jesus.
Why a growing number of people suggest Jesus may have never existed
While there is a growing body of internet literature supporting the argument for the existence of the man we know as Jesus Christ, most of the information is muddied with atheist non-believers who are just trying to make a point.
The academic and scientific communities are still at odds about the existence of such a man, but there’s no doubt that these stories had to have come from somewhere, based on someone’s life, even loosely.
What’s interesting is that the stories themselves are flawed; for example, it’s widely known that there are entire years of Jesus’ life missing from the bible stories. Perhaps work piled up and there was no one to record those times in the man’s life.
But whatever the reason, it just adds fuel to the fire about whether or not there was an actual man named Jesus Christ.
What’s more, there is also more evidence that has come about over the years to indicate that the bible stories have been heavily misinterpreted and edited for the gain of the Catholic and christian churches.
There are several books that have been researched and published to make the argument against the existence of Jesus in real life, including Zealot by Reza Aslan, Nailed: Ten Christian Myths That Show Jesus Never Existed at All by David Fitzgerald, and How Jesus Became God by Bart Ehrman.
One historian even wrote a 600 page manuscript on the history of Jesus and it provides a lot of compelling arguments that the stories of Jesus are fabricated and manipulated to create new stories of a god-like man.
If Jesus has been fabricated, it shouldn’t be surprising
It shouldn’t seem surprising that man has held onto the belief of a godly figure for so long. After all the Greeks, Romans, Pagans, and other nations believed in celestial gods. Entire civilizations believed that they were ruled by Gods and they obeyed Gods for their protection and knowledge.
How come no one is wondering about the existence of Zeus or Thor? They aren’t just comic book characters you know. People actually believed in these entities.
The conversation of whether or not there was a real man named Jesus that the bible was written about and that millions of people worldwide believe in, has many layers.
There are so many elements to these stories that it is hard not to believe them. Yet, many more millions of people don’t believe in God, Jesus or any other worldly being.
As a society, there seems to be a shift toward believing in the power of the universe or fate or other things beyond our control, and people have no trouble praising the universe for their existence, but when it comes to praising God, Jesus or otherwise, people clam up and there is a level of embarrassment that is present these days.
So while the atheists and the scholars might not think they are making headway to disprove the existence of Jesus, people are taking notice and are holding their beliefs close to their chest.
Perhaps because they, too, question what they think they know, and are finding it difficult to navigate a world in which Jesus is not real.
There is a growing trend of people turning away from religion. Those left standing in the path of righteousness seem to have difficulty understanding this global shift.
It seems that the most affluent people: the ones with money, excess, and access are the ones who are turning their back on organized religion, and it’s causing a lot of concern.
One of the theories behind this conundrum is that as people become more able to care for themselves through financial means and external resources, people worry less about dying and don’t need to believe in a higher power as much as they would have even 50 years ago.
People are more about “living in the moment” now than planning for a life in “paradise” with a God they’ve never seen.
Religion won’t be going away anytime soon, however
While there is a great deal of evidence mounting in support of atheism (mostly due to the incredible access to information we all have these days), there is also many solid arguments from religious scholars that are saying these minority groups of atheists are not powerful enough to change an entire religion’s belief systems.
Those who believe still currently outnumber those who don’t believe. However, it is predicted that atheism will overtake religion by 2038.
Why people are becoming less religious
Outside of the argument for the existence of God, there are a number of global factors that are impacting the way families are raised and what they believe as well. For example, as more and more women enter the workforce, the size of families continues to decrease. Many couples are opting not to raise children at all, thus ending their lineage for religious beliefs.
Another factor to consider in reports and studies is that people who believe in God, or some form thereof, may not disclose that belief. Some people believe in a God without being “religious”, so the use of the world religion is tricky. Others believe in a God superficially and only tap into that belief when it serves them in the moment.
It seems that people of economically diverse countries are split between being religious and being atheists such as Spain, Canada, Switzerland, Germany, and France.
On the contrary, research has shown that countries with split religious beliefs, like Canada and France, rank high on morality, social trust, economic equality, and have lower crime rates than countries with higher numbers of religious believers.
While religion and the notion of God are a highly personal matter, people feel compelled to spread their messages of support or disdain, making it difficult for a person to decide what they believe.
It may be that people find it easier to just “give up” their beliefs in order to fit into a world where God doesn’t seem to be present; but the topic is hot, and it leaves a lot of unanswered questions on both sides of the debate to determine whether or not religion is going anywhere, anytime soon.
Buried within the archives of a museum in Missouri, an essay on the search for alien life has come to light, 78 years after it was penned. Written on the brink of the second world war, its unlikely author is the political leader Winston Churchill.
If the British prime minister was seeking solace in the prospect of life beyond our war-torn planet, would the discovery of a plethora of exoplanets aid or hinder such comfort?
The 11-page article – Are We Alone in the Universe? – has sat in the US National Churchill Museum archives in Fulton, Missouri from the 1980s until it was reviewed by astrophysicist Mario Livio in this week’s edition of the journal Nature.
Livio highlights that the unpublished text shows Churchill’s arguments were extremely contemporary for a piece written nearly eight decades previously. In it, Churchill speculates on the conditions needed to support life but notes the difficulty in finding evidence due to the vast distances between the stars.
Churchill fought the darkness of wartime with his trademark inspirational speeches and championing of science. This latter passion led to the development of radar, which proved instrumental to victory over Nazi Germany, and a boom in scientific advancement in post-war Britain.
Churchill’s writings on science reveal him to be a visionary. Publishing a piece entitled Fifty Years Hence in 1931, he detailed future technologies from the atomic bomb and wireless communications to genetic engineered food and even humans. But as his country faced the uncertainty of another world war, Churchill’s thoughts turned to the possibility of life on other worlds.
In the shadow of war
Churchill was not alone in contemplating alien life as war ripped across the globe.
The British government was also taking the prospect of extraterrestrial encounters seriously, receiving weekly ministerial briefings on UFO sightings in the years following the war. Concern that mass hysteria would result from any hint of alien contact resulted in Churchill forbidding an unexplained wartime encounter with an RAF bomber from being reported.
Faced with the prospect of widespread destruction during a global war, the raised interest in life beyond Earth could be interpreted as being driven by hope.
Discovery of an advanced civilisation might imply the huge ideological differences revealed in wartime could be surmounted. If life was common, could we one day spread through the Galaxy rather than fight for a single planet? Perhaps if nothing else, an abundance of life would mean nothing we did on Earth would affect the path of creation.
Churchill himself appeared to subscribe to the last of these, writing:
I, for one, am not so immensely impressed by the success we are making of our civilisation here that I am prepared to think we are the only spot in this immense universe which contains living, thinking creatures.
A profusion of new worlds
Were Churchill prime minister now, he might find himself facing a similar era of political and economic uncertainty. Yet in the 78 years since he first penned his essay, we have gone from knowing of no planets outside our Solar System to the discovery of around 3,500 worlds orbiting around other stars.
Had Churchill lifted his pen now – or rather, touched his stylus to his iPad Pro – he would have known planets could form around nearly every star in the sky.
This profusion of new worlds might have heartened Churchill and many parts of his essay remain relevant to modern planetary science. He noted the importance of water as a medium for developing life and that the Earth’s distance from the Sun allowed a surface temperature capable of maintaining water as a liquid.
He even appears to have touched on the fact that a planet’s gravity would determine its atmosphere, a point frequently missed when considering how Earth-like a new planet discovery may be.
To this, a modern-day Churchill could have added the importance of identifying biosignatures; observable changes in a planet’s atmosphere or reflected light that may indicate the influence of a biological organism. The next generation of telescopes aim to collect data for such a detection.
By observing starlight passing through a planet’s atmosphere, the composition of gases can be determined from a fingerprint of missing wavelengths that have been absorbed by the different molecules. Direct imaging of a planet may also reveal seasonal shifts in the reflected light as plant life blooms and dies on the surface.
Where is everybody?
But Churchill’s thoughts may have taken a darker turn in wondering why there was no sign of intelligent life in a Universe packed with planets. The question “Where is everybody?” was posed in a casual lunchtime conversation by Enrico Fermi and went on to become known as the Fermi Paradox.
The solutions proposed take the form of a great filter or bottleneck that life finds very difficult to struggle past. The question then becomes whether the filter is behind us and we have already survived it, or if it lies ahead to stop us spreading beyond planet Earth.
Filters in our past could include a so-called “emergence bottleneck” that proposes that life is very difficult to kick-start. Many organic molecules such as amino acids and nucleobases seem amply able to form and be delivered to terrestrial planets within meteorites. But the progression from this to more complex molecules may require very exact conditions that are rare in the Universe.
The continuing interest in finding evidence for life on Mars is linked to this quandary. Should we find a separate genesis of life in the Solar System – even one that fizzled out – it would suggest the emergence bottleneck didn’t exist.
It could also be that life is needed to maintain habitable conditions on a planet. The “Gaian bottleneck” proposes that life needs to evolve rapidly enough to regulate the planet’s atmosphere and stabilise conditions needed for liquid water. Life that develops too slowly will end up going extinct on a dying world.
A third option is that life develops relatively easily, but evolution rarely results in the rationality required for human-level intelligence.
The existence of any of those early filters is at least not evidence that the human race cannot prosper. But it could be that the filter for an advanced civilisation lies ahead of us.
In this bleak picture, many planets have developed intelligent life that inevitably annihilates itself before gaining the ability to spread between star systems. Should Churchill have considered this on the eve of the second world war, he may well have considered it a probable explanation for the Fermi Paradox.
Churchill’s name went down in history as the iconic leader who took Britain successfully through the second world war. At the heart of his policies was an environment that allowed science to flourish. Without a similar attitude in today’s politics, we may find we hit a bottleneck for life that leaves a Universe without a single human soul to enjoy it.
Perhaps nutrition is the field most in the spotlight. It took several decades for cholesterol to be absolved and for sugar to be re-indicted as the more serious health threat, thanks to the fact that the sugar industry sponsored a research program in the 1960s and 1970s, which successfully cast doubt on the hazards of sucrose – while promoting fat as the dietary culprit.
We think of science as producing truths about the universe. Triumphs of science, like the recent confirmation of the existence of gravitational waves and the landing of a probe on a comet flying around the sun, bring more urgency to the need to reverse the present crisis of confidence in other areas of the scientific endeavour.
Science is tied up with our ideas about democracy – not in the cold war sense of science being an attribute of open democratic societies, but because it provides legitimacy to existing power arrangements: those who rule need to know what needs to be done, and in modern society this knowledge is provided by science. The science-knowledge-power relationship is one of the master narratives of modernity, whose end was announced by philosopher Jean-François Lyotard four decades ago. The contemporary loss of trust in expertise seems to support his views.
Still, techno-science is at the heart of contemporary narratives: the convictions that we will innovate our way out of the economic crisis, overcome our planetary boundaries, achieve a dematerialised economy, improve the fabric of nature, and allow universal well-being.
The appeal of reassuring narratives about our future depends on our trust in science, and the feared collapse of this trust will have far-reaching consequences.
The cult of science is still adhered to by many. Most of us need to believe in a neutral science, detached from material interests and political bargaining, capable of discovering the wonders of nature. For this reason, no political party has so far argued for a reduction in science funding on the basis of the crisis in science, but this threat could soon materialise.
The crisis we saw coming
The crisis in science is not a surprise – some scholars of history and philosophy of science had predicted it four decades ago.
Derek de Solla Price, the father of scientometrics – literally the scientific study of science – feared the quality crisis. He noted in his 1963 book, Little Science, Big Science, that the exponential growth of science might lead to saturation, and possibly to senility (an incapacity to progress any further). For contemporary philosopher Elijah Millgram, this disease takes the form of disciplines becoming alien to one another, separated by different languages and standards.
Jerome R Ravetz noted in 1971 that science is a social activity, and that changes in the social fabric of science – once made up of restricted clubs whose members were linked by common interests and now a system ruled by impersonal metrics – would entail serious problems for its quality assurance system and important repercussions for its social functions.
Ravetz, whose analysis of science’s contradictions has continued to the present day, noted that neither a technical fix would remedy this, nor would a system of enforced rules. Scientific quality is too delicate a matter to be resolved with a set of recipes.
A perfect illustration of his thesis is the recent debate about the P value – commonly used in experiments to judge the quality of scientific results. The inappropriate use of this technique has been strongly criticised, provoking alarm – and statements of concern – at the highest levels in the profession of statistics. But no clear agreement has been reached on the nature of the problem, as shown by the high number of critical comments in the ensuing debate.
Philip Mirowski’s recent book offers a fresh reading of the crisis in terms of the commercialisation of science’s production. Scientific research deteriorates when it is entrusted to contract research organisations, working on a short leash held by commercial interests.
Science-based narratives and the social functions of science will then lose their appeal. No solution is possible without a change in the prevailing vision and ideology, but can scientific institutions offer one?
The supremacy of expertise
Here the stakes are high and perverse systems of incentives entrenched. Many scientists are highly defensive of their work. They adhere to the deficit model, in its standard or glorified form, whereby if only people understood science – or at least understood who the true experts were – then progress would be achieved.
Scientists often subscribe to the myth of one science, and promote actions for or against a policy based on their position as scientists. In a recent case, more than 100 Nobel laureates took a side in a dispute over a genetically modified rice, a rather complex case where more prudence would have been in order.
Climate is another battlefield where the idea that “science has spoken” or “doubt has been eliminated” have become common refrains.
There is an evident tension between this view and what takes place in the arena of evidence-based (or informed) policy. Here legislation developed to fight racketeering is used by activists and scientists to target their peers in the opposing faction, in hot fields from climate to biotechnologies.
The science of economics is still in control of the master narrative. The same craft that failed to predict the latest great recession – and worse, directly engineered it thanks to its financial recklessness – is still dictating market-based approaches to overcome present challenges. By its own admission, the discipline, which supported austerity policies with a theorem based on a coding error, has little clue as to what to do if the global economy will face another downturn.
The economic historian Erik Reinert notes that economics is the only discipline impermeable to paradigm shifts. For economics, he says, the earth is round and flat at the same time, all the time, with fashions changing in cyclical shifts.
One can see in the present critique of finance – as something having outgrown its original function into a self-serving entity – the same ingredients of the social critique of science.
Thus the ethos of “little science” reminds us of the local banker of old times. Scientists in a given field knew one another, just as local bankers had lunch and played golf with their most important customers. The ethos of techno-science or mega-science is similar to that of the modern Lehman bankers, where the key actors know one another only through performance metrics.
If this wave of concern will merge with the science crisis, then important facets of our modernity might be up for discussion. Will this lead to a new humanism as hoped by some philosophers or to a new dark age, as feared by others?
The conflicts described thus far involve values in conflict, of the type dealt with in something called “post-normal science”. Many dislike the name of this approach for its postmodern associations, but appreciate its model of extended peer communities. These communities bring together experts from across disciplines – as different disciplines see through different lenses – and anyone affected or concerned with the subject at hand, with possibly different views about what the problem is.
Today, extended peer communities are set up by some activist citizens and scientists. This format encourages a humbler, more reflexive attitude. It suggests to citizens a more critical and participatory attitude in matters of science and technology, with less deference towards experts.
If this process leads to reform in science and challenges the monopoly of knowledge and authority – as to some extent we see happening in health – then we might go some way to rebuilding trust in one of the most important facets of modern life.
Modern societies are usually defined as relatively unreligious, dominated by money and power rather than belief in gods. This idea marks them out as modern when compared to traditional societies as well as highlighting the many issues of modernity including capitalism, growth, overproduction and climate change.
But why are we so sure that secularisation and the dominance of politics and economics are in the DNA of modern societies? Our answer to this question defines and confine our problem-solving ability.
A recent article in the journal Futures shows that most strategic management tools and models of the future have a strong bias towards politics, economics and science, thus systematically neglecting religion, law, art, or education.
Given this bias is unconscious and unjustified, we risk constantly looking at solutions to wrong problems.
We undertook big data research on the digital database created by the Google Books project, which has scanned and digitised over 25 million of the estimated 130 million books ever published worldwide.
To systematically screen this huge collection of text, we used the Google Books Ngram Viewer, a free online graphing tool that charts annual word counts as found in the Google Books project. The Ngram Viewer comes with an intuitive interface where users can enter keywords, choose the sample period, define the desired language area, and modulate the shape of the graphical output.
One of our challenges was to find the right keywords. For this, we used an open source tool by Jan Berkel.
The result was a list of the 10,000 most frequently used words and strings in books published between 1800 and 2000. This period covers a considerable proportion of the era commonly referred to as modernity and is regarded as reliable data by Google.
We repeated the procedure until we had compiled one list each for English, Spanish, Russian, French, German, and Italian. We then screened the word frequency lists for terms that make unambiguous and distinct keywords. Money or God make good examples of such keywords, whereas we omitted terms such as tax or constitution as they refer to both politics and economy or law.
Finally, we entered stacks of the five most frequent religious, political, economic, and other pertinent keywords to run comparative analyses of word frequency time-series plots as displayed by the Google Ngram Viewer.
The figure below shows word frequencies of combined religious, political, economic, scientific, and mass media-related keywords in the English-language Google Books corpus between 1800 and 2000.
Since we analysed a considerable proportion of humanity’s collective memory between 1800 and 2000, and since the outcomes of our research resemble classical electroencephalography (EEG) recordings (see figure), we also linked our research into the global brain discourse.
The basic idea here is that the worldwide network of information and communication technology acts as the global brain of planet earth. In this sense, our electroencephalographic big data internet research is the first example of a global brain wave measurement, which was heralded by Peter Russell in his 1982 book The Global Brain.
Secularisation, much politics, and no capitalism
Looking at the global brain wave recordings (figure below), we find that our method performs well in capturing the expected decline of religion (orange line), which, by the way, is less significant in Spanish and Italian.
The chart for English language shows notable interactions between the two world wars and the significance of politics (blue line), and we find similar interactions in other areas, too.
In the Russian segment, the importance of politics is considerably increased during the (post-) October Revolution period and even more dramatically so in the context of the second world war.
In the French case, the years around the first world war see a steep rise in the importance of politics, whereas the interaction during the second world war is much more moderate. The German data follow a similar pattern as the French, but exhibit the dramatic rise of politics in the post-second world war era, peaking at around 1970.
The rise of the importance of the mass media corresponds to the timeline of the information age (green line). What we do not see, however, is the dominant position of economy (purple line) in modern societies. There is a short period between 1910 and 1950 where economy was second to a much stronger politics.
This image of a war economy is the closest approximation to a capitalist situation to be found in the English segment, in which the economy is outperformed by science (red line) soon after the second world war and by mass media in the 1990s, ranking fourth at the end of the sample period.
There’s no sign of the golden age of capitalism in the 19th century either, as narratives of the Industrial Revolution would lead us to believe.
The charts for the other languages also do not support the idea of modern societies as being capitalist or otherwise dominated by the economy. The only exception is the French segment, where economy has been second again to a much stronger politics since the end of the second world war.
Economy ranks third in the Russian segment only from the late 1950s to the 1990s and in the German segment not before the 1970s, and well below par in the Spanish and Italian segments.
New god of modern societies
Our big data research hence suggests that modern societies are heavily politicised; most investigated societies are clearly secularized, with reservations applying to the societies where Spanish and Italian are spoken; and that science plays a remarkable role, ranking second in the English-, Russian-, and German-language areas in the second half of the 20th century.
None of the investigated societies is dominated by the economy, with the minor reservation discussed above applying only to French. Even this finding reflects only the idea of capitalism as an economy-based political ideology, not the idea of the primacy of the economy.
The major finding of our research is that political power rather than economics has dethroned religious faith as the dominant guiding principle between 1800 and 2000.
Our data definitely suggests that, despite all contradicting ideologies or habits of mind, the economy is of only moderate importance to modern societies. This implies that, in the future, we may wish to think twice before we continue to label our societies as money-driven, economy-biased, or simply capitalist.
One major shortcoming of our research is that it focused only on books. But this focus is adequate as the ideas of capitalism and the primacy of economy have been developed precisely in the books we investigated.
The idea that the definitions of modern societies as capitalist or economy-dominated are probably rooted in misconceptions rather than modern scientific worldviews may be counter-intuitive or even shocking for both capitalists and anti-capitalists.
Acknowledgement: Our research was first presented at 2016 City University of Hong Kong Workshop on Computational Approaches to Big Data in the Social Sciences and Humanities. I am grateful to Jonathan Zhu and the entire team of the Web Mining Lab at the CityU Department of Media and Communication for the invitation to Hong Kong as well as for valuable feedback.
But that doesn’t mean the Reef is out of danger. Afforded World Heritage recognition in 1981, the Reef has been on the warning list for nearly three years. It’s not entirely evident why UNESCO decided not to list the Reef as “in danger” at this year’s meeting, given the many ongoing threats to its health.
However, the World Heritage Committee has made it clear they remain concerned about the future of this remarkable world heritage site.
The reef is still in deep trouble
UNESCO’s draft decision (the adopted version is not yet releasesd) cites significant and ongoing threats to the Reef, and emphasises that much more work is needed to get the health of the Reef back on track. Australia must provide a progress report on the Reef in two years’ time – and they want to see our efforts to protect the reef accelerate.
About 40% of this vegetation clearing is in catchments that drain to the Great Barrier Reef. Land clearing contributes to gully and streambank erosion. This erosion means that soil (and whatever chemical residues are in it) washes into waterways and flows into reef lagoon, reducing water quality and affecting the health of corals and seagrass.
Landclearing also directly contributes to climate change, which is the single biggest threat to the Reef. The recent surge in land clearing in Queensland alone poses a threat to Australia’s ability to meet its 2030 emissions reduction target. Yet attempts by the Queensland Government to control excessive land clearing have failed – a concern highlighted by UNESCO in the draft decision.
A time for action, not celebration
The Reef remains on UNESCO’s watch list. Just last month the World Heritage Committee released a report concluding that progress towards achieving water quality targets had been slow, and that it does not expect the immediate water quality targets to be met.
The draft decision still expressed UNESCO’s “serious concern” and “strongly encouraged” Australia to “accelerate efforts to ensure meeting the intermediate and long-term targets of the plan, which are essential to the overall resilience of the property, in particular regarding water quality”.
This means reducing run-off of sediment, nutrients and pollutants from our towns and farmlands. Improving water quality can help recovery of corals, even if it doesn’t prevent mortality during extreme heatwaves.
The Great Barrier Reef is the most biodiverse of all the World Heritage sites, and of “enormous scientific and intrinsic importance” according to the United Nations. A recent report by Deloitte put its value at A$56bn. It contributes an estimated A$6.4bn annually to Australia’s economy and supports 64,000 jobs.
But the reef cannot exist in the long term without international efforts to curb global warming. To address climate change and reduce emissions, we need to act both nationally and globally. Local action on water quality (the focus of the Reef 2050 Plan) does not prevent bleaching, or “buy time” to delay action on emissions.
We need adequate funding for achieving the Reef 2050 Plan targets for improved water quality, and a plan to reach zero net carbon emissions. Without that action, an “in danger” listing seems inevitable in 2020. But regardless of lists and labels, the evidence is clear. The Great Barrier Reef is dying before our eyes. Unless we do more, and fast, we risk losing it forever.