Academic Connections, International © Academic Connections, International

     In the forward to David K. Naugle’s Worldview: The History of a Concept—which has become something of a “standard reference” on the subject—Christian philosopher Arthur Holmes writes: 

“’…What Christian academics have long asserted is that biblical religion is not inimical to serious scholarship but motivates it, illumines the mind, opens new avenues for inquiry and draws things together in a meaningful whole. All truth in the final analysis is about the ways and works of God. But the secular academy under the spell of modernity found it outrageous that a place be given to scholarship from a religious point of view: the rule of “reason alone” excludes it...’”[i]

     Because of that, contemporary Christian scholars who wish to think Christianly about their academic work face the great challenge to be taken seriously by their secular colleagues. Indeed, thinking Christianly about your academic work is often considered crossing a line with serious consequences for your academic career, and for that reason alone many scholars who are Christians at secular institutions hesitate to let what they know as a Christian inform any of their research or work.

     For that matter, Christian apologists have faced a similar dismissive marginalization when they wish to defend the Christian faith in a milieu that thinks its arguments are specious just because of the “insights” of modernity. But just what are those insights that are thought to be a part of modernity that provide that kind of leverage?

Faith and Reason

     The goal of this essay is to uncover some of the history of how what Holmes called the “spell of modernity” came about, offer a brief analysis and critique of it and finally challenge Christian scholars to engage in further research and discussion. It seems necessary to get to the bottom of this transformation in the history and sociology of ideas, so that we might be able to critically investigate current academic orthodoxy, and by that speak to every generation about the underlying assumptions of each discipline, their particular methodologies, and wonder how our Christian faith informs these sorts of things.

     To introduce that, we will in Part A provide a first approximation of the current meanings of the terms’ “faith” and “reason.” Then we will supplement that by considering in Part B the roots and impact of logical positivism in forming a legacy within the sciences that impacted that understanding; in Part C, we will look at the impact of methodological conventions and naturalizing tendencies that follow from those methods. Finally, in Part D we will discuss how the humanities' critique of the Christian faith and its scholarship has had its influence on these terms, and offer an ambitious challenge to Christian scholars to set the record straight.

Part A — A First Approximation of the Problem

     To begin, we think the issues we are addressing can be disclosively framed by asking how the terms “faith” and “reason” are understood and used in contemporary academe. Putatively these terms are understood in such a way that we find the term “faith” seems to fit into the category of the subjective, unsubstantiated and the prejudicial and thus not fit to be worthy of serious concern; while the term “reason” is associated with things thought to be objective, substantiated, unbiased and worthy of consideration and reflection. Another preliminary way of saying this is, faith is about subjective beliefs and reason is about objectively knowing. We think that because of the evolution of these terms in the history of thought, this putative understanding is not quite right and we will need to explore that evolution to be clearer. However, it is close enough to practically explain why faith has been excluded from a position of authority in the great academies of the world. At the same time, what is of Reason is widely accepted as credible and as having authority. Surely not everything claimed to be of “reason” gets a free pass; however, nearly everything of “faith” is excluded.

     The university, especially through its successes and advances in science, technology and mathematics has created substantial cultural credibility for itself and for the most part, rightly so. This success allows the university to play a dominant role in forming what Lesslie Newbigin has called in sociological terms, a cultural “plausibility structure”—that filter which separates the intellectual wheat from the chaff culturally and ideologically speaking. That filter and its authority is of extreme importance because it can even be written into the law of the land. This cannot be of casual concern for contemporary Christian scholars or defenders of the Christian faith, despite the fact that currently there is majority of professors in the United States who have some religious affiliation.

Part B – The Roots of 20th Century

Logical Positivism & Its Impact

     If that first approximation of the current state of affairs in this matter is in the ball park correct, the next question to ask is from what and where did we receive the tradition that led to this understanding. We think the proximate development of the received tradition regarding these terms was significantly affected by two things: 1) the rise of modern science, through the evolution of thought and achievements of people like Francis Bacon, Galileo, Newton, Faraday, and Einstein, to name a few. It is an influence that continues to shape the dialogue and narrative of the received tradition to this day; and 2) the evolving conversation about epistemology among Enlightenment philosophers[ii] that included people like Descartes, Locke, Hume and especially Kant, to name a few and their legacy.

     The rise of what we now call modern science was even more radical than the Renaissance perspective on the modern worldview of academe. It played a major (perhaps the major role) in undermining confidence in revealed religion because as it matured it seemed to directly contradict traditional readings of the scriptures. As a matter of fact, the beginnings of deference to the deliverances of scientific inquiry would also start us on a road that eventually undermines a great deal of classic Greek and Roman wisdom. Because of length we can only speak briefly about that heritage here; we have resources elsewhere on our ACI website where you can more carefully peruse the relevant literature and what we have to say about that.

     There is more than one place we could begin to discuss a more fine-grained analysis of our heritage. Some prefer starting with Frances Bacon; for our purposes we will start with the Catholic monk Copernicus. Unsatisfied by Ptolemaic geocentric model of our system for satisfying the Aristotelian requirement of the circular motion of the planets, he concluded the Ptolemaic picture of our solar system was incorrect. The bad news for faith is that this old system had been adopted by the Catholic church into Christian doctrine. Nonetheless, the sun, not the earth was at the center of what we now call our solar system. In the not-too-distant future this conflict between doctrine (not necessarily scripture) and science would become front and center for the Roman Catholic church and Galileo.

     Near the end of the 17th century, Newton, whom many still consider to be the greatest scientist who has ever lived, presented the world with a picture of the universe as a great machine that could be mathematically understood. It was as if a great mathematician had created a world—a mechanistic world—that worked according to empirically discoverable causes and effects that could be expressed in equations. God was still the Creator and the Sustainer of the universe, but after Creation, He wasn’t needed for understanding its regularities, the math would do that. Philosophically speaking, deism is what we would eventually call that picture of the God-world relation.

     The intellectuals of the 18th and 19th Enlightenment centuries wanted to push their investigation of human understanding further. Most of the heavy philosophic lifting of that period was done by those who wished to plumb the depths of human epistemic foundations.[iii]  A forceful motivation for these inquiries was the sense that, what one could not “prove” by sound arguments (having proper form and true premises), one could not rationally know. What could not clear that bar of reason was then thought to be merely a matter of subjective opinion—a belief. This approach had the “benefit” of ridding the intellectuals of concern about God, values, ethics and revealed religion’s claims on humanity, with all its distasteful baggage. At the same time, it could still hold onto something of what they wanted—the physical world. Sadly, it was evolving into an understanding of the physical world without objective meaning or moral Truth. That perspective set the stage where humanity, not God, could more formally take on the mantle of being the measure of all things.

     (You want to keep an eye on that motivation because you may find it ineluctably leads to a boot-strapping problem for all claims to Truth and that might mean the undermining of all claims to objective knowledge (and Truth) including an objective understanding of the external world. Wouldn’t that be ironic to think that for those who held a secular perspective on this and pressed their point would eventually find the passing joy of the cosmic Oedipus complex fulfilled, would also have the consequence of cutting themselves off from claiming to know objective meaning, purpose, ethics and Truth?)

     At the beginning of the modern philosophical era, Descartes—widely considered the first modern philosopher because he attempted to separate himself from tradition and establish his systematic thinking on the basis of reason alone—was interested in getting to the bottom of how we know what we claim to know. He thought such an explanation might resemble in important ways what the model of mathematical knowledge was thought to be like. That is, we would begin with something we obviously know (in mathematics they are called definitional axioms) and build deductively from there. Descartes wanted to identify indubitable “axioms” through reasoning (really skeptical reasoning)—not from his experience—but by merely “thinking” about it; and, by that method arrive at what he called “clear and distinct ideas” from which he could then build. 

     Descartes thought he discovered such a clear and distinct idea by means of radical skepticism: the discovery of the certainty of his own existence. That is, though an experiential skeptic he could not meaningfully doubt his own existence. He wanted to begin with that certainty and see what he could build upon that by means of deduction. Deduction was important to his method because it provided a chain of certainty that would avoid the possible deceptions of sensory perception and induction. You will likely recognize this approach was a characteristic of the epistemic strategy of the “rationalist” school of thought; that is, the term 'rationalism' in this epistemic sense is used as a part of a program that holds that we come to know things by starting with things that cannot be doubted and deducing from that. This adopted approach was justaposed with the “empirical” school of thought and was to escape the fallible foundations of our experience. Innate ideas (roughly his axioms) played a major explanatory role for getting this project off the ground.

     On the other hand, epistemic representatives of the empirical school of thought, people like Locke and Hume, held that what we know through self-evident definitions was very little at all...logical entailments and maybe our own existence, but not even the nature of our existence. Instead, they thought one should develop a knowledge of the world through the probable means of our experience and reflection on that experience. They generally eschewed for various reason confidence in innate ideas. However, the roughly contemporary analysis of that by the l’enfant terrible David Hume, even though he leaned toward a nuanced empiricist understanding in many ways, may have undercut the theoretical underpinnings of both schools of thought. Indeed, we think it did.

     Roughly speaking a kernel idea that emerges from Hume’s analysis that impacts our discussion was that while rationalism and strict empiricism both succumbed to skepticism, at least the empirical world practically impinged on us in a greater way than did religion—and we were forced as a practical matter to pay more attention to that. Hume built upon this to suggest that what we know, if we know anything, is the empirical world and its relations; thus, things like religion and morality, which were not discovered through our senses were to be considered subjective, prejudicial and obviously not objective knowledge. They as such were fit only for "the flames."

     After reading Hume analysis, the philosopher Immanuel Kant said that he was “awakened from his dogmatic slumber” and he subsequently began to develop the implications of this by a rather complicated pragmatic or instrumental understanding of empirical knowledge. In his system we would not know the things of reality in themselves, but only know the things as they appeared to us. The view Kant was developing was a form of philosophical idealism. The distinction in that view between knowing the phenomena (the thing to me) and not knowing the nature of the noumena (the reality of the thing in itself) has had an enormous influence on the evolution of thinking about the meaning of “faith” and “reason.” 

     Depsite the instrumental nature of the knowledge produced in Kant’s scheme, the perceived successes of the empirical sciences—Galilean and Newtonian science, for example—lent gravitas to the common-sense empiricist intuitions that we were in some unspecified sense in touch with reality. At the same time modern science was advancing in its discoveries, conflicts between the literal readings of the Bible and the purported deliverances of science began to surface. Especially crucial in this narrative is the impact in the 19th century of Darwin’s theory of natural selection acting on what was then the blackbox of genetics to produce more successfully adapted progeny and the conflict it had with a literal reading of the special creation account in Genesis.

     Returning to Kant, a difficulty he is thought to have exposed was, it was the case that the ideas we formed in our minds were a product of our mind’s construction. This view was implicit in his discussion of what he called pre-existing categories in our mind, that is, pre-existing to our experience. These inherent categories were the way by which we apprehended or “knew” the world. This “new” schema was a metaphorical Copernican revolution that turned the old idea that our perceptual faculties passively received the external world mostly as it is (the common-sense view) on its head. It was changed to the idea that it was our mind’s categories that actively created and organized our perceptions and baked into those categories were limits of how we could “see” the world, just because that was the only way we could understand anything. 

     Kant worked on the skeptical implications of this project in some detail; but he held that the degree of coherence of our collective experience allowed for a practical “knowledge” of this world as it appeared to us. This was despite the fact that even though we could not compare those appearances to the things in themselves without begging the question (assuming what was to be proved), we could still treat it as sort of “knowledge.” Nonetheless, compared to Descartes’ vision of what counted as knowledge, this was an obvious deflationary view that persists.

     To be clear let me repeat, the instrumental nature of the “knowledge” of what Kant’s analysis produced, in spite of its profound agnosticism of the objective nature of things we experience, led to a radical departure from a critical common-sense knowledge of the nature of the external world. The move was to a view whose gravitas depended on our common experience that fortunately in practice had a great deal of coherence. It was a constructive (non-realist) instrumental form of “knowing."

     Nonetheless, it is notoriously difficult on Kant’s analysis as to whether the coherence that we find among people’s experience was due to our common “wiring” or programing (presumably a result of either guided or unguided evolution) through which we accessed the world, or was due to the coherence of the external world in itself. It is typically presumed the latter. However, the presence of enough coherence, for whatever the underlying reason, combined with the scientific success of this instrumental knowing allowed for the acceptance of a workable, but as we earlier said, deflationary solution to the problem of “knowledge.” Implicit in this is the sense that it enabled us to predict to a very precise degree where planets would show up in the future, allowed us to make useful gadgets and the like of that. 

     It is worth a reminding ourselves that as a matter of historical fact, the basic Kantian linguistic legacy, that we can know the phenomena (the thing to me) but not the noumena (the thing in itself), remains today deeply imbedded in the way we speak and think about what constitutes “reason.” Again, not to make too fine a point of it, it is explicitly a philosophically non-realist (or anti-realist) view.

     That was what some thought Kant’s view led to, but there are also quite a few latter-day dissenters to this interpretation of the Kantian picture. That is, they object to holding that the extent of coherence that science gives us, along with the ability to predict things to a very high degree of accuracy, provides sufficient justification for thinking that our knowledge of the phenomenal world is closing in on the truth--the truth of the noumenal world. 

     Despite this ubiquitous intuition shared by many bench scientists, the philosophical received skeptical tradition about Kant’s approach instead concludes that the coherence found among many of our experiences provides insufficient justification for concluding that it is a knowledge of the real world. Rather, it holds that it merely provides justification for calling this an instrumental “workability,” knowledge. The key problem they cite for this conclusion was the need to have faculties capable of knowing reality before you construct the appearances you do, otherwise you could not compare your ideas to Reality to see

if you are closing in on it. However, why think that unguided evolution provided us with that? What the unguided legacy seemed to provide were the sensory and cognitive faculties that instrumentally increased one’s chance of getting your DNA into the next generation. Adding to the linguistic confusion, the term “useful” is often swapped out for the term “truth”; something that is exposed when we see former Stanford philosopher of science Richard Rorty implicitly and infamously swapping out of the terms: “...truth is what your contemporaries will let you get away with.”

     Again, this is certainly not a Cartesian vision of deductive knowledge derived from even more basic indubitables, but rather a vision of a fallible and correctable perceptual knowledge that isn’t necessarily giving us a knowledge of the underlying real world. It is also the roots of a coherence theory of knowledge. Its claim to fame was that its coherence could be improved on when inconsistencies were identified within theories of our experience by either falsifying the old theory, or by suggesting and then verifying newer and more coherent theories by repeatable empirical experiments.

     So, herein lies part of the problem of parsing out a contemporary understanding of the meaning of the terms of “faith” and “reason.” There is a somewhat torturous and subtle narrative we have to navigate in order to understand how it has come to mean what does today. The past common-sense version of humanity’s understanding of reality was thought to have been undermined by a bootstrapping problem. It was thought to have been undermined because common sense (a sort of faith) could not prove its first principles—therefore it was understood as subjective and a matter of opinion. 

     So, that iteration of explanation was replaced by an instrumental understanding of our experience where what was produced was thought to be ever more coherent and rigorous explanations of our common experience. Old understandings (or paradigms) were in principle supplanted by newer better paradigms; but, an important question still looms: can this new understanding prove its own first principles by means of sound arguments? It would be ironic indeed if this new strategy impaled itself as well.

     Nonetheless, despite the consequences, could it pull this off? How does one step outside your framework of ideas to provide its justification? Would not this paradigm of practical knowledge suffer from some sort of related “incompleteness problem” that Gödel saw in mathematical system's foundations?

     We and many others suspect that it too suffers a boot strapping problem and what is a secularist to do about that? We think instead of facing the problem head on, it attempts to side-step the issue. It is argued this instrumental understanding of our experience is not to be understood as a system that needs provable grounding. But then, we assert, it still follows that even from this newer version of practical, workable knowledge is itself either ungrounded or “grounded" in the subjective. 

     At the bottom, this is explanation of instrumental knowledge is still a subjective understanding of our experience that does not necessarily lead us to Truth and we cannot even be sure it is closing in on it. Instead, what it leads us to is the most coherent understanding of our experience. Unfortunately, coherence is a necessary but not sufficient condition for knowing. This kind of subjectivism was in a relevantly similar position to those epistemic positions of which it was critical. 

     Yet, there something in the language of how this was expressed that seemed to be treating similar things differently. Many scientists and sometimes philosophers of science talk like they have objective truth by means of this instrumental scientific method. C.S. Lewis pointed to what we consider to be this problem in his essay on Miracles.

“…If the Naturalists do not claim to know any truths, ought they not to have warned us rather earlier of the fact? For really from all the books they have written in which the behaviour of the remotest nebula, the shyest photon and the most prehistoric man are described, one would have got the idea that they were claiming to give a true account of real things. The fact surely is they nearly always are claiming to do so. The claim is surrendered only when the question discussed in this chapter is pressed; and when the crisis is over the claim is tacitly resumed…"

     What we see was this linguistic tendency to keep on taking the systematic, practical study of the empirical world seriously as if it was knowledge of real things, with all of its phenomenal warts; but, how could this go on? Science, by its practitioners who were in the know, was quietly giving up a claim to knowledge of the external world as it is in itself and was now restricting itself to a knowledge of appearances, though they often did not sound like that’s what they were doing. Surely this expedient explanation for the tacit resumption of a linguistic habit is an important thing to see and understand, but the clarification of the problem seems to be better understood in terms of the sociology of ideas than in terms of the analysis of ideas.

     What you have to like about the way this move was made, is that in so doing, its supporters were still able by some curious reason able to hold on to its old authoritative cachet and privileges. The way this is put, is that current scientific theory is authoritative until it is overthrown by a more substantiated theory or paradigm; and then the new theory inherits the language and garb of the new provisional authority. Despite the subtle, but underlying deflationary move that was implicit (because the move restricts the meaning of the term 'authority' to its provisional nature), it was still taken seriously enough so as to not undermine the normal use of the linguistic term “truth” being carried along...despite the switcheroo.

     Instead of concern about the kind of skepticism that provisional authority engenders to scientific endeavors, its implications seem to have been somehow overlooked, even while its authoritative reign was always temporary. Again, this understanding became a main constituent of what we now term “reason.” That is, empirical results had authority enough to still eschew (and we think unjustifiably eschew) morality and religion. Why were they excluded? It was because they were thought not to be derived from any empirical experience--the critical line of demarcation for them--and thus in secularist eyes, religion and morality still only had an appearance of truth and authority.  

     The historical trajectory of this way of thinking in the early 20th century led to the attempt to try and identify a criterion of meaningfulness that put what they considered the empirically unverified claims of common and religious language where it belonged—committed to the flames or at least the dust bin of ideas. Thus, by means of the seminal ideas of Hume and Kant, which were picked up by the Vienna Circle of philosophers and perhaps best expressed by 20th century philosopher A.J. Ayer emerged a position that has come to be known as logical positivism. 

     In a more formal way, knowledge and especially knowledge language was to be restricted in principle to the empirically verifiable and its relations. The short story about the demise of this move with respect to meaningful language is that the dreaded principle of verification when clarified, failed to pass its own test for meaningfulness and was nonsense on its own terms—it was unhappily hoisted on its own petard. We think the collapse of the logical positivist’s position in the world of ideas by mid-20th century is the most important philosophical development of the 20th century. 

     We think that ill-fated “move” to embrace positivism (when it was either implicitly or explicitly accepted in academe), was a huge philosophical mistake of the first order and it had enormous negative effects on the development of the university’s intellectual ambiance that lingers to this day. While full-on positivism we think has been defeated philosophically, somewhat surprisingly its spirit survives in pockets of some scientific circles, that is, until it’s confronted. 

     However, another kind of defensive move was in the offing that we hinted of and that was to use those philosophical terms not to define “reason,” but to instead define what science is. That isscience was to be understood in terms of the "in principle" empirically verified rule, but not to more broadly define what is rational and reasonable. What is subtle here is that despite the pivot, the linguistic tradition of talking about truth in correspondence terms as they sometimes do, helps confuse the conversation and conflate how the term is often being used from that point onwards. How could this happen? Well, you could speak as if science just is the paradigm of knowledge—the best and only way of knowing—and if that gets by, then it’s a short step to thinking science per se or the deliverances of science just is what reason amounts to.

     Such “sociological situations" like the evolution of the meaing of the terms still encouraged a sort of stealth logical positivist view to persist where only what empirical science had to say qualifies as being capable of being true or false. Whatever was left over was still literally nonsense. It also encouraged a hubristic “scientism” where it was alleged that the scientific method of knowing had universal application to all domains of knowing. We hold both of these moves are demonstrably self-defeating (self-referentially incoherent) conclusions and Christian apologists need to master the deconstructive arguments and clarifications needed to defeat these modes of its continuing influence. We also hold that in critiquing this we are not anti-science, but rather disapprove of some the exclusivist’s claims made by some philosophers and some scientists about the reach of science.

Part C - Methodological Conventions, Naturalizing Tendencies and Justification for Metaphysics

     Another serious part of this “modern” legacy of “thinking for ourselves” attitude without the benefit of taking into account the existence of God (or metaphysics) in our knowing was that the methodology of science—methodological naturalism—lent itself to producing a “naturalizing” tendency for our explanations of the world. That way of thinking produced a culture that eventually thought we have no need for and left no place for the metaphysics of theism, indeed, or for metaphysics at all. It was thought we were all limited in what we could talk about and know, they say, to practical fallible knowledge based on our experience of the world.

     That is, whether one thinks as a metaphysical naturalist or uses the convention of methodological naturalism strictly, the results of that restriction (whether a conventional restriction or a metaphysical one) focuses our attention on what must be (by rule) the only possible explanation for states of affairs—other empirical causes. Baked into this view is that there is no room for believing any claims for theistic activity in the world and no room for theistic explanation as the cause of the world. Religion and morality were still pseudo-science.

     Only one problem: to pull that off and try to stick to principles, one would have to abandon the strict rule of counting only sound and provable arguments as justification for their position. However, as we have already pointed out, that kind of justification was a bridge too far for them to achieve. Sadly, for them the alternative that you must abandon that rule, is you leave an unwanted backdoor open for theism to raise its ugly head again. Put another way, if science even in its humbler form is given a pass on its inability to justify all of their foundations, then why not so for morality and religion? What’s sauce for the goose is sauce for the gander.

     We do, however, think that some if not many of the naturalizing tendencies are justifiable and have had good research consequences—for instance, we check maps for directions and see doctors for cures to illnesses. However, we think this naturalizing tendency can and has gone too far, at times. Despite Hume and his followers, there is no definitive reason to think that all causes in nature are natural; and at the same time there are good reasons to think it is logically possible for things (actually persons like God) to have the capacity to have causal effects in the world. Indeed, we further think that there is room for non-natural causes, that is, the non-physical causes of human agents. 

     What that amounts to is to say we think there are limitations to scientific investigation that should be interrogated on a case-by-case basis. And, again, remember that science since it cannot “prove" its foundational support, does not itself escape a circle or an infinite regress of justification. Perhaps one should abandon the idea that it and it alone has greater footing than any other ways of knowing.

     Is Hume’s skepticism of rationalism and empiricism defendable? Strictly speaking, in a metaphysical naturalist worldview we think there are serious epistemic defeaters (really undefeated defeaters) for trusting our cognitive faculties. When by your own perspective you cannot trust your faculties, you’re in the serious kind of trouble of a self-referential incoherent kind. On the other hand, we think theism has the metaphysical resources to overcome this doubt. This is an important distinction with a difference—theism has the metaphysical resources to support the conclusion that science can be closing in on the truth, while the secular approach is stuck with a practical knowledge with no metaphysical resources to secure it.

     For example, as Alvin Plantinga has argued, if metaphysical naturalism is held in conjunction with unguided evolution, there are good reasons to be skeptical of the reliability of our cognitive faculties--that they are reliable in forming mostly true beliefs. That’s because unguided evolution if it is “aimed” at all, is aimed at survival and getting our genetic material “into" the next generation and not necessarily for forming true beliefs. Unfortunately for the metaphysical naturalist, there are many false beliefs that are adaptive and will still allow us to be genetically successful—but this should at least leave the secularist approach to doubt her faculties are producing mostly true beliefs. That’s a serious epistemic problem for your colleagues who are metaphysical naturalists on this basis. And if they demur to a more defensible pragmatic approach, we need to make sure they don’t sound like what they have reached is metaphysical Truth.

     In summary we think 1) methodological naturalism is a convention whose results do not and cannot confirm metaphysical naturalism--even if a large number of intelligent people fail to see why and incorrectly thinks it does; 2) logical positivism (still) flourishes in the academy except where it is exposed. It is important for Christian apologists to understand the roots and shoots of positivism and make use of decisive de jure arguments wherever this approach raises its ugly head. Scientism flourishes in the academy largely because of the “success” of science where it works so well in the domain of empirical things—however, we find some wish to extrapolate the success of that project to other domains for which the methodology is not properly fitted. That, too, needs to be exposed. And finally, we also think Christian apologists should also engage in uncovering “bad” science where it occurs—for example, broad generalizations from too little data, or the non-repeatability of some experimental results, and the like of that.

Part D – The Humanities’ Critique of Christianity

     What we’ve talked about above is mainly how the development of the received tradition in the sciences led to the general impression that Christianity--Christian theism and theism, in general--is somehow intellectually, metaphysically and epistemically subpar and thus below standards for any minimally self-respecting academic institution or discipline. One of the consequences of that is, it became a secular project and duty to expose any influence Christian theism has on academic thinking, to be sure that influence is done away with. 

     You could ask, where would that influence emerge in the humanities? First, it is alleged the basic ideas within Christian theism comes into conflict with academic sensibilities found in the humanities, not because theism holds that there is an empirical world independent of minds (as it typically does), but because theism doesn’t limit itself by only reporting, exploring and explaining the empirical and natural world with its properties. More recently, Christianity’s exclusivistic claims are cited as conflicting with human rights and social justice. These accusations fly well in the humanities and are typically associated with a ubiquitous "hermeneutic of suspicion” reserved for a religious perspective. 

     However, theists and Christian theists in particular hold that not only is there something transcendent to the physical world, but that this something (really Someone), has the power to create, sustain and “interfere” with it, if it desires, that is, with the physical world and its processes. This Someone is also thought to have an essential nature such that there are objective moral requirements and obligations for human beings that flow from that Being’s nature and our being created in His image.

     The humanities in various ways have made use of the skepticism engendered by the Humean/Kantian critique—that we do not know the noumena, only the phenomena—and thus we should thus retain a skepticism of the existence of gods, angels and all else implied by their existence. One thing they do believe about the Humean/Kantian critique is that it supports the view that “man is the measure of all things.” This autonomy and subjectivism are rife in the humanities so much so that it is fashionable today for part of it to think of itself as post-modern (PM). That is, many prominent thinkers in the humanities think of their domains are free from the antiquated constraints of a quest for an ahistorical truth of essences and especially the truth claims in religion.

     The most skeptical brands of postmodernism hold that science itself is not immune to skepticism, that is it is not without the subjective influence of one’s culture and time and thus its results are not ahistorical or without some prejudice.[iv] This sort of epistemic skepticism, however, is sometimes extended to a metaphysical skepticism where it is held that we not only construct words and symbols in an attempt to give a representation of the world, but that our words and symbols actually construct the entirety of reality

     This latter view, we think, is a much more virulent post-modern strain and is not sustainable if it claims (as we think it does) that this PM analysis is not itself biased, not historically conditioned and is at the same time, not an attempt to grab power. Believing as they do that there is no Truth, they seek to gain more and more power, in order to gain or to guard their power in order to shape the world into their image and into what they like and value. Not to make too fine a point of it, post-modernism in many of its forms is rife with reliance on propaganda as a means to their ends, and maybe it is best understood in terms of a philosophy of will to power, which itself is nothing new.


     In summary, it is important to see that these developments in the history of ideas led to increasing skepticism as to what we could properly claim to know--Hume thought that “faith” claims should be confined to the flames and that view seemed to carry the day in academe in the late nineteenth and early twentieth centuries. Instead, what was developed and preferred was a pragmatic and deflationary way of “knowing” through the coherent theories of our common experience; nonetheless it often sounded like scientists were talking as if they had the Truth, whereas religion—the Christian faith—was still to be thought of as nonsense.

     Thus, the non-empirical or non-natural areas in academe—for instance, ethics, value judgments and parts of religion—were thought to be without any grounding and amounted to merely subjective prejudices or opinions. This bifurcation would lead the nineteenth century intelligentsia and eventually the universities of the time to want to distance themselves from religious and moral claims to knowledge. That development eventually would produce a real crisis of authority for Christian faith and its relation to the academic disciplines. Additionally, the drum beat of new empirical findings that seemed to contradict and undermine the credibility of a literal reading of scripture—especially in Genesis—led to the sense that science and religion were at war with each other and religion, to understate things a bit, wasn’t seen as doing very well. 

     What is often overlooked and is incomplete about this narrative is that the skepticism of moral and religious propositions that emerged during the Enlightenment (the proximate roots of logical positivism) was itself based on a self-referentially incoherent restrictive principle. That this was the case didn’t become obvious until the rise in the early part of the 20th century of logical positivism and its subsequent demise during the middle of the twentieth century. As bad or perhaps worse for secularism’s hegemonic prospects was the price it willingly paid in that scenario for characterizing Truth, science and epistemology in deflationary terms. That is, in this construal of science, we were no longer dealing with or closing in on Reality in itself, but only with a commonly experienced subjective phenomena—that worked instrumentally, but still cannot rationally claim to have a purchase on the Truth. 

      The consequence of that move weakens its claim of authoritative dominance in all areas to which it saw itself speaking to. Nonetheless, defeated as that project was, it seemed to survive in a rhetorical sense through the lips of many dismissive educators in the sciences and humanities. The record about all that needs to be set straight and we need to take up the newer challenges as well, that is, the newer claims that secularism is not only in touch with objective morality, but has the resources to justify that claim.


     As apologists for Christian theism our job is to continue to parse out the details of this narrative and the implications of this state of affairs in the history of ideas, providing nuances where it is justified. Do the results of science provide decisive support for metaphysical naturalism? Again, we think not, because coherence is only a necessary condition, but is not a sufficient condition for metaphysical truth.

[i] Naugle, David. Worldview, the History of a Concept. Michigan: Grand Rapids. William B. Eerdmans Publishing, 2002.

[ii] The thought leaders of the Renaissance for the most part was rejecting the Medieval consensus by looking back at the classic Greek and Roman literature for wisdom and ideas; by contrast, the Enlightenment thought leaders wanted to reject not only church authority but also ancient authority, including much of the Renaissance’s reverence for classical authority. It wished to urge intellectuals to just think for themselves.

[iii] There are many philosophers during this period that deserve our attention, but three whose influence deserves to be highlighted the most are Descartes, David Hume and Immanuel Kant.

[iv] This deserves some explication; however, the details go beyond the aims of this essay, but this concern of bias in science has been brought up by feminists and other minorities.

This non-technical essay was written for a popular but educated audience and was published in August of 2023; it has been insubstantially revised several times for the purpose of greater clarity.


     Our next article in this series on Faith and Reason will take up where we left off.  That is, in this last article we looked at the proximate roots and foundations for how the terms “faith” and “reason” came to be understood in contemporary academe. We concluded that the term “truth” in science is not the same as Truth and that the implications of that are not often carefully made explicit. We think the deflationary sense of that term “truth" is not often clarified, nor its consquences explored adquately when it is used in theory making. At the end of this discussion, we saw that the attempt to confer to the natural sciences and only the natural sciences by means of postivist criteria what it means to be using “reason" ran into serious problems by the middle of the last century. But there were other formative streams of thought at the end of the 19th century and beginning of the 20th century that were around to attempt to fill in the void made by the spectacular failure of logical positivism. In our next article we will discuss other serious attempts to provide a way out for secular academics.

"...what are those insights that are thought to be part of modernity that provide that kind of leverage?"