Remarks on Empirical vs Ex Cathedra Solutions to Enduring Problems

“Facts and their properties” has occupied my thinking for much of the past year. I suggested earlier that the current term *fact* needs to be supplemented by terms which express the idea that many matters which were once considered as factual have lost their credentials and have since been dishonourably discharged.

A famous example of this was the discovery (c.1886) that eels possess sexual organs i.e reproduced themselves in a “normal” manner by sexual coupling, a discovery consonant with Darwin’s view that fish reproduced sexually and were not worms reproducing through spontaneous generation, which was the view advocated by Aristotle (c 330 BC), two thousand years earlier. This 19th century discovery meant that statements which supported “spontaneous generation” as a mechanism for generating new life-forms were weakened to the point of extinction; such statements therefore joined the ranks of “factoids”, as part of dead-science. In short, spontaneous generation was not an option which could be summoned to account for the emergence of new species.

Here then is a model for the transition of statements which describe the world in empirically false terms and how, at some later stage, such statements are replaced by a new body of statements, by new knowledge. It is a complicated process which often meets fierce resistance that is by those who have been entrusted by their follow citizens to guard our hoard of “knowledge”, like the Nibelungen’s guardian dragons protecting the golden treasure of the Rhine. Old treasured tales do not pass gently into the night but fiercely resist such attempts to demystify them. It isn’t that some newer theory is found to be correct, but that finally someone convinced others of the error of their ways, and the old theory was discovered to be faulty perhaps even in several repeats. Spontaneous generation, the pre-Darwinian theory used to account for new species, was a plausible theory at the time these matter were first discussed – but in the end it was deemed seems to be inadequate and was therefore rejected by the scientific community as a whole, and by those we had entrusted to safe-keep our knowledge.

In summary, the challenge faced by biologists at the turn of the 19th century was to discover evidence which either would suffice to continue it to support an older theory of speciation (as sanctified by Aristotle two thousand years earlier) or to find evidence which contradicted the theory proposed by Darwin and others during there first half of the 19th century, that speciation was an ongoing (contemporary) process powered by a combination of mutations in cells (about which very little was known at the time) and the adaptation of such mutants to their ecological niche. These were different but complementary tasks: (1) find supporting evidence for two conflicting position about speciation and/or (2) find evidence which contradicts one, or both, of the theories proposed to explain the large variety of species observed and the source of their often small inter-species differences. At the time these matters were first debated when both microbiology and especially cell biology were in their infancy, and were still half a century away from the great breakthroughs of the late 1950s. The initial problems were set by conflicting theories which had been formulated when knowledge about these matters was sketchy and mostly conjectural. Historically, here was a case of how these problems were approached and resolved, step by step, often secretively, through empirical investigations.

But the history of our knowledge about the world also records many cases where solution were adopted ex-cathedra, that is, by declaring a solution to a problem which was primarily based on arguments from broadly defined first principles. If there was public disagreements about these, it was based on how well and perfectly deductions had been derived from the assumptions adopted. These first principles, as they came to be known, referred to assumptions which were not themselves directly challenged, but which were assumed to depict and reflect an existing state of affairs on the grounds that these were self-evident to the theorist (the person who mattered) or because these appeared the best ones (as in the most rational) available under the prevailing circumstances to the writer and his friends.

The most persuasive cases cited from the past of solutions which had been reached in this manner were the proofs of Euclidean geometry. These proofs had been available to the educated elite of historical periods who had assumed that space is best represented by a two-dimensional linear surface. Thus, all the conclusion reached by Euclid and his many successors over the next 2,000 years were deemed to hold when applied to what is basically a “flat earth” model of the earth: however, any conclusions drawn did not hold for spaces which were concave or convex, i.e., did not hold for the surface of globes. The assumptions that the earth is flat, that the earth is stationary, that celestial objects move relative to the earth, that the movement of celestial bodies are uninfluenced by their proximity to the earth, that light and sound travel through a medium and specifically that light travels in a straight trajectory etc., were not questioned until the end of the 19th century. When these older assumptions were challenged and exposed to experimental investigations, this change in approach also marked the end of solutions to problems which used the deductive approach.

Of course, deductions from first principles remained valid when done strictly according to a priori rules of logic, but the deduction themselves could not answer questions about what composed the universe to start with, or how things worked during “post-creation” periods! Such questions demanded that one demonstrated that any claim about a state of affairs had been independently demonstrated, that there was a correspondence between a state of affairs as perceived and what was being asserted about it. Under some conditions the meeting of two points in space does not make “sense” and therefore needed to be viewed as an “impossible act” !

Once it was accepted that empirical investigations could reveal new facts it opened the door to the (dangerous?) idea that old existing facts could be tarnished, even faulted, perhaps that new discoveries could be superior to old facts. To which old facts? All, or only to some? Those facts declared to be so were supported by the first layer of assumptions made. It was a dangerous idea.

The history of comets is a case in point. Comets had been reported for thousands of years by both Eastern and Western sky-watchers, but were thought to be aberrations from a pre-ordained order of things, which portended unusual events, like the birth and death of prominent people,(e.g. Ceasar’s death, Macbeth’s kingship, Caliban’s fate — Shakespeare was, as is well known, well-versed in the Occult, as were many in his audience). But where did comets come from, and how did they travel through the (layered) sky? What propelled them and for what goodly reasons did they come menacingly across the sky ? It required special agents to interpret such rare public events. These were furthermore more likely to be messengers from the gods, and were therefore inspired seers and were believed (before the advent of Christianity) to be equipped to find answers to especially difficut questions! Thus if one assumed (as was common for thousands of years) that celestial bodies travelled around the earth on fixed translucent platforms – perhaps on impenetrable crystalline discs, each of which was “nailed” permanently to an opaque or translucent wall in the sky – this belief opened a number of possibilities to previously unanswered questions. (We create worlds for ourselves which make it possible to answer questions which bother us!)

There were other assumptions involved, as for example the assumption that whoever created the world (the great mover, as assumed by some early Greek philosophers) who must also have created everything perceivable in accordance with a perfect plan which employed perfect forms, e.g., perfect geometric forms and patterns. Such assumptions had to be jettisoned before one could consider alternatives which dispensed with the notion (a) that perfect forms existed since even before the world did ab initio or (b) that anything imperfect would necessarily refer to an illusion, a distortion, aberration and was therefore itself unnatural! Comets, according to ancient astronomer or priests and others, were not to be viewed as natural phenomena, but were viewed as unnatural, abberations, beyond what super-intelligent entitites would do, a species which could intervene in the normal, divine order of things! Thus our ancestors provided for the possibility that a construction could arise which was built by following imperfect rules of construction and which could by virtue of this also implode unexpectedly!

The last paragraph illustrates graphically what I have tagged as ex-cathedra procedures and has demonstrated how a naturalistic philosophy, which was based on assumptions that knowledge attained by empirical discoveries was inherently superior to knowledge derived or deduced from first principles. The issue is to justify why any choice would be superior in effect to its alternatives: it has remained a casus belli between different factions of metaphysicians for two and half thousand years – perhaps even longer. We have of course few records, if any, which would support either position wholly or fully. This may perhaps change in future as we inreasingly and assiduously store all earlier findings and speculations regardless of how well supported, and before we undertake the awesome task of assessing each on the strenght of its merits. There are no ultimate judges as far as we know who can undertake this task and also assume responsibilty for any final recommendations they may make!

The Fictions We Create 4: Our Inside and the Rapidly Expanding Outside

We have come to accept that the inside world is large, although ordinary people do not have a very large vocabulary with which to report “inside” experiences. They cover this by saying “I think”,“I feel…” ,“I sense that…”. In other words, most people tend to leave the description of their “feelings” to our poets or song writers/musicians!

We are, in fact, demonstrably more adept at describing the outside world — our common world. Some say that this is so because we live in a materialistic culture which is primarily focussed on the world around us. (Thus culture plays a “shaping” role).

However, both “domains” — the inside and the outside — are currently expanding rapidly, the latter at a faster rate than the former.* What do we do to meet our needs to express and refer to these changes? We create additional (new) term/ words and expressions which mark and label different events. Here the term creation has four references:

(a) Inventing new sounds (or symbols which substitute for sounds) to be used routinely in an existing language. The new sound — a complex event — is then assigned an “official meaning” either by fiat or later by including it in a current (on-line?) dictionary (which is a relatively new invention! — see the footnote** below);

(b) We borrow already existing words from a foreign language (e.g. Greek ἰδέα idea “form, pattern,” from the root of ἰδεῖν idein, “to see. Oxford English Dictionary 2014) but import into the host-language, only one one of its several meanings from its original, its donor-language (see earlier articles on neolidesm).

(c) We transform an existing word which has been selected from within our home- language and “quietly” assign an additional meaning to it by “analogy” i.e. by referring to its likeness/similarity between it and its new “reference”. In an earlier blog I have called this an “analogical spread”;

(d) We adapt an existing word in our home-language by changing its context of use. Using this method we assign new meaning to many old, well-worn words. It is sometimes the origin of what are now referred to as slang words, but not its only source, which I previously referred to as a special case of neolidesm. (See ** footnote below.)

* I have some reservations about this statement. One could argue that much of the “inside” worlds get expressed in expressive modes of popular culture, includes its songs, dances, arts.

** A brief statement about dictionaries taken mostly from Wikipedia:

Dictionaries go back several thousand years, but the “The first monolingual dictionary written in Europe was Spanish, written by Sebastián Covarrubias’ Tesoro de la lengua castellana o española, published in 1611 in Madrid, Spain.

Several attempts were made to produce a reference book to serve the current use for English speakers, but it was not until Samuel Johnson’s A Dictionary of the English Language (1755) that a reliable English dictionary was produced. At this point dictionaries had evolved which also featured textual references for most words, and their listing was arranged alphabetically, rather than by topic (a previously popular form of arrangement, which meant that all animals would be grouped together, etc.). Johnson’s masterwork was the first to bring all these elements together, thereby creating the first dictionary to be published in “modern” form.

The Fictions We Create 3: Describing the Inside and Outside

Common sense distinguishes between objects that stand on the “outside” of a room or enclosure, and those located “within” a room, or a box, or a carton or an awkwardly configured shell/enclosure. Thus objects are invariably assumed to occupy space. Amongst their many diverse properties which is that objects are located somewhere, in some place which can itself be described and conceptualized, as “a vessel 20 leagues under the sea”! To be in space assumes therefore that one is located “within” or “outside” an enclosure.

Thinking outside the box

We speak routinely about “locations” but — as we shall see — this is speaking figuratively and metaphorically. By contrast, when we discuss our feelings about an issue — e.g. about a neighbour who has just won a large prize in a lottery, or the person who called the police to report an ongoing burglary of his home, or about the doctor’s report that someone known to us has contracted HIV — we not only report the bare facts of a case, but also refer to how these event affected or influenced us as individuals — specifically, how we feel, or retrospectively felt, about what happened on a particular occasion.

Of course, we usually or normally do not receive two reports, one of a particular event and the other of our feelings about it, although on many occasions we get these “mixed up”. Thus we may say, “I’m sorry that I appear all excited and wound-up but the following happened as I walked to the Forum — Caesar was killed by Brutus, and by a host of others!”

So humans, as a rule, report both what they have experienced but also how this experience may have affected them. In our culture we are also trained to be clear that reports and narratives can be both about “objective events” and about how these events influenced us “personally”! We learn therefore how to present “what has occurred out there, somewhere” and “how we (I) feel about a matter” in a clear, possibly concise, manner. The common distinction is therefore between (a) “objective” events and (b) events which “impact” our feelings and our personal reactions to such events.

We refer to these latter as “subjective reports” — and in this matter ensure that these are excluded from being “objective” or “scientific”! We assign to such events a special status. But there are exceptions, as when a person is totally focussed on events taking place “outside”, even when these are far in the past.

It would be a gross error of judgement to describe such events in “personal” terms, like the eruption of Mount Vesuvius in 70 AD which destroyed the Roman city of Pompeii. Not quite! It would also be inappropriate for a historian or archeologist to do so, but not for a writer of fiction, a novelist like Lord Lytton, or even of a stray eye-witness. (Were there any surviving witnesses?) In other words, we ourselves choose whether to treat an event as objective or in a personalized manner. In summary, there are times when we have the opportunity and the “right” to choose between these two options, whether to see and present ourselves to others as impartial witnesses, or as involved participants!

The Fictions We Create 1: Knowledge by Definition and Conception

Herewith four articles which treat an aspect of the theme launched centuries ago when it was claimed that if we managed to somehow integrate all the knowledge about things we have collected in the past — and which we continue to collect even now — we will attain a comprehensive, synoptic view of our universe, of our world. The assumption made was that we live in a finite world so that all that could be known about this world will become known!

This is a deceptive over-assessment of what can be done with such a “collection”, assuming of course that we had it! An opposing insight is that the task of getting to know everything is not possible because such a collection includes much that is at present contradictory. Much of this materials cannot be reconciled. For example, there are likely to be theories in the collection which are plainly incorrect, which yield contradictions and therefore should be viewed as “fictions” — not as facts — and that these cannot be part of a “totality of facts”. A “totality” assumes, I believe, that was has been collected also includes incompatible items. Anything contradictory would need to be excluded.

The Fictions We Create 1: Knowledge by Definition and Conception

We have a way of stating that something is “true”, or “is the case”, by definition, by convention, which avoids the suggestion that it is empirically true. But we lack a method for indicating that something is the case by virtue of our description of it.

How does one inform another person what “the Jabberwock” refers to? No real definition of this creature appears in the poem in Alice through the Looking Glass. However, we have an image of it from an illustration in the book, that is, we have admittedly only a vague concept of it which we share with all other lovers of the Alice books and its fabulous unique menagerie of characters.

I here suggest that we adopt the convention “x=cpt” to indicate that something, x, is the case but by virtue of its conception. Conversely, we would use “x=df” to indicate that it is true by definition or convention.

To refer to someone as “Pickwickian” also illustrates this point. Mr. Pickwick is a character described in loving detail by Charles Dickens. He has since become a universal image for a certain kind of 19th century person of the English middle class. He cannot be described well or adequately by a single, or even by a small set of sentences, but in a sense he “emerges” through acquaintance with the many descriptions given of him throughout the novel bearing his name. He has become what Freudians called an “imago”, a prototype based strongly on unconscious factors.

This, then, is what is meant by having knowledge by conception, for which I have now suggested the expression, “x=cpt”.

Thoughts on Language Expansion

Jeff Berg’s comments on my article The Factoid: A Revised Interpretation were most helpful. It raised one issue I wish to discuss further: about what aspect of the meaning of a term borrowed from one language and introduced into another, is actually transferred. To make things easier, may I suggest that we call the language from which a word or a phrase is borrowed the DONOR-language and the language into which a word from another language is introduced the HOST-language.

Clearly whenever a language hosts or imports words and phrases, that language grows, regardless of whether its new terms are derived from a foreign language or whether these are “home made”, that is, are invented by its current speakers! Modern American English is full of such “home-grown” words and phrases. The word *rap* for example has several meanings including the act of talking or discussing, freely, openly, or volubly (as suggested in Wikipedia). It is also related to the idea of establishing rapport, that is, developing a sense of fellowship among speakers and members of a group. These group members will then determine which of several meanings also apply! It is a case where the meaning of a term will be strongly influenced by its context, not only by its dictionary definition(s). In short, the meaning of a word is usually — perhaps, most often — multi-determined. Thus, to search for a unique meaning is generally speaking, foolhardy.

Yet there are exceptions to this rule. The major exceptions relate to whenever a word has been deliberately imported from a Donor to a Host language. The import is commonly done by an individual agent — that person usually remains anonymous. The agent may believe that their home-language is currently short of a single word to express a unique and important idea, is deficient, or that his/her home-language is already so muddled that a single word given enough publicity or expression may just be enough to cure it of confusion, indeed may “cleanse” it. The importation of a word which is borrowed from elsewhere may therefore — it is thought — have a “therapeutic”, a curative effect!

Jeff’s comments also forced me to re-examine my previous position in several ways. Most importantly how new words enter an existing language. Are new terms just “inventions” — added on the spur of the moment by a rogue interloper, or do such terms enter a language in more thoughtful ways? What actually happens? To answer this question require much empirical research – and therefore lies beyond the mission of this blog. But clearly a process is involved which has been largely overlooked not only by philosophers, but also by others who profess an interested in the growth and modification of a language. Language remains, of course, our principal method for communicating with others. But it also seems that it may involves a very human disposition: a little pilfering or theft! “Seize from others what you fancy, or what you think you currently need!” Let me clarify this a little.

When we run short of words and expressions in our HOME-language we have several options open to rectify this deficiency including that we can draw from other languages for help. In short, we can make up a perceived deficiency in our own language by unashamedly borrowing from another. Our own language then acts as “host” to entries from elsewhere. What is transferred by this act are not only the sounds typical of another language (which can be fun — try Xhosa!) but a very particular meaning that the borrowed word already possesses in its original language. However the broader meaning of the borrowed term may get damaged or lost during the transfer from donor to home-language, although this need not matter too much. Who cares? Whoever has borrowed would not necessarily know about damaged done, or of the loss of meaning which resulted from the transfer from the donor-language to the host-language. Jeff, for example cites the fact that the French “ meubel”, i.e. furniture, implies mobility, whereas the opposite is true for the English term, but in Italian mobili means “furnishings” and immobili means property! — therefore non-movement!

Let me summarize the above: A borrowed word from a donor-language enters a host-language but it carries only one of its meanings. It therefore enters the host-language without any ambiguity whatsoever! Its meaning in the host-language is pristine, unambiguous. It only carries the meaning assigned to it at the time it was imported by whoever borrowed the term! The borrower called the shots but he/she cannot control the future of the imported term. It has now become public property in the host language. It may even “wander off” by itself and within a short time, acquire other meanings than assigned by the original borrower. Since words are social objects and therefore are subject to the vagaries of such objects, this is as expected! The borrower may have borrowed with very good intent, including a wish to combat existing multiple meanings in the host language, but the truth is that he/she has no further control over the situation.

Meanings wander off? Does this mean that one cannot permanently bind a meaning to a word or to an expression? If the answer is “ one can indeed do so, one can tie a word and a meaning” one would also have to state under what conditions this becomes possible. It does not appear to be a universal rule which applies to a living and therefore developing language. On the contrary, such languages appear to “grow”, a euphemism which suggests that changes will be largely unpredictable! It is a common but also troublesome phenomenon. It can however be given a distinctive name: I propose to call it an “analogical spread”.

This name given the phenomenon covers the idea that many words which are currently uniquely distinctive and therefore have currently a limited meaning will nevertheless change over time. Indeed such words often extend their original limits quite extensively. The original term then comes to encompass more and more points of reference — a process which of course also increases their ambiguity, something often dreaded by others, except by poets and politicians. The term then assumes an analogical stance: it now says much more than was originally (earlier) intended. It also means that many important terms can no longer be carefully, or uniquely defined. The term has now become a suitable candidate for a Thesaurus, where each meaning overtly depends on the context in which it is used. It is a case of what may be good for poetry may be bad for scientific or for hyper-accurate communication!

Although the borrower of a word from a foreign language may have only noble aims, perhaps of eliminating or reducing confusion, he/she may not succeed in cleansing an existing language of its many ambiguities. Such an act of “cleansing” can however be encouraged by inventing new words (neologisms). These may be constructed in a variety of ways of which the most common method has been to use a person’s name as a label for a new object or product, e.g. *pasteurization*, or for an idea, e.g., *Freudian wish*.

In the second article of this series I will give a a short guide to three terms which appear critical for the analysis given so far. I shall also suggest a fourth term, which covers new ground.

The Factoid : A Revised Interpretation

This article is a revision of an earlier version which was published in 2013.

The term *factoid* was coined by Norman Mailer to express the idea that many things we believe to be true — and therefore do not challenge — are products of endless, ill-intended and often vicious reiteration by some of our news media with the intent to mislead others. It is usually done for personal gain rather than the ignorance of journalists and editors. What is proclaimed to be true is too often found to be inversely related to truth. There may be a smidgen of truth under hidden under the surface, enough to get by, but most of what is claimed is false. When factoids are promoted by higher authorities, like governments or by special political interest groups, or party-machines, such material is known as propaganda.

The underlying principle was already voiced and practiced with consummate and diabolical skill by Josef Goebbels, the infamous henchman of Hitler during the long nights of 1933-1945. His mantra can be expressed as follows:

“Repeat an untruth often enough and people will accept it as truth.”

Of course, it was not invented by Goebbels — but was adapted from the practices of religious organizations who controlled whatever media were available in their respective times. Such control was in force throughout Europe for nearly two thousand years, often with a little help from their skilled enforcers.

George Orwell also wrote extensively about this; how all of us easily become confused by willful deceit and how our language, our most powerful method of molding other people’s opinions and mind-sets, gets misused in the process (see both Animal Farm and 1984).

Both here and elsewhere I have assigned a different meaning to factoid than the one invented by Norman Mailer. I argued that we already have terms which cover the idea that endlessly reiterated assertions may come to be accepted by others as statements of truth even when there is no scintilla of truth for this: *propaganda*. The meaning I propose is not as pejorative as Mailer’s, but focuses on the a historical feature of statements, namely that many facts become dated because they are overtaken by new discoveries and therefore no longer reflect the evidence they were meant to summarize. Sometimes the evidence supporting a claim is only, “People say that…”. However, it is often critical to know which people supported a particular claim when there is more than one claim, who subsequently contested it — and for what reasons they support one rather than an alternate version.

For example, which toothpaste really, truly, reliably whitens teeth? I now propose that the new meaning of *factoid* implies that whatever evidence previously supported a specific claim has now become less compelling. This may be due to improved research or because new remedies have been discovered since the earlier claim was made We should always ask whether any claim can be improved upon, can be corrected, or whether it remains true in the face of new evidence and finally has withstood the test of time. To ask these questions is obligatory.

Many things, as we all now know, have failed tests over time, have not survived, for example:

  • that the earth is a stationary object in the firmament
  • that light always travels in a straight line
  • that women are inherently evil
  • that water must boil at 100 degrees Celsius
  • that the Jews killed Christian babies to use their blood in ceremonies or to bake unleavened bread
  • that the earth is flat
  • that hallucinogenic drugs enable one to see into the future
  • that language skills reflect innate intelligence

It seems as if there is a direct coupling between progress in science and technology (a distinction increasingly difficult to defend) and the abandonment of what were once thought to be indubitable verities!

The transition from a fact to a factoid refers specifically to those cases where something which had at first met all the criteria used to certify that an event was a fact, subsequently failed new tests. It means that the former statement of fact has to be de-certified! It now no longer meets critical criteria and therefore transitioned from fact to factoid.

I therefore view a factoid as a fallen angel, a verity which has become a liability. It now lacks truth. Whenever we assert that a given statement is true, we should also add that this is most probably so only for the brief moment, and not for all eternity – not even for the foreseeable future! For a wise man the future is largely unpredictable, even unfathomable; for the ignorant person the future is an extension of the present. For such persons all earlier explanation of the present must also hold for the future.

To conclude: I have rehabilitated the term *factoid* by assigning it a meaning which helps us to understand our current world better as a part of a developing situation. A factoid is therefore a claim that was made about some feature of the world which had once been secure, was viewed as part of its unchanging furniture, steady as a rock, anchored in reality, etc., whereas the test of time has demonstrated the contrary: that is was transient, not a permanent phenomenon. Indeed, the notion of a permanent phenomenon is ambiguous, and is an idea which needs to be revisited and reappraised.

In the scheme of things as outlined here, facts are viewed as the latest kids on the block but they too will become fallen angels, and pass into our history, some unnoticed, some celebrated. Some may enter a hall of fame, even after they have become blemished: e.g., the claim that the earth is flat.

One understands the present inevitably in terms of the past, that is, one has to know about the critical, salient errors made in the past since, as far as I know, there is no error-less learning, no future science without a past science whose paths were studded with pot-holes and major diversions into the unknown. Humans may stumble but many find a path that leads to somewhere.

Dedicated to Mark Reczkiewicz in grateful thanks for reminding me that one should not take human attributes, like language, out of their social context, and that our history also becomes a part of our destiny.

Jabberwockians Part 3: J’s, Theists, and Causes

This is a continuation of Part 2, which can be read here.

The great majority of Western people usually speak and imply that there is only one deity. Many go as far as to claim that their “god” is also the cause or creator of everything that exists — and which will exist, of all creatures yet unborn and those that no longer are. Frankly, I don’t know how they know this, how they can be so sure in this matter, how they get to have this all inclusive knowledge or where and how they found the key to this treasure-trove.
A particular version of this deistic belief claims that their god not only programmed and planned the universe as we have come to know it so far, but that their deity also has plans to dismantle it at some time in the future. It is part of an apocalyptic vision — which is not shared by all deists — and is distinctly Jabberwockian, as previously defined. So it is argued that there was a beginning, perhaps a prelude to the show, there will also be an end-play, a Gotterdamerung. The “gods” will move out and find a better playing field but before they do so they will curse our world. Something to look forward to, especially when “all those arms and legs and heads chopped off in a battle, shall join together in the latter day” (Henry V Act IV, Scene 1) — a happy end game for some and perdition for most others.

Those who deny the fantasy I have just described and who claim that the world is as it is, that it may have had some sort of beginning and may well have some sort of grande finale but that — in sum — all else is fantasy and delusion, are AJ’s! These non-believers are often reviled and called “a-theist” (“a” for “anti ”). Atheists, we are told, will face particularly severe punishments and retributions (perdition) by an otherwise benevolent deity in the afterworld, since their denial of a god is a particularly heinous crime which demands heavy retributions.

I will take my chances.

However, not all J’s are theists. It is perfectly possible to construct a vision of the world without having a vision of a maker or a first cause. Deists have to face an epistemic question: “What set of circumstances destroys a particular thesis, including that of deism?” The point about J’s is that they claim rights to populate the world with their creatures whereas their opponents enforce rules of inclusion and exclusion and thereafter remain forever watchful that these rules are strictly followed by all players. AJ’s insist on spelling out rules for inclusion and exclusion and that one states clearly out under what conditions one is permitted to replace one position by another. Not all visions, according to this credo, have equal legitimacy.

What is the strong card in the deist’s pack? What set of circumstance, in their view, would undermine their deist position? The general public is growing in numbers as well as in its level of education/sophistication — and for this reason is probably more influenced by pragmatics than by first principles. But the general public have been tutored to follow rules of method and therefore every now and then question whether a viewpoint now touted was arrived at by following such “accepted rules”.

I suspect that we are more tuned into the idea that some positions are more valid than others, which means, that when two viewpoints conflict there is a tendency to rally around the most favoured position until ultimately this position becomes a flag-bearer. Woe unto us! The recent debates between creationists and traditional evolution-oriented biologists, for example, could continue for some time yet, provided that the issues discussed have empirical entailments which can make a difference to both parties. Current debates between the parties do not appear to have quality, but the debates are too esoteric, too convoluted to influence and sway public opinion or change mind-sets at all. In that sense, the debates may be irrelevant.

However there is a variety of beliefs which falls into its own class. I have called these Ajabberwockisms. These are beliefs that the Jabberwock and others, such as the pushmi-pullyu described in Dr. Dolittle, do not necessarily exist beyond the imagination. They are fictional characters, imaginary objects. Our imagination consists of conjectures about such objects, creatures and events, but the conjectures also have to be vindicated, and the possibility that these exist has to be independently established.

Not always an easy task. “If only I could see what bites me!” That which bites is sensed, but may not be visible yet. It has taken a few hundred thousand years to fulfill this wish by humans. Do other living things pine away for lack of adequate devices for observing things? Probably not. However, the Jabberwockists have develop persuasive arguments to support and strengthen their claims. Furthermore they meet regularly at scientific conferences to reaffirm their beliefs, their causes which bind them — and their creed. Some are dressed to look like Druids, but others go about their daily business in normal attire, although they may carry a copy of Lewis Carroll’s great poem in their proverbial briefcases or in their trouser pockets or handbags in modern miniature editions, together with their smart-phones and similar memory devices.

What are the distinguishing features of Jabberwocky beliefs? First and foremost that all questions can be answered — often conclusively answered — regardless of the nature of the question. If one wants or needs an explanation, the Jabberwockists have one well-prepared. Their answers take the simple form of asserting that in addition to standard replies to any question there is always the answer that a special being exists who already has the answer to any question posed.

So the art lies in the articulation of a question, not in finding its answer. When in doubt one has permission from the priests to invent a series of beings, called “causes” which when well selected will provide an appropriate framework for a question.

The astute reader will recognize that this approach has a catch: one can always phrase a question in a manner which suggests its answer. The question is not necessarily directed to getting an unexpected reply. On the contrary: only those issues can be legitimately raised to which ready-made answers already exist. It is more like saying, “did you already know that such and such is the case?” Thus, one starts all enquiries by assuming that the answer is already known — but not handy!

The most famous example come from our most famous and rightly exalted thinker: Socrates, who demonstrated that we already know what there is to know about basic principles. When these principles are applied we can generate all solutions.

It is easy to conclude that we know all we need to know. But the message had dreadful and alarming practical consequences, including that there is no basis for discovery, only for uncovering secrets. We learn how to do things, but we discover what there is to know. The two are distiquishable and should not be confused.

Jabberwockians Part 2: Creatures in the Mind and Those in the World

This is a continuation of Part 1, which can be read here.

We could also talk about the itch on the sole of my foot as something that occurs “in my imagination”. It is not that I imagine the itch — the itch is very real, all encompassing for me! But I agree that it is different in many or most respects from the lamp-post. We refer to items in our imagination in two ways: (a) as independent of anything that is happening in the world (b) as something which also has a counterpart in our objective world.

To give an example: the Jabberwocky, described by Lewis Carrol in Alice through the Looking Glass, is an imagined creature that has no counterpart in our normal world although it is kith and kin to other imagined creatures of the “monster class”. Compare this to a predatory tiger which stalks the open fields in search of a prey. This tiger is not described by “Tyger Tyger, burning bright” but can be described in many ways in ordinary language. We can therefore talk about an AJ creature — short for Anti-Jabberwockians — as well as a J creature, which may have “eyes of flame,” whiffles as it moves and “burbles” too. It had a neck! The rest is left to our imagination, stimulated by the sounds of a set of non-descriptive words. Such creatures only exist in our minds, in our imagination and have no counterparts in nature. Don’t look for them! One can add to their attributes, but we do not expect that the same additional attributes will be discovered by other people as they roam through their imaginations.

This difference in the conditions under which one believes in the reality of things between J’s and AJ’s is important.

What is an AJ? What is a J?

Those of us who are J’s are traditional believers; their beliefs could include that Jabberwockies exist, so that if there is one presumably there are many! This conclusions follows from the more abstract belief that all creatures have parents even when these have not been sighted, i.e. they could be sighted, could be discovered, are discoverable — therefore must be found!

AJ’s, on the other hand are believers who demand that on every occasion solid grounds must be given for things/events in which we are asked to believe. They advocate a rigorous acceptance-principle, which is applied to every instance where a belief is actively promoted. Such beliefs are often promoted by an interest group — of which there are very many! I think Alice herself is an AJ. Even very young little North American girls (and boys, of course) often become convinced AJ’s quite early in their lives, e.g. most believe in Father Christmas, as toddlers, or in the tooth-fairy, but both beliefs are gradually shed.

Furthermore, AJ’s adopt moveable, flexible demarcation criteria, which change over time and experience. Thus, they tend to be comfortable with the idea that what was “ true” — and therefore real — yesterday (!) may not be so tomorrow. Indeed reality itself becomes a changing idea: the world may not be a sea of chaos — as many philosophers once surmised — but it is not a like a sheet of calm unruffled water either.

AJ’s therefore reserve judgements about whether creatures of the imagination can (or do) retain their legitimacy for long or whether legitimacy implies that there are strictly enforcable legitimation rules which are unbreakable, and cannot be revised. Of course, such rules need to be justified on other grounds than that these are mere momentary convenience! Much of the work by contemporary philosophers of science (a separate breed!) has been devoted to this task. Often their work has de-legitimized earlier research by demonstrating that so-called firm conclusions were prematurely reached, since the methods of data collection and date treatment was unwarranted, that is, cannot be justified by today’s standards. Medical research has been particularly hard hit since all too often studies on, for example, the short and long term effects of medication, require elaborate designs and procedures which could endanger the subjects of the investigations — and could therefore not be undertaken at this time.

But placing restrictions on what can be done in an experiment also places limits on the interpretation of the data collected on such occasions: the jury is out, so to speak! Of course, when this is so it limits the validity and generality of any conclusion reached at the end of the investigation. “The jury is out” is in fact an important contribution to research: it limits what can be concluded, what matters can be temporarily included in the “Book of Truths”!

It therefore immediately influences the integrity of any model which was involved in the inquiry itself and therefore also of the theory being “tested” directly or indirectly by the experimental investigation. From one point of view, this scenario is a downer for researchers, for people who are usually committed to the theory being tested. To be sure, there are those whose primary interest is in “discovery” — or as is often said “in the facts” — whereas others are tuned into the validity of the theory which generated the predictions being tested. Theories — one could say — are not discovered, but created, whereas facts are uncovered.

It seems from all this that Jabberwockians (the J’s) are the salt of the earth given that they are willing to entertain a world which is by no means coherent, and are those who are willing to walk into positions which are not laced up by tight strands of logic. Anti-Jabberwockians (AJ’s), on the other hand, can be insufferable because they insist that creatures like the Jabberwocky don’t make sense, that creatures of the imagination always have to stand the test of logical coherence before these creatures can be declared as real. It is not a matter of “value”, but a matter of what furniture has been placed into each room in our complex living space and of how flexible we are in our demands for attains the best of all possible worlds.

This is the second of a 3-part blog. Part 3 will appear in a few days.

Jabberwockians Part 1: The Concept of Jabberwockians and Anti-Jabberwockians

This is the first of a three-part blog. Parts 2 and 3 will appear in a few days.

There are two sorts of people: The Jabberwockians (the J’s of this world) and Anti-Jabberwokians (the AJ’s). The former are all those who believe that the Jabberwocky described by Lewis Carrol in Alice through the Looking Glass is a real creature and is as described! Anti-Jabberwokians, by contrast, believe that this creature is a hoax, does not exist except in the minds of young children and infantile adults.

The J’s therefore believe in the (real) possible existence of creatures of their imagination, just as they believe in their own existence (I think therefore I am) but also in the reality of butterflies. The follow-up question is, “What creatures in their imagination do not exist and are therefore imaginary?” This question demands that we submit a set of criteria which demarcates creatures that are officially real from those deemed non-real. By what standards can these two classes of creatures be reliably identified?

The truth is that even the most committed solipsist will wander up and down the boulevard declaring that some objects encountered are real whereas others are not, i.e., are unreal. At least, they may express doubts whether all things on their walk were as perceived or not. We wish to know, however, by what standards does the solipsist makes his/her judgements because we are often placed in a position where we need to decide for ourselves whether specific claims made are justified or seem to be ill supported. We do not argue that indeed there are more imaginary creatures than objects encountered on our walk — at least that may be so for some people.

Two classes of creatures? Surely there are more categories? Some are more likely than others to have the required attributes. We often also assume there is a dimension which helps us to distinquish matters, so that our choice is not confined to either/or! Ther may be many choices that have to made. Therefore, If we confine ourselves to two anchor-points, we join the group of either/or thinkers, those who bifurcate, for whom the world is left/right, up/down, right/wrong, where the notion of dimension has no place. For me such a world is unthinkable, although it was the dominant mode of thought for long periods. We ask, is “exist/does-not-exist” part of this world? I think not, because we have always allowed room for the category of “could exist”, for possibilities, and in recent times we have encouraged the habit of suspended judgements.

When Macbeth exclaims, “Is this a dagger which I see before me?” the audience realizes that there is no miraculous dagger suspended before him as he believes; that the dagger is imagined by him, or as we commonly say, “in his mind”. The audience can tell — and knows — the convention whereby matters or common objects are regularly identified and also on what occasions this is possible and feasible. In short, we have all learned to distinguish things we think about — which are deemed to be private — from things which are public, which are assumed to be available for everyone to see and sense.

It would be more correct to talk about “private as well as public”, given that matters can be private and NOT be public. “My toothache” is a case in point, but so is “to my mind”. The statement “I think of Jeannie with the light brown hair” is private because there maybe no such person as Jeannie! Whatever is referred to as “in my thoughts” is therefore private unless a claim is attached that it is also assumed to be public! Such a claim is a game-changer.

We know therefore that others — occasionally we ourselves — see and detect matters without cause, so that we can usually (but not always) tell or distinguish the real from illusion. Example: “(I thought) I heard a burglar in the bathroom, and therefore shot my gun in that direction.” The cause to action in this case is the presumed burglar, and we are asked to accept that this was a reasonable thought to have under the circumstances and — all things considered — not just an illusion or an aberration.

Here is the critical question which arises from the above example: what is the status of events which we agree are “private” compared to those we accept as “public” events? The lamp-post at the corner of our street is a public event/object. We assume it will stand there regardless of whether it is seen by anyone or not. It has endurance, a lasting presence. It does not “will itself” to stand! It was placed there by someone and probably for some reason. In this respect the lamp-post is quite different from an itch on the sole of my foot.

More to come in Part 2 in a few days!

Form and Constructivism (Part 1)

A definition of *entities* given in 1596 states that the word derives from the Latin entitas and was proposed by Julius Caesar as Present Perfect of esse, *to be*.

I interpret this to mean that anything claimed “to be” also exists. Note that terms like *entity*, *substance*, and *object* were already used by Greek writers (before 450 BC) and continued to be used as a vital part of the vocabulary by philosophers from then until the present day. Of course the meaning has changed over time. However our uninterrupted use of the term indicates that we are able to make good use of the it even when its meaning (as in interpretation) changes. As with so many words meaning and interpretations change often, even imperceptibly, throughout the past hundreds of years.

Words are like like sewage pipes: they have to be cleaned out periodically, preferably by certified plumbers. Thus Greek thinkers enjoyed speculating about the nature of “Nature” and raised questions about how and by whom the Nature they knew had been composed and organized. Often they avoided questions about its origin because they lacked acceptable standards of getting clear and authoritative answers to their questions. Indeed, some writers have claimed that the continuing task of philosophers is to propose standards and methods by which we can come to know what is justified and true.

Thus, asking relevant questions without having an effective technology to move these beyond what was already known at the common sense level seemed futile. Yet it was (and still is!) a game compulsively played despite its significant and recognized hazards. Some answers were deemed to be better than none! Incorrect solutions or unpopular solutions were actually dangerous — for such might offend the Gods and their worldly representatives, or so it was opined. (Socrates was neither the first nor the last person to face expulsion from his home, even death, for raising such “origin” questions!) By casting doubt on an existing — perhaps even a powerful religious — order and its doctrines, one also cast doubts on the existing socio-political order, an order, it was thought, that was dictated by the Gods and therefore part of a “Divine Order”.

Questions about how the world is composed are, generally speaking, raised by peole who are prone and inspired to give different answers to those already accepted within a community — by so-called dissidents. Surely such men and women are inherently dangerous, and therefore deserve to be neutralized! These questions could easily unravel the tightly woven explanatory tapestry which was completed with great care and over many centuries by wise (but of course not disinterested) thinkers!

An early example of such a challenge occurred in approximately 550 BC when several Ionian thinkers suggested that the physical world — the world of physical objects and their interactions — had been constructed by unknown forces from one or more basic fundamental materials, i.e. atoms, the indivisibles. Two factors were speculated: one, material objects in their most fundamental, most primitive forms; and two, the manner by which these were structurally related. The former (*atoms*) were not considered by Greek thinkers to be pliable, whereas *forms* or *structures*, it was assumed, were. It was not clear by what rules these “atoms” has been assembled to become new entities, new phenomena, but it was assumed that knowledge about how this happened could ultimately be gained — although not how this knowledge could be gathered.

Conceptually Ionian ideas represent an advance over earlier ideas about the origin of the Cosmos, including proposals made within traditional mythologies, such as the widely held suggestion that one or more super-humans had “engineered” and assembled the cosmos! (The origin of science fiction?) Thus with very little evidence to support their ideas, Ionian philosophers put forward a constructivist theory of the origin and composition of nature, about the world of its entities as these exist throughout space and time! The issue about which principles were involved to produce this staggering result was unclear, but the issue was discussed more and more amongst learned and enlightened men, some of who had been officially entrusted by their contemporaries with “speculating about deep matters”, by priests and the lovers of wisdom, those who make good judgements.

constructivist
Others also participated in these discussions. They offered varying solutions which caused a massive problem, namely, how to adjudicate between different solutions proffered to problems, both problems about the material structure of the world and about how to make good judgements about human actions. How can this be done without first enumerating principles according to which decisions between alternatives can be taken? How does one evaluate alternate solutions offered to such problems? To do so was clearly the task of a meta-discipline, whose task would be to lay down what was possible in all possible worlds. What had to be recognized was that their own thinking was inescapably multidimensional, and therefore not only move forward and backward in time, but could also moved up and down within a multi-dimensional hypothesized conceptual space.

Humans accepted on many occasions that there is an immediate and even a far distant past, and that this does not depend entirely on experience. On the contrary, our experience mainly appears to reflect something that takes place independently of us. But there are also steps which do not involve the passage of time, but which reflect that different events entail one another, in the sense that a strong fierce wind for example is viewed as part of a more extended event, i.e., a storm. The event itself (the storm) has many discernible attributes, but with a certain amount of effort these can be ordered so that some events stand at a different level to others.

When this is done we have the bare bones of a structure of an event. A common example of this is when someone describes the skeleton of a deceased person, perhaps a skeleton found in a burial site in a cave in a district where is is known from other evidence, that humans deposited their dead thousands of years ago. Here is a case where an image of the past is reconstructed from the scattered fragments (remains) found during the dig. The reconstructed image is then the product of someone’s imagination which has been guided by much background knowledge. Of course, it is also subject to commonly accepted corrective standards, such as those incorporated in any currently-espoused methodology of the sciences.