Sunday, 21 December 2014

Christmas: the one thing that can be all things to all people

I never liked Christmas much. It is a season of disappointments; a season of darkness, literally and figuratively. No amount of contrived gaudy gaiety can redeem such jet-black night.

As a believer in the Christ as my Lord Savior I see no contradiction, no lack of generosity, no crippling pessimism: it is said that it is darkest before dawn, and what fitting metaphor:

In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of all mankind. The light shines in the darkness, and the darkness has not overcome it. (John 1: 1-5)

Hanukkah is encapsulated by an acronym that says: A great miracle happened there.

I am no saint. This meditation is what stirs and animates my love—for my kids, my family, those that have touched my life both positively and negatively.

I recently came across Isaiah Berlin (6 June 1909 – 5 November 1997). I think I've always sought out those thinkers who have a sense of irony, and what exquisite irony this man weld. If Christ-like irony be the measure of a humble man, Berlin is worthy of our attention:

Certainly no politics are more real than those of academic life, no loves deeper, no hatreds more burning, no principles more sacred. (To Freya Stark, 12 June 1944)

Nobody is so fiercely bureaucratic, or so stern with soldiers and regular civil servants, as the don disguised as temporary government official armed with an indestructible superiority complex. (ibid)

Doesn't that one describe the dullard, Harper, to the tee? (actually, for someone who described his own writings as having so little politics, Berlin cuts rather deeply to the quick and counts certainty a great evil to be rightly and promptly ridiculed)

Here is a personal lesson for me (someone so obviously prone to easy intellectual pride and vanity):

I am a hopeless dilettante about matters of fact really and only good for a column of gossip, if that. (To Walter Turner, 12 June 1945)

Pluralism, which he advocated for mightily and so capably, was often criticized as merely a form of relativism. Rather than pulling a conniption or walking away in disbelief and disappointment, he responded thusly:

Injustice, poverty, slavery, ignorance – these may be cured by reform or revolution. But men do not live only by fighting evils. They live by positive goals, individual and collective, a vast variety of them, seldom predictable, at times incompatible. (‘Political Ideas in the Twentieth Century’ (1950), L 93 [FEL 40])

Unless there is some point at which you are prepared to fight against whatever odds, and whatever the threat may be, not merely to yourself but to anybody, all principles become flexible, all codes melt, and all ends-in-themselves for which we live disappear ... (To Philip Toynbee, 24 January 1958)

What the historian says will, however careful he may be to use purely descriptive language, sooner or later convey his attitude. Detachment is itself a moral position. The use of neutral language (‘Himmler caused many persons to be asphyxiated’) conveys its own ethical tone. (Introduction to ‘Five Essays on Liberty’ (1969) , L 22–3 [FEL xxix])

This final quote in defense of pluralism is what originally caught my eye:

Those, no doubt, are in some way fortunate who have brought themselves, or have been brought by others, to obey some ultimate principle before the bar of which all problems can be brought. Single-minded monists, ruthless fanatics, men possessed by an all-embracing coherent vision do not know the doubts and agonies of those who cannot wholly blind themselves to reality. (ibid. 47 [lv])

I cannot honestly bring myself to believe that the dark hours of the winter solstice as marking the birth of our Savior though I can honestly say that giving gifts to my loved ones brings me sincere joy. This season, however, is full of metaphorical meaning for me. And I am of the belief that no one can sincerely account for one's years to one's self without some measure of objective standard no matter how amorphous, dynamic and vague its resolution.

These nuggets just quoted contrast and gauge how I'd like to measure myself up to the Gospel of my Lord.

Merry Christmas and a Prosperous New Year,

Jay

Sunday, 14 December 2014

"I open my eyes and my eyes are filled"

-the title of this entry is taken from David Berlinski (2009)

Much of the features of the human brain seems to obey the principles of quantum physics: how we see the world (matter) in all its richness of colour (wave-like); how our actions and mental states can be apparently influenced by our notions of past, present and future; etc. Even the way we so effortlessly express and comprehend language has pseudo-quantum physical like features.

In thinking about language I'm always struck by the human mind's ability to take in and express fully-formed and/or spontaneously forming ideas in the act of speaking and listening. This may be just an illusion of timing—I concede this only contingently because whether it be an illusion or not it is a fine and good thing to have. At any rate, the notion of completion (quantum, if you will) seems built-in to our aesthetic/semantic sensibilities such that an incomplete thought always leaves us scrambling to complete it: our "linguistic turn" demands satisfaction.

The morpho-syntactic structure of the Inuit language is highly mathematical where the interaction between phonology and morphology is so beautifully regular that we (those that speak the language) can and do anticipate, and can and do predict the place and manner of articulation the variant will take on depending on the following morpheme—cause apparently follows effect in this case because most of the phonological assimilation rules in the Inuit language are "regressive" (ie, the cause/effect goes backwards).

ani- becomes anijumajunga  "I would like to exit (now)"

pisuk- becomes pisugumajunga  "I would like to (take a) walk"

isiq- becomes isirumajunga  "I would like to enter (now)"

the bold text are examples of "progressive" assimilation (basic form remains [juma] when it follows a root ending in a vowel; /k/ changes to [g] in the second case; /q/ changes to [r] in the third). In all cases, the place of articulation of the initial segment is left unchanged, but the voicing is changed (as is typical of Inuit language assimilation rules)

But, in the case of

pisuktunga  "I am walking"

pisuglanga "let me walk"

the assimilation rules are "regressive"—again, following the general rules of Inuktitut, the place of articulation of the final segment of the root verb (pisuk-) remains unchanged but the change in voicing is affected by the following morphemes ([-tunga] and [-langa]) that differ not only in semantic content but, more importantly, whose initial segments differ in voicing.

This morpho-phonological feature of the Inuit language is not so uniquely strange to the Inuit language. English, to some extent, also has this feature—in fact, all human languages do; the only qualification seems to be that the phonological change be phonetic and not phonemic (ie, have no semantic significance in the change).

In English, the word "miss" ends with a segment /s/:

"I miss her"  -no change in the final /s/ (say it out loud);

but, /s/ becomes [sh] in

"I miss you" (compare it with "I miss her")

This change in /s/ is called "fronting" and it anticipates the following "you" which is articulated relatively close to the lips in comparison to "how" which is pronounced further back in the sound-production apparatus (ie, the mouth as a whole).

The production of speech (and expression/comprehension of thoughts) is largely psychological where the encoding and decoding processes happen too fast for the conscious mind to discern. The currency of speech comprises of idealized quanta of fully formed thoughts. This interplay apparently pays little attention to individual sounds, words and phrases which smear out and bleed into each other forwards and backwards, and this process must be unhindered lest meaningfullness just sort of evaporates—I mean, try and repeat the same word over and over again or stare at an individual word for a while and our mind's eye will just naturally glaze over because there is no real meaning to hold its attention.

The various narratives and semiotic frameworks (ie, our prior experiences) determine our ability to interpret and decode meaning and comprehension/take-away is often idiosyncratic rather than collective: our ability to learn and acquire new frameworks demands it. Collective follows idiosyncratic.

When the process of language acquisition is disrupted in a sustained way—as what happens in inappropriate "pedagogies" like the so-called "whole language" approach—what comes naturally is lost. Whether this is permanent or not, I don't know. It certainly has devastating consequences as histories of "colonialized" peoples bears this out time and again.

I think this disruption can be overcome simply because it is proven time and again that the human mind is extremely resilient and seems to naturally seek ways of transcendence and compensation, after its own way and fashion.

The ability to switch back and forth between various narratives/semiotic frameworks seems to be the key. Intelligence, at the end of the day, is a given. But the more learned and educated a person becomes the more natural and well-oiled the ability to shift intellectual gears become. Exposure to new and well-articulated ideas and experiences facilitates language acquisition; not so much with rote memorization.

Jay

Saturday, 13 December 2014

Why 'dialectics' is important to vitality of civilization

The term "dialectics" is defined:

Dialectic (also dialectics and the dialectical method), from Ancient Greek διαλεκτική, is a method of argument for resolving disagreement that has been central to European and Indian philosophy since antiquity. The word dialectic originated in ancient Greece, and was made popular by Plato in the Socratic dialogues. The dialectical method is discourse between two or more people holding different points of view about a subject, who wish to establish the truth of the matter guided by reasoned arguments.

The term dialectics is not synonymous with the term debate. While in theory debaters are not necessarily emotionally invested in their point of view, in practice debaters frequently display an emotional commitment that may cloud rational judgement. Debates are won through a combination of persuading the opponent; proving one's argument correct; or proving the opponent's argument incorrect. Debates do not necessarily require promptly identifying a clear winner or loser; however clear winners are frequently determined by either a judge, jury, or by group consensus. The term dialectics is also not synonymous with the term rhetoric, a method or art of discourse that seeks to persuade, inform, or motivate an audience. (http://en.wikipedia.org/wiki/Dialectic)

The Wikipedia entry on Dialectic continues:

Different forms of dialectical reasoning have emerged throughout history from the Indosphere (Greater India) and the West (Europe). These forms include the Socratic method, Hindu, Buddhist, Medieval, Hegelian dialectics, Marxist, Talmudic, and Neo-orthodoxy. (ibid)

The Talmudic dialectical method (of Rabbinic Judaism), especially the Avot Tractate, or, "The ethics of the fathers", relies on the four principles of interpretation called, PaRDeS (an acronym of the four levels of exegeses) but it places extreme importance to fidelity to the Holy Scriptures, to not only in what it says but to every stroke and sequential integrity of the actual Hebrew letters:

Each type of Pardes interpretation examines the extended meaning of a text. As a general rule, the extended meaning never contradicts the base meaning [emphasis mine]. The Peshat means the plain or contextual meaning of the text. Remez is the allegorical meaning. Derash includes the metaphorical meaning, and Sod represents the hidden meaning. There is often considerable overlap, for example when legal understandings of a verse are influenced by mystical interpretations or when a "hint" is determined by comparing a word with other instances of the same word. (http://en.wikipedia.org/wiki/Pardes_(Jewish_exegesis))

Ideology and its twin, Dogma, are two demons that ever seek to lull the unwary into dead-ends and they seem to be inherently built-in to any rational discourse especially where definitive answers/solutions are necessarily rare and only hard-won by sheer effort and perseverance. There seems to have been two spectacular instances where the twin demons apparently won in the venerable discourse of scientific thought: the so-called String Theory and Darwinian evolution.

In the Darwinian theory of evolution, the will to monopoly (through no fault of anyone, really) seems to have asserted itself right from the get-go. In David Berlinski's book, The Devil's Delusion: Atheism and Its Scientific Pretensions (2009), there is a beautifully crafted passage (is there any other way he writes?) that illustrates this need for dialectical exchange and the consequences when it is ignored (quoted here in its entirety just because no other way is possible):

Together with Charles Darwin, Alfred Wallace created the modern theory of evolution. He has been unjustly neglected by history, perhaps because shortly after conceiving his theory, he came to doubt its provenance. Darwin, too, had his doubts. No one reading On the Origins of Species could miss the note of moral anxiety. But Darwin's doubts arose because, considering its consequences, he feared his theory might be true; with Wallace, it was the other way around. Considering its consequences, he suspected his theory might be false.

In an interesting essay published in 1869 and entitled "Sir Charles Lyell on Geological Climates and the Origins of Species," Wallace outlined his sense that evolution was inadequate to explain certain obvious features of the human race. The essay is of great importance. It marks a falling-away in faith on the part of a sensitive biologist previously devoted to ideas he had himself introduced. Certain of "our physical characteristics," he observed, "are not explicable on the theory of variation and survival of the fittest." These include the human brain, the organs of speech and articulation, the human hand, and the external human form, with its upright posture and bipedal gait. It is only human beings who can rotate their thumb and ring finger in what is called ulnar opposition in order to achieve a grip, a grasp, and degree of torque denied any of the great apes. No other item on Wallace's list has been ticked off against real understanding in evolutionary thought. What remains is fantasy of the sort in which the bipedal gait is assigned to an unrecoverable ancestor wishing to peer (or pee) over tall savannah grasses.

The argument that Wallace made with respect to the human body he made again with respect to the human mind. There it gathers force. Do we understand why alone among the animals, human beings have acquired language? Or a refined and delicate moral system, or art, architecture, music, dance or mathematics? This is a severely abbreviated list. The body of Western literature and philosophy is an extended commentary on human nature, and over the course of more than four thousand years, it has not exhausted its mysteries. "You could not discover the limits of soul," Heraclitus wrote, "not even if you traveled down every road. Such is the depth of its form."

Yet there is no evident distinction, Wallace observed, between the mental powers of the most primitive human being and the most advanced. Raised in England instead of the Ecuadorian Amazon, a native child of the head-hunting Jivaro, destined otherwise for a life spent loping through the jungle, would learn to speak perfect English, and would upon graduation from Oxford or Cambridge have the double advantage of a modern intellectual worldview and a commercially valuable ethnic heritage. He might become a mathematician, he would understand the prevailing moral and social codes perfectly, and for all anyone knows (or could tell), he might find himself a BBC commentator, explaining lucidly the cultural significance of head-hunting and arguing its protection.

From this it follows, Wallace argued, that characteristic human abilities must be latent in primitive man, existing somehow as an unopened gift, the entryway to a world that primitive man does not possess and would not recognize.

But the idea that a biological species might possess latent powers makes no sense in Darwinian terms. It suggests forbidden doctrine that evolutionary  advantages were front-loaded far away and long ago; it is in conflict with the Darwinian principle that useless genes are subject to negative selection pressure and must therefore find themselves draining away into the sands of time.

Wallace identified a frank conflict between his own theory and what seemed to him obvious facts about the solidity and unchangeability of human nature.

The conflict persists; it has not been resolved. (Berlinski, 2009, pp. 157-159)

As if the idiocy of over-extending an incomplete scientific theory into fields not of its purview, and the ill-advised-ness of pontificating and issuing papal bulls prematurely (as the Dawkinses and the Pinkers of this world have done) were not abundantly clear yet, Berlinski continues further down:

Writing in the 1960s and 1970s, the Japanese mathematical biologist Motoo Kimura argued that on the genetic level—the place where mutations take place—most changes are selectively neutral. They do not help an organism survive; they may even be deleterious. A competent mathematician and a fastidious English prose stylist, Kimura was perfectly aware that he was advancing a powerful argument against Darwin's theory of natural selection. "The neutral theory asserts," he wrote in the introduction to his masterpiece, The Neutral Theory of Molecular Evolution, "that the great majority of evolutionary changes at the molecular level, as revealed by comparative studies of protein and DNA sequences, are caused not by Darwinian selection but by random drift of selectively neutral or nearly neutral mutations." (ibid, p. 194)

Most people who know me, or have read enough entries in this blog, may have already surmised that I'd say: BURNED! to both sides of the Creationists/Evolutionist divide, the right-wing nuts who use social Darwinism to justify their arrested-development of becoming fully human.

Sorry.

Dialectics, then—rather than promising definitive answers—seek to draw in as wide a range and diversity of thought in knowledge acquisition as possible, as long as advocacy is rationally articulated and done in good faith.

Had the theory(ies) of evolution been sifted through the dialectic sieve, people like the Nazis, the Stephen Harpers, the Ayn Rands, the Tea Party militants, the so-called Moral Majority movements of this world may have been proven much less virulent, even peripheral to Western civilization. The United States of America—that brightly shining mansion on the hill—may have been given half a chance to truly be the promise of the suffering humanity rather than the sorry, pathetic husk it has become in pursuit of its manifest destiny, after the image of the military-industrial complex.

Jay

A curious case of the "four" (in the Inuit language)

The universe appears to be just one of those things.
-Frank Wilczek

The other day, I was doing an informal presentation on the Inuit language consonant chart to one of my colleagues at the Arctic College in preparation for Inuit Language courses we're simultaneously holding for the next couple of weeks (I'm teaching one of the four classes) when she asked me how I would explain the dialectal variants of the number 4—in some instances, it occurs as "tisamat" and in others as "sitamat".

Rather than giving an unsatisfying "explanation" that it is probably "just one of those things", I decided that I would try looking at it in terms of autosegmental analysis. -I've mentioned "autosegmental phonology" before in this blog which I believe has productive possibilities for the analysis of the Inuit language variations.

Says Wikipedia on autosegmental phonology:

The working hypothesis of autosegmental analysis is that a large part of phonological generalizations can be interpreted as a restructuring or reorganization of the autosegments in a representation. Clear examples of the usefulness of autosegmental analysis came in early work from the detailed study of African tone languages, as well as the study of vowel and nasal harmony systems. (http://en.wikipedia.org/wiki/Autosegmental_phonology)

More specifically, using this gambit:

There are situations in which the rule applies not to a particular value of a feature, but to whatever value the feature has. In these situations, it is necessary to include the presence of the feature, but not to specify its value. (ibid)

I thought some light may be shed on this interesting problem.

How I'd explain the implications of the above insight would be to say that a given word comprises of a series of obligatory slots to which consonants and vowels are assigned. For example, the word for "four" in Inuktitut may be rendered (in highly abstract terms, of course) as comprising of this series of consonants and vowels:

CiCamat—ie, the word is sometimes 'tisamat' and sometimes 'sitamat' depending on how your dialect pronounces the word.

Let me draw your attention to the capitalized "C"s (consonants), where the first consonant is sometimes /t/ and sometimes /s/ while the second consonant slot takes what the first doesn't choose (it comes out as [t] if the first chooses /s/; [s] if the first chooses /t/).

The choice in the expression of the second consonant between the two seems to be psychologically real, and, in fact, it seems to be set in stone: "after thy father, thou may be such but not some other".

I think what is happening here is that the sequence of the first two vowels in the word: the /i/ and the /a/ respectively are exerting some irresistible force in the sequence of this particular slot assignment. I say this because one may have:

sitamat   but not  *satimat

and

tisamat  but not  *tasimat

In obeying the slot assignments, in fact, one may mispronounce the first two consonants (in this case, change the voicing feature of the consonants in questions while keeping to the vowels as they are) and not lose the meaning:

zidamat  (the hypothetical version of 'sitamat'); and,

dizamat (the hypothetical version of 'tisamat').

...

Ferdinand de Saussure—and the structuralist school after him—claimed that the production of words in human languages is a completely arbitrary affair:

Saussure posited that linguistic form [ie, a given word] is arbitrary, and therefore that all languages function in a similar fashion. According to Saussure, a language is arbitrary because it is systematic, in that the whole is greater than the sum of its parts.(http://en.wikipedia.org/wiki/Ferdinand_de_Saussure)

meh...

There is something incomplete about this perspective. Having seen that /dog/ may express itself as: qimmiq; chien; dog (or, schweinhund as my friend, Kalman, likes to pepper his speech in mimicking a German accent), Saussure makes a rather vague (and odd) claim that "language is arbitrary because it is systematic" and no less "greater than the sum of it parts". He goes from word to language with little or no intermediary steps in between. That he started from word outward apparently gave him no pause for thought, let alone in treating a given word as having no internal structure at all but fully-formed and permanent.

Language structures do have internal logic systems that interplay elemental phono-morpho-syntax with history and linguistic evolution in highly constrained and rigid ways. Rather.

The analysis of "four" in Inuktitut above instead suggests that these systematic linguistic structures are "information-rich" (ie, greater than the sum of their parts) not only because they are capable of encoding meanings (semantic content) but evolve according to lock-stepped iterative processes, much like how plants grow and evolve or how chemical elements combine to make compounds, each according to the available physical properties of its elements and their unique emergent possibilities when combined a certain way.

Jay

Sunday, 30 November 2014

Quantum mystery

In semiotics a sign is rarely, if ever, just the sum of its parts. A hunter may know where to go to optimize his/her chances of bagging prey, but there are no guarantees that they will succeed every time. Not having gamed the system (perhaps by intention), the Inuit psyche seems especially open to the probabilistic, the mysterious, in its pristine state.

The word "mystery" as used here is not a weasel (or, all-purpose) word serving some ulterior motivation, but rather a recognition of a state of affairs within one's ken and pondering but currently beyond one's articulation and psychological mastery.

When I was a child I often thought about the nature of infinity having heard of the concept in church as "without beginning or end". The endless fog bank, the lead in the ice that went on forever, etc. I was looking at a boulder underwater and wondered at how long must have that boulder not experienced dryness (literally, forever, in my mind)—certainly not yesterday, nor the countless yesterdays I would be inclined to examine. What about the inside of the boulder itself: has it ever seen the light of day?

When I hit upon the science of classical and quantum physics, my mind (in my mind) was ready to receive their mysteries. Of course, I had preconceived notions of what reality was all about and how it must be just so. But I grew up in a culture that seems to have little use for preconceptions, and a rich metaphoric imagination that even a child knows is metaphoric rather than realistic.

I loved the philosophical discourse of physics if not outright captured by the math—interestingly enough, my restless mind finds calm in the equations for some strange reason: I can look at and ponder the meanings of equations (grammatically and geometrically) and get something akin to spiritual completion. For a long time, I'd talk ad nauseam (literally) about physics to anyone who had the misfortune of being in my presence.

I'd talk about imaginary (or, complex) arithmetic, Lorentz transformation laws, the impossibility of imagining our reality without movement in contrast to the oft-claimed impossibility of seeing the four dimensional space-time. It wasn't so much that people I spoke to understood, but that I worked through and revel in these objects of my obsession: I swear I intuited the Pauli exclusion principle by examining the periodic table of elements before I ever read about the principle itself; that I reconciled the dual nature of particle physics by realizing that we see things in colour.

I read of Michael Faraday and he became my inspiration because I saw how knowledge acquisition in science is more like him than the mathematical prodigies who seem to formulate theorems that come fully-formed and impassive and inscrutable to the human mind: informed imagination, more often than not, precedes articulated formulation.

Every once in a while I'll come across a precious gem of a thought. But my compulsion to insist upon sharing an insight is much diminished and is diminishing still. My ego still asserts its irresistible power but, by and by, it seems to become more comfortable in its own skin. My insights, once gotten, rarely ever expire. I find waiting for the right moment to share is much more satisfying (and mature). I often find that someone else will have come across the same thought, and, often, have better language to articulate them.

About this fact of a maturation process, I have sometimes wondered if Hermann Hesse's Magister Ludi is subconsciously influencing/modelling my experience, or whether it is true of all deliberately cultivated minds and would hold true even if I had never read the book.

Ironically, even now, I realize that I still have much work to do on my patience, my ego, my still crippling visceral fear of ignorance. I'm psychologically repulsed by closed minds (including my own, upon reflection), and, it is especially in these moments of existential terror, I turn to Christ, the greatest mystery, the mystery, that is capable of affirming and assuring meaning on an otherwise seemingly meaningless, terrifying existence.

Jay

Saturday, 29 November 2014

The illusion of self?

A very close friend of mine recently gave me a book called, The Self Illusion: how the social brain creates identity, written by Bruce Hood (2012).

I've always had an interest in the cognitive sciences whether it be popularized recounting of findings in the science itself, philosophical speculation, linguistics, etc.—I, in fact, tend to regard my spirituality in such terms: ie, it is the whole package of the Judeo-Christian faith (the literary/linguistic structures, the aesthetic appeal of contrasting archetypes, the practical wisdom, reflecting upon the veracity of my Master's Gospel using the principles of PARDES (the Jewish tradition of Scriptural exegesis)—in short, trying to be human to the fullest—all the time trying not to set or delimit my expectations).

Believing that purely literal interpretation (fundamentalism and dogmatic thought) is not only a perversion but utterly ignorant and blasphemous, I do not pretend that the world is only 6000 years old (who did the math anyway?), do not believe in the so-called creationism, take ecclesiastical hierarchies seriously only up to a point, truly believe that G*d can and does speak truth to anyone He wishes, and truly believe that spontaneous rejoicing/ecstasy is a possibility built into the architecture of our being as much as cripplingly dark pessimism (ie, revelation and reception of truth is a matter of perspective and emotive intent).

I've been re-reading David Berlinski's, The Devil's Delusion (2009)—a scathing tour-de-force response to militant atheism—along with the Hood book (which is really an apologia for militant atheism).

The Hood book is highly selective in advancing its arguments (but so is the great Berlinski for that matter). For instance, Hood contends that that "the illusion of self" can be and is demonstrated by brain experiments that examine what is called the "readiness potential":

Prior to most voluntary motor acts, such as pushing a button with a finger, a spike of neural activity occurs in the brain's motor cortex region that is responsible for producing the eventual movement of the finger. This is known as a readiness potential, and it is the forerunner to the cascade of brain activation that actually makes the finger move. Of course, in making a decision, we also experience a conscious intention or free will to initiate the act of pushing the button about a fifth of a second before we actually begin to press the button. But here is the spooky thing. [Californian psychologist, Benjamin] Libet demonstrated that there was a mismatch between when the readiness potential began and the point when the individual experienced the conscious intention to push the button.
...

One might argue that half a second is hardly a long time but, more recently, researchers using brain imaging have been able to push this boundary back to 7 seconds. They can predict on the basis of brain activity which two buttons a subject will eventually press. This is shocking. As you can imagine, these sorts of findings create havoc for most people. How can we be so out of touch with our bodies? Do we not have conscious control? The whole point about voluntary acts is that we feel both the intention to act and the effort of our agency. We feel there is a moment in time when we have decided to do something, which is followed by the execution of the act. Brain science tells us that, in these experiments, the feeling of intention occurs after the fact. (Hood (2012), pp. 128-129)

The missing premise I see in this line of reasoning is the development of the cerebral cortex—that part of the human brain that epitomizes the evolution of the homo sapiens sapiens—which is said to "censor" our impulses (I, myself, much prefer "edit" to "censor") and activates right before we execute the expression of  a given thought or action. It is telling that, in Hood's book, the index makes no reference to this significant and important area of the human brain. I guess it screws up the whole point of the tome.

I'm not dismissing Hood out of hand. There are many interesting tidbits, already in common knowledge, about the workings of the human brain that he alludes to throughout the book. But like any interesting piece of "scientific" writing, it is necessarily incomplete, and should be assayed in light of all findings and extant scientific/philosophical thought that inform this corpus of knowledge.

Jay

Sunday, 23 November 2014

Has Canada become boorish?

At the time of typing up of this entry, the Harper Government is announcing "Road to Mental Readiness"—apparently, "new" funding for war vets' mental health program (a thinly veiled, highly cynical bid for re-election as everyone knows). Given Harper's track record for the treatment of our society's most vulnerable—including and especially—our military's wounded and broken, we can already expect lapsed funding given that Michael Blais (an out-spoken veterans' advocate who has a blog: http://www.canadianveteransadvocacy.com/blog/?author=2) says right after the announcement that this comes from a government that has fought tooth-and-nail to be recognized in courts that it has no constitutional obligation, no "social contract/covenant", to care for our war vets.

It is bad enough that the Harper Government has had to be shamed into making this announcement.

According to Michael Harris' Party of One: Stephen Harper and Canada's Radical Makeover, Stephen Harper is third generation admirer of all things military. Neither his grandfather nor his father nor Stephen Harper himself have ever gone further than military wannabes—I'd surmise that actually enlisting into active service would break the spell. But one can rightly imagine these Harper men (during wars of their generation) goose-stepping around the house, imposing regimentation on their unfortunate charges, pontificating on the nobility of military service when not actually telling "war stories".

Boorish.

It is truly ironic that a self-avowed military/historical buff would spurn not only our military but also the actual field in which a military force would carry the stick. Here is a quote from Harris' Party of One:

As a politician, this prime minister seems to look out from a kind of intellectual suburbia onto a cosmopolitan world that is poorly understood, uninteresting, and perhaps even unimportant to him except in terms of the economic opportunities it provides. It is his instinctive position. When Harper was a Reform MP, Preston Manning tried to broaden his acolyte's horizons by introducing him to the virgin territory of foreign affairs. Harper balked. "One thing that did surprise me about Stephen as an MP. He had no interest in international stuff," Manning told me. "We simply couldn't get him to travel."

Perhaps it was Harper's parochial bent; perhaps it was a deeply ingrained mistrust of international politics, diplomacy, or leaders with different views than himself. whatever the reason, soon after winning his majority government in 2011, Stephen Harper became the proverbial skunk at the diplomatic garden. (Harris 2014, p.218)

Harper certainly cops a good line that, though hardly ever original, is always peppered with value-laden terms: duty, a strong Canada, an energy superpower, patriotism, etc. But all this comes from a man that Preston Manning says: "Stephen doesn't think words mean very much".

Heaven forbid that an actual big-leaguer like Putin should ever call us out on Harper's rhetoric: Harper has so far been proven an inept leader when it comes to military procurement especially when his rhetoric on the Canadian Arctic has not translated at all into actually producing the ice-breakers to monitor and enforce our claims to not only the Northwest Passage but the North Pole no less.

Harper definitely has "book knowledge" but it seems nothing more than an impressive talent for rote memorization of required reading without much understanding of the real implications and applications of the briefing. As I said: Boorish.

Jay