Wednesday, December 20, 2006

A Method of Trisecting Angles


Folks, as promised, here is the method for dividing any angle into three equal parts. That's how it looks to me anyway, and until someone demonstrates that the sections are not equal (rather than simply pointing to the damn proof and saying, "Look, it's impossible!") I will trust my eyes on this one. Try it yourself and see...

Trisection of angle using only compass and straightedge.

1-Draw arbitrary angle A-O-A' (no greater than 180°. If greater than 180°, bisect, work as follows, then rejoin.)

2-Draw bisector, B, of angle through O.

3-Draw arbitrary-sized circle P; on bisector B, with center P, and tangent to inside of angle A-O-A':

4-Draw parallels, C and D, to bisector B, and tangent to circle P'

5-Draw parallels, E and E'; through point P, to angle A-O-A'

6-Draw parallels, F and F'; to angle A-O-A'; tangent to circle P'

7-Draw line, from intersection of lines F and C at G, connecting with intersection of lines D and F' at H, crossing B at J.

8-Draw circle O' with center at O, through J and new points K, L, M, N.

9-Draw circles J, L, and M' around points J, L, and M, same size as circle P:

10-Draw trisector X-O, from O, tangent to both circles J'and L'.

11-Draw trisector Y-O, from O, tangent to both circles J'and M'.

12-One may also draw hexsectors Z and Z' through points L and M from O. Line D-O is also a hexsector.

I have found it possible to extend this method, enabling one to 5-sect an angle (pentasect, for word techies), 7-sect, or any odd-numbered division of an angle.

It seems evident to me that by measuring with compass, the angles are indeed equally 1/3 the original angle. Still, a proper mathematical proof is required. As mentioned earlier, this would entail an investigation into Wantzel's proof to determine where he went wrong. (Readers can download that paper here.)

Alfred King Aldin. "A Pythagoreanism." Philosophic Research and Analysis, Fall 1971, pp. 6-7.
(This is an obscure journal coming out of Boulder, Colorado, and is has been defunct for some time now. He also published a response to readers in the issue of Late Winter 1973.)

M.L. Wantzel (1837). "Recherches sur les moyens de reconnaître si un Problème de Géométrie peut se résoudre avec la règle et le compas". Journal de Mathématiques Pures et Appliquées 1 (2): 366–372.
(Yes, dear research students, I copied that link right off Wikipedia! And I enjoyed it!)

(Image scanned and posted on the Internet with evil delight from Mr. Aldin's original article.)

Tuesday, December 12, 2006

On "Difficulty"


At last I'm on vacation! Time to try and get some work done - but first some long-postponed blogging...

One thing that always annoyed me is to be told that something is difficult. "Mathematics is very hard," some people say. Even teachers fall into this trap.

Why is something hard? Who says it's hard? I don't find it easy, but that doesn't mean it's hard for everyone. Some people even find math easy; I wish I were one of those folks. The

How many times have you heard a story about somebody who was interested in something, went ahead and figured it out, and then heard after the fact that he wasn't supposed to be able to do it? This is worth considering.

What makes something difficult? Not the material itself. It's entirely relative to the individual, or culture, or species, depending on what the nature of the challenge is. And then it's largely a matter of practice.

For human beings, it's very hard to walk on two legs - at first. Once they get it, though, they're off and running before you know it. For other critters, they're off and running once they hit the ground.

Answering fundamental questions about the world - now that's pretty hard, always has been. Just when you think you've got it, something or somebody comes along and messes up your tidy little theory. And it's back to square one.

For some cultures, it's hard to imagine an impersonal God. I'm not talking about believing in one, just the idea of one. And the reason for that is they're used to conceiving of a personal God with a face (sometimes literally).

For some individuals, math is hard. Maybe it'll always be that way, but maybe they'll get it and even be pretty good at it. It all depends on the person and the teachers they get - blockheads or angels.

Some things are supposed to be impossible for anybody. There's even proofs to show you that the difficulty isn't just in your head. Take trisection, for example. The task is simple: for any angle, divide it into three equal angles. Easy, right?

Wrong. People have beat their heads against the wall for centuries in an effort to figure out a way to do this. But in 1836 a proof was published by a mathematician named Wantzel that this task is not only difficult but impossible. Mathematicians have been grinning smugly ever since. (You can read all about it here and here. Archimedes cheated - see here.)

The only trouble is, there seems to be a method that works. I found one in an obscure journal years ago, and it obeys the rules of the game. Now I'm not a mathematician, but I know what I see, and this method looks very convincing to me. I've seen many other attempts, but they are either obviously wrong or inapplicable beyond a certain angle. But this method apparently lacks both these shortcomings.

(Yes, I do have a copy of it. Just not at the time of this writing. I will try and post it later.)

What have I learned from this?

1. I need to learn math.
2. I need to learn math to do, among other things,
a. understand that damn proof,
b. see what's wrong with it,
c. prove that this method works.

Is there any practical benefit to this exercise? No, not if you're really asking, "Can it fix your car?" or "Will it make you money?" But those aren't the only benefits around. Consider these:

1. A hidden assumption will be exposed.
2. We will have yet another example showing that we can do more than we think.
3. How we think greatly influences our abilities.

One thing about inquiry is that, to a large degree, it concerns finding limitations. That's what world records are for. Kids do it all the time: How far can I jump? How many pieces of chewing gum can I stuff into my mouth? How much can I get away with before Teacher smacks me? Grown-ups too: What happens if I mix these chemicals together? How long can I ride a unicycle? If I leave this number out, can I pay less on my income tax? The basic question underlying all these is, How far can you go?

If you don't try, you'll never find out. Even when the proofs say it can't be done, it's worth asking, Why not?

(Image joyfully pirated from http://mathworld.wolfram.com/AngleTrisection.html)

Originality, creativity, and philosophy


Everybody knows Steve Martin from the movies; some still remember that he did stand-up comedy. Not so many know that he studied philosophy. He said that philosophy was one of the best things for thinking creatively. Was he right? At first this might sound odd: after all, aren’t philosophers just those dead white males who analyze concepts? How can that help you be creative? I want to show that Steve Martin indeed was right, maybe more so than he knew at the time he said it.

I should preface this by saying that although I practice philosophy, and am writing this, I’m not doing so in an effort to justify my existence. I don’t need to do that. What I’m doing is showing what the arrows point to, and further how weird it is.

I taught a course in doing philosophical research this semester. It’s a first-year course for BA students, and it gave me a lot of food for thought for the process of doing research – as well as for how to teach it. I’m not sure if my approach was a success; we’ll see with the final papers and feedback forms.

As stimulating as it was for me, I wasn’t able to get all my thoughts out. This is just an attempt to take down some of those so that they will be of further use, either to my students or myself.

The Place of Originality and Creativity in Research. One day I harped on the word “originality” as the most overworked in our working vocabulary, and I still stand by that. The unfortunate thing is that I may have been perceived as being against originality – or its allied word, creativity.

I’m not against either entity, really; what I can’t stand is the overuse of the words. There’s a big difference, and it’s a shame we have no substitute. Which, paradoxically, is not so bad. It forces us to come to terms with what those words really mean. Hopefully then we’ll be more judicious about using them in the future.

Let’s look at them one by one. Each of these is often used to mean that some product or person is unique and positive; we usually save them for solutions to problems and artistic fields. “An original solution to the problem of universals,” says the blurb on a book cover. “The author has written an ingenious book that is sure to stimulate the reader.” What does this tell us? – Buy the book! It’s a gem of an answer shining out among tired, old worn-out ones.

Originality. Such a tired word (yawn). Go back to the etymology, and find what it said: origō, point or place of beginning. As far as the activity is concerned, ask this: who is the source of the questions being asked? Who is in the driver’s seat of the investigation? Nobody can die for you, and nobody can think for you. In fact, nobody can do anything for you. If you are compelled to do something “– or else,” you are still the one choosing to act.

What does it mean when we say a writer has done “original research?” Most of us think of research as mousing around in the library, poring over piles of books. That’s supposed to be unoriginal. But that’s untrue. All those books and note-taking are driven by the questions you ask. Every corporation has an R&D department; are they reading dusty tomes? No, they’re doing experiments. They’re testing whatever it is they’re interested in, tweaking it here and there to see what happens; hopefully they can discover something and use it in some new machine or laundry detergent.

“Research,” according to my Oxford, means “a careful study of a subject, especially in order to discover new facts or information about it.” The idea is simple enough; knowing how that translates into action is the stuff of myth-busting.

In product development, research is for finding new things to sell. In philosophy, there’s something very similar going on. (Please don’t think I put them on the same level; work with me here.) Ideas are the stuff philosophers work with, which is why books are their medium. But don’t mistake the book for the thing itself – the idea is one thing, the words another. When you read up, you’re doing background research; that’s to see what has been done before. The reasons for this are various: to prevent reinventing the wheel, to prevent making the same mistakes. The aim is one – economy of effort. You don’t want to do what’s already been done, and you don’t goof the same way. Find another way to screw up.

Experts focus on what’s known, creators on the unknown. It only makes sense to know the difference so you don’t repeat things unnecessarily. Do your homework. You want to find out what is known, but also what is not yet known. This requires some thinking outside the box.

“Library research” is past-oriented; original research is future-oriented. They are not mutually exclusive.

In the course, what I focused on overtly was giving structure to the library efforts, the preliminary work. This is what is usually considered “research”, but I made it plain (perhaps too often, though) that this is only the half of it. When you study a concept you’re doing research. Analyzing that concept: you’re taking it apart to see how it works turning it over in your mind, asking lots of questions – this too is research.

Creativity. Another tired but starry-eyed word. From creāre, to bring forth, produce, to cause or grow. Thanks to Romanticism, this and “originality” are begging for a bullet in the head. (By the way, I don’t mean to slag the best of the best Romantics – Schiller, Herder, and the like. I mean the romantic notion in the most ordinary of ordinary senses. It’s an unfortunate fact: great minds do not move the world, mediocre interpretations of great minds move the world. The first pose bracing questions, the second proffers stultifying answers.)

As we understand it now, creative thinking has as its essence the questioning of basic assumptions – thinking outside the box.[1] A classic example is the nine-dot puzzle. If you’re not familiar with this, try it before reading on…

Did you get it? The key to solving this is to take it strictly on its terms. Four straight lines, connect the dots – that’s all. You don’t have to stick to horizontal or vertical lines, and the lines don’t have to end at a point. That’s all just assumption. Try it now and see how you fare…

Did you get it now? I hope so. You just saw through an assumption, and found a way out.

Children are often said to be naturally creative. There is a kind of reading into children’s activity here: if creativity has to do with questioning assumptions, children cannot be said to be creative. Why not? Because they simply haven’t formed assumptions yet. One thing I remember about growing up was that I wasn’t surprised by much. Why not? Since I was young, I hadn’t built up many expectations, and surprise always plays on expectations. Until I realized this, my memory of not being surprised had puzzled me. I’m not a very remarkable person, never was, and I wouldn’t find it unusual if others shared the same experience.

Among other things that the wrong kind of experience and education do, they teach kids what to accept without questioning. It needn’t be that way. One thing to do is simply observe what is right there before your eyes. (I said simple, not easy.)

If you ever stopped to think about a simple piece of technology, for example a car – I mean really stopped to think about what it took to get that thing in front of you – if you ever thought about it, you might find it rather amazing. First the manufacturing: the metal has to be mined and refined, then molded; the pieces have to be assembled just right. Then the marketing: somebody had to sell the damn thing, and somebody had to buy it. Now for the design: that car was designed by a person, or more likely, several people. Go back through the history: before that car was designed, there were others from the same company, other companies that were trying to muscle into the market; earlier cars weren’t as efficient, so they had to be improved on; before cars even existed we had horses and carriages, chariots. And to get the technology that made the car possible – well, you get the idea.

So you started with this old ’78 Peugeot which you wanted to junk, and ended up with (among other things) a headful of history. If you’ve got an ounce of sensitivity in your soul, you’ll appreciate what it took to get that rust-bucket on the road. Thinking outside the box.

Now look what at philosophy – what else is it but questioning the most basic assumptions of our world? There is a world, and we’re all in it. If we think about it at all, we just shrug our shoulders: yeah, so what? But wait – why should the world exist to begin with? I’m not demanding the extinction of the world, I’m calling for an explanation of it. We can comprehend this thing, and we naturally seek reasons for things; if we can comprehend the world, then of course we’ll want to know the reason for its being around at all.

Why is there a world? Is that a nonsensical question? Why? If you give me reasons for its being a nonsensical question (“Because it’s stupid” is not a reason), you have to know the limits of reasoning to back it all up. If you don’t, then your answer is not sufficient. That means that if you take the question seriously, you question the assumptions of that question – and those of your own answer. When was the last time you did that? Thinking outside the box.

What then is creative research? It’s explanation that calls the taken-for-granted into question, and seeks a more satisfying answer. Thinking outside the box.

* * * * *

So philosophers engage in creative thinking in the purest form. Does that mean we’re privileged human beings? Yes, with qualifications. (What’s a philosophical answer without qualifications?) Yes, we enjoy a privilege not everybody partakes in. Does that mean we’re some elite class? No, without qualification. I’d argue that every human being can do philosophy, and that everyone who asks a question such as “What’s it all about?” thinking philosophically. Sadly, many shake off the question like so much dust from their shoe. They find the question overwhelming, and hence decide they cannot answer it – or, worse, that it’s unanswerable. Maybe that’s so, but I’ve never seen a proof of that. And even if I did, I wouldn’t believe it. I’ve seen proofs undermined, proofs deemed irrefutable by hundreds of experts. (See the entry above.)

If we go back to the root of creativity, we find it has an organic meaning. This suggests a product is intimately connected to the source, to that which made it grow. That would explain why plagiarism strikes at the heart of a writer or artist. And it would suggest that questioning assumptions is itself a sort of generative act.

If we look at the aspect of producing, bringing into existence what wasn’t there before, we run up against the idea of naïveté. To question an assumption is to look at it without the rose-colored spectacles which keep us from seeing it in the first place – in other words, to see it for the first time. Like a child, who does not even know the assumption exists, except that we do know it. Uncovering an assumption can be an exhilarating experience; it is liberating to see it for what it is – and that we don’t have to accept it.

Immanuel Kant urged us, “Sapere aude![2] Dare to know! Buckminster Fuller urged us, “Dare to be naïve.”[3] Both are right. Both involve questioning. Knowing, questioning, being naïve: these are three points that yield a form. I put it this way:

Sapere aude! Quære aude! Nascī aude!



[1] Russell L. Ackoff & Sheldon Rovin. Beating the System: Using Creativity to Outsmart Bureaucracies. San Francisco: Berrett-Koehler, 2005, p. 25.

[2] Immanuel Kant. “What is Enlightenment?” Political Writings. Cambridge: Cambridge University Press, 1991, p. 54.

[3] Fuller, R. Buckmister. Synergetics. New York: Macmillan, 1982. Look here.

(Image daringly lifted from www.rhapsody.com/stevemartin)

Friday, September 29, 2006

Black Light: The Julian Barrett


A college friend of mine has released a CD recently - kind of Johnny Cash meets Sid Vicious while wallowing in the penumbra of popular culture. It's not a new thing for him, though it might be for you. Listen to The Julian Barrett or Winocreep, and read a little about the voice behind the mike.

These lyrics are not happy, though they are often blackly funny, and some lines are downright blasphemous (or so it would seem). Why, then, does it strike me as being somehow - brace yourselves, this might sound moralistic - good? I don't just mean technically. They're definitely well-written songs, but then that's a different matter. This I'd say is both. I don't have an answer, this is how it is at the moment. Theories don't fit at this time; better to let it cook a while and see what comes out. I just want to enjoy the music. Thoughts come of their own accord.

(Image gleefully swiped from http://www.crawlspacerecords.com/photosmay.html.)

What Education Points To

This entry will likely evolve as it sits there. I just had to post the topic, which I'd like to develop further.

Aristotle, one of the greatest thinkers ever, had his finger right on the pulse of life when he wrote:

No one will doubt that the legislator should direct his attention above all to the education of youth; for the neglect of education does harm to the constitution. The citizen should be moulded to suit the form of government under which he lives. For each government has a peculiar character which originally formed and which continues to preserve it. The character of democracy creates democracy, and the character of oligarchy creates oligarchy; and always the better the better character, the better the government. --Politics, 1337a10-16
In other words, teach your children into the state. If you want to see what a state is really up to, don't look at the overt workings - this may well be misleading - rather, look at the schools. I think it would give a far more accurate reading, if only we read it with clear eyes.

What are they doing in the schools of the most powerful nation in the world? If you know, please drop me a line. I'm currently on another continent, so it's hard to tell. I've got my ideas, but prefer to chew on them for a little while...

Statistics aren't everything, but they are a good indicator of performance (provided they don't get twisted to an agenda). According to the Organization for Economic Co-operation and Development, the leading scorers tend to come from Japan, Korea, and Finland. The U.S. clocks in around the middle. Of course it isn't a simple matter, but one cannot help questioning why this is. And I can't help asking - to what purpose are students in various countries educated? This does not show up on any standardized test; it calls for deeper study.

Thursday, September 21, 2006

The Most Moral Weapon Ever Invented?


Just happened to find a write-up published on Boing Boing on Samuel Cohen, a fascinating read for various reasons. For those who don't know who he is, Cohen invented the neutron bomb. It sounds absurd, wacky, and peripheral - but there are some core insights on the military and politics to be gleaned from this little bit. Yes, there is the whole discussion of the game theory behind deterrence policy - i.e. let's assume the other guy is ready to push The Button, so we'd better scare him out of that - but there is more to it than that. More can surely be had from reading Cohen's own book, Shame. It's not available on Amazon.com, but the UK branch has it.

One thing interesting is Cohen's rationale for inventing the device. He was a member of RAND, the first think tank and perhaps the think tank. A passage:

Sam Cohen might have remained relatively unknown, troubled by ethical lapses in government and the military but unable to do anything about them, if he had not visited Seoul in 1951, during the Korean war. In the aftermath of bombing sorties he witnessed scenes of intolerable devastation. Civilians wandered like zombies through the ruins of a city in which all services had ceased. Children were drinking water from gutters that were being used as sewers. "I'd seen countless pictures of Hiroshima by then," Cohen recalls, "and what I saw in Seoul was precious little different. . . . The question I asked of myself was something like: If we're going to go on fighting these damned fool wars in the future, shelling and bombing cities to smithereens and wrecking the lives of their surviving inhabitants, might there be some kind of nuclear weapon that could avoid all this?"
Years later, after he finally got backing to develop it, the design of the bomb was reworked, effectively dissolving Cohen's intent.

One bit is very mistaken, however, but also very revealing:

The bomb would still kill people--but this was the purpose of all weapons.
That's not true. The purpose of weapons, and war, never was to kill the enemy but to overpower them. And you don't have to kill to accomplish that. The ammunition of military rifles is steel-jacketed, whereas hunting rifles use bullets with copper jackets. Why? Steel-jacketed bullets will just pass through the body, wounding but not necessarily killing. Brass-or copper-jacketed rounds, however, are softer and are slowed down more by the body upon contact. Why not just use hunting ammo then? After all, it's more lethal - and don't you want to kill 'em? No. If you kill a bunch of enemy soldiers, the other has to recruit more soldiers; but if you wound a bunch of soldiers, they have to recruit more and nurse the casualties - a significantly more expensive , exhausting, demoralizing consequence. When they can't afford to keep it up, they surrender. (We're assuming, of course, that the opposition doesn't consider leaving the wounded alone as a viable option.)

The military is to the government as the fist is to the brain. War is above all a political tool; people often forget this, as they're hung up on the killing thing. Even Mr. Cohen forgot this to an extent, which gives you an idea as to the force of conceptions on our thinking. I didn't realize the real aim of war until I read Sun Tzu; when a WWII veteran explained to me the thing about bullets, it only confirmed that. He knew the purpose too. What is needed is to see things with a fresh eye, so that the stale ideas we inherit have no undue power over our minds. If you can put 2 and 2 together, you're reasoning just fine. The thing to be concerned about is to see.

(Image ruthlessly hijacked from www.shadowfist.com/html/gallery/cardgallery7.htm)

Wednesday, September 20, 2006

The 3 Rs: Rantin', Ravin', & Rippin'


The older I get, the less patience I have with complacency, especially when it comes to education. I've seen two news items in two days that have gotten my blood boiling - that's a lot of blood!

One article was an op-ed piece by Johan Huizinga (not of Homo Ludens fame). You can find it here.

It wasn't the article per se that I disliked - I rather liked it - what I disliked were the troubling situations he brings up and how they're dealt with - or not dealt with, as it were. It was also the one comment by a reader who simply got the wrong idea about it. Satire can be quite subtle, and it seems she missed the irony. I posted a rebuttal there that was...well, sharp and lacking in subtlety. (It should go up in the next few days, I expect.)

My problem was in unraveling the subtleties of Huizinga's article for my response. There's quite a bit going on there; he touches on the complexities of things today, which is why it's hard to just say he means the opposite. One thing we often do is equate simplicity with ease; really, we should know better. They are not the same thing, not by a long shot. Some of the most difficult things are the simplest. Try just sitting in one place for ten minutes, and focus on your breathing. Nothing else going on around you, just sit and concentrate. That's about as simple as you can get, but I don't believe for a minute you'll find it easy.

In my opinion, talk about education problems has no place for subtle wit that can be (mis)taken at face value. That's my only qualm with the piece. It's too easy for someone to misread it, and if that someone is a school admin - well, they could walk away thinking they're doing a good job. I doubt that would happen, but it's apparent that misinterpretation of the satire does happen.

The other piece has to do with a new book that argues against homework. There's an interview with the one of the authors of The Case Against Homework on MSNBC.

In the book Sara Bennett argues that homework is a waste of time. Any homework. The claim is that the quantity of homework has no correlation to achievement, which is measured by teachers' tests. But if the interview is any indication of the content of the book, it has a wildly off-target thesis.

The interview centers on reading. Her example is that reading novels for school is bad because of the attendant tasks - looking up words, answering questions after each chapter. The reasons it's so bad is that there is a method imposed on reading, a method which has no place. More precisely, it's the teacher's view on reading that gets imposed. As Ms. Bennett puts it, "You don't want to be interrupted every five minutes when you're reading or when you're watching a great movie." The reading experience is being taken away from the students.

Am I the only one who sees the glaring contradiction here? First they're talking about the worthlessness of homework - how invalid the very idea of homework is - and then they're talking about the kind of homework that's being doled out! Ms. Bennett doesn't have an issue with homework but with the quality of the homework, but in making her case she's throwing out the baby with the bathwater.

According to her, teachers don't have a clue as to why they assign any homework. They weren't trained on it, they weren't told its importance. They were told to just do it, and they just did it. (What good little Nike-wearing sports.) What does that tell me? The homework lacks any direction, there is no purpose to it. But who's to blame, the homework, the teachers who thoughtlessly assign it, or the trainers who never bothered to explain its purpose, or how it's to be done?

Ms. Bennett says that American schools are currently in "testing mode" (which she never explains), and homework is the teachers' way of foisting unfinished lessons onto the students. Does this sound like homework is inherently evil, or is the current practice evil?

Now I don't disagree with everything Ms. Bennett says. She does state that a family dinner is the most important factor in academic success. I don't deny this; in fact I agree that families don't spend enough time in the same place at once (dare I say it?) communicating and getting along. I also agree there is a profound problem with the way education is being conducted. But I also think that she's wrongly diagnosed the problem.

OK, here's where my unsolicited advice comes in.

Problem #1: teachers don't know what homework's all about. Solution: teach them!

Problem #2: Educators don't know what would constitute good homework assignments. Solution: find out! Ask what you want students to get out of a lesson, and what they want to get out of a homework assignment. Then ask how to get those results. Do research, experiment with different approaches, formats, exercises, etc. Then and only then are you in a position to judge homework as a whole.

Problem #3: Students see homework as a waste of time. Solution: explain it to them! and give 'em meaningful assignments, for cryin' out loud. Kids aren't stupid. Of course they'll complain about how meaningless it is, because it is meaningless. They don't see the point because there isn't any point. Craft the assignments with a little bit of care, tell them how it should be done, and why. If possible, demonstrate the payoff.

Problem #4: Teachers palm unfinished assignments off as homework. Solution: stop it, stop it, stop it! How freakin' hard is that to figure out? If you don't finish a lesson, something has to be changed - the timetable, the class hours, the number of students, whatever. But expecting students to teach themselves is irresponsible when it's done in this manner. Yes, you want to get students to be independent thinkers. This is not the way to do it.

Now I know what one objection will be: "But we don't have enough teachers, and the ones who are there are overworked and underpaid." Sounds like we've finally got some real problems. Part of the difficulty is a lack of funding, true. But throwing money at something won't solve it (though it will make teaching more attractive from one point of view). There are a number of things to be attended to, all at once, which means there's no easy solution. Practical matters, such as the number of teachers or the size of classrooms, need to be juggled alongside pedagogical matters. I cannot say there's a quick fix; I can only say that the case against homework is but a symptom of a crisis in the schools, and we need to do something about it pronto.

A deeper problem is the very attitude we have towards schools, education, and teachers. This is no small matter: it goes right to the core of our value system. It is well known that America has an anti-intellectual streak a mile long, which does not speak well for us as a culture. People of an intellectual bent face strong opposition (usually subtle, sometimes not so subtle).

Take nerds, for example. You know them, the ones who actually like chemistry class, the ones who enjoy doing (yeccch) math. The stereotypes of these guys getting picked on by jocks - it's true! Granted, several can hold their own, but they've got support from family and their environment; many don't have that. Where does all this happen? The good ol' U.S. of A. Now here I am in Belgium, have been for a few years now - do I see nerds? No. Why not? The bias against intellectual pursuits isn't there. I won't say kids here are perfect, but they don't have to deal with the sort of thing I grew up with; either it's such a minor happening or it doesn't exist here at all.

Now if we red-blooded Americans took a sober look at ourselves, and asked what really counts for us, what are we going to say? If we ask, What sort of future do I want my children to live in? how are we going to answer? It's time to start thinking about where we want to go, rather than complain about how we're not getting anywhere.

(Image shamelessly stolen from www.hellofriend.org/parents/homework.html)

Monday, September 11, 2006

A Note to Fellow Readers

If you haven't done so already, take a moment now, and think about what happened five years ago today. Put yourself there, looking out of the gaping hole in the side of the building, high over New York City. And remember that, for their sake.

For those of you who enjoyed the last entry (see below), we'd like to thank you for your support. However, those coming here expecting me to be some authority on religion are bound to be disappointed. I'm hardly an authority there, or on anything for that matter. Really. There are plenty of folks out there far more competent than me, with better souls to boot. Here's one, a good friend of mine:

http://www.oregonlive.com/weblogs/religionblog/

You'll find a lot to chew on. Do check it out.

Sunday, September 10, 2006

The Barbarians Go South Again

Here is an article in Newsweek concerning a recent wave of prominent thinkers plugging atheism:

1=8535">http://msnbc.msn.com/id/14638243/site/newsweek/?GT1=8506>1=8535

The authors focused on there are Daniel Dennett, Richard Dawkins, and Sam Harris. Apparently it was written because all these guys are publishing their arguments at the same time. What irritates me, however - what irritates me enough to interrupt the work I should be doing - is how juvenile their notions of religion and God are. Without exception, they speak as if Christianity were on the same par as superstition: God is this vindictive old guy with a white beard who sits in heaven and makes us suck up to Him. (The last bit is Dawkins's phrase, not mine, and I think all the others mentioned in the article would agree.)

Harris is explicitly mentioned as taking up a literal reading of the Bible, which doesn't speak well of him. By this stance Harris implies that any non-panliteral interpretation is just hedging. But clearly there's a lack of sensitivity to the text. I see no reason why it must be taken completely literally or metaphorically, and there are no reasons given in the article for any particular reading.

I get the feeling that most arguments against the validity of the Bible come from people who are very uninformed about the Good Book, reading it selectively, partially, or not at all.

The same questions are posed. If there's a God, how can there be evil in the world? Really, this question presupposes a lot; it's a loaded question, in fact, which is why I dislike it. Presumably God would not allow disasters, either natural or man-made, because He's so gosh-darn good. But because these do occur - well, how could there be a God? So goes the reasoning.

I don't know about you, but when I was younger, there were a lot of things I didn't understand about my parents. If they didn't think or act as I would've liked them to, they were idiots. And they were idiots because their actions didn't make sense to me. Looking back, though, I see they had good reasons for what they did. I can't say I think everything they did was perfect, but I can see why they acted as they did, and in a lot of cases - dare I say it? - it's a good thing they didn't do what I wanted.

Now, if it's so easy to understand that fact, why isn't it so easy to see it in religious thinking? Maybe God knows something we don't? - what a concept! Maybe God doesn't have to play by our rules. Indeed, why should He? I'm not saying I can sit back and cheerfully watch all the hell on earth around us today, like some Dr. Pangloss, only that the combined intellect of those atheists - the combined intellect of the human species, for that matter - is pretty paltry when held up against the wisdom of God.

Another bit that annoys me is when they say that believers get their ideas out of a book:
They ask: where do people get their idea of God? From the Bible or the Qur'an. "Tell a devout Christian ... that frozen yogurt can make a man invisible," Harris writes, "and he is likely to require as much evidence as anyone else, and to be persuaded only to the extent that you give it. Tell him that the book he keeps by his bed was written by an invisible deity who will punish him with fire for eternity if he fails to accept its every incredible claim about the universe, and he seems to require no evidence whatsoever."
Granted, some people do have a notion of religion this simplistic and sheeply. But for anyone who has a sense of religiousness at all, they know this is not the case. The question of where people get their idea of God is another loaded one; in fact it's not even the question it claims to be. What the atheists imply is that religion is purely a textual matter - if it weren't for the Bible or the Qur'an, we wouldn't be religious. Don't believe me? Check it out: the book by my bed "was written by an invisible deity"! Human beings could not have written it out of a response to some phenomenon, no-siree Bob. Religious experience doesn't exist, and if it did it would obviously be chalked up as delusional.

The question "Where do people get their idea of God?" deals more with how our thinking about God has been conditioned, not with the original source of the idea.

What we find here, then, is a position that is unassailable - unbeatable because it refuses to fight. Walled itself up in its own circle of logic, it is impenetrable. Kind of like conspiracy theories.

What I find most striking about the so-called debate is how much religious thinkers have developed over the years, and how little the atheists have come along. They marshall up the same tired questions, the same evidence, and draw the same conclusions. If this were a real debate, they might listen to the opposing side and learn something from them, if only in an attempt to convince them of the error of their ways.

But they don't, and that's telling. What it tells me is that they have simply ignored religious thought, preferring the sanctuary of their own fantasy-image to actual research. If they had, the article mentioned above would probably have included new material, new questions, new refutations. There isn't any of this.

Is this what passes for enlightened thinking? Is this what's called progress? Looks more like regress to me.

(Image brashly cribbed from www.nivbed.com/junk/ancient_garbage/)

Sunday, September 03, 2006

Activity, Clean and Alive

"But there is also hope in this: music invents itself through musicians working on behalf of music, rather than themselves. This is healthy music, and can be experienced as such. After listening, or playing, one feels stronger and cleaner. No elaborate metamusics is needed to demonstrate this, for it can be simply experienced. The question for the musician is this: do I become alive playing this? For the audience, it is: do I become alive listening to this?" - Robert Fripp, "The Act of Music." Via 10 (1990), p. 88.


I finished my workout last night, feeling pleasantly drained as usual. Nothing much - pushups, pullups, Hindu squats, and so on - I'm clawing my way out of sedentary life, you see. As I cycled home from the playground, the word bubbled up: clean. That's how I felt, and there was no other way to put it.

Clean is the way I feel after a good workout, when it's not too hard but still challenging. It's always been that way, even back when I was practicing tae kwon do, though I never used that word. But there it was last night, and it reminded me of the article by Robert Fripp; you can see he uses the same adjective to describe the experience of playing music. Or listening to it.

Looking back on other moments, I become aware of other times when the idea of clean has come to the fore. And its opposite. Whenever I saw Natural Born Killers, I felt dirty. That is a movie that shouldn't have been made. It's not the violence; I've seen equivalent levels of that in films, but it doesn't register the same way. No, something else in its treatment of violence strikes me as unsavory and - well, let's just say it - wrong. The soundtrack is enough to bring on the feeling of being soiled.

Of course when we act in a way that is wrong, we often say we feel dirty, and want make a clean breast of it. This is such a commonplace, there's no need to hunt for citations. Rituals of purification work on this notion of cleaning - inwardly, outwardly or both. Sometimes that takes on the form of going through the dirt as part of the cleansing, purging. Catharsis.

Having an interest in logic, I can't help considering cleanness in terms of consistency. There's an element of it there too: if we consider Mary Douglas's definition of dirt as "matter out of place", the notion of coherence is unmistakable. Things have their place in relation to one another, and so a misplaced item runs against the order of things.

Which brings me to another, connected sensation - the sensation of conflict. This takes on several forms, depending on the degree and nature of conflict. A paradox piques by its apparent self-opposition, but we don't feel it to be painful; rather, the pique is exciting, stirs us to action. There is dissonance, to be sure, but somewhere at bottom there is a coexistence of the two notes; the dissonance isn't absolute. Contradiction, however, presents us with a dissonance that is absolute, hence intolerable. "Stand there! - no, don't stand there!" Oh, make up your mind! It hurts to be pulled this way and that. We've all been there.

Before I veer too far from the topic at hand, let me just say I'd like to go deeper into the matter. That I'm not alone in noticing the sensation of cleanness in relation to action suggests that there's something important to it in our experience. The phenomenology of cleanness and dirtiness needs to be addressed, if it hasn't been already. If anybody out there knows of studies in this matter, let me know. I'd be very grateful.

Saturday, September 02, 2006

The Divining-Rod

[This is a short short story I wrote last year, one of my last. It gives a window on my view of the human condition; it was born out of a thought, so you could say it's a vessel for propaganda, but hopefully it stands on its own as a story.]
Dr. Jude Theodore awoke one morning late September. He awoke a different man. He gazed at a red leaf that had lighted on the window sill, then turned to his wife and said, “I’m not going to work today.”

“Not going?” she exclaimed. “Why not?"

“I don’t need it. Nobody needs it. We know enough already."

“Know enough?”

“Enough. Plenty. We could go on just like this for the rest of our lives and not ask for anything more. Live like kings.”

“And queens,” she added.

Dr. Theodore phoned the laboratory: “I’m not coming in today.”

“No?” said the assistant. “Not feeling well?”

“No, not really.”

“Well, take it easy, but come in as soon as you feel better.”

But he did not come in. Not the next day, not a few days later, not a week later. His wife brushed her auburn hair before the mirror every morning, wondering when he would snap out of it.

“Why don’t you go back to work?” she asked one day.

“We know too much. First we looked for homes, then we made them. We hunted for food till we learned to grow it. We weren’t happy killing one man, we had to find a way to kill thousands. Worlds we make and worlds we kill: that’s playing God, that’s why we find things out. We know too much.”

A week later she found him sitting by the window, eyes closed, brow furrowed like a wadded piece of paper. He seemed to be listening to the naked branches in the wind.

“What are you doing?” she asked.

“Setting an example,” he said.

“What are you doing?”

“Forgetting.”

He did not get up from that chair, not later that day, not the next day, not after a week.

She found him there, gazing out the window through the branches. She sat down across from him. Again she asked, “What are you doing?” All she received for an answer was a blank sheet of a look.

She would bring food to him, but he just stared at it, uncomprehending. She wept, shook him, screamed at him, pleaded for him to snap out of it, but – he did nothing. For a time he shook his head when she spoke, but even that disappeared.

One morning, as snow was falling in clusters onto the window sill, she sat down across from him; she stared, mumbling, “Who are you? What’s happened? Why?” Frost gathered on the pane, each question hanging like the steam of her breath. But she knew the answers inside: He is Dr. Jude Theodore, my husband. He is forgetting. We know too much. Soon they too began to fade. By nightfall she was silent, silent save for her breathing; and when the sun rose they still sat there, a king and his queen, gods among gods.

On a Battle over Voices

"I have written a wicked book, and feel spotless as the lamb." - Herman Melville, letter to Nathaniel Hawthorne, 17 Nov. 1851. Correspondence, vol 14, The Writings of Herman Melville, ed. Lynn Horth (1993).

I was asked by a friend what I thought about Moby-Dick: did I find it hard to read? Well, I said, it is somehow very nineteenth-century, but that's nothing - it's an amazing book. Why ask? I was curious. Then my friend told me about a debate surrounding two competing German translations of the work; one is very difficult but "close" to the original, the other a freer rendering but more readable. Which one was better? Fellow translators in Germany seem to favor the latter more. Why?

The discussion seemed to pool around the mixing of styles, my friend explained, especially in the voice of Captain Ahab. The joining of high diction and low, using Biblical rhetoric and Shakespearian talk along with the language of sailors, and -

- Wait a minute, I said, what Shakespearian talk? Well, all the Thou arts and Thees, my friend replied.

Oh no. Problem. That was not Shakespeare's talk, it was Nantucket's. Nantucket dialect in the 19th century was peculiar in this respect, retaining that archaic style. Ahab wasn't the only one to use Thees and Thous; the whole damn town did that. Several characters in the book, and in real life as well. This is no secret; Melville comments on this trait, mentioning the source of that peculiarity in the process:
"The Nantucketer, he alone resides and riots on the sea; he alone, in Bible language, goes down to it in ships; to and fro ploughing it as his own special plantation." -Moby-Dick, ch. XIV "Nantucket"

Hardly Shakespearian, and hardly restricted to the voice of Ahab.

What does this mean? What does it mean, that translators never picked up on the speech pattern, so clearly marked, of a community? Were all these translators stupid? I don't think so. Did every one of them skip this chapter? I hope not. What I suspect is that Ahab's voice drowned them out. He is a singular character, so striking and overwhelming that they listened to him and forgot entirely about all the others who had used similar words before him.

You may be wondering why I'm so worked up about this issue. For one thing, I love Moby-Dick; there's good reason why it's still around. A pet peeve of mine is when people - especially critics, who should know better - knock the book for being a stylistic failure. How can they say that? Ahab talks one way in one scene, another way in another scene. So it's artistically wrong to portray a complex character?? My friend related this from the online discussion about it, and I couldn't believe my ears. A work of art exists for its own logic, not the expectations of critics. I hate that presumptuous attitude.

Another thing I hate is seeing widespread misunderstanding when it simply needn't exist. As seen above, there is proof positive that one claim of the debaters is unfounded. If that passage weren't enough, all they need to do is read the other characters who speak in the same way.

Before you think I'm being arrogant here, you're only half-right. I'm chiding myself and them equally. I make the same errors, which is probably why I get so irritated when I come across this sort of thing.

That mixing of styles, my friend continued, doesn't that seem post-modern to you? Then, responding to that very question, added: The same thing happens in the Persian Letters by Montesquieu. And in Robinson Crusoe there are those interminable descriptions of the guy growing corn, or building a hut! Isn't that post-modern too? So the technique isn't so new after all.

Right. And I'm not sure what counts as post-modern, nor do I think we can clearly tell what would really distinguish the period as such. We're too close to it. We may think we know it, and we should certainly try to articulate that perspective, but I don't really believe we have the clearest eye on ourselves - precisely because we are caught up in it. A hundred years from now, historians may carve up the 20th century into three periods, or five, or whatever. Or post-modernism might be replaced by some other term that captures the essence. But our opinions will still be valuable to them; for they will be able to know how we saw ourselves, and can see just wildly wrong we were.

Moby-Dick was not a literary success when it was published, my friend said, or in Melville's lifetime. I would chalk this up as evidence that we do not know our own time. None of us. What works will last as masterpieces? If Melville's own generation did not recognize the genius of that wicked book, and if they were as human as us, can we truly say we have a clearer self-estimate?
In my opinion we are as deluded as any generation has ever been, maybe even more so. For an age when we profess to be more pluralistic and inclusive, we sure do act arrogantly. In honor of being so humble in our self-estimate, we honor ourselves still more. We know better, we judge more fairly: that's quite an honor. Progress goes forward, we learn from our mistakes. It doesn't take long to poke and prod the semblance before the usual suspects come reeling outbetraying our unstated belief that we are the bearers of truth, over and above our predecessors.

Is this new? Hardly. I dare say it's the human condition. My take on this soon to come.

(Image ably nicked from www.whalecraft.net/Whaling_Books.html)

Sunday, August 06, 2006

Do you feel lucky, monkey? Well, do ya?


Here's one for you: an Indian train company has hired a monkey to keep deadbeat monkeys from hitching a ride on their trains.

Apparently one got on, scared off honest ticket-paying monkeys - er, passengers - and enjoyed the trip for three stops. Train employees had to clear the car while they tried to catch the bugger. So, in an effort to stem the tide of freeloading monkeys, the company hired a primate bouncer.

It's a good idea and all...now what are they going to do about the lawyers?

"Go ahead. Make my day."

http://www.msnbc.msn.com/id/14121007/

Corporate Culture is an Oxymoron

I just finished dinner, watching a documentary called The Corporation. It's fantastic. Really, it should be required viewing for everyone. You can download it in 6 parts at

http://www.question911.com/linksall.htm

The Corporation is one of the best videos on the site - well researched and brilliantly edited.

One thing to know is that the website is full of videos on 9/11 and the conspiracy some believe is behind it. While I don't agree with most of that, I can say I fully respect their right to express their dissatisfactions. If they're wrong, they're wrong; but if they're right to any degree, we should sit up and take notice. And for that, my hat's off to you.

This is the thing I like about conspiracy theorists: they've got the nerve to question the status quo. Maybe it's in their minds that there's something behind it, and maybe they line their baseball caps with aluminum foil, but it's a hell of a lot better than sitting there like a marshmallow getting roasted on a stick.

Tuesday, August 01, 2006

I'm sorry, Sam, I'm afraid you can't play that again.

Some one-off thoughts on an article I read...

A recent issue of The Economist had an interesting series on computers; they were all quite neat to read (that magazine has got to have the best writing of any), but one article caught my eye. It’s on “music intelligence” software (“Sounds Good?” Economist 10 Jun 06, p. 8 of the Technology Quarterly section).

According to the article the computer programs analyze music pieces into 30 parameters, among them sonic brilliance, octave, cadence, frequency range, fullness of sound. Pop music producers can use these readings to predict which songs will be hits – or advise the musicians on how to improve their songs.

Disparate pieces of music can share underlying parameter: the article mentions that several U2 ditties have a bit in common with some Beethoven pieces. Trained ears apparently don’t detect this. Why? Mike McCready, co-founder of the software company Platinum Blue, explains it like this: “Songs conform to a limited number of mathematical equations”. I imagine that since it’s a mathematical thing rather than an ear thing, this would explain the trouble a trained ear has - when focusing on the overt sound of the notes and rests, rather than on deeper relationships among the elements, it's easy to overlook those connections.

The repercussions for the industry are obvious. Big companies will feel more confident backing one artist, because they’ll stand a better chance of return on investment. Artists in turn will profit – experienced ones, because they’ve already proved themselves, and newcomers too, since their work can be run through the machine before a contract is signed. The unwritten consequence of all this is that the quality of music will improve generally.

Copyright violation is another avenue for the software. Plagiarism lawsuits could use the readings as evidence, lending “objectivity” to the whole affair. It could even be used to prevent such stuff from happening. (Though you wouldn’t have needed it when Bush shamelessly stole the riff from Bon Jovi’s “You Give Love a Bad Name.” Sampling is one thing, but that was pure theft.)

One thing the article does not consider is the prospect of learning from the program. If the trained ear can’t tell the similarities between U2 and Beethoven, maybe it’s because they’re not listening in the same way. If the producer – or the artist – keys into that mode (pun fully intended), they can eventually dispense with the software altogether. My guess is that musicians and producers perform this kind of calculation already, but are only aware of it at a superficial level; the link between music and mathematics has been noted several times before, making this latest development just a push deeper into the inquiry.

Another thing the article does not consider is what quality is. What if, say, you have the slickest song in the world – but the most insipid lyrics in history? In other words, can we really say there’s hope for Britney Spears’s music would improve? That side of the music hasn’t had a program written for it yet, though surely they’re working on it. One could argue that nobody listens to the words anyways, so there’s no need to bother. Brian Eno discovered this early on in his own career (Eric Tamm, Brian Eno: His Music and the Vertical Color of Sound, p. 120), but is this unanimously true? And if so, why do songwriters insist on penning meaningful lyrics? Perhaps they’re just behind the times.

As suggested above, it could be that the music of the words themselves has yet to be incorporated into the software. Some nonsense just sounds better than others – I’m even tempted to say that some gibberish rings “truer” than others, though I cannot back this up.

And while I’m at it, what would those programs make of Mizar? I love how Amazon says,
“People who bought 'The King of the Stars' also bought:

No titles available.”

http://www.mizar.us/

Go on, check it out. You know you wanna.

Monday, July 31, 2006

Blogging, the Chautauqua, and their Place in our World

"I told Wittgenstein that my friend James, who had been working on his Ph.D. thesis for a year, had decided in the end that he had nothing original to say and would therefore not submit his thesis or obtain his degree.

"Wittgenstein: For that action alone he should be given his Ph.D. degree.

"Drury: Dawes Hicks was very displeased with James about this decision. He told James that when he started to write his book on Kant he had no clear idea what he was going to say. This seems to me an extraordinary, queer attitude.

"Wittgenstein: No, Dawes Hicks was quite right in one way. It is only the attempt to write down your ideas that enables them to develop." M. O'C. Drury, from Recollections of Wittgenstein, p. 109

When I told my girlfriend I'd created a blog, she said (among other things) that I was wasting my time. She's right, of course, and I know it - but I can't help demurring. Yes, it takes up time, and that's something I don't have much of. What the Federal budget is to money, my work and research is to time. This is true.

But not completely. There's the need to articulate thoughts, and to do so in a public sphere, which fulfils a deep-seated need. When you don't get a chance to have a good conversation outside the regular sphere, you start to go funny.

What I mean here is, I do have good conversations with my darling. She's one of the most important people in my life, and I wouldn't give her up for anything; she lends a big part to the wholeness of my life. But if I couldn't talk to anyone else, I'd go crazy. She would do the same. It's human nature. We have our first loves, and we have our friends, and we have acquaintances. All of them lend something to the richnes of our lives. It's a communal thing.

The wholeness of life is an open whole. The body is a complete thing in itself, but it has eyes for seeing - what is not the eyes; it has ears for hearing - what is not the ears. And so on. Similarly, the conversations I have with other people serve a different need, a more public one.

I recently joined a List-serve dealing with C.S. Peirce and his philosophy. The host's welcome message makes just this point - that there is a conspicuous lack of a forum for intellectual discussion today, and its absence is felt by many. Specialists in their fields may have conferences, they may read journals, but face-to-face talks are not always forthcoming.

The next best step is something, well, like what's happening online: almost-real-time discussions taking place with dozens, perhaps hundreds of people interested in the same topic as you but with their unique point of view. That there is a lack of this discussion-space is telling. Sure, there are always informal discussions around a mug of beer, or during a party, but
what about something a bit more formal - or simply organized?

I can't help thinking of the Chautauqua tradition in the late 1800s-early 1900s. This is, I think, a uniquely American thing: it was a group of performers who traveled from town to town, staying in each for a few days, where they would put on a show. It wasn't just something people would sit back and watch, they got to participate too. It was a mix of music, theater, oratory, and there was often a discussion that ensued; it was a way for people to get their brains going, and it worked well.

Movies all but killed Chautauquas. I remember one coming through North Dakota, where I grew up; as hokey as it may sound to today's jaded teenager, it was quite a bit of fun. Don't get me wrong, I love a good film, but I won't pretend it's necessarily superior to pre-Hollywood entertainment. Now although I don't recall any discussions, I do remember a guy dressed up like Teddy Roosevelt and giving a speech, and he did it pretty well. And I imagine that with the right speech, you could get a talk going - and wouldn't that be something?

Blogging is no Chautauqua; in fact it's like comparing apples and oranges. No, away with the cliches. If talks were drinks, a chautaqua would be a big mojito - a festive cocktail for the soul that livens up an otherwise mundane bar. Blogs would be that coffee you're nursing while brooding on something - while you're trying to get your thoughts on right,
the cup gets cold, the coffee goes bitter. It's nothing special, but in its way it is. Mojitos have a longer half-life than blogs, but you wouldn't give up that cuppa joe, now would you? I didn't think so.

Blogging is my way of straightening out my thoughts in a public way, while maintaining the solitude needed for straightening them out. Writing helps me articulate my thoughts better because I'm having a conversation with myself. Once I get that out of the way, I'm better able to actually talk face-to-face. So you could say that blogging is my way of paying the ticket to the Chautauqua of real life.

Of course that's not always possible - real life doesn't wait for me, and neither does my darling, which is actually a good thing for my pokey old butt. I wouldn't want to say, "Good question, let me blog it and I'll get back to you." But sometimes I just like having this place carved out for myself to do as I please. Virginia Woolf argues that it's essential for everyone to have a room of one's own, and she's right. Everyone needs a space to call their own; this here is
mine .

http://www.chautauqua.org/history.html

Monday, July 24, 2006

Babylon Belgium

After living in Belgium for nearly eight years - and not mastering any of the three official languages (in typical ugly-American fashion) - I've become painfully aware of some issues surrounding the use of any given tongue.

Of course there is the practical side of life. You have to get around, buy food, wait in city hall for your number to come up, etc. When you don't have the local language, it makes things difficult to say the least. The Flemish are wonderful at learning foreign languages, and most speak English quite well; you could say they have learned how to learn languages. As enthusiastic language learners, they also want to practice speaking; they are very obliging to foreigners as well, switching languages adroitly.

But there's a down side to that. I hate to say this, but sometimes their enthusiasm for learning other languages hinders visitors from learning to speak on local terms. Water flows downhill, and people generally won't learn a language if they don't need to. I'm surrounded by Dutch (or Flemish, depending on how strong your provincial feeling is), and my competence hasn't improved much over the last five years; I learned enough to get by, and that was it. When I'm in the shops, I speak Dutch till I run out of words; then, if the shopkeeper doesn't switch to English, I ask "Excuseer, mevrouw/meneer, spreekt U Engels?"

This isn't an apology for my linguistic incompetence. Without excusing myself, I'm simply describing some of the effects I've noticed. They're not limited to me; I know people from other countries here who share the same difficulties.

Now English is the common language in Leuven, a university town bustling with 20,000+ students, more than 1000 of whom are from abroad. As the tongue of the most powerful nation in the world its businesses and popular culture, English has current status as the lingua franca (!). In Brussels the demand for English instruction is greater than any other language; at one point there were more contracts for English in my school than for French and Dutch combined!

You would think that this would solve our communication problems: everybody speaks English already, so why not just make that the official language? Obviously, native speakers have a leg up on their second-language colleagues. It's painfully obvious to me when I edit a master's thesis written by a student who's competent in their field, but not so competent in the language.


"America was the land where they were old and sick, Norway where they were young and full of hopes - and much smarter, for you are never so smart again in a language learned in middle age nor so romantic or brave or kind. All the best of you is in the old tongue, but when you speak your best in America you become a yokel, a dumb Norskie, and when you speak English, an idiot. No wonder the old-timers loved the places where the mother tongue was spoken, the Evangelical Lutheran Church, the Sons of Knute lodge, the tavern, where they could talk and cry and sing to their hearts' content." --Garrison Keillor, Lake Wobegon Days, p. 65.

The political dimension of language cannot be underestimated, either. As Keillor's poor Norwegian knows, the weaker user of a language is lower on the totem pole than the stronger ones. And in Leuven, the students who are native speakers of English enjoy an advantage over everyone else - simply because they were born in the right place at the right time. Of course some are better with words than others, but at a certain level it cannot be denied that native is just better.

I haven't gone into the domestic politics of language here. This is a problem as old as the country itself: ever since the Francophones lorded it over the Flemish, there has been resentment breeded in the north of Belgium. This feeling makes itself known, even in the bakeries, literally.

Once I was in a bakery to get some worstenbroodjes (sausage rolls) for lunch; I ordered in Dutch, and she asked if I wanted the rolls heated up. Everything was going fine until she rang it up and said, "Twee euro twintig, alsublieft." I didn't hear her clearly, but then she said "Two euros twenty." I gave her the money and spoke in Dutch, and Dutch the transaction remained to the end.

While I was waiting for my sausage rolls, the next guy in line stepped up. He didn't even bother to try speaking Dutch, he just ordered in French. The gal got the stuff and rang it up - she obviously understood, but she never stopped speaking Dutch. English would have been a more neutral language, but the Flemish-Wallon tension has this backlash effect, even in the simplest places.

And it's not just me. A Greek friend of mine was nearly beaten up one New Year's Eve, simply because he was wishing folks Bon annee. He didn't know any better, it was his first year. But he learned fast.

If we want to look at something more "objective", let us turn to the European Union and its legions of translators and interpreters. Any document has to be translated from its original language into the other member languages, and the premium for good work is crucial: a misunderstood clause can spell the difference of a few million euros. In meetings speakers are attended to by headphone-wearing audience - listening not to the speaker directly, but through a translator. Naturally all these services cost something, and their cumulative total must be staggering.

It's all in the interest of fairness. After all, how can you make one language the official one for the whole EU? But again, things are complicated enough without adding to it.

You can probably guess where I'm headed. What's we need, I argue, is a language that is politically neutral, easy to learn, and lively enough to make it less of a chore. That need would easily be filled by Esperanto. I know there are criticisms, but I'd like to address those in the near future.

Sunday, July 23, 2006

Howdy pardner

Greetings, and welcome to my blog! Hopefully this will stay up longer than the last attempt. I won't complain about it, but let's just say the blogger-blog relationship did not go well.

What I'd like to do here is simply to publish thoughts on this or that issue as it comes across my mind. Should things evolve in radically different directions (as most likely will happen), I'll set up another blog for the occasion.

As farewells are not my strong suit, let me sign off with a simple "Don't be a stranger!"