Thursday, September 2, 2010

The American In Bruges:

Spoilers ahead!

After I watched The American, I made an offhand observation to a friend: the movie reminded me of 2008's In Bruges. I didn't seriously consider the similarities, but the observation stuck with me as I was driving home. Note: I haven't seen In Bruges in over a year, so I've refreshed my memory with several online sites: Wikipedia, IMDB, etc.; similarly, I've only seen The American once, so I've used the aforesaid sites to sort out any details I may have missed/forgotten. Anyway, the [haphazardly listed] similarities:

First, the plot; The American [as I understand it] follows the same general plot as In Bruges: an assassin ["Jack" and Ray, respectively] makes a mistake, and his boss sends him on a "vacation" while secretly scheming to have him killed. In both cases, the mistake is collateral damage: in The American, Jack has been "making friends", and those friends have been getting killed; in the opening sequence [the only explicit "mistake", but the boss implies that this has happened before], Jack kills his "friend" when she witnesses his murderous response to a failed attempt against his life: "She knows too much", etc. Note: the aforementioned mistakes haunt their respective assassins.

In both films, the boss sends the assassin to an idyllic locale, and the locale is viewed through a travelogue lens; architecture is in the forefront: In Bruges with its medieval structures, The American with its nautilus shell city.

In both films, the assassin's would-be murderer* is a confidant[e]: The American's would-be murderer doesn't have the same emotional connection with the target as In Bruges' would-be murderer [any connection isn't reciprocated, anyway]; however, to the extent that Jack is able to maintain personal relationships, his would-be murderer is "close" to him.

In both films, the assassin befriends a nonviolent arbitrator native to the locale: the hotel operator Marie [In Bruges], and Father Benedetto [The American]; both arbitrators support the assassin to an extent, but neither "joins" the assassin.

In Bruges' Chloe and The American's Clara [the love interests] make for the most obvious similarity: both are in negative occupations [risky drug dealing and prostitution, respectively]; both have negative relationships with men [an abusive boyfriend and "Johns", respectively]; and both hope to use their respective assassin-cum-lover to escape their negative situations. A tangential similarity: in The American, the man seems out of place in the restaurant; In Bruges' Chloe seems out of place in that film's restaurant. Both films address foreignness.

In both films, the would-be murderer betrays their [and the assassin's] boss. The scenes are similar, though the event sequence is reversed: In Bruges' would-be murderer betrays his boss, and is killed; in The American, the would-be murderer is injured, and then betrays her boss before she dies. Also: would-be murderers die from a fall; interestingly, the injury incurred prior to the fall by the would-be murderer in The American is almost identical to an injury incurred by one of In Bruges' peripheral characters.

In both films, the assassin's boss arrives in the locale and attempts to murder the assassin; in both films, the assassin kills the boss [albeit indirectly in In Bruges]; in both films, the assassin is severely injured, but not killed onscreen. In both films, the assassin and the boss had maintained a relationship, though this is more pronounced in The American--speaking of which:

Jack's role in said relationship is childlike** ["What have I told you about making friends, Jack?"], and his overall curiosity is childlike. His fixation on butterflies seems innocently curious; similarly, you have the oral sex scene. I'm fascinated by it; focus on his eyes during that scene: he's curious, innocently servile... childlike--which makes the scene even more complex. Similarly, Ray is impishly childlike; again, there's that curiosity and innocent servility in the eyes--especially around Chloe.

Lastly: both films end with ethereally dreamlike, ambiguous scenes† collecting the players important to the dying assassins.††

I'm sure I'll see The American again [and again] and sporadically add to this list, but it's a start.


*Is an assassin's assassin a murderer?

**There's an interesting psychoanalysis of Jack waiting to be written.

†I'll write an analysis of the The American's ending scene.

††Only the butterfly and Clara are important to Jack; see '†'

Friday, July 9, 2010

Moderatism:

In politics and religion [and the two are becoming increasingly indistinguishable], moderatism is en vogue. Contrasted with fundamentalism [or "radicalism"], moderatism is more "tolerant" and pliable: it doesn't necessitate fundamentalist ideology. Moderatism isn't apathetic, in the sense that moderates do "care"--they have opinions, and it would be nice if other people shared those opinions, but it's cool, man; are we still on for that beer this weekend?

Before I get into the relationship between fundamentalism and moderatism, I should note that this discussion is going to presuppose morality for the sake of argument: whether or not morality exists as an objectively provable natural phenomenon is irrelevant, since the notion of morality is entertained by nearly every person. If I suggest that objective morality doesn't exist, any subsequent conversation is stillborn: the debate descends into a philosophical quagmire. So, since the notion of morality exists, I'm going to treat morality as existent--that is, that there are objectively "right" things, and that there are objectively "wrong" things, and that we can know which things fall into which category.

And following that position, this discussion is going to assume that fundamentalism isn't inherently or necessarily "wrong"; if the moral propositions held by a fundamentalist are "right" propositions, then that fundamentalist is more "right" than the moderate or opposing fundamentalist. For example: if it can be objectively proved that killing another person is wrong, then the pacifist who claims that killing another person is always wrong is going to be more right than the moderate who claims that killing another person is wrong, yes, but there are some exceptions where it's a necessary evil. In short: if a fundamentalist is right, the fundamentalist is going to be more right than the moderate who believes a diluted version of the fundamentalist's ideology.

Moderatism Defined:

Moderatism is defined by fundamentalism. Suppose that for every proposition X, there are two opposing fundamentalist conclusions: X, and not X, i.e., Either something is right, or something is wrong. If a person believes X, then we can call that person a fundamentalist; if a person believes not X, then we can call that person a fundamentalist; and if a person believes X with some exceptions, or not X with some exceptions, then we can call that person a moderate.

Moderatism's Appeal:

Moral propositions have an inherent problem: you're either going to be right, or you're going to be wrong; and if you're a fundamentalist, you're either going to be very right, or you're going to be very wrong. Suppose that X is objectively wrong: the fundamentalist who claims that X is wrong is going to be right; the moderate who claims that X is wrong with some exceptions is going to be right, with some exceptions; the moderate who claims that X is right with some exceptions is going to be wrong, with some exceptions; and the fundamentalist who claims that X is right is going to be wrong. And therein lies moderatism's appeal: morality seems to be a gamble, with fundamentalism having the highest risk; moderatism seems to be a safe bet: you might be wrong, but you won't be completely wrong.

The Problem With Moderatism:

Consider the development of fire: throughout prehistory, fire was a major threat to our survival as a species; it destroyed food supplies, environments, and was a danger to the individual who found himself near a fire. Only a risk taker would attempt to harness something so destructive, but with too many risk takers, the survival of the species would be threatened; too few risk takers, and the species' development would stagnate. So, as a whole, our species has evolved to produce some risk takers, but not to be risk takers; this is why moderatism is the predominate moral position: most of us are programmed to play it safe, and moderatism plays it safe.

Since moderatism seeks to reduce risk, moderatism is tolerant of fundamentalism: moderatism doesn't support fundamentalism, but it doesn't oppose fundamentalism. To the moderate, supporting or opposing fundamentalism would be a positive action, while simply tolerating fundamentalism is a neutral action; it never occurs to the moderate that tolerance implicitly supports whatever is tolerated by allowing the tolerated thing to perpetuate itself. And therein lies the problem with moderatism: by tolerating fundamentalism, moderatism allows fundamentalism to perpetuate itself. But it's more than simply tolerating fundamentalism: because by allowing fundamentalism to exist, moderatism is redefined. As previously mentioned, moderatism is defined by fundamentalism: as fundamentalism propagates, moderatism and fundamentalism polymerize, creating a more fundamentalist moderatism, and a more radical fundamentalism. Consider the previously mentioned example of killing another person:

Suppose that killing another person is objectively wrong: on one end of the spectrum, you have fundamentalists who claim that it is wrong; and on the other end of the spectrum, you have fundamentalists who claim that it is not wrong. However, most people will be moderates who claim either that killing another person is wrong, with some exceptions, or that killing another person is not wrong, with some exceptions; in either case, most people will be moderates. As the fundamentalist positions propagate, they will adopt the illusion of moderatism; if more people come to believe that murder is not morally wrong, then that position will appear more moderate than the opposing position; and if that position appears more moderate [if it appears to have less risk], moderates will gravitate towards it, unknowingly becomes fundamentalists. Invariably, the end result is an ideologically dictatorial fundamentalism that has assimilated moderatism.

An Answer To Moderatism:

If there are objectively "right" things and objectively "wrong" things, and if we can know which things fall into which category, then the problem of moderatism becomes obvious: if the right fundamentalism eventually refutes all other ideologies despite moderatism, then moderatism was unnecessary; and if the wrong ideology eventually refutes all other ideologies despite moderatism, then moderatism was useless. At a basic level, morality is a coin toss: you're either going to be right, or you're going to be wrong; moderatism doesn't change those odds: if you tolerate both sides, you're not changing the odds of either side being right or wrong. But even though moderatism doesn't affect the odds of whether something is "right" or "wrong", it does affect whether one ideology will gain a foothold over another ideology.

Surprisingly, the answer to moderatism is fundamentalism: if only fundamentalists existed, then any moral conflict would be quickly decided; assuming an objective morality, the "right" fundamentalism would quickly become obvious and accepted, or the "wrong" fundamentalism would silence the opposition. In either case, moderatism's involvement wouldn't change the possible outcomes: it would simply draw out the conflict. One could argue that moderatism might swing the odds in favor of the "right" fundamentalism, but moderatism could just as easily be a boon to the "wrong" fundamentalism; other variables aside, the odds wouldn't change--only the timetable.

Consider the moral question of killing another person: moderatism has continued this debate across thousands of battlefields and hundreds of millions of deaths. If killing another person is not morally wrong, then the fundamentalism espousing that ideology has already won. If killing another person is morally wrong, one might argue that moderatism is keeping the debate alive; however, think of what might have been if there were no moderates: either the pacifists would have won the debate millenniums ago, avoiding untold conflicts, or the pacifists would have been silenced, and in the latter case, the result would have been essentially indistinguishable from the results effected by moderatism's involvement in the debate.

In our conflicted world, moderatism has serious consequences: moderatism doesn't end genocide; moderatism doesn't feed starving people; moderatism doesn't volunteer in AIDS-stricken Africa; moderatism doesn't combat civil injustice; moderatism doesn't guarantee children an intellectually honest education. In any conflict, moderatism is either unnecessary or useless; moderatism has never decided anything--moderatism simply perpetuates the debate.

Wednesday, June 16, 2010

Thursday, May 20, 2010

Cologne:

I consider myself a cologne connoisseur; I wear different colognes for different seasons: citrus-tinted ones in warmer months, spice-tinted ones for cooler months; my cologne choice also depends on the time: more noticeable ones for the daytime, more subtle ones for the nighttime.

My first cologne was Curve, which I started wearing in grade seven. I continued wearing that cologne until my sophomore year in high school, when I took a brief hiatus from it and wore a sandalwood-based cologne marketed by Hollister, a clothing company I frequented at the time. I continued wearing that cologne for several months, but I eventually migrated to a Black, a more subtle and cinnamon-tinged cologne. I've continued wearing that cologne through high school, and my sparing use of it [don't overuse a good cologne] allowed one bottle to last several years; during that time, I returned to using Curve, and also began using Polo Blue, though I reserved the latter for the winter months. Since graduating high school, I've used umpteen other colognes [I've forgotten most names, unfortunately], and I've begun experimenting with mixing different colognes. In short, I consider myself a cologne connoisseur. And this bring me back to my point: don't wear Axe.

Monday, May 17, 2010

Stoned:

New York Times has started a new philosophy section, "The Stone":

http://opinionator.blogs.nytimes.com/2010/05/16/what-is-a-philosopher/

I've always understood philosophy to be an implicit science: every discussion--political, biological, psychological, etc.--is implicitly and necessarily philosophical. Yes, you can have explicitly philosophical discussions [the nature of "good" and "evil", for example], but that leads to my second point[s]:

Many of those questions have been answered, contrary to book sales figures; the questions have survived the centuries only because people cannot tolerate [or even comprehend] being wrong. Consider how ideas propagate:

Teach Idea A, while demonizing Idea B. Idea A is logically disproved, but Idea B has already been demonized, which effects cognitive dissonance: logically, Idea A cannot be tolerated, but Idea B has been so demonized that it is impossible to accept. Unfortunately, most individuals will adapt reality to Idea A, and the disproved idea continues.

Which brings us back to the New York Times article: it's going to tolerate disproved ideas; philosophy tolerates disproved ideas, because many "philosophers" tolerate disproved ideas. That the notion that people can be completely wrong is apparently a radical thing to suggest ["Everyone has their own reality, and we have to be tolerant."] is ridiculous: most people are completely wrong about many things--I've been wrong about many things; the difference is, I've adapted to the logical things, rather than the illogical; I only wish we [philosophers, society, etc.] expected everyone to do so.

Tuesday, April 20, 2010

Video Games Can Never Be Art:

Roger Ebert recently posted this blog entry, "Video Games Can Never Be Art":

http://blogs.suntimes.com/ebert/2010/04/video_games_can_never_be_art.html

I enjoy Roger Ebert's writing on film [not to mention his wonderfully meandering discussions on personal historical minutia] in the same way I enjoy Richard Dawkins' writing on evolutionary biology: when they're writing what they know, they're wonderful to read, but once they stand on the philosophical soapbox--well, the proverbial shit hits the proverbial fan.

Anyway, several issues with Ebert's entry:

[regarding the cave wall art at Chauvet-Pont-d'Arc]

"[Kellee Santiago] begins by saying video games 'already are art.' Yet she concedes that I was correct when I wrote, 'No one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets.' To which I could have added painters, composers, and so on, but my point is clear. [...] She shows a slide of a prehistoric cave painting, calling it 'chicken scratches on walls', and contrasts it with Michelangelo's ceiling of the Sistine Chapel. Her point is that while video games may be closer to the chicken scratch end of the spectrum, I am foolish to assume they will not evolve."

Ebert then says, regarding the cave art:

"They were great artists at that time, geniuses with nothing to build on, and were not in the process of becoming Michelangelo or anyone else."

This elicits the question, Why should video games be exempt from this "genesis" stage, where the artists press into the unknown, creating something without a blueprint?

Ebert's diatribe then focuses on the dichotomy between "arts" and "games"; he says, "One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite an immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them."

The unaddressed issue here are these arbitrary notions of "rules, points, objectives, and an outcome". Firstly, "rules": I'll assume [since he never defined his terms] that he means, "conditions that must be satisfied by the viewer in order to complete the viewer's experience", or something to that effect [again, I'm trying to fairly play his side, since he neglected to do so himself]--though that elicits the question of which conditions are necessary, and the undefined nature of those "conditions". I'll concede this point: games have intended rules; however, I would ontologically argue that art also has rules. And even if "rules" created a definable palisade between "art" and "games", I would ask why "rules" bar games from being art. When Ebert mentions "points", he again fails to define this term, which negates any meaningful debate: axiological value theory would suggest that games and art are both subject to "points"--or, to the application of value [and as a film critic, Ebert owes his career to this philosophy]; suggesting that games and art having differing "values" is fine, but said suggestion cannot logically serve as a predicate when it remains undefined and unsubstantiated. Similarly, "objectives" arguably apply to both games and art: one might argue that games differ in that viewers are asked to reach an objective, but this elicits the question of whether artists demand that similar objectives be met [which would suggest that art is subject to rules] in order for the viewer to view the art as the instantiation intended by the artist; Ebert even [unintentionally?] suggests that art has objectives: "Yet what ideas are contained in Stravinsky, Picasso, 'Night of the Hunter', 'Persona', 'Waiting for Godot', 'The Love Song of J. Alfred Prufrock'?" Oh, you can perform an exegesis or a paraphrase, but then you are creating your own art object from the materials at hand." Finally, the nature of the "outcome" depends on how one defines rules, points, and objectives: any contention would be meaningless unless one satisfied each predicate--and, subsequently, each predicate's predicate[s]; as I've stated, Ebert has unfortunately presented his arguments as stillborn statements.

[regarding Braid]

"You can go back in time and correct your mistakes. In chess, this is known as taking back a move, and negates the whole discipline of the game."

Having played this game, I can assure you that "taking back a move" doesn't make the game's puzzles any less difficult or meaningful [as Ebert seems to imply]; consider this: if you fall off a cliff, and then "rewind" time, you're back on the mountain, but the cliff is still there.

"Nor am I persuaded that I can learn about my own past by taking back my mistakes in a video game. She also admires a story told between the games levels, which exhibits prose on the level of a wordy fortune cookie."

I would counter this with Ebert's own preceding statements:

"I might argue that the novels of Cormac McCarthy are so motivated, and Nicholas Sparks would argue that his novels are so motivated. But when I say McCarthy is "better" than Sparks and that his novels are artworks, that is a subjective judgment, made on the basis of my taste (which I would argue is better than the taste of anyone who prefers Sparks). [...] Countless artists have drawn countless nudes. They are all working from nature. Some of there paintings are masterpieces, most are very bad indeed. How do we tell the difference? We know. It is a matter, yes, of taste."

[regarding A Voyage To The Moon]

"Obviously, I'm hopelessly handicapped because of my love of cinema, but Melies seems to me vastly more advanced than her three modern video games. He has limited technical resources, but superior artistry and imagination."

Again, I would counter with one of Ebert's previous statements:

"They were great artists at that time, geniuses with nothing to build on, and were not in the process of becoming Michelangelo or anyone else."

Which returns the argument to my [unanswered] question: Why should video games be exempt from this "genesis" stage, where the artists press into the unknown, creating something without a blueprint?

[regarding cited game examples]

"The three games she chooses as examples do not raise my hopes for a video game that will deserve my attention long enough to play it. They are, I regret to say, pathetic. I repeat: 'No one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets.'"

To which I would respond, again, with Ebert's own statement:

"They were great artists at that time, geniuses with nothing to build on, and were not in the process of becoming Michelangelo or anyone else."

Ebert then ironically channels Rush Limbaugh:

[poorly-disguised, patronizing belligerence in bold]

"Why are gamers so intensely concerned, anyway, that games be defined as art? Bobby Fischer, Michael Jordan and Dick Butkus never said they thought their games were an art form. Nor did Shi Hua Chen, winner of the $500,000 World Series of Mah Jong in 2009. Why aren't gamers content to play their games and simply enjoy themselves? They have my blessing, not that they care.

Do they require validation? In defending their gaming against parents, spouses, children, partners, co-workers or other critics, do they want to be able to look up from the screen and explain, "I'm studying a great form of art?" Then let them say it, if it makes them happy.

I allow Sangtiago the last word. Toward the end of her presentation, she shows a visual with six circles, which represent, I gather, the components now forming for her brave new world of video games as art. The circles are labeled: Development, Finance, Publishing, Marketing, Education, and Executive Management. I rest my case."

All of which is to say, Ebert wrote a shitty, shitty article: this blog entry would be laughed out of any serious philosophical discussion [assuming those still happen], but Ebert [unintentionally?] adopts the role of the patronizing polemicist, unconcerned with actually making an argument. Here's a thought experiment: read [or reread] Ebert's article, and replace references to video games with references to "liberals" and "liberalism"; it reads like a talk radio ad lib. I don't go to Ebert's blog to read this sort of dreck. Ideally, Ebert wouldn't write more articles like this, and he'd return to his strengths: film criticism, and his wonderfully meandering discussions on personal historical minutia; but, as he said, "The circles are labeled: Development, Finance, Publishing, Marketing, Education, and Executive Management. I rest my case."

Note: I'm not angry, unlike the hundreds of angst-filled comments on Ebert's blog. I don't understand being "angry", really; I'm simply disappointed that he was so incorrect.

Update, 4/21/10:

Roger Ebert, via Twitter:

"I'm not too old to "get" video games, but I may be too well-read."

My comment was disconcertingly [and unfortunately] accurate: Ebert [unintentionally?] adopts the role of the patronizing polemicist, unconcerned with actually making an argument.



Thursday, April 8, 2010

Cooking 101:

Cooking is generally viewed as a result of our ancestors' experimentation with fire: after harnessing fire, they developed cooking; stated thusly, this assumes a rather unlikely [and illogical] "creativity": prior to [and even after] cooking was developed, food in any amount was a crucial, necessary resource; to suppose that human ancestors would risk this resource for an unknown reward is evolutionarily improbable, and even nonsensical. So, an alternative theory [probably already suggested]:

Our ancestors evolved in woodlands, but eventually moved into grassier areas; this theory is equally valid in either geography, but more probable in the latter. Consequently, our ancestors encounters fires. Now, it can be assumed that our ancestors encountered fire prior to developing it; this uncontrolled fire was directly and indirectly deadly to the individual. Therefore, the "protective" argument for our ancestors harnessing fire seems questionable: if fire was detrimental to an individual's survival, it seems unlikely that an individual would try to create a known risk.

Which brings us back to the grasslands, where our ancestors encountered brush fires. These fires certainly consumed many animals, and our ancestors probably encountered these "cooked" remains. Food's uncertain availability was certainly compounded by a fire, so our ancestors were probably more desperate for a meal: this probably led our ancestors to sample the "cooked" remains of other animals; in doing so, our ancestors would have made two crucial discoveries: that "cooked" food was better food [read as, more efficient food], and that fire "cooked" food; the latter epiphany would have established a method to cook--fire; by understanding fire as a cooking method, our ancestors would have had cause to develop fire, i.e., fire was no longer only a risk--it had a benefit. While our ancestors probably didn't yet possess the understanding to develop fire, establishing fire's benefit would have changed our ancestors' opinion of fire; when a method to create fire was eventually [and probably accidentally] developed, our ancestors had reason to understand and retain the ability to create fire, rather than discarding it due to fire's risk. The rest, as they say, is history.

Tuesday, March 23, 2010

Wild America:

So, it's come to this: Discovery is buying Sarah Palin's new "reality show" for upwards of one million dollars per episode*; Sarah Palin's Alaska, shot in the style of the wonderful Planet Earth , will take viewers on a tour of the state's natural beauty.

I have nothing against Alaska, the environment; I have an issue with a Palin-ized Alaska. I still cannot understand this ignorant bitch's cultural longevity. I mean, I understand tolerating her as intellectual shtick, but too many people [read as, more than none] view her as a legitimate, intelligent, "common sense-y" conservative; giving her another outlet [one outside the isolated reality of the Fox Network] only legitimizes her idiocy, and allows her to further polymerize herself with our culture.

And, as if legitimizing idiocy weren't enough, this new "reality show" will undoubtedly debut Sarah Palin, The Narrator; program narration quality is already in decline, and her adenoidal bleating will be another nail in narration's coffin. I'm not unreasonable: I know Attenborough can't narrate every program. And, public figures have turned in successful narrations: Sigourney Weaver [needlessly] narrated the stateside release of the aforementioned Planet Earth, and Oprah Winfrey narrates Discovery's current Life; but do we really need Sarah "I Can See Russia From My House" Palin narrating anything?

*I'm sure that paycheck will be philanthropically recycled.

Wednesday, February 10, 2010

Song Of The Angler:

From A.J. McClane's "Song Of The Angler" [Field & Stream, 1967]:

Who but an angler knows that magic hour when the red lamp of summer drops behind blackening hemlocks and the mayflies emerge from the dull folds of their nymphal robes to dance in ritual as old as the river itself? Trout appear one by one, and the angler begins his game in movements as stylized as Japanese poetry. Perhaps he will hook that wonder-spotted rogue, or maybe he will remain in silent pantomime long into the night with no visible reward.


Vis-à-vis "Digital Archeology", silicon-cemented ephemera:

Thunderheads erased the setting sun; our captain deviated an additional fifty miles into the Gulf Of Mexico, fading the storm into the distance. As night fell, we [the boat carried twenty-odd anglers] baited our hooks [squid, or shrimp], set our lines, and waited. The ocean was calm; our boat, a gentle pendulum, hypnotically pitching. Deck lights illuminated the agate-colored water, attracting bait fish; blood-colored squid, attracted by the bait, darted through the water; attracted by the squid was the occasional predator: barracuda, small sharks, mackerel. A sojourning leatherback floated beneath us, undisturbed by the food chain. Flying fish scintillated through the air [and occasionally into our boat], gliding on the breeze pushed out to sea by the distant storm. As Hemingway wrote, "A man is never lost at sea."

Never lost at sea, and never lost canoeing a somniferous stream: lazing in a canoe, floating under a willow canopy; catching the occasional bronzeback, or simply pulling the canoe ashore and swimming in a cerulean pool. Floating down the river, using the sun as a map: if the sun isn't setting, you don't need to be anywhere else.

Streams evolve into rivers, and angling evolves in tandem: into fly fishing. Of course, one can fly fish a stream, but the intellectual [and physical] demands of fly fishing often run contrary to the languor of a slow-running stream--though fly fishing a stream can certainly be more rewarding [and demanding] than most angling experiences. And fly fishing is physically demanding--not in exertion, but in specificity. And that specificity demands the intellectual challenge: in choosing [or tying] the proper fly, and knowing where and how that fly needs to be placed on the water: so begins that silent pantomime mentioned by A.J. McClane.

Sunday, January 17, 2010

Pandora:

Watching Avatar, I felt the same way I felt when I was six, staying up late to read [and reread] Dinotopia; Avatar evokes the soul of biology: the primal, overwhelming wonder of the living world.

[collected essays]


Luminous, Vibrant Fantasy World Has Science At Its Core:

Inventing Pandora's Flora:

On The Rationalism Of The Na'vi:

Avatar's Animated Acting:

Followers