Ahhhh, the good ol’ practice of admitting we’re wrong. You know, as hard as it is, it actually feels kind of good once we do it. It’s not instinctive, and it usually takes a long time to come around to it. I’ve heard it said that criminals often begin to leave clues behind because secretly they want (need) to get caught. When they finally are caught, I don’t doubt that relief is the prevailing emotion. But confessing not only goes against our survival instincts and the more base aspect of our nature, but it also goes against the culture. Many social critics regard the “no fault” divorce as one of the greatest tragedies in wording, as it has allowed an entire “no fault” culture to emerge. All that complaining we heard from our grandparents about how no one is responsible anymore, or how all these silly lawsuits are making life more expensive from us stem from the frustration of watching the concept of “no fault” become en vogue.
In Protestant circles, I wonder if a renewed discussion on the merits of individual confession and forgiveness would make a dent in the “no fault” delusion. Given that the Roman Catholic Church continues to consider confession (Penance) a sacrament and priests offer stories of waiting 15 years for one person to come, perhaps it wouldn’t make a difference. But when Luther considered Confession and Forgiveness to be a mere extension of baptism and thus not a sacrament on its own, I wonder if he could have foreseen the way the Church would essentially abandon confession and eventually regard baptism in such high esteem that it almost becomes a superstitious event of universal forgiveness? If Protestants had never officially declared Confession to no longer be a sacrament, how different would our world be?
Again, Catholics officially retained the practice of Penance, but did Protestants influence them to think they don’t need it, either? It’s impossible to say. But one thing I’m pretty sure of is that it is very easy to believe the lie that we don’t sin, we can save ourselves, and confessing silently in our mind/heart is the same as confessing to our brother. It’s not. When we allow ourselves to confess to another person, we experience the true relief of admitting guilt as well as the security of community. With another person, and not just some vague conceptual God, we experience the sensation of admitting something we would prefer to be hidden, and still having that person accept us. It is a rare experience, possible only with a spouse or truly great friend outside of clergy, and one that goes against our tendencies to hide the things about ourselves we think will be the most damaging to our precious reputations.
That being said, I have only been to Confession once and it was a rather intimidating experience. The pastor was very good at it, and helped me find the balance of admitting appropriate guilt without beating myself up. I can understand why even the most practiced Christian (much less non-Christian) finds the confessional booth or kneeler anathema.
But not only is it largely ignored by individuals, the liturgy of my own Church is going against the concept of confession altogether. Having “successfully” replaced the practice of individual confession with a “Brief Order of Confession and Forgiveness”, our new service books takes it a step further, where we provide an option instead of the “Brief Order”: “Thanksgiving for Baptism.” I understand the huge and profound importance of baptism, and I don’t want to neglect it in any form or fashion. And I do agree that forgiveness is an extension of baptism, etc. But when we get away from the confession of our nature, and instead turn worship into a gigantic celebration, what are we saying about God? Does God desire, or even need us to be honest about the way we fall short, or does he just want us to be happy that we’re baptized? Isn’t the possibility that we can take advantage of the gift of baptism with such a glorification of it worse than the “medieval” act of confession?
Finally, what has happened and what will happen as we get away from our own sin? I have heard that individual confession is gaining, very slowly but surely, in “popularity” again. Priests and pastors are encouraging it, and there are a few more takers than in years past. Good; the process of abandoning the lies we can easily slip into is the beginning of spiritual health. And this isn’t to say we go into the confessional booth to beat ourselves up, or to get beat up. Quite the contrary: we hear about the grace of God in an entirely new way, spoken to us as an individual by a priest or pastor who is acting in the stead of God. Given that many protestants and even some Catholics have never even experienced this once, I lament from time to time that Luther didn’t hang onto Penance as a third sacrament.
Thursday, June 29, 2006
Friday, June 23, 2006
Looking Back at Postmodernism
Thinking back on my recent trip to Houston, it was obvious to me that the city contained relatively little historic architecture (pre-World War historic styles of classical and gothic derivation). Instead, the city serves as a broad catalogue for post-War styles: there are fine examples of minimalist mid-century modern such as the Houston Museum of Art by Mies Van der Rohe, the high-tech pavillion of Renzo Piano’s Menil Collection, as well as some refined examples of 60’s Brutalism.
My most recent impressions of Houston revealed a city characterized particularly by the Postmodern. This architectural movement rose in popularity during the 1970’s and became the ascendant architectural mode for much of the 1980’s, particularly in the United States. The style is often identified by the revival of historic formal elements, the superficial use of ornament, a pastel color palette, all composed in a way to acknowledge the surrounding context. Knowing fully well that the term 'postmodern' can be used as a catch-all to the many concurrent and contradictory architectural trends since the Sixties, I am referring specifically to a mode design that showcases consistent design elements and characteristics just described.
As a broader phenomenon, Postmodernism is often treated as a blanket term describing all aspects of our contemporary culture from a point of view that evolved from the social turbulence of the 1960’s. It was a reaction to the prevailing Modernist worldview, which understood the world in terms of elegant systems, machine-precision, scientific logic, the universality of man and thus the possibility at an objective and undeniable truth. Progress moved along a narrow path that favored abstraction, uncovering the layers to reveal an ideal world governed by reason and constant perfectibility. Modernism got its start during the renaissance in Humanism, and given a more sober and secular character from the influences of the French enlightenment, the scientific revolutions of the nineteenth century (including the contributions of Charles Darwin), Marxist materialism and finally the nihilist works Friedrich Nietzsche. These ideas achieved an architectural synthesis near the turn of the twentieth century with the first examples of structures composed of primary forms, straight-forward modern structural systems (reinforced concrete, steel), and facades abstracted to such a degree so as to abandon traditional ornament completely. Abstraction contributed to a building’s ‘honesty’, by exposing the reality of its structure, opening the walls to the outside with large expansive windows. Form was the product of function and nothing more. Building was an important part of moving civilization towards an ideal with large expanses of cities being torn-down to make way for ‘urban renewal’.
This worldview began to collapse in the face of growing social disenchantment in reaction to Modernism's seemingly repressive universalism. In philosophy, a deconstruction of general assumptions about what is believed to be true was taking shape, resulting not only questioning the validity of scientific knowledge, but also in the actual structure of language itself. In architecture, minimalist glass and concrete cubes were subverted by a restoration of historic styles and traditional idioms. But as Postmodern philosophy had tried to deconstruct language and describe the codified meanings of signs and symbols (semiotics), the revival of old and familiar architectural forms no longer necessarily connected them to its original meanings or uses. ‘Classical’ forms such as the column, the keystone, the gable, the arch and profiles of moldings provided designers with countless opportunities to compose facades that deliberately evoked irony by exaggerating proportions and traditional architectural relationships between these elements. Modernist 'honesty' was replaced by Postmodern superficiality at first, which revealed multiple layers of meaning ressed in humor. Modernist architecture was devoid of such ‘complexity and contradiction’ since it was designed under a different, more sober, mindset that tended towards unity by abstraction.
The architect Robert Venturi with his wife Denise Scott Brown are largely credited as having provided the first major theoretical underpinnings for the postmodern movement. Their 1966 book, Complexity and Contradiction in Architecture, was metaphorically the postmodern rebuttal to Modernism’s most influential manifesto, Le Corbusier’s Towards a New Architecture, published nearly forty years before.. The most memorable quip from the Venturis’ essay is that, far from Mies Van der Rohe’s axiom that ‘less is more’, the reality for most people is that ‘less is a bore.’ The authors cited countless architectural masterpieces throughout history that embodied formal complexity and were composed in such a way so as to suggest contradictory relationships between the elements. Mannerism and Baroque architecture are particular useful in describing his idea, in that designers were permitted tremendous flexibility to reinterpret traditional architectural motifs, subverting the tectonic and proportional relationships between column, arch and entablature. The period between the Renaissance’s (16th century) obsession for copying the ancients and the Neoclassical period’s (1750’s-1800) embrace of strict empiricism was one rich in architectural innovation, sensuous and passionate expression, and dramatic tension (here’s an interactive example).
Subsequent Postmodern architects gladly delved into creating a new Mannerist style, assembling together familiar classical elements in an eclectic purposefully inelegant way. Scale was often distorted, proportions exaggerated, and fine detailing neglected in favor enhancing a building’s symbolic effect from far away. Venturi’s house for his mother, Michael Graves' Portland office building, Philip Johnson’s ‘Chippendale’ building, Robert Stern’s projects for Disney and Charles Moore’s Piazza d’Italia in New Orleans are the most widely studied examples of the Postmodern practice of play and ambiguous symbolism.
Because of most of the public is rather fond traditional architectural modes, Postmodernism overtook its Modern predecessor as the preferred choice in most construction projects. Any recent building that evokes a traditional style or displays abstracted classical ornamental elements could be classified as Postmodern. If the look of the building appears to try to represent something else, whether a long-lost style, a familiar building typology, or even a symbol embodying meaning, it is Postmodern.
Older major American cities are blessed with beautiful beaux arts structures commonly built during the latter half of the nineteenth and first decades of the twentieth centuries. Houston, a city heavily bankrolled by the fortunes of the petroleum industry, seems to have massively built its great civic and commercial monuments during the Seventies when oil prices were high. As a result, Houston has become home to a vast number of ‘authentic’ Postmodern (an oxymoron, I know) style buildings, a treasure trove of architectural idiosyncrasies popular during the seventies and eighties, from the flamboyant use of pastel colors to the pastiche stucco facades with their seemingly cartoonish scale an abstraction.
The Houston skyline is dotted with Philip Johnson’s postmodern experiments, the profiles of buildings appropriating familiar typologies and forms of the past into a new context of corporate office buildings. His design for the architecture school at the University of Houston is nothing but literal reproduction of an un-built scheme by the eighteenth century French architect Claude Nicolas Ledoux. The opera house, especially its interior, is a vivid demonstration of the ability of ornament to emphasize festive pageantry, but is also a testament to postmodernism’s ephemeral nature. John Outram’s Duncan Hall at Rice University can be an overwhelming visual experience. Its bold color, elaborate ornamentation, and its Egyptian hypostyle hall stimulate the visitor to the extreme, while equally demonstrating the potential to activate a space by recycling past motifs. Venturi Scott Brown's design for Houston's Children's Museum is another excellent example of Postmodern flamboyance. The Federal Branch Bank of Houston by Robert Stern at first appears as an abstracted Greek temple from far away, but the highly contrasting colors, the broken façade planes, and the gigantic painted ‘brick’ mortar joints reveal a rather postmodern treatment. The front façade of the building resembles more a child’s drawing of house than a temple, and the gaps between the punched planes undermine the bank’s traditional image as being a strong fortress that protects wealth.
Rediscovering the examples above with other postmodern structures revealed to me how this style had a character and visual consistency that made it so identifiable. It is truly the product of its times, defining the built environment distinctively similar to fashions that defined social life during the 1970’s and 1980’s. Unlike its Modernist predecessor, postmodern architecture was less a radical rethinking of what built space could be. It gave little attention to the connection between outside and inside, the fluidity of open space, nor did it bring attention to new materials and technologies. Postmodernism in architecture was rather a restoration of what went on before the Modernist violent break of the past beginning after the First World War. Victorian, Arts & Crafts and Art Nouveau (as well as Art Deco of the 1920’s & 30’s) were attempts at creating new surface styles to dress many new building types that emerged near the turn of the twentieth century. Postmodernism is mostly about the manipulation of surface similar to those earlier, more beloved styles. The difference is that Postmodernism was less about simple embellishment and systematic formal vocabulary than it was about using signifiers to make a statement. Also the level of craftsmanship and attention to detail is mostly lacking compared to the older styles.
Although ironic humor is often used to describe postmodernism’s architectural expression, the underlying meanings of these designs is quite profound. It was an architecture that tried to make a point, ‘speaking’ through signs, quotations from the past and subversive proportions. Modernist glass boxes aimed to say as little as possible, their meanings pared down to nothing more than simple fact of merely existing. Nowadays, postmodernism no longer stimulates the interests of the current architectural vanguard. Modernism experienced a revival during the 1990’s, while the digital revolution has ushered the rise of biomorphism or ‘blobitechture’ while also breathing life into deconstruction. Postmodernism is now looked on as a clumsy effort to respond the conceptual dead end of late Modernism, suffering from its lack of timelessness and its indifference to actually being beautiful. To young designers, seeing a postmodern masterpiece from the seventies and eighties often causes them to cringe.
Update: Sam Jacob over at "Strange Harvest" explains why he admires the contributions of Postmodern pioneers Venturi Scott Brown in this insightful post. Hattip: Progressive Reactionary
My most recent impressions of Houston revealed a city characterized particularly by the Postmodern. This architectural movement rose in popularity during the 1970’s and became the ascendant architectural mode for much of the 1980’s, particularly in the United States. The style is often identified by the revival of historic formal elements, the superficial use of ornament, a pastel color palette, all composed in a way to acknowledge the surrounding context. Knowing fully well that the term 'postmodern' can be used as a catch-all to the many concurrent and contradictory architectural trends since the Sixties, I am referring specifically to a mode design that showcases consistent design elements and characteristics just described.
As a broader phenomenon, Postmodernism is often treated as a blanket term describing all aspects of our contemporary culture from a point of view that evolved from the social turbulence of the 1960’s. It was a reaction to the prevailing Modernist worldview, which understood the world in terms of elegant systems, machine-precision, scientific logic, the universality of man and thus the possibility at an objective and undeniable truth. Progress moved along a narrow path that favored abstraction, uncovering the layers to reveal an ideal world governed by reason and constant perfectibility. Modernism got its start during the renaissance in Humanism, and given a more sober and secular character from the influences of the French enlightenment, the scientific revolutions of the nineteenth century (including the contributions of Charles Darwin), Marxist materialism and finally the nihilist works Friedrich Nietzsche. These ideas achieved an architectural synthesis near the turn of the twentieth century with the first examples of structures composed of primary forms, straight-forward modern structural systems (reinforced concrete, steel), and facades abstracted to such a degree so as to abandon traditional ornament completely. Abstraction contributed to a building’s ‘honesty’, by exposing the reality of its structure, opening the walls to the outside with large expansive windows. Form was the product of function and nothing more. Building was an important part of moving civilization towards an ideal with large expanses of cities being torn-down to make way for ‘urban renewal’.
This worldview began to collapse in the face of growing social disenchantment in reaction to Modernism's seemingly repressive universalism. In philosophy, a deconstruction of general assumptions about what is believed to be true was taking shape, resulting not only questioning the validity of scientific knowledge, but also in the actual structure of language itself. In architecture, minimalist glass and concrete cubes were subverted by a restoration of historic styles and traditional idioms. But as Postmodern philosophy had tried to deconstruct language and describe the codified meanings of signs and symbols (semiotics), the revival of old and familiar architectural forms no longer necessarily connected them to its original meanings or uses. ‘Classical’ forms such as the column, the keystone, the gable, the arch and profiles of moldings provided designers with countless opportunities to compose facades that deliberately evoked irony by exaggerating proportions and traditional architectural relationships between these elements. Modernist 'honesty' was replaced by Postmodern superficiality at first, which revealed multiple layers of meaning ressed in humor. Modernist architecture was devoid of such ‘complexity and contradiction’ since it was designed under a different, more sober, mindset that tended towards unity by abstraction.
The architect Robert Venturi with his wife Denise Scott Brown are largely credited as having provided the first major theoretical underpinnings for the postmodern movement. Their 1966 book, Complexity and Contradiction in Architecture, was metaphorically the postmodern rebuttal to Modernism’s most influential manifesto, Le Corbusier’s Towards a New Architecture, published nearly forty years before.. The most memorable quip from the Venturis’ essay is that, far from Mies Van der Rohe’s axiom that ‘less is more’, the reality for most people is that ‘less is a bore.’ The authors cited countless architectural masterpieces throughout history that embodied formal complexity and were composed in such a way so as to suggest contradictory relationships between the elements. Mannerism and Baroque architecture are particular useful in describing his idea, in that designers were permitted tremendous flexibility to reinterpret traditional architectural motifs, subverting the tectonic and proportional relationships between column, arch and entablature. The period between the Renaissance’s (16th century) obsession for copying the ancients and the Neoclassical period’s (1750’s-1800) embrace of strict empiricism was one rich in architectural innovation, sensuous and passionate expression, and dramatic tension (here’s an interactive example).
Subsequent Postmodern architects gladly delved into creating a new Mannerist style, assembling together familiar classical elements in an eclectic purposefully inelegant way. Scale was often distorted, proportions exaggerated, and fine detailing neglected in favor enhancing a building’s symbolic effect from far away. Venturi’s house for his mother, Michael Graves' Portland office building, Philip Johnson’s ‘Chippendale’ building, Robert Stern’s projects for Disney and Charles Moore’s Piazza d’Italia in New Orleans are the most widely studied examples of the Postmodern practice of play and ambiguous symbolism.
Because of most of the public is rather fond traditional architectural modes, Postmodernism overtook its Modern predecessor as the preferred choice in most construction projects. Any recent building that evokes a traditional style or displays abstracted classical ornamental elements could be classified as Postmodern. If the look of the building appears to try to represent something else, whether a long-lost style, a familiar building typology, or even a symbol embodying meaning, it is Postmodern.
Older major American cities are blessed with beautiful beaux arts structures commonly built during the latter half of the nineteenth and first decades of the twentieth centuries. Houston, a city heavily bankrolled by the fortunes of the petroleum industry, seems to have massively built its great civic and commercial monuments during the Seventies when oil prices were high. As a result, Houston has become home to a vast number of ‘authentic’ Postmodern (an oxymoron, I know) style buildings, a treasure trove of architectural idiosyncrasies popular during the seventies and eighties, from the flamboyant use of pastel colors to the pastiche stucco facades with their seemingly cartoonish scale an abstraction.
The Houston skyline is dotted with Philip Johnson’s postmodern experiments, the profiles of buildings appropriating familiar typologies and forms of the past into a new context of corporate office buildings. His design for the architecture school at the University of Houston is nothing but literal reproduction of an un-built scheme by the eighteenth century French architect Claude Nicolas Ledoux. The opera house, especially its interior, is a vivid demonstration of the ability of ornament to emphasize festive pageantry, but is also a testament to postmodernism’s ephemeral nature. John Outram’s Duncan Hall at Rice University can be an overwhelming visual experience. Its bold color, elaborate ornamentation, and its Egyptian hypostyle hall stimulate the visitor to the extreme, while equally demonstrating the potential to activate a space by recycling past motifs. Venturi Scott Brown's design for Houston's Children's Museum is another excellent example of Postmodern flamboyance. The Federal Branch Bank of Houston by Robert Stern at first appears as an abstracted Greek temple from far away, but the highly contrasting colors, the broken façade planes, and the gigantic painted ‘brick’ mortar joints reveal a rather postmodern treatment. The front façade of the building resembles more a child’s drawing of house than a temple, and the gaps between the punched planes undermine the bank’s traditional image as being a strong fortress that protects wealth.
Rediscovering the examples above with other postmodern structures revealed to me how this style had a character and visual consistency that made it so identifiable. It is truly the product of its times, defining the built environment distinctively similar to fashions that defined social life during the 1970’s and 1980’s. Unlike its Modernist predecessor, postmodern architecture was less a radical rethinking of what built space could be. It gave little attention to the connection between outside and inside, the fluidity of open space, nor did it bring attention to new materials and technologies. Postmodernism in architecture was rather a restoration of what went on before the Modernist violent break of the past beginning after the First World War. Victorian, Arts & Crafts and Art Nouveau (as well as Art Deco of the 1920’s & 30’s) were attempts at creating new surface styles to dress many new building types that emerged near the turn of the twentieth century. Postmodernism is mostly about the manipulation of surface similar to those earlier, more beloved styles. The difference is that Postmodernism was less about simple embellishment and systematic formal vocabulary than it was about using signifiers to make a statement. Also the level of craftsmanship and attention to detail is mostly lacking compared to the older styles.
Although ironic humor is often used to describe postmodernism’s architectural expression, the underlying meanings of these designs is quite profound. It was an architecture that tried to make a point, ‘speaking’ through signs, quotations from the past and subversive proportions. Modernist glass boxes aimed to say as little as possible, their meanings pared down to nothing more than simple fact of merely existing. Nowadays, postmodernism no longer stimulates the interests of the current architectural vanguard. Modernism experienced a revival during the 1990’s, while the digital revolution has ushered the rise of biomorphism or ‘blobitechture’ while also breathing life into deconstruction. Postmodernism is now looked on as a clumsy effort to respond the conceptual dead end of late Modernism, suffering from its lack of timelessness and its indifference to actually being beautiful. To young designers, seeing a postmodern masterpiece from the seventies and eighties often causes them to cringe.
Update: Sam Jacob over at "Strange Harvest" explains why he admires the contributions of Postmodern pioneers Venturi Scott Brown in this insightful post. Hattip: Progressive Reactionary
Wednesday, June 21, 2006
Church Hypocrites: Hate the Rich, Want Their Money
First, a defense of hypocrites. Hypocrites, by nature, profess to believe in something. Usually, hypocrites profess to believe in things of value, noble principles or virtues, though I suppose it’s possible for a hypocrite to believe in things that have little moral value. Either way, hypocrites, whether they mean it or not, stand up for a belief. It is in that risk taking that the character of the hypocrite is revealed to be flawed, thus negating the very values the person upholds. So by that measure, I consider every Christian to be perfect examples of hypocrisy, and that’s not entirely a bad thing. Unlike their atheist or agnostic brethren, at least they confess to believe in something beyond themselves. And in that confession, they fall short, and are easily labeled as hypocrites.
So I am a proud hypocrite, but I do not hold that all hypocrisies are created equal. Blatant hypocrisy based on conceit or pride clearly are worse than the hypocrisies that are inevitable when one does their best to live out a moral life and falls short. So I will have the audacity to propose the kind of hypocrisy that irritates me the most. I was listening to a speech tonight by a pastor who is nothing short of courageous, who gave a very engaging speech, but also repeated a lot of very tired clichés that have been disproved enough times to be discredited. Yet here they were again, presented as gospel truth to impressionable teenagers who are too young to understand economics well enough to debate.
The sentiment was great, the intentions are well-meaning, but how long do we have to hang on to the following untruths: The minimum wage needs to be increased, the rich are greedy, and huge companies like Wal-Mart are helping to keep the poor impoverished? There was no discussion of reliance on social programs, or the hint that they might make poverty worse. There was no discussion about the importance of the father in the home. At least he, unlike the State, understood that real change could come to the truly down-and-out if they could make better life decisions. He, for example, worked closely with many to help them become free of addictions.
One story in particular jumped out at me. The pastor, who works in inner-city Milwaukee, asked a banking CEO who makes $4.6 million/year when “enough was enough?” That’s a fair question to ask. Eventually, it seemed to inspire this CEO to donate over a million dollars to scholarships for poor African-Americans so that they might achieve. Yet, the same pastor could not see the irony when he lamented the woes of capitalism and recommended such socialist drivel as “Pedagogy of the Oppressed” or “Nickeled and Dimed.” What was it that allowed the banker to donate the $1 million? Was it socialism? Communism? No, it was the capitalist system, and the prosperity that came as a result that allowed him to give away a third of his salary for a year.
At one point, he asked, “Why does this country so hate the rich?” I wondered when the last time he watched the news was. Has he not yet heard that we are the most generous nation in the history of the world? Americans per capita are more charitable than anyone else, is that not correct? I guess because the minimum wage is only $5.15, America must hate the poor. But what about the data that proves that the minimum wage has never statistically helped the poor, or the simple common sense that if it did, the first minimum wage would have licked poverty? Inflation, you say? What do you think causes it? Increased in wages leads to increased consumer prices, and so the cycle goes.
I won’t preach to the choir (pardon the pun) anymore than necessary, I just have one request. For those who bemoan capitalism, please do me the favor of not benefiting from its fruits. If money from capitalism is as good as “blood money,” please be happy running a commune. I realize there is corruption in capitalism like any other economic system, but given its voluntary nature, I will defend it as a more moral system than the systems Paulo Freire or Barbara Ehrenreich espouse. Meanwhile, I’ll be looking the moral capitalists in the eye at my church and thanking them for their generous donations, which we will use as best we can to help those in need.
So I am a proud hypocrite, but I do not hold that all hypocrisies are created equal. Blatant hypocrisy based on conceit or pride clearly are worse than the hypocrisies that are inevitable when one does their best to live out a moral life and falls short. So I will have the audacity to propose the kind of hypocrisy that irritates me the most. I was listening to a speech tonight by a pastor who is nothing short of courageous, who gave a very engaging speech, but also repeated a lot of very tired clichés that have been disproved enough times to be discredited. Yet here they were again, presented as gospel truth to impressionable teenagers who are too young to understand economics well enough to debate.
The sentiment was great, the intentions are well-meaning, but how long do we have to hang on to the following untruths: The minimum wage needs to be increased, the rich are greedy, and huge companies like Wal-Mart are helping to keep the poor impoverished? There was no discussion of reliance on social programs, or the hint that they might make poverty worse. There was no discussion about the importance of the father in the home. At least he, unlike the State, understood that real change could come to the truly down-and-out if they could make better life decisions. He, for example, worked closely with many to help them become free of addictions.
One story in particular jumped out at me. The pastor, who works in inner-city Milwaukee, asked a banking CEO who makes $4.6 million/year when “enough was enough?” That’s a fair question to ask. Eventually, it seemed to inspire this CEO to donate over a million dollars to scholarships for poor African-Americans so that they might achieve. Yet, the same pastor could not see the irony when he lamented the woes of capitalism and recommended such socialist drivel as “Pedagogy of the Oppressed” or “Nickeled and Dimed.” What was it that allowed the banker to donate the $1 million? Was it socialism? Communism? No, it was the capitalist system, and the prosperity that came as a result that allowed him to give away a third of his salary for a year.
At one point, he asked, “Why does this country so hate the rich?” I wondered when the last time he watched the news was. Has he not yet heard that we are the most generous nation in the history of the world? Americans per capita are more charitable than anyone else, is that not correct? I guess because the minimum wage is only $5.15, America must hate the poor. But what about the data that proves that the minimum wage has never statistically helped the poor, or the simple common sense that if it did, the first minimum wage would have licked poverty? Inflation, you say? What do you think causes it? Increased in wages leads to increased consumer prices, and so the cycle goes.
I won’t preach to the choir (pardon the pun) anymore than necessary, I just have one request. For those who bemoan capitalism, please do me the favor of not benefiting from its fruits. If money from capitalism is as good as “blood money,” please be happy running a commune. I realize there is corruption in capitalism like any other economic system, but given its voluntary nature, I will defend it as a more moral system than the systems Paulo Freire or Barbara Ehrenreich espouse. Meanwhile, I’ll be looking the moral capitalists in the eye at my church and thanking them for their generous donations, which we will use as best we can to help those in need.
Wednesday, June 14, 2006
Crankin' out the Hits: Why Many Architects Won't Design Traditionally
On the question of what is good design, architects and non-architects rarely seem to agree. Particularly since the ascendancy of Modernism during the last century, architects have used a different set of values in deeming what a well-designed building should look like. Much of the public, particularly in the U.S., have expressed their dislike of modern design and often for very good reason: modern architecture struggled to express monumentality, function was not apparent to the users due to the style’s minimalism and lack of signifying ornament. Modern architecture seemed contemptuous of the local context, deliberately clashing with the surrounding built context instead of harmonizing with it.
In response, the trends in design have adapted to these criticisms, incorporating context to form a new synthesis commonly called critical regionalism. The rise of the Post-Modern style addressed the importance of traditional architectural motifs and the need for the building to clearly express what it is (albeit in its ironic and ambiguous way). And still much of the design highly regarded by the architectural profession still borrows heavily from the formal innovations of the earliest Modernist pioneers. This incessant tendency among the most highly regarded architects to design in the modern idiom fails to win many fans outside professional and academic circles or people with a general appreciation for modern art. One gets the impression that only if architects could return to using traditional styles and conform to classical rules of design and re-use time-tested building typologies, then all would be hunky-dory and architects wouldn’t come off as smug aesthetes.
Quite a few architects have taken that route with much success, but in my opinion the reason that the most ambitious designers dedicate themselves to the modern style has lots to do with the nature of architecture as a profession. For much of the history of Western civilization, a pleasing design was one that embodied the visual harmony brought forth by the skillful use of proportional systems, of following proper architectural vocabularies evolved from local tradition, and often applying symmetry where applicable. Building types were few and simple technologies derived from masonry construction were all that was available for large permanent structures. The profession of architecture as it is currently understood did not exist throughout most of our history, since the job of designing structures was left to the master mason along with the engineer. Vitruvius, who wrote the first major theoretical manual on architecture, was a Roman military officer and engineer, with much of his text devoted to the successful planning of encampments and fortresses. During the European Middle Ages, the Gothic cathedrals soared because of the knowledge of the master mason, who drew very schematically the plan of the church and tried to retain designs of each design in the form of shop secrets.
The notion that a building should be designed by an academically educated professional came about during the Italian renaissance. This new kind of specialist, the architect, would be steeped in the theoretical knowledge inherited from the Ancients (Vitruvius’s manual, for instance) as well as be broadly exposed to prevailing intellectual doctrines of humanism, science, mathematics and art. Rather than coming from the work site as a mason and being closely involved in the manufacture of every building component as was typical of construction projects before then, the architect would generate a plan and compositions for the facades of the structure through the abstract techniques of drafting on paper. He would naturally use the knowledge he had acquired from his theoretical education instead of methods and common practices learned from a mason’s many years of apprenticeship. It is not surprising at this time that architects were credited with authoship, as individuals like Alberti, Brunesleschi and Palladio becoming among the first to be identified with their designs.
With the establishment of the first schools of architecture during the eighteenth and nineteenth centuries, the profession became more formalized and naturally more mired in design theory. Although architecture schools did not train designers on how to manage a practice, it did inculcate a sense that what distinguished architects from other construction trades was a unique poetic insight. Before Modernism took over the beaux-arts curriculum at the architecture schools, the debate was on the nature of the proper rules that should govern design, such as the tension between the respect for the traditional orders and the desire to innovate within them. The Bauhaus-inspired curriculum would throw out such concerns, concentrating more on the re-learning the most basic concepts of forms, visual relationships, color and technology.
Contemporary architects in the U.S. are mostly educated under a curriculum loosely based on Bauhaus principles. They have studied the history of architecture to some degree, often traveling to old European capitals to sketch its marvels. And yet the notion that to be a modern architect is to therefore be a creative and artistically professional comes as a given. Modernism’s lack of precise rules of composition allows anyone to believe that they are generating a scheme as unique as any other. Everyone gets to be a special designer, can create their distinct signature on a building. The only problem is that a miniscule number of such designers actually have the talent to pull off an original but transcendentally beautiful building. For many, the design process is a true joy, but achieving a moving design is extremely hard. It is doubly difficult when applying the Modernist style precisely because the rules are too few or too subtle. In my ideal world, those who are short of design talent (but are good at everything else) should incorporate a traditional style and diligently apply classical rules of proportion and composition. It is a foolproof means toward pleasingly attractive buildings, and would be a much better alternative to the numerous lazy modernist design experiments gone wrong.
When people are given the right to express themselves as individuals in any vocation, it usually will never be voluntarily revoked. Creating an original object, whether it is art, an artifact, or a building is often fulfilling precisely because it allows an individual to realize himself in the physical world. The architectural profession during the last century has made this experience accessible to those who aren’t quite artists nor are they pure engineers. Demanding that architects should give up on employing Modernist design in favor of using historicism is in reality a demand for architects to renounce their identity as a theoretically educated specialist of poetic license. It is a demand for architects to resume a role similar to master masons and mere practitioners of classical rules of composition.
The American Institute of Architects supposedly awards projects that demonstrate genuine design talent and technical mastery. It does not award the skillful application of historic styles. Such awards are a recognition by the architectural community of efforts in innovation and quality. Judging by the comments from this post over at 2 Blowhards, it is clear that quality and sophistication are defined quite differently between those within the architectural profession and those outside. For those outside the profession, it is fundamental to understand that the buildings awarded were very difficult to execute and the risks involved much higher than a more traditional solution. The result might look minimalist at times, but such designs require tremendous meditation on the part of the designer. Classicist design, because of its elaborate codification handed down from the ages, requires relatively little reflection. It is the meditative aspect of the profession that inspires the young to become architects, and serves as a major basis in judging excellent work.
Would you rather have architects crank out the old hits anyway? Most of them think they are more than that, to their own success or demise.
In response, the trends in design have adapted to these criticisms, incorporating context to form a new synthesis commonly called critical regionalism. The rise of the Post-Modern style addressed the importance of traditional architectural motifs and the need for the building to clearly express what it is (albeit in its ironic and ambiguous way). And still much of the design highly regarded by the architectural profession still borrows heavily from the formal innovations of the earliest Modernist pioneers. This incessant tendency among the most highly regarded architects to design in the modern idiom fails to win many fans outside professional and academic circles or people with a general appreciation for modern art. One gets the impression that only if architects could return to using traditional styles and conform to classical rules of design and re-use time-tested building typologies, then all would be hunky-dory and architects wouldn’t come off as smug aesthetes.
Quite a few architects have taken that route with much success, but in my opinion the reason that the most ambitious designers dedicate themselves to the modern style has lots to do with the nature of architecture as a profession. For much of the history of Western civilization, a pleasing design was one that embodied the visual harmony brought forth by the skillful use of proportional systems, of following proper architectural vocabularies evolved from local tradition, and often applying symmetry where applicable. Building types were few and simple technologies derived from masonry construction were all that was available for large permanent structures. The profession of architecture as it is currently understood did not exist throughout most of our history, since the job of designing structures was left to the master mason along with the engineer. Vitruvius, who wrote the first major theoretical manual on architecture, was a Roman military officer and engineer, with much of his text devoted to the successful planning of encampments and fortresses. During the European Middle Ages, the Gothic cathedrals soared because of the knowledge of the master mason, who drew very schematically the plan of the church and tried to retain designs of each design in the form of shop secrets.
The notion that a building should be designed by an academically educated professional came about during the Italian renaissance. This new kind of specialist, the architect, would be steeped in the theoretical knowledge inherited from the Ancients (Vitruvius’s manual, for instance) as well as be broadly exposed to prevailing intellectual doctrines of humanism, science, mathematics and art. Rather than coming from the work site as a mason and being closely involved in the manufacture of every building component as was typical of construction projects before then, the architect would generate a plan and compositions for the facades of the structure through the abstract techniques of drafting on paper. He would naturally use the knowledge he had acquired from his theoretical education instead of methods and common practices learned from a mason’s many years of apprenticeship. It is not surprising at this time that architects were credited with authoship, as individuals like Alberti, Brunesleschi and Palladio becoming among the first to be identified with their designs.
With the establishment of the first schools of architecture during the eighteenth and nineteenth centuries, the profession became more formalized and naturally more mired in design theory. Although architecture schools did not train designers on how to manage a practice, it did inculcate a sense that what distinguished architects from other construction trades was a unique poetic insight. Before Modernism took over the beaux-arts curriculum at the architecture schools, the debate was on the nature of the proper rules that should govern design, such as the tension between the respect for the traditional orders and the desire to innovate within them. The Bauhaus-inspired curriculum would throw out such concerns, concentrating more on the re-learning the most basic concepts of forms, visual relationships, color and technology.
Contemporary architects in the U.S. are mostly educated under a curriculum loosely based on Bauhaus principles. They have studied the history of architecture to some degree, often traveling to old European capitals to sketch its marvels. And yet the notion that to be a modern architect is to therefore be a creative and artistically professional comes as a given. Modernism’s lack of precise rules of composition allows anyone to believe that they are generating a scheme as unique as any other. Everyone gets to be a special designer, can create their distinct signature on a building. The only problem is that a miniscule number of such designers actually have the talent to pull off an original but transcendentally beautiful building. For many, the design process is a true joy, but achieving a moving design is extremely hard. It is doubly difficult when applying the Modernist style precisely because the rules are too few or too subtle. In my ideal world, those who are short of design talent (but are good at everything else) should incorporate a traditional style and diligently apply classical rules of proportion and composition. It is a foolproof means toward pleasingly attractive buildings, and would be a much better alternative to the numerous lazy modernist design experiments gone wrong.
When people are given the right to express themselves as individuals in any vocation, it usually will never be voluntarily revoked. Creating an original object, whether it is art, an artifact, or a building is often fulfilling precisely because it allows an individual to realize himself in the physical world. The architectural profession during the last century has made this experience accessible to those who aren’t quite artists nor are they pure engineers. Demanding that architects should give up on employing Modernist design in favor of using historicism is in reality a demand for architects to renounce their identity as a theoretically educated specialist of poetic license. It is a demand for architects to resume a role similar to master masons and mere practitioners of classical rules of composition.
The American Institute of Architects supposedly awards projects that demonstrate genuine design talent and technical mastery. It does not award the skillful application of historic styles. Such awards are a recognition by the architectural community of efforts in innovation and quality. Judging by the comments from this post over at 2 Blowhards, it is clear that quality and sophistication are defined quite differently between those within the architectural profession and those outside. For those outside the profession, it is fundamental to understand that the buildings awarded were very difficult to execute and the risks involved much higher than a more traditional solution. The result might look minimalist at times, but such designs require tremendous meditation on the part of the designer. Classicist design, because of its elaborate codification handed down from the ages, requires relatively little reflection. It is the meditative aspect of the profession that inspires the young to become architects, and serves as a major basis in judging excellent work.
Would you rather have architects crank out the old hits anyway? Most of them think they are more than that, to their own success or demise.
Travel Lightly: Discipleship, Phase II
For those unfamiliar with trends in the Church who would usually assume them to be rather boring, I find some of the recent trends in mainline churches fascinating. In my own denomination, I’ve been impressed by the way “discipleship” language has effectively taken over the conversation as churches realize their antique views of membership were at best self-centered, and at worst conceited. It seems that some mainline churches have made real efforts to get out of the “head above water” mode and have struggled with how to be an evangelistic people. Phase I of this was simple: change the language so that people in the pews see themselves not as “members” as much as literal “disciples” of Christ, folks who “follow” him and abide by his teachings as the disciples did 2,000 years ago. Phase II is even more interesting, as the next step may be some sort of de-centralization so individual churches are more “headquarters” to distribution centers than the totality of that church’s ministry.
Is this the future of business as well? As my fiancée enjoys Mondays being her official work-from-home day, it seems that the Internet and the manual-labor-free nature of work in America have made centralization of the workplace less relevant. Is it possible that in the future, one building may house workers for dozens of different companies, serving as a convenient place to meet more than the nexus of the company, sort of like a mall houses many retailers? Will companies become more de-centralized as America’s economy revolves more around ideas, information and automation than physical labor? In other words, will the lunch pail and hardhat continue to be replaced by work-from-home days and de-centralized companies?
Perhaps we should give Southwest Airlines a lot of credit for proving the impossible in the airline industry. They said they would be more efficient by being a “hubless” airline, focusing on short, non-stop flights. It worked, and remarkably well at that. Most of the other major airlines were already too reliant on hubs (and too beholden to labor unions) to compete and have since declared bankruptcy. But Southwest found a new way to be lean and mean, to rid of the idea of centralization in travel. I wonder if this is the sort of de-centralization that will continue to define American business and even the Church?
Like I said, Phase II of the discipleship model seems to be not only for Christians to see themselves as disciples of Christ more than members of a church, but also to see the Church as more of a headquarters than the totality of the life of that church. Think of a bicycle wheel with the center being the church and the spokes being its ministry. It’s as close as we may get to entrepreneurship in the Church, where there still is a center, but there are more ministries than can think for themselves. Most churches continue to operate from the hub mentality with themselves at the focus of everything. They put so many financial eggs into that basket that there is little left for anything else. But can churches thrive within a city having one lean base of operations, but many other “distribution centers”? Will Christian churches in a post-Christian world come to be outposts along the way than the center of the city square?
And how did we get to where we are, anyway? It seems to me that the basic model Jesus gave us in the life of discipleship was fundamentally altered as the Church and culture became synonymous in the West. If I remember what one Oxford professor told me, a city wasn’t technically a “city” unless it had a cathedral, at least in England. This was the difference between “town” and “city”. Either way, cities and towns were both built around the church in many cases. It was the center of public life, often in the center of the town. (Architects and city planners, feel free to offer comments on historical accuracy.) Why bother to “go” or “send” if the church is in the middle of town and everyone knows it’s there? Even though this model was physically changed as cities changed and mentally/spiritually changed with post-modernism, many churches cling to this lazy sort of evangelism. I admit, it is awfully tempting.
The reality is that cities changed as populations grew, transportation improved, and mass communication became the norm. As cities changed, so did the role of the Church within them. Churches should no longer have assumed the same place of centrality. But it has taken several generations for the Church to accept that. For several decades, my own denomination has been in denial of this change, seemingly still looking to the shores for more boats of Norwegians, Germans or Swedes to come and fill the pews. Well, wouldn’t you know it, the boats stopped coming. And so we arrive at what always was: the command by Jesus to “Go.” And not only go, but travel lightly, because it’s hostile out there. Now that Phase II is upon us, how will the Church react, especially those who haven’t entered Phase I?
Is this the future of business as well? As my fiancée enjoys Mondays being her official work-from-home day, it seems that the Internet and the manual-labor-free nature of work in America have made centralization of the workplace less relevant. Is it possible that in the future, one building may house workers for dozens of different companies, serving as a convenient place to meet more than the nexus of the company, sort of like a mall houses many retailers? Will companies become more de-centralized as America’s economy revolves more around ideas, information and automation than physical labor? In other words, will the lunch pail and hardhat continue to be replaced by work-from-home days and de-centralized companies?
Perhaps we should give Southwest Airlines a lot of credit for proving the impossible in the airline industry. They said they would be more efficient by being a “hubless” airline, focusing on short, non-stop flights. It worked, and remarkably well at that. Most of the other major airlines were already too reliant on hubs (and too beholden to labor unions) to compete and have since declared bankruptcy. But Southwest found a new way to be lean and mean, to rid of the idea of centralization in travel. I wonder if this is the sort of de-centralization that will continue to define American business and even the Church?
Like I said, Phase II of the discipleship model seems to be not only for Christians to see themselves as disciples of Christ more than members of a church, but also to see the Church as more of a headquarters than the totality of the life of that church. Think of a bicycle wheel with the center being the church and the spokes being its ministry. It’s as close as we may get to entrepreneurship in the Church, where there still is a center, but there are more ministries than can think for themselves. Most churches continue to operate from the hub mentality with themselves at the focus of everything. They put so many financial eggs into that basket that there is little left for anything else. But can churches thrive within a city having one lean base of operations, but many other “distribution centers”? Will Christian churches in a post-Christian world come to be outposts along the way than the center of the city square?
And how did we get to where we are, anyway? It seems to me that the basic model Jesus gave us in the life of discipleship was fundamentally altered as the Church and culture became synonymous in the West. If I remember what one Oxford professor told me, a city wasn’t technically a “city” unless it had a cathedral, at least in England. This was the difference between “town” and “city”. Either way, cities and towns were both built around the church in many cases. It was the center of public life, often in the center of the town. (Architects and city planners, feel free to offer comments on historical accuracy.) Why bother to “go” or “send” if the church is in the middle of town and everyone knows it’s there? Even though this model was physically changed as cities changed and mentally/spiritually changed with post-modernism, many churches cling to this lazy sort of evangelism. I admit, it is awfully tempting.
The reality is that cities changed as populations grew, transportation improved, and mass communication became the norm. As cities changed, so did the role of the Church within them. Churches should no longer have assumed the same place of centrality. But it has taken several generations for the Church to accept that. For several decades, my own denomination has been in denial of this change, seemingly still looking to the shores for more boats of Norwegians, Germans or Swedes to come and fill the pews. Well, wouldn’t you know it, the boats stopped coming. And so we arrive at what always was: the command by Jesus to “Go.” And not only go, but travel lightly, because it’s hostile out there. Now that Phase II is upon us, how will the Church react, especially those who haven’t entered Phase I?
Saturday, June 10, 2006
Can We Save the World? Less is More with Charity Endeavors
A few comments on some recent articles have been especially inspiring. I like to think that if nothing else, most of the writing on this site is committed to reality. And not grim, dour, and pessimistic reality with a “there’s nothing we can do about it” mindset that assumes we are mere automatons in a machinated world. But we strive to be grounded in reality because we hold the conviction that the truth is invaluable in solving social problems. The assumption here is that the world’s persistent ills are not solved by romantic, catch-all solutions that promise everything and do little, but by accepting the reality that surviving and thriving in the world is the result of working, doing our part, and, especially, learning from history.
So when I write about the fallacy of the “all you need is love” mantra or the over-reliance on “justice” instead of “mercy” as a concept for helping others, it is this realistic worldview I am appealing to. Thus, the following slogan that appeared in a comment struck me again at the way we continually delude ourselves, the way our best intentions often die like a fish out of water, and the way the arrogance of our modern age drives sacred and secular culture alike to make promises it cannot possibly keep. “One Spirit. One Will. Zero Poverty” is the slogan of Bread for the World (bread.org), a massive non-profit whose stated goal is to seek “justice for the world's hungry people by lobbying our nation's decision makers.” (There’s that justice word again.)
Certainly, Bread for the World may do a great deal of good work, and may seek to follow the example given by Jesus Christ when he fed the poor. I will give Bread the BFTW the benefit of the doubt that its intentions are good and godly, performed out of compassion generated by seeing truly suffering people. It’s not the intentions I question, but the stated goal. This phrase, this implied promise, “zero poverty,” strikes me as so beyond the pale, it makes me wonder what the real intentions of BFTW are. Who of truly noble intent has ever made such promises, or has even considered such an impossible goal to be possible? Even Jesus (who even secular folks would find it hard to argue was an exceptionally charitable person) never promised utopia this side of heaven, even as he fed four or five thousand at a time with a few fish and loaves. The number of charlatans and dictators that have made such promises is striking, however. And BFTW’s goal is centered squarely in a materialistic framework, but has the audacity to assume spiritual understandings of “justice” to make their work appear like it is solely inspired by Jesus-loving piety.
If the adage that “it is better to tell a big lie than a little lie” is correct, BFTW is certainly telling a whopper. Even if by their math they have number-crunched the wealth of the world against the hungry of the world, and by their idealistic ways of thinking they can see how there could no longer be poverty, who would actually believe that upping the federal budget here will get the money into the hands of the poor in “developing” nations? The power is not in the legislation, but in the distribution. All of the hopes and dreams for zero poverty go out the window as soon as a middleman gets involved and funnels American taxpayers to warlords in a corrupt country.
All of this pie-in-the-sky thinking about “zero poverty” is a misguided waste of time. To create real wealth for people hinges not on the good intentions of wealthy countries like the US, but on the rule of law in those that are poor. Without property rights, relatively clean government, and enough virtue in the culture to abide by the ever-fragile reliance on the law, we should never expect anything but poverty. Basic economics suggests and real history has proved that poverty is never eradicated by BFTW’s ridiculous solution to attain “zero poverty”: upping America’s federal budget and giving the money to politicians in poor countries. We’ll only be empowering corrupt leaders who keep the poor subjugated as it is.
So let’s stick with reality. Forget zero poverty. Let’s tighten the scope, get a little more myopic, help those we can most easily help. If you want to help the poor in third world countries, give them goats, cattle or dig them a water well. The Church here could start by helping those who are dependent on such government aid to become self-sufficient (in an economic sense, not a theological one). Perhaps BFTW could attempt to create “zero poverty” in their own neighborhood before having the arrogance to tell Congress what to do. Or hey, just read up on failed socialist/communist economies of the past. A little history lesson would make slogans like BFTW’s all but impossible to ever consider.
So when I write about the fallacy of the “all you need is love” mantra or the over-reliance on “justice” instead of “mercy” as a concept for helping others, it is this realistic worldview I am appealing to. Thus, the following slogan that appeared in a comment struck me again at the way we continually delude ourselves, the way our best intentions often die like a fish out of water, and the way the arrogance of our modern age drives sacred and secular culture alike to make promises it cannot possibly keep. “One Spirit. One Will. Zero Poverty” is the slogan of Bread for the World (bread.org), a massive non-profit whose stated goal is to seek “justice for the world's hungry people by lobbying our nation's decision makers.” (There’s that justice word again.)
Certainly, Bread for the World may do a great deal of good work, and may seek to follow the example given by Jesus Christ when he fed the poor. I will give Bread the BFTW the benefit of the doubt that its intentions are good and godly, performed out of compassion generated by seeing truly suffering people. It’s not the intentions I question, but the stated goal. This phrase, this implied promise, “zero poverty,” strikes me as so beyond the pale, it makes me wonder what the real intentions of BFTW are. Who of truly noble intent has ever made such promises, or has even considered such an impossible goal to be possible? Even Jesus (who even secular folks would find it hard to argue was an exceptionally charitable person) never promised utopia this side of heaven, even as he fed four or five thousand at a time with a few fish and loaves. The number of charlatans and dictators that have made such promises is striking, however. And BFTW’s goal is centered squarely in a materialistic framework, but has the audacity to assume spiritual understandings of “justice” to make their work appear like it is solely inspired by Jesus-loving piety.
If the adage that “it is better to tell a big lie than a little lie” is correct, BFTW is certainly telling a whopper. Even if by their math they have number-crunched the wealth of the world against the hungry of the world, and by their idealistic ways of thinking they can see how there could no longer be poverty, who would actually believe that upping the federal budget here will get the money into the hands of the poor in “developing” nations? The power is not in the legislation, but in the distribution. All of the hopes and dreams for zero poverty go out the window as soon as a middleman gets involved and funnels American taxpayers to warlords in a corrupt country.
All of this pie-in-the-sky thinking about “zero poverty” is a misguided waste of time. To create real wealth for people hinges not on the good intentions of wealthy countries like the US, but on the rule of law in those that are poor. Without property rights, relatively clean government, and enough virtue in the culture to abide by the ever-fragile reliance on the law, we should never expect anything but poverty. Basic economics suggests and real history has proved that poverty is never eradicated by BFTW’s ridiculous solution to attain “zero poverty”: upping America’s federal budget and giving the money to politicians in poor countries. We’ll only be empowering corrupt leaders who keep the poor subjugated as it is.
So let’s stick with reality. Forget zero poverty. Let’s tighten the scope, get a little more myopic, help those we can most easily help. If you want to help the poor in third world countries, give them goats, cattle or dig them a water well. The Church here could start by helping those who are dependent on such government aid to become self-sufficient (in an economic sense, not a theological one). Perhaps BFTW could attempt to create “zero poverty” in their own neighborhood before having the arrogance to tell Congress what to do. Or hey, just read up on failed socialist/communist economies of the past. A little history lesson would make slogans like BFTW’s all but impossible to ever consider.
Wednesday, June 07, 2006
Why America Will Lose....Unless...
America will lose the war in Iraq as it stands. The loss will not come as a result of insufficient training, inferior equipment, or even the lack of willingness by our troops on the ground. The loss will be a result of the American Media and the weakness of our elected Federal Government Officials. Our Federal Government is under constant attack by those in the media and our treasonous liberal government officials that wish more troops would die. They wish this because the more troops die, the worse it looks for our President. George Bush knows that the war in Iraq is not being fought by the commanders on the ground over there, but fought here on Capitol Hill and in the pages of the press.
Our government has become nothing but a reaction based body that jumps and does things for the Liberal Press. This is no surprise to those of us that follow politics and live on the conservative side of the isle. We are disgusted at what we see in Washington, and know that those who founded this country are rolling in their graves. Soldiers are constantly saying that we need to let them do what they’ve been trained to do. In Boot Camp we try to make our soldiers into warriors, but then send them off to sensitivity training. There is no sensitivity in war, there is no sensitivity in killing, and our enemy has no sensitivity for our way of life. We are trying to fight a sensitive war with a foe that has no tolerance for anyone’s way of life. A good analogy is this: Two men are fighting. One of these men can only punch with his right hand. The other man can fight how ever he pleases. This other man can use kicks to the groin, pull hair, and poke eyes while the first has to only use his right hand. Who do you think will win?
As we know the military, specifically the Marine Corps, is under attack for what the press and Democratic Party is calling a “murderous rampage”. Seven Marines and one Nave Corpsman are accused of killing “innocent” civilians in a fire-fight in Iraq. First of all, we don’t know if they were innocent and are taking the word of the Sunni People over in Iraq. We haven’t even charged our soldiers with anything yet, but have the shackled in solitary confinement. Better treatment is given to child molesters and rapist in prison. This is an absolute disgrace to all Americans. Yes, the Sunni people have provided have footage of bullet holes in the heads of Iraqi civilians, but that doesn’t say where those holes came from. I believe those bullets were put there by the Iraqi people opposing our involvement in their country after those civilians were dead. A building was brought down in the fire-fight thus civilians were unfortunately killed. This gave the Sunni’s the chance they needed to demean the American forces. But wait, wasn’t Saddam a Sunni? Aren’t these people telling us this information loyal to Saddam? Of course they are so why in the hell are we taking their word for it? Because our press is also loyal to Saddam. Our government has once again jumped and sent our soldiers to “Sensitivity Training” to teach them how to cope with the Iraqi people. How weak can we be? Patton would be sick! How in the world can we continue to have a volunteer military when people see their government turn their backs and betray those military men and women that have given so much? I am sickened. Finally, those like John Kerry have gotten their way.
While we are trying to fight the nice war and sending our weakened military to die, the enemies of our country are laughing at us. They mock us and know that we are weakened by the liberal press and the reaction based government that cowers to that liberal press. Osama Bin Laden even said that the American people don’t have the stomach for casualties. The thought that we can stop Iran is laughable. The thought that we will prevent a Civil War in Iraq is a far off thought. Our government simply does not have the testicular fortitude to do what is needed. Rain hell from above. The job of the military is to inflict the maximum amount of destruction in the shortest time possible. The military is there to break the will of the enemy. If that means carpet bombing a city to rubble, that’s what it means. War is hell. We haven’t won a war since World War II. Not Vietnam, Not Iraq. Iraq hasn’t ended yet. Yes, it started in 1991, but hasn’t ended. We’re still there. We will lose if things remain in this constant state of mismanagement. We will lose for the same reasons that Vietnam was lost. Our press and our mindless people don’t have the stomach for death. They don’t have the stomach for war. In war, innocent people die, warriors lust for killing the enemy, and the enemy is brought to submission. Then, and only then, will the enemy country be able to be rebuilt. Just like Japan was and just like Germany was after World War II.
So maybe the liberals are right. Maybe we shouldn’t be in Iraq. If you can’t do it right, don’t do it at all. My opinion is that the soldiers that have died over there will have died in vein if things don’t change. They have not been backed by our government and have lashed over and over by our media. The job cannot be accomplished using political correctness. There is only one way. Complete victory. If I knew that our government would be sending our boys out and be as weak as they are, I wouldn’t support the war effort. I believe the reasons for going to war are just, and I believe the world is a better place without Saddam, but our leadership is entirely unfit to wage war. They are unfortunately “Empty Suits”. Only a leader formed in the mold of George Patton will lead us to victory. Until then, we will continue to see our soldiers die for a failing cause.
Our government has become nothing but a reaction based body that jumps and does things for the Liberal Press. This is no surprise to those of us that follow politics and live on the conservative side of the isle. We are disgusted at what we see in Washington, and know that those who founded this country are rolling in their graves. Soldiers are constantly saying that we need to let them do what they’ve been trained to do. In Boot Camp we try to make our soldiers into warriors, but then send them off to sensitivity training. There is no sensitivity in war, there is no sensitivity in killing, and our enemy has no sensitivity for our way of life. We are trying to fight a sensitive war with a foe that has no tolerance for anyone’s way of life. A good analogy is this: Two men are fighting. One of these men can only punch with his right hand. The other man can fight how ever he pleases. This other man can use kicks to the groin, pull hair, and poke eyes while the first has to only use his right hand. Who do you think will win?
As we know the military, specifically the Marine Corps, is under attack for what the press and Democratic Party is calling a “murderous rampage”. Seven Marines and one Nave Corpsman are accused of killing “innocent” civilians in a fire-fight in Iraq. First of all, we don’t know if they were innocent and are taking the word of the Sunni People over in Iraq. We haven’t even charged our soldiers with anything yet, but have the shackled in solitary confinement. Better treatment is given to child molesters and rapist in prison. This is an absolute disgrace to all Americans. Yes, the Sunni people have provided have footage of bullet holes in the heads of Iraqi civilians, but that doesn’t say where those holes came from. I believe those bullets were put there by the Iraqi people opposing our involvement in their country after those civilians were dead. A building was brought down in the fire-fight thus civilians were unfortunately killed. This gave the Sunni’s the chance they needed to demean the American forces. But wait, wasn’t Saddam a Sunni? Aren’t these people telling us this information loyal to Saddam? Of course they are so why in the hell are we taking their word for it? Because our press is also loyal to Saddam. Our government has once again jumped and sent our soldiers to “Sensitivity Training” to teach them how to cope with the Iraqi people. How weak can we be? Patton would be sick! How in the world can we continue to have a volunteer military when people see their government turn their backs and betray those military men and women that have given so much? I am sickened. Finally, those like John Kerry have gotten their way.
While we are trying to fight the nice war and sending our weakened military to die, the enemies of our country are laughing at us. They mock us and know that we are weakened by the liberal press and the reaction based government that cowers to that liberal press. Osama Bin Laden even said that the American people don’t have the stomach for casualties. The thought that we can stop Iran is laughable. The thought that we will prevent a Civil War in Iraq is a far off thought. Our government simply does not have the testicular fortitude to do what is needed. Rain hell from above. The job of the military is to inflict the maximum amount of destruction in the shortest time possible. The military is there to break the will of the enemy. If that means carpet bombing a city to rubble, that’s what it means. War is hell. We haven’t won a war since World War II. Not Vietnam, Not Iraq. Iraq hasn’t ended yet. Yes, it started in 1991, but hasn’t ended. We’re still there. We will lose if things remain in this constant state of mismanagement. We will lose for the same reasons that Vietnam was lost. Our press and our mindless people don’t have the stomach for death. They don’t have the stomach for war. In war, innocent people die, warriors lust for killing the enemy, and the enemy is brought to submission. Then, and only then, will the enemy country be able to be rebuilt. Just like Japan was and just like Germany was after World War II.
So maybe the liberals are right. Maybe we shouldn’t be in Iraq. If you can’t do it right, don’t do it at all. My opinion is that the soldiers that have died over there will have died in vein if things don’t change. They have not been backed by our government and have lashed over and over by our media. The job cannot be accomplished using political correctness. There is only one way. Complete victory. If I knew that our government would be sending our boys out and be as weak as they are, I wouldn’t support the war effort. I believe the reasons for going to war are just, and I believe the world is a better place without Saddam, but our leadership is entirely unfit to wage war. They are unfortunately “Empty Suits”. Only a leader formed in the mold of George Patton will lead us to victory. Until then, we will continue to see our soldiers die for a failing cause.
Monday, June 05, 2006
The Beatles Were Wrong: We Need More Than Love
I don’t want to be one of those people that blames the Baby Boomers for everything. I look at my own parents and find it hard to believe there aren’t great Boomers out there producing, working hard, and carrying on the valuable vestiges of the Greatest Generation. But one of the more irritating facets of this generation is this radical sense of idealism that leads to paramount levels of impatience. Though Gens X and Y clearly do not share this idealism (it seems to have been replaced by cynicism if not outright pessimism), the impatience passed down by Boomers has officially been diagnosed as ADD/ADHD/etc. The theme song of for the Boomers teenage years, “All You Need is Love” turned out to be a daydream, a fanciful notion that never quite panned out as planned. Just ask The Beatles.
To function successfully in the world, we need more than love, or we at least need love properly understood. Often, I find that when those in my generation speak of love, they have either learned to cling to their parent’s (or the culture’s) romantic notions of love, or have rejected it wholeheartedly. But love is an enterprise in sacrifice more than the living out of a gut feeling. The problem with gut feelings, or the kind of love the pop artists tend to sing about, is that gut feelings change as do life circumstances. It strikes me that if The Beatles were singing about love as sacrifice, love as joy, or about love adapting and evolving to life’s curveballs, they would have been dead-on.
But I’m not sure the Boomers took it that way. This was their time, they were the generation that was going to prove their parents’ Depression-era advice wrong, they were finally going to have their cake and eat it too. All they needed was love. If they could just corral their good intentions into some sort of policy, we could finally achieve the utopia that we have been on the verge of achieving for centuries, but could never quite make it. The Boomers would achieve what so many other generations couldn’t.
This sort of daydreaming led to scores of failed policies from conception to implementation. The easiest examples are the economic policies that actually helped the pre-welfare poor were scrapped in favor of massive spending that made the problem worse. The unintended consequences of government growth and/or action were never thoughtfully considered by the Boomers and their legislators. I’m no foreign policy expert, but a continued weakness in how we wage/win war is, I think, a byproduct of this idealism run amok. War, at best, is a necessary evil, and daydreaming about it leads to second-guessing that only makes the problem worse.
But secular society isn’t the only place this motto has been adopted. I find my own church body (and similar church bodies) operating within this idealistic model, which, of course, in many ways is true. The Bible is clearly full of endorsements of love: “For God so loved the world, that he gave his only begotten son,” from St. John or “But now abide in faith, hope, love, these three; and the greatest of these is love” from St. Paul. (These are the “greatest hits” of bible verses on love, and for good reason.) But when scripture speaks of love, I find that it can be a different understanding from what even the church calls love.
A recent television advertisement (is this what evangelism has become?) by the United Church of Christ (UCC) utilizes the slogan, “Jesus didn’t turn people away. Neither do we,” implying that the key feature of the UCC is that it is welcoming of all people, just as Jesus was. Whatever your lifestyle, we welcome you. Other mainline churches have similar advertising slogans, making sure the general non-churchgoing public knows that they are welcome. Fine, but is that all the Church is about? It seems that beneath all of this is the mantra, “All you need is love.” What about the law? What about virtue? What about living out our faith? What about sacrifice, carrying the cross, all that good stuff?
It’s probably not a good career move for me to seemingly place myself at odds with Jesus, which I don’t see myself doing at all. But I’ve found if you criticize the love-only model of the faith, you’re quickly labeled a legalist. I just want to accurately define what we mean by grace and love, not only within our personal relationships, but also our sacred and even secular institutions. The idealism of needing love and love alone has gotten us into trouble, not because the intentions were bad, but because the warmest, fuzziest blanket statement “All you need is love” is just too good to let go when the policy proves a failure. Give me two people who merely like each other but agree in principle as to what marriage is about and I would wager there marriage would be more successful than a couple wildly in love who has immature views of marriage. The same is true for a moral society. It’s great that we should all love each other. But how do we live together in the meantime?
To function successfully in the world, we need more than love, or we at least need love properly understood. Often, I find that when those in my generation speak of love, they have either learned to cling to their parent’s (or the culture’s) romantic notions of love, or have rejected it wholeheartedly. But love is an enterprise in sacrifice more than the living out of a gut feeling. The problem with gut feelings, or the kind of love the pop artists tend to sing about, is that gut feelings change as do life circumstances. It strikes me that if The Beatles were singing about love as sacrifice, love as joy, or about love adapting and evolving to life’s curveballs, they would have been dead-on.
But I’m not sure the Boomers took it that way. This was their time, they were the generation that was going to prove their parents’ Depression-era advice wrong, they were finally going to have their cake and eat it too. All they needed was love. If they could just corral their good intentions into some sort of policy, we could finally achieve the utopia that we have been on the verge of achieving for centuries, but could never quite make it. The Boomers would achieve what so many other generations couldn’t.
This sort of daydreaming led to scores of failed policies from conception to implementation. The easiest examples are the economic policies that actually helped the pre-welfare poor were scrapped in favor of massive spending that made the problem worse. The unintended consequences of government growth and/or action were never thoughtfully considered by the Boomers and their legislators. I’m no foreign policy expert, but a continued weakness in how we wage/win war is, I think, a byproduct of this idealism run amok. War, at best, is a necessary evil, and daydreaming about it leads to second-guessing that only makes the problem worse.
But secular society isn’t the only place this motto has been adopted. I find my own church body (and similar church bodies) operating within this idealistic model, which, of course, in many ways is true. The Bible is clearly full of endorsements of love: “For God so loved the world, that he gave his only begotten son,” from St. John or “But now abide in faith, hope, love, these three; and the greatest of these is love” from St. Paul. (These are the “greatest hits” of bible verses on love, and for good reason.) But when scripture speaks of love, I find that it can be a different understanding from what even the church calls love.
A recent television advertisement (is this what evangelism has become?) by the United Church of Christ (UCC) utilizes the slogan, “Jesus didn’t turn people away. Neither do we,” implying that the key feature of the UCC is that it is welcoming of all people, just as Jesus was. Whatever your lifestyle, we welcome you. Other mainline churches have similar advertising slogans, making sure the general non-churchgoing public knows that they are welcome. Fine, but is that all the Church is about? It seems that beneath all of this is the mantra, “All you need is love.” What about the law? What about virtue? What about living out our faith? What about sacrifice, carrying the cross, all that good stuff?
It’s probably not a good career move for me to seemingly place myself at odds with Jesus, which I don’t see myself doing at all. But I’ve found if you criticize the love-only model of the faith, you’re quickly labeled a legalist. I just want to accurately define what we mean by grace and love, not only within our personal relationships, but also our sacred and even secular institutions. The idealism of needing love and love alone has gotten us into trouble, not because the intentions were bad, but because the warmest, fuzziest blanket statement “All you need is love” is just too good to let go when the policy proves a failure. Give me two people who merely like each other but agree in principle as to what marriage is about and I would wager there marriage would be more successful than a couple wildly in love who has immature views of marriage. The same is true for a moral society. It’s great that we should all love each other. But how do we live together in the meantime?
Thursday, June 01, 2006
"You Must Watch This Tape!": Truth in Documentaries
Under most circumstances I try to avoid getting into political debates with my friends. For one thing, politics is a serious matter for most people, and I prefer to make time with my friends as enjoyable as possible. Another aspect of political debates is that they are often adversarial, since many of my friends share an opposite point of view. It becomes futile to persuade a few of my friends who are distrustful of any facts I provide, arguing that whether information comes from a government agency or widely read news sources, a hidden agenda corrupts it all anyway. Thus statistics on the job market from the Department of Labor are suspect because they are supposedly influenced by the current presidential administration. News stories and other tidbits from network news broadcasts are to be doubted because they are all owned by a handful of large private corporations. Somehow the editors and news anchors are forced to tow the corporate line, which always seems to favor powerful business against the powerless. Facts matter little to such people, even as they insist that truth on an issue does exist.
What is ironic is that those who doubt factual data never hesitate to embrace information from the most subjective forms of media. One good friend of mine was proud not to watch the news nor read the papers, but he was eager to watch any video handed to him from his politically like-minded friends. Every time I began to lay out my position on a particular topic, he was quick to reply by declaring: “I have this tape!” This tape was supposed to validate his argument, as if an amateurishly edited thirty-minute video funded with a limited budget by a partisan non-profit organization was the only thing that articulated the truth on any issue. Having watched a few of these tapes I asked him how such footage could be less agenda-driven than other media he was so quick to reject. The explanation he offered was that these tapes were purposefully clandestine to avoid censorship and suppression from the almighty media and government agencies out to stamp out dissent. There was something quite romantic that captivated my friend’s interest, having exclusive access to the “truth” and organizing below the surface with other people through camaraderie.
In high school I took a class on television production. Our final project was to create a music video, using camera, audio and editing equipment owned by the school. I remember the hours spent in the editing room, using these bulky editing machines to splice footage in a clean sequence to better express the meaning of Pink Floyd’s “The Wall”. One thing anyone knows about the production of videos is the tremendous amount of editing, the importance of montage in developing a narrative and the oodles of footage that never make the final cut. Video is an art form that must stimulate the viewer’s interest quickly, and its linear sequence of action requires that the information is as effectively condensed as possible. Such condensation from editing makes videos prone to leaving out lots of important information, making it a comparatively poor resource for inquiry compared to journal publications, books or the internet. Therefore, of all the sources of information one could use to build a valid argument, the last thing one would ever want to cite is a video.
A video documentary is mostly an argument made with an assortment of facts, interviews, images and music. Unlike many people’s inept rhetorical skills, a carefully crafted video feeds on the viewers’ psychological impulses to deliver a convincing explanation and possible solutions to a problem. But since all documentaries are heavily subjected to editing, one cannot help but wonder what kind of useful information was left out. A documentary often tests the limits to which one can make a compelling argument with as few facts as possible, using visual aids instead raw data and deep analysis.
That is why I almost never watch documentaries at a movie theater. I go to be entertained, not informed. If I wanted to be informed I would read about an issue rather than rely on a documentary filmmaker’s deliberately composed jumble of footage and hand-picked facts. A person who uses these documentaries as the factual basis for their argument are basically using someone else’s argument in the form of a video to make his her own argument. This isn’t making one’s own mind on an issue by carefully constructing a logic than it is a simple matter of repeating someone else’s more visually elaborate view on the topic at hand. Between “Farenheit 9-11” and an esoteric, brooding French film I choose the latter. Even if a more conservative documentary were on offer, I would still prefer watching French actors smoke while contemplating their anxieties in life. At least a fictional art film can potentially reveal far more about the truth behind the human condition than best intentioned documentary.
What is really at the bottom of why certain people cannot distinguish good objective information from the subjective kind has to do with the way we perceive truth. To many people truth and fact are unrelated. Truth often consists of an almost spiritual zeal, the way the world should work. Facts are inanimate fragments of information that can be manipulated to simply serve truth. The facts are made to fit the truth, not vice versa. Thus, a video documentary is mostly an exposition of a particular truth. To appreciate a documentary is not to praise its skillful exposition of an issue in general, but to agree with the film-maker’s truth. If the film’s truth is opposite to the viewer, the label of propaganda is used to discredit it.
Indeed documentaries are produced with the goal of informing the viewers. But rarely is it to provide a broad balanced view of a topic. It seems that the more a film is driven by the desire to make money, as in a theatrical release, for example, the more it is less concerned with balance as it is seeking viewers who share in the stated truths of the film. Interestingly, I find documentaries made for public television less concerned about preaching a truth than to simply describe all aspects of an issue in a seemingly fair way. What I hope that people who seek information should know, is that if one wants to really understand an issue, one first has to acknowledge that things are often too complex to understand from one point of view. One has to then do the work in researching and comparing all sorts of facts, and all the while, avoid subscribing too quickly to any sort of truth.
A clear-headed view of what many video documentaries make the most recent revelations regarding Michael Moore’s practice of highly selective editing completely unsurprising. His intent was never to illustrate the War on Terror and Iraq as complicated subjects requiring complete investigative detachment. Rather, Mr. Moore was laying out to the audience his truth in as clean a narrative as possible, from the purported relations between the President and the Saudis to the supposed desperation of American soldiers in Iraq to the apparent joy of Iraqi citizens under Saddam Hussein. If it demanded that he take out of context the words of an armeless veteran to reinforce the truth of his narrative, than so be it. To him, sharing his truth to viewers who are eager to accept any kind of truth was well worth the difficulty in necessarily twisting the facts.
In the end, searching for a truth on any serious topic is more of an emotional response than an empirical one. Facts are what they are. But to film editors, facts are what we choose them to be, and how they can be embellished results into their distortion in service of a “truth”.
For a complementary take on this issue, Patrick Hynes makes some good observations.
Hattip: Instapundit
What is ironic is that those who doubt factual data never hesitate to embrace information from the most subjective forms of media. One good friend of mine was proud not to watch the news nor read the papers, but he was eager to watch any video handed to him from his politically like-minded friends. Every time I began to lay out my position on a particular topic, he was quick to reply by declaring: “I have this tape!” This tape was supposed to validate his argument, as if an amateurishly edited thirty-minute video funded with a limited budget by a partisan non-profit organization was the only thing that articulated the truth on any issue. Having watched a few of these tapes I asked him how such footage could be less agenda-driven than other media he was so quick to reject. The explanation he offered was that these tapes were purposefully clandestine to avoid censorship and suppression from the almighty media and government agencies out to stamp out dissent. There was something quite romantic that captivated my friend’s interest, having exclusive access to the “truth” and organizing below the surface with other people through camaraderie.
In high school I took a class on television production. Our final project was to create a music video, using camera, audio and editing equipment owned by the school. I remember the hours spent in the editing room, using these bulky editing machines to splice footage in a clean sequence to better express the meaning of Pink Floyd’s “The Wall”. One thing anyone knows about the production of videos is the tremendous amount of editing, the importance of montage in developing a narrative and the oodles of footage that never make the final cut. Video is an art form that must stimulate the viewer’s interest quickly, and its linear sequence of action requires that the information is as effectively condensed as possible. Such condensation from editing makes videos prone to leaving out lots of important information, making it a comparatively poor resource for inquiry compared to journal publications, books or the internet. Therefore, of all the sources of information one could use to build a valid argument, the last thing one would ever want to cite is a video.
A video documentary is mostly an argument made with an assortment of facts, interviews, images and music. Unlike many people’s inept rhetorical skills, a carefully crafted video feeds on the viewers’ psychological impulses to deliver a convincing explanation and possible solutions to a problem. But since all documentaries are heavily subjected to editing, one cannot help but wonder what kind of useful information was left out. A documentary often tests the limits to which one can make a compelling argument with as few facts as possible, using visual aids instead raw data and deep analysis.
That is why I almost never watch documentaries at a movie theater. I go to be entertained, not informed. If I wanted to be informed I would read about an issue rather than rely on a documentary filmmaker’s deliberately composed jumble of footage and hand-picked facts. A person who uses these documentaries as the factual basis for their argument are basically using someone else’s argument in the form of a video to make his her own argument. This isn’t making one’s own mind on an issue by carefully constructing a logic than it is a simple matter of repeating someone else’s more visually elaborate view on the topic at hand. Between “Farenheit 9-11” and an esoteric, brooding French film I choose the latter. Even if a more conservative documentary were on offer, I would still prefer watching French actors smoke while contemplating their anxieties in life. At least a fictional art film can potentially reveal far more about the truth behind the human condition than best intentioned documentary.
What is really at the bottom of why certain people cannot distinguish good objective information from the subjective kind has to do with the way we perceive truth. To many people truth and fact are unrelated. Truth often consists of an almost spiritual zeal, the way the world should work. Facts are inanimate fragments of information that can be manipulated to simply serve truth. The facts are made to fit the truth, not vice versa. Thus, a video documentary is mostly an exposition of a particular truth. To appreciate a documentary is not to praise its skillful exposition of an issue in general, but to agree with the film-maker’s truth. If the film’s truth is opposite to the viewer, the label of propaganda is used to discredit it.
Indeed documentaries are produced with the goal of informing the viewers. But rarely is it to provide a broad balanced view of a topic. It seems that the more a film is driven by the desire to make money, as in a theatrical release, for example, the more it is less concerned with balance as it is seeking viewers who share in the stated truths of the film. Interestingly, I find documentaries made for public television less concerned about preaching a truth than to simply describe all aspects of an issue in a seemingly fair way. What I hope that people who seek information should know, is that if one wants to really understand an issue, one first has to acknowledge that things are often too complex to understand from one point of view. One has to then do the work in researching and comparing all sorts of facts, and all the while, avoid subscribing too quickly to any sort of truth.
A clear-headed view of what many video documentaries make the most recent revelations regarding Michael Moore’s practice of highly selective editing completely unsurprising. His intent was never to illustrate the War on Terror and Iraq as complicated subjects requiring complete investigative detachment. Rather, Mr. Moore was laying out to the audience his truth in as clean a narrative as possible, from the purported relations between the President and the Saudis to the supposed desperation of American soldiers in Iraq to the apparent joy of Iraqi citizens under Saddam Hussein. If it demanded that he take out of context the words of an armeless veteran to reinforce the truth of his narrative, than so be it. To him, sharing his truth to viewers who are eager to accept any kind of truth was well worth the difficulty in necessarily twisting the facts.
In the end, searching for a truth on any serious topic is more of an emotional response than an empirical one. Facts are what they are. But to film editors, facts are what we choose them to be, and how they can be embellished results into their distortion in service of a “truth”.
For a complementary take on this issue, Patrick Hynes makes some good observations.
Hattip: Instapundit
Subscribe to:
Posts (Atom)