ads without products

finance vs. academia

with one comment

According to The Financial Times today:

In 1975, more than a decade before the Big Bang that deregulated the City, the average London financial services worker was paid about £3,800 a year, a salary that was outstripped by a sizeable proportion of other professionals. Academics were paid about £5,000, around a third more, while natural scientists and engineers received roughly 10 per cent more than finance workers.

Now the average London financial services salary is about £102,000 including bonuses while academics are paid about £48,000, natural scientists average about £42,000 and mechanical engineers £46,000.

Written by adswithoutproducts

February 15, 2014 at 8:30 pm

Posted in Uncategorized

eliot / auerbach

with one comment

Very strange, and not at all sure what to do with this yet. Might just be a false echo… But as I’ve indicated on here before, I’ve long been fascinated by the final paragraph of Erich Auerbach’s Mimesis. Namely…

Beneath the conflicts, an economic and cultural levelling process is taking place. It is still a long way to a common life of mankind on earth, but the goal begins to be visible. And it is most concretely visible now in the unprejudiced, precise, interior and exterior representation of the random moment in the lives of different people. So the complicated process of dissolution which led to fragmentation of the exterior action, to reflection of consciousness, and to stratification of time seems to be tending toward a very simple solution. Perhaps it will be too simple to please those who, despite all its dangers and catastrophes, admire and love our epoch for the sake of its abundance of life and the incomparable historical vantage point which it affords. But they are few in number, and probably they will not live to see much more than the first forewarnings of the approaching unification and simplification.

I discuss it, for instance, in a (strange, wandering) post here. At any rate, I’ve been getting ready to give a lecture today on T.S. Eliot’s essays, and found the following in his 1921 piece The Metaphysical Poets. 

We can only say that it appears likely that poets in our civilization, as it exists at present, must be difficult. Our civilization comprehends great variety and complexity, and this variety and complexity, playing upon a refined sensibility, must produce various and complex results. The poet must become more and more comprehensive, more allusive, more indirect, in order to force, to dislocate if necessary, language into his meaning.

The play of simplicity vs. difficulty (and the gap of a few very important decades) does make me wonder whether there’s a responsive echo going on in Auerbach. Something to look into… (If only there was a good Auerbach biography in English!) What makes it more interesting, perhaps, is that arch-small-c-conservative Eliot is in the midst of laying out his theory of the “dissociation of sensibility” that somehow happened after the seventeenth century (hmmm) while – if very obliquely – Auerbach seems to be suggesting a sort of “re-association of sensibility” in the aftermath of modernism…

More soon if I can find a way / get a chance to look into this further…

Written by adswithoutproducts

February 9, 2014 at 1:53 pm

porn and democracy

with 4 comments

From Hannah Dawson’s review (paywalled) of Margret Grebowicz’s Why Internet Porn Matters in the current issue of the TLS

Grebowicz […] argues that despite its theoretical potential, internet pornography tends to oppress rather than emancipate. The “free speech” that it embodies still belongs in large part to men, objectifying and subjugating other human beings. The many testimonies of self-empowerment from the “victims” of the industry are matched by first-person reports of misogyny, degradation, rape and incarceration. Rather than opening up an egalitarian space for self-construction, the file-sharing, file-ranking chatrooming online realm is creating communities all the more powerfully by normalizing discourses that preclude our saying anything new or real. “Internet pornography”, writes Grebowicz, “emerges as the perfect manifestation of the babbling political body, the speechless mass, in which every subject is interchangeable for every other, exercising its rights and expressing, more and more, telling us what we already know, climaxing, climaxing, always recognizable and predictable.”

The final sentence of the paragraph rings a bit oddly against what precedes it. While “speechless” sounds ominous, the “babbling” of the “political body” sounds like an only slight pejorative rendition of democracy. Interchangeability is ambiguous too, as it is both the result of capitalism’s reduction of us labourers to replaceable parts and, again, a quality of democratic equality. The exercise of rights, in particular the right to expression, too is of course a staple of the democratic diet. But the point of the paragraph seems to be that porn is underwritten by and a source of profitable reenforcement to the powers that be, in particular, men.

So there’s complexity at play here: internet pornography presents an ambiguous vision of freedom that is subtended by a business apparatus that depends upon the very opposite of freedom. In this, it stands (like so many other cultural products, but  more intensely and viscerally) as an uncannily accurate aesthetic mirror – a reflection more than a representation – of the political and economic conditions that obtain today in the world. On the aggregation sites, it seems, everyone has a voice, the cascading streams of thumbnails suggest a world in which all are represented, all represent themselves, and all are of course taking great pleasure in this rhythm of representing and being represented. And the consumer in turn sifts her or his pleasure out of this capacious pot of pleasure-taking and freedom-having. Everyone is equal, ostensibly, in their interchangeability – one’s acts are as free and pleasurable as those of the next. It can start to sound almost utopian, when described this way. But, of course, in the end and as in the world itself, almost all of this performance is stage-managed by those who profit from the exploitation of others. *

Given all this, a few questions to start. First, a quiet aesthetic question posed by internet pornography, perhaps, is what we do with its banality – the fact that it is constantly “telling us what we want to know” – in view that we incessantly come back to taste the banality again. There further is another quiet question, this time politico-aesthetic, about what this banality has to do with the conditions of its production and the means of its distribution.

But beyond these two, there’s an age old matter of ethics – and the ethics of the aesthetic – at play, one that queries the relationship between exploitation and representation, empathy and what we might call “forced performance” which has troubled the better sort of critic and writer since the very beginnings of literature itself, and which manifests itself at certain vividly aporetic moments as history moves forward. (One relatively recent example – Ruskin’s implication that the gothic cathedral is actually more beautiful than the pyramids because of the freedom of the workers who made them. But can that be right? Does free trade coffee actually taste better than that which is more exploitatively sourced?)

How much relieved sexual dissatisfaction is the suffering of a single human being worth? What am I to make of my enjoyment of the fruits of other’s struggles? Does it matter whether I am aware of the mechanics of production of that which I enjoy? How are we to understand the nexus of volition and exploitation, of willed self-exploitation and exploited wilfulness, that underwrites not only pornography but the increasingly illiberal world-space of “liberal capitalism”?

I have a sense that this perpetually recoded algorithm of suffering and enjoyment, repression and representation, is one of the matters that it has always been and still is essentially worthwhile for us to take up. Further, it is a question that has everything to do with the issues at play in the article by J.M. Coetzee that I discuss in this post. But more on that, I promise, soon…. A continuation of this is already in the works…

* Please note that I am – for the sake of starting up a line of thought – side-stepping for the moment several very important issues here. They include the very non-representative nature of porn (obviously not anything like “everyone” is represented there, no matter how many hundreds of thousands of videos exist to be viewed) as well as the extremely complex issues of exploitation and agency in the production of porn. These need to be addressed… but for now, let me just juggle a bit with the terms of the argument and description of the situation as presented in the review I have started from…

Written by adswithoutproducts

January 6, 2014 at 1:30 pm

the default mode: sociality, distraction, and concentration

with one comment

From Julian Baggini’s essay / review on “sociality” in today’s Financial Times. Here, in particular, he’s discussing Matthew Lieberman’s book Social: Why Our Brains Are Wired to Connect.

Lieberman’s task is in some ways the most straightforward. His aim in Social is to impress upon us just how much we have learnt in recent years about the wiring of our brains. Social thinking is so fundamental that it fills our consciousness whenever we switch off from any pressing task. This “default mode network” activity “precedes any conscious interest in the social world”, having been detected in babies as young as two days.

Most neuroscientists believe we have a dedicated system for social reasoning, quite different to the one that is used for non-social thinking. What’s more, when one system is on, the other turns off. Lieberman explains how the social system fulfils three core tasks. First, it must make connections with others, which involves feeling social pains and pleasures, such as those of rejection or belonging. Second, it must develop mind-reading skills, in order to know what others are thinking, so as to predict their behaviour and act appropriately. Finally, it must use these abilities to harmonise with others, so as to thrive safely in the social world.

This notion –  that we have two parallel systems, one for the non-social thinking involved in dealing with “pressing tasks” and another “default mode” that inherently angled towards the social – is an interesting one. And it’s one that seems intuitively true, given how incredibly easy it is to slip from concentrated hard work to checking in on what one’s friends (real and “only electronic”) are saying on social media networks and the like. Of course, it doesn’t even take an online connection to feel the gravitational pull of the social (or even that particularly intense form of the social known as the “sexual”) when we are hard at work on something that requires fixed attention. (See, for instance, Inigo Thomas’s recent LRB piece on the British Library pick up scene – he’s basically written down what everyone chatters about when on the topic of the BL). 

But what I’m wondering about here is not so much the default social mode (which I’m convinced exists) but the other mode, the mode of that runs when completing the “pressing task” – the mental state that we are in when we successfully refuse ourselves yet another check of our email accounts or twitter scrolls. I’m working from a review here, so Lieberman might have it completely differently in his book, but isn’t there also a complex “sociality” to the ostensibly non-social forms of concentrated attention?

In my own work, for instance, which consists mainly of writing, preparing to teach, and marking students’ work, there’s always a implicit, virtual conversation going on as I compose or correct with the readers or audiences that I am planning or at least hoping to communicate with. Of course, the forms of work that require my concentration may be more social that what others have to accomplish – an accountant or an engineer isn’t necessarily expecting to deliver the fruits of her spreadsheet calculations or CAD diagrams to a lecture hall full of people. But even then – and even when I have to devote myself to the calculation of student marks or the tedious estimation of MA admissions returns – isn’t there a deeper, more cryptic sociality involved in the completion of these seemingly inherently solitary tasks. After all, without a sense of the students who are awaiting the marks, my boss who expects me to get the numbers right and on time, the university superstructure that expects students to graduate with full transcripts and seminar rooms next year that are full of students, would I ever even begin to get these (often excruciatingly boring) things done?

Freud coined a term for the deepest, darkest, and most hardwired of the secret interlocutors that converse  (if “converse” is the word, rather than “cajole”, “chastise,” etc) with us as we complete our daily tasks – or engage in any aspect of our quotidian behaviour. He called it the “superego,” that aspect of our psychologies which takes the shape of an internalized, virtual version of authority figures – first parents, later other figures like teachers or religious leaders – and with whom we negotiate constantly. As he has it in The Ego and the Id:

The super-ego retains the character of the father, while the more powerful the Oedipus complex was and the more rapidly it succumbed to repression (under the influence of authority, religious teaching, schooling and reading), the stricter will be the domination of the super-ego over the ego later on—in the form of conscience or perhaps of an unconscious sense of guilt.

In other words, I agree with Lieberman’s thesis about the inherent sociability of human thought and work. But – again conceding that I haven’t read his actual book, only the review – I wonder whether the “default mode” isn’t even more default than he’s making it there. I wonder, in short, if  it isn’t when when we’re most alone, when we’re as concentrated and “unplugged” as we can be, that the voice of the other – even if it comes from nowhere but within our own minds – shouts at us the loudest.

Written by adswithoutproducts

January 4, 2014 at 12:33 pm

Posted in distraction, sociality

americans in limbo

leave a comment »

Most Americans – me included before I moved here – have a difficult time reading British “class” through accent and its other accoutrements. Sure, there’s My Fair Lady cockneyism on the one side and chinless Royal Familyism on the other, we can detect that, but between lies just a fast undifferentiated middle. Which of course not how British people hear it, not in the least, as they sniff each other out with the subtle discernment of dogs testing each others’ asses.

But on the other hand: Americans are completely indiscernable to Brits as well. They can’t detect the subtle differences of speech and gesture that mark the well-born or earned-through from the other sorts, and all the complicating and obsfucating play that goes on in between. But whereas Americans default to “rich and polished” when they hear Brits, I think Americans are assigned a lower and more ambiguous place in the eyes of my hosts here. The best analogy I can come up with for where we are placed is the way that Dante handles the virtuous non-Christians in Inferno. Greek philosophers and the like aren’t mixed into the bottom, not quite, but they don’t quite merit the middle berthing either.

They are placed in Limbo, for lack of anywhere else to settle them – technically in the game but ultimately not really.

Written by adswithoutproducts

October 10, 2013 at 11:01 am

forgetting to remember the reminders of old

with 80 comments

The experience of a new sense of paranoia, about our intellectual capacities, our attention spans, our abilities to concentrate, to retain. “I simply don’t seem to have the wherewithal to make it through a long book anymore – twitter’s ruined it all.” “I can remember when I’d simply sit at my desk and will my way to finishing an essay, as an undergraduate, more than a decade ago. But now, there are all of these sites to check, and emails and texts pinging their way into my awareness all of the time, and so…”

And so… one lays in bed at night worrying that the game really is up, what one could once do one can do no more, lost now in the funhouse of the always-on mediasphere. “In or around June 1995 human character changed again,” a recent essay tells us. Another, by a self-proclaimed saint of seriousness, warns us of a coming apocalypse. Reading in bed, yes, it’s true – why can’t I remember what happened in the previous chapter of this history of Byzantium? Why, furthermore, am I still not finished with this history, months after my trip to Istanbul? In the early morning, more panic to ring in the day with worry: will today be like yesterday, and the yesterday before that, where despite my best intentions I still don’t get anything done, instead always taking “five more minutes” to scan the social media screens, to surf around in the flotsam of trivial news?

Between the articles and the personal sense of guilt, then, a creeping sense of despair. Perhaps it’s the personal and intellectual version of what the ancient Romans must have felt about their Greek predecessors. Despite all these resources, all of this wealth and power and worldly awareness, why can’t we get the statues to stand up without props? Why can’t we write an Odyssey or an Oedipus Rex? Where are our Aristotles, our Platos? 

But then this morning a second thought about all of this: Undoubtedly, undoubtedly, all of these new screens and devices, fora and threads, have a major impact on my – and all of our – mental and psychological ecosystems. There’s no doubt either that having the world’s body of information searchable on my desk has made me lazy about retaining information, and the ease of electronic contact has made me less willing and able to do the quiet, self-circumscribed work that I used to do when there simply weren’t many options for finding continual, causal contact with friends and strangers. But…

I am wondering this morning when, exactly, was my worklife not organised around long periods of apathy and distraction, punctuated by sudden rushes of illumination, focus, and productivity? Long before I had a working web browser and wifi setup, that’s for sure. I can’t remember what happens in novels or histories now, sure – but then look back and the notebook after notebook I filled with notes during my undergraduate and graduate years? How much of War and Peace did I really have in hand, despite just having read it, back in 1996? And further, when was it that I didn’t blow off reading interminable critical monographs to read the newspaper, magazines, or whatever was at hand? In short, when wasn’t my internal intellectual life organised in a manner resembling a factory with lazy workers, constantly off for a smoke break or getting distracted in conversation, and with a manager staring down at it all in despair, occasionally shouting at the shiftless individuals to get the hell back to work?

Not sure there’s a wider point to all of this, except perhaps to offer a slight rejoinder to the prophets of social media apocalypse who would tell us that we’re screwed… and who often succeed, as with my night time worries, to convince us of this. More than that, I guess I’m trying to remind myself – to remind myself that I’ve always needed reminders, and that if ADHD or dementia there is growing in my brain and mind, it’s been growing there from the very start.

Written by adswithoutproducts

September 18, 2013 at 10:33 am

Posted in distraction

crime and punishment

leave a comment »

The sentence that is wrong in this otherwise interesting post is this one:

“Perhaps this is our world-historical punishment for the failure of communism.”

Who is the “our” in that sentence? Who is doing the punishing? Who is it that’s concerned, in that sentence, with the failure of communism?

Got to take care with your metaphors, as they’ll trick your political analysis into theology… And theology leads, as it always has, to the worst sort of quietism.

Written by adswithoutproducts

August 17, 2013 at 7:57 pm

Posted in catastrophe

ads without an audience / an audience without ads

leave a comment »

From Choire Sicha’s review of Jaron Lanier’s Who Owns the Future in the current Bookforum

Put most simply: “The primary business of digital networking has come to be the creation of ultrasecret mega-dossiers about what others are doing, and using this information to concentrate money and power.” There is, quite literally, no future in this for almost any of us. Apart from this sprawling system of digital vampirism, publishing in general (books and newspapers especially) has taken a major hit from technological change—as did, you know, the lives of people who made cars and worked in offices. (The number of people in the labor force in America has now returned to the levels of the late 1970s, also known as the heyday of postwar economic malaise.) Colleges may very well be next—Harvard Business School professor Clayton Christensen said earlier this year that “higher education is just on the edge of the crevasse.” So might various science industries, or home health care, or international shipping, or taxi drivers, or accountants, or who knows what. We can each in turn go to our deaths giving away our value for some other entity’s benefit while working in industries that are losing their value as well, all for someone else’s disruption game.

The “creation of ultrasecret mega-dossiers” which, of course, as of now are generally used to fuel logarithms in order to serve us up ads on the websites that we visit to see the very stuff that no one’s being paid to produce anymore. As Sicha notes, this process has or threatens to put us all out of work: the media, artists, writers, and next educators, scientists, health care workers, shippers, taxi drivers, accountant – the list heads off toward encompassing just about everyone employed in post-industrial first world society.  (Even the bankers and hedgefunders are turning into machine minders and point of sale contact assistants).

But of course, there’s a massive historical irony haunting this process – one that is both very old and utterly new. The value of these universal archives of marketing-friendly information declines as the general financial welfare of the population declines – it’s useless running ads for those without money to spend, and the system itself threatens to make us all into just that. The situation takes the shape of a national or global version of the Walmart that moves into a rural town, undercuts the local shops, destroys the livelihoods of those that live town, and eventually is left with no one to sell to and thus closes up shop.

The situation feels ripe for the emergence of some sort of new, post-modern Fordism, where it dawns on the information industries that they themselves need to maintain some sort of consumer base to sell to, just as Henry Ford realised that if his own workers couldn’t buy his cars, there wouldn’t be many customers left over to sell to. If not, it seems we’re tending towards the situation in Alfonso Cuarón’s Children of Men, where the ads somehow keep rolling, even though there’s barely anyone left to view them.

 

Relatedly, a strange situation is emerging at the intersection of internet technologies and television. See this post by Jessica Lessin, which I found via a David Carr article in the New York Times.

For more than a year, Apple has been seeking rights from cable companies and television networks for a service that would allow users to watch live and on-demand television over an Apple set-top box or TV.

Talks have been slow and proceeding in fits and starts, but things seem to be heating up.

In recent discussions, Apple told media executives it wants to offer a “premium” version of the service that would allow users to skip ads and would compensate television networks for the lost revenue, according to people briefed on the conversations.

Consumers, of course, are already accustomed to fast-forwarding through commercials on their DVRs, and how Apple’s technology differs is unclear.

It is a risky idea. Ad-skipping would disrupt the entrenched system of television ratings—the basis for buying TV ads. In fact, television broadcasters sued Dish Network when it introduced similar technology last year.

On the other hand, it is no secret that fewer and fewer people are watching commercials thanks to DVRs; networks may very well be eager to make, rather than lose, money off the practice.

It is a “risky idea,” for both parties, but what is interesting is that the added loops of the situation bring to the fore some of the strange effects that I’d like to label the “metaphysics of advertising.” (I’m following from Marx’s description of the commodity in the section of Capital on commodity fetishism – “A commodity appears, at first sight, a very trivial thing, and easily understood. Its analysis shows that it is, in reality, a very queer thing, abounding in metaphysical subtleties and theological niceties” (italics mine).

At any rate, in the case of the television negotiations, the advertisement is there at once to bring to market the value of the customers’ eyeballs, but also as a sort of ransom-able distraction from the content itself. I.e. one side pays to put them in, the other side pays to have them taken away. In a sense, we’re drawing close to the situation I blogged about yesterday with what I called the “Sisyphusian capitalism” of Goldman Sachs’s entry into the commodity handling business, where they draw a rent simply by slowing an economic process down. Could one imagine a situation where the advertisements are included solely so that they might be destroyed? 

Written by adswithoutproducts

July 30, 2013 at 9:58 am

aluminum can’t: sisyphusian capitalism

with one comment

From the New York Times a few days back:

Hundreds of millions of times a day, thirsty Americans open a can of soda, beer or juice. And every time they do it, they pay a fraction of a penny more because of a shrewd maneuver by Goldman Sachs and other financial players that ultimately costs consumers billions of dollars.

The story of how this works begins in 27 industrial warehouses in the Detroit area where a Goldman subsidiary stores customers’ aluminum. Each day, a fleet of trucks shuffles 1,500-pound bars of the metal among the warehouses. Two or three times a day, sometimes more, the drivers make the same circuits. They load in one warehouse. They unload in another. And then they do it again.

This industrial dance has been choreographed by Goldman to exploit pricing regulations set up by an overseas commodities exchange, an investigation by The New York Times has found. The back-and-forth lengthens the storage time. And that adds many millions a year to the coffers of Goldman, which owns the warehouses and charges rent to store the metal. It also increases prices paid by manufacturers and consumers across the country.

Thoughts right now about what it would feel like to be of the drivers of that fleet of trucks. Absolutely meaningless efforts in the service of bending a crimp in the system’s hose ironically to keep the profits fluid. Like a not quite as dark version of this:

 “I avoided a vast artificial hole somebody had been digging on the slope, the purpose of which I found it impossible to divine. It wasn’t a quarry or a sandpit, anyhow. It was just a hole. It might have been connected with the philanthropic desire of giving the criminals something to do. I don’t know. Then I nearly fell into a very narrow ravine, almost no more than a scar in the hillside. I discovered that a lot of imported drainage-pipes for the settlement had been tumbled in there. There wasn’t one that was not broken. It was a wanton smash-up. At last I got under the trees. My purpose was to stroll into the shade for a moment; but no sooner within than it seemed to me I had stepped into the gloomy circle of some Inferno. The rapids were near, and an uninterrupted, uniform, headlong, rushing noise filled the mournful stillness of the grove, where not a breath stirred, not a leaf moved, with a mysterious sound — as though the tearing pace of the launched earth had suddenly become audible.

At any rate, go read the rest…

Written by adswithoutproducts

July 29, 2013 at 3:27 pm

Posted in conrad, inefficiency

notes on the novel, genre, woolwich

leave a comment »

What else does the novel, by the very nature of its elemental form, teach us than that there is some relation, or at least should be, between our internal subjective states and the world in which we move. Foreground / background. Protagonist / context. Romance / history. The family / the city. Wires run between the one to the other, from the outside in and back again. Almost every name of a novelistic subgenre or period movement (realism, naturalism, modernism, postmodernism, to name just a few of the recent ones) names a different mode of wiring. Shifts in genre represent new ideas about how to write the machine. How tangled or untangled it is, how many wires run hither and how many yon, what buttons there are to push to control the voltage and wattage of the link up, how much bandwidth in total is carried.

***

Has there ever been a “terrorist attack” as uncanny as the one that happened yesterday in Woolwich? And uncanny is the right word – utterly familiar (tropes of beheading, tropes of “bringing the fight back to the oppressor,” the visibility of violence) yet at the same time utterly not (the refusal of both escape or self-immolative martyrdom, the implicit invocation of the laws of war when it comes to “innocent bystanders,” the further refusal to “let the event speak for itself,” or be spoken for by leadership organisations far away and ex post facto, or through pre-recorded statements aired after the event,  and the immediate extinguishing of the fear of further attacks, at least by the same actors, as per Boston). With this one, we seem to slip from the genre called “terrorism” to something else: a gruesome morality play about the calculus of war, the algebra of carnage. Street theatre allegory that trades the fake blood for the real.

So was it the “genre shift” that explains the strange reactions of the bystanders who observed the attack and its aftermath? Women reportedly ran over, in the course of the attack itself, to attempt to help the dying or dead soldier, thinking that the three actors in this play were rehearsing an all-too-common everyday scene we call “a car accident.” Who was it, and why was it, that someone stayed to film a man whose arms were drenched in blood, who carried a knife and a cleaver in his left hand, while he delivered his final soliloquy? What to make of these recorded conversations between the killers and their audience?

Is there a better answer than that a genre had been disrupted or reinvented, and thus the rules that normal apply (murders try to escape, bystanders flee, etc) were unavailable for consultation?

***

Genre is also another name for myth. While it sometimes postures as science, it has far more in common with superstition. Throw salt over your shoulder, and lucky will occur. One character says something, the other, naturally, touches wood. We now, in our pharmacologically-lexiconed period, are far more likely to call superstitious practices the symptoms of Obsessive Compulsive Disorder. One has to check, and check again, that the water’s not running in the bathroom before one leaves the flat. Push hard three times on the front door to make sure it’s locked… or else another storyline will ensue, the one that has an evening return to a gaping door, the laptop gone, the bedroom drawers dumped. This is literally it – some sort of chemical depletion or superfluity occurs, some traumatic event takes place, and then an almost mystical belief in certain irrational storylines takes over. To disobey the mandates of genre is to open oneself to an unhappy ending.

Last night: this news-story. On television and especially on the web. Fraught conversations about the arithmetic of death. And then a phone call. Bad news of the sort that late night phone calls usually bring. The trope of the middle-aged son and the ailing parent. The novel teaches us to think of the one thing as related, if complex, to the other. At least metaphorically, or even just formally. What is happening out there of course is a prelude to what is about to happen right in here, in the space of the family home and especially the skulls (and bodies) of those that inhabit it.

Think of the script. The call in the night in the movie. The early middle-aged son who ignores the call momentarily, caught up as he is in an argument about the gruesome news on television. The politics of violence, the physics of the world system. The cigarette whose space allows a second thought, a second glance at the mobile phone. Ominous – we can imagine what will happen next. The film that will play out from its start in a graphic sequence of news images morphs into a dark family drama. How does one cope when the worst comes home to roost?

***

A fallacy (a word quite close to “myth” and “superstition”) that doesn’t have a name, one that is hardwired into the DNA of the novel as a form. I’ve tried to name it in things that I’ve written, in seminars that I’ve led. Sometimes it seems to have more to do with temporality. What happens after what, or at the same times as each other. We could call it presumptive fallacy. Retro-prospective fallacy. The fallacy of coincidence. Sometimes it’s simply about the structural mandate that the foreground be read in the light of the background and vice versa. Contextual fallacy? Flaubert, disrupter through over-fulfilment of so many genre mandates, so early in the game, was aware of the problem. Think of Frédéric waiting for Madame Arnoux while the revolution kicks off a few blocks away in L’Éducation sentimentale.  The New Critics liked to label fallacies on the part of the reader. I am more interested in the fallacies inherent in artistic forms themselves, even though obviously these can turn into the former and often do through the sort of training that novels provide. 

***

But of course, myths are also true in a very serious sense. I don’t simply mean that what we believe we are. What we think is the only thing there is. Although that may well be true. In this case, it is also useful to think of myth or superstition or even fallacy as a customary practice, a mode of operation, running orders against confusion. The world, as we know, lives out the demands of its many operative genres every single day. Perhaps now as much as ever. A myth is habitus, generated by practice, an operating manual written and re-written each time we act.

The novel makes us stupid in one sense, solipsistic, tends to make us look for our angle on things, what does this mean to us? What were the attackers yesterday, in both his words and deeds, and deeds both during and after the attack, trying to say to me? Or at least us? There is a counter-instinct, for those disciplined a certain way, to try to climb up the ladder of transcendent wisdom, to disavow the inwrought narcissism of our conditioned response. To gasp and yell when the news commentators reduce a global to a local question, an a serious question to a matter of insanity or unanchored spite. They might think what they want, but they have no right to act it out here. To force us into these stringent attempts to adjust the genre back to something we’re comfortable with. 

But the attempt to climb out of the fray of self-interest, however complex, however Wallace-ianly convoluted and self-reflexive, is of course a trope in yet another sort of story, another sort of myth, one that – we need to remind ourselves – has the deepest affinities with an imperial mindset, one that takes the world panoptically, one for whom impersonality is a transferable skill.

What retards political development – and really contemporary thought as a whole – more right now than an inability to come to terms with the relationship between the self, located wherever it might be, and the world-system as a whole? At least here where we are? What are we, sequestered in the posh uptowns and suburbs of the global system, meant to think or say when we are in the wrong jurisdiction? We know not to fall into the ethical mode, charity is of no use, but there may be an exitless cloverleaf, a highway cul de sac, ahead if

Despite all the complicities of the novel, these generic demands and the demands of its sub-genres, the promise remains that the bad faith strictures themselves make space for revelatory manipulation, clarifying detournage. They even, potentially, lead us toward the formulation of simpler questions, question more pressing in their semi-solipsistic simplicity. Like this one, that with the little revision, some shifts in seemingly inevitable consequence, the script I outlined above could be made to ask:

Who has to die in the prime of life, and who is afforded the luxury of death that comes at an actuarially appropriate stage? 

the talking cure: ads that speak back

leave a comment »

 

I remember, as a kid, having several books of BASIC programs, mostly games, that I could dutifully type into my IBM PC, save on a a floppy to play again and again. One of them, which I remember intrigued me at the time, was called Eliza, which according to Wikipedia, is

a computer program and an early example of primitive natural language processing. ELIZA operated by processing users’ responses to scripts, the most famous of which was DOCTOR, a simulation of a Rogerian psychotherapist. Using almost no information about human thought or emotion, DOCTOR sometimes provided a startlingly human-like interaction. ELIZA was written at MIT by Joseph Weizenbaumbetween 1964 and 1966.

The outcome would look something like this:

The implicit joke at work with Eliza is that one of the easiest conversational models to simulate would be that of the classic psychoanalyst, the content of whose speech is (proverbially) meant to be meaningless. Rather it’s only the reflective form of the speech (“Can you elaborate on that?” “What do you mean by that?”) that matters. The therapist turns every statement of the patient’s into an opportunity to ask another question – and in particular meta-questions about the meaning of the meaning. (“Do you like talking about yourself?”)

The video at the top of the post is a pitch from a company called Nuance, best known for its Dragon speech-recognition software, to develop voice-recognition and -response driven ads for mobile devices. Of course, most of us are all too-familiar with this sort of mechanical conversation from dealing with our banks and utilities, as installing a robot to hear and respond to us, find us the department that we need to be in touch with (or, more often it seems to me, find a way of confusing us away from the proper department).

But what seems to me interesting, however, about all of this is the way that the things I’ve posted above relate to one another. (Of course, there are several famous episodes in the history of advertising that render the relationshop between it and psychoanalysis rather clear, starting with the fact that the “father of modern public relations,” Edward Bernays, was Sigmund Freud’s nephew, and certainly wasn’t afraid to bring his uncle’s ideas to bear upon his own work.) Listen to the Nuance video again. It seems to me significant – even (we might almost say) psychoanalytically significant – that in both of the “sample” applications of their software, the computer-driven voice digresses into pseudo-therapeutic responses on its way to the delivery of the product message. In the case of the florist ad, it’s the “Of course you don’t!” as an answer to the revelation of the husband’s ignorance, whereas in the deodorant pitch, it’s the bizarrely chirpy wish-fulfilment about “This is America.”

What else, in the end, is a computer going to do as it makes conversation with us, other than provide a pseudo-therapeutic sounding board? But given that that’s the “selling point” of the technology on offer, it’s almost as if the ultimate point of the ad – the delivery of the commercial message – comes as a non sequitur interruption, rather than the other way around, as is especially clear in the deodorant ad above (“And while you’re at it…”). After all, it’s not the ability to deliver the product pitch that the company is selling as an innovation. It’s the calming but uncanny banter that is meant to disinhibit the potential consumer, to get her or him to “open up” the mind and ultimately the wallet.

It’s hard not to see Nuance’s sample responses, driven we might imagine by conversational algorithms not vastly more complicated than ELIZA’s, as subtly demonstrating the deep formal relationship between the two modes of discourse at play. So, on the one hand, the computerized conversations above gesture towards the sort of content-free “idle talk” that characterises certain versions of therapeutic discourse. On the other – and of more interest to me – the form of the Nuance exercise also reflects the deep affinities of advertising with psychoanalysis, as both quest for a) an understanding of what the patient/customer “really wants” and b) a means to spur the patient/customer on to the fulfilment of that want.

At any rate, as some of you might guess, I am going to try to get some work done this summer on my long-deferred “advertising” project. True to this post, what is of most interest to me is the strange relationship between the functional and aesthetic aspects of advertising –  i.e. an analysis of the parts of advertising that aren’t the direct product pitch, the announcement of details / utility / price. In a sense, we’re used to moving in the other direction in our considerations of the “aesthetic.” That is, we find a case of it and then we work to show the material underpinnings of its emergence. Advertising forces us to work the other way around. We take the functionality of the work as primary, but then its all of the supplementary, ostensibly not-directly-functional elements that stand as a mysterious remainder. (As you might further guess, the project would be reflexively anti-Adornoian from the start. Or maybe that’s not the right way to put it. It’s bound up with this thinking, put attempts to walk the path in the other direction. Or perhaps in the same direction, but backwards…) We intuitively understand, in other words, that they want to sell us deodorant. The question is why they and we need the conversation with the computer in order for them to do that. 

Written by adswithoutproducts

April 29, 2013 at 11:13 am

first thoughts on coetzee’s “the childhood of jesus”

with 2 comments

I was asked by students and others several times last week what I made of Coetzee’s new novel. I’ve been a bit annoyed with myself that I haven’t really had any good answers yet, and have been forced to make the same gestures towards “bafflement” that just about all the reviews I’ve read have made. But I’m starting to think that its our bafflement itself that we should be looking into – that there’s more to be made of it than a shoulder-shrug.

Chris Tayler, in his review of the novel in the LRB, gives us a good start at a list of the questions begged but left unanswered in the course of the narrative:

As a reading experience it’s utterly absorbing, with almost painful levels of meta-suspense as you try to work out where the story is aiming to lead you. Questions are as close as Coetzee comes to direct statements, and the novel is richly generative of these. Is the world it depicts an afterlife, a pre-life, a mere stage in an unending transmigration of souls, a realm of ideal images as discussed in Coetzee’s recent essay on Gerald Murnane in the New York Review of Books, or none of the above? How does the Jesus plot fit in with this? How come Inés has access to sausages? Do the deadpan jokes get less frequent or just ascend to a higher sphere?

One of the things that I try to teach my students is to developed a more nuanced take on literary “difficulty.” Most of us, especially when we’re starting out at reading “difficult” books and thus insecure about our ability to understand, let alone intrepret, them, take it on instinct that there always is something to figure out in such works. One acquires a “reader’s guide” to Ulysses, one takes up the challenge of the notes at the end of The Waste Land – one struggles to “solve” the riddles of the poems, to understand the allusions, etc. But what if (so I argue in my first-year seminars) we’re meant in dealing with these texts not so much to penetrate the difficult but to have an experience of difficulty’s opacity itself. (My favourite example is to use in teaching is the beginning of the second section of The Waste Landwhere I think Eliot’s putting us through a sort of routine having to do with the “dissociation of sensibility.” We simply can’t see the image described, and perhaps that’s meant to make us feel our own post-lapsarianness…)

Why does Inés have access to meat – and what is La Residencia in the first place?

It has been a preoccupation of Coetzee’s for quite awhile, to tantalise the reader with the sense that there are answers to questions raised by the text, that there is an interrogate-able reality lurking behind the narrative itself, and thus, when the answers fail to arrive, perhaps to push the reader back into an awareness of her or his own need for answers in the first place. (Think for instance of Disgrace, where the reader is left in the same position as David Lurie himself – completely unable to understand the reasons why his daughter Lucy does what she does [or doesn’t do what she doesn’t do] in the wake of her rape.) In this case, why, in the end, are we bothered by Inés’s access to sausages? Why are we worried about the nature of La Residencia? It feels as though, at the beginning of the work, Simón would have asked them too – but by the end of the novel, he’s lost his appetite for questions of this sort – his appetite for questions about appetite and its fulfilment. In other words, the reader’s persistence in wondering falls out of sync with the characters in the text – it’s we readers who remain new arrivals at Novilla.

Likewise with the question “How does the Jesus plot fit in with this?” Not only is the abstraction inherent in this sort of typology or allegorical sense incompatible with the putative Jesus’s incessant refusal of such abstraction, but the question is exactly the sort that Coetzee’s fiction time and again refuses to solve for us – or stages the struggle and failure to solve on the part of his characters. Again, think of Lurie’s attempts to place is daughter into a discernable “category” of rape victim after their attack, or even more pressingly, the efforts of the administrators of the camp that Michael K ends up in at the end of his novel to deduce the “meaning” of this man who has come into their care and custody.

Michaels means something, and the meaning he has is not private to me. If it were, if the origin of this meaning were no more than a lack in myself, a lack, say, of something to believe in, since we all know how difficult it is to satisfy a hunger for belief with the vision of times to come that the way, to say nothing of the camps, presents us with, if it were a mere craving for meaning that sent me to Michaels and his story, if Michaels himself were no more than what he seems to be (what you seem to be), a skin-and-bones man with a crumpled lip (pardon me, I name only the obvious), then I would have every justification for retiring to the toilets behind the jockey’s changing-rooms and locking myself into the last cubicle and putting a bullet through my head.

With just a shift of a few details and a reduction in intensity, this passage from Michael K could stand as a rendition of what I was feeling when asked last week “what the new novel means” and probably isn’t all that far away from the sort of frustration that the reviewers felt as they worked up their pieces for the magazines, or so I guess…

Coetzee is often – with obvious justification – labelled a “meta-fictional” writer: his works build on and distort previous literary works, or are “about” the act of writing itself. But they are also books that generate – or should generate – a sort of “meta-reading.” Just as the writer is writing about writing, when we read them, we are reading about reading. Or at least that seems to be the point. Were a new (or even the first) messiah to arrive on earth, would we be so concerned with his meaning and relation to precedent, his conformity or lack of conformity to the models that we would impose, that we would fail to listen to him right from the start? With inherited instrumental logics and instinct to abstract categorization, our need to extract reified meanings from things, would we be able to read him at all?

Written by adswithoutproducts

March 17, 2013 at 12:30 pm

a belated new year’s message

leave a comment »

2012 is over. Now back to the regularly scheduled programming.

Written by adswithoutproducts

January 11, 2013 at 9:47 am

Posted in dystopia

emotional unemployment

with 2 comments

Fredric Jameson in the LRB on Hemingway and Carver: 

This is not to suggest that minimalism finds its realisation in the repudiation of the category of expression as such. On the contrary, the inaugural model of minimalism, Ernest Hemingway, simply opened up another alternative path to expression, one characterised by the radical exclusion of rhetoric and theatricality, for which, however, that very exclusion and its tense silences and omissions were precisely the technique for conveying heightened emotional intensity (particularly in the marital situation). Hemingway’s avatar, Raymond Carver, then learned to mobilise the minimalist technique of ‘leaving out’ in the service of a rather different and more specifically American sense of desolation and depression – of emotional unemployment, so to speak.

Interesting thought: that the outrolling of literary history and influence reveals that the apophatic isn’t just “mentioning by not mentioning” but in the long run is an index of the fact that there was nothing to mention in the first place. Carver takes up a style that is meant to suggest depths by remaining on the surface only to realise that they’re only ever surface. The ineffable shifts from what can’t be said to what’s not there to be said in the first place. Or even that the adoption of minimalism leads fiction into perversely-Pascalian situation: Minimalise, delete your words, and you will believe that there was nothing to delete in the first place. 

 

 

Written by adswithoutproducts

January 10, 2013 at 1:42 pm

Posted in fiction, jameson

march of the headless women / fictionality, character identification, and whateverness

with 8 comments

Interesting synchronictiy. The other day I was in a Waterstones and was stunned yet again at the fact that the “headless women” book covers are still proliferating. What are the “headless women” book covers? Well, take a look here or here or here. Or take a look at this one, which happened to be on display on the 3-for-2 rack at the Waterstones in question, and which was written by an author I’ve met a few times.

It’s pretty obvious what’s interesting / discomforting / grating about the proliferation of covers of this sort. Implicit in their ubiquity is a sense on publishers’ parts that female readers, when choosing a novel, want to be able to project themselves into the work, to occupy the place of the female protagonist. If the person pictured on the cover of the book were to possess a head, and in particular a face, this would somehow block the ability for them to do so: But I don’t have red hair! But my eyes aren’t that colour! My cheekbones aren’t at all like that! It’s notable that works aimed at male audiences don’t take the same tack – often foregoing the depiction of people on the cover altogether.

Pretty condescending, isn’t it? Unfortunately one has a sense that the publishers know what works, and wouldn’t be doing this if it didn’t work to some degree. I’ve seen an argument on twitter – now lost to us, as it was months ago – in which a PR person for a publisher responded to criticism of the practice with something like “I know, I know – it’s awful. But what do you want us to do about it? The books won’t move off the shelves if we don’t.”

Depressing. But here’s the interesting part. It just so happens that I had assigned – and had to prepare to teach early this week – a fantastic essay by Catherine Gallagher called “The Rise of Fictionality,” which was published in Franco Moretti’s magisterial anthology on the novel. (Luckily for you – and for me as I rushed to get the students a copy of it – PUP has the essay on-line here.) The essay is a vivid and succinct historicization of the emergence of fiction as a category in eighteenth-century Britain, a category born out of divergence both from “factual” writing and (and here’s where the brilliance of the piece truly lies) “fantastical” writing as well.

I won’t go into all the nuances of the argument here – do yourself a favour and read the piece. But here’s a few paragraphs that seem especially relevant to the acephalous women of Waterstones:

That apparent paradox—that readers attach themselves to characters be­cause of, not despite, their fictionality—was acknowledged and discussed by eighteenth-century writers. As I have already mentioned, they noticed that the fictional framework established a protected affective enclosure that en­couraged risk-free emotional investment. Fictional characters, moreover, were thought to be easier to sympathize or identify with than most real peo­ple. Although readers were often called to be privileged and superior wit­nesses of protagonists’ follies, they were also expected to imagine themselves as the characters. “All joy or sorrow for the happiness or calamities of oth­ers,” Samuel Johnson explained, “is produced by an act of the imagination, that realizes the event however fictitious . . . by placing us, for a time, in the condition of him whose fortune we contemplate” (Johnson 1750). What seemed to make novelistic “others” outstanding candidates for such realiza­tions was the fact that, especially in contradistinction to the figures who pointedly referred to actual individuals, they were enticingly unoccupied. Because they were haunted by no shadow of another person who might take priority over the reader as a “real” referent, anyone might appropriate them. No reader would have to grapple with the knowledge of some real-world double or contract an accidental feeling about any actual person by making the temporary identification. Moreover, unlike the personae of tragedy or legend, novelistic characters tended to be commoners, who would fall be­neath the notice of history proper, and so they tended to carry little extratex­tual baggage. As we have noticed, they did carry the burden of the type, what Henry Fielding called the “species,” which he thought was a turntable for aiming reference back at the reader; a fictional “he” or “she” should re­ally be taken to mean “you.” But in the case of many novel characters, even the “type” was generally minimized by the requirement that the character escape from the categorical in the process of individuation. The fact that “le personnage . . . n’est personne” was thought to be precisely what made him or her magnetic.

Some recent critics are reviving this understanding and venturing to propose that we, like our eighteenth-century predecessors, feel things for characters not despite our awareness of their fictionality but because of it. Consequently, we cannot be dissuaded from identifying with them by re­minders of their nonexistence. We have plenty of those, and they configure our emotional responses in ways unique to fiction, but they do not diminish our feeling. We already know, moreover, that all of our fictional emotions are by their nature excessive because they are emotions about nobody, and yet the knowledge does not reform us. Our imagination of characters is, in this sense, absurd and (perhaps) legitimately embarrassing, but it is also constitutive of the genre, and it requires more explanation than the eighteenth-century commentators were able to provide.

That is to say, the “headlessness” of the fictional character, their availability to us because they are unblocked by connection to a “real person” and thus readily available for readerly identification, may be “absurd and (perhaps) legitimately embarrassing,” as are the images on the covers in the bookshop, but it is also one of the things that makes fiction what it is, and is what accounts for the special mental and emotional states that we experience as we read them.But to take this a step further (and here I am drawing out some of Gallagher’s arguments and taking them in a slightly different direction) it’s possible that reflections of Gallagher’s sort (and even the instinct catered to by the contemporary covers) point us to different sensibility about the ideology of fiction.

In short, we are made anxious about the protagonism of fiction, the structural mandate that it forces or soothes us into identification with the autonomous or semi-autonomous individual as such, that it serves as an advertisement for intricate interiority and in so doing may urge us away from the consideration of the exterior. But if it is the case that the fictionality of the fictional character is grounded on a certain availability, a certain openness, even a certain whateverness, we might be licensed to think that the ideological underpinnings of fiction are far more complex than conventional (literary Marxist) wisdom suggests. Rather than a cult of personality, fiction, at base, might start to seem a space for the emergence of impersonality – and rather than simply markers of readerly solipsism and commercial cynicism, the book covers above might suggest a nascently radical instinct lurking just below the surface of the Waterstones transaction.

Written by adswithoutproducts

January 10, 2013 at 12:49 pm

Posted in agamben, fiction, novel, whatever