"Popcock!" said Gertrude Stein.
With this unusually lucid and brief remark the writer who has grown famous for her "a rose is a rose is a rose" style dismissed recent efforts of scientists to explain her work.
"Popcock is popcock is science is popcock," Miss Stein might have been expected to say. But she did not, according to her report. For once, she failed to repeat herself or to bewilder her hearers.
The scientific explanation is that her writing is done with her wrist and not with her mind. Automatic writing is the scientific term for it. Miss Stein not only disagrees, but takes the view that her writing does not need explaining.
If you have seen her play, "Four Saints in Three Acts," or have read ay of her other strange writings, you probably feel that she needs as much explaining as that other famous "stein"—Einstein—who also always draws a capacity crowd but whom hardly anyone in the audience understands.
—Jane Stafford, "Gertrude Stein Explained," Science News-Letter, 2 March 1935
Thursday, September 29, 2011
Tuesday, September 27, 2011
Monday, September 26, 2011
Sunday, September 25, 2011
It's not "the job market"; it's the profession (and it's your problem too)
I enjoyed Kathleen Fitzpatrick's recent piece in the Chronicle* on risk-taking and the responsibility of mentors to back up those junior scholars who are doing nontraditional work. The piece's key insight is that it's one thing to urge people to "innovate" and quite another to create the institutional frameworks that make innovation not only possible but consequential.**
Kathleen's observation comports with some ideas that have been floating around in my head lately, especially around "digital humanities." I and my Fox Center colleague Bart Brinkman were recently called upon to define digital humanities for the other fellows in residence, and in the process of talking it over with Bart, and during the discussion at the CHI, I've come to realize that I have some real pet peeves around the notion of the "job market" that come into relief specifically around the field of digital humanities.
It boils down to this: peeps, we're all connected.
The recent rise to prominence of digital humanities is indistinguishable from its new importance in "the job market" (I insist on those scare quotes); after all, digital humanities and its predecessor, humanities computing, have been active fields for decades. What's happening now is that they are institutionalizing in new ways. So when we talk about "digital humanities and the 'job market,'" we are not just talking about a young scholar's problem (or opportunity, depending on how you see it). We are talking about a shift in the institutional structures of the profession. And, senior scholars, this is not something that is happening to you. You are, after all, the ones on the hiring and t&p committees. It is a thing you are making—through choices that you make, and through choices that you decline to make.
There's something a little strange about the way that digital humanities gets promoted from the top down; it gets a lot of buzz in the New York Times; it's well known as dean-candy and so gets tacked onto requests for hires; digital humanities grant money seems to pour in (thanks, NEH!) even as philosophy departments across the country are getting shut down; university libraries start up initiatives to promote digital humanities among their faculty. I am waiting for the day when administrators and librarians descend upon the natural sciences faculty to promote history of science. No, I really am.
So it seems quite natural that there should be wariness and resistance to the growing presence of digital humanities. Perhaps there is some bitterness that you might get your new Americanist only on condition that her work involves a Google Maps mashup, because it was easy to persuade people that your department needed a new "digital humanist," whatever the hell that is, and it was not easy to persuade people that you needed somebody to teach Faulkner.
The situation is not improved by the confrontational attitudes of certain factions of the digital humanities establishment (such as it is), which are occasionally prone to snotty comments about how innovative DH is and how tired and intellectually bankrupt everybody else's work is. (Not so often, I find—but even a little is enough to be a problem.) Under those circumstances, DH seems clubby and not liberating; not a way of advocating the humanities but an attack on it, and specifically on the worth of that Faulkner seminar that you teach, and that non-digital research that you do. Why, an established scholar might reasonably ask, should I even deal with this "digital humanities" nonsense? Shouldn't I just keep teaching my Faulkner seminar, because somebody ought to do it, for Christ's sake?
Well, whatever else DH is, it is highly political, and it has political consequences. So, in short, no.
I'm persuaded that the widespread appeal of DH has much to do with the leveling fantasy it offers, a fantasy of meritocracy that is increasingly belied elsewhere in the professional humanities. As Tom Scheinfeldt points out in his useful "Stuff Digital Humanists Like,"
And as traditional scholarly publishing becomes more and more constricted and humanities department budgets are slashed, the fiction of academic meritocracy becomes harder and harder to sustain. Perhaps on the web, we think, through lean DIY publishing and postprint review, meritocracy (or its semblance) can return to the academy. It seems at once a way forward and a way to return to a (fabled) time when people cared about scholarship for the sake of scholarship—not because they needed X number of well-placed articles or a line on the cv or a connection at Y institution without which their careers would disappear. Perhaps DH offers us a way out of the increasingly rationalized death-spiral of "impact scores" and credential inflation. Perhaps it will let us out-quantify the quantifiers, or sidestep them altogether.
Of course, the web always comes with liberatory rhetoric that usually turns out to mean little more than "what the market will bear," and the ostensible meritocracy of digital humanities in the present moment is really no more than a misalignment between its alternative (and potentially even more aggressively capitalistic) value systems and those of the institutionalized humanities more generally. It can be disturbingly easy for the genuinely progressive intentions of digital humanists to become assimilated to the vague libertarianisms of "information wants to be free" and "DIY U," and from there to Google Books and charter schools and the privatization of knowledge—an enclosure of the digital commons ironically in the name of openness. At the same time, the naming of the "alt-ac" "track" (it is generally not a track, of course, by definition) seems to provide new opportunities for young scholars even as it raises research expectations for staff and requires those on the "track" to subordinate their research interests to those of the institutional structure that employs them. Digital forms are exceptionally good at occluding labor. How to navigate those waters thoughtfully—to realize the real promise of DH—is a question to which we must all apply ourselves.
So you see what I mean when I say that "digital humanities and 'the job market'" as it now manifests isn't a narrow, merely administrative sliver of life of interest solely to junior academics who are still gravely listening to advice about how to "tailor" the teaching paragraphs in their cover letters. Digital humanities has become important to "the job market" exactly insofar as it is causing major shifts in the institutions of the profession. These shifts are political. And if you are in my profession, then they are your concern.
*I know, "enjoyed" and "Chronicle" in one sentence... mirabile dictu.
**As we all know, I have a complex relationship with the word "innovation" and do not consider it an unqualified good, nor a transhistorical value. For today, however, we will leave that particular word a black box.
Thanks to Bart and Colleen for sitting through a less-worked-out live version of this rant last week.
Kathleen's observation comports with some ideas that have been floating around in my head lately, especially around "digital humanities." I and my Fox Center colleague Bart Brinkman were recently called upon to define digital humanities for the other fellows in residence, and in the process of talking it over with Bart, and during the discussion at the CHI, I've come to realize that I have some real pet peeves around the notion of the "job market" that come into relief specifically around the field of digital humanities.
It boils down to this: peeps, we're all connected.
The recent rise to prominence of digital humanities is indistinguishable from its new importance in "the job market" (I insist on those scare quotes); after all, digital humanities and its predecessor, humanities computing, have been active fields for decades. What's happening now is that they are institutionalizing in new ways. So when we talk about "digital humanities and the 'job market,'" we are not just talking about a young scholar's problem (or opportunity, depending on how you see it). We are talking about a shift in the institutional structures of the profession. And, senior scholars, this is not something that is happening to you. You are, after all, the ones on the hiring and t&p committees. It is a thing you are making—through choices that you make, and through choices that you decline to make.
There's something a little strange about the way that digital humanities gets promoted from the top down; it gets a lot of buzz in the New York Times; it's well known as dean-candy and so gets tacked onto requests for hires; digital humanities grant money seems to pour in (thanks, NEH!) even as philosophy departments across the country are getting shut down; university libraries start up initiatives to promote digital humanities among their faculty. I am waiting for the day when administrators and librarians descend upon the natural sciences faculty to promote history of science. No, I really am.
So it seems quite natural that there should be wariness and resistance to the growing presence of digital humanities. Perhaps there is some bitterness that you might get your new Americanist only on condition that her work involves a Google Maps mashup, because it was easy to persuade people that your department needed a new "digital humanist," whatever the hell that is, and it was not easy to persuade people that you needed somebody to teach Faulkner.
The situation is not improved by the confrontational attitudes of certain factions of the digital humanities establishment (such as it is), which are occasionally prone to snotty comments about how innovative DH is and how tired and intellectually bankrupt everybody else's work is. (Not so often, I find—but even a little is enough to be a problem.) Under those circumstances, DH seems clubby and not liberating; not a way of advocating the humanities but an attack on it, and specifically on the worth of that Faulkner seminar that you teach, and that non-digital research that you do. Why, an established scholar might reasonably ask, should I even deal with this "digital humanities" nonsense? Shouldn't I just keep teaching my Faulkner seminar, because somebody ought to do it, for Christ's sake?
Well, whatever else DH is, it is highly political, and it has political consequences. So, in short, no.
I'm persuaded that the widespread appeal of DH has much to do with the leveling fantasy it offers, a fantasy of meritocracy that is increasingly belied elsewhere in the professional humanities. As Tom Scheinfeldt points out in his useful "Stuff Digital Humanists Like,"
Innovation in digital humanities frequently comes from the edges of the scholarly community rather than from its center—small institutions and even individual actors with few resources are able to make important innovations. Institutions like George Mason, the University of Mary Washington, and CUNY and their staff members play totally out-sized roles in digital humanities when compared to their roles in higher ed more generally, and the community of digital humanities makes room for and values these contributions from the nodes.This is true. Those involved in digital humanities have also seen the ways that THATCamps, blogs, and Twitter allow junior scholars and scholars at non-R1 institutions to cut geodesics across the profession, allowing them to spread their ideas, collaborate, and achieve a certain prominence that would have been impossible through traditional channels. I'm convinced that real possibilities lie here.
And as traditional scholarly publishing becomes more and more constricted and humanities department budgets are slashed, the fiction of academic meritocracy becomes harder and harder to sustain. Perhaps on the web, we think, through lean DIY publishing and postprint review, meritocracy (or its semblance) can return to the academy. It seems at once a way forward and a way to return to a (fabled) time when people cared about scholarship for the sake of scholarship—not because they needed X number of well-placed articles or a line on the cv or a connection at Y institution without which their careers would disappear. Perhaps DH offers us a way out of the increasingly rationalized death-spiral of "impact scores" and credential inflation. Perhaps it will let us out-quantify the quantifiers, or sidestep them altogether.
Of course, the web always comes with liberatory rhetoric that usually turns out to mean little more than "what the market will bear," and the ostensible meritocracy of digital humanities in the present moment is really no more than a misalignment between its alternative (and potentially even more aggressively capitalistic) value systems and those of the institutionalized humanities more generally. It can be disturbingly easy for the genuinely progressive intentions of digital humanists to become assimilated to the vague libertarianisms of "information wants to be free" and "DIY U," and from there to Google Books and charter schools and the privatization of knowledge—an enclosure of the digital commons ironically in the name of openness. At the same time, the naming of the "alt-ac" "track" (it is generally not a track, of course, by definition) seems to provide new opportunities for young scholars even as it raises research expectations for staff and requires those on the "track" to subordinate their research interests to those of the institutional structure that employs them. Digital forms are exceptionally good at occluding labor. How to navigate those waters thoughtfully—to realize the real promise of DH—is a question to which we must all apply ourselves.
So you see what I mean when I say that "digital humanities and 'the job market'" as it now manifests isn't a narrow, merely administrative sliver of life of interest solely to junior academics who are still gravely listening to advice about how to "tailor" the teaching paragraphs in their cover letters. Digital humanities has become important to "the job market" exactly insofar as it is causing major shifts in the institutions of the profession. These shifts are political. And if you are in my profession, then they are your concern.
*I know, "enjoyed" and "Chronicle" in one sentence... mirabile dictu.
**As we all know, I have a complex relationship with the word "innovation" and do not consider it an unqualified good, nor a transhistorical value. For today, however, we will leave that particular word a black box.
Thanks to Bart and Colleen for sitting through a less-worked-out live version of this rant last week.
Friday, September 23, 2011
Saying hey
Via Ben Friedlander on Twitter, I was recently treated to Anne Boyer's "Damnatio Memoriae." Please read it--it's short and brilliant.
As Miriam Posner points out, it's oddly moving, not in spite of its repetition of the hilariously banal phrase "saying hey," but because of it. It serves as a subversively anti-dramatic counterpoint to the question, "Can the subaltern speak?" After all, here they are, saying hey. Only they're saying hey across the centuries, across the continents, "across deep time." These mysteries of the low register are the genius of flarf and the reason it's poetry, even if it's irritating poetry. Why does lameness sometimes flare out in the form of awesomeness?
[Better than Storify? Worse? I sometimes think Twitter conversations amount to more than the sum of their parts, but it can be difficult to render them usefully on a blog.]
As Miriam Posner points out, it's oddly moving, not in spite of its repetition of the hilariously banal phrase "saying hey," but because of it. It serves as a subversively anti-dramatic counterpoint to the question, "Can the subaltern speak?" After all, here they are, saying hey. Only they're saying hey across the centuries, across the continents, "across deep time." These mysteries of the low register are the genius of flarf and the reason it's poetry, even if it's irritating poetry. Why does lameness sometimes flare out in the form of awesomeness?
[Better than Storify? Worse? I sometimes think Twitter conversations amount to more than the sum of their parts, but it can be difficult to render them usefully on a blog.]
Beddini (reading a telegram): 'Come ahead stop stop being a sap stop you can even bring Alberto stop my husband is stopping at your hotel stop when do you start stop.' I cannot understand who wrote this.
Dale: Sounds like Gertrude Stein.
—Top Hat (1935)
Thursday, September 22, 2011
Etta Cone offered to typewrite Three Lives and she began. Baltimore is famous for the delicate sensibilities and conscientiousness of its inhabitants. It suddenly occurred to Gertrude Stein that she had not told Etta Cone to read the manuscript before beginning to typewrite it. She went to see her and there indeed was Etta Cone faithfully copying the manuscript letter by letter so that she might not by any indiscretion become conscious of the meaning. (713)All you need to know, really.
—The Autobiography of Alice B. Toklas
Monday, September 19, 2011
Friday, September 16, 2011
Consensus and knowledge according to Colbert
I wonder what folks would think of teaching this Stephen Colbert clip (September 14, 2011) alongside Leviathan and the Air-Pump or Laboratory Life.
This clip brings the issues at stake in the notion of scientific consensus into rather stark relief, reflecting as it does on current public health policy. It also puts a brake on any too-quick readings of science studies that might construe the political nature of expertise as a debunking of expertise.
In the clip, Colbert mocks Rep. Michele Bachmann for presenting as truth an unnamed stranger's claim that the HPV vaccine (Gardasil) caused cognitive dysfunction in her daughter. The segment is funny, but it's also uncomfortable when we see how very flatly Colbert pits "the entire medical establishment" against "some lady." It's not a joke about method; it's a joke about authority, and who doesn't have it. Bachmann doesn't have very many people on her "team."
The clip forces us to confront the substantiveness of expertise as well as its political nature—its reliance on modest witnesses, on trustworthiness. Bachmann's statement genuinely doesn't hold up; it's about as epistemologically unsound a way to establish fact as we can imagine—it's no more than hearsay. But the reason it's hearsay to begin with is that we know so little about this woman or her daughter, about her methods, about her discernment. We don't have enough of those markers of trustworthiness.
Colbert is interesting when it comes to issues of consensus and knowledge. I've taught Colbert's segment on "Wikiality" before in the context of a media studies unit on wikis and citation. In it, Colbert pushes an extreme relativism that the bit is supposed to mock; the idea (contrary to the suggestion in the more recent clip about Michele Bachmann) is that reality is not determined by consensus, and a wiki encyclopedia is therefore an epistemologically untenable free-for-all.
That Colbert fans rather persistently vandalized the "elephant" entry on Wikipedia just to prove his point shows both Wikipedia's limitations and its relative strength: most of the time such things don't happen on Wikipedia. Colbert's overstatement of the consensus narrative led most of my students to come to see consensus as a potentially epistemologically strong method, under some circumstances, i.e. more than a mere convention. More practically, it led many of them to understand Wikipedia as a tenable project—without, however, losing sight of its limitations. It made for a very productive discussion, and I suspect the more recent clip would too.
This clip brings the issues at stake in the notion of scientific consensus into rather stark relief, reflecting as it does on current public health policy. It also puts a brake on any too-quick readings of science studies that might construe the political nature of expertise as a debunking of expertise.
In the clip, Colbert mocks Rep. Michele Bachmann for presenting as truth an unnamed stranger's claim that the HPV vaccine (Gardasil) caused cognitive dysfunction in her daughter. The segment is funny, but it's also uncomfortable when we see how very flatly Colbert pits "the entire medical establishment" against "some lady." It's not a joke about method; it's a joke about authority, and who doesn't have it. Bachmann doesn't have very many people on her "team."
The clip forces us to confront the substantiveness of expertise as well as its political nature—its reliance on modest witnesses, on trustworthiness. Bachmann's statement genuinely doesn't hold up; it's about as epistemologically unsound a way to establish fact as we can imagine—it's no more than hearsay. But the reason it's hearsay to begin with is that we know so little about this woman or her daughter, about her methods, about her discernment. We don't have enough of those markers of trustworthiness.
Colbert is interesting when it comes to issues of consensus and knowledge. I've taught Colbert's segment on "Wikiality" before in the context of a media studies unit on wikis and citation. In it, Colbert pushes an extreme relativism that the bit is supposed to mock; the idea (contrary to the suggestion in the more recent clip about Michele Bachmann) is that reality is not determined by consensus, and a wiki encyclopedia is therefore an epistemologically untenable free-for-all.
That Colbert fans rather persistently vandalized the "elephant" entry on Wikipedia just to prove his point shows both Wikipedia's limitations and its relative strength: most of the time such things don't happen on Wikipedia. Colbert's overstatement of the consensus narrative led most of my students to come to see consensus as a potentially epistemologically strong method, under some circumstances, i.e. more than a mere convention. More practically, it led many of them to understand Wikipedia as a tenable project—without, however, losing sight of its limitations. It made for a very productive discussion, and I suspect the more recent clip would too.
Wednesday, September 7, 2011
Sunday, September 4, 2011
Modern Female Automatisms
I'm not teaching this semester, but my book list for next semester is due exceedingly soon. I think it'll have to be one of those late-nite activities, since looking up ISBNs doesn't take a lot of brain. ("Night," when preceded by "late-," is properly spelled "nite." True facts.)
I've done a poor job of articulating the course's interest and importance of late, mostly because I haven't been in the teaching zone, but it's about gender and the discourses of automatism circa 1900, and is in some degree related to the talk I'll be giving at MSA next month on Stein and repetition. Repetition structures normality and (as a "compulsion") pathology, habit and obsession; it's evidence of mechanicity and, in its ability to provoke laughter, also a site of evidence of the human. Butler brilliantly makes repetition the scene of gender.
We'll read/watch some of the classic Lady Robots texts of the Gilded Age and early C20—L'Ève future, Metropolis, "In the Cage," "Melanctha." We'll also look at some contemporary nonfiction theories of mechanicity and gender, like Otto Weininger's theory of variability, the biometrics of Lombroso and Berthillon, and of course Freud, contextualizing them in more recent work by Haraway, Oreskes, Kittler, Hayles, and Fleissner. I had sort of a lovely (that is, entertaining) Twitter conversation with Chris Forster, Jentery Sayers, and Stephen Ross (probably among others) a week or two ago about modernist humor and the role of gender in Michael North's Machine-Age Comedy, which is one of the problems I intend for the class to investigate.
Roughly, the course will use the rubric of "automatism" to look at female labor; the gendering of humor; affect and the human; objectivity and knowledge; psychopathology c. 1900; and biological determinisms.
Needless to say, I'm still in that grandiose, overly ambitious phase of syllabus-planning. I haven't done all the necessary cutting down, which will have to happen soon. I'm also contemplating some sort of introspective exercise (observing one's repetitions, or the like) that I haven't quite worked out yet. Suggestions welcome.
I've done a poor job of articulating the course's interest and importance of late, mostly because I haven't been in the teaching zone, but it's about gender and the discourses of automatism circa 1900, and is in some degree related to the talk I'll be giving at MSA next month on Stein and repetition. Repetition structures normality and (as a "compulsion") pathology, habit and obsession; it's evidence of mechanicity and, in its ability to provoke laughter, also a site of evidence of the human. Butler brilliantly makes repetition the scene of gender.
We'll read/watch some of the classic Lady Robots texts of the Gilded Age and early C20—L'Ève future, Metropolis, "In the Cage," "Melanctha." We'll also look at some contemporary nonfiction theories of mechanicity and gender, like Otto Weininger's theory of variability, the biometrics of Lombroso and Berthillon, and of course Freud, contextualizing them in more recent work by Haraway, Oreskes, Kittler, Hayles, and Fleissner. I had sort of a lovely (that is, entertaining) Twitter conversation with Chris Forster, Jentery Sayers, and Stephen Ross (probably among others) a week or two ago about modernist humor and the role of gender in Michael North's Machine-Age Comedy, which is one of the problems I intend for the class to investigate.
Roughly, the course will use the rubric of "automatism" to look at female labor; the gendering of humor; affect and the human; objectivity and knowledge; psychopathology c. 1900; and biological determinisms.
Needless to say, I'm still in that grandiose, overly ambitious phase of syllabus-planning. I haven't done all the necessary cutting down, which will have to happen soon. I'm also contemplating some sort of introspective exercise (observing one's repetitions, or the like) that I haven't quite worked out yet. Suggestions welcome.