Bruder & Rudmann
Library Archive Replies Also on Micro.blog
  • A Poetics of Infrastructural Salvage

    Presentation draft delivered at Oxford University on 4 March 2026

    I. Frame the Problem

    One possible way of framing the current poly-crises of academia is as a significant and structural failure of imagination. A failure of imagination is found in reducing a university as a space for workforce preparedness, thereby allowing for uncritical technological adoption - vis a vis their unquestionable inevitability, as we are sitting somewhere neutral and a-political, with a function of degree conferral. A failure of imagination is present in an inability to express or gesture toward the value and import of a space that brings people together for the purpose of sharing knowledge.

    A failure of imagination is a failure in articulating a vision for the present and future of the university.

    And so instead we get management and surveillance and seek to flatten the contributions of the people who make up a university into metrics compressed into spreadsheets for accounting.

    Datafication willfully unsees every moment of understanding that happens in this place - because it happens between people and on small scales. And once we stop regarding people, once their unique human-ness has no spot on our charts, we stop hearing them, we lose their presence, and they become disposable.

    If a failure of imagination is a failure to envision then it is also a loss of belief. In one another and in the future. And if we follow that road, where we no longer trust our neighbors and colleagues and we no longer trust in the possibility of a better day to come, all that is left to do, if you find yourself in the right position, is to extract, get what you can, and get out. And we can clearly witness those attendant effects of a looting of the university: punching down, the infantilization of students, massive work related burnout, the loss of autonomy and diversity.

    But we know better.

    I want to tell you about two experiments that we are undertaking in and around Utrecht University that seek to re-centre people as the heart of the University. These might be understood as commoning or collectivizing, they might be building communities - but what they also are, implicitly, is critique of the failure of imagination that pervades scholarship.

    And I want to make the case that this, in the end, can be the purpose of Open Science. Not just a movement or a practice, or an alternative set of metrics, or a declaration to be signed. Open Science, or Open Scholarship is a lens of critique on the institution. A way to build a form of understanding, a poetics, toward a university that we imagine together, everyone.

    II. Cooperative Open University Publishing

    Let’s start with publishing, let’s wag the dog. We know academic publishing impacts every aspect of scholarly culture - there’s a reason Open Access serves as the precursor to Open Scholarship. Communicating research is a consequential act of making known and useful and accessible and actionable.

    The way we share information has profound implications for the way we form a society. A university is one of the critical conduits for information sharing - along with media and government - to sanctify meaning and establish a community based on trust. But we messed up.

    We alienated ourselves from the process. We let in other parties, who had different needs and understandings of the world, to manage our publishing work. And so we end up with organisations that seek to frack every aspect of the research communication process - that is to extract a profit in every conceivable nook and crack within publishing - promoting an knowable glut of writings without much accountability, becoming what Sarah Lamdan calls Data Cartels, and more recently leveraging AI not as a technology really but as what Sonja Drimmer calls a permission structure for extractive modes of profit. This is all to say we lost trust.

    And so, a few years back, we developed something of an organisational vision.

    I had just come from Austin, Texas where communities of artists work together to realize each other’s projects. You play drums on my record, I’ll play guitar on your record, or I’ll do the mastering if for you if you help me record or illustrate the album art, and we’ll help each other book some shows and so on.

    And I was now working at Copim, where I was tasked with developing business models that could better support scholar-led publishing. And a couple of us - Ellie Gerakopolu and myself - settled on the idea of cooperatives, that we could self-organize into collectives to share labor and realize one another’s projects. This meant mapping out and reintegrating all of the labor that went in to scholarly publishing, this meant making space for forms and work we haven’t yet anticipated. So we wrote and shared information about this emerging idea.

    A couple years later, I realize a bit of branding and form for this model. I call it Cooperative Open University Publishing, on which I make an argument that the time for implementation has arrived, due to 1) the maturity of our digital infrastructures - there are a lot of tools out there to publish scholarship and they’re quite good! 2) Widespread literacy of the publishing process and their ethical implications - people just understand what it means to publish with certain entities and in certain venues in a way that that was just not present 10 years ago when I started working in Open Access. And as a result of the first two: 3) New forms and content for scholarly publishing - we’re willing to try new things, accept new things in the midst of a collapsing system and new potentialities. Blog post? Scholarship. Podcast? Scholarship. Multimodal installation? Scholarship. And, as a librarian, because our capacity for metadata and persistent identifiers were emerging and flexible enough to recognise these works, the barriers to entry had changed. The gatekeeping was changing. And that leads to 4) an emerging landscape of people at knowledge organisations who could cooperatively undertake the work of publishing. In the name of Open Science, Dutch Universities brought in data stewards and community managers and research engineers - all of these people will the skills and capacity to open up the way we make public and communicate the work going on within our institutions.

    After a year or so of giving talks on this model - arguing for its efficacy and immediacy, an opportunity arises to put this into practice. There is a wide understanding of Dutch academia as being on the forefront of Open Science. How this happened is that back in 2017 or so, groups of researchers got together to talk about and help one another figure out, this new thing called OS. Those gatherings became bigger and organised into Open Science Communities, which pop up across the country and are now international. The researchers did not organise by institution but by region so OSCU contains Utrecht University but also the school of applied sciences and university for humanistic studies and design school and theological seminary. And by congregating in that way, researchers began to tell the university how they want to work and find pathways to reorganise. And the university administrations said, “oh wow great let’s bring this in and make it structural!”

    But once Open Science becomes the University’s, the OSCs lose a bit of their air. And so what to do to make these communities empowered once again? For the past year we have been working on the social and technical infrastructure for OSCU Publishing. We wrote something of an introduction or manifesto to the project and went around to different faculties and institutions explaining the vision to researchers.

    And just about a year ago, Julien Tacquet completed work on a publishing environment for the Louvre. I find it quite beautifully designed, lightweight, and flexible. And as part of that contract, the publishing system, which is called Velour, is made free and open source. So, now, Julien is working with us to implement an instance of Velour for OSCU Publishing.

    We have our first book in process, a collection of texts of course addressing the question of “What is Open Science, really?” And as people begin to see the possibilities of this model, it activates their imagination. And suddenly we have proposals for a number of new works. Their ability to be published will not be based on a question of whether their idea passes someone’s gatekeeping standards, but rather whether or not the idea’s originators can organise the labour and get enough collective participation from their colleagues to see the project through.

    And so we mean to both publish interesting and collectively driven works as much as we mean to help people realize that they have the ability and capacity within their institutions to publish. We can regain not only our shared understanding, but bring about what Sam Moore and Janneke Adema discuss as academic citizenship - build stronger bonds amongst each other and our publics.

    III. Library School

    Library School is a story about what is possible when you build trust within your organisation.

    One of the greatest things that Utrecht University has done in recent years - partly in the name of Open Science - is eliminate the distinction among scholarly staff and support staff. Realizing that we all have a critical role to play in the production of knowledge, and realizing that structural hierarchies can and do inhibit that production, Utrecht University released a new policy naming all of those people as colleagues on equal footing.

    It’s great news for the library, where people like me sit. I hold a PhD from the University of Texas at Austin in language and literature, and the whole of my training and experiences in academia come to bear upon my work in the library. So there is now a framework for my contribution. My colleague and co-contributor on this presentation, Anton Bruder, studied at Cambridge with a PhD in Renaissance book history. And there are indeed many people at Utrecht who have diverse experiences and expertise when it comes to library work that can find a home in this place.

    But the movement from policy to practice takes effort and intention.

    And so in recognizing the challenges around us those I alluded to at the start, my colleague and I began having discussions about the state of the university and what it could be - imagining, that is. Imagining how we share information, how we conceive of the university in terms of emerging technologies, imagining new ways to articulate the import and weight of the library.

    And as our conversations progressed we thought, well more people should be involved in these conversations and we should have them on a regular basis. So we sketched out a few plans, and we relied on design rather than explanation. We shared an image or two inviting people to join us without further explanation. We wanted to leave plenty of space for people to see themselves in this. We began sharing these images over the summer, to let the imagination wander. And not just to library staff, but to researchers and students and academics in Amsterdam and folks we know who live in Utrecht and have nothing to do with education. “All are welcome,” is our consistent refrain.

    And on a Thursday in the second week of September, to a room filled with people, some of whom brought baked goods, we met to discuss metaphors for the library. And we have met roughly every two weeks since in one of our two library locations.

    And we have a class. We have a topic, loosely construed, sometimes a reading to pull things in a certain direction, but the gravity often shifts, and we continue to leave enough space for people to talk about whatever they want, with a trust in one another and in ourselves, that we can come away with greater understanding.

    Sometimes the room is full, sometimes no one shows up, a lot of times we find ourselves talking about AI - not because we find it a useful technology but because conversations around AI call us to reconsider ever aspect of education and research. We maintain the space, and we would be having these conversations anyway, but we find a way to imbue them with new meaning.

    But as we continue with these experiments, something funny happens. We have a seminar, an idea for publishing, then we start to think about our discussions and other forms of communication and publishing. Then we start to think about bringing people in who might be further afield, and how can we create a communication structure to get further participation. And before you know it you have the beginnings of a journal, maybe a conference, other forms of expression, like a radio - and you start to turn to your colleagues and neighbors and say, like Paul Simon once did, “hey you know that’s quite astute, why don’t we get together and call ourselves an institute.”

    IV. A Poetics for Infrastructural Salvage

    In the face of a structural failure of imagination, I say we are called to think more expansively, to imagine more ways of salvaging the incredible social, physical, and digital infrastructures that we have at our institutions. Truly, we at Utrecht can look around and feel overwhelmed with how brilliant and thoughtful everyone is, we have the responsibility of hundreds of years of building knowledge in our physical spaces, and we have these ridiculous machines that allow us to share knowledge in a way that was never before nearly possible in the whole of human history. And all three of those things need to be opened up, made more free. If we can empower them with autonomy and self-confidence - think of how vibrant and powerful our organisations might be.

    Stuart Lawson wrote that the purpose of a university is emancipation.

    If theirs is a failure of imagination, then we are compelled to imagine more, to take a long view, to envision entire new systems - salvaging the infrastructures that begin to point the way. And if we can have these visions and articulate them, we get to tell a different story about the University. And if we start to produce a narrative of the University, we get to rethink its culture and operations, and, ultimately, our institutional governance.

    We are all familiar with that triangle for culture change within institutions that is rolled out at every conference. I want to suggest strongly here a structural flaw in the design of that triangle. Somewhere between “Make it Easy” and “Make it Required” we missed a the crucial step that determines whether the change we wish to develop is significant, lasting, and inclusive: “Make it Meaningful.” We build new infrastructures with people to see themselves in, to spark imaginations, and collectively set out to fulfill those visions.

    A university is an unceasing ever expanding event of learning. Be we are up against compression - a lack of vision and deficiency in narrative that says , “well the University can’t be all of this - it must be efficient and future-proof.”

    We don’t need to accept that. There are opportunities to make our own choices.

    → 3:38 PM, Mar 9
  • First they came for the Palestinians and I didn’t speak out because, well, y’know, something about terrorists. Then they came for the students speaking out for the Palestinians and I didn’t speak out because fuck them kids. Then they came for the teachers of those students speaking out for the Palestinians and I didn’t speak out because universities are elitist institutions and those professors are elitist Marxist leftists or something. And then they came for the people of Minneapolis and I realized we’re all fucked.

    → 9:07 AM, Jan 16
  • Failure of Imagination - fragment I

    An ability to accept our present condition comes with a requirement to remove a creative part of our selves. The loss of our ability to create our conditions.

    One of the most pernicious and harming choruses we repeat lately: “it is what it is.” This is an utterance of failure. A repeated failure to grasp what it is. We fail to see, to feel, to understand, to imagine. It is more than that.

    We accept mistreatment, a lack of justice, corrupt institutions, that someone should go hungry, war. We think of things as inevitable because we lack the vision for an alternative. So many of the problems in our present moment are caused by a failure of imagination.

    → 8:15 AM, Jan 4
  • Data and Detail – Some Thoughts

    Cultural theorist and historian Aby Warburg (1866-1929) used to say that God is in the detail: der liebe Gott steckt im Detail. This phrase expressed a sort of methodological credo for Warburg and the school of cultural historians which formed in the library he founded: namely, that it is through the attention to, identification and observation of the telling detail that truths about human history may be uncovered. For the Warburg School, human history and the history of culture were almost interchangeable concepts. According to the Warburgian philosopher Ernst Cassirer, “culture” could be understood as the home the human mind makes in nature, and thus the history of culture (which is a history of forms) is the history of humanity’s homemaking in the universe.

    This philosophy of detail, however, was not an idiosyncratic insight unique to Warburg or even to his school. Rather, it is itself a detail out of which a culture and a history may be extrapolated. What we may term ‘detail orientation’ is a (perhaps the) key characteristic of humanist culture.

    Humanism in this sense is a culture centered on the study and practice of the so-called language arts of grammar, rhetoric and logic. Grammar is the fundamental science for humanism: the study of the parts and inner workings of human language. In the Western tradition, the “dead” languages of Greek and Latin take precedence because on the one hand their grammar was thoroughly theorized in Antiquity, and on the other they provide in their fossilized state a perennial and for all intents and purposes an unchanging common ground (an essential prerequisite for any shared concept of truth). In rhetoric (the science of persuasive language) we find ultimately the roots of all the genres we associate with literature in its broadest sense: language intentionally wrought, whether into a speech, a poem, a detective novel, a work of historiography, a sensationalist newspaper article, and so on. Logic in turn is the science of meaning, or of meaningful propositions, and the ways in which these may be combined and elaborated.

    An intellectual praxis which weaves and blends all three together with a historical consciousness of change can be described as philological. Detail orientation is preeminently modelled for us in philology, in which tiny details of style, vocabulary, spelling, and even sometimes of the ways individual letters are drawn, help the reader place a text in its historical cultural context. Doing so is the first step towards unlocking the full potential of a textual artefact to mean. Historicizing a text does not preclude its capacity to speak to us in the present; rather, it is like replacing a monophonic recording with stereo sound. Philology thus understood appears to have been practised in Classical Antiquity, but was forgotten during the Middle Ages. The culture of the Middle Ages, so thoroughly Christian-centric, may oddly enough be described as humanist in that of preeminent importance was the textual corpus consisting of Scripture and the many works of commentary on it. But because the Biblical text was not historicized (in other words, it was believed to be transcendental and eternally and universally valid), the medievals had no use for philology. With the rediscovery of the Latin classics in the Renaissance, however, such precise attention to language came to be viewed as necessary again, initially in order to explain the striking discrepancy between the eloquence of Ciceronian Latin on the one hand and the jargon of Scholastic Latin on the other.

    The story of Renaissance philology and the humanist culture it fed may indeed be told as the story of closely observed details. In the fourteenth century, the poet Petrarch noticed that the written Latin of his contemporaries had declined tremendously in elegance in comparison with the “Golden Age” Latin of Cicero. In the fifteenth century, a scholar named Lorenzo Valla brought a finely tuned attention to detail to the text of the so-called Donation of Constantine, and exposed it as a forgery, thereby dealing a blow to Papal claims to temporal authority. In sixteenth-century France, a jurist and historian, Etienne Pasquier, wrote that even the smallest changes in the usage and the forms of words could contain within them the evidence of empires.

    From these examples (which could easily be multiplied) we see that a humanistically trained attention to detail in the early modern period was fundamentally comparatist: a critical gaze constantly oscillating between past and present. What we might call the value orientation of this gaze would itself oscillate over subsequent centuries, favoring sometimes the old over the new (tradition over innovation) and sometimes the opposite - originality and achievement over imitation. But in general, after the Renaissance, in a humanist culture past and present would exist simultaneously in an uneasy, unresolved, and incredibly productive tension.

    The hallmark attention to detail of the Renaissance humanists would ultimately reach far beyond the written word. The relationship between humanism and the emergence of modern science in the early modern period is a difficult and fascinating one, and what is certain is that it was not a history of simple replacement or progressive succession. The “science” or “scientific worldview” of a Copernicus, a Galileo, a Descartes, a Newton did not replace “the humanities”, but was rather nurtured by and in a pedagogical culture of detail orientation rooted in the rudiments of historicized grammatical analysis. The medieval idea of the world as a book was still current in the early modern period (c.1500s-1700s) and certainly inspired those whom we look on today as the scientists of that time. If for the medievals the book of nature had been written in the Latin of the Vulgate, for the “natural philosophers” of the 1600s onwards the book of nature spoke a different language, but a language nonetheless: that of mathematics. Conjecture and comprehension, key tasks of the humanities scholar faced with an arcane manuscript, became the moves of the scientist grappling with the book of nature. And the idea of mathematics as the language of nature is not mere imagery. There is a sense in which mathematics was (and perhaps is, I don’t know) written and practiced almost like a literary language, replete with idioms and redolent with style. In one episode from the early modern history of science, an anonymous proof submitted to a competition was identified as belonging to Newton on stylistic grounds: one could “recognize the lion from his claw”, in the words of the Swiss mathematician Johann Bernoulli.

    All this is to say that the sciences and the humanities as they exist today side by side in the modern university are not two different worldviews, as they are so often framed, but sibling enterprises. We might say that in their shared quest for meaning, both privilege detail over data.

    In a discussion about the different meaning of the word information in an interpersonal context and in relation to computer science, philosopher Roger Scruton points out in The Soul of the World (2014) that the information we communicate between ourselves about the world is always information that something is the case, information about something. “Information, in this sense, is an intentional concept, which describes states that can be identified only through their content.” (p. 57) In a computer science context, however, “information means the accumulated instructions for taking this or that exit from a binary pathway” (ibid.), and is therefore not (strictly speaking) about anything. This, to me, seems like a useful way of thinking about data. Data of course emerge from contexts, but data are more of a radical abstraction of informational contexts than an informative description of them. The power of data lies precisely in the leverage which abstraction affords. Data are information points decoupled from their immediate contexts of meaning, information devoid of detail. Data are not about anything, but claim to represent the thing in itself in an essential, algorithmically parsable form. If detail is the preserve of the humanities and the pure sciences to which they gave rise, then data may be said to belong to an applied field which we can term “technics”. Technics is an extremely abstract field, which unlike the humanities or the natural sciences does not investigate reality but seeks to replicate parts of it in simplified and controllable imitations or approximations, from simple mechanics to computational models and fiat currency.

    Technics results in the creation of tools, which may usefully be defined as “externalizations of originally integral functions […]: Tools do not introduce new principles but they greatly extend the range of conditions under which the discovered control principle may be effectively employed[.]” (Buckminster Fuller, Operating Manual for Spaceship Earth, 1969). Fuller argues that when humans discover certain principles through their investigations of reality they often develop technologies which specialize in or embody those principles, which can then be left to work with ever greater degrees of autonomy. Technics is thus a way of integrating our intellectual discoveries (Fuller calls them metaphysical discoveries) into our physical reality. I suggest that “datafication” is the defining tool of our moment: a tool for turning meaning into money.

    Society today is under the sway of a harsh technics. Global finance and the “tech” industry are reducing to data the world in its every meaningful detail. This economy of datafication has its infrastructure of banks and farms through which the shared meaning of our lives is systematically transmuted into unshared money. We are becoming cognizant of a choice: between making money, or making meaning. The supreme irony is that those who profit financially from the datafication of our world will not be exempt from decontextualization and the loss of meaning. In the last analysis, the billionaire will stand alone upon a mountain of money, with nothing and no-one to spend it on or with. It behooves us all to attend to the details of the world around us, to demand detail of those who speak only of data, and to fight the slide into meaningless nothing which a “data-driven” world would mean.

    → 12:51 AM, Dec 24
  • The Start of History

    In 1992, Francis Fukuyama wrote a book called The End of History and the Last Man, in which he argued that with the collapse of the Soviet Union in 1991, history as a conflict between opposing socio-economic and political paradigms was over. Free market democracy had won out.

    Fukuyama’s thesis did not fare particularly well. What appears to have bothered his first critics was the conclusion that in a post-historical world there would be nobody left to fight. Samuel P. Huntingdon, for example, argued in The Clash of Civilizations (1993) that the conclusion of the Communism/Capitalism debate merely left the stage of history open for the return of a far more primal conflict between East and West, Muslim and Christian. In light of 9/11 and its aftermath, to many observers Huntingdon’s thesis would appear to have proved correct, and at the very least still corresponds broadly to the dominant tenor of the western political imagination. Fukuyama and Huntingdon have been interpreted as representatives of two opposing trends in politics at the turn of the century: a naive and misguided optimism on the one hand, and a grim realism on the other.

    Don’t worry, dear reader, this essay is not going to be an in depth comparison of their respective arguments. Before we can judge an argument (facts, figures, proposed relationships of causality, interpretations) we first need to check the premise of the argument. If this is faulty, then none of the rest matters, and we may safely stop reading. Fukuyama and Huntingdon share the same faulty premise: that history is a horizontal conflict between competing ideologies, be they economic, political, religious or otherwise. That such conflict exists and has always existed across space and time is undoubted. But this horizontal axis of conflict between what we may term ideological communities is secondary to the vertical axis within those communities: the axis of oppression.

    An ideological community is a hierarchical and pyramidal polity in which those at the top oppress and feed off of those at the bottom. Ideological communities are thus symbiotes (an organism made up of two or more organisms), and may take many forms: nations, religions, nuclear families, corporations, economic communities, etc. The glue that holds the pyramidal structure together and which ensures a flow of wealth from the bottom to the top may be termed an ideology: a story told by the top about itself and to the bottom. (In a sense, we might say that an ideology is the perversion of a philosophy for the purpose of extractive oppression.) The fact of these symbiotes - their emergence, their endurance - is the primordial fact of history. Conflict between symbiotes as and when they encounter one another is a secondary phenomenon.

    Only when societies cease to be hierarchical - that is, when the extractive, pyramidal symbiote is replaced by something centerless and vector-less, like a rhizome - can we start to talk about the end of history as we have known it. This has happened from time to time over the centuries. The dynamic of commoning which characterises the rhizomatic has tended however to be met with fierce resistance by the top of the given symbiote in which it emerges, being identified by that top group quite rightly as a threat to its comfortable and unearned dominance.

    The history of the later twentieth century in the West reveals many instances of rhizomatic commoning within the bloated symbiote of free market capitalism. For example, the fact that so many average folk became cultural leaders in all the arts - from music and the plastic arts to the intellectual and scientific arts of academia and science - in the twentieth century and in such numbers is historically pretty unprecedented. There was in the twentieth century a kind of popular culture never seen before, and what we’re seeing now is it being enclosed and being sold back to us.

    The assault on knowledge which we see both in the proprietary mechanization of our educational institutions, and in the dismantling, enclosure, and strip-mining of our cumulative cultural archive (painstakingly assembled over decades if not centuries) is a breaking of the bridges between us and a vision of a truly horizontal history briefly attained for some people for some years in the late twentieth-century West.

    It also, arguably, represents the emergence of a new symbiote, a new player on the field of history. We are all being engulfed within its pyramidal structure, and bathed in its techno-messianic ideology. Those at the top no longer want our muscle power, as the Industrialists did 200 years ago, nor do they want our killing and dying power, as did the Imperialists of 150 years ago. They want our brainpower. By reducing our brainpower to a steady state of bare attention, they want to syphon it all away and into their beloved computers.

    We must, all of us, pay attention - and above all, to whom and for what we are paying it.

    → 10:17 PM, Dec 22
  • Fact and Friction

    In Four Frictions: or, How to Resist AI in Education, 16/12/2025, Sonja Drimmer and Christopher J Nygren contextualise the struggle against “technosolutionism in teaching,” and call for “a resistance comprising the collective force of small acts of friction.” For the authors, friction is a feature rather than a bug of education, which they define as “the result of human grappling with the parts of the world that resist us and our capacity to understand.” We might even go so far as to say that when we encounter friction in our lives it is an indication that we should stay a while and learn.

    The authors call for a pedagogy of friction which directs the flow of attention in the classroom towards “centering humanity” rather than towards ‘AI’. They group examples of “small acts of friction” under four principles:

    1. “Resolutely center students in our teaching.” By performing care for the student and for their studies, the teacher can inspire students to take responsibility for their own education as something meaningful.
    2. “Cultivate the moments between graded reckonings; slow down the momentum of ‘optimizing.'” By tending to a pedagogical landscape filled with opportunities for in-person interactions and physical assignments (reading groups, class conversation), students are encouraged to see their education as a major part of their own lives, and not just as a means to an end external to education.
    3. “Interrupt the digital landscape.” By bringing back material handouts, planning physical and in-person assignments, and foregrounding the material aspect of the topics being studied, we poke holes in the digital curtain wall being built around us.
    4. “Ask questions.” We must rediscover the recursive “why” reflex of our toddlerhood (and expect the same degree of irritation in response)

    In Wikipedia is Resilient Because it is Boring, 04/09/2025, Josh Dzieza offers a thorough account of Wikipedia: its history, values, editorial process, culture - and the threats its community and the very Wikipedia project ("‘Imagine a world where all knowledge is freely available to everyone," Jimmy Wales, Founder) face today. (Note by the way the restrained yet somewhat Utopian Scholastic aesthetic of the web article, with its many collages of wildly juxtaposed images from our world. The Wikipedia logo itself can be thought of as a vestige of this aesthetic.)

    Drawing on an analysis of the relationship between Truth and Politics by the 20th-century philosopher Hannah Arendt, Dzieza argues that Wikipedia functions for the Internet - and thus also for the world of people connected to the Internet - as “a stubborn common ground of shared reality.” Such common ground, following Arendt, is a basic preliminary condition for politics, for “collective human life.” What makes this common ground of shared reality on which a polity may be built so stubborn is its foundation in facts.

    Facts, according to Arendt, “possess an infuriating stubbornness,” which Dzieza attributes to “a fact’s dumb arbitrary quality of being the case for no particular reason and no matter your opinion or influence.” Facts are true whether you believe in them or not. Facts make no claims on our allegiance, they do not ask for trust or loyalty or sacrifice, they not ask for anything. Facts are curious artefacts, and their careful production is a key task of civil society that wishes to continue to be civil. This task is too important to any single institution within society, and is therefore shared between several impartial institutions, including the Judiciary, the Press, and Academia. (Ideally, Government should also be a machine for ascertaining facticity, but its proximity to power makes it especially susceptible to corruption. The Market too is an institution of contemporary society, but given the way it enshrines private interest as sacred, it is hard to see how it can have much of a role to play in the administration of civil society.)

    Reading these two articles in conjunction, a connection proposes itself between the necessary friction of meaningful education on the one hand and the stubbornness of facts on the other. Facts are frictional (or frictious, which is apparently also a word), caltrops in the path of slippery fascists, too afraid to face them head on. And friction too is factual, a fundamental characteristic of our being in the world, our human experience. Friction too, let it be said, is how fire is made.

    Any kind of being without friction is a being without facts. Philosophically speaking, any discourse that shows a disregard for facts is bullshit. And while a lack of friction may be a desideratum for the bull in question, civil society deserves more than relaxed stool.

    → 6:26 PM, Dec 17
  • It’s Time to Stop Worshipping the Market Oriented Mindset

    (rejoinder to It’s Time to Stop Worshipping the Liberal Arts)

    In today’s evolving political landscape, ‘liberal arts institutions’ (which in a UK and European context we may understand as the humanities departments of our universities) must confront a hard reality: reverence for liberal education as a public good does not justify resistance to private sector greed (It’s Time to Stop Worshipping the Liberal Arts).

    For too long, humanities departments have clung to the notion that being human means cultivating closeness with our shared cultural heritage. For too long, they have assumed that training future citizens to be critically engaged members of a democratic community would be recognized forever as a public good.

    A particularly stubborn myth is that the humanities and liberal arts stand somehow in opposition to STEM. While this belief often undergirds successful calls to defund humanities education, it is hopelessly muddled. To begin with, the ‘M’ in STEM, mathematics, is itself a hallowed member of the liberal arts. Geometry was a must for the ancient Greek philosopher Plato when processing admissions to his academy, and both geometry and arithmetic have been core members of the seven traditional liberal arts since at least the fifth century AD. Far from being a natural law of the universe, the belief in a sharp divide between ‘liberal arts’ and ‘hard sciences’ arguably has a much less impressive pedigree. It dates to the infamous “Two Cultures” debate instigated by the forgotten novelist C. P. Snow in a lecture of that name which he gave in 1959. In it, he painted a picture of a community of navel-gazing literary scholars who looked down on the technical sciences. This picture was derived from Snow’s own unpleasant (but sadly not uncommon) experience of being snubbed by Oxbridge dons, a highly exclusive group representative of little outside of their own bubble.  Snow contrasted their elitist culture with the supposedly more enlightened one of contemporary industrial science (the very same science, let it be noted, which 7 years prior had tested the first hydrogen bomb). The present letter thus gently suggests that we finally let go of the dichotomy between arts and sciences, rooted as it is in the deeply limited and rather sad culture wars of the 1950s.

    Critics of a supposed (and untrue) economic exceptionalism enjoyed by liberal arts institutions often also attack the humanities for claiming a monopoly on critical thinking. Yet, once again, this is not so. Of course the natural sciences cultivate critical attitudes. Without a searching, questioning attitude towards the mysteries of the material universe there would have been no Galileo, no Newton, no Darwin, no Einstein. If the humanities claim anything at all, it is far humbler, though perhaps also closer to home: the responsibility to cultivate critical attitudes towards the phenomena of our social world (smaller by far than the kingdom of nature). Thus if there is a meaningful divide between the humanities on the one hand and the natural and technical sciences on the other, it is limited to their methods and focus: source criticism of human-made artefacts in the humanities, and the empirical analysis of natural phenomena in the sciences. And of course, as soon as we recall that humans and their works are indeed always already part of nature, even this flimsy division is threatened. In short, the humanities and the sciences are two horns on a single goat named Critical Thinking.

    Ironically, a recent attack on the monopolization of “critical thinking” by the humanities (see linked article above) is itself a shining example of the deep need for critical awareness when writing and reading. At just a cursory glance, that letter contains numerous and rather serious faults of argumentation, including false dichotomies (liberal arts vs. STEM, but also social values vs. market forces) and a host of straw men (the myth of the liberal arts’ monopoly on critical thinking, the myth of a noble and economically insulated humanities, etc.). The letter’s thesis statement manage to combine a false dichotomy, a straw man, hyperbole, and a non sequitur all into the space of just over twenty words: “While liberal arts institutions do have intrinsic value, that doesn’t mean they are entitled to be socially favoured or economically exceptional for ever.” The two topics in the two clauses are not logically related, and in any case, surely intrinsic value would indeed justify some sort of protection? This is not nitpicking; it is diagnostic of a fundamental state of confusion deep in the heart of thought and speech.

    The letter this essay responds to called on us to stop worshipping the liberal arts, yet all it offers in their place is the worship of big business, the further cultivation of a “market-oriented mindset”. Rather than worry our heads about what the academic humanities do or don’t do, we should all beware lazy conformism to a market-oriented mindset. Calls to be businesslike may sound like common sense, but mask a greedy desire to lay the institutions which structure our society open to strip-mining by private interests.  As long as our only conception of profit is monetary, as long as we continue to equate the success of the stock market with our success as society, and as long as we persist in a fantasy vision of a “real world” limited to the tiny world of finance, we will forever be trapped in a spiraling surrender of values, rights and duties to the altar of “market needs”. As a late liberal artist, David Foster Wallace, once said, in the day-to-day trenches of adult life there is no such thing as not worshipping: the only choice we get, is what to worship. Liberty and art – or “market needs”? The choice is ours.

    → 10:17 AM, Nov 30
  • Enough

    In the 21st century it is difficult to distinguish innovation from extraction. The people at the helm of their organisations are strip mining them for parts. Developers breakdown communities. Publishers hoard and enclose data. Mining has reached its end and now the structure can only collapse if we continue forward.

    Is collapse inevitable? No. We have learned a thing or two in the last fifty years. But the main lesson is this: there is enough for everyone.

    We waste nearly half the food we produce, yet people go hungry. Beaches are lined with never used clothing, yet people go cold. Every main street has more empty buildings than occupied ones, yet people sleep on the curb outside of them.

    We hold up artificial scarcity for what? So we can feel like we have accomplished something in being able to find a room or buy another jacket? The emptiness we would feel if it was revealed it was all for nothing. That we haven’t achieved anything because the only acheivment for our time is emancipation and we are further from that goal by the day.

    Freedom from the known, freedom by means of knowledge as true knowledge can only produce peace and love, freedom from what we think are our only choices.

    → 11:53 AM, Nov 21
  • Technology, Philosophy and Responsibility - A Rejoinder to a Recent NYT Opinion

    “A.I. Is on Its Way to Something Even More Remarkable Than Intelligence New York Times Online, Nov. 8, 2025”

    www.nytimes.com/2025/11/0…

    By Barbara Gail Montero. Dr. Montero is a philosophy professor who writes on mind, body and consciousness.

    We should not allow ourselves to be unduly stunned by the sensationalist claim that: “Not long ago, A.I. became intelligent.” By the standards of Alan Turing’s theoretical test, computers became intelligent long ago - back in 1966, to be precise. This was the year computer scientist Joseph Weizenbaum created ELIZA, a chatbot programmed according to a script derived from Rogerian psychotherapy - a method in which the therapist repeats the patient’s comments back to them - as an ironic display of the limits of what Weizenbaum termed ‘computer power’ in contrast to ‘human judgement.’ Yet when Weizenbaum asked his secretary to test ELIZA by engaging with it in conversation, he was bemused when after a couple of exchanges the secretary asked him to leave the room and give her and ELIZA privacy. Soon, psychiatrists were clamoring for ELIZAs to be installed in hospitals across the USA, leaving a now worried Weizenbaum to ponder what exactly these mental health professionals thought they were doing if they felt their work could be carried out as effectively by a pre-programmed script.

    What the ELIZA experiment unexpectedly revealed was how incredibly low the Turing Test bar really is. We humans were already more than willing to credit machines with intelligence in 1966; the only difference between ELIZA and ChatGPT 4.5 is the amount of computer power behind them. To paraphrase Professor Montero, as far as we can tell, there is no direct implication from the claim that a computer has greater power to the conclusion that it deserves to be called intelligent.


    “Consider the atom.” Yes, let’s. As Professor Montero accurately summarizes, the history of atomic thought can be thematized as one of ever greater complexity of conception: from the solid little spheres of Democritus to the quantum clouds of uncertainty of Heisenberg, Schrodinger and Bohr. Sadly for Montero, this is actually a disanalogy for her argument about our evolving understanding of intelligence. By her account - that intelligence today is whatever the average American thinks it is (vox populi, vox dei - good grief) - our idea of intelligence has grown not more but less complicated. We are, according to Montero, placated by the appearance of intelligence in response to our “prompt,” and this is considered sufficient. This is a shocking claim to come from an educational professional. Anyone who has ever been in a teaching role knows that the real spark of intelligent engagement on the part of a student comes not in the form of a brilliant response to the teacher’s question, but in the form of a brilliant question - and above all an unprompted one.

    The question of what intelligence is, is fundamental. But Montero confidently asserts that instead of defining intelligence and then seeing if AI meets those standards “we do something more dynamic: We interact with increasingly sophisticated A.I., and we see how our understanding of what counts as intelligence changes.” Firstly, what is this “understanding of what counts as intelligence” if not a definition, or at least a premise? Indeed, Montero is clearly aware that postulating a definition and then gradually adjusting it through experience is the fundamental dynamic of epistemology - how we come to have knowledge of the world. Thus by opposing a definition of intelligence as a starting point in favour of interacting with AI, Montero is setting up what’s called a straw man argument: no one ever said we should have a fixed and immutable definition of what intelligence is. But having some kind of definition to begin with is essential.

    Secondly, the alternative Montero proposes sounds an awful lot like saying: instead of coming up with our own definition of what intelligence is, let’s just have AI tell us. But “AI” is not a thing with thoughts and opinions. It is a chatbot, an ingenious device designed to respond engagingly and effectively to prompts. It draws on the sum of human knowledge available as text on the internet (so, really just a sliver of human knowledge, all things considered), and can never do more than combine and recombine the parts of that corpus: in other words, reflect our human achievements back to us time and time again. “AI” is not a mind, it is a discourse chameleon. And while chameleons fuel their adaptations to their environment with no more than their fair share of flies, “AI” turns its tricks at the cost of vast amounts of energy, water, money, jobs, etc. “AI” is as Montero states certainly on its way to something even more remarkable than intelligence, wondrous to tell: the wholesale abandonment of intelligence in favour of madness, against a backdrop of amazed gasps and applause.


    We cannot, in good conscience, pass over Professor Montero’s comments about consciousness in silence. They deserve to be quoted in full, for the way the professor proudly manifests a total disregard for the moral implications of consciousness:

    “Some worry that if A.I. becomes conscious, it will deserve our moral consideration — that it will have rights, that we will no longer be able to use it however we like, that we might need to guard against enslaving it. Yet as far as I can tell, there is no direct implication from the claim that a creature is conscious to the conclusion that it deserves our moral consideration. Or if there is one, a vast majority of Americans, at least, seem unaware of it. Only a small percentage of Americans are vegetarians. Just as A.I. has prompted us to see certain features of human intelligence as less valuable than we thought (like rote information retrieval and raw speed), so too will A.I. consciousness prompt us to conclude that not all forms of consciousness warrant moral consideration. Or rather, it will reinforce the view that many already seem to hold: that not all forms of consciousness are as morally valuable as our own.”

    In the words of Lin Zexu, a Qing dynasty political philosopher, in his 1839 letter to Queen Victoria over the British Empire’s aggressive introduction of opium into the Chinese economy: “let us ask, where is your conscience?” Firstly, it is logically absurd to say, “there is no direct implication from the claim that a creature is conscious to the conclusion that it deserves our moral consideration,” and then contradict this with an admission that there may after all be a link between consciousness and moral responsibility. Secondly, it is philosophically absurd to claim that, because most Americans still express support for animal cruelty through their eating habits, such a link must surely be unimportant.

    And for an American above all, at this point in our global history, to speak so glibly of slavery, of the potential enslavement of an entity which she has just publicly announced is on the road to consciousness, is such an enormity as to defy adequate response. That a university professor of philosophy could be complicit in such wanton cheapening of ideas, of discourse, of argumentation, of journalism, speaks volumes about the fascism gripping the USA. Sadly, the shelf space for such volumes is already full to bursting.

    → 5:31 PM, Nov 13
  • Prompt and Response

    The academic assessment of generative artificial intelligence is intrinsically a matter of source criticism deeply embedded in historic approaches So is the technical functioning of generative AI and LLMs. AI as we have it today is fundamentally a history machine that scrutinizes vast quantities of historical sources (or data) and interprets them statistically in order to output predictions with a high degree of certainty. Any success it enjoys, however, is due not to the immensity of the data it is able to process, but rather the opposite. Its quantitative data sets can never hope to grasp the fullness of reality in all its profound mystery. But, by delineating the field, by excising the qualitative, “AI” creates the very world it is able to predict. Our society is being radically reduced to a matrix of quantitative data points, the bare one dimension presciently described by Herbert Marcuse over 60 years ago.

    (Prompt taken from this recent job posting)

    → 11:13 PM, Nov 1
  • Teachers and Troublemakers

    There is a shift in the language of higher education from the word “academic” to the word “researcher”. As the following note will attempt to make clear, this shift reflects yet another expression of a growing intolerance for ambiguity - the organic ambiguity which characterizes culture as a healthy expression of human activity.

    “Academic” is a useful term for anyone engaged by a university faculty in ways which bring them into close contact with the core faculty activities of research and teaching. Academics come in many different shapes and sizes, and being an academic may best be described as a professional profile.

    Direct employment by a faculty is not in itself a necessary criterion. At Oxford and Cambridge, for example, much of the actual teaching is done in small groups outside of the faculty. The tutors or supervisors who deliver this teaching are not necessarily affiliated with any faculty, and are paid for by the colleges (administrative units which together with the faculties form the university as a corporate entity). University library professionals often also share an academic profile, depending on how closely their work brings them to the fundamental work of teaching and research which form the raison d’être of the faculty.

    Nor can academics be simply defined in opposition to a university’s administrative or support staff, since many workers with an academic profile are also expected to fulfil administrative work divorced from either teaching or research, or to take turns serving on the boards and committees which make the faculty world go round.

    Finally, being an academic is not simply a question of holding a PhD and being called “Dr.”. Many with a doctorate never set foot in a university again, and many who teach at university (especially among adjuncts, who carry the brunt of teaching at many institutions) are, with a masters degree, amply qualified to inspire and instruct.

    So: being a direct employee of a faculty, holding a certain title (professor, adjunct, assistant professor, associate professor, lecturer, etc.), or even a certain degree are none of them either necessary nor sufficient criteria for being an academic. What emerges as necessary is a professional profile that unites proximity to teaching and research at a (proximately) tertiary educational level.

    (the term “researcher” thus isolates just part of the meaning of the word “academic”. Part of its rhetorical appeal is that the researcher is a problem solver: the activity is inspired by and responds to “societal problems” in clearcut, empirical ways. What has been cut out/off, isolated, discarded? My contention: not simply the teacher, but the teacher as troublemaker, in opposition to the researcher as problem solver. I use the word troublemaker ironically to emphasize the necessarily troubling role of the pedagogue. True pedagogy moreover does not divide teaching and research but combines the two always - Socrates was learning at the same time as encouraging learning in others, and was executed as a troublemaker. The loss of disinterested, open-ended enquiry in a way that involves the formation of students is itself a societal problem. We must struggle not only against external capitalistic pressures, but also against internal and even personal forces of intellectual pride coupled with insecurity. Let pride and responsibility be our watchwords, perhaps. In any event, the race for funding is distracting us from the work of the academic.)

    (tbc)

    Some texts out of order:

    Ivan Illich, Deschooling Society, 1979 Paulo Freire, Pedagogy of the Oppressed, 1970 Neil Postman & Charles Weingartner, Teaching as a Subversive Activity, 1971 bell hooks, Teaching to Transgress: Education as the Practice of Freedom, 1994 bell hooks, Teaching Community: A Pedagogy of Hope, 2004 Jacques Rancière, Le maître ignorant : cinq leçons sur l’émancipation intellectuelle, 2004 C. S. Lewis, The Abolition of Man, 1943 Mary Harrington, ‘Thinking is Becoming a Luxury Good’, NY Times, July 2025

    → 9:53 AM, Oct 26
  • Artificial Authority

    The present relationship between knowledge organisations and the US Government might be best contextualised through an understanding of Grover Norquist. The right-wing strategist, who has been instrumental in denying public money to education and research since the Regan administration, gave a particularly telling statement at the Conservative Political Action Conference at St. Louis in 2013. He said that his goal is that people within institutions “begin to look at each other a little bit more like the second to last scene in one of those life-boat movies.” He wants us to ask “who we are going to eat and who we are going to throw overboard.”

    This is, I argue here, an admission of the limits of the American conservative movement’s power. Norquist is aware that he can damage the university up to a point. What he wants, instead, is to create the conditions in which a university hurts itself. And in the intervening twelve years since that statement, this strategy has been incredibly successful.

    Adjunctification, the loss of secure jobs within US academia, which supports a tolerance for abuse and untenable working situations. Which in turn impacts the accord with students and undermines their trust for, and relationship to, the university. These are conditions exacerbated, but not imposed, by outside pressures. This is coming from the inside of the institution.

    And as we continue to alienate more people from the project of higher education, as we sever the connections that should be the bedrock of an open and collective pursuit of science, we arrive somewhere truly troubling.

    But it’s more than that - when the university sends enforcement officers upon students, knowing full well that such a decision will impact people’s lives and result in severe injury, we send the signal that we are not a community. That we as an institution no longer consider our responsibility to shelter and care for one another. And that is what people like Norquist have been waiting for.

    As those cracks within the university are made plain, we can now observe the overt incursions, coming from the outside, intent on breaking the institution. Forcibly detaining university people, and subsequent attempts to assume governance control. The stifling of free speech. Withholding public funds from research at an unprecedented scale. And - as we saw last week, at the Library of Congress - removing leadership of knowledge organisations, based presumably on those leaders egalitarian values and identity.

    To be clear, what I am talking about here is the infrastructure of knowledge organisations. Yes, infrastructure is the software, instruments, tools, machines, money, and buildings that make up the institution. But underlying all of that, the infrastructural foundation of a knowledge organisation and its generative function, are its people and their efforts and labor. Sometimes we refer to this as social infrastructure as opposed to things like digital or technical infrastructure, but I’m not sure such a distinction is helpful.

    Nonetheless, let’s focus a minute on the digital side of infrastructure because the project to undermine the university has a crucial new element here. Just as IBM once provided the machinery to reduce people to data and make them easier to sort, transport, and ultimately exterminate, our present moment has its own tool to displace and harm people with efficiency and un-accountability.

    But, a difference, in concordance with people like Emily Bender, is that Artificial Intelligence is not a cohesive tool - we do have significant tools bundled in here, like increasingly advanced compute. But rather than a technology, AI signifies a particular vie for power that notably incurs upon the domain of erudition, by pirating the language of intelligence and consciousness and the actions of sense making.

    This is an attempt, once again, to alienate authority toward something that cannot be held to account - to create something of a higher power. This is a particular set of resource-heavy people that are covetous of the position of knowledge organisations and have long wished to supplant them.

    Why? Because they see knowledge organisations as places that democratise and emancipate at a moment where they wish to consolidate and control. This is, I think, quite generous to universities, but it also points to the way forward.

    Open Science and Infrastructure, and in particular human-centered Open Science and Infrastructure, could be considered the rejoinder to authoritarian control. This framework provides the counter narrative to conceptualise abundance within the university - abundance of knowledge and ideas, abundance within people - to oppose the scarcity mindset that would have us eating one another or throwing anyone overboard.

    And so the moment calls us to reflect upon our understanding and articulation of Open Science and how it is not just a set of principles or practices but a system that undergirds the generation of knowledge and its application within our world. And here we come to the crux of open infrastructure, which is governance.

    A common refrain among Norquist and his crowd is that the university must be governed by outside entities, like a state or federal body, because any alternative would allow ‘the inmates to run the asylum.’ But we who make up this space know better. Whether we will have our databases, publications, and agency within academia will come down to the question of our inclusion in the governance of infrastructure. Opening governance is the mechanism to repair our institutions and our relationships as a community because we have the proximity and immediacy to be both responsive and accountable.

    I want to quote Kathryn Gindlesparger’s work here: “When faculty answer the calls of governance, they create openings in its ceremony for new actors, behaviors, and ideas. These openings are places where inclusion can flourish, and inclusion builds governance as a responsive and meaningful system. When people matter in governance, governance matters.”

    So the question is not as much what are they going to do to us next - that we already know in some ways, it’s bad. But I hope the more important question is can we commit to one another, can we shelter and protect each other in the way we should have all along?

    Are we a community?

    → 8:08 AM, Oct 17
  • A Library is...

    An instrument of discovery

    A garden

    A place where people come to work

    A place were people come not to work

    A big building full of books

    A bigger building full of nothing

    Somewhere we never go

    Somewhere we went when we were children

    A place for sharing (which is lesson no. 1 at school)

    → 7:30 PM, Oct 13
  • What is a Librarian?

    A librarian is someone who knows how to find information and who believes in your right to have it.

    The knowledge of how to find information is key. Where to find it is secondary. The ‘where’ might well be the institution at which the librarian works, i.e. a library, but it doesn’t have to be. When the ‘where’ of information is unknown, the librarian can tell you how to find it. Not where you will find it, and not what you will find, but the steps to follow in order to find it.

    Knowledge of how to find information may in turn be compared with the knowledge of how to find data, which is the essential characteristic of the researcher. The researcher knows how to bring their attention to a certain slice of reality and sieve it for data. Data arranged according to an interpretive framework becomes information. Information assimilated by a mind becomes knowledge. Data are the raw ingredients. Information is a dish made of data. Knowledge is a full stomach.

    → 7:21 PM, Oct 13
  • The State of Education

    Before we can answer the question “What is a Library?”, we have to take a look at the state of education. It’s not a popular thing to do (perhaps nobody feels wholly comfortable when looking in a mirror), but unlike most tasks which have been left for far too long, this overview should only take us a moment.

    What does education look like today? What form does it take and what effect does it have? To answer the first question, let’s stick to the education of the young as required by law in much of the world, and frequently pursued by choice for several years longer. The institutions which provide this education are commonly referred to as the education system, and sometimes as the education industry. This is subdivided into various levels arranged according to what the cultures of education in different countries deem necessary for young people at various stages of their development. Broadly speaking, these divisions are primary, secondary, and tertiary. Additionally, the whole system (at least up to the tertiary stage) comes in two forms: one in which the state (or public sector) provides the various necessary infrastructures, and another where all is privately sourced and paid for. The rest of this post is going to ignore the private/state school debate, the better to draw out a dynamic which I believe is characteristic of our education system no matter who is paying for it, from first year primary all the way up to final year PhD.

    Postulate: the primary effect of the education system today is to rob us of our intellectual autonomy (the natural ability to think for ourselves), to neg us into conformity by undermining our mental abilities. This may seem paradoxical (i.e. contrary too (para) common sense (doxa)), but paradox is not a synonym for false.

    Argument: the process of schooling involves having one’s intellectual autonomy taken by the institution and handed back to you piecemeal the further you progress through the system. This process of dispossession and restitution correlates with what we might term “social mobility”, roughly in the following manner: the better one is at passing the system’s tests, the more likely it is that one will be deemed a safe bet for higher levels of educational investment, and for the socio-economic opportunities such education affords. Rather than an infrastructure for developing intelligence and knowledge, therefore, the education system may be seen as a vetting process working organically to conserve a social status quo.

    The sad reality of primary and secondary schooling is that after years of low marks, failed tests, struggles with homework and clashes with teachers, many young people leave the system with the ingrained conviction that intellectual work of any kind - the purported purpose of school - is not really for them. Worse, they may feel an inherent aversion to and distrust of “cleverness” altogether. These young people are delighted and relieved to be able to escape school at 16. I suggest that this is a searing indictment of the education system in itself. The school should be a place that encourages dwelling. Nevertheless it does not, and these scarred youths for the most part go on to join the army of technicians which keeps our world lit, clean, and moving on time.

    Students who demonstrate an above average success in the arbitrary intelligence tests of the education system beyond what is legally required are permitted to continue to submit themselves for further testing. They will not have to clean or scan or dig or click for poverty wages. Rather, they can now be shown socially appropriate directions in which to channel their mental energies. In the UK these directions generally take the form of careers in accountancy and civil law (as opposed to criminal law, which does not pay), medicine, (data) engineering etc.; in other words, “the professions”, such as they exist today. Thus for many who shine in school tests the natural next step is to go to university - gatekeeper, since the thirteenth century, to the professional world.

    The undergraduate experience is the clearing house of society’s systemically ratified intelligence. Students are made aware that no matter how much they may have enjoyed getting an education the ride is almost over and the pressure is on to hop off and into a salaried job. And it is at this point that the bell curve plotting academic attainment against expected financial compensation in the form of salaries and career progression hits its zenith for many individual trajectories. For as soon as study risks turning into research - as soon as one sets out to make personal sense of some aspect of reality, to fully reclaim that intellectual autonomy long held hostage - here the (much touted) directly proportional relationship between educational attainment and socio-economic climbing starts to become inversely proportional.

    With a BA in modern languages or history from a good university, a young graduate in the UK can get a training contract at a corporate conglomerate and cash in their years of education for a more or less guaranteed lifetime of work. In the US and in Europe, an MA may often be an added requirement for entry to the professions. In the UK, masters degrees are either a stepping stone towards very specific and often academic careers or are pursued out of passion or nostalgia by those who can afford the time out from work. As for the doctorate, while the potential return on investment is high (one might become a medical doctor, or a tenured professor), the risk of leaving the programme overqualified and unable to find work is daunting. Indeed, “overqualification” may not even be the worst of it. The true risk of continuing with higher education, of striving to bring the weight of all that is known onto a tiny point of the unknown, is the risk of seeing the world for the first time in the full light of history: a story of oppression punctuated by mad dashes for freedom.

    In such a case, one may be described as having finally and fully ransomed ones intellectual autonomy back from the hegemonic culture, but at the expense of ever finding a permanent place within it. And this is one way in which a certain kind of intellectual is minted. An intellectual is thereby not one who pursues education, but rather one who has come out on the other side. Of course not every PhD holder is an intellectual, and not every intellectual holds a PhD - to be an intellectual is merely to be someone who values and cultivates the act of thinking for oneself. But this is precisely the activity which a good PhD programme enshrines and honours with a degree. Historically and globally speaking, many - maybe most? - humans have been born and lived out their lives with this intellectual autonomy. In this light, our modern West with its mandatory education system might just be the stupidest civilization to have ever existed. Paradoxes all the way down?

    → 9:48 AM, Oct 12
  • Testing...

    This is but a test. A test post for Bruder & Rudmann - an experiment in collective writing and thinking on the state of the library, university, and knoweldge. All is a test, but this is a real test.

    The state of the library… But what is a library? Isn’t it at least a little like a coral reef? Built anonymously over generations, rich in vital (biblio)diversity, it sits between the shore of civil society and the ocean of the unknown: not as a barrier, but as a net.

    → 3:51 PM, Oct 9
  • RSS
  • JSON Feed