THE MODERN PERIOD / MODERNITY

What do people mean by the modern period?  A number of significant events that occurred in the first half of the seventeenth century allow people to date the emergence of a distinctly modern period from then.  But there are those who would put the birth of the modern even earlier, using the term “early modern” to describe the rapid-changing and turbulent times once referred to as the Renaissance.  To date the modern period from the first half of the seventeenth century is to locate the emergence of some leading though often contradictory assumptions, which are supposed to form the basic attitudes of modernity.  But we need to make one more qualification before proceeding.  In the form of a question (like so much Critical Theory), is the concept of period adequate to account for what we mean when we talk about the experience of modernity?  The answer, strictly speaking, will be “no,” but we will go on to give reasons for this later on.  In the meantime, the following concepts and trends will help us to excavate the site of what we call modernity.

1. Empiricism

Empiricism usually denotes a specific way of thinking about knowledge, about where it comes from, what knowledge is, and how much or how little can actually be known.  So while you might read about Empirical philosophers like John Locke, David Hume or John Stuart Mill (all British, notably), you will also come across references to “the empirical sciences,” which dominate until well into the twentieth century.  But apart from explicitly and avowedly empiricist trends we find that empiricism dominates even those who oppose it.  Empiricism teaches that all knowledge is derived from experience and must be tested only with reference to palpable evidence.  That is, at its extreme, everything we know concerns what we can actually see, hear, taste, touch or smell.  The world is just as it appears to us through the senses and can be understood only through painstaking empirical analysis.  Empirical thinkers, then, dispute any knowledge said to be derived from supernatural, transcendental, mysterious origins and direct our attention to things, facts, matters that can be understood as having real existence.  This way of thinking, as you can guess, leads to what we now know as “good common sense.”  As we have seen, even those who oppose empirical thought as such would be unlikely to deny the existence in some form of things that are apparently real--but they would deny the lofty status that the empiricists ascribe to them.  What is important here is that empiricism simultaneously draws limits to what can be known--knowledge is restricted to the empirical sphere--yet it opens knowledge up to the promise of completion.  Ultimately it denies any such sphere as the ethical or any metaphysical foundation that lies beyond finite empirical experience; but in so doing, it provides a basis for the development of something like a total science.  The assumption is that the more we can understand about the universe on scientific grounds then the more control we should have over it.  It is difficult within the terms of modernity to deny the extraordinary advances that have been made in the development of scientific thinking and in technological progress, much of which must be at least connected with modern empiricist trends.  But some of the most effective and radical developments in critical and cultural theory have been made against empiricist assumptions, often apparently against good common sense.  That no doubt leaves theory open to ridicule from certain points of view, which we will have to examine quite carefully.  However, the logical conclusion to be drawn from empiricist thought would suggest that we ought to be able to do without theory altogether.  And that already rules out any notion like empiricist thought.

2. Rationality

The very fact that empiricism has another side ought to be enough to draw it into question.  But rationality, its alter ego, is generally constructed as being its opposite.  Throughout the eighteenth century a philosophical battle raged between empiricists on one hand and rationalists on the other (with a decisive victory for the empiricists towards the end of the eighteenth century until the ground breaking critical philosophy of Immanuel Kant entirely shifted the terms of the debate).  So where the empiricist claims that knowledge is available through the senses alone, the rationalist asserts that there is a transcendent source of knowledge in the human mind (Anaxagoras’s nous, of course) and this is needed to organise and understand the world of empirical experience.  The privileged example for rationalists (following Plato) is the realm of mathematical and geometrical proofs which demonstrate a kind of universal truth beyond deceptive and capricious empirical finitude.  It is striking, in this respect, that both Gallileo and Descartes can be found independently asserting the possibility of a mathematical understanding of the universe and its contents.  The assumption of a rationalist is that the universe is itself rational.  Only much later in the twentieth century do scientists and mathematicians start asking awkward questions about phenomena such as cloud formations and coastlines, which do not seem to conform to traditional mathematical principles.  These questions lead, via what will be called “Chaos Theory,” to an entirely new and very interesting mathematics, to which we will return.

3. Freedom

It is not by chance that a certain concept of freedom comes to be valorised and affirmed at the same time as the tension between empiricism and rationalism gets more tightly defined.  As the clarion call of Enlightenment during the eighteenth century, the concept of freedom signals both an empirical problem and a rational principle.  Jean-Jacques Rousseau begins his famous and influential “The Social Contract” (1762) with the observation, “Man is born free, and everywhere he is in chains” (41).  And towards the end of the eighteenth century the critical philosopher Immanuel Kant describes Enlightenment as “Freedom from Man’s self-incurred tutelage” (“What is Enlightenment”).  Notice the assumptions that inform both these statements.  First the concept of freedom used here suggests something like an original state, which has been lost.  (The echo of Rousseau in Joy Adamson’s story about her life with lions in Africa, Born Free, and the sentimental song that goes with the film version of it are simplistic reminders of this Enlightenment notion of freedom.)  Freedom, according to these conceptions, is a natural and original state that has been lost to Man but might somehow be regained in the process of his emancipation.  In that case modern civilisation must be regarded as being in some sense a form of enslavement, enchainment or like a prison.  The analogy with Plato’s cave, then, is not so far behind these conceptions, for it is Man’s essential freedom of thought that is supposed to mark him out from machines and the brutes.  The fact that this notion of freedom emerged while the slave trade was at its height, and while the notorious “middle-passage” between Africa and America was at its busiest, is a telling indication of the contradictory historical processes at work in modernity.  The other important assumption in both Rousseau’s and Kant’s formulations involves the concept Man.  To the greatest extent, it seems, Man ought now to be able to take centre stage in the unfolding of his world.  Not only do his scientific (both empiricist and rationalist) projects allow a hitherto undreamed of understanding of the objective universe, but his freedom allows him to intervene instrumentally in the process of his own history.  So, on the one hand, freedom and science march forward together.  But on the other they are fundamentally opposed because where science is concerned with uncovering unchanging necessary truths, freedom concerns Man’s intervention in a world considered in terms of its absolutely unpredictable contingency.  What’s more, Man’s thought doesn’t seem to be constrained or determined in the same way as his body is governed by, say, the laws of gravity.  There are no established principles that can be used to predict the actions of Man.  Man is free to make a science of his universe but he cannot make a science of his freedom.  One of the urgent tasks of much 18th and 19th Century philosophy thus becomes one of reconciling the claims of scientific necessity and moral freedom. 

5.  Man

Who, then, is this “Man”?  Now that is a question.  Some would argue that for eighteenth century thinkers Man is not, as was overtly affirmed, just anybody, but covertly Man turns out to be based on the image of a specifically modern rational western white bourgeois educated male.  This would thus exclude madmen, women, lower classes and members of other races and men from other (selected) historical eras as well as machines and brutes.  But despite the now obvious sexism and racism that hindsight allows us to pinpoint with easy clarity, the notion “Man” is a bit more enigmatic than that and deserves further elaboration.  Man is not, in itself, a new concept.  Nor is freedom.  Aristotle, in The Politics, qualifies his statement that “Anthropos is a political animal” by saying that this is a being that is capable of both the best and the worst of actions--hence the need for a politics.  And in the late Middle Ages controversies about “grace” concerned precisely the level to which Man could be held responsible for his own actions.  But in each case Man (or Anthropos) is characterised in terms of a teleology that is beyond him.  A teleology is a long-term system or plan that gives a final purpose to the things that fall under it.  So the human race, in the late middle ages, could be characterised in terms of what lay beyond the finite existence of its individuals--some kind of afterlife all part of God’s great plan.  But mythical notions of destiny and fate, far from falling out of use altogether become covertly disseminated within the new scientific vocabulary that begins to take over from religious cosmologies in providing representations of the world as such.  So where the ancient conceptions of the universe rested on the consolation of richly detailed mythologies and theologies, the new scientifically oriented conceptions place the emphasis firmly upon the shoulders of Man himself, who thus becomes a new mythology rigged out with a rationality supposedly equal to the new calculability of the universe in its entirety.  So if God was strictu sensu unknowable, Man, who takes over the transcendental role, tends to become his own blind spot in an otherwise increasingly transparent frame of vision.  In other words, the enigma behind the question “who is Man?” turns out to be just that--enigmatic in a necessary way.  The answer would spoil everything.

4. Progress

Progress is a name for one of the ways in which mythological fate, or Man’s destiny, gets disseminated within the new vocabulary of modernity.  We have already learnt that empiricism has limited the field of knowledge to the finitude of empirical experience.  The effect of this is to transform that finitude (which has since Plato and throughout Christianity been nothing more than an impoverished and imperfect temporary condition) into the rich and fecund totality of existence itself (if it wasn’t for some pesky reservations that we must come back to shortly).  It’s not that finitude is any less impoverished just at the moment (as it were).  Rather, the new conception involves the belief that this finitude can be transformed by the scientific and moral manipulations of rational Man himself until it becomes that perfect world which everyone hitherto had assumed was in some fabulous beyond.  Modernity thus involves a rather spectacular shift in the experience of time itself.                         

Arguably contemporary critical theory evolves as a set of responses to certain crises or certainly problems with the general condition of what is often referred to as modernity.  This can be considered in terms of philosophical, aesthetic, historical, economic and technological developments that have occurred in the modern period, beginning in the sixteenth century.   However, modernity does not become visible as such without certain important trends within it, which reflect back often very critically upon it.  These self-critical trends are often seen most clearly in the various forms of aesthetic production known as artistic or literary modernism.  What are striking about these modes of self-reflexive critique are, indeed, the aesthetic strategies.   Modernism is both a reaction to and a constituent part of modernity that involves a set of engagements utilising a variety of aesthetic, political and ideological strategies.  The modernist avant-garde, especially, attempts to shake up consciousness of the present and to rethink relations to the past and to the future.  The problems of modernity become clearer in the kinds of conspicuous engagements that are made in the name of modernism and these engagements thus constitute a vital source for critical theory.

Centrism

Modernity becomes visible only when the conditions that modern subjects take for granted emerge as problems.  These taken-for-granted conditions, then, turn out to be the consequence of an evolution of ideas, a historical development, which imperceptibly governs the thoughts and experiences of modern subjects.  Modernity thus names the constructed field of historical ideas and attitudes that underlie the experience of being modern.  A constituent part of that experience, however, is its naturalisation.  That is, the modern subject implicitly assumes that his or her experience and view of ordinary things like the self, the world and other people is the normal or natural view to take.  Scientific rationality (a uniquely modern perspective) is equated with a certain “common sense,” which only primitives and the insane or stupid would refuse.  Other perspectives tend to be relegated to the past (primitive cultures) or to the margins (third world cultures, women, lower classes, etc.).  This experience thus involves certain forms of centrism. 

 Ethnocentrism

Ethnocentrism involves implicit or explicit privileging of one ethnic region over all others (which are thus placed on or outside the margins).  As the political, technological, aesthetic and philosophical achievements of the west develop, a parallel set of assumptions about the primitive and regressive nature of other parts of the world are developed too.  Whenever problems have greeted the progress of the western cultures, on the other hand, a corresponding hyperbolic praise of exotic “others” around the globe has often accompanied the response.  The result of ethnocentrism is usually a forgetting of the specific ethnicity of the centralised ethos (western culture).  Instead that is either regarded as the universal against which everything else should be measured.  Or else it is regarded as not quite finished, needing only this or that imagined quality to complete it.

Androcentrism

This formulation comes from the word androgynous, meaning hermaphrodite, from Greek androgynos.  Being androgynous could mean either having the characteristics or nature of both male and female, or being neither specifically feminine nor masculine.  Or it can mean having traditional male and female roles obscured or reversed, as in an androgynous marriage.  Whatever, the meaning implies a neutralising of the differences between the sexes.  Androcentrism, on the contrary, implies a normalisation of one sex (in western culture men) to the exclusion of all others.  So sexual difference, it is assumed, does not matter in the basic definition of what it is to be human.  This assumption, together with the highly stratified accounts of the actual differences between men and women, comprise one of the central contradictions of modernity.  The implicit assumption is that there is only one sex and that sex is male as opposed to female.  If one tried to live according to this contradiction the feminine could only cause trouble, which, of course, it does, to the impoverishment of actual women’s experience as they are forced to live inside this kind of representation.

Phonocentrism

This is a slightly surprising one, but since the work of the French philosopher Jacques Derrida began to appear in the early 1960s we are getting used to the fact that this version of centrism is crucial.  Phonocentrism implies the standardisation of the phonetic unit as the key component of language.  What this means is that the spoken word tends to get privileged over the written one.  Somehow speaking, which seems very close to breathing, comes to be correlated with meaning.  The living truth of what I say begins its journey of compromise and corruption only when it is written down.  Hence the phonic unit (“a”) is privileged over the written unit (“a”).  As you can see there’s no real difference, is there?  However, what I say and what is written down bear no relation to each other whatsoever, except via the conventions by which we unconsciously associate the one (a) with the other (a).  As we go on we will see that this type of connection (i.e., absolutely no relation) comes to trouble and determine all attempts to identify and assert any unit of language whatsoever.  Research has revealed that this prejudice against the graphic mark is based on a desire to maintain belief in something that cannot be presented to experience but must nonetheless be affirmed, that is, the notion of transcendental truth.

Logocentrism

Logocentrism involves the belief in the existence and identity of meaning beyond and outside the various modes of representation, like language and other sign systems.  Sign systems include aesthetic genres as well as ideas, encompassing both epistemological ideas like understanding and reason as well as moral ones like virtue and crime.  In many cases philosophers have had to conclude that even the basic elements of experience, like perception, memory, imagination and surprise are formed of systems of representation, and that the world we experience is at best the consequence of human modes of framing it.  The logocentric belief accepts the vulnerability of representation but asserts an independent truth, or a realm of meanings that we get access to only through the most painstaking philosophical and/or scientific labour.  Generally, then, logocentrism involves the stubborn rejection of the inescapable power of rhetoric to make or break the world.  A committed critical theory is obliged to respect this attempt, as it reveals, on the one hand, the inescapable power of rhetoric and, on the other, possibilities towards an outside or beyond that would allow for any development or change from oppressive forces.  Such an escape is not likely to be possible but that needn’t mean we have to rest within oppressive systems and institutions of meaning, as we will discover.

Descartes’ Judgment

The philosophy of Rene Descartes represents, along with a number of other developments, a peculiarly modern way of thinking, one that reaches into all domains of thought and action.  What distinguishes Descartes from earlier forms of philosophy is the way in which he establishes the relation between thought and the world of objects.  As with Plato, the hierarchy remains.  Ideas inform our knowledge of the world around us.  Ideas are not sensible (reducible to the senses).  They are intelligible and belong to a dimension of thought, which operates independently of the empirical dimension.  Descartes’ project, as he presents it, is to establish a method for arriving at precise judgments about the world.  He therefore begins with the need to form clear and distinct ideas as a basis on which to build a more complex but no less precise epistemological edifice.  The interest that Descartes provides for a developing critical theory lies in the succinct elegance of his arguments.  Descartes contributes lasting treatments to (at least) three philosophical concepts.  First, the subject of philosophy, which stands over against the world of objects, has never before been given so much attention.  But the activity of the theorising, objectivising subject is for us another embodiment of philosophy’s great aporia, the difference between the empirical and transcendental dimensions.  The Cartesian subject is in principle entirely disembodied and free and is therefore entirely transcendent.  The empirical world lies at his disposal.  Once the right method is learned, Descartes argues, the subject can advance his knowledge unhindered.  Secondly, the precision of the judgment by which clear and distinct ideas are applied to the world of objects allows a knowledge of the universe and everything within it that had hitherto been impossible.  It is on this basis that we also witness the rapid growth of modern scientific thinking.  Thirdly, the idea of infinity as a clear and distinct idea replaces embodied notions of the Christian God.  Descartes argues that one of the most basic ideas we have is of God.  But once we read the texts carefully, we see that this God is essentially unknowable except as the idea of infinity.  Now, in this section I am going to explain how the privileged role of the subject, the refinement of the judgment, and the idea of infinity combine to give us a picture of the activity of theory.  Before looking at the philosophy of Descartes in more detail, let’s have a look at the problems that it must respond to.

Otherness, infinity and difference

Philosophy has always been concerned with relations to what for want of a better term we may call “the other.”  Philosophy's “other” is anything that cannot be conceptualised, that lies beyond the representations of the mind.  Many would argue that the feminine is one such other, philosophy being resolutely masculine (I wouldn’t argue that unreservedly but there is plenty of evidence to show that the philosophical norm is often simply masculine).  Many would say that Western notions of otherness concern other cultures, peoples who do not fit the Western philosophical interpretation of humanity, which contrasts humans with animals and machines.  There are problems with these notions of otherness, not least because, strictly speaking, the other is unpresentable.  Otherness in general constitutes a philosophical problem in so far is it has no concept--it has no repeatable ideal form--so there is often a tendency for people to slot some empirical object in its place (it is possible to show that the woman and the African, for instance, are both “others” for certain powerful nineteenth century westerners for whom the white male constituted the norm).  The problem is thus the relation between thought and its outside, and the identity of the thinker.  If you don’t have a concept of the other (as black and/or female, for instance) then it is as difficult to come up with a definitive concept of the self.

So the problem of otherness had its correlative in seventeenth century philosophy as the problem of maintaining a single unified subject who could remain objective in a world of plurality, difference and change.  If thought belongs to a mind (as Anaxagoras described it) that is both infinite and independent, then what is the relation between this naked singularity and the many objects that surround it?  What is the relationship between the one and the many?  You can always bring strange objects into your domain by attempting categorise them under concepts.  But will there ever be an end to it?  Will there ever come a time when all the possible objects of knowledge have been categorised and stored up in ordered hierarchies? 

The short answer to that is “no.”   For a start “the many” certainly looks like being infinitely many.  But also, and more importantly, this infinity is not simply the number of particular others that would need to be taken into account by a total knowledge.  Rather this infinity is what remains absent in the finite world (including unheard of arrivals and events that might occur in the future).  All this points to the existence in the finite world of something that at all times resists knowledge: otherness.  What this means is that otherness has something important to do with the infinite and, crucially here, something to do with the possibility of knowledge, which seems to come into being as a need in the face of s.  It is, paradoxically, a part of knowledge that lies outside knowledge, infinitely.  It is what is always missing from knowledge.

When I know a table I frame the object with my concepts of table (its status as furniture, its relation to chairs and to other tables, other types of table, the materials to which the concept of table gives shape and purpose, the wood, the glass, the metal), I add a whole network of ideas, beliefs and relations to what I see on the surface.  My knowledge looked at in that way is a set of restrictions that enables me to understand my world.  I share this knowledge with many others.  They include designers, manufacturers and retailers of tables, who know what they are to be used for and must thus follow the very conventions that also activate my knowledge of what to do with them, though many fashionable designers would stretch the conventions to their limits.  They also include my friends, relations and acquaintances, who often gather together around tables for eating, drinking, talking and related social rituals.  When westerners visit the east for the first time they may get confused about certain things.  What are those low ornate boards around which silk cushions are arranged?  The presence of containers on these boards signals that they serve in the same way as tables do in the West.  But how do you sit? On your knees or squatting or with legs crossed, etc?  Cultural difference often reveals knowledge to be limited, conventional and, of course, cultural.  But it is important to see that “otherness” is not “other cultures.”  Otherness is what makes other cultures (including yours) possible and it is the strangeness that intervenes between cultures (making both yours and mine strange).

Science fiction writers and film producers (maybe even set designers) have to consider unknowns beyond their cultural background.  What would a table in the twenty-third century be like?  The future is unknowable but it can be represented.  Isn’t that a terrible paradox?  How can you represent something that has never been present?  How can you represent the future?  The answer is through fiction--this is not a twenty-third century table; it is just a fiction.  Furthermore, the fiction is unlikely to depart very far from the conventions of a particular cultural background--you’ve got to think of your audience, after all.  But the paradox is not limited to science fiction.  Otherness is not just the future.  The unknown surrounds us like a night fog.  When something emerges from it or when we enter into it with the torches of our knowledge flaring we can only understand it (whatever it is) by the light of the concepts we already know, which we then apply until “the other” is no longer other.  Sure, the effects of otherness may change our knowledge, but never by giving us otherness itself.  Otherness (like the future, infinity, aliens etc.) remains always and by definition beyond the frame of our conscious gaze.  If there is something we cannot know we can still represent it through the possibility of inventing it.  This possibility seems to infect the possibility of all our knowledge.

The example of something as humble as a table is complicated enough but what then do I mean when I say I know a person?  How much more complex it all is when compared to the conditions for knowing a table.  What does it mean when I say that you are “not acting yourself” or “that’s not like you”?  It’s all very complex.  The sentence has both a literal meaning and a more involved implied meaning.  Literally the sentence says, “the way you are acting does not conform to the way you are.”  There is a discrepancy between what you are and the way you are acting.  The more involved implied meaning suggests that the way you are acting does not conform to the norms that my concepts of you provide.  Less analytically, if you like, the way you are acting does not conform to what I think of as “you.”  What that actually means depends upon context.  Perhaps you have fallen out of love with me and show less interest in me than you used to do.  Perhaps you are drunk and I want to censure you.  Perhaps you are being rude to someone and I want to caution you.  On the other hand, perhaps you have fallen in love with me and have given me a present.  Perhaps for the first time in months you are not drunk but articulate and sober, like Anaxagoras.  On another tack altogether perhaps some devilish professor has created an android that looks just like you but hasn’t quite got the fine-tuning sorted out.  Or maybe your twin has turned up out of the blue as in some banal soap opera plot.

Where then does my concept “like you” come from?  When I meet someone for the first time I have to rely on some very flimsy presuppositions in order to form a preliminary knowledge.  Appearances help--dress, gender, age are all things that can help me form a preliminary category or concept because there are conventions in all societies that through habits of practise and representation (tv, film, advertising, magazines etc.) allow us to make quick assumptions.  Anyway, I have nearly always already formed quite involved concepts on those issues.  Previous knowledge can help--what I know by what other people say or by repute may give me some idea of what to expect.  But of course appearances can also hinder--they are deceptive--and reporting by others, as we know, can be extremely unreliable.  In any case I cannot avoid prejudice--now hold on I’m not prejudiced--but yes you are; knowledge depends on prejudice, making judgments about an object before impartial judgments can be made.  To an extent all these prejudices (at least in terms of the basic judgments, if what I see is, say, a 28-year-old woman wearing jeans and a T-shirt or a 40-year-old man in an expensive suit and tie) can be tested when measured against repeated behaviour and things said and done.  My concept of you will be gradually “enriched” as time goes by just through your repeating certain anticipated ways of behaving.  (Depending on my psychological state I may well be capable of completely missing “uncharacteristic” ways of behaving if the “characteristic” ones are repeated regularly enough, which simply means that I see what I want to see.)  Only when you act “out of character” (out of concept) does the concept need modifying.  In a world where the likelihood is high that even one’s closest friends and relations might suddenly and unannounced act out of character or do something different, it is no surprise that so much is invested in making things “the same.”  Difference is disturbing.  It is as if getting to know someone is a form of domesticating his or her difference.  But at the same time we desire difference, as if mechanical repetition was somehow unreal.  As if a relationship would get dull or stale unless it was with someone for whom your concept was never complete.  Things will be all right as long as there is always the risk that after, say, twenty years of being together one of an apparently happy couple might suddenly leave the other.  If that did happen, of course, it would reveal an aspect that was never evident, acknowledged only in retrospect while the bereaved lover’s concept of the beloved undergoes its inevitable crisis.  What has happened?  The lover’s other, the otherness of the beloved, has outstripped the will-to-knowledge of love.

The problem of course is not just restricted to our dealings with other people.  As the philosophers never tire of explaining, knowledge, whether passed on by others or gained through painstaking observation, is intrinsically capable of deception.  This is why scientific knowledge demands so much rigor and painstaking empirical research.  Knowledge can take you in.  You might always have been wrong.  On the one hand this is clearly a bad thing for the philosopher who wants to know things with certainty.  But on the other hand, with the concept of “otherness” (which, remember, is not a concept at all), it is possible to begin to see this “bad thing” as being also something that makes things possible.  What would you have without the other?  Everything would be reduced to your own thoughts, cut off from the world of legislations, negotiations and love.  Take it a step further: without the possibility of such associations would you even have your own thought?  Don’t your concepts already come out of some other domain in order to be applied to the domain of the other?  Otherness: without it knowledge would be nothing.  Otherness is the possibility of knowledge.

How to not define the other

The point to be clear upon is this.  The other is not a concept, not a name for some thing or someone.  It doesn’t name an object.  It is used to gesture towards that which is noticeable only ever by its absence.  How then is it possible to know it?  It isn’t.  Knowledge of the other is precisely impossible.  But it is possible to present accounts that demonstrate that this impossibility is absolutely necessary.  Remember that Plato uses analogy to gesture towards a truth that cannot be represented.  (The truth is this: The truth is impossible to represent.  Here is a good single sentence précis of Plato: “it is impossible to represent the truth and that’s the truth”).

It is also possible to communicate by analogy the importance of not defining the other.  Psychology tells us that there are cases of people whose nervous disorders cause them to see nothing as a terrifying and persecuting presence, a “no-thing.”  Beginning with the absence of, say, the mother’s breast, they develop a pathological fear of absences, spaces, gaps etc., that each time represents the monstrous no-thing, which seems to reflect back upon them the sense of their being nothing.  In place of nothing a terrifying monstrous presence is hallucinated.  What these cases illustrate is that in order to cope with existence we all must learn to cope with the vivid and threatening experience of the no-thing.  The very transition from infantile dependence to independent adulthood seems to be about learning to cope with absences, lacks and unconsummated desires.  But we often do this by replacing the nothing with a something that supports the sense of self that I believe completes me, fills in the gap that is my own otherness.

Philosophy too sometimes seems not to be able to cope with the absences that foil aspirations to total, universal knowledge, the absences that make philosophy forever incomplete, fundamentally open (that’s what calls for the closure).  So even things that cannot be named get named anyway and in the place of the absent other some innocent third party fills the role.

Post a Comment

Previous Post Next Post