BARBELITH underground
 

Subcultural engagement for the 21st Century...
Barbelith is a new kind of community (find out more)...
You can login or register.


"Stuff that works"

 
  

Page: (1)2

 
 
We're The Great Old Ones Now
11:14 / 11.10.02
Douglas Adams wrote that we're stuck with technology when all we want is stuff that works. I was reminded of this when I read Tom's recent ponderings of flawed but functional technology.

Then I thought about Terramac, a project involving flawed chips cross-patched so that the errors just aren't a problem. Something in me loves that idea, I don't know why.

Perhaps it's the part of me drawn to Rowland Emett's machines (if you find a decent link, let me know - I've never managed it yet). Emett said that 'proper' machines perform a function and excite fear and loathing, whereas his machines - constructed of old shoes, badminton raquets, saucepans and bicycle wheels, and not a small quantity of utter lunacy - did nothing of any use and excited only love.

Maybe these things represent the beginning of a technology we can live with. A mechanic's technology, a human's technology, not the shining Futurist (fascist?) surfaces of technology as it's presented now. These machines have to try. They don't foster an illusion of perfection - either theirs or ours.

The encounter between the perfection of ideas and the real world is (I think according to Lyotard, but I may have the reference wrong) the basis of Terror. Reification is not a gentle process, it seems, but a violent, sharp-edged reshaping of the world, which costs human life. Perhaps a technology which is fuzzy is one way to cushion the blow.
 
 
Lurid Archive
12:33 / 11.10.02
I'm unsure about whether to contribute as I am not entirely clear on what your point might be, Nick. Let me just throw about a few thoughts.

My first reaction to your post is one of disagreement. Its not that there is any I particular point I disagree with, I just think the post is misframed. If I think about it, I might even venture that the attitudes that I perceive behind the post are exactly what I think is wrong with technology. Let me try to flesh that out.

Perhaps I should start with a point we agree on. Technology is, largely, badly designed. I remember having long conversations with a computing guy who spent lots of his time analysing and proposing solutions for the "design problem". In short, he felt that the problem was one of attitude. Technology is badly designed because good design requires foresight and initial investment. Most technology is designed by people with neither the resources nor the skills to do a good job. Moreover, his attempts to construct theories of good design were met with a lukewarm reception. He felt that his cross-disciplinary approach was one of the biggest stumbling blocks.

When I read your post, Nick, I get the impression of someone who has constructed a view of technology that borders on caricature. For you, technology embodies a cold "perfection" and is perhaps the "basis of Terror" not to mention potentially "fascist". Almost as if technology is a dangerous alien force to be feared. You like Emett's machines since they humanise technology, in your eyes.

I think what we have is a separation of people from the artifacts around them - echoes of Arthur C Clarke. Technology is something that 'they' do and 'we' should make fuzzier. Apart from being reminded of the Sokal article, I can't help feeling that this is like suggesting we combat rascist literature by producing fuzzier, more human fonts. The problems with technology go to the core of how we, as a society, relate to it. And you can't relate to something without engaging substantially with it.
 
 
grant
13:38 / 11.10.02
Actually, I think what Nick is getting at is a technology that *admits its own fallibility*, rather than just blindly fumbling onward.
One that checks itself.
A sort of organic tech, one with no assumption of perfection... because that assumption is one of the greatest problems with tech design. You know, like the Titanic, or like pretty much every horror movie revolving around machines and computers. The "things" don't work right because they're overdesigned and incapable of accepting the possibility of error.
In other words, truly elegant machines check themselves. They stamp their feet on the rug before coming in, because they know their shoes might be dirty.

--------------

Oh, and Rowland Emett:



apparently refers to himself as a "3D Cartoonist" and made the "breakfast machines" for the Chitty Chitty Bang Bang movie.

He's got stuff up in Ontario. It comes out once a year. He also apparently designed aircraft - real aircraft - during WWII.


"The first principle in science is to invent something nice to look at
and then decide what it can do." -- Rowland Emett
 
 
We're The Great Old Ones Now
13:41 / 11.10.02
I'm not sure there's a strong point yet, Lurid. And I'll state for the record that I'm a frantic neophile/technophile.

And grant's just said wise things and posted pictures, so I feel better.

As to my caricaturing technology - perhaps I'm caricaturing (or just examining) our relationship with it, and being sloppy about it. But since our relationship with technology ultimately shapes both us and it, and our perceptions will feed its design, and so on, I think the distinction is moot.
 
 
Lurid Archive
13:50 / 11.10.02
Doesn't that agree with much of what I've said above, Nick? I don't think you are being sloppy. Rather, I think that the problem is one of perception which, as you say, is ultimately the reality but approached from a particular direction.
 
 
We're The Great Old Ones Now
13:58 / 11.10.02
In which case the original picture stands...
 
 
Kit-Cat Club
16:17 / 11.10.02
Do any of you remember 'The Great Egg Race'? A programme in which teams would compete against each other to produce a contraption (from rubbish) which would perform a specified function at the end of the hour's recording... some of the results were very Heath Robinson, but quite a few of them did what they were supposed to.

It's efficiency really, isn't it? We want things that work... efficiently.
 
 
Lurid Archive
19:43 / 11.10.02
In which case the original picture stands...

Yes. In the sense that thinking about technology as a source of dehumanising evil is unlikely to have a positive effect on it. It stands more as a symptom than of the problem than an analysis, however.
 
 
We're The Great Old Ones Now
08:43 / 13.10.02
I wasn't thinking about technology as a source of dehamanising evil. I was thinking about it as a possible area where the ethos of a creative endeavour does not necessarily mesh well with humanity. Hence my old favourite quote from my nuclear physicist house mate, "morality has nothing to do with science", which is both true and idiotic. The perception we have of how technology ought to be sometimes gets in the way of making 'stuff that works' and 'stuff which is human' in the sense of making machines for living with, rather than just things which do things in the most efficient and cost-effective way. Like any other area, the pure Idea in technology need not be the best thing for the lifeworld. But technology, as it is still often conceived, is about remodelling the world, and hence that pure idea is more legitimate than perhaps it should be.

That's 'technology' as much as the actual gizmos are.
 
 
Lurid Archive
13:03 / 13.10.02
Hmmm. Don't you think that the mismatch between scientific and technolgical advance with ethics or humanity may be a problem of perception?

I'd suggest that technology in the abstract has few innate qualities. It is what we make of it. I think the alienation that many feel, feeds into a construction of technology that has some of the qualities you describe. I've said this above.

As for morality and science. I think that the separation is important. When scienctific discoveries are informed by morality, we have an imposition of ideology. Economics - as in Deva's thread - is an example of this. However, what science is funded, its code of practice and objectives are largely in the domain of morality. Science may inform morality - knowing what is, is a prerequisite for making a moral decision on what to do - but science doesn't dictate morality.
 
 
We're The Great Old Ones Now
14:04 / 13.10.02
I'm not at all sure you can separate gizmos from our attitude to them and our perceptions of them. Technology is what we think and how we relate to and delpoy the engineered product as much as it is the gizmo itself.

And yes, science needs to be separated from ideology in order to do it properly, but there is no excuse for ignoring the implications of science and technology in the real world.
 
 
Lurid Archive
16:05 / 13.10.02
Technology can come to resemble our model for it. But we could think differently about it and so change it. In that sense, technology doesn't have to be a particular way and this is the sense in which I'd talk about separating perception from reality.

But who talks about ignoring the implications of science? Could it be our old bogeyman, the scientist? Perhaps this is a culture clash, but the opinions some people seem to have about scientists - their meglomania, amorality and hubris - just don't match my own experiences. If anything, it is non-scientists who seem to want to ignore science if it doesn't match their agenda.
 
 
We're The Great Old Ones Now
07:54 / 14.10.02
Lurid, we're going round in circles. You're talking about 'technology in the abstract' and I'm not at all sure there is such a thing. There's science and the applications of science, but 'technology' is always going to be a cultural shape. Technology is what actually gets done. It's informed by our attitudes. Yes, we could do something else - but only if we were else.

So when I talk about a possible alternative culture of technology which is enmeshed in the lifeworld rather than seeking to be perfect, I'm also, I suppose, talking about a kind of bootstrapping project operating on ourselves.

And as to who talks about ignoring the implications of science, I just told you. A bunch of scientists I spent time with at university. Obviously, that's not a representative sample, any more than your own is. Andrew's position was simple: his job was to find stuff out. Doing science did not include any reference to the possible consequences of his work. So if his employer asked him to speculate on the conditions in the heart of a theoretical star which was hotter and reacting more violently than stars do, he'd do it, without thinking that the 'heart of a star' discussions which took place in many scientific journals in the Cold War were a way for blockaded scientists to tell the rest of the world about the weapons they were working on, and that he was basically being asked to be part of a weapons programme. And just as a journalist has a responsibility to tell the truth, and not just hir editor's political opinion, so a scientist has a duty to think about what will be done with hir work.

Yes, it's quite true that many political agendas ignore science. That's another discussion.
 
 
Lurid Archive
09:21 / 14.10.02
You're talking about 'technology in the abstract' and I'm not at all sure there is such a thing.

I'm not making myself clear. It seems to me that you have fixed on technology as situated in the world. Fair enough. And then you are insisting on a particular way to look at it. You appear unwilling to allow different viewpoints, insisting that the picture you paint is the "real" one, while others are abstract. I'll accept that you have probably outlined a majority view, and hence it is worth debating. But if we were talking about majority views in the Headshop, say, we would also talk about alternatives, ways to change the mainstream and the problems that perception projects.

Isn't that the debate you want to have?

so a scientist has a duty to think about what will be done with hir work.

Couldn't agree more.
 
 
We're The Great Old Ones Now
10:00 / 14.10.02
I think, if you go back to my original post, you'll find it's considerably less dogmatic than that. If you want to quarrel with the idea that the acme of technology is seen as perfect function, fully reliable, textureless and devoid of idiosyncracy/personality, and that error is something 'wrong' rather than something part of the normal operation, that's interesting, though perhaps a little far afield from where I was going.

The notion of perfection as Totalitarian/Terroristic is from Lyotard, and possibly Habermas. It's relatively obvious, both experientially and theoretically, that the encounter between an abstracted idea and the messy real world will lead to one of them having to compromise - and interestingly, people often compromise the (life)world rather than the idea/ideology, even where that means quite considerable suffering or remodelling. I included 'fascist' because I mentioned Futurism, and the geometric - but perhaps non-human - spaces of Futurism were, if I remember, a source of inspiration to Fascist architecture, and were heavily identified with the industrial-commercial world.

The possibility of Science itself being, both a means of knowing and hence controlling the world, and as the attempt by the Enlightenment Project to reduce human life to a set of rules, inherently Totalitarian, isn't an idea I really want to get into in this thread, but it's one which we might look at elsewhere.
 
 
Persephone
14:03 / 14.10.02
This is really interesting, I was thinking about it over the weekend when I went apple-picking and wished I had brought my camera, because there was an exhibit of old-timey machines --a hay baler at 1/3 scale, a washing machine and wringer, etc. All the machines were making as much noise as lawnmowers and were spewing black smoke and waggling and were conspicuously belt-operated. But the thing to take the pictures of was the look of love on the people's faces looking at these comical machines. But at one time, these machines were the face of technology. Did they seem perfect in their time? There's an interesting comparison in the Little House books, of all places --Laura's Pa rents a grain-threshing machine, is all for progress; Almanzo's father is against technology, flails by hand.
 
 
grant
14:55 / 14.10.02
I think you might be touching on something, Persephone - the gradual removal of "human" or "organic" elements from technology. Modern threshing machines are much less... anthropomorphosizable... than the old ones. They have less character, you know. They're less moody. They don't wobble when they operate, like people do.
 
 
Saveloy
15:59 / 14.10.02
I may have the wrong end of the stick here, or even the wrong stick entirely, but - can anyone think of any product that evokes 'fear and loathing' because it works? Or one that doesn't incite hatred in it's owner when it fails? I have got the wrong stick, haven't I?

Regarding the anphropomorphising business that Persephone and grant mention, I personally (and I suspect this may be common) have no problem regarding even a door as being alive but only, for some reason, when it goes wrong. And especially if it does me harm (see the Bad Design Kills by Degrees thread in the Conversation). Because I assume any assault, any refusal to perform as it should to be a personal attack, a deliberate act. If not itself alive, it is channelling the malicious intentions of some distant, invisible, sentient being or power.

As long as something is doing what it is meant to do (by which I mean, what I want it to do, which may not be the same as what the manufacturer intended) then it really is just a lifeless machine, for me, and that is a huge source of joy. Its lifelessness, its neutrality is very appealing.
 
 
We're The Great Old Ones Now
16:22 / 14.10.02
Emmet was talking, I think, about the sense that mechanisation was costing jobs. In which context I think it's pretty obvious where the fear and loathing comes from.
 
 
Persephone
16:31 / 14.10.02
it really is just a lifeless machine, for me, and that is a huge source of joy

Something that I frequently think about when I'm cleaning the house is the idea of getting a cleaning lady. It's kind of a meditative routine, I clean and I think about getting a cleaning lady. And it's always the same train of thought, I think about people I know who have cleaning ladies & how they always complain about this or that. I think about how people in the novels I read talk about the servants, how one has to take care about talking in front of the servants and so forth. Then I think that I definitely don't want cleaning people, but cleaning machines. I want a machine that I can treat like a machine. And it always ends with thinking about the robot maid on the Jetsons --a robot with a personality, with quirks and feelings and a will of its own & what a nightmare that would be.
 
 
Saveloy
12:51 / 15.10.02
Nick:

"Emmet was talking, I think, about the sense that mechanisation was costing jobs."

Right, I think I get you now. I was getting hung up on the immediate, practical relationship between object and user (eg a man and a teapot), rather than the effects - possibly side-effects - that use of the object has on the world at large, including people who don't actually use it themselves - which I think is the problem that you are concerned with. Is that right? I apologise for not sussing it from your initial discussion with Lurid but I get confused easily.

Even if I have got that bit sussed, I'm still not clear what the new approach to technology that you propose would actually entail, what the practical application of it would involve. Could you give us some examples, or some idea of how it would work in the real world? Are you talking about physical changes, or legislation, or what?
 
 
We're The Great Old Ones Now
13:23 / 15.10.02
Sav - not to worry. My exchange with Lurid confused the hell out of me...

I'm not sure what we're talking about in practical terms. Certainly a different attitude to how technology works with humans - and certainly less technology which requires human behaviour to change in order to use it properly.

I keep thinking about Sci-Fi movies where the good guys have fucked-up, nuts and bolts tech, which requires clobbering and lots of shouting. Why is that? What does it tell us about them?

I'm wondering whether it's also about changing the notion of a 'unit'. Each computer of the same model is identical, right? Wrong. They never are. As soon as you put software on 'em, it becomes apparent that they're different. They have slightly different builds and firmware, and they work differently.

Terramac works because it acknowledges that it has flaws, and it is designed to get around them. A lot of other tech is predicated on the idea that it will come out of the factory perfect and remain that way.

I'm reaching for something. Help me out.
 
 
Pepsi Max
14:27 / 17.10.02
This discussion is till positing technology has something human beings interact with. Which to a certain extent is true. However you can also view human beings and things as actors in huge (or tiny) networks of machination.

Nick> Latour's idea of 'black box' might be of use here. Maybe what you are looking for is machinery that doesn't try to portray itself as a 'black box' but owns up to its place in the network - and allows us to own up to our place too.
 
 
Cat Chant
08:54 / 18.10.02
I keep thinking about Sci-Fi movies where the good guys have fucked-up, nuts and bolts tech, which requires clobbering and lots of shouting. Why is that? What does it tell us about them?

That they have a very laudable distrust of efficiency. I keep thinking about the Vonnegut/Trout short story where everything is done by machines & people are encouraged to kill themselves - the guy in the suicide booth asking What are people for? There's something scary about the vision of a perfectly smooth, perfectly efficiently-running world which would make human input redundant. Technology constantly poses and repositions the question of the human relation to the world... And, because most of the theorists I read are big Heideggerians, I keep thinking about the (fairly obvious) link between technology/automation and fascism that he poses, when the factory tech of mass production, with its orientation towards efficiency, became implicated in the "mass reproduction" of human society.

Not that I'm anti-technology, obviously. (I interpret technology pretty broadly, as any structure or implement which affects or effects human interaction with the world, so I couldn't be.) I suppose the flip side of the sci-fi good guys' clunky machinery is that it makes it clear that the machinery is subordinate to the human, whilst also making the machines' resistance visible: you can see what's going wrong (and fix it), rather than being spookily drawn into the correct functioning as invisible (and hence irresistible). But it still makes the relation between human and tech into an intractable thing, rather than positing technology as an invisible, indispensible, relation between human and world, which might be what the spooky thing is.
 
 
Cat Chant
08:58 / 18.10.02
And I just spotted this

"How do you prevent people from doing bad things? I don't think this is a technical problem," says Balakrishnan.

over in this thread.
 
 
Lurid Archive
10:52 / 18.10.02
Not sure I should really be contributing here, since the thread is about exploring an idea rather than challnging it. But I thought that I should just reiterate that I don't share the view of most of the contributors here. For instance,

I keep thinking about the (fairly obvious) link between technology/automation and fascism that - Deva

It may be my ignorance of "Heideggerians" and others showing, but I think that this association is far from obvious. Much of the thread, in fact, seems to me a bewildering sequence of almost tautological deductions from preconceived positions. The common thread is the implicit agreement that actual contraints on technology are superfluous to the discussion.

For instance, the use of the words "perfection" and "error", are deployed in such a way as to make the link with fascism almost inevitable. Here "perfect" has the qualities of being frigid, inhuman and uncaring. This is a rather one-sided usage of the word that plays off its other meanings. Think of Lou Reed's "Perfect Day". (I know I'm objecting to rhetoric, but it is so central to the discussion that I can't really avoid it.)

By way of contrast, I would say that any engineer would quickly concede that the human hand is the most "perfect" gripping tool ever seen. The fact that it (or some equivalent) isn't part of human technology says more about the difficulties of design than about the ideology of engineers. One example of this is Gaudi's Sagreda Familia cathedral in Barcelona, which has an organic design. In fact, it is mathematically more "perfect" than conventional designs. It is just harder to build.

I was going to contribute earlier and say that I thought the opinions expressed seemed to borrow heavily from dystopian sci-fi which again feeds into my point about the tautological nature of the discussion. But I thought that would be too snarky.

However, I'm sure that you are all aware that not all sci-fi is dystopian. Iain Banks culture is founded upon absloute perfection while the society is utopian and anarchistic.
 
 
We're The Great Old Ones Now
13:15 / 18.10.02
Deva: I keep thinking about the (fairly obvious) link between technology/ automation and fascism...

Lurid: I think that this association is far from obvious.


It is at least well-grounded historically, in terms both of the emergence of Fascism and the link between industrialisation and Imperialism/Totalitarianism. In the theoretical arena, it's also not a difficult case - see Lyotard and Habermas.

The common thread is the implicit agreement that actual contraints on technology are superfluous to the discussion.

Would you mind unpacking that, please? I don't get it.

the opinions expressed seemed to borrow heavily from dystopian sci-fi which again feeds into my point about the tautological nature of the discussion

Actually, I think it may be the other way around. Many distopian writers draw in concerns about technology which have a theoretical basis.

However, I'm sure that you are all aware that not all sci-fi is dystopian. Iain Banks culture is founded upon absloute perfection while the society is utopian and anarchistic.

At the risk of derailing the thread, I think you're mistaken. The Culture is rather less clear cut than that - it has an anarchistic flavour for as long as there are no external threats or pressing needs, but it can and does pressgang its citizens, assassinate, torture, and wage war. The 'Special Circumstances' section of Contact is a far from benevolent crew which is feared and worshipped in equal measure by Culture citizens. The Culture also acts with extreme force and utter lethality against non-member civilisations at times. You could easily characterise it as a Totalitarian or Fascist Utopia which simply hasn't any issues of scarcity. Banks' ambivalence toward the Culture is evident in the books - if you don't believe me, read 'em again.
 
 
grant
13:56 / 18.10.02
Is now a good point to insert the idea that "efficient" technology is good for doing a single task or task-set, while "efficient" organic tools (which includes hands as well as spaceships held together with duct tape) are good for adapting to multiple tasks, including unforeseen ones?
It's two different approaches to the single word "efficiency."

For instance, the human hand can grip a pencil, a jam jar, a wriggling infant and a falling fruit. That's efficient. But a robot arm built to grip a particular way - say, to crush walnuts in a walnut-packing factory, or to pick up rocks on the surface of Mars - is going to be able to that one repeated task with more strength, stamina and accuracy than any human. That's also efficient - but in the scary way, not the duct-tape way.

Maybe it's that the duct-tape efficiency has the capacity to transcend design - to outperform its specifications.
 
 
Lurid Archive
14:35 / 18.10.02
It is at least well-grounded historically, in terms both of the emergence of Fascism and the link between industrialisation and Imperialism/Totalitarianism

I'm sure you'll disagree with me, but isn't there a risk of confusing an enabling factor with a causative one? Technology has been very successful in increasing levels of wealth - here I mean wealth in a general sense, not neccessarily tied to capitalism. A political movement is likely to be more significant if supported by resources but it doesn't follow that a certain kind of wealth encourages Fascism and/or Totalitarianism. This is debatable of course.

One might argue that given the oppurtunity, humans are a nasty bunch and so technology is bound to make it worse. Unless one makes an attempt to explain a historical connection, things are unclear. Correlation and causation are not the same.

If you could give me more specific references for Lyotard and Habermas, I would be grateful.

The common thread is the implicit agreement that actual contraints on technology are superfluous to the discussion.

Would you mind unpacking that, please? I don't get it.


Grant's comment that machines are less anthropormorhisable, for instance, ignores the constraints of expertise and physics. Also, the analysis that puts "perfection" as the goal of technology is seen as obvious, while I would think it required some support, especially if seen as an inescapable trend. The implicit standing assumption being that form is purely a product of viewpoint. Arguable for art, more problematic for technology.

The Culture:

Again, we disagree. Do we agree about anything, Nick?
Granted, Special Circumstances do some very unpleasant things. But most of what they do is minimal in terms of harm while maximising a particular impression for effect.

Also, Banks is not ambivalent about the Culture, unless it is ambivalence to acknowledge flaws. For example here,

I've always said the Culture is as close as something remotely like us could ever get to a genuine Utopian state; I never said it was perfect - Banks

He just enjoys exploring the morally dubious aspects of it, and exploring difficult situations. The point you make about the wealth of the Culture is ok, but it is a strange Totalitarian state where the "rulers" have the ability to take complete control but decide not to and bend over backwards to maintain a self imposed moral code.

To call the Culture "Totalitarian" requires impressive intellectual gymnastics. Ironically, the opponents of the Culture in the books who think of it as a controlling state are technophobes. But you are right, this is hopelessly off topic.
 
 
Saveloy
14:41 / 18.10.02
I'm having another meandering splurge....

Nick:
"I keep thinking about Sci-Fi movies where the good guys have fucked-up, nuts and bolts tech, which requires clobbering and lots of shouting. Why is that? What does it tell us about them?"

Deva:
"That they have a very laudable distrust of efficiency."

I wouldn't infer, from the examples that I can remember (and there may well be many that I can't remember which will prove me wrong), that the good guys actually choose to have inefficient machinery, or consider it to be the right and proper alternative to stuff that works. They certainly don't seem to be particularly chuffed when it goes wrong (see Han Solo and Leia in Empire Strikes Back - Millenium Falcon escape near beginning). Often the suggested reason the tech fails is because it is old and has long been in the goodies' service, so you can reasonably assume that they once knew it when it was new and efficient. The fact that they use it is meant to throw them in a good light, sure, but it seems more likely to me to be because:

a) it reinforces the good guys' underdog status

and/or

b) they are prepared to put up with inefficiency - which doesn't necessarily imply that they dislike or mistrust efficiency - possibly because they have a long-standing relationship with the thing in question. We are meant to appreciate their loyalty to a trusty (rusty) friend who just happens to be mechanical (see again Han Solo and the Millenium Falcon).

Which brings me back to anphropomorphism, which Sci-Fi (especially film) loves to use as a way to make perfectly efficient machines lovable and interesting. And I suggest that that is not because efficient, un-humanised machines are inherently bad or threatening, but simply because they are dull (except, occassionally, as a spectacle). A classic example of that is Silent Running, where one of the cute robots (non-humanoid, but very toddler-like in size and gait) gets blown into space and its remaining leg is given a formal burial among the plants that it tended to, and we all cry copious tears. The robots in question are completely unthreatening, but they also happen to be perfectly reliable and hard working. They do start off with the some basic toddler-ish characteristics (all physical, to do with the way they move), which are only by-products of their design. It is their owner's treatment of them that humanises them further - reprograming them, teaching them cards, etc. The point is, they don't lose any of their efficiency, or become any less reliable at doing the jobs they are assigned to. There is nothing to suggest, btw, that their owner could not just as easily teach them to burn peoples faces off.

[ed. - this sort of ties in with grant's observation about adaptability, maybe?]
 
 
We're The Great Old Ones Now
15:31 / 18.10.02
isn't there a risk of confusing an enabling factor with a causative one?

There's always a risk of that kind of thing in historical analysis. I don't happen to think I'm doing it, however.

As to specific references from Lyotard and Habermas, you can do the websearch as well as I can - I'm drawing on my memory, not sitting in a library. The notion of the encounter between perfect idea and fleshy world resulting in Terror comes directly from him, I think, possibly in 'Answering the Question: What is Postmodernism?' - though he is also (as I am, remember) in favour of technology in many ways. Elizabeth Flynn of Michigan Technical University writes of 'The Postmodern Condition': [Lyotard's] insights continue to have relevance today. As a faculty member at a technological and highly bureaucratized and computerized university, I certainly feel increasingly powerless in the face of a growing emphasis on performativity and a decreasing emphasis on truth and justice. And I am sure I am not alone. If Lyotard was right fifteen years ago (and I certainly hope he has not changed his mind on this),then what we need to alter this reality is more information, more discussion, more knowledge. What we need is a new politics that would respect the unknown, the unwritten.

The concept of the scientific/Enlightenment project as one of control and domination of the natural world is derived from Horkheimer and Adorno's 'Dialectic of Enlightenment'.

There's also an interesting article here on war and technology (the mechanisation of warfare is identified by Giddens as one of the defining aspects of the Modern era).

And Banks - it seems you're correct. He's not ambivalent about the Culture, but rather about the human ability to achieve Utopia at all. And Special Circumstances do not do 'minimal' harm, they do whatever harm is necessary to achieve their goal. They maintain employ frankly and acknowledgedly disturbed, dangerous, and sadistic entities who do not use 'minimum force'. They strive to generate a reputation for lethality and sudden and total assaults or revenges - in other words, they are experts in the use of political Terror. That the vast majority stay in line with the Culture's desires, and therefore do not feel the brunt of their actual force, is not relevant, nor yet is their rather woolly version of the Star Trek's Prime Directive. The Culture's peace is based on utterly overwhelming military might and a demonstrable willingness to use it, combined with a secret police of horrendous power and reach. What's intriguing about this is that almost everyone who reads the novels would be happy to live in the Culture for all that - largely because the Minds and Ships have cool, funny names, and the day-to-day business of running the empire is conducted via witty email. In other words, Banks' Culture has the best image of any Totalitarian state yet posited.
 
 
Pepsi Max
05:52 / 19.10.02
Nick> Theoretical things first:

Habermas - H's beef in specifically about 'technocracy' (see "Technology and Science as 'Ideology'"- a paper published in the late 60s/early 70s, I think). The use of technological jargon and means-ends rationality to control the public sphere and undermine democratic debate. He has little to say about technology as such. His interest is more in rhetoric.

Lyotard - In the "The Postmodern Condition" 10 years later, Lyotard makes a similar comment regarding the rise of 'performativity' (i.e. means-ends rationality) as the over-riding criteria for judgement. Although he talks about Science (and identifies certain sciences as postmodern - e.g. the then fashionable catashtrophe theory, pretty much anything to do with information theory), he's pretty weak only technology as such. (The subtitle to this book is 'A Report on Knowledge')

To this list you could probably add Deleuze and his Societies of Control.

But again, as far as I can see most of these theorists don't *get* technology. They fail to understand the complexity of what they're dealing with.

It is at least well-grounded historically, in terms both of the emergence of Fascism and the link between industrialisation and Imperialism/Totalitarianism.

How about the anti-technological totalitarian states you get - China under the Cultural Revolution, Cambodia under Pol Pot?

You can make a link between the Industrial Revolution and Imperialism - new machines leading to massive overproduction of commodities and the need to find new markets in which to sell these goods. But demographics, economics and geopolitics also come into play. I think your argument is in danger of getting overdeterministic.

There is no link that I can see between Technology and Nazism (even less with the Italian Fascist variant). Please spell it out to me.

saveloy> Absolutely agree with you. More later.
 
 
We're The Great Old Ones Now
07:52 / 19.10.02
Please spell it out to me.

Why? You're obviously smart enough. Do some work if you're interested. Haus does crit, I do synth. I'm bored of popping out ideas so other people can try to find the holes in 'em.
 
 
Lurid Archive
12:07 / 19.10.02
Nick. Sadly, I could argue about the Culture point by point. I don't agree with either your reading or your analysis but I think it would take us too far from the discussion. Your idea that Sci-fi naturally makes the association between technology and dystopia is also one I disagree with. But if you really think that the Culture is a Totalitarian state then I see why you might think that.


isn't there a risk of confusing an enabling factor with a causative one?

There's always a risk of that kind of thing in historical analysis. I don't happen to think I'm doing it, however.


I realise that you don't want to spend your time arguing the case, but I don't think that you have done anything more than assert a position. Unlike Deva, I find your ideas far from obvious and it isn't laziness to want you to spell it out. An unsupported and bald "idea" is of limited value.

But again, as far as I can see most of these theorists don't *get* technology. They fail to understand the complexity of what they're dealing with. - Pepsi Max

At the risk of sounding arrogant and dismissive, that has been my experience with much postmodernist writing about science and technology. In fact, Sokal and Bricmont wrote a whole chapter about Lyotard, in the section where they criticise people who talk about science while making gross errors.
 
 
Pepsi Max
12:40 / 20.10.02
Nick>

Why? You're obviously smart enough. Do some work if you're interested. Haus does crit, I do synth. I'm bored of popping out ideas so other people can try to find the holes in 'em.

Oh right, I'll just go off and talk to myself.

"So, do you think Nick's referring to the links between Marintetti and Mussolini? Or may be he's talking about the relationship between IBM and the Nazis? Or perhaps he's simply reiterating Marcuse's arguments?"
"Well, I'd like to be able to answer you, but the fairies have borrowed my shiny, fascistic mind-reading machine and won't bring it back."

Feel free to join in when the mood takes you.
 
  

Page: (1)2

 
  
Add Your Reply