Avatar | |
---|---|
Theatrical release poster |
|
Directed by | James Cameron |
Produced by | James Cameron Jon Landau |
Written by | James Cameron |
Starring | Sam Worthington Zoë Saldaña Stephen Lang Michelle Rodriguez Giovanni Ribisi Sigourney Weaver |
Music by | James Horner |
Cinematography | Mauro Fiore |
Editing by | James Cameron John Refoua Stephen E. Rivkin |
Studio | Lightstorm Entertainment |
Distributed by | 20th Century Fox |
Release date(s) | December 16, 2009 (World premiere)[1] December 17, 2009 (Puerto Rico, Australia & New Zealand)[1] December 18, 2009 (USA)[2] |
Running time | 161 minutes[3] |
Country | United States |
Language | English |
Budget | US$237 million[4] |
Avatar, also known as James Cameron's Avatar, is an American 3-D science fiction epic film written and directed by James Cameron, due to be released on December 16, 2009[1] by 20th Century Fox. The film is by Lightstorm Entertainment, and focuses on an epic conflict on a far-away world called Pandora, where humans and the native species of Pandora, the Na'vi, engage in a war over the planet's resources and existence.[5]
The film will be released in 2D and 3D formats, along with an IMAX 3D release in selected theaters. The film is being touted as a breakthrough in terms of filmmaking technology, for its development of 3D viewing and stereoscopic filmmaking with cameras that were specially designed for the film's production.[6] The film has been rated PG-13 by the United States MPAA for "intense epic battle sequences and warfare, sensuality, language and some smoking."[7]
Contents[hide] |
In A.D. 2154[8], the story’s protagonist, Jake Sully (Sam Worthington), is a former U.S. Marine who was wounded and paralyzed from the waist down in combat on Earth. Jake is selected to participate in the Avatar program, which will enable him to walk. Jake travels to Pandora. This world is a lush and sentient-inhabited jungle-covered satellite of Polyphemus, one of three gas giants that orbit Alpha Centauri A,[8] 4.3 light years from Earth.
Pandora's biosphere is filled with incredible life forms, some beautiful, many terrifying. This world is also home to the Na’vi, a sentient humanoid race, who are considered primitive, yet are more physically capable than humans. Standing three meters tall (approximately 10 feet), with tails and sparkling blue skin, the Na’vi live in harmony with their unspoiled world. As humans encroach deeper into Pandora's forests in search of valuable minerals, the Na’vi unleash their formidable warrior abilities to defend their threatened existence.
Jake has unwittingly been recruited to become part of this encroachment. Since humans are unable to breathe the air on Pandora, they have created genetically-bred human-Na’vi hybrids known as Avatars. On Pandora, through his Avatar body, Jake will be able to walk again. Sent deep into Pandora's jungles as a scout for the soldiers that will follow, Jake encounters many of Pandora's beauties and dangers. There he meets a young Na’vi female, Neytiri (Zoe Saldaña).
Over time, Jake integrates himself into the Na'vi clan, and begins to fall in love with Neytiri. As a result, Jake finds himself caught between the military-industrial forces of Earth and the Na’vi, forcing him to choose sides in an epic battle that will decide the fate of Pandora.
In 1994, director James Cameron wrote a 114-page scriptment for Avatar.[9] Cameron said his inspiration was "every single science fiction book I read as a kid", and that he was particularly striving to update the style of Edgar Rice Burroughs' John Carter series. In August 1996, Cameron announced that after completing Titanic, he would film Avatar, which would make use of "synthetic", or computer-generated, actors.[29] The project would cost $100 million and involve at least six actors in leading roles "who appear to be real but do not exist in the physical world".[30] Special effects house Digital Domain, with whom Cameron has a partnership, joined the project, which was supposed to begin production in the summer of 1997 for a 1999 release.[31]
In June 2005, director Cameron was announced to be working on a project tentatively titled "Project 880", concurrently with another project, Battle Angel.[32] By December, Cameron said that he planned to film Battle Angel first for a summer 2007 release, and to film Project 880 for a 2009 release.[33] In February 2006, Cameron said he had switched goals for the two film projects – Project 880 was now scheduled for 2007 and Battle Angel for 2009. He indicated that the release of Project 880 would possibly be delayed until 2008.[34] Later that February, Cameron revealed that Project 880 was "a retooled version of Avatar", a film that he had tried to make years earlier,[35] citing the technological advances in the creation of the computer-generated characters Gollum, King Kong and Davy Jones.[9] Cameron had chosen Avatar over Battle Angel after completing a five-day camera test in the previous year.[36]
Cameron's early scriptment for Avatar had circulated on the Internet for years. When the project was re-announced, copies were subsequently removed from websites.[37] In June 2006, Cameron said that if Avatar was successful, he hoped to make two sequels to the film.[38]
Wikinews has related news: James Cameron to use Weta Digital for next film |
From January to April 2006, Cameron worked on the script. Working with Paul Frommer, linguist and Director of the Center for Management Communication at USC, he developed a whole language and culture for the Na'vi, the indigenous race on Pandora.[9] In July, Cameron announced that he would film Avatar for a summer 2008 release and planned to begin principal photography with an established cast by February 2007.[39] The following August, the visual effects studio Weta Digital signed on to help Cameron produce Avatar.[40] Stan Winston, who had collaborated with Cameron in the past, joined Avatar to help with the film's designs.[41] In September 2006, Cameron was announced to be using his own Reality Camera System to film in 3-D. The system would use two high-definition cameras in a single camera body to create depth perception.[42]
At Comic Con 2009, Cameron told attendees that he wanted to make "something that has this spoonful of sugar of all the action and the adventure and all that, which thrills me anyway as a fan, but also wanting to do something that has a conscience, that maybe in the enjoying of it makes you think a little bit about the way you interact with nature and your fellow man."[43] He added that "the Na'vi represent something that is our higher selves, or our aspirational selves, what we would like to think we are," and "the humans in the film, even though there are some good ones salted in, represent what we know to be the parts of ourselves that are trashing our world and maybe condemning ourselves to a grim future."[43]
The themes most cited are imperialism and biodiversity.[44]
It's this form of pure creation where if you want to move a
tree
or a mountain or the sky or change the time of day, you have complete
control over the elements.
—James Cameron on virtual
filmmaking[45]
|
In December 2006, Cameron explained that the delay in producing the film since the 1990s had been to wait until the technology necessary to create his project was advanced enough. The director planned to create photo-realistic computer-generated characters by using motion capture animation technology, on which he had been doing work for the past 14 months. Unlike previous performance capture systems, where the digital environment is added after the actors' motions have been captured, Cameron's new virtual camera allows him to observe directly on a monitor how the actors' virtual counterparts interact with the movie's digital world in real time and adjust and direct the scenes just as if shooting live action; "It’s like a big, powerful game engine. If I want to fly through space, or change my perspective, I can. I can turn the whole scene into a living miniature and go through it on a 50 to 1 scale."[46] Cameron planned to continue developing the special effects for Avatar, which he hoped would be released in summer 2009. He also gave fellow directors Steven Spielberg and Peter Jackson a chance to test the new technology.[47] Spielberg and George Lucas were also able to visit the set to watch Cameron direct with the equipment.[48]
Other technological innovations include a performance-capture stage, called The Volume, which is six times larger than previously used and an improved method of capturing facial expressions. The tool is a small individually made skull cap with a tiny camera attached to it, located in front of the actors' face which collects information about their facial expressions and eyes, which is then transmitted to the computers. This way, Cameron intends to transfer about 95% of the actors' performances to their digital counterparts. Besides a real time virtual world, the team is also experimenting with a way of letting computer generated characters interact with real actors on a real, live-action set while shooting live action.[49]
In January 2007, Fox announced that the studio's Avatar would be filmed in 3D at 24 frames per second. Cameron described the film as a hybrid with a full live-action shoot in combination with computer-generated characters and live environments. "Ideally at the end of the day the audience has no idea which they’re looking at," Cameron said. The director indicated that he had already worked four months on nonprincipal scenes for the film. Principal photography began in April 2007,[50] and was done around parts of Los Angeles as well as New Zealand. The live action was shot with a modified version of the proprietary digital 3D Fusion Camera System, developed by Cameron and Vince Pace.[51] According to Cameron, the film will be composed of 60% computer-generated elements and 40% live action, as well as traditional miniatures.[52] The performance-capture photography would last 31 days at the Hughes Aircraft stage in Playa Vista, Los Angeles, California.[36][53] In October, Cameron was scheduled to shoot live-action in New Zealand[16] for another 31 days.[9]
To create the human mining colony on Pandora, production designers visited the Noble Clyde Boudreaux drilling rig in the Gulf of Mexico during June 2007. They photographed, measured and filmed every aspect of the rig, which will be replicated on-screen with photorealistic CGI.[54] More than a thousand people worked on the production.[53] James Cameron sent the cast of Avatar off to the jungle for bonding boot camp exercises before he started shooting the film.[55]
A.I. Artificial Intelligence | |
---|---|
Directed by | Steven Spielberg |
Produced by | Steven Spielberg Stanley Kubrick Jan Harlan Kathleen Kennedy Walter F. Parkes Bonnie Curtis |
Written by | Short story Brian Aldiss Screen story: Ian Watson Screenplay: Steven Spielberg |
Narrated by | Ben Kingsley |
Starring | Haley Joel Osment Frances O'Connor Jude Law Sam Robards Jake Thomas William Hurt |
Music by | John Williams |
Cinematography | Janusz Kamiński |
Editing by | Michael Kahn |
Studio | Amblin Entertainment |
Distributed by | Warner
Bros. (USA) DreamWorks (non-USA) |
Release date(s) | June 29, 2001 |
Running time | 146 minutes |
Country | United States |
Language | English |
Budget | $100 million |
Gross revenue | $235,926,552 |
A.I. Artificial Intelligence, also known as Artificial Intelligence: A.I. or simply A.I., is a 2001 science fiction film directed, produced and co-written by Steven Spielberg. Based on Brian Aldiss's short story Super-Toys Last All Summer Long, the film stars Haley Joel Osment, Frances O'Connor, Jude Law, Sam Robards, Jake Thomas and William Hurt. Set sometime in the future, A.I. tells the story of David, a child-like android programmed with the unique ability to love.
Development of A.I. originally began with Stanley Kubrick in the early 1970s. Kubrick hired a series of writers up until the mid-1990s, including Brian Aldiss, Bob Shaw, Ian Watson and Sara Maitland. The film languished in development hell for years because Kubrick felt computer-generated imagery was not advanced enough to create the David character, whom he believed no child actor would believably portray. In 1995 Kubrick handed A.I. to Steven Spielberg, but the film did not gain momentum until the death of Kubrick in 1999. Spielberg remained close to Watson's film treatment for the screenplay, and replicated Kubrick's secretive style of filmmaking. A.I. was greeted with mostly positive reviews from critics and became a moderate financial success. This film was dedicated to Kubrick's memory with a small credit after the credits, saying "For Stanley Kubrick"
Contents[hide] |
Global warming has led to ecological disasters all over the world in the mid-22nd Century, and a drastic reduction of the human population. Humanity's best efforts to maintain civilization have led to the creation of new robots known as mechas, advanced humanoid robots capable of emulating thoughts and emotions. David (Haley Joel Osment), in 2104 an advanced prototype model created by Cybertronics, is designed to resemble a human child and to virtually feel love for its human owners. They test their creation on one of their employees, Henry Swinton (Sam Robards) and his wife Monica (Frances O'Connor). The Swintons have a son Martin (Jake Thomas), who has been placed in suspended animation until a cure can be found for his rare disease. Although Monica is initially frightened of David, she eventually warms to him after activating his imprinting protocol, which irreversibly causes David to feel love for her as a child loves a parent. As he continues to live with the Swintons, David is befriended by Teddy (voiced by Jack Angel), a robotic teddy bear who takes upon himself the responsibility of David's well being.
Martin is suddenly cured and brought home; a sibling rivalry ensues between Martin and David. Martin's scheming behavior backfires when he and his friends activate David's self-protection programming at a pool party. Martin is saved from drowning but David's actions prove too much for Henry. It is decided for David to be destroyed at the factory where he was built, but Monica rather leaves him (alongside Teddy) in a forest to live as unregistered mechas. David is captured for an anti-mecha Flesh Fair, an event where useless mechas are destroyed before cheering crowds. David is nearly killed, but the crowd is swayed by his realistic nature (David, unlike other mechas, pleads for his life) and he escapes along with Gigolo Joe (Jude Law), a male prostitute mecha on the run after being framed for murder.
The two set out to find the Blue Fairy, whom David remembers from the story The Adventures of Pinocchio. As in the story, he believes that she will transform him into a real boy, so Monica will love him and take him back. Joe and David make their way to the decadent metropolis of Rouge City. Information from a holographic volumetric display answer engine called "Dr. Know" (voiced by Robin Williams) eventually leads them to the top of the Rockefeller Center in the flooded ruins of New York City, using a submersible vehicle (named the Amphibi-copter) they have stolen from the authorities, still hot on Joe's tail. When they arrive at New York, David's human creator Professor Hobby (William Hurt) enters, after David destroys an android that looks exactly like him, and excitedly tells David that finding him was a test, which has demonstrated the reality of his love and desire. A disheartened David attempts to commit suicide by falling from a ledge into the ocean, but Joe rescues him. A little while later, Joe is captured by the authorities with the use of an electromagnet, but David escapes.
David and Teddy take the submersible to the fairy, which turns out to be a statue from a submerged attraction at Coney Island. Teddy and David become trapped when the Wonder Wheel falls on their vehicle. Believing the Blue Fairy to be real, he asks to be turned into a real boy, repeating his wish without end, until the ocean freezes.
2000 years later, Manhattan is buried
under several hundred feet of glacial ice, and humans are extinct.[1]
Earth is now being excavated and studied by humanoid future robots,
which appear to be made of mostly energy and posses some form of
telekinesis and telepathy.[2]
They find David and Teddy; the only two functional mechas who knew
living humans. David wakes up and after coming upon the frozen stiff
Blue Fairy, he tries to touch her. However, because of the time passed
and the damage upon the statue, it cracks and collapses immediately.
David then realizes that the fairy was fake. Using David's memories,
the alien-looking humanoid mechas reconstruct the Swinton home, and
explain to him via a mecha of the Blue Fairy (voiced by Meryl
Streep)
that he cannot become human. However they recreate Monica from a lock
of her hair which has been faithfully saved by Teddy, but she will live
for only a single day and the process cannot be repeated as stated by
one of the aliens (voiced by Ben
Kingsley).
David spends the happiest day of his life playing with Monica and
Teddy. Monica tells David that she loves him and has always loved him
as she drifts slowly away from the world. This was the "everlasting
moment" David had been looking for, he closes his eyes, and goes "to
that place where dreams are born".
Minority Report | |
---|---|
Theatrical release poster |
|
Directed by | Steven Spielberg |
Produced by | Gerald R. Molen Bonnie Curtis Walter F. Parkes Jan de Bont |
Written by | Screenplay: Scott Frank Jon Cohen (uncredited: John August) Short Story: Philip K. Dick |
Starring | Tom
Cruise Colin Farrell Samantha Morton Steve Harris Neal McDonough and Max Von Sydow |
Music by | John Williams |
Cinematography | Janusz Kamiński |
Editing by | Michael Kahn |
Studio | Amblin Entertainment Cruise/Wagner Productions |
Distributed by | 20th Century Fox (USA) DreamWorks (non-USA) |
Release date(s) | June 21, 2002 |
Running time | 139 min. |
Country | United States |
Language | English |
Budget | $102 million |
Gross revenue | Domestic $132,072,926 Foreign $226,300,000 Worldwide $358,372,926 |
Minority Report is a 2002 science fiction film directed by Steven Spielberg and loosely based on the short story "The Minority Report" by Philip K. Dick. It is set primarily in Washington, D.C. and Northern Virginia in the year 2054, where "Precrime", a specialized police department, apprehends criminals based on foreknowledge provided by three psychics called "precogs". The cast includes Tom Cruise as Precrime officer John Anderton, Colin Farrell as Department of Justice agent Danny Witwer, Samantha Morton as the senior precog Agatha, and Max von Sydow as Anderton's superior Lamar Burgess. The film has a distinctive look, featuring high contrast for dark colors and shadows, resembling film noir.
Minority Report was one of the best reviewed films of 2002,[1] and was nominated for and won several awards.[2] These included an Academy Award nomination for Best Sound Editing, and four Saturn Awards, including Best Science Fiction Film and Best Direction. Produced on a budget of $102 million, the film was also a commercial success, earning more than three times that in worldwide box office returns and selling four million DVDs in its first few months of release.[3][4]
Contents[hide] |
This plot summary may be too long or overly detailed. Please help improve it by removing unnecessary details and making it more concise. (December 2009) |
In 2054, John Anderton (Tom Cruise) is a member of an experimental Washington, D.C. police force known as Precrime, which uses future visions generated by three "precogs", mutated humans with precognition abilities, to stop murders. Visions from their minds are displayed on screens for Anderton and Precrime to view; while the precogs are able to provide the names of the victims and perpetrators and the time of the murders, other facts (chiefly the location) must be deduced by sorting through the images manually. The beginning of the film shows Anderton successfully stopping a man from murdering his adulterous wife and lover; because the Precrime project is public knowledge, crimes are rarely premeditated, with the majority of murders now being crimes of passion that are decided on the spot (the precogs tend to detect these late; in this case, Anderton only had about 30 minutes to figure out the location, travel there, and apprehend the criminal). Due to the unit's actions, D.C. has been essentially murder-free for six years. Though chief of the force, Anderton has been addicted to an illegal psychoactive drug since the disappearance of his son Sean (Dominic Scott Kay), which also caused his wife Lara (Kathryn Morris) to leave him. With the Precrime concept poised to go nationwide, the system is audited by Danny Witwer (Colin Farrell), a member of the Department of Justice. Witwer is doubtful on the legality of the system, pointing out that stopping future crimes essentially changes the future and creates a paradox. During the audit, the precogs predict that Anderton will murder a man named Leo Crow (Mike Binder) in 36 hours; believing the incident to be a setup by Witwer, who is aware of Anderton's addiction, and given the fact Anderton doesn't even know Crow, Anderton attempts to hide the case and quickly departs the area before Witwer begins a manhunt for him. Anderton seeks the advice of Dr. Iris Hineman (Lois Smith), the lead researcher of the Precrime technology. She explains to Anderton that the three precogs—the children of drug addicts using experimental drugs years ago—may see different visions of the future. When this happens, the system only provides data on the two reports which agree; a "minority report", showing the futures where the perpetrators may not have actually committed a murder, is discarded. According to Dr. Hineman, the female precog Agatha (Samantha Morton) is likely the one who witnesses the minority reports.
Anderton has his eyes surgically replaced to avoid iris recognition scanners before travelling back to Precrime and kidnapping Agatha. Agatha's kidnapping prevents the precogs' hive mind from working and shuts down the system. Anderton takes Agatha to a hacker, who is able to extract both Agatha's vision of Crow's murder—which reveals John's case lacks a minority report—and another depicting the murder of a woman named Anne Lively (Jessica Harper), which Agatha also showed to Anderton the day before he was incriminated. In an attempt to outrun the police, Anderton and Agatha end up at the apartment building where Crow is to be killed. Anderton breaks into Crow's room and finds hundreds of pictures of children, including his son, on Crow's bed, leading him to conclude that Crow is the man responsible for Sean's disappearance. When Crow arrives, Anderton holds him at gunpoint, but ultimately decides to control his anger and place Crow under arrest. Crow admits that he was hired to plant the photos and be killed so his family would be paid handsomely; realizing Anderton will no longer kill him, Crow grabs the officer's hand, making him fire at point-blank range, and dies. After assessing Crow's "murder", Witwer doubts that Anderton killed in cold blood, and approaches the Precrime division's director Lamar Burgess (Max von Sydow) at Anderton's apartment. Witwer, who has also discovered Agatha's recording of the Anne Lively murder, points out that it differs slightly from the original recording and observes that someone must have manipulated the system to fake the murder. Witwer deduces that it would have be someone high up in Precrime to have access to the precog visions, to which Burgess then kills Witwer with Anderton's gun. Because Agatha is with Anderton, Precrime is unable to detect the murder.
Anderton approaches his wife Lara, from whom he is separated, for refuge, and realizes that his knowledge of the Lively case is why he is being targeted: Lively was Agatha's mother, and shortly after her request to see her daughter again, Lively was never heard from again, despite an earlier attempt on her life failing when it was predicted by the precogs. The Precrime unit eventually captures Anderton and restores Agatha to the system. Burgess attempts to comfort Lara, but accidentally reveals that he knows more about Lively's death than implied. Lara uses this information to free Anderton.
At a banquet to celebrate the success of the Precrime unit and Burgess, Anderton plays Agatha's vision of the Lively murder for the gathered crowd, clearly showing Burgess as the murderer. Anderton explains that Burgess had hired a drifter to kill Lively, only to have it prevented by Precrime. Having viewed the precog vision, Burgess had then killed Lively in the exact same way as in the vision. Because precogs sometimes experience relapses of past murders, or "echoes," Precrime had put off this new murder to be an echo (Witwer had known the two visions to be separate murders after noticing water lapping in opposite directions). As Burgess sneaks off to confront Anderton to silence him of the Lively murder, a new precrime report is created: Anderton is the victim and Burgess is the murderer. When Burgess finds Anderton, Anderton says it's over and presents him with an no-win situation: if Burgess kills Anderton, he proves the system works but at the cost of his incarceration, while if he does not, the system will not have worked and the Precrime division will be shut down. Anderton reveals the fundamental flaw of the system: if one knows his or her own future, he or she can change it. Burgess resolves the paradox by killing himself. Anderton and Lara try to have another baby and get back together. The Precrime program is shut down, all those jailed as a result of Precrime are paroled and released and the precogs are given the chance to lead a full life. The film ends with the precogs themselves living together in a cabin on a remote island, far from anyone who would trouble them with visions.
"We don't choose the things we believe in; they choose us."
—Lamar Burgess
|
The main themes of Minority Report are the classic philosophical questions surrounding foreknowledge and free will vs. determinism.[29][30] One of the main questions the film raises is whether the future is set or whether free will can alter the future.[31] As critic C.A. Wolski commented, "At the outset, Minority Report... promises to mine some deep subject matter, to do with: do we possess free will or are we predestined to our fate?"[29] However, there is also the added question of whether the precogs' visions are correct.[31] As reviewer James Berardinelli asked, "is the Precogs' vision accurate, or has it in some way been tampered with? Perhaps Anderton isn't actually going to kill, but has been set up by a clever and knowledgeable criminal who wants him out of the way."[31] The precog Agatha also states that since Anderton knows his future, he can change it. However, the film also indicates that Anderton's knowledge of the future may actually be the factor that causes Leo Crow's death. Berardinelli describes this as the main paradox regarding free will vs. determinism in the film,[31] "[h]ere's the biggest one of all: Is it possible that the act of accusing someone of a murder could begin a chain of events that leads to the slaying. In Anderton's situation, he runs because he is accused. The only reason he ends up in circumstances where he might be forced to kill is because he is a hunted man. Take away the accusation, and there would be no question of him committing a criminal act. The prediction drives the act – a self-fulfilling prophecy. You can see the vicious circle, and it's delicious (if a little maddening) to ponder."[31] Ironically, this paradox of choice also presents a personal paradox, as if Anderton choses not to kill Crow, pre-crime is thrown into doubt, but if he chooses to kill Crow, he proves that the system works, but at the cost of his own life. Spielberg also mentioned that the lack of free will mentioned in the movie had some real world background, saying that "We’re giving up some of our freedom so that the government can protect us."[32] Most critics gave this element of the film positive reviews,[33] with many ranking it as the main strength of the film.[30][31][34] Other reviewers however, felt that Spielberg did not adequately deal with the issues that he raised.[29][35]
Minority Report is a futuristic film which portrays elements of a both dystopian and utopian future. It renders a much more detailed view of a near-term future world than that present in the original short story, with depictions of a number of technologies related to the film's themes.[36] The scene in which Anderton is dreaming about his son's kidnapping at the pool is shot in "normal" color.
From a stylistic standpoint, Minority Report resembles Spielberg's previous film A.I.[26] The picture was deliberately overlit, and the negative was bleach-bypassed during post-production.[37] This gave the film a distinctive look, with colors desaturated, yet the blacks and shadows have a high contrast, looking almost like a film noir picture.[37] Elvis Mitchell, formerly of the The New York Times, commented that "[t]he picture looks as if it were shot on chrome, caught on the fleeing bumper of a late '70s car."[38]
Total Recall | |
---|---|
Theatrical release poster |
|
Directed by | Paul Verhoeven |
Produced by | Mario
Kassar Andrew G. Vajna |
Written by | Short story Philip K. Dick Screenplay Ronald Shusett Dan O’Bannon Jon Povill Gary Goldman |
Starring | Arnold Schwarzenegger Rachel Ticotin Sharon Stone Michael Ironside and Ronny Cox |
Music by | Jerry Goldsmith |
Cinematography | Jost Vacano |
Editing by | Carlos Puente Frank J. Urioste |
Distributed by | TriStar Pictures Guild |
Release date(s) | June 1, 1990 |
Running time | 113 min. |
Country | United States |
Language | English |
Budget | $65 million |
Gross revenue | $261,299,840 [1] |
Total Recall is a 1990 American science fiction action film. The film features Arnold Schwarzenegger and Sharon Stone, based on the Philip K. Dick story "We Can Remember It for You Wholesale". The film was directed by Paul Verhoeven and written by Ronald Shusett, Dan O’Bannon, Jon Povill, and Gary Goldman. It won a Special Achievement Academy Award for its visual effects. The soundtrack composed by Jerry Goldsmith won the BMI Film Music Award.
Contents[hide] |
The story is set in the year 2084. Douglas Quaid (Schwarzenegger) is a construction worker who has been experiencing dreams about exploring the planet Mars with a brunette woman. After seeing an ad from Rekall, a company that sells imaginary adventures by implanting false memories, he decides to buy a “vacation” on Mars from them, one in which he will take a vacation from himself by becoming a spy in a clichéd, "James Bond in space" scenario which promises lurid entertainment and thrills. Before buying the vacation, Quaid is cautioned by a co-worker that Rekall is risky, and failed memory implants can cause the recipients to suffer permanent brain damage. Quaid hesitates, but disregards this warning. After the procedure starts, Quaid has a violent outburst and tries to break free, yelling incoherently. At first, it seems as though he was merely acting out the “spy” portion of the memory implant, but the doctors claim they had not implanted the memories yet, and realize that the memories are already there, and someone else had previously erased his memory.
After being subdued Quaid is returned home with no memories of ever going to Rekall, but his old friends try to kill him. His wife Lori (Stone) also attacks him; after he overpowers her, she tells him that everything he remembers, including their marriage, is a series of false memories implanted less than two months before. While evading assailants, Quaid receives a phone call from someone claiming to be a former friend of his agency from Mars who had been asked to deliver a briefcase if he ever disappeared. The briefcase contains fake IDs, money, weapons, and a video player with a video he left for himself. The "Quaid" on the video is named Hauser, who had worked for Mars administrator Vilos Cohaagen (Cox) as a secret agent. Pursued by a man working for Cohaagen, Richter (Ironside), Quaid travels to Mars to discover the truth. On Mars, Quaid finds out that Cohaagen rules an airtight city via his monopoly of air production on Mars, and that the poor workers in the city’s slums have been turned into mutants from cosmic rays, which the thin atmosphere of Mars cannot block. He makes allies, including a cabbie named Benny and the woman from his dreams, Melina (Rachel Ticotin) who works as a lady of the night on Mars. Quaid is confronted by Lori and Dr. Edgemar from Rekall, who attempt to convince him he is trapped within a hallucination brought on by a faulty memory implant at Rekall. Quaid sees how nervous the Dr. is and shoots Edgemar. Suddenly a group of hitmen storm Quaid's room, knock out and capture him. Melina arrives shortly after, and kills the hitmen but is disarmed by Lori. The two women engage in a fistfight which Lori wins by knocking out Melina. Lori is about to slit Melina's throat when Quaid, who has come to shoots her in the head.
Melina and Quaid flee and eventually meet resistance leader Kuato, who is revealed to be a mutant growing out of his own brother's abdomen. With Kuato's psychic help, Quaid sees a mysterious alien machine in the Martian mines, but then Cohaagen's forces storm the resistance hideout and kill Kuato. Quaid and Melina are betrayed by the mutant Benny and captured, and Cohaagen then reveals that Hauser willingly had his mind wiped in order to gain Kuato’s trust, the only way the mutant psychic could be fooled. The whole incident, with the exception of Richter’s maniacal pursuit of Quaid and Quaid’s activation at Rekall, was planned, and Cohaagen provides another video from Quaid’s alter ego, Hauser, who congratulates Quaid on his performance and confirms Cohaagen's words. Cohaagen decides to eliminate the rebels by cutting off the air supply to their section of the city, and orders Quaid’s mind to be restored to Hauser’s and Melina’s mind be altered to be subservient to Quaid/Hauser. However, Quaid and Melina escape and hurry to the alien machine, killing Benny and Richter on the way.
As they approach the machine, Quaid tells Melina that the device is a giant reactor meant to melt the frozen core of Mars, releasing oxygen from the ice and giving the planet an atmosphere. If Mars had oxygen, Cohaagen would lose his monopoly on Mars, which is why he has kept the machine's existence secret. Cohaagen arrives as Quaid tries to activate the machine, but is shot by Melina. He then attempts to set off a bomb via remote, but Quaid throws the bomb away, and the force blows out the windows of the city. The vacuum draws Cohaagen onto the surface of Mars, where he dies a painful death of asphyxiation and decompression. Quaid and Melina nearly suffer the same fate, similar to the dream Quaid had in the opening scene of the movie, but the alien machine creates fast enough a breathable atmosphere that saves them and the mutants just in time, and a blue sky forms over Mars. Quaid wonders if the whole thing has been real or if he is dreaming. He and Melina kiss each other as a bright flash of white light illuminates the screen like an opening eye, and the credits roll.
The film explores the question of reality versus delusion, a recurrent topic in Philip K. Dick’s works. The plot calls for the lead character and the audience to question whether the character’s experience is real or being fed directly to his mind. There are several visual and informational clues which point in both directions. Verhoeven plays up the intentional ambiguity to the very end and no definitive answer is ever given. However the beginning title of the movie soundtrack is called "the dream" and the ending title "end of a dream". On the DVD commentary Verhoeven and Schwarzenegger come to opposite conclusions regarding how real the post-Rekall events of the film actually were. Thus, the viewer is left wondering whether or not the events actually happened, if the entire story is simply the memory purchased at Rekall gone terribly awry, or if in fact Rekall had simply delivered on its original promise of “action” and “adventure.” This theme has been revisited since in similarly-themed films such as The Matrix, eXistenZ, The Thirteenth Floor, and Vanilla Sky.
A consistent motif throughout the film is
the presentation of
striking opposites: Earth/Mars; Quaid/Hauser; the mutants Kuato and his
brother George; the use of holographic doubles by Quaid and Melina;
reflections of Quaid, Lori and Dr. Edgemar in mirrors in Quaid's hotel
room; Melina/Lori. The latter example subverts a standard film
noir convention, the saintly blonde versus the devilish
brunette; in Total Recall, the blonde turns out to be the
villain and the brunette the heroine.[7]
Aliens | |
---|---|
Theatrical release poster |
|
Directed by | James Cameron |
Produced by | Gale Anne Hurd Gordon Carroll David Giler Walter Hill |
Written by | Story: James Cameron David Giler Walter Hill Screenplay: James Cameron |
Starring | Sigourney Weaver Carrie Henn Michael Biehn Lance Henriksen William Hope Paul Reiser |
Music by | James Horner |
Cinematography | Adrian Biddle |
Editing by | Ray Lovejoy |
Distributed by | 20th Century Fox |
Release date(s) | July 18, 1986 |
Running time | 137 minutes |
Country | United States United Kingdom |
Language | English |
Budget | $18,500,000 |
Gross revenue | $131,060,248[1] |
Preceded by | Alien |
Followed by | Alien 3 |
Aliens is a 1986 science fiction action film directed by James Cameron and starring Sigourney Weaver, Carrie Henn, Michael Biehn, Lance Henriksen, and Bill Paxton. A sequel to the 1979 film Alien, Aliens is set fifty-seven years after the first film and is regarded by many film critics as a benchmark for the action and science fiction genres.[2][3] In Aliens, Weaver's character Ellen Ripley returns to the planetoid LV-426 where she first encountered the hostile Alien. This time she is accompanied by a unit of Colonial Marines.
Aliens' action adventure tone was in contrast to the horror motifs of the original Alien. Following the success of The Terminator (1984), which helped establish Cameron as a major action director,[4] 20th Century Fox greenlit Aliens with a budget of approximately $18 million. It was filmed in England at Pinewood Studios, and at a decommissioned power plant.
Aliens earned $86 million in the United States box office during its 1986 theatrical release and $131 million internationally.[5] The movie was nominated for seven Academy Awards, including a Best Actress nomination for Sigourney Weaver. It won in the categories of Sound Effects Editing and Visual Effects.
Contents[hide] |
Ellen Ripley (Sigourney Weaver), the only survivor of the space freighter Nostromo, is rescued and revived after drifting for fifty-seven years in hypersleep. At an interview before a panel of executives from her employer, the Weyland-Yutani Corporation, her testimony regarding the Alien is met with extreme skepticism as no physical evidence of the creature survived the destruction of the Nostromo. Ripley loses her space flight license as a result of her "questionable judgment" and learns that LV-426, the planetoid where her crew first encountered the Alien eggs, is now home to a terraforming colony. Some time later, Ripley is visited by Weyland-Yutani representative Carter Burke (Paul Reiser) and Lieutenant Gorman (William Hope) of the Colonial Marines, who inform her that contact has been lost with the colony on LV-426. The company is dispatching Burke and a unit of marines to investigate, and offers to restore Ripley's flight status and pick up her contract if she will accompany them as a consultant. Traumatized by her previous encounter with the Alien, Ripley initially refuses to join, but accepts when she realizes that the mission will allow her to face her fears. Aboard the warship Sulaco she is introduced to the Colonial Marines, including Sergeant Apone (Al Matthews), Corporal Hicks (Michael Biehn), Privates Vasquez (Jenette Goldstein) and Hudson (Bill Paxton), and the android Bishop (Lance Henriksen), toward whom Ripley is initially hostile due to her previous experience with the android Ash aboard the Nostromo.
The heavily-armed expedition descends to the surface of LV-426 via dropship, where they find the colony seemingly abandoned. Two living Alien facehuggers are found in containment tanks in the medical lab, and the only colonist found is a traumatized young girl nicknamed Newt (Carrie Henn). The marines determine that the colonists are clustered in the nuclear-powered atmosphere processing station. There they find a large Alien nest filled with the cocooned colonists. The Aliens attack and kill most of the unit, but Ripley rescues Hicks, Vasquez, and Hudson. With Gorman knocked unconscious during the rescue, Hicks assumes command and orders the dropship to recover the survivors, intending to return to the Sulaco and destroy the colony from orbit. A stowaway Alien kills the dropship pilots in flight, causing the vessel to crash into the processing station. The surviving humans barricade themselves inside the colony complex.
Ripley discovers that it was Burke who ordered the colonists to investigate the derelict spaceship where the Nostromo crew first encountered the Alien eggs, and that he hopes to return Alien specimens to the company laboratories where he can profit from their use as biological weapons. She threatens to expose him, but Bishop soon informs the group of a greater threat: the damaged processing station has become unstable and will soon detonate with the force of a thermonuclear weapon. He volunteers to use the colony's transmitter to pilot the Sulaco's remaining dropship to the surface by remote control so that the group can escape. Ripley and Newt fall asleep in the medical laboratory, awakening to find themselves locked in the room with the two facehuggers, which have been released from their tanks. Ripley is able to alert the marines, who rescue them and kill the creatures. Ripley accuses Burke of attempting to smuggle implanted Alien embryos past Earth's quarantine inside her and Newt, and of planning to kill the rest of the marines in hypersleep during the return trip. The electricity is suddenly cut off and the Aliens attack through the ceiling. Hudson, Burke, Gorman, and Vasquez are killed and Newt is captured by the Aliens.
Ripley and an injured Hicks reach Bishop and the second dropship, but Ripley is unwilling to leave Newt behind. She rescues Newt from the hive in the processing station, where the two encounter the Alien queen and her egg chamber. Ripley destroys most of the eggs, enraging the queen, who escapes by tearing free from her ovipositor. Closely pursued by the queen, Ripley and Newt rendezvous with Bishop and Hicks on the dropship and escape moments before the colony is consumed by the nuclear blast. Back on the Sulaco, Ripley and Bishop's relief at their narrow escape is interrupted when the Alien queen, stowed away on the dropship's landing gear, tears Bishop in half. Ripley battles the queen using an exosuit cargo-loader. The two of them tumble into a large airlock, which Ripley then opens, expelling the queen into space. Ripley clambers to safety and she, Newt, Hicks, and the still-functioning Bishop enter hypersleep for the return to Earth.
Philosopher Stephen Mulhall has remarked that the four Alien films represent an artistic rendering of the difficulties faced by the woman's "voice" to have itself heard in a masculinist society, as Ripley continually encounters males who try to silence her and to force her to submit to their desires. Mulhall sees this depicted in several places in Aliens, particularly the inquest scene in which Ripley's explanation for the deaths and destruction of the Nostromo, as well as her attempts to warn the board members of the alien danger, are met with officious disdain. However, Mulhall believes that Ripley's relationship with Hicks illustrates that Aliens "is devoted ... to the possibility of modes of masculinity that seek not to stifle but rather to accommodate the female voice, and modes of femininity that can acknowledge and incorporate something more or other of masculinity than our worst nightmares of it."[39]
Several movie academics, including Barbara Creed, have remarked on the color and lighting symbolism in the Alien franchise, which offsets white, strongly lit environments (spaceships, corporate offices) against darker, dirtier, 'corrupted' settings (derelict alien ship, abandoned industrial facilities). These black touches contrast or even attempt to take over the purity of the white elements.[40] Others, such as Kile M. Ortigo of Emory University, agree with this interpretation and point to the Sulaco with its "sterilized, white interior" as representing this element in the second film of the franchise.[41]
Academics analyzing the role of the Ripley character remark on the symbolism of the Sulaco's cryo chamber. Ripley is compared with an incorrupt Catholic saint preserved in a glass coffin (akin to Saint Bernadette of Lourdes, both in her lying in state in the cryotube as well as her incorrupt body, which has twice survived being almost "impregnated" by the Alien). Accompanied by the Agnus Dei of the Ordinary Mass playing in the background of the opening scene, these scholars argue that the Sulaco is transformed "into a holy site where the iconic bodies of a fetishistic religion lie in state," setting the scene for a lone facehugger attacking its victim (corrupting it) and also causing the emergency system to eject the cryotubes into space and to plunge to Fiorina "Fury" 161 (representing the Fall of Man).[42]
While some claim that the shape of the Sulaco was based on a submarine,[43] the design has most often been described as a 'gun in space' resembling the rifles used in the movie.[44] Author Roz Kaveney called the opening shot of the ship traveling through space 'fetishistic' and 'shark-like', "an image of brutal strength and ingenious efficiency"—while the militarized interior of the Sulaco (designed by Ron Cobb) is contrasted to the organic interior of the Nostromo in the first movie (also designed by Cobb).[45] David McIntree noted the homage the scene pays to the opening tour through the Nostromo in Alien.[46]
The android
character Bishop has been the
subject of literary and philosophical analysis as a high-profile
fictional android conforming to science fiction author Isaac
Asimov's Three Laws of Robotics and as a
model of a compliant, potentially self-aware machine.[47]
His portrayal has been studied by writers for the University of Texas Press for
its implications relating to how humans deal with the presence of an "Other,"[48]
as Ripley treats them with fear and suspicion and a form of "hi-tech
racism and android apartheid" is present throughout the series.[49]
This is seen as part of a larger trend of technophobia
in films prior to the 1990s, with Bishop's role being particularly
significant as he proves his worth at the end of the film, thus
confounding Ripley's expectations.[50]
Jean-François Lyotard | |
---|---|
Jean-Francois Lyotard, photo by Bracha L. Ettinger, 1995 |
|
Full name | Jean-François Lyotard |
Born | 10 August 1924 Versailles, France |
Died | 21 April 1998 Paris, France |
Era | 20th-century philosophy |
Region | Western Philosophy |
School | Postmodernism |
Main interests | Metanarrative |
Notable ideas | The "postmodern
condition" Collapse of the "grand narrative" |
Jean-François Lyotard (French pronunciation: [ʒɑ̃ fʀɑ̃swa ljɔˈtaʀ]; 10 August 1924 – 21 April 1998) was a French philosopher and literary theorist. He is well-known for his articulation of postmodernism after the late 1970s and the analysis of the impact of postmodernity on the human condition.
Contents[hide] |
He was born in 1924 in Versailles, France to Jean-Pierre Lyotard, a sales representative, and Madeleine Cavalli. He went to primary school at the Paris Lycées Buffon and Louis-le-Grand and later began studying philosophy at the Sorbonne. After graduation, in 1950, he took up a position teaching philosophy in Constantine in French East Algeria. Lyotard earned a Ph.D in literature. He married twice: in 1948 to Andrée May, with whom he had two daughters, and for a second time in 1993 to the mother of his son, who was born in 1986.
In 1954 Lyotard became a member of Socialisme ou Barbarie, a French political organisation formed in 1948 around the inadequacy of the Trotskyist analysis to explain the new forms of domination in the Soviet Union. His writings in this period are mostly concerned with ultra-left politics, with focus on the Algerian situation which he witnessed first hand while teaching philosophy in Constantine. [1] Socialisme ou Barbarie became increasingly anti-Marxist and Lyotard was prominent in the Pouvoir Ouvrier, a group that rejected the position and split in 1963. [2]
In the early 1970s Lyotard began teaching at the University of Paris VIII, Vincennes until 1987 when he became Professor Emeritus. During the next two decades he lectured outside of France, notably as a Professor of Critical Theory at the University of California, Irvine and as visiting professor at universities around the world including Johns Hopkins, Berkeley, Yale and the University of California, San Diego in the U.S., the Université de Montréal in Québec (Canada), and the University of São Paulo in Brazil. He was also a founding director and council member of the Collège International de Philosophie, Paris. Before his death, he split his time between Paris and Atlanta, where he taught at Emory University as the Woodruff Professor of Philosophy and French.
Lyotard repeatedly returned to the notion of the Postmodern in essays gathered in English as The Postmodern Explained to Children, Toward the Postmodern, and Postmodern Fables. In 1998, while preparing for a conference on Postmodernism and Media Theory, he died unexpectedly from a case of leukemia that had advanced rapidly. He is buried in Le Père Lachaise Cemetery in Paris.
Lyotard's work is characterised by a persistent opposition to universals, meta-narratives, and generality. He is fiercely critical of many of the 'universalist' claims of the Enlightenment, and several of his works serve to undermine the fundamental principles that generate these broad claims.
In his writings of the early 1970s, he rejects what he regards as theological underpinnings of both Marx and Freud: "In Freud, it is judaical, critical sombre (forgetful of the political); in Marx it is catholic. Hegelian, reconciliatory (...) in the one and in the other the relationship of the economic with meaning is blocked in the category of representation (...) Here a politics, there a therapeutics, in both cases a laical theology, on top of the arbitrariness and the roaming of forces".[3] Consequently he rejected Adorno's negative dialectics which he regarded as seeking a "therapeutic resolution in the framework of a religion, here the religion of history".[4] In Lyotard's "libidinal economics" (the title of one of his books of that time), he aimed at "discovering and describing different social modes of investment of libidinal intensities".[5]
Most famously, in La Condition postmoderne: Rapport sur le savoir (The Postmodern Condition: A Report on Knowledge) (1979), he proposes what he calls an extreme simplification of the "postmodern" as an 'incredulity towards meta-narratives'.[6] These meta-narratives - sometimes 'grand narratives' - are grand, large-scale theories and philosophies of the world, such as the progress of history, the knowability of everything by science, and the possibility of absolute freedom. Lyotard argues that we[who?] have ceased to believe that narratives of this kind are adequate to represent and contain us all. We have become alert to difference, diversity, the incompatibility of our aspirations, beliefs and desires, and for that reason postmodernity is characterised by an abundance of micronarratives. For this concept Lyotard draws from the notion of 'language-games' found in the work of Wittgenstein.
In Lyotard's works, the term 'language games', sometimes also called 'phrase regimens', denotes the multiplicity of communities of meaning, the innumerable and incommensurable separate systems in which meanings are produced and rules for their circulation are created.
This becomes more crucial in Au juste: Conversations (Just Gaming) (1979) and Le Différend (The Differend) (1983), which develop a postmodern theory of justice. It might appear that the atomisation of human beings implied by the notion of the micronarrative and the language game suggests a collapse of ethics. It has often been thought that universality is a condition for something to be a properly ethical statement: 'thou shalt not steal' is an ethical statement in a way that 'thou shalt not steal from Margaret' is not. The latter is too particular to be an ethical statement (what's so special about Margaret?); it is only ethical if it rests on a universal statement ('thou shalt not steal from anyone'). But universals are impermissible in a world that has lost faith in metanarratives, and so it would seem that ethics is impossible. Justice and injustice can only be terms within language games, and the universality of ethics is out of the window. Lyotard argues that notions of justice and injustice do in fact remain in postmodernism. The new definition of injustice is indeed to use the language rules from one 'phrase regimen' and apply them to another. Ethical behaviour is about remaining alert precisely to the threat of this injustice, about paying attention to things in their particularity and not enclosing them within abstract conceptuality. One must bear witness to the 'differend.'
"I would like to call a differend the case where the plantiff is divested of the means to argue and becomes for that reason a victim. If the addressor, the addressee, and the sense of the testimony are neutralized, everything takes place as if there were no damages. A case of differend between two parties takes place when the regulation of the conflict that opposes them is done in the idiom of one of the parties while the wrong suffered by the other is not signified in that idiom." [7]
Lyotard was a frequent writer on aesthetic matters. He was, despite his reputation as a postmodernist, a great promoter of modernist art. Lyotard saw 'postmodernism' as a latent tendency within thought throughout time and not a narrowly-limited historical period. He favoured the startling and perplexing works of the high modernist avant-garde. In them he found a demonstration of the limits of our conceptuality, a valuable lesson for anyone too imbued with Enlightenment confidence. Lyotard has written extensively also on few contemporary artists of his choice: Valerio Adami, Daniel Buren, Marcel Duchamp, Bracha Ettinger and Barnett Newman, as well as on Paul Cézanne and Wassily Kandinsky.
He developed these themes in particular by discussing the sublime. The "sublime" is a term in aesthetics whose fortunes revived under postmodernism after a century or more of neglect. It refers to the experience of pleasurable anxiety that we experience when confronting wild and threatening sights like, for example, a massive craggy mountain, black against the sky, looming terrifyingly in our vision.
Lyotard found particularly interesting the explanation of the sublime offered by Immanuel Kant in his Critique of Judgment (sometimes Critique of the Power of Judgment). In this book Kant explains this mixture of anxiety and pleasure in the following terms: there are two kinds of 'sublime' experience. In the 'mathematically' sublime, an object strikes the mind in such a way that we find ourselves unable to take it in as a whole. More precisely, we experience a clash between our reason (which tells us that all objects are finite) and the imagination (the aspect of the mind that organises what we see, and which sees an object incalculably larger than ourselves, and feels infinite). In the 'dynamically' sublime, the mind recoils at an object so immeasurably more powerful than we, whose weight, force, scale could crush us without the remotest hope of our being able to resist it. (Kant stresses that if we are in actual danger, our feeling of anxiety is very different from that of a sublime feeling. The sublime is an aesthetic experience, not a practical feeling of personal danger.) This explains the feeling of anxiety.
The feeling of pleasure comes when human reason asserts itself[citation needed]. What is deeply unsettling about the mathematically sublime is that the mental faculties that present visual perceptions to the mind are inadequate to the concept corresponding to it; in other words, what we are able to make ourselves see cannot fully match up to what we know is there. We know it's a mountain but we cannot take the whole thing into our perception. What this does, ironically, is to compel our awareness of the supremacy of the human reason[citation needed]. Our sensibility is incapable of coping with such sights, but our reason can assert the finitude of the presentation. With the dynamically sublime, our sense of physical danger should prompt an awareness that we are not just physical material beings, but moral and (in Kant's terms) noumenal beings as well. The body may be dwarfed by its power but our reason need not be. This explains, in both cases, why the sublime is an experience of pleasure as well as pain.
Lyotard is fascinated by this admission, from one of the philosophical architects of the Enlightenment, that the mind cannot always organise the world rationally. Some objects are simply incapable of being brought neatly under concepts. For Lyotard, in Lessons on the Analytic of the Sublime, but drawing on his argument in The Differend, this is a good thing. Such generalities as 'concepts' fail to pay proper attention to the particularity of things. What happens in the sublime is a crisis where we realise the inadequacy of the imagination and reason to each other. What we are witnessing, says Lyotard, is actually the differend; the straining of the mind at the edges of itself and at the edges of its conceptuality.
Some argue that Lyotard's theories may seem self-contradictory because The Postmodern Condition seems to offer its own grand narrative in the story of the decline of the metanarrative. Against this it can be argued that Lyotard's narrative in The Postmodern Condition declares the decline of only a few defunct "narratives of legitimation" and not of narrative knowledge itself. It is not logically contradictory to say that a statement about narratives is itself a narrative, just as when Lyotard states that "every utterance [in a language game] should be thought of as a 'move' in a game"[8] his statement is itself a 'move' in a language game.
See also the critical analysis of David Harvey
in his book 'The Condition of Postmodernity' (Blackwell, 1989).
Harvey's materialistic perspective finds traits of postmodernity to be
rooted in the large-scale shifts from Fordist to flexible accumulation
through a period of pronounced 'time-space compression'
taking place in conjunction with the technological advances happening
roughly around the 1970s. Far from being liberating, postmodernity
draws us into ever more chaotic and disruptive spirals of accumulation
that are ultimately as damaging as the Enlightenment project.
This article may require cleanup to meet Wikipedia's quality standards. Please improve this article if you can. (December 2009) |
In critical theory, and particularly postmodernism, a metanarrative (from meta-narrative, sometimes also known as a master- or grand narrative) is an abstract idea that is thought to be a comprehensive explanation of historical experience or knowledge. According to John Stephens it "is a global or totalizing cultural narrative schema which orders and explains knowledge and experience".[1] The prefix meta means "beyond" and is here used to mean "about", and a narrative is a story. Therefore, a metanarrative is a story about a story, encompassing and explaining other 'little stories' within totalizing schemes.
Contents[hide] |
This section does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (June 2009) |
There is only one metanarrative as defined by Lyotard. Modernists and philosophers address the problem by telling a story -- the story of progress through universal human reason, as Logos triumphs over Mythos. The problem is that once a proof is accepted as the standard of believability not only must we prove our claims, we must also prove our proofs, and so on, ad infinitum. This is what Lyotard was referring to when he made the claim that the postmodern condition is one of incredulity toward metanarratives.
This section needs additional citations for verification. Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (April 2007) |
The concept was criticized by Jean-François Lyotard in his work, The Postmodern Condition: A Report on Knowledge (1979). In this text, Lyotard refers to what he describes as the postmodern condition, which he characterized as increasing skepticism toward the totalizing nature of "metanarratives" (or "grand narratives," typically characterised by some form of 'transcendent and universal truth'):
“ | Simplifying to the extreme, I define postmodern as incredulity toward metanarratives. This incredulity is undoubtedly a product of progress in the sciences: but that progress in turn presupposes it. To the obsolescence of the metanarrative apparatus of legitimation corresponds, most notably, the crisis of metaphysical philosophy and of the university institution which in the past relied on it. The narrative function is losing its functors, its great hero, its great dangers, its great voyages, its great goal. It is being dispersed in clouds of narrative language elements--narrative, but also denotative, prescriptive, descriptive, and so on [...] Where, after the metanarratives, can legitimacy reside? - Jean-Francois Lyotard[2] | ” |
Lyotard and many other poststructuralist thinkers have viewed this as a positive development for a number of reasons. First, attempts to construct grand theories tend to dismiss the naturally existing chaos and disorder of the universe. Second, metanarratives are created and reinforced by power structures and are therefore not to be trusted. 'Metanarratives' ignore the heterogeneity or variety of human existence. They are also seen to embody unacceptable views of historical development, in terms of progress towards a specific goal. The latent diverse passions of human beings will always make it impossible for them to be marshalled under some theoretical doctrine and this is one of the reasons given for the collapse of the Soviet Union in the early 1990s.
According to the advocates of postmodernism, metanarratives have lost their power to convince – they are, literally, stories that are told in order to legitimise various versions of "the truth". With the transition from modern to postmodern, Lyotard proposes that metanarratives should give way to 'petits récits', or more modest and "localised" narratives.[citation needed] Borrowing from the works of Wittgenstein and his theory of the "models of discourse",[3] Lyotard constructs his vision of a progressive politics that is grounded in the cohabitation of a whole range of diverse and always locally legitimated language games. Postmodernists attempt to replace metanarratives by focusing on specific local contexts as well as the diversity of human experience. They argue for the existence of a "multiplicity of theoretical standpoints"[4] rather than grand, all-encompassing theories.
Lyotard's analysis of the postmodern condition has been criticized as being internally inconsistent. For example, thinkers like Alex Callinicos[5] and Jürgen Habermas[6] argue that Lyotard's description of the postmodern world as containing an "incredulity toward metanarratives" could be seen as a metanarrative in itself. According to this view, post-structuralist thinkers like Lyotard criticise universal rules but postulate that postmodernity contains a universal skepticism toward metanarratives; and this 'universal skepticism' is in itself a contemporary metanarrative. Like a post-modern neo-romanticist metanarrative that intends to build up a 'meta' critic, or 'meta' discourse and a 'meta' belief holding up that Western science is just taxonomist, empiricist, utilitarian, assuming a supposed sovereignty around its own reason and pretending to be neutral, rigorous and universal. This is itself an obvious sample of another 'meta' story, self-contradicting the postmodern critique of the metanarrative.[citation needed]
Intertextuality is the shaping of texts' meanings by other texts. It can refer to an author’s borrowing and transformation of a prior text or to a reader’s referencing of one text in reading another. The term “intertextuality” has, itself, been borrowed and transformed many times since it was coined by poststructuralist Julia Kristeva in 1966. As critic William Irwin says, the term “has come to have almost as many meanings as users, from those faithful to Kristeva’s original vision to those who simply use it as a stylish way of talking about allusion and influence” (Irwin, 228).
Contents[hide] |
Kristeva’s coinage of “intertextuality” represents an attempt to synthesize Ferdinand de Saussure’s structuralist semiotics—his study of how signs derive their meaning within the structure of a text—with Bakhtin’s dialogism—his examination of the multiple meanings, or “heteroglossia,” in each text (especially novels) and word (Irwin, 228). For Kristeva (69), “the notion of intertextuality replaces the notion of intersubjectivity” when we realize that meaning is not transferred directly from writer to reader but instead is mediated through, or filtered by, “codes” imparted to the writer and reader by other texts. For example, when we read Joyce’s Ulysses we decode it as a modernist literary experiment, or as a response to the epic tradition, or as part of some other conversation, or as part of all of these conversations at once. This intertextual view of literature, as shown by Roland Barthes, supports the concept that the meaning of an artistic work does not reside in that work, but in the viewers. More recent post-structuralist theory, such as that formulated in Daniela Caselli's Beckett's Dantes: Intertextuality in the Fiction and Criticism (MUP 2005), re-examines "intertextuality" as a production within texts, rather than as a series of relationships between different texts. Some postmodern theorists[citation needed] like to talk about the relationship between "intertextuality" and "hypertextuality"; intertextuality makes each text a "mosaic of quotations" (Kristeva, 66) and part of a larger mosaic of texts, just as each hypertext can be a web of links and part of the whole World-Wide Web.
There is also a distinction between the notions of "intertext", "hypertext" and "supertext". Take for example the Dictionary of the Khazars by Milorad Pavić. As an intertext it employs quotations from the scriptures of the Abrahamic religions. As a hypertext it consists of links to different articles within itself and also every individual trajectory of reading it. As a supertext it combines male and female versions of itself, as well as three mini-dictionaries in each of the versions.
Some critics have complained that the ubiquity of the term "intertextuality" in postmodern criticism has crowded out related terms and important nuances. Irwin (227) laments that intertextuality has eclipsed allusion as an object of literary study while lacking the latter term's clear definition. Linda Hutcheon argues that excessive interest in intertextuality obscures the role of the author, because intertextuality can be found "in the eye of the beholder" and does not necessarily entail a communicator's intentions. By contrast, parody, Hutcheon's preferred term, always features an author who actively encodes a text as an imitation with critical difference. However, there have also been attempts at more closely defining different types of intertextuality. The British film theoretician John Fiske has made a distinction between what he labels 'vertical' and 'horizontal' intertextuality. Horizontal intertextuality denotes references that are on the 'same level' ie. when books make references to other books, whereas vertical intertextuality is found when, say, a book makes a reference to film or song or vice versa. Similarly, Linguist Norman Fairclough distinguishes between 'manifest intertextuality' and 'constitutive intertextuality,'(Fairclough 1992: 117). The former signifies intertextual elements such as presupposition, negation, parody , irony, etc. The latter signifies the interrelationship of discursive features in a text, such as structure, form, or genre. Constitutive Intertextuality is also referred to interdiscursivity (Agger 1999), though generally interdiscursivity refers to relations between larger formations of texts.
While the theoretical concept of intertextuality is associated with post-modernism, the device itself is not new. New Testament passages quote from the Old Testament and Old Testament books such as Deuteronomy or the prophets refer to the events described in Exodus (though on using 'intertextuality' to describe the use of the Old Testament in the New Testament, see Porter 1997). Whereas a redaction critic would use such intertextuality to argue for a particular order and process of the authorship of the books in question, literary criticism takes a synchronic view that deals with the texts in their final form, as an interconnected body of literature. This interconnected body extends to later poems and paintings that refer to Biblical narratives, just as other texts build networks around Greek and Roman Classical history and mythology. Bullfinch's 1855 work The Age Of Fable served as an introduction to such an intertextual network;[citation needed] according to its author, it was intended "...for the reader of English literature, of either sex, who wishes to comprehend the allusions so frequently made by public speakers, lecturers, essayists, and poets...".
Sometimes intertextuality is taken as plagiarism as in the case of Spanish writer Lucía Etxebarria whose poem collection Estación de infierno (2001) was found to contain metaphors and verses from Antonio Colinas. Etxebarria claimed that she admired him and applied intertextuality.
Some examples of intertextuality in literature include:
This article may contain original research or unverified claims. Please improve the article by adding references. See the talk page for details. (September 2009) |
This article is missing citations or needs footnotes. Please help add inline citations to guard against copyright violations and factual inaccuracies. (May 2009) |
A pastiche is a literary or other artistic genre that is a "hodge-podge" or an imitation. The word is also a linguistic term used to describe an early stage in the development of a pidgin language.
Contents[hide] |
In this usage, a work is called pastiche if it is cobbled together in imitation of several original works. As the Oxford English Dictionary puts it, a pastiche in this sense is "a medley of various ingredients; a hotchpotch, farrago, jumble." This meaning accords with etymology: pastiche is the French version of the greco-Roman dish pastitsio or pasticcio, which designated a kind of pie made of many different ingredients.
Some works of art are pastiche in both senses of the term; for example, the David Lodge novel and the Star Wars series mentioned below appreciatively imitate work from multiple sources.
A pastiche mass is a mass where the constituent movements are from different Mass settings.
Masses are composed by classical composers as a set of movements: Kyrie, Gloria, Credo, Sanctus, Agnus Dei. (Examples: the Missa Solemnis by Beethoven and the Messe de Nostre Dame by Guillaume de Machaut.) In a pastiche mass, the performers may choose a Kyrie from one composer, and a Gloria from another, or, choose a Kyrie from one setting of an individual composer, and a Gloria from another.
Most often this convention is chosen for concert performances, particularly by early music ensembles.
In this usage, the term denotes a literary technique employing a generally light-hearted tongue-in-cheek imitation of another's style; although jocular, it is usually respectful.
For example, many stories featuring Sherlock Holmes, originally created by Arthur Conan Doyle, have been written as pastiches since the author's time. A similar example of pastiche is the posthumous continuations the Robert E. Howard stories, written by other writers without Howard's authorization. This includes the Conan stories of L. Sprague de Camp and Lin Carter. David Lodge's novel The British Museum Is Falling Down (1965) is a pastiche of works by Joyce, Kafka, and Virginia Woolf. Tom Stoppard's Rosencrantz and Guildenstern are Dead is a pastiche of Shakespeare's Hamlet.
The fantasy writer Terry Pratchett is known for his use of pastiche, particularly in his early works Strata, a pastiche of various science fiction themes, The Light Fantastic, a humorous pastiche of the fantasy genre, and Wyrd Sisters, which was inspired by the plays of William Shakespeare, particularly Macbeth and Hamlet.
Pastiche is also found in non-literary works, including art and music. For instance, Charles Rosen has characterized Mozart's various works in imitation of Baroque style as pastiche, and Edvard Grieg's Holberg Suite was written as a conscious homage to the music of an earlier age. Perhaps one of the best examples of pastiche in modern music is the that of George Rochberg, who used the technique in his String Quartet No. 3 of 1972 and Music for the Magic Theater. Rochberg turned to pastiche from serialism after the death of his son in 1963.
Many of "Weird Al" Yankovic's songs are pastiches: for example, "Dare to Be Stupid" is a Devo pastiche, and "Bob" from the album Poodle Hat is a pastiche of Bob Dylan.
"Bohemian Rhapsody" by Queen is unusual as it is a pastiche in both senses of the word, as there are many distinct styles imitated in the song, all 'hodge-podged' together to create one piece of music.
Pastiche is prominent in popular culture. Many genre writings, particularly in fantasy, are essentially pastiches. The Star Wars series of films by George Lucas is often considered to be a pastiche of traditional science fiction television serials (or radio shows). The fact that Lucas's films have been influential (spawning their own pastiches - vis the 1983 3D film Metalstorm: The Destruction of Jared-Syn) can be regarded as a function of postmodernity.
The films of Quentin Tarantino are often described as pastiches, as they often pay tribute to (or imitate) pulp novels, blaxploitation and/or Chinese kung fu films, though some say his films are more of an homage. The same definition is said to apply to the video games of Hideo Kojima as well, since they adopt many conventions of action films.
Pastiche can also be a cinematic device wherein the creator of the film pays homage to another filmmaker's style and use of cinematography, including camera angles, lighting, and mise en scène. A film's writer may also offer a pastiche based on the works of other writers (this is especially evident in historical films and documentaries but can be found in non-fiction drama, comedy and horror films as well).
Well-known academic Fredric Jameson has a somewhat more critical view of pastiche, describing it as "blank parody" (Jameson, 1991), especially with reference to the postmodern parodic practices of self-reflexivity and intertextuality. By this is meant that rather than being a jocular but still respectful imitation of another style, pastiche in the postmodern era has become a "dead language", without any political or historical content, and so has also become unable to satirize in any effective way. Whereas pastiche used to be a humorous literary style, it has, in postmodernism, become "devoid of laughter" (Jameson, 1991).
In urban planning, a pastiche is used to refer to neighborhoods as imitations of building styles as conceived by major planners. Many post-war European neighborhoods can in this way be described as pastiches from planners like Le Corbusier or Ebenezer Howard.
Postmodern art, media and literature can be characterized by intertextuality as the narrative mode, and the postmodern period can be characterized by the death of the grand narratives as proclaimed by Jean-François Lyotard in The Postmodern Condition: A Report on Knowledge (1979). The grand narratives such as religions, ideologies and the enlightenment project have been substituted by the small, local narratives, e.g. love of one’s family. Pastiche is intertextual in its very form as it is a recreation of an earlier text. In the postmodern pastiche the older text (the hypotext) may reflect one of the bygone grand narratives, yet its new postmodern version may reflect a local narrative, so that the two enter into a dialogue in the pastiche. This is for instance the case with Francis Glebas’ "Pomp and Circumstance"- the seventh segment in Fantasia 2000 from 1999, in which the grand religious narrative of the Deluge is merged with the local narrative of personal love, personified in Donald Duck and Daisy. Though the grand narratives may be dead as ontological frames, they can here in the pastiche narrative regain some of their ontological strength when the local narratives are confronted by them in this narrative way.
Jean Baudrillard | |
---|---|
Full name | Jean Baudrillard |
Born | 27 July 1929 Reims, France |
Died | 6 March 2007 (aged 77) Paris, France |
Era | 20th / 21st-century philosophy |
Region | Western Philosophy |
School | Post-Structuralism · Marxism · Post-Marxism |
Main interests | Postmodernity · Mass Media |
Notable ideas | Hyperreality · Simulacra · Sign value |
Jean Baudrillard (27 July 1929 – 6 March 2007) (IPA: [ʒɑ̃ bo.dʁi.jaʁ])[2] was a French sociologist, philosopher, cultural theorist, political commentator, and photographer. His work is frequently associated with postmodernism and post-structuralism.
Baudrillard was born in Reims, north-eastern France, on July 27, 1929. He told interviewers that his grandparents were peasants and his parents were civil servants. He became the first of his family to attend university when he moved to Paris to attend Sorbonne University.[3]. There he studied German, which led to him to begin teaching the subject at a provincial lycée, where he remained from 1958 until his departure in 1966. While teaching, Baudrillard began to publish reviews of literature and translated the works of such authors as Peter Weiss, Bertolt Brecht and Wilhelm Mühlmann[4]
Toward the end of his time as a German teacher, Baudrillard began to transfer to sociology, eventually completing his doctoral thesis Le Système des objets (The System of Objects) under the tutelage of Henri Lefebvre. Subsequently, he began teaching the subject at the Université de Paris-X Nanterre, at the time a politically radical institution which would become heavily involved in the events of May 1968.[5] At Nanterre he took up a position as Maître Assistant (Assistant Professor), then Maître de Conférences (Associate Professor), eventually becoming a professor after completing his accreditation, L'Autre par lui-même (The Other, by himself).
In 1986 he moved to IRIS (Institut de Recherche et d'Information Socio-Économique) at the Université de Paris-IX Dauphine, where he spent the latter part of his teaching career. During this time he had begun to move away from sociology as a discipline (particularly in its "classical" form), and, after ceasing to teach full time, he rarely identified himself with any particular discipline, although he remained linked to the academic world. During the 1980s and 1990s his books had gained a wide audience, and in his last years he became, to an extent, an intellectual celebrity,[6] being published often in the French- and English-speaking popular press. He nonetheless continued supporting the Institut de Recherche sur l'Innovation Sociale at the Centre National de la Recherche Scientifique and was Satrap at the Collège de Pataphysique. He also collaborated at the Canadian philosophical review Ctheory, where he was abundantly cited.
Baudrillard was a social theorist and critic who is best known for his analyses of the modes of mediation and of technological communication. His writing, although consistently interested in the way technological progress affects social change, covers diverse subjects — from consumerism to gender relations to the social understanding of history to journalistic commentaries about AIDS, cloning, the Rushdie affair, the first Gulf War and the attacks on the World Trade Center in New York City.
His published work emerged as part of a generation of French thinkers including Gilles Deleuze, Jean-Francois Lyotard, Michel Foucault, Jacques Derrida and Jacques Lacan who all shared an interest in semiotics, and he is often seen as a part of the poststructuralist philosophical school.[7] In common with many poststructuralists, his arguments consistently draw upon the notion that signification and meaning are both only understandable in terms of how particular words or "signs" interrelate. Baudrillard thought, as many post-structuralists did, that meaning is brought about through systems of signs working together. Following on from the structuralist linguist Ferdinand de Saussure, Baudrillard argued that meaning is based upon an absence (so "dog" means "dog" not because of what the word says, as such, but because of what it does not say: "cat", "goat", "tree" etc.). In fact, he viewed meaning as near enough self-referential: objects, images of objects, words and signs are situated in a web of meaning; one object's meaning is only understandable through its relation to the meaning of other objects, in other words, one thing's prestige relates to another's mundanity.
From this starting point Baudrillard constructed broad theories of human society based upon this kind of self-referentiality. His pictures of society portray societies always searching for a sense of meaning — or a "total" understanding of the world — that remains consistently elusive. In contrast to poststructuralists such as Foucault, for whom the formations of knowledge emerge only as the result of relations of power, Baudrillard developed theories in which the excessive, fruitless search for total knowledge lead almost inevitably to a kind of delusion. In Baudrillard's view, the (human) subject may try to understand the (non-human) object, but because the object can only be understood according to what it signifies (and because the process of signification immediately involves a web of other signs from which it is distinguished) this never produces the desired results. The subject, rather, becomes seduced (in the original Latin sense, seducere, to lead away) by the object. He therefore argued that, in the last analysis, a complete understanding of the minutiae of human life is impossible, and when people are seduced into thinking otherwise they become drawn toward a "simulated" version of reality, or, to use one of his neologisms, a state of "hyperreality." This is not to say that the world becomes unreal, but rather that the faster and more comprehensively societies begin to bring reality together into one supposedly coherent picture, the more insecure and unstable it looks and the more fearful societies become.[8] Reality, in this sense, "dies out."[9]
Accordingly, Baudrillard argued that the excess of signs and of meaning in late 20th century "global" society had caused (quite paradoxically) an effacement of reality. In this world neither liberal nor Marxist utopias are any longer believed in. We live, he argued, not in a "global village," to use Marshall McLuhan's phrase, but rather in a world that is ever more easily petrified by even the smallest event. Because the "global" world operates at the level of the exchange of signs and commodities, it becomes ever more blind to symbolic acts such as, for example, terrorism. In Baudrillard's work the symbolic realm (which he develops a perspective on through the anthropological work of Marcel Mauss and Georges Bataille) is seen as quite distinct from that of signs and signification. Signs can be exchanged like commodities; symbols, on the other hand, operate quite differently: they are exchanged, like gifts, sometimes violently as a form of potlatch. Baudrillard, particularly in his later work, saw the "global" society as without this "symbolic" element, and therefore symbolically (if not militarily) defenceless against acts such as the Rushdie Fatwa[10] or, indeed, the September 11, 2001, terrorist attacks against the United States and its military establishment (see below).
In 2004, the International Journal of Baudrillard Studies was launched.
In his early books, such as The System of Objects, For a Critique of the Political Economy of the Sign, and The Consumer Society, Baudrillard's main focus is upon consumerism, and how different objects are consumed in different ways. At this time Baudrillard's political outlook was loosely associated with Marxism (and situationism), but in these books he differed from Marx in one significant way. For Baudrillard, it was consumption, rather than production, which was the main drive in capitalist society.
Baudrillard came to this conclusion by criticising Marx's concept of "use value." Baudrillard thought that both Marx's and Adam Smith's economic thought accepted the idea of genuine needs relating to genuine uses too easily and too simply. He argued, drawing from Georges Bataille, that needs are constructed, rather than innate. Whereas Marx believed that uses genuinely laid beneath capitalism's "commodity fetishism," Baudrillard thought that all purchases, because they always signify something socially, have their fetishistic side. Objects always, drawing from Roland Barthes, "say something" about their users. And this was, for him, why consumption was and remains more important than production: because the "ideological genesis of needs"[11] precedes the production of goods to meet those needs.
He wrote that there are four ways of an object obtaining value. The four value-making processes are as follows:[12]
Baudrillard's earlier books were attempts to argue that the first two of these values are not simply associated, but are disrupted by the third and, particularly, the fourth. Later, Baudrillard rejected Marxism totally (The Mirror of Production and Symbolic Exchange and Death). But the focus on the difference between sign value (which relates to commodity exchange) and symbolic value (which relates to Maussian gift exchange) remained in his work up until his death. Indeed it came to play a more and more important role, particularly in his writings on world events.
As he developed his work throughout the 1980s, he moved from economically-based theory to the consideration of mediation and mass communications. Although retaining his interest in Saussurean semiotics and the logic of symbolic exchange (as influenced by anthropologist Marcel Mauss) Baudrillard turned his attention to Marshall McLuhan, developing ideas about how the nature of social relations is determined by the forms of communication that a society employs. In so doing, Baudrillard progressed beyond both Saussure's and Roland Barthes' formal semiology to consider the implications of a historically-understood (and thus formless) version of structural semiology. The concept of Simulacra also involves a negation of the concept of reality as we usually understand it. Baudrillard argues that today there is no such thing as reality.
Throughout the 1980s and 1990s, one of Baudrillard's most common themes was historicity, or, more specifically, how present day societies utilise the notions of progress and modernity in their political choices. He argued, much like the political theorist Francis Fukuyama, that history had ended or "vanished" with the spread of globalization; but, unlike Fukuyama, Baudrillard averred that this end should not be understood as the culmination of history's progress, but as the collapse of the very idea of historical progress. For Baudrillard, the end of the Cold War was not caused by one ideology's victory over the other, but the disappearance of the utopian visions that both the political Right and Left shared. Giving further evidence of his opposition toward Marxist visions of global communism and liberal visions of global civil society, Baudrillard contended that the ends they hoped for had always been illusions; indeed, as his book The Illusion of the End argued, he thought the idea of an end itself was nothing more than a misguided dream:
Within a society subject to and ruled by fast-paced electronic communication and global information networks the collapse of this façade was always going to be, he thought, inevitable. Employing a quasi-scientific vocabulary that attracted the ire of the physicist Alan Sokal, Baudrillard wrote that the speed society moved at had destabilized the linearity of history: "we have the particle accelerator that has smashed the referential orbit of things once and for all."[14]
In making this argument Baudrillard found some affinity with the postmodern philosophy of Jean-Francois Lyotard, who famously argued that in the late Twentieth Century there was no longer any room for "metanarratives." (The triumph of a coming communism being one such metanarrative.) But, in addition to simply lamenting this collapse of history, Baudrillard also went beyond Lyotard and attempted to analyse how the idea of forward progress was being employed in spite of the notion's declining validity. Baudrillard argued that although genuine belief in a universal endpoint of history, wherein all conflicts would find their resolution, had been deemed redundant, universality was still a notion utilised in world politics as an excuse for actions. Universal values which, according to him, no one any longer believed universal were and are still rhetorically employed to justify otherwise unjustifiable choices. The means, he wrote, are there even though the ends are no longer believed in, and are employed in order to hide the present's harsh realities (or, as he would have put it, unrealities). "In the Enlightenment, universalization was viewed as unlimited growth and forward progress. Today, by contrast, universalization is expressed as a forward escape."[15]
Part of Baudrillard's public profile, as both an academic and a political commentator, comes from his 1991 book, titled for its provocative main thesis, "The Gulf War Did Not Take Place." His argument described the first Gulf War as the inverse of the Clausewitzian formula: it was not "the continuation of politics by other means", but "the continuation of the absence of politics by other means". Accordingly, Saddam Hussein was not fighting the Allied Forces, but using the lives of his soldiers as a form of sacrifice to preserve his power (p. 72, 2004 edition). The Allied Forces fighting the Iraqi military forces were merely dropping 10,000 tonnes of bombs daily, as if proving to themselves that there was an enemy to fight (p. 61). So, too, were the Western media complicit, presenting the war in real time, by recycling images of war to propagate the notion that the two enemies, the US (and allies) were actually fighting the Iraqi Army, but, such was not the case: Saddam Hussein did not use his military capacity (the Iraqi Air Force), his politico-military power was not weakened (he suppressed the Kurdish insurgency against Iraq at war's end), so, concluding that politically little had changed in Iraq: the enemy went undefeated, the victors were not victorious, therefore, there was no war: the Gulf War did not occur.
Much of the repute that Baudrillard found as a result of the book — originally a series of articles in the British newspaper The Guardian and the French newspaper Libération in three parts: During the American military and rhetorical buildup as "The Gulf War Will not take Place"; during military action as "The Gulf War is not Taking Place", and after action was over, "The Gulf War Did Not Take Place" — was based on his critique that the Gulf War was not ineffectual, as Baudrillard portrayed it: People died, the political map was altered, and Saddam Hussein's regime was harmed. Some critics accuse Baudrillard of instant revisionism; a denial of the physical action of the conflict (part of his denial of reality, in general). Consequently, Baudrillard was accused of lazy amoralism, encompassing cynical scepticism, and Berkelian idealism. Sympathetic commentators (such as William Merrin, in his book Baudrillard and the Media) have argued that Baudrillard was more concerned with the West's technological and political dominance and the globalization of its commercial interests, and what it means for the present possibility of war. Merrin has asserted that Baudrillard did not deny that something happened, but merely questioned that that something was a war; rather it was "an atrocity masquerading as a war". Merrin's book viewed the accusations of amorality as redundant and based upon misreading; Baudrillard's own position was more nuanced. In Baudrillard's own words (p. 71-72):
Saddam liquidates the communists, Moscow flirts even more with him; he gases the Kurds, it is not held against him; he eliminates the religious cadres, the whole of Islam makes peace with him ... Even ... the 100,000 dead will only have been the final decoy that Saddam will have sacrificed, the blood money paid in forfeit according to a calculated equivalence, in order to conserve his power. What is worse is that these dead still serve as an alibi for those who do not want to have been excited for nothing: at least these dead will prove this war was indeed a war and not a shameful and pointless hoax ...
In contrast to the "non-event" of the Gulf War, in the essay The Spirit of Terrorism he characterised the terrorist attacks on the World Trade Center in New York City as the "absolute event." Seeking to understand them as an (ab)reaction[clarification needed] to the technological and political expansion of capitalist globalization, rather than as a war of religiously-based or civilization-based warfare, he termed the absolute event and its consequences as follows (p. 11 in the 2002 version):
This is not a clash of civilisations or religions, and it reaches far beyond Islam and America, on which efforts are being made to focus the conflict in order to create the delusion of a visible confrontation and a solution based upon force. There is indeed a fundamental antagonism here, but one that points past the spectre of America (which is perhaps the epicentre, but in no sense the sole embodiment, of globalisation) and the spectre of Islam (which is not the embodiment of terrorism either) to triumphant globalisation battling against itself.
Baudrillard thus placed the attacks — as accords with his theory of society — in context as a symbolic reaction to the continued expansion of a world based solely upon commodity exchange. This stance was criticised on two counts. Richard Wolin (in The Seduction of Unreason) forcefully accused Baudrillard and Slavoj Zizek of all but celebrating the terrorist attacks, essentially claiming that the United States of America received what it deserved. Zizek, however, countered that accusation to Wolin's analysis as a form of intellectual barbarism in the journal Critical Inquiry, saying that Wolin fails to see the difference between fantasising about an event and stating that one is deserving of that event. Merrin (in Baudrillard and the Media) argued that Baudrillard's position affords the terrorists a type of moral superiority. In the journal Economy and Society, Merrin further noted that Baudrillard gives the symbolic facets of society unfair privilege above semiotic concerns. Second, authors questioned whether the attacks were unavoidable. Bruno Latour, in Critical Inquiry argued that Baudrillard believed that their destruction was forced by the society that created them, alluding the Towers were "brought down by their own weight". In Latour's view, this was because Baudrillard conceived only of society in terms of a symbolic and semiotic dualism.
Baudrillard's writing, and his uncompromising positions, has led to his being criticised fiercely by many.[citation needed] For example Denis Dutton, founder of Philosophy & Literature's "Bad Writing Contest" — which listed examples of the kind of willfully obscurantist prose for which Baudrillard was frequently criticised — had the following to say:
However only one of the two major confrontational books on Baudrillard's thought — Christopher Norris's Uncritical Theory: Postmodernism, Intellectuals and the Gulf War (ISBN 0-87023-817-5) — seeks to reject his media theory and position on "the real" out of hand. The other — Douglas Kellner's Jean Baudrillard: From Marxism to Postmodernism and Beyond (ISBN 0-8047-1757-5) — seeks rather to analyse Baudrillard's relation to postmodernism (a concept with which Baudrillard has had a continued, if uneasy and rarely explicit, relationship) and to present a Marxist counter. Regarding the former, William Merrin (as discussed above) has published more than one denunciation of Norris's position. The latter Baudrillard himself characterised as reductive (in Nicholas Zurbrugg's Jean Baudrillard: Art and Artefact).
Willam Merrin's work has presented a more sympathetic account, which attempts to "place Baudrillard in opposition to himself." Thereby Merrin has argued that Baudrillard's position on semiotic analysis of meaning denies himself his own position on symbolic exchange. Merrin thus alludes to the common criticism of post-structuralist work (a criticism not dissimilar in either Baudrillard, Foucault or Deleuze) that emphasising interrelation as the basis for subjectivity denies the human agency from which social structures necessarily arise. (Alain Badiou and Michel de Certeau have made this point generally, and Barry Sandywell has argued as much in Baudrillard's specific case).
Finally, Mark Poster, Baudrillard's editor and one of a number of present day academics who argue for his contemporary relevance, has remarked (p. 8 of Poster's 2nd ed. of Selected Writings):
Nonetheless Poster is keen to refute the most extreme of Baudrillard's critics, the likes of Alan Sokal and Norris who see him as a purveyor of a form of reality-denying irrationalism (ibid p. 7):