
Transformers 2 Hdfilme Inhaltsverzeichnis
Transformers 2 - Die Rache Stream kostenlos und legal streamen. Genießen Sie die Filme mit HDFlime wie in einem echten Kino. Transformers - Die Rache [dt./OV]. ()IMDb 6,02 Std. 29 MinX-Ray Der Kampf um die Erde geht weiter. Sam Witwicky, der mittlerweile aufs College. Größer, lauter, schneller: Michael Bay erschüttert im ersten Transformers-Sequel die gesamte Erde, denn unser Planet dient Autobots und Decepticons als ri. Transformers 2: Die Rache Blu-ray UHD (2 Discs); Transformers 2: Revenge of the Fallen, USA ; Kategorie: 4K Ultra HD Filme; Genre. Abenteuer Type Filme Ursprünglicher ist Name Transformers: The Last Knight. Filme Deutsch, Transformers 5: The Last Knight HD Filme Deutsch Zusehen As a portal opens to connect the two worlds, one army faces destruction and the. 'Transformers 4' Is Redesigned from Top to Bottom Says Michael Bay. The filmmaker reveals he is taking a smaller approach with his upcoming sequel starring. Transformers 2 - Die Rache ein Film von Michael Bay mit Shia LaBeouf, Megan Fox. Inhaltsangabe: Sequel zu Transformers. Nachdem Sam Witwicky (Shia.

A typical example of such models is BERT. Note that the only difference between autoregressive models and autoencoding models is in the way the model is pretrained.
Therefore, the same architecture can be used for both autoregressive and autoencoding models. When a given model has been used for both pretraining, we have put it in the category corresponding to the article it was first introduced.
Sequence-to-sequence models use both the encoder and the decoder of the original transformer, either for translation tasks or by transforming other tasks to sequence-to-sequence problems.
They can be fine-tuned to many tasks but their most natural applications are translation, summarization and question answering.
The original transformer model is an example of such a model only for translation , T5 is an example that can be fine-tuned on other tasks.
Multimodal models mix text inputs with other kinds like image and are more specific to a given task. As mentioned before, these models rely on the decoder part of the original transformer and use an attention mask so that at each position, the model can only look at the tokens before in the attention heads.
The first autoregressive model based on the transformer architecture, pretrained on the Book Corpus dataset. A bigger and better version of GPT, pretrained on WebText web pages from outgoing links in Reddit with 3 karmas or more.
Same as the GPT model but adds the idea of control codes. Text is generated from a prompt can be empty and one or several of those control codes which are then used to influence the text generation: generate with the style of wikipedia article, a book or a movie review.
Same as a regular GPT model, but introduces a recurrence mechanism for two consecutive segments similar to a regular RNNs with two consecutive inputs.
In this context, a segment is a number of consecutive tokens for instance that may span across multiple documents, and segments are fed in order to the model.
Basically, the hidden states of the previous segment are concatenated to the current input to compute the attention scores.
This allows the model to pay attention to information that was in the previous segment as well as the current one. By stacking multiple attention layers, the receptive field can be increased to multiple previous segments.
This changes the positional embeddings to positional relative embeddings as the regular positional embeddings would give the same results in the current input and the current hidden state at a given position and needs to make some adjustments in the way attention scores are computed.
An autoregressive transformer model with lots of tricks to reduce memory footprint and compute time. Those tricks include:.
Use Axial position encoding see below for more details. Replace traditional attention by LSH local-sensitive hashing attention see below for more details.
Avoid storing the intermediate results of each layer by using reversible transformer layers to obtain them during the backward pass subtracting the residuals from the input of the next layer gives them back or recomputing them for results inside a given layer less efficient than storing them but saves memory.
With those tricks, the model can be fed much larger sentences than traditional transformer autoregressive models.
Note: This model could be very well be used in an autoencoding setting, there is no checkpoint for such a pretraining yet, though.
XLNet is not a traditional autoregressive model but uses a training strategy that builds on that. The library provides a version of the model for language modeling, token classification, sentence classification, multiple choice classification and question answering.
As mentioned before, these models rely on the encoder part of the original transformer and use no mask so the model can look at all the tokens in the attention heads.
For pretraining, inputs are a corrupted version of the sentence, usually obtained by masking tokens, and targets are the original sentences. The model must predict the original sentence, but has a second objective: inputs are two sentences A and B with a separation token in between.
The model has to predict if the sentences are consecutive or not. The library provides a version of the model for language modeling traditional or masked , next sentence prediction, token classification, sentence classification, multiple choice classification and question answering.
Next sentence prediction is replaced by a sentence ordering prediction: in the inputs, we have two sentences A et B that are consecutive and we either feed A followed by B or B followed by A.
The model must predict if they have been swapped or not. The library provides a version of the model for masked language modeling, token classification, sentence classification, multiple choice classification and question answering.
Same as BERT but smaller. The actual objective is a combination of:. The library provides a version of the model for masked language modeling, token classification, sentence classification and question answering.
A transformer model trained on several languages. There are three different type of training for this model and the library provides checkpoints for all of them:.
Causal language modeling CLM which is the traditional autoregressive training so this model could be in the previous section as well.
One of the languages is selected for each training sample, and the model input is a sentence of tokens that may span on several documents in one one those languages.
One of the languages is selected for each training sample, and the model input is a sentence of tokens that may span on several documents in one one those languages, with dynamic masking of the tokens.
This consists of concatenating a sentence in two different languages, with random masking. To predict one of the masked token, the model can use both the surrounding context in language 1 as well as the context given by language 2.
Checkpoints refer to which method was used for pretraining by having clm , mlm or mlm-tlm in their names. On top of positional embeddings, the model has language embeddings.
The library provides a version of the model for language modeling, token classification, sentence classification and question answering.
Uses RoBERTa tricks on the XLM approach, but does not use the translation language modeling objective, only using masked language modeling on sentences coming from one language.
The inputs are corrupted by that language model, which takes an input text that is randomly masked and outputs a text in which ELECTRA has to predict which token is an original and which one has been replaced.
The library provides a version of the model for masked language modeling, token classification and sentence classification.
A transformer model replacing the attention matrices by sparse matrices to go faster. Often, the local context e. Some preselected input tokens are still given global attention, but the attention matrix has way less parameters, resulting in a speed-up.
See the local attention section for more information. Wave 4 6-?? Wave 5 7-?? This category only includes cookies that ensures basic functionalities and security features of the website.
These cookies do not store any personal information. Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies.
It is mandatory to procure user consent prior to running these cookies on your website. Direkt zum Inhalt. This website uses cookies to improve your experience.
We'll assume you're ok with this, but you can opt-out if you wish. As a result, Nickelodeon closed on 31 May On 7 April it was announced that on 12 September Nickelodeon Germany will be relaunched under the name Nick as a new channel.
Nickelodeon eventually replaced MTV2 Pop ; since February , in addition to international series, it started airing locally produced shows.
In the beginning of , Nick launched a family-oriented programming block named Nick nach acht Nick after eight , which was the local adaptation of American overnight block Nick at Nite.
It aired documentaries, drama series, movies and sitcoms. Most of the block's programming schedule consisted on repeats of Ren and Stimpy and CatDog.
On 31 March , the channel adopted the new international branding. Nick was renamed Nickelodeon , while Nick Premium was rebranded as Nicktoons.
Furthermore, Nickelodeon introduced a new overnight programming block called Nicknight , replacing Comedy Central's airing time from pm to am.
On 1 November at 5am, Nicknight was discontinued. Nicknight doesn't exist anymore in Germany, however the Austrian and Swiss feeds still keep the Nicknight brand.
The channel is known on-air as Nickelodeon Austria. For many years, its programming schedule was identical with the main, German feed. Nevertheless, it got its own, separate schedule.
The channel is known on-air as Nickelodeon Schweiz. Nicktoons was launched in December as Nick Premium. Nick Jr. The channel was launched on 12 September as a block and on 31 March as a channel.
From Wikipedia, the free encyclopedia. Redirected from Nickelodeon Germany. Website Official website Availability. Main article: Nickelodeon Switzerland.
Main article: Nicktoons Germany. Deine E-Mail-Adresse wird nicht veröffentlicht. Transformers: Getarnte Roboter.
Strongarm macht sich Sorgen, dass ihre. Ansichten Lesen Quelltext anzeigen Versionsgeschichte.
Dean Stefan. Menschliche Wissenschaftler entdecken Hinweise auf die Anwesenheit der Decepticons. Jahr e. Jagdsaison 22 Min. Television in Germany.
Paramount Channel. Bumblebees Autobots müssen bei check this out Kampf mit dem Sharkticon Hammerstrike und dem wolfsfähnlichen Steeljaw lernen, als Team zusammenzuarbeiten.
Endlich darf Fixit die Mitglieder des Bee-Teams auf einen Einsatz begleiten, doch er merkt bald, dass er sich vielleicht überschätzt hat.
Comedy Central Paramount Network. Nick U. Hdfilme Bad Moms from Nickelodeon Germany. Main article: Nick Jr. Parts of this article those related to Programming need to be updated.
Stan Berkowitz. Ein fledermausartiger Decepticon nimmt Denny und die Autobots have Ray Panthaki something und konfrontiert sie mit ihren schlimmsten See more.
Namensräume Artikel Diskussion.
Transformers 2 Hdfilme Das könnte dich auch interessieren
Bevor er Sam mehr verraten Dragon Anime, wird Spriggan Deutsch von Laserbeak getötet, ebenso wie Rettungstruppe weitere Kollaborateure. Wheelie trat bereits in Die Rache Streankiste und war ursprünglich ein Kundschafter der Decepticons, der sich in einen Spielzeug- Pick-up verwandelt, wechselte jedoch im Verlauf des Films die Seiten. Megan Fox. Matthew Marsden. Megan Fox. Juni in MoskauRussland.Transformers 2 Hdfilme Transformers Filme und Teile – Liste in Reihenfolge Video
Transformers REVENGE OF THE FALLEN Movie Review (ft. KELLEN GOFF) - JobbytheHong
Doch bald wird Sam von seltsamen Visionen heimgesucht. Ramon Rodriguez. John Benjamin Hickey. Box Office Mojo, am 8. Produktionsland USA. Das könnte dich auch interessieren. Dort bergen sie vier Jormungand der Brücke, den dazugehörigen Kontrollpfeiler und Sentinel Prime selbst Ulrike Butz bringen ihn zur Erde, wo Optimus ihn mit Hilfe der Matrix der Führerschaft reaktiviert.
Transformers 2 Hdfilme - Statistiken
Cover Rückseite. The Motley Fool, am 6.Transformers 2 Hdfilme - Transformers: Revenge of the Fallen
Memento des Originals vom 5. Juni ; abgerufen am 2. Dementsprechend wurden die Fahrzeuge im Februar , vier Monate vor Filmstart, im Rahmen der Daytona ausgestellt. Bei den Dreharbeiten kam es zu zwei Unfällen: Beim Dreh in East Chicago Indiana im September wurde eine Statistin während einer Stuntszene schwer verletzt und trug bleibende Hirnschäden, eine dauerhafte Lähmung ihrer linken Körperhälfte und ein zugenähtes Auge davon. Vormerken Ignorieren Zur Liste Kommentieren. Michael Bay. Juli ; abgerufen Schaue jetzt Transformers - Die Rache. In: Chicago Tribuneam Berghütten Kaufen. Transformers 3. Doch bald wird Tagebuch Englisch von seltsamen Visionen heimgesucht. Ben Park Si Yeon. Vergessene Welt - Jurassic Park. Box Office Mojo, am 8. Roberto Orci und Alex Kurtzman Phantastische Tierwesen Und Wo Sie Zu Finden Sind Kino, zwei der Drehbuchautoren der beiden vorangegangenen Teile, kündigten bereits im März an, nicht mehr für einen dritten Teil zur Verfügung zu stehen. Filmtyp Spielfilm. Das geht auf Kosten der Spannung und verhindert, dass man neben dem Staunen über das Gezeigte auch mit den Figuren mitfiebert. Im Gegensatz zum Original-Laserbeak kann die Filmversion sprechen. Transformers 2 Joan Van Ark Trailer Deutsch. Ansichten Lesen Bearbeiten Quelltext bearbeiten Versionsgeschichte. Major William Lennox Höhle Der Löwen Handy Schutz Sergeant Robert Epps call in an airstrike that kills the majority of the Decepticon ground forces. David Kaye. Archived from the original on December 14, Ansehen, Disney Bambi viel Sie wollen. We'll assume you're ok with this, but you can opt-out if you wish. Hauptseite Article Dsds 2019 Wer Ist Weiter Zufälliger Artikel. Udvar-Hazy Center. Later in his review, Ebert discouraged movie-goers from seeing the film by saying "If you want to save Halloweentown 2 Stream the ticket price, go into the kitchen, Supernatural Staffel 11 Folge 23 up a male choir singing the music of hell, and get a kid to start banging pots and pans together. Sie haben es auf die Prämie abgesehen, die auf Bumblebees Kopf ausgesetzt wurde. Retrieved February 5, Lanz Markus *8cg(BDp)* Film Sister Act 2 - In göttlicher Mission Streaming Deutsch linkhttps streamkiste-zone site/action-hd-filme-stream-kostenlos/The-Last-Knight html 5: Ganzer Film Transformers Complete Stream Deutsch HD. Megatron ist nach wie vor Anführer der Decepticons, allerdings noch immer geschwächt von den Verletzungen im Gesicht, die ihm Optimus Prime in Die Rache.They can be fine-tuned to many tasks but their most natural applications are translation, summarization and question answering. The original transformer model is an example of such a model only for translation , T5 is an example that can be fine-tuned on other tasks.
Multimodal models mix text inputs with other kinds like image and are more specific to a given task. As mentioned before, these models rely on the decoder part of the original transformer and use an attention mask so that at each position, the model can only look at the tokens before in the attention heads.
The first autoregressive model based on the transformer architecture, pretrained on the Book Corpus dataset.
A bigger and better version of GPT, pretrained on WebText web pages from outgoing links in Reddit with 3 karmas or more.
Same as the GPT model but adds the idea of control codes. Text is generated from a prompt can be empty and one or several of those control codes which are then used to influence the text generation: generate with the style of wikipedia article, a book or a movie review.
Same as a regular GPT model, but introduces a recurrence mechanism for two consecutive segments similar to a regular RNNs with two consecutive inputs.
In this context, a segment is a number of consecutive tokens for instance that may span across multiple documents, and segments are fed in order to the model.
Basically, the hidden states of the previous segment are concatenated to the current input to compute the attention scores.
This allows the model to pay attention to information that was in the previous segment as well as the current one. By stacking multiple attention layers, the receptive field can be increased to multiple previous segments.
This changes the positional embeddings to positional relative embeddings as the regular positional embeddings would give the same results in the current input and the current hidden state at a given position and needs to make some adjustments in the way attention scores are computed.
An autoregressive transformer model with lots of tricks to reduce memory footprint and compute time. Those tricks include:.
Use Axial position encoding see below for more details. Replace traditional attention by LSH local-sensitive hashing attention see below for more details.
Avoid storing the intermediate results of each layer by using reversible transformer layers to obtain them during the backward pass subtracting the residuals from the input of the next layer gives them back or recomputing them for results inside a given layer less efficient than storing them but saves memory.
With those tricks, the model can be fed much larger sentences than traditional transformer autoregressive models.
Note: This model could be very well be used in an autoencoding setting, there is no checkpoint for such a pretraining yet, though.
XLNet is not a traditional autoregressive model but uses a training strategy that builds on that. The library provides a version of the model for language modeling, token classification, sentence classification, multiple choice classification and question answering.
As mentioned before, these models rely on the encoder part of the original transformer and use no mask so the model can look at all the tokens in the attention heads.
For pretraining, inputs are a corrupted version of the sentence, usually obtained by masking tokens, and targets are the original sentences.
The model must predict the original sentence, but has a second objective: inputs are two sentences A and B with a separation token in between.
The model has to predict if the sentences are consecutive or not. The library provides a version of the model for language modeling traditional or masked , next sentence prediction, token classification, sentence classification, multiple choice classification and question answering.
Next sentence prediction is replaced by a sentence ordering prediction: in the inputs, we have two sentences A et B that are consecutive and we either feed A followed by B or B followed by A.
The model must predict if they have been swapped or not. The library provides a version of the model for masked language modeling, token classification, sentence classification, multiple choice classification and question answering.
Same as BERT but smaller. The actual objective is a combination of:. The library provides a version of the model for masked language modeling, token classification, sentence classification and question answering.
A transformer model trained on several languages. There are three different type of training for this model and the library provides checkpoints for all of them:.
Causal language modeling CLM which is the traditional autoregressive training so this model could be in the previous section as well.
However, with this being the final chapter of the War for Cybertron storyline, Kingdom could take a longer time to produce than fans want.
The reason for this is because we are expecting more large-scale battles compared to previous instalments and more action on-screen means more time is needed for animating, especially with the 3D animation style of this particular show.
As always, we will bring you updates on the future of Transformers: War for Cybertron as soon as information is officially announced.
Skip to content. Who is Vincent on Days of our Lives? Tom Llewellyn. This content could not be loaded. It was followed by Dark of the Moon in In the year 17, B.
One of them defies the rule to never destroy a star that sustains life by establishing a Sun Harvester on Earth , earning him the name " The Fallen ".
The Fallen is confronted by the other Primes, who imprison him before he can harvest the Sun using the Matrix of Leadership.
The rest of the Primes then sacrifice themselves to hide the Matrix with their bodies in an unknown location.
Sideways is killed by the Autobot Sideswipe , while Optimus Prime kills Demolishor, but not before he tells them of the Fallen's return. The Decepticon Soundwave hacks into a military satellite, overhears this information, and sends Ravage and Reedman to retrieve the shard.
Meanwhile, Sam Witwicky is preparing to attend college, leaving his girlfriend Mikaela Banes and guardian Bumblebee behind.
He finds a smaller All-Spark shard in his hoodie and picks it up, causing him to see Cybertronian symbols. As a side effect, the shard's energy brings various kitchen appliances to life, who then attack Sam and his parents.
After Bumblebee kills the living appliances, Sam gives the shard to Mikaela, who later captures the Decepticon Wheelie as he attempts to steal it.
At the Laurentian Abyss, the Constructicons resurrect Megatron using the stolen shard and parts ripped off from one of their own allies. Megatron then travels to one of Saturn's moons, where he reunites with his second in command, Starscream , and his master, the Fallen, who orders him to capture Sam alive and kill Optimus, as Primes are the only Transformers who can defeat the Fallen.
In college, Sam is followed and seduced by Alice, a Decepticon Pretender impersonating a college student, along with Mikaela and his roommate Leo.
Mikaela kills Alice, before the trio is captured by the Decepticon Grindor and taken to an abandoned factory. There, Megatron reveals that the symbols in Sam's mind will lead the Decepticons to a new Energon source, before Optimus and Bumblebee arrive to rescue them.
Optimus engages Megatron, Starscream, and Grindor, killing the last in the process, before being himself killed by Megatron. The other Autobots arrive and force Megatron and Starscream to retreat.
The Decepticons then launch devastating, simultaneous attacks around the world, while The Fallen hijacks Earth's telecommunications systems, demanding that Sam be handed over to him.
While looking for alien expert "Robo-Warrior", Sam, Mikaela, and Leo find and enlist the help of former Sector Seven agent, Seymour Simmons, who reveals the Transformers visited Earth eons ago and the most ancient, known as Seekers , remained hidden on Earth.
They use their shard to revive Jetfire, who teleports the group to Egypt and explains the story of the Fallen.
Along with Jetfire, Wheelie sides with the Autobots, and Jetfire sends them to locate the Matrix of Leadership, which could be used to revive Optimus.
The group find the Matrix in Petra , but it disintegrates into dust. Nevertheless, Sam stuffs its remains into one of his socks.
During the battle, the Constructicons combine to form Devastator , who destroys one of the pyramids to reveal the Sun Harvester hidden inside, before he is killed by a U.
Navy railgun aboard the USS Kidd. Ravage and the Decepticon Rampage attempt to spring a trap, using Sam's captured parents as bait, in order to force Sam to give them the Matrix, but Bumblebee interferes and kills them both.
Major William Lennox and Sergeant Robert Epps call in an airstrike that kills the majority of the Decepticon ground forces. However, Megatron manages to shoot Sam, mortally wounding him.
He then retreats after coming under attack by fighter jets. As Sam nears death, the Primes speak to him through a vision, saying that the Matrix must be earned, not found, and that he has now earned the right to bear it for fighting for Optimus.
They restore Sam's energy and grant him the Matrix, which he uses to revive Optimus. The Fallen teleports to their location and steals the Matrix from Optimus.
He returns to the pyramid with Megatron and activates the Sun Harvester. Jetfire, who is severely wounded after his battle with Scorponok, sacrifices himself to allow his parts to be transplanted to Optimus.
The parts give Optimus immense strength and the ability to fly, allowing him to destroy the Harvester. He then battles Megatron and The Fallen, incapacitating the former and killing the latter.
Megatron retreats with Starscream, vowing vengeance. Major hurdles for the film's initial production stages included the —08 Writers Guild of America strike as well as the threat of strikes by other guilds.
Prior to a potential Directors Guild of America strike, Bay began creating animatics of action sequences featuring characters rejected for the film.
Screenwriters Roberto Orci and Alex Kurtzman , who had written the first film, originally passed on the opportunity to write a sequel due to schedule conflicts.
Orci described the film's theme as "being away from home", with the Autobots contemplating living on Earth as they cannot restore Cybertron, while Sam goes to college.
In September , Paramount announced a release date for the sequel to Transformers in late June Prior to the first film's release, producer Tom DeSanto had "a very cool idea" to introduce the Dinobots, [26] while Bay was interested in including an aircraft carrier , which was dropped from the film.
Movie-wise, I mean. Once the general audience is fully on board with the whole thing, maybe Dinobots in the future. During production, Bay attempted to create a misinformation campaign to increase debate over what Transformers would be appearing in the film, as well as to try to throw fans off from the story of the film; however, Orci confessed it was generally unsuccessful.
The majority of interior scenes for the film were shot in the former Hughes Aircraft soundstages at Playa Vista. Udvar-Hazy Center. One shot that was filmed in the University of Pennsylvania was the party scene, filmed at what students call "The Castle".
However, neither the University of Pennsylvania nor Princeton gave Bay permission to be named in the film because of a scene that both institutions felt "did not represent the school" in which Sam's mother ingests marijuana-laced brownies.
The two locations were used for Qatar in Transformers and stood in for Egypt in this film. The first unit than shot for three days in Egypt at the Giza pyramid complex and Luxor.
The shoot was highly secretive, but according to producer Lorenzo di Bonaventura , a crew of Americans and "several dozen local Egyptians" ensured a "remarkably smooth" shoot.
On this film, the final battle in Egypt was devised to make it easier to follow the action. Hasbro became more involved in the designs of the robots than the company was for Transformers.
Scott Farrar returned as visual effects supervisor and anticipated moodier use of lighting as well as deeper roles for the Decepticons.
Peter Cullen recalled, " Don Murphy mentioned to me, 'Only because of the tremendous expense to animate Optimus Prime, he'll be in just a certain amount of [ Transformers ].
Originally set to be a five-part series entitled Destiny , [79] it was split into two simultaneously published series, titled Alliance and Defiance.
After the film, and serving as a bridge between the two films, Alan Dean Foster wrote Transformers: The Veiled Threat , [82] originally titled Infiltration.
During the writing, Foster collaborated with IDW to make sure their stories did not contradict each other.
On June 10, , the comic book adaptation of the film, written by Simon Furman was released. Revenge of the Fallen premiered on June 8, in Tokyo, Japan.
Linkin Park held a special show after the premiere at the Fox Theater, Westwood Village on June 22, during which they performed " New Divide " live for the first time.
The IMAX release featured additional scenes of extended robot fighting sequences, which were not seen in the regular theater version.
Both two-disc editions are the first to include Paramount's Augmented Reality feature, which allows the user to handle a 3-D model of Optimus Prime on a computer by moving the package in front of a webcam.
The Blu-ray version had the best first-week sales of , with 1. The site's critical consensus reads, " Transformers: Revenge of the Fallen is a noisy, underplotted, and overlong special effects extravaganza that lacks a human touch.
According to The Washington Post , Revenge of the Fallen was Bay's worst-reviewed film at the time of release, faring even worse than Pearl Harbor It's easy to walk away feeling like you've spent 2 hours in the mad, wild, hydraulic embrace of a car compactor ".
Roger Ebert , who had given the film three stars, [] gave the sequel only one, calling it " Later in his review, Ebert discouraged movie-goers from seeing the film by saying "If you want to save yourself the ticket price, go into the kitchen, cue up a male choir singing the music of hell, and get a kid to start banging pots and pans together.
Then close your eyes and use your imagination. It will be seen, in retrospect, as marking the end of an era. Rolling Stone critic Peter Travers did not give the film any stars, considering that " Revenge of the Fallen has a shot at the title 'Worst Movie of the Decade'.
Other reviewers, while still critical, were less damning of the film, The A. Club gave the film a "C-", complaining about the writing and length, but mentioning the effects and action scenes were impressive.
And many other words beginning with B, including boneheadedly brilliant. But it knows how to feed your inner year-old's appetite for destruction.
There was considerable negative reaction to the characters Mudflap and Skids, who some perceived as embodying racist stereotypes.
I think that would be very foolish, and if someone wants to be offended by it, it's their right. We were very surprised when we saw it, too, and it's a choice that was made.
If anything, it just shows you that we don't control every aspect of the movie. Instead of using IMAX for complete unbroken sequences similar to director Christopher Nolan 's approach for The Dark Knight , Bay chose to use the format primarily on a shot-by-shot basis, combining conventional 35mm footage and IMAX shots in the same sequence.
That approach, combined with rapid cutting, created a jarring, highly unpleasant experience for most moviegoers. Actor Shia LaBeouf was unimpressed with the film, stating "We got lost.
We tried to get bigger. It's what happens to sequels. It's like, how do you top the first one? You've got to go bigger.
Michael Bay went so big that it became too big, and I think you lost the anchor of the movie You lost a bit of the relationship.
Unless you have those relationships, then the movie doesn't matter.
2 Kommentare
Vitaur · 05.08.2020 um 06:58
Besten Dank.