Fox Renderfarm Blog
Revealing the Techniques Behind the Production of Jibaro "Love, Death & Robots", Which Took Two Years to Draw the Storyboard
Behind The Scenes
As a leading cloud rendering services provider and CPU & GPU render farm, Fox Renderfarm, will share with you the techniques behind the Jibaro.It's been more than 10 days since the third season of Netflix's "Love Death & Robots" was launched, but the lingering effects of those brilliant episodes are still reverberating in the hearts of viewers.In particular, the ninth episode, "Jibaro", has raised the cry of the audience with its surreal visuals and story of love, death, blood and greed!But just watching the episodes may not be enough to satisfy the curiosity in you. That's why we've prepared a behind-the-scenes breakdown of the most comprehensive edition of Jibaro for you!This article is a combination of information and interviews of Alberto Mielgo, director of Jibaro, Remi Comtois and Jacob Gardner, animation directors and Agora Studios.How many people are in the core team? How long did it take in total?director Alberto MielgoIs Dance of the Banshees mocap or keyframe animation?Is this water real or CG?How was the collision of metal decorations made?What difficulties did the team encounter?What was the creator's favourite part?Everything you want to know is here!From 'The Witness', 'The Windshield Wiper' to 'Jibaro' Many fans should have been familiar with the talented director Alberto from 'The Witness' in the first season of 'Love Dead & Robots'.The Witness tells the story of a dancer who unwittingly stumbles upon a murder in a hotel across the street and is chased by the killer.And this year, the winner of four Emmy Awards and two Annie Awards, he took home the 94th Academy Award for Best Animated Short Film for his work The Windshield Wiper!But while some people will instantly recognise Alberto's highly personal style, the subject matter of Jibaro is not quite the same as the two short films above.From cityscapes to enchanting Sirens, vast and vibrant forests and knights in exquisite armour, Alberto says of this: "I always thought of doing something with nature as a backdrop. The visuals in the natural world, such as how the light hits the forest, are incredible sounds of nature, and I wanted to pay homage to the incredible nature through the work. Of course, it's also very challenging because nature is so rich that you can lose yourself in it."What is Jibaro?Jibaro means 'people of the forested mountains of Puerto Rico' and is based on the Taíno people, a representative of the indigenous peoples of Latin America. Columbus discovered this treasure on his second voyage and named it Puerto Rico, from which Spain began to rule for a long time. In order to plunder precious metals, slaves and cash crops, the colonists had bloody massacres and enslaved the indigenous people. In 1989, when the Spanish-American War broke out, the United States occupied Puerto Rico and made it its own autonomous state in the Caribbean.Giant stone sculpture of Taíno’s face in Puerto Rico. /Shutterstock@ eddtoroThis is probably why audiences felt that the director wasn't just showing a fantastic gender relationship in Jibaro, but was alluding to colonial history.The storyIn terms of storytelling, Alberto made "Hearing, Love and Death" as the core of "Jibaro".The story opens with the hero picking up a piece of gold from the river, but this also awakens a banshee in the river who causes the knights to kill each other with her beautiful dance and song.The male protagonist, who escapes death by deafness, attracts the attention of the banshee, and tensions between them rise, while a relationship of love, greed and lust ends before it even begins.Jibaro may also appear to be a fantasy story about love, but it is almost a world away from director Alberto's previous film, The Windshield Wiper, in terms of content.He explains that Jibaro is also describing a relationship that is closer to that between predators of carnal desire. It is as if two people are trying to communicate, but they are not listening to each other and thus become attracted to each other due to a misperception.I don't know if you've ever experienced this while watchingWhen the banshee and the knight kiss in the middle of the waterfallYou think it's the start of a forbidden love affairAnd when the male lead doesn't follow the rulesWhen the he kills the banshee and takes all the goldI wonder how many viewers' jaws droppedIn response, Alberto said: “Not all stories should go about saving the world, or the hero's good journey. In my conception, it is not necessary and taken for granted that the heroes gain growth, and the real situation is often the opposite of what is expected, that they may turn out to be a worse person, and not learn any lessons, or even lose everything.Two people falling in love for the wrong reasons is something that is always happening in today's society. We choose each other for the wrong reasons and this causes us all to suffer in the end. Some people are lucky enough to find their partner at a very young age, but for most of us it is very difficult to find our partner for the rest of our lives. I love discussing this topic, it's something that's very personal. When you talk about that kind of relationship, you need to make the audience understand it and make them feel like it's really happening.And in Jibaro, neither of the two main characters is a hero, in fact they are both so bad that sometimes the audience will love one and hate the other. For example, at first you think the banshee is a monster, but then you see what happens to her and you are flooded with sympathy. I like this state of affairs where the audience doesn't know who is the good guy and who is the bad guy, it makes them feel more strongly when they watch the show."StoryboardWhen you see the beautiful dance moves and surreal graphics of the sirens, you might call the production team brilliant, but the basis of all this is also the clear and detailed storyboards of director Alberto.According to interviews with Remi Comtois and Jacob Gardner, the animation directors of the team behind Jibaro, the director did all the conceptual design and the storyboard alone, which took around two years.It's hard to believe that the director drew all the drafts himself on paper, scanned them into AE, coloured them and placed them in the corresponding 3D scenes, and the drafts almost filled Alberto's entire office!Concept DesignOnce in animation, Alberto's design went through a number of iterations.The first was the armor design. The storyboard the director gave the team did not include armor, so the animators had to find it online or go to museums to observe the textures of the different armours and think about their shape and movement.Most of the characters, including the two main characters, were completely redesigned, and each design change meant that the modeling, binding, animation and other steps needed to be adjusted significantly, which led to facial expressions being added to later. (P.S. It's hard to imagine that all of Jibaro's facial animations were done in a fortnight, and a month in total counting the later additions of details and fine-tuning)One artist, after viewing Jibaro, analyzed the aesthetics of its art:Character style: KlimtKlimt had an absolute obsession with gold and would put gold leaf, shells and feathers on his paintings, while being influenced by symbolism, Egyptian mythology and Oriental ukiyo-e.Environmental style: Pre-RaphaeliteThe deep, bottomless inner lake in the animation resembles the dark waters presented by the Pre-Raphaelite.Story setting: SirenThis one is certainly the one that people associate with the most. In ancient Greek mythology, there was a siren called the Siren, who had a heavenly singing voice and used it to seduce passing sailors and sink ships.Costume and Dance: SalomeThe banshee not only sings but also dances. More like a combination of Siren and Salome, and the costume of Salome by Moreau is very close to the character in the anime, but the style of drawing of Salome by Moreau is more like the Indian religion.Expression: SchieleThis vacant, unfocused, manic, berserk, neurotic, highly expressionistic expression and gaze are almost identical to those in Schiele's work.Medieval knight look + Bamboo Forest by Akutagawa RyunosukeAnimation Techniques 3D+2DIn the case of the ultra-realistic and beautiful scenes of Jibaro, most people might overlook some of the details on first viewing, such as the fact that certain scenes with more obvious paint strokes are actually in 2d.This mix and match approach to creation was something Alberto had in mind from the start, and he believes he simply chose the one that was closest to his inner feelings."In fact, we finished modeling an entire forest.And the look of the characters is very simplified, the reason they look realistic is actually the lighting and rendering aspect playing the right role, but in fact they are much simpler than a true hyper-realistic 3D rendering. I tend to remove details that we don't need, and this creates a better, more enjoyable visual experience for the eye."You can see that the character has no nostrils and the beard and eyebrows are just simple linesFor the modeling and animation side of things, animation director Remi Comtois exclaimed a lot.There was a lot of pressure on the team members to deliver the best picture that the director had in mind.And the animators didn't actually set up any of the shots before the main cut, which meant that most of the animation was created within a 360 degree view. They would usually finish on one of the sides, then go around to the back and finally check the coordination of the whole movement from all angles.Of course, the team has since worked out a few tricks to reduce the pressure of the job. For example, they would assume which angle the director would set the camera at based on the storyboard, so that when the close-ups came up, not too much thought would be put into the lower half of the dress and the legs.Extremely distinctive camera movementsAs the protagonist in "Jibaro" has been given a "deaf" quality, the director has set up the camera to emphasize this unsettling feeling, with the camera moving violently and out of focus as if to give the audience a first-hand experience of the story."I do like a camera that's alive, I like to make the audience feel like you're really there and you're holding the camera yourself. Also, in terms of the action, I really like the use of the camera to simulate the feeling of exhaustion. Sometimes it doesn't even have a second to focus properly on a frame. It's like when you're in some kind of action or battle in real life, it's hard to concentrate. Those are the things I wanted to include in the film, because you can convey that sense of stress to the audience."Mocap or Keyframe Animation?This should be the most important question for everyone who knows about 3D animation, and here I can give you a definite answer: all the character animations in "Jibaro" are keyframe animation!It's not that the team didn't try mocap, but for one thing, it didn't fit the artistry the director was after, and for another, the cost of motion capture was enormous due to repeated design changes, so they eventually returned to keyframe animation.But the director shot a number of references himself so that the animators could replicate the perfect image he had in mind. For example, the scene where the cavalrymen kill each other and fall into the lake was a reference result of Alberto finding a large swimming pool and going to film it himself.And then there's the part of the Banshee's Dance that everyone is most concerned about.Alberto had a professional dance team choreograph the dance, and after a two-week rehearsal period, spent another week filming the entire section in real time with a complete 360-degree view as a reference.SARA SILKIN, the dancer who played the banshee, thanked the director for the "unbelievable and amazing experience" on her own Ins after the episode aired.Dancer SARA SILKIN"My voice. My pain. My sadness. My screams. My cries. I am the Bloody Golden Woman, and put my heart into the final scene performance.My interpretation of JIBARO has always been from her perspective as I relate to HER; someone who destroys and is destroyed by the one she loves.The tremors in the hands in the final performance represent her shame and terror. I kept repeating to myself while acting the final scene “Why would you do this to me, how could you do this to me?”Alberto directed me with intense passion and brought out the darkness that lives in my soul. Thank you for not only creating a visual masterpiece but building a beautiful world highlighting the cruelty that can exist within it.”Liquid, Metal CollisionThere is a joking saying in the animation industry: you can measure an animator's ability by how he draws water.This saying is just as true in 3D animation.Realistic liquid simulations are always a major source of brain-ache for the team behind the scenes, and the magnificent waterfalls and intense underwater scenes in "Jibaro" almost bewilder the eye!Is that really not a real shot?Answer: Not a real shot!!!Director Alberto answers this: "Technically, the film is very complex and it was very difficult to simulate the clash of armor, jewelry, the splash of water and the effect of them all moving and colliding in the water.When the team first heard it, they all said, "Oh no! Oh my God! Can we simplify it", but I think that's what we were trying to achieve, it was the one thing that couldn't be simplified, I wanted to push the technology, push the visuals, push the artwork to the limit! And we ended up doing that, which is great, it's definitely not easy!"Agora Studios, the team that worked with Alberto, also revealed the behind-the-scenes production of the metal accessories in the film."When we did the pre-production previews, we realised that the soldier armour Alberto wanted was not impossible to achieve, but also difficult to realise for the type of mass movement. So we suggested adjusting the model accordingly while respecting the original design, and luckily the director approved this suggestion and gave it down to the character team to implement. A good synergy between the teams was essential in pre-production.It was a huge challenge for our assembler Alex Mann to automate all the metal parts as much as possible, we animated these individually and then automated some of the secondary movements with Ragdoll Dynamics. Finally, we also spent a long time cleaning up the intersecting meshes, which if done properly no one would notice it, but it was a job that required a lot of passion and dedication and a lot of time and effort from the team members."*Ragdoll Dynamics is a new real-time physics calculator for Maya. This tool allows animators to use real-time physics in character bindings, whether it's mechanical components, fabric, hair or muscles, anything in the Maya viewport can generate realistic secondary motion.Ragdoll Dynamics demoWhat was the favourite and most difficult part for the team?The two animation directors had the same answer to this question.Their favourite part of the film is the final dance and scream of the banshee at the end.Remi Comtois, who was in charge of this section, explained that the kind of performance that seems to use all the strength, trembling and screaming, requires a lot of details, and it is the ultimate feeling that the director was after.Jacob Gardner, on the other hand, said the hardest part he was responsible for was the dancing cheek-to-cheek at waterfall, as the character design and modeling were tweaked several times throughout, so getting the shape, distance and rhythm right became a challenge."Their intertwined hands, the shape of the armor, the shape of the two men's arms, the change in body distance as they dance... There were so many details to imagine."What is it like to work with Alberto?In an interview with animation directors Remi Comtois and Jacob Gardner, the host asked the question: was there a particular shot that consistently failed to meet the director's standards and made him go ballistic?But the answer given by the two directors gave us a real insight into how such a core team of 22 people was able to complete such a crazy project in eight months.Not once, they replied.Top (left): Remi Comtois, Director of Animation Bottom: Jacob Gardner, Director of AnimationAlberto respects every creator's opinion, he is happy to see all the possibilities and to see everyone's sparkle, and everyone gets confidence and praise from him. And as an animator himself, he knows exactly what difficulties to expect in a project and how to overcome them.On one occasion Alberto pointed out a part of the project that a team member had forgotten and everyone was thinking: what can we do about this? But he calmly and comfortably reassured them and remedied the situation in time, increasing their trust in each other.Alberto did all the preliminary work on this project alone and had everything in his head. Even though there were many iterations in between, as soon as he suggested a change, then everyone was willing to trust that his idea would make the whole project better, rather than a waste of effort, so everyone gave their all from the bottom of their hearts.Also, Alberto knew exactly what he wanted, so every instruction given was quite clear, which was one of the reasons why everyone was able to finish on time.There may be many talented people in the world But that's why only Alberto could make Jibaro Crazy and respectful enoughFox Renderfarm hopes it will be of some help to you. As you know, Fox Renderfarm is an excellent cloud rendering farm in the CG world, so if you need to find a render farm, why not try Fox Renderfarm, which is offering a free $25 trial for new users who sign up? Thanks for reading!From wuhuxiaojingling
Virtual Production:Past, Present, and Future (2)
Behind The Scenes
As the leading cloud rendering services provider, Fox Renderfarm, continues to share with you the virtual production.Virtual Production:Past, Present, and Future (1)The Birth of Virtual ProductionFrom the perspective of time and popularity of the application, the application of TV should be earlier than that of movies.In 1978, Eugene L. proposed the concept of "Electronic Studio Setting", pointing out that future programming can be completed in an empty studio with only personnel and cameras, and the sets and props are automatically produced by electronic systems.Virtual studio technology widely used by current mainstream TV mediaAfter 1992, virtual studio technology realized. As a new technology, the virtual studio became a hotspot in TV studio technology. At the IBC exhibition in 1994, the virtual studio technology debuted in various TV broadcasts.Virtual background can be changed with the lens in the virtual studioThe Virtual Studio System (VSS) is a new TV program production system that has emerged in recent years with the rapid development of computer technology and chroma key.F1 program presents a demonstration effect through virtual scenes and props In the VSS, the position and scenes of the cinematography are transmitted to the virtual system in real-time. The green screen (or blue screen) is cleared by chroma key to replace the pre-made virtual three-dimensional space model. Then the host or actor of the picture emerges with the three-dimensional virtual scene into a new picture. Finally, the compositing video can be displayed on the TV in real-time.Using virtual studio technology for remote virtual dialogueActually, the virtual scene does not exist in reality, but the technology that seamlessly integrates with the real shot characters makes the virtual studio a reality.The world's first reality show using virtual productionNow please follow the best CPU&x26;GPU render farm to our next part: Virtual Production:Past, Present, and Future (3).
3D Tutorials: How to Make Dogs in Togo (1)
Behind The Scenes
As the leading cloud rendering services provider and fast render farm, Fox Renderfarm will share you how to make dogs in Togo.The visual effects of Togo are produced by DNEG, Lola Visual Effects, and Soho VFX. The most impressive thing in the whole film is the loyal and brave sled dogs and the local weather. DNEG began production after shooting in October 2018. The cycle lasted for about one year. A total of 872 artists from 4 different branch teams completed 778 VFX shots. Next, we will divide the parts made by DNEG (CG Dog and Digital Environment) into three parts to introduce to you.3D scanning of sled dogsThe husky sled dogs we saw in the film are not necessarily real, some are full CG, and some are CG replacements of a dog head. The sled dogs produced by CG need to match the real shots of the sled dogs. It will involve 3D scanning of real sled dogs, collecting reference materials, redesigning hair tools, building muscles and bones, and reconstructing the roaring appearance of the sled dogs.Live shot1. TestingDNEG has done a rocket raccoon with 800,000 hairs in "Avengers 4: Endgame", and has also tested wolves in previous projects, but has not done a dog, nor has it dealt with something like this Such a huge fur/hair production in the film.For the "Togo" project, they updated the internal hair tool Furball. Through some development and optimization work, they first created the largest amount of hair on a single dog as much as possible, and then met the hair of 11 dogs in 1 shot. In addition to the real appearance, it is necessary to simulate the state of hair with water, ice, and snow and its rendering effect.2. 3D scanning of sled dogsDuring the shooting, DNEG performed two photogrammetric scans of 30-40 sled dogs in the studio using Clear Angle, one wearing summer clothes and the other wearing winter clothes. The dog handler introduces a single sled dog into the studio, first familiarizes himself with the environment, and then takes him to the designated C position. The scanning process must be completed all at once, otherwise, there will be no chance to do it again if the dog is scared away.There is a separate shed next to the Clear Angle shed, which is equipped with an animation reference camera, which can capture the detailed dynamics and characteristics of the sled dogs outside of the actual shooting, and provide reference materials for creating CG character bindings and assets.3. From data to sled dogsA 3D scan of a sled dog can get feet, legs, head, and rough body volume data, but not including fur/hair data. Without hair data, it is impossible to analyze the muscle mass and the tissues under the fur. The solution adopted by DNEG is to manually measure the fur on the sled dog’s neck, back, tail, and other specific locations with a measuring tape in the small shed mentioned above.However, this method of creation is more based on the dogs involved in ideal, anatomy textbooks, not necessarily data-driven, and some specific details of the sled dogs need to be added on this basis.Real shotFinal shotThe final creation is to customize the muscular system and the skeletal system, use Ziva Dynamics to create the muscle and fat system, Maya's nCloth to create the skin, the fur tool Furball to handle grooming, and Houdini vellum to handle the hair follicle dynamics.The production team found that a lot of fur movement actually comes from fat. The fur itself is very stiff and will not twist left and right. Instead, fat and muscles move around underneath, forming a feeling that the fur is moving. Of course, it also needs to simulate the effect of pulling the fur after putting on the protective gear for the sled dog.Real shotFinal shotNow follow the TPN-accredited render farm and cloud rendering services provider, Fox Renderfarm to the next part: 3D Tutorials: How to Make Dogs in Togo (2).
3D Tutorials: How to Make Dogs in Togo (2)
VFX
As the leading cloud rendering services provider and 3D render farm, Fox Renderfarm will continue sharing you how to make dogs in Togo.3D Tutorials: How to Make Dogs in Togo (1)The Dogfaces System for Sled Dog Facial CaptureIn order to produce the final version of the sled dog animation, DNEG also explored the dog's motion capture and spent time developing the Facial Action Coding System (FACS) system specifically for the face of the sled dog.1. Research and development of sled dog motion captureIn the early stages of production, DNEG used well-trained sled dogs for motion capture in Animatrix Studios. We usually see actors wearing professional clothing for motion capture. This time the same process happened on sled dogs. The way the equipment is worn and how to set tracking marks on the fur is very technically challenging. Although these motion capture data were not used in actual production, it was still a good learning experience for the production team in the future.2. Research and development: dogFACS systemIn order to have a detailed understanding of the facial muscles of sled dogs, and to clarify the working direction of facial binding as soon as possible, DNEG also started dogFACS research very early. Since there is a human face-catching system, there can also be sled dogs. The researchers categorized all sled dog expressions, and drew facial expressions based on the dogfaces system, including "mouth raised" and "wrinkled nose". For example, a growl is "upper lip lift", "wrinkle nose", "wrinkle lips" "The expression of these three actions combined together. Facial control schemes based on these expressions can enable animators to activate or counteract individual expressions, thereby creating more detailed and believable animations.DNEG’s binding department has re-developed the entire four-legged binding system to simulate higher standards of real performance, including a new front leg module, which has the function of “fixing limbs” so that animators can imitate most animals. The fixed state of the front leg when the weight is on the leg; the reconstructed spine setting is also included to improve the realism and function of the animation. In general, the binding process requires a lot of effort. It is necessary to work closely with the animation department to develop and improve the four-legged binding standard, and always draw inspiration and direction from research references.3. Roaring timeThere is a scene where a little partner named Ilsa in the sled team roars at Togo, and Togo immediately subdues Ilsa. Please note that the seemingly real performance here is actually a CG shot.In the actual shooting, the dog handler used a "snarling device", which uses a rubber band and a prosthesis to open the sled dog's mouth. It is said that the sled dog who plays Ilsa is very docile and very coordinating with the whole process, but its eyes and constantly wagging tail reveals its extremely happy state, and it doesn't look angry at all.Real shot materialThe effect after replacing the CG headThe production team decided to use a real shot of the body + CG head to solve it, and designed some conceptual images of the sled dogs being angry and fierce. This production process is very challenging for riggers, modelers, and animators. They need to accurately grasp the subtle facial animations of sled dogs, and Ilsa's head production is more detailed and precise than Togo. After all, the lens is mainly In performance Ilsa.Real shot materialFinal shotThe most useful part of the dogfaces system is this close-up shot of Ilsa's rant. FACS expressions and mixed expressions are linear, while the sled dog's mouth and nose can be moved at will. In order to have the authenticity of proper micro-motion, the production team added many small details next to its nose. It is said that when the roaring scene animation test was shown for the first time, it was played side by side with the real shot material, and most viewers could not see that it was fake.Real shot materialFinal shot4. Sled dogs skating on iceThere is a plot in the film. The sled dog team braved the snow and cold to cross the unstable frozen lake under the leadership of Togo. When they returned, the ice surface was torn apart, and the sled dogs were moving forward quickly while their feet were slipping.The DNEG team also began to find some materials on the Internet about sled dogs slipping or falling, to provide a reference for finding the unstable feeling of sled dogs standing on the ice. These materials will be put into the rough cut of the animation processed with Time Editor to create a suitable effect for the overall shot like building blocks, and also help them determine the performance of the sled dog in some specific shots. From the perspective of animation quality and details, this method should belong to a relatively advanced blocking.5. Animate multiple sled dogsThe above-mentioned ice skiing. In addition, there are 11 sled dogs in Togo. The behavior of the sled dogs in the team is different from individual performance. For this reason, DNEG has developed a binding system to determine each The distance between the sled dogs in the team, if the distance is too far, part of the system will be displayed in red. Although this system is not super accurate, it can still provide the team with an approximate range.The production team carried out some layout development work when processing the animation, set up many different cycles and different speeds for the state of the sled dog action, and input them into the binding system so that the layout effect can be freely switched and loaded before loading The method of setting animation after binding is much more convenient.The Layout department can set up the sled animation according to the selected sled dog sport Cycles and speed while ensuring the work efficiency while achieving the authenticity of the effect as much as possible. It seems that this part of their work is also very convenient.Now follow the TPN-accredited render farm and best cloud rendering services provider, Fox Renderfarm to the next part: 3D Tutorials: How to Make Dogs in Togo (3).
3D Tutorials: How to Make Dogs in Togo (3)
VFX
As the leading cloud rendering services provider and 3D render farm, Fox Renderfarm will continue sharing you how to make dogs in Togo.3D Tutorials: How to Make Dogs in Togo (1)3D Tutorials: How to Make Dogs in Togo (2)The digital environment in the filmWhat we mentioned above are all about the production of CG sled dogs. There are many natural environments in the film, some of which are fully synthesized digital landscapes, and some are enhanced effects after real shots in Alberta, Canada. The two main scenes are Wheat Kinley Mountain shooting at Fortress Mountain, Alberta, and frozen Norton Bay shooting at Lake Abraham, Alberta.Real shot material1. Shooting across a frozen lake in Alberta, relying on a lot of roto to completeThe reason why I chose Lake Abraham in Alberta as the location for shooting the unstable frozen lake is that the blue ice here is very clean and clear, and the place is large. It's just that the weather on the day of the shooting was not very compatible. Two days later, the lake was covered with snow. Fortunately, the production team took a lot of reference photos within these two days.Nearly 95% of the shots are related to the sled dog and the owner Leonhard Seppala. To facilitate production, the entire sled team, sled, protective gear, fur, clothes, and Leonhard Seppala's hair need to be rooted separately, taking into account the sled dog body behind the team The cold atmosphere is stronger than the sled dogs in front, and the roto has to be created in layers.A real shot of sled dogs, replacing the environment and enhancing the atmosphereUnder normal circumstances, it is sufficient to project the roto layer on the character's body trajectory or vehicle trajectory and then superimpose on it, but at this time the hair details are very complicated. You need to project the roto on the card, build a sled system as a card binding, and then add Each layer of particle FX, snow, atmosphere, etc. In addition, the snow and ice produced by the sled and sled dogs flying forward are simulated in Houdini and filled with some 2D elements.Real shotFinal shot2. Cracked iceThe design of the ice-breaking lens is very complicated. The live shots were taken on a flat ice surface, so some ice flipped shots can only be done in full CG, while those non-CG shots used a method of destroying the lens track to make the ice surface look more" be more active, limit it to the range that does not destroy the parallax, use 2D techniques to constrain the fixed camera on a floating ice cube, or move the camera on the ice material, add 2D floating, and make the effect look less stable.Moreover, the material itself was shot on a flat ice surface with different light, and the lighting conditions are constantly changing with time. Therefore, the material of each version is different. It is necessary to keep these materials consistent and guide the ice surface. Fragmenting piece by piece is also a long and complicated process.3. Simulate huge ice cubesIn the sequence of the sled dog team crossing the frozen lake, huge blocks of ice will gradually rise as the ice surface breaks. The production team used procedural methods as much as possible to guide the shape of huge ice cubes. When there are a large number of huge ice cubes, there is no way to bind each piece individually, and it is impossible to carry out carving, texture, and appearance development processing on each piece. This will limit its shape and size. Once you want to make it If you modify it, you have to go back to the previous step and recreate it. So they created a new Cascade system that allows the layout department and the environment department to create huge ice cube layouts on a shot-by-shot basis.The Layout department created a very basic proxy shape in Maya and used Maya's curve tool to draw a huge ice shape, stretch it, place it in the scene, add binding constraints, and set up rough animations in the floating ocean.The environmental team has created a very practical toolset that can procedurally model huge ice blocks through basic geometry, generating broken edge details, internal bubbles, cracks, and faults in the ice layer. With the help of new tools, the work of the production team is basically all day shooting during the day and farm rendering at night.If you need to change the size or shape of the huge ice cube, go to the Layout link to redraw the curve and give it to the next process. The visual effects link will also simulate the interaction between huge ice cubes and water, including details such as bubbles and splashes, and finally, render in Clarisse.4. The environment of other mountainsIn the film, the growth of Togo is described in the form of memories of Laosai, part of which takes place among the mountains. The production team took a lot of very beautiful mountain views and modified them on this basis. For example, when shooting in Fortress Mountain, Alberta, the director thought the environment was good but there were too many trees, so some processing was done in the later stage. For another example, in the shots of the settlers' houses, there are no Alaska mountains in the real shots, and they need to be added later.Real shotFinal shotWhen designing the background, compared to the cumbersome work of drawing a large number of digital landscape maps, the production team adopted a 3D method that combined digital high-modulus, lidar scanning, and photogrammetry technology.In the end, the mountains that we saw close to the lens were sculpted, textured, and look-dev processes were completed. There were also some Clarisse renderings of trees, leaves, and rocks. In general, there was indeed a lot of background work that needed to be processed.Above is the information about how to make dogs in Togo brought by Fox Renderfarm. Fox Renderfarm is an excellent CPU&x26;GPU cloud render farm and 3D cloud rendering services provider, so if you need to find a render farm, why not try Fox Renderfarm, which is offering a free $25 trial for new users.
Virtual Production:Past, Present, and Future (3)
Behind The Scenes
As the fast and affordable cloud rendering services provider, Fox Renderfarm still share with you Virtual Production.Virtual Production:Past, Present, and Future (1)Virtual Production:Past, Present, and Future (2)The Evolution of Virtual ProductionVirtual production applied in the industry in early times with less coverage. In the early 1990s, Hollywood used virtual production to make dynamic storyboards, which is the current Previz.360 degree super static instant short technique used in The MatrixTitanic was released in the United States on December 19, 1997, followed by Matrix in 1999, Matrix 2 and Matrix 3 in 2003, and Transformers 1 in 2007 and its sequels. These films had made a good application and promotion of virtual production, especially the use of Motion Capture and Motion Control.Combination of motion capture and real-time renderingVirtual production is well known to the audience after the release of Avatar in 2009.On December 16, 2009, Avatar and its Pandora brought the audience a beautiful feeling, making the virtual production familiar to the public. The director can directly see the compositing scenes instead of the traditional green screen in the monitor, which makes it refreshing.Avatar takes virtual production to the mainstreamWhen the whole movie is shot on a green or blue background, there are more than 2000 scenes need to have post-production, including editing, VFX, sound and others; and the staff needs to match the shots one by one according to the script description. Hence, demands have driven the development of technology.After Avatar was completed, virtual production was integrated into a commercial module.The Outlook of Virtual ProductionAlthough there are many technical problems that need to be improved for virtual production, the development trend has been established and will be prosperous with the rising trend of the global box office.The live version of The Lion King fully uses virtual productionThe driving effect of Hollywood VFX films has made more and more directors and filmmakers recognize and contact this field, and use this new technology in their creation.A material library for virtual production is needed in the near future.Cultural and historical monuments, natural sceneries, and others have already become frequent visitors to the movie screen. However, all cinematography must go to the site at a high cost. The schedule of the actors is also difficult to coordinate.Real-time compositing scenes in the monitorVirtual production can solve the problem and reduce production costs.For example, when shooting the Colosseum in ancient Rome by virtual production, the actors perform in the virtual shed, directly utilizing the scene of the beast in the material library for cinematography.Movies are the art of creation. Using photos as a background can not meet the need for cinematography technology. A three-dimensional model needs to be built with real textures. Light simulation, such as early morning, dusk, or noon effect, can also be performed as needed. Then put them into the model library, transfer to the virtual production system according to the needs of creation, and let the camera flexibly choose the required angle to shoot according to the needs of the script.The establishment of the material library is a huge digital project, which not only includes all kinds of materials but also enables clear retrieval and calling. This project requires continuous investment and construction. We can build a framework through special groups, formulate technical standards, and incorporate the conformity into the database. We can also commercialize it for a fee to make this material library enter a healthy development track and grow slowly.Only green screen props are needed to simulate the interaction with the virtual scenesIn the future, more and more traditional studios will be transformed into virtual studios. Within the scope of human control, 24 hours of cinematography can be provided and will not delay due to any weather.Here is a common sense of virtual production. The scenes built by traditional studios are fixed. The replacement of reality scenes takes one day or a few days, but that of the virtual scene only takes a few minutes. In the performance area, the modules can be assembled outside the venue in advance in the studio.The production of Disney's Mandalorian brings virtual production into the LED eraThe actor's schedule is very tight. Any accident, such as illness, injury, and so on, will disrupt the normal cinematography plan. The virtual production will make the scenes in the script in advance, and quickly call different scenes according to changes, coordinating the process of the crew to make filming more humane and more efficient.Virtual production has a very good prospect. The continuous improvement of virtual production builds a proper platform to improve the quality of filming. Creativity can be presented in a better way, bringing audiences a better viewing experience.Fox Renderfarm hopes it will be of some help to you. As you know, Fox Renderfarm is an excellent cloud render farm in the CG world, so if you need to find a render farm, why not try Fox Renderfarm, which is offering a free $25 trial for new users? Thanks for reading!
Virtual Production:Past, Present, and Future (1)
Behind The Scenes
This sharing is collected by the TPN-Accredited cloud render farm, Fox Renderfarm. We hope that can help you in learning knowledge in Film and TV production.Virtual Production is an emerging film and TV production system, including a variety of computer-aided film production methods designed to enhance innovation and time-saving. With the help of real-time software such as Unreal Engine, traditional linear processes can be transformed into parallel processes, blurring the boundaries between pre-production, production, and post-production, making the entire process more fluid and collaborative.Virtual production in real-time LED video wallSince Avatar released in 2009, virtual production emerged as a new field from the digital revolution of the film industry, bringing a new perspective to viewers and professionals.The virtual production of AvatarThe large-scale use of green screen(or blue screen) filmmaking has made it more and more important for directors to see the real-time composition of performance and virtual space. As directors could implement instant guidance on live shooting by virtual production, the technology is gradually applied to production.Real-time combination in the green screenThe Definition of Virtual ProductionLet’s describe virtual production simply. When the actor performs in front of the green screen (or blue screen), the screen is directly replaced by the virtual scene made in advance. Then the director’s monitor is presented with scenes in real-time composition which can only see in the late stage of the production process. This is called virtual production.Virtual production sets and motion capture environmentsThe History of Virtual ProductionThe development of virtual production is inseparable from the progress of computer technology. In the 1940s and 1950s, computers were in the mainframe stage. In the 1960s and 1970s, they gradually transformed into small computers. Owing to the limitations of the hardware, software, and talent base, virtual production carried forward slowly during this period.IBM-PC launched by IBMMicrosoft was founded in 1975, and later developed the Windows operating system; Apple was established in 1976, using Apple's own System x.xx/Mac OS operating system; in 1981, IBM introduced IBM-PC, which greatly simplified the hardware architecture and oriented ordinary people. Because the development of the operating system is still in the DOS stage at this time, the speed of popularization is still very slow.Spielberg on the monitor to view the virtual production effect of Number One PlayerIn 1982, SGI was established in the United States. Jurassic Park, Titanic, Toy Story, The Lord of the Rings, and others are all having a close connection with it.The Lord of the Rings set & VFX compositingIn 1995, SGI acquired Alias Wavefront, which is the predecessor of MAYA software. In 1991, Microsoft introduced the Windows 3.0 multi-language version of the operating system. A few years later, Win95 was launched, realizing a real graphical operating interface. Hence, the development of computers entered the fast lane.Now please follow the best CPU&x26;GPU cloud rendering farm to our next part: Virtual Production:Past, Present, and Future (2).
How does the studio directly produce the final shot? (2)
Behind The Scenes
The rising expressive power of Real-time renderingJon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought. Sandy environment is built with assets scanned with photos and rendered in real-timeIn the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries.Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramasThe virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. "Entering the virtual studio, the actor has entered the virtual world and the final sceneThe changes of Hollywood led by virtual production technologyIn addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact."This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”.Virtual production can also help fantasy characters interact with real peopleIn addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage.Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books.Spielberg stood on the monitor to view the real-time synthesis of the shooting contentIn the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie.Lion King uses a lot of real-time virtual preview technologyThe popularization of virtual production technologyIn terms of budget, virtual production and LED display equipment are still relatively expensive at this stage. Complete real-time compositing and film production at a lower costJon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance."In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.”Virtual production technology will gradually become the routine process of film production in the future
Behind the Scenes: Spy House Production (3)
Behind The Scenes
Next, we will share the production process of two other materials:Making wood materialPay attention to the following points when making wooden materials:• When you split the UV, try to place the squares as much as possible to ensure that the post-texture image will not be distorted.• It is also necessary to consider the utilization rate of UV space and try to give priority to the things you can see, which can improve the quality of the material.• The texture of the wooden structure is simple to produce, and it is necessary to find a lot of references that combine with the situation of the scene. For example, what does the wooden structure look like in an underground wet environment for a long time? Or the wood in this environment is more corroded? So to make such a feeling, a lot of references is very important.Substance Painter wood texture production• Import the baked Normal stickers and prepared materials into Substance Painter. Before creating something, create two fill layers, leaving only the Color option. One layer puts the AO diagram on the Color and adjusts to a positive overlay, which is used to improve the dark color. Another layer puts the Curvature map on Color and adjusts to Overlay, which is used to improve the bright color of the edge. After these operations are completed, the production of the texture begins.Pay attention to the production of wooden texture: Consider the actual situation of the scene. Take the scene we made (spy house) as an example. For a long time in a dark and humid environment, the corrosion will be relatively serious. Afterward, it should be noted that the edges of the wood will not be particularly bright. The bright edges of the wear marks will also be darker, and then the scratches and the wear of the places where they are often used. A little brightness is needed here. Finally, there is dust accumulation and stains in places that are not often encountered. After these things are done, you can basically get a good real texture.The material for making paper and bookWhen making a single-page paper with no thickness, please pay attention to the following points:When you split the UV, try to place the square as far as possible to ensure that the textures in the later period will not be distorted.The texture of the paper is simple to produce, without Hight Polygon, just baking Low Polygon Normal.How to make paper in Substance Painter:Import the baked Normal map and prepared materials into Substance Painter. Create a layer (unfilled layer), select Color for the material and put the material in Base Color, and select None for UV Wrap. In F1 3D/2D mode, place the picture material in the corresponding position.Just pay attention to the book texture production: from the beginning, it is necessary to distinguish between the shell and the content. If it is made into one, it will not be easy to separate when making the texture.Marmoset Toolbag 3 renderingFinally imported into Marmoset Toolbag 3 for rendering, the first is to layout the light source, choose to add an HDR suitable for the scene, adjust the Child Light to the appropriate position to make the model produce a 45-degree shadow.Because the scene is larger, more lights are added, but when rendering, pay attention to adjusting the size and illumination range of those fill lights to prevent the mutual influence of light.I hope our learning and sharing can help you, thank you.
Behind the Scenes: Spy House Production (2)
Behind The Scenes
Material ProductionAfter baking all normals, AO, and curvature, import Substance Painter to make materials. It is worth noting that the normals baked in Marmoset Toolbag 3 and Maya use OpenGL mode in Substance Painter, and the normals baked in 3ds Max use DirectX mode. The first step after successfully importing the model must be to check whether the normal direction is correct.After successful import, the remaining textures are baked in the texture set-baking model texture.The following uses the sofa leather material to share the basic process of the material: the production of sofa leather material.Inherent Color of the LeatherChoose a suitable material map and paste it in the color channel, adjust other parameters appropriately to make the effect more naturalBump HeightCreate a new fill layer to keep the height channel, paste the height map corresponding to the background color layer, and add Level to adjust the height. At the same time, add anchor points to ensure that the color changes with the height.AO LayerThe AO diagram of the model pasted in the AO channel is that the dark parts of the edges are deepened, and the bumps are more three-dimensional.RoughnessAdjust the roughness of the sofa surface. Frequent contact with objects will wear more, such as the edge of the cushion, the inner and upper sides of the armrest, the bottom of the sofa, the top of the backrest, etc. Grunge textures can be added to make the effect more natural.The Edge FadesCreate a fill layer and add a generator to produce a faded edge wear effect. Adding only one generator effect will be very stiff. Add a few more layers to interrupt the continuous effect to make the wear more natural, and finally, add hand-painted details.Details AddedUse the Grunge map to add scratches, stains, and other details to the surface.DustThere are two kinds of dust: dust accumulated in corners and trace dust on the surface. Create a fill layer to leave the color, roughness, and height, and then use the generator to calculate the dust position adjustment parameters to make the effect natural, and you can appropriately add hand-painted details.Gap TreatmentManually add black to deal with gaps and other details. If you're interested in animation and visual effects, you might want to check out Fox Renderfarm - the leading cloud rendering services provider and render farm in the CG industry. Fox Renderfarm has provided cloud rendering services to countless visual effects and animation studios, allowing them to get the best quality results in the shortest time. A $25 free trial is available to let you speed up the rendering of your 3D projects.
White Snake’s Special Effects Behind The Scenes And Animated Performances
White Snake
Recently, the Chinese animation circle has caused a hot discussion about White Snake.As an original animated film with folk legends and national style aesthetics, White Snake’ box office has been attacked twice since the release of the week.Next, we will start to reveal the special effects and animation performance behind the scenes.Role displayAnimator behind-the-scenes performanceBehind the scenes special effects production secretThis lens special effect simulates the weightless environment in the water. The main difficulties are the high-speed movement and special effects of the character. The dynamics of the fabric are in line with the environment. Different solutions were tested for this effect, hitting the fluid and emitting fluid from the character's body. Finally, the advantages of both are combined as the final solution.In addition, the character itself has created a special force field to make the surrounding ink speed change more and back to the role action. Finally, a suitable physical environment is set for the ink, instead of the default vacuum environment to ensure that the dynamics are relatively correct.In the early stage of role effects, a large number of tests and asset classifications were carried out. The force field expression noise map and some self-developed tools were used to simulate and control the underwater state. After multiple versions of the iterative test, the final result was achieved. In this lens, the difficulty in the absorption of the beads is the irregular movement and shape. In order to be dynamic and natural and conform to the needs of the shape, the special effects have developed a complex force field to guide the movement of particles and fluids as the main body, and on this basis, 10 layers of elements are derived to enrich the effect. Finally, the lighting department synthesizes the final result.In the lens of the Pool, the creative team needs to show a very strange and evil feeling. Special effects use flip to achieve the dynamics of the pool water. According to the dynamic processing into as much information as possible, the particles, volume, and geometry are converted into renderable layers based on these data, and finally the final result is achieved in the synthesis. Recently, the Chinese animation circle has caused a hot discussion about White Snake.As an original animated film with folk legends and national style aesthetics, White Snake’ box office has been attacked twice since the release of the week.Next, we will start to reveal the special effects and animation performance behind the scenes.Role displayAnimator behind-the-scenes performanceBehind the scenes special effects production secretThis lens special effect simulates the weightless environment in the water. The main difficulties are the high-speed movement and special effects of the character. The dynamics of the fabric are in line with the environment. Different solutions were tested for this effect, hitting the fluid and emitting fluid from the character's body. Finally, the advantages of both are combined as the final solution.In addition, the character itself has created a special force field to make the surrounding ink speed change more and back to the role action. Finally, a suitable physical environment is set for the ink, instead of the default vacuum environment to ensure that the dynamics are relatively correct.In the early stage of role effects, a large number of tests and asset classifications were carried out. The force field expression noise map and some self-developed tools were used to simulate and control the underwater state. After multiple versions of the iterative test, the final result was achieved. In this lens, the difficulty in the absorption of the beads is the irregular movement and shape. In order to be dynamic and natural and conform to the needs of the shape, the special effects have developed a complex force field to guide the movement of particles and fluids as the main body, and on this basis, 10 layers of elements are derived to enrich the effect. Finally, the lighting department synthesizes the final result. In the lens of the Pool, the creative team needs to show a very strange and evil feeling. Special effects use flip to achieve the dynamics of the pool water. According to the dynamic processing into as much information as possible, the particles, volume, and geometry are converted into renderable layers based on these data, and finally the final result is achieved in the synthesis.
Recommended reading
Top 9 Best And Free Blender Render Farms of 2024
2024-08-30
Revealing the Techniques Behind the Production of Jibaro "Love, Death & Robots", Which Took Two Years to Draw the Storyboard
2024-08-30
Top 10 Free And Best Cloud Rendering Services in 2024
2024-11-08
Top 8 After Effects Render Farm Recommended of 2023
2024-08-30
Shocked! The Secret Behind Using 3D to Make 2D Animation was Revealed!
2022-05-11
How to Render High-quality Images in Blender
2024-11-18
Easy Cel Shading Tutorial for Cartoon in Blender Within 2 Minutes
2022-07-01
Top 5 Best and Free 3d Rendering Software 2024
2024-01-19
Partners