Pyro Explosion Experiment

The following experiment that I want to test is creating a pyro explosion. As I have done previous experiments involving fluid, particles and soft body materials, I feel this is the appropriate next step.

https://www.youtube.com/watch?v=fTIeGob0Wuo

Although this tutorial dealt more with shaders, it is still an important step into learning how to make something look more realistic. Creating this was difficult as I found it hard to follow because it only had key words throughout the video at the bottom. Having said this, it also allowed me to be more independent with the image and create something that I feel is more of a reflection on my style of creativity.

 

The first step was to add a sphere into the viewport, and following this, add a mountain node. Ive learnt from previous experiments that adding a mountain node allows the object to look more organic, and as I will be creating a cloud of smoke, it will look more natural if it isn’t perfectly round.

The following step was to apply a pre-made explosion node that is located under the PYRO FX tab (top right). Adding nodes including a vortex will make the smoke look more organic, and changing the levels of confinement will adapt the size that the explosion can reach. The main reason for this step was to change the values of some nodes, in order to make it have a better rendered outcome.

When recreating this scene, I noticed that my smoke did not look as thick/natural as the one in the tutorial. This may have been down to the different versions of Houdini, but having read some of the comments below, there was a solution. By adding a ‘Volume Visualisation’ node inside the ‘pyro_import’ object, I was able to change the density, and ultimately create a more natural looking explosion.

The next step was to add a camera, but I didn’t adjust the resolution like the tutorial, as I was comfortable with where it had been positioned. After this, the main step to create a good explosion is to add shaders and change the lighting, I added two ‘distant light’ options to the scene, one was to keep the white to create more shadow and the other was changed to blue. This made the scene more natural.

The image above is what the simulation looked like in the rendered viewport. In comparison to the image in the tutorial, this does not look as affective. As I was not satisfied with the outcome, I decided to experiment further and managed to find a tutorial more suited to my version of Houdini.

Although this explosion looks less organic, hopefully by the end of this experiment I will have created something that I am more confident with, as it still created the plume of smoke which I want to reflect in my work.

The first few steps were similar to other previous experiments, as I had to create a sphere object and add a mountain node in order to make it look like the image above. Changing the rows and columns too 100 allows this object to have more edges when attributes are changed further into the experiment. Before adding the mountain node, I changed the values of the sphere by selecting the scale on the left hand panel to make the sphere flatter.

the next step was to play around with the attributes in the mountain node (element size and height) in order to create the image above.

After this, I added the PYRO FX presetting ‘explosion’ which created the smoke you can see above. The next few steps were just adjusting the attributes in each node, below is a time-lapse of these small changes which ultimately created the final render that will follow.

Editing the pyro nodes:

Pyro Explosion Output:

Frames Rendered: 180

Render Time: 2 hours

Rendering Settings

From the Houdini software, the default render setting is Mantra. As this setting is a sideFX product, it works alongside Houdini easily. This being said, after researching into the best output render setting many people also use Redshift. After accumulating information from a few sources:

  • https://forums.odforce.net/topic/26640-mantra-vs-redshift/
  • https://www.reddit.com/r/vfx/comments/963bmi/what_renderer_do_you_use_for_houdini/
  • http://www.sidefx.com/docs/houdini/render/render.html

I narrowed down the best features of both render settings.

Redshift:

  • GPU has a faster speed
  • Supports shaders and textures
  • Works in Maya, Houdini, Cinema 4D and Katana

Mantra:

  • Supports volume and particles (i.e. pyro and fluid)
  • a SideFX product
  • Houdini is VEX based, and mantra shaders are written in VEX

Although the GPU of Redshift has a faster speed and can adapt in a few different softwares, mantra is a product of SideFX which leads me to believe that it is a more reliable and efficient way of rendering my outputs and therefore I am going to continue to render with Mantra.

The settings that I rendered the disintegration effect in were effective enough to show the experiment clearly – the only thing that I will change is the output device, as this could have been a reason for the initial malfunctions. Keeping this in mind for future experiments, I will change the output device to a PNG as this will work better on my laptop and make it easier to render out video sequences for my work.

I created a Render Present on Houdini that I feel works best for these experiments. The reason I did this is to make it easier when it comes to rendering and allow myself to have reference to my previous render settings for other experiments. If necessary, I can adapt the settings to suit the particular experiment but for now it creates a base setting that I am comfortable with.

Image of preset render setting:

The images above could change depending on the simulation, however this is a good base for my experiments to work off of as I know that the PNG setting is more accommodating to render the output in.

Something I have learnt from an online forum (https://forums.odforce.net/topic/26689-houdini-fx-how-to-render-out-correctly/) is that is an important factor to rendering out an image sequence is to add ‘$F4.png’ to the file name (or .exr depending on the type of image you have), as this give you a separate file for each frame, and in this case provides a sequence of separate images that can be put into a video sequence in PremierePro. When I originally rendered out the jellies experiment, this wasn’t added at the end, which meant that I was overwriting the same image at the end of each frame rendered.