Thursday, June 26, 2014
Quick Cycles Tip: Fiber Optic Cable Material
This is a simple yet good looking material for you to check out. At first it wasn't super obvious to me that I could use both a surface and a volume socket at the same time. I found that a Glass Surface works really well with an Emission Volume. You can create fiber optic cables, illuminated plastic, LEDs etc. Tweak the emission color and strength, or the glass Roughness or IOR to get various looks.
Wednesday, June 25, 2014
Striping a Curve in Blender/Cycles
File this little tutorial under "maybe obvious to everyone else but me", but here is a way to add some interesting striped texture to your blender curves that have a bevel object.
To start we add a bezier curve and a bezier circle, then in the curve properties tab for the curve object, we add a bevel object of the curve. This ends us up with a simple tube looking object.
We can of course assign it a uniform texture, but what if we want stripes?
We could add a checker texture to start, but that does not map nicely by default.
As you can see here, mapping an image can also be good, with or without the striping! The Mapping Node can really make this easy as well.
To start we add a bezier curve and a bezier circle, then in the curve properties tab for the curve object, we add a bevel object of the curve. This ends us up with a simple tube looking object.
We can of course assign it a uniform texture, but what if we want stripes?
We could add a checker texture to start, but that does not map nicely by default.
First turn on "Use UV for texture mapping" under the "Texture Space" panel in the curve options tab. Then add a UV texture mapping node. As you can see we are already a ways there.
Right now we can control two colors, but what if we want to control two shaders instead? Instead of feeding the checker into the color input, we will use it to control a mix shader. You can adjust the two colors of the checker shader to determine the mixing values of the checker pattern. Full white/black will give you only each shader on each square. I have kept the white/grey here for no apparent reason.
But we do not want a checker pattern, we want stripes. So to do that we will adjust the UV vectors. To do that we will break apart the UV vector into it's components. If you are not using a version of Blender with with Separate XYZ node, you can use the Separate RGB node to do the same thing.
Next, add in Math nodes in between the X and Y components (or R&G if you are using the color nodes)
By setting the math nodes to multiply, you can see that the X component affects the checkers along the curve and the Y component affects the checkers around the curve.
So if we set X to multiply by 5 (or whatever size we like) and the Y to 0, we get nice stripes.
Here I changed one of the mix shaders inputs to a glossy shader so we can see one type of effect we can get.
These are not the only way to mess with the position of the UVs. Try some of the other vector nodes to see what you can accomplish! Try setting keyframes on them to change the pattern over time!
Go crazy!
As you can see here, mapping an image can also be good, with or without the striping! The Mapping Node can really make this easy as well.
Monday, June 23, 2014
Going from Blender to 2k Digital Cinema DCP
I recently started in on a project for a friend of mine to create an advertisement for his small chain of movies theaters to play before movies. I've done plenty of animations before, but I had never thought about what it would take to get it on the screen. It turns out to do it right, you need more than just a quicktime file. Here is the process I went through to make it work. I pulled tidbits from various websites to get the whole process. I'll link to ones I used at the bottom of this post.
Resolution:
First off there is aspect ratio. A flat 2k render is 2048x1080. But that is not what the 2k digital cinema projects at. The aspect ratio is 2.39:1, so actual render resolution is 2048x858.
Source Files:
Image:
DCP (digital cinema package) uses JPEG2000 format. Luckily there is a handy free utility called openDCP which can convert 16 bit TIFF files into JPEG2000. So when I rendered out my blender scenes, I made sure that I was exporting to 16 bit RGB TIFF files. Then using openDCP, I convert the individual frames to JPEG2000 files.
Audio:
I used the blender compositor to cut up and arrange my audio and export. The audio required for DCP is a mono .WAV Signed PCM 24 bit format at 48k or 96k sampling rate file for each channel. I used Audacity to upconvert the mp3 files I exported. Make sure that you have a separate file for each channel. Stereo is 2 channels, 5.1 is 6 channels, etc.
There are other items you can do that I haven't touched yet as well.
Intermediate Files:
openDCP then lets you package your JPEG2000 files into a MXF file, as well as your audio source files into another MXF file. These are the 2 files that will be used to create your final DCP folder.
DCP Output:
One main thing I learned is that DCP is not a file. It is a series of files, some contain the data and others contain definitions of that data. The last thing you do with openDCP is combine all these items into your DCP. Once finished you are left with a folder with 6 files. This folder can be ingested into a digital cinema system and played back.
There are not many options for testing you DCP file on your local computer. EasyDCP player has a trial version that will let you play the first 15 seconds of a DCP. Which will let you know if it is working. But another thing you can do is convert your DCP into a quicktime file, to see what your output looks like. The newest builds of ffmpeg can convert MXF to Quicktime. The following line of code will do it.
ffmpeg -i videofile.mxf -i audiofile.mxf -c:v prores -c:a copy testfile.mov
Where the videofile and audiofile are the mxf files that were placed in your DCP folder.
Software
Thanks to the following websites who helped me piece this processes together.
- Sources: To start, I created my Blender scene and collected my audio files
- Render #1: Render the Blender Scene into a series of 16 bit TIFF files at full resolution
- 2048x858 with square pixels (1:1 aspect ratio)
- Composite: Using my tiff files and my audio, I created a second Blender project to composite the sources into what I wanted the final project to look and sound like. For my project, I used Stereo audio, so I didn't mess with the surround sound options. I will be looking into that for future projects.
- Render #2:
- Render out the Composite to 16 Bit TIFF image sequence with any additional compositing/effects/etc
- Create the audio mixdown. This is where you could end up with a stereo file or a series of files for surround
- Video Path to MXF (Material Exchange Format)
- Using openDCP, convert the TIFF files to a JPEG2000 image sequence
- Encapsulate the JPEG2000 image sequence in an MXF file
- Audio Path to MXF
- Process the resultant audio files as needed (compression, normalization, etc)
- Make sure the project will save as 24bit uncompressed WAV at 48k or 96k
- Save each channel as a separate mono file (e.g. mix-Left.wav, mix-Right.wav)
- Use openDCP to encapsulate the audio in an MXF file
- Use openDCP to combine the MXF into a DCP (digital cinema package)
- Optional Convert the DCP into a quicktime file for testing on your local computer with ffmpeg
Resolution:
First off there is aspect ratio. A flat 2k render is 2048x1080. But that is not what the 2k digital cinema projects at. The aspect ratio is 2.39:1, so actual render resolution is 2048x858.
Source Files:
Image:
DCP (digital cinema package) uses JPEG2000 format. Luckily there is a handy free utility called openDCP which can convert 16 bit TIFF files into JPEG2000. So when I rendered out my blender scenes, I made sure that I was exporting to 16 bit RGB TIFF files. Then using openDCP, I convert the individual frames to JPEG2000 files.
Audio:
I used the blender compositor to cut up and arrange my audio and export. The audio required for DCP is a mono .WAV Signed PCM 24 bit format at 48k or 96k sampling rate file for each channel. I used Audacity to upconvert the mp3 files I exported. Make sure that you have a separate file for each channel. Stereo is 2 channels, 5.1 is 6 channels, etc.
There are other items you can do that I haven't touched yet as well.
Intermediate Files:
openDCP then lets you package your JPEG2000 files into a MXF file, as well as your audio source files into another MXF file. These are the 2 files that will be used to create your final DCP folder.
DCP Output:
One main thing I learned is that DCP is not a file. It is a series of files, some contain the data and others contain definitions of that data. The last thing you do with openDCP is combine all these items into your DCP. Once finished you are left with a folder with 6 files. This folder can be ingested into a digital cinema system and played back.
There are not many options for testing you DCP file on your local computer. EasyDCP player has a trial version that will let you play the first 15 seconds of a DCP. Which will let you know if it is working. But another thing you can do is convert your DCP into a quicktime file, to see what your output looks like. The newest builds of ffmpeg can convert MXF to Quicktime. The following line of code will do it.
ffmpeg -i videofile.mxf -i audiofile.mxf -c:v prores -c:a copy testfile.mov
Where the videofile and audiofile are the mxf files that were placed in your DCP folder.
Software
- http://www.blender.org
- http://opendcp.org/
- http://www.easydcp.com/
- https://www.ffmpeg.org/ (mac builds here http://www.evermeet.cx/ffmpeg/)
- http://audacity.sourceforge.net/
Thanks to the following websites who helped me piece this processes together.
Subscribe to:
Posts (Atom)