How to Make an AWESOME 3D Virtual Choir!
Theatre TechnologyLearn how to make a 3D Virtual Choir using Blender, DaVinci Resolve, and RunwayML.
Making a Virtual Choir
Today I am going to show you how I made a 3d virtual choir show. Whether you are here to learn how to do it yourself, or if you are just interested in the technology behind it this post is for you! First let's see what we will be making.
This project was a lot of fun to create particularly because it uses some fun technologies behind the scenes, and for the most part uses either free to use and open source software. Doing a virtual choir video can be a daunting task to begin with since we have to deal with lots of source footage that has to be aligned perfectly and then edited together to not be boring. We also have to deal with working in a 3D environment. But just like any problem we have to solve in our lifetime it is much easier if we break it down into smaller pieces.
I personally divided this project into three discrete sections. The first section is where all the individual processing of files happens. The second section is where we build the 3D scene. And the third section is lighting, animation and exporting of the final video.
This project was a lot of fun to create particularly because it uses some fun technologies behind the scenes, and for the most part uses either free to use and open source software. Doing a virtual choir video can be a daunting task to begin with since we have to deal with lots of source footage that has to be aligned perfectly and then edited together to not be boring. We also have to deal with working in a 3D environment. But just like any problem we have to solve in our lifetime it is much easier if we break it down into smaller pieces.
I personally divided this project into three discrete sections. The first section is where all the individual processing of files happens. The second section is where we build the 3D scene. And the third section is lighting, animation and exporting of the final video.
Let’s cover how we should film the source footage. When I did this project originally I did not provide any specific instructions to the students, which was because I didn’t know at the time if this would even work. But now that I know it does work, I would have asked for a full body shot of each of them singing along to a backing track. This will help us match up the clips later.
If the performers have a green or blue screen they can use it! But, if they don’t have them film in front of a blank background such as a wall. Next we need to remove the backgrounds from each of the videos. If a performer did not film in front of a bluescreen or green screen we can use an online tool called RunwayML to add a green background that we can key out later.
We simply need to click on the Green Screen tool under the video tools section and upload our file. Next, we click on the person we want to isolate. If there are some areas that don't get highlighted in green the first time you can keep clicking until the entire body is isolated.
If there are some areas that are highlighted in green that shouldn't, you can use the exclude tool to remove it. That's it! We just need to export the video now. This processing will take a while since it uses Machine Learning and Artificial Intelligence but once complete it can be put back into the folder containing the rest of the source footage that has a green or blue screen already.
Next we are going to kill three birds with one stone. We need to chroma key out the background, align the video to the music, and export the video in a format that we can easily handle in the 3D application. I used a video editor called Davinci Resolve which has a great free version. But feel free to use any video editing software you are comfortable with.
First, we will import the music file and drop it into the timeline and lock it so it does not move. Next we will import the first source video. We will then line up this video to the music file by trimming either end and dragging the clip in the timeline.
Once everything is lined up we want to remove the green background. This will be different in different applications but for mine I have to head over to the fusion tab and add a chroma key node and adjust the settings till the background is blank.
We are finally ready to export this clip. When we start to deal with 3D projects it is generally a good idea to use an export format that can scale with the project. The format we will be using is EXR which was developed by Industrial Light and Magic in 1999. This is made for complex 3D workflows and will reduce the required RAM and processor power when we go to export our video.
Now we just need to repeat this process for each source clip that we have. Since I knew that I was going to have rows of videos in the final video I created a timeline with a wide resolution and added multiple videos next to each other, this would reduce the amount of things we need to drag around later.
With the source footage complete it is time to move to the fun part, the 3D environment. I am a terrible 3D modeler, and because of that I went online and found a 3D model of a theatre that I liked and downloaded it.
To combine all of these elements I used an open source software called Blender. This program has a lot of features, and if I covered all of them we would be here for years. But there are good introduction videos on YouTube that explain the basics. Thankfully what we are doing only scrapes the surface. We simply need to import the 3D model we downloaded earlier from the file menu.
Next, we need to add an image plane into the scene and select the EXR sequence we exported earlier. Now, we have successfully added our performer into the virtual theatre! If we hold down z on the keyboard we can then select the rendered version. As we can see it does not look very good.
This brings us to the third and final section, making it look better. The reason it looks so meh is that we don't have any lighting in the scene. Just like in a real theatre we need lights to add ambiance. In blender we have many different types of lights that we can add. Just play around with them till you like how things look.
One neat feature that I was surprised worked was backlighting an EXR image sequence would result in a shadow just like in the real world. I think that this shadow on the ground really helps with the illusion that everyone is there.
After lighting the scene we need to give the video some movement to keep things interesting. We can grab the camera and move it to a starting point, add a keyframe, advance the playhead to the second point, move the camera and add a second keyframe. Now when we play it back we have added some nice camera movement.
The very last step is to render out the video. Since we are working in a 3D world with lighting it takes a long time to render out the video. For reference, this video that was around 3 minutes took about 10 hours to render out so make sure you leave yourself enough time for the render process.
In blender we can select the output format we want and then select render animation from the menu. And with that we have created our own 3D virtual choir.
Read More
Check out our latest posts.