2d Stabalize in Flame Batch

Sorry about the quality of this video, I am still trying to get the hang of screen recording, I had a lot of trouble with the 2d tracker in Flame so I decided to figure it out and make a video. Thanks to Flame news and Quinn and Phillippe from Autodesk for helping me with this. It started out as a small tutorial, but it grew to a 23 minute tutorial. I guess I had a lot to say. Hope it helps you!


How to get video footage into Linear mode with a LUT and keeping all the colors legal.

Working in Linear can give you great results, but it is tricky.  I had to do a job where I had to take a video image and comp it into a monitor.  The footage was shot on Alexa LOG.  The insert was a rec709 still.  The problem was, the blacks were crushed when I applied the Alexa LUT and did not pass QC.  It also screwed up the edges on my key but more on that on a later post.  How did I fix this?  I applied the Alexa lut to the still, put a 16 bit FP video to Linear lut on, used Jeroen Schulte’s handy dandy Matchbox filter and a color correct node to check.  Lets look.


The above image was what I had to put in the monitor of an operating room for a show we are working on.  In the comp, all the blacks were below zero.

Now lets look at the batch setup.


Above you can see my image which is going into the Alexa LUT.  Lets look at the LUT setup for that node.


In the screen cap above, I simply loaded a 3d LUT into the LUT node.  NOTE it is only 12 bit.  Flame does not support 16 bit FP LUTS.  At my shop, we NEVER bake in LUTS.  We always deliver unlutted shots and you should as well.  **Note**  if you put the Destination to 16bit FP, the LUT you have loaded will disapear and the software will NOT tell you so be careful.

Here is the trick to the whole tip.  If we use the Alexa LUT as a viewer LUT, this is downstream of all my scopes and color corrector pickers so we will not be able to measure what the viewing LUT is doing to the image.  We need to be able to measure what the LUT is doing to our footage in the Color correction node, so I have “baked in” the LUT (see above)  so we can measure what the LUT is doing to the image.  Don’t worry, we will take it out at the end.  The second LUT is the conversion of VIDEO to Scene LINEAR Lut at 16bit FP.  Lets look at the LUT nodes settings.


In the LUT node pictured above, I have selected 16 float, Video_to_Scene_Linear and I have adjusted some of the curves to bring back some of the hilight detail that I have lost.  What is going on here so far?  I started with a rec709 video image and I put the Alexa LUT onto the image.  Then I made the image 16 bit FP video to scene linear because that is what my comp is.  Here is the neat part.  I am now going to make sure none of my blacks are below zero.  Here is where you need to have the L_Clamp Matchbox filter.


In the still above, I have selected the Minimum box which is black and the color pickers pop up.  I changed the colors to 16bit FP and entered .002 as a value in the RGB boxes.  What this is doing is boosting up all the blacks to just barely above zero so they will not be crushed when we stick this video clip into a 16bit FP Linear action setup.  This will make the VFX supervisor happy.  Lets now look at the color corrector which was the final node in our pipeline.


OK above is the screen cap of our final node the color corrector.  First note there is NO lut on this shot in the viewer.  I have already applied the Alexa LUT at the very beginning of the pipeline so this is what the shot is going to look like when the LUT is applied in the viewer.  If I used the LUT in the viewer, my shot would 1)look like crap cause it would be double lutted and 2) the color picker in the color corrector would not register the color changes introduced by the viewing LUT.  I also have a Neg Clamp on the output.  What this does is bring down the whites so they will not be illegal in the comp.  Finally here is the proof that the process is working.  Note the Front on the color picker.  I have selected the blackest part of the image and it comes up as .002 which if you remember was the value we entered in the Matchbox node.  The matchbox node has clamped all the blacks at .002 and this image is now ready to go into an action node and get comped.  Lets look at that.


Above is the final batch setup for our image.  **NOTE** I have removed the Alexa LUT node from the tree.  This is because in Action I will be using the viewing LUT and I no longer need the Alexa LUT node in the pipeline.  If I kept the Alexa on, my shot would be double lutted and look really bad.

In retrospect, you may not need the Matchbox filter.  You may be able to tweek the settings in the 16 bit FP LUT node and see if that will eliminate any illegal blacks or whites you may have.  If not, the Matchbox Clamp filter will make sure you are all legal.  Hope this makes sense.  Join FlameUsers Logik Facebook page( Jeroen is the owner of the page and quite helpful) to download the clamp filter.

Bilinear Tracking a still frame back onto an image. Part 1.

Can someone please help me figure this out? I am trying to get a single frame of a clip to track as a bilinear in Flame from Mocha. I had to do this several times on a job I finished and the Flame always seemed to “start off” bad and then catch up to the track whereas Nuke always nailed this with very little effort.   I had to do this for 6 shots and the Flame never seemed to import the data correctly and Nuke did it right every time.   Lets start at the beginning.
First I have a shot of this brick wall which is 90 frames long. Here is what it looks like. The first thing I did was track it in Mocha.
Then I exported frame 43 to photoshop. I painted the brick blue and made an alpha channel out of it.
Then I imported the still frame into Mocha and tracked it to the moving video. Here is what that looked like.  The video below has been rendered out of Mocha.

As you can see, the still frame tracks quite nicely to the background when rendered in Mocha.  To sum this up:  I have a single still frame (frame 43) that is tracked to the background.  Mocha is twisting and turning this frame so it will track to the background.  I did NOT paint all 90 frames blue.
I then went to frame 43 in Mocha and clicked the align surface button. This makes the surface align to frame 43 which is the still frame I painted.I then copy the data from the track into a nuke corner pin like in the photo below. I select the layer, hit export tracking data, select nuke corner pin and copy to clipboard.In nuke I simply hit the paste button and I get a corner pin node.
In this simple Nuke setup above, I have a corner pin node on my still frame of the blue brick. The blue brick is a targa with a matte. I stick that in the corner pin node and key it over the moving footage. Here is the nuke render of the corner pin. As you can see it looks exactly like the Mocha corner pin.  Again to sum up, I have taken frame 43 from this sequence, exported it into photoshop and painted a single brick blue and created an alpha.  In Mocha and Nuke I am able to take this still frame and track it to the background.  Nuke and Mocha are “twisting” the still frame with the tracking data to make the brick fit over the moving footage.

Now for the problem part. I am trying to do the same in the Flame and the track is not accurate. It seems to be off for the first 20 or so frames but after that it is OK. Here is how I got the data into flame. I click the layer, hit export tracking data and hit the stabilizer option.

In flame in action I load up the targa of the blue brick with the matte and key it over the video layer. I make the surface a bilinear see below

I then load the stabalizer data from mocha into the bilinear like in the still below.

I then hit the stabilizer button to enter the tracker module so I can load the Mocha data.

Above I load the mocha data by clicking the hilighted Load button.  The mocha data gets loaded in an my action setup looks like this.

In the above still I am on frame 43 an the Mocha data is tracked perfectly.  This looks good, but lets look at the flame render.

It may be a little hard to tell from this You Tube video, but the track is off for the first 20 frames.  Take a close look.  The blue brick seems to slide from left to right.  Lets look at a still frame from each of the three videos so  you can see what I am talking about.

Above is the first frame of the track.  Note Nuke and Mocha are exactly the same and Flame seems to be a bit lower and to the left.  Now lets look at frame 10.

The above still is frame 10.  Note Mocha and Nuke are identical.  Again, the data for Nuke was created in Mocha and loaded into Nuke via cut and paste.  Notice the Flame still is again too low and to the left.  Now lets look at frame 20.

The above still is frame 20.  Note Flame has “caught up” to Nuke and Mocha and for the rest of the track it seems to be correct.

The above still is the final frame.  Note how everybody is all lined up and the same.  I do not understand how Flame can take the same data as Nuke and be off for the first 20 or so frames, catch up and then be perfect for the rest of the track.  It should be all right or all wrong.  How can it catch up?  HOW CAN I GET FLAME TO TAKE THIS DATA AND HAVE A CORRECT RESULT?  WHAT AM I MISSING?  PLEASE HELP!  THANKS.

Here is a slowed down version of all 4 shots together so you can see a little better how the flame catches up with the track.

Using Expressions to Make a Frame for a Still or Power Cropping!

Here is a quick and useful way to use expressions to make a border for stills in Action.  Lets look at my action setup.

Above we can see my blue background layer.



Layer 1 is a white frame that is 1920×1080.  It is a lot easier for this trick if the white border is the same resolution as the frame you want to put the border around.

And finally, we can see a still of the park that is across the street from our shop in Philly.  This is where we hang out when there are building wide fire alarms.

Now, lets have a look at our finished product in action.

Nothing that earth shattering, but the white border was made with an expression and it moves with the picture frame!  How did I do that?  Like this.  Lets crack open our animation channels.


OK.  What you want to do is first open up the Media tab.  Note how I named my images white and park so when I opened up the media tab, they would be clearly labeled, hey wait a second they still say media 2 and media 1!  I hope Autodesk can fix this.  When you have 50 images, this menu can get confusing because the media names never change.  They are always media1, media2.  Please feel free to educate me on how to change this, but I do not think it is possible.  Note the (2) in the schematic view next to the word park. This means that this is media 2 in the media list.  This is the only way I know to figure out what media is what in the list. Here is what we need to do to set up this expression.  1.  Open up the media tab.  2.  Select the Crop menu of the picture you want to have the border around it.  3.  Now hit the copy button and onto the next step.  The copy button “arms” the expression and puts it into memory.  We are now ready to link it to another channel.  Here is how.


OK.  Select media 1, open it up and highlight the crop menu.  Now push the Link button.  A little e should appear next to the crop menu and you should see the expression in the field I have highlighted.  If not, you have done the operation in the wrong order.  What we need to do  now is determine the width of the border of the white box.  Click the crop button (the one with the e next to it) and hit the expression button.  You should get a little something like this.

Above I have selected the menu with the expression. hit the expression button to enter the editor and the expression editor comes up.  I have another expression post a few back if you want to know a little more about them.  This expression came up media.2.crop.  I added a -20 to it.  What this does is all the crops for media 1 will be created from media 2 but with a -20 offset.  So if media 2 has a crop of 100, media 2 will have a crop of 80.  Lets look at the action crop menus.

Above is the crop menu.  What I did was adjust my crops on Layer 2.  Notice how each of the crops on Layer 1 is 20 less than on layer 2.  This is due to the expression.  Also note the softness.  The softness is also included in this expression.  Since we have not adjusted the softness on layer 2, the softness on layer 1 is -20.  This will have no effect on the picture since you can not have a negative softness.  If this is really bothering you or you need to adjust softness by itself, you would have to copy and link each channel of layer 2 (top, bottom, left and right) to layer 1.  This is a little cumbersome to do, so I always do it the way I showed you because I rarely have to deal with softness when I do this trick.   Just wanted to point it out there for ya.


Lets say you do all this and you start to move the crops on layer 2 and the crops on layer 1 are NOT moving as advertised.  This is a bug.  Simply tap any one of the top, bottom, left, right crops in layer 1 and the crops will “update” and begin to move by using the expression.

Lets look at the result.

Amazing.  It is croptacular.  Now lets say the clients wants a fatter border around the crop.  Here is why this trick comes in so handy.  Crack open the animation channel again.

In the above still I opened up the media layer, selected the expression, hit the expression button to get into the editor and then increased the number in the expression to 60.  Note how my white border is now bigger.  I can still adjust the crops on layer 2 but now the white border will be larger.

But what if my still is a wacky size? As we all know, IFFS needs clips with the same frame size and resolution in action or it won’t work.  Also, if you have an odd size still, you will have weird results if you took an 1880×1360@1.39 still and used a 1920×1080 @1.77 for the border.   Here is a good work around for that.

Above we can see my horse is a weird frame size with an even weirder aspect ration.  I want to put a white border around it in action.  Here is what you do.  Go into the color corrector, load in your frame and make the clip black.  Process the frame.  Then go into the color corrector and invert the clip.  Your desktop should have three clips like this.

Above I have a black clip that I made from the horse clip in the desktop color corrector.  Just bring down the gamma to 0 and render it.  Then load the clip into the color corrector and invert it.  We now have a black and white clip the exact same size as the horse.  Lets load in the white frame and the horse into action and do our crop trick.  Here is what we get.

Above we can see I have loaded the horse white frame into layer3 and the horse into layer4.  I have set the expression to 50 and I can crop by using layer 4 crops and layer three will follow along 50 units less.  Again, the crop on layer 3 didn’t work right away so I simply touched the top crop for layer 3 and the expression “kicked in” and multi layer crops were happening quite nicely.


Changing Reel Names on Mac Pro Rez Files Via Gateway AND conforming when Smoke can’t read source code.

Conforming an XML in Smoke can be one of the most annoying things you can do. Here is a neat little tip I discovered quite by accident. You can change the reel names and the clip names of your files via the gateway. You are not changing any metadata on the clip, nor are you changing the filename of the QT.  You are just changing how the Smoke reads the data. First, lets look at where the media is stored on the San. The media path is san_001/<client_name>/Media/001. Why is this important? The media is in a folder titled 001. It turns out that the DP arranged this media into folders with the right reel name. So all the media in folder 001 is from reel 001. All the media in folder 002 is from reel 002. Final Cut Pro is getting the reel name from the folder name where the media lives. Get it?

Lets look at the Smoke side now. I have gone through a mac gateway and can see my Pro Rez media. Lets look.

OK in the above still I am looking at Pro Rez via mac gateway and we can see the Smoke reel name is 001, just like it was on the mac. The media is all black because I do not have the rights to this footage, but I needed to show it to show the tip.  Anyhow, here is the neat part.

In the above still I am looking at the Mac Gatway in list view. Look at the meta data options at the bottom of the screen. I have selected Tape Name from File Header. I have also selected clip name from file name but I did not do a screen cap of that menu. Here is the crux of the issue. The Smoke is not reading the correct meta data from these QT files. Note the source code. It is all This is because these quicktimes do not have their timecode saved in a place where the smoke can read it properly (no fix for that!) and the reel name meta data is also stored in a place where the Smoke can not read it properly, which is why Tape name from File Header is coming up with the clip name. It should be 001. Here is how we fix it. Remember, the clips from Reel 001 are in a folder named 001. We can force the Smoke to assign a reel name from the directory the clip is in. Here is how.

First, note that we are still in the Mac Gateway. These Pro Rez clips are all being read off the San through our mac. We are about to change how the Smoke is reading the meta data on the clips, NOT the actual data. Here is how we do it.

  1. First select all the clips.
  2. Now Select Tape Name From Directory
  3. Hit the Refresh Selected Button
  4. Note all the clips are from Reel 001.

The neat thing about this trick is it changes the way the Smoke is reading the data. So if you need to change the clip name to the name of the folder it is in, you can do it by selecting Clip name from Directory, select the clips you want to change and hit refresh. You can also change just the clips you have selected so if only a few clips are misbehaving, you can change only those clips.

Conforming XML’s is hard, but I have found that if I use this list view I can see what is going on a lot easier. If I can not relink the clips in one fell swoop, what I end up doing is looking at my media in this list view so I can start to change things and see what works.   For this particular job, the TC was not reading properly, but all the clips had unique names.  However, relinking using clip name only wasn’t grabbing the right clips.  We noticed the reelnames were coming up as the clip name so we changed the reel name to the folder and did a relink using clip names and reel names.  The Smoke then found all the media.

Here is how I Conform XMLs.


2.  Load the XML into a record area and note the clip names, the source timecode and the reel name.  You can do this by hovering over the timeline and holding alt+click on the clip.  You should get this.

Above the clip name is 002_SCENE 101_012.  The Reel name is 002.  Note the : separates the reel name from the file name.  The file name is 002_SCENE 101_012.mov.  The source timecode is in the lower left.

3.  I then look at the clip list view in my library.  Do I have a clip named the same in the library as it is in the XML?  Is there a reel 002 in the library?  Does the source code in the timeline match the source code in the library?  All things to check before you try to relink.  In this case, the source code was NOT matching the XML because the Smoke could not read the code properly.  What did I do?

4.  Note the H:4129 and the O:4129 numbers.  This means that 4129 frames into clip 002_SCENE 101_012, the Smoke needs to put that part of the media into the timeline.  When I reliniked, I only used clip name and reel name NOT the timecode.  The smoke then grabbed the shot by matching the filename and the reelname.  It then slipped the clip 4129 frames forward so all my shots lined up!  If I consolidated this timeline first, the offsets would be all messed up and this trick would NOT work.  Sometimes, it is better not to consolidate the timeline prior to conforming.

Match a Clip out of a timeline Batch or Action Setup.

Not an earth shattering tip here, but a very handy one.  Lets say you have an action clip with a clip history on it, or a complicated batch setup in your timeline and you just need to pull out a clip and give it some love with desktop paint.  How do you go about getting a clip out of batch?  Lets look.


In the above still, I have a clip with an action clip history on it.  I used action on the desktop and put in lots of logos.  This tip will also work with batch.  I just happened to have an Action clip handy for this job.  Make sure your clip is selected and the little yellow pointer is on the clip (it is in the small blue box.  Now hit Shift+control+F5 or select large clip history.  Your screen should look like this.


See how you can select Large History or use the hot keys?  Now I can see all the clips I used in this action setup.  Use the bars on the left and right to zoom in and you can see your clips.

OK all you have to do is select the clip.  I selected the Enterprise car logo.  See how it is hi lighted in yellow?  Then push the match key.   The clip now appears on the desktop.

See, the clip is now on the desktop!


Smoke to Lustre how to “handle” it.

The Smoke to Lustre workflow is good not great. It is amazing what it can do and it is amazing what it can’t do. There are a few things you have to look out for. Here is the biggest one. The Lustre will only see shots you have rendered.  Here are the three ways you can work using Lustre. Lets look at a timeline to see what I am talking about.

Way Number One. Source Grade OFF.


In the photo above, we can see a Smoke timeline. This is the biggest issue we have when using Lustre and that is mixed resolution media. The timeline is 1920×1080@10 bit. I have labeled the shots in white so we can see their frame sizes. Note all the shots except the 1920×1080@10 bit shots have a resize on them. The resize has been rendered. If you save this timeline into the library and the Lustre grabs it,THE LUSTRE WILL ONLY SEE MEDIA THAT HAS BEEN RENDERED. If ANY of the shots are unrendered the Lustre will not see the shot. So who cares? You should. The handles of these shots are NOT RENDERED in the Smoke. This is why this is important. Even though the Lustre will grab the shot with the handles and you have render handles selected in the Lustre, THE LUSTRE WILL NOT RENDER HANDLES. You will get your shots back from Lustre cut to cut. If you are done editing, this will usually not be an issue. However, you have to be aware of it.

Pluses of this workflow.

1. The Lustre operator will be able to see all of your repos, batches and timeline effects as is.

Minus of this workflow.

1. You will not have any handles in your shots when you get them back.

Way Number TWO. Source Grade ON..

Now lets say we want the Lustre to grade our handles. Here is how we do it. In the Lustre, select the Source Grade button BEFORE you drag over you Smoke timeline.


This will make the Lustre grab the ORIGINAL media for each shot. This way, you are grading the shots preblowup, pre resize pre everything. When the colorist sends you back the footage, the smoke will “automagically” place all of your axis effects back on the shots. Lets look at an example.


In the above photo layer 2 is the original edit. Layer 1 is the timeline the Lustre shot back to us. I didn’t do anything special to the Lustre layer. It appeared in my From_Lustre library I have when the colorist graded it. I then stuck it on layer 1 of my timeline. As you can see the smoke “Automagically” reapplied all of my soft effects.

Pluses of this workflow

  1. Your will have handles on the media.
  2. If you have 12 bit media (red or alexa) in a 10 bit timeline, the Lustre will be able to grade it using the native 12 bit goodness.
  3. The Lustre operator will be able to see the shots in the timeline in the right order BUT without any timewarps, blowups or any other soft effects

Minuses of this workflow

  1. The Lustre operator will not be able to see blowups, time warps or any other soft effects.

So what if we want handles on our media AND the effects that we have in the timeline. Then you gotta kick it old school my friend.

Way Number Three. Source Grade OFF. Editing hat ON.


In the still above, I have layer 2 as my conform. I have a bunch of soft effects on the media. In layer 1 what I have done is trimmed out all my shots so there are handles. I have also rendered out the soft effects in this layer. I will send Layer one to Lustre (I kept layer 2 in the picture so you can see what is going on. I would normally make layer 1 a new timeline for Lustre) and make sure Source grade is OFF. This way the Lustre will grade the shots with the effects and send it back to me. I will then have to re edit the graded material to match the conform. Again.

Pluses of this workflow.

  1. I will have all the handles I need.

Minuses of this workflow.

  1. I will lose my 12 bit goodness. Since this is a 10 bit timeline, the 12 bit shots will all be resized to 10 bit and we will lose that color info.
  2. I have to do the conform twice. This is how we used to do it in the bad old days of linear tape to tape sessions. I have to conform the show, send it to Lustre, get it back and reconform it again.
  3. My effects are baked in and I can’t change them
  4. The colorist will not be able to “see” the spot as it is in the edit.

Personal rant here. I can see Lustre not handling a big complicated batch setup. However, simple resizes and simple x y z axis moves should NOT be a problem. If I have a 10 frame shot with a 120% blow up on it with 12 frames of handles, the Lustre should be able to “see” my simple timeline repo. The operator should be able to see my blow up but grade the original source material and then send it back to me. The fact that it can’t handle the handles in the shot with an axis on it is annoying.