Resolve Studio can run as Master and slave third party software and hardware solutions to the Fairlight digital audio workstation's Timeline via LTC: Fairlight menu > Remote Control Settings.
Patch LTC to any physical output via the Fairlight menu > Patch IO and selecting the System Generator as Source with the Timecode option. Remember to enable SMPTE Timecode in Remote Control Settings.
SourceLTC is widely used by composers to slave dedicated MIDI sequencers such as Logic Pro to dedicated audio post digital audio workstations such as Fairlight. This provides a purely software-based solution if MIDI/Sync hardware is not available.
The slave application may run on the same computer as Resolve or on a second dedicated machine.
With the Fairlight Audio Accelerator card and Audio Interface, the Fairlight digital audio workstation can also slave to incoming LTC, generated from a third party software-based solution or hardware (such as a Console).
Resolve Studio also includes the Fairlight Monitor AFX in the Other plugin category. The Monitor AFX allows Resolve to run as a Rewire Master (Resolve will not run as a Rewire slave device). Place the Monitor AFX on a Fairlight Track or Bus, and set the MIDI sequencer as the Source in the Monitor AFX.
Thanks so much for getting back!! I actually stumbled on to a extremely obvious oversight of mine about this subject. I was waiting for a piece of software or hardware to make this solution for me but its so obvious….. now.
– Literally just drop LTC generated audio clip into any video editing software, this site generates up to a 90min clip
https://elteesee.pehrhovey.net or generate your own and record it somewhere.
– put it on a channel of its own and output to a submix that doesn’t route to speakers (obviously)
– use this tool: https://figure53.com/lockstep/ on the chase/slave computer
– set correct MTC input on DAW and WHAM!!!! super easy setup.
-Mult this submit out to as many computers or devices you need to chase the timeline!
Obviously it doesn’t let you mess with anything on the slave computers once they are externally synced. But OMG…. I have been trying to stupidly solve this issue for what feels like years. But probably just a really long week.
Thanks for your post that is a solid setup! But since I don’t actually need nonstop clock generated the audio file method will work perfectly for me!
SO PUMPED. I can now mix, color grade, edit, final mix, deliver all with out bouncing a single stem or exporting h264 ever again.
BOO YA
https://figure53.github.io/studio/
https://resources.avid.com/SupportFiles/VENUE/Using_Lockstep_with_S6L.pdf
Mister_9_Volt
Aug 2020
If you are on a Mac you can use Lockstep (a little free app) to sync cubase to the SMPTE track. Your recorded SMPTE must be routed to an input channel of your audio interface, use this as the input in Lockstep. And Cubase must be externally synced to the virtual Lockstep MTC output. Should work.