and I started a 4-day +Novacut
hack fest today.
First we've focused on a feasibility study for a new Novacut rendering backend, and thus far the results have been extremely
It's not a huge change architecturally as we're still using +GStreamer
, although we will be using it quite differently in one key area. It's also not about new features as editing fundamentally isn't about features... it's about editing. Which has actually changed very little in the past 50 years.
To clarify: go shoot a few dozen reels of film, then sit yourself down at a light table with some scissors and scotch tape. That's what I mean by editing. It's also what anyone who has worked in a Hollywood-esque production pipeline means by editing.
I want Novacut to have the clarity of a light table, scissors, and tape. We have still have lots to do on many fronts, on both the design and the technology side. But I think we've done a good job staying focused, even if it takes a long time to get there.
During this hack-fest we're tightening up some technology areas that we just can't put off any longer:
1) We need to focus more on the stability and performance needed for feature length, high complexity edits
2) We need to be able to deliver flawless frame-accurate renders, without fail, over and over, every time (we've historically done pretty darn well on this, but it's never been perfect... and we need perfect here)
One nice thing about the Novacut edit description is that video cuts (what we call "slices") are specified by [start:stop] video frame indexes rather than time. This means that, for example, we know the exact total number of frames that the render of an edit should contain. So we can check that every time we render a particular edit we get the expected number of frames, not one frame more, not one frame less.
I've grown increasingly fond of using large, randomly generated edits to test both stability and frame accuracy. And our experimental backend took us to new heights today, allowed us to do things we couldn't previously.
I just successfully rendered a randomly generated test edit that:
* Had 658,051 frames
* Contained 5,120 slices
* Was 6 hours, 5 minutes, 56 seconds long
* Produced a 20.7 GB video
Not only is the new renderer stable enough to do this already after just a day of work, and not only could it did so with a very flat memory profile over the 7 plus hours this test took...
one of those 658,051 frames were rendered exactly
as specified by the edit description. Not a frame was dropped or duplicated. And the outgoing timestamp and duration of every frame was mathematically perfect
, without fail.
There is a pretty amazing visual difference between a render that is generally frame accurate, give or take a little wiggle, and a render that is by the numbers truly perfect
. I'll do a shorter test render tomorrow and will upload the result. For now, just a screenshot of Totem playing that 20GB file, as I don't think I'll be uploading it anytime soon :P