In-camera vfx means that the final CGI is already present in the scene when it's shot. This is usually accomplished with giant LED screens. Typically the engine that runs these screens is Unreal.
One major advantage is that the cinematographer's main job, lighting design, gets easier compared to green screen workflows. The LED screens themselves are meaningful light sources (unlike traditional rear projection), so they contribute correct light rather than green spill which would have to be cleaned up in post.
The downside of course is that the CGI is nailed down and is mostly very hard to fix in post. I suppose that's what Gore Verbinski is criticizing — for a filmmaker, the dreaded "Unreal look" is when your LED screen set has cheesy realtime CGI backgrounds and you can't do anything about it because those assets are already produced and you must shoot with them.
Does this happen often? Are there any examples?
For example Fireframe in Finland: https://fireframe.fi/
https://techcrunch.com/2020/02/20/how-the-mandalorian-and-il...
See https://youtu.be/7ttG90raCNo for more details.
They’ll happily settle for “looks good enough for viewers who are distracted by their phones anyway” if it means the post budget item goes away completely.
https://en.wikipedia.org/wiki/Compose_key
(Besides, an LLM would capitalise "vfx")
I didn't think they were actually using the video straight out of the Volume though - my assumption was they'd just use it to make sure the lighting reflected on to the actors nicely and then redo the CGI elements with something else.
Say you're making children's videos like Cocomelon or Bluey in 3D, you don't need all these nice things.
At the end, movies are about the stories, not just pretty graphics.
The great people at Pixar and DreamWorks would be a bit offended. Over the past three or so decades they have pushed every aspect of rendering to its very limits: from water, hair, atmospheric effects, reflections, subsurface scattering, and more. Watching a modern Pixar film is a visual feast. Sure, the stories are also good, but the graphics are mind-bendingly good.
People don't pay 45 eurodollars for IMAX because they like the story.
That's how it's used though? It only runs real time for preview, but the final product is very much not rendered in real time at all. Obviously it's a very flexible tool that works with what you need and what you can afford - Disney runs it on their compute farm and they can throw the resources at it to render at the fidelity required. But obviously there are plenty of production houses which don't have those kind of resources and they have to make do with less. But then you wouldn't expect Pixar's own pipeline to work in those circumstances, would you.
>> Unreal does have the ability to render out video but it's not going to be the same fidelity.
I really encourage you to look into what's possible with UE nowadays. Custom made pipelines from Pixar or Dreamworks are better still, of course, but UE can absolutely stand next to them.
My issue with UE is the opposite, the engine went too far into cinema production, and making it a performant game engine requires code refactoring. At which point an open-source engine might be a better choice. Its a mix of two (three) worlds, and not the best choice for one specific use.
For what is actually hard to do, like character animation, UE is a good choice. The lighting can be replaced more easily than the animation system.
Traditionally, big films bought new computer hardware and paid for new code as part of production. It was never particularly popular as the promise of CGI was always that it was going to lower cost, but it never did. However, the upside of all of this spending was some amazing looking visuals.
Unreal has been sold pretty hard to the film industry and affects houses are still charging as if they are buying new hardware and writing new code for every production. What they worked out is that they can use unreal save a ton of money and make more profit. That’s also why the names listed on affects heavy films are increasingly exotic as they outsource more and more of the work to the cheapest places they can find
Long story short is the effects are getting worse because they are getting cheaper while they’re being used more and more heavily meaning that the budget for them is being stretched further and further.
Marvel are responsible for a lot of this with their ridiculous production schedule for making films, which are essentially animated but treated by the public as if they are live action.
Funny it says this right after mentioning Jurassic Park. I, an avid JP fan that was blown away by the movie (and the book) when I was a dino-obsessed teenager, always thought that it was the non-CGI dinos the ones that didn't look that realistic (even if the "puppets" were fantastically done, it was more about the movement/animation). Although we have to keep in mind they used those mostly for close up shots where CGI would've looked even worse.
If it's Gore saying it - maybe he should talk to his producers then, and ask them whether they actually have budgeted the "proper" VFX talent/timelines for the show. He has creative control - the people doing the work do not.
it still doesn't explain why it is done:
• why do directors and producers sign off effects that are just eye-bleeding bad?
• using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics". a final render-pass can't be that expensive that ruining the movie is preferred? if 20 years ago a render-farm could do it, it cannot cost millions today, can it?
- There's an order of magnitude more CGI in films than a decade ago, so even though the budget and tech is better, its spread way thinner
- With CGI it's easier to slip into excess, and too much stuff on the screen is just visual noise
- Practical effects/complex CGI require months of planning, as it must work or you blow the budget/miss the deadline - now you don't need to plan ahead so much, leading to sloppy writing/directing, as the attitude is that 'we can rework it'
- Movies used to have 1-2 epic scenes they spent most of the runtime building up to. Nowadays, each scene feels less memorable, because there's a lot more of it, and have less buildup
- 3D people don't have the skillsets for nailing a particular look. The person who's best at making gothic castle ruins, is probably not a 3D expert, this also goes the other way
(And AFAIK they do usually do a non-realtime run, but a high-end render going for maximum photorealism also requires a whole different pipeline for modelling and rendering, which would essentially blow the budget even more so)
I feel like there's some strong rose tinted glasses effect happening here. Early 2000s were especially full of absolutely dreadful CGI and VFX in almost every film that used them unless you were Pixar, Dreamworks, or Lucasfilms. I can give you almost countless examples of this.
The only thing that changed is that now it's easier than ever to make something on a cheap budget, but this absolutely used to happen 20-30 years ago too, horror CGI was the standard not an exception.
It's a bit cheaper.
> • using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics".
It's probably a bit expensive in terms of effort or processing-wise.
In both cases you aren't ruining a movie. You're just making it more mediocre. People rarely leave cinema because CGI is mediocre.
Once the slop starts at the very basics it's just natural it embraces also CGI.
He knows what he's talking about!
The requirement to recompile the engine makes this feature non existent for a film crew.