https://github.com/vfxwiki/8bit2floatCat
3Dprinting (176) A.I. (761) animation (340) blender (197) colour (229) commercials (49) composition (152) cool (360) design (636) Featured (69) hardware (308) IOS (109) jokes (134) lighting (282) modeling (131) music (186) photogrammetry (178) photography (751) production (1254) python (87) quotes (491) reference (310) software (1336) trailers (297) ves (538) VR (219)
Depth Map: A depth map is a representation of the distance or depth information for each pixel in a scene. It is typically a two-dimensional array where each pixel contains a value that represents the distance from the camera to the corresponding point in the scene. The depth values are usually represented in metric units, such as meters. A depth map provides a continuous representation of the scene’s depth information.
For example, in Arnold this is achieved through a Z AOV, this collects depth of the shading points as seen from the camera.
https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_user_guide_ac_output_aovs_ac_aovs_html
https://help.autodesk.com/view/ARNOL/ENU/?guid=arnold_for_3ds_max_ax_aov_tutorials_ax_zdepth_aov_html
https://radiancefields.com/gaussian-splatting-in-nuke
https://aescripts.com/gaussian-splatting-for-nuke
It connects Nuke with the ComfyUI server, any plugin that comes out in ComfyUI can be used in nuke, rotos with sam, rescaling, image generation, inpaintins, normal generator, the nodes are IPAdapter, ControlNet, AnimateDiff, Flux etc.
https://github.com/vinavfx/nuke_comfyui
https://blender-addons.gumroad.com/l/denoiser_comp_addon
Blender 3 updated Intel® Open Image Denoise to version 1.4.2 which improved many artifacts in render, even separating into passes, but still loses a lot of definition when used in standard mode, DENOISER COMP separates passes and applies denoiser only in the selected passes and generates the final pass (beauty) keeping much more definition as can be seen in the videos.
https://www.nukepedia.com/gizmos/time/vectorframeblend
Blend up to 11 nearby frames together, while preserving all detail
VectorFrameBlend can average/median/min/max/plus up to +- 5 frames with full motion awareness. Compared to the last version or other similar solutions, I built it as technically correct as possible and it provides thorough settings to improve the filtering quality and edge cases (literally).
You can also use the ‘External’ mode and connect the ‘vec’ input to another VectorFrameBlend, to use its internally generated vectors.
This can be useful, if you want to analyse a certain layer (for example a diffuse color pass that holds a lot of clean details), but apply the frame blending on somewhere else. Apart from that, the tool can of course be used on live action plates, utility passes or whatever comes to mind.
This project implements RIFE – Real-Time Intermediate Flow Estimation for Video Frame Interpolation for The Foundry’s Nuke.
RIFE is a powerful frame interpolation neural network, capable of high-quality retimes and optical flow estimation.
This implementation allows RIFE to be used natively inside Nuke without any external dependencies or complex installations. It wraps the network in an easy-to-use Gizmo with controls similar to those in OFlow or Kronos.
https://github.com/rafaelperez/RIFE-for-Nuke
A tool that detects, crops, and presents reference & cg spheres
https://www.patreon.com/posts/nuke-auto-ai-96524139
Website link: https://lnkd.in/dr7Xv5C9
Nukepedia: https://lnkd.in/dfRuVtJ8
Github: https://lnkd.in/drXeHcn
The Cattery is a library of free third-party machine learning models converted to .cat files to run natively in Nuke, designed to bridge the gap between academia and production, providing all communities access to different ML models that all run in Nuke. Users will have access to state-of-the-art models addressing segmentation, depth estimation, optical flow, upscaling, denoising, and style transfer, with plans to expand the models hosted in the future.
https://www.foundry.com/insights/machine-learning/the-artists-guide-to-cattery
https://community.foundry.com/cattery
After 12 years developing and supporting Bokeh we are excited to announce the product has found a new home with Foundry.
https://peregrinelabs.com/blogs/news/bokeh-has-a-new-home
COLLECTIONS
| Featured AI
| Design And Composition
| Explore posts
POPULAR SEARCHES
unreal | pipeline | virtual production | free | learn | photoshop | 360 | macro | google | nvidia | resolution | open source | hdri | real-time | photography basics | nuke
FEATURED POSTS
Social Links
DISCLAIMER – Links and images on this website may be protected by the respective owners’ copyright. All data submitted by users through this site shall be treated as freely available to share.