Run renderman compatible render engine aqsis on armv6

reference http://alvyray.com/Pixar/documents/Pixar_ImageComputer.pdf https://ohiostate.pressbooks.pub/app/uploads/sites/45/2017/09/pixar-image-processor.pdf Pixar Image Computer What inspired me to write this article is that last month, March, I took a quick look at The Renderman Companion A Program’s Guide To Realistic Computer Graphics, which mainly describes how to use the Renderman C API interface to achieve 3D image rendering. The book was published in 1989, when Pixar had developed the Renderman rendering interface standard and implemented the PRMan PhotoRealistic Renderman implementation that is now widely used in the film industry. One of the interesting things is that in 1986 Pixar also developed a computer graphics hardware system similar to today’s Nvidia general-purpose graphics cards - Pixar Image Computer. ...

April 5, 2022 · 6 min · alexchen

Deep Learned Super Resolution for Feature Film Production

Paper Deep Learned Super Resolution for Feature Film Production Learning Recently I have been learning about 3D modeling rendering and production, and I saw a paper about Pixar’s release in 2020, the main content is about how Pixar’s internal engineers and developers use deep learning to improve the resolution of rendered images, fine rendering of a 4k image frame may consume a lot of capital cost, if the final rendering of 2k resolution image then use deep neural network to boost it to 4k, the image is more detailed and richer, then it can help companies reduce a lot of the cost of rendering. Unlike the previous image super-resolution enhancement model, the film industry typically uses images that contain data with high dynamic range of lighting, and true reproduction of color, which makes the current mainstream super-resolution models simply unusable in the film domain. This is an area of interest to me at the moment, as using deep learning for industrial applications is more practical than theoretical. ...

March 14, 2022 · 8 min · alexchen

Renderman History

The Art of Innovation When we talk about Renderman, we also have to mention Pixar, the studio that makes animated films. Let’s take a look at the history of the studio, which was born out of the need to create real images in the digital world. The history of Pixar shows that science and art go hand in hand in the world of film creation, so let’s find out what exactly Renderman is a tool for. Just like painting, the painter remembers the color, light and shadow of the objects he sees in his brain through constant observation of the things around him, and then relies on his hands to present the objects of the three-dimensional world on the canvas of the two latitudes, Renderman is such a set of tools, we tell him what objects are in the three-dimensional space, where the light source is, the position of the camera and the angle of observation of this space is. Rendering a picture is of course not enough, we need to make the viewer feel that the picture was taken with a real camera, then we need to simulate real world light and color in the computer world, and that’s what Renderman needs to do. After the model is built, we can output a scene description file that is compatible with the Renderman program, input this description file into Renderman, and we get a photo with real texture. Renderman is actually the specification and implementation of a series of external interfaces that take Rendering out of modeling for independent development, then any 3D modeling program that can communicate with Renderman through these interfaces can deliver the scene description to a renderer with a Renderman interface implementation to render the image. ...

February 25, 2022 · 3 min · alexchen

CG319x EIZO

Back in 2020 I wrote a blog post Renderman24 Will Support Blender,at the end of which I documented my intention to purchase a commercial version of Renderman. after more than a year, I realized my idea, configured myself with an image workstation, purchased a commercially copyrighted Renderman, and most importantly, I also purchased a set of EIZO CG319x monitors for myself. The initial reason for preparing all this was mainly because of the rise of web 3.0. I envisioned a one-person freelance career, because the film industry is created through a very large group of people, and it is very time-consuming for one person to complete a live-action film or an animated feature film, or even a short film, independently, because there are many technical as well as artistic stumbling blocks in front of you, and you have to overcome them little by little to complete every detail of the film. Then I thought about the possibility of using the machine’s algorithm to set up the characters’ environment, to generate life and environment in the computer’s world, which is called modeling and animation and scene production in conventional film production. Dare to imagine if the computer can complete these creative processes, such as animal jumping and running, model materials, keying, human walking, we are no longer through the drawing of key frames, debugging materials, manual green screen keying, but through the computer’s learning of the real world to actively explore these key frames, to achieve the simulation of real-world object materials, the precise keying of the characters in the picture, then it seems that a person It is no problem to create a wonderful movie. Of course, this is just an idea, and the technology is not yet so sci-fi. ...

February 16, 2022 · 3 min · alexchen

ACES Color Management

Reference articles https://docs.blender.org/manual/en/latest/render/color_management.html https://rmanwiki.pixar.com/display/REN24/Color+Management https://www.arri.com/en/learn-help/learn-help-camera-system/camera-workflow/image-science/aces https://github.com/colour-science/OpenColorIO-Configs https://acescentral.com/knowledge-base-2/using-aces-reference-images/ https://opencolorio.readthedocs.io/en/latest/configurations/aces_1.0.3.html Talk about Gamma Correction Table[x^2.2, {x, 0, 1, 0.1}] {0., 0.00630957, 0.0289912, 0.0707403, 0.133209, 0.217638, 0.325037, \ 0.456263, 0.612066, 0.79311, 1.} Table[x, {x, 0, 1, 0.1}] {0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.} Table[x^(1/2.2), {x, 0, 1, 0.1}] {0., 0.351119, 0.481157, 0.578533, 0.659353, 0.72974, 0.792793, \ 0.850335, 0.903545, 0.953238, 1.} #^2.2 & /@ Table[x^(1/2.2), {x, 0, 1, 0.1}] {0., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.} To get a preliminary understanding of the gamma function, it can be seen from the above expression that the transformation obtained by the gamma function 2.2, we can keep the input and output in a linear relationship by the inverse gamma function 1/2.2. ...

January 30, 2022 · 13 min · alexchen