What's Mr. Newton Up To?

#1

Keith Newton. It sounds like he’s taking a noteworthy, non-trivial approach to integrating the fancy rendering pipeline thing with I-Novae’s planetary fundamentals.

Sounds interesting and I really haven’t understood what’s up at I-Novae aside from conquering precision issues and other difficult/subtle engineering aside from manifest rendering tech like the fancy rendering pipeline / planetary engine.

#2

I am working on some issues we’re having with using realistic quantities of light that have delayed the release of our new rings video.

3 Likes
#3

That’s cool.

#4

Just call it lens flares and be done with it. :stuck_out_tongue:

1 Like
#5

We don’t want to turn him into JJ Abrams now do we

#6

I think he’s talking about a representation problem, not a manifest rendering aspect. (I know you’re joking; oh well; wanted to clarify)

#7

The issue is a combination of things. To start with our HDR render buffer is in fp16 and so right off the bat the luminance of a star far exceeds the precision of fp16. On top of that due to a realistic, and massive, range of luminances in the scene small discrepancies that would be unnoticeable in a regular engine can have a huge impact on our tonemapping algorithm (i.e. causing parts of the final image to be too dark or too overexposed). It gives you a real appreciation for the power and flexibility of the human eye :stuck_out_tongue:

2 Likes
#8

… so no lens flares. :stuck_out_tongue:

#9

Hah! This is basically the computer rendering equivalent of blooming, which astronomers have struggled with for decades now. Unfortunately, we deal with it in post-processing (images from the SOHO observatory are frequently mined by (very) amateur UFO hunters who incessantly insist that bloomed out planets and other bright objects are actually UFOs (either directly, or that have been edited out of the images by government censors and disinformation agents. Seriously)). It actually excites me that you’re staying true enough to the real world that you’re encountering problems similar to what we have to deal with in, you know, the real world.

I don’t know, well, anything about HDR, but I have to imagine that almost every implementation of it normalizes the output, including yours. Are you normalizing on a linear scale, or a logarithmic scale? If you’re using a linear scale, I’d suggest switching to a logarithmic one. If you’re already using logarithmic scaling, maybe consider logs of logs, or logs of a higher base. A situationally aware scaling might be a viable fix (maximum brightness exceeds 10,000 lumens, log base 20, else log base 10/linear).

I imagine most HDR algorithms are designed for digital photographic cameras, which themselves aren’t designed to be pointed at the Sun, so the level of scaling necessary to both normalize the scene and still make dim (relative to the Sun) objects visible wouldn’t even be considered.

4 Likes
#10

We are using a logarithmic scale.

1 Like
#11

Is the kickstarter still mid july? The vid still hasnt come out.

#12

Well, just thinking out loud here, since I obviously don’t know the details of the issue you’re dealing with, but allow me to go on at some length here. This is all going to be highly simplified compared to what you’re dealing with, I"m sure, but if there’s anything in here that helps even in the slightest, you’re welcome to it.

Say you wanted to be able see the stars in the background from Earth’s orbit while looking at the Sun. The Sun has a magnitude of -26.75, and Vega, one of the brightest stars (and the reference star for the current magnitude scale) has a magnitude of 0. Because of the wonky calculation used to determine astronomical magnitudes, this means that the Sun appears to be about 5e10 (50 billion) times brighter than Vega.

From Earth’s orbit, the Sun delivers ~100,000 lux (lumens per square metre; this is equivalent to ~150 W/m^2) in the visible band, and this makes up ~11% of the Sun’s actual output (bolometric values (i.e. values integrated over all wavelenghts) are ~1360 W/m^2, which is equivalent to ~ 136 million lux). This means that Vega has a visible brightness of approximately 2e-6 lux (3e-9 W/m^2).

On a base 10 logarithmic scale, the lux values for these two stars end up looking like:

Sun: log10(100,000) = 5
Vega: log10(2e-6) = -5.7

That’s a pretty big range to deal with, especially given that Vega is one of the brightest stars we can see in our night skies. We have a natural upper boundary, where we can set the Sun’s (monochromatic) pixels equal to 255, but we can’t a lower cutoff, since we want Vega to be visible. I don’t know how low a pixel brightness would be needed for a star to be considered “visible” on a 0-255 scale, but let’s pull a value out of thin air to act as our lower boundary: 10. We need to decide how faint something can be and still have a pixel brightness of 10 or more.

According to Wikipedia, there are ~500 stars in the sky brighter than magnitude 4. This is also the typical visible cutoff in smaller urban centres. Let’s call this the cutoff, then. This will produce the absolutely surreal experience of seeing the typical urban night sky, and the Sun on screen at the same time.

Stars at m = 4 are 2 trillion times fainter than the Sun, and so have a value of ~5e-8 lux.
log10(5e-8) = -7.3

This maps 5 -> 255 and -7.3 -> 10. Using a linear mapping, we find a slope of ~20, and an intercept of ~155. This gives Vega a pixel brightness of ~40. Only 4 stars (other than the Sun) will appear brighter, with Sirius peaking somewhere in the neighbourhood of 55.

This seems reasonable, but what does this do for nearby solar system objects? What if, say, we had a planet at Earth’s orbit with rings comparable to Saturn’s.

The particles in Saturn’s rings have an albedo of (very rough average) 0.45, so of the 100,000 lux falling on them, they would reflect 45% of it. For the sake of simplicity, let’s model the rings as a solid disk, and account for all of the empty space in them by assuming they’re ~90% empty, giving a total albedo of 0.045. This means they’re reflecting back 4500 lux. Since these are nearby, we have to worry about the inverse square law now, which means we need to pick a distance. Let’s say the rings are as large as Saturn’s, with an inner radius of ~65,000 km, and an outer radius of ~135,000. This gives them an area of ~5e9 km^2. If we want the ringspan to fit within, say, 60 degrees, we would need to be ~235,000 km away from the planet.

The resulting brightness of those rings will be ~8e-14 lux coming from each square metre of those rings. We now need to know how many square metres there are per pixel. Let’s assume the FoV is 60 degrees (so that the rings span the entire monitor; this now gets into the unrealistic, except for maybe in multiple star systems, situation where we have front-illuminated rings, with the Sun in the background, but what the hell; let’s have fun), and let’s assume standard full HD resolution with square pixels so we can measure the pixel sizes. Let’s also assume we’re looking at the ring system face on, for simplicity. The FoV is 60 degrees horizontal, so we have 1920 pixels spanning 270,000 km. This means each pixel is 140 km * 140 km, or 1.96e10 square metres.

This means each pixel is giving off ~0.001568 lux. log10(0.001568) = -2.8.

Each pixel in the ring, then, should have a monochromatic brightness of ~100. That seems reasonable.

What about the planet itself? Let’s assume the planet is actually Neptune (because blue is pretty), with an albedo of 0.29. We’ll only consider the lit pixel values, since the geometry of this fabricated scenario is starting to become slightly Lovecraftian.

Each illuminated square metre of Neptine-at-Earth’s-orbit will reflect back 29,000 lumens (lux*m^2). At our distance of 235,000 km, each square metre will deliver unto us ~5.25e-13 lux. This means each pixel will produce ~0.01 lux. log10(0.01) = -2.

This spits out a pixel brightness of 115. This is a nearly negligible difference from the brightness of the rings, even though each square metre of the planet is reflecting back well over 6 times as much light as each square metre of the rings. This is a bit wonky, but if you look at pictures of Saturn, the rings don’t appear to be that much dimmer than the planet itself.

The contrast between the planet and the rings can also be increased by taking stars of a lower magnitude as your floor, but the logarithmic scale is inherently insensitive to these changes in the mid ranges. For instance, raising the floor from magnitude 4 stars to magnitude 3 stars causes the difference in pixel brightness between the planet and the rings to increase from 15 to 17. Setting the floor at magnitude -1 (so that only Sirius is visible in the background) still only increases the difference to 19.

Switching to log20 for all (and keeping with a floor of magnitude 4) of the calculations brings the difference up to 21 (planet: 103, rings: 82), while keeping Vega’s brightness approximately the same. log30 increases the gap to ~24 (planet: 96, rings: 72), while again keeping Vega’s brightness the same.

So, switching up the base of your logarithms can help deal with the insensitivity of logarithmic scales in those middle regions. I don’t know how computationally expensive it is, though.

3 Likes
#13

Just figure at least three months after the new planetary rings video comes out. Shoot, I’m happy with Q4-Q1 but that is just me thinking out loud.

#14

No I’m afraid not.

#15

What you’re proposing wouldn’t work because then all blending would be done in log space which is incorrect. The light buffer must contain linear values and fp16 is the best option for that due to the tremendous bandwidth requirements of fp32 (not to mention lack of support on older hardware).

1 Like
#16

I see. The fact that the blending has to be done in linear space is bad enough; I see how being limited to fp16 on top of that is a huge hurdle. This simply isn’t an issue we usually have to deal with in astronomical data processing – it’s certainly outside the realm of my experience! – so I don’t have any tricks of the trade that would be of help.

What kind of range in luminances are you looking at (and no, I don’t expect you to answer; I know you’re developing commercial tech. My questions are basically rhetorical)? Because stars on the scale of the inner solar system are just going to be universally blinding. What’s preventing you from just defining some L_max that is low enough to allow for proper blending, and setting any value > L_max equal to L_max? That seems like the most straightforward solution. Not using a simple cutoff like that seems to suggest that you lose something significant that way. Proper reflections, maybe?

The only other thing that comes to mind is to consider just straight up filtering. We photograph the sun by using extremely dark filters, so that we can avoid burning holes in the CCDs. Cutting out 99.999% of the light up front is the easiest way to do this. That doesn’t really help when you’re trying to blend a scene with a 100,000 lux star and a 10 lux asteroid. Your star ends up with a comfortable 1 lux getting through, but the asteroid is completely blotted out.

#17

Okay, I’ll pipe in here too, but I have no experience, just ideas. :stuck_out_tongue: I’m not even sure I’m adding to the discussion, but I thought I’d try, so humor me for a minute:

Maybe looking at another color space might help more – wouldn’t changing the buffer to use YUV with say a 4:1:1 ratio allow more brightness information to be written? Of course, you’d still have to map that to RGB at some point to finish the image for the monitor, but it would enable a higher precision without using more bits (although of course it is not an exponential increase, just a simple multiple).

That just moves the problem further out, so to speak, but could that be good enough?

EDIT: Question: why can’t you blend in log? Shouldn’t you be able to transform the blending algorithm to the log space through the power of math?

#18

Blending is done by the hardware using the fixed-function pipeline; in other words, it isn’t customizable like a shader is. It is an old dream of mine that one day, constructors implement a “blend shader” that would allow one to write custom blending operations, that way we could work in non-linear spaces…

2 Likes
#19

Man, that is brutal! That clears up some questions I’ve always had about how the sun is displayed in games, though. Like, for instance, why it never really seems to be much brighter than a 100 W light bulb.

#20

Oooh, ouch. Brutal is the word for that.