Cold jets, obviously.
In one of the livestreams they said the moons, planets, station are fixed at the moment. They want them to be on the daylight side for the prototype.
My point is that I don’t want the gameplay to have to explain away what I’m doing with lore and special rules. I want the natural physics to do it because it’s way cooler that way and it feels like real spaceflight. The best part about space sims is the immersion and anything that breaks it sucks. Obviously not all things will be possibly, and I know Nbody is one of those things but staying still in space while having your engine on while being undetectable is one of those immersion breaking things for me.
@Seuche planets always have a daylight side
I know you meant the stations and the moons not to be eclipsed though
Hm. Thats’ a shame.
I mean you can just have your “eyes” adjust to the light and see some nice light colors magnified that’s bounced off other moons and onto the Moon’s on the dark side.
They already adjust to lower brightness.
Yeah, I wouldn’t mind if the Battlescapes ended up on the dark side. But for the streamers to see something without waiting for the moons to rotate they fixed them in place. It’s a shame.
Especially when you know how amazing the area around the terminator looks due to the light angle, atmospheric scattering and shadows
Yeah, there’s not enough moons right now to always have at least one bouncing light off of that’s enough to attune to, I guess?
There would always be 1 moon with a darkside regardless. Like Seuche said, it was just so that all the camera lighting looks good for streamers and the media…it’s impossible to see anything on the dark side of planets when recording…for example look at any KSP screenshots ever. They’re a nightmare even with post-processed brightness(which is hard to do for a stream).
I’m talking about a moon that has all light eclipsed by the gas gaint, but that gets light bounced between the planet and moons around it.
It’s a small amount, but your eye adjusts.
Agreed, your eye would adjust. Unfortunately the cameras we make are no where near as good as the human eye. In a real life situation this wouldn’t be a problem, but when trying to record it’s a huge issue.
For example, go outside during a full moon. You can see fine - now try and take a picture with the flash off and tell me what you see. This also works in a dark room that you can still see in if you don’t want to wait. We’re dealing with cameras and recording software not real eyes.
Unfortunately from a strictly realistic POV, that’s still probably not feasible.
I don’t see anything about cold gas thrusters, but assuming PR is as thorough as usual, that’s not a meaningful omission.
Sad but true… So let’s just have fun crafting something as realistic as (great) gameplay allows. E.G. assuming that life support etc are ignored, and only actual combat systems (energy, propulsion, weapons, comms, etc)
I’m talking about the artificial “eyes” that are rendering the scene and displaying it to the monitor, not your IRL ones looking at the screen.
I know I figured there would be liberties taken to make some sort of detection mechanics work. Otherwise everyone is always going to be detected. As far as cold gas thrusters go, it would need to be <3.4 K in temperature or else it could be seen above noise level of the CMB and stuff that cold makes very poor propellant ignoring the fact that you now have to have fuel as well.
Maybe one choice is to just craft some fun ships to fly, and then recalibrate our gameplay toolbox (sensors of all kinds) away from that cosmic ~3 degrees Kelvin baseline, to something roughly matching the full range of emissions for all ships/objects in the game (e.g. smallest/coldest ship/object at its coldest state - powered off etc >> hottest ship running at maximum).
Well, maybe there’s some sort of future cloaking technology on every ship, maybe the type of propulsion, maybe warp is emissionless, maybe the materials the ships are made of are special. Maybe we could have heat mechanics too, radiators and all that.
Exactly. If realism is clearly not an option, then let’s create an awesome gameplay framework, and then map sensor abilities to that.
Space is huge. You can’t visually see a ship past a few KM, and the two Moons in the prototype are probably a million KM apart. No “natural”, eye simulating brightness adjustment will change that.
More than likely you got too close to the atmosphere while going too fast and bounced off.
but Nbody! What about Nbody Keith. Don’t kill my dreams
Personally I would want it to be as realistic as possible, but that’s of course not a reasonable expectation as it severely reduces accessibility.
because planets, moons and space stations will be on rails I think good choice would be a influence based + N body variant. so when you are close enough to one of the the “rail” objects you will be flying in 1 body system + their rail data.
for example we have
- on low earth orbit you have only (2) + move with earth on their rail around sun
- half way to the moon you have (2)+(3) + move with earth on their rail around sun
- half way to mars you have (1)+(2+3)+(4)+(5+6) - 4N body simulation. earth and jupiter counts as 1 point with mass of itself + moons mass.