Battlescape on Oculus Rift?

Hey everyone,

Just curious - Would you like to see Oculus Rift support Implemented into infinity battlescape?

I posted about this on the Oculus VR forums under other engine integrations.

2 Likes

Yes, but no.

Yes, I would like to see it implemented if they do it right but no, I don’t think they should spend the time on doing so.

1 Like

I’m not against its being supported, but I don’t see it as being a priority.

Maybe a KickStarter stretch goal? “Give us X+1 dollars and we’ll hire someone to implement OR support.”

Of course, it all depends on whether the devs intend to go the whole nine yards and plop a “pilot” in the cockpit, though I wouldn’t be surprised if that was planned.

Maybe a stretch goal itself?

Flavien has the OVR development kit, and he did “implement” it. He showed us fellow dev’s a video of it in engine. However, I’m not sure where we will take this, it’s not out of the realm of possibility it’s a stretch goal, however I’m not sure we have spent the time to think about how it could effect the gameplay we desire in I:B. Our time is spent doing a lot of other things atm…

It seems to me, an OVR game should be “OVR or bust” in terms of gameplay/UI? I don’t know myself, I have not had the chance to try it. I believe Keith tried EVE Valkyrie a with OVR at GDC.

1 Like

Hey @INovaeGene Thanks for the heads up on this.

I really am excited for this game and the development tools. I would love to play this game with Dev Kit 2 or Consumer Version 1 of the Oculus Rift.

Fully agree with you on this OVR or Bust. UI would need to be redesigned for the rift for the immersion factor and to accommodate the rift as well.

@INovaeKeith - What were your thoughts about EVE: Valkyrie? I am keen to hear.

It was pretty cool but I think it still has a long way to go for practical use. The resolution is still much too low and making it any higher requires players to have a seriously beefy machine which may not be a problem for early adopters but certainly would be a barrier for mass market adoption. Also vertigo remains a problem as I found myself getting motion sickness.

As far as Valkyrie is concerned I found it visually lackluster, a problem that is magnified by OVR’s immersiveness, and incredibly boring to play.

Overall I walked away with the feeling that we should de-emphasize OVR support as it’ll take significant resources for the UI and I don’t see it as being anything more than a niche novelty for quite some time.

1 Like

I will trust your judgement as I do not have a Dev Kit yet so I cannot comment on the oculus.

Thanks for your reponse,

Keith

As gene said, I have a dev kit at home and was able to implement it natively into our engine already.

The main issue is related to performance. The rift requires rendering to an upscaled (high-res) distorted buffer so that the distortion correction that the rift applies is cancelled out. This means that for a final resolution of 1920 x 1080, we might have to render the scene into a much higher res buffer ( like 2500+ ).

Knowing that at the moment we’re very much gpu / shader / fillrate limited, increasing the res affects the framerate… a lot :frowning: And the scene has to be rendered twice, one per eye.

A very high-end top-of-the-line system would hardly be able to maintain 30 fps with all the details.

There’s also the topic of GUI and controls. When you’re “inside” the rift, you aren’t able to see your hands on the keyboard/mouse anymore. And a game such as Battlescape may require a lot of keys. There’s only so many buttons on a gamepad, and other tracking devices such as the Razer Hydra would not really fit all that well our game type ( plus that’d be another device to buy… ).

Solving all of these issues is non-obvious, especially when we already have our plates so busy :slight_smile:

4 Likes

Ah ok, fair enough.

But at-least it is in engine for other games that may use it. :smile:

I don’t think any of the above would be a major concern to a very large fraction of current OR users, but yeah, it’s a lot of why I don’t expect it to become mainstream for a long time yet.

Speaking of input-methods, though. I can see why handling interplanetary travel and possible intercepts would be tricky without something like a full keyboard’s worth of buttons, but I’d personally quite appreciate it if at least the actual dogfighting can be handled with a joystick+throttle combo or double joysticks (assuming either option would have at least a dozen or so buttons between them in addition to 5-6 axis control).

Great, I’m going to be playing with four year olds. >:(
Spellchecker has a problem with adding an “S” to old? Oh and it’s spellchecker and not spellchek.

I was under the impression that each eye was going to be 960x1080. It would still be 2 scenes being rendered, but each is actually half the resolution of the total. I saw someone mention it was around a 20% workload increase on the first developer’s kit (which is lower res), so definately >20% on the final version depending on optimizations and improvements. I don’t think you’ll be seeing a 200% increase or anything on that order though.

I wouldn’t let this stop you. There’s creative people doing things like soft kinetic out there:

Programming this kind of thing as a cockpit overlay that also recognizes simple things like keyboard, mouse, and joystick could alleviate losing awareness of the outside world. It’s still a ways off I imagine, but you can see where people are going with it.

Overall, implementing any sort of VR support is probably going to be a sticking point since it’s become quite the buzzword for a community that, while not small, isn’t completely mainstream despite recent high-profile news items. I don’t think it should be the top of your priorities given that it still has yet to come out, but i think it should definitely be somewhere in the stretch goals.

I’m not sure I agree, as the gameplay has to be designed for it. I think either a game is an OVR game from the start, or not at all. But I’m a VR newb, so…

I’d love to see OR support… but I agree it needs to be done well. The UI is a major player in that, if you’re shooting for minimal UI or all the info being displayed on the cockpit interior (in large obvious writing and pictures) then it would work great, otherwise not so much.

Some games have hacked in VR support really well, Lunar Flight and Live for Speed spring to mind as two I’ve really enjoyed. You should definitely try and get round to trying the rift out, when possible.

I think you’re overestimating what needs to be done for VR in this case.

Cockpit-oriented simulators are perfect for OVR at the moment. You don’t have to simulate full body movement. You don’t even really have to simulate hand movement either. You just have to give players a wider and more natural field of view from a virtual chair.

Check out an Elite: Dangerous youtube video where someone looks down at the hands on the joystick and throttle…completely static while they fly around. They also didn’t work in realistic physical control panels or anything that you had to interact with. They’re all holographic in nature, though tied to the cockpit and not the player point of view.

The bare minimum for me as a player is to have my head placed in the cockpit and not be looking at it through a window. I don’t need fancy buttons and cool looking arms that I’m not going to be paying attention to because frankly I’m going to be too busy trying to dodge relativistic lead on the other side of the window. Its the technology that does this and it will add its own dimension to playing.

Also, since its a hot buzz word, putting it down the list in your stretch goals will give the VR junkies reason to spend a bit more :slight_smile:

A year later and resurrecting this topic… the Oculus Rift was just announced to be released in Q1 2016.

http://www.anandtech.com/show/9238/oculus-targets-q1-2016-for-consumer-oculus-rift

Still don’t think support will make it to infinity?

When we had this argument on the old forums (yes years ago), I think the issues @INovaeFlavien had with the OR was that the dev version had low quality screens/lenses (beauty of the render was lost), there was a need to filter the view and do additional rendering to account for the curve of the lenses and the quality (or something to that effect) and having to render to two screens was problematic for then “current” GPUs.

That’s not a good reason to just slap it into a stretch goal IMO if we feel it has broader implications into our budgeting/available resources/game design.

Not a lot of keys. A lot of functions. How those functions are accessed is a matter of UI design. Mapping each function to a distinct key combination is the traditional and terrible way of designing the UI.

Functions that are unrelated to the current task don’t need to be bound to keys. Functions that are secondary don’t need to be on a single key press. And so on. What’s important is that players can get to the functions that they need when they need them.

If I’m in combat, I don’t need sensor controls. I don’t need navigation controls. I don’t need inventory controls. Yet the functionality for each of those systems is undoubtedly mapped onto the keyboard in some funky way. The more peripheral the task, the more obscure the key binding. ARMA 3 is an example of a game that does this badly. Diablo 3 does this well.

If I’m in combat, then I want easy access to my combat functions. Map them onto the keyboard and mouse so that I can leave my hands at their home positions and never look down. For any reason. This is not a VR consideration, but one of enjoying the game. I don’t want to be looking at my keyboard, I want to be looking at the game.

If there are more functions than can be fit into the control scheme, then change the game. Reduce the number of functions. I think that being able to play the game in a natural way is more important than having the ability to adjust the power level for the 16 shield sectors on my fighter by 1% increments. Stick to the essentials functions of combat and map them onto the keys that the players can get to.

Remember that you can chord keys as well. Or have sequences of keys. Such things are appropriate to non-critical functions. I would assert that the more such things are used, the more it’s likely that the gameplay is overdone. The game is probably not “easy to learn and difficult to master”, but just complicated to control.

Break up the game’s functions by modes or activities. I don’t need my navigation controls while in combat. When I’m navigating, I don’t need the combat controls. I’m not going to suddenly launch a missile while I’m figuring out where to go. I may need to suddenly switch modes, so that should be natural and quick, but if I have mutually-exclusive activities, then they don’t need to be mapped to the keyboard at the same time. The navigation interface should use the same keys as the combat interface. Players should get used to the various key sequences or key chording that the game uses, all from one home position.

Voice might be a good way to change control modes. Making a unique color-coded HUD for each control system would help as well.

As for gesture systems, they have their uses, but a fighter cockpit is not one of them.

2 Likes

I would just like to add that a menu system can also do wonders for reducing the amount of keys that need to be mapped, Elite Dangerous does a pretty good job of this, you have half a dozen or less keys that are used for menu navigation and with proper menu design you can have access to hundreds of “controls” within a rather shallow and easily navigable menu tree.

Or go old school and do what Klingon Academy did, number keys open up menus, following number keys open up the next layer and so on. You feel like a good oiled machine that knows every single command out of hundreds after finishing that game.