Posts

Showing posts from 2013

Pulse Lasers

Image
Finally I have implemented the Pulse Laser onboard system.  This was an exciting build, as it leverages a number of features I've been wanting to play with. Ship System Each Pulse Laser attached to the ship is a ship system, and has a power capacitor which draws energy from the main power unit.  When you fire, it fires continuously until the capacitor runs out of juice.  After that, the rate of fire for the lasers is limited to the capacitor's recharge rate.  What this means is if you let the lasers fully charge, you get an initial ultra-burst of pulse laser fire, followed by a slower volley that can be sustained indefinitely. Rendering It's important to batch objects.  A GPU can draw 1000s of lasers in the blink of an eye, as long as you submit it with 1 or 2 draw calls.  If you draw each laser individually, you'd easily use up your 33 msec or more. In Pegwars, a python pulse laser fire event writes a pulse laser into a single dyanmic vertex buffer.

Oculus Rift Integration

Image
There's still a lot of work to do to get the rift running smoothly, specifically I can't seem to get the thing to focus properly in fullscreen mode (but works great in windowed!). I found the SDK alright, but a little basic.  The examples and API itself use a RH (right handed) coordinate system, which is not what DirectX uses, and I wish they supported both out-of-the-box instead of me having to map between them. "Oh here's a quaternion providing the headset rotation, but it needs its coordinate system remapped"..uh-huh.  Oh yeah, some documentation on using the rift with DirectX would be nice!!  OR am I the only one using DirectX these days?   In order to render for the rift, you need to draw your scene twice, with the camera offset by the distance between your eyes.  It's nice and easy, as the geometry of the rift lenses is such that you don't need to skew or rotate your projection matrices, it's just a translation.  However, drawing the

Amusing Graphical Stuffups

Image
One of the best things about working on a game is graphics bugs.  No, not the kind that make you spend a week delving beneath the covers of hideously complicated interactions between CPU API GPU RAM ETC. No, the serendipitous kind! Engage warp drive!  Funnily enough I was thinking about making some kind of FTL effect one day Playing around with SSAO, I got some kind of cubist painting look Ooops, wrong viewport! Hmmm.. this is not the material you are looking for Wrong light map! Not quite the right amount of specular highlight

Your Oculus Order is AboutTo Ship

Image
Hell yeah...Oculus Rift is coming my way.  Lucky I've been working hard on getting pegwars ready! - User Interface now working, displays on consoles in virtual cockpit.  From day 1 I've tried to stay away from on-screen UI, and I think that'll be vindicated when using the Oculus.  Immersion is key! - 'Star trails' object which is not really star trails, but a detail object that helps give the impression of movement and speed in what is essentially a very large, empty space.  50000 points wrapping and extruding and drawing in a single batch on the GPU - The stars you see in this shot are now drawn by the galaxy class, which is an octree of solar systems.  And although it's hard to show in a screenshot, you can fly to every single system seamlessly using a combination of your turbo drive and turning off the engine stabilisers... - I need to start creating procedural planet textures!  Planet named 'Masjou' but using Earth texture just won't cut

Planets

Image
Planets are now becoming well-formed.  They have a placeholder atmosphere object, but will get a dedicated scattering shader. Their atmosphere detects entry and the Planet Flight App Module is instigated.  This sets up the rendering pipeline slightly differently.  Flags are set on the spaceship physics model, for atmospheric flight. In future this app module will also begin the streaming of planet cities, detail objects, etc.

An Update

Image
I have to get back into blogging this thing, so this update is just a refresher to remind me how to log in to blogger etc.  Here's a screenshot of the cityscapes these days : What we've got here is A tile-based city that is read streamed from a bitmap.  The tall building you see in the centre is an example of specific placement of buildings - it's not just a procedurally generated cityscape.  This is so that players will be able to grow cities on planets, but also do specific terraforming and building and truly customise their planets. A nice example of arbitrary backgrounds used as image-based lights for the scene.  While the background is currently a static mesh and bitmap, it's implemented in a way that's totally dynamic.  This way when I get around to doing the atmospheric scattering simulation (for planets) or procedurally generated nebulae in space, the models will all sit in their surroundings no matter where on the planet you are and what time of da