Upcoming Events! Community Event Calendar
Social Media Follow us, or not!

Around the Verse: Secondary Viewports Written Thursday 3rd of August 2017 at 10:24pm by Desmarius, StormyWinters and

As per usual, anything said during the show is subject to change by CIG and may not always be accurate at the time of posting. Also any mistakes you see that I may have missed, please let me know so I can correct them. Enjoy the show!

TL;DR (Too Long; Didn't Read)

Behind the Scenes: Secondary Viewport

  • The new Secondary Viewport tech uses the new Render to Texture System to allow such things as comms calls and holographic volume rendering in the PU and Squadron 42
  • The new system creates a more immersive experience as it makes UI and game holographics seem a part of the world instead of a simple overlay while also improving performance
  • The same tech will be used for many additional different systems such as visors, comms calls and mirrors further down the line
  • The Render to Texture System starts at the engine level and accounts for such things as if objects are going to be streamed, the appropriate rendering size and of objects for the “screen” that they'll be displayed upon
  • The Rendering System has a fixed memory budget that uses a texture pool, similar to a shadow pool, rendering in powers of two to allow for the movement closer and further away and the adjustment of lower or higher resolution progressively
  • One benefit of render texture is that it can be reused on multiple objects with the only downside being that it requires a manager for mouse pointer interactivity
  • The Camera Calling System and the Facial Animation System communicates with the render texture manager to determine the exact size and level of details for the facial animations and character animations
  • This real time rendering allows for changes in character ships, costumes and locations as well as it allows a character to view any projection, 2d or 3D from any angle while moving
  • This allows the rendering of any object without the need for the duplication of material set up, and with shaders allows for the fading, dissolving and tinting of objects as they reach borders without the usual clipping
  • This technology basically creates an immersive telepresence that plays well with player-to-player comms and for effective mission briefings
  • Though developed rather quickly they still aim to optimize performance further, like with using environ probes to avoid having to render background scenes inside of a video call
  • Beyond performance, they want to stabilize it more and improve the finalized effects such as with flickering or interlacing lines when there is a poor signal or if the display is damaged

Full Transcript

Intro With Chris Roberts (CEO, Director of Star Citizen and Squadron 42), Sandi Gardiner (VP of Marketing). Timestamped Link.

Sandi Gardiner(SG): Hello and welcome to another episode of Around the Verse, our weekly look at Star Citizen’s ongoing development, I’m Sandi Gardiner.

Chris Roberts (CR): And I’m Chris Roberts.

SG: On today’s show we take a look at the systems we’ll be using to render holograms and comms in real time.

CR: Yeah, it’s pretty cool so can’t wait to show you guys but first as many of you know the team is very focused in completing our 3.0 update for the persistent universe. So 3.0 is a giant leap forward on what’s currently available in game and thanks to the dev teams hard work, the majority of 3.0’s new features are almost complete and we’ve shifted into the final phase of production process that focuses on feature and content integration, optimization and bug fixing.

Now we’re also expecting many new players or people who have been busy playing something else to come back in and log in and play 3.0. So we wanted to make sure the user experience is really good so we decided to spend more time polishing and optimizing than we have in recent releases. In addition, we’re also aiming to introduce our new delta patcher so you will only need to download just the files that have changed for each subsequent patch which means no more 30 GB downloads, but of course this requires some fine tuning and a lot of testing to make sure it works as intended.

Now we know that 3.0 is a big release and you’re all eager to play and we’re excited for you to play too and can’t wait to get it done, but we want to make sure that it’s ready. So if you’ve read the list of caveats we gave when we first started sharing our internal unpadded schedules, our very first point was quality would always trump schedule and the second and third points about task estimates, you know, being unpredictable due to the nature of developing something that hasn’t been done before and the difficulty of estimating bug fixing and polish time are also important to remember as we go forward with our schedules on finishing 3.0.

So that’s why we’ve seen the constant changes to production schedule over the past few weeks as new issues or advancements cross our paths we’ve worked hard to communicate those to you, no matter how good or bad the news may be. By its very nature game development can be an exhilarating and frustrating and unpredictable process so if our 3.0 schedule wasn’t that then you wouldn't be getting the true development experience.

SG: No, you wouldn’t… for our new backers who may not know with each of our major releases we’ve done different things to help you track our progress. For the 0.8 patcher that launched Arena Commander and the 2.0 patch which introduced the PU, we had our weekly development updates that listed current blockers and resolved bugs and for the march to 3.0 we’ve been tracking the major tasks we're doing with our weekly production schedule reports.    

CR: So, now that we’ve reached this latest stage in the process we’re planning to adjust the format of AtV to highlight exactly what we’re working on to get 3.0 out the door. Now as all our studios are working hard to get 3.0 out we’ve decided to spend the Studio Update portion of the show so as not to distract developers in providing footage of their work for the Studio Updates and instead starting next week we’ll be launching a new segment called Burn Down. With this segment you’ll be able to be a fly on the wall for some of our production meetings and hear directly from the developers and QA testers about the week’s biggest bugs, blockers and challenges that we’ve been battling. It will be another great way for you to follow 3.0’s progress, the weekly production report on the website will also be adjusting its focus to match.

SG: Alongside the new Burn Down segment, AtV will bring you a weekly deep dive into a feature we’re working on for the game. That way you’ll still be getting the same great detail about what we have planned alongside the most current information on exactly where we are on the path to releasing 3.0.       

CR: Yup and once 3.0’s out we’ll resume the normal AtV cadence with the weekly studio reports and all that lovely eye candy that you guys like to see every week.

SG: Now let’s shift gears to focus on two systems we’ve recently got working together, the secondary viewport and render to texture system. When combined these systems can do a wide variety of things from dynamically creating comm calls from other locations to rendering holograms in real time.

CR: Yeah, I’m pretty excited about the potential of this technology as it’s going to allow us to do some really cool things in Squadron 42 and Star Citizen which we’re going… you’ll see maybe a little hint of it to come, let’s take a look.

Secondary Viewport With Hannes Appell (Director of Cinematics), Alistair Brown (Director of Graphics Engineering). Timestamped Link.

Hannes Appell (HA): We've been working with the Graphics Engineering team in the UK to develop and make use of the new Secondary Viewport tech, which itself makes use of the new Render to Texture System. It allows us to do some really cool things for our in verse narrative in both Squadron 42 and the PU. So far we have used it for comms calls and holographic volume rendering, and we have been syncing very closely with the engineers that write all of the new rendering code to make this happen, and we're slowly homing in on a final feature set.

Alistair Brown (AB): First of which is Secondary Viewports, which allows us to get a second view onto the world or many different views onto the world. This is built on top of some new tech we have called the Render to Texture System. Prior to the Render to Texture System if we wanted to render some user interfaces or screens or visors we'd have to render them directly into the game world, and this happened after all of the rest of the scene had been rendered. What that meant is that the UI would always look on top of the game world. It would truly never fit in, and therefore it would never correctly be obscured by things like glass or fog or bloom in the same way as everything else in the scene, and this has always bothered our UI artists.

So, the new system … the idea is we render all of this content into textures first, and then we use that, them textures, in the actual main rendering pass of the scene and composite them in with whatever effects we need like whether we need them to look holographic or like they're on glass or whatever it might be, and it let's them to bed themselves in the game world much better and have a much better lighting and sorting with the rest of the scene. We also get a few other benefits from this. We get better antialiasing ... better sorting. We get better performance actually with the fact that we can reuse the same screen on many different displays in the game world just by rendering it once, and we can even use the same screen on the next frame of the game to be able to avoid rendering costs. For example, you've got a screen which doesn't need to animate or doesn't animate very quickly.

These new pieces of tech we've been using in many different systems, so we've got all of our UI screens and our visors, all of our holograms and video comms calls, and there'll be several other uses we're hoping to fit in further down the line like things like mirrors. Things that are typically really difficult to achieve in games.

Geoff Birch (GB): The Render to Texture System starts at the engine level when we're gathering all of the objects. Really at this point all we really need to know is that the objects are going to be streamed, so the streaming system needs to be informed, and we also need the max and min screen space size. We use the maximum screen space size along with the UV technical density to be able to calculate how much screen resolution is required for that texture. The minimum screen space size is required, cause that texture may be used on multiple different objects. As it's used on multiple different objects we then need to get the largest size and use bit mapping down to the smallest size. When you have a screen within a screen we need to know the ordering of the RTTs, so as one RTT is rendered before another it can then be used as a texture within the second one. We also need to know that an RTT within an RTT within the main pass. If the first RTT is half res of it's parent RTT, and that RTT is half res of the main pass, the first RTT must be a quarter the res at rendering size.

The Rendering System has a fixed memory budget. To do this we allocate one large texture ahead of time. This texture is called the texture pool or in this case the render texture pool. It's very similar to a standard shadow pool system. We recently rewrote our shadow pool packing system to be a power of two quad tree allocator. We use the same power of two quad tree allocator for the Render to Texture System. We render textures as a power of two, so i.e. 128, 256, 512, 1k. We use the light or the smallest size for … that the texture needs, and that we can fit in, so if you needed a rendered texture object that 800 x 800 we'd use a 1024 x 1024, and as you move closer and further away from the object it'll require lower or higher resolution, and we progressively move up and down.

One of the benefits of render texture is we can reuse those textures for multiple objects. So, if you have a scene with many different billboards in them (I say 12 billboards) we would render that texture for the billboard once and then reuse that texture over 12 different billboards. The original system, the UI System for instance, wouldn't do that. It would render the UI on that billboard 12 times. Because the Render to Texture System is now a texture and it's not flash just rendered into the world, it means that we can render any curved screen, anything like that. The only downside to that was we had to implement a new system to manage the mouse pointer interactivity that we already had. We had to make some modifications to bring in a mouse pointer system which takes screen space size and remaps it into object UV coordinates, and then we can then pass that object UV coordinates to the UI system. The UI system is then able to work out where on the object that mouse pointer is and then in reference to where on the flash that is, and then therefore you can start selecting things.

With the cargo animation it normally goes through the camera system to decide whether it's … needs to be animated, whether any of the facial animations need to be run, and as we were running this through the Render to Texture it wasn't in view of the main camera. What would happen is it just wouldn't render when it was in the Render to Texture System. We resolved this by having the usual Camera Calling System and the Facial Animation System, stuff like that, communicating with the render texture manager. The render texture manager will allow it to go through all of the different cameras and work out exactly how big it is on the screen or how big it is with inside our render to texture manager, and it will allow it to decide on the level of detail of the facial animations and the level of detail in the character animations.

HA: If you do prerendered comms you can't really acknowledge characters changing costumes or ships or locations. So, real time rendering for us makes a big difference for immersion. Comms calls reflect what's going on in the verse and for the Persistent Universe it opens up customized player avatars calling each other. All rendered live. Another possibility are for example CCTV or other room views style puzzles or live recording of views to be featured somewhere else in the verse. There's some remarkable consequences of these advances in the tech. Our capital ships feature big holographic volumes on bridges or in briefing rooms, and it means the player can walk around them freely. For those we wanted to not just render a second viewport using 2D display screens but actual 3D holograms, and you can view them from all angles.

[Demo of Ingame Holograms]

Muhammad Ahmad (MA): Secondary reviewport camera is updated dynamically to match the relative viewing angle from the main player's camera to the projection volume, and then as you move around the second reviewport camera moves and therefore you can essentially move around the holographic projection. Using the existing rendering pipeline means that we can render essentially any object into the holographic projection where there's no need for material duplicates or any duplicating of material set up. It just basically works with existing shaders. So, as well as the existing shaders we also have developed dedicated shaders for various things. For example, like abstract user interface objects or if in a mission briefing you wanted to go to a waypoint. The waypoint could be displayed as holographic, and it would be using one of the dedicated shaders that we developed, and the cool thing about this is that we can automatically fade the objects that are in the source volume. As they get closer to the boundary of the volume we can automatically fade them out, so it doesn't clip as it goes through the boundary. We also exposed two new artistic features where we basically allow the objects to dissolve and tint independent of the material setup.

HA: Being able to light these 2D and holo-RTT presences in real time at the source location and then seeing the results in the corresponding 2D display screen or in the 3D holo target area immediately felt exciting to me. With this new tech we can have a character calling another ship or location and the caller appears on either a 2D display screen or inside a 3D volume, but the calling partner essentially being telepresence then. What's really cool about the holo-comms or teleprescence is we can arbitrarily scale the source volume up or down and easily create larger than life representations of characters without having to resort to cheats like scaling up or anything of the scene, and this makes it possible to have something like a grand admiral appearing as a looming figure inside the Bengal carrier holo-globe, whereas him just being a small life sized presence. We also added the ability to tint or dissolve any object on our scenes at will, which helps our staging something like mission briefings where waypoints would need to flash green or enemy presence is marked as red.

AB: So, this tech progressed really, really fast, and we got some really great results, but there's more that we want to do with it. Next thing for us is to optimize it further. We really want to make sure that there was no performance impact when you have these secondary renders or these holograms and such in the scene. We're doing a bunch of exciting things to try and combat the performance issues such as if you're going to video call someone and you can only see a slight part of the background behind them, we'll use what we call the environ probe which is normally used for reflections of the scene. We're going to render that directly behind the player to avoid having to render the entire background scene, and in most situations you won't be able to tell the difference. So, that's one of the examples of the optimizations we're going to be making, but they'll be many more to make sure that we can really use this tech in as many situations as possible, so you get to see the fun gameplay level result from it.

HA: So now that we have the basics of our holotech in place, we want to stabilize it more and spend more time finalizing the look of these holograms that means all the good post effects goodies you can think of ... interlacing lines. I want to have flickering when there's poor signal quality or when the holographic display is damaged. Yeah and we can't wait to show you more of this when it comes to life later in the PU and of course in our Squadron 42 narrative. Thanks for watching.

Outro With Chris Roberts (CEO, Director of Star Citizen and Squadron 42), Sandi Gardiner (VP of Marketing). Timestamped Link.

CR: Pretty awesome, eh? For the eagle eye amongst you will have spotted the first appearance of Ben Mendelsohn’s character in Squadron 42 and Liam ‘Onion Knight’ Cunningham in the work in progress holo briefing test scene that we’ve been doing. Our graphics team have really created something I haven’t seen in any other engine and allows to actually do proper holographic telepresence. None of it’s pre-rendered or faked, it’s all live and the possibilities for longer term gameplay are pretty exciting.

SG: That’s all for today’s episodes, as always thanks to all of our subscribers for making it possible for us to produce all of our video content. We’ve just announced that August ship of the month is the Aopoa Khartu-Al so that means subscribers can test out this Xi'An ship all month long. Just log into the game to take it out for a spin.

CR: Thanks to all of our backers who have supported the game over the years, opening the development process to you all has both been challenging and extremely rewarding and I would say the emphasis would mostly be on rewarding. So I can’t thank you enough for making it all possible.

SG: Finally if you want to know what all our offices did over the past month then check out the July monthly report which goes live tomorrow.  

CR: Til next week we’ll see you…

CR/SG: Around the Verse.    

Desmarius

Transcriber

When he's not pun-ishing his patients or moderating Twitch streams, he's at Relay pun-ishing their patience as a member of the transcription staff. Otherwise, he can be found under a rock somewhere in deep East Texas listening to the dulcet tones of a banjo and pondering the meaning of life.

"If you can't do something smart, do something right." - Sheperd Book

StormyWinters

Director of Fiction

Moonlighting as a writer in her spare time StormyWinters combines her passion for the written word and love of science fiction resulting in innumerable works of fiction. As the Director of Fiction, she works with a fantastic team of writers to bring you amazing stories that transport you to new places week after week.