Upcoming Events! Community Event Calendar
Social Media Follow us, or not!

Around the Verse: Universal Interaction Written Thursday 11th of May 2017 at 08:00pm by StormyWinters, Sunjammer and Desmarius

As per usual, anything said during the show is subject to change by CIG and may not always be accurate at the time of posting. Also any mistakes you see that I may have missed, please let me know so I can correct them. Enjoy the show!

TL;DR (Too Long; Didn't Read)

Studio Update 

  • Gameplay

    • Player Interaction system has moved along quickly and the Inner Thought system has seen further improvements

    • Air Traffic Controller initial implementation underway, working on the underlying structure

    • Finishing up the Character Status system: final implementation of procedural breathing and suit punctures

    • Working on "pick up and carry" combining Player Interaction and Useables

    • Initial development of Conversation tech into Subsumption; Design implementing all conversations to test

  • Audio

    • R&D and planning for systems to map and modify audio automatically

    • Work continues on Audio Propagation system, Breathing system audio and "Word Up" (a new dialogue tool)

    • Also Ship Weapons Toolkit, Gallant reload audio, weapon tell refactor, multi-positional code support

  • Music

    • Working on Dynamically Looping Cinematic Ambient Music system

    • Also dogfighting music logic clean up, Tension system prototyping and planetside procedural music

    • Added more music to the Launcher

  • Graphics

    • Initial integration  of real time lit volumetric fog from Lumberyard

    • Render To Texture is progressing quickly; initial 2D version UI team to update UIs (3D holographic projections soon)

    • Real time Environment Probe tech nearing completion (allows fully dynamic bounce light and reflections on a planet)

  • VFX

    • Atmospheric flight effect: planetary entry VFX and engine trails are now being merged, optimised and fixed

    • Lightning entity effect improvements: attempting to create realistic lighting and electrical effects

    • Completed first pass on the MISC Prospector thrusters and damage

    • Initial work on APR Scourge railgun “charging” and “charged” effects

  • Concept Art

    • Origin 600i concept is now in its final stages

  • Weapons

    • Completed Preacher Distortion Scattergun and Apocalypse Arms Scattershot

    • Working on Klaus and Werner LMG.

  • Ship

    • Reclaimer has made a lot of progress: hull exterior (and claw) completed; interior habitation, tech and salvage completed

    • Work has also begun on the derelict ships: ships are being broken into structural elements, aged/deteriorated materials are being created, and a wreckage component is being worked on

    • Razor hull is complete and damage pass in progress; LODs are being finalised, working to get it flight ready

    • Hull C exterior is largely complete: maneuvering thrusters are being incorporated and polish work done to match the interior and exterior

  • Environment

    • Exploring ways to create volumetric forms in space

    • Surface Outposts are finishing their interior visual benchmarks for engineering, habitation and hydroponics

    • Truck Stop space stations are in final art phase: working on shader library and bringing pieces up to final art quality

  • Animation

    • Working on the cover AI to improve all animations beyond "functional"

    • Breathing state improvements: migrating curve data from Maya to DataForge

    • Implemented multi-directional takedowns: killing enemies within close proximity

    • Improvements to weapon set ups and reloads: Devastator shotgun, Arrowhead sniper rifle, Galant laser rifle, and P8SC ballistic SMG

    • Melee improvements for pistol and stock weapon archetypes

  • Derby

    • Busy with face and body animations for 3.0 mission givers

    • Delivered 500 facial animations files for implementing into S42

    • Tracked and solved almost 1000 body animations for the PU

    • New facial animations for shooting guns

Behind the Scenes: Player Interaction System

  • The Player Interaction System touches everything in game opening up numerous opportunities to interact with contextual feedback

  • This is the third version since the Alpha 2.5 interaction system and incorporates Item 2.0 with raycast and collision geometry and new added features to make it more contextually aware of the player's interactions

  • The original Use System was limited to one action and wasn't very descriptive

  • A fundamental rewrite was needed to bring the interaction system to the wanted level of immersive detail

  • During the earliest cargo prototypes it became clear that a unifying core was needed to facilitate the vastly growing input systems

  • This completely new system uses a changeable cursor upon condition and highlighting of interacted object to provide more focus, clarity and feedback for the player's interaction

  • The philosophy is that the object along in tandem with the player's state dictates how interaction functions for a more natural approaches

  • The player's interaction is only limited by the animation and the physicality of the character

  • The system uses proximity query, checking interact-able objects around the character, and cursor raycasts to browse the available interaction options

  • How to do these things while covering all the various player and environmental options and providing the expected level of polish took hard coordination between the LA and UK offices

  • A difficult aspect to getting this system together was they had to build various systems of the game in isolation so they would function regardless of the fact the rest of the game wasn’t there

    • Taking all these specific behaviour tailored to all these systems and create a generic interaction object that needs to sync with all these other generic aspects

    • They also need to have all these interactions that need to coexist with a wide variety of gameplay

  • A type of intelligence needed to be added to the game to create a more intuitive experience where interacting with objects is concerned

  • Anything designated as being interactable is using the Item 2.0 component system

    • The new system adds the ability to have interactions on particular bones/subregions of an entity

  • Generic components gives them building blocks to make things ultra bespoke but the the downside to bespoke content is it requires constant maintenance due to changing technology, changing environments you’ve places items in, etc

  • Making behaviours more separated/modular/generic allows for things to be built out more conceptually

  • They used subsystems such as zone system queries for figuring out which objects you can interact with are in proximity of the player and also such things like ray casting to determine which objects were best used

  • There is now through line of input you can interact with the whole game that is consistent

  • The new interaction system is more about fluidly browsing what you can clearly see, options are more clearly defined

    • The new system also requires you to move around less compared to the old one

  • They’ve been focused on consolidating their behaviours/tech to get to the point where when they build things they can just stitch them together conceptually

  • In conjunction with the interaction system they’re working on something they call Render to texture that in a UI sense means it allows the UI to render properly within the rendering pipeline

    • Render to texture tech will also allow them to project onto curved surfaces

  • Continuously adding new things and thinking of new possibilities for the system including unifying systems to gain the benefits from the new interaction system

Full Transcript

Intro With Eric Kieron Davis (Senior Producer), Kirk Tome (Lead Technical Designer). Timestamped Link.

Eric Kieron Davis (EKD): Hello and welcome to Around the Verse, our weekly look at the development of Star Citizen, I’m Eric Kieron Davis.

Kirk Tome(KT): And I’m Kirk Tome. Tomorrow the April monthly report will be shared with the community, as you probably know by now the new monthly report is a collection of each studios updates over the past month.

EKD: Yeah and we really look forward to sharing all of our progress with everyone and starting the cycle all over again here in Los Angeles next week. First let’s go to Wilmslow and see what they’ve been up to.  

Studio Update With Erin Roberts (Studio Director). Timestamped Link.

Erin Roberts (ER): Hi and welcome back again to the UK studios for our latest update on our progress over the last four weeks.

Everyone is focused and busy working through all the tasks and bugs for Squadron 42 and of course the anticipated 3.0 update for Star Citizen which we are very excited to get out to you all as soon as possible.

So let’s kick off with the gameplay feature sprints which we have been working on:

The Player Interaction System has move along quickly over the last few weeks. Further improvements of the personal “Inner Thoughts” system will allow you to select functionality that is not directly tied to a particular object. Examples of this would be selecting an emote or exiting your seat - although there will be of course quicker ways of doing this though default actions for experienced players.

Next up is the Air Traffic Controller sprint which deals with managing the flows of traffic to a location for both takeoff and landing. In particular it is responsible for signing out and reserving a landing pad when a player wants to land, as well as freeing up that landing pad once they’ve landed and cleared the area. Similarly it will deal with reserving a landing pad and spawning as ship when a player wants to take off. The initial stages of the implementation are now underway and we’re working on the underlying structure of how the system works.

We’ve been finishing up functionality on the Character Status system which includes bringing the procedural breathing and suit punctures to final implementation. Once this is done we’ll focus on getting the system switched on by default in the game.

We’re also working on “pick up and carry” which is a bit of mash up between the Player Interaction system and our Useables sprint. The Useables were more concerned with getting the AI to interact with the objects in the environment. Whereas the Player Interaction system is more of the player UI to interact with the environment. We’re now bringing these two systems together to get the player to be able to pick up, carry and then place objects in our universe.

The conversation tech has now completed the initial development of the Subsumption tool and to create the conversations with NPCs much more easily. It’s been handed over to designers to prove it out by setting up all the different conversations. They’ll provide feedback to code on any needed further improvements.

The Audio team has been working on procedural planet audio processes including R&D and planning for systems to map and modify audio automatically. Also work is continuing on the Audio Propagation system, the Breathing System audio for the Character Status system, and also a dialogue tool that’s been called “Word Up”. For weapon sound effects the Ship Weapon Toolkit is in progress which includes reload sound effects for the Gallant, the weapon tell refactor and multi-positional code support for weapons which will handle summing up the audio for many of the same weapons mounted to a single ship. For ships the Prospector audio is done with work on the Greycat and Cutlass Black which’s continuing.

The Music department continues to work on the Dynamically Looping Cinematic Ambient Music system, clean up dogfighting music logic, the addition of Tension system prototyping, and the planetside procedural music; and have also added more music to the Launcher.

Meanwhile the Graphics team have been working on many separate pieces of tech this month:

The first is the integration of real time lit volumetric fog from Lumberyard which is going to be a huge boost for the Lighting and Environment Art.

The Render To Texture feature is progressing quickly and an initial version is in the UI team’s hands to upgrade our 2D UIs. And we’ll also soon have the feature useable for 3D holographic projections to power our various holographic displays.

The real time Environment Probe tech is nearing completion that allows fully dynamic bounce light and reflections on a planet where traditional light baking techniques are not possible.

The Visual Effects team has been working on several sprints:

Atmospheric flight effects have completed the first sprint with a pass at planetary entry VFXs. The effect is controlled by speed and atmospheric density values. With the core functionality in place for this as well as engine trails we’re now merging these two sprints as we further implement design and art feedback while optimising and bug fixing.

We have also been working on lightning entity effect improvements. This is where we are attempting to create realistic lightning and other electrical type effects.

In other areas we’ve completed the first pass for the MISC Prospector including thruster improvements and damage.

For weapons we’ve continued the initial work APR Scourge railgun including the “charging” and “charged” effects.

Since finishing off the Banu Defender the Concept team has been busily developing the Origin 600i which is now in its final stages.

The Weapons team has completed the Preacher Distortion Scattergun and the Apocalypse Arms Scattershot; and made good headway on the Klaus and Werner LMG.

The UK Ship team has been hard at work since bringing you the Javelin for Ship Shape:

The Reclaimer has made a lot of progress since our last update. On the exterior we’ve completed work on the hull and the team was excited to see the huge claw come together. We’ve now moved into the damage phase of development - splitting the mesh up and getting it ready to use the damage tech. On the interior we’ve fully fleshed out habitation and tech decks, as well as an enormous salvage processing room, and now the team is working on finishing the drone room, engineering deck and cockpit.

Work has also begun on the derelict ships so that Design can start laying the groundwork for mission-specific scenarios encompassing ships and wreckage. An initial batch of ships that includes the Connie, Caterpillar, Starfarer, and the Freelancer are being broken down to their structural elements and being made to to look destroyed. Material work is being pushed alongside to give the ships a far more deteriorated and aged look. A wreckage component is also being worked on so that we have a library of nondescript ship parts that will be used to help embed and integrate derelict scenarios into the environments.

The Razor artwork is now complete with the ship … and it’s going through a full damage pass. Some core work has been done on breaking this one into pieces and we’re really excited to show you the results. Currently LODs are being finalised on the hull and Art are working closely with Tech Design to get this one flight ready.  

The Hull C is also moving along nicely. The hull mesh is now largely complete with maneuvering thrusters being incorporated and polish work matching up the interior and exterior. A detail pass is also ongoing adding all the finer detail you’ve come to expect from our ships. The interior’s going through block out phase and is now well into art production. By utilising assets from other MISC ships we’ve been able to create spaces quickly and efficiently with the intention to use these across the Hull series.

The Environment team are continuing to explore ways to create volumetric forms in space with the Graphics team. We’ve been baking out simulations and doing some initial renders. The Surface Outposts are finishing their interior visual benchmarks for engineering, habitation and hydroponics. These will then be distributed to the various outpost layouts and configurations. The team is continuing to “set dress”, light and polish these interior spaces to build character whilst also exploring options for navigation and branding based on the lore and fiction. The Truck Stop space stations have moved into the final art phase. So the team is busy building out the shader library and working up some example pieces to final art quality. As it’s a modular system we’re also continuing to refine the building set to explore potential build configurations which will make sure the set is as flexible as possible.

The Animation team has been working on the cover AI work with the aim to improve all animation assets beyond functional. Breathing state improvements are now online with backend code improvements. This involves getting curve data out of Maya and into DataForge which will allow for more refined procedural breathing curves. In other areas the team started implementing multi-directional takedowns for killing enemies that are within close proximity to the player characters. Also there have been further improvements to weapon set ups and reloads across the the board including the Devastator shotgun, the Arrowhead sniper rifle, the Galant laser rifle, and the P8SC ballistic SMG; as well as melee improvements for pistol and stock weapons.  

Finally the Derby Foundry team has been busy with face and body animations for our 3.0 mission givers and has handed over 500 facial animations files that are now ready to be implemented into Squadron 42. The motion capture team has tracked and solved almost 1000 new body animations for various characters within the Persistent Universe. The team has also been working on new facial animations for shooting guns - Steve Bender, our Animation Director, has been a great source of inspiration so expect to see new, improved faces soon.

Well that’s our update for this month. Once again I hope you enjoyed seeing what we’ve been up to here in the UK supported, as always, by all the other studios around the globe. Thanks again for all your support and encouragement; and for joining us on our journey to make Star Citizen become this incredible reality. Everyone on the team really appreciates the trust our community places with us allowing us to create this amazing universe: without you all Star Citizen would not be the reality it has become. Thank you, take care, and I look forward to seeing you in the ‘verse.  

Back to Studio With Eric Kieron Davis (Senior Producer), Kirk Tome (Lead Technical Designer). Timestamped Link.

EKD: You know it’s really great to see the progress made on the procedural breathing. When stamina is introduced in 3.0, players are really going to be able to experience the consequences of their actions like puncturing someone’s suit or running for long periods of time.

KT: Yes, it’s part of the detailed universe building that we’re implementing. First with the roll out of 3.0, then testing the expanding universe from there.

EKD: Yeah, and speaking of the 3.0 roll outs, up next we see how the new Interaction System influences every aspect of gameplay. Take a look.  

Behind the Scenes With Zane Bien (Global UI Creative Director), Calix Reneau (Tech Designer), Chad McKinney (Gameplay Engineer). Timestamped Link.

Player Interaction System

Calix Reneau (CR): The Player Interaction System touches everything. It's a unified interaction with … across first person experience of shooting, of shopping, of looting, when you go up to screens and interact with those terminals and also when you get into your cockpit you fly your ship around and being able to point at things with reckless abandon actually opens up a lot of opportunity for interactions of “I want to find out more about that.”, and we can give back contextual clues of the things that you can do.

Chad McKinney (CM): So the Player Interaction System that we're starting to show here is the third version of the interaction system that we've added to the game. The original interaction system is what was in the game in Alpha 2.5 and previous, and it was this kind of mishmash of different approaches to try to figure out what the player was looking at and what they were trying to interact with in the game. After that we made an Item 2.0 Interaction System that tried to be more accurate about what you were interacting with, and so it would use raycast and collision geometry to figure out the results. That had some issues as far as usability, and so we're adding some new features onto the current Item 2.0 Interaction System that makes it more contextually aware about what you're interacting with.

Zane Bien (ZB): Our old system, especially the Credit Use System that basically … we had the bounding boxes had to be inside and then we can only have one action tied to that, and it wasn't very descriptive. It was just “Use”, right? That's why it's kind of deemed as the Use System.

Robert Johnson (RJ): To get the Player Interaction System going it really required a pretty fundamental rewrite of the interface we would use to interact with objects within the game. And for example for the first couple of years on the project we were so used to just having a big horrible “Use” prompt on everything, and really that not only did that not look good, but it also always felt clunky. It was very hard to actually get in the right spot to use stuff. So, we had to really come up with a design that gave the kind of level of detail that we wanted to put into the game for the users to actually be able to interact with things put at present in the levels to the level of detail that would immerse them in the gaming experience. So, to really make all that happen the easiest way was to just basically write a lot of this from the ground up. Write it right as a fresh new system as opposed to just going with the old “Use” stuff.

CR: I did a prototype of cargo. Some of the earliest, jankiest bits of that, that we've done, and as part of that I did a little section in a Freelancer that would detect that you brought a thing in and then showed up on a cargo manifest screen, which was the first time that I made a thing with the cursor on it. Eventually that turned into MFDs and how you interact with that stuff and the screens in your cockpit, and then eventually became … well that's should … we should unify all these exponentially growing input systems to something that actually has a core to it.

ZB: This is a whole completely new system that's coming in, because before you didn't have a cursor or anything, so it was very hard to tell what you're actually focusing on. Okay. So, this text is popping up on this door, but is it for this door that's to the left or to the right, so by having the cursor and then being able to highlight the objects, which is another thing in the interaction system that we do is we highlight the actual object that's going to be engaged. So, when you bring up the cursor and you have your cursor over the particular item that you're focusing on you'll get a highlight on the object, so that's additional feedback that is very useful for you to know as a player when you're using this interaction system. It's like okay. I'm … here's the door that I'm focusing on, and here's the actions that are tied to that. I mean the whole idea is to make all interactions that you can have in the game extremely consistent. So for instance, if you want to walk up and interact with a terminal screen, that's the interaction system. You can bring up a cursor, click certain buttons. That can be applied to something where it's a kiosk where it's very … it has very in-depth interaction where you have multiple buttons and filters and all sorts of things, or it can be used for an elevator panel where you're selecting between floor one and floor two and something simple like that, but also not just … that's just screens but we can also use that system for picking up objects in the world or interacting with a physical control panel.

CR: As we've gone through iterations of the interaction system it hasn't changed too much philosophically. It's always been about having the objects in the world dictate how you interact with them in tandem with your current state. We've demonstrated that a little bit with the battery demo that I think we showed some of the previous footage where having the battery in hand gives you contextually the interaction to put it into the radar where you could also manipulate the radar directly and open up the panel and all those things, but because your current state interacts with the world you get to have these more … more natural interactions.

RJ: So the way the system would work on a regular basis is the user would press and hold F to go into the interaction mode. They'd then be presented with a cursor that could kind of lead the choice. Choice may also depend on the proximity of the player to the various objects and the way they've been facing and such, but really the cursor would kind of lead the action and then clicking the left mouse button could actually trigger the action so that would lead into an animation of picking something up or could be they're inspecting a particular object. We've also added various different sub-modes such as like you rather than a left-click, you right-click and we can zoom in on an object to focus on it a bit more to get that extra detail, so you can see what we're doing with it. So, it's very much cursor led to give us the precision, but then the results that comes out would tend to be something that would be animated.

CR: It was important to me that you'd be able to interact anywhere on the screen for a couple of reasons. One, I just really liked the way it felt, and I got sold on that really early and through iteration there were versions of it that were more and less successful, but I'm really quite fond of it myself. But also again with the animation, your ability to look around the world is limited by your animation, by the physicality of your character. And so, if there is something that you need to interact with you need to be able to look at it, and traditionally shooters manipulate everything through a dot in the center of the screen which is you really. All of your interaction capability is a dot in the center of the screen. And that means that if you're going to interact with something, you have to be able to get it to the center of the screen, which isn't so hard when you can fudge the animations a bit, but when you're being really faithful to what's actually happening it's important to be able to look with your eyes to the edges, and you can reach anything that you can see. So, if there's interactions on your body of where you would put items to stow or to access things about the seat that you're sitting in. These things should always be available to you as well.

CM: The way it works is it uses what we call a proximity query, so it checks what's around the player and the local area to see what is interact-able, but then also uses some raycasts to figure out what the cursor is currently pointing at, and then using that it can figure out it's best estimation is for what you're looking at right now, what you were previously looking at and what is the best result for what you should interact with if you press the interact button immediately. So, it's basically this cursor that's kind of browsing the available options to you that are available.

ZB: You know depending on what type of item or what type of action that you're going invoke on the thing that you're focusing on, you'll get different cursors for example. So, you'll get very specific feedback on “okay this is a dial” so your cursor might be a sort of like a dial like on that indicates that you can rotate it or it's an onscreen cursor, that you're over a button or things like that. So, you get like that sort of feedback that makes it much more intuitive than if let's say if we just use one cursor for everything and there was no feedback.

CM: Building it was a challenge, because it required some collaboration for one between the studios. We had design here in LA, the main engineering was spearheaded in the UK with support here in LA studio, and one big part of it was how to get the sophistication of that behavior to work with just screen coordinates. You have this cursor. You know the player's position. How do we figure out given this information in 3D space what is the best solution? So, you have to do a little bit of math. You have to do a little bit of testing to test it out and prototype and iterate to see what actually works. And the other issue is performance. How do you do all of these checks? How do you make it so smart and intelligent about what you want to do and what the right result is and still be performing?

RJ: The kind of challenges you get with this with it being a system that's going to be regularly used for all manner of different objects and all manner of different scenarios. You really need to get that level of polish added to the system to make it feel good so the player isn't there getting annoyed every time they want to use an object, and also we need to kind of cover all of these various different scenarios and situations the player finds themselves in so that they can either ... the player can actually do what they want to do with these objects as opposed to being limited by a system that being in the CryEngine previously that was just very flat in what you could do with an object. It's really the kind of situations you'd find yourself in where whether it's just getting in a ship often you'd just presented with the option to use the thing as opposed to like opening the door, deploy me the ladder, choose to climb up the ladder, actually start the engine ... These were all the things that we really wanted to add, but we just didn't really have the system there, so it was a real challenge to get that system in place that allowed us to do all these things that we want to do, but also to a nice polish level where that wouldn't feel frustrating or tricky to do.          

CR: Part of what’s been so difficult in getting this system together has been as we built the various systems of the game, we had to make them in isolation so that they would function. You know, regardless of the fact that the rest of the game wasn’t there yet. So when we.. now that we have those things and we want to bring them all together, consolidate them into something that’s a bit more robust and sensical. Building that is difficult because you have to take all these really specific behaviours that are tailored to all these systems and create a generic interaction object that needs to sync with a generic useable object which needs to sync with a generic animation object, and all these things have to... they have to be that generic because they need to touch so many parts of the game.  

On top of that, there’s the fact that we have these interactions that we want to do that need to coexist with a wide variety of gameplay. You could be in a seat chilling, looking around, browsing MobiGlas, you could be in a middle of a firefight… you could be franticly repairing something, you could be cautiously exploring a derelict. There’s all these wildly different experiences that this needs to accommodate, it took a lot of experimentation to sort of feel out what allowed you that degree of expression and nuance without impinging too much on the other systems that were so far afield from what you were doing now.

For example, the grabby hands system as it’s been so notoriously been labeled, was an exploration in how do we create a system that accommodates all these different types of carried objects and what does it mean for something to be in your possession and in your inventory when we don't really have an inventory, we have physical places on your body. How do you access those things now, how do you put them into the world, how can you get close enough to take them back, and so a lot of those questions have also extended into this system. Which has been a hub of all these things like the terminals as well.    

RJ: There’s some really intelligent added to the system to create that nice feel of an intuitive kinda selection of the objects that you’re close by with, what you’re trying to interact with. If you don’t add in some sort of intelligence there it can lead to kinda of frustations, like clearly the player’s trying to interact with a certain object sat in a particular position but because without the intelligence, it might require a very specific alignment of player to object and there might be more cool objects in one scene or close by. So we have to sort of get some kind of intelligence to try and figure out which object the player wants to actually interact with.

CM: Whenever you interact with something in the world, the way that that thing is designated as being interactable is using the Item 2.0 component system. So a designer would create some kinda record for something they want to interactable, it could be an elevator, it could be a door and then they’ll give it the interactable component and on the interactable component they define the sets of interactions that can be used with it. Then the interaction points that are in that entity, where they want those interactions to be shown. One thing the new interaction system adds is the ability to have interactions on particular bones and subregions of an entity, whereas previously the old system you made it where you had these large bounding boxes and you kinda would get lost whenever you tried to find things, it wasn’t clear.     

CR: So the important thing about having these generic components is it gives us the building blocks to make things that are ultra bespoke. The problem with bespoke content is that it requires a lot of painstaking maintenance because as you move forward, things about your technology changes, things about the environment in which you’ve place this content change and in order to keep it all working… lock, step and sync, you have to be really vigilant going through all those things. When you make your behaviours separated and modular and generic, it makes it so that you can build things out more conceptually. This thing is heavy but it can be picked up and it takes two hands. So that’s going to affect what you can be holding at the time you try and interact with this thing, it’s going to affect what’s going to be the result of throwing this thing, what’s going to be the result in zero-G having collisions with other things. Those are fairly simple examples and already it’s started to spiral out into all these possibilities of when you can get really specific in the content that you’re making, it’s usually because the rest of it has been nailed down.     

RJ: We’d use similar subsystems within a game such as the zone system, zone system queries for example, to figure out which objects you can interact with were in the proximity of the player. We’d also use sort of standard engine techniques which is ray casting and other such things to actually determine which of these objects were best used but really there was a whole other sort of lair of logic added for this system, to give us the feel and depth that we wanted to add for interactions.

CR: So now there is a through line of input that you can interact with the whole game and it remains consistent. It adds quite a lot of exciting possibilities, it gives you the opportunity to have essentially a point and click adventure game in your shooter. So all those wild bespoke interactions that adventure games are built out of are suddenly available for something that is so systemic such as Star Citizen.  

CM: I think the biggest change in player experience is kinda change in mindset when you’re using it. When you used the old interactions system you’re kinda fishing around and you’re not sure what you’re looking for. When you use the new interaction system it’s more about fluidly browsing what you can clearly see and it’s a really big difference, I think, because you’ll want into a room, you enter into this interaction mode and immediately you can tell based on this highlighting  that’s happening, here’s the options on what I can work with right now. What do I want to do given these options and you can kinda float the cursor around, see what’s close, see what the different available interactions for those different objects are. It might be, you can turn on the engines, or I can turn off the power, do I want to do that. So, I think it’s just a more enjoyable experience, you also don’t have to move so much so previously with the old interaction system if you wanted to go interact with this thing over here, you’d have to go over there, you’d have to position yourself, you’d have to look. It just takes more work to look at the different options. Whereas this, you could kinda stay put a little bit more and kinda more fluidly guide yourself through the different options. You don’t have to do as much work to see what is all there.  

CR: The focus up to now has been on consolidating all of our behaviours and tech to get to the point where as we’re building things we can just sort of stitch them together conceptually and be able to get really specific with things and that frees us up to have all sorts of creative ideas and this is just the start.           

ZB: What we’re also working on in conjunction with this interaction system is something that we’re calling render to texture and in the UI sense that’ll allow the UI to render properly within the rendering pipeline. It’ll pick up all the post effects and it’ll actually look like it’s in the world, so that’s another thing that’s going to really make this your interaction feel much more in world, you’re part of the game experience. The other thing it’ll allows us to do is project onto curved surfaces so in terms of sci fi settings, that’ pry one thing we want to have. So wherever your cursor is, it’ll allow us to easily negotiate where… what you’re actually over in terms of UI, so if you’re over a button that’s on a curved surface the position of your cursor will… it’s easier to map out on a curved surface with this render to texture tech. So, that’s kinda an example of all these different pieces that are going to be coming together to make a fulfilling game experience.

CM: I think the one thing I would add is that this new interaction system is starting to come online, we’re starting to add new things but there’s a still a lot that can be added. Like every day we’re thinking of new possibilities for how this interaction system can be used, where, have we used other solutions in the game that maybe can be replace by this. One thing for instance is the item port system in the hangars, it’s kinda it’s own thing right now but it shouldn't be… it’s almost identical to what we’re doing with the interaction system so it doesn’t have the same benefits. It’s an opportunity for us to unify the systems and gain the benefits from the new interaction system but beyond that there’s even new wild things we could possibly do, try and bring more of the ship HUD for instance into the interaction system in some parts. To where maybe you could look at something say that’s displaying the ships that are in the area, it might then be possible to select one of them and find some information about it or communicate with that particular player. Using the new interaction system we can generate new interaction points and new interactions that run time. So it gives us a lot of flexibility with deciding what behavior do we want to open up to the player, with the benefit that it’s all using one system to do it. This game has been very iterative and I think it’s one of the great things about it, but certainly the interaction system is a great example of that. We kinda start with this stuff we have in CryEngine and it works, it’s not like it was broken but it didn’t have everything that we needed so we started adding stuff to it but it was buggy, it wasn't performing, it didn’t do quite what we needed, so we kept having to iterate and change and invent and make new things. Now we’re getting to this place where eventually after a lot of work we’re bringing it all together and I’m really excited once we get it into the players hands and they can start to see all these things coming together.                    

Outro With Eric Kieron Davis (Senior Producer), Kirk Tome (Lead Technical Designer). Timestamped Link.

EKD: You know as the guys said the new system is just the groundwork for more realistic experiences here in Star Citizen. It’ll continue to grow just like our universe.

KT: Yes and it’ll affect every aspect of the universe to create a more immersive player experience.

EKD: Before we go I just want to remind the subscribers that they can fly the Drake Buccaneer as part of our ship of the month. Subscribers will also get an Icarus I holomodel as part of their flair this week and if you’re interested in learning about our subscriber program, check out the link in the description.

KT: That’s all for this episode of AtV, we want to thank all our backers for your continued support…

EKD: Yeah.

KT: You’re the reason we’re able to create the best damned space sim ever.

EKD: And we’re also very grateful to all our subscribers who make shows like this possible, thank you.

KT: And thanks for watching, we’ll see you…

KT/EKD: Around the Verse.  

StormyWinters

Director of Fiction

Moonlighting as a writer in her spare time StormyWinters combines her passion for the written word and love of science fiction resulting in innumerable works of fiction. As the Director of Fiction, she works with a fantastic team of writers to bring you amazing stories that transport you to new places week after week.

Sunjammer

Contributor

For whatever reason, this author doesn't have a bio yet. Maybe they're a mystery. Maybe they're an ALIEN. Maybe they're about to get a paddlin' for not filling one out. Who knows, it's a myyyyyyystery!

Desmarius

Transcriber

When he's not pun-ishing his patients or moderating Twitch streams, he's at Relay pun-ishing their patience as a member of the transcription staff. Otherwise, he can be found under a rock somewhere in deep East Texas listening to the dulcet tones of a banjo and pondering the meaning of life.

"If you can't do something smart, do something right." - Sheperd Book