As per usual, anything said during the show is subject to change by CIG and may not always be accurate at the time of posting. Also any mistakes you see that I may have missed, please let me know so I can correct them. Enjoy the show!
Contributed to the Coil volumetric cloud, exterior effects for slaver base on Gainey, and bespoke effects for cinematic sequences
Refactored cover-movement transitions in the Movement system to include improved path analysis enabling better transition smoothing
Revealed first version of updates spaceship behaviours: AI ships are now fully controlled by the NPCs inside them
Introduced new Subsumption functionality: new event dispatcher, master graph system and dynamic TrackView scenes
Lots of effort showcasing moments and areas in Vertical Slice: establishing shots, framing points of interest, and improving readability
Acquired lots of ideas for better tools and workflows going forward to ultimately improve player experience
Finalising legacy lighting conversion to volumetric fog technology
Building a lighting language for the upcoming modular Truckstop features
First art pass for Geminii F55 LMG and the Torral Aggregate Kahix missile launcher
First art pass for A&R laser cannon (s1-6) and the Gallenson Tactical gatlings (s1-3),
Blocking out Trident Attack Systems laser beam weapons (s4-6) and Associated Science & Development distortion repeaters (s1-3)
Working on script and tools to speed up the art pipeline, e.g. modular weapons creation script
Responsible for quite a bit of content in the December reveal and pushing tech to make it all happen for the livestream
Scaling up Vat Tagaca to be more intimidating required careful adjustments to performance capture data so lined up with the environment and other actors and in fact some changes to the Argo mesh were needed as well.
Custom lighting is used to get the right mood for a scene.
Launch sequence of the Gladius fighters required a lot of new tech and code improvements, helped drive feature development - including AI, TrackView nav spline, and a new cinematic camera.
For big cinematic scenes, gold standard sequences are created utilizing the most likely responses to create a linear cinematic master take that informs the departments what the final product should look, feel, and behave like in-game with all systems active. These scenes have also been used to develop the workflow switching to and from subsumption and to and from TrackView.
Other work for the livestream included several cutaway scenes - some of these will be optional in the final game.
A lot of back and forth between the cinematics team and the game designers about camera control during cinematics. Agreed upon solution is to cutaway the camera for important scenes, but allow the player to easily regain camera control if they’d prefer to see the scene from their character’s perspective.
Players can walk/move around during a conversation and the effects related to conversations (reduced walking speed, camera focus) remain in effect.
Cheating to get a risky shot is okay to prevent actor injuries, but can require lots of logistical planning even small scenes
The DE Tech Art Team is extending the FPS weapon's pipeline with a new tool that will make production more efficient and give more control in the overall aesthetic
The DE Engine Team focused on improving performance and addressing issues for the current live release via a new lighter weight memory tracking tool and new scripting for automatic analysis related to memory leaks and allocations
Nikko, a new engineer, started working on a more accurate ambient occlusion solution
Shader optimization continues including a rework of the cloth variant providing more consistency in shading through the game
The System Design Team focused primarily on tightening up the first major mission givers, improved FPS combat AI believability, and a few other items that should make the game feel much more alive
The build engineers and the Austin studio ran virtualized and nonvirtualized testing comparisons to reduce the layers of redundant variables and to improve the overall incremental building process
They also added an extra layer for bug checks for riskier code changes
The DE Team recently had its first taste of mission testing regarding the mission givers, Eckhart and Ruto, via the use of shelf changes with Perforce by system designers and QA
The Subsumption and Lumberyard editors continue to be used everyday for anything related to procedural planet tech
John Lang is the main point of contact and QA tester for anything S42 related out of the Frankfurt office
Towards the end of last year the DE QA Team were included in review meetings and cleaned up the whole issue catching process by decreasing turnaround time with a whole rinse and repeat aspect, not unlike shampoo
The Environmental Art Team spent most of last month prepping for 2018 to update planet tech and tools to give more diversity and control of colors and materials as well as shaders to improve the visual palette and quality
Work on Lorville, the Hurston main landingzone, has begun to include a big hub area like with Levski and Area 18 and will include the possibility of trading in the Hurston Dynamics flagship store
Sandi Gardiner (SG): Hello and welcome to another episode of Around the Verse. I’m Sandi Gardiner.
Chris Roberts (CR): And I’m Chris Roberts.
SG: This week we have an in-depth studio update from Foundry 42 in Frankfurt, Germany.
CR: Yes. They’ll be giving a closer look at some of the work they did on the Squadron 42 Vertical Slice, and we’ll see some of the new optimisations and backend tech they’ve been working on for both Squadron and Star Citizen.
SG: Let’s check in with Brian Chambers now and see what the Frankfurt team has been up to.
Brian Chambers (BC): Hello everyone. My name is Brian Chambers, Development Director of Foundry 42 Frankfurt office. Since our last studio update team’s grown by a few people putting us currently at 79 people strong.
The end of last year was a big, coordinated, global effort to get 3.0 out to everyone as well as wrapping up bits for the end of the year: Squadron 42 Vertical Slice playthrough. The team then took a couple weeks off for the holidays and everyone came back recharged and ready to focus on their next tasks.
Once back the first of the year involved a large amount of scheduling and planning across most all disciplines: laying out the plans to distribute work amongst the various offices in the most efficient way possible.
Since January was a lighter than usual month in regards to work time - due to the holidays - this update will include some items from the previous update but with a bit more detail and visuals to accompany them.
So let’s first check in with the VFX team.
The most recent work for the DE VFX team was focused on S42 Vertical Slice. They contributed to the Coil volumetric cloud, exterior effects to the slaver base on Gainey, and specific bespoke effects for the cinematic sequences.
The slaver base’s exterior was dressed in suitable effects for the dusty planet surface and low technological style. And the cinematic effects were bespoke for the various cutaways in the Vertical Slice.
The work on the Coil gas cloud tech went through a good amount of R&D with the team: they were focussed on developing new methods to create an interior volume while keeping in line with the refined and approved concepts and art direction. The process they used for the Coil involved software called Houdini - which is not natively supported in Lumberyard - so portion of their time was spent on developing, optimising, and integrating that to the new pipeline. The entire process went through numerous iterations. And for the final look they paid really close attention on how the interior created interesting compositions and visuals for the players while flying.
The past couple of months have been very productive for the AI team. With a split focus on both Squadron 42 and the Star Citizen PU. During the Squadron 42 holiday livestream we were able to show a first version the human combat on foot. To prep for this the team spent some time refactoring all the cover-movement transitions inside the Movement system. The system now includes an improved path analysis by the AI - so that NPCs will know more in advance which type of animation should be played - enabling us to better smooth the transitions from locomotion to special transition animations if and when needed.
Ship AI was also given some attention during the past month. During the S42 reveal we were able to show our first version of the updated spaceship behaviours. Currently our Movement system is now in control for both on-foot and flying movement requests, and Subsumption is in charge of controlling the pilot and seat operator behaviours. The AI ships are now actually fully controlled by the NPCs sitting in the operator seats, and each operator has specific behaviours to use the items they have control of.
Regarding Subsumption they introduced several new functionalities. The new Subsumption Event Dispatcher, the Master Graph, the dynamic TrackView implementation.
The Subsumption Event Dispatcher is a way to fully support the Subsumption events without relying on any other external systems. It’s composed by a central system that allows user code to create and send an event. The events are created in a pool to allow the AI system to efficiently handle and reuse the allocated memory. And each signal lifetime is automatically managed by specific structures called “handlers” that also allow fast access to the signal itself. User code can specify a direct event to an entity or send an event to all the entities in an range from a specific location. Sending events in a range is now “zone safe” and all the code efficiently uses the Zone system for special queries and entity filtering.
The Master Graph is a way of … for building a relationship between multiple Subsumption activities. We want to have a general way for handling specific assignments requested by designers, as well as combat and regular activities. The Master Graph allows us to specify transitions between our scheduled activities and other logic that should be executed when specific events are received.
They also provided support for dynamic TrackView scenes. A dynamic TrackView scene is a cinematic scene where the participants can be replaced at runtime with actual AI NPCs. This allows cinematic designers to work on their scenes in a controlled environment but also allow level designers to populate their levels with NPCs that might have or need different activities such as specific clothing or other customisations. All the changes in the character might influence the NPC characters and become part of the cinematic scenes. Also dynamic TrackView scenes can be potentially interrupted so the AI system needs to know how to take over if, and when, that happens.
Closing out 2017 the Lighting team was focussed entirely on the Squadron 42 Vertical Slice for the holiday livestream. With lighting generally coming in at the end of an art pipeline a lot of effort needs to be placed on showcasing key moments and specific areas: establishing shots, framing points of interest, and improving readability for gameplay. With some things being worked on right up to the deadline this is no easy task, but the experience was extremely useful and provided the team with ideas for better tools and workflows going forward to ultimately improve the player experience.
Entering 2018, while continuing to support ongoing work for Squadron 42, they begin training a new lighting team member of the team who will help finalise our legacy lighting conversion process to the new volumetric fog technology, and also polish and build on new content in the PU. Initially this will involve building a lighting language for the upcoming modular Truckstop features. Since these truckstops won’t always be hand built we need to build processes to automatically place lighting in environment that feels believable, is light on performance and, obviously, bug free.
Last month the FPS Weapons art team continued to work on the Geminii F55 LMG and the Torral Aggregate Kahix missile launcher both of which are now finished with their first art pass.
Similarly the Ship Weapons art team completed their first art pass for the A&R laser cannon (size 1-6) and the Gallenson Tactical gatlings (size 1-3). They started blocking out the laser beam weapons (size 4-6) for the Trident Attack Systems and the distortion repeaters (size 1-3) for the Associated Science & Development manufacture.
Team’s also been working on scripts and tools to help further speed up their art pipeline. One such script is a ship weapon tool which was made to assist in the creation of modular weapons. This new tool lets us create a larger number of weapons in a smaller amount of time by allowing artists to preview different component setups in real time and automate the export process directly to the game engine. All animation exports and engine related meta files are handled by the script leaving the artist to mainly focus on making the actual art itself.
For the Cinematics team, based here in Frankfurt it’s rare that I can actually go into detail on their progress since we’re doing our best to keep the full Squadron 42 story a surprise. Since we recently showed portions of Squadron 42 on our holiday livestream I thought we would take the opportunity to go into some detail on what was done with the cinematics that were shown. So here’s Hannes Appell, our Cinematics Director, to give you and update.
Hannes Appell (HA): With Squadron 42 being such a narrative and cinematic heavy game the Cinematic team was responsible for quite a bit of content in the December reveal, as well as pushing tech to make it all happen for the livestream. Today we will go into more detail about what it takes to create some of these scenes.
Before players get to walk to their ship and launch, we’re witness to the arrival of the OMC prisoners on board the UEE Stanton. For that we actually scaled one of our characters Vat Tagaca, who’s played by Craig Fairbrass - the brawn to Khan’s brains so to speak - up to be more in line with his actual real life intimidating self. That meant careful adjustments to the performance capture solve so it all lines up with the environment and other scene members.
Owen Robertson (OR): I’m Owen Robertson. I’m one of the senior cinematic animators here at Foundry 42 in Frankfurt. I was responsible for working on a couple of the scenes from the demo that we showed in December. I want to talk about one of the issues we had with scaling one of the prisoner characters.
One of the issues is it’s not just the character’s model that gets scaled but also the skeleton as well and therefore the animations gets scaled. That means when a character moves on set - and that animation gets scaled - they’re no longer in their correct position. Which means they might not interact with a character properly or they don’t … can’t interact with the environment properly. So we need to adjust for that in the animation.
So the way we resolved this issue was to first scale up the character and then apply the inverse of that scale to anything that would affect their translations - or movement- through the scene. And I have an example here to show you.
Okay, so here I have a portion of the scene that I was working on with Tagaca and the two other prisoners. Tagaca walks up and stands directly between the two other prisoners. And this is before he’s been scaled. So we can see that his height is the same as the other characters and he doesn’t look quite as imposing as we would like him to.
So the first step was to scale up the character: we decided we wanted to scale him by ten percent. When I do that he’s now larger but also his position has shifted and he’s not where he should be on set. So the way we resolve this issue is we apply the inverse of that scale to anything that affects his translation through the scene.
So in this case we’d select his root controller, his pelvis and both feet. In the Curve Editor we need to select the translation tracks in the X and Z planes - I’m not worried about the Y plane I can adjust that later - for all those controllers. And now the question is how much scale do we need to apply to bring that back to the original performance.
So the amount of scale we need to apply is the reciprocal of the original scale factor of 1.1. We can work that out quite easily by doing 1 divided by our original scale, 1.1. That gives us a value of 0.90 recurring. So I’m going to copy that. Select my animations curves and apply a value scale of that amount. And when we scale those keys Tagaca is now back in the correct position but he’s also at the correct scale that we wanted and everything looks much better.
And then that’s … that’s ready at that point to do a final polish pass and then hand it off to the cinematic designers.
HA: It also meant a couple of mesh adjustments on the Argo pod door and landing gear which the ship team did so the performance capture would then work. To get the most out of this foreboding scene we also wanted a darker based lighting for the aft section of the Idris hangar. We soon put the ship art director and introduced a lower light state for that section of the ship that is a lot moodier than the default based lighting. Once this light state was done we took that as our starting point and set up additive cinematic lighting to finalize the scene.
The launch sequence of the Gladius fighters was a big set piece that required a lot of new tech and code improvements to make it all happen. We aimed for elements like a real AI Old Man inside his real AI ship, the deck crew has AI, the signaling from the player, the air traffic control officer to be done in a proper way that would push feature development on the game further along. For that we work together with AI engineering and our cinematic tools engineers as the launch sequence also meant more work for our TrackView nav spline. That spline actually allows us to puppeteer AI ships precisely and we had to make it work going from the interior zone of the hangar to the outside Shubin space seamlessly. We also work together with engineering on several features that help enrich our game cinematically we prototyped and then enabled an always on out of focus f-stop based depth of field mostly for non-combat situations that adds a lot of filmic realism to walking around and exploring interiors like the stanton Idris corridors. We are currently in the process of dialing in a rule set for this and how strong the depth of field will kick in under certain circumstances.
Another bit of tech that came online late last year was the ability to use light groups as
cinematic light layers for our scenes. Any scene can be tagged up and a corresponding light layer can be triggered to be faded in aid of a default or custom individual timing per light coming online. This can be a light rig for scene, but it can also be perlocation both are viable. We called it our Cinelighting rig and it helps to push hero scenes further than what the environmental based lighting would give us. For big scenes like the briefing with Captain White or Trejo’s rescue on Gainey base we are doing what we are calling gold standard sequences. This is a linear cinematic master take on the sequence with the most likely dialogue bits pre chosen and linked together to show all departments working on that part we want to achieve with the actual in-game version of that scene running with all the AI characters online and all bells and whistles active. It helps animation dialing in all the transitional clips, the facial post matches, it is integral for lighting and visual effects for the
scene and it shows how everything should behave when it all comes together. It also informs level-design about potential needs for environment funneling or staging of player.
Once the gold standard is done we shift the scene over to systemic AI, our conversation system or subsumption behaviors and compare that to the linear gold standard to see if we're happy with the end results. Scenes like the briefing with Captain White were also used to prototype and develop our workflow to go from and back to subsumption AI control and to and from TrackView.
Further work for the livestream included cutaways that will feature during mission gameplay - some of them will be optional in the final game. This included for example the intro sequence showing the scale of the Stanton which is being dwarfed by the even more gigantic Shubin Archon station looming in the background. It included the Reclaimer cutaway that actually features an AI character Donna Atar and when she's talking to the player in the cinematic cutaway and the subsequent comms call on the cockpit display she's actually sitting in her seat in her ship as an AI. This is made possible because of our render-to-texture feature for comms calls. Additional cutaways were the Gainey base exterior, the turrets firing, interior base power-up cutaways, slaver scenes, and of course agent Trejo’s rescue. Early on during development during previous for the first performance capture shoot we had discussions about when and where to cut the third-person and when to stay in first-person during our narrative. Of course this is the endless story of the cinematic guys like me trying to convince everybody else that filmic and close-up on the characters is the way to go to have your scenes have impact, so that characters transport as much of the intimacy and urgency of the drama as possible and on the other end of the spectrum you have game designers that hold player agency and immersion up as the holy grail and pretty much hate taking away control
from the player at any point in time. So during development of squadron it became clear to us that our story and the game experience we are aiming for is not served well by going into either extreme.
As most of our cinematics, especially the ones where the player is present, are real-time, we are going for an approach that will trigger cinematic cutaways with filmic cameras for important scenes but the player will be able to wiggle him or herself free of them to regain control and watch the cinematic from a more detached player perspective. So we will try to have our cake and eat it too, and serve both sides where possible.
Our conversation system already allows for players to roam and circle strafe around characters they are engaged with and keeps them in a sticky filter with special conversational field-of-view, depth of field walking speed adjusted, and other effects on. This together with the ability to wriggle free out of a cutaway are our tools to solving the conundrum of cinematic impact versus player agency,
Last but not least, in addition to the work for the stream we also worked on a scene where our player is introduced to the character of Old Man played by Mark Hamill during an earlier chapter before the two go out flying the first patrol together and this was used for an IGN trailer.
A scene like this means a lot of planning before anything is shot as it involves a lot of staging: who is placed where? and when? what would this mean for the camera axes? and how do we achieve that on set. The idea was to have Old Man start on the wing then do a dramatic pause in dialogue delivery while he shows mechanic Yuri who's boss by nonchalantly sliding down the maintenance ladder leaning on the Gladius wing and then being right in Yuri's face for the final bit.
This was pitched to Chris by me. We have previous animation done in the engine. Once the staging was signed off we started building the maintenance letter props so we would have finalized metrics for it on set. We carefully measured the wing, the thruster housing, height differences, the outer wing cannon and went to planning the actual shoot of the scene with the onset supervisor. A stunt like this is pretty much a no-go for any actor to do him or herself, but at the same time we wanted the whole scene as one whole fluid take, so we cheated the buildup of the Gladius wing including everyone's eyelines and had Mark pretend to slide down and end up in this final position next to Yuri. Our lead cinematic animator, Jason, then took the scene and gave Old Man a hand keyframe slide down and tweaked positioning and eyelines to have the final result. So, even relatively small scenes like this sometimes mean a lot of logistical planning to meet the real world requirements of a real set. I hope you enjoyed this closer look into what the Cinematic Team is up to. Thank you all for watching.
BC: Thanks Hannes. I'm glad we're actually at a point where we can start showing off some of your team's work. It's really cool to see the progress. The DE Tech Art Team has been evenly splitting their time between work on both the PU and S42. They're currently extending the FPS weapon's pipeline with a new tool. This tool will enable weapons artist to transfer skin weights to different meshes from one source skinned object. They started work on real time cloth and flesh sim R&D, and started development of a live link between MAIA and the game engine, so that animators can tweak animations in particular facial animations, while enjoying the advanced shading quality of the in-game real time renders versus MAIA's own lower quality viewport. This tech will be particularly useful for integrating and tweaking the pcap facial animations in S42's cinematic cutscenes. Due to the huge impact artistically controlled lighting has on the look and feel of an animation performance, iterating realtime and in-game will make things much more efficient for the team with more finite control in the overall aesthetic.
The DE Engine Team spent a large majority of the past month focused on improving performance and addressing issues for the current live release. In the process they end up working with most all disciplines to help profile items and make recommendations and/or fix issues when needed. Some cases are straightforward and addressed immediately, while in some cases they might actually define the fix but find it too risky to address immediately. They made improvements to a new memory tracking tool used for both server and clients, and implemented a script to automatically analyze gathered statistics to quickly find leaks and dubious allocations. Existing memory tracking tools became too heavy for recording and processing extended sessions of both the server and client, so they started working on a new more lightweight memory tracking system that could support the most important features in order to track memory leaks and invalid allocations without generating gigabytes of logs to analyze. The new system has already been used to optimize memory usage on the server for 3.0 and to track leaks. They implemented Python scripts to analyze the resulting log files as well as allowing to compute a difference for the two of them to see how memory allocation behavior changes over time in various parts of the code base which allows us to find leaks as well as trim excessive memory usage. The team also fixed several issues related to how data is collected in our crash database sentry, so we can see again how many different clients are affected by a certain crash. They also fixed an issue that caused bugs to be categorized as GPU crashes incorrectly, due to stealth files being left on the client when previously submitting a real GPU crash.
One of our newest engineers, Nikko, starting working on an ambient occlusion solution which is closer to grab truth, meaning it's more accurate. It's horizon based SSDO. Previously this shader only sampled the end of occlusion test phrase across the surface hemisphere. With the rework it also takes samples along each ray in order to better detect occlusion in between to prevent undersampling. We're in the final process of optimizing the shader as the first pass is fairly expensive due the number of samples required. They also reworked our cloth shader to use a more physically based shading model. It provides more consistency with the rest of our shading throughout the entire game.
In December the System Design Team was busy finalizing work on both on 3.0 and Squadron 42 vertical slice. A lot of the work has gone into the first major system mission givers, Eckhart and Ruto, and making sure there are no edge cases where either of them get stuck or players can abuse them. Each of those two presented different challenges as each has their own conversation flow, their own way of being found as well as being triggered. Similar to the mission givers they completed the first implementation of the admin officer for all of our major locations. Those also had a lot of edge cases that had to be considered and a lot of work went into making sure that players can't mess with them by blocking them for too long or giving him items that he actually doesn't know how to properly handle. Some smaller issues still exist, but they have some solutions in mind which should short it out fairly easy. A big focus was also put into our FPS combat AI, and the current pass is starting to show a lot of promise. They are beginning to act in more believable manner in combat. Behaviors will continued to be added and the timing of the combat will be adjusted all in an effort to get AI fights to feel both challenging and fun to play. They also completed a few other things such as doors now having override pump upgrades, hatches received cuttable locks, Idris beds got upgraded with shutters, and more and more complex usables are getting added which should make the game feel much more alive.
The build engineers recently worked with our Austin studio during low traffic times to prepare and run tests on our virtualized TryBuild cluster. A non-virtualized hardware TryBuild test cluster was setup to identify common issues and differences in behaviour between the two groups of virtualized and real hardware. Focus was then shifted to the virtualized cluster to run standardized tests to eliminate any variables introducing issues that would result in loss of truly incremental builds and identify areas that could be optimized to increase the flow of code related changes being validated through TryBuild back into the Star Citizen code base. After a series of layers of variables had been reduced through this testing they work closely with Mike Pickett on solutions to eradicate any steps in the code validation process that are destructive to the incremental building process, such as files being touched in any reversal process that would then require recompilation and the next code validation unrelated to the change being tested or not. This past month they also added a QA test request option within TryBuild which has already proven beneficial for the team. They then developed a feature to offload that test request TryBuild to dedicated agents in an effort to reduce compile times. With this new feature engineers now have the possibility to have their changelist compiled against another specified changelist. Within the TryBuild UI the user can select an archive option and will received a notice with the link to a zip archive containing their binaries. At this point a programmer can send a request to QA asking to thoroughly test their binaries making sure that their fix and/or new feature works as expected before checking in their work. This puts yet an extra layer in place for bug checks primarily to be used with code changes that are deemed riskier than others.
Mission testing for the QA Team is primarily done out of our UK office, but the DE Team recently had their first real taste of mission testing with both Eckhart and Ruto. Issues were entered for anything that was not working as intended as well as for any issues that would hinder accessibility to these two mission givers. The system designers worked with QA to quickly shelf changes for these NPCs so that QA could pull the shelf changes down from Perforce and test to see how things were working before the changes were checked into a build. Any issues encountered were brought up, identified and addressed and the process would repeat until Ruto and Eckhart were at their intended level of quality for 3.0 Additional multiplayer ad hoc testing was also completed and QA would provide feedback on how these two mission givers would function when more than one player was involved. The QA team also focused on testing and regression for the most recent 3.0 build, continuing to provide support for Frankfurt development team as needed. The Subsumption editor continues to be part of their everyday testing as well as the Lumberyard editor and anything related to procedural planet tech. John Lang is our resident S42 QA tester in Frankfurt, being the main point of contact for anything S42 related out of the DE office.
Close to the end of last year the rest of the DE QA Team was brought into the S42 testing loop to attend regular review meetings discussing the progress of S42. This insured that a tester was available to provide support for the dev team working on any specific feature at any time, which made the turnaround time for catching issues relevant to the features much quicker. This structure enabled QA to quickly test changes the moment they were in a build, write up Jira reports for issues if needed and send them the appropriate devs to be fixed. The whole process is a rinse and repeat until we eventually come to a build that is in the best possible playable and visual state.
Last month the DE Environment Art Team spent the majority of their time prepping for work needed for 2018. It's always good to have a look back at the progress made during the year and realign the goals for the new year if necessary. A lot of preparation and R&D effort has gone into updating our planet tech and tools. As we move from moons to full planets with more visual diverse ecosystems it became clear we wanted to have more control and diversity of colors and materials, so work is being done in updating the planet tech as well as the shaders we use on the scattered assets. Not only will this allow us to have the visual palette we want in the upcoming locations it will also give us a nice visual update and boost in quality for existing moons.
We all started work on Lorville, the flagship landing zone on Hurston. The team's been focused on taking the level from a level design block out to a first pass in terms of artwork and visuals. Preliminary work on the transit system for Lorville has also been started. Just as Levski and and Area 18, Lorville's going to be another big hub area where players can expect new quest givers, shops, trading and the opportunity to visit and buy ship weapons directly from the Hurston Dynamics flagship store located in the massive Hurston Dynamics building overshadowing the city of Lorville.
That's the update from Frankfurt. I appreciate everybody watching. The entire team appreciates the support we receive to make things like this possible and we'll see you in the verse.
CR: Alright so that was pretty cool. It was actually pretty interesting to see a lot of the little details that we worked on to deliver the fluid, cinematic, first person experience that we’re going for for Squadron. And obviously in future Squadron-specific ATVs we’ll be giving you more behind-the-scenes updates on what we’re working on. So check out those in the future.
SG: And for a “heads up” on those monthly updates on Squadron 42 development - from Frankfurt and all of our studios - head to the game’s web page and enlist for the newsletter.
CR: Yes. And we just relaunched the RSI website last week. It’s been redesigned and has some new features so make sure to check it out if you haven’t already.
SG: That’s all for this week. Reverse the Verse live airs tomorrow at a special Europe-friendly time of 8 AM PST with guests Brian Chambers and Todd Papy.
CR: Yes and you can watch episodes of Calling All Devs every Monday where you questions could be answered by a member of the team, and them may get the answer right, and they may not cause a forum storm. Or they may. Who knows?
Thanks of course to our subscribers for sponsoring all of our shows.
SG: Yes and the first stage of 2018 Subscriber Perks including “first wave” PTU access and some cool in-game item perks. So check those out.
And thank you to our backers: you’re dedication is what makes all of this possible.
CR: Yes, thank you very much everybody. And until next week, we’ll see you …
Both: Around the ‘verse!