As per usual, anything said during the show is subject to change by CIG and may not always be accurate at the time of posting. Also any mistakes you see that I may have missed, please let me know so I can correct them. Enjoy the show!
Squadron 42 Project Update
Developed a tool to highlight a number of issue with skeletons that were updating but not (or not needing to) animate
Feature team has been able to make items smarter about when/how they are updated
Working towards Object Container Streaming by ensuring Entity system code was thread safe so entities can spawn via background threads
Dialing in the look of the military version of the mobiGlas UI: referencing real world military and cinematic sci-fi
Continued working on animations for Duncan Chakma (Master of Arms) and polishing to release quality
Continued to develop the player's grab and inspect animations
Added several unique walk cycles for AI characters based on loops extracted from performance capture data
Focused on Subsumption stability updates for crew activities and setting up more scenes in other chapters
August Beck (Quartermaster) has her activities set up and is in-game, all AI usesables in the cargo bay are going in-game now
More ambient life moments are being implemented into other chapters and locations
The Living Idris
The AI within Squadron 42 will react to players differently between each playthrough, based off a players actions during the campaign.
However it’s more than pick choice A or B. With tech called SPOIL, Systemic Points Of Interest Look tech, it makes it so that the AI behave more like a regular person on a ship, enabling them to have a personality that reflects how they talk, what they do, and how they interact with players during the campaign.
Every AI on the Idris has a routine, but within that routine they can deviate depending on a number of circumstances such as emergencies, conflict, basic necessities such as eating, sleeping, etc.
When a player leaves the ship for a mission, the ship doesn’t stop working, it continues to function as if the player never existed in the first place, it’s a living breathing world just inside the Idris itself.
With SPOIL the AI are aware of a players presence and depending on the circumstances, may react positive or negative towards a player. The AI will also interact with each other not by script, but by what their sub activities dictate.
A good example of SPOIL tech is an NPC following objects outside of the immediate area such as through an unobstructed window
Sandi Gardiner (SG): Hello and welcome to another episode of Around the Verse. I’m Sandi Gardiner.
Josh Herman (JH): And I’m Josh Herman.
SG: This week’s episode features our Squadron 42 project update for Februrary.
JH: Well get a look at the complex AI systems being implemented in the game using the massive crew of the Idris frigate as an example. But first we’ll get to check in with the developers.
SG: That’s right let’s go to Phil Mellor now for a rundown on what the Squadron 42 dev teams have been up to.
Phil Mellor (PM): Hi everyone I’m Phil Mellor, Lead Designer of Squadron 42, with this month’s project update. So I’ll kick off with the techie stuff while you’re all still fresh ...
The Programming team’s been looking to issues that affect the performance of the game, looking into various ways we can optimise the code to make the game run better. The driving force behind this is to help speed things up for the PU release but because we have a shared codebase it means almost all those optimisations still often help S42 at the same time. For S42 the biggest wins were the animations and skeleton updates. Some of the best optimisations involved discovering where the code is updating items when it really doesn’t need to.
To help with our detective work our Lead Animation Programmer, Evo Hertzig, added some debug rendering that enabled us to more easily visualise objects in the world that were having their skeletons updated every frame. That enabled us to see what was being updated but not actually animating. The debug feature drew boxes around anything with a skeleton with different colours indicating different issues. This allowed the feature team to put in changes that made the items a bit smarter about how they tell their system about when their items need to get updated. We also put in additional debug information to show us when skeletons were getting updated but weren’t even visible: basically doing a lot of additional work that you’ll never see.
A gain from this information the teams would be made aware of when this was happening and change the code to be more aggressive about when they need to update. The downside to this is that it can cause some knock ons especially when something offscreen actually needs to animate but, because you can’t see, it it doesn’t. It’s not common but these situations end up as a job for our QA team. Playing the levels and finding all the situations where the animations are broken and getting the teams to fix them up one by one.
As always we constantly look at all component updates and make the more batch friendly as well as weeding out the other sections of slow code with a view to making them more efficient.
We’ve also been working on Object Container Streaming. The eventual aim for Object Container Streaming is to allow us to seamlessly load and unload parts of the solar system as players move around. That will allow us to greatly increase the amount of content within the solar system without increasing memory or CPU usage as only the parts that are relevant to you will be present in memory on your machine. Obviously this still needs to occur without stalls or loading screens.
The focus is currently on making the Entity system code thread safe so we can safely spawn entities - including creating components and loading resources such as the geometry and textures - from background threads. This is super important as it will remove the stalls that currently occur when entities spawn into the game.
For example, when you currently request a vehicle spawn at present we have over 450 individual entity components implementations: everything from game code - such as weapons and vehicles - to the more behind the scenes things - such as the network component. We’re working through the list checking and fixing every single one to make sure they are all fully thread safe. Once this pass is complete we can move on from single entity spawns to entire object containers loading in the background.
Now onto the mobiGlas …
As you’ll all know S42 is going to have its own military version of the device and we’ve been dialing in the look of the UI. The visual elements of the reskin have been referenced from real world military HUDs and cinematic sci-fi references to further immerse the player into the S42 universe. Lastly vital data is being visually prioritised and improvements to the navigation system add up to a better user experience.
Going into the Idris - where we do spend a lot of time in S42 - we’ve continued working on the animation set up for Duncan Chakma - the Master of Arms in the armoury area of the ship. With the primary focus of S42 being the fluid, cinematic, tactile feel Chris Roberts shot a whole variety of performance capture footage for the armoury so when you get or return a weapon - or attachment - you feel like your interacting with a real character.
Our job here included identifying all the common poses that the actor was hitting during the performance then matching those with the state machine while focusing on maintaining maximum amount of performance from the actor. Some of this performance has already been seen in the livestream event but we’re polishing up the assets to a full shippable release quality.
In tandem with Chakma’s animations we’re continuing to develop the grab and inspect animations for the player. These are more heavily key-framed assets as we need to come at these from a different angle and really take the player’s view into consideration - more than we need to do on an AI character. There’s still some way to go to get these finalised but once they’re in there they’ll really connect to player to the weapon and make the armoury a believable, tactile environment.
Now for an animation update …
Our Animation department on S42 have been continuing implementing more of the performance capture Chris Roberts shot into fully-functional game-ready assets. This now means we have unique walk cycles for a lot of the people in the game. It greatly helps to add personality to each character we see moving around. We take performance capture data, analyse it for the best case game functional loop. We always aim for the maximum number of cycles possible to keep as much variation in the walk as we can. Once the animator has cleaned up the motion capture data and got it seamlessly looping we export it into the game engine and start to plug the assets into the character-specific blend space to see how it looks in situ.
Staying on the Idris another area of focus has been on Subsumption stability updates for crew activities. We’ve taken what was learned during the Vertical Slice and continued to optimise the primary and secondary activity set up, wild lines, etc. along with further tweaks and updates to all those logic graphs. Using the refined system we are now setting up more scenes for other chapters in the game such as conversations between AI that the player can witness as seen in the VS. These are more complicated scenes that involve multiple AI characters for longer periods.
There’s been a lot of work going into improving the pipeline for getting these scenes into the game via the Subsumption system. For example August Beck - who’s the Quartermaster on the Stanton - she now has her activities set up and is in the game. The Quartermaster generally takes charge of stocks, distribution, and oversees the work done by the support crew characters that do a similar job. All the AI useables that are in the cargo bay are now going into the game for the AI to interact with so we will now see her moving around the area much more believablely checking stock and generally chatting to other AI characters.
Finally as part of the mission set up more of the game’s interstitial moments have been worked on using tech developed as part of the Vertical Slice, meaning more of the ambient life moments are being implemented into the other chapters and locations within the game.
So that’s it from me this week. Hope you enjoyed the update. Thanks for watching. See you next month.
JH: Thanks Phil.
SG: Yes thanks Phil. Back in December the Vertical Slice that premiered right here on ATV gave us a look at a UEE Idris frigate and its substantial crew. The developers have been working on some impressive tech that will help bring that crew to life.
JH: Now let’s take a closer look at the AI system working together with story and player interaction to turn the Idris into a fully immersive experience in this month’s feature.
Declan Troughton (DT): Living Idris to me means as a player you can walk around a ship, you can soak in whatever else anyone is doing. It’s this sense that everyone's got their own life, their own routine, their own personalities as well like so for example: If an AI is walking by, a guy who he doesn’t like might not say hello, but walks by the player and does like the player will say hi and will go out of his way to sort of greet him or talk to him. It’s all like this minutia of minor details that make up the whole story of what an AI can do.
Francesco Roccucci (FR): What the Idris shows is that everyone NPC has a meaning, has a background story that you can actually talk to them. They can tell you some rumors or each of them has a different personality.
Daniel Baker (DB): So the goal of the Idris was always to create a ship for people and we want you to walk around the ship and see these people going about their lives, doing whatever their role is and generally just doing whereas that they wanted or have to do. So in doing that we have every character on the ship has their own name, their own rank, their own role on the ship and that spreads among all the different disciplines.
So we have lots of bridge crew, engineers, gunners, deck crew, even janitors, and medical personnel, and they’re all spread out quite well given whatever their role might be and then we have to spread that out to different shifts as well. So you might see somebody doing a certain role or in a certain shift on one chapter of the game, but then on the next chapter of the game they might be off duty and you’ll see them in the mess hall talking to people, and everybody is doing their role on the ship regardless of where you are. So you could be on the complete other end of the ship and they’ll still be doing their thing. It’s not like they despawn as soon as you turn the corner.
DT: The depth today of an AI today on the Idris is essentially governed by this concept of a schedule. So on a Monday, like all this means on the Monday somebody might be a marine and going about their business as a marine might do. So he might go to the shooting range, he might go to the Idrisbridge and stand guard, do all this sort of stuff. Then on a Tuesday he might be off duty, so that all means he’ll go to the mess hall if he’s hungry, he’ll go lay down if he’s tired, he might go to the shooting range again if he’s feeling like he wants to shoot some stuff. There’s also things like arcade machines if he’s feeling bored, and it’s this idea that everything is governed by what an AI wants to do. So again if he’s hungry he’s going to go eat, if he’s tired he’s going to lay down. We just give them options and then they go doing it depending on what they want to do.
FR: Our game has this kind of huge opportunity to be a simulation in first place. So you want it to feel real and having that character that stays there forever doesn’t feel real.
Ross Wilding (RW): For example we might have a marine who his by default is supposed to be on a guard post and he will go to the guard post, but if somebody else is already guarding the guard post he might say, oh well I can’t do that so he’ll just decide if he wants to go do some training, go over to the shooting range or just simply running around the ships and doing that sort of thing.
FR: We really want the AI to tell a story to the player. So to achieve this there is no just one tool that can do it or one feature lets say. We try to embed the subsumption tool a lot of different functionalities so that designers can create that behavior that can tell a story, we want to have the systemic behavior to feel real, to characters talking in the mess and not just eating, three guys walking together or two jogger running around and while they run they try to keep their pacing and talk about something, but you also want a mission to tell its own story. So on top of their regular systemic behavior you might need the story to follow a specific storyboard. If we look at the prisoner scene which is particular fantastic in the livestream, that achieves very well. So you arrive to the pilots room, you start to talk to Old Man and then there’s a moment where if you don’t interact with the game, the game takes control and you have this really cinematic moment.
You can look at the face of the character, you’re not just listening from very fall. This brings more immersion and the immersion is persistent with the fact that you are still in control if you want to move around, interrupt the cinematic camera, and then you can go close to the prisoner and look at them and they’re actually performing all their actions. The mission on the other way can also tell the characters suggestions or what to perform next. So after you take off from the Idris, you want Old Man to fly towards the next position for the next mission.
DT: So the activity setup we’ve been doing has been really interesting over the last few months. We’ve done a big refactor of all the guys on the Idris. Guys like marines, engineers, off duty characters, bridge crew, all the deck crew guys, even janitors. The way I’ve been approaching it is I would roleplay it a bit. So when I’m setting up an engineer I’ll think like what's an engineer going to do today, what’s his job? Is he going to inspect all the ship parts, see if they’re broken, if they are broken, what’s he going to do then? Is he going to call up a repair guy, ask him to repair it and then let him move on and all this sort of stuff. Once I’ve got this big clear picture of a day to day of what an AI can do, I’ll start setup and I’ll start from ground one. He gets out of bed, puts his clothes on if he’s not dressed already and goes about his job.
FR: Designers can actually give assignments. An assignment can be I suggest you or I request you to fly to this position or I request you to defend a player from this fight. This is very similar to what in a systemic situation you also have wingman commands right? So the system is going to be very similar for the player and for a designer to ask the AI to do something.
An assignment doesn’t need to be necessarily a full behavior like a defend an area or defend the player or they might be a suggestion. They might be, I really want you to fly as close to the player as possible so you can just exceed your normal behavior, but telling a story that the mission wants. It might be in the moment Old Man has to fly next to you because you are in a very low health state and you need protection and we can communicate these things to the systemic behaviors to assignments.
RW: In the personal arrival scene, Palas who’s on the elevator doing his guard duty, now suddenly there’s a group of prisoners about to arrive so Palas is informed of this and then at that point he suddenly says right well I’ve got to go down to the hanger and to basically guard them as they arrive so he will suddenly leave his behavior and then be told that his priority is to make sure these prisoners arrive safely. So he will then walk down to the hangar and he will basically wait in the hangar until this ship with the prisoners arrives basically and it'll be the same for the other guards who are there as well. Then likewise when the cinematic scene is finished, we obviously need to seamlessly blend them out of this cinematic sequence and put them back into their natural behaviors so at this point Palas is given a completely new objective which is to go off and prepare the holding cells for the prisoners that have arrived. After the cinematic sequence takes place, Palas is then popped back into a new behavior and his behavior is then told you need to prioritize sorting out the holding cells so he’ll then wander up to the brig and start doing a completely different behavior which is again is all apart of his natural life.
FR: all the things that you see happening in the Idris are real, are actual behavior surroundings. So from our perspective it’s really interesting to create the tools that allow you to create a lot of content that is not fake. For example talking about the repairing an instrument in the Idris. That process basically evolved during development. It started a bit more like an engineer work that goes around, search for object repair and then he just repairs them, but that felt a bit too much robotic. So we did the designer, we looked at this and we went like well you know what it would be interesting is to try to have a lead engineer that goes around, inspects objects and then he calls repairing engineering work. So you will see him going around, look at stuff and talking and calling out another NPC, this NPC will come and he will repair and maybe the lead engineer will be there to inspect that the work has been done correctly and this is really what is described in the behaviors and in the subsumption activities and sub activities. So you can imagine the sub subsumption activities to be their regular job, that the schedule is assigning the different NPCs to do and then the sub activities are the actual kind of actions so an engineer can repair something, can inspect something or just going around because he gets cold, and this is like the way we describe the flow and the content in subsumption.
DT: One bit of tech that we made as part of the secondary sub activity sprint is this ability to trigger something based off what the player is looking at. So for example as you’re roaming around a place like the Idris, as a player you might be interested in things like the engines underneath the coolant or piping that carries power all across the ship. What this tech does is say, if the players looking at these things, the other AI nearby are going to notice that, they’re going to pick up on it and then they’re talk to the player about it. So anything that the player might be interested in, they can get information about it just by looking at it in the same way if you were looking at something in real life and some other guy knows all about it, he might pick up on that fact and talk to you about it.
There’s the ability to for it to feed into missions and gameplay and anywhere we want really. Say a big explosion happens or say there’s some fire going off as you’re walking down or running down a corridor I should say, an AI might comment on that oh look this is on fire because of this and that might have gameplay implications. So the way the POI tech works is it’s basically a raycast from the player and all that means is just like a line drawn from the players eyes to any object right in front of them. So for example the engines that player might be interested in because he’s looking at them, this raycast would happen, a little timer would go off and any AI interested or knows about that thing the player’s looking at nearby, they’ll wait for that timer to go down and once its gone down they’ll get an event and that will trigger some logic to say, oh the player is interested in that so I’m going to try and talk to the player about that and give them some information because it seems like they’re interested.
DB: The ways we’re building the tech we want the AI to be more aware of one another and to react to one another in more intelligent and systemic ways. Similar to this bull tech we don’t necessarily want to know what they’re going to say or do which makes it more interesting for everybody and one of those ways might be to react to one of your own attributes like say you hadn’t taken a shower and you’re a bit smelly then the AI might comment on that.
One of the things that we’re very aware of is how repetitive and jarring something can be if it’s said again, again, and again every couple moments like Lieutenant. So yeah we’re trying to avoid that.
DT: So SPOIL stands for Systemic Point Of Interest Look tech and it’s basically our way of giving an AI the ability to look at interesting things. So for example an engineer, he might be interested in things like wall panels that you can fix, monitors that display the status of the ship, so he’s going to look at those things, he’s going to be super interested in them. Whereas a marine, he doesn’t care about that stuff, he’s all about weapons and the armoury and the brig and that stuff so he’s not going to look at all the stuff an engineer might be interested in and look at all the stuff that he’s interested in. Similarly on places like the bridge, the pilot of an Idris for example, he might look at approaching ships because that’s something he’s actively interested in, he doesn’t want ships to collide with him so SPOIL is going to allow him to take a look at those ships and maybe watch out of them.
DB: It’s all part and parcel of how layered animation systems that we have. So the very basics of that kind of system would be you would have an AI character walking down one of the passage ways and that’s all well and good, but then you might want to have them walking down the passageway whilst also scratching their arm and that would be layered on top of the initial walk. SPOIL is a way for us to mark up anything in the game as a point of interest so then the character would be walking down the character down the corridor, they’d be scratching their arm, but it could also be looking up at a screen as they walk past it and that’s really great because you don’t know what the character is going to be looking at in any given time and that makes it more interesting for us because we don’t know either.,
DT: Every AI has their own personality and part of what that means is if the player interacts with AI in a negative or positive way, the AI are going to pick up on that and treat the player differently. So for example if the player just straight up punches the AI, that AI isn’t going to say anymore greetings to that guy because he punched him. Whereas if the player maybe fetched something off a mission and gave it to the AI as a request, the AI’s going to think more positively of that character and therefore is not only going to say a greeting, he might say a really positive greeting or might stop in his track and try and talk with the player. So it’s this idea that every players journey throughout Squadron is going to be slightly different. You might have one player whose relationship with Morrow is really positive. Morrow gives him loads of gossip and stuff like that.whereas another players journey might be super negative towards Morrow, like Morrow essentially has hates the player and that’s always reflected through things like wildlines, SPOIL tech, the way AI interacts with the player in general.
DB: A really good example of SPOIL tech is Marster was designing the executive officer behavior on the bridge. Now she's in charge of everybody, and she's got to walk around, and she's going to be observing people. Just generally walking around making sure everything's going well, and that was all nice, but it felt a little bit flat and wooden at times. So you might want to looking at something, but she tends to looking straight ahead. Whereas now with SPOIL tech we could have it so that if she was looking out a window she could be looking at a ship as it's flying past. So she'll actually follow the ship, and it just makes it so much more dynamic and … just more real and tangible. It's really … just opens it up.
If you look at Idris documentation online right now you'll see that the crew compliment is supposed to be somewhere between 30 and 50, but as we were testing out the Idris and the crew and filling up 30 and 50 people just wasn't enough. We were walking the passageways. It just felt too empty. We didn't feel like we were getting that, that sense of vibrant that we wanted and life, so we added more and more people until it felt right, and now the Idris is at about 81 people including the player, and to put into perspective we … if you look at the live stream demo we had 12 people just on the bridge, and that was without the captain or a co-helmsman in there. So you can see how it would quickly fill up everywhere else as well.
FR: From a designer perspective of course it's interesting to create a universe and an environment that's very complex where interaction of NPC can happen, interaction with the player can happen. From a developer perspective it's better for us as a high programmer. The interesting part is to create the tools that allow designers and also our self to define this world and define the restriction and define the rules where things are happening and the rules of the game pretty much.
RW: One of the big things we have around the Idris and throughout the game is we have what are essentially conversations where you can walk past several AI and obviously they might kind of wave or not to you and kind of do something, but there's nothing. It's just more what we call a wild line, whereas there are certain AI on the ship that have specific things they want to say to the player or even to other characters, and say like if you're the player. You walk past a certain character that has something that they want to say to you. They'll actually break out of their behavior, their activity, and they'll say ... actually call you over and then have a conversation with you, and these things can actually be seen by the change in the depth of field with the vision. So if the player is looking at something they will … like for example in the VS demo that we showed you can actually see Lars on the bottom of the elevator who … he's in his guard post, and when you approach him he actually reaches out to you and starts a conversation with you and one of the things you'll notice is the camera kind of zooms in almost like your then focusing on the character, and at that point we then have a conversation. We might have like at the start of Lars' conversation he'll actually talk to you, and the player will respond, and then he'll finish up talking, but then part way through our conversation there might be multiple options for a player to actually have player choice, and in this case we'll actually pause the conversation timeline and basically wait for the player to make a decision, and then based on which decision they make we'll actually branch out the conversations, so that they can basically go down one of two paths or three or four paths depending on lots ... how many options.
DT: Wild line is this all encompassing term that we use for anything that an AI can do as a one shot dialogue line to the player or to each other. So things like greetings, busies - what we call busies essentially is a AI saying I'm busy. I can't talk. Similar to be’s - is another one which is another thing, another term that we use to say like going off somewhere to be. Why are you lingering around? I'm doing something. Also things like mission comments, so when a player gets back from a mission. Say he did really well. The player's going to comment … The AI's going to comment on that. If the player didn't do too great then the AI is going to say like you know you kind of sucked on that last mission. Do better next time.
Wild Lines add a lot to AI behavior. It adds a lot of life. It means that … oh it's the difference between you walking by an AI, them looking straight forward or them looking at the player and maybe greeting them and asking them how they are. So, it adds a lot of life basically to the Idris. The way it interacts with all the primary set up is very easy. It's essentially just layered on top. The idea is once you have setup one greeting - the logic for that greeting - it can be applied to any AI who has a greeting line. So for example Captain White … he has greetings … Kelly has greetings ... all our cast characters even some of our what we call red shirt characters who are like minor characters. They have greetings, and the good thing about the secondary logic is that once you've set it up once it can be applied to everyone. You know you don't have to do that setup over and over and over again.
FR: In the perfect scenario if the player wants to play the story as we intended then they'll have the perfect playthrough. If they want to influence the scene and the story then they can still do it and they will enjoy that fact. That every system is not like a baked scene. It's really something that is using the realtime AI, so if an AI changes loadout or changes outfit you will see that impacting the cinematic scene. Our big challenge is to make sure that this system is the most robust and stable as possible, so that designers … content creators are not worried to bring more content … are not worried about let's say experimenting with the gameplay, because a cool game is coming from experimentation and iteration. If you don't allow people to iterate then you pretty much are getting stuck to the first test, and then people don't want to change it, and then it's going to be like ... it's not great, but if I now change this script then everything's going to break, and I don't want to do it, and we don't have time.
It's really interesting for me to try to speak a lot with writers to level designers to system designs to other high programmers, because we want to make sure that every we do doesn't have big ripple effect on the other systems, but they just build on top. They expand functionality. It's sort of composition math, so you wind up every system can be composed by multiple elements, and you can create more without breaking the others, and I think this is the big interesting challenge of Star Citizen.
JH: As you can see the cutting edge AI tech integrating with cinematic moments aims to give players an experience that’s story driven but also extremely personal and adaptable.
SG: We’ll have another Squadron 42 project update for you next month.
JH: That’s right. And in the meantime you can head to the game’s webpage to sign up for the dedicated newsletter, check out past updates, and stay on the front line of development.
SG: This weekend will be your last chance to supersize your Star Citizen starter package and add Squadron 42 for just $15. The price for the upgrade will increase on Monday so grab both games and take advantage of this special while you still can.
JH: In Star Citizen news this week the PU teams branched to Alpha 3.1 as planned and continue to stabilise and gear up for deployment to Evocati as we get closer to release.
SG: You can always stay up to date on what we’re planning for that release and beyond with the live Roadmap on our website.
JH: Last week’s installment of Ship Shape introduced the first new concept ship of 2018, the Aegis Vulcan. This versatile utility ship will be an entry point for Star Citizen players interested in support roles with the ability to refuel, repair, and rearm ships in need of assistance. You can pledge for the Vulcan now so make sure you head to the store and check it out.
SG: Also released today the Aegis Wrecking Crew ship pack gives you an instant, self-sufficient mini-fleet with five ships hand picked to work together in formidable harmony.
JH: You can learn more about the Vulcan and the mechanics involved in it’s repair, refuel and rearm functions tomorrow in the episode of Reverse the Verse airing live at 9am PST.
SG: Thanks to our subscribers for sponsoring Reverse the Verse, Around the Verse, and all of our shows. We look forward to seeing some of you at our upcoming subscriber events in LA and Derby.
JH: And thank you to all our backers and supporters for the development of Star Citizen and Squadron 42. That’s it from us today.
SG: Until next week, we will see you …
Both: Around the ‘verse.