top of page

STELLANTIS VEHICLE PROJECT

OVERVIEW

SUBJECT TO REFORMATTING

Engine:

  • Unreal 5.1

Duration:

  • 8 Months

Genre:

  • Learning

Features:

  • Reactive AI Companion

  • Interactive Vehicle UI

  • Interior/Exterior Movement

  • Quest & Missions

  • Concept Showcase

Position:

  • Visual Scripter

  • Communication Manager

  • Documentation

Design Team:

FINAL VIDEO TRAILER

EARLY CONCEPT

Project First Steps:

This project was developed while I was studying at "The Academy of Art University." It was a multi-semester project that worked in tangent with the schools Industrial Design Department, both sides of which were given feedback and project direction by individuals from Stellantis; It involved vehicle concept development (ID Team), and the creation of gamified experiences using them, which fell under my teams' jurisdiction. We were not in direct side-by-side development, but worked in bi/weekly updates with majority of the theming coming from the ID team.

At first, our team of five developers was split between three industrial design teams with finalized products and another still designing after requiring a major rework. My team of two being assigned to make progress combining the completed concept and initial concept of the two ID teams. During production I also assisted the other teams with [Unreal Blueprint] issues and design.

Seeing as the vehicle designs and storyboard experiences had been

finalized over the previous semester, we immediately set out to

complete a prototype using initial themes planned out by our Game

Director; Concepts we were to expand upon using the ID Storyboards.

 

My project "Somnium-Long Road Trip," was a VR learning experience

simulating how an AR game final project would appear in the real

world. This may have been where further confusion down the

development pipeline arose as you will read later. Many attempts

were made to define this separation and convey these differences.

Our development was overseen weekly by our Game Director Mark Girouard, and two visiting members of Stellantis who would additionally review our work and give us feedback on a bi/tri-weekly basis. 

Despite numerous misunderstandings, redesigns, and prototype reconstructions, we managed to complete a final proto-copy of what the experience would look like.

ConceptBoard.png
Environment_(Engine)_06.PNG

Final Story Board. Recompiled.

EARLY DEVELOPMENT

The Concept:

A VR game simulating the AR experience of a road trip across the country at various locations. The focus being to showcase the ID teams' vehicle & UI elements in a 3D environment; Overall, a Gamified Road Trip.

Environment_(Engine)_04.PNG

Prototyping:

We started with the previous semesters' Industrial Design vehicle, a hybrid all-terrain vehicle; Graciously it was fully rigged and ready for import, barring a minor mesh misalignment at the bottom.

 

This allowed us to start iterating quickly on a Movement and UI interaction Systems. We used Perforce for version control.

With a [High Poly] & [Low Poly] version of the vehicle we were able to keep the initial development optimized which helped later when using VR. (At the time I didn't know about the implementation of Soft References, but I now make it a priority to use them after spending a long period attempting to optimize this project later without them).

Somnium_EarlyVehicle_02.png
Somnium_EarlyVehicle_01.png
Somnium_EarlyVehicle_04.png

Seeing as this was our first time working in VR we spent an initial period modifying a hybridized [VR Pawn] into a controllable character with Keyboard & Mouse until we received our VR equipment.

 

The VR Character suffered from an issue where looking up or down caused it to spin rapidly, luckily fixed with a simple angular clamp. However, later in development it broke when implementing external vehicle movement, something we hadn't planned on initially. 

 

I believe this occurred due to the built in CPP code of the Unreal Character which I might not have remembered to apply the clamp to the override function. Luckily, it revealed ways of getting past some built in Unreal Features.

The Track System:

Within two weeks of testing and reiterating I managed to setup a basic [Spline] movement system. It could:

  • React to going up/down steep inclines.

  • Automatic Speed Adjustment

  • Sway naturally with Set Speed.

  • Switch Between Near Tracks Smoothly.

  • Turn Around When Switching Tracks

InteractionUI_EarlyBuild.gif

INITIAL CONCEPT NOTES:

Working on this bit in particular certainly made me want to finish my C++ studies with Unreal.

Combining 2 ID Team Concepts. Time Interveens.

Almost all of the system components were depreciated for the VR experience, on concerns of player nausea since realism entails; My own motion sickness kicking in being a joyful sign of capturing a realistic effect and an ominous sign that my prototype was about to be, Butchered. By my own hand no less! A true tragedy I'm sure.

 

And from what was left, two problems arose.

 

The [Spline] track itself revealed the first issue, its use creating a rigid Roller Coaster like riding effect that had to be solved to accurately emulate a vehicle, make it believable, and prevent literally visual whiplash for the VR occupant. The option to use a free movement vehicle tempting, but ultimately not a part of the design left it on the design board.

 

I solved the Roller Coaster by using a rotational lerp, and a turning

radius value; Doing this softened the rigid [Spline] track movements

and simulated an accurate turning radius in the process. It created a 

preemptive turning radius.

At INSANE speeds, it wouldn't work too well, but at the usual traveling

speeds (0 - 60mph), it created an effect similar to an actual vehicle

turning as the lerp between current and future values was delayed

appropriately. The value I settled on was (0.035), as anything below

(0.025) started causing the vehicle to not move at all.

Turning Radius_Track.png

Track System | Problem One | The Roller Coaster: 

Combining this with a [Spring-Arm] on the vehicle mesh itself on a short [10 Unreal Unit] distance smoothed out most of the ride perfectly. No more Roller Coaster. Which left only one problem.

 

Acceleration, Deceleration, and Braking; Technically, from a design standpoint, a single issue.

SpeedCalculations_Track.png
VerticalityAdjustment_Track.png

Track System | Problem Two | Shuttering Stop/Start:

Using a [Timeline] in Unreal seems to come with an abrupt stop/start whenever it's paused/un-paused and seeking to prevent it I turned to my automatic speed modifier that would modulate the total time of the [Timeline] as it moved to adjust the vehicles speed on inclines. Simple and by using relatively easy bool logic with speed calculations it could modulate when to do so.

 

So, I requisitioned the effect when stopping and starting to slow down or speed up the vehicle in accordance with an acceleration time. This had mixed results on longer tracks as we used [Trigger Boxes], and child variants of it to create interactions between the vehicle and the level which Dane had started constructing on his side after we met and settled on a location with the ID team.

On smaller tracks, small changes to acceleration produced easy to maintain results, as it was designed to use small [Spline] sections, select a new route (thus allowing exploration), move to the closest point on the new route, then continue.  However, this reasoning didn't come across as in the final level segment a single [Spline] was used throughout the entire level preventing these additional features usage. My time consumed with optimization and UI made it a low priority task and so it was kept.

As such, a patch-job fix of values preventing some stops and starts from jittering. Looking back in Post-Mortem I realize I could have faked braking by extending the [Spring-Arm] length so that the in-built physics could fake a smoother stop.

If I were to reconstruct the system, I think I would try using a [Timer] that increment along the [Spline] at a set rate/speed rather than between its start and end. I will add that I did briefly spend some time refining and creating an optimized vehicle mover, but it couldn't be finished or implemented due to the work schedule moving up.

 

We had received our VR Headsets. (Oculus)At this point, I was pushed to work on VR implementation, UI Elements, and the eventual AI companion since the initial vehicle movement features were functional.

Progress seeks no perfection.

Early UI:

Before I talk about VR, I should go over the core of the experience, the UI. Dane setup the first variant attempting to mimic the original ID teams child orientated designs, with a general theme of white and pink. 

ORIGINAL ID TEAM UI DESIGN:

Once we received the full presentation, we pretty

much jumped into getting everything in-engine so that

we could build our game around it.

Dane's initial design lacked functionality, but created

the visual scope and scale we had to work with.

 

The original vehicle's circular cabin design gave us quite a bit of space to work with, but also a challenge for designing diegetic UI in a 3D space; More so considering how often it would block the environment outside or feel cumbersome to reach an adequate size for interaction and legibility at a distance; More so with the moving background and VR Headset creating a sensory bombardment.

InteriorDev.jpg
Somnium_EarlyVehicle_04.png

Early UI | Information Data Tabs:

Between our members Dane was more confident in his level design capabilities than systems so I took over the UI elements alongside developing the IDT's or "Information Data Tab." It didn't occur to me until later that it shared similarities to our descriptions of the ID team, but by the time I did everything had been named and was using a [Structure] system; Which are prone to crashing when being renamed. (At the time of this writing I am learning the usage of [Data Assets] in conjuncture with Unreal CPP implementation).

The concept overall was to create Nodes in the world which when interacted would update the vehicle UI with information contained in a central Information [Data table] stored in a Library_BP. All interaction Nodes capable of spawning the Library_BP should it be missing. 

Each stored value used an updatable list of [Enumerators]. I had previously used a [String] / [Integer] retrieval system on another system and found it to be difficult for Designers to use, which resulted in this method. On a larger Database I would use a Key instead as it has more benefits for per instance data retrieval. (Combined with Soft References and Data Assets I feel memory usage would drop significantly).

 

For now I used a spinning box on the path, and the vehicle collision to trigger the effect.

InteractionUI_EarlyBuild.gif
Depreciated Popup.PNG

Early UI | The Redesign

Progress on the Prototype was going well when we received the first redesigns for the UI, with meetings revealing a redesign of the vehicle in the works.

 

The ID teams focus had shifted to more detailed information and less vocal interaction from the assistive AI. Reformatting the design the collection of images would now cycle on a [Timer] while depending on the Library Table's information the Information Tab [Widget] would fill itself out:

  • The Subject Name

  • The Subject Classification

  • Images (<4)

  • Information & Info Title (<4)

Additional UI elements, such as the 

button selection and location were

left out due to VR at times being 

difficult to work with on smaller buttons; Also, that the button functionality,

were unnecessary with the information presented already incorporated into the design. Including the Clip Art would have required multiple premade formats with images, so it was left on the backburner. (Probably for the best considering there were more redesigns on the way).

 

More UI shifts occurred as the lead ID designer kept trying to rebuild the project around less physical controls and more abstract functions such as voice control and automatic gesture recognition. While these drained development time, I always made attempts to capture the core of the design for proof of concept since we all assumed we were making a simulated game specifically for the vehicle as per company request.  

VehicleInternal_InitialUI_Concept.png
VehicleInternal_InitialUI.png

NEW  ID TEAM UI DESIGN:

Updating The Information Data Tabs:

Considering a box spinning on the path to be a bit cumbersome I updated it using a Niagara effect so that its color would change to indicate progress.

The Hand you see in the images that follow are meant to be placeholders for the Bunny AI, a feature present in both ID teams designs, prompting its implementation at least in concept. As such to make it more of an active entity whenever the user would select a IDT node a holographic version of the AI companion would retrieve it.

VehicleInternal_EarlyBuild.png
InteractionInitial_EarlyBuild.gif

This caused a bit of a delay, so plans for a screen that showed the AI Hand Moving through the world were developed, which eventually turned into the Top-Down GPS view using a [Texture Renderer UI Material].

 

Visible in the left image the VR Headset is visible and a collapsed central HUD which would now open whenever one of the selected buttons were interacted with. This would allow for a simulated presence of the various vehicle elements. Once selected the vehicle HUD would show the information in a semi-permanent pop-up.

A difficult time stopper during this section was actually getting the VR Player to be able to interact with the Hud elements and the IDT popups in the world which would require two interaction methods; One using the [Widget Interactor] provided by Unreal, and the next being a [Line Ray-cast] that would go through it, but still interact with the IDT. The biggest problems were registering collision channels and the size and direction of interactions, though it was planned to use simulated working fingers and hands it ended up staying as a pointer format that interacted when using the trigger. 

A major issue was actually with the [3D Widgets] themselves as depending on their size the actual collision of the widget would be misaligned causing the [Ray-casts] to completely miss or only trigger the buttons by pointing at an unrelated spot. This was fixed by increasing the registered size of the [3D widget] to fit the [Widget] being spawned. Visually it didn't make a difference, but the collision seemed to register based on the scaled bounds.

It took a while, but I managed to get a [Ray-cast] linked to the hand and Unreal's built in [Widget Interaction Component]; One for UI, the other for IDT's, though overlap did occur it wasn't a major problem as it required multiple inputs like a charge-up to activate. This also sought to capture some game like essence with the players reaction capabilities.  

During this process I designed, in tangent, a look at activation feature to try and capture the expected ID elements. It worked, but ultimately removed a huge amount of physical control and felt quite lackluster from a design standpoint.

Updating The IDTs | Problem One | Interaction:

GPS & HUD & VEHICLE HOLOGRAM:

The full UI redesign, (seen right), was completed, the

used of a holographic version of the vehicle to

indicate its upcoming directional movements was more

difficult that anticipated because of a lack of image

on the ID team lead who I had to grill for details,

the hologram transforming from a simple compass to

the latter future direction indicator.

I would say this marked the end of the early development period as it had seemed like all of the functionality was coming together which would give us more time to work on mini-games, interaction features, and aesthetic design. Additional trackers of temperature, time, and weather conditions rounded out the vehicle functionality.


It was determined due to performance and through our Game Director's meetings with Stellantis that we were creating a visual representation of a game with these vehicles, not a game. As such its removal was made a focus so as to have an operational demo that could easily be viewed. This however was contested multiple times and by the end of development was confirmed to be a miscommunication.

It also was not the expertise of our department which could not render out simulated videos like the ID teams could.

MAJOR CHANGE | No VR | Project Shift:

MIDPOINT DEVELOPMENT

A New Vehicle:

It was around here when we were introduced to the ID Teams "New" vehicle, a complete redesign inspired by the team's takeover of the project that transformed it from a street vehicle to an off-roader in which we were told to showcase these elements in our design.

image_edited.jpg

The Plan being to incorporate an offroad section to a Redwood National Park experience following the Road Trip theme and the new ID team's choice. The second ID team was pushed to the backburner as their concepts were diverging again, but the AI Companion was kept.

The difficulty with this new vehicle came from multiple factors. There was no [Low Poly] model, it did not have a Skeletal Mesh, and my attempts to apply the appropriate functions to the model seemed to not work with how they modeled the structure, my experience being brief in modeling software.

Included in the change was a switch in car company, from Chrysler, to Jeep, both of which are subsidiaries to Stellantis. Which honestly was probably the most shocking part, having a reason to research the car industry and its various ownerships and connections. 

Additionally, the buttons used to control the vehicle were to be completely removed in favor of a self-driving car, removing more of the possible interactive features and transforming it more into a 1st Person-Rail-Shooter. Which was previously more a sub-point which we weren't trying to highlight.

New AI & External Movement:

With a new vehicle came a redesign of the AI companion. Though it wasn't as fleshed out as the other ID teams in prototype form I setup a copy of the vehicle IDT UI and movement capabilities.

Originally the AI companion would move along the dashboard, but with the new changes it seemed more obtrusive and thus pushed them further out of view while in the vehicle.

Design Change: Following the Story-board which showcased the user leaving the vehicle it prompted the development of external movement and the AI companion following the user.

The AI Buddy will always try to move to the users left/right near the edge of their vision unless the user is looking towards them within a 45 degree range; This prevented the AI Buddy from moving when the user is trying to read the information. Additionally when the user moves out of range the pop-up would close completed.

An Issue with this was prompting the user to interact with the AI Buddy and that doing so was indicative of a hand gesture. A hand popup like a UI element seemed out of place, but necessary for conveying the interaction. For times sake and lack of a hand image I used a small targeting reticle.

Quests & Clue Zones:

Included with the AI Buddy and External Movement was the inclusion of a Clue Zone which would prompt the user to explore and discover in a safe area of the experience. Originally this was kept as an in-vehicle I-Spy mini-game, one of three that were scrapped after the redesign. 

Though it required a lot of redesigning since I had used

direct references during the initial stages, I managed to get

the AI Buddy to show the Indicator, while the Clue-Zone

managed the tracking collection to allow/prevent vehicle

entry.

A Notification system was also implemented with it which was used to convey additional information outside of a simulated Chat Box. Production moved fast here to get everything in for an early deadline.

Due to the rushed schedule everything ran through the UI elements. Post-Mortem I would use Interfaces for all of these interactions and data transfers since there is a lot of repetition or single integer values that are being sent which don't require a hard reference to everything else.

image.png
image.png

Advancing Design:

It was around this time that I took over the level design features, the various interactive collision boxes I designed for Dane now at my disposal along with the large single track level; Some of which required whole redesigns that didn't get implemented or were only done in small sections.

This period is the Long Crawl.

It's full of testing and retesting, AI voice renderings and the development of more event collision boxes to curtail an experience that the team was expecting to be for Executive-type characters to see. So, we made strides in using audio and clear visual transitions with prompts to convey the full scope of what was happening. The level transforming into a curtailed experience where nothing could be missed.

BEHIND THE SCENES | COLLISION BOX EVENTS:

Optimization & World Partition:

A major problem around this point was performance which the prebuilt foliage acquired from the workshop, initially only for prototyping was not the most memory friendly. Multiple times during development I found myself optimizing more than anything else.

END DEVELOPMENT

Final Product:

Finalizing the project consisted of laying out all of the event experiences and modifying the overall pace of the experience, bug fixing of the roadway and movement components outside the vehicle, fixing lighting, and finally  updating the level map to better capture the theme of a Redwood National Park.

bottom of page