10/21 - 12/9

Oct 21 - Dec 9

    A lot of time has passed since I have explained what I have been doing. Many things have happened over the last month and a half. I have worked on the AI monster, the environment, VR physics, optimization, and about all other pieces of game-play for Trespasser.

1) AI
    The major changes and challenges here had to do with just being able to see what the monster is actually doing. The VR display cuts off important AI variable values when in VR mode. This means that many print strings and  draw debugs were used to visually comprehend what the monster was thinking. There weren't any game breaking issues with the monster other than the animation being  jittery and frantic at times.
    I was also able to completely flesh out the different stages of the AI and his abilities at each stage. To do this there were lots of switch on Enum nodes that would dictate what the monster could do at each stage.
Edit 12/12: Much of the AI system will have to be refactored in order to have events that will sync behavior with animation and sounds.

2) Environment
    The biggest issue I ran in to with the environment is the snow system. The falling snow is supposed to be an indicator to the player as t which direction the wind is going and so blows in a direction. My method of the snow was to attach some GPU particle emitters to the player that would spew snow particles in a direction, always relative to the player and also not taking up as much processing power as the snow is only around the player. The problem arose with buildings. Because the emitters would always follow the player, they would also follow the player inside buildings, which is not supposed to happen. I ended up scraping this method of weather for this reason. I instead opted for a particle emitter, still a child of the player, but one that would fire down from enough height that it would not be able to penetrate structures.
    Another part of the environment, but also an attempt at optimization, was my attempt at fog. My theory was that with the map as big as it was, fog that got exponentially thicker with distance would limit the player's render distance, increasing frame-rate. the post process effect that I made did do this, but again it came with unintended consequences. This time, if you held up or positioned a particle emitter at a point where part of it was below the horizon and part was above, the part above the horizon would look as if it was behind the post process effect and the part below in front. Because of this I have put custom fog on hold; using the exponential height fog provided by Unreal as a stand-in.

3) Physics
   Working with physics in VR was a roller-coaster. Some things worked nearly perfectly and were very fun to play around with (the car trunk being one example), but others were so incredibly broken that we had to find another way to do the thing we were trying to do (the bunker door keypad).
    First off, I love physics. When combined with being in VR where you can interact with objects in a more realistic way, it is a dream come true for me. As a result of this fascination I tried to use physics wherever possible. The first instance of this was the keypad buttons on the bunker door. My theory was that we would be able to put small cubes on the door and put physics constraints on them to be able to push them in one direction until a threshold where it would set off an event that the button had been pressed. This... did not work. The buttons would fly forward and backward, and we were unable to give the button a maximum distance to move and so the player was able to move a button infinitely along one axis and make the button rotate slightly before it popped back to its intended rotation. Despite these problems, the door worked in every other way except aesthetics. You were able to input a code that would print out in a text box, and if you input the correct code the door would open. To fix the flying buttons we had a system that when your hand collided with the button it would input that number, depress the button visually into the door, then pop back out after a delay. This system worked much better than simulated physics and I think that this is the system that I will use going forward with interactable objects (e.g. cabin doors).

4) Optimization
    It was clear as soon as David started working in the main map that it would need optimizing. The map itself was quite large combined with props for a forest and constant snow particles all combined to produce a map that took minimum of a minute to get into play mode from the editor and would produce constant graphical glitches when moving the headset quickly.
    The first problem to solve was the size of the map. Constantly rendering the entire map would be a waste of resources when the player can only see one side of the map at a time (with fog). The solution to this problem was Level Streaming Volumes. This allowed us to "cut the map in half" and only render one side at a time.
    The next piece of optimization that I tried was to make the snow GPU particles. This immediately solved that issue and allowed for more particles, as the unreal documentation states: "The GPUSprite type data module supports simulating particles on the GPU allowing for hundreds of thousands of particles to be simulated and rendered efficiently".
    The last problem was the prop-count. Hundreds of trees, rocks, pieces of grass, and more made it hard for the computer to keep up. The solution was Cull Distance Volumes (unreal has a built in solution for everything!). These allowed us to set the distance that objects of different sizes would stop rendering. Pretty simple.
    I have yet to get numbers on exactly how much each of these solutions contribute individually and combined to a more fluid VR experience.






































Comments

Popular posts from this blog

Polishing the Foundation

Progress Update #13

A New Chapter