Saturday, January 7, 2017

Stylized Water for Epitasis Breakdown

Basically since I've shown it off, I've gotten quite a lot of requests to show how I accomplished the water that is found in Epitasis.

It's gone through quite a few different variations and improvements to get where it is now, and honestly it could still be refined. But its current state works quite well, and is close to where I want it to be, so I figured I'd do a detailed breakdown of how you too can also accomplish a similar water system.



The water is essentially broken down into two parts: The material and the blueprint.

The material is exactly what you expect - everything related to rendering the water. In addition to this, you'll also need a lightly tessellated mesh to get the wave effect that you see in many of the GIFs shown of it.

The blueprint is essentially something you can just drag and drop into a level, change its size, and change various material parameters on the fly. This allows maximum flexibility and you don't need to use material instances to get desired effects for each water instance you use in levels (although, it is certainly possible to do it that way if you so choose).

Lets start with the material, as this is the most complex bit.

First and foremost, lets create a master material for our water.

We have quite a few settings to check and mess with inside the material - take a look:


Just most of your basic stuff but we need to enable stuff like tessellation and make sure we have the correct shading model, blend mode, etc.


Here is an overview of the material, and as you can see its big and complex and lots of nodes going everywhere. I'm going to go over each section, starting with the Base Color comment block, and work our way down.


Base Color



The base color section has two subsections, the water color and foam. These two get blended together by using depth fade, as we can find the water edges and use it as an alpha to lerp the water color and foam together.

The water color has two vector variables - Water Light and Water Dark, which blend together using a fresnel (which gets its normal from our normal section). This creates a nice blending effect of two different colors to get a deeper ripple effect. 

The foam section has a vector variable to control the foam color along with two float variables for the scaling of the foam in uv space. They also pan on 0.02x-0.02, to get a nice effect of the foam moving. The texture samples you see are just a nice foam texture which get added together.

Metallic, Specular, and Roughness



Nothing to strange or complex here, only difference is we use the depth fade from the foam fade scaling in the water color block to make sure its not too shiny where the foam is.

Opacity



Nothing to crazy here, we are essentially using depth fade to control our min and max opacity and multiply it against a sphere mask that uses pixel depth, which helps fade our the water when its right next to the player / camera. This makes it so the water does not just clip into the camera when your halfway above and below it. Its a bit expensive, and thats why there is a quality switch there as well.


Normal



We divide a world position offset input by the wave size, which is essentially the scaling of the normal. We mask R and G, use two different panners (gets wave effect when animated against each other) at 0.02x0.03 and -0.04x0.03, then use two different float variables (Small Ripples and Large Ripples) to control scaling. This then gets plugged into two texture samples of our water normal texture and they are then added together. I also have a vector parameter called Normal Intensity this then gets multiplied against, which at times helps control how strong it looks but this is definitely an area that can be improved because it does not always work well.

There is also a blank normal map, which is lerped together from the foam depth fade in the base color. This is because we dont want to use the wave normal for the foam too.

Tessellation Amount



A simple little graph here that plugs into the tessellation multiplier input.

Wave Movement



To get the wave movement, we use absolute world position, divide it by how a float variable of how frequent we want the waves (wave frequency), mask the red channel from it, and add it so our wave speed. We then use a sine node and do some more math, and eventually multiply it against the variable Water Height Surge, which controls how high the waves are.

We then plug this into a quality switch since its fairly taxing and plug that in both the world position offset and world displacement inputs.

We also plug it into a normal input of a new material function we need to create, which controls the subsurface scattering of the material, which we'll go into next.

Subsurface Water Material Function



This function is very complex. I also need to mention I did not actually create this function, and I copied it off the web from TK's Dev Blog. Huge props to him, as this adds in a super awesome effect we wouldn't have otherwise. I'll go over some of the variables here and what they do.

The Light Vector is the vector from your sun position, so we can get the correct lighting for the subsurface. The Scattering Color is just what you expect, the subsurface scattering color, which is multiplied against everything right before the final input.

We then have three float variables, Scattering Color Intensity, Scattering Fade Distance, and Scattering Scale. These are all fairly self explanatory and simply just control how much scattering there is to see and how intense it is.

Refraction



This is fairly simple as well, but we use depth fade again so we don't refract right at the waters edges. You'll get some bad artifacts if you do in certain cases.

--

Now that the material is out of the way and created, lets move  onto the blueprint. Its fairly simple, mostly just consisting of a few things we have to do on tick and on construction of the blueprint.

Lets create our water blueprint and add our tessellated mesh to it and apply the master water material to the mesh. We are then going to have to add variables to the blueprint that we have in our material, and make them editable so we can edit them in the editor. Its essentially up to you at this point to add in the variables that you wish to change found in the material. I've personally added them all, for maximum flexibility.



In addition, you will also need a reference to your directional light or whatever vector you want to use for the light vector variable found in the subsurface water material function. Since I have an entire sky blueprint, I use that and grab the sun location.

Construction Script
This thing is huge and I'm not going to go over it all, its fairly self explanatory from the following image:



We essentially use a sequence to update all the parameters in the material thats located on the static mesh plane of the water. Every time we update something in the editor, this will be run and it will be updated. We just use the "Set Scalar / Vector Parameter Value on Materials" nodes to control the parameters (just specify the name of the parameter).

Event Graph



The last thing to do here is to update the light vector parameter on tick. In my case I grab the sky reference, get the sun position, get its world location, and stick in there for the vector.

--

Now that we're done with the material and blueprint, the only thing left to do is to throw it into a level and start messing with its parameters and seeing the results! This is a big, complex system that still has room for improvement. If you've ever seen the water plugin on the unreal forums, you'll know that water can get very complex.

Some last notes: At one point I used a planar reflection component in the water blueprint, but eventually removed it because it was incredibly taxing (although, the results looked awesome!). If that was ever improved to a point where it would not be taxing I would probably add it in again, but I dont see that happening since how that method of reflection is done. Another note, usually in my environments wherever I have sand and water I'll use a slightly more reflective sand material and paint that around the edges and under the water, which gives the effect of wet sand. Lastly, one thing I do in levels already but need to just add to the blueprint to be universal is a sound dampening and post process volume, so you get a better looking underwater effect.

Cheers! Let me know if you have any questions or comments.

Saturday, December 31, 2016

2016: Starting Epitasis and what lies beyond

I've never actually taken the time to talk about how I started Epitasis or how it came to be, so I figured the end of the year would be a perfect time to talk about that, and where exactly Epitasis is heading next year.



Epitasis started as a small art project, based off some code I had created for a previous project that never really got off the ground. The code included things such as basic player movement and a linked actor system, allowing players to interact and manipulate objects around the world. This gave birth to a puzzle system now found in Epitasis, and most of the code, while transferred from blueprints to C++, is relatively the same.

In terms of art, the project started with a small vision I had in my schools art gallery back in March of this year. While no particular piece caught my attention, something about the room made my brain click and I finally had what I was looking for in my next project: A colorful world, a green field, and an orange planet. I immediately began working on this exact area, and within a week I had a small prototype, which looked like this:


Not long after that, I took a month long trip to Costa Rica. Traveling around to a few places throughout the country brought a long needed inspiration for colorful vistas and mountainside landscapes, which compared to the last few projects I had been creating, was entirely different. My previous projects were very monochromatic, dark, or realistic. With Epitasis I wanted a style focused on colorful imagery, and not focused entirely on being super realistic looking. Something along the lines of The Witness or RIME essentially, and so far this style has worked incredibly well.

In Costa Rica I also had the opportunity to stay at an animal rescue center for a couple of weeks as well, and got to take care of beautiful animals such as a Scarlet Macaws, Sloths, and Howler Monkeys, to name just a few. Heres just a couple pictures from my month long trip:


Once I got back, I finished up some summer classes and began working on the project with most of my free time I had. It's been like this since, but my free time dwindles between the amount of college work I have for my major (computer science). I also began experimenting with photography, as I took it as one of my classes this past semester, which has influenced some thoughts on things such as lighting and composition. I spend a lot of time flying around my levels trying to get cool shots and various times of the day.



I also created a demo for Epitasis, presented it at a local UE4 meetup, and in just a couple weeks in 2017 I'll be presenting it again at an IGDA meeting in DC. I may also bring it to MAG Fest in Baltimore for a hot minute in January as well (next week!), but as I didn't have a demo in time to apply for getting a table a while back, my time there would be severely limited to a small rotating booth between developers.


I also released a small teaser trailer for people to see Epitasis, and a website. Check them out if you haven't seen them:





This brings me to the new year, and where I see Epitasis going in 2017.

First of all, I should state that I see Epitasis having a "when its finished" release date, but having done something like that in the past I know that I'll be able to figure out a release date before its actually finished, depending on the amount of work thats left to be completed. At the moment, the game is in a state where most basic and core systems are completed, and now I'm just creating content, levels, and more designs to be added in and extended. Its a hard process, and to be completely honest, its only going to get harder as more time goes on. There is some promising designs I've already started concepting for the game though, such as a language based on real world binary numbers and how one may be able to communicate through binary to extraterrestrial intelligent life (see: Voyager probe plaque, and also prime number communication) and a bunch of other cool stuff I'll talk about in the following year. Besides from a notebook covered in level designs, the wall near my desk is also currently covered in new and old designs, take a look:


I hope to continually improve the prototype demo for Epitasis in the next couple of months as well, which would allow me to show it off to more people and at other events, and hopefully gather some more attention for the project. I dont think I'll be done in 2017 with Epitasis, and it would be foolish to think otherwise. However, I do see a lot of the game being completed by the end of the year, and hopefully a 2018 release. But like I said, it'll be done when its done, and I don't plan to rush it.

It's going to be an interesting year for game development all around, as there's so many interesting projects happening by independent developers and developments being made in technology, such as UE4. Overall, 2016 was a great year to start a game development project like Epitasis among so many other great projects, and I honestly cant wait to see what 2017 brings to the table. :)

Sunday, October 9, 2016

Building a Demo for a Puzzle Based Game

Building a demo is hard. Building a demo for a puzzle game is even harder.



Epitasis is a very interconnected game. Many levels tie in together, have hubs, areas to explore, and so forth. I sat down and started thinking about a demo about a month or so ago and really tried to hammer down what it'd be like. Since players who'd be playing it wouldn't have much storyline context or build up on previous puzzle mechanics, I simply wanted to focus on different puzzle elements and what the game could offer there.

I also intended the demo for just people at events / meetups / showing friends, so I wanted to keep it relatively short so people who would play it would have a quick understanding of what the game was like. It's by no means a straight up vertical slice, but it does showcase important basic puzzle elements and truly does show off the visual style aimed for in the game. In addition to that, its also very much a great staging area to test performance, which I talk about a couple paragraphs down.



So the design I came up with is something you'd see from something like The Talos Principle, where puzzle "areas" are somewhat clearly designated. However, I made each puzzle somewhat interconnected, as in you need to solve each one to power up a central structure. Along with that, minor exploration is introduced where some puzzle elements (such as crates for pressure pads) are spread out. Terminal machines offer a small backstory into the ancient ruins and landscape, and even a secret puzzle area is included if you can figure out how to get to it.



One of the great things about building a demo also is all the stuff it forces you to think about and do. It's a great exercise in building an entire standalone level and also really makes you get stuff done. For example, this demo really made me focus on optimizations and performance, since this upcoming week I'll be showing the game at a UE4 meetup, and pretty much need it to run at a reasonable frame rate on my laptop. I had to implement a bunch of new features and optimizations, but the payoff on this was about 10 FPS saved on my laptop (a Lenovo y410p with an i7 and 765m NVIDIA GPU) so it runs stably at 30 frames locked at all high settings. Previously it would only run at about 15-20 in that area. Tweaking these settings slightly I can get it to run even higher and closer to 60, and of course on my main desktop I also saw performance gains at a 2560x1440 monitor resolution with a GTX 970. I still need to get around to testing even lower end hardware again, but I'm hopeful that I'll see big gains there as well.



I ended up having to implement new features like Screen Resolution (previously only had screen percentage), detail modes (to use with material quality), and even a Max FPS setting to help smooth out FPS for other hardware like my laptop.

Along with that, I also ended up redoing my entire puzzle and usable actor systems in C++, which most certainly helped as it made me take a harder look at some of the things I was previously doing. In the end, this new code refactor, in addition to new quality settings and material optimizations (I'm looking at you water material!) ended up making the game much more performant.



I still have a couple things left to do before tuesday to show the game, and after that I'll probably be doing more puzzle and gameplay fixes based on feedback and other ideas I have lingering. Cheers!


Saturday, August 27, 2016

Music Management in Epitasis

 While working in Unreal a need arised to have a music manager system for my project, Epitasis.

Coming from the Halo:CE engine originally where music management was quite simple (just controlled via commands in a script), I wanted to create something similar using blueprints and easy macros.

Trying to play songs simply in a sequence using the regular audio mixer did not bode well, as small delays started popping up during the transitions. Additionally, that audio mixer did not easily allow the user to access and play different sounds on command. So the simplest solution was to create a new blueprint class for music tracks.

This included:
  • Easy to use commands to call a child of the base music blueprint and start playing it
  • Correctly play tracks without delays
  • Use of intros, transitional intros, loops, alternate loops, transitional outro, and outro.
  • Functions to start playing alternate loop, stop music, and stop music using a fade out with timed duration.
  • Macro library that allowed the commands to be called anywhere (more on this later).
 So the base music class simply contains things some simply variables the user can modify, which are the different audio cues for the tracks.
  1. Cue_MusicIn
  2. Cue_MusicTransLoop
  3. Cue_MusicLoop
  4. Cue_MusicAltLoop
  5. Cue_MusicTransOut
  6. Cue_MusicOut
The only real ones you need to include in a child of this blueprint would be an intro and loop. Anything less could simply just use a Play Sound 2D node instead.

Lets look at some code:

This is the main event graph in the BP_BaseMusic class.




Some notes. I'm using Rama's Victory plugin to allow me to fade out my custom music sound class. This made it really simply for the fade out function, which is really the only reason its used. If you don't want a fade out function then you don't need to include it, but I found this to be the quickest / easiest way to implement a fade function without using an audio component and just simple Play Sound 2D nodes. This method also assumes all you cues are assigned to that sound class. The only limitation this has is that when you fade out this track you fade out ALL music, but in my case thats not a big deal.

The timeline for the fadeout is over one second long. We use the set play rate function and a input to control the speed of the fade out.

Otherwise, the code is quite straightforward. We get the audio cue's duration and delay by that and simply play the next cue (or loop). We also check if the transitional cues are valid, considering these are not always necessary, but more so there to help if needed for the composer.

We also had three functions: StopMusic, PlayMusic, and PlayAltLoop.

- StopMusic simply checks to see if the loop has even fired yet (in case it was called to early), and if it has, it then clears the loop music function and sets the "MusicFinished" boolean to true.

- PlayMusic simply calls the custom event BeginPlayMusic.

- PlayAltLoop simply sets the "PlayAltLoop" boolean variable to true or false based on a bool input when called. This is so it can be called again to be set to false.

I then created a simple macro library that has simple functions and inputs that can be called from any actor or level blueprint (theres another macro library specifically to be called from UMG if needed).


Start music - only input it requires is the child blueprint CLASS that contains all the sound cues we want to play. It returns the created instance.


Stop Music and Play Alternate - simply just get the instance and call the respective functions. Honestly these aren't necessary since you have to get the instance anyway but added for consistency.


Fade out macro - we get an input for the play rate and then find all currently playing music actors and fade them out.

From here, we can just easily call the start music macro, plug in a child of the base music class and let it play!



Lastly, I've been working with Andrzej Ojczenasz for the music of Epitasis. He's been creating some really great stuff, and I can't wait to share it with you all quite soon.

Sunday, August 21, 2016

Portal Rendering in UE4

Here’s a new breakdown for portal rendering, as requested by MetricZero.

A few words though:

This breakdown won’t cover everything, but hopefully it should get people started on the right direction to building their own portals and rendering them correctly. This same rendering method could also be used to build non-euclidian worlds (eg, a hallway that looks long on the inside, but outside looks short!). It could probably (and has been, although not really shared) be done better in C++, but this has been currently accomplished completely in blueprints.
Here is an example of what you can expect:



The first thing we should do is create the following assets:
- A new blueprint actor class for our portal (BP_Portal)
- A new blueprint actor class for the portals exit (BP_PortalExitNode)
- A new material for the portal’s mesh (MAT_Portal_Master). In addition, you’ll need a simple plane mesh (rotated upwards to stand vertically) to apply the portal material to.

Lets begin with going over how the material will work.



We plug in a texture 2d parameter into the emissive color. In our BP_Portal class, we will inject a dynamically created render target into this parameter, and that is why we don’t need an actual render target asset. We also plug in a screen aligned UV’s into the UV input of the parameter, as we want the texture to align to the screen.
This material right here is fairly simple. I also do some effects with the opacity mask to create more of a wormhole effect, which just uses some basic masks and noise textures that rotate and blend together.

The last thing I do, which is still a WIP is fade out the portal material (lerping it with the render target parameter). I don’t like this, and will most likely get rid of it. In most cases, its not necessary either, since in our BP_Portal class we will stop updating our render target if the player is far enough away.

Next up is the BP_PortalExitNode class. This class serves as the “exit” for the player and also where the portal renders to as it contains our scene capture component. There isn’t much going on here except for components, variables, and a function to update our scene capture:



Lastly is the meat of it all, the BP_Portal class. There is a lot of stuff going on in this class and I’m not going to cover it all, but I’ll try to cover the jist of it.



On Begin Play: The big thing here is creating the dynamic render target 2d. I’m using the Victory Plugin courtesy of Rama to create the dynamic render target, but with 4.11 or 4.12 there was a new node added which made it possible without the plugin to do this. The big thing here is when creating it is to get the correct screen aspect ratio. If it’s off, our portal will be distorted and look incorrect. To maintain a decent FPS, we get the player controllers viewport resolution, divide both X and Y by 2 (so half of the current screen resolution) and use those numbers for our render targets. Something to add in the future would be a scalability setting for this. Obviously if we do not divide the resolution we get a clearer result, but it tends to murder FPS.

We then take that returned value of the render target, and plug it into a reference of our BP_PortalExitNode’s scene capture component so it has something to render onto. Next we create a material instance of the MAT_Portal_Master and plug the render target texture into the PortalRenderTarget parameter we created inside of it.
Next is the camera math. With everything else setup correctly, we don’t have to do much to get something realistic now.



Essentially, all we need to do is set the scene capture component’s rotation (from out BP_PortalExitNode reference) to the player camera’s rotation. It’s slightly hacky, but I have to actually modify the rotation by 180 degrees to get the correct results. This is probably an area that could receive some slight improvements in the future.

In addition to this all is also where I call the Update function from the BP_PortalExitNode reference.  If this function isn’t called (which it isn’t if we are more than 1000uu away) then the portal doesn’t update. The UpdatePortalCapture event is called on tick every 0.03 seconds. Reducing this number comes with more realistic rendering with the sacrifice of performance. Really, scene captures murder FPS, at least for me.

Last thing we need to do is throw in a BP_Portal into our scene and link a BP_PortalExitNode to the Exit Node Reference!

Final Result:



Other things to do are teleport the player to the exit node reference. The easiest way to do this is to setup a trigger volume on the portal which simply transports the player to the exit node references location. This is easier said then done, which is why I didn’t show my code, as its still a WIP and quite messy. The same thing can be done to transport other objects. Another room to improve is to use custom stencil to the give the effect of objects passing into the portal (same thing should be done if you have first person objects and don’t want them to clip into the portal mesh).

I hope this helps people who are looking to build portals. I know its not a complete guide but hopefully it starts people on the right direction to building their own. Even these systems have a ton of room for improvement and hopefully in the future I can expand on this same tutorial with more knowledge gained on the subject matter.

Creating Stylized Grass in UE4 for Epitasis


Hey guys, heres the breakdown for the grass that some people have been asking about.

This same method is used for other foliage too, like the trees, plants, bushes, etc.
There is a few key things you must do to get this right:
•    Make sure the grass normals are all pointing up.
•    Create a grass texture used on the grass and landscape materials.
•    When painting the mesh onto the landscape, align the mesh to the landscape normals.

Here is a small example of what you can expect of your grass or other foliage:



Notice the nice blending and shading from the landscape. This is due to the actual vertex normals of the grass. If we modify the vertex normals to point upwards, this helps with this blending effect, which in the end is key to getting the grass smooth as possible.



In 3ds Max, the grass normals are represented by green lines; they should look like the above the image when you have all of them pointing upwards.

In order to create the proper effect of blending between the grass and actual landscape material/component, they both need to use the same texture and base diffuse material setup. This same method was actually first made by ZacD I believe as I found it over on a reddit post of his a while back, but I’ll cover it again in some more detail. Overall, this is a very easy effect to generate.

Here is the default green grass texture I use. You can make this simply by painting it in photoshop or just finding a grass texture online and applying a gaussian blur filter to it. In my case I did both to find something suitable.



Create 2 new materials: One for your landscape, and one to be applied to the grass mesh.
In both, create the following material node setup and connect it to the diffuse input of the material:



The texture samples are samples from the new diffuse texture we imported. By making them parameters you can easily change the colors or textures and easily create new looking landscapes by using material instances. We will also need to use the material function AbsoluteWorldPosition(), which lines them up to the same spots for the blending.
In the grass mesh material, we next apply a simple 0,0,1 vector, which ensures that all texture normals point upwards as well.



Again, this helps blending between the grass mesh and the landscape component.

In your material settings for the grass mesh material, make sure to disable "Tangent Space Normal" (see last pic).

Lastly, apply your new landscape material to your landscape, and the grass mesh material to the grass mesh. I also recommend disabling dynamic shadows from the grass mesh to improve performance and improve the overall stylized look. I do this on some other foliage as well, largely depending on its size.
The last step that we must do is the paint the grass onto the landscape.
A few things I make sure I do here is align the grass to normals, disable dynamic shadows, and make sure its density is between 150-200. The nice part about this grass mesh and material is that it seamlessly blends between itself and the landscape, so you can fade it out relatively closely to the player without them noticing. You also don’t need it very dense, as again, it’s not super noticeable.

Final outcome:



The biggest things to remember when creating this sort of effect is to use normals that all point up and to use the AbsoluteWorldPosition() function for your texture UV's, so the grass and landscape texture will tile in the same way across your levels.
Lastly, I didn’t shown the entire material inside the actual breakdown, but here it is anyway if you want to see some other bits of it (basically just wind, opacity, etc).

Welcome

Hello, and welcome to my game development blog.

I'll mostly be posting long ramblings, breakdowns, and progress on my game "Epitasis".

If you don't already know me, my name is Lucas Govatos, an indie game developer working in Unreal Engine 4. I've worked on other AAA games in the past in a QA position, but have now shifted my focus to going back to school and creating my own game projects.

So welcome to my humble blog... hopefully we'll both learn something here.

- Lucas