Minecraft subtle shader pack

Justin Donn, Kevin Yuan, Matin Kassaian

~ enjoying the sunset ~

Abstract

For our project, we set out to create a minecraft shader that would enhance the appearance of in-game water. To achieve this, we implemented realistic, physically-based lighting effects such as specular reflection, and we also implemented some less physically accurate (but visually appealing) effects like horizon lighting. Throughout the process of the project, we reinforced our understanding of the mathematical models behind lighting effects, and we also learned how to code/debug in the Optifine pipeline.

Technical Approach

We built our project off of a bare-bones shaderpack called "Sildur’s basic shaders" that contained the necessary project structure for the shader program to compile, but provided little other functionality on top. The specific files we edited were gbuffers_water, composite, and final (both .vsh & .fsh files). The following are the features that we implemented, accompanied by short technical descriptions.

Specular Reflection: We implemented specular reflection on water based on the Blinn-Phong reflection model we learned in class. Getting this to work was a challenge at first because there were different view spaces to work in (eye space vs. world space), and transforming between the two used difficult to find matrices.
  1. Specular Reflection: We implemented specular reflection on water based on the Blinn-Phong reflection model we learned in class. Getting this to work was a challenge at first because there were different view spaces to work in (eye space vs. world space), and transforming between the two used difficult to find matrices.
  2. Waves: To implement waves, we used displacement mapping to modify the vertex normals, using a TBN matrix similar to the one we learned in class. The wave pattern that we used consisted of two cosine functions, travelling in positive & negative XZ directions.
  3. Horizon Lighting: We implemented horizon lighting calculation in the vertex shader file by modifying water “brightness” depending on the eye viewing angle. The steeper the angle is relative to the vertex normal, the more brightness will be added onto that fragment of the water in the fragment shader.
  4. Underwater “refraction” effect: We used a similar function as the waves, only instead of modifying the physical vertices, we modified the uv coordinates the texture is sampled at. Thus, this was a form of bump mapping, reliant on a sine/cosine function with respect to the frame time counter.
  5. Antialiasing: We implemented MSAA 4x using similar methods to project 1. final.fsh is the last shader in the pipeline, so we used that to our advantage. We sampled game textures multiple times with sub-pixel offsets, then averaged them together to return the final fragment color. In a game like minecraft, which has a lot of straight lines and edges, smoothing jaggies helped increase visual clarity.
  6. Convolution filters: We used the example kernels from Lecture 24 to create a sharpening filter, a Gaussian blur effect, and edge detection capabilities. This was more for our own enjoyment and experimentation, and it isn’t one of the primary features of the shader pack.

One of the greatest challenges we faced was how difficult and complex the process of creating Minecraft shaders that align with your visions was. Firstly, there was very minimal documentation on how Minecraft interacts with OptiFine and while there was some official OptiFine documentation, it was pretty basic and a little bit confusing. There was also the difficulty of not being able to translate our ideas into code that actually compiled and worked as we desired. One example of both of these challenges was when we tried to implement raytracing. While we understood the basic algorithm thanks to the lectures and projects, we had no idea how Minecraft (or OptiFine) defined/structured their objects (in this case, blocks) and their corresponding textures. We weren’t sure how to access the positions (and textures) of certain blocks from within the vertex or (fragment shader, respectively) of our current block. Thus, we couldn’t really implement raytracing. Another obstacle we encountered was when we tried to implement refraction (Snell’s law) in the water. While we had a pretty good idea of how to implement it theoretically, we weren’t sure how to access the texture of a block sitting underneath another block (for example, how do we access the texture of a dirt block which is sitting under a water block?). In the end, we decided to implement a sort of fake refraction, as mentioned above, via bump mapping, where we used sine/cosine functions to create an effect similar to refraction underwater.


Nonetheless, we still had a lot of fun working on this project, and we all learned firsthand how difficult it was to develop a Minecraft shader pack. One big takeaway we’re all grateful for is the opportunity to deep dive into a technology we’re not very familiar with and still manage to create something that we’re proud of with minimal online help. Thanks to this experience, we can appreciate the shader packs currently available to us even more and admire the beauty that is not vanilla Minecraft.

Results

Demo video:

Additional Screenshots:


Gaussian blur
Sharpen
Edge Detection
k e l p

References

We almost exclusively used two different types of resources for this project: either other people’s shader packs to be used as references or various online forums, as official documentation for developing Minecraft shaders was extremely limited. Here are some of the shader packs we took at a look at:

Here are some of the webpages we used for information:

Contributions

We all worked together closely throughout the entirety of this project, as we believed that that was the most efficient way to learn about OptiFine, Minecraft shader development, and implementation-specific GLSL. In the initial stages of the project, we combed through different shader packs together and dissected the different components found in each one. When it came to actually implementing our own features, we split off in different directions trying out our own ideas, but we frequently communicated to each other our intermediate results and any confusions or insights that we may have experienced. There wasn’t really any feature that we added that we didn’t all work on or at least experiment with at some point. That being said, here’s a quick rundown of our individual contributions:

Kevin - experimented with specular lighting and waves, experimented with and (sort of) implemented pathtracing, experimented with adding a noise texture to waves, set up and published the website

Matin - came up with and implemented final specular lighting, waves, and MSAA 4x code, experimented with convolutions, and made the founding ideas for horizon lighting

Justin - experimented with specular lighting, implemented underwater effects, horizon lighting (water brightness based on viewing angle), water color, texture, and transparency, and implemented several convolutions like sharpening, Gaussian blur, and edge detection (as a bonus)

Links

Demo Video