Skip to content

Aperture Tutorial (for legacy developers)

Aperture is a new take on shader packs utilizing TypeScript to create an extensible pipeline.

pack.ts

The core of the pipeline; pack.ts is where you start registering programs. Unlike in OptiFine/Iris, there is no “default” setup; you must create one.

The required parts of pack.ts are composed of two separate functions: configureRenderer and configurePipeline.

configureRenderer

configureRenderer is where you configure the aspects of how Minecraft’s default rendering should change to accomodate your pipeline. This is where you define:

  • If you will use shadow maps, and if so, their settings
  • If you want sun tilt, ambient occlusion, or directional shade
  • If you want point light shadows (documentation not ready)

An example of configureRenderer:

function configureRenderer(config: RendererConfig): void {
config.disableShade = true;
config.ambientOcclusionLevel = 0.0;
config.render.sun = false;
config.shadow.enabled = true;
config.shadow.resolution = 1024;
}

configurePipeline

The actual meat of the pipeline; this is where you will configure the following:

  • Textures
  • Buffers
  • Object shaders (previously known as gbuffer programs)
  • Command lists (containing composite/compute shaders)

There is no explicit order or way to do this, so we will cover these in order.

First, let’s define it.

function configurePipeline(pipeline: PipelineConfig): void {}

Textures

In Aperture, the only textures provided to you by default are depth textures; that being mainDepthTex (previously depthtex0) and solidDepthTex (previously depthtex1).

All other textures, including color textures, must be created and provided by you.

For the following, we will use the predefined variables screenWidth and screenHeight.

Let’s create two basic RGBA8 textures. (The reason for creating two will be explained later.)

let mainTex = pipeline
.createTexture("mainTexture") // The string provided here will be what is used to access this texture in shader code.
.format(Format.RGBA8)
.width(screenWidth)
.height(screenHeight)
.build();
let finalTex = pipeline
.createTexture("finalTexture")
.format(Format.RGBA8)
.width(screenWidth)
.height(screenHeight)
.build();

Remember that we created this, we will use this later.

In Aperture, there are many other types of textures, including array textures, PNG textures, and raw data textures. These will not be covered here.

Combination pass

Let’s get this out of the way first. The “combination pass” is the final stage of rendering, and is required; it will state what is sent as the final image.

Let’s make a dummy one quickly.

pipeline.createCombinationPass("programs/combination.fsh").build();

Create programs/combination.fsh, and put this in.

#version 460 core
uniform sampler2D finalTexture;
in vec2 uv;
layout(location = 0) out vec4 finalColor;
void main() {
finalColor = texture(finalTexture, uv);
}

Object shaders

Previously known as gbuffer_ shaders, these are the shaders run on world geometry. These differ in many ways from in Optifine/Iris.

Time to define our first shader.

pipeline
.createObjectShader("basic", Usage.BASIC)
.vertex("programs/basic.vsh")
.fragment("programs/basic.fsh")
.target(0, mainTex)
.compile();

We have just defined an object shader for the BASIC program usage; this will be the shader all others fall back to when there isn’t a more specialized alternative.

We have defined that output 0 will write to the mainTex we specified earlier.

Vertex shader

Time to create the vertex shader. This is quite different than what you are used to.

#version 460 core
out vec2 uv;
out vec2 light;
out vec4 color;
void iris_emitVertex(inout VertexData data) {
data.clipPos = iris_projectionMatrix * iris_modelViewMatrix * data.modelPos;
}
void iris_sendParameters(VertexData data) {
uv = data.uv;
light = data.light;
color = vec4(data.color.rgb * data.ao, data.color.a); // Ambient occlusion is split by default in Aperture.
}

So, what’s going on here? Let’s break this down.

First, note the two separate functions. iris_emitVertex contains an inout copy of the VertexData, hinting that it should be edited, while iris_sendParameters does not.

Indeed, the job of iris_emitVertex is to modify the vertex position. It contains two jobs: modifying the position as the shader wishes, and converting said position to clip space.

Why are these two separate functions? Simply put, because iris_emitVertex can be called more than once per vertex, while iris_sendParameters is guaranteed to only be called once. As the names hint, you should send vertex parameters within sendParameters for this reason. However, this is not a hard limit; if needed, you can mix and match these. Just beware the unexpected.

VertexData

The full set of data available is as follows

struct VertexData {
vec4 modelPos; // model space position you are given
vec4 clipPos; // clip space position you must set
vec2 uv; // texture coordinate in the atlas
vec2 light; // lightmap texture coordinate - x is blocklight, y is skylight
vec4 color; // vertex color, equivalent to gl_Color
vec3 normal; // model space normal
vec4 tangent; // model space tangent
vec4 overlayColor; // stuff like the red damage flash, equivalent to entityColor
vec3 midBlock; // equivalent to at_midBlock
uint blockId;
uint textureId;
float ao; // vanilla ambient occlusion
};

Block IDs

The VertexData object passed to iris_sendParameters has a blockId attribute. The following functions can be used on the ID:

vec4 iris_getLightColor
uint iris_getMetadata // Raw bitmask, only use if nothing else is enough
bool iris_isFullBlock
bool iris_hasFluid
int iris_getEmission

The metadata comes in the form of an 18 bit uint with the following structure

0-5: IS_SIDE_SOLID (6 bits) // DOWN, UP, NORTH, SOUTH, WEST, EAST
6-9: EMISSION (4 bits)
10-13: LIGHT_BLOCK (4 bits)
14-16: DIRECTION (3 bits)
17: IS_LOWER (1 bit)
18: IS_FLUID (1 bit)

Fragment shader

Time for the other side.

#version 460 core
in vec2 uv;
in vec2 light;
in vec4 color;
layout(location = 0) out vec4 outColor; // Remember when we put location 0 in `pack.ts`? This is where it's used.
void iris_emitFragment() {
vec2 mUV = uv, mLight = light;
vec4 mColor = color;
iris_modifyBase(mUV, mColor, mLight);
outColor = iris_sampleBaseTex(mUV) * iris_sampleLightmap(mLight) * mColor;
if (iris_discardFragment(outColor)) discard;
}

Less going on here, but still notable. Let’s go through it.

First of all, main is now iris_emitFragment. Second of all, you must define your outputs; no more gl_FragData.

Now, these copies of the outputs; why do we need them? Simply, to allow greater mod support. iris_modifyBase is a default hook for mods to modify per-fragment data, without impacting the final image significantly.

(No, you cannot edit the in values directly; they’re read only.)

It is always recommended, but not required to have this hook when possible.

Second, iris_discardFragment. Alpha testing (the act of transparent objects being transparent) is not implicit in Aperture; this simple if statement takes care of all those situations.

Third, notice the lack of uniforms for sampling the base texture and lightmap. These are handled using built-in functions instead; examples include:

  • iris_sampleBaseTex
  • iris_sampleLightmap
  • iris_sampleNormalMap
  • iris_sampleSpecularMap

Command lists (composite/compute)

The functions of composite and compute shaders have been merged into a single “idea”, known as a command list. These “lists” run at certain points in the pipeline, similar to composite, deferred, and prepare stages.

Command lists are also capable of having sub-lists, which is helpful for debugging.

Let’s create a basic command list with a single composite. Computes will not be covered in this page.

let postRenderList = pipeline.forStage(Stage.POST_RENDER);
postRenderList
.createComposite("brighten")
.fragment("programs/brighten.fsh")
.target(0, finalTex)
.compile();
postRenderList.end();

(You are not required to have a vertex shader for composites; although it is allowed, we will not cover them.)

Let’s write it! Unlike with object shaders, these don’t contain much special syntax.

#version 460 core
uniform sampler2D mainTexture;
in vec2 uv;
layout(location = 0) out vec4 finalTex;
void main() {
vec4 col = texture(mainTexture, uv);
finalTex = vec4(pow(col.rgb, 2.2), col.a);
}

Computes and buffers will be covered in a separate page. Unlike with composites, they differ greatly in functionality, with buffers being significantly more powerful.

Uniforms

A list of available uniforms is printed to the console on startup in the form of the structs they are stored in. As an example of how to access these values, cameraPosition is now the pos member of the CameraData struct, and to access it, we would use ap.camera.pos. The same pattern can be applied to any struct.

Shadows

Aperture makes use of cascaded shadow maps, which means things are a bit more complicated.

Setting Up the Shadow Pass

The shadow pass is created as an object shader with Usage.SHADOW.

You do not need to declare the shadow map, as it is automatically defined. If you want any shadowcolor style buffers, declare them as an ArrayTexture.

Sampling the Shadow Map

Aperture defines two shadow maps.

  • shadowMap (equivalent to shadowtex0)
  • solidShadowMap (equivalent to shadowtex1)

Both of these should be defined as sampler2DArrays. To sample them, you pass the cascade as the z component of your coordinate, i.e:

float shadow = step(shadowScreenPos.z, texture(shadowMap, vec3(shadowScreenPos.xy, cascade)));

Cascade 0 is the smallest, containing only a small area around the player, and cascade 3 is the largest.

Bear in mind that the shadow projection is also an array, because each cascade has a different projection. As such, to sample the shadow map, you must loop through each projection until your position is inside the frustum.

int cascade;
for(cascade = 0; cascade < 4; cascade++){
shadowClipPos = shadowProjection[cascade] * shadowViewPos;
if(clamp(shadowClipPos.xy, vec2(-1.0), vec2(1.0)) == shadowClipPos.xy) break;
}

Hardware Filtering

If you want hardware filtering, you can instead use a sampler2DArrayShadow and suffix filtered to the sampler names (i.e shadowMapFiltered. Due to reasons known only to the developers over at Khronos, the cascade is still the z component of the coordinate you pass in, so it becomes

float shadow = texture(shadowMap, vec4(shadowScreenPos.xy, cascade, shadowScreenPos.z));