collision detection – How to transfrom CollisionPolygon2D according to the Vertex Shader applied to a Sprite?

Consider this:

* KinematicBody2D
- * Sprite
- * CollisionPolygon2D

And a vertex shader:

shader_type canvas_item;

void vertex() {
    VERTEX.y += sin(2.0 * TIME) * 10.0;
}

When the vertex shader is added to the CanvasItem.material of the Sprite and the CollisionPolygon2D they both move at the same time:

enter image description here

But if one modifies the shader like that:

void vertex() {
  VERTEX.y += sin(UV.y * 2.0 * TIME) * 10.0;
}

It produces that:

enter image description here

How to transfrom CollisionPolygon2D according to the Vertex Shader applied to the Sprite?

unity – How to create a standard pbr material with shader graph?

I want to create a standard pbr material that is editable with the shader graph, but when I install the package and go the Create -> Shader tab, I only have 2 graph options: Empty Shader Graph, and Sub Graph. Isn’t there a shader graph that starts me off with a basic pbr material with color and reflections? I’m using unity 2020, and the default render pipeline. Do I need to switch to ldrp or hdrp?

LibGDX Shader that converts sprite to single colour with blurred edges

I made a shader which turns all the colours of the texture to black but I also want it to be blurred at the edges where the opaque sprite turns transparent.

Original texture with desired output.

enter image description here

#ifdef GL_ES
    precision mediump float;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;

void main() {
  vec4 c = v_color * texture2D(u_texture, v_texCoords);
  gl_FragColor = vec4(0,0,0, c.a);
}

A simple gaussian blur with good performance is what I want.

This shadertoy shader looks ideal: https://www.shadertoy.com/view/Xltfzj

I managed to convert the above mentioned shadertoy to GLSL which works fine, but I don’t know how to make it first turn the texture black before blurring.

#ifdef GL_ES
    #define PRECISION mediump
    precision PRECISION float;
    precision PRECISION int;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;
uniform vec2 resolution;

void main() {
    float Pi = 6.28318530718; // Pi*2
    
    // GAUSSIAN BLUR SETTINGS {{{
    float Directions = 16.0; // BLUR DIRECTIONS (Default 16.0 - More is better but slower)
    float Quality = 4.0; // BLUR QUALITY (Default 4.0 - More is better but slower)
    float Size = 8.0; // BLUR SIZE (Radius)
    // GAUSSIAN BLUR SETTINGS }}}
   
    vec2 Radius = Size / resolution.xy;
    
    // Normalized pixel coordinates (from 0 to 1)
    vec2 uv = v_texCoords;
    // Pixel colour
    vec4 Color = texture2D(u_texture, uv);
    
    // Blur calculations
    for( float d=0.0; d<Pi; d+=Pi/Directions) {
        for(float i=1.0/Quality; i<=1.0; i+=1.0/Quality) {
            Color += texture2D( u_texture, uv+vec2(cos(d),sin(d))*Radius*i);        
        }
    }
    
    // Output to screen
    Color /= Quality * Directions - 15.0;
    gl_FragColor =  Color;
}

LibGDX Shader that converts image to single colour with blur

I made a shader which turns all the colours of the texture to black but I also want it to be blurred.

#ifdef GL_ES
    precision mediump float;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;

void main() {
  vec4 c = v_color * texture2D(u_texture, v_texCoords);
  gl_FragColor = vec4(0,0,0, c.a);
}

A simple gaussian blur with good performance is what I want. I tried various blur shaders but haven’t managed to get any to work well.

This shadertoy shader looks ideal but not sure how to convert it or make it output only one colour https://www.shadertoy.com/view/Xltfzj

EDIT: I managed to convert the above mentioned shadertoy but the output doesn’t seem to really be blurred, just almost transparent. Not sure whether I converted it correctly or not though:

#ifdef GL_ES
    #define PRECISION mediump
    precision PRECISION float;
    precision PRECISION int;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;
uniform vec2 resolution;

void main() {
    float Pi = 6.28318530718; // Pi*2
    
    // GAUSSIAN BLUR SETTINGS {{{
    float Directions = 16.0; // BLUR DIRECTIONS (Default 16.0 - More is better but slower)
    float Quality = 4.0; // BLUR QUALITY (Default 4.0 - More is better but slower)
    float Size = 8.0; // BLUR SIZE (Radius)
    // GAUSSIAN BLUR SETTINGS }}}
   
    vec2 Radius = Size / resolution.xy;
    
    // Normalised pixel coordinates (from 0 to 1)
    vec2 uv = gl_FragCoord.xy / resolution.xy;
    // Pixel colour
    vec4 Color = texture2D(u_texture, v_texCoords);
    
    // Blur calculations
    for( float d=0.0; d<Pi; d+=Pi/Directions) {
        for(float i=1.0/Quality; i<=1.0; i+=1.0/Quality) {
            Color += texture2D( u_texture, uv+vec2(cos(d),sin(d))*Radius*i);        
        }
    }
    
    // // Output to screen
    Color /= Quality * Directions - 15.0;
    gl_FragColor =  Color;
}

Original texture with desired output.

enter image description here

Fragment shader not working as expected

I am porting the following shader into Godot. The results seem not correct though. Apart from the wrong colours there seems to be no change with time. What is the problem with it?

shader_type canvas_item;

const float DELTA = 0.00001;
const float TAU = 6.28318530718;
const float NOISE_TEXTURE_SIZE = 256.0;
const float NOISE_TEXTURE_PIXEL_COUNT = (NOISE_TEXTURE_SIZE * NOISE_TEXTURE_SIZE);

uniform vec4 vColor: color;

// MAIN CONTROLLER UNIFORMS
uniform float intensity = 1.;       // overall effect intensity, 0-1 (no upper limit)
uniform float rngSeed = .0;         // seed offset (changes configuration around)
uniform sampler2D noiseTexture;     // noise texture sampler

//TUNING
uniform float lineSpeed = .01;          // line speed
uniform float lineDrift = .1;           // horizontal line drifting
uniform float lineResolution = 1.;      // line resolution
uniform float lineVertShift = .0;       // wave phase offset of horizontal lines
uniform float lineShift = .004;         // horizontal shift
uniform float jumbleness = .2;          // amount of "block" glitchiness
uniform float jumbleResolution = .2;    // resolution of blocks
uniform float jumbleShift = .15;        // texture shift by blocks  
uniform float jumbleSpeed = 1.;         // speed of block variation
uniform float dispersion = .0025;       // color channel horizontal dispersion
uniform float channelShift = .004;      // horizontal RGB shift
uniform float noiseLevel = .5;          // level of noise
uniform float shakiness = .5;           // horizontal shakiness
//

float rand(vec2 co){
    return fract(sin(dot(co.xy ,vec2(12.9898,78.233))) * 43758.5453);
}

vec4 extractRed(vec4 col){
    return vec4(col.r, 0., 0., col.a);
}

vec4 extractGreen(vec4 col){
    return vec4(0., col.g, 0., col.a);
}

vec4 extractBlue(vec4 col){
    return vec4(0., 0., col.b, col.a);
}

// Replacement for the mirror address mode, hopefully nobody needs filtering.
vec2 mirror(vec2 v) {
    return abs((fract((v * 0.5) + 0.5) * 2.0) - 1.0);
}

vec2 downsample(vec2 v, vec2 res) {    
    // Division by zero protected by uniform getters.
    return floor(v * res) / res;
}

// Fetches four random values from an RGBA noise texture
vec4 whiteNoise(vec2 coord, vec2 texelOffset, vec2 resolution) {
    vec2 offset = downsample(vec2(rngSeed * NOISE_TEXTURE_SIZE, rngSeed) + texelOffset, vec2(NOISE_TEXTURE_SIZE));
    vec2 ratio = resolution / vec2(NOISE_TEXTURE_SIZE);
    return texture(noiseTexture, (coord * ratio) + offset); 
}

// Fetch noise texture texel based on single offset in the (0-1) range
vec4 random(float dataOffset) {
    vec2 halfTexelSize = vec2((0.5 / NOISE_TEXTURE_SIZE));
    float offset = rngSeed + dataOffset;    
    return texture(noiseTexture, vec2(offset * NOISE_TEXTURE_SIZE, offset) + halfTexelSize); 
}

// Jumble coord generation
vec2 jumble(vec2 coord, float time, vec2 resolution){
    // Static branch.
    if ((jumbleShift * jumbleness * jumbleResolution) < DELTA) {
        return vec2(0.0);
    }
        
    vec2 gridCoords = (coord * jumbleResolution) / (NOISE_TEXTURE_SIZE * 0.0245);
    float jumbleTime = mod(floor(time * 0.02 * jumbleSpeed), NOISE_TEXTURE_PIXEL_COUNT);
    vec2 offset = random(jumbleTime / NOISE_TEXTURE_PIXEL_COUNT).ga * jumbleResolution;
    vec4 cellRandomValues = whiteNoise(gridCoords, vec2(jumbleResolution * -10.0) + offset, resolution);
    return (cellRandomValues.ra - 0.5) * jumbleShift * floor(min(0.99999, cellRandomValues.b) + jumbleness);
}

// Horizontal line offset generation
float lineOffset(vec2 coord, vec2 uv, float time, vec2 resolution) {
    // Static branch.
    if (lineShift < DELTA) {
        return 0.0;
    }
    
    // Wave offsets
    vec2 waveHeights = vec2(50.0 * lineResolution, 25.0 * lineResolution);    
    vec4 lineRandom = whiteNoise(downsample(uv.yy, waveHeights), vec2(0.0), resolution);
    float driftTime = uv.y * resolution.y * 2.778;
    
    // XY: big waves, ZW: drift waves
    vec4 waveTimes = (vec4(downsample(lineRandom.ra * TAU, waveHeights) * 80.0, driftTime + 2.0, (driftTime * 0.1) + 1.0) + (time * lineSpeed)) + (lineVertShift * TAU);
    vec4 waveLineOffsets = vec4(sin(waveTimes.x), cos(waveTimes.y), sin(waveTimes.z), cos(waveTimes.w));
    waveLineOffsets.xy *= ((whiteNoise(waveTimes.xy, vec2(0.0), resolution).gb - 0.5) * shakiness) + 1.0;
    waveLineOffsets.zw *= lineDrift;
    return dot(waveLineOffsets, vec4(1.0));
}

void fragment()
{
    vec3 randomValues = vec3(rand(vec2(TIME, 0.0)), rand(vec2(TIME, 0.0)), rand(vec2(TIME, 0.0)));
    vec2 resolution = 1.0 / SCREEN_PIXEL_SIZE;
    vec2 uv = FRAGCOORD.xy / (1.0 / SCREEN_PIXEL_SIZE).xy;
    
    // Sample random high-frequency noise
    vec4 randomHiFreq = whiteNoise(uv, randomValues.xy, resolution);
    
    // Apply line offsets
    vec2 offsetCoords = uv;
    offsetCoords.x += ((((2.0 * randomValues.z) - 1.0) * shakiness * lineSpeed) + lineOffset(offsetCoords, uv, TIME, resolution)) * lineShift * intensity;
    
    // Apply jumbles
    offsetCoords += jumble(offsetCoords, TIME, resolution) * intensity * intensity * 0.25;
        
    // Channel split
    vec2 shiftFactors = (channelShift + (randomHiFreq.rg * dispersion)) * intensity;
    vec4 outColour;
    
    // Static branch.
    if (((channelShift + dispersion) * intensity) < DELTA) {
        outColour = texture(SCREEN_TEXTURE, mirror(offsetCoords));
    } else {
        outColour = extractRed(texture(SCREEN_TEXTURE, mirror(offsetCoords + vec2(shiftFactors.r, 0.0)))) + extractBlue(texture(SCREEN_TEXTURE, mirror(offsetCoords + vec2(-shiftFactors.g, 0.0)))) + extractGreen(texture(SCREEN_TEXTURE, mirror(offsetCoords)));
    }
    
    // Add noise    
    outColour.rgb *= (vec3(.55, .5, .4) * randomHiFreq.gab * intensity * noiseLevel) + 1.0;
        
    // COLOR = vColor * outColour;
    COLOR = outColour;
}

enter image description here

Hardcode per-instance properties into the shader in unity

I am using one of the packages from the asset store and using its Polyline feature for wire rendering. There are thousands of wires and its taking much fps/Batch counts. The reason of the lack of support for static batching. Now there is a workaround suggested by the author of the package.

if you don’t use polylines for anything other than that, and all your
shaders use the same properties, you could hardcode all per-instance
properties into the shader

enter image description here

Due to a lack of shader knowledge(I am developing the knowledge currently and it will take time) I am unable to hard code the above properties: I tried and here is the current scenario:

#define int _ScaleMode = 1
#define half4 _Color = (0,0,0,0)    
#define float4 _PointStart = (0,0,0,0)   
#define float4 _PointEnd = (0,0,0,0)   
#define half _Thickness = 0.5
#define int _ThicknessSpace = 1
#define half _DashSize = 1
#define half _DashOffset =1
#define int _Alignment =1

HOw do i correctly hard code above values?

How to implement dither in pixel shader?

I’ve seen this talk about INSIDE’s rendering (thanks to DMGregory).
It says that we should dither everything.
What I wonder however is how to dither in a simple fragment shader?

In a Pixel Shader I don’t have access to neighbouring pixels, how do I dither for example the vertex colors?

Imagine a single quad with a gradient going from white to black.
I can multiply the Vertex Colour with the noise texture, but this will only darken/brighten the actual color.

mathematics – How to project a Matcap correctly in Amplify Shader Editor

Unity’s Amplify Shader Editor comes with an example of matcap nodes setup, however, the matcap projection in this example is incorrect.
As you can see the texture becomes distorted when it’s on de edges of the screen, that shouldn’t happen with a correct matcap:

enter image description here

enter image description here

On the secoud image, this is how a matcap should look, without major distortions.

Does any expert in shaders knows the right way to project a matcap using the nodes? I’ve done my search and there is absolutelly no information about this around the internet, any response would add for this Shader documentation.
I’m extremelly in need that matcap works with this shader.

fade/transparence for object by shader graph?

I’m trying to make a simple fade/transparence for my object using shader graph. I got a result with dither but I don’t like it. I want something smooth without dots. How to get it?
enter image description here

directx – Specifying a root signature in the HLSL code of a DXR shader

I’ve noticed that I cannot specify a root signature in the HLSL code of a DXR shader. For example, if I got a ray generation shader with the following declaration

(rootsignature(
    "RootFlags(LOCAL_ROOT_SIGNATURE),"  
    "DescriptorTable("                  
    "UAV(u0, numDescriptors = 1),"  
    "SRV(t0, numDescriptors = 1))"))
(shader("raygeneration"))
void RayGen()
{}

CreateRootSignature yields the error message

No root signature was found in the dxil library provided to CreateRootSignature. ( STATE_CREATION ERROR #696: CREATE_ROOT_SIGNATURE_BLOB_NOT_FOUND).

I’ve noticed that even when I add a typo (for example, write roosignature instead of rootsignature), the compiler doesn’t complain about this typo. So, it seems like the whole attribute declaration is simply ignored.

If I change the code to a simple rasterization shader, everything works as expected.

So, is the specification of a root signature in the HLSL code of a DXR shader not supported?