-
What is a shader?
- Program written in GLSL
- Sent to the GPU
- Position each vertex of a geometry
- Colorize each visible pixel of that geometry
- Actually, Pixel isn't accurate because pixels are about the screen, each point in the render doesn't necessarily match each pixel of the screen, we're going to use fragment
- We send a lot of data to the shader, vertices coordinates, mesh transformation, information about the camera, colors, textures, light..., the GPU processes all of this data following the shader instructions
- Once the vertices are placed by the vertex shader, the GPU knows what pixels of the geometry are visible and can proceed to the fragment shader
-
Vertex Shader 顶点着色器
- position each vertex of a geometry
- the same vertex shader will be used for every vertices, some data like the vertex position will be different for each vertex, those type of data are called attributes
- some data like the position of the mesh are the same for every vertices, those type of data are called uniforms
- we can send a value from the vertex to the fragment, those are called varyings and the value get interpolated between the vertices
-
Fragment Shader 片段着色器
- color each visible pixel of the geometry
-
Set up 基础场景
<script setup>
import * as THREE from 'three'
import {OrbitControls} from 'three/addons/controls/OrbitControls.js'
import * as dat from 'dat.gui'
/**
* scene
*/
const scene = new THREE.Scene()
/**
* test mesh
*/
const geometry = new THREE.PlaneGeometry(1, 1, 32, 32)
const material = new THREE.MeshBasicMaterial()
const mesh = new THREE.Mesh(geometry, material)
scene.add(mesh)
/**
* light
*/
const directionalLight = new THREE.DirectionalLight('#ffffff', 4)
directionalLight.position.set(3.5, 2, - 1.25)
scene.add(directionalLight)
/**
* camera
*/
const camera = new THREE.PerspectiveCamera(
35,
window.innerWidth / window.innerHeight,
0.1,
100
)
camera.position.set(6, 4, 8)
/**
* renderer
*/
const renderer = new THREE.WebGLRenderer()
renderer.setSize(window.innerWidth, window.innerHeight)
document.body.appendChild(renderer.domElement)
window.addEventListener('resize', () => {
camera.aspect = window.innerWidth / window.innerHeight
camera.updateProjectionMatrix()
renderer.setSize(window.innerWidth, window.innerHeight)
renderer.setPixelRatio(Math.min(window.devicePixelRatio, 2))
})
/**
* axesHelper
*/
const axesHelper = new THREE.AxesHelper(5)
scene.add(axesHelper)
/**
* control
*/
const controls = new OrbitControls(camera, renderer.domElement)
controls.enableDamping = true
/**
* render
*/
const tick = () => {
controls.update()
requestAnimationFrame(tick)
renderer.render(scene, camera)
}
tick()
/**
* gui
*/
const gui = new dat.GUI()
</script>
-
Create our first shaders with RawShaderMaterial - 原始着色器材质
- Replace the
meshBasicMaterial
withRawShaderMaterial
- use the
vertexShader
andfragmentShader
properties to provide the shaders
/** * test mesh */ const geometry = new THREE.PlaneGeometry(1, 1, 32, 32) const material = new THREE.RawShaderMaterial({ vertexShader: ` uniform mat4 projectionMatrix; uniform mat4 viewMatrix; uniform mat4 modelMatrix; attribute vec3 position; void main() { gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); } `, fragmentShader: ` precision mediump float; void main() { gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); } ` }) const mesh = new THREE.Mesh(geometry, material) scene.add(mesh)
- move the shader codes and import, 我们知道
import
是模块化语法,通常用于导入模块文件,但是这里我们需要的只是将导入内容变成一个字符串并使用,所以需要支持解析glsl
语法 -
glsl的常用文档:shaderific,khronos,The Book of Shaders
import testVertexShader from './shaders/test/vertex.glsl' import testFragmentShader from './shaders/test/fragment.glsl' /** * test mesh */ ... const material = new THREE.RawShaderMaterial({ vertexShader: ` `, fragmentShader: ` ` }) ...
- We can use vite-plugin-glsl or vite-plugin-glslify, GLSLIFY is kind of the standard, but vite-plugin-glsl is easier to use and well maintained, so
npm i vite-plugin-glsl
// vite.config.js ... import glsl from 'vite-plugin-glsl' ... ... export default defineConfig({ plugins: [ vue (), glsl (), ], ... })
import testVertexShader from './shaders/test/vertex.glsl' import testFragmentShader from './shaders/test/fragment.glsl' /** * test mesh */ ... const material = new THREE.RawShaderMaterial({ vertexShader: testVertexShader, fragmentShader: testFragmentShader, // wireframe: true, // 部分属性还可以继续使用,但像类似color这种还是需要着色器编写的 }) ...
- 关于 vertex.glsl 的部分解释
- the clip space looks like a box
// uniform 指所有线程统一的输入值,只读,对于每个顶点来说都是相同的数据 uniform mat4 projectionMatrix; // 投影矩阵,坐标转换 uniform mat4 viewMatrix; // 视图矩阵,相对于Camera的转换 uniform mat4 modelMatrix; // 模型矩阵,相对于Mesh的转换(position,rotation,scale) // 缓冲几何体的attributes,对于每个顶点来说都有的属性,可以获取每个顶点的坐标 attribute vec3 position; // 自动调用函数,没有返回值 void main() { // gl_Position是一个内置变量,描述世界坐标系中的顶点坐标,返回一个 vec4 // 我们提供的坐标位于 clip space,除了x,y,z以外还会提供第四个值 w,提供给 perspective gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); }
- Separate each matrix part, 拆分 vertex.glsl 的部分,便于更好的控制
uniform mat4 projectionMatrix; // 投影矩阵,坐标转换 uniform mat4 viewMatrix; // 视图矩阵,相对于Camera的转换 uniform mat4 modelMatrix; // 模型矩阵,相对于Mesh的转换(position,rotation,scale) // 缓冲几何体的attributes,对于每个顶点来说都有的属性,可以获取每个顶点的坐标 attribute vec3 position; // 自动调用函数,没有返回值 void main() { // gl_Position是一个内置变量,描述世界坐标系中的顶点坐标,返回一个vec4 // 我们提供的坐标位于 clip space,除了x,y,z以外还会提供第四个值w,提供给perspective // gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); // 使用模型矩阵将位置属性转换为模型位置 vec4 modelPosition = modelMatrix * vec4(position, 1.0); modelPosition.z += sin(modelPosition.x * 10.0) * 0.1; // 修改模型位置 // 视图位置 vec4 viewPosition = viewMatrix * modelPosition; // 投影位置 vec4 projectionPosition = projectionMatrix * viewPosition; gl_Position = projectionPosition; }
- 添加自定义的
attribute
给 vertex shader
/** * test mesh */ ... const count = geometry.attributes.position.count // 顶点数量 const randoms = new Float32Array(count) // 随机数数组 for(let i = 0; i < count; i++) { randoms[i] = Math.random() } // 添加attribute,每个顶点一个随机值 geometry.setAttribute('aRandom', new THREE.BufferAttribute(randoms, 1)) ... ...
... // 缓冲几何体的attributes,对于每个顶点来说都有的属性 attribute vec3 position; attribute float aRandom; // 自动调用函数,没有返回值 void main() { ... // 使用模型矩阵将位置属性转换为模型位置 vec4 modelPosition = modelMatrix * vec4(position, 1.0); // modelPosition.z += sin(modelPosition.x * 10.0) * 0.1; modelPosition.z += aRandom * 0.1; ... }
- 将数据从 vertex 发送到 fragment,在 fragment 中是不可以使用 attribute 的,所以这里将要使用上文提到的 varyings
... ... varying float vRandom; // 自动调用函数,没有返回值 void main() { ... ... vRandom = aRandom; }
... varying float vRandom; void main() { gl_FragColor = vec4(0.0, vRandom, vRandom, 1.0); }
- 关于 fragment.glsl 的部分解释
// 精度 // highP可能会造成性能问题,并且不适用于所有设备 // lowP缺少精度可能不够准确 // 默认值为mediump, 必须提供 precision mediump float; void main() { // 内置变量,(r, g, b, a) // 仅在此修改 alpha 是不生效的,还需要结合 RawShaderMaterial 中的 transparent 属性 gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); } /** * test mesh */ ... const material = new THREE.RawShaderMaterial({ ... transparent: true, }) ...
- Replace the
-
uniform
- 假设CPU是一个管道,在CPU执行任务时,每一个任务就要排队一次一个通过管道(串行),有的任务比别的大,那么就要花费更长的时间,为了提高任务的处理能力,现代计算机通常有多个处理器,这些管道被称为线程
- 视频、游戏等跟一般程序比起来需要高得多的处理能力,比如一个分辨率为800*600的老式屏幕,需要每一帧处理480000个像素,这对CPU来说就是大问题了,因此就有了 图形处理器GPU (Graphic Processor Unit))- 用一大堆小的微处理器并行处理
- GPU并行处理任务时,每个线程只负责给完整图像的一部分提供数据,彼此之间不能进行数据交换,但我们能从CPU给每个线程输入数据,但是所有这部分输入数据必须统一,并且只读,这些输入数据就是 uniform
/** * test mesh */ ... ... const material = new THREE.RawShaderMaterial({ ... uniforms: { uFrequency: {value: new THREE.Vector2(10, 5)} } })
// uniform 指所有线程统一的输入值,只读 ... uniform vec2 uFrequency; // 自动调用函数,没有返回值 void main() { ... ... // 使用模型矩阵将位置属性转换为模型位置 vec4 modelPosition = modelMatrix * vec4(position, 1.0); modelPosition.z += sin(modelPosition.x * uFrequency.x) * 0.1; modelPosition.z += sin(modelPosition.y * uFrequency.y) * 0.1; ... ... }
- 添加 gui 方便观察值的变化,当我们在操作时,uniform的值就自动更新了
/** * gui */ const gui = new dat.GUI() gui.add(material.uniforms.uFrequency.value, 'x').min(0).max(20).step(0.01).name('frequencyX') gui.add(material.uniforms.uFrequency.value, 'y').min(0).max(20).step(0.01).name('frequencyY')
- 既然能够动态更新uniform,那就实现一下动画效果吧
/** * test mesh */ ... ... const material = new THREE.RawShaderMaterial({ ... uniforms: { uFrequency: {value: new THREE.Vector2(10, 5)}, uTime: {value: 0} } }) ... ...
/** * render */ const clock = new THREE.Clock() const tick = () => { const elapsedTime = clock.getElapsedTime() // update material material.uniforms.uTime.value = elapsedTime controls.update() requestAnimationFrame(tick) renderer.render(scene, camera) } tick()
... uniform float uTime; // 自动调用函数,没有返回值 void main() { ... // 使用模型矩阵将位置属性转换为模型位置 vec4 modelPosition = modelMatrix * vec4(position, 1.0); modelPosition.z += sin(modelPosition.x * uFrequency.x - uTime) * 0.1; modelPosition.z += sin(modelPosition.y * uFrequency.y - uTime) * 0.1; ... ... }
- 除此以外,
uniform
还可以被发送至fragment
,来改变一下颜色试试
... ... const material = new THREE.RawShaderMaterial({ ... uniforms: { uFrequency: {value: new THREE.Vector2(10, 5)}, uTime: {value: 0}, uColor: {value: new THREE.Color('cyan')} } }) const mesh = new THREE.Mesh(geometry, material) mesh.scale.y = 2 / 3 scene.add(mesh)
... uniform vec3 uColor; void main() { // gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); gl_FragColor = vec4(uColor, 1.0); ... }
- 再继续改变 texture,这里需要将 texture 上的像素添加到
fragment shader
中,我们使用texture2D()
// shaders.vue /** * texture */ const textureLoader = new THREE.TextureLoader() const flagTexture = textureLoader.load('../public/imgs/sponge.jpg') /** * test mesh */ ... const material = new THREE.RawShaderMaterial({ ... uniforms: { uFrequency: {value: new THREE.Vector2(10, 5)}, uTime: {value: 0}, uColor: {value: new THREE.Color('cyan')}, uTexture: {value: flagTexture} } }) ... ...
// vertex.glsl ... ... attribute vec2 uv; // geometry的attributes属性 varying vec2 vUv; // 从顶点传入数据给fragment void main() { ... ... vUv = uv; }
// fragment.glsl ... ... varying vec2 vUv; ... uniform sampler2D uTexture; // 纹理类型 void main() { ... ... vec4 textureColor = texture2D(uTexture, vUv); // texture gl_FragColor = textureColor; }
- color variation 当顶点很高,距离相机越近时增加亮度
// vertex.glsl 这部分变更是为了能够传递数据 ... ... varying float vElevation; void main() { float elevation = sin(modelPosition.x * uFrequency.x - uTime) * 0.1; elevation += sin(modelPosition.y * uFrequency.y - uTime) * 0.1; modelPosition.z = elevation; ... ... vElevation = elevation }
// fragment.glsl ... ... varying float vElevation; void main() { vec4 textureColor = texture2D(uTexture, vUv); // texture textureColor.rg *= vElevation * 2.0 + 0.8; gl_FragColor = textureColor; }
-
以上创建 shaders 我们用的是
RawShaderMaterial
,在了解了RawShaderMaterial
的用法后,现在我们使用一个相对更简洁的ShaderMaterial
- replace the material
const material = new THREE.ShaderMaterial({ ... })
-
看一下报错的截图,显示我们正在重新定义这部分属性,也就是说这些属性已经存在了,所以我们删除重新定义的部分,最终vertex文件代码如下:
// vertex.glsl uniform vec2 uFrequency; uniform float uTime; attribute float aRandom; varying vec2 vUv; varying float vElevation; void main() { vec4 modelPosition = modelMatrix * vec4(position, 1.0); float elevation = sin(modelPosition.x * uFrequency.x - uTime) * 0.1; elevation += sin(modelPosition.y * uFrequency.y - uTime) * 0.1; modelPosition.z = elevation; vec4 viewPosition = viewMatrix * modelPosition; vec4 projectionPosition = projectionMatrix * viewPosition; gl_Position = projectionPosition; vUv = uv; vElevation = elevation; }