A shader material is rendered with a custom shader. It requires vertex and fragment shaders which are written in GLSL (openGL Shading Language) code and depict the position of a vertex and its color, respectively. Since these codes run on the GPU using WebGL, a ShaderMaterial
is rendered properly by WebGLRenderer
only. In the post, I’ll explain how to use Shaders in Three.js.
ShaderMaterial
can be defined by:
const material = new THREE.ShaderMaterial({
uniforms: {
time: { value: 1.0 },
resolution: { value: new THREE.Vector2() }
},
vertexShader: /* glsl */ `...`,
fragmentShader: /* glsl */ `...`,
})
The properties inside uniforms
can be accessed in the vertexShader
and fragmentShader
, and they have the same values for each vertex. The types of GLSL variable are float, vec2, vec3, vec4, sampler2D, and their corresponding types for JavaScript are Number
, THREE.Vector2
, THREE.Vector3
(or THREE.Color
), THREE.Vector4
, and THREE.Texture
.
GLSL | JavaScript |
---|---|
float | Number |
vec2 | THREE.Vector2 |
vec3 | THREE.Vector3 or THREE.Color |
vec4 | THREE.Vector4 |
sampler2D | THREE.Texture |
The vertexShader
and fragmentShader
of ShaderMaterial
fetch a code as a text format. These code can be written inside <script type=“x-shader/x-vertex”>
or <script type=“x-shader/x-fragment”>
in HTML, then be read by document.getElementById(’vertex’).textContent
.
<!-- index.html -->
<html>
<head>
<script id="vertex" type=“x-shader/x-vertex”>
...
</script>
<script id="fragment" type=“x-shader/x-fragment”>
...
</script>
</head>
<body>
</body>
</html>
Otherwise, they can be declared as a multi-line strings using apostrophe (`) at an external file, then be imported as import vertex from ‘./shader/vertex.js’
// vertex.js
export const vertex = `
/* glsl */
...
`
In the following articles, I’ll introduce the basics of GLSL and create the earth using ShaderMaterial
.
Three.js provides the material attribute for a 3D object, which determines how the object reflects light and how the object is rendered in a camera. The properties of material are composed of base color, metalness, roughness, and so on. Moreover, we can decorate the surface of a 3D object by using texture maps. The texture map is a 2D image map which describes the characteristic of a material with respect to the UV map of the object surface. Thus, texture map helps us to make a realistic object.
In order to concentrate the textures of the earth, fix the position of the earth object. Let the sun revolves around the earth by modifying the below lines
scene.add(light);
function updateSystem(sec) {
moon.position.set(0.4*Math.cos(w_moon*sec), 0, -0.4*Math.sin(w_moon*sec));
earth_equator.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
into
sun.add(light);
function updateSystem(sec) {
moon.position.set(0.4*Math.cos(w_moon*sec), 0, -0.4*Math.sin(w_moon*sec));
sun.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
Besides, the light has been attached into the revolving sun. This is much intuitive hierarchy. Then, the result is:
In the article, I’ll use MeshPhongMaterial
for creating a realistic earth. Texture map for color, normal, bump, specular, etc. can be found at here or here. Of course, you can use other texture map by searching with a keyword, “earth texture map”. Example for color and normal map are:
A color of normal map represents normalized normal vector of surface as RGB. Indeed,
\[R = (n_x + 1) / 2\] \[G = (n_y + 1) / 2\] \[B = (n_z + 1) / 2\]Usually, since the texture with smooth surface has normal vector, \(n = (0, 0, 1)\), a normal map is represented with purple color, #7F7FFF
, in general.
To use the downloaded texture map, Three.js provides TextureLoader
class. It helps to load a texture map from a local file or URL. If we define TextureLoader
with load(url: String, onLoad: Function, onProgress: Function, onError: Function)
callback functions, the texture map is applied to the object asynchronously. In the following, we are about to apply color map, normal map, and specular map to the earth object.
const material_earth = new THREE.MeshPhongMaterial({specular: new THREE.Color(0xFFFFFF), shininess: 3});
const earth = new THREE.Mesh(geometry_sphere, material_earth);
earth.scale.set(0.2, 0.2, 0.2);
const loader = new THREE.TextureLoader();
loader.load('./assets/2k_earth_daymap.jpg', (texture)=>{
material_earth.map = texture;
material_earth.needsUpdate = true;
});
loader.load('./assets/2k_earth_normal_map.tif', (texture)=>{
material_earth.normalMap = texture;
material_earth.normalScale = new THREE.Vector2(2, 2);
material_earth.needsUpdate = true;
});
loader.load('./assets/2k_earth_specular_map.tif', (texture)=>{
material_earth.specularMap = texture;
material_earth.needsUpdate = true;
});
Notice that you have to run material_earth.needsUpdate = true
when you update the attributes of the material after its construction. The combination of color, specular, and normal maps is here:
If we apply only color map, specular map, or normal map in order to see each influence, the results are as follows:
Color map | Specular map | Normal map |
---|---|---|
Since the reflectivity of the ocean is higher than the land, ocean part has brighter value in the specular map. Also, in the normal map, you can notice that the shadow of mountains changes depending on the sun direction.
In the above, we’ve created the earth. Next, let’s create the moon.
loader.load('./assets/moonmap2k.jpg', (texture)=>{
material_moon.map = texture;
material_moon.needsUpdate = true;
});
In the result, day and night are well-rendered at both earth and moon. But, a solar eclipse does not happen even though the earth and moon are in line with the sun. Also, lunar eclipse would not be happened, too. So, let’s activate a shadow effect of Three.js renderer.
renderer.shadowMap.enabled = true;
...
light.castShadow = true;
...
earth.castShadow = true;
earth.receiveShadow = true;
...
moon.castShadow = true;
moon.receiveShadow = true;
...
renderer.shadowMap
configures the characteristics of shadow map. Depending on the scale of a scene, you have to tune this parameter. castShadow = true
is applied to the object that generates shadow. Otherwise, receiveShadow = true
is applied to the object that depicts shadow regions. Because moon and earth interact with each other, castShadow
and receiveShadow
are set as true for both moon and earth. If castShadow
of earth or receiveShadow
of moon is set as false, lunar eclipse would not be appeared.
Now, we can observe the eclipse. Although the boundary of shadow looks not much natural, but let’s skip now. Then, for real scale, how about if we reduce the size of the moon?
This rasterized boundary of shadow gets worse if the distance between the light and earth. To make the shadow more realistic, we can configure the size of shadow map and blur as below.
light.shadow.mapSize = new THREE.Vector2(4096, 4096);
light.shadow.radius = 20;
In this way, we can tune the value of shadow map resolution for further planet, but it would be the waste of memory because only some of small surface are affected by shadows. Also, the quality of shadow will be different depending on the distance from the lighting source. So, it’s time to use Shader.
Overall, the material and texture makes an object almost realistic. But if we want to illustrate the night side of Earth, the shadow of cloud, and the high resolution of eclipse, we need shader material which will be covered in the following article.
import * as THREE from 'three'
import { OrbitControls } from 'three/addons/controls/OrbitControls.js'
const canvas = document.createElement("canvas");
document.body.appendChild(canvas);
const renderer = new THREE.WebGLRenderer({canvas: canvas, alpha: true, antialias: true});
renderer.setSize( window.innerWidth, window.innerHeight );
renderer.shadowMap.enabled = true;
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.set(0, 0, 5);
const geometry_sphere = new THREE.SphereGeometry(1, 30, 30);
const material_sun = new THREE.MeshBasicMaterial({color: 0xffaa00});
const sun = new THREE.Mesh(geometry_sphere, material_sun);
const material_earth = new THREE.MeshPhongMaterial({specular: new THREE.Color(0xFFFFFF), shininess: 3});
const earth = new THREE.Mesh(geometry_sphere, material_earth);
earth.scale.set(0.2, 0.2, 0.2);
earth.castShadow = true;
earth.receiveShadow = true;
const loader = new THREE.TextureLoader();
loader.load('./assets/2k_earth_daymap.jpg', (texture)=>{
material_earth.map = texture;
material_earth.needsUpdate = true;
});
loader.load('./assets/2k_earth_normal_map.tif', (texture)=>{
material_earth.normalMap = texture;
material_earth.normalScale = new THREE.Vector2(2, 2);
material_earth.needsUpdate = true;
});
loader.load('./assets/2k_earth_specular_map.tif', (texture)=>{
material_earth.specularMap = texture;
material_earth.needsUpdate = true;
});
const material_moon = new THREE.MeshLambertMaterial();
const moon = new THREE.Mesh(geometry_sphere, material_moon);
moon.scale.set(0.05, 0.05, 0.05);
moon.castShadow = true;
moon.receiveShadow = true;
loader.load('./assets/moonmap2k.jpg', (texture)=>{
material_moon.map = texture;
material_moon.needsUpdate = true;
});
const light = new THREE.PointLight(0xffffff, 15);
light.position.set(0, 0, 0);
light.castShadow = true;
light.shadow.mapSize = new THREE.Vector2(4096, 4096);
light.shadow.radius = 20;
const earth_orbit = new THREE.Object3D();
const earth_equator = new THREE.Object3D();
const moon_orbit = new THREE.Object3D();
// earth_equator.rotateZ(23.5*Math.PI/180);
const scene = new THREE.Scene();
scene.add(sun);
sun.add(light);
scene.add(earth_orbit);
earth_orbit.add(earth_equator);
earth_equator.add(earth);
earth_equator.add(moon_orbit);
moon_orbit.add(moon);
const controls = new OrbitControls(camera, canvas);
controls.enableDamping = true;
const w_moon = 2;
const w_orbit = 0.5;
const w_rotate = 0.1;
function updateSystem(sec) {
moon.position.set(0.4*Math.cos(w_moon*sec), 0, -0.4*Math.sin(w_moon*sec));
sun.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
function animate (msec) {
requestAnimationFrame(animate);
updateSystem(msec * 0.001);
controls.update();
renderer.render(scene, camera);
}
animate();
In this article, I’ll create a simple Sun-Earth system. Firstly, create an orange sphere and a blue sphere which represent the Sun and the Earth, respectively.
// create sphere geometry for Sun and Earth
const geometry_sphere = new THREE.SphereGeometry(1, 30, 30);
// object Sun uses basicMaterial since it emits orange light
const material_sun = new THREE.MeshBasicMaterial({color: 0xffaa00});
const sun = new THREE.Mesh(geometry_sphere, material_sun);
sun.position.set(0, 0, 0);
const light = new THREE.PointLight(0xffffff, 50);
light.position.set(0, 0, 0);
// object Earth uses lambertMaterial since it reflects sunlight
const material_earth = new THREE.MeshLambertMaterial({color: 0x4444ff});
const earth = new THREE.Mesh(geometry_sphere, material_earth);
earth.position.set(3, 0, 0); // distant from Sun
earth.scale.set(0.2, 0.2, 0.2); // smaller size than Sun
const scene = new THREE.Scene();
scene.add(sun);
scene.add(earth);
scene.add(light);
The sun and earth have the same shape, thus they share a single geometry attribute, geometry_sphere
. Because sun is the only lighting source in the system, the material of Sun is MeshBasicMaterial
which is not affected by lighting, and PointLight
is located at the center of sun. The earth is distanced from the the sun and has smaller size than the sun. The result is below:
Let’s make the earth rotating and orbiting around the sun. Define the following function and add it inside the animation loop.
const w_orbit = 0.5;
const w_rotate = 0.1;
function updateSystem(sec) {
earth.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
By the way, the animation function that is binded with requestAnimationFrame()
can pass a single argument indicating timestamp, msec
.
This callback function is passed a single argument: a
DOMHighResTimeStamp
indicating the end time of the previous frame’s rendering (based on the number of milliseconds since time origin). 1
function animate (msec) {
requestAnimationFrame(animate);
updateSystem(msec * 0.001);
controls.update();
renderer.render(scene, camera);
}
animate();
Also, to easily see the rotation of the earth, reduce the number of segment of SphereGeometry
and set flatShading = true
for the earth material.
const geometry_sphere = new THREE.SphereGeometry(1, 10, 10);
const material_earth = new THREE.MeshLambertMaterial({color: 0x4444ff, flatShading: true});
Finally, the result is below:
However, when we try to add the moon to the Sun-Earth system, we have to solve the position of the moon explicitly.
\[x_{\rm moon} = dist_{\rm sun-earth} \times \cos(\omega_{\rm rev, earth}\times t) + dist_{\rm earth-moon}\times \cos(\omega_{\rm rev, moon}\times t)\] \[y_{\rm moon} = -dist_{\rm sun-earth} \times \sin(\omega_{\rm rev, earth}\times t) - dist_{\rm earth-moon}\times \sin(\omega_{\rm rev, moon}\times t)\]Moreover, if we try to describe a realistic solar system, the above equations would become much more complicated because real orbit and rotation axis of the earth are tilted. Let’s revise the above code using local coordinate: earth’s orbit plane and equator plane, and moon’s orbit plane.
const earth_orbit = new THREE.Object3D();
const earth_equator = new THREE.Object3D();
const moon_orbit = new THREE.Object3D();
earth_equator.rotateZ(23.5*Math.PI/180); // tilted rotation axis
const scene = new THREE.Scene();
scene.add(sun);
scene.add(earth_orbit);
sun.add(light);
earth_orbit.add(earth_equator);
earth_equator.add(earth);
earth_equator.add(moon_orbit);
moon_orbit.add(moon);
const w_moon = 5;
const w_orbit = 0.5;
const w_rotate = 0.1;
function updateSystem(sec) {
moon.position.set(0.4*Math.cos(w_moon*sec), 0, -0.4*Math.sin(w_moon*sec));
earth_equator.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
In the above code, because moon_orbit
belongs to the earth_equator
, the position of moon
is determined in the earth_equator
coordinates. Thus, the equation of the moon gets simple rather than the above equation.
Therefore, the scene graph of the above system looks like below. Depending on moon, its object can be added into mother’s equator or mother’s orbit. Actually, to be precise, the scene graph also includes the relationships among Object3D, Mesh, Geometry, Material, and Texture. But here, I depicted Object3D only.
import * as THREE from 'three'
import { OrbitControls } from 'three/addons/controls/OrbitControls.js'
const canvas = document.createElement("canvas");
document.body.appendChild(canvas);
const renderer = new THREE.WebGLRenderer({canvas: canvas, alpha: true, antialias: true});
renderer.setSize( window.innerWidth, window.innerHeight );
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.set(0, 0, 5);
const geometry_sphere = new THREE.SphereGeometry(1, 10, 10);
const material_sun = new THREE.MeshBasicMaterial({color: 0xffaa00});
const sun = new THREE.Mesh(geometry_sphere, material_sun);
const material_earth = new THREE.MeshLambertMaterial({color: 0x4444ff, flatShading: true});
const earth = new THREE.Mesh(geometry_sphere, material_earth);
earth.scale.set(0.2, 0.2, 0.2);
const material_moon = new THREE.MeshLambertMaterial({color: 0xaaaaaa, flatShading: true});
const moon = new THREE.Mesh(geometry_sphere, material_moon);
moon.scale.set(0.1, 0.1, 0.1);
const light = new THREE.PointLight(0xffffff, 50);
light.position.set(0, 0, 0);
const earth_orbit = new THREE.Object3D();
const earth_equator = new THREE.Object3D();
const moon_orbit = new THREE.Object3D();
earth_equator.rotateZ(23.5*Math.PI/180);
const scene = new THREE.Scene();
scene.add(sun);
scene.add(earth_orbit);
sun.add(light);
earth_orbit.add(earth_equator);
earth_equator.add(earth);
earth_equator.add(moon_orbit);
moon_orbit.add(moon);
const controls = new OrbitControls(camera, canvas);
controls.enableDamping = true;
const w_moon = 5;
const w_orbit = 0.5;
const w_rotate = 0.1;
function updateSystem(sec) {
moon.position.set(0.4*Math.cos(w_moon*sec), 0, -0.4*Math.sin(w_moon*sec));
earth_equator.position.set(3*Math.cos(w_orbit*sec), 0, -3*Math.sin(w_orbit*sec));
earth.rotateY(w_rotate);
}
function animate (msec) {
requestAnimationFrame(animate);
updateSystem(msec * 0.001);
controls.update();
renderer.render(scene, camera);
}
animate();
For Three.js to render a scene, it needs scene, camera, and renderer. In JavaScript, you have to import Three.js depending on the installation options as mentioned in the previous article.
// main.js
import * as THREE from 'three'
In HTML, to display the rendering result of Three.js, you have to create or designate <canvas>
element to which Three.js renders the scene.
const canvas = document.createElement("canvas");
document.body.appendChild(canvas);
The renderer fetches canvas
as domElement
and options
such as alpha, anti-alias. If you want the app to fill the screen, you can set the size of renderer as window.innerWidth
and window.innerHeight
.
const renderer = new THREE.WebGLRenderer({canvas: canvas, alpha: true, antialias: true});
renderer.setSize( window.innerWidth, window.innerHeight );
Next, camera can be defined as PerspectiveCamera
or OrthographicCamera
. In the example, I’ve chosen a perspective camera. The perspective camera needs field of view, aspect ratio, near and far depth values as arguments.
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.set(1, -5, 5);
camera.lookAt(new THREE.Vector3(0,0,0));
After generating an empty scene, renderer.render(scene, camera)
renders the scene with the camera.
const scene = new THREE.Scene();
renderer.render(scene, camera);
Furthermore, in order to construct a 3D scene, we also need objects and lights. A 3D object requires geometry, material, and mesh properties to be created. In the below, we define cube geometry and basic material. BoxGeometry(1,1,1)
constructs a cube with a length of 1. MeshBasicMaterial
constructs the material of the object, and does not affect by lighting.
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshBasicMaterial({color: 0xff0000});
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
If we use MeshLambertMaterial
, then you should add a light into the scene.
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshLambertMaterial({color: 0xff0000});
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
const light = new THREE.PointLight(0xffffff, 50);
light.position.set(2, -3, 5);
light.lookAt(new THREE.Vector3(0,0,0));
scene.add(light);
If you want to animate the scene, you have to run renderer.render(scene, camera)
iteratively. Similar to canvas animation, it would be better to use requestAnimationFrame()
rather than to use setInterval()
.
function animate () {
requestAnimationFrame(animate);
renderer.render(scene, camera);
}
animate();
Let’s rotate the cube to see the animation. Insert cube.rotateX(0.02)
inside animate()
function.
function animate () {
requestAnimationFrame(animate);
cube.rotateX(0.02);
renderer.render(scene, camera);
}
animate();
To control the pose of the camera using mouse interaction, you can use an add-on of Three.js. It provides several versatile controls. Particularly, orbit controls allow the camera to move along its orbit around a target.
import { OrbitControls } from 'three/addons/controls/OrbitControls.js'
...
const controls = new OrbitControls(camera, canvas);
If you want to enable damping effect of orbit controls, set controls.enableDamping = true
and revise the animate()
function
function animate () {
requestAnimationFrame(animate);
cube.rotateX(0.02);
controls.update();
renderer.render(scene, camera);
}
animate();
import * as THREE from 'three'
import { OrbitControls } from 'three/addons/controls/OrbitControls.js'
const canvas = document.createElement("canvas");
document.body.appendChild(canvas);
const renderer = new THREE.WebGLRenderer({canvas: canvas, alpha: true, antialias: true});
renderer.setSize( window.innerWidth, window.innerHeight );
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.set(1, -5, 5);
camera.lookAt(new THREE.Vector3(0,0,0));
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshLambertMaterial({color: 0xff0000});
const cube = new THREE.Mesh(geometry, material);
const light = new THREE.PointLight(0xffffff, 50);
light.position.set(2, -3, 5);
light.lookAt(new THREE.Vector3(0,0,0));
const scene = new THREE.Scene();
scene.add(cube);
scene.add(light);
const controls = new OrbitControls(camera, canvas);
controls.enableDamping = true;
function animate () {
requestAnimationFrame(animate);
cube.rotateX(0.02);
controls.update();
renderer.render(scene, camera);
}
animate();
The above image is the snapshot of a solar system simulator.1 The posts will be serialized in the order of contents below.
Three.js is a JavaScript library for creating and animating 3D computer graphics in browser using WebGL. 1
You can install Three.js with NPM or from a CDN.
npm install --save three
npm install --save-dev vite
npx vite
<head></head>
tag in index.html. Please substitute {version}
with an actual version of Three.js, such as 0.153.1.
<!-- index.html -->
<script type="importmap">
{
"imports": {
"three": "https://unpkg.com/three@{version}/build/three.module.js",
"three/addons/": "https://unpkg.com/three@{version}/examples/jsm/"
}
}
</script>
npm install --save-dev vite
npx serve .
Otherwise, you can run server using Live Server if you use VS Code.
To use Three.js, you have to import THREE
in main.js. It is different depending on the installation option. If you’ve installed Three.js with Option 1, import Three.js as:
// main.js
import * as THREE from 'three'
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
...
If you’ve installed Three.js with Option 2, use:
// main.js
import * as THREE from 'three'
import { OrbitControls } from 'three/addon/controls/OrbitControls.js';
...
In the following contents, I’ve imported Three.js from a CDN and used Live Server with VS Code.
]]>In the page, I’ll introduce a method to build an electron app and automatically publish it to GitHub using electron-builder.
First, you should install electron-builder
:
npm install --save-dev electron-builder
Then, you should add repository and build option in package.json
as below:
// package.json
{
"name": "{ name }",
"version": "{ version }",
...
"repository": {
"type": "git",
"url": "{ repository url }"
},
"build": {
"appID": "{ appID }",
...
"publish": {
"provider": "github",
"host": "{ github.com }",
"owner": "{ github username }",
"repo": "{ repository name }",
"releaseType": "draft",
},
"releaseInfo": {
"releaseNotesFile": "release-notes.md",
}
}
}
releaseType
can be “draft”, “prerelease”, or “release”; and these are types of release. If you choose “draft”, you have to publish release manually in the remote repository.
releaseInfo
contains “releaseName”, “releaseNotes”, “releaseNotesFile”, “releaseDate” values that summarize the release update.
When you use electron with Vue, you should configure builder in vue.config.js
as below:
module.exports = defineConfig({
...
pluginOptions: {
electronBuilder: {
builderOptions: {
appId: "{ appID }",
...
"publish": {
"provider": "github",
"host": "{ github.com }",
"owner": "{ github username }",
"repo": "{ repository name }",
"releaseType": "draft",
},
"releaseInfo": {
"releaseNotesFile": "release-notes.md",
}
}
}
},
...
})
To publish a release, execute the below line using publish argument, -p
or --publish
:
vue-cli-service electron:build --publish always
always
forces to publish the current build. The choices are “onTag”, “onTagOrDraft”, “always”, and “never”. 1
When you use NPM script, execute:
npm run [build script] -- -p always
--
makes to pass the following arguments to the script.
The autoUpdater enables an electron app to check the latest version and update itself automatically. 1
You can install autoUpdater via NPM or yarn:
npm install --save—dev electron-updater
or
yarn add electron-updater
In main .js file of electron, autoUpdater can be defined as follows:
// main.js
const { autoUpdater } = require("electron-updater");
autoUpdater.setFeedURL({
provider: "github",
host: "github.com",
owner: "{ username }",
repo: "{ repository }",
token: "{ token }",
});
autoDownload = true
, the update will be downloaded automatically. The event contains the version, releaseDate, releaseNotes, etc.autoDownload
defines whether to automatically download an update when it is found. If autoDownload = false
, you should manually execute autoUpdater.downloadUpdate()
after receiving update-available event.autoInstallOnAppQuit
defines whether to automatically install a downloaded update on app quit. If autoInstallOnAppQuit = false
, you should manually execute autoUpdater.quitAndInstall()
.options
consists of provider, host, owner, repo, token, etc.update-downloaded
has been emitted.The first part of the dissertation focuses on asynchronously estimating optical flow streams with low latency and robustness to various scenes. Due to the fundamental difference between traditional and event cameras, most existing algorithms construct event frames by stacking the timestamp value of many events and exploit the legacy of traditional computer vision algorithms. However, this approach increases latency in proportion to the size of a time window, and the size has to be set heuristically. I estimate an optical flow stream with very low latency by enhancing the existing block matching algorithm. The locally estimated optical flow is more accurate than that of the method using a global event frame, in front of irregularly textured scenes. To validate the latency of optical flow, I present the result of angular velocity estimation by using the proposed optical flow stream. Then, the latency is computed by the optimization approach comparing the estimated and ground-truth angular velocity. The evaluation results suggest that the proposed optical flow has very low latency while showing comparable accuracy to event-frame-based algorithms. Besides, the performance of angular velocity estimation is superior to the other existing algorithms in terms of accuracy and robustness with low latency under 15 ms consistently.
The second part of the dissertation proposes an angular velocity estimation with motion segmentation. Unlike traditional cameras, since event cameras detect intensity changes, their event data can be dominated by a small but fast-moving object. To eliminate the influence from the movement of an undesirable object, I utilize the optical flow stream of the first work and intra-pixel-area method and separate an image frame into the static and dynamic regions. Moreover, since event cameras do not produce events at stationary, a classification model should be addressed in the temporal domain and be able to segment motion temporally. Thus, I employ the dual-mode motion model to update models that determine the region occupied by moving objects. Then, the angular velocity of ego-motion is estimated from a bunch of optical flows belonging to the static region. The evaluation results suggest that the proposed algorithm divides the image frame into static and dynamic parts successfully and estimates the angular velocity robustly in dynamic environments.
]]>Abstract: Event cameras are bio-inspired sensors that capture intensity changes of pixels individually, and generate asynchronous and independent ``events’’. Due to the fundamental difference from the conventional cameras, most research on event cameras builds a global event frame by grouping events according to their timestamps or their number to employ traditional computer vision algorithms. However, in order to take advantage of event cameras, it makes sense to generate asynchronous output on an event-by-event basis. In this paper, we propose an optical flow estimation algorithm with low latency and robustness to various scenes to utilize the advantage of the event camera by enhancing the existing optical flow algorithm. Furthermore, we estimate angular velocity with low latency using the proposed optical flow stream. For the validation of algorithms, we evaluate the accuracy and latency of optical flow with publicly available datasets. Moreover, we assess the performance of the proposed angular velocity estimation in comparison to the existing algorithms. Both validations suggest that our asynchronous optical flow shows comparable accuracy to the existing algorithms and the latency is reduced by half compared to the existing block matching algorithm on average. Also, our angular velocity estimation is superior to the existing algorithms in terms of accuracy and robustness while showing low latency within 15 ms consistently.
@article{lee2021low,
title={Low-Latency and Scene-Robust Optical Flow Stream and Angular Velocity Estimation},
author={Lee, Sangil and Kim, H Jin},
journal={IEEE Access},
volume={9},
pages={155988--155997},
year={2021},
publisher={IEEE}
}