Drawing and animation

Overview

Drawing in Use.GPU is different from most other engines: there is no global rendering loop. The app is written as if it only has to produce 1 frame.

The incremental effect system handles the rest, as it will selectively re-run parts of the code when state changes somewhere.

import { Pass } from '@use-gpu/workbench';

// ...
return (
  <Pass>
  </Pass>
);

You can place a <Pass> inside your <Canvas>, to set up a rendering pass to the screen. This gathers draw calls from inside and schedules them by type (opaque, transparent, picking, shadow, debug, ...). <Pass> can operate either in classic forward mode (direct rendering), or deferred mode with a so-called GBuffer (lights rendered separately).

These will be drawn into the surrounding RenderContext, which by default comes from the surrounding <Canvas>.

Draw calls will be ordered front-to-back and back-to-front as needed for optimal performance, if tagged with min/max bounds. This is true for any data ingested via the built-in components.

FrameContext

Use.GPU maintains a command queue inside <Queue> which will re-draw the entire canvas whenever any draw call is added, removed or changed. It will also trigger whenever any data source changes. This is all that is needed to produce an interactive app that can re-render on every change.

It's recommended however to wrap visible content in a Loop to avoid rendering more frames than necessary.

There is also a FrameContext to provide a classic "per frame" trigger. This is provided by e.g. interactive camera controls such as <OrbitControls>. Components can use the usePerFrame hook as a short-hand to subscribe to the FrameContext. This is only used for outside events, such as viewpoint changes and on-going animation, e.g. to allow uploading of live data.


Animation

You need an explicit loop if you have components that are continuously animated. In that case, the app changes by itself, on a regular schedule.

<Loop>

This component provides an equivalent to requestAnimationFrame() in the browser.

import { Loop } from '@use-gpu/workbench';

// ...
return (
  <Loop>
    <Pass>
    </Pass>
  </Loop>
);

You can wrap your content inside <Loop> to allow it to be re-rendered on demand. This must encompass all draw calls that are needed to produce the final frame.

<Loop> buffers the draw calls so they only run once per animation frame. If a state change occurs inside, in the middle of a frame, looped components will update immediately, but commands won't be dispatched until the next frame.

<Loop> provides a FrameContext, as well as a TimeContext with a global synchronized clock for animation.

<Loop> does not run continuously by itself, unless its live prop is also set to true. It only loops if a component inside is animating.

LoopContext

<Loop> provides a LoopContext to allow components to request a new frame.

Components can use the useAnimationFrame hook as a short-hand.

<Animate>

Run a keyframe animation.

Keyframes

import { Animate, Keyframe } from '@use-gpu/workbench';

const keyframes: Keyframe&lt;number> = [
  [0, 10], // 0s - value 10
  [5, 20], // 5s - value 20
];

// ...
return (
  <Animate
    prop="size"
    keyframes={keyframes}
    loop
    mirror
    delay={1}
  >
    <Component />
  </Animate>
);

This will animate the size on <Component /> according to the given keyframes.

Tracks

You can also specify multiple tracks instead of a single prop:

const tracks = {
  size: [
    [0, 10], // 0s - value 10
    [5, 20], // 5s - value 20
  ],
  color: [
    [0, [0, 0, 0, 1]], // 0s - value black
    [5, [1, 1, 1, 1]], // 5s - value white
  ],
};

// ...
return (
  <Animate
    tracks={tracks}
  >
    <Component />
  </Animate>
);

Render prop

Animate arbitrary components from a single source of truth:

return (
  <Animate
    keyframes={...}
    render={
      (value: T) => (<>
        <Component prop={value} />
        <Component prop={value} />
        <Component prop={value} />
      </>)
    }
  />
)
return (
  <Animate
    tracks={...}
    render={
      (values: Record<string, T>) => (<>
        <Component prop={values.prop} />
        <Component prop={values.prop} />
        <Component prop={values.prop} />
      </>)
    }
  />
)

Render-to-Texture

To render to an off-screen image, set up a render target with <RenderTarget>. It will inherit properties from the main screen, unless overridden:

import { RenderTarget } from '@use-gpu/workbench';

// ...
return (
  <RenderTarget
    resolution={1/2}
    format="rgba16float"
  >
    {/* RenderContext inside points to texture. */}
    <Pass>
      {/* ... */}
    </Pass>
  </RenderTarget>
);

For more complex arrangements, you can gather the <RenderTarget> as part of a set and pass it to a RenderToTexture as target instead:

import { RenderToTexture } from '@use-gpu/workbench';

// ...
<Gather
  children={[
    <RenderTarget {...} />,
  ]}
  then={([target]) =>
    <RenderToTexture target={target}>
      {/* Contents rendered to texture */}
    </RenderToTexture>
  }
/>
);

To process the resulting texture, use a then prop, e.g. to draw it to the screen:

import type { TextureSource } from '@use-gpu/core';
import { RawFullScreen } from '@use-gpu/workbench';

// ...
return (
  <RenderToTexture
    children={<>
      {/* Contents rendered to texture */}
    </>}
    then={(texture: TextureSource) => {
      <Pass>
        <RawFullScreen texture={texture} />
      </Pass>
    }}
  />
);
menu
format_list_numbered