返回

This ComfyUI tutorial workflow turns a single piece of concept art into a short, stop‑motion‑style video. It uses LoadImage to bring your reference artwork into the graph, feeds it into ByteDance2ReferenceNode to synthesize motion while preserving the look of the original design, and then encodes the result with SaveVideo. The ByteDance2ReferenceNode is the engine of the pipeline: it performs reference‑to‑video generation by conditioning on your image so the character, style, and palette remain consistent across frames while introducing controlled movement.

Technically, the workflow keeps things simple and focused on iteration. You supply a clean, well‑framed reference image; ByteDance2ReferenceNode generates a sequence of frames guided by the reference, and SaveVideo writes those frames to an MP4 at a frame rate you choose. To emphasize a stop‑motion aesthetic, you typically use a lower FPS (for example, 8–12) and modest motion magnitude so changes appear stepwise and tactile. Because the setup is lightweight (LoadImage → ByteDance2ReferenceNode → SaveVideo), it’s easy to re‑run with different reference images or parameter tweaks to dial in timing, motion intensity, and framing.