In the world of professional video mapping hardware, video processors ingest a 2 and often times 4k video stream over Display Port or HDMI, etc. This video processor in turn maps this content onto an arrangement of video walls, or video tiles arranged throughout a stage, usually over ethernet.
Sometimes those video tiles are in neat arrays, sometimes they are arranged at angles, and in design centric ways. When this happens, mapping video to these surfaces in a visually cohesive way gets much more difficult with traditional methods, and this is where Crescent Sun's workflow really shines.
The first step is to layout digital tiles onto the pixel map the venue (hopefully!) provides, but even when this is not available this process can be figured out via trial and error and enough time on the rig. One of the ongoing topics of UX was making the Panel Mapper workflow as smooth and fast as possible, so that more could be done in less time on the tight schedules often encountered onsite, days before a show.
While this first step is crucial, it doesn't account for the ways the video walls and tiles are positioned or rotated in real life. This must be accounted for in the next step, however the layout done below informs the 3D tiles in the Panel FX system how they are initially positioned.
The next series of steps in preparing the show file is to layout the digital panels into an arrangement that matches the real life placement. This is an important step to making video content projected across the entire stage feel cohesive, oriented and scaled correctly, etc.
Since we could easily have hundreds of panels on a given medium stage, we needed a fast and responsive UI for selecting these in list form. I built the outliner on the left side of the UI below through geo-instancing to prevent stalling the UI when lots of new panels were created through subdivision.
The attribute editor on the right side was another complex challenge, that required customized callbacks for different parameters. Some could be nudged as integers, some as floats. Others were image radio groups, and some represented rotation, etc.
Ultimately, while performance in this area was important for general user experience using the software, most of this is inert during a performance, and is only used for authoring and setup.
"Live" is where the performance real-time magic happens. The UI/UX here is quite involved, but ultimately breaks down into four content decks, and a plethora of controls for controlling a number of Effects and fading between decks dropping new clips on different targets in real-time.
One big challenge here was keeping the real-time UI from dragging down the FPS of the show during interaction. A lot of R&D happened between Eric Mintzer and I on finding an optimal solution that performed well, looked nice, and was mappable. Ultimately we achieved a good visual level of quality by caching GPU textures for the knobs, buttons, and other UI widgets and simply fetched the appropriate image for the corresponding slider value, etc.
Cues was a part of Crescent Sun that allowed a VJ or Lighting Designer to craft specialized looks using the Panel FX system. It featured a large preset bank at the top, and a non linear editor that allowed sequencing and keyframing of various panel effect attributes.
Users could have sweeps going from left to right, in to out, noise patterns, and many other directions, and have those control channels effect rotation, scale, outline, and numerous other geometry attributes.
Crescent Sun is built with TouchDesigner, lots of Python, and some strategic use of GLSL shaders to either lower the GPU memory consumption, or make higher scalability possible in certain highly replicated areas.