Welcome to the AT&T Discovery District®, a new downtown destination where tech, culture and entertainment combine to create unique experiences.
Due to the time constraint and limited bandwidth AT&T had for feedback, Float4 needed to be able to tweak and adjust the TouchDesigner file live, and quickly. Instead of me playing catchup on all the complex parts of the software, we instead opted for a workflow where Float4 would drive virtually, and I would provide them technical commentary / feedback and a live video stream that would run constantly for the evening.
Other onsite personnel from Float4's New York office were present to interact with the client directly and convey the artistic feedback directly to the Float4 programmer who was remote.
We all found a rhythm sooner than later, an elegant dance of technology and real-time communication which allowed Float4 to be as close to efficient as being on-site would have allowed.
The Globe had a really interesting approach to detecting human movement. Several Lidar style sensors were embedded around the globe's inner circumference, around waist high. All of these sensors faced inwards, providing enough depth slices from different angles that partial scene reconstruction could happen on flat 2d plane, allowing blob tracking algorithms to pick up dense clusters of points as humans.
This made it possible to extract high level positional data of where people were standing, and allowed Float4 to designate certain areas inside the Globe as triggers for certain spatialized audiovisual effects.
In some cases as shown below, if the right number of users stood in the correct places, this would trigger an "Easter Egg". A sequence would playback for a few moments, holding as long as the users were in those places.