Limbic Media

Limbic Media

Category: Uncategorized

Interactive Lighting Control is Opening New Doors for LED Applications

This article was written by Limbic Media’s CTO Manjinder Benning and republished with permission from LED Professional Review, an Austrian-based publication for innovators in LED technology.

————————————

Advancements in lighting control technology are allowing for sophisticated interactivity in LED mapping. These new technologies are bridging the gap between lighting control and AI, with the ability to analyze and map data input (such as audio) in real-time. Installations driven by interactive LED control technologies have their place in a variety of application spaces. Manjinder Benning, Founder and CTO of Limbic Media, explains how this new technology works, what its applications are, and what the future of interactive lighting control looks like – not only for end-users, but also for lighting designers and technicians.

Interactivity is a growing feature of consumable technology. Public spaces – from shopping malls to schools, hospitals, and entertainment venues – are increasingly designed with human-centric, interactive approaches. Designers are recognizing the value of interactive technology in driving traffic, educating, healing, and entertaining over platforms that engage and connect people on a multi-sensory level.

This trend has only begun to influence LED applications – and new technologies are making interactive LEDs more sophisticated and accessible than ever before. This article describes the relevance of interactive technology in various industries, the existing state of interactive LED mapping, and outline an autonomous LED mapping technology that expands the current range of interactive LED applications.

 

Interactive stage lightin

 

Interactive technologies are growing in demand

The digital age has allowed anyone to curate information. With limited resources, millions are able to publish content and connect to global networks. People expect a greater level of participation and control over their digital environments. Much of our non-digital experiences remain unchanged despite this shift in digital experience. As a result, many facets of the real world struggle to stay relevant: Retail centres are losing revenue, university enrolments are declining, and community-centred activities are struggling to survive in the Netflix era.

Interactive technologies are becoming more common across spheres of public and private life to stay relevant and increase revenue:

  • Voice-controlled smart hubs are growing in popularity in private residences, creating a common interactive interface for a number of domestic devices.
  • Shopping malls are embracing interactive technologies, such as virtual try-on mirrors, interactive marketing displays, interactive LEDs on building facades (Singapore’s Illuma), virtual immersive experiences, and robotics.
  • Some theatres are testing multi-sensory experiences by manipulating temperatures, scents, and tactile experiences.
  • Education institutions of all levels are introducing more hands-on, interactive learning approaches, such as STEAM.

Implementation of these technologies through public art, entertainment, and education has uncovered many benefits. For participants, multi-sensory input elevates entertainment value, or conversely, calming synesthesia-like effects. It also appeals to various learning styles⁵⁶⁷ in educational settings. Interactive technologies benefit retail-focused spaces by increasing foot traffic and brand loyalty through customer engagement. They also transform under-utilized civic space into social hubs, improving public safety and revitalizing neighbourhoods.

It is clear that interactive, multi-sensory experiences are poised for rapid growth globally. Traditional sectors such as retail, entertainment, and education are struggling to catch up to our world’s digital transformation. These sectors are utilizing interactive technologies to bridge the gap between the digital and physical world. Modern LED technologies play an important but under-utilized role in interactive experiences.

 

Existing interactive LED technologies are limited

LED technologies have been under-utilized in the interactive marketplace for a number of reasons: interactive LED technology has been limited to simplistic sound-to-light interaction – and even in this application, achieving interactions is a laborious and expensive process.

 

Traditional sectors such as retail, entertainment, and education are struggling to catch up to our world’s digital transformation. These sectors are utilizing interactive technologies to bridge the gap between the digital and physical world. Modern LED technologies play an important but under-utilized role in interactive experiences.

 

Until now, interactive LED technology has been largely realized through automatic music-to-light mapping. Driving light fixtures from musical input, known as light organs, was first presented in a 1929 patent: The patent mechanically models light automatically from audio frequencies. A 1989 patent employed electrical resonant circuits to respond to low, medium, and high frequencies. Modern, digital music-to-light mapping systems have a number of advantages over these early systems. Computers can digitally process audio in real-time and extract control signals (energy in certain frequency bins, or tempo, for example) to more meaningfully map lighting schemes.

Some modern lighting control equipment, including hardware and software lighting consoles and VJ software systems, provide designer interfaces to map beat or frequency-based control signals to parameters that modulate lighting. For example, designers can map the amplitude of a 60-100 Hz frequency bin to DMX fixture brightness. This would create a visual “pumping” effect in response to bass.

This paradigm of manually connecting simple control signals – most often derived from the incoming audio signal frequencies—is closely modeled after the original light organ techniques from the 20th century. There has been little innovation in this field since its inception. In addition, mapping light interactions using this method is time-consuming for designers, and as a result, costly for consumers.

 

Potential beyond music-to-light mapping

Beyond music-to-light mapping for LED systems, there is great potential for other interactive data inputs. There has been an explosion, in recent years, for low-cost sensor technologies coupled with easy-to-use micro-controllers such as Raspberry PI. These technologies are capable of sensing data inputs from physical environments more cheaply, accurately, and easily than previously possible.

In terms of LED interactivity and mapping, data inputs could include:

  • Audio
  • Voice recognition
  • Motion detection
  • Data streams (from social media or other live inputs such as weather patterns)

Some commercially-available software products, such as the Isadora system, enable complex input/output system building. This allows designers to map a variety of inputs (such as sensors) to multimedia outputs, such as projections or audio effects. Again, using LEDs as output is largely unexplored.

Although very well designed and capable of dealing with complexity, existing systems still require expert designers to inform mappings between inputs and outputs, and to direct visualizations as inputs change and evolve. No existing technology has been capable of autonomously listening to data input, monitoring output, and learning to make intelligent decisions to map LED visuals over time.

 

Interactive dj lighting

 

This article discusses a new paradigm in interactive LED control: artificially intelligent systems that eliminate the programming expertise, time, and cost required to create advanced interactive LED experiences. Such a system intuitively recognizes distinct input features (from audio or otherwise) in real-time. Input features are mapped according to human-based preferences, without direct human control. This makes interactive LED applications more accessible and less costly to a variety of industries seeking interactive solutions, while elevating user experiences.

 

LED control that maps inputs and drives output autonomously

Imagine an LED installation that intuitively “listens to” audio or other real-time data input and adapts accordingly, learning over time, with no human intervention. This new approach to interactive LED mapping uses an “intelligent” system based on mainly three key elements.

interactive lighting platform

Fig. 1

 

The system is composed of:

  • A temporal correlation unit (110). This acts like a brain, inputting, processing, recording, retrieving, and outputting data. Data inputs can include audio (either from a microphone or line-in audio), motion detection sensor inputs from a camera, data streams (from weather patterns, the stock exchange, tallied votes, or social media, for example), or interfaces that request data input from an audience
  • An oscillator (140). This perturbs the inputs, introducing variation to the LED output. This produces light interactions that are lively, dynamic, and less predictable to the viewer
  • A signal mixer unit (150). This mixes input signals in various ways to create different outputs

The temporal correlation unit references input signals for distinct features, and determines how the oscillator and signal mixer unit behave in response. The system also determines how the output signal spans through a specific color space.

interactive lighting platform

Fig. 2

Figure 3 expands on potential external inputs (120). As with prior technology, the system analyzes binned frequency content (210, 235) and time domain envelopes (215). In addition, the system recognizes and classifies higher-level musical features (220).

Some examples include:

  • Percussion/other specific instruments
  • Vocal qualities
  • Musical genre
  • Key
  • Dissonance and harmony
  • Sentiment
  • Transitions (e.g. from verse to chorus)

The system also interprets nonmusical data inputs (225) in real-time. This includes non-musical audio features, such as speech recognition or environmental sounds (rain, wind, lightning, or footsteps), or the other non-audio data inputs previously described.

Features can be reflected as LED-mapped output in many ways. LED parameters such as motion, color palette, brightness, and decay adapt to reflect specific data input features. This creates LED displays that are more intuitively-mapped to human preferences than previous light-mapping technology. The system’s ability to map intuitively and autonomously in real-time heightens the users’ multi-sensory experience and potential for LED interactivity.

 

Referenced Data Input Determines LED Mapping

An intelligent LED mapping system relies on referenced input signals. The system analyzes new data input for familiar features based on referenced input stored in the temporal correlation unit. Over time, the system optimizes database searches. This allows it to predict input features from audio or other data streams, and create a more intuitive, real-time visual LED output on its own.

The system’s ability to map intuitively and autonomously in real-time heightens the users’ multi-sensory experience and potential for LED interactivity.

When the temporal correlation unit has been adequately trained, it can predict human listeners’ preferences, and map LEDs accordingly for musical, other audio, and non-audio inputs. This system provides a more intuitive, engaging user experience with no need for customized LED programming knowledge or real-time human control:

  • The temporal correlation unit trains itself to map output effectively in three ways:The system acts as a neural network by comparing new data inputs to similar inputs stored in the system’s database. New output features are modeled after those of referenced inputs. This allows the system to quickly reference previous lighting output configurations rather than creating them on the fly.Previous technology requires a technician to manually choose which lighting cues to load and when, whereas this system automatically chooses which cues to load and when. Neural networks can also be supervised. In a supervised neural network, the system recognizes specific data input features that indicate audience approval of the LED mapped output. These input features could include: manual switches, face recognition, or voice recognition that indicate emotional states. This serves to further refine the system’s output choices according to human preferences.

 

  • The system can also utilize evolutionary algorithms. Evolutionary algorithms are used in artificially intelligent systems – they are modeled after selection mechanisms found in evolutionary biology (firefly attraction, ant pheromone trail setting, and bird flocks, for example) to optimize data searches.Evolutionary algorithms, such as a genetic algorithm, allow an LED control system to independently find and select the most effective lighting outputs without human control. As with a supervised neural network, the system governed by genetic algorithms seeks specific audience cues that suggest approval of the mapped LED output. This serves as a fitness function, training the temporal correlation unit to respond to real-time input signals. Third way of the system to train itself

 

  • Similarly to evolutionary algorithms, a system can utilize interacting intelligent agents. Agents also mimic natural patterns in code by responding to specific, predefined rules (e.g. a specific frequency produces a certain color space). Each agent applies a set of rules to generate temporal sequences for LED mappings, again seeking audience cues to train the system how to respond appropriately to input.Agent rules can be parametric. For example, rule parameters are determined by the physical arrangement of LEDs in 2D or 3D installations. A suite of techniques known as nature inspired algorithms, which are modeled after naturally occurring patterns, are a credible source for generative content when considering lighting output. This approach works particularly well with large numbers of LED pixels.

 

Implications of technology for industries and end-users

An intuitive method of mapping LEDs according to human preferences means that multi-sensory, interactive lighting are more immersive and emotive than ever before. An intelligent, autonomous LED control system has many benefits and applications for end-users and various sectors.

Interactive LED lighting at a climbing gym in Victoria, BC

Benefits for end-users:

  • Educational programs can use such systems to leverage multi-sensory, interdisciplinary curriculums that address various learning styles
  • Holiday, architectural, and other lighting companies that already employ LED technologies, can use the system to employ a more interactive, human-centric approach to design
  • Retail centres can use the system’s interactivity, particularly live social media hashtags as data input, to attract customers and leverage brand presence online
  • Cities can incorporate the system in their efforts to revitalize public space by:
    • Investing in interactive, public art using LEDs
    • Visualizing data gathered through smart city initiatives
    • Attracting foot traffic to business areas
    • Improving public safety
    • Place making and creating community focal points
  • Clubs, venues, and AV teams can quickly and effectively create improved visual effects for live performances and DJs
  • Public centres and exhibition venues that adhere to redesign cycles can adapt the system with changing data input types and LED configurations to refresh displays year after year
  • Non-technical users are able to access sophisticated interactive technology without custom programming or design knowledge
  • Users can avoid the time and cost associated with creating and maintaining interactive LED displays
  • LED displays can be controlled and scaled across multiple locations at a lower cost

 

Implications for technicians

It is often assumed that technological advances, particularly using AI, have the potential to destroy jobs. The described system simplifies or removes the programming process, making the technology more accessible and affordable than previous interactive LED technologies – but this does not necessarily imply job obsolescence for lighting designers and technicians. The technology will only change and improve the state of the art in the future, providing a number of benefits for industry professionals.

Benefits for professionals:

  • Provides a sophisticated tool for the lighting designers that can be used in conjunction with existing professional lighting protocols such as DMX
  • Saves lighting designers programming time
  • Allows designers to scale large projects at a lower cost
  • Opens the door to a wider variety of LED applications in industries outside the current status quo
  • Allows designers to manipulate lighting schemes with data input other than music
  • Allows designers to improve or incorporate audience interactivity

————————————

Interactive technologies are poised for global growth, allowing various industries to offer engaging, multi-sensory experiences in non-digital settings. Applying interactivity to LED technologies opens a variety of doors into a number of sectors looking to attract, engage, and educate communities in settings that struggle to stay relevant in our digital world.

Until recent advancements in LED control technology, mapping data input to lighting design has been limited to audio input using age-old light organ techniques. While low-cost and easy to use micro-controllers such as Raspberry PI have opened new doors in LED mapping, the process still requires skilled lighting designers and programmers. The cost and time associated with creating and maintaining interactive LED displays using these methods has made interactive LED applications costly and inaccessible to a variety of industries and audiences.

interactive public art

A new technology, outlined in “System and Method for Predictive Generation of Visual Sequences,” addresses these barriers to new LED applications by controlling LED interactivity autonomously yet elegantly. The system analyzes data input, including music, non-musical audio, and non-audio data streams for distinct input features. Input features are mapped into distinct LED output parameters based on human preferences, and indexed into the system’s database. This indexing allows the system to autonomously predict upcoming data input and intelligently refine its output over time.

The system’s design avoids the need for timely human programming and maintenance, creates LED mapping that looks aesthetically detailed and intuitive, and allows real-time interaction from a variety of data inputs. This has clear benefits to the LED lighting industry: it opens doors to new applications in various sectors seeking interactive solutions for consumers. It creates heightened multi-sensory, end-user experiences. It offers a sophisticated tool for lighting technicians and professional designers.

 

References:

[5] Johnson, Gretchen L., and Edelson, Jill R.. “Integrating Music and Mathematics in the Elementary Classroom.” Teaching Children Mathematics, Vol. 9, No. 8, April 2003, pp. 474-479.
[6] Wilmes, Barbara, Harrington, Lauren, Kohler-Evans, Patty, and Sumpter, David. “Coming to Our Senses: Incorporating Brain Research Findings into Classroom Instruction.” Education, Vol. 128, No. 4, Summer 2008, pp. 659-666.
[7] Kast, Monika, Baschera, Gian-Marco, Gross, Markus, Jäncke, Lutz & Meyer, Martin. “Computer-based learning of spelling skills in children with and without dyslexia.” Annals of Dyslexia, 12 May, 2011, DOI: 10.1007/s11881-011-0052-2

From Glowflow to Burning Man: The Evolution of Interactive Media

Want to learn more about interactive media? Contact us about Aurora.

——————

On day 3 of the 2012 Coachella Valley Music and Arts Festival, onlookers were captivated by a computer-generated recreation of Tupac Shakur to perform with Dr. Dre and Snoop Dogg. The animation used projection mapping in combination with a theatrical technique called “Pepper’s Ghost” to create a 3D holographic effect. The project employed a team of 20 artists, lighting designers, and technicians to create an unexpected, immersive audience experience.

Festival season is upon us, and with it comes more opportunities to showcase and explore interactive media. From music, to performance art, to technology-based installations, the event lead-up is a full-time engagement for artists, technologists, and festival organizers seeking to stand out in what has become a multi-billion dollar industry worldwide. Technology has hugely influenced festivals’ ability to engage audiences with interactive media. Where has this attraction for interactive and technology-driven media come from, and how is it impacting other public spaces?

 

Computer mapped Tupac

Virtual Tupac at Coachella 2012

 

Interactive Media is Not A New Concept

Technological developments of the last half-century have breathed a new novelty into the concept of interactivity. Physically and emotionally participating in entertainment, which was the norm, became less common after the relatively recent advent of “passive” entertainment, like television and cinema.

 

“The reason we suddenly need such a word [as interactivity] is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television.

Before they came along all entertainment was interactive: theater, music sport — the performers and audience were together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for.

We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.”

—Douglas Adams, How to Stop Worrying and Learn to Love the Internet

 

Technology moved us away from interactive media, and ironically, technology is orienting us back to those original values when it comes to art and leisure—perhaps in an even bigger way than before TV. As much as technology has the power to isolate us, interactive media today is also more accessible, more invigorating on a multisensory level, and more likely to establish a genuine human connection than ever before.

 

Technology Has Revitalized Interactive Media

Using technology to create new forms of interactive media goes back to the mid-20th century. In the 1950s and 60s, Morton Leonard Heilig was one of the first to create VR in response to the passive experience of cinema.

 

“Without the active participation of a spectator, there can be no transfer of consciousness, no art.”

—Morton Leonard Heilig

 

Sensorama, which was patented in 1962, was a prototype for what he imagined would become “experience theatre.” It combined a stereoscopic 3D colour display, stereo sound, fans, olfactory dispensers, and tilted, vibrational seating to provide single viewers with a multisensory experience over the course of a short film. Heilig was unable to find funding to get Sensorama to industry players, and the project dissolved.

 

virtual reality sensorama Morton heeling

Morton Heilig’s Sensorama

 

7 years later, Myron Krueger developed one of the earliest forms of computer-based interactive art. Glowflow was first installed at the University of Wisconsin’s Memorial Union Gallery. Pressure-sensitive pads were activated by viewers’ footsteps, triggering a real-time visual response from phosphorescent tubes and aural response from a Moog synthesizer. Glowflow was one of such interactive environments that lead to Krueger’s cornerstone project, Videoplace, in 1988. Videoplace is an artificial reality laboratory that creates reactionary light art out of viewers’ motion.

Much of Krueger’s work was motivated by a desire to redesign computers by addressing features that take away from an inherent human desire to connect and interact.

 

“There were things I resented about computers. I resented the fact that I had to sit down to use them. I resented the fact that I was using a hundred-year-old device to operate them—a keyboard—and the fact…that it was denying that I had a body of any kind, and that it was all perceptual, sort of, symbolic.”

—Myron Krueger

 

interactive media virtual reality Myron Krueger

Myron Krueger’s Videoplace

 

Krueger modeled Videoplace after the relationship that artists and musicians have with their tools, seeking to create a type of computer that people could experience rather than use for the sole purpose of efficiency. The first rendition of Videoplace superimposed Krueger’s hand-drawn data tablet doodles onto a screen in the Memorial Union Gallery a mile away. The doodles would appear to interact with viewers’ shadows, which were also projected onto the screen in real-time. Almost by accident, Krueger noticed that viewers were most engaged when their motion appeared to create the doodles.

 

“We discovered that there was this very natural desire to identify with the image on the screen. Their image was them, and they expected it to do things in the video world as much as it did in the physical world. It was as if evolution had prepared us for seeing ourselves on television screens combined with computer images.”

 

Suddenly, here was a real, tangible example of how technology had the potential to bring human connection full-circle—back to what interactive media had done for us prior to the age of passive media. From VR to public art, interactive media has come a long way since Videoplace.

 

Burning Man: A Lasting Example Interactive Media’s “Rebirth”

Unlike static art, interactive media is unique by involving the viewer in its creation, forming a platform for human connection and community. Passive media is presented with the intention of presenting audiences with a static piece to derive meaning from, rather than involving their participation in the media’s creation and forming a community from that involvement. A good example of the rebirth of interactive media, especially as it relates to the growth of art festivals, is Burning Man.

On June 22, 1986, Larry Harvey and Jerry James built an 8-foot human figure out of scrap wood in their Noe Valley basement. They hauled the wooden man down to Baker Beach and quickly drew an audience of close to 40 people as flames engulfed the figure. Before you could say gasoline, the spontaneous hootenanny was singing a fire-themed tune on the fly, and a woman was literally hand-in-hand with the pyro-masterpiece.

 

“That was the first spontaneous performance…that was the first geometric increase of Burning Man. What we had instantly created was a community. And…you know if we had done it as an art event, people would have come, and come to the gallery or something, and said ‘It’s very interesting, perhaps a little derivative, what are you going to do next?’”

—Lee Harvey

 

The festival has since grown into a 70,000-person gathering based on the values of immediacy, participation, communal effort, radical self-expression and self-reliance, egalitarianism, and creativity—so unsurprisingly, the festival has become a global platform for the convergence of art and innovative interactive media, informing values within the tech industry (and perhaps vice versa). What began as a novel concept associated with underground movements became its own city with the power to impact the culture and values behind one of North America’s largest industries.

 

 

Interactive Media’s Impact

Aside from influential Burners taking those core values back to the office after Labour Day each year, the impact of cultural phenomena like Burning Man has been a driving force behind the evolution of interactive media. Interactive media has re-infiltrated mainstream society, evolving in just a few decades from what was once associated with counterculture and festivals or niche, university-affiliated galleries like Videoplace.

Interactive technology and art are increasingly incorporated into civic space and public institutions like art galleries, science centres, shopping malls, and schools. Those behind designing and coordinating these spaces are realizing the advantage that interactivity has over passive forms of media in community building and increasing a return audience. Growing public values in interactive media are also expanding the tech industry, leveraging advances in interactive technologies like wearable tech, sound-to-light mapping, motion-tracking, VR and AI.

 

interactive public art

Montréal’s Impulsedezeen.com photo

 

Passive media is still the norm for a culture built on Netflix. But the values behind traditional forms of interactive media has been experiencing a rebirth over the last few decades, thanks to innovators like Myron Krueger and events like Burning Man—and the technology behind our ability to realize those values is growing every day.

An Interactive Lighting Case Study with Aurora

Interactive Lighting Case Study

 

Real estate developers often invest in hoardings for big projects—on-site marketing signage that describes future developments. As opposed to online, radio, and print ads, hoardings are highly cost-effective marketing investments for developers, providing large-scale project awareness 24/7. Vancouver-based developer Belford Properties took their hoarding for Sun Towers Metrotown to the next level. Faced with the challenge of promoting Sun Towers while building long-term community relationships throughout the development, Belford partnered with a local organization and turned the hoarding into an interactive public art display.

The result was a 30×170-foot billboard combining community art with interactive technology. The billboard transformed public space across from BC’s largest shopping centre, Metropolis at Metrotown, into an interactive boulevard. The Metrotown project inspired Limbic Media’s Interactive Art Wall concept, an engaging art installation with multiple applications for civic space, retail and holiday displays, and any organization looking to increase ROI through public engagement and community placemaking. This week, we are taking you through Limbic Media’s process for this project, from the initial collaboration and concept to the final installation.

 

 

A Concept for Community Building

 

“If you want 10 years of prosperity, grow trees. If you want 100 years of prosperity, grow people.”

 

This is the proverb that initially inspired Belford to collaborate with Burnaby Neighbourhood House, a volunteer-driven social service agency. BNH supports programs and services that address local community needs. Understanding that youth have an enormous impact on community futures, the two organizations joined forces to support youth art education over the course of Belford’s 3-year development.

 

“Belford believes that youth can have a huge impact on community, helping to shape the future with new ideas through education and art. An investment in youth and education is much more rewarding than one can imagine, especially in the community that they grow up in. That type of investment is something we keenly sought out, hoping to work with an organization that places such an importance on education and art with children in the neighbourhood. We found that organization, Burnaby Neighbourhood House, and let the kids do their thing.”

—Belford Properties

 

Interactive Lighting Case Study Public Art

 

Their vision resulted in a public art concept surrounding the theme of rain + sunshine = growth to encourage yearly donations to BNH’s youth art programs. The 3-year project has three phases: the first, inspired by Greater Vancouver’s notoriously heavy rainfall, features umbrellas and rainbows. The second phase, scheduled for Spring 2019, features sunshine-themed drawings, and the third will display the fruits of that nourishment—growing flowers, bees, and nature. BNH and Belford commissioned art for the first phase to children currently in BNH arts programs. Their pieces were then scaled to fit the hoarding.

 

Interactive Lighting Case Study

 

Interactive public art not only fosters a sense of community and placemaking, but also increases brand awareness, foot traffic, public safety in surrounding areas, and overall ROI. Hangar 18, the project’s design and branding consultant, reached out to us to create an interactive lighting component for the billboard.

 

Designing and Integrating the Aurora Platform

Limbic Media’s role was to design the lighting component of the installation and integrate an Aurora system with a coin box to allow for donations. The vision was to literally “make it rain” when coins are inserted, offering passers-by a lightshow in exchange for their donations. Limbic Media used Hangar 18’s concept drawing as a template for the lighting design.

 

Interactive Lighting Case Study

 

Our team at Limbic Media was responsible for designing the layout of Minleon Pebble Light strands over the concept art, spec out project requirements, and do custom programming to evoke rainfall and rainbow effects. The project required 61 light strands of various lengths, totaling 2,075 pebble lights. Projects of this scale require multiple Network Distribution Boxes (NDBs) along with a network switch to effectively supply power and data from Aurora across all the lights. The next step was to parse the billboard’s light strands into 9 sections; one section of light strands for each NDB. Because all the technical components would be hidden behind the billboard, the project also required leader cables of various lengths to connect the beginning of each light strand with its respective NDB.

 

Interactive Lighting Case Study

 

The Interactive Art Wall was Limbic Media’s first time integrating Aurora with a coin box. Our lead design engineer created a new Aurora pattern to achieve a rainfall effect for the pebble lights. The coin box was then integrated with its own microcontroller, programmed to speak to Aurora: in resting mode, Aurora tells the light strands to evoke a subtle version of the rainfall pattern. When coins are donated to the box, it triggers an algorithm that intensifies the rainfall pattern’s brightness and speed, slowly diminishing until the the more subtle resting pattern is achieved.

 

Interactive Lighting Case Study

 

Limbic Media’s design process was a team effort, involving sales staff, engineers, and a technical lead to spec out and price the project—all while liaising with Hangar 18 and Belford to meet the project’s vision and timeline. Once the installation was set up at Limbic Media and passed for QA, we sent the equipment with our lead design engineer to oversee and support the onsite installation process alongside Belford, and make final tweaks to the project’s custom programming.

 

Interactive Lighting Case Study

 

Project Outcomes

The Interactive Art Wall was a huge success as an alternative to the average hoarding. Unlike a regular marketing billboard, the display’s interactivity increased a community-building ROI in addition to potential monetary gains. Lighting and interactivity leveraged Belford’s marketing for the Sun Towers development by encouraging public participation in the display and also increased awareness of BNH and their impact on community initiatives. By providing an opportunity for hashtags and social media engagement, the interactive display created an additional marketing tool for both Belford and BNH. The interactive hoarding captured Belford’s vision as a developer that is mindful of its surrounding community and involved in its long-term, people-based goals.

 

“With the addition of these beautifully installed LED lights around the drawings on the wall, we are able to raise public awareness not only in the daytime but also attract lots of attention at night. Our Art Wall has soon become a popular sight visiting point in the area which gives us chances to interact with the public. The lights are one of the key elements in this charity fundraising event. On behalf of Belford Properties, we are very pleased with how the addition of the lighting has attracted a tremendous amount of attention to our Charity Art Wall project.”

—Chris Ba, Belford Properties

 

Providing a reward for donations to the initiative in the form of a light show also piqued public interest from passers-by in a way that stand-alone donations boxes can’t. The hoarding brightened the thoroughfare at Beresford Street, potentially increasing return foot traffic to the area. Overall, interactivity at the Metrotown installation played a crucial role in placemaking and fostering community development out of what would otherwise remain a typical urban development on an ordinary roadway. If you find yourself near Metrotown Station over the next few years, check out the installation, make a donation, and be sure to share your interaction on social media with #celebratebby.

 

Interactive Lighting Case Study

 

The Interactive Art Wall concept has potential across multiple applications. Light fixture styles and custom patterns can be adapted for unique themes and mounted against a variety of backdrops and settings. If you are interested in combining interactivity with a similar concept or initiative, contact us today to brainstorm ideas.

 

Photos by Mandy Jin at WeTopia.

CASC 2018 Conference: LHULH’UTS’UT’EN

Representatives from Canadian science centres and museums came together last week to embody LHULH’UTS’UT’EN—working together—at this year’s Canadian Association of Science Centres‘ 2018 conference. CASC attendees came to Prince George this year to seek inspiration, network, and learn about the challenges facing science centres and museums across the country.

Limbic Media’s Marketing Associate, Deanna Foster, and Lead Design Engineer, Gabrielle Odowichuk, attended the event with an Aurora Jam Tent (check out our Aurora Jam Tent video from Tectoria 2018). Here are a few highlights from their time up north.

 

CASC 2018 Conference

 

Welcome Reception

This year’s CASC attendees had a chance to be kids again in the Two Rivers Art Gallery’s Maker Space. Activities ranged from felting, to learning code, to traditional Lheidli T’enneh wood carving. Participants also enjoyed traditional drumming by the talented Khast’an drummers. Check them out for a truly mesmerizing show!

 

CASC 2018 Conference

 

The Way-Late Play Date

Clad in their finest plaid, CASC-goers were invited to eat, drink and wield some good ‘ol saws and axes. Logger sports and a relay race kicked off the night, followed by dancing and Northern BC’s finest brews. The Exploration Place provided an interactive setting to network and learn about industry trends and challenges. Highlight of the evening? Chocolate-covered bacon.

 

CASC 2018 Conference

 

The Exhibitors

Limbic Media’s Jam Tent, an Aurora-lit enclosure filled with musical instruments, was among a variety of science and museum exhibitors. Little Ray’s Nature Centres (aka. Average-sized Ray’s Nature Centres) provides permanent and traveling hands-on, zoological education exhibits. Sadly, Shane from Little Ray’s was unable to bring a sloth to CASC—but here’s hoping for next year.

Big shoutout to Pathfinders Designs, who was a huge help in Limbic Media’s Jam Tent setup. Pathfinders, based on Vancouver Island, designs and creates wooden science kits.

 

CASC 2018 Conference

 

CASC 2018 was an exciting reminder of the open-mindedness and innovative thinking of Canada’s industry leaders in science centres and museums. Each attendee left with new connections and inspiring ideas for their home audiences. We had a lot of fun seeing everyone have jam time in our Aurora Jam Tent, and hope to see sound-to-light interactivity infiltrating more science centres and museums across the country in the next year!

 

CASC 2018 Aurora sound-to-light engine reacting to music

 

Contact us today to plan a Jam Tent for your next event!

Scroll to top