Babbling Head

The Babbling Head Robot, better known as Babbling Head is one of The Robot Group’s iconic exhibits.

Babbling Robot Head is perhaps one of the finest examples of our group’s efforts to meld art and technology. It is certainly one of the classiest. Brooks Coleman, a genius at wood-crafting, designed and crafted the various wood elements that went into this piece. These wood elements are as diverse and as rich as purple heart and oak and as raw as tanzania root.

The base of the piece is purple heart wood and the neck and collar bones are pow amavia woods. The organic brain in this robot is a piece a tanzania root that Brooks filed and fitted perfectly to this piece.
The rawness and natural ridges of the wood root give it a very natural organic-looking brain.

Brooks, who also does metal forging, tailored the metal nose and half head piece for Babbling Head while Laurie Davis forged the eyes.

The eyeballs are hammered silver with LEDs illuminating the iris centers. The eyeballs are also servo-controlled and capable of rolling.

The neck and the lips are made of soft plastic for mobility. The lips are molded black silicone and the neck-pipe piece is actually a dryer vent pipe.

The servos are each fitted with a winch pulley which Brooks turned from nylon. He machined a custom tool to create the spline that fits the standard Futaba servo shaft.

This robot is fitted with a Mini SSC II board that operates a series of hobby servo motors that move the lips, eyeballs and neck areas.

The DecTalk provides the computer voice that brings the Babbling Robot Head to life.

Alex Iles was instrumental in developing and programming the original exhibit controller for this piece. Bill Craig assisted with the original programming and the speech synthesis. Later, Eric Lundquist added real-time speech synchronization with the DecTalk.

At times, Babbling Head has been integrated into the Robot Brain project.

When last it spoke, Babbling Head revealed that it was awaiting some surgery. We look forward to its reappearance and to hearing it sing again.

Did we mention Babbling Head’s popularity? Some notable appearances :

[Text and images originally from http://wiki.therobotgroup.org/wiki/BabblingHead

Venus Project

The Venus Project was formed in the spring of 1989 as an independent cyberart coalition whose members embraced the techno-aesthetic philosophy of a positive human/computer synergy. The expression of this philosophy was the use of technology in playful, educational opportunities for interaction shared with others. The Venus Project members included Bob Nagy, Karen Pittman and John Witham.

VP-1, the first public exhibit designed by the Venus Project, appeared at RoboFest (1989 show at Discovery Hall in Austin, TX).

The exhibit was a computer-mediated interactive environment for producing musical sounds by body movements and for manipulating computer graphics with those sounds.

The participant donned a helmet that actuated MIDI-controlled synthesizer sound module. Pressure-sensitive areas beneath the feet of the participant were also connected to the sound module.

The graphics display was driven by software that changed the output based on the audio pitches and the interrelation of the audio signals.

VP-2 (Sonic Silhouette), the second Venus Project work designed for exhibit at Discovery Hall, was featured in RoboFest 2 (February 1991)

In this installation, body movements were tracked by an overhead video camera and digitized. A graphical interface to the audio synthesizers, in the form of a grid, caused musical sounds to be produced.

This system acted like a virtual instrument that was played by dancing or moving inside the digitized space.

VP-3 (Musicgraphic Hyperinstrument), was designed for the Berzerkwerks installation at the Austin Children’s Museum (August 1991).

This exhibit featured several virtual worlds that the participant could “enter” through the video camera and digitizer.

These worlds, created using the Mandala System authoring software, enabled the user to create sounds, trigger animations, and to paint by moving and “touching” virtual objects.

Lights and fans in the room surrounding the exhibit were also activated by these virtual objects and acted on wind sculptures and other installations in the show.

This exhibit was modified and renamed Video Playscape and remained on display at the museum after the close of the Berzerkwerks show.

Iterations of this exhibit were the top interactive attraction at RoboFests.

Venus Project (VP-3) Mandala System interactive computer graphic scene
Mandala System interactive computer graphic scene from the Venus Project VP-3 (Musicgraphic Hyperinstrument)

VP-4 (Living Systems Interactive Video Environment / L.I.V.E.)

In February 1992, the Austin Childrens’ Museum commissioned the Venus Project artists to create a virtual representation of the human digestive system.

The Mandala System was used to design scenes for the Video Playscape which made an educational game of the process of digestion. It featured an interactive tour of the digestive tract with animations, sound samples, and music.

VP-5 (Performance Interfaces) emerged in April – May 1992. The Venus Project designed and performed with several new virtual interfaces for control of audio synthesizers accompanied by the Sainsott’s Shrinking Robot Heads Band.

The debut performance for the combined organic and inorganic groups was held at X/XX2 Experimental Musical Festival on April 4, 1992. The performance featured an ensemble of six live (organic) musicians as well as the Shrinking Robot Heads Band members.

The interface was used to play synthesizers as the performers moved inside a digitized space.

Venus Project interfaces were also designed to be used in performance at Mayfest in Tulsa, OK and at RoboFest 3 in Austin that year. In both of these events, the interfaces were also to be used experimentally by the audience.