Friday, September 8, 2017

Noisebridge, a Do-ocratic Hackerspace, Celebrates 10th Anniversary

For a decade now, Noisebridge has beckoned its visitors up two flights of pastel-colored rainbow steps from Mission Street. The self described anarchist collective hackerspace, Noisebridge (NB) is a “do-ocracy,” that encourages its members to “Be excellent to each other.” In fact, that’s the space’s only rule. This weekend, September 9th and 10th, Noisebridge will have its 10th anniversary Exhibition + Ball. It promises to be a visual treat akin to a Maker Faire, meets a Science fair with a side of “Afterburn.”
This weekend, I sat down on a record hot September day with founding member Mitch Altman to gain more insight into the space, how it began and how the mission solidified into the place we are sitting. 
In 1986, Mitch was driving through Alaska, stopped off in the Bay Area, and knew it was his home. Altman started Noisebridge because he “saw technology being used against individuals, not empowering individuals.” This is a noble goal in an era where AI technology is being used to simultaneously profile criminals, and bring you cashless shoppingexperiences. Altman, along with another founder, Jake attended the 2007 Chaos Communications Camp (CCC), and attended the How To Start Your Own Hackerspace session. Noisebridge had its first meeting at the camp. Within the first of the 2007 CCC, four new physical hacker spaces were formed, including the Hacktory, Hack DC, and NYC Resistor. 
View the full article here, at The Bay City Beacon, and the photo essay here.
See the list of events on the NB Wiki, and RSVP via Eventbrite.  Events are free!
Micah Morgan is a new media, art, and technology writer. You can find her work, at 

Monday, December 14, 2015


Nyumblies is a piece we presented at the 3D Web Fest festival in May 2015 at the Foundry in San Francisco. The festival showcased websites showing the best mixture of music, art and technology, and exhibited what’s possible with the advent of 3D Web.

The piece was produced in WebGL using three.js. Each Nymubly is a Catmull-Rom surface (a higher-order surface named forPixar founder Ed Catmull) implemented in a vertex shader. Its shape over time is defined by a 3D grid of control points, interpolating over time gives the 2D set of control points that define its shape for the current frame. An authoring page, Nyumbilator, allows users to design a species of Nymubly by editing this grid of control points.

The movement of the Nyumblies is based on a flocking algorithm similar to the early AI system developed by Paul Reynolds. The system is based on separation, alignment, and cohesion. Nyumblies of the same species will flock together, and away from other species.


Nyumblies are an experiment in artificial ecosystems. By nature ecosystems are a combination of different lifeforms with unique designs, who interact over time to form a dynamic system of life and movement. The Nyumbly ecosystem, Norgamatos, is formed by artistic intervention rather than the gradual process of evolution.  Norgamatos, and the Nyumblies within it, can be viewed at anytime online at are created with the Nyumbilator: an open online design tool where artists can craft their own Nyumbly lifeforms, define their behavior, and introduce them into Norgamatos.


The music used in the piece is Into The Fourth Dimension by The Orb.

Wednesday, April 8, 2015

Introduction to the gestural lexicon with Memex

Animation use gestures as tools to signify the emotions of characters in the same way as traditional theater. Gestures are powerful signifies because we understand them kinesthetically, and in education studies, they have been known to increase the transfer and retention of an idea.

Stills from Memex by Marshmallow Laser Feast.jpg

Duologue’s “Memex” by Marshmallow Laser Feast

September 2014,
Courtesy of the Nowness Channel, YouTube

Animations that employ VFX compositing are becoming powerful tools for understanding gestural human interaction. By repeating gesture and micro gesture in virtual settings, basic human interactions defined by gestures are set to become an independent lexicon. We can look at internet memes as a less complex example of this phenomenon. Memes speak to words falling short of expressing the total emotional record.

Memex is a VFX short film released in September, 2014. The film is a collaboration piece between Duologue, a London based electronic group, and digital artist Marshmallow Laser Feast. The entire film content is computer generated. The illustrator creates an uncomfortable and emotionally moving exaggerated reality. The film opens with a close up on skin, the camera hovers close at angles impossible with normal film. In the absence of her own movement we as viewers can imagine how the character feels frozen in her pose. As artists and designers continue to delve into gesture and micro gesture, there will be room for recognizing the competence for storytelling and fostering greater cultural understanding. 

Thursday, March 12, 2015

Dance Dance (Virtual) Evolution

This week I was blown away by Asphyxia, a motion capture and dance project by Maria Takeuchi and Frederico Phillips'. Using the XBox Kinect, they create a moving, virtual, sculpture of dancer, Shisho Tinaka's movements. She becomes a moving web of light in the dark screen. The collaborative team put equal effort into art and technology aspects. The project required iterative studies on styles and a number of 3D tools after motion data was captured using "inexpensive tools." Amazing and inspirational!

Wednesday, March 11, 2015

Equations We Love

Gareth Morgan Shares Kajiya at Papers We Love

Gareth Morgan, 3D NURDS Organizer gave talk on James Kajiya's The Rendering Equation at the San Francisco Chapter of "Papers We Love (PWL)." The equation was first published by Kajiya at the 1986 Siggraph. PWL is self described as, a repository of academic computer science papers, and a community who loves to read them. Comp Sci not your thing? That's okay. Co-Organizer Ines Sombra openly encourages any scholarly paper, and corresponding topic you'd like to present. This was my third attendance, and her welcoming attitude is mirrored by the atmosphere: kindness and curiosity. Presenters are encouraged to be clear and simple so that everyone can engage as much as possible. 

Kajiya's equation measures light energy passing from one point to another in the world. This calculation is vitally important for 3D graphics and computer generated images. Morgan argues, the most important: "Every image in a screen is an attempt to solve the rendering equation," and the concept itself revolutionized 3D graphics. The mini talk presented is from February 19, 2015 at Yammer HQ. Morgan uses diagrams to  break down each of the equations terms, and how they contribute to the solution. The full talk will happen in Spring 2015. Look forward! And yes, there will be libations.

Video Caption: Gareth Morgan presents on The Rendering Equation at the 12th Papers We Love. Morgan is preceded by Veronica Ray on Experimenting at Scale with Google Chrome's SSL Warning and followed by the evening's main event: Caitie McCaffrey on Orleans: A Framework for Cloud.

Friday, January 30, 2015

Virtual mummies!

This TED talk by David Hughes, mentions the virtual reality Mummy project that I worked on back in the day at SGI. It's a great talk and well worth listening to. It inspired me to write about my bit of that project.

Back in 2001 David had arranged for one of mummies from the British Museum to be taken to University College hospital in London and scanned using a computerized tomography (CAT) scanner. The mummy chosen was that of an Egyptian priest named Nesperennub, who lived in Thebes in around 800BC.

I was working for SGI's professional services group in the UK at the time and was tasked with writing the software to visualize the CAT scan data. I had been using SGI's OpenGL based volume rendering API, Volumizer quite a bit, so it seemed like an obvious choice for implementing the software.

The application I wrote took the CAT scan data (basically huge 3D arrays of floats) and visualized it in real time. I distinctly remember the moment when it went from a data processing problem (how do I get sensible values out of these billions of apparently random bits?), to an actual mummy, named Nesperennub, being drawn on the screen. It didn't help the sense of creepiness that I was alone in the office at the weekend when this happened.
Look, a mummy!

Volumizer worked by breaking the volume down into tetrahedra then rendering each of these as multiple 2D slices aligned to the view direction, this technique is called volume slicing. The slices were rasterized as triangles with a 3D texture applied to them that contained the volume data. The resulting fragments were then drawn using alpha blending to create the final image.  
Volume slicing
3D textures were at that time a very sophisticated GPU feature that required graphics hardware like SGI's InfiniteReality, which featured on the Onyx line of visual supercomputers which my software ran on. These GPUs had 100s of megabytes of texture memory, which for the time was an obscene amount!
Onyx2 visual super computers

X-rays had been taken of Nesperennub in the 1960s that revealed a mysterious object on his head. In my application different materials could be highlighted using a dynamic color look-up table. Lookup tables like this were easy to implement in Volumizer. As this was before programmable shaders were common in realtime, under the hood OpenGL's color table feature was used to create this effect. In the final renderered image different densities in the CAT data were represented as different colours. Some colours in the table could be partially, or completely, transparent so that some materials became see-through. 

Using this technique it became very clear that the mysterious object was in fact a clay bowl. The most likely explanation is that during the embalming process the bowl (which was probably used to pour resin on the corpse) became stuck and the embalmers decided to simply carry on with wrapping the body with the bowl still attached. The embalmers probably thought they'd gotten away with it too. But several millennia later their shoddy workmanship was revealed to the world!
Volume rendering, exposing shoddy embalming since 2001
The application got an incredible amount of press and I spent most of the next year showing it to everyone from Nature to CNN (and a British kids TV show, we actually took the Onyx demo machine and the actual mummy on set for them).  It was also featured on SGI's booth at Siggraph 2002 in San Antonio, where it was used to promote the newly release InfiniteReality4 graphics hardware.

The software was ultimately turned into an exhibit at the British Musuem called Mummy: The Inside Story. I didn't survive the next round of layoffs at SGI (which was in serious trouble as a company while all this was going on), and left in 2002, so didn't have much to do with this process (though I did get a "special thanks" credit in the book of the exhibit). This remains the coolest project I ever worked on. In particular Dr John Taylor the Egyptologist at the British Museum, who we worked with, is a great guy (and at one point gave us a tour of the back rooms at the British Musuem which is unbelievably cool)

One historical side note is that the mummy of Nesperennub's wife has unfortunately disappeared. In the Victorian era "mummy unwrapping parties" were popular in the west, which resulted in the destruction of many irreplaceable ancient mummies, so this may have been her fate.  Her cartonnage, however, survives so we know her name was Neskhonspakhered. Weirdly, just like me, she ended up in the San Francisco Bay Area. You can visit all that remains of her right here at the Phoebe A. Hearst Museum of Anthropology in Berkeley.
Cartonnage of Neskhonspakhered, the wife of Nesperrenub, 

Wednesday, January 21, 2015

Story of R32

Story of R32 is a short film from director Vladimir Vlasenko with some pretty impressive CG.

There is also an interesting VFX breakdown video: