Workshop: Extreme Detail – Part 1

From October 27-31, 2014, my studio group (Digital Transformation) and Studio MAD participated in a workshop aimed at learning how to use the school’s CNC 5-axis milling machine through the creation of a “lighting object.” Students were put into three working groups, with two subgroups in each. One subgroup was responsible for the form of the lighting object, and the other group was responsible for the surface patterning.
I was placed in the form subgroup. We used Grasshopper to modify the properties of a cylindrical mesh and the Kangaroo plugin for Grasshopper to apply gravity and basic force dynamics to the form. Our aim was to create a shape made of planarized quads that would eventually be milled out of plywood sheets in the CNC mill. Kangaroo has a helpful component called “planarize quads” which tries to force the shape to be planar on each side so it can be cut out of flat sheets.
The form we arrived at provided a simple enough shape to be cut out of three large sheets of plywood while creating an architectural typology that justified use of the CNC mill.
The next step was to prepare the form for cutting on the CNC mill. This was done by labeling each quad in the form and then placing them flat.
We decided that the structure would be held together by binding and used another Grasshopper sketch to create evenly-spaced holes around the perimeter of each sheet. The binding sketch found the perimeter of the shape, created an offset perimeter a certain distance inwards, and then placed holes at equal intervals.
The patterning subgroup provided us with a series of contoured holes to place on the face of each quad. During this process we extruded the quads to match the thickness of the plywood. With the binding holes and contours placed, we then arranged the quads onto the plywood sheets and prepared the Alphacam files, which would be used by the CNC machine to interpret our Rhino drawing.

Sheets 1-3

Preparing the Alphacam files was a difficult process, as almost none of the students in the workshop had used the software before. We relied heavily on the workshop instructors to guide us through this process. In order to prepare a Rhino drawing for export to the CNC machine, the basic steps are to import the Rhino geometry into Alphacam, select your cutting tools and define your tool paths. The Alphacam file is then exported to the CNC machine as G-code.
Part two will detail the milling and assembly process.

Odense infographics

I spent last week researching the area around Thomas B. Thriges Street in Odense as a means of locating where I wanted to situate my project. Having a previous interest in information graphics, I focused on the data concerning how people get from the underground parkade to the street level and used it to create a series of pie charts and a usage diagram.


Location of ramps (orange), stairs (blue) and elevators (red) at the site

The data shows that one of the most common ways to move from underground to above is via stairwells located throughout the site, and that most of these stairwells are located around areas designated as “regional urban areas” – this is the Musikhuspassagen, Overgade and Albani Torv areas:

project_locations (1)

Using this information, my aim is to create two or three transition points which will connect the underground to the street level. These structures will be independent of the buildings on the site but their architectural qualities will be informed by their surroundings. Perhaps these transition points will serve as “waypoints” that guide people through the site and will somehow interact with each other. They will most likely contain stairs as the method of vertical movement, but should not be represented in the form of a traditional building staircase. The XBees will feature in the physical model as a method of creating interactive architecture.
My next step will be to create a basic material and structural catalogue of the buildings surrounding each site that will inform these transition points.

Workshop: Extreme Detail – Part 2

The school has a large 5-axis CNC machine that we used to cut out the pieces of the lighting object. The process actually took longer than expected due to the large number of binding holes. Average cut time was probably around 45 minutes – 1 hour per sheet. In the first pass, the CNC drilled the binding holes.


In the second pass, the CNC drilled out the contours that were on each quad. The actual edges of the quads were done as the last step. This is important to note, since once the quads are cut free of the plywood sheet they are free to move around on the cutting table, which would make any precision drilling on it impossible.
We also had an incident occur where a stray piece that was cut out from the contours was thrown up on the sheet to lay on top of it. The drill head happened to come down right overtop of this piece. Upon contact the piece shattered and the CNC machine halted! We had to reset the machine to continue.
The cut pieces required thorough hand-sanding in order to remove the rough edges that formed. The next step was assembling the structure using string to bind it. We attached rice paper to the insides of the quads in order to diffuse the interior light. As a final detail, we ran some LED strip lights inside the lighting object. Check out the time-lapse video to see how we assembled it:


Conclusions: The workshop gave students a chance to use the school’s CNC machine to bring a computer-generated form into physical space. We accomplished this in the short time frame of only five days, with a few sacrifices: the tight timeline meant that we could not test out our ideas, so we didn’t actually know how the final design would hold up in terms of assembly and fit.

The design process partially suffered from the inability to fully automate some processes of the parametric design. The patterns team provided us with the concentric lines to cut the contours all flattened on the z-axis, so we had to manually move each set of rings down 2mm each so that the contours would be cut correctly. We had to move our binding holes inward on each quad once we realized during a simulated cutting that they were too close to the edges.

Alphacam was mentioned a few times by one of the workshop instructors as a “terrible” piece of software to handle CNC milling. There were many occasions where only the instructors knew the right settings needed. I am confident that I could not cut anything on my own, despite the CNC machine being free for all students to use.

It was very difficult to find the right combination of settings in our Grasshopper sketch that would produce truly planar quads for cutting. There was a tolerance issue of +/- 1mm all around, which greatly affected the edges of the quads. Instead of being made from a single planar edge, most of them had a bit of “warp,” associated with the edge being made from a double curve. It seems that there should be a better way to ensure that the final form is planar.

The binding holes turned out to be very snug, and in some cases the string wouldn’t fit through without the hole needing widening with a tool. In retrospect, we should have made larger holes. The binding process as we designed it was very time-consuming during assembly. A different assembly technique would have benefited the project.

The lighting object that our team produced was an excellent example of what you can produce on a CNC machine with a week of work. Further experimentation with the machine would result in projects with a higher degree of craft, and it would be worthwhile to experiment with other materials as the plywood had a tendency to splinter on the edges where the wood grain ran perpendicular to the cutting direction.

I’m not sure if I will use the school’s CNC machine in my own work yet, but it now appears a lot more accessible than it was before.

Untitled-1 (1)

Wireless communication with Arduino – Part 4

Recently I’ve been using Andrew Rapp’s XBee library for Arduino so that I can start to communicate with more than one radio module at a time. This is a crucial step, and as the screenshot below shows, now the code can also tell me the XBee’s address.

I found a great code example of how to display information from each XBee at this blog – it’s invaluable!

The next step is to take the data from each XBee and probably insert it into an array that gets passed to Grasshopper.


A reading of Odense and the city’s new master plan

The current layout of Thomas B. Thriges Street in Odense has been shown to clearly not work. With the advent of the automobile came the expansion and upgrade of roads – once-quiet city spaces were soon overtaken by the roar of engines and horns. In the 1960s a roadway was installed in the heart of Odense to accommodate this new city inhabitant, and it became known as Thomas B. Thriges Street. By placing convenience of privatized travel over the serenity and character of Odense’s central square, city planners bisected the area, creating a disconnect between the two sides.
My visit to Odense last week confirmed what I had learned through research. Arriving at the site, I was immediately struck by the uncomfortable feeling of the street being so close to the area where pedestrians were walking. My first view of Sct. Albani Kirke was arranged through the proximity of roads, concrete and street signs, which did not impress any testament to the history of the site. The extreme conditions of the site made it impossible for me to imagine what stories the site might tell, despite the fascinating architecture of the church. The street, laying directly in front of the church, stopped me from approaching it and disrupted any sense of place the structure should have lent to the site. As I walked along the street, I stepped into the middle to take the “perfect” photo of the church – only to discover that while I was free of nearby taller obstacles, the ugly directional signs for the automobiles still hung directly in front of the church.
Only recently has Odense has taken steps to rectify the problem of Thomas B. Thriges Street. A new master plan for the area has been developed that proposes to remove the street in an attempt to transform the area into an inviting, pedestrian-friendly place. Starting in the north at Østre Stationsgade and moving southward towards Sct. Albani Kirke and Odense Domkirke, a gradient of buildings will occupy part of the space, with larger, modern-looking buildings progressing to smaller, traditional buildings. The choice of progression is in response to allowing Odense to continue to develop as an important hub city in Denmark while preserving the historical dwellings that occupy the area. The Hans Christian Andersen Museum is a notable place surrounded by a specific atmosphere of old houses the reflects the time of the famous author. Underground parking beneath the buildings will serve to acknowledge the importance of the automobile to a city while preventing a noticeable change to the urban landscape.
My interest in the Thomas B. Thriges Street site lays in the area near the two churches. The area currently contains a small surface parking lot and a cobblestone area with a small historical ruin. The master plan states that the area will be occupied by two new commercial block buildings. I would like to remedy the dichotomy between this new built-up area and the two churches through an architectural intervention. Opportunities exist for an inhabitation of the proposed structures, or a completely new plan for the area.
The present way we build architecture has undergone a major change since the times of simple brick-and-mortar construction. Architects increasingly look to digital computation in the pursuit of novel building methods. One method involves the consideration of how we create forms that are influenced by and change with the environment. Neri Oxman, a researcher at the MIT Media Lab, talks about a separation existing between “what” a building senses and “how” it does so. Often architects will simply embed sensors into a building as a post-gesture rather than considering how the sensors lend themselves to the sensing elements of a building. Designing with attention to material choice can lead to exploring the role the designed material has in the creation of a “sensing” building. Through these exercises we can tweak the formal expression of a constructed space to unite it with another – in this case, the existing churches. Through careful attention to the way buildings and people sense their environment, I want to continue to expand on the notion of creating architecture that is informed by these actions.

Wireless communication with Arduino – Part 3

One of the Firefly devs actually responded to my earlier question and said I didn’t need the Firefly firmata on the Arduino if I was only using the Firefly serial read/write components. For now, this works, but I wonder if I will need to output data from Grasshopper to the Arduino at some point in the future. Possibly not, if the Arduino at the base station is only acting as an interface to the XBee coordinator radio. I will have to look into this.

The flex sensor as a wireless sensor. The wires going to the wireless sensor is only for power as I didn’t have a battery pack.
I used the following code to read the incoming serial data from the wireless sensor node:
void setup() {
void loop() {
  //make sure everything we need is in the buffer
  if (Serial.available() >= 21) {
    //look for the start byte
    if ( == 0x7E) {
      //discard some bytes that we’re not using
      for (int i = 1; i < 19; i++) {
        byte discardByte =;
      //grab the two bytes that make up the analog value from the wireless sensor
      int analogMSB =;
      int analogLSB =;
      //convert to decimal to create the analog reading
      int analogReading = analogLSB + (analogMSB * 256);
The following graphic shows how the data was used to create a concept of a hallway with moving walls. By using the inverse of the sensor data, I was able to work with two sets of data so that the walls of the hallway would react with each other. Eventually, I would like to introduce a second wireless sensor as another data source.
The result is shown below. As the flex sensor is bent, one wall extends outwards while the other recedes.

Wireless communication with Arduino – Part 2

Today I learned how to use a XBee as a standalone device, without an Arduino. This will be useful when building in 1:1 scale and needing to place sensors on a model that are spaced far apart. An Arduino is only needed at the coordinator XBee.
Also, I was able to successfully import the serial data from the XBee and bring it into a Grasshopper sketch by using Firefly components. The next step is to make sense of the data – currently it is garbled and showing as multiple varying values (see screenshot below). I’ve posted to the Firefly help forum in hopes of solving this, as processing the serial data from the XBee requires some code that is usually handled by the Arduino.
Quick Grasshopper sketch showing the serial data being read

Wireless communication with Arduino – Part 1

I’ve been using part of this week to sort through the various tech that I’d like to use in my projects, as a means of keeping the troubleshooting/learning time to a minimum so that I can focus on the architecture. The electronics and tech are integral parts of creating interactive architecture – through sensing and adapting we can create surfaces and structures that create a personal connection to the visitor/user.
I was successful in using XBees (wireless radio modules) to communicate wirelessly between two Arduinos. In the example below I demonstrate how closing a circuit on one Arduino turns on a LED on another Arduino. The electronics are very straightforward and the XBee devices communicate via serial, so you only need to send/receive/process serial messages to work with the data.
When the switch circuit is open, the LED on the other Arduino is off.
When the switch is thrown (circuit closed), a LED illuminates on the other Arduino.

My next step is to learn how to work with two-way communication, so that a sensor could send data to a base Arduino, and then receive information from that Arduino to affect a local component – it could be a motor, a shape-memory alloy, or something creating a similar means of actuation.

Interfacing Arduino with Grasshopper – Day 2

I’ve now been able to make the data from the flex sensor affect 3D objects in Rhino, which completes my tutorial in understanding how to interface the Arduino with Grasshopper. In the example shown, the aperture size of an array of pyramidal shapes changes depending on how the flex sensor is bent.

When the sensor is not bent, the apertures are wide open.
When the sensor is bent, the apertures close.

I still need to educate myself on data structures and sorting data in Grasshopper, as it seems that in my previous attempts I did not have matching data sets which prevented me from creating 3D shapes.

The Grasshopper sketch that produced the 3D shapes.

My next step will be justifying the use of this technology within the task of creating an architectural plan/intervention for the site in Odense.

Day trip to Odense

The studio group took a day trip to Odense to observe Thomas B. Thriges Street, which is the centre of a radical new plan to demolish the existing roads running through the centre of the city and replace it with a pedestrian-friendly area. The new development will be located near all of the landmarks I photographed, and I think there’s an interesting opportunity to create new architecture that embraces new technologies and ways of building but also harkens back to the traditional designs of churches and government buildings.