Final Project — Painting with Light

For the final project, I wanted to create something that encouraged movement to initiate a response with processing. My idea was vague to begin with, which allowed me to explore and learn a variety of different concepts in processing whilst I tried to find the response I wanted. For instance, I learnt how to pixelate video feedback, and how to de-pixelate regions with light. Although this response was cool, the video lagged. My main priority with this project was to establish something that was intuitive, fun, and worked as perfectly as I could make it. Having video lag did not line up with these design principles. I decided to aim for having the core components of my project working perfectly, rather than having lots of components that only half work.

Consequently, I created ‘Painting with Light.’ It is a simple interactive installation where the user controls the array of circles painted on the screen with a cube of blue light.

The array of circles take on the brightest value colour. As you paint, the circles that trail slowly fade in a way that replicates painting with water colours. The background changes colour very slowly between different blue/ purple values. Below is a video demonstrating only the gradual background colour change (it is difficult to see in the interaction videos).

These are simple factors of my project that I think are very important in creating the overall, calming experience. I found it interesting that some users would try to make the program react faster by waving the box around sporadically, but the calm nature of the program would make them slow their interaction.

Users could simply pick up the box and without any explanation could play with the program — which was exactly the type of intuitive interaction I wanted.

Painting with Light also tended to encourage a meditative interaction.

This program technically works without a controller and follows the brightest part of the user’s body. However, this does not allow for directed interaction i.e. the painted circles would appear all over the screen and would be difficult to control. Thus, I made a light box that acts as a controller and allows the user to make directed movements. It’s like painting with a brush rather than a large sponge.

 

HARDWARE — CUBE CONTROLLER

The box consists of a small inner box that contains the LED circuit and battery pack. This smaller box is then suspended within the larger controller. A switch is mounted on the outside to conserve battery power.

(The controller circuit)

To keep the circuit neat, I kept most of the circuit on the breadboard except for the switch’s resistor, which I soldered and encased with heat shrink tubing. The breadboard has 6 LEDs in parallel, with three on each side so that the entire cube will be lit from within (not just one face). A 9V rechargeable battery powers the circuit.

               

The exterior of the box is transparent acrylic that I sanded so that it became opaque to better refract the light from the LEDs inside.

The smaller box is suspended within the controller by cylinders of acrylic mounted on three of the six faces. This allows the circuit to remain compact/ unmoving within the smaller box, and and also means the circuitry cannot by easily seen when looking in from the outer shell. I purposefully choose  only cylinders to use as these mounts, as I knew they would be visible and wanted a consistent aesthetic. A small hole is left for the switch in the outer shell. All faces are hot glued together along the box seams, except for the top face of the interior box. I simply taped this shut (securely) so that the circuit could be easily removed and recycled at the completion of the project.

With its blue glow and circular mounts, the box reminds me of energy sources in scifi movies e.g. the tesseract in Thor. I like this reference, because it makes it fell like there is a subliminal force within the cube that is allowing you to generate a response on the screen.

 (The tesseract in Thor — source)

 

CODE

The comments within the code describe the different elements I used. The code’s main components reference processing’s brightnessTracking example (see examples within processing) and array example. James showed me a technique to generate the slow change in background colour. Numbers are randomly generated, but at increments, like reading values along a gently curving graph. These numbers are then mapped to the range of colour values I want for r,g, and b. I was also having some trouble with the web cam being recognised by processing, so I included code that registers and prints the cameras available, before using the first one to source pixels (which should be the web cam). 

Processing CODE:

One of the three most difficult components of my project was trying to simplify my idea to its core components. I was overly ambitious with my first concept, but am very happy that I simplified to a more manageable project. The result was simple, yet was successful in the design elements I wanted. 

Another area of difficulty was making the circuit stable and working. I enjoyed soldering, and learnt a lot about making a good circuit whilst creating this project. I had to do a lot of debugging to finally get the circuit to work, but it was fulfilling as it made me realise the importance of soldering correctly and ensuring the integrity of all elements. The final controller was able to be thrown around by participants yet still remained functional, which was exactly the type of integrity I wanted to ensure. 

Finally, my last area of difficulty was figuring out the best way to code what I wanted to achieve. I still have a lot to learn here with respect to the coding process. Pierre gave me some really helpful advice when I asked him how to code a certain element. He basically said that I should break down what I am trying to achieve into its simplest components e.g. if I wanted to pixelate something, I could also say that I want to break an image up into individual boxes, where each box is a single pixel taken from the range of pixels that would otherwise inhabit that boxed section of the image. This type of thinking is something I hope to manifest in my future coding projects.

For a future iteration of this project, I would like to include RGB LEDs in the controller so that the colour of light can be changed, like choosing colours from a palette. This would require an arduino be used, with an XBee unit installed within the controller to enable the controller to remain wireless. I would also like to have this project work in 3D space, so the user moving in the z plane would also create a response in the processing z plane. 

Final Project Report – Void ()

Concept 

At first, I was pretty much clueless as to what to do for the final project. The idea to do a game came to mind while the others in class were pitching their own ideas. However, since this is Interactive Media, and not Computer Science, I wanted to showcase my hardware knowledge and build some level of interaction between my game and the outside world that doesn’t involve a keyboard. And, so I decided to build an external controller too, a simple joystick in a box. My initial idea was to create a button as well, but since Michael asked us to produce a simple take on our original idea, I decided to scrap the idea of including a button. 

The inspiration behind my game was essentially my childhood memories. Having grown up in the Gameboy Color/Advance generation, I was exposed to games from franchises such as Pokemon, The Legend of Zelda,  Dragon Ball, Dragon Quest and Final Fantasy. These games had very simple graphics, sounds and storylines, unlike the current generation of high spec AAA rating games. However, it was these retro games that left a lasting impact on my mind, and it only felt right to pay homage to the games that made my childhood so memorable.

The structure of the game is simple. Your character, the hero, is on the moon and he fights off space bats and lizards that have invaded the moon. He ultimately discovers a dragon, a.k.a the final boss of the game, and his goal is to eliminate the dragon. 

Implementation

I implemented the game using PyProcessing, which is Processing that executes native Python code. The reason why I chose this over standard Processing that runs on Java is because I was more confident of my programming skills in Python as opposed to Java. Ultimately, this turned out to be a major inhibitor which I’ll explain later. 

The controls of the game were simple, the player moves the characters left, right or up, and if the player comes into contact with any foe with his dagger facing the right direction, the player would inflict damage onto the foe. Otherwise, the player would lose some health instead. In order to create the necessary software aspects of the game, I used the following :
1) PyProcessing IDE – To program the actual game logic
2) Arduino IDE – To program the hardware aspect of the project to receive input
3) Adobe Photoshop – To crop and resize images
4) Character Generator to create Hero Sprite – Can be found here 
5) Sithjester’s RXMP Resources for the enemy sprites – Can be found here.

The actual code itself is below. I used Object Oriented Programming to implement the vast majority of functions, considering how versatile it is to duplicate elements of the same type with slightly different characteristics. Since I had learnt OOP in Intro to CS, and had implemented a game of similar nature, it didnt prove to be too difficult to come up with the logic for the game.

In addition to the Processing code, I also had 4 CSV files, from which I read the contents for each stage. The contents of the CSVs are below.

———————————————————————————

Stage1.csv

Tonberry,800,400,20,585,tonberry1.png,40,40,4
Tonberry,200,400,20,585,tonberry1.png,40,40,4
Silverbat,700,100,16,585,silverbat1.png,32,32,4
Silverbat,500,100,16,585,silverbat1.png,32,32,4
Silverbat,250,75,16,585,silverbat1.png,32,32,4
Tonberry,1280,400,20,585,tonberry1.png,40,40,4
Platform,150,400,200,52
Platform,200,400,200,52
Platform,400,400,200,52
Platform,540,200,200,52
Platform,1000,350,200,52
Platform,1200,350,200,52
Music,openWorld2.mp3

——————————————————————————–

Stage2.csv

Platform,200,350,200,52
Platform,400,350,200,52
Platform,800,450,200,52
Platform,1000,350,200,52
Platform,1200,350,200,52
Platform,1400,450,200,52
Silverbat,800,200,16,585,silverbat1.png,32,32,4
Silverbat,600,200,16,585,silverbat1.png,32,32,4
Tonberry,650,350,20,585,tonberry1.png,40,40,4
Tonberry,900,150,20,585,tonberry1.png,40,40,4
Silverbat,300,150,16,585,silverbat1.png,32,32,4
Music,openWorld2.mp3

——————————————————————————-

Stage3.csv

Bahamut,740,400,47,585,bahamut1.png,94,94,4
Silverbat,150,150,16,585,silverbat1.png,32,32,4
Silverbat,200,200,16,585,silverbat1.png,32,32,4
Silverbat,300,550,16,585,silverbat1.png,32,32,4
Silverbat,1000,250,16,585,silverbat1.png,32,32,4
Silverbat,1200,350,16,585,silverbat1.png,32,32,4
Silverbat,150,350,16,585,silverbat1.png,32,32,4
Silverbat,475,450,16,585,silverbat1.png,32,32,4
Silverbat,875,275,16,585,silverbat1.png,32,32,4
Tonberry,650,50,20,585,tonberry1.png,40,40,4
Tonberry,750,75,20,585,tonberry1.png,40,40,4
Tonberry,900,450,20,585,tonberry1.png,40,40,4
Tonberry,950,300,20,585,tonberry1.png,40,40,4
Tonberry,650,375,20,585,tonberry1.png,40,40,4
Tonberry,1000,750,20,585,tonberry1.png,40,40,4
Platform,50,200,350,52
Platform,150,300,300,52
Platform,250,400,300,52
Platform,450,350,300,52
Platform,450,250,300,52
Platform,850,250,300,52
Platform,750,350,350,52
Platform,650,450,300,52
Potion,175,300,16,585,potion.png,31,31,8
Potion,425,225,16,585,potion.png,31,31,8
Music,boss.mp3

——————————————————————————————-

Stage4.csv

Splash,endingScreen.png,375,200,750,75

——————————————————————————————-
The Arduino code was simpler than I thought. It was just a matter of performing a Serial.print() each time the user moves the joystick. Since I wanted only left, right, up and the button press to be recorded, I ignored the bottom joystick movement.
The code is as follows :

For the hardware part of the implementation, I used a RedBoard, a prototyping shield, and an Adafruit joystick housed inside a laser cut box made of acrylic, However, since he connections to the joystick were very close, it was extremely difficult to solder them securely and Michael’s help is what saved the joystick in the end.
The schematic is as below

.

The main problems I encountered

  1. Cropping the sprite sheets turned out to be quite troublesome, because I’d often end up with half-cropped frames and it wouldn’t animate properly. I think cropping sheets like this comes with practice, since I got it right only after dozens of unsuccessful attempts
  2. PyProcessing does not produce meaningful error messages when code compiles, which means that I might have parse through dozens of lines of code to spot a tiny error.
  3. PyProcessing does not have standard documentation for the Serial library, which meant that I had to use the Java documentation to translate the code into Python, and just hope it works.
  4. Soldering the ends of the joystick proved to be ultra challenging, since they were thin leads close together. Michael showed me a soldering technique for such joints and it worked well.

Further Improvements.

1. I would use have preferred to use Processing in Java, considering the vast availability of documentation for the original version of Processing

2. I would include a button and functionality to make the player able to control when the Hero slashes. I skipped out on this due to time constraints.

3. I would also like to create a better storyline for the game itself, to make it more enriching and engaging.

 

Acknowledgements

Professor Michael Shiloh for his constant support, encouragement and advice throughout the whole journey,
James and Ume for always being there around the lab to make sure we knew the resources we had, plus advice on how to make use of them.
Alex for teaching me how to use Photoshop to crop images,
Jennifer for helping me out with getting the Serial library to work with PyProcessing,
Diego and Laine for helping me out with the joystick and designing the box using http://www.makercase.com/,
Mateo for helping me resolve errors in the logic of my code,
And lastly, the community at the IM Lab who created an amazing work environment and acted as a strong support system throughout the entire process!

Assignment due December 20

Concept:

For our final project Russell and I were a bit lost as to what to do. Because the idea of space is so huge and encompassing, we weren’t sure what we could do that would serve the purpose of the name of the class, to be interactive, as well as to be theme appropriate i.e. space-y. Russell suggested a simple idea of maybe just making a space invaders game. We liked the idea and so we started to work with it. Later on however as we prototyped, we decided maybe we wanted to still stay in the same idea but maybe drift a little bit to create something not entirely new but at the same time new for us. We started to make a prototype with simple boxes falling and the user would control another box and ensure it doesn’t explode. Soon afterwards I remembered watching something and the question they asked was if we would be destined to be extinct in the same way the dinosaurs did. I thought it was pretty similar to the prototype we currently had and thought maybe if we had gone back in time we could have protected the earth and the dinosaurs from the asteroids that in the end killed them. Afterwards Russell and I began working on adding images to the boxes to distinguish between the spaceship and the asteroids. 

Robert and I were at first working with just our mousepads to control the spaceship but we thought it wasn’t really as interactive as we want it to be. I decided maybe we could use what we have learnt so far and not so long ago, in our project to make the user feel like they are playing in real space, in a 3D environment rather than on a small laptop. I suggested that  we could create a controller and use brightness tracking or color tracking to track the location of the player and then map it to screen. Robert agreed with this and we fiddled around with brightness tracking, color tracking and even an accelerometer in the off-case that our final location would be too bright to use either color or brightness tracking. In the end color tracking was the most stable and so we decided to work with that. 

Russell and I got a cheap plastic toy gun and we removed the insides and added a small circuit for a button and a led. We placed the led at the tip of the toy gun (where the muzzle is located) so that the user could aim it just like any normal gun as well as Russell attached the trigger of the gun to the button so that when the user pulled the trigger they would be able to press the button and fire bullets in the game. We decided on using a toy pistol as our controller because we thought of its signifiers and its applications. Guns are rather everywhere (sadly) and as such are known by most if not all people and there are very few things to interact with, which in turn makes it very simple and easy to use. We did not label the gun because as aforementioned it is rather straightforward to use. We did however place an ‘x’ on the floor at our booth so as to instruct users where to stand to aim the pistol. 

Prototype Images: 

Continue reading “Assignment due December 20”

Dec. 21st Ju Hee Noh Final Project

My project was to create a sound visualizer that shows the sound difference with lights. I wanted to create a lamp that detects sound and light up different colors. To do this, I used led neo-pixels inside the lamp shade. I wanted the lights to be distributed equally throughout the lamp, so I removed the original lightbulb the lamp had and implanted a pvc pipe that is coiled with neo-pixels.

 

 

I wanted to choose a lamp shade that was a bit see through so the light changes would show easily. I found a lamp that had malleable lamp shade and found pvc pipe. I have used to pvc pipe to stand in the middle of the lamp and coiled about 150 neo-pixels on it. The lamp shade was a good choice in that it was open at top and bottom, which made it easier to detect any sound or music. I am pretty happy with the overall material, but it would have been easier if I could find a more sensitive microphone or sound detector.

To create the project, I had to learn how to use neo-pixels. I have not used them before, and wanted to find a way to individually control them. I went to adafruit website and watched several videos to learn how to code.

The overall electronic part consists of:

  • Arduino
  • Arduino microphone
  • Arduino shield
  • Neo-pixels
  • Powersource

Software part was harder for me, and I have created my code based on a tutorial: https://learn.sparkfun.com/tutorials/addressable-rgb-led-music-and-sound-visualizer

This is the code:

 

When I was coding, I wanted to use both processing and Arduino to have processing detect the sound and the Arduino to light up the leds, but the new code was faster in catching the pitches and showing up the lights.

 

Three most difficult parts were coding, prototyping, and making the project sturdy.

Prototyping was hard, because I only had a vague idea of what I wanted to do. However, I believe prototyping has actually helped me develop my project, and it was a good place to start instead of starting with something big. Coding was the hardest part for me, because this class was the first time I ever coded, and also because I never used neo-pixels before. It was also because the Arduino microphone was not as sensitive as I wanted it to be. Lastly, as I built the lamp, I wished it would have been a bit more sturdier.

I think if I had chance to improve my project, I would like to work more on the code and figure out a new code that works the lamp better. Moreover, I would like to make sure the pvc pipe is strongly attached to the lamp.

Final project writeup

This writeup reviews Navya and Diego’s final project

Concept: What was your project about? How did you use technology to accomplish this? What design principles did you apply?

Dwelling on the theme of space, we decided to work on a game that would combine travelling across outer space and also travelling an actual (but small) distance. For that reason, we decided to do a project that would allow the user to “travel” space by controlling a spaceship.

For that reason, we settled on creating a final project that was comprised of a) a spaceship, b) a controller and c) a playground.

The player controlled the spaceship (a) using a controller (b), and he or she moved it around a designated space (c) that represented the galaxy. On a monitor we displayed different tasks that the player had to complete by moving the spaceship to certain planets. We used sensors hidden beneath the playground to control the progress of the game.

To create these elements, we used different materials (electronic components, like motors and arduinos; and manmade and natural elements, like acrylic and wood). In terms of technology, we used electronic components (motors, XBees, shields, arduinos, and others) and software (arduino, processing) to achieve movement and communication.

We used XBees to communicate the movement of the joystick to the motors through arduino. By reading the joystick movement, we would return a “case” (1 = forward, 2 = left…) and that would be sent to the motors which then would act accordingly. That way, we controlled the movement of the spaceship.

Secondly, we used sensors to control the story line. These sensors were wired beneath the plywood and they sent information to arduino. Then, communication with processing ensured that the progress was recorded and it changed the story line displayed in the screen.

In short, we used electronic components and other materials, programs like arduino and processing to allow the player to drive a spaceship through space in a mission.

 

Hand drawn sketch, computer drawing, or a photograph of the overall project

 

Discuss the materials and construction techniques. Why did you chose these? Knowing what you know now, would you have chosen different materials or techniques?

To create this final project, we used several different materials and technologies.

For the spaceship we used an Arduino, an XBee, XBee and motor shields, two 9V batteries, two motors with wheels attached as well as a normal wheel. The components were mounted onto a piece of acrylic and on top of all, we had a Lego Star Wars spaceship. We mounted the components using screws and zip ties, ensuring that the motors and most of the wiring were hidden underneath the acrylic. Only the arduino and the shields were slightly visible, but they were mostly covered by the spaceship.

For the controller, we used a joystick that was connected to an Arduino, with its correspondent XBee and XBee shield. The controller was overall made out of laser-cut acrylic, using the website http://www.makercase.com. We mounted the joystick with screws, and had the arduino, XBee and batteries hidden inside the controller.

We wanted the project  to be as clean and straightforward as possible, keeping in mind that the simpler the interface, the better the user experience will be.

Lastly, the playground was a combination of hardwood, plywood and sensors. Eight sensors were hidden beneath a plywood platform that we painted black to recreate space. Over the edges of the playground, we built barriers that aimed at preventing the spaceship from falling as well as showing the player where each planet was. We had the sensors wired up beneath and connected to a breadboard and an arduino mega. The latter was directly connected to the computer, and it sent the information received by the sensors.

 

What did you have to learn in order to complete your project? How did you learn this? (Include links to any useful resources)

We had to learn a lot in terms of improving the quality of the materials and presentation of our project. With regards to hardware, we had to ask for help from both the wood and scene shop to gather the materials and build the playground. In terms of software and circuity, we had to learn how to use wireless communication to control the spaceship remotely. We had to learn how to use a MotorShield to control the movement of our spaceship. Furthermore, we also had to understand the circuitry and design of the spaceship.

There are a host of YouTube videos and online resources that explain how to establish wireless communication using XBee modules. After browsing through a few, we managed to get a brief understanding of how they function. We then approached Professor Shiloh and he helped us set up the communication between two XBees.

Adafruit has a comprehensive tutorial on how one can get started with using the MotorShield. After reading through this tutorial and running the example program on a motor, we could understand how to control the MotorShield to move as we needed. The library is pretty straightforward, and the tutorial gave us everything we needed to know.

Lastly, in terms of “hardware,” working with wood was a bit difficult due to the fact that none of us had the proper tool training. Using the jigsaw was challenging because we had to find materials of the right length, and finding nice surfaces to use the tool was very hard, but nonetheless we had a lot of fun. We really enjoyed building and painting the playground because it was a very practical, rewarding activity.

 

Describe the electronic and electrical part of your project

We had 3 main components in our project and there were electrical components in each of them.

  1. The Spaceship: The Spaceship consisted of an Arduino, a MotorShield, a XBee shield, a XBee, 2 DC motors, 2 wheels and 2 9V batteries. It moved according to the values sent by the joystick (b)
  2. The controller: The controller was made up of an analog Joystick that was connected to an Arduino with its correspondent XBee and Xbee shield, as well as a 9V battery.
  3. The playground: as stated before, the playground was made of wood but it had a set of sensors underneath. We had 8 sensors, a capacitor and an Arduino Mega. 

Overview, describing the general operation

The player controlled the spaceship with the joystick. He or she moved it around the playground and was asked to complete tasks (i.e. go to Mars). These tasks were completed by moving the spaceship to different locations. When the task was completed, the player was prompted to the next mission.

Schematic

Controller:

Spaceship:

Play area:

Describe the software part of your project

We had 4 sketches running at the same time, each taking care of different parts of the project:

 

  • Spaceship controller: records the movement of the joystick
  • Spaceship move: uses the information from the first sketch to move the spaceship
  • Play area Arduino: reads the position of the ship (has it been to each planet yet?) and send to processing
  • Play area Processing: displays the story line and prompts the player to each mission

 

Overview, describing the general operation

We had three different parts which required individual arduino sketches to run. With one sketch, we recorded the movement of the joystick. The second sketch transmitted this information to make the motors move.

A third sketch read the position of the spaceship, then transferred the information to processing. The processing sketch directed the user to different planets in the play area.

Upload your program(s)

Spaceship controller:

Spaceship move:

Play Area Arduino:

Play Area Processing:

Describe the mechanical part of your project

The playground was constructed of a 1,2 x 1,2 m plywood piece that was nailed to a frame that elevated it 2 inches. Lastly, 3 of the sides were protected with barriers that were made out of thin plywood. It was all painted black to simulate space.

What were the 3 most difficult parts of your project?

What we initially assumed to be the most difficult aspects of our project were not really that difficult. Learning how to work with XBees and the MotorShield took no time at all. The most difficult parts of the project were construction of the different components and getting them to work.

We had a lot of problems with the batteries, since they kept running out of juice. Furthermore, finding the materials for the playground was hard and we also required a lot of help from the Wood Shop people. Lastly, having the XBee inside the box of the controller made communication harder.

Knowing what you know now, what would you have done differently?

Definitely we would have started the project earlier. That would have given us room for improvement.

We would have liked to have a bigger joystick, and maybe instead of doing it wireless we could have a bigger box fixed to the table that was wired to ensure that there is no lag in the communication.

More pictures and videos

Available on this album

Final Project: Light installation

Concept: 

There were two parts to my project. One was a controller, which can literally controlled by your own hand and the other was a mini box installation with lights. My project represented pollutions and trash we create in space. I will further explain the details with the pictures. But basically, I want to create a space where lights will change the perspective of how my art will be viewed. I’ve used flex sensors to construct a controller, which was attached to a glove and neopixels to shine different lights into the box. 

Pictures: 

As shown by these pictures, I wanted to create a feeling of destruction but at the same time change that feeling of destruction to something more beautiful. Therefore, the light changing light is supposed to symbolize the light pollution we create in space. Shown through different satellite pictures, Earth sparking with like may seem very pretty. However, in reality, it causes destruction. Therefore, the “star-shaped” structure hanging above is suppose to represent Earth. Scraps of paper and wires on the bottom are trash that we leave to the space. 

Physical Construction of the project:  

  1. Building of the controller: Challenging yet really fun. It required lots of soldering (I honestly feel like I’ve mastered it at this point). In the beginning before soldering, I did make a prototype with a breadboard and tried on the sensors to check if they were all functioning properly. However, what I had not realized was how sensitive the flex sensors were, meaning after a while the plastic that is coated on the sensor gets blent, giving different results. After the prototype, I soldered the wired into the metal breadboard. I really enjoyed this process because the end result looked very professional. Afterwards, I sewed the flex sensor into a glove so that “interactors” will actually be able to control something with their own hands. 
  2. Building of the Box: This part was extremely difficult for me because I need to think of a most convenient, neat and easy way to display my installation. First, I did not know that neopixels could not be controlled with separate pins. Although it is possible, the software part of it would be way too complicated for my skills. Therefore, Michael suggested me ways to separate the pixels, yet have them controlled by different fingers. Honestly, this was so much easier and a lot simplier. What I realized while using neopixels was that they may seem complicated, but they are actually very simple. Therefore, I divided into strips then cut them but connected them back with wires so that they could be controlled through on pin. Moreover, because neopixels are sensitive, to make sure they aren’t bent or broken, I cut out some acrylics to protect the pixels that would be randomly placed inside the box. 

As shown by the schematic, my inputs were the flex sensors that will command the output, the neopixel to shine lights. 

Software: 

  • The software part was easier than expected. I used AnalogRead due to the A0-4 pins that I’ve used for flex sensor. James helped me write the codes from scratch. I also referred back to simple analog_read test that we’ve done with buttons and LED lights to understand the basics. I’ve used “if statements” to command neopixels to be controlled by the flex sensors. I mapped the scale of colors and the flex so that I would be able to create different shades of colors and to control the colors more effectively with my controller. By using the else statement, I’ve turned off the lights when the flex sensors were not bent. 

Reflection: 

I wish I had learned more about the materials I’ve used beforehand. Although I’ve read about the neopixels and flex sensors because I had no chance in actually using them myself, I still felt like a beginner (which I was). Had I known more about them, I feel like I could have done something cooler. I think I would still use the same materials if I were to recreate it. Except, I would like to replace my box to a transparent plastic box so it would be more sustainable and it could also be used as a lamp, not only just an installation. I liked to keep the mechanics outside because I really enjoy looking at them. Moreover, during the exhibition, a lot of people were actually very interested in looking at the mechanics and asked a lot of questions about it. I really enjoyed having people from all discipline come and ask me questions about how I designed and created the project, whether it be from an engineer perspective or an artistic point of view. 

Three difficulties: 

  1. Soldering the wires into a permanent breadboard: The process was a little confusing. Not only I was using a used breadboard that already had some compartments including on the board, it was confusing to know what went where in order to make it most convenient for users, location wise while putting it on a display. 
  2. Building the installation: I wasn’t too sure what I would be the best way to place neopixels and wires inside the box so that when people would look into the box, the lights will create the effect that I wanted it to create. I carefully placed the wires intentionally in the places that I’ve put it so that when the light shines, you could see the shadows of the wire reflecting against the wall. 
  3. Exhibition: Because the flex sensors were bent a lot as more people were trying them on, it was difficult to have each sensor control on neopixel strip. They were still creating cool effects but it wasn’t exactly what I commanded it to do through software. I wished there were strong flex sensors that were less sensitive. 

December 20: Final Project

Final Project Documentation

AURORA BOREALIS

When I was brainstorming ideas for our space theme, I thought about aurora borealis (also known as northern lights) which is a natural light display in the sky found in Arctic and Antarctic regions. It has always been a dream of mine to see one. I was inspired by this acryl + neopixels crown project that James showed on his neopixels workshop, and it made me think about how light from the colourful LEDs could disperse through transparent acrylic parts that would copy the wavy shape of aurora.

My project is an art installation imitating a beautiful nature phenomenon that also engages people to interact with it by controlling certain values to play around with the overall look of the installation. 

(Note: since I couldn’t attend the IM show, videos included in the post were filmed while still working on the project and do not show its final look)

 

It was a true joy to work on it and combine what I’ve learned in this class to design and create something that would not only look good, but also involves user interaction.

The final design looks quite simple and minimalistic. Its affordances are clear, as the control box has only two knobs on top that encourage people to twist them and then see what happens.

MOST DIFFICULT PARTS OF THE PROJECT:

  1. Creating the colour patterns;
  2. Gluing parts and calculating correct measurements (even including the weight);
  3. Attaching potentiometers with a box

Materials & Construction: Physical, Electronic and Mechanical Components.

The physical installation consists of the “Aurora” and box with with potentiometers hiding the Arduino. Components:

  • 1 Adafruit neopixel strip (20 pixels);
  • 1 transparent acrylic sheet (cut in 20 pieces);
  • acrylic base;
  • 1 Arduino;
  • acrylic box;
  • external power supply;
  • 2 potentiometers (10k ohm) and knobs.

PART 1: Aurora Borealis

I have used the laser cutter to make 20 thin rectangles of different length (varying from 14 cm to 31 cm long). I chose transparent acrylic (3-6 mm thick)  to create the light dispersion effect that I was initially aiming for. I was happy with the range of length they had. The difference made the colour sequence look more dynamic, and I don’t think I had to make the rectangles even longer. First, taller parts are trickier to attach to the base. Second, since the entire look of my project depends on the overall lightning in the display room, taller acrylic pieces do not light up as much when the room is bright.However, in 006, it was pretty dark, so the pieces fully lit up and I was satisfied with the way my project looked.

My Aurora is standing on white acrylic base that has a “pit” line in the middle to create space for the neopixel string, which has 20 pixels, between the base and the rectangles attached above with super glue. This construction gave some mobility for the string to be taken out whenever I want, while I was playing around with code and would carry the string around with me. I took off paper. Eventually,  on the back of the neopixel string to stick it to the base to avoid it being misplaced by someone during the show.

Knowing what I know now, I would probably do the construction in a slightly different way. I saw Tayla using matte acrylic, which I think would also look cool with my project and maybe it would have recreated the northern lights effect even better. Also, I would make my acrylic pieces wider to be glued easier to the base (yet it could change the light dispersion effect just like the length does). Alternatively, I should have cut out rectangular “holes” for the strip on the rectangles themselves.

I was actually going to make the installation bigger by attaching another similar block with 10 more rectangles to have 30 rectangles and neopixels in total, but I gave up on the idea once Michael suggested to simplify our project in order to save time.

PART 2: Control Box

I have used this website http://www.makercase.com to create the template for my box that would hide the Arduino, breadboard etc. and would have two potentiometer on top to control brightness and speed of my colour sequence. The box also has two holes on the sides for the USB cord and wires that connect the neopixels strip.

To be honest, I made wrong measurements for my first box, because I did not take the thickness of acrylic into account. The second box had holes too small to attach potentiometers without them moving along with the knobs. Michael suggested I should add an extra rectangular hole for a tiny “stabiliser” that those potentiometer have that prevent them from moving around. I’ve also made the round holes one 1 mm bigger. (Lesson learnt: size matters! Even 1 mm can make a huge difference.) In the end, I did not have to re-cut the entire box, only the top part.

Also, I wish we had more white acrylic left, because I had to make my box black, which was an out-of-place colour contrast with the base of Aurora. In general, I am pleased with the way it looks – a nice way to hide messy wires and so on.

What is more, my strip is using an external 5v power supply, so it’s not using a laptop, but is directly connected to the power socket.

 

 

Schematic:

Software:

My code has undergone hundreds of changes as I was playing around with the colour sequence. I started out with Adafruit Neopixel Library’s simple codes to understand how the strip works and tried to customise those code for the effect that I was going for. I wanted my aurora to gradually change colors, fading and fading out. Also, the challenge was to use a small colour range of mostly green and some blue, yellow etc., because aurora borealis’ colour actually depends on many factors such as latitude and altitude, but it is mostly green in general. For this reason, sample codes were not that helpful as they are too fast-bright-disco-rainbow. 

Later, May showed me this cool website https://hohmbody.com/flickerstrip/lightwork/?id=1042#  that let’s you draw the pattern you want and it generates the code for you. It was fun to play around with, but I felt like it’s wrong to not write my own code, because I wanted to use what I’ve learnt in this class. However, it took me a long time to figure out what I need to do with the code for it to look  like what I imagined it to be, until James gave me the best advice ever: start over and write the code yourself from scratch. And it worked!

The code is basically a bunch of for-loops that change colour values for certain pixels. While I was creating it, I tried to sketch out the sequence, because it’s difficult to visualize colours from numbers on the screen.
What I’d like to learn: is there a way to make the code not as long and repetitive.