Midterm
Process
I started thinking about my physical computation midterm way before —probably even longer than a month. Unfortunately, I'm only getting to the midterm documentation now.
I met with Tom during office hours on September 18th to discuss my first idea. The first idea I had was to create a Pepper's Ghost illusion and use an Arduino OLED screen to display a p5.js sketch that reacts to an ultrasonic sensor. The animation would be of a person running, and as you get closer, the person runs faster, and after a certain distance, the sketch says "Boo!" Tom pointed out that the interaction is minimal, and every day, it is essentially just walking back and forth.
Then I started thinking of another midterm idea based on an article I came across: https://www.nationalgeographic.com/animals/article/fireflies-light-pollution-green-spaces-new-york-dc. I wanted to make an installation based on this article. But the challenge was that I didn't want to use LEDs to represent fireflies. This idea was definitely one that I pursued for a bit because it was exciting to me.
On September 24th, I asked Arjun if we wanted to work with me for the midterm. He already had a line of enquiry, he wanted to pursue, but he did want to contribute to as many projects as he could, so later that week on October 1st, we discussed my idea a little bit more, he helped me think it through a bit more in terms how I might want to approach it, and also got me thinking more into about the project and. I realized that I wanted to represent 'resilience' through fireflies rather than just an installation.
Then I spoke with Christina Tang on Friday, October 3rd, and she told me that LEDs and a form of shadow from a good light source would be a good approach for the Week 5 lab, using a bulb to get myself more familiar with LEDs.
Then later on the 4th, I went to Princeton to visit one of my friends, and they told me that this project might seem really interesting to me, but as a viewer, it could be brushed off and doesn't necessarily engage. So I thought more about my project that day and came up with the idea of my board game, 'Trash!'.
Initially, Matt helped me come up with the idea that I'd have a big bin and people would use tokens from that bin to sort trash into multiple bins. So on the 5th, I prototyped with cardboard and conductive tape, using resistors. With different resistor values, the speaker would be triggered.
#include "pitches.h";
int peakValue = 0;
int threshold_1 = 750;
int threshold_2 = 820;
int note[] = {
NOTE_A4, NOTE_G4};
const int speakerPin = 8; // pin number for the speaker
const int noteDuration = 200; // play notes for 20 ms
void setup() {
// put your setup code here, to run once:
Serial.begin(9600);
}
void loop() {
// Serial.println(analogRead(A1));
int sensorValue = analogRead(A1);
// map the results from the sensor reading's range
// to the desired pitch range:
// check if it's higher than the current peak:
if (sensorValue > peakValue) {
peakValue = sensorValue;
}
if (sensorValue <= threshold_1) {
if (peakValue > threshold_1 && peakValue < threshold_2) {
// you have a peak value:
tone(speakerPin, note[0], noteDuration);
// delay as long as the note is playing:
delay(noteDuration);
// reset the peak variable:
peakValue = 0;
} else if (peakValue > threshold_2) {
tone(speakerPin, note[1], noteDuration);
// delay as long as the note is playing:
delay(noteDuration);
// reset the peak variable:
peakValue = 0;
}
}
}
Then, in the next class, we discussed our projects, and after that, I realized I hadn't thoroughly thought through the project's concept. And the way I presented was confusing because I get nervous during presentations, so people in the class thought I was trying to make an actual trash segregator, not the board game. Then Tom suggested that I collaborate with Galt and use computer vision, but then I clarified that I wanted to make a board game. But, for a moment, I thought it would be really interesting to work on building an actual trash segregator. However, I didn't want to use computer vision because, to me, it didn't make sense: using a tool that generates trash to create a tool to segregate waste. So the same week, I met with Tom again.
After meeting with Tom, he agreed with me that the computer sensors aren't necessarily the best way to detect trash, and he suggested that if I'm really interested, I could look into this for my thesis. He suggested that I use Light and Color sensors, try it out from the Shop on the floor, and also use p5.js for the game's audio output.
Then I met with Phil, and he gave me ideas for fabricating. He told me to use makercase to make a box, and since I know a little bit of Rhino, he said I could potentially design a trash chute. He said that I should use Fusion 360 for it. He also suggested that I get plywood to fabricate from Home Depot or Lowe's and get about a quarter-inch-thick wood.
After that, I asked Phil if there were color and light sensors available in the ER, but he said there weren't, so I emailed Tom to ask what resources I could use to find them. But the next day, I was near the shop. I was talking to Prisha, who told me to try working with RFID or hall-effect sensors, and she also told me to prototype my midterm. She said the light sensor is most likely available in the Shop, so I went with her to the shop and got the color sensor.
Then, after that, for the next couple of days, I spent some time looking into the datasheet: https://cdn-learn.adafruit.com/downloads/pdf/adafruit-as7341-10-channel-light-color-sensor-breakout.pdf, then, while looking online, I also came across this GitHub page: https://github.com/samthetinkerer/DIY_Light-spectrum-analyser. To start off testing the sensor, I used this code as a reference and updated it. During that time, I was thinking about using the TCA9548A multiplexer because, as per my initial plan, I wanted to connect three of the sensors. At this point, I still hadn't clearly figured out the configuration of a sensor for color detection. While testing the initial code, I saw orange as the highest value at some points, but I couldn't figure out why, so I emailed Tom.
// code adapted from samthetinkerer DIY_Light-spectrum-analyser
/* This example will read all channels from the AS7341 and print out reported values */
#include <Adafruit_AS7341.h>
#include <Arduino.h>
#include <Wire.h>
Adafruit_AS7341 as7341;
int violet = 0;
int blue = 0;
int teal = 0;
int green = 0;
int yellow = 0;
int Yellowgreen = 0;
int orange = 0;
int red = 0;
int farRed = 0;
void setup() {
Serial.begin(115200);
// Wait for communication with the host computer serial monitor
while (!Serial) {
delay(1);
}
if (!as7341.begin()){
Serial.println("Could not find AS7341");
while (1) { delay(10); }
}
as7341.setATIME(100);
as7341.setASTEP(999);
as7341.setGain(AS7341_GAIN_256X);
}
void loop() {
// Read all channels at the same time and store in as7341 object
if (!as7341.readAllChannels()){
Serial.println("Error reading all channels!");
return;
}
violet = map(as7341.getChannel(AS7341_CHANNEL_415nm_F1),0.0,65536.0,0.0,400.0);
blue = map(as7341.getChannel(AS7341_CHANNEL_445nm_F2),0.0,65536.0,0.0,400.0);
teal = map(as7341.getChannel(AS7341_CHANNEL_480nm_F3),0.0,65536.0,0.0,400.0);
green = map(as7341.getChannel(AS7341_CHANNEL_515nm_F4), 0.0, 65536.0, 0.0, 400.0);
yellow = map(as7341.getChannel(AS7341_CHANNEL_555nm_F5), 0.0, 65536.0, 0.0, 400.0);
Yellowgreen = map(as7341.getChannel(AS7341_CHANNEL_590nm_F6), 0.0, 65536.0, 0.0, 400.0);
orange = map(as7341.getChannel(AS7341_CHANNEL_630nm_F7), 0.0, 65536.0, 0.0, 400.0);
red = map(as7341.getChannel(AS7341_CHANNEL_680nm_F8), 0.0, 65536.0, 0.0, 400.0);
farRed = map(as7341.getChannel(AS7341_CHANNEL_NIR), 0.0, 65536.0, 0.0, 400.0);
Serial.print("violet: ");
Serial.println(violet);
Serial.print("blue: ");
Serial.println(blue);
Serial.print("teal: ");
Serial.println(teal);
Serial.print("green: ");
Serial.println(green);
Serial.print("yellow: ");
Serial.println(yellow);
Serial.print("Yellowgreen: ");
Serial.println(Yellowgreen);
Serial.print("orange: ");
Serial.println(orange);
Serial.print("red: ");
Serial.println(red);
Serial.print("farRed: ");
Serial.println(farRed);
}
After I emailed Tom to ask whether I needed a multiplexor, and he told me that, rather than using a multiplexor or multiple controllers, I should try working with just one. And in retrospect, I do agree: it was a better strategy to get one working with one bin, get the sending to work reliably, and connect it via asynchronous serial to p5.js.
He mentioned that if this project is interesting, I should spend the extra money on additional hardware for the final.
Tom sent me his notes on the sensor: https://tigoe.github.io/LightProjects/spectrometers/. He also mentioned that if I'm using it for color detection, I will need a diffuser to cover the sensor, but not the LED on the board, since the LED provides a reference light for the object to be sensed. I also had a question as to why mainly orange is being detected, and that would depend on the ambient light around it. I was working on the floor at night when the lights got warm, and Tom pointed out, too, that when I'm working on the floor at night, the higher wavelengths are more prevalent. While working during the day, blue wavelengths are more prevalent.
So the next day, I revamped my code after referring to Tom's notes, and I was able to establish serial communication using Arduino and the p5.webserial, and I used the Spectrograph code, which was written for serial input from AS7341.
/*
AS7341 sensor readings. Reads and prints the results
as a CSV string. This version uses a non-blocking
approach, using the startReading() and checkReadingProgress()
functions. It's about 2 seconds between readings.
Details for this sensor can be found on the AMS product page here:
https://ams-osram.com/products/sensor-solutions/ambient-light-color-spectral-proximity-sensors/ams-as7341-11-channel-spectral-color-sensor
In particular, the product data sheet, user guide, and
Application Note AS7341 AN000633, "Spectral Sensor Calibration Methods"
are of most use.
Library:
http://librarymanager/All#Adafruit_AS7341
created 18 Jun 2021
modified 16 Feb 2025
by Tom Igoe
*/
#include <Adafruit_AS7341.h>
// instance of the sensor library:
Adafruit_AS7341 as7341;
// array to hold the raw readings:
uint16_t readings[12];
float channels[12];
// header string for CSV:
String headers = "415nm,445nm,480nm,515nm,555nm,590nm,630nm,680nm,Clear,NIR";
// correction factors: these corrections are from Application Note AS7341 AN000633, "Spectral Sensor Calibration Methods"
// fig. 10. These are for channels F1 through F8.
// TODO: This math needs to be corrected.
// values 4 and 5 are 0 because they are not used (see datasheet)
float corrections[] = { 3.20, 3.00, 2.07, 1.30, 0.0, 0.0, 1.07, 0.93, 0.78, 0.71 };
void setup() {
// init serial, wait 3 secs for serial monitor to open:
Serial.begin(9600);
// if the serial port's not open, wait 3 seconds:
if (!Serial) delay(50);
if (!as7341.begin()) {
Serial.println("Sensor not found, check wiring");
while (true)
;
}
// set integration time:
as7341.setATIME(100);
as7341.setASTEP(999);
as7341.setGain(AS7341_GAIN_256X);
// print column headers:
Serial.println("setup");
Serial.println(headers);
// start a new reading cycle:
as7341.startReading();
}
void loop() {
if (!as7341.checkReadingProgress()) return;
// if the current reading is complete:
if(readSensor()) {
// print the results:
for (int r = 0; r < 12; r++) {
// skip readings 4 and 5 as they are repeats:
if (r == 4 || r == 5) continue;
Serial.print(channels[r]);
if (r < 11) Serial.print(",");
}
Serial.println();
// start a new reading cycle:
as7341.startReading();
}
}
bool readSensor() {
// take the current reading, do corrections,
// and put the result into readingString:
// get the readings:
if (!as7341.readAllChannels(readings)) return false;
// there are 12 elements in the readAllChannels array,
// but elements 4 and 5 are not used. So channel number is different
// than array index:
int channelNum = 0;
// loop over the readings and put them in the channels array:
for (int r = 0; r < 12; r++) {
// skip readings 4 and 5 as they are repeats:
if (r == 4 || r == 5) continue;
channels[r] = as7341.toBasicCounts(readings[r]);
if (r < 10) {
channels[r] = channels[r] * corrections[r];
}
}
return true;
}
After that, I was getting values from the spectrograph, but I couldn't tell how far it was detecting.
So, I went to the shop and talked to Nikolai and Nasif to understand exactly what a diffuser is and whether I should purchase one based on the documentation I saw for the diffuser: https://look.ams-osram.com/m/436a32f63ba06bad/original/AS7341-Details-for-Optomechanical-Design.pdf.
But they showed me how to make a diffuser using acetate paper. I used P3000 and P2500 sandpaper and sanded the acetate paper. After it got a frosted texture, I placed it over the sensor and continued detecting, but the results remained confusing to me. But at this point, I was still trying to detect from really far.
Then Prisha comes over and tells me that I need to put the object really close to the sensor. She put her green scarf right over the sensor, and that's when I realised how close you have to be to the sensor to detect the colors. For some reason, the sensor also detected better without the diffuser than with it, so I decided to let go of the diffuser.
In the meantime, as a backup, I listened to Prisha's advice and decided to try out the hall-effect sensors, which could be connected to the breadboard like potentiometers. This is what I had coded for it. I decided that I’d give myself until next Wednesday to configure the sensor, or else switch to a different one.
void setup() {
// put your setup code here, to run once:
Serial.begin(9600);
}
void loop() {
// put your main code here, to run repeatedly:
Serial.println(analogRead(A6));
}
Prisha gives me the idea of printing and cutting colored circles, putting them right over the sensor, and mapping the values.
I was then confused about which colors to map to the sensor or even how to start mapping values, so I talked to Arjun, and he told me to try mapping the values to the red, blue, and green ranges since that would be easier.
After talking to Arjun, I planned my project again using just one sensor. Furthermore, in class, after talking to Tom about my project, I also wrote down all the possible questions I had while making a diagram of what the project might look like and roughly planning a schedule, which I didn't end up sticking to.
So the next week, I started mapping values. I began mapping values and creating general functions in p5. I ended up hard-coding the range. Based on the ranges I got.
But the manner in which I mapped it, and when I used that to detect the values. The readings spanned a range such that green and blue were within red's range.
I showed my numbers to Prisha too. She was concerned about the delay in serial communication between Arduino and p5. She suggested I remove delay() and use millis(), and explained how millis() works.
Then I spoke with Guy, who is good with mathematics, but he wasn't able to quite understand the readings. Then I spoke more with William. I showed him my numbers and code, and he was baffled. Then I asked James for help, but he was also really confused about the numbers. But James told me to use LEDs to map the values. Then I was talking to Arjun, and I said to him that I spent the whole day mapping on paper, and he told me that next time I should run my experiments with people before I actually start doing them, so I spend less time. I realised I had spent the whole weekend working with this, and that is quite a long time trying to do it myself. Until the very end, I realised it wasn't working, but that's alright. I learnt to manage my time better and a bit more about the project process.