Jeff’s Status Report for 4/8

Personal Progress

This week I updated the hardware software interface to allow for interruption of blinds. I also got cardboard to build the fake window frame for the final demo and began construction of the frame.

I am currently on schedule personally since the hardware system is fully complete and am waiting to the software team to be fully finished to perform full system testing.

Plans for Next Week

There plenty of sunny days on the coming week so we will be doing full integrated system testing and calibration. I will also finish building the frame.

The tests planned for the hardware system include sending in test vectors asking the blinds to move the blinds at every 10 cm intervals (10cm, 20cm, 30cm from the top). I also plan to test the light detect circuit by moving it into the light and not light and make sure the response is correct.

Dianne’s Status Report for 4/8

Personal Progress

This week I worked on fine tuning the LAOE portion of the algorithm and creating more rigorous testing. This should help ensure that edge cases where a person is on or near the boundary of being in the LAOE will be treated as an “in LAOE” case to be safe, as well as adding a few more test cases. This will need to be tested on one of the sunny days coming up. The changes I made can be seen on our repository as usual: https://github.com/djge/18500_GroupC1

I also worked briefly with Elizabeth on setting up the RPi on Wednesday.

We are currently on track, and I hope to run some more integrated testing with a real person in sunlight to ensure that the adjustments I made will work in a real setting.

Comprehensive Testing Update

I have already run tests measuring the difference between data and measured projection points from the window onto the floor (this determines the “area of effect”). There are 7 total test cases (4 points for each), giving a total of 28 test cases. We may continue to measure data on sunny days for this, but our focus going forwards will be on integration testing with a real life scenario of a person in and out of the LAOE, as well as testing whether or not the change in blinds mitigates the effects of sunlight.

Next Steps

Next week, we hope to run more testing in real situation testing, as it should be sunny. We also hope to finish and set up a way to present the blinds so they are more visually appealing and come up with adjustments that will make it more user-friendly.

Elizabeth’s Status Report for 4/1

Progress

I worked on creating test files to more systematically gather and test data for the User Extraction. My files can be found in test/distance_testing in the team respository. As said in the README, you run get_data.py to gather a data point, add your real data to measured_data.txt, and run test_data.py to calculate (x,y,z) values and calculate percent error. We also gathered one data point this week and for that point, our calculated (x, y, z) had (13.0%, 2.7%, 6.4%) error. The image of the data point can be seen using the display() function in test_data.py. In the future we will strive to gather more test points, and in more unique user positions. Dianne and I also spent some time prepping the Raspberry Pi, and the Pi is relatively set up and we have installed OpenCV and other things on it, but installing pyrealsense2 has proven to take longer than expected, as it cannot just be pip installed on the Pi, and must be compiled from source.

Schedule

We are roughly on schedule.

Next Steps

By the demo, I hope to finish at least some kind of integrated version of our product (where the motor system is connected to our software side). If this is not finishable before the demo, then at least by next week this is the goal. I also wish to test more.

Jeff’s Status Report for 4/1

Personal Progress

This week I was able not able to fully assemble the blinds because I can’t install the blinds into my bedroom due to my lease and tried to find solutions. I’ve considered plywood, pvc pipes, cardboard, and other ways to build a frame but because I’m not allowed into the woodwork area and lack of power tools, we settled with command strips. After placing the order for the command strips, I manually held up the blinds to test if it will work and what is the rate the blinds can move. I also helped Dianne and Elizabeth in software testing and integration.

I am current still behind on schedule because of the issue detailed above. There is no real way to adjust the schedule since the arrival of the command strip is outside our control and can only shorten testing period.

Plans for Next Week

If the command strips arrive by next week, I will begin full integrated testing in real environment. In the meantime I will help test software accuracy and think of other temporary measures to enable earlier full system testing.

Dianne’s Status Report for 4/1

Personal Progress

This week I worked on both testing and integration with my teammates. Since Wednesday was a sunny day, we did some testing on the User Position Extraction algorithm. We were able to figure out some issues with it and some changes, such as adding the height of the stand of the LIDAR camera. We also did some testing with the LAOE algorithm. One was the projection coordinates function, where we map coordinates of the corners of the windows onto where the light is cast onto the floor. We kept having strange values for the y-value of the top left and top right coordinates, but we eventually realized that this was because there was the ledge of the roof outside and above the window that was blocking off the top portion of the window from direct light. After accounting for the length of this ledge, the algorithm works great with relatively low error (~4% error). We also manually tested the intersection algorithm, which uses both user position and projection coordinates. This also worked relatively well enough for our initial standards.

We are currently on track. We are working on getting a construct to attach the blinds, but all testing besides integration testing is done.

Next Steps

Next week, we hope to finished with the MVP and fine tune the accuracy of the algorithms.

Team Status Report for 4/1

Risk and Risk Management

The major risk that we have is the physical installation of the blinds. The blinds need to be drilled into a window, but we cannot legally drill the blinds into the windows of any of our houses. Our current workaround for this is to connect the blinds to a cardboard construct and attach that construct to the windows, or wherever we want to put them. We also need command strips in order to connect the cardboard to the window, but if these do not arrive on time then we will temporarily use duct tape.

System Design Changes

As of now there hasn’t been a change to the existing design of the system.

Schedule

The schedule has not changed yet, and everyone is roughly on track for the time being.

Team Status Report for 3/25

Risk and Risk Management

The major risk that we have is related to full integration testing. Since real time testing requires the weather to be sunny and we also have other time commitments so we need the sunny day to line up with our availability to test for capstone. This makes rapid revision and testing very difficult. We aim to mitigate this by utilizing past data from sunny day to simulate a sunny day. We won’t be able to get real life confirmation but we can double check the response to what we think is the ideal response.

System Design Changes

As of now there hasn’t been a change to the existing design of the system.

Schedule

The schedule has not changed yet, and everyone is roughly on track for the time being.

Elizabeth’s Status Report for 3/25

Progress

Dianne and I worked on integration.py (work can be found on the team repository) that essentially integrated the LAOE algorithm and the User Position Extraction. It’s mostly just taking the work I did last week and calling Dianne’s LAOE functions. Actually, in the process of integration, I realized that some of my math from last week was incorrect, and had to edit my calculations. There were also some other minor changes, like turning the testsuncalc.py file into a module with get_suncalc(). Some other issues this week included merge conflicts. I set up the Raspberry Pi by inserting the microSD, putting on the heatsinks, attaching the fan, and putting it in its case, however I didn’t realize that to interact with the Pi, I would have to have some kind of monitor. Jeff has a monitor, so next time we are both available, I will try working with the Pi in his room.

Schedule

So far my progress is roughly on schedule.

Next Steps

Next week, I hope to upload the software end of things into the Raspberry Pi, and see if it works. I will also work on connecting the Raspberry Pi and the Arduino (likely with Jeff).

Jeff’s Status Report for 3/25

Personal Progress

This week I was able to speed up the motor system to a reasonable speed by  setting the Vref to the correct voltage and to send out control pulses at the fastest pace possible. I also finish setting up the interface for the software people to send the control they want the motor to perform. I also finished assembling the 3D printed gear pieces with the motor.

I am current still on schedule and no need for adjustment.

Plans for Next Week

To test the motor system with actual blinds attached.

Dianne’s Status Report for 3/25

Personal Progress

This week I worked on both testing and integration with my teammates.

With Elizabeth, we worked on the integration of the main file that would continuously be running face detection and doing calculations if a face is found. I also wrote a testing file that would make sure the projection coordinates are correct, as this is the most important part for accuracy reasons. There are a few issues right now (I might have gotten the axis or angle mixed up relative to North, as the sign of the x-value is flipped). The full scope of the work can be found in our repository: https://github.com/djge/18500_GroupC1.

Jeff and I discussed potential ways to integrate the motor system with the adjustment of the blinds, but have yet to integrate as we first need to calibrate the turning of the blinds with the blind size to figure out how we can get the blinds to go up or down a certain amount. We are currently on track.

Next Steps

Next week, we hope to have an integrated system, where we can start to debug as well as fine tune parts of the project that are not performing as well as they should be. Additionally, I will work on the issues with the algorithm and writes tests for the intersect function as well.