Dianne’s Status Report for 2/25

Progress:

This past week, as a group we worked on specifying what we needed for our project as well as the Design Review. I worked on the full diagram depicting how the parts of the project fit together and how they are connected.

Individually, I got started with the code for the Light Area of Effect (LAOE) algorithm. I made a rough outline/layout of the different working functions that I would need to write, as well as implementing a few of them. I wrote out the algorithm for the projection of the light coming through the window onto the floor using the manual testing we had done last week. This is the code so far, but it will need to be changed after doing some testing with the data we have on hand:

Additionally, I wrote the start to some code for checking for the intersection between a point (person) and the volumetric prism where the light is cast into the room. All my changes were reflected here: https://github.com/djge/18500_GroupC1

Next Steps

So far, I am on track with the development of the algorithm. Next week, I hope to work on the rest of the code to a point where we are able to start writing test cases for individual functions and the functions that will determine what the changes to the blinds should be. We will also be working on the Design Report as a team throughout the next week.

Team Status Report for 2/18

Risks and Risk Management

The most significant risks as of now is probably how our LAOE testing is dependent on not only our schedules, but the weather as well (we can only test when it’s sunny). The contingency plan for this risk is to get as much data as possible on sunny days.

System Design Changes

So far there are no real changes to the design/requirements yet, besides needing to change our Intel RealSense Lidar Camera L515 to an Intel RealSense Depth Camera D455. This change was necessary because the L515 doesn’t perform well in sunlight/ambient light, and the switch impacts us by making us pay significantly more for the D455 rather than just borrowing the L515 from ECE Receiving. We’ve looked into the D455 about possible light issues, so there should be no issue on that front.

Schedule

The schedule so far has not changed, but the schedule moving forward will likely largely depend on how accurate our LAOE algorithm is after adjustments and more testing.

Us taking data together!

Elizabeth’s Status Report for 2/18

We met as a team to collect data. We went around to each other’s places to discover which place was the best to test our data. Jeff and I first tried to get data on Tuesday morning, but failed to get data due to time limitations, and physical limitations (we discovered that a lot of our windows are facing away from the sunlight during this time of year). As a team we also worked on the design presentation.

As a team, we tested out our Intel RealSense Lidar Camera L515, but we discovered it doesn’t work under sunlight/ambient light (more can be found here). We decided to switch plans instead and go with the Intel RealSense Depth Camera D455 instead, as its accuracy is not dependent on the presence of sunlight, and its range is 6 meters. I also helped order the materials and find a rough cost of the parts we need 3D printed.

I also looked into the pros and cons of using different distance sensors, to see if perhaps an ultrasonic distance sensor would fit our needs better. Referencing multiple websites, I found that LiDAR in general measures distance more accurately, but ultrasonic sensors tend to be cheaper and use less power. Because our accuracy goals are kind of high, it would be best to use LiDAR in this case to measure the distance of the person. I also looked at the GitHub for Intel RealSense, but haven’t started a small demo yet.

Progress

As a team, we are on schedule, though I feel like I could’ve accomplished more this week.

Deliverables

I will work on an OpenCV example using the Intel RealSense Lidar Camera L515 (while we are still in the process of getting the D455)which I will work on over the next week. I will also help Dianne with the LAOE adjustments and help in general with getting more test points as needed.

Courses: 18202, 15112

Dianne’s Status Report for 02/18/2023

Current Progress:
This week, I got started with coding the API for the Light Area of Effect (LAOE). We first wanted to make sure our math was correct, so we took data points from one of our living rooms. We measured the horizontal distance from a corner of the window to the respective corner cast onto the ground (x) as well as the distance away from the window (y), and recorded the time of day so that we could find the altitude and the azimuth of the sun at that time.

I took these data points and applied the formula we had in mind to check the formula. Here is the manual math done with the data points to find the (x, y) position relative to a spot on the window:


The values did not match the data points within a 5 cm range as we had originally intended. We may need to make revisions to the formula going forwards, such as accounting for refraction and other factors.

Next Steps:
First, I will finish coding up the formula to map corners of the window projected onto the ground. Along with that, we will write an algorithm that will have to check if a point intersects this area/volume created by the window, as well as the algorithm for calculating the change to the blinds.

ECE Courses:
Since I am working on the software track, the greatest contribution to the skills I have utilized are those from software courses. This includes 15-112 for a great foundation in computer science and 15-462 for help with the LAOE algorithm.

Jeff’s Status Report for 2/18

Personal Progress

Our original plan of utilizing the RealSense Depth Camera L515 that the school already had fell through because after some testing, we found out the L515 Camera’s depth perception range gets severely hindered by sunlight (down to max depth range of 1.5m away). This lead me to do research for new hardware that fits our need. We settled with the RealSense Depth Camera D455 because it has a 6m range in sunlight and from a vender that we know we can trust and has good documentations. However the D455 is pretty pricey so we double checked our hardware needed to build our motorized blind to make sure it all fits within budget. I was able to get it down to all fit within the budget with $100 still available as leeway.

We also collected data to test our Light Area of Effect (LAoE) algorithm on Wednesday.  The predicted light area from our LAoE is about 10% off from the actual area the sunlight affected. This is expected since our LAoE algorithm simplifies light to be a particle and thus move in a straight line when in reality light is both a particle and wave and thus would bend outwards a little.

While I am waiting for the hardware purchase to be approved and to arrive, I helped with the design presentation and design report.

We are ahead of schedule because we originally schedule work to began after design presentation so all the work we are doing now were work planned for the coming week.

Classes knowledge utilized

For this weeks work, I’ve utilized knowledge from 18220 to understand the circuit diagram of other arduino controlled motorized blinds.

Plans for Next Week

If we were able to obtain the new RealSense Depth Camera D455 and the hardware for the blinds within the next week, I would be working on setting it up. During the downtimes, I would continue assisting Elizabeth and Dianne with their work.

Elizabeth’s Status Report 2/11

This week, Jeff and I researched the different hardware we can use in our project. We decided we wanted to use a unit that had an integrated LIDAR and camera to make things simpler. We were initially thinking about using a unit called Astra from Orbbec, but decided against it as the user guide and documentation for it was sparse and hard to navigate. We also found that it also had questionable reviews on Amazon. Because we wanted to use something with established documentation, we decided on an Intel unit, the Intel RealSense Depth Camera D415, and found a graph that displayed its depth RMS error, which seemed somewhat acceptable. However later when we were looking at the ECE Inventory, we found an even better item, the Intel RealSense Lidar Camera L515, which also seems to satisfy our accuracy requirements. The depth accuracy of the L515 is shown below. On the list we also found a Raspberry Pi, so we filled out the request form and notified our TA.

 

I looked into how to use pre-existing APIs to calculate the azimuth/altitude of the sun, to use for calculating the area of effect of the sun, but I was very inefficient in my approach, as I should’ve thought there was a Python library for this. At first I tried to make GET requests to suncalc.org, but the values we needed weren’t directly in the HTML. When I looked up suncalc APIs online, at first I only saw the JavaScript API, so I spent some time trying to find out how to run JavaScript in Python (using the js2py library), but it kept giving me errors on how it couldn’t link the required node modules for the suncalc API. In the end, I just found a Python library that gives us these values based on time and location, and used another library called geopy to find latitude/longitude based on an address. The short code is below. I ended up not putting it on the GitHub repository for now, as it’s only a very preliminary venture into our project, and it’s very simple (I was just being stupid).

Our progress is roughly on schedule.

The deliverables for the next week I plan on working on is working with Dianne on the Sun Position calculating API, and start the implementation of our area of effect API. I will also work on the design review/report and presentation, as my group members and I have decided that I will be in charge of the next presentation. I will also help my group members in making more in-depth schematic diagrams for our project.

Jeff’s Status Report for 2/11

Personal Progress

I did research on the exact LiDAR Camera we should use with the assistance of Elizabeth and settled with the Intel RealSense Cameras. I also did research on Arduino-controlled motorized blinds that met our requirement and found a guide on one. We noticed that the ECE inventory already has an RealSense Camera and Raspberry Pi 4 so we put in the request for these hardware. As for the hardware for the motorized blinds, we are waiting for our design meeting on Monday before finalizing the purchase since this component of the project is not a prerequisite for any other work other than integration.

Since we don’t have the hardware on hand right now, I assisted Dianne in creating the Light Area of Effect (LAoE) Algorithm. During the development process, we were not sure how the azimuth (the angle from the north pole) affects how far the light projects away from the window wall. We performed an experiment to test this and found that the azimuth should not affect the projection distance from the wall. The photos to the experiment is in the team status report.

 (My proposed LAoE Algorithm)

I am currently not behind on schedule since we originally thought that the purchase of hardware would take place after the design presentation.

Plans for Next Week

If we were able to obtain the RealSense Cameras and Raspberry Pi 4 within the next week, I would be working on setting it up and collecting some camera feeds for Dianne and Elizabeth to use as sample during software development. If the professor and teaching assistant approves of our design, we can then finalize our hardware purchases and help Elizabeth and Dianne with algorithms/software development in the down times. If the design is not approved then I will research on other hardware alternatives.

Dianne’s Status Report for 02/11/2023

Focused on fleshing out the design of the project and looked into how we will be designing the APIs, specifically the light area of effect algorithm. I created the following Git repository for the code we will be writing in the next week or so: https://github.com/djge/18500_GroupC1

I worked on a few diagrams to illustrate how the calculation in the Area of Effect of the Sun API works. Our group was not sure how the angle from north for the position of the sun might effect the reach of the area of effect from the window, so we conducted a few experiments using a paper bag with cutout windows and a flashlight. The following images clarify our vision for this function:

The results of the experiment seemed to show that the azimuth angle does not affect the distance from the window that is affected by light. We decided to go with this result and change the algorithm accordingly if it results in problems during testing.

So far, we are on schedule. Next week, I hope to get a working Sun Position calculating API with Elizabeth, start implementation on the area of effect API,  and focus on the design review/report with fully fleshed out diagrams of the schematics with our whole team on both the software and hardware end of the project.

Team Status Report for 2/11

Risks and Risk Management

The most significant risks of our project is if we are unable to set up or obtain the LIDAR/Camera device in time, as not many other steps can be performed in parallel. This risk is being managed by trying to reserve this device as soon as possible, so that the rest of the project can continue smoothly. If we don’t manage to get the reservation, we are ready to order the device as it is still within our budget.

System Design Changes

We more precisely defined the scope of the project. We decided that for our MVP, we won’t account for weather (cloudy/rainy days), reflected/artificial light, and objects blocking the window. We want our project to work in a 300 ft^2 room, but that room must be rectangular, and the length of the room is not over 30 feet. We want our design to more specifically block light from the person’s face. Our project is targeting a room on a second floor (at a height of roughly 10 ft. above ground). It will work with at most three people in the room (as it would be difficult to test otherwise). Our design will work as long as nothing in the room moves but people (as such, our project will also not work with pets). It assumes the window is closed, since windy weather could interfere with the blinds’ function. These changes were necessary to better fit the scope in our minds. Although this makes our project less flexible, it makes it more feasible. The schedule that was made previously will stay the same.

Ethical Considerations

Our project includes considerations for personal health. It will improve people’s eyes, as it will block harmful sunrays. It will also help people’s mental health, as research suggests natural light improves mood. 

Experiment

The bag experiment listed in Jeff and Dianne’s status reports. Fun group meeting that helped us better determine where the area of light hits.