Team Status Report for 2/18

Risks and Risk Management

The most significant risks as of now is probably how our LAOE testing is dependent on not only our schedules, but the weather as well (we can only test when it’s sunny). The contingency plan for this risk is to get as much data as possible on sunny days.

System Design Changes

So far there are no real changes to the design/requirements yet, besides needing to change our Intel RealSense Lidar Camera L515 to an Intel RealSense Depth Camera D455. This change was necessary because the L515 doesn’t perform well in sunlight/ambient light, and the switch impacts us by making us pay significantly more for the D455 rather than just borrowing the L515 from ECE Receiving. We’ve looked into the D455 about possible light issues, so there should be no issue on that front.

Schedule

The schedule so far has not changed, but the schedule moving forward will likely largely depend on how accurate our LAOE algorithm is after adjustments and more testing.

Us taking data together!

Elizabeth’s Status Report for 2/18

We met as a team to collect data. We went around to each other’s places to discover which place was the best to test our data. Jeff and I first tried to get data on Tuesday morning, but failed to get data due to time limitations, and physical limitations (we discovered that a lot of our windows are facing away from the sunlight during this time of year). As a team we also worked on the design presentation.

As a team, we tested out our Intel RealSense Lidar Camera L515, but we discovered it doesn’t work under sunlight/ambient light (more can be found here). We decided to switch plans instead and go with the Intel RealSense Depth Camera D455 instead, as its accuracy is not dependent on the presence of sunlight, and its range is 6 meters. I also helped order the materials and find a rough cost of the parts we need 3D printed.

I also looked into the pros and cons of using different distance sensors, to see if perhaps an ultrasonic distance sensor would fit our needs better. Referencing multiple websites, I found that LiDAR in general measures distance more accurately, but ultrasonic sensors tend to be cheaper and use less power. Because our accuracy goals are kind of high, it would be best to use LiDAR in this case to measure the distance of the person. I also looked at the GitHub for Intel RealSense, but haven’t started a small demo yet.

Progress

As a team, we are on schedule, though I feel like I could’ve accomplished more this week.

Deliverables

I will work on an OpenCV example using the Intel RealSense Lidar Camera L515 (while we are still in the process of getting the D455)which I will work on over the next week. I will also help Dianne with the LAOE adjustments and help in general with getting more test points as needed.

Courses: 18202, 15112

Elizabeth’s Status Report 2/11

This week, Jeff and I researched the different hardware we can use in our project. We decided we wanted to use a unit that had an integrated LIDAR and camera to make things simpler. We were initially thinking about using a unit called Astra from Orbbec, but decided against it as the user guide and documentation for it was sparse and hard to navigate. We also found that it also had questionable reviews on Amazon. Because we wanted to use something with established documentation, we decided on an Intel unit, the Intel RealSense Depth Camera D415, and found a graph that displayed its depth RMS error, which seemed somewhat acceptable. However later when we were looking at the ECE Inventory, we found an even better item, the Intel RealSense Lidar Camera L515, which also seems to satisfy our accuracy requirements. The depth accuracy of the L515 is shown below. On the list we also found a Raspberry Pi, so we filled out the request form and notified our TA.

 

I looked into how to use pre-existing APIs to calculate the azimuth/altitude of the sun, to use for calculating the area of effect of the sun, but I was very inefficient in my approach, as I should’ve thought there was a Python library for this. At first I tried to make GET requests to suncalc.org, but the values we needed weren’t directly in the HTML. When I looked up suncalc APIs online, at first I only saw the JavaScript API, so I spent some time trying to find out how to run JavaScript in Python (using the js2py library), but it kept giving me errors on how it couldn’t link the required node modules for the suncalc API. In the end, I just found a Python library that gives us these values based on time and location, and used another library called geopy to find latitude/longitude based on an address. The short code is below. I ended up not putting it on the GitHub repository for now, as it’s only a very preliminary venture into our project, and it’s very simple (I was just being stupid).

Our progress is roughly on schedule.

The deliverables for the next week I plan on working on is working with Dianne on the Sun Position calculating API, and start the implementation of our area of effect API. I will also work on the design review/report and presentation, as my group members and I have decided that I will be in charge of the next presentation. I will also help my group members in making more in-depth schematic diagrams for our project.

Team Status Report for 2/11

Risks and Risk Management

The most significant risks of our project is if we are unable to set up or obtain the LIDAR/Camera device in time, as not many other steps can be performed in parallel. This risk is being managed by trying to reserve this device as soon as possible, so that the rest of the project can continue smoothly. If we don’t manage to get the reservation, we are ready to order the device as it is still within our budget.

System Design Changes

We more precisely defined the scope of the project. We decided that for our MVP, we won’t account for weather (cloudy/rainy days), reflected/artificial light, and objects blocking the window. We want our project to work in a 300 ft^2 room, but that room must be rectangular, and the length of the room is not over 30 feet. We want our design to more specifically block light from the person’s face. Our project is targeting a room on a second floor (at a height of roughly 10 ft. above ground). It will work with at most three people in the room (as it would be difficult to test otherwise). Our design will work as long as nothing in the room moves but people (as such, our project will also not work with pets). It assumes the window is closed, since windy weather could interfere with the blinds’ function. These changes were necessary to better fit the scope in our minds. Although this makes our project less flexible, it makes it more feasible. The schedule that was made previously will stay the same.

Ethical Considerations

Our project includes considerations for personal health. It will improve people’s eyes, as it will block harmful sunrays. It will also help people’s mental health, as research suggests natural light improves mood. 

Experiment

The bag experiment listed in Jeff and Dianne’s status reports. Fun group meeting that helped us better determine where the area of light hits.