Team Status Report 4/27

What are the most significant risks that could jeopardize the success of theproject? How are these risks being managed? What contingency plans are ready?

For this week, we got all the components fully integrated and running for our project. We were able to get 6 simultaneous streams of 240p video running at 10fps. Everything is working as expected and there are no more work left to do.

The following is a list of unit tests that we have run:

  1. FPGA JPEG decoding
  2. DRAM controller timings
  3. HDMI driver consistency
  4.  OctoSPI peripheral functional and data integrity tests
  5. Full FPGA end-to-end pipeline testing
  6. ESP32 Wi-Fi transmission range test
  7. ESP32 and FPGA subsystem power consumption tests
  8. ESP32 frame transmission interval consistency test
  9. ESP32 to FPGA frame interval tests
  10. ESP32 motion detection test
  11. Full end-to-end system test

Michael’s Status Report 4/27

What did you personally accomplish this week on the project? 

For this week, I wrote a system to address the streams that are sent to the FPGA. This is needed for the FPGA to identify where it should draw a picture when it gets a frame from the central ESP. With the addressing scheme in place, we are now able to run 6 simultaneous streams at once and have them all show up in the same place.

In addition, I also added some pacing code into the central ESP as well. We need to pace the frames at around a 20-25ms interval since the FPGA only has one instance of the decoder and it runs sequentially. It therefore can only accept a new image to draw every 20-25ms. Since it is much easier for the ESP to buffer this data, we decided to have the pacing and buffering code on the ESP side.

The last loose end that needed to be cleaned up was to black out the unused picture locations. At 720p we have divided the frame into 12 individual 240p streams. Since we are not using half of them, we need the ESP to send a black frame on initialization to those locations to make the system look nice. Without sending the black frame we will just get random colors in the unused locations which just looks bad.

Finally, I also integrated the whole system with Varun. We were able to get all 6 streams working simultaneously. The video below is of the system working with all 6 streams. Note that the code to black out the unused locations is not active in the video. 

https://drive.google.com/file/d/1J4ZgzfkFmw4zAhaMK7OQOmpwzOf-8wDz/view?usp=sharing

Output with Unused Locations Blacked Out

Michael’s Status Report 4/20/2024

As you’ve designed, implemented and debugged your project, what new tools or new knowledge did you find it necessary to learn to be able to accomplish these tasks? What learning strategies did you use to acquire this new knowledge?

I had to learn about the specifics of ESP32 development. The toolchain and configurations methods were all new to me. Along with that I also had to learn about the intricacies of how Wi-Fi worked on the ESP32s and how to debug them when the performance wasn’t were we wanted it. I mostly learned by reading the espressif programming manual and the provided example code. It also helped that I could learn by drawing from on my previous class work and work with microcontrollers to help me along.

What did you personally accomplish this week on the project? 

For this week, I used a multimeter to check the power consumption of the ESP-32 to verify the claimed values on its data sheet. I got a measurement of 1.5W maximum power draw for the remote camera node. This was with the camera continuously transmitting imagery which is an absolute worst case situation. The average power draw was more around the range of 1w which is inline with our initial assumptions based on the manufacturer’s datasheet. The power consumption for the remote camera node was also measured and it was slightly lower at only about 0.8w maximum. 

I got the change detector running on the ESP-32. The change detector is implemented as a separate task than the main sending task so it doesn’t block the main sending task. Since the ESP32 has two cores these two tasks are able to be run simultaneously to provide maximum performance. The change detector works by computing the number of pixels that have had significant changes. If the number is above a threshold then change is assumed to have happened and video streaming will commence. 

 I also got all 6 camera nodes sent to the central node at the same time. I checked the SPI output with a logic analyzer to verify the data and the frame spacings. Even with 6 nodes, the system is able to reliably deliver frames at 100ms intervals with only slight variations. The SPI clock rate was also bumped up to 20MHz to account for the time needed to decode the image on the FPGA side

Finally, I also got a range test completed for the remote camera nodes sending to the central node. The current system with omnidirectional antennas is able to hit the targeted 50m while maintaining a stable 10fps stream. The range test was done in Schenley park to simulate the system being used in an outdoor environment. For the test, the remote node was mounted to a small tree and I walked slowly away with the central node. I stopped walking when the video stream stopped working consistently
 

Range Test Distance
Images from Range Test

 

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Currently on schedule

 

What deliverables do you hope to complete in the next week?

For next week, I hope complete final integration with Varun’s FPGA with the new stream addressing system schema

Michael’s Status Report 4/6

What did you personally accomplish this week on the project? 

 

For this week, I wrote a python script that would simulate the central node’s function so I could test the Wi-Fi link for stability. The script will listen for an incoming packet and then perform the needed decompression steps before it displays it as a live video feed. The script also allowed me to explore the limits of the current Wi-Fi systems by adjusting the stream quality to see at what level we start seeing dropped frames. I found that the Wi-Fi is robust enough to support the encoder running at nearly the highest quality without significant frame drops. 

In addition, we also switched the antenna jumpers on the ESP over to the external IPEX  connector. This is so that I can use an external patch antenna for better transmission quality. Previously, I was just using the built-in PCB antenna which is suboptimal since it is so small and limited by the PCB area. With this new external antenna, I was able to get about 30 meters of range while maintaining a good video feed. The test was performed indoors with a noisy RF environment and in a non-line of sight situation with two brick walls between the receiver and the transmitter. I’d expect this number to increase significantly when I run this test outdoors and in a line-of-sight environment. The current patch antenna that we have on it is a 3dBi omnidirectional one. This can always be switched out to a antenna with higher gain if we need better performance

The final thing that I was able to get done this week was verify that I could connect to the receiving ESP32 and have it read valid data. There were a few things that had to be fixed on the receiving side which was mainly to enable IPv4 fragment reconstruction. Fragment reconstruction is needed since the packet size exceeds that of Wi-Fi MTU of around 1550 bytes. 

Link to video of streaming python script: https://drive.google.com/file/d/1OHjypw3lSpivNJLjFea4oVtCaHKOl_wL/view?usp=sharing
 

Is your progress on schedule or behind? If you are behind, what actions will be

taken to catch up to the project schedule?

 

Currently on schedule

 

What deliverables do you hope to complete in the next week?

 

For next week, I hope to run the range test outdoors to verify my assumptions

Michael’s Status Report 3/30

What did you personally accomplish this week on the project? 

 

For this week, I got the performance of the JPEG pipeline working at a sufficient level to enable 10fps streaming and verified it using wireshark. Since integration has not yet happened, I used the hotspot of my phone to act as an access point and have the ESP32 join that access point. I also then joined my computer with wireshark. The ESP32 was then commanded to send data to my computer via the phone’s access point. When the packet reached the laptop, the time was noted down for measurement purposes. Since the maximal transmission unit is 1500 bytes, there is a need for packet fragmentation. Therefore, to measure the time correctly, those packet fragments must be discarded. The picture below is the measurement result showing that we are able to hit a frame interval of about 100ms which corresponds to 10fps. 

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Currently on schedule

 

What deliverables do you hope to complete in the next week?

For next week, I will begin integration with Neelansh for the receiving node. The remote node is now functionally complete with the only thing that is missing being the detection of movement using the luminance channel data. 

Team Status Report For 3/23

What are the most significant risks that could jeopardize the success of the

project? How are these risks being managed? What contingency plans are ready?

The most significant risk right now is the encoder side. As mentioned in Michael’s status report, there are severe constraints on what we can do on the encoder side due to the need to use the PSRAM module on the ESP32. The PSRAM module’s 40 MB/s is a hard limit that is hard for us to work around. Being one of the first stages of the pipeline means that any changes in the encoder side will trickle down and cause issues for the central receiving node and the FPGA decoder. Current contingency plans in case this PSRAM issue does materialize is to set the encoder at a lower quality which will minimize the amount of data that needs to be processed by the LWIP and Wi-Fi stacks. The reduced workload will in turn alleviate the pressure on the PSRAM module.

 

Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

No changes

Michael’s Status Report for 3/23

What did you personally accomplish this week on the project? 

 

For this week, I finally got the compression code ported over to the EPS32. This means that we now have all the steps that are needed on the remote camera node coded up and working. The remote camera node is now able to take a picture, compress it and then send it over via Wi-Fi to a receiver. Porting this over was significantly harder than I thought since we kept running into issues with the camera driver co-existing with Wi-Fi code. In integrating these two, we kept running into issues where the Wi-Fi sending code would stop transmitting at random intervals which would then incur a packet fragment time to live exceedance error message on the receiver. After a lot of debugging and configuration changes, I was able to solve the issue by making some changes on the encoder side and pinning the Wi-Fi task to core 1, which leaves core 0 free to handle the camera. However, this performance is still on the lower side since we are not limited by the bandwidth to the PSRAM. The PSRAM lives on a QSPI bus that runs at 80MHz. Thus, we are limited to a maximum of 40 MB/s of memory bandwidth and unknown latency. The internal data DRAM is only 320KB of size thus it is not an option for us to store a complete frame buffer. Keep in mind that this PSRAM is shared between LWIP, camera, Wi-Fi, and compression. 

 

Image captured, encoded, and then transmitted from ESP32

 

Is your progress on schedule or behind? If you are behind, what actions will be

taken to catch up to the project schedule?

 

Currently on schedule

What deliverables do you hope to complete in the next week?

 

For next week, I hope to begin integration with the FPGA. This will mostly entail me providing a bunch of test data to Varun which he will then run through the FPGA to make sure that we are in agreement in regards to the data format and algorithm stages

Team Status Report 3/16/2024

What are the most significant risks that could jeopardize the success of the project? How are these risks being managed? What contingency plans are ready?

Wi-Fi was previously our greatest unknown. None of us had ever programmed a microcontroller with Wi-Fi capabilities and we didn’t really understand what the ESP’s Wi-Fi capabilities were. We now proved that the Wi-Fi capabilities are adequate for our needs and won’t be a risk factor going forward. The greatest remaining risk now is the JPEG runtime. All of the JPEG code that was written so far runs on a laptop and not a microcontroller. Even though the runtime on the laptop is an order of magnitude faster than the needed 100ms run time, even accounting for all the setup code, it still doesn’t give us concrete data on if the ESP can run JPEG at the speed we need. Michael is currently working on making the final changes to port the code over so the actuality of this risk will be soon known

 

Were any changes made to the existing design of the system (requirements,block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward?

No changes

Provide an updated schedule if changes have occurred

No changes

This is also the place to put some photos of your progress or to brag about a component you got working.

I’m super excited to have gotten some baseline work running on the FPGA for running the JPEG decoder. By next week I should be able to display an uncompressed frame on the monitor!

Michael’s Status Report 3/16/2024

What did you personally accomplish this week on the project? 

For this week, I got the Wi-Fi transmission code working on the ESP32. I have tested it and we are able to reach about 17 Mbit/s. The 17 Mbit/s is more than what is needed to stream 240p video, which is our project’s initial requirement. However, this may actually be limited by the iPhone hotspot that I am using to serve as an access point. Since I don’t have an actual Wi-Fi access point to test with, I will have to accept that 17 Mbit/s is the speed of the ESP32. The transmission code did take significantly longer than I expected. In the beginning I kept running into issues associated with the iPhone’s hotspot. It turns out that you have to be on the hotspot screen to have the hotspot be actually visible when there is no one actively connected. After this, I ran into more issues regarding sending packets at a high rate or large packets (>1000 bytes). The ESP would complain that it was running out of memory and return a failure code. After debugging it for a long time, it turns out that there was a misconfiguration in the ESP SDK and the station power saving mode was too aggressive. After these issues were resolved, the ESP was finally about to hit the aforementioned 17 Mbit/s.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Currently on schedule

What deliverables do you hope to complete in the next week?

For next week, I hope to finally finish porting the JPEG code over. I stopped working on it this week since the Wi-Fi portion hasn’t been explored at all and we were not sure if there would be hidden issues. The delay from the debugging Wi-Fi ate up any time that was left over for JPEG porting

Team Status Report 3/9/2024

Part A: … with consideration of global factors. Global factors are world-wide contexts and factors, rather than only local ones. They do not necessarily represent geographic concerns. Global factors do not need to concern every single person in the entire world. Rather, these factors affect people outside of Pittsburgh, or those who are not in an academic environment, or those who are not technologically savvy, etc. 

 

Being a technology product, EyeSpy can present a challenge to use for people who are not very tech savvy. However, our design goal of plug-and-play operation should make it easy to use even for people that are not very tech savvy. EyeSpy is also fully compatible with all wireless regulations globally as it uses the internationally standardized 2.4 GHz ISM band. Since EyeSpy doesn’t require an electric grid connection to work, varying grid voltage and frequency shouldn’t be a hindering factor

 

Part B: … with consideration of cultural factors. Cultural factors encompass the set of beliefs, moral values, traditions, language, and laws (or rules of behavior) held in common by a nation, a community, or other defined group of people. 

 

Our product doesn’t necessarily have a high cultural impact, and does not offend / attack any particular rules set in any culture. Our camping security system can help people when they are doing religious prayers or ceremonies and need to have a sense of the surroundings, and thus continue the religious proceedings in peace. Our product does not break any existing laws as such, as long as it is used for the right purposes and not for anything illegal. 

 

 Part C: … with consideration of environmental factors. Environmental factors are concerned with the environment as it relates to living organisms and natural resources.

 

Our product does have some environmental factors. Some of the main concerns are with how it will interact with wildlife. It could be harmful to wildlife if they decided to try to eat or play with the remote camera nodes as they contain harmful chemicals and other items such as the camera, the battery, etc. The main concerns relating to natural resources is the sustainability of sourcing material for the remote node. The main concern for this is the battery and how often will the health of the battery remain. Batteries are also very expensive in terms of manufacturing and the environmental impact that comes with it.

 

Were any changes made to the existing design of the system (requirements, block diagram, system spec, etc)? Why was this change necessary, what costs does the change incur, and how will these costs be mitigated going forward? 

 

No major changes have been made to the design of the system yet.

 

Provide an updated schedule if changes have occurred.

 

There are no major updates to our schedule as of yet and no changes have occurred.