Erin’s Status Report for 04/27

This past week I have been working on the final steps of integration for the dirt detection subsystem. This included both speeding up the transmission interval between BLE messages, and fixing the way the data was represented on the AR application once the dirt detection data was successfully transmitted from the Jetson to the iPhone.
The BLE transmission was initially incredibly slow, but last week, I managed to increase the transmission rate significantly. By this time last week, we had messages sending at around every 5.5 seconds, and I had conducted a series of tests to determine what the statistical average was for this delay. Our group, however, had known that the capability of BLE transmission far exceeded the results that we were getting. This week, I timed the speed of the different components within the BLE script. The script which runs on the Jetson can be broken down into two parts: 1) the dirt detection component, and 2) the actual message serialization and transmission. The dirt detection component was tricky to integrate into the BLE script because each script relies on a different Python version. Since the dependencies for these script did not match (and I was unable to resolve these dependency issues after two weeks of research and testing), I had resorted to running one script as a subprocess within the other. After timing the subcomponents within the overall script, I found that the dirt detection was the component which was causing the longest delay. I had also discovered that sending the data over BLE to the iPhone took just over a millisecond. I continued by timing each separate component within the dirt detection script. At first glance, there was no issue, as the script ran pretty quickly when started from the command line, but the delay in opening the camera was what caused the script to be running incredibly slow. I tried to mitigate this issue by returning an object in the script to the outer process which was calling it, but this did not make sense as the data could only be read as serial data, and the dependencies would not have matched to be able to handle an object of that type. Harshul actually came up with an incredibly clever solution—he proposed use the command line to pipe in an argument instead. Since the subprocess function from Python effectively takes in command line arguments and executes them, we would pipe in a newline character each time we wanted to query another image from the script. This took very little refactoring on my end, and we have now sped up the script to be able to send images as fast as we would need. Now, the bottleneck is the frame rate of the CSI camera, which is only 30FPS, but our script can now (in theory) handle sending around 250 messages per second.
Something else I worked on in this past week was allowing the dirt detection data to be successfully rendered on the user’s side. Nathalie created a basic list data structure which stored timestamps along with some world coordinate. I added logic which sequentially iterated through this list, checking whether the Jetson’s timestamps matched the timestamp from within the script, and then displaying the respective colors on the screen depending on what the Jetson communicated to the iPhone. This iterative search was also destructive (elements were being popped off the list from the front, as in a queue data structure). This is because both the timestamps in the queue and the timestamps received from the Jetson are monotonically increasing, so we never have to worry about matching a timestamp with something that was in the past. In the state that I left the system in, we were able to draw small segments depending on the Jetson’s data, but Harshul and I are still working together to make sure that the area which is displayed on the AR application correctly reflects the camera’s view. As a group, we have conducted experiments to find out what the correct transformation matrix would be for this situation, and it now needs to be integrated. Harshul has already written some logic for this, and I simply need to tie his logic to the algorithm that I have been using. I do not expect this to take very long.
I have also updated the timestamps on the Jetson’s side and the iPhone side to be interpreted as the data type Double. This is because we are able to achieve much lower granularity and send a higher volume of data. I have reduced the granularity to 10 messages per second, which is an incredible improvement from one message every 5.5 seconds from before. If we wish to increase granularity, the refactoring process would be very short. Again, the bottleneck is now the camera’s frame rate, rather than any of the software/scripts we are running.
Earlier in the week, I spent some time with Nathalie mounting the camera and the Jetson. I mounted the camera to the vacuum at the angle which I tested for, and Nathalie helped secure the rest of the system. Harshul designed and cut out the actual hardware components, and Nathalie and I worked together to mount the Jetson’s system to the vacuum. Harshul handled the image tracking side of things. We now only need to mount the active illumination in the following week.
One consideration I had though of in order to refactor/reduce network traffic is to simply not send any bluetooth message from the Jetson to the iPhone in the case that something was clean. This at first seems like a good idea, but this was something I ended up scrapping. Consider the case where a location was initially flagged as dirty. If a user runs the vacuum back over this same location and cleans it, they should recognize that the floor is now clean. Implementing this change would cause the user to never know whether their floor had been cleaned properly.
As for next steps, we have just a little more to do for mounting, and the bulk of the next week will presumably be documentation and testing. I do not think we have any large blockers left, and feel like our group is in much better shape for the final demo.

Leave a Reply

Your email address will not be published. Required fields are marked *