This week, Mae, Meghana, and I all worked on completing the Project Proposal Slides by solidifying our scope and laying out our requirements, tools, challenges, solution, and timeline. I rehearsed giving the speech so that I could deliver it adequately. I also worked with Mae to walk through our current software pipeline, including how to use the machine learning model on pre-labeled bottle images to finding the distance and angle between the robot’s current orientation and the target bottle. We also thought through some of the concerns brought up by the professor, especially the issue of having non-bottle items in the robot’s area and how to interact with it.

Since I placed orders for and picked up the RealSense and switched our Nano to the Jetson Xavier, I started looking into how to use the RealSense SDK. After reading into how to use the camera, I researched how to install and use the SDK locally in order to get LiDAR depth points for a live camera feed. But since there isn’t a universal application, I had to find a way to build the SDK on my computer. Currently, I’m still working on configuring the SDK on my M1 Mac, which is giving issues since the compliant OS for the SDK is outdated and there are logged issues with Apple Silicon and RealSense. I walked through a tutorial on building the SDK on Apple Silicon and am in the process of debugging errors that came up.

I am slightly behind schedule due to this unexpected software incompatibility. Next week, I plan to see if I can use my Windows for local development and to figure out how to get a camera feed from the RealSense, and to begin experimenting with extracting data from the depth points given the coordinates of a bounding box which we’ll be using our model to draw.


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *