Heather’s Status Update for 10/17

Heather’s Status Update for 10/17

Big risk to the project identified this week when doing the design presentation. We realized we had spent a lot of time solidifying the different components for the peripheral components, and not enough time on the communication between the Jetson Nano and a computer. So this week, I spent a lot of time researching this topic and came up with a hopefully implementable solution. I should be able to change the Linux kernel for the Jetson Nano using the Nvidia SDKManager. This change would be made utilizing the Linux Gadget Driver and will enable the Jetson Nano to be recognized as a USB Video class, which is the same as a webcam. If I can get this solution working, I think it will be hands down the best one because all current personal computers should have drivers installed for UVC with and then automatically detect the Jetson as a camera. I found a website with a couple of pages on how to use the audio gadget driver and the video gadget driver (https://developer.ridgerun.com/wiki/index.php/How_to_use_the_audio_gadget_driver and https://developer.ridgerun.com/wiki/index.php?title=How_to_use_the_UVC_gadget_driver_in_Linux). I think the connection type to use between the Jetson and my computer is USB-A to USB-A. Currently, I’m working on getting this solution implemented. RIght now, I’m trying to install the SDKManager. I keep running into issues with different libraries not being installed correctly, but hopefully I’ll be able to get the SDKManager installed and running today, and an image for the Jetson Nano with UVC included by tomorrow. If this solution doesn’t work, we have a couple of other options. First, is to establish data transfer between the Jetson Nano and a computer most likely with a USB to TTY cable and then write software on the computer to convert the data stream into video. Second, would be to slap a webcam right next to the raspberry pi. I’m not too happy with either of those backup options, so really working hard to get the UVC working and into the design report.

I didn’t spend time on the prototype this week like I had in mind because figuring out the communication between Jetson Nano and my computer took most of my time. I did get the chance to figure out how to get the multiple cameras set up and running. After some research, Eddy and I found that all we needed to do was specify sensor-id = 0 or 1 in the gst-launch pipeline. And then using threading, we can get computer vision going on the camera feeds simultaneously. 

I also spent the early part of this week working on the design presentation, and the later half working on the design report. 

Goals for this upcoming week: Finish up report and get the communication working. If I have time after that, I’m going to return to working on the prototype and figuring out how to get/set the stepper motor angle and initial angle position.

Leave a Reply

Your email address will not be published. Required fields are marked *