This week we worked to select and order parts to set up a developer environment to start prototyping our project’s components. On the hardware selection side with my experience of working with a Jetson in another capstone we opted to select to work with a Jetson instead of a Raspi to leverage the CUDA accelerated computer vision features of the Nano. I set up the Xcode + Reality kit SDK and configured my iphone for development mode to flash an AR Demo app onto it. I spent some time attempting to flash the Jetson with an os image and get it to boot up, but unfortunately this wasn’t working. After troubleshooting multiple SD Cards, different jetpackOS images and power cables the final remaining item to troubleshoot is to attempt to power the jetson directly with Jumper cables from a DC power supply. I tried to boot up the Jetson in headless mode over serial, but it did not create an entry in tty or cu. To actually access it over serial. While the jetson was unable to boot I did take the time to test the peripherals of the camera and wifi adapter on a separate working jetson to verify that those components work. After setting up my Xcode environment I’ve taken some time to do some research in the build process and APIs + conventions of the swift language to better prepare myself for developing in xcode.
Our Schedule is on track, but the dirt detection deadlines that depend on the Jetson to some degree are approaching soon so that’s going to become the highest priority action item once the design presentation is complete and submitted.
Next week’s deliverables involve getting a working Jetson up and running either by fixing the power solution or trying a replacement from ECE inventory to see if it’s a hardware fault. Working on RealityKit in Swift to create a demo app that can build a world map and ticking off the dirt detection prototype on the jetson.