Weekly Report #7- 3/30/19

Karthik Natarajan:

This week I spent most of my time writing code to interact with the servos and the motion sensor. Specifically, throughout Monday and Wednesday, I worked on getting the servo libraries working so we could move our board. Initially, Ithought the problem was with my code, so I kept looking for the error there. However, after looking on the oscilloscope with both Jerry and Nathan, we realized that the pulse we were sending was not what we assumed. Instead of getting 1 ms pulse, we were getting a distorted spike. To fix this we equated the ground wires for the Arduino we were using and the power source. This code in turn swept both servos on the pan-tilt configuration. After that with a little bit of modification to the code, I was able to make the servos only move when they detected motion. Through this we learned that the motion sensor is asserted low. Overall we should be on schedule for both the demo and the final project deadline. Most of what is left is integration. Specifically, I would probably try to make code that works with the neural network.

Nathan Levin:

This week I soldered the Adafruit boards together, and have started working with them. I have a basic proof of concept for UART communication working on the Feather’s side of things (see attached image), though it is currently buggy. I think a little more tweaking with the serial settings should be able to fix the bug of not displaying past the first character. I’m currently working on getting communication and control working between the Adafruit board and the Ultra96 for controlling the servos, which is the last major component needed before the demo on Monday.

Related, it looks like our Feather board has too little storage to contain all of the CiruitPython libraries we’d want, so we will use the Arduino ecosystem for now. This is a slight pain, but should not prove much of a long term setback caught now.

 

Jerry:

I’ve looked into turning off the fan for the board, which involves controlling one of the programmable logic I/O pins. There were some tutorials for how to route the pin to a custom datapath / IP block, but the main problem is that these steps typically occur before getting the boot image. Since we already have the final boot image, this may be more difficult. I have created a forum topic on Xilinx’s webpage, and am waiting for the responses; if controlling the fan isn’t possible on the boot image, we may consider a backup plan where we wire the fan differently so that it can be turned off with a hardware switch.

I have also have learned how to create a new UART device on the Ultra96, with which we can use to communicate with the FeatherBoard M0. Once the FeatherBoard side is set up, everything should be ready to make a demo of the camera tracking a person in view. The motion sensor also works now, so the next step will be to wake the board with the motion sensor.

 

Team:

As a team, we worked together on some of the UART debugging and communication. Our immediate goal for today and tomorrow (this weekend) is to get a functioning demo working for lab on Monday. We have individual subsystems working, but the key now is to integrate them and have the camera follow someone under the motion subsystem’s control and the software’s analysis/dictation.

Weekly Report #6 – 3/23

Jerry

I was able to successfully quantize and compile the trained Yolov3-tiny network, so that it runs on the Ultra96 board as a demo taking in a camera feed and showing bounding boxes. Right now this DNNDK version of the network doesn’t match the behavior of the original (so the bounding boxes are wrong), but I believe I know the reason for the discrepancy. Soon, we should be able to get a working version of this demo.

When I get this done, I will turn back to looking at implementing the lower power modes.

Karthik:

Because we got all of the materials for the project, I started and finished assembling the pan-tilt servo mount. This mount essentially allows us to move the camera, when mounted, both horizontally and vertically as needed to track a person. There were some complications with regards to the screws on the servo motors but we were eventually able to get those off even if our methods were less than conventional, to say the least. Once Jerry finishes getting the neural network loaded, we will have to start working on the control algorithms for moving the servo.

To help this process go faster, I will start looking into making a hello world that proves that we can interface with the servo motor. Overall I think we have a good pace but we still do need to increase our pace if we want to finish on time.

Nathan:

While the latter part of my week was occupied with a Model UN conference I had to dedicate significant time to hosting, earlier in the week I got access to the servo hardware and a chance to try out some of the basic servo functionality. I started with some basic code from the Arduino servo library, as that is one of the most ubiquitous control schemes and serves as a good proof of concept before trying Circuit Python. Unfortunately, I discovered that there seems to be a problem/incompatibility either with our servo or the Arduino ecosystem, as the servo would refuse to do anything other than turn counter-clockwise until it hit the limit of its range of motion. Manual analog pin output control did nothing to address this issue. To illustrate the complication involved, we found that even while not nominally transmitting a PWM signal to the servo (so it should be at rest), it consistently moved by ~10 degrees when a stove-top igniter ~4m away sparked. This is troubling to say the least, and we are investigating the cause of the discrepancy as a group. I am still somewhat behind schedule, though the coming week should be lighter than past ones. I will focus on servo/motor control and finish the soldering needed to get the Feather boards functional.

Team:

In addition to individual contributions specified above, Karthik and Jerry completely assembled the servo mount, which is now ready for a camera attachment to enable full pan and tilt functionality. Additionally, towards the end of the week we worked together to try to diagnose the servo problems, with a picture of our setup shown below. It should be noted that using the battery as our power source serves as a good proof of concept for how the servos will be powered in the final system, control dynamics aside.

Overall, we are pretty much on track but could be slightly behind depending on how long it takes us to get through the issues we saw throughout this week.

Weekly Report #5 – 3/16

Nathan:

I was not able to get as much work done this week as I intended, with a busier than expected spring break schedule and lack of convenient internet access. For now, I’ve continued to study the Adafruit CircuitPython libraries for servo and stepper motor control, and their UART control as well. I’ve downloaded the necessary libraries, and will try out my example code once I get access to the hardware.

Karthik:

I have also not been doing as much work because of spring break. But I have been researching into possible control logic that we could use for the servo. I will hopefully be able to implement it once I get back to CMU after the break. So, because of that we are still a bit behind our schedule, but as mentioned in the team report we have updated the Gantt chart to better illustrate our current situation.

Jerry:

I mostly just researched into ways to implement autofocusing for the system. Currently, I’m thinking of measuring a lookup table for reasonable focus parameters given the size of the bounding box to be tracked and the current zoom level, and using a contrast based algorithm to fine tune the focus around that point.

Team:

Regarding the Gantt chart, there are a few changes made. One, the enclosure was pushed back by about a week and a half, as leading up to the April 1st demo, the most important thing is getting working mechanical control and vision systems. Some of the power-related optimization tasks were also relabeled to better reflect a more software/firmware-based nature, as opposed to the earlier hardware controls, and changes were made to the distribution of work to bring motor/servo control under the group whole-domain.

Weekly Report #4 – 3/9


Jerry:

Firstly, I was able to set up the board to be accessible without needing a monitor. This would increase our flexibility for working with the board.

I have also been researching into the low power suspend mode and other options for reducing the Ultra96’s power consumption. We discovered that the Linux built-in suspend probably doesn’t power gate the FPGA fabric, as power consumption only falls by half when suspended. There is a chance that the low power suspend requires more deep level tweaking with Vivado (in particular, enabling the option), though we plan to try out the steps afterward in case the DNNDK image also has it enabled.

Since we as a team were able to run various demo CV algorithms with the board, the logical next step will be to try to load our own network on it.

Karthik:

Throughout this week, I helped Jerry and Nathan with trying to set up Deephi on the Ultra96. This took a considerable amount of effort because of some extra time that we needed to debug some issues with transferring the correct files to the Ultra96 board. Specifically, we first tried to use USB, but that didn’t work because the USB was not mounted. Because of this we eventually started to use the wifi connection to the Ultra96 to make an SFTP connection and transfer the Deephi files that way. Eventually, we got it to work and here is a picture with us using one of the face detection demos.

This is a picture of me and Jerry running Deephi face detection demo on the Ultra96.

On top of that, we recently acquired the servos that we are going to be using for the project. So, I have started to think about and look into ways to try and interface with the servo motors. Overall, I think I might be slightly behind schedule, but I think a simple demo with a valid control algorithm for the servos should help me get back on schedule.

Nathan:

During, the earlier part of this week, I spent the majority of my time working on finalizing the system details for the Design Report. We decided to use an Adafruit Feather board to control the motors and servos, with controls coming from the Ultra96. I investigated the differences between Circuit Python and Arduino C for the control logic. My initial inclination is to use Arduino C for the sake of portability, and I will try to get a demo working by the end of tomorrow.

I also ordered the last of the components for the mechanical control system, including the Feather board and control board accessories. On top of that, I ordered the lens, however, the vendor is apparently shady about the shipping info, so it’ll take about a week longer than intended for it to arrive. In the meantime, I’ll focus on the servo control, which points the camera at the subject.

P.S. I have been investigating machining the enclosure for the camera at the CMU makerspace instead of buying it. Would save ~$100 and 2 weeks shipping time. None of us have such experience, but we can seek advice.

Team Status Report:

Most of our time this week was spent debugging different issues while trying to run the Deephi demos on the Ultra96. The first issues started with the wifi. Specifically the wifi software that our archlinux OS used had a bug which stopped it from working with wifi networks that used WPA2 security. Our first attempt to get around involved installing a new wifi GUI that would allow us to connect to networks that used WPA2 security. However, after doing that, we were unable to connect to any wifi networks. So, eventually, after trying to fix the issue multiple times, we were able to get it working by hard-reseting, unplugging all of the peripheral devices and reflashing the archlinux OS on the SD card.

We also tried looking into ways to profile the power usage of the board. We could get a crude figure for system power just by looking at the current draw from the power source, though we’d have to find a way to adapt this approach to work with the dedicated DC adapter. We tried installing powertop to get a more detailed breakdown, but the utility wasn’t able to show any power usage statistics. Next we plan to try insalling some tools from Xilinx to get a better idea about the power usage.

 

 

Weekly Report #3 – 3/2

Jerry:

I have fine-tuned the Yolov3-tiny network to specialize solely in person detection using data from MS-COCO (the dataset used to train the pre-trained network) and the Caltech Pedestrian Dataset, which would hopefully increase the accuracy of the network for our use case. I verified that it performed better on the Caltech dataset compared to the pretrained network.

Since we’ve formally decided to support battery operations, I have also been researching into possible batteries for the system. At first I was considering USB power banks, and found that though this is possible, a large amount of equipment will be needed:

  • USB power bank
  • A way to convert the 5V USB output to a 12V 4.8mm x 2.1mm jack output
  • A way to prevent the power bank from turning itself off under low current draw (!!)
  • Courage to believe that the power bank won’t draw too much power while it’s being forced on

I breadboarded a circuit that kept one of my own power banks on, but later by chance I found a different battery that outputs 12V DC directly and is well-suited for powering low-powered things for a long time, so we decided to use that instead.

Next time, I will try to make the neural network run on the board.

Karthik:

I have been working on trying to interface with the GPIO pins on the Ultra96 Board. This will specifically become more useful as the parts we ordered start to come in. Therefore, because we don’t have all the parts, I am trying to interface with LEDs that are connected through the GPIO pins just as a proof of concept.  To do this, I have been doing some installations of different board configuration files and been going through some Ultra96 startup guides to try and set up both Vivado and the Xilinx SDK on my laptop to use the GPIO pins. This has involved getting pin assignments from a third-party site. After that I started setting up both Vivado and the Xilinx SDK to use the proper pin assignments.

On top of this, I have been working with Nathan and Jerry to try and get an Archlinux system with the Deephi libraries that we needed for our project onto our board. Overall, I think we are pretty close to on schedule because after getting the OS on our board and the logitech webcam, we can now start to test the GPIO interfacing with the webcam.

Nathan:

I spent my week focusing on getting the Ultra96 functional with the Deephi demo IP. As the end result of this process involved pulling in Karthik and Jerry as well, I will detail it in more detail in the team section. Additionally, I worked on nailing down the specifics of our control and low power motion subsystem. The general system is as follows. We will use the Adafruit Feather Board platform (specifically their ARM M0+ based board, specialized for CircuitPython) to monitor the motion sensor with low power consumption. Upon detecting motion, it will send a signal over UART to the Ultra96 to wake it from deep sleep. This seems the lowest power way to economically monitor the device’s surroundings. The Feather Board will also control the servos and stepper motors of the control system, as Adafruit and the Arduino ecosystem have very robust software tools for precisely that purpose, and there is no effective power difference compared to using the Ultra96 itself, despite the latter’s greater complexity.

Also, I forgot to mention this in the last post, but I significantly improved the website before last week’s blog post, as some visitors may have noticed, and even included one-click links to our presentations (as opposed to embedding them in a post).

Team:

In addition to the contributions above, our team had an epic this Saturday to rival the Iliad. Nathan started by trying to get some Deephi demos working on the board. Nominally, this involved downloading a Xilinx board image and transferring over some test programs, a simple feat in theory. However, upon trying to boot that board image, nothing was happening. We tried reflashing the original Linux distribution, which works just fine, and also validated the flashing utility, but the Deephi boot image just wouldn’t work.

Eventually, we were able to track down a UART to USB tool, which we could use to debug the boot process instead of flying blind. However, what we saw was the following message, over and over again:

Xilinx Zynq MP First Stage Boot Loader Release 2018.2 Dec 10 2018 – 14:03:11 Reset Mode : System Reset Platform: Silicon (4.0), Running on A53-0 (64-bit) Processor, Device Name: XCZUG SD0 Boot Mode Non authenticated Bitstream download to start now PL Configuration done successfully

We tried various solutions, including booting from a USB (got nothing over debug) and even modifying the binary file, but nothing worked.

Eventually, however, we stumbled upon the true source of the problem. Apparently, the lab power supply can only provide 0.5A, which was just enough to boot into Linux, but was insufficient to power the boot process for the Deephi. However, by increasing the voltage, we were able to provide enough power to get to the desktop, but not enough to run the demos. We will receive a new power supply this week to fix this problem.

Edit 3/2: Corrected typo “logic” to “Linux”.