[Irene] Controls

Good news: I got the motor integrated with the system, and I updated the Bill of Materials document. I rewired the circuit to share the 12V supply between the motor and the solenoid.

Bad news: the motor turns very slowly. I think the 12V supply is not supplying 12V.

In addition, I designed the control loop for the system, with the help of prof Bain. Essentially, it is an automatic door, like what one would encounter at the entrance to a supermarket or in an elevator. The door opens if there is movement on either side, then does not close until there is no movement in both sides. The control loop does this by taking turns checking the sensor for each side. It is like a scheduler that context switches between the two tasks, delegating equal time per task to make sure that the other task does not get starved out.

Next week, we gotta do the poster and the list of last min things to take care of. I’ll put together a comparison of how the camera performs during lit and unlit settings, and work with jing to fix the solenoid.

[Irene] DC Motor

I explored all the motor options in all their fullness of potential. Let me explain what was peculiar with each one:

NEMA17 & L298N (cheap from amazon):

Using the stepper motor library, the motor stalls, as before, but setting the pins directly did not have the motor work. The same code that worked with the fake NEMA 17 and borrowed driver did not work for this motor and driver.

NEMA 17 & L298 hbridge (expensive from sparkfun):

I could not figure out how to get this driver to work, despite trying various wiring configurations. The leds light up for MotorA and MotorB, which is a good sign. I tried hooking up the H-bridge and the motor to separate power supplies and the same power supply. I tried setting different combinations for the enA and enB pins, along with in0-in3 pins. I followed what the datasheet instructed for having a coasting motor. I researched what a stepper motor should be doing and found an article on stepping modes. I manually set the in0-in3 pins according to what the coils were expecting, and huzzah!

Animated GIF - Find & Share on GIPHY

I could not get the Full nor Half operation modes to work, but I got the Wave operation mode to work for counter clockwise rotation. Unfortunately, I could not get the stepper motor to rotate clockwise.

NEMA 17 & TB6600 motor driver:

Led indicator told me the motor was asking for more current, but it was asking for more than it was allowed to have according to the datasheet. I found various wiring instructions online, each slightly different. Nathan and I tried all sorts of configurations trying to guess how the motor should’ve been wired up, with little luck. Furthermore, peeling off the cover of the motor driver, I found that the labels on the pcb did not match the labels on the outside of the cover. The pulse pins were mislabeled as enable pins. I gave up on trying this option, but in hindsight, I bet if I fed the motor some more current, it might’ve behaved differently. On adafruit.com the NEMA 17 spec for max current says 350mA, but other sources say the motor can handle more. Weird.

Servos:

The PWM from my Arduino that worked so reliably last week no longer worked this week. The falling edge was not as fast as before, and my servos no longer cooperated with my Arduino.

Furthermore, I realized that PWM from Jetson is not a good idea. The easiest solution for PWM by far is to: connect to Arduino from Jetson TK1, so that your main program running on Jetson decides what the motors should do, and it sends that info to an Arduino that does the actual PWM control. To control motors or servo motors, you can either attach a servo motor shield onto a regular Arduino or use an Arduino clone that includes its own motor controller, so it didn’t need a separate motor controller shield. But if you use a regular Arduino then the easiest solution would be to add a servo motor shield to a regular Arduino. Low-level real-time operations like PWM need consistent “hard-realtime” timing that a microcontroller is much more suited to than a big CPU / Application Processor like Tegra.

DC Motor:

I had actually forgotten that I had a DC motor in my possession until Jing pointed it out. The motor asks for 12 volts, and has incredible holding torque when turned off – I didn’t measure exactly how much, but its more than even Professor Bain’s hands. Reversing the direction of current causes the motor to turn the other direction. I imagine I could probably form an open feedback loop circuit out of transistors to control the direction of the motor using gpio pins. Fortunately, I don’t have to. I can use the L298 hbridge (expensive from sparkfun). All I need to do is reverse the pin settings for in1 and in2 in order to reverse directions, and I can hold both pins at 0 in order to stop the motor.

void ccw() {
  digitalWrite(in1, HIGH);
  digitalWrite(in2, LOW);
}

void cw() {
  digitalWrite(in1, LOW);
  digitalWrite(in2, HIGH);
}

void hold() {
  digitalWrite(in1, LOW);
  digitalWrite(in2, LOW);
}

The jetson should be able to control the motor entirely through GPIO pins since only the enA, in1, and in2 pins on the H-bridge need to be set. And I can share the 12V power supply from the solenoid bolt. The motor draws at most .13A.

TLDR;

I have a motor that turns and I can control it.

I’ll mount the pully system tomorrow so we can show it off at the in class final demo. Next week, we’ll be doing the presentation. I will revise the project paper to according to design review feedback. I will have updated diagrams in the project paper according to system changes.

[Irene] Motor Drivers

I used a waveform generator to output a square wave with T = 2.0ms, Ampl = 3.3Vpp, HiLevel = 3.3Vpp, Offset=1.65Vdc, LoLevel=0V, and Duty Cycle between 20% and 80%. The servo seems to work fine, with a few glitches when the duty cycle was low.

Animated GIF - Find & Share on GIPHY

However, using the servo library on the Arduino, the pwm output from pin 9 looks funny. Here is a comparison on the oscilloscope:

Yellow is the square wave from the waveform generator, and green is the output from the Arduino using the servo.h library and the sample code provided by adafruit here https://learn.adafruit.com/adafruit-arduino-lesson-14-servo-motors/arduino-code-for-sweep. Upon inspection of the Arduino servo library, it looks like the servo.write() function sets an interrupt. When I set my own pwm using the following code:

void loop()
{
  digitalWrite(servoPin,HIGH);
  delayMicroseconds(active_us);
  digitalWrite(servoPin,LOW);
  delayMicroseconds(period_us - active_us);
}

Hallelujah, the Arduino output has the correct duty cycle and period, but the amplitude voltage is lower. Here is what the output from the oscilloscope looked like:

In the context of the system, I would need to use interrupts to manage the PWM when integrated with the sensors.

I ordered parts for five alternatives because I don’t think we have time for ordering parts for a Plan B. In order of potential, my options are:

  1. NEMA 17 & L298 hbridge (cheap from amazon): There are many resources out there using these two parts together and I (almost) had it working with the same type of hbridge last week. 
  2. NEMA 17 & L298 hbridge (expensive from sparkfun): I’ve learned from Build18 that cheap parts are not reliable, so here’s an alternative. The hbridge is the same model, but the breakout board looks more enhanced. The heat sink is larger.
  3. NEMA 17 & TB6600 motor driver (as recommended by Nathan Serafin). This driver is very heavy duty and reliable, but there are not many resources out there for how to use it.
  4. Continuous rotation servo: I feed this a time interval and it spins for that amount of time, but is very imprecise.
  5. The original servo: I believe I could hack something together to make this work, but it wouldn’t be pretty. I have better options, and this is only a backup on backups.

If the option that I go with does not have enough torque (which is unlikely), I can use a counterweight to require less torque.

I got through trying options 1-3. Here are my results.

  1. NEMA 17 & L298 hbridge (cheap from amazon):Animated GIF - Find & Share on GIPHY

    The motor stutters, but is drawing much less current than the imposter NEMA 17.

  2. NEMA 17 & L298 hbridge (expensive from sparkfun): I wired everything, but there is no current flowing. I looked at the user manual and found operating instructions that include the use of ENA and ENB, which I haven’t needed to use before when following other tutorials. I will try more wiring configurations
  3. NEMA 17 & TB6600 motor driver: This I have no clue how to wire up (LOL). I will meet with Nathan to figure it out.

In other news, I have mounted the camera and LED to the door. They are both taped on so we can remove or adjust them.

Here is what the camera output looks like:

And with the lights off and LED on:

And I was standing about two feet away:

I’m behind and still working to solve the same problems as last time.

[Irene] Motor Alternatives & Timing

First, I attached the solenoid to the door:

I happened to be in the right place at the right time to overhear Professor Nace say that Ryan Bates from Tech Spark has a box of stepper motors. I talked to Ryan and he let me borrow a stepper motor and L298N Dual H-Bridge Motor Controller. When I followed a tutorial using the same motor and H-bridge motor controller with an Arduino, I couldn’t get the system working.

 

I poked around with an oscilloscope and saw that there is current flowing through the H-bridge, but not a lot. Next, I asked Nathan Serafin for advice, and he said he could lend me a few motors. I tried switching out the motor, but the motor still did not work. I met with Sam to try to get the motor working, and he pointed out that my power supply could not source the amount of current needed at the voltage wanted. He borrowed a bigger power supply from the CMR garage, and the motor didn’t work, so we tried borrowing an identical H-bridge component from another team. The motor moved with the new H-bridge, but the H-bridge was overheating. So we concluded that the H-bridge from Tech Spark was faulty, but that particular H-bridge model is not strong enough to support the stepper motor. Sam suggested I investigate the servo again, which I will be doing. I also talked to Nathan Serafin again and he suggested researching this alternative H-bridge.

In between all the motor things, I found time to do a timing analysis on the integrated machine vision script. I segmented the code into the following blocks: initialization, resizing, converting to grayscale, applying Gaussian blur, calculating frame delta & applying binary threshold & computing contours, tracking where to crop, prediction, and drawing the frame. I counted the number of times each block was executed, the total time spent executing a code block, and the total time for execution of the whole script. I found that image saving, image resizing and drawing things on top of the frame took the most time. Therefore, for the integrated script, I removed the text and rectangle overlays on top of the frames and only displayed one frame. I am keeping the motion detection script separate with all the intermediate frames displaying for the purposes of fine tuning and debugging.

I will be placing orders for alternative motor parts and trying them out. In the next status report, I will detail the pros and cons of each alternative, and why they did or did not work. I’m busy until the start of carnival, which works out well because the new parts will not arrive until Thursday at the earliest. I can work over Friday and Saturday.

[Irene] Servo Setbacks

I ran into servo problems this week. I started by trying to get the servo to turn using an Arduino, but it kept jittering. I tried replacing Vin and GND with a power supply from the 220 lab, but still no luck. I thought it was an issue with the PWM from the Arduino, but when Ronit lent me a mini servo from 18-349, the Arduino handled it fine. I’m also confused because the servo would draw close to 1A when it was jittering. I’m not sure if it’s a problem with the way I’m wiring or controlling the motor or if it’s a faulty motor. I asked around and someone suggested looking into stepper motors. The advantage of this is the continuous rotation. I don’t need to worry about the wheel diameter. However, stepper motors are power drains, especially when doing no work at all.

contour

con⋅ tour | /ˈkänˌto͝or/

(noun) A curve joining all the continuous points (along the boundary), having same color or intensity.

I did achieve my goal to implement image cropping. The current frame is smoothed and converted into grayscale, and then the absolute difference is taken between this and the reference frame. A binary threshold is applied to the frame delta such that if a pixel’s intensity is above a set threshold, the pixel is set to maximum intensity.

For each contour in the frame delta, if the contour area is greater than the minimum contour area parameter, the contour is considered to be actual motion. The image is cropped around all significant contours. In order to track where to crop, a bounding rectangle is drawn around each contours and the outermost corners out of all the contours set the cropping region. The original and cropped frames are written to a file and the smoothed grayscale current frame is saved as the new reference frame.

This week was high in discouragement and low in progress made. Next week, I will ask Sam and Cyrus for advice in the beginning of the week, and have new parts picked and an order in by Tuesday. I will also have the parameters for the motion detection script tuned and tested as it is integrated with Jing’s ML.

[Irene] Sliding Panel & GIFs

I started the week off by attaching the panel to the drawer slides.

Then, Prof. Nace helped me out with the band saw to create two rectangular prisms for the other side of the drawer slides. I secured the drawer slides to the rectangular prisms and attached them to the indoor side of the door.

It’s a little lopsided, but it works.

Animated GIF - Find & Share on GIPHY

Unfortunately, devices are not mounted on the door because I was waiting on parts. However, I made up for that by spending time getting ahead on the motion detection script. I prepared testing data by downloading videos of raccoons and editing ten clips of raccoons that will be used as input to the system. For the motion detection, when a large enough change is detected between the current frame and the reference frame, the program saves a JPEG file, which is the file type that Jing is working with for the ML model. The current frame is set as the new reference frame. This allows the program to output a series of images spaced out in time. The images are noticeably different from each other such that our ML model is not wasting time computing on redundant images. This script does not do any image cropping. To demonstrate the program functionality, I have strung together the image outputs from the program as GIFs:

Animated GIF - Find & Share on GIPHY

Animated GIF - Find & Share on GIPHY

Deliverables for next week are to have the rotating servo that lifts the panel and image cropping implemented in the motion detection script.

[Irene] Building the Pet Door

I laser cut all the door parts, with the assistance of my friends, Alfred Chang and Christian Manaog. Two identical rectangles were laser cut into the 2ft x 2ft plywood. These are the two sides of the hollow door. Professor Nace helped me cut 1 inch cube spacers with the band saw and taught me how to use the power drill. We used bolts to maintain alignment between the two plywood sides together and used the cubes to maintain spacing. I installed the pet door and here is the result:

Here is the installation guide for the drawer slides: https://images.homedepot-static.com/catalog/pdfImages/d5/d57bf3c8-71fe-4ed0-af53-a427049d4421.pdf. It will be adapted to fit our needs. A 23cm x 21cm plywood panel has already been cut. #8-32 machine screws will be used to secure the panel to the drawer members. Two 12in x 1in x 1in rectangular prisms will be cut using a band saw. Wood screws will be used to attach the cabinet members to the rectangular prisms, and #8-32 machine screws will be used to attach the prisms to the plywood. The solenoid will be secured using #4-40 machine screws and the servo will be secured using #6-32 machine screws.

Since the panel 21cm high and the servo rotates 180 degrees, the diameter of the wheel needs to be 13cm. The bottom of the wheel needs to be 21cm higher than the top of the panel. There is not enough space above the pet door, so I will need to elevate the servo. I plan to laser cut and band saw a few more parts to create an extension mount. We don’t have time to order more parts and re cut more plywood and rebuild the door. Since this is a prototype, this is okay, and is convenient for storage because we can take off the servo extension mount.

Parts have arrived, and progress is on schedule to complete the door by March 23, and the computer vision script by March 30. I am on track to be ready for integration on April 1.

I will be able to use the power drill for installing the drawer slides and panel, and will only require additional assistance for the band saw. Deliverable for March 23 is the completed door.

[Irene] Let’s Laser Cut

In preparation for the design review, I did a dry run for Sam. Jing and I went to Home Depot to pick up the plywood and a few other small hardware parts. I wrote the abstract, intro, requirements, architecture overview, and future work sections of the project paper. I ordered the remaining parts.

I responded to the design review feedback in the design document. After consulting a few instructors and people more knowledgeable than us in the machine vision field, I learned the following. In order of preference, our testing options are as follows: live animals, taxidermy, video feeds, printed high resolution pictures, and stuffed animals. My friend has a cat and Jing’s friend has a cat, but animals aren’t allowed on campus. Thus we will record footage of said cats interacting with the system and the system responding appropriately. For raccoons, we won’t be able to find live raccoons to test our system on. A taxidermy raccoon is as close as we could get to a live raccoon, but they are expensive. We have decided to look for videos similar to footage the camera would have actually captured of a raccoon. This is better than printed high resolution photos because the footage animal would replicate reality more closely than a video of a printed picture. Stuffed animals are not a good test of our machine learning algorithm because they don’t represent actual animals. A machine learning model that classifies stuffed cats as real cats would be considered a poorly train classifier.

There is no power metric because the device will be plugged into a wall outlet.

The next large milestone is to have version 1 of the door finished by March 22. I am a little anxious about finding power tools and small metal parts to construct the door. To alleviate that, I am proactively working on obtaining the right hardware from home depot and getting the plywood laser cut first. Whenever there is a delay in the construction of the door due to waiting on parts, I can context switch to writing the computer vision python program for motion detection and tracking.

I will have the door parts laser cut by the end of next week.

[Irene] So Many Diagrams

This week, I put together diagrams to organize our thoughts and make sure we’re all on the same page about interconnections. First, I drew the door design. In a moment of panic, I thought that the servo would not be strong enough to lift a wooden panel, so I changed it to a flappy door design:

Then a mechanical engineering friend told me to redo the calculations and I realized that kg-cm means kg force x centimeter distance. So our 10kg-cm servo can lift 2kg at 5cm radius. So we’re back to the lifting door design.

I learned that it is important to have consistent lighting for machine vision. I selected an LED that content creators on Youtube often use for filming. It will be connected to a power relay that is controlled by the jetson. An alternative lighting mechanism would be IR lighting. IR wavelengths of 850nm and 940nm (also called NIR – Near InfraRed) are commonly used in machine vision. IR reduces color of objects, glare, and reflections. IR has a longer wavelength than visible light which usually results in a greater transmission of light into a material through materials like paper, cloth and plastic. IR wavelengths react differently on materials and coatings than visible light, so certain defects and flaw detection can be identified with IR where visible light did not work. One drawback would be that IR lighting changes the color of the cats fur in the image and therefore, our machine learning model would have to be trained on IR images. This dataset is hard to find.

Here are the interconnections between all the parts, along with the software we need to write:

And an events diagram for what happens after what, which will be converted into a flowchart of images:

I some did reading on computer vision, specifically motion detection and tracking. Basically, we can detect motion by taking the average of the past ten frames and comparing it to the current frame. Read the team status update for more details!

Onwards to design review!

Irene