Fernando’s Status Report for April 30

What did you personally accomplish this week on the project?

This week I continued to work on integrating the tracker via library calls for the zoom functionality of the camera.  I’ve also been preparing for testing with code that takes different video formats and reads each frame for processing.  The latter involved reading up on some opencv and skvideo documentation.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

I am currently on schedule, but our team has yet to get the camera working so we’re still behind.  I’m thinking we should speak more with one of our professors colleagues to help us with configuring the camera, as they’ve worked with streaming before.

What deliverables do you hope to complete in the next week?

Next week we will have a fully functional robot 🤖.  What’s left is to ascertain that the streaming is adequate and test what real-time detection is like on the Jetson Nano.

Fernando’s Status Report for April 23rd

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress.

This week we collectively focused on the integration of the system and our final presentation.  Some of the larger tasks involve setting up the ssh server within the Jetson Nano, configuring the requirements for PyTorch on the Jetson Nano, and attempting to run some rough tests on the KLT using images of animals.

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

My progress is behind, as I would have liked to have tested more videos on the tracker of animals at different distances from the camera , but it seems I could not get the dependencies correct under ffmpeg for  using skvideo.  With this I could convert regular mp4 videos to npy arrays that are usable for our KLT.  Catching up would require getting skvideo to work, recording videos of animals at different distances and making sure the tracker works at an appropriate speed.  If not, our plan B would be to use the base KLT offered by openCV.   Perhaps there is another way of converting regular videos to npy without skvideo?

What deliverables do you hope to complete in the next week?

In the next week, I’d like to have a fully integrated tracker.

Fernando’s Status Report for April 16

What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress.

This week we made further progress on the cooperation between the arducam and tracker, which now sends incremental updates of the target’s bound box that are then translated to positive or negative motor steps in the x and y direction of the camera’s pan functionality

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

Our progress is currently on schedule.

What deliverables do you hope to complete in the next week?

In the next week we hope to have a further tested tracking camera that moves with its target smoothly.

Fernando’s Status Report for April 9

What did you personally accomplish this week on the project?

This week I didn’t contribute much to the project.  #Carnival22

Is your progress on schedule or behind?

Our progress is on schedule.  In the updated schedule, we are set to focus on integration until the following week, after which we will be testing our code.

What deliverables do you hope to accomplish this week?

This week I’d like to deliver a mostly vectorized KLT that has been integrated into the Arducam’s API.  This would allow the camera to move with its target while cooperating with the second camera to snap an adequate photo of the target.  Upon testing the code on some of our videos, I will decide if our KLT made from scratch will suffice or if we should migrate to the KLT provided by OpenCV which would most likely be faster.

Fernando’s Status Report for March 26

What did you personally accomplish this week on the project?

I have been falling behind on the tracking feature’s development and have
just begun testing. Currently the KLT has been tested on three videos consisting of moving cars and a helicopter landing platform demo. The tracker is to maintain the bounding box on the main cars in the first two videos and on the platform’s identification number in the third video. The KLT works successfully for the landing video, but refocuses the bounding box on unwanted targets for the car videos.

Is your progress on schedule or behind?

I am currently behind schedule. This next week I plan on testing the KLT on videos of actual animals in different environments under different conditions such as lighting, contrast, and varying levels of occlusions (which the KLT should be resistant to), the likes of which aren’t too disruptive.

I have yet to fit the KLT to whatever format the arducam records video
instead of the npz format that it’s being tested on now.

What deliverables do you hope to complete in the next week?

By next week, I should have tested the KLT on videos of actual animals and have it work with the video/picture format used for the arducam.

Fernando’s Status Report for March 3

What did you personally accomplish this week on the project?

This week I helped work on the Design Review paper.

Is your progress on schedule or behind?
I’m a little behind on schedule as I would like to make more changes to the Design Review and plan on finishing it in the days to come.

What deliverables do you hope to complete in the next week?
A completed Design Review and some interface between the KLT and Nvidia Jetson Nano.

Fernando Paulino’s Status Report for February 19

What did you personally accomplish this week on your project?

This week I focused on familiarizing myself with our cameras’ SDK and zoom, focus, and autofocus functionality.  I also worked on the tracking algorithm.  Thus far, we have an LKT for simple translation transformations of our target.

Demoing the camera’s SDK

Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?

After settling into our schedule and more concrete individual tasks, I believe we are on schedule.  With Sid making progress on the detection section, I shifted my focus to mostly tracking for now.  We will be ready to start testing the baseline tracking algorithm by next week as planned.

What deliverables do you hope to complete in the next week?

By next week, the baseline tracking algorithm, an LKT for full affine transformations, will be ready for testing.  I would also like to implement some code for synchronizing the camera with the tracker.

Fernando Paulino’s Status Report for February 12

What did you personally accomplish this week on the project?
This week I contributed to our project proposal and the corresponding
slides. I also reached out to some nature photographers to learn more about
their processes in photographing animals and some of the challenges they meet.

We also spent some time researching different kinds of cameras that may be
compatible with the Jetson Nano and would adequately meet our needs. Lastly, I reviewed some of my previous projects in recognition in preparation for the detection systems we will implement for this project.

Is your progress on schedule or behind?
I would have liked to have made some solid progress on the actual code, as well as some more research on OpenCV’s libraries to ascertain which, if any, may be of use to us so that we don’t have to waste time making something that’s already openly available.

What deliverables do you hope to complete in the next week?
In the next week, I hope to have some deliverable code in animal detection that utilizes an animal image database, as well as a more concrete log of responses nature photographers who have agreed to help us with their insight and experiences.