Team Status Report for 3/18

Our current focus has primarily been on getting the devices to work/ communicate properly to the Raspberry Pi. The thermal printer being able to print correct card designs correctly has been critical since we need to have these designs finalized to begin training our model. Card recognition is a main component of our project, so we are a little behind schedule in this part. We hope to get most of the designs to be able to be printed via the Arduino Uno to the thermal printer. This switch was made due to the abundance in documentation and user-friendly IDE and libraries. (Before, we were using minicom on the Raspberry Pi to send ascii files to the printer.) For now, the Arduino Uno is our backup plan for the Demo in the case we aren’t able to use the Raspberry Pi to print. We aim to get the 52 cards to successfully print with suits this next week.

In terms of other devices, the keyboard and LCD screen device drivers have been written, staying on schedule for this week. The plan is to work on interfacing with the camera next week.

 

Miya’s Status Report for 3/18

This week I focused on getting the bitmaps to print on the thermal printer. We’ve been trying to finalize the designs for the cards to begin training for card recognition. Since we are using a Raspberry Pi to interface between all of our devices, I’ve been trying to make manual bitmaps/ASCII images to send to the printer. At the moment, the backup plan for using the Raspberry Pi is using ASCII images. Using the Raspberry Pi with the thermal printer has been pretty difficult, since there is little up-to-date documentation that allows us to send bitmaps with the two devices.

In the meantime, as another backup plan for demo, I’ve figured out how to configure an Arduino Uno to send bitmaps to our thermal printer. This took quite a bit of troubleshooting, since there were issues with library permissions and Arduino IDE not being compatible with my Mac OS, so I had to switch to using Windows. I was able to get faint images of a heart card suit printed, but it still needs some more fine tuning for the finalized designs.

Next week I hope to get more of the 52 card designs done, so training can start as soon as possible.

Mason’s Status Report for 3/18

This week I was working on the device drivers for the keyboard and the LCD screen. The keyboard was a holdover from the last week which I wasn’t able to finish on time. This week I was able to catch up and make an MVP version of both drivers. Implementing the drivers also included rearchitecting the device code in order to support the multithreaded model outlined in the design document. The keyboard implementation (https://github.com/mloyet/joat/pull/3) was fairly straightforward but required reading about the some linux documentation and source code to figure out how the key codes are formatted. The LCD screen (https://github.com/mloyet/joat/pull/4) was slightly more complicated. It needed a new device driver to be installed, and since the protocol is not on a standard bus, it had to be registered as a platform device in the device tree. Fortunately the device is pretty well supported by linux, so the actual device driver almost works for our use case. I say almost because I discovered that our device is actually interfaced as two lcd screens that share a bus and have different enable lines. Currently our lcd peripheral only makes use of the top half of the screen. I may make some modifications to the linux device driver in the future in order to utilize both the top and bottom halves.

I am pretty much complete with my tasks for this week, but there is some code that prints out suits to the LCD screen that I haven’t had the chance to test yet.

Next week I plan on getting the camera operational, and also spend some time looking into the receipt printer. Rachel and Miya have been working with the interface to figure out how to make cards, but the bitmap logic hasn’t been fully working. Perhaps a fresh set of eyes can get it working.

Rachel’s Status Report for 3/18

This week I set up the software necessary to label the input data for the YOLO algorithm. The ML model essentially takes in images as well as coordinates for where in the image it is detected associated to its label. In order to generate these coordinates accurately, it is necessary to use the YoloLabel software. With this created and the ML model set up, it is all ready to take in the cards to train the model effectively and then optimize it to be faster. We also got the cards figured out. With the trouble we were facing with the bitmapping, we got two backup plans so that if we can not figure out the bitmaps by Monday, we can use the backup designs of the cards to start printing them out and training our model.

I am a bit behind on schedule because we did not account for spring break but once we get the data printed out, I should be able to train the model quickly and be able to start working with the camera and transferring my model over to the system.

In the upcoming week, I hope to generate all the coordinates and labels for all of the input data pictures. I hope to then feed them into the model and start being able to perform object detection. I also hope to optimize the model by changing the logic to only look for cards that have been dealt to the user which can be determined by the game state.

Team Status Report for 3/11

This week we focused on working with the printer and getting it to print out cards. We are focusing on the bitmaps and were having some trouble getting them to look like the various suits. To manage this risk, we have created a contingency plan of designing a card without the bitmap suits and just words to start the initial training and then updating the code to include the bitmaps once we do figure it out. There have been no changes made to the existing design of the system and the schedule is still on track.

(SR #4 Q) In terms of new tools we have deemed necessary to implement our project, as we mentioned in other posts, we are using the Rust programming language in order to implement the software that will be run on the server. When comparing to other languages, such as Python, Rust offers better performance in terms of concurrency and memory. This will help facilitate server-device communication, since Rust will be used for encoding-decoding of JSON objects.

Miya’s Status Report for 3/11

Last week, I was primarily focused on writing the Design Review report. We had most of the details figured out due to having the presentation and mentor meeting earlier, so we just needed to narrow down the use-case and design requirements.

So far, we are able to print out letters and number text with the thermal printer, but are still trying to finalize the card designs for training. This past week was Spring Break, but I tried to make time to figure out how to bitmap the card suits such that the Raspberry Pi can accurately send the correct files to the printer. I’ve been trying out some bitmap converters and am working towards testing the files out tomorrow in-person.

Mason’s Status Report for 3/11

This week I was supposed to finish writing the testing framework, which was held over from last week. I was able to write the framework how I wanted it before working on the design document, which was a major team effort. I helped Rachel understand how to use the receipt printer, and explained how it was configured. This week she began constructing the card designs in order to begin her ML training.

 

Over spring break I was scheduled to figure out how to make the keyboard work. I didn’t do this as of today, but I expect to be able to figure it out tomorrow. Since keyboard support is very common, I don’t expect it to be a real holdup.

 

Next week I will figure out the keyboard and begin working on getting the LCD screen operational. The LCD screen is the most obscure peripheral, but it implements a very common LCD screen protocol.

Rachel’s Status Report for 3/11

Last week, Mason got the printer working. With the thermal printer set up, I was able to start designing the cards and printing out what I want them to look like. I worked with the bitmap encoding to attempt to recreate the different suits. I have also set up the YOLO algorithm so that it is ready to start training once I feed in the photos of the cards. My progress is currently on schedule. According to the gantt chart we had originally planned, I have until the end of this week to build the computer vision locally. I hope to complete designing the cards by the start of this week and have the pictures put into a dataset so that it is ready to be fed into the YOLO algorithm. This week the deliverables I hope to finish are having 52 cards printed out and ready to take photos of and the YOLO model completely trained so that it can start detecting cards locally.

Miya’s Status Report for 2/25

Since we’ve settled on using Rust over Python to write our test harnesses, I’ve spent this week trying to learn Rust and the process of writing test harnesses. (My main programming experience has been in Python/C, but Rust’s faster memory and threading capabilities would be more suited for our system’s latency specifications.) Additionally, I’ve been trying to write out some simple tests we could run in order to implement our MVP of Go-Fish. Once we’re able to get the cards printed to test input recognition on the Raspberry Pi, I hope to be able to test out the RPi’s interfacing with the Camera Module.

 

 

Mason’s Status Report for 2/25

This week I was supposed to write testing frameworks for the device and server, which I started work on. However, I did not completely finish them because I started working ahead on the receipt printer. It came to my attention that in order for Rachel to do her ML training, we would need to construct a set of training images of the cards we will be using. In order to do that, we need to be able to print cards, so I prioritized being able to print receipts. There were a couple of hang ups I had to address along the way.

  1. I needed to be able to connect the raspberry pi the the school internet, which I was able to do with Ankita’s guidance. This required me to mess with some of the network configuration files and learn how the wpa_supplicant daemon works and how network cards are represented on Unix systems.
  2. Then, I needed to figure out how to connect the receipt printer in a productive way. I discovered that serial ports are represented as tty ports. There was some more configuration files I needed to mess with to make sure the actual pins were connected to the file though. Then I was able to write to the file and send data to the printer for the first time.
  3. Finally, did some research into making it possible for my partners to ssh into the pi instead of just me. Next time we meet, I will register their public keys as authorized keys so they can do work.

This week I got behind on the testing frameworks. I am confident that I can finish them before the end of next week along with my other tasks so that we can get back on schedule.

Next week I am going to make the interface I started with the printer concrete. This task is high priority so we can start printing cards and Rachel can be unblocked.