Leia’s Status Report for 4/27/2024

Progress

I’ve been making steady work on debugging the bluetooth feature of the web app. Currently the demo web app is able to successfully connect and control the Arduino Nano 33 BLE. Moreover, it can transmit data between the two, including characters of text, and display that text on the OLED screen.

 

Additional 3D printings of the case attachment is underway because of initial mishaps as well as design flaws recognized only after the prototype was developed. However, the printed parts of the refined design have been checked on and have significantly improved from the first model.

Next Steps

All that’s left in the hardware side is further debugging of the bluetooth feature to enable string transmission from the web app to the Arduino and the assembly of the 3D printed items.

Then the bluetooth feature will be integrated with our final web app in its production stage and user satisfaction will be tested. Outside of the actual project development in preparation for the final demo, we will be working on the final poster, video, and report.

Leia’s Status Report for 4/20/2024

Progress

Because of our team’s shift in solution approach from mobile app to web app, I have been adapting and rewriting code for the Arduino to implement bluetooth functionality with a webpage. With a general HTML, CSS, and Typescript setup, the demo website has been able to connect and control over BLE the Arduino unit. Textbox input has been added and tested to write sentences from the web app to the OLED screen via the Arduino. I’ve also been trying to transplant the necessary and appropriate demo web app components into our actual project web app. However, I’m facing similar issues and difficulties with trying to run our python scripts because my virtual environments and modules I download continue to be incompatible. Despite downgrading versions and installing appropriate elements, it does not work and so, I will be developing the bluetooth features so that they can be easily transferred into the web app with my team members.

The 3D case and attachment to the phone is currently underway. The design was completed and printing is in progress.

Next Steps

I will be retrieving the completed 3D print and incorporating the hardware parts into it to create the complete physical product of our project. I will also be trying to further accomplish and refine the text transmission over bluetooth between the Arduino and our web app, and assist with frontend if necessary.

With team members, we will be pushing into testing stages to ensure our product meets our use case and design requirements. We plan to have surveys to measure user satisfaction and other methods that verify our project.

Additional Prompt

To operate CAD software to import and configure models, I had to watch guidance videos provided by the program’s company to develop the final attachment. I also had to read documentation for the Web Bluetooth API to understand how a web app can implement BLE functions and pair with external devices such as an Arduino in addition to transmitting and receiving data from the Arduino. Forums such as Stack Overflow and Chrome For Developer’s articles about communicating with bluetooth devices over Javascript helped solidify my understandings.

Team Status Report for 4/6

Main Accomplishments for This Week

  • Transporting our current ML implementation into CoreML to be iOS app compatible
  • Adapting our app CV input to work with the ML program
  • Successful integration of the LLM model to grammatically fix direct ASL translations into proper sentences
  • Hardware approaching near completion with app to screen text functionality

Risks & Risk Management

  • As we’re in the final week of our project, possible risks are issues with the total integration. The CV and ML are being worked on in one app and the Arduino and bluetooth screen in another, so eventually they will have to be merged together.
    • The risk mitigation for this is careful and honest communication across all team members. We don’t anticipate this to fail severely, but in the off chance that we cannot get them to combine, we will discuss which app to prioritize.
  • Another related concern that carries from last week is the difficulty incorporating a pose detection model as MediaPipe lacks it for iOS. This may lead to reduced accuracy, and our fallback if it continues to be unavailable is to focus entirely on hand landmarks.

Design Changes

  • No design changes

Schedule Changes

  • We added an extra week for NLP training and for the final system integration. 

Additional Week-Specific Question

  • Our validation tests for the overall project involve:
    • Distance tests to measure how close or far user needs to be from app camera
      • Our use case requirement was that the person must be between 1-3.9ft from the iPhone front camera, so we will test distance inside and out of this range to determine if it meets this requirement.
    • Accuracy of ASL translations displayed on the OLED screen
      •  Our use case requirement was that the accuracy for gesture detection and recognition should be >= 95%, so we have to ensure the accuracy meets this requirement
    • Latency of text appearing on screen after gestures signed
      • We also have to ensure our latency meets the requirement of <= 1-3 seconds, consisting of the ML processing, LLM processing, and displaying on the OLED screen.
    • Accessibility and user experience surveys
      • We will get ASL users to test out the device and collect feedback through surveys in order to reach out user satisfaction rate requirement of > 90%

Leia’s Status Report for 4/6/2024

Progress

I have got the OLED to display text from the Arduino unit as well as a variety of images and animations for testing purposes. The demo app that connects with the Arduino via bluetooth has an added text feature that the user can type on. Currently, transmitting text from the app to the Arduino to show on the OLED is a work in progress.

Next Steps

I will be trying to get the app to OLED transmission to succeed by the end of this week so that next week I can focus on designing and 3D printing the case that will hold the hardware components and latch onto the phone. After that, I will be working with my team members to integrate their CV and ML app with my demo app to develop a singular mobile application that handles all of our project’s goals.

Once that app is settled, I will be building on the frontend to make it more user friendly and simple to operate.

Verification

I’ve run example code provided by Arduino to test the fluidity and latency of the OLED screen. I’ve also already examined the speed at which the app is able to connect with the Arduino over bluetooth, which has demonstrated immediate results. Other verification tests I will be running are:

  • additional latency tests of app to OLED text communication
  • phone attachment assessments that ensure it is easy to latch and remove
  • speed and formatting of ASL translations on OLED

Essentially, I will be handling all the inspections in regards to hardware that ensures the physical product meets the use case and design requirements.

Leia’s Status Report for 3/30/2024

Progress

I have successfully soldered all the components together: battery, Arduino, and OLED screen. The entire hardware side is currently self-sufficient and functional. The battery essentially powers the Arduino, and the OLED draws power from the Arduino to work. It can be charged with inductive wireless charging but for safety precautions, it will continue to be charged through plugging in the Arduino via the Micro USB cable since the Adafruit backpack enables the battery to be charged and maintained in this method. The demo app for the circuit can connect and control to the Arduino via bluetooth, and the Arduino sketches uploaded to the unit can affect the OLED screen properly with low latency and clear imagery.

Next Steps

I will now be integrating features into the demo app that allow text to be transmitted over bluetooth to the Arduino so the OLED can exhibit them. The app will be further refined in its appearance and overall frontend to eventually prepare for its integration with machine learning and computer vision codes in Xcode. I will also be combining Arduino sketches I have for bluetooth and OLED text display together into one program that handles all necessary functionalities.

Farther down the line, I will be planning on the 3D printing case for the hardware components to be fitted into a neat package and eventually attached to the phone.

Leia’s Status Report for 3/23/2024

Progress

I continued to practice American sign language, particularly basic greetings and the alphabet. I’ve also been tweaking the mobile app used for testing bluetooth capabilities to work on frontend development and to implement a text box typing function. Because of Adafruit issues mentioned from the last status report, another voltage regulator backpack was purchased in addition to a 3.7V 150mAh Adafruit battery, and they were obtained this week. I tried connecting the OLED screen to the Arduino, but the loose jumper wires and unbending pins that came with both items require soldering to be affixed together. The same can be said for the Adafruit backpack and battery.

Next Steps

Now that all components for the hardware side have been attained, I will be going into the 18220 lab to solder the wires. The interim demo is approaching so my goal is to get the entire hardware side joined together by then. I hope to at least make the Arduino module self-sufficient by connecting the battery and then hooking it to the OLED. The ideal is to also get the OLED working by then using the mobile app.

Team Status Report for 3/16/24

Main Accomplishments for This Week

  • The issues we encountered last week carried on to this week as well, but we have made progress in resolving them and continue to work on their solutions:
    • The dynamic machine learning models were not performing as expected – regardless of gestures made, the same word(s) are being predicted. We narrowed the vulnerability to be from integration and received feedback to focus on how we are extracting our coordinates. 
      • We were advised to identify a center of mass or main focal point such as the wrist to subtract from, rather than use raw xyz coordinates for landmarking. 
      • Hence, we updated our dynamic processing model code and now have been getting improved predictions.
    • The transmission from video to database is currently questionable. We desired real-time streaming from the phone camera to the cloud environment so that gestures can be processed and interpreted immediately as they are happening. 
      • We received lots of articles on this concept to study further. With the diversity of solutions to solving this problem, it’s a little difficult to identify which is best suitable for our situation. 
      • Hence, we are considering just having the iOS app and Xcode environment directly handle the machine learning and computer vision rather than outsource the operations to a cloud database storage. Research was done on whether in Xcode, python scripts and related code that utilize OpenCV, Mediapipe, Tensorflow, and Keras can be packaged with the app and retrieved within that package. So far, there is promise shown that this can be achieved, but for safety, we will maintain our database. 
  • Progress on Amplify setup
  • Arduino and Mobile App Bluetooth connection

Risks & Risk Management

  • With the interim demo approaching, we hope to have definitive outcomes in all our parts.
    • We are working on further accuracy and expansion of training data for our machine learning models. Our basic risk mitigation tactic for this in case of setbacks is to remain with static model implementation.
    • Regarding hardware, there is a safety concern with operating the LiPo battery, but that has been minimized by extremely careful and proper handling in addition to budget available in case of part replacement needed.
  • As mentioned in the Main Accomplishment section, there is a challenge with our plans to integrate ML and CV with the mobile app. At first, we thought of the database, but because of streaming issues, we shifted to having the mobile app have local script access to ML and CV. We will be steadily trying to achieve this, but we will have a backup of the database and even delegate operations into a web app to be converted into a mobile app if the database-to-app transmission continues to be a risk.

Design Changes

  • No design changes

Schedule Changes

  • We are currently approaching our milestone to launch the mobile app. We will be working together to integrate it, but if we do not achieve everything we want to accomplish by then in regards to the iOS app, then we will change the date.

Leia’s Status Report for 3/16/2024

Progress

I was able to implement the bluetooth function between a mobile app and the Arduino unit. The app currently can switch on and off the Arduino’s LED and show temperature data taken from the Arduino all through bluetooth.

I tried to connect the battery with the Adafruit battery backpack, but some smoke came out. Turns out the battery’s wire port is not compatible with Adafruit’s receiving port because the polarities are reversed – Adafruit sets their positive on the right side while the negative is on the left, the battery is the opposite. Although the Adafruit circuit appears fine, there is a possibility that the chip did blow. I considered taking out the wires form the batterie’s port to switch the contacts, but then I was informed that I can accidentally short-circuit the battery with the contacts out if they somehow touch each other.

Next Steps

I will be trying to connect the OLED screen to the Arduino. After, I will try to display the temperature on it to ensure the connection across app, Arduino, and screen is seamless. The next step would be to try and display any text on the OLED, first directly from the Arduino with its uploaded sketch, and then from the app with the Arduino as the medium. I want to work on how a user can type in a textbox on the app and immediately have that shown real-time on the screen.

I will also have to be purchasing a new Adafruit backpack and their lips battery because none of the third-party batteries I could find on Amazon are oriented in Adafruit’s polarity.

Leia’s Status Report for 3/9/2024

Progress

I tried to connect all the components together but encountered significant difficulties. When connecting the Arduino to the LiPo battery, unfortunately the pins of the Arduino unit do not fit and insert into the breadboards I purchased. I tried to perform direct coupling with jumper wires but encountered there are risks to this method. Connecting an external power supply to an Arduino without proper voltage regulation may damage the unit, even when using a battery surplus that matches within the voltage range at which the Arduino uses. I also do not possess a micro usb cord to even test the Arduino itself to my computer.

I wrote the Arduino IDE sketch to connect the Arduino unit to a mobile app over bluetooth and to communicate with the OLED screen. The mobile app is a simple, testing environment solely for exercising the BLE capabilities and to try transmitting text between each other. It is not reflective of the actual mobile app we will be employing. I have downloaded all necessary CAD models and begun creating a basic casing as well.

Next Steps

I found I did not acquire the appropriate tools and materials for this project. I will be purchasing an Adafruit Battery Backpack, a voltage regulating shield to protect the Arduino unit. I will also be getting an Arduino Nano 33 BLE Sense without headers. This model has additional sensors that will aid with testing in addition to capabilities of the Nano 33 BLE. Without headers, I will be soldering the components together. I find this process is better as its headers probably won’t fit my breadboard either. Finally, I will purchase a micro usb cable to be able to test without needing a battery source hooked to the Arduino. Everything else is still fine to use, especially the jumper wires, battery, and screens. I want to try and get at least the battery and Arduino connection as well as the bluetooth capabilities done as my highest priorities. 

Leia’s Status Report for 2/24/2024

Progress

The components for our product have been ordered through the purchasing form: an Arduino Nano 33 BLE, an OLED 2.42” screen display module, an E-Ink 2.7” display, a Lithium Polymer battery 3.7V 2000mAh, and a breadboard + jumper wires kit. Currently, the two different screens and the battery have been received and hopefully the rest of the parts will arrive this coming week. I’ve been continuing to prepare how I’ll connect everything and learning sign language on Youtube. I’ve also been practicing the Swift language and Xcode environment via Apple Developer Tutorials. Specifically, there are three features I’m trying to learn to integrate into the mobile app for our MVP and also for backup in case our attempts at integration in the future across the app, Arduino, machine learning, and computer vision go awry: 1. Retrieving data from the internet such as URLs so we can port a web app into a mobile app, 2. Recognizing multi-touch screen gestures like taps, drags, and touch and hold, and 3. Recognizing hand gestures from the phone camera with machine learning. With Ran, I am also trying to figure out how to distribute our app into our phone for testing purposes. She raised an issue that the Xcode-simulated iPhone does not have a camera implementation so we are working to try and get the app into our phones.

Next Steps

The third feature mentioned in the Progress section needs further analysis and communication with team members. It’s performance is still uncertain and how it could amalgamate with our ASL-adapted computer vision and machine learning is questionable. For now, its primary use is to try and get our app to use the phone’s camera.

My plan is to get a working mobile app with functional buttons that lead to the settings page and the  ASL page where at its corner, a small window shows what the phone’s front-facing camera sees. This will be broken down into further steps. Additionally, once I obtain the rest of the purchased components, I will connect the Arduino to the app, using the BLE feature. I’ll attach an LED to the Arduino and see if the mobile app can control it. After, I’ll hook a screen to Arduino, control the screen via Arduino, then control the screen via app. I realized that it’s still too early to try and utilize CAD, so my priorities have shifted into working on the mobile app and operating the hardware.