Ari’s Week 11 Status Report

This week was one where we got the most accomplished. We were able to integrate our system and get it running on the Pi with processing being done remotely on the ECE machine in the lab. We were able to get the visualization and the recording working on the pi, however we ended up destroying the sound card on the pi which then caused the touch screen to stop working. As a result, for the demo we will be doing it all on my chromebook. Further, we were able to confirm our testing setup mimics the heart and were able to successfully classify 93% of the heart sounds we tested on which was much better than what we expected it to be.

Ari’s Week 6 Report

  1. This week I continued with the integration and development of theĀ  integrated system. I worked with using matplotlib to plot the incoming signal and applied a butterworth filter to filter out high freqency noise. I also helped out with the ML algo with Eri and Ryan to make progress a little faster. I also refactored the matlab code and did reasearch on improving our processing.

Week 9 Team Report

This week our team made good progress, however, we still have a decent amount to do in the upcoming weeks. We are aiming to finish the real time system tomorrow on Sunday in lab. We are also going to decide if we want to do AR/AS vs MR/MS depending on how much time we have. We are also going to try to increase our accuracy to above 85% consistently. We were able to reach 85% a few times, but are going to try to make it higher. Finally, with the remaining time we have, we will focus on complete testing of our finished system to make sure our device work. We have not a ton of time left, but I think we can finish what we need to get done to have a good final project.

Ari’s Week 9 Status Report

This week was when we began our integration into a real time system. I personally accomplished a lot this week and am excited to integrate before our demo. The raspberry pi that we ordered arrived along with the touch screen that we will use to run the system. I flashed the pi and was able to configure the touch screen to work with it. I also started a python program that will be the basis for our entire project. This codebase will create a simple GUI which will live display the signal received from the stethoscope, and will have a button that will begin the analysis. This code will invoke our matlab code to classify a heart sound and will display the result. We aim to have this real time integration ready for our pre-final demo and will work a lot on it tomorrow to make sure it is ready. The main things we have left to do is to work on our testing after the integration is done. We also want to try to get a little more accuracy on our ML.

Ari’s Week 8 Status Report

This week was very productive for me. I was able to accomplish a lot regarding the hardware and the structure of the physical stethoscope as well as working on the ML algorithm with Ryan and Eri. I also worked on figuring out a way to convert our Matlab code into C code so that we could run it on a raspberry pi. I placed orders for a raspberry pi on which the code will run and a screen on which a user can interact with the device. Further, this week I was able to begin writing the code that will begin the processing on the raspberry pi and I was also able to design the status reporting system. I also designed the double blind experiment and scheduled it for next week to figure out if humans can tell the difference between my the sounds from my stethoscope and the other sounds.

Ari’s Week 7 Report

  • What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).
    • This week I worked more on the hardware components and trying to make the signal look like the ones in our testing data. I also worked out a testing plan which involves a double blind between our microphone and our ML’s training data. The logic is that if the algo cannot tell the difference and a person cannot tell the difference (>80%), then the signal is about the same.
    • I also helped out with the integration for our demo, getting all of the separate matlab components to connect with each other and worked on modularizing our code.
    • Additionally, I looked into setting up the matlab code to work standalone with python so that way the stethoscope could work without being restricted to having a computer running matlab on it.
    • I also created a team to-do list with our highest action items based on what we learned during the demo, and from the feedback we were provided.
  • Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
    • Our progress seems to be on track, the next thing we need to do is get the testing working and sort out our process for doing so.
  • What deliverables do you hope to complete in the next week?
    • For next week, I am to have a working testing set up, and want to validate our process. I also would like to connect Eri’s shannon energy denoising algo with Ryan’s ML and hopefully that will make the accuracy higher.
    • We also want to find approx 1000 more sound files so that we can train our ML more.

Ari’s Week 5 Status Report

This week, I began developing the entire structure of our integration. I planned to develop a python program that will run a script to live view the heart beat waveform. I also was able to test running our mat lab script via python and will be loading the entire script onto a raspberry pi to view the live heartbeat and analyze it.

Ari’s Week 4 Status Report

This week I began testing our training data and the data that we are getting from our microphone. I performed cosine similarity on the frequency magnitude of the two sound files and found that their frequency spectrum are nearly identical with a cosine similarity of .87. I also began to determine our testing and validation setup to try to mimic a human heart. I found that we would not need a gel and that playing the sounds through a speaker would be sufficient.

Ari’s Week Progress Status 3

This week I accomplished many things. First off, I was able to transfer the signal from our stethoscope to a computer. I was also able to visualize the signal using a oscilloscope and was able to create recordings of the sounds the microphone produced.

My goal for next week is to apply the algorithm that Eri and Ryan create to see how it compares to our testing data. I also want to think more about our testing process and see how we can make it better.

Ari – Week 2 Status Report

Status Report #2

Arihant jain

Team A4 (Smart Stethoscope)

 

My task for the week was to work on the prototype of the physical device stethoscope, and work on wiring the prototype together with the parts we got and test the sound on a speaker.

 

  • What did you personally accomplish this week on the project? Give files or photos that demonstrate your progress. Prove to the reader that you put sufficient effort into the project over the course of the week (12+ hours).
    • This week, I received the parts I had ordered last week and started to connect them and test the audio signal with an oscilloscope and then set up the wiring for turning the signal into a 3.5mm output so that we could listen to the microphone stethoscope system on a speaker.
  • Is your progress on schedule or behind? If you are behind, what actions will be taken to catch up to the project schedule?
    • Progress on track!
  • What deliverables do you hope to complete in the next week?
    • We have a working prototype and now just need to see how much we can make the sound quality match our testing data.