While last week involved preliminary testing and information gathering for requirements for the design proposal, this week proved to offer drastic and unexpected changes.
- Software Decisions – Last week we decided that the best integration method between the wavelet transform and the mobile app would include the Wavelet transform implemented in python, and integrated to android studio via Jython. However, this method would have required a different library to be used on the Java side that would not interface with the phone’s step counter data. While researching and discussing this issue, Professor Sullivan suggested using C++ for the wavelet transform. Since I will be working on the wavelet transform, I took this decision particularly personally. I have little experience with C++. In fact, even upon initially playing with C++ to familiarize myself with the language, I was still uncomfortable. Despite my distaste for this language, it was important to note that C++ offered DWT & IDWT (discrete wavelet transform) methods AND these were well documented: http://wavelet2d.sourceforge.net/#toc-Chapter-2. In fact, these implementations and example use cases I found provided more customization and flexibility with the input/output signals than Python’s libraries and examples proved to show. As a result, I decided to bite the bullet to favor using C++ for its easier integration with the mobile app, its flexibility with signal processing, and its sufficient, clear examples/documentation of wavelet transform use cases.
- Device Decisions – NO watch, ONLY phone based on step counter data we measured and acquired since the watch’s step counter was the least accurate despite the fact that it is of a recent generation.
- Scoping Step Detection – As of last week, we decided our target audience to be runners. However, after performing our own ‘runner’s’ test, we realized that the discrepancy between our target scope of bpm was because our searches were targeting runners, but we were actually referencing joggers. Additionally, our measured pace/desired bpm of 150-180 bpm actually matched up well with many songs I do use to fast ‘jog’ to. Thus, we adjusted our target bpm/pace accordingly to match this pace.
- Music Choice – During our proposal presentation, we received feedback to narrow the scope of inputs – AKA the scope of songs that could be used to run to with warped conditions. With our new target pace, we will allow only songs of 150-180bpm. Additionally, when choosing a song from a defined, we will apply a scoring algorithm. This scoring algorithm will give a song a score depending on how many times its played, and how close the song’s natural bpm is to the jogger’s current pace. The algorithm will choose the song with the best score. This will ensure one song is not constantly on repeat, and a song of decent bpm is played. Both factors will be weighed and adjusted relatively based on the outcome of our algorithm.
- Wavelet Transform vs Phase Vocoder metrics and tradeoffs were searched for and validated as expected. Additionally a plan to accomplish the wavelet transform has been made: code and test for a sin wave without changing inputs, do the same and test on simple music, account for tempo variations, account for pitch, & measure artifacts throughout this process. I have additionally researched additional resources on campus in case I need guidance in applying the wavelets for our specific use cases (i.e. Prof Aswin, Stern, Paul Heckbert, Jelena Kovačević)