This week, I met with the team to discuss our final ideas and direction for the project. We decided on something that slightly differed from our original idea, but we realized that we were a lot more enthusiastic about this idea than our original. We talked a lot about our vision for the project and the scope of the features we wanted to incorporate. We worked together on a project introduction/summary, we also began work on our project proposal presentation. For the upcoming week, I want to start thinking more about the exact parts we will need. Also, start to flesh out a timeline for the project.
Team Status Report for 2/01
The most significant risks that can jeopardize the success of this project largely involve the precision and accuracy of our sensors. In particular, we anticipate having to map out the distance of an object when navigating the user, in particular when the object is out of the camera’s field of vision. This requires us to estimate the location of the object relative to the user’s position, which can lead to many significant complications. We are concerned what happens if we arrive at the wrong destination, or if the sensors are able to locate the user even with erratic/unpredictable movement.
To address this risk, we plan to have an update mechanism, where the location of the object is dynamically updated as the user approaches the destination. If the device loses sight of the object, we plan on mapping the last seen location of the target destination. Moreover, more research on hardware components is necessary. We would like accelerometers/potentiometers that emphasize precision, since the speed/distance traveled by our user will be fairly small.
There were no major changes to the existing design of the system, but we plan to add the new navigation functionality. We anticipate this being fairly costly, and more complex than the original concept. In particular, it requires the use of multiple cameras capable of depth perception, as well as an accelerometer/linear potentiometer to track the movement/displacement of the user. Furthermore, this would require more performance from our processing unit, as well as more power consumption. Adding more hardware seems inevitable, but one thing we are trying to keep in mind is that this device is a wearable; selecting compact/low profile components will be important for maintaining the wearability for the user. Furthermore, power consumption can be reduced by keeping unused components idle, as well as using devices for multiple functionalities. For example, the same camera can be used for both obstacle detection as well as navigation.
Talay’s Status Report for 2/01
This week I contributed to team efforts to iron out our project idea and come up with specific implementation details for it. We did some quick research to find out how feasible our idea was and if we could accomplish the MVP by the end of the semester. Our progress is on schedule as we have nailed down a lot of the specifics of our project scope and implementation. During the next week, we hope to come up with a list of materials needed and start budgeting our cost. We would also like to look into what existing technologies there are for some of the features in our project.
Kevin’s Status Report for 2/1
This week we discussed and finalized our initial vision for our product. In particular, we made a decision on the specific features we want to implement: object detection as well as close-range navigation. With the rest of the team, I proposed the risks/challenges of various implementations, and decided upon a plan for our product. I begin drafting a materials list of all the hardware components that we anticipate using. Our progress is fairly on schedule, as we are happy with the concept behind this project, and believe that the complexity is both challenging but feasible. In the next week, I would like to research and decide upon specific materials based on functionality and cost.