This week we had a lot of design discussions. Ever since our project proposal, we’ve been streamlining the idea that we have for InFrame so that we can focus more on what exactly it is we are trying to achieve: a camera with tracking capabilities.
We’ve narrowed down our goal uses cases to the following:
- Tracking a professor during a lecture.
- Tracking either a person or a ball in a sporting scene (i.e. a skateboarder, a soccer ball, etc.)
We decided to go with these cases because InFrame provides a tremendous level of value for them. In both lectures and live sports, eliminating the need for a cameraman to follow a desired subject would save a great deal of money. In addition, when practicing a new skill in a sport, watching your technique is an incredible way to improve but is sometimes hard if you don’t have someone to record you — by having InFrame follow you as you try a new trick or perfect your swing, we make practice even more effective.
Our design discussions revolved around these specific use cases. We don’t really think that streaming live video is necessary, it is a nice feature to have, but is outside of our scope. In addition, because we don’t really need to see the camera’s video feed on the phone, Bluetooth is fast enough for the communication between the Jetson and the phone. However, this means that whenever the user wants to select the subject to track, he will have to request an image from the Jetson. We figured that this was a pretty reasonable user experience that would greatly simplify the communication with the phone and the pairing process, without adding a significant amount of latency and responsiveness.
0 Comments