Robotic-assisted photography is among the most exciting technological developments in the production industry since the invention of the camera itself. It provides new and unique ways to capture dynamic scenes and unlock new avenues for cinematographic creative potential. Current photography robots on the market range from tremendously expensive industrial robotic arms, such as Motorized Precision’s Kira camera robot, to portable devices that allow for smooth motion across scenes like the Rhino Arc II. However, most photography robots have not yet leveraged breakthroughs made in the Computer Vision space in the last decade. For these reasons, we created InFrame: an intelligent, motorized photography assistant that uses state of the art object detection models and tracking algorithms to follow user-selected targets across a 3D space, constantly keeping them InFrame.
InFrame was built by Diego Martinez, Ike Kilinc and Ismael Mercier as part of our S20 Senior Capstone Project for Electrical and Computer Engineering at Carnegie Mellon University. The code and CAD files for InFrame can be found in our Github Repository.
Weekly Updates
Follow our progress on the development of InFrame!
Weeelp, we’re finally nearing the finish line not only of this project but of our undergraduate careers. COVID-19 certainly didn’t make it easy for us, but we mustered through. It would’ve been great to have Read more…
Remote Interface! This week, in preparation for what became a rather successful Demo2, I continued to work through the CSM-Perception software integration process and, perhaps more significantly, programmed the remote interface! After preliminary work on Read more…
Demo 2 Feedback and Wrapping Up! This week, we had a pretty successful second demo. Diego and Ike focused heavily on the software integration and showed the perception pipeline working in tandem with the rest Read more…
Demo Feedback and Speeding up the Perception Pipeline After a successful demo 2 in which Ike and I showed off the merged perception and CSM functionalities, I focused on speeding up the perception pipeline in Read more…
I feel pretty accomplished about this week. I spent a good amount of time testing the motors and have determined that they are fast enough to meet our requirements. Tomorrow I will dedicate time to Read more…
CSM-Perception Integration & iOS Beginnings This week, after demonstrating the independent functionality of the CSM framework to the entire team and mentors, I turned to the integration of the Perception code into the CSM together Read more…
Made with ❤️ by
Team A5: Diego Martinez, Ike Kilinc and Ismael Mercier
CMU x ECE
Hamerschlag Hall, Carnegie Mellon UniversityHamerschlag Dr, Pittsburgh, PA 15213