Team Status Report for 10/21/23

One major change that we thought of over fall break is what circuit data structure we send to the frontend of the application from the computer vision output. Previously we have been set on a netlist: the computer vision sends a netlist to the frontend, and the input to the circuit simulator is also a netlist. What we have realized while constructing the netlist from the computer vision algorithm is that there is a discrepancy between the circuit that the user drew and the orientation of the circuit that the netlist represents. A properly constructed netlist (which is easy to do) will guarantee that the right components are connected at the appropriate nodes and that the relative positioning of each component to one another is correct. What a correct netlist does not give us is the same orientation of the circuit as what the user draws. For example, say that a user draws a circuit where the components starting from the left side and going in a clockwise direction are voltage source -> resistor -> wire -> wire (back to voltage source). The generated netlist will guarantee this ordering, but when drawn the circuit could look like (also from the left side, clockwise) wire->wire->voltage source->resistor (back to wire). We may end up accidentally throwing the user off by showing them what is technically the same circuit as the one they drew but oriented differently, which may lead to correct circuit classifications not being deemed correct by the user. Our solution to this also simplifies some work; the computer vision algorithm naturally produces a list of edges where each is denoted by the coordinates of a pair of connecting nodes and the component that is connecting the nodes. By giving the frontend the coordinates of the nodes, they can construct the relative orientation of the circuit that the user expects. We made progress with the circuit simulator by installing the required libraries to the development environment and creating the required matrices. 

There have been no changes in the schedule. We are on track with our work and plan to meet all of our deadlines accordingly. Next steps include testing the iOS application to make sure it will integrate with the computer vision algorithm correctly. This means feeding coordinates into the application and making sure the circuits displayed are correct. 

Leave a Reply

Your email address will not be published. Required fields are marked *