Sophia’s Status Report for 03/29/2025

This week, we tested the system from scanning, to if the stepper motor rotates, to saving the scans, to feeding it into the math in a python file to make the normal map. The system, after a bit of directory/save name tweaking, works, even properly starting and working fully from the Blender add-on. The only thing we didn’t test is the manipulator rotating the coin/object itself, Theo did it manually for the testing, because we were waiting on a 3D printed part to mount the suction part on the manipulator device. We were actually able to get a decently clear normal map of the coin, just a regular U.S. quarter. So personally, what I did was make adjustments in the software to be able to integrate in Yon’s math normal-map-making python file.

We’ve decided to put Mac-compatible software on hold, as there’s complications with Mac and the framework we were using to command the scanner. We’ll look into Mac-specific scanning libraries to use after getting the whole system including making the .obj working on Windows and Linux. Better to have at least one OS working completely than have it work only partially on all three.

Next step is expanding the Blender UI so the user can select a file path needed for the scanning process, the COM port for the microcontroller (though maybe we could do that automatically), and possibly the scanning DPI. Maybe somehow getting the command line messages we have with the scanning process progress getting displayed in Blender? Also we need to make sure the suction manipulation works in system run and integrate .obj 3D model creation. I’d say we’re on track and have a pretty good setup for the demos this upcoming week.

Team Status Report for 03/22/2025

Currently, work on the manipulator hardware is on schedule. The newest 3D-printed mount slides up and down the manipulator smoothly, though it remains to be seen how well it will function when accommodating objects of different thicknesses. All that’s left for the manipulator is the stepper-suction adapter, which Theo plans to finish by this week. We won’t need to cut the stepper motor’s axle if we make the adapter longer, so we should be able to avoid accidentally damaging the internals of the stepper motor.

One new risk that came up this week is that the scanner API is having trouble running on our Mac. We found sources online that claim they’re able to make it work, but we’ve run into several issues so far. We did make some progress on this front and hope to be able to get it to work, but we may have to sacrifice mac compatibility if the issue persists, which would be unfortunate. This does not impact our schedule or our block diagram, but it could affect our benchmarks by excluding compatibility with mac systems.

Theo’s Status Report for 03/22/2025

This past week was mainly spent working on the 3D printed parts for the manipulator. The newest iteration of our mount finally got the hole size right, so it slides up and down relatively smoothly. It will be interesting to see how well it works once the suction cup is attached to the stepper motor.

The adapter for the suction cup and stepper motor is going along well. It seems like there’s enough clearance in the area that we don’t need to cut the stepper motor axle. The next iteration needs to be longer and have a slightly larger hole for the suction cup, but it should be doable. I’ll have the stl to Yon by Monday for more testing.

The last 3D printed piece I’ll be working on is electronics housing. It will be a long box along one of the T-channels on the frame, housing the mirocontroller and dc air pump. There will be a micro-usb port and a 12V DC adapter port. I’m leaving this for last once we confirm full functionality for the manipulator itself.

Everything is still on schedule, hopefully the adapter doesn’t take too many iterations to finish. It prints pretty quickly (40min), so I may spend an afternoon in our 3D printing area to prototype and get it right.

Sophia’s Status Report for 03/22/2025

This week, I started on the Blender add-on. I’ve successfully made an add-on with the name “Flatbed 3D Scan” to appear in Blender, and it runs the “execute” function that is inside of its Python file when clicked. So, next week I’ll be making it so it calls the Main.py file from last week which would run the scan. I need to look further into if there’s a way I can input arguments to the add-on, or a way to make a UI window pop-up to enter arguments in.

For Main.py, I refactored it a bit along with scanner-controller dotnet project and the serial_proto.py in order to put them in the right directories and directly call serial_proto from Main.py. So, the sequence looks okay. I need to implement try-catches to make sure the program doesn’t have unhandled errors. The dotnet project still isn’t working on Mac, so we’re going to look more into that next week as well.

So overall, next week will be adding in quality of life updates with the Blender add-on UI and bug fixing with testing the system.

Team Status Report for 03/15/2025

Theo received the suction cup we were waiting on this week and began working on it. It can hold and let go of objects as heavy as a phone, and he was able to make a right-angled connection between the suction cup and the tubing. This is important for the last design component of the manipulator: a 3D printed part that connects the suction cup to a bearing on the stepper motor, centering it with the shaft. He’s already started working on this and will have it ready to print before Wednesday.

Sophia, with Theo’s help, solved what was wrong with the NAPS2 code so the scanner controller software works. She’s started working on the Main.py file that will connect together the scanner controller, the microcontroller manipulator device, the image processing, and the Blender add-on UI.

A risk that is coming to light in our design choices thus far is the 3D printed mount’s smoothness while sliding up and down the pillars on our manipulator. This issue is mainly due to the tolerances on the mount’s holes. We’ve needed to reprint it multiple times while making the holes larger, and this issue affects the holes the stepper motor will go into as well. If the holes are made large enough but the sliding still isn’t smooth, we are considering linear bearings on the pillars/mount to ensure smooth motion. We’ll weigh the costs and benefits of this change if it proves necessary.

The schedule has not changed significantly. Even though Yon’s subsystem is a little behind due to him being sick last week, we worked in some padding to the implementation of the entire image processing system which leaves the schedule unchanged.

Theo’s Status Report for 3/15/2025

This past week I focused on implementing the suction cup for our manipulator and helping Sophia debug the NAPS2 code we’re using to talk to our scanner. We’re able to toggle the suction cup on and off and have succeeded in lifting objects as heavy as my phone. With a t-connector, scissors, and hot glue, I was able to create a right-angle connection between the suction cup and the air tubing running to the pump. From here, we just need a 3D-printed piece that connects to a bearing on the stepper motor and aligns the suction cup with the shaft. I’m already working on its design, and we should have it 3D-printed by Wednesday. The stepper motor shaft will also need to be cut to accommodate this; I’ve already talked with the TechSpark machine shop, and we are able to make this cut anytime this week.

Sophia’s Status Report for 03/15/2025

With Theo’s help, I got the scanner controller software working still using the NAPS2 library. It required changing the scanning contexts to different contexts than she expected, specifically Windows OS had two different scanning contexts and the one that said it was for Windows Forms, not Windows, was needed in this instance.

I started on writing the Main file that would call the scanner controller software, command the microcontroller manipulator device, and eventually will interact with the image processing software and the Blender add-on UI. I chose to do Python for this since there would be a lot of moving between file directories and some command line calls for the dotnet project that encompasses the scanner controller software. I also updated the scanner controller software to account for the Main file creating the scanning directory, so it now just has to make sure it doesn’t repeat file names for new scans.

The next step would be to implement the manipulator device calls into the Main.py file and ensure that the process of scanning and rotating works, building the Blender add-on, and integrating Yon’s image processing software. Initially I thought to make the Blender add-on do all of the calls, but now I think it’d be better to let the add-on be almost exclusively UI with a start button that calls the Main.py file. So, technically I’m a touch behind schedule, but as long as I finish the Blender add-on next week and the Main.py file is working, it should be good since we left two weeks of buffer time.

Team Status Report for 3/08/2025

Since we’ve all been on schedule so far, we took Spring Break off. During the last week of February, we made progress on our prototype and have moved closer to system integration/testing.

Theo built the prototype’s structure, and we’re now just waiting on the suction cup and its mounting components to finish it. See his section for specifics and a photo of the prototype on our scanner. He also made some basic serial code for sending rotation and pumping commands, as well as python code for interfacing over serial. He’ll be working with Sophia to integrate a better version of this in the final control software.

Besides the serial code, Sophia did trial tests of the flatbed scanner with the controller software and found that it had issues with no clear solution. After many attempts and guesses at what was going wrong with the scanner to computer communication, and Theo after attempting to setup the project but finding it incompatible with newer versions of Linux, she’ll be pivoting to a more modular approach that will use unique and more compatible libraries for each OS, keeping the individual OS scanning processes in separate files. This will mean a more incremental approach to ensuring each OS works well without jeopardizing the states of the other OS’s and it won’t rely on an outdated dotnet framework (NAPS2 needed version 4.8, was only supported at all to version 8, current version is version 9).

Yon finished working out the math for normal map construction and detailed his findings in the design report. He also identified 3D scanners we can use for qualification, which gives both Yon and Theo some work next week in designing and manufacturing a test object. Now that a basic python version of the normal map code is implemented and can be used for testing Theo and Sophia’s subsystems, Yon will turn to implementing the normal map math in the design report. He also still has to identify a normal map to depth map pipeline, which can be achieved either in a custom implementation, a C library, or externally through another blender plugin / tool.

A was written by Theo, B was written by Yon, and C was written by Sophia.

Part A: On the global scale, our project has possible uses in the fields of archeology, history, and design. There are no limiting factors in who across the world could access/utilize flatbed 3D scanning besides a possible language barrier in the software (which is easily translatable).

People not in the academic environment would be more likely to use this as hobbyists who want to get detailed scans of their art, sculptures, or other detailed objects. There is a small subculture of photographers who use scanners for their high DPI, and such a group would likely be eager to map their hobby into a third dimension that they can interact with via Blender or other .obj-friendly software.

There is an emphasis throughout our project on making the scanning process user-friendly and hands-off. While this is mainly meant to accommodate repetitive data acquisition, less tech-savvy people would only have to deal with a few more steps than when using a flatbed scanner normally (place manipulator, plug in cables, run software).

Part B: Our project implements a cheap and accessible way to 3D scan small objects. One significant area of application for this technology is in archeology and preservation where cheap, quick, and onsite digitization of cultural artifacts can help preserve clotures and assist dialog and discourse around them.

That said, all technology is a double edged sword. The ability to create quick replicas of historical artifacts makes them vulnerable to pop cloture-ification which could manifest in cultural appropriation.

Part C: Our project uses a low power draw, which is an environmental boon. Especially considering its competitors are larger, complicated machines that would use more energy. Our project also leverages an existing technology, therefore reusing devices and not requiring buying a larger version that uses much more material and energy in manufacturing and usage.

The simplicity of our project also lends itself to environmentalism, since we don’t use any cloud storage, AI features, or other massive energy consumption processes. We don’t even use a separate battery, drawing power from the computer through USB. Open source projects like ours are also generally more sustainable than completely commercialized sources.

Environmental organisms and discoveries can even be captured for future research and knowledge using our project. Since archaeology is a key audience, it’s not a stretch to extend that into general biology. Scanning small bones, skeletons, feathers, footprints, or other small biological features would be possible as long as they aren’t particularly frail. This contributes to the knowledge bank of biological organisms, furthering science’s understanding. The Smithsonian for example has a public access bank of 3D scans, so our project would be perfectly suited to its use for small, detailed objects.

Sophia’s Status Report for 03/08/2025

The week before spring break, I started working with Theo on the serial code to get communication between the computer and the microcontroller on the manipulator device. Since the device connects using USB, it’s a matter of opening the serial ports and feeding commands at the right time to the manipulator device.

As for the program to automatically capture scans using the flatbed scanner, I’ve encountered. The file works fine before attempting a practical trial with the flatbed scanner. However, when I try to use the scanner with it I encounter a “lost connection with the scanner” error in the middle of scanner. I hear the scanner go off, but then it loses connection and doesn’t save the file. Online search wasn’t helpful, only suggesting to unplug and replug/restart the scanner, which I tried a few times unsuccessfully. I guessed it was something to do with file access permissions on my computer with the scanner, so I tried moving the project to more accessible file locations that definitely wouldn’t require admin, I tried running the script as an admin, I tried to see if there was an access permission the scanner was missing, I double checked that the device drivers were up to date, and nothing seemed to fix this issue. It’s extra confusing because I was able to scan and save just fine from the scanner’s native software and from the NAPS2 library software. I asked Theo to try downloading the project and running it to see maybe if it was an issue with Windows or my machine in particular. However, he encountered a lot of issues with trying to set up the dotnet project and the incompatibilities in the version of dotnet that the project required with his version of Linux.

So, in light of this I believe the best approach would be to pivot from trying to use the universal scan library of NAPS2 that requires a dotnet project. Instead, just a series of files instead that don’t rely on an existing framework. There would be a master file that receives the command from the Blender UI, checks the OS version, and then calls a corresponding file to make the scans based on the OS. This way, we would have a file for each OS version and each OS would be able to use a compatible scanning library. It also makes it so that we could incrementally ensure each OS works, ensuring one is done before moving on to the next. It would also ensure that something in general works, even if not compatible with every OS. Currently, I’m looking at WIA (Windows Imaging Acquisition) for Windows, SANE (Scanner Access Now Easy) for Linux, and ImageCaptureCore (Apple API) for Mac. Since two of these are native to their OS’s and Linux is generally good with setting up libraries, I think these will work out better.

Theo’s Status Report for 3/08/2025

I’ve taken the past week off for Spring Break since I’m on schedule. The week before, we received the parts for the manipulator’s structure, so I actually built it (see image below). The parts all fit together besides the 3D printed mount, which needs slightly larger holes for the stepper motor to fit inside. We’ll 3D print this when we return to campus. So far, we’ve noticed slight instability and rough motion when trying to move the 3D printed mount up and down the screws with spacers. If the next iteration’s larger holes don’t enable the smooth and stable motion we want, then we’ll look into linear bearings or something similar.

We’ve also further theorized possible solutions to picking the object up during rotation. Currently, solenoids on the mount that push off of the second layer of T-channel extrusions is our best idea. This would likely work, and we would use at least two in order to deliver equal/symmetric force while sliding up the screws.

My next steps include finishing the troubleshooting on this 3D print, helping Sophia with integrating my basic serial test code with her control software, and designing + optimizing the 3D printed shaft-suction cup piece that we’ll use to connect the vacuum pump tubes and the stepper motor to the suction cup.

Our manipulator prototype on our scanner.