Microscope Focus Stacking for Part Inspection (on a Budget)

Recently I found myself on a yak shaving detour in dire need to look at some very small things. In my particular case: to assess scratches after consecutive grinding and polishing steps of a lens mold. Basically, I polished a cavity and needed to know if the scratches of the prior grind size have been removed by the current grain size so I can move on to the next one. For that, I needed some cheap microscope objectives with a bit of magnification (4x to 10x). The problem: the focal plane of these objectives is very, very shallow. Since I want to look top down at the curved cavity meant for molding lenses, I needed to cover quite a bit of distance.

acrylic glass that should be inspected during polishing

The simple solution to this problem is focus stacking. Just take several images while moving the microscope lens just a fraction of a millimeter and combine only the sharp areas of those images afterwards.

There are a few noteworthy microscopy projects out there that work towards more accessible and open-source imaging:

The OpenFlexure project developed a 3-axis movement stage with micrometer precision, based on deforming a cheap 3d-printed part. They have some options for microscope objectives based on cheap Raspberry Pi cameras and that stuff seems to work well for biology samples, ie. looking at tiny cells on a small glass slide with 50-100x magnification.

UC2 (or you see, too), is a wide collection of building blocks (like, literally blocks) for modular microscopes. Documentation looks pretty extensive and they seem to have a lot of options for rather fancy imaging techniques.

However, both projects did not really have a simple solution for my particular problem (large parts, low magnification), so I set out to build my own very simple microscoping rig.

the microscope

For that a bit of electronics, hardware, and software is necessary:

Electronics & Sensor

The first version of the DIY microscope made use of a Sony A6000 with a 3d printed spacer for the microscope lens. This did work fine and the large APS-C sensor covered quite a field of view with the microscope objective. I controlled the camera using gphoto2 from a Raspberry Pi. But moving the camera, taking a picture, and loading it from the internal storage to the pi is slow and tedious. In addition to that, I don’t want to torture the mechanical shutter of the camera excessively. The easier solution is using a Raspberry Pi HQ camera module and 3d-printing a spacer for the objective. The sensor is considerably smaller so the field of view is narrower, but it’s faster, is controlled directly by the pi, and doesn’t have a mechanical shutter.

The motor responsible for moving the microscope objective closer to the camera is controlled by a Fysetc E4 running fluidNC. That is basically a grbl-compatible firmware for ESP32 boards instead of the good old AtMega328. The big advantage of fluidNC on the Fysetc E4 board is that you can just load a config file with steps-per-mm and motor current settings and fluidNC takes care of configuring the very silent TMC2209 stepper drivers on the board accordingly. It’s a very versatile combination that requires minimal effort.

I am not using a gcode sender but my own camera slider python script, which works well for this simple job. General procedure: use picamera to take an image, tell fluidNC to move the Z axis by a few steps, take the next image, and so on…


the macrorail I use as a base

As a linear actuator, I am using components I originally intended for a macro rail (which kind of makes sense). A 270mm 40x20 aluminium extrusion with a 250mm MGN12H linear rail from Aliexpress. A 1.8-degree stepper motor with a GT2 belt spins an 8mm ACME leadscrew. The pulleys for the GT2 belt have a 3:1 gear ratio, and the leadscrew has a 2mm pitch. This gives us a theoretical resolution of 3.33 microns per step.

The 3d-printed tube holding microscope objective, camera module, and Raspberry Pi are clamped with an Arca Swiss plate to the linear rail carriage. Both Arca plate and clamp are from Mengs Photo, my preferred purveyor of low-price photo parts.

a micronstage with retrofitted stepper motors

The X and Y axes are controlled by a micrometer XY-stage from Aliexpress. I got the Idea and purchase information from UC2, you can find them in the UC2-MicronStage repository. However, mostly I am just moving them manually to set the position once, not making use of the stepper motors.

The microscope objectives I use are just some cheap generic ones from Amazon for about 20 Euro each (4x, 10x). They come with an RMS thread as their only mounting option, so they need to be screwed either in an RMS adapter or an RMS thread needs to be cut into a machined holder (Thorlabs is selling the correct tap). I just printed a sufficiently similar thread with the 3d-printer and used a bit of force, so the metal thread on the objective cuts its way into the plastic. Not the most elegant solution but it works well.


RGB ringlight mounted around the microscope objective with an additional diffusor

I tried to be clever and use an RGB ringlight mounted around the microscope objective. By selectively mixing colors and controlling the direction of light you can highlight the direction of scratches.

gif / webm / mp4

The problem is that the geometry of the ring light (position and diameter) only work well in a rather narrow set of situations and even then it is slightly dim.

RGB ringlight mounted around the microscope objective with an additional diffusor

While the camera sensor could just expose a bit longer and compensate for the dim LEDs, that’s a pain with the Raspberry Pi camera driver, so I tried to avoid that. The simplest solution was just too use a very bright lamp (1000 lumen) and flood the whole area with light at an angle.

IKEA lamp as flood illumination


I tested a few pieces of software in the process of finding something that works for me:

Excerpts from my shell script:

Align images:

$APPDIR/align_image_stack -v -m -a aligned $1/*.jpg

Combine aligned images:

$APPDIR/enfuse --exposure-weight=0 --saturation-weight=0 --contrast-weight=1 --hard-mask --contrast-window-size=9 --output=$OUTPUT_PATH/$2$DIR_NAME.jpg $TMP/*.tif

APPDIR is on my mac APPDIR=/Applications/Hugin/tools_mac when hugin is installed via the official installer.

That’s certainly not the most convenient workflow for people who would rather use a graphical user interface, but for me that’s perfect. When I do a dozen polishing steps and want to have a focus-stacked image in between each one, I don’t want to manually copy images, click on five buttons in the user interface and assign a name to the resulting image. I can write a script for that once and don’t need to worry later. The only truly manual step: While the Zerene Stacker and Helicon Focus seem to manage that fine, enfuse is a bit susceptible to images in the stack that are completely out of focus. If you have a bit of overshoot in your image stack, ie. move the camera beyond the object and there is no part in the image that is in focus, you may need to delete these manually. If you don’t do that, aligning and stacking might fail.


Aligning and stacking a sequence of 24 images:

gif / webm / mp4

The result looks quite usable:

stacked image