TypQxQ / kTAMV

Klipper Tool Alignment (using) Machine Vision
GNU General Public License v3.0
68 stars 7 forks source link

kTAMV - Klipper Tool Alignment (using) Machine Vision

screenshot of UI

This allows X and Y allignment betwween multiple tools on a 3D printer using a camera that points up towards the nozzle from inside Klipper.

It has one part that runs as a part of Klipper, adding the necesary commands and integration, and one part that does all the io and cpu intensive calculations as a webserver, localy or on any computer for true multithreading.

It adds the following commands to klipper:

!!! !!! !!! !!! !!! This software is only meant for advanced users! Please only use while supervising your printer, may produce unexpected results, be ready to hit 'emergency stop' at any time! !!! !!! !!! !!! !!!

How to install

Connect to your klipper machine using SSH, run these command

cd ~/ && git clone https://github.com/TypQxQ/kTAMV.git && bash ~/kTAMV/install.sh

This will install and configure everything.

Configuration

The installation script will add a section to printer.cfg that looks like the following:

[ktamv]
nozzle_cam_url: http://localhost/webcam2/snapshot?max_delay=0
server_url: http://localhost:8085
move_speed: 1800
send_frame_to_cloud: false
detection_tolerance: 0

If your nozzle webcamera is on another stream, change that. You can find out what the stream is called in the Mainsail camera configuration. For example, here this is webcam2, so my configuration would be:

nozzle_cam_url: http://localhost/webcam2/stream

Change the server_url if you run on another machine or port.

move_speed is the toolhead spped while calibrating.

send_frame_to_cloud indicates if you want to contribute to possible future development of AI based detection.

detection_tolerance If the nozzle position is within this many pixels when comparing frames, it's considered a match. Only whole numbers are supported.

Setting up the server image in Mainsail

Add a webcam and configure it like in the image:


How to run

  1. Run the KTAMV_SEND_SERVER_CFG command to configure the server.
  2. Home the printer and move the endstop or nozzle over the camera so that it is aproximatley in the middle of the image. You can run the KTAMV_START_PREVIEW command to help you orientate.
  3. Run the KTAMV_CALIB_CAMERA command to detect the nozzle or endstop. Note that it can have problems with endstops and it's easier to calibrate using a nozzle.
  4. If successfull, run the KTAMV_FIND_NOZZLE_CENTER command to center the nozzle or endstop.
  5. Run the KTAMV_SET_ORIGIN command to set this as the origin for all other offsets. If a tool is selected, this should not have any XY offsets applied.
  6. Change to another tool and move the nozzle over the camera so that it is aproximatley in the middle of the image. You can run the KTAMV_START_PREVIEW command to help you orientate.
  7. Run the KTAMV_FIND_NOZZLE_CENTER command to center the nozzle.
  8. Run the KTAMV_GET_OFFSET to get the offset from when the first tool or nozzle was in the middle of the image.
  9. Run step 5 - 7 for every tool to get their offset.

Can it be automated?

Of course! And here is a macro you can use as a start point: ktamv_automation_example.cfg

Debug logs

The kTAMV server logs in memory and everything can be displayed on it's root path. http://my_printer_ip_address:8085/

The Client part logs to regular Klipper logs.

FAQ

How it works

This project consists of two parts: a Klipper plugin and a web server based on Flask and Waitress. The Klipper plugin runs within the environment managed by Klipper and does not require any additional components. The web server, on the other hand, depends on various specific components for image recognition, mathematics, statistics and web serving. This project is truly multithreaded because the web server operates in its own Python instance and can even run on a different machine. This is unlike only running in Klipper, which is only multithreaded but does not use multiple cpu cores and has to prioritize real-time interaction with the printer mainboards.

The camera calibration performs small movements around the initial position to keep the nozzle centered and prevent the nozzle opening from becoming oval-shaped. It will try to find the nozzle in each position and calculate the distance in pixels between the two, already knowing the requested physical distance on the printer. It uses ten positions and skips the ones where the nozzle is not detected. It then filters out the values that deviate more than 20% from the average, removing false readings and using only true values. It finallycalculates a matrix it can use to map the distance between a point and the center on the image and the real space coordinates.

When the server needs to find the center of the nozzle it will first fetch a frame from the webcamera, it is the only time it accesses the webcam feed. Then it will resize the image to 640x480 pixels. After this it will try to find a circle that would match the nozzle opening by going trough five diffrent detector and image preprocessor combinations. If it finds multiple circles, it will then use the one closest to the center of the image. It will repeat the above until it has found the same middlepoint 3 consecutive times with a tolerance of one pixel, or it times out, default 20s.

Special thanks