gaugo87 / PIXEstL

A program for creating color lithophanies and pixel images
MIT License
55 stars 10 forks source link

First Go Round, and some questions. #6

Open LewnWorx opened 7 months ago

LewnWorx commented 7 months ago

So I followed your instructions as best I could with the following caveats:

The STL’s didn’t match the written descriptions.

1- In the calibration discussion you mention a separate “square” at integer multiples of the print layer hieght (i.e. 0.10, 0.20, 0.30 and so on for a .10mm layers hieght). Yet the STL’s first “square” was 0.20 and the rest incremented by the 0.10 layer height. Is there a reason for the first one having double the layer height?

2 - While it wasn’t explicitly called out the separate STL (full length but a flat 0.10mm Z across the entire surface) appears to be designed to print under the stair step gradients and should be printed in white (gathering that from whatever slicer was shown there (I’ve never used anything but prusa slicer so was unfamiliar with the UI), so I’m assuming I have the correct assumption there.

Following the other comments, I decided to go the 7 layers of 0.07mm LH, and it dawned on me that that might change or need to be so I built a fully parameterized model in fusion that’s set up where you specify the number of steps, the square size, the target layer height for each step an an optional “first step addtional Z height” in case what I viewed with the demo calibration SLT really needed to be there. It also takes a separate “first” layer height for what I believe to be the “needed” white layer under the whole thing.

It then builds the steps and the ‘white’ base

I then set it up for 7 steps @ 0.07 @ 25mm square, with a 0.15 white first layeer under the whole thing. Exported that, and filled the need with IIIDmax’s light blue, yelllow, fushia, white and black.

I then printed those, and that led to the next question..

How do you calibrate measuring the colors? I’ve got a multi level LED panel with various white color temps (not sure on the exact values though as the manual is in Chinese) and variable brightness. I put the strips on that, took several photos at various color temps and brightnesses (noticing my iPhone camera auto compensated for the various brigntnesses, as all of them imported looking mostly the same).

What wasn’t clear at all was how to white balance this. I’ve done a ton of professional photography back in the SLR days and know how to pull a white balance there (I still have several neutral gray targets with specific color temp ranges) but this is a whole different beast, and being backlit makes the prospect of getting an accurate color balance a puzzle.

So what I ended up doing was importing them to photoshop, and noticed the “white” border around the strips was anything but. the Iphones’ auto compensate had rendered those various shades of gray.

So my next thought was to try to get the border white stuff back to white without importing a hue shift and just sticking to levels and brightness / contrast adjustements. That “sorta” worked.

The next thing was measuring the colors - even with ironing on some (particularly the whites and blacks) still showed the lines. My “sorta” solution for that was to run a Gaussian blur on the image, as it’ll change contrast and edge deltas but shouldn’t muck with the hue much at all and only diddle the saturation a little. It was either that or run an average calc on all the pixels in a selected area and the Gaussian blur was way less work.

Next i expanded the JSON color palettes to 7 layers for each color. What wasn’t super clear though, was where you pulled your hex HSL (least I think that first JSON parameter is HSL (hope it’s not RGB or I really screwed it up). So where does THAT first one come from? The middle of the spectrum of your swatches? The first? The Last? Does it even matter? that part wasn’t clear.

So I transferred over all the HSL values from color slurp (which if you’re on a mac is great for this sort of thing as you can create templates for the output format when you copy a value out of a pallete, whcih gave me the entire entry, pre formatted per your JSON layout in one go per color. saved a LOT of time.

So I updated the palette, added in my 5 IIIDMax initial filaments, and saved that with your jar and the other files to a directory and cloned the sample.bat file and started modifying that. What I noticed is that my hues and satruarations for the IIDMax counterparts for the CMY stuff was in the same ballpark, by L values were generally off from yours by a good 20 points in most cases. Ergo the question about what’s the best practice for both backlighting these and measuring them..

From there it was onto the batch file modifications.

The first thing I noticed is when you drop that on terminal (at least in Mac OS Monterey) the shell is expecting fully qualified paths for EVERYTHING, so I needed to drop the files one at a time on terminal to get those paths, paste that into the batch file and then applied your suggested settings for a 0.40 nozzle. I noticed there, that the default backing height at (0.24mm) wasn’t an even multiple of the 0.10 layer height, and the neither was the texture thickness default.

Thinking that having a non integer multiple of the layer height (particularly for the texture file might have an adverse impact I settled for 0.28 for the base (at a 0.07mm layer height) thus giving me 4 layers of backing, and a similar integer # of layers for the texture height (Cant’ remember the value but it was close to your suggested 0.4mm default, just rounded to the closest integer multiple of my 0.07mm layer height.

My source image was a 1200DPI color can of a magazine cover, which I cleaned up in photoshop (to ditch the offset printing Moire effects you get on a scanned print, some basic straightening, skew correction and my eyeballed color correction based on having shot some 200,000 or so posted photos over the years. I did a image resolution drop on that bringing it down to 300 DPI and that became the source image.

I bumped the color layers from 5 to 7, left the rest at defaults and let it rip.

I unpacked the zip, dropped all of em into PrusaSlicer (which asked if they were all a single file with multiple parts) and I answered yes, and it (apparently got the stacking order correct). Sent it to slice with 100% rectilinear infill and off the printer it went. At 130mm x axis (portrait orientation) this was a 13 hour print.

So here’s my source color pallete against the light pad ::

2CEC4620-EB80-4186-9050-CC58602D2B66 A08C549B-4FEE-4D97-B747-B62EFD2ADD67 04EADC60-DB43-47F1-8490-1B313C437351 C232E3E7-6A80-4BF8-8C5C-E0152D56A277 02DE42F7-3857-43D1-96D2-5DC42B466425 24BBBA53-B4E7-4A86-9E38-95343894369A

these were all the various (indecipherable chinese) color temp values of the backlight. the sometimes illegible squiggly marks on the swatches was due to my fusion model baking in the exact layer height of each swatch square so if I have multiples of these at different layer heights I’ll be able to tell which is which.

So here’s the print, again on the same backlit pad at various color temps and brightnesses.

Was pretty happy (very actually) with the detail, not so keen on the very white highlights on her face, and the color stuff is somewhat close but the TIME lettering and the non subject gray background is all over the map color wise.

9079D23D-BC79-4109-B356-9B7F8EB58008 65BEC0C5-811B-4EF4-8DA7-B81C190C6443 B5C2D4AC-DD64-4F7B-8F3F-FF7395107B3B BAA5C302-E419-4857-939E-AD7F815D37C2 AA809210-255D-466A-A651-111D7F878BE3 2B380857-0E4C-471B-9C3E-2266701B88A6 D43F8532-378F-45FB-BF7D-239C0D47B274

So here’s some close ups of the details:

6DC754E3-8F25-46F0-87FE-FBDC07EF27BE 601346EB-94B7-4A37-9A21-849D8E4DD1CC 7E53DF49-CC26-4EF0-8A84-FB09CFD0CE2A E58A8F7C-39D6-4559-863D-FD99D21628D0

And the original source:

41B067F8-3550-4450-9D38-19D362216BB7

So, curious as to where I may have messed up, how I could better or more accurately handle the color palette sampling, and any other recommendations you might have as to how best use this thing.

Also, as I’m also a programmer, I could probably cook up a gui app for mac and windows that would handle the creation of the JSON Pallette by feeding the app the pix of the calibration swatches and picking the colors and cratering a library of sorts for the filaments, and the generate specific Pallette files using any subset of the filament libraries “sampled” colors to be saved as a preset, or applied to a specific image, and allow the user to use standard OS hosted input output dialogs for getting the source images and the designation image, and feed that all to your jar on behalf of the user. Shouldn’t be too terribly tough to cook up as you’ve already done all the heavy lifting.

gaugo87 commented 7 months ago

Hi @LewnWorx ,

1- In the calibration discussion you mention a separate “square” at integer multiples of the print layer hieght (i.e. 0.10, 0.20, 0.30 and so on for a .10mm layers hieght). Yet the STL’s first “square” was 0.20 and the rest incremented by the 0.10 layer height. Is there a reason for the first one having double the layer height?

Each layer height must have a Z-step height (so for a 0.1m step: 0.10, 0.20, 0.30, etc. But my main concern is to have a really uniform visible layer. For this, I use a neutral (white) base of 0.1mm, on which the layers I want to calibrate rest. This allows me to obtain a true first layer of color on which I can activate ironing (Bambu Studio refuses to do ironing on a first layer). I also noticed that it doesn't change much in the final result.

So if you take into account the base, yes, the first layer will be 0.2mm. but how it will be white (and that your other color will be different), it will actually be 0.1mm.

Now, if you manage to have consistent values without this base, then it will be better.

2 - While it wasn’t explicitly called out the separate STL (full length but a flat 0.10mm Z across the entire surface) appears to be designed to print under the stair step gradients and should be printed in white (gathering that from whatever slicer was shown there (I’ve never used anything but prusa slicer so was unfamiliar with the UI), so I’m assuming I have the correct assumption there.

Yes, that's correct.

Following the other comments, I decided to go the 7 layers of 0.07mm LH, and it dawned on me that that might change or need to be so I built a fully parameterized model in fusion that’s set up where you specify the number of steps, the square size, the target layer height for each step an an optional “first step addtional Z height” in case what I viewed with the demo calibration SLT really needed to be there. It also takes a separate “first” layer height for what I believe to be the “needed” white layer under the whole thing.

Yes, in 4 colors, that seems to be the best choice. @MadMax389 's results are amazing.

How do you calibrate measuring the colors? I’ve got a multi level LED panel with various white color temps (not sure on the exact values though as the manual is in Chinese) and variable brightness. I put the strips on that, took several photos at various color temps and brightnesses (noticing my iPhone camera auto compensated for the various brigntnesses, as all of them imported looking mostly the same).

The ideal is to use the same temperature as the LED strip you are using. My phone also has an option to correct photos with AI. I strongly advise you to disable this option. I also recommend using the pro mode of your phone to choose the ISO, speed, and temperature yourself.

The next thing was measuring the colors - even with ironing on some (particularly the whites and blacks) still showed the lines. My “sorta” solution for that was to run a Gaussian blur on the image, as it’ll change contrast and edge deltas but shouldn’t muck with the hue much at all and only diddle the saturation a little. It was either that or run an average calc on all the pixels in a selected area and the Gaussian blur was way less work.

Before using ironing, I also tried using Gaussian blur. Thanks to ironing, it was no longer necessary for me.

Next i expanded the JSON color palettes to 7 layers for each color. What wasn’t super clear though, was where you pulled your hex HSL (least I think that first JSON parameter is HSL (hope it’s not RGB or I really screwed it up). So where does THAT first one come from? The middle of the spectrum of your swatches? The first? The Last? Does it even matter? that part wasn’t clear.

In lithophane mode, the first hexadecimal value doesn't matter. It's useful in the "FULL" mode which doesn't use color blending but directly the filament "as is". It represents the color of the filament. I personally put the color declared by the manufacturer. But again, in lithophane mode, it doesn't matter.

So I updated the palette, added in my 5 IIIDMax initial filaments, and saved that with your jar and the other files to a directory and cloned the sample.bat file and started modifying that. What I noticed is that my hues and satruarations for the IIDMax counterparts for the CMY stuff was in the same ballpark, by L values were generally off from yours by a good 20 points in most cases. Ergo the question about what’s the best practice for both backlighting these and measuring them..

To be honest, I don't have the slightest idea. Just like you, I'm also learning to master all of this.

The first thing I noticed is when you drop that on terminal (at least in Mac OS Monterey) the shell is expecting fully qualified paths for EVERYTHING, so I needed to drop the files one at a time on terminal to get those paths, paste that into the batch file and then applied your suggested settings for a 0.40 nozzle. I noticed there, that the default backing height at (0.24mm) wasn’t an even multiple of the 0.10 layer height, and the neither was the texture thickness default.

The first thing I noticed is when you drop that on terminal (at least in Mac OS Monterey) the shell is expecting fully qualified paths for EVERYTHING, so I needed to drop the files one at a time on terminal to get those paths, paste that into the batch file and then applied your suggested settings for a 0.40 nozzle. I noticed there, that the default backing height at (0.24mm) wasn’t an even multiple of the 0.10 layer height, and the neither was the texture thickness default. Thinking that having a non integer multiple of the layer height (particularly for the texture file might have an adverse impact I settled for 0.28 for the base (at a 0.07mm layer height) thus giving me 4 layers of backing, and a similar integer # of layers for the texture height (Cant’ remember the value but it was close to your suggested 0.4mm default, just rounded to the closest integer multiple of my 0.07mm layer height.

My suggested settings are primarily for 0.12mm layers, which is one of the default profiles on Bambulab for the 0.4mm nozzle. I indeed wasn't very clear. As you understood, the base needs to be compatible with the height of the first layer + a multiple of the other layers (if you print the base in multiple parts), and the height of the colors needs to be equal to the layer height in the slicer.

So, curious as to where I may have messed up, how I could better or more accurately handle the color palette sampling, and any other recommendations you might have as to how best use this thing.

I think you got it wrong anywhere. I actually find that image 3 is the best in my opinion. I don't think I could have done better with 4 colors.

In reality, It's just very difficult to print skin color with only 4 colors. I admit that I mainly created this program to explore the possibilities of lithophanes using more than 4 colors. To be able to use 8, 12, or 16... .

I personally use between 7 and 10 colors (8 in 90% of cases.), and with beige and brown to have more shades available to manage skin colors.

Also, as I’m also a programmer, I could probably cook up a gui app for mac and windows that would handle the creation of the JSON Pallette by feeding the app the pix of the calibration swatches and picking the colors and cratering a library of sorts for the filaments, and the generate specific Pallette files using any subset of the filament libraries “sampled” colors to be saved as a preset, or applied to a specific image, and allow the user to use standard OS hosted input output dialogs for getting the source images and the designation image, and feed that all to your jar on behalf of the user. Shouldn’t be too terribly tough to cook up as you’ve already done all the heavy lifting.

Well, It will be a pleasure.

That said, I am currently exploring a new way for the color palette. I'm not sure yet if it will be worth it because it adds quite a bit of complexity to the calibration. My idea is to print all possible combinations in a matrix and use these actual "obtained colors" to deduce all usable colors.

20240129_213703

I made a prototype that directly uses an image as a reference (so no need to retrieve HSL values anymore).

matrix4

I don't know what it's going to turn out like. Especially since I put this project on hold a month ago.

MadMax389 commented 7 months ago

What wasn’t clear at all was how to white balance this. I’ve done a ton of professional photography back in the SLR days and know how to pull a white balance there (I still have several neutral gray targets with specific color temp ranges) but this is a whole different beast, and being backlit makes the prospect of getting an accurate color balance a puzzle.

So what I ended up doing was importing them to photoshop, and noticed the “white” border around the strips was anything but. the Iphones’ auto compensate had rendered those various shades of gray.

So my next thought was to try to get the border white stuff back to white without importing a hue shift and just sticking to levels and brightness / contrast adjustements. That “sorta” worked.

The next thing was measuring the colors - even with ironing on some (particularly the whites and blacks) still showed the lines. My “sorta” solution for that was to run a Gaussian blur on the image, as it’ll change contrast and edge deltas but shouldn’t muck with the hue much at all and only diddle the saturation a little. It was either that or run an average calc on all the pixels in a selected area and the Gaussian blur was way less work.

I'm not a photographic expert, but I snapped pictures using my Pixel, always using the same light source. I did simple white balance on the resulting image. image image

I also had issues selecting a pixel to generate the hex value. Paint.net has a cool picker that averages the value over a 51x51 square. image

I've had pretty good luck with 5 color 3DMax Red White Yellow Blue and Violet. image image

I've become so obsessed, I wound up buying a hand held colorimeter to play around with. When I receive it, hopefully, I will be able to generate hex values directly from the swatch.

LewnWorx commented 7 months ago

@gaugo87 @MadMax389 :

First, thanks for the replies. Good to know I’m not completely out in the weeds.

A couple things I picked up on between your two replies:

The “quality” of the color sampling is pretty important. A white balance of the source image based on the LED source of the backlight appears to be pretty key. In the interim I did manage to translate the tech specs of the panel I’m using, which gave the color temp values of the various modes. whether or not that’s actually accurate is a good question, I’m almost thinking of tearing it apart to get the actual chip models of the LED and driver SMD’s to look those up.

But I did notice Max’s white balanced sources he fed to the color picker are a lot more white than mine (even just visually to the raw eye) and that most likely has skewed my color values, and further his 51 pix rect that gets averaged is a whole lot better than my Gaussian blurred single pixel grab per tile. Gonna have to look to see how they did that. I’m not 100% sure but I think there’s support for color averaging in in one of the Javasscript libraries. I’ll have to dig into the API and see if that’s there. Back in the day when very few folks had 24 bit graphics cards, pretty much all images are 8 bit based on a Pallette and Pallette optimization was something of a fine science, and Equalibrium’s Debabilzer pro had probably the best tool on the planet for creating really good palettes, and essentially what we’re doing here is essentially that same thing.

Kinna funny how this stuff has all come full circle.

It’s been a while since I’ve cut a desktop app for stuff like this as I’ve had a couple job hops since then. My go to tool for this sort of thing has been Xojo as they handle all the platform specific api wrappers for you and you write platform agnostic code and just pick your target platforms and with a single code base you can puke out Windows, MacOS, iOS, Linux, etc. Sadly, becasue I haven’t been doing that stuff for several years (my more recent development work has all been database stuff), my compiler license has expired, and thus my most recent build of the thing is several years old, although the build I have that I can use probably can do what I need it to do for this as it’s not inherently leveraging any “state of the art” OS specific api wrappered stuff. Dunno, will check when I get a bit of time. I’d rather not pony up $900 for a renewal.

@MadMax389, what’d you end up going with for a colorimeter?

I need to get ahold of Sergio over at IIIDmax, as he mentioned he’s got a buddy who’s working on a hardware based colorometer specifically for filament usage. Dunno much about it but I get the sense it’s like a pi or Arduino or adafruit thing with a support board and firmware.

Also, you mentioned you were getting 5 hour print times with your stuff. The one I did (at 130mm x 170mm) rolled in at over 12 hours on my 5 tool head XL with input shaping on. Feed parameters to the jar were -f 0.28 -b 0.07 -l 7 -w 130. Print settings were 0.15mm first layer (white) and 0.07 mm for 2nd and on layers. Infill was 100% rectalinear. However just noticed on opening it back up I still had ironing on from the color sample prints. DOH, but re-slicing with that off is still showing 10 hours. Not sure why your print times are so much better than mine.

Also you’d mentioned your output STL’s from PIXEstl were huge. Mine are weighing in between 24-39 mb per which really isnt’ that bad. Are you feeding PIXEstl any values I’m not (like pixel size of some of the other ones?) My source image was 2528x3280 @ 300dpi, downsampled from a ludicrous 1200 dpi scan that was a 402mb TIFF coming out of photoshop post cleanup.

At any rate, I’ll rework the palettes based off this stuff, wait for your response as to what you’re using for command line parameters and fire off another round.

Thanks…

MadMax389 commented 7 months ago

@LewnWorx I wound up with the cheapest thing I could find on AliExpress for $120. Similar ones at Amazon were twice the price. Not sure if it will work or if it's worth it, but like I said, I'm obsessed :). It does output Hex and RGB (no HSL) that I can convert to HSL in Excel. image

I've talked to Sergio about some other things, but I think he's working with the HueForge guy. I could be mistaken, but I think that colorimeter they are developing is to measure transmittance on a strand of filament, or maybe swatches as well. It might be applicable to lithos and a lot cheaper.

I arrived at simple white balancing after creating profiles from non-corrected images. Colors were way off on the lithos. I used the "white" on the diffuser to set the baseline. By correcting this way, I think it standardizes all my palettes and removes or minimizes variability in lighting, temp, or camera.

My 5 hour prints are for my 115mm2 hex lithophanes. The 144x108 lithos of the same image take 9 to 10 hours. I did a lot of playing around with color width and texture resolution. I wanted to make lithos for upload that could be printed with either a .2mm or .4mm nozzle without having to create separate stls. I tested various color widths and texture resolutions with both nozzles and compromised on the following settings. I use .3mm line width in the slicer for the .4mm nozzle. image

Regarding mesh size, they were a lot bigger before Beta 2. Now, cumulative size for all 6 stl's for 144x108mm is about 500MB (more for 5 and 6 color lithos)

LewnWorx commented 7 months ago

Thanks for all that.

So I re white balanced, and found a dirt simple means of getting the average in photoshop, you can convert any selection to an average color, so I was able to use the other tools to build up a selection that didn't have any artifacts in it and once that was done average the whole mess for each sample that I then fed to Colorslurp (which on the Mac is an awesome tool for this sort of thing as you can define the output format of any color format, so I have a preset built the generates the json already formatted with all the same spacing / tabs etc that I can just drop in the 3 lines for each entry for the layers.

My output palette (sans the other colors I haven't printed swatches for yet or processed is essentially this:

{ "#004785": { "name": "PLA - IIID Max - PLA+ - Light Blue", "active": true, "layers": { "7": { "H": 208, "S": 100, "L": 26 }, "6": { "H": 205, "S": 95, "L": 29 }, "5": { "H": 208, "S": 63, "L": 36 }, "4": { "H": 208, "S": 52, "L": 39 }, "3": { "H": 208, "S": 35, "L": 45 }, "2": { "H": 205, "S": 25, "L": 49 }, "1": { "H": 207, "S": 16, "L": 51 }} }, "#C95CA2": { "name": "PLA - IIID Max - PLA+ - Fushia", "active": true, "layers": { "7": { "H": 321, "S": 50, "L": 57 }, "6": { "H": 323, "S": 59, "L": 65 }, "5": { "H": 323, "S": 60, "L": 68 }, "4": { "H": 324, "S": 59, "L": 70 }, "3": { "H": 323, "S": 54, "L": 72 }, "2": { "H": 325, "S": 43, "L": 72 }, "1": { "H": 327, "S": 23, "L": 69 }} }, "#AA8400": { "name": "PLA - IIID Max - PLA + - Yellow", "active": true, "layers": { "7": { "H": 47, "S": 100, "L": 33 }, "6": { "H": 48, "S": 100, "L": 37 }, "5": { "H": 50, "S": 100, "L": 38 }, "4": { "H": 51, "S": 100, "L": 38 }, "3": { "H": 52, "S": 100, "L": 39 }, "2": { "H": 54, "S": 94, "L": 39 }, "1": { "H": 53, "S": 40, "L": 52 }} }, "#FFFFFF": { "name": "PLA - CC3D - PLA Max - White", "active": true, "layers": { "7": { "H": 35, "S": 7, "L": 56 }, "6": { "H": 37, "S": 6, "L": 61 }, "5": { "H": 32, "S": 6, "L": 65 }, "4": { "H": 42, "S": 4, "L": 64 }, "3": { "H": 48, "S": 3, "L": 65 }, "2": { "H": 50, "S": 4, "L": 66 }, "1": { "H": 39, "S": 2, "L": 64 }} }, "#000000": { "name": "PLA - IIID MAX - PLA + - Black", "active": true, "layers": { "7": { "H": 358, "S": 3, "L": 15 }, "6": { "H": 28, "S": 2, "L": 21 }, "5": { "H": 62, "S": 1, "L": 29 }, "4": { "H": 62, "S": 1, "L": 33 }, "3": { "H": 152, "S": 1, "L": 39 }, "2": { "H": 240, "S": 0, "L": 42 }, "1": { "H": 208, "S": 1, "L": 48 }} },

I've also got the parameterized F360 model that creates the swatches for ya if you want that, lemme know and I'll email it to ya.

I did mod your settings a bit, I used a texture height that was evenly divisible by the 0.07 layer height, and also added a non default entry for the minimum texture height (as the layer height is 0.07), so the parameter part of the equation ended up as:

-b 0.07 -cW .3 -f 0.15 -l 7 -w 120 -M 1.68 -m 0.07 -tW .2

I'm not sure if the minimum texture height helped changing from defaults as I haven't had enough time to change one paremeter at a time and compare the differences.

Print time jumped from 12 to 16 hours on the XL as a result. and the file sizes jumped quite a bit as well:

image

However the previews look a LOT better.

The R1 previews:

image

image

The R2 Previews:

image

image

Now the texture in R1 isn't a whole lot different than in R2, and it looks really decent on the actual print, and as a result I'm toying with the idea of doing two separate runs with different parameters, and using the lower "res" version of the texture file with the higher "res" colors and creating a layer modifier in the slicer to change the print width from your 0.3 to the stock 0.45 (that was what was used in the first rev) as a means of perhaps saving some of that additional print time.

Not sure why I'm getting such huge print times vs yours, as for normal (non litho work) the XL flat out smokes my I3 (by like a factor of almost 4x) on multi material stuff, simply because the color changes are damn near instantaneous, so any advantage the Bambu might have had in raw print speed over the xl is lost with all the layer changes. But that's a whole different fish to fry at this point.

I keep forgetting to call Sergio, need to. We usually shoot the shit at least once a month (mostly about everything BUT printing).

MadMax389 commented 7 months ago

The output from Colorslurp is amazing. I was converting each hex code to HSL using a converter website and then editing each entry of the .json in Notepad++. A true pain. Not having coding experience, other than for industrial PLC's, your post inspired me to create a poor man's substitute for elegant code in Excel. I found some VBA to convert hex to RGB, then RGB to HSL. Simple concatenations generate text for the json. Bulky, but effective. Now I can enter hex values from the colorimeter and generate the code automatically. Thanks for the inspiration. image

LewnWorx commented 7 months ago

So I've got an ever growing palette in color slurp as I'm cooking these 7 layer swatches (well If my older fillaments I had forgotten to change out the desiccant would stop breaking in the PTFE long enough to print that is). Got a Sunny 4 bay dryer on since last year and it still hasn't shipped, so doing em 1 at ta time is taking forever).

At any rate, in colorlslrup they look like:

image

but it's real strength is those custom formats, which I use in a bunch of ways. We have a digital media server product you have to create you're own layouts for and industrial control products that we roll guis for and they all have some goofy format for getting RGB stuff in and out.

So for each of these I've cooked up the format it wants to see the color in. You can mix and match (like have bot HSL and RBG or whatever in any given format:

image

Back to the swatches, does it matter if we're using 0.10 or 0.15 or something else on the bases?

Just dawned on my my first run the the CYMK IIIDMax stuff was 0.10 for both the base layer and the color layer:

image

And the last couple batches were 0.15 / 0.07:

image

Suppose I could update the fusion model to show Both the backing value and the color values on the swatch currently just shows the color:

image

And going back to @gaugo87 's STLS, I'm still a tad fuzzy on that.

The first STL, was a flat one, (the white backing) which was 0.10 for the whole thing.

So I have that in my fusion model as a paraem (fist Layer Height) which yields this:

image

Now I added the color stuff at 0.10 for the first batch:

image image

SO that's what I used. However when I cracked up @gaugo87 's separate staristepped STL I found a "double" 0.10 for the first portion of the stair step:

image

Yielding this in your STLf for the color stuff:

image

Now we could interpret that to mean (but the first 0.10 is always white (like my earlier one shows) - OR - the first COLOR step has that additional 0.10 in it AND the separate STL base, as such:

image

But I interpreted it to be:

image

This raises the question of should be be doing these palettes off whichever of those is correct, or if (like in my and max's case) where we're using different actual values for the First layer, but tighter layer heights on the colors:

image

I.E. would the values in the palette be better if they were based of the color layers actual print height? Or are they even used?

I guess I'm not super clear on how the values of the palette are being mapped to the values in the source images and how that drives the depth values of the color "pixels".

I did notice last night (trying to toss a quickie valentine's card together for my wife that it's not actually using some of the values in the palettes:

`Last login: Tue Feb 13 21:25:48 on ttys001 marklewno@Marks-Home-CMP-Monterey ~ % /Volumes/homes/Mark\ NAS/Cad\ Work/3d\ Printing/Color\ Lithophanes/Projects/Valentines/Valentines Palette generation... (165 colors found) Calculating color distances with the image... Nb color used=21 Generating previews... Generating STL files... Layer[0.0] :PLA - IIID MAX - PLA + - Black, PLA - IIID Max - PLA+ - Light Blue, PLA - IIID Max - PLA + - Yellow, PLA - IIID Max - PLA+ - Fushia, PLA - CC3D - PLA Max - White

GENERATION COMPLETE ! (18659 ms) marklewno@Marks-Home-CMP-Monterey ~ % /Volumes/homes/Mark\ NAS/Cad\ Work/3d\ Printing/Color\ Lithophanes/Projects/Valentines/Valentines Palette generation... (165 colors found) Calculating color distances with the image... Nb color used=22 Generating previews... Generating STL files... Layer[0.0] :PLA - IIID MAX - PLA + - Black, PLA - IIID Max - PLA+ - Light Blue, PLA - IIID Max - PLA + - Yellow, PLA - IIID Max - PLA+ - Fushia, PLA - CC3D - PLA Max - White

GENERATION COMPLETE ! (20111 ms) marklewno@Marks-Home-CMP-Monterey ~ % /Volumes/homes/Mark\ NAS/Cad\ Work/3d\ Printing/Color\ Lithophanes/Projects/Valentines/Valentines Palette generation... (165 colors found) Calculating color distances with the image... Nb color used=25 Generating previews... Generating STL files... Layer[0.0] :PLA - IIID MAX - PLA + - Black, PLA - IIID Max - PLA+ - Light Blue, PLA - IIID Max - PLA + - Yellow, PLA - IIID Max - PLA+ - Fushia, PLA - CC3D - PLA Max - White

GENERATION COMPLETE ! (21464 ms) marklewno@Marks-Home-CMP-Monterey ~ % /Volumes/homes/Mark\ NAS/Cad\ Work/3d\ Printing/Color\ Lithophanes/Projects/Valentines/Valentines Palette generation... (239 colors found) Calculating color distances with the image... Nb color used=22 Generating previews... Generating STL files... Layer[0.0] :PLA - IIID MAX - PLA + - Black, PLA - IIID Max - PLA+ - Light Blue, PLA - IIID Max - PLA + - Yellow, PLA - IIID Max - PLA+ - Fushia, PLA - CC3D - PLA Max - White

GENERATION COMPLETE ! (24346 ms) marklewno@Marks-Home-CMP-Monterey ~ % /Volumes/homes/Mark\ NAS/Cad\ Work/3d\ Printing/Color\ Lithophanes/Projects/Valentines/Valentines Palette generation... (239 colors found) Calculating color distances with the image... Nb color used=32 Generating previews... Generating STL files... Layer[0.0] :PLA - IIID MAX - PLA + - Black, PLA - IIID Max - PLA+ - Light Blue, PLA - IIID Max - PLA + - Yellow, PLA - IIID Max - PLA+ - Fushia, PLA - CC3D - PLA Max - White `

As well as it did on the "all over the map" Taylor Pic I'd have figured something without a lot of tonal range would have come across pretty well:

Iteration 1: Source:

image

Output:

image

Iteration 2: Source:

image

Output:

image

Now I also noted in your sample palette, you had a lot of filaments in there that weren't "active".

I'm taking that as you can store em all in a common file, and turn on and off the actives based on the colors you'll actually use in the print.

If I am correct on that my next assumption was I could "tilt" the palette for a given print to utilize filaments that were more predominantly used in the source image. Like in this case I have a bunch of different pinks, reds, maroons etc filament wise and therefor I "COULD" load the palette to use those (and my main reason for this is on that card run it didn't even use any of the cyan stuff from my IIIDMax light blue, there wasn't even a STL in the output for that color at all.

So still trying to get my head around this stuff to figure out best practices.

FWIW, I picked up a Hueforge license and woah. I can't even begin to get my head around that. AFAICT, everything these is based off additive layers based solely on luminance so it's not really (again AFAICT) doing a color separation. Played with it for a couple hours and couldn't get anything even vaguely close to what I was feeding it.

Looking at it most of the examples I've seen are either source images with a very limited color range, or highly stylized stuff where the output is very limited color range wise as well. Lots of stuff looking like duo and tri tone prints of days gone by.

gaugo87 commented 7 months ago

The output from Colorslurp is amazing. I was converting each hex code to HSL using a converter website and then editing each entry of the .json in Notepad++. A true pain. Not having coding experience, other than for industrial PLC's, your post inspired me to create a poor man's substitute for elegant code in Excel. I found some VBA to convert hex to RGB, then RGB to HSL. Simple concatenations generate text for the json. Bulky, but effective. Now I can enter hex values from the colorimeter and generate the code automatically. Thanks for the inspiration.

If you want, I can modify the code to accept either a hexadecimal value or an HSL value for each layer in the JSON. I chose HSL because it seemed most relevant to me, but honestly, if you find it more convenient to use hexadecimal, that's fine with me.

gaugo87 commented 7 months ago

@LewnWorx ,

You're overthinking it with this 0.1mm base.

It's just there to enable ironing from the first layer of color. I added it because during my tests, I noticed its impact was negligible on the final result. So, if it bothers you, don't use it. Just print layers of 0.07, 0.14, 0.21, etc...

Second advice, I suggest not calibrating white and black and leaving the values I've set. That's exactly it, it allows me to selectively enable or disable filaments based on the images. I think your results will be better that way.

Now I also noted in your sample palette, you had a lot of filaments in there that weren't "active". I'm taking that as you can store em all in a common file, and turn on and off the actives based on the colors you'll actually use in the print. If I am correct on that my next assumption was I could "tilt" the palette for a given print to utilize filaments that were more predominantly used in the source image. Like in this case I have a bunch of different pinks, reds, maroons etc filament wise and therefor I "COULD" load the palette to use those (and my main reason for this is on that card run it didn't even use any of the cyan stuff from my IIIDMax light blue, there wasn't even a STL in the output for that color at all. So still trying to get my head around this stuff to figure out best practices.

That's exactly it, it allows me to selectively enable or disable filaments regarding he images.

gaugo87 commented 7 months ago

@LewnWorx,

One last thing. Avoid images with color gradients (like your image from iteration 2), especially in 4 colors. The results will often be disappointing.

MadMax389 commented 7 months ago

If you want, I can modify the code to accept either a hexadecimal value or an HSL value for each layer in the JSON. I chose HSL because it seemed most relevant to me, but honestly, if you find it more convenient to use hexadecimal, that's fine with me.

@gaugo87 That would be fantastic. For me, it would be much easier to create profiles. I have been creating a lot of them trying different color and filament combinations.

One last thing. Avoid images with color gradients (like your image from iteration 2), especially in 4 colors. The results will often be disappointing.

I definitely find that images with contrasting colors next to each other produce the best lithophanes. Skin tones are really difficult with 4 and 5 color profiles.

gaugo87 commented 6 months ago

Hi @MadMax389,

@gaugo87 That would be fantastic. For me, it would be much easier to create profiles. I have been creating a lot of them trying different color and filament combinations.

Done in the version 0.3.0


  "#0086D6":  
  {  
    "name": "Cyan[PLA Basic]",  
    "active": true,  
    "layers": {  
    "5": {  
      "hexcode": "#018BE6"  
    },  
    "4": {  
      "hexcode": "#059EEE"  
    },  
    "3": {  
      "hexcode": "#1EAEF5"  
    },  
    "2": {  
      "hexcode": "#49C4FF"  
    },  
    "1": {  
      "hexcode": "#92D6FD"  
    }}  
  },  
...
MadMax389 commented 6 months ago

@gaugo87 Thank you! I created a simple spreadsheet where I can enter the HEX value and see the expected color. It then generates the line of code for that color. I can create profiles with any color combinations in a few minutes. Much appreciated. image