Closed Skunkynator closed 2 years ago
If this issue gets accepted, I'd like to implement the config file argument and parsing. I am also up to implement other parts if no one else wants to.
How would loading a model from the config file look like? Whats the structure? How would you specify which shaders belong to which model and what about coordinates and scale? Could you maybe make a small mockup config? Otherwise feel free to implement it.
I also thought about having the ability to set an image in the background instead or behind the shader.
Here is a small mockup config using the toml format:
[arguments]
speed = 1.2
quality = 0.8 #maybe allow values higher then 1 for anti aliasing
mode = "root"
opacity = 0.9 #not used in root mode
[fragment]
path = "./relative/to/the/conf/file.fragment"
[[fragment.textures]]
name = "const_var_name"
path = "path/to/image.png"
[[fragment.textures]]
name = "another_img"
path = "yep.png"
[[models]]
path = "/path/can/also/be/absolute.obj"
vertex_path = "shader.vert"
fragment_path = "shader.frag" #if not specified, use fallback shader for either
position = [1.5, 0, -13.2] # "camera" points towards -Z by default
perspective = true #specifies if using perspective or orthographic projection
[[models.textures]]
# same format as fragment.textures
[[global_textures]]
#same as fragment.textures but given to every shader
What about multiple monitors?
okay, i have an idea for that now, we remove the arguments section of the above example and instead add a second file with this format:
[[Instances]]
speed = 1
quality = 0.5
mode = "window"
opacity = 0.5
displays = ["DP-2","DP-1"]
wallpaper = "wallpaper.conf"
[[Instances]]
displays = ["HDMI-1"]
wallpaper = "examples/wallpaper.conf"
so the first example file is more for the effect creators, whereas the second is more for the ones using the effects
if there's any additional improvements or ideas, it would be appreciated to let me know about them
It'd be incredible if there was a way to pass arbitrary live data to a shader. For example:
That data (after minimal preprocessing) would be passed into a buffer parameter for the shader, similar to how the current mouse position is provided from the outside. It could also be useful to fill an array buffer with a limited history of values.
The possibilities opened by this are endless: out of the box a shader could react to audio volume (like some other apps allow), but also to things like CPU load or RAM usage. Similar to conky, but the rendering is done with shaders.
I have no experience with this sort of thing tho, so I'd be interested to hear the experts' opinion.
It'd be incredible if there was a way to pass arbitrary live data to a shader
Yes I also want to implement a way to pass data like audio volume, CPU load, etc to the shaders. I'm not really an expert with shaders and OpenGL + I don't have a lot of time right now but I'm sure such possibility could be really useful. Will have to think about how to implement it in a way that someone could easily pass custom values fetched from their script to the shader with "Show" as a middleman.
will be replaced with a script system and layers (multiple shader, textures, videos above each other) (see python branch)
Add support for config files that have arguments pre specified and can point to textures/3D models to also load and draw.