There are not many models that support any-to-any, currently the best seems to be Qwen3-Omni, the audio quality is not great and it is not supported by llama.cpp: https://github.com/ggml-org/llama.cpp/issues/16186
The --rotate normal,inverted,left,right does not work, but you can use the transform option to achieve the same effect.
To create the transformation matrix you can use something like: https://angrytools.com/css-generator/transform/
for translateXY enter half the screen resolution
don't copy the generated code, it has the numbers in the wrong order just type out the matrix row wise.
Thanks for the suggestion, I tried it and the diff view is very good.
The setup was not really easy for my local models, but after i set it up, it was really fast.
The biggest problem with the tool is that the open source models are not that good, i tried if it could fix a bug in my code and it was only able to make it worse.
On a more positive note, you at least do not need to copy all text over to another window and it is great for generating boilerplate code nearly flawlessly every time.
It works ok for the most part.
The problem i have with it is that inline completion is more annoying then helpful, because the AI only sees the last few lines that you wrote and therefore does not know the larger context of the project.
I also found this project, it looks promising.
Has anyone tested it?
Can you separate the server from the client?
There are not many models that support any-to-any, currently the best seems to be Qwen3-Omni, the audio quality is not great and it is not supported by llama.cpp: https://github.com/ggml-org/llama.cpp/issues/16186