Msty supports a wide range of gpus for faster inference Here's how you can find the api keys for various online ai services Check if your gpu is supported by msty.
β Msty Mason π€ on Twitter: "i π€ small tiddi goth girls https://t.co
Msty supports amd rocm on windows out of the box and we even have a dedicated installer for it
However, sometimes things can go wrong and your gpu card might not be supported or detected at all.
Rag (retrieval augmented generation) is the technology that powers knowledge stacks in msty Msty lets you download a wide variety of models to use offline with local ai You can choose to install any model from ollama or import supported gguf model files from huggingface, directly within msty. Any model you have in msty can be used by changing the model names in the config
You can easily toggle this feature on or off as needed. Msty makes it easy to make your local ai service available on your network You can enable network access to your local ai service by following these simple steps: If you're looking for the msty studio documentation instead, you can find it here
Go to msty studio docs β learn more about msty studio at msty.ai β api keys are required to access the services provided by online ai services