Skip to content
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions content/en/docs/refguide/installation/system-requirements.md
Original file line number Diff line number Diff line change
Expand Up @@ -316,6 +316,22 @@ Depending on your app's complexity, these minimum hardware requirements might no

Developing native mobile apps with Mendix comes with special requirements explained in [Native App Prerequisites and Troubleshooting](/refguide/mobile/getting-started-with-mobile/prerequisites/).

## LLM Providers
Mendix supports connectivity to a broad set of models. Maia has been standardized on Claude Sonnet 3.7.
It is possible to connect to the following models:
Comment on lines +319 to +321
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to conflate two different things. Maia and LLMs in Mendix apps. It implies you can use all these models with Maia - but I don't think that is under user control?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you expand on 'LLMs in Mendix apps', it would refer to LLMs that Studio Pro can work with, and not LLMs in apps, but perhaps I'm misinterpetting your feedback.
As for user control, as mentioned in the PR comment, is only there in Private Platform, the model is supplied to you on the public platform. We could think of making that part more explicit.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, so you do mean that you can use these for Maia - then I think we need to be very clear that this is not a choice for most customers, only for PMP.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this belongs in System Requirements at all. It is not a system requirement for running Studio Pro for the vast majority of people - most people will not need to install an LLM or decide which LLM Maia should be using.
The only requirement about LLMs for Maia is that they not be blocked by a firewall and this is mentioned here: http://docs.mendix.com/refguide/mendix-ai-assistance/#maia-network-requirement.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fact that we impose an LLM for our public platform is a limitation in technical capabilities, that will likely change. Not documenting this due to a limit of choice also does not seem like the right direction.
It is also not a PMP feature, it is a SP+Maia capability. Regardless, I fully agree that the current request needs to be updated to clearly reflect the current state before it can be released, that is on my todo.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't agree that it is a limitation. It is a design.
In the same way we don't have to tell customers that some UX features use ETO and other use Web components. Or that we use particular libraries when coding C#. It's just the way they are written.
Customers don't have a choice about which UX management tool they use to see the UX. In the same way, they don't have any choice about which LLM Maia uses, what resources we use to train Maia, or how much weight we give different resources in helping Maia reach the right answer. If we find in future that a different LLM model produces better results for Maia, then we should be able to change that at the back end, in the same way as we move from ETO to Web.

This is, of course, different from deciding which LLM you will implement when using genai resources in your app.

It may also be different when it comes to having to choose and pay for your own LLM when running Maia on PMP.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Per the PMP team, BYO Maya will be available for PMP 2.6 and newer.)

@grootjans we have a PMP requirements page - what do you think about putting the information there for the time being? Once this capability is enabled for the public platform, we can add the info there and only refer to it from the PMP page (in the same manner as the System Requirements link here) but while it only applies to PMP, I'd keep it within the PMP scope.


* Anthropic
* Claude Sonnet 3.7
* [AWS Bedrock](https://docs.aws.amazon.com/bedrock/latest/userguide/models-regions.html)
* Claude Sonnet 3.7
* Claude Sonnet 4.0
* Claude Sonnet 4.5
* GPT-OSS-120B
* [Azure](https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/concepts/models-sold-directly-by-azure)
* 03-mini
* OpenAI
* GPT-5-Mini

## MxBuild {#mxbuild}

MxBuild is a Windows, Linux, and macOS command-line tool that can be used to build a Mendix Deployment Package. For more information, see [MxBuild](/refguide/mxbuild/).
Expand Down