-
Notifications
You must be signed in to change notification settings - Fork 769
Update system-requirements.md #10631
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: development
Are you sure you want to change the base?
Conversation
Update to the system requirements to reflect the model support of Maia. The model is supplied for you when you are on the public platform, but configurable in PMP. We will link it to PMP docs when they are available.
| ## LLM Providers | ||
| Mendix supports connectivity to a broad set of models. Maia has been standardized on Claude Sonnet 3.7. | ||
| It is possible to connect to the following models: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to conflate two different things. Maia and LLMs in Mendix apps. It implies you can use all these models with Maia - but I don't think that is under user control?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you expand on 'LLMs in Mendix apps', it would refer to LLMs that Studio Pro can work with, and not LLMs in apps, but perhaps I'm misinterpetting your feedback.
As for user control, as mentioned in the PR comment, is only there in Private Platform, the model is supplied to you on the public platform. We could think of making that part more explicit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, so you do mean that you can use these for Maia - then I think we need to be very clear that this is not a choice for most customers, only for PMP.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this belongs in System Requirements at all. It is not a system requirement for running Studio Pro for the vast majority of people - most people will not need to install an LLM or decide which LLM Maia should be using.
The only requirement about LLMs for Maia is that they not be blocked by a firewall and this is mentioned here: http://docs.mendix.com/refguide/mendix-ai-assistance/#maia-network-requirement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The fact that we impose an LLM for our public platform is a limitation in technical capabilities, that will likely change. Not documenting this due to a limit of choice also does not seem like the right direction.
It is also not a PMP feature, it is a SP+Maia capability. Regardless, I fully agree that the current request needs to be updated to clearly reflect the current state before it can be released, that is on my todo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't agree that it is a limitation. It is a design.
In the same way we don't have to tell customers that some UX features use ETO and other use Web components. Or that we use particular libraries when coding C#. It's just the way they are written.
Customers don't have a choice about which UX management tool they use to see the UX. In the same way, they don't have any choice about which LLM Maia uses, what resources we use to train Maia, or how much weight we give different resources in helping Maia reach the right answer. If we find in future that a different LLM model produces better results for Maia, then we should be able to change that at the back end, in the same way as we move from ETO to Web.
This is, of course, different from deciding which LLM you will implement when using genai resources in your app.
It may also be different when it comes to having to choose and pay for your own LLM when running Maia on PMP.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(Per the PMP team, BYO Maya will be available for PMP 2.6 and newer.)
@grootjans we have a PMP requirements page - what do you think about putting the information there for the time being? Once this capability is enabled for the public platform, we can add the info there and only refer to it from the PMP page (in the same manner as the System Requirements link here) but while it only applies to PMP, I'd keep it within the PMP scope.
MarkvanMents
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@katarzyna-koltun-mx
I don't think this information belongs in System Requirements at all. It is only relevant for people who are running Maia on PMP.
I would suggest that this goes with the documentation for running Maia on PMP, but I don't believe this is written yet. So I would suggest planning that documentation and coordinating with Robbert Jan so he can ensure that the appropriate information is added.
I'm setting this to draft as I don't think this should be merged in its current state.
Update to the system requirements to reflect the model support of Maia. The model is supplied for you when you are on the public platform, but configurable in PMP. We will link it to PMP docs when they are available.
Still to be reviewed by the AI group as well.