Cria comando para realizar download de modelo IA através do huggingface#26
Conversation
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| - | - | Hugging Face user access token | b0f1a31 | .envs/.local/.django | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
There was a problem hiding this comment.
Pull Request Overview
This PR creates a Django management command for downloading AI models from Hugging Face Hub, replacing the existing standalone download script with a more robust Django-integrated solution. The changes also migrate from wagtail-modeladmin to Wagtail's built-in snippet system for the Reference model.
- Creates a new Django management command
download_modelfor downloading models from Hugging Face - Migrates Reference admin from wagtail-modeladmin to Wagtail snippets
- Moves model configuration settings to Django settings and adds HF_TOKEN environment variable
Reviewed Changes
Copilot reviewed 6 out of 8 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| reference/management/commands/download_model.py | New Django command for downloading HF models with proper error handling |
| reference/wagtail_hooks.py | Migrates from wagtail-modeladmin to snippet system |
| reference/models.py | Adds translation label for inline panel |
| llama3/download_model.py | Removes old standalone download script |
| config/settings/base.py | Moves model configuration to proper settings section |
| .envs/.local/.django | Adds HF_TOKEN environment variable |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
| parser.add_argument( | ||
| "--dir", | ||
| type=str, | ||
| default="llama3/llama-3.2", |
There was a problem hiding this comment.
The hardcoded default path 'llama3/llama-3.2' should reference the Django setting LLAMA_MODEL_DIR to maintain consistency with the configuration defined in settings/base.py.
| parser.add_argument( | ||
| "--repo", | ||
| type=str, | ||
| default="hugging-quants/Llama-3.2-3B-Instruct-Q4_K_M-GGUF", |
There was a problem hiding this comment.
The hardcoded repository ID should be made configurable through Django settings to avoid duplicating model configuration across the codebase.
| parser.add_argument( | ||
| "--filename", | ||
| type=str, | ||
| default="llama-3.2-3b-instruct-q4_k_m.gguf", |
There was a problem hiding this comment.
The hardcoded filename should reference the Django setting MODEL_LLAMA to maintain consistency with the configuration defined in settings/base.py.
|
@eduranm @pitangainnovare por favor, revisar / probar |
There was a problem hiding this comment.
@robertatakenaka @samuelveigarangel @eduranm
Testei o comando e funcionou bem. Eu sugiro que no texto introdutório do comando (aquele que vem com o --help) contenha uma descrição do que é preciso passar como parâmetro.
Por exemplo, experimentei baixar outro modelo fazendo algo como:
python manage.py download_model --repo hugging-quants/Llama-3.2-3B-Instruct-Q8_0-GGUF --filename llama-3.2-3b-instruct-q8_0.ggufCaso o filename seja omitido, ele tentará baixar o filename padrão (que é o modelo q4_k) e haverá falha (not found).
Então sugiro alertar na variável help help = "Download the model from HuggingFace" esse tipo de detalhe (para não esquecer de indicar tanto o repo_id quanto o filename e que um depende do outro).
Também gosto das sugestões do Copilot, que indicam substituir os campos "defaukt" pelos valores definidos em settings.
|
@robertatakenaka @samuelveigarangel algo além do escopo deste PR é o modelo Reference. Ao tentar criar um registro, ao pressionar "Salvar", obtive o seguinte erro:
|

No description provided.