Skill
où trouver les models IA pour les tester
Dans le code ci-dessous vous pouvez trouver tutes les sources en libre téléchargment.
Le lien github permet l’accès au javascript models.json.
Voici tout de même les liens des models si vous soihaitez les télécharger :
- https://huggingface.co/nomic-ai/gpt4all-falcon-ggml/resolve/main/ggml-model-gpt4all-falcon-q4_0.bin
- https://huggingface.co/TheBloke/Nous-Hermes-13B-GGML/resolve/main/nous-hermes-13b.ggmlv3.q4_0.bin
- https://huggingface.co/TheBloke/GPT4All-13B-snoozy-GGML/resolve/main/GPT4All-13B-snoozy.ggmlv3.q4_0.bin
- https://huggingface.co/TheBloke/orca_mini_7B-GGML/resolve/main/orca-mini-7b.ggmlv3.q4_0.bin
- https://huggingface.co/TheBloke/orca_mini_7B-GGML/resolve/main/orca-mini-7b.ggmlv3.q4_0.bin
- https://huggingface.co/TheBloke/orca_mini_13B-GGML/resolve/main/orca-mini-13b.ggmlv3.q4_0.bin
- https://huggingface.co/TheBloke/WizardLM-13B-Uncensored-GGML/resolve/main/wizardLM-13B-Uncensored.ggmlv3.q4_0.bin
- https://huggingface.co/nomic-ai/ggml-replit-code-v1-3b/resolve/main/ggml-replit-code-v1-3b.bin
- https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_0.bin
https://github.com/nomic-ai/gpt4all/blob/main/gpt4all-chat/metadata/models.json
[{
“order”: “a”,
“md5sum”: “e8d47924f433bd561cb5244557147793”,
“name”: “Wizard v1.1”,
“filename”: “wizardlm-13b-v1.1-superhot-8k.ggmlv3.q4_0.bin”,
“filesize”: “7323310848”,
“ramrequired”: “16”,
“parameters”: “13 billion”,
“quant”: “q4_0”,
“type”: “LLaMA”,
“systemPrompt”: ” “,
“description”: “Best overall model
- “,
“url”: “https://huggingface.co/nomic-ai/gpt4all-falcon-ggml/resolve/main/ggml-model-gpt4all-falcon-q4_0.bin”,
“promptTemplate”: “### Instruction:\n%1\n### Response:\n”
},
{
“order”: “c”,
“md5sum”: “4acc146dd43eb02845c233c29289c7c5”,
“name”: “Hermes”,
“filename”: “nous-hermes-13b.ggmlv3.q4_0.bin”,
“filesize”: “8136777088”,
“requires”: “2.4.7”,
“ramrequired”: “16”,
“parameters”: “13 billion”,
“quant”: “q4_0”,
“type”: “LLaMA”,
“systemPrompt”: ” “,
“description”: “Extremely good model- Instruction basedGives long responsesCurated with 300,000 uncensored instructionsTrained by Nous ResearchCannot be used commercially
“url”: “https://huggingface.co/TheBloke/Nous-Hermes-13B-GGML/resolve/main/nous-hermes-13b.ggmlv3.q4_0.bin”,
“promptTemplate”: “### Instruction:\n%1\n### Response:\n”
},
{
“order”: “f”,
“md5sum”: “11d9f060ca24575a2c303bdc39952486”,
“name”: “Snoozy”,
“filename”: “GPT4All-13B-snoozy.ggmlv3.q4_0.bin”,
“filesize”: “8136770688”,
“requires”: “2.4.7”,
“ramrequired”: “16”,
“parameters”: “13 billion”,
“quant”: “q4_0”,
“type”: “LLaMA”,
“systemPrompt”: ” “,
“description”: “Very good overall model- Instruction basedBased on the same dataset as GroovySlower than Groovy, with higher quality responsesTrained by Nomic AICannot be used commercially
“url”: “https://huggingface.co/TheBloke/GPT4All-13B-snoozy-GGML/resolve/main/GPT4All-13B-snoozy.ggmlv3.q4_0.bin”
},
{
“order”: “h”,
“md5sum”: “e64e74375ce9d36a3d0af3db1523fd0a”,
“name”: “Mini Orca”,
“filename”: “orca-mini-7b.ggmlv3.q4_0.bin”,
“filesize”: “3791749248”,
“requires”: “2.4.7”,
“ramrequired”: “8”,
“parameters”: “7 billion”,
“quant”: “q4_0”,
“type”: “OpenLLaMa”,
“description”: “New model with novel dataset- Instruction basedExplain tuned datasetsOrca Research Paper dataset construction approachesLicensed for commercial use
“url”: “https://huggingface.co/TheBloke/orca_mini_7B-GGML/resolve/main/orca-mini-7b.ggmlv3.q4_0.bin”,
“promptTemplate”: “### User:\n%1\n### Response:\n”,
“systemPrompt”: “### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n”
},
{
“order”: “i”,
“md5sum”: “6a087f7f4598fad0bb70e6cb4023645e”,
“name”: “Mini Orca (Small)”,
“filename”: “orca-mini-3b.ggmlv3.q4_0.bin”,
“filesize”: “1928446208”,
“requires”: “2.4.7”,
“ramrequired”: “4”,
“parameters”: “3 billion”,
“quant”: “q4_0”,
“type”: “OpenLLaMa”,
“description”: “Small version of new model with novel dataset- Instruction basedExplain tuned datasetsOrca Research Paper dataset construction approachesLicensed for commercial use
“url”: “https://huggingface.co/TheBloke/orca_mini_3B-GGML/resolve/main/orca-mini-3b.ggmlv3.q4_0.bin”,
“promptTemplate”: “### User:\n%1\n### Response:\n”,
“systemPrompt”: “### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n”
},
{
“order”: “j”,
“md5sum”: “959b7f65b2d12fd1e3ff99e7493c7a3a”,
“name”: “Mini Orca (Large)”,
“filename”: “orca-mini-13b.ggmlv3.q4_0.bin”,
“filesize”: “7323329152”,
“requires”: “2.4.7”,
“ramrequired”: “16”,
“parameters”: “13 billion”,
“quant”: “q4_0”,
“type”: “OpenLLaMa”,
“description”: “Largest version of new model with novel dataset- Instruction basedExplain tuned datasetsOrca Research Paper dataset construction approachesLicensed for commercial use
“url”: “https://huggingface.co/TheBloke/orca_mini_13B-GGML/resolve/main/orca-mini-13b.ggmlv3.q4_0.bin”,
“promptTemplate”: “### User:\n%1\n### Response:\n”,
“systemPrompt”: “### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n”
},
{
“order”: “r”,
“md5sum”: “489d21fd48840dcb31e5f92f453f3a20”,
“name”: “Wizard Uncensored”,
“filename”: “wizardLM-13B-Uncensored.ggmlv3.q4_0.bin”,
“filesize”: “8136777088”,
“requires”: “2.4.7”,
“ramrequired”: “16”,
“parameters”: “13 billion”,
“quant”: “q4_0”,
“type”: “LLaMA”,
“systemPrompt”: ” “,
“description”: “Trained on uncensored assistant data and instruction data- Instruction basedCannot be used commercially
“url”: “https://huggingface.co/TheBloke/WizardLM-13B-Uncensored-GGML/resolve/main/wizardLM-13B-Uncensored.ggmlv3.q4_0.bin”
},
{
“order”: “s”,
“md5sum”: “615890cb571fcaa0f70b2f8d15ef809e”,
“disableGUI”: “true”,
“name”: “Replit”,
“filename”: “ggml-replit-code-v1-3b.bin”,
“filesize”: “5202046853”,
“requires”: “2.4.7”,
“ramrequired”: “4”,
“parameters”: “3 billion”,
“quant”: “f16”,
“type”: “Replit”,
“systemPrompt”: ” “,
“promptTemplate”: “%1”,
“description”: “Trained on subset of the Stack- Code completion basedLicensed for commercial use
“url”: “https://huggingface.co/nomic-ai/ggml-replit-code-v1-3b/resolve/main/ggml-replit-code-v1-3b.bin”
},
{
“order”: “t”,
“md5sum”: “031bb5d5722c08d13e3e8eaf55c37391”,
“disableGUI”: “true”,
“name”: “Bert”,
“filename”: “ggml-all-MiniLM-L6-v2-f16.bin”,
“filesize”: “45521167”,
“requires”: “2.4.14”,
“ramrequired”: “1”,
“parameters”: “1 million”,
“quant”: “f16”,
“type”: “Bert”,
“systemPrompt”: ” “,
“description”: “Sbert- For embeddings” }, { “order”: “u”, “md5sum”: “379ee1bab9a7a9c27c2314daa097528e”, “disableGUI”: “true”, “name”: “Starcoder (Small)”, “filename”: “starcoderbase-3b-ggml.bin”, “filesize”: “7503121552”, “requires”: “2.4.14”, “ramrequired”: “8”, “parameters”: “3 billion”, “quant”: “f16”, “type”: “Starcoder”, “systemPrompt”: ” “, “promptTemplate”: “%1”, “description”: “Trained on subset of the Stack
- Code completion based
},
{
“order”: “w”,
“md5sum”: “f981ab8fbd1ebbe4932ddd667c108ba7”,
“disableGUI”: “true”,
“name”: “Starcoder”,
“filename”: “starcoderbase-7b-ggml.bin”,
“filesize”: “17860448016”,
“requires”: “2.4.14”,
“ramrequired”: “16”,
“parameters”: “7 billion”,
“quant”: “f16”,
“type”: “Starcoder”,
“systemPrompt”: ” “,
“promptTemplate”: “%1”,
“description”: “Trained on subset of the Stack- Code completion based
},
{
“order”: “w”,
“md5sum”: “c7ebc61eec1779bddae1f2bcbf2007cc”,
“name”: “Llama-2-7B Chat”,
“filename”: “llama-2-7b-chat.ggmlv3.q4_0.bin”,
“filesize”: “3791725184”,
“requires”: “2.4.14”,
“ramrequired”: “8”,
“parameters”: “7 billion”,
“quant”: “q4_0”,
“type”: “LLaMA2”,
“description”: “New LLaMA2 model from Meta AI.- Fine-tuned for dialogue.
- static model trained on an offline dataset
- RLHF dataset
- Licensed for commercial use
“url”: “https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_0.bin”,
“promptTemplate”: “[INST] %1 [/INST] “,
“systemPrompt”: “[INST]<>You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don’t know the answer to a question, please don’t share false information.<>[/INST] “
}
]

