update,
This commit is contained in:
23
OpenAI/notes.md
Normal file
23
OpenAI/notes.md
Normal file
@@ -0,0 +1,23 @@
|
||||
---
|
||||
tags: OpenAI
|
||||
---
|
||||
|
||||
openai chatgpt -> esp32
|
||||
|
||||
## prompty
|
||||
|
||||
- https://www.promptingguide.ai/
|
||||
- https://cyb3rward0g.github.io/floki/home/why/#why-the-name-floki
|
||||
- https://microsoft.github.io/promptflow/how-to-guides/quick-start.html
|
||||
|
||||
i want to test if `floki` can use local model to generate response like `ollama`
|
||||
|
||||
```
|
||||
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
|
||||
|
||||
# Run a model
|
||||
docker exec -it ollama ollama run llama3.2:1b
|
||||
|
||||
|
||||
docker run -p 8888:8888 jupyter/base-notebook
|
||||
```
|
Reference in New Issue
Block a user