AI 使用示例
免费 ai chat
软件
开源模型
QwenLM
AI 使用场景
- Google Stitch(出原型图) –导入到–> Figama(调整设计图) –自动生成 UI–> Figma MCP
- figma -> account -> security -> Generat New Token ->
file content
and file metadata
set to read only
# AI 提示词
# 图设计稿
https://www.figma.com/design/ptjnTLAXiPoMQaczQllrAg/oauth2-ui?node-id=0-1&p=f&t=qZy9iHDV0N6sONYb-0
Tell me what designs you can see from the figma page
#
now develop the 3 screens in this ant-design project, just use mock data for now. Make sure the navigation works smoothly.
AI 软件
aichat
aichat All-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, AI tools & agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more.
# rust
cargo install aichat
# mac
brew install aichat
# url
curl -Lfs -o aichat.tar.gz https://github.com/sigoden/aichat/releases/latest/download/aichat-v0.20.0-x86_64-unknown-linux-musl.tar.gz
tar -zxvf aichat.tar.gz
mv aichat /usr/local/bin/
$ aichat --help
All-in-one AI CLI Tool
Usage: aichat [OPTIONS] [TEXT]...
Arguments:
[TEXT]... Input text
Options:
-m, --model <MODEL> Select a LLM model
--prompt <PROMPT> Use the system prompt
-r, --role <ROLE> Select a role
-s, --session [<SESSION>] Start or join a session
--save-session Forces the session to be saved
-a, --agent <AGENT> Start a agent
-R, --rag <RAG> Start a RAG
--serve [<ADDRESS>] Serve the LLM API and WebAPP
-e, --execute Execute commands in natural language
-c, --code Output code only
-f, --file <FILE> Include files with the message
-S, --no-stream Turn off stream mode
--dry-run Display the message without sending it
--info Display information
--list-models List all available chat models
--list-roles List all roles
--list-sessions List all sessions
--list-agents List all agents
--list-rags List all RAGs
-h, --help Print help
-V, --version Print version
- 配置文件:
/root/.config/aichat/config.yaml
lobe-chat
docker run -d -p 3210:3210 \
-e OPENAI_API_KEY=sk-xxxx \
-e QWEN_API_KEY=sk-xxxx \
-e ACCESS_CODE=lobe66 \
--cpu-shares=512 \
--name lobe-chat \
lobehub/lobe-chat