Live Entry Choose What To Install These are the currently supported open-source AI install entrypoints. Copy the command, then follow the real install flow and enter your invite or worker code when prompted.
Use the left command on macOS/Linux; use the PowerShell command on Windows.
AI application platform with common failures around multi-service setup, Python/Node deps, and ports.
Best for validating the “skip Docker, still install successfully” promise.
macOS / Linux Copy macOS / Linux Command
curl -sL https://aimaserver.com/install/dify | bash Windows (PowerShell) Copy Windows Command
iex (irm https://aimaserver.com/install/dify) Node-based AI image tool with highly variable Python, model, and plugin-node setups.
Best for validating adaptive install decisions instead of fixed scripts.
macOS / Linux Copy macOS / Linux Command
curl -sL https://aimaserver.com/install/comfyui | bash Windows (PowerShell) Copy Windows Command
iex (irm https://aimaserver.com/install/comfyui) Popular Ollama frontend that often fails on runtime deps, ports, and service boot order.
Good for validating post-install verification instead of trusting exit code 0.
macOS / Linux Copy macOS / Linux Command
curl -sL https://aimaserver.com/install/open-webui | bash Windows (PowerShell) Copy Windows Command
iex (irm https://aimaserver.com/install/open-webui) Previously validated case-study flow used to compare multi-project expansion against the original path.
Useful as a regression baseline so multi-project work does not break the original path.
macOS / Linux Copy macOS / Linux Command
curl -sL https://aimaserver.com/install/openclaw | bash Windows (PowerShell) Copy Windows Command
iex (irm https://aimaserver.com/install/openclaw)