llama serve
¶
Control LLAMA server processes (for example: interactive status website).
usage: llama serve [-V] [--help-env] [-h]
[{gui,jupyter}] [subargs [subargs ...]]
Named Arguments¶
- -V, --version
Print the version number and exit.
Default: False
- --help-env
Print a list of environmental variables used by LLAMA, including whether they were loaded from the environment, whether fallback defaults were specified, and their corresponding descriptive/warning messages for when they are not specified. DO NOT RUN THIS IF RESULTS ARE BEING LOGGED, since it will include any access credentials defined in the environment.
- -h, --help
If a SUBCOMMAND is provided, run
--help
for that subcommand. If no SUBCOMMAND is provided, print this documentation and exit.Default: False
subcommands (call one with ``–help`` for details on each)¶
- subcommand
Possible choices: gui, jupyter
- If a subcommand is provided, ALL remaining
arguments will be passed to that command and it will be called.
- subargs
- If SUBCOMMAND takes its own
positional arguments, provide them here. This includes sub-sub commands.
- subcommand summaries (pick one, use
--help
for more info): - gui
Serve pages for events in the current run.
- jupyter
Launch a Jupyter Notebook server.