DagsterDocs

Dagster CLI

dagster asset

dagster asset [OPTIONS] COMMAND [ARGS]...

wipe

Eliminate asset key indexes from event logs. Warning: Cannot be undone

Usage:

dagster asset wipe –all

dagster asset wipe <unstructured_asset_key_name>

dagster asset wipe <json_string_of_structured_asset_key>

dagster asset wipe [OPTIONS] [KEY]...

Options

--all

Eliminate all asset key indexes

Arguments

KEY

Optional argument(s)

dagster debug

dagster debug [OPTIONS] COMMAND [ARGS]...

export

Export the relevant artifacts for a pipeline run to a file.

dagster debug export [OPTIONS] RUN_ID OUTPUT_FILE

Arguments

RUN_ID

Required argument

OUTPUT_FILE

Required argument

dagster instance

dagster instance [OPTIONS] COMMAND [ARGS]...

info

List the information about the current instance.

dagster instance info [OPTIONS]

migrate

Automatically migrate an out of date instance.

dagster instance migrate [OPTIONS]

reindex

Rebuild index over historical runs for performance.

dagster instance reindex [OPTIONS]

dagster pipeline

dagster pipeline [OPTIONS] COMMAND [ARGS]...

backfill

Backfill a partitioned pipeline.

This commands targets a partitioned pipeline. The pipeline and partition set must be defined in a repository, which can be specified in a number of ways:

  1. dagster pipeline backfill -p <<pipeline_name>> (works if .workspace.yaml exists)

  2. dagster pipeline backfill -p <<pipeline_name>> -w path/to/workspace.yaml

  3. dagster pipeline backfill -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  4. dagster pipeline backfill -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline backfill [OPTIONS]

Options

-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

--partitions <partitions>

Comma-separated list of partition names that we want to backfill

--partition-set <partition_set>

The name of the partition set over which we want to backfill.

--all <all>

Specify to select all partitions to backfill.

--from <from>

Specify a start partition for this backfill job

Example: dagster pipeline backfill log_daily_stats –from 20191101

--to <to>

Specify an end partition for this backfill job

Example: dagster pipeline backfill log_daily_stats –to 20191201

--tags <tags>

JSON string of tags to use for this pipeline run

--noprompt

execute

Execute a pipeline.

This commands targets a pipeline. The pipeline can be specified in a number of ways:

  1. dagster pipeline execute -f /path/to/file.py -a define_some_pipeline

  2. dagster pipeline execute -m a_module.submodule -a define_some_pipeline

  3. dagster pipeline execute -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  4. dagster pipeline execute -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline execute [OPTIONS]

Options

-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-r, --repository <repository>

Repository name, necessary if more than one repository is present.

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-c, --config <config>

Specify one or more run config files. These can also be file patterns. If more than one run config file is captured then those files are merged. Files listed first take precedence. They will smash the values of subsequent files at the key-level granularity. If the file is a pattern then you must enclose it in double quotes

Example: dagster pipeline execute -f hello_world.py -p pandas_hello_world -c “pandas_hello_world/*.yaml”

You can also specify multiple files:

Example: dagster pipeline execute -f hello_world.py -p pandas_hello_world -c pandas_hello_world/solids.yaml -e pandas_hello_world/env.yaml

--preset <preset>

Specify a preset to use for this pipeline. Presets are defined on pipelines under preset_defs.

--mode <mode>

The name of the mode in which to execute the pipeline.

--tags <tags>

JSON string of tags to use for this pipeline run

-s, --solid-selection <solid_selection>

Specify the solid subselection to execute. It can be multiple clauses separated by commas.Examples: - “some_solid” will execute “some_solid” itself - “*some_solid” will execute “some_solid” and all its ancestors (upstream dependencies) - “*some_solid+++” will execute “some_solid”, all its ancestors, and its descendants (downstream dependencies) within 3 levels down - “*some_solid,other_solid_a,other_solid_b+” will execute “some_solid” and all its ancestors, “other_solid_a” itself, and “other_solid_b” and its direct child solids

launch

Launch a pipeline using the run launcher configured on the Dagster instance.

This commands targets a pipeline. The pipeline can be specified in a number of ways:

  1. dagster pipeline launch -p <<pipeline_name>> (works if .workspace.yaml exists)

  2. dagster pipeline launch -p <<pipeline_name>> -w path/to/workspace.yaml

  3. dagster pipeline launch -f /path/to/file.py -a define_some_pipeline

  4. dagster pipeline launch -m a_module.submodule -a define_some_pipeline

  5. dagster pipeline launch -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  6. dagster pipeline launch -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline launch [OPTIONS]

Options

-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

-c, --config <config>

Specify one or more run config files. These can also be file patterns. If more than one run config file is captured then those files are merged. Files listed first take precedence. They will smash the values of subsequent files at the key-level granularity. If the file is a pattern then you must enclose it in double quotes

Example: dagster pipeline launch -f hello_world.py -p pandas_hello_world -c “pandas_hello_world/*.yaml”

You can also specify multiple files:

Example: dagster pipeline launch -f hello_world.py -p pandas_hello_world -c pandas_hello_world/solids.yaml -e pandas_hello_world/env.yaml

--config-json <config_json>

JSON string of run config to use for this pipeline run. Cannot be used with -c / –config.

--preset <preset>

Specify a preset to use for this pipeline. Presets are defined on pipelines under preset_defs.

--mode <mode>

The name of the mode in which to execute the pipeline.

--tags <tags>

JSON string of tags to use for this pipeline run

-s, --solid-selection <solid_selection>

Specify the solid subselection to launch. It can be multiple clauses separated by commas.Examples: - “some_solid” will launch “some_solid” itself - “*some_solid” will launch “some_solid” and all its ancestors (upstream dependencies) - “*some_solid+++” will launch “some_solid”, all its ancestors, and its descendants (downstream dependencies) within 3 levels down - “*some_solid,other_solid_a,other_solid_b+” will launch “some_solid” and all its ancestors, “other_solid_a” itself, and “other_solid_b” and its direct child solids

--run-id <run_id>

The ID to give to the launched pipeline run

list

List the pipelines in a repository. Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.

dagster pipeline list [OPTIONS]

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

list_versions

Display the freshness of memoized results for the given pipeline.

This commands targets a pipeline. The pipeline can be specified in a number of ways:

  1. dagster pipeline list_versions -f /path/to/file.py -a define_some_pipeline

  2. dagster pipeline list_versions -m a_module.submodule -a define_some_pipeline

  3. dagster pipeline list_versions -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  4. dagster pipeline list_versions -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline list_versions [OPTIONS]

Options

-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-r, --repository <repository>

Repository name, necessary if more than one repository is present.

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-c, --config <config>

Specify one or more run config files. These can also be file patterns. If more than one run config file is captured then those files are merged. Files listed first take precedence. They will smash the values of subsequent files at the key-level granularity. If the file is a pattern then you must enclose it in double quotes

Example: dagster pipeline list_versions -f hello_world.py -p pandas_hello_world -c “pandas_hello_world/*.yaml”

You can also specify multiple files:

Example: dagster pipeline list_versions -f hello_world.py -p pandas_hello_world -c pandas_hello_world/solids.yaml -e pandas_hello_world/env.yaml

--preset <preset>

Specify a preset to use for this pipeline. Presets are defined on pipelines under preset_defs.

--mode <mode>

The name of the mode in which to execute the pipeline.

print

Print a pipeline.

This commands targets a pipeline. The pipeline can be specified in a number of ways:

  1. dagster pipeline print -p <<pipeline_name>> (works if .workspace.yaml exists)

  2. dagster pipeline print -p <<pipeline_name>> -w path/to/workspace.yaml

  3. dagster pipeline print -f /path/to/file.py -a define_some_pipeline

  4. dagster pipeline print -m a_module.submodule -a define_some_pipeline

  5. dagster pipeline print -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  6. dagster pipeline print -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline print [OPTIONS]

Options

--verbose
-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

scaffold_config

Scaffold the config for a pipeline.

This commands targets a pipeline. The pipeline can be specified in a number of ways:

  1. dagster pipeline scaffold_config -f /path/to/file.py -a define_some_pipeline

  2. dagster pipeline scaffold_config -m a_module.submodule -a define_some_pipeline

  3. dagster pipeline scaffold_config -f /path/to/file.py -a define_some_repo -p <<pipeline_name>>

  4. dagster pipeline scaffold_config -m a_module.submodule -a define_some_repo -p <<pipeline_name>>

dagster pipeline scaffold_config [OPTIONS]

Options

-p, --pipeline <pipeline>

Pipeline within the repository, necessary if more than one pipeline is present.

-r, --repository <repository>

Repository name, necessary if more than one repository is present.

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

--print-only-required

dagster run

dagster run [OPTIONS] COMMAND [ARGS]...

delete

Delete a run by id and its associated event logs. Warning: Cannot be undone

dagster run delete [OPTIONS] RUN_ID

Arguments

RUN_ID

Required argument

list

List the runs in this dagster installation.

dagster run list [OPTIONS]

Options

--limit <limit>

Only list a specified number of runs

wipe

Eliminate all run history and event logs. Warning: Cannot be undone

dagster run wipe [OPTIONS]

dagster schedule

dagster schedule [OPTIONS] COMMAND [ARGS]...

debug

Debug information about the scheduler

dagster schedule debug [OPTIONS]

list

List all schedules that correspond to a repository.

dagster schedule list [OPTIONS]

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

--running

Filter for running schedules

--stopped

Filter for stopped schedules

--name

Only display schedule schedule names

logs

Get logs for a schedule

dagster schedule logs [OPTIONS] [SCHEDULE_NAME]...

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SCHEDULE_NAME

Optional argument(s)

preview

Preview changes that will be performed by `dagster schedule up

dagster schedule preview [OPTIONS]

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

restart

Restart a running schedule

dagster schedule restart [OPTIONS] [SCHEDULE_NAME]...

Options

--restart-all-running

restart previously running schedules

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SCHEDULE_NAME

Optional argument(s)

start

Start an existing schedule

dagster schedule start [OPTIONS] [SCHEDULE_NAME]...

Options

--start-all

start all schedules

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SCHEDULE_NAME

Optional argument(s)

stop

Stop an existing schedule

dagster schedule stop [OPTIONS] [SCHEDULE_NAME]...

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SCHEDULE_NAME

Optional argument(s)

up

Updates the internal dagster representation of schedules to match the list of ScheduleDefinitions defined in the repository. Use dagster schedule up –preview or dagster schedule preview to preview what changes will be applied. New ScheduleDefinitions will not start running by default when up is called. Use dagster schedule start and dagster schedule stop to start and stop a schedule. If a ScheduleDefinition is deleted, the corresponding running schedule will be stopped and deleted.

dagster schedule up [OPTIONS]

Options

--preview

Preview changes

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

wipe

Deletes schedule history and turns off all schedules.

dagster schedule wipe [OPTIONS]

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

dagster sensor

dagster sensor [OPTIONS] COMMAND [ARGS]...

list

List all sensors that correspond to a repository.

dagster sensor list [OPTIONS]

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

--running

Filter for running sensors

--stopped

Filter for stopped sensors

--name

Only display sensor sensor names

preview

Preview an existing sensor execution

dagster sensor preview [OPTIONS] [SENSOR_NAME]...

Options

--since <since>

Set the last_completion_time value as a timestamp float for the sensor context

--last_run_key <last_run_key>

Set the last_run_key value for the sensor context

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SENSOR_NAME

Optional argument(s)

start

Start an existing sensor

dagster sensor start [OPTIONS] [SENSOR_NAME]...

Options

--start-all

start all sensors

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SENSOR_NAME

Optional argument(s)

stop

Stop an existing sensor

dagster sensor stop [OPTIONS] [SENSOR_NAME]...

Options

-l, --location <location>

RepositoryLocation within the workspace, necessary if more than one location is present.

-r, --repository <repository>

Repository within the workspace, necessary if more than one repository is present.

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

Arguments

SENSOR_NAME

Optional argument(s)

dagster-graphql

Run a GraphQL query against the dagster interface to a specified repository or pipeline.

Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.

Examples:

  1. dagster-graphql

  2. dagster-graphql -y path/to/workspace.yaml

  3. dagster-graphql -f path/to/file.py -a define_repo

  4. dagster-graphql -m some_module -a define_repo

  5. dagster-graphql -f path/to/file.py -a define_pipeline

  6. dagster-graphql -m some_module -a define_pipeline

dagster-graphql [OPTIONS]

Options

--version

Show the version and exit.

-t, --text <text>

GraphQL document to execute passed as a string

-f, --file <file>

GraphQL document to execute passed as a file

-p, --predefined <predefined>

GraphQL document to execute, from a predefined set provided by dagster-graphql.

Options

launchPipelineExecution

-v, --variables <variables>

A JSON encoded string containing the variables for GraphQL execution.

-r, --remote <remote>

A URL for a remote instance running dagit server to send the GraphQL request to.

-o, --output <output>

A file path to store the GraphQL response to. This flag is useful when making pipeline execution queries, since pipeline execution causes logs to print to stdout and stderr.

--empty-workspace

Allow an empty workspace

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

--grpc-port <grpc_port>

Port to use to connect to gRPC server

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--use-ssl

Use a secure channel when connecting to the gRPC server

dagit

Run dagit. Loads a repository or pipeline.

Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.

Examples:

  1. dagit (works if .workspace.yaml exists)

  2. dagit -w path/to/workspace.yaml

  3. dagit -f path/to/file.py

  4. dagit -f path/to/file.py -d path/to/working_directory

  5. dagit -m some_module

  6. dagit -f path/to/file.py -a define_repo

  7. dagit -m some_module -a define_repo

  8. dagit -p 3333

Options can also provide arguments via environment variables prefixed with DAGIT

For example, DAGIT_PORT=3333 dagit

dagit [OPTIONS]

Options

--use-ssl

Use a secure channel when connecting to the gRPC server

--grpc-host <grpc_host>

Host to use to connect to gRPC server, defaults to localhost

--grpc-socket <grpc_socket>

Named socket to use to connect to gRPC server

--grpc-port <grpc_port>

Port to use to connect to gRPC server

-a, --attribute <attribute>

Attribute that is either a 1) repository or pipeline or 2) a function that returns a repository or pipeline

-m, --module-name <module_name>

Specify module where repository or pipeline function lives

--package-name <package_name>

Specify installed Python package where repository or pipeline function lives

-f, --python-file <python_file>

Specify python file where repository or pipeline function lives

-d, --working-directory <working_directory>

Specify working directory to use when loading the repository or pipeline. Can only be used along with -f/–python-file

--empty-working-directory

Indicates that the working directory should be empty and should not set to the current directory as a default

-w, --workspace <workspace>

Path to workspace file. Argument can be provided multiple times.

--empty-workspace

Allow an empty workspace

-h, --host <host>

Host to run server on

Default

127.0.0.1

-p, --port <port>

Port to run server on, default is 3000

-l, --path-prefix <path_prefix>

The path prefix where Dagit will be hosted (eg: /dagit)

Default

--storage-fallback <storage_fallback>

Base directory for dagster storage if $DAGSTER_HOME is not set

--db-statement-timeout <db_statement_timeout>

The timeout in milliseconds to set on database statements sent to the DagsterInstance. Not respected in all configurations.

Default

5000

--version

Show the version and exit.