Command Line (DAP CLI)
Access your organization's Instructure data from the command line.
The Data Access Platform (DAP) CLI is a command-line tool that enables efficient access to large volumes of educational data with high fidelity and low latency. It adheres to a canonical data model and integrates with various educational products.
Built on top of the Query API, DAP CLI allows you to:
Fetch initial snapshots of data.
Track incremental changes.
Initialize and synchronize a supported database with DAP data.
Before using DAP CLI, it is recommended to familiarize yourself with the key concepts of DAP.
For developers, a Python library implementing the Query API is also available. The CLI is essentially a wrapper around this library, offering the same robust functionality in a command-line interface for ease of use.
Requirements
Supported Database Integrations: PostgreSQL 16.3+, MySQL 8.2+, Microsoft SQL Server 2019+
Supported Python Versions: Python 3.11+
Commands
The basic syntax is:
dap [arguments] [command] [flags]
Refer to Reference section in the sidebar for a list of available commands. Or, run the dap --help
command to get this information right in the terminal.
Upgrading
To ensure optimal performance, always use the latest version of DAP CLI. Check your current version with:
dap --version
To upgrade to the latest version, run:
pip install --upgrade instructure-dap-client
pip install --upgrade "instructure-dap-client[postgresql,mysql]"
Rate Limiting
DAP CLI follows the rate limiting policies of DAP. Be mindful of these limits when making requests.
Output modes
Two output modes are supported: interactive
is meant for human consumption, non-interactive
is meant for usage in scripts. The output in non-interactive mode is the same as in versions up to 1.4.0
. The content of output streams depends on what mode is set:
stdout
command output for scripts
user-friendly messages about execution status
stderr
logs according to loglevel
-
logfile
logs according to loglevel
logs according to loglevel
The mode and additional customizations can be set using the following command line options:
switch from default interactive mode to non-interactive using
--non-interactive
.in non-interactive mode logging to console can be disabled with
--no-log-to-console
, using this only the command output on stdout is emittedin interactive mode set colors using
--console-theme
, default isdark
to best fit the usual black terminal background
Logging & Debugging
The default log level is info
, and messages are printed to the console. To change the log level or save logs to a file, use the following parameters:
dap --loglevel debug --logfile dap.log initdb --namespace canvas --table accounts
Logs can be written in different formats, the default is plain
text format. Specifying the json
format:
dap --logformat json syncdb --namespace canvas --table accounts
The log lines in the console and the log file will have the exact same content, the content is determined by --loglevel
and --logformat
. In json
format some additional fields are added to all log records, such as the namespace, table and clientId. These additional fields can also be used for filtering when logs are ingested and queried by some services such as Splunk. For unambiguous processing by these services the timestamp format in json
logs is ISO 8601 with UTC time zone.
Tracking usage
The DAP Client sends usage analytics data to pendo.io. By default tracking is enabled and an opt out feature is added using the --no-tracking
command line switch or the DAP_TRACKING
environment variable. Valid values for DAP_TRACKING
are: true
, false
, 1
, 0
, yes
, no
, on
, off
. When using as a library set tracking with the tracking
parameter of class DAPClient
constructor. The exact list of tracked data is logged on debug loglevel.
Using in scripts
When executing the DAP CLI from a shell or other script the exit code can be used to determine if there were errors during execution, only the exit code 0
indicates a successful execution.
When executing on multiple tables it might happen that it fails for some tables, in this case the exit code will be non zero even if it failed only on a single table.
Ideally only output to stdout
should be used by scripts and not rely on the content of logs in stderr
as logs may change in the future.
Where To Get Help
Last updated
Was this helpful?