Default connection used by aws-provider SystemsManagerParameterStoreBackend #58747
Unanswered
tobadarichard
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I'm using airflow 3.1.3 and apache-airflow-providers-amazon 9.17.0
I'm currently trying to use AWS SSM Parameter Store Secrets as a backend to retrieve secrets but it doesn't seem to use the correct settings to establish the connection.
The [secrets] section of my airflow.cfg is :
backend = airflow.providers.amazon.aws.secrets.systems_manager.SystemsManagerParameterStoreBackend
backend_kwargs = {"connections_prefix": "/airflow/connections","variables_prefix ": "/airflow/variables","config_prefix": "/airflow/config"}
I also defined an aws connection with the id 'aws_default' and the type 'aws' through the airflow UI.
From the documentation of the aws provider, the default connection name is 'aws_default' but looking at my logs, it seems that SystemsManagerParameterStoreBackend is using a connection named 'SystemsManagerParameterStoreBackend__connection' :
[2025-11-26 21:12:24] DEBUG - Calling 'on_starting' with {'component': <airflow.sdk.execution_time.task_runner.TaskRunnerMarker object at 0x7a3e8ed6bd90>} source=airflow.listeners.listener loc=listener.py:37
[2025-11-26 21:12:24] DEBUG - Hook impls: [] source=airflow.listeners.listener loc=listener.py:38
[2025-11-26 21:12:24] DEBUG - Result from 'on_starting': [] source=airflow.listeners.listener loc=listener.py:42
[2025-11-26 21:12:24] INFO - DAG bundles loaded: dags-folder source=airflow.dag_processing.bundles.manager.DagBundlesManager loc=manager.py:179
[2025-11-26 21:12:24] INFO - Filling up the DagBag from /home/richard/Documents/airflow_usage/dags/dags_airflow_hitlop.py source=airflow.models.dagbag.DagBag loc=dagbag.py:593
[2025-11-26 21:12:24] DEBUG - Importing /home/richard/Documents/airflow_usage/dags/dags_airflow_hitlop.py source=airflow.models.dagbag.DagBag loc=dagbag.py:391
[2025-11-26 21:12:24] DEBUG - Initializing Providers Manager[hooks] source=airflow.providers_manager loc=providers_manager.py:353
[2025-11-26 21:12:24] DEBUG - Initialization of Providers Manager[hooks] took 0.00 seconds source=airflow.providers_manager loc=providers_manager.py:356
[2025-11-26 21:12:24] DEBUG - Initializing Providers Manager[hook_lineage_writers] source=airflow.providers_manager loc=providers_manager.py:353
[2025-11-26 21:12:24] DEBUG - Initializing Providers Manager[taskflow_decorators] source=airflow.providers_manager loc=providers_manager.py:353
[2025-11-26 21:12:24] DEBUG - Initialization of Providers Manager[taskflow_decorators] took 0.00 seconds source=airflow.providers_manager loc=providers_manager.py:356
[2025-11-26 21:12:24] DEBUG - Initialization of Providers Manager[hook_lineage_writers] took 0.00 seconds source=airflow.providers_manager loc=providers_manager.py:356
[2025-11-26 21:12:24] WARNING - The
airflow.hooks.base.BaseHookattribute is deprecated. Please use'airflow.sdk.bases.hook.BaseHook'. category=DeprecatedImportWarning source=py.warnings loc=/home/richard/Documents/airflow_usage/dags/dags_airflow_hitlop.py:13[2025-11-26 21:12:24] DEBUG - Loaded DAG <DAG: example_hitl_operator> source=airflow.models.dagbag.DagBag loc=dagbag.py:568
[2025-11-26 21:12:24] DEBUG - Dag file parsed file=dags_airflow_hitlop.py source=task loc=task_runner.py:737
[2025-11-26 21:12:24] DEBUG - Plugins are already loaded. Skipping. source=airflow.plugins_manager loc=plugins_manager.py:345
[2025-11-26 21:12:24] DEBUG - Integrate Macros plugins source=airflow.plugins_manager loc=plugins_manager.py:597
[2025-11-26 21:12:24] DEBUG - Calling 'on_task_instance_running' with {'previous_state': <TaskInstanceState.QUEUED: 'queued'>, 'task_instance': RuntimeTaskInstance(id=UUID('019ac315-249a-73bf-9f29-596253803bc7'), task_id='print_vars', dag_id='example_hitl_operator', run_id='manual__2025-11-27T02:12:12+00:00', try_number=1, dag_version_id=UUID('019ab91a-0670-777a-b28c-60f63fe1b513'), map_index=-1, hostname='default-pc', context_carrier={}, task=<Task(_PythonDecoratedOperator): print_vars>, bundle_instance=LocalDagBundle(name=dags-folder), max_tries=0, start_date=datetime.datetime(2025, 11, 27, 2, 12, 15, 379781, tzinfo=datetime.timezone.utc), end_date=None, state=<TaskInstanceState.RUNNING: 'running'>, is_mapped=False, rendered_map_index=None)} source=airflow.listeners.listener loc=listener.py:37
[2025-11-26 21:12:24] DEBUG - Hook impls: [] source=airflow.listeners.listener loc=listener.py:38
[2025-11-26 21:12:24] DEBUG - Result from 'on_task_instance_running': [] source=airflow.listeners.listener loc=listener.py:42
[2025-11-26 21:12:25] DEBUG - Missing endpoint_url in extra config of AWS Connection with id SystemsManagerParameterStoreBackend__connection. Using default AWS service endpoint source=airflow.providers.amazon.aws.utils.connection_wrapper.AwsConnectionWrapper loc=connection_wrapper.py:249
[2025-11-26 21:12:27] DEBUG - Missing endpoint_url in extra config of AWS Connection with id SystemsManagerParameterStoreBackend__connection. Using default AWS service endpoint source=airflow.providers.amazon.aws.utils.connection_wrapper.AwsConnectionWrapper loc=connection_wrapper.py:249
[2025-11-26 21:12:29] DEBUG - Unable to retrieve connection from secrets backend (SystemsManagerParameterStoreBackend). Checking subsequent secrets backend. source=task loc=context.py:165
I tried to look at the source code of the provider and I have not found a way to change the connection used by the backend. It seems to hardcoded here
I would like to know : is there is a way to use a different connection for SystemsManagerParameterStoreBackend ? Or am I doing something wrong ?
Beta Was this translation helpful? Give feedback.
All reactions