我对Airflow很陌生.我已经阅读了几次文档,在网上遇到了许多S / O问题和许多随机文章,但尚未解决此问题.我觉得这很简单,我做错了.
我有适用于Windows的Docker,我拉起了puckel / docker-airflow映像并运行了一个带有暴露端口的容器,以便可以从主机访问UI.我有另一个运行mcr.microsoft.com/mssql/server的容器,在该容器上还原了WideWorldImporters示例数据库.通过Airflow UI,我已经能够成功创建与此数据库的连接,甚至可以从“数据分析”部分中查询它.查看以下图像:
Connection Creation
Successful Query to Connection
因此,尽管这可行,但我的dag在第二个任务sqlData失败.这是代码:
from airflow.models import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
from airflow.operators.mssql_operator import MssqlOperator
from datetime import timedelta,datetime
copyData = DAG(
dag_id='copyData',schedule_interval='@once',start_date=datetime(2019,1,1)
)
printHelloBash = BashOperator(
task_id = "print_hello_Bash",bash_command = 'echo "Lets copy some data"',dag = copyData
)
mssqlConnection = "WWI"
sqlData = MssqlOperator(sql="select top 100 InvoiceDate,TotalDryItems from sales.invoices",task_id="select_some_data",mssql_conn_id=mssqlConnection,database="WideWorldImporters",dag = copyData,depends_on_past=True
)
queryDataSuccess = BashOperator(
task_id = "confirm_data_queried",bash_command = 'echo "We queried data!"',dag = copyData
)
printHelloBash >> sqlData >> queryDataSuccess
最初的错误是:
*[2019-02-22 16:13:09,176] {{logging_mixin.py:95}} INFO - [2019-02-22 16:13:09,176] {{base_hook.py:83}} INFO - Using connection to: 172.17.0.3
[2019-02-22 16:13:09,186] {{models.py:1760}} ERROR - Could not create Fernet object: Incorrect padding
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/airflow/models.py",line 171,in get_fernet
_fernet = Fernet(fernet_key.encode('utf-8'))
File "/usr/local/lib/python3.6/site-packages/cryptography/fernet.py",line 34,in __init__
key = base64.urlsafe_b64decode(key)
File "/usr/local/lib/python3.6/base64.py",line 133,in urlsafe_b64decode
return b64decode(s)
File "/usr/local/lib/python3.6/base64.py",line 87,in b64decode
return binascii.a2b_base64(s)
binascii.Error: Incorrect padding*
我注意到这与加密有关,因此我继续运行pip install加密和pip install airflow [crytpo],它们都返回了完全相同的结果,通知我该要求已经得到满足.终于,我发现一些东西说我只需要生成一个fernet_key.我的airflow.cfg文件中的默认密钥为fernet_key = $FERNET_KEY.因此,从容器中的cli中我运行了:
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
并得到了我用$FERNET_KEY替换的代码.我重新启动了容器,然后重新运行了dag,现在我的错误是:
[2019-02-22 16:22:13,641] {{models.py:1760}} ERROR -
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/cryptography/fernet.py",line 106,in _verify_signature
h.verify(data[-32:])
File "/usr/local/lib/python3.6/site-packages/cryptography/hazmat/primitives/hmac.py",line 69,in verify
ctx.verify(signature)
File "/usr/local/lib/python3.6/site-packages/cryptography/hazmat/backends/openssl/hmac.py",line 73,in verify
raise InvalidSignature("Signature did not match digest.")
cryptography.exceptions.InvalidSignature: Signature did not match digest.
初始加密文档扫描中的哪一项与兼容性有关?
version: '2.1'
services:
postgres:
image: postgres:9.6
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
mssql:
image: dw:latest
ports:
- "1433:1433"
webserver:
image: puckel/docker-airflow:1.10.2
restart: always
depends_on:
- postgres
- mssql
environment:
- LOAD_EX=n
- EXECUTOR=Local
#volumes:
#- ./dags:/usr/local/airflow/dags
# Uncomment to include custom plugins
# - ./plugins:/usr/local/airflow/plugins
ports:
- "8080:8080"
command: webserver
healthcheck:
test: ["CMD-SHELL","[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
请注意,dw是我命名基于mssql容器的新映像的名称.接下来,我将文件重命名为docker-compose.yml,以便可以轻松运行docker-compose(不确定是否有直接指向另一个YAML文件的命令).一切启动并运行后,我导航至Airflow UI并配置了连接.注意:由于您使用的是docker-compose,因此您无需知道其他容器的IP地址,因为它们使用的是DNS服务发现功能,我发现它约为here.然后,为了测试连接,我去了Data Profiling做广告-hoc查询,但连接不存在.这是因为puckel / docker-airflow映像未安装pymssql.因此,只需将bash拖入容器docker exec -it airflow_webserver_container bash并将其安装pip install pymssql –user.退出容器并使用docker-compose restart重新启动所有服务.一分钟后,一切正常.我的连接显示在临时查询中,可以成功选择数据.最终,我打开了DAG,调度程序将其选中,一切都成功了!经过数周的谷歌搜索,超级放心.感谢@ y2k-shubham的帮助和对@Tomasz的超级感谢,我在他关于r / datascience subreddit上关于Airflow的精彩而透彻的发布之后,实际上是我最初与之接触的.