
At work, at GE Aerospace, I work around supporting data ingestion into a Datalake. I'm not going to go into the details here, but I would love to use Airflow instead of the current stack we have today.
I've done a few proofs of concept work with Airflow in the past. It is solid solution and with the hype of AI these days, quick and reliable data ingestion has never been more critical.

Over the last few years every provider seems to be reducing their on-premises options to only their hosted solution.
And I get why, for small to be midsize companies, it's easier to just deploy a cloud based solution. But when you have a really large and/or regulation heavy environment it's both more important to be able to self-host and manage your own data. No shipping it off to a third party and trust them with your data.
I am however going to put a few constraints in place around how I want to use Airflow.
Just started with a repo at: https://github.com/ChrisTowles/airflow-playground
I don't usually post Proof of Concepts's like this publicly, but I'm doing it on my own time so lets see how this goes and see where it takes me.