
At work, at GE Aerospace, I work around supporting data ingestion into a Datalake. I'm not going to go into the details here, but I would love to use Airflow instead of the current stack we have today.
I've done a few proofs of concept with Airflow in the past. It is a solid solution, and with the hype of AI these days, quick and reliable data ingestion has never been more critical.

Over the last few years every provider seems to be reducing their on-premises options to only their hosted solution.
And I get why—for small to midsize companies, it's easier to just deploy a cloud-based solution. But when you have a really large and/or regulation-heavy environment, it's more important to be able to self-host and manage your own data. No shipping it off to a third party and trusting them with your data.
I am however going to put a few constraints in place around how I want to use Airflow.
Just started with a repo at: https://github.com/ChrisTowles/airflow-playground
I don't usually post Proofs of Concept like this publicly, but I'm doing it on my own time, so let's see how this goes and where it takes me.