Airflow is a python package similar to other packages in concept, it contains a group of python classes which interact with each other and with other classes to run the dag tasks. So you can any testing frameworks to implement Airflow tests (Robot, PyTest, Unittest, DocTest, Nose2, Testify...).
For testing, you can check the dags files parsing performance, and if there is a problem in this parsing (check this answer), you can test the operator by preparing the task context and calling the method execute (here is some examples), for integration tests you can use mocks (check these tests used to test Airflow official operators), and if you want to implement some functional tests, you can run Airflow server, and create a separate process (local, docker, k8s pod, ...) which has the same Metastore (Airflow db) configuration, then use DagBag
class to find the dags and create runs, finally check the state of the tasks each x seconds and check if they did their job (the result is written to an external system, check the xcom, ...), I don't have public resources about this part, but the implementation is not very complicated.