Location of /home/airflowAirflow > S3 Connection fails on workersAirflow dag dependencies not available to dags when running Google's Cloud ComposeRunning docker operator from Google Cloud ComposerCloud Composer GKE Node upgrade results in Airflow task randomly failingGoogle Cloud Composer variables do not propagate to AirflowUnable to use google-cloud-storage in AirflowHow to get Airflow db credentials from Google Cloud ComposerAirflow task retried after failure despite retries=0Authorisation error when running airflow via cloud composerCan some provide me with the schema to recreate dag_run table in airflow-db.?

Did "Dirty Harry" feel lucky?

Template default argument loses its reference type

Examples where "thin + thin = nice and thick"

How to apply a register to a command

How can I know what hashing algorithm SQL Server used to decrypt the encrypted data when using the function DECRYPTBYPASSPHRASE?

GFI outlets tripped after power outage

How to best explain that you are taking pictures in a space for practice reasons?

Why can't some airports handle heavy aircraft while others do it easily (same runway length)?

How to accelerate progress in mathematical research

Should I tip on the Amtrak train?

How to interpret or parse this confusing 'NOT' and 'AND' legal clause

How many attacks exactly do I get combining Dual Wielder feat with Two-Weapon Fighting style?

How do you say "to hell with everything" in French?

Supervisor wants me to support a diploma-thesis SW tool after I graduated

How is the phase of 120V AC established in a North American home?

What is the "Brake to Exit" feature on the Boeing 777X?

Constant integers and constant evaluation

What can we do about our 9-month-old putting fingers down his throat?

What exactly is Apple Cider

At what point does a land become controlled?

Did the Byzantines ever attempt to move their capital to Rome?

Were there any contemporary sources (prior to RotJ) that confirmed that Darth Vader was telling the truth to Luke?

How do English-speaking kids loudly request something?

Fantasy Military Arms and Armor: the Dwarven Grand Armory



Location of /home/airflow


Airflow > S3 Connection fails on workersAirflow dag dependencies not available to dags when running Google's Cloud ComposeRunning docker operator from Google Cloud ComposerCloud Composer GKE Node upgrade results in Airflow task randomly failingGoogle Cloud Composer variables do not propagate to AirflowUnable to use google-cloud-storage in AirflowHow to get Airflow db credentials from Google Cloud ComposerAirflow task retried after failure despite retries=0Authorisation error when running airflow via cloud composerCan some provide me with the schema to recreate dag_run table in airflow-db.?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








-1















I have specified 3 nodes when creating a cloud composer environment. I tried to connect to worker nodes via SSH but I am not able to find airflow directory in /home. So where exactly is it located?










share|improve this question
























  • Try echo $AIRFLOW_HOME.

    – Maroun
    Mar 28 at 6:12

















-1















I have specified 3 nodes when creating a cloud composer environment. I tried to connect to worker nodes via SSH but I am not able to find airflow directory in /home. So where exactly is it located?










share|improve this question
























  • Try echo $AIRFLOW_HOME.

    – Maroun
    Mar 28 at 6:12













-1












-1








-1








I have specified 3 nodes when creating a cloud composer environment. I tried to connect to worker nodes via SSH but I am not able to find airflow directory in /home. So where exactly is it located?










share|improve this question














I have specified 3 nodes when creating a cloud composer environment. I tried to connect to worker nodes via SSH but I am not able to find airflow directory in /home. So where exactly is it located?







google-cloud-platform airflow google-cloud-composer






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 28 at 6:06









user11270223user11270223

1




1















  • Try echo $AIRFLOW_HOME.

    – Maroun
    Mar 28 at 6:12

















  • Try echo $AIRFLOW_HOME.

    – Maroun
    Mar 28 at 6:12
















Try echo $AIRFLOW_HOME.

– Maroun
Mar 28 at 6:12





Try echo $AIRFLOW_HOME.

– Maroun
Mar 28 at 6:12












1 Answer
1






active

oldest

votes


















0
















Cloud Composer runs Airflow on GKE, so you won't find data directly on any of the host GCE instances. Instead, Airflow processes are run within Kubernetes-managed containers, which either mount or sync data to the /home/airflow directory. To find the directory you will need to look within a running container.



Since each environment stores its Airflow data in a GCS bucket, you can alternatively inspect files by using Cloud Console or gsutil. If you really want to view /home/airflow with a shell, you can use kubectl exec which allows you to run commands/open a shell on any pod/container in the Kubernetes cluster. For example:



# Obtain the name of the Composer environment's GKE cluster
$ gcloud composer environments describe $ENV_NAME

# Fetch Kubernetes credentials for that cluster
$ gcloud container cluster get-credentials $GKE_CLUSTER_NAME


Once you have Kubernetes credentials, you can list running pods and SSH into them:



# List running pods
$ kubectl get pods

# SSH into a pod
$ kubectl exec -it $POD_NAME bash
airflow-worker-a93j$ ls /home/airflow





share|improve this answer


























    Your Answer






    StackExchange.ifUsing("editor", function ()
    StackExchange.using("externalEditor", function ()
    StackExchange.using("snippets", function ()
    StackExchange.snippets.init();
    );
    );
    , "code-snippets");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "1"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );














    draft saved

    draft discarded
















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55391105%2flocation-of-home-airflow%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0
















    Cloud Composer runs Airflow on GKE, so you won't find data directly on any of the host GCE instances. Instead, Airflow processes are run within Kubernetes-managed containers, which either mount or sync data to the /home/airflow directory. To find the directory you will need to look within a running container.



    Since each environment stores its Airflow data in a GCS bucket, you can alternatively inspect files by using Cloud Console or gsutil. If you really want to view /home/airflow with a shell, you can use kubectl exec which allows you to run commands/open a shell on any pod/container in the Kubernetes cluster. For example:



    # Obtain the name of the Composer environment's GKE cluster
    $ gcloud composer environments describe $ENV_NAME

    # Fetch Kubernetes credentials for that cluster
    $ gcloud container cluster get-credentials $GKE_CLUSTER_NAME


    Once you have Kubernetes credentials, you can list running pods and SSH into them:



    # List running pods
    $ kubectl get pods

    # SSH into a pod
    $ kubectl exec -it $POD_NAME bash
    airflow-worker-a93j$ ls /home/airflow





    share|improve this answer































      0
















      Cloud Composer runs Airflow on GKE, so you won't find data directly on any of the host GCE instances. Instead, Airflow processes are run within Kubernetes-managed containers, which either mount or sync data to the /home/airflow directory. To find the directory you will need to look within a running container.



      Since each environment stores its Airflow data in a GCS bucket, you can alternatively inspect files by using Cloud Console or gsutil. If you really want to view /home/airflow with a shell, you can use kubectl exec which allows you to run commands/open a shell on any pod/container in the Kubernetes cluster. For example:



      # Obtain the name of the Composer environment's GKE cluster
      $ gcloud composer environments describe $ENV_NAME

      # Fetch Kubernetes credentials for that cluster
      $ gcloud container cluster get-credentials $GKE_CLUSTER_NAME


      Once you have Kubernetes credentials, you can list running pods and SSH into them:



      # List running pods
      $ kubectl get pods

      # SSH into a pod
      $ kubectl exec -it $POD_NAME bash
      airflow-worker-a93j$ ls /home/airflow





      share|improve this answer





























        0














        0










        0









        Cloud Composer runs Airflow on GKE, so you won't find data directly on any of the host GCE instances. Instead, Airflow processes are run within Kubernetes-managed containers, which either mount or sync data to the /home/airflow directory. To find the directory you will need to look within a running container.



        Since each environment stores its Airflow data in a GCS bucket, you can alternatively inspect files by using Cloud Console or gsutil. If you really want to view /home/airflow with a shell, you can use kubectl exec which allows you to run commands/open a shell on any pod/container in the Kubernetes cluster. For example:



        # Obtain the name of the Composer environment's GKE cluster
        $ gcloud composer environments describe $ENV_NAME

        # Fetch Kubernetes credentials for that cluster
        $ gcloud container cluster get-credentials $GKE_CLUSTER_NAME


        Once you have Kubernetes credentials, you can list running pods and SSH into them:



        # List running pods
        $ kubectl get pods

        # SSH into a pod
        $ kubectl exec -it $POD_NAME bash
        airflow-worker-a93j$ ls /home/airflow





        share|improve this answer















        Cloud Composer runs Airflow on GKE, so you won't find data directly on any of the host GCE instances. Instead, Airflow processes are run within Kubernetes-managed containers, which either mount or sync data to the /home/airflow directory. To find the directory you will need to look within a running container.



        Since each environment stores its Airflow data in a GCS bucket, you can alternatively inspect files by using Cloud Console or gsutil. If you really want to view /home/airflow with a shell, you can use kubectl exec which allows you to run commands/open a shell on any pod/container in the Kubernetes cluster. For example:



        # Obtain the name of the Composer environment's GKE cluster
        $ gcloud composer environments describe $ENV_NAME

        # Fetch Kubernetes credentials for that cluster
        $ gcloud container cluster get-credentials $GKE_CLUSTER_NAME


        Once you have Kubernetes credentials, you can list running pods and SSH into them:



        # List running pods
        $ kubectl get pods

        # SSH into a pod
        $ kubectl exec -it $POD_NAME bash
        airflow-worker-a93j$ ls /home/airflow






        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Mar 31 at 3:32

























        answered Mar 31 at 3:19









        hexacyanidehexacyanide

        60.1k24 gold badges130 silver badges130 bronze badges




        60.1k24 gold badges130 silver badges130 bronze badges





















            Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







            Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.




















            draft saved

            draft discarded















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55391105%2flocation-of-home-airflow%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

            SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

            은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현