Skip to content

Investigate issues with failing deployment k8s deploy #119

@ryan109

Description

@ryan109

We have a scenario when kubetools will deploy an app, and consider it successful even if the new containers haven't come up successfully.

This command deck deploy was were it was discovered: http://jenkins.edtd.net/job/EDITED%20Github/job/command_deck/job/master/164/console

According to the deploy output, all existing k8s objects were successfully pushed. However the first container in the deployment never came up as it had a dependancy issue. I would have thought the existing app deployments stage would have failed, but it didn't.

This didn't cause the app to come down, as the previous containers were never removed after the new version failed to rollout....again this is something I thought would have been caught by updating app deployments.

--> Executing changes:
    UPDATE service command-deck
    UPDATE deployment command-deck

--> Create and/or update namespace
    Update namespace: production

--> Update existing app deployments
    Update deployment: command-deck

--> Update existing app services
    Update service: command-deck

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions