-
Notifications
You must be signed in to change notification settings - Fork 82
Closed
Labels
kind/bugCategorizes issue or PR as related to a bug.Categorizes issue or PR as related to a bug.oadp-1.1
Description
Contact Details
No response
Describe bug
Hello,
I installed the OADP Operator, 0.5.6 version, into OpenShift 4.9.0.
Moreover, I have a bucket s3 in eu-north-1.
I created DataProtectionApplication using this YAML:
apiVersion: oadp.openshift.io/v1alpha1
kind: DataProtectionApplication
metadata:
name: velero
namespace: openshift-adp
spec:
backupLocations:
- velero:
config:
profile: default
region: eu-north-1
credential:
key: cloud
name: cloud-credentials
default: true
objectStorage:
bucket: my-bucket
prefix: velero
provider: aws
configuration:
restic:
enable: true
velero:
defaultPlugins:
- openshift
- aws
snapshotLocations:
- velero:
config:
profile: default
region: eu-north-1
provider: aws
But when the operator creates the pod I get:
# oc -n openshift-adp get pod
NAME READY STATUS RESTARTS AGE
oadp-velero-1-aws-registry-68bf57c5b9-ldpwz 0/1 CrashLoopBackOff 1 (9s ago) 12s
openshift-adp-controller-manager-56869f8855-nvx77 1/1 Running 0 16m
restic-g82jr 1/1 Running 0 12s
restic-gsrvj 1/1 Running 0 12s
restic-mr5zw 1/1 Running 0 12s
restic-x548c 1/1 Running 0 12s
velero-6998f4c769-hk8s9 1/1 Running 0 12s
What happened?
The adp-velero-1-aws-registry
pod crashed because it does not recognize the AWS eu-north-1 (Stockholm) region.
It's very strange because this region was launched in 2018.
OADP Version
0.5.x (Stable)
OpenShift Version
4.9
Velero pod logs
time="2022-06-08T09:40:21.144008127Z" level=info msg="debug server listening :5001"
panic: invalid region provided: eu-north-1
goroutine 1 [running]:
github.com/docker/distribution/registry/handlers.NewApp({0xe57c20, 0xc0003b85d0}, 0xc000394380)
/go/src/github.com/docker/distribution/registry/handlers/app.go:126 +0x2269
github.com/docker/distribution/registry.NewRegistry({0xe57c20, 0xc000365380}, 0xc000394380)
/go/src/github.com/docker/distribution/registry/registry.go:106 +0x145
github.com/docker/distribution/registry.glob..func1(0x13af2a0, {0xc0002df810, 0x1, 0x1})
/go/src/github.com/docker/distribution/registry/registry.go:64 +0x194
github.com/spf13/cobra.(*Command).execute(0x13af2a0, {0xc0002df7e0, 0x1, 0x1})
/go/src/github.com/docker/distribution/vendor/github.com/spf13/cobra/command.go:766 +0x5f8
github.com/spf13/cobra.(*Command).ExecuteC(0x13af500)
/go/src/github.com/docker/distribution/vendor/github.com/spf13/cobra/command.go:852 +0x2dc
github.com/spf13/cobra.(*Command).Execute(...)
/go/src/github.com/docker/distribution/vendor/github.com/spf13/cobra/command.go:800
main.main()
/go/src/github.com/docker/distribution/cmd/registry/main.go:24 +0x25
Restic pod logs
No response
Operator pod logs
No response
New issue
- This issue is new
Metadata
Metadata
Assignees
Labels
kind/bugCategorizes issue or PR as related to a bug.Categorizes issue or PR as related to a bug.oadp-1.1