Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Not working in K8s community version with OS ubuntu #1047

Open
sujitkagarwal opened this issue Aug 21, 2024 · 2 comments
Open

Bug: Not working in K8s community version with OS ubuntu #1047

sujitkagarwal opened this issue Aug 21, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@sujitkagarwal
Copy link

What version of redis operator are you using?
v0.17.0

kubectl logs <_redis-operator_pod_name> -n <namespace>

{"level":"info","ts":1724242879.4357052,"logger":"controller.redis","msg":"Starting Controller","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"Redis"}
{"level":"info","ts":1724242879.435408,"logger":"controller.redissentinel","msg":"Starting Controller","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"RedisSentinel"}
{"level":"info","ts":1724242879.4354303,"logger":"controller.rediscluster","msg":"Starting Controller","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"RedisCluster"}
{"level":"info","ts":1724242879.72707,"logger":"controller.redissentinel","msg":"Starting workers","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"RedisSentinel","worker count":1}
{"level":"info","ts":1724242879.727451,"logger":"controller.rediscluster","msg":"Starting workers","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"RedisCluster","worker count":1}
{"level":"info","ts":1724242879.7271116,"logger":"controller.redisreplication","msg":"Starting workers","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"RedisReplication","worker count":1}
{"level":"info","ts":1724242879.727457,"logger":"controller.redis","msg":"Starting workers","reconciler group":"redis.redis.opstreelabs.in","reconciler kind":"Redis","worker count":1}
{"level":"info","ts":1724242985.7979865,"logger":"controllers.Redis","msg":"Reconciling opstree redis controller","Request.Namespace":"ot-operators","Request.Name":"redis-standalone"}
{"level":"info","ts":1724242985.8432214,"logger":"KubeAPIWarningLogger","msg":"unknown field \"spec.storage.volumeClaimTemplate.metadata.creationTimestamp\""}
{"level":"info","ts":1724242985.843453,"logger":"KubeAPIWarningLogger","msg":"metadata.finalizers: \"redisFinalizer\": prefer a domain-qualified finalizer name to avoid accidental conflicts with other finalizer writers"}
{"level":"info","ts":1724242985.8607802,"logger":"controller_redis","msg":"Redis statefulset get action failed","Request.StatefulSet.Namespace":"ot-operators","Request.StatefulSet.Name":"redis-standalone"}
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x14268ea]

goroutine 387 [running]:
github.com/OT-CONTAINER-KIT/redis-operator/k8sutils.getProbeInfo(...)
	/workspace/k8sutils/statefulset.go:516
github.com/OT-CONTAINER-KIT/redis-operator/k8sutils.generateContainerDef({_, _}, {{0xc00094c0a0, 0x1c}, {0xc000920370, 0xc}, 0x0, {0x0, 0x0}, {0x0, ...}, ...}, ...)
	/workspace/k8sutils/statefulset.go:338 +0x12a
github.com/OT-CONTAINER-KIT/redis-operator/k8sutils.generateStatefulSetsDef({{0xc000920340, 0x10}, {0x0, 0x0}, {0xc000920350, 0xc}, {0x0, 0x0}, {0x0, 0x0}, ...}, ...)
	/workspace/k8sutils/statefulset.go:223 +0x1dc
github.com/OT-CONTAINER-KIT/redis-operator/k8sutils.CreateOrUpdateStateFul({_, _}, {{0xc000920340, 0x10}, {0x0, 0x0}, {0xc000920350, 0xc}, {0x0, 0x0}, ...}, ...)
	/workspace/k8sutils/statefulset.go:89 +0x1ad
github.com/OT-CONTAINER-KIT/redis-operator/k8sutils.CreateStandaloneRedis(0xc000195000)
	/workspace/k8sutils/redis-standalone.go:56 +0x3f8
github.com/OT-CONTAINER-KIT/redis-operator/controllers.(*RedisReconciler).Reconcile(0xc000528990, {0xc0007cdb30, 0x15742c0}, {{{0xc000153ad0, 0x167fea0}, {0xc000153ac0, 0x30}}})
	/workspace/controllers/redis_controller.go:66 +0x2c8
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Reconcile(0xc00089c6e0, {0x195e118, 0xc0007cdb30}, {{{0xc000153ad0, 0x167fea0}, {0xc000153ac0, 0x413a34}}})
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.11.0/pkg/internal/controller/controller.go:114 +0x26f
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler(0xc00089c6e0, {0x195e070, 0xc0008d7600}, {0x15ca160, 0xc00071efe0})
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.11.0/pkg/internal/controller/controller.go:311 +0x33e
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem(0xc00089c6e0, {0x195e070, 0xc0008d7600})
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.11.0/pkg/internal/controller/controller.go:266 +0x205
sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2()
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.11.0/pkg/internal/controller/controller.go:227 +0x85
created by sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2
	/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.11.0/pkg/internal/controller/controller.go:223 +0x357```





redis-operator version: v0.17.0

**Does this issue reproduce with the latest release?**

Yes the issue is reported :v0.17.0

**What operating system and processor architecture are you using (`kubectl version`)?**

PRETTY_NAME="Ubuntu 22.04.4 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.4 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy

<details><summary><code>kubectl version</code> Output</summary><br><pre>
$ kubectl version
WARNING: This version information is deprecated and will be replaced with the output from kubectl version --short.  Use --output=yaml|json to get the full version.
Client Version: version.Info{Major:"1", Minor:"27", GitVersion:"v1.27.3", GitCommit:"25b4e43193bcda6c7328a6d147b1fb73a33f1598", GitTreeState:"clean", BuildDate:"2023-06-14T09:53:42Z", GoVersion:"go1.20.5", Compiler:"gc", Platform:"linux/amd64"}
Kustomize Version: v5.0.1
Server Version: version.Info{Major:"1", Minor:"27", GitVersion:"v1.27.3", GitCommit:"25b4e43193bcda6c7328a6d147b1fb73a33f1598", GitTreeState:"clean", BuildDate:"2023-06-14T09:47:40Z", GoVersion:"go1.20.5", Compiler:"gc", Platform:"linux/amd64"}
</pre></details>

**What did you do?**

<!--
If possible, provide a recipe for reproducing the error.
A detailed sequence of steps describing what to do to observe the issue is good.
A complete runnable bash shell script is best.
-->
Deploy the operator and deploy the cluster

**What did you expect to see?**

When I try to deploy standalone, replication and cluster all are failling 

**What did you see instead?**


The pods are not coming up.
@sujitkagarwal sujitkagarwal added the bug Something isn't working label Aug 21, 2024
@woodliu
Copy link

woodliu commented Sep 9, 2024

@sujitkagarwal Can you privide the manifest content from the kubernetes cluster?

apiVersion: redis.redis.opstreelabs.in/v1beta1
kind: Redis

@robermar23
Copy link

@sujitkagarwal you probably want to upgrade to 0.18 and you can still use your v1beta1 resource as is

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants