Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KServe v2 returns model version now #2973

Merged
merged 3 commits into from
Feb 28, 2024
Merged

Conversation

agunapal
Copy link
Collaborator

@agunapal agunapal commented Feb 27, 2024

Description

KServe nightly tests are failing as now KServe is returning the model version too.

Update KServe v2 tests to check the model version too

Seems like this was fixed recently
kserve/kserve#3466

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Feature/Issue validation/testing

  • Logs
MNIST KServe V2 test begin
Deploying the cluster
inferenceservice.serving.kserve.io/torchserve-mnist-v2 created
Waiting for pod to come up...
Check status of the pod
NAME                                                              READY   STATUS    RESTARTS   AGE
torchserve-mnist-v2-predictor-00001-deployment-868bcbc5c6-tv6j4   1/2     Running   0          80s
Name:             torchserve-mnist-v2-predictor-00001-deployment-868bcbc5c6-tv6j4
Namespace:        default
Priority:         0
Service Account:  default
Node:             minikube/192.168.49.2
Start Time:       Tue, 27 Feb 2024 23:46:42 +0000
Labels:           app=torchserve-mnist-v2-predictor-00001
                  component=predictor
                  pod-template-hash=868bcbc5c6
                  service.istio.io/canonical-name=torchserve-mnist-v2-predictor
                  service.istio.io/canonical-revision=torchserve-mnist-v2-predictor-00001
                  serviceEnvelope=kservev2
                  serving.knative.dev/configuration=torchserve-mnist-v2-predictor
                  serving.knative.dev/configurationGeneration=1
                  serving.knative.dev/configurationUID=792ae138-ac66-4a41-aab5-8a146bdc1d9e
                  serving.knative.dev/revision=torchserve-mnist-v2-predictor-00001
                  serving.knative.dev/revisionUID=761b97fb-7674-46cf-83a5-ba04eaf83b86
                  serving.knative.dev/service=torchserve-mnist-v2-predictor
                  serving.knative.dev/serviceUID=87836584-0a79-4559-8c5b-3bac7b342479
                  serving.kserve.io/inferenceservice=torchserve-mnist-v2
Annotations:      autoscaling.knative.dev/class: kpa.autoscaling.knative.dev
                  autoscaling.knative.dev/min-scale: 1
                  internal.serving.kserve.io/storage-initializer-sourceuri: gs://kfserving-examples/models/torchserve/image_classifier/v2
                  prometheus.kserve.io/path: /metrics
                  prometheus.kserve.io/port: 8082
                  serving.knative.dev/creator: system:serviceaccount:kserve:kserve-controller-manager
                  serving.kserve.io/enable-metric-aggregation: false
                  serving.kserve.io/enable-prometheus-scraping: false
Status:           Running
IP:               10.244.0.17
IPs:
  IP:           10.244.0.17
Controlled By:  ReplicaSet/torchserve-mnist-v2-predictor-00001-deployment-868bcbc5c6
Init Containers:
  storage-initializer:
    Container ID:  docker://24b1d230adede55565dd11962f4d4e60b512d711959946ff15d8aa475f3b91d0
    Image:         kserve/storage-initializer:v0.11.1
    Image ID:      docker-pullable://kserve/storage-initializer@sha256:5cfc8ae36cc0b894097db90a9df3b034f707efdbba24e91c8d36087f6f0dc78d
    Port:          <none>
    Host Port:     <none>
    Args:
      gs://kfserving-examples/models/torchserve/image_classifier/v2
      /mnt/models
    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Tue, 27 Feb 2024 23:46:58 +0000
      Finished:     Tue, 27 Feb 2024 23:47:04 +0000
    Ready:          True
    Restart Count:  0
    Limits:
      cpu:     1
      memory:  1Gi
    Requests:
      cpu:        100m
      memory:     100Mi
    Environment:  <none>
    Mounts:
      /mnt/models from kserve-provision-location (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-5prfw (ro)
Containers:
  kserve-container:
    Container ID:  docker://c6521931a4c667ac09f1165713b474bdf7d1e9c51e65c6065ab19011ac60b684
    Image:         index.docker.io/pytorch/torchserve-kfs-nightly@sha256:43bbf3d98b6549040cbb1b38e89f0656f4bde9ba4d826663e1501386e162bcbf
    Image ID:      docker-pullable://pytorch/torchserve-kfs-nightly@sha256:43bbf3d98b6549040cbb1b38e89f0656f4bde9ba4d826663e1501386e162bcbf
    Port:          8080/TCP
    Host Port:     0/TCP
    Args:
      torchserve
      --start
      --model-store=/mnt/models/model-store
      --ts-config=/mnt/models/config/config.properties
    State:          Running
      Started:      Tue, 27 Feb 2024 23:47:59 +0000
    Ready:          True
    Restart Count:  0
    Limits:
      cpu:     1
      memory:  1Gi
    Requests:
      cpu:     100m
      memory:  256Mi
    Environment:
      PROTOCOL_VERSION:     v2
      TS_SERVICE_ENVELOPE:  kservev2
      PORT:                 8080
      K_REVISION:           torchserve-mnist-v2-predictor-00001
      K_CONFIGURATION:      torchserve-mnist-v2-predictor
      K_SERVICE:            torchserve-mnist-v2-predictor
    Mounts:
      /mnt/models from kserve-provision-location (ro)
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-5prfw (ro)
  queue-proxy:
    Container ID:   docker://b0c2ecc9817027bf746d39a7ea0a3c1bd4cf53c6d1d015d86e48845e88c2104d
    Image:          gcr.io/knative-releases/knative.dev/serving/cmd/queue@sha256:65c427aaab3be9cea1afea32cdef26d5855c69403077d2dc3439f75c26a1e83f
    Image ID:       docker-pullable://gcr.io/knative-releases/knative.dev/serving/cmd/queue@sha256:65c427aaab3be9cea1afea32cdef26d5855c69403077d2dc3439f75c26a1e83f
    Ports:          8022/TCP, 9090/TCP, 9091/TCP, 8012/TCP, 8112/TCP
    Host Ports:     0/TCP, 0/TCP, 0/TCP, 0/TCP, 0/TCP
    State:          Running
      Started:      Tue, 27 Feb 2024 23:48:00 +0000
    Ready:          False
    Restart Count:  0
    Requests:
      cpu:      25m
    Readiness:  http-get http://:8012/ delay=0s timeout=1s period=10s #success=1 #failure=3
    Environment:
      SERVING_NAMESPACE:                        default
      SERVING_SERVICE:                          torchserve-mnist-v2-predictor
      SERVING_CONFIGURATION:                    torchserve-mnist-v2-predictor
      SERVING_REVISION:                         torchserve-mnist-v2-predictor-00001
      QUEUE_SERVING_PORT:                       8012
      QUEUE_SERVING_TLS_PORT:                   8112
      CONTAINER_CONCURRENCY:                    0
      REVISION_TIMEOUT_SECONDS:                 300
      REVISION_RESPONSE_START_TIMEOUT_SECONDS:  0
      REVISION_IDLE_TIMEOUT_SECONDS:            0
      SERVING_POD:                              torchserve-mnist-v2-predictor-00001-deployment-868bcbc5c6-tv6j4 (v1:metadata.name)
      SERVING_POD_IP:                            (v1:status.podIP)
      SERVING_LOGGING_CONFIG:                   
      SERVING_LOGGING_LEVEL:                    
      SERVING_REQUEST_LOG_TEMPLATE:             {"httpRequest": {"requestMethod": "{{.Request.Method}}", "requestUrl": "{{js .Request.RequestURI}}", "requestSize": "{{.Request.ContentLength}}", "status": {{.Response.Code}}, "responseSize": "{{.Response.Size}}", "userAgent": "{{js .Request.UserAgent}}", "remoteIp": "{{js .Request.RemoteAddr}}", "serverIp": "{{.Revision.PodIP}}", "referer": "{{js .Request.Referer}}", "latency": "{{.Response.Latency}}s", "protocol": "{{.Request.Proto}}"}, "traceId": "{{index .Request.Header "X-B3-Traceid"}}"}
      SERVING_ENABLE_REQUEST_LOG:               false
      SERVING_REQUEST_METRICS_BACKEND:          prometheus
      TRACING_CONFIG_BACKEND:                   none
      TRACING_CONFIG_ZIPKIN_ENDPOINT:           
      TRACING_CONFIG_DEBUG:                     false
      TRACING_CONFIG_SAMPLE_RATE:               0.1
      USER_PORT:                                8080
      SYSTEM_NAMESPACE:                         knative-serving
      METRICS_DOMAIN:                           knative.dev/internal/serving
      SERVING_READINESS_PROBE:                  {"tcpSocket":{"port":8080,"host":"127.0.0.1"},"successThreshold":1}
      ENABLE_PROFILING:                         false
      SERVING_ENABLE_PROBE_REQUEST_LOG:         false
      METRICS_COLLECTOR_ADDRESS:                
      HOST_IP:                                   (v1:status.hostIP)
      ENABLE_HTTP2_AUTO_DETECTION:              false
      ROOT_CA:                                  
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-5prfw (ro)
Conditions:
  Type              Status
  Initialized       True 
  Ready             False 
  ContainersReady   False 
  PodScheduled      True 
Volumes:
  kube-api-access-5prfw:
    Type:                    Projected (a volume that contains injected data from multiple sources)
    TokenExpirationSeconds:  3607
    ConfigMapName:           kube-root-ca.crt
    ConfigMapOptional:       <nil>
    DownwardAPI:             true
  kserve-provision-location:
    Type:        EmptyDir (a temporary directory that shares a pod's lifetime)
    Medium:      
    SizeLimit:   <unset>
QoS Class:       Burstable
Node-Selectors:  <none>
Tolerations:     node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
                 node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
  Type    Reason     Age   From               Message
  ----    ------     ----  ----               -------
  Normal  Scheduled  80s   default-scheduler  Successfully assigned default/torchserve-mnist-v2-predictor-00001-deployment-868bcbc5c6-tv6j4 to minikube
  Normal  Pulling    79s   kubelet            Pulling image "kserve/storage-initializer:v0.11.1"
  Normal  Pulled     66s   kubelet            Successfully pulled image "kserve/storage-initializer:v0.11.1" in 13.103653547s (13.103667947s including waiting)
  Normal  Created    64s   kubelet            Created container storage-initializer
  Normal  Started    64s   kubelet            Started container storage-initializer
  Normal  Pulling    57s   kubelet            Pulling image "index.docker.io/pytorch/torchserve-kfs-nightly@sha256:43bbf3d98b6549040cbb1b38e89f0656f4bde9ba4d826663e1501386e162bcbf"
  Normal  Pulled     4s    kubelet            Successfully pulled image "index.docker.io/pytorch/torchserve-kfs-nightly@sha256:43bbf3d98b6549040cbb1b38e89f0656f4bde9ba4d826663e1501386e162bcbf" in 53.205183551s (53.205197761s including waiting)
  Normal  Created    3s    kubelet            Created container kserve-container
  Normal  Started    3s    kubelet            Started container kserve-container
  Normal  Pulling    3s    kubelet            Pulling image "gcr.io/knative-releases/knative.dev/serving/cmd/queue@sha256:65c427aaab3be9cea1afea32cdef26d5855c69403077d2dc3439f75c26a1e83f"
  Normal  Pulled     2s    kubelet            Successfully pulled image "gcr.io/knative-releases/knative.dev/serving/cmd/queue@sha256:65c427aaab3be9cea1afea32cdef26d5855c69403077d2dc3439f75c26a1e83f" in 1.609511194s (1.609515854s including waiting)
  Normal  Created    2s    kubelet            Created container queue-proxy
  Normal  Started    2s    kubelet            Started container queue-proxy

Wait for inference service to be ready
Wait for ports to be in forwarding
Forwarding from 127.0.0.1:8080 -> 8080
Forwarding from [::1]:8080 -> 8080
Make inference request
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0Handling connection for 8080
100  6889  100   197  100  6692    280   9532 --:--:-- --:--:-- --:--:--  9799
✓ SUCCESS
Clean up port forwarding
inferenceservice.serving.kserve.io "torchserve-mnist-v2" deleted
./kubernetes/kserve/tests/scripts/test_mnist.sh: line 179: 141929 Terminated              kubectl port-forward --namespace istio-system svc/${INGRESS_GATEWAY_SERVICE} 8080:80

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@agunapal agunapal added this pull request to the merge queue Feb 28, 2024
Merged via the queue into master with commit 3a406d6 Feb 28, 2024
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants