Skip to content
View dk19y's full-sized avatar
  • AWS
  • Santa Clara

Block or report dk19y

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Popular repositories Loading

  1. triton-inference-server triton-inference-server Public

    Forked from triton-inference-server/server

    The Triton Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

    C++ 1

  2. serve serve Public

    Forked from pytorch/serve

    Model Serving on PyTorch

    Java

  3. sagemaker-pytorch-inference-toolkit sagemaker-pytorch-inference-toolkit Public

    Forked from aws/sagemaker-pytorch-inference-toolkit

    Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at https://github.com/aws/deep-learning-containers.

    Python

  4. sagemaker-inference-toolkit sagemaker-inference-toolkit Public

    Forked from aws/sagemaker-inference-toolkit

    Serve machine learning models within a 🐳 Docker container using 🧠 Amazon SageMaker.

    Python

  5. deep-learning-containers deep-learning-containers Public

    Forked from aws/deep-learning-containers

    AWS Deep Learning Containers (DLCs) are a set of Docker images for training and serving models in TensorFlow, TensorFlow 2, PyTorch, and MXNet.

    Python

  6. multi-model-server multi-model-server Public

    Forked from awslabs/multi-model-server

    Multi Model Server is a tool for serving neural net models for inference

    Java