Skip to content
This repository was archived by the owner on Nov 20, 2025. It is now read-only.

Commit c06265d

Browse files
authored
doc: convert README to Markdown and add emoji (#39)
* doc: Add emojis to README section titles * doc: Change Usage emoji * doc: Fix title underline length * doc: Convert README from rst to Markdown * doc: Fix emoji markup for Contributing * doc: Add subheaders to Usage section
1 parent 5f01578 commit c06265d

File tree

2 files changed

+49
-66
lines changed

2 files changed

+49
-66
lines changed
Lines changed: 47 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1,68 +1,48 @@
1-
.. image:: https://github.com/aws/sagemaker-inference-toolkit/raw/master/branding/icon/sagemaker-banner.png
2-
:height: 100px
3-
:alt: SageMaker
1+
![SageMaker](https://github.com/aws/sagemaker-inference-toolkit/raw/master/branding/icon/sagemaker-banner.png)
42

5-
===========================
6-
SageMaker Inference Toolkit
7-
===========================
3+
# SageMaker Inference Toolkit
84

9-
.. image:: https://img.shields.io/pypi/v/sagemaker-inference.svg
10-
:target: https://pypi.python.org/pypi/sagemaker-inference
11-
:alt: Latest Version
5+
[![Latest Version](https://img.shields.io/pypi/v/sagemaker-inference.svg)](https://pypi.python.org/pypi/sagemaker-inference) [![Supported Python Versions](https://img.shields.io/pypi/pyversions/sagemaker-inference.svg)](https://pypi.python.org/pypi/sagemaker-inference) [![Code Style: Black](https://img.shields.io/badge/code_style-black-000000.svg)](https://github.com/python/black)
126

13-
.. image:: https://img.shields.io/pypi/pyversions/sagemaker-inference.svg
14-
:target: https://pypi.python.org/pypi/sagemaker-inference
15-
:alt: Supported Python Versions
7+
Serve machine learning models within a Docker container using Amazon
8+
SageMaker.
169

17-
.. image:: https://img.shields.io/badge/code_style-black-000000.svg
18-
:target: https://github.com/python/black
19-
:alt: Code Style: Black
2010

21-
Serve machine learning models within a Docker container using Amazon SageMaker.
11+
## :books: Background
2212

23-
-----------------
24-
Table of Contents
25-
-----------------
26-
.. contents::
27-
:local:
28-
29-
Background
30-
----------
31-
32-
`Amazon SageMaker <https://aws.amazon.com/sagemaker/>`__ is a fully managed service for data science and machine learning (ML) workflows.
13+
[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is a fully managed service for data science and machine learning (ML) workflows.
3314
You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models.
3415

35-
Once you have a trained model, you can include it in a `Docker container <https://www.docker.com/resources/what-container>`__ that runs your inference code.
16+
Once you have a trained model, you can include it in a [Docker container](https://www.docker.com/resources/what-container) that runs your inference code.
3617
A container provides an effectively isolated environment, ensuring a consistent runtime regardless of where the container is deployed.
37-
Containerizing your model and code enables fast and reliable deployment of your model.
18+
Containerizing your model and code enables fast and reliable deployment of your model.
3819

39-
The **SageMaker Inference Toolkit** implements a model serving stack and can be easily added to any Docker container, making it `deployable to SageMaker <https://aws.amazon.com/sagemaker/deploy/>`__.
40-
This library's serving stack is built on `Multi Model Server <https://github.com/awslabs/mxnet-model-server>`__, and it can serve your own models or those you trained on SageMaker using `machine learning frameworks with native SageMaker support <https://docs.aws.amazon.com/sagemaker/latest/dg/frameworks.html>`__.
41-
If you use a `prebuilt SageMaker Docker image for inference <https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html>`__, this library may already be included.
20+
The **SageMaker Inference Toolkit** implements a model serving stack and can be easily added to any Docker container, making it [deployable to SageMaker](https://aws.amazon.com/sagemaker/deploy/).
21+
This library's serving stack is built on [Multi Model Server](https://github.com/awslabs/mxnet-model-server), and it can serve your own models or those you trained on SageMaker using [machine learning frameworks with native SageMaker support](https://docs.aws.amazon.com/sagemaker/latest/dg/frameworks.html).
22+
If you use a [prebuilt SageMaker Docker image for inference](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html), this library may already be included.
4223

43-
For more information, see the Amazon SageMaker Developer Guide sections on `building your own container with Multi Model Server <https://docs.aws.amazon.com/sagemaker/latest/dg/build-multi-model-build-container.html>`__ and `using your own models <https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms.html>`__.
24+
For more information, see the Amazon SageMaker Developer Guide sections on [building your own container with Multi Model Server](https://docs.aws.amazon.com/sagemaker/latest/dg/build-multi-model-build-container.html) and [using your own models](https://docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms.html).
4425

45-
Installation
46-
------------
26+
## :hammer_and_wrench: Installation
4727

48-
To install this library in your Docker image, add the following line to your `Dockerfile <https://docs.docker.com/engine/reference/builder/>`__:
28+
To install this library in your Docker image, add the following line to your [Dockerfile](https://docs.docker.com/engine/reference/builder/):
4929

50-
.. code:: dockerfile
30+
``` dockerfile
31+
RUN pip3 install multi-model-server sagemaker-inference-toolkit
32+
```
5133

52-
RUN pip3 install multi-model-server sagemaker-inference-toolkit
34+
[Here is an example](https://github.com/awslabs/amazon-sagemaker-examples/blob/master/advanced_functionality/multi_model_bring_your_own/container/Dockerfile) of a Dockerfile that installs SageMaker Inference Toolkit.
5335

54-
`Here is an example <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/advanced_functionality/multi_model_bring_your_own/container/Dockerfile>`__ of a Dockerfile that installs SageMaker Inference Toolkit.
36+
## :computer: Usage
5537

56-
Usage
57-
-----
38+
### Implementation Steps
5839

5940
To use the SageMaker Inference Toolkit, you need to do the following:
6041

61-
1. Implement an inference handler, which is responsible for loading the model and providing input, predict, and output functions.
62-
(`Here is an example <https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/default_inference_handler.py>`__ of an inference handler.)
63-
64-
.. code:: python
42+
1. Implement an inference handler, which is responsible for loading the model and providing input, predict, and output functions.
43+
([Here is an example](https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/default_inference_handler.py) of an inference handler.)
6544

45+
``` python
6646
from sagemaker_inference import content_types, decoder, default_inference_handler, encoder, errors
6747

6848
class DefaultPytorchInferenceHandler(default_inference_handler.DefaultInferenceHandler):
@@ -114,13 +94,13 @@ To use the SageMaker Inference Toolkit, you need to do the following:
11494
Returns: output data serialized
11595
"""
11696
return encoder.encode(prediction, accept)
97+
```
11798

118-
2. Implement a handler service that is executed by the model server.
119-
(`Here is an example <https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/handler_service.py>`__ of a handler service.)
120-
For more information on how to define your ``HANDLER_SERVICE`` file, see `the MMS custom service documentation <https://github.com/awslabs/mxnet-model-server/blob/master/docs/custom_service.md>`__.
121-
122-
.. code:: python
99+
2. Implement a handler service that is executed by the model server.
100+
([Here is an example](https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/handler_service.py) of a handler service.)
101+
For more information on how to define your `HANDLER_SERVICE` file, see [the MMS custom service documentation](https://github.com/awslabs/mxnet-model-server/blob/master/docs/custom_service.md).
123102

103+
``` python
124104
from sagemaker_inference.default_handler_service import DefaultHandlerService
125105
from sagemaker_inference.transformer import Transformer
126106
from sagemaker_pytorch_serving_container.default_inference_handler import DefaultPytorchInferenceHandler
@@ -137,31 +117,34 @@ To use the SageMaker Inference Toolkit, you need to do the following:
137117
def __init__(self):
138118
transformer = Transformer(default_inference_handler=DefaultPytorchInferenceHandler())
139119
super(HandlerService, self).__init__(transformer=transformer)
120+
```
140121

141-
3. Implement a serving entrypoint, which starts the model server.
142-
(`Here is an example <https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/serving.py>`__ of a serving entrypoint.)
143-
144-
.. code:: python
122+
3. Implement a serving entrypoint, which starts the model server.
123+
([Here is an example](https://github.com/aws/sagemaker-pytorch-serving-container/blob/master/src/sagemaker_pytorch_serving_container/serving.py) of a serving entrypoint.)
145124

125+
``` python
146126
from sagemaker_inference import model_server
147127

148128
model_server.start_model_server(handler_service=HANDLER_SERVICE)
129+
```
149130

150-
4. Define the location of the entrypoint in your Dockerfile.
151-
152-
.. code:: dockerfile
131+
4. Define the location of the entrypoint in your Dockerfile.
153132

133+
``` dockerfile
154134
ENTRYPOINT ["python", "/usr/local/bin/entrypoint.py"]
135+
```
136+
137+
### Complete Example
155138

156-
`Here is a complete example <https://github.com/awslabs/amazon-sagemaker-examples/tree/master/advanced_functionality/multi_model_bring_your_own>`__ demonstrating usage of the SageMaker Inference Toolkit in your own container for deployment to a multi-model endpoint.
139+
[Here is a complete example](https://github.com/awslabs/amazon-sagemaker-examples/tree/master/advanced_functionality/multi_model_bring_your_own) demonstrating usage of the SageMaker Inference Toolkit in your own container for deployment to a multi-model endpoint.
157140

158-
License
159-
-------
141+
## :scroll: License
160142

161-
This library is licensed under the `Apache 2.0 License <http://aws.amazon.com/apache2.0/>`__.
162-
For more details, please take a look at the `LICENSE <https://github.com/aws-samples/sagemaker-inference-toolkit/blob/master/LICENSE>`__ file.
143+
This library is licensed under the [Apache 2.0 License](http://aws.amazon.com/apache2.0/).
144+
For more details, please take a look at the [LICENSE](https://github.com/aws-samples/sagemaker-inference-toolkit/blob/master/LICENSE) file.
163145

164-
Contributing
165-
------------
146+
## :handshake: Contributing
166147

167-
Contributions are welcome! Please read our `contributing guidelines <https://github.com/aws/sagemaker-inference-toolkit/blob/master/CONTRIBUTING.md>`__ if you'd like to open an issue or submit a pull request.
148+
Contributions are welcome!
149+
Please read our [contributing guidelines](https://github.com/aws/sagemaker-inference-toolkit/blob/master/CONTRIBUTING.md)
150+
if you'd like to open an issue or submit a pull request.

setup.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,8 +45,8 @@ def read_version():
4545
package_dir={PKG_NAME: "src/sagemaker_inference"},
4646
package_data={PKG_NAME: ["etc/*"]},
4747
py_modules=[os.path.splitext(os.path.basename(path))[0] for path in glob("src/*.py")],
48-
long_description=read("README.rst"),
49-
long_description_content_type="text/x-rst",
48+
long_description=read("README.md"),
49+
long_description_content_type="text/markdown",
5050
author="Amazon Web Services",
5151
url="https://github.com/aws/sagemaker-inference-toolkit/",
5252
license="Apache License 2.0",

0 commit comments

Comments
 (0)