Skip to content

ExtraLime/FlaskMLApi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

What Did you Say?

A small example of to use Flask as a REST Api for ML Model inference

This project can be run locally or in a Docker Container. (Before building the docker container)
To get started, navigate to the models directory:
cd wdys/blueprints/language/models
and run (important):
python3 setup_models.py
This will download the necessary model metadata and save the files in the necessary location.

To run locally in develop mode:

Navigate to the wdys directory and run:
bash serve.sh

To run as a docker instance:

From the root directory, after running setup_models.py:
docker-compose up --build
This assumes you already have docker installed and can compose

Once the app is running:

visit localhost:8000 in your browser to verify the api is working

For Entity Recognition:

You can send either a POST with request with a 'sentence' key and some text as a value or a GET request with a parameter string.

example:
http://localhost:8000/ner?sentence=My name is Will and this is hosted on Github!

response: [ { "Will": "I-PER" }, { "##ithub": "I-ORG" } ]

Tests

This repo is also equipped with pytest to test the endpoints. run py.test wdys/tests from the root directory

About

How to use Flask as a REST API for Machine Learning Inference

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published