Thinc is a lightweight deep learning library that offers an elegant,
type-checked, functional-programming API for composing models, with support
for layers defined in other frameworks such as PyTorch, TensorFlow or
MXNet. You can use Thinc as an interface layer, a standalone toolkit or a
flexible way to develop new models. Previous versions of Thinc have been running
quietly in production in thousands of companies, via both
spaCy and Prodigy. We wrote the new
version to let users compose, configure and deploy custom models built with
their favorite framework. The end result is a library quite different in its
design, that’s easy to understand, plays well with others, and is a lot of fun
to use.
from thinc.api import PyTorchWrapper, TensorFlowWrapper
pt_model = PyTorchWrapper(create_pytorch_model())
tf_model = TensorFlowWrapper(create_tensorflow_model())# You can even stitch together strange hybrids# (not efficient, but possible)
frankenmodel = chain(add(pt_model, tf_model), Linear(128), logistic())
Wrap PyTorch, TensorFlow & MXNet models for use in your network
for i inrange(10):for X, Y in train_batches:
Yh, backprop = model.begin_update(X)
loss, dYh = get_loss(Yh, Y)
backprop(dYh)
model.finish_update(optimizer)
Intro to Thinc · Everything you need to know to get started. Composing and training a model on the MNIST data, using config files, registering custom functions and wrapping PyTorch, TensorFlow and MXNet models.