backprop.utils

backprop.utils.datasets

class ImageGroupDataset(images, groups, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class ImagePairDataset(imgs1, imgs2, similarity_scores, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class ImageTextGroupDataset(images, texts, groups, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class ImageTextPairDataset(img_text_pairs1, img_text_pairs2, similarity_scores, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class MultiLabelImageClassificationDataset(images, labels, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class SingleLabelImageClassificationDataset(images, labels, process_batch)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class SingleLabelTextClassificationDataset(params, process_batch, length)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class TextGroupDataset(texts, groups, process_batch, max_length=None)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class TextPairDataset(texts1, texts2, similarity_scores, process_batch, max_length=None)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

class TextToTextDataset(params, task, process_batch, length)[source]

Bases: Generic[torch.utils.data.dataset.T_co]

backprop.utils.download

download(url: str, folder: str, root: str = '/home/docs/.cache/backprop', force: bool = False)[source]

Downloads file from url to folder

backprop.utils.functions

cosine_similarity(vec1: Union[List[float], torch.Tensor], vec2: Union[List[float], torch.Tensor, List[List[float]], List[torch.Tensor]])[source]

Calculates cosine similarity between two vectors.

Parameters
  • vec1 – list of floats or corresponding tensor

  • vec2 – list of floats / list of list of floats or corresponding tensor

Example:

import backprop

backprop.cosine_similarity(vec1, vec2)
0.8982

backprop.cosine_similarity(vec1, [vec2, vec3])
[0.8982, 0.3421]

backprop.utils.helpers

base64_to_img(image: Union[str, List[str]])[source]

Returns PIL Image objects of base64 encoded images

img_to_base64(image: Union[PIL.Image.Image, List[PIL.Image.Image]])[source]

Returns base64 encoded strings of PIL Image objects

path_to_img(image: Union[str, List[str]])[source]

Returns PIL Image objects of paths to images

backprop.utils.load

load(path)[source]

Loads a saved model and returns it.

Parameters

path – Name of the model or full path to model.

Example:

import backprop

backprop.save(model_object, "my_model")
model = backprop.load("my_model")

backprop.utils.samplers

class SameGroupSampler(dataset)[source]

Bases: Generic[torch.utils.data.sampler.T_co]

backprop.utils.save

save(model, name: Optional[str] = None, description: Optional[str] = None, tasks: Optional[List[str]] = None, details: Optional[Dict] = None, path=None)[source]
Saves the provided model to the backprop cache folder using:
  1. provided name

  2. model.name

  3. provided path

The resulting folder has three files:

  • model.bin (dill pickled model instance)

  • config.json (description and task keys)

  • requirements.txt (exact python runtime requirements)

Parameters
  • model – Model object

  • name – string identifier for the model. Lowercase letters and numbers. No spaces/special characters except dashes.

  • description – String description of the model.

  • tasks – List of supported task strings

  • details – Valid json dictionary of additional details about the model

  • path – Optional path to save model

Example:

import backprop

backprop.save(model_object, "my_model")
model = backprop.load("my_model")

backprop.utils.upload

upload(model, name: Optional[str] = None, description: Optional[str] = None, tasks: Optional[List[str]] = None, details: Optional[Dict] = None, path=None, api_key: Optional[str] = None)[source]

Saves and deploys a model to Backprop.

Parameters
  • model – Model object

  • api_key – Backprop API key

  • name – string identifier for the model. Lowercase letters and numbers. No spaces/special characters except dashes.

  • description – String description of the model.

  • tasks – List of supported task strings

  • details – Valid json dictionary of additional details about the model

  • path – Optional path to save model

Example:

import backprop

tg = backprop.TextGeneration("t5_small")

# Any text works as training data
inp = ["I really liked the service I received!", "Meh, it was not impressive."]
out = ["positive", "negative"]

# Finetune with a single line of code
tg.finetune({"input_text": inp, "output_text": out})

# Use your trained model
prediction = tg("I enjoyed it!")

print(prediction)
# Prints
"positive"

# Upload to Backprop for production ready inference

model = tg.model
# Describe your model
name = "t5-sentiment"
description = "Predicts positive and negative sentiment"

backprop.upload(model, name=name, description=description, api_key="abc")