Authors: Ritambhara Singh , Jack Lanchantin , Arshdeep Sekhon , Yanjun Qi

License: MIT

Contributed by: Jack Lanchantin , Jeffrey Yoo

Cite as: https://doi.org:/10.1101/329334

Type: pytorch

Postprocessing: None

Trained on: Histone Modidification and RNA Seq Data From Roadmad/REMC database

Source files

Gene Expression Prediction

Dependency Requirements

  • python>=3.5
  • numpy
  • pytorch-cpu
  • torchvision-cpu

Quick Start

Creating new conda environtment using kipoi

kipoi env create AttentiveChrome

Activating environment

conda activate kipoi-AttentiveChrome

Command Line

Getting example input file

Replace {model_name} with the actual name of model (e.g. E003, E005, etc.)

kipoi get-example AttentiveChrome/{model_name} -o example_file

example: kipoi get-example AttentiveChrome/E003 -o example_file

Predicting using example file

kipoi predict AttentiveChrome/{model_name} --dataloader_args='{"input_file": "example_file/input_file", "bin_size": 100}' -o example_predict.tsv

This should produce a tsv file containing the results.

Python

Fetching the model

First, import kipoi: import kipoi

Next, get the model. Replace {model_name} with the actual name of model (e.g. E003, E005, etc.)

model = kipoi.get_model("AttentiveChrome/{model_name}")

Predicting using pipeline

prediction = model.pipeline.predict({"input_file": "path to input file", "bin_size": {some integer}})

This returns a numpy array containing the output from the final softmax function.

e.g. model.pipeline.predict({"input_file": "data/input_file", "bin_size": 100})

Predicting for a single batch

First, we need to set up our dataloader dl.

dl = model.default_dataloader(input_file="path to input file", bin_size={some integer})

Next, we can use the iterator functionality of the dataloader.

it = dl.batch_iter(batch_size=32)

single_batch = next(it)

First line gets us an iterator named it with each batch containing 32 items. We can use next(it) to get a batch.

Then, we can perform prediction on this single batch.

prediction = model.predict_on_batch(single_batch['inputs'])

This also returns a numpy array containing the output from the final softmax function.

Create a new conda environment with all dependencies installed
kipoi env create AttentiveChrome
source activate kipoi-AttentiveChrome
Test the model
kipoi test AttentiveChrome/E098 --source=kipoi
Make a prediction
kipoi get-example AttentiveChrome/E098 -o example
kipoi predict AttentiveChrome/E098 \
  --dataloader_args='{"input_file": "example/input_file"}' \
  -o '/tmp/AttentiveChrome|E098.example_pred.tsv'
# check the results
head '/tmp/AttentiveChrome|E098.example_pred.tsv'
Create a new conda environment with all dependencies installed
kipoi env create AttentiveChrome
source activate kipoi-AttentiveChrome
Get the model
import kipoi
model = kipoi.get_model('AttentiveChrome/E098')
Make a prediction for example files
pred = model.pipeline.predict_example(batch_size=4)
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs = model.default_dataloader.download_example('example')
# Get the dataloader and instantiate it
dl = model.default_dataloader(**dl_kwargs)
# get a batch iterator
batch_iterator = dl.batch_iter(batch_size=4)
for batch in batch_iterator:
    # predict for a batch
    batch_pred = model.predict_on_batch(batch['inputs'])
Make predictions for custom files directly
pred = model.pipeline.predict(dl_kwargs, batch_size=4)
Get the model
library(reticulate)
kipoi <- import('kipoi')
model <- kipoi$get_model('AttentiveChrome/E098')
Make a prediction for example files
predictions <- model$pipeline$predict_example()
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs <- model$default_dataloader$download_example('example')
# Get the dataloader
dl <- model$default_dataloader(dl_kwargs)
# get a batch iterator
it <- dl$batch_iter(batch_size=4)
# predict for a batch
batch <- iter_next(it)
model$predict_on_batch(batch$inputs)
Make predictions for custom files directly
pred <- model$pipeline$predict(dl_kwargs, batch_size=4)
Get the docker image
docker pull kipoi/kipoi-docker:attentivechrome-slim
Get the full sized docker image
docker pull kipoi/kipoi-docker:attentivechrome
Get the activated conda environment inside the container
docker run -it kipoi/kipoi-docker:attentivechrome-slim
Test the model
docker run kipoi/kipoi-docker:attentivechrome-slim kipoi test AttentiveChrome/E098 --source=kipoi
Make prediction for custom files directly
# Create an example directory containing the data
mkdir -p $PWD/kipoi-example 
# You can replace $PWD/kipoi-example with a different absolute path containing the data 
docker run -v $PWD/kipoi-example:/app/ kipoi/kipoi-docker:attentivechrome-slim \
kipoi get-example AttentiveChrome/E098 -o /app/example 
docker run -v $PWD/kipoi-example:/app/ kipoi/kipoi-docker:attentivechrome-slim \
kipoi predict AttentiveChrome/E098 \
--dataloader_args='{'input_file': '/app/example/input_file'}' \
-o '/app/AttentiveChrome_E098.example_pred.tsv' 
# check the results
head $PWD/kipoi-example/AttentiveChrome_E098.example_pred.tsv
    
Install apptainer
https://apptainer.org/docs/user/main/quick_start.html#quick-installation-steps
Make prediction for custom files directly
kipoi get-example AttentiveChrome/E098 -o example
kipoi predict AttentiveChrome/E098 \
--dataloader_args='{"input_file": "example/input_file"}' \
-o 'AttentiveChrome_E098.example_pred.tsv' \
--singularity 
# check the results
head AttentiveChrome_E098.example_pred.tsv

Schema

Inputs

Single numpy array

Name: None

    Shape: (100, 5) 

    Doc: Histone Modification Bin Matrix


Targets

Single numpy array

Name: None

    Shape: (1,) 

    Doc: Binary Classification


Dataloader

Defined as: ..

Doc: Dataloader for Gene Expression Prediction

Authors: Ritambhara Singh , Jack Lanchantin , Arshdeep Sekhon , Yanjun Qi

Type: Dataset

License: MIT


Arguments

input_file : Path of the histone modification read count file.

bin_size (optional): Size of bin


Model dependencies
conda:
  • python=3.8
  • numpy=1.19.2
  • pytorch::pytorch
  • pip=21.0.1

pip:

Dataloader dependencies
conda:
  • python=3.8
  • pytorch::pytorch
  • numpy

pip: