Model name Version Authors Contributed by Type Cite as License Training procedure
Model name Version Authors Contributed by Type Cite as License Training procedure
AttentiveChrome/E003 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E004 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E005 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E006 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E007 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E011 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E012 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E013 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E016 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E024 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E027 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E028 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E037 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E038 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E047 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E050 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E053 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E054 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E055 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E056 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E057 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E058 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E061 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E062 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E065 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E066 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E070 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E071 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E079 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E082 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E084 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E085 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E087 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E094 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E095 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E096 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E097 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E098 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E100 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E104 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E105 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E106 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E109 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E112 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E113 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E114 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E116 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E117 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E118 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E119 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E120 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E122 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E123 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E127 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT
AttentiveChrome/E128 0.1 Ritambhara Singh et al. Jack Lanchantin et al. pytorch https://doi.org:/10.1101/329334 MIT

Attentive Chrome Kipoi

Dependency Requirements

  • python>=3.5
  • numpy
  • pytorch-cpu
  • torchvision-cpu

Quick Start

Creating new conda environtment using kipoi

kipoi env create AttentiveChrome

Activating environment

conda activate kipoi-AttentiveChrome

Command Line

Getting example input file

Replace {model_name} with the actual name of model (e.g. E003, E005, etc.)

kipoi get-example AttentiveChrome/{model_name} -o example_file

example: kipoi get-example AttentiveChrome/E003 -o example_file

Predicting using example file

kipoi predict AttentiveChrome/{model_name} --dataloader_args='{"input_file": "example_file/input_file", "bin_size": 100}' -o example_predict.tsv

This should produce a tsv file containing the results.

Python

Fetching the model

First, import kipoi: import kipoi

Next, get the model. Replace {model_name} with the actual name of model (e.g. E003, E005, etc.)

model = kipoi.get_model("AttentiveChrome/{model_name}")

Predicting using pipeline

prediction = model.pipeline.predict({"input_file": "path to input file", "bin_size": {some integer}})

This returns a numpy array containing the output from the final softmax function.

e.g. model.pipeline.predict({"input_file": "data/input_file", "bin_size": 100})

Predicting for a single batch

First, we need to set up our dataloader dl.

dl = model.default_dataloader(input_file="path to input file", bin_size={some integer})

Next, we can use the iterator functionality of the dataloader.

it = dl.batch_iter(batch_size=32)

single_batch = next(it)

First line gets us an iterator named it with each batch containing 32 items. We can use next(it) to get a batch.

Then, we can perform prediction on this single batch.

prediction = model.predict_on_batch(single_batch['inputs'])

This also returns a numpy array containing the output from the final softmax function.