KipoiSplice/4cons

Authors: Ziga Avsec , Roman Kreuzhuber

License: MIT

Contributed by: Ziga Avsec , Roman Kreuzhuber

Cite as: https://doi.org/10.1101/375345

Type: sklearn

Postprocessing: None

Trained on: ClinVar (release 2018-04-29) variants in the range [-40nt, 10nt] around the splicing acceptor or variants in the range [-10, 10] nt around the splice donor of a protein coding gene. Only variants labelled 'Pathogenic' or 'Benign' were used. Data from all chromosomes was used for training.

Source files

This model is similar to KipoiSplice/4, but is additionally base on 4 conservational features - phyloP46way_placental, phyloP46way_primate, CADD_raw and CADD_phred.

Create a new conda environment with all dependencies installed
kipoi env create KipoiSplice/4cons
source activate kipoi-KipoiSplice__4cons
Test the model
kipoi test KipoiSplice/4cons --source=kipoi
Make a prediction
kipoi get-example KipoiSplice/4cons -o example
kipoi predict KipoiSplice/4cons \
  --dataloader_args='{"fasta_file": "example/fasta_file", "gtf_file": "example/gtf_file", "num_workers": 1, "vcf_file": "example/vcf_file"}' \
  -o '/tmp/KipoiSplice|4cons.example_pred.tsv'
# check the results
head '/tmp/KipoiSplice|4cons.example_pred.tsv'
Create a new conda environment with all dependencies installed
kipoi env create KipoiSplice/4cons
source activate kipoi-KipoiSplice__4cons
Get the model
import kipoi
model = kipoi.get_model('KipoiSplice/4cons')
Make a prediction for example files
pred = model.pipeline.predict_example(batch_size=4)
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs = model.default_dataloader.download_example('example')
# Get the dataloader and instantiate it
dl = model.default_dataloader(**dl_kwargs)
# get a batch iterator
batch_iterator = dl.batch_iter(batch_size=4)
for batch in batch_iterator:
    # predict for a batch
    batch_pred = model.predict_on_batch(batch['inputs'])
Make predictions for custom files directly
pred = model.pipeline.predict(dl_kwargs, batch_size=4)
Get the model
library(reticulate)
kipoi <- import('kipoi')
model <- kipoi$get_model('KipoiSplice/4cons')
Make a prediction for example files
predictions <- model$pipeline$predict_example()
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs <- model$default_dataloader$download_example('example')
# Get the dataloader
dl <- model$default_dataloader(dl_kwargs)
# get a batch iterator
it <- dl$batch_iter(batch_size=4)
# predict for a batch
batch <- iter_next(it)
model$predict_on_batch(batch$inputs)
Make predictions for custom files directly
pred <- model$pipeline$predict(dl_kwargs, batch_size=4)
Get the docker image
docker pull kipoi/kipoi-docker:kipoisplice-slim
Get the full sized docker image
docker pull kipoi/kipoi-docker:kipoisplice
Get the activated conda environment inside the container
docker run -it kipoi/kipoi-docker:kipoisplice-slim
Test the model
docker run kipoi/kipoi-docker:kipoisplice-slim kipoi test KipoiSplice/4cons --source=kipoi
Make prediction for custom files directly
# Create an example directory containing the data
mkdir -p $PWD/kipoi-example 
# You can replace $PWD/kipoi-example with a different absolute path containing the data 
docker run -v $PWD/kipoi-example:/app/ kipoi/kipoi-docker:kipoisplice-slim \
kipoi get-example KipoiSplice/4cons -o /app/example 
docker run -v $PWD/kipoi-example:/app/ kipoi/kipoi-docker:kipoisplice-slim \
kipoi predict KipoiSplice/4cons \
--dataloader_args='{'fasta_file': '/app/example/fasta_file', 'gtf_file': '/app/example/gtf_file', 'num_workers': 1, 'vcf_file': '/app/example/vcf_file'}' \
-o '/app/KipoiSplice_4cons.example_pred.tsv' 
# check the results
head $PWD/kipoi-example/KipoiSplice_4cons.example_pred.tsv
    
Install apptainer
https://apptainer.org/docs/user/main/quick_start.html#quick-installation-steps
Make prediction for custom files directly
kipoi get-example KipoiSplice/4cons -o example
kipoi predict KipoiSplice/4cons \
--dataloader_args='{"fasta_file": "example/fasta_file", "gtf_file": "example/gtf_file", "num_workers": 1, "vcf_file": "example/vcf_file"}' \
-o 'KipoiSplice_4cons.example_pred.tsv' \
--singularity 
# check the results
head KipoiSplice_4cons.example_pred.tsv

Schema

Inputs

Single numpy array

Name: None

    Shape: (16,) 

    Doc: Model predictions of MaxEntscan, HAL and labranchor + conservation


Targets

Single numpy array

Name: None

    Shape: (1,) 

    Doc: Pathogenicity score. 0th index represents the probability of being benign and 1st index represents the probability for being pathogenic.


Dataloader

Defined as: .

Doc: Predictions of 4 splicing models + conservation for all splice-sites in the GTF.

Authors: Ziga Avsec , Roman Kreuzhuber

Type: PreloadedDataset

License: MIT


Arguments

batch_size (optional): batch size to use with all the models

fasta_file : reference genome fasta file

gtf_file : path to the GTF file required by the models (Ensemble)

num_workers (optional): number of workers to use for each model

tmpdir (optional): path to the temporary directory where to store the predictions

vcf_file : Path to the input vcf file. The file has to be annotated with VEP. Specifically, it has to contain the following 4 scores - phyloP46way_placental, phyloP46way_primate, CADD_raw and CADD_phred.


Model dependencies
conda:
  • pip=20.3.3

pip:
  • scikit-learn==0.22.2.post1
  • sklearn-pandas==1.8.0
  • tensorflow==1.13.1
  • numexpr==2.6.2

Dataloader dependencies
conda:
  • bioconda::pysam
  • bioconda::maxentpy
  • bioconda::pybedtools
  • bioconda::cyvcf2
  • pandas
  • numpy
  • h5py
  • attrs=17.4.0
  • python=3.5

pip:
  • pyvcf
  • intervaltree
  • joblib
  • scikit-learn
  • sklearn-pandas
  • kipoi==0.6.30
  • kipoi_utils==0.7.2
  • kipoi_veff
  • tqdm
  • tensorflow>=1.0.0
  • keras==2.2.4
  • protobuf==3.19.4