DeepFlyBrain

Authors: Ibrahim Ihsan Taskiran , Jasper Janssens , Sara Aibar , Stein Aerts

License: MIT

Contributed by: Ibrahim Ihsan Taskiran , Jasper Janssens , Sara Aibar , Stein Aerts

Cite as: TBA

Type: None

Postprocessing: None

Trained on: Accessible genomic sites.

Source files

Specialized deep learning model on Kenyon cells, T neurons, and glia chromatin accessibility data of adult fly brain

Create a new conda environment with all dependencies installed
kipoi env create DeepFlyBrain
source activate kipoi-DeepFlyBrain
Install model dependencies into current environment
kipoi env install DeepFlyBrain
Test the model
kipoi test DeepFlyBrain --source=kipoi
Make a prediction
kipoi get-example DeepFlyBrain -o example
kipoi predict DeepFlyBrain \
  --dataloader_args='{"intervals_file": "example/intervals_file", "fasta_file": "example/fasta_file"}' \
  -o '/tmp/DeepFlyBrain.example_pred.tsv'
# check the results
head '/tmp/DeepFlyBrain.example_pred.tsv'
Get the model
import kipoi
model = kipoi.get_model('DeepFlyBrain')
Make a prediction for example files
pred = model.pipeline.predict_example()
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs = model.default_dataloader.download_example('example')
# Get the dataloader and instantiate it
dl = model.default_dataloader(**dl_kwargs)
# get a batch iterator
it = dl.batch_iter(batch_size=4)
# predict for a batch
batch = next(it)
model.predict_on_batch(batch['inputs'])
Make predictions for custom files directly
pred = model.pipeline.predict(dl_kwargs, batch_size=4)
Get the model
library(reticulate)
kipoi <- import('kipoi')
model <- kipoi$get_model('DeepFlyBrain')
Make a prediction for example files
predictions <- model$pipeline$predict_example()
Use dataloader and model separately
# Download example dataloader kwargs
dl_kwargs <- model$default_dataloader$download_example('example')
# Get the dataloader
dl <- model$default_dataloader(dl_kwargs)
# get a batch iterator
it <- dl$batch_iter(batch_size=4)
# predict for a batch
batch <- iter_next(it)
model$predict_on_batch(batch$inputs)
Make predictions for custom files directly
pred <- model$pipeline$predict(dl_kwargs, batch_size=4)
Get the docker image
docker pull haimasree/kipoi-docker:deepflybrain
Get the activated conda environment inside the container
docker run -it haimasree/kipoi-docker:deepflybrain
Test the model
docker run haimasree/kipoi-docker:deepflybrain kipoi test DeepFlyBrain --source=kipoi
Make prediction for custom files directly
# Create an example directory containing the data
mkdir -p $PWD/kipoi-example 
# You can replace $PWD/kipoi-example with a different absolute path containing the data 
docker run -v $PWD/kipoi-example:/app/ haimasree/kipoi-docker:deepflybrain \
kipoi get-example DeepFlyBrain -o /app/example 
docker run -v $PWD/kipoi-example:/app/ haimasree/kipoi-docker:deepflybrain \
kipoi predict DeepFlyBrain \
--dataloader_args='{'intervals_file': '/app/example/intervals_file', 'fasta_file': '/app/example/fasta_file'}' \
-o '/app/DeepFlyBrain.example_pred.tsv' 
# check the results
head $PWD/kipoi-example/DeepFlyBrain.example_pred.tsv

Schema

Inputs

List of numpy arrays

Name: None

    Shape: (500, 4) 

    Doc: DNA sequence


Targets

Single numpy array

Name: topic

    Shape: (81,) 

    Doc: Topic Prediction


Dataloader

Defined as: .

Doc: Data-loader returning one-hot encoded sequences given genome intervals

Authors: Ibrahim Ihsan Taskiran

Type: None

License: MIT


Arguments

intervals_file : intervals file bed3

fasta_file : Reference genome FASTA file path.

ignore_targets (optional): if True, don't return any target variables


Model dependencies
conda:
  • python=3.6
  • h5py=2.10.0
  • keras=2.2.4
  • tensorflow=1.14.0
  • pip=21.0.1

pip:

Dataloader dependencies
conda:
  • python=3.6
  • cython=0.29.23
  • bioconda::pybedtools=0.8.2
  • bioconda::pysam=0.16.0.1
  • bioconda::pyfaidx=0.6.1
  • numpy=1.19.5
  • pandas=1.1.5

pip:
  • kipoiseq