out of memory


New member

I am trying to blastx sequences against the nr base database using diamond 2.0.2. My understanding was that with the default blocksize being 2, six times the memory should be required. However, requesting a total 18GB memory (3GB over six cores) on a cloud service I run into oom-errors.

This is my request:

#SBATCH --account=def
#SBATCH --time=03-00:00:00
#SBATCH --cpus-per-task=6
#SBATCH --mem-per-cpu=3G

module load nixpkgs/16.09 gcc/7.3.0 diamond/2.0.2

for file in mort_temp/*_cgigas_un.fastq; do diamond blastx -d diamond_db_20200813/nr.dmnd -q "${file}" --sensitive --matrix BLOSUM62 -o "${file%.fastq}"_diamond.txt -f 6 qseqid qlen qstart qend sseqid sstart send length pident mismatch gapopen evalue bitscore score staxids stitle -k 100 -e 0.001; done

The seff output:

State: OUT_OF_MEMORY (exit code 0)
Nodes: 1
Cores per node: 6
CPU Utilized: 04:05:54
CPU Efficiency: 74.61% of 05:29:36 core-walltime
Job Wall-clock time: 00:54:56
Memory Utilized: 17.92 GB
Memory Efficiency: 99.54% of 18.00 GB

Am I miss-understanding something? I am restricted by unsuccessfull runs, how can I determine the necessary memory requirements?

Thank you very much for diamond and the latest update, the makedb worked with no issues!


Benjamin Buchfink

Staff member
Try using a smaller block size, like -b1. It is not that simple to know the memory use in advance, and it can increase under certain circumstances.