miga-base 0.7.26.0 → 0.7.26.1

Sign up to get free protection for your applications and to get access to all the features.
Files changed (276) hide show
  1. checksums.yaml +4 -4
  2. data/lib/miga/version.rb +1 -1
  3. data/utils/FastAAI/00.Libraries/01.SCG_HMMs/Archaea_SCG.hmm +41964 -0
  4. data/utils/FastAAI/00.Libraries/01.SCG_HMMs/Bacteria_SCG.hmm +32439 -0
  5. data/utils/FastAAI/00.Libraries/01.SCG_HMMs/Complete_SCG_DB.hmm +62056 -0
  6. data/utils/FastAAI/FastAAI/FastAAI +1336 -0
  7. data/utils/FastAAI/README.md +84 -0
  8. data/utils/FastAAI/kAAI_v1.0_virus.py +1296 -0
  9. data/utils/enveomics/Docs/recplot2.md +244 -0
  10. data/utils/enveomics/Examples/aai-matrix.bash +66 -0
  11. data/utils/enveomics/Examples/ani-matrix.bash +66 -0
  12. data/utils/enveomics/Examples/essential-phylogeny.bash +105 -0
  13. data/utils/enveomics/Examples/unus-genome-phylogeny.bash +100 -0
  14. data/utils/enveomics/LICENSE.txt +73 -0
  15. data/utils/enveomics/Makefile +52 -0
  16. data/utils/enveomics/Manifest/Tasks/aasubs.json +103 -0
  17. data/utils/enveomics/Manifest/Tasks/blasttab.json +786 -0
  18. data/utils/enveomics/Manifest/Tasks/distances.json +161 -0
  19. data/utils/enveomics/Manifest/Tasks/fasta.json +766 -0
  20. data/utils/enveomics/Manifest/Tasks/fastq.json +243 -0
  21. data/utils/enveomics/Manifest/Tasks/graphics.json +126 -0
  22. data/utils/enveomics/Manifest/Tasks/mapping.json +67 -0
  23. data/utils/enveomics/Manifest/Tasks/ogs.json +382 -0
  24. data/utils/enveomics/Manifest/Tasks/other.json +829 -0
  25. data/utils/enveomics/Manifest/Tasks/remote.json +355 -0
  26. data/utils/enveomics/Manifest/Tasks/sequence-identity.json +501 -0
  27. data/utils/enveomics/Manifest/Tasks/tables.json +308 -0
  28. data/utils/enveomics/Manifest/Tasks/trees.json +68 -0
  29. data/utils/enveomics/Manifest/Tasks/variants.json +111 -0
  30. data/utils/enveomics/Manifest/categories.json +156 -0
  31. data/utils/enveomics/Manifest/examples.json +154 -0
  32. data/utils/enveomics/Manifest/tasks.json +4 -0
  33. data/utils/enveomics/Pipelines/assembly.pbs/CONFIG.mock.bash +69 -0
  34. data/utils/enveomics/Pipelines/assembly.pbs/FastA.N50.pl +1 -0
  35. data/utils/enveomics/Pipelines/assembly.pbs/FastA.filterN.pl +1 -0
  36. data/utils/enveomics/Pipelines/assembly.pbs/FastA.length.pl +1 -0
  37. data/utils/enveomics/Pipelines/assembly.pbs/README.md +189 -0
  38. data/utils/enveomics/Pipelines/assembly.pbs/RUNME-2.bash +112 -0
  39. data/utils/enveomics/Pipelines/assembly.pbs/RUNME-3.bash +23 -0
  40. data/utils/enveomics/Pipelines/assembly.pbs/RUNME-4.bash +44 -0
  41. data/utils/enveomics/Pipelines/assembly.pbs/RUNME.bash +50 -0
  42. data/utils/enveomics/Pipelines/assembly.pbs/kSelector.R +37 -0
  43. data/utils/enveomics/Pipelines/assembly.pbs/newbler.pbs +68 -0
  44. data/utils/enveomics/Pipelines/assembly.pbs/newbler_preparator.pl +49 -0
  45. data/utils/enveomics/Pipelines/assembly.pbs/soap.pbs +80 -0
  46. data/utils/enveomics/Pipelines/assembly.pbs/stats.pbs +57 -0
  47. data/utils/enveomics/Pipelines/assembly.pbs/velvet.pbs +63 -0
  48. data/utils/enveomics/Pipelines/blast.pbs/01.pbs.bash +38 -0
  49. data/utils/enveomics/Pipelines/blast.pbs/02.pbs.bash +73 -0
  50. data/utils/enveomics/Pipelines/blast.pbs/03.pbs.bash +21 -0
  51. data/utils/enveomics/Pipelines/blast.pbs/BlastTab.recover_job.pl +72 -0
  52. data/utils/enveomics/Pipelines/blast.pbs/CONFIG.mock.bash +98 -0
  53. data/utils/enveomics/Pipelines/blast.pbs/FastA.split.pl +1 -0
  54. data/utils/enveomics/Pipelines/blast.pbs/README.md +127 -0
  55. data/utils/enveomics/Pipelines/blast.pbs/RUNME.bash +109 -0
  56. data/utils/enveomics/Pipelines/blast.pbs/TASK.check.bash +128 -0
  57. data/utils/enveomics/Pipelines/blast.pbs/TASK.dry.bash +16 -0
  58. data/utils/enveomics/Pipelines/blast.pbs/TASK.eo.bash +22 -0
  59. data/utils/enveomics/Pipelines/blast.pbs/TASK.pause.bash +26 -0
  60. data/utils/enveomics/Pipelines/blast.pbs/TASK.run.bash +89 -0
  61. data/utils/enveomics/Pipelines/blast.pbs/sentinel.pbs.bash +29 -0
  62. data/utils/enveomics/Pipelines/idba.pbs/README.md +49 -0
  63. data/utils/enveomics/Pipelines/idba.pbs/RUNME.bash +95 -0
  64. data/utils/enveomics/Pipelines/idba.pbs/run.pbs +56 -0
  65. data/utils/enveomics/Pipelines/trim.pbs/README.md +54 -0
  66. data/utils/enveomics/Pipelines/trim.pbs/RUNME.bash +70 -0
  67. data/utils/enveomics/Pipelines/trim.pbs/run.pbs +130 -0
  68. data/utils/enveomics/README.md +42 -0
  69. data/utils/enveomics/Scripts/AAsubs.log2ratio.rb +171 -0
  70. data/utils/enveomics/Scripts/Aln.cat.rb +163 -0
  71. data/utils/enveomics/Scripts/Aln.convert.pl +35 -0
  72. data/utils/enveomics/Scripts/AlphaDiversity.pl +152 -0
  73. data/utils/enveomics/Scripts/BedGraph.tad.rb +93 -0
  74. data/utils/enveomics/Scripts/BedGraph.window.rb +71 -0
  75. data/utils/enveomics/Scripts/BlastPairwise.AAsubs.pl +102 -0
  76. data/utils/enveomics/Scripts/BlastTab.addlen.rb +63 -0
  77. data/utils/enveomics/Scripts/BlastTab.advance.bash +48 -0
  78. data/utils/enveomics/Scripts/BlastTab.best_hit_sorted.pl +55 -0
  79. data/utils/enveomics/Scripts/BlastTab.catsbj.pl +104 -0
  80. data/utils/enveomics/Scripts/BlastTab.cogCat.rb +76 -0
  81. data/utils/enveomics/Scripts/BlastTab.filter.pl +47 -0
  82. data/utils/enveomics/Scripts/BlastTab.kegg_pep2path_rest.pl +194 -0
  83. data/utils/enveomics/Scripts/BlastTab.metaxaPrep.pl +104 -0
  84. data/utils/enveomics/Scripts/BlastTab.pairedHits.rb +157 -0
  85. data/utils/enveomics/Scripts/BlastTab.recplot2.R +48 -0
  86. data/utils/enveomics/Scripts/BlastTab.seqdepth.pl +86 -0
  87. data/utils/enveomics/Scripts/BlastTab.seqdepth_ZIP.pl +119 -0
  88. data/utils/enveomics/Scripts/BlastTab.seqdepth_nomedian.pl +86 -0
  89. data/utils/enveomics/Scripts/BlastTab.subsample.pl +47 -0
  90. data/utils/enveomics/Scripts/BlastTab.sumPerHit.pl +114 -0
  91. data/utils/enveomics/Scripts/BlastTab.taxid2taxrank.pl +90 -0
  92. data/utils/enveomics/Scripts/BlastTab.topHits_sorted.rb +101 -0
  93. data/utils/enveomics/Scripts/Chao1.pl +97 -0
  94. data/utils/enveomics/Scripts/CharTable.classify.rb +234 -0
  95. data/utils/enveomics/Scripts/EBIseq2tax.rb +83 -0
  96. data/utils/enveomics/Scripts/FastA.N50.pl +56 -0
  97. data/utils/enveomics/Scripts/FastA.extract.rb +152 -0
  98. data/utils/enveomics/Scripts/FastA.filter.pl +52 -0
  99. data/utils/enveomics/Scripts/FastA.filterLen.pl +28 -0
  100. data/utils/enveomics/Scripts/FastA.filterN.pl +60 -0
  101. data/utils/enveomics/Scripts/FastA.fragment.rb +92 -0
  102. data/utils/enveomics/Scripts/FastA.gc.pl +42 -0
  103. data/utils/enveomics/Scripts/FastA.interpose.pl +93 -0
  104. data/utils/enveomics/Scripts/FastA.length.pl +38 -0
  105. data/utils/enveomics/Scripts/FastA.mask.rb +89 -0
  106. data/utils/enveomics/Scripts/FastA.per_file.pl +36 -0
  107. data/utils/enveomics/Scripts/FastA.qlen.pl +57 -0
  108. data/utils/enveomics/Scripts/FastA.rename.pl +65 -0
  109. data/utils/enveomics/Scripts/FastA.revcom.pl +23 -0
  110. data/utils/enveomics/Scripts/FastA.sample.rb +83 -0
  111. data/utils/enveomics/Scripts/FastA.slider.pl +85 -0
  112. data/utils/enveomics/Scripts/FastA.split.pl +55 -0
  113. data/utils/enveomics/Scripts/FastA.split.rb +79 -0
  114. data/utils/enveomics/Scripts/FastA.subsample.pl +131 -0
  115. data/utils/enveomics/Scripts/FastA.tag.rb +65 -0
  116. data/utils/enveomics/Scripts/FastA.wrap.rb +48 -0
  117. data/utils/enveomics/Scripts/FastQ.filter.pl +54 -0
  118. data/utils/enveomics/Scripts/FastQ.interpose.pl +90 -0
  119. data/utils/enveomics/Scripts/FastQ.offset.pl +90 -0
  120. data/utils/enveomics/Scripts/FastQ.split.pl +53 -0
  121. data/utils/enveomics/Scripts/FastQ.tag.rb +63 -0
  122. data/utils/enveomics/Scripts/FastQ.test-error.rb +81 -0
  123. data/utils/enveomics/Scripts/FastQ.toFastA.awk +24 -0
  124. data/utils/enveomics/Scripts/GFF.catsbj.pl +127 -0
  125. data/utils/enveomics/Scripts/GenBank.add_fields.rb +84 -0
  126. data/utils/enveomics/Scripts/HMM.essential.rb +351 -0
  127. data/utils/enveomics/Scripts/HMM.haai.rb +168 -0
  128. data/utils/enveomics/Scripts/HMMsearch.extractIds.rb +83 -0
  129. data/utils/enveomics/Scripts/JPlace.distances.rb +88 -0
  130. data/utils/enveomics/Scripts/JPlace.to_iToL.rb +320 -0
  131. data/utils/enveomics/Scripts/M5nr.getSequences.rb +81 -0
  132. data/utils/enveomics/Scripts/MeTaxa.distribution.pl +198 -0
  133. data/utils/enveomics/Scripts/MyTaxa.fragsByTax.pl +35 -0
  134. data/utils/enveomics/Scripts/MyTaxa.seq-taxrank.rb +49 -0
  135. data/utils/enveomics/Scripts/NCBIacc2tax.rb +92 -0
  136. data/utils/enveomics/Scripts/Newick.autoprune.R +27 -0
  137. data/utils/enveomics/Scripts/RAxML-EPA.to_iToL.pl +228 -0
  138. data/utils/enveomics/Scripts/RecPlot2.compareIdentities.R +32 -0
  139. data/utils/enveomics/Scripts/RefSeq.download.bash +48 -0
  140. data/utils/enveomics/Scripts/SRA.download.bash +57 -0
  141. data/utils/enveomics/Scripts/TRIBS.plot-test.R +36 -0
  142. data/utils/enveomics/Scripts/TRIBS.test.R +39 -0
  143. data/utils/enveomics/Scripts/Table.barplot.R +31 -0
  144. data/utils/enveomics/Scripts/Table.df2dist.R +30 -0
  145. data/utils/enveomics/Scripts/Table.filter.pl +61 -0
  146. data/utils/enveomics/Scripts/Table.merge.pl +77 -0
  147. data/utils/enveomics/Scripts/Table.replace.rb +69 -0
  148. data/utils/enveomics/Scripts/Table.round.rb +63 -0
  149. data/utils/enveomics/Scripts/Table.split.pl +57 -0
  150. data/utils/enveomics/Scripts/Taxonomy.silva2ncbi.rb +227 -0
  151. data/utils/enveomics/Scripts/VCF.KaKs.rb +147 -0
  152. data/utils/enveomics/Scripts/VCF.SNPs.rb +88 -0
  153. data/utils/enveomics/Scripts/aai.rb +418 -0
  154. data/utils/enveomics/Scripts/ani.rb +362 -0
  155. data/utils/enveomics/Scripts/clust.rand.rb +102 -0
  156. data/utils/enveomics/Scripts/gi2tax.rb +103 -0
  157. data/utils/enveomics/Scripts/in_silico_GA_GI.pl +96 -0
  158. data/utils/enveomics/Scripts/lib/data/dupont_2012_essential.hmm.gz +0 -0
  159. data/utils/enveomics/Scripts/lib/data/lee_2019_essential.hmm.gz +0 -0
  160. data/utils/enveomics/Scripts/lib/enveomics.R +1 -0
  161. data/utils/enveomics/Scripts/lib/enveomics_rb/enveomics.rb +24 -0
  162. data/utils/enveomics/Scripts/lib/enveomics_rb/jplace.rb +253 -0
  163. data/utils/enveomics/Scripts/lib/enveomics_rb/og.rb +182 -0
  164. data/utils/enveomics/Scripts/lib/enveomics_rb/remote_data.rb +74 -0
  165. data/utils/enveomics/Scripts/lib/enveomics_rb/seq_range.rb +237 -0
  166. data/utils/enveomics/Scripts/lib/enveomics_rb/stat.rb +30 -0
  167. data/utils/enveomics/Scripts/lib/enveomics_rb/vcf.rb +135 -0
  168. data/utils/enveomics/Scripts/ogs.annotate.rb +88 -0
  169. data/utils/enveomics/Scripts/ogs.core-pan.rb +160 -0
  170. data/utils/enveomics/Scripts/ogs.extract.rb +125 -0
  171. data/utils/enveomics/Scripts/ogs.mcl.rb +186 -0
  172. data/utils/enveomics/Scripts/ogs.rb +104 -0
  173. data/utils/enveomics/Scripts/ogs.stats.rb +131 -0
  174. data/utils/enveomics/Scripts/rbm.rb +146 -0
  175. data/utils/enveomics/Tests/Makefile +10 -0
  176. data/utils/enveomics/Tests/Mgen_M2288.faa +3189 -0
  177. data/utils/enveomics/Tests/Mgen_M2288.fna +8282 -0
  178. data/utils/enveomics/Tests/Mgen_M2321.fna +8288 -0
  179. data/utils/enveomics/Tests/Nequ_Kin4M.faa +2970 -0
  180. data/utils/enveomics/Tests/Xanthomonas_oryzae-PilA.tribs.Rdata +0 -0
  181. data/utils/enveomics/Tests/Xanthomonas_oryzae-PilA.txt +7 -0
  182. data/utils/enveomics/Tests/Xanthomonas_oryzae.aai-mat.tsv +17 -0
  183. data/utils/enveomics/Tests/Xanthomonas_oryzae.aai.tsv +137 -0
  184. data/utils/enveomics/Tests/a_mg.cds-go.blast.tsv +123 -0
  185. data/utils/enveomics/Tests/a_mg.reads-cds.blast.tsv +200 -0
  186. data/utils/enveomics/Tests/a_mg.reads-cds.counts.tsv +55 -0
  187. data/utils/enveomics/Tests/alkB.nwk +1 -0
  188. data/utils/enveomics/Tests/anthrax-cansnp-data.tsv +13 -0
  189. data/utils/enveomics/Tests/anthrax-cansnp-key.tsv +17 -0
  190. data/utils/enveomics/Tests/hiv1.faa +59 -0
  191. data/utils/enveomics/Tests/hiv1.fna +134 -0
  192. data/utils/enveomics/Tests/hiv2.faa +70 -0
  193. data/utils/enveomics/Tests/hiv_mix-hiv1.blast.tsv +233 -0
  194. data/utils/enveomics/Tests/hiv_mix-hiv1.blast.tsv.lim +1 -0
  195. data/utils/enveomics/Tests/hiv_mix-hiv1.blast.tsv.rec +233 -0
  196. data/utils/enveomics/Tests/phyla_counts.tsv +10 -0
  197. data/utils/enveomics/Tests/primate_lentivirus.ogs +11 -0
  198. data/utils/enveomics/Tests/primate_lentivirus.rbm/hiv1-hiv1.rbm +9 -0
  199. data/utils/enveomics/Tests/primate_lentivirus.rbm/hiv1-hiv2.rbm +8 -0
  200. data/utils/enveomics/Tests/primate_lentivirus.rbm/hiv1-siv.rbm +6 -0
  201. data/utils/enveomics/Tests/primate_lentivirus.rbm/hiv2-hiv2.rbm +9 -0
  202. data/utils/enveomics/Tests/primate_lentivirus.rbm/hiv2-siv.rbm +6 -0
  203. data/utils/enveomics/Tests/primate_lentivirus.rbm/siv-siv.rbm +6 -0
  204. data/utils/enveomics/build_enveomics_r.bash +45 -0
  205. data/utils/enveomics/enveomics.R/DESCRIPTION +31 -0
  206. data/utils/enveomics/enveomics.R/NAMESPACE +39 -0
  207. data/utils/enveomics/enveomics.R/R/autoprune.R +155 -0
  208. data/utils/enveomics/enveomics.R/R/barplot.R +184 -0
  209. data/utils/enveomics/enveomics.R/R/cliopts.R +135 -0
  210. data/utils/enveomics/enveomics.R/R/df2dist.R +154 -0
  211. data/utils/enveomics/enveomics.R/R/growthcurve.R +331 -0
  212. data/utils/enveomics/enveomics.R/R/recplot.R +354 -0
  213. data/utils/enveomics/enveomics.R/R/recplot2.R +1631 -0
  214. data/utils/enveomics/enveomics.R/R/tribs.R +583 -0
  215. data/utils/enveomics/enveomics.R/R/utils.R +50 -0
  216. data/utils/enveomics/enveomics.R/README.md +80 -0
  217. data/utils/enveomics/enveomics.R/data/growth.curves.rda +0 -0
  218. data/utils/enveomics/enveomics.R/data/phyla.counts.rda +0 -0
  219. data/utils/enveomics/enveomics.R/man/cash-enve.GrowthCurve-method.Rd +17 -0
  220. data/utils/enveomics/enveomics.R/man/cash-enve.RecPlot2-method.Rd +17 -0
  221. data/utils/enveomics/enveomics.R/man/cash-enve.RecPlot2.Peak-method.Rd +17 -0
  222. data/utils/enveomics/enveomics.R/man/enve.GrowthCurve-class.Rd +25 -0
  223. data/utils/enveomics/enveomics.R/man/enve.TRIBS-class.Rd +46 -0
  224. data/utils/enveomics/enveomics.R/man/enve.TRIBS.merge.Rd +23 -0
  225. data/utils/enveomics/enveomics.R/man/enve.TRIBStest-class.Rd +47 -0
  226. data/utils/enveomics/enveomics.R/man/enve.__prune.iter.Rd +23 -0
  227. data/utils/enveomics/enveomics.R/man/enve.__prune.reduce.Rd +23 -0
  228. data/utils/enveomics/enveomics.R/man/enve.__tribs.Rd +32 -0
  229. data/utils/enveomics/enveomics.R/man/enve.barplot.Rd +91 -0
  230. data/utils/enveomics/enveomics.R/man/enve.cliopts.Rd +57 -0
  231. data/utils/enveomics/enveomics.R/man/enve.col.alpha.Rd +24 -0
  232. data/utils/enveomics/enveomics.R/man/enve.col2alpha.Rd +19 -0
  233. data/utils/enveomics/enveomics.R/man/enve.df2dist.Rd +39 -0
  234. data/utils/enveomics/enveomics.R/man/enve.df2dist.group.Rd +38 -0
  235. data/utils/enveomics/enveomics.R/man/enve.df2dist.list.Rd +40 -0
  236. data/utils/enveomics/enveomics.R/man/enve.growthcurve.Rd +67 -0
  237. data/utils/enveomics/enveomics.R/man/enve.prune.dist.Rd +37 -0
  238. data/utils/enveomics/enveomics.R/man/enve.recplot.Rd +122 -0
  239. data/utils/enveomics/enveomics.R/man/enve.recplot2-class.Rd +45 -0
  240. data/utils/enveomics/enveomics.R/man/enve.recplot2.ANIr.Rd +24 -0
  241. data/utils/enveomics/enveomics.R/man/enve.recplot2.Rd +68 -0
  242. data/utils/enveomics/enveomics.R/man/enve.recplot2.__counts.Rd +25 -0
  243. data/utils/enveomics/enveomics.R/man/enve.recplot2.__peakHist.Rd +21 -0
  244. data/utils/enveomics/enveomics.R/man/enve.recplot2.__whichClosestPeak.Rd +19 -0
  245. data/utils/enveomics/enveomics.R/man/enve.recplot2.changeCutoff.Rd +19 -0
  246. data/utils/enveomics/enveomics.R/man/enve.recplot2.compareIdentities.Rd +41 -0
  247. data/utils/enveomics/enveomics.R/man/enve.recplot2.coordinates.Rd +29 -0
  248. data/utils/enveomics/enveomics.R/man/enve.recplot2.corePeak.Rd +18 -0
  249. data/utils/enveomics/enveomics.R/man/enve.recplot2.extractWindows.Rd +40 -0
  250. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.Rd +36 -0
  251. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.__em_e.Rd +19 -0
  252. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.__em_m.Rd +19 -0
  253. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.__emauto_one.Rd +27 -0
  254. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.__mow_one.Rd +41 -0
  255. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.__mower.Rd +17 -0
  256. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.em.Rd +43 -0
  257. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.emauto.Rd +37 -0
  258. data/utils/enveomics/enveomics.R/man/enve.recplot2.findPeaks.mower.Rd +74 -0
  259. data/utils/enveomics/enveomics.R/man/enve.recplot2.peak-class.Rd +59 -0
  260. data/utils/enveomics/enveomics.R/man/enve.recplot2.seqdepth.Rd +27 -0
  261. data/utils/enveomics/enveomics.R/man/enve.recplot2.windowDepthThreshold.Rd +32 -0
  262. data/utils/enveomics/enveomics.R/man/enve.tribs.Rd +59 -0
  263. data/utils/enveomics/enveomics.R/man/enve.tribs.test.Rd +28 -0
  264. data/utils/enveomics/enveomics.R/man/enve.truncate.Rd +27 -0
  265. data/utils/enveomics/enveomics.R/man/growth.curves.Rd +14 -0
  266. data/utils/enveomics/enveomics.R/man/phyla.counts.Rd +13 -0
  267. data/utils/enveomics/enveomics.R/man/plot.enve.GrowthCurve.Rd +63 -0
  268. data/utils/enveomics/enveomics.R/man/plot.enve.TRIBS.Rd +38 -0
  269. data/utils/enveomics/enveomics.R/man/plot.enve.TRIBStest.Rd +38 -0
  270. data/utils/enveomics/enveomics.R/man/plot.enve.recplot2.Rd +111 -0
  271. data/utils/enveomics/enveomics.R/man/summary.enve.GrowthCurve.Rd +19 -0
  272. data/utils/enveomics/enveomics.R/man/summary.enve.TRIBS.Rd +19 -0
  273. data/utils/enveomics/enveomics.R/man/summary.enve.TRIBStest.Rd +19 -0
  274. data/utils/enveomics/globals.mk +8 -0
  275. data/utils/enveomics/manifest.json +9 -0
  276. metadata +277 -4
@@ -0,0 +1,57 @@
1
+ #!/bin/bash
2
+ #PBS -q iw-shared-6
3
+ #PBS -l nodes=1:ppn=1
4
+ #PBS -l mem=1gb
5
+ #PBS -l walltime=3:00:00
6
+ #PBS -k oe
7
+
8
+ # Check mandatory variables
9
+ if [[ "$LIB" == "" ]]; then
10
+ echo "Error: LIB is mandatory" >&2
11
+ exit 1;
12
+ fi
13
+ if [[ "$PDIR" == "" ]]; then
14
+ echo "Error: PDIR is mandatory" >&2
15
+ exit 1;
16
+ fi
17
+
18
+ # Run
19
+ module load perl/5.14.4
20
+ echo "K N50 used reads " > $LIB.velvet.n50
21
+ echo "K N50 used reads " > $LIB.soap.n50
22
+ for ID in $(seq 10 31); do
23
+ let KMER=$ID*2+1
24
+ DIRV="$LIB.velvet_$KMER"
25
+ DIRS="$LIB.soap_$KMER"
26
+ echo $KMER > $LIB.velvet.n50.$KMER
27
+ echo $KMER > $LIB.soap.n50.$KMER
28
+ # N50 (>=500)
29
+ perl "$PDIR/FastA.N50.pl" "$DIRV/contigs.fa" 500 | grep '^N50' | sed -e 's/.*: //' >> $LIB.velvet.n50.$KMER
30
+ perl "$PDIR/FastA.N50.pl" "$DIRS/O.contig" 500 | grep '^N50' | sed -e 's/.*: //' >> $LIB.soap.n50.$KMER
31
+ # Used and Total reads
32
+ tail -n 1 $DIRV/Log | sed -e 's/.* using \\([0-9]*\\)\\/\\([0-9]*\\) reads.*/\\1\\n\\2/' >> $LIB.velvet.n50.$KMER
33
+ if [ -e "$DIRS/O.readOnContig" ] ; then
34
+ cat "$DIRS/O.readOnContig" | grep -vc '^read' >> $LIB.soap.n50.$KMER
35
+ elif [ -e "$DIRS/O.readOnContig.gz" ] ; then
36
+ zcat "$DIRS/O.readOnContig.gz" | grep -vc '^read' >> $LIB.soap.n50.$KMER
37
+ else
38
+ echo 0 >> $LIB.soap.n50.$KMER
39
+ fi
40
+ head -n 1 $DIRS/O.peGrads | awk '{print $3}' >> $LIB.soap.n50.$KMER
41
+ # Join
42
+ (cat $LIB.velvet.n50.$KMER | tr "\\n" " "; echo) >> $LIB.velvet.n50
43
+ rm $LIB.velvet.n50.$KMER
44
+ (cat $LIB.soap.n50.$KMER | tr "\\n" " "; echo) >> $LIB.soap.n50
45
+ rm $LIB.soap.n50.$KMER
46
+ done
47
+
48
+ # Create plot
49
+ module load R/3.1.2
50
+ echo "
51
+ source('$PDIR/kSelector.R');
52
+ pdf('$LIB.n50.pdf', 13, 7);
53
+ kSelector('$LIB.velvet.n50', '$LIB (Velvet)');
54
+ kSelector('$LIB.soap.n50', '$LIB (SOAP)');
55
+ dev.off();
56
+ " | R --vanilla -q
57
+
@@ -0,0 +1,63 @@
1
+ #!/bin/bash
2
+ #PBS -l nodes=1:ppn=1
3
+ #PBS -k oe
4
+
5
+ # Some defaults for the parameters
6
+ FORMAT=${FORMAT:-fasta};
7
+ INSLEN=${INSLEN:-300};
8
+ USECOUPLED=${USECOUPLED:-yes};
9
+ USESINGLE=${USESINGLE:-no};
10
+ CLEANUP=${CLEANUP:-yes}
11
+
12
+ # Check mandatory variables
13
+ if [[ "$LIB" == "" ]]; then
14
+ echo "Error: LIB is mandatory" >&2
15
+ exit 1;
16
+ fi
17
+ if [[ "$PDIR" == "" ]]; then
18
+ echo "Error: PDIR is mandatory" >&2
19
+ exit 1;
20
+ fi
21
+ if [[ "$DATA" == "" ]]; then
22
+ echo "Error: DATA is mandatory" >&2
23
+ exit 1;
24
+ fi
25
+
26
+ # Prepare input
27
+ KMER=$PBS_ARRAYID
28
+ CWD=$(pwd)
29
+ DIR="$CWD/$LIB.velvet_$KMER"
30
+
31
+ # Run
32
+ module load velvet/1.2.10
33
+ echo velveth > $DIR.proc
34
+ CMD="velveth_101_omp $DIR $KMER -$FORMAT"
35
+ if [[ "$USECOUPLED" == "yes" ]]; then
36
+ CMD="$CMD -shortPaired $DATA/$LIB.CoupledReads.fa"
37
+ fi
38
+ if [[ "$USESINGLE" == "yes" ]]; then
39
+ CMD="$CMD -short $DATA/$LIB.SingleReads.fa"
40
+ fi
41
+ if [[ "$VELVETH_EXTRA" != "" ]]; then
42
+ CMD="$CMD $VELVETH_EXTRA"
43
+ fi
44
+ $CMD &> $DIR.hlog
45
+ echo velvetg > $DIR.proc
46
+ velvetg_101_omp "$DIR" -exp_cov auto -cov_cutoff auto -ins_length "$INSLEN" $VELVETG_EXTRA &> $DIR.glog
47
+ if [[ -d $DIR ]] ; then
48
+ if [[ -s $DIR/contigs.fa ]] ; then
49
+ if [[ "$CLEANUP" != "no" ]] ; then
50
+ echo cleanup > $DIR.proc
51
+ rm $DIR/Sequences
52
+ rm $DIR/Roadmaps
53
+ rm $DIR/*Graph*
54
+ fi
55
+ echo done > $DIR.proc
56
+ else
57
+ echo "$0: Error: File $DIR/contigs.fa doesn't exist, something went wrong" >&2
58
+ exit 1
59
+ fi
60
+ else
61
+ echo "$0: Error: Directory $DIR doesn't exist, something went wrong" >&2
62
+ exit 1
63
+ fi
@@ -0,0 +1,38 @@
1
+ # blast.pbs pipeline
2
+ # Step 01 : Initialize input files
3
+
4
+ # 00. Read configuration
5
+ cd $SCRATCH ;
6
+ TASK="dry" ;
7
+ source "$PDIR/RUNME.bash" ;
8
+ echo "$PBS_JOBID" > "$SCRATCH/success/01.00" ;
9
+
10
+ if [[ ! -e "$SCRATCH/success/01.01" ]] ; then
11
+ # 01. BEGIN
12
+ REGISTER_JOB "01" "01" "Custom BEGIN function" \
13
+ && BEGIN \
14
+ || exit 1 ;
15
+ touch "$SCRATCH/success/01.01" ;
16
+ fi
17
+
18
+ if [[ ! -e "$SCRATCH/success/01.02" ]] ; then
19
+ # 02. Split
20
+ [[ -d "$SCRATCH/tmp/split" ]] && rm -R "$SCRATCH/tmp/split" ;
21
+ REGISTER_JOB "01" "02" "Splitting query files" \
22
+ && mkdir "$SCRATCH/tmp/split" \
23
+ && perl "$PDIR/FastA.split.pl" "$INPUT" "$SCRATCH/tmp/split/$PROJ" "$MAX_JOBS" \
24
+ || exit 1 ;
25
+ touch "$SCRATCH/success/01.02" ;
26
+ fi ;
27
+
28
+ if [[ ! -e "$SCRATCH/success/01.03" ]] ; then
29
+ # 03. Finalize
30
+ REGISTER_JOB "01" "03" "Finalizing input preparation" \
31
+ && mv "$SCRATCH/tmp/split" "$SCRATCH/tmp/in" \
32
+ || exit 1 ;
33
+ touch "$SCRATCH/success/01.03" ;
34
+ fi ;
35
+
36
+ [[ -d "$SCRATCH/tmp/out" ]] || ( mkdir "$SCRATCH/tmp/out" || exit 1 ) ;
37
+ JOB_DONE "01" ;
38
+
@@ -0,0 +1,73 @@
1
+ # blast.pbs pipeline
2
+ # Step 02 : Run BLAST
3
+
4
+ # Read configuration
5
+ cd $SCRATCH ;
6
+ TASK="dry" ;
7
+ source "$PDIR/RUNME.bash" ;
8
+
9
+ # 00. Initial vars
10
+ ID_N=$PBS_ARRAYID
11
+ [[ "$ID_N" == "" ]] && exit 1 ;
12
+ [[ -e "$SCRATCH/success/02.$ID_N" ]] && exit 0 ;
13
+ IN="$SCRATCH/tmp/in/$PROJ.$ID_N.fa" ;
14
+ OUT="$SCRATCH/tmp/out/$PROJ.blast.$ID_N" ;
15
+ FINAL_OUT="$SCRATCH/results/$PROJ.$ID_N.blast" ;
16
+ if [[ -e "$SCRATCH/success/02.$ID_N.00" ]] ; then
17
+ pre_job=$(cat "$SCRATCH/success/02.$ID_N.00") ;
18
+ state=$(qstat -f "$pre_job" 2>/dev/null | grep job_state | sed -e 's/.*= //')
19
+ if [[ "$state" == "R" ]] ; then
20
+ echo "Warning: This task is already being executed by $pre_job. Aborting." >&2 ;
21
+ exit 0 ;
22
+ elif [[ "$state" == "" ]] ; then
23
+ echo "Warning: This task was initialized by $pre_job, but it's currently not running. Superseding." >&2 ;
24
+ fi ;
25
+ fi
26
+ echo "$PBS_JOBID" > "$SCRATCH/success/02.$ID_N.00" ;
27
+
28
+ # 01. Before BLAST
29
+ if [[ ! -e "$SCRATCH/success/02.$ID_N.01" ]] ; then
30
+ BEFORE_BLAST "$IN" "$OUT" || exit 1 ;
31
+ touch "$SCRATCH/success/02.$ID_N.01" ;
32
+ fi ;
33
+
34
+ # 02. Run BLAST
35
+ if [[ ! -e "$SCRATCH/success/02.$ID_N.02" ]] ; then
36
+ # Recover previous runs, if any
37
+ if [[ -s "$OUT" ]] ; then
38
+ perl "$PDIR/BlastTab.recover_job.pl" "$IN" "$OUT" \
39
+ || exit 1 ;
40
+ fi ;
41
+ # Run BLAST
42
+ RUN_BLAST "$IN" "$OUT" \
43
+ && mv "$OUT" "$OUT-z" \
44
+ || exit 1 ;
45
+ touch "$SCRATCH/success/02.$ID_N.02" ;
46
+ fi ;
47
+
48
+ # 03. Collect BLAST parts
49
+ if [[ ! -e "$SCRATCH/success/02.$ID_N.03" ]] ; then
50
+ if [[ -e "$OUT" ]] ; then
51
+ echo "Warning: The file $OUT pre-exists, but the BLAST collection was incomplete." >&2 ;
52
+ echo " I'm assuming that it corresponds to the first part of the result, but you should check manually." >&2 ;
53
+ echo " The last lines are:" >&2 ;
54
+ tail -n 3 "$OUT" >&2 ;
55
+ else
56
+ touch "$OUT" || exit 1 ;
57
+ fi ;
58
+ for i in $(ls $OUT-*) ; do
59
+ cat "$i" >> "$OUT" ;
60
+ rm "$i" || exit 1 ;
61
+ done ;
62
+ mv "$OUT" "$FINAL_OUT"
63
+ touch "$SCRATCH/success/02.$ID_N.03" ;
64
+ fi ;
65
+
66
+ # 04. After BLAST
67
+ if [[ ! -e "$SCRATCH/success/02.$ID_N.04" ]] ; then
68
+ AFTER_BLAST "$IN" "$FINAL_OUT" || exit 1 ;
69
+ touch "$SCRATCH/success/02.$ID_N.04" ;
70
+ fi ;
71
+
72
+ touch "$SCRATCH/success/02.$ID_N" ;
73
+
@@ -0,0 +1,21 @@
1
+ # blast.pbs pipeline
2
+ # Step 03 : Finalize
3
+
4
+ # Read configuration
5
+ cd $SCRATCH ;
6
+ TASK="dry" ;
7
+ source "$PDIR/RUNME.bash" ;
8
+ PREFIX="$SCRATCH/results/$PROJ" ;
9
+ OUT="$SCRATCH/$PROJ.blast" ;
10
+ echo "$PBS_JOBID" > "$SCRATCH/success/02.00" ;
11
+
12
+ # 01. END
13
+ if [[ ! -e "$SCRATCH/success/03.01" ]] ; then
14
+ REGISTER_JOB "03" "01" "Custom END function" \
15
+ && END "$PREFIX" "$OUT" \
16
+ || exit 1 ;
17
+ touch "$SCRATCH/success/03.01" ;
18
+ fi ;
19
+
20
+ JOB_DONE "03" ;
21
+
@@ -0,0 +1,72 @@
1
+ #!/usr/bin/perl
2
+
3
+ use warnings;
4
+ use strict;
5
+ use File::Copy;
6
+
7
+ my($fasta, $blast) = @ARGV;
8
+
9
+ ($fasta and $blast) or die "
10
+ .USAGE:
11
+ $0 query.fa blast.txt
12
+
13
+ query.fa Query sequences in FastA format.
14
+ blast.txt Incomplete BLAST output in tabular format.
15
+
16
+ ";
17
+
18
+ print "Fixing $blast:\n";
19
+ my $blast_res;
20
+ for(my $i=0; 1; $i++){
21
+ $blast_res = "$blast-$i";
22
+ last unless -e $blast_res;
23
+ }
24
+ open BLAST, "<", $blast or die "Cannot read the file: $blast: $!\n";
25
+ open TMP, ">", "$blast-tmp" or die "Cannot create the file: $blast-tmp: $!\n";
26
+ my $last="";
27
+ my $last_id="";
28
+ my $before = "";
29
+ while(my $ln=<BLAST>){
30
+ chomp $ln;
31
+ last unless $ln =~ m/(.+?)\t/;
32
+ my $id = $1;
33
+ if($id eq $last_id){
34
+ $last.= $ln."\n";
35
+ }else{
36
+ print TMP $last if $last;
37
+ $before = $last_id;
38
+ $last = $ln."\n";
39
+ $last_id = $id;
40
+ }
41
+ }
42
+ close BLAST;
43
+ close TMP;
44
+
45
+ move "$blast-tmp", $blast_res or die "Cannot move file $blast-tmp into $blast_res: $!\n";
46
+ unlink $blast or die "Cannot delete file: $blast: $!\n";
47
+
48
+ unless($before eq ""){
49
+ print "[$before] ";
50
+ $before = ">$before";
51
+
52
+ open FASTA, "<", $fasta or die "Cannot read file: $fasta: $!\n";
53
+ open TMP, ">", "$fasta-tmp" or die "Cannot create file: $fasta-tmp: $!\n";
54
+ my $print = 0;
55
+ my $at = 0;
56
+ my $i = 0;
57
+ while(my $ln=<FASTA>){
58
+ $i++;
59
+ $print = 1 if $at and $ln =~ /^>/;
60
+ print TMP $ln if $print;
61
+ $ln =~ s/\s+.*//;
62
+ chomp $ln;
63
+ $at = $i if $ln eq $before;
64
+ }
65
+ close TMP;
66
+ close FASTA;
67
+ printf 'recovered at %.2f%% (%d/%d).'."\n", 100*$at/$i, $at, $i if $i;
68
+
69
+ move $fasta, "$fasta.old" or die "Cannot move file $fasta into $fasta.old: $!\n";
70
+ move "$fasta-tmp", $fasta or die "Cannot move file $fasta-tmp into $fasta: $!\n";
71
+ }
72
+
@@ -0,0 +1,98 @@
1
+ #!/bin/bash
2
+
3
+ ##################### VARIABLES
4
+ # Queue and resources.
5
+ QUEUE="iw-shared-6" ;
6
+ MAX_JOBS=500 ; # Maximum number of concurrent jobs. Never exceed 1990.
7
+ PPN=2 ;
8
+ RAM="9gb" ;
9
+
10
+ # Paths
11
+ SCRATCH_DIR="$HOME/scratch/pipelines/blast" ; # Where the outputs and temporals will be created
12
+ INPUT="$HOME/data/my-large-file.fasta" ; # Input query file
13
+ DB="$HOME/data/db/nr" ; # Input database
14
+ PROGRAM="blastp" ;
15
+
16
+ # Pipeline
17
+ MAX_TRIALS=5 ; # Maximum number of automated attempts to re-start a job
18
+
19
+ ##################### FUNCTIONS
20
+ ## All the functions below can be edited to suit your particular job.
21
+ ## No function can be empty, but you can use a "dummy" function (like true).
22
+ ## All functions have access to any of the variables defined above.
23
+ ##
24
+ ## The functions are executed in the following order (from left to right):
25
+ ##
26
+ ## / -----> BEFORE_BLAST --> RUN_BLAST --> AFTER_BLAST ---\
27
+ ## / ··· ··· ··· \
28
+ ## BEGIN --#--------> BEFORE_BLAST --> RUN_BLAST --> AFTER_BLAST -----#---> END
29
+ ## \ ··· ··· ··· /
30
+ ## \ -----> BEFORE_BLAST --> RUN_BLAST --> AFTER_BLAST ---/
31
+ ##
32
+
33
+ # Function to execute ONLY ONCE at the begining
34
+ function BEGIN {
35
+ ### Format the database (assuming proteins, check commands):
36
+ # module load ncbi_blast/2.2.25 || exit 1 ;
37
+ # makeblastdb -in $HOME/data/some-database.faa -title $DB -dbtype prot || exit 1 ;
38
+ # module unload ncbi_blast/2.2.25 || exit 1 ;
39
+ ### Don't do anything:
40
+ true ;
41
+ }
42
+
43
+ # Function to execute BEFORE running the BLAST, for each sub-task.
44
+ function BEFORE_BLAST {
45
+ local IN=$1 # Query file
46
+ local OUT=$2 # Blast file (to be created)
47
+ ### Don't do anything:
48
+ true ;
49
+ }
50
+
51
+ # Function that executes BLAST, for each sub-task
52
+ function RUN_BLAST {
53
+ local IN=$1 # Query file
54
+ local OUT=$2 # Blast file (to be created)
55
+ ### Run BLAST+ with 13th and 14th columns (query length and subject length):
56
+ module load ncbi_blast/2.2.28_binary || exit 1 ;
57
+ $PROGRAM -query $IN -db $DB -out $OUT -num_threads $PPN \
58
+ -outfmt "6 qseqid sseqid pident length mismatch gapopen qstart qend sstart send evalue bitscore qlen slen" \
59
+ || exit 1 ;
60
+ module unload ncbi_blast/2.2.28_binary || exit 1 ;
61
+ ### Run BLAT (nucleotides)
62
+ # module load blat/rhel6 || exit 1 ;
63
+ # blat $DB $IN -out=blast8 $OUT || exit 1 ;
64
+ # module unload blat/rhel6 || exit 1 ;
65
+ ### Run BLAT (proteins)
66
+ # module load blat/rhel6 || exit 1 ;
67
+ # blat $DB $IN -out=blast8 -prot $OUT || exit 1 ;
68
+ # module unload blat/rhel6 || exit 1 ;
69
+ }
70
+
71
+ # Function to execute AFTER running the BLAST, for each sub-task
72
+ function AFTER_BLAST {
73
+ local IN=$1 # Query files
74
+ local OUT=$2 # Blast files
75
+ ### Filter by best-match:
76
+ # sort $OUT | perl $PDIR/../../Scripts/BlastTab.best_hit_sorted.pl > $OUT.bm
77
+ ### Filter by Bit-score 60:
78
+ # awk '$12>=60' $OUT > $OUT.bs60
79
+ ### Filter by corrected identity 95 (only if it has the additional 13th column):
80
+ # awk '$3*$4/$13 >= 95' $OUT > $OUT.ci95
81
+ ### Don't do anything:
82
+ true ;
83
+ }
84
+
85
+ # Function to execute ONLY ONCE at the end, to concatenate the results
86
+ function END {
87
+ local PREFIX=$1 # Prefix of all Blast files
88
+ local OUT=$2 # Single Blast output (to be created).
89
+ ### Simply concatenate files:
90
+ # cat $PREFIX.*.blast > $OUT
91
+ ### Concatenate only the filtered files (if filtering in AFTER_BLAST):
92
+ # cat $PREFIX.*.blast.bs60 > $OUT
93
+ ### Sort the BLAST by query (might require considerable RAM):
94
+ # sort -k 1 $PREFIX.*.blast > $OUT
95
+ ### Don't do anyhthing:
96
+ true ;
97
+ }
98
+
@@ -0,0 +1 @@
1
+ ../../Scripts/FastA.split.pl
@@ -0,0 +1,127 @@
1
+ @author: Luis Miguel Rodriguez-R <lmrodriguezr at gmail dot com>
2
+
3
+ @update: Feb-20-2014
4
+
5
+ @license: artistic 2.0
6
+
7
+ @status: auto
8
+
9
+ @pbs: yes
10
+
11
+ # IMPORTANT
12
+
13
+ This pipeline was developed for the [PACE cluster](http://pace.gatech.edu/). You
14
+ are free to use it in other platforms with adequate adjustments.
15
+
16
+ # PURPOSE
17
+
18
+ Simplifies submitting and tracking large BLAST jobs in cluster.
19
+
20
+ # HELP
21
+
22
+ 1. Files preparation:
23
+
24
+ 1.1. Obtain the enveomics package in the cluster. You can use: `git clone https://github.com/lmrodriguezr/enveomics.git`
25
+
26
+ 1.2. Prepare the query sequences and the database.
27
+
28
+ 1.3. Copy the file `CONFIG.mock.bash` to `CONFIG.<name>.bash`, where `<name>` is a
29
+ short name for your project (avoid characters other than alphanumeric).
30
+
31
+ 1.4. Change the variables in `CONFIG.<name>.bash`. The **Queue and resources** and the
32
+ **Pipeline** variables are very standard, and can be kept unchanged. The **Paths**
33
+ variables indicate where your input files are and where the output files are to
34
+ be created, so check them carefully. Finally, the **FUNCTIONS** define the core
35
+ functionality of the pipeline, and should also be reviewed. By default, the
36
+ Pipeline simply runs BLAST+, with default parameters and tabular output with two
37
+ extra columns (qlen and slen). However, additional functionality can easily be
38
+ incorporated via these functions, such as BLAST filtering, concatenation, sorting,
39
+ or even execution of other programs instead of BLAST, such as BLAT, etc. Note that
40
+ the output MUST be BLAST-like tabular, because this is the only format supported
41
+ to check completeness and recover incomplete runs.
42
+
43
+ 2. Pipeline execution:
44
+
45
+ 2.1. To initialize a run, execute: `./RUNME.bash <name> run`.
46
+
47
+ 2.2. To check the status of a job, execute: `./RUNME.bash <name> check`.
48
+
49
+ 2.3. To pause a run, execute: `./RUNME.bash <name> pause` (see 2.1 to resume).
50
+
51
+ 2.4. To check if your CONFIG defines all required parameters, execute: `./RUNME.bash <name> dry`.
52
+
53
+ 2.5. To review all the e/o files in the run, execute: `./RUNME.bash <name> eo`.
54
+
55
+ 3. Finalizing:
56
+
57
+ 3.1. `./RUNME.bash <name> check` will inform you if a project finished. If it finished successfully,
58
+ you can review your (split) results in $SCRATCH/results. If you concatenated the results in the
59
+ `END` function, you should have a file with all the results in $SCRATCH/<name>.blast.
60
+
61
+ 3.2. Usually, checking the e/o files at the end is a good idea (`./RUNME.bash <name> eo`). However,
62
+ bear in mind that this Pipeline can overcome several errors and is robust to most failures, so
63
+ don't be alarmed at the first sight of errors.
64
+
65
+ # Comments
66
+
67
+ * Some scripts contained in this package are actually symlinks to files in the
68
+ _Scripts_ folder. Check the existance of these files when copied to
69
+ the cluster.
70
+
71
+ # Troubleshooting
72
+
73
+ 1. Do I really have to change directory (`cd`) to the pipeline's folder everytime I want to execute
74
+ something?
75
+
76
+ No. Not really. For simplicity, this file tells you to execute `./RUNME.bash`. However, you don't
77
+ really have to be there, you can execute it from any location. For example, if you saved enveomics in
78
+ your home directory, you can just execute `~/enveomics/blast.pbs/RUNME.bash` insted from any location
79
+ in the head node.
80
+
81
+ 2. When I check a project, few sub-jobs are Active for much longer than the others. How do I know if those
82
+ sub-jobs are really active?
83
+
84
+ Lets review an example of a problematic run. When you run `./RUNME.bash <name> check`, you see the
85
+ following in the "Active jobs" section:
86
+ ````
87
+ Idle: 155829.shared-sched.pace.gatech.edu: 02: 00: Mon Mar 17 14:10:28 EDT 2014
88
+ Sub-jobs:500 Active:4 ( 0.8% ) Eligible:0 ( 0.0% ) Blocked:0 ( 0.0% ) Completed:496 ( 99.2% )
89
+ Idle: 155830.shared-sched.pace.gatech.edu: 02: 00: Mon Mar 17 14:10:28 EDT 2014
90
+
91
+ Running jobs: 0.
92
+ Idle jobs: 2.
93
+ ````
94
+ That means that the job "155829.shared-sched.pace.gatech.edu" has four Active jobs, while all the others are Completed. This is
95
+ a sign of something problematic. You can see the complete status of each array using
96
+ `checkjob -v <JOB_NAME>`. In our example above, you would run `checkjob -v 155829`. In the output
97
+ of checkjob, most jobs should report "Completed". In this example, there are four jobs that are not
98
+ complete:
99
+ ````
100
+ ...
101
+ 387 : 155829[387] : Completed
102
+ 388 : 155829[388] : Running
103
+ 389 : 155829[389] : Running
104
+ 390 : 155829[390] : Running
105
+ 391 : 155829[391] : Running
106
+ 392 : 155829[392] : Completed
107
+ ...
108
+ ````
109
+ So you can simply check these sub-jobs in more detail. For example, if I run `checkjob -v 155829[388]`,
110
+ I see that the job is running in the machine `iw-k30-12.pace.gatech.edu` (Task Distribution), so I can try
111
+ to login to that machine to check if the job is actually running, using `top -u $(whoami)`. However, when
112
+ I run `ssh iw-k30-12`, I got a "Connection closed" error, which means that the machine hung up. At this point,
113
+ you might want to try one of the following solutions:
114
+
115
+ 2.1. Pause the project using `./RUNME.bash <name> pause`, wait a few minutes, and resume using
116
+ `./RUNME.bash <name> run`. If you tried this a couple of times and you still have sub-jobs hanging, try:
117
+
118
+ 2.2. Check if your sub-jobs finished. Sometimes sub-jobs die too soon to return a success code, but they actually
119
+ finished. Just run the following command: `ls <SCRATCH>/<name>/success/02.* | wc -l`, where `<SCRATCH>` is the
120
+ value you set for the `SCRATCH` variable in the CONFIG file, and `<name>` is the name of your project. If the
121
+ output of that command is a number, and that number is exactly six times the number of jobs (`MAX_JOBS` in the
122
+ CONFIG file, typically 500), then your step 2 actually finished. In my case, I have 500 jobs, and the output
123
+ was 3000, so my job finished successfully, but the pipeline didn't notice. You can manually tell the pipeline
124
+ to go on running: `touch <SCRATCH>/<name>/success/02`, and pausing/resuming the project (see 2.1 above). If
125
+ the output is not the expected number (in my case, 3000, which is 6*500), DON'T RUN `touch`, just try the
126
+ solution 2.1 above once again.
127
+