You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run LMAT2mult-fastsummaryTable.pl, both with and without 'megan' in the output file name, the script uses the number of threads as the total number of sequences. For example, I ran each of 24 samples through run_rl.sh with 40 threads and the resulting summary table in megan format has the following headers: @Creator LMAT @CreationDate Tue Jul 24 21:31:18 UTC 2018 @ContentType Summary4 @Names KLF-01 KLF-02 KLF-03 KLF-04 KLF-05 KLF-06 KLF-07 KLF-08 KLF-09 KLF-10 KLF-11 KLF-12 KLF-13 KLF-14 KLF-15 KLF-16 KLF-17 KLF-18 KLF-19 KLF-20 KLF-21 KLF-22 KLF-23 KLF-24 @Sizes 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 @TotalReads 960 @Algorithm Taxonomy lmat
The log file for the run_rl.sh command is in the same folder as the .fastsummary file, so I'm not sure why I am not getting the read counts for each sample (which are significantly larger than 40).
The text was updated successfully, but these errors were encountered:
When I run LMAT2mult-fastsummaryTable.pl, both with and without 'megan' in the output file name, the script uses the number of threads as the total number of sequences. For example, I ran each of 24 samples through run_rl.sh with 40 threads and the resulting summary table in megan format has the following headers:
@Creator LMAT @CreationDate Tue Jul 24 21:31:18 UTC 2018 @ContentType Summary4 @Names KLF-01 KLF-02 KLF-03 KLF-04 KLF-05 KLF-06 KLF-07 KLF-08 KLF-09 KLF-10 KLF-11 KLF-12 KLF-13 KLF-14 KLF-15 KLF-16 KLF-17 KLF-18 KLF-19 KLF-20 KLF-21 KLF-22 KLF-23 KLF-24 @Sizes 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 40 @TotalReads 960 @Algorithm Taxonomy lmat
The log file for the run_rl.sh command is in the same folder as the .fastsummary file, so I'm not sure why I am not getting the read counts for each sample (which are significantly larger than 40).
The text was updated successfully, but these errors were encountered: