paleomix 1.3.7-2 source package in Ubuntu

Changelog

paleomix (1.3.7-2) unstable; urgency=medium

  [ Andreas Tille ]
  * d/watch: Proper upstream tarball name

  [ Étienne Mollier ]
  * d/t/control: remove unneeded dependency to python3-nose. (Closes: #1018433)
  * d/lintian-overrides: fix mismatched override on empty files.

 -- Étienne Mollier <email address hidden>  Sat, 10 Sep 2022 22:54:13 +0200

Upload details

Uploaded by:
Debian Med
Uploaded to:
Sid
Original maintainer:
Debian Med
Architectures:
any
Section:
misc
Urgency:
Medium Urgency

See full publishing history Publishing

Series Pocket Published Component Section

Downloads

File Size SHA-256 Checksum
paleomix_1.3.7-2.dsc 2.4 KiB 344dd0a6d90bd3357bbf9d842cf53b8d504247395bf6bde9a5b4af3608826b84
paleomix_1.3.7.orig.tar.gz 1.1 MiB 63c03e8b0d9938714836862c3c98763aefd18b4c13730d713713a0045e18c088
paleomix_1.3.7-2.debian.tar.xz 10.2 KiB 3cd5f44f890d86cb54b9a1d3e97d60f185a33aab589cf62ea65ac05472dba7db

Available diffs

No changes file available.

Binary packages built by this source

paleomix: pipelines and tools for the processing of ancient and modern HTS data

 The PALEOMIX pipelines are a set of pipelines and tools designed to aid
 the rapid processing of High-Throughput Sequencing (HTS) data: The BAM
 pipeline processes de-multiplexed reads from one or more samples,
 through sequence processing and alignment, to generate BAM alignment
 files useful in downstream analyses; the Phylogenetic pipeline carries
 out genotyping and phylogenetic inference on BAM alignment files, either
 produced using the BAM pipeline or generated elsewhere; and the Zonkey
 pipeline carries out a suite of analyses on low coverage equine
 alignments, in order to detect the presence of F1-hybrids in
 archaeological assemblages. In addition, PALEOMIX aids in metagenomic
 analysis of the extracts.
 .
 The pipelines have been designed with ancient DNA (aDNA) in mind, and
 includes several features especially useful for the analyses of ancient
 samples, but can all be for the processing of modern samples, in order
 to ensure consistent data processing.