dask 2024.1.1+dfsg-1 source package in Ubuntu

Changelog

dask (2024.1.1+dfsg-1) unstable; urgency=medium

  * Team upload

  [ Alexandre Detiste ]
  * New upstream release
  * Suggests: python3-boto3 instead of python3-boto
  * Refresh use-youtube-nocookie.patch

  [ Jeremy Bícha ]
  * Refresh patch
  * Drop patch applied in new release

 -- Jeremy Bícha <email address hidden>  Wed, 01 May 2024 08:32:03 -0400

Upload details

Uploaded by:
Debian Python Team
Uploaded to:
Sid
Original maintainer:
Debian Python Team
Architectures:
all
Section:
misc
Urgency:
Medium Urgency

See full publishing history Publishing

Series Pocket Published Component Section

Builds

Oracular: [FULLYBUILT] amd64

Downloads

File Size SHA-256 Checksum
dask_2024.1.1+dfsg-1.dsc 3.1 KiB 4ad3dc23faecdf75d5bd783bf3f0b198a089e51b451a9e5433e73de5ea391f72
dask_2024.1.1+dfsg.orig.tar.xz 8.2 MiB 2e36f01c1ea90356ff760475c5570b516f7345c4af88558cd7c254fccab17517
dask_2024.1.1+dfsg-1.debian.tar.xz 44.6 KiB f568ad3fadc7c9cbf178d0ef86cb8751f5f99f4b53759dc146e6a43f3f0822eb

Available diffs

No changes file available.

Binary packages built by this source

python-dask-doc: Minimal task scheduling abstraction documentation

 Dask is a flexible parallel computing library for analytics,
 containing two components.
 .
 1. Dynamic task scheduling optimized for computation. This is similar
 to Airflow, Luigi, Celery, or Make, but optimized for interactive
 computational workloads.
 2. "Big Data" collections like parallel arrays, dataframes, and lists
 that extend common interfaces like NumPy, Pandas, or Python iterators
 to larger-than-memory or distributed environments. These parallel
 collections run on top of the dynamic task schedulers.
 .
 This contains the documentation

python3-dask: Minimal task scheduling abstraction for Python 3

 Dask is a flexible parallel computing library for analytics,
 containing two components.
 .
 1. Dynamic task scheduling optimized for computation. This is similar
 to Airflow, Luigi, Celery, or Make, but optimized for interactive
 computational workloads.
 2. "Big Data" collections like parallel arrays, dataframes, and lists
 that extend common interfaces like NumPy, Pandas, or Python iterators
 to larger-than-memory or distributed environments. These parallel
 collections run on top of the dynamic task schedulers.
 .
 This contains the Python 3 version.