File: //usr/lib/python3/dist-packages/s3transfer/__pycache__/processpool.cpython-310.pyc
o
�y�`�� � @ s d Z ddlZddlZddlZddlZddlZddlZddlmZ ddl Z
ddlmZ ddl
mZ ddl
mZ ddl
mZ ddlmZ dd lmZ dd
lmZ ddlmZ ddlmZ dd
lmZ ddlmZ ddlmZ ddlmZ ddlmZ ddlmZ e� e!�Z"dZ#e�$dg d��Z%e�$dg d��Z&ej'dd� �Z(dd� Z)G dd� de*�Z+G dd� de*�Z,G d d!� d!e�Z-G d"d#� d#e�Z.G d$d%� d%e*�Z/G d&d'� d'e*�Z0G d(d)� d)e*�Z1G d*d+� d+e�Z2e2�3d'e0� G d,d-� d-ej4�Z5G d.d/� d/e5�Z6G d0d1� d1e5�Z7dS )2aC Speeds up S3 throughput by using processes
Getting Started
===============
The :class:`ProcessPoolDownloader` can be used to download a single file by
calling :meth:`ProcessPoolDownloader.download_file`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file('mybucket', 'mykey', 'myfile')
This snippet downloads the S3 object located in the bucket ``mybucket`` at the
key ``mykey`` to the local file ``myfile``. Any errors encountered during the
transfer are not propagated. To determine if a transfer succeeded or
failed, use the `Futures`_ interface.
The :class:`ProcessPoolDownloader` can be used to download multiple files as
well:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file('mybucket', 'mykey', 'myfile')
downloader.download_file('mybucket', 'myotherkey', 'myotherfile')
When running this snippet, the downloading of ``mykey`` and ``myotherkey``
happen in parallel. The first ``download_file`` call does not block the
second ``download_file`` call. The snippet blocks when exiting
the context manager and blocks until both downloads are complete.
Alternatively, the ``ProcessPoolDownloader`` can be instantiated
and explicitly be shutdown using :meth:`ProcessPoolDownloader.shutdown`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
downloader = ProcessPoolDownloader()
downloader.download_file('mybucket', 'mykey', 'myfile')
downloader.download_file('mybucket', 'myotherkey', 'myotherfile')
downloader.shutdown()
For this code snippet, the call to ``shutdown`` blocks until both
downloads are complete.
Additional Parameters
=====================
Additional parameters can be provided to the ``download_file`` method:
* ``extra_args``: A dictionary containing any additional client arguments
to include in the
`GetObject <https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.get_object>`_
API request. For example:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file(
'mybucket', 'mykey', 'myfile',
extra_args={'VersionId': 'myversion'})
* ``expected_size``: By default, the downloader will make a HeadObject
call to determine the size of the object. To opt-out of this additional
API call, you can provide the size of the object in bytes:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
MB = 1024 * 1024
with ProcessPoolDownloader() as downloader:
downloader.download_file(
'mybucket', 'mykey', 'myfile', expected_size=2 * MB)
Futures
=======
When ``download_file`` is called, it immediately returns a
:class:`ProcessPoolTransferFuture`. The future can be used to poll the state
of a particular transfer. To get the result of the download,
call :meth:`ProcessPoolTransferFuture.result`. The method blocks
until the transfer completes, whether it succeeds or fails. For example:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
future = downloader.download_file('mybucket', 'mykey', 'myfile')
print(future.result())
If the download succeeds, the future returns ``None``:
.. code:: python
None
If the download fails, the exception causing the failure is raised. For
example, if ``mykey`` did not exist, the following error would be raised
.. code:: python
botocore.exceptions.ClientError: An error occurred (404) when calling the HeadObject operation: Not Found
.. note::
:meth:`ProcessPoolTransferFuture.result` can only be called while the
``ProcessPoolDownloader`` is running (e.g. before calling ``shutdown`` or
inside the context manager).
Process Pool Configuration
==========================
By default, the downloader has the following configuration options:
* ``multipart_threshold``: The threshold size for performing ranged downloads
in bytes. By default, ranged downloads happen for S3 objects that are
greater than or equal to 8 MB in size.
* ``multipart_chunksize``: The size of each ranged download in bytes. By
default, the size of each ranged download is 8 MB.
* ``max_request_processes``: The maximum number of processes used to download
S3 objects. By default, the maximum is 10 processes.
To change the default configuration, use the :class:`ProcessTransferConfig`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
from s3transfer.processpool import ProcessTransferConfig
config = ProcessTransferConfig(
multipart_threshold=64 * 1024 * 1024, # 64 MB
max_request_processes=50
)
downloader = ProcessPoolDownloader(config=config)
Client Configuration
====================
The process pool downloader creates ``botocore`` clients on your behalf. In
order to affect how the client is created, pass the keyword arguments
that would have been used in the :meth:`botocore.Session.create_client` call:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
from s3transfer.processpool import ProcessTransferConfig
downloader = ProcessPoolDownloader(
client_kwargs={'region_name': 'us-west-2'})
This snippet ensures that all clients created by the ``ProcessPoolDownloader``
are using ``us-west-2`` as their region.
� N)�deepcopy)�Config)�MB)�ALLOWED_DOWNLOAD_ARGS)�PROCESS_USER_AGENT)�MAXINT)�BaseManager)�CancelledError)�RetriesExceededError)�BaseTransferFuture)�BaseTransferMeta)�S3_RETRYABLE_DOWNLOAD_ERRORS)�calculate_num_parts)�calculate_range_parameter)�OSUtils)�CallArgs�SHUTDOWN�DownloadFileRequest��transfer_id�bucket�key�filename�
extra_args�
expected_size�GetObjectJob)r r r �
temp_filenamer �offsetr c c s � t � } d V t�tj| � d S �N)�"_add_ignore_handler_for_interrupts�signal�SIGINT)�original_handler� r# �8/usr/lib/python3/dist-packages/s3transfer/processpool.py�
ignore_ctrl_c s �r% c C s t � t jt j�S r )r r! �SIG_IGNr# r# r# r$ r s r c @ s$ e Zd Zde de dfdd�ZdS )�ProcessTransferConfig� �
c C s || _ || _|| _dS )au Configuration for the ProcessPoolDownloader
:param multipart_threshold: The threshold for which ranged downloads
occur.
:param multipart_chunksize: The chunk size of each ranged download.
:param max_request_processes: The maximum number of processes that
will be making S3 API transfer-related requests at a time.
N)�multipart_threshold�multipart_chunksize�max_request_processes)�selfr* r+ r, r# r# r$ �__init__ s
zProcessTransferConfig.__init__N)�__name__�
__module__�__qualname__r r. r# r# r# r$ r'