S3 Utilities
Utilities for reading spike train data from Amazon S3 buckets. Requires the
s3 extra (pip install spikelab[s3]).
Utilities for handling S3-backed inputs.
These helpers support: - Detecting S3 URLs (s3://… and common https://…amazonaws.com/… forms) - Parsing bucket/key pairs from S3 URLs - Downloading S3 objects to local temporary files for downstream processing - Treating local paths and S3 URLs uniformly (ensure_local_file)
This module intentionally has no dependency on the MCP server implementation so it can be reused by the core analysis package and other integrations.
- spikelab.data_loaders.s3_utils.is_s3_url(url)[source]
Return True if url looks like an S3 URL (s3:// or https://…amazonaws.com).
- spikelab.data_loaders.s3_utils.parse_s3_url(url)[source]
Parse an S3 URL into (bucket, key).
Supported forms include s3://bucket/key, path-style HTTPS (s3.amazonaws.com/bucket/key), and virtual-hosted-style HTTPS (bucket.s3.amazonaws.com/key), with optional region subdomains.
- Parameters:
url (str) – S3 URL to parse.
- Returns:
- The (bucket, key) pair extracted
from the URL.
- Return type:
- Raises:
ValueError – If the URL format is not recognised or has no object key.
- spikelab.data_loaders.s3_utils.download_from_s3(url, local_path=None, aws_access_key_id=None, aws_secret_access_key=None, aws_session_token=None, region_name=None)[source]
Download a single S3 object to a local file and return the local path.
- Parameters:
url (str) – S3 URL of the object to download.
local_path (str | None) – Destination file path. If None, a temporary file is created.
aws_access_key_id (str | None) – AWS access key ID.
aws_secret_access_key (str | None) – AWS secret access key.
aws_session_token (str | None) – AWS session token for temporary credentials.
region_name (str | None) – AWS region name.
- Returns:
Path to the downloaded local file.
- Return type:
local_path (str)
- Raises:
ImportError – If boto3 is not installed.
ValueError – If the URL is not an S3 URL or the bucket/key is not found.
PermissionError – If access to the S3 object is denied.
RuntimeError – If the download fails for another reason.
- spikelab.data_loaders.s3_utils.upload_to_s3(local_path, s3_url, aws_access_key_id=None, aws_secret_access_key=None, aws_session_token=None, region_name=None)[source]
Upload a local file to S3 and return the S3 URL.
- Parameters:
local_path (str) – Path to the local file to upload.
s3_url (str) – Destination S3 URL (s3://bucket/key).
aws_access_key_id (str | None) – AWS access key ID.
aws_secret_access_key (str | None) – AWS secret access key.
aws_session_token (str | None) – AWS session token for temporary credentials.
region_name (str | None) – AWS region name.
- Returns:
The S3 URL the file was uploaded to.
- Return type:
s3_url (str)
- Raises:
ImportError – If boto3 is not installed.
FileNotFoundError – If the local file does not exist.
ValueError – If the URL is not an S3 URL or the bucket is not found.
PermissionError – If access to the S3 bucket is denied.
RuntimeError – If the upload fails for another reason.
- spikelab.data_loaders.s3_utils.ensure_local_file(file_path_or_url, aws_access_key_id=None, aws_secret_access_key=None, aws_session_token=None, region_name=None)[source]
Return (local_path, is_temporary) for a local path or S3 URL.
If the input is an S3 URL, the object is downloaded to a temporary file. If it is a local path, it is returned as-is.
- Parameters:
- Returns:
- A (local_path, is_temporary) pair.
is_temporary is True when the file was downloaded from S3 and the caller should delete it after use.
- Return type:
- Raises:
FileNotFoundError – If a local path is given and the file does not exist.