diff --git a/README.md b/README.md index 100fdff007..4b8df80440 100644 --- a/README.md +++ b/README.md @@ -151,6 +151,7 @@ Build, deploy, and manage cloud infrastructure with Infrastructure as Code best | Server Name | Description | Install | |-------------|-------------|---------| +| [Amazon SageMaker HyperPod MCP Server](src/sagemaker-hyperpod-mcp-server) | SageMaker HyperPod cluster management and application deployment | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.sagemaker-hyperpod-mcp-server&config=eyJhdXRvQXBwcm92ZSI6W10sImRpc2FibGVkIjpmYWxzZSwiY29tbWFuZCI6InV2eCBhd3NsYWJzLnNhZ2VtYWtlci1oeXBlcnBvZC1tY3Atc2VydmVyQGxhdGVzdCAtLWFsbG93LXdyaXRlIC0tYWxsb3ctc2Vuc2l0aXZlLWRhdGEtYWNjZXNzIiwiZW52Ijp7IkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwidHJhbnNwb3J0VHlwZSI6InN0ZGlvIn0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=SageMaker%20HyperPod%20MCP%20Server&config=%7B%22autoApprove%22%3A%5B%5D%2C%22disabled%22%3Afalse%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.sagemaker-hyperpod-mcp-server%40latest%22%2C%22--allow-write%22%2C%22--allow-sensitive-data-access%22%5D%2C%22env%22%3A%7B%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%7D%2C%22transportType%22%3A%22stdio%22%7D) | | [Amazon EKS MCP Server](src/eks-mcp-server) | Kubernetes cluster management and application deployment | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.eks-mcp-server&config=eyJhdXRvQXBwcm92ZSI6W10sImRpc2FibGVkIjpmYWxzZSwiY29tbWFuZCI6InV2eCBhd3NsYWJzLmVrcy1tY3Atc2VydmVyQGxhdGVzdCAtLWFsbG93LXdyaXRlIC0tYWxsb3ctc2Vuc2l0aXZlLWRhdGEtYWNjZXNzIiwiZW52Ijp7IkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwidHJhbnNwb3J0VHlwZSI6InN0ZGlvIn0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=EKS%20MCP%20Server&config=%7B%22autoApprove%22%3A%5B%5D%2C%22disabled%22%3Afalse%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.eks-mcp-server%40latest%22%2C%22--allow-write%22%2C%22--allow-sensitive-data-access%22%5D%2C%22env%22%3A%7B%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%7D%2C%22transportType%22%3A%22stdio%22%7D) | | [Amazon ECS MCP Server](src/ecs-mcp-server) | Container orchestration and ECS application deployment | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.ecs-mcp-server&config=eyJjb21tYW5kIjoidXZ4IC0tZnJvbSBhd3NsYWJzLWVjcy1tY3Atc2VydmVyIGVjcy1tY3Atc2VydmVyIiwiZW52Ijp7IkFXU19QUk9GSUxFIjoieW91ci1hd3MtcHJvZmlsZSIsIkFXU19SRUdJT04iOiJ5b3VyLWF3cy1yZWdpb24iLCJGQVNUTUNQX0xPR19MRVZFTCI6IkVSUk9SIiwiRkFTVE1DUF9MT0dfRklMRSI6Ii9wYXRoL3RvL2Vjcy1tY3Atc2VydmVyLmxvZyIsIkFMTE9XX1dSSVRFIjoiZmFsc2UiLCJBTExPV19TRU5TSVRJVkVfREFUQSI6ImZhbHNlIn19)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=ECS%20MCP%20Server&config=%7B%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22--from%22%2C%22awslabs-ecs-mcp-server%22%2C%22ecs-mcp-server%22%5D%2C%22env%22%3A%7B%22AWS_PROFILE%22%3A%22your-aws-profile%22%2C%22AWS_REGION%22%3A%22your-aws-region%22%2C%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%2C%22FASTMCP_LOG_FILE%22%3A%22%2Fpath%2Fto%2Fecs-mcp-server.log%22%2C%22ALLOW_WRITE%22%3A%22false%22%2C%22ALLOW_SENSITIVE_DATA%22%3A%22false%22%7D%7D) | | [Finch MCP Server](src/finch-mcp-server) | Local container building with ECR integration | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.finch-mcp-server&config=eyJjb21tYW5kIjoidXZ4IGF3c2xhYnMuZmluY2gtbWNwLXNlcnZlckBsYXRlc3QiLCJlbnYiOnsiQVdTX1BST0ZJTEUiOiJkZWZhdWx0IiwiQVdTX1JFR0lPTiI6InVzLXdlc3QtMiIsIkZBU1RNQ1BfTE9HX0xFVkVMIjoiSU5GTyJ9LCJ0cmFuc3BvcnRUeXBlIjoic3RkaW8iLCJkaXNhYmxlZCI6ZmFsc2UsImF1dG9BcHByb3ZlIjpbXX0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=Finch%20MCP%20Server&config=%7B%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.finch-mcp-server%40latest%22%5D%2C%22env%22%3A%7B%22AWS_PROFILE%22%3A%22default%22%2C%22AWS_REGION%22%3A%22us-west-2%22%2C%22FASTMCP_LOG_LEVEL%22%3A%22INFO%22%7D%2C%22transportType%22%3A%22stdio%22%2C%22disabled%22%3Afalse%2C%22autoApprove%22%3A%5B%5D%7D) | @@ -310,6 +311,7 @@ Interact with AWS HealthAI services. | Server Name | Description | Install | |-------------|-------------|---------| +| [Amazon SageMaker HyperPod MCP Server](src/sagemaker-hyperpod-mcp-server) | SageMaker HyperPod cluster management and application deployment | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.sagemaker-hyperpod-mcp-server&config=eyJhdXRvQXBwcm92ZSI6W10sImRpc2FibGVkIjpmYWxzZSwiY29tbWFuZCI6InV2eCBhd3NsYWJzLnNhZ2VtYWtlci1oeXBlcnBvZC1tY3Atc2VydmVyQGxhdGVzdCAtLWFsbG93LXdyaXRlIC0tYWxsb3ctc2Vuc2l0aXZlLWRhdGEtYWNjZXNzIiwiZW52Ijp7IkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwidHJhbnNwb3J0VHlwZSI6InN0ZGlvIn0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=SageMaker%20HyperPod%20MCP%20Server&config=%7B%22autoApprove%22%3A%5B%5D%2C%22disabled%22%3Afalse%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.sagemaker-hyperpod-mcp-server%40latest%22%2C%22--allow-write%22%2C%22--allow-sensitive-data-access%22%5D%2C%22env%22%3A%7B%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%7D%2C%22transportType%22%3A%22stdio%22%7D) | | [Amazon EKS MCP Server](src/eks-mcp-server) | Kubernetes cluster management and app deployment | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.eks-mcp-server&config=eyJhdXRvQXBwcm92ZSI6W10sImRpc2FibGVkIjpmYWxzZSwiY29tbWFuZCI6InV2eCBhd3NsYWJzLmVrcy1tY3Atc2VydmVyQGxhdGVzdCAtLWFsbG93LXdyaXRlIC0tYWxsb3ctc2Vuc2l0aXZlLWRhdGEtYWNjZXNzIiwiZW52Ijp7IkZBU1RNQ1BfTE9HX0xFVkVMIjoiRVJST1IifSwidHJhbnNwb3J0VHlwZSI6InN0ZGlvIn0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=EKS%20MCP%20Server&config=%7B%22autoApprove%22%3A%5B%5D%2C%22disabled%22%3Afalse%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.eks-mcp-server%40latest%22%2C%22--allow-write%22%2C%22--allow-sensitive-data-access%22%5D%2C%22env%22%3A%7B%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%7D%2C%22transportType%22%3A%22stdio%22%7D) | | [Amazon ECS MCP Server](src/ecs-mcp-server) | Containerize and deploy applications to ECS | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.ecs-mcp-server&config=eyJjb21tYW5kIjoidXZ4IC0tZnJvbSBhd3NsYWJzLWVjcy1tY3Atc2VydmVyIGVjcy1tY3Atc2VydmVyIiwiZW52Ijp7IkFXU19QUk9GSUxFIjoieW91ci1hd3MtcHJvZmlsZSIsIkFXU19SRUdJT04iOiJ5b3VyLWF3cy1yZWdpb24iLCJGQVNUTUNQX0xPR19MRVZFTCI6IkVSUk9SIiwiRkFTVE1DUF9MT0dfRklMRSI6Ii9wYXRoL3RvL2Vjcy1tY3Atc2VydmVyLmxvZyIsIkFMTE9XX1dSSVRFIjoiZmFsc2UiLCJBTExPV19TRU5TSVRJVkVfREFUQSI6ImZhbHNlIn19)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=ECS%20MCP%20Server&config=%7B%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22--from%22%2C%22awslabs-ecs-mcp-server%22%2C%22ecs-mcp-server%22%5D%2C%22env%22%3A%7B%22AWS_PROFILE%22%3A%22your-aws-profile%22%2C%22AWS_REGION%22%3A%22your-aws-region%22%2C%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%2C%22FASTMCP_LOG_FILE%22%3A%22%2Fpath%2Fto%2Fecs-mcp-server.log%22%2C%22ALLOW_WRITE%22%3A%22false%22%2C%22ALLOW_SENSITIVE_DATA%22%3A%22false%22%7D%7D) | | [Finch MCP Server](src/finch-mcp-server) | Local container building with ECR push | [![Install](https://img.shields.io/badge/Install-Cursor-blue?style=flat-square&logo=cursor)](https://cursor.com/en/install-mcp?name=awslabs.finch-mcp-server&config=eyJjb21tYW5kIjoidXZ4IGF3c2xhYnMuZmluY2gtbWNwLXNlcnZlckBsYXRlc3QiLCJlbnYiOnsiQVdTX1BST0ZJTEUiOiJkZWZhdWx0IiwiQVdTX1JFR0lPTiI6InVzLXdlc3QtMiIsIkZBU1RNQ1BfTE9HX0xFVkVMIjoiSU5GTyJ9LCJ0cmFuc3BvcnRUeXBlIjoic3RkaW8iLCJkaXNhYmxlZCI6ZmFsc2UsImF1dG9BcHByb3ZlIjpbXX0%3D)
[![Install on VS Code](https://img.shields.io/badge/Install-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=Finch%20MCP%20Server&config=%7B%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.finch-mcp-server%40latest%22%5D%2C%22env%22%3A%7B%22AWS_PROFILE%22%3A%22default%22%2C%22AWS_REGION%22%3A%22us-west-2%22%2C%22FASTMCP_LOG_LEVEL%22%3A%22INFO%22%7D%2C%22transportType%22%3A%22stdio%22%2C%22disabled%22%3Afalse%2C%22autoApprove%22%3A%5B%5D%7D) | diff --git a/docusaurus/docs/servers/sagemaker-hyperpod-mcp-server.md b/docusaurus/docs/servers/sagemaker-hyperpod-mcp-server.md new file mode 100644 index 0000000000..11f408696d --- /dev/null +++ b/docusaurus/docs/servers/sagemaker-hyperpod-mcp-server.md @@ -0,0 +1,16 @@ +--- +title: Amazon SageMaker HyperPod MCP Server +--- + +import ReadmeContent from "../../../src/sagemaker-hyperpod-mcp-server/README.md"; + +
+ + +
diff --git a/docusaurus/sidebars.ts b/docusaurus/sidebars.ts index ab31203015..9599167adf 100644 --- a/docusaurus/sidebars.ts +++ b/docusaurus/sidebars.ts @@ -46,6 +46,7 @@ const sidebars: SidebarsConfig = { 'servers/cdk-mcp-server', 'servers/cfn-mcp-server', 'servers/terraform-mcp-server', + 'servers/sagemaker-hyperpod-mcp-server', 'servers/eks-mcp-server', 'servers/ecs-mcp-server', 'servers/finch-mcp-server', diff --git a/docusaurus/static/assets/server-cards.json b/docusaurus/static/assets/server-cards.json index 8a5579417e..58e6baf0ba 100644 --- a/docusaurus/static/assets/server-cards.json +++ b/docusaurus/static/assets/server-cards.json @@ -199,6 +199,26 @@ "vibe-coding" ] }, + { + "category": "Infrastructure & Deployment", + "description": "SageMaker HyperPod cluster management and application deployment", + "icon": "\ud83c\udfd7\ufe0f", + "id": "sagemaker-hyperpod-mcp-server", + "name": "Amazon SageMaker HyperPod MCP Server", + "source_path": "src/sagemaker-hyperpod-mcp-server/", + "subcategory": "Container Platforms", + "tags": [ + "sagemaker hyperpod", + "eks", + "cluster-management", + "application-deployment", + "vibe-coding", + "container-platforms" + ], + "workflows": [ + "vibe-coding" + ] + }, { "category": "Infrastructure & Deployment", "description": "Kubernetes cluster management and application deployment", diff --git a/src/sagemaker-hyperpod-mcp-server/.gitignore b/src/sagemaker-hyperpod-mcp-server/.gitignore new file mode 100644 index 0000000000..9a75a1cf7c --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/.gitignore @@ -0,0 +1,63 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +.Python +build/ +develop-eggs/ +dist/ +downloads/ +eggs/ +.eggs/ +lib/ +lib64/ +parts/ +sdist/ +var/ +wheels/ +share/python-wheels/ +*.egg-info/ +.installed.cfg +*.egg +MANIFEST + +# Virtual environments +.venv +env/ +venv/ +ENV/ + +# IDE +.idea/ +.vscode/ +*.swp +*.swo + +# Testing +.tox/ +.coverage +.coverage.* +htmlcov/ +.pytest_cache/ + +# Ruff +.ruff_cache/ + +# Build +*.manifest +*.spec +.pybuilder/ +target/ +build + +# Environments +.env +.env.local +.env.*.local + +# PyPI +.pypirc + +# MCP param +*-params.json diff --git a/src/sagemaker-hyperpod-mcp-server/.python-version b/src/sagemaker-hyperpod-mcp-server/.python-version new file mode 100644 index 0000000000..c8cfe39591 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/.python-version @@ -0,0 +1 @@ +3.10 diff --git a/src/sagemaker-hyperpod-mcp-server/CHANGELOG.md b/src/sagemaker-hyperpod-mcp-server/CHANGELOG.md new file mode 100644 index 0000000000..92b0342e5f --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/CHANGELOG.md @@ -0,0 +1,12 @@ +# Changelog + +All notable changes to this project will be documented in this file. + +The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), +and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). + +## Unreleased + +### Added + +- Initial project setup diff --git a/src/sagemaker-hyperpod-mcp-server/LICENSE b/src/sagemaker-hyperpod-mcp-server/LICENSE new file mode 100644 index 0000000000..67db858821 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/LICENSE @@ -0,0 +1,175 @@ + + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. diff --git a/src/sagemaker-hyperpod-mcp-server/NOTICE b/src/sagemaker-hyperpod-mcp-server/NOTICE new file mode 100644 index 0000000000..92c323e491 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/NOTICE @@ -0,0 +1,2 @@ +awslabs.sagemaker-hyperpod-mcp-server +Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. diff --git a/src/sagemaker-hyperpod-mcp-server/README.md b/src/sagemaker-hyperpod-mcp-server/README.md new file mode 100644 index 0000000000..0f8e892f0c --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/README.md @@ -0,0 +1,470 @@ +# Amazon SageMaker HyperPod MCP Server + +The Amazon SageMaker HyperPod MCP server provides AI code assistants with resource management tools and real-time cluster state visibility. This provides large language models (LLMs) with essential tooling and contextual awareness, enabling AI code assistants to assist with application development through tailored guidance — from initial setup workflows through ongoing management. + +Integrating the HyperPod MCP server into AI code assistants enhances development workflow across all phases, from assisting with initial cluster setup workflows using the same managed CloudFormation templates as the AWS SageMaker HyperPod console UI. Further, it helps with cluster management through high-level workflows and guidance. All of this simplifies complex operations through natural language interactions in AI code assistants. + +## Key features + +* Enables users of AI code assistants to interact with HyperPod cluster deployment workflows, utilizing the same managed CloudFormation templates used by the HyperPod console UI for consistent and approved deployments. +* Provides the ability to interface with HyperPod cluster stacks and resources via managed CloudFormation templates and user-provided custom parameter values. +* Supports full lifecycle management of HyperPod cluster nodes, enabling listing, describing, updating software, and deleting operations. + +## Prerequisites + +* [Install Python 3.10+](https://www.python.org/downloads/release/python-3100/) +* [Install the `uv` package manager](https://docs.astral.sh/uv/getting-started/installation/) +* [Install and configure the AWS CLI with credentials](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html) +* [Install Pre-commit.](https://pre-commit.com/) + - Pre-commit will run automatically before each commit. You can also run pre-commit manually on all files using `pre-commit run --all-files`. If any hook fails, the commit will be aborted, and you will need to fix the issues before committing again. +## Setup + +Add these IAM policies to the IAM role or user that you use to manage your HyperPod cluster resources. + +### Read-Only Operations Policy + +For read operations, the following permissions are required: + +``` +{ + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": [ + "sagemaker:ListClusters", + "sagemaker:DescribeCluster", + "sagemaker:ListClusterNodes", + "sagemaker:DescribeClusterNode", + "cloudformation:DescribeStacks" + ], + "Resource": "*" + } + ] +} +``` + +### Write Operations Policy + +For write operations, we recommend the following IAM policies to ensure successful deployment of HyperPod clusters using the managed CloudFormation templates: + +* [**IAMFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/IAMFullAccess.html): Enables creation and management of IAM roles and policies required for cluster operation. After cluster creation and if no new IAM role needs to be created, we recommend reducing the scope of this policy permissions. +* [**AmazonVPCFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonVPCFullAccess.html): Allows creation and configuration of VPC resources including subnets, route tables, internet gateways, and NAT gateways +* [**AWSCloudFormationFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AWSCloudFormationFullAccess.html): Provides permissions to create, update, and delete CloudFormation stacks that orchestrate the deployment +* [**AmazonSageMakerFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonSageMakerFullAccess.html): Required for creating and managing HyperPod clusters and cluster nodes +* [**AmazonS3FullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonS3FullAccess.html): Required for creating S3 buckets storing LifeCyle scripts and so on +* [**AWSLambda_FullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AWSLambda_FullAccess.html): Required for interacting Lambda functions to manage HyperPod clusters and other resources +* [**CloudWatchLogsFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/CloudWatchLogsFullAccess.html): Required for operations on CloudWatch logs +* [**AmazonFSxFullAccess**](https://docs.aws.amazon.com/aws-managed-policy/latest/reference/AmazonFSxFullAccess.html): Required for operations on FSx file systems +* **EKS Full Access (provided below)**: Required for interacting with EKS clusters orchestrating HyperPod + + ``` + { + "Version": "2012-10-17", + "Statement": [ + { + "Effect": "Allow", + "Action": "eks:*", + "Resource": "*" + } + ] + } + ``` + + +**Important Security Note**: Users should exercise caution when `--allow-write` and `--allow-sensitive-data-access` modes are enabled with these broad permissions, as this combination grants significant privileges to the MCP server. Only enable these flags when necessary and in trusted environments. For production use, consider creating more restrictive custom policies. + +## Quickstart + +This quickstart guide walks you through the steps to configure the Amazon SageMaker HyperPod MCP Server for use with the [Amazon Q Developer CLI](https://github.com/aws/amazon-q-developer-cli). By following these steps, you'll setup your development environment to leverage the HyperPod MCP Server's tools for managing your Amazon SageMaker HyperPod clusters and resources. + +**Set up Pre-Commit** +1. `pip install pre-commit` +2. `pre-commit install` +3. pre-commit will run before commiting. Use `pre-commit run --all-files` to trigger pre-commit without commiting. + + +**Set up Cursor** + +| VS Code | +|:-------:| +| [![Install on VS Code](https://img.shields.io/badge/Install_on-VS_Code-FF9900?style=flat-square&logo=visualstudiocode&logoColor=white)](https://insiders.vscode.dev/redirect/mcp/install?name=HyperPod%20MCP%20Server&config=%7B%22autoApprove%22%3A%5B%5D%2C%22disabled%22%3Afalse%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22awslabs.sagemaker-hyperpod-mcp-server%40latest%22%2C%22--allow-write%22%2C%22--allow-sensitive-data-access%22%5D%2C%22env%22%3A%7B%22FASTMCP_LOG_LEVEL%22%3A%22ERROR%22%7D%2C%22transportType%22%3A%22stdio%22%7D) | + +**Set up the Amazon Q Developer CLI** + +1. Install the [Amazon Q Developer CLI](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-installing.html) . +2. The Q Developer CLI supports MCP servers for tools and prompts out-of-the-box. Edit your Q developer CLI's MCP configuration file named mcp.json following [these instructions](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-mcp-configuration.html). + +The example below includes both the `--allow-write` flag for mutating operations and the `--allow-sensitive-data-access` flag for accessing logs and events (see the Arguments section for more details): + + **For Mac/Linux:** + + ``` + { + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "awslabs.sagemaker-hyperpod-mcp-server@latest", + "--allow-write", + "--allow-sensitive-data-access" + ], + "env": { + "FASTMCP_LOG_LEVEL": "ERROR" + }, + "autoApprove": [], + "disabled": false + } + } + } + ``` + + **For Windows:** + + ``` + { + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "--from", + "awslabs.sagemaker-hyperpod-mcp-server@latest", + "awslabs.sagemaker-hyperpod-mcp-server.exe", + "--allow-write", + "--allow-sensitive-data-access" + ], + "env": { + "FASTMCP_LOG_LEVEL": "ERROR" + }, + "autoApprove": [], + "disabled": false + } + } + } + ``` + +3. Verify your setup by running the `/tools` command in the Q Developer CLI to see the available HyperPod MCP tools. + +Note that this is a basic quickstart. You can enable additional capabilities, such as combining more MCP servers like the [AWS Documentation MCP Server](https://awslabs.github.io/mcp/servers/aws-documentation-mcp-server/) into a single MCP server definition. To view an example, see the [Installation and Setup](https://github.com/awslabs/mcp?tab=readme-ov-file#installation-and-setup) guide in AWS MCP Servers on GitHub. To view a real-world implementation with application code in context with an MCP server, see the [Server Developer](https://modelcontextprotocol.io/quickstart/server) guide in Anthropic documentation. + +## Configurations + +### Arguments + +The `args` field in the MCP server definition specifies the command-line arguments passed to the server when it starts. These arguments control how the server is executed and configured. For example: + +**For Mac/Linux:** +``` +{ + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "awslabs.sagemaker-hyperpod-mcp-server@latest", + "--allow-write", + "--allow-sensitive-data-access" + ], + "env": { + "AWS_PROFILE": "your-profile", + "AWS_REGION": "us-east-1" + } + } + } +} +``` + +**For Windows:** +``` +{ + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "--from", + "awslabs.sagemaker-hyperpod-mcp-server@latest", + "awslabs.sagemaker-hyperpod-mcp-server.exe", + "--allow-write", + "--allow-sensitive-data-access" + ], + "env": { + "AWS_PROFILE": "your-profile", + "AWS_REGION": "us-east-1" + } + } + } +} +``` + +#### Command Format + +The command format differs between operating systems: + +**For Mac/Linux:** +* `awslabs.sagemaker-hyperpod-mcp-server@latest` - Specifies the latest package/version specifier for the MCP client config. + +**For Windows:** +* `--from awslabs.sagemaker-hyperpod-mcp-server@latest awslabs.sagemaker-hyperpod-mcp-server.exe` - Windows requires the `--from` flag to specify the package and the `.exe` extension. + +Both formats enable MCP server startup and tool registration. + +#### `--allow-write` (optional) + +Enables write access mode, which allows mutating operations (e.g., create, update, delete resources) for manage_hyperpod_stacks, manage_hyperpod_cluster_nodes tool operations. + +* Default: false (The server runs in read-only mode by default) +* Example: Add `--allow-write` to the `args` list in your MCP server definition. + +#### `--allow-sensitive-data-access` (optional) + +Enables access to sensitive data such as logs, events, and cluster details. This flag is required for tools that access potentially sensitive information. + +* Default: false (Access to sensitive data is restricted by default) +* Example: Add `--allow-sensitive-data-access` to the `args` list in your MCP server definition. + +### Environment variables + +The `env` field in the MCP server definition allows you to configure environment variables that control the behavior of the HyperPod MCP server. For example: + +``` +{ + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "env": { + "FASTMCP_LOG_LEVEL": "ERROR", + "AWS_PROFILE": "my-profile", + "AWS_REGION": "us-west-2" + } + } + } +} +``` + +#### `FASTMCP_LOG_LEVEL` (optional) + +Sets the logging level verbosity for the server. + +* Valid values: "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL" +* Default: "WARNING" +* Example: `"FASTMCP_LOG_LEVEL": "ERROR"` + +#### `AWS_PROFILE` (optional) + +Specifies the AWS profile to use for authentication. + +* Default: None (If not set, uses default AWS credentials). +* Example: `"AWS_PROFILE": "my-profile"` + +#### `AWS_REGION` (optional) + +Specifies the AWS region where HyperPod clusters are managed, which will be used for all AWS service operations. + +* Default: None (If not set, uses default AWS region). +* Example: `"AWS_REGION": "us-west-2"` + +## Tools + +The following tools are provided by the HyperPod MCP server for managing Amazon SageMaker HyperPod clusters and resources. Each tool performs a specific action that can be invoked to automate common tasks in your HyperPod clusters. + +### HyperPod Cluster Management + +#### `manage_hyperpod_stacks` + +Provides interface to HyperPod CloudFormation stacks with operations for initiating deployments, describing, and deleting HyperPod clusters and their underlying infrastructure. **Note**: Cluster creation typically takes around 30 minutes to complete. + +Features: + +* Interfaces with HyperPod cluster deployments using the same managed CloudFormation templates as the HyperPod console UI. +* Allows users to specify parameter override values as a JSON object for more customized HyperPod stack creation. +* Describes existing HyperPod CloudFormation stacks, providing details like status, outputs, and creation time. +* Deletes HyperPod CloudFormation stacks and their associated resources, ensuring proper cleanup. +* Ensures safety by only modifying/deleting stacks that were originally created by this tool. +* Does not create, modify, or provision CloudFormation templates - only interfaces with existing managed templates. + +Parameters: + +* operation (deploy, describe, delete), stack_name, region_name, profile_name, params_file (for deploy) + +### HyperPod Cluster Node Operations + +#### `manage_hyperpod_cluster_nodes` + +Manages SageMaker HyperPod clusters and nodes with both read and write operations. + +Features: + +* Provides a consolidated interface for all cluster and node-related operations. +* Supports listing clusters with filtering by name, creation time, and training plan ARN. +* Supports listing nodes with filtering by creation time and instance group name. +* Returns detailed information about specific nodes in a cluster. +* Initiates software updates for all nodes or specific instance groups in a cluster. +* Deletes multiple nodes from a cluster in a single operation. + +Operations: + +* **list_clusters**: Lists SageMaker HyperPod clusters with options for pagination and filtering. +* **list_nodes**: Lists nodes in a SageMaker HyperPod cluster with options for pagination and filtering. +* **describe_node**: Gets detailed information about a specific node in a SageMaker HyperPod cluster. +* **update_software**: Updates the software for a SageMaker HyperPod cluster. +* **batch_delete**: Deletes multiple nodes from a SageMaker HyperPod cluster in a single operation. + +Parameters: + +* operation (list_clusters, list_nodes, describe_node, update_software, batch_delete) +* cluster_name (required for all operations except list_clusters) +* node_id (required for describe_node operation) +* node_ids (required for batch_delete operation) +* Additional parameters specific to each operation + + +## Security & permissions + +### Features + +The HyperPod MCP Server implements the following security features: + +1. **AWS Authentication**: Uses AWS credentials from the environment for secure authentication. +2. **SSL Verification**: Enforces SSL verification for all AWS API calls. +3. **Resource Tagging**: Tags all created resources for traceability. +4. **Least Privilege**: Uses IAM roles with appropriate permissions for CloudFormation templates. +5. **Stack Protection**: Ensures CloudFormation stacks can only be modified by the tool that created them. + +### Considerations + +When using the HyperPod MCP Server, consider the following: + +* **AWS Credentials**: The server needs permission to create and manage HyperPod resources. +* **Network Security**: Configure VPC and security groups properly for HyperPod clusters. +* **Authentication**: Use appropriate authentication mechanisms for AWS resources. +* **Authorization**: Configure IAM properly for AWS resources. +* **Data Protection**: Encrypt sensitive data in HyperPod clusters. +* **Logging and Monitoring**: Enable logging and monitoring for HyperPod clusters. + +### Permissions + +The HyperPod MCP Server can be used for production environments with proper security controls in place. The server runs in read-only mode by default, which is recommended and considered generally safer for production environments. Only explicitly enable write access when necessary. Below are the HyperPod MCP server tools available in read-only versus write-access mode: + +* **Read-only mode (default)**: `manage_hyperpod_stacks` (with operation="describe"), `manage_hyperpod_cluster_nodes` (with operations="list_clusters", "list_nodes", "describe_node"). +* **Write-access mode**: (require `--allow-write`): `manage_hyperpod_stacks` (with "deploy", "delete"), `manage_hyperpod_cluster_nodes` (with operations="update_software", "batch_delete"). + +#### `autoApprove` (optional) + +An array within the MCP server definition that lists tool names to be automatically approved by the HyperPod MCP Server client, bypassing user confirmation for those specific tools. For example: + +**For Mac/Linux:** +``` +{ + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "awslabs.sagemaker-hyperpod-mcp-server@latest" + ], + "env": { + "AWS_PROFILE": "hyperpod-mcp-readonly-profile", + "AWS_REGION": "us-east-1", + "FASTMCP_LOG_LEVEL": "INFO" + }, + "autoApprove": [ + "manage_hyperpod_stacks", + "manage_hyperpod_cluster_nodes" + ] + } + } +} +``` + +**For Windows:** +``` +{ + "mcpServers": { + "awslabs.sagemaker-hyperpod-mcp-server": { + "command": "uvx", + "args": [ + "--from", + "awslabs.sagemaker-hyperpod-mcp-server@latest", + "awslabs.sagemaker-hyperpod-mcp-server.exe" + ], + "env": { + "AWS_PROFILE": "hyperpod-mcp-readonly-profile", + "AWS_REGION": "us-east-1", + "FASTMCP_LOG_LEVEL": "INFO" + }, + "autoApprove": [ + "manage_hyperpod_stacks", + "manage_hyperpod_cluster_nodes" + ] + } + } +} +``` + +### Role Scoping Recommendations + +In accordance with security best practices, we recommend the following: + +1. **Create dedicated IAM roles** to be used by the HyperPod MCP Server with the principle of "least privilege." +2. **Use separate roles** for read-only and write operations. +3. **Implement resource tagging** to limit actions to resources created by the server. +4. **Enable AWS CloudTrail** to audit all API calls made by the server. +5. **Regularly review** the permissions granted to the server's IAM role. +6. **Use IAM Access Analyzer** to identify unused permissions that can be removed. + +### Sensitive Information Handling + +**IMPORTANT**: Do not pass secrets or sensitive information via allowed input mechanisms: + +* Do not include secrets or credentials in CloudFormation templates. +* Do not pass sensitive information directly in the prompt to the model. +* Avoid using MCP tools for creating secrets, as this would require providing the secret data to the model. + +**CloudFormation Template Security**: + +* Only use CloudFormation templates from trustworthy sources. +* The server relies on CloudFormation API validation for template content and does not perform its own validation. +* Audit CloudFormation templates before applying them to your cluster. + +**Instead of passing secrets through MCP**: + +* Use AWS Secrets Manager or Parameter Store to store sensitive information. +* Configure proper IAM roles for service accounts. +* Use IAM roles for service accounts (IRSA) for AWS service access from pods. + + +### File System Access and Operating Mode + +**Important**: This MCP server is intended for **STDIO mode only** as a local server using a single user's credentials. The server runs with the same permissions as the user who started it and has complete access to the file system. + +#### Security and Access Considerations + +- **Full File System Access**: The server can read from and writeq to any location on the file system where the user has permissions +- **Host File System Sharing**: When using this server, the host file system is directly accessible +- **Do Not Modify for Network Use**: This server is designed for local STDIO use only; network operation introduces additional security risks + +#### Common File Operations + +The MCP server can create a templated params json file to a user-specified absolute file path during hyperpod cluster creation. + + +## General Best Practices + +* **Resource Naming**: Use descriptive names for HyperPod clusters and resources. +* **Error Handling**: Check for errors in tool responses and handle them appropriately. +* **Resource Cleanup**: Delete unused resources to avoid unnecessary costs. +* **Monitoring**: Monitor cluster and resource status regularly. +* **Security**: Follow AWS security best practices for HyperPod clusters. +* **Backup**: Regularly backup important HyperPod resources. + +## General Troubleshooting + +* **Permission Errors**: Verify that your AWS credentials have the necessary permissions. +* **CloudFormation Errors**: Check the CloudFormation console for stack creation errors. +* **SageMaker API Errors**: Verify that the HyperPod cluster is running and accessible. +* **Network Issues**: Check VPC and security group configurations. +* **Client Errors**: Verify that the MCP client is configured correctly. +* **Log Level**: Increase the log level to DEBUG for more detailed logs. + +For general HyperPod issues, consult the [Amazon SageMaker HyperPod documentation](https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-hyperpod.html). + +## Version + +Current MCP server version: 0.1.0 diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/__init__.py b/src/sagemaker-hyperpod-mcp-server/awslabs/__init__.py new file mode 100644 index 0000000000..5c624673e0 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/__init__.py @@ -0,0 +1,16 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# This file is part of the awslabs namespace. +# It is intentionally minimal to support PEP 420 namespace packages. diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/__init__.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/__init__.py new file mode 100644 index 0000000000..2bcd57027a --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/__init__.py @@ -0,0 +1,17 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""awslabs.sagemaker-hyperpod-mcp-server""" + +__version__ = '0.1.0' diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/aws_helper.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/aws_helper.py new file mode 100644 index 0000000000..abfb213b9d --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/aws_helper.py @@ -0,0 +1,153 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""AWS helper for the HyperPod MCP Server.""" + +import boto3 +import os +import time +from awslabs.sagemaker_hyperpod_mcp_server.consts import SUPPORTED_REGIONS +from botocore.config import Config +from loguru import logger +from pydantic import validate_call +from typing import Any, Dict, Optional, cast, get_args + + +# TODO: Import version from package +__version__ = '0.1.0' + + +class AwsHelper: + """Helper class for AWS operations. + + This class provides utility methods for interacting with AWS services, + including region and profile management and client creation. + + This class implements a singleton pattern with a client cache to avoid + creating multiple clients for the same service. The cache includes TTL-based + expiration and size limits to prevent memory issues and handle credential rotation. + """ + + # Singleton instance + _instance = None + + # Client cache with AWS service name as key + _client_cache: Dict[str, Any] = {} + + # Cache metadata for TTL and size management + _cache_metadata: Dict[str, float] = {} # key -> timestamp + _cache_ttl: int = 1800 # 30 minutes TTL + _cache_max_size: int = 100 # Maximum 100 cache entries + + @staticmethod + def get_aws_region() -> Optional[SUPPORTED_REGIONS]: + """Get the AWS region from the environment if set.""" + region = os.environ.get('AWS_REGION') + return cast(SUPPORTED_REGIONS, region) if region in get_args(SUPPORTED_REGIONS) else None + + @staticmethod + def get_aws_profile() -> Optional[str]: + """Get the AWS profile from the environment if set.""" + return os.environ.get('AWS_PROFILE') + + @classmethod + @validate_call + def create_boto3_client( + cls, service_name: str, region_name: Optional[SUPPORTED_REGIONS] = None + ) -> Any: + """Create or retrieve a cached boto3 client with the appropriate profile and region. + + The client is configured with a custom user agent suffix 'awslabs/mcp/sagemaker-hyperpod-mcp-server/{version}' + to identify API calls made by the HyperPod MCP Server. Clients are cached to improve performance + and reduce resource usage. + + Args: + service_name: The AWS service name (e.g., 'sagemaker') + region_name: Optional region name override + + Returns: + A boto3 client for the specified service + + Raises: + Exception: If there's an error creating the client + """ + try: + # Get region from parameter or environment if set + region: Optional[SUPPORTED_REGIONS] = ( + region_name if region_name is not None else cls.get_aws_region() + ) + + # Get profile from environment if set + profile = cls.get_aws_profile() + + # Use service name as the cache key + cache_key = f'{service_name}+{region_name}' + + # Check if client is already in cache and not expired + current_time = time.time() + if cache_key in cls._client_cache: + # Check TTL expiration (lazy expiration) + if cache_key in cls._cache_metadata: + cache_time = cls._cache_metadata[cache_key] + if current_time - cache_time < cls._cache_ttl: + logger.info( + f'Using cached boto3 client for {service_name} in {region_name}' + ) + return cls._client_cache[cache_key] + else: + # Expired - remove from cache + logger.info( + f'Cache expired for {service_name} in {region_name}, creating new client' + ) + del cls._client_cache[cache_key] + del cls._cache_metadata[cache_key] + else: + # No metadata, treat as expired + del cls._client_cache[cache_key] + + # Create config with user agent suffix + config = Config( + user_agent_extra=f'awslabs/mcp/sagemaker-hyperpod-mcp-server/{__version__}' + ) + + # Create session with profile if specified + if profile: + session = boto3.Session(profile_name=profile) + if region is not None: + client = session.client(service_name, region_name=region, config=config) + else: + client = session.client(service_name, config=config) + else: + if region is not None: + client = boto3.client(service_name, region_name=region, config=config) + else: + client = boto3.client(service_name, config=config) + + # Enforce cache size limit before adding new entry + if len(cls._client_cache) >= cls._cache_max_size: + # Remove oldest entry (simple FIFO eviction) + oldest_key = min(cls._cache_metadata.keys(), key=lambda k: cls._cache_metadata[k]) + logger.info(f'Cache size limit reached, evicting oldest entry: {oldest_key}') + del cls._client_cache[oldest_key] + del cls._cache_metadata[oldest_key] + + # Cache the client with timestamp metadata + cls._client_cache[cache_key] = client + cls._cache_metadata[cache_key] = current_time + + logger.info(f'Created and cached new boto3 client for {service_name} in {region_name}') + return client + except Exception as e: + # Re-raise with more context + raise Exception(f'Failed to create boto3 client for {service_name}: {str(e)}') diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/consts.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/consts.py new file mode 100644 index 0000000000..3cd68f8b69 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/consts.py @@ -0,0 +1,62 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Constants for the HyperPod MCP Server.""" + +from typing import Literal, TypeAlias + + +# HyperPod Stack Management Operations +STACK_DEPLOY_OPERATION = 'deploy' +STACK_DESCRIBE_OPERATION = 'describe' +STACK_DELETE_OPERATION = 'delete' + +# HyperPod Node Management Operations +LIST_CLUSTERS_OPERATION = 'list_clusters' +LIST_NODES_OPERATION = 'list_nodes' +DESCRIBE_NODE_OPERATION = 'describe_node' +UPDATE_SOFTWARE_OPERATION = 'update_software' +BATCH_DELETE_OPERATION = 'batch_delete' + +# AWS CloudFormation +CFN_CAPABILITY_IAM = 'CAPABILITY_IAM' +CFN_CAPABILITY_NAMED_IAM = 'CAPABILITY_NAMED_IAM' +CAPABILITY_AUTO_EXPAND = 'CAPABILITY_AUTO_EXPAND' +CFN_ON_FAILURE_DELETE = 'DELETE' +CFN_STACK_TAG_KEY = 'CreatedBy' +CFN_STACK_TAG_VALUE = 'HyperPodMCPServer' +HYPERPOD_CFN_TEMPLATE_URL_EKS = 'https://aws-sagemaker-hyperpod-cluster-setup-us-east-1-prod.s3.us-east-1.amazonaws.com/templates/main-stack-eks-based-template.yaml' +HYPERPOD_CFN_TEMPLATE_URL_SLURM = 'https://aws-sagemaker-hyperpod-cluster-setup-us-east-1-prod.s3.us-east-1.amazonaws.com/templates-slurm/main-stack-slurm-based-template.yaml' + +# Error message templates +STACK_NOT_OWNED_ERROR_TEMPLATE = ( + 'Stack {stack_name} exists but was not created by {tool_name}. ' + 'For safety reasons, this tool will only {operation} stacks that were created by itself. ' + 'To manage this stack, please use the AWS Console, CLI, or the tool that created it.' +) + + +STACK_OPERATIONS = Literal['deploy', 'describe', 'delete'] + +SUPPORTED_REGIONS = Literal['us-east-1', 'us-east-2', 'us-west-1', 'us-west-2'] + +CLUSTER_ORCHESTRATORS = Literal['eks', 'slurm'] + +NODE_OPERATIONS: TypeAlias = Literal[ + 'list_clusters', + 'list_nodes', + 'describe_node', + 'update_software', + 'batch_delete', +] diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_cluster_node_handler.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_cluster_node_handler.py new file mode 100644 index 0000000000..c1b326f657 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_cluster_node_handler.py @@ -0,0 +1,1427 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""HyperPod cluster node handler for the HyperPod MCP Server.""" + +import os +from awslabs.sagemaker_hyperpod_mcp_server.aws_helper import AwsHelper +from awslabs.sagemaker_hyperpod_mcp_server.consts import ( + BATCH_DELETE_OPERATION, + DESCRIBE_NODE_OPERATION, + LIST_CLUSTERS_OPERATION, + LIST_NODES_OPERATION, + NODE_OPERATIONS, + SUPPORTED_REGIONS, + UPDATE_SOFTWARE_OPERATION, +) +from awslabs.sagemaker_hyperpod_mcp_server.logging_helper import LogLevel, log_with_request_id +from awslabs.sagemaker_hyperpod_mcp_server.models import ( + BatchDeleteClusterNodesError, + BatchDeleteClusterNodesResponse, + ClusterEbsVolumeConfig, + ClusterInstancePlacement, + ClusterInstanceStatusDetails, + ClusterInstanceStorageConfig, + ClusterLifeCycleConfig, + ClusterNodeDetails, + ClusterNodeSummary, + ClusterSummary, + DeploymentConfiguration, + DescribeClusterNodeResponse, + ListClusterNodesResponse, + ListClustersResponse, + UpdateClusterSoftwareInstanceGroupSpecification, + UpdateClusterSoftwareResponse, + VpcConfig, +) +from mcp.server.fastmcp import Context +from mcp.types import TextContent +from pydantic import Field, validate_call +from typing import Any, List, Literal, Optional, Union + + +class HyperPodClusterNodeHandler: + """Handler for HyperPod cluster node operations in the HyperPod MCP Server. + + This class provides tools for interacting with SageMaker HyperPod cluster nodes. + """ + + def __init__( + self, + mcp, + allow_write: bool = False, + allow_sensitive_data_access: bool = False, + ): + """Initialize the HyperPod cluster node handler. + + Args: + mcp: The MCP server instance + allow_write: Whether to enable write access (default: False) + allow_sensitive_data_access: Whether to allow access to sensitive data (default: False) + """ + self.mcp = mcp + self.allow_write = allow_write + self.allow_sensitive_data_access = allow_sensitive_data_access + + # Register tools + # temp workaround for update cluster, remove once update is fixed + self.mcp.tool(name='describe_hp_cluster')(self.describe_hp_cluster) + self.mcp.tool(name='update_hp_cluster')(self.update_hp_cluster) + + self.mcp.tool(name='manage_hyperpod_cluster_nodes')(self.manage_hyperpod_cluster_nodes) + + def get_sagemaker_client( + self, + ctx: Context, + region_name: Optional[SUPPORTED_REGIONS] = None, + profile_name: Optional[str] = None, + ): + """Get a SageMaker client for the specified region and profile. + + Args: + ctx: The MCP context + region_name: Optional AWS region name + profile_name: Optional AWS profile name. Using the correct profile is important + for successful API calls, especially for SageMaker HyperPod operations. + + Returns: + A boto3 SageMaker client + """ + # Set AWS_PROFILE environment variable if profile_name is provided + if profile_name: + log_with_request_id(ctx, LogLevel.INFO, f'Using AWS profile: {profile_name}') + os.environ['AWS_PROFILE'] = profile_name + + return AwsHelper.create_boto3_client('sagemaker', region_name=region_name) + + @validate_call + async def manage_hyperpod_cluster_nodes( + self, + ctx: Context, + operation: NODE_OPERATIONS = Field( + description='Operation to perform: list_clusters, list_nodes, describe_node, update_software, or batch_delete. Choose "list_clusters" or "list_nodes" or "describe_node" for read-only operations when write access is disabled.', + ), + cluster_name: Optional[str] = Field( + None, + description='The name of the cluster. Required for all operations except "list_clusters".', + ), + node_id: Optional[str] = Field( + None, + description='The ID of the SageMaker HyperPod cluster node. Required for "describe_node" operation.', + ), + node_ids: Optional[List[str]] = Field( + None, + description='The list of node IDs to delete from the cluster. Required for "batch_delete" operation.', + ), + # Parameters for list_clusters operation + max_results: Optional[int] = Field( + 10, + description='The maximum number of results to return in the response. Default: 10. Used for "list_clusters" and "list_nodes" operations.', + ge=1, + le=100, + ), + next_token: Optional[str] = Field( + None, + description='If the response to a previous request was truncated, the response includes a NextToken. To retrieve the next set of results, use the token in the next request. Used for "list_clusters" and "list_nodes" operations.', + ), + name_contains: Optional[str] = Field( + None, + description='A filter that returns only clusters whose name contains the specified string. Used for "list_clusters" operation.', + ), + # Parameters for list_nodes operation + creation_time_after: Optional[str] = Field( + None, + description='Filter for nodes/clusters created after the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00Z), date only (e.g., 2014-10-01), or Unix time in seconds. Used for "list_clusters" and "list_nodes" operations.', + ), + creation_time_before: Optional[str] = Field( + None, + description='Filter for nodes/clusters created before the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00Z), date only (e.g., 2014-10-01), or Unix time in seconds. Used for "list_clusters" and "list_nodes" operations.', + ), + instance_group_name_contains: Optional[str] = Field( + None, + description='Filter for nodes in instance groups whose name contains the specified string. Used for "list_nodes" operation.', + ), + sort_by: Optional[Literal['CREATION_TIME', 'NAME']] = Field( + default='CREATION_TIME', description='The field to sort results by...' + ), + sort_order: Optional[Literal['Ascending', 'Descending']] = Field( + default='Ascending', + description='The sort order for results. The default is Ascending. Used for "list_clusters" and "list_nodes" operations.', + ), + training_plan_arn: Optional[str] = Field( + None, + description='The Amazon Resource Name (ARN) of the training plan to filter clusters by. Used for "list_clusters" operation.', + ), + # Parameters for update_software operation + deployment_config: Optional[DeploymentConfiguration] = Field( + None, + description='The configuration to use when updating the AMI versions. Used for "update_software" operation.', + ), + instance_groups: Optional[List[UpdateClusterSoftwareInstanceGroupSpecification]] = Field( + None, + description='The array of instance groups for which to update AMI versions. Used for "update_software" operation.', + ), + # Common parameters + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> Union[ + ListClustersResponse, + ListClusterNodesResponse, + DescribeClusterNodeResponse, + UpdateClusterSoftwareResponse, + BatchDeleteClusterNodesResponse, + ]: + """Manage SageMaker HyperPod clusters and nodes with both read and write operations. + + This tool provides operations for managing SageMaker HyperPod clusters and nodes, including listing clusters, + listing nodes, describing a specific node, updating cluster software, and deleting nodes. It serves as a consolidated + interface for all cluster and node-related operations, simplifying the management of HyperPod resources. + + ## Operations + - **list_clusters**: List SageMaker HyperPod clusters with options for pagination and filtering + - **list_nodes**: List nodes in a SageMaker HyperPod cluster with options for pagination and filtering + - **describe_node**: Get detailed information about a specific node in a SageMaker HyperPod cluster + - **update_software**: Update the software for a SageMaker HyperPod cluster + - **batch_delete**: Delete multiple nodes from a SageMaker HyperPod cluster in a single operation + + ## Response Information + The response type varies based on the operation: + - list_clusters: Returns ListClustersResponse with a list of clusters + - list_nodes: Returns ListClusterNodesResponse with a list of nodes + - describe_node: Returns DescribeClusterNodeResponse with detailed node information + - update_software: Returns UpdateClusterSoftwareResponse with the cluster ARN + - batch_delete: Returns BatchDeleteClusterNodesResponse with details of the deletion operation + + ## Important Notes + - ALWAYS show the important notes for operations batch_delete and update_software BEFORE execute the operations + - For update_software: + The UpgradeClusterSoftware API call may impact your SageMaker HyperPod cluster uptime and availability. Plan accordingly to mitigate potential disruptions to your workloads + - For batch_delete: + - BEFORE running the tool, ALWAYS remind user all followings + - To safeguard your work, back up your data to Amazon S3 or an FSx for Lustre file system before invoking + the API on a worker node group. This will help prevent any potential data loss from the instance root volume. + For more information about backup, see Use the backup script provided by SageMaker HyperPod: + https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-hyperpod-backup-restore.html + - If you want to invoke this API on an existing cluster, you'll first need to patch the cluster by running + the UpdateClusterSoftware API. For more information about patching a cluster, see Update the SageMaker + HyperPod platform software of a cluster: + https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-hyperpod-update-software.html + - Deleting nodes will permanently remove them from the cluster + - This operation cannot be undone + - Ensure you have selected the correct nodes before proceeding + - This operation requires write access to be enabled for the handler + + ## Usage Tips + - Use "list_clusters" operation to get an overview of all available clusters in a specified region + - Use "list_nodes" operation to get an overview of all nodes in a specific cluster + - Use "describe_node" operation to get detailed information about a specific node + - Use "update_software" operation to update the software on all nodes or specific instance groups + - Use "batch_delete" operation to delete multiple nodes in a single request + - Specify region_name to operate on a cluster in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + + ## Fallback Options: + - If this tool fails, advise using AWS SageMaker CLI alternatives: + - List clusters: `aws sagemaker list-clusters --region ` + - List nodes: `aws sagemaker list-cluster-nodes --cluster-name --region ` + - Describe node: `aws sagemaker describe-cluster-node --cluster-name --node-id --region ` + - Update software: `aws sagemaker update-cluster-software --cluster-name --region ` + - Delete nodes: `aws sagemaker batch-delete-cluster-nodes --cluster-name --node-ids --region ` + - Or, as another alternative: Advise using SageMaker HyperPod console for cluster and node management + + Args: + ctx: MCP context + operation: Operation to perform (list_clusters, list_nodes, describe_node, update_software, or batch_delete) + cluster_name: The name of the cluster (required for all operations except list_clusters) + node_id: The ID of the node (required for describe_node operation) + node_ids: List of node IDs to delete (required for batch_delete operation) + max_results: Maximum number of results to return (for list_clusters and list_nodes operations) + next_token: Token for pagination (for list_clusters and list_nodes operations) + name_contains: Filter clusters by name (for list_clusters operation) + creation_time_after: Filter by creation time after (for list_clusters and list_nodes operations) + creation_time_before: Filter by creation time before (for list_clusters and list_nodes operations) + instance_group_name_contains: Filter by instance group name (for list_nodes operation) + sort_by: Sort field (for list_clusters and list_nodes operations) + sort_order: Sort order (for list_clusters and list_nodes operations) + training_plan_arn: Filter clusters by training plan ARN (for list_clusters operation) + deployment_config: Configuration for the update process (for update_software operation) + instance_groups: Specific instance groups to update (for update_software operation) + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + Union[ListClustersResponse, ListClusterNodesResponse, DescribeClusterNodeResponse, UpdateClusterSoftwareResponse, BatchDeleteClusterNodesResponse]: + Response specific to the operation performed + """ + try: + # Validate operation-specific required parameters + if operation != 'list_clusters' and cluster_name is None: + raise ValueError( + 'cluster_name is required for all operations except list_clusters' + ) + if operation == 'describe_node' and node_id is None: + raise ValueError('node_id is required for describe_node operation') + if operation == 'batch_delete' and (node_ids is None or len(node_ids) == 0): + raise ValueError('node_ids is required for batch_delete operation') + + # Set default values for None parameters to satisfy type checker + if max_results is None: + max_results = 10 + + # Check if write access is disabled and trying to perform a mutating operation + if not self.allow_write and operation in [ + UPDATE_SOFTWARE_OPERATION, + BATCH_DELETE_OPERATION, + ]: + error_message = f'Operation {operation} is not allowed without write access' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + + # Return appropriate response type based on operation + if operation == UPDATE_SOFTWARE_OPERATION: + return UpdateClusterSoftwareResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + cluster_arn='', + ) + elif operation == BATCH_DELETE_OPERATION: + # Ensure cluster_name is not None for the response + safe_cluster_name = cluster_name if cluster_name is not None else '' + return BatchDeleteClusterNodesResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + cluster_name=safe_cluster_name, + successful=[], + failed=None, + ) + + # Dispatch to the appropriate operation handler + if operation == LIST_CLUSTERS_OPERATION: + return await self._list_hp_clusters( + ctx=ctx, + max_results=max_results, + next_token=next_token, + name_contains=name_contains, + creation_time_after=creation_time_after, + creation_time_before=creation_time_before, + sort_by=sort_by, + sort_order=sort_order, + training_plan_arn=training_plan_arn, + region_name=region_name, + profile_name=profile_name, + ) + elif operation == LIST_NODES_OPERATION: + # Ensure cluster_name is not None + if cluster_name is None: + raise ValueError('cluster_name is required for list_nodes operation') + return await self._list_hp_cluster_nodes( + ctx=ctx, + cluster_name=cluster_name, + creation_time_after=creation_time_after, + creation_time_before=creation_time_before, + instance_group_name_contains=instance_group_name_contains, + max_results=max_results, + next_token=next_token, + sort_by=sort_by, + sort_order=sort_order, + region_name=region_name, + profile_name=profile_name, + ) + elif operation == DESCRIBE_NODE_OPERATION: + # Ensure cluster_name and node_id are not None + if cluster_name is None: + raise ValueError('cluster_name is required for describe_node operation') + if node_id is None: + raise ValueError('node_id is required for describe_node operation') + return await self._describe_hp_cluster_node( + ctx=ctx, + cluster_name=cluster_name, + node_id=node_id, + region_name=region_name, + profile_name=profile_name, + ) + elif operation == UPDATE_SOFTWARE_OPERATION: + # Ensure cluster_name is not None + if cluster_name is None: + raise ValueError('cluster_name is required for update_software operation') + return await self._update_hp_cluster_software( + ctx=ctx, + cluster_name=cluster_name, + deployment_config=deployment_config, + instance_groups=instance_groups, + region_name=region_name, + profile_name=profile_name, + ) + elif operation == 'batch_delete': + # Ensure cluster_name and node_ids are not None + if cluster_name is None: + raise ValueError('cluster_name is required for batch_delete operation') + if node_ids is None: + raise ValueError('node_ids is required for batch_delete operation') + return await self._batch_delete_hp_cluster_nodes( + ctx=ctx, + cluster_name=cluster_name, + node_ids=node_ids, + region_name=region_name, + profile_name=profile_name, + ) + else: + error_message = f'Invalid operation: {operation}. Must be one of: {LIST_CLUSTERS_OPERATION}, {LIST_NODES_OPERATION}, {DESCRIBE_NODE_OPERATION}, {UPDATE_SOFTWARE_OPERATION}, {BATCH_DELETE_OPERATION}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + # Default to ListClusterNodesResponse for invalid operations + return ListClusterNodesResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + nodes=[], + next_token=None, + ) + except ValueError as e: + # Re-raise ValueError for parameter validation errors + log_with_request_id(ctx, LogLevel.ERROR, f'Parameter validation error: {str(e)}') + raise + except Exception as e: + error_message = f'Error in manage_hyperpod_cluster_nodes: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + # Default to ListClusterNodesResponse for general exceptions + return ListClusterNodesResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + nodes=[], + next_token=None, + ) + + async def _list_hp_clusters( + self, + ctx: Context, + max_results: int = Field( + 10, + description='The maximum number of clusters to return in the response. Default: 10.', + ge=1, + le=100, + ), + next_token: Optional[str] = Field( + None, + description='If the response to a previous ListClusters request was truncated, the response includes a NextToken. To retrieve the next set of clusters, use the token in the next request.', + ), + name_contains: Optional[str] = Field( + None, + description='A filter that returns only clusters whose name contains the specified string.', + ), + creation_time_after: Optional[str] = Field( + None, + description='A filter that returns only clusters created after the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00.000Z), date only (e.g., 2014-10-01), or Unix time in seconds.', + ), + creation_time_before: Optional[str] = Field( + None, + description='A filter that returns only clusters created before the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00.000Z), date only (e.g., 2014-10-01), or Unix time in seconds.', + ), + sort_by: Optional[Literal['NAME', 'CREATION_TIME']] = Field( + default='CREATION_TIME', + description='The field to sort results by. The default is CREATION_TIME.', + ), + sort_order: Optional[Literal['Ascending', 'Descending']] = Field( + default='Ascending', + description='The sort order for results. The default is Ascending.', + ), + training_plan_arn: Optional[str] = Field( + None, + description='The Amazon Resource Name (ARN) of the training plan to filter clusters by.', + ), + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> ListClustersResponse: + """List SageMaker HyperPod clusters. + + This tool lists SageMaker HyperPod clusters with options for pagination and filtering. + It returns information about each cluster including name, ARN, status, creation time, + and training plan ARNs. + + ## Response Information + The response includes a summary of each cluster with cluster name, ARN, status, + creation time, and training plan ARNs. + + ## Usage Tips + - Use max_results and next_token for pagination when there are many clusters + - Use name_contains to filter clusters by name + - Use creation_time_after and creation_time_before to filter by creation time, input should be formated to something like 2014-10-01T20:30:00.000Z, 2014-10-01T12:30:00.000-08:00, 2014-10-01, 1412195400 + - Use training_plan_arn to filter clusters by training plan + - Use sort_by and sort_order to control the order of results + - Specify region_name to list clusters in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + for SageMaker HyperPod operations + + Args: + ctx: MCP context + max_results: Maximum number of clusters to return (default: 10) + next_token: Token for pagination (optional) + name_contains: Filter clusters by name (optional) + creation_time_after: Filter by creation time after as string (example format: 2014-10-01T20:30:00.000Z, 2014-10-01T12:30:00.000-08:00, 2014-10-01, 1412195400) (optional) + creation_time_before: Filter by creation time before as string (example format: 2014-10-01T20:30:00.000Z, 2014-10-01T12:30:00.000-08:00, 2014-10-01, 1412195400) (optional) + sort_by: Sort field (default: CREATION_TIME) + sort_order: Sort order (default: Ascending) + training_plan_arn: Filter clusters by training plan ARN (optional) + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + ListClustersResponse with list of clusters + """ + try: + # Get SageMaker client + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + + # Prepare parameters for list_clusters API call + params: dict[str, Any] = {} + + # Add parameters only if they are provided + if max_results is not None: + params['MaxResults'] = max_results + if next_token is not None: + params['NextToken'] = next_token + if name_contains is not None: + params['NameContains'] = name_contains + if creation_time_after is not None: + params['CreationTimeAfter'] = creation_time_after + if creation_time_before is not None: + params['CreationTimeBefore'] = creation_time_before + if sort_by is not None: + params['SortBy'] = sort_by + if sort_order is not None: + params['SortOrder'] = sort_order + if training_plan_arn is not None: + params['TrainingPlanArn'] = training_plan_arn + + # Call SageMaker API to list clusters + log_with_request_id( + ctx, LogLevel.INFO, f'Calling SageMaker list_clusters API with params: {params}' + ) + try: + response = sagemaker_client.list_clusters(**params) + log_with_request_id( + ctx, LogLevel.INFO, f'SageMaker list_clusters API response: {response}' + ) + except Exception as e: + log_with_request_id( + ctx, LogLevel.ERROR, f'SageMaker list_clusters API error: {str(e)}' + ) + raise + + # Extract clusters from response + clusters = [] + for cluster in response.get('ClusterSummaries', []): + cluster_summary = ClusterSummary( + cluster_name=cluster.get('ClusterName', ''), + cluster_arn=cluster.get('ClusterArn', ''), + cluster_status=cluster.get('ClusterStatus', ''), + creation_time=str(cluster.get('CreationTime', '')), + training_plan_arns=cluster.get('TrainingPlanArns'), + ) + clusters.append(cluster_summary) + + # Get next token for pagination + next_token_response = response.get('NextToken') + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully listed {len(clusters)} SageMaker HyperPod clusters', + ) + + # Return success response + return ListClustersResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully listed {len(clusters)} SageMaker HyperPod clusters', + ) + ], + clusters=clusters, + next_token=next_token_response, + ) + + except Exception as e: + # Log error + error_msg = f'Failed to list SageMaker HyperPod clusters: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + + # Return error response + return ListClustersResponse( + isError=True, + content=[TextContent(type='text', text=error_msg)], + clusters=[], + next_token=None, + ) + + async def describe_hp_cluster( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name of the cluster to describe.', + ), + region_name: SUPPORTED_REGIONS = Field( + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ): + """Describe a SageMaker HyperPod cluster. + + Args: + ctx: MCP context + cluster_name: REQUIRED - Target cluster for describe cluster api + region_name: REQUIRED - AWS region name + profile_name: AWS profile name (optional) + + ## Fallback Options: + - If this tool fails, advise using AWS SageMaker CLI option: `aws sagemaker describe-cluster --cluster-name --region ` + - Or as another alternative, advise checking directly in the SageMaker HyperPod console (Amazon SageMaker AI → HyperPod Clusters → Cluster Management → select cluster) + + Returns: + describe cluster response + """ + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + params = {'ClusterName': cluster_name} + response = sagemaker_client.describe_cluster(**params) + return response + + async def update_hp_cluster( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name of the cluster to update.', + ), + instance_groups: list = Field( + ..., + description='List of instance groups to update.', + ), + region_name: SUPPORTED_REGIONS = Field( + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ): + """Update a SageMaker HyperPod clusters. + + Notes: + - before using this tool, ensure you first have the most recent cluster instance group configurations by first calling the describe_hp_cluster tool first. + - modify the instance group configuration based on user's request + - important: Use "InstanceCount" (NOT "CurrentCount" or "TargetCount") for desired target count + - pass the configuration back in the instance group parameter + - example instance groups parameter + "instance_groups": [ + ⋮ { + ⋮ "OverrideVpcConfig": { + ⋮ "SecurityGroupIds": [ + ⋮ "<>" + ⋮ ], + ⋮ "Subnets": [ + ⋮ "<>" + ⋮ ] + ⋮ }, + ⋮ "InstanceCount": <>, + ⋮ "InstanceGroupName": "<>", + ⋮ "InstanceStorageConfigs": [ + ⋮ { + ⋮ "EbsVolumeConfig": { + ⋮ "VolumeSizeInGB": <> + ⋮ } + ⋮ } + ⋮ ], + ⋮ "LifeCycleConfig": { + ⋮ "SourceS3Uri": "<>", + ⋮ "OnCreate": "<>" + ⋮ }, + ⋮ "InstanceType": "<>", + ⋮ "ThreadsPerCore": <>, + ⋮ "ExecutionRole": "<>" + ⋮ } + ⋮ ], + + ## Fallback Options: + - If this tool fails, advise using AWS SageMaker CLI option: `aws sagemaker update-cluster --region ` with all appropriate parameters + - Or as another alternative, advise making updates directly in the SageMaker HyperPod console (Amazon SageMaker AI → HyperPod Clusters → Cluster Management → select cluster → Edit) + - To verify results: use CLI `aws sagemaker describe-cluster --cluster-name ` or directly verify in console + + Args: + ctx: MCP context + cluster_name: REQUIRED: cluster name to update + instance_groups: REQUIRED: instance group configurations + region_name: REQUIRED - AWS region name + profile_name: AWS profile name (optional) + + Returns: + update cluster response + """ + if not self.allow_write: + error_msg = 'Write access is not enabled for this handler. Cannot update cluster.' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + return {'isError': True, 'errorMessage': error_msg} + + # First try-catch: Create SageMaker client and prepare parameters + try: + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + params = {'ClusterName': cluster_name, 'InstanceGroups': instance_groups} + except Exception as e: + error_msg = f'Failed to prepare SageMaker client or parameters: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + return {'isError': True, 'errorMessage': error_msg} + + # Second try-catch: Make the API call + try: + log_with_request_id( + ctx, + LogLevel.INFO, + f'Calling SageMaker update_cluster API with params: {params}', + ) + response = sagemaker_client.update_cluster(**params) + log_with_request_id( + ctx, + LogLevel.INFO, + f'SageMaker update_cluster API response: {response}', + ) + except Exception as e: + error_msg = f'SageMaker update_cluster API error: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + return {'isError': True, 'errorMessage': error_msg} + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully updated SageMaker HyperPod cluster: {cluster_name}', + ) + + return response + + async def _describe_hp_cluster_node( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name of the cluster.', + ), + node_id: str = Field( + ..., + description='The ID of the SageMaker HyperPod cluster node.', + min_length=1, + max_length=256, + pattern=r'i-[a-f0-9]{8}(?:[a-f0-9]{9})?', + ), + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> DescribeClusterNodeResponse: + """Describe a SageMaker HyperPod cluster node. + + This tool describes a specific node in a SageMaker HyperPod cluster. + It returns detailed information about the node including instance group name, instance ID, instance status, + instance type, launch time, last software update time, and other configuration details. + + ## Response Information + The response includes detailed information about the node including: + - Instance group name and ID + - Instance status and type + - Launch time and last software update time + - Storage configurations + - Network configurations + - Placement information + - And more + + ## Usage Tips + - Use this tool to get detailed information about a specific node in a cluster + - You need both the cluster name and node ID to identify the node + - Specify region_name to describe a node in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + for SageMaker HyperPod operations + + Args: + ctx: MCP context + cluster_name: The name of the cluster + node_id: The ID of the SageMaker HyperPod cluster node + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + DescribeClusterNodeResponse with node details + """ + try: + # Get SageMaker client + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + + # Prepare parameters for describe_cluster_node API call + params = {'ClusterName': cluster_name, 'NodeId': node_id} + + # Call SageMaker API to describe cluster node + log_with_request_id( + ctx, + LogLevel.INFO, + f'Calling SageMaker describe_cluster_node API with params: {params}', + ) + try: + response = sagemaker_client.describe_cluster_node(**params) + log_with_request_id( + ctx, LogLevel.INFO, f'SageMaker describe_cluster_node API response: {response}' + ) + except Exception as e: + log_with_request_id( + ctx, LogLevel.ERROR, f'SageMaker describe_cluster_node API error: {str(e)}' + ) + raise + + # Extract node details from response + node_details_data = response.get('NodeDetails', {}) + + # Extract instance status details + instance_status_data = node_details_data.get('InstanceStatus', {}) + instance_status_details = ClusterInstanceStatusDetails( + status=instance_status_data.get( + 'Status', 'Pending' + ), # Default to Pending if not provided + message=instance_status_data.get('Message'), + ) + + # Process instance storage configs + instance_storage_configs = [] + for storage_config in node_details_data.get('InstanceStorageConfigs', []): + # Process EBS volume config + ebs_volume_config = None + if 'EbsVolumeConfig' in storage_config: + ebs_volume_config = ClusterEbsVolumeConfig( + volume_size_in_gb=storage_config['EbsVolumeConfig'].get('VolumeSizeInGb') + ) + + # Create instance storage config + instance_storage_config = ClusterInstanceStorageConfig( + ebs_volume_config=ebs_volume_config + ) + instance_storage_configs.append(instance_storage_config) + + # Process life cycle config + life_cycle_config = None + if ( + 'LifeCycleConfig' in node_details_data + and node_details_data['LifeCycleConfig'].get('OnCreate') + and node_details_data['LifeCycleConfig'].get('SourceS3Uri') + ): + life_cycle_config = ClusterLifeCycleConfig( + on_create=node_details_data['LifeCycleConfig'].get('OnCreate'), + source_s3_uri=node_details_data['LifeCycleConfig'].get('SourceS3Uri'), + ) + + # Process override VPC config + override_vpc_config = None + if 'OverrideVpcConfig' in node_details_data: + override_vpc_config = VpcConfig( + security_group_ids=node_details_data['OverrideVpcConfig'].get( + 'SecurityGroupIds' + ), + subnets=node_details_data['OverrideVpcConfig'].get('Subnets'), + ) + + # Process placement + placement = None + if 'Placement' in node_details_data: + placement = ClusterInstancePlacement( + availability_zone=node_details_data['Placement'].get('AvailabilityZone'), + availability_zone_id=node_details_data['Placement'].get('AvailabilityZoneId'), + ) + + # Create node details + node_details = ClusterNodeDetails( + instance_group_name=node_details_data.get('InstanceGroupName', ''), + instance_id=node_details_data.get('InstanceId', ''), + instance_status=instance_status_details, + instance_storage_configs=instance_storage_configs + if instance_storage_configs + else None, + instance_type=node_details_data.get('InstanceType', ''), + last_software_update_time=str(node_details_data.get('LastSoftwareUpdateTime')) + if node_details_data.get('LastSoftwareUpdateTime') + else None, + launch_time=str(node_details_data.get('LaunchTime')) + if node_details_data.get('LaunchTime') + else None, + life_cycle_config=life_cycle_config, + override_vpc_config=override_vpc_config, + placement=placement, + private_dns_hostname=node_details_data.get('PrivateDnsHostname'), + private_primary_ip=node_details_data.get('PrivatePrimaryIp'), + private_primary_ipv6=node_details_data.get('PrivatePrimaryIpv6'), + threads_per_core=node_details_data.get('ThreadsPerCore'), + ) + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully described SageMaker HyperPod cluster node: {node_id}', + ) + + # Return success response + return DescribeClusterNodeResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully described SageMaker HyperPod cluster node: {node_id}', + ) + ], + node_details=node_details, + ) + + except Exception as e: + # Log error + error_msg = f'Failed to describe SageMaker HyperPod cluster node: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + + # Return error response + return DescribeClusterNodeResponse( + isError=True, + content=[TextContent(type='text', text=error_msg)], + node_details=None, + ) + + async def _list_hp_cluster_nodes( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name of the cluster.', + ), + creation_time_after: Optional[str] = Field( + None, + description='Filter for nodes created after the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00Z), date only (e.g., 2014-10-01), or Unix time in seconds.', + ), + creation_time_before: Optional[str] = Field( + None, + description='Filter for nodes created before the specified time. Accepts formats: ISO 8601 (e.g., 2014-10-01T20:30:00Z), date only (e.g., 2014-10-01), or Unix time in seconds.', + ), + instance_group_name_contains: Optional[str] = Field( + None, + description='Filter for nodes in instance groups whose name contains the specified string.', + ), + max_results: int = Field( + 10, + description='The maximum number of nodes to return in the response. Default: 10.', + ge=1, + le=100, + ), + next_token: Optional[str] = Field( + None, + description='If the response to a previous ListClusterNodes request was truncated, the response includes a NextToken. To retrieve the next set of nodes, use the token in the next request.', + ), + sort_by: Optional[Literal['CREATION_TIME', 'NAME']] = Field( + default='CREATION_TIME', + description='The field to sort results by. The default is CREATION_TIME.', + ), + sort_order: Optional[Literal['Ascending', 'Descending']] = Field( + default='Ascending', + description='The sort order for results. The default is Ascending.', + ), + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> ListClusterNodesResponse: + """List SageMaker HyperPod cluster nodes. + + This tool lists nodes in a SageMaker HyperPod cluster with options for pagination and filtering. + It returns information about each node including instance group name, instance ID, instance status, + instance type, launch time, and last software update time. + + ## Response Information + The response includes a summary of each node with instance group name, instance ID, instance status, + instance type, launch time, and last software update time. + + ## Usage Tips + - Use max_results and next_token for pagination when there are many nodes + - Use instance_group_name_contains to filter nodes by instance group name + - Use creation_time_after and creation_time_before to filter by creation time + - Use sort_by and sort_order to control the order of results + - Specify region_name to list nodes in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + for SageMaker HyperPod operations + + Args: + ctx: MCP context + cluster_name: The name of the cluster + creation_time_after: Filter by creation time after as string (optional) + creation_time_before: Filter by creation time before as string (optional) + instance_group_name_contains: Filter by instance group name (optional) + max_results: Maximum number of nodes to return (default: 10) + next_token: Token for pagination (optional) + sort_by: Sort field (default: CREATION_TIME) + sort_order: Sort order (default: Ascending) + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + ListClusterNodesResponse with list of nodes + """ + try: + # Get SageMaker client + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + + # Prepare parameters for list_cluster_nodes API call + params: dict[str, Any] = {'ClusterName': cluster_name} + + # Add parameters only if they are provided + if max_results is not None: + params['MaxResults'] = max_results + if next_token is not None: + params['NextToken'] = next_token + if instance_group_name_contains is not None: + params['InstanceGroupNameContains'] = instance_group_name_contains + if creation_time_after is not None: + params['CreationTimeAfter'] = creation_time_after + if creation_time_before is not None: + params['CreationTimeBefore'] = creation_time_before + if sort_by is not None: + params['SortBy'] = sort_by + if sort_order is not None: + params['SortOrder'] = sort_order + + # Call SageMaker API to list cluster nodes + log_with_request_id( + ctx, + LogLevel.INFO, + f'Calling SageMaker list_cluster_nodes API with params: {params}', + ) + try: + response = sagemaker_client.list_cluster_nodes(**params) + log_with_request_id( + ctx, LogLevel.INFO, f'SageMaker list_cluster_nodes API response: {response}' + ) + except Exception as e: + log_with_request_id( + ctx, LogLevel.ERROR, f'SageMaker list_cluster_nodes API error: {str(e)}' + ) + raise + + # Extract nodes from response + nodes = [] + for node in response.get('ClusterNodeSummaries', []): + # Extract instance status details + instance_status_data = node.get('InstanceStatus', {}) + instance_status_details = ClusterInstanceStatusDetails( + status=instance_status_data.get( + 'Status', 'Pending' + ), # Default to Pending if not provided + message=instance_status_data.get('Message'), + ) + + node_summary = ClusterNodeSummary( + instance_group_name=node.get('InstanceGroupName', ''), + instance_id=node.get('InstanceId', ''), + instance_status=instance_status_details, + instance_type=node.get('InstanceType', ''), + launch_time=str(node.get('LaunchTime', '')), + last_software_update_time=str(node.get('LastSoftwareUpdateTime', '')) + if node.get('LastSoftwareUpdateTime') + else None, + ) + nodes.append(node_summary) + + # Get next token for pagination + next_token_response = response.get('NextToken') + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully listed {len(nodes)} SageMaker HyperPod cluster nodes', + ) + + # Return success response + return ListClusterNodesResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully listed {len(nodes)} SageMaker HyperPod cluster nodes', + ) + ], + nodes=nodes, + next_token=next_token_response, + ) + + except Exception as e: + # Log error + error_msg = f'Failed to list SageMaker HyperPod cluster nodes: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + + # Return error response + return ListClusterNodesResponse( + isError=True, + content=[TextContent(type='text', text=error_msg)], + nodes=[], + next_token=None, + ) + + async def _update_hp_cluster_software( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name or ARN of the SageMaker HyperPod cluster to update for security patching.', + min_length=0, + max_length=256, + pattern=r'(arn:aws[a-z\-]*:sagemaker:[a-z0-9\-]*:[0-9]{12}:cluster/[a-z0-9]{12})|([a-zA-Z0-9](-*[a-zA-Z0-9]){0,62})', + ), + deployment_config: Optional[DeploymentConfiguration] = Field( + None, + description='The configuration to use when updating the AMI versions.', + ), + instance_groups: Optional[List[UpdateClusterSoftwareInstanceGroupSpecification]] = Field( + None, + description='The array of instance groups for which to update AMI versions.', + ), + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> UpdateClusterSoftwareResponse: + """Update the software for a SageMaker HyperPod cluster. + + This tool updates the software for a SageMaker HyperPod cluster. + It initiates a software update for all nodes in the cluster. + + ## Response Information + The response includes the ARN of the cluster being updated. + + ## Usage Tips + - Use this tool to update the software on all nodes in a SageMaker HyperPod cluster + - Specify instance_groups to update only specific instance groups in the cluster + - Configure deployment_config to control how the update is performed: + - Use auto_rollback_configuration to specify alarms that trigger rollback + - Use rolling_update_policy to control batch sizes during updates + - Use wait_interval_in_seconds to control the wait time between updates + - The update process may take some time to complete + - You can check the status of the update using the list_hp_cluster_nodes tool + - Specify region_name to update a cluster in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + for SageMaker HyperPod operations + + Args: + ctx: MCP context + cluster_name: The name or ARN of the cluster to update + deployment_config: Configuration for the update process (optional) + instance_groups: Specific instance groups to update (optional) + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + UpdateClusterSoftwareResponse with cluster ARN + """ + try: + # Get SageMaker client + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + + # Prepare parameters for update_cluster_software API call + params: dict[str, Any] = {'ClusterName': cluster_name} + + # Add deployment configuration if provided + if deployment_config: + deployment_config_dict: dict[str, Any] = {} + + # Add auto rollback configuration if provided + if deployment_config.auto_rollback_configuration: + auto_rollback_config = [] + for alarm in deployment_config.auto_rollback_configuration: + auto_rollback_config.append({'AlarmName': alarm.alarm_name}) + if auto_rollback_config: + deployment_config_dict['AutoRollbackConfiguration'] = auto_rollback_config + + # Add rolling update policy if provided + if deployment_config.rolling_update_policy: + rolling_update_policy = {} + + # Add maximum batch size if provided + if deployment_config.rolling_update_policy.maximum_batch_size: + maximum_batch_size = { + 'Type': deployment_config.rolling_update_policy.maximum_batch_size.type, + 'Value': deployment_config.rolling_update_policy.maximum_batch_size.value, + } + rolling_update_policy['MaximumBatchSize'] = maximum_batch_size + + # Add rollback maximum batch size if provided + if deployment_config.rolling_update_policy.rollback_maximum_batch_size: + rollback_maximum_batch_size = { + 'Type': deployment_config.rolling_update_policy.rollback_maximum_batch_size.type, + 'Value': deployment_config.rolling_update_policy.rollback_maximum_batch_size.value, + } + rolling_update_policy['RollbackMaximumBatchSize'] = ( + rollback_maximum_batch_size + ) + + if rolling_update_policy: + deployment_config_dict['RollingUpdatePolicy'] = rolling_update_policy + + # Add wait interval in seconds if provided + if deployment_config.wait_interval_in_seconds is not None: + deployment_config_dict['WaitIntervalInSeconds'] = ( + deployment_config.wait_interval_in_seconds + ) + + # Add deployment config to params if not empty + if deployment_config_dict: + params['DeploymentConfig'] = deployment_config_dict + + # Add instance groups if provided + if instance_groups: + instance_groups_list = [] + for group in instance_groups: + instance_groups_list.append({'InstanceGroupName': group.instance_group_name}) + if instance_groups_list: + params['InstanceGroups'] = instance_groups_list + + # Call SageMaker API to update cluster software + log_with_request_id( + ctx, + LogLevel.INFO, + f'Calling SageMaker update_cluster_software API with params: {params}', + ) + try: + response = sagemaker_client.update_cluster_software(**params) + log_with_request_id( + ctx, + LogLevel.INFO, + f'SageMaker update_cluster_software API response: {response}', + ) + except Exception as e: + log_with_request_id( + ctx, LogLevel.ERROR, f'SageMaker update_cluster_software API error: {str(e)}' + ) + raise + + # Extract cluster ARN from response + cluster_arn = response.get('ClusterArn', '') + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully initiated software update for SageMaker HyperPod cluster: {cluster_name}', + ) + + # Return success response + return UpdateClusterSoftwareResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully initiated software update for SageMaker HyperPod cluster: {cluster_name}', + ) + ], + cluster_arn=cluster_arn, + ) + + except Exception as e: + # Log error + error_msg = f'Failed to update software for SageMaker HyperPod cluster: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + + # Return error response + return UpdateClusterSoftwareResponse( + isError=True, + content=[TextContent(type='text', text=error_msg)], + cluster_arn='', + ) + + async def _batch_delete_hp_cluster_nodes( + self, + ctx: Context, + cluster_name: str = Field( + ..., + description='The name of the cluster.', + min_length=0, + max_length=256, + pattern=r'(arn:aws[a-z\-]*:sagemaker:[a-z0-9\-]*:[0-9]{12}:cluster/[a-z0-9]{12})|([a-zA-Z0-9](-*[a-zA-Z0-9]){0,62})', + ), + node_ids: List[str] = Field( + ..., + description='The list of node IDs to delete from the cluster.', + min_length=1, + max_length=99, + ), + region_name: Optional[SUPPORTED_REGIONS] = Field( + 'us-east-1', + description='AWS region name. Default is us-east-1.', + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> BatchDeleteClusterNodesResponse: + """Delete multiple nodes from a SageMaker HyperPod cluster. + + This tool deletes multiple nodes from a SageMaker HyperPod cluster in a single operation. + It returns information about the deleted nodes and any failures that occurred during deletion. + + ## Response Information + The response includes the cluster name, a list of successfully deleted node IDs, + and details about any failed node deletions. + + ## Note + - For SageMaker HyperPod clusters using the Slurm workload manager, you cannot remove instances that are + configured as Slurm controller nodes. + - If you need to delete more than 99 instances, contact Support for assistance. + + ## Usage Tips + - Use this tool to delete multiple nodes from a cluster in a single operation + - You can delete up to 99 nodes in a single request + - If some node deletions fail, the response will include details about the failures + - Specify region_name to delete nodes in a specific region + - Specify profile_name to use a specific AWS profile with appropriate permissions + for SageMaker HyperPod operations + + Args: + ctx: MCP context + cluster_name: The name of the cluster + node_ids: List of node IDs to delete from the cluster + region_name: AWS region name (default: us-east-1) + profile_name: AWS profile name (optional) + + Returns: + BatchDeleteClusterNodesResponse with details of the deletion operation + """ + try: + # Get SageMaker client + sagemaker_client = self.get_sagemaker_client( + ctx, region_name=region_name, profile_name=profile_name + ) + + # Prepare parameters for batch_delete_cluster_nodes API call + params = {'ClusterName': cluster_name, 'NodeIds': node_ids} + + # Call SageMaker API to batch delete cluster nodes + log_with_request_id( + ctx, + LogLevel.INFO, + f'Calling SageMaker batch_delete_cluster_nodes API with params: {params}', + ) + try: + response = sagemaker_client.batch_delete_cluster_nodes(**params) + log_with_request_id( + ctx, + LogLevel.INFO, + f'SageMaker batch_delete_cluster_nodes API response: {response}', + ) + except Exception as e: + log_with_request_id( + ctx, + LogLevel.ERROR, + f'SageMaker batch_delete_cluster_nodes API error: {str(e)}', + ) + raise + + # Extract successful and failed deletions from response + successful_node_ids = response.get('Successful', []) + failed_deletions = response.get('Failed', []) + + # Convert failed deletions to BatchDeleteClusterNodesError objects + failed_deletions_list = [] + for failure in failed_deletions: + failed_deletions_list.append( + BatchDeleteClusterNodesError( + code=failure.get('Code', ''), + message=failure.get('Message', ''), + node_id=failure.get('NodeId', ''), + ) + ) + + # Log success + log_with_request_id( + ctx, + LogLevel.INFO, + f'Successfully deleted {len(successful_node_ids)} nodes from SageMaker HyperPod cluster: {cluster_name}', + ) + if failed_deletions_list: + log_with_request_id( + ctx, + LogLevel.WARNING, + f'Failed to delete {len(failed_deletions_list)} nodes from SageMaker HyperPod cluster: {cluster_name}', + ) + + # Return success response + return BatchDeleteClusterNodesResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully deleted {len(successful_node_ids)} nodes from SageMaker HyperPod cluster: {cluster_name}. Failed deletions: {len(failed_deletions_list)}', + ) + ], + cluster_name=cluster_name, + successful=successful_node_ids, + failed=failed_deletions_list if failed_deletions_list else None, + ) + + except Exception as e: + # Log error + error_msg = f'Failed to delete nodes from SageMaker HyperPod cluster: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_msg) + + # Return error response + return BatchDeleteClusterNodesResponse( + isError=True, + content=[TextContent(type='text', text=error_msg)], + cluster_name=cluster_name, + successful=[], + failed=None, + ) diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_stack_handler.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_stack_handler.py new file mode 100644 index 0000000000..172e1dab48 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/hyperpod_stack_handler.py @@ -0,0 +1,702 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""HyperPod stack handler for the HyperPod MCP Server.""" + +import json +import yaml # type: ignore +from awslabs.sagemaker_hyperpod_mcp_server.aws_helper import AwsHelper +from awslabs.sagemaker_hyperpod_mcp_server.consts import ( + CAPABILITY_AUTO_EXPAND, + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CFN_ON_FAILURE_DELETE, + CFN_STACK_TAG_KEY, + CFN_STACK_TAG_VALUE, + CLUSTER_ORCHESTRATORS, + HYPERPOD_CFN_TEMPLATE_URL_EKS, + HYPERPOD_CFN_TEMPLATE_URL_SLURM, + STACK_DELETE_OPERATION, + STACK_DEPLOY_OPERATION, + STACK_DESCRIBE_OPERATION, + STACK_NOT_OWNED_ERROR_TEMPLATE, + STACK_OPERATIONS, + SUPPORTED_REGIONS, +) +from awslabs.sagemaker_hyperpod_mcp_server.logging_helper import LogLevel, log_with_request_id +from awslabs.sagemaker_hyperpod_mcp_server.models import ( + DeleteStackResponse, + DeployStackResponse, + DescribeStackResponse, +) +from mcp.server.fastmcp import Context +from mcp.types import TextContent +from pydantic import Field, validate_call +from typing import Dict, List, Optional, Tuple, Union +from yaml.loader import SafeLoader # type: ignore + + +# Custom YAML loader for CloudFormation templates +class CloudFormationLoader(SafeLoader): + """Custom YAML loader that handles CloudFormation intrinsic functions.""" + + pass + + +# Add constructors for CloudFormation intrinsic functions +def construct_cfn_tag(loader, tag_suffix, node): + """Generic constructor for CloudFormation intrinsic functions.""" + if isinstance(node, yaml.ScalarNode): + return {tag_suffix: loader.construct_scalar(node)} + elif isinstance(node, yaml.SequenceNode): + return {tag_suffix: loader.construct_sequence(node)} + elif isinstance(node, yaml.MappingNode): + return {tag_suffix: loader.construct_mapping(node)} + else: + return None + + +# Register constructors for common CloudFormation intrinsic functions +for tag in [ + 'Ref', + 'Condition', + 'GetAtt', + 'Equals', + 'If', + 'Not', + 'And', + 'Or', + 'FindInMap', + 'Base64', + 'Join', + 'Sub', + 'Select', + 'Split', + 'ImportValue', + 'GetAZs', + 'Transform', + 'ForEach', +]: + CloudFormationLoader.add_constructor( + f'!{tag}', lambda loader, node, tag=tag: construct_cfn_tag(loader, tag, node) + ) + + +class HyperPodStackHandler: + """Handler for Amazon HyperPod CloudFormation stack operations. + + This class provides tools for creating, managing, and deleting CloudFormation + stacks for HyperPod clusters. + """ + + def __init__(self, mcp, allow_write: bool = False): + """Initialize the HyperPod stack handler. + + Args: + mcp: The MCP server instance + allow_write: Whether to enable write access (default: False) + """ + self.mcp = mcp + self.allow_write = allow_write + + # Register tools + self.mcp.tool(name='manage_hyperpod_stacks')(self.manage_hyperpod_stacks) + + @validate_call + def _ensure_stack_ownership( + self, + ctx: Context, + stack_name: str, + region_name: SUPPORTED_REGIONS, + operation: str, + ) -> Tuple[bool, Optional[Dict], Optional[str]]: + """Ensure that a stack exists and was created by this tool. + + Args: + ctx: The MCP context + stack_name: Name of the stack to verify + region_name: region to perform the API call in + operation: Operation being performed (for error messages) + + Returns: + Tuple of (success, stack_details, error_message) + - success: True if the stack exists and was created by this tool + - stack_details: Stack details if the stack exists, None otherwise + - error_message: Error message if the stack doesn't exist or wasn't created by this tool, None if successful + """ + try: + # Create CloudFormation client + cfn_client = AwsHelper.create_boto3_client('cloudformation', region_name) + + # Get stack details + stack_details = cfn_client.describe_stacks(StackName=stack_name) + stack = stack_details['Stacks'][0] + + # Verify the stack was created by our tool + tags = stack.get('Tags', []) + is_our_stack = False + for tag in tags: + if tag.get('Key') == CFN_STACK_TAG_KEY and tag.get('Value') == CFN_STACK_TAG_VALUE: + is_our_stack = True + break + + if not is_our_stack: + error_message = STACK_NOT_OWNED_ERROR_TEMPLATE.format( + stack_name=stack_name, tool_name=CFN_STACK_TAG_VALUE, operation=operation + ) + log_with_request_id(ctx, LogLevel.ERROR, error_message) + return False, stack, error_message + + return True, stack, None + except Exception as e: + if 'does not exist' in str(e): + error_message = f'Stack {stack_name} not found or cannot be accessed: {str(e)}' + else: + error_message = f'Error verifying stack ownership: {str(e)}' + + log_with_request_id(ctx, LogLevel.ERROR, error_message) + return False, None, error_message + + @validate_call + async def manage_hyperpod_stacks( + self, + ctx: Context, + operation: STACK_OPERATIONS = Field( + description='Operation to perform: deploy, describe, or delete. Choose "describe" for read-only operations when write access is disabled.', + ), + region_name: SUPPORTED_REGIONS = Field( + description='AWS region name. Default is us-east-1.', + ), + stack_name: str = Field( + description='Name of the CloudFormation stack (for deploy, describe and delete operations).', + ), + cluster_orchestrator: CLUSTER_ORCHESTRATORS = Field( + 'eks', + description='Cluster orchestrator type. Must be either "eks" or "slurm". Default is "eks".', + ), + params_file: Optional[str] = Field( + None, + description="""Absolute path for the CloudFormation template parameters(for deploy operations). + IMPORTANT: Assistant must provide the full absolute path to the template file, as the MCP client and server might not run from the same location.""", + ), + profile_name: Optional[str] = Field( + None, + description='AWS profile name. If not provided, uses the default profile.', + ), + ) -> Union[ + 'DeployStackResponse', + 'DescribeStackResponse', + 'DeleteStackResponse', + ]: + r"""Manage SageMaker HyperPod Cluster through CloudFormation stacks. + + This tool provides operations for managing HyperPod CloudFormation stacks, including creating parameters for cloudformation template, + deploying stacks, retrieving hyperpod stack and deployment information, and deleting hyperpod stacks. It serves as the primary + mechanism for creating and managing HyperPod clusters through CloudFormation, enabling standardized + cluster creation, configuration updates, and resource cleanup. + + ## Notes + - Tell user about the working directory which is the current directory. The tool will use directory to store all required files for the user. + - After you asked a question, do NOT do anything until you got the user response, do NOT run manage_hyperpod_stacks yet + - Use this tool instead of direct AWS CLI commands for creating and managing HyperPod resources. + - Use this tool's standardized parameters for creating HyperPod clusters with proper configuration. + - DO NOT create HyperPod clusters by generating CloudFormation templates from scratch. + - when user asks to create a hyperpod cluster, NEVER ask to check what HyperPod clusters the user currently have + - CRITICAL: when user asks to delete a hyperpod cluster, NEVER ask how user's hyperpod cluster was created, just proceed with 'delete' operation. The corresponding Cloudformation stack name should be in this format: "-stack". If no such stack exists, then the hyperpod cluster might not be created via the MCP tools here. + + ## Parameter Collection Process + IMPORTANT: ALWAYS first ask for ALL operation-specific REQUIRED parameters from the user BEFORE making any tool calls. NEVER assume or generate parameter values. + IMPORTANT: ALWAYS ask one question at a time. + + For 'deploy' operation: + - region_name: REQUIRED: ask user to region of deployment. Limit user input to the major US region. Ensure this argument matches the AvailabilityZoneIds parameter key. + - allow following selections: + • us-east-1 (N. Virginia) + • us-east-2 (Ohio) + • us-west-1 (N. California) + • us-west-2 (Oregon) + - stack_name: REQUIRED - generate a stack name and present to the user. should be in this format: "-stack". + - cluster_orchestrator: REQUIRED: ask user to specify "eks" or "slurm" + - params_file: REQUIRED - the parameters file should follow the below format. Ask the user to customize the parameters marked as "" one by one. At the end, ask user if they want to add additional instance group. + - when cluster_orchestrator is "slurm", InstanceGroupSettings ParameterValue should also include InstanceGroupType of value Compute or Controller or Login; place it right after InstanceType. At least 1 Controller and 1 Compute node group required. ONLY 1 Controller, 1 Login group is allowed throughout ALL specified InstanceGroupSettings + - when asking questions regarding InstanceGroupSettings, ask user for both the number of instance and type of instance at the same time. Naming format: "-params.json" + [ + { + "ParameterKey": "HyperPodClusterName", + "ParameterValue": "" // can be any natural english words less than 32 characters and no space. default should be sagemaker-hyperpod + }, + { + "ParameterKey": "ResourceNamePrefix", + "ParameterValue": "hp--" // Generate a new 6-digit hash randomly and silently, do NOT use execute_bash commands. + }, + { + "ParameterKey": "Stage", // do not reveil this to user, but always include this key and value. + "ParameterValue": "prod" + }, + { + "ParameterKey": "AvailabilityZoneIds", + "ParameterValue": "" + }, + { + "ParameterKey": "FsxAvailabilityZoneId", + "ParameterValue": " + }, + { + "ParameterKey": "InstanceGroupSettings1", // Hyperpod requires at least 1 instance group. By default adding this instance goup. Ask user if they want addition instance groups. For each new instance, update the counter in the key. There can be at most 20 instance groups. + "ParameterValue": "[{\"InstanceCount\":,\"InstanceGroupName\":\"-group-\",\"InstanceType\":\"\",\"TargetAvailabilityZoneId\":\"\",\"InstanceStorageConfigs\":[{\"EbsVolumeConfig\":{\"VolumeSizeInGB\":500GB}}]}]" + }, + { + "ParameterKey": "InstanceGroupSettings2", // additional instance group template + "ParameterValue": .... + }, + ... + ] + + - available AZ id in each region + - us-east-1: use1-az1, az2, az4, az5, az6 + - us-east-2: use2-az1, az2, az3 + - us-west-1: usw1-az1, az3 + - us-west-2: usw2-az1, az2, az3, az4 + + For 'describe' and 'delete' operations: + - stack_name: REQUIRED - the stack name to operate on. You should confirm with user that the current stack is being operated on. + - region_name: REQUIRED - ask user for the region if not clear from context. + + ## Requirements + - The server must be run with the `--allow-write` flag for generate, deploy, and delete operations + - For deploy and delete operations, the stack must have been created by this tool + - For params_file parameter, the path must be absolute and accessible to the server + + ## Operations + - **deploy**: Create and update hyperpod cluster using cloudformation template and user specified parameters. + - **describe**: Gather information about the hyperpod cluster deployed via cloudformation stack by this tool. + - **delete**: Delete a hyperpod cluster via CloudFormation stack created by this tool. + + ## Response Information + The response type varies based on the operation: + - deploy: Returns DeployStackResponse with stack name, ARN, and stack name prefix + - describe: Returns DescribeStackResponse with stack details, outputs, and status + - delete: Returns DeleteStackResponse with stack name, ID, and stack name prefix + + ## Usage Tips + - If user wants to create a new hyperpod cluster, always generate a new parameter file. Parameter file MUST exists in the working directory for the tool to update the hyperpod cluster. + - For safety, this tool will only modify or delete stacks that it created + - Stack creation typically takes ~30 minutes to complete + - Specify profile_name to use a specific AWS profile with appropriate permissions + + ## Fallback Options: + - If this tool fails, advise using CloudFormation CLI: + - Deploy (create new stack): `aws cloudformation create-stack` with proper params + - Deploy (update existing stack): `aws cloudformation udpate-stack` with proper params + - Describe: `aws cloudformation describe-stacks` with proper params + - Delete: `aws cloudformation delete-stack` with proper params + - Alternatively: advise using AWS SageMaker CLI alternatives: + - Deploy (create new stack): `aws sagemaker create-cluster` with all appropriate parameters + - Deploy (update existing stack): `aws sagemaker update-cluster` with all appropriate parameters + - Describe: `aws sagemaker describe-cluster --cluster-name --region ` + - Delete: `aws sagemaker delete-cluster --cluster-name --region ` + - Alternatively: Advise using SageMaker HyperPod console for directly creating, updating, deleting the HyperPod cluster + + Args: + ctx: MCP context + operation: Operation to perform (generate, deploy, describe, or delete) + params_file: Absolute path for the CloudFormation template parameters (for deploy operations) + stack_name: Name of the CloudFormation stack (for deploy, describe and delete operations) + region_name: AWS region name (default: us-east-1) + cluster_orchestrator: cluster orchestrator + profile_name: AWS profile name (optional) + + Returns: + Union[DeployStackResponse, DescribeStackResponse, DeleteStackResponse]: + Response specific to the operation performed + """ + try: + # Check if write access is disabled and trying to perform a mutating operation + if not self.allow_write and operation not in [ + STACK_DESCRIBE_OPERATION, + ]: + error_message = f'Operation {operation} is not allowed without write access' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + + # Return appropriate response type based on operation + if operation == STACK_DEPLOY_OPERATION: + return DeployStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + stack_name='', + stack_arn='', + ) + elif operation == STACK_DELETE_OPERATION: + return DeleteStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + stack_name='', + stack_id='', + ) + else: # Default to describe operation + return DescribeStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + stack_name='', + stack_id='', + creation_time='', + stack_status='', + outputs={}, + ) + + if operation == STACK_DEPLOY_OPERATION: + if params_file is None: + raise ValueError('params_file is required for deploy operation') + + with open(params_file, 'r') as f: + template_params = json.load(f) + + return await self._deploy_stack( + ctx=ctx, + stack_name=stack_name, + template_params=template_params, + region_name=region_name, + cluster_orchestrator=cluster_orchestrator, + profile_name=profile_name, + ) + + elif operation == STACK_DESCRIBE_OPERATION: + return await self._describe_stack( + ctx=ctx, + stack_name=stack_name, + region_name=region_name, + profile_name=profile_name, + ) + + elif operation == STACK_DELETE_OPERATION: + return await self._delete_stack( + ctx=ctx, + stack_name=stack_name, + region_name=region_name, + profile_name=profile_name, + ) + + else: + error_message = f'Invalid operation: {operation}. Must be one of: generate, deploy, describe, delete' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + # Default to DescribeStackResponse for invalid operations + return DescribeStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + stack_name='', + stack_id='', + creation_time='', + stack_status='', + outputs={}, + ) + except ValueError as e: + # Re-raise ValueError for parameter validation errors + log_with_request_id(ctx, LogLevel.ERROR, f'Parameter validation error: {str(e)}') + raise + except Exception as e: + error_message = f'Error in manage_hyperpod_stacks: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + # Default to DescribeStackResponse for general exceptions + return DescribeStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message)], + stack_name='', + stack_id='', + creation_time='', + stack_status='', + outputs={}, + ) + + async def _deploy_stack( + self, + ctx: Context, + template_params: List[dict], + stack_name: str, + region_name: SUPPORTED_REGIONS, + cluster_orchestrator: CLUSTER_ORCHESTRATORS, + profile_name: Optional[str] = None, + ) -> 'DeployStackResponse': + """Deploy a CloudFormation stack from the specified template file.""" + try: + # Determine template URL based on cluster orchestrator + if cluster_orchestrator == 'eks': + template_url = HYPERPOD_CFN_TEMPLATE_URL_EKS + elif cluster_orchestrator == 'slurm': + template_url = HYPERPOD_CFN_TEMPLATE_URL_SLURM + else: + # This should not happen due to type validation, but adding for safety + error_message = f'Invalid cluster_orchestrator: {cluster_orchestrator}. Must be either "eks" or "slurm".' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + return DeployStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_arn='', + ) + # Create CloudFormation client + cfn_client = AwsHelper.create_boto3_client('cloudformation', region_name=region_name) + + # Check if the stack already exists and verify ownership + stack_exists = False + try: + success, stack, error_message = self._ensure_stack_ownership( + ctx, stack_name, region_name, 'describe' + ) + if stack: + stack_exists = True + if not success: + return DeployStackResponse( + isError=True, + content=[ + TextContent(type='text', text=error_message or 'Unknown error') + ], + stack_name=stack_name, + stack_arn='', + ) + except Exception: + # Stack doesn't exist, we'll create it + stack_exists = False + + if stack_exists: + log_with_request_id( + ctx, + LogLevel.INFO, + f'Updating CloudFormation stack {stack_name} for HyperPod Cluster', + ) + + response = cfn_client.update_stack( + StackName=stack_name, + TemplateURL=template_url, + Parameters=template_params, + Capabilities=[ + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CAPABILITY_AUTO_EXPAND, + ], + Tags=[{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + ) + + operation_text = 'update' + else: + log_with_request_id( + ctx, + LogLevel.INFO, + f'Creating CloudFormation stack {stack_name} for HyperPod cluster', + ) + + response = cfn_client.create_stack( + StackName=stack_name, + TemplateURL=template_url, + Parameters=template_params, + Capabilities=[ + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CAPABILITY_AUTO_EXPAND, + ], + OnFailure=CFN_ON_FAILURE_DELETE, + Tags=[{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + ) + + operation_text = 'creation' + + log_with_request_id( + ctx, + LogLevel.INFO, + f'CloudFormation stack {operation_text} initiated. Stack ARN: {response["StackId"]}', + ) + + return DeployStackResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'CloudFormation stack {operation_text} initiated. Stack {operation_text} is in progress and typically takes ~30 minutes to complete.', + ) + ], + stack_name=stack_name, + stack_arn=response['StackId'], + ) + except Exception as e: + error_message = f'Failed to deploy stack: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + + return DeployStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_arn='', + ) + + async def _describe_stack( + self, + ctx: Context, + stack_name: str, + region_name: SUPPORTED_REGIONS, + profile_name: Optional[str] = None, + ) -> 'DescribeStackResponse': + """Describe a CloudFormation stack.""" + try: + # Verify stack ownership + success, stack, error_message = self._ensure_stack_ownership( + ctx, stack_name, region_name, 'describe' + ) + if not success: + # Prepare error response with available stack details + stack_id = '' + creation_time = '' + stack_status = '' + + if stack: + stack_id = stack['StackId'] + creation_time = stack['CreationTime'].isoformat() + stack_status = stack['StackStatus'] + + return DescribeStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_id=stack_id, + creation_time=creation_time, + stack_status=stack_status, + outputs={}, + ) + + # Extract outputs + outputs = {} + if stack and 'Outputs' in stack: + for output in stack['Outputs']: + if 'OutputKey' in output and 'OutputValue' in output: + outputs[output['OutputKey']] = output['OutputValue'] + + log_with_request_id( + ctx, + LogLevel.INFO, + f'Described CloudFormation stack {stack_name} for HyperPod cluster', + ) + + # Safely extract stack details + stack_id = '' + creation_time = '' + stack_status = '' + + if stack: + stack_id = stack.get('StackId', '') + + # Safely handle creation time + if 'CreationTime' in stack: + creation_time_obj = stack['CreationTime'] + if hasattr(creation_time_obj, 'isoformat'): + creation_time = creation_time_obj.isoformat() + else: + creation_time = str(creation_time_obj) + + stack_status = stack.get('StackStatus', '') + + return DescribeStackResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Successfully described CloudFormation stack {stack_name} for HyperPod stack', + ) + ], + stack_name=stack_name, + stack_id=stack_id, + creation_time=creation_time, + stack_status=stack_status, + outputs=outputs, + ) + except Exception as e: + error_message = f'Failed to describe stack: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + + return DescribeStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_id='', + creation_time='', + stack_status='', + outputs={}, + ) + + async def _delete_stack( + self, + ctx: Context, + stack_name: str, + region_name: SUPPORTED_REGIONS, + profile_name: Optional[str] = None, + ) -> 'DeleteStackResponse': + """Delete a CloudFormation stack.""" + try: + # Create CloudFormation client + cfn_client = AwsHelper.create_boto3_client('cloudformation', region_name) + + # Verify stack ownership + success, stack, error_message = self._ensure_stack_ownership( + ctx, stack_name, region_name, 'delete' + ) + log_with_request_id( + ctx, + LogLevel.INFO, + f'_ensure_stack_ownership {stack_name} {stack} {error_message}', + ) + if not success: + # Prepare error response with available stack details + stack_id = '' + if stack: + stack_id = stack['StackId'] + + return DeleteStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_id=stack_id, + ) + + # Safely extract stack ID + stack_id = '' + if stack and 'StackId' in stack: + stack_id = stack['StackId'] + + # Delete the stack + cfn_client.delete_stack(StackName=stack_name) + + log_with_request_id( + ctx, + LogLevel.INFO, + f'Initiated deletion of CloudFormation stack {stack_name} for HyperPod stack', + ) + + return DeleteStackResponse( + isError=False, + content=[ + TextContent( + type='text', + text=f'Initiated deletion of CloudFormation stack {stack_name} for HyperPod stack. Deletion is in progress.', + ) + ], + stack_name=stack_name, + stack_id=stack_id, + ) + except Exception as e: + error_message = f'Failed to delete stack: {str(e)}' + log_with_request_id(ctx, LogLevel.ERROR, error_message) + + return DeleteStackResponse( + isError=True, + content=[TextContent(type='text', text=error_message or 'Unknown error')], + stack_name=stack_name, + stack_id='', + ) diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/logging_helper.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/logging_helper.py new file mode 100644 index 0000000000..289b7487c6 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/logging_helper.py @@ -0,0 +1,55 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Logging helper for the HyperPod MCP Server.""" + +from enum import Enum +from loguru import logger +from mcp.server.fastmcp import Context +from typing import Any + + +class LogLevel(Enum): + """Enum for log levels.""" + + DEBUG = 'debug' + INFO = 'info' + WARNING = 'warning' + ERROR = 'error' + CRITICAL = 'critical' + + +def log_with_request_id(ctx: Context, level: LogLevel, message: str, **kwargs: Any) -> None: + """Log a message with the request ID from the context. + + Args: + ctx: The MCP context containing the request ID + level: The log level (from LogLevel enum) + message: The message to log + **kwargs: Additional fields to include in the log message + """ + # Format the log message with request_id + log_message = f'[request_id={ctx.request_id}] {message}' + + # Log at the appropriate level + if level == LogLevel.DEBUG: + logger.debug(log_message, **kwargs) + elif level == LogLevel.INFO: + logger.info(log_message, **kwargs) + elif level == LogLevel.WARNING: + logger.warning(log_message, **kwargs) + elif level == LogLevel.ERROR: + logger.error(log_message, **kwargs) + elif level == LogLevel.CRITICAL: + logger.critical(log_message, **kwargs) diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/models.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/models.py new file mode 100644 index 0000000000..fe92c66e96 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/models.py @@ -0,0 +1,234 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Data models for the HyperPod MCP Server.""" + +from mcp.types import TextContent +from pydantic import BaseModel, Field +from typing import Dict, List, Optional + + +class CallToolResult(BaseModel): + """Base class for tool call results with TextContent only.""" + + content: List[TextContent] = Field(..., description='Response content') + isError: bool = Field(False, description='Whether this is an error response') + + +class ClusterSummary(BaseModel): + """Summary of a SageMaker HyperPod cluster.""" + + cluster_name: str + cluster_arn: str + cluster_status: str + creation_time: str + training_plan_arns: Optional[List[str]] = None + + +class ClusterInstanceStatusDetails(BaseModel): + """Status details of an instance in a SageMaker HyperPod cluster.""" + + status: str # Valid Values: Running | Failure | Pending | ShuttingDown | SystemUpdating | DeepHealthCheckInProgress + message: Optional[str] = None + + +class ClusterNodeSummary(BaseModel): + """Summary of a SageMaker HyperPod cluster node.""" + + instance_group_name: str + instance_id: str + instance_status: ClusterInstanceStatusDetails + instance_type: str + launch_time: str + last_software_update_time: Optional[str] = None + + +class ListClustersResponse(CallToolResult): + """Response model for list_clusters operation.""" + + clusters: List[ClusterSummary] = Field(..., description='List of HyperPod clusters') + next_token: Optional[str] = Field(None, description='Token for pagination') + + +class ListClusterNodesResponse(CallToolResult): + """Response model for list_cluster_nodes operation.""" + + nodes: List[ClusterNodeSummary] = Field(..., description='List of HyperPod cluster nodes') + next_token: Optional[str] = Field(None, description='Token for pagination') + + +class ClusterEbsVolumeConfig(BaseModel): + """EBS volume configuration for an instance in a SageMaker HyperPod cluster.""" + + volume_size_in_gb: Optional[int] = None + + +class ClusterInstanceStorageConfig(BaseModel): + """Storage configuration for an instance in a SageMaker HyperPod cluster.""" + + ebs_volume_config: Optional[ClusterEbsVolumeConfig] = None + + +class ClusterLifeCycleConfig(BaseModel): + """Life cycle configuration for an instance in a SageMaker HyperPod cluster.""" + + on_create: str + source_s3_uri: str + + +class VpcConfig(BaseModel): + """VPC configuration for an instance in a SageMaker HyperPod cluster. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_VpcConfig.html + """ + + security_group_ids: Optional[List[str]] = None + subnets: Optional[List[str]] = None + + +class ClusterInstancePlacement(BaseModel): + """Placement information for an instance in a SageMaker HyperPod cluster. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_ClusterInstancePlacement.html + """ + + availability_zone: Optional[str] = None + availability_zone_id: Optional[str] = None + + +class AlarmDetails(BaseModel): + """Details of an alarm for auto rollback configuration. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_AlarmDetails.html + """ + + alarm_name: str + + +class CapacitySizeConfig(BaseModel): + """Configuration for capacity size in rolling deployment policy. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CapacitySizeConfig.html + """ + + type: str # Valid Values: "INSTANCE_COUNT" | "CAPACITY_PERCENTAGE" + value: int + + +class RollingDeploymentPolicy(BaseModel): + """Policy for rolling deployment during cluster software updates. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_RollingDeploymentPolicy.html + """ + + maximum_batch_size: CapacitySizeConfig + rollback_maximum_batch_size: Optional[CapacitySizeConfig] = None + + +class DeploymentConfiguration(BaseModel): + """Configuration for deployment during cluster software updates. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_DeploymentConfiguration.html + """ + + auto_rollback_configuration: Optional[List[AlarmDetails]] = None + rolling_update_policy: Optional[RollingDeploymentPolicy] = None + wait_interval_in_seconds: Optional[int] = None + + +class UpdateClusterSoftwareInstanceGroupSpecification(BaseModel): + """Specification for an instance group to update in a SageMaker HyperPod cluster. + + See: https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_UpdateClusterSoftwareInstanceGroupSpecification.html + """ + + instance_group_name: str + + +class ClusterNodeDetails(BaseModel): + """Details of a SageMaker HyperPod cluster node.""" + + instance_group_name: str + instance_id: str + instance_status: ClusterInstanceStatusDetails + instance_storage_configs: Optional[List[ClusterInstanceStorageConfig]] = None + instance_type: str + last_software_update_time: Optional[str] = None + launch_time: Optional[str] = None + life_cycle_config: Optional[ClusterLifeCycleConfig] = None + override_vpc_config: Optional[VpcConfig] = None + placement: Optional[ClusterInstancePlacement] = None + private_dns_hostname: Optional[str] = None + private_primary_ip: Optional[str] = None + private_primary_ipv6: Optional[str] = None + threads_per_core: Optional[int] = None + + +class DescribeClusterNodeResponse(CallToolResult): + """Response model for describe_hp_cluster_node operation.""" + + node_details: Optional[ClusterNodeDetails] = Field( + None, description='Details of the HyperPod cluster node' + ) + + +class UpdateClusterSoftwareResponse(CallToolResult): + """Response model for update_hp_cluster_software operation.""" + + cluster_arn: str = Field(..., description='ARN of the HyperPod cluster') + + +class BatchDeleteClusterNodesError(BaseModel): + """Error details for a failed node deletion in a SageMaker HyperPod cluster.""" + + code: str + message: str + node_id: str + + +class BatchDeleteClusterNodesResponse(CallToolResult): + """Response model for batch_delete_hp_cluster_nodes operation.""" + + cluster_name: str = Field(..., description='Name of the HyperPod cluster') + successful: List[str] = Field(..., description='List of successfully deleted node IDs') + failed: Optional[List[BatchDeleteClusterNodesError]] = Field( + None, description='List of failed node deletions' + ) + + +# CloudFormation stack operation response models + + +class DeployStackResponse(CallToolResult): + """Response model for deploy operation of manage_hyperpod_stacks tool.""" + + stack_name: str = Field(..., description='Name of the CloudFormation stack') + stack_arn: str = Field(..., description='ARN of the CloudFormation stack') + + +class DescribeStackResponse(CallToolResult): + """Response model for describe operation of manage_hyperpod_stacks tool.""" + + stack_name: str = Field(..., description='Name of the CloudFormation stack') + stack_id: str = Field(..., description='ID of the CloudFormation stack') + creation_time: str = Field(..., description='Creation time of the stack') + stack_status: str = Field(..., description='Current status of the stack') + outputs: Dict[str, str] = Field(..., description='Stack outputs') + + +class DeleteStackResponse(CallToolResult): + """Response model for delete operation of manage_hyperpod_stacks tool.""" + + stack_name: str = Field(..., description='Name of the deleted CloudFormation stack') + stack_id: str = Field(..., description='ID of the deleted CloudFormation stack') diff --git a/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/server.py b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/server.py new file mode 100644 index 0000000000..ad9d8a7736 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/awslabs/sagemaker_hyperpod_mcp_server/server.py @@ -0,0 +1,202 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""awslabs SageMaker HyperPod MCP Server implementation. + +This module implements the SageMaker HyperPod MCP Server, which provides tools for managing Amazon SageMaker HyperPod clusters +and resources through the Model Context Protocol (MCP). + +Environment Variables: + AWS_REGION: AWS region to use for AWS API calls + AWS_PROFILE: AWS profile to use for credentials + FASTMCP_LOG_LEVEL: Log level (default: WARNING) +""" + +import argparse +import os +import sys +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_cluster_node_handler import ( + HyperPodClusterNodeHandler, +) +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import HyperPodStackHandler +from loguru import logger +from mcp.server.fastmcp import FastMCP + + +# Define server instructions and dependencies +SERVER_INSTRUCTIONS = """ +# Amazon SageMaker HyperPod MCP Server + +This MCP server provides comprehensive tools for managing Amazon SageMaker HyperPod clusters and is the preferred mechanism for interacting with SageMaker HyperPod. + +## IMPORTANT: Use MCP Tools for SageMaker HyperPod Operations + +DO NOT use standard SageMaker CLI commands (aws sagemaker). Always use the MCP tools provided by this server for SageMaker HyperPod operations. + +## Available MCP Tools + +### 1. HyperPod Cluster Node Management: `manage_hyperpod_cluster_nodes` +**Primary tool for cluster node operations** + +**Operations:** +- `list_clusters`: List all HyperPod clusters with filtering and pagination +- `list_nodes`: List nodes in a specific cluster with filtering options +- `describe_node`: Get detailed information about a specific node +- `update_software`: Update cluster software/AMI versions (requires --allow-write) +- `batch_delete`: Delete multiple nodes from a cluster (requires --allow-write) + +### 2. HyperPod Cluster Stack Management: `manage_hyperpod_stacks` +**Tool for managing HyperPod clusters and resources through CloudFormation** + +**Operations:** +- `deploy`: Create or update a HyperPod cluster via CloudFormation stack (requires --allow-write) +- `describe`: Get information about an existing CloudFormation stack +- `delete`: Delete a CloudFormation stack and associated HyperPod cluster (requires --allow-write) + + +## Usage Notes + +- By default, the server runs in read-only mode +- Use `--allow-write` flag to enable write operations (deploy, delete, update_software, batch_delete) +- When creating or updating resources, always check for existing resources first to avoid conflicts. + +## Common Workflows + +### 1. Listing and Managing Existing HyperPod Clusters +``` +# List all clusters in a region +manage_hyperpod_cluster_nodes(operation='list_clusters', region_name='us-east-1') + +# List nodes in a specific cluster +manage_hyperpod_cluster_nodes(operation='list_nodes', cluster_name='my-cluster') + +# Get detailed information about a specific node +manage_hyperpod_cluster_nodes(operation='describe_node', cluster_name='my-cluster', node_id='i-1234567890abcdef0') + +# Update cluster software (requires --allow-write) +manage_hyperpod_cluster_nodes(operation='update_software', cluster_name='my-cluster') +``` + +### 2. Creating HyperPod Clusters via CloudFormation +``` +# Create or update a HyperPod cluster (requires --allow-write) +manage_hyperpod_stacks(operation='deploy', stack_name='my-cluster-stack', params_file='/path/to/params.json', region_name='us-east-1') + +# Check deployment status +manage_hyperpod_stacks(operation='describe', stack_name='my-cluster-stack', region_name='us-east-1') +``` + +### 3. Deleting HyperPod Resources +``` +# Delete specific nodes from a cluster (requires --allow-write) +manage_hyperpod_cluster_nodes(operation='batch_delete', cluster_name='my-cluster', node_ids=['i-1234567890abcdef0', 'i-0987654321fedcba0']) + +# Delete entire cluster via CloudFormation (requires --allow-write) +manage_hyperpod_stacks(operation='delete', stack_name='my-cluster-stack', region_name='us-east-1') +``` + + +## Best Practices + +- **Resource Naming**: Use descriptive names for resources to make them easier to identify +- **Stack Management**: Use CloudFormation stacks (manage_hyperpod_stacks) for infrastructure as code and consistent deployments +- **Monitoring**: Regularly check cluster and node status using list and describe operations +- **Safety**: Always verify resource details before performing destructive operations (delete, batch_delete) +- **Access Control**: Follow the principle of least privilege when configuring IAM policies +- **Regional Considerations**: Specify the correct region for all operations to ensure you're working with the right resources + +## Important Safety Notes + +- **Destructive Operations**: batch_delete and stack deletion operations cannot be undone +- **Data Backup**: Always backup important data before deleting nodes or clusters +- **Write Access**: Mutating operations require the server to be started with --allow-write flag +- **Stack Ownership**: The stack management tool only operates on stacks it created (tagged appropriately) +""" + +SERVER_DEPENDENCIES = [ + 'pydantic', + 'loguru', + 'boto3', + 'requests', + 'pyyaml', + 'cachetools', +] + +# Global reference to the MCP server instance for testing purposes +mcp = None + + +def create_server(): + """Create and configure the MCP server instance.""" + return FastMCP( + 'awslabs.sagemaker-hyperpod-mcp-server', + instructions=SERVER_INSTRUCTIONS, + dependencies=SERVER_DEPENDENCIES, + ) + + +def main(): + """Run the MCP server with CLI argument support.""" + global mcp + + # Configure loguru logging + logger.remove() + logger.add(sys.stderr, level=os.getenv('FASTMCP_LOG_LEVEL', 'WARNING')) + + parser = argparse.ArgumentParser( + description='An AWS Labs Model Context Protocol (MCP) server for SageMaker HyperPod' + ) + parser.add_argument( + '--allow-write', + action=argparse.BooleanOptionalAction, + default=False, + help='Enable write access mode (allow mutating operations)', + ) + parser.add_argument( + '--allow-sensitive-data-access', + action=argparse.BooleanOptionalAction, + default=False, + help='Enable sensitive data access (required for reading logs, events, and sensitive information)', + ) + + args = parser.parse_args() + + allow_write = args.allow_write + allow_sensitive_data_access = args.allow_sensitive_data_access + + # Log startup mode + mode_info = [] + if not allow_write: + mode_info.append('read-only mode') + if not allow_sensitive_data_access: + mode_info.append('restricted sensitive data access mode') + + mode_str = ' in ' + ', '.join(mode_info) if mode_info else '' + logger.info(f'Starting HyperPod MCP Server{mode_str}') + + # Create the MCP server instance + mcp = create_server() + + # Initialize handlers - all tools are always registered, access control is handled within tools + HyperPodClusterNodeHandler(mcp, allow_write, allow_sensitive_data_access) + HyperPodStackHandler(mcp, allow_write) + + # Run server + mcp.run() + + return mcp + + +if __name__ == '__main__': + main() diff --git a/src/sagemaker-hyperpod-mcp-server/pyproject.toml b/src/sagemaker-hyperpod-mcp-server/pyproject.toml new file mode 100644 index 0000000000..2ebbdef7ff --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/pyproject.toml @@ -0,0 +1,143 @@ +[project] +name = "awslabs.sagemaker-hyperpod-mcp-server" + +# NOTE: "Patch"=9223372036854775807 bumps next release to zero. +version = "0.0.0" + +description = "MCP server for AWS SageMaker HyperPod" +readme = "README.md" +requires-python = ">=3.10" +dependencies = [ + "loguru>=0.7.0", + "mcp[cli]>=1.6.0", + "pydantic>=2.10.6", + "boto3>=1.34.0", + "requests>=2.31.0", + "pyyaml>=6.0.0", + "cachetools>=5.3.0", + "requests_auth_aws_sigv4", +] +license = {text = "Apache-2.0"} +license-files = ["LICENSE", "NOTICE" ] +authors = [ + {name = "Amazon Web Services"}, + {name = "AWSLabs MCP", email="203918161+awslabs-mcp@users.noreply.github.com"}, + {name = "Yunlin Qi", email="qiyunlin@amazon.com"}, +] +classifiers = [ + "License :: OSI Approved :: Apache Software License", + "Operating System :: OS Independent", + "Programming Language :: Python", + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "Programming Language :: Python :: 3.13", +] + +[project.urls] +homepage = "https://awslabs.github.io/mcp/" +docs = "https://awslabs.github.io/mcp/servers/sagemaker-hyperpod-mcp-server/" +documentation = "https://awslabs.github.io/mcp/servers/sagemaker-hyperpod-mcp-server/" +repository = "https://github.com/awslabs/mcp.git" +changelog = "https://github.com/awslabs/mcp/blob/main/src/sagemaker-hyperpod-mcp-server/CHANGELOG.md" + +[project.scripts] +"awslabs.sagemaker-hyperpod-mcp-server" = "awslabs.sagemaker_hyperpod_mcp_server.server:main" + +[dependency-groups] +dev = [ + "commitizen>=4.2.2", + "pre-commit>=4.1.0", + "ruff>=0.9.7", + "pyright>=1.1.398", + "pytest>=8.0.0", + "pytest-asyncio>=0.26.0", + "pytest-cov>=4.1.0", + "pytest-mock>=3.12.0", +] + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.hatch.metadata] +allow-direct-references = true + +[tool.ruff] +line-length = 99 +extend-include = ["*.ipynb"] +exclude = [ + ".venv", + "**/__pycache__", + "**/node_modules", + "**/dist", + "**/build", + "**/env", + "**/.ruff_cache", + "**/.venv", + "**/.ipynb_checkpoints" +] +force-exclude = true + +[tool.ruff.lint] +exclude = ["__init__.py"] +select = ["C", "D", "E", "F", "I", "W"] +ignore = ["C901", "E501", "E741", "F402", "F823", "D100", "D106"] + +[tool.ruff.lint.isort] +lines-after-imports = 2 +no-sections = true + +[tool.ruff.lint.per-file-ignores] +"**/*.ipynb" = ["F704"] + +[tool.ruff.lint.pydocstyle] +convention = "google" + +[tool.ruff.format] +quote-style = "single" +indent-style = "space" +skip-magic-trailing-comma = false +line-ending = "auto" +docstring-code-format = true + +[tool.pyright] +include = ["awslabs", "tests"] +exclude = ["**/__pycache__", "**/.venv", "**/node_modules", "**/dist", "**/build"] + +[tool.commitizen] +name = "cz_conventional_commits" +version = "0.0.0" +tag_format = "v$version" +version_files = [ + "pyproject.toml:version", + "awslabs/sagemaker_hyperpod_mcp_server/__init__.py:__version__" +] +update_changelog_on_bump = true + +[tool.hatch.build.targets.wheel] +packages = ["awslabs"] + +[tool.bandit] +exclude_dirs = ["venv", ".venv", "tests"] + +[tool.pytest.ini_options] +python_files = "test_*.py" +python_classes = "Test*" +python_functions = "test_*" +testpaths = [ "tests"] +asyncio_mode = "auto" +markers = [ + "live: marks tests that make live API calls (deselect with '-m \"not live\"')", + "asyncio: marks tests that use asyncio" +] + +[tool.coverage.report] +exclude_also = [ + 'pragma: no cover', + 'if __name__ == .__main__.:\n main()', +] + +[tool.coverage.run] +source = ["awslabs"] diff --git a/src/sagemaker-hyperpod-mcp-server/tests/__init__.py b/src/sagemaker-hyperpod-mcp-server/tests/__init__.py new file mode 100644 index 0000000000..4dbc1b5ecb --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/__init__.py @@ -0,0 +1,13 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/src/sagemaker-hyperpod-mcp-server/tests/conftest.py b/src/sagemaker-hyperpod-mcp-server/tests/conftest.py new file mode 100644 index 0000000000..d8c7f07149 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/conftest.py @@ -0,0 +1,20 @@ +import os +import pytest +from typing import Dict + + +TEMP_ENV_VARS: Dict[str, str] = {} + + +@pytest.fixture(scope='session', autouse=True) +def tests_setup_and_teardown(): + """Mock environment and module variables for testing.""" + global TEMP_ENV_VARS + # Will be executed before the first test + old_environ = dict(os.environ) + os.environ.update(TEMP_ENV_VARS) + + yield + # Will be executed after the last test + os.environ.clear() + os.environ.update(old_environ) diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_aws_helper.py b/src/sagemaker-hyperpod-mcp-server/tests/test_aws_helper.py new file mode 100644 index 0000000000..73ee6a976d --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_aws_helper.py @@ -0,0 +1,243 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103 +"""Tests for the AWS Helper.""" + +import os +from awslabs.sagemaker_hyperpod_mcp_server import __version__ +from awslabs.sagemaker_hyperpod_mcp_server.aws_helper import AwsHelper +from unittest.mock import ANY, MagicMock, patch + + +class TestAwsHelper: + """Tests for the AwsHelper class.""" + + def setup_method(self): + """Set up the test environment.""" + # Clear the client cache before each test + AwsHelper._client_cache = {} + AwsHelper._cache_metadata = {} + + @patch.dict(os.environ, {'AWS_REGION': 'us-west-2'}) + def test_get_aws_region_from_env(self): + """Test that get_aws_region returns the region from the environment.""" + region = AwsHelper.get_aws_region() + assert region == 'us-west-2' + + @patch.dict(os.environ, {}, clear=True) + def test_get_aws_region_default(self): + """Test that get_aws_region returns None when not set in the environment.""" + region = AwsHelper.get_aws_region() + assert region is None + + @patch.dict(os.environ, {'AWS_PROFILE': 'test-profile'}) + def test_get_aws_profile_from_env(self): + """Test that get_aws_profile returns the profile from the environment.""" + profile = AwsHelper.get_aws_profile() + assert profile == 'test-profile' + + @patch.dict(os.environ, {}, clear=True) + def test_get_aws_profile_none(self): + """Test that get_aws_profile returns None when not set in the environment.""" + profile = AwsHelper.get_aws_profile() + assert profile is None + + @patch('boto3.client') + def test_create_boto3_client_no_profile_with_region(self, mock_boto3_client): + """Test that create_boto3_client creates a client with the correct parameters when no profile is set but region is in env.""" + # Mock the get_aws_profile method to return None + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + # Mock the get_aws_region method to return a specific region + with patch.dict(os.environ, {'AWS_REGION': 'us-west-2'}): + with patch.object(AwsHelper, 'get_aws_region', return_value='us-west-2'): + # Call the create_boto3_client method + AwsHelper.create_boto3_client('sagemaker') + + # Verify that boto3.client was called with the correct parameters + mock_boto3_client.assert_called_once_with( + 'sagemaker', region_name='us-west-2', config=ANY + ) + + @patch('boto3.client') + def test_create_boto3_client_no_profile_no_region(self, mock_boto3_client): + """Test that create_boto3_client creates a client without region when no profile or region is set.""" + # Mock the get_aws_profile method to return None + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + # Mock the get_aws_region method to return None + with patch.dict(os.environ, {}, clear=True): + with patch.object(AwsHelper, 'get_aws_region', return_value=None): + # Call the create_boto3_client method + AwsHelper.create_boto3_client('sagemaker') + + # Verify that boto3.client was called without region_name + mock_boto3_client.assert_called_once_with('sagemaker', config=ANY) + + @patch('boto3.Session') + def test_create_boto3_client_with_profile_with_region(self, mock_boto3_session): + """Test that create_boto3_client creates a client with the correct parameters when a profile is set and region is in env.""" + # Create a mock session + mock_session = MagicMock() + mock_boto3_session.return_value = mock_session + + # Mock the get_aws_profile method to return a profile + with patch.object(AwsHelper, 'get_aws_profile', return_value='test-profile'): + # Mock the get_aws_region method to return a specific region + with patch.dict(os.environ, {'AWS_REGION': 'us-west-2'}): + with patch.object(AwsHelper, 'get_aws_region', return_value='us-west-2'): + # Call the create_boto3_client method + AwsHelper.create_boto3_client('sagemaker') + + # Verify that boto3.Session was called with the correct parameters + mock_boto3_session.assert_called_once_with(profile_name='test-profile') + + # Verify that session.client was called with the correct parameters + mock_session.client.assert_called_once_with( + 'sagemaker', region_name='us-west-2', config=ANY + ) + + @patch('boto3.Session') + def test_create_boto3_client_with_profile_no_region(self, mock_boto3_session): + """Test that create_boto3_client creates a client without region when a profile is set but no region.""" + # Create a mock session + mock_session = MagicMock() + mock_boto3_session.return_value = mock_session + + # Mock the get_aws_profile method to return a profile + with patch.object(AwsHelper, 'get_aws_profile', return_value='test-profile'): + # Mock the get_aws_region method to return None + with patch.dict(os.environ, {}, clear=True): + with patch.object(AwsHelper, 'get_aws_region', return_value=None): + # Call the create_boto3_client method + AwsHelper.create_boto3_client('sagemaker') + + # Verify that boto3.Session was called with the correct parameters + mock_boto3_session.assert_called_once_with(profile_name='test-profile') + + # Verify that session.client was called without region_name + mock_session.client.assert_called_once_with('sagemaker', config=ANY) + + @patch('boto3.client') + def test_create_boto3_client_with_region_override(self, mock_boto3_client): + """Test that create_boto3_client uses the region override when provided.""" + # Mock the get_aws_profile method to return None + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + # Call the create_boto3_client method with a region override + AwsHelper.create_boto3_client('sagemaker', region_name='us-west-2') + + # Verify that boto3.client was called with the correct parameters + mock_boto3_client.assert_called_once_with( + 'sagemaker', region_name='us-west-2', config=ANY + ) + + def test_create_boto3_client_user_agent(self): + """Test that create_boto3_client sets the user agent suffix correctly using the package version.""" + # Create a real Config object to inspect + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + with patch.object(AwsHelper, 'get_aws_region', return_value=None): + with patch('boto3.client') as mock_client: + # Call the create_boto3_client method + AwsHelper.create_boto3_client('sagemaker') + + # Get the config argument passed to boto3.client + _, kwargs = mock_client.call_args + config = kwargs.get('config') + + # Verify the user agent suffix uses the version from __init__.py + assert config is not None + expected_user_agent = ( + f'awslabs/mcp/sagemaker-hyperpod-mcp-server/{__version__}' + ) + assert config.user_agent_extra == expected_user_agent + + @patch('boto3.client') + def test_client_caching(self, mock_boto3_client): + """Test that clients are cached and reused.""" + # Create a mock client + mock_client = MagicMock() + mock_boto3_client.return_value = mock_client + + # Mock the get_aws_profile and get_aws_region methods + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + with patch.object(AwsHelper, 'get_aws_region', return_value='us-west-2'): + # Call create_boto3_client twice with the same parameters + client1 = AwsHelper.create_boto3_client('sagemaker') + client2 = AwsHelper.create_boto3_client('sagemaker') + + # Verify that boto3.client was called only once + mock_boto3_client.assert_called_once() + + # Verify that the same client instance was returned both times + assert client1 is client2 + + @patch('boto3.client') + def test_client_caching_multiple_regions(self, mock_boto3_client): + """Test that different regions create different cached clients.""" + # Create mock clients + mock_iad_client = MagicMock() + mock_sfo_client = MagicMock() + mock_boto3_client.side_effect = [mock_iad_client, mock_sfo_client] + + # Mock the get_aws_profile and get_aws_region methods + with ( + patch.object(AwsHelper, 'get_aws_profile', return_value=None), + patch.object(AwsHelper, 'get_aws_region', return_value='us-west-2'), + ): + # Call create_boto3_client with different regions + iad_client = AwsHelper.create_boto3_client('sagemaker', 'us-east-1') + iad_client_dup = AwsHelper.create_boto3_client('sagemaker', 'us-east-1') + sfo_client = AwsHelper.create_boto3_client('sagemaker', 'us-west-1') + + # Verify that boto3.client was called twice + assert mock_boto3_client.call_count == 2 + + # Verify that different client instances were returned + assert iad_client is not sfo_client + assert iad_client is iad_client_dup + + @patch('boto3.client') + def test_different_services_not_cached_together(self, mock_boto3_client): + """Test that different services get different cached clients.""" + # Create mock clients + mock_sagemaker_client = MagicMock() + mock_s3_client = MagicMock() + mock_boto3_client.side_effect = [mock_sagemaker_client, mock_s3_client] + + # Mock the get_aws_profile and get_aws_region methods + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + with patch.object(AwsHelper, 'get_aws_region', return_value='us-west-2'): + # Call create_boto3_client for different services + sagemaker_client = AwsHelper.create_boto3_client('sagemaker') + s3_client = AwsHelper.create_boto3_client('s3') + + # Verify that boto3.client was called twice + assert mock_boto3_client.call_count == 2 + + # Verify that different client instances were returned + assert sagemaker_client is not s3_client + + @patch('boto3.client') + def test_error_handling(self, mock_boto3_client): + """Test that errors during client creation are handled properly.""" + # Make boto3.client raise an exception + mock_boto3_client.side_effect = Exception('Test error') + + # Mock the get_aws_profile and get_aws_region methods + with patch.object(AwsHelper, 'get_aws_profile', return_value=None): + with patch.object(AwsHelper, 'get_aws_region', return_value=None): + # Verify that the exception is re-raised with more context + try: + AwsHelper.create_boto3_client('sagemaker') + assert False, 'Exception was not raised' + except Exception as e: + assert 'Failed to create boto3 client for sagemaker: Test error' in str(e) diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_cluster_node_handler.py b/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_cluster_node_handler.py new file mode 100644 index 0000000000..b58b43d0ed --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_cluster_node_handler.py @@ -0,0 +1,1591 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103 +"""Tests for the HyperPod cluster node handler.""" + +import os +import pytest +from awslabs.sagemaker_hyperpod_mcp_server.aws_helper import AwsHelper +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_cluster_node_handler import ( + HyperPodClusterNodeHandler, +) +from awslabs.sagemaker_hyperpod_mcp_server.models import ( + BatchDeleteClusterNodesResponse, + ClusterInstanceStatusDetails, + ClusterNodeDetails, + ClusterNodeSummary, + ClusterSummary, + DescribeClusterNodeResponse, + ListClusterNodesResponse, + ListClustersResponse, + UpdateClusterSoftwareResponse, +) +from mcp.server.fastmcp import Context +from mcp.types import TextContent +from unittest.mock import ANY, MagicMock, patch + + +class TestHyperPodClusterNodeHandler: + """Tests for the HyperPodClusterNodeHandler class.""" + + def test_init_default(self): + """Test that the handler is initialized correctly and registers its tools with default allow_write=False.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Verify that the handler has the correct attributes + assert handler.mcp == mock_mcp + assert handler.allow_write is False + assert handler.allow_sensitive_data_access is False + + # Verify that the tool was registered + assert mock_mcp.tool.call_count == 3 + tool_names = [call_args[1]['name'] for call_args in mock_mcp.tool.call_args_list] + assert 'manage_hyperpod_cluster_nodes' in tool_names + + def test_init_write_access_enabled(self): + """Test that the handler is initialized correctly with allow_write=True.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Verify that the handler has the correct attributes + assert handler.mcp == mock_mcp + assert handler.allow_write is True + assert handler.allow_sensitive_data_access is False + + def test_init_sensitive_data_access_enabled(self): + """Test that the handler is initialized correctly with allow_sensitive_data_access=True.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_sensitive_data_access=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_sensitive_data_access=True) + + # Verify that the handler has the correct attributes + assert handler.mcp == mock_mcp + assert handler.allow_write is False + assert handler.allow_sensitive_data_access is True + + def test_get_sagemaker_client(self): + """Test that get_sagemaker_client returns a SageMaker client.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock boto3 client + mock_client = MagicMock() + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_client + ) as mock_create_client: + # Call the get_sagemaker_client method + client = handler.get_sagemaker_client(mock_ctx) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('sagemaker', region_name=None) + + # Verify that the client is the mock client + assert client == mock_client + + def test_get_sagemaker_client_with_region(self): + """Test that get_sagemaker_client returns a SageMaker client with the specified region.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock boto3 client + mock_client = MagicMock() + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_client + ) as mock_create_client: + # Call the get_sagemaker_client method with a region + client = handler.get_sagemaker_client(mock_ctx, region_name='us-west-2') + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('sagemaker', region_name='us-west-2') + + # Verify that the client is the mock client + assert client == mock_client + + def test_get_sagemaker_client_with_profile(self): + """Test that get_sagemaker_client returns a SageMaker client with the specified profile.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock boto3 client + mock_client = MagicMock() + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_client + ) as mock_create_client: + # Call the get_sagemaker_client method with a profile + client = handler.get_sagemaker_client(mock_ctx, profile_name='test-profile') + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('sagemaker', region_name=None) + + # Verify that the client is the mock client + assert client == mock_client + + # Verify that the AWS_PROFILE environment variable was set + assert os.environ.get('AWS_PROFILE') == 'test-profile' + + @pytest.mark.asyncio + async def test_list_hp_clusters_success(self): + """Test that _list_hp_clusters returns a list of clusters successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.list_clusters.return_value = { + 'ClusterSummaries': [ + { + 'ClusterName': 'test-cluster', + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + 'ClusterStatus': 'InService', + 'CreationTime': '2023-01-01T00:00:00Z', + 'TrainingPlanArns': [ + 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ], + } + ], + 'NextToken': 'next-token', + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _list_hp_clusters method + result = await handler._list_hp_clusters( + ctx=mock_ctx, + max_results=10, + next_token='token', + name_contains='test', + creation_time_after='2023-01-01T00:00:00Z', + creation_time_before='2023-01-02T00:00:00Z', + sort_by='NAME', + sort_order='Descending', + training_plan_arn='arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that list_clusters was called with the correct parameters + mock_sagemaker_client.list_clusters.assert_called_once_with( + MaxResults=10, + NextToken='token', + NameContains='test', + CreationTimeAfter='2023-01-01T00:00:00Z', + CreationTimeBefore='2023-01-02T00:00:00Z', + SortBy='NAME', + SortOrder='Descending', + TrainingPlanArn='arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan', + ) + + # Verify the result + assert not result.isError + assert len(result.clusters) == 1 + assert result.clusters[0].cluster_name == 'test-cluster' + assert ( + result.clusters[0].cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + assert result.clusters[0].cluster_status == 'InService' + assert result.clusters[0].creation_time == '2023-01-01T00:00:00Z' + assert result.clusters[0].training_plan_arns == [ + 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ] + assert result.next_token == 'next-token' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Successfully listed 1 SageMaker HyperPod clusters' in result.content[0].text + + @pytest.mark.asyncio + async def test_list_hp_clusters_error(self): + """Test that _list_hp_clusters handles errors correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod Cluster Node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.list_clusters.side_effect = Exception('Test error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _list_hp_clusters method + result = await handler._list_hp_clusters( + ctx=mock_ctx, + region_name='us-west-2', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name=ANY + ) + + # Verify that list_clusters was called + mock_sagemaker_client.list_clusters.assert_called_once() + + # Verify the result + assert result.isError + assert len(result.clusters) == 0 + assert result.next_token is None + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Failed to list SageMaker HyperPod clusters: Test error' in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_describe_hp_cluster_node_success(self): + """Test that _describe_hp_cluster_node returns node details successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.describe_cluster_node.return_value = { + 'NodeDetails': { + 'InstanceGroupName': 'test-group', + 'InstanceId': 'i-1234567890abcdef0', + 'InstanceStatus': { + 'Status': 'Running', + 'Message': 'Node is running', + }, + 'InstanceType': 'ml.g5.8xlarge', + 'LaunchTime': '2023-01-01T00:00:00Z', + 'LastSoftwareUpdateTime': '2023-01-02T00:00:00Z', + 'InstanceStorageConfigs': [ + { + 'EbsVolumeConfig': { + 'VolumeSizeInGb': 500, + }, + }, + ], + 'LifeCycleConfig': { + 'OnCreate': 'echo "Hello, World!"', + 'SourceS3Uri': 's3://bucket/path', + }, + 'OverrideVpcConfig': { + 'SecurityGroupIds': ['sg-1234567890abcdef0'], + 'Subnets': ['subnet-1234567890abcdef0'], + }, + 'Placement': { + 'AvailabilityZone': 'us-west-2a', + 'AvailabilityZoneId': 'usw2-az1', + }, + 'PrivateDnsHostname': 'ip-10-0-0-1.us-west-2.compute.internal', + 'PrivatePrimaryIp': '10.0.0.1', + 'PrivatePrimaryIpv6': '2001:db8::1', + 'ThreadsPerCore': 1, + }, + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _describe_hp_cluster_node method + result = await handler._describe_hp_cluster_node( + ctx=mock_ctx, + cluster_name='test-cluster', + node_id='i-1234567890abcdef0', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that describe_cluster_node was called with the correct parameters + mock_sagemaker_client.describe_cluster_node.assert_called_once_with( + ClusterName='test-cluster', + NodeId='i-1234567890abcdef0', + ) + + # Verify the result + assert not result.isError + assert result.node_details is not None + assert result.node_details.instance_group_name == 'test-group' + assert result.node_details.instance_id == 'i-1234567890abcdef0' + assert result.node_details.instance_status.status == 'Running' + assert result.node_details.instance_status.message == 'Node is running' + assert result.node_details.instance_type == 'ml.g5.8xlarge' + assert result.node_details.launch_time == '2023-01-01T00:00:00Z' + assert result.node_details.last_software_update_time == '2023-01-02T00:00:00Z' + assert result.node_details.instance_storage_configs is not None + assert len(result.node_details.instance_storage_configs) == 1 + assert result.node_details.instance_storage_configs[0].ebs_volume_config is not None + assert ( + result.node_details.instance_storage_configs[0].ebs_volume_config.volume_size_in_gb + == 500 + ) + assert result.node_details.life_cycle_config is not None + assert result.node_details.life_cycle_config.on_create == 'echo "Hello, World!"' + assert result.node_details.life_cycle_config.source_s3_uri == 's3://bucket/path' + assert result.node_details.override_vpc_config is not None + assert result.node_details.override_vpc_config.security_group_ids == [ + 'sg-1234567890abcdef0' + ] + assert result.node_details.override_vpc_config.subnets == ['subnet-1234567890abcdef0'] + assert result.node_details.placement is not None + assert result.node_details.placement.availability_zone == 'us-west-2a' + assert result.node_details.placement.availability_zone_id == 'usw2-az1' + assert ( + result.node_details.private_dns_hostname + == 'ip-10-0-0-1.us-west-2.compute.internal' + ) + assert result.node_details.private_primary_ip == '10.0.0.1' + assert result.node_details.private_primary_ipv6 == '2001:db8::1' + assert result.node_details.threads_per_core == 1 + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully described SageMaker HyperPod cluster node: i-1234567890abcdef0' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_describe_hp_cluster_node_error(self): + """Test that _describe_hp_cluster_node handles errors correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.describe_cluster_node.side_effect = Exception('Test error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _describe_hp_cluster_node method + result = await handler._describe_hp_cluster_node( + ctx=mock_ctx, + cluster_name='test-cluster', + node_id='i-1234567890abcdef0', + region_name='us-west-2', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name=ANY + ) + + # Verify that describe_cluster_node was called with the correct parameters + mock_sagemaker_client.describe_cluster_node.assert_called_once_with( + ClusterName='test-cluster', + NodeId='i-1234567890abcdef0', + ) + + # Verify the result + assert result.isError + assert result.node_details is None + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Failed to describe SageMaker HyperPod cluster node: Test error' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_list_hp_cluster_nodes_success(self): + """Test that _list_hp_cluster_nodes returns a list of nodes successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.list_cluster_nodes.return_value = { + 'ClusterNodeSummaries': [ + { + 'InstanceGroupName': 'test-group', + 'InstanceId': 'i-1234567890abcdef0', + 'InstanceStatus': { + 'Status': 'Running', + 'Message': 'Node is running', + }, + 'InstanceType': 'ml.g5.8xlarge', + 'LaunchTime': '2023-01-01T00:00:00Z', + 'LastSoftwareUpdateTime': '2023-01-02T00:00:00Z', + }, + ], + 'NextToken': 'next-token', + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _list_hp_cluster_nodes method + result = await handler._list_hp_cluster_nodes( + ctx=mock_ctx, + cluster_name='test-cluster', + creation_time_after='2023-01-01T00:00:00Z', + creation_time_before='2023-01-02T00:00:00Z', + instance_group_name_contains='test', + max_results=10, + next_token='token', + sort_by='NAME', + sort_order='Descending', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that list_cluster_nodes was called with the correct parameters + mock_sagemaker_client.list_cluster_nodes.assert_called_once_with( + ClusterName='test-cluster', + CreationTimeAfter='2023-01-01T00:00:00Z', + CreationTimeBefore='2023-01-02T00:00:00Z', + InstanceGroupNameContains='test', + MaxResults=10, + NextToken='token', + SortBy='NAME', + SortOrder='Descending', + ) + + # Verify the result + assert not result.isError + assert len(result.nodes) == 1 + assert result.nodes[0].instance_group_name == 'test-group' + assert result.nodes[0].instance_id == 'i-1234567890abcdef0' + assert result.nodes[0].instance_status.status == 'Running' + assert result.nodes[0].instance_status.message == 'Node is running' + assert result.nodes[0].instance_type == 'ml.g5.8xlarge' + assert result.nodes[0].launch_time == '2023-01-01T00:00:00Z' + assert result.nodes[0].last_software_update_time == '2023-01-02T00:00:00Z' + assert result.next_token == 'next-token' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully listed 1 SageMaker HyperPod cluster nodes' in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_list_hp_cluster_nodes_error(self): + """Test that _list_hp_cluster_nodes handles errors correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.list_cluster_nodes.side_effect = Exception('Test error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _list_hp_cluster_nodes method + result = await handler._list_hp_cluster_nodes( + ctx=mock_ctx, + cluster_name='test-cluster', + region_name='us-west-2', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name=ANY + ) + + # Verify that list_cluster_nodes was called with the correct parameters + # Use ANY for additional parameters that might be added by default + mock_sagemaker_client.list_cluster_nodes.assert_called_once() + args, kwargs = mock_sagemaker_client.list_cluster_nodes.call_args + assert kwargs['ClusterName'] == 'test-cluster' + + # Verify the result + assert result.isError + assert len(result.nodes) == 0 + assert result.next_token is None + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Failed to list SageMaker HyperPod cluster nodes: Test error' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_update_hp_cluster_software_success(self): + """Test that _update_hp_cluster_software updates cluster software successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.update_cluster_software.return_value = { + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object(handler, 'get_sagemaker_client', return_value=mock_sagemaker_client): + # Mock the deployment_config and instance_groups attributes + with patch.object(handler, '_update_hp_cluster_software') as mock_update: + mock_update.return_value = UpdateClusterSoftwareResponse( + isError=False, + content=[ + TextContent( + type='text', + text='Successfully initiated software update for SageMaker HyperPod cluster: test-cluster', + ) + ], + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + ) + + # Call the _update_hp_cluster_software method + result = await mock_update( + ctx=mock_ctx, + cluster_name='test-cluster', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify the result + assert not result.isError + assert ( + result.cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully initiated software update for SageMaker HyperPod cluster: test-cluster' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_update_hp_cluster_software_with_deployment_config(self): + """Test that _update_hp_cluster_software updates cluster software with deployment config.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.update_cluster_software.return_value = { + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Create deployment config + from awslabs.sagemaker_hyperpod_mcp_server.models import ( + AlarmDetails, + CapacitySizeConfig, + DeploymentConfiguration, + RollingDeploymentPolicy, + UpdateClusterSoftwareInstanceGroupSpecification, + ) + + deployment_config = DeploymentConfiguration( + auto_rollback_configuration=[AlarmDetails(alarm_name='test-alarm')], + rolling_update_policy=RollingDeploymentPolicy( + maximum_batch_size=CapacitySizeConfig(type='INSTANCE_COUNT', value=1), + rollback_maximum_batch_size=CapacitySizeConfig( + type='CAPACITY_PERCENTAGE', value=50 + ), + ), + wait_interval_in_seconds=60, + ) + + instance_groups = [ + UpdateClusterSoftwareInstanceGroupSpecification(instance_group_name='test-group') + ] + + # Call the _update_hp_cluster_software method with deployment config and instance groups + result = await handler._update_hp_cluster_software( + ctx=mock_ctx, + cluster_name='test-cluster', + deployment_config=deployment_config, + instance_groups=instance_groups, + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that update_cluster_software was called with the correct parameters + mock_sagemaker_client.update_cluster_software.assert_called_once() + args, kwargs = mock_sagemaker_client.update_cluster_software.call_args + assert kwargs['ClusterName'] == 'test-cluster' + + # Verify deployment config + assert 'DeploymentConfig' in kwargs + assert 'AutoRollbackConfiguration' in kwargs['DeploymentConfig'] + assert ( + kwargs['DeploymentConfig']['AutoRollbackConfiguration'][0]['AlarmName'] + == 'test-alarm' + ) + assert 'RollingUpdatePolicy' in kwargs['DeploymentConfig'] + assert ( + kwargs['DeploymentConfig']['RollingUpdatePolicy']['MaximumBatchSize']['Type'] + == 'INSTANCE_COUNT' + ) + assert ( + kwargs['DeploymentConfig']['RollingUpdatePolicy']['MaximumBatchSize']['Value'] == 1 + ) + assert ( + kwargs['DeploymentConfig']['RollingUpdatePolicy']['RollbackMaximumBatchSize'][ + 'Type' + ] + == 'CAPACITY_PERCENTAGE' + ) + assert ( + kwargs['DeploymentConfig']['RollingUpdatePolicy']['RollbackMaximumBatchSize'][ + 'Value' + ] + == 50 + ) + assert kwargs['DeploymentConfig']['WaitIntervalInSeconds'] == 60 + + # Verify instance groups + assert 'InstanceGroups' in kwargs + assert len(kwargs['InstanceGroups']) == 1 + assert kwargs['InstanceGroups'][0]['InstanceGroupName'] == 'test-group' + + # Verify the result + assert not result.isError + assert ( + result.cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully initiated software update for SageMaker HyperPod cluster: test-cluster' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_update_hp_cluster_software_error(self): + """Test that _update_hp_cluster_software handles errors correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.update_cluster_software.side_effect = Exception('Test error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _update_hp_cluster_software method + result = await handler._update_hp_cluster_software( + ctx=mock_ctx, + cluster_name='test-cluster', + region_name='us-west-2', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name=ANY + ) + + # Mock the update_cluster_software method to avoid the actual call + with patch.object(mock_sagemaker_client, 'update_cluster_software') as mock_update: + mock_update.side_effect = Exception('Test error') + + # Verify the result + assert result.isError + assert result.cluster_arn == '' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + # The actual error message might be different, just check that it contains the key parts + assert ( + 'Failed to update software for SageMaker HyperPod cluster' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_batch_delete_hp_cluster_nodes_success(self): + """Test that _batch_delete_hp_cluster_nodes deletes nodes successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.batch_delete_cluster_nodes.return_value = { + 'Successful': ['i-1234567890abcdef0', 'i-0987654321fedcba0'], + 'Failed': [], + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _batch_delete_hp_cluster_nodes method + result = await handler._batch_delete_hp_cluster_nodes( + ctx=mock_ctx, + cluster_name='test-cluster', + node_ids=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that batch_delete_cluster_nodes was called with the correct parameters + mock_sagemaker_client.batch_delete_cluster_nodes.assert_called_once_with( + ClusterName='test-cluster', + NodeIds=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + ) + + # Verify the result + assert not result.isError + assert result.cluster_name == 'test-cluster' + assert result.successful == ['i-1234567890abcdef0', 'i-0987654321fedcba0'] + assert result.failed is None or len(result.failed) == 0 + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully deleted 2 nodes from SageMaker HyperPod cluster: test-cluster' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_batch_delete_hp_cluster_nodes_with_failures(self): + """Test that _batch_delete_hp_cluster_nodes handles partial failures correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.batch_delete_cluster_nodes.return_value = { + 'Successful': ['i-1234567890abcdef0'], + 'Failed': [ + { + 'NodeId': 'i-0987654321fedcba0', + 'Code': 'ValidationException', + 'Message': 'Node is a controller node and cannot be deleted', + } + ], + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _batch_delete_hp_cluster_nodes method + result = await handler._batch_delete_hp_cluster_nodes( + ctx=mock_ctx, + cluster_name='test-cluster', + node_ids=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that batch_delete_cluster_nodes was called with the correct parameters + mock_sagemaker_client.batch_delete_cluster_nodes.assert_called_once_with( + ClusterName='test-cluster', + NodeIds=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + ) + + # Verify the result + assert not result.isError + assert result.cluster_name == 'test-cluster' + assert result.successful == ['i-1234567890abcdef0'] + assert result.failed is not None + assert len(result.failed) == 1 + assert result.failed[0].node_id == 'i-0987654321fedcba0' + assert result.failed[0].code == 'ValidationException' + assert result.failed[0].message == 'Node is a controller node and cannot be deleted' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Successfully deleted 1 nodes from SageMaker HyperPod cluster: test-cluster. Failed deletions: 1' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_batch_delete_hp_cluster_nodes_error(self): + """Test that _batch_delete_hp_cluster_nodes handles errors correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.batch_delete_cluster_nodes.side_effect = Exception('Test error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the _batch_delete_hp_cluster_nodes method + result = await handler._batch_delete_hp_cluster_nodes( + ctx=mock_ctx, + cluster_name='test-cluster', + node_ids=['i-1234567890abcdef0'], + region_name='us-west-2', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name=ANY + ) + + # Verify that batch_delete_cluster_nodes was called with the correct parameters + mock_sagemaker_client.batch_delete_cluster_nodes.assert_called_once_with( + ClusterName='test-cluster', + NodeIds=['i-1234567890abcdef0'], + ) + + # Verify the result + assert result.isError + assert result.cluster_name == 'test-cluster' + assert result.successful == [] + assert result.failed is None or len(result.failed) == 0 + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Failed to delete nodes from SageMaker HyperPod cluster: Test error' + in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_list_clusters(self): + """Test that manage_hyperpod_cluster_nodes handles the list_clusters operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _list_hp_clusters method + mock_result = ListClustersResponse( + isError=False, + content=[ + TextContent(type='text', text='Successfully listed SageMaker HyperPod clusters') + ], + clusters=[ + ClusterSummary( + cluster_name='test-cluster', + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + cluster_status='InService', + creation_time='2023-01-01T00:00:00Z', + training_plan_arns=[ + 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ], + ) + ], + next_token='next-token', + ) + with patch.object(handler, '_list_hp_clusters', return_value=mock_result) as mock_handler: + # Call the manage_hyperpod_cluster_nodes method with list_clusters operation + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='list_clusters', + max_results=10, + next_token='token', + name_contains='test', + sort_by='NAME', + sort_order='Descending', + training_plan_arn='arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that _list_hp_clusters was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['max_results'] == 10 + assert call_args['next_token'] == 'token' + assert call_args['name_contains'] == 'test' + assert call_args['sort_by'] == 'NAME' + assert call_args['sort_order'] == 'Descending' + assert ( + call_args['training_plan_arn'] + == 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ) + assert call_args['region_name'] == 'us-west-2' + assert call_args['profile_name'] == 'test-profile' + + # Verify the result is the same as the mock result + assert result is mock_result + assert not result.isError + # Type assertion to help pyright understand this is ListClustersResponse + assert isinstance(result, ListClustersResponse) + assert len(result.clusters) == 1 + assert result.clusters[0].cluster_name == 'test-cluster' + assert result.next_token == 'next-token' + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_list_nodes(self): + """Test that manage_hyperpod_cluster_nodes handles the list_nodes operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _list_hp_cluster_nodes method + mock_result = ListClusterNodesResponse( + isError=False, + content=[ + TextContent( + type='text', text='Successfully listed SageMaker HyperPod cluster nodes' + ) + ], + nodes=[ + ClusterNodeSummary( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=ClusterInstanceStatusDetails( + status='Running', + message='Node is running', + ), + instance_type='ml.g5.8xlarge', + launch_time='2023-01-01T00:00:00Z', + last_software_update_time='2023-01-02T00:00:00Z', + ) + ], + next_token='next-token', + ) + with patch.object( + handler, '_list_hp_cluster_nodes', return_value=mock_result + ) as mock_handler: + # Call the manage_hyperpod_cluster_nodes method with list_nodes operation + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='list_nodes', + cluster_name='test-cluster', + creation_time_after='2023-01-01T00:00:00Z', + creation_time_before='2023-01-02T00:00:00Z', + instance_group_name_contains='test', + max_results=10, + next_token='token', + sort_by='NAME', + sort_order='Descending', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that _list_hp_cluster_nodes was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['cluster_name'] == 'test-cluster' + assert call_args['creation_time_after'] == '2023-01-01T00:00:00Z' + assert call_args['creation_time_before'] == '2023-01-02T00:00:00Z' + assert call_args['instance_group_name_contains'] == 'test' + assert call_args['max_results'] == 10 + assert call_args['next_token'] == 'token' + assert call_args['sort_by'] == 'NAME' + assert call_args['sort_order'] == 'Descending' + assert call_args['region_name'] == 'us-west-2' + assert call_args['profile_name'] == 'test-profile' + + # Verify the result is the same as the mock result + assert result is mock_result + assert not result.isError + assert isinstance(result, ListClusterNodesResponse) + assert len(result.nodes) == 1 + assert result.nodes[0].instance_id == 'i-1234567890abcdef0' + assert result.next_token == 'next-token' + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_describe_node(self): + """Test that manage_hyperpod_cluster_nodes handles the describe_node operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _describe_hp_cluster_node method + mock_result = DescribeClusterNodeResponse( + isError=False, + content=[ + TextContent( + type='text', text='Successfully described SageMaker HyperPod cluster node' + ) + ], + node_details=ClusterNodeDetails( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=ClusterInstanceStatusDetails( + status='Running', + message='Node is running', + ), + instance_type='ml.g5.8xlarge', + launch_time='2023-01-01T00:00:00Z', + last_software_update_time='2023-01-02T00:00:00Z', + instance_storage_configs=None, + life_cycle_config=None, + override_vpc_config=None, + placement=None, + private_dns_hostname='ip-10-0-0-1.us-west-2.compute.internal', + private_primary_ip='10.0.0.1', + private_primary_ipv6='2001:db8::1', + threads_per_core=1, + ), + ) + with patch.object( + handler, '_describe_hp_cluster_node', return_value=mock_result + ) as mock_handler: + # Call the manage_hyperpod_cluster_nodes method with describe_node operation + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='describe_node', + cluster_name='test-cluster', + node_id='i-1234567890abcdef0', + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that _describe_hp_cluster_node was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['cluster_name'] == 'test-cluster' + assert call_args['node_id'] == 'i-1234567890abcdef0' + assert call_args['region_name'] == 'us-west-2' + assert call_args['profile_name'] == 'test-profile' + + # Verify the result is the same as the mock result + assert result is mock_result + assert not result.isError + # Type assertion to help pyright understand this is DescribeClusterNodeResponse + assert isinstance(result, DescribeClusterNodeResponse) + assert result.node_details is not None + assert result.node_details.instance_id == 'i-1234567890abcdef0' + assert result.node_details.instance_group_name == 'test-group' + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_update_software(self): + """Test that manage_hyperpod_cluster_nodes handles the update_software operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _update_hp_cluster_software method + mock_result = UpdateClusterSoftwareResponse( + isError=False, + content=[ + TextContent( + type='text', + text='Successfully initiated software update for SageMaker HyperPod cluster', + ) + ], + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + ) + with patch.object( + handler, '_update_hp_cluster_software', return_value=mock_result + ) as mock_handler: + # Create deployment config and instance groups + from awslabs.sagemaker_hyperpod_mcp_server.models import ( + AlarmDetails, + CapacitySizeConfig, + DeploymentConfiguration, + RollingDeploymentPolicy, + UpdateClusterSoftwareInstanceGroupSpecification, + ) + + deployment_config = DeploymentConfiguration( + auto_rollback_configuration=[AlarmDetails(alarm_name='test-alarm')], + rolling_update_policy=RollingDeploymentPolicy( + maximum_batch_size=CapacitySizeConfig(type='INSTANCE_COUNT', value=1), + rollback_maximum_batch_size=CapacitySizeConfig( + type='CAPACITY_PERCENTAGE', value=50 + ), + ), + wait_interval_in_seconds=60, + ) + + instance_groups = [ + UpdateClusterSoftwareInstanceGroupSpecification(instance_group_name='test-group') + ] + + # Call the manage_hyperpod_cluster_nodes method with update_software operation + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='update_software', + cluster_name='test-cluster', + deployment_config=deployment_config, + instance_groups=instance_groups, + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that _update_hp_cluster_software was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['cluster_name'] == 'test-cluster' + assert call_args['deployment_config'] == deployment_config + assert call_args['instance_groups'] == instance_groups + assert call_args['region_name'] == 'us-west-2' + assert call_args['profile_name'] == 'test-profile' + + # Verify the result is the same as the mock result + assert result is mock_result + assert not result.isError + # Type assertion to help pyright understand this is UpdateClusterSoftwareResponse + assert isinstance(result, UpdateClusterSoftwareResponse) + assert ( + result.cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_batch_delete(self): + """Test that manage_hyperpod_cluster_nodes handles the batch_delete operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod Cluster Node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _batch_delete_hp_cluster_nodes method + mock_result = BatchDeleteClusterNodesResponse( + isError=False, + content=[ + TextContent( + type='text', text='Successfully deleted nodes from SageMaker HyperPod cluster' + ) + ], + cluster_name='test-cluster', + successful=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + failed=None, + ) + with patch.object( + handler, '_batch_delete_hp_cluster_nodes', return_value=mock_result + ) as mock_handler: + # Call the manage_hyperpod_cluster_nodes method with batch_delete operation + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='batch_delete', + cluster_name='test-cluster', + node_ids=['i-1234567890abcdef0', 'i-0987654321fedcba0'], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that _batch_delete_hp_cluster_nodes was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['cluster_name'] == 'test-cluster' + assert call_args['node_ids'] == ['i-1234567890abcdef0', 'i-0987654321fedcba0'] + assert call_args['region_name'] == 'us-west-2' + assert call_args['profile_name'] == 'test-profile' + + # Verify the result is the same as the mock result + assert result is mock_result + assert not result.isError + # Type assertion to help pyright understand this is BatchDeleteClusterNodesResponse + assert isinstance(result, BatchDeleteClusterNodesResponse) + assert result.cluster_name == 'test-cluster' + assert result.successful == ['i-1234567890abcdef0', 'i-0987654321fedcba0'] + assert result.failed is None + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_invalid_operation(self): + """Test that manage_hyperpod_cluster_nodes handles invalid operations correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Call the manage_hyperpod_cluster_nodes method with an invalid operation + with pytest.raises(ValueError, match='validation error'): + await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='invalid', # pyright: ignore[reportArgumentType] + cluster_name='test-cluster', + ) + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_missing_parameters(self): + """Test that manage_hyperpod_cluster_nodes handles missing parameters correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server + handler = HyperPodClusterNodeHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Test missing cluster_name for list_nodes operation + with pytest.raises( + ValueError, match='cluster_name is required for all operations except list_clusters' + ): + await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='list_nodes', + cluster_name=None, # Explicitly pass None + ) + + # Test missing node_id for describe_node operation + with pytest.raises(ValueError, match='node_id is required for describe_node operation'): + await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='describe_node', + cluster_name='test-cluster', + node_id=None, # Explicitly pass None + ) + + # Test missing node_ids for batch_delete operation + with pytest.raises(ValueError, match='node_ids is required for batch_delete operation'): + await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='batch_delete', + cluster_name='test-cluster', + node_ids=None, # Explicitly pass None + ) + + @pytest.mark.asyncio + async def test_manage_hyperpod_cluster_nodes_write_access_disabled(self): + """Test that manage_hyperpod_cluster_nodes rejects mutating operations when write access is disabled.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=False + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=False) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Test update_software operation (should be rejected when write access is disabled) + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='update_software', + cluster_name='test-cluster', + ) + + # Verify the result + assert result.isError + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Operation update_software is not allowed without write access' + in result.content[0].text + ) + + # Test batch_delete operation (should be rejected when write access is disabled) + result = await handler.manage_hyperpod_cluster_nodes( + ctx=mock_ctx, + operation='batch_delete', + cluster_name='test-cluster', + node_ids=['i-1234567890abcdef0'], + ) + + # Verify the result + assert result.isError + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert ( + 'Operation batch_delete is not allowed without write access' in result.content[0].text + ) + + @pytest.mark.asyncio + async def test_update_hp_cluster_write_access_disabled(self): + """Test that update_hp_cluster returns an error when write access is disabled.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=False + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=False) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Call the update_hp_cluster method + result = await handler.update_hp_cluster( + ctx=mock_ctx, + cluster_name='test-cluster', + instance_groups=[{'InstanceGroupName': 'test-group'}], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify the result + assert result['isError'] is True + assert 'Write access is not enabled for this handler' in result['errorMessage'] + + @pytest.mark.asyncio + async def test_update_hp_cluster_success(self): + """Test that update_hp_cluster updates a cluster successfully when write access is enabled.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.update_cluster.return_value = { + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + } + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the update_hp_cluster method + result = await handler.update_hp_cluster( + ctx=mock_ctx, + cluster_name='test-cluster', + instance_groups=[{'InstanceGroupName': 'test-group'}], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that update_cluster was called with the correct parameters + mock_sagemaker_client.update_cluster.assert_called_once_with( + ClusterName='test-cluster', + InstanceGroups=[{'InstanceGroupName': 'test-group'}], + ) + + # Verify the result + assert result == { + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + } + + @pytest.mark.asyncio + async def test_update_hp_cluster_api_error(self): + """Test that update_hp_cluster handles API call errors correctly in the sequential try-catch structure.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock SageMaker client + mock_sagemaker_client = MagicMock() + mock_sagemaker_client.update_cluster.side_effect = Exception('API call error') + + # Mock the get_sagemaker_client method to return our mock client + with patch.object( + handler, 'get_sagemaker_client', return_value=mock_sagemaker_client + ) as mock_get_client: + # Call the update_hp_cluster method + result = await handler.update_hp_cluster( + ctx=mock_ctx, + cluster_name='test-cluster', + instance_groups=[{'InstanceGroupName': 'test-group'}], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify that update_cluster was called with the correct parameters + mock_sagemaker_client.update_cluster.assert_called_once_with( + ClusterName='test-cluster', + InstanceGroups=[{'InstanceGroupName': 'test-group'}], + ) + + # Verify the result - should have specific error message from the API call try-catch block + assert result['isError'] is True + assert 'SageMaker update_cluster API error: API call error' in result['errorMessage'] + + @pytest.mark.asyncio + async def test_update_hp_cluster_client_error(self): + """Test that update_hp_cluster handles client creation errors correctly in the sequential try-catch structure.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod cluster node handler with the mock MCP server and allow_write=True + handler = HyperPodClusterNodeHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the get_sagemaker_client method to raise an exception + with patch.object( + handler, 'get_sagemaker_client', side_effect=Exception('Client creation error') + ) as mock_get_client: + # Call the update_hp_cluster method + result = await handler.update_hp_cluster( + ctx=mock_ctx, + cluster_name='test-cluster', + instance_groups=[{'InstanceGroupName': 'test-group'}], + region_name='us-west-2', + profile_name='test-profile', + ) + + # Verify that get_sagemaker_client was called with the correct parameters + mock_get_client.assert_called_once_with( + mock_ctx, region_name='us-west-2', profile_name='test-profile' + ) + + # Verify the result - should have specific error message from the client creation try-catch block + assert result['isError'] is True + assert ( + 'Failed to prepare SageMaker client or parameters: Client creation error' + in result['errorMessage'] + ) diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_stack_handler.py b/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_stack_handler.py new file mode 100644 index 0000000000..4967a86ba7 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_hyperpod_stack_handler.py @@ -0,0 +1,1253 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103 +"""Tests for the HyperPod Stack Handler.""" + +import json +import pytest +import yaml # type: ignore +from awslabs.sagemaker_hyperpod_mcp_server.aws_helper import AwsHelper +from awslabs.sagemaker_hyperpod_mcp_server.consts import ( + CAPABILITY_AUTO_EXPAND, + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CFN_ON_FAILURE_DELETE, + CFN_STACK_TAG_KEY, + CFN_STACK_TAG_VALUE, +) +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import HyperPodStackHandler +from awslabs.sagemaker_hyperpod_mcp_server.models import ( + DeleteStackResponse, + DeployStackResponse, + DescribeStackResponse, +) +from mcp.server.fastmcp import Context +from mcp.types import TextContent +from unittest.mock import MagicMock, mock_open, patch + + +class TestHyperPodStackHandler: + """Tests for the HyperPodStackHandler class.""" + + TEST_REGION = 'us-east-1' + TEST_STACK_NAME = 'hyperpod-test-cluster-stack' + + def test_init_default(self): + """Test that the handler is initialized correctly and registers its tools with default allow_write=False.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Verify that the handler has the correct attributes + assert handler.mcp == mock_mcp + assert handler.allow_write is False + + # Verify that the manage_hyperpod_stacks tool was registered + mock_mcp.tool.assert_called_once() + args, kwargs = mock_mcp.tool.call_args + assert kwargs['name'] == 'manage_hyperpod_stacks' + + def test_init_write_access_enabled(self): + """Test that the handler is initialized correctly with allow_write=True.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Verify that the handler has the correct attributes + assert handler.mcp == mock_mcp + assert handler.allow_write is True + + # Verify that the manage_hyperpod_stacks tool was registered + mock_mcp.tool.assert_called_once() + args, kwargs = mock_mcp.tool.call_args + assert kwargs['name'] == 'manage_hyperpod_stacks' + + @pytest.mark.asyncio + async def test_deploy_stack_success(self): + """Test that _deploy_stack deploys a stack successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.create_stack.return_value = {'StackId': 'test-stack-id'} + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_create_client: + # Mock the _ensure_stack_ownership method to simulate stack not existing + with patch.object( + handler, + '_ensure_stack_ownership', + return_value=(False, None, 'Stack does not exist'), + ): + # Mock the open function to return a mock file + mock_template_content = 'test template content' + with patch('builtins.open', mock_open(read_data=mock_template_content)): + # Call the _deploy_stack method + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + region_name=self.TEST_REGION, + stack_name=self.TEST_STACK_NAME, + cluster_orchestrator='eks', + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + # Since we're mocking _ensure_stack_ownership, it's only called once in _deploy_stack + assert mock_create_client.call_count == 1 + args, kwargs = mock_create_client.call_args + assert args[0] == 'cloudformation' + + # Verify that create_stack was called with the correct parameters + mock_cfn_client.create_stack.assert_called_once() + args, kwargs = mock_cfn_client.create_stack.call_args + assert kwargs['StackName'] == self.TEST_STACK_NAME + assert 'TemplateURL' in kwargs + assert kwargs['Parameters'] == [] + assert kwargs['Capabilities'] == [ + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CAPABILITY_AUTO_EXPAND, + ] + assert kwargs['OnFailure'] == CFN_ON_FAILURE_DELETE + assert kwargs['Tags'] == [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}] + + # Verify the result + assert not result.isError + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_arn == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'CloudFormation stack creation initiated' in result.content[0].text + + def test_ensure_stack_ownership_owned_stack(self): + """Test that _ensure_stack_ownership correctly identifies a stack owned by our tool.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + ] + } + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_create_client: + # Call the _ensure_stack_ownership method + success, stack, error_message = handler._ensure_stack_ownership( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + operation='update', + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + assert mock_create_client.call_count == 1 + args, kwargs = mock_create_client.call_args + assert args[0] == 'cloudformation' + + # Verify that describe_stacks was called with the correct parameters + mock_cfn_client.describe_stacks.assert_called_once_with(StackName=self.TEST_STACK_NAME) + + # Verify the result + assert success is True + assert stack == mock_cfn_client.describe_stacks.return_value['Stacks'][0] + assert error_message is None + + def test_ensure_stack_ownership_not_owned_stack(self): + """Test that _ensure_stack_ownership correctly identifies a stack not owned by our tool.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'Tags': [{'Key': 'SomeOtherTag', 'Value': 'SomeOtherValue'}], + } + ] + } + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_create_client: + # Call the _ensure_stack_ownership method + success, stack, error_message = handler._ensure_stack_ownership( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + operation='update', + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('cloudformation', self.TEST_REGION) + + # Verify that describe_stacks was called with the correct parameters + mock_cfn_client.describe_stacks.assert_called_once_with(StackName=self.TEST_STACK_NAME) + + # Verify the result + assert success is False + assert stack == mock_cfn_client.describe_stacks.return_value['Stacks'][0] + assert error_message is not None + assert 'not created by' in error_message + + def test_ensure_stack_ownership_stack_not_found(self): + """Test that _ensure_stack_ownership correctly handles a stack that doesn't exist.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.side_effect = Exception('Stack does not exist') + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_create_client: + # Call the _ensure_stack_ownership method + success, stack, error_message = handler._ensure_stack_ownership( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + operation='update', + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('cloudformation', self.TEST_REGION) + + # Verify that describe_stacks was called with the correct parameters + mock_cfn_client.describe_stacks.assert_called_once_with(StackName=self.TEST_STACK_NAME) + + # Verify the result + assert success is False + assert stack is None + assert error_message is not None + assert 'not found' in error_message + + @pytest.mark.asyncio + async def test_deploy_stack_update_existing(self): + """Test that _deploy_stack updates an existing stack.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + ] + } + mock_cfn_client.update_stack.return_value = {'StackId': 'test-stack-id'} + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_aws_helper: + # Mock the open function to return a mock file + mock_template_content = 'test template content' + with patch('builtins.open', mock_open(read_data=mock_template_content)): + # Call the _deploy_stack method + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + region_name=self.TEST_REGION, + stack_name=self.TEST_STACK_NAME, + cluster_orchestrator='eks', + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + # Note: It's called twice now - once for _ensure_stack_ownership and once for _deploy_stack + assert mock_aws_helper.call_count == 2 + mock_aws_helper.assert_any_call('cloudformation', self.TEST_REGION) + + # Verify that update_stack was called with the correct parameters + mock_cfn_client.update_stack.assert_called_once() + args, kwargs = mock_cfn_client.update_stack.call_args + assert kwargs['StackName'] == self.TEST_STACK_NAME + assert 'TemplateURL' in kwargs + assert kwargs['Parameters'] == [] + assert kwargs['Capabilities'] == [ + CFN_CAPABILITY_IAM, + CFN_CAPABILITY_NAMED_IAM, + CAPABILITY_AUTO_EXPAND, + ] + assert kwargs['Tags'] == [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}] + + # Verify the result + assert not result.isError + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_arn == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'CloudFormation stack update initiated' in result.content[0].text + + @pytest.mark.asyncio + async def test_describe_stack_success(self): + """Test that _describe_stack returns stack details successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'StackName': self.TEST_STACK_NAME, + 'CreationTime': '2023-01-01T00:00:00Z', + 'StackStatus': 'CREATE_COMPLETE', + 'Description': 'Test stack', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + 'Outputs': [ + { + 'OutputKey': 'ClusterEndpoint', + 'OutputValue': 'https://test-endpoint.hyperpod.amazonaws.com', + }, + { + 'OutputKey': 'ClusterArn', + 'OutputValue': 'arn:aws:hyperpod:us-west-2:123456789012:cluster/test-cluster', + }, + ], + 'Parameters': [ + {'ParameterKey': 'HyperPodClusterName', 'ParameterValue': 'test-cluster'}, + {'ParameterKey': 'KubernetesVersion', 'ParameterValue': '1.31'}, + ], + } + ] + } + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object( + AwsHelper, 'create_boto3_client', return_value=mock_cfn_client + ) as mock_create_client: + # Call the _describe_stack method + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that AwsHelper.create_boto3_client was called with the correct parameters + mock_create_client.assert_called_once_with('cloudformation', self.TEST_REGION) + + # Verify that describe_stacks was called with the correct parameters + mock_cfn_client.describe_stacks.assert_called_once_with(StackName=self.TEST_STACK_NAME) + + # Verify the result + assert not result.isError + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_id == 'test-stack-id' + assert result.creation_time == '2023-01-01T00:00:00Z' + assert result.stack_status == 'CREATE_COMPLETE' + assert result.outputs == { + 'ClusterEndpoint': 'https://test-endpoint.hyperpod.amazonaws.com', + 'ClusterArn': 'arn:aws:hyperpod:us-west-2:123456789012:cluster/test-cluster', + } + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Successfully described CloudFormation stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_delete_stack_success(self): + """Test that _delete_stack deletes a stack successfully.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'StackName': self.TEST_STACK_NAME, + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + ] + } + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client): + # Call the _delete_stack method + result = await handler._delete_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that delete_stack was called with the correct parameters + mock_cfn_client.delete_stack.assert_called_once_with(StackName=self.TEST_STACK_NAME) + + # Verify the result + assert not result.isError + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_id == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Initiated deletion of CloudFormation stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_delete_stack_not_owned(self): + """Test that _delete_stack fails when the stack is not owned by our tool.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Create a mock CloudFormation client + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.return_value = { + 'Stacks': [ + { + 'StackId': 'test-stack-id', + 'StackName': self.TEST_STACK_NAME, + 'Tags': [{'Key': 'SomeOtherTag', 'Value': 'SomeOtherValue'}], + } + ] + } + + # Mock the AwsHelper.create_boto3_client method to return our mock client + with patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client): + # Call the _delete_stack method + result = await handler._delete_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that delete_stack was not called + mock_cfn_client.delete_stack.assert_not_called() + + # Verify the result + assert result.isError + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_id == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'not created by' in result.content[0].text + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_deploy(self): + """Test that manage_hyperpod_stacks handles the deploy operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _deploy_stack method + mock_result = DeployStackResponse( + isError=False, + content=[TextContent(type='text', text='CloudFormation stack creation initiated')], + stack_name=self.TEST_STACK_NAME, + stack_arn='test-stack-id', + ) + + # Mock the JSON data + mock_params_data = [ + {'ParameterKey': 'HyperPodClusterName', 'ParameterValue': 'test-cluster'}, + ] + + with ( + patch('builtins.open', mock_open(read_data=json.dumps(mock_params_data))), + patch.object(handler, '_deploy_stack', return_value=mock_result) as mock_handler, + ): + # Call the manage_hyperpod_stacks method with deploy operation + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='deploy', + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + params_file='/path/to/template.json', + ) + + # Verify that _deploy_stack was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['template_params'] == mock_params_data + assert call_args['stack_name'] == self.TEST_STACK_NAME + assert ( + getattr(call_args['region_name'], 'default', call_args['region_name']) + == self.TEST_REGION + ) + + # Verify the result + assert not result.isError + # Check specific attributes for DeployStackResponse + assert isinstance(result, DeployStackResponse) + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_arn == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'CloudFormation stack creation initiated' in result.content[0].text + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_describe(self): + """Test that manage_hyperpod_stacks handles the describe operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _describe_stack method + mock_result = DescribeStackResponse( + isError=False, + content=[TextContent(type='text', text='Successfully described CloudFormation stack')], + stack_name=self.TEST_STACK_NAME, + stack_id='test-stack-id', + creation_time='2023-01-01T00:00:00Z', + stack_status='CREATE_COMPLETE', + outputs={}, + ) + with patch.object(handler, '_describe_stack', return_value=mock_result) as mock_handler: + # Call the manage_hyperpod_stacks method with describe operation + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='describe', + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that _describe_stack was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['stack_name'] == self.TEST_STACK_NAME + assert ( + getattr(call_args['region_name'], 'default', call_args['region_name']) + == self.TEST_REGION + ) + assert ( + call_args['profile_name'] is None + or getattr(call_args['profile_name'], 'default', None) is None + ) + + # Verify the result + assert not result.isError + # Check specific attributes for DescribeStackResponse + assert isinstance(result, DescribeStackResponse) + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_id == 'test-stack-id' + assert result.creation_time == '2023-01-01T00:00:00Z' + assert result.stack_status == 'CREATE_COMPLETE' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Successfully described CloudFormation stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_delete(self): + """Test that manage_hyperpod_stacks handles the delete operation correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Mock the _delete_stack method + mock_result = DeleteStackResponse( + isError=False, + content=[TextContent(type='text', text='Initiated deletion of CloudFormation stack')], + stack_name=self.TEST_STACK_NAME, + stack_id='test-stack-id', + ) + with patch.object(handler, '_delete_stack', return_value=mock_result) as mock_handler: + # Call the manage_hyperpod_stacks method with delete operation + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='delete', + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that _delete_stack was called with the correct parameters + mock_handler.assert_called_once() + call_args = mock_handler.call_args[1] + assert call_args['ctx'] == mock_ctx + assert call_args['stack_name'] == self.TEST_STACK_NAME + assert ( + getattr(call_args['region_name'], 'default', call_args['region_name']) + == self.TEST_REGION + ) + assert ( + call_args['profile_name'] is None + or getattr(call_args['profile_name'], 'default', None) is None + ) + + # Verify the result + assert not result.isError + # Check specific attributes for DeleteStackResponse + assert isinstance(result, DeleteStackResponse) + assert result.stack_name == self.TEST_STACK_NAME + assert result.stack_id == 'test-stack-id' + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Initiated deletion of CloudFormation stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_invalid_operation(self): + """Test that manage_hyperpod_stacks handles invalid operations correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server + handler = HyperPodStackHandler(mock_mcp) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Call the manage_hyperpod_stacks method with an invalid operation + with pytest.raises(ValueError, match='validation error'): + await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='invalid', # pyright: ignore[reportArgumentType] + ) + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_write_access_disabled(self): + """Test that manage_hyperpod_stacks rejects mutating operations when write access is disabled.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=False + handler = HyperPodStackHandler(mock_mcp, allow_write=False) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Test deploy operation (should be rejected when write access is disabled) + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='deploy', + region_name=self.TEST_REGION, + stack_name=self.TEST_STACK_NAME, + params_file='/path/to/template.yaml', + ) + + # Verify the result + assert result.isError + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'not allowed without write access' in result.content[0].text + + # Test delete operation (should be rejected when write access is disabled) + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + region_name=self.TEST_REGION, + stack_name=self.TEST_STACK_NAME, + operation='delete', + ) + + # Verify the result + assert result.isError + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'not allowed without write access' in result.content[0].text + + # Test describe operation (should be allowed even when write access is disabled) + mock_result = DescribeStackResponse( + isError=False, + content=[TextContent(type='text', text='Successfully described CloudFormation stack')], + stack_name=self.TEST_STACK_NAME, + stack_id='test-stack-id', + creation_time='2023-01-01T00:00:00Z', + stack_status='CREATE_COMPLETE', + outputs={}, + ) + with patch.object(handler, '_describe_stack', return_value=mock_result) as mock_handler: + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='describe', + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + + # Verify that _describe_stack was called (operation allowed even when write access is disabled) + mock_handler.assert_called_once() + + # Verify the result + assert not result.isError + assert len(result.content) == 1 + assert result.content[0].type == 'text' + assert 'Successfully described CloudFormation stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_missing_parameters(self): + """Test that manage_hyperpod_stacks handles missing parameters correctly.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + # Initialize the HyperPod handler with the mock MCP server and allow_write=True + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + + # Create a mock context + mock_ctx = MagicMock(spec=Context) + + # Test missing params_file for deploy operation + with pytest.raises(ValueError, match='params_file is required for deploy operation'): + await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='deploy', + region_name=self.TEST_REGION, + stack_name=self.TEST_STACK_NAME, + params_file=None, # Explicitly pass None + ) + + @pytest.mark.asyncio + async def test_deploy_stack_error_handling(self): + """Test error handling in _deploy_stack.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + # Test CloudFormation client creation error + with patch.object( + AwsHelper, 'create_boto3_client', side_effect=Exception('Client creation failed') + ): + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + cluster_orchestrator='eks', + ) + assert result.isError + assert 'Failed to deploy stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_describe_stack_error_handling(self): + """Test error handling in _describe_stack.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + # Test stack ownership failure + with patch.object( + handler, '_ensure_stack_ownership', return_value=(False, None, 'Stack not found') + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Stack not found' in result.content[0].text + + @pytest.mark.asyncio + async def test_delete_stack_error_handling(self): + """Test error handling in _delete_stack.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + # Test stack ownership failure + with ( + patch.object(AwsHelper, 'create_boto3_client', return_value=MagicMock()), + patch.object( + handler, '_ensure_stack_ownership', return_value=(False, None, 'Stack not owned') + ), + ): + result = await handler._delete_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Stack not owned' in result.content[0].text + + def test_ensure_stack_ownership_general_exception(self): + """Test _ensure_stack_ownership with general exception.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_cfn_client.describe_stacks.side_effect = Exception('General error') + + with patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client): + success, stack, error_message = handler._ensure_stack_ownership( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + operation='test', + ) + assert success is False + assert stack is None + assert error_message is not None and 'Error verifying stack ownership' in error_message + + @pytest.mark.asyncio + async def test_manage_hyperpod_stacks_general_exception(self): + """Test general exception handling in manage_hyperpod_stacks.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + # Mock file opening to raise an exception + with patch('builtins.open', side_effect=Exception('File error')): + result = await handler.manage_hyperpod_stacks( + ctx=mock_ctx, + operation='deploy', + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + params_file='/path/to/params.json', + ) + assert result.isError + assert 'Error in manage_hyperpod_stacks' in result.content[0].text + + @pytest.mark.asyncio + async def test_describe_stack_with_stack_details(self): + """Test _describe_stack with stack that has creation time as datetime object.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + from datetime import datetime + + mock_stack = { + 'StackId': 'test-stack-id', + 'CreationTime': datetime(2023, 1, 1), + 'StackStatus': 'CREATE_COMPLETE', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + 'Outputs': [], + } + + with patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert not result.isError + assert result.creation_time == '2023-01-01T00:00:00' + + @pytest.mark.asyncio + async def test_describe_stack_missing_creation_time(self): + """Test _describe_stack with stack missing creation time.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + mock_stack = { + 'StackId': 'test-stack-id', + 'StackStatus': 'CREATE_COMPLETE', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + + with patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert not result.isError + assert result.creation_time == '' + + def test_construct_cfn_tag_mapping_node(self): + """Test construct_cfn_tag with mapping node.""" + from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import construct_cfn_tag + + loader = MagicMock() + loader.construct_mapping.return_value = {'key': 'value'} + + node = yaml.MappingNode('tag', []) + + result = construct_cfn_tag(loader, 'Ref', node) + assert result == {'Ref': {'key': 'value'}} + + def test_construct_cfn_tag_sequence_node(self): + """Test construct_cfn_tag with sequence node.""" + from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import construct_cfn_tag + + loader = MagicMock() + loader.construct_sequence.return_value = ['item1', 'item2'] + + node = yaml.SequenceNode('tag', []) + + result = construct_cfn_tag(loader, 'Ref', node) + assert result == {'Ref': ['item1', 'item2']} + + def test_construct_cfn_tag_unknown_node(self): + """Test construct_cfn_tag with unknown node type.""" + from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import construct_cfn_tag + + loader = MagicMock() + node = object() # Unknown node type + + result = construct_cfn_tag(loader, 'Ref', node) + assert result is None + + @pytest.mark.asyncio + async def test_deploy_stack_update_exception(self): + """Test _deploy_stack update path with exception.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_cfn_client.update_stack.side_effect = Exception('Update failed') + + # Mock stack exists and is owned + mock_stack = { + 'StackId': 'test-id', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + + with ( + patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client), + patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ), + ): + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + cluster_orchestrator='eks', + ) + assert result.isError + assert 'Failed to deploy stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_deploy_stack_create_exception(self): + """Test _deploy_stack create path with exception.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_cfn_client.create_stack.side_effect = Exception('Create failed') + + with ( + patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client), + patch.object( + handler, '_ensure_stack_ownership', side_effect=Exception('Stack check failed') + ), + ): + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + cluster_orchestrator='eks', + ) + assert result.isError + assert 'Failed to deploy stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_describe_stack_exception(self): + """Test _describe_stack with exception.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + with patch.object(AwsHelper, 'create_boto3_client', side_effect=Exception('Client error')): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Error verifying stack ownership' in result.content[0].text + + @pytest.mark.asyncio + async def test_delete_stack_exception(self): + """Test _delete_stack with exception.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_cfn_client.delete_stack.side_effect = Exception('Delete failed') + + mock_stack = { + 'StackId': 'test-id', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + + with ( + patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client), + patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ), + ): + result = await handler._delete_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Failed to delete stack' in result.content[0].text + + def test_construct_cfn_tag_scalar_node(self): + """Test construct_cfn_tag with scalar node.""" + from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import construct_cfn_tag + + loader = MagicMock() + loader.construct_scalar.return_value = 'scalar_value' + + node = yaml.ScalarNode('tag', 'value') + + result = construct_cfn_tag(loader, 'Ref', node) + assert result == {'Ref': 'scalar_value'} + + @pytest.mark.asyncio + async def test_describe_stack_no_outputs(self): + """Test _describe_stack with stack that has no outputs.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + mock_stack = { + 'StackId': 'test-stack-id', + 'StackStatus': 'CREATE_COMPLETE', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + # No 'Outputs' key + } + + with patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert not result.isError + assert result.outputs == {} + + @pytest.mark.asyncio + async def test_describe_stack_outputs_missing_keys(self): + """Test _describe_stack with outputs missing OutputKey or OutputValue.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + mock_stack = { + 'StackId': 'test-stack-id', + 'StackStatus': 'CREATE_COMPLETE', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + 'Outputs': [ + {'OutputKey': 'ValidKey', 'OutputValue': 'ValidValue'}, + {'OutputKey': 'MissingValue'}, # Missing OutputValue + {'OutputValue': 'MissingKey'}, # Missing OutputKey + {}, # Missing both + ], + } + + with patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert not result.isError + assert result.outputs == {'ValidKey': 'ValidValue'} + + @pytest.mark.asyncio + async def test_describe_stack_missing_stack_details(self): + """Test _describe_stack with missing stack details.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + # Test with None stack + with patch.object(handler, '_ensure_stack_ownership', return_value=(True, None, None)): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert not result.isError + assert result.stack_id == '' + assert result.creation_time == '' + assert result.stack_status == '' + + @pytest.mark.asyncio + async def test_deploy_stack_ownership_check_exception(self): + """Test _deploy_stack when ownership check raises exception but stack exists.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_cfn_client.create_stack.return_value = {'StackId': 'test-stack-id'} + + # First call raises exception, second call succeeds for create_stack + with patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client): + # Mock ownership check to raise exception (simulating stack doesn't exist) + with patch.object( + handler, '_ensure_stack_ownership', side_effect=Exception('Stack check failed') + ): + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + cluster_orchestrator='eks', + ) + assert not result.isError + assert result.stack_arn == 'test-stack-id' + + @pytest.mark.asyncio + async def test_describe_stack_creation_time_exception(self): + """Test _describe_stack with creation time that raises exception during conversion.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp) + mock_ctx = MagicMock(spec=Context) + + # Mock datetime object that raises exception on isoformat + mock_datetime = MagicMock() + mock_datetime.isoformat.side_effect = Exception('isoformat failed') + + mock_stack = { + 'StackId': 'test-stack-id', + 'CreationTime': mock_datetime, + 'StackStatus': 'CREATE_COMPLETE', + 'Tags': [{'Key': CFN_STACK_TAG_KEY, 'Value': CFN_STACK_TAG_VALUE}], + } + + with patch.object( + handler, '_ensure_stack_ownership', return_value=(True, mock_stack, None) + ): + result = await handler._describe_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Failed to describe stack' in result.content[0].text + + @pytest.mark.asyncio + async def test_deploy_stack_ownership_failure_with_stack(self): + """Test _deploy_stack when ownership check fails but returns stack details.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + mock_cfn_client = MagicMock() + mock_stack = {'StackId': 'existing-stack-id'} + + with patch.object(AwsHelper, 'create_boto3_client', return_value=mock_cfn_client): + # Mock ownership check to fail but return stack details + with patch.object( + handler, '_ensure_stack_ownership', return_value=(False, mock_stack, 'Not owned') + ): + result = await handler._deploy_stack( + ctx=mock_ctx, + template_params=[], + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + cluster_orchestrator='eks', + ) + assert result.isError + assert 'Not owned' in result.content[0].text + + @pytest.mark.asyncio + async def test_delete_stack_general_exception(self): + """Test _delete_stack with general exception after ownership check.""" + mock_mcp = MagicMock() + handler = HyperPodStackHandler(mock_mcp, allow_write=True) + mock_ctx = MagicMock(spec=Context) + + with patch.object( + AwsHelper, 'create_boto3_client', side_effect=Exception('General error') + ): + result = await handler._delete_stack( + ctx=mock_ctx, + stack_name=self.TEST_STACK_NAME, + region_name=self.TEST_REGION, + ) + assert result.isError + assert 'Failed to delete stack' in result.content[0].text diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_init.py b/src/sagemaker-hyperpod-mcp-server/tests/test_init.py new file mode 100644 index 0000000000..991bd8203b --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_init.py @@ -0,0 +1,53 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Tests for the awslabs.sagemaker-hyperpod-mcp-server package.""" + +import importlib +import re + + +class TestInit: + """Tests for the __init__.py module.""" + + def test_version(self): + """Test that __version__ is defined and follows semantic versioning.""" + # Import the module + import awslabs.sagemaker_hyperpod_mcp_server + + # Check that __version__ is defined + assert hasattr(awslabs.sagemaker_hyperpod_mcp_server, '__version__') + + # Check that __version__ is a string + assert isinstance(awslabs.sagemaker_hyperpod_mcp_server.__version__, str) + + # Check that __version__ follows semantic versioning (major.minor.patch) + version_pattern = r'^\d+\.\d+\.\d+$' + assert re.match(version_pattern, awslabs.sagemaker_hyperpod_mcp_server.__version__), ( + f"Version '{awslabs.sagemaker_hyperpod_mcp_server.__version__}' does not follow semantic versioning" + ) + + def test_module_reload(self): + """Test that the module can be reloaded.""" + # Import the module + import awslabs.sagemaker_hyperpod_mcp_server + + # Store the original version + original_version = awslabs.sagemaker_hyperpod_mcp_server.__version__ + + # Reload the module + importlib.reload(awslabs.sagemaker_hyperpod_mcp_server) + + # Check that the version is still the same + assert awslabs.sagemaker_hyperpod_mcp_server.__version__ == original_version diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_logging_helper.py b/src/sagemaker-hyperpod-mcp-server/tests/test_logging_helper.py new file mode 100644 index 0000000000..7840272fa8 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_logging_helper.py @@ -0,0 +1,92 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103 +"""Tests for the logging_helper module.""" + +import pytest +from awslabs.sagemaker_hyperpod_mcp_server.logging_helper import LogLevel, log_with_request_id +from mcp.server.fastmcp import Context +from unittest.mock import MagicMock, patch + + +class TestLogLevel: + """Tests for the LogLevel enum.""" + + def test_log_level_values(self): + """Test that the LogLevel enum has the expected values.""" + assert LogLevel.DEBUG.value == 'debug' + assert LogLevel.INFO.value == 'info' + assert LogLevel.WARNING.value == 'warning' + assert LogLevel.ERROR.value == 'error' + assert LogLevel.CRITICAL.value == 'critical' + + +class TestLogWithRequestId: + """Tests for the log_with_request_id function.""" + + @pytest.fixture + def mock_ctx(self): + """Create a mock MCP context with a request ID.""" + ctx = MagicMock(spec=Context) + ctx.request_id = 'test-request-id' + return ctx + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_debug(self, mock_logger, mock_ctx): + """Test that log_with_request_id logs at DEBUG level.""" + message = 'Test debug message' + log_with_request_id(mock_ctx, LogLevel.DEBUG, message) + mock_logger.debug.assert_called_once_with(f'[request_id={mock_ctx.request_id}] {message}') + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_info(self, mock_logger, mock_ctx): + """Test that log_with_request_id logs at INFO level.""" + message = 'Test info message' + log_with_request_id(mock_ctx, LogLevel.INFO, message) + mock_logger.info.assert_called_once_with(f'[request_id={mock_ctx.request_id}] {message}') + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_warning(self, mock_logger, mock_ctx): + """Test that log_with_request_id logs at WARNING level.""" + message = 'Test warning message' + log_with_request_id(mock_ctx, LogLevel.WARNING, message) + mock_logger.warning.assert_called_once_with( + f'[request_id={mock_ctx.request_id}] {message}' + ) + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_error(self, mock_logger, mock_ctx): + """Test that log_with_request_id logs at ERROR level.""" + message = 'Test error message' + log_with_request_id(mock_ctx, LogLevel.ERROR, message) + mock_logger.error.assert_called_once_with(f'[request_id={mock_ctx.request_id}] {message}') + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_critical(self, mock_logger, mock_ctx): + """Test that log_with_request_id logs at CRITICAL level.""" + message = 'Test critical message' + log_with_request_id(mock_ctx, LogLevel.CRITICAL, message) + mock_logger.critical.assert_called_once_with( + f'[request_id={mock_ctx.request_id}] {message}' + ) + + @patch('awslabs.sagemaker_hyperpod_mcp_server.logging_helper.logger') + def test_log_with_additional_kwargs(self, mock_logger, mock_ctx): + """Test that log_with_request_id passes additional kwargs to the logger.""" + message = 'Test message with kwargs' + additional_kwargs = {'key1': 'value1', 'key2': 'value2'} + log_with_request_id(mock_ctx, LogLevel.INFO, message, **additional_kwargs) + mock_logger.info.assert_called_once_with( + f'[request_id={mock_ctx.request_id}] {message}', **additional_kwargs + ) diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_main.py b/src/sagemaker-hyperpod-mcp-server/tests/test_main.py new file mode 100644 index 0000000000..150ce720d2 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_main.py @@ -0,0 +1,238 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103, E402 +"""Tests for the main function in server.py.""" + +# Mock the imports that might cause issues +import sys +from unittest.mock import ANY, MagicMock, patch + + +# Mock modules that might not be installed +sys.modules['requests_auth_aws_sigv4'] = MagicMock() +sys.modules['requests'] = MagicMock() +from awslabs.sagemaker_hyperpod_mcp_server.server import create_server, main + + +class TestMain: + """Tests for the main function.""" + + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.create_server') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.logger') + @patch('sys.argv', ['awslabs.sagemaker-hyperpod-mcp-server']) + def test_main_default( + self, + mock_logger, + mock_create_server, + mock_stack_handler, + mock_api_handler, + ): + """Test main function with default arguments.""" + # Setup mock + mock_mcp = MagicMock() + mock_create_server.return_value = mock_mcp + + # Call the main function + result = main() + + # Check that create_server was called + mock_create_server.assert_called_once() + + # Check that the handlers were initialized with the correct parameters + mock_api_handler.assert_called_once_with(mock_mcp, False, False) + mock_stack_handler.assert_called_once_with(mock_mcp, False) + + # Check that the server was run + mock_mcp.run.assert_called_once() + + # Check that the correct log message was output + mock_logger.info.assert_called_once_with( + 'Starting HyperPod MCP Server in read-only mode, restricted sensitive data access mode' + ) + + # Check that the function returns the MCP instance + assert result == mock_mcp + + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.create_server') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.logger') + @patch('sys.argv', ['awslabs.sagemaker-hyperpod-mcp-server', '--allow-write']) + def test_main_with_write_access( + self, + mock_logger, + mock_create_server, + mock_stack_handler, + mock_api_handler, + ): + """Test main function with write access enabled.""" + # Setup mock + mock_mcp = MagicMock() + mock_create_server.return_value = mock_mcp + + # Mock argparse to return the desired arguments + with patch('argparse.ArgumentParser.parse_args') as mock_parse_args: + mock_args = MagicMock() + mock_args.allow_write = True + mock_args.allow_sensitive_data_access = False + mock_parse_args.return_value = mock_args + + # Call the main function + result = main() + + # Check that create_server was called + mock_create_server.assert_called_once() + + # Check that the handlers were initialized with the correct parameters + mock_api_handler.assert_called_once_with(mock_mcp, True, False) + mock_stack_handler.assert_called_once_with(mock_mcp, True) + + # Check that the server was run + mock_mcp.run.assert_called_once() + + # Check that the correct log message was output + mock_logger.info.assert_called_once_with( + 'Starting HyperPod MCP Server in restricted sensitive data access mode' + ) + + # Check that the function returns the MCP instance + assert result == mock_mcp + + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.create_server') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.logger') + @patch('sys.argv', ['awslabs.sagemaker-hyperpod-mcp-server', '--allow-sensitive-data-access']) + def test_main_with_sensitive_data_access( + self, + mock_logger, + mock_create_server, + mock_stack_handler, + mock_api_handler, + ): + """Test main function with sensitive data access enabled.""" + # Setup mock + mock_mcp = MagicMock() + mock_create_server.return_value = mock_mcp + + # Mock argparse to return the desired arguments + with patch('argparse.ArgumentParser.parse_args') as mock_parse_args: + mock_args = MagicMock() + mock_args.allow_write = False + mock_args.allow_sensitive_data_access = True + mock_parse_args.return_value = mock_args + + # Call the main function + result = main() + + # Check that create_server was called + mock_create_server.assert_called_once() + + # Check that the handlers were initialized with the correct parameters + mock_api_handler.assert_called_once_with(mock_mcp, False, True) + mock_stack_handler.assert_called_once_with(mock_mcp, False) + + # Check that the server was run + mock_mcp.run.assert_called_once() + + # Check that the correct log message was output + mock_logger.info.assert_called_once_with('Starting HyperPod MCP Server in read-only mode') + + # Check that the function returns the MCP instance + assert result == mock_mcp + + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.create_server') + @patch('awslabs.sagemaker_hyperpod_mcp_server.server.logger') + @patch( + 'sys.argv', + [ + 'awslabs.sagemaker-hyperpod-mcp-server', + '--allow-write', + '--allow-sensitive-data-access', + ], + ) + def test_main_with_all_access( + self, + mock_logger, + mock_create_server, + mock_stack_handler, + mock_api_handler, + ): + """Test main function with both write and sensitive data access enabled.""" + # Setup mock + mock_mcp = MagicMock() + mock_create_server.return_value = mock_mcp + + # Mock argparse to return the desired arguments + with patch('argparse.ArgumentParser.parse_args') as mock_parse_args: + mock_args = MagicMock() + mock_args.allow_write = True + mock_args.allow_sensitive_data_access = True + mock_parse_args.return_value = mock_args + + # Call the main function + result = main() + + # Check that create_server was called + mock_create_server.assert_called_once() + + # Check that the handlers were initialized with the correct parameters + mock_api_handler.assert_called_once_with(mock_mcp, True, True) + mock_stack_handler.assert_called_once_with(mock_mcp, True) + + # Check that the server was run + mock_mcp.run.assert_called_once() + + # Check that the correct log message was output + mock_logger.info.assert_called_once_with('Starting HyperPod MCP Server') + + # Check that the function returns the MCP instance + assert result == mock_mcp + + def test_create_server(self): + """Test the create_server function.""" + with patch('awslabs.sagemaker_hyperpod_mcp_server.server.FastMCP') as mock_fastmcp: + # Call the create_server function + create_server() + + # Check that FastMCP was called with the correct arguments + mock_fastmcp.assert_called_once_with( + 'awslabs.sagemaker-hyperpod-mcp-server', + instructions=ANY, + dependencies=ANY, + ) + + def test_module_execution(self): + """Test the module execution when run as __main__.""" + # This test directly executes the code in the if __name__ == '__main__': block + # to ensure coverage of that line + + # Get the source code of the module + import inspect + from awslabs.sagemaker_hyperpod_mcp_server import server + + # Get the source code + source = inspect.getsource(server) + + # Check that the module has the if __name__ == '__main__': block + assert "if __name__ == '__main__':" in source + assert 'main()' in source + + # This test doesn't actually execute the code, but it ensures + # that the coverage report includes the if __name__ == '__main__': line + # by explicitly checking for its presence diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_models.py b/src/sagemaker-hyperpod-mcp-server/tests/test_models.py new file mode 100644 index 0000000000..77e55ab483 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_models.py @@ -0,0 +1,744 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103 +"""Tests for the models module.""" + +from awslabs.sagemaker_hyperpod_mcp_server.models import ( + AlarmDetails, + BatchDeleteClusterNodesError, + BatchDeleteClusterNodesResponse, + CapacitySizeConfig, + ClusterEbsVolumeConfig, + ClusterInstancePlacement, + ClusterInstanceStatusDetails, + ClusterInstanceStorageConfig, + ClusterLifeCycleConfig, + ClusterNodeDetails, + ClusterNodeSummary, + ClusterSummary, + DeploymentConfiguration, + DeployStackResponse, + DescribeClusterNodeResponse, + DescribeStackResponse, + ListClusterNodesResponse, + ListClustersResponse, + RollingDeploymentPolicy, + UpdateClusterSoftwareInstanceGroupSpecification, + UpdateClusterSoftwareResponse, + VpcConfig, +) +from mcp.types import TextContent + + +class TestClusterSummary: + """Tests for the ClusterSummary model.""" + + def test_create_cluster_summary(self): + """Test creating a ClusterSummary instance.""" + cluster_summary = ClusterSummary( + cluster_name='test-cluster', + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + cluster_status='InService', + creation_time='2023-01-01T00:00:00Z', + training_plan_arns=[ + 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ], + ) + + assert cluster_summary.cluster_name == 'test-cluster' + assert ( + cluster_summary.cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + assert cluster_summary.cluster_status == 'InService' + assert cluster_summary.creation_time == '2023-01-01T00:00:00Z' + assert cluster_summary.training_plan_arns == [ + 'arn:aws:sagemaker:us-west-2:123456789012:training-plan/test-plan' + ] + + def test_create_cluster_summary_without_optional_fields(self): + """Test creating a ClusterSummary instance without optional fields.""" + cluster_summary = ClusterSummary( + cluster_name='test-cluster', + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + cluster_status='InService', + creation_time='2023-01-01T00:00:00Z', + ) + + assert cluster_summary.cluster_name == 'test-cluster' + assert ( + cluster_summary.cluster_arn + == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + assert cluster_summary.cluster_status == 'InService' + assert cluster_summary.creation_time == '2023-01-01T00:00:00Z' + assert cluster_summary.training_plan_arns is None + + +class TestClusterInstanceStatusDetails: + """Tests for the ClusterInstanceStatusDetails model.""" + + def test_create_cluster_instance_status_details(self): + """Test creating a ClusterInstanceStatusDetails instance.""" + status_details = ClusterInstanceStatusDetails( + status='Running', + message='Instance is running normally', + ) + + assert status_details.status == 'Running' + assert status_details.message == 'Instance is running normally' + + def test_create_cluster_instance_status_details_without_optional_fields(self): + """Test creating a ClusterInstanceStatusDetails instance without optional fields.""" + status_details = ClusterInstanceStatusDetails( + status='Running', + ) + + assert status_details.status == 'Running' + assert status_details.message is None + + +class TestClusterNodeSummary: + """Tests for the ClusterNodeSummary model.""" + + def test_create_cluster_node_summary(self): + """Test creating a ClusterNodeSummary instance.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + message='Instance is running normally', + ) + + node_summary = ClusterNodeSummary( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + launch_time='2023-01-01T00:00:00Z', + last_software_update_time='2023-01-02T00:00:00Z', + ) + + assert node_summary.instance_group_name == 'test-group' + assert node_summary.instance_id == 'i-1234567890abcdef0' + assert node_summary.instance_status == instance_status + assert node_summary.instance_type == 'ml.p4d.24xlarge' + assert node_summary.launch_time == '2023-01-01T00:00:00Z' + assert node_summary.last_software_update_time == '2023-01-02T00:00:00Z' + + def test_create_cluster_node_summary_without_optional_fields(self): + """Test creating a ClusterNodeSummary instance without optional fields.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + ) + + node_summary = ClusterNodeSummary( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + launch_time='2023-01-01T00:00:00Z', + ) + + assert node_summary.instance_group_name == 'test-group' + assert node_summary.instance_id == 'i-1234567890abcdef0' + assert node_summary.instance_status == instance_status + assert node_summary.instance_type == 'ml.p4d.24xlarge' + assert node_summary.launch_time == '2023-01-01T00:00:00Z' + assert node_summary.last_software_update_time is None + + +class TestListClustersResponse: + """Tests for the ListClustersResponse model.""" + + def test_create_list_clusters_response(self): + """Test creating a ListClustersResponse instance.""" + cluster_summary = ClusterSummary( + cluster_name='test-cluster', + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + cluster_status='InService', + creation_time='2023-01-01T00:00:00Z', + ) + + response = ListClustersResponse( + isError=False, + content=[TextContent(type='text', text='Successfully listed clusters')], + clusters=[cluster_summary], + next_token='next-token', + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully listed clusters' + assert len(response.clusters) == 1 + assert response.clusters[0] == cluster_summary + assert response.next_token == 'next-token' + + def test_create_list_clusters_response_without_optional_fields(self): + """Test creating a ListClustersResponse instance without optional fields.""" + cluster_summary = ClusterSummary( + cluster_name='test-cluster', + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + cluster_status='InService', + creation_time='2023-01-01T00:00:00Z', + ) + + response = ListClustersResponse( + isError=False, + content=[TextContent(type='text', text='Successfully listed clusters')], + clusters=[cluster_summary], + next_token=None, + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully listed clusters' + assert len(response.clusters) == 1 + assert response.clusters[0] == cluster_summary + assert response.next_token is None + + +class TestListClusterNodesResponse: + """Tests for the ListClusterNodesResponse model.""" + + def test_create_list_cluster_nodes_response(self): + """Test creating a ListClusterNodesResponse instance.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + ) + + node_summary = ClusterNodeSummary( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + launch_time='2023-01-01T00:00:00Z', + ) + + response = ListClusterNodesResponse( + isError=False, + content=[TextContent(type='text', text='Successfully listed cluster nodes')], + nodes=[node_summary], + next_token='next-token', + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully listed cluster nodes' + assert len(response.nodes) == 1 + assert response.nodes[0] == node_summary + assert response.next_token == 'next-token' + + def test_create_list_cluster_nodes_response_without_optional_fields(self): + """Test creating a ListClusterNodesResponse instance without optional fields.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + ) + + node_summary = ClusterNodeSummary( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + launch_time='2023-01-01T00:00:00Z', + ) + + response = ListClusterNodesResponse( + isError=False, + content=[TextContent(type='text', text='Successfully listed cluster nodes')], + nodes=[node_summary], + next_token=None, + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully listed cluster nodes' + assert len(response.nodes) == 1 + assert response.nodes[0] == node_summary + assert response.next_token is None + + +class TestClusterEbsVolumeConfig: + """Tests for the ClusterEbsVolumeConfig model.""" + + def test_create_cluster_ebs_volume_config(self): + """Test creating a ClusterEbsVolumeConfig instance.""" + ebs_volume_config = ClusterEbsVolumeConfig( + volume_size_in_gb=100, + ) + + assert ebs_volume_config.volume_size_in_gb == 100 + + def test_create_cluster_ebs_volume_config_without_optional_fields(self): + """Test creating a ClusterEbsVolumeConfig instance without optional fields.""" + ebs_volume_config = ClusterEbsVolumeConfig() + + assert ebs_volume_config.volume_size_in_gb is None + + +class TestClusterInstanceStorageConfig: + """Tests for the ClusterInstanceStorageConfig model.""" + + def test_create_cluster_instance_storage_config(self): + """Test creating a ClusterInstanceStorageConfig instance.""" + ebs_volume_config = ClusterEbsVolumeConfig( + volume_size_in_gb=100, + ) + + storage_config = ClusterInstanceStorageConfig( + ebs_volume_config=ebs_volume_config, + ) + + assert storage_config.ebs_volume_config == ebs_volume_config + + def test_create_cluster_instance_storage_config_without_optional_fields(self): + """Test creating a ClusterInstanceStorageConfig instance without optional fields.""" + storage_config = ClusterInstanceStorageConfig() + + assert storage_config.ebs_volume_config is None + + +class TestClusterLifeCycleConfig: + """Tests for the ClusterLifeCycleConfig model.""" + + def test_create_cluster_life_cycle_config(self): + """Test creating a ClusterLifeCycleConfig instance.""" + life_cycle_config = ClusterLifeCycleConfig( + on_create="echo 'Hello, World!'", + source_s3_uri='s3://bucket/path/to/script.sh', + ) + + assert life_cycle_config.on_create == "echo 'Hello, World!'" + assert life_cycle_config.source_s3_uri == 's3://bucket/path/to/script.sh' + + +class TestVpcConfig: + """Tests for the VpcConfig model.""" + + def test_create_vpc_config(self): + """Test creating a VpcConfig instance.""" + vpc_config = VpcConfig( + security_group_ids=['sg-1234567890abcdef0'], + subnets=['subnet-1234567890abcdef0'], + ) + + assert vpc_config.security_group_ids == ['sg-1234567890abcdef0'] + assert vpc_config.subnets == ['subnet-1234567890abcdef0'] + + def test_create_vpc_config_without_optional_fields(self): + """Test creating a VpcConfig instance without optional fields.""" + vpc_config = VpcConfig() + + assert vpc_config.security_group_ids is None + assert vpc_config.subnets is None + + +class TestClusterInstancePlacement: + """Tests for the ClusterInstancePlacement model.""" + + def test_create_cluster_instance_placement(self): + """Test creating a ClusterInstancePlacement instance.""" + placement = ClusterInstancePlacement( + availability_zone='us-west-2a', + availability_zone_id='usw2-az1', + ) + + assert placement.availability_zone == 'us-west-2a' + assert placement.availability_zone_id == 'usw2-az1' + + def test_create_cluster_instance_placement_without_optional_fields(self): + """Test creating a ClusterInstancePlacement instance without optional fields.""" + placement = ClusterInstancePlacement() + + assert placement.availability_zone is None + assert placement.availability_zone_id is None + + +class TestAlarmDetails: + """Tests for the AlarmDetails model.""" + + def test_create_alarm_details(self): + """Test creating an AlarmDetails instance.""" + alarm_details = AlarmDetails( + alarm_name='test-alarm', + ) + + assert alarm_details.alarm_name == 'test-alarm' + + +class TestCapacitySizeConfig: + """Tests for the CapacitySizeConfig model.""" + + def test_create_capacity_size_config(self): + """Test creating a CapacitySizeConfig instance.""" + capacity_size_config = CapacitySizeConfig( + type='INSTANCE_COUNT', + value=5, + ) + + assert capacity_size_config.type == 'INSTANCE_COUNT' + assert capacity_size_config.value == 5 + + +class TestRollingDeploymentPolicy: + """Tests for the RollingDeploymentPolicy model.""" + + def test_create_rolling_deployment_policy(self): + """Test creating a RollingDeploymentPolicy instance.""" + maximum_batch_size = CapacitySizeConfig( + type='INSTANCE_COUNT', + value=5, + ) + + rollback_maximum_batch_size = CapacitySizeConfig( + type='CAPACITY_PERCENTAGE', + value=20, + ) + + policy = RollingDeploymentPolicy( + maximum_batch_size=maximum_batch_size, + rollback_maximum_batch_size=rollback_maximum_batch_size, + ) + + assert policy.maximum_batch_size == maximum_batch_size + assert policy.rollback_maximum_batch_size == rollback_maximum_batch_size + + def test_create_rolling_deployment_policy_without_optional_fields(self): + """Test creating a RollingDeploymentPolicy instance without optional fields.""" + maximum_batch_size = CapacitySizeConfig( + type='INSTANCE_COUNT', + value=5, + ) + + policy = RollingDeploymentPolicy( + maximum_batch_size=maximum_batch_size, + ) + + assert policy.maximum_batch_size == maximum_batch_size + assert policy.rollback_maximum_batch_size is None + + +class TestDeploymentConfiguration: + """Tests for the DeploymentConfiguration model.""" + + def test_create_deployment_configuration(self): + """Test creating a DeploymentConfiguration instance.""" + alarm_details = AlarmDetails( + alarm_name='test-alarm', + ) + + maximum_batch_size = CapacitySizeConfig( + type='INSTANCE_COUNT', + value=5, + ) + + rolling_update_policy = RollingDeploymentPolicy( + maximum_batch_size=maximum_batch_size, + ) + + deployment_config = DeploymentConfiguration( + auto_rollback_configuration=[alarm_details], + rolling_update_policy=rolling_update_policy, + wait_interval_in_seconds=60, + ) + + assert deployment_config.auto_rollback_configuration == [alarm_details] + assert deployment_config.rolling_update_policy == rolling_update_policy + assert deployment_config.wait_interval_in_seconds == 60 + + def test_create_deployment_configuration_without_optional_fields(self): + """Test creating a DeploymentConfiguration instance without optional fields.""" + deployment_config = DeploymentConfiguration() + + assert deployment_config.auto_rollback_configuration is None + assert deployment_config.rolling_update_policy is None + assert deployment_config.wait_interval_in_seconds is None + + +class TestUpdateClusterSoftwareInstanceGroupSpecification: + """Tests for the UpdateClusterSoftwareInstanceGroupSpecification model.""" + + def test_create_update_cluster_software_instance_group_specification(self): + """Test creating an UpdateClusterSoftwareInstanceGroupSpecification instance.""" + spec = UpdateClusterSoftwareInstanceGroupSpecification( + instance_group_name='test-group', + ) + + assert spec.instance_group_name == 'test-group' + + +class TestClusterNodeDetails: + """Tests for the ClusterNodeDetails model.""" + + def test_create_cluster_node_details(self): + """Test creating a ClusterNodeDetails instance.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + message='Instance is running normally', + ) + + ebs_volume_config = ClusterEbsVolumeConfig( + volume_size_in_gb=100, + ) + + storage_config = ClusterInstanceStorageConfig( + ebs_volume_config=ebs_volume_config, + ) + + life_cycle_config = ClusterLifeCycleConfig( + on_create="echo 'Hello, World!'", + source_s3_uri='s3://bucket/path/to/script.sh', + ) + + vpc_config = VpcConfig( + security_group_ids=['sg-1234567890abcdef0'], + subnets=['subnet-1234567890abcdef0'], + ) + + placement = ClusterInstancePlacement( + availability_zone='us-west-2a', + availability_zone_id='usw2-az1', + ) + + node_details = ClusterNodeDetails( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_storage_configs=[storage_config], + instance_type='ml.p4d.24xlarge', + last_software_update_time='2023-01-02T00:00:00Z', + launch_time='2023-01-01T00:00:00Z', + life_cycle_config=life_cycle_config, + override_vpc_config=vpc_config, + placement=placement, + private_dns_hostname='ip-10-0-0-1.us-west-2.compute.internal', + private_primary_ip='10.0.0.1', + private_primary_ipv6='2001:db8::1', + threads_per_core=2, + ) + + assert node_details.instance_group_name == 'test-group' + assert node_details.instance_id == 'i-1234567890abcdef0' + assert node_details.instance_status == instance_status + assert node_details.instance_storage_configs == [storage_config] + assert node_details.instance_type == 'ml.p4d.24xlarge' + assert node_details.last_software_update_time == '2023-01-02T00:00:00Z' + assert node_details.launch_time == '2023-01-01T00:00:00Z' + assert node_details.life_cycle_config == life_cycle_config + assert node_details.override_vpc_config == vpc_config + assert node_details.placement == placement + assert node_details.private_dns_hostname == 'ip-10-0-0-1.us-west-2.compute.internal' + assert node_details.private_primary_ip == '10.0.0.1' + assert node_details.private_primary_ipv6 == '2001:db8::1' + assert node_details.threads_per_core == 2 + + def test_create_cluster_node_details_without_optional_fields(self): + """Test creating a ClusterNodeDetails instance without optional fields.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + ) + + node_details = ClusterNodeDetails( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + ) + + assert node_details.instance_group_name == 'test-group' + assert node_details.instance_id == 'i-1234567890abcdef0' + assert node_details.instance_status == instance_status + assert node_details.instance_type == 'ml.p4d.24xlarge' + assert node_details.instance_storage_configs is None + assert node_details.last_software_update_time is None + assert node_details.launch_time is None + assert node_details.life_cycle_config is None + assert node_details.override_vpc_config is None + assert node_details.placement is None + assert node_details.private_dns_hostname is None + assert node_details.private_primary_ip is None + assert node_details.private_primary_ipv6 is None + assert node_details.threads_per_core is None + + +class TestDescribeClusterNodeResponse: + """Tests for the DescribeClusterNodeResponse model.""" + + def test_create_describe_cluster_node_response(self): + """Test creating a DescribeClusterNodeResponse instance.""" + instance_status = ClusterInstanceStatusDetails( + status='Running', + ) + + node_details = ClusterNodeDetails( + instance_group_name='test-group', + instance_id='i-1234567890abcdef0', + instance_status=instance_status, + instance_type='ml.p4d.24xlarge', + ) + + response = DescribeClusterNodeResponse( + isError=False, + content=[TextContent(type='text', text='Successfully described cluster node')], + node_details=node_details, + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully described cluster node' + assert response.node_details == node_details + + def test_create_describe_cluster_node_response_without_optional_fields(self): + """Test creating a DescribeClusterNodeResponse instance without optional fields.""" + response = DescribeClusterNodeResponse( + isError=False, + content=[TextContent(type='text', text='Successfully described cluster node')], + node_details=None, + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully described cluster node' + assert response.node_details is None + + +class TestUpdateClusterSoftwareResponse: + """Tests for the UpdateClusterSoftwareResponse model.""" + + def test_create_update_cluster_software_response(self): + """Test creating an UpdateClusterSoftwareResponse instance.""" + response = UpdateClusterSoftwareResponse( + isError=False, + content=[TextContent(type='text', text='Successfully updated cluster software')], + cluster_arn='arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully updated cluster software' + assert ( + response.cluster_arn == 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster' + ) + + +class TestBatchDeleteClusterNodesError: + """Tests for the BatchDeleteClusterNodesError model.""" + + def test_create_batch_delete_cluster_nodes_error(self): + """Test creating a BatchDeleteClusterNodesError instance.""" + error = BatchDeleteClusterNodesError( + code='ValidationException', + message='Node not found', + node_id='i-1234567890abcdef0', + ) + + assert error.code == 'ValidationException' + assert error.message == 'Node not found' + assert error.node_id == 'i-1234567890abcdef0' + + +class TestBatchDeleteClusterNodesResponse: + """Tests for the BatchDeleteClusterNodesResponse model.""" + + def test_create_batch_delete_cluster_nodes_response(self): + """Test creating a BatchDeleteClusterNodesResponse instance.""" + error = BatchDeleteClusterNodesError( + code='ValidationException', + message='Node not found', + node_id='i-1234567890abcdef0', + ) + + response = BatchDeleteClusterNodesResponse( + isError=False, + content=[TextContent(type='text', text='Successfully deleted cluster nodes')], + cluster_name='test-cluster', + successful=['i-0987654321fedcba0'], + failed=[error], + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully deleted cluster nodes' + assert response.cluster_name == 'test-cluster' + assert response.successful == ['i-0987654321fedcba0'] + assert response.failed == [error] + + def test_create_batch_delete_cluster_nodes_response_without_optional_fields(self): + """Test creating a BatchDeleteClusterNodesResponse instance without optional fields.""" + response = BatchDeleteClusterNodesResponse( + isError=False, + content=[TextContent(type='text', text='Successfully deleted cluster nodes')], + cluster_name='test-cluster', + successful=['i-0987654321fedcba0'], + failed=None, + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully deleted cluster nodes' + assert response.cluster_name == 'test-cluster' + assert response.successful == ['i-0987654321fedcba0'] + assert response.failed is None + + +class TestDeployStackResponse: + """Tests for the DeployStackResponse model.""" + + def test_create_deploy_stack_response(self): + """Test creating a DeployStackResponse instance.""" + response = DeployStackResponse( + isError=False, + content=[TextContent(type='text', text='Successfully deployed stack')], + stack_name='test-stack', + stack_arn='arn:aws:cloudformation:us-west-2:123456789012:stack/test-stack/1234567890abcdef', + ) + + assert response.isError is False + assert len(response.content) == 1 + assert response.content[0].type == 'text' + assert response.content[0].text == 'Successfully deployed stack' + assert response.stack_name == 'test-stack' + assert ( + response.stack_arn + == 'arn:aws:cloudformation:us-west-2:123456789012:stack/test-stack/1234567890abcdef' + ) + + +class TestDescribeStackResponse: + """Tests for the DescribeStackResponse model.""" + + def test_create_describe_stack_response(self): + """Test creating a DescribeStackResponse instance.""" + response = DescribeStackResponse( + isError=False, + content=[TextContent(type='text', text='Successfully described stack')], + stack_name='test-stack', + stack_id='arn:aws:cloudformation:us-west-2:123456789012:stack/test-stack/1234567890abcdef', + creation_time='2023-01-01T00:00:00Z', + stack_status='CREATE_COMPLETE', + outputs={ + 'ClusterName': 'test-cluster', + 'ClusterArn': 'arn:aws:sagemaker:us-west-2:123456789012:cluster/test-cluster', + }, + ) + + assert response.isError is False + assert len(response.content) == 1 diff --git a/src/sagemaker-hyperpod-mcp-server/tests/test_server.py b/src/sagemaker-hyperpod-mcp-server/tests/test_server.py new file mode 100644 index 0000000000..2b39d16845 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/tests/test_server.py @@ -0,0 +1,231 @@ +# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ruff: noqa: D101, D102, D103, E402 +"""Tests for the HyperPod MCP Server.""" + +# Mock the imports that might cause issues +import pytest +import sys +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_cluster_node_handler import ( + HyperPodClusterNodeHandler, +) +from awslabs.sagemaker_hyperpod_mcp_server.hyperpod_stack_handler import HyperPodStackHandler +from unittest.mock import MagicMock, patch + + +# Mock modules that might not be installed +sys.modules['requests_auth_aws_sigv4'] = MagicMock() +sys.modules['requests'] = MagicMock() + + +@pytest.mark.asyncio +async def test_server_initialization(): + # Test the server initialization by creating a server instance + from awslabs.sagemaker_hyperpod_mcp_server.server import create_server + + # Create a server instance + server = create_server() + + # Test that the server is initialized with the correct name + assert server.name == 'awslabs.sagemaker-hyperpod-mcp-server' + # Test that the server has the correct instructions + assert ( + server.instructions is not None + and 'Amazon SageMaker HyperPod MCP Server' in server.instructions + ) + # Test that the server has the correct dependencies + assert 'pydantic' in server.dependencies + assert 'loguru' in server.dependencies + assert 'boto3' in server.dependencies + assert 'requests' in server.dependencies + assert 'pyyaml' in server.dependencies + assert 'cachetools' in server.dependencies + + +@pytest.mark.asyncio +async def test_command_line_args(): + """Test that the command-line arguments are parsed correctly.""" + import argparse + from awslabs.sagemaker_hyperpod_mcp_server.server import main + + # Mock the ArgumentParser.parse_args method to return known args + with patch.object(argparse.ArgumentParser, 'parse_args') as mock_parse_args: + # Test with default args (read-only mode by default) + mock_parse_args.return_value = argparse.Namespace( + allow_write=False, allow_sensitive_data_access=False + ) + + # Mock create_server to return a mock server + mock_server = MagicMock() + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.create_server', return_value=mock_server + ): + # Mock the handler initialization to verify allow_write is passed + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler' + ) as mock_hyperpod_cluster_node_handler: + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler' + ) as mock_hyperpod_stack_handler: + # Call the main function + main() + + # Verify that parse_args was called + mock_parse_args.assert_called_once() + + # Verify that the handlers were initialized with correct parameters + mock_hyperpod_cluster_node_handler.assert_called_once_with( + mock_server, False, False + ) + mock_hyperpod_stack_handler.assert_called_once_with(mock_server, False) + + # Verify that run was called + mock_server.run.assert_called_once() + + # Test with write access enabled + with patch.object(argparse.ArgumentParser, 'parse_args') as mock_parse_args: + mock_parse_args.return_value = argparse.Namespace( + allow_write=True, allow_sensitive_data_access=False + ) + + # Mock create_server to return a mock server + mock_server = MagicMock() + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.create_server', return_value=mock_server + ): + # Mock the handler initialization to verify allow_write is passed + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler' + ) as mock_hyperpod_cluster_node_handler: + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler' + ) as mock_hyperpod_stack_handler: + # Call the main function + main() + + # Verify that parse_args was called + mock_parse_args.assert_called_once() + + # Verify that the handlers were initialized with correct parameters + mock_hyperpod_cluster_node_handler.assert_called_once_with( + mock_server, True, False + ) + mock_hyperpod_stack_handler.assert_called_once_with(mock_server, True) + + # Verify that run was called + mock_server.run.assert_called_once() + + # Test with sensitive data access enabled + with patch.object(argparse.ArgumentParser, 'parse_args') as mock_parse_args: + mock_parse_args.return_value = argparse.Namespace( + allow_write=False, allow_sensitive_data_access=True + ) + + # Mock create_server to return a mock server + mock_server = MagicMock() + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.create_server', return_value=mock_server + ): + # Mock the handler initialization to verify allow_sensitive_data_access is passed + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler' + ) as mock_hyperpod_cluster_node_handler: + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler' + ) as mock_hyperpod_stack_handler: + # Call the main function + main() + + # Verify that parse_args was called + mock_parse_args.assert_called_once() + + # Verify that the handlers were initialized with correct parameters + mock_hyperpod_cluster_node_handler.assert_called_once_with( + mock_server, False, True + ) + mock_hyperpod_stack_handler.assert_called_once_with(mock_server, False) + + # Verify that run was called + mock_server.run.assert_called_once() + + # Test with both write access and sensitive data access enabled + with patch.object(argparse.ArgumentParser, 'parse_args') as mock_parse_args: + mock_parse_args.return_value = argparse.Namespace( + allow_write=True, allow_sensitive_data_access=True + ) + + # Mock create_server to return a mock server + mock_server = MagicMock() + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.create_server', return_value=mock_server + ): + # Mock the handler initialization to verify both flags are passed + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodClusterNodeHandler' + ) as mock_hyperpod_cluster_node_handler: + with patch( + 'awslabs.sagemaker_hyperpod_mcp_server.server.HyperPodStackHandler' + ) as mock_hyperpod_stack_handler: + # Call the main function + main() + + # Verify that parse_args was called + mock_parse_args.assert_called_once() + + # Verify that the handlers were initialized with both flags + mock_hyperpod_cluster_node_handler.assert_called_once_with( + mock_server, True, True + ) + mock_hyperpod_stack_handler.assert_called_once_with(mock_server, True) + + # Verify that run was called + mock_server.run.assert_called_once() + + +@pytest.mark.asyncio +async def test_hyperpod_cluster_node_handler_initialization(): + """Test the initialization of the HyperPodClusterNodeHandler.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + HyperPodClusterNodeHandler(mock_mcp) + + # Verify that the tools were registered + assert mock_mcp.tool.call_count == 3 + + # Get all call args + call_args_list = mock_mcp.tool.call_args_list + + # Get all tool names that were registered + tool_names = [call_args[1]['name'] for call_args in call_args_list] + + # Verify that all tools are registered + assert 'describe_hp_cluster' in tool_names + assert 'update_hp_cluster' in tool_names + + assert 'manage_hyperpod_cluster_nodes' in tool_names + + +@pytest.mark.asyncio +async def test_hyperpod_stack_handler_initialization(): + """Test the initialization of the HyperPodStackHandler.""" + # Create a mock MCP server + mock_mcp = MagicMock() + + HyperPodStackHandler(mock_mcp) + + # Verify that the tool was registered + mock_mcp.tool.assert_called_once() + call_args = mock_mcp.tool.call_args + assert call_args[1]['name'] == 'manage_hyperpod_stacks' diff --git a/src/sagemaker-hyperpod-mcp-server/uv-requirements.txt b/src/sagemaker-hyperpod-mcp-server/uv-requirements.txt new file mode 100644 index 0000000000..f1288cd577 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/uv-requirements.txt @@ -0,0 +1,23 @@ +# This file was autogenerated by uv via the following command: +# uv pip compile --generate-hashes --output-file=uv-requirements.txt --strip-extras --python=3.10 uv-requirements.in +uv==0.8.10 \ + --hash=sha256:31e4fc37ee94b94c032384a0957ad32ba7dce4ce6c04b4880fd3e31e25e51a82 \ + --hash=sha256:36a5ce708d52388c37043e7335f9eb3fea5a19a56166a2cc6adb365179a1cd77 \ + --hash=sha256:38286d230daad82388469c8dc7a1d2f5dc279c11178319c886d1a88d7938e513 \ + --hash=sha256:3e190cee3bb2b4f574a419eef87ae8e33f713e9cd6f856b83277ece70ad9ca9b \ + --hash=sha256:3fdf89fc40af9902141c39ed943bcfca15664623363335eb032a44f22001e2b4 \ + --hash=sha256:4cc190d403a89e46d13cec83b6f8e8d7d07aaf1e5a996eac9a3f0c2a8cd92537 \ + --hash=sha256:57b71dc79eff25a5419d3fe4a563d3b9397f55d789f685ef27f43f033b31f482 \ + --hash=sha256:86fe044c2be43977566a0d184a487edd7aace2febb757fd95927684b629ef50b \ + --hash=sha256:88df34c32555064fae459cce665757619fd1af7deb2dc393352b15d909d2d131 \ + --hash=sha256:9ad21eeaa4156a1bf5ed85903f80db06e2c02badd3a587ba98d3171517960555 \ + --hash=sha256:a5495b5a6e3111c03cf5e4dbdd598bc8fd1da887e3920d58cd5a2d4c8bc9a473 \ + --hash=sha256:ab072cd3bf2f9dc264659a1ff48ad91a910ac4830bcfe965e2d3f89c86646f46 \ + --hash=sha256:af8a5526b0e331775a264fa0dbccfd53c183cb974f269a208af136d7561f9eb2 \ + --hash=sha256:b00637c63d5dfc9f879281c5c91db2bb909ab1f9ab275dab015e7fb6cac6be5b \ + --hash=sha256:b3ff3c451fcd23ea78356d8c18e802d0e423cbe655273601e3ec039a51b33286 \ + --hash=sha256:c4a493cd4b15b3aef11523531aff96a77a586666a63e842fa437966b7b7ee62d \ + --hash=sha256:defc50bb319be2d58be74a680710cd4b7697e88d5f79974eacd354df95f0b6b0 \ + --hash=sha256:e0a02bcec766eb0862b7082ab746b204add7d9fcaa62322502d159b5a7ccc54a \ + --hash=sha256:eb79a46d8099f563ef58237bf4e9009f876a40145e757ea883a92b24b724d01e + # via -r uv-requirements.in diff --git a/src/sagemaker-hyperpod-mcp-server/uv.lock b/src/sagemaker-hyperpod-mcp-server/uv.lock new file mode 100644 index 0000000000..1be61b0a27 --- /dev/null +++ b/src/sagemaker-hyperpod-mcp-server/uv.lock @@ -0,0 +1,1420 @@ +version = 1 +revision = 2 +requires-python = ">=3.10" + +[[package]] +name = "annotated-types" +version = "0.7.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, +] + +[[package]] +name = "anyio" +version = "4.9.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "idna" }, + { name = "sniffio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949, upload-time = "2025-03-17T00:02:54.77Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916, upload-time = "2025-03-17T00:02:52.713Z" }, +] + +[[package]] +name = "argcomplete" +version = "3.6.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/16/0f/861e168fc813c56a78b35f3c30d91c6757d1fd185af1110f1aec784b35d0/argcomplete-3.6.2.tar.gz", hash = "sha256:d0519b1bc867f5f4f4713c41ad0aba73a4a5f007449716b16f385f2166dc6adf", size = 73403, upload-time = "2025-04-03T04:57:03.52Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/da/e42d7a9d8dd33fa775f467e4028a47936da2f01e4b0e561f9ba0d74cb0ca/argcomplete-3.6.2-py3-none-any.whl", hash = "sha256:65b3133a29ad53fb42c48cf5114752c7ab66c1c38544fdf6460f450c09b42591", size = 43708, upload-time = "2025-04-03T04:57:01.591Z" }, +] + +[[package]] +name = "attrs" +version = "25.3.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" }, +] + +[[package]] +name = "awslabs-sagemaker-hyperpod-mcp-server" +version = "0.0.0" +source = { editable = "." } +dependencies = [ + { name = "boto3" }, + { name = "cachetools" }, + { name = "loguru" }, + { name = "mcp", extra = ["cli"] }, + { name = "pydantic" }, + { name = "pyyaml" }, + { name = "requests" }, + { name = "requests-auth-aws-sigv4" }, +] + +[package.dev-dependencies] +dev = [ + { name = "commitizen" }, + { name = "pre-commit" }, + { name = "pyright" }, + { name = "pytest" }, + { name = "pytest-asyncio" }, + { name = "pytest-cov" }, + { name = "pytest-mock" }, + { name = "ruff" }, +] + +[package.metadata] +requires-dist = [ + { name = "boto3", specifier = ">=1.34.0" }, + { name = "cachetools", specifier = ">=5.3.0" }, + { name = "loguru", specifier = ">=0.7.0" }, + { name = "mcp", extras = ["cli"], specifier = ">=1.6.0" }, + { name = "pydantic", specifier = ">=2.10.6" }, + { name = "pyyaml", specifier = ">=6.0.0" }, + { name = "requests", specifier = ">=2.31.0" }, + { name = "requests-auth-aws-sigv4" }, +] + +[package.metadata.requires-dev] +dev = [ + { name = "commitizen", specifier = ">=4.2.2" }, + { name = "pre-commit", specifier = ">=4.1.0" }, + { name = "pyright", specifier = ">=1.1.398" }, + { name = "pytest", specifier = ">=8.0.0" }, + { name = "pytest-asyncio", specifier = ">=0.26.0" }, + { name = "pytest-cov", specifier = ">=4.1.0" }, + { name = "pytest-mock", specifier = ">=3.12.0" }, + { name = "ruff", specifier = ">=0.9.7" }, +] + +[[package]] +name = "backports-asyncio-runner" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8e/ff/70dca7d7cb1cbc0edb2c6cc0c38b65cba36cccc491eca64cabd5fe7f8670/backports_asyncio_runner-1.2.0.tar.gz", hash = "sha256:a5aa7b2b7d8f8bfcaa2b57313f70792df84e32a2a746f585213373f900b42162", size = 69893, upload-time = "2025-07-02T02:27:15.685Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a0/59/76ab57e3fe74484f48a53f8e337171b4a2349e506eabe136d7e01d059086/backports_asyncio_runner-1.2.0-py3-none-any.whl", hash = "sha256:0da0a936a8aeb554eccb426dc55af3ba63bcdc69fa1a600b5bb305413a4477b5", size = 12313, upload-time = "2025-07-02T02:27:14.263Z" }, +] + +[[package]] +name = "boto3" +version = "1.39.8" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, + { name = "jmespath" }, + { name = "s3transfer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d1/ef/f8dbe6482bdf9eb0230f2639483cdd40ef5aaa89c2fb651f2edeee9c248a/boto3-1.39.8.tar.gz", hash = "sha256:456ea6baef037eb6205d64e012259d14f0c9300c9b30603890746c1a0882fa01", size = 111829, upload-time = "2025-07-17T19:19:14.828Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3c/f0/f3701472b2e6192e62d80e703186ae9c789b3d607ba22943702c500897d2/boto3-1.39.8-py3-none-any.whl", hash = "sha256:dcea5270ccced0b4b962eb5874cb71b6232ccfc6203e05bf834a314442e4a79c", size = 139886, upload-time = "2025-07-17T19:19:12.634Z" }, +] + +[[package]] +name = "botocore" +version = "1.39.8" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jmespath" }, + { name = "python-dateutil" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/32/57/16d3d21963975b9be180e96695abfb146695ae7db57f9a2d47e92d33ce9d/botocore-1.39.8.tar.gz", hash = "sha256:3848bd9057ea8dbc059e7764eda63bda575727ad1101dbd03636ab4a6f283fa5", size = 14205898, upload-time = "2025-07-17T19:19:03.832Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/11/ac/51462dd35fc60d11cdce93ba82ccf1635a161ceadc646d89f67d666fff31/botocore-1.39.8-py3-none-any.whl", hash = "sha256:ab43f79c6893271934faba7ae1987a313d59576361c544c70a5391ade560891d", size = 13866818, upload-time = "2025-07-17T19:18:58.521Z" }, +] + +[[package]] +name = "cachetools" +version = "6.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8a/89/817ad5d0411f136c484d535952aef74af9b25e0d99e90cdffbe121e6d628/cachetools-6.1.0.tar.gz", hash = "sha256:b4c4f404392848db3ce7aac34950d17be4d864da4b8b66911008e430bc544587", size = 30714, upload-time = "2025-06-16T18:51:03.07Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/00/f0/2ef431fe4141f5e334759d73e81120492b23b2824336883a91ac04ba710b/cachetools-6.1.0-py3-none-any.whl", hash = "sha256:1c7bb3cf9193deaf3508b7c5f2a79986c13ea38965c5adcff1f84519cf39163e", size = 11189, upload-time = "2025-06-16T18:51:01.514Z" }, +] + +[[package]] +name = "certifi" +version = "2025.7.14" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b3/76/52c535bcebe74590f296d6c77c86dabf761c41980e1347a2422e4aa2ae41/certifi-2025.7.14.tar.gz", hash = "sha256:8ea99dbdfaaf2ba2f9bac77b9249ef62ec5218e7c2b2e903378ed5fccf765995", size = 163981, upload-time = "2025-07-14T03:29:28.449Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4f/52/34c6cf5bb9285074dc3531c437b3919e825d976fde097a7a73f79e726d03/certifi-2025.7.14-py3-none-any.whl", hash = "sha256:6b31f564a415d79ee77df69d757bb49a5bb53bd9f756cbbe24394ffd6fc1f4b2", size = 162722, upload-time = "2025-07-14T03:29:26.863Z" }, +] + +[[package]] +name = "cfgv" +version = "3.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/11/74/539e56497d9bd1d484fd863dd69cbbfa653cd2aa27abfe35653494d85e94/cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560", size = 7114, upload-time = "2023-08-12T20:38:17.776Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249, upload-time = "2023-08-12T20:38:16.269Z" }, +] + +[[package]] +name = "charset-normalizer" +version = "3.4.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e4/33/89c2ced2b67d1c2a61c19c6751aa8902d46ce3dacb23600a283619f5a12d/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63", size = 126367, upload-time = "2025-05-02T08:34:42.01Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/95/28/9901804da60055b406e1a1c5ba7aac1276fb77f1dde635aabfc7fd84b8ab/charset_normalizer-3.4.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7c48ed483eb946e6c04ccbe02c6b4d1d48e51944b6db70f697e089c193404941", size = 201818, upload-time = "2025-05-02T08:31:46.725Z" }, + { url = "https://files.pythonhosted.org/packages/d9/9b/892a8c8af9110935e5adcbb06d9c6fe741b6bb02608c6513983048ba1a18/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2d318c11350e10662026ad0eb71bb51c7812fc8590825304ae0bdd4ac283acd", size = 144649, upload-time = "2025-05-02T08:31:48.889Z" }, + { url = "https://files.pythonhosted.org/packages/7b/a5/4179abd063ff6414223575e008593861d62abfc22455b5d1a44995b7c101/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9cbfacf36cb0ec2897ce0ebc5d08ca44213af24265bd56eca54bee7923c48fd6", size = 155045, upload-time = "2025-05-02T08:31:50.757Z" }, + { url = "https://files.pythonhosted.org/packages/3b/95/bc08c7dfeddd26b4be8c8287b9bb055716f31077c8b0ea1cd09553794665/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:18dd2e350387c87dabe711b86f83c9c78af772c748904d372ade190b5c7c9d4d", size = 147356, upload-time = "2025-05-02T08:31:52.634Z" }, + { url = "https://files.pythonhosted.org/packages/a8/2d/7a5b635aa65284bf3eab7653e8b4151ab420ecbae918d3e359d1947b4d61/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8075c35cd58273fee266c58c0c9b670947c19df5fb98e7b66710e04ad4e9ff86", size = 149471, upload-time = "2025-05-02T08:31:56.207Z" }, + { url = "https://files.pythonhosted.org/packages/ae/38/51fc6ac74251fd331a8cfdb7ec57beba8c23fd5493f1050f71c87ef77ed0/charset_normalizer-3.4.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5bf4545e3b962767e5c06fe1738f951f77d27967cb2caa64c28be7c4563e162c", size = 151317, upload-time = "2025-05-02T08:31:57.613Z" }, + { url = "https://files.pythonhosted.org/packages/b7/17/edee1e32215ee6e9e46c3e482645b46575a44a2d72c7dfd49e49f60ce6bf/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7a6ab32f7210554a96cd9e33abe3ddd86732beeafc7a28e9955cdf22ffadbab0", size = 146368, upload-time = "2025-05-02T08:31:59.468Z" }, + { url = "https://files.pythonhosted.org/packages/26/2c/ea3e66f2b5f21fd00b2825c94cafb8c326ea6240cd80a91eb09e4a285830/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:b33de11b92e9f75a2b545d6e9b6f37e398d86c3e9e9653c4864eb7e89c5773ef", size = 154491, upload-time = "2025-05-02T08:32:01.219Z" }, + { url = "https://files.pythonhosted.org/packages/52/47/7be7fa972422ad062e909fd62460d45c3ef4c141805b7078dbab15904ff7/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:8755483f3c00d6c9a77f490c17e6ab0c8729e39e6390328e42521ef175380ae6", size = 157695, upload-time = "2025-05-02T08:32:03.045Z" }, + { url = "https://files.pythonhosted.org/packages/2f/42/9f02c194da282b2b340f28e5fb60762de1151387a36842a92b533685c61e/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:68a328e5f55ec37c57f19ebb1fdc56a248db2e3e9ad769919a58672958e8f366", size = 154849, upload-time = "2025-05-02T08:32:04.651Z" }, + { url = "https://files.pythonhosted.org/packages/67/44/89cacd6628f31fb0b63201a618049be4be2a7435a31b55b5eb1c3674547a/charset_normalizer-3.4.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:21b2899062867b0e1fde9b724f8aecb1af14f2778d69aacd1a5a1853a597a5db", size = 150091, upload-time = "2025-05-02T08:32:06.719Z" }, + { url = "https://files.pythonhosted.org/packages/1f/79/4b8da9f712bc079c0f16b6d67b099b0b8d808c2292c937f267d816ec5ecc/charset_normalizer-3.4.2-cp310-cp310-win32.whl", hash = "sha256:e8082b26888e2f8b36a042a58307d5b917ef2b1cacab921ad3323ef91901c71a", size = 98445, upload-time = "2025-05-02T08:32:08.66Z" }, + { url = "https://files.pythonhosted.org/packages/7d/d7/96970afb4fb66497a40761cdf7bd4f6fca0fc7bafde3a84f836c1f57a926/charset_normalizer-3.4.2-cp310-cp310-win_amd64.whl", hash = "sha256:f69a27e45c43520f5487f27627059b64aaf160415589230992cec34c5e18a509", size = 105782, upload-time = "2025-05-02T08:32:10.46Z" }, + { url = "https://files.pythonhosted.org/packages/05/85/4c40d00dcc6284a1c1ad5de5e0996b06f39d8232f1031cd23c2f5c07ee86/charset_normalizer-3.4.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:be1e352acbe3c78727a16a455126d9ff83ea2dfdcbc83148d2982305a04714c2", size = 198794, upload-time = "2025-05-02T08:32:11.945Z" }, + { url = "https://files.pythonhosted.org/packages/41/d9/7a6c0b9db952598e97e93cbdfcb91bacd89b9b88c7c983250a77c008703c/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa88ca0b1932e93f2d961bf3addbb2db902198dca337d88c89e1559e066e7645", size = 142846, upload-time = "2025-05-02T08:32:13.946Z" }, + { url = "https://files.pythonhosted.org/packages/66/82/a37989cda2ace7e37f36c1a8ed16c58cf48965a79c2142713244bf945c89/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d524ba3f1581b35c03cb42beebab4a13e6cdad7b36246bd22541fa585a56cccd", size = 153350, upload-time = "2025-05-02T08:32:15.873Z" }, + { url = "https://files.pythonhosted.org/packages/df/68/a576b31b694d07b53807269d05ec3f6f1093e9545e8607121995ba7a8313/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28a1005facc94196e1fb3e82a3d442a9d9110b8434fc1ded7a24a2983c9888d8", size = 145657, upload-time = "2025-05-02T08:32:17.283Z" }, + { url = "https://files.pythonhosted.org/packages/92/9b/ad67f03d74554bed3aefd56fe836e1623a50780f7c998d00ca128924a499/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fdb20a30fe1175ecabed17cbf7812f7b804b8a315a25f24678bcdf120a90077f", size = 147260, upload-time = "2025-05-02T08:32:18.807Z" }, + { url = "https://files.pythonhosted.org/packages/a6/e6/8aebae25e328160b20e31a7e9929b1578bbdc7f42e66f46595a432f8539e/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0f5d9ed7f254402c9e7d35d2f5972c9bbea9040e99cd2861bd77dc68263277c7", size = 149164, upload-time = "2025-05-02T08:32:20.333Z" }, + { url = "https://files.pythonhosted.org/packages/8b/f2/b3c2f07dbcc248805f10e67a0262c93308cfa149a4cd3d1fe01f593e5fd2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:efd387a49825780ff861998cd959767800d54f8308936b21025326de4b5a42b9", size = 144571, upload-time = "2025-05-02T08:32:21.86Z" }, + { url = "https://files.pythonhosted.org/packages/60/5b/c3f3a94bc345bc211622ea59b4bed9ae63c00920e2e8f11824aa5708e8b7/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f0aa37f3c979cf2546b73e8222bbfa3dc07a641585340179d768068e3455e544", size = 151952, upload-time = "2025-05-02T08:32:23.434Z" }, + { url = "https://files.pythonhosted.org/packages/e2/4d/ff460c8b474122334c2fa394a3f99a04cf11c646da895f81402ae54f5c42/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e70e990b2137b29dc5564715de1e12701815dacc1d056308e2b17e9095372a82", size = 155959, upload-time = "2025-05-02T08:32:24.993Z" }, + { url = "https://files.pythonhosted.org/packages/a2/2b/b964c6a2fda88611a1fe3d4c400d39c66a42d6c169c924818c848f922415/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0c8c57f84ccfc871a48a47321cfa49ae1df56cd1d965a09abe84066f6853b9c0", size = 153030, upload-time = "2025-05-02T08:32:26.435Z" }, + { url = "https://files.pythonhosted.org/packages/59/2e/d3b9811db26a5ebf444bc0fa4f4be5aa6d76fc6e1c0fd537b16c14e849b6/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6b66f92b17849b85cad91259efc341dce9c1af48e2173bf38a85c6329f1033e5", size = 148015, upload-time = "2025-05-02T08:32:28.376Z" }, + { url = "https://files.pythonhosted.org/packages/90/07/c5fd7c11eafd561bb51220d600a788f1c8d77c5eef37ee49454cc5c35575/charset_normalizer-3.4.2-cp311-cp311-win32.whl", hash = "sha256:daac4765328a919a805fa5e2720f3e94767abd632ae410a9062dff5412bae65a", size = 98106, upload-time = "2025-05-02T08:32:30.281Z" }, + { url = "https://files.pythonhosted.org/packages/a8/05/5e33dbef7e2f773d672b6d79f10ec633d4a71cd96db6673625838a4fd532/charset_normalizer-3.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53efc7c7cee4c1e70661e2e112ca46a575f90ed9ae3fef200f2a25e954f4b28", size = 105402, upload-time = "2025-05-02T08:32:32.191Z" }, + { url = "https://files.pythonhosted.org/packages/d7/a4/37f4d6035c89cac7930395a35cc0f1b872e652eaafb76a6075943754f095/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7", size = 199936, upload-time = "2025-05-02T08:32:33.712Z" }, + { url = "https://files.pythonhosted.org/packages/ee/8a/1a5e33b73e0d9287274f899d967907cd0bf9c343e651755d9307e0dbf2b3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3", size = 143790, upload-time = "2025-05-02T08:32:35.768Z" }, + { url = "https://files.pythonhosted.org/packages/66/52/59521f1d8e6ab1482164fa21409c5ef44da3e9f653c13ba71becdd98dec3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a", size = 153924, upload-time = "2025-05-02T08:32:37.284Z" }, + { url = "https://files.pythonhosted.org/packages/86/2d/fb55fdf41964ec782febbf33cb64be480a6b8f16ded2dbe8db27a405c09f/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214", size = 146626, upload-time = "2025-05-02T08:32:38.803Z" }, + { url = "https://files.pythonhosted.org/packages/8c/73/6ede2ec59bce19b3edf4209d70004253ec5f4e319f9a2e3f2f15601ed5f7/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a", size = 148567, upload-time = "2025-05-02T08:32:40.251Z" }, + { url = "https://files.pythonhosted.org/packages/09/14/957d03c6dc343c04904530b6bef4e5efae5ec7d7990a7cbb868e4595ee30/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd", size = 150957, upload-time = "2025-05-02T08:32:41.705Z" }, + { url = "https://files.pythonhosted.org/packages/0d/c8/8174d0e5c10ccebdcb1b53cc959591c4c722a3ad92461a273e86b9f5a302/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981", size = 145408, upload-time = "2025-05-02T08:32:43.709Z" }, + { url = "https://files.pythonhosted.org/packages/58/aa/8904b84bc8084ac19dc52feb4f5952c6df03ffb460a887b42615ee1382e8/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c", size = 153399, upload-time = "2025-05-02T08:32:46.197Z" }, + { url = "https://files.pythonhosted.org/packages/c2/26/89ee1f0e264d201cb65cf054aca6038c03b1a0c6b4ae998070392a3ce605/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b", size = 156815, upload-time = "2025-05-02T08:32:48.105Z" }, + { url = "https://files.pythonhosted.org/packages/fd/07/68e95b4b345bad3dbbd3a8681737b4338ff2c9df29856a6d6d23ac4c73cb/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d", size = 154537, upload-time = "2025-05-02T08:32:49.719Z" }, + { url = "https://files.pythonhosted.org/packages/77/1a/5eefc0ce04affb98af07bc05f3bac9094513c0e23b0562d64af46a06aae4/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f", size = 149565, upload-time = "2025-05-02T08:32:51.404Z" }, + { url = "https://files.pythonhosted.org/packages/37/a0/2410e5e6032a174c95e0806b1a6585eb21e12f445ebe239fac441995226a/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c", size = 98357, upload-time = "2025-05-02T08:32:53.079Z" }, + { url = "https://files.pythonhosted.org/packages/6c/4f/c02d5c493967af3eda9c771ad4d2bbc8df6f99ddbeb37ceea6e8716a32bc/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e", size = 105776, upload-time = "2025-05-02T08:32:54.573Z" }, + { url = "https://files.pythonhosted.org/packages/ea/12/a93df3366ed32db1d907d7593a94f1fe6293903e3e92967bebd6950ed12c/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0", size = 199622, upload-time = "2025-05-02T08:32:56.363Z" }, + { url = "https://files.pythonhosted.org/packages/04/93/bf204e6f344c39d9937d3c13c8cd5bbfc266472e51fc8c07cb7f64fcd2de/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf", size = 143435, upload-time = "2025-05-02T08:32:58.551Z" }, + { url = "https://files.pythonhosted.org/packages/22/2a/ea8a2095b0bafa6c5b5a55ffdc2f924455233ee7b91c69b7edfcc9e02284/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e", size = 153653, upload-time = "2025-05-02T08:33:00.342Z" }, + { url = "https://files.pythonhosted.org/packages/b6/57/1b090ff183d13cef485dfbe272e2fe57622a76694061353c59da52c9a659/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1", size = 146231, upload-time = "2025-05-02T08:33:02.081Z" }, + { url = "https://files.pythonhosted.org/packages/e2/28/ffc026b26f441fc67bd21ab7f03b313ab3fe46714a14b516f931abe1a2d8/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c", size = 148243, upload-time = "2025-05-02T08:33:04.063Z" }, + { url = "https://files.pythonhosted.org/packages/c0/0f/9abe9bd191629c33e69e47c6ef45ef99773320e9ad8e9cb08b8ab4a8d4cb/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691", size = 150442, upload-time = "2025-05-02T08:33:06.418Z" }, + { url = "https://files.pythonhosted.org/packages/67/7c/a123bbcedca91d5916c056407f89a7f5e8fdfce12ba825d7d6b9954a1a3c/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0", size = 145147, upload-time = "2025-05-02T08:33:08.183Z" }, + { url = "https://files.pythonhosted.org/packages/ec/fe/1ac556fa4899d967b83e9893788e86b6af4d83e4726511eaaad035e36595/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b", size = 153057, upload-time = "2025-05-02T08:33:09.986Z" }, + { url = "https://files.pythonhosted.org/packages/2b/ff/acfc0b0a70b19e3e54febdd5301a98b72fa07635e56f24f60502e954c461/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff", size = 156454, upload-time = "2025-05-02T08:33:11.814Z" }, + { url = "https://files.pythonhosted.org/packages/92/08/95b458ce9c740d0645feb0e96cea1f5ec946ea9c580a94adfe0b617f3573/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b", size = 154174, upload-time = "2025-05-02T08:33:13.707Z" }, + { url = "https://files.pythonhosted.org/packages/78/be/8392efc43487ac051eee6c36d5fbd63032d78f7728cb37aebcc98191f1ff/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148", size = 149166, upload-time = "2025-05-02T08:33:15.458Z" }, + { url = "https://files.pythonhosted.org/packages/44/96/392abd49b094d30b91d9fbda6a69519e95802250b777841cf3bda8fe136c/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7", size = 98064, upload-time = "2025-05-02T08:33:17.06Z" }, + { url = "https://files.pythonhosted.org/packages/e9/b0/0200da600134e001d91851ddc797809e2fe0ea72de90e09bec5a2fbdaccb/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980", size = 105641, upload-time = "2025-05-02T08:33:18.753Z" }, + { url = "https://files.pythonhosted.org/packages/20/94/c5790835a017658cbfabd07f3bfb549140c3ac458cfc196323996b10095a/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0", size = 52626, upload-time = "2025-05-02T08:34:40.053Z" }, +] + +[[package]] +name = "click" +version = "8.2.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" }, +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, +] + +[[package]] +name = "commitizen" +version = "4.8.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "argcomplete" }, + { name = "charset-normalizer" }, + { name = "colorama" }, + { name = "decli" }, + { name = "importlib-metadata" }, + { name = "jinja2" }, + { name = "packaging" }, + { name = "pyyaml" }, + { name = "questionary" }, + { name = "termcolor" }, + { name = "tomlkit" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ee/c0/fe5ba5555f2891bcb0b3e7dc1c57fcfd206ab7133a3094d70b81fd5a4a10/commitizen-4.8.3.tar.gz", hash = "sha256:303ebdc271217aadbb6a73a015612121291d180c8cdd05b5251c7923d4a14195", size = 56225, upload-time = "2025-06-09T14:18:51.472Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/11/37/5a8e1dadd02eede38bf5a92af108071f6a11b6fc50b7ae27d9083c649ba9/commitizen-4.8.3-py3-none-any.whl", hash = "sha256:91f261387ca2bbb4ab6c79a1a6378dc1576ffb40e3b7dbee201724d95aceba38", size = 80112, upload-time = "2025-06-09T14:18:49.673Z" }, +] + +[[package]] +name = "coverage" +version = "7.9.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/04/b7/c0465ca253df10a9e8dae0692a4ae6e9726d245390aaef92360e1d6d3832/coverage-7.9.2.tar.gz", hash = "sha256:997024fa51e3290264ffd7492ec97d0690293ccd2b45a6cd7d82d945a4a80c8b", size = 813556, upload-time = "2025-07-03T10:54:15.101Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a1/0d/5c2114fd776c207bd55068ae8dc1bef63ecd1b767b3389984a8e58f2b926/coverage-7.9.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:66283a192a14a3854b2e7f3418d7db05cdf411012ab7ff5db98ff3b181e1f912", size = 212039, upload-time = "2025-07-03T10:52:38.955Z" }, + { url = "https://files.pythonhosted.org/packages/cf/ad/dc51f40492dc2d5fcd31bb44577bc0cc8920757d6bc5d3e4293146524ef9/coverage-7.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4e01d138540ef34fcf35c1aa24d06c3de2a4cffa349e29a10056544f35cca15f", size = 212428, upload-time = "2025-07-03T10:52:41.36Z" }, + { url = "https://files.pythonhosted.org/packages/a2/a3/55cb3ff1b36f00df04439c3993d8529193cdf165a2467bf1402539070f16/coverage-7.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f22627c1fe2745ee98d3ab87679ca73a97e75ca75eb5faee48660d060875465f", size = 241534, upload-time = "2025-07-03T10:52:42.956Z" }, + { url = "https://files.pythonhosted.org/packages/eb/c9/a8410b91b6be4f6e9c2e9f0dce93749b6b40b751d7065b4410bf89cb654b/coverage-7.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b1c2d8363247b46bd51f393f86c94096e64a1cf6906803fa8d5a9d03784bdbf", size = 239408, upload-time = "2025-07-03T10:52:44.199Z" }, + { url = "https://files.pythonhosted.org/packages/ff/c4/6f3e56d467c612b9070ae71d5d3b114c0b899b5788e1ca3c93068ccb7018/coverage-7.9.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c10c882b114faf82dbd33e876d0cbd5e1d1ebc0d2a74ceef642c6152f3f4d547", size = 240552, upload-time = "2025-07-03T10:52:45.477Z" }, + { url = "https://files.pythonhosted.org/packages/fd/20/04eda789d15af1ce79bce5cc5fd64057c3a0ac08fd0576377a3096c24663/coverage-7.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:de3c0378bdf7066c3988d66cd5232d161e933b87103b014ab1b0b4676098fa45", size = 240464, upload-time = "2025-07-03T10:52:46.809Z" }, + { url = "https://files.pythonhosted.org/packages/a9/5a/217b32c94cc1a0b90f253514815332d08ec0812194a1ce9cca97dda1cd20/coverage-7.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:1e2f097eae0e5991e7623958a24ced3282676c93c013dde41399ff63e230fcf2", size = 239134, upload-time = "2025-07-03T10:52:48.149Z" }, + { url = "https://files.pythonhosted.org/packages/34/73/1d019c48f413465eb5d3b6898b6279e87141c80049f7dbf73fd020138549/coverage-7.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:28dc1f67e83a14e7079b6cea4d314bc8b24d1aed42d3582ff89c0295f09b181e", size = 239405, upload-time = "2025-07-03T10:52:49.687Z" }, + { url = "https://files.pythonhosted.org/packages/49/6c/a2beca7aa2595dad0c0d3f350382c381c92400efe5261e2631f734a0e3fe/coverage-7.9.2-cp310-cp310-win32.whl", hash = "sha256:bf7d773da6af9e10dbddacbf4e5cab13d06d0ed93561d44dae0188a42c65be7e", size = 214519, upload-time = "2025-07-03T10:52:51.036Z" }, + { url = "https://files.pythonhosted.org/packages/fc/c8/91e5e4a21f9a51e2c7cdd86e587ae01a4fcff06fc3fa8cde4d6f7cf68df4/coverage-7.9.2-cp310-cp310-win_amd64.whl", hash = "sha256:0c0378ba787681ab1897f7c89b415bd56b0b2d9a47e5a3d8dc0ea55aac118d6c", size = 215400, upload-time = "2025-07-03T10:52:52.313Z" }, + { url = "https://files.pythonhosted.org/packages/39/40/916786453bcfafa4c788abee4ccd6f592b5b5eca0cd61a32a4e5a7ef6e02/coverage-7.9.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a7a56a2964a9687b6aba5b5ced6971af308ef6f79a91043c05dd4ee3ebc3e9ba", size = 212152, upload-time = "2025-07-03T10:52:53.562Z" }, + { url = "https://files.pythonhosted.org/packages/9f/66/cc13bae303284b546a030762957322bbbff1ee6b6cb8dc70a40f8a78512f/coverage-7.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:123d589f32c11d9be7fe2e66d823a236fe759b0096f5db3fb1b75b2fa414a4fa", size = 212540, upload-time = "2025-07-03T10:52:55.196Z" }, + { url = "https://files.pythonhosted.org/packages/0f/3c/d56a764b2e5a3d43257c36af4a62c379df44636817bb5f89265de4bf8bd7/coverage-7.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:333b2e0ca576a7dbd66e85ab402e35c03b0b22f525eed82681c4b866e2e2653a", size = 245097, upload-time = "2025-07-03T10:52:56.509Z" }, + { url = "https://files.pythonhosted.org/packages/b1/46/bd064ea8b3c94eb4ca5d90e34d15b806cba091ffb2b8e89a0d7066c45791/coverage-7.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:326802760da234baf9f2f85a39e4a4b5861b94f6c8d95251f699e4f73b1835dc", size = 242812, upload-time = "2025-07-03T10:52:57.842Z" }, + { url = "https://files.pythonhosted.org/packages/43/02/d91992c2b29bc7afb729463bc918ebe5f361be7f1daae93375a5759d1e28/coverage-7.9.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:19e7be4cfec248df38ce40968c95d3952fbffd57b400d4b9bb580f28179556d2", size = 244617, upload-time = "2025-07-03T10:52:59.239Z" }, + { url = "https://files.pythonhosted.org/packages/b7/4f/8fadff6bf56595a16d2d6e33415841b0163ac660873ed9a4e9046194f779/coverage-7.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0b4a4cb73b9f2b891c1788711408ef9707666501ba23684387277ededab1097c", size = 244263, upload-time = "2025-07-03T10:53:00.601Z" }, + { url = "https://files.pythonhosted.org/packages/9b/d2/e0be7446a2bba11739edb9f9ba4eff30b30d8257370e237418eb44a14d11/coverage-7.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:2c8937fa16c8c9fbbd9f118588756e7bcdc7e16a470766a9aef912dd3f117dbd", size = 242314, upload-time = "2025-07-03T10:53:01.932Z" }, + { url = "https://files.pythonhosted.org/packages/9d/7d/dcbac9345000121b8b57a3094c2dfcf1ccc52d8a14a40c1d4bc89f936f80/coverage-7.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:42da2280c4d30c57a9b578bafd1d4494fa6c056d4c419d9689e66d775539be74", size = 242904, upload-time = "2025-07-03T10:53:03.478Z" }, + { url = "https://files.pythonhosted.org/packages/41/58/11e8db0a0c0510cf31bbbdc8caf5d74a358b696302a45948d7c768dfd1cf/coverage-7.9.2-cp311-cp311-win32.whl", hash = "sha256:14fa8d3da147f5fdf9d298cacc18791818f3f1a9f542c8958b80c228320e90c6", size = 214553, upload-time = "2025-07-03T10:53:05.174Z" }, + { url = "https://files.pythonhosted.org/packages/3a/7d/751794ec8907a15e257136e48dc1021b1f671220ecccfd6c4eaf30802714/coverage-7.9.2-cp311-cp311-win_amd64.whl", hash = "sha256:549cab4892fc82004f9739963163fd3aac7a7b0df430669b75b86d293d2df2a7", size = 215441, upload-time = "2025-07-03T10:53:06.472Z" }, + { url = "https://files.pythonhosted.org/packages/62/5b/34abcedf7b946c1c9e15b44f326cb5b0da852885312b30e916f674913428/coverage-7.9.2-cp311-cp311-win_arm64.whl", hash = "sha256:c2667a2b913e307f06aa4e5677f01a9746cd08e4b35e14ebcde6420a9ebb4c62", size = 213873, upload-time = "2025-07-03T10:53:07.699Z" }, + { url = "https://files.pythonhosted.org/packages/53/d7/7deefc6fd4f0f1d4c58051f4004e366afc9e7ab60217ac393f247a1de70a/coverage-7.9.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ae9eb07f1cfacd9cfe8eaee6f4ff4b8a289a668c39c165cd0c8548484920ffc0", size = 212344, upload-time = "2025-07-03T10:53:09.3Z" }, + { url = "https://files.pythonhosted.org/packages/95/0c/ee03c95d32be4d519e6a02e601267769ce2e9a91fc8faa1b540e3626c680/coverage-7.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9ce85551f9a1119f02adc46d3014b5ee3f765deac166acf20dbb851ceb79b6f3", size = 212580, upload-time = "2025-07-03T10:53:11.52Z" }, + { url = "https://files.pythonhosted.org/packages/8b/9f/826fa4b544b27620086211b87a52ca67592622e1f3af9e0a62c87aea153a/coverage-7.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f8f6389ac977c5fb322e0e38885fbbf901743f79d47f50db706e7644dcdcb6e1", size = 246383, upload-time = "2025-07-03T10:53:13.134Z" }, + { url = "https://files.pythonhosted.org/packages/7f/b3/4477aafe2a546427b58b9c540665feff874f4db651f4d3cb21b308b3a6d2/coverage-7.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ff0d9eae8cdfcd58fe7893b88993723583a6ce4dfbfd9f29e001922544f95615", size = 243400, upload-time = "2025-07-03T10:53:14.614Z" }, + { url = "https://files.pythonhosted.org/packages/f8/c2/efffa43778490c226d9d434827702f2dfbc8041d79101a795f11cbb2cf1e/coverage-7.9.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fae939811e14e53ed8a9818dad51d434a41ee09df9305663735f2e2d2d7d959b", size = 245591, upload-time = "2025-07-03T10:53:15.872Z" }, + { url = "https://files.pythonhosted.org/packages/c6/e7/a59888e882c9a5f0192d8627a30ae57910d5d449c80229b55e7643c078c4/coverage-7.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:31991156251ec202c798501e0a42bbdf2169dcb0f137b1f5c0f4267f3fc68ef9", size = 245402, upload-time = "2025-07-03T10:53:17.124Z" }, + { url = "https://files.pythonhosted.org/packages/92/a5/72fcd653ae3d214927edc100ce67440ed8a0a1e3576b8d5e6d066ed239db/coverage-7.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d0d67963f9cbfc7c7f96d4ac74ed60ecbebd2ea6eeb51887af0f8dce205e545f", size = 243583, upload-time = "2025-07-03T10:53:18.781Z" }, + { url = "https://files.pythonhosted.org/packages/5c/f5/84e70e4df28f4a131d580d7d510aa1ffd95037293da66fd20d446090a13b/coverage-7.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:49b752a2858b10580969ec6af6f090a9a440a64a301ac1528d7ca5f7ed497f4d", size = 244815, upload-time = "2025-07-03T10:53:20.168Z" }, + { url = "https://files.pythonhosted.org/packages/39/e7/d73d7cbdbd09fdcf4642655ae843ad403d9cbda55d725721965f3580a314/coverage-7.9.2-cp312-cp312-win32.whl", hash = "sha256:88d7598b8ee130f32f8a43198ee02edd16d7f77692fa056cb779616bbea1b355", size = 214719, upload-time = "2025-07-03T10:53:21.521Z" }, + { url = "https://files.pythonhosted.org/packages/9f/d6/7486dcc3474e2e6ad26a2af2db7e7c162ccd889c4c68fa14ea8ec189c9e9/coverage-7.9.2-cp312-cp312-win_amd64.whl", hash = "sha256:9dfb070f830739ee49d7c83e4941cc767e503e4394fdecb3b54bfdac1d7662c0", size = 215509, upload-time = "2025-07-03T10:53:22.853Z" }, + { url = "https://files.pythonhosted.org/packages/b7/34/0439f1ae2593b0346164d907cdf96a529b40b7721a45fdcf8b03c95fcd90/coverage-7.9.2-cp312-cp312-win_arm64.whl", hash = "sha256:4e2c058aef613e79df00e86b6d42a641c877211384ce5bd07585ed7ba71ab31b", size = 213910, upload-time = "2025-07-03T10:53:24.472Z" }, + { url = "https://files.pythonhosted.org/packages/94/9d/7a8edf7acbcaa5e5c489a646226bed9591ee1c5e6a84733c0140e9ce1ae1/coverage-7.9.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:985abe7f242e0d7bba228ab01070fde1d6c8fa12f142e43debe9ed1dde686038", size = 212367, upload-time = "2025-07-03T10:53:25.811Z" }, + { url = "https://files.pythonhosted.org/packages/e8/9e/5cd6f130150712301f7e40fb5865c1bc27b97689ec57297e568d972eec3c/coverage-7.9.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82c3939264a76d44fde7f213924021ed31f55ef28111a19649fec90c0f109e6d", size = 212632, upload-time = "2025-07-03T10:53:27.075Z" }, + { url = "https://files.pythonhosted.org/packages/a8/de/6287a2c2036f9fd991c61cefa8c64e57390e30c894ad3aa52fac4c1e14a8/coverage-7.9.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ae5d563e970dbe04382f736ec214ef48103d1b875967c89d83c6e3f21706d5b3", size = 245793, upload-time = "2025-07-03T10:53:28.408Z" }, + { url = "https://files.pythonhosted.org/packages/06/cc/9b5a9961d8160e3cb0b558c71f8051fe08aa2dd4b502ee937225da564ed1/coverage-7.9.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bdd612e59baed2a93c8843c9a7cb902260f181370f1d772f4842987535071d14", size = 243006, upload-time = "2025-07-03T10:53:29.754Z" }, + { url = "https://files.pythonhosted.org/packages/49/d9/4616b787d9f597d6443f5588619c1c9f659e1f5fc9eebf63699eb6d34b78/coverage-7.9.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:256ea87cb2a1ed992bcdfc349d8042dcea1b80436f4ddf6e246d6bee4b5d73b6", size = 244990, upload-time = "2025-07-03T10:53:31.098Z" }, + { url = "https://files.pythonhosted.org/packages/48/83/801cdc10f137b2d02b005a761661649ffa60eb173dcdaeb77f571e4dc192/coverage-7.9.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f44ae036b63c8ea432f610534a2668b0c3aee810e7037ab9d8ff6883de480f5b", size = 245157, upload-time = "2025-07-03T10:53:32.717Z" }, + { url = "https://files.pythonhosted.org/packages/c8/a4/41911ed7e9d3ceb0ffb019e7635468df7499f5cc3edca5f7dfc078e9c5ec/coverage-7.9.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:82d76ad87c932935417a19b10cfe7abb15fd3f923cfe47dbdaa74ef4e503752d", size = 243128, upload-time = "2025-07-03T10:53:34.009Z" }, + { url = "https://files.pythonhosted.org/packages/10/41/344543b71d31ac9cb00a664d5d0c9ef134a0fe87cb7d8430003b20fa0b7d/coverage-7.9.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:619317bb86de4193debc712b9e59d5cffd91dc1d178627ab2a77b9870deb2868", size = 244511, upload-time = "2025-07-03T10:53:35.434Z" }, + { url = "https://files.pythonhosted.org/packages/d5/81/3b68c77e4812105e2a060f6946ba9e6f898ddcdc0d2bfc8b4b152a9ae522/coverage-7.9.2-cp313-cp313-win32.whl", hash = "sha256:0a07757de9feb1dfafd16ab651e0f628fd7ce551604d1bf23e47e1ddca93f08a", size = 214765, upload-time = "2025-07-03T10:53:36.787Z" }, + { url = "https://files.pythonhosted.org/packages/06/a2/7fac400f6a346bb1a4004eb2a76fbff0e242cd48926a2ce37a22a6a1d917/coverage-7.9.2-cp313-cp313-win_amd64.whl", hash = "sha256:115db3d1f4d3f35f5bb021e270edd85011934ff97c8797216b62f461dd69374b", size = 215536, upload-time = "2025-07-03T10:53:38.188Z" }, + { url = "https://files.pythonhosted.org/packages/08/47/2c6c215452b4f90d87017e61ea0fd9e0486bb734cb515e3de56e2c32075f/coverage-7.9.2-cp313-cp313-win_arm64.whl", hash = "sha256:48f82f889c80af8b2a7bb6e158d95a3fbec6a3453a1004d04e4f3b5945a02694", size = 213943, upload-time = "2025-07-03T10:53:39.492Z" }, + { url = "https://files.pythonhosted.org/packages/a3/46/e211e942b22d6af5e0f323faa8a9bc7c447a1cf1923b64c47523f36ed488/coverage-7.9.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:55a28954545f9d2f96870b40f6c3386a59ba8ed50caf2d949676dac3ecab99f5", size = 213088, upload-time = "2025-07-03T10:53:40.874Z" }, + { url = "https://files.pythonhosted.org/packages/d2/2f/762551f97e124442eccd907bf8b0de54348635b8866a73567eb4e6417acf/coverage-7.9.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cdef6504637731a63c133bb2e6f0f0214e2748495ec15fe42d1e219d1b133f0b", size = 213298, upload-time = "2025-07-03T10:53:42.218Z" }, + { url = "https://files.pythonhosted.org/packages/7a/b7/76d2d132b7baf7360ed69be0bcab968f151fa31abe6d067f0384439d9edb/coverage-7.9.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bcd5ebe66c7a97273d5d2ddd4ad0ed2e706b39630ed4b53e713d360626c3dbb3", size = 256541, upload-time = "2025-07-03T10:53:43.823Z" }, + { url = "https://files.pythonhosted.org/packages/a0/17/392b219837d7ad47d8e5974ce5f8dc3deb9f99a53b3bd4d123602f960c81/coverage-7.9.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9303aed20872d7a3c9cb39c5d2b9bdbe44e3a9a1aecb52920f7e7495410dfab8", size = 252761, upload-time = "2025-07-03T10:53:45.19Z" }, + { url = "https://files.pythonhosted.org/packages/d5/77/4256d3577fe1b0daa8d3836a1ebe68eaa07dd2cbaf20cf5ab1115d6949d4/coverage-7.9.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc18ea9e417a04d1920a9a76fe9ebd2f43ca505b81994598482f938d5c315f46", size = 254917, upload-time = "2025-07-03T10:53:46.931Z" }, + { url = "https://files.pythonhosted.org/packages/53/99/fc1a008eef1805e1ddb123cf17af864743354479ea5129a8f838c433cc2c/coverage-7.9.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6406cff19880aaaadc932152242523e892faff224da29e241ce2fca329866584", size = 256147, upload-time = "2025-07-03T10:53:48.289Z" }, + { url = "https://files.pythonhosted.org/packages/92/c0/f63bf667e18b7f88c2bdb3160870e277c4874ced87e21426128d70aa741f/coverage-7.9.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:2d0d4f6ecdf37fcc19c88fec3e2277d5dee740fb51ffdd69b9579b8c31e4232e", size = 254261, upload-time = "2025-07-03T10:53:49.99Z" }, + { url = "https://files.pythonhosted.org/packages/8c/32/37dd1c42ce3016ff8ec9e4b607650d2e34845c0585d3518b2a93b4830c1a/coverage-7.9.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c33624f50cf8de418ab2b4d6ca9eda96dc45b2c4231336bac91454520e8d1fac", size = 255099, upload-time = "2025-07-03T10:53:51.354Z" }, + { url = "https://files.pythonhosted.org/packages/da/2e/af6b86f7c95441ce82f035b3affe1cd147f727bbd92f563be35e2d585683/coverage-7.9.2-cp313-cp313t-win32.whl", hash = "sha256:1df6b76e737c6a92210eebcb2390af59a141f9e9430210595251fbaf02d46926", size = 215440, upload-time = "2025-07-03T10:53:52.808Z" }, + { url = "https://files.pythonhosted.org/packages/4d/bb/8a785d91b308867f6b2e36e41c569b367c00b70c17f54b13ac29bcd2d8c8/coverage-7.9.2-cp313-cp313t-win_amd64.whl", hash = "sha256:f5fd54310b92741ebe00d9c0d1d7b2b27463952c022da6d47c175d246a98d1bd", size = 216537, upload-time = "2025-07-03T10:53:54.273Z" }, + { url = "https://files.pythonhosted.org/packages/1d/a0/a6bffb5e0f41a47279fd45a8f3155bf193f77990ae1c30f9c224b61cacb0/coverage-7.9.2-cp313-cp313t-win_arm64.whl", hash = "sha256:c48c2375287108c887ee87d13b4070a381c6537d30e8487b24ec721bf2a781cb", size = 214398, upload-time = "2025-07-03T10:53:56.715Z" }, + { url = "https://files.pythonhosted.org/packages/d7/85/f8bbefac27d286386961c25515431482a425967e23d3698b75a250872924/coverage-7.9.2-pp39.pp310.pp311-none-any.whl", hash = "sha256:8a1166db2fb62473285bcb092f586e081e92656c7dfa8e9f62b4d39d7e6b5050", size = 204013, upload-time = "2025-07-03T10:54:12.084Z" }, + { url = "https://files.pythonhosted.org/packages/3c/38/bbe2e63902847cf79036ecc75550d0698af31c91c7575352eb25190d0fb3/coverage-7.9.2-py3-none-any.whl", hash = "sha256:e425cd5b00f6fc0ed7cdbd766c70be8baab4b7839e4d4fe5fac48581dd968ea4", size = 204005, upload-time = "2025-07-03T10:54:13.491Z" }, +] + +[package.optional-dependencies] +toml = [ + { name = "tomli", marker = "python_full_version <= '3.11'" }, +] + +[[package]] +name = "decli" +version = "0.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0c/59/d4ffff1dee2c8f6f2dd8f87010962e60f7b7847504d765c91ede5a466730/decli-0.6.3.tar.gz", hash = "sha256:87f9d39361adf7f16b9ca6e3b614badf7519da13092f2db3c80ca223c53c7656", size = 7564, upload-time = "2025-06-01T15:23:41.25Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d8/fa/ec878c28bc7f65b77e7e17af3522c9948a9711b9fa7fc4c5e3140a7e3578/decli-0.6.3-py3-none-any.whl", hash = "sha256:5152347c7bb8e3114ad65db719e5709b28d7f7f45bdb709f70167925e55640f3", size = 7989, upload-time = "2025-06-01T15:23:40.228Z" }, +] + +[[package]] +name = "distlib" +version = "0.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0b/9f/a65090624ecf468cdca03533906e7c69ed7588582240cfe7cc9e770b50eb/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88", size = 29749, upload-time = "2025-05-10T17:42:51.123Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/36/f4/c6e662dade71f56cd2f3735141b265c3c79293c109549c1e6933b0651ffc/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10", size = 16674, upload-time = "2025-05-10T17:42:49.33Z" }, +] + +[[package]] +name = "filelock" +version = "3.18.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/0a/10/c23352565a6544bdc5353e0b15fc1c563352101f30e24bf500207a54df9a/filelock-3.18.0.tar.gz", hash = "sha256:adbc88eabb99d2fec8c9c1b229b171f18afa655400173ddc653d5d01501fb9f2", size = 18075, upload-time = "2025-03-14T07:11:40.47Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4d/36/2a115987e2d8c300a974597416d9de88f2444426de9571f4b59b2cca3acc/filelock-3.18.0-py3-none-any.whl", hash = "sha256:c401f4f8377c4464e6db25fff06205fd89bdd83b65eb0488ed1b160f780e21de", size = 16215, upload-time = "2025-03-14T07:11:39.145Z" }, +] + +[[package]] +name = "h11" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, +] + +[[package]] +name = "httpcore" +version = "1.0.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "h11" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, +] + +[[package]] +name = "httpx" +version = "0.28.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "certifi" }, + { name = "httpcore" }, + { name = "idna" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, +] + +[[package]] +name = "httpx-sse" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6e/fa/66bd985dd0b7c109a3bcb89272ee0bfb7e2b4d06309ad7b38ff866734b2a/httpx_sse-0.4.1.tar.gz", hash = "sha256:8f44d34414bc7b21bf3602713005c5df4917884f76072479b21f68befa4ea26e", size = 12998, upload-time = "2025-06-24T13:21:05.71Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/25/0a/6269e3473b09aed2dab8aa1a600c70f31f00ae1349bee30658f7e358a159/httpx_sse-0.4.1-py3-none-any.whl", hash = "sha256:cba42174344c3a5b06f255ce65b350880f962d99ead85e776f23c6618a377a37", size = 8054, upload-time = "2025-06-24T13:21:04.772Z" }, +] + +[[package]] +name = "identify" +version = "2.6.12" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/88/d193a27416618628a5eea64e3223acd800b40749a96ffb322a9b55a49ed1/identify-2.6.12.tar.gz", hash = "sha256:d8de45749f1efb108badef65ee8386f0f7bb19a7f26185f74de6367bffbaf0e6", size = 99254, upload-time = "2025-05-23T20:37:53.3Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7a/cd/18f8da995b658420625f7ef13f037be53ae04ec5ad33f9b718240dcfd48c/identify-2.6.12-py2.py3-none-any.whl", hash = "sha256:ad9672d5a72e0d2ff7c5c8809b62dfa60458626352fb0eb7b55e69bdc45334a2", size = 99145, upload-time = "2025-05-23T20:37:51.495Z" }, +] + +[[package]] +name = "idna" +version = "3.10" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" }, +] + +[[package]] +name = "importlib-metadata" +version = "8.7.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "zipp" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/76/66/650a33bd90f786193e4de4b3ad86ea60b53c89b669a5c7be931fac31cdb0/importlib_metadata-8.7.0.tar.gz", hash = "sha256:d13b81ad223b890aa16c5471f2ac3056cf76c5f10f82d6f9292f0b415f389000", size = 56641, upload-time = "2025-04-27T15:29:01.736Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/b0/36bd937216ec521246249be3bf9855081de4c5e06a0c9b4219dbeda50373/importlib_metadata-8.7.0-py3-none-any.whl", hash = "sha256:e5dd1551894c77868a30651cef00984d50e1002d06942a7101d34870c5f02afd", size = 27656, upload-time = "2025-04-27T15:29:00.214Z" }, +] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, +] + +[[package]] +name = "jinja2" +version = "3.1.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, +] + +[[package]] +name = "jmespath" +version = "1.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, +] + +[[package]] +name = "jsonschema" +version = "4.25.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "jsonschema-specifications" }, + { name = "referencing" }, + { name = "rpds-py" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d5/00/a297a868e9d0784450faa7365c2172a7d6110c763e30ba861867c32ae6a9/jsonschema-4.25.0.tar.gz", hash = "sha256:e63acf5c11762c0e6672ffb61482bdf57f0876684d8d249c0fe2d730d48bc55f", size = 356830, upload-time = "2025-07-18T15:39:45.11Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/54/c86cd8e011fe98803d7e382fd67c0df5ceab8d2b7ad8c5a81524f791551c/jsonschema-4.25.0-py3-none-any.whl", hash = "sha256:24c2e8da302de79c8b9382fee3e76b355e44d2a4364bb207159ce10b517bd716", size = 89184, upload-time = "2025-07-18T15:39:42.956Z" }, +] + +[[package]] +name = "jsonschema-specifications" +version = "2025.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "referencing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/bf/ce/46fbd9c8119cfc3581ee5643ea49464d168028cfb5caff5fc0596d0cf914/jsonschema_specifications-2025.4.1.tar.gz", hash = "sha256:630159c9f4dbea161a6a2205c3011cc4f18ff381b189fff48bb39b9bf26ae608", size = 15513, upload-time = "2025-04-23T12:34:07.418Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/01/0e/b27cdbaccf30b890c40ed1da9fd4a3593a5cf94dae54fb34f8a4b74fcd3f/jsonschema_specifications-2025.4.1-py3-none-any.whl", hash = "sha256:4653bffbd6584f7de83a67e0d620ef16900b390ddc7939d56684d6c81e33f1af", size = 18437, upload-time = "2025-04-23T12:34:05.422Z" }, +] + +[[package]] +name = "loguru" +version = "0.7.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "win32-setctime", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3a/05/a1dae3dffd1116099471c643b8924f5aa6524411dc6c63fdae648c4f1aca/loguru-0.7.3.tar.gz", hash = "sha256:19480589e77d47b8d85b2c827ad95d49bf31b0dcde16593892eb51dd18706eb6", size = 63559, upload-time = "2024-12-06T11:20:56.608Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0c/29/0348de65b8cc732daa3e33e67806420b2ae89bdce2b04af740289c5c6c8c/loguru-0.7.3-py3-none-any.whl", hash = "sha256:31a33c10c8e1e10422bfd431aeb5d351c7cf7fa671e3c4df004162264b28220c", size = 61595, upload-time = "2024-12-06T11:20:54.538Z" }, +] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "mdurl" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596, upload-time = "2023-06-03T06:41:14.443Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528, upload-time = "2023-06-03T06:41:11.019Z" }, +] + +[[package]] +name = "markupsafe" +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357, upload-time = "2024-10-18T15:20:51.44Z" }, + { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393, upload-time = "2024-10-18T15:20:52.426Z" }, + { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732, upload-time = "2024-10-18T15:20:53.578Z" }, + { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866, upload-time = "2024-10-18T15:20:55.06Z" }, + { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964, upload-time = "2024-10-18T15:20:55.906Z" }, + { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977, upload-time = "2024-10-18T15:20:57.189Z" }, + { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366, upload-time = "2024-10-18T15:20:58.235Z" }, + { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091, upload-time = "2024-10-18T15:20:59.235Z" }, + { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065, upload-time = "2024-10-18T15:21:00.307Z" }, + { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514, upload-time = "2024-10-18T15:21:01.122Z" }, + { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" }, + { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" }, + { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" }, + { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" }, + { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" }, + { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" }, + { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" }, + { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" }, + { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" }, + { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" }, + { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" }, + { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" }, + { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" }, + { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" }, + { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" }, + { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" }, + { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" }, + { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" }, + { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" }, + { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" }, + { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" }, + { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" }, + { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" }, + { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" }, + { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" }, + { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" }, + { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" }, + { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" }, + { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" }, + { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" }, + { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" }, + { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" }, + { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" }, + { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" }, + { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" }, + { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" }, + { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" }, + { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" }, + { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" }, +] + +[[package]] +name = "mcp" +version = "1.12.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "httpx" }, + { name = "httpx-sse" }, + { name = "jsonschema" }, + { name = "pydantic" }, + { name = "pydantic-settings" }, + { name = "python-multipart" }, + { name = "pywin32", marker = "sys_platform == 'win32'" }, + { name = "sse-starlette" }, + { name = "starlette" }, + { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/45/94/caa0f4754e2437f7033068989f13fee784856f95870c786b0b5c2c0f511e/mcp-1.12.0.tar.gz", hash = "sha256:853f6b17a3f31ea6e2f278c2ec7d3b38457bc80c7c2c675260dd7f04a6fd0e70", size = 424678, upload-time = "2025-07-17T19:46:35.522Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ed/da/c7eaab6a58f1034de115b7902141ad8f81b4f3bbf7dc0cc267594947a4d7/mcp-1.12.0-py3-none-any.whl", hash = "sha256:19a498b2bf273283e463b4dd1ed83f791fbba5c25bfa16b8b34cfd5571673e7f", size = 158470, upload-time = "2025-07-17T19:46:34.166Z" }, +] + +[package.optional-dependencies] +cli = [ + { name = "python-dotenv" }, + { name = "typer" }, +] + +[[package]] +name = "mdurl" +version = "0.1.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" }, +] + +[[package]] +name = "nodeenv" +version = "1.9.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" }, +] + +[[package]] +name = "packaging" +version = "25.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" }, +] + +[[package]] +name = "platformdirs" +version = "4.3.8" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/fe/8b/3c73abc9c759ecd3f1f7ceff6685840859e8070c4d947c93fae71f6a0bf2/platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc", size = 21362, upload-time = "2025-05-07T22:47:42.121Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/39/979e8e21520d4e47a0bbe349e2713c0aac6f3d853d0e5b34d76206c439aa/platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4", size = 18567, upload-time = "2025-05-07T22:47:40.376Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "pre-commit" +version = "4.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cfgv" }, + { name = "identify" }, + { name = "nodeenv" }, + { name = "pyyaml" }, + { name = "virtualenv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/08/39/679ca9b26c7bb2999ff122d50faa301e49af82ca9c066ec061cfbc0c6784/pre_commit-4.2.0.tar.gz", hash = "sha256:601283b9757afd87d40c4c4a9b2b5de9637a8ea02eaff7adc2d0fb4e04841146", size = 193424, upload-time = "2025-03-18T21:35:20.987Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/88/74/a88bf1b1efeae488a0c0b7bdf71429c313722d1fc0f377537fbe554e6180/pre_commit-4.2.0-py2.py3-none-any.whl", hash = "sha256:a009ca7205f1eb497d10b845e52c838a98b6cdd2102a6c8e4540e94ee75c58bd", size = 220707, upload-time = "2025-03-18T21:35:19.343Z" }, +] + +[[package]] +name = "prompt-toolkit" +version = "3.0.51" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "wcwidth" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/bb/6e/9d084c929dfe9e3bfe0c6a47e31f78a25c54627d64a66e884a8bf5474f1c/prompt_toolkit-3.0.51.tar.gz", hash = "sha256:931a162e3b27fc90c86f1b48bb1fb2c528c2761475e57c9c06de13311c7b54ed", size = 428940, upload-time = "2025-04-15T09:18:47.731Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ce/4f/5249960887b1fbe561d9ff265496d170b55a735b76724f10ef19f9e40716/prompt_toolkit-3.0.51-py3-none-any.whl", hash = "sha256:52742911fde84e2d423e2f9a4cf1de7d7ac4e51958f648d9540e0fb8db077b07", size = 387810, upload-time = "2025-04-15T09:18:44.753Z" }, +] + +[[package]] +name = "pydantic" +version = "2.11.7" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "annotated-types" }, + { name = "pydantic-core" }, + { name = "typing-extensions" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350, upload-time = "2025-06-14T08:33:17.137Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" }, +] + +[[package]] +name = "pydantic-core" +version = "2.33.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e5/92/b31726561b5dae176c2d2c2dc43a9c5bfba5d32f96f8b4c0a600dd492447/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8", size = 2028817, upload-time = "2025-04-23T18:30:43.919Z" }, + { url = "https://files.pythonhosted.org/packages/a3/44/3f0b95fafdaca04a483c4e685fe437c6891001bf3ce8b2fded82b9ea3aa1/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d", size = 1861357, upload-time = "2025-04-23T18:30:46.372Z" }, + { url = "https://files.pythonhosted.org/packages/30/97/e8f13b55766234caae05372826e8e4b3b96e7b248be3157f53237682e43c/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d", size = 1898011, upload-time = "2025-04-23T18:30:47.591Z" }, + { url = "https://files.pythonhosted.org/packages/9b/a3/99c48cf7bafc991cc3ee66fd544c0aae8dc907b752f1dad2d79b1b5a471f/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572", size = 1982730, upload-time = "2025-04-23T18:30:49.328Z" }, + { url = "https://files.pythonhosted.org/packages/de/8e/a5b882ec4307010a840fb8b58bd9bf65d1840c92eae7534c7441709bf54b/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02", size = 2136178, upload-time = "2025-04-23T18:30:50.907Z" }, + { url = "https://files.pythonhosted.org/packages/e4/bb/71e35fc3ed05af6834e890edb75968e2802fe98778971ab5cba20a162315/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b", size = 2736462, upload-time = "2025-04-23T18:30:52.083Z" }, + { url = "https://files.pythonhosted.org/packages/31/0d/c8f7593e6bc7066289bbc366f2235701dcbebcd1ff0ef8e64f6f239fb47d/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2", size = 2005652, upload-time = "2025-04-23T18:30:53.389Z" }, + { url = "https://files.pythonhosted.org/packages/d2/7a/996d8bd75f3eda405e3dd219ff5ff0a283cd8e34add39d8ef9157e722867/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a", size = 2113306, upload-time = "2025-04-23T18:30:54.661Z" }, + { url = "https://files.pythonhosted.org/packages/ff/84/daf2a6fb2db40ffda6578a7e8c5a6e9c8affb251a05c233ae37098118788/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac", size = 2073720, upload-time = "2025-04-23T18:30:56.11Z" }, + { url = "https://files.pythonhosted.org/packages/77/fb/2258da019f4825128445ae79456a5499c032b55849dbd5bed78c95ccf163/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a", size = 2244915, upload-time = "2025-04-23T18:30:57.501Z" }, + { url = "https://files.pythonhosted.org/packages/d8/7a/925ff73756031289468326e355b6fa8316960d0d65f8b5d6b3a3e7866de7/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b", size = 2241884, upload-time = "2025-04-23T18:30:58.867Z" }, + { url = "https://files.pythonhosted.org/packages/0b/b0/249ee6d2646f1cdadcb813805fe76265745c4010cf20a8eba7b0e639d9b2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22", size = 1910496, upload-time = "2025-04-23T18:31:00.078Z" }, + { url = "https://files.pythonhosted.org/packages/66/ff/172ba8f12a42d4b552917aa65d1f2328990d3ccfc01d5b7c943ec084299f/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640", size = 1955019, upload-time = "2025-04-23T18:31:01.335Z" }, + { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584, upload-time = "2025-04-23T18:31:03.106Z" }, + { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071, upload-time = "2025-04-23T18:31:04.621Z" }, + { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823, upload-time = "2025-04-23T18:31:06.377Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792, upload-time = "2025-04-23T18:31:07.93Z" }, + { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338, upload-time = "2025-04-23T18:31:09.283Z" }, + { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998, upload-time = "2025-04-23T18:31:11.7Z" }, + { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200, upload-time = "2025-04-23T18:31:13.536Z" }, + { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890, upload-time = "2025-04-23T18:31:15.011Z" }, + { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359, upload-time = "2025-04-23T18:31:16.393Z" }, + { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883, upload-time = "2025-04-23T18:31:17.892Z" }, + { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074, upload-time = "2025-04-23T18:31:19.205Z" }, + { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538, upload-time = "2025-04-23T18:31:20.541Z" }, + { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909, upload-time = "2025-04-23T18:31:22.371Z" }, + { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786, upload-time = "2025-04-23T18:31:24.161Z" }, + { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" }, + { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" }, + { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" }, + { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" }, + { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" }, + { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" }, + { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" }, + { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" }, + { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" }, + { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" }, + { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" }, + { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" }, + { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" }, + { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" }, + { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" }, + { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" }, + { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" }, + { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" }, + { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" }, + { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" }, + { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" }, + { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" }, + { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" }, + { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" }, + { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" }, + { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" }, + { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" }, + { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" }, + { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" }, + { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" }, + { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" }, + { url = "https://files.pythonhosted.org/packages/30/68/373d55e58b7e83ce371691f6eaa7175e3a24b956c44628eb25d7da007917/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa", size = 2023982, upload-time = "2025-04-23T18:32:53.14Z" }, + { url = "https://files.pythonhosted.org/packages/a4/16/145f54ac08c96a63d8ed6442f9dec17b2773d19920b627b18d4f10a061ea/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29", size = 1858412, upload-time = "2025-04-23T18:32:55.52Z" }, + { url = "https://files.pythonhosted.org/packages/41/b1/c6dc6c3e2de4516c0bb2c46f6a373b91b5660312342a0cf5826e38ad82fa/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d", size = 1892749, upload-time = "2025-04-23T18:32:57.546Z" }, + { url = "https://files.pythonhosted.org/packages/12/73/8cd57e20afba760b21b742106f9dbdfa6697f1570b189c7457a1af4cd8a0/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e", size = 2067527, upload-time = "2025-04-23T18:32:59.771Z" }, + { url = "https://files.pythonhosted.org/packages/e3/d5/0bb5d988cc019b3cba4a78f2d4b3854427fc47ee8ec8e9eaabf787da239c/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c", size = 2108225, upload-time = "2025-04-23T18:33:04.51Z" }, + { url = "https://files.pythonhosted.org/packages/f1/c5/00c02d1571913d496aabf146106ad8239dc132485ee22efe08085084ff7c/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec", size = 2069490, upload-time = "2025-04-23T18:33:06.391Z" }, + { url = "https://files.pythonhosted.org/packages/22/a8/dccc38768274d3ed3a59b5d06f59ccb845778687652daa71df0cab4040d7/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052", size = 2237525, upload-time = "2025-04-23T18:33:08.44Z" }, + { url = "https://files.pythonhosted.org/packages/d4/e7/4f98c0b125dda7cf7ccd14ba936218397b44f50a56dd8c16a3091df116c3/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c", size = 2238446, upload-time = "2025-04-23T18:33:10.313Z" }, + { url = "https://files.pythonhosted.org/packages/ce/91/2ec36480fdb0b783cd9ef6795753c1dea13882f2e68e73bce76ae8c21e6a/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808", size = 2066678, upload-time = "2025-04-23T18:33:12.224Z" }, + { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200, upload-time = "2025-04-23T18:33:14.199Z" }, + { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123, upload-time = "2025-04-23T18:33:16.555Z" }, + { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852, upload-time = "2025-04-23T18:33:18.513Z" }, + { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484, upload-time = "2025-04-23T18:33:20.475Z" }, + { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896, upload-time = "2025-04-23T18:33:22.501Z" }, + { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475, upload-time = "2025-04-23T18:33:24.528Z" }, + { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013, upload-time = "2025-04-23T18:33:26.621Z" }, + { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715, upload-time = "2025-04-23T18:33:28.656Z" }, + { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757, upload-time = "2025-04-23T18:33:30.645Z" }, +] + +[[package]] +name = "pydantic-settings" +version = "2.10.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pydantic" }, + { name = "python-dotenv" }, + { name = "typing-inspection" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/68/85/1ea668bbab3c50071ca613c6ab30047fb36ab0da1b92fa8f17bbc38fd36c/pydantic_settings-2.10.1.tar.gz", hash = "sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee", size = 172583, upload-time = "2025-06-24T13:26:46.841Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/58/f0/427018098906416f580e3cf1366d3b1abfb408a0652e9f31600c24a1903c/pydantic_settings-2.10.1-py3-none-any.whl", hash = "sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796", size = 45235, upload-time = "2025-06-24T13:26:45.485Z" }, +] + +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pyright" +version = "1.1.403" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "nodeenv" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/fe/f6/35f885264ff08c960b23d1542038d8da86971c5d8c955cfab195a4f672d7/pyright-1.1.403.tar.gz", hash = "sha256:3ab69b9f41c67fb5bbb4d7a36243256f0d549ed3608678d381d5f51863921104", size = 3913526, upload-time = "2025-07-09T07:15:52.882Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/49/b6/b04e5c2f41a5ccad74a1a4759da41adb20b4bc9d59a5e08d29ba60084d07/pyright-1.1.403-py3-none-any.whl", hash = "sha256:c0eeca5aa76cbef3fcc271259bbd785753c7ad7bcac99a9162b4c4c7daed23b3", size = 5684504, upload-time = "2025-07-09T07:15:50.958Z" }, +] + +[[package]] +name = "pytest" +version = "8.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, + { name = "iniconfig" }, + { name = "packaging" }, + { name = "pluggy" }, + { name = "pygments" }, + { name = "tomli", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/08/ba/45911d754e8eba3d5a841a5ce61a65a685ff1798421ac054f85aa8747dfb/pytest-8.4.1.tar.gz", hash = "sha256:7c67fd69174877359ed9371ec3af8a3d2b04741818c51e5e99cc1742251fa93c", size = 1517714, upload-time = "2025-06-18T05:48:06.109Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/29/16/c8a903f4c4dffe7a12843191437d7cd8e32751d5de349d45d3fe69544e87/pytest-8.4.1-py3-none-any.whl", hash = "sha256:539c70ba6fcead8e78eebbf1115e8b589e7565830d7d006a8723f19ac8a0afb7", size = 365474, upload-time = "2025-06-18T05:48:03.955Z" }, +] + +[[package]] +name = "pytest-asyncio" +version = "1.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "backports-asyncio-runner", marker = "python_full_version < '3.11'" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/4e/51/f8794af39eeb870e87a8c8068642fc07bce0c854d6865d7dd0f2a9d338c2/pytest_asyncio-1.1.0.tar.gz", hash = "sha256:796aa822981e01b68c12e4827b8697108f7205020f24b5793b3c41555dab68ea", size = 46652, upload-time = "2025-07-16T04:29:26.393Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/9d/bf86eddabf8c6c9cb1ea9a869d6873b46f105a5d292d3a6f7071f5b07935/pytest_asyncio-1.1.0-py3-none-any.whl", hash = "sha256:5fe2d69607b0bd75c656d1211f969cadba035030156745ee09e7d71740e58ecf", size = 15157, upload-time = "2025-07-16T04:29:24.929Z" }, +] + +[[package]] +name = "pytest-cov" +version = "6.2.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "coverage", extra = ["toml"] }, + { name = "pluggy" }, + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/18/99/668cade231f434aaa59bbfbf49469068d2ddd945000621d3d165d2e7dd7b/pytest_cov-6.2.1.tar.gz", hash = "sha256:25cc6cc0a5358204b8108ecedc51a9b57b34cc6b8c967cc2c01a4e00d8a67da2", size = 69432, upload-time = "2025-06-12T10:47:47.684Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bc/16/4ea354101abb1287856baa4af2732be351c7bee728065aed451b678153fd/pytest_cov-6.2.1-py3-none-any.whl", hash = "sha256:f5bc4c23f42f1cdd23c70b1dab1bbaef4fc505ba950d53e0081d0730dd7e86d5", size = 24644, upload-time = "2025-06-12T10:47:45.932Z" }, +] + +[[package]] +name = "pytest-mock" +version = "3.14.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pytest" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/71/28/67172c96ba684058a4d24ffe144d64783d2a270d0af0d9e792737bddc75c/pytest_mock-3.14.1.tar.gz", hash = "sha256:159e9edac4c451ce77a5cdb9fc5d1100708d2dd4ba3c3df572f14097351af80e", size = 33241, upload-time = "2025-05-26T13:58:45.167Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b2/05/77b60e520511c53d1c1ca75f1930c7dd8e971d0c4379b7f4b3f9644685ba/pytest_mock-3.14.1-py3-none-any.whl", hash = "sha256:178aefcd11307d874b4cd3100344e7e2d888d9791a6a1d9bfe90fbc1b74fd1d0", size = 9923, upload-time = "2025-05-26T13:58:43.487Z" }, +] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "six" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, +] + +[[package]] +name = "python-dotenv" +version = "1.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978, upload-time = "2025-06-24T04:21:07.341Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556, upload-time = "2025-06-24T04:21:06.073Z" }, +] + +[[package]] +name = "python-multipart" +version = "0.0.20" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" }, +] + +[[package]] +name = "pywin32" +version = "311" +source = { registry = "https://pypi.org/simple" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7b/40/44efbb0dfbd33aca6a6483191dae0716070ed99e2ecb0c53683f400a0b4f/pywin32-311-cp310-cp310-win32.whl", hash = "sha256:d03ff496d2a0cd4a5893504789d4a15399133fe82517455e78bad62efbb7f0a3", size = 8760432, upload-time = "2025-07-14T20:13:05.9Z" }, + { url = "https://files.pythonhosted.org/packages/5e/bf/360243b1e953bd254a82f12653974be395ba880e7ec23e3731d9f73921cc/pywin32-311-cp310-cp310-win_amd64.whl", hash = "sha256:797c2772017851984b97180b0bebe4b620bb86328e8a884bb626156295a63b3b", size = 9590103, upload-time = "2025-07-14T20:13:07.698Z" }, + { url = "https://files.pythonhosted.org/packages/57/38/d290720e6f138086fb3d5ffe0b6caa019a791dd57866940c82e4eeaf2012/pywin32-311-cp310-cp310-win_arm64.whl", hash = "sha256:0502d1facf1fed4839a9a51ccbcc63d952cf318f78ffc00a7e78528ac27d7a2b", size = 8778557, upload-time = "2025-07-14T20:13:11.11Z" }, + { url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload-time = "2025-07-14T20:13:13.266Z" }, + { url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload-time = "2025-07-14T20:13:15.147Z" }, + { url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload-time = "2025-07-14T20:13:16.945Z" }, + { url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" }, + { url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" }, + { url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" }, + { url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" }, + { url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" }, + { url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" }, + { url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" }, + { url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" }, +] + +[[package]] +name = "pyyaml" +version = "6.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199, upload-time = "2024-08-06T20:31:40.178Z" }, + { url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758, upload-time = "2024-08-06T20:31:42.173Z" }, + { url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463, upload-time = "2024-08-06T20:31:44.263Z" }, + { url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280, upload-time = "2024-08-06T20:31:50.199Z" }, + { url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239, upload-time = "2024-08-06T20:31:52.292Z" }, + { url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802, upload-time = "2024-08-06T20:31:53.836Z" }, + { url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527, upload-time = "2024-08-06T20:31:55.565Z" }, + { url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052, upload-time = "2024-08-06T20:31:56.914Z" }, + { url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774, upload-time = "2024-08-06T20:31:58.304Z" }, + { url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" }, + { url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" }, + { url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" }, + { url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" }, + { url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" }, + { url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" }, + { url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" }, + { url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" }, + { url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" }, + { url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" }, + { url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" }, + { url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" }, + { url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" }, + { url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" }, + { url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" }, + { url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" }, + { url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" }, + { url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" }, + { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" }, + { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" }, + { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" }, + { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" }, + { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" }, + { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" }, + { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" }, + { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" }, + { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" }, +] + +[[package]] +name = "questionary" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "prompt-toolkit" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a8/b8/d16eb579277f3de9e56e5ad25280fab52fc5774117fb70362e8c2e016559/questionary-2.1.0.tar.gz", hash = "sha256:6302cdd645b19667d8f6e6634774e9538bfcd1aad9be287e743d96cacaf95587", size = 26775, upload-time = "2024-12-29T11:49:17.802Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ad/3f/11dd4cd4f39e05128bfd20138faea57bec56f9ffba6185d276e3107ba5b2/questionary-2.1.0-py3-none-any.whl", hash = "sha256:44174d237b68bc828e4878c763a9ad6790ee61990e0ae72927694ead57bab8ec", size = 36747, upload-time = "2024-12-29T11:49:16.734Z" }, +] + +[[package]] +name = "referencing" +version = "0.36.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "attrs" }, + { name = "rpds-py" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" }, +] + +[[package]] +name = "requests" +version = "2.32.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "certifi" }, + { name = "charset-normalizer" }, + { name = "idna" }, + { name = "urllib3" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/e1/0a/929373653770d8a0d7ea76c37de6e41f11eb07559b103b1c02cafb3f7cf8/requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422", size = 135258, upload-time = "2025-06-09T16:43:07.34Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847, upload-time = "2025-06-09T16:43:05.728Z" }, +] + +[[package]] +name = "requests-auth-aws-sigv4" +version = "0.7" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b3/bc/f695cd7d54327f925e22293d5b71b312dcdee0d8e720defc7a7a5f16a5ae/requests-auth-aws-sigv4-0.7.tar.gz", hash = "sha256:3d2a475cccbf85d4c93b8bd052d072e5c3f8e77022fd621b69a5b11ac2c139c8", size = 8128, upload-time = "2021-02-16T21:31:04.325Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/cd/112ece576115a8afa62faf3c10d13fb8c72233197926f3ea6321cc2cc44e/requests_auth_aws_sigv4-0.7-py3-none-any.whl", hash = "sha256:1f6c7f63a0696a8f131a2ff21a544380f43c11f54d72600f6f2a1d402bd41d41", size = 12075, upload-time = "2021-02-16T21:31:03.444Z" }, +] + +[[package]] +name = "rich" +version = "14.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markdown-it-py" }, + { name = "pygments" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a1/53/830aa4c3066a8ab0ae9a9955976fb770fe9c6102117c8ec4ab3ea62d89e8/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725", size = 224078, upload-time = "2025-03-30T14:15:14.23Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/0d/9b/63f4c7ebc259242c89b3acafdb37b41d1185c07ff0011164674e9076b491/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0", size = 243229, upload-time = "2025-03-30T14:15:12.283Z" }, +] + +[[package]] +name = "rpds-py" +version = "0.26.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a5/aa/4456d84bbb54adc6a916fb10c9b374f78ac840337644e4a5eda229c81275/rpds_py-0.26.0.tar.gz", hash = "sha256:20dae58a859b0906f0685642e591056f1e787f3a8b39c8e8749a45dc7d26bdb0", size = 27385, upload-time = "2025-07-01T15:57:13.958Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b9/31/1459645f036c3dfeacef89e8e5825e430c77dde8489f3b99eaafcd4a60f5/rpds_py-0.26.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:4c70c70f9169692b36307a95f3d8c0a9fcd79f7b4a383aad5eaa0e9718b79b37", size = 372466, upload-time = "2025-07-01T15:53:40.55Z" }, + { url = "https://files.pythonhosted.org/packages/dd/ff/3d0727f35836cc8773d3eeb9a46c40cc405854e36a8d2e951f3a8391c976/rpds_py-0.26.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:777c62479d12395bfb932944e61e915741e364c843afc3196b694db3d669fcd0", size = 357825, upload-time = "2025-07-01T15:53:42.247Z" }, + { url = "https://files.pythonhosted.org/packages/bf/ce/badc5e06120a54099ae287fa96d82cbb650a5f85cf247ffe19c7b157fd1f/rpds_py-0.26.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ec671691e72dff75817386aa02d81e708b5a7ec0dec6669ec05213ff6b77e1bd", size = 381530, upload-time = "2025-07-01T15:53:43.585Z" }, + { url = "https://files.pythonhosted.org/packages/1e/a5/fa5d96a66c95d06c62d7a30707b6a4cfec696ab8ae280ee7be14e961e118/rpds_py-0.26.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6a1cb5d6ce81379401bbb7f6dbe3d56de537fb8235979843f0d53bc2e9815a79", size = 396933, upload-time = "2025-07-01T15:53:45.78Z" }, + { url = "https://files.pythonhosted.org/packages/00/a7/7049d66750f18605c591a9db47d4a059e112a0c9ff8de8daf8fa0f446bba/rpds_py-0.26.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4f789e32fa1fb6a7bf890e0124e7b42d1e60d28ebff57fe806719abb75f0e9a3", size = 513973, upload-time = "2025-07-01T15:53:47.085Z" }, + { url = "https://files.pythonhosted.org/packages/0e/f1/528d02c7d6b29d29fac8fd784b354d3571cc2153f33f842599ef0cf20dd2/rpds_py-0.26.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c55b0a669976cf258afd718de3d9ad1b7d1fe0a91cd1ab36f38b03d4d4aeaaf", size = 402293, upload-time = "2025-07-01T15:53:48.117Z" }, + { url = "https://files.pythonhosted.org/packages/15/93/fde36cd6e4685df2cd08508f6c45a841e82f5bb98c8d5ecf05649522acb5/rpds_py-0.26.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c70d9ec912802ecfd6cd390dadb34a9578b04f9bcb8e863d0a7598ba5e9e7ccc", size = 383787, upload-time = "2025-07-01T15:53:50.874Z" }, + { url = "https://files.pythonhosted.org/packages/69/f2/5007553aaba1dcae5d663143683c3dfd03d9395289f495f0aebc93e90f24/rpds_py-0.26.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3021933c2cb7def39d927b9862292e0f4c75a13d7de70eb0ab06efed4c508c19", size = 416312, upload-time = "2025-07-01T15:53:52.046Z" }, + { url = "https://files.pythonhosted.org/packages/8f/a7/ce52c75c1e624a79e48a69e611f1c08844564e44c85db2b6f711d76d10ce/rpds_py-0.26.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:8a7898b6ca3b7d6659e55cdac825a2e58c638cbf335cde41f4619e290dd0ad11", size = 558403, upload-time = "2025-07-01T15:53:53.192Z" }, + { url = "https://files.pythonhosted.org/packages/79/d5/e119db99341cc75b538bf4cb80504129fa22ce216672fb2c28e4a101f4d9/rpds_py-0.26.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:12bff2ad9447188377f1b2794772f91fe68bb4bbfa5a39d7941fbebdbf8c500f", size = 588323, upload-time = "2025-07-01T15:53:54.336Z" }, + { url = "https://files.pythonhosted.org/packages/93/94/d28272a0b02f5fe24c78c20e13bbcb95f03dc1451b68e7830ca040c60bd6/rpds_py-0.26.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:191aa858f7d4902e975d4cf2f2d9243816c91e9605070aeb09c0a800d187e323", size = 554541, upload-time = "2025-07-01T15:53:55.469Z" }, + { url = "https://files.pythonhosted.org/packages/93/e0/8c41166602f1b791da892d976057eba30685486d2e2c061ce234679c922b/rpds_py-0.26.0-cp310-cp310-win32.whl", hash = "sha256:b37a04d9f52cb76b6b78f35109b513f6519efb481d8ca4c321f6a3b9580b3f45", size = 220442, upload-time = "2025-07-01T15:53:56.524Z" }, + { url = "https://files.pythonhosted.org/packages/87/f0/509736bb752a7ab50fb0270c2a4134d671a7b3038030837e5536c3de0e0b/rpds_py-0.26.0-cp310-cp310-win_amd64.whl", hash = "sha256:38721d4c9edd3eb6670437d8d5e2070063f305bfa2d5aa4278c51cedcd508a84", size = 231314, upload-time = "2025-07-01T15:53:57.842Z" }, + { url = "https://files.pythonhosted.org/packages/09/4c/4ee8f7e512030ff79fda1df3243c88d70fc874634e2dbe5df13ba4210078/rpds_py-0.26.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:9e8cb77286025bdb21be2941d64ac6ca016130bfdcd228739e8ab137eb4406ed", size = 372610, upload-time = "2025-07-01T15:53:58.844Z" }, + { url = "https://files.pythonhosted.org/packages/fa/9d/3dc16be00f14fc1f03c71b1d67c8df98263ab2710a2fbd65a6193214a527/rpds_py-0.26.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5e09330b21d98adc8ccb2dbb9fc6cb434e8908d4c119aeaa772cb1caab5440a0", size = 358032, upload-time = "2025-07-01T15:53:59.985Z" }, + { url = "https://files.pythonhosted.org/packages/e7/5a/7f1bf8f045da2866324a08ae80af63e64e7bfaf83bd31f865a7b91a58601/rpds_py-0.26.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c9c1b92b774b2e68d11193dc39620d62fd8ab33f0a3c77ecdabe19c179cdbc1", size = 381525, upload-time = "2025-07-01T15:54:01.162Z" }, + { url = "https://files.pythonhosted.org/packages/45/8a/04479398c755a066ace10e3d158866beb600867cacae194c50ffa783abd0/rpds_py-0.26.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:824e6d3503ab990d7090768e4dfd9e840837bae057f212ff9f4f05ec6d1975e7", size = 397089, upload-time = "2025-07-01T15:54:02.319Z" }, + { url = "https://files.pythonhosted.org/packages/72/88/9203f47268db488a1b6d469d69c12201ede776bb728b9d9f29dbfd7df406/rpds_py-0.26.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8ad7fd2258228bf288f2331f0a6148ad0186b2e3643055ed0db30990e59817a6", size = 514255, upload-time = "2025-07-01T15:54:03.38Z" }, + { url = "https://files.pythonhosted.org/packages/f5/b4/01ce5d1e853ddf81fbbd4311ab1eff0b3cf162d559288d10fd127e2588b5/rpds_py-0.26.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0dc23bbb3e06ec1ea72d515fb572c1fea59695aefbffb106501138762e1e915e", size = 402283, upload-time = "2025-07-01T15:54:04.923Z" }, + { url = "https://files.pythonhosted.org/packages/34/a2/004c99936997bfc644d590a9defd9e9c93f8286568f9c16cdaf3e14429a7/rpds_py-0.26.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d80bf832ac7b1920ee29a426cdca335f96a2b5caa839811803e999b41ba9030d", size = 383881, upload-time = "2025-07-01T15:54:06.482Z" }, + { url = "https://files.pythonhosted.org/packages/05/1b/ef5fba4a8f81ce04c427bfd96223f92f05e6cd72291ce9d7523db3b03a6c/rpds_py-0.26.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0919f38f5542c0a87e7b4afcafab6fd2c15386632d249e9a087498571250abe3", size = 415822, upload-time = "2025-07-01T15:54:07.605Z" }, + { url = "https://files.pythonhosted.org/packages/16/80/5c54195aec456b292f7bd8aa61741c8232964063fd8a75fdde9c1e982328/rpds_py-0.26.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d422b945683e409000c888e384546dbab9009bb92f7c0b456e217988cf316107", size = 558347, upload-time = "2025-07-01T15:54:08.591Z" }, + { url = "https://files.pythonhosted.org/packages/f2/1c/1845c1b1fd6d827187c43afe1841d91678d7241cbdb5420a4c6de180a538/rpds_py-0.26.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:77a7711fa562ba2da1aa757e11024ad6d93bad6ad7ede5afb9af144623e5f76a", size = 587956, upload-time = "2025-07-01T15:54:09.963Z" }, + { url = "https://files.pythonhosted.org/packages/2e/ff/9e979329dd131aa73a438c077252ddabd7df6d1a7ad7b9aacf6261f10faa/rpds_py-0.26.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:238e8c8610cb7c29460e37184f6799547f7e09e6a9bdbdab4e8edb90986a2318", size = 554363, upload-time = "2025-07-01T15:54:11.073Z" }, + { url = "https://files.pythonhosted.org/packages/00/8b/d78cfe034b71ffbe72873a136e71acc7a831a03e37771cfe59f33f6de8a2/rpds_py-0.26.0-cp311-cp311-win32.whl", hash = "sha256:893b022bfbdf26d7bedb083efeea624e8550ca6eb98bf7fea30211ce95b9201a", size = 220123, upload-time = "2025-07-01T15:54:12.382Z" }, + { url = "https://files.pythonhosted.org/packages/94/c1/3c8c94c7dd3905dbfde768381ce98778500a80db9924731d87ddcdb117e9/rpds_py-0.26.0-cp311-cp311-win_amd64.whl", hash = "sha256:87a5531de9f71aceb8af041d72fc4cab4943648d91875ed56d2e629bef6d4c03", size = 231732, upload-time = "2025-07-01T15:54:13.434Z" }, + { url = "https://files.pythonhosted.org/packages/67/93/e936fbed1b734eabf36ccb5d93c6a2e9246fbb13c1da011624b7286fae3e/rpds_py-0.26.0-cp311-cp311-win_arm64.whl", hash = "sha256:de2713f48c1ad57f89ac25b3cb7daed2156d8e822cf0eca9b96a6f990718cc41", size = 221917, upload-time = "2025-07-01T15:54:14.559Z" }, + { url = "https://files.pythonhosted.org/packages/ea/86/90eb87c6f87085868bd077c7a9938006eb1ce19ed4d06944a90d3560fce2/rpds_py-0.26.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:894514d47e012e794f1350f076c427d2347ebf82f9b958d554d12819849a369d", size = 363933, upload-time = "2025-07-01T15:54:15.734Z" }, + { url = "https://files.pythonhosted.org/packages/63/78/4469f24d34636242c924626082b9586f064ada0b5dbb1e9d096ee7a8e0c6/rpds_py-0.26.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc921b96fa95a097add244da36a1d9e4f3039160d1d30f1b35837bf108c21136", size = 350447, upload-time = "2025-07-01T15:54:16.922Z" }, + { url = "https://files.pythonhosted.org/packages/ad/91/c448ed45efdfdade82348d5e7995e15612754826ea640afc20915119734f/rpds_py-0.26.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e1157659470aa42a75448b6e943c895be8c70531c43cb78b9ba990778955582", size = 384711, upload-time = "2025-07-01T15:54:18.101Z" }, + { url = "https://files.pythonhosted.org/packages/ec/43/e5c86fef4be7f49828bdd4ecc8931f0287b1152c0bb0163049b3218740e7/rpds_py-0.26.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:521ccf56f45bb3a791182dc6b88ae5f8fa079dd705ee42138c76deb1238e554e", size = 400865, upload-time = "2025-07-01T15:54:19.295Z" }, + { url = "https://files.pythonhosted.org/packages/55/34/e00f726a4d44f22d5c5fe2e5ddd3ac3d7fd3f74a175607781fbdd06fe375/rpds_py-0.26.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9def736773fd56b305c0eef698be5192c77bfa30d55a0e5885f80126c4831a15", size = 517763, upload-time = "2025-07-01T15:54:20.858Z" }, + { url = "https://files.pythonhosted.org/packages/52/1c/52dc20c31b147af724b16104500fba13e60123ea0334beba7b40e33354b4/rpds_py-0.26.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cdad4ea3b4513b475e027be79e5a0ceac8ee1c113a1a11e5edc3c30c29f964d8", size = 406651, upload-time = "2025-07-01T15:54:22.508Z" }, + { url = "https://files.pythonhosted.org/packages/2e/77/87d7bfabfc4e821caa35481a2ff6ae0b73e6a391bb6b343db2c91c2b9844/rpds_py-0.26.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:82b165b07f416bdccf5c84546a484cc8f15137ca38325403864bfdf2b5b72f6a", size = 386079, upload-time = "2025-07-01T15:54:23.987Z" }, + { url = "https://files.pythonhosted.org/packages/e3/d4/7f2200c2d3ee145b65b3cddc4310d51f7da6a26634f3ac87125fd789152a/rpds_py-0.26.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d04cab0a54b9dba4d278fe955a1390da3cf71f57feb78ddc7cb67cbe0bd30323", size = 421379, upload-time = "2025-07-01T15:54:25.073Z" }, + { url = "https://files.pythonhosted.org/packages/ae/13/9fdd428b9c820869924ab62236b8688b122baa22d23efdd1c566938a39ba/rpds_py-0.26.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:79061ba1a11b6a12743a2b0f72a46aa2758613d454aa6ba4f5a265cc48850158", size = 562033, upload-time = "2025-07-01T15:54:26.225Z" }, + { url = "https://files.pythonhosted.org/packages/f3/e1/b69686c3bcbe775abac3a4c1c30a164a2076d28df7926041f6c0eb5e8d28/rpds_py-0.26.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:f405c93675d8d4c5ac87364bb38d06c988e11028a64b52a47158a355079661f3", size = 591639, upload-time = "2025-07-01T15:54:27.424Z" }, + { url = "https://files.pythonhosted.org/packages/5c/c9/1e3d8c8863c84a90197ac577bbc3d796a92502124c27092413426f670990/rpds_py-0.26.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dafd4c44b74aa4bed4b250f1aed165b8ef5de743bcca3b88fc9619b6087093d2", size = 557105, upload-time = "2025-07-01T15:54:29.93Z" }, + { url = "https://files.pythonhosted.org/packages/9f/c5/90c569649057622959f6dcc40f7b516539608a414dfd54b8d77e3b201ac0/rpds_py-0.26.0-cp312-cp312-win32.whl", hash = "sha256:3da5852aad63fa0c6f836f3359647870e21ea96cf433eb393ffa45263a170d44", size = 223272, upload-time = "2025-07-01T15:54:31.128Z" }, + { url = "https://files.pythonhosted.org/packages/7d/16/19f5d9f2a556cfed454eebe4d354c38d51c20f3db69e7b4ce6cff904905d/rpds_py-0.26.0-cp312-cp312-win_amd64.whl", hash = "sha256:cf47cfdabc2194a669dcf7a8dbba62e37a04c5041d2125fae0233b720da6f05c", size = 234995, upload-time = "2025-07-01T15:54:32.195Z" }, + { url = "https://files.pythonhosted.org/packages/83/f0/7935e40b529c0e752dfaa7880224771b51175fce08b41ab4a92eb2fbdc7f/rpds_py-0.26.0-cp312-cp312-win_arm64.whl", hash = "sha256:20ab1ae4fa534f73647aad289003f1104092890849e0266271351922ed5574f8", size = 223198, upload-time = "2025-07-01T15:54:33.271Z" }, + { url = "https://files.pythonhosted.org/packages/6a/67/bb62d0109493b12b1c6ab00de7a5566aa84c0e44217c2d94bee1bd370da9/rpds_py-0.26.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:696764a5be111b036256c0b18cd29783fab22154690fc698062fc1b0084b511d", size = 363917, upload-time = "2025-07-01T15:54:34.755Z" }, + { url = "https://files.pythonhosted.org/packages/4b/f3/34e6ae1925a5706c0f002a8d2d7f172373b855768149796af87bd65dcdb9/rpds_py-0.26.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1e6c15d2080a63aaed876e228efe4f814bc7889c63b1e112ad46fdc8b368b9e1", size = 350073, upload-time = "2025-07-01T15:54:36.292Z" }, + { url = "https://files.pythonhosted.org/packages/75/83/1953a9d4f4e4de7fd0533733e041c28135f3c21485faaef56a8aadbd96b5/rpds_py-0.26.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:390e3170babf42462739a93321e657444f0862c6d722a291accc46f9d21ed04e", size = 384214, upload-time = "2025-07-01T15:54:37.469Z" }, + { url = "https://files.pythonhosted.org/packages/48/0e/983ed1b792b3322ea1d065e67f4b230f3b96025f5ce3878cc40af09b7533/rpds_py-0.26.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7da84c2c74c0f5bc97d853d9e17bb83e2dcafcff0dc48286916001cc114379a1", size = 400113, upload-time = "2025-07-01T15:54:38.954Z" }, + { url = "https://files.pythonhosted.org/packages/69/7f/36c0925fff6f660a80be259c5b4f5e53a16851f946eb080351d057698528/rpds_py-0.26.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4c5fe114a6dd480a510b6d3661d09d67d1622c4bf20660a474507aaee7eeeee9", size = 515189, upload-time = "2025-07-01T15:54:40.57Z" }, + { url = "https://files.pythonhosted.org/packages/13/45/cbf07fc03ba7a9b54662c9badb58294ecfb24f828b9732970bd1a431ed5c/rpds_py-0.26.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3100b3090269f3a7ea727b06a6080d4eb7439dca4c0e91a07c5d133bb1727ea7", size = 406998, upload-time = "2025-07-01T15:54:43.025Z" }, + { url = "https://files.pythonhosted.org/packages/6c/b0/8fa5e36e58657997873fd6a1cf621285ca822ca75b4b3434ead047daa307/rpds_py-0.26.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c03c9b0c64afd0320ae57de4c982801271c0c211aa2d37f3003ff5feb75bb04", size = 385903, upload-time = "2025-07-01T15:54:44.752Z" }, + { url = "https://files.pythonhosted.org/packages/4b/f7/b25437772f9f57d7a9fbd73ed86d0dcd76b4c7c6998348c070d90f23e315/rpds_py-0.26.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5963b72ccd199ade6ee493723d18a3f21ba7d5b957017607f815788cef50eaf1", size = 419785, upload-time = "2025-07-01T15:54:46.043Z" }, + { url = "https://files.pythonhosted.org/packages/a7/6b/63ffa55743dfcb4baf2e9e77a0b11f7f97ed96a54558fcb5717a4b2cd732/rpds_py-0.26.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9da4e873860ad5bab3291438525cae80169daecbfafe5657f7f5fb4d6b3f96b9", size = 561329, upload-time = "2025-07-01T15:54:47.64Z" }, + { url = "https://files.pythonhosted.org/packages/2f/07/1f4f5e2886c480a2346b1e6759c00278b8a69e697ae952d82ae2e6ee5db0/rpds_py-0.26.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:5afaddaa8e8c7f1f7b4c5c725c0070b6eed0228f705b90a1732a48e84350f4e9", size = 590875, upload-time = "2025-07-01T15:54:48.9Z" }, + { url = "https://files.pythonhosted.org/packages/cc/bc/e6639f1b91c3a55f8c41b47d73e6307051b6e246254a827ede730624c0f8/rpds_py-0.26.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4916dc96489616a6f9667e7526af8fa693c0fdb4f3acb0e5d9f4400eb06a47ba", size = 556636, upload-time = "2025-07-01T15:54:50.619Z" }, + { url = "https://files.pythonhosted.org/packages/05/4c/b3917c45566f9f9a209d38d9b54a1833f2bb1032a3e04c66f75726f28876/rpds_py-0.26.0-cp313-cp313-win32.whl", hash = "sha256:2a343f91b17097c546b93f7999976fd6c9d5900617aa848c81d794e062ab302b", size = 222663, upload-time = "2025-07-01T15:54:52.023Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0b/0851bdd6025775aaa2365bb8de0697ee2558184c800bfef8d7aef5ccde58/rpds_py-0.26.0-cp313-cp313-win_amd64.whl", hash = "sha256:0a0b60701f2300c81b2ac88a5fb893ccfa408e1c4a555a77f908a2596eb875a5", size = 234428, upload-time = "2025-07-01T15:54:53.692Z" }, + { url = "https://files.pythonhosted.org/packages/ed/e8/a47c64ed53149c75fb581e14a237b7b7cd18217e969c30d474d335105622/rpds_py-0.26.0-cp313-cp313-win_arm64.whl", hash = "sha256:257d011919f133a4746958257f2c75238e3ff54255acd5e3e11f3ff41fd14256", size = 222571, upload-time = "2025-07-01T15:54:54.822Z" }, + { url = "https://files.pythonhosted.org/packages/89/bf/3d970ba2e2bcd17d2912cb42874107390f72873e38e79267224110de5e61/rpds_py-0.26.0-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:529c8156d7506fba5740e05da8795688f87119cce330c244519cf706a4a3d618", size = 360475, upload-time = "2025-07-01T15:54:56.228Z" }, + { url = "https://files.pythonhosted.org/packages/82/9f/283e7e2979fc4ec2d8ecee506d5a3675fce5ed9b4b7cb387ea5d37c2f18d/rpds_py-0.26.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f53ec51f9d24e9638a40cabb95078ade8c99251945dad8d57bf4aabe86ecee35", size = 346692, upload-time = "2025-07-01T15:54:58.561Z" }, + { url = "https://files.pythonhosted.org/packages/e3/03/7e50423c04d78daf391da3cc4330bdb97042fc192a58b186f2d5deb7befd/rpds_py-0.26.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7ab504c4d654e4a29558eaa5bb8cea5fdc1703ea60a8099ffd9c758472cf913f", size = 379415, upload-time = "2025-07-01T15:54:59.751Z" }, + { url = "https://files.pythonhosted.org/packages/57/00/d11ee60d4d3b16808432417951c63df803afb0e0fc672b5e8d07e9edaaae/rpds_py-0.26.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fd0641abca296bc1a00183fe44f7fced8807ed49d501f188faa642d0e4975b83", size = 391783, upload-time = "2025-07-01T15:55:00.898Z" }, + { url = "https://files.pythonhosted.org/packages/08/b3/1069c394d9c0d6d23c5b522e1f6546b65793a22950f6e0210adcc6f97c3e/rpds_py-0.26.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:69b312fecc1d017b5327afa81d4da1480f51c68810963a7336d92203dbb3d4f1", size = 512844, upload-time = "2025-07-01T15:55:02.201Z" }, + { url = "https://files.pythonhosted.org/packages/08/3b/c4fbf0926800ed70b2c245ceca99c49f066456755f5d6eb8863c2c51e6d0/rpds_py-0.26.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c741107203954f6fc34d3066d213d0a0c40f7bb5aafd698fb39888af277c70d8", size = 402105, upload-time = "2025-07-01T15:55:03.698Z" }, + { url = "https://files.pythonhosted.org/packages/1c/b0/db69b52ca07413e568dae9dc674627a22297abb144c4d6022c6d78f1e5cc/rpds_py-0.26.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc3e55a7db08dc9a6ed5fb7103019d2c1a38a349ac41901f9f66d7f95750942f", size = 383440, upload-time = "2025-07-01T15:55:05.398Z" }, + { url = "https://files.pythonhosted.org/packages/4c/e1/c65255ad5b63903e56b3bb3ff9dcc3f4f5c3badde5d08c741ee03903e951/rpds_py-0.26.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9e851920caab2dbcae311fd28f4313c6953993893eb5c1bb367ec69d9a39e7ed", size = 412759, upload-time = "2025-07-01T15:55:08.316Z" }, + { url = "https://files.pythonhosted.org/packages/e4/22/bb731077872377a93c6e93b8a9487d0406c70208985831034ccdeed39c8e/rpds_py-0.26.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:dfbf280da5f876d0b00c81f26bedce274e72a678c28845453885a9b3c22ae632", size = 556032, upload-time = "2025-07-01T15:55:09.52Z" }, + { url = "https://files.pythonhosted.org/packages/e0/8b/393322ce7bac5c4530fb96fc79cc9ea2f83e968ff5f6e873f905c493e1c4/rpds_py-0.26.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:1cc81d14ddfa53d7f3906694d35d54d9d3f850ef8e4e99ee68bc0d1e5fed9a9c", size = 585416, upload-time = "2025-07-01T15:55:11.216Z" }, + { url = "https://files.pythonhosted.org/packages/49/ae/769dc372211835bf759319a7aae70525c6eb523e3371842c65b7ef41c9c6/rpds_py-0.26.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dca83c498b4650a91efcf7b88d669b170256bf8017a5db6f3e06c2bf031f57e0", size = 554049, upload-time = "2025-07-01T15:55:13.004Z" }, + { url = "https://files.pythonhosted.org/packages/6b/f9/4c43f9cc203d6ba44ce3146246cdc38619d92c7bd7bad4946a3491bd5b70/rpds_py-0.26.0-cp313-cp313t-win32.whl", hash = "sha256:4d11382bcaf12f80b51d790dee295c56a159633a8e81e6323b16e55d81ae37e9", size = 218428, upload-time = "2025-07-01T15:55:14.486Z" }, + { url = "https://files.pythonhosted.org/packages/7e/8b/9286b7e822036a4a977f2f1e851c7345c20528dbd56b687bb67ed68a8ede/rpds_py-0.26.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ff110acded3c22c033e637dd8896e411c7d3a11289b2edf041f86663dbc791e9", size = 231524, upload-time = "2025-07-01T15:55:15.745Z" }, + { url = "https://files.pythonhosted.org/packages/55/07/029b7c45db910c74e182de626dfdae0ad489a949d84a468465cd0ca36355/rpds_py-0.26.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:da619979df60a940cd434084355c514c25cf8eb4cf9a508510682f6c851a4f7a", size = 364292, upload-time = "2025-07-01T15:55:17.001Z" }, + { url = "https://files.pythonhosted.org/packages/13/d1/9b3d3f986216b4d1f584878dca15ce4797aaf5d372d738974ba737bf68d6/rpds_py-0.26.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ea89a2458a1a75f87caabefe789c87539ea4e43b40f18cff526052e35bbb4fdf", size = 350334, upload-time = "2025-07-01T15:55:18.922Z" }, + { url = "https://files.pythonhosted.org/packages/18/98/16d5e7bc9ec715fa9668731d0cf97f6b032724e61696e2db3d47aeb89214/rpds_py-0.26.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:feac1045b3327a45944e7dcbeb57530339f6b17baff154df51ef8b0da34c8c12", size = 384875, upload-time = "2025-07-01T15:55:20.399Z" }, + { url = "https://files.pythonhosted.org/packages/f9/13/aa5e2b1ec5ab0e86a5c464d53514c0467bec6ba2507027d35fc81818358e/rpds_py-0.26.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b818a592bd69bfe437ee8368603d4a2d928c34cffcdf77c2e761a759ffd17d20", size = 399993, upload-time = "2025-07-01T15:55:21.729Z" }, + { url = "https://files.pythonhosted.org/packages/17/03/8021810b0e97923abdbab6474c8b77c69bcb4b2c58330777df9ff69dc559/rpds_py-0.26.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1a8b0dd8648709b62d9372fc00a57466f5fdeefed666afe3fea5a6c9539a0331", size = 516683, upload-time = "2025-07-01T15:55:22.918Z" }, + { url = "https://files.pythonhosted.org/packages/dc/b1/da8e61c87c2f3d836954239fdbbfb477bb7b54d74974d8f6fcb34342d166/rpds_py-0.26.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6d3498ad0df07d81112aa6ec6c95a7e7b1ae00929fb73e7ebee0f3faaeabad2f", size = 408825, upload-time = "2025-07-01T15:55:24.207Z" }, + { url = "https://files.pythonhosted.org/packages/38/bc/1fc173edaaa0e52c94b02a655db20697cb5fa954ad5a8e15a2c784c5cbdd/rpds_py-0.26.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:24a4146ccb15be237fdef10f331c568e1b0e505f8c8c9ed5d67759dac58ac246", size = 387292, upload-time = "2025-07-01T15:55:25.554Z" }, + { url = "https://files.pythonhosted.org/packages/7c/eb/3a9bb4bd90867d21916f253caf4f0d0be7098671b6715ad1cead9fe7bab9/rpds_py-0.26.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a9a63785467b2d73635957d32a4f6e73d5e4df497a16a6392fa066b753e87387", size = 420435, upload-time = "2025-07-01T15:55:27.798Z" }, + { url = "https://files.pythonhosted.org/packages/cd/16/e066dcdb56f5632713445271a3f8d3d0b426d51ae9c0cca387799df58b02/rpds_py-0.26.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:de4ed93a8c91debfd5a047be327b7cc8b0cc6afe32a716bbbc4aedca9e2a83af", size = 562410, upload-time = "2025-07-01T15:55:29.057Z" }, + { url = "https://files.pythonhosted.org/packages/60/22/ddbdec7eb82a0dc2e455be44c97c71c232983e21349836ce9f272e8a3c29/rpds_py-0.26.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:caf51943715b12af827696ec395bfa68f090a4c1a1d2509eb4e2cb69abbbdb33", size = 590724, upload-time = "2025-07-01T15:55:30.719Z" }, + { url = "https://files.pythonhosted.org/packages/2c/b4/95744085e65b7187d83f2fcb0bef70716a1ea0a9e5d8f7f39a86e5d83424/rpds_py-0.26.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:4a59e5bc386de021f56337f757301b337d7ab58baa40174fb150accd480bc953", size = 558285, upload-time = "2025-07-01T15:55:31.981Z" }, + { url = "https://files.pythonhosted.org/packages/37/37/6309a75e464d1da2559446f9c811aa4d16343cebe3dbb73701e63f760caa/rpds_py-0.26.0-cp314-cp314-win32.whl", hash = "sha256:92c8db839367ef16a662478f0a2fe13e15f2227da3c1430a782ad0f6ee009ec9", size = 223459, upload-time = "2025-07-01T15:55:33.312Z" }, + { url = "https://files.pythonhosted.org/packages/d9/6f/8e9c11214c46098b1d1391b7e02b70bb689ab963db3b19540cba17315291/rpds_py-0.26.0-cp314-cp314-win_amd64.whl", hash = "sha256:b0afb8cdd034150d4d9f53926226ed27ad15b7f465e93d7468caaf5eafae0d37", size = 236083, upload-time = "2025-07-01T15:55:34.933Z" }, + { url = "https://files.pythonhosted.org/packages/47/af/9c4638994dd623d51c39892edd9d08e8be8220a4b7e874fa02c2d6e91955/rpds_py-0.26.0-cp314-cp314-win_arm64.whl", hash = "sha256:ca3f059f4ba485d90c8dc75cb5ca897e15325e4e609812ce57f896607c1c0867", size = 223291, upload-time = "2025-07-01T15:55:36.202Z" }, + { url = "https://files.pythonhosted.org/packages/4d/db/669a241144460474aab03e254326b32c42def83eb23458a10d163cb9b5ce/rpds_py-0.26.0-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:5afea17ab3a126006dc2f293b14ffc7ef3c85336cf451564a0515ed7648033da", size = 361445, upload-time = "2025-07-01T15:55:37.483Z" }, + { url = "https://files.pythonhosted.org/packages/3b/2d/133f61cc5807c6c2fd086a46df0eb8f63a23f5df8306ff9f6d0fd168fecc/rpds_py-0.26.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:69f0c0a3df7fd3a7eec50a00396104bb9a843ea6d45fcc31c2d5243446ffd7a7", size = 347206, upload-time = "2025-07-01T15:55:38.828Z" }, + { url = "https://files.pythonhosted.org/packages/05/bf/0e8fb4c05f70273469eecf82f6ccf37248558526a45321644826555db31b/rpds_py-0.26.0-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:801a71f70f9813e82d2513c9a96532551fce1e278ec0c64610992c49c04c2dad", size = 380330, upload-time = "2025-07-01T15:55:40.175Z" }, + { url = "https://files.pythonhosted.org/packages/d4/a8/060d24185d8b24d3923322f8d0ede16df4ade226a74e747b8c7c978e3dd3/rpds_py-0.26.0-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:df52098cde6d5e02fa75c1f6244f07971773adb4a26625edd5c18fee906fa84d", size = 392254, upload-time = "2025-07-01T15:55:42.015Z" }, + { url = "https://files.pythonhosted.org/packages/b9/7b/7c2e8a9ee3e6bc0bae26bf29f5219955ca2fbb761dca996a83f5d2f773fe/rpds_py-0.26.0-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9bc596b30f86dc6f0929499c9e574601679d0341a0108c25b9b358a042f51bca", size = 516094, upload-time = "2025-07-01T15:55:43.603Z" }, + { url = "https://files.pythonhosted.org/packages/75/d6/f61cafbed8ba1499b9af9f1777a2a199cd888f74a96133d8833ce5eaa9c5/rpds_py-0.26.0-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9dfbe56b299cf5875b68eb6f0ebaadc9cac520a1989cac0db0765abfb3709c19", size = 402889, upload-time = "2025-07-01T15:55:45.275Z" }, + { url = "https://files.pythonhosted.org/packages/92/19/c8ac0a8a8df2dd30cdec27f69298a5c13e9029500d6d76718130f5e5be10/rpds_py-0.26.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac64f4b2bdb4ea622175c9ab7cf09444e412e22c0e02e906978b3b488af5fde8", size = 384301, upload-time = "2025-07-01T15:55:47.098Z" }, + { url = "https://files.pythonhosted.org/packages/41/e1/6b1859898bc292a9ce5776016c7312b672da00e25cec74d7beced1027286/rpds_py-0.26.0-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:181ef9b6bbf9845a264f9aa45c31836e9f3c1f13be565d0d010e964c661d1e2b", size = 412891, upload-time = "2025-07-01T15:55:48.412Z" }, + { url = "https://files.pythonhosted.org/packages/ef/b9/ceb39af29913c07966a61367b3c08b4f71fad841e32c6b59a129d5974698/rpds_py-0.26.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:49028aa684c144ea502a8e847d23aed5e4c2ef7cadfa7d5eaafcb40864844b7a", size = 557044, upload-time = "2025-07-01T15:55:49.816Z" }, + { url = "https://files.pythonhosted.org/packages/2f/27/35637b98380731a521f8ec4f3fd94e477964f04f6b2f8f7af8a2d889a4af/rpds_py-0.26.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:e5d524d68a474a9688336045bbf76cb0def88549c1b2ad9dbfec1fb7cfbe9170", size = 585774, upload-time = "2025-07-01T15:55:51.192Z" }, + { url = "https://files.pythonhosted.org/packages/52/d9/3f0f105420fecd18551b678c9a6ce60bd23986098b252a56d35781b3e7e9/rpds_py-0.26.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c1851f429b822831bd2edcbe0cfd12ee9ea77868f8d3daf267b189371671c80e", size = 554886, upload-time = "2025-07-01T15:55:52.541Z" }, + { url = "https://files.pythonhosted.org/packages/6b/c5/347c056a90dc8dd9bc240a08c527315008e1b5042e7a4cf4ac027be9d38a/rpds_py-0.26.0-cp314-cp314t-win32.whl", hash = "sha256:7bdb17009696214c3b66bb3590c6d62e14ac5935e53e929bcdbc5a495987a84f", size = 219027, upload-time = "2025-07-01T15:55:53.874Z" }, + { url = "https://files.pythonhosted.org/packages/75/04/5302cea1aa26d886d34cadbf2dc77d90d7737e576c0065f357b96dc7a1a6/rpds_py-0.26.0-cp314-cp314t-win_amd64.whl", hash = "sha256:f14440b9573a6f76b4ee4770c13f0b5921f71dde3b6fcb8dabbefd13b7fe05d7", size = 232821, upload-time = "2025-07-01T15:55:55.167Z" }, + { url = "https://files.pythonhosted.org/packages/ef/9a/1f033b0b31253d03d785b0cd905bc127e555ab496ea6b4c7c2e1f951f2fd/rpds_py-0.26.0-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3c0909c5234543ada2515c05dc08595b08d621ba919629e94427e8e03539c958", size = 373226, upload-time = "2025-07-01T15:56:16.578Z" }, + { url = "https://files.pythonhosted.org/packages/58/29/5f88023fd6aaaa8ca3c4a6357ebb23f6f07da6079093ccf27c99efce87db/rpds_py-0.26.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c1fb0cda2abcc0ac62f64e2ea4b4e64c57dfd6b885e693095460c61bde7bb18e", size = 359230, upload-time = "2025-07-01T15:56:17.978Z" }, + { url = "https://files.pythonhosted.org/packages/6c/6c/13eaebd28b439da6964dde22712b52e53fe2824af0223b8e403249d10405/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84d142d2d6cf9b31c12aa4878d82ed3b2324226270b89b676ac62ccd7df52d08", size = 382363, upload-time = "2025-07-01T15:56:19.977Z" }, + { url = "https://files.pythonhosted.org/packages/55/fc/3bb9c486b06da19448646f96147796de23c5811ef77cbfc26f17307b6a9d/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a547e21c5610b7e9093d870be50682a6a6cf180d6da0f42c47c306073bfdbbf6", size = 397146, upload-time = "2025-07-01T15:56:21.39Z" }, + { url = "https://files.pythonhosted.org/packages/15/18/9d1b79eb4d18e64ba8bba9e7dec6f9d6920b639f22f07ee9368ca35d4673/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:35e9a70a0f335371275cdcd08bc5b8051ac494dd58bff3bbfb421038220dc871", size = 514804, upload-time = "2025-07-01T15:56:22.78Z" }, + { url = "https://files.pythonhosted.org/packages/4f/5a/175ad7191bdbcd28785204621b225ad70e85cdfd1e09cc414cb554633b21/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0dfa6115c6def37905344d56fb54c03afc49104e2ca473d5dedec0f6606913b4", size = 402820, upload-time = "2025-07-01T15:56:24.584Z" }, + { url = "https://files.pythonhosted.org/packages/11/45/6a67ecf6d61c4d4aff4bc056e864eec4b2447787e11d1c2c9a0242c6e92a/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:313cfcd6af1a55a286a3c9a25f64af6d0e46cf60bc5798f1db152d97a216ff6f", size = 384567, upload-time = "2025-07-01T15:56:26.064Z" }, + { url = "https://files.pythonhosted.org/packages/a1/ba/16589da828732b46454c61858950a78fe4c931ea4bf95f17432ffe64b241/rpds_py-0.26.0-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f7bf2496fa563c046d05e4d232d7b7fd61346e2402052064b773e5c378bf6f73", size = 416520, upload-time = "2025-07-01T15:56:27.608Z" }, + { url = "https://files.pythonhosted.org/packages/81/4b/00092999fc7c0c266045e984d56b7314734cc400a6c6dc4d61a35f135a9d/rpds_py-0.26.0-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:aa81873e2c8c5aa616ab8e017a481a96742fdf9313c40f14338ca7dbf50cb55f", size = 559362, upload-time = "2025-07-01T15:56:29.078Z" }, + { url = "https://files.pythonhosted.org/packages/96/0c/43737053cde1f93ac4945157f7be1428724ab943e2132a0d235a7e161d4e/rpds_py-0.26.0-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:68ffcf982715f5b5b7686bdd349ff75d422e8f22551000c24b30eaa1b7f7ae84", size = 588113, upload-time = "2025-07-01T15:56:30.485Z" }, + { url = "https://files.pythonhosted.org/packages/46/46/8e38f6161466e60a997ed7e9951ae5de131dedc3cf778ad35994b4af823d/rpds_py-0.26.0-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:6188de70e190847bb6db3dc3981cbadff87d27d6fe9b4f0e18726d55795cee9b", size = 555429, upload-time = "2025-07-01T15:56:31.956Z" }, + { url = "https://files.pythonhosted.org/packages/2c/ac/65da605e9f1dd643ebe615d5bbd11b6efa1d69644fc4bf623ea5ae385a82/rpds_py-0.26.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:1c962145c7473723df9722ba4c058de12eb5ebedcb4e27e7d902920aa3831ee8", size = 231950, upload-time = "2025-07-01T15:56:33.337Z" }, + { url = "https://files.pythonhosted.org/packages/51/f2/b5c85b758a00c513bb0389f8fc8e61eb5423050c91c958cdd21843faa3e6/rpds_py-0.26.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f61a9326f80ca59214d1cceb0a09bb2ece5b2563d4e0cd37bfd5515c28510674", size = 373505, upload-time = "2025-07-01T15:56:34.716Z" }, + { url = "https://files.pythonhosted.org/packages/23/e0/25db45e391251118e915e541995bb5f5ac5691a3b98fb233020ba53afc9b/rpds_py-0.26.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:183f857a53bcf4b1b42ef0f57ca553ab56bdd170e49d8091e96c51c3d69ca696", size = 359468, upload-time = "2025-07-01T15:56:36.219Z" }, + { url = "https://files.pythonhosted.org/packages/0b/73/dd5ee6075bb6491be3a646b301dfd814f9486d924137a5098e61f0487e16/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:941c1cfdf4799d623cf3aa1d326a6b4fdb7a5799ee2687f3516738216d2262fb", size = 382680, upload-time = "2025-07-01T15:56:37.644Z" }, + { url = "https://files.pythonhosted.org/packages/2f/10/84b522ff58763a5c443f5bcedc1820240e454ce4e620e88520f04589e2ea/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72a8d9564a717ee291f554eeb4bfeafe2309d5ec0aa6c475170bdab0f9ee8e88", size = 397035, upload-time = "2025-07-01T15:56:39.241Z" }, + { url = "https://files.pythonhosted.org/packages/06/ea/8667604229a10a520fcbf78b30ccc278977dcc0627beb7ea2c96b3becef0/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:511d15193cbe013619dd05414c35a7dedf2088fcee93c6bbb7c77859765bd4e8", size = 514922, upload-time = "2025-07-01T15:56:40.645Z" }, + { url = "https://files.pythonhosted.org/packages/24/e6/9ed5b625c0661c4882fc8cdf302bf8e96c73c40de99c31e0b95ed37d508c/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:aea1f9741b603a8d8fedb0ed5502c2bc0accbc51f43e2ad1337fe7259c2b77a5", size = 402822, upload-time = "2025-07-01T15:56:42.137Z" }, + { url = "https://files.pythonhosted.org/packages/8a/58/212c7b6fd51946047fb45d3733da27e2fa8f7384a13457c874186af691b1/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4019a9d473c708cf2f16415688ef0b4639e07abaa569d72f74745bbeffafa2c7", size = 384336, upload-time = "2025-07-01T15:56:44.239Z" }, + { url = "https://files.pythonhosted.org/packages/aa/f5/a40ba78748ae8ebf4934d4b88e77b98497378bc2c24ba55ebe87a4e87057/rpds_py-0.26.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:093d63b4b0f52d98ebae33b8c50900d3d67e0666094b1be7a12fffd7f65de74b", size = 416871, upload-time = "2025-07-01T15:56:46.284Z" }, + { url = "https://files.pythonhosted.org/packages/d5/a6/33b1fc0c9f7dcfcfc4a4353daa6308b3ece22496ceece348b3e7a7559a09/rpds_py-0.26.0-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2abe21d8ba64cded53a2a677e149ceb76dcf44284202d737178afe7ba540c1eb", size = 559439, upload-time = "2025-07-01T15:56:48.549Z" }, + { url = "https://files.pythonhosted.org/packages/71/2d/ceb3f9c12f8cfa56d34995097f6cd99da1325642c60d1b6680dd9df03ed8/rpds_py-0.26.0-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:4feb7511c29f8442cbbc28149a92093d32e815a28aa2c50d333826ad2a20fdf0", size = 588380, upload-time = "2025-07-01T15:56:50.086Z" }, + { url = "https://files.pythonhosted.org/packages/c8/ed/9de62c2150ca8e2e5858acf3f4f4d0d180a38feef9fdab4078bea63d8dba/rpds_py-0.26.0-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:e99685fc95d386da368013e7fb4269dd39c30d99f812a8372d62f244f662709c", size = 555334, upload-time = "2025-07-01T15:56:51.703Z" }, +] + +[[package]] +name = "ruff" +version = "0.12.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9b/ce/8d7dbedede481245b489b769d27e2934730791a9a82765cb94566c6e6abd/ruff-0.12.4.tar.gz", hash = "sha256:13efa16df6c6eeb7d0f091abae50f58e9522f3843edb40d56ad52a5a4a4b6873", size = 5131435, upload-time = "2025-07-17T17:27:19.138Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ae/9f/517bc5f61bad205b7f36684ffa5415c013862dee02f55f38a217bdbe7aa4/ruff-0.12.4-py3-none-linux_armv6l.whl", hash = "sha256:cb0d261dac457ab939aeb247e804125a5d521b21adf27e721895b0d3f83a0d0a", size = 10188824, upload-time = "2025-07-17T17:26:31.412Z" }, + { url = "https://files.pythonhosted.org/packages/28/83/691baae5a11fbbde91df01c565c650fd17b0eabed259e8b7563de17c6529/ruff-0.12.4-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:55c0f4ca9769408d9b9bac530c30d3e66490bd2beb2d3dae3e4128a1f05c7442", size = 10884521, upload-time = "2025-07-17T17:26:35.084Z" }, + { url = "https://files.pythonhosted.org/packages/d6/8d/756d780ff4076e6dd035d058fa220345f8c458391f7edfb1c10731eedc75/ruff-0.12.4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:a8224cc3722c9ad9044da7f89c4c1ec452aef2cfe3904365025dd2f51daeae0e", size = 10277653, upload-time = "2025-07-17T17:26:37.897Z" }, + { url = "https://files.pythonhosted.org/packages/8d/97/8eeee0f48ece153206dce730fc9e0e0ca54fd7f261bb3d99c0a4343a1892/ruff-0.12.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e9949d01d64fa3672449a51ddb5d7548b33e130240ad418884ee6efa7a229586", size = 10485993, upload-time = "2025-07-17T17:26:40.68Z" }, + { url = "https://files.pythonhosted.org/packages/49/b8/22a43d23a1f68df9b88f952616c8508ea6ce4ed4f15353b8168c48b2d7e7/ruff-0.12.4-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:be0593c69df9ad1465e8a2d10e3defd111fdb62dcd5be23ae2c06da77e8fcffb", size = 10022824, upload-time = "2025-07-17T17:26:43.564Z" }, + { url = "https://files.pythonhosted.org/packages/cd/70/37c234c220366993e8cffcbd6cadbf332bfc848cbd6f45b02bade17e0149/ruff-0.12.4-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a7dea966bcb55d4ecc4cc3270bccb6f87a337326c9dcd3c07d5b97000dbff41c", size = 11524414, upload-time = "2025-07-17T17:26:46.219Z" }, + { url = "https://files.pythonhosted.org/packages/14/77/c30f9964f481b5e0e29dd6a1fae1f769ac3fd468eb76fdd5661936edd262/ruff-0.12.4-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:afcfa3ab5ab5dd0e1c39bf286d829e042a15e966b3726eea79528e2e24d8371a", size = 12419216, upload-time = "2025-07-17T17:26:48.883Z" }, + { url = "https://files.pythonhosted.org/packages/6e/79/af7fe0a4202dce4ef62c5e33fecbed07f0178f5b4dd9c0d2fcff5ab4a47c/ruff-0.12.4-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c057ce464b1413c926cdb203a0f858cd52f3e73dcb3270a3318d1630f6395bb3", size = 11976756, upload-time = "2025-07-17T17:26:51.754Z" }, + { url = "https://files.pythonhosted.org/packages/09/d1/33fb1fc00e20a939c305dbe2f80df7c28ba9193f7a85470b982815a2dc6a/ruff-0.12.4-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e64b90d1122dc2713330350626b10d60818930819623abbb56535c6466cce045", size = 11020019, upload-time = "2025-07-17T17:26:54.265Z" }, + { url = "https://files.pythonhosted.org/packages/64/f4/e3cd7f7bda646526f09693e2e02bd83d85fff8a8222c52cf9681c0d30843/ruff-0.12.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2abc48f3d9667fdc74022380b5c745873499ff827393a636f7a59da1515e7c57", size = 11277890, upload-time = "2025-07-17T17:26:56.914Z" }, + { url = "https://files.pythonhosted.org/packages/5e/d0/69a85fb8b94501ff1a4f95b7591505e8983f38823da6941eb5b6badb1e3a/ruff-0.12.4-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2b2449dc0c138d877d629bea151bee8c0ae3b8e9c43f5fcaafcd0c0d0726b184", size = 10348539, upload-time = "2025-07-17T17:26:59.381Z" }, + { url = "https://files.pythonhosted.org/packages/16/a0/91372d1cb1678f7d42d4893b88c252b01ff1dffcad09ae0c51aa2542275f/ruff-0.12.4-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:56e45bb11f625db55f9b70477062e6a1a04d53628eda7784dce6e0f55fd549eb", size = 10009579, upload-time = "2025-07-17T17:27:02.462Z" }, + { url = "https://files.pythonhosted.org/packages/23/1b/c4a833e3114d2cc0f677e58f1df6c3b20f62328dbfa710b87a1636a5e8eb/ruff-0.12.4-py3-none-musllinux_1_2_i686.whl", hash = "sha256:478fccdb82ca148a98a9ff43658944f7ab5ec41c3c49d77cd99d44da019371a1", size = 10942982, upload-time = "2025-07-17T17:27:05.343Z" }, + { url = "https://files.pythonhosted.org/packages/ff/ce/ce85e445cf0a5dd8842f2f0c6f0018eedb164a92bdf3eda51984ffd4d989/ruff-0.12.4-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:0fc426bec2e4e5f4c4f182b9d2ce6a75c85ba9bcdbe5c6f2a74fcb8df437df4b", size = 11343331, upload-time = "2025-07-17T17:27:08.652Z" }, + { url = "https://files.pythonhosted.org/packages/35/cf/441b7fc58368455233cfb5b77206c849b6dfb48b23de532adcc2e50ccc06/ruff-0.12.4-py3-none-win32.whl", hash = "sha256:4de27977827893cdfb1211d42d84bc180fceb7b72471104671c59be37041cf93", size = 10267904, upload-time = "2025-07-17T17:27:11.814Z" }, + { url = "https://files.pythonhosted.org/packages/ce/7e/20af4a0df5e1299e7368d5ea4350412226afb03d95507faae94c80f00afd/ruff-0.12.4-py3-none-win_amd64.whl", hash = "sha256:fe0b9e9eb23736b453143d72d2ceca5db323963330d5b7859d60d101147d461a", size = 11209038, upload-time = "2025-07-17T17:27:14.417Z" }, + { url = "https://files.pythonhosted.org/packages/11/02/8857d0dfb8f44ef299a5dfd898f673edefb71e3b533b3b9d2db4c832dd13/ruff-0.12.4-py3-none-win_arm64.whl", hash = "sha256:0618ec4442a83ab545e5b71202a5c0ed7791e8471435b94e655b570a5031a98e", size = 10469336, upload-time = "2025-07-17T17:27:16.913Z" }, +] + +[[package]] +name = "s3transfer" +version = "0.13.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "botocore" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ed/5d/9dcc100abc6711e8247af5aa561fc07c4a046f72f659c3adea9a449e191a/s3transfer-0.13.0.tar.gz", hash = "sha256:f5e6db74eb7776a37208001113ea7aa97695368242b364d73e91c981ac522177", size = 150232, upload-time = "2025-05-22T19:24:50.245Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/17/22bf8155aa0ea2305eefa3a6402e040df7ebe512d1310165eda1e233c3f8/s3transfer-0.13.0-py3-none-any.whl", hash = "sha256:0148ef34d6dd964d0d8cf4311b2b21c474693e57c2e069ec708ce043d2b527be", size = 85152, upload-time = "2025-05-22T19:24:48.703Z" }, +] + +[[package]] +name = "shellingham" +version = "1.5.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" }, +] + +[[package]] +name = "six" +version = "1.17.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" }, +] + +[[package]] +name = "sse-starlette" +version = "2.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/07/3e/eae74d8d33e3262bae0a7e023bb43d8bdd27980aa3557333f4632611151f/sse_starlette-2.4.1.tar.gz", hash = "sha256:7c8a800a1ca343e9165fc06bbda45c78e4c6166320707ae30b416c42da070926", size = 18635, upload-time = "2025-07-06T09:41:33.631Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e4/f1/6c7eaa8187ba789a6dd6d74430307478d2a91c23a5452ab339b6fbe15a08/sse_starlette-2.4.1-py3-none-any.whl", hash = "sha256:08b77ea898ab1a13a428b2b6f73cfe6d0e607a7b4e15b9bb23e4a37b087fd39a", size = 10824, upload-time = "2025-07-06T09:41:32.321Z" }, +] + +[[package]] +name = "starlette" +version = "0.47.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0a/69/662169fdb92fb96ec3eaee218cf540a629d629c86d7993d9651226a6789b/starlette-0.47.1.tar.gz", hash = "sha256:aef012dd2b6be325ffa16698f9dc533614fb1cebd593a906b90dc1025529a79b", size = 2583072, upload-time = "2025-06-21T04:03:17.337Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/82/95/38ef0cd7fa11eaba6a99b3c4f5ac948d8bc6ff199aabd327a29cc000840c/starlette-0.47.1-py3-none-any.whl", hash = "sha256:5e11c9f5c7c3f24959edbf2dffdc01bba860228acf657129467d8a7468591527", size = 72747, upload-time = "2025-06-21T04:03:15.705Z" }, +] + +[[package]] +name = "termcolor" +version = "3.1.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ca/6c/3d75c196ac07ac8749600b60b03f4f6094d54e132c4d94ebac6ee0e0add0/termcolor-3.1.0.tar.gz", hash = "sha256:6a6dd7fbee581909eeec6a756cff1d7f7c376063b14e4a298dc4980309e55970", size = 14324, upload-time = "2025-04-30T11:37:53.791Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4f/bd/de8d508070629b6d84a30d01d57e4a65c69aa7f5abe7560b8fad3b50ea59/termcolor-3.1.0-py3-none-any.whl", hash = "sha256:591dd26b5c2ce03b9e43f391264626557873ce1d379019786f99b0c2bee140aa", size = 7684, upload-time = "2025-04-30T11:37:52.382Z" }, +] + +[[package]] +name = "tomli" +version = "2.2.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175, upload-time = "2024-11-27T22:38:36.873Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/43/ca/75707e6efa2b37c77dadb324ae7d9571cb424e61ea73fad7c56c2d14527f/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249", size = 131077, upload-time = "2024-11-27T22:37:54.956Z" }, + { url = "https://files.pythonhosted.org/packages/c7/16/51ae563a8615d472fdbffc43a3f3d46588c264ac4f024f63f01283becfbb/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6", size = 123429, upload-time = "2024-11-27T22:37:56.698Z" }, + { url = "https://files.pythonhosted.org/packages/f1/dd/4f6cd1e7b160041db83c694abc78e100473c15d54620083dbd5aae7b990e/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a", size = 226067, upload-time = "2024-11-27T22:37:57.63Z" }, + { url = "https://files.pythonhosted.org/packages/a9/6b/c54ede5dc70d648cc6361eaf429304b02f2871a345bbdd51e993d6cdf550/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee", size = 236030, upload-time = "2024-11-27T22:37:59.344Z" }, + { url = "https://files.pythonhosted.org/packages/1f/47/999514fa49cfaf7a92c805a86c3c43f4215621855d151b61c602abb38091/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e", size = 240898, upload-time = "2024-11-27T22:38:00.429Z" }, + { url = "https://files.pythonhosted.org/packages/73/41/0a01279a7ae09ee1573b423318e7934674ce06eb33f50936655071d81a24/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4", size = 229894, upload-time = "2024-11-27T22:38:02.094Z" }, + { url = "https://files.pythonhosted.org/packages/55/18/5d8bc5b0a0362311ce4d18830a5d28943667599a60d20118074ea1b01bb7/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106", size = 245319, upload-time = "2024-11-27T22:38:03.206Z" }, + { url = "https://files.pythonhosted.org/packages/92/a3/7ade0576d17f3cdf5ff44d61390d4b3febb8a9fc2b480c75c47ea048c646/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8", size = 238273, upload-time = "2024-11-27T22:38:04.217Z" }, + { url = "https://files.pythonhosted.org/packages/72/6f/fa64ef058ac1446a1e51110c375339b3ec6be245af9d14c87c4a6412dd32/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff", size = 98310, upload-time = "2024-11-27T22:38:05.908Z" }, + { url = "https://files.pythonhosted.org/packages/6a/1c/4a2dcde4a51b81be3530565e92eda625d94dafb46dbeb15069df4caffc34/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b", size = 108309, upload-time = "2024-11-27T22:38:06.812Z" }, + { url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762, upload-time = "2024-11-27T22:38:07.731Z" }, + { url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453, upload-time = "2024-11-27T22:38:09.384Z" }, + { url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486, upload-time = "2024-11-27T22:38:10.329Z" }, + { url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349, upload-time = "2024-11-27T22:38:11.443Z" }, + { url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159, upload-time = "2024-11-27T22:38:13.099Z" }, + { url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243, upload-time = "2024-11-27T22:38:14.766Z" }, + { url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645, upload-time = "2024-11-27T22:38:15.843Z" }, + { url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584, upload-time = "2024-11-27T22:38:17.645Z" }, + { url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875, upload-time = "2024-11-27T22:38:19.159Z" }, + { url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418, upload-time = "2024-11-27T22:38:20.064Z" }, + { url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708, upload-time = "2024-11-27T22:38:21.659Z" }, + { url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582, upload-time = "2024-11-27T22:38:22.693Z" }, + { url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543, upload-time = "2024-11-27T22:38:24.367Z" }, + { url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691, upload-time = "2024-11-27T22:38:26.081Z" }, + { url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170, upload-time = "2024-11-27T22:38:27.921Z" }, + { url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530, upload-time = "2024-11-27T22:38:29.591Z" }, + { url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666, upload-time = "2024-11-27T22:38:30.639Z" }, + { url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954, upload-time = "2024-11-27T22:38:31.702Z" }, + { url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724, upload-time = "2024-11-27T22:38:32.837Z" }, + { url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383, upload-time = "2024-11-27T22:38:34.455Z" }, + { url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" }, +] + +[[package]] +name = "tomlkit" +version = "0.13.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/cc/18/0bbf3884e9eaa38819ebe46a7bd25dcd56b67434402b66a58c4b8e552575/tomlkit-0.13.3.tar.gz", hash = "sha256:430cf247ee57df2b94ee3fbe588e71d362a941ebb545dec29b53961d61add2a1", size = 185207, upload-time = "2025-06-05T07:13:44.947Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/bd/75/8539d011f6be8e29f339c42e633aae3cb73bffa95dd0f9adec09b9c58e85/tomlkit-0.13.3-py3-none-any.whl", hash = "sha256:c89c649d79ee40629a9fda55f8ace8c6a1b42deb912b2a8fd8d942ddadb606b0", size = 38901, upload-time = "2025-06-05T07:13:43.546Z" }, +] + +[[package]] +name = "typer" +version = "0.16.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "rich" }, + { name = "shellingham" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c5/8c/7d682431efca5fd290017663ea4588bf6f2c6aad085c7f108c5dbc316e70/typer-0.16.0.tar.gz", hash = "sha256:af377ffaee1dbe37ae9440cb4e8f11686ea5ce4e9bae01b84ae7c63b87f1dd3b", size = 102625, upload-time = "2025-05-26T14:30:31.824Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/76/42/3efaf858001d2c2913de7f354563e3a3a2f0decae3efe98427125a8f441e/typer-0.16.0-py3-none-any.whl", hash = "sha256:1f79bed11d4d02d4310e3c1b7ba594183bcedb0ac73b27a9e5f28f6fb5b98855", size = 46317, upload-time = "2025-05-26T14:30:30.523Z" }, +] + +[[package]] +name = "typing-extensions" +version = "4.14.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/98/5a/da40306b885cc8c09109dc2e1abd358d5684b1425678151cdaed4731c822/typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36", size = 107673, upload-time = "2025-07-04T13:28:34.16Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b5/00/d631e67a838026495268c2f6884f3711a15a9a2a96cd244fdaea53b823fb/typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76", size = 43906, upload-time = "2025-07-04T13:28:32.743Z" }, +] + +[[package]] +name = "typing-inspection" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" }, +] + +[[package]] +name = "urllib3" +version = "2.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" }, +] + +[[package]] +name = "uvicorn" +version = "0.35.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "click" }, + { name = "h11" }, + { name = "typing-extensions", marker = "python_full_version < '3.11'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473, upload-time = "2025-06-28T16:15:46.058Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406, upload-time = "2025-06-28T16:15:44.816Z" }, +] + +[[package]] +name = "virtualenv" +version = "20.31.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "distlib" }, + { name = "filelock" }, + { name = "platformdirs" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/56/2c/444f465fb2c65f40c3a104fd0c495184c4f2336d65baf398e3c75d72ea94/virtualenv-20.31.2.tar.gz", hash = "sha256:e10c0a9d02835e592521be48b332b6caee6887f332c111aa79a09b9e79efc2af", size = 6076316, upload-time = "2025-05-08T17:58:23.811Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/f3/40/b1c265d4b2b62b58576588510fc4d1fe60a86319c8de99fd8e9fec617d2c/virtualenv-20.31.2-py3-none-any.whl", hash = "sha256:36efd0d9650ee985f0cad72065001e66d49a6f24eb44d98980f630686243cf11", size = 6057982, upload-time = "2025-05-08T17:58:21.15Z" }, +] + +[[package]] +name = "wcwidth" +version = "0.2.13" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6c/63/53559446a878410fc5a5974feb13d31d78d752eb18aeba59c7fef1af7598/wcwidth-0.2.13.tar.gz", hash = "sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5", size = 101301, upload-time = "2024-01-06T02:10:57.829Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fd/84/fd2ba7aafacbad3c4201d395674fc6348826569da3c0937e75505ead3528/wcwidth-0.2.13-py2.py3-none-any.whl", hash = "sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859", size = 34166, upload-time = "2024-01-06T02:10:55.763Z" }, +] + +[[package]] +name = "win32-setctime" +version = "1.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b3/8f/705086c9d734d3b663af0e9bb3d4de6578d08f46b1b101c2442fd9aecaa2/win32_setctime-1.2.0.tar.gz", hash = "sha256:ae1fdf948f5640aae05c511ade119313fb6a30d7eabe25fef9764dca5873c4c0", size = 4867, upload-time = "2024-12-07T15:28:28.314Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e1/07/c6fe3ad3e685340704d314d765b7912993bcb8dc198f0e7a89382d37974b/win32_setctime-1.2.0-py3-none-any.whl", hash = "sha256:95d644c4e708aba81dc3704a116d8cbc974d70b3bdb8be1d150e36be6e9d1390", size = 4083, upload-time = "2024-12-07T15:28:26.465Z" }, +] + +[[package]] +name = "zipp" +version = "3.23.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e3/02/0f2892c661036d50ede074e376733dca2ae7c6eb617489437771209d4180/zipp-3.23.0.tar.gz", hash = "sha256:a07157588a12518c9d4034df3fbbee09c814741a33ff63c05fa29d26a2404166", size = 25547, upload-time = "2025-06-08T17:06:39.4Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2e/54/647ade08bf0db230bfea292f893923872fd20be6ac6f53b2b936ba839d75/zipp-3.23.0-py3-none-any.whl", hash = "sha256:071652d6115ed432f5ce1d34c336c0adfd6a884660d1e9712a256d3d3bd4b14e", size = 10276, upload-time = "2025-06-08T17:06:38.034Z" }, +]