mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-09 08:08:32 +00:00
* feat: add `apps` & `actions` attributes to Agent (#3504)
* feat: add app attributes to Agent
* feat: add actions attribute to Agent
* chore: resolve linter issues
* refactor: merge the apps and actions parameters into a single one
* fix: remove unnecessary print
* feat: logging error when CrewaiPlatformTools fails
* chore: export CrewaiPlatformTools directly from crewai_tools
* style: resolver linter issues
* test: fix broken tests
* style: solve linter issues
* fix: fix broken test
* feat: monorepo restructure and test/ci updates
- Add crewai workspace member
- Fix vcr cassette paths and restore test dirs
- Resolve ci failures and update linter/pytest rules
* chore: update python version to 3.13 and package metadata
* feat: add crewai-tools workspace and fix tests/dependencies
* feat: add crewai-tools workspace structure
* Squashed 'temp-crewai-tools/' content from commit 9bae5633
git-subtree-dir: temp-crewai-tools
git-subtree-split: 9bae56339096cb70f03873e600192bd2cd207ac9
* feat: configure crewai-tools workspace package with dependencies
* fix: apply ruff auto-formatting to crewai-tools code
* chore: update lockfile
* fix: don't allow tool tests yet
* fix: comment out extra pytest flags for now
* fix: remove conflicting conftest.py from crewai-tools tests
* fix: resolve dependency conflicts and test issues
- Pin vcrpy to 7.0.0 to fix pytest-recording compatibility
- Comment out types-requests to resolve urllib3 conflict
- Update requests requirement in crewai-tools to >=2.32.0
* chore: update CI workflows and docs for monorepo structure
* chore: update CI workflows and docs for monorepo structure
* fix: actions syntax
* chore: ci publish and pin versions
* fix: add permission to action
* chore: bump version to 1.0.0a1 across all packages
- Updated version to 1.0.0a1 in pyproject.toml for crewai and crewai-tools
- Adjusted version in __init__.py files for consistency
* WIP: v1 docs (#3626)
(cherry picked from commit d46e20fa09bcd2f5916282f5553ddeb7183bd92c)
* docs: parity for all translations
* docs: full name of acronym AMP
* docs: fix lingering unused code
* docs: expand contextual options in docs.json
* docs: add contextual action to request feature on GitHub (#3635)
* chore: apply linting fixes to crewai-tools
* feat: add required env var validation for brightdata
Co-authored-by: Greyson Lalonde <greyson.r.lalonde@gmail.com>
* fix: handle properly anyOf oneOf allOf schema's props
Co-authored-by: Greyson Lalonde <greyson.r.lalonde@gmail.com>
* feat: bump version to 1.0.0a2
* Lorenze/native inference sdks (#3619)
* ruff linted
* using native sdks with litellm fallback
* drop exa
* drop print on completion
* Refactor LLM and utility functions for type consistency
- Updated `max_tokens` parameter in `LLM` class to accept `float` in addition to `int`.
- Modified `create_llm` function to ensure consistent type hints and return types, now returning `LLM | BaseLLM | None`.
- Adjusted type hints for various parameters in `create_llm` and `_llm_via_environment_or_fallback` functions for improved clarity and type safety.
- Enhanced test cases to reflect changes in type handling and ensure proper instantiation of LLM instances.
* fix agent_tests
* fix litellm tests and usagemetrics fix
* drop print
* Refactor LLM event handling and improve test coverage
- Removed commented-out event emission for LLM call failures in `llm.py`.
- Added `from_agent` parameter to `CrewAgentExecutor` for better context in LLM responses.
- Enhanced test for LLM call failure to simulate OpenAI API failure and updated assertions for clarity.
- Updated agent and task ID assertions in tests to ensure they are consistently treated as strings.
* fix test_converter
* fixed tests/agents/test_agent.py
* Refactor LLM context length exception handling and improve provider integration
- Renamed `LLMContextLengthExceededException` to `LLMContextLengthExceededExceptionError` for clarity and consistency.
- Updated LLM class to pass the provider parameter correctly during initialization.
- Enhanced error handling in various LLM provider implementations to raise the new exception type.
- Adjusted tests to reflect the updated exception name and ensure proper error handling in context length scenarios.
* Enhance LLM context window handling across providers
- Introduced CONTEXT_WINDOW_USAGE_RATIO to adjust context window sizes dynamically for Anthropic, Azure, Gemini, and OpenAI LLMs.
- Added validation for context window sizes in Azure and Gemini providers to ensure they fall within acceptable limits.
- Updated context window size calculations to use the new ratio, improving consistency and adaptability across different models.
- Removed hardcoded context window sizes in favor of ratio-based calculations for better flexibility.
* fix test agent again
* fix test agent
* feat: add native LLM providers for Anthropic, Azure, and Gemini
- Introduced new completion implementations for Anthropic, Azure, and Gemini, integrating their respective SDKs.
- Added utility functions for tool validation and extraction to support function calling across LLM providers.
- Enhanced context window management and token usage extraction for each provider.
- Created a common utility module for shared functionality among LLM providers.
* chore: update dependencies and improve context management
- Removed direct dependency on `litellm` from the main dependencies and added it under extras for better modularity.
- Updated the `litellm` dependency specification to allow for greater flexibility in versioning.
- Refactored context length exception handling across various LLM providers to use a consistent error class.
- Enhanced platform-specific dependency markers for NVIDIA packages to ensure compatibility across different systems.
* refactor(tests): update LLM instantiation to include is_litellm flag in test cases
- Modified multiple test cases in test_llm.py to set the is_litellm parameter to True when instantiating the LLM class.
- This change ensures that the tests are aligned with the latest LLM configuration requirements and improves consistency across test scenarios.
- Adjusted relevant assertions and comments to reflect the updated LLM behavior.
* linter
* linted
* revert constants
* fix(tests): correct type hint in expected model description
- Updated the expected description in the test_generate_model_description_dict_field function to use 'Dict' instead of 'dict' for consistency with type hinting conventions.
- This change ensures that the test accurately reflects the expected output format for model descriptions.
* refactor(llm): enhance LLM instantiation and error handling
- Updated the LLM class to include validation for the model parameter, ensuring it is a non-empty string.
- Improved error handling by logging warnings when the native SDK fails, allowing for a fallback to LiteLLM.
- Adjusted the instantiation of LLM in test cases to consistently include the is_litellm flag, aligning with recent changes in LLM configuration.
- Modified relevant tests to reflect these updates, ensuring better coverage and accuracy in testing scenarios.
* fixed test
* refactor(llm): enhance token usage tracking and add copy methods
- Updated the LLM class to track token usage and log callbacks in streaming mode, improving monitoring capabilities.
- Introduced shallow and deep copy methods for the LLM instance, allowing for better management of LLM configurations and parameters.
- Adjusted test cases to instantiate LLM with the is_litellm flag, ensuring alignment with recent changes in LLM configuration.
* refactor(tests): reorganize imports and enhance error messages in test cases
- Cleaned up import statements in test_crew.py for better organization and readability.
- Enhanced error messages in test cases to use `re.escape` for improved regex matching, ensuring more robust error handling.
- Adjusted comments for clarity and consistency across test scenarios.
- Ensured that all necessary modules are imported correctly to avoid potential runtime issues.
* feat: add base devtooling
* fix: ensure dep refs are updated for devtools
* fix: allow pre-release
* feat: allow release after tag
* feat: bump versions to 1.0.0a3
Co-authored-by: Greyson LaLonde <greyson.r.lalonde@gmail.com>
* fix: match tag and release title, ignore devtools build for pypi
* fix: allow failed pypi publish
* feat: introduce trigger listing and execution commands for local development (#3643)
* chore: exclude tests from ruff linting
* chore: exclude tests from GitHub Actions linter
* fix: replace print statements with logger in agent and memory handling
* chore: add noqa for intentional print in printer utility
* fix: resolve linting errors across codebase
* feat: update docs with new approach to consume Platform Actions (#3675)
* fix: remove duplicate line and add explicit env var
* feat: bump versions to 1.0.0a4 (#3686)
* Update triggers docs (#3678)
* docs: introduce triggers list & triggers run command
* docs: add KO triggers docs
* docs: ensure CREWAI_PLATFORM_INTEGRATION_TOKEN is mentioned on docs (#3687)
* Lorenze/bedrock llm (#3693)
* feat: add AWS Bedrock support and update dependencies
- Introduced BedrockCompletion class for AWS Bedrock integration in LLM.
- Added boto3 as a new dependency in both pyproject.toml and uv.lock.
- Updated LLM class to support Bedrock provider.
- Created new files for Bedrock provider implementation.
* using converse api
* converse
* linted
* refactor: update BedrockCompletion class to improve parameter handling
- Changed max_tokens from a fixed integer to an optional integer.
- Simplified model ID assignment by removing the inference profile mapping method.
- Cleaned up comments and unnecessary code related to tool specifications and model-specific parameters.
* feat: improve event bus thread safety and async support
Add thread-safe, async-compatible event bus with read–write locking and
handler dependency ordering. Remove blinker dependency and implement
direct dispatch. Improve type safety, error handling, and deterministic
event synchronization.
Refactor tests to auto-wait for async handlers, ensure clean teardown,
and add comprehensive concurrency coverage. Replace thread-local state
in AgentEvaluator with instance-based locking for correct cross-thread
access. Enhance tracing reliability and event finalization.
* feat: enhance OpenAICompletion class with additional client parameters (#3701)
* feat: enhance OpenAICompletion class with additional client parameters
- Added support for default_headers, default_query, and client_params in the OpenAICompletion class.
- Refactored client initialization to use a dedicated method for client parameter retrieval.
- Introduced new test cases to validate the correct usage of OpenAICompletion with various parameters.
* fix: correct test case for unsupported OpenAI model
- Updated the test_openai.py to ensure that the LLM instance is created before calling the method, maintaining proper error handling for unsupported models.
- This change ensures that the test accurately checks for the NotFoundError when an invalid model is specified.
* fix: enhance error handling in OpenAICompletion class
- Added specific exception handling for NotFoundError and APIConnectionError in the OpenAICompletion class to provide clearer error messages and improve logging.
- Updated the test case for unsupported models to ensure it raises a ValueError with the appropriate message when a non-existent model is specified.
- This change improves the robustness of the OpenAI API integration and enhances the clarity of error reporting.
* fix: improve test for unsupported OpenAI model handling
- Refactored the test case in test_openai.py to create the LLM instance after mocking the OpenAI client, ensuring proper error handling for unsupported models.
- This change enhances the clarity of the test by accurately checking for ValueError when a non-existent model is specified, aligning with recent improvements in error handling for the OpenAICompletion class.
* feat: bump versions to 1.0.0b1 (#3706)
* Lorenze/tools drop litellm (#3710)
* completely drop litellm and correctly pass config for qdrant
* feat: add support for additional embedding models in EmbeddingService
- Expanded the list of supported embedding models to include Google Vertex, Hugging Face, Jina, Ollama, OpenAI, Roboflow, Watson X, custom embeddings, Sentence Transformers, Text2Vec, OpenClip, and Instructor.
- This enhancement improves the versatility of the EmbeddingService by allowing integration with a wider range of embedding providers.
* fix: update collection parameter handling in CrewAIRagAdapter
- Changed the condition for setting vectors_config in the CrewAIRagAdapter to check for QdrantConfig instance instead of using hasattr. This improves type safety and ensures proper configuration handling for Qdrant integration.
* moved stagehand as optional dep (#3712)
* feat: bump versions to 1.0.0b2 (#3713)
* feat: enhance AnthropicCompletion class with additional client parame… (#3707)
* feat: enhance AnthropicCompletion class with additional client parameters and tool handling
- Added support for client_params in the AnthropicCompletion class to allow for additional client configuration.
- Refactored client initialization to use a dedicated method for retrieving client parameters.
- Implemented a new method to handle tool use conversation flow, ensuring proper execution and response handling.
- Introduced comprehensive test cases to validate the functionality of the AnthropicCompletion class, including tool use scenarios and parameter handling.
* drop print statements
* test: add fixture to mock ANTHROPIC_API_KEY for tests
- Introduced a pytest fixture to automatically mock the ANTHROPIC_API_KEY environment variable for all tests in the test_anthropic.py module.
- This change ensures that tests can run without requiring a real API key, improving test isolation and reliability.
* refactor: streamline streaming message handling in AnthropicCompletion class
- Removed the 'stream' parameter from the API call as it is set internally by the SDK.
- Simplified the handling of tool use events and response construction by extracting token usage from the final message.
- Enhanced the flow for managing tool use conversation, ensuring proper integration with the streaming API response.
* fix streaming here too
* fix: improve error handling in tool conversion for AnthropicCompletion class
- Enhanced exception handling during tool conversion by catching KeyError and ValueError.
- Added logging for conversion errors to aid in debugging and maintain robustness in tool integration.
* feat: enhance GeminiCompletion class with client parameter support (#3717)
* feat: enhance GeminiCompletion class with client parameter support
- Added support for client_params in the GeminiCompletion class to allow for additional client configuration.
- Refactored client initialization into a dedicated method for improved parameter handling.
- Introduced a new method to retrieve client parameters, ensuring compatibility with the base class.
- Enhanced error handling during client initialization to provide clearer messages for missing configuration.
- Updated documentation to reflect the changes in client parameter usage.
* add optional dependancies
* refactor: update test fixture to mock GOOGLE_API_KEY
- Renamed the fixture from `mock_anthropic_api_key` to `mock_google_api_key` to reflect the change in the environment variable being mocked.
- This update ensures that all tests in the module can run with a mocked GOOGLE_API_KEY, improving test isolation and reliability.
* fix tests
* feat: enhance BedrockCompletion class with advanced features
* feat: enhance BedrockCompletion class with advanced features and error handling
- Added support for guardrail configuration, additional model request fields, and custom response field paths in the BedrockCompletion class.
- Improved error handling for AWS exceptions and added token usage tracking with stop reason logging.
- Enhanced streaming response handling with comprehensive event management, including tool use and content block processing.
- Updated documentation to reflect new features and initialization parameters.
- Introduced a new test suite for BedrockCompletion to validate functionality and ensure robust integration with AWS Bedrock APIs.
* chore: add boto typing
* fix: use typing_extensions.Required for Python 3.10 compatibility
---------
Co-authored-by: Greyson Lalonde <greyson.r.lalonde@gmail.com>
* feat: azure native tests
* feat: add Azure AI Inference support and related tests
- Introduced the `azure-ai-inference` package with version `1.0.0b9` and its dependencies in `uv.lock` and `pyproject.toml`.
- Added new test files for Azure LLM functionality, including tests for Azure completion and tool handling.
- Implemented comprehensive test cases to validate Azure-specific behavior and integration with the CrewAI framework.
- Enhanced the testing framework to mock Azure credentials and ensure proper isolation during tests.
* feat: enhance AzureCompletion class with Azure OpenAI support
- Added support for the Azure OpenAI endpoint in the AzureCompletion class, allowing for flexible endpoint configurations.
- Implemented endpoint validation and correction to ensure proper URL formats for Azure OpenAI deployments.
- Enhanced error handling to provide clearer messages for common HTTP errors, including authentication and rate limit issues.
- Updated tests to validate the new endpoint handling and error messaging, ensuring robust integration with Azure AI Inference.
- Refactored parameter preparation to conditionally include the model parameter based on the endpoint type.
* refactor: convert project module to metaclass with full typing
* Lorenze/OpenAI base url backwards support (#3723)
* fix: enhance OpenAICompletion class base URL handling
- Updated the base URL assignment in the OpenAICompletion class to prioritize the new `api_base` attribute and fallback to the environment variable `OPENAI_BASE_URL` if both are not set.
- Added `api_base` to the list of parameters in the OpenAICompletion class to ensure proper configuration and flexibility in API endpoint management.
* feat: enhance OpenAICompletion class with api_base support
- Added the `api_base` parameter to the OpenAICompletion class to allow for flexible API endpoint configuration.
- Updated the `_get_client_params` method to prioritize `base_url` over `api_base`, ensuring correct URL handling.
- Introduced comprehensive tests to validate the behavior of `api_base` and `base_url` in various scenarios, including environment variable fallback.
- Enhanced test coverage for client parameter retrieval, ensuring robust integration with the OpenAI API.
* fix: improve OpenAICompletion class configuration handling
- Added a debug print statement to log the client configuration parameters during initialization for better traceability.
- Updated the base URL assignment logic to ensure it defaults to None if no valid base URL is provided, enhancing robustness in API endpoint configuration.
- Refined the retrieval of the `api_base` environment variable to streamline the configuration process.
* drop print
* feat: improvements on import native sdk support (#3725)
* feat: add support for Anthropic provider and enhance logging
- Introduced the `anthropic` package with version `0.69.0` in `pyproject.toml` and `uv.lock`, allowing for integration with the Anthropic API.
- Updated logging in the LLM class to provide clearer error messages when importing native providers, enhancing debugging capabilities.
- Improved error handling in the AnthropicCompletion class to guide users on installation via the updated error message format.
- Refactored import error handling in other provider classes to maintain consistency in error messaging and installation instructions.
* feat: enhance LLM support with Bedrock provider and update dependencies
- Added support for the `bedrock` provider in the LLM class, allowing integration with AWS Bedrock APIs.
- Updated `uv.lock` to replace `boto3` with `bedrock` in the dependencies, reflecting the new provider structure.
- Introduced `SUPPORTED_NATIVE_PROVIDERS` to include `bedrock` and ensure proper error handling when instantiating native providers.
- Enhanced error handling in the LLM class to raise informative errors when native provider instantiation fails.
- Added tests to validate the behavior of the new Bedrock provider and ensure fallback mechanisms work correctly for unsupported providers.
* test: update native provider fallback tests to expect ImportError
* adjust the test with the expected bevaior - raising ImportError
* this is exoecting the litellm format, all gemini native tests are in test_google.py
---------
Co-authored-by: Greyson LaLonde <greyson.r.lalonde@gmail.com>
* fix: remove stdout prints, improve test determinism, and update trace handling
Removed `print` statements from the `LLMStreamChunkEvent` handler to prevent
LLM response chunks from being written directly to stdout. The listener now
only tracks chunks internally.
Fixes #3715
Added explicit return statements for trace-related tests.
Updated cassette for `test_failed_evaluation` to reflect new behavior where
an empty trace dict is used instead of returning early.
Ensured deterministic cleanup order in test fixtures by making
`clear_event_bus_handlers` depend on `setup_test_environment`. This guarantees
event bus shutdown and file handle cleanup occur before temporary directory
deletion, resolving intermittent “Directory not empty” errors in CI.
* chore: remove lib/crewai exclusion from pre-commit hooks
* feat: enhance task guardrail functionality and validation
* feat: enhance task guardrail functionality and validation
- Introduced support for multiple guardrails in the Task class, allowing for sequential processing of guardrails.
- Added a new `guardrails` field to the Task model to accept a list of callable guardrails or string descriptions.
- Implemented validation to ensure guardrails are processed correctly, including handling of retries and error messages.
- Enhanced the `_invoke_guardrail_function` method to manage guardrail execution and integrate with existing task output processing.
- Updated tests to cover various scenarios involving multiple guardrails, including success, failure, and retry mechanisms.
This update improves the flexibility and robustness of task execution by allowing for more complex validation scenarios.
* refactor: enhance guardrail type handling in Task model
- Updated the Task class to improve guardrail type definitions, introducing GuardrailType and GuardrailsType for better clarity and type safety.
- Simplified the validation logic for guardrails, ensuring that both single and multiple guardrails are processed correctly.
- Enhanced error messages for guardrail validation to provide clearer feedback when incorrect types are provided.
- This refactor improves the maintainability and robustness of task execution by standardizing guardrail handling.
* feat: implement per-guardrail retry tracking in Task model
- Introduced a new private attribute `_guardrail_retry_counts` to the Task class for tracking retry attempts on a per-guardrail basis.
- Updated the guardrail processing logic to utilize the new retry tracking, allowing for independent retry counts for each guardrail.
- Enhanced error handling to provide clearer feedback when guardrails fail validation after exceeding retry limits.
- Modified existing tests to validate the new retry tracking behavior, ensuring accurate assertions on guardrail retries.
This update improves the robustness and flexibility of task execution by allowing for more granular control over guardrail validation and retry mechanisms.
* chore: 1.0.0b3 bump (#3734)
* chore: full ruff and mypy
improved linting, pre-commit setup, and internal architecture. Configured Ruff to respect .gitignore, added stricter rules, and introduced a lock pre-commit hook with virtualenv activation. Fixed type shadowing in EXASearchTool using a type_ alias to avoid PEP 563 conflicts and resolved circular imports in agent executor and guardrail modules. Removed agent-ops attributes, deprecated watson alias, and dropped crewai-enterprise tools with corresponding test updates. Refactored cache and memoization for thread safety and cleaned up structured output adapters and related logic.
* New MCL DSL (#3738)
* Adding MCP implementation
* New tests for MCP implementation
* fix tests
* update docs
* Revert "New tests for MCP implementation"
This reverts commit 0bbe6dee90.
* linter
* linter
* fix
* verify mcp pacakge exists
* adjust docs to be clear only remote servers are supported
* reverted
* ensure args schema generated properly
* properly close out
---------
Co-authored-by: lorenzejay <lorenzejaytech@gmail.com>
Co-authored-by: Greyson Lalonde <greyson.r.lalonde@gmail.com>
* feat: a2a experimental
experimental a2a support
---------
Co-authored-by: Lucas Gomide <lucaslg200@gmail.com>
Co-authored-by: Greyson LaLonde <greyson.r.lalonde@gmail.com>
Co-authored-by: Tony Kipkemboi <iamtonykipkemboi@gmail.com>
Co-authored-by: Mike Plachta <mplachta@users.noreply.github.com>
Co-authored-by: João Moura <joaomdmoura@gmail.com>
9612 lines
273 KiB
JSON
9612 lines
273 KiB
JSON
{
|
|
"tools": [
|
|
{
|
|
"description": "A wrapper around [AI-Minds](https://mindsdb.com/minds). Useful for when you need answers to questions from your data, stored in data sources including PostgreSQL, MySQL, MariaDB, ClickHouse, Snowflake and Google BigQuery. Input should be a question in natural language.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for AI-Minds",
|
|
"name": "MINDS_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "AIMind Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"datasources": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Datasources"
|
|
},
|
|
"mind_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Mind Name"
|
|
}
|
|
},
|
|
"title": "AIMindTool",
|
|
"type": "object"
|
|
},
|
|
"name": "AIMindTool",
|
|
"package_dependencies": [
|
|
"minds-sdk"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for AIMind Tool.",
|
|
"properties": {
|
|
"query": {
|
|
"description": "Question in natural language to ask the AI-Mind",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "AIMindToolInputSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Fetches metadata from Arxiv based on a search query and optionally downloads PDFs.",
|
|
"env_vars": [],
|
|
"humanized_name": "Arxiv Paper Fetcher and Downloader",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"additionalProperties": true,
|
|
"properties": {},
|
|
"title": "ArxivPaperTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ArxivPaperTool",
|
|
"package_dependencies": [
|
|
"pydantic"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"max_results": {
|
|
"default": 5,
|
|
"description": "Max results to fetch; must be between 1 and 100",
|
|
"maximum": 100,
|
|
"minimum": 1,
|
|
"title": "Max Results",
|
|
"type": "integer"
|
|
},
|
|
"search_query": {
|
|
"description": "Search query for Arxiv, e.g., 'transformer neural network'",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "ArxivToolInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to search the internet with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Brave Search",
|
|
"name": "BRAVE_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Brave Web Search the internet",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "BraveSearchTool - A tool for performing web searches using the Brave Search API.\n\nThis module provides functionality to search the internet using Brave's Search API,\nsupporting customizable result counts and country-specific searches.\n\nDependencies:\n - requests\n - pydantic\n - python-dotenv (for API key management)",
|
|
"properties": {
|
|
"country": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "",
|
|
"title": "Country"
|
|
},
|
|
"n_results": {
|
|
"default": 10,
|
|
"title": "N Results",
|
|
"type": "integer"
|
|
},
|
|
"save_file": {
|
|
"default": false,
|
|
"title": "Save File",
|
|
"type": "boolean"
|
|
},
|
|
"search_url": {
|
|
"default": "https://api.search.brave.com/res/v1/web/search",
|
|
"title": "Search Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "BraveSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "BraveSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for BraveSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the internet",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "BraveSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrapes structured data using Bright Data Dataset API from a URL and optional input parameters",
|
|
"env_vars": [],
|
|
"humanized_name": "Bright Data Dataset Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "CrewAI-compatible tool for scraping structured data using Bright Data Datasets.\n\nAttributes:\n name (str): Tool name displayed in the CrewAI environment.\n description (str): Tool description shown to agents or users.\n args_schema (Type[BaseModel]): Pydantic schema for validating input arguments.",
|
|
"properties": {
|
|
"additional_params": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Additional Params"
|
|
},
|
|
"dataset_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Dataset Type"
|
|
},
|
|
"format": {
|
|
"default": "json",
|
|
"title": "Format",
|
|
"type": "string"
|
|
},
|
|
"url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Url"
|
|
},
|
|
"zipcode": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Zipcode"
|
|
}
|
|
},
|
|
"title": "BrightDataDatasetTool",
|
|
"type": "object"
|
|
},
|
|
"name": "BrightDataDatasetTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Schema for validating input parameters for the BrightDataDatasetTool.\n\nAttributes:\n dataset_type (str): Required Bright Data Dataset Type used to specify which dataset to access.\n format (str): Response format (json by default). Multiple formats exist - json, ndjson, jsonl, csv\n url (str): The URL from which structured data needs to be extracted.\n zipcode (Optional[str]): An optional ZIP code to narrow down the data geographically.\n additional_params (Optional[Dict]): Extra parameters for the Bright Data API call.",
|
|
"properties": {
|
|
"additional_params": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Additional params if any",
|
|
"title": "Additional Params"
|
|
},
|
|
"dataset_type": {
|
|
"description": "The Bright Data Dataset Type",
|
|
"title": "Dataset Type",
|
|
"type": "string"
|
|
},
|
|
"format": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "json",
|
|
"description": "Response format (json by default)",
|
|
"title": "Format"
|
|
},
|
|
"url": {
|
|
"description": "The URL to extract data from",
|
|
"title": "Url",
|
|
"type": "string"
|
|
},
|
|
"zipcode": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional zipcode",
|
|
"title": "Zipcode"
|
|
}
|
|
},
|
|
"required": [
|
|
"dataset_type",
|
|
"url"
|
|
],
|
|
"title": "BrightDataDatasetToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Tool to perform web search using Bright Data SERP API.",
|
|
"env_vars": [],
|
|
"humanized_name": "Bright Data SERP Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A web search tool that utilizes Bright Data's SERP API to perform queries and return either structured results\nor raw page content from search engines like Google or Bing.\n\nAttributes:\n name (str): Tool name used by the agent.\n description (str): A brief explanation of what the tool does.\n args_schema (Type[BaseModel]): Schema class for validating tool arguments.\n base_url (str): The Bright Data API endpoint used for making the POST request.\n api_key (str): Bright Data API key loaded from the environment variable 'BRIGHT_DATA_API_KEY'.\n zone (str): Zone identifier from Bright Data, loaded from the environment variable 'BRIGHT_DATA_ZONE'.\n\nRaises:\n ValueError: If API key or zone environment variables are not set.",
|
|
"properties": {
|
|
"api_key": {
|
|
"default": "",
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
},
|
|
"base_url": {
|
|
"default": "",
|
|
"title": "Base Url",
|
|
"type": "string"
|
|
},
|
|
"country": {
|
|
"default": "us",
|
|
"title": "Country",
|
|
"type": "string"
|
|
},
|
|
"device_type": {
|
|
"default": "desktop",
|
|
"title": "Device Type",
|
|
"type": "string"
|
|
},
|
|
"language": {
|
|
"default": "en",
|
|
"title": "Language",
|
|
"type": "string"
|
|
},
|
|
"parse_results": {
|
|
"default": true,
|
|
"title": "Parse Results",
|
|
"type": "boolean"
|
|
},
|
|
"query": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Query"
|
|
},
|
|
"search_engine": {
|
|
"default": "google",
|
|
"title": "Search Engine",
|
|
"type": "string"
|
|
},
|
|
"search_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Search Type"
|
|
},
|
|
"zone": {
|
|
"default": "",
|
|
"title": "Zone",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "BrightDataSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "BrightDataSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Schema that defines the input arguments for the BrightDataSearchToolSchema.\n\nAttributes:\n query (str): The search query to be executed (e.g., \"latest AI news\").\n search_engine (Optional[str]): The search engine to use (\"google\", \"bing\", \"yandex\"). Default is \"google\".\n country (Optional[str]): Two-letter country code for geo-targeting (e.g., \"us\", \"in\"). Default is \"us\".\n language (Optional[str]): Language code for search results (e.g., \"en\", \"es\"). Default is \"en\".\n search_type (Optional[str]): Type of search, such as \"isch\" (images), \"nws\" (news), \"jobs\", etc.\n device_type (Optional[str]): Device type to simulate (\"desktop\", \"mobile\", \"ios\", \"android\"). Default is \"desktop\".\n parse_results (Optional[bool]): If True, results will be returned in structured JSON. If False, raw HTML. Default is True.",
|
|
"properties": {
|
|
"country": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "us",
|
|
"description": "Two-letter country code for geo-targeting (e.g., 'us', 'gb')",
|
|
"title": "Country"
|
|
},
|
|
"device_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "desktop",
|
|
"description": "Device type to simulate (e.g., 'mobile', 'desktop', 'ios')",
|
|
"title": "Device Type"
|
|
},
|
|
"language": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "en",
|
|
"description": "Language code (e.g., 'en', 'es') used in the query URL",
|
|
"title": "Language"
|
|
},
|
|
"parse_results": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": true,
|
|
"description": "Whether to parse and return JSON (True) or raw HTML/text (False)",
|
|
"title": "Parse Results"
|
|
},
|
|
"query": {
|
|
"description": "Search query to perform",
|
|
"title": "Query",
|
|
"type": "string"
|
|
},
|
|
"search_engine": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "google",
|
|
"description": "Search engine domain (e.g., 'google', 'bing', 'yandex')",
|
|
"title": "Search Engine"
|
|
},
|
|
"search_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Type of search (e.g., 'isch' for images, 'nws' for news)",
|
|
"title": "Search Type"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "BrightDataSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Tool to perform web scraping using Bright Data Web Unlocker",
|
|
"env_vars": [],
|
|
"humanized_name": "Bright Data Web Unlocker Scraping",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for performing web scraping using the Bright Data Web Unlocker API.\n\nThis tool allows automated and programmatic access to web pages by routing requests\nthrough Bright Data's unlocking and proxy infrastructure, which can bypass bot\nprotection mechanisms like CAPTCHA, geo-restrictions, and anti-bot detection.\n\nAttributes:\n name (str): Name of the tool.\n description (str): Description of what the tool does.\n args_schema (Type[BaseModel]): Pydantic model schema for expected input arguments.\n base_url (str): Base URL of the Bright Data Web Unlocker API.\n api_key (str): Bright Data API key (must be set in the BRIGHT_DATA_API_KEY environment variable).\n zone (str): Bright Data zone identifier (must be set in the BRIGHT_DATA_ZONE environment variable).\n\nMethods:\n _run(**kwargs: Any) -> Any:\n Sends a scraping request to Bright Data's Web Unlocker API and returns the result.",
|
|
"properties": {
|
|
"api_key": {
|
|
"default": "",
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
},
|
|
"base_url": {
|
|
"default": "",
|
|
"title": "Base Url",
|
|
"type": "string"
|
|
},
|
|
"data_format": {
|
|
"default": "markdown",
|
|
"title": "Data Format",
|
|
"type": "string"
|
|
},
|
|
"format": {
|
|
"default": "raw",
|
|
"title": "Format",
|
|
"type": "string"
|
|
},
|
|
"url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Url"
|
|
},
|
|
"zone": {
|
|
"default": "",
|
|
"title": "Zone",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "BrightDataWebUnlockerTool",
|
|
"type": "object"
|
|
},
|
|
"name": "BrightDataWebUnlockerTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Pydantic schema for input parameters used by the BrightDataWebUnlockerTool.\n\nThis schema defines the structure and validation for parameters passed when performing\na web scraping request using Bright Data's Web Unlocker.\n\nAttributes:\n url (str): The target URL to scrape.\n format (Optional[str]): Format of the response returned by Bright Data. Default 'raw' format.\n data_format (Optional[str]): Response data format (html by default). markdown is one more option.",
|
|
"properties": {
|
|
"data_format": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "markdown",
|
|
"description": "Response data format (html by default)",
|
|
"title": "Data Format"
|
|
},
|
|
"format": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "raw",
|
|
"description": "Response format (raw is standard)",
|
|
"title": "Format"
|
|
},
|
|
"url": {
|
|
"description": "URL to perform the web scraping",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "BrightDataUnlockerToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Load webpages url in a headless browser using Browserbase and return the contents",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Browserbase services",
|
|
"name": "BROWSERBASE_API_KEY",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Project ID for Browserbase services",
|
|
"name": "BROWSERBASE_PROJECT_ID",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "Browserbase web load tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"browserbase": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Browserbase"
|
|
},
|
|
"project_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Project Id"
|
|
},
|
|
"proxy": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Proxy"
|
|
},
|
|
"session_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Session Id"
|
|
},
|
|
"text_content": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": false,
|
|
"title": "Text Content"
|
|
}
|
|
},
|
|
"title": "BrowserbaseLoadTool",
|
|
"type": "object"
|
|
},
|
|
"name": "BrowserbaseLoadTool",
|
|
"package_dependencies": [
|
|
"browserbase"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"url": {
|
|
"description": "Website URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "BrowserbaseLoadToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a CSV's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a CSV's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "CSVSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "CSVSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for CSVSearchTool.",
|
|
"properties": {
|
|
"csv": {
|
|
"description": "Mandatory csv path you want to search",
|
|
"title": "Csv",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the CSV's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"csv"
|
|
],
|
|
"title": "CSVSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a Code Docs content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a Code Docs content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "CodeDocsSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "CodeDocsSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for CodeDocsSearchTool.",
|
|
"properties": {
|
|
"docs_url": {
|
|
"description": "Mandatory docs_url path you want to search",
|
|
"title": "Docs Url",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the Code Docs content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"docs_url"
|
|
],
|
|
"title": "CodeDocsSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Interprets Python3 code strings with a final print statement.",
|
|
"env_vars": [],
|
|
"humanized_name": "Code Interpreter",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for executing Python code in isolated environments.\n\nThis tool provides functionality to run Python code either in a Docker container\nfor safe isolation or directly in a restricted sandbox. It can handle installing\nPython packages and executing arbitrary Python code.",
|
|
"properties": {
|
|
"code": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Code"
|
|
},
|
|
"default_image_tag": {
|
|
"default": "code-interpreter:latest",
|
|
"title": "Default Image Tag",
|
|
"type": "string"
|
|
},
|
|
"unsafe_mode": {
|
|
"default": false,
|
|
"title": "Unsafe Mode",
|
|
"type": "boolean"
|
|
},
|
|
"user_docker_base_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "User Docker Base Url"
|
|
},
|
|
"user_dockerfile_path": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "User Dockerfile Path"
|
|
}
|
|
},
|
|
"title": "CodeInterpreterTool",
|
|
"type": "object"
|
|
},
|
|
"name": "CodeInterpreterTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Schema for defining inputs to the CodeInterpreterTool.\n\nThis schema defines the required parameters for code execution,\nincluding the code to run and any libraries that need to be installed.",
|
|
"properties": {
|
|
"code": {
|
|
"description": "Python3 code used to be interpreted in the Docker container. ALWAYS PRINT the final result and the output of the code",
|
|
"title": "Code",
|
|
"type": "string"
|
|
},
|
|
"libraries_used": {
|
|
"description": "List of libraries used in the code with proper installing names separated by commas. Example: numpy,pandas,beautifulsoup4",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Libraries Used",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"required": [
|
|
"code",
|
|
"libraries_used"
|
|
],
|
|
"title": "CodeInterpreterSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Composio services",
|
|
"name": "COMPOSIO_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "ComposioTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Wrapper for composio tools.",
|
|
"properties": {},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "ComposioTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ComposioTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "Create a new Contextual AI RAG agent with documents and datastore",
|
|
"env_vars": [],
|
|
"humanized_name": "Contextual AI Create Agent Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to create Contextual AI RAG agents with documents.",
|
|
"properties": {
|
|
"api_key": {
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
},
|
|
"contextual_client": {
|
|
"default": null,
|
|
"title": "Contextual Client"
|
|
}
|
|
},
|
|
"required": [
|
|
"api_key"
|
|
],
|
|
"title": "ContextualAICreateAgentTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ContextualAICreateAgentTool",
|
|
"package_dependencies": [
|
|
"contextual-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Schema for contextual create agent tool.",
|
|
"properties": {
|
|
"agent_description": {
|
|
"description": "Description for the new agent",
|
|
"title": "Agent Description",
|
|
"type": "string"
|
|
},
|
|
"agent_name": {
|
|
"description": "Name for the new agent",
|
|
"title": "Agent Name",
|
|
"type": "string"
|
|
},
|
|
"datastore_name": {
|
|
"description": "Name for the new datastore",
|
|
"title": "Datastore Name",
|
|
"type": "string"
|
|
},
|
|
"document_paths": {
|
|
"description": "List of file paths to upload",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Document Paths",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"required": [
|
|
"agent_name",
|
|
"agent_description",
|
|
"datastore_name",
|
|
"document_paths"
|
|
],
|
|
"title": "ContextualAICreateAgentSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Parse documents using Contextual AI's advanced document parser",
|
|
"env_vars": [],
|
|
"humanized_name": "Contextual AI Document Parser",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to parse documents using Contextual AI's parser.",
|
|
"properties": {
|
|
"api_key": {
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"api_key"
|
|
],
|
|
"title": "ContextualAIParseTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ContextualAIParseTool",
|
|
"package_dependencies": [
|
|
"contextual-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Schema for contextual parse tool.",
|
|
"properties": {
|
|
"enable_document_hierarchy": {
|
|
"default": true,
|
|
"description": "Enable document hierarchy",
|
|
"title": "Enable Document Hierarchy",
|
|
"type": "boolean"
|
|
},
|
|
"figure_caption_mode": {
|
|
"default": "concise",
|
|
"description": "Figure caption mode",
|
|
"title": "Figure Caption Mode",
|
|
"type": "string"
|
|
},
|
|
"file_path": {
|
|
"description": "Path to the document to parse",
|
|
"title": "File Path",
|
|
"type": "string"
|
|
},
|
|
"output_types": {
|
|
"default": [
|
|
"markdown-per-page"
|
|
],
|
|
"description": "List of output types",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Output Types",
|
|
"type": "array"
|
|
},
|
|
"page_range": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Page range to parse (e.g., '0-5')",
|
|
"title": "Page Range"
|
|
},
|
|
"parse_mode": {
|
|
"default": "standard",
|
|
"description": "Parsing mode",
|
|
"title": "Parse Mode",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"file_path"
|
|
],
|
|
"title": "ContextualAIParseSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Use this tool to query a Contextual AI RAG agent with access to your documents",
|
|
"env_vars": [],
|
|
"humanized_name": "Contextual AI Query Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to query Contextual AI RAG agents.",
|
|
"properties": {
|
|
"api_key": {
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
},
|
|
"contextual_client": {
|
|
"default": null,
|
|
"title": "Contextual Client"
|
|
}
|
|
},
|
|
"required": [
|
|
"api_key"
|
|
],
|
|
"title": "ContextualAIQueryTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ContextualAIQueryTool",
|
|
"package_dependencies": [
|
|
"contextual-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Schema for contextual query tool.",
|
|
"properties": {
|
|
"agent_id": {
|
|
"description": "ID of the Contextual AI agent to query",
|
|
"title": "Agent Id",
|
|
"type": "string"
|
|
},
|
|
"datastore_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional datastore ID for document readiness verification",
|
|
"title": "Datastore Id"
|
|
},
|
|
"query": {
|
|
"description": "Query to send to the Contextual AI agent.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query",
|
|
"agent_id"
|
|
],
|
|
"title": "ContextualAIQuerySchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Rerank documents using Contextual AI's instruction-following reranker",
|
|
"env_vars": [],
|
|
"humanized_name": "Contextual AI Document Reranker",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to rerank documents using Contextual AI's instruction-following reranker.",
|
|
"properties": {
|
|
"api_key": {
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"api_key"
|
|
],
|
|
"title": "ContextualAIRerankTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ContextualAIRerankTool",
|
|
"package_dependencies": [
|
|
"contextual-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Schema for contextual rerank tool.",
|
|
"properties": {
|
|
"documents": {
|
|
"description": "List of document texts to rerank",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Documents",
|
|
"type": "array"
|
|
},
|
|
"instruction": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional instruction for reranking behavior",
|
|
"title": "Instruction"
|
|
},
|
|
"metadata": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional metadata for each document",
|
|
"title": "Metadata"
|
|
},
|
|
"model": {
|
|
"default": "ctxl-rerank-en-v1-instruct",
|
|
"description": "Reranker model to use",
|
|
"title": "Model",
|
|
"type": "string"
|
|
},
|
|
"query": {
|
|
"description": "The search query to rerank documents against",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query",
|
|
"documents"
|
|
],
|
|
"title": "ContextualAIRerankSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to search the Couchbase database for relevant information on internal documents.",
|
|
"env_vars": [],
|
|
"humanized_name": "CouchbaseFTSVectorSearchTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to search the Couchbase database",
|
|
"properties": {
|
|
"bucket_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": [
|
|
null
|
|
],
|
|
"title": "Bucket Name"
|
|
},
|
|
"cluster": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Cluster"
|
|
},
|
|
"collection_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": [
|
|
null
|
|
],
|
|
"title": "Collection Name"
|
|
},
|
|
"embedding_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "embedding",
|
|
"description": "Name of the field in the search index that stores the vector",
|
|
"title": "Embedding Key"
|
|
},
|
|
"index_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": [
|
|
null
|
|
],
|
|
"title": "Index Name"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 3,
|
|
"title": "Limit"
|
|
},
|
|
"scope_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": [
|
|
null
|
|
],
|
|
"title": "Scope Name"
|
|
},
|
|
"scoped_index": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"title": "Scoped Index"
|
|
}
|
|
},
|
|
"title": "CouchbaseFTSVectorSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "CouchbaseFTSVectorSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for CouchbaseTool.",
|
|
"properties": {
|
|
"query": {
|
|
"description": "The query to search retrieve relevant information from the Couchbase database. Pass only the query, not the question.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "CouchbaseToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a DOCX's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a DOCX's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "DOCXSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "DOCXSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for DOCXSearchTool.",
|
|
"properties": {
|
|
"docx": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "Mandatory docx path you want to search",
|
|
"title": "Docx"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the DOCX's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"docx",
|
|
"search_query"
|
|
],
|
|
"title": "DOCXSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Generates images using OpenAI's Dall-E model.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for OpenAI services",
|
|
"name": "OPENAI_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Dall-E Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"model": {
|
|
"default": "dall-e-3",
|
|
"title": "Model",
|
|
"type": "string"
|
|
},
|
|
"n": {
|
|
"default": 1,
|
|
"title": "N",
|
|
"type": "integer"
|
|
},
|
|
"quality": {
|
|
"default": "standard",
|
|
"title": "Quality",
|
|
"type": "string"
|
|
},
|
|
"size": {
|
|
"default": "1024x1024",
|
|
"title": "Size",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "DallETool",
|
|
"type": "object"
|
|
},
|
|
"name": "DallETool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Dall-E Tool.",
|
|
"properties": {
|
|
"image_description": {
|
|
"description": "Description of the image to be generated by Dall-E.",
|
|
"title": "Image Description",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"image_description"
|
|
],
|
|
"title": "ImagePromptSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Execute SQL queries against Databricks workspace tables and return the results. Provide a 'query' parameter with the SQL query to execute.",
|
|
"env_vars": [],
|
|
"humanized_name": "Databricks SQL Query",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for querying Databricks workspace tables using SQL.\n\nThis tool executes SQL queries against Databricks tables and returns the results.\nIt requires Databricks authentication credentials to be set as environment variables.\n\nAuthentication can be provided via:\n- Databricks CLI profile: Set DATABRICKS_CONFIG_PROFILE environment variable\n- Direct credentials: Set DATABRICKS_HOST and DATABRICKS_TOKEN environment variables\n\nExample:\n >>> tool = DatabricksQueryTool()\n >>> results = tool.run(query=\"SELECT * FROM my_table LIMIT 10\")",
|
|
"properties": {
|
|
"default_catalog": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default Catalog"
|
|
},
|
|
"default_schema": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default Schema"
|
|
},
|
|
"default_warehouse_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default Warehouse Id"
|
|
}
|
|
},
|
|
"title": "DatabricksQueryTool",
|
|
"type": "object"
|
|
},
|
|
"name": "DatabricksQueryTool",
|
|
"package_dependencies": [
|
|
"databricks-sdk"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for DatabricksQueryTool.",
|
|
"properties": {
|
|
"catalog": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Databricks catalog name (optional, defaults to configured catalog)",
|
|
"title": "Catalog"
|
|
},
|
|
"db_schema": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Databricks schema name (optional, defaults to configured schema)",
|
|
"title": "Db Schema"
|
|
},
|
|
"query": {
|
|
"description": "SQL query to execute against the Databricks workspace table",
|
|
"title": "Query",
|
|
"type": "string"
|
|
},
|
|
"row_limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 1000,
|
|
"description": "Maximum number of rows to return (default: 1000)",
|
|
"title": "Row Limit"
|
|
},
|
|
"warehouse_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Databricks SQL warehouse ID (optional, defaults to configured warehouse)",
|
|
"title": "Warehouse Id"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "DatabricksQueryToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to recursively list a directory's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "List files in directory",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"directory": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Directory"
|
|
}
|
|
},
|
|
"title": "DirectoryReadTool",
|
|
"type": "object"
|
|
},
|
|
"name": "DirectoryReadTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for DirectoryReadTool.",
|
|
"properties": {
|
|
"directory": {
|
|
"description": "Mandatory directory to list content",
|
|
"title": "Directory",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"directory"
|
|
],
|
|
"title": "DirectoryReadToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a directory's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a directory's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "DirectorySearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "DirectorySearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for DirectorySearchTool.",
|
|
"properties": {
|
|
"directory": {
|
|
"description": "Mandatory directory you want to search",
|
|
"title": "Directory",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the directory's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"directory"
|
|
],
|
|
"title": "DirectorySearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Search the internet using Exa",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Exa services",
|
|
"name": "EXA_API_KEY",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "EXASearchTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "API key for Exa services",
|
|
"required": false,
|
|
"title": "Api Key"
|
|
},
|
|
"client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Client"
|
|
},
|
|
"content": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": false,
|
|
"title": "Content"
|
|
},
|
|
"summary": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": false,
|
|
"title": "Summary"
|
|
},
|
|
"type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "auto",
|
|
"title": "Type"
|
|
}
|
|
},
|
|
"title": "EXASearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "EXASearchTool",
|
|
"package_dependencies": [
|
|
"exa_py"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"end_published_date": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "End date for the search",
|
|
"title": "End Published Date"
|
|
},
|
|
"include_domains": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "List of domains to include in the search",
|
|
"title": "Include Domains"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the internet",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"start_published_date": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Start date for the search",
|
|
"title": "Start Published Date"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "EXABaseToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Compresses a file or directory into an archive (.zip currently supported). Useful for archiving logs, documents, or backups.",
|
|
"env_vars": [],
|
|
"humanized_name": "File Compressor Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {},
|
|
"title": "FileCompressorTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FileCompressorTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input schema for FileCompressorTool.",
|
|
"properties": {
|
|
"format": {
|
|
"default": "zip",
|
|
"description": "Compression format ('zip', 'tar', 'tar.gz', 'tar.bz2', 'tar.xz').",
|
|
"title": "Format",
|
|
"type": "string"
|
|
},
|
|
"input_path": {
|
|
"description": "Path to the file or directory to compress.",
|
|
"title": "Input Path",
|
|
"type": "string"
|
|
},
|
|
"output_path": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional output archive filename.",
|
|
"title": "Output Path"
|
|
},
|
|
"overwrite": {
|
|
"default": false,
|
|
"description": "Whether to overwrite the archive if it already exists.",
|
|
"title": "Overwrite",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"input_path"
|
|
],
|
|
"title": "FileCompressorToolInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that reads the content of a file. To use this tool, provide a 'file_path' parameter with the path to the file you want to read. Optionally, provide 'start_line' to start reading from a specific line and 'line_count' to limit the number of lines read.",
|
|
"env_vars": [],
|
|
"humanized_name": "Read a file's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for reading file contents.\n\nThis tool inherits its schema handling from BaseTool to avoid recursive schema\ndefinition issues. The args_schema is set to FileReadToolSchema which defines\nthe required file_path parameter. The schema should not be overridden in the\nconstructor as it would break the inheritance chain and cause infinite loops.\n\nThe tool supports two ways of specifying the file path:\n1. At construction time via the file_path parameter\n2. At runtime via the file_path parameter in the tool's input\n\nArgs:\n file_path (Optional[str]): Path to the file to be read. If provided,\n this becomes the default file path for the tool.\n **kwargs: Additional keyword arguments passed to BaseTool.\n\nExample:\n >>> tool = FileReadTool(file_path=\"/path/to/file.txt\")\n >>> content = tool.run() # Reads /path/to/file.txt\n >>> content = tool.run(file_path=\"/path/to/other.txt\") # Reads other.txt\n >>> content = tool.run(file_path=\"/path/to/file.txt\", start_line=100, line_count=50) # Reads lines 100-149",
|
|
"properties": {
|
|
"file_path": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "File Path"
|
|
}
|
|
},
|
|
"title": "FileReadTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FileReadTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for FileReadTool.",
|
|
"properties": {
|
|
"file_path": {
|
|
"description": "Mandatory file full path to read the file",
|
|
"title": "File Path",
|
|
"type": "string"
|
|
},
|
|
"line_count": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Number of lines to read. If None, reads the entire file",
|
|
"title": "Line Count"
|
|
},
|
|
"start_line": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 1,
|
|
"description": "Line number to start reading from (1-indexed)",
|
|
"title": "Start Line"
|
|
}
|
|
},
|
|
"required": [
|
|
"file_path"
|
|
],
|
|
"title": "FileReadToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to write content to a specified file. Accepts filename, content, and optionally a directory path and overwrite flag as input.",
|
|
"env_vars": [],
|
|
"humanized_name": "File Writer Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {},
|
|
"title": "FileWriterTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FileWriterTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"content": {
|
|
"title": "Content",
|
|
"type": "string"
|
|
},
|
|
"directory": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "./",
|
|
"title": "Directory"
|
|
},
|
|
"filename": {
|
|
"title": "Filename",
|
|
"type": "string"
|
|
},
|
|
"overwrite": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "boolean"
|
|
}
|
|
],
|
|
"default": false,
|
|
"title": "Overwrite"
|
|
}
|
|
},
|
|
"required": [
|
|
"filename",
|
|
"content"
|
|
],
|
|
"title": "FileWriterToolInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Crawl webpages using Firecrawl and return the contents",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Firecrawl services",
|
|
"name": "FIRECRAWL_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Firecrawl web crawl tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for crawling websites using Firecrawl. To run this tool, you need to have a Firecrawl API key.\n\nArgs:\n api_key (str): Your Firecrawl API key.\n config (dict): Optional. It contains Firecrawl API parameters.\n\nDefault configuration options:\n max_depth (int): Maximum depth to crawl. Default: 2\n ignore_sitemap (bool): Whether to ignore sitemap. Default: True\n limit (int): Maximum number of pages to crawl. Default: 100\n allow_backward_links (bool): Allow crawling backward links. Default: False\n allow_external_links (bool): Allow crawling external links. Default: False\n scrape_options (ScrapeOptions): Options for scraping content\n - formats (list[str]): Content formats to return. Default: [\"markdown\", \"screenshot\", \"links\"]\n - only_main_content (bool): Only return main content. Default: True\n - timeout (int): Timeout in milliseconds. Default: 30000",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"title": "Config"
|
|
}
|
|
},
|
|
"title": "FirecrawlCrawlWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FirecrawlCrawlWebsiteTool",
|
|
"package_dependencies": [
|
|
"firecrawl-py"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"url": {
|
|
"description": "Website URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "FirecrawlCrawlWebsiteToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape webpages using Firecrawl and return the contents",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Firecrawl services",
|
|
"name": "FIRECRAWL_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Firecrawl web scrape tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for scraping webpages using Firecrawl. To run this tool, you need to have a Firecrawl API key.\n\nArgs:\n api_key (str): Your Firecrawl API key.\n config (dict): Optional. It contains Firecrawl API parameters.\n\nDefault configuration options:\n formats (list[str]): Content formats to return. Default: [\"markdown\"]\n onlyMainContent (bool): Only return main content. Default: True\n includeTags (list[str]): Tags to include. Default: []\n excludeTags (list[str]): Tags to exclude. Default: []\n headers (dict): Headers to include. Default: {}\n waitFor (int): Time to wait for page to load in ms. Default: 0\n json_options (dict): Options for JSON extraction. Default: None",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"config": {
|
|
"additionalProperties": true,
|
|
"title": "Config",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"title": "FirecrawlScrapeWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FirecrawlScrapeWebsiteTool",
|
|
"package_dependencies": [
|
|
"firecrawl-py"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"url": {
|
|
"description": "Website URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "FirecrawlScrapeWebsiteToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Search webpages using Firecrawl and return the results",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Firecrawl services",
|
|
"name": "FIRECRAWL_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Firecrawl web search tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for searching webpages using Firecrawl. To run this tool, you need to have a Firecrawl API key.\n\nArgs:\n api_key (str): Your Firecrawl API key.\n config (dict): Optional. It contains Firecrawl API parameters.\n\nDefault configuration options:\n limit (int): Maximum number of pages to crawl. Default: 5\n tbs (str): Time before search. Default: None\n lang (str): Language. Default: \"en\"\n country (str): Country. Default: \"us\"\n location (str): Location. Default: None\n timeout (int): Timeout in milliseconds. Default: 60000",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"title": "Config"
|
|
}
|
|
},
|
|
"title": "FirecrawlSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "FirecrawlSearchTool",
|
|
"package_dependencies": [
|
|
"firecrawl-py"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"query": {
|
|
"description": "Search query",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "FirecrawlSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that leverages CrewAI Studio's capabilities to automatically generate complete CrewAI automations based on natural language descriptions. It translates high-level requirements into functional CrewAI implementations.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "Personal Access Token for CrewAI AMP API",
|
|
"name": "CREWAI_PERSONAL_ACCESS_TOKEN",
|
|
"required": true
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Base URL for CrewAI AMP API",
|
|
"name": "CREWAI_PLUS_URL",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "Generate CrewAI Automation",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"crewai_enterprise_url": {
|
|
"description": "The base URL of CrewAI AMP. If not provided, it will be loaded from the environment variable CREWAI_PLUS_URL with default https://app.crewai.com.",
|
|
"title": "Crewai Enterprise Url",
|
|
"type": "string"
|
|
},
|
|
"personal_access_token": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "The user's Personal Access Token to access CrewAI AMP API. If not provided, it will be loaded from the environment variable CREWAI_PERSONAL_ACCESS_TOKEN.",
|
|
"title": "Personal Access Token"
|
|
}
|
|
},
|
|
"title": "GenerateCrewaiAutomationTool",
|
|
"type": "object"
|
|
},
|
|
"name": "GenerateCrewaiAutomationTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"organization_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The identifier for the CrewAI AMP organization. If not specified, a default organization will be used.",
|
|
"title": "Organization Id"
|
|
},
|
|
"prompt": {
|
|
"description": "The prompt to generate the CrewAI automation, e.g. 'Generate a CrewAI automation that will scrape the website and store the data in a database.'",
|
|
"title": "Prompt",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"prompt"
|
|
],
|
|
"title": "GenerateCrewaiAutomationToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a github repo's content. This is not the GitHub API, but instead a tool that can provide semantic search capabilities.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a github repo's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"content_types": {
|
|
"description": "Content types you want to be included search, options: [code, repo, pr, issue]",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Content Types",
|
|
"type": "array"
|
|
},
|
|
"gh_token": {
|
|
"title": "Gh Token",
|
|
"type": "string"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"gh_token"
|
|
],
|
|
"title": "GithubSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "GithubSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for GithubSearchTool.",
|
|
"properties": {
|
|
"content_types": {
|
|
"description": "Mandatory content types you want to be included search, options: [code, repo, pr, issue]",
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"title": "Content Types",
|
|
"type": "array"
|
|
},
|
|
"github_repo": {
|
|
"description": "Mandatory github you want to search",
|
|
"title": "Github Repo",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the github repo's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"github_repo",
|
|
"content_types"
|
|
],
|
|
"title": "GithubSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape or crawl a website using Hyperbrowser and return the contents in properly formatted markdown or html",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Hyperbrowser services",
|
|
"name": "HYPERBROWSER_API_KEY",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "Hyperbrowser web load tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "HyperbrowserLoadTool.\n\nScrape or crawl web pages and load the contents with optional parameters for configuring content extraction.\nRequires the `hyperbrowser` package.\nGet your API Key from https://app.hyperbrowser.ai/\n\nArgs:\n api_key: The Hyperbrowser API key, can be set as an environment variable `HYPERBROWSER_API_KEY` or passed directly",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"hyperbrowser": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Hyperbrowser"
|
|
}
|
|
},
|
|
"title": "HyperbrowserLoadTool",
|
|
"type": "object"
|
|
},
|
|
"name": "HyperbrowserLoadTool",
|
|
"package_dependencies": [
|
|
"hyperbrowser"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"operation": {
|
|
"description": "Operation to perform on the website. Either 'scrape' or 'crawl'",
|
|
"enum": [
|
|
"scrape",
|
|
"crawl"
|
|
],
|
|
"title": "Operation",
|
|
"type": "string"
|
|
},
|
|
"params": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "Optional params for scrape or crawl. For more information on the supported params, visit https://docs.hyperbrowser.ai/reference/sdks/python/scrape#start-scrape-job-and-wait or https://docs.hyperbrowser.ai/reference/sdks/python/crawl#start-crawl-job-and-wait",
|
|
"title": "Params"
|
|
},
|
|
"url": {
|
|
"description": "Website URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url",
|
|
"operation",
|
|
"params"
|
|
],
|
|
"title": "HyperbrowserLoadToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Invokes an CrewAI Platform Automation using API",
|
|
"env_vars": [],
|
|
"humanized_name": "invoke_amp_automation",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A CrewAI tool for invoking external crew/flows APIs.\n\nThis tool provides CrewAI Platform API integration with external crew services, supporting:\n- Dynamic input schema configuration\n- Automatic polling for task completion\n- Bearer token authentication\n- Comprehensive error handling\n\nExample:\n Basic usage:\n >>> tool = InvokeCrewAIAutomationTool(\n ... crew_api_url=\"https://api.example.com\",\n ... crew_bearer_token=\"your_token\",\n ... crew_name=\"My Crew\",\n ... crew_description=\"Description of what the crew does\"\n ... )\n \n With custom inputs:\n >>> custom_inputs = {\n ... \"param1\": Field(..., description=\"Description of param1\"),\n ... \"param2\": Field(default=\"default_value\", description=\"Description of param2\")\n ... }\n >>> tool = InvokeCrewAIAutomationTool(\n ... crew_api_url=\"https://api.example.com\",\n ... crew_bearer_token=\"your_token\",\n ... crew_name=\"My Crew\",\n ... crew_description=\"Description of what the crew does\",\n ... crew_inputs=custom_inputs\n ... )\n \n Example:\n >>> tools=[\n ... InvokeCrewAIAutomationTool(\n ... crew_api_url=\"https://canary-crew-[...].crewai.com\",\n ... crew_bearer_token=\"[Your token: abcdef012345]\",\n ... crew_name=\"State of AI Report\",\n ... crew_description=\"Retrieves a report on state of AI for a given year.\",\n ... crew_inputs={\n ... \"year\": Field(..., description=\"Year to retrieve the report for (integer)\")\n ... }\n ... )\n ... ]",
|
|
"properties": {
|
|
"crew_api_url": {
|
|
"title": "Crew Api Url",
|
|
"type": "string"
|
|
},
|
|
"crew_bearer_token": {
|
|
"title": "Crew Bearer Token",
|
|
"type": "string"
|
|
},
|
|
"max_polling_time": {
|
|
"default": 600,
|
|
"title": "Max Polling Time",
|
|
"type": "integer"
|
|
}
|
|
},
|
|
"required": [
|
|
"crew_api_url",
|
|
"crew_bearer_token"
|
|
],
|
|
"title": "InvokeCrewAIAutomationTool",
|
|
"type": "object"
|
|
},
|
|
"name": "InvokeCrewAIAutomationTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input schema for InvokeCrewAIAutomationTool.",
|
|
"properties": {
|
|
"prompt": {
|
|
"description": "The prompt or query to send to the crew",
|
|
"title": "Prompt",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"prompt"
|
|
],
|
|
"title": "InvokeCrewAIAutomationInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a JSON's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a JSON's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "JSONSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "JSONSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for JSONSearchTool.",
|
|
"properties": {
|
|
"json_path": {
|
|
"description": "Mandatory json path you want to search",
|
|
"title": "Json Path",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the JSON's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"json_path"
|
|
],
|
|
"title": "JSONSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Performs an API call to Linkup to retrieve contextual information.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Linkup",
|
|
"name": "LINKUP_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Linkup Search Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {},
|
|
"title": "LinkupSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "LinkupSearchTool",
|
|
"package_dependencies": [
|
|
"linkup-sdk"
|
|
],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "",
|
|
"env_vars": [],
|
|
"humanized_name": "LlamaIndexTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to wrap LlamaIndex tools/query engines.",
|
|
"properties": {
|
|
"llama_index_tool": {
|
|
"title": "Llama Index Tool"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description",
|
|
"llama_index_tool"
|
|
],
|
|
"title": "LlamaIndexTool",
|
|
"type": "object"
|
|
},
|
|
"name": "LlamaIndexTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a MDX's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a MDX's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "MDXSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "MDXSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for MDXSearchTool.",
|
|
"properties": {
|
|
"mdx": {
|
|
"description": "Mandatory mdx path you want to search",
|
|
"title": "Mdx",
|
|
"type": "string"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the MDX's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"mdx"
|
|
],
|
|
"title": "MDXSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perfrom a vector search on a MongoDB database for relevant information on internal documents.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Browserbase services",
|
|
"name": "BROWSERBASE_API_KEY",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Project ID for Browserbase services",
|
|
"name": "BROWSERBASE_PROJECT_ID",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "MongoDBVectorSearchTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"MongoDBVectorSearchConfig": {
|
|
"description": "Configuration for MongoDB vector search queries.",
|
|
"properties": {
|
|
"include_embeddings": {
|
|
"default": false,
|
|
"description": "Whether to include the embedding vector of each result in metadata.",
|
|
"title": "Include Embeddings",
|
|
"type": "boolean"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 4,
|
|
"description": "number of documents to return.",
|
|
"title": "Limit"
|
|
},
|
|
"oversampling_factor": {
|
|
"default": 10,
|
|
"description": "Multiple of limit used when generating number of candidates at each step in the HNSW Vector Search",
|
|
"title": "Oversampling Factor",
|
|
"type": "integer"
|
|
},
|
|
"post_filter_pipeline": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Pipeline of MongoDB aggregation stages to filter/process results after $vectorSearch.",
|
|
"title": "Post Filter Pipeline"
|
|
},
|
|
"pre_filter": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "List of MQL match expressions comparing an indexed field",
|
|
"title": "Pre Filter"
|
|
}
|
|
},
|
|
"title": "MongoDBVectorSearchConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to perfrom a vector search the MongoDB database",
|
|
"properties": {
|
|
"collection_name": {
|
|
"description": "The name of the MongoDB collection",
|
|
"title": "Collection Name",
|
|
"type": "string"
|
|
},
|
|
"connection_string": {
|
|
"description": "The connection string of the MongoDB cluster",
|
|
"title": "Connection String",
|
|
"type": "string"
|
|
},
|
|
"database_name": {
|
|
"description": "The name of the MongoDB database",
|
|
"title": "Database Name",
|
|
"type": "string"
|
|
},
|
|
"dimensions": {
|
|
"default": 1536,
|
|
"description": "Number of dimensions in the embedding vector",
|
|
"title": "Dimensions",
|
|
"type": "integer"
|
|
},
|
|
"embedding_key": {
|
|
"default": "embedding",
|
|
"description": "Field that will contain the embedding for each document",
|
|
"title": "Embedding Key",
|
|
"type": "string"
|
|
},
|
|
"embedding_model": {
|
|
"default": "text-embedding-3-large",
|
|
"description": "Text OpenAI embedding model to use",
|
|
"title": "Embedding Model",
|
|
"type": "string"
|
|
},
|
|
"query_config": {
|
|
"anyOf": [
|
|
{
|
|
"$ref": "#/$defs/MongoDBVectorSearchConfig"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "MongoDB Vector Search query configuration"
|
|
},
|
|
"text_key": {
|
|
"default": "text",
|
|
"description": "MongoDB field that will contain the text for each document",
|
|
"title": "Text Key",
|
|
"type": "string"
|
|
},
|
|
"vector_index_name": {
|
|
"default": "vector_index",
|
|
"description": "Name of the Atlas Search vector index",
|
|
"title": "Vector Index Name",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"database_name",
|
|
"collection_name",
|
|
"connection_string"
|
|
],
|
|
"title": "MongoDBVectorSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "MongoDBVectorSearchTool",
|
|
"package_dependencies": [
|
|
"mongdb"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for MongoDBTool.",
|
|
"properties": {
|
|
"query": {
|
|
"description": "The query to search retrieve relevant information from the MongoDB database. Pass only the query, not the question.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "MongoDBToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Multion gives the ability for LLMs to control web browsers using natural language instructions.\n If the status is 'CONTINUE', reissue the same instruction to continue execution",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Multion",
|
|
"name": "MULTION_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Multion Browse Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to wrap MultiOn Browse Capabilities.",
|
|
"properties": {
|
|
"local": {
|
|
"default": false,
|
|
"title": "Local",
|
|
"type": "boolean"
|
|
},
|
|
"max_steps": {
|
|
"default": 3,
|
|
"title": "Max Steps",
|
|
"type": "integer"
|
|
},
|
|
"multion": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Multion"
|
|
},
|
|
"session_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Session Id"
|
|
}
|
|
},
|
|
"title": "MultiOnTool",
|
|
"type": "object"
|
|
},
|
|
"name": "MultiOnTool",
|
|
"package_dependencies": [
|
|
"multion"
|
|
],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a database table's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a database's table content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"db_uri": {
|
|
"description": "Mandatory database URI",
|
|
"title": "Db Uri",
|
|
"type": "string"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"db_uri"
|
|
],
|
|
"title": "MySQLSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "MySQLSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for MySQLSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory semantic search query you want to use to search the database's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "MySQLSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Converts natural language to SQL queries and executes them.",
|
|
"env_vars": [],
|
|
"humanized_name": "NL2SQLTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"columns": {
|
|
"additionalProperties": true,
|
|
"default": {},
|
|
"title": "Columns",
|
|
"type": "object"
|
|
},
|
|
"db_uri": {
|
|
"description": "The URI of the database to connect to.",
|
|
"title": "Database URI",
|
|
"type": "string"
|
|
},
|
|
"tables": {
|
|
"default": [],
|
|
"items": {},
|
|
"title": "Tables",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"required": [
|
|
"db_uri"
|
|
],
|
|
"title": "NL2SQLTool",
|
|
"type": "object"
|
|
},
|
|
"name": "NL2SQLTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"sql_query": {
|
|
"description": "The SQL query to execute.",
|
|
"title": "SQL Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"sql_query"
|
|
],
|
|
"title": "NL2SQLToolInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "This tool uses an LLM's API to extract text from an image file.",
|
|
"env_vars": [],
|
|
"humanized_name": "Optical Character Recognition Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for performing Optical Character Recognition on images.\n\nThis tool leverages LLMs to extract text from images. It can process\nboth local image files and images available via URLs.\n\nAttributes:\n name (str): Name of the tool.\n description (str): Description of the tool's functionality.\n args_schema (Type[BaseModel]): Pydantic schema for input validation.\n\nPrivate Attributes:\n _llm (Optional[LLM]): Language model instance for making API calls.",
|
|
"properties": {},
|
|
"title": "OCRTool",
|
|
"type": "object"
|
|
},
|
|
"name": "OCRTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input schema for Optical Character Recognition Tool.\n\nAttributes:\n image_path_url (str): Path to a local image file or URL of an image.\n For local files, provide the absolute or relative path.\n For remote images, provide the complete URL starting with 'http' or 'https'.",
|
|
"properties": {
|
|
"image_path_url": {
|
|
"default": "The image path or URL.",
|
|
"title": "Image Path Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "OCRToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape Amazon product pages with Oxylabs Amazon Product Scraper",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "Username for Oxylabs",
|
|
"name": "OXYLABS_USERNAME",
|
|
"required": true
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Password for Oxylabs",
|
|
"name": "OXYLABS_PASSWORD",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Oxylabs Amazon Product Scraper tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"OxylabsAmazonProductScraperConfig": {
|
|
"description": "Amazon Product Scraper configuration options:\nhttps://developers.oxylabs.io/scraper-apis/web-scraper-api/targets/amazon/product",
|
|
"properties": {
|
|
"callback_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "URL to your callback endpoint.",
|
|
"title": "Callback Url"
|
|
},
|
|
"context": {
|
|
"anyOf": [
|
|
{
|
|
"items": {},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Additional advanced settings and controls for specialized requirements.",
|
|
"title": "Context"
|
|
},
|
|
"domain": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The domain to limit the search results to.",
|
|
"title": "Domain"
|
|
},
|
|
"geo_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The Deliver to location.",
|
|
"title": "Geo Location"
|
|
},
|
|
"parse": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "True will return structured data.",
|
|
"title": "Parse"
|
|
},
|
|
"parsing_instructions": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Instructions for parsing the results.",
|
|
"title": "Parsing Instructions"
|
|
},
|
|
"render": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Enables JavaScript rendering.",
|
|
"title": "Render"
|
|
},
|
|
"user_agent_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Device type and browser.",
|
|
"title": "User Agent Type"
|
|
}
|
|
},
|
|
"title": "OxylabsAmazonProductScraperConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Scrape Amazon product pages with OxylabsAmazonProductScraperTool.\n\nGet Oxylabs account:\nhttps://dashboard.oxylabs.io/en\n\nArgs:\n username (str): Oxylabs username.\n password (str): Oxylabs password.\n config: Configuration options. See ``OxylabsAmazonProductScraperConfig``",
|
|
"properties": {
|
|
"config": {
|
|
"$ref": "#/$defs/OxylabsAmazonProductScraperConfig"
|
|
},
|
|
"oxylabs_api": {
|
|
"title": "Oxylabs Api"
|
|
}
|
|
},
|
|
"required": [
|
|
"oxylabs_api",
|
|
"config"
|
|
],
|
|
"title": "OxylabsAmazonProductScraperTool",
|
|
"type": "object"
|
|
},
|
|
"name": "OxylabsAmazonProductScraperTool",
|
|
"package_dependencies": [
|
|
"oxylabs"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"query": {
|
|
"description": "Amazon product ASIN",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "OxylabsAmazonProductScraperArgs",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape Amazon search results with Oxylabs Amazon Search Scraper",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "Username for Oxylabs",
|
|
"name": "OXYLABS_USERNAME",
|
|
"required": true
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Password for Oxylabs",
|
|
"name": "OXYLABS_PASSWORD",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Oxylabs Amazon Search Scraper tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"OxylabsAmazonSearchScraperConfig": {
|
|
"description": "Amazon Search Scraper configuration options:\nhttps://developers.oxylabs.io/scraper-apis/web-scraper-api/targets/amazon/search",
|
|
"properties": {
|
|
"callback_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "URL to your callback endpoint.",
|
|
"title": "Callback Url"
|
|
},
|
|
"context": {
|
|
"anyOf": [
|
|
{
|
|
"items": {},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Additional advanced settings and controls for specialized requirements.",
|
|
"title": "Context"
|
|
},
|
|
"domain": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The domain to limit the search results to.",
|
|
"title": "Domain"
|
|
},
|
|
"geo_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The Deliver to location.",
|
|
"title": "Geo Location"
|
|
},
|
|
"pages": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The number of pages to scrape.",
|
|
"title": "Pages"
|
|
},
|
|
"parse": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "True will return structured data.",
|
|
"title": "Parse"
|
|
},
|
|
"parsing_instructions": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Instructions for parsing the results.",
|
|
"title": "Parsing Instructions"
|
|
},
|
|
"render": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Enables JavaScript rendering.",
|
|
"title": "Render"
|
|
},
|
|
"start_page": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The starting page number.",
|
|
"title": "Start Page"
|
|
},
|
|
"user_agent_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Device type and browser.",
|
|
"title": "User Agent Type"
|
|
}
|
|
},
|
|
"title": "OxylabsAmazonSearchScraperConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Scrape Amazon search results with OxylabsAmazonSearchScraperTool.\n\nGet Oxylabs account:\nhttps://dashboard.oxylabs.io/en\n\nArgs:\n username (str): Oxylabs username.\n password (str): Oxylabs password.\n config: Configuration options. See ``OxylabsAmazonSearchScraperConfig``",
|
|
"properties": {
|
|
"config": {
|
|
"$ref": "#/$defs/OxylabsAmazonSearchScraperConfig"
|
|
},
|
|
"oxylabs_api": {
|
|
"title": "Oxylabs Api"
|
|
}
|
|
},
|
|
"required": [
|
|
"oxylabs_api",
|
|
"config"
|
|
],
|
|
"title": "OxylabsAmazonSearchScraperTool",
|
|
"type": "object"
|
|
},
|
|
"name": "OxylabsAmazonSearchScraperTool",
|
|
"package_dependencies": [
|
|
"oxylabs"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"query": {
|
|
"description": "Amazon search term",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "OxylabsAmazonSearchScraperArgs",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape Google Search results with Oxylabs Google Search Scraper",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "Username for Oxylabs",
|
|
"name": "OXYLABS_USERNAME",
|
|
"required": true
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Password for Oxylabs",
|
|
"name": "OXYLABS_PASSWORD",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Oxylabs Google Search Scraper tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"OxylabsGoogleSearchScraperConfig": {
|
|
"description": "Google Search Scraper configuration options:\nhttps://developers.oxylabs.io/scraper-apis/web-scraper-api/targets/google/search/search",
|
|
"properties": {
|
|
"callback_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "URL to your callback endpoint.",
|
|
"title": "Callback Url"
|
|
},
|
|
"context": {
|
|
"anyOf": [
|
|
{
|
|
"items": {},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Additional advanced settings and controls for specialized requirements.",
|
|
"title": "Context"
|
|
},
|
|
"domain": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The domain to limit the search results to.",
|
|
"title": "Domain"
|
|
},
|
|
"geo_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The Deliver to location.",
|
|
"title": "Geo Location"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Number of results to retrieve in each page.",
|
|
"title": "Limit"
|
|
},
|
|
"pages": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The number of pages to scrape.",
|
|
"title": "Pages"
|
|
},
|
|
"parse": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "True will return structured data.",
|
|
"title": "Parse"
|
|
},
|
|
"parsing_instructions": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Instructions for parsing the results.",
|
|
"title": "Parsing Instructions"
|
|
},
|
|
"render": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Enables JavaScript rendering.",
|
|
"title": "Render"
|
|
},
|
|
"start_page": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The starting page number.",
|
|
"title": "Start Page"
|
|
},
|
|
"user_agent_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Device type and browser.",
|
|
"title": "User Agent Type"
|
|
}
|
|
},
|
|
"title": "OxylabsGoogleSearchScraperConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Scrape Google Search results with OxylabsGoogleSearchScraperTool.\n\nGet Oxylabs account:\nhttps://dashboard.oxylabs.io/en\n\nArgs:\n username (str): Oxylabs username.\n password (str): Oxylabs password.\n config: Configuration options. See ``OxylabsGoogleSearchScraperConfig``",
|
|
"properties": {
|
|
"config": {
|
|
"$ref": "#/$defs/OxylabsGoogleSearchScraperConfig"
|
|
},
|
|
"oxylabs_api": {
|
|
"title": "Oxylabs Api"
|
|
}
|
|
},
|
|
"required": [
|
|
"oxylabs_api",
|
|
"config"
|
|
],
|
|
"title": "OxylabsGoogleSearchScraperTool",
|
|
"type": "object"
|
|
},
|
|
"name": "OxylabsGoogleSearchScraperTool",
|
|
"package_dependencies": [
|
|
"oxylabs"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"query": {
|
|
"description": "Search query",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "OxylabsGoogleSearchScraperArgs",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape any url with Oxylabs Universal Scraper",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "Username for Oxylabs",
|
|
"name": "OXYLABS_USERNAME",
|
|
"required": true
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Password for Oxylabs",
|
|
"name": "OXYLABS_PASSWORD",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Oxylabs Universal Scraper tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"OxylabsUniversalScraperConfig": {
|
|
"description": "Universal Scraper configuration options:\nhttps://developers.oxylabs.io/scraper-apis/web-scraper-api/other-websites",
|
|
"properties": {
|
|
"callback_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "URL to your callback endpoint.",
|
|
"title": "Callback Url"
|
|
},
|
|
"context": {
|
|
"anyOf": [
|
|
{
|
|
"items": {},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Additional advanced settings and controls for specialized requirements.",
|
|
"title": "Context"
|
|
},
|
|
"geo_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The Deliver to location.",
|
|
"title": "Geo Location"
|
|
},
|
|
"parse": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "True will return structured data.",
|
|
"title": "Parse"
|
|
},
|
|
"parsing_instructions": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Instructions for parsing the results.",
|
|
"title": "Parsing Instructions"
|
|
},
|
|
"render": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Enables JavaScript rendering.",
|
|
"title": "Render"
|
|
},
|
|
"user_agent_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Device type and browser.",
|
|
"title": "User Agent Type"
|
|
}
|
|
},
|
|
"title": "OxylabsUniversalScraperConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Scrape any website with OxylabsUniversalScraperTool.\n\nGet Oxylabs account:\nhttps://dashboard.oxylabs.io/en\n\nArgs:\n username (str): Oxylabs username.\n password (str): Oxylabs password.\n config: Configuration options. See ``OxylabsUniversalScraperConfig``",
|
|
"properties": {
|
|
"config": {
|
|
"$ref": "#/$defs/OxylabsUniversalScraperConfig"
|
|
},
|
|
"oxylabs_api": {
|
|
"title": "Oxylabs Api"
|
|
}
|
|
},
|
|
"required": [
|
|
"oxylabs_api",
|
|
"config"
|
|
],
|
|
"title": "OxylabsUniversalScraperTool",
|
|
"type": "object"
|
|
},
|
|
"name": "OxylabsUniversalScraperTool",
|
|
"package_dependencies": [
|
|
"oxylabs"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"url": {
|
|
"description": "Website URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "OxylabsUniversalScraperArgs",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a PDF's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a PDF's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "PDFSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "PDFSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for PDFSearchTool.",
|
|
"properties": {
|
|
"pdf": {
|
|
"description": "Mandatory pdf path you want to search",
|
|
"title": "Pdf",
|
|
"type": "string"
|
|
},
|
|
"query": {
|
|
"description": "Mandatory query you want to use to search the PDF's content",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query",
|
|
"pdf"
|
|
],
|
|
"title": "PDFSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Search the web using Parallel's Search API (v1beta). Returns ranked results with compressed excerpts optimized for LLMs.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Parallel",
|
|
"name": "PARALLEL_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Parallel Web Search Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"search_url": {
|
|
"default": "https://api.parallel.ai/v1beta/search",
|
|
"title": "Search Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "ParallelSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ParallelSearchTool",
|
|
"package_dependencies": [
|
|
"requests"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for ParallelSearchTool using the Search API (v1beta).\n\nAt least one of objective or search_queries is required.",
|
|
"properties": {
|
|
"max_chars_per_result": {
|
|
"default": 6000,
|
|
"description": "Maximum characters per result excerpt (values >30000 not guaranteed)",
|
|
"minimum": 100,
|
|
"title": "Max Chars Per Result",
|
|
"type": "integer"
|
|
},
|
|
"max_results": {
|
|
"default": 10,
|
|
"description": "Maximum number of search results to return (processor limits apply)",
|
|
"maximum": 40,
|
|
"minimum": 1,
|
|
"title": "Max Results",
|
|
"type": "integer"
|
|
},
|
|
"objective": {
|
|
"anyOf": [
|
|
{
|
|
"maxLength": 5000,
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Natural-language goal for the web research (<=5000 chars)",
|
|
"title": "Objective"
|
|
},
|
|
"processor": {
|
|
"default": "base",
|
|
"description": "Search processor: 'base' (fast/low cost) or 'pro' (higher quality/freshness)",
|
|
"pattern": "^(base|pro)$",
|
|
"title": "Processor",
|
|
"type": "string"
|
|
},
|
|
"search_queries": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"maxLength": 200,
|
|
"type": "string"
|
|
},
|
|
"maxItems": 5,
|
|
"minItems": 1,
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional list of keyword queries (<=5 items, each <=200 chars)",
|
|
"title": "Search Queries"
|
|
},
|
|
"source_policy": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional source policy configuration",
|
|
"title": "Source Policy"
|
|
}
|
|
},
|
|
"title": "ParallelSearchInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Patronus evaluation services",
|
|
"name": "PATRONUS_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Patronus Evaluation Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"criteria": {
|
|
"default": [],
|
|
"items": {
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
"title": "Criteria",
|
|
"type": "array"
|
|
},
|
|
"evaluate_url": {
|
|
"default": "https://api.patronus.ai/v1/evaluate",
|
|
"title": "Evaluate Url",
|
|
"type": "string"
|
|
},
|
|
"evaluators": {
|
|
"default": [],
|
|
"items": {
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
"title": "Evaluators",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"title": "PatronusEvalTool",
|
|
"type": "object"
|
|
},
|
|
"name": "PatronusEvalTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "This tool calls the Patronus Evaluation API that takes the following arguments:",
|
|
"env_vars": [],
|
|
"humanized_name": "Call Patronus API tool for evaluation of model inputs and outputs",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "PatronusEvalTool is a tool to automatically evaluate and score agent interactions.\n\nResults are logged to the Patronus platform at app.patronus.ai",
|
|
"properties": {
|
|
"evaluate_url": {
|
|
"default": "https://api.patronus.ai/v1/evaluate",
|
|
"title": "Evaluate Url",
|
|
"type": "string"
|
|
},
|
|
"evaluators": {
|
|
"default": [],
|
|
"items": {
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
"title": "Evaluators",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"title": "PatronusPredefinedCriteriaEvalTool",
|
|
"type": "object"
|
|
},
|
|
"name": "PatronusPredefinedCriteriaEvalTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"evaluated_model_gold_answer": {
|
|
"additionalProperties": true,
|
|
"description": "The agent's gold answer only if available",
|
|
"title": "Evaluated Model Gold Answer",
|
|
"type": "object"
|
|
},
|
|
"evaluated_model_input": {
|
|
"additionalProperties": true,
|
|
"description": "The agent's task description in simple text",
|
|
"title": "Evaluated Model Input",
|
|
"type": "object"
|
|
},
|
|
"evaluated_model_output": {
|
|
"additionalProperties": true,
|
|
"description": "The agent's output of the task",
|
|
"title": "Evaluated Model Output",
|
|
"type": "object"
|
|
},
|
|
"evaluated_model_retrieved_context": {
|
|
"additionalProperties": true,
|
|
"description": "The agent's context",
|
|
"title": "Evaluated Model Retrieved Context",
|
|
"type": "object"
|
|
},
|
|
"evaluators": {
|
|
"description": "List of dictionaries containing the evaluator and criteria to evaluate the model input and output. An example input for this field: [{'evaluator': '[evaluator-from-user]', 'criteria': '[criteria-from-user]'}]",
|
|
"items": {
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
"title": "Evaluators",
|
|
"type": "array"
|
|
}
|
|
},
|
|
"required": [
|
|
"evaluated_model_input",
|
|
"evaluated_model_output",
|
|
"evaluated_model_retrieved_context",
|
|
"evaluated_model_gold_answer",
|
|
"evaluators"
|
|
],
|
|
"title": "FixedBaseToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to search the Qdrant database for relevant information on internal documents.",
|
|
"env_vars": [],
|
|
"humanized_name": "QdrantVectorSearchTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to query and filter results from a Qdrant database.\n\nThis tool enables vector similarity search on internal documents stored in Qdrant,\nwith optional filtering capabilities.\n\nAttributes:\n client: Configured QdrantClient instance\n collection_name: Name of the Qdrant collection to search\n limit: Maximum number of results to return\n score_threshold: Minimum similarity score threshold\n qdrant_url: Qdrant server URL\n qdrant_api_key: Authentication key for Qdrant",
|
|
"properties": {
|
|
"collection_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Collection Name"
|
|
},
|
|
"filter_by": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Filter By"
|
|
},
|
|
"filter_value": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Filter Value"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 3,
|
|
"title": "Limit"
|
|
},
|
|
"qdrant_api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The API key for the Qdrant server",
|
|
"title": "Qdrant Api Key"
|
|
},
|
|
"qdrant_url": {
|
|
"description": "The URL of the Qdrant server",
|
|
"title": "Qdrant Url",
|
|
"type": "string"
|
|
},
|
|
"query": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Query"
|
|
},
|
|
"score_threshold": {
|
|
"default": 0.35,
|
|
"title": "Score Threshold",
|
|
"type": "number"
|
|
}
|
|
},
|
|
"required": [
|
|
"qdrant_url"
|
|
],
|
|
"title": "QdrantVectorSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "QdrantVectorSearchTool",
|
|
"package_dependencies": [
|
|
"qdrant-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for QdrantTool.",
|
|
"properties": {
|
|
"filter_by": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Filter by properties. Pass only the properties, not the question.",
|
|
"title": "Filter By"
|
|
},
|
|
"filter_value": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Filter by value. Pass only the value, not the question.",
|
|
"title": "Filter Value"
|
|
},
|
|
"query": {
|
|
"description": "The query to search retrieve relevant information from the Qdrant database. Pass only the query, not the question.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "QdrantToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A knowledge base that can be used to answer questions.",
|
|
"env_vars": [],
|
|
"humanized_name": "Knowledge base",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "RagTool",
|
|
"type": "object"
|
|
},
|
|
"name": "RagTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to read a website content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Read a website content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"cookies": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Cookies"
|
|
},
|
|
"css_element": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Css Element"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {
|
|
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9",
|
|
"Accept-Encoding": "gzip, deflate, br",
|
|
"Accept-Language": "en-US,en;q=0.9",
|
|
"Connection": "keep-alive",
|
|
"Referer": "https://www.google.com/",
|
|
"Upgrade-Insecure-Requests": "1",
|
|
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36"
|
|
},
|
|
"title": "Headers"
|
|
},
|
|
"website_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Website Url"
|
|
}
|
|
},
|
|
"title": "ScrapeElementFromWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ScrapeElementFromWebsiteTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for ScrapeElementFromWebsiteTool.",
|
|
"properties": {
|
|
"css_element": {
|
|
"description": "Mandatory css reference for element to scrape from the website",
|
|
"title": "Css Element",
|
|
"type": "string"
|
|
},
|
|
"website_url": {
|
|
"description": "Mandatory website url to read the file",
|
|
"title": "Website Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"website_url",
|
|
"css_element"
|
|
],
|
|
"title": "ScrapeElementFromWebsiteToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to read a website content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Read website content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"cookies": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Cookies"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {
|
|
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9",
|
|
"Accept-Language": "en-US,en;q=0.9",
|
|
"Connection": "keep-alive",
|
|
"Referer": "https://www.google.com/",
|
|
"Upgrade-Insecure-Requests": "1",
|
|
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36"
|
|
},
|
|
"title": "Headers"
|
|
},
|
|
"website_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Website Url"
|
|
}
|
|
},
|
|
"title": "ScrapeWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ScrapeWebsiteTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for ScrapeWebsiteTool.",
|
|
"properties": {
|
|
"website_url": {
|
|
"description": "Mandatory website url to read the file",
|
|
"title": "Website Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"website_url"
|
|
],
|
|
"title": "ScrapeWebsiteToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that uses Scrapegraph AI to intelligently scrape website content.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Scrapegraph AI services",
|
|
"name": "SCRAPEGRAPH_API_KEY",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "Scrapegraph website scraper",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool that uses Scrapegraph AI to intelligently scrape website content.\n\nRaises:\n ValueError: If API key is missing or URL format is invalid\n RateLimitError: If API rate limits are exceeded\n RuntimeError: If scraping operation fails",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"enable_logging": {
|
|
"default": false,
|
|
"title": "Enable Logging",
|
|
"type": "boolean"
|
|
},
|
|
"user_prompt": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "User Prompt"
|
|
},
|
|
"website_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Website Url"
|
|
}
|
|
},
|
|
"title": "ScrapegraphScrapeTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ScrapegraphScrapeTool",
|
|
"package_dependencies": [
|
|
"scrapegraph-py"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for ScrapegraphScrapeTool.",
|
|
"properties": {
|
|
"user_prompt": {
|
|
"default": "Extract the main content of the webpage",
|
|
"description": "Prompt to guide the extraction of content",
|
|
"title": "User Prompt",
|
|
"type": "string"
|
|
},
|
|
"website_url": {
|
|
"description": "Mandatory website url to scrape",
|
|
"title": "Website Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"website_url"
|
|
],
|
|
"title": "ScrapegraphScrapeToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrape a webpage url using Scrapfly and return its content as markdown or text",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Scrapfly",
|
|
"name": "SCRAPFLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Scrapfly web scraping API tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"api_key": {
|
|
"default": null,
|
|
"title": "Api Key",
|
|
"type": "string"
|
|
},
|
|
"scrapfly": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Scrapfly"
|
|
}
|
|
},
|
|
"title": "ScrapflyScrapeWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "ScrapflyScrapeWebsiteTool",
|
|
"package_dependencies": [
|
|
"scrapfly-sdk"
|
|
],
|
|
"run_params_schema": {
|
|
"properties": {
|
|
"ignore_scrape_failures": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "whether to ignore failures",
|
|
"title": "Ignore Scrape Failures"
|
|
},
|
|
"scrape_config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Scrapfly request scrape config",
|
|
"title": "Scrape Config"
|
|
},
|
|
"scrape_format": {
|
|
"anyOf": [
|
|
{
|
|
"enum": [
|
|
"raw",
|
|
"markdown",
|
|
"text"
|
|
],
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "markdown",
|
|
"description": "Webpage extraction format",
|
|
"title": "Scrape Format"
|
|
},
|
|
"url": {
|
|
"description": "Webpage URL",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "ScrapflyScrapeWebsiteToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to read a website content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Read a website content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"cookie": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Cookie"
|
|
},
|
|
"css_element": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Css Element"
|
|
},
|
|
"driver": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Driver"
|
|
},
|
|
"return_html": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": false,
|
|
"title": "Return Html"
|
|
},
|
|
"wait_time": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 3,
|
|
"title": "Wait Time"
|
|
},
|
|
"website_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Website Url"
|
|
}
|
|
},
|
|
"title": "SeleniumScrapingTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SeleniumScrapingTool",
|
|
"package_dependencies": [
|
|
"selenium",
|
|
"webdriver-manager"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for SeleniumScrapingTool.",
|
|
"properties": {
|
|
"css_element": {
|
|
"description": "Mandatory css reference for element to scrape from the website",
|
|
"title": "Css Element",
|
|
"type": "string"
|
|
},
|
|
"website_url": {
|
|
"description": "Mandatory website url to read the file. Must start with http:// or https://",
|
|
"title": "Website Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"website_url",
|
|
"css_element"
|
|
],
|
|
"title": "SeleniumScrapingToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform to perform a Google search with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for SerpApi searches",
|
|
"name": "SERPAPI_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Google Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Client"
|
|
}
|
|
},
|
|
"title": "SerpApiGoogleSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerpApiGoogleSearchTool",
|
|
"package_dependencies": [
|
|
"serpapi"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for Google Search.",
|
|
"properties": {
|
|
"location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Location you want the search to be performed in.",
|
|
"title": "Location"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to Google search.",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerpApiGoogleSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform search on Google shopping with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for SerpApi searches",
|
|
"name": "SERPAPI_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Google Shopping",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Client"
|
|
}
|
|
},
|
|
"title": "SerpApiGoogleShoppingTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerpApiGoogleShoppingTool",
|
|
"package_dependencies": [
|
|
"serpapi"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for Google Shopping.",
|
|
"properties": {
|
|
"location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Location you want the search to be performed in.",
|
|
"title": "Location"
|
|
},
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to Google shopping.",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerpApiGoogleShoppingToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to search the internet with a search_query. Supports different search types: 'search' (default), 'news'",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serper",
|
|
"name": "SERPER_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Search the internet with Serper",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"base_url": {
|
|
"default": "https://google.serper.dev",
|
|
"title": "Base Url",
|
|
"type": "string"
|
|
},
|
|
"country": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "",
|
|
"title": "Country"
|
|
},
|
|
"locale": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "",
|
|
"title": "Locale"
|
|
},
|
|
"location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "",
|
|
"title": "Location"
|
|
},
|
|
"n_results": {
|
|
"default": 10,
|
|
"title": "N Results",
|
|
"type": "integer"
|
|
},
|
|
"save_file": {
|
|
"default": false,
|
|
"title": "Save File",
|
|
"type": "boolean"
|
|
},
|
|
"search_type": {
|
|
"default": "search",
|
|
"title": "Search Type",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "SerperDevTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerperDevTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for SerperDevTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the internet",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerperDevToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Scrapes website content using Serper's scraping API. This tool can extract clean, readable content from any website URL, optionally including markdown formatting for better structure.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serper",
|
|
"name": "SERPER_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "serper_scrape_website",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {},
|
|
"title": "SerperScrapeWebsiteTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerperScrapeWebsiteTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input schema for SerperScrapeWebsite.",
|
|
"properties": {
|
|
"include_markdown": {
|
|
"default": true,
|
|
"description": "Whether to include markdown formatting in the scraped content",
|
|
"title": "Include Markdown",
|
|
"type": "boolean"
|
|
},
|
|
"url": {
|
|
"description": "The URL of the website to scrape",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "SerperScrapeWebsiteInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform to perform a job search in the US with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serply services",
|
|
"name": "SERPLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Job Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Headers"
|
|
},
|
|
"proxy_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "US",
|
|
"title": "Proxy Location"
|
|
},
|
|
"request_url": {
|
|
"default": "https://api.serply.io/v1/job/search/",
|
|
"title": "Request Url",
|
|
"type": "string"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "SerplyJobSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerplyJobSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Job Search.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to fetch jobs postings.",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerplyJobSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform News article search with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serply services",
|
|
"name": "SERPLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "News Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Headers"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 10,
|
|
"title": "Limit"
|
|
},
|
|
"proxy_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "US",
|
|
"title": "Proxy Location"
|
|
},
|
|
"search_url": {
|
|
"default": "https://api.serply.io/v1/news/",
|
|
"title": "Search Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "SerplyNewsSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerplyNewsSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Serply News Search.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to fetch news articles",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerplyNewsSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform scholarly literature search with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serply services",
|
|
"name": "SERPLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Scholar Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Headers"
|
|
},
|
|
"hl": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "us",
|
|
"title": "Hl"
|
|
},
|
|
"proxy_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "US",
|
|
"title": "Proxy Location"
|
|
},
|
|
"search_url": {
|
|
"default": "https://api.serply.io/v1/scholar/",
|
|
"title": "Search Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "SerplyScholarSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerplyScholarSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Serply Scholar Search.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to fetch scholarly literature",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerplyScholarSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform Google search with a search_query.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serply services",
|
|
"name": "SERPLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Google Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"device_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "desktop",
|
|
"title": "Device Type"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Headers"
|
|
},
|
|
"hl": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "us",
|
|
"title": "Hl"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 10,
|
|
"title": "Limit"
|
|
},
|
|
"proxy_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "US",
|
|
"title": "Proxy Location"
|
|
},
|
|
"query_payload": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Query Payload"
|
|
},
|
|
"search_url": {
|
|
"default": "https://api.serply.io/v1/search/",
|
|
"title": "Search Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "SerplyWebSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerplyWebSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Serply Web Search.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to Google search",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SerplyWebSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to perform convert a webpage to markdown to make it easier for LLMs to understand",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Serply services",
|
|
"name": "SERPLY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Webpage to Markdown",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": {},
|
|
"title": "Headers"
|
|
},
|
|
"proxy_location": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "US",
|
|
"title": "Proxy Location"
|
|
},
|
|
"request_url": {
|
|
"default": "https://api.serply.io/v1/request",
|
|
"title": "Request Url",
|
|
"type": "string"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "SerplyWebpageToMarkdownTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SerplyWebpageToMarkdownTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Serply Search.",
|
|
"properties": {
|
|
"url": {
|
|
"description": "Mandatory url you want to use to fetch and convert to markdown",
|
|
"title": "Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"url"
|
|
],
|
|
"title": "SerplyWebpageToMarkdownToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a database.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "A comprehensive URL string that can encapsulate host, port, username, password, and database information, often used in environments like SingleStore notebooks or specific frameworks. For example: 'me:p455w0rd@s2-host.com/my_db'",
|
|
"name": "SINGLESTOREDB_URL",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Specifies the hostname, IP address, or URL of the SingleStoreDB workspace or cluster",
|
|
"name": "SINGLESTOREDB_HOST",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Defines the port number on which the SingleStoreDB server is listening",
|
|
"name": "SINGLESTOREDB_PORT",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Specifies the database user name",
|
|
"name": "SINGLESTOREDB_USER",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Specifies the database user password",
|
|
"name": "SINGLESTOREDB_PASSWORD",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "Name of the database to connect to",
|
|
"name": "SINGLESTOREDB_DATABASE",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "File containing SSL key",
|
|
"name": "SINGLESTOREDB_SSL_KEY",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "File containing SSL certificate",
|
|
"name": "SINGLESTOREDB_SSL_CERT",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "File containing SSL certificate authority",
|
|
"name": "SINGLESTOREDB_SSL_CA",
|
|
"required": false
|
|
},
|
|
{
|
|
"default": null,
|
|
"description": "The timeout for connecting to the database in seconds",
|
|
"name": "SINGLESTOREDB_CONNECT_TIMEOUT",
|
|
"required": false
|
|
}
|
|
],
|
|
"humanized_name": "Search a database's table(s) content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool for performing semantic searches on SingleStore database tables.\n\nThis tool provides a safe interface for executing SELECT and SHOW queries\nagainst a SingleStore database with connection pooling for optimal performance.",
|
|
"properties": {
|
|
"connection_args": {
|
|
"additionalProperties": true,
|
|
"default": {},
|
|
"title": "Connection Args",
|
|
"type": "object"
|
|
},
|
|
"connection_pool": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Connection Pool"
|
|
}
|
|
},
|
|
"title": "SingleStoreSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SingleStoreSearchTool",
|
|
"package_dependencies": [
|
|
"singlestoredb",
|
|
"SQLAlchemy"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for SingleStoreSearchTool.\n\nThis schema defines the expected input format for the search tool,\nensuring that only valid SELECT and SHOW queries are accepted.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory semantic search query you want to use to search the database's content. Only SELECT and SHOW queries are supported.",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query"
|
|
],
|
|
"title": "SingleStoreSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Execute SQL queries or semantic search on Snowflake data warehouse. Supports both raw SQL and natural language queries.",
|
|
"env_vars": [],
|
|
"humanized_name": "Snowflake Database Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"SnowflakeConfig": {
|
|
"description": "Configuration for Snowflake connection.",
|
|
"properties": {
|
|
"account": {
|
|
"description": "Snowflake account identifier",
|
|
"pattern": "^[a-zA-Z0-9\\-_]+$",
|
|
"title": "Account",
|
|
"type": "string"
|
|
},
|
|
"database": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Default database",
|
|
"title": "Database"
|
|
},
|
|
"password": {
|
|
"anyOf": [
|
|
{
|
|
"format": "password",
|
|
"type": "string",
|
|
"writeOnly": true
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Snowflake password",
|
|
"title": "Password"
|
|
},
|
|
"private_key_path": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Path to private key file",
|
|
"title": "Private Key Path"
|
|
},
|
|
"role": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Snowflake role",
|
|
"title": "Role"
|
|
},
|
|
"session_parameters": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "Session parameters",
|
|
"title": "Session Parameters"
|
|
},
|
|
"snowflake_schema": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Default schema",
|
|
"title": "Snowflake Schema"
|
|
},
|
|
"user": {
|
|
"description": "Snowflake username",
|
|
"title": "User",
|
|
"type": "string"
|
|
},
|
|
"warehouse": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Snowflake warehouse",
|
|
"title": "Warehouse"
|
|
}
|
|
},
|
|
"required": [
|
|
"account",
|
|
"user"
|
|
],
|
|
"title": "SnowflakeConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for executing queries and semantic search on Snowflake.",
|
|
"properties": {
|
|
"config": {
|
|
"$ref": "#/$defs/SnowflakeConfig",
|
|
"description": "Snowflake connection configuration"
|
|
},
|
|
"enable_caching": {
|
|
"default": true,
|
|
"description": "Enable query result caching",
|
|
"title": "Enable Caching",
|
|
"type": "boolean"
|
|
},
|
|
"max_retries": {
|
|
"default": 3,
|
|
"description": "Maximum retry attempts",
|
|
"title": "Max Retries",
|
|
"type": "integer"
|
|
},
|
|
"pool_size": {
|
|
"default": 5,
|
|
"description": "Size of connection pool",
|
|
"title": "Pool Size",
|
|
"type": "integer"
|
|
},
|
|
"retry_delay": {
|
|
"default": 1.0,
|
|
"description": "Delay between retries in seconds",
|
|
"title": "Retry Delay",
|
|
"type": "number"
|
|
}
|
|
},
|
|
"required": [
|
|
"config"
|
|
],
|
|
"title": "SnowflakeSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SnowflakeSearchTool",
|
|
"package_dependencies": [
|
|
"snowflake-connector-python",
|
|
"snowflake-sqlalchemy",
|
|
"cryptography"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for SnowflakeSearchTool.",
|
|
"properties": {
|
|
"database": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Override default database",
|
|
"title": "Database"
|
|
},
|
|
"query": {
|
|
"description": "SQL query or semantic search query to execute",
|
|
"title": "Query",
|
|
"type": "string"
|
|
},
|
|
"snowflake_schema": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Override default schema",
|
|
"title": "Snowflake Schema"
|
|
},
|
|
"timeout": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 300,
|
|
"description": "Query timeout in seconds",
|
|
"title": "Timeout"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "SnowflakeSearchToolInput",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to scrape or crawl a website and return LLM-ready content.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Spider.cloud",
|
|
"name": "SPIDER_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "SpiderTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
},
|
|
"SpiderToolConfig": {
|
|
"description": "Configuration settings for SpiderTool.\n\nContains all default values and constants used by SpiderTool.\nCentralizes configuration management for easier maintenance.",
|
|
"properties": {
|
|
"DEFAULT_CRAWL_LIMIT": {
|
|
"default": 5,
|
|
"title": "Default Crawl Limit",
|
|
"type": "integer"
|
|
},
|
|
"DEFAULT_REQUEST_MODE": {
|
|
"default": "smart",
|
|
"title": "Default Request Mode",
|
|
"type": "string"
|
|
},
|
|
"DEFAULT_RETURN_FORMAT": {
|
|
"default": "markdown",
|
|
"title": "Default Return Format",
|
|
"type": "string"
|
|
},
|
|
"FILTER_SVG": {
|
|
"default": true,
|
|
"title": "Filter Svg",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "SpiderToolConfig",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for scraping and crawling websites.\nThis tool provides functionality to either scrape a single webpage or crawl multiple\npages, returning content in a format suitable for LLM processing.",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"config": {
|
|
"$ref": "#/$defs/SpiderToolConfig",
|
|
"default": {
|
|
"DEFAULT_CRAWL_LIMIT": 5,
|
|
"DEFAULT_REQUEST_MODE": "smart",
|
|
"DEFAULT_RETURN_FORMAT": "markdown",
|
|
"FILTER_SVG": true
|
|
}
|
|
},
|
|
"custom_params": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Custom Params"
|
|
},
|
|
"log_failures": {
|
|
"default": true,
|
|
"title": "Log Failures",
|
|
"type": "boolean"
|
|
},
|
|
"spider": {
|
|
"default": null,
|
|
"title": "Spider"
|
|
},
|
|
"website_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Website Url"
|
|
}
|
|
},
|
|
"title": "SpiderTool",
|
|
"type": "object"
|
|
},
|
|
"name": "SpiderTool",
|
|
"package_dependencies": [
|
|
"spider-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for SpiderTool.",
|
|
"properties": {
|
|
"mode": {
|
|
"default": "scrape",
|
|
"description": "The mode of the SpiderTool. The only two allowed modes are `scrape` or `crawl`. Crawl mode will follow up to 5 links and return their content in markdown format.",
|
|
"enum": [
|
|
"scrape",
|
|
"crawl"
|
|
],
|
|
"title": "Mode",
|
|
"type": "string"
|
|
},
|
|
"website_url": {
|
|
"description": "Mandatory website URL to scrape or crawl",
|
|
"title": "Website Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"website_url"
|
|
],
|
|
"title": "SpiderToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Use this tool to control a web browser and interact with websites using natural language.\n\n Capabilities:\n - Navigate to websites and follow links\n - Click buttons, links, and other elements\n - Fill in forms and input fields\n - Search within websites\n - Extract information from web pages\n - Identify and analyze elements on a page\n\n To use this tool, provide a natural language instruction describing what you want to do.\n For reliability on complex pages, use specific, atomic instructions with location hints:\n - Good: \"Click the search box in the header\"\n - Good: \"Type 'Italy' in the focused field\"\n - Bad: \"Search for Italy and click the first result\"\n\n For different types of tasks, specify the command_type:\n - 'act': For performing one atomic action (default)\n - 'navigate': For navigating to a URL\n - 'extract': For getting data from a specific page section\n - 'observe': For finding elements in a specific area",
|
|
"env_vars": [],
|
|
"humanized_name": "Web Automation Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"AvailableModel": {
|
|
"enum": [
|
|
"gpt-4o",
|
|
"gpt-4o-mini",
|
|
"claude-3-5-sonnet-latest",
|
|
"claude-3-7-sonnet-latest",
|
|
"computer-use-preview",
|
|
"gemini-2.0-flash"
|
|
],
|
|
"title": "AvailableModel",
|
|
"type": "string"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "A tool that uses Stagehand to automate web browser interactions using natural language with atomic action handling.\n\nStagehand allows AI agents to interact with websites through a browser,\nperforming actions like clicking buttons, filling forms, and extracting data.\n\nThe tool supports four main command types:\n1. act - Perform actions like clicking, typing, scrolling, or navigating\n2. navigate - Specifically navigate to a URL (shorthand for act with navigation)\n3. extract - Extract structured data from web pages\n4. observe - Identify and analyze elements on a page\n\nUsage examples:\n- Navigate to a website: instruction=\"Go to the homepage\", url=\"https://example.com\"\n- Click a button: instruction=\"Click the login button\"\n- Fill a form: instruction=\"Fill the login form with username 'user' and password 'pass'\"\n- Extract data: instruction=\"Extract all product prices and names\", command_type=\"extract\"\n- Observe elements: instruction=\"Find all navigation menu items\", command_type=\"observe\"\n- Complex tasks: instruction=\"Step 1: Navigate to https://example.com; Step 2: Scroll down to the 'Features' section; Step 3: Click 'Learn More'\", command_type=\"act\"\n\nExample of breaking down \"Search for OpenAI\" into multiple steps:\n1. First navigation: instruction=\"Go to Google\", url=\"https://google.com\", command_type=\"navigate\"\n2. Enter search term: instruction=\"Type 'OpenAI' in the search box\", command_type=\"act\"\n3. Submit search: instruction=\"Press the Enter key or click the search button\", command_type=\"act\"\n4. Click on result: instruction=\"Click on the OpenAI website link in the search results\", command_type=\"act\"",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Api Key"
|
|
},
|
|
"dom_settle_timeout_ms": {
|
|
"default": 3000,
|
|
"title": "Dom Settle Timeout Ms",
|
|
"type": "integer"
|
|
},
|
|
"headless": {
|
|
"default": false,
|
|
"title": "Headless",
|
|
"type": "boolean"
|
|
},
|
|
"max_retries_on_token_limit": {
|
|
"default": 3,
|
|
"title": "Max Retries On Token Limit",
|
|
"type": "integer"
|
|
},
|
|
"model_api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Model Api Key"
|
|
},
|
|
"model_name": {
|
|
"anyOf": [
|
|
{
|
|
"$ref": "#/$defs/AvailableModel"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "claude-3-7-sonnet-latest"
|
|
},
|
|
"project_id": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Project Id"
|
|
},
|
|
"self_heal": {
|
|
"default": true,
|
|
"title": "Self Heal",
|
|
"type": "boolean"
|
|
},
|
|
"server_url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "https://api.stagehand.browserbase.com/v1",
|
|
"title": "Server Url"
|
|
},
|
|
"use_simplified_dom": {
|
|
"default": true,
|
|
"title": "Use Simplified Dom",
|
|
"type": "boolean"
|
|
},
|
|
"verbose": {
|
|
"default": 1,
|
|
"title": "Verbose",
|
|
"type": "integer"
|
|
},
|
|
"wait_for_captcha_solves": {
|
|
"default": true,
|
|
"title": "Wait For Captcha Solves",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "StagehandTool",
|
|
"type": "object"
|
|
},
|
|
"name": "StagehandTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for StagehandTool.",
|
|
"properties": {
|
|
"command_type": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": "act",
|
|
"description": "The type of command to execute (choose one):\n - 'act': Perform an action like clicking buttons, filling forms, etc. (default)\n - 'navigate': Specifically navigate to a URL\n - 'extract': Extract structured data from the page\n - 'observe': Identify and analyze elements on the page\n ",
|
|
"title": "Command Type"
|
|
},
|
|
"instruction": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Single atomic action with location context. For reliability on complex pages, use ONE specific action with location hints. Good examples: 'Click the search input field in the header', 'Type Italy in the focused field', 'Press Enter', 'Click the first link in the results area'. Avoid combining multiple actions. For 'navigate' command type, this can be omitted if only URL is provided.",
|
|
"title": "Instruction"
|
|
},
|
|
"url": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The URL to navigate to before executing the instruction. MUST be used with 'navigate' command. ",
|
|
"title": "Url"
|
|
}
|
|
},
|
|
"title": "StagehandToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a txt's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a txt's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "TXTSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "TXTSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for TXTSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the txt's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"txt": {
|
|
"description": "Mandatory txt path you want to search",
|
|
"title": "Txt",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"txt"
|
|
],
|
|
"title": "TXTSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "Extracts content from one or more web pages using the Tavily API. Returns structured data.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Tavily extraction service",
|
|
"name": "TAVILY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "TavilyExtractorTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "The Tavily API key. If not provided, it will be loaded from the environment variable TAVILY_API_KEY.",
|
|
"title": "Api Key"
|
|
},
|
|
"async_client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Async Client"
|
|
},
|
|
"client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Client"
|
|
},
|
|
"extract_depth": {
|
|
"default": "basic",
|
|
"description": "The depth of extraction. 'basic' for basic extraction, 'advanced' for advanced extraction.",
|
|
"enum": [
|
|
"basic",
|
|
"advanced"
|
|
],
|
|
"title": "Extract Depth",
|
|
"type": "string"
|
|
},
|
|
"include_images": {
|
|
"default": false,
|
|
"description": "Whether to include images in the extraction.",
|
|
"title": "Include Images",
|
|
"type": "boolean"
|
|
},
|
|
"proxies": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional proxies to use for the Tavily API requests.",
|
|
"title": "Proxies"
|
|
},
|
|
"timeout": {
|
|
"default": 60,
|
|
"description": "The timeout for the extraction request in seconds.",
|
|
"title": "Timeout",
|
|
"type": "integer"
|
|
}
|
|
},
|
|
"title": "TavilyExtractorTool",
|
|
"type": "object"
|
|
},
|
|
"name": "TavilyExtractorTool",
|
|
"package_dependencies": [
|
|
"tavily-python"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for TavilyExtractorTool.",
|
|
"properties": {
|
|
"urls": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "string"
|
|
}
|
|
],
|
|
"description": "The URL(s) to extract data from. Can be a single URL or a list of URLs.",
|
|
"title": "Urls"
|
|
}
|
|
},
|
|
"required": [
|
|
"urls"
|
|
],
|
|
"title": "TavilyExtractorToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that performs web searches using the Tavily Search API. It returns a JSON object containing the search results.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for Tavily search service",
|
|
"name": "TAVILY_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Tavily Search",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool that uses the Tavily Search API to perform web searches.\n\nAttributes:\n client: An instance of TavilyClient.\n async_client: An instance of AsyncTavilyClient.\n name: The name of the tool.\n description: A description of the tool's purpose.\n args_schema: The schema for the tool's arguments.\n api_key: The Tavily API key.\n proxies: Optional proxies for the API requests.\n search_depth: The depth of the search.\n topic: The topic to focus the search on.\n time_range: The time range for the search.\n days: The number of days to search back.\n max_results: The maximum number of results to return.\n include_domains: A list of domains to include in the search.\n exclude_domains: A list of domains to exclude from the search.\n include_answer: Whether to include a direct answer to the query.\n include_raw_content: Whether to include the raw content of the search results.\n include_images: Whether to include images in the search results.\n timeout: The timeout for the search request in seconds.\n max_content_length_per_result: Maximum length for the 'content' of each search result.",
|
|
"properties": {
|
|
"api_key": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"description": "The Tavily API key. If not provided, it will be loaded from the environment variable TAVILY_API_KEY.",
|
|
"title": "Api Key"
|
|
},
|
|
"async_client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Async Client"
|
|
},
|
|
"client": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Client"
|
|
},
|
|
"days": {
|
|
"default": 7,
|
|
"description": "The number of days to search back.",
|
|
"title": "Days",
|
|
"type": "integer"
|
|
},
|
|
"exclude_domains": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "A list of domains to exclude from the search.",
|
|
"title": "Exclude Domains"
|
|
},
|
|
"include_answer": {
|
|
"anyOf": [
|
|
{
|
|
"type": "boolean"
|
|
},
|
|
{
|
|
"enum": [
|
|
"basic",
|
|
"advanced"
|
|
],
|
|
"type": "string"
|
|
}
|
|
],
|
|
"default": false,
|
|
"description": "Whether to include a direct answer to the query.",
|
|
"title": "Include Answer"
|
|
},
|
|
"include_domains": {
|
|
"anyOf": [
|
|
{
|
|
"items": {
|
|
"type": "string"
|
|
},
|
|
"type": "array"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "A list of domains to include in the search.",
|
|
"title": "Include Domains"
|
|
},
|
|
"include_images": {
|
|
"default": false,
|
|
"description": "Whether to include images in the search results.",
|
|
"title": "Include Images",
|
|
"type": "boolean"
|
|
},
|
|
"include_raw_content": {
|
|
"default": false,
|
|
"description": "Whether to include the raw content of the search results.",
|
|
"title": "Include Raw Content",
|
|
"type": "boolean"
|
|
},
|
|
"max_content_length_per_result": {
|
|
"default": 1000,
|
|
"description": "Maximum length for the 'content' of each search result to avoid context window issues.",
|
|
"title": "Max Content Length Per Result",
|
|
"type": "integer"
|
|
},
|
|
"max_results": {
|
|
"default": 5,
|
|
"description": "The maximum number of results to return.",
|
|
"title": "Max Results",
|
|
"type": "integer"
|
|
},
|
|
"proxies": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": {
|
|
"type": "string"
|
|
},
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "Optional proxies to use for the Tavily API requests.",
|
|
"title": "Proxies"
|
|
},
|
|
"search_depth": {
|
|
"default": "basic",
|
|
"description": "The depth of the search.",
|
|
"enum": [
|
|
"basic",
|
|
"advanced"
|
|
],
|
|
"title": "Search Depth",
|
|
"type": "string"
|
|
},
|
|
"time_range": {
|
|
"anyOf": [
|
|
{
|
|
"enum": [
|
|
"day",
|
|
"week",
|
|
"month",
|
|
"year"
|
|
],
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"description": "The time range for the search.",
|
|
"title": "Time Range"
|
|
},
|
|
"timeout": {
|
|
"default": 60,
|
|
"description": "The timeout for the search request in seconds.",
|
|
"title": "Timeout",
|
|
"type": "integer"
|
|
},
|
|
"topic": {
|
|
"default": "general",
|
|
"description": "The topic to focus the search on.",
|
|
"enum": [
|
|
"general",
|
|
"news",
|
|
"finance"
|
|
],
|
|
"title": "Topic",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "TavilySearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "TavilySearchTool",
|
|
"package_dependencies": [
|
|
"tavily-python"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input schema for TavilySearchTool.",
|
|
"properties": {
|
|
"query": {
|
|
"description": "The search query string.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "TavilySearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "This tool uses OpenAI's Vision API to describe the contents of an image.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "API key for OpenAI services",
|
|
"name": "OPENAI_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "Vision Tool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool for analyzing images using vision models.\n\nArgs:\n llm: Optional LLM instance to use\n model: Model identifier to use if no LLM is provided",
|
|
"properties": {},
|
|
"title": "VisionTool",
|
|
"type": "object"
|
|
},
|
|
"name": "VisionTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for Vision Tool.",
|
|
"properties": {
|
|
"image_path_url": {
|
|
"default": "The image path or URL.",
|
|
"title": "Image Path Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"title": "ImagePromptSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool to search the Weaviate database for relevant information on internal documents.",
|
|
"env_vars": [
|
|
{
|
|
"default": null,
|
|
"description": "OpenAI API key for embedding generation and retrieval",
|
|
"name": "OPENAI_API_KEY",
|
|
"required": true
|
|
}
|
|
],
|
|
"humanized_name": "WeaviateVectorSearchTool",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"description": "Tool to search the Weaviate database",
|
|
"properties": {
|
|
"alpha": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 0.75,
|
|
"title": "Alpha"
|
|
},
|
|
"collection_name": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Collection Name"
|
|
},
|
|
"generative_model": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Generative Model"
|
|
},
|
|
"headers": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Headers"
|
|
},
|
|
"limit": {
|
|
"anyOf": [
|
|
{
|
|
"type": "integer"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": 3,
|
|
"title": "Limit"
|
|
},
|
|
"query": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Query"
|
|
},
|
|
"vectorizer": {
|
|
"anyOf": [
|
|
{},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Vectorizer"
|
|
},
|
|
"weaviate_api_key": {
|
|
"description": "The API key for the Weaviate cluster",
|
|
"title": "Weaviate Api Key",
|
|
"type": "string"
|
|
},
|
|
"weaviate_cluster_url": {
|
|
"description": "The URL of the Weaviate cluster",
|
|
"title": "Weaviate Cluster Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"weaviate_cluster_url",
|
|
"weaviate_api_key"
|
|
],
|
|
"title": "WeaviateVectorSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "WeaviateVectorSearchTool",
|
|
"package_dependencies": [
|
|
"weaviate-client"
|
|
],
|
|
"run_params_schema": {
|
|
"description": "Input for WeaviateTool.",
|
|
"properties": {
|
|
"query": {
|
|
"description": "The query to search retrieve relevant information from the Weaviate database. Pass only the query, not the question.",
|
|
"title": "Query",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"query"
|
|
],
|
|
"title": "WeaviateToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a specific URL content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search in a specific website",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "WebsiteSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "WebsiteSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for WebsiteSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search a specific website",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"website": {
|
|
"description": "Mandatory valid website URL you want to search on",
|
|
"title": "Website",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"website"
|
|
],
|
|
"title": "WebsiteSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a XML's content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a XML's content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "XMLSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "XMLSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for XMLSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the XML's content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"xml": {
|
|
"description": "Mandatory xml path you want to search",
|
|
"title": "Xml",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"xml"
|
|
],
|
|
"title": "XMLSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a Youtube Channels content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a Youtube Channels content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "YoutubeChannelSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "YoutubeChannelSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for YoutubeChannelSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the Youtube Channels content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"youtube_channel_handle": {
|
|
"description": "Mandatory youtube_channel_handle path you want to search",
|
|
"title": "Youtube Channel Handle",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"youtube_channel_handle"
|
|
],
|
|
"title": "YoutubeChannelSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
},
|
|
{
|
|
"description": "A tool that can be used to semantic search a query from a Youtube Video content.",
|
|
"env_vars": [],
|
|
"humanized_name": "Search a Youtube Video content",
|
|
"init_params_schema": {
|
|
"$defs": {
|
|
"Adapter": {
|
|
"properties": {},
|
|
"title": "Adapter",
|
|
"type": "object"
|
|
},
|
|
"EnvVar": {
|
|
"properties": {
|
|
"default": {
|
|
"anyOf": [
|
|
{
|
|
"type": "string"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Default"
|
|
},
|
|
"description": {
|
|
"title": "Description",
|
|
"type": "string"
|
|
},
|
|
"name": {
|
|
"title": "Name",
|
|
"type": "string"
|
|
},
|
|
"required": {
|
|
"default": true,
|
|
"title": "Required",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"required": [
|
|
"name",
|
|
"description"
|
|
],
|
|
"title": "EnvVar",
|
|
"type": "object"
|
|
}
|
|
},
|
|
"properties": {
|
|
"adapter": {
|
|
"$ref": "#/$defs/Adapter"
|
|
},
|
|
"config": {
|
|
"anyOf": [
|
|
{
|
|
"additionalProperties": true,
|
|
"type": "object"
|
|
},
|
|
{
|
|
"type": "null"
|
|
}
|
|
],
|
|
"default": null,
|
|
"title": "Config"
|
|
},
|
|
"summarize": {
|
|
"default": false,
|
|
"title": "Summarize",
|
|
"type": "boolean"
|
|
}
|
|
},
|
|
"title": "YoutubeVideoSearchTool",
|
|
"type": "object"
|
|
},
|
|
"name": "YoutubeVideoSearchTool",
|
|
"package_dependencies": [],
|
|
"run_params_schema": {
|
|
"description": "Input for YoutubeVideoSearchTool.",
|
|
"properties": {
|
|
"search_query": {
|
|
"description": "Mandatory search query you want to use to search the Youtube Video content",
|
|
"title": "Search Query",
|
|
"type": "string"
|
|
},
|
|
"youtube_video_url": {
|
|
"description": "Mandatory youtube_video_url path you want to search",
|
|
"title": "Youtube Video Url",
|
|
"type": "string"
|
|
}
|
|
},
|
|
"required": [
|
|
"search_query",
|
|
"youtube_video_url"
|
|
],
|
|
"title": "YoutubeVideoSearchToolSchema",
|
|
"type": "object"
|
|
}
|
|
}
|
|
]
|
|
} |