mirror of
https://github.com/crewAIInc/crewAI.git
synced 2026-01-10 00:28:31 +00:00
Merge branch 'main' into lg-support-set-task-context
This commit is contained in:
25
.github/workflows/linter.yml
vendored
25
.github/workflows/linter.yml
vendored
@@ -5,12 +5,29 @@ on: [pull_request]
|
|||||||
jobs:
|
jobs:
|
||||||
lint:
|
lint:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
env:
|
||||||
|
TARGET_BRANCH: ${{ github.event.pull_request.base.ref }}
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
- name: Install Requirements
|
- name: Fetch Target Branch
|
||||||
|
run: git fetch origin $TARGET_BRANCH --depth=1
|
||||||
|
|
||||||
|
- name: Install Ruff
|
||||||
|
run: pip install ruff
|
||||||
|
|
||||||
|
- name: Get Changed Python Files
|
||||||
|
id: changed-files
|
||||||
run: |
|
run: |
|
||||||
pip install ruff
|
merge_base=$(git merge-base origin/"$TARGET_BRANCH" HEAD)
|
||||||
|
changed_files=$(git diff --name-only --diff-filter=ACMRTUB "$merge_base" | grep '\.py$' || true)
|
||||||
|
echo "files<<EOF" >> $GITHUB_OUTPUT
|
||||||
|
echo "$changed_files" >> $GITHUB_OUTPUT
|
||||||
|
echo "EOF" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- name: Run Ruff Linter
|
- name: Run Ruff on Changed Files
|
||||||
run: ruff check
|
if: ${{ steps.changed-files.outputs.files != '' }}
|
||||||
|
run: |
|
||||||
|
echo "${{ steps.changed-files.outputs.files }}" | tr " " "\n" | xargs -I{} ruff check "{}"
|
||||||
|
|||||||
@@ -2,8 +2,3 @@ exclude = [
|
|||||||
"templates",
|
"templates",
|
||||||
"__init__.py",
|
"__init__.py",
|
||||||
]
|
]
|
||||||
|
|
||||||
[lint]
|
|
||||||
select = [
|
|
||||||
"I", # isort rules
|
|
||||||
]
|
|
||||||
|
|||||||
@@ -700,4 +700,11 @@ recent_news = SpaceNewsKnowledgeSource(
|
|||||||
- Configure appropriate embedding models
|
- Configure appropriate embedding models
|
||||||
- Consider using local embedding providers for faster processing
|
- Consider using local embedding providers for faster processing
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
|
<Accordion title="One Time Knowledge">
|
||||||
|
- With the typical file structure provided by CrewAI, knowledge sources are embedded every time the kickoff is triggered.
|
||||||
|
- If the knowledge sources are large, this leads to inefficiency and increased latency, as the same data is embedded each time.
|
||||||
|
- To resolve this, directly initialize the knowledge parameter instead of the knowledge_sources parameter.
|
||||||
|
- Link to the issue to get complete idea [Github Issue](https://github.com/crewAIInc/crewAI/issues/2755)
|
||||||
|
</Accordion>
|
||||||
</AccordionGroup>
|
</AccordionGroup>
|
||||||
|
|||||||
Reference in New Issue
Block a user