You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but terminal shows 10000 !! - Running: web_crawler(max_length=10000, url=https://huggingface.co/blog/smolagents)
note I saved the code before running for sure, I even changed the scraped URL to confirm it... :)
On page:
https://docs.phidata.com/tools/crawl4ai
specify what max_length means: chars, words or tokens ( I suppose chars but not sure )
I have a weird experience:
phidata: v2.7.6
crawl4ai: v0.4.246
Pls. note I set in example code max_length to 4000
but terminal shows 10000 !!
- Running: web_crawler(max_length=10000, url=https://huggingface.co/blog/smolagents)
note I saved the code before running for sure, I even changed the scraped URL to confirm it... :)
EDIT:
note i tested your hackernews.py example
https://github.com/phidatahq/phidata/blob/main/cookbook/async/hackernews.py
and when running it showed:
Running: get_top_hackernews_stories(num_stories=100)
while 10 was set in the code, so it is the 2nd instance of the same weird stuff.
The text was updated successfully, but these errors were encountered: