Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

paraglide-js experimental extract #334

Open
4 tasks
samuelstroschein opened this issue Jan 22, 2025 · 2 comments
Open
4 tasks

paraglide-js experimental extract #334

samuelstroschein opened this issue Jan 22, 2025 · 2 comments
Labels
documentation Improvements or additions to documentation v2.0

Comments

@samuelstroschein
Copy link
Member

samuelstroschein commented Jan 22, 2025

Context

This is just waaaaaaaaaaay too good and is more or less what has been envisioned by https://github.com/orgs/opral/discussions/111

Extracting.Messages.with.LLMs.mp4

Proposal

create paraglide-js experimental extract --open-api-key <key> command OR use deepseek R1 locally (much better)

  • loads a project
  • lets an LLM crawl the source code and extract messages
  • uses saveProjectToDirectory to de-couple the command from a specific plugin or syntax
  • optionally extract links as well with localizePath()
@samuelstroschein samuelstroschein added documentation Improvements or additions to documentation v2.0 labels Jan 22, 2025
@samuelstroschein samuelstroschein changed the title add "extracting messages" via copilot to the docs with video tutorial paraglide-js experimental extract --open-api-key <key> Jan 23, 2025
@oskar-gmerek
Copy link

I am not sure how this is working currently. But, by feeding Sherlock with all messages and the source code of the page, there should be much better translations, because LLMs are pretty good when they got enough context for translations.

Extracting messages are also perfect work for LLMs

@samuelstroschein samuelstroschein changed the title paraglide-js experimental extract --open-api-key <key> paraglide-js experimental extract Jan 24, 2025
@samuelstroschein
Copy link
Member Author

If running a local model works, this feature is a no-brainer. So amazing.

From discord https://discord.com/channels/897438559458430986/1298609166385942578/1332141542764445857

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation v2.0
Projects
None yet
Development

No branches or pull requests

2 participants