Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

soc/ethernet: enable full_memory_we by default for Quartus toolchain #2140

Merged

Conversation

piotro888
Copy link
Contributor

There is an reported issue with Quartus toolchain, that fails to infer memories with multiple write enable signals.
It is a problem for liteeth module with large buffers (enjoy-digital/liteeth#63) which was fixed and later changed to configurable full_memory_we option in enjoy-digital/liteeth#72, because of regression on other FPGAs.

However, this option/problem is not documented and not used in existing codebase, and leads to unexpected, difficult to trace problems with high device utilization.
Because of that, I think it would be a good idea to enable this option by default if Quartus toolchain is detected.

The second place where full_memory_we option is defined is L2 Cache, where it is enabled by default for all targets.

@piotro888
Copy link
Contributor Author

Hello @enjoy-digital , what's your opinion on this change? Sorry to ping you

@enjoy-digital
Copy link
Owner

Thanks @piotro888 and sorry for the delay, otherwise maybe we could just use the same approach than on Efinix design and always transform the memories: https://github.com/enjoy-digital/litex/blob/master/litex/build/efinix/efinity.py#L73-L74 since it always seems to be requred on Altera FPGA?

@piotro888 piotro888 force-pushed the ethernet-full-mem-we-quartus branch from 50e5df4 to 118bd62 Compare January 6, 2025 20:15
@piotro888
Copy link
Contributor Author

Yes, that makes more sense. Updated and tested. Thanks!

@enjoy-digital enjoy-digital merged commit c55086f into enjoy-digital:master Jan 6, 2025
1 check passed
@enjoy-digital
Copy link
Owner

Great, thanks. This is merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants