A performance dashboard for monitoring the sustainability of a website and site crawler script to fetch this data.
Includes support for dark mode and comparison of reports.
View demo of the live site here.
Demo screenshots
Lightmode
Darkmode with comparison
A summary of network requests sent by the least performant pages
A summary of requests sent by iframes loaded within the page.
This is an informal project with several outstanding bugs and issues, but generally useful to get a sense of how optimised a website is at a glance, and what might be holding it back.
This is a Next.js project bootstrapped with create-next-app
.
Modify the startUrls
array at the top of site-crawler/main.js
with the root URL of the website you'd like to crawl.
Optional: configure the maxRequestsPerCrawl
to a higher number to crawl more pages of the site automatically.
To run the crawler, run the following commands:
cd /site-crawler
fnm use # (or nvm use - make sure you're using the Node version in .nvmrc)
npm install
npm run crawl
When finished, the main website will have been analysed, but not any embedded iframes. To analyse the iframes, run the following commands:
npm run iframes
To process the results of the crawler, run the following commands:
npm run export
To view the crawler results, run the development server. cd
into the root directory, then
fnm use # (or nvm use - make sure you're using the Node version in .nvmrc)
npm install
npm run dev
# or
yarn dev
# or
pnpm dev
Open http://localhost:3000 with your browser to see the result.