1

Page de statistiques

This commit is contained in:
2025-11-28 01:47:10 +01:00
parent 38926267a3
commit fd27dc7fb6
47 changed files with 3278 additions and 86 deletions

6
.gitignore vendored
View File

@@ -4,3 +4,9 @@ resources/
public/ public/
.DS_Store .DS_Store
tools/cache/ tools/cache/
static/_generated
data/stats.json
content/stats/data/stats.json
content/stats/images
.venv
__pycache__

View File

@@ -15,6 +15,10 @@ main:
title: "Classification de mes articles" title: "Classification de mes articles"
pageRef: /taxonomies/ pageRef: /taxonomies/
parent: Accueil parent: Accueil
- name: Statistiques
title: "Statistiques de publication"
pageRef: /stats/
parent: Accueil
- name: Manifeste - name: Manifeste
title: Un texte dintention sur la démarche du site title: Un texte dintention sur la démarche du site
pageRef: /manifeste/ pageRef: /manifeste/

View File

@@ -159,17 +159,13 @@ Ils traduisent une volonté : reprendre le contrôle, respecter le lecteur, refu
Je nai pas conçu ce site pour quil soit moderne. Je nai pas conçu ce site pour quil soit moderne.
Je lai conçu pour quil soit lisible, modeste et fidèle à ce que jy écris. Je lai conçu pour quil soit lisible, modeste et fidèle à ce que jy écris.
### Le refus de la métrique ### Le rapport aux chiffres
Je ne mesure pas mes visites. Je nutilise ni cookies, ni scripts de mesure daudience, ni services tiers de tracking.
Je ne trace pas les lecteurs. Le seul « suivi » de ce site est celui que produisent naturellement les journaux de mon serveur.
Je ne regarde pas combien de personnes ont cliqué, partagé, parcouru ou survolé mes pages.
Non pas par négligence, mais par principe. Jen extrais parfois quelques chiffres agrégés — nombre de visiteurs uniques, pages vues sur une période donnée — pour vérifier que le site vit encore, et par simple curiosité, comme on regarderait un relevé météo.
Ces données restent globales, anonymes, et ne permettent de suivre personne en particulier.
Je ne veux pas écrire en regardant derrière moi.
Je ne veux pas calibrer mes contenus pour répondre à une attente que je nai jamais validée.
Je ne veux pas me transformer en analyste de moi-même, ni devenir le gestionnaire dun produit éditorial.
Ce site ne cherche pas à croître. Ce site ne cherche pas à croître.
Il cherche à exister. Il cherche à exister.

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Nombre d'articles publiés par mois"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Nombre d'articles publiés par section"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Nombre d'articles publiés par an"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Cumul de mots et d'articles"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Top 10 des pages les plus vues"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Rapport entre température, humidité et fréquence de publication"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Activité par jour de la semaine"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Histogramme par mots"
#prompt: ""

View File

@@ -0,0 +1,4 @@
#title: ""
#attribution: ""
description: "Nombre de mots par article (moyenne mensuelle)"
#prompt: ""

62
content/stats/index.md Normal file
View File

@@ -0,0 +1,62 @@
---
title: Statistiques
---
> Statistiques générées le {{< stats-var key="generated_at" >}}
## Visites
<div class="stats">
<div class="panel stat">
<strong>Pages vues</strong>
<span>{{< stats-var key="pageviews_per_month" >}}</span>
</div>
<div class="panel stat">
<strong>Visiteurs uniques</strong>
<span>{{< stats-var key="unique_visitors_per_month_value" >}}</span>
</div>
</div>
Ces statistiques sont extraites des journaux de [mon serveur web](https://caddyserver.com) _uniquement_ et passées à l'outil [GoAccess](https://goaccess.io/) qui les anonymise (option `--anonymize-ip`).
Elles sont construites à partir d'agrégats globaux **sur 30 jours glissants**.
Je suis le seul à pouvoir accéder à ces journaux.
**Il n'y a aucun traitement par des tiers**.
La politique de rétention des journaux de mon serveur est de _sept jours_ à titre technique.
Celle de GoAccess est de _deux mois_ à titre de mesure agrégée de l'audience (option `--keep-last=60`).
**Aucun profilage ou suivi individuel n'est effectué**.
Les bots connus de GoAccess sont ignorés (option `--ignore-crawlers`).
![](images/top_requests.png)
L'essentiel des accès à mon site se fait donc via le flux RSS.
**Merci à vous !**
## Habitudes d'écriture
<div class="stats">
<div class="panel stat">
<strong>Record</strong>
<span>{{< stats-var key="most_prolific_month" >}}</span>
</div>
<div class="panel stat">
<strong>Articles par mois</strong>
<span>{{< stats-var key="articles_avg_per_month" >}}</span>
</div>
</div>
![](images/articles_per_year.png)
![](images/articles_per_month.png)
![](images/articles_per_section.png)
![](images/weekday_activity.png)
![](images/words_per_article.png)
![](images/cumulative_articles.png)
![](images/words_histogram.png)
![](images/weather_hexbin.png)

View File

@@ -19,6 +19,9 @@ node "$SCRIPT_DIR/tools/check_internal_links.js"
echo "==> Enrichissement météo des articles" echo "==> Enrichissement météo des articles"
node "$SCRIPT_DIR/tools/add_weather.js" node "$SCRIPT_DIR/tools/add_weather.js"
echo "==> Génération des statistiques"
npm run stats:generate
# echo "==> Application des taxonomies et mots-clés" # echo "==> Application des taxonomies et mots-clés"
# node "$SCRIPT_DIR/tools/link_taxonomy_terms.js" # node "$SCRIPT_DIR/tools/link_taxonomy_terms.js"

View File

@@ -0,0 +1,28 @@
{{- $key := .Get "key" | default (.Get 0) -}}
{{- if not $key -}}
{{- warnf "stats-var: key manquante" -}}
{{- else -}}
{{- $resource := .Page.Resources.GetMatch "data/stats.json" -}}
{{- if not $resource -}}
{{- warnf "stats-var: data/stats.json introuvable pour %s" .Page.File.Path -}}
{{- else -}}
{{- $data := $resource | transform.Unmarshal -}}
{{- $value := "" -}}
{{- if eq $key "generated_at" -}}
{{- with $data.generated_at -}}
{{- $value = time . | time.Format "02/01/2006 à 15:04" -}}
{{- end -}}
{{- else -}}
{{- range $section := $data.sections -}}
{{- range $stat := (default (slice) $section.statistics) -}}
{{- if eq $stat.key $key -}}
{{- $value = (default "" $stat.value) -}}
{{- end -}}
{{- end -}}
{{- end -}}
{{- end -}}
{{- $value -}}
{{- end -}}
{{- end -}}

467
package-lock.json generated
View File

@@ -6,6 +6,9 @@
"": { "": {
"dependencies": { "dependencies": {
"@influxdata/influxdb-client": "^1.35.0", "@influxdata/influxdb-client": "^1.35.0",
"@napi-rs/canvas": "^0.1.59",
"chart.js": "^4.4.4",
"chartjs-node-canvas": "^5.0.0",
"luxon": "^3.7.2", "luxon": "^3.7.2",
"postcss-import": "^16.1.0", "postcss-import": "^16.1.0",
"postcss-nested": "^7.0.2", "postcss-nested": "^7.0.2",
@@ -430,6 +433,12 @@
"integrity": "sha512-woWMi8PDpPQpvTsRaUw4Ig+nOGS/CWwAwS66Fa1Vr/EkW+NEwxI8YfPBsdBMn33jK2Y86/qMiiuX/ROHIkJLTw==", "integrity": "sha512-woWMi8PDpPQpvTsRaUw4Ig+nOGS/CWwAwS66Fa1Vr/EkW+NEwxI8YfPBsdBMn33jK2Y86/qMiiuX/ROHIkJLTw==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/@kurkle/color": {
"version": "0.3.4",
"resolved": "https://registry.npmjs.org/@kurkle/color/-/color-0.3.4.tgz",
"integrity": "sha512-M5UknZPHRu3DEDWoipU6sE8PdkZ6Z/S+v4dD+Ke8IaNlpdSQah50lz1KtcFBa2vsdOnwbbnxJwVM4wty6udA5w==",
"license": "MIT"
},
"node_modules/@mermaid-js/mermaid-cli": { "node_modules/@mermaid-js/mermaid-cli": {
"version": "10.9.1", "version": "10.9.1",
"resolved": "https://registry.npmjs.org/@mermaid-js/mermaid-cli/-/mermaid-cli-10.9.1.tgz", "resolved": "https://registry.npmjs.org/@mermaid-js/mermaid-cli/-/mermaid-cli-10.9.1.tgz",
@@ -697,6 +706,190 @@
"node": ">=12" "node": ">=12"
} }
}, },
"node_modules/@napi-rs/canvas": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas/-/canvas-0.1.83.tgz",
"integrity": "sha512-f9GVB9VNc9vn/nroc9epXRNkVpvNPZh69+qzLJIm9DfruxFqX0/jsXG46OGWAJgkO4mN0HvFHjRROMXKVmPszg==",
"license": "MIT",
"workspaces": [
"e2e/*"
],
"engines": {
"node": ">= 10"
},
"optionalDependencies": {
"@napi-rs/canvas-android-arm64": "0.1.83",
"@napi-rs/canvas-darwin-arm64": "0.1.83",
"@napi-rs/canvas-darwin-x64": "0.1.83",
"@napi-rs/canvas-linux-arm-gnueabihf": "0.1.83",
"@napi-rs/canvas-linux-arm64-gnu": "0.1.83",
"@napi-rs/canvas-linux-arm64-musl": "0.1.83",
"@napi-rs/canvas-linux-riscv64-gnu": "0.1.83",
"@napi-rs/canvas-linux-x64-gnu": "0.1.83",
"@napi-rs/canvas-linux-x64-musl": "0.1.83",
"@napi-rs/canvas-win32-x64-msvc": "0.1.83"
}
},
"node_modules/@napi-rs/canvas-android-arm64": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-android-arm64/-/canvas-android-arm64-0.1.83.tgz",
"integrity": "sha512-TbKM2fh9zXjqFIU8bgMfzG7rkrIYdLKMafgPhFoPwKrpWk1glGbWP7LEu8Y/WrMDqTGFdRqUmuX89yQEzZbkiw==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"android"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-darwin-arm64": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-arm64/-/canvas-darwin-arm64-0.1.83.tgz",
"integrity": "sha512-gp8IDVUloPUmkepHly4xRUOfUJSFNvA4jR7ZRF5nk3YcGzegSFGeICiT4PnYyPgSKEhYAFe1Y2XNy0Mp6Tu8mQ==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-darwin-x64": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-x64/-/canvas-darwin-x64-0.1.83.tgz",
"integrity": "sha512-r4ZJxiP9OgUbdGZhPDEXD3hQ0aIPcVaywtcTXvamYxTU/SWKAbKVhFNTtpRe1J30oQ25gWyxTkUKSBgUkNzdnw==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-arm-gnueabihf": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm-gnueabihf/-/canvas-linux-arm-gnueabihf-0.1.83.tgz",
"integrity": "sha512-Uc6aSB05qH1r+9GUDxIE6F5ZF7L0nTFyyzq8ublWUZhw8fEGK8iy931ff1ByGFT04+xHJad1kBcL4R1ZEV8z7Q==",
"cpu": [
"arm"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-gnu": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-gnu/-/canvas-linux-arm64-gnu-0.1.83.tgz",
"integrity": "sha512-eEeaJA7V5KOFq7W0GtoRVbd3ak8UZpK+XLkCgUiFGtlunNw+ZZW9Cr/92MXflGe7o3SqqMUg+f975LPxO/vsOQ==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-musl": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-musl/-/canvas-linux-arm64-musl-0.1.83.tgz",
"integrity": "sha512-cAvonp5XpbatVGegF9lMQNchs3z5RH6EtamRVnQvtoRtwbzOMcdzwuLBqDBQxQF79MFbuZNkWj3YRJjZCjHVzw==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-riscv64-gnu": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-riscv64-gnu/-/canvas-linux-riscv64-gnu-0.1.83.tgz",
"integrity": "sha512-WFUPQ9qZy31vmLxIJ3MfmHw+R2g/mLCgk8zmh7maJW8snV3vLPA7pZfIS65Dc61EVDp1vaBskwQ2RqPPzwkaew==",
"cpu": [
"riscv64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-x64-gnu": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-gnu/-/canvas-linux-x64-gnu-0.1.83.tgz",
"integrity": "sha512-X9YwIjsuy50WwOyYeNhEHjKHO8rrfH9M4U8vNqLuGmqsZdKua/GrUhdQGdjq7lTgdY3g4+Ta5jF8MzAa7UAs/g==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-linux-x64-musl": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-musl/-/canvas-linux-x64-musl-0.1.83.tgz",
"integrity": "sha512-Vv2pLWQS8EnlSM1bstJ7vVhKA+mL4+my4sKUIn/bgIxB5O90dqiDhQjUDLP+5xn9ZMestRWDt3tdQEkGAmzq/A==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@napi-rs/canvas-win32-x64-msvc": {
"version": "0.1.83",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-x64-msvc/-/canvas-win32-x64-msvc-0.1.83.tgz",
"integrity": "sha512-K1TtjbScfRNYhq8dengLLufXGbtEtWdUXPV505uLFPovyGHzDUGXLFP/zUJzj6xWXwgUjHNLgEPIt7mye0zr6Q==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"win32"
],
"engines": {
"node": ">= 10"
}
},
"node_modules/@puppeteer/browsers": { "node_modules/@puppeteer/browsers": {
"version": "2.8.0", "version": "2.8.0",
"resolved": "https://registry.npmjs.org/@puppeteer/browsers/-/browsers-2.8.0.tgz", "resolved": "https://registry.npmjs.org/@puppeteer/browsers/-/browsers-2.8.0.tgz",
@@ -992,7 +1185,6 @@
"version": "1.5.1", "version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz", "resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==", "integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -1035,7 +1227,6 @@
"version": "4.1.0", "version": "4.1.0",
"resolved": "https://registry.npmjs.org/bl/-/bl-4.1.0.tgz", "resolved": "https://registry.npmjs.org/bl/-/bl-4.1.0.tgz",
"integrity": "sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==", "integrity": "sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w==",
"dev": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"buffer": "^5.5.0", "buffer": "^5.5.0",
@@ -1103,7 +1294,6 @@
"version": "5.7.1", "version": "5.7.1",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz", "resolved": "https://registry.npmjs.org/buffer/-/buffer-5.7.1.tgz",
"integrity": "sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==", "integrity": "sha512-EHcyIPBQ4BSGlvjB16k5KgAJ27CIsHY/2JBmCRReo48y9rQ3MaUzWX3KVlBa4U7MyX02HdVj0K7C3WaB3ju7FQ==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -1163,6 +1353,20 @@
], ],
"license": "CC-BY-4.0" "license": "CC-BY-4.0"
}, },
"node_modules/canvas": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/canvas/-/canvas-3.2.0.tgz",
"integrity": "sha512-jk0GxrLtUEmW/TmFsk2WghvgHe8B0pxGilqCL21y8lHkPUGa6FTsnCNtHPOzT8O3y+N+m3espawV80bbBlgfTA==",
"hasInstallScript": true,
"license": "MIT",
"dependencies": {
"node-addon-api": "^7.0.0",
"prebuild-install": "^7.1.3"
},
"engines": {
"node": "^18.12.0 || >= 20.9.0"
}
},
"node_modules/chalk": { "node_modules/chalk": {
"version": "5.6.2", "version": "5.6.2",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-5.6.2.tgz", "resolved": "https://registry.npmjs.org/chalk/-/chalk-5.6.2.tgz",
@@ -1187,6 +1391,31 @@
"url": "https://github.com/sponsors/wooorm" "url": "https://github.com/sponsors/wooorm"
} }
}, },
"node_modules/chart.js": {
"version": "4.5.1",
"resolved": "https://registry.npmjs.org/chart.js/-/chart.js-4.5.1.tgz",
"integrity": "sha512-GIjfiT9dbmHRiYi6Nl2yFCq7kkwdkp1W/lp2J99rX0yo9tgJGn3lKQATztIjb5tVtevcBtIdICNWqlq5+E8/Pw==",
"license": "MIT",
"dependencies": {
"@kurkle/color": "^0.3.0"
},
"engines": {
"pnpm": ">=8"
}
},
"node_modules/chartjs-node-canvas": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/chartjs-node-canvas/-/chartjs-node-canvas-5.0.0.tgz",
"integrity": "sha512-+Lc5phRWjb+UxAIiQpKgvOaG6Mw276YQx2jl2BrxoUtI3A4RYTZuGM5Dq+s4ReYmCY42WEPSR6viF3lDSTxpvw==",
"license": "MIT",
"dependencies": {
"canvas": "^3.1.0",
"tslib": "^2.8.1"
},
"peerDependencies": {
"chart.js": "^4.4.8"
}
},
"node_modules/chokidar": { "node_modules/chokidar": {
"version": "3.6.0", "version": "3.6.0",
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz", "resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz",
@@ -1216,7 +1445,6 @@
"version": "1.1.4", "version": "1.1.4",
"resolved": "https://registry.npmjs.org/chownr/-/chownr-1.1.4.tgz", "resolved": "https://registry.npmjs.org/chownr/-/chownr-1.1.4.tgz",
"integrity": "sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==", "integrity": "sha512-jJ0bqzaylmJtVnNgzTeSOs8DPavpbYgEr/b0YL8/2GO3xJEhInFmhKMUnEJQjZumK7KXGFhUy89PrsJWlakBVg==",
"dev": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/chromium-bidi": { "node_modules/chromium-bidi": {
@@ -1945,6 +2173,30 @@
"url": "https://github.com/sponsors/wooorm" "url": "https://github.com/sponsors/wooorm"
} }
}, },
"node_modules/decompress-response": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/decompress-response/-/decompress-response-6.0.0.tgz",
"integrity": "sha512-aW35yZM6Bb/4oJlZncMH2LCoZtJXTRxES17vE3hoRiowU2kWHaJKFkSBDnDR+cm9J+9QhXmREyIfv0pji9ejCQ==",
"license": "MIT",
"dependencies": {
"mimic-response": "^3.1.0"
},
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/deep-extend": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/deep-extend/-/deep-extend-0.6.0.tgz",
"integrity": "sha512-LOHxIOaPYdHlJRtCQfDIVZtfw/ufM8+rVj649RIHzcm/vGwQRXFt6OPqIFWsm2XEMrNIEtWR64sY1LEKD2vAOA==",
"license": "MIT",
"engines": {
"node": ">=4.0.0"
}
},
"node_modules/deepmerge": { "node_modules/deepmerge": {
"version": "4.3.1", "version": "4.3.1",
"resolved": "https://registry.npmjs.org/deepmerge/-/deepmerge-4.3.1.tgz", "resolved": "https://registry.npmjs.org/deepmerge/-/deepmerge-4.3.1.tgz",
@@ -2138,6 +2390,15 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/expand-template": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/expand-template/-/expand-template-2.0.3.tgz",
"integrity": "sha512-XYfuKMvj4O35f/pOXLObndIRvyQ+/+6AhODh+OKWj9S9498pHHn/IMszH+gt0fBCRWMNfk1ZSp5x3AifmnI2vg==",
"license": "(MIT OR WTFPL)",
"engines": {
"node": ">=6"
}
},
"node_modules/extract-zip": { "node_modules/extract-zip": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/extract-zip/-/extract-zip-2.0.1.tgz", "resolved": "https://registry.npmjs.org/extract-zip/-/extract-zip-2.0.1.tgz",
@@ -2225,7 +2486,6 @@
"version": "1.0.0", "version": "1.0.0",
"resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz", "resolved": "https://registry.npmjs.org/fs-constants/-/fs-constants-1.0.0.tgz",
"integrity": "sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==", "integrity": "sha512-y6OAwoSIf7FyjMIv94u+b5rdheZEjzR63GTyZJm5qh4Bi+2YgwLCcI/fPFZkL5PSixOt6ZNKm+w+Hfp/Bciwow==",
"dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/fs-extra": { "node_modules/fs-extra": {
@@ -2310,6 +2570,12 @@
"node": ">= 14" "node": ">= 14"
} }
}, },
"node_modules/github-from-package": {
"version": "0.0.0",
"resolved": "https://registry.npmjs.org/github-from-package/-/github-from-package-0.0.0.tgz",
"integrity": "sha512-SyHy3T1v2NUXn29OsWdxmK6RwHD+vkj3v8en8AOBZ1wBQ/hCAQ5bAQTD02kW4W9tUp/3Qh6J8r9EvntiyCmOOw==",
"license": "MIT"
},
"node_modules/glob": { "node_modules/glob": {
"version": "7.2.3", "version": "7.2.3",
"resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz", "resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
@@ -2405,7 +2671,6 @@
"version": "1.2.1", "version": "1.2.1",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz", "resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz",
"integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==", "integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -2455,6 +2720,12 @@
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==", "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/ini": {
"version": "1.3.8",
"resolved": "https://registry.npmjs.org/ini/-/ini-1.3.8.tgz",
"integrity": "sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew==",
"license": "ISC"
},
"node_modules/internmap": { "node_modules/internmap": {
"version": "2.0.3", "version": "2.0.3",
"resolved": "https://registry.npmjs.org/internmap/-/internmap-2.0.3.tgz", "resolved": "https://registry.npmjs.org/internmap/-/internmap-2.0.3.tgz",
@@ -3298,6 +3569,18 @@
], ],
"license": "MIT" "license": "MIT"
}, },
"node_modules/mimic-response": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/mimic-response/-/mimic-response-3.1.0.tgz",
"integrity": "sha512-z0yWI+4FDrrweS8Zmt4Ej5HdJmky15+L2e6Wgn3+iK5fWzb6T3fhNFq2+MeTRb064c6Wr4N/wv0DzQTjNzHNGQ==",
"license": "MIT",
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/minimatch": { "node_modules/minimatch": {
"version": "3.1.2", "version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz", "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
@@ -3310,6 +3593,15 @@
"node": "*" "node": "*"
} }
}, },
"node_modules/minimist": {
"version": "1.2.8",
"resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz",
"integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==",
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/mitt": { "node_modules/mitt": {
"version": "3.0.1", "version": "3.0.1",
"resolved": "https://registry.npmjs.org/mitt/-/mitt-3.0.1.tgz", "resolved": "https://registry.npmjs.org/mitt/-/mitt-3.0.1.tgz",
@@ -3342,7 +3634,6 @@
"version": "0.5.3", "version": "0.5.3",
"resolved": "https://registry.npmjs.org/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz", "resolved": "https://registry.npmjs.org/mkdirp-classic/-/mkdirp-classic-0.5.3.tgz",
"integrity": "sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A==", "integrity": "sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A==",
"dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/mri": { "node_modules/mri": {
@@ -3379,6 +3670,12 @@
"node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1" "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
} }
}, },
"node_modules/napi-build-utils": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/napi-build-utils/-/napi-build-utils-2.0.0.tgz",
"integrity": "sha512-GEbrYkbfF7MoNaoh2iGG84Mnf/WZfB0GdGEsM8wz7Expx/LlWf5U8t9nvJKXSp3qr5IsEbK04cBGhol/KwOsWA==",
"license": "MIT"
},
"node_modules/netmask": { "node_modules/netmask": {
"version": "2.0.2", "version": "2.0.2",
"resolved": "https://registry.npmjs.org/netmask/-/netmask-2.0.2.tgz", "resolved": "https://registry.npmjs.org/netmask/-/netmask-2.0.2.tgz",
@@ -3388,6 +3685,24 @@
"node": ">= 0.4.0" "node": ">= 0.4.0"
} }
}, },
"node_modules/node-abi": {
"version": "3.85.0",
"resolved": "https://registry.npmjs.org/node-abi/-/node-abi-3.85.0.tgz",
"integrity": "sha512-zsFhmbkAzwhTft6nd3VxcG0cvJsT70rL+BIGHWVq5fi6MwGrHwzqKaxXE+Hl2GmnGItnDKPPkO5/LQqjVkIdFg==",
"license": "MIT",
"dependencies": {
"semver": "^7.3.5"
},
"engines": {
"node": ">=10"
}
},
"node_modules/node-addon-api": {
"version": "7.1.1",
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-7.1.1.tgz",
"integrity": "sha512-5m3bsyrjFWE1xf7nz7YXdN4udnVtXK6/Yfgn5qnahL6bCkf2yKt4k3nuTKAtT4r3IG8JNR2ncsIMdZuAzJjHQQ==",
"license": "MIT"
},
"node_modules/node-fetch": { "node_modules/node-fetch": {
"version": "2.6.7", "version": "2.6.7",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.7.tgz", "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.7.tgz",
@@ -3773,6 +4088,60 @@
"integrity": "sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==", "integrity": "sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/prebuild-install": {
"version": "7.1.3",
"resolved": "https://registry.npmjs.org/prebuild-install/-/prebuild-install-7.1.3.tgz",
"integrity": "sha512-8Mf2cbV7x1cXPUILADGI3wuhfqWvtiLA1iclTDbFRZkgRQS0NqsPZphna9V+HyTEadheuPmjaJMsbzKQFOzLug==",
"license": "MIT",
"dependencies": {
"detect-libc": "^2.0.0",
"expand-template": "^2.0.3",
"github-from-package": "0.0.0",
"minimist": "^1.2.3",
"mkdirp-classic": "^0.5.3",
"napi-build-utils": "^2.0.0",
"node-abi": "^3.3.0",
"pump": "^3.0.0",
"rc": "^1.2.7",
"simple-get": "^4.0.0",
"tar-fs": "^2.0.0",
"tunnel-agent": "^0.6.0"
},
"bin": {
"prebuild-install": "bin.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/prebuild-install/node_modules/tar-fs": {
"version": "2.1.4",
"resolved": "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.4.tgz",
"integrity": "sha512-mDAjwmZdh7LTT6pNleZ05Yt65HC3E+NiQzl672vQG38jIrehtJk/J3mNwIg+vShQPcLF/LV7CMnDW6vjj6sfYQ==",
"license": "MIT",
"dependencies": {
"chownr": "^1.1.1",
"mkdirp-classic": "^0.5.2",
"pump": "^3.0.0",
"tar-stream": "^2.1.4"
}
},
"node_modules/prebuild-install/node_modules/tar-stream": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/tar-stream/-/tar-stream-2.2.0.tgz",
"integrity": "sha512-ujeqbceABgwMZxEJnk2HDY2DlnUZ+9oEcb1KzTVfYHio0UE6dG71n60d8D2I4qNvleWrrXpmjpt7vZeF1LnMZQ==",
"license": "MIT",
"dependencies": {
"bl": "^4.0.3",
"end-of-stream": "^1.4.1",
"fs-constants": "^1.0.0",
"inherits": "^2.0.3",
"readable-stream": "^3.1.1"
},
"engines": {
"node": ">=6"
}
},
"node_modules/pretty-hrtime": { "node_modules/pretty-hrtime": {
"version": "1.0.3", "version": "1.0.3",
"resolved": "https://registry.npmjs.org/pretty-hrtime/-/pretty-hrtime-1.0.3.tgz", "resolved": "https://registry.npmjs.org/pretty-hrtime/-/pretty-hrtime-1.0.3.tgz",
@@ -4001,6 +4370,21 @@
} }
} }
}, },
"node_modules/rc": {
"version": "1.2.8",
"resolved": "https://registry.npmjs.org/rc/-/rc-1.2.8.tgz",
"integrity": "sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw==",
"license": "(BSD-2-Clause OR MIT OR Apache-2.0)",
"dependencies": {
"deep-extend": "^0.6.0",
"ini": "~1.3.0",
"minimist": "^1.2.0",
"strip-json-comments": "~2.0.1"
},
"bin": {
"rc": "cli.js"
}
},
"node_modules/read-cache": { "node_modules/read-cache": {
"version": "1.0.0", "version": "1.0.0",
"resolved": "https://registry.npmjs.org/read-cache/-/read-cache-1.0.0.tgz", "resolved": "https://registry.npmjs.org/read-cache/-/read-cache-1.0.0.tgz",
@@ -4014,7 +4398,6 @@
"version": "3.6.2", "version": "3.6.2",
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz", "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz",
"integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==", "integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==",
"dev": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"inherits": "^2.0.3", "inherits": "^2.0.3",
@@ -4123,7 +4506,6 @@
"version": "5.2.1", "version": "5.2.1",
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz", "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
"integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==", "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
"dev": true,
"funding": [ "funding": [
{ {
"type": "github", "type": "github",
@@ -4234,6 +4616,51 @@
"@img/sharp-win32-x64": "0.33.5" "@img/sharp-win32-x64": "0.33.5"
} }
}, },
"node_modules/simple-concat": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/simple-concat/-/simple-concat-1.0.1.tgz",
"integrity": "sha512-cSFtAPtRhljv69IK0hTVZQ+OfE9nePi/rtJmw5UjHeVyVroEqJXP1sFztKUy1qU+xvz3u/sfYJLa947b7nAN2Q==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT"
},
"node_modules/simple-get": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/simple-get/-/simple-get-4.0.1.tgz",
"integrity": "sha512-brv7p5WgH0jmQJr1ZDDfKDOSeWWg+OVypG99A/5vYGPqJ6pxiaHLy8nxtFjBA7oMa01ebA9gfh1uMCFqOuXxvA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT",
"dependencies": {
"decompress-response": "^6.0.0",
"once": "^1.3.1",
"simple-concat": "^1.0.0"
}
},
"node_modules/simple-swizzle": { "node_modules/simple-swizzle": {
"version": "0.2.2", "version": "0.2.2",
"resolved": "https://registry.npmjs.org/simple-swizzle/-/simple-swizzle-0.2.2.tgz", "resolved": "https://registry.npmjs.org/simple-swizzle/-/simple-swizzle-0.2.2.tgz",
@@ -4342,7 +4769,6 @@
"version": "1.3.0", "version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz", "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
"integrity": "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==", "integrity": "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==",
"dev": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"safe-buffer": "~5.2.0" "safe-buffer": "~5.2.0"
@@ -4374,6 +4800,15 @@
"node": ">=8" "node": ">=8"
} }
}, },
"node_modules/strip-json-comments": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-2.0.1.tgz",
"integrity": "sha512-4gB8na07fecVVkOI6Rs4e7T6NOTki5EmL7TUduTs6bu3EdnSycntVJ4re8kgZA+wx9IueI2Y11bfbgwtzuE0KQ==",
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/stylis": { "node_modules/stylis": {
"version": "4.3.6", "version": "4.3.6",
"resolved": "https://registry.npmjs.org/stylis/-/stylis-4.3.6.tgz", "resolved": "https://registry.npmjs.org/stylis/-/stylis-4.3.6.tgz",
@@ -4522,6 +4957,18 @@
"integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==", "integrity": "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w==",
"license": "0BSD" "license": "0BSD"
}, },
"node_modules/tunnel-agent": {
"version": "0.6.0",
"resolved": "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.6.0.tgz",
"integrity": "sha512-McnNiV1l8RYeY8tBgEpuodCC1mLUdbSN+CYBL7kJsJNInOP8UjDDEwdk6Mw60vdLLrr5NHKZhMAOSrR2NZuQ+w==",
"license": "Apache-2.0",
"dependencies": {
"safe-buffer": "^5.0.1"
},
"engines": {
"node": "*"
}
},
"node_modules/typed-query-selector": { "node_modules/typed-query-selector": {
"version": "2.12.0", "version": "2.12.0",
"resolved": "https://registry.npmjs.org/typed-query-selector/-/typed-query-selector-2.12.0.tgz", "resolved": "https://registry.npmjs.org/typed-query-selector/-/typed-query-selector-2.12.0.tgz",

View File

@@ -1,9 +1,13 @@
{ {
"scripts": { "scripts": {
"links:refresh": "node tools/check_external_links.js" "links:refresh": "node tools/check_external_links.js",
"stats:generate": "node tools/generate_stats.js"
}, },
"dependencies": { "dependencies": {
"@napi-rs/canvas": "^0.1.59",
"@influxdata/influxdb-client": "^1.35.0", "@influxdata/influxdb-client": "^1.35.0",
"chart.js": "^4.4.4",
"chartjs-node-canvas": "^5.0.0",
"luxon": "^3.7.2", "luxon": "^3.7.2",
"postcss-import": "^16.1.0", "postcss-import": "^16.1.0",
"postcss-nested": "^7.0.2", "postcss-nested": "^7.0.2",

3
requirements.txt Normal file
View File

@@ -0,0 +1,3 @@
matplotlib
numpy
pyyaml

View File

@@ -70,13 +70,14 @@ article.article-body {
dl { dl {
margin: var(--margin) auto; margin: var(--margin) auto;
max-width: 100%; min-width: var(--width-content-min);
max-width: var(--width-content-max);
display: grid; display: grid;
grid-template-columns: 40% 60%; grid-template-columns: auto minmax(0, 1fr);
/* column-gap: var(--gap-half); */
row-gap: 0; row-gap: 0;
border: var(--border-panel-outer); border: var(--border-panel-outer);
overflow: hidden; overflow: hidden;
margin: var(--margin) auto;
font-size: 1rem; font-size: 1rem;
} }
@@ -89,6 +90,7 @@ article.article-body {
/* Colonne de gauche */ /* Colonne de gauche */
dt { dt {
font-weight: bold; font-weight: bold;
white-space: nowrap;
} }
/* Colonne de droite */ /* Colonne de droite */

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env node #!/usr/bin/env node
const fs = require("fs/promises");
const path = require("path"); const path = require("path");
const { resolveMarkdownTargets } = require("./lib/content");
const { extractRawDate, readFrontmatter, writeFrontmatter } = require("./lib/weather/frontmatter"); const { extractRawDate, readFrontmatter, writeFrontmatter } = require("./lib/weather/frontmatter");
const { resolveArticleDate } = require("./lib/weather/time"); const { resolveArticleDate } = require("./lib/weather/time");
const { fetchWeather, hasConfiguredProvider, mergeWeather } = require("./lib/weather/providers"); const { fetchWeather, hasConfiguredProvider, mergeWeather } = require("./lib/weather/providers");
@@ -9,62 +9,6 @@ const { loadWeatherConfig } = require("./lib/weather/config");
const CONTENT_ROOT = path.resolve("content"); const CONTENT_ROOT = path.resolve("content");
async function collectMarkdownFiles(rootDir) {
const entries = await fs.readdir(rootDir, { withFileTypes: true });
const files = [];
for (const entry of entries) {
const fullPath = path.join(rootDir, entry.name);
if (entry.isDirectory()) {
const nested = await collectMarkdownFiles(fullPath);
files.push(...nested);
continue;
}
if (!entry.isFile()) continue;
if (!entry.name.endsWith(".md")) continue;
if (entry.name === "_index.md") continue;
files.push(fullPath);
}
return files;
}
async function resolveTargets(args) {
if (args.length === 0) {
return collectMarkdownFiles(CONTENT_ROOT);
}
const targets = new Set();
for (const input of args) {
const resolved = path.resolve(input);
try {
const stat = await fs.stat(resolved);
if (stat.isDirectory()) {
const nested = await collectMarkdownFiles(resolved);
nested.forEach((file) => targets.add(file));
continue;
}
if (stat.isFile()) {
if (!resolved.endsWith(".md")) continue;
if (path.basename(resolved) === "_index.md") continue;
targets.add(resolved);
}
} catch (error) {
console.error(`Skipping ${input}: ${error.message}`);
}
}
return Array.from(targets);
}
async function processFile(filePath, config, { force = false } = {}) { async function processFile(filePath, config, { force = false } = {}) {
const frontmatter = await readFrontmatter(filePath); const frontmatter = await readFrontmatter(filePath);
@@ -128,7 +72,7 @@ async function main() {
console.error("No weather provider configured. Update tools/config.json (weather.providers) before running this script."); console.error("No weather provider configured. Update tools/config.json (weather.providers) before running this script.");
process.exit(1); process.exit(1);
} }
const files = await resolveTargets(pathArgs); const files = await resolveMarkdownTargets(pathArgs, { rootDir: CONTENT_ROOT });
if (files.length === 0) { if (files.length === 0) {
console.log("No matching markdown files found."); console.log("No matching markdown files found.");

View File

@@ -93,6 +93,9 @@
"windowMinutes": 90, "windowMinutes": 90,
"precipitationThreshold": 0.1 "precipitationThreshold": 0.1
} }
} },
"goaccess": {
"url": ""
} }
} }
}

357
tools/generate_stats.js Normal file
View File

@@ -0,0 +1,357 @@
#!/usr/bin/env node
const fs = require("fs/promises");
const path = require("path");
const DEFAULT_CONFIG_PATH = "tools/stats.json";
const DEFAULT_DATA_OUTPUT = "content/stats/data/stats.json";
const DEFAULT_IMAGE_DIR = "content/stats/images";
function parseArgs(argv) {
const args = {};
for (let index = 0; index < argv.length; index += 1) {
const current = argv[index];
const next = argv[index + 1];
switch (current) {
case "--config":
case "-c":
args.config = next;
index += 1;
break;
case "--data":
case "-d":
args.data = next;
index += 1;
break;
case "--only":
case "-o":
args.only = next;
index += 1;
break;
default:
break;
}
}
return args;
}
async function loadDefinition(configPath) {
const raw = await fs.readFile(configPath, "utf8");
try {
return JSON.parse(raw);
} catch (error) {
throw new Error(`Impossible de parser ${configPath}: ${error.message}`);
}
}
async function loadModule(scriptPath) {
const resolved = path.resolve(scriptPath);
await fs.access(resolved);
// allow re-run without cache
delete require.cache[resolved];
const mod = require(resolved);
if (!mod || typeof mod.run !== "function") {
throw new Error(`Le script ${scriptPath} doit exporter une fonction run(context)`);
}
return mod;
}
function resolvePythonInterpreter() {
const envPython = process.env.VIRTUAL_ENV
? path.join(process.env.VIRTUAL_ENV, "bin", "python")
: null;
const candidates = [
envPython,
path.join(process.cwd(), ".venv", "bin", "python"),
path.join(process.cwd(), ".venv", "bin", "python3"),
"python3",
].filter(Boolean);
return candidates.find((candidate) => {
try {
const stat = require("fs").statSync(candidate);
return stat.isFile() || stat.isSymbolicLink();
} catch (_error) {
return false;
}
}) || "python3";
}
async function runPython(scriptPath, payload) {
const resolved = path.resolve(scriptPath);
await fs.access(resolved);
const interpreter = resolvePythonInterpreter();
return new Promise((resolve, reject) => {
const child = require("child_process").spawn(interpreter, [resolved], {
stdio: ["pipe", "pipe", "pipe"],
});
let stdout = "";
let stderr = "";
child.stdout.on("data", (data) => {
stdout += data.toString();
});
child.stderr.on("data", (data) => {
stderr += data.toString();
});
child.on("error", (error) => {
reject(error);
});
child.on("close", (code) => {
if (code !== 0) {
const err = new Error(stderr || `Python exited with code ${code}`);
err.code = code;
return reject(err);
}
const trimmed = stdout.trim();
if (!trimmed) return resolve({});
try {
resolve(JSON.parse(trimmed));
} catch (error) {
reject(new Error(`Invalid JSON from ${scriptPath}: ${error.message}`));
}
});
child.stdin.write(JSON.stringify(payload));
child.stdin.end();
});
}
function toPublicPath(target, { rootDir = process.cwd(), staticDir = path.resolve("static"), absOutput } = {}) {
if (!target) return "";
const normalized = target.replace(/\\/g, "/");
if (/^https?:\/\//i.test(normalized)) return normalized;
if (normalized.startsWith("/")) {
return normalized.replace(/\/{2,}/g, "/");
}
const absolute = absOutput || path.resolve(rootDir, target);
if (absolute.startsWith(staticDir)) {
const rel = path.relative(staticDir, absolute).replace(/\\/g, "/");
return `/${rel}`;
}
if (!path.isAbsolute(target) && !target.startsWith("/")) {
return normalized;
}
const relRoot = path.relative(rootDir, absolute).replace(/\\/g, "/");
return `/${relRoot}`;
}
function resolveGraphicPaths(stat, defaultImageDir, { rootDir, staticDir }) {
const target = stat.image || (defaultImageDir ? path.join(defaultImageDir, `${stat.key}.png`) : null);
if (!target) {
throw new Error("Chemin d'image manquant (image ou defaultImageDir)");
}
const absOutput = path.isAbsolute(target) ? target : path.resolve(rootDir, target);
const publicPath = toPublicPath(target, { rootDir, staticDir, absOutput });
return { publicPath, outputPath: absOutput };
}
function mergeResult(base, result, { publicPath } = {}) {
const entry = { ...base };
if (publicPath && base.type === "graphic") {
entry.image = publicPath;
}
if (result === undefined || result === null) {
return entry;
}
if (typeof result === "object" && !Array.isArray(result)) {
if (Object.prototype.hasOwnProperty.call(result, "value")) {
entry.value = result.value;
}
if (result.image) {
entry.image = toPublicPath(result.image);
}
if (result.meta) {
entry.meta = result.meta;
}
if (result.data) {
entry.data = result.data;
}
return entry;
}
entry.value = result;
return entry;
}
async function runStat(stat, context) {
const base = {
key: stat.key,
title: stat.title,
type: stat.type,
};
if (!stat.key) {
throw new Error("Cle manquante pour cette statistique");
}
if (!stat.script) {
throw new Error(`Script manquant pour ${stat.key}`);
}
if (!stat.type) {
throw new Error(`Type manquant pour ${stat.key}`);
}
const isPython = stat.script.endsWith(".py");
if (isPython) {
if (stat.type === "graphic") {
const { publicPath, outputPath } = resolveGraphicPaths(stat, context.defaultImageDir, context);
await fs.mkdir(path.dirname(outputPath), { recursive: true });
const result = await runPython(stat.script, {
...context,
stat,
outputPath,
publicPath,
});
return mergeResult(base, result, { publicPath });
}
if (stat.type === "variable") {
const result = await runPython(stat.script, {
...context,
stat,
});
return mergeResult(base, result);
}
throw new Error(`Type inconnu pour ${stat.key}: ${stat.type}`);
} else {
const mod = await loadModule(stat.script);
if (stat.type === "graphic") {
const { publicPath, outputPath } = resolveGraphicPaths(stat, context.defaultImageDir, context);
await fs.mkdir(path.dirname(outputPath), { recursive: true });
const result = await mod.run({
...context,
stat,
outputPath,
publicPath,
});
return mergeResult(base, result, { publicPath });
}
if (stat.type === "variable") {
const result = await mod.run({
...context,
stat,
});
return mergeResult(base, result);
}
throw new Error(`Type inconnu pour ${stat.key}: ${stat.type}`);
}
}
function buildOnlyFilter(onlyRaw) {
if (!onlyRaw) return null;
const parts = onlyRaw
.split(",")
.map((part) => part.trim())
.filter(Boolean);
return new Set(parts);
}
async function main() {
const cliArgs = parseArgs(process.argv.slice(2));
const definitionPath = path.resolve(cliArgs.config || DEFAULT_CONFIG_PATH);
const definition = await loadDefinition(definitionPath);
const statsConfig = definition.config || {};
const dataOutput = path.resolve(cliArgs.data || statsConfig.dataOutput || DEFAULT_DATA_OUTPUT);
const defaultImageDir = statsConfig.defaultImageDir || DEFAULT_IMAGE_DIR;
const onlyFilter = buildOnlyFilter(cliArgs.only);
const context = {
rootDir: process.cwd(),
contentDir: path.resolve("content"),
staticDir: path.resolve("static"),
definitionPath,
defaultImageDir,
config: statsConfig,
};
const output = {
generated_at: new Date().toISOString(),
sections: [],
};
const errors = [];
for (const section of definition.sections || []) {
const results = [];
for (const stat of section.statistics || []) {
if (onlyFilter && !onlyFilter.has(stat.key)) {
continue;
}
try {
const entry = await runStat(stat, context);
results.push(entry);
console.log(`[ok] ${stat.key}`);
} catch (error) {
errors.push({ key: stat.key, message: error.message });
console.error(`[err] ${stat.key}: ${error.message}`);
results.push({
key: stat.key,
title: stat.title,
type: stat.type,
error: error.message,
image: stat.image ? toPublicPath(stat.image) : undefined,
});
}
}
if (results.length > 0) {
output.sections.push({
title: section.title,
statistics: results,
});
}
}
if (errors.length > 0) {
output.errors = errors;
}
await fs.mkdir(path.dirname(dataOutput), { recursive: true });
await fs.writeFile(dataOutput, `${JSON.stringify(output, null, 2)}\n`, "utf8");
const relativeOutput = path.relative(process.cwd(), dataOutput);
console.log(`\nFichier de donnees genere: ${relativeOutput}`);
if (errors.length > 0) {
console.log(`Statistiques en erreur: ${errors.length}. Les entrees concernees contiennent le message d'erreur.`);
}
}
main().catch((error) => {
console.error(error.message);
process.exit(1);
});

20
tools/lib/config.js Normal file
View File

@@ -0,0 +1,20 @@
const fs = require("fs/promises");
const path = require("path");
let cached = null;
async function loadToolsConfig(configPath = "tools/config.json") {
const resolved = path.resolve(configPath);
if (cached && cached.path === resolved) {
return cached.data;
}
const raw = await fs.readFile(resolved, "utf8");
const data = JSON.parse(raw);
cached = { path: resolved, data };
return data;
}
module.exports = {
loadToolsConfig,
};

99
tools/lib/content.js Normal file
View File

@@ -0,0 +1,99 @@
const fs = require("fs/promises");
const path = require("path");
async function collectMarkdownFiles(rootDir, { skipIndex = true } = {}) {
const entries = await fs.readdir(rootDir, { withFileTypes: true });
const files = [];
for (const entry of entries) {
const fullPath = path.join(rootDir, entry.name);
if (entry.isDirectory()) {
const nested = await collectMarkdownFiles(fullPath, { skipIndex });
files.push(...nested);
continue;
}
if (!entry.isFile()) continue;
if (!entry.name.toLowerCase().endsWith(".md")) continue;
if (skipIndex && entry.name === "_index.md") continue;
files.push(fullPath);
}
return files;
}
async function collectSectionIndexDirs(rootDir) {
const sections = new Set();
async function walk(dir) {
let entries;
try {
entries = await fs.readdir(dir, { withFileTypes: true });
} catch (error) {
console.error(`Skipping section scan for ${dir}: ${error.message}`);
return;
}
let hasIndex = false;
for (const entry of entries) {
if (entry.isFile() && entry.name.toLowerCase() === "_index.md") {
hasIndex = true;
break;
}
}
if (hasIndex) {
sections.add(path.resolve(dir));
}
for (const entry of entries) {
if (entry.isDirectory()) {
await walk(path.join(dir, entry.name));
}
}
}
await walk(rootDir);
return sections;
}
async function resolveMarkdownTargets(inputs, { rootDir = process.cwd(), skipIndex = true } = {}) {
if (!inputs || inputs.length === 0) {
return collectMarkdownFiles(rootDir, { skipIndex });
}
const targets = new Set();
for (const input of inputs) {
const resolved = path.resolve(input);
try {
const stat = await fs.stat(resolved);
if (stat.isDirectory()) {
const nested = await collectMarkdownFiles(resolved, { skipIndex });
nested.forEach((file) => targets.add(file));
continue;
}
if (stat.isFile()) {
const lower = resolved.toLowerCase();
if (!lower.endsWith(".md")) continue;
if (skipIndex && path.basename(resolved) === "_index.md") continue;
targets.add(resolved);
}
} catch (error) {
console.error(`Skipping ${input}: ${error.message}`);
}
}
return Array.from(targets);
}
module.exports = {
collectMarkdownFiles,
collectSectionIndexDirs,
resolveMarkdownTargets,
};

View File

@@ -0,0 +1,91 @@
const path = require("path");
const { DateTime } = require("luxon");
const { collectMarkdownFiles, collectSectionIndexDirs } = require("../content");
const { readFrontmatter } = require("../weather/frontmatter");
function parseDate(value) {
if (!value) return null;
if (value instanceof Date) {
return DateTime.fromJSDate(value);
}
if (typeof value === "string") {
let parsed = DateTime.fromISO(value);
if (!parsed.isValid) {
parsed = DateTime.fromRFC2822(value);
}
return parsed.isValid ? parsed : null;
}
return null;
}
function countWords(body) {
if (!body) return 0;
const cleaned = body
.replace(/```[\s\S]*?```/g, " ") // fenced code blocks
.replace(/`[^`]*`/g, " ") // inline code
.replace(/<[^>]+>/g, " "); // html tags
const words = cleaned.match(/[\p{L}\p{N}'-]+/gu);
return words ? words.length : 0;
}
async function loadArticles(contentDir) {
const files = await collectMarkdownFiles(contentDir);
const sectionDirs = await collectSectionIndexDirs(contentDir);
const rootDir = path.resolve(contentDir);
const articles = [];
function resolveSection(filePath) {
const absolute = path.resolve(filePath);
let current = path.dirname(absolute);
while (current.startsWith(rootDir)) {
if (sectionDirs.has(current)) {
return path.relative(rootDir, current).replace(/\\/g, "/") || ".";
}
const parent = path.dirname(current);
if (parent === current) break;
current = parent;
}
return null;
}
for (const file of files) {
const frontmatter = await readFrontmatter(file);
if (!frontmatter) continue;
const date = parseDate(frontmatter.doc.get("date"));
const title = frontmatter.doc.get("title") || path.basename(file, ".md");
const body = frontmatter.body.trim();
const wordCount = countWords(body);
const relativePath = path.relative(contentDir, file);
const section = resolveSection(file);
articles.push({
path: file,
relativePath,
title,
date,
body,
wordCount,
section,
frontmatter: frontmatter.doc.toJS ? frontmatter.doc.toJS() : frontmatter.doc.toJSON(),
});
}
return articles;
}
module.exports = {
collectMarkdownFiles,
countWords,
loadArticles,
parseDate,
};

131
tools/lib/stats/goaccess.js Normal file
View File

@@ -0,0 +1,131 @@
const { request } = require("undici");
const { DateTime } = require("luxon");
async function fetchGoAccessJson(url) {
const res = await request(url, { method: "GET" });
if (res.statusCode < 200 || res.statusCode >= 300) {
throw new Error(`HTTP ${res.statusCode}`);
}
return res.body.json();
}
function crawlerRatios(data) {
const browsers = data.browsers?.data || [];
const crawler = browsers.find((entry) => entry.data === "Crawlers");
if (!crawler) return { hits: 0, visitors: 0 };
const totalHits = (browsers.reduce((sum, entry) => sum + (entry.hits?.count || 0), 0)) || 0;
const totalVisitors = (browsers.reduce((sum, entry) => sum + (entry.visitors?.count || 0), 0)) || 0;
const hitRatio = totalHits > 0 ? Math.min(1, (crawler.hits?.count || 0) / totalHits) : 0;
const visitorRatio = totalVisitors > 0 ? Math.min(1, (crawler.visitors?.count || 0) / totalVisitors) : 0;
return { hits: hitRatio, visitors: visitorRatio };
}
function groupVisitsByMonth(data, { adjustCrawlers = true } = {}) {
const entries = data.visitors?.data || [];
const ratios = adjustCrawlers ? crawlerRatios(data) : { hits: 0, visitors: 0 };
const months = new Map();
for (const entry of entries) {
const dateStr = entry.data;
if (!/^[0-9]{8}$/.test(dateStr)) continue;
const year = dateStr.slice(0, 4);
const month = dateStr.slice(4, 6);
const day = dateStr.slice(6, 8);
const key = `${year}-${month}`;
const hits = entry.hits?.count || 0;
const visitors = entry.visitors?.count || 0;
const current = months.get(key) || { hits: 0, visitors: 0, from: null, to: null };
const isoDate = `${year}-${month}-${day}`;
current.hits += hits;
current.visitors += visitors;
if (!current.from || isoDate < current.from) current.from = isoDate;
if (!current.to || isoDate > current.to) current.to = isoDate;
months.set(key, current);
}
const adjust = (value, ratio) => {
if (!adjustCrawlers) return value;
const scaled = value * (1 - ratio);
return Math.max(0, Math.round(scaled));
};
const sorted = Array.from(months.entries())
.sort((a, b) => a[0].localeCompare(b[0]))
.map(([key, value]) => ({
month: key,
from: value.from,
to: value.to,
hits: adjust(value.hits, ratios.hits),
visitors: adjust(value.visitors, ratios.visitors),
}));
return sorted;
}
function aggregateLastNDays(data, days = 30, { adjustCrawlers = true } = {}) {
const entries = data.visitors?.data || [];
if (!entries.length || days <= 0) {
return { from: null, to: null, hits: 0, visitors: 0 };
}
const valid = entries.filter((entry) => /^[0-9]{8}$/.test(entry.data));
if (valid.length === 0) {
return { from: null, to: null, hits: 0, visitors: 0 };
}
const sorted = valid.slice().sort((a, b) => a.data.localeCompare(b.data));
const last = sorted[sorted.length - 1];
const end = DateTime.fromFormat(last.data, "yyyyLLdd", { zone: "UTC" });
if (!end.isValid) {
return { from: null, to: null, hits: 0, visitors: 0 };
}
const start = end.minus({ days: days - 1 });
let from = null;
let to = null;
let hits = 0;
let visitors = 0;
for (const entry of sorted) {
const current = DateTime.fromFormat(entry.data, "yyyyLLdd", { zone: "UTC" });
if (!current.isValid) continue;
if (current < start || current > end) continue;
const iso = current.toISODate();
if (!from || iso < from) from = iso;
if (!to || iso > to) to = iso;
hits += entry.hits?.count || 0;
visitors += entry.visitors?.count || 0;
}
const ratios = adjustCrawlers ? crawlerRatios(data) : { hits: 0, visitors: 0 };
const adjust = (value, ratio) => {
if (!adjustCrawlers) return value;
const scaled = value * (1 - ratio);
return Math.max(0, Math.round(scaled));
};
return {
from,
to,
hits: adjust(hits, ratios.hits),
visitors: adjust(visitors, ratios.visitors),
};
}
module.exports = {
fetchGoAccessJson,
groupVisitsByMonth,
aggregateLastNDays,
crawlerRatios,
};

32
tools/lib/stats/python.js Normal file
View File

@@ -0,0 +1,32 @@
const { spawn } = require("child_process");
const path = require("path");
async function renderWithPython({ type, data, outputPath }) {
return new Promise((resolve, reject) => {
const scriptPath = path.resolve(__dirname, "../../render_stats_charts.py");
const child = spawn("python3", [scriptPath, "--type", type, "--output", outputPath], {
stdio: ["pipe", "inherit", "inherit"],
});
const payload = JSON.stringify(data);
child.stdin.write(payload);
child.stdin.end();
child.on("error", (error) => {
reject(error);
});
child.on("exit", (code) => {
if (code === 0) {
resolve();
} else {
reject(new Error(`Python renderer exited with code ${code}`));
}
});
});
}
module.exports = {
renderWithPython,
};

View File

@@ -0,0 +1,528 @@
#!/usr/bin/env python3
import argparse
import json
import math
import sys
import matplotlib
matplotlib.use("Agg")
import matplotlib.pyplot as plt # noqa: E402
import matplotlib.colors as mcolors # noqa: E402
import numpy as np # noqa: E402
PALETTE = [
"#467FFF", # blue-500
"#40C474", # green-500
"#FF4D5A", # red-500
"#FFA93D", # amber-500
"#9E63E9", # purple-500
"#2FC4FF", # cyan-500
"#98C0FF", # blue-300
"#8FE4A2", # green-300
"#FF939B", # red-300
"#FFD08C", # amber-300
"#D2AAF7", # purple-300
"#8EE8FF", # cyan-300
]
BACKGROUND = "#0F1114" # gray-900
TEXT = "#D9E0E8" # gray-300
GRID = (1.0, 1.0, 1.0, 0.16) # soft white grid
FIG_WIDTH = 20.0 # ~1920px at DPI=96
FIG_HEIGHT = 10.8 # 16:9 ratio
DPI = 96
BASE_FONT_SIZE = 16
TICK_FONT_SIZE = 15
LEGEND_FONT_SIZE = 14
TITLE_FONT_SIZE = 18
def setup_rcparams():
matplotlib.rcParams.update(
{
"figure.figsize": (FIG_WIDTH, FIG_HEIGHT),
"figure.dpi": DPI,
"axes.facecolor": BACKGROUND,
"figure.facecolor": BACKGROUND,
"axes.edgecolor": TEXT,
"axes.labelcolor": TEXT,
"xtick.color": TEXT,
"ytick.color": TEXT,
"text.color": TEXT,
"font.size": BASE_FONT_SIZE,
}
)
def new_axes():
fig, ax = plt.subplots()
fig.set_facecolor(BACKGROUND)
ax.set_facecolor(BACKGROUND)
ax.grid(True, axis="y", color=GRID, linestyle="--", linewidth=0.7)
return fig, ax
def render_articles_per_month(data, output):
labels = data.get("labels") or []
series = data.get("series") or []
title = data.get("title") or "Articles par mois"
if not labels or not series:
fig, ax = new_axes()
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
return
x = np.arange(len(labels))
fig, ax = new_axes()
bottoms = np.zeros(len(labels))
for index, serie in enumerate(series):
values = np.array(serie.get("values") or [0] * len(labels), dtype=float)
color = PALETTE[index % len(PALETTE)]
ax.bar(x, values, bottom=bottoms, label=str(serie.get("label", "")), color=color, linewidth=0)
bottoms += values
ax.set_xticks(x)
ax.set_xticklabels(labels, rotation=45, ha="right", fontsize=TICK_FONT_SIZE)
ax.tick_params(axis="y", labelsize=TICK_FONT_SIZE)
ax.set_ylabel("Articles")
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
ax.legend(fontsize=LEGEND_FONT_SIZE)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_articles_per_year(data, output):
labels = data.get("labels") or []
values = data.get("values") or []
title = data.get("title") or "Articles par an"
if not labels or not values:
fig, ax = new_axes()
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
return
x = np.arange(len(labels))
fig, ax = new_axes()
ax.bar(x, values, color=PALETTE[0])
ax.set_xticks(x)
ax.set_xticklabels(labels, rotation=0, fontsize=TICK_FONT_SIZE)
ax.tick_params(axis="y", labelsize=TICK_FONT_SIZE)
ax.set_ylabel("Articles")
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_articles_per_section(data, output):
labels = data.get("labels") or []
values = data.get("values") or []
title = data.get("title") or "Articles par section"
if not labels or not values:
fig, ax = new_axes()
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
return
fig, ax = new_axes()
# Donut chart
wedges, _ = ax.pie(
values,
labels=None,
colors=[PALETTE[i % len(PALETTE)] for i in range(len(values))],
startangle=90,
counterclock=False,
)
centre_circle = plt.Circle((0, 0), 0.60, fc=BACKGROUND)
fig.gca().add_artist(centre_circle)
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
ax.legend(
wedges,
labels,
title="Sections",
loc="center left",
bbox_to_anchor=(1.0, 0.5),
fontsize=LEGEND_FONT_SIZE,
title_fontsize=LEGEND_FONT_SIZE,
)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_cumulative(data, output):
labels = data.get("labels") or []
articles = data.get("articles") or []
words = data.get("words") or []
title = data.get("title") or "Cumul articles / mots"
if not labels or (not articles and not words):
fig, ax = new_axes()
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
return
x = np.arange(len(labels))
fig, ax_words = new_axes()
ax_articles = ax_words.twinx()
lines = []
labels_for_legend = []
if words:
lw = ax_words.plot(
x,
words,
label="Mots cumulés",
color=PALETTE[1],
linewidth=2.2,
marker="o",
markersize=4,
)
lines += lw
labels_for_legend += ["Mots cumulés"]
if articles:
la = ax_articles.plot(
x,
articles,
label="Articles cumulés",
color=PALETTE[0],
linewidth=2.2,
marker="o",
markersize=4,
)
lines += la
labels_for_legend += ["Articles cumulés"]
ax_words.set_xticks(x)
ax_words.set_xticklabels(labels, rotation=45, ha="right", fontsize=TICK_FONT_SIZE)
ax_words.tick_params(axis="y", labelsize=TICK_FONT_SIZE, colors=PALETTE[1])
ax_articles.tick_params(axis="y", labelsize=TICK_FONT_SIZE, colors=PALETTE[0])
ax_words.set_ylabel("Mots cumulés", color=PALETTE[1])
ax_articles.set_ylabel("Articles cumulés", color=PALETTE[0])
ax_words.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
ax_articles.grid(False)
ax_words.grid(True, axis="y", color=GRID, linestyle="--", linewidth=0.7)
fig.legend(lines, labels_for_legend, loc="upper left", fontsize=LEGEND_FONT_SIZE)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_words_histogram(data, output):
values = data.get("values") or []
title = data.get("title") or "Distribution des longueurs d'article"
bins = data.get("bins") or 20
fig, ax = new_axes()
if not values:
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
else:
ax.hist(values, bins=bins, color=PALETTE[0], edgecolor=TEXT, alpha=0.9)
ax.set_xlabel("Nombre de mots")
ax.set_ylabel("Articles")
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_top_requests(data, output):
labels = data.get("labels") or []
values = data.get("values") or []
title = data.get("title") or "Top requêtes"
fig, ax = new_axes()
if not labels or not values:
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
else:
y_pos = np.arange(len(labels))
ax.barh(y_pos, values, color=PALETTE[0])
ax.set_yticks(y_pos)
ax.set_yticklabels(labels, fontsize=TICK_FONT_SIZE)
ax.invert_yaxis()
ax.set_xlabel("Hits")
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_weather_hexbin(data, output):
temps = data.get("temps") or []
hums = data.get("hums") or []
presses = data.get("presses") or []
title = data.get("title") or "Météo à la publication"
fig, ax = new_axes()
if not temps or not hums:
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
else:
# If pressures are provided, use them for color; otherwise density
if presses and len(presses) == len(temps):
hb = ax.scatter(temps, hums, c=presses, cmap="viridis", alpha=0.75, s=50, edgecolors="none")
cbar = fig.colorbar(hb, ax=ax)
cbar.set_label("Pression (hPa)", color=TEXT)
cbar.ax.yaxis.set_tick_params(color=TEXT, labelsize=LEGEND_FONT_SIZE)
plt.setp(plt.getp(cbar.ax.axes, "yticklabels"), color=TEXT)
else:
norm = mcolors.LogNorm() if len(temps) > 0 else None
hb = ax.hexbin(
temps,
hums,
gridsize=28,
cmap="plasma",
mincnt=1,
linewidths=0.2,
edgecolors="none",
alpha=0.9,
norm=norm,
)
cbar = fig.colorbar(hb, ax=ax)
cbar.set_label("Densité", color=TEXT)
cbar.ax.yaxis.set_tick_params(color=TEXT, labelsize=LEGEND_FONT_SIZE)
plt.setp(plt.getp(cbar.ax.axes, "yticklabels"), color=TEXT)
ax.set_xlabel("Température (°C)")
ax.set_ylabel("Humidité (%)")
ax.tick_params(axis="x", labelsize=TICK_FONT_SIZE)
ax.tick_params(axis="y", labelsize=TICK_FONT_SIZE)
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_weekday_activity(data, output):
labels = data.get("labels") or []
articles = data.get("articles") or []
words = data.get("words") or []
title = data.get("title") or "Activité par jour"
fig, ax_left = new_axes()
ax_right = ax_left.twinx()
if not labels or (not articles and not words):
ax_left.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
else:
x = np.arange(len(labels))
width = 0.38
bars_articles = ax_left.bar(
x - width / 2,
articles,
width=width,
label="Articles",
color=PALETTE[0],
)
bars_words = ax_right.bar(
x + width / 2,
words,
width=width,
label="Mots",
color=PALETTE[1],
)
ax_left.set_xticks(x)
ax_left.set_xticklabels(labels, rotation=0, fontsize=TICK_FONT_SIZE)
ax_left.tick_params(axis="y", labelsize=TICK_FONT_SIZE, colors=PALETTE[0])
ax_right.tick_params(axis="y", labelsize=TICK_FONT_SIZE, colors=PALETTE[1])
ax_left.set_ylabel("Articles", color=PALETTE[0])
ax_right.set_ylabel("Mots", color=PALETTE[1])
lines = [bars_articles, bars_words]
labels_for_legend = ["Articles", "Mots"]
fig.legend(lines, labels_for_legend, loc="upper right", fontsize=LEGEND_FONT_SIZE)
ax_left.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def render_words_per_article(data, output):
labels = data.get("labels") or []
series = data.get("series") or []
title = data.get("title") or "Moyenne de mots par article (par mois)"
if not labels or not series:
fig, ax = new_axes()
ax.text(
0.5,
0.5,
"Aucune donnees",
ha="center",
va="center",
fontsize=BASE_FONT_SIZE,
)
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
return
x = np.arange(len(labels))
n_series = len(series)
width = 0.8 / max(n_series, 1)
fig, ax = new_axes()
for index, serie in enumerate(series):
values = np.array(serie.get("values") or [0] * len(labels), dtype=float)
color = PALETTE[index % len(PALETTE)]
offset = (index - (n_series - 1) / 2) * width
ax.bar(x + offset, values, width=width, label=str(serie.get("label", "")), color=color, linewidth=0)
ax.set_xticks(x)
ax.set_xticklabels(labels, rotation=45, ha="right", fontsize=TICK_FONT_SIZE)
ax.tick_params(axis="y", labelsize=TICK_FONT_SIZE)
ax.set_ylabel("Mots par article (moyenne)")
ax.set_title(title, fontsize=TITLE_FONT_SIZE, color=TEXT)
ax.legend(fontsize=LEGEND_FONT_SIZE)
fig.tight_layout()
fig.savefig(output, bbox_inches="tight")
plt.close(fig)
def main():
parser = argparse.ArgumentParser(description="Render stats charts from JSON data.")
parser.add_argument(
"--type",
required=True,
choices=[
"articles_per_month",
"articles_per_year",
"articles_per_section",
"words_per_article",
"cumulative",
"words_histogram",
"top_requests",
"weather_hexbin",
"weekday_activity",
],
)
parser.add_argument("--output", required=True)
args = parser.parse_args()
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON from stdin: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
chart_type = args.type
if chart_type == "articles_per_month":
render_articles_per_month(payload, args.output)
elif chart_type == "articles_per_year":
render_articles_per_year(payload, args.output)
elif chart_type == "articles_per_section":
render_articles_per_section(payload, args.output)
elif chart_type == "words_per_article":
render_words_per_article(payload, args.output)
elif chart_type == "cumulative":
render_cumulative(payload, args.output)
elif chart_type == "words_histogram":
render_words_histogram(payload, args.output)
elif chart_type == "top_requests":
render_top_requests(payload, args.output)
elif chart_type == "weather_hexbin":
render_weather_hexbin(payload, args.output)
elif chart_type == "weekday_activity":
render_weekday_activity(payload, args.output)
else:
print(f"Unknown chart type: {chart_type}", file=sys.stderr)
sys.exit(1)
if __name__ == "__main__":
main()

107
tools/stats.json Normal file
View File

@@ -0,0 +1,107 @@
{
"config": {
"dataOutput": "content/stats/data/stats.json",
"defaultImageDir": "content/stats/images"
},
"sections": [
{
"title": "Habitudes d'écriture",
"statistics": [
{
"key": "most_prolific_month",
"title": "Mois le plus prolifique",
"type": "variable",
"script": "tools/stats/most_prolific_month.js"
},
{
"key": "weekday_activity",
"title": "Articles et mots par jour",
"type": "graphic",
"script": "tools/stats/weekday_activity.py",
"image": "content/stats/images/weekday_activity.png"
},
{
"key": "articles_avg_per_month",
"title": "Moyenne d'articles par mois",
"type": "variable",
"script": "tools/stats/articles_avg_per_month.js"
},
{
"key": "articles_per_month",
"title": "Articles par mois",
"type": "graphic",
"script": "tools/stats/articles_per_month.py",
"image": "content/stats/images/articles_per_month.png"
},
{
"key": "articles_per_year",
"title": "Articles par an",
"type": "graphic",
"script": "tools/stats/articles_per_year.py",
"image": "content/stats/images/articles_per_year.png"
},
{
"key": "cumulative_articles",
"title": "Cumul articles / mots",
"type": "graphic",
"script": "tools/stats/cumulative_articles.py",
"image": "content/stats/images/cumulative_articles.png"
},
{
"key": "articles_per_section",
"title": "Articles par section",
"type": "graphic",
"script": "tools/stats/articles_per_section.py",
"image": "content/stats/images/articles_per_section.png"
},
{
"key": "words_per_article",
"title": "Nombre de mots par article",
"type": "graphic",
"script": "tools/stats/words_per_article.py",
"image": "content/stats/images/words_per_article.png"
},
{
"key": "words_histogram",
"title": "Distribution des longueurs",
"type": "graphic",
"script": "tools/stats/words_histogram.py",
"image": "content/stats/images/words_histogram.png"
},
{
"key": "weather_hexbin",
"title": "Conditions météo à la publication",
"type": "graphic",
"script": "tools/stats/weather_hexbin.py",
"image": "content/stats/images/weather_hexbin.png"
}
]
},
{
"title": "Visites",
"statistics": [
{
"key": "pageviews_per_month",
"title": "Pages vues (mois courant)",
"type": "variable",
"script": "tools/stats/goaccess_monthly.js",
"metric": "hits"
},
{
"key": "unique_visitors_per_month_value",
"title": "Visiteurs uniques (mois courant)",
"type": "variable",
"script": "tools/stats/goaccess_monthly.js",
"metric": "visitors"
},
{
"key": "top_requests",
"title": "Top requêtes (30 jours)",
"type": "graphic",
"script": "tools/stats/top_requests.py",
"image": "content/stats/images/top_requests.png"
}
]
}
]
}

View File

@@ -0,0 +1,32 @@
#!/usr/bin/env node
const { loadArticles } = require("../lib/stats/articles");
function computeAveragePerMonth(articles) {
let first = null;
let last = null;
for (const article of articles) {
if (!article.date) continue;
if (!first || article.date < first) first = article.date;
if (!last || article.date > last) last = article.date;
}
if (!first || !last) {
return { average: 0, months: 0 };
}
const monthSpan = Math.max(1, Math.round(last.diff(first.startOf("month"), "months").months) + 1);
const total = articles.filter((a) => a.date).length;
const average = total / monthSpan;
return { average, months: monthSpan };
}
async function run({ contentDir }) {
const articles = await loadArticles(contentDir || "content");
const { average, months } = computeAveragePerMonth(articles);
return { value: average.toFixed(2), meta: { months } };
}
module.exports = { run };

View File

@@ -0,0 +1,67 @@
#!/usr/bin/env node
const { loadArticles } = require("../lib/stats/articles");
const { renderWithPython } = require("../lib/stats/python");
const MONTH_LABELS = ["Jan", "Fev", "Mar", "Avr", "Mai", "Jun", "Jul", "Aou", "Sep", "Oct", "Nov", "Dec"];
function groupByMonthAndYear(articles) {
const counts = new Map();
const years = new Set();
let first = null;
let last = null;
for (const article of articles) {
if (!article.date) continue;
const year = article.date.year;
const month = article.date.month; // 1..12
years.add(year);
const key = `${year}-${month}`;
counts.set(key, (counts.get(key) || 0) + 1);
if (!first || article.date < first) first = article.date;
if (!last || article.date > last) last = article.date;
}
const monthNumbers = Array.from({ length: 12 }, (_, index) => index + 1);
const labels = monthNumbers.map((month) => MONTH_LABELS[month - 1]);
const sortedYears = Array.from(years).sort((a, b) => a - b);
const series = sortedYears.map((year) => {
return {
label: String(year),
values: monthNumbers.map((month) => counts.get(`${year}-${month}`) || 0),
};
});
return { labels, series, first, last };
}
async function run({ contentDir, outputPath, publicPath }) {
if (!outputPath) {
throw new Error("outputPath manquant pour articles_per_month");
}
const articles = await loadArticles(contentDir || "content");
const { labels, series, first, last } = groupByMonthAndYear(articles);
await renderWithPython({
type: "articles_per_month",
outputPath,
data: {
labels,
series,
title: "Articles par mois",
},
});
return {
image: publicPath,
meta: {
from: first ? first.toISODate() : null,
to: last ? last.toISODate() : null,
months: labels.length,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,81 @@
#!/usr/bin/env python3
import sys
import json
import os
from collections import defaultdict
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, MONTH_LABELS, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
counts = defaultdict(int)
years = set()
first = None
last = None
for article in articles:
date = article.get("date")
if not date:
continue
year = date.year
month = date.month
years.add(year)
counts[(year, month)] += 1
if not first or date < first:
first = date
if not last or date > last:
last = date
month_numbers = list(range(1, 13))
labels = [MONTH_LABELS[m - 1] for m in month_numbers]
sorted_years = sorted(years)
series = []
for year in sorted_years:
values = [counts.get((year, m), 0) for m in month_numbers]
series.append({"label": str(year), "values": values})
# Render via shared renderer
try:
from render_stats_charts import render_articles_per_month, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_articles_per_month({"labels": labels, "series": series, "title": "Articles par mois"}, output_path)
write_result(
{
"image": public_path,
"meta": {
"from": first.isoformat() if first else None,
"to": last.isoformat() if last else None,
"months": len(labels),
},
}
)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,55 @@
#!/usr/bin/env node
const { loadArticles } = require("../lib/stats/articles");
const { renderWithPython } = require("../lib/stats/python");
function groupBySection(articles) {
const counts = new Map();
for (const article of articles) {
const key = article.section || "root";
counts.set(key, (counts.get(key) || 0) + 1);
}
const entries = Array.from(counts.entries()).sort((a, b) => b[1] - a[1]);
return entries;
}
async function run({ contentDir, outputPath, publicPath }) {
if (!outputPath) {
throw new Error("outputPath manquant pour articles_per_section");
}
const articles = await loadArticles(contentDir || "content");
const entries = groupBySection(articles);
const maxSlices = 21;
const top = entries.slice(0, maxSlices);
const rest = entries.slice(maxSlices);
if (rest.length > 0) {
const restSum = rest.reduce((sum, [, value]) => sum + value, 0);
top.push(["Others", restSum]);
}
const labels = top.map(([key]) => key);
const values = top.map(([, value]) => value);
await renderWithPython({
type: "articles_per_section",
outputPath,
data: {
labels,
values,
title: "Articles par section",
},
});
return {
image: publicPath,
meta: {
sections: entries.length,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,69 @@
#!/usr/bin/env python3
import sys
import json
import os
from collections import defaultdict
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
counts = defaultdict(int)
for article in articles:
section = article.get("section") or "root"
counts[section] += 1
entries = sorted(counts.items(), key=lambda item: item[1], reverse=True)
max_slices = 21
top = entries[:max_slices]
rest = entries[max_slices:]
if rest:
rest_sum = sum(v for _, v in rest)
top.append(("Others", rest_sum))
labels = [label for label, _ in top]
values = [value for _, value in top]
try:
from render_stats_charts import render_articles_per_section, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_articles_per_section({"labels": labels, "values": values, "title": "Articles par section"}, output_path)
write_result(
{
"image": public_path,
"meta": {
"sections": len(entries),
},
}
)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,54 @@
#!/usr/bin/env node
const { loadArticles } = require("../lib/stats/articles");
const { renderWithPython } = require("../lib/stats/python");
function groupByYear(articles) {
const counts = new Map();
let first = null;
let last = null;
for (const article of articles) {
if (!article.date) continue;
const year = article.date.year;
counts.set(year, (counts.get(year) || 0) + 1);
if (!first || article.date < first) first = article.date;
if (!last || article.date > last) last = article.date;
}
const entries = Array.from(counts.entries()).sort((a, b) => a[0] - b[0]);
const labels = entries.map(([year]) => `${year}`);
const values = entries.map(([, value]) => value);
return { labels, values, first, last };
}
async function run({ contentDir, outputPath, publicPath }) {
if (!outputPath) {
throw new Error("outputPath manquant pour articles_per_year");
}
const articles = await loadArticles(contentDir || "content");
const { labels, values, first, last } = groupByYear(articles);
await renderWithPython({
type: "articles_per_year",
outputPath,
data: {
labels,
values,
title: "Articles par an",
},
});
return {
image: publicPath,
meta: {
from: first ? first.toISODate() : null,
to: last ? last.toISODate() : null,
years: labels.length,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,73 @@
#!/usr/bin/env python3
import sys
import json
import os
from collections import defaultdict
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
counts = defaultdict(int)
first = None
last = None
for article in articles:
date = article.get("date")
if not date:
continue
year = date.year
counts[year] += 1
if not first or date < first:
first = date
if not last or date > last:
last = date
sorted_years = sorted(counts.keys())
labels = [str(y) for y in sorted_years]
values = [counts[y] for y in sorted_years]
try:
from render_stats_charts import render_articles_per_year, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_articles_per_year({"labels": labels, "values": values, "title": "Articles par an"}, output_path)
write_result(
{
"image": public_path,
"meta": {
"from": first.isoformat() if first else None,
"to": last.isoformat() if last else None,
"years": len(labels),
},
}
)
if __name__ == "__main__":
main()

166
tools/stats/common.py Normal file
View File

@@ -0,0 +1,166 @@
#!/usr/bin/env python3
import os
import re
import json
import yaml
from datetime import datetime, date, timezone
MONTH_LABELS = ["Jan", "Fev", "Mar", "Avr", "Mai", "Jun", "Jul", "Aou", "Sep", "Oct", "Nov", "Dec"]
def find_markdown_files(root):
files = []
for dirpath, dirnames, filenames in os.walk(root):
for filename in filenames:
if not filename.lower().endswith(".md"):
continue
if filename == "_index.md":
continue
files.append(os.path.join(dirpath, filename))
return files
def collect_section_dirs(root):
section_dirs = set()
for dirpath, dirnames, filenames in os.walk(root):
if "_index.md" in filenames:
section_dirs.add(os.path.abspath(dirpath))
return section_dirs
def leaf_sections(section_dirs):
leaves = set()
for section in section_dirs:
is_leaf = True
for other in section_dirs:
if other == section:
continue
if other.startswith(section + os.sep):
is_leaf = False
break
if is_leaf:
leaves.add(section)
return leaves
def parse_frontmatter(path):
with open(path, "r", encoding="utf-8") as handle:
content = handle.read()
if content.startswith("---"):
parts = content.split("---", 2)
if len(parts) >= 3:
fm_text = parts[1]
body = parts[2]
else:
return {}, content
else:
return {}, content
try:
data = yaml.safe_load(fm_text) or {}
except Exception:
data = {}
return data, body
def parse_date(value):
if not value:
return None
dt = None
if isinstance(value, datetime):
dt = value
elif isinstance(value, date):
dt = datetime.combine(value, datetime.min.time())
elif isinstance(value, (int, float)):
try:
dt = datetime.fromtimestamp(value)
except Exception:
dt = None
elif isinstance(value, str):
# try ISO-like formats
for fmt in ("%Y-%m-%dT%H:%M:%S%z", "%Y-%m-%dT%H:%M:%S", "%Y-%m-%d", "%Y/%m/%d", "%d/%m/%Y"):
try:
dt = datetime.strptime(value, fmt)
break
except Exception:
continue
if dt is None:
try:
dt = datetime.fromisoformat(value)
except Exception:
dt = None
if dt is None:
return None
if dt.tzinfo is not None:
dt = dt.astimezone(timezone.utc).replace(tzinfo=None)
return dt
WORD_RE = re.compile(r"[\w'-]+", re.UNICODE)
def count_words(text):
if not text:
return 0
words = WORD_RE.findall(text)
return len(words)
def resolve_section(file_path, content_root, leaf_dirs):
content_root = os.path.abspath(content_root)
current = os.path.abspath(os.path.dirname(file_path))
best = None
while current.startswith(content_root):
if current in leaf_dirs:
best = current
break
parent = os.path.dirname(current)
if parent == current:
break
current = parent
if not best:
return None
rel = os.path.relpath(best, content_root)
return rel.replace(os.sep, "/") if rel != "." else "."
def load_articles(content_root):
files = find_markdown_files(content_root)
section_dirs = collect_section_dirs(content_root)
leaf_dirs = leaf_sections(section_dirs)
articles = []
for file_path in files:
fm, body = parse_frontmatter(file_path)
date = parse_date(fm.get("date"))
title = fm.get("title") or os.path.splitext(os.path.basename(file_path))[0]
word_count = count_words(body)
rel_path = os.path.relpath(file_path, content_root)
section = resolve_section(file_path, content_root, leaf_dirs)
weather = fm.get("weather") if isinstance(fm, dict) else None
articles.append(
{
"path": file_path,
"relativePath": rel_path,
"title": title,
"date": date,
"wordCount": word_count,
"section": section,
"weather": weather,
}
)
return articles
def write_result(data):
import sys
json.dump(data, sys.stdout)
sys.stdout.flush()

View File

@@ -0,0 +1,95 @@
#!/usr/bin/env python3
import sys
import json
import os
from collections import defaultdict
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
def month_key(dt):
return f"{dt.year}-{dt.month:02d}"
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
monthly_articles = defaultdict(int)
monthly_words = defaultdict(int)
months_set = set()
for article in articles:
date = article.get("date")
if not date:
continue
key = month_key(date)
monthly_articles[key] += 1
monthly_words[key] += article.get("wordCount") or 0
months_set.add(key)
sorted_months = sorted(months_set)
cum_articles = []
cum_words = []
labels = []
total_a = 0
total_w = 0
for key in sorted_months:
total_a += monthly_articles.get(key, 0)
total_w += monthly_words.get(key, 0)
labels.append(key)
cum_articles.append(total_a)
cum_words.append(total_w)
try:
from render_stats_charts import render_cumulative, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_cumulative(
{
"labels": labels,
"articles": cum_articles,
"words": cum_words,
"title": "Cumul articles / mots",
},
output_path,
)
write_result(
{
"image": public_path,
"meta": {
"months": len(labels),
"articles": total_a,
"words": total_w,
},
}
)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,42 @@
#!/usr/bin/env node
const { fetchGoAccessJson, aggregateLastNDays } = require("../lib/stats/goaccess");
const { loadToolsConfig } = require("../lib/config");
let cache = null;
async function loadData(url) {
if (!cache) {
cache = await fetchGoAccessJson(url);
}
return cache;
}
function latestMonthEntry(months) {
if (!months || months.length === 0) return null;
return months[months.length - 1];
}
async function run({ stat }) {
const toolsConfig = await loadToolsConfig();
const url = stat.url || toolsConfig.goaccess?.url || "";
const metric = stat.metric || "hits";
const windowDays = Number.isFinite(stat.days) ? stat.days : 30;
const data = await loadData(url);
const window = aggregateLastNDays(data, windowDays, { adjustCrawlers: true });
if (!window || !window.to) return { value: 0 };
const value = metric === "visitors" ? window.visitors : window.hits;
return {
value,
meta: {
from: window.from || null,
to: window.to || null,
days: windowDays,
metric,
raw: value,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,36 @@
#!/usr/bin/env node
const { DateTime } = require("luxon");
const { loadArticles } = require("../lib/stats/articles");
async function computeMostProlificMonth(contentDir) {
const articles = await loadArticles(contentDir);
const counts = new Map();
for (const article of articles) {
if (!article.date) continue;
const monthKey = article.date.toFormat("yyyy-MM");
counts.set(monthKey, (counts.get(monthKey) || 0) + 1);
}
if (counts.size === 0) {
return null;
}
const sorted = Array.from(counts.entries()).sort((a, b) => {
if (b[1] !== a[1]) return b[1] - a[1];
return a[0] < b[0] ? -1 : 1;
});
const [monthKey, count] = sorted[0];
const label = DateTime.fromISO(`${monthKey}-01`).setLocale("fr").toFormat("LLLL yyyy");
return { value: `${label} (${count})`, month: monthKey, count };
}
async function run({ contentDir }) {
const result = await computeMostProlificMonth(contentDir || "content");
return result || { value: null };
}
module.exports = { run };

107
tools/stats/top_requests.py Normal file
View File

@@ -0,0 +1,107 @@
#!/usr/bin/env python3
import sys
import json
import os
import urllib.request
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
TOOLS_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
ROOT_DIR = os.path.abspath(os.path.join(TOOLS_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if TOOLS_DIR not in sys.path:
sys.path.append(TOOLS_DIR)
def load_config():
cfg_path = os.path.join(ROOT_DIR, "tools", "config.json")
try:
with open(cfg_path, "r", encoding="utf-8") as handle:
return json.load(handle)
except Exception:
return {}
def fetch_goaccess(url, timeout=10):
with urllib.request.urlopen(url, timeout=timeout) as resp:
data = resp.read().decode("utf-8")
return json.loads(data)
def crawler_ratios(data):
browsers = (data.get("browsers") or {}).get("data") or []
crawler = next((entry for entry in browsers if entry.get("data") == "Crawlers"), None)
if not crawler:
return {"hits": 0.0, "visitors": 0.0}
def total(field):
return sum((entry.get(field, {}) or {}).get("count", 0) for entry in browsers)
total_hits = total("hits")
total_visitors = total("visitors")
return {
"hits": min(1.0, (crawler.get("hits", {}) or {}).get("count", 0) / total_hits) if total_hits else 0.0,
"visitors": min(1.0, (crawler.get("visitors", {}) or {}).get("count", 0) / total_visitors)
if total_visitors
else 0.0,
}
def adjust(value, ratio):
return max(0, round(value * (1 - ratio)))
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
url = payload.get("stat", {}).get("url")
cfg = load_config()
goaccess_url = url or (cfg.get("goaccess") or {}).get("url") or ""
try:
data = fetch_goaccess(goaccess_url)
except Exception as exc: # noqa: BLE001
print(f"Failed to fetch GoAccess JSON: {exc}", file=sys.stderr)
sys.exit(1)
ratios = crawler_ratios(data)
reqs = (data.get("requests") or {}).get("data") or []
# entries have .data = path, hits.count, visitors.count ?
cleaned = []
for entry in reqs:
path = entry.get("data") or ""
hits = (entry.get("hits") or {}).get("count", 0)
if not path or hits <= 0:
continue
cleaned.append((path, adjust(hits, ratios["hits"])))
cleaned.sort(key=lambda item: item[1], reverse=True)
top = cleaned[:10]
labels = [item[0] for item in top]
values = [item[1] for item in top]
try:
from render_stats_charts import render_top_requests, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_top_requests({"labels": labels, "values": values, "title": "Top 10 requêtes (hors crawlers)"}, output_path)
json.dump({"image": public_path}, sys.stdout)
sys.stdout.flush()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,43 @@
#!/usr/bin/env node
const fs = require("fs/promises");
const { renderChart, makeBarConfig } = require("../lib/stats/charts");
const { fetchGoAccessJson, groupVisitsByMonth } = require("../lib/stats/goaccess");
const { loadToolsConfig } = require("../lib/config");
async function run({ stat, outputPath, publicPath }) {
if (!outputPath) {
throw new Error("outputPath manquant pour unique_visitors_per_month");
}
const toolsConfig = await loadToolsConfig();
const url = stat.url || toolsConfig.goaccess?.url || "";
const data = await fetchGoAccessJson(url);
const months = groupVisitsByMonth(data, { adjustCrawlers: true });
const labels = months.map((entry) => entry.month);
const values = months.map((entry) => entry.visitors);
const buffer = await renderChart(
makeBarConfig({
labels,
data: values,
title: "Visiteurs uniques par mois (hors crawlers)",
color: "rgba(239, 68, 68, 0.8)",
}),
);
await fs.writeFile(outputPath, buffer);
const latest = months[months.length - 1];
return {
image: publicPath,
meta: {
month: latest?.month || null,
visitors: latest?.visitors || 0,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,78 @@
#!/usr/bin/env python3
import sys
import json
import os
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
temps = []
hums = []
presses = []
for article in articles:
weather = article.get("weather") or {}
try:
t = float(weather.get("temperature"))
h = float(weather.get("humidity"))
except Exception:
continue
temps.append(t)
hums.append(h)
try:
p = float(weather.get("pressure"))
presses.append(p)
except Exception:
presses.append(None)
# Align pressures length
if all(p is None for p in presses):
presses = []
else:
presses = [p for p in presses if p is not None]
try:
from render_stats_charts import render_weather_hexbin, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_weather_hexbin(
{
"temps": temps,
"hums": hums,
"presses": presses if len(presses) == len(temps) else [],
"title": "Conditions météo à la publication",
},
output_path,
)
write_result({"image": public_path, "meta": {"points": len(temps)}})
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,65 @@
#!/usr/bin/env python3
import sys
import json
import os
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
WEEKDAY_LABELS = ["Lun", "Mar", "Mer", "Jeu", "Ven", "Sam", "Dim"]
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
counts = [0] * 7
words = [0] * 7
for article in articles:
dt = article.get("date")
if not dt:
continue
weekday = dt.weekday() # Monday=0
counts[weekday] += 1
words[weekday] += article.get("wordCount") or 0
try:
from render_stats_charts import render_weekday_activity, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_weekday_activity(
{
"labels": WEEKDAY_LABELS,
"articles": counts,
"words": words,
"title": "Articles et mots par jour de semaine",
},
output_path,
)
write_result({"image": public_path})
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,59 @@
#!/usr/bin/env python3
import sys
import json
import os
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
values = [a.get("wordCount") or 0 for a in articles if a.get("wordCount")]
try:
from render_stats_charts import render_words_histogram, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_words_histogram(
{
"values": values,
"title": "Distribution des longueurs d'article",
"bins": 20,
},
output_path,
)
write_result(
{
"image": public_path,
"meta": {
"articles": len(values),
},
}
)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,79 @@
#!/usr/bin/env node
const { loadArticles } = require("../lib/stats/articles");
const { renderWithPython } = require("../lib/stats/python");
const MONTH_LABELS = ["Jan", "Fev", "Mar", "Avr", "Mai", "Jun", "Jul", "Aou", "Sep", "Oct", "Nov", "Dec"];
function groupAverageWordsByMonth(articles) {
const buckets = new Map();
const years = new Set();
let totalWords = 0;
let totalArticles = 0;
for (const article of articles) {
if (!article.date) continue;
const year = article.date.year;
const month = article.date.month;
const key = `${year}-${month}`;
years.add(year);
const current = buckets.get(key) || { words: 0, count: 0 };
current.words += article.wordCount || 0;
current.count += 1;
buckets.set(key, current);
totalWords += article.wordCount || 0;
totalArticles += 1;
}
const monthNumbers = Array.from({ length: 12 }, (_, index) => index + 1);
const labels = monthNumbers.map((month) => MONTH_LABELS[month - 1]);
const sortedYears = Array.from(years).sort((a, b) => a - b);
const series = sortedYears.map((year) => {
const values = monthNumbers.map((month) => {
const entry = buckets.get(`${year}-${month}`);
if (!entry || entry.count === 0) return 0;
return Math.round(entry.words / entry.count);
});
return {
label: String(year),
values,
};
});
const average = totalArticles > 0 ? Math.round(totalWords / totalArticles) : 0;
return { labels, series, average, articles: totalArticles };
}
async function run({ contentDir, outputPath, publicPath }) {
if (!outputPath) {
throw new Error("outputPath manquant pour le graphique words_per_article");
}
const articles = await loadArticles(contentDir || "content");
const { labels, series, average, articles: totalArticles } = groupAverageWordsByMonth(articles);
await renderWithPython({
type: "words_per_article",
outputPath,
data: {
labels,
series,
title: "Moyenne de mots par article (par mois)",
},
});
return {
image: publicPath,
meta: {
average,
articles: totalArticles,
},
};
}
module.exports = { run };

View File

@@ -0,0 +1,88 @@
#!/usr/bin/env python3
import sys
import json
import os
from collections import defaultdict
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
PARENT_DIR = os.path.abspath(os.path.join(CURRENT_DIR, os.pardir))
if CURRENT_DIR not in sys.path:
sys.path.append(CURRENT_DIR)
if PARENT_DIR not in sys.path:
sys.path.append(PARENT_DIR)
from common import load_articles, MONTH_LABELS, write_result # noqa: E402
def main():
try:
payload = json.load(sys.stdin)
except Exception as exc: # noqa: BLE001
print(f"Failed to read JSON: {exc}", file=sys.stderr)
sys.exit(1)
content_dir = payload.get("contentDir") or "content"
output_path = payload.get("outputPath")
public_path = payload.get("publicPath")
articles = load_articles(content_dir)
buckets = defaultdict(lambda: {"words": 0, "count": 0})
years = set()
total_words = 0
total_articles = 0
for article in articles:
date = article.get("date")
if not date:
continue
year = date.year
month = date.month
key = (year, month)
years.add(year)
buckets[key]["words"] += article.get("wordCount") or 0
buckets[key]["count"] += 1
total_words += article.get("wordCount") or 0
total_articles += 1
month_numbers = list(range(1, 13))
labels = [MONTH_LABELS[m - 1] for m in month_numbers]
sorted_years = sorted(years)
series = []
for year in sorted_years:
values = []
for month in month_numbers:
entry = buckets.get((year, month))
if not entry or entry["count"] == 0:
values.append(0)
else:
values.append(round(entry["words"] / entry["count"]))
series.append({"label": str(year), "values": values})
average = round(total_words / total_articles) if total_articles > 0 else 0
try:
from render_stats_charts import render_words_per_article, setup_rcparams
except ImportError as exc: # noqa: BLE001
print(f"Failed to import renderer: {exc}", file=sys.stderr)
sys.exit(1)
setup_rcparams()
render_words_per_article({"labels": labels, "series": series, "title": "Moyenne de mots par article (par mois)"}, output_path)
write_result(
{
"image": public_path,
"meta": {
"average": average,
"articles": total_articles,
},
}
)
if __name__ == "__main__":
main()