2026-03-10-knowledge-base-migration
Knowledge Base Migration (ReadMe → Docusaurus) Implementation Plan
For agentic workers: REQUIRED: Use superpowers:subagent-driven-development (if subagents available) or superpowers:executing-plans to implement this plan. Steps use checkbox (
- [ ]) syntax for tracking.
Goal: Migrate Vianova's ~108 docs + 6 recipes from ReadMe.io to a self-hosted Docusaurus 3 site with Crowdin i18n and AWS S3/CloudFront deployment.
Architecture: Docusaurus 3 static site with custom Vianova purple theme, Crowdin for translation (24+ EU languages), S3+CloudFront for hosting. A Node.js migration script converts ReadMe markdown to Docusaurus format, downloads images from ReadMe CDN, and generates sidebar config from _order.yaml files.
Tech Stack: Docusaurus 3, React, Node.js (migration script), Crowdin CLI, AWS CLI (S3/CloudFront), GitHub Actions (CI/CD)
Note: The reference/ directory (167 auto-generated API endpoint files from hawkeye.json) is NOT migrated — API reference stays at api.vianova.dev/docs.
Chunk 1: Docusaurus Project Scaffold & Vianova Theme
Task 1: Initialize Docusaurus project
Files:
- Create:
docusaurus-site/(new project root, sibling to current knowledge-base content)
The Docusaurus site will be initialized as a new directory structure. The current knowledge-base content stays in place until migration is complete.
- Step 1: Create Docusaurus project
cd /Users/fred/Projects/Vianova/plateform/knowledge-base
npx create-docusaurus@latest docusaurus-site classic --typescriptExpected: Docusaurus project scaffolded with default content.
- Step 2: Verify it runs locally
cd docusaurus-site
npm startExpected: Dev server at http://localhost:3000 showing default Docusaurus site.
- Step 3: Clean default content
Remove default docs, blog, and src/pages content that ships with the template:
rm -rf docs/
rm -rf blog/
rm -rf src/pages/index.module.css- Step 4: Commit
git add docusaurus-site/
git commit -m "feat: scaffold Docusaurus 3 project"Task 2: Configure Vianova branding & theme
Files:
-
Modify:
docusaurus-site/docusaurus.config.ts -
Modify:
docusaurus-site/src/css/custom.css -
Create:
docusaurus-site/static/img/vianova-logo.png(download from current site) -
Step 1: Download Vianova logo
curl -o docusaurus-site/static/img/vianova-logo.png "https://help.vianova.io/img/small-Vianova_logo_B_copy.png"If the URL doesn't work, extract the logo URL from the current site and download it.
- Step 2: Configure
docusaurus.config.ts
Replace the config with Vianova-specific settings:
import {themes as prismThemes} from 'prism-react-renderer';
import type {Config} from '@docusaurus/types';
import type * as Preset from '@docusaurus/preset-classic';
const config: Config = {
title: 'Vianova Help Center',
tagline: 'Mobility data intelligence documentation',
favicon: 'img/favicon.ico',
url: 'https://help.vianova.io',
baseUrl: '/',
onBrokenLinks: 'throw',
onBrokenMarkdownLinks: 'throw',
i18n: {
defaultLocale: 'en',
locales: ['en'],
// Other locales added after Crowdin setup
},
presets: [
[
'classic',
{
docs: {
sidebarPath: './sidebars.ts',
routeBasePath: 'docs',
},
blog: {
showReadingTime: true,
routeBasePath: 'changelog',
blogTitle: 'Changelog',
blogDescription: 'Vianova platform changelog',
blogSidebarTitle: 'Recent changes',
},
theme: {
customCss: './src/css/custom.css',
},
} satisfies Preset.Options,
],
],
plugins: [
[
'@docusaurus/plugin-content-docs',
{
id: 'recipes',
path: 'recipes',
routeBasePath: 'recipes',
sidebarPath: './sidebarsRecipes.ts',
},
],
],
themeConfig: {
navbar: {
logo: {
alt: 'Vianova',
src: 'img/vianova-logo.png',
},
items: [
{type: 'docSidebar', sidebarId: 'docs', position: 'left', label: 'Docs'},
{to: '/recipes', label: 'Recipes', position: 'left'},
{href: 'https://api.vianova.dev/docs', label: 'API Reference', position: 'left'},
{to: '/changelog', label: 'Changelog', position: 'left'},
{to: '/roadmap', label: 'Roadmap', position: 'left'},
{type: 'localeDropdown', position: 'right'},
],
},
footer: {
style: 'dark',
links: [
{
title: 'Documentation',
items: [
{label: 'Getting Started', to: '/docs/getting-started-with-vianova'},
{label: 'API Reference', href: 'https://api.vianova.dev/docs'},
],
},
{
title: 'Company',
items: [
{label: 'Vianova', href: 'https://www.vianova.io'},
],
},
],
copyright: `Copyright © ${new Date().getFullYear()} Vianova.`,
},
prism: {
theme: prismThemes.github,
darkTheme: prismThemes.dracula,
additionalLanguages: ['python', 'json', 'bash'],
},
} satisfies Preset.ThemeConfig,
};
export default config;- Step 3: Configure custom CSS with Vianova purple palette
Replace docusaurus-site/src/css/custom.css:
:root {
--ifm-color-primary: #6B0070;
--ifm-color-primary-dark: #5a005e;
--ifm-color-primary-darker: #530058;
--ifm-color-primary-darkest: #440048;
--ifm-color-primary-light: #7c0082;
--ifm-color-primary-lighter: #830088;
--ifm-color-primary-lightest: #940099;
--ifm-link-color: #6B0070;
--ifm-navbar-background-color: #ffffff;
--ifm-footer-background-color: #220023;
--ifm-footer-color: #ffffff;
}
[data-theme='dark'] {
--ifm-color-primary: #c56bca;
--ifm-color-primary-dark: #bb4fc1;
--ifm-color-primary-darker: #b641bc;
--ifm-color-primary-darkest: #96349b;
--ifm-color-primary-light: #cf87d3;
--ifm-color-primary-lighter: #d495d8;
--ifm-color-primary-lightest: #e3b7e6;
--ifm-footer-background-color: #1a001b;
}
.navbar__logo img {
height: 32px;
}- Step 4: Create recipes sidebar
Create docusaurus-site/sidebarsRecipes.ts:
import type {SidebarsConfig} from '@docusaurus/plugin-content-docs';
const sidebars: SidebarsConfig = {
recipes: [{type: 'autogenerated', dirName: '.'}],
};
export default sidebars;- Step 5: Verify branding locally
cd docusaurus-site
npm startExpected: Purple-themed Docusaurus site with Vianova logo and correct navbar items.
- Step 6: Commit
git add docusaurus-site/
git commit -m "feat: configure Vianova branding and theme"Task 3: Create landing page
Files:
-
Modify:
docusaurus-site/src/pages/index.tsx -
Step 1: Create a simple landing page
Replace docusaurus-site/src/pages/index.tsx:
import React from 'react';
import Layout from '@theme/Layout';
import Link from '@docusaurus/Link';
const categories = [
{title: 'Getting Started', description: 'Set up your account and learn the basics.', link: '/docs/getting-started-with-vianova'},
{title: 'VIP Platform', description: 'Vianova Intelligent Platform guides.', link: '/docs/vip'},
{title: 'Analyzing Data', description: 'Analyze mobility data with Cityscope.', link: '/docs/analyzing-mobility-data'},
{title: 'Policy Management', description: 'Create and manage mobility policies.', link: '/docs/managing-and-implementing-policy'},
{title: 'Recipes', description: 'Step-by-step API tutorials.', link: '/recipes'},
{title: 'API Reference', description: 'Full API documentation.', link: 'https://api.vianova.dev/docs'},
];
export default function Home(): React.JSX.Element {
return (
<Layout title="Help Center" description="Vianova Help Center">
<main style={{padding: '4rem 2rem', maxWidth: '1200px', margin: '0 auto'}}>
<div style={{textAlign: 'center', marginBottom: '3rem'}}>
<h1>Vianova Help Center</h1>
<p style={{fontSize: '1.2rem', color: 'var(--ifm-color-secondary-darkest)'}}>
Everything you need to get started with Vianova
</p>
</div>
<div style={{display: 'grid', gridTemplateColumns: 'repeat(auto-fit, minmax(300px, 1fr))', gap: '1.5rem'}}>
{categories.map((cat) => (
<Link key={cat.title} to={cat.link} style={{textDecoration: 'none', color: 'inherit'}}>
<div style={{
border: '1px solid var(--ifm-color-emphasis-300)',
borderRadius: '8px',
padding: '1.5rem',
height: '100%',
transition: 'border-color 0.2s',
}}>
<h3>{cat.title}</h3>
<p>{cat.description}</p>
</div>
</Link>
))}
</div>
</main>
</Layout>
);
}- Step 2: Verify landing page
cd docusaurus-site
npm startExpected: Landing page with category cards and Vianova branding.
- Step 3: Commit
git add docusaurus-site/src/pages/index.tsx
git commit -m "feat: add landing page with category cards"Chunk 2: Migration Script
Task 4: Write migration script — docs conversion
Files:
- Create:
scripts/migrate.mjs
This Node.js script converts ReadMe markdown files to Docusaurus format. It handles:
- Frontmatter transformation (strip ReadMe fields, keep title/description)
<Image>component → standard markdown image (both self-closing and with children/captions)- Image download from ReadMe CDN to local
static/img/(preserving original filenames) _order.yaml→_category_.json+sidebar_positionin frontmatter (fuzzy slug matching)- Recipe
<!-- python@X-Y -->→ code block line highlights <Embed>components → iframe or link(doc:slug)ReadMe internal links → relative Docusaurus links- Absolute
help.vianova.iolinks → relative paths dash.readme.comeditorial links → removed with warning- Exclusion of
superpowers/internal directory - Detect
index.mdin categories to avoid conflicting_category_.jsonlink types
- Step 1: Create migration script
Create scripts/migrate.mjs:
import fs from 'fs/promises';
import path from 'path';
import https from 'https';
import http from 'http';
import { fileURLToPath } from 'url';
import { parse as parseYaml } from 'yaml';
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const KB_ROOT = path.resolve(__dirname, '..');
const DOCUSAURUS_ROOT = path.join(KB_ROOT, 'docusaurus-site');
const DOCS_SRC = path.join(KB_ROOT, 'docs');
const RECIPES_SRC = path.join(KB_ROOT, 'recipes');
const DOCS_DEST = path.join(DOCUSAURUS_ROOT, 'docs');
const RECIPES_DEST = path.join(DOCUSAURUS_ROOT, 'recipes');
const IMG_DEST = path.join(DOCUSAURUS_ROOT, 'static', 'img', 'docs');
// Directories to exclude from migration
const EXCLUDED_DIRS = ['superpowers'];
let imageCounter = 0;
const imageMap = new Map(); // URL -> local path
const warnings = []; // Collect warnings for summary
// --- Helpers ---
async function downloadImage(url) {
if (imageMap.has(url)) return imageMap.get(url);
// Preserve original filename for debuggability
const urlPath = new URL(url).pathname;
const originalName = path.basename(urlPath).split('?')[0];
const ext = path.extname(originalName) || '.png';
const baseName = originalName.replace(ext, '').replace(/[^a-zA-Z0-9_-]/g, '_');
const filename = `${baseName}-${++imageCounter}${ext}`;
const localPath = path.join(IMG_DEST, filename);
const relativePath = `/img/docs/${filename}`;
try {
const data = await fetchBuffer(url);
await fs.mkdir(path.dirname(localPath), { recursive: true });
await fs.writeFile(localPath, data);
imageMap.set(url, relativePath);
console.log(` Downloaded: ${filename}`);
return relativePath;
} catch (err) {
const msg = `Failed to download ${url}: ${err.message}`;
console.warn(` WARN: ${msg}`);
warnings.push(msg);
imageMap.set(url, url); // Keep original URL as fallback
return url;
}
}
function fetchBuffer(url) {
return new Promise((resolve, reject) => {
const client = url.startsWith('https') ? https : http;
client.get(url, { headers: { 'User-Agent': 'Vianova-Migration/1.0' } }, (res) => {
if (res.statusCode >= 300 && res.statusCode < 400 && res.headers.location) {
return fetchBuffer(res.headers.location).then(resolve).catch(reject);
}
if (res.statusCode !== 200) {
return reject(new Error(`HTTP ${res.statusCode}`));
}
const chunks = [];
res.on('data', (chunk) => chunks.push(chunk));
res.on('end', () => resolve(Buffer.concat(chunks)));
res.on('error', reject);
}).on('error', reject);
});
}
function slugify(name) {
return name.toLowerCase().replace(/\s+/g, '-').replace(/[^a-z0-9-]/g, '');
}
// --- Frontmatter ---
function parseReadMeFrontmatter(content) {
const match = content.match(/^---\n([\s\S]*?)\n---\n([\s\S]*)$/);
if (!match) return { frontmatter: {}, body: content };
const yamlStr = match[1];
let frontmatter;
try {
frontmatter = parseYaml(yamlStr);
} catch {
frontmatter = {};
}
return { frontmatter, body: match[2] };
}
function buildDocusaurusFrontmatter(fm, sidebarPosition) {
const result = { title: fm.title || 'Untitled' };
if (fm.excerpt) result.description = fm.excerpt;
if (fm.description) result.description = fm.description;
if (sidebarPosition !== undefined) result.sidebar_position = sidebarPosition;
if (fm.hidden === true) result.draft = true;
// Convert recipe tags if present
if (fm.recipe) {
result.tags = ['recipe'];
}
const lines = ['---'];
for (const [key, value] of Object.entries(result)) {
if (Array.isArray(value)) {
lines.push(`${key}: [${value.map(v => `"${v}"`).join(', ')}]`);
} else if (typeof value === 'string') {
lines.push(`${key}: "${value.replace(/"/g, '\\"')}"`);
} else {
lines.push(`${key}: ${value}`);
}
}
lines.push('---');
return lines.join('\n');
}
// --- Content transforms ---
async function transformBody(body, srcFile = '') {
let result = body;
// 1. Convert <Image ...>caption</Image> (multiline with children)
const imageWithChildrenRegex = /<Image[^>]*src=["']([^"']+)["'][^>]*(?:alt=["']([^"']*?)["'])?[^>]*>([\s\S]*?)<\/Image>/gi;
const imageChildMatches = [...result.matchAll(imageWithChildrenRegex)];
for (const match of imageChildMatches) {
const url = match[1];
const alt = match[2] || match[3]?.trim() || '';
const localPath = await downloadImage(url);
result = result.replace(match[0], ``);
}
// 2. Convert self-closing <Image ... /> components
const imageComponentRegex = /<Image[^>]*src=["']([^"']+)["'][^>]*(?:alt=["']([^"']*?)["'])?[^>]*\/>/gi;
const imageMatches = [...result.matchAll(imageComponentRegex)];
for (const match of imageMatches) {
const url = match[1];
const alt = match[2] || '';
const localPath = await downloadImage(url);
result = result.replace(match[0], ``);
}
// 3. Convert markdown images with ReadMe CDN URLs
const mdImageRegex = /!\[([^\]]*)\]\((https:\/\/files\.readme\.io\/[^)]+)\)/g;
const mdImageMatches = [...result.matchAll(mdImageRegex)];
for (const match of mdImageMatches) {
const alt = match[1];
const url = match[2];
const localPath = await downloadImage(url);
result = result.replace(match[0], ``);
}
// 4. Convert <Embed> components to iframes or links
const embedRegex = /<Embed\s+url=["']([^"']+)["'][^>]*(?:title=["']([^"']*?)["'])?[^>]*\/?>/gi;
result = result.replace(embedRegex, (match, url, title) => {
if (url.includes('loom.com') || url.includes('youtube.com') || url.includes('vimeo.com')) {
// Convert Loom/YouTube/Vimeo to iframe
const embedUrl = url.includes('loom.com') ? url.replace('/share/', '/embed/') : url;
return `<iframe src="${embedUrl}" width="100%" height="400" frameBorder="0" allowFullScreen title="${title || 'Video'}"></iframe>`;
}
return `[${title || url}](${url})`;
});
// 5. Convert ReadMe (doc:slug) internal links to relative links
result = result.replace(/\(doc:([a-z0-9-]+)\)/g, (match, slug) => {
return `(/docs/${slug})`;
});
// Also handle [text](doc:slug) pattern
result = result.replace(/\]\(doc:([a-z0-9-]+)\)/g, (match, slug) => {
return `](/docs/${slug})`;
});
// 6. Convert absolute help.vianova.io links to relative
result = result.replace(/https:\/\/help\.vianova\.io\/(docs|recipes|changelog)\//g, '/$1/');
result = result.replace(/https:\/\/help\.vianova\.io\/(docs|recipes|changelog)/g, '/$1');
// 7. Remove/warn about dash.readme.com editorial links
const dashLinks = [...result.matchAll(/https:\/\/dash\.readme\.com[^\s)]+/g)];
if (dashLinks.length > 0) {
for (const match of dashLinks) {
warnings.push(`${srcFile}: Found ReadMe dashboard link: ${match[0]}`);
}
result = result.replace(/\[([^\]]*)\]\(https:\/\/dash\.readme\.com[^)]+\)/g, '$1');
}
// 8. Convert ReadMe callout blocks to Docusaurus admonitions
// Handle both > emoji **Title**\n> body and standalone > emoji patterns
// Use a function to handle multi-line callouts properly
const calloutMap = {
'\u{1F4D8}': 'info', // 📘
'\u{1F44D}': 'tip', // 👍
'\u2757': 'danger', // ❗
'\u{FE0F}': '', // variation selector (ignore)
'\u26A0': 'warning', // ⚠️
'\u{1F6A7}': 'caution', // 🚧
};
// Match callout blocks: > emoji **Title** followed by > continuation lines
result = result.replace(
/^(> (?:📘|👍|❗️?|⚠️|🚧)\s*\*\*([^*]+)\*\*.*)\n((?:^>.*\n?)*)/gm,
(match, firstLine, title, bodyLines) => {
let type = 'info';
if (firstLine.includes('📘')) type = 'info';
else if (firstLine.includes('👍')) type = 'tip';
else if (firstLine.includes('❗')) type = 'danger';
else if (firstLine.includes('⚠')) type = 'warning';
else if (firstLine.includes('🚧')) type = 'caution';
// Strip the > prefix from body lines
const body = bodyLines
.split('\n')
.map(line => line.replace(/^>\s?/, ''))
.join('\n')
.trim();
return `:::${type}[${title}]\n${body}\n:::\n`;
}
);
return result;
}
function transformRecipeBody(body) {
let result = body;
result = result.replace(/<!--\s*(\w+)@(\d+(?:-\d+)?)\s*-->/g,
(match, lang, lines) => `<!-- Code reference: ${lang} lines ${lines} -->`);
return result;
}
// --- Order / sidebar ---
async function readOrder(dir) {
const orderFile = path.join(dir, '_order.yaml');
try {
const content = await fs.readFile(orderFile, 'utf-8');
return parseYaml(content) || [];
} catch {
return [];
}
}
// Find position by comparing slugified versions of both the entry name and order entries
function findPosition(name, order) {
const nameSlug = slugify(name);
for (let i = 0; i < order.length; i++) {
if (slugify(String(order[i])) === nameSlug || String(order[i]) === name) {
return i + 1;
}
}
return undefined;
}
async function createCategoryJson(destDir, label, position, hasIndexMd) {
const categoryJson = {
label,
position,
};
// Only set generated-index if there is no index.md (which serves as the landing page)
if (!hasIndexMd) {
categoryJson.link = { type: 'generated-index' };
}
await fs.writeFile(
path.join(destDir, '_category_.json'),
JSON.stringify(categoryJson, null, 2)
);
}
// --- Main migration ---
async function migrateDir(srcDir, destDir, parentOrder, depth = 0) {
const entries = await fs.readdir(srcDir, { withFileTypes: true });
const order = await readOrder(srcDir);
for (const entry of entries) {
if (entry.name === '_order.yaml' || entry.name === '.DS_Store') continue;
// Skip excluded directories
if (entry.isDirectory() && EXCLUDED_DIRS.includes(entry.name.toLowerCase())) {
console.log(` Skipping excluded directory: ${entry.name}`);
continue;
}
const srcPath = path.join(srcDir, entry.name);
const destName = entry.isDirectory() ? slugify(entry.name) : entry.name;
const destPath = path.join(destDir, destName);
if (entry.isDirectory()) {
await fs.mkdir(destPath, { recursive: true });
const position = findPosition(entry.name, order);
// Check if this directory has an index.md
const dirEntries = await fs.readdir(srcPath);
const hasIndexMd = dirEntries.includes('index.md');
await createCategoryJson(destPath, entry.name, position, hasIndexMd);
await migrateDir(srcPath, destPath, order, depth + 1);
} else if (entry.name.endsWith('.md')) {
const content = await fs.readFile(srcPath, 'utf-8');
const { frontmatter, body } = parseReadMeFrontmatter(content);
const slug = entry.name.replace('.md', '');
const position = findPosition(slug, order);
const newFrontmatter = buildDocusaurusFrontmatter(frontmatter, position);
const relSrc = srcPath.replace(KB_ROOT, '');
const newBody = await transformBody(body, relSrc);
await fs.mkdir(path.dirname(destPath), { recursive: true });
await fs.writeFile(destPath, `${newFrontmatter}\n${newBody}`);
console.log(`Migrated: ${relSrc}`);
}
}
}
async function migrateRecipes(srcDir, destDir) {
const entries = await fs.readdir(srcDir, { withFileTypes: true });
const order = await readOrder(srcDir);
for (const entry of entries) {
if (entry.name === '_order.yaml' || entry.name === '.DS_Store') continue;
if (!entry.name.endsWith('.md')) continue;
const srcPath = path.join(srcDir, entry.name);
const destPath = path.join(destDir, entry.name);
const content = await fs.readFile(srcPath, 'utf-8');
const { frontmatter, body } = parseReadMeFrontmatter(content);
const slug = entry.name.replace('.md', '');
const position = findPosition(slug, order);
const newFrontmatter = buildDocusaurusFrontmatter(frontmatter, position);
const relSrc = srcPath.replace(KB_ROOT, '');
const newBody = transformRecipeBody(await transformBody(body, relSrc));
await fs.mkdir(destDir, { recursive: true });
await fs.writeFile(destPath, `${newFrontmatter}\n${newBody}`);
console.log(`Migrated recipe: ${entry.name}`);
}
}
// --- Entry point ---
async function main() {
console.log('Starting migration...\n');
// Create destination dirs
await fs.mkdir(DOCS_DEST, { recursive: true });
await fs.mkdir(RECIPES_DEST, { recursive: true });
await fs.mkdir(IMG_DEST, { recursive: true });
// Migrate docs (excludes superpowers/ and reference/)
console.log('--- Migrating docs ---');
await migrateDir(DOCS_SRC, DOCS_DEST, []);
// Migrate recipes
console.log('\n--- Migrating recipes ---');
await migrateRecipes(RECIPES_SRC, RECIPES_DEST);
// Summary
console.log(`\n--- Done ---`);
console.log(`Images downloaded: ${imageCounter}`);
console.log(`Image map entries: ${imageMap.size}`);
if (warnings.length > 0) {
console.log(`\n--- Warnings (${warnings.length}) ---`);
warnings.forEach(w => console.log(` ! ${w}`));
}
}
main().catch(console.error);- Step 2: Install yaml dependency at repo root
The script runs from the repo root, so the dependency must be available there:
cd /Users/fred/Projects/Vianova/plateform/knowledge-base
npm init -y
npm install yamlThis creates a minimal package.json at the repo root for the migration script.
- Step 3: Commit migration script
git add scripts/migrate.mjs docusaurus-site/package.json docusaurus-site/package-lock.json
git commit -m "feat: add ReadMe to Docusaurus migration script"Task 5: Run migration and verify
- Step 1: Run the migration
cd /Users/fred/Projects/Vianova/plateform/knowledge-base
node scripts/migrate.mjsExpected: All ~107 docs and 6 recipes migrated, images downloaded to docusaurus-site/static/img/docs/.
- Step 2: Generate sidebars from migrated content
Update docusaurus-site/sidebars.ts to use autogenerated sidebars:
import type {SidebarsConfig} from '@docusaurus/plugin-content-docs';
const sidebars: SidebarsConfig = {
docs: [{type: 'autogenerated', dirName: '.'}],
};
export default sidebars;- Step 3: Start dev server and verify
cd docusaurus-site
npm startExpected: All docs render, images load, sidebar structure matches original.
- Step 4: Fix any migration issues
Review the output for warnings. Common issues:
-
Broken image downloads → manually download or fix URLs
-
Malformed frontmatter → fix in source markdown
-
Missing sidebar entries → check
_category_.jsonfiles -
Step 5: Commit migrated content
git add docusaurus-site/docs/ docusaurus-site/recipes/ docusaurus-site/static/img/docs/
git commit -m "feat: migrate all docs and recipes from ReadMe"Chunk 3: Roadmap Page & Algolia Search
Task 6: Create roadmap custom page
Files:
-
Create:
docusaurus-site/src/pages/roadmap.tsx -
Step 1: Read current roadmap content
cat /Users/fred/Projects/Vianova/plateform/knowledge-base/custom_pages/roadmap.md- Step 2: Create roadmap page
Create docusaurus-site/src/pages/roadmap.tsx with the roadmap content converted to a React page. The exact content depends on what's in the current roadmap.md.
import React from 'react';
import Layout from '@theme/Layout';
export default function Roadmap(): React.JSX.Element {
return (
<Layout title="Roadmap" description="Vianova product roadmap">
<main style={{padding: '2rem', maxWidth: '900px', margin: '0 auto'}}>
<h1>Product Roadmap</h1>
{/* Convert roadmap.md content here */}
</main>
</Layout>
);
}- Step 3: Commit
git add docusaurus-site/src/pages/roadmap.tsx
git commit -m "feat: add roadmap page"Task 7: Configure Algolia DocSearch
Files:
-
Modify:
docusaurus-site/docusaurus.config.ts -
Step 1: Apply for Algolia DocSearch
Go to https://docsearch.algolia.com/apply/ and submit help.vianova.io. This is a free service for open documentation sites. Alternatively, use Algolia's self-hosted crawler if the site is private.
- Step 2: Add Algolia config to
docusaurus.config.ts
Add to themeConfig:
algolia: {
appId: 'YOUR_APP_ID', // Replace after Algolia approval
apiKey: 'YOUR_SEARCH_KEY', // Public search-only key
indexName: 'vianova-help',
contextualSearch: true,
searchPagePath: 'search',
},- Step 3: Commit
git add docusaurus-site/docusaurus.config.ts
git commit -m "feat: add Algolia search config (placeholder keys)"Chunk 4: Crowdin i18n Integration
Task 8: Configure Docusaurus i18n
Files:
-
Modify:
docusaurus-site/docusaurus.config.ts -
Step 1: Add all EU locales to config
Update the i18n section in docusaurus.config.ts:
i18n: {
defaultLocale: 'en',
locales: [
'en', 'fr', 'de', 'es', 'it', 'pt', 'nl', 'pl', 'ro', 'cs',
'sk', 'hu', 'bg', 'hr', 'sl', 'et', 'lv', 'lt', 'fi', 'sv',
'da', 'el', 'ga', 'mt',
],
localeConfigs: {
en: { label: 'English' },
fr: { label: 'Français' },
de: { label: 'Deutsch' },
es: { label: 'Español' },
it: { label: 'Italiano' },
pt: { label: 'Português' },
nl: { label: 'Nederlands' },
pl: { label: 'Polski' },
ro: { label: 'Română' },
cs: { label: 'Čeština' },
sk: { label: 'Slovenčina' },
hu: { label: 'Magyar' },
bg: { label: 'Български' },
hr: { label: 'Hrvatski' },
sl: { label: 'Slovenščina' },
et: { label: 'Eesti' },
lv: { label: 'Latviešu' },
lt: { label: 'Lietuvių' },
fi: { label: 'Suomi' },
sv: { label: 'Svenska' },
da: { label: 'Dansk' },
el: { label: 'Ελληνικά' },
ga: { label: 'Gaeilge' },
mt: { label: 'Malti' },
},
},- Step 2: Generate translation files for English (source of truth)
cd docusaurus-site
npm run write-translations -- --locale enExpected: i18n/en/ directory created with translation JSON files. These are the source strings that Crowdin will translate.
- Step 3: Generate French translations for local testing
npm run write-translations -- --locale frExpected: i18n/fr/ directory created with translation JSON files.
- Step 4: Verify locale switching works locally
npm run start -- --locale frExpected: French locale loads (with English content as fallback).
- Step 5: Commit
git add docusaurus-site/docusaurus.config.ts docusaurus-site/i18n/
git commit -m "feat: configure i18n with all EU locales"Task 9: Configure Crowdin integration
Files:
-
Create:
docusaurus-site/crowdin.yml -
Step 1: Install Crowdin CLI
npm install -g @crowdin/cliOr use brew: brew install crowdin
- Step 2: Create Crowdin config
Create docusaurus-site/crowdin.yml:
project_id_env: CROWDIN_PROJECT_ID
api_token_env: CROWDIN_PERSONAL_TOKEN
preserve_hierarchy: true
files:
# Documentation markdown files
- source: /docs/**/*.md
translation: /i18n/%two_letters_code%/docusaurus-plugin-content-docs/current/**/%original_file_name%
# Recipe markdown files
- source: /recipes/**/*.md
translation: /i18n/%two_letters_code%/docusaurus-plugin-content-docs-recipes/current/**/%original_file_name%
# Theme translations (navbar, footer, etc.)
- source: /i18n/en/**/*.json
translation: /i18n/%two_letters_code%/**/%original_file_name%- Step 3: Test Crowdin upload
cd docusaurus-site
crowdin upload sources --config crowdin.ymlExpected: Source files uploaded to Crowdin project.
- Step 4: Test Crowdin download
crowdin download --config crowdin.ymlExpected: Translated files (if any) downloaded into i18n/<locale>/ directories.
- Step 5: Verify Crowdin round-trip
Build the French locale to verify translations land in the right paths:
npm run build -- --locale fr
npm run serveExpected: French site builds and serves. Navigate to a few pages to verify translated content (or English fallback) appears correctly. Check that i18n/fr/docusaurus-plugin-content-docs/current/ contains the expected directory structure.
- Step 6: Commit
git add docusaurus-site/crowdin.yml
git commit -m "feat: configure Crowdin integration for translations"Chunk 5: AWS Deployment & CI/CD
Task 10: Build and test production build locally
- Step 1: Build the site
cd docusaurus-site
npm run buildExpected: build/ directory created with static files for all locales.
- Step 2: Serve production build locally
npm run serveExpected: Production site running locally, all pages accessible, images loading.
- Step 3: Commit any build fixes
git add -A
git commit -m "fix: resolve production build issues"Task 11: AWS S3 + CloudFront setup
This task requires AWS CLI configured with appropriate permissions.
Required IAM permissions for deployment role:
-
s3:PutObject,s3:DeleteObject,s3:ListBucketonarn:aws:s3:::vianova-help-centerandarn:aws:s3:::vianova-help-center/* -
cloudfront:CreateInvalidationon the specific distribution -
acm:RequestCertificate,acm:DescribeCertificate(one-time setup only) -
Step 1: Request ACM certificate (must be us-east-1 for CloudFront)
aws acm request-certificate \
--domain-name help.vianova.io \
--validation-method DNS \
--region us-east-1 \
--query 'CertificateArn' --output textSave the output ARN. Then get the DNS validation record:
aws acm describe-certificate \
--certificate-arn <ARN_FROM_ABOVE> \
--region us-east-1 \
--query 'Certificate.DomainValidationOptions[0].ResourceRecord'Add the CNAME validation record to your DNS provider. Wait for validation:
aws acm wait certificate-validated \
--certificate-arn <ARN_FROM_ABOVE> \
--region us-east-1- Step 2: Create S3 bucket
aws s3 mb s3://vianova-help-center --region eu-west-1- Step 3: Block public access (CloudFront OAC will handle access)
aws s3api put-public-access-block \
--bucket vianova-help-center \
--public-access-block-configuration BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true- Step 4: Create CloudFront distribution with full config
Create scripts/cloudfront-config.json:
{
"CallerReference": "vianova-help-center-init",
"Aliases": {
"Quantity": 1,
"Items": ["help.vianova.io"]
},
"DefaultRootObject": "index.html",
"Origins": {
"Quantity": 1,
"Items": [
{
"Id": "S3-vianova-help-center",
"DomainName": "vianova-help-center.s3.eu-west-1.amazonaws.com",
"OriginAccessControlId": "OAC_ID_PLACEHOLDER",
"S3OriginConfig": {
"OriginAccessIdentity": ""
}
}
]
},
"DefaultCacheBehavior": {
"TargetOriginId": "S3-vianova-help-center",
"ViewerProtocolPolicy": "redirect-to-https",
"CachePolicyId": "658327ea-f89d-4fab-a63d-7e88639e58f6",
"Compress": true,
"AllowedMethods": {
"Quantity": 2,
"Items": ["GET", "HEAD"]
},
"ForwardedValues": {
"QueryString": false,
"Cookies": { "Forward": "none" }
}
},
"CustomErrorResponses": {
"Quantity": 1,
"Items": [
{
"ErrorCode": 404,
"ResponsePagePath": "/404.html",
"ResponseCode": "404",
"ErrorCachingMinTTL": 60
}
]
},
"ViewerCertificate": {
"ACMCertificateArn": "ACM_ARN_PLACEHOLDER",
"SSLSupportMethod": "sni-only",
"MinimumProtocolVersion": "TLSv1.2_2021"
},
"Enabled": true,
"Comment": "Vianova Help Center"
}Replace ACM_ARN_PLACEHOLDER with the ARN from Step 1. Then create the Origin Access Control and distribution:
# Create OAC
OAC_ID=$(aws cloudfront create-origin-access-control \
--origin-access-control-config Name=vianova-help-center-oac,SigningProtocol=sigv4,SigningBehavior=always,OriginAccessControlOriginType=s3 \
--query 'OriginAccessControl.Id' --output text)
# Update config with OAC ID, then create distribution
sed -i '' "s/OAC_ID_PLACEHOLDER/$OAC_ID/" scripts/cloudfront-config.json
aws cloudfront create-distribution \
--distribution-config file://scripts/cloudfront-config.json \
--query 'Distribution.{Id:Id,DomainName:DomainName}'Save the Distribution ID.
- Step 5: Apply S3 bucket policy for CloudFront OAC
Update scripts/s3-policy.json with your actual ACCOUNT_ID and DISTRIBUTION_ID, then apply:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowCloudFrontOAC",
"Effect": "Allow",
"Principal": {
"Service": "cloudfront.amazonaws.com"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::vianova-help-center/*",
"Condition": {
"StringEquals": {
"AWS:SourceArn": "arn:aws:cloudfront::ACCOUNT_ID:distribution/DISTRIBUTION_ID"
}
}
}
]
}aws s3api put-bucket-policy --bucket vianova-help-center --policy file://scripts/s3-policy.json- Step 6: DNS — point
help.vianova.ioto CloudFront
Important: Do NOT switch DNS until the new site is verified. Keep ReadMe.io running in parallel during transition.
If using Route53 (recommended for CloudFront):
# Create an ALIAS record (A record) pointing to CloudFront
aws route53 change-resource-record-sets --hosted-zone-id YOUR_ZONE_ID --change-batch '{
"Changes": [{
"Action": "UPSERT",
"ResourceRecordSet": {
"Name": "help.vianova.io",
"Type": "A",
"AliasTarget": {
"HostedZoneId": "Z2FDTNDATAQYW2",
"DNSName": "DIST_ID.cloudfront.net",
"EvaluateTargetHealth": false
}
}
}]
}'If using another DNS provider, create a CNAME: help.vianova.io → <distribution-id>.cloudfront.net
Set TTL low (60s) during transition for fast rollback.
- Step 7: Deploy to S3 and verify
cd docusaurus-site
npm run build
aws s3 sync build/ s3://vianova-help-center --delete
aws cloudfront create-invalidation --distribution-id YOUR_DIST_ID --paths "/*"Expected: Site live at help.vianova.io. Verify all pages, images, and search work before decommissioning ReadMe.io.
Task 12: CI/CD pipeline (GitHub Actions)
Files:
-
Create:
.github/workflows/deploy-docs.yml -
Create:
.github/workflows/preview-docs.yml -
Step 1: Add Crowdin CLI as a dev dependency (avoid global install in CI)
cd docusaurus-site
npm install --save-dev @crowdin/cli@4- Step 2: Create production deployment workflow
Create .github/workflows/deploy-docs.yml:
name: Deploy Knowledge Base
on:
push:
branches: [v1.0]
paths:
- 'docusaurus-site/**'
- '.github/workflows/deploy-docs.yml'
concurrency:
group: deploy-docs
cancel-in-progress: true
jobs:
deploy:
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
cache: npm
cache-dependency-path: docusaurus-site/package-lock.json
- name: Install dependencies
working-directory: docusaurus-site
run: npm ci
- name: Sync translations with Crowdin
working-directory: docusaurus-site
env:
CROWDIN_PROJECT_ID: ${{ secrets.CROWDIN_PROJECT_ID }}
CROWDIN_PERSONAL_TOKEN: ${{ secrets.CROWDIN_PERSONAL_TOKEN }}
run: |
npm run write-translations -- --locale en
npx crowdin upload sources --config crowdin.yml
npx crowdin download --config crowdin.yml
- name: Build (strict broken link checking)
working-directory: docusaurus-site
run: npm run build
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-arn: ${{ secrets.AWS_ROLE_ARN }}
aws-region: eu-west-1
- name: Deploy to S3
working-directory: docusaurus-site
run: aws s3 sync build/ s3://vianova-help-center --delete
- name: Invalidate CloudFront cache
run: |
aws cloudfront create-invalidation \
--distribution-id ${{ secrets.CLOUDFRONT_DISTRIBUTION_ID }} \
--paths "/*"- Step 3: Create PR preview workflow
Create .github/workflows/preview-docs.yml:
name: Preview Knowledge Base
on:
pull_request:
paths:
- 'docusaurus-site/**'
concurrency:
group: preview-docs-${{ github.event.pull_request.number }}
cancel-in-progress: true
jobs:
preview:
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: 20
cache: npm
cache-dependency-path: docusaurus-site/package-lock.json
- name: Install dependencies
working-directory: docusaurus-site
run: npm ci
- name: Build (English only for preview speed)
working-directory: docusaurus-site
run: npm run build
- name: Deploy preview to S3
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-arn: ${{ secrets.AWS_ROLE_ARN }}
aws-region: eu-west-1
- run: |
aws s3 sync docusaurus-site/build/ s3://vianova-help-center-preview/pr-${{ github.event.pull_request.number }}/ --delete
- name: Comment preview URL on PR
uses: actions/github-script@v7
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `📖 Preview: https://preview.help.vianova.io/pr-${context.issue.number}/`
})Note: The preview workflow requires a separate S3 bucket (vianova-help-center-preview) and CloudFront distribution for preview URLs. This can be simplified by using a subdirectory in the main bucket if preferred.
- Step 4: Add GitHub secrets
In the GitHub repo settings, add:
AWS_ROLE_ARN— IAM role ARN for OIDC federation (scoped tos3:PutObject,s3:DeleteObject,s3:ListBucket,cloudfront:CreateInvalidation)CLOUDFRONT_DISTRIBUTION_ID— CloudFront distribution IDCROWDIN_PROJECT_ID— Crowdin project IDCROWDIN_PERSONAL_TOKEN— Crowdin API token (use a project-scoped token, not a personal token with full access)
Also recommended: enable branch protection on v1.0 requiring PR reviews and status checks before merge.
- Step 5: Commit
git add .github/workflows/deploy-docs.yml .github/workflows/preview-docs.yml docusaurus-site/package.json docusaurus-site/package-lock.json
git commit -m "feat: add CI/CD pipelines for docs deployment and PR preview"Execution Summary
| Chunk | Tasks | Description |
|---|---|---|
| 1 | Tasks 1-3 | Docusaurus scaffold, Vianova theme, landing page |
| 2 | Tasks 4-5 | Migration script (handles all ReadMe patterns), run migration, verify |
| 3 | Tasks 6-7 | Roadmap page, Algolia search |
| 4 | Tasks 8-9 | i18n config, Crowdin integration with round-trip verification |
| 5 | Tasks 10-12 | Production build, AWS deployment (ACM + CloudFront + OAC), CI/CD with PR previews |
Dependencies: Chunk 1 must complete first. Chunks 2-4 can be parallelized after Chunk 1. Chunk 5 depends on all previous chunks.
Rollback plan: Keep ReadMe.io running in parallel until the new site is verified. DNS TTL set low (60s) during transition. If issues arise, revert DNS to ReadMe.io.
Updated about 18 hours ago
