Dit script automatiseert het synchroniseren van de productiedatabase naar je lokale Docker-omgeving. Het vervangt het handmatige proces van exporteren via phpMyAdmin, downloaden, en importeren via de terminal.
Wat doet het script? Controleert of Docker, SSH en MySQL beschikbaar zijn Maakt verbinding met de productieserver via SSH Draait mysqldump op de productieserver (alleen lezen, geen wijzigingen aan productie) Downloadt het gecomprimeerde bestand naar je lokale machine Vraagt om bevestiging voordat lokale tabellen worden verwijderd Importeert de export in je lokale Docker database Verifieert het resultaat en ruimt tijdelijke bestanden op Tabellen zonder dataDe volgende tabellen worden alleen als structuur geëxporteerd (zonder data), omdat ze grote logtabellen zijn die lokaal niet nodig zijn:
hero_contactmanager_log hero_emailsender_emaillog hero_freighthero_calculator_data_audit_log Configuratie Inloggegevens instellenOpen update-local-database.sh en vul de volgende variabelen in:
# Productieserver (SSH + Database) REMOTE_SSH_HOST="thefreighthero.com" REMOTE_SSH_USER="freighther" REMOTE_DB_USER="<vul in>" # Database gebruikersnaam op productie REMOTE_DB_PASS="<vul in>" # Database wachtwoord op productie # Lokale Docker database LOCAL_HOST="127.0.0.1" LOCAL_PORT="3307" LOCAL_USER="freighther_nl" LOCAL_PASS="freighther_nl" LOCAL_DB="freighther_nl"SSH-sleutel instellen (aanbevolen)⚠️ Let op: Dit script bevat wachtwoorden in platte tekst. Deel dit bestand nooit via Git of andere publieke kanalen. Voeg het toe aan .gitignore.
Om te voorkomen dat je elke keer je SSH-wachtwoord moet invoeren:
ssh-copy-id freighther@thefreighthero.com Optioneel: pv installerenVoor een voortgangsbalk tijdens het importeren:
# macOS brew install pv # Ubuntu/Debian sudo apt install pv Gebruik Script uitvoeren cd ~/Desktop/tfh/helpers bash update-local-database.sh Verwachte output ==> Preparing environment ✔ Docker is running ✔ SSH client available ✔ MySQL client available ==> Exporting database from production via SSH Connecting to freighther@thefreighthero.com... [remote] Dumping tables with data... [remote] Dumping structure-only tables... [remote] Compressing... [remote] Done: 45M ✔ Remote export complete ==> Downloading export from production ...progress... ✔ Download complete ✔ Decompressed: freighther_nl_20260212_143022.sql (180MB) ==> Truncating local database Found 47 tables to drop Target: mysql -h 127.0.0.1 -P 3307 -u freighther_nl freighther_nl DROP TABLE `hero_contactmanager_contacts` DROP TABLE `hero_contactmanager_log` DROP TABLE `hero_emailsender_emaillog` ... ⚠️ This will DROP all 47 tables in LOCAL database 'freighther_nl' ⚠️ Connection: 127.0.0.1:3307 Continue? (yes/no): yes ✔ Dropped 47 tables ==> Importing into local database ✔ Import complete ==> Verifying import ✔ 47 tables imported hero_contactmanager_log: 0 rows (structure only ✔) hero_emailsender_emaillog: 0 rows (structure only ✔) hero_freighthero_calculator_data_audit_log: 0 rows (structure only ✔) ==> Cleaning up ✔ Removed remote dump Local file kept: freighther_nl_20260212_143022.sql ============================================= 🐳 Database sync complete! ============================================= BevestigingsvraagVoordat het script tabellen verwijdert, toont het:
De exacte database-verbinding (127.0.0.1:3307) Alle tabellen die verwijderd worden Een bevestigingsvraag waar je yes moet typenTyp iets anders dan yes om te annuleren. Het script ruimt dan het remote bestand op en stopt.
Veiligheid Productiedatabase wordt NIET gewijzigd Stap Waar Actie Export Productieserver mysqldump — alleen lezen Download Productie → Lokaal Bestandsoverdracht Drop tabellen Alleen lokale Docker DROP TABLE op 127.0.0.1:3307 Import Alleen lokale Docker mysql < bestand op 127.0.0.1:3307 Opruimen Productieserver Verwijdert alleen /tmp/*.sql.gz Wachtwoorden beschermenHet script bevat database-wachtwoorden. Zorg ervoor dat:
Het bestand niet in versiebeheer (Git) staat Voeg toe aan .gitignore:update-local-database.sh Stel de juiste bestandsrechten in:chmod 700 update-local-database.sh Dit zorgt ervoor dat alleen jij het bestand kunt lezen en uitvoeren. Problemen oplossen "Access denied" bij remote export Controleer REMOTE_DB_USER en REMOTE_DB_PASS Zorg dat het wachtwoord geen speciale tekens bevat die verkeerd worden geïnterpreteerd Test handmatig: ssh freighther@thefreighthero.com "mysqldump -u GEBRUIKER -pWACHTWOORD freighther_nl --no-data | head -5" "Docker is not running" Start Docker Desktop of de Docker daemon Import mislukt Controleer of je lokale Docker MySQL-container draait Test verbinding: mysql -h 127.0.0.1 -P 3307 -u freighther_nl -p freighther_nl -e "SELECT 1;" Geen voortgangsbalk bij import Installeer pv (zie configuratie hierboven) Structuur-only tabellen aanpassenOm tabellen toe te voegen of te verwijderen die zonder data worden geëxporteerd, pas het array aan in het script:
STRUCTURE_ONLY_TABLES=( "hero_contactmanager_log" "hero_emailsender_emaillog" "hero_freighthero_calculator_data_audit_log" # Voeg hier extra tabellen toe ) Script #!/bin/bash # ============================================================ # 🐳 FreightHero Local DB Sync # Exports from production, downloads, and imports locally # ============================================================ # -- Remote (Production) Configuration ---------------------- REMOTE_SSH_HOST="thefreighthero.com" REMOTE_SSH_USER="freighther" REMOTE_DB_NAME="freighther_nl" REMOTE_DB_USER="freighther" REMOTE_DB_PASS="DIRECT ADMIN PASS OF USER" REMOTE_DUMP_PATH="/home/freighther/dump/freighther_nl_export.sql" # -- Local (Docker) Configuration ---------------------------- LOCAL_HOST="127.0.0.1" LOCAL_PORT="3307" LOCAL_DB="LOCAL_DB_NAME" LOCAL_USER="LOCAL_DB_USER" LOCAL_PASS="LOCAL_DB_PASS" # -- Export password for mysql client ------------------------ export MYSQL_PWD="$LOCAL_PASS" # -- Tables to export structure only (no data) --------------- STRUCTURE_ONLY_TABLES=( "hero_contactmanager_log" "hero_emailsender_emaillog" "hero_freighthero_calculator_data_audit_log" ) # -- Local SQL file with timestamp --------------------------- LOCAL_SQL_FILE="freighther_nl_$(date +%Y%m%d_%H%M%S).sql" # -- Helper -------------------------------------------------- log_step() { echo ""; echo "==> $1"; } log_info() { echo " $1"; } log_done() { echo " ✔ $1"; } log_error() { echo " ✘ $1"; exit 1; } # ============================================================ # Step 1: Check prerequisites # ============================================================ log_step "Preparing environment" if ! docker info &>/dev/null; then log_error "Docker is not running. Start Docker first." fi log_done "Docker is running" if ! command -v ssh &>/dev/null; then log_error "SSH client not found." fi log_done "SSH client available" if ! command -v mysql &>/dev/null; then log_error "mysql client not found." fi log_done "MySQL client available" # ============================================================ # Step 2: Run mysqldump on production server via SSH # ============================================================ log_step "Exporting database from production via SSH" log_info "Connecting to ${REMOTE_SSH_USER}@${REMOTE_SSH_HOST}..." # Build --ignore-table flags for structure-only tables IGNORE_FLAGS="" for TABLE in "${STRUCTURE_ONLY_TABLES[@]}"; do IGNORE_FLAGS+=" --ignore-table=${REMOTE_DB_NAME}.${TABLE}" done # Space-separated list for structure-only dump STRUCTURE_TABLES="${STRUCTURE_ONLY_TABLES[*]}" # Write remote script to a temp file to avoid quoting hell REMOTE_SCRIPT_FILE=$(mktemp) cat > "$REMOTE_SCRIPT_FILE" <<EOF #!/bin/bash set -e echo "[remote] Dumping tables with data..." mysqldump -u ${REMOTE_DB_USER} -p${REMOTE_DB_PASS} \ ${REMOTE_DB_NAME} \ ${IGNORE_FLAGS} \ --single-transaction \ --routines \ --triggers \ > ${REMOTE_DUMP_PATH} echo "[remote] Dumping structure-only tables..." mysqldump -u ${REMOTE_DB_USER} -p${REMOTE_DB_PASS} \ ${REMOTE_DB_NAME} \ ${STRUCTURE_TABLES} \ --no-data \ --single-transaction \ >> ${REMOTE_DUMP_PATH} echo "[remote] Compressing..." gzip -f ${REMOTE_DUMP_PATH} FILE_SIZE=\$(ls -lh ${REMOTE_DUMP_PATH}.gz | awk '{print \$5}') echo "[remote] Done: \${FILE_SIZE}" EOF ssh "${REMOTE_SSH_USER}@${REMOTE_SSH_HOST}" bash < "$REMOTE_SCRIPT_FILE" 2>&1 | while IFS= read -r line; do log_info "$line" done if [[ ${PIPESTATUS[0]} -ne 0 ]]; then rm -f "$REMOTE_SCRIPT_FILE" log_error "Remote export failed. Check SSH connection and DB credentials." fi rm -f "$REMOTE_SCRIPT_FILE" log_done "Remote export complete" # ============================================================ # Step 3: Download with progress # ============================================================ log_step "Downloading export from production" if command -v rsync &>/dev/null; then rsync -avz --progress \ "${REMOTE_SSH_USER}@${REMOTE_SSH_HOST}:${REMOTE_DUMP_PATH}.gz" \ "${LOCAL_SQL_FILE}.gz" \ 2>&1 | while IFS= read -r line; do echo " ${line}" done else scp \ "${REMOTE_SSH_USER}@${REMOTE_SSH_HOST}:${REMOTE_DUMP_PATH}.gz" \ "${LOCAL_SQL_FILE}.gz" fi if [[ ! -f "${LOCAL_SQL_FILE}.gz" ]]; then log_error "Download failed." fi log_done "Download complete" log_info "Decompressing..." gunzip -f "${LOCAL_SQL_FILE}.gz" if [[ ! -f "${LOCAL_SQL_FILE}" ]]; then log_error "Decompression failed." fi FILE_SIZE_MB=$(du -m "${LOCAL_SQL_FILE}" | awk '{print $1}') log_done "Decompressed: ${LOCAL_SQL_FILE} (${FILE_SIZE_MB}MB)" # ============================================================ # Step 4: Drop all tables in local database # ============================================================ log_step "Truncating local database" # Re-export password for local commands export MYSQL_PWD="$LOCAL_PASS" TABLES=$(mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" \ -N -e "SHOW TABLES;" 2>/dev/null) if [[ -z "$TABLES" ]]; then log_info "Local database is empty, skipping." else TABLE_COUNT=$(echo "$TABLES" | wc -l | tr -d ' ') log_info "Found ${TABLE_COUNT} tables to drop" # Disable FK checks, drop all, re-enable DROP_SQL="SET FOREIGN_KEY_CHECKS = 0;" while IFS= read -r TABLE; do DROP_SQL+=" DROP TABLE IF EXISTS \`${TABLE}\`;" done <<< "$TABLES" DROP_SQL+=" SET FOREIGN_KEY_CHECKS = 1;" mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" \ -e "${DROP_SQL}" 2>/dev/null if [[ $? -ne 0 ]]; then log_error "Failed to drop tables." fi log_done "Dropped ${TABLE_COUNT} tables" fi # ============================================================ # Step 5: Import into local database # ============================================================ log_step "Importing into local database" if command -v pv &>/dev/null; then pv "${LOCAL_SQL_FILE}" | mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" 2>/dev/null else log_info "(tip: install 'pv' for a progress bar)" mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" < "$LOCAL_SQL_FILE" fi if [[ $? -ne 0 ]]; then log_error "Import failed." fi log_done "Import complete" # ============================================================ # Step 6: Verify # ============================================================ log_step "Verifying import" TOTAL_TABLES=$(mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" \ -N -e "SELECT COUNT(*) FROM information_schema.TABLES WHERE TABLE_SCHEMA='${LOCAL_DB}';" 2>/dev/null) log_done "${TOTAL_TABLES} tables imported" for TABLE in "${STRUCTURE_ONLY_TABLES[@]}"; do ROW_COUNT=$(mysql -h "$LOCAL_HOST" -P "$LOCAL_PORT" -u "$LOCAL_USER" "$LOCAL_DB" \ -N -e "SELECT COUNT(*) FROM \`${TABLE}\`;" 2>/dev/null) log_info "${TABLE}: ${ROW_COUNT:-0} rows (structure only ✔)" done # ============================================================ # Step 7: Cleanup # ============================================================ log_step "Cleaning up" ssh "${REMOTE_SSH_USER}@${REMOTE_SSH_HOST}" "rm -f ${REMOTE_DUMP_PATH}.gz" 2>/dev/null log_done "Removed remote dump" log_info "Local file kept: ${LOCAL_SQL_FILE}" unset MYSQL_PWD echo "" echo "=============================================" echo " 🐳 Database sync complete!" echo "=============================================" echo ""