43 Commits

Author SHA1 Message Date
4fa3515e32 Bump version to 0.1.3
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:37 +01:00
c5c30fd4c4 Fix Vite 7.x TDZ error in __vite__mapDeps with post-build script
- Add post-build script to fix Temporal Dead Zone error in generated code
- Remove debug logging from stores and composables
- Simplify init-logger plugin to essential error handling
- Fix circular store dependency in useUiStore

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:12 +01:00
8c7a02a019 Sync version numbers across all package files
Update Cargo.toml and tauri.conf.json to version 0.1.2 to match package.json
2025-11-01 19:33:42 +01:00
465fe19542 Clean up unused code and dependencies
- Remove commented-out code in Rust and TypeScript files
- Remove unused npm dependencies (@tauri-apps/plugin-http, @tauri-apps/plugin-sql, fuse.js)
- Remove commented imports in nuxt.config.ts
- Remove commented dependencies in Cargo.toml
2025-11-01 19:32:34 +01:00
d2d0f8996b Fix runtime CSP error by allowing inline scripts
Added 'unsafe-inline' to script-src CSP directive to fix JavaScript
initialization errors in production builds. Nuxt's generated modules
require inline script execution.

- Fixes: "Cannot access uninitialized variable" error
- Fixes: CSP script execution blocking
- Version bump to 0.1.2
2025-11-01 19:00:36 +01:00
f727d00639 Bump version to 0.1.1 2025-11-01 17:21:10 +01:00
a946b14f69 Fix Android assets upload to correct release
Use gh CLI to upload Android APK and AAB to the tagged release.
2025-11-01 17:20:13 +01:00
471baec284 Simplify Android build: use default command for APK and AAB
tauri android build creates both APK and AAB by default.
2025-11-01 16:44:32 +01:00
8298d807f3 Fix Android build commands: use --apk and --aab flags
Changed from incorrect --bundle aab to correct --aab flag.
2025-11-01 16:34:15 +01:00
42e6459fbf Prevent duplicate builds on tag pushes
Build workflow now ignores all tags to avoid running alongside release workflow.
2025-11-01 16:06:35 +01:00
6ae87fc694 Fix Android OpenSSL build by adding NDK toolchain to PATH
Set proper CC, AR, and RANLIB environment variables for all Android targets
to enable OpenSSL cross-compilation with SQLCipher encryption.
2025-11-01 16:03:46 +01:00
f7867a5bde Restore SQLCipher encryption for Android and fix CI build
- Re-enable bundled-sqlcipher-vendored-openssl for Android
- Add NDK environment variables for OpenSSL compilation
- Install perl and make for OpenSSL build in CI
- Ensures encryption works on all platforms including Android
2025-11-01 15:39:44 +01:00
d82599f588 Fix Android build by using platform-specific rusqlite features
- Use bundled-sqlcipher-vendored-openssl for non-Android platforms
- Use bundled (standard SQLite) for Android to avoid OpenSSL compilation issues
- Resolves OpenSSL build errors on Android targets
2025-11-01 15:36:20 +01:00
72bb211a76 Fix secrets access in workflow conditional
- Move secrets to env block instead of if condition
- Use bash conditional to check if keystore is available
- Provide clear logging for signed vs unsigned builds
2025-11-01 15:28:06 +01:00
f14ce0d6ad Add Android signing configuration to Gradle
- Configure signingConfigs to read from environment variables
- Apply signing to release builds when keystore is available
- Support both signed and unsigned builds
2025-11-01 15:26:21 +01:00
af09f4524d Remove iOS builds from CI/CD workflows 2025-11-01 15:21:58 +01:00
102832675d Fix Android build commands syntax
- Change from --apk to default build (produces APK)
- Change from --aab to --bundle aab for AAB generation
2025-11-01 15:20:49 +01:00
3490de2f51 Configure Android signing and disable iOS builds
- Add optional Android signing for build workflow (unsigned for testing)
- Require Android signing for release workflow
- Disable iOS builds (commented out) until Apple Developer Account is available
2025-11-01 15:06:56 +01:00
7c3af10938 Add Android and iOS builds to CI/CD pipelines 2025-11-01 15:00:33 +01:00
5c5d0785b9 Fix pnpm version conflict in CI workflows 2025-11-01 14:48:58 +01:00
121dd9dd00 Add GitHub Actions CI/CD pipelines
- Add build pipeline for Windows, macOS, and Linux
- Add release pipeline for automated releases
- Remove CLAUDE.md from git tracking
2025-11-01 14:46:01 +01:00
4ff6aee4d8 Fix Vue i18n warnings and component root node issues
- Set useScope: 'global' in UI store to prevent i18n scope conflicts
- Add wrapper div to vault page to ensure single root node for transitions
- Fixes 'Duplicate useI18n calling by local scope' warning
- Fixes 'Component inside <Transition> renders non-element root node' warning
2025-10-31 23:24:20 +01:00
dceb49ae90 Add context menu for vault actions and trash functionality
- Add UiButtonContext component for context menu support on buttons
- Implement vault trash functionality using trash crate
- Move vaults to system trash on desktop (with fallback to permanent delete on mobile)
- Add context menu to vault list items for better mobile UX
- Keep hover delete button for desktop users
2025-10-31 22:57:56 +01:00
5ea04a80e0 Fix Android safe-area handling and window maximization
- Fix extension signature verification on Android by canonicalizing paths (symlink compatibility)
- Implement proper safe-area-inset handling for mobile devices
- Add reactive header height measurement to UI store
- Fix maximized window positioning to respect safe-areas and header
- Create reusable HaexDebugOverlay component for mobile debugging
- Fix Swiper navigation by using absolute positioning instead of flex-1
- Remove debug logging after Android compatibility confirmed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-31 02:18:59 +01:00
65cf2e2c3c adjust gitignore 2025-10-30 22:01:31 +01:00
68d542b4d7 Update extension system and database migrations
Changes:
- Added CLAUDE.md with project instructions
- Updated extension manifest bindings (TypeScript)
- Regenerated database migrations (consolidated into single migration)
- Updated haex schema with table name handling
- Enhanced extension manager and manifest handling in Rust
- Updated extension store in frontend
- Updated vault.db

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-30 21:59:13 +01:00
f97cd4ad97 adjust drizzle backend.
return array of arrays
handle table names with quotes
2025-10-30 04:57:01 +01:00
ef225b281f refactored design 2025-10-28 14:16:17 +01:00
16b71d9ea8 fix: Snap Dropzones 2025-10-27 11:26:12 +01:00
5ee5ced8c0 desktopicons now with foreign key to extensions 2025-10-26 00:19:15 +02:00
86b65f117d cleanup. renamed postMessgages 2025-10-25 23:17:28 +02:00
5fdea155d1 removed logs 2025-10-25 08:14:59 +02:00
cb0c8d71f4 fix window on workspace rendering 2025-10-25 08:09:15 +02:00
9281a85deb fix linting 2025-10-24 14:37:20 +02:00
8f8bbb5558 fix window overview 2025-10-24 14:33:56 +02:00
252b8711de feature: window overview 2025-10-24 13:17:29 +02:00
4f839aa856 fixed trigger 2025-10-23 13:17:58 +02:00
99ccadce00 removed pk fk mapping 2025-10-23 10:24:19 +02:00
922ae539ba no more soft delete => we do it hard now 2025-10-23 09:26:36 +02:00
3d020e7dcf refactored workspace table 2025-10-22 15:52:56 +02:00
f70e924cc3 refatored rust sql and drizzle 2025-10-22 15:05:36 +02:00
9ea057e943 fixed drizzle rust logic 2025-10-21 16:29:13 +02:00
e268947593 reorganized window 2025-10-21 13:49:29 +02:00
111 changed files with 8031 additions and 12431 deletions

View File

@ -1,227 +0,0 @@
{
"session_date": "2025-10-20",
"project": "haex-hub System Windows Architecture + Drizzle CRDT RETURNING Fix + PK-Remapping Refactor",
"status": "system_windows_ui_integration_completed",
"context": {
"main_work_today": [
"Fixed Drizzle CRDT integration with RETURNING support",
"Implemented System Windows architecture (Settings, Marketplace as Desktop Windows)",
"Refactored executor functions: Split execute/query paths for cleaner code",
"Integrated System Windows UI: Launcher, Window Component, Placeholder Components"
],
"completed_today": [
"Added AST-based statement_has_returning() for safe RETURNING detection",
"Created SqlExecutor::query_internal() for INSERT/UPDATE/DELETE with RETURNING",
"Simplified execute_internal to use execute_internal_typed (now with PK-Remapping!)",
"Added sql_query_with_crdt Tauri command",
"Updated drizzleCallback in index.ts to route correctly",
"Extended IWindow interface: type ('system' | 'extension'), sourceId",
"Added SystemWindowDefinition interface",
"Created system windows registry in windowManager store",
"Extended openWindow() to support both system and extension windows",
"Added singleton support for system windows",
"Split execute_internal_typed_with_context into two functions (execute vs query)",
"Created query_internal_typed_with_context with full PK-Remapping support",
"Updated query_internal to use new typed function with PK-Remapping",
"Fixed manager.rs to use query_internal_typed_with_context for INSERT RETURNING",
"Extended Launcher to show System Windows + Extensions alphabetically",
"Adapted Window Component to render System Windows as Vue Components, Extensions as iFrames",
"Created placeholder components: Settings.vue and Marketplace.vue"
],
"tech_stack": "Vue 3, TypeScript, Pinia, Nuxt UI, Tauri, Rust, Drizzle ORM, SQLite"
},
"drizzle_crdt_implementation": {
"problem": "Drizzle .insert().returning() executed SQL twice and lost RETURNING data",
"solution": "Separate execute and query paths based on RETURNING clause",
"typescript_side": {
"file": "src/stores/vault/index.ts",
"drizzleCallback_logic": {
"select": "sql_select (unchanged)",
"with_returning": "sql_query_with_crdt (NEW)",
"without_returning": "sql_execute_with_crdt (unchanged)"
},
"hasReturning_check": "String-based RETURNING regex (safe enough for generated SQL)"
},
"rust_side": {
"files": [
"src-tauri/src/database/core.rs",
"src-tauri/src/extension/database/executor.rs",
"src-tauri/src/database/mod.rs"
],
"core_rs_changes": {
"statement_has_returning": {
"line": 84,
"purpose": "AST-based RETURNING check (INSERT, UPDATE, DELETE)",
"safety": "Checks actual AST, not string matching"
},
"convert_value_ref_to_json": {
"line": 333,
"visibility": "Made public for reuse"
},
"removed": "query_with_crdt function (replaced by SqlExecutor::query_internal)"
},
"executor_rs_changes": {
"execute_internal_typed_with_context": {
"line": 100,
"purpose": "Execute SQL WITHOUT RETURNING (with CRDT and FK-Remapping)",
"returns": "Result<HashSet<String>, DatabaseError>",
"behavior": "Handles INSERTs with FK-Remapping, uses execute()"
},
"query_internal_typed_with_context": {
"line": 186,
"purpose": "Execute SQL WITH RETURNING (with CRDT, PK-Remapping, FK-Remapping)",
"returns": "Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError>",
"behavior": "Handles INSERTs with full PK-Remapping + FK-Remapping, returns all RETURNING columns"
},
"query_internal": {
"line": 454,
"purpose": "Execute with CRDT + return full RETURNING results (JsonValue params)",
"behavior": "Wrapper around query_internal_typed_with_context"
},
"execute_internal_refactor": {
"line": 345,
"change": "Now wrapper around execute_internal_typed",
"benefit": "Drizzle now gets PK-Remapping for ON CONFLICT!"
}
},
"mod_rs_changes": {
"sql_query_with_crdt": {
"line": 59,
"calls": "SqlExecutor::query_internal",
"returns": "Vec<Vec<JsonValue>>"
}
},
"lib_rs_changes": {
"registered_command": "sql_query_with_crdt added to invoke_handler"
}
},
"benefits": [
"SQL executed only once (not twice)",
"Full RETURNING results available to Drizzle",
"PK-Remapping now works for Drizzle (both execute and query paths)",
"AST-based RETURNING detection (safe)",
"Less code duplication",
"Cleaner code separation: execute vs query functions",
"FK-Remapping works across transactions with PkRemappingContext"
]
},
"system_windows_architecture": {
"concept": "ALL UI (Settings, Marketplace, etc.) as DesktopWindows, same as Extensions",
"status": "Store completed, UI integration pending",
"window_manager_store": {
"file": "src/stores/desktop/windowManager.ts",
"changes": {
"IWindow_interface": {
"added_fields": [
"type: 'system' | 'extension'",
"sourceId: string (replaces extensionId)"
],
"removed_fields": ["extensionId"]
},
"SystemWindowDefinition": {
"fields": "id, name, icon, component, defaultWidth, defaultHeight, resizable, singleton"
},
"system_windows_registry": {
"line": 46,
"entries": ["settings", "marketplace"],
"structure": "Record<string, SystemWindowDefinition>"
},
"openWindow_function": {
"line": 101,
"signature": "(type, sourceId, title?, icon?, width?, height?, sourcePosition?)",
"features": [
"Type-based handling (system vs extension)",
"Singleton check for system windows",
"Auto-loads defaults from registry",
"Activates existing singleton if already open"
]
},
"new_exports": ["getAllSystemWindows", "getSystemWindow"]
}
},
"ui_integration": {
"launcher": {
"file": "src/components/haex/extension/launcher.vue",
"changes": [
"Combined system windows and extensions in unified launcherItems computed",
"Alphabetically sorted by name",
"openItem() function handles both types with correct openWindow() signature",
"Uses windowManagerStore.getAllSystemWindows()"
]
},
"desktop_window_component": {
"file": "src/components/haex/desktop/index.vue",
"changes": [
"Dynamic component rendering: <component :is='getSystemWindowComponent()'> for system windows",
"HaexDesktopExtensionFrame for extensions (iFrame)",
"getSystemWindowComponent() function to retrieve Vue component from registry",
"Applied to both normal mode and overview mode"
]
},
"placeholder_components": {
"created": [
"src/components/haex/system/settings.vue",
"src/components/haex/system/marketplace.vue"
],
"description": "Simple placeholder UI with sections and styling"
}
},
"next_steps": {
"priority": [
"Desktop Icons: Support system window icons (alongside extension icons)",
"Drag & Drop: Launcher → Desktop for all types (system + extension)"
]
}
},
"workspace_overview_context": {
"still_active": "GNOME-style workspace overview with UDrawer, implemented yesterday",
"file": "src/components/haex/desktop/index.vue",
"status": "Working, workspace switching functional"
},
"file_changes_today": {
"modified": [
"src/stores/vault/index.ts (drizzleCallback with hasReturning)",
"src-tauri/src/database/core.rs (statement_has_returning, removed query_with_crdt)",
"src-tauri/src/extension/database/executor.rs (split execute/query functions, PK-Remapping)",
"src-tauri/src/database/mod.rs (sql_query_with_crdt command)",
"src-tauri/src/lib.rs (registered sql_query_with_crdt)",
"src/stores/desktop/windowManager.ts (system windows support)",
"src-tauri/src/extension/core/manager.rs (updated to use query_internal_typed_with_context)",
"src/components/haex/extension/launcher.vue (unified launcher for system + extensions)",
"src/components/haex/desktop/index.vue (dynamic component rendering)"
],
"created": [
"src/components/haex/system/settings.vue",
"src/components/haex/system/marketplace.vue"
],
"deleted": []
},
"important_notes": [
"Drizzle RETURNING now fully functional with CRDT",
"System windows use Vue components, Extensions use iFrame",
"sourceId is generic: extensionId for extensions, systemWindowId for system windows",
"Singleton system windows auto-activate if already open",
"PK-Remapping now works for both execute and query paths",
"executor.rs: Two separate functions for execute (no RETURNING) vs query (with RETURNING)",
"query_internal_typed_with_context returns full RETURNING results as Vec<Vec<JsonValue>>",
"FK-Remapping works across transaction using PkRemappingContext",
"Next session: Implement Launcher UI integration for system windows"
],
"todos_remaining": [
"Desktop Icons für System Windows unterstützen (neben Extension Icons)",
"Drag & Drop vom Launcher zum Desktop implementieren (für beide Typen)"
]
}

228
.github/workflows/build.yml vendored Normal file
View File

@ -0,0 +1,228 @@
name: Build
on:
push:
branches:
- main
- develop
tags-ignore:
- '**'
pull_request:
branches:
- main
- develop
workflow_dispatch:
jobs:
build-desktop:
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
args: ${{ matrix.args }}
- name: Upload artifacts (macOS)
if: matrix.platform == 'macos-latest'
uses: actions/upload-artifact@v4
with:
name: macos-${{ contains(matrix.args, 'aarch64') && 'aarch64' || 'x86_64' }}
path: |
src-tauri/target/*/release/bundle/dmg/*.dmg
src-tauri/target/*/release/bundle/macos/*.app
- name: Upload artifacts (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
uses: actions/upload-artifact@v4
with:
name: linux
path: |
src-tauri/target/release/bundle/deb/*.deb
src-tauri/target/release/bundle/appimage/*.AppImage
src-tauri/target/release/bundle/rpm/*.rpm
- name: Upload artifacts (Windows)
if: matrix.platform == 'windows-latest'
uses: actions/upload-artifact@v4
with:
name: windows
path: |
src-tauri/target/release/bundle/msi/*.msi
src-tauri/target/release/bundle/nsis/*.exe
build-android:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (if secrets available)
env:
ANDROID_KEYSTORE: ${{ secrets.ANDROID_KEYSTORE }}
ANDROID_KEYSTORE_PASSWORD: ${{ secrets.ANDROID_KEYSTORE_PASSWORD }}
ANDROID_KEY_ALIAS: ${{ secrets.ANDROID_KEY_ALIAS }}
ANDROID_KEY_PASSWORD: ${{ secrets.ANDROID_KEY_PASSWORD }}
run: |
if [ -n "$ANDROID_KEYSTORE" ]; then
echo "$ANDROID_KEYSTORE" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=$ANDROID_KEYSTORE_PASSWORD" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=$ANDROID_KEY_ALIAS" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=$ANDROID_KEY_PASSWORD" >> $GITHUB_ENV
echo "Keystore configured for signing"
else
echo "No keystore configured, building unsigned APK"
fi
- name: Build Android APK and AAB (unsigned if no keystore)
run: pnpm tauri android build
- name: Upload Android artifacts
uses: actions/upload-artifact@v4
with:
name: android
path: |
src-tauri/gen/android/app/build/outputs/apk/**/*.apk
src-tauri/gen/android/app/build/outputs/bundle/**/*.aab

251
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,251 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
jobs:
create-release:
permissions:
contents: write
runs-on: ubuntu-22.04
outputs:
release_id: ${{ steps.create-release.outputs.release_id }}
upload_url: ${{ steps.create-release.outputs.upload_url }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Get version
run: echo "PACKAGE_VERSION=$(node -p "require('./package.json').version")" >> $GITHUB_ENV
- name: Create release
id: create-release
uses: actions/github-script@v7
with:
script: |
const { data } = await github.rest.repos.createRelease({
owner: context.repo.owner,
repo: context.repo.repo,
tag_name: `v${process.env.PACKAGE_VERSION}`,
name: `haex-hub v${process.env.PACKAGE_VERSION}`,
body: 'Take a look at the assets to download and install this app.',
draft: true,
prerelease: false
})
core.setOutput('release_id', data.id)
core.setOutput('upload_url', data.upload_url)
return data.id
build-desktop:
needs: create-release
permissions:
contents: write
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build and release Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
releaseId: ${{ needs.create-release.outputs.release_id }}
args: ${{ matrix.args }}
build-android:
needs: create-release
permissions:
contents: write
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (required for release)
run: |
echo "${{ secrets.ANDROID_KEYSTORE }}" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=${{ secrets.ANDROID_KEYSTORE_PASSWORD }}" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=${{ secrets.ANDROID_KEY_ALIAS }}" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=${{ secrets.ANDROID_KEY_PASSWORD }}" >> $GITHUB_ENV
- name: Build Android APK and AAB (signed)
run: pnpm tauri android build
- name: Upload Android artifacts to Release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ github.ref_name }} \
src-tauri/gen/android/app/build/outputs/apk/universal/release/app-universal-release.apk \
src-tauri/gen/android/app/build/outputs/bundle/universalRelease/app-universal-release.aab \
--clobber
publish-release:
permissions:
contents: write
runs-on: ubuntu-22.04
needs: [create-release, build-desktop, build-android]
steps:
- name: Publish release
id: publish-release
uses: actions/github-script@v7
env:
release_id: ${{ needs.create-release.outputs.release_id }}
with:
script: |
github.rest.repos.updateRelease({
owner: context.repo.owner,
repo: context.repo.repo,
release_id: process.env.release_id,
draft: false,
prerelease: false
})

2
.gitignore vendored
View File

@ -27,3 +27,5 @@ src-tauri/target
nogit* nogit*
.claude .claude
.output .output
target
CLAUDE.md

View File

@ -1,5 +1,3 @@
//import tailwindcss from '@tailwindcss/vite'
import { fileURLToPath } from 'node:url' import { fileURLToPath } from 'node:url'
// https://nuxt.com/docs/api/configuration/nuxt-config // https://nuxt.com/docs/api/configuration/nuxt-config
@ -16,6 +14,9 @@ export default defineNuxtConfig({
}, },
app: { app: {
head: {
viewport: 'width=device-width, initial-scale=1.0, viewport-fit=cover',
},
pageTransition: { pageTransition: {
name: 'fade', name: 'fade',
}, },
@ -28,7 +29,6 @@ export default defineNuxtConfig({
'@vueuse/nuxt', '@vueuse/nuxt',
'@nuxt/icon', '@nuxt/icon',
'@nuxt/eslint', '@nuxt/eslint',
//"@nuxt/image",
'@nuxt/fonts', '@nuxt/fonts',
'@nuxt/ui', '@nuxt/ui',
], ],
@ -108,8 +108,7 @@ export default defineNuxtConfig({
runtimeConfig: { runtimeConfig: {
public: { public: {
haexVault: { haexVault: {
lastVaultFileName: 'lastVaults.json', deviceFileName: 'device.json',
instanceFileName: 'instance.json',
defaultVaultName: 'HaexHub', defaultVaultName: 'HaexHub',
}, },
}, },
@ -123,7 +122,6 @@ export default defineNuxtConfig({
}, },
vite: { vite: {
//plugins: [tailwindcss()],
// Better support for Tauri CLI output // Better support for Tauri CLI output
clearScreen: false, clearScreen: false,
// Enable environment variables // Enable environment variables

View File

@ -1,10 +1,10 @@
{ {
"name": "haex-hub", "name": "haex-hub",
"private": true, "private": true,
"version": "0.1.0", "version": "0.1.3",
"type": "module", "type": "module",
"scripts": { "scripts": {
"build": "nuxt build", "build": "nuxt build && node scripts/fix-vite-mapdeps.js",
"dev": "nuxt dev", "dev": "nuxt dev",
"drizzle:generate": "drizzle-kit generate", "drizzle:generate": "drizzle-kit generate",
"drizzle:migrate": "drizzle-kit migrate", "drizzle:migrate": "drizzle-kit migrate",
@ -21,48 +21,47 @@
"@nuxt/eslint": "1.9.0", "@nuxt/eslint": "1.9.0",
"@nuxt/fonts": "0.11.4", "@nuxt/fonts": "0.11.4",
"@nuxt/icon": "2.0.0", "@nuxt/icon": "2.0.0",
"@nuxt/ui": "4.0.0", "@nuxt/ui": "4.1.0",
"@nuxtjs/i18n": "10.0.6", "@nuxtjs/i18n": "10.0.6",
"@pinia/nuxt": "^0.11.1", "@pinia/nuxt": "^0.11.2",
"@tailwindcss/vite": "^4.1.10", "@tailwindcss/vite": "^4.1.16",
"@tauri-apps/api": "^2.5.0", "@tauri-apps/api": "^2.9.0",
"@tauri-apps/plugin-dialog": "^2.2.2", "@tauri-apps/plugin-dialog": "^2.4.2",
"@tauri-apps/plugin-fs": "^2.3.0", "@tauri-apps/plugin-fs": "^2.4.4",
"@tauri-apps/plugin-http": "2.5.2",
"@tauri-apps/plugin-notification": "2.3.1", "@tauri-apps/plugin-notification": "2.3.1",
"@tauri-apps/plugin-opener": "^2.3.0", "@tauri-apps/plugin-opener": "^2.5.2",
"@tauri-apps/plugin-os": "^2.2.2", "@tauri-apps/plugin-os": "^2.3.2",
"@tauri-apps/plugin-sql": "2.3.0", "@tauri-apps/plugin-store": "^2.4.1",
"@tauri-apps/plugin-store": "^2.2.1",
"@vueuse/components": "^13.9.0", "@vueuse/components": "^13.9.0",
"@vueuse/core": "^13.4.0", "@vueuse/core": "^13.9.0",
"@vueuse/gesture": "^2.0.0", "@vueuse/gesture": "^2.0.0",
"@vueuse/nuxt": "^13.4.0", "@vueuse/nuxt": "^13.9.0",
"drizzle-orm": "^0.44.2", "drizzle-orm": "^0.44.7",
"eslint": "^9.34.0", "eslint": "^9.38.0",
"fuse.js": "^7.1.0", "nuxt-zod-i18n": "^1.12.1",
"nuxt": "^4.0.3", "swiper": "^12.0.3",
"nuxt-zod-i18n": "^1.12.0", "tailwindcss": "^4.1.16",
"swiper": "^12.0.2", "vue": "^3.5.22",
"tailwindcss": "^4.1.10", "vue-router": "^4.6.3",
"vue": "^3.5.20", "zod": "^3.25.76"
"vue-router": "^4.5.1",
"zod": "4.1.5"
}, },
"devDependencies": { "devDependencies": {
"@iconify/json": "^2.2.351", "@iconify-json/hugeicons": "^1.2.17",
"@iconify-json/lucide": "^1.2.71",
"@iconify/json": "^2.2.401",
"@iconify/tailwind4": "^1.0.6", "@iconify/tailwind4": "^1.0.6",
"@libsql/client": "^0.15.15", "@libsql/client": "^0.15.15",
"@tauri-apps/cli": "^2.5.0", "@tauri-apps/cli": "^2.9.1",
"@types/node": "^24.6.2", "@types/node": "^24.9.1",
"@vitejs/plugin-vue": "6.0.1", "@vitejs/plugin-vue": "6.0.1",
"@vue/compiler-sfc": "^3.5.17", "@vue/compiler-sfc": "^3.5.22",
"drizzle-kit": "^0.31.2", "drizzle-kit": "^0.31.5",
"globals": "^16.2.0", "globals": "^16.4.0",
"nuxt": "^4.2.0",
"prettier": "3.6.2", "prettier": "3.6.2",
"tsx": "^4.20.6", "tsx": "^4.20.6",
"tw-animate-css": "^1.3.8", "tw-animate-css": "^1.4.0",
"typescript": "^5.8.3", "typescript": "^5.9.3",
"vite": "7.1.3", "vite": "7.1.3",
"vue-tsc": "3.0.6" "vue-tsc": "3.0.6"
}, },

4563
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,52 @@
#!/usr/bin/env node
/**
* Post-build script to fix the Vite 7.x TDZ error in __vite__mapDeps
* This script patches the generated JavaScript files after the build
*/
import { readdir, readFile, writeFile } from 'node:fs/promises'
import { join } from 'node:path'
const NUXT_DIR = join(process.cwd(), '.output/public/_nuxt')
async function fixFile(filePath) {
const content = await readFile(filePath, 'utf-8')
const fixedContent = content.replace(
/const __vite__mapDeps=\(i,m=__vite__mapDeps,/g,
'let __vite__mapDeps;__vite__mapDeps=(i,m=__vite__mapDeps,'
)
if (fixedContent !== content) {
await writeFile(filePath, fixedContent, 'utf-8')
console.log(`✓ Fixed TDZ error in ${filePath.split('/').pop()}`)
return true
}
return false
}
async function main() {
try {
const files = await readdir(NUXT_DIR)
const jsFiles = files.filter((f) => f.endsWith('.js'))
let fixedCount = 0
for (const file of jsFiles) {
const filePath = join(NUXT_DIR, file)
const fixed = await fixFile(filePath)
if (fixed) fixedCount++
}
if (fixedCount > 0) {
console.log(`\n✓ Fixed __vite__mapDeps TDZ error in ${fixedCount} file(s)`)
} else {
console.log('\n✓ No __vite__mapDeps TDZ errors found')
}
} catch (error) {
console.error('Error fixing __vite__mapDeps:', error)
process.exit(1)
}
}
main()

1301
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[package] [package]
name = "haex-hub" name = "haex-hub"
version = "0.1.0" version = "0.1.3"
description = "A Tauri App" description = "A Tauri App"
authors = ["you"] authors = ["you"]
edition = "2021" edition = "2021"
@ -20,14 +20,7 @@ tauri-build = { version = "2.2", features = [] }
serde = { version = "1.0.228", features = ["derive"] } serde = { version = "1.0.228", features = ["derive"] }
[dependencies] [dependencies]
rusqlite = { version = "0.37.0", features = [ tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }
"load_extension",
"bundled-sqlcipher-vendored-openssl",
"functions",
] }
#tauri-plugin-sql = { version = "2", features = ["sqlite"] }tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }#libsqlite3-sys = { version = "0.31", features = ["bundled-sqlcipher"] }
#sqlx = { version = "0.8", features = ["runtime-tokio-rustls", "sqlite"] }
base64 = "0.22" base64 = "0.22"
ed25519-dalek = "2.1" ed25519-dalek = "2.1"
fs_extra = "1.3.0" fs_extra = "1.3.0"
@ -39,18 +32,25 @@ serde = { version = "1", features = ["derive"] }
serde_json = "1.0.143" serde_json = "1.0.143"
sha2 = "0.10.9" sha2 = "0.10.9"
sqlparser = { version = "0.59.0", features = ["visitor"] } sqlparser = { version = "0.59.0", features = ["visitor"] }
tauri = { version = "2.8.5", features = ["protocol-asset", "devtools"] } tauri = { version = "2.9.1", features = ["protocol-asset", "devtools"] }
tauri-plugin-dialog = "2.4.0" tauri-plugin-dialog = "2.4.2"
tauri-plugin-fs = "2.4.0" tauri-plugin-fs = "2.4.0"
tauri-plugin-http = "2.5.2" tauri-plugin-http = "2.5.4"
tauri-plugin-notification = "2.3.1" tauri-plugin-notification = "2.3.3"
tauri-plugin-opener = "2.5.0" tauri-plugin-opener = "2.5.2"
tauri-plugin-os = "2.3" tauri-plugin-os = "2.3.2"
tauri-plugin-persisted-scope = "2.3.2" tauri-plugin-persisted-scope = "2.3.4"
tauri-plugin-store = "2.4.0" tauri-plugin-store = "2.4.1"
thiserror = "2.0.17" thiserror = "2.0.17"
ts-rs = { version = "11.1.0", features = ["serde-compat"] } ts-rs = { version = "11.1.0", features = ["serde-compat"] }
uhlc = "0.8.2" uhlc = "0.8.2"
url = "2.5.7"
uuid = { version = "1.18.1", features = ["v4"] } uuid = { version = "1.18.1", features = ["v4"] }
zip = "6.0.0" zip = "6.0.0"
url = "2.5.7"
[target.'cfg(not(target_os = "android"))'.dependencies]
trash = "5.2.0"
rusqlite = { version = "0.37.0", features = ["load_extension", "bundled-sqlcipher-vendored-openssl", "functions"] }
[target.'cfg(target_os = "android")'.dependencies]
rusqlite = { version = "0.37.0", features = ["load_extension", "bundled-sqlcipher-vendored-openssl", "functions"] }

View File

@ -1,3 +1,3 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually. // This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, devServerUrl: string | null, }; export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, entry: string | null, singleInstance: boolean | null, devServerUrl: string | null, };

View File

@ -1,4 +1,4 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually. // This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { ExtensionPermissions } from "./ExtensionPermissions"; import type { ExtensionPermissions } from "./ExtensionPermissions";
export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, }; export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string | null, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, single_instance: boolean | null, };

View File

@ -170,6 +170,14 @@ use serde::{Deserialize, Serialize};
table: schema.haexCrdtSnapshots, table: schema.haexCrdtSnapshots,
}, },
{ name: tablesNames.haex.crdt.configs.name, table: schema.haexCrdtConfigs }, { name: tablesNames.haex.crdt.configs.name, table: schema.haexCrdtConfigs },
{
name: tablesNames.haex.desktop_items.name,
table: schema.haexDesktopItems,
},
{
name: tablesNames.haex.workspaces.name,
table: schema.haexWorkspaces,
},
] ]
for (const { name, table } of schemas) { for (const { name, table } of schemas) {

View File

@ -19,5 +19,3 @@ const dummyExecutor = async (
// Erstelle die Drizzle-Instanz für den SQLite-Dialekt // Erstelle die Drizzle-Instanz für den SQLite-Dialekt
// Übergib den dummyExecutor und das importierte Schema // Übergib den dummyExecutor und das importierte Schema
export const db = drizzle(dummyExecutor, { schema }) export const db = drizzle(dummyExecutor, { schema })
// Exportiere auch alle Schema-Definitionen weiter, damit man alles aus einer Datei importieren kann

View File

@ -24,9 +24,23 @@ CREATE TABLE `haex_crdt_snapshots` (
`file_size_bytes` integer `file_size_bytes` integer
); );
--> statement-breakpoint --> statement-breakpoint
CREATE TABLE `haex_desktop_items` (
`id` text PRIMARY KEY NOT NULL,
`workspace_id` text NOT NULL,
`item_type` text NOT NULL,
`extension_id` text,
`system_window_id` text,
`position_x` integer DEFAULT 0 NOT NULL,
`position_y` integer DEFAULT 0 NOT NULL,
`haex_timestamp` text,
FOREIGN KEY (`workspace_id`) REFERENCES `haex_workspaces`(`id`) ON UPDATE no action ON DELETE cascade,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade,
CONSTRAINT "item_reference" CHECK(("haex_desktop_items"."item_type" = 'extension' AND "haex_desktop_items"."extension_id" IS NOT NULL AND "haex_desktop_items"."system_window_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'system' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'file' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'folder' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL))
);
--> statement-breakpoint
CREATE TABLE `haex_extension_permissions` ( CREATE TABLE `haex_extension_permissions` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`extension_id` text, `extension_id` text NOT NULL,
`resource_type` text, `resource_type` text,
`action` text, `action` text,
`target` text, `target` text,
@ -34,38 +48,28 @@ CREATE TABLE `haex_extension_permissions` (
`status` text DEFAULT 'denied' NOT NULL, `status` text DEFAULT 'denied' NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP), `created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer, `updated_at` integer,
`haex_tombstone` integer,
`haex_timestamp` text, `haex_timestamp` text,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE no action FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade
); );
--> statement-breakpoint --> statement-breakpoint
CREATE UNIQUE INDEX `haex_extension_permissions_extension_id_resource_type_action_target_unique` ON `haex_extension_permissions` (`extension_id`,`resource_type`,`action`,`target`);--> statement-breakpoint CREATE UNIQUE INDEX `haex_extension_permissions_extension_id_resource_type_action_target_unique` ON `haex_extension_permissions` (`extension_id`,`resource_type`,`action`,`target`);--> statement-breakpoint
CREATE TABLE `haex_extensions` ( CREATE TABLE `haex_extensions` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`public_key` text NOT NULL,
`name` text NOT NULL,
`version` text NOT NULL,
`author` text, `author` text,
`description` text, `description` text,
`entry` text, `entry` text DEFAULT 'index.html',
`homepage` text, `homepage` text,
`enabled` integer, `enabled` integer DEFAULT true,
`icon` text, `icon` text,
`name` text, `signature` text NOT NULL,
`public_key` text, `single_instance` integer DEFAULT false,
`signature` text,
`url` text,
`version` text,
`haex_tombstone` integer,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE TABLE `haex_settings` (
`id` text PRIMARY KEY NOT NULL,
`key` text,
`type` text,
`value` text,
`haex_tombstone` integer,
`haex_timestamp` text `haex_timestamp` text
); );
--> statement-breakpoint --> statement-breakpoint
CREATE UNIQUE INDEX `haex_extensions_public_key_name_unique` ON `haex_extensions` (`public_key`,`name`);--> statement-breakpoint
CREATE TABLE `haex_notifications` ( CREATE TABLE `haex_notifications` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`alt` text, `alt` text,
@ -77,63 +81,24 @@ CREATE TABLE `haex_notifications` (
`text` text, `text` text,
`title` text, `title` text,
`type` text NOT NULL, `type` text NOT NULL,
`haex_tombstone` integer `haex_timestamp` text
); );
--> statement-breakpoint --> statement-breakpoint
CREATE TABLE `haex_passwords_group_items` ( CREATE TABLE `haex_settings` (
`group_id` text,
`item_id` text,
`haex_tombstone` integer,
PRIMARY KEY(`item_id`, `group_id`),
FOREIGN KEY (`group_id`) REFERENCES `haex_passwords_groups`(`id`) ON UPDATE no action ON DELETE no action,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_groups` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`name` text,
`description` text,
`icon` text,
`order` integer,
`color` text,
`parent_id` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_tombstone` integer,
FOREIGN KEY (`parent_id`) REFERENCES `haex_passwords_groups`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_details` (
`id` text PRIMARY KEY NOT NULL,
`title` text,
`username` text,
`password` text,
`note` text,
`icon` text,
`tags` text,
`url` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_tombstone` integer
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_history` (
`id` text PRIMARY KEY NOT NULL,
`item_id` text,
`changed_property` text,
`old_value` text,
`new_value` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`haex_tombstone` integer,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_key_values` (
`id` text PRIMARY KEY NOT NULL,
`item_id` text,
`key` text, `key` text,
`type` text,
`value` text, `value` text,
`updated_at` integer, `haex_timestamp` text
`haex_tombstone` integer,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
); );
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_settings_key_type_value_unique` ON `haex_settings` (`key`,`type`,`value`);--> statement-breakpoint
CREATE TABLE `haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -1 +0,0 @@
ALTER TABLE `haex_notifications` ADD `haex_timestamp` text;

View File

@ -1,22 +0,0 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint
CREATE TABLE `__new_haex_extensions` (
`id` text PRIMARY KEY NOT NULL,
`public_key` text NOT NULL,
`name` text NOT NULL,
`version` text NOT NULL,
`author` text,
`description` text,
`entry` text DEFAULT 'index.html' NOT NULL,
`homepage` text,
`enabled` integer DEFAULT true,
`icon` text,
`signature` text NOT NULL,
`haex_tombstone` integer,
`haex_timestamp` text
);
--> statement-breakpoint
INSERT INTO `__new_haex_extensions`("id", "public_key", "name", "version", "author", "description", "entry", "homepage", "enabled", "icon", "signature", "haex_tombstone", "haex_timestamp") SELECT "id", "public_key", "name", "version", "author", "description", "entry", "homepage", "enabled", "icon", "signature", "haex_tombstone", "haex_timestamp" FROM `haex_extensions`;--> statement-breakpoint
DROP TABLE `haex_extensions`;--> statement-breakpoint
ALTER TABLE `__new_haex_extensions` RENAME TO `haex_extensions`;--> statement-breakpoint
PRAGMA foreign_keys=ON;--> statement-breakpoint
CREATE UNIQUE INDEX `haex_extensions_public_key_name_unique` ON `haex_extensions` (`public_key`,`name`);

View File

@ -1,9 +0,0 @@
CREATE TABLE `haex_desktop_items` (
`id` text PRIMARY KEY NOT NULL,
`item_type` text NOT NULL,
`reference_id` text NOT NULL,
`position_x` integer DEFAULT 0 NOT NULL,
`position_y` integer DEFAULT 0 NOT NULL,
`haex_tombstone` integer,
`haex_timestamp` text
);

View File

@ -1,10 +0,0 @@
CREATE TABLE `haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`created_at` integer NOT NULL,
`haex_tombstone` integer,
`haex_timestamp` text
);
--> statement-breakpoint
ALTER TABLE `haex_desktop_items` ADD `workspace_id` text NOT NULL REFERENCES haex_workspaces(id);

View File

@ -1 +0,0 @@
CREATE UNIQUE INDEX `haex_workspaces_name_unique` ON `haex_workspaces` (`name`);

View File

@ -1,7 +1,7 @@
{ {
"version": "6", "version": "6",
"dialect": "sqlite", "dialect": "sqlite",
"id": "3bbe52b8-5933-4b21-8b24-de3927a2f9b0", "id": "8dc25226-70f9-4d2e-89d4-f3a6b2bdf58d",
"prevId": "00000000-0000-0000-0000-000000000000", "prevId": "00000000-0000-0000-0000-000000000000",
"tables": { "tables": {
"haex_crdt_configs": { "haex_crdt_configs": {
@ -155,6 +155,106 @@
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
}, },
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_extension_permissions": { "haex_extension_permissions": {
"name": "haex_extension_permissions", "name": "haex_extension_permissions",
"columns": { "columns": {
@ -169,7 +269,7 @@
"name": "extension_id", "name": "extension_id",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"resource_type": { "resource_type": {
@ -223,13 +323,6 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
"type": "text", "type": "text",
@ -261,7 +354,7 @@
"columnsTo": [ "columnsTo": [
"id" "id"
], ],
"onDelete": "no action", "onDelete": "cascade",
"onUpdate": "no action" "onUpdate": "no action"
} }
}, },
@ -279,6 +372,27 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": { "author": {
"name": "author", "name": "author",
"type": "text", "type": "text",
@ -298,7 +412,8 @@
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": "'index.html'"
}, },
"homepage": { "homepage": {
"name": "homepage", "name": "homepage",
@ -312,7 +427,8 @@
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": true
}, },
"icon": { "icon": {
"name": "icon", "name": "icon",
@ -321,99 +437,20 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": { "signature": {
"name": "signature", "name": "signature",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"key": { "single_instance": {
"name": "key", "name": "single_instance",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": false
}, },
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
@ -423,7 +460,16 @@
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {}, "foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
@ -502,9 +548,9 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": { "haex_timestamp": {
"name": "haex_tombstone", "name": "haex_timestamp",
"type": "integer", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
@ -516,74 +562,8 @@
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
}, },
"haex_passwords_group_items": { "haex_settings": {
"name": "haex_passwords_group_items", "name": "haex_settings",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": { "columns": {
"id": { "id": {
"name": "id", "name": "id",
@ -592,270 +572,6 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": { "key": {
"name": "key", "name": "key",
"type": "text", "type": "text",
@ -863,6 +579,13 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": { "value": {
"name": "value", "name": "value",
"type": "text", "type": "text",
@ -870,37 +593,80 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"updated_at": { "haex_timestamp": {
"name": "updated_at", "name": "haex_timestamp",
"type": "integer", "type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"foreignKeys": { "haex_settings_key_type_value_unique": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": { "name": "haex_settings_key_type_value_unique",
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk", "columns": [
"tableFrom": "haex_passwords_item_key_values", "key",
"tableTo": "haex_passwords_item_details", "type",
"columnsFrom": [ "value"
"item_id"
], ],
"columnsTo": [ "isUnique": true
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
} }
}, },
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}

View File

@ -1,926 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "862ac1d5-3065-4244-8652-2b6782254862",
"prevId": "3bbe52b8-5933-4b21-8b24-de3927a2f9b0",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -1,930 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "5387568f-75b3-4a85-86c5-67f539c3fedf",
"prevId": "862ac1d5-3065-4244-8652-2b6782254862",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -1,991 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "2f40a42e-9b3f-42be-8951-8e94baadcd65",
"prevId": "5387568f-75b3-4a85-86c5-67f539c3fedf",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"reference_id": {
"name": "reference_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -5,43 +5,8 @@
{ {
"idx": 0, "idx": 0,
"version": "6", "version": "6",
"when": 1759402321133, "when": 1761821821609,
"tag": "0000_glamorous_hulk", "tag": "0000_dashing_night_nurse",
"breakpoints": true
},
{
"idx": 1,
"version": "6",
"when": 1759418087677,
"tag": "0001_green_stark_industries",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1760272083150,
"tag": "0002_amazing_iron_fist",
"breakpoints": true
},
{
"idx": 3,
"version": "6",
"when": 1760611690801,
"tag": "0003_daily_polaris",
"breakpoints": true
},
{
"idx": 4,
"version": "6",
"when": 1760817142340,
"tag": "0004_mature_viper",
"breakpoints": true
},
{
"idx": 5,
"version": "6",
"when": 1760964548034,
"tag": "0005_tidy_yellowjacket",
"breakpoints": true "breakpoints": true
} }
] ]

View File

@ -1,5 +1,6 @@
import { sql } from 'drizzle-orm' import { sql } from 'drizzle-orm'
import { import {
check,
integer, integer,
sqliteTable, sqliteTable,
text, text,
@ -8,37 +9,36 @@ import {
type SQLiteColumnBuilderBase, type SQLiteColumnBuilderBase,
} from 'drizzle-orm/sqlite-core' } from 'drizzle-orm/sqlite-core'
import tableNames from '../tableNames.json' import tableNames from '../tableNames.json'
import { crdtColumnNames } from '.'
// Helper function to add common CRDT columns (haexTombstone and haexTimestamp) // Helper function to add common CRDT columns ( haexTimestamp)
export const withCrdtColumns = < export const withCrdtColumns = <
T extends Record<string, SQLiteColumnBuilderBase>, T extends Record<string, SQLiteColumnBuilderBase>,
>( >(
columns: T, columns: T,
columnNames: { haexTombstone: string; haexTimestamp: string },
) => ({ ) => ({
...columns, ...columns,
haexTombstone: integer(columnNames.haexTombstone, { mode: 'boolean' }), haexTimestamp: text(crdtColumnNames.haexTimestamp),
haexTimestamp: text(columnNames.haexTimestamp),
}) })
export const haexSettings = sqliteTable(tableNames.haex.settings.name, { export const haexSettings = sqliteTable(
tableNames.haex.settings.name,
withCrdtColumns({
id: text() id: text()
.primaryKey() .primaryKey()
.$defaultFn(() => crypto.randomUUID()), .$defaultFn(() => crypto.randomUUID()),
key: text(), key: text(),
type: text(), type: text(),
value: text(), value: text(),
haexTombstone: integer(tableNames.haex.settings.columns.haexTombstone, {
mode: 'boolean',
}), }),
haexTimestamp: text(tableNames.haex.settings.columns.haexTimestamp), (table) => [unique().on(table.key, table.type, table.value)],
}) )
export type InsertHaexSettings = typeof haexSettings.$inferInsert export type InsertHaexSettings = typeof haexSettings.$inferInsert
export type SelectHaexSettings = typeof haexSettings.$inferSelect export type SelectHaexSettings = typeof haexSettings.$inferSelect
export const haexExtensions = sqliteTable( export const haexExtensions = sqliteTable(
tableNames.haex.extensions.name, tableNames.haex.extensions.name,
{ withCrdtColumns({
id: text() id: text()
.primaryKey() .primaryKey()
.$defaultFn(() => crypto.randomUUID()), .$defaultFn(() => crypto.randomUUID()),
@ -47,16 +47,13 @@ export const haexExtensions = sqliteTable(
version: text().notNull(), version: text().notNull(),
author: text(), author: text(),
description: text(), description: text(),
entry: text().notNull().default('index.html'), entry: text().default('index.html'),
homepage: text(), homepage: text(),
enabled: integer({ mode: 'boolean' }).default(true), enabled: integer({ mode: 'boolean' }).default(true),
icon: text(), icon: text(),
signature: text().notNull(), signature: text().notNull(),
haexTombstone: integer(tableNames.haex.extensions.columns.haexTombstone, { single_instance: integer({ mode: 'boolean' }).default(false),
mode: 'boolean',
}), }),
haexTimestamp: text(tableNames.haex.extensions.columns.haexTimestamp),
},
(table) => [ (table) => [
// UNIQUE constraint: Pro Developer (public_key) kann nur eine Extension mit diesem Namen existieren // UNIQUE constraint: Pro Developer (public_key) kann nur eine Extension mit diesem Namen existieren
unique().on(table.public_key, table.name), unique().on(table.public_key, table.name),
@ -67,13 +64,15 @@ export type SelectHaexExtensions = typeof haexExtensions.$inferSelect
export const haexExtensionPermissions = sqliteTable( export const haexExtensionPermissions = sqliteTable(
tableNames.haex.extension_permissions.name, tableNames.haex.extension_permissions.name,
{ withCrdtColumns({
id: text() id: text()
.primaryKey() .primaryKey()
.$defaultFn(() => crypto.randomUUID()), .$defaultFn(() => crypto.randomUUID()),
extensionId: text( extensionId: text(tableNames.haex.extension_permissions.columns.extensionId)
tableNames.haex.extension_permissions.columns.extensionId, .notNull()
).references((): AnySQLiteColumn => haexExtensions.id), .references((): AnySQLiteColumn => haexExtensions.id, {
onDelete: 'cascade',
}),
resourceType: text('resource_type', { resourceType: text('resource_type', {
enum: ['fs', 'http', 'db', 'shell'], enum: ['fs', 'http', 'db', 'shell'],
}), }),
@ -87,14 +86,7 @@ export const haexExtensionPermissions = sqliteTable(
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate( updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(), () => new Date(),
), ),
haexTombstone: integer( }),
tableNames.haex.extension_permissions.columns.haexTombstone,
{ mode: 'boolean' },
),
haexTimestamp: text(
tableNames.haex.extension_permissions.columns.haexTimestamp,
),
},
(table) => [ (table) => [
unique().on( unique().on(
table.extensionId, table.extensionId,
@ -111,7 +103,7 @@ export type SelecthaexExtensionPermissions =
export const haexNotifications = sqliteTable( export const haexNotifications = sqliteTable(
tableNames.haex.notifications.name, tableNames.haex.notifications.name,
{ withCrdtColumns({
id: text().primaryKey(), id: text().primaryKey(),
alt: text(), alt: text(),
date: text(), date: text(),
@ -124,65 +116,61 @@ export const haexNotifications = sqliteTable(
type: text({ type: text({
enum: ['error', 'success', 'warning', 'info', 'log'], enum: ['error', 'success', 'warning', 'info', 'log'],
}).notNull(), }).notNull(),
haexTombstone: integer( }),
tableNames.haex.notifications.columns.haexTombstone,
{ mode: 'boolean' },
),
haexTimestamp: text(tableNames.haex.notifications.columns.haexTimestamp),
},
) )
export type InsertHaexNotifications = typeof haexNotifications.$inferInsert export type InsertHaexNotifications = typeof haexNotifications.$inferInsert
export type SelectHaexNotifications = typeof haexNotifications.$inferSelect export type SelectHaexNotifications = typeof haexNotifications.$inferSelect
export const haexWorkspaces = sqliteTable( export const haexWorkspaces = sqliteTable(
tableNames.haex.workspaces.name, tableNames.haex.workspaces.name,
withCrdtColumns( withCrdtColumns({
{
id: text(tableNames.haex.workspaces.columns.id) id: text(tableNames.haex.workspaces.columns.id)
.primaryKey() .primaryKey()
.$defaultFn(() => crypto.randomUUID()), .$defaultFn(() => crypto.randomUUID()),
deviceId: text(tableNames.haex.workspaces.columns.deviceId).notNull(),
name: text(tableNames.haex.workspaces.columns.name).notNull(), name: text(tableNames.haex.workspaces.columns.name).notNull(),
position: integer(tableNames.haex.workspaces.columns.position) position: integer(tableNames.haex.workspaces.columns.position)
.notNull() .notNull()
.default(0), .default(0),
createdAt: integer(tableNames.haex.workspaces.columns.createdAt, { }),
mode: 'timestamp', (table) => [unique().on(table.position)],
})
.notNull()
.$defaultFn(() => new Date()),
},
tableNames.haex.workspaces.columns,
),
(table) => [unique().on(table.name)],
) )
export type InsertHaexWorkspaces = typeof haexWorkspaces.$inferInsert export type InsertHaexWorkspaces = typeof haexWorkspaces.$inferInsert
export type SelectHaexWorkspaces = typeof haexWorkspaces.$inferSelect export type SelectHaexWorkspaces = typeof haexWorkspaces.$inferSelect
export const haexDesktopItems = sqliteTable( export const haexDesktopItems = sqliteTable(
tableNames.haex.desktop_items.name, tableNames.haex.desktop_items.name,
withCrdtColumns( withCrdtColumns({
{
id: text(tableNames.haex.desktop_items.columns.id) id: text(tableNames.haex.desktop_items.columns.id)
.primaryKey() .primaryKey()
.$defaultFn(() => crypto.randomUUID()), .$defaultFn(() => crypto.randomUUID()),
workspaceId: text(tableNames.haex.desktop_items.columns.workspaceId) workspaceId: text(tableNames.haex.desktop_items.columns.workspaceId)
.notNull() .notNull()
.references(() => haexWorkspaces.id), .references(() => haexWorkspaces.id, { onDelete: 'cascade' }),
itemType: text(tableNames.haex.desktop_items.columns.itemType, { itemType: text(tableNames.haex.desktop_items.columns.itemType, {
enum: ['extension', 'file', 'folder'], enum: ['system', 'extension', 'file', 'folder'],
}).notNull(), }).notNull(),
referenceId: text( // Für Extensions (wenn itemType = 'extension')
tableNames.haex.desktop_items.columns.referenceId, extensionId: text(
).notNull(), // extensionId für extensions, filePath für files/folders tableNames.haex.desktop_items.columns.extensionId,
).references((): AnySQLiteColumn => haexExtensions.id, {
onDelete: 'cascade',
}),
// Für System Windows (wenn itemType = 'system')
systemWindowId: text(tableNames.haex.desktop_items.columns.systemWindowId),
positionX: integer(tableNames.haex.desktop_items.columns.positionX) positionX: integer(tableNames.haex.desktop_items.columns.positionX)
.notNull() .notNull()
.default(0), .default(0),
positionY: integer(tableNames.haex.desktop_items.columns.positionY) positionY: integer(tableNames.haex.desktop_items.columns.positionY)
.notNull() .notNull()
.default(0), .default(0),
}, }),
tableNames.haex.desktop_items.columns, (table) => [
check(
'item_reference',
sql`(${table.itemType} = 'extension' AND ${table.extensionId} IS NOT NULL AND ${table.systemWindowId} IS NULL) OR (${table.itemType} = 'system' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL) OR (${table.itemType} = 'file' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL) OR (${table.itemType} = 'folder' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL)`,
), ),
],
) )
export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert
export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect

View File

@ -1,2 +1,5 @@
export const crdtColumnNames = {
haexTimestamp: 'haex_timestamp',
}
export * from './crdt' export * from './crdt'
export * from './haex' export * from './haex'

View File

@ -1,112 +0,0 @@
import { sql } from 'drizzle-orm'
import {
integer,
primaryKey,
sqliteTable,
text,
type AnySQLiteColumn,
} from 'drizzle-orm/sqlite-core'
import tableNames from '../tableNames.json'
export const haexPasswordsItemDetails = sqliteTable(
tableNames.haex.passwords.item_details,
{
id: text().primaryKey(),
title: text(),
username: text(),
password: text(),
note: text(),
icon: text(),
tags: text(),
url: text(),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InsertHaexPasswordsItemDetails =
typeof haexPasswordsItemDetails.$inferInsert
export type SelectHaexPasswordsItemDetails =
typeof haexPasswordsItemDetails.$inferSelect
export const haexPasswordsItemKeyValues = sqliteTable(
tableNames.haex.passwords.item_key_values,
{
id: text().primaryKey(),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
key: text(),
value: text(),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InserthaexPasswordsItemKeyValues =
typeof haexPasswordsItemKeyValues.$inferInsert
export type SelectHaexPasswordsItemKeyValues =
typeof haexPasswordsItemKeyValues.$inferSelect
export const haexPasswordsItemHistory = sqliteTable(
tableNames.haex.passwords.item_histories,
{
id: text().primaryKey(),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
changedProperty:
text('changed_property').$type<keyof typeof haexPasswordsItemDetails>(),
oldValue: text('old_value'),
newValue: text('new_value'),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InserthaexPasswordsItemHistory =
typeof haexPasswordsItemHistory.$inferInsert
export type SelectHaexPasswordsItemHistory =
typeof haexPasswordsItemHistory.$inferSelect
export const haexPasswordsGroups = sqliteTable(
tableNames.haex.passwords.groups,
{
id: text().primaryKey(),
name: text(),
description: text(),
icon: text(),
order: integer(),
color: text(),
parentId: text('parent_id').references(
(): AnySQLiteColumn => haexPasswordsGroups.id,
),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InsertHaexPasswordsGroups = typeof haexPasswordsGroups.$inferInsert
export type SelectHaexPasswordsGroups = typeof haexPasswordsGroups.$inferSelect
export const haexPasswordsGroupItems = sqliteTable(
tableNames.haex.passwords.group_items,
{
groupId: text('group_id').references(
(): AnySQLiteColumn => haexPasswordsGroups.id,
),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
haex_tombstone: integer({ mode: 'boolean' }),
},
(table) => [primaryKey({ columns: [table.itemId, table.groupId] })],
)
export type InsertHaexPasswordsGroupItems =
typeof haexPasswordsGroupItems.$inferInsert
export type SelectHaexPasswordsGroupItems =
typeof haexPasswordsGroupItems.$inferSelect

View File

@ -7,7 +7,7 @@
"key": "key", "key": "key",
"type": "type", "type": "type",
"value": "value", "value": "value",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -26,7 +26,7 @@
"signature": "signature", "signature": "signature",
"url": "url", "url": "url",
"version": "version", "version": "version",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -42,7 +42,7 @@
"status": "status", "status": "status",
"createdAt": "created_at", "createdAt": "created_at",
"updateAt": "updated_at", "updateAt": "updated_at",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -59,7 +59,7 @@
"text": "text", "text": "text",
"title": "title", "title": "title",
"type": "type", "type": "type",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -67,10 +67,11 @@
"name": "haex_workspaces", "name": "haex_workspaces",
"columns": { "columns": {
"id": "id", "id": "id",
"deviceId": "device_id",
"name": "name", "name": "name",
"position": "position", "position": "position",
"createdAt": "created_at", "createdAt": "created_at",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -80,20 +81,15 @@
"id": "id", "id": "id",
"workspaceId": "workspace_id", "workspaceId": "workspace_id",
"itemType": "item_type", "itemType": "item_type",
"referenceId": "reference_id", "extensionId": "extension_id",
"systemWindowId": "system_window_id",
"positionX": "position_x", "positionX": "position_x",
"positionY": "position_y", "positionY": "position_y",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
"passwords": {
"groups": "haex_passwords_groups",
"group_items": "haex_passwords_group_items",
"item_details": "haex_passwords_item_details",
"item_key_values": "haex_passwords_item_key_values",
"item_histories": "haex_passwords_item_history"
},
"crdt": { "crdt": {
"logs": { "logs": {
"name": "haex_crdt_logs", "name": "haex_crdt_logs",

Binary file not shown.

View File

@ -24,6 +24,23 @@ android {
versionCode = tauriProperties.getProperty("tauri.android.versionCode", "1").toInt() versionCode = tauriProperties.getProperty("tauri.android.versionCode", "1").toInt()
versionName = tauriProperties.getProperty("tauri.android.versionName", "1.0") versionName = tauriProperties.getProperty("tauri.android.versionName", "1.0")
} }
signingConfigs {
create("release") {
val keystorePath = System.getenv("ANDROID_KEYSTORE_PATH")
val keystorePassword = System.getenv("ANDROID_KEYSTORE_PASSWORD")
val keyAlias = System.getenv("ANDROID_KEY_ALIAS")
val keyPassword = System.getenv("ANDROID_KEY_PASSWORD")
if (keystorePath != null && keystorePassword != null && keyAlias != null && keyPassword != null) {
storeFile = file(keystorePath)
storePassword = keystorePassword
this.keyAlias = keyAlias
this.keyPassword = keyPassword
}
}
}
buildTypes { buildTypes {
getByName("debug") { getByName("debug") {
manifestPlaceholders["usesCleartextTraffic"] = "true" manifestPlaceholders["usesCleartextTraffic"] = "true"
@ -43,6 +60,12 @@ android {
.plus(getDefaultProguardFile("proguard-android-optimize.txt")) .plus(getDefaultProguardFile("proguard-android-optimize.txt"))
.toList().toTypedArray() .toList().toTypedArray()
) )
// Sign with release config if available
val releaseSigningConfig = signingConfigs.getByName("release")
if (releaseSigningConfig.storeFile != null) {
signingConfig = releaseSigningConfig
}
} }
} }
kotlinOptions { kotlinOptions {

File diff suppressed because one or more lines are too long

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1,5 +1,6 @@
// src-tarui/src/build/table_names.rs // src-tarui/src/build/table_names.rs
use serde::Deserialize; use serde::Deserialize;
use serde_json::Value;
use std::collections::HashMap; use std::collections::HashMap;
use std::env; use std::env;
use std::fs::File; use std::fs::File;
@ -8,25 +9,7 @@ use std::path::Path;
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
struct Schema { struct Schema {
haex: Haex, haex: HashMap<String, Value>,
}
#[derive(Debug, Deserialize)]
#[allow(non_snake_case)]
struct Haex {
settings: TableDefinition,
extensions: TableDefinition,
extension_permissions: TableDefinition,
notifications: TableDefinition,
desktop_items: TableDefinition,
crdt: Crdt,
}
#[derive(Debug, Deserialize)]
struct Crdt {
logs: TableDefinition,
snapshots: TableDefinition,
configs: TableDefinition,
} }
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
@ -45,185 +28,39 @@ pub fn generate_table_names() {
let reader = BufReader::new(file); let reader = BufReader::new(file);
let schema: Schema = let schema: Schema =
serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen"); serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen");
let haex = schema.haex;
let code = format!( let mut code = String::from(
r#" r#"
// ================================================================== // ==================================================================
// HINWEIS: Diese Datei wurde automatisch von build.rs generiert. // HINWEIS: Diese Datei wurde automatisch von build.rs generiert.
// Manuelle Änderungen werden bei der nächsten Kompilierung überschrieben! // Manuelle Änderungen werden bei der nächsten Kompilierung überschrieben!
// ================================================================== // ==================================================================
// --- Table: haex_settings ---
pub const TABLE_SETTINGS: &str = "{t_settings}";
pub const COL_SETTINGS_ID: &str = "{c_settings_id}";
pub const COL_SETTINGS_KEY: &str = "{c_settings_key}";
pub const COL_SETTINGS_TYPE: &str = "{c_settings_type}";
pub const COL_SETTINGS_VALUE: &str = "{c_settings_value}";
pub const COL_SETTINGS_HAEX_TOMBSTONE: &str = "{c_settings_tombstone}";
pub const COL_SETTINGS_HAEX_TIMESTAMP: &str = "{c_settings_timestamp}";
// --- Table: haex_extensions ---
pub const TABLE_EXTENSIONS: &str = "{t_extensions}";
pub const COL_EXTENSIONS_ID: &str = "{c_ext_id}";
pub const COL_EXTENSIONS_AUTHOR: &str = "{c_ext_author}";
pub const COL_EXTENSIONS_DESCRIPTION: &str = "{c_ext_description}";
pub const COL_EXTENSIONS_ENTRY: &str = "{c_ext_entry}";
pub const COL_EXTENSIONS_HOMEPAGE: &str = "{c_ext_homepage}";
pub const COL_EXTENSIONS_ENABLED: &str = "{c_ext_enabled}";
pub const COL_EXTENSIONS_ICON: &str = "{c_ext_icon}";
pub const COL_EXTENSIONS_NAME: &str = "{c_ext_name}";
pub const COL_EXTENSIONS_PUBLIC_KEY: &str = "{c_ext_public_key}";
pub const COL_EXTENSIONS_SIGNATURE: &str = "{c_ext_signature}";
pub const COL_EXTENSIONS_URL: &str = "{c_ext_url}";
pub const COL_EXTENSIONS_VERSION: &str = "{c_ext_version}";
pub const COL_EXTENSIONS_HAEX_TOMBSTONE: &str = "{c_ext_tombstone}";
pub const COL_EXTENSIONS_HAEX_TIMESTAMP: &str = "{c_ext_timestamp}";
// --- Table: haex_extension_permissions ---
pub const TABLE_EXTENSION_PERMISSIONS: &str = "{t_ext_perms}";
pub const COL_EXT_PERMS_ID: &str = "{c_extp_id}";
pub const COL_EXT_PERMS_EXTENSION_ID: &str = "{c_extp_extensionId}";
pub const COL_EXT_PERMS_RESOURCE_TYPE: &str = "{c_extp_resourceType}";
pub const COL_EXT_PERMS_ACTION: &str = "{c_extp_action}";
pub const COL_EXT_PERMS_TARGET: &str = "{c_extp_target}";
pub const COL_EXT_PERMS_CONSTRAINTS: &str = "{c_extp_constraints}";
pub const COL_EXT_PERMS_STATUS: &str = "{c_extp_status}";
pub const COL_EXT_PERMS_CREATED_AT: &str = "{c_extp_createdAt}";
pub const COL_EXT_PERMS_UPDATE_AT: &str = "{c_extp_updateAt}";
pub const COL_EXT_PERMS_HAEX_TOMBSTONE: &str = "{c_extp_tombstone}";
pub const COL_EXT_PERMS_HAEX_TIMESTAMP: &str = "{c_extp_timestamp}";
// --- Table: haex_notifications ---
pub const TABLE_NOTIFICATIONS: &str = "{t_notifications}";
pub const COL_NOTIFICATIONS_ID: &str = "{c_notif_id}";
pub const COL_NOTIFICATIONS_ALT: &str = "{c_notif_alt}";
pub const COL_NOTIFICATIONS_DATE: &str = "{c_notif_date}";
pub const COL_NOTIFICATIONS_ICON: &str = "{c_notif_icon}";
pub const COL_NOTIFICATIONS_IMAGE: &str = "{c_notif_image}";
pub const COL_NOTIFICATIONS_READ: &str = "{c_notif_read}";
pub const COL_NOTIFICATIONS_SOURCE: &str = "{c_notif_source}";
pub const COL_NOTIFICATIONS_TEXT: &str = "{c_notif_text}";
pub const COL_NOTIFICATIONS_TITLE: &str = "{c_notif_title}";
pub const COL_NOTIFICATIONS_TYPE: &str = "{c_notif_type}";
pub const COL_NOTIFICATIONS_HAEX_TOMBSTONE: &str = "{c_notif_tombstone}";
// --- Table: haex_desktop_items ---
pub const TABLE_DESKTOP_ITEMS: &str = "{t_desktop_items}";
pub const COL_DESKTOP_ITEMS_ID: &str = "{c_desktop_id}";
pub const COL_DESKTOP_ITEMS_ITEM_TYPE: &str = "{c_desktop_itemType}";
pub const COL_DESKTOP_ITEMS_REFERENCE_ID: &str = "{c_desktop_referenceId}";
pub const COL_DESKTOP_ITEMS_POSITION_X: &str = "{c_desktop_positionX}";
pub const COL_DESKTOP_ITEMS_POSITION_Y: &str = "{c_desktop_positionY}";
pub const COL_DESKTOP_ITEMS_HAEX_TOMBSTONE: &str = "{c_desktop_tombstone}";
pub const COL_DESKTOP_ITEMS_HAEX_TIMESTAMP: &str = "{c_desktop_timestamp}";
// --- Table: haex_crdt_logs ---
pub const TABLE_CRDT_LOGS: &str = "{t_crdt_logs}";
pub const COL_CRDT_LOGS_ID: &str = "{c_crdt_logs_id}";
pub const COL_CRDT_LOGS_HAEX_TIMESTAMP: &str = "{c_crdt_logs_timestamp}";
pub const COL_CRDT_LOGS_TABLE_NAME: &str = "{c_crdt_logs_tableName}";
pub const COL_CRDT_LOGS_ROW_PKS: &str = "{c_crdt_logs_rowPks}";
pub const COL_CRDT_LOGS_OP_TYPE: &str = "{c_crdt_logs_opType}";
pub const COL_CRDT_LOGS_COLUMN_NAME: &str = "{c_crdt_logs_columnName}";
pub const COL_CRDT_LOGS_NEW_VALUE: &str = "{c_crdt_logs_newValue}";
pub const COL_CRDT_LOGS_OLD_VALUE: &str = "{c_crdt_logs_oldValue}";
// --- Table: haex_crdt_snapshots ---
pub const TABLE_CRDT_SNAPSHOTS: &str = "{t_crdt_snapshots}";
pub const COL_CRDT_SNAPSHOTS_ID: &str = "{c_crdt_snap_id}";
pub const COL_CRDT_SNAPSHOTS_CREATED: &str = "{c_crdt_snap_created}";
pub const COL_CRDT_SNAPSHOTS_EPOCH_HLC: &str = "{c_crdt_snap_epoch}";
pub const COL_CRDT_SNAPSHOTS_LOCATION_URL: &str = "{c_crdt_snap_location}";
pub const COL_CRDT_SNAPSHOTS_FILE_SIZE: &str = "{c_crdt_snap_size}";
// --- Table: haex_crdt_configs ---
pub const TABLE_CRDT_CONFIGS: &str = "{t_crdt_configs}";
pub const COL_CRDT_CONFIGS_KEY: &str = "{c_crdt_configs_key}";
pub const COL_CRDT_CONFIGS_VALUE: &str = "{c_crdt_configs_value}";
"#, "#,
// Settings
t_settings = haex.settings.name,
c_settings_id = haex.settings.columns["id"],
c_settings_key = haex.settings.columns["key"],
c_settings_type = haex.settings.columns["type"],
c_settings_value = haex.settings.columns["value"],
c_settings_tombstone = haex.settings.columns["haexTombstone"],
c_settings_timestamp = haex.settings.columns["haexTimestamp"],
// Extensions
t_extensions = haex.extensions.name,
c_ext_id = haex.extensions.columns["id"],
c_ext_author = haex.extensions.columns["author"],
c_ext_description = haex.extensions.columns["description"],
c_ext_entry = haex.extensions.columns["entry"],
c_ext_homepage = haex.extensions.columns["homepage"],
c_ext_enabled = haex.extensions.columns["enabled"],
c_ext_icon = haex.extensions.columns["icon"],
c_ext_name = haex.extensions.columns["name"],
c_ext_public_key = haex.extensions.columns["public_key"],
c_ext_signature = haex.extensions.columns["signature"],
c_ext_url = haex.extensions.columns["url"],
c_ext_version = haex.extensions.columns["version"],
c_ext_tombstone = haex.extensions.columns["haexTombstone"],
c_ext_timestamp = haex.extensions.columns["haexTimestamp"],
// Extension Permissions
t_ext_perms = haex.extension_permissions.name,
c_extp_id = haex.extension_permissions.columns["id"],
c_extp_extensionId = haex.extension_permissions.columns["extensionId"],
c_extp_resourceType = haex.extension_permissions.columns["resourceType"],
c_extp_action = haex.extension_permissions.columns["action"],
c_extp_target = haex.extension_permissions.columns["target"],
c_extp_constraints = haex.extension_permissions.columns["constraints"],
c_extp_status = haex.extension_permissions.columns["status"],
c_extp_createdAt = haex.extension_permissions.columns["createdAt"],
c_extp_updateAt = haex.extension_permissions.columns["updateAt"],
c_extp_tombstone = haex.extension_permissions.columns["haexTombstone"],
c_extp_timestamp = haex.extension_permissions.columns["haexTimestamp"],
// Notifications
t_notifications = haex.notifications.name,
c_notif_id = haex.notifications.columns["id"],
c_notif_alt = haex.notifications.columns["alt"],
c_notif_date = haex.notifications.columns["date"],
c_notif_icon = haex.notifications.columns["icon"],
c_notif_image = haex.notifications.columns["image"],
c_notif_read = haex.notifications.columns["read"],
c_notif_source = haex.notifications.columns["source"],
c_notif_text = haex.notifications.columns["text"],
c_notif_title = haex.notifications.columns["title"],
c_notif_type = haex.notifications.columns["type"],
c_notif_tombstone = haex.notifications.columns["haexTombstone"],
// Desktop Items
t_desktop_items = haex.desktop_items.name,
c_desktop_id = haex.desktop_items.columns["id"],
c_desktop_itemType = haex.desktop_items.columns["itemType"],
c_desktop_referenceId = haex.desktop_items.columns["referenceId"],
c_desktop_positionX = haex.desktop_items.columns["positionX"],
c_desktop_positionY = haex.desktop_items.columns["positionY"],
c_desktop_tombstone = haex.desktop_items.columns["haexTombstone"],
c_desktop_timestamp = haex.desktop_items.columns["haexTimestamp"],
// CRDT Logs
t_crdt_logs = haex.crdt.logs.name,
c_crdt_logs_id = haex.crdt.logs.columns["id"],
c_crdt_logs_timestamp = haex.crdt.logs.columns["haexTimestamp"],
c_crdt_logs_tableName = haex.crdt.logs.columns["tableName"],
c_crdt_logs_rowPks = haex.crdt.logs.columns["rowPks"],
c_crdt_logs_opType = haex.crdt.logs.columns["opType"],
c_crdt_logs_columnName = haex.crdt.logs.columns["columnName"],
c_crdt_logs_newValue = haex.crdt.logs.columns["newValue"],
c_crdt_logs_oldValue = haex.crdt.logs.columns["oldValue"],
// CRDT Snapshots
t_crdt_snapshots = haex.crdt.snapshots.name,
c_crdt_snap_id = haex.crdt.snapshots.columns["snapshotId"],
c_crdt_snap_created = haex.crdt.snapshots.columns["created"],
c_crdt_snap_epoch = haex.crdt.snapshots.columns["epochHlc"],
c_crdt_snap_location = haex.crdt.snapshots.columns["locationUrl"],
c_crdt_snap_size = haex.crdt.snapshots.columns["fileSizeBytes"],
// CRDT Configs
t_crdt_configs = haex.crdt.configs.name,
c_crdt_configs_key = haex.crdt.configs.columns["key"],
c_crdt_configs_value = haex.crdt.configs.columns["value"]
); );
// Dynamisch über alle Einträge in haex iterieren
for (key, value) in &schema.haex {
// Spezialbehandlung für nested structures wie "crdt"
if key == "crdt" {
if let Some(crdt_obj) = value.as_object() {
for (crdt_key, crdt_value) in crdt_obj {
if let Ok(table) = serde_json::from_value::<TableDefinition>(crdt_value.clone())
{
let const_prefix = format!("CRDT_{}", to_screaming_snake_case(crdt_key));
code.push_str(&generate_table_constants(&table, &const_prefix));
}
}
}
} else {
// Normale Tabelle (settings, extensions, notifications, workspaces, desktop_items, etc.)
if let Ok(table) = serde_json::from_value::<TableDefinition>(value.clone()) {
let const_prefix = to_screaming_snake_case(key);
code.push_str(&generate_table_constants(&table, &const_prefix));
}
}
}
// --- Datei schreiben --- // --- Datei schreiben ---
let mut f = File::create(&dest_path).expect("Konnte Zieldatei nicht erstellen"); let mut f = File::create(&dest_path).expect("Konnte Zieldatei nicht erstellen");
f.write_all(code.as_bytes()) f.write_all(code.as_bytes())
@ -231,3 +68,51 @@ pub const COL_CRDT_CONFIGS_VALUE: &str = "{c_crdt_configs_value}";
println!("cargo:rerun-if-changed=database/tableNames.json"); println!("cargo:rerun-if-changed=database/tableNames.json");
} }
/// Konvertiert einen String zu SCREAMING_SNAKE_CASE
fn to_screaming_snake_case(s: &str) -> String {
let mut result = String::new();
let mut prev_is_lower = false;
for (i, ch) in s.chars().enumerate() {
if ch == '_' {
result.push('_');
prev_is_lower = false;
} else if ch.is_uppercase() {
if i > 0 && prev_is_lower {
result.push('_');
}
result.push(ch);
prev_is_lower = false;
} else {
result.push(ch.to_ascii_uppercase());
prev_is_lower = true;
}
}
result
}
/// Generiert die Konstanten für eine Tabelle
fn generate_table_constants(table: &TableDefinition, const_prefix: &str) -> String {
let mut code = String::new();
// Tabellenname
code.push_str(&format!("// --- Table: {} ---\n", table.name));
code.push_str(&format!(
"pub const TABLE_{}: &str = \"{}\";\n",
const_prefix, table.name
));
// Spalten
for (col_key, col_value) in &table.columns {
let col_const_name = format!("COL_{}_{}", const_prefix, to_screaming_snake_case(col_key));
code.push_str(&format!(
"pub const {}: &str = \"{}\";\n",
col_const_name, col_value
));
}
code.push('\n');
code
}

View File

@ -1,144 +1,84 @@
// src-tauri/src/crdt/insert_transformer.rs // src-tauri/src/crdt/insert_transformer.rs
// INSERT-spezifische CRDT-Transformationen (ON CONFLICT, RETURNING) // INSERT-spezifische CRDT-Transformationen (ON CONFLICT, RETURNING)
use crate::crdt::trigger::{HLC_TIMESTAMP_COLUMN, TOMBSTONE_COLUMN}; use crate::crdt::trigger::HLC_TIMESTAMP_COLUMN;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use sqlparser::ast::{ use sqlparser::ast::{Expr, Ident, Insert, SelectItem, SetExpr, Value};
Assignment, AssignmentTarget, BinaryOperator, Expr, Ident, Insert, ObjectNamePart,
OnConflict, OnConflictAction, OnInsert, SelectItem, SetExpr, Value,
};
use uhlc::Timestamp; use uhlc::Timestamp;
/// Helper-Struct für INSERT-Transformationen /// Helper-Struct für INSERT-Transformationen
pub struct InsertTransformer { pub struct InsertTransformer {
tombstone_column: &'static str,
hlc_timestamp_column: &'static str, hlc_timestamp_column: &'static str,
} }
impl InsertTransformer { impl InsertTransformer {
pub fn new() -> Self { pub fn new() -> Self {
Self { Self {
tombstone_column: TOMBSTONE_COLUMN,
hlc_timestamp_column: HLC_TIMESTAMP_COLUMN, hlc_timestamp_column: HLC_TIMESTAMP_COLUMN,
} }
} }
/// Transformiert INSERT-Statements (fügt HLC-Timestamp hinzu und behandelt Tombstone-Konflikte) fn find_or_add_column(columns: &mut Vec<Ident>, col_name: &'static str) -> usize {
/// Fügt automatisch RETURNING für Primary Keys hinzu, damit der Executor die tatsächlichen PKs kennt match columns.iter().position(|c| c.value == col_name) {
Some(index) => index, // Gefunden! Gib Index zurück.
None => {
// Nicht gefunden! Hinzufügen.
columns.push(Ident::new(col_name));
columns.len() - 1 // Der Index des gerade hinzugefügten Elements
}
}
}
/// Wenn der Index == der Länge ist, wird der Wert stattdessen gepusht.
fn set_or_push_value(row: &mut Vec<Expr>, index: usize, value: Expr) {
if index < row.len() {
// Spalte war vorhanden, Wert (wahrscheinlich `?` oder NULL) ersetzen
row[index] = value;
} else {
// Spalte war nicht vorhanden, Wert hinzufügen
row.push(value);
}
}
fn set_or_push_projection(projection: &mut Vec<SelectItem>, index: usize, value: Expr) {
let item = SelectItem::UnnamedExpr(value);
if index < projection.len() {
projection[index] = item;
} else {
projection.push(item);
}
}
/// Transformiert INSERT-Statements (fügt HLC-Timestamp hinzu)
/// Hard Delete: Kein ON CONFLICT mehr nötig - gelöschte Einträge sind wirklich weg
pub fn transform_insert( pub fn transform_insert(
&self, &self,
insert_stmt: &mut Insert, insert_stmt: &mut Insert,
timestamp: &Timestamp, timestamp: &Timestamp,
primary_keys: &[String],
foreign_keys: &[String],
) -> Result<(), DatabaseError> { ) -> Result<(), DatabaseError> {
// Add both haex_timestamp and haex_tombstone columns // Add haex_timestamp column if not exists
insert_stmt let hlc_col_index =
.columns Self::find_or_add_column(&mut insert_stmt.columns, self.hlc_timestamp_column);
.push(Ident::new(self.hlc_timestamp_column));
insert_stmt.columns.push(Ident::new(self.tombstone_column));
// Füge RETURNING für alle Primary Keys hinzu (falls noch nicht vorhanden) // ON CONFLICT Logik komplett entfernt!
// Dies erlaubt uns, die tatsächlichen PK-Werte nach ON CONFLICT zu kennen // Bei Hard Deletes gibt es keine Tombstone-Einträge mehr zu reaktivieren
if insert_stmt.returning.is_none() && !primary_keys.is_empty() { // UNIQUE Constraint Violations sind echte Fehler
insert_stmt.returning = Some(
primary_keys
.iter()
.map(|pk| SelectItem::UnnamedExpr(Expr::Identifier(Ident::new(pk))))
.collect(),
);
}
// Setze ON CONFLICT für UPSERT-Verhalten bei Tombstone-Einträgen
// Dies ermöglicht das Wiederverwenden von gelöschten Einträgen
if insert_stmt.on.is_none() {
// ON CONFLICT DO UPDATE SET ...
// Aktualisiere alle Spalten außer CRDT-Spalten, wenn ein Konflikt auftritt
// Erstelle UPDATE-Assignments für alle Spalten außer CRDT-Spalten, Primary Keys und Foreign Keys
let mut assignments = Vec::new();
for column in insert_stmt.columns.iter() {
let col_name = &column.value;
// Überspringe CRDT-Spalten
if col_name == self.hlc_timestamp_column || col_name == self.tombstone_column {
continue;
}
// Überspringe Primary Key Spalten um FOREIGN KEY Konflikte zu vermeiden
if primary_keys.contains(col_name) {
continue;
}
// Überspringe Foreign Key Spalten um FOREIGN KEY Konflikte zu vermeiden
// Wenn eine FK auf eine neue ID verweist, die noch nicht existiert, schlägt der Constraint fehl
if foreign_keys.contains(col_name) {
continue;
}
// excluded.column_name referenziert die neuen Werte aus dem INSERT
assignments.push(Assignment {
target: AssignmentTarget::ColumnName(sqlparser::ast::ObjectName(vec![
ObjectNamePart::Identifier(column.clone()),
])),
value: Expr::CompoundIdentifier(vec![Ident::new("excluded"), column.clone()]),
});
}
// Füge HLC-Timestamp Update hinzu (mit dem übergebenen timestamp)
assignments.push(Assignment {
target: AssignmentTarget::ColumnName(sqlparser::ast::ObjectName(vec![ObjectNamePart::Identifier(
Ident::new(self.hlc_timestamp_column),
)])),
value: Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into()),
});
// Setze Tombstone auf 0 (reaktiviere den Eintrag)
assignments.push(Assignment {
target: AssignmentTarget::ColumnName(sqlparser::ast::ObjectName(vec![ObjectNamePart::Identifier(
Ident::new(self.tombstone_column),
)])),
value: Expr::Value(Value::Number("0".to_string(), false).into()),
});
// ON CONFLICT nur wenn Tombstone = 1 (Eintrag wurde gelöscht)
// Ansonsten soll der INSERT fehlschlagen (UNIQUE constraint error)
let tombstone_condition = Expr::BinaryOp {
left: Box::new(Expr::Identifier(Ident::new(self.tombstone_column))),
op: BinaryOperator::Eq,
right: Box::new(Expr::Value(Value::Number("1".to_string(), false).into())),
};
insert_stmt.on = Some(OnInsert::OnConflict(OnConflict {
conflict_target: None, // Wird auf alle UNIQUE Constraints angewendet
action: OnConflictAction::DoUpdate(sqlparser::ast::DoUpdate {
assignments,
selection: Some(tombstone_condition),
}),
}));
}
match insert_stmt.source.as_mut() { match insert_stmt.source.as_mut() {
Some(query) => match &mut *query.body { Some(query) => match &mut *query.body {
SetExpr::Values(values) => { SetExpr::Values(values) => {
for row in &mut values.rows { for row in &mut values.rows {
// Add haex_timestamp value let hlc_value =
row.push(Expr::Value( Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into());
Value::SingleQuotedString(timestamp.to_string()).into(),
)); Self::set_or_push_value(row, hlc_col_index, hlc_value);
// Add haex_tombstone value (0 = not deleted)
row.push(Expr::Value(Value::Number("0".to_string(), false).into()));
} }
} }
SetExpr::Select(select) => { SetExpr::Select(select) => {
let hlc_expr = let hlc_value =
Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into()); Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into());
select.projection.push(SelectItem::UnnamedExpr(hlc_expr));
// Add haex_tombstone value (0 = not deleted) Self::set_or_push_projection(&mut select.projection, hlc_col_index, hlc_value);
let tombstone_expr = Expr::Value(Value::Number("0".to_string(), false).into());
select
.projection
.push(SelectItem::UnnamedExpr(tombstone_expr));
} }
_ => { _ => {
return Err(DatabaseError::UnsupportedStatement { return Err(DatabaseError::UnsupportedStatement {

View File

@ -1,5 +1,5 @@
pub mod hlc; pub mod hlc;
pub mod insert_transformer; pub mod insert_transformer;
pub mod query_transformer; //pub mod query_transformer;
pub mod transformer; pub mod transformer;
pub mod trigger; pub mod trigger;

View File

@ -1,515 +0,0 @@
// src-tauri/src/crdt/query_transformer.rs
// SELECT-spezifische CRDT-Transformationen (Tombstone-Filterung)
use crate::crdt::trigger::{TOMBSTONE_COLUMN};
use crate::database::error::DatabaseError;
use sqlparser::ast::{
BinaryOperator, Expr, Ident, ObjectName, SelectItem, SetExpr, TableFactor, Value,
};
use std::collections::HashSet;
/// Helper-Struct für SELECT-Transformationen
pub struct QueryTransformer {
tombstone_column: &'static str,
}
impl QueryTransformer {
pub fn new() -> Self {
Self {
tombstone_column: TOMBSTONE_COLUMN,
}
}
/// Transformiert Query-Statements (fügt Tombstone-Filter hinzu)
pub fn transform_query_recursive(
&self,
query: &mut sqlparser::ast::Query,
excluded_tables: &std::collections::HashSet<&str>,
) -> Result<(), DatabaseError> {
self.add_tombstone_filters_recursive(&mut query.body, excluded_tables)
}
/// Rekursive Behandlung aller SetExpr-Typen mit vollständiger Subquery-Unterstützung
fn add_tombstone_filters_recursive(
&self,
set_expr: &mut SetExpr,
excluded_tables: &std::collections::HashSet<&str>,
) -> Result<(), DatabaseError> {
match set_expr {
SetExpr::Select(select) => {
self.add_tombstone_filters_to_select(select, excluded_tables)?;
// Transformiere auch Subqueries in Projektionen
for projection in &mut select.projection {
match projection {
SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } => {
self.transform_expression_subqueries(expr, excluded_tables)?;
}
_ => {} // Wildcard projections ignorieren
}
}
// Transformiere Subqueries in WHERE
if let Some(where_clause) = &mut select.selection {
self.transform_expression_subqueries(where_clause, excluded_tables)?;
}
// Transformiere Subqueries in GROUP BY
match &mut select.group_by {
sqlparser::ast::GroupByExpr::All(_) => {
// GROUP BY ALL - keine Expressions zu transformieren
}
sqlparser::ast::GroupByExpr::Expressions(exprs, _) => {
for group_expr in exprs {
self.transform_expression_subqueries(group_expr, excluded_tables)?;
}
}
}
// Transformiere Subqueries in HAVING
if let Some(having) = &mut select.having {
self.transform_expression_subqueries(having, excluded_tables)?;
}
}
SetExpr::SetOperation { left, right, .. } => {
self.add_tombstone_filters_recursive(left, excluded_tables)?;
self.add_tombstone_filters_recursive(right, excluded_tables)?;
}
SetExpr::Query(query) => {
self.add_tombstone_filters_recursive(&mut query.body, excluded_tables)?;
}
SetExpr::Values(values) => {
// Transformiere auch Subqueries in Values-Listen
for row in &mut values.rows {
for expr in row {
self.transform_expression_subqueries(expr, excluded_tables)?;
}
}
}
_ => {} // Andere Fälle
}
Ok(())
}
/// Transformiert Subqueries innerhalb von Expressions
fn transform_expression_subqueries(
&self,
expr: &mut Expr,
excluded_tables: &std::collections::HashSet<&str>,
) -> Result<(), DatabaseError> {
match expr {
// Einfache Subqueries
Expr::Subquery(query) => {
self.add_tombstone_filters_recursive(&mut query.body, excluded_tables)?;
}
// EXISTS Subqueries
Expr::Exists { subquery, .. } => {
self.add_tombstone_filters_recursive(&mut subquery.body, excluded_tables)?;
}
// IN Subqueries
Expr::InSubquery {
expr: left_expr,
subquery,
..
} => {
self.transform_expression_subqueries(left_expr, excluded_tables)?;
self.add_tombstone_filters_recursive(&mut subquery.body, excluded_tables)?;
}
// ANY/ALL Subqueries
Expr::AnyOp { left, right, .. } | Expr::AllOp { left, right, .. } => {
self.transform_expression_subqueries(left, excluded_tables)?;
self.transform_expression_subqueries(right, excluded_tables)?;
}
// Binäre Operationen
Expr::BinaryOp { left, right, .. } => {
self.transform_expression_subqueries(left, excluded_tables)?;
self.transform_expression_subqueries(right, excluded_tables)?;
}
// Unäre Operationen
Expr::UnaryOp {
expr: inner_expr, ..
} => {
self.transform_expression_subqueries(inner_expr, excluded_tables)?;
}
// Verschachtelte Ausdrücke
Expr::Nested(nested) => {
self.transform_expression_subqueries(nested, excluded_tables)?;
}
// CASE-Ausdrücke
Expr::Case {
operand,
conditions,
else_result,
..
} => {
if let Some(op) = operand {
self.transform_expression_subqueries(op, excluded_tables)?;
}
for case_when in conditions {
self.transform_expression_subqueries(&mut case_when.condition, excluded_tables)?;
self.transform_expression_subqueries(&mut case_when.result, excluded_tables)?;
}
if let Some(else_res) = else_result {
self.transform_expression_subqueries(else_res, excluded_tables)?;
}
}
// Funktionsaufrufe
Expr::Function(func) => match &mut func.args {
sqlparser::ast::FunctionArguments::List(sqlparser::ast::FunctionArgumentList {
args,
..
}) => {
for arg in args {
if let sqlparser::ast::FunctionArg::Unnamed(
sqlparser::ast::FunctionArgExpr::Expr(expr),
) = arg
{
self.transform_expression_subqueries(expr, excluded_tables)?;
}
}
}
_ => {}
},
// BETWEEN
Expr::Between {
expr: main_expr,
low,
high,
..
} => {
self.transform_expression_subqueries(main_expr, excluded_tables)?;
self.transform_expression_subqueries(low, excluded_tables)?;
self.transform_expression_subqueries(high, excluded_tables)?;
}
// IN Liste
Expr::InList {
expr: main_expr,
list,
..
} => {
self.transform_expression_subqueries(main_expr, excluded_tables)?;
for list_expr in list {
self.transform_expression_subqueries(list_expr, excluded_tables)?;
}
}
// IS NULL/IS NOT NULL
Expr::IsNull(inner) | Expr::IsNotNull(inner) => {
self.transform_expression_subqueries(inner, excluded_tables)?;
}
// Andere Expression-Typen benötigen keine Transformation
_ => {}
}
Ok(())
}
/// Erstellt einen Tombstone-Filter für eine Tabelle
pub fn create_tombstone_filter(&self, table_alias: Option<&str>) -> Expr {
let column_expr = match table_alias {
Some(alias) => {
Expr::CompoundIdentifier(vec![Ident::new(alias), Ident::new(self.tombstone_column)])
}
None => {
Expr::Identifier(Ident::new(self.tombstone_column))
}
};
Expr::BinaryOp {
left: Box::new(column_expr),
op: BinaryOperator::NotEq,
right: Box::new(Expr::Value(Value::Number("1".to_string(), false).into())),
}
}
/// Normalisiert Tabellennamen (entfernt Anführungszeichen)
pub fn normalize_table_name(&self, name: &ObjectName) -> String {
let name_str = name.to_string().to_lowercase();
name_str.trim_matches('`').trim_matches('"').to_string()
}
/// Fügt Tombstone-Filter zu SELECT-Statements hinzu
pub fn add_tombstone_filters_to_select(
&self,
select: &mut sqlparser::ast::Select,
excluded_tables: &HashSet<&str>,
) -> Result<(), DatabaseError> {
// Sammle alle CRDT-Tabellen mit ihren Aliasen
let mut crdt_tables = Vec::new();
for twj in &select.from {
if let TableFactor::Table { name, alias, .. } = &twj.relation {
let table_name_str = self.normalize_table_name(name);
if !excluded_tables.contains(table_name_str.as_str()) {
let table_alias = alias.as_ref().map(|a| a.name.value.as_str());
crdt_tables.push((name.clone(), table_alias));
}
}
}
if crdt_tables.is_empty() {
return Ok(());
}
// Prüfe, welche Tombstone-Spalten bereits in der WHERE-Klausel referenziert werden
let explicitly_filtered_tables = if let Some(where_clause) = &select.selection {
self.find_explicitly_filtered_tombstone_tables(where_clause, &crdt_tables)
} else {
HashSet::new()
};
// Erstelle Filter nur für Tabellen, die noch nicht explizit gefiltert werden
let mut tombstone_filters = Vec::new();
for (table_name, table_alias) in crdt_tables {
let table_name_string = table_name.to_string();
let table_key = table_alias.unwrap_or(&table_name_string);
if !explicitly_filtered_tables.contains(table_key) {
tombstone_filters.push(self.create_tombstone_filter(table_alias));
}
}
// Füge die automatischen Filter hinzu
if !tombstone_filters.is_empty() {
let combined_filter = tombstone_filters
.into_iter()
.reduce(|acc, expr| Expr::BinaryOp {
left: Box::new(acc),
op: BinaryOperator::And,
right: Box::new(expr),
})
.unwrap();
match &mut select.selection {
Some(existing) => {
*existing = Expr::BinaryOp {
left: Box::new(existing.clone()),
op: BinaryOperator::And,
right: Box::new(combined_filter),
};
}
None => {
select.selection = Some(combined_filter);
}
}
}
Ok(())
}
/// Findet alle Tabellen, die bereits explizit Tombstone-Filter in der WHERE-Klausel haben
fn find_explicitly_filtered_tombstone_tables(
&self,
where_expr: &Expr,
crdt_tables: &[(ObjectName, Option<&str>)],
) -> HashSet<String> {
let mut filtered_tables = HashSet::new();
self.scan_expression_for_tombstone_references(
where_expr,
crdt_tables,
&mut filtered_tables,
);
filtered_tables
}
/// Rekursiv durchsucht einen Expression-Baum nach Tombstone-Spalten-Referenzen
fn scan_expression_for_tombstone_references(
&self,
expr: &Expr,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) {
match expr {
Expr::Identifier(ident) => {
if ident.value == self.tombstone_column && crdt_tables.len() == 1 {
let table_name_str = crdt_tables[0].0.to_string();
let table_key = crdt_tables[0].1.unwrap_or(&table_name_str);
filtered_tables.insert(table_key.to_string());
}
}
Expr::CompoundIdentifier(idents) => {
if idents.len() == 2 && idents[1].value == self.tombstone_column {
let table_ref = &idents[0].value;
for (table_name, alias) in crdt_tables {
let table_name_str = table_name.to_string();
if table_ref == &table_name_str || alias.map_or(false, |a| a == table_ref) {
filtered_tables.insert(table_ref.clone());
break;
}
}
}
}
Expr::BinaryOp { left, right, .. } => {
self.scan_expression_for_tombstone_references(left, crdt_tables, filtered_tables);
self.scan_expression_for_tombstone_references(right, crdt_tables, filtered_tables);
}
Expr::UnaryOp { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
Expr::Nested(nested) => {
self.scan_expression_for_tombstone_references(nested, crdt_tables, filtered_tables);
}
Expr::InList { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
Expr::Between { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
Expr::IsNull(expr) | Expr::IsNotNull(expr) => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
Expr::Function(func) => {
if let sqlparser::ast::FunctionArguments::List(
sqlparser::ast::FunctionArgumentList { args, .. },
) = &func.args
{
for arg in args {
if let sqlparser::ast::FunctionArg::Unnamed(
sqlparser::ast::FunctionArgExpr::Expr(expr),
) = arg
{
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
}
}
}
Expr::Case {
operand,
conditions,
else_result,
..
} => {
if let Some(op) = operand {
self.scan_expression_for_tombstone_references(op, crdt_tables, filtered_tables);
}
for case_when in conditions {
self.scan_expression_for_tombstone_references(
&case_when.condition,
crdt_tables,
filtered_tables,
);
self.scan_expression_for_tombstone_references(
&case_when.result,
crdt_tables,
filtered_tables,
);
}
if let Some(else_res) = else_result {
self.scan_expression_for_tombstone_references(
else_res,
crdt_tables,
filtered_tables,
);
}
}
Expr::Subquery(query) => {
self.analyze_query_for_tombstone_references(query, crdt_tables, filtered_tables)
.ok();
}
Expr::Exists { subquery, .. } => {
self.analyze_query_for_tombstone_references(subquery, crdt_tables, filtered_tables)
.ok();
}
Expr::InSubquery { expr, subquery, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
self.analyze_query_for_tombstone_references(subquery, crdt_tables, filtered_tables)
.ok();
}
Expr::AnyOp { left, right, .. } | Expr::AllOp { left, right, .. } => {
self.scan_expression_for_tombstone_references(left, crdt_tables, filtered_tables);
self.scan_expression_for_tombstone_references(right, crdt_tables, filtered_tables);
}
_ => {}
}
}
fn analyze_query_for_tombstone_references(
&self,
query: &sqlparser::ast::Query,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) -> Result<(), DatabaseError> {
self.analyze_set_expr_for_tombstone_references(&query.body, crdt_tables, filtered_tables)
}
fn analyze_set_expr_for_tombstone_references(
&self,
set_expr: &SetExpr,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) -> Result<(), DatabaseError> {
match set_expr {
SetExpr::Select(select) => {
if let Some(where_clause) = &select.selection {
self.scan_expression_for_tombstone_references(
where_clause,
crdt_tables,
filtered_tables,
);
}
for projection in &select.projection {
match projection {
SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } => {
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
_ => {}
}
}
match &select.group_by {
sqlparser::ast::GroupByExpr::All(_) => {}
sqlparser::ast::GroupByExpr::Expressions(exprs, _) => {
for group_expr in exprs {
self.scan_expression_for_tombstone_references(
group_expr,
crdt_tables,
filtered_tables,
);
}
}
}
if let Some(having) = &select.having {
self.scan_expression_for_tombstone_references(
having,
crdt_tables,
filtered_tables,
);
}
}
SetExpr::SetOperation { left, right, .. } => {
self.analyze_set_expr_for_tombstone_references(left, crdt_tables, filtered_tables)?;
self.analyze_set_expr_for_tombstone_references(
right,
crdt_tables,
filtered_tables,
)?;
}
SetExpr::Query(query) => {
self.analyze_set_expr_for_tombstone_references(
&query.body,
crdt_tables,
filtered_tables,
)?;
}
SetExpr::Values(values) => {
for row in &values.rows {
for expr in row {
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
}
}
_ => {}
}
Ok(())
}
}

View File

@ -1,12 +1,12 @@
// src-tauri/src/crdt/transformer.rs
use crate::crdt::insert_transformer::InsertTransformer; use crate::crdt::insert_transformer::InsertTransformer;
use crate::crdt::query_transformer::QueryTransformer; use crate::crdt::trigger::HLC_TIMESTAMP_COLUMN;
use crate::crdt::trigger::{HLC_TIMESTAMP_COLUMN, TOMBSTONE_COLUMN};
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_CRDT_LOGS}; use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_CRDT_LOGS};
use sqlparser::ast::{ use sqlparser::ast::{
Assignment, AssignmentTarget, BinaryOperator, ColumnDef, DataType, Expr, Ident, Assignment, AssignmentTarget, ColumnDef, DataType, Expr, Ident, ObjectName, ObjectNamePart,
ObjectName, ObjectNamePart, Statement, TableFactor, TableObject, Statement, TableFactor, TableObject, Value,
Value,
}; };
use std::borrow::Cow; use std::borrow::Cow;
use std::collections::HashSet; use std::collections::HashSet;
@ -15,46 +15,14 @@ use uhlc::Timestamp;
/// Konfiguration für CRDT-Spalten /// Konfiguration für CRDT-Spalten
#[derive(Clone)] #[derive(Clone)]
struct CrdtColumns { struct CrdtColumns {
tombstone: &'static str,
hlc_timestamp: &'static str, hlc_timestamp: &'static str,
} }
impl CrdtColumns { impl CrdtColumns {
const DEFAULT: Self = Self { const DEFAULT: Self = Self {
tombstone: TOMBSTONE_COLUMN,
hlc_timestamp: HLC_TIMESTAMP_COLUMN, hlc_timestamp: HLC_TIMESTAMP_COLUMN,
}; };
/// Erstellt einen Tombstone-Filter für eine Tabelle
fn create_tombstone_filter(&self, table_alias: Option<&str>) -> Expr {
let column_expr = match table_alias {
Some(alias) => {
// Qualifizierte Referenz: alias.tombstone
Expr::CompoundIdentifier(vec![Ident::new(alias), Ident::new(self.tombstone)])
}
None => {
// Einfache Referenz: tombstone
Expr::Identifier(Ident::new(self.tombstone))
}
};
Expr::BinaryOp {
left: Box::new(column_expr),
op: BinaryOperator::NotEq,
right: Box::new(Expr::Value(Value::Number("1".to_string(), false).into())),
}
}
/// Erstellt eine Tombstone-Zuweisung für UPDATE/DELETE
fn create_tombstone_assignment(&self) -> Assignment {
Assignment {
target: AssignmentTarget::ColumnName(ObjectName(vec![ObjectNamePart::Identifier(
Ident::new(self.tombstone),
)])),
value: Expr::Value(Value::Number("1".to_string(), false).into()),
}
}
/// Erstellt eine HLC-Zuweisung für UPDATE/DELETE /// Erstellt eine HLC-Zuweisung für UPDATE/DELETE
fn create_hlc_assignment(&self, timestamp: &Timestamp) -> Assignment { fn create_hlc_assignment(&self, timestamp: &Timestamp) -> Assignment {
Assignment { Assignment {
@ -67,13 +35,6 @@ impl CrdtColumns {
/// Fügt CRDT-Spalten zu einer Tabellendefinition hinzu /// Fügt CRDT-Spalten zu einer Tabellendefinition hinzu
fn add_to_table_definition(&self, columns: &mut Vec<ColumnDef>) { fn add_to_table_definition(&self, columns: &mut Vec<ColumnDef>) {
if !columns.iter().any(|c| c.name.value == self.tombstone) {
columns.push(ColumnDef {
name: Ident::new(self.tombstone),
data_type: DataType::Integer(None),
options: vec![],
});
}
if !columns.iter().any(|c| c.name.value == self.hlc_timestamp) { if !columns.iter().any(|c| c.name.value == self.hlc_timestamp) {
columns.push(ColumnDef { columns.push(ColumnDef {
name: Ident::new(self.hlc_timestamp), name: Ident::new(self.hlc_timestamp),
@ -113,26 +74,14 @@ impl CrdtTransformer {
Cow::Owned(name_str.trim_matches('`').trim_matches('"').to_string()) Cow::Owned(name_str.trim_matches('`').trim_matches('"').to_string())
} }
pub fn transform_select_statement(&self, stmt: &mut Statement) -> Result<(), DatabaseError> { // =================================================================
match stmt { // ÖFFENTLICHE API-METHODEN
Statement::Query(query) => { // =================================================================
let query_transformer = QueryTransformer::new();
query_transformer.transform_query_recursive(query, &self.excluded_tables)
}
// Fange alle anderen Fälle ab und gib einen Fehler zurück
_ => Err(DatabaseError::UnsupportedStatement {
sql: stmt.to_string(),
reason: "This operation only accepts SELECT statements.".to_string(),
}),
}
}
/// Transformiert Statements MIT Zugriff auf Tabelleninformationen (empfohlen)
pub fn transform_execute_statement_with_table_info( pub fn transform_execute_statement_with_table_info(
&self, &self,
stmt: &mut Statement, stmt: &mut Statement,
hlc_timestamp: &Timestamp, hlc_timestamp: &Timestamp,
tx: &rusqlite::Transaction,
) -> Result<Option<String>, DatabaseError> { ) -> Result<Option<String>, DatabaseError> {
match stmt { match stmt {
Statement::CreateTable(create_table) => { Statement::CreateTable(create_table) => {
@ -149,31 +98,9 @@ impl CrdtTransformer {
Statement::Insert(insert_stmt) => { Statement::Insert(insert_stmt) => {
if let TableObject::TableName(name) = &insert_stmt.table { if let TableObject::TableName(name) = &insert_stmt.table {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
// Hole die Tabelleninformationen um PKs und FKs zu identifizieren // Hard Delete: Kein Schema-Lookup mehr nötig (kein ON CONFLICT)
let table_name_str = self.normalize_table_name(name);
let columns = crate::crdt::trigger::get_table_schema(tx, &table_name_str)
.map_err(|e| DatabaseError::ExecutionError {
sql: format!("PRAGMA table_info('{}')", table_name_str),
reason: e.to_string(),
table: Some(table_name_str.to_string()),
})?;
let primary_keys: Vec<String> = columns
.iter()
.filter(|c| c.is_pk)
.map(|c| c.name.clone())
.collect();
let foreign_keys = crate::crdt::trigger::get_foreign_key_columns(tx, &table_name_str)
.map_err(|e| DatabaseError::ExecutionError {
sql: format!("PRAGMA foreign_key_list('{}')", table_name_str),
reason: e.to_string(),
table: Some(table_name_str.to_string()),
})?;
let insert_transformer = InsertTransformer::new(); let insert_transformer = InsertTransformer::new();
insert_transformer.transform_insert(insert_stmt, hlc_timestamp, &primary_keys, &foreign_keys)?; insert_transformer.transform_insert(insert_stmt, hlc_timestamp)?;
} }
} }
Ok(None) Ok(None)
@ -188,23 +115,11 @@ impl CrdtTransformer {
} }
Ok(None) Ok(None)
} }
Statement::Delete(del_stmt) => { Statement::Delete(_del_stmt) => {
if let Some(table_name) = self.extract_table_name_from_delete(del_stmt) { // Hard Delete - keine Transformation!
let table_name_str = self.normalize_table_name(&table_name); // DELETE bleibt DELETE
let is_crdt = self.is_crdt_sync_table(&table_name); // BEFORE DELETE Trigger schreiben die Logs
eprintln!("DEBUG DELETE (with_table_info): table='{}', is_crdt_sync={}, normalized='{}'",
table_name, is_crdt, table_name_str);
if is_crdt {
eprintln!("DEBUG: Transforming DELETE to UPDATE for table '{}'", table_name_str);
self.transform_delete_to_update(stmt, hlc_timestamp)?;
}
Ok(None) Ok(None)
} else {
Err(DatabaseError::UnsupportedStatement {
sql: del_stmt.to_string(),
reason: "DELETE from non-table source or multiple tables".to_string(),
})
}
} }
Statement::AlterTable { name, .. } => { Statement::AlterTable { name, .. } => {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
@ -222,9 +137,6 @@ impl CrdtTransformer {
stmt: &mut Statement, stmt: &mut Statement,
hlc_timestamp: &Timestamp, hlc_timestamp: &Timestamp,
) -> Result<Option<String>, DatabaseError> { ) -> Result<Option<String>, DatabaseError> {
// Für INSERT-Statements ohne Connection nutzen wir eine leere PK-Liste
// Das bedeutet ALLE Spalten werden im ON CONFLICT UPDATE gesetzt
// Dies ist ein Fallback für den Fall, dass keine Connection verfügbar ist
match stmt { match stmt {
Statement::CreateTable(create_table) => { Statement::CreateTable(create_table) => {
if self.is_crdt_sync_table(&create_table.name) { if self.is_crdt_sync_table(&create_table.name) {
@ -240,9 +152,9 @@ impl CrdtTransformer {
Statement::Insert(insert_stmt) => { Statement::Insert(insert_stmt) => {
if let TableObject::TableName(name) = &insert_stmt.table { if let TableObject::TableName(name) = &insert_stmt.table {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
// Ohne Connection: leere PK- und FK-Listen (alle Spalten werden upgedatet) // Hard Delete: Keine ON CONFLICT Logik mehr nötig
let insert_transformer = InsertTransformer::new(); let insert_transformer = InsertTransformer::new();
insert_transformer.transform_insert(insert_stmt, hlc_timestamp, &[], &[])?; insert_transformer.transform_insert(insert_stmt, hlc_timestamp)?;
} }
} }
Ok(None) Ok(None)
@ -257,18 +169,10 @@ impl CrdtTransformer {
} }
Ok(None) Ok(None)
} }
Statement::Delete(del_stmt) => { Statement::Delete(_del_stmt) => {
if let Some(table_name) = self.extract_table_name_from_delete(del_stmt) { // Hard Delete - keine Transformation!
if self.is_crdt_sync_table(&table_name) { // DELETE bleibt DELETE
self.transform_delete_to_update(stmt, hlc_timestamp)?;
}
Ok(None) Ok(None)
} else {
Err(DatabaseError::UnsupportedStatement {
sql: del_stmt.to_string(),
reason: "DELETE from non-table source or multiple tables".to_string(),
})
}
} }
Statement::AlterTable { name, .. } => { Statement::AlterTable { name, .. } => {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
@ -280,65 +184,4 @@ impl CrdtTransformer {
_ => Ok(None), _ => Ok(None),
} }
} }
/// Transformiert DELETE zu UPDATE (soft delete)
fn transform_delete_to_update(
&self,
stmt: &mut Statement,
timestamp: &Timestamp,
) -> Result<(), DatabaseError> {
if let Statement::Delete(del_stmt) = stmt {
let table_to_update = match &del_stmt.from {
sqlparser::ast::FromTable::WithFromKeyword(from)
| sqlparser::ast::FromTable::WithoutKeyword(from) => {
if from.len() == 1 {
from[0].clone()
} else {
return Err(DatabaseError::UnsupportedStatement {
reason: "DELETE with multiple tables not supported".to_string(),
sql: stmt.to_string(),
});
}
}
};
let assignments = vec![
self.columns.create_tombstone_assignment(),
self.columns.create_hlc_assignment(timestamp),
];
*stmt = Statement::Update {
table: table_to_update,
assignments,
from: None,
selection: del_stmt.selection.clone(),
returning: None,
or: None,
limit: None,
};
}
Ok(())
}
/// Extrahiert Tabellennamen aus DELETE-Statement
fn extract_table_name_from_delete(
&self,
del_stmt: &sqlparser::ast::Delete,
) -> Option<ObjectName> {
let tables = match &del_stmt.from {
sqlparser::ast::FromTable::WithFromKeyword(from)
| sqlparser::ast::FromTable::WithoutKeyword(from) => from,
};
if tables.len() == 1 {
if let TableFactor::Table { name, .. } = &tables[0].relation {
Some(name.clone())
} else {
None
}
} else {
None
}
}
} }

View File

@ -9,20 +9,17 @@ use ts_rs::TS;
// Der "z_"-Präfix soll sicherstellen, dass diese Trigger als Letzte ausgeführt werden // Der "z_"-Präfix soll sicherstellen, dass diese Trigger als Letzte ausgeführt werden
const INSERT_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_insert"; const INSERT_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_insert";
const UPDATE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_update"; const UPDATE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_update";
const DELETE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_delete";
//const SYNC_ACTIVE_KEY: &str = "sync_active";
pub const TOMBSTONE_COLUMN: &str = "haex_tombstone";
pub const HLC_TIMESTAMP_COLUMN: &str = "haex_timestamp"; pub const HLC_TIMESTAMP_COLUMN: &str = "haex_timestamp";
/// Name der custom UUID-Generierungs-Funktion (registriert in database::core::open_and_init_db)
pub const UUID_FUNCTION_NAME: &str = "gen_uuid";
#[derive(Debug)] #[derive(Debug)]
pub enum CrdtSetupError { pub enum CrdtSetupError {
/// Kapselt einen Fehler, der von der rusqlite-Bibliothek kommt. /// Kapselt einen Fehler, der von der rusqlite-Bibliothek kommt.
DatabaseError(rusqlite::Error), DatabaseError(rusqlite::Error),
/// Die Tabelle hat keine Tombstone-Spalte, was eine CRDT-Voraussetzung ist.
TombstoneColumnMissing {
table_name: String,
column_name: String,
},
HlcColumnMissing { HlcColumnMissing {
table_name: String, table_name: String,
column_name: String, column_name: String,
@ -36,14 +33,6 @@ impl Display for CrdtSetupError {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
match self { match self {
CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {}", e), CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {}", e),
CrdtSetupError::TombstoneColumnMissing {
table_name,
column_name,
} => write!(
f,
"Table '{}' is missing the required tombstone column '{}'",
table_name, column_name
),
CrdtSetupError::HlcColumnMissing { CrdtSetupError::HlcColumnMissing {
table_name, table_name,
column_name, column_name,
@ -94,7 +83,8 @@ impl ColumnInfo {
} }
fn is_safe_identifier(name: &str) -> bool { fn is_safe_identifier(name: &str) -> bool {
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_') // Allow alphanumeric characters, underscores, and hyphens (for extension names like "nuxt-app")
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_' || c == '-')
} }
/// Richtet CRDT-Trigger für eine einzelne Tabelle ein. /// Richtet CRDT-Trigger für eine einzelne Tabelle ein.
@ -109,13 +99,6 @@ pub fn setup_triggers_for_table(
return Ok(TriggerSetupResult::TableNotFound); return Ok(TriggerSetupResult::TableNotFound);
} }
if !columns.iter().any(|c| c.name == TOMBSTONE_COLUMN) {
return Err(CrdtSetupError::TombstoneColumnMissing {
table_name: table_name.to_string(),
column_name: TOMBSTONE_COLUMN.to_string(),
});
}
if !columns.iter().any(|c| c.name == HLC_TIMESTAMP_COLUMN) { if !columns.iter().any(|c| c.name == HLC_TIMESTAMP_COLUMN) {
return Err(CrdtSetupError::HlcColumnMissing { return Err(CrdtSetupError::HlcColumnMissing {
table_name: table_name.to_string(), table_name: table_name.to_string(),
@ -137,12 +120,13 @@ pub fn setup_triggers_for_table(
let cols_to_track: Vec<String> = columns let cols_to_track: Vec<String> = columns
.iter() .iter()
.filter(|c| !c.is_pk) //&& c.name != TOMBSTONE_COLUMN && c.name != HLC_TIMESTAMP_COLUMN .filter(|c| !c.is_pk)
.map(|c| c.name.clone()) .map(|c| c.name.clone())
.collect(); .collect();
let insert_trigger_sql = generate_insert_trigger_sql(table_name, &pks, &cols_to_track); let insert_trigger_sql = generate_insert_trigger_sql(table_name, &pks, &cols_to_track);
let update_trigger_sql = generate_update_trigger_sql(table_name, &pks, &cols_to_track); let update_trigger_sql = generate_update_trigger_sql(table_name, &pks, &cols_to_track);
let delete_trigger_sql = generate_delete_trigger_sql(table_name, &pks, &cols_to_track);
if recreate { if recreate {
drop_triggers_for_table(&tx, table_name)?; drop_triggers_for_table(&tx, table_name)?;
@ -150,6 +134,7 @@ pub fn setup_triggers_for_table(
tx.execute_batch(&insert_trigger_sql)?; tx.execute_batch(&insert_trigger_sql)?;
tx.execute_batch(&update_trigger_sql)?; tx.execute_batch(&update_trigger_sql)?;
tx.execute_batch(&delete_trigger_sql)?;
Ok(TriggerSetupResult::Success) Ok(TriggerSetupResult::Success)
} }
@ -170,28 +155,7 @@ pub fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<V
rows.collect() rows.collect()
} }
/// Holt alle Foreign Key Spalten einer Tabelle. // get_foreign_key_columns() removed - not needed with hard deletes (no ON CONFLICT logic)
/// Gibt eine Liste der Spaltennamen zurück, die Foreign Keys sind.
pub fn get_foreign_key_columns(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<String>> {
if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}",
table_name
))
.into());
}
let sql = format!("PRAGMA foreign_key_list(\"{}\");", table_name);
let mut stmt = conn.prepare(&sql)?;
// foreign_key_list gibt Spalten zurück: id, seq, table, from, to, on_update, on_delete, match
// Wir brauchen die "from" Spalte, die den Namen der FK-Spalte in der aktuellen Tabelle enthält
let rows = stmt.query_map([], |row| {
row.get::<_, String>("from")
})?;
rows.collect()
}
pub fn drop_triggers_for_table( pub fn drop_triggers_for_table(
tx: &Transaction, // Arbeitet direkt auf einer Transaktion tx: &Transaction, // Arbeitet direkt auf einer Transaktion
@ -209,8 +173,13 @@ pub fn drop_triggers_for_table(
drop_trigger_sql(INSERT_TRIGGER_TPL.replace("{TABLE_NAME}", table_name)); drop_trigger_sql(INSERT_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let drop_update_trigger_sql = let drop_update_trigger_sql =
drop_trigger_sql(UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name)); drop_trigger_sql(UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let drop_delete_trigger_sql =
drop_trigger_sql(DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let sql_batch = format!("{}\n{}", drop_insert_trigger_sql, drop_update_trigger_sql); let sql_batch = format!(
"{}\n{}\n{}",
drop_insert_trigger_sql, drop_update_trigger_sql, drop_delete_trigger_sql
);
tx.execute_batch(&sql_batch)?; tx.execute_batch(&sql_batch)?;
Ok(()) Ok(())
@ -282,9 +251,10 @@ fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
let column_inserts = if cols.is_empty() { let column_inserts = if cols.is_empty() {
// Nur PKs -> einfacher Insert ins Log // Nur PKs -> einfacher Insert ins Log
format!( format!(
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks) "INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES (NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}));", VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}));",
log_table = TABLE_CRDT_LOGS, log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN, hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name, table = table_name,
pk_payload = pk_json_payload pk_payload = pk_json_payload
@ -293,9 +263,10 @@ fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
cols.iter().fold(String::new(), |mut acc, col| { cols.iter().fold(String::new(), |mut acc, col| {
writeln!( writeln!(
&mut acc, &mut acc,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks, column_name, new_value) "INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value)
VALUES (NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}), '{column}', json_object('value', NEW.\"{column}\"));", VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}), '{column}', json_object('value', NEW.\"{column}\"));",
log_table = TABLE_CRDT_LOGS, log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN, hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name, table = table_name,
pk_payload = pk_json_payload, pk_payload = pk_json_payload,
@ -337,11 +308,12 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols { for col in cols {
writeln!( writeln!(
&mut body, &mut body,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value) "INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value)
SELECT NEW.\"{hlc_col}\", 'UPDATE', '{table}', json_object({pk_payload}), '{column}', SELECT {uuid_fn}(), NEW.\"{hlc_col}\", 'UPDATE', '{table}', json_object({pk_payload}), '{column}',
json_object('value', NEW.\"{column}\"), json_object('value', OLD.\"{column}\") json_object('value', NEW.\"{column}\"), json_object('value', OLD.\"{column}\")
WHERE NEW.\"{column}\" IS NOT OLD.\"{column}\";", WHERE NEW.\"{column}\" IS NOT OLD.\"{column}\";",
log_table = TABLE_CRDT_LOGS, log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN, hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name, table = table_name,
pk_payload = pk_json_payload, pk_payload = pk_json_payload,
@ -350,19 +322,7 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
} }
} }
// Soft-delete loggen // Soft-delete Logging entfernt - wir nutzen jetzt Hard Deletes mit eigenem BEFORE DELETE Trigger
writeln!(
&mut body,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks)
SELECT NEW.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload})
WHERE NEW.\"{tombstone_col}\" = 1 AND OLD.\"{tombstone_col}\" = 0;",
log_table = TABLE_CRDT_LOGS,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
tombstone_col = TOMBSTONE_COLUMN
)
.unwrap();
let trigger_name = UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name); let trigger_name = UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name);
@ -375,3 +335,57 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
END;" END;"
) )
} }
/// Generiert das SQL für den BEFORE DELETE-Trigger.
/// WICHTIG: BEFORE DELETE damit die Daten noch verfügbar sind!
fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks
.iter()
.map(|pk| format!("'{}', OLD.\"{}\"", pk, pk))
.collect::<Vec<_>>()
.join(", ");
let mut body = String::new();
// Alle Spaltenwerte speichern für mögliche Wiederherstellung
if !cols.is_empty() {
for col in cols {
writeln!(
&mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}), '{column}',
json_object('value', OLD.\"{column}\"));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap();
}
} else {
// Nur PKs -> minimales Delete Log
writeln!(
&mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
)
.unwrap();
}
let trigger_name = DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name);
format!(
"CREATE TRIGGER IF NOT EXISTS \"{trigger_name}\"
BEFORE DELETE ON \"{table_name}\"
FOR EACH ROW
BEGIN
{body}
END;"
)
}

View File

@ -1,9 +1,11 @@
// src-tauri/src/database/core.rs // src-tauri/src/database/core.rs
use crate::crdt::trigger::UUID_FUNCTION_NAME;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::database::DbConnection; use crate::database::DbConnection;
use crate::extension::database::executor::SqlExecutor; use crate::extension::database::executor::SqlExecutor;
use base64::{engine::general_purpose::STANDARD, Engine as _}; use base64::{engine::general_purpose::STANDARD, Engine as _};
use rusqlite::functions::FunctionFlags;
use rusqlite::types::Value as SqlValue; use rusqlite::types::Value as SqlValue;
use rusqlite::{ use rusqlite::{
types::{Value as RusqliteValue, ValueRef}, types::{Value as RusqliteValue, ValueRef},
@ -13,9 +15,9 @@ use serde_json::Value as JsonValue;
use sqlparser::ast::{Expr, Query, Select, SetExpr, Statement, TableFactor, TableObject}; use sqlparser::ast::{Expr, Query, Select, SetExpr, Statement, TableFactor, TableObject};
use sqlparser::dialect::SQLiteDialect; use sqlparser::dialect::SQLiteDialect;
use sqlparser::parser::Parser; use sqlparser::parser::Parser;
use uuid::Uuid;
/// Öffnet und initialisiert eine Datenbank mit Verschlüsselung /// Öffnet und initialisiert eine Datenbank mit Verschlüsselung
///
pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connection, DatabaseError> { pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connection, DatabaseError> {
let flags = if create { let flags = if create {
OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_CREATE OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_CREATE
@ -35,6 +37,19 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Register custom UUID function for SQLite triggers
conn.create_scalar_function(
UUID_FUNCTION_NAME,
0,
FunctionFlags::SQLITE_UTF8 | FunctionFlags::SQLITE_DETERMINISTIC,
|_ctx| {
Ok(Uuid::new_v4().to_string())
},
)
.map_err(|e| DatabaseError::DatabaseError {
reason: format!("Failed to register {} function: {}", UUID_FUNCTION_NAME, e),
})?;
let journal_mode: String = conn let journal_mode: String = conn
.query_row("PRAGMA journal_mode=WAL;", [], |row| row.get(0)) .query_row("PRAGMA journal_mode=WAL;", [], |row| row.get(0))
.map_err(|e| DatabaseError::PragmaError { .map_err(|e| DatabaseError::PragmaError {
@ -74,8 +89,15 @@ pub fn parse_single_statement(sql: &str) -> Result<Statement, DatabaseError> {
/// Utility für SQL-Parsing - parst mehrere SQL-Statements /// Utility für SQL-Parsing - parst mehrere SQL-Statements
pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError> { pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError> {
let dialect = SQLiteDialect {}; let dialect = SQLiteDialect {};
Parser::parse_sql(&dialect, sql).map_err(|e| DatabaseError::ParseError {
reason: e.to_string(), // Normalize whitespace: replace multiple whitespaces (including newlines, tabs) with single space
let normalized_sql = sql
.split_whitespace()
.collect::<Vec<&str>>()
.join(" ");
Parser::parse_sql(&dialect, &normalized_sql).map_err(|e| DatabaseError::ParseError {
reason: format!("Failed to parse SQL: {}", e),
sql: sql.to_string(), sql: sql.to_string(),
}) })
} }
@ -159,45 +181,23 @@ pub fn execute(
let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect(); let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect();
with_connection(connection, |conn| { with_connection(connection, |conn| {
// Check if the SQL contains RETURNING clause if sql.to_uppercase().contains("RETURNING") {
let has_returning = sql.to_uppercase().contains("RETURNING"); let mut stmt = conn.prepare(&sql)?;
if has_returning {
// Use prepare + query for RETURNING statements
let mut stmt = conn.prepare(&sql).map_err(|e| DatabaseError::PrepareError {
reason: e.to_string(),
})?;
let num_columns = stmt.column_count(); let num_columns = stmt.column_count();
let mut rows = stmt let mut rows = stmt.query(&params_sql[..])?;
.query(&params_sql[..])
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next()? {
reason: format!("Row iteration error: {}", e),
})? {
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns); let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns { for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| { let value_ref = row.get_ref(i)?;
DatabaseError::RowProcessingError {
reason: format!("Failed to get column {}: {}", i, e),
}
})?;
let json_val = convert_value_ref_to_json(value_ref)?; let json_val = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val); row_values.push(json_val);
} }
result_vec.push(row_values); result_vec.push(row_values);
} }
Ok(result_vec) Ok(result_vec)
} else { } else {
// For non-RETURNING statements, just execute and return empty array
conn.execute(&sql, &params_sql[..]).map_err(|e| { conn.execute(&sql, &params_sql[..]).map_err(|e| {
let table_name = extract_primary_table_name_from_sql(&sql).unwrap_or(None); let table_name = extract_primary_table_name_from_sql(&sql).unwrap_or(None);
DatabaseError::ExecutionError { DatabaseError::ExecutionError {
@ -206,7 +206,6 @@ pub fn execute(
table: table_name, table: table_name,
} }
})?; })?;
Ok(vec![]) Ok(vec![])
} }
}) })
@ -236,44 +235,34 @@ pub fn select(
let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect(); let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect();
with_connection(connection, |conn| { with_connection(connection, |conn| {
let mut stmt = conn let mut stmt = conn.prepare(&sql)?;
.prepare(&sql)
.map_err(|e| DatabaseError::PrepareError {
reason: e.to_string(),
})?;
let num_columns = stmt.column_count(); let num_columns = stmt.column_count();
let mut rows = stmt.query(&params_sql[..])?;
let mut rows = stmt
.query(&params_sql[..])
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next()? {
reason: format!("Row iteration error: {}", e),
})? {
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns); let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns { for i in 0..num_columns {
let value_ref = row let value_ref = row.get_ref(i)?;
.get_ref(i)
.map_err(|e| DatabaseError::RowProcessingError {
reason: format!("Failed to get column {}: {}", i, e),
})?;
let json_val = convert_value_ref_to_json(value_ref)?; let json_val = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val); row_values.push(json_val);
} }
result_vec.push(row_values); result_vec.push(row_values);
} }
Ok(result_vec) Ok(result_vec)
}) })
} }
pub fn select_with_crdt(
sql: String,
params: Vec<JsonValue>,
connection: &DbConnection,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
with_connection(&connection, |conn| {
SqlExecutor::query_select(conn, &sql, &params)
})
}
/// Konvertiert rusqlite ValueRef zu JSON /// Konvertiert rusqlite ValueRef zu JSON
pub fn convert_value_ref_to_json(value_ref: ValueRef) -> Result<JsonValue, DatabaseError> { pub fn convert_value_ref_to_json(value_ref: ValueRef) -> Result<JsonValue, DatabaseError> {
let json_val = match value_ref { let json_val = match value_ref {

View File

@ -16,8 +16,6 @@ pub struct HaexSettings {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub value: Option<String>, pub value: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -28,8 +26,7 @@ impl HaexSettings {
key: row.get(1)?, key: row.get(1)?,
r#type: row.get(2)?, r#type: row.get(2)?,
value: row.get(3)?, value: row.get(3)?,
haex_tombstone: row.get(4)?, haex_timestamp: row.get(4)?,
haex_timestamp: row.get(5)?,
}) })
} }
} }
@ -54,8 +51,6 @@ pub struct HaexExtensions {
pub icon: Option<String>, pub icon: Option<String>,
pub signature: String, pub signature: String,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -73,8 +68,7 @@ impl HaexExtensions {
enabled: row.get(8)?, enabled: row.get(8)?,
icon: row.get(9)?, icon: row.get(9)?,
signature: row.get(10)?, signature: row.get(10)?,
haex_tombstone: row.get(11)?, haex_timestamp: row.get(11)?,
haex_timestamp: row.get(12)?,
}) })
} }
} }
@ -83,8 +77,7 @@ impl HaexExtensions {
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
pub struct HaexExtensionPermissions { pub struct HaexExtensionPermissions {
pub id: String, pub id: String,
#[serde(skip_serializing_if = "Option::is_none")] pub extension_id: String,
pub extension_id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub resource_type: Option<String>, pub resource_type: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
@ -99,8 +92,6 @@ pub struct HaexExtensionPermissions {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub updated_at: Option<String>, pub updated_at: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -116,8 +107,7 @@ impl HaexExtensionPermissions {
status: row.get(6)?, status: row.get(6)?,
created_at: row.get(7)?, created_at: row.get(7)?,
updated_at: row.get(8)?, updated_at: row.get(8)?,
haex_tombstone: row.get(9)?, haex_timestamp: row.get(9)?,
haex_timestamp: row.get(10)?,
}) })
} }
} }
@ -200,3 +190,51 @@ impl HaexCrdtConfigs {
} }
} }
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct HaexDesktopItems {
pub id: String,
pub workspace_id: String,
pub item_type: String,
pub reference_id: String,
pub position_x: i64,
pub position_y: i64,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>,
}
impl HaexDesktopItems {
pub fn from_row(row: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Self {
id: row.get(0)?,
workspace_id: row.get(1)?,
item_type: row.get(2)?,
reference_id: row.get(3)?,
position_x: row.get(4)?,
position_y: row.get(5)?,
haex_timestamp: row.get(6)?,
})
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct HaexWorkspaces {
pub id: String,
pub name: String,
pub position: i64,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>,
}
impl HaexWorkspaces {
pub fn from_row(row: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Self {
id: row.get(0)?,
name: row.get(1)?,
position: row.get(2)?,
haex_timestamp: row.get(3)?,
})
}
}

View File

@ -0,0 +1,67 @@
// src-tauri/src/database/init.rs
// Database initialization utilities (trigger setup, etc.)
use crate::crdt::trigger;
use crate::database::error::DatabaseError;
use crate::table_names::{
TABLE_DESKTOP_ITEMS,
TABLE_EXTENSIONS,
TABLE_EXTENSION_PERMISSIONS,
TABLE_NOTIFICATIONS,
TABLE_SETTINGS,
TABLE_WORKSPACES,
};
use rusqlite::{params, Connection};
/// Liste aller CRDT-Tabellen die Trigger benötigen (ohne Password-Tabellen - die kommen in Extension)
const CRDT_TABLES: &[&str] = &[
TABLE_SETTINGS,
TABLE_EXTENSIONS,
TABLE_EXTENSION_PERMISSIONS,
TABLE_NOTIFICATIONS,
TABLE_WORKSPACES,
TABLE_DESKTOP_ITEMS,
];
/// Prüft ob Trigger bereits initialisiert wurden und erstellt sie falls nötig
///
/// Diese Funktion wird beim ersten Öffnen einer Template-DB aufgerufen.
/// Sie erstellt alle CRDT-Trigger für die definierten Tabellen und markiert
/// die Initialisierung in haex_settings.
///
/// Bei Migrations (ALTER TABLE) werden Trigger automatisch neu erstellt,
/// daher ist kein Versioning nötig.
pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, DatabaseError> {
let tx = conn.transaction()?;
// Check if triggers already initialized
let check_sql = format!(
"SELECT value FROM {} WHERE key = ? AND type = ?",
TABLE_SETTINGS
);
let initialized: Option<String> = tx
.query_row(
&check_sql,
params!["triggers_initialized", "system"],
|row| row.get(0),
)
.ok();
if initialized.is_some() {
eprintln!("DEBUG: Triggers already initialized, skipping");
tx.commit()?; // Wichtig: Transaktion trotzdem abschließen
return Ok(true); // true = war schon initialisiert
}
eprintln!("INFO: Initializing CRDT triggers for database...");
// Create triggers for all CRDT tables
for table_name in CRDT_TABLES {
eprintln!(" - Setting up triggers for: {}", table_name);
trigger::setup_triggers_for_table(&tx, table_name, false)?;
}
tx.commit()?;
eprintln!("INFO: ✓ CRDT triggers created successfully (flag pending)");
Ok(false) // false = wurde gerade initialisiert
}

View File

@ -3,11 +3,13 @@
pub mod core; pub mod core;
pub mod error; pub mod error;
pub mod generated; pub mod generated;
pub mod init;
use crate::crdt::hlc::HlcService; use crate::crdt::hlc::HlcService;
use crate::database::core::execute_with_crdt;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::extension::database::executor::SqlExecutor; use crate::extension::database::executor::SqlExecutor;
use crate::table_names::TABLE_CRDT_CONFIGS; use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_SETTINGS};
use crate::AppState; use crate::AppState;
use rusqlite::Connection; use rusqlite::Connection;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
@ -18,6 +20,8 @@ use std::time::UNIX_EPOCH;
use std::{fs, sync::Arc}; use std::{fs, sync::Arc};
use tauri::{path::BaseDirectory, AppHandle, Manager, State}; use tauri::{path::BaseDirectory, AppHandle, Manager, State};
use tauri_plugin_fs::FsExt; use tauri_plugin_fs::FsExt;
#[cfg(not(target_os = "android"))]
use trash;
use ts_rs::TS; use ts_rs::TS;
pub struct DbConnection(pub Arc<Mutex<Option<Connection>>>); pub struct DbConnection(pub Arc<Mutex<Option<Connection>>>);
@ -43,6 +47,15 @@ pub fn sql_execute(
core::execute(sql, params, &state.db) core::execute(sql, params, &state.db)
} }
#[tauri::command]
pub fn sql_select_with_crdt(
sql: String,
params: Vec<JsonValue>,
state: State<'_, AppState>,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
core::select_with_crdt(sql, params, &state.db)
}
#[tauri::command] #[tauri::command]
pub fn sql_execute_with_crdt( pub fn sql_execute_with_crdt(
sql: String, sql: String,
@ -67,7 +80,8 @@ pub fn sql_query_with_crdt(
core::with_connection(&state.db, |conn| { core::with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?; let tx = conn.transaction().map_err(DatabaseError::from)?;
let result = SqlExecutor::query_internal(&tx, &hlc_service, &sql, &params)?; let (_modified_tables, result) =
SqlExecutor::query_internal(&tx, &hlc_service, &sql, &params)?;
tx.commit().map_err(DatabaseError::from)?; tx.commit().map_err(DatabaseError::from)?;
Ok(result) Ok(result)
}) })
@ -121,7 +135,6 @@ pub fn get_vaults_directory(app_handle: &AppHandle) -> Result<String, DatabaseEr
Ok(vaults_dir.to_string_lossy().to_string()) Ok(vaults_dir.to_string_lossy().to_string())
} }
//#[serde(tag = "type", content = "details")]
#[derive(Debug, Serialize, Deserialize, TS)] #[derive(Debug, Serialize, Deserialize, TS)]
#[ts(export)] #[ts(export)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
@ -195,15 +208,33 @@ pub fn list_vaults(app_handle: AppHandle) -> Result<Vec<VaultInfo>, DatabaseErro
/// Checks if a vault with the given name exists /// Checks if a vault with the given name exists
#[tauri::command] #[tauri::command]
pub fn vault_exists(app_handle: AppHandle, db_name: String) -> Result<bool, DatabaseError> { pub fn vault_exists(app_handle: AppHandle, vault_name: String) -> Result<bool, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &db_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
Ok(Path::new(&vault_path).exists()) Ok(Path::new(&vault_path).exists())
} }
/// Deletes a vault database file /// Moves a vault database file to trash (or deletes permanently if trash is unavailable)
#[tauri::command] #[tauri::command]
pub fn delete_vault(app_handle: AppHandle, db_name: String) -> Result<String, DatabaseError> { pub fn move_vault_to_trash(
let vault_path = get_vault_path(&app_handle, &db_name)?; app_handle: AppHandle,
vault_name: String,
) -> Result<String, DatabaseError> {
// On Android, trash is not available, so delete permanently
#[cfg(target_os = "android")]
{
println!(
"Android platform detected, permanently deleting vault '{}'",
vault_name
);
return delete_vault(app_handle, vault_name);
}
// On non-Android platforms, try to use trash
#[cfg(not(target_os = "android"))]
{
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{}-shm", vault_path);
let vault_wal_path = format!("{}-wal", vault_path);
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -212,12 +243,63 @@ pub fn delete_vault(app_handle: AppHandle, db_name: String) -> Result<String, Da
}); });
} }
// Try to move to trash first (works on desktop systems)
let moved_to_trash = trash::delete(&vault_path).is_ok();
if moved_to_trash {
// Also try to move auxiliary files to trash (ignore errors as they might not exist)
let _ = trash::delete(&vault_shm_path);
let _ = trash::delete(&vault_wal_path);
Ok(format!(
"Vault '{}' successfully moved to trash",
vault_name
))
} else {
// Fallback: Permanent deletion if trash fails
println!(
"Trash not available, falling back to permanent deletion for vault '{}'",
vault_name
);
delete_vault(app_handle, vault_name)
}
}
}
/// Deletes a vault database file permanently (bypasses trash)
#[tauri::command]
pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{}-shm", vault_path);
let vault_wal_path = format!("{}-wal", vault_path);
if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
path: vault_path,
reason: "Vault does not exist".to_string(),
});
}
if Path::new(&vault_shm_path).exists() {
fs::remove_file(&vault_shm_path).map_err(|e| DatabaseError::IoError {
path: vault_shm_path.clone(),
reason: format!("Failed to delete vault: {}", e),
})?;
}
if Path::new(&vault_wal_path).exists() {
fs::remove_file(&vault_wal_path).map_err(|e| DatabaseError::IoError {
path: vault_wal_path.clone(),
reason: format!("Failed to delete vault: {}", e),
})?;
}
fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError { fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError {
path: vault_path.clone(), path: vault_path.clone(),
reason: format!("Failed to delete vault: {}", e), reason: format!("Failed to delete vault: {}", e),
})?; })?;
Ok(format!("Vault '{}' successfully deleted", db_name)) Ok(format!("Vault '{}' successfully deleted", vault_name))
} }
#[tauri::command] #[tauri::command]
@ -367,9 +449,6 @@ pub fn open_encrypted_database(
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<String, DatabaseError> { ) -> Result<String, DatabaseError> {
println!("Opening encrypted database vault_path: {}", vault_path); println!("Opening encrypted database vault_path: {}", vault_path);
// Vault-Pfad aus dem Namen ableiten
//let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path); println!("Resolved vault path: {}", vault_path);
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
@ -392,9 +471,12 @@ fn initialize_session(
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<(), DatabaseError> { ) -> Result<(), DatabaseError> {
// 1. Establish the raw database connection // 1. Establish the raw database connection
let conn = core::open_and_init_db(path, key, false)?; let mut conn = core::open_and_init_db(path, key, false)?;
// 2. Initialize the HLC service // 2. Ensure CRDT triggers are initialized (for template DB)
let triggers_were_already_initialized = init::ensure_triggers_initialized(&mut conn)?;
// 3. Initialize the HLC service
let hlc_service = HlcService::try_initialize(&conn, app_handle).map_err(|e| { let hlc_service = HlcService::try_initialize(&conn, app_handle).map_err(|e| {
// We convert the HlcError into a DatabaseError // We convert the HlcError into a DatabaseError
DatabaseError::ExecutionError { DatabaseError::ExecutionError {
@ -404,16 +486,53 @@ fn initialize_session(
} }
})?; })?;
// 3. Store everything in the global AppState // 4. Store everything in the global AppState
let mut db_guard = state.db.0.lock().map_err(|e| DatabaseError::LockError { let mut db_guard = state.db.0.lock().map_err(|e| DatabaseError::LockError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Wichtig: Wir brauchen den db_guard gleich nicht mehr,
// da 'execute_with_crdt' 'with_connection' aufruft, was
// 'state.db' selbst locken muss.
// Wir müssen den Guard freigeben, *bevor* wir 'execute_with_crdt' rufen,
// um einen Deadlock zu verhindern.
// Aber wir müssen die 'conn' erst hineinbewegen.
*db_guard = Some(conn); *db_guard = Some(conn);
drop(db_guard);
let mut hlc_guard = state.hlc.lock().map_err(|e| DatabaseError::LockError { let mut hlc_guard = state.hlc.lock().map_err(|e| DatabaseError::LockError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
*hlc_guard = hlc_service; *hlc_guard = hlc_service;
// WICHTIG: hlc_guard *nicht* freigeben, da 'execute_with_crdt'
// eine Referenz auf die Guard erwartet.
// 5. NEUER SCHRITT: Setze das Flag via CRDT, falls nötig
if !triggers_were_already_initialized {
eprintln!("INFO: Setting 'triggers_initialized' flag via CRDT...");
let insert_sql = format!(
"INSERT INTO {} (id, key, type, value) VALUES (?, ?, ?, ?)",
TABLE_SETTINGS
);
// execute_with_crdt erwartet Vec<JsonValue>, kein params!-Makro
let params_vec: Vec<JsonValue> = vec![
JsonValue::String(uuid::Uuid::new_v4().to_string()),
JsonValue::String("triggers_initialized".to_string()),
JsonValue::String("system".to_string()),
JsonValue::String("1".to_string()),
];
// Jetzt können wir 'execute_with_crdt' sicher aufrufen,
// da der AppState initialisiert ist.
execute_with_crdt(
insert_sql, params_vec, &state.db, // Das &DbConnection (der Mutex)
&hlc_guard, // Die gehaltene MutexGuard
)?;
eprintln!("INFO: ✓ 'triggers_initialized' flag set.");
}
Ok(()) Ok(())
} }

View File

@ -4,13 +4,14 @@ use crate::extension::core::manifest::{EditablePermissions, ExtensionManifest, E
use crate::extension::core::types::{copy_directory, Extension, ExtensionSource}; use crate::extension::core::types::{copy_directory, Extension, ExtensionSource};
use crate::extension::core::ExtensionPermissions; use crate::extension::core::ExtensionPermissions;
use crate::extension::crypto::ExtensionCrypto; use crate::extension::crypto::ExtensionCrypto;
use crate::extension::database::executor::{PkRemappingContext, SqlExecutor}; use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError; use crate::extension::error::ExtensionError;
use crate::extension::permissions::manager::PermissionManager; use crate::extension::permissions::manager::PermissionManager;
use crate::extension::permissions::types::ExtensionPermission; use crate::extension::permissions::types::ExtensionPermission;
use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS}; use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS};
use crate::AppState; use crate::AppState;
use std::collections::HashMap; use serde_json::Value as JsonValue;
use std::collections::{HashMap, HashSet};
use std::fs; use std::fs;
use std::io::Cursor; use std::io::Cursor;
use std::path::PathBuf; use std::path::PathBuf;
@ -65,17 +66,124 @@ impl ExtensionManager {
Self::default() Self::default()
} }
/// Helper function to validate path and check for path traversal
/// Returns the cleaned path if valid, or None if invalid/not found
/// If require_exists is true, returns None if path doesn't exist
pub fn validate_path_in_directory(
base_dir: &PathBuf,
relative_path: &str,
require_exists: bool,
) -> Result<Option<PathBuf>, ExtensionError> {
// Check for path traversal patterns
if relative_path.contains("..") {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {}", relative_path),
});
}
// Clean the path (same logic as in protocol.rs)
let clean_path = relative_path
.replace('\\', "/")
.trim_start_matches('/')
.split('/')
.filter(|&part| !part.is_empty() && part != "." && part != "..")
.collect::<PathBuf>();
let full_path = base_dir.join(&clean_path);
// Check if file/directory exists (if required)
if require_exists && !full_path.exists() {
return Ok(None);
}
// Verify path is within base directory
let canonical_base = base_dir
.canonicalize()
.map_err(|e| ExtensionError::Filesystem { source: e })?;
if let Ok(canonical_path) = full_path.canonicalize() {
if !canonical_path.starts_with(&canonical_base) {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {}", relative_path),
});
}
Ok(Some(canonical_path))
} else {
// Path doesn't exist yet - still validate it would be within base
if full_path.starts_with(&canonical_base) {
Ok(Some(full_path))
} else {
Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {}", relative_path),
})
}
}
}
/// Validates icon path and falls back to favicon.ico if not specified
fn validate_and_resolve_icon_path(
extension_dir: &PathBuf,
haextension_dir: &str,
icon_path: Option<&str>,
) -> Result<Option<String>, ExtensionError> {
// If icon is specified in manifest, validate it
if let Some(icon) = icon_path {
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, icon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
} else {
eprintln!("WARNING: Icon path specified in manifest not found: {}", icon);
// Continue to fallback logic
}
}
// Fallback 1: Check haextension/favicon.ico
let haextension_favicon = format!("{}/favicon.ico", haextension_dir);
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, &haextension_favicon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// Fallback 2: Check public/favicon.ico
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, "public/favicon.ico", true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// No icon found
Ok(None)
}
/// Extrahiert eine Extension-ZIP-Datei und validiert das Manifest /// Extrahiert eine Extension-ZIP-Datei und validiert das Manifest
fn extract_and_validate_extension( fn extract_and_validate_extension(
bytes: Vec<u8>, bytes: Vec<u8>,
temp_prefix: &str, temp_prefix: &str,
app_handle: &AppHandle,
) -> Result<ExtractedExtension, ExtensionError> { ) -> Result<ExtractedExtension, ExtensionError> {
let temp = std::env::temp_dir().join(format!("{}_{}", temp_prefix, uuid::Uuid::new_v4())); // Use app_cache_dir for better Android compatibility
let cache_dir = app_handle
.path()
.app_cache_dir()
.map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot get app cache dir: {}", e),
})?;
let temp_id = uuid::Uuid::new_v4();
let temp = cache_dir.join(format!("{}_{}", temp_prefix, temp_id));
let zip_file_path = cache_dir.join(format!("{}_{}_{}.haextension", temp_prefix, temp_id, "temp"));
// Write bytes to a temporary ZIP file first (important for Android file system)
fs::write(&zip_file_path, &bytes).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
// Create extraction directory
fs::create_dir_all(&temp) fs::create_dir_all(&temp)
.map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?; .map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?;
let mut archive = ZipArchive::new(Cursor::new(bytes)).map_err(|e| { // Open ZIP file from disk (more reliable on Android than from memory)
let zip_file = fs::File::open(&zip_file_path).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
let mut archive = ZipArchive::new(zip_file).map_err(|e| {
ExtensionError::InstallationFailed { ExtensionError::InstallationFailed {
reason: format!("Invalid ZIP: {}", e), reason: format!("Invalid ZIP: {}", e),
} }
@ -87,38 +195,54 @@ impl ExtensionManager {
reason: format!("Cannot extract ZIP: {}", e), reason: format!("Cannot extract ZIP: {}", e),
})?; })?;
// Check if manifest.json is directly in temp or in a subdirectory // Clean up temporary ZIP file
let manifest_path = temp.join("manifest.json"); let _ = fs::remove_file(&zip_file_path);
let actual_dir = if manifest_path.exists() {
temp.clone()
} else {
// manifest.json is in a subdirectory - find it
let mut found_dir = None;
for entry in fs::read_dir(&temp)
.map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?
{
let entry = entry.map_err(|e| ExtensionError::Filesystem { source: e })?;
let path = entry.path();
if path.is_dir() && path.join("manifest.json").exists() {
found_dir = Some(path);
break;
}
}
found_dir.ok_or_else(|| ExtensionError::ManifestError { // Read haextension_dir from config if it exists, otherwise use default
reason: "manifest.json not found in extension archive".to_string(), let config_path = temp.join("haextension.config.json");
})? let haextension_dir = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read haextension.config.json: {}", e),
})?;
let config: serde_json::Value = serde_json::from_str(&config_content)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Invalid haextension.config.json: {}", e),
})?;
let dir = config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string();
dir
} else {
"haextension".to_string()
}; };
let manifest_path = actual_dir.join("manifest.json"); // Validate manifest path using helper function
let manifest_relative_path = format!("{}/manifest.json", haextension_dir);
let manifest_path = Self::validate_path_in_directory(&temp, &manifest_relative_path, true)?
.ok_or_else(|| ExtensionError::ManifestError {
reason: format!("manifest.json not found at {}/manifest.json", haextension_dir),
})?;
let actual_dir = temp.clone();
let manifest_content = let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError { std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read manifest: {}", e), reason: format!("Cannot read manifest: {}", e),
})?; })?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let mut manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
let content_hash = ExtensionCrypto::hash_directory(&actual_dir).map_err(|e| { // Validate and resolve icon path with fallback logic
let validated_icon = Self::validate_and_resolve_icon_path(&actual_dir, &haextension_dir, manifest.icon.as_deref())?;
manifest.icon = validated_icon;
let content_hash = ExtensionCrypto::hash_directory(&actual_dir, &manifest_path).map_err(|e| {
ExtensionError::SignatureVerificationFailed { ExtensionError::SignatureVerificationFailed {
reason: e.to_string(), reason: e.to_string(),
} }
@ -167,7 +291,6 @@ impl ExtensionManager {
Ok(specific_extension_dir) Ok(specific_extension_dir)
} }
pub fn add_production_extension(&self, extension: Extension) -> Result<(), ExtensionError> { pub fn add_production_extension(&self, extension: Extension) -> Result<(), ExtensionError> {
if extension.id.is_empty() { if extension.id.is_empty() {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
@ -223,10 +346,11 @@ impl ExtensionManager {
name: &str, name: &str,
) -> Result<Option<(String, Extension)>, ExtensionError> { ) -> Result<Option<(String, Extension)>, ExtensionError> {
// 1. Check dev extensions first (higher priority) // 1. Check dev extensions first (higher priority)
let dev_extensions = self.dev_extensions.lock().map_err(|e| { let dev_extensions =
ExtensionError::MutexPoisoned { self.dev_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
}
})?; })?;
for (id, ext) in dev_extensions.iter() { for (id, ext) in dev_extensions.iter() {
@ -236,10 +360,11 @@ impl ExtensionManager {
} }
// 2. Check production extensions // 2. Check production extensions
let prod_extensions = self.production_extensions.lock().map_err(|e| { let prod_extensions =
ExtensionError::MutexPoisoned { self.production_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
}
})?; })?;
for (id, ext) in prod_extensions.iter() { for (id, ext) in prod_extensions.iter() {
@ -262,11 +387,7 @@ impl ExtensionManager {
.map(|(_, ext)| ext)) .map(|(_, ext)| ext))
} }
pub fn remove_extension( pub fn remove_extension(&self, public_key: &str, name: &str) -> Result<(), ExtensionError> {
&self,
public_key: &str,
name: &str,
) -> Result<(), ExtensionError> {
let (id, _) = self let (id, _) = self
.find_extension_id_by_public_key_and_name(public_key, name)? .find_extension_id_by_public_key_and_name(public_key, name)?
.ok_or_else(|| ExtensionError::NotFound { .ok_or_else(|| ExtensionError::NotFound {
@ -276,10 +397,11 @@ impl ExtensionManager {
// Remove from dev extensions first // Remove from dev extensions first
{ {
let mut dev_extensions = self.dev_extensions.lock().map_err(|e| { let mut dev_extensions =
ExtensionError::MutexPoisoned { self.dev_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
}
})?; })?;
if dev_extensions.remove(&id).is_some() { if dev_extensions.remove(&id).is_some() {
return Ok(()); return Ok(());
@ -288,10 +410,11 @@ impl ExtensionManager {
// Remove from production extensions // Remove from production extensions
{ {
let mut prod_extensions = self.production_extensions.lock().map_err(|e| { let mut prod_extensions =
ExtensionError::MutexPoisoned { self.production_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
}
})?; })?;
prod_extensions.remove(&id); prod_extensions.remove(&id);
} }
@ -316,7 +439,10 @@ impl ExtensionManager {
})?; })?;
eprintln!("DEBUG: Removing extension with ID: {}", extension.id); eprintln!("DEBUG: Removing extension with ID: {}", extension.id);
eprintln!("DEBUG: Extension name: {}, version: {}", extension_name, extension_version); eprintln!(
"DEBUG: Extension name: {}, version: {}",
extension_name, extension_version
);
// Lösche Permissions und Extension-Eintrag in einer Transaktion // Lösche Permissions und Extension-Eintrag in einer Transaktion
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
@ -327,12 +453,11 @@ impl ExtensionManager {
})?; })?;
// Lösche alle Permissions mit extension_id // Lösche alle Permissions mit extension_id
eprintln!("DEBUG: Deleting permissions for extension_id: {}", extension.id); eprintln!(
PermissionManager::delete_permissions_in_transaction( "DEBUG: Deleting permissions for extension_id: {}",
&tx, extension.id
&hlc_service, );
&extension.id, PermissionManager::delete_permissions_in_transaction(&tx, &hlc_service, &extension.id)?;
)?;
// Lösche Extension-Eintrag mit extension_id // Lösche Extension-Eintrag mit extension_id
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSIONS); let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSIONS);
@ -391,9 +516,10 @@ impl ExtensionManager {
pub async fn preview_extension_internal( pub async fn preview_extension_internal(
&self, &self,
app_handle: &AppHandle,
file_bytes: Vec<u8>, file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> { ) -> Result<ExtensionPreview, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_preview")?; let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_preview", app_handle)?;
let is_valid_signature = ExtensionCrypto::verify_signature( let is_valid_signature = ExtensionCrypto::verify_signature(
&extracted.manifest.public_key, &extracted.manifest.public_key,
@ -418,7 +544,7 @@ impl ExtensionManager {
custom_permissions: EditablePermissions, custom_permissions: EditablePermissions,
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<String, ExtensionError> { ) -> Result<String, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_ext")?; let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_ext", &app_handle)?;
// Signatur verifizieren (bei Installation wird ein Fehler geworfen, nicht nur geprüft) // Signatur verifizieren (bei Installation wird ein Fehler geworfen, nicht nur geprüft)
ExtensionCrypto::verify_signature( ExtensionCrypto::verify_signature(
@ -435,6 +561,17 @@ impl ExtensionManager {
&extracted.manifest.version, &extracted.manifest.version,
)?; )?;
// If extension version already exists, remove it completely before installing
if extensions_dir.exists() {
eprintln!(
"Extension version already exists at {}, removing old version",
extensions_dir.display()
);
std::fs::remove_dir_all(&extensions_dir).map_err(|e| {
ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e)
})?;
}
std::fs::create_dir_all(&extensions_dir).map_err(|e| { std::fs::create_dir_all(&extensions_dir).map_err(|e| {
ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e) ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e)
})?; })?;
@ -469,22 +606,20 @@ impl ExtensionManager {
let actual_extension_id = with_connection(&state.db, |conn| { let actual_extension_id = with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?; let tx = conn.transaction().map_err(DatabaseError::from)?;
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned { let hlc_service_guard = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
// Klonen, um den MutexGuard freizugeben, bevor potenziell lange DB-Operationen stattfinden
// Erstelle PK-Remapping Context für die gesamte Transaktion let hlc_service = hlc_service_guard.clone();
// Dies ermöglicht automatisches FK-Remapping wenn ON CONFLICT bei Extension auftritt drop(hlc_service_guard);
let mut pk_context = PkRemappingContext::new();
// 1. Extension-Eintrag erstellen mit generierter UUID // 1. Extension-Eintrag erstellen mit generierter UUID
// WICHTIG: RETURNING wird vom CRDT-Transformer automatisch hinzugefügt
let insert_ext_sql = format!( let insert_ext_sql = format!(
"INSERT INTO {} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) RETURNING id", "INSERT INTO {} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
TABLE_EXTENSIONS TABLE_EXTENSIONS
); );
let (_tables, returning_results) = SqlExecutor::query_internal_typed_with_context( SqlExecutor::execute_internal_typed(
&tx, &tx,
&hlc_service, &hlc_service,
&insert_ext_sql, &insert_ext_sql,
@ -500,27 +635,11 @@ impl ExtensionManager {
extracted.manifest.homepage, extracted.manifest.homepage,
extracted.manifest.description, extracted.manifest.description,
true, // enabled true, // enabled
extracted.manifest.single_instance.unwrap_or(false),
], ],
&mut pk_context,
)?; )?;
// Nutze die tatsächliche ID aus der Datenbank (wichtig bei ON CONFLICT) // 2. Permissions speichern
// Die haex_extensions Tabelle hat einen single-column PK namens "id"
let actual_extension_id = returning_results
.first()
.and_then(|row| row.first())
.and_then(|val| val.as_str())
.map(|s| s.to_string())
.unwrap_or_else(|| extension_id.clone());
eprintln!(
"DEBUG: Extension UUID - Generated: {}, Actual from DB: {}",
extension_id, actual_extension_id
);
// 2. Permissions speichern (oder aktualisieren falls schon vorhanden)
// Nutze einfaches INSERT - die CRDT-Transformation fügt automatisch ON CONFLICT hinzu
// FK-Werte (extension_id) werden automatisch remapped wenn Extension ON CONFLICT hatte
let insert_perm_sql = format!( let insert_perm_sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)",
TABLE_EXTENSION_PERMISSIONS TABLE_EXTENSION_PERMISSIONS
@ -530,7 +649,7 @@ impl ExtensionManager {
use crate::database::generated::HaexExtensionPermissions; use crate::database::generated::HaexExtensionPermissions;
let db_perm: HaexExtensionPermissions = perm.into(); let db_perm: HaexExtensionPermissions = perm.into();
SqlExecutor::execute_internal_typed_with_context( SqlExecutor::execute_internal_typed(
&tx, &tx,
&hlc_service, &hlc_service,
&insert_perm_sql, &insert_perm_sql,
@ -543,16 +662,15 @@ impl ExtensionManager {
db_perm.constraints, db_perm.constraints,
db_perm.status, db_perm.status,
], ],
&mut pk_context,
)?; )?;
} }
tx.commit().map_err(DatabaseError::from)?; tx.commit().map_err(DatabaseError::from)?;
Ok(actual_extension_id.clone()) Ok(extension_id.clone())
})?; })?;
let extension = Extension { let extension = Extension {
id: actual_extension_id.clone(), // Nutze die actual_extension_id aus der Transaktion id: extension_id.clone(),
source: ExtensionSource::Production { source: ExtensionSource::Production {
path: extensions_dir.clone(), path: extensions_dir.clone(),
version: extracted.manifest.version.clone(), version: extracted.manifest.version.clone(),
@ -573,6 +691,7 @@ impl ExtensionManager {
app_handle: &AppHandle, app_handle: &AppHandle,
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<Vec<String>, ExtensionError> { ) -> Result<Vec<String>, ExtensionError> {
// Clear existing data
self.production_extensions self.production_extensions
.lock() .lock()
.map_err(|e| ExtensionError::MutexPoisoned { .map_err(|e| ExtensionError::MutexPoisoned {
@ -592,19 +711,21 @@ impl ExtensionManager {
})? })?
.clear(); .clear();
// Schritt 1: Alle Daten aus der Datenbank in einem Rutsch laden. // Lade alle Daten aus der Datenbank
let extensions = with_connection(&state.db, |conn| { let extensions = with_connection(&state.db, |conn| {
let sql = format!( let sql = format!(
"SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled FROM {}", "SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance FROM {}",
TABLE_EXTENSIONS TABLE_EXTENSIONS
); );
eprintln!("DEBUG: SQL Query before transformation: {}", sql); eprintln!("DEBUG: SQL Query before transformation: {}", sql);
let results = SqlExecutor::select_internal(conn, &sql, &[])?;
let results = SqlExecutor::query_select(conn, &sql, &[])?;
eprintln!("DEBUG: Query returned {} results", results.len()); eprintln!("DEBUG: Query returned {} results", results.len());
let mut data = Vec::new(); let mut data = Vec::new();
for result in results { for row in results {
let id = result["id"] // Wir erwarten die Werte in der Reihenfolge der SELECT-Anweisung
let id = row[0]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing id field".to_string(), reason: "Missing id field".to_string(),
@ -612,31 +733,34 @@ impl ExtensionManager {
.to_string(); .to_string();
let manifest = ExtensionManifest { let manifest = ExtensionManifest {
name: result["name"] name: row[1]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing name field".to_string(), reason: "Missing name field".to_string(),
})? })?
.to_string(), .to_string(),
version: result["version"] version: row[2]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing version field".to_string(), reason: "Missing version field".to_string(),
})? })?
.to_string(), .to_string(),
author: result["author"].as_str().map(String::from), author: row[3].as_str().map(String::from),
entry: result["entry"].as_str().unwrap_or("index.html").to_string(), entry: row[4].as_str().map(String::from),
icon: result["icon"].as_str().map(String::from), icon: row[5].as_str().map(String::from),
public_key: result["public_key"].as_str().unwrap_or("").to_string(), public_key: row[6].as_str().unwrap_or("").to_string(),
signature: result["signature"].as_str().unwrap_or("").to_string(), signature: row[7].as_str().unwrap_or("").to_string(),
permissions: ExtensionPermissions::default(), permissions: ExtensionPermissions::default(),
homepage: result["homepage"].as_str().map(String::from), homepage: row[8].as_str().map(String::from),
description: result["description"].as_str().map(String::from), description: row[9].as_str().map(String::from),
single_instance: row[11]
.as_bool()
.or_else(|| row[11].as_i64().map(|v| v != 0)),
}; };
let enabled = result["enabled"] let enabled = row[10]
.as_bool() .as_bool()
.or_else(|| result["enabled"].as_i64().map(|v| v != 0)) .or_else(|| row[10].as_i64().map(|v| v != 0))
.unwrap_or(false); .unwrap_or(false);
data.push(ExtensionDataFromDb { data.push(ExtensionDataFromDb {
@ -665,9 +789,10 @@ impl ExtensionManager {
&extension_data.manifest.version, &extension_data.manifest.version,
)?; )?;
if !extension_path.exists() || !extension_path.join("manifest.json").exists() { // Check if extension directory exists
if !extension_path.exists() {
eprintln!( eprintln!(
"DEBUG: Extension files missing for: {} at {:?}", "DEBUG: Extension directory missing for: {} at {:?}",
extension_id, extension_path extension_id, extension_path
); );
self.missing_extensions self.missing_extensions
@ -684,10 +809,53 @@ impl ExtensionManager {
continue; continue;
} }
// Read haextension_dir from config if it exists, otherwise use default
let config_path = extension_path.join("haextension.config.json");
let haextension_dir = if config_path.exists() {
match std::fs::read_to_string(&config_path) {
Ok(config_content) => {
match serde_json::from_str::<serde_json::Value>(&config_content) {
Ok(config) => {
config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string()
}
Err(_) => "haextension".to_string(),
}
}
Err(_) => "haextension".to_string(),
}
} else {
"haextension".to_string()
};
// Validate manifest.json path using helper function
let manifest_relative_path = format!("{}/manifest.json", haextension_dir);
if Self::validate_path_in_directory(&extension_path, &manifest_relative_path, true)?
.is_none()
{
eprintln!( eprintln!(
"DEBUG: Extension loaded successfully: {}", "DEBUG: manifest.json missing or invalid for: {} at {}/manifest.json",
extension_id extension_id, haextension_dir
); );
self.missing_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?
.push(MissingExtension {
id: extension_id.clone(),
public_key: extension_data.manifest.public_key.clone(),
name: extension_data.manifest.name.clone(),
version: extension_data.manifest.version.clone(),
});
continue;
}
eprintln!("DEBUG: Extension loaded successfully: {}", extension_id);
let extension = Extension { let extension = Extension {
id: extension_id.clone(), id: extension_id.clone(),

View File

@ -57,13 +57,20 @@ pub struct ExtensionManifest {
pub name: String, pub name: String,
pub version: String, pub version: String,
pub author: Option<String>, pub author: Option<String>,
pub entry: String, #[serde(default = "default_entry_value")]
pub entry: Option<String>,
pub icon: Option<String>, pub icon: Option<String>,
pub public_key: String, pub public_key: String,
pub signature: String, pub signature: String,
pub permissions: ExtensionPermissions, pub permissions: ExtensionPermissions,
pub homepage: Option<String>, pub homepage: Option<String>,
pub description: Option<String>, pub description: Option<String>,
#[serde(default)]
pub single_instance: Option<bool>,
}
fn default_entry_value() -> Option<String> {
Some("index.html".to_string())
} }
impl ExtensionManifest { impl ExtensionManifest {
@ -155,7 +162,6 @@ impl ExtensionPermissions {
.and_then(|c| serde_json::from_value::<PermissionConstraints>(c.clone()).ok()), .and_then(|c| serde_json::from_value::<PermissionConstraints>(c.clone()).ok()),
status: p.status.clone().unwrap_or(PermissionStatus::Ask), status: p.status.clone().unwrap_or(PermissionStatus::Ask),
haex_timestamp: None, haex_timestamp: None,
haex_tombstone: None,
}) })
} }
} }
@ -173,6 +179,8 @@ pub struct ExtensionInfoResponse {
pub description: Option<String>, pub description: Option<String>,
pub homepage: Option<String>, pub homepage: Option<String>,
pub icon: Option<String>, pub icon: Option<String>,
pub entry: Option<String>,
pub single_instance: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub dev_server_url: Option<String>, pub dev_server_url: Option<String>,
} }
@ -198,6 +206,8 @@ impl ExtensionInfoResponse {
description: extension.manifest.description.clone(), description: extension.manifest.description.clone(),
homepage: extension.manifest.homepage.clone(), homepage: extension.manifest.homepage.clone(),
icon: extension.manifest.icon.clone(), icon: extension.manifest.icon.clone(),
entry: extension.manifest.entry.clone(),
single_instance: extension.manifest.single_instance,
dev_server_url, dev_server_url,
}) })
} }

View File

@ -4,28 +4,13 @@ use std::{
}; };
// src-tauri/src/extension/crypto.rs // src-tauri/src/extension/crypto.rs
use crate::extension::error::ExtensionError;
use ed25519_dalek::{Signature, Verifier, VerifyingKey}; use ed25519_dalek::{Signature, Verifier, VerifyingKey};
use sha2::{Digest, Sha256}; use sha2::{Digest, Sha256};
pub struct ExtensionCrypto; pub struct ExtensionCrypto;
impl ExtensionCrypto { impl ExtensionCrypto {
/// Berechnet Hash vom Public Key (wie im SDK)
pub fn calculate_key_hash(public_key_hex: &str) -> Result<String, String> {
let public_key_bytes =
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key hex: {}", e))?;
let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap())
.map_err(|e| format!("Invalid public key: {}", e))?;
let mut hasher = Sha256::new();
hasher.update(public_key.as_bytes());
let result = hasher.finalize();
// Ersten 20 Hex-Zeichen (10 Bytes) - wie im SDK
Ok(hex::encode(&result[..10]))
}
/// Verifiziert Extension-Signatur /// Verifiziert Extension-Signatur
pub fn verify_signature( pub fn verify_signature(
public_key_hex: &str, public_key_hex: &str,
@ -50,26 +35,64 @@ impl ExtensionCrypto {
} }
/// Berechnet Hash eines Verzeichnisses (für Verifikation) /// Berechnet Hash eines Verzeichnisses (für Verifikation)
pub fn hash_directory(dir: &Path) -> Result<String, String> { pub fn hash_directory(dir: &Path, manifest_path: &Path) -> Result<String, ExtensionError> {
// 1. Alle Dateipfade rekursiv sammeln // 1. Alle Dateipfade rekursiv sammeln
let mut all_files = Vec::new(); let mut all_files = Vec::new();
Self::collect_files_recursively(dir, &mut all_files) Self::collect_files_recursively(dir, &mut all_files)
.map_err(|e| format!("Failed to collect files: {}", e))?; .map_err(|e| ExtensionError::Filesystem { source: e })?;
all_files.sort();
// 2. Konvertiere zu relativen Pfaden für konsistente Sortierung (wie im SDK)
let mut relative_files: Vec<(String, PathBuf)> = all_files
.into_iter()
.map(|path| {
let relative = path.strip_prefix(dir)
.unwrap_or(&path)
.to_string_lossy()
.to_string()
// Normalisiere Pfad-Separatoren zu Unix-Style (/) für plattformübergreifende Konsistenz
.replace('\\', "/");
(relative, path)
})
.collect();
// 3. Sortiere nach relativen Pfaden
relative_files.sort_by(|a, b| a.0.cmp(&b.0));
let mut hasher = Sha256::new(); let mut hasher = Sha256::new();
let manifest_path = dir.join("manifest.json");
// 2. Inhalte der sortierten Dateien hashen // Canonicalize manifest path for comparison (important on Android where symlinks may differ)
for file_path in all_files { // Also ensure the canonical path is still within the allowed directory (security check)
if file_path == manifest_path { let canonical_manifest_path = manifest_path.canonicalize()
.unwrap_or_else(|_| manifest_path.to_path_buf());
// Security: Verify canonical manifest path is still within dir
let canonical_dir = dir.canonicalize()
.unwrap_or_else(|_| dir.to_path_buf());
if !canonical_manifest_path.starts_with(&canonical_dir) {
return Err(ExtensionError::ManifestError {
reason: format!("Manifest path resolves outside of extension directory (potential path traversal)"),
});
}
// 4. Inhalte der sortierten Dateien hashen
for (_relative, file_path) in relative_files {
// Canonicalize file_path for comparison
let canonical_file_path = file_path.canonicalize()
.unwrap_or_else(|_| file_path.clone());
if canonical_file_path == canonical_manifest_path {
// FÜR DIE MANIFEST.JSON: // FÜR DIE MANIFEST.JSON:
let content_str = fs::read_to_string(&file_path) let content_str = fs::read_to_string(&file_path)
.map_err(|e| format!("Cannot read manifest file: {}", e))?; .map_err(|e| ExtensionError::Filesystem { source: e })?;
// Parse zu einem generischen JSON-Wert // Parse zu einem generischen JSON-Wert
let mut manifest: serde_json::Value = serde_json::from_str(&content_str) let mut manifest: serde_json::Value =
.map_err(|e| format!("Cannot parse manifest JSON: {}", e))?; serde_json::from_str(&content_str).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Cannot parse manifest JSON: {}", e),
}
})?;
// Entferne oder leere das Signaturfeld, um den "kanonischen Inhalt" zu erhalten // Entferne oder leere das Signaturfeld, um den "kanonischen Inhalt" zu erhalten
if let Some(obj) = manifest.as_object_mut() { if let Some(obj) = manifest.as_object_mut() {
@ -80,13 +103,23 @@ impl ExtensionCrypto {
} }
// Serialisiere das modifizierte Manifest zurück (mit 2 Spaces, wie in JS) // Serialisiere das modifizierte Manifest zurück (mit 2 Spaces, wie in JS)
let canonical_manifest_content = serde_json::to_string_pretty(&manifest).unwrap(); // serde_json sortiert die Keys automatisch alphabetisch
println!("canonical_manifest_content: {}", canonical_manifest_content); let canonical_manifest_content =
hasher.update(canonical_manifest_content.as_bytes()); serde_json::to_string_pretty(&manifest).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Failed to serialize manifest: {}", e),
}
})?;
// Normalisiere Zeilenenden zu Unix-Style (\n), wie Node.js JSON.stringify es macht
// Dies ist wichtig für plattformübergreifende Konsistenz (Desktop vs Android)
let normalized_content = canonical_manifest_content.replace("\r\n", "\n");
hasher.update(normalized_content.as_bytes());
} else { } else {
// FÜR ALLE ANDEREN DATEIEN: // FÜR ALLE ANDEREN DATEIEN:
let content = fs::read(&file_path) let content =
.map_err(|e| format!("Cannot read file {}: {}", file_path.display(), e))?; fs::read(&file_path).map_err(|e| ExtensionError::Filesystem { source: e })?;
hasher.update(&content); hasher.update(&content);
} }
} }

View File

@ -1,4 +1,4 @@
// src-tauri/src/extension/database/executor.rs (neu) // src-tauri/src/extension/database/executor.rs
use crate::crdt::hlc::HlcService; use crate::crdt::hlc::HlcService;
use crate::crdt::transformer::CrdtTransformer; use crate::crdt::transformer::CrdtTransformer;
@ -7,114 +7,27 @@ use crate::database::core::{convert_value_ref_to_json, parse_sql_statements, Val
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction}; use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction};
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
use sqlparser::ast::{Insert, Statement, TableObject}; use sqlparser::ast::Statement;
use std::collections::{HashMap, HashSet}; use std::collections::HashSet;
/// Repräsentiert PK-Werte für eine Zeile (kann single oder composite key sein)
#[derive(Debug, Clone, PartialEq, Eq)]
struct PkValues {
/// column_name -> value
values: HashMap<String, String>,
}
impl PkValues {
fn new() -> Self {
Self {
values: HashMap::new(),
}
}
fn insert(&mut self, column: String, value: String) {
self.values.insert(column, value);
}
fn get(&self, column: &str) -> Option<&String> {
self.values.get(column)
}
}
/// Context für PK-Remapping während einer Transaktion
/// Trackt für jede Tabelle: welche PKs sollten eingefügt werden vs. welche sind tatsächlich in der DB
#[derive(Debug, Default)]
pub struct PkRemappingContext {
/// Für jede Tabelle: Liste von (original_pk_values, actual_pk_values) Mappings
/// Wird nur gespeichert wenn original != actual (d.h. ON CONFLICT hat PK geändert)
mappings: HashMap<String, Vec<(PkValues, PkValues)>>,
}
impl PkRemappingContext {
pub fn new() -> Self {
Self::default()
}
/// Fügt ein Mapping für eine Tabelle hinzu, aber nur wenn original != actual
/// original und actual sind die PK-Werte vor und nach dem INSERT
fn add_mapping(&mut self, table: String, original: PkValues, actual: PkValues) {
// Nur speichern wenn tatsächlich unterschiedlich (ON CONFLICT hat stattgefunden)
if original != actual {
eprintln!(
"DEBUG: PK Remapping for table '{}': {:?} -> {:?}",
table, original.values, actual.values
);
self.mappings
.entry(table)
.or_insert_with(Vec::new)
.push((original, actual));
}
}
/// Versucht einen FK-Wert zu remappen
/// referenced_table: Die Tabelle auf die der FK zeigt
/// referenced_column: Die PK-Spalte in der referenced_table
/// value: Der FK-Wert der ersetzt werden soll
fn remap_fk_value(
&self,
referenced_table: &str,
referenced_column: &str,
value: &str,
) -> String {
self.mappings
.get(referenced_table)
.and_then(|mappings| {
mappings.iter().find_map(|(original, actual)| {
if original.get(referenced_column)? == value {
let actual_val = actual.get(referenced_column)?.clone();
eprintln!(
"DEBUG: FK Remapping for {}.{}: {} -> {}",
referenced_table, referenced_column, value, actual_val
);
Some(actual_val)
} else {
None
}
})
})
.unwrap_or_else(|| value.to_string())
}
}
/// SQL-Executor OHNE Berechtigungsprüfung - für interne Nutzung /// SQL-Executor OHNE Berechtigungsprüfung - für interne Nutzung
pub struct SqlExecutor; pub struct SqlExecutor;
impl SqlExecutor { impl SqlExecutor {
/// Führt ein SQL Statement OHNE RETURNING aus (mit CRDT und PK-Remapping) /// Führt ein SQL Statement OHNE RETURNING aus (mit CRDT)
/// Unterstützt automatisches FK-Remapping wenn vorherige INSERTs ON CONFLICT getriggert haben
///
/// Diese Variante akzeptiert &[&dyn ToSql] direkt (wie von rusqlite::params![] erzeugt)
/// Returns: modified_schema_tables /// Returns: modified_schema_tables
pub fn execute_internal_typed_with_context( pub fn execute_internal_typed(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: &[&dyn ToSql], params: &[&dyn ToSql],
pk_context: &mut PkRemappingContext,
) -> Result<HashSet<String>, DatabaseError> { ) -> Result<HashSet<String>, DatabaseError> {
let mut ast_vec = parse_sql_statements(sql)?; let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 { if ast_vec.len() != 1 {
return Err(DatabaseError::ExecutionError { return Err(DatabaseError::ExecutionError {
sql: sql.to_string(), sql: sql.to_string(),
reason: "execute_internal_typed_with_context sollte nur ein einzelnes SQL-Statement erhalten" reason: "execute_internal_typed should only receive a single SQL statement"
.to_string(), .to_string(),
table: None, table: None,
}); });
@ -134,96 +47,50 @@ impl SqlExecutor {
if let Some(table_name) = transformer.transform_execute_statement_with_table_info( if let Some(table_name) = transformer.transform_execute_statement_with_table_info(
&mut statement, &mut statement,
&hlc_timestamp, &hlc_timestamp,
tx,
)? { )? {
modified_schema_tables.insert(table_name); modified_schema_tables.insert(table_name);
} }
let sql_str = statement.to_string(); let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed SQL: {}", sql_str); eprintln!("DEBUG: Transformed execute SQL: {}", sql_str);
// Spezielle Behandlung für INSERT Statements (mit FK-Remapping, OHNE RETURNING) // Führe Statement aus
if let Statement::Insert(ref insert_stmt) = statement {
if let TableObject::TableName(ref table_name) = insert_stmt.table {
let table_name_str = table_name
.to_string()
.trim_matches('`')
.trim_matches('"')
.to_string();
// Konvertiere Params zu Vec für Manipulation
let mut param_vec = params_to_vec(params, tx)?;
// Hole Foreign Key Informationen
let fk_info = get_fk_info(tx, &table_name_str)?;
// Remap FK-Werte in params (falls Mappings existieren)
remap_fk_params(insert_stmt, &mut param_vec, &fk_info, pk_context)?;
// Führe INSERT mit execute() aus
let param_refs: Vec<&dyn ToSql> =
param_vec.iter().map(|v| v as &dyn ToSql).collect();
let mut stmt = tx
.prepare(&sql_str)
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: Some(table_name_str.clone()),
reason: format!("Prepare failed: {}", e),
})?;
let _ = stmt
.query(params_from_iter(param_refs.iter()))
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: Some(table_name_str.clone()),
reason: format!("Query execution failed: {}", e),
})?;
/* tx.execute(&sql_str, params_from_iter(param_refs.iter()))
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: Some(table_name_str.clone()),
reason: e.to_string(),
})?; */
}
} else {
// Nicht-INSERT Statements normal ausführen
tx.execute(&sql_str, params) tx.execute(&sql_str, params)
.map_err(|e| DatabaseError::ExecutionError { .map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: None, table: None,
reason: e.to_string(), reason: format!("Execute failed: {}", e),
})?; })?;
}
// Trigger-Logik für CREATE TABLE // Trigger-Logik für CREATE TABLE
if let Statement::CreateTable(create_table_details) = statement { if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string(); let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
eprintln!("DEBUG: Setting up triggers for table: {}", table_name_str);
trigger::setup_triggers_for_table(tx, &table_name_str, false)?; trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
} }
Ok(modified_schema_tables) Ok(modified_schema_tables)
} }
/// Führt ein SQL Statement MIT RETURNING aus (mit CRDT und PK-Remapping) /// Führt ein SQL Statement MIT RETURNING aus (mit CRDT)
/// Unterstützt automatisches FK-Remapping wenn vorherige INSERTs ON CONFLICT getriggert haben
///
/// Diese Variante akzeptiert &[&dyn ToSql] direkt (wie von rusqlite::params![] erzeugt)
/// Returns: (modified_schema_tables, returning_results) /// Returns: (modified_schema_tables, returning_results)
/// returning_results enthält ALLE RETURNING-Spalten für INSERT/UPDATE/DELETE mit RETURNING pub fn query_internal_typed(
pub fn query_internal_typed_with_context(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: &[&dyn ToSql], params: &[&dyn ToSql],
pk_context: &mut PkRemappingContext,
) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> { ) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
let mut ast_vec = parse_sql_statements(sql)?; let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 { if ast_vec.len() != 1 {
return Err(DatabaseError::ExecutionError { return Err(DatabaseError::ExecutionError {
sql: sql.to_string(), sql: sql.to_string(),
reason: "query_internal_typed_with_context sollte nur ein einzelnes SQL-Statement erhalten" reason: "query_internal_typed should only receive a single SQL statement"
.to_string(), .to_string(),
table: None, table: None,
}); });
@ -243,7 +110,6 @@ impl SqlExecutor {
if let Some(table_name) = transformer.transform_execute_statement_with_table_info( if let Some(table_name) = transformer.transform_execute_statement_with_table_info(
&mut statement, &mut statement,
&hlc_timestamp, &hlc_timestamp,
tx,
)? { )? {
modified_schema_tables.insert(table_name); modified_schema_tables.insert(table_name);
} }
@ -251,483 +117,168 @@ impl SqlExecutor {
let sql_str = statement.to_string(); let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed SQL (with RETURNING): {}", sql_str); eprintln!("DEBUG: Transformed SQL (with RETURNING): {}", sql_str);
// Spezielle Behandlung für INSERT Statements (mit PK-Remapping + RETURNING) // Prepare und query ausführen
if let Statement::Insert(ref insert_stmt) = statement {
if let TableObject::TableName(ref table_name) = insert_stmt.table {
let table_name_str = table_name
.to_string()
.trim_matches('`')
.trim_matches('"')
.to_string();
// Konvertiere Params zu Vec für Manipulation
let mut param_vec = params_to_vec(params, tx)?;
// Hole Table Schema um PKs und FKs zu identifizieren
let table_columns =
trigger::get_table_schema(tx, &table_name_str).map_err(|e| {
DatabaseError::ExecutionError {
sql: format!("PRAGMA table_info('{}')", table_name_str),
reason: e.to_string(),
table: Some(table_name_str.clone()),
}
})?;
let pk_columns: Vec<String> = table_columns
.iter()
.filter(|c| c.is_pk)
.map(|c| c.name.clone())
.collect();
// Hole Foreign Key Informationen
let fk_info = get_fk_info(tx, &table_name_str)?;
// 1. Extrahiere Original PK-Werte aus params (vor FK-Remapping)
let original_pk =
extract_pk_values_from_params(insert_stmt, &param_vec, &pk_columns)?;
// 2. Remap FK-Werte in params (falls Mappings existieren)
remap_fk_params(insert_stmt, &mut param_vec, &fk_info, pk_context)?;
// 3. Führe INSERT mit query() aus um RETURNING zu lesen
let mut stmt = tx let mut stmt = tx
.prepare(&sql_str) .prepare(&sql_str)
.map_err(|e| DatabaseError::ExecutionError { .map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: Some(table_name_str.clone()), table: None,
reason: e.to_string(), reason: e.to_string(),
})?; })?;
let num_columns = stmt.column_count(); let column_names: Vec<String> = stmt
let param_refs: Vec<&dyn ToSql> = .column_names()
param_vec.iter().map(|v| v as &dyn ToSql).collect(); .into_iter()
.map(|s| s.to_string())
.collect();
let num_columns = column_names.len();
let mut rows = stmt let mut rows = stmt
.query(params_from_iter(param_refs.iter())) .query(params_from_iter(params.iter()))
.map_err(|e| DatabaseError::ExecutionError { .map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: Some(table_name_str.clone()), table: None,
reason: e.to_string(), reason: e.to_string(),
})?; })?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
// 4. Lese ALLE RETURNING Werte und speichere PK-Mapping // Lese alle RETURNING Zeilen
while let Some(row) = rows.next().map_err(|e| DatabaseError::ExecutionError { while let Some(row) = rows.next().map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: Some(table_name_str.clone()), table: None,
reason: e.to_string(), reason: e.to_string(),
})? { })? {
// Extrahiere PK-Werte für PK-Remapping let mut row_values: Vec<JsonValue> = Vec::new();
let actual_pk = extract_pk_values_from_row(&row, &pk_columns)?;
pk_context.add_mapping(
table_name_str.clone(),
original_pk.clone(),
actual_pk.clone(),
);
// Extrahiere ALLE Spalten für RETURNING-Ergebnis
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns { for i in 0..num_columns {
let value_ref = let value_ref = row.get_ref(i).map_err(|e| DatabaseError::ExecutionError {
row.get_ref(i) sql: sql_str.clone(),
.map_err(|e| DatabaseError::RowProcessingError { table: None,
reason: format!("Failed to get column {}: {}", i, e), reason: e.to_string(),
})?; })?;
let json_val = convert_value_ref_to_json(value_ref)?; let json_value = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val); row_values.push(json_value);
} }
result_vec.push(row_values); result_vec.push(row_values);
} }
return Ok((modified_schema_tables, result_vec)); // Trigger-Logik für CREATE TABLE
} if let Statement::CreateTable(create_table_details) = statement {
} let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
// Für UPDATE/DELETE mit RETURNING: query() verwenden (kein PK-Remapping nötig) let table_name_str = raw_name
let mut stmt = tx .trim_matches('"')
.prepare(&sql_str) .trim_matches('`')
.map_err(|e| DatabaseError::PrepareError { .to_string();
reason: e.to_string(), eprintln!("DEBUG: Setting up triggers for table (RETURNING): {}", table_name_str);
})?; trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
let num_columns = stmt.column_count();
let mut rows = stmt.query(params).map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::RowProcessingError {
reason: format!("Row iteration error: {}", e),
})? {
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns {
let value_ref = row
.get_ref(i)
.map_err(|e| DatabaseError::RowProcessingError {
reason: format!("Failed to get column {}: {}", i, e),
})?;
let json_val = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val);
}
result_vec.push(row_values);
} }
Ok((modified_schema_tables, result_vec)) Ok((modified_schema_tables, result_vec))
} }
/// Legacy-Methode ohne PK-Remapping Context /// Führt ein einzelnes SQL Statement OHNE Typinformationen aus (JSON params)
pub fn execute_internal_typed(
tx: &Transaction,
hlc_service: &HlcService,
sql: &str,
params: &[&dyn ToSql],
) -> Result<HashSet<String>, DatabaseError> {
let mut context = PkRemappingContext::new();
Self::execute_internal_typed_with_context(tx, hlc_service, sql, params, &mut context)
}
/// Führt SQL aus (mit CRDT-Transformation) - OHNE Permission-Check
/// Wrapper um execute_internal_typed für JsonValue-Parameter
/// Nutzt PK-Remapping Logik für INSERT mit ON CONFLICT
pub fn execute_internal( pub fn execute_internal(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: &[JsonValue], params: &[JsonValue],
) -> Result<HashSet<String>, DatabaseError> { ) -> Result<HashSet<String>, DatabaseError> {
// Parameter validation let sql_params: Vec<SqliteValue> = params
let total_placeholders = sql.matches('?').count();
if total_placeholders != params.len() {
return Err(DatabaseError::ParameterMismatchError {
expected: total_placeholders,
provided: params.len(),
sql: sql.to_string(),
});
}
// Convert JsonValue params to SqliteValue
let params_converted: Vec<SqliteValue> = params
.iter() .iter()
.map(ValueConverter::json_to_rusqlite_value) .map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
.collect::<Result<Vec<_>, _>>()?; .collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
// Convert to &dyn ToSql references
let param_refs: Vec<&dyn ToSql> =
params_converted.iter().map(|v| v as &dyn ToSql).collect();
// Call execute_internal_typed (mit PK-Remapping!)
Self::execute_internal_typed(tx, hlc_service, sql, &param_refs) Self::execute_internal_typed(tx, hlc_service, sql, &param_refs)
} }
/// Führt SELECT aus (mit CRDT-Transformation) - OHNE Permission-Check /// Query-Variante (mit RETURNING) OHNE Typinformationen (JSON params)
pub fn select_internal(
conn: &rusqlite::Connection,
sql: &str,
params: &[JsonValue],
) -> Result<Vec<JsonValue>, DatabaseError> {
// Parameter validation
let total_placeholders = sql.matches('?').count();
if total_placeholders != params.len() {
return Err(DatabaseError::ParameterMismatchError {
expected: total_placeholders,
provided: params.len(),
sql: sql.to_string(),
});
}
let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.is_empty() {
return Ok(vec![]);
}
// Validate that all statements are queries
for stmt in &ast_vec {
if !matches!(stmt, Statement::Query(_)) {
return Err(DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "Only SELECT statements are allowed".to_string(),
table: None,
});
}
}
let sql_params = ValueConverter::convert_params(params)?;
let transformer = CrdtTransformer::new();
let last_statement = ast_vec.pop().unwrap();
let mut stmt_to_execute = last_statement;
transformer.transform_select_statement(&mut stmt_to_execute)?;
let transformed_sql = stmt_to_execute.to_string();
let mut prepared_stmt =
conn.prepare(&transformed_sql)
.map_err(|e| DatabaseError::ExecutionError {
sql: transformed_sql.clone(),
reason: e.to_string(),
table: None,
})?;
let column_names: Vec<String> = prepared_stmt
.column_names()
.into_iter()
.map(|s| s.to_string())
.collect();
let rows = prepared_stmt
.query_map(params_from_iter(sql_params.iter()), |row| {
crate::extension::database::row_to_json_value(row, &column_names)
})
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut results = Vec::new();
for row_result in rows {
results.push(row_result.map_err(|e| DatabaseError::RowProcessingError {
reason: e.to_string(),
})?);
}
Ok(results)
}
/// Führt SQL mit CRDT-Transformation aus und gibt RETURNING-Ergebnisse zurück
/// Speziell für INSERT/UPDATE/DELETE mit RETURNING (Drizzle-Integration)
/// Nutzt PK-Remapping für INSERT-Operationen
pub fn query_internal( pub fn query_internal(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: &[JsonValue], params: &[JsonValue],
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> { ) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
// Parameter validation let sql_params: Vec<SqliteValue> = params
let total_placeholders = sql.matches('?').count(); .iter()
if total_placeholders != params.len() { .map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
return Err(DatabaseError::ParameterMismatchError { .collect::<Result<Vec<_>, _>>()?;
expected: total_placeholders, let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
provided: params.len(), Self::query_internal_typed(tx, hlc_service, sql, &param_refs)
sql: sql.to_string(), }
/// Führt mehrere SQL Statements als Batch aus
pub fn execute_batch_internal(
tx: &Transaction,
hlc_service: &HlcService,
sqls: &[String],
params: &[Vec<JsonValue>],
) -> Result<HashSet<String>, DatabaseError> {
if sqls.len() != params.len() {
return Err(DatabaseError::ExecutionError {
sql: format!("{} statements but {} param sets", sqls.len(), params.len()),
reason: "Statement count and parameter count mismatch".to_string(),
table: None,
}); });
} }
// Parameter konvertieren let mut all_modified_tables = HashSet::new();
let params_converted: Vec<SqliteValue> = params
for (sql, param_set) in sqls.iter().zip(params.iter()) {
let modified_tables = Self::execute_internal(tx, hlc_service, sql, param_set)?;
all_modified_tables.extend(modified_tables);
}
Ok(all_modified_tables)
}
/// Query für SELECT-Statements (read-only, kein CRDT nötig außer Filter)
pub fn query_select(
conn: &rusqlite::Connection,
sql: &str,
params: &[JsonValue],
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 {
return Err(DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "query_select should only receive a single SELECT statement".to_string(),
table: None,
});
}
// Hard Delete: Keine SELECT-Transformation mehr nötig
let stmt_to_execute = ast_vec.pop().unwrap();
let transformed_sql = stmt_to_execute.to_string();
eprintln!("DEBUG: SELECT (no transformation): {}", transformed_sql);
// Convert JSON params to SQLite values
let sql_params: Vec<SqliteValue> = params
.iter() .iter()
.map(ValueConverter::json_to_rusqlite_value) .map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
.collect::<Result<Vec<_>, _>>()?; .collect::<Result<Vec<_>, _>>()?;
// Convert to &dyn ToSql references let mut prepared_stmt = conn.prepare(&transformed_sql)?;
let param_refs: Vec<&dyn ToSql> =
params_converted.iter().map(|v| v as &dyn ToSql).collect();
// Call query_internal_typed_with_context (mit PK-Remapping!) let num_columns = prepared_stmt.column_count();
let mut context = PkRemappingContext::new();
let (_tables, results) = Self::query_internal_typed_with_context(
tx,
hlc_service,
sql,
&param_refs,
&mut context,
)?;
Ok(results) let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
let mut rows = prepared_stmt.query(params_from_iter(param_refs.iter()))?;
let mut result: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next()? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i)?;
let json_value = convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
} }
result.push(row_values);
} }
// ========================= Ok(result)
// Helper-Funktionen für FK-Remapping
// =========================
/// Strukturiert FK-Informationen für einfache Lookups
#[derive(Debug)]
struct FkInfo {
/// column_name -> (referenced_table, referenced_column)
mappings: HashMap<String, (String, String)>,
}
/// Hole Foreign Key Informationen für eine Tabelle
fn get_fk_info(tx: &Transaction, table_name: &str) -> Result<FkInfo, DatabaseError> {
// Nutze PRAGMA foreign_key_list um FK-Beziehungen zu holen
let sql = format!("PRAGMA foreign_key_list('{}');", table_name);
let mut stmt = tx
.prepare(&sql)
.map_err(|e| DatabaseError::ExecutionError {
sql: sql.clone(),
reason: e.to_string(),
table: Some(table_name.to_string()),
})?;
let mut mappings = HashMap::new();
let rows = stmt
.query_map([], |row| {
Ok((
row.get::<_, String>("from")?, // FK column in this table
row.get::<_, String>("table")?, // referenced table
row.get::<_, String>("to")?, // referenced column
))
})
.map_err(|e| DatabaseError::ExecutionError {
sql,
reason: e.to_string(),
table: Some(table_name.to_string()),
})?;
for row in rows {
let (from_col, ref_table, ref_col) = row.map_err(|e| DatabaseError::ExecutionError {
sql: format!("PRAGMA foreign_key_list('{}')", table_name),
reason: e.to_string(),
table: Some(table_name.to_string()),
})?;
mappings.insert(from_col, (ref_table, ref_col));
}
Ok(FkInfo { mappings })
}
/// Konvertiert &[&dyn ToSql] zu Vec<SqliteValue> für Manipulation
/// Nutzt einen Dummy-Query um die Parameter-Werte zu extrahieren
fn params_to_vec(
params: &[&dyn ToSql],
tx: &Transaction,
) -> Result<Vec<SqliteValue>, DatabaseError> {
let mut values = Vec::new();
// Erstelle eine Dummy-Query mit genau so vielen Platzhaltern wie wir Parameter haben
// z.B. "SELECT ?, ?, ?"
if params.is_empty() {
return Ok(values);
}
let placeholders = vec!["?"; params.len()].join(", ");
let dummy_sql = format!("SELECT {}", placeholders);
let mut stmt = tx
.prepare(&dummy_sql)
.map_err(|e| DatabaseError::ExecutionError {
sql: dummy_sql.clone(),
reason: format!("Failed to prepare dummy query: {}", e),
table: None,
})?;
// Führe die Query aus und extrahiere die Werte aus der Row
let mut rows = stmt
.query(params)
.map_err(|e| DatabaseError::ExecutionError {
sql: dummy_sql.clone(),
reason: format!("Failed to execute dummy query: {}", e),
table: None,
})?;
if let Some(row) = rows.next().map_err(|e| DatabaseError::ExecutionError {
sql: dummy_sql,
reason: format!("Failed to read dummy query result: {}", e),
table: None,
})? {
// Extrahiere alle Spalten-Werte
for i in 0..params.len() {
let value: SqliteValue = row.get(i).map_err(|e| DatabaseError::ExecutionError {
sql: format!("SELECT ..."),
reason: format!("Failed to extract value at index {}: {}", i, e),
table: None,
})?;
values.push(value);
} }
} }
Ok(values)
}
/// Extrahiert PK-Werte aus den INSERT-Parametern
fn extract_pk_values_from_params(
insert_stmt: &Insert,
params: &[SqliteValue],
pk_columns: &[String],
) -> Result<PkValues, DatabaseError> {
let mut pk_values = PkValues::new();
// Finde die Positionen der PK-Spalten in der INSERT column list
for pk_col in pk_columns {
if let Some(pos) = insert_stmt.columns.iter().position(|c| &c.value == pk_col) {
// Hole den Parameter-Wert an dieser Position
if pos < params.len() {
// Konvertiere SqliteValue zu String
let value_str = value_to_string(&params[pos]);
pk_values.insert(pk_col.clone(), value_str);
}
}
}
Ok(pk_values)
}
/// Remapped FK-Werte in den Parametern basierend auf dem PK-Remapping Context
fn remap_fk_params(
insert_stmt: &Insert,
params: &mut Vec<SqliteValue>,
fk_info: &FkInfo,
pk_context: &PkRemappingContext,
) -> Result<(), DatabaseError> {
// Für jede FK-Spalte: prüfe ob Remapping nötig ist
for (col_name, (ref_table, ref_col)) in &fk_info.mappings {
// Finde Position der FK-Spalte in der INSERT column list
if let Some(pos) = insert_stmt
.columns
.iter()
.position(|c| &c.value == col_name)
{
if pos < params.len() {
// Hole aktuellen FK-Wert (als String)
let current_value = value_to_string(&params[pos]);
// Versuche zu remappen
let new_value = pk_context.remap_fk_value(ref_table, ref_col, &current_value);
if new_value != current_value {
// Ersetze den Parameter-Wert
params[pos] = SqliteValue::Text(new_value);
eprintln!(
"DEBUG: Remapped FK {}={} to {:?}",
col_name, current_value, params[pos]
);
}
}
}
}
Ok(())
}
/// Hilfsfunktion: Konvertiert SqliteValue zu String für Vergleiche
fn value_to_string(value: &SqliteValue) -> String {
match value {
SqliteValue::Null => "NULL".to_string(),
SqliteValue::Integer(i) => i.to_string(),
SqliteValue::Real(r) => r.to_string(),
SqliteValue::Text(s) => s.clone(),
SqliteValue::Blob(b) => format!("BLOB({} bytes)", b.len()),
}
}
/// Extrahiert PK-Werte aus einer RETURNING Row
fn extract_pk_values_from_row(
row: &rusqlite::Row,
pk_columns: &[String],
) -> Result<PkValues, DatabaseError> {
let mut pk_values = PkValues::new();
for (idx, pk_col) in pk_columns.iter().enumerate() {
// RETURNING gibt PKs in der Reihenfolge zurück, wie sie im RETURNING Clause stehen
let value: String = row.get(idx).map_err(|e| DatabaseError::ExecutionError {
sql: "RETURNING clause".to_string(),
reason: format!("Failed to extract PK column '{}': {}", pk_col, e),
table: None,
})?;
pk_values.insert(pk_col.clone(), value);
}
Ok(pk_values)
}

View File

@ -5,6 +5,7 @@ use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger; use crate::crdt::trigger;
use crate::database::core::{parse_sql_statements, with_connection, ValueConverter}; use crate::database::core::{parse_sql_statements, with_connection, ValueConverter};
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError; use crate::extension::error::ExtensionError;
use crate::extension::permissions::validator::SqlPermissionValidator; use crate::extension::permissions::validator::SqlPermissionValidator;
use crate::AppState; use crate::AppState;
@ -110,7 +111,7 @@ pub async fn extension_sql_execute(
public_key: String, public_key: String,
name: String, name: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<String>, ExtensionError> { ) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID // Get extension to retrieve its ID
let extension = state let extension = state
.extension_manager .extension_manager
@ -129,42 +130,72 @@ pub async fn extension_sql_execute(
// SQL parsing // SQL parsing
let mut ast_vec = parse_sql_statements(sql)?; let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 {
return Err(ExtensionError::Database {
source: DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "extension_sql_execute should only receive a single SQL statement"
.to_string(),
table: None,
},
});
}
let mut statement = ast_vec.pop().unwrap();
// Check if statement has RETURNING clause
let has_returning = crate::database::core::statement_has_returning(&statement);
// Database operation // Database operation
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?; let tx = conn.transaction().map_err(DatabaseError::from)?;
let transformer = CrdtTransformer::new(); let transformer = CrdtTransformer::new();
let executor = StatementExecutor::new(&tx);
// Get HLC service reference
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(),
})?;
// Generate HLC timestamp // Generate HLC timestamp
let hlc_timestamp = state let hlc_timestamp = hlc_service
.hlc
.lock()
.unwrap()
.new_timestamp_and_persist(&tx) .new_timestamp_and_persist(&tx)
.map_err(|e| DatabaseError::HlcError { .map_err(|e| DatabaseError::HlcError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Transform statements // Transform statement
let mut modified_schema_tables = HashSet::new(); transformer.transform_execute_statement(&mut statement, &hlc_timestamp)?;
for statement in &mut ast_vec {
if let Some(table_name) =
transformer.transform_execute_statement(statement, &hlc_timestamp)?
{
modified_schema_tables.insert(table_name);
}
}
// Convert parameters // Convert parameters to references
let sql_values = ValueConverter::convert_params(&params)?; let sql_values = ValueConverter::convert_params(&params)?;
let param_refs: Vec<&dyn rusqlite::ToSql> = sql_values.iter().map(|v| v as &dyn rusqlite::ToSql).collect();
// Execute statements let result = if has_returning {
for statement in ast_vec { // Use query_internal for statements with RETURNING
executor.execute_statement_with_params(&statement, &sql_values)?; let (_, rows) = SqlExecutor::query_internal_typed(&tx, &hlc_service, &statement.to_string(), &param_refs)?;
rows
} else {
// Use execute_internal for statements without RETURNING
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &statement.to_string(), &param_refs)?;
vec![]
};
// Handle CREATE TABLE trigger setup
if let Statement::CreateTable(ref create_table_details) = statement {
// Extract table name and remove quotes (both " and `)
let raw_name = create_table_details.name.to_string();
println!("DEBUG: Raw table name from AST: {:?}", raw_name);
println!("DEBUG: Raw table name chars: {:?}", raw_name.chars().collect::<Vec<_>>());
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
println!("DEBUG: Cleaned table name: {:?}", table_name_str);
println!("DEBUG: Cleaned table name chars: {:?}", table_name_str.chars().collect::<Vec<_>>());
if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string();
println!( println!(
"Table '{}' created by extension, setting up CRDT triggers...", "Table '{}' created by extension, setting up CRDT triggers...",
table_name_str table_name_str
@ -175,12 +206,11 @@ pub async fn extension_sql_execute(
table_name_str table_name_str
); );
} }
}
// Commit transaction // Commit transaction
tx.commit().map_err(DatabaseError::from)?; tx.commit().map_err(DatabaseError::from)?;
Ok(modified_schema_tables.into_iter().collect()) Ok(result)
}) })
.map_err(ExtensionError::from) .map_err(ExtensionError::from)
} }
@ -192,7 +222,7 @@ pub async fn extension_sql_select(
public_key: String, public_key: String,
name: String, name: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<JsonValue>, ExtensionError> { ) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID // Get extension to retrieve its ID
let extension = state let extension = state
.extension_manager .extension_manager
@ -229,17 +259,10 @@ pub async fn extension_sql_select(
} }
} }
// Database operation // Database operation - return Vec<Vec<JsonValue>> like sql_select_with_crdt
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
let sql_params = ValueConverter::convert_params(&params)?; let sql_params = ValueConverter::convert_params(&params)?;
let transformer = CrdtTransformer::new(); let stmt_to_execute = ast_vec.pop().unwrap();
// Use the last statement for result set
let last_statement = ast_vec.pop().unwrap();
let mut stmt_to_execute = last_statement;
// Transform the statement
transformer.transform_select_statement(&mut stmt_to_execute)?;
let transformed_sql = stmt_to_execute.to_string(); let transformed_sql = stmt_to_execute.to_string();
// Prepare and execute query // Prepare and execute query
@ -251,51 +274,34 @@ pub async fn extension_sql_select(
table: None, table: None,
})?; })?;
let column_names: Vec<String> = prepared_stmt let num_columns = prepared_stmt.column_count();
.column_names() let mut rows = prepared_stmt
.into_iter() .query(params_from_iter(sql_params.iter()))
.map(|s| s.to_string())
.collect();
let rows = prepared_stmt
.query_map(params_from_iter(sql_params.iter()), |row| {
row_to_json_value(row, &column_names)
})
.map_err(|e| DatabaseError::QueryError { .map_err(|e| DatabaseError::QueryError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
let mut results = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
for row_result in rows {
results.push(row_result.map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next().map_err(|e| DatabaseError::QueryError {
reason: e.to_string(), reason: e.to_string(),
})?); })? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let json_value = crate::database::core::convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
}
result_vec.push(row_values);
} }
Ok(results) Ok(result_vec)
}) })
.map_err(ExtensionError::from) .map_err(ExtensionError::from)
} }
/// Konvertiert eine SQLite-Zeile zu JSON
fn row_to_json_value(
row: &rusqlite::Row,
columns: &[String],
) -> Result<JsonValue, rusqlite::Error> {
let mut map = serde_json::Map::new();
for (i, col_name) in columns.iter().enumerate() {
let value = row.get::<usize, rusqlite::types::Value>(i)?;
let json_value = match value {
rusqlite::types::Value::Null => JsonValue::Null,
rusqlite::types::Value::Integer(i) => json!(i),
rusqlite::types::Value::Real(f) => json!(f),
rusqlite::types::Value::Text(s) => json!(s),
rusqlite::types::Value::Blob(blob) => json!(blob.to_vec()),
};
map.insert(col_name.clone(), json_value);
}
Ok(JsonValue::Object(map))
}
/// Validiert Parameter gegen SQL-Platzhalter /// Validiert Parameter gegen SQL-Platzhalter
fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> { fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> {
@ -317,15 +323,6 @@ fn count_sql_placeholders(sql: &str) -> usize {
sql.matches('?').count() sql.matches('?').count()
} }
/// Kürzt SQL für Fehlermeldungen
/* fn truncate_sql(sql: &str, max_length: usize) -> String {
if sql.len() <= max_length {
sql.to_string()
} else {
format!("{}...", &sql[..max_length])
}
} */
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;

View File

@ -1,7 +1,7 @@
/// src-tauri/src/extension/mod.rs /// src-tauri/src/extension/mod.rs
use crate::{ use crate::{
extension::{ extension::{
core::{EditablePermissions, ExtensionInfoResponse, ExtensionPreview}, core::{manager::ExtensionManager, EditablePermissions, ExtensionInfoResponse, ExtensionPreview},
error::ExtensionError, error::ExtensionError,
}, },
AppState, AppState,
@ -37,7 +37,7 @@ pub async fn get_all_extensions(
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<ExtensionInfoResponse>, String> { ) -> Result<Vec<ExtensionInfoResponse>, String> {
// Check if extensions are loaded, if not load them first // Check if extensions are loaded, if not load them first
let needs_loading = { /* let needs_loading = {
let prod_exts = state let prod_exts = state
.extension_manager .extension_manager
.production_extensions .production_extensions
@ -45,15 +45,15 @@ pub async fn get_all_extensions(
.unwrap(); .unwrap();
let dev_exts = state.extension_manager.dev_extensions.lock().unwrap(); let dev_exts = state.extension_manager.dev_extensions.lock().unwrap();
prod_exts.is_empty() && dev_exts.is_empty() prod_exts.is_empty() && dev_exts.is_empty()
}; }; */
if needs_loading { /* if needs_loading { */
state state
.extension_manager .extension_manager
.load_installed_extensions(&app_handle, &state) .load_installed_extensions(&app_handle, &state)
.await .await
.map_err(|e| format!("Failed to load extensions: {:?}", e))?; .map_err(|e| format!("Failed to load extensions: {:?}", e))?;
} /* } */
let mut extensions = Vec::new(); let mut extensions = Vec::new();
@ -82,12 +82,13 @@ pub async fn get_all_extensions(
#[tauri::command] #[tauri::command]
pub async fn preview_extension( pub async fn preview_extension(
app_handle: AppHandle,
state: State<'_, AppState>, state: State<'_, AppState>,
file_bytes: Vec<u8>, file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> { ) -> Result<ExtensionPreview, ExtensionError> {
state state
.extension_manager .extension_manager
.preview_extension_internal(file_bytes) .preview_extension_internal(&app_handle, file_bytes)
.await .await
} }
@ -193,13 +194,7 @@ pub async fn remove_extension(
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
state state
.extension_manager .extension_manager
.remove_extension_internal( .remove_extension_internal(&app_handle, &public_key, &name, &version, &state)
&app_handle,
&public_key,
&name,
&version,
&state,
)
.await .await
} }
@ -223,6 +218,16 @@ pub fn is_extension_installed(
#[derive(serde::Deserialize, Debug)] #[derive(serde::Deserialize, Debug)]
struct HaextensionConfig { struct HaextensionConfig {
dev: DevConfig, dev: DevConfig,
#[serde(default)]
keys: KeysConfig,
}
#[derive(serde::Deserialize, Debug, Default)]
struct KeysConfig {
#[serde(default)]
public_key_path: Option<String>,
#[serde(default)]
private_key_path: Option<String>,
} }
#[derive(serde::Deserialize, Debug)] #[derive(serde::Deserialize, Debug)]
@ -231,6 +236,8 @@ struct DevConfig {
port: u16, port: u16,
#[serde(default = "default_host")] #[serde(default = "default_host")]
host: String, host: String,
#[serde(default = "default_haextension_dir")]
haextension_dir: String,
} }
fn default_port() -> u16 { fn default_port() -> u16 {
@ -241,10 +248,14 @@ fn default_host() -> String {
"localhost".to_string() "localhost".to_string()
} }
fn default_haextension_dir() -> String {
"haextension".to_string()
}
/// Check if a dev server is reachable by making a simple HTTP request /// Check if a dev server is reachable by making a simple HTTP request
async fn check_dev_server_health(url: &str) -> bool { async fn check_dev_server_health(url: &str) -> bool {
use tauri_plugin_http::reqwest;
use std::time::Duration; use std::time::Duration;
use tauri_plugin_http::reqwest;
// Try to connect with a short timeout // Try to connect with a short timeout
let client = reqwest::Client::builder() let client = reqwest::Client::builder()
@ -276,29 +287,28 @@ pub async fn load_dev_extension(
let extension_path_buf = PathBuf::from(&extension_path); let extension_path_buf = PathBuf::from(&extension_path);
// 1. Read haextension.json to get dev server config // 1. Read haextension.config.json to get dev server config and haextension directory
let config_path = extension_path_buf.join("haextension.json"); let config_path = extension_path_buf.join("haextension.config.json");
let (host, port) = if config_path.exists() { let (host, port, haextension_dir) = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path).map_err(|e| { let config_content =
ExtensionError::ValidationError { std::fs::read_to_string(&config_path).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to read haextension.json: {}", e), reason: format!("Failed to read haextension.config.json: {}", e),
}
})?; })?;
let config: HaextensionConfig = serde_json::from_str(&config_content).map_err(|e| { let config: HaextensionConfig =
ExtensionError::ValidationError { serde_json::from_str(&config_content).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to parse haextension.json: {}", e), reason: format!("Failed to parse haextension.config.json: {}", e),
}
})?; })?;
(config.dev.host, config.dev.port) (config.dev.host, config.dev.port, config.dev.haextension_dir)
} else { } else {
// Default values if config doesn't exist // Default values if config doesn't exist
(default_host(), default_port()) (default_host(), default_port(), default_haextension_dir())
}; };
let dev_server_url = format!("http://{}:{}", host, port); let dev_server_url = format!("http://{}:{}", host, port);
eprintln!("📡 Dev server URL: {}", dev_server_url); eprintln!("📡 Dev server URL: {}", dev_server_url);
eprintln!("📁 Haextension directory: {}", haextension_dir);
// 1.5. Check if dev server is running // 1.5. Check if dev server is running
if !check_dev_server_health(&dev_server_url).await { if !check_dev_server_health(&dev_server_url).await {
@ -311,35 +321,30 @@ pub async fn load_dev_extension(
} }
eprintln!("✅ Dev server is reachable"); eprintln!("✅ Dev server is reachable");
// 2. Build path to manifest: <extension_path>/haextension/manifest.json // 2. Validate and build path to manifest: <extension_path>/<haextension_dir>/manifest.json
let manifest_path = extension_path_buf.join("haextension").join("manifest.json"); let manifest_relative_path = format!("{}/manifest.json", haextension_dir);
let manifest_path = ExtensionManager::validate_path_in_directory(
// Check if manifest exists &extension_path_buf,
if !manifest_path.exists() { &manifest_relative_path,
return Err(ExtensionError::ManifestError { true,
)?
.ok_or_else(|| ExtensionError::ManifestError {
reason: format!( reason: format!(
"Manifest not found at: {}. Make sure you run 'npx @haexhub/sdk init' first.", "Manifest not found at: {}/manifest.json. Make sure you run 'npx @haexhub/sdk init' first.",
manifest_path.display() haextension_dir
), ),
}); })?;
}
// 3. Read and parse manifest // 3. Read and parse manifest
let manifest_content = std::fs::read_to_string(&manifest_path).map_err(|e| { let manifest_content =
ExtensionError::ManifestError { std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Failed to read manifest: {}", e), reason: format!("Failed to read manifest: {}", e),
}
})?; })?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
// 4. Generate a unique ID for dev extension: dev_<public_key_first_8>_<name> // 4. Generate a unique ID for dev extension: dev_<public_key>_<name>
let key_prefix = manifest let extension_id = format!("dev_{}_{}", manifest.public_key, manifest.name);
.public_key
.chars()
.take(8)
.collect::<String>();
let extension_id = format!("dev_{}_{}", key_prefix, manifest.name);
// 5. Check if dev extension already exists (allow reload) // 5. Check if dev extension already exists (allow reload)
if let Some(existing) = state if let Some(existing) = state
@ -387,12 +392,10 @@ pub fn remove_dev_extension(
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
// Only remove from dev_extensions, not production_extensions // Only remove from dev_extensions, not production_extensions
let mut dev_exts = state let mut dev_exts = state.extension_manager.dev_extensions.lock().map_err(|e| {
.extension_manager ExtensionError::MutexPoisoned {
.dev_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
}
})?; })?;
// Find and remove by public_key and name // Find and remove by public_key and name
@ -406,10 +409,7 @@ pub fn remove_dev_extension(
eprintln!("✅ Dev extension removed: {}", name); eprintln!("✅ Dev extension removed: {}", name);
Ok(()) Ok(())
} else { } else {
Err(ExtensionError::NotFound { Err(ExtensionError::NotFound { public_key, name })
public_key,
name,
})
} }
} }

View File

@ -197,6 +197,30 @@ impl PermissionManager {
action: Action, action: Action,
table_name: &str, table_name: &str,
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
// Remove quotes from table name if present (from SDK's getTableName())
let clean_table_name = table_name.trim_matches('"');
// Auto-allow: Extensions have full access to their own tables
// Table format: {publicKey}__{extensionName}__{tableName}
// Extension ID format: dev_{publicKey}_{extensionName} or {publicKey}_{extensionName}
// Get the extension to check if this is its own table
let extension = app_state
.extension_manager
.get_extension(extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {} not found", extension_id),
})?;
// Build expected table prefix: {publicKey}__{extensionName}__
let expected_prefix = format!("{}__{}__", extension.manifest.public_key, extension.manifest.name);
if clean_table_name.starts_with(&expected_prefix) {
// This is the extension's own table - auto-allow
return Ok(());
}
// Not own table - check explicit permissions
let permissions = Self::get_permissions(app_state, extension_id).await?; let permissions = Self::get_permissions(app_state, extension_id).await?;
let has_permission = permissions let has_permission = permissions
@ -205,7 +229,7 @@ impl PermissionManager {
.filter(|perm| perm.resource_type == ResourceType::Db) .filter(|perm| perm.resource_type == ResourceType::Db)
.filter(|perm| perm.action == action) // action ist nicht mehr Option .filter(|perm| perm.action == action) // action ist nicht mehr Option
.any(|perm| { .any(|perm| {
if perm.target != "*" && perm.target != table_name { if perm.target != "*" && perm.target != clean_table_name {
return false; return false;
} }
true true

View File

@ -165,8 +165,6 @@ pub struct ExtensionPermission {
pub constraints: Option<PermissionConstraints>, pub constraints: Option<PermissionConstraints>,
pub status: PermissionStatus, pub status: PermissionStatus,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -341,9 +339,9 @@ impl From<&ExtensionPermission> for crate::database::generated::HaexExtensionPer
fn from(perm: &ExtensionPermission) -> Self { fn from(perm: &ExtensionPermission) -> Self {
Self { Self {
id: perm.id.clone(), id: perm.id.clone(),
extension_id: Some(perm.extension_id.clone()), extension_id: perm.extension_id.clone(),
resource_type: Some(perm.resource_type.as_str().to_string()), resource_type: Some(perm.resource_type.as_str().to_string()),
action: Some(perm.action.as_str()), action: Some(perm.action.as_str().to_string()),
target: Some(perm.target.clone()), target: Some(perm.target.clone()),
constraints: perm constraints: perm
.constraints .constraints
@ -352,7 +350,6 @@ impl From<&ExtensionPermission> for crate::database::generated::HaexExtensionPer
status: perm.status.as_str().to_string(), status: perm.status.as_str().to_string(),
created_at: None, created_at: None,
updated_at: None, updated_at: None,
haex_tombstone: perm.haex_tombstone,
haex_timestamp: perm.haex_timestamp.clone(), haex_timestamp: perm.haex_timestamp.clone(),
} }
} }
@ -382,13 +379,12 @@ impl From<crate::database::generated::HaexExtensionPermissions> for ExtensionPer
Self { Self {
id: db_perm.id, id: db_perm.id,
extension_id: db_perm.extension_id.unwrap_or_default(), extension_id: db_perm.extension_id,
resource_type, resource_type,
action, action,
target: db_perm.target.unwrap_or_default(), target: db_perm.target.unwrap_or_default(),
constraints, constraints,
status, status,
haex_tombstone: db_perm.haex_tombstone,
haex_timestamp: db_perm.haex_timestamp, haex_timestamp: db_perm.haex_timestamp,
} }
} }

View File

@ -68,17 +68,19 @@ pub fn run() {
.invoke_handler(tauri::generate_handler![ .invoke_handler(tauri::generate_handler![
database::create_encrypted_database, database::create_encrypted_database,
database::delete_vault, database::delete_vault,
database::move_vault_to_trash,
database::list_vaults, database::list_vaults,
database::open_encrypted_database, database::open_encrypted_database,
database::sql_execute,
database::sql_execute_with_crdt, database::sql_execute_with_crdt,
database::sql_execute,
database::sql_query_with_crdt, database::sql_query_with_crdt,
database::sql_select_with_crdt,
database::sql_select, database::sql_select,
database::vault_exists, database::vault_exists,
extension::database::extension_sql_execute, extension::database::extension_sql_execute,
extension::database::extension_sql_select, extension::database::extension_sql_select,
extension::get_all_extensions,
extension::get_all_dev_extensions, extension::get_all_dev_extensions,
extension::get_all_extensions,
extension::get_extension_info, extension::get_extension_info,
extension::install_extension_with_permissions, extension::install_extension_with_permissions,
extension::is_extension_installed, extension::is_extension_installed,

View File

@ -1,7 +1,7 @@
{ {
"$schema": "https://schema.tauri.app/config/2", "$schema": "https://schema.tauri.app/config/2",
"productName": "haex-hub", "productName": "haex-hub",
"version": "0.1.0", "version": "0.1.3",
"identifier": "space.haex.hub", "identifier": "space.haex.hub",
"build": { "build": {
"beforeDevCommand": "pnpm dev", "beforeDevCommand": "pnpm dev",
@ -25,7 +25,8 @@
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"haex-extension:", "haex-extension:",
"'wasm-unsafe-eval'" "'wasm-unsafe-eval'",
"'unsafe-inline'"
], ],
"style-src": [ "style-src": [
"'self'", "'self'",

View File

@ -3,6 +3,8 @@ export default defineAppConfig({
colors: { colors: {
primary: 'sky', primary: 'sky',
secondary: 'fuchsia', secondary: 'fuchsia',
warning: 'yellow',
danger: 'red',
}, },
}, },
}) })

View File

@ -1,9 +1,7 @@
<template> <template>
<UApp :locale="locales[locale]"> <UApp :locale="locales[locale]">
<div data-vaul-drawer-wrapper> <div data-vaul-drawer-wrapper>
<NuxtLayout>
<NuxtPage /> <NuxtPage />
</NuxtLayout>
</div> </div>
</UApp> </UApp>
</template> </template>

View File

@ -13,8 +13,48 @@
[disabled] { [disabled] {
@apply cursor-not-allowed; @apply cursor-not-allowed;
} }
/* Define safe-area-insets as CSS custom properties for JavaScript access */
:root {
--safe-area-inset-top: env(safe-area-inset-top, 0px);
--safe-area-inset-bottom: env(safe-area-inset-bottom, 0px);
--safe-area-inset-left: env(safe-area-inset-left, 0px);
--safe-area-inset-right: env(safe-area-inset-right, 0px);
} }
:root { /* Verhindere Scrolling auf html und body */
--ui-header-height: 74px; html {
overflow: hidden;
margin: 0;
padding: 0;
height: 100dvh;
height: 100vh; /* Fallback */
width: 100%;
}
body {
overflow: hidden;
margin: 0;
height: 100%;
width: 100%;
padding: 0;
}
#__nuxt {
/* Volle Höhe des body */
height: 100%;
width: 100%;
/* Safe-Area Paddings auf root element - damit ALLES davon profitiert */
padding-top: var(--safe-area-inset-top);
padding-bottom: var(--safe-area-inset-bottom);
padding-left: var(--safe-area-inset-left);
padding-right: var(--safe-area-inset-right);
box-sizing: border-box;
}
}
@theme {
--spacing-header: 3.5rem; /* 72px - oder dein Wunschwert */
} }

View File

@ -0,0 +1,61 @@
<template>
<div
v-if="data"
class="fixed top-2 right-2 bg-black/90 text-white text-xs p-3 rounded-lg shadow-2xl max-w-sm z-[9999] backdrop-blur-sm"
>
<div class="flex justify-between items-start gap-3 mb-2">
<span class="font-bold text-sm">{{ title }}</span>
<div class="flex gap-1">
<button
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="copyToClipboardAsync"
>
Copy
</button>
<button
v-if="dismissible"
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="handleDismiss"
>
</button>
</div>
</div>
<pre class="text-xs whitespace-pre-wrap font-mono overflow-auto max-h-96">{{ formattedData }}</pre>
</div>
</template>
<script setup lang="ts">
const props = withDefaults(
defineProps<{
data: Record<string, any> | null
title?: string
dismissible?: boolean
}>(),
{
title: 'Debug Info',
dismissible: false,
},
)
const emit = defineEmits<{
dismiss: []
}>()
const formattedData = computed(() => {
if (!props.data) return ''
return JSON.stringify(props.data, null, 2)
})
const copyToClipboardAsync = async () => {
try {
await navigator.clipboard.writeText(formattedData.value)
} catch (err) {
console.error('Failed to copy debug info:', err)
}
}
const handleDismiss = () => {
emit('dismiss')
}
</script>

View File

@ -36,7 +36,7 @@
<div class="flex flex-col items-center gap-4"> <div class="flex flex-col items-center gap-4">
<div <div
class="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-500" class="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-500"
></div> />
<p class="text-sm text-gray-600 dark:text-gray-400"> <p class="text-sm text-gray-600 dark:text-gray-400">
Loading extension... Loading extension...
</p> </p>

View File

@ -69,7 +69,7 @@
<script setup lang="ts"> <script setup lang="ts">
const props = defineProps<{ const props = defineProps<{
id: string id: string
itemType: 'extension' | 'file' | 'folder' itemType: DesktopItemType
referenceId: string referenceId: string
initialX: number initialX: number
initialY: number initialY: number

View File

@ -1,8 +1,7 @@
<template> <template>
<div <div
ref="desktopEl" ref="desktopEl"
class="w-full h-full relative overflow-hidden" class="absolute inset-0 overflow-hidden"
@click.self.stop="handleDesktopClick"
> >
<Swiper <Swiper
:modules="[SwiperNavigation]" :modules="[SwiperNavigation]"
@ -11,11 +10,11 @@
:initial-slide="currentWorkspaceIndex" :initial-slide="currentWorkspaceIndex"
:speed="300" :speed="300"
:touch-angle="45" :touch-angle="45"
:threshold="10"
:no-swiping="true" :no-swiping="true"
no-swiping-class="no-swipe" no-swiping-class="no-swipe"
:allow-touch-move="allowSwipe" :allow-touch-move="allowSwipe"
class="w-full h-full" class="h-full w-full"
direction="vertical"
@swiper="onSwiperInit" @swiper="onSwiperInit"
@slide-change="onSlideChange" @slide-change="onSlideChange"
> >
@ -25,9 +24,11 @@
class="w-full h-full" class="w-full h-full"
> >
<div <div
class="w-full h-full relative bg-gradient-to-br from-gray-50 via-gray-100 to-gray-200 dark:from-gray-900 dark:via-gray-800 dark:to-gray-700" class="w-full h-full relative"
@click.self.stop="handleDesktopClick" @click.self.stop="handleDesktopClick"
@mousedown.left.self="handleAreaSelectStart" @mousedown.left.self="handleAreaSelectStart"
@dragover.prevent="handleDragOver"
@drop.prevent="handleDrop($event, workspace.id)"
> >
<!-- Grid Pattern Background --> <!-- Grid Pattern Background -->
<div <div
@ -40,18 +41,16 @@
/> />
<!-- Snap Dropzones (only visible when window drag near edge) --> <!-- Snap Dropzones (only visible when window drag near edge) -->
<Transition name="fade">
<div <div
v-if="showLeftSnapZone" class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
class="absolute left-0 top-0 bottom-0 w-1/2 bg-blue-500/20 border-2 border-blue-500 pointer-events-none backdrop-blur-sm z-40" :class="showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'"
/> />
</Transition>
<Transition name="fade">
<div <div
v-if="showRightSnapZone" class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
class="absolute right-0 top-0 bottom-0 w-1/2 bg-blue-500/20 border-2 border-blue-500 pointer-events-none backdrop-blur-sm z-40" :class="showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'"
/> />
</Transition>
<!-- Area Selection Box --> <!-- Area Selection Box -->
<div <div
@ -79,37 +78,36 @@
<!-- Windows for this workspace --> <!-- Windows for this workspace -->
<template <template
v-for="(window, index) in getWorkspaceWindows(workspace.id)" v-for="window in getWorkspaceWindows(workspace.id)"
:key="window.id" :key="window.id"
> >
<!-- Wrapper for Overview Mode Click/Drag --> <!-- Overview Mode: Teleport to window preview -->
<div <Teleport
v-if="false" v-if="
:style=" windowManager.showWindowOverview &&
getOverviewWindowGridStyle( overviewWindowState.has(window.id)
index,
getWorkspaceWindows(workspace.id).length,
)
" "
class="absolute cursor-pointer group" :to="`#window-preview-${window.id}`"
:draggable="true"
@dragstart="handleOverviewWindowDragStart($event, window.id)"
@dragend="handleOverviewWindowDragEnd"
@click="handleOverviewWindowClick(window.id)"
> >
<!-- Overlay for click/drag events (prevents interaction with window content) -->
<div <div
class="absolute inset-0 z-[100] bg-transparent group-hover:ring-4 group-hover:ring-purple-500 rounded-xl transition-all" class="absolute origin-top-left"
/> :style="{
transform: `scale(${overviewWindowState.get(window.id)!.scale})`,
<HaexDesktopWindow width: `${overviewWindowState.get(window.id)!.width}px`,
height: `${overviewWindowState.get(window.id)!.height}px`,
}"
>
<HaexWindow
v-show="
windowManager.showWindowOverview || !window.isMinimized
"
:id="window.id" :id="window.id"
v-model:x="overviewWindowState.get(window.id)!.x"
v-model:y="overviewWindowState.get(window.id)!.y"
v-model:width="overviewWindowState.get(window.id)!.width"
v-model:height="overviewWindowState.get(window.id)!.height"
:title="window.title" :title="window.title"
:icon="window.icon" :icon="window.icon"
:initial-x="window.x"
:initial-y="window.y"
:initial-width="window.width"
:initial-height="window.height"
:is-active="windowManager.isWindowActive(window.id)" :is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX" :source-x="window.sourceX"
:source-y="window.sourceY" :source-y="window.sourceY"
@ -117,12 +115,21 @@
:source-height="window.sourceHeight" :source-height="window.sourceHeight"
:is-opening="window.isOpening" :is-opening="window.isOpening"
:is-closing="window.isClosing" :is-closing="window.isClosing"
class="no-swipe pointer-events-none" :warning-level="
window.type === 'extension' &&
availableExtensions.find(
(ext) => ext.id === window.sourceId,
)?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe"
@close="windowManager.closeWindow(window.id)" @close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)" @minimize="windowManager.minimizeWindow(window.id)"
@activate="windowManager.activateWindow(window.id)" @activate="windowManager.activateWindow(window.id)"
@position-changed=" @position-changed="
(x, y) => windowManager.updateWindowPosition(window.id, x, y) (x, y) =>
windowManager.updateWindowPosition(window.id, x, y)
" "
@size-changed=" @size-changed="
(width, height) => (width, height) =>
@ -131,7 +138,6 @@
@drag-start="handleWindowDragStart(window.id)" @drag-start="handleWindowDragStart(window.id)"
@drag-end="handleWindowDragEnd" @drag-end="handleWindowDragEnd"
> >
{{ window }}
<!-- System Window: Render Vue Component --> <!-- System Window: Render Vue Component -->
<component <component
:is="getSystemWindowComponent(window.sourceId)" :is="getSystemWindowComponent(window.sourceId)"
@ -144,18 +150,21 @@
:extension-id="window.sourceId" :extension-id="window.sourceId"
:window-id="window.id" :window-id="window.id"
/> />
</HaexDesktopWindow> </HaexWindow>
</div> </div>
</Teleport>
<!-- Normal Mode (non-overview) --> <!-- Desktop Mode: Render directly in workspace -->
<HaexDesktopWindow <HaexWindow
v-else
v-show="windowManager.showWindowOverview || !window.isMinimized"
:id="window.id" :id="window.id"
v-model:x="window.x"
v-model:y="window.y"
v-model:width="window.width"
v-model:height="window.height"
:title="window.title" :title="window.title"
:icon="window.icon" :icon="window.icon"
:initial-x="window.x"
:initial-y="window.y"
:initial-width="window.width"
:initial-height="window.height"
:is-active="windowManager.isWindowActive(window.id)" :is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX" :source-x="window.sourceX"
:source-y="window.sourceY" :source-y="window.sourceY"
@ -163,6 +172,13 @@
:source-height="window.sourceHeight" :source-height="window.sourceHeight"
:is-opening="window.isOpening" :is-opening="window.isOpening"
:is-closing="window.isClosing" :is-closing="window.isClosing"
:warning-level="
window.type === 'extension' &&
availableExtensions.find((ext) => ext.id === window.sourceId)
?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe" class="no-swipe"
@close="windowManager.closeWindow(window.id)" @close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)" @minimize="windowManager.minimizeWindow(window.id)"
@ -189,79 +205,30 @@
:extension-id="window.sourceId" :extension-id="window.sourceId"
:window-id="window.id" :window-id="window.id"
/> />
</HaexDesktopWindow> </HaexWindow>
</template> </template>
</div> </div>
</SwiperSlide> </SwiperSlide>
</Swiper> </Swiper>
<!-- Workspace Drawer --> <!-- Window Overview Modal -->
<UDrawer <HaexWindowOverview />
v-model:open="isOverviewMode"
direction="left"
:dismissible="false"
:overlay="false"
:modal="false"
should-scale-background
set-background-color-on-scale
title="Workspaces"
description="Workspaces"
>
<template #content>
<div class="p-6 h-full overflow-y-auto">
<UButton
block
trailing-icon="mdi-close"
class="text-2xl font-bold ext-gray-900 dark:text-white mb-4"
@click="isOverviewMode = false"
>
Workspaces
</UButton>
<!-- Workspace Cards -->
<div class="flex flex-col gap-3">
<HaexWorkspaceCard
v-for="workspace in workspaces"
:key="workspace.id"
:workspace
/>
</div>
<!-- Add New Workspace Button -->
<UButton
block
variant="outline"
class="mt-6"
@click="handleAddWorkspace"
>
<template #leading>
<UIcon name="i-heroicons-plus" />
</template>
New Workspace
</UButton>
</div>
</template>
</UDrawer>
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { Swiper, SwiperSlide } from 'swiper/vue' import { Swiper, SwiperSlide } from 'swiper/vue'
import { Navigation } from 'swiper/modules' import { Navigation } from 'swiper/modules'
import type { Swiper as SwiperType } from 'swiper'
import 'swiper/css' import 'swiper/css'
import 'swiper/css/navigation' import 'swiper/css/navigation'
import { eq } from 'drizzle-orm'
import { haexDesktopItems } from '~~/src-tauri/database/schemas'
const SwiperNavigation = Navigation const SwiperNavigation = Navigation
const desktopStore = useDesktopStore() const desktopStore = useDesktopStore()
const extensionsStore = useExtensionsStore() const extensionsStore = useExtensionsStore()
const windowManager = useWindowManagerStore() const windowManager = useWindowManagerStore()
const workspaceStore = useWorkspaceStore() const workspaceStore = useWorkspaceStore()
const { currentVault } = storeToRefs(useVaultStore())
const { desktopItems } = storeToRefs(desktopStore) const { desktopItems } = storeToRefs(desktopStore)
const { availableExtensions } = storeToRefs(extensionsStore) const { availableExtensions } = storeToRefs(extensionsStore)
const { const {
@ -273,14 +240,8 @@ const {
isOverviewMode, isOverviewMode,
} = storeToRefs(workspaceStore) } = storeToRefs(workspaceStore)
// Swiper instance
// Control Swiper touch behavior (disable during icon/window drag)
// Mouse position tracking
const { x: mouseX } = useMouse() const { x: mouseX } = useMouse()
// Desktop element ref
const desktopEl = useTemplateRef('desktopEl') const desktopEl = useTemplateRef('desktopEl')
// Track desktop viewport size reactively // Track desktop viewport size reactively
@ -320,7 +281,6 @@ const currentDraggedReferenceId = ref<string>()
// Window drag state for snap zones // Window drag state for snap zones
const isWindowDragging = ref(false) const isWindowDragging = ref(false)
const currentDraggingWindowId = ref<string | null>(null)
const snapEdgeThreshold = 50 // pixels from edge to show snap zone const snapEdgeThreshold = 50 // pixels from edge to show snap zone
// Computed visibility for snap zones (uses mouseX from above) // Computed visibility for snap zones (uses mouseX from above)
@ -334,37 +294,29 @@ const showRightSnapZone = computed(() => {
return mouseX.value >= viewportWidth - snapEdgeThreshold return mouseX.value >= viewportWidth - snapEdgeThreshold
}) })
// Dropzone refs
/* const removeDropzoneEl = ref<HTMLElement>()
const uninstallDropzoneEl = ref<HTMLElement>() */
// Setup dropzones with VueUse
/* const { isOverDropZone: isOverRemoveZone } = useDropZone(removeDropzoneEl, {
onDrop: () => {
if (currentDraggedItemId.value) {
handleRemoveFromDesktop(currentDraggedItemId.value)
}
},
}) */
/* const { isOverDropZone: isOverUninstallZone } = useDropZone(uninstallDropzoneEl, {
onDrop: () => {
if (currentDraggedItemType.value && currentDraggedReferenceId.value) {
handleUninstall(currentDraggedItemType.value, currentDraggedReferenceId.value)
}
},
}) */
// Get icons for a specific workspace // Get icons for a specific workspace
const getWorkspaceIcons = (workspaceId: string) => { const getWorkspaceIcons = (workspaceId: string) => {
return desktopItems.value return desktopItems.value
.filter((item) => item.workspaceId === workspaceId) .filter((item) => item.workspaceId === workspaceId)
.map((item) => { .map((item) => {
if (item.itemType === 'system') {
const systemWindow = windowManager
.getAllSystemWindows()
.find((win) => win.id === item.referenceId)
return {
...item,
label: systemWindow?.name || 'Unknown',
icon: systemWindow?.icon || '',
}
}
if (item.itemType === 'extension') { if (item.itemType === 'extension') {
const extension = availableExtensions.value.find( const extension = availableExtensions.value.find(
(ext) => ext.id === item.referenceId, (ext) => ext.id === item.referenceId,
) )
console.log('found ext', extension)
return { return {
...item, ...item,
label: extension?.name || 'Unknown', label: extension?.name || 'Unknown',
@ -398,11 +350,9 @@ const getWorkspaceIcons = (workspaceId: string) => {
}) })
} }
// Get windows for a specific workspace // Get windows for a specific workspace (including minimized for teleport)
const getWorkspaceWindows = (workspaceId: string) => { const getWorkspaceWindows = (workspaceId: string) => {
return windowManager.windows.filter( return windowManager.windows.filter((w) => w.workspaceId === workspaceId)
(w) => w.workspaceId === workspaceId && !w.isMinimized,
)
} }
// Get Vue Component for system window // Get Vue Component for system window
@ -436,26 +386,50 @@ const handleDragEnd = async () => {
allowSwipe.value = true // Re-enable Swiper after drag allowSwipe.value = true // Re-enable Swiper after drag
} }
// Move desktop item to different workspace // Handle drag over for launcher items
const moveItemToWorkspace = async ( const handleDragOver = (event: DragEvent) => {
itemId: string, if (!event.dataTransfer) return
targetWorkspaceId: string,
) => { // Check if this is a launcher item
const item = desktopItems.value.find((i) => i.id === itemId) if (event.dataTransfer.types.includes('application/haex-launcher-item')) {
if (!item) return event.dataTransfer.dropEffect = 'copy'
}
}
// Handle drop for launcher items
const handleDrop = async (event: DragEvent, workspaceId: string) => {
if (!event.dataTransfer) return
const launcherItemData = event.dataTransfer.getData(
'application/haex-launcher-item',
)
if (!launcherItemData) return
try { try {
if (!currentVault.value?.drizzle) return const item = JSON.parse(launcherItemData) as {
id: string
name: string
icon: string
type: 'system' | 'extension'
}
await currentVault.value.drizzle // Get drop position relative to desktop
.update(haexDesktopItems) const desktopRect = (
.set({ workspaceId: targetWorkspaceId }) event.currentTarget as HTMLElement
.where(eq(haexDesktopItems.id, itemId)) ).getBoundingClientRect()
const x = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2)
const y = Math.max(0, event.clientY - desktopRect.top - 32)
// Update local state // Create desktop icon on the specific workspace
item.workspaceId = targetWorkspaceId await desktopStore.addDesktopItemAsync(
item.type as DesktopItemType,
item.id,
x,
y,
workspaceId,
)
} catch (error) { } catch (error) {
console.error('Fehler beim Verschieben des Items:', error) console.error('Failed to create desktop icon:', error)
} }
} }
@ -471,33 +445,53 @@ const handleDesktopClick = () => {
} }
desktopStore.clearSelection() desktopStore.clearSelection()
isOverviewMode.value = false isOverviewMode.value = false
} }
const handleWindowDragStart = (windowId: string) => { const handleWindowDragStart = (windowId: string) => {
console.log('[Desktop] handleWindowDragStart:', windowId)
isWindowDragging.value = true isWindowDragging.value = true
currentDraggingWindowId.value = windowId windowManager.draggingWindowId = windowId // Set in store for workspace cards
console.log(
'[Desktop] draggingWindowId set to:',
windowManager.draggingWindowId,
)
allowSwipe.value = false // Disable Swiper during window drag allowSwipe.value = false // Disable Swiper during window drag
} }
const handleWindowDragEnd = async () => { const handleWindowDragEnd = async () => {
// Window handles snapping itself, we just need to cleanup state console.log('[Desktop] handleWindowDragEnd')
isWindowDragging.value = false
currentDraggingWindowId.value = null // Check if window should snap to left or right
allowSwipe.value = true // Re-enable Swiper after drag const draggingWindowId = windowManager.draggingWindowId
if (draggingWindowId) {
if (showLeftSnapZone.value) {
// Snap to left half
windowManager.updateWindowPosition(draggingWindowId, 0, 0)
windowManager.updateWindowSize(
draggingWindowId,
viewportWidth.value / 2,
viewportHeight.value,
)
} else if (showRightSnapZone.value) {
// Snap to right half
windowManager.updateWindowPosition(
draggingWindowId,
viewportWidth.value / 2,
0,
)
windowManager.updateWindowSize(
draggingWindowId,
viewportWidth.value / 2,
viewportHeight.value,
)
}
} }
// Move window to different workspace isWindowDragging.value = false
const moveWindowToWorkspace = async ( windowManager.draggingWindowId = null // Clear from store
windowId: string, allowSwipe.value = true // Re-enable Swiper after drag
targetWorkspaceId: string,
) => {
const window = windowManager.windows.find((w) => w.id === windowId)
if (!window) return
// Update window's workspaceId
window.workspaceId = targetWorkspaceId
} }
// Area selection handlers // Area selection handlers
@ -569,27 +563,12 @@ const onSwiperInit = (swiper: SwiperType) => {
} }
const onSlideChange = (swiper: SwiperType) => { const onSlideChange = (swiper: SwiperType) => {
workspaceStore.switchToWorkspace(swiper.activeIndex) workspaceStore.switchToWorkspace(
workspaceStore.workspaces.at(swiper.activeIndex)?.id,
)
} }
// Workspace control handlers /* const handleRemoveWorkspace = async () => {
const handleAddWorkspace = async () => {
await workspaceStore.addWorkspaceAsync()
// Swiper will auto-slide to new workspace because we switch in addWorkspaceAsync
nextTick(() => {
if (swiperInstance.value) {
swiperInstance.value.slideTo(workspaces.value.length - 1)
}
})
}
const handleSwitchToWorkspace = (index: number) => {
if (swiperInstance.value) {
swiperInstance.value.slideTo(index)
}
}
const handleRemoveWorkspace = async () => {
if (!currentWorkspace.value || workspaces.value.length <= 1) return if (!currentWorkspace.value || workspaces.value.length <= 1) return
const currentIndex = currentWorkspaceIndex.value const currentIndex = currentWorkspaceIndex.value
@ -604,13 +583,6 @@ const handleRemoveWorkspace = async () => {
}) })
} }
// Drawer handlers
const handleSwitchToWorkspaceFromDrawer = (index: number) => {
handleSwitchToWorkspace(index)
// Close drawer after switch
isOverviewMode.value = false
}
const handleDropWindowOnWorkspace = async ( const handleDropWindowOnWorkspace = async (
event: DragEvent, event: DragEvent,
targetWorkspaceId: string, targetWorkspaceId: string,
@ -620,116 +592,65 @@ const handleDropWindowOnWorkspace = async (
if (windowId) { if (windowId) {
await moveWindowToWorkspace(windowId, targetWorkspaceId) await moveWindowToWorkspace(windowId, targetWorkspaceId)
} }
} } */
// Overview Mode: Calculate grid positions and scale for windows // Overview Mode: Calculate grid positions and scale for windows
const getOverviewWindowGridStyle = (index: number, totalWindows: number) => { // Calculate preview dimensions for window overview
if (!viewportWidth.value || !viewportHeight.value) { const MIN_PREVIEW_WIDTH = 300 // 50% increase from 200
return {} const MAX_PREVIEW_WIDTH = 600 // 50% increase from 400
const MIN_PREVIEW_HEIGHT = 225 // 50% increase from 150
const MAX_PREVIEW_HEIGHT = 450 // 50% increase from 300
// Store window state for overview (position only, size stays original)
const overviewWindowState = ref(
new Map<
string,
{ x: number; y: number; width: number; height: number; scale: number }
>(),
)
// Calculate scale and card dimensions for each window
watch(
() => windowManager.showWindowOverview,
(isOpen) => {
if (isOpen) {
// Wait for the Overview modal to mount and create the teleport targets
nextTick(() => {
windowManager.windows.forEach((window) => {
const scaleX = MAX_PREVIEW_WIDTH / window.width
const scaleY = MAX_PREVIEW_HEIGHT / window.height
const scale = Math.min(scaleX, scaleY, 1)
// Ensure minimum card size
const scaledWidth = window.width * scale
const scaledHeight = window.height * scale
let finalScale = scale
if (scaledWidth < MIN_PREVIEW_WIDTH) {
finalScale = MIN_PREVIEW_WIDTH / window.width
}
if (scaledHeight < MIN_PREVIEW_HEIGHT) {
finalScale = Math.max(
finalScale,
MIN_PREVIEW_HEIGHT / window.height,
)
} }
// Determine grid layout based on number of windows overviewWindowState.value.set(window.id, {
let cols = 1 x: 0,
let rows = 1 y: 0,
width: window.width,
if (totalWindows === 1) { height: window.height,
cols = 1 scale: finalScale,
rows = 1 })
} else if (totalWindows === 2) { })
cols = 2 })
rows = 1
} else if (totalWindows <= 4) {
cols = 2
rows = 2
} else if (totalWindows <= 6) {
cols = 3
rows = 2
} else if (totalWindows <= 9) {
cols = 3
rows = 3
} else { } else {
cols = 4 // Clear state when overview is closed
rows = Math.ceil(totalWindows / 4) overviewWindowState.value.clear()
}
// Calculate grid cell position
const col = index % cols
const row = Math.floor(index / cols)
// Padding and gap
const padding = 40 // px from viewport edges
const gap = 30 // px between windows
// Available space
const availableWidth = viewportWidth.value - padding * 2 - gap * (cols - 1)
const availableHeight = viewportHeight.value - padding * 2 - gap * (rows - 1)
// Cell dimensions
const cellWidth = availableWidth / cols
const cellHeight = availableHeight / rows
// Window aspect ratio (assume 16:9 or use actual window dimensions)
const windowAspectRatio = 16 / 9
// Calculate scale to fit window in cell
const targetWidth = cellWidth
const targetHeight = cellHeight
const targetAspect = targetWidth / targetHeight
let scale = 0.25 // Default scale
let scaledWidth = 800 * scale
let scaledHeight = 600 * scale
if (targetAspect > windowAspectRatio) {
// Cell is wider than window aspect ratio - fit by height
scaledHeight = Math.min(targetHeight, 600 * 0.4)
scale = scaledHeight / 600
scaledWidth = 800 * scale
} else {
// Cell is taller than window aspect ratio - fit by width
scaledWidth = Math.min(targetWidth, 800 * 0.4)
scale = scaledWidth / 800
scaledHeight = 600 * scale
}
// Calculate position to center window in cell
const cellX = padding + col * (cellWidth + gap)
const cellY = padding + row * (cellHeight + gap)
// Center window in cell
const x = cellX + (cellWidth - scaledWidth) / 2
const y = cellY + (cellHeight - scaledHeight) / 2
return {
transform: `scale(${scale})`,
transformOrigin: 'top left',
left: `${x / scale}px`,
top: `${y / scale}px`,
width: '800px',
height: '600px',
zIndex: 91,
transition: 'all 0.3s ease',
}
}
// Overview Mode handlers
const handleOverviewWindowClick = (windowId: string) => {
// Activate the window
windowManager.activateWindow(windowId)
// Close overview mode
isOverviewMode.value = false
}
const handleOverviewWindowDragStart = (event: DragEvent, windowId: string) => {
if (event.dataTransfer) {
event.dataTransfer.effectAllowed = 'move'
event.dataTransfer.setData('windowId', windowId)
}
}
const handleOverviewWindowDragEnd = () => {
// Cleanup after drag
} }
},
)
// Disable Swiper in overview mode // Disable Swiper in overview mode
watch(isOverviewMode, (newValue) => { watch(isOverviewMode, (newValue) => {

View File

@ -89,7 +89,11 @@ const removeExtensionAsync = async () => {
} }
try { try {
await extensionStore.removeExtensionAsync(extension.id, extension.version) await extensionStore.removeExtensionAsync(
extension.publicKey,
extension.name,
extension.version,
)
await extensionStore.loadExtensionsAsync() await extensionStore.loadExtensionsAsync()
add({ add({

View File

@ -15,7 +15,7 @@
<div class="flex items-start gap-4"> <div class="flex items-start gap-4">
<div <div
v-if="preview?.manifest.icon" v-if="preview?.manifest.icon"
class="w-16 h-16 flex-shrink-0" class="w-16 h-16 shrink-0"
> >
<UIcon <UIcon
:name="preview.manifest.icon" :name="preview.manifest.icon"
@ -184,7 +184,6 @@ const shellPermissions = computed({
}, },
}) })
const permissionAccordionItems = computed(() => { const permissionAccordionItems = computed(() => {
const items = [] const items = []

View File

@ -1,32 +1,48 @@
<template> <template>
<UPopover v-model:open="open"> <UDrawer
v-model:open="open"
direction="right"
:title="t('launcher.title')"
:description="t('launcher.description')"
:ui="{
content: 'w-dvw max-w-md sm:max-w-fit',
}"
>
<UButton <UButton
icon="material-symbols:apps" icon="material-symbols:apps"
color="neutral" color="neutral"
variant="outline" variant="outline"
v-bind="$attrs" v-bind="$attrs"
size="xl" size="lg"
/> />
<template #content> <template #content>
<ul class="p-4 max-h-96 grid grid-cols-3 gap-2 overflow-scroll"> <div class="p-4 h-full overflow-y-auto">
<div class="flex flex-wrap">
<!-- All launcher items (system windows + enabled extensions, alphabetically sorted) --> <!-- All launcher items (system windows + enabled extensions, alphabetically sorted) -->
<UiButton <UContextMenu
v-for="item in launcherItems" v-for="item in launcherItems"
:key="item.id" :key="item.id"
:items="getContextMenuItems(item)"
>
<UiButton
square square
size="xl" size="lg"
variant="ghost" variant="ghost"
:ui="{ :ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible', base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab active:cursor-grabbing',
leadingIcon: 'size-10', leadingIcon: 'size-10',
label: 'w-full', label: 'w-full',
}" }"
:icon="item.icon" :icon="item.icon"
:label="item.name" :label="item.name"
:tooltip="item.name" :tooltip="item.name"
draggable="true"
@click="openItem(item)" @click="openItem(item)"
@dragstart="handleDragStart($event, item)"
@dragend="handleDragEnd"
/> />
</UContextMenu>
<!-- Disabled Extensions (grayed out) --> <!-- Disabled Extensions (grayed out) -->
<UiButton <UiButton
@ -45,12 +61,31 @@
:label="extension.name" :label="extension.name"
:tooltip="`${extension.name} (${t('disabled')})`" :tooltip="`${extension.name} (${t('disabled')})`"
/> />
</ul> </div>
</div>
</template> </template>
</UPopover> </UDrawer>
<!-- Uninstall Confirmation Dialog -->
<UiDialogConfirm
v-model:open="showUninstallDialog"
:title="t('uninstall.confirm.title')"
:description="
t('uninstall.confirm.description', {
name: extensionToUninstall?.name || '',
})
"
:confirm-label="t('uninstall.confirm.button')"
confirm-icon="i-heroicons-trash"
@confirm="confirmUninstall"
/>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
defineOptions({
inheritAttrs: false,
})
const extensionStore = useExtensionsStore() const extensionStore = useExtensionsStore()
const windowManagerStore = useWindowManagerStore() const windowManagerStore = useWindowManagerStore()
@ -58,6 +93,10 @@ const { t } = useI18n()
const open = ref(false) const open = ref(false)
// Uninstall dialog state
const showUninstallDialog = ref(false)
const extensionToUninstall = ref<LauncherItem | null>(null)
// Unified launcher item type // Unified launcher item type
interface LauncherItem { interface LauncherItem {
id: string id: string
@ -119,14 +158,123 @@ const openItem = async (item: LauncherItem) => {
console.log(error) console.log(error)
} }
} }
// Uninstall extension - shows confirmation dialog first
const uninstallExtension = async (item: LauncherItem) => {
extensionToUninstall.value = item
showUninstallDialog.value = true
}
// Confirm uninstall - actually removes the extension
const confirmUninstall = async () => {
if (!extensionToUninstall.value) return
try {
const extension = extensionStore.availableExtensions.find(
(ext) => ext.id === extensionToUninstall.value!.id,
)
if (!extension) return
// Close all windows of this extension first
const extensionWindows = windowManagerStore.windows.filter(
(win) => win.type === 'extension' && win.sourceId === extension.id,
)
for (const win of extensionWindows) {
windowManagerStore.closeWindow(win.id)
}
// Uninstall the extension
await extensionStore.removeExtensionAsync(
extension.publicKey,
extension.name,
extension.version,
)
// Refresh available extensions list
await extensionStore.loadExtensionsAsync()
// Close dialog and reset state
showUninstallDialog.value = false
extensionToUninstall.value = null
} catch (error) {
console.error('Failed to uninstall extension:', error)
}
}
// Get context menu items for launcher item
const getContextMenuItems = (item: LauncherItem) => {
const items = [
{
label: t('contextMenu.open'),
icon: 'i-heroicons-arrow-top-right-on-square',
onSelect: () => openItem(item),
},
]
// Add uninstall option for extensions
if (item.type === 'extension') {
items.push({
label: t('contextMenu.uninstall'),
icon: 'i-heroicons-trash',
onSelect: () => uninstallExtension(item),
})
}
return items
}
// Drag & Drop handling
const handleDragStart = (event: DragEvent, item: LauncherItem) => {
if (!event.dataTransfer) return
// Store the launcher item data
event.dataTransfer.effectAllowed = 'copy'
event.dataTransfer.setData(
'application/haex-launcher-item',
JSON.stringify(item),
)
// Set drag image (optional - uses default if not set)
const dragImage = event.target as HTMLElement
if (dragImage) {
event.dataTransfer.setDragImage(dragImage, 20, 20)
}
}
const handleDragEnd = () => {
// Cleanup if needed
}
</script> </script>
<i18n lang="yaml"> <i18n lang="yaml">
de: de:
disabled: Deaktiviert disabled: Deaktiviert
marketplace: Marketplace marketplace: Marketplace
launcher:
title: App Launcher
description: Wähle eine App zum Öffnen
contextMenu:
open: Öffnen
uninstall: Deinstallieren
uninstall:
confirm:
title: Erweiterung deinstallieren
description: Möchtest du wirklich "{name}" deinstallieren? Diese Aktion kann nicht rückgängig gemacht werden.
button: Deinstallieren
en: en:
disabled: Disabled disabled: Disabled
marketplace: Marketplace marketplace: Marketplace
launcher:
title: App Launcher
description: Select an app to open
contextMenu:
open: Open
uninstall: Uninstall
uninstall:
confirm:
title: Uninstall Extension
description: Do you really want to uninstall "{name}"? This action cannot be undone.
button: Uninstall
</i18n> </i18n>

View File

@ -15,7 +15,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(read).at(0)" :checked="Object.values(read).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(read).at(0)" :for="Object.keys(read).at(0)"
@ -42,7 +42,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(write).at(0)" :checked="Object.values(write).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(write).at(0)" :for="Object.keys(write).at(0)"
@ -69,7 +69,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(create).at(0)" :checked="Object.values(create).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(create).at(0)" :for="Object.keys(create).at(0)"

View File

@ -14,7 +14,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(read).at(0)" :checked="Object.values(read).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(read).at(0)" :for="Object.keys(read).at(0)"
@ -41,7 +41,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(write).at(0)" :checked="Object.values(write).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(write).at(0)" :for="Object.keys(write).at(0)"

View File

@ -15,7 +15,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(access).at(0)" :checked="Object.values(access).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(access).at(0)" :for="Object.keys(access).at(0)"

View File

@ -8,7 +8,7 @@
> >
<div class="flex items-start gap-4"> <div class="flex items-start gap-4">
<!-- Icon --> <!-- Icon -->
<div class="flex-shrink-0"> <div class="shrink-0">
<div <div
v-if="extension.icon" v-if="extension.icon"
class="w-16 h-16 rounded-lg bg-primary/10 flex items-center justify-center" class="w-16 h-16 rounded-lg bg-primary/10 flex items-center justify-center"
@ -52,7 +52,7 @@
<p <p
v-if="extension.description" v-if="extension.description"
class="text-sm text-gray-600 dark:text-gray-300 mt-2 line-clamp-2" class="hidden @lg:flex text-sm text-gray-600 dark:text-gray-300 mt-2 line-clamp-2"
> >
{{ extension.description }} {{ extension.description }}
</p> </p>
@ -67,7 +67,9 @@
> >
<UIcon name="i-heroicons-check-circle-solid" /> <UIcon name="i-heroicons-check-circle-solid" />
<span v-if="!extension.installedVersion">{{ t('installed') }}</span> <span v-if="!extension.installedVersion">{{ t('installed') }}</span>
<span v-else>{{ t('installedVersion', { version: extension.installedVersion }) }}</span> <span v-else>{{
t('installedVersion', { version: extension.installedVersion })
}}</span>
</div> </div>
<div <div
v-if="extension.downloads" v-if="extension.downloads"
@ -114,10 +116,16 @@
<div class="flex items-center justify-between gap-2"> <div class="flex items-center justify-between gap-2">
<UButton <UButton
:label="getInstallButtonLabel()" :label="getInstallButtonLabel()"
:color="extension.isInstalled && !extension.installedVersion ? 'neutral' : 'primary'" :color="
extension.isInstalled && !extension.installedVersion
? 'neutral'
: 'primary'
"
:disabled="extension.isInstalled && !extension.installedVersion" :disabled="extension.isInstalled && !extension.installedVersion"
:icon=" :icon="
extension.isInstalled && !extension.installedVersion ? 'i-heroicons-check' : 'i-heroicons-arrow-down-tray' extension.isInstalled && !extension.installedVersion
? 'i-heroicons-check'
: 'i-heroicons-arrow-down-tray'
" "
size="sm" size="sm"
@click.stop="$emit('install')" @click.stop="$emit('install')"

View File

@ -1,5 +1,5 @@
<template> <template>
<div class="p-4 max-w-4xl mx-auto space-y-6"> <div class="p-4 mx-auto space-y-6 bg-default/90 backdrop-blur-2xl">
<div class="space-y-2"> <div class="space-y-2">
<h1 class="text-2xl font-bold">{{ t('title') }}</h1> <h1 class="text-2xl font-bold">{{ t('title') }}</h1>
<p class="text-sm opacity-70">{{ t('description') }}</p> <p class="text-sm opacity-70">{{ t('description') }}</p>
@ -85,28 +85,16 @@
<script setup lang="ts"> <script setup lang="ts">
import { invoke } from '@tauri-apps/api/core' import { invoke } from '@tauri-apps/api/core'
import { open } from '@tauri-apps/plugin-dialog' import { open } from '@tauri-apps/plugin-dialog'
import type { ExtensionInfoResponse } from '~~/src-tauri/bindings/ExtensionInfoResponse'
definePageMeta({
name: 'settings-developer',
})
const { t } = useI18n() const { t } = useI18n()
const { add } = useToast() const { add } = useToast()
const { loadExtensionsAsync } = useExtensionsStore() const { loadExtensionsAsync } = useExtensionsStore()
// State // State
const extensionPath = ref('') const extensionPath = ref('')
const isLoading = ref(false) const isLoading = ref(false)
const devExtensions = ref< const devExtensions = ref<Array<ExtensionInfoResponse>>([])
Array<{
id: string
publicKey: string
name: string
version: string
enabled: boolean
}>
>([])
// Load dev extensions on mount // Load dev extensions on mount
onMounted(async () => { onMounted(async () => {
@ -140,7 +128,7 @@ const loadDevExtensionAsync = async () => {
isLoading.value = true isLoading.value = true
try { try {
const extensionId = await invoke<string>('load_dev_extension', { await invoke<string>('load_dev_extension', {
extensionPath: extensionPath.value, extensionPath: extensionPath.value,
}) })
@ -157,10 +145,10 @@ const loadDevExtensionAsync = async () => {
// Clear input // Clear input
extensionPath.value = '' extensionPath.value = ''
} catch (error: any) { } catch (error) {
console.error('Failed to load dev extension:', error) console.error('Failed to load dev extension:', error)
add({ add({
description: error || t('add.errors.loadFailed'), description: t('add.errors.loadFailed') + error,
color: 'error', color: 'error',
}) })
} finally { } finally {
@ -171,7 +159,9 @@ const loadDevExtensionAsync = async () => {
// Load all dev extensions (for the list on this page) // Load all dev extensions (for the list on this page)
const loadDevExtensionListAsync = async () => { const loadDevExtensionListAsync = async () => {
try { try {
const extensions = await invoke<Array<any>>('get_all_dev_extensions') const extensions = await invoke<Array<ExtensionInfoResponse>>(
'get_all_dev_extensions',
)
devExtensions.value = extensions devExtensions.value = extensions
} catch (error) { } catch (error) {
console.error('Failed to load dev extensions:', error) console.error('Failed to load dev extensions:', error)
@ -179,29 +169,30 @@ const loadDevExtensionListAsync = async () => {
} }
// Reload a dev extension (removes and re-adds) // Reload a dev extension (removes and re-adds)
const reloadDevExtensionAsync = async (ext: any) => { const reloadDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
try { try {
console.log('reloadDevExtensionAsync', extension)
// Get the extension path from somewhere (we need to store this) // Get the extension path from somewhere (we need to store this)
// For now, just show a message // For now, just show a message
add({ add({
description: t('list.reloadInfo'), description: t('list.reloadInfo'),
color: 'info', color: 'info',
}) })
} catch (error: any) { } catch (error) {
console.error('Failed to reload dev extension:', error) console.error('Failed to reload dev extension:', error)
add({ add({
description: error || t('list.errors.reloadFailed'), description: t('list.errors.reloadFailed') + error,
color: 'error', color: 'error',
}) })
} }
} }
// Remove a dev extension // Remove a dev extension
const removeDevExtensionAsync = async (ext: any) => { const removeDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
try { try {
await invoke('remove_dev_extension', { await invoke('remove_dev_extension', {
publicKey: ext.publicKey, publicKey: extension.publicKey,
name: ext.name, name: extension.name,
}) })
add({ add({
@ -214,10 +205,10 @@ const removeDevExtensionAsync = async (ext: any) => {
// Reload all extensions store // Reload all extensions store
await loadExtensionsAsync() await loadExtensionsAsync()
} catch (error: any) { } catch (error) {
console.error('Failed to remove dev extension:', error) console.error('Failed to remove dev extension:', error)
add({ add({
description: error || t('list.errors.removeFailed'), description: t('list.errors.removeFailed') + error,
color: 'error', color: 'error',
}) })
} }

View File

@ -1,145 +1,607 @@
<template> <template>
<div class="w-full h-full flex flex-col bg-white dark:bg-gray-900"> <div class="flex flex-col h-full bg-default">
<!-- Marketplace Header --> <!-- Header with Actions -->
<div <div
class="flex-shrink-0 border-b border-gray-200 dark:border-gray-700 p-6" class="flex flex-col @lg:flex-row @lg:items-center justify-between gap-4 p-6 border-b border-gray-200 dark:border-gray-800"
> >
<h1 class="text-2xl font-bold text-gray-900 dark:text-white"> <div>
Extension Marketplace <h1 class="text-2xl font-bold">
{{ t('title') }}
</h1> </h1>
<p class="text-sm text-gray-600 dark:text-gray-400 mt-1"> <p class="text-sm text-gray-500 dark:text-gray-400 mt-1">
Discover and install extensions for HaexHub {{ t('subtitle') }}
</p> </p>
</div> </div>
<!-- Search Bar -->
<div <div
class="flex-shrink-0 border-b border-gray-200 dark:border-gray-700 p-4" class="flex flex-col @lg:flex-row items-stretch @lg:items-center gap-3"
>
<!-- Marketplace Selector -->
<USelectMenu
v-model="selectedMarketplace"
:items="marketplaces"
value-key="id"
class="w-full @lg:w-48"
>
<template #leading>
<UIcon name="i-heroicons-building-storefront" />
</template>
</USelectMenu>
<!-- Install from File Button -->
<UiButton
:label="t('extension.installFromFile')"
icon="i-heroicons-arrow-up-tray"
color="neutral"
block
@click="onSelectExtensionAsync"
/>
</div>
</div>
<!-- Search and Filters -->
<div
class="flex flex-col @lg:flex-row items-stretch @lg:items-center gap-4 p-6 border-b border-gray-200 dark:border-gray-800"
> >
<UInput <UInput
v-model="searchQuery"
:placeholder="t('search.placeholder')"
icon="i-heroicons-magnifying-glass" icon="i-heroicons-magnifying-glass"
size="lg" class="flex-1"
placeholder="Search extensions..." />
class="w-full" <USelectMenu
v-model="selectedCategory"
:items="categories"
:placeholder="t('filter.category')"
value-key="id"
class="w-full @lg:w-48"
>
<template #leading>
<UIcon name="i-heroicons-tag" />
</template>
</USelectMenu>
</div>
<!-- Extensions Grid -->
<div class="flex-1 overflow-auto p-6">
<div
v-if="filteredExtensions.length"
class="grid grid-cols-1 @md:grid-cols-2 @2xl:grid-cols-3 gap-4"
>
<!-- Marketplace Extension Card -->
<HaexExtensionMarketplaceCard
v-for="ext in filteredExtensions"
:key="ext.id"
:extension="ext"
@install="onInstallFromMarketplace(ext)"
@details="onShowExtensionDetails(ext)"
/> />
</div> </div>
<!-- Marketplace Content --> <!-- Empty State -->
<div class="flex-1 overflow-y-auto p-6">
<div class="max-w-4xl space-y-6">
<!-- Featured Extensions -->
<section>
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">
Featured Extensions
</h2>
<div class="grid grid-cols-1 md:grid-cols-2 gap-4">
<!-- Example Extension Card -->
<div <div
class="bg-gray-50 dark:bg-gray-800 rounded-lg p-4 hover:shadow-lg transition-shadow cursor-pointer" v-else
> class="flex flex-col items-center justify-center h-full text-center"
<div class="flex items-start gap-4">
<div
class="w-12 h-12 rounded-lg bg-primary-500 flex items-center justify-center flex-shrink-0"
> >
<UIcon <UIcon
name="i-heroicons-puzzle-piece" name="i-heroicons-magnifying-glass"
class="w-6 h-6 text-white" class="w-16 h-16 text-gray-400 mb-4"
/> />
</div> <h3 class="text-lg font-semibold text-gray-900 dark:text-white">
<div class="flex-1 min-w-0"> {{ t('empty.title') }}
<h3
class="font-semibold text-gray-900 dark:text-white truncate"
>
Example Extension
</h3> </h3>
<p <p class="text-gray-500 dark:text-gray-400 mt-2">
class="text-sm text-gray-600 dark:text-gray-400 line-clamp-2 mt-1" {{ t('empty.description') }}
>
A powerful extension for HaexHub
</p>
<div class="flex items-center gap-2 mt-2">
<span class="text-xs text-gray-500 dark:text-gray-500"
>v1.0.0</span
>
<span class="text-xs text-gray-500 dark:text-gray-500"
></span
>
<span class="text-xs text-gray-500 dark:text-gray-500"
>1.2k downloads</span
>
</div>
</div>
<UButton
label="Install"
size="sm"
color="primary"
/>
</div>
</div>
<!-- Placeholder for more extensions -->
<div class="bg-gray-50 dark:bg-gray-800 rounded-lg p-4 opacity-50">
<div class="flex items-start gap-4">
<div
class="w-12 h-12 rounded-lg bg-gray-400 flex items-center justify-center flex-shrink-0"
>
<UIcon
name="i-heroicons-puzzle-piece"
class="w-6 h-6 text-white"
/>
</div>
<div class="flex-1 min-w-0">
<h3
class="font-semibold text-gray-900 dark:text-white truncate"
>
More extensions coming soon...
</h3>
<p class="text-sm text-gray-600 dark:text-gray-400 mt-1">
Check back later for more extensions
</p> </p>
</div> </div>
</div> </div>
</div>
</div>
</section>
<!-- Categories --> <HaexExtensionDialogReinstall
<section> v-model:open="openOverwriteDialog"
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4"> v-model:preview="preview"
Categories @confirm="reinstallExtensionAsync"
</h2>
<div class="flex flex-wrap gap-2">
<UBadge
label="Productivity"
color="primary"
variant="soft"
size="lg"
/> />
<UBadge
label="Development" <HaexExtensionDialogInstall
color="secondary" v-model:open="showConfirmation"
variant="soft" :preview="preview"
size="lg" @confirm="addExtensionAsync"
/> />
<UBadge
label="Security" <HaexExtensionDialogRemove
color="error" v-model:open="showRemoveDialog"
variant="soft" :extension="extensionToBeRemoved"
size="lg" @confirm="removeExtensionAsync"
/> />
<UBadge
label="Utilities"
color="secondary"
variant="soft"
size="lg"
/>
</div>
</section>
</div>
</div>
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
// Marketplace component - placeholder implementation import type {
IHaexHubExtension,
IHaexHubExtensionManifest,
IMarketplaceExtension,
} from '~/types/haexhub'
import { open } from '@tauri-apps/plugin-dialog'
import type { ExtensionPreview } from '~~/src-tauri/bindings/ExtensionPreview'
const { t } = useI18n()
const extensionStore = useExtensionsStore()
const showConfirmation = ref(false)
const openOverwriteDialog = ref(false)
const extension = reactive<{
manifest: IHaexHubExtensionManifest | null | undefined
path: string | null
}>({
manifest: null,
path: '',
})
/* const loadExtensionManifestAsync = async () => {
try {
extension.path = await open({ directory: true, recursive: true })
if (!extension.path) return
const manifestFile = JSON.parse(
await readTextFile(await join(extension.path, 'manifest.json')),
)
if (!extensionStore.checkManifest(manifestFile))
throw new Error(`Manifest fehlerhaft ${JSON.stringify(manifestFile)}`)
return manifestFile
} catch (error) {
console.error('Fehler loadExtensionManifestAsync:', error)
add({ color: 'error', description: JSON.stringify(error) })
await addNotificationAsync({ text: JSON.stringify(error), type: 'error' })
}
} */
const { add } = useToast()
const { addNotificationAsync } = useNotificationStore()
const preview = ref<ExtensionPreview>()
// Marketplace State
const selectedMarketplace = ref('official')
const searchQuery = ref('')
const selectedCategory = ref('all')
// Marketplaces (später von API laden)
const marketplaces = [
{
id: 'official',
label: t('marketplace.official'),
icon: 'i-heroicons-building-storefront',
},
{
id: 'community',
label: t('marketplace.community'),
icon: 'i-heroicons-users',
},
]
// Categories
const categories = computed(() => [
{ id: 'all', label: t('category.all') },
{ id: 'productivity', label: t('category.productivity') },
{ id: 'security', label: t('category.security') },
{ id: 'utilities', label: t('category.utilities') },
{ id: 'integration', label: t('category.integration') },
])
// Dummy Marketplace Extensions (später von API laden)
const marketplaceExtensions = ref<IMarketplaceExtension[]>([
{
id: 'haex-passy',
name: 'HaexPassDummy',
version: '1.0.0',
author: 'HaexHub Team',
public_key:
'a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2',
description:
'Sicherer Passwort-Manager mit Ende-zu-Ende-Verschlüsselung und Autofill-Funktion.',
icon: 'i-heroicons-lock-closed',
homepage: null,
downloads: 15420,
rating: 4.8,
verified: true,
tags: ['security', 'password', 'productivity'],
category: 'security',
downloadUrl: '/extensions/haex-pass-1.0.0.haextension',
isInstalled: false,
},
{
id: 'haex-notes',
name: 'HaexNotes',
version: '2.1.0',
author: 'HaexHub Team',
public_key:
'b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3',
description:
'Markdown-basierter Notizen-Editor mit Syntax-Highlighting und Live-Preview.',
icon: 'i-heroicons-document-text',
homepage: null,
downloads: 8930,
rating: 4.5,
verified: true,
tags: ['productivity', 'notes', 'markdown'],
category: 'productivity',
downloadUrl: '/extensions/haex-notes-2.1.0.haextension',
isInstalled: false,
},
{
id: 'haex-backup',
name: 'HaexBackup',
version: '1.5.2',
author: 'Community',
public_key:
'c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4',
description:
'Automatische Backups deiner Daten mit Cloud-Sync-Unterstützung.',
icon: 'i-heroicons-cloud-arrow-up',
homepage: null,
downloads: 5240,
rating: 4.6,
verified: false,
tags: ['backup', 'cloud', 'utilities'],
category: 'utilities',
downloadUrl: '/extensions/haex-backup-1.5.2.haextension',
isInstalled: false,
},
{
id: 'haex-calendar',
name: 'HaexCalendar',
version: '3.0.1',
author: 'HaexHub Team',
public_key:
'd4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5',
description:
'Integrierter Kalender mit Event-Management und Synchronisation.',
icon: 'i-heroicons-calendar',
homepage: null,
downloads: 12100,
rating: 4.7,
verified: true,
tags: ['productivity', 'calendar', 'events'],
category: 'productivity',
downloadUrl: '/extensions/haex-calendar-3.0.1.haextension',
isInstalled: false,
},
{
id: 'haex-2fa',
name: 'Haex2FA',
version: '1.2.0',
author: 'Security Team',
public_key:
'e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6',
description:
'2-Faktor-Authentifizierung Manager mit TOTP und Backup-Codes.',
icon: 'i-heroicons-shield-check',
homepage: null,
downloads: 7800,
rating: 4.9,
verified: true,
tags: ['security', '2fa', 'authentication'],
category: 'security',
downloadUrl: '/extensions/haex-2fa-1.2.0.haextension',
isInstalled: false,
},
{
id: 'haex-github',
name: 'GitHub Integration',
version: '1.0.5',
author: 'Community',
public_key:
'f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6a7',
description:
'Direkter Zugriff auf GitHub Repositories, Issues und Pull Requests.',
icon: 'i-heroicons-code-bracket',
homepage: null,
downloads: 4120,
rating: 4.3,
verified: false,
tags: ['integration', 'github', 'development'],
category: 'integration',
downloadUrl: '/extensions/haex-github-1.0.5.haextension',
isInstalled: false,
},
])
// Mark marketplace extensions as installed if they exist in availableExtensions
const allExtensions = computed((): IMarketplaceExtension[] => {
return marketplaceExtensions.value.map((ext) => {
// Extensions are uniquely identified by public_key + name
const installedExt = extensionStore.availableExtensions.find(
(installed) => {
return (
installed.publicKey === ext.publicKey && installed.name === ext.name
)
},
)
if (installedExt) {
return {
...ext,
isInstalled: true,
// Show installed version if it differs from marketplace version
installedVersion:
installedExt.version !== ext.version
? installedExt.version
: undefined,
}
}
return {
...ext,
isInstalled: false,
installedVersion: undefined,
}
})
})
// Filtered Extensions
const filteredExtensions = computed(() => {
return allExtensions.value.filter((ext) => {
const matchesSearch =
!searchQuery.value ||
ext.name.toLowerCase().includes(searchQuery.value.toLowerCase()) ||
ext.description?.toLowerCase().includes(searchQuery.value.toLowerCase())
const matchesCategory =
selectedCategory.value === 'all' ||
ext.category === selectedCategory.value
return matchesSearch && matchesCategory
})
})
// Install from marketplace
const onInstallFromMarketplace = async (ext: unknown) => {
console.log('Install from marketplace:', ext)
// TODO: Download extension from marketplace and install
add({ color: 'info', description: t('extension.marketplace.comingSoon') })
}
// Show extension details
const onShowExtensionDetails = (ext: unknown) => {
console.log('Show details:', ext)
// TODO: Show extension details modal
}
const onSelectExtensionAsync = async () => {
try {
extension.path = await open({ directory: false, recursive: true })
if (!extension.path) return
preview.value = await extensionStore.previewManifestAsync(extension.path)
if (!preview.value?.manifest) return
// Check if already installed using public_key + name
const isAlreadyInstalled = extensionStore.availableExtensions.some(
(ext) =>
ext.publicKey === preview.value!.manifest.public_key &&
ext.name === preview.value!.manifest.name,
)
if (isAlreadyInstalled) {
openOverwriteDialog.value = true
} else {
showConfirmation.value = true
}
} catch (error) {
add({ color: 'error', description: JSON.stringify(error) })
await addNotificationAsync({ text: JSON.stringify(error), type: 'error' })
}
}
const addExtensionAsync = async () => {
try {
console.log(
'preview.value?.editable_permissions',
preview.value?.editable_permissions,
)
await extensionStore.installAsync(
extension.path,
preview.value?.editable_permissions,
)
await extensionStore.loadExtensionsAsync()
add({
color: 'success',
title: t('extension.success.title', {
extension: extension.manifest?.name,
}),
description: t('extension.success.text'),
})
await addNotificationAsync({
text: t('extension.success.text'),
type: 'success',
title: t('extension.success.title', {
extension: extension.manifest?.name,
}),
})
} catch (error) {
console.error('Fehler addExtensionAsync:', error)
add({ color: 'error', description: JSON.stringify(error) })
await addNotificationAsync({ text: JSON.stringify(error), type: 'error' })
}
}
const reinstallExtensionAsync = async () => {
try {
if (!preview.value?.manifest) return
// Find the installed extension to get its current version
const installedExt = extensionStore.availableExtensions.find(
(ext) =>
ext.publicKey === preview.value!.manifest.public_key &&
ext.name === preview.value!.manifest.name,
)
if (installedExt) {
// Remove old extension first
await extensionStore.removeExtensionAsync(
installedExt.publicKey,
installedExt.name,
installedExt.version,
)
}
// Then install new version
await addExtensionAsync()
} catch (error) {
console.error('Fehler reinstallExtensionAsync:', error)
add({ color: 'error', description: JSON.stringify(error) })
await addNotificationAsync({ text: JSON.stringify(error), type: 'error' })
}
}
const extensionToBeRemoved = ref<IHaexHubExtension>()
const showRemoveDialog = ref(false)
// Load extensions on mount
onMounted(async () => {
try {
await extensionStore.loadExtensionsAsync()
console.log('Loaded extensions:', extensionStore.availableExtensions)
} catch (error) {
console.error('Failed to load extensions:', error)
add({ color: 'error', description: 'Failed to load installed extensions' })
}
})
/* const onShowRemoveDialog = (extension: IHaexHubExtension) => {
extensionToBeRemoved.value = extension
showRemoveDialog.value = true
} */
const removeExtensionAsync = async () => {
if (
!extensionToBeRemoved.value?.publicKey ||
!extensionToBeRemoved.value?.name ||
!extensionToBeRemoved.value?.version
) {
add({
color: 'error',
description: 'Erweiterung kann nicht gelöscht werden',
})
return
}
try {
await extensionStore.removeExtensionAsync(
extensionToBeRemoved.value.publicKey,
extensionToBeRemoved.value.name,
extensionToBeRemoved.value.version,
)
await extensionStore.loadExtensionsAsync()
add({
color: 'success',
title: t('extension.remove.success.title', {
extensionName: extensionToBeRemoved.value.name,
}),
description: t('extension.remove.success.text', {
extensionName: extensionToBeRemoved.value.name,
}),
})
await addNotificationAsync({
text: t('extension.remove.success.text', {
extensionName: extensionToBeRemoved.value.name,
}),
type: 'success',
title: t('extension.remove.success.title', {
extensionName: extensionToBeRemoved.value.name,
}),
})
} catch (error) {
add({
color: 'error',
title: t('extension.remove.error.title'),
description: t('extension.remove.error.text', {
error: JSON.stringify(error),
}),
})
await addNotificationAsync({
type: 'error',
title: t('extension.remove.error.title'),
text: t('extension.remove.error.text', { error: JSON.stringify(error) }),
})
}
}
</script> </script>
<i18n lang="yaml">
de:
title: Erweiterungen
subtitle: Entdecke und installiere Erweiterungen für HaexHub
extension:
installFromFile: Von Datei installieren
add: Erweiterung hinzufügen
success:
title: '{extension} hinzugefügt'
text: Die Erweiterung wurde erfolgreich hinzugefügt
remove:
success:
text: 'Erweiterung {extensionName} wurde erfolgreich entfernt'
title: '{extensionName} entfernt'
error:
text: "Erweiterung {extensionName} konnte nicht entfernt werden. \n {error}"
title: 'Fehler beim Entfernen von {extensionName}'
marketplace:
comingSoon: Marketplace-Installation kommt bald!
marketplace:
official: Offizieller Marketplace
community: Community Marketplace
category:
all: Alle
productivity: Produktivität
security: Sicherheit
utilities: Werkzeuge
integration: Integration
search:
placeholder: Erweiterungen durchsuchen...
filter:
category: Kategorie auswählen
empty:
title: Keine Erweiterungen gefunden
description: Versuche einen anderen Suchbegriff oder eine andere Kategorie
en:
title: Extensions
subtitle: Discover and install extensions for HaexHub
extension:
installFromFile: Install from file
add: Add Extension
success:
title: '{extension} added'
text: Extension was added successfully
remove:
success:
text: 'Extension {extensionName} was removed'
title: '{extensionName} removed'
error:
text: "Extension {extensionName} couldn't be removed. \n {error}"
title: 'Exception during uninstall {extensionName}'
marketplace:
comingSoon: Marketplace installation coming soon!
marketplace:
official: Official Marketplace
community: Community Marketplace
category:
all: All
productivity: Productivity
security: Security
utilities: Utilities
integration: Integration
search:
placeholder: Search extensions...
filter:
category: Select category
empty:
title: No extensions found
description: Try a different search term or category
</i18n>

View File

@ -1,101 +1,132 @@
<template> <template>
<div class="w-full h-full flex flex-col bg-white dark:bg-gray-900"> <div class="w-full h-full bg-default">
<!-- Settings Header --> <div class="grid grid-cols-2 p-2">
<div <div class="p-2">{{ t('language') }}</div>
class="flex-shrink-0 border-b border-gray-200 dark:border-gray-700 p-6" <div><UiDropdownLocale @select="onSelectLocaleAsync" /></div>
>
<h1 class="text-2xl font-bold text-gray-900 dark:text-white">Settings</h1>
<p class="text-sm text-gray-600 dark:text-gray-400 mt-1">
Manage your HaexHub preferences and configuration
</p>
</div>
<!-- Settings Content --> <div class="p-2">{{ t('design') }}</div>
<div class="flex-1 overflow-y-auto p-6"> <div><UiDropdownTheme @select="onSelectThemeAsync" /></div>
<div class="max-w-2xl space-y-6">
<!-- General Section --> <div class="p-2">{{ t('vaultName.label') }}</div>
<section>
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">
General
</h2>
<div class="space-y-4 bg-gray-50 dark:bg-gray-800 rounded-lg p-4">
<div class="flex items-center justify-between">
<div> <div>
<p class="font-medium text-gray-900 dark:text-white">Theme</p> <UiInput
<p class="text-sm text-gray-600 dark:text-gray-400"> v-model="currentVaultName"
Choose your preferred theme :placeholder="t('vaultName.label')"
</p> @change="onSetVaultNameAsync"
</div>
<UButton
label="Auto"
variant="outline"
/> />
</div> </div>
<div class="flex items-center justify-between">
<div class="p-2">{{ t('notifications.label') }}</div>
<div> <div>
<p class="font-medium text-gray-900 dark:text-white"> <UiButton
Language :label="t('notifications.requestPermission')"
</p> @click="requestNotificationPermissionAsync"
<p class="text-sm text-gray-600 dark:text-gray-400">
Select your language
</p>
</div>
<UButton
label="English"
variant="outline"
/> />
</div> </div>
</div>
</section>
<!-- Privacy Section --> <div class="p-2">{{ t('deviceName.label') }}</div>
<section>
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">
Privacy & Security
</h2>
<div class="space-y-4 bg-gray-50 dark:bg-gray-800 rounded-lg p-4">
<div class="flex items-center justify-between">
<div> <div>
<p class="font-medium text-gray-900 dark:text-white"> <UiInput
Auto-lock v-model="deviceName"
</p> :placeholder="t('deviceName.label')"
<p class="text-sm text-gray-600 dark:text-gray-400"> @change="onUpdateDeviceNameAsync"
Lock vault after inactivity />
</p>
</div> </div>
</div>
</div>
</section>
<!-- About Section --> <div class="h-full"/>
<section>
<h2 class="text-lg font-semibold text-gray-900 dark:text-white mb-4">
About
</h2>
<div class="space-y-2 bg-gray-50 dark:bg-gray-800 rounded-lg p-4">
<div class="flex justify-between">
<span class="text-sm text-gray-600 dark:text-gray-400"
>Version</span
>
<span class="text-sm font-medium text-gray-900 dark:text-white"
>0.1.0</span
>
</div>
<div class="flex justify-between">
<span class="text-sm text-gray-600 dark:text-gray-400"
>Platform</span
>
<span class="text-sm font-medium text-gray-900 dark:text-white"
>Tauri + Vue</span
>
</div>
</div>
</section>
</div>
</div> </div>
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
// Settings component - placeholder implementation import type { Locale } from 'vue-i18n'
const { t, setLocale } = useI18n()
const { currentVaultName } = storeToRefs(useVaultStore())
const { updateVaultNameAsync, updateLocaleAsync, updateThemeAsync } =
useVaultSettingsStore()
const onSelectLocaleAsync = async (locale: Locale) => {
await updateLocaleAsync(locale)
await setLocale(locale)
}
const { currentThemeName } = storeToRefs(useUiStore())
const onSelectThemeAsync = async (theme: string) => {
currentThemeName.value = theme
console.log('onSelectThemeAsync', currentThemeName.value)
await updateThemeAsync(theme)
}
const { add } = useToast()
const onSetVaultNameAsync = async () => {
try {
await updateVaultNameAsync(currentVaultName.value)
add({ description: t('vaultName.update.success'), color: 'success' })
} catch (error) {
console.error(error)
add({ description: t('vaultName.update.error'), color: 'error' })
}
}
const { requestNotificationPermissionAsync } = useNotificationStore()
const { deviceName } = storeToRefs(useDeviceStore())
const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore()
onMounted(async () => {
await readDeviceNameAsync()
})
const onUpdateDeviceNameAsync = async () => {
const check = vaultDeviceNameSchema.safeParse(deviceName.value)
if (!check.success) return
try {
await updateDeviceNameAsync({ name: deviceName.value })
add({ description: t('deviceName.update.success'), color: 'success' })
} catch (error) {
console.log(error)
add({ description: t('deviceName.update.error'), color: 'error' })
}
}
</script> </script>
<i18n lang="yaml">
de:
language: Sprache
design: Design
save: Änderung speichern
notifications:
label: Benachrichtigungen
requestPermission: Benachrichtigung erlauben
vaultName:
label: Vaultname
update:
success: Vaultname erfolgreich aktualisiert
error: Vaultname konnte nicht aktualisiert werden
deviceName:
label: Gerätename
update:
success: Gerätename wurde erfolgreich aktualisiert
error: Gerätename konnte nich aktualisiert werden
en:
language: Language
design: Design
save: save changes
notifications:
label: Notifications
requestPermission: Grant Permission
vaultName:
label: Vault Name
update:
success: Vault Name successfully updated
error: Vault name could not be updated
deviceName:
label: Device name
update:
success: Device name has been successfully updated
error: Device name could not be updated
</i18n>

View File

@ -2,6 +2,7 @@
<UiDialogConfirm <UiDialogConfirm
:confirm-label="t('create')" :confirm-label="t('create')"
@confirm="onCreateAsync" @confirm="onCreateAsync"
:description="t('description')"
> >
<UiButton <UiButton
:label="t('vault.create')" :label="t('vault.create')"
@ -55,7 +56,9 @@
<script setup lang="ts"> <script setup lang="ts">
import { vaultSchema } from './schema' import { vaultSchema } from './schema'
const { t } = useI18n() const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{ const vault = reactive<{
name: string name: string
@ -118,6 +121,7 @@ de:
name: HaexVault name: HaexVault
title: Neue {haexvault} erstellen title: Neue {haexvault} erstellen
create: Erstellen create: Erstellen
description: Erstelle eine neue Vault für deine Daten
en: en:
vault: vault:
@ -127,4 +131,5 @@ en:
name: HaexVault name: HaexVault
title: Create new {haexvault} title: Create new {haexvault}
create: Create create: Create
description: Create a new vault for your data
</i18n> </i18n>

View File

@ -5,7 +5,7 @@
:description="vault.path || path" :description="vault.path || path"
@confirm="onOpenDatabase" @confirm="onOpenDatabase"
> >
<!-- <UiButton <UiButton
:label="t('vault.open')" :label="t('vault.open')"
:ui="{ :ui="{
base: 'px-3 py-2', base: 'px-3 py-2',
@ -14,8 +14,7 @@
size="xl" size="xl"
variant="outline" variant="outline"
block block
@click.stop="onLoadDatabase" />
/> -->
<template #title> <template #title>
<i18n-t <i18n-t
@ -59,7 +58,9 @@ const props = defineProps<{
path?: string path?: string
}>() }>()
const { t } = useI18n() const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{ const vault = reactive<{
name: string name: string
@ -100,9 +101,6 @@ const vault = reactive<{
} }
} */ } */
const { syncLocaleAsync, syncThemeAsync, syncVaultNameAsync } =
useVaultSettingsStore()
const check = ref(false) const check = ref(false)
const initVault = () => { const initVault = () => {
@ -156,15 +154,17 @@ const onOpenDatabase = async () => {
}, },
}), }),
) )
await Promise.allSettled([
syncLocaleAsync(),
syncThemeAsync(),
syncVaultNameAsync(),
])
} catch (error) { } catch (error) {
open.value = false open.value = false
console.error('handleError', error, typeof error) if (error?.details?.reason === 'file is not a database') {
add({ color: 'error', description: `${error}` }) add({
color: 'error',
title: t('error.password.title'),
description: t('error.password.description'),
})
} else {
add({ color: 'error', description: JSON.stringify(error) })
}
} }
} }
</script> </script>
@ -178,7 +178,9 @@ de:
open: Vault öffnen open: Vault öffnen
description: Öffne eine vorhandene Vault description: Öffne eine vorhandene Vault
error: error:
open: Vault konnte nicht geöffnet werden. \n Vermutlich ist das Passwort falsch password:
title: Vault konnte nicht geöffnet werden
description: Bitte üperprüfe das Passwort
en: en:
open: Unlock open: Unlock
@ -188,5 +190,7 @@ en:
vault: vault:
open: Open Vault open: Open Vault
error: error:
open: Vault couldn't be opened. \n The password is probably wrong password:
title: Vault couldn't be opened
description: Please check your password
</i18n> </i18n>

View File

@ -0,0 +1,83 @@
<template>
<UTooltip :text="tooltip">
<button
class="size-8 shrink-0 rounded-lg flex justify-center transition-colors group"
:class="variantClasses.buttonClass"
@click="(e) => $emit('click', e)"
>
<UIcon
:name="icon"
class="size-4 text-gray-600 dark:text-gray-400"
:class="variantClasses.iconClass"
/>
</button>
</UTooltip>
</template>
<script setup lang="ts">
const props = defineProps<{
variant: 'close' | 'maximize' | 'minimize'
isMaximized?: boolean
}>()
defineEmits(['click'])
const icon = computed(() => {
switch (props.variant) {
case 'close':
return 'i-heroicons-x-mark'
case 'maximize':
return props.isMaximized
? 'i-heroicons-arrows-pointing-in'
: 'i-heroicons-arrows-pointing-out'
default:
return 'i-heroicons-minus'
}
})
const variantClasses = computed(() => {
if (props.variant === 'close') {
return {
iconClass: 'group-hover:text-error',
buttonClass: 'hover:bg-error/30 items-center',
}
} else if (props.variant === 'maximize') {
return {
iconClass: 'group-hover:text-warning',
buttonClass: 'hover:bg-warning/30 items-center',
}
} else {
return {
iconClass: 'group-hover:text-success',
buttonClass: 'hover:bg-success/30 items-end pb-1',
}
}
})
const { t } = useI18n()
const tooltip = computed(() => {
switch (props.variant) {
case 'close':
return t('close')
case 'maximize':
return props.isMaximized ? t('shrink') : t('maximize')
default:
return t('minimize')
}
})
</script>
<i18n lang="yaml">
de:
close: Schließen
maximize: Maximieren
shrink: Verkleinern
minimize: Minimieren
en:
close: Close
maximize: Maximize
shrink: Shrink
minimize: Minimize
</i18n>

View File

@ -3,10 +3,17 @@
ref="windowEl" ref="windowEl"
:style="windowStyle" :style="windowStyle"
:class="[ :class="[
'absolute bg-white/80 dark:bg-gray-900/80 backdrop-blur-xl rounded-xl shadow-2xl overflow-hidden', 'absolute bg-default/80 backdrop-blur-xl rounded-lg shadow-xl overflow-hidden',
'border border-gray-200 dark:border-gray-700 transition-all ease-out duration-600', 'transition-all ease-out duration-600',
'flex flex-col', 'flex flex-col @container',
isActive ? 'z-50' : 'z-10', { 'select-none': isResizingOrDragging },
isActive ? 'z-20' : 'z-10',
// Border colors based on warning level
warningLevel === 'warning'
? 'border-2 border-warning-500'
: warningLevel === 'danger'
? 'border-2 border-danger-500'
: 'border border-gray-200 dark:border-gray-700',
]" ]"
@mousedown="handleActivate" @mousedown="handleActivate"
> >
@ -22,7 +29,7 @@
v-if="icon" v-if="icon"
:src="icon" :src="icon"
:alt="title" :alt="title"
class="w-5 h-5 object-contain flex-shrink-0" class="w-5 h-5 object-contain shrink-0"
/> />
</div> </div>
@ -37,85 +44,39 @@
<!-- Right: Window Controls --> <!-- Right: Window Controls -->
<div class="flex items-center gap-1 justify-end"> <div class="flex items-center gap-1 justify-end">
<button <HaexWindowButton
class="w-8 h-8 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-700 flex items-center justify-center transition-colors" variant="minimize"
@click.stop="handleMinimize" @click.stop="handleMinimize"
>
<UIcon
name="i-heroicons-minus"
class="w-4 h-4 text-gray-600 dark:text-gray-400"
/> />
</button>
<button <HaexWindowButton
class="w-8 h-8 rounded-lg hover:bg-gray-200 dark:hover:bg-gray-700 flex items-center justify-center transition-colors" :is-maximized
variant="maximize"
@click.stop="handleMaximize" @click.stop="handleMaximize"
>
<UIcon
:name="
isMaximized
? 'i-heroicons-arrows-pointing-in'
: 'i-heroicons-arrows-pointing-out'
"
class="w-4 h-4 text-gray-600 dark:text-gray-400"
/> />
</button>
<button <HaexWindowButton
class="w-8 h-8 rounded-lg hover:bg-red-100 dark:hover:bg-red-900/30 flex items-center justify-center transition-colors group" variant="close"
@click.stop="handleClose" @click.stop="handleClose"
>
<UIcon
name="i-heroicons-x-mark"
class="w-4 h-4 text-gray-600 dark:text-gray-400 group-hover:text-red-600 dark:group-hover:text-red-400"
/> />
</button>
</div> </div>
</div> </div>
<!-- Window Content --> <!-- Window Content -->
<div <div
:class="[ :class="[
'flex-1 overflow-hidden relative', 'flex-1 overflow-auto relative ',
isDragging || isResizing ? 'pointer-events-none' : '', isResizingOrDragging ? 'pointer-events-none' : '',
]" ]"
> >
<slot /> <slot />
</div> </div>
<!-- Resize Handles --> <!-- Resize Handles -->
<template v-if="!isMaximized"> <HaexWindowResizeHandles
<div :disabled="isMaximized"
class="absolute top-0 left-0 w-2 h-2 cursor-nw-resize" @resize-start="handleResizeStart"
@mousedown.left.stop="handleResizeStart('nw', $event)"
/> />
<div
class="absolute top-0 right-0 w-2 h-2 cursor-ne-resize"
@mousedown.left.stop="handleResizeStart('ne', $event)"
/>
<div
class="absolute bottom-0 left-0 w-2 h-2 cursor-sw-resize"
@mousedown.left.stop="handleResizeStart('sw', $event)"
/>
<div
class="absolute bottom-0 right-0 w-2 h-2 cursor-se-resize"
@mousedown.left.stop="handleResizeStart('se', $event)"
/>
<div
class="absolute top-0 left-2 right-2 h-1 cursor-n-resize"
@mousedown.left.stop="handleResizeStart('n', $event)"
/>
<div
class="absolute bottom-0 left-2 right-2 h-1 cursor-s-resize"
@mousedown.left.stop="handleResizeStart('s', $event)"
/>
<div
class="absolute left-0 top-2 bottom-2 w-1 cursor-w-resize"
@mousedown.left.stop="handleResizeStart('w', $event)"
/>
<div
class="absolute right-0 top-2 bottom-2 w-1 cursor-e-resize"
@mousedown.left.stop="handleResizeStart('e', $event)"
/>
</template>
</div> </div>
</template> </template>
@ -124,10 +85,6 @@ const props = defineProps<{
id: string id: string
title: string title: string
icon?: string | null icon?: string | null
initialX?: number
initialY?: number
initialWidth?: number
initialHeight?: number
isActive?: boolean isActive?: boolean
sourceX?: number sourceX?: number
sourceY?: number sourceY?: number
@ -135,6 +92,7 @@ const props = defineProps<{
sourceHeight?: number sourceHeight?: number
isOpening?: boolean isOpening?: boolean
isClosing?: boolean isClosing?: boolean
warningLevel?: 'warning' | 'danger' // Warning indicator (e.g., dev extension, dangerous permissions)
}>() }>()
const emit = defineEmits<{ const emit = defineEmits<{
@ -147,7 +105,13 @@ const emit = defineEmits<{
dragEnd: [] dragEnd: []
}>() }>()
const windowEl = ref<HTMLElement>() // Use defineModel for x, y, width, height
const x = defineModel<number>('x', { default: 100 })
const y = defineModel<number>('y', { default: 100 })
const width = defineModel<number>('width', { default: 800 })
const height = defineModel<number>('height', { default: 600 })
const windowEl = useTemplateRef('windowEl')
const titlebarEl = useTemplateRef('titlebarEl') const titlebarEl = useTemplateRef('titlebarEl')
// Inject viewport size from parent desktop // Inject viewport size from parent desktop
@ -155,20 +119,14 @@ const viewportSize = inject<{
width: Ref<number> width: Ref<number>
height: Ref<number> height: Ref<number>
}>('viewportSize') }>('viewportSize')
// Window state
const x = ref(props.initialX ?? 100)
const y = ref(props.initialY ?? 100)
const width = ref(props.initialWidth ?? 800)
const height = ref(props.initialHeight ?? 600)
const isMaximized = ref(false) // Don't start maximized const isMaximized = ref(false) // Don't start maximized
// Store initial position/size for restore // Store initial position/size for restore
const preMaximizeState = ref({ const preMaximizeState = ref({
x: props.initialX ?? 100, x: x.value,
y: props.initialY ?? 100, y: y.value,
width: props.initialWidth ?? 800, width: width.value,
height: props.initialHeight ?? 600, height: height.value,
}) })
// Dragging state // Dragging state
@ -186,9 +144,9 @@ const resizeStartHeight = ref(0)
const resizeStartPosX = ref(0) const resizeStartPosX = ref(0)
const resizeStartPosY = ref(0) const resizeStartPosY = ref(0)
// Snap settings const isResizingOrDragging = computed(
const snapEdgeThreshold = 50 // pixels from edge to trigger snap () => isResizing.value || isDragging.value,
const { x: mouseX } = useMouse() )
// Setup drag with useDrag composable (supports mouse + touch) // Setup drag with useDrag composable (supports mouse + touch)
useDrag( useDrag(
@ -205,34 +163,9 @@ useDrag(
} }
if (last) { if (last) {
// Drag ended - apply snapping // Drag ended
isDragging.value = false isDragging.value = false
globalThis.getSelection()?.removeAllRanges()
const viewportBounds = getViewportBounds()
if (viewportBounds) {
const viewportWidth = viewportBounds.width
const viewportHeight = viewportBounds.height
if (mouseX.value <= snapEdgeThreshold) {
// Snap to left half
x.value = 0
y.value = 0
width.value = viewportWidth / 2
height.value = viewportHeight
isMaximized.value = false
} else if (mouseX.value >= viewportWidth - snapEdgeThreshold) {
// Snap to right half
x.value = viewportWidth / 2
y.value = 0
width.value = viewportWidth / 2
height.value = viewportHeight
isMaximized.value = false
} else {
// Normal snap back to viewport
snapToViewport()
}
}
emit('positionChanged', x.value, y.value) emit('positionChanged', x.value, y.value)
emit('sizeChanged', width.value, height.value) emit('sizeChanged', width.value, height.value)
emit('dragEnd') emit('dragEnd')
@ -253,7 +186,6 @@ useDrag(
eventOptions: { passive: false }, eventOptions: { passive: false },
pointer: { touch: true }, pointer: { touch: true },
drag: { drag: {
threshold: 10, // 10px threshold prevents accidental drags and improves performance
filterTaps: true, // Filter out taps (clicks) vs drags filterTaps: true, // Filter out taps (clicks) vs drags
delay: 0, // No delay for immediate response delay: 0, // No delay for immediate response
}, },
@ -289,27 +221,24 @@ const windowStyle = computed(() => {
baseStyle.opacity = '0' baseStyle.opacity = '0'
baseStyle.transform = 'scale(0.3)' baseStyle.transform = 'scale(0.3)'
} }
// Normal state // Normal state (maximized windows now use actual pixel dimensions)
else if (isMaximized.value) { else {
baseStyle.left = '0px'
baseStyle.top = '0px'
baseStyle.width = '100%'
baseStyle.height = '100%'
baseStyle.borderRadius = '0'
baseStyle.opacity = '1'
baseStyle.transform = 'scale(1)'
} else {
baseStyle.left = `${x.value}px` baseStyle.left = `${x.value}px`
baseStyle.top = `${y.value}px` baseStyle.top = `${y.value}px`
baseStyle.width = `${width.value}px` baseStyle.width = `${width.value}px`
baseStyle.height = `${height.value}px` baseStyle.height = `${height.value}px`
baseStyle.opacity = '1' baseStyle.opacity = '1'
baseStyle.transform = 'scale(1)'
// Remove border-radius when maximized
if (isMaximized.value) {
baseStyle.borderRadius = '0'
}
} }
// Performance optimization: hint browser about transforms // Performance optimization: hint browser about transforms
if (isDragging.value || isResizing.value) { if (isDragging.value || isResizing.value) {
baseStyle.willChange = 'transform, width, height' baseStyle.willChange = 'transform, width, height'
baseStyle.transform = 'translateZ(0)'
} }
return baseStyle return baseStyle
@ -341,38 +270,18 @@ const constrainToViewportDuringDrag = (newX: number, newY: number) => {
const windowWidth = width.value const windowWidth = width.value
const windowHeight = height.value const windowHeight = height.value
// Allow max 1/3 of window to go outside viewport during drag // Allow sides and bottom to go out more
const maxOffscreenX = windowWidth / 3 const maxOffscreenX = windowWidth / 3
const maxOffscreenY = windowHeight / 3 const maxOffscreenBottom = windowHeight / 3
// For X axis: allow 1/3 to go outside on both sides
const maxX = bounds.width - windowWidth + maxOffscreenX const maxX = bounds.width - windowWidth + maxOffscreenX
const minX = -maxOffscreenX const minX = -maxOffscreenX
const maxY = bounds.height - windowHeight + maxOffscreenY
const minY = -maxOffscreenY
const constrainedX = Math.max(minX, Math.min(maxX, newX)) // For Y axis: HARD constraint at top (y=0), never allow window to go above header
const constrainedY = Math.max(minY, Math.min(maxY, newY))
return { x: constrainedX, y: constrainedY }
}
const constrainToViewportFully = (
newX: number,
newY: number,
newWidth?: number,
newHeight?: number,
) => {
const bounds = getViewportBounds()
if (!bounds) return { x: newX, y: newY }
const windowWidth = newWidth ?? width.value
const windowHeight = newHeight ?? height.value
// Keep entire window within viewport
const maxX = bounds.width - windowWidth
const minX = 0
const maxY = bounds.height - windowHeight
const minY = 0 const minY = 0
// Bottom: allow 1/3 to go outside
const maxY = bounds.height - windowHeight + maxOffscreenBottom
const constrainedX = Math.max(minX, Math.min(maxX, newX)) const constrainedX = Math.max(minX, Math.min(maxX, newX))
const constrainedY = Math.max(minY, Math.min(maxY, newY)) const constrainedY = Math.max(minY, Math.min(maxY, newY))
@ -380,15 +289,6 @@ const constrainToViewportFully = (
return { x: constrainedX, y: constrainedY } return { x: constrainedX, y: constrainedY }
} }
const snapToViewport = () => {
const bounds = getViewportBounds()
if (!bounds) return
const constrained = constrainToViewportFully(x.value, y.value)
x.value = constrained.x
y.value = constrained.y
}
const handleActivate = () => { const handleActivate = () => {
emit('activate') emit('activate')
} }
@ -410,23 +310,76 @@ const handleMaximize = () => {
height.value = preMaximizeState.value.height height.value = preMaximizeState.value.height
isMaximized.value = false isMaximized.value = false
} else { } else {
// Maximize // Maximize - set position and size to viewport dimensions
preMaximizeState.value = { preMaximizeState.value = {
x: x.value, x: x.value,
y: y.value, y: y.value,
width: width.value, width: width.value,
height: height.value, height: height.value,
} }
// Get viewport bounds (desktop container, already excludes header)
const bounds = getViewportBounds()
if (bounds && bounds.width > 0 && bounds.height > 0) {
// Get safe-area-insets from CSS variables for debug
const safeAreaTop = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-top',
) || '0',
)
const safeAreaBottom = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-bottom',
) || '0',
)
// Desktop container uses 'absolute inset-0' which stretches over full viewport
// bounds.height = full viewport height (includes header area + safe-areas)
// We need to calculate available space properly
// Get header height from UI store (measured reactively in layout)
const uiStore = useUiStore()
const headerHeight = uiStore.headerHeight
x.value = 0
y.value = 0 // Start below header and status bar
width.value = bounds.width
// Height: viewport - header - both safe-areas
height.value = bounds.height - headerHeight - safeAreaTop - safeAreaBottom
isMaximized.value = true isMaximized.value = true
} }
} }
}
// Window resizing // Window resizing
const handleResizeStart = (direction: string, e: MouseEvent) => { const handleResizeStart = (direction: string, e: MouseEvent | TouchEvent) => {
isResizing.value = true isResizing.value = true
resizeDirection.value = direction resizeDirection.value = direction
resizeStartX.value = e.clientX let clientX: number
resizeStartY.value = e.clientY let clientY: number
if ('touches' in e) {
// Es ist ein TouchEvent
const touch = e.touches[0] // Hole den ersten Touch
// Prüfe, ob 'touch' existiert (ist undefined, wenn e.touches leer ist)
if (touch) {
clientX = touch.clientX
clientY = touch.clientY
} else {
// Ungültiges Start-Event (kein Finger). Abbruch.
isResizing.value = false
return
}
} else {
// Es ist ein MouseEvent
clientX = e.clientX
clientY = e.clientY
}
resizeStartX.value = clientX
resizeStartY.value = clientY
resizeStartWidth.value = width.value resizeStartWidth.value = width.value
resizeStartHeight.value = height.value resizeStartHeight.value = height.value
resizeStartPosX.value = x.value resizeStartPosX.value = x.value
@ -466,11 +419,9 @@ useEventListener(window, 'mousemove', (e: MouseEvent) => {
// Global mouse up handler (for resizing only, dragging handled by useDrag) // Global mouse up handler (for resizing only, dragging handled by useDrag)
useEventListener(window, 'mouseup', () => { useEventListener(window, 'mouseup', () => {
if (isResizing.value) { if (isResizing.value) {
globalThis.getSelection()?.removeAllRanges()
isResizing.value = false isResizing.value = false
// Snap back to viewport after resize ends
snapToViewport()
emit('positionChanged', x.value, y.value) emit('positionChanged', x.value, y.value)
emit('sizeChanged', width.value, height.value) emit('sizeChanged', width.value, height.value)
} }

View File

@ -0,0 +1,222 @@
<template>
<UDrawer
v-model:open="localShowWindowOverview"
direction="bottom"
:title="t('modal.title')"
:description="t('modal.description')"
>
<template #content>
<div class="h-full overflow-y-auto p-6 justify-center flex">
<!-- Window Thumbnails Flex Layout -->
<div
v-if="windows.length > 0"
class="flex flex-wrap gap-6 justify-center-safe items-start"
>
<div
v-for="window in windows"
:key="window.id"
class="relative group cursor-pointer"
>
<!-- Window Title Bar -->
<div class="flex items-center gap-3 mb-2 px-2">
<UIcon
v-if="window.icon"
:name="window.icon"
class="size-5 shrink-0"
/>
<div class="flex-1 min-w-0">
<p class="font-semibold text-sm truncate">
{{ window.title }}
</p>
</div>
<!-- Minimized Badge -->
<UBadge
v-if="window.isMinimized"
color="info"
size="xs"
:title="t('minimized')"
/>
</div>
<!-- Scaled Window Preview Container / Teleport Target -->
<div
:id="`window-preview-${window.id}`"
class="relative bg-gray-100 dark:bg-gray-900 rounded-xl overflow-hidden border-2 border-gray-200 dark:border-gray-700 group-hover:border-primary-500 transition-all shadow-lg"
:style="getCardStyle(window)"
@click="handleRestoreAndActivateWindow(window.id)"
>
<!-- Hover Overlay -->
<div
class="absolute inset-0 bg-primary-500/10 opacity-0 group-hover:opacity-100 transition-opacity pointer-events-none z-40"
/>
</div>
</div>
</div>
<!-- Empty State -->
<div
v-else
class="flex flex-col items-center justify-center py-12 text-gray-500 dark:text-gray-400"
>
<UIcon
name="i-heroicons-window"
class="size-16 mb-4 shrink-0"
/>
<p class="text-lg font-medium">No windows open</p>
<p class="text-sm">
Open an extension or system window to see it here
</p>
</div>
</div>
</template>
</UDrawer>
</template>
<script setup lang="ts">
const { t } = useI18n()
const windowManager = useWindowManagerStore()
const workspaceStore = useWorkspaceStore()
const { showWindowOverview, windows } = storeToRefs(windowManager)
// Local computed for two-way binding with UModal
const localShowWindowOverview = computed({
get: () => showWindowOverview.value,
set: (value) => {
showWindowOverview.value = value
},
})
const handleRestoreAndActivateWindow = (windowId: string) => {
const window = windowManager.windows.find((w) => w.id === windowId)
if (!window) return
// Switch to the workspace where this window is located
if (window.workspaceId) {
workspaceStore.slideToWorkspace(window.workspaceId)
}
// If window is minimized, restore it first
if (window.isMinimized) {
windowManager.restoreWindow(windowId)
} else {
// If not minimized, just activate it
windowManager.activateWindow(windowId)
}
// Close the overview
localShowWindowOverview.value = false
}
// Store original window sizes and positions to restore after overview closes
const originalWindowState = ref<
Map<string, { width: number; height: number; x: number; y: number }>
>(new Map())
// Min/Max dimensions for preview cards
const MIN_PREVIEW_WIDTH = 300
const MAX_PREVIEW_WIDTH = 600
const MIN_PREVIEW_HEIGHT = 225
const MAX_PREVIEW_HEIGHT = 450
// Calculate card size and scale based on window dimensions
const getCardStyle = (window: (typeof windows.value)[0]) => {
const scaleX = MAX_PREVIEW_WIDTH / window.width
const scaleY = MAX_PREVIEW_HEIGHT / window.height
const scale = Math.min(scaleX, scaleY, 1) // Never scale up, only down
// Calculate scaled dimensions
const scaledWidth = window.width * scale
const scaledHeight = window.height * scale
// Ensure minimum card size
let finalScale = scale
if (scaledWidth < MIN_PREVIEW_WIDTH) {
finalScale = MIN_PREVIEW_WIDTH / window.width
}
if (scaledHeight < MIN_PREVIEW_HEIGHT) {
finalScale = Math.max(finalScale, MIN_PREVIEW_HEIGHT / window.height)
}
const cardWidth = window.width * finalScale
const cardHeight = window.height * finalScale
return {
width: `${cardWidth}px`,
height: `${cardHeight}px`,
'--window-scale': finalScale, // CSS variable for scale
}
}
// Watch for overview closing to restore windows
watch(localShowWindowOverview, async (isOpen, wasOpen) => {
if (!isOpen && wasOpen) {
console.log('[WindowOverview] Overview closed, restoring windows...')
// Restore original window state
for (const window of windows.value) {
const originalState = originalWindowState.value.get(window.id)
if (originalState) {
console.log(
`[WindowOverview] Restoring window ${window.id} to:`,
originalState,
)
windowManager.updateWindowSize(
window.id,
originalState.width,
originalState.height,
)
windowManager.updateWindowPosition(
window.id,
originalState.x,
originalState.y,
)
}
}
originalWindowState.value.clear()
}
})
// Watch for overview opening to store original state
watch(
() => localShowWindowOverview.value && windows.value.length,
(shouldStore) => {
if (shouldStore && originalWindowState.value.size === 0) {
console.log('[WindowOverview] Storing original window states...')
for (const window of windows.value) {
console.log(`[WindowOverview] Window ${window.id}:`, {
originalSize: { width: window.width, height: window.height },
originalPos: { x: window.x, y: window.y },
})
originalWindowState.value.set(window.id, {
width: window.width,
height: window.height,
x: window.x,
y: window.y,
})
}
}
},
)
</script>
<i18n lang="yaml">
de:
modal:
title: Fensterübersicht
description: Übersicht aller offenen Fenster auf allen Workspaces
minimized: Minimiert
en:
modal:
title: Window Overview
description: Overview of all open windows on all workspaces
minimized: Minimized
</i18n>

View File

@ -0,0 +1,61 @@
<template>
<template v-if="!disabled">
<div
class="absolute top-0 left-0 size-2 cursor-nw-resize z-10"
@mousedown.left.stop="emitResizeStart('nw', $event)"
@touchstart.passive.stop="emitResizeStart('nw', $event)"
/>
<div
class="absolute top-0 right-0 size-2 cursor-ne-resize z-10"
@mousedown.left.stop="emitResizeStart('ne', $event)"
@touchstart.passive.stop="emitResizeStart('ne', $event)"
/>
<div
class="absolute bottom-0 left-0 size-2 cursor-sw-resize z-10"
@mousedown.left.stop="emitResizeStart('sw', $event)"
@touchstart.passive.stop="emitResizeStart('sw', $event)"
/>
<div
class="absolute bottom-0 right-0 w-2 h-2 cursor-se-resize z-10"
@mousedown.left.stop="emitResizeStart('se', $event)"
@touchstart.passive.stop="emitResizeStart('se', $event)"
/>
<div
class="absolute top-0 left-2 right-2 h-2 cursor-n-resize z-10"
@mousedown.left.stop="emitResizeStart('n', $event)"
@touchstart.passive.stop="emitResizeStart('n', $event)"
/>
<div
class="absolute bottom-0 left-2 right-2 h-2 cursor-s-resize z-10"
@mousedown.left.stop="emitResizeStart('s', $event)"
@touchstart.passive.stop="emitResizeStart('s', $event)"
/>
<div
class="absolute left-0 top-2 bottom-2 w-2 cursor-w-resize z-10"
@mousedown.left.stop="emitResizeStart('w', $event)"
@touchstart.passive.stop="emitResizeStart('w', $event)"
/>
<div
class="absolute right-0 top-2 bottom-2 w-2 cursor-e-resize z-10"
@mousedown.left.stop="emitResizeStart('e', $event)"
@touchstart.passive.stop="emitResizeStart('e', $event)"
/>
</template>
</template>
<script setup lang="ts">
// Props: Nur Information, ob Handles angezeigt werden sollen
defineProps<{
disabled?: boolean // True if window is maximized
}>()
// Emits: Signalisiert den Start des Resizing mit Richtung und Event
const emit = defineEmits<{
resizeStart: [direction: string, event: MouseEvent | TouchEvent]
}>()
// Funktion, um das Event nach oben weiterzuleiten
const emitResizeStart = (direction: string, event: MouseEvent | TouchEvent) => {
emit('resizeStart', direction, event)
}
</script>

View File

@ -1,12 +1,14 @@
<template> <template>
<UCard <UCard
class="cursor-pointer transition-all h-32 w-72 shrink-0 group duration-500" ref="cardEl"
class="cursor-pointer transition-all h-32 w-72 shrink-0 group duration-500 rounded-lg"
:class="[ :class="[
workspace.position === currentWorkspaceIndex workspace.id === currentWorkspace?.id
? 'ring-2 ring-secondary bg-secondary/10' ? 'ring-2 ring-secondary bg-secondary/10'
: 'hover:ring-2 hover:ring-gray-300', : 'hover:ring-2 hover:ring-gray-300',
isDragOver ? 'ring-4 ring-primary bg-primary/20 scale-105' : '',
]" ]"
@click="workspaceStore.slideToWorkspace(workspace.position)" @click="workspaceStore.slideToWorkspace(workspace.id)"
> >
<template #header> <template #header>
<div class="flex justify-between"> <div class="flex justify-between">
@ -27,9 +29,70 @@
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
defineProps<{ workspace: IWorkspace }>() const props = defineProps<{ workspace: IWorkspace }>()
const workspaceStore = useWorkspaceStore() const workspaceStore = useWorkspaceStore()
const windowManager = useWindowManagerStore()
const { currentWorkspaceIndex } = storeToRefs(workspaceStore) const { currentWorkspace } = storeToRefs(workspaceStore)
const cardEl = useTemplateRef('cardEl')
const isDragOver = ref(false)
// Use mouse position to detect if over card
const { x: mouseX, y: mouseY } = useMouse()
// Check if mouse is over this card while dragging
watchEffect(() => {
if (!windowManager.draggingWindowId || !cardEl.value?.$el) {
isDragOver.value = false
return
}
// Get card bounding box
const rect = cardEl.value.$el.getBoundingClientRect()
// Check if mouse is within card bounds
const isOver =
mouseX.value >= rect.left &&
mouseX.value <= rect.right &&
mouseY.value >= rect.top &&
mouseY.value <= rect.bottom
isDragOver.value = isOver
})
// Handle drop when drag ends - check BEFORE draggingWindowId is cleared
let wasOverThisCard = false
watchEffect(() => {
if (isDragOver.value && windowManager.draggingWindowId) {
wasOverThisCard = true
}
})
watch(
() => windowManager.draggingWindowId,
(newValue, oldValue) => {
// Drag ended (from something to null)
if (oldValue && !newValue && wasOverThisCard) {
console.log(
'[WorkspaceCard] Drop detected! Moving window to workspace:',
props.workspace.name,
)
const window = windowManager.windows.find((w) => w.id === oldValue)
if (window) {
window.workspaceId = props.workspace.id
window.x = 0
window.y = 0
// Switch to the workspace after dropping
//workspaceStore.slideToWorkspace(props.workspace.id)
}
wasOverThisCard = false
} else if (!newValue) {
// Drag ended but not over this card
wasOverThisCard = false
}
},
)
</script> </script>

View File

@ -0,0 +1,28 @@
<template>
<UContextMenu :items="contextMenuItems">
<UiButton
v-bind="$attrs"
@click="$emit('click', $event)"
>
<template
v-for="(_, slotName) in $slots"
#[slotName]="slotProps"
>
<slot
:name="slotName"
v-bind="slotProps"
/>
</template>
</UiButton>
</UContextMenu>
</template>
<script setup lang="ts">
import type { ContextMenuItem } from '@nuxt/ui'
defineProps<{
contextMenuItems: ContextMenuItem[]
}>()
defineEmits<{ click: [Event] }>()
</script>

View File

@ -4,11 +4,10 @@
<UButton <UButton
class="pointer-events-auto" class="pointer-events-auto"
v-bind="{ v-bind="{
...{ size: isSmallScreen ? 'lg' : 'md' },
...buttonProps, ...buttonProps,
...$attrs, ...$attrs,
}" }"
@click="(e) => $emit('click', e)" @click="$emit('click', $event)"
> >
<template <template
v-for="(_, slotName) in $slots" v-for="(_, slotName) in $slots"

View File

@ -11,10 +11,6 @@ const { availableThemes, currentTheme } = storeToRefs(useUiStore())
const emit = defineEmits<{ select: [string] }>() const emit = defineEmits<{ select: [string] }>()
watchImmediate(availableThemes, () =>
console.log('availableThemes', availableThemes),
)
const items = computed<DropdownMenuItem[]>(() => const items = computed<DropdownMenuItem[]>(() =>
availableThemes?.value.map((theme) => ({ availableThemes?.value.map((theme) => ({
...theme, ...theme,

View File

@ -17,7 +17,7 @@
:title="t('pick')" :title="t('pick')"
class="top-0 left-0 absolute size-0" class="top-0 left-0 absolute size-0"
type="color" type="color"
/> >
<UiTooltip :tooltip="t('reset')"> <UiTooltip :tooltip="t('reset')">
<UiButton <UiButton

View File

@ -2,8 +2,8 @@
<UDropdownMenu <UDropdownMenu
:items="icons" :items="icons"
class="btn" class="btn"
@select="(newIcon) => (iconName = newIcon)"
:read_only :read_only
@select="(newIcon) => (iconName = newIcon)"
> >
<template #activator> <template #activator>
<Icon :name="iconName ? iconName : defaultIcon || icons.at(0)" /> <Icon :name="iconName ? iconName : defaultIcon || icons.at(0)" />
@ -12,8 +12,8 @@
<template #items="{ items }"> <template #items="{ items }">
<div class="grid grid-cols-6 -ml-2"> <div class="grid grid-cols-6 -ml-2">
<li <li
class="dropdown-item"
v-for="item in items" v-for="item in items"
class="dropdown-item"
@click="read_only ? '' : (iconName = item)" @click="read_only ? '' : (iconName = item)"
> >
<Icon <Icon

View File

@ -6,8 +6,8 @@
<button <button
:id :id
class="advance-select-toogle flex justify-between grow p-3" class="advance-select-toogle flex justify-between grow p-3"
@click.prevent="toogleMenu"
:disabled="read_only" :disabled="read_only"
@click.prevent="toogleMenu"
> >
<slot <slot
name="value" name="value"
@ -18,9 +18,9 @@
</slot> </slot>
</button> </button>
<button <button
@click.prevent="toogleMenu"
class="flex items-center p-2 hover:shadow rounded-md hover:bg-primary hover:text-base-content" class="flex items-center p-2 hover:shadow rounded-md hover:bg-primary hover:text-base-content"
:disabled="read_only" :disabled="read_only"
@click.prevent="toogleMenu"
> >
<i class="i-[material-symbols--keyboard-arrow-down] size-4" /> <i class="i-[material-symbols--keyboard-arrow-down] size-4" />
</button> </button>

View File

@ -1,65 +0,0 @@
// composables/extensionContextBroadcast.ts
// NOTE: This composable is deprecated. Use tabsStore.broadcastToAllTabs() instead.
// Keeping for backwards compatibility.
import { getExtensionWindow } from './extensionMessageHandler'
export const useExtensionContextBroadcast = () => {
// Globaler State für Extension IDs statt IFrames
const extensionIds = useState<Set<string>>(
'extension-ids',
() => new Set(),
)
const registerExtensionIframe = (_iframe: HTMLIFrameElement, extensionId: string) => {
extensionIds.value.add(extensionId)
}
const unregisterExtensionIframe = (_iframe: HTMLIFrameElement, extensionId: string) => {
extensionIds.value.delete(extensionId)
}
const broadcastContextChange = (context: {
theme: string
locale: string
platform: string
}) => {
const message = {
type: 'context.changed',
data: { context },
timestamp: Date.now(),
}
extensionIds.value.forEach((extensionId) => {
const win = getExtensionWindow(extensionId)
if (win) {
win.postMessage(message, '*')
}
})
}
const broadcastSearchRequest = (query: string, requestId: string) => {
const message = {
type: 'search.request',
data: {
query: { query, limit: 10 },
requestId,
},
timestamp: Date.now(),
}
extensionIds.value.forEach((extensionId) => {
const win = getExtensionWindow(extensionId)
if (win) {
win.postMessage(message, '*')
}
})
}
return {
registerExtensionIframe,
unregisterExtensionIframe,
broadcastContextChange,
broadcastSearchRequest,
}
}

View File

@ -166,20 +166,18 @@ const registerGlobalMessageHandler = () => {
try { try {
let result: unknown let result: unknown
if (request.method.startsWith('extension.')) { if (request.method.startsWith('haextension.context.')) {
result = await handleExtensionMethodAsync(request, instance.extension)
} else if (request.method.startsWith('db.')) {
result = await handleDatabaseMethodAsync(request, instance.extension)
} else if (request.method.startsWith('fs.')) {
result = await handleFilesystemMethodAsync(request, instance.extension)
} else if (request.method.startsWith('http.')) {
result = await handleHttpMethodAsync(request, instance.extension)
} else if (request.method.startsWith('permissions.')) {
result = await handlePermissionsMethodAsync(request, instance.extension)
} else if (request.method.startsWith('context.')) {
result = await handleContextMethodAsync(request) result = await handleContextMethodAsync(request)
} else if (request.method.startsWith('storage.')) { } else if (request.method.startsWith('haextension.storage.')) {
result = await handleStorageMethodAsync(request, instance) result = await handleStorageMethodAsync(request, instance)
} else if (request.method.startsWith('haextension.db.')) {
result = await handleDatabaseMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.fs.')) {
result = await handleFilesystemMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.http.')) {
result = await handleHttpMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.permissions.')) {
result = await handlePermissionsMethodAsync(request, instance.extension)
} else { } else {
throw new Error(`Unknown method: ${request.method}`) throw new Error(`Unknown method: ${request.method}`)
} }
@ -328,31 +326,28 @@ export const getExtensionWindow = (extensionId: string): Window | undefined => {
return getAllInstanceWindows(extensionId)[0] return getAllInstanceWindows(extensionId)[0]
} }
// ========================================== // Broadcast context changes to all extension instances
// Extension Methods export const broadcastContextToAllExtensions = (context: {
// ========================================== theme: string
locale: string
platform?: string
}) => {
const message = {
type: 'haextension.context.changed',
data: { context },
timestamp: Date.now(),
}
async function handleExtensionMethodAsync( console.log('[ExtensionHandler] Broadcasting context to all extensions:', context)
request: ExtensionRequest,
extension: IHaexHubExtension, // Direkter Typ, kein ComputedRef mehr // Send to all registered extension windows
) { for (const [_, instance] of iframeRegistry.entries()) {
switch (request.method) { const win = windowIdToWindowMap.get(instance.windowId)
case 'extension.getInfo': { if (win) {
const info = (await invoke('get_extension_info', { console.log('[ExtensionHandler] Sending context to:', instance.extension.name, instance.windowId)
publicKey: extension.publicKey, win.postMessage(message, '*')
name: extension.name,
})) as Record<string, unknown>
// Override allowedOrigin with the actual window origin
// This fixes the dev-mode issue where Rust returns "tauri://localhost"
// but the actual origin is "http://localhost:3003"
return {
...info,
allowedOrigin: window.location.origin,
} }
} }
default:
throw new Error(`Unknown extension method: ${request.method}`)
}
} }
// ========================================== // ==========================================
@ -369,11 +364,12 @@ async function handleDatabaseMethodAsync(
} }
switch (request.method) { switch (request.method) {
case 'db.query': { case 'haextension.db.query': {
const rows = await invoke<unknown[]>('extension_sql_select', { const rows = await invoke<unknown[]>('extension_sql_select', {
sql: params.query || '', sql: params.query || '',
params: params.params || [], params: params.params || [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
return { return {
@ -383,21 +379,22 @@ async function handleDatabaseMethodAsync(
} }
} }
case 'db.execute': { case 'haextension.db.execute': {
await invoke<string[]>('extension_sql_execute', { const rows = await invoke<unknown[]>('extension_sql_execute', {
sql: params.query || '', sql: params.query || '',
params: params.params || [], params: params.params || [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
return { return {
rows: [], rows,
rowsAffected: 1, rowsAffected: 1,
lastInsertId: undefined, lastInsertId: undefined,
} }
} }
case 'db.transaction': { case 'haextension.db.transaction': {
const statements = const statements =
(request.params as { statements?: string[] }).statements || [] (request.params as { statements?: string[] }).statements || []
@ -405,7 +402,8 @@ async function handleDatabaseMethodAsync(
await invoke('extension_sql_execute', { await invoke('extension_sql_execute', {
sql: stmt, sql: stmt,
params: [], params: [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
} }
@ -467,7 +465,7 @@ async function handlePermissionsMethodAsync(
async function handleContextMethodAsync(request: ExtensionRequest) { async function handleContextMethodAsync(request: ExtensionRequest) {
switch (request.method) { switch (request.method) {
case 'context.get': case 'haextension.context.get':
if (!contextGetters) { if (!contextGetters) {
throw new Error( throw new Error(
'Context not initialized. Make sure useExtensionMessageHandler is called in a component.', 'Context not initialized. Make sure useExtensionMessageHandler is called in a component.',
@ -499,25 +497,25 @@ async function handleStorageMethodAsync(
) )
switch (request.method) { switch (request.method) {
case 'storage.getItem': { case 'haextension.storage.getItem': {
const key = request.params.key as string const key = request.params.key as string
return localStorage.getItem(storageKey + key) return localStorage.getItem(storageKey + key)
} }
case 'storage.setItem': { case 'haextension.storage.setItem': {
const key = request.params.key as string const key = request.params.key as string
const value = request.params.value as string const value = request.params.value as string
localStorage.setItem(storageKey + key, value) localStorage.setItem(storageKey + key, value)
return null return null
} }
case 'storage.removeItem': { case 'haextension.storage.removeItem': {
const key = request.params.key as string const key = request.params.key as string
localStorage.removeItem(storageKey + key) localStorage.removeItem(storageKey + key)
return null return null
} }
case 'storage.clear': { case 'haextension.storage.clear': {
// Remove only instance-specific keys // Remove only instance-specific keys
const keys = Object.keys(localStorage).filter((k) => const keys = Object.keys(localStorage).filter((k) =>
k.startsWith(storageKey), k.startsWith(storageKey),
@ -526,7 +524,7 @@ async function handleStorageMethodAsync(
return null return null
} }
case 'storage.keys': { case 'haextension.storage.keys': {
// Return only instance-specific keys (without prefix) // Return only instance-specific keys (without prefix)
const keys = Object.keys(localStorage) const keys = Object.keys(localStorage)
.filter((k) => k.startsWith(storageKey)) .filter((k) => k.startsWith(storageKey))

View File

@ -14,12 +14,20 @@ export function useAndroidBackButton() {
// Track navigation history manually // Track navigation history manually
router.afterEach((to, from) => { router.afterEach((to, from) => {
console.log('[AndroidBack] Navigation:', { to: to.path, from: from.path, stackSize: historyStack.value.length }) console.log('[AndroidBack] Navigation:', {
to: to.path,
from: from.path,
stackSize: historyStack.value.length,
})
// If navigating forward (new page) // If navigating forward (new page)
if (from.path && to.path !== from.path && !historyStack.value.includes(to.path)) { if (
from.path &&
to.path !== from.path &&
!historyStack.value.includes(to.path)
) {
historyStack.value.push(from.path) historyStack.value.push(from.path)
console.log('[AndroidBack] Added to stack:', from.path, 'Stack:', historyStack.value) //console.log('[AndroidBack] Added to stack:', from.path, 'Stack:', historyStack.value)
} }
}) })
@ -31,7 +39,10 @@ export function useAndroidBackButton() {
// Listen to close requested event (triggered by Android back button) // Listen to close requested event (triggered by Android back button)
unlisten = await appWindow.onCloseRequested(async (event) => { unlisten = await appWindow.onCloseRequested(async (event) => {
console.log('[AndroidBack] Back button pressed, stack size:', historyStack.value.length) console.log(
'[AndroidBack] Back button pressed, stack size:',
historyStack.value.length,
)
// Check if we have history // Check if we have history
if (historyStack.value.length > 0) { if (historyStack.value.length > 0) {
@ -40,7 +51,10 @@ export function useAndroidBackButton() {
// Remove current page from stack // Remove current page from stack
historyStack.value.pop() historyStack.value.pop()
console.log('[AndroidBack] Going back, new stack size:', historyStack.value.length) console.log(
'[AndroidBack] Going back, new stack size:',
historyStack.value.length,
)
// Navigate back in router // Navigate back in router
router.back() router.back()

View File

@ -1,80 +0,0 @@
<template>
<div class="flex flex-col w-full h-full overflow-hidden">
<UPageHeader
as="header"
:ui="{
root: [
'bg-default border-b border-accented sticky top-0 z-50 py-0 px-8',
],
wrapper: [
'pt-6 flex flex-col sm:flex-row sm:items-center sm:justify-between gap-4',
],
}"
>
<template #title>
<div class="flex items-center">
<UiLogoHaexhub class="size-12 shrink-0" />
<NuxtLinkLocale
class="link text-base-content link-neutral text-xl font-semibold no-underline flex items-center"
:to="{ name: 'desktop' }"
>
<UiTextGradient class="text-nowrap">
{{ currentVaultName }}
</UiTextGradient>
</NuxtLinkLocale>
</div>
</template>
<template #links>
<UButton
color="neutral"
variant="outline"
:block="isSmallScreen"
@click="isOverviewMode = !isOverviewMode"
>
<template #leading>
<UIcon name="i-heroicons-squares-2x2" />
</template>
Workspaces
</UButton>
<HaexExtensionLauncher :block="isSmallScreen" />
<UiDropdownVault :block="isSmallScreen" />
</template>
</UPageHeader>
<main class="flex-1 overflow-hidden bg-elevated">
<NuxtPage />
</main>
</div>
</template>
<script setup lang="ts">
const { currentVaultName } = storeToRefs(useVaultStore())
const { isSmallScreen } = storeToRefs(useUiStore())
const { isOverviewMode } = storeToRefs(useWorkspaceStore())
</script>
<i18n lang="yaml">
de:
vault:
close: Vault schließen
sidebar:
close: Sidebar ausblenden
show: Sidebar anzeigen
search:
label: Suche
en:
vault:
close: Close vault
sidebar:
close: close sidebar
show: show sidebar
search:
label: Search
</i18n>

View File

@ -1,5 +1,155 @@
<template> <template>
<div class="bg-default isolate w-dvw h-dvh flex flex-col"> <div class="w-full h-dvh flex flex-col">
<slot /> <UPageHeader
ref="headerEl"
as="header"
:ui="{
root: ['px-8 py-0'],
wrapper: ['flex flex-row items-center justify-between gap-4'],
}"
>
<template #default>
<div class="flex justify-between items-center py-1">
<div>
<!-- <NuxtLinkLocale
class="link text-base-content link-neutral text-xl font-semibold no-underline flex items-center"
:to="{ name: 'desktop' }"
>
<UiTextGradient class="text-nowrap">
{{ currentVaultName }}
</UiTextGradient>
</NuxtLinkLocale> -->
<UiButton
v-if="currentVaultId"
color="neutral"
variant="outline"
icon="i-bi-person-workspace"
size="lg"
:tooltip="t('workspaces.label')"
@click="isOverviewMode = !isOverviewMode"
/>
</div>
<div>
<div v-if="!currentVaultId">
<UiDropdownLocale @select="onSelectLocale" />
</div>
<div
v-else
class="flex flex-row gap-2"
>
<UButton
v-if="openWindowsCount > 0"
color="primary"
variant="outline"
size="lg"
@click="showWindowOverview = !showWindowOverview"
>
{{ openWindowsCount }}
</UButton>
<HaexExtensionLauncher />
</div>
</div>
</div> </div>
</template> </template>
</UPageHeader>
<main class="overflow-hidden relative bg-elevated h-full">
<slot />
</main>
<!-- Workspace Drawer -->
<UDrawer
v-model:open="isOverviewMode"
direction="left"
:dismissible="false"
:overlay="false"
:modal="false"
title="Workspaces"
description="Workspaces"
>
<template #content>
<div class="p-6 h-full overflow-y-auto">
<UButton
block
trailing-icon="mdi-close"
class="text-2xl font-bold ext-gray-900 dark:text-white mb-4"
@click="isOverviewMode = false"
>
Workspaces
</UButton>
<!-- Workspace Cards -->
<div class="flex flex-col gap-3">
<HaexWorkspaceCard
v-for="workspace in workspaces"
:key="workspace.id"
:workspace
/>
</div>
<!-- Add New Workspace Button -->
<UButton
block
variant="outline"
class="mt-6"
@click="handleAddWorkspace"
icon="i-heroicons-plus"
:label="t('workspaces.add')"
>
</UButton>
</div>
</template>
</UDrawer>
</div>
</template>
<script setup lang="ts">
import type { Locale } from 'vue-i18n'
const { t, setLocale } = useI18n()
const onSelectLocale = async (locale: Locale) => {
await setLocale(locale)
}
const { currentVaultId } = storeToRefs(useVaultStore())
const { showWindowOverview, openWindowsCount } = storeToRefs(
useWindowManagerStore(),
)
const workspaceStore = useWorkspaceStore()
const { workspaces, isOverviewMode } = storeToRefs(workspaceStore)
const handleAddWorkspace = async () => {
const workspace = await workspaceStore.addWorkspaceAsync()
nextTick(() => {
workspaceStore.slideToWorkspace(workspace?.id)
})
}
// Measure header height and store it in UI store
const headerEl = useTemplateRef('headerEl')
const { height } = useElementSize(headerEl)
const uiStore = useUiStore()
watch(height, (newHeight) => {
uiStore.headerHeight = newHeight
})
</script>
<i18n lang="yaml">
de:
search:
label: Suche
workspaces:
label: Workspaces
add: Workspace hinzufügen
en:
search:
label: Search
workspaces:
label: Workspaces
add: Add Workspace
</i18n>

View File

@ -3,7 +3,6 @@ export default defineNuxtRouteMiddleware(async (to) => {
const toVaultId = getSingleRouteParam(to.params.vaultId) const toVaultId = getSingleRouteParam(to.params.vaultId)
console.log('middleware', openVaults.value?.[toVaultId])
if (!openVaults.value?.[toVaultId]) { if (!openVaults.value?.[toVaultId]) {
return await navigateTo(useLocalePath()({ name: 'vaultOpen' })) return await navigateTo(useLocalePath()({ name: 'vaultOpen' }))
} }

View File

@ -1,10 +1,9 @@
<template> <template>
<div class="items-center justify-center flex w-full h-full relative"> <div class="h-full">
<div class="absolute top-8 right-8 sm:top-4 sm:right-4"> <NuxtLayout>
<UiDropdownLocale @select="onSelectLocale" /> <div
</div> class="flex flex-col justify-center items-center gap-5 mx-auto h-full overflow-scroll"
>
<div class="flex flex-col justify-center items-center gap-5 max-w-3xl">
<UiLogoHaexhub class="bg-primary p-3 size-16 rounded-full shrink-0" /> <UiLogoHaexhub class="bg-primary p-3 size-16 rounded-full shrink-0" />
<span <span
class="flex flex-wrap font-bold text-pretty text-xl gap-2 justify-center" class="flex flex-wrap font-bold text-pretty text-xl gap-2 justify-center"
@ -15,7 +14,7 @@
<UiTextGradient>Haex Hub</UiTextGradient> <UiTextGradient>Haex Hub</UiTextGradient>
</span> </span>
<div class="flex flex-col md:flex-row gap-4 w-full h-24 md:h-auto"> <div class="flex flex-col gap-4 h-24 items-stretch justify-center">
<HaexVaultCreate /> <HaexVaultCreate />
<HaexVaultOpen <HaexVaultOpen
@ -26,9 +25,9 @@
<div <div
v-show="lastVaults.length" v-show="lastVaults.length"
class="w-full" class="max-w-md w-full sm:px-5"
> >
<div class="font-thin text-sm justify-start px-2 pb-1"> <div class="font-thin text-sm pb-1 w-full">
{{ t('lastUsed') }} {{ t('lastUsed') }}
</div> </div>
@ -40,10 +39,19 @@
:key="vault.name" :key="vault.name"
class="flex items-center justify-between group overflow-x-scroll" class="flex items-center justify-between group overflow-x-scroll"
> >
<UButton <UiButtonContext
variant="ghost" variant="ghost"
color="neutral" color="neutral"
class="flex items-center no-underline justify-between text-nowrap text-sm md:text-base shrink w-full px-3" size="xl"
class="flex items-center no-underline justify-between text-nowrap text-sm md:text-base shrink w-full hover:bg-default"
:context-menu-items="[
{
icon: 'mdi:trash-can-outline',
label: t('remove.button'),
onSelect: () => prepareRemoveVault(vault.name),
color: 'error',
},
]"
@click=" @click="
() => { () => {
passwordPromptOpen = true passwordPromptOpen = true
@ -54,7 +62,7 @@
<span class="block"> <span class="block">
{{ vault.name }} {{ vault.name }}
</span> </span>
</UButton> </UiButtonContext>
<UButton <UButton
color="error" color="error"
square square
@ -62,7 +70,7 @@
> >
<Icon <Icon
name="mdi:trash-can-outline" name="mdi:trash-can-outline"
@click="removeVaultAsync(vault.name)" @click="prepareRemoveVault(vault.name)"
/> />
</UButton> </UButton>
</div> </div>
@ -81,45 +89,84 @@
</div> </div>
</div> </div>
</div> </div>
<UiDialogConfirm
v-model:open="showRemoveDialog"
:title="t('remove.title')"
:description="t('remove.description', { vaultName: vaultToBeRemoved })"
@confirm="onConfirmRemoveAsync"
/>
</NuxtLayout>
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { openUrl } from '@tauri-apps/plugin-opener' import { openUrl } from '@tauri-apps/plugin-opener'
import type { Locale } from 'vue-i18n'
import type { VaultInfo } from '@bindings/VaultInfo'
definePageMeta({ definePageMeta({
name: 'vaultOpen', name: 'vaultOpen',
}) })
const { t, setLocale } = useI18n()
const { t } = useI18n()
const passwordPromptOpen = ref(false) const passwordPromptOpen = ref(false)
const selectedVault = ref<IVaultInfo>() const selectedVault = ref<VaultInfo>()
const showRemoveDialog = ref(false)
const { syncLastVaultsAsync, removeVaultAsync } = useLastVaultStore()
const { lastVaults } = storeToRefs(useLastVaultStore()) const { lastVaults } = storeToRefs(useLastVaultStore())
const { syncLastVaultsAsync, moveVaultToTrashAsync } = useLastVaultStore()
const { syncDeviceIdAsync } = useDeviceStore()
const vaultToBeRemoved = ref('')
const prepareRemoveVault = (vaultName: string) => {
vaultToBeRemoved.value = vaultName
showRemoveDialog.value = true
}
const toast = useToast()
const onConfirmRemoveAsync = async () => {
try {
await moveVaultToTrashAsync(vaultToBeRemoved.value)
showRemoveDialog.value = false
await syncLastVaultsAsync()
} catch (error) {
toast.add({
color: 'error',
description: JSON.stringify(error),
})
}
}
onMounted(async () => { onMounted(async () => {
try { try {
await syncLastVaultsAsync() await syncLastVaultsAsync()
await syncDeviceIdAsync()
} catch (error) { } catch (error) {
console.error('ERROR: ', error) console.error('ERROR: ', error)
} }
}) })
const onSelectLocale = async (locale: Locale) => {
await setLocale(locale)
}
</script> </script>
<i18n lang="yaml"> <i18n lang="yaml">
de: de:
welcome: 'Viel Spass mit' welcome: 'Viel Spass mit'
lastUsed: 'Zuletzt verwendete Vaults' lastUsed: 'Zuletzt verwendete Vaults'
sponsors: 'Supported by' sponsors: Supported by
remove:
button: Löschen
title: Vault löschen
description: Möchtest du die Vault {vaultName} wirklich löschen?
en: en:
welcome: 'Have fun with' welcome: 'Have fun with'
lastUsed: 'Last used Vaults' lastUsed: 'Last used Vaults'
sponsors: 'Supported by' sponsors: 'Supported by'
remove:
button: Delete
title: Delete Vault
description: Are you sure you really want to delete {vaultName}?
</i18n> </i18n>

View File

@ -1,6 +1,6 @@
<template> <template>
<div class="w-full h-full overflow-y-auto"> <div>
<NuxtLayout name="app"> <NuxtLayout>
<NuxtPage /> <NuxtPage />
</NuxtLayout> </NuxtLayout>
@ -9,6 +9,7 @@
v-model:open="showNewDeviceDialog" v-model:open="showNewDeviceDialog"
:confirm-label="t('newDevice.save')" :confirm-label="t('newDevice.save')"
:title="t('newDevice.title')" :title="t('newDevice.title')"
:description="t('newDevice.setName')"
confirm-icon="mdi:content-save-outline" confirm-icon="mdi:content-save-outline"
@abort="showNewDeviceDialog = false" @abort="showNewDeviceDialog = false"
@confirm="onSetDeviceNameAsync" @confirm="onSetDeviceNameAsync"
@ -48,16 +49,26 @@ const newDeviceName = ref<string>('unknown')
const { readNotificationsAsync } = useNotificationStore() const { readNotificationsAsync } = useNotificationStore()
const { isKnownDeviceAsync } = useDeviceStore() const { isKnownDeviceAsync } = useDeviceStore()
const { loadExtensionsAsync } = useExtensionsStore() const { loadExtensionsAsync } = useExtensionsStore()
const { setDeviceIdIfNotExistsAsync, addDeviceNameAsync } = useDeviceStore() const { addDeviceNameAsync } = useDeviceStore()
const { deviceId } = storeToRefs(useDeviceStore()) const { deviceId } = storeToRefs(useDeviceStore())
const { syncLocaleAsync, syncThemeAsync, syncVaultNameAsync } =
useVaultSettingsStore()
onMounted(async () => { onMounted(async () => {
try { try {
await setDeviceIdIfNotExistsAsync() // Sync settings first before other initialization
await loadExtensionsAsync()
await readNotificationsAsync()
if (!(await isKnownDeviceAsync())) { await Promise.allSettled([
syncLocaleAsync(),
syncThemeAsync(),
syncVaultNameAsync(),
loadExtensionsAsync(),
readNotificationsAsync(),
])
const knownDevice = await isKnownDeviceAsync()
if (!knownDevice) {
console.log('not known device') console.log('not known device')
newDeviceName.value = hostname.value ?? 'unknown' newDeviceName.value = hostname.value ?? 'unknown'
showNewDeviceDialog.value = true showNewDeviceDialog.value = true

View File

@ -1,6 +1,8 @@
<template> <template>
<div class="w-full h-full flex items-center justify-center"> <div>
<UDashboardPanel resizable>
<HaexDesktop /> <HaexDesktop />
</UDashboardPanel>
</div> </div>
</template> </template>

View File

@ -1,139 +0,0 @@
<template>
<div>
<div
class="grid grid-rows-2 sm:grid-cols-2 sm:gap-2 p-2 max-w-2xl w-full h-fit"
>
<div class="p-2">{{ t('language') }}</div>
<div><UiDropdownLocale @select="onSelectLocaleAsync" /></div>
<div class="p-2">{{ t('design') }}</div>
<div><UiDropdownTheme @select="onSelectThemeAsync" /></div>
<div class="p-2">{{ t('vaultName.label') }}</div>
<div>
<UiInput
v-model="currentVaultName"
:placeholder="t('vaultName.label')"
@change="onSetVaultNameAsync"
/>
</div>
<div class="p-2">{{ t('notifications.label') }}</div>
<div>
<UiButton
:label="t('notifications.requestPermission')"
@click="requestNotificationPermissionAsync"
/>
</div>
<div class="p-2">{{ t('deviceName.label') }}</div>
<div>
<UiInput
v-model="deviceName"
:placeholder="t('deviceName.label')"
@change="onUpdateDeviceNameAsync"
/>
</div>
</div>
<!-- Child routes (like developer.vue) will be rendered here -->
<NuxtPage />
</div>
</template>
<script setup lang="ts">
import type { Locale } from 'vue-i18n'
definePageMeta({
name: 'settings',
})
const { t, setLocale } = useI18n()
const { currentVaultName } = storeToRefs(useVaultStore())
const { updateVaultNameAsync, updateLocaleAsync, updateThemeAsync } =
useVaultSettingsStore()
const onSelectLocaleAsync = async (locale: Locale) => {
await updateLocaleAsync(locale)
await setLocale(locale)
}
const { currentThemeName } = storeToRefs(useUiStore())
const onSelectThemeAsync = async (theme: string) => {
currentThemeName.value = theme
console.log('onSelectThemeAsync', currentThemeName.value)
await updateThemeAsync(theme)
}
const { add } = useToast()
const onSetVaultNameAsync = async () => {
try {
await updateVaultNameAsync(currentVaultName.value)
add({ description: t('vaultName.update.success'), color: 'success' })
} catch (error) {
console.error(error)
add({ description: t('vaultName.update.error'), color: 'error' })
}
}
const { requestNotificationPermissionAsync } = useNotificationStore()
const { deviceName } = storeToRefs(useDeviceStore())
const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore()
onMounted(async () => {
await readDeviceNameAsync()
})
const onUpdateDeviceNameAsync = async () => {
const check = vaultDeviceNameSchema.safeParse(deviceName.value)
if (!check.success) return
try {
await updateDeviceNameAsync({ name: deviceName.value })
add({ description: t('deviceName.update.success'), color: 'success' })
} catch (error) {
console.log(error)
add({ description: t('deviceName.update.error'), color: 'error' })
}
}
</script>
<i18n lang="yaml">
de:
language: Sprache
design: Design
save: Änderung speichern
notifications:
label: Benachrichtigungen
requestPermission: Benachrichtigung erlauben
vaultName:
label: Vaultname
update:
success: Vaultname erfolgreich aktualisiert
error: Vaultname konnte nicht aktualisiert werden
deviceName:
label: Gerätename
update:
success: Gerätename wurde erfolgreich aktualisiert
error: Gerätename konnte nich aktualisiert werden
en:
language: Language
design: Design
save: save changes
notifications:
label: Notifications
requestPermission: Grant Permission
vaultName:
label: Vault Name
update:
success: Vault Name successfully updated
error: Vault name could not be updated
deviceName:
label: Device name
update:
success: Device name has been successfully updated
error: Device name could not be updated
</i18n>

View File

@ -0,0 +1,25 @@
export default defineNuxtPlugin({
name: 'init-logger',
enforce: 'pre',
parallel: false,
setup() {
// Add global error handler for better debugging
window.addEventListener('error', (event) => {
console.error('[HaexHub] Global error caught:', {
message: event.message,
filename: event.filename,
lineno: event.lineno,
colno: event.colno,
error: event.error,
stack: event.error?.stack,
})
})
window.addEventListener('unhandledrejection', (event) => {
console.error('[HaexHub] Unhandled rejection:', {
reason: event.reason,
promise: event.promise,
})
})
},
})

Some files were not shown because too many files have changed in this diff Show More