61 Commits

Author SHA1 Message Date
c71b8468df Fix workspace background feature for Android
- Add missing filesystem permissions in capabilities
  - fs:allow-applocaldata-read-recursive
  - fs:allow-applocaldata-write-recursive
  - fs:allow-write-file
  - fs:allow-mkdir
  - fs:allow-exists
  - fs:allow-remove

- Fix Android photo picker URI handling
  - Detect file type from binary signature (PNG, JPEG, WebP)
  - Use manual path construction to avoid path joining issues
  - Works with Android photo picker content:// URIs

- Improve error handling with detailed toast messages
  - Show specific error at each step (read, mkdir, write, db)
  - Better debugging on Android where console is unavailable

- Fix window activation behavior
  - Restore minimized windows when activated

- Remove unused imports in launcher component
2025-11-03 02:03:34 +01:00
3a4f482021 Add database migrations for workspace background feature
- Add migration 0001 for background column in haex_workspaces table
- Update vault.db with new schema
- Sync Android assets database
2025-11-03 01:32:00 +01:00
88507410ed Refactor code formatting and imports
- Reformat Rust code in extension database module
  - Improve line breaks and indentation
  - Remove commented-out test code
  - Clean up debug print statements formatting

- Update import path in CRDT schema (use @ alias)

- Fix UButton closing tag formatting in default layout
2025-11-03 01:30:46 +01:00
f38cecc84b Add workspace background customization and fix launcher drawer drag
- Add workspace background image support with file-based storage
  - Store background images in $APPLOCALDATA/files directory
  - Save file paths in database (text column in haex_workspaces)
  - Use convertFileSrc for secure asset:// URL conversion
  - Add context menu to workspaces with "Hintergrund ändern" option

- Implement background management in settings
  - File selection dialog for PNG, JPG, JPEG, WebP images
  - Copy selected images to app data directory
  - Remove background with file cleanup
  - Multilingual UI (German/English)

- Fix launcher drawer drag interference
  - Add :handle-only="true" to UDrawer to restrict drag to handle
  - Simplify drag handlers (removed complex state tracking)
  - Items can now be dragged to desktop without drawer interference

- Extend Tauri asset protocol scope to include $APPLOCALDATA/**
  for background image loading
2025-11-03 01:29:08 +01:00
931d51a1e1 Remove unused function parameters
Removed unused parameters:
- allowed_origin from parse_extension_info_from_path in protocol.rs
- app_handle from resolve_path_pattern in filesystem/core.rs
2025-11-02 15:07:44 +01:00
c97afdee18 Restore trash import for move_vault_to_trash functionality
The trash crate is needed for the move_vault_to_trash function which
moves vault files to the system trash instead of permanently deleting
them. Clippy incorrectly marked it as unused because it's only used
within a cfg(not(target_os = "android")) block.
2025-11-02 15:06:02 +01:00
65d2770df3 Fix Android build by unconditionally importing ts_rs::TS
When cargo clippy removed the unused trash import, the cfg attribute
accidentally applied to the ts_rs::TS import below it, making it
conditional for Android. This caused the Android build to fail with
"cannot find derive macro TS in this scope".

Moved the TS import out of the cfg block to make it available for all
platforms including Android.
2025-11-02 15:02:45 +01:00
a52e1b43fa Remove unused code and modernize Rust format strings
Applied cargo clippy fixes to clean up codebase:
- Removed unused imports (serde_json::json, std::collections::HashSet)
- Removed unused function encode_hex_for_log
- Modernized format strings to use inline variables
- Fixed clippy warnings for better code quality

All changes applied automatically by cargo clippy --fix
2025-11-02 14:48:01 +01:00
6ceb22f014 Bundle Iconify icons locally and enhance CSP for Tauri protocols
- Add lucide and hugeicons to serverBundle collections for local bundling
- Add https://tauri.localhost and asset: protocol to CSP directives
- Prevents CSP errors and eliminates dependency on Iconify API
2025-11-02 14:28:06 +01:00
4833dee89a Fix bundle targets to build for all platforms 2025-11-02 13:52:29 +01:00
a80c783576 Restore CSP settings in tauri.conf.json 2025-11-02 13:41:18 +01:00
4e1e4ae601 Bump version to 0.1.4 2025-11-02 00:58:02 +01:00
6a7f58a513 Fix production build crash by resolving circular import dependency
Moved database schemas from src-tauri/database/schemas/ to src/database/schemas/
to fix bundling issues and resolved circular import dependency that caused
"Cannot access uninitialized variable" error in production builds.

Key changes:
- Moved crdtColumnNames definition into haex.ts to break circular dependency
- Restored .$defaultFn(() => crypto.randomUUID()) calls
- Kept AnySQLiteColumn type annotations
- Removed obsolete TDZ fix script (no longer needed)
- Updated all import paths across stores and configuration files
2025-11-02 00:57:03 +01:00
3ed8d6bc05 Fix frontendDist path for nuxt generate output 2025-11-01 21:54:24 +01:00
81a72da26c Add post-build fix to generate script 2025-11-01 21:34:44 +01:00
4fa3515e32 Bump version to 0.1.3
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:37 +01:00
c5c30fd4c4 Fix Vite 7.x TDZ error in __vite__mapDeps with post-build script
- Add post-build script to fix Temporal Dead Zone error in generated code
- Remove debug logging from stores and composables
- Simplify init-logger plugin to essential error handling
- Fix circular store dependency in useUiStore

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:12 +01:00
8c7a02a019 Sync version numbers across all package files
Update Cargo.toml and tauri.conf.json to version 0.1.2 to match package.json
2025-11-01 19:33:42 +01:00
465fe19542 Clean up unused code and dependencies
- Remove commented-out code in Rust and TypeScript files
- Remove unused npm dependencies (@tauri-apps/plugin-http, @tauri-apps/plugin-sql, fuse.js)
- Remove commented imports in nuxt.config.ts
- Remove commented dependencies in Cargo.toml
2025-11-01 19:32:34 +01:00
d2d0f8996b Fix runtime CSP error by allowing inline scripts
Added 'unsafe-inline' to script-src CSP directive to fix JavaScript
initialization errors in production builds. Nuxt's generated modules
require inline script execution.

- Fixes: "Cannot access uninitialized variable" error
- Fixes: CSP script execution blocking
- Version bump to 0.1.2
2025-11-01 19:00:36 +01:00
f727d00639 Bump version to 0.1.1 2025-11-01 17:21:10 +01:00
a946b14f69 Fix Android assets upload to correct release
Use gh CLI to upload Android APK and AAB to the tagged release.
2025-11-01 17:20:13 +01:00
471baec284 Simplify Android build: use default command for APK and AAB
tauri android build creates both APK and AAB by default.
2025-11-01 16:44:32 +01:00
8298d807f3 Fix Android build commands: use --apk and --aab flags
Changed from incorrect --bundle aab to correct --aab flag.
2025-11-01 16:34:15 +01:00
42e6459fbf Prevent duplicate builds on tag pushes
Build workflow now ignores all tags to avoid running alongside release workflow.
2025-11-01 16:06:35 +01:00
6ae87fc694 Fix Android OpenSSL build by adding NDK toolchain to PATH
Set proper CC, AR, and RANLIB environment variables for all Android targets
to enable OpenSSL cross-compilation with SQLCipher encryption.
2025-11-01 16:03:46 +01:00
f7867a5bde Restore SQLCipher encryption for Android and fix CI build
- Re-enable bundled-sqlcipher-vendored-openssl for Android
- Add NDK environment variables for OpenSSL compilation
- Install perl and make for OpenSSL build in CI
- Ensures encryption works on all platforms including Android
2025-11-01 15:39:44 +01:00
d82599f588 Fix Android build by using platform-specific rusqlite features
- Use bundled-sqlcipher-vendored-openssl for non-Android platforms
- Use bundled (standard SQLite) for Android to avoid OpenSSL compilation issues
- Resolves OpenSSL build errors on Android targets
2025-11-01 15:36:20 +01:00
72bb211a76 Fix secrets access in workflow conditional
- Move secrets to env block instead of if condition
- Use bash conditional to check if keystore is available
- Provide clear logging for signed vs unsigned builds
2025-11-01 15:28:06 +01:00
f14ce0d6ad Add Android signing configuration to Gradle
- Configure signingConfigs to read from environment variables
- Apply signing to release builds when keystore is available
- Support both signed and unsigned builds
2025-11-01 15:26:21 +01:00
af09f4524d Remove iOS builds from CI/CD workflows 2025-11-01 15:21:58 +01:00
102832675d Fix Android build commands syntax
- Change from --apk to default build (produces APK)
- Change from --aab to --bundle aab for AAB generation
2025-11-01 15:20:49 +01:00
3490de2f51 Configure Android signing and disable iOS builds
- Add optional Android signing for build workflow (unsigned for testing)
- Require Android signing for release workflow
- Disable iOS builds (commented out) until Apple Developer Account is available
2025-11-01 15:06:56 +01:00
7c3af10938 Add Android and iOS builds to CI/CD pipelines 2025-11-01 15:00:33 +01:00
5c5d0785b9 Fix pnpm version conflict in CI workflows 2025-11-01 14:48:58 +01:00
121dd9dd00 Add GitHub Actions CI/CD pipelines
- Add build pipeline for Windows, macOS, and Linux
- Add release pipeline for automated releases
- Remove CLAUDE.md from git tracking
2025-11-01 14:46:01 +01:00
4ff6aee4d8 Fix Vue i18n warnings and component root node issues
- Set useScope: 'global' in UI store to prevent i18n scope conflicts
- Add wrapper div to vault page to ensure single root node for transitions
- Fixes 'Duplicate useI18n calling by local scope' warning
- Fixes 'Component inside <Transition> renders non-element root node' warning
2025-10-31 23:24:20 +01:00
dceb49ae90 Add context menu for vault actions and trash functionality
- Add UiButtonContext component for context menu support on buttons
- Implement vault trash functionality using trash crate
- Move vaults to system trash on desktop (with fallback to permanent delete on mobile)
- Add context menu to vault list items for better mobile UX
- Keep hover delete button for desktop users
2025-10-31 22:57:56 +01:00
5ea04a80e0 Fix Android safe-area handling and window maximization
- Fix extension signature verification on Android by canonicalizing paths (symlink compatibility)
- Implement proper safe-area-inset handling for mobile devices
- Add reactive header height measurement to UI store
- Fix maximized window positioning to respect safe-areas and header
- Create reusable HaexDebugOverlay component for mobile debugging
- Fix Swiper navigation by using absolute positioning instead of flex-1
- Remove debug logging after Android compatibility confirmed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-31 02:18:59 +01:00
65cf2e2c3c adjust gitignore 2025-10-30 22:01:31 +01:00
68d542b4d7 Update extension system and database migrations
Changes:
- Added CLAUDE.md with project instructions
- Updated extension manifest bindings (TypeScript)
- Regenerated database migrations (consolidated into single migration)
- Updated haex schema with table name handling
- Enhanced extension manager and manifest handling in Rust
- Updated extension store in frontend
- Updated vault.db

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-30 21:59:13 +01:00
f97cd4ad97 adjust drizzle backend.
return array of arrays
handle table names with quotes
2025-10-30 04:57:01 +01:00
ef225b281f refactored design 2025-10-28 14:16:17 +01:00
16b71d9ea8 fix: Snap Dropzones 2025-10-27 11:26:12 +01:00
5ee5ced8c0 desktopicons now with foreign key to extensions 2025-10-26 00:19:15 +02:00
86b65f117d cleanup. renamed postMessgages 2025-10-25 23:17:28 +02:00
5fdea155d1 removed logs 2025-10-25 08:14:59 +02:00
cb0c8d71f4 fix window on workspace rendering 2025-10-25 08:09:15 +02:00
9281a85deb fix linting 2025-10-24 14:37:20 +02:00
8f8bbb5558 fix window overview 2025-10-24 14:33:56 +02:00
252b8711de feature: window overview 2025-10-24 13:17:29 +02:00
4f839aa856 fixed trigger 2025-10-23 13:17:58 +02:00
99ccadce00 removed pk fk mapping 2025-10-23 10:24:19 +02:00
922ae539ba no more soft delete => we do it hard now 2025-10-23 09:26:36 +02:00
3d020e7dcf refactored workspace table 2025-10-22 15:52:56 +02:00
f70e924cc3 refatored rust sql and drizzle 2025-10-22 15:05:36 +02:00
9ea057e943 fixed drizzle rust logic 2025-10-21 16:29:13 +02:00
e268947593 reorganized window 2025-10-21 13:49:29 +02:00
df97a3cb8b fix launcher 2025-10-20 22:44:35 +02:00
57fb496fca changed openWindow signature 2025-10-20 20:03:39 +02:00
2b8f1781f3 use window system 2025-10-20 19:14:05 +02:00
125 changed files with 10502 additions and 9481 deletions

View File

@ -1,9 +0,0 @@
{
"lastUpdated": "2025-10-16T00:00:00.000Z",
"todos": [],
"context": {
"description": "Session context file for Claude Code. This file is automatically updated to persist state across sessions.",
"currentFocus": null,
"notes": []
}
}

228
.github/workflows/build.yml vendored Normal file
View File

@ -0,0 +1,228 @@
name: Build
on:
push:
branches:
- main
- develop
tags-ignore:
- '**'
pull_request:
branches:
- main
- develop
workflow_dispatch:
jobs:
build-desktop:
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
args: ${{ matrix.args }}
- name: Upload artifacts (macOS)
if: matrix.platform == 'macos-latest'
uses: actions/upload-artifact@v4
with:
name: macos-${{ contains(matrix.args, 'aarch64') && 'aarch64' || 'x86_64' }}
path: |
src-tauri/target/*/release/bundle/dmg/*.dmg
src-tauri/target/*/release/bundle/macos/*.app
- name: Upload artifacts (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
uses: actions/upload-artifact@v4
with:
name: linux
path: |
src-tauri/target/release/bundle/deb/*.deb
src-tauri/target/release/bundle/appimage/*.AppImage
src-tauri/target/release/bundle/rpm/*.rpm
- name: Upload artifacts (Windows)
if: matrix.platform == 'windows-latest'
uses: actions/upload-artifact@v4
with:
name: windows
path: |
src-tauri/target/release/bundle/msi/*.msi
src-tauri/target/release/bundle/nsis/*.exe
build-android:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (if secrets available)
env:
ANDROID_KEYSTORE: ${{ secrets.ANDROID_KEYSTORE }}
ANDROID_KEYSTORE_PASSWORD: ${{ secrets.ANDROID_KEYSTORE_PASSWORD }}
ANDROID_KEY_ALIAS: ${{ secrets.ANDROID_KEY_ALIAS }}
ANDROID_KEY_PASSWORD: ${{ secrets.ANDROID_KEY_PASSWORD }}
run: |
if [ -n "$ANDROID_KEYSTORE" ]; then
echo "$ANDROID_KEYSTORE" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=$ANDROID_KEYSTORE_PASSWORD" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=$ANDROID_KEY_ALIAS" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=$ANDROID_KEY_PASSWORD" >> $GITHUB_ENV
echo "Keystore configured for signing"
else
echo "No keystore configured, building unsigned APK"
fi
- name: Build Android APK and AAB (unsigned if no keystore)
run: pnpm tauri android build
- name: Upload Android artifacts
uses: actions/upload-artifact@v4
with:
name: android
path: |
src-tauri/gen/android/app/build/outputs/apk/**/*.apk
src-tauri/gen/android/app/build/outputs/bundle/**/*.aab

251
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,251 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
jobs:
create-release:
permissions:
contents: write
runs-on: ubuntu-22.04
outputs:
release_id: ${{ steps.create-release.outputs.release_id }}
upload_url: ${{ steps.create-release.outputs.upload_url }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Get version
run: echo "PACKAGE_VERSION=$(node -p "require('./package.json').version")" >> $GITHUB_ENV
- name: Create release
id: create-release
uses: actions/github-script@v7
with:
script: |
const { data } = await github.rest.repos.createRelease({
owner: context.repo.owner,
repo: context.repo.repo,
tag_name: `v${process.env.PACKAGE_VERSION}`,
name: `haex-hub v${process.env.PACKAGE_VERSION}`,
body: 'Take a look at the assets to download and install this app.',
draft: true,
prerelease: false
})
core.setOutput('release_id', data.id)
core.setOutput('upload_url', data.upload_url)
return data.id
build-desktop:
needs: create-release
permissions:
contents: write
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build and release Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
releaseId: ${{ needs.create-release.outputs.release_id }}
args: ${{ matrix.args }}
build-android:
needs: create-release
permissions:
contents: write
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (required for release)
run: |
echo "${{ secrets.ANDROID_KEYSTORE }}" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=${{ secrets.ANDROID_KEYSTORE_PASSWORD }}" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=${{ secrets.ANDROID_KEY_ALIAS }}" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=${{ secrets.ANDROID_KEY_PASSWORD }}" >> $GITHUB_ENV
- name: Build Android APK and AAB (signed)
run: pnpm tauri android build
- name: Upload Android artifacts to Release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ github.ref_name }} \
src-tauri/gen/android/app/build/outputs/apk/universal/release/app-universal-release.apk \
src-tauri/gen/android/app/build/outputs/bundle/universalRelease/app-universal-release.aab \
--clobber
publish-release:
permissions:
contents: write
runs-on: ubuntu-22.04
needs: [create-release, build-desktop, build-android]
steps:
- name: Publish release
id: publish-release
uses: actions/github-script@v7
env:
release_id: ${{ needs.create-release.outputs.release_id }}
with:
script: |
github.rest.repos.updateRelease({
owner: context.repo.owner,
repo: context.repo.repo,
release_id: process.env.release_id,
draft: false,
prerelease: false
})

4
.gitignore vendored
View File

@ -26,4 +26,6 @@ dist-ssr
src-tauri/target src-tauri/target
nogit* nogit*
.claude .claude
.output .output
target
CLAUDE.md

View File

@ -1,7 +1,7 @@
import { defineConfig } from 'drizzle-kit' import { defineConfig } from 'drizzle-kit'
export default defineConfig({ export default defineConfig({
schema: './src-tauri/database/schemas/**.ts', schema: './src/database/schemas/**.ts',
out: './src-tauri/database/migrations', out: './src-tauri/database/migrations',
dialect: 'sqlite', dialect: 'sqlite',
dbCredentials: { dbCredentials: {

View File

@ -1,5 +1,3 @@
//import tailwindcss from '@tailwindcss/vite'
import { fileURLToPath } from 'node:url' import { fileURLToPath } from 'node:url'
// https://nuxt.com/docs/api/configuration/nuxt-config // https://nuxt.com/docs/api/configuration/nuxt-config
@ -16,6 +14,9 @@ export default defineNuxtConfig({
}, },
app: { app: {
head: {
viewport: 'width=device-width, initial-scale=1.0, viewport-fit=cover',
},
pageTransition: { pageTransition: {
name: 'fade', name: 'fade',
}, },
@ -28,7 +29,6 @@ export default defineNuxtConfig({
'@vueuse/nuxt', '@vueuse/nuxt',
'@nuxt/icon', '@nuxt/icon',
'@nuxt/eslint', '@nuxt/eslint',
//"@nuxt/image",
'@nuxt/fonts', '@nuxt/fonts',
'@nuxt/ui', '@nuxt/ui',
], ],
@ -41,6 +41,20 @@ export default defineNuxtConfig({
'pages/**', 'pages/**',
'types/**', 'types/**',
], ],
presets: [
{
from: '@vueuse/gesture',
imports: [
'useDrag',
'useGesture',
'useHover',
'useMove',
'usePinch',
'useScroll',
'useWheel',
],
},
],
}, },
css: ['./assets/css/main.css'], css: ['./assets/css/main.css'],
@ -54,7 +68,7 @@ export default defineNuxtConfig({
includeCustomCollections: true, includeCustomCollections: true,
}, },
serverBundle: { serverBundle: {
collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione'], collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione', 'lucide', 'hugeicons'],
}, },
customCollections: [ customCollections: [
@ -94,8 +108,7 @@ export default defineNuxtConfig({
runtimeConfig: { runtimeConfig: {
public: { public: {
haexVault: { haexVault: {
lastVaultFileName: 'lastVaults.json', deviceFileName: 'device.json',
instanceFileName: 'instance.json',
defaultVaultName: 'HaexHub', defaultVaultName: 'HaexHub',
}, },
}, },
@ -109,7 +122,6 @@ export default defineNuxtConfig({
}, },
vite: { vite: {
//plugins: [tailwindcss()],
// Better support for Tauri CLI output // Better support for Tauri CLI output
clearScreen: false, clearScreen: false,
// Enable environment variables // Enable environment variables

View File

@ -1,7 +1,7 @@
{ {
"name": "haex-hub", "name": "haex-hub",
"private": true, "private": true,
"version": "0.1.0", "version": "0.1.4",
"type": "module", "type": "module",
"scripts": { "scripts": {
"build": "nuxt build", "build": "nuxt build",
@ -21,47 +21,48 @@
"@nuxt/eslint": "1.9.0", "@nuxt/eslint": "1.9.0",
"@nuxt/fonts": "0.11.4", "@nuxt/fonts": "0.11.4",
"@nuxt/icon": "2.0.0", "@nuxt/icon": "2.0.0",
"@nuxt/ui": "4.0.0", "@nuxt/ui": "4.1.0",
"@nuxtjs/i18n": "10.0.6", "@nuxtjs/i18n": "10.0.6",
"@pinia/nuxt": "^0.11.1", "@pinia/nuxt": "^0.11.2",
"@tailwindcss/vite": "^4.1.10", "@tailwindcss/vite": "^4.1.16",
"@tauri-apps/api": "^2.5.0", "@tauri-apps/api": "^2.9.0",
"@tauri-apps/plugin-dialog": "^2.2.2", "@tauri-apps/plugin-dialog": "^2.4.2",
"@tauri-apps/plugin-fs": "^2.3.0", "@tauri-apps/plugin-fs": "^2.4.4",
"@tauri-apps/plugin-http": "2.5.2",
"@tauri-apps/plugin-notification": "2.3.1", "@tauri-apps/plugin-notification": "2.3.1",
"@tauri-apps/plugin-opener": "^2.3.0", "@tauri-apps/plugin-opener": "^2.5.2",
"@tauri-apps/plugin-os": "^2.2.2", "@tauri-apps/plugin-os": "^2.3.2",
"@tauri-apps/plugin-sql": "2.3.0", "@tauri-apps/plugin-store": "^2.4.1",
"@tauri-apps/plugin-store": "^2.2.1",
"@vueuse/components": "^13.9.0", "@vueuse/components": "^13.9.0",
"@vueuse/core": "^13.4.0", "@vueuse/core": "^13.9.0",
"@vueuse/nuxt": "^13.4.0", "@vueuse/gesture": "^2.0.0",
"drizzle-orm": "^0.44.2", "@vueuse/nuxt": "^13.9.0",
"eslint": "^9.34.0", "drizzle-orm": "^0.44.7",
"fuse.js": "^7.1.0", "eslint": "^9.38.0",
"nuxt": "^4.0.3", "nuxt-zod-i18n": "^1.12.1",
"nuxt-zod-i18n": "^1.12.0", "swiper": "^12.0.3",
"tailwindcss": "^4.1.10", "tailwindcss": "^4.1.16",
"vue": "^3.5.20", "vue": "^3.5.22",
"vue-router": "^4.5.1", "vue-router": "^4.6.3",
"zod": "4.1.5" "zod": "^3.25.76"
}, },
"devDependencies": { "devDependencies": {
"@iconify/json": "^2.2.351", "@iconify-json/hugeicons": "^1.2.17",
"@iconify-json/lucide": "^1.2.71",
"@iconify/json": "^2.2.401",
"@iconify/tailwind4": "^1.0.6", "@iconify/tailwind4": "^1.0.6",
"@libsql/client": "^0.15.15", "@libsql/client": "^0.15.15",
"@tauri-apps/cli": "^2.5.0", "@tauri-apps/cli": "^2.9.1",
"@types/node": "^24.6.2", "@types/node": "^24.9.1",
"@vitejs/plugin-vue": "6.0.1", "@vitejs/plugin-vue": "6.0.1",
"@vue/compiler-sfc": "^3.5.17", "@vue/compiler-sfc": "^3.5.22",
"drizzle-kit": "^0.31.2", "drizzle-kit": "^0.31.5",
"globals": "^16.2.0", "globals": "^16.4.0",
"nuxt": "^4.2.0",
"prettier": "3.6.2", "prettier": "3.6.2",
"tsx": "^4.20.6", "tsx": "^4.20.6",
"tw-animate-css": "^1.3.8", "tw-animate-css": "^1.4.0",
"typescript": "^5.8.3", "typescript": "^5.9.3",
"vite": "7.1.3", "vite": "^7.1.3",
"vue-tsc": "3.0.6" "vue-tsc": "3.0.6"
}, },
"prettier": { "prettier": {

4563
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

1303
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[package] [package]
name = "haex-hub" name = "haex-hub"
version = "0.1.0" version = "0.1.4"
description = "A Tauri App" description = "A Tauri App"
authors = ["you"] authors = ["you"]
edition = "2021" edition = "2021"
@ -20,14 +20,7 @@ tauri-build = { version = "2.2", features = [] }
serde = { version = "1.0.228", features = ["derive"] } serde = { version = "1.0.228", features = ["derive"] }
[dependencies] [dependencies]
rusqlite = { version = "0.37.0", features = [ tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }
"load_extension",
"bundled-sqlcipher-vendored-openssl",
"functions",
] }
#tauri-plugin-sql = { version = "2", features = ["sqlite"] }tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }#libsqlite3-sys = { version = "0.31", features = ["bundled-sqlcipher"] }
#sqlx = { version = "0.8", features = ["runtime-tokio-rustls", "sqlite"] }
base64 = "0.22" base64 = "0.22"
ed25519-dalek = "2.1" ed25519-dalek = "2.1"
fs_extra = "1.3.0" fs_extra = "1.3.0"
@ -39,18 +32,26 @@ serde = { version = "1", features = ["derive"] }
serde_json = "1.0.143" serde_json = "1.0.143"
sha2 = "0.10.9" sha2 = "0.10.9"
sqlparser = { version = "0.59.0", features = ["visitor"] } sqlparser = { version = "0.59.0", features = ["visitor"] }
tauri = { version = "2.8.5", features = ["protocol-asset", "devtools"] } tauri = { version = "2.9.1", features = ["protocol-asset", "devtools"] }
tauri-plugin-dialog = "2.4.0" tauri-plugin-dialog = "2.4.2"
tauri-plugin-fs = "2.4.0" tauri-plugin-fs = "2.4.0"
tauri-plugin-http = "2.5.2" tauri-plugin-http = "2.5.4"
tauri-plugin-notification = "2.3.1" tauri-plugin-notification = "2.3.3"
tauri-plugin-opener = "2.5.0" tauri-plugin-opener = "2.5.2"
tauri-plugin-os = "2.3" tauri-plugin-os = "2.3.2"
tauri-plugin-persisted-scope = "2.3.2" tauri-plugin-persisted-scope = "2.3.4"
tauri-plugin-store = "2.4.0" tauri-plugin-store = "2.4.1"
thiserror = "2.0.17" thiserror = "2.0.17"
ts-rs = { version = "11.1.0", features = ["serde-compat"] } ts-rs = { version = "11.1.0", features = ["serde-compat"] }
uhlc = "0.8.2" uhlc = "0.8.2"
url = "2.5.7"
uuid = { version = "1.18.1", features = ["v4"] } uuid = { version = "1.18.1", features = ["v4"] }
zip = "6.0.0" zip = "6.0.0"
url = "2.5.7" rusqlite = { version = "0.37.0", features = [
"load_extension",
"bundled-sqlcipher-vendored-openssl",
"functions",
] }
[target.'cfg(not(target_os = "android"))'.dependencies]
trash = "5.2.5"

View File

@ -1,3 +1,3 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually. // This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, devServerUrl: string | null, }; export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, entry: string | null, singleInstance: boolean | null, devServerUrl: string | null, };

View File

@ -1,4 +1,4 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually. // This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { ExtensionPermissions } from "./ExtensionPermissions"; import type { ExtensionPermissions } from "./ExtensionPermissions";
export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, }; export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string | null, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, single_instance: boolean | null, };

View File

@ -18,8 +18,14 @@
"fs:allow-appconfig-write-recursive", "fs:allow-appconfig-write-recursive",
"fs:allow-appdata-read-recursive", "fs:allow-appdata-read-recursive",
"fs:allow-appdata-write-recursive", "fs:allow-appdata-write-recursive",
"fs:allow-applocaldata-read-recursive",
"fs:allow-applocaldata-write-recursive",
"fs:allow-read-file", "fs:allow-read-file",
"fs:allow-write-file",
"fs:allow-read-dir", "fs:allow-read-dir",
"fs:allow-mkdir",
"fs:allow-exists",
"fs:allow-remove",
"fs:allow-resource-read-recursive", "fs:allow-resource-read-recursive",
"fs:allow-resource-write-recursive", "fs:allow-resource-write-recursive",
"fs:allow-download-read-recursive", "fs:allow-download-read-recursive",
@ -35,6 +41,7 @@
"notification:allow-create-channel", "notification:allow-create-channel",
"notification:allow-list-channels", "notification:allow-list-channels",
"notification:allow-notify", "notification:allow-notify",
"notification:allow-is-permission-granted",
"notification:default", "notification:default",
"opener:allow-open-url", "opener:allow-open-url",
"opener:default", "opener:default",

View File

@ -1,8 +1,8 @@
import { writeFileSync, mkdirSync } from 'node:fs' import { writeFileSync, mkdirSync } from 'node:fs'
import { join, dirname } from 'node:path' import { join, dirname } from 'node:path'
import { fileURLToPath } from 'node:url' import { fileURLToPath } from 'node:url'
import tablesNames from './tableNames.json' import tablesNames from '../../src/database/tableNames.json'
import { schema } from './index' import { schema } from '../../src/database/index'
import { getTableColumns } from 'drizzle-orm' import { getTableColumns } from 'drizzle-orm'
import type { AnySQLiteColumn, SQLiteTable } from 'drizzle-orm/sqlite-core' import type { AnySQLiteColumn, SQLiteTable } from 'drizzle-orm/sqlite-core'
@ -170,6 +170,14 @@ use serde::{Deserialize, Serialize};
table: schema.haexCrdtSnapshots, table: schema.haexCrdtSnapshots,
}, },
{ name: tablesNames.haex.crdt.configs.name, table: schema.haexCrdtConfigs }, { name: tablesNames.haex.crdt.configs.name, table: schema.haexCrdtConfigs },
{
name: tablesNames.haex.desktop_items.name,
table: schema.haexDesktopItems,
},
{
name: tablesNames.haex.workspaces.name,
table: schema.haexWorkspaces,
},
] ]
for (const { name, table } of schemas) { for (const { name, table } of schemas) {

View File

@ -1,23 +0,0 @@
import { drizzle } from 'drizzle-orm/sqlite-proxy' // Adapter für Query Building ohne direkte Verbindung
import * as schema from './schemas' // Importiere alles aus deiner Schema-Datei
export * as schema from './schemas'
// sqlite-proxy benötigt eine (dummy) Ausführungsfunktion als Argument.
// Diese wird in unserem Tauri-Workflow nie aufgerufen, da wir nur .toSQL() verwenden.
// Sie muss aber vorhanden sein, um drizzle() aufrufen zu können.
const dummyExecutor = async (
sql: string,
params: unknown[],
method: 'all' | 'run' | 'get' | 'values',
) => {
console.warn(
`Frontend Drizzle Executor wurde aufgerufen (Methode: ${method}). Das sollte im Tauri-Invoke-Workflow nicht passieren!`,
)
// Wir geben leere Ergebnisse zurück, um die Typen zufriedenzustellen, falls es doch aufgerufen wird.
return { rows: [] } // Für 'run' (z.B. bei INSERT/UPDATE)
}
// Erstelle die Drizzle-Instanz für den SQLite-Dialekt
// Übergib den dummyExecutor und das importierte Schema
export const db = drizzle(dummyExecutor, { schema })
// Exportiere auch alle Schema-Definitionen weiter, damit man alles aus einer Datei importieren kann

View File

@ -24,9 +24,23 @@ CREATE TABLE `haex_crdt_snapshots` (
`file_size_bytes` integer `file_size_bytes` integer
); );
--> statement-breakpoint --> statement-breakpoint
CREATE TABLE `haex_desktop_items` (
`id` text PRIMARY KEY NOT NULL,
`workspace_id` text NOT NULL,
`item_type` text NOT NULL,
`extension_id` text,
`system_window_id` text,
`position_x` integer DEFAULT 0 NOT NULL,
`position_y` integer DEFAULT 0 NOT NULL,
`haex_timestamp` text,
FOREIGN KEY (`workspace_id`) REFERENCES `haex_workspaces`(`id`) ON UPDATE no action ON DELETE cascade,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade,
CONSTRAINT "item_reference" CHECK(("haex_desktop_items"."item_type" = 'extension' AND "haex_desktop_items"."extension_id" IS NOT NULL AND "haex_desktop_items"."system_window_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'system' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'file' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'folder' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL))
);
--> statement-breakpoint
CREATE TABLE `haex_extension_permissions` ( CREATE TABLE `haex_extension_permissions` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`extension_id` text, `extension_id` text NOT NULL,
`resource_type` text, `resource_type` text,
`action` text, `action` text,
`target` text, `target` text,
@ -34,38 +48,28 @@ CREATE TABLE `haex_extension_permissions` (
`status` text DEFAULT 'denied' NOT NULL, `status` text DEFAULT 'denied' NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP), `created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer, `updated_at` integer,
`haex_tombstone` integer,
`haex_timestamp` text, `haex_timestamp` text,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE no action FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade
); );
--> statement-breakpoint --> statement-breakpoint
CREATE UNIQUE INDEX `haex_extension_permissions_extension_id_resource_type_action_target_unique` ON `haex_extension_permissions` (`extension_id`,`resource_type`,`action`,`target`);--> statement-breakpoint CREATE UNIQUE INDEX `haex_extension_permissions_extension_id_resource_type_action_target_unique` ON `haex_extension_permissions` (`extension_id`,`resource_type`,`action`,`target`);--> statement-breakpoint
CREATE TABLE `haex_extensions` ( CREATE TABLE `haex_extensions` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`public_key` text NOT NULL,
`name` text NOT NULL,
`version` text NOT NULL,
`author` text, `author` text,
`description` text, `description` text,
`entry` text, `entry` text DEFAULT 'index.html',
`homepage` text, `homepage` text,
`enabled` integer, `enabled` integer DEFAULT true,
`icon` text, `icon` text,
`name` text, `signature` text NOT NULL,
`public_key` text, `single_instance` integer DEFAULT false,
`signature` text,
`url` text,
`version` text,
`haex_tombstone` integer,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE TABLE `haex_settings` (
`id` text PRIMARY KEY NOT NULL,
`key` text,
`type` text,
`value` text,
`haex_tombstone` integer,
`haex_timestamp` text `haex_timestamp` text
); );
--> statement-breakpoint --> statement-breakpoint
CREATE UNIQUE INDEX `haex_extensions_public_key_name_unique` ON `haex_extensions` (`public_key`,`name`);--> statement-breakpoint
CREATE TABLE `haex_notifications` ( CREATE TABLE `haex_notifications` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`alt` text, `alt` text,
@ -77,63 +81,25 @@ CREATE TABLE `haex_notifications` (
`text` text, `text` text,
`title` text, `title` text,
`type` text NOT NULL, `type` text NOT NULL,
`haex_tombstone` integer `haex_timestamp` text
); );
--> statement-breakpoint --> statement-breakpoint
CREATE TABLE `haex_passwords_group_items` ( CREATE TABLE `haex_settings` (
`group_id` text,
`item_id` text,
`haex_tombstone` integer,
PRIMARY KEY(`item_id`, `group_id`),
FOREIGN KEY (`group_id`) REFERENCES `haex_passwords_groups`(`id`) ON UPDATE no action ON DELETE no action,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_groups` (
`id` text PRIMARY KEY NOT NULL, `id` text PRIMARY KEY NOT NULL,
`name` text,
`description` text,
`icon` text,
`order` integer,
`color` text,
`parent_id` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_tombstone` integer,
FOREIGN KEY (`parent_id`) REFERENCES `haex_passwords_groups`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_details` (
`id` text PRIMARY KEY NOT NULL,
`title` text,
`username` text,
`password` text,
`note` text,
`icon` text,
`tags` text,
`url` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_tombstone` integer
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_history` (
`id` text PRIMARY KEY NOT NULL,
`item_id` text,
`changed_property` text,
`old_value` text,
`new_value` text,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`haex_tombstone` integer,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `haex_passwords_item_key_values` (
`id` text PRIMARY KEY NOT NULL,
`item_id` text,
`key` text, `key` text,
`type` text,
`value` text, `value` text,
`updated_at` integer, `haex_timestamp` text
`haex_tombstone` integer,
FOREIGN KEY (`item_id`) REFERENCES `haex_passwords_item_details`(`id`) ON UPDATE no action ON DELETE no action
); );
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_settings_key_type_value_unique` ON `haex_settings` (`key`,`type`,`value`);--> statement-breakpoint
CREATE TABLE `haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` blob,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -0,0 +1,15 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint
CREATE TABLE `__new_haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` text,
`haex_timestamp` text
);
--> statement-breakpoint
INSERT INTO `__new_haex_workspaces`("id", "device_id", "name", "position", "background", "haex_timestamp") SELECT "id", "device_id", "name", "position", "background", "haex_timestamp" FROM `haex_workspaces`;--> statement-breakpoint
DROP TABLE `haex_workspaces`;--> statement-breakpoint
ALTER TABLE `__new_haex_workspaces` RENAME TO `haex_workspaces`;--> statement-breakpoint
PRAGMA foreign_keys=ON;--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -1 +0,0 @@
ALTER TABLE `haex_notifications` ADD `haex_timestamp` text;

View File

@ -1,22 +0,0 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint
CREATE TABLE `__new_haex_extensions` (
`id` text PRIMARY KEY NOT NULL,
`public_key` text NOT NULL,
`name` text NOT NULL,
`version` text NOT NULL,
`author` text,
`description` text,
`entry` text DEFAULT 'index.html' NOT NULL,
`homepage` text,
`enabled` integer DEFAULT true,
`icon` text,
`signature` text NOT NULL,
`haex_tombstone` integer,
`haex_timestamp` text
);
--> statement-breakpoint
INSERT INTO `__new_haex_extensions`("id", "public_key", "name", "version", "author", "description", "entry", "homepage", "enabled", "icon", "signature", "haex_tombstone", "haex_timestamp") SELECT "id", "public_key", "name", "version", "author", "description", "entry", "homepage", "enabled", "icon", "signature", "haex_tombstone", "haex_timestamp" FROM `haex_extensions`;--> statement-breakpoint
DROP TABLE `haex_extensions`;--> statement-breakpoint
ALTER TABLE `__new_haex_extensions` RENAME TO `haex_extensions`;--> statement-breakpoint
PRAGMA foreign_keys=ON;--> statement-breakpoint
CREATE UNIQUE INDEX `haex_extensions_public_key_name_unique` ON `haex_extensions` (`public_key`,`name`);

View File

@ -1,9 +0,0 @@
CREATE TABLE `haex_desktop_items` (
`id` text PRIMARY KEY NOT NULL,
`item_type` text NOT NULL,
`reference_id` text NOT NULL,
`position_x` integer DEFAULT 0 NOT NULL,
`position_y` integer DEFAULT 0 NOT NULL,
`haex_tombstone` integer,
`haex_timestamp` text
);

View File

@ -1,7 +1,7 @@
{ {
"version": "6", "version": "6",
"dialect": "sqlite", "dialect": "sqlite",
"id": "3bbe52b8-5933-4b21-8b24-de3927a2f9b0", "id": "e3d61ad1-63be-41be-9243-41144e215f98",
"prevId": "00000000-0000-0000-0000-000000000000", "prevId": "00000000-0000-0000-0000-000000000000",
"tables": { "tables": {
"haex_crdt_configs": { "haex_crdt_configs": {
@ -155,6 +155,106 @@
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
}, },
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_extension_permissions": { "haex_extension_permissions": {
"name": "haex_extension_permissions", "name": "haex_extension_permissions",
"columns": { "columns": {
@ -169,7 +269,7 @@
"name": "extension_id", "name": "extension_id",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"resource_type": { "resource_type": {
@ -223,13 +323,6 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
"type": "text", "type": "text",
@ -261,7 +354,7 @@
"columnsTo": [ "columnsTo": [
"id" "id"
], ],
"onDelete": "no action", "onDelete": "cascade",
"onUpdate": "no action" "onUpdate": "no action"
} }
}, },
@ -279,6 +372,27 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": { "author": {
"name": "author", "name": "author",
"type": "text", "type": "text",
@ -298,7 +412,8 @@
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": "'index.html'"
}, },
"homepage": { "homepage": {
"name": "homepage", "name": "homepage",
@ -312,7 +427,8 @@
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": true
}, },
"icon": { "icon": {
"name": "icon", "name": "icon",
@ -321,99 +437,20 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": { "signature": {
"name": "signature", "name": "signature",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"key": { "single_instance": {
"name": "key", "name": "single_instance",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": false
}, },
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
@ -423,7 +460,16 @@
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {}, "foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
@ -502,9 +548,9 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": { "haex_timestamp": {
"name": "haex_tombstone", "name": "haex_timestamp",
"type": "integer", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
@ -516,74 +562,8 @@
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
}, },
"haex_passwords_group_items": { "haex_settings": {
"name": "haex_passwords_group_items", "name": "haex_settings",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": { "columns": {
"id": { "id": {
"name": "id", "name": "id",
@ -592,270 +572,6 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": { "key": {
"name": "key", "name": "key",
"type": "text", "type": "text",
@ -863,6 +579,13 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": { "value": {
"name": "value", "name": "value",
"type": "text", "type": "text",
@ -870,37 +593,87 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"updated_at": { "haex_timestamp": {
"name": "updated_at", "name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_key_type_value_unique": {
"name": "haex_settings_key_type_value_unique",
"columns": [
"key",
"type",
"value"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "blob",
"primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": { "haex_timestamp": {
"name": "haex_tombstone", "name": "haex_timestamp",
"type": "integer", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"foreignKeys": { "haex_workspaces_position_unique": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": { "name": "haex_workspaces_position_unique",
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk", "columns": [
"tableFrom": "haex_passwords_item_key_values", "position"
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
], ],
"columnsTo": [ "isUnique": true
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
} }
}, },
"foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}

View File

@ -1,8 +1,8 @@
{ {
"version": "6", "version": "6",
"dialect": "sqlite", "dialect": "sqlite",
"id": "862ac1d5-3065-4244-8652-2b6782254862", "id": "10bec43a-4227-483e-b1c1-fd50ae32bb96",
"prevId": "3bbe52b8-5933-4b21-8b24-de3927a2f9b0", "prevId": "e3d61ad1-63be-41be-9243-41144e215f98",
"tables": { "tables": {
"haex_crdt_configs": { "haex_crdt_configs": {
"name": "haex_crdt_configs", "name": "haex_crdt_configs",
@ -155,6 +155,106 @@
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
}, },
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_extension_permissions": { "haex_extension_permissions": {
"name": "haex_extension_permissions", "name": "haex_extension_permissions",
"columns": { "columns": {
@ -169,7 +269,7 @@
"name": "extension_id", "name": "extension_id",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"resource_type": { "resource_type": {
@ -223,13 +323,6 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
"type": "text", "type": "text",
@ -261,7 +354,7 @@
"columnsTo": [ "columnsTo": [
"id" "id"
], ],
"onDelete": "no action", "onDelete": "cascade",
"onUpdate": "no action" "onUpdate": "no action"
} }
}, },
@ -279,6 +372,27 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": { "author": {
"name": "author", "name": "author",
"type": "text", "type": "text",
@ -298,7 +412,8 @@
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": "'index.html'"
}, },
"homepage": { "homepage": {
"name": "homepage", "name": "homepage",
@ -312,7 +427,8 @@
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": true
}, },
"icon": { "icon": {
"name": "icon", "name": "icon",
@ -321,47 +437,20 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": { "signature": {
"name": "signature", "name": "signature",
"type": "text", "type": "text",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"url": { "single_instance": {
"name": "url", "name": "single_instance",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false,
"default": false
}, },
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
@ -371,7 +460,16 @@
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {}, "foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
@ -450,13 +548,6 @@
"notNull": true, "notNull": true,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": { "haex_timestamp": {
"name": "haex_timestamp", "name": "haex_timestamp",
"type": "text", "type": "text",
@ -502,10 +593,66 @@
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
"haex_tombstone": { "haex_timestamp": {
"name": "haex_tombstone", "name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_key_type_value_unique": {
"name": "haex_settings_key_type_value_unique",
"columns": [
"key",
"type",
"value"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer", "type": "integer",
"primaryKey": false, "primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false, "notNull": false,
"autoincrement": false "autoincrement": false
}, },
@ -517,400 +664,19 @@
"autoincrement": false "autoincrement": false
} }
}, },
"indexes": {}, "indexes": {
"foreignKeys": {}, "haex_workspaces_position_unique": {
"compositePrimaryKeys": {}, "name": "haex_workspaces_position_unique",
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [ "columns": [
"item_id", "position"
"group_id"
], ],
"name": "haex_passwords_group_items_item_id_group_id_pk" "isUnique": true
} }
}, },
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {}, "foreignKeys": {},
"compositePrimaryKeys": {}, "compositePrimaryKeys": {},
"uniqueConstraints": {}, "uniqueConstraints": {},
"checkConstraints": {} "checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
} }
}, },
"views": {}, "views": {},

View File

@ -1,930 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "5387568f-75b3-4a85-86c5-67f539c3fedf",
"prevId": "862ac1d5-3065-4244-8652-2b6782254862",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -1,991 +0,0 @@
{
"version": "6",
"dialect": "sqlite",
"id": "2f40a42e-9b3f-42be-8951-8e94baadcd65",
"prevId": "5387568f-75b3-4a85-86c5-67f539c3fedf",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"reference_id": {
"name": "reference_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_group_items": {
"name": "haex_passwords_group_items",
"columns": {
"group_id": {
"name": "group_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_group_items_group_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_group_items_group_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"group_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_group_items_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_group_items",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"haex_passwords_group_items_item_id_group_id_pk": {
"columns": [
"item_id",
"group_id"
],
"name": "haex_passwords_group_items_item_id_group_id_pk"
}
},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_groups": {
"name": "haex_passwords_groups",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"order": {
"name": "order",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"color": {
"name": "color",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"parent_id": {
"name": "parent_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_groups_parent_id_haex_passwords_groups_id_fk": {
"name": "haex_passwords_groups_parent_id_haex_passwords_groups_id_fk",
"tableFrom": "haex_passwords_groups",
"tableTo": "haex_passwords_groups",
"columnsFrom": [
"parent_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_details": {
"name": "haex_passwords_item_details",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"password": {
"name": "password",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"note": {
"name": "note",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"tags": {
"name": "tags",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"url": {
"name": "url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_history": {
"name": "haex_passwords_item_history",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"changed_property": {
"name": "changed_property",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_history_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_history",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_passwords_item_key_values": {
"name": "haex_passwords_item_key_values",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"item_id": {
"name": "item_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_tombstone": {
"name": "haex_tombstone",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk": {
"name": "haex_passwords_item_key_values_item_id_haex_passwords_item_details_id_fk",
"tableFrom": "haex_passwords_item_key_values",
"tableTo": "haex_passwords_item_details",
"columnsFrom": [
"item_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -5,29 +5,15 @@
{ {
"idx": 0, "idx": 0,
"version": "6", "version": "6",
"when": 1759402321133, "when": 1762119713008,
"tag": "0000_glamorous_hulk", "tag": "0000_cynical_nicolaos",
"breakpoints": true "breakpoints": true
}, },
{ {
"idx": 1, "idx": 1,
"version": "6", "version": "6",
"when": 1759418087677, "when": 1762122405562,
"tag": "0001_green_stark_industries", "tag": "0001_furry_brother_voodoo",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1760272083150,
"tag": "0002_amazing_iron_fist",
"breakpoints": true
},
{
"idx": 3,
"version": "6",
"when": 1760611690801,
"tag": "0003_daily_polaris",
"breakpoints": true "breakpoints": true
} }
] ]

View File

@ -1,157 +0,0 @@
import { sql } from 'drizzle-orm'
import {
integer,
sqliteTable,
text,
unique,
type AnySQLiteColumn,
type SQLiteColumnBuilderBase,
} from 'drizzle-orm/sqlite-core'
import tableNames from '../tableNames.json'
// Helper function to add common CRDT columns (haexTombstone and haexTimestamp)
export const withCrdtColumns = <T extends Record<string, SQLiteColumnBuilderBase>>(
columns: T,
columnNames: { haexTombstone: string; haexTimestamp: string },
) => ({
...columns,
haexTombstone: integer(columnNames.haexTombstone, { mode: 'boolean' }),
haexTimestamp: text(columnNames.haexTimestamp),
})
export const haexSettings = sqliteTable(tableNames.haex.settings.name, {
id: text()
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
key: text(),
type: text(),
value: text(),
haexTombstone: integer(tableNames.haex.settings.columns.haexTombstone, {
mode: 'boolean',
}),
haexTimestamp: text(tableNames.haex.settings.columns.haexTimestamp),
})
export type InsertHaexSettings = typeof haexSettings.$inferInsert
export type SelectHaexSettings = typeof haexSettings.$inferSelect
export const haexExtensions = sqliteTable(
tableNames.haex.extensions.name,
{
id: text()
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
public_key: text().notNull(),
name: text().notNull(),
version: text().notNull(),
author: text(),
description: text(),
entry: text().notNull().default('index.html'),
homepage: text(),
enabled: integer({ mode: 'boolean' }).default(true),
icon: text(),
signature: text().notNull(),
haexTombstone: integer(tableNames.haex.extensions.columns.haexTombstone, {
mode: 'boolean',
}),
haexTimestamp: text(tableNames.haex.extensions.columns.haexTimestamp),
},
(table) => [
// UNIQUE constraint: Pro Developer (public_key) kann nur eine Extension mit diesem Namen existieren
unique().on(table.public_key, table.name),
],
)
export type InsertHaexExtensions = typeof haexExtensions.$inferInsert
export type SelectHaexExtensions = typeof haexExtensions.$inferSelect
export const haexExtensionPermissions = sqliteTable(
tableNames.haex.extension_permissions.name,
{
id: text()
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
extensionId: text(
tableNames.haex.extension_permissions.columns.extensionId,
).references((): AnySQLiteColumn => haexExtensions.id),
resourceType: text('resource_type', {
enum: ['fs', 'http', 'db', 'shell'],
}),
action: text({ enum: ['read', 'write'] }),
target: text(),
constraints: text({ mode: 'json' }),
status: text({ enum: ['ask', 'granted', 'denied'] })
.notNull()
.default('denied'),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haexTombstone: integer(
tableNames.haex.extension_permissions.columns.haexTombstone,
{ mode: 'boolean' },
),
haexTimestamp: text(
tableNames.haex.extension_permissions.columns.haexTimestamp,
),
},
(table) => [
unique().on(
table.extensionId,
table.resourceType,
table.action,
table.target,
),
],
)
export type InserthaexExtensionPermissions =
typeof haexExtensionPermissions.$inferInsert
export type SelecthaexExtensionPermissions =
typeof haexExtensionPermissions.$inferSelect
export const haexNotifications = sqliteTable(
tableNames.haex.notifications.name,
{
id: text().primaryKey(),
alt: text(),
date: text(),
icon: text(),
image: text(),
read: integer({ mode: 'boolean' }),
source: text(),
text: text(),
title: text(),
type: text({
enum: ['error', 'success', 'warning', 'info', 'log'],
}).notNull(),
haexTombstone: integer(
tableNames.haex.notifications.columns.haexTombstone,
{ mode: 'boolean' },
),
haexTimestamp: text(tableNames.haex.notifications.columns.haexTimestamp),
},
)
export type InsertHaexNotifications = typeof haexNotifications.$inferInsert
export type SelectHaexNotifications = typeof haexNotifications.$inferSelect
export const haexDesktopItems = sqliteTable(
tableNames.haex.desktop_items.name,
withCrdtColumns(
{
id: text(tableNames.haex.desktop_items.columns.id)
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
itemType: text(tableNames.haex.desktop_items.columns.itemType, {
enum: ['extension', 'file', 'folder'],
}).notNull(),
referenceId: text(tableNames.haex.desktop_items.columns.referenceId).notNull(), // extensionId für extensions, filePath für files/folders
positionX: integer(tableNames.haex.desktop_items.columns.positionX)
.notNull()
.default(0),
positionY: integer(tableNames.haex.desktop_items.columns.positionY)
.notNull()
.default(0),
},
tableNames.haex.desktop_items.columns,
),
)
export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert
export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect

View File

@ -1,112 +0,0 @@
import { sql } from 'drizzle-orm'
import {
integer,
primaryKey,
sqliteTable,
text,
type AnySQLiteColumn,
} from 'drizzle-orm/sqlite-core'
import tableNames from '../tableNames.json'
export const haexPasswordsItemDetails = sqliteTable(
tableNames.haex.passwords.item_details,
{
id: text().primaryKey(),
title: text(),
username: text(),
password: text(),
note: text(),
icon: text(),
tags: text(),
url: text(),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InsertHaexPasswordsItemDetails =
typeof haexPasswordsItemDetails.$inferInsert
export type SelectHaexPasswordsItemDetails =
typeof haexPasswordsItemDetails.$inferSelect
export const haexPasswordsItemKeyValues = sqliteTable(
tableNames.haex.passwords.item_key_values,
{
id: text().primaryKey(),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
key: text(),
value: text(),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InserthaexPasswordsItemKeyValues =
typeof haexPasswordsItemKeyValues.$inferInsert
export type SelectHaexPasswordsItemKeyValues =
typeof haexPasswordsItemKeyValues.$inferSelect
export const haexPasswordsItemHistory = sqliteTable(
tableNames.haex.passwords.item_histories,
{
id: text().primaryKey(),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
changedProperty:
text('changed_property').$type<keyof typeof haexPasswordsItemDetails>(),
oldValue: text('old_value'),
newValue: text('new_value'),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InserthaexPasswordsItemHistory =
typeof haexPasswordsItemHistory.$inferInsert
export type SelectHaexPasswordsItemHistory =
typeof haexPasswordsItemHistory.$inferSelect
export const haexPasswordsGroups = sqliteTable(
tableNames.haex.passwords.groups,
{
id: text().primaryKey(),
name: text(),
description: text(),
icon: text(),
order: integer(),
color: text(),
parentId: text('parent_id').references(
(): AnySQLiteColumn => haexPasswordsGroups.id,
),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
haex_tombstone: integer({ mode: 'boolean' }),
},
)
export type InsertHaexPasswordsGroups = typeof haexPasswordsGroups.$inferInsert
export type SelectHaexPasswordsGroups = typeof haexPasswordsGroups.$inferSelect
export const haexPasswordsGroupItems = sqliteTable(
tableNames.haex.passwords.group_items,
{
groupId: text('group_id').references(
(): AnySQLiteColumn => haexPasswordsGroups.id,
),
itemId: text('item_id').references(
(): AnySQLiteColumn => haexPasswordsItemDetails.id,
),
haex_tombstone: integer({ mode: 'boolean' }),
},
(table) => [primaryKey({ columns: [table.itemId, table.groupId] })],
)
export type InsertHaexPasswordsGroupItems =
typeof haexPasswordsGroupItems.$inferInsert
export type SelectHaexPasswordsGroupItems =
typeof haexPasswordsGroupItems.$inferSelect

Binary file not shown.

View File

@ -24,6 +24,23 @@ android {
versionCode = tauriProperties.getProperty("tauri.android.versionCode", "1").toInt() versionCode = tauriProperties.getProperty("tauri.android.versionCode", "1").toInt()
versionName = tauriProperties.getProperty("tauri.android.versionName", "1.0") versionName = tauriProperties.getProperty("tauri.android.versionName", "1.0")
} }
signingConfigs {
create("release") {
val keystorePath = System.getenv("ANDROID_KEYSTORE_PATH")
val keystorePassword = System.getenv("ANDROID_KEYSTORE_PASSWORD")
val keyAlias = System.getenv("ANDROID_KEY_ALIAS")
val keyPassword = System.getenv("ANDROID_KEY_PASSWORD")
if (keystorePath != null && keystorePassword != null && keyAlias != null && keyPassword != null) {
storeFile = file(keystorePath)
storePassword = keystorePassword
this.keyAlias = keyAlias
this.keyPassword = keyPassword
}
}
}
buildTypes { buildTypes {
getByName("debug") { getByName("debug") {
manifestPlaceholders["usesCleartextTraffic"] = "true" manifestPlaceholders["usesCleartextTraffic"] = "true"
@ -43,6 +60,12 @@ android {
.plus(getDefaultProguardFile("proguard-android-optimize.txt")) .plus(getDefaultProguardFile("proguard-android-optimize.txt"))
.toList().toTypedArray() .toList().toTypedArray()
) )
// Sign with release config if available
val releaseSigningConfig = signingConfigs.getByName("release")
if (releaseSigningConfig.storeFile != null) {
signingConfig = releaseSigningConfig
}
} }
} }
kotlinOptions { kotlinOptions {

File diff suppressed because one or more lines are too long

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1 +1 @@
{"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-read-file","fs:allow-read-dir","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:default","opener:allow-open-url","opener:default","os:allow-hostname","os:default","store:default"]}} {"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-applocaldata-read-recursive","fs:allow-applocaldata-write-recursive","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-mkdir","fs:allow-exists","fs:allow-remove","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:allow-is-permission-granted","notification:default","opener:allow-open-url","opener:default","os:allow-hostname","os:default","store:default"]}}

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1400,10 +1400,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",
@ -2277,10 +2277,10 @@
"markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`" "markdownDescription": "Default core plugins set.\n#### This default permission set includes:\n\n- `core:path:default`\n- `core:event:default`\n- `core:window:default`\n- `core:webview:default`\n- `core:app:default`\n- `core:image:default`\n- `core:resources:default`\n- `core:menu:default`\n- `core:tray:default`"
}, },
{ {
"description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`", "description": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`",
"type": "string", "type": "string",
"const": "core:app:default", "const": "core:app:default",
"markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`" "markdownDescription": "Default permissions for the plugin.\n#### This default permission set includes:\n\n- `allow-version`\n- `allow-name`\n- `allow-tauri-version`\n- `allow-identifier`\n- `allow-bundle-type`\n- `allow-register-listener`\n- `allow-remove-listener`"
}, },
{ {
"description": "Enables the app_hide command without any pre-configured scope.", "description": "Enables the app_hide command without any pre-configured scope.",
@ -2324,12 +2324,24 @@
"const": "core:app:allow-name", "const": "core:app:allow-name",
"markdownDescription": "Enables the name command without any pre-configured scope." "markdownDescription": "Enables the name command without any pre-configured scope."
}, },
{
"description": "Enables the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-register-listener",
"markdownDescription": "Enables the register_listener command without any pre-configured scope."
},
{ {
"description": "Enables the remove_data_store command without any pre-configured scope.", "description": "Enables the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:allow-remove-data-store", "const": "core:app:allow-remove-data-store",
"markdownDescription": "Enables the remove_data_store command without any pre-configured scope." "markdownDescription": "Enables the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Enables the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:allow-remove-listener",
"markdownDescription": "Enables the remove_listener command without any pre-configured scope."
},
{ {
"description": "Enables the set_app_theme command without any pre-configured scope.", "description": "Enables the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -2396,12 +2408,24 @@
"const": "core:app:deny-name", "const": "core:app:deny-name",
"markdownDescription": "Denies the name command without any pre-configured scope." "markdownDescription": "Denies the name command without any pre-configured scope."
}, },
{
"description": "Denies the register_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-register-listener",
"markdownDescription": "Denies the register_listener command without any pre-configured scope."
},
{ {
"description": "Denies the remove_data_store command without any pre-configured scope.", "description": "Denies the remove_data_store command without any pre-configured scope.",
"type": "string", "type": "string",
"const": "core:app:deny-remove-data-store", "const": "core:app:deny-remove-data-store",
"markdownDescription": "Denies the remove_data_store command without any pre-configured scope." "markdownDescription": "Denies the remove_data_store command without any pre-configured scope."
}, },
{
"description": "Denies the remove_listener command without any pre-configured scope.",
"type": "string",
"const": "core:app:deny-remove-listener",
"markdownDescription": "Denies the remove_listener command without any pre-configured scope."
},
{ {
"description": "Denies the set_app_theme command without any pre-configured scope.", "description": "Denies the set_app_theme command without any pre-configured scope.",
"type": "string", "type": "string",
@ -5541,10 +5565,10 @@
"markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths." "markdownDescription": "This enables all index or metadata related commands without any pre-configured accessible paths."
}, },
{ {
"description": "An empty permission you can use to modify the global scope.", "description": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n",
"type": "string", "type": "string",
"const": "fs:scope", "const": "fs:scope",
"markdownDescription": "An empty permission you can use to modify the global scope." "markdownDescription": "An empty permission you can use to modify the global scope.\n\n## Example\n\n```json\n{\n \"identifier\": \"read-documents\",\n \"windows\": [\"main\"],\n \"permissions\": [\n \"fs:allow-read\",\n {\n \"identifier\": \"fs:scope\",\n \"allow\": [\n \"$APPDATA/documents/**/*\"\n ],\n \"deny\": [\n \"$APPDATA/documents/secret.txt\"\n ]\n }\n ]\n}\n```\n"
}, },
{ {
"description": "This scope permits access to all files and list content of top level directories in the application folders.", "description": "This scope permits access to all files and list content of top level directories in the application folders.",

View File

@ -1,5 +1,6 @@
// src-tarui/src/build/table_names.rs // src-tarui/src/build/table_names.rs
use serde::Deserialize; use serde::Deserialize;
use serde_json::Value;
use std::collections::HashMap; use std::collections::HashMap;
use std::env; use std::env;
use std::fs::File; use std::fs::File;
@ -8,24 +9,7 @@ use std::path::Path;
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
struct Schema { struct Schema {
haex: Haex, haex: HashMap<String, Value>,
}
#[derive(Debug, Deserialize)]
#[allow(non_snake_case)]
struct Haex {
settings: TableDefinition,
extensions: TableDefinition,
extension_permissions: TableDefinition,
notifications: TableDefinition,
crdt: Crdt,
}
#[derive(Debug, Deserialize)]
struct Crdt {
logs: TableDefinition,
snapshots: TableDefinition,
configs: TableDefinition,
} }
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
@ -36,178 +20,98 @@ struct TableDefinition {
pub fn generate_table_names() { pub fn generate_table_names() {
let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt."); let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt.");
println!("Generiere Tabellennamen nach {}", out_dir); println!("Generiere Tabellennamen nach {out_dir}");
let schema_path = Path::new("database/tableNames.json"); let schema_path = Path::new("../src/database/tableNames.json");
let dest_path = Path::new(&out_dir).join("tableNames.rs"); let dest_path = Path::new(&out_dir).join("tableNames.rs");
let file = File::open(&schema_path).expect("Konnte tableNames.json nicht öffnen"); let file = File::open(schema_path).expect("Konnte tableNames.json nicht öffnen");
let reader = BufReader::new(file); let reader = BufReader::new(file);
let schema: Schema = let schema: Schema =
serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen"); serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen");
let haex = schema.haex;
let code = format!( let mut code = String::from(
r#" r#"
// ================================================================== // ==================================================================
// HINWEIS: Diese Datei wurde automatisch von build.rs generiert. // HINWEIS: Diese Datei wurde automatisch von build.rs generiert.
// Manuelle Änderungen werden bei der nächsten Kompilierung überschrieben! // Manuelle Änderungen werden bei der nächsten Kompilierung überschrieben!
// ================================================================== // ==================================================================
// --- Table: haex_settings ---
pub const TABLE_SETTINGS: &str = "{t_settings}";
pub const COL_SETTINGS_ID: &str = "{c_settings_id}";
pub const COL_SETTINGS_KEY: &str = "{c_settings_key}";
pub const COL_SETTINGS_TYPE: &str = "{c_settings_type}";
pub const COL_SETTINGS_VALUE: &str = "{c_settings_value}";
pub const COL_SETTINGS_HAEX_TOMBSTONE: &str = "{c_settings_tombstone}";
pub const COL_SETTINGS_HAEX_TIMESTAMP: &str = "{c_settings_timestamp}";
// --- Table: haex_extensions ---
pub const TABLE_EXTENSIONS: &str = "{t_extensions}";
pub const COL_EXTENSIONS_ID: &str = "{c_ext_id}";
pub const COL_EXTENSIONS_AUTHOR: &str = "{c_ext_author}";
pub const COL_EXTENSIONS_DESCRIPTION: &str = "{c_ext_description}";
pub const COL_EXTENSIONS_ENTRY: &str = "{c_ext_entry}";
pub const COL_EXTENSIONS_HOMEPAGE: &str = "{c_ext_homepage}";
pub const COL_EXTENSIONS_ENABLED: &str = "{c_ext_enabled}";
pub const COL_EXTENSIONS_ICON: &str = "{c_ext_icon}";
pub const COL_EXTENSIONS_NAME: &str = "{c_ext_name}";
pub const COL_EXTENSIONS_PUBLIC_KEY: &str = "{c_ext_public_key}";
pub const COL_EXTENSIONS_SIGNATURE: &str = "{c_ext_signature}";
pub const COL_EXTENSIONS_URL: &str = "{c_ext_url}";
pub const COL_EXTENSIONS_VERSION: &str = "{c_ext_version}";
pub const COL_EXTENSIONS_HAEX_TOMBSTONE: &str = "{c_ext_tombstone}";
pub const COL_EXTENSIONS_HAEX_TIMESTAMP: &str = "{c_ext_timestamp}";
// --- Table: haex_extension_permissions ---
pub const TABLE_EXTENSION_PERMISSIONS: &str = "{t_ext_perms}";
pub const COL_EXT_PERMS_ID: &str = "{c_extp_id}";
pub const COL_EXT_PERMS_EXTENSION_ID: &str = "{c_extp_extensionId}";
pub const COL_EXT_PERMS_RESOURCE_TYPE: &str = "{c_extp_resourceType}";
pub const COL_EXT_PERMS_ACTION: &str = "{c_extp_action}";
pub const COL_EXT_PERMS_TARGET: &str = "{c_extp_target}";
pub const COL_EXT_PERMS_CONSTRAINTS: &str = "{c_extp_constraints}";
pub const COL_EXT_PERMS_STATUS: &str = "{c_extp_status}";
pub const COL_EXT_PERMS_CREATED_AT: &str = "{c_extp_createdAt}";
pub const COL_EXT_PERMS_UPDATE_AT: &str = "{c_extp_updateAt}";
pub const COL_EXT_PERMS_HAEX_TOMBSTONE: &str = "{c_extp_tombstone}";
pub const COL_EXT_PERMS_HAEX_TIMESTAMP: &str = "{c_extp_timestamp}";
// --- Table: haex_notifications ---
pub const TABLE_NOTIFICATIONS: &str = "{t_notifications}";
pub const COL_NOTIFICATIONS_ID: &str = "{c_notif_id}";
pub const COL_NOTIFICATIONS_ALT: &str = "{c_notif_alt}";
pub const COL_NOTIFICATIONS_DATE: &str = "{c_notif_date}";
pub const COL_NOTIFICATIONS_ICON: &str = "{c_notif_icon}";
pub const COL_NOTIFICATIONS_IMAGE: &str = "{c_notif_image}";
pub const COL_NOTIFICATIONS_READ: &str = "{c_notif_read}";
pub const COL_NOTIFICATIONS_SOURCE: &str = "{c_notif_source}";
pub const COL_NOTIFICATIONS_TEXT: &str = "{c_notif_text}";
pub const COL_NOTIFICATIONS_TITLE: &str = "{c_notif_title}";
pub const COL_NOTIFICATIONS_TYPE: &str = "{c_notif_type}";
pub const COL_NOTIFICATIONS_HAEX_TOMBSTONE: &str = "{c_notif_tombstone}";
// --- Table: haex_crdt_logs ---
pub const TABLE_CRDT_LOGS: &str = "{t_crdt_logs}";
pub const COL_CRDT_LOGS_ID: &str = "{c_crdt_logs_id}";
pub const COL_CRDT_LOGS_HAEX_TIMESTAMP: &str = "{c_crdt_logs_timestamp}";
pub const COL_CRDT_LOGS_TABLE_NAME: &str = "{c_crdt_logs_tableName}";
pub const COL_CRDT_LOGS_ROW_PKS: &str = "{c_crdt_logs_rowPks}";
pub const COL_CRDT_LOGS_OP_TYPE: &str = "{c_crdt_logs_opType}";
pub const COL_CRDT_LOGS_COLUMN_NAME: &str = "{c_crdt_logs_columnName}";
pub const COL_CRDT_LOGS_NEW_VALUE: &str = "{c_crdt_logs_newValue}";
pub const COL_CRDT_LOGS_OLD_VALUE: &str = "{c_crdt_logs_oldValue}";
// --- Table: haex_crdt_snapshots ---
pub const TABLE_CRDT_SNAPSHOTS: &str = "{t_crdt_snapshots}";
pub const COL_CRDT_SNAPSHOTS_ID: &str = "{c_crdt_snap_id}";
pub const COL_CRDT_SNAPSHOTS_CREATED: &str = "{c_crdt_snap_created}";
pub const COL_CRDT_SNAPSHOTS_EPOCH_HLC: &str = "{c_crdt_snap_epoch}";
pub const COL_CRDT_SNAPSHOTS_LOCATION_URL: &str = "{c_crdt_snap_location}";
pub const COL_CRDT_SNAPSHOTS_FILE_SIZE: &str = "{c_crdt_snap_size}";
// --- Table: haex_crdt_configs ---
pub const TABLE_CRDT_CONFIGS: &str = "{t_crdt_configs}";
pub const COL_CRDT_CONFIGS_KEY: &str = "{c_crdt_configs_key}";
pub const COL_CRDT_CONFIGS_VALUE: &str = "{c_crdt_configs_value}";
"#, "#,
// Settings
t_settings = haex.settings.name,
c_settings_id = haex.settings.columns["id"],
c_settings_key = haex.settings.columns["key"],
c_settings_type = haex.settings.columns["type"],
c_settings_value = haex.settings.columns["value"],
c_settings_tombstone = haex.settings.columns["haexTombstone"],
c_settings_timestamp = haex.settings.columns["haexTimestamp"],
// Extensions
t_extensions = haex.extensions.name,
c_ext_id = haex.extensions.columns["id"],
c_ext_author = haex.extensions.columns["author"],
c_ext_description = haex.extensions.columns["description"],
c_ext_entry = haex.extensions.columns["entry"],
c_ext_homepage = haex.extensions.columns["homepage"],
c_ext_enabled = haex.extensions.columns["enabled"],
c_ext_icon = haex.extensions.columns["icon"],
c_ext_name = haex.extensions.columns["name"],
c_ext_public_key = haex.extensions.columns["public_key"],
c_ext_signature = haex.extensions.columns["signature"],
c_ext_url = haex.extensions.columns["url"],
c_ext_version = haex.extensions.columns["version"],
c_ext_tombstone = haex.extensions.columns["haexTombstone"],
c_ext_timestamp = haex.extensions.columns["haexTimestamp"],
// Extension Permissions
t_ext_perms = haex.extension_permissions.name,
c_extp_id = haex.extension_permissions.columns["id"],
c_extp_extensionId = haex.extension_permissions.columns["extensionId"],
c_extp_resourceType = haex.extension_permissions.columns["resourceType"],
c_extp_action = haex.extension_permissions.columns["action"],
c_extp_target = haex.extension_permissions.columns["target"],
c_extp_constraints = haex.extension_permissions.columns["constraints"],
c_extp_status = haex.extension_permissions.columns["status"],
c_extp_createdAt = haex.extension_permissions.columns["createdAt"],
c_extp_updateAt = haex.extension_permissions.columns["updateAt"],
c_extp_tombstone = haex.extension_permissions.columns["haexTombstone"],
c_extp_timestamp = haex.extension_permissions.columns["haexTimestamp"],
// Notifications
t_notifications = haex.notifications.name,
c_notif_id = haex.notifications.columns["id"],
c_notif_alt = haex.notifications.columns["alt"],
c_notif_date = haex.notifications.columns["date"],
c_notif_icon = haex.notifications.columns["icon"],
c_notif_image = haex.notifications.columns["image"],
c_notif_read = haex.notifications.columns["read"],
c_notif_source = haex.notifications.columns["source"],
c_notif_text = haex.notifications.columns["text"],
c_notif_title = haex.notifications.columns["title"],
c_notif_type = haex.notifications.columns["type"],
c_notif_tombstone = haex.notifications.columns["haexTombstone"],
// CRDT Logs
t_crdt_logs = haex.crdt.logs.name,
c_crdt_logs_id = haex.crdt.logs.columns["id"],
c_crdt_logs_timestamp = haex.crdt.logs.columns["haexTimestamp"],
c_crdt_logs_tableName = haex.crdt.logs.columns["tableName"],
c_crdt_logs_rowPks = haex.crdt.logs.columns["rowPks"],
c_crdt_logs_opType = haex.crdt.logs.columns["opType"],
c_crdt_logs_columnName = haex.crdt.logs.columns["columnName"],
c_crdt_logs_newValue = haex.crdt.logs.columns["newValue"],
c_crdt_logs_oldValue = haex.crdt.logs.columns["oldValue"],
// CRDT Snapshots
t_crdt_snapshots = haex.crdt.snapshots.name,
c_crdt_snap_id = haex.crdt.snapshots.columns["snapshotId"],
c_crdt_snap_created = haex.crdt.snapshots.columns["created"],
c_crdt_snap_epoch = haex.crdt.snapshots.columns["epochHlc"],
c_crdt_snap_location = haex.crdt.snapshots.columns["locationUrl"],
c_crdt_snap_size = haex.crdt.snapshots.columns["fileSizeBytes"],
// CRDT Configs
t_crdt_configs = haex.crdt.configs.name,
c_crdt_configs_key = haex.crdt.configs.columns["key"],
c_crdt_configs_value = haex.crdt.configs.columns["value"]
); );
// Dynamisch über alle Einträge in haex iterieren
for (key, value) in &schema.haex {
// Spezialbehandlung für nested structures wie "crdt"
if key == "crdt" {
if let Some(crdt_obj) = value.as_object() {
for (crdt_key, crdt_value) in crdt_obj {
if let Ok(table) = serde_json::from_value::<TableDefinition>(crdt_value.clone())
{
let const_prefix = format!("CRDT_{}", to_screaming_snake_case(crdt_key));
code.push_str(&generate_table_constants(&table, &const_prefix));
}
}
}
} else {
// Normale Tabelle (settings, extensions, notifications, workspaces, desktop_items, etc.)
if let Ok(table) = serde_json::from_value::<TableDefinition>(value.clone()) {
let const_prefix = to_screaming_snake_case(key);
code.push_str(&generate_table_constants(&table, &const_prefix));
}
}
}
// --- Datei schreiben --- // --- Datei schreiben ---
let mut f = File::create(&dest_path).expect("Konnte Zieldatei nicht erstellen"); let mut f = File::create(&dest_path).expect("Konnte Zieldatei nicht erstellen");
f.write_all(code.as_bytes()) f.write_all(code.as_bytes())
.expect("Konnte nicht in Zieldatei schreiben"); .expect("Konnte nicht in Zieldatei schreiben");
println!("cargo:rerun-if-changed=database/tableNames.json"); println!("cargo:rerun-if-changed=../src/database/tableNames.json");
}
/// Konvertiert einen String zu SCREAMING_SNAKE_CASE
fn to_screaming_snake_case(s: &str) -> String {
let mut result = String::new();
let mut prev_is_lower = false;
for (i, ch) in s.chars().enumerate() {
if ch == '_' {
result.push('_');
prev_is_lower = false;
} else if ch.is_uppercase() {
if i > 0 && prev_is_lower {
result.push('_');
}
result.push(ch);
prev_is_lower = false;
} else {
result.push(ch.to_ascii_uppercase());
prev_is_lower = true;
}
}
result
}
/// Generiert die Konstanten für eine Tabelle
fn generate_table_constants(table: &TableDefinition, const_prefix: &str) -> String {
let mut code = String::new();
// Tabellenname
code.push_str(&format!("// --- Table: {} ---\n", table.name));
code.push_str(&format!(
"pub const TABLE_{}: &str = \"{}\";\n",
const_prefix, table.name
));
// Spalten
for (col_key, col_value) in &table.columns {
let col_const_name = format!("COL_{}_{}", const_prefix, to_screaming_snake_case(col_key));
code.push_str(&format!(
"pub const {col_const_name}: &str = \"{col_value}\";\n"
));
}
code.push('\n');
code
} }

View File

@ -74,15 +74,14 @@ impl HlcService {
// Parse den String in ein Uuid-Objekt. // Parse den String in ein Uuid-Objekt.
let uuid = Uuid::parse_str(&node_id_str).map_err(|e| { let uuid = Uuid::parse_str(&node_id_str).map_err(|e| {
HlcError::ParseNodeId(format!( HlcError::ParseNodeId(format!(
"Stored device ID is not a valid UUID: {}. Error: {}", "Stored device ID is not a valid UUID: {node_id_str}. Error: {e}"
node_id_str, e
)) ))
})?; })?;
// Hol dir die rohen 16 Bytes und erstelle daraus die uhlc::ID. // Hol dir die rohen 16 Bytes und erstelle daraus die uhlc::ID.
// Das `*` dereferenziert den `&[u8; 16]` zu `[u8; 16]`, was `try_from` erwartet. // Das `*` dereferenziert den `&[u8; 16]` zu `[u8; 16]`, was `try_from` erwartet.
let node_id = ID::try_from(*uuid.as_bytes()).map_err(|e| { let node_id = ID::try_from(*uuid.as_bytes()).map_err(|e| {
HlcError::ParseNodeId(format!("Invalid node ID format from device store: {:?}", e)) HlcError::ParseNodeId(format!("Invalid node ID format from device store: {e:?}"))
})?; })?;
// 2. Erstelle eine HLC-Instanz mit stabiler Identität // 2. Erstelle eine HLC-Instanz mit stabiler Identität
@ -95,8 +94,7 @@ impl HlcService {
if let Some(last_timestamp) = Self::load_last_timestamp(conn)? { if let Some(last_timestamp) = Self::load_last_timestamp(conn)? {
hlc.update_with_timestamp(&last_timestamp).map_err(|e| { hlc.update_with_timestamp(&last_timestamp).map_err(|e| {
HlcError::Parse(format!( HlcError::Parse(format!(
"Failed to update HLC with persisted timestamp: {:?}", "Failed to update HLC with persisted timestamp: {e:?}"
e
)) ))
})?; })?;
} }
@ -119,7 +117,7 @@ impl HlcService {
if let Some(s) = value.as_str() { if let Some(s) = value.as_str() {
// Das ist unser Erfolgsfall. Wir haben einen &str und können // Das ist unser Erfolgsfall. Wir haben einen &str und können
// eine Kopie davon zurückgeben. // eine Kopie davon zurückgeben.
println!("Gefundene und validierte Geräte-ID: {}", s); println!("Gefundene und validierte Geräte-ID: {s}");
if Uuid::parse_str(s).is_ok() { if Uuid::parse_str(s).is_ok() {
// Erfolgsfall: Der Wert ist ein String UND eine gültige UUID. // Erfolgsfall: Der Wert ist ein String UND eine gültige UUID.
// Wir können die Funktion direkt mit dem Wert verlassen. // Wir können die Funktion direkt mit dem Wert verlassen.
@ -183,19 +181,19 @@ impl HlcService {
let hlc = hlc_guard.as_mut().ok_or(HlcError::NotInitialized)?; let hlc = hlc_guard.as_mut().ok_or(HlcError::NotInitialized)?;
hlc.update_with_timestamp(timestamp) hlc.update_with_timestamp(timestamp)
.map_err(|e| HlcError::Parse(format!("Failed to update HLC: {:?}", e))) .map_err(|e| HlcError::Parse(format!("Failed to update HLC: {e:?}")))
} }
/// Lädt den letzten persistierten Zeitstempel aus der Datenbank. /// Lädt den letzten persistierten Zeitstempel aus der Datenbank.
fn load_last_timestamp(conn: &Connection) -> Result<Option<Timestamp>, HlcError> { fn load_last_timestamp(conn: &Connection) -> Result<Option<Timestamp>, HlcError> {
let query = format!("SELECT value FROM {} WHERE key = ?1", TABLE_CRDT_CONFIGS); let query = format!("SELECT value FROM {TABLE_CRDT_CONFIGS} WHERE key = ?1");
match conn.query_row(&query, params![HLC_TIMESTAMP_TYPE], |row| { match conn.query_row(&query, params![HLC_TIMESTAMP_TYPE], |row| {
row.get::<_, String>(0) row.get::<_, String>(0)
}) { }) {
Ok(state_str) => { Ok(state_str) => {
let timestamp = Timestamp::from_str(&state_str).map_err(|e| { let timestamp = Timestamp::from_str(&state_str).map_err(|e| {
HlcError::ParseTimestamp(format!("Invalid timestamp format: {:?}", e)) HlcError::ParseTimestamp(format!("Invalid timestamp format: {e:?}"))
})?; })?;
Ok(Some(timestamp)) Ok(Some(timestamp))
} }
@ -209,9 +207,8 @@ impl HlcService {
let timestamp_str = timestamp.to_string(); let timestamp_str = timestamp.to_string();
tx.execute( tx.execute(
&format!( &format!(
"INSERT INTO {} (key, value) VALUES (?1, ?2) "INSERT INTO {TABLE_CRDT_CONFIGS} (key, value) VALUES (?1, ?2)
ON CONFLICT(key) DO UPDATE SET value = excluded.value", ON CONFLICT(key) DO UPDATE SET value = excluded.value"
TABLE_CRDT_CONFIGS
), ),
params![HLC_TIMESTAMP_TYPE, timestamp_str], params![HLC_TIMESTAMP_TYPE, timestamp_str],
)?; )?;

View File

@ -0,0 +1,99 @@
// src-tauri/src/crdt/insert_transformer.rs
// INSERT-spezifische CRDT-Transformationen (ON CONFLICT, RETURNING)
use crate::crdt::trigger::HLC_TIMESTAMP_COLUMN;
use crate::database::error::DatabaseError;
use sqlparser::ast::{Expr, Ident, Insert, SelectItem, SetExpr, Value};
use uhlc::Timestamp;
/// Helper-Struct für INSERT-Transformationen
pub struct InsertTransformer {
hlc_timestamp_column: &'static str,
}
impl InsertTransformer {
pub fn new() -> Self {
Self {
hlc_timestamp_column: HLC_TIMESTAMP_COLUMN,
}
}
fn find_or_add_column(columns: &mut Vec<Ident>, col_name: &'static str) -> usize {
match columns.iter().position(|c| c.value == col_name) {
Some(index) => index, // Gefunden! Gib Index zurück.
None => {
// Nicht gefunden! Hinzufügen.
columns.push(Ident::new(col_name));
columns.len() - 1 // Der Index des gerade hinzugefügten Elements
}
}
}
/// Wenn der Index == der Länge ist, wird der Wert stattdessen gepusht.
fn set_or_push_value(row: &mut Vec<Expr>, index: usize, value: Expr) {
if index < row.len() {
// Spalte war vorhanden, Wert (wahrscheinlich `?` oder NULL) ersetzen
row[index] = value;
} else {
// Spalte war nicht vorhanden, Wert hinzufügen
row.push(value);
}
}
fn set_or_push_projection(projection: &mut Vec<SelectItem>, index: usize, value: Expr) {
let item = SelectItem::UnnamedExpr(value);
if index < projection.len() {
projection[index] = item;
} else {
projection.push(item);
}
}
/// Transformiert INSERT-Statements (fügt HLC-Timestamp hinzu)
/// Hard Delete: Kein ON CONFLICT mehr nötig - gelöschte Einträge sind wirklich weg
pub fn transform_insert(
&self,
insert_stmt: &mut Insert,
timestamp: &Timestamp,
) -> Result<(), DatabaseError> {
// Add haex_timestamp column if not exists
let hlc_col_index =
Self::find_or_add_column(&mut insert_stmt.columns, self.hlc_timestamp_column);
// ON CONFLICT Logik komplett entfernt!
// Bei Hard Deletes gibt es keine Tombstone-Einträge mehr zu reaktivieren
// UNIQUE Constraint Violations sind echte Fehler
match insert_stmt.source.as_mut() {
Some(query) => match &mut *query.body {
SetExpr::Values(values) => {
for row in &mut values.rows {
let hlc_value =
Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into());
Self::set_or_push_value(row, hlc_col_index, hlc_value);
}
}
SetExpr::Select(select) => {
let hlc_value =
Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into());
Self::set_or_push_projection(&mut select.projection, hlc_col_index, hlc_value);
}
_ => {
return Err(DatabaseError::UnsupportedStatement {
sql: insert_stmt.to_string(),
reason: "INSERT with unsupported source type".to_string(),
});
}
},
None => {
return Err(DatabaseError::UnsupportedStatement {
reason: "INSERT statement has no source".to_string(),
sql: insert_stmt.to_string(),
});
}
}
Ok(())
}
}

View File

@ -1,3 +1,5 @@
pub mod hlc; pub mod hlc;
pub mod insert_transformer;
//pub mod query_transformer;
pub mod transformer; pub mod transformer;
pub mod trigger; pub mod trigger;

View File

@ -1,9 +1,12 @@
use crate::crdt::trigger::{HLC_TIMESTAMP_COLUMN, TOMBSTONE_COLUMN}; // src-tauri/src/crdt/transformer.rs
use crate::crdt::insert_transformer::InsertTransformer;
use crate::crdt::trigger::HLC_TIMESTAMP_COLUMN;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_CRDT_LOGS}; use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_CRDT_LOGS};
use sqlparser::ast::{ use sqlparser::ast::{
Assignment, AssignmentTarget, BinaryOperator, ColumnDef, DataType, Expr, Ident, Insert, Assignment, AssignmentTarget, ColumnDef, DataType, Expr, Ident, ObjectName, ObjectNamePart,
ObjectName, ObjectNamePart, SelectItem, SetExpr, Statement, TableFactor, TableObject, Value, Statement, TableFactor, TableObject, Value,
}; };
use std::borrow::Cow; use std::borrow::Cow;
use std::collections::HashSet; use std::collections::HashSet;
@ -12,46 +15,14 @@ use uhlc::Timestamp;
/// Konfiguration für CRDT-Spalten /// Konfiguration für CRDT-Spalten
#[derive(Clone)] #[derive(Clone)]
struct CrdtColumns { struct CrdtColumns {
tombstone: &'static str,
hlc_timestamp: &'static str, hlc_timestamp: &'static str,
} }
impl CrdtColumns { impl CrdtColumns {
const DEFAULT: Self = Self { const DEFAULT: Self = Self {
tombstone: TOMBSTONE_COLUMN,
hlc_timestamp: HLC_TIMESTAMP_COLUMN, hlc_timestamp: HLC_TIMESTAMP_COLUMN,
}; };
/// Erstellt einen Tombstone-Filter für eine Tabelle
fn create_tombstone_filter(&self, table_alias: Option<&str>) -> Expr {
let column_expr = match table_alias {
Some(alias) => {
// Qualifizierte Referenz: alias.tombstone
Expr::CompoundIdentifier(vec![Ident::new(alias), Ident::new(self.tombstone)])
}
None => {
// Einfache Referenz: tombstone
Expr::Identifier(Ident::new(self.tombstone))
}
};
Expr::BinaryOp {
left: Box::new(column_expr),
op: BinaryOperator::NotEq,
right: Box::new(Expr::Value(Value::Number("1".to_string(), false).into())),
}
}
/// Erstellt eine Tombstone-Zuweisung für UPDATE/DELETE
fn create_tombstone_assignment(&self) -> Assignment {
Assignment {
target: AssignmentTarget::ColumnName(ObjectName(vec![ObjectNamePart::Identifier(
Ident::new(self.tombstone),
)])),
value: Expr::Value(Value::Number("1".to_string(), false).into()),
}
}
/// Erstellt eine HLC-Zuweisung für UPDATE/DELETE /// Erstellt eine HLC-Zuweisung für UPDATE/DELETE
fn create_hlc_assignment(&self, timestamp: &Timestamp) -> Assignment { fn create_hlc_assignment(&self, timestamp: &Timestamp) -> Assignment {
Assignment { Assignment {
@ -64,13 +35,6 @@ impl CrdtColumns {
/// Fügt CRDT-Spalten zu einer Tabellendefinition hinzu /// Fügt CRDT-Spalten zu einer Tabellendefinition hinzu
fn add_to_table_definition(&self, columns: &mut Vec<ColumnDef>) { fn add_to_table_definition(&self, columns: &mut Vec<ColumnDef>) {
if !columns.iter().any(|c| c.name.value == self.tombstone) {
columns.push(ColumnDef {
name: Ident::new(self.tombstone),
data_type: DataType::Integer(None),
options: vec![],
});
}
if !columns.iter().any(|c| c.name.value == self.hlc_timestamp) { if !columns.iter().any(|c| c.name.value == self.hlc_timestamp) {
columns.push(ColumnDef { columns.push(ColumnDef {
name: Ident::new(self.hlc_timestamp), name: Ident::new(self.hlc_timestamp),
@ -110,14 +74,61 @@ impl CrdtTransformer {
Cow::Owned(name_str.trim_matches('`').trim_matches('"').to_string()) Cow::Owned(name_str.trim_matches('`').trim_matches('"').to_string())
} }
pub fn transform_select_statement(&self, stmt: &mut Statement) -> Result<(), DatabaseError> { // =================================================================
// ÖFFENTLICHE API-METHODEN
// =================================================================
pub fn transform_execute_statement_with_table_info(
&self,
stmt: &mut Statement,
hlc_timestamp: &Timestamp,
) -> Result<Option<String>, DatabaseError> {
match stmt { match stmt {
Statement::Query(query) => self.transform_query_recursive(query), Statement::CreateTable(create_table) => {
// Fange alle anderen Fälle ab und gib einen Fehler zurück if self.is_crdt_sync_table(&create_table.name) {
_ => Err(DatabaseError::UnsupportedStatement { self.columns
sql: stmt.to_string(), .add_to_table_definition(&mut create_table.columns);
reason: "This operation only accepts SELECT statements.".to_string(), Ok(Some(
}), self.normalize_table_name(&create_table.name).into_owned(),
))
} else {
Ok(None)
}
}
Statement::Insert(insert_stmt) => {
if let TableObject::TableName(name) = &insert_stmt.table {
if self.is_crdt_sync_table(name) {
// Hard Delete: Kein Schema-Lookup mehr nötig (kein ON CONFLICT)
let insert_transformer = InsertTransformer::new();
insert_transformer.transform_insert(insert_stmt, hlc_timestamp)?;
}
}
Ok(None)
}
Statement::Update {
table, assignments, ..
} => {
if let TableFactor::Table { name, .. } = &table.relation {
if self.is_crdt_sync_table(name) {
assignments.push(self.columns.create_hlc_assignment(hlc_timestamp));
}
}
Ok(None)
}
Statement::Delete(_del_stmt) => {
// Hard Delete - keine Transformation!
// DELETE bleibt DELETE
// BEFORE DELETE Trigger schreiben die Logs
Ok(None)
}
Statement::AlterTable { name, .. } => {
if self.is_crdt_sync_table(name) {
Ok(Some(self.normalize_table_name(name).into_owned()))
} else {
Ok(None)
}
}
_ => Ok(None),
} }
} }
@ -141,7 +152,9 @@ impl CrdtTransformer {
Statement::Insert(insert_stmt) => { Statement::Insert(insert_stmt) => {
if let TableObject::TableName(name) = &insert_stmt.table { if let TableObject::TableName(name) = &insert_stmt.table {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
self.transform_insert(insert_stmt, hlc_timestamp)?; // Hard Delete: Keine ON CONFLICT Logik mehr nötig
let insert_transformer = InsertTransformer::new();
insert_transformer.transform_insert(insert_stmt, hlc_timestamp)?;
} }
} }
Ok(None) Ok(None)
@ -156,18 +169,10 @@ impl CrdtTransformer {
} }
Ok(None) Ok(None)
} }
Statement::Delete(del_stmt) => { Statement::Delete(_del_stmt) => {
if let Some(table_name) = self.extract_table_name_from_delete(del_stmt) { // Hard Delete - keine Transformation!
if self.is_crdt_sync_table(&table_name) { // DELETE bleibt DELETE
self.transform_delete_to_update(stmt, hlc_timestamp)?; Ok(None)
}
Ok(None)
} else {
Err(DatabaseError::UnsupportedStatement {
sql: del_stmt.to_string(),
reason: "DELETE from non-table source or multiple tables".to_string(),
})
}
} }
Statement::AlterTable { name, .. } => { Statement::AlterTable { name, .. } => {
if self.is_crdt_sync_table(name) { if self.is_crdt_sync_table(name) {
@ -179,619 +184,4 @@ impl CrdtTransformer {
_ => Ok(None), _ => Ok(None),
} }
} }
/// Transformiert Query-Statements (fügt Tombstone-Filter hinzu)
fn transform_query_recursive(
&self,
query: &mut sqlparser::ast::Query,
) -> Result<(), DatabaseError> {
self.add_tombstone_filters_recursive(&mut query.body)
}
/// Rekursive Behandlung aller SetExpr-Typen mit vollständiger Subquery-Unterstützung
fn add_tombstone_filters_recursive(&self, set_expr: &mut SetExpr) -> Result<(), DatabaseError> {
match set_expr {
SetExpr::Select(select) => {
self.add_tombstone_filters_to_select(select)?;
// Transformiere auch Subqueries in Projektionen
for projection in &mut select.projection {
match projection {
SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } => {
self.transform_expression_subqueries(expr)?;
}
_ => {} // Wildcard projections ignorieren
}
}
// Transformiere Subqueries in WHERE
if let Some(where_clause) = &mut select.selection {
self.transform_expression_subqueries(where_clause)?;
}
// Transformiere Subqueries in GROUP BY
match &mut select.group_by {
sqlparser::ast::GroupByExpr::All(_) => {
// GROUP BY ALL - keine Expressions zu transformieren
}
sqlparser::ast::GroupByExpr::Expressions(exprs, _) => {
for group_expr in exprs {
self.transform_expression_subqueries(group_expr)?;
}
}
}
// Transformiere Subqueries in HAVING
if let Some(having) = &mut select.having {
self.transform_expression_subqueries(having)?;
}
}
SetExpr::SetOperation { left, right, .. } => {
self.add_tombstone_filters_recursive(left)?;
self.add_tombstone_filters_recursive(right)?;
}
SetExpr::Query(query) => {
self.add_tombstone_filters_recursive(&mut query.body)?;
}
SetExpr::Values(values) => {
// Transformiere auch Subqueries in Values-Listen
for row in &mut values.rows {
for expr in row {
self.transform_expression_subqueries(expr)?;
}
}
}
_ => {} // Andere Fälle
}
Ok(())
}
/// Transformiert Subqueries innerhalb von Expressions
fn transform_expression_subqueries(&self, expr: &mut Expr) -> Result<(), DatabaseError> {
match expr {
// Einfache Subqueries
Expr::Subquery(query) => {
self.add_tombstone_filters_recursive(&mut query.body)?;
}
// EXISTS Subqueries
Expr::Exists { subquery, .. } => {
self.add_tombstone_filters_recursive(&mut subquery.body)?;
}
// IN Subqueries
Expr::InSubquery {
expr: left_expr,
subquery,
..
} => {
self.transform_expression_subqueries(left_expr)?;
self.add_tombstone_filters_recursive(&mut subquery.body)?;
}
// ANY/ALL Subqueries
Expr::AnyOp { left, right, .. } | Expr::AllOp { left, right, .. } => {
self.transform_expression_subqueries(left)?;
self.transform_expression_subqueries(right)?;
}
// Binäre Operationen
Expr::BinaryOp { left, right, .. } => {
self.transform_expression_subqueries(left)?;
self.transform_expression_subqueries(right)?;
}
// Unäre Operationen
Expr::UnaryOp {
expr: inner_expr, ..
} => {
self.transform_expression_subqueries(inner_expr)?;
}
// Verschachtelte Ausdrücke
Expr::Nested(nested) => {
self.transform_expression_subqueries(nested)?;
}
// CASE-Ausdrücke
Expr::Case {
operand,
conditions,
else_result,
..
} => {
if let Some(op) = operand {
self.transform_expression_subqueries(op)?;
}
for case_when in conditions {
self.transform_expression_subqueries(&mut case_when.condition)?;
self.transform_expression_subqueries(&mut case_when.result)?;
}
if let Some(else_res) = else_result {
self.transform_expression_subqueries(else_res)?;
}
}
// Funktionsaufrufe
Expr::Function(func) => match &mut func.args {
sqlparser::ast::FunctionArguments::List(sqlparser::ast::FunctionArgumentList {
args,
..
}) => {
for arg in args {
if let sqlparser::ast::FunctionArg::Unnamed(
sqlparser::ast::FunctionArgExpr::Expr(expr),
) = arg
{
self.transform_expression_subqueries(expr)?;
}
}
}
_ => {}
},
// BETWEEN
Expr::Between {
expr: main_expr,
low,
high,
..
} => {
self.transform_expression_subqueries(main_expr)?;
self.transform_expression_subqueries(low)?;
self.transform_expression_subqueries(high)?;
}
// IN Liste
Expr::InList {
expr: main_expr,
list,
..
} => {
self.transform_expression_subqueries(main_expr)?;
for list_expr in list {
self.transform_expression_subqueries(list_expr)?;
}
}
// IS NULL/IS NOT NULL
Expr::IsNull(inner) | Expr::IsNotNull(inner) => {
self.transform_expression_subqueries(inner)?;
}
// Andere Expression-Typen benötigen keine Transformation
_ => {}
}
Ok(())
}
/// Fügt Tombstone-Filter zu SELECT-Statements hinzu (nur wenn nicht explizit in WHERE gesetzt)
fn add_tombstone_filters_to_select(
&self,
select: &mut sqlparser::ast::Select,
) -> Result<(), DatabaseError> {
// Sammle alle CRDT-Tabellen mit ihren Aliasen
let mut crdt_tables = Vec::new();
for twj in &select.from {
if let TableFactor::Table { name, alias, .. } = &twj.relation {
if self.is_crdt_sync_table(name) {
let table_alias = alias.as_ref().map(|a| a.name.value.as_str());
crdt_tables.push((name.clone(), table_alias));
}
}
}
if crdt_tables.is_empty() {
return Ok(());
}
// Prüfe, welche Tombstone-Spalten bereits in der WHERE-Klausel referenziert werden
let explicitly_filtered_tables = if let Some(where_clause) = &select.selection {
self.find_explicitly_filtered_tombstone_tables(where_clause, &crdt_tables)
} else {
HashSet::new()
};
// Erstelle Filter nur für Tabellen, die noch nicht explizit gefiltert werden
let mut tombstone_filters = Vec::new();
for (table_name, table_alias) in crdt_tables {
let table_name_string = table_name.to_string();
let table_key = table_alias.unwrap_or(&table_name_string);
if !explicitly_filtered_tables.contains(table_key) {
tombstone_filters.push(self.columns.create_tombstone_filter(table_alias));
}
}
// Füge die automatischen Filter hinzu
if !tombstone_filters.is_empty() {
let combined_filter = tombstone_filters
.into_iter()
.reduce(|acc, expr| Expr::BinaryOp {
left: Box::new(acc),
op: BinaryOperator::And,
right: Box::new(expr),
})
.unwrap();
match &mut select.selection {
Some(existing) => {
*existing = Expr::BinaryOp {
left: Box::new(existing.clone()),
op: BinaryOperator::And,
right: Box::new(combined_filter),
};
}
None => {
select.selection = Some(combined_filter);
}
}
}
Ok(())
}
/// Findet alle Tabellen, die bereits explizit Tombstone-Filter in der WHERE-Klausel haben
fn find_explicitly_filtered_tombstone_tables(
&self,
where_expr: &Expr,
crdt_tables: &[(ObjectName, Option<&str>)],
) -> HashSet<String> {
let mut filtered_tables = HashSet::new();
self.scan_expression_for_tombstone_references(
where_expr,
crdt_tables,
&mut filtered_tables,
);
filtered_tables
}
/// Rekursiv durchsucht einen Expression-Baum nach Tombstone-Spalten-Referenzen
fn scan_expression_for_tombstone_references(
&self,
expr: &Expr,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) {
match expr {
// Einfache Spaltenreferenz: tombstone = ?
Expr::Identifier(ident) => {
if ident.value == self.columns.tombstone {
// Wenn keine Tabelle spezifiziert ist und es nur eine CRDT-Tabelle gibt
if crdt_tables.len() == 1 {
let table_name_str = crdt_tables[0].0.to_string();
let table_key = crdt_tables[0].1.unwrap_or(&table_name_str);
filtered_tables.insert(table_key.to_string());
}
}
}
// Qualifizierte Spaltenreferenz: table.tombstone = ? oder alias.tombstone = ?
Expr::CompoundIdentifier(idents) => {
if idents.len() == 2 && idents[1].value == self.columns.tombstone {
let table_ref = &idents[0].value;
// Prüfe, ob es eine unserer CRDT-Tabellen ist (nach Name oder Alias)
for (table_name, alias) in crdt_tables {
let table_name_str = table_name.to_string();
if table_ref == &table_name_str || alias.map_or(false, |a| a == table_ref) {
filtered_tables.insert(table_ref.clone());
break;
}
}
}
}
// Binäre Operationen: AND, OR, etc.
Expr::BinaryOp { left, right, .. } => {
self.scan_expression_for_tombstone_references(left, crdt_tables, filtered_tables);
self.scan_expression_for_tombstone_references(right, crdt_tables, filtered_tables);
}
// Unäre Operationen: NOT, etc.
Expr::UnaryOp { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
// Verschachtelte Ausdrücke
Expr::Nested(nested) => {
self.scan_expression_for_tombstone_references(nested, crdt_tables, filtered_tables);
}
// IN-Klauseln
Expr::InList { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
// BETWEEN-Klauseln
Expr::Between { expr, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
// IS NULL/IS NOT NULL
Expr::IsNull(expr) | Expr::IsNotNull(expr) => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
}
// Funktionsaufrufe - KORRIGIERT
Expr::Function(func) => {
match &func.args {
sqlparser::ast::FunctionArguments::List(
sqlparser::ast::FunctionArgumentList { args, .. },
) => {
for arg in args {
if let sqlparser::ast::FunctionArg::Unnamed(
sqlparser::ast::FunctionArgExpr::Expr(expr),
) = arg
{
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
}
}
_ => {} // Andere FunctionArguments-Varianten ignorieren
}
}
// CASE-Ausdrücke - KORRIGIERT
Expr::Case {
operand,
conditions,
else_result,
..
} => {
if let Some(op) = operand {
self.scan_expression_for_tombstone_references(op, crdt_tables, filtered_tables);
}
for case_when in conditions {
self.scan_expression_for_tombstone_references(
&case_when.condition,
crdt_tables,
filtered_tables,
);
self.scan_expression_for_tombstone_references(
&case_when.result,
crdt_tables,
filtered_tables,
);
}
if let Some(else_res) = else_result {
self.scan_expression_for_tombstone_references(
else_res,
crdt_tables,
filtered_tables,
);
}
}
// Subqueries mit vollständiger Unterstützung
Expr::Subquery(query) => {
self.transform_query_recursive_for_tombstone_analysis(
query,
crdt_tables,
filtered_tables,
)
.ok();
}
// EXISTS/NOT EXISTS Subqueries
Expr::Exists { subquery, .. } => {
self.transform_query_recursive_for_tombstone_analysis(
subquery,
crdt_tables,
filtered_tables,
)
.ok();
}
// IN/NOT IN Subqueries
Expr::InSubquery { expr, subquery, .. } => {
self.scan_expression_for_tombstone_references(expr, crdt_tables, filtered_tables);
self.transform_query_recursive_for_tombstone_analysis(
subquery,
crdt_tables,
filtered_tables,
)
.ok();
}
// ANY/ALL Subqueries
Expr::AnyOp { left, right, .. } | Expr::AllOp { left, right, .. } => {
self.scan_expression_for_tombstone_references(left, crdt_tables, filtered_tables);
self.scan_expression_for_tombstone_references(right, crdt_tables, filtered_tables);
}
// Andere Expression-Typen ignorieren wir für jetzt
_ => {}
}
}
/// Analysiert eine Subquery und sammelt Tombstone-Referenzen
fn transform_query_recursive_for_tombstone_analysis(
&self,
query: &sqlparser::ast::Query,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) -> Result<(), DatabaseError> {
self.analyze_set_expr_for_tombstone_references(&query.body, crdt_tables, filtered_tables)
}
/// Rekursiv analysiert SetExpr für Tombstone-Referenzen
fn analyze_set_expr_for_tombstone_references(
&self,
set_expr: &SetExpr,
crdt_tables: &[(ObjectName, Option<&str>)],
filtered_tables: &mut HashSet<String>,
) -> Result<(), DatabaseError> {
match set_expr {
SetExpr::Select(select) => {
// Analysiere WHERE-Klausel
if let Some(where_clause) = &select.selection {
self.scan_expression_for_tombstone_references(
where_clause,
crdt_tables,
filtered_tables,
);
}
// Analysiere alle Projektionen (können auch Subqueries enthalten)
for projection in &select.projection {
match projection {
SelectItem::UnnamedExpr(expr) | SelectItem::ExprWithAlias { expr, .. } => {
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
_ => {} // Wildcard projections ignorieren
}
}
// Analysiere GROUP BY
match &select.group_by {
sqlparser::ast::GroupByExpr::All(_) => {
// GROUP BY ALL - keine Expressions zu analysieren
}
sqlparser::ast::GroupByExpr::Expressions(exprs, _) => {
for group_expr in exprs {
self.scan_expression_for_tombstone_references(
group_expr,
crdt_tables,
filtered_tables,
);
}
}
}
// Analysiere HAVING
if let Some(having) = &select.having {
self.scan_expression_for_tombstone_references(
having,
crdt_tables,
filtered_tables,
);
}
}
SetExpr::SetOperation { left, right, .. } => {
self.analyze_set_expr_for_tombstone_references(left, crdt_tables, filtered_tables)?;
self.analyze_set_expr_for_tombstone_references(
right,
crdt_tables,
filtered_tables,
)?;
}
SetExpr::Query(query) => {
self.analyze_set_expr_for_tombstone_references(
&query.body,
crdt_tables,
filtered_tables,
)?;
}
SetExpr::Values(values) => {
// Analysiere Values-Listen
for row in &values.rows {
for expr in row {
self.scan_expression_for_tombstone_references(
expr,
crdt_tables,
filtered_tables,
);
}
}
}
_ => {} // Andere Varianten
}
Ok(())
}
/// Transformiert INSERT-Statements (fügt HLC-Timestamp hinzu)
fn transform_insert(
&self,
insert_stmt: &mut Insert,
timestamp: &Timestamp,
) -> Result<(), DatabaseError> {
// Add both haex_timestamp and haex_tombstone columns
insert_stmt
.columns
.push(Ident::new(self.columns.hlc_timestamp));
insert_stmt
.columns
.push(Ident::new(self.columns.tombstone));
match insert_stmt.source.as_mut() {
Some(query) => match &mut *query.body {
SetExpr::Values(values) => {
for row in &mut values.rows {
// Add haex_timestamp value
row.push(Expr::Value(
Value::SingleQuotedString(timestamp.to_string()).into(),
));
// Add haex_tombstone value (0 = not deleted)
row.push(Expr::Value(
Value::Number("0".to_string(), false).into(),
));
}
}
SetExpr::Select(select) => {
let hlc_expr =
Expr::Value(Value::SingleQuotedString(timestamp.to_string()).into());
select.projection.push(SelectItem::UnnamedExpr(hlc_expr));
// Add haex_tombstone value (0 = not deleted)
let tombstone_expr =
Expr::Value(Value::Number("0".to_string(), false).into());
select.projection.push(SelectItem::UnnamedExpr(tombstone_expr));
}
_ => {
return Err(DatabaseError::UnsupportedStatement {
sql: insert_stmt.to_string(),
reason: "INSERT with unsupported source type".to_string(),
});
}
},
None => {
return Err(DatabaseError::UnsupportedStatement {
reason: "INSERT statement has no source".to_string(),
sql: insert_stmt.to_string(),
});
}
}
Ok(())
}
/// Transformiert DELETE zu UPDATE (soft delete)
fn transform_delete_to_update(
&self,
stmt: &mut Statement,
timestamp: &Timestamp,
) -> Result<(), DatabaseError> {
if let Statement::Delete(del_stmt) = stmt {
let table_to_update = match &del_stmt.from {
sqlparser::ast::FromTable::WithFromKeyword(from)
| sqlparser::ast::FromTable::WithoutKeyword(from) => {
if from.len() == 1 {
from[0].clone()
} else {
return Err(DatabaseError::UnsupportedStatement {
reason: "DELETE with multiple tables not supported".to_string(),
sql: stmt.to_string(),
});
}
}
};
let assignments = vec![
self.columns.create_tombstone_assignment(),
self.columns.create_hlc_assignment(timestamp),
];
*stmt = Statement::Update {
table: table_to_update,
assignments,
from: None,
selection: del_stmt.selection.clone(),
returning: None,
or: None,
limit: None,
};
}
Ok(())
}
/// Extrahiert Tabellennamen aus DELETE-Statement
fn extract_table_name_from_delete(
&self,
del_stmt: &sqlparser::ast::Delete,
) -> Option<ObjectName> {
let tables = match &del_stmt.from {
sqlparser::ast::FromTable::WithFromKeyword(from)
| sqlparser::ast::FromTable::WithoutKeyword(from) => from,
};
if tables.len() == 1 {
if let TableFactor::Table { name, .. } = &tables[0].relation {
Some(name.clone())
} else {
None
}
} else {
None
}
}
} }

View File

@ -9,20 +9,17 @@ use ts_rs::TS;
// Der "z_"-Präfix soll sicherstellen, dass diese Trigger als Letzte ausgeführt werden // Der "z_"-Präfix soll sicherstellen, dass diese Trigger als Letzte ausgeführt werden
const INSERT_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_insert"; const INSERT_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_insert";
const UPDATE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_update"; const UPDATE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_update";
const DELETE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_delete";
//const SYNC_ACTIVE_KEY: &str = "sync_active";
pub const TOMBSTONE_COLUMN: &str = "haex_tombstone";
pub const HLC_TIMESTAMP_COLUMN: &str = "haex_timestamp"; pub const HLC_TIMESTAMP_COLUMN: &str = "haex_timestamp";
/// Name der custom UUID-Generierungs-Funktion (registriert in database::core::open_and_init_db)
pub const UUID_FUNCTION_NAME: &str = "gen_uuid";
#[derive(Debug)] #[derive(Debug)]
pub enum CrdtSetupError { pub enum CrdtSetupError {
/// Kapselt einen Fehler, der von der rusqlite-Bibliothek kommt. /// Kapselt einen Fehler, der von der rusqlite-Bibliothek kommt.
DatabaseError(rusqlite::Error), DatabaseError(rusqlite::Error),
/// Die Tabelle hat keine Tombstone-Spalte, was eine CRDT-Voraussetzung ist.
TombstoneColumnMissing {
table_name: String,
column_name: String,
},
HlcColumnMissing { HlcColumnMissing {
table_name: String, table_name: String,
column_name: String, column_name: String,
@ -35,25 +32,16 @@ pub enum CrdtSetupError {
impl Display for CrdtSetupError { impl Display for CrdtSetupError {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
match self { match self {
CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {}", e), CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {e}"),
CrdtSetupError::TombstoneColumnMissing {
table_name,
column_name,
} => write!(
f,
"Table '{}' is missing the required tombstone column '{}'",
table_name, column_name
),
CrdtSetupError::HlcColumnMissing { CrdtSetupError::HlcColumnMissing {
table_name, table_name,
column_name, column_name,
} => write!( } => write!(
f, f,
"Table '{}' is missing the required hlc column '{}'", "Table '{table_name}' is missing the required hlc column '{column_name}'"
table_name, column_name
), ),
CrdtSetupError::PrimaryKeyMissing { table_name } => { CrdtSetupError::PrimaryKeyMissing { table_name } => {
write!(f, "Table '{}' has no primary key", table_name) write!(f, "Table '{table_name}' has no primary key")
} }
} }
} }
@ -78,14 +66,14 @@ pub enum TriggerSetupResult {
TableNotFound, TableNotFound,
} }
#[derive(Debug)] #[derive(Debug, Clone)]
struct ColumnInfo { pub struct ColumnInfo {
name: String, pub name: String,
is_pk: bool, pub is_pk: bool,
} }
impl ColumnInfo { impl ColumnInfo {
fn from_row(row: &Row) -> RusqliteResult<Self> { pub fn from_row(row: &Row) -> RusqliteResult<Self> {
Ok(ColumnInfo { Ok(ColumnInfo {
name: row.get("name")?, name: row.get("name")?,
is_pk: row.get::<_, i64>("pk")? > 0, is_pk: row.get::<_, i64>("pk")? > 0,
@ -94,7 +82,8 @@ impl ColumnInfo {
} }
fn is_safe_identifier(name: &str) -> bool { fn is_safe_identifier(name: &str) -> bool {
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_') // Allow alphanumeric characters, underscores, and hyphens (for extension names like "nuxt-app")
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_' || c == '-')
} }
/// Richtet CRDT-Trigger für eine einzelne Tabelle ein. /// Richtet CRDT-Trigger für eine einzelne Tabelle ein.
@ -109,13 +98,6 @@ pub fn setup_triggers_for_table(
return Ok(TriggerSetupResult::TableNotFound); return Ok(TriggerSetupResult::TableNotFound);
} }
if !columns.iter().any(|c| c.name == TOMBSTONE_COLUMN) {
return Err(CrdtSetupError::TombstoneColumnMissing {
table_name: table_name.to_string(),
column_name: TOMBSTONE_COLUMN.to_string(),
});
}
if !columns.iter().any(|c| c.name == HLC_TIMESTAMP_COLUMN) { if !columns.iter().any(|c| c.name == HLC_TIMESTAMP_COLUMN) {
return Err(CrdtSetupError::HlcColumnMissing { return Err(CrdtSetupError::HlcColumnMissing {
table_name: table_name.to_string(), table_name: table_name.to_string(),
@ -137,47 +119,48 @@ pub fn setup_triggers_for_table(
let cols_to_track: Vec<String> = columns let cols_to_track: Vec<String> = columns
.iter() .iter()
.filter(|c| !c.is_pk) //&& c.name != TOMBSTONE_COLUMN && c.name != HLC_TIMESTAMP_COLUMN .filter(|c| !c.is_pk)
.map(|c| c.name.clone()) .map(|c| c.name.clone())
.collect(); .collect();
let insert_trigger_sql = generate_insert_trigger_sql(table_name, &pks, &cols_to_track); let insert_trigger_sql = generate_insert_trigger_sql(table_name, &pks, &cols_to_track);
let update_trigger_sql = generate_update_trigger_sql(table_name, &pks, &cols_to_track); let update_trigger_sql = generate_update_trigger_sql(table_name, &pks, &cols_to_track);
let delete_trigger_sql = generate_delete_trigger_sql(table_name, &pks, &cols_to_track);
if recreate { if recreate {
drop_triggers_for_table(&tx, table_name)?; drop_triggers_for_table(tx, table_name)?;
} }
tx.execute_batch(&insert_trigger_sql)?; tx.execute_batch(&insert_trigger_sql)?;
tx.execute_batch(&update_trigger_sql)?; tx.execute_batch(&update_trigger_sql)?;
tx.execute_batch(&delete_trigger_sql)?;
Ok(TriggerSetupResult::Success) Ok(TriggerSetupResult::Success)
} }
/// Holt das Schema für eine gegebene Tabelle. /// Holt das Schema für eine gegebene Tabelle.
fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<ColumnInfo>> { pub fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<ColumnInfo>> {
if !is_safe_identifier(table_name) { if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!( return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}", "Invalid or unsafe table name provided: {table_name}"
table_name )));
))
.into());
} }
let sql = format!("PRAGMA table_info(\"{}\");", table_name); let sql = format!("PRAGMA table_info(\"{table_name}\");");
let mut stmt = conn.prepare(&sql)?; let mut stmt = conn.prepare(&sql)?;
let rows = stmt.query_map([], ColumnInfo::from_row)?; let rows = stmt.query_map([], ColumnInfo::from_row)?;
rows.collect() rows.collect()
} }
// get_foreign_key_columns() removed - not needed with hard deletes (no ON CONFLICT logic)
pub fn drop_triggers_for_table( pub fn drop_triggers_for_table(
tx: &Transaction, // Arbeitet direkt auf einer Transaktion tx: &Transaction, // Arbeitet direkt auf einer Transaktion
table_name: &str, table_name: &str,
) -> Result<(), CrdtSetupError> { ) -> Result<(), CrdtSetupError> {
if !is_safe_identifier(table_name) { if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!( return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}", "Invalid or unsafe table name provided: {table_name}"
table_name
)) ))
.into()); .into());
} }
@ -186,8 +169,12 @@ pub fn drop_triggers_for_table(
drop_trigger_sql(INSERT_TRIGGER_TPL.replace("{TABLE_NAME}", table_name)); drop_trigger_sql(INSERT_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let drop_update_trigger_sql = let drop_update_trigger_sql =
drop_trigger_sql(UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name)); drop_trigger_sql(UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let drop_delete_trigger_sql =
drop_trigger_sql(DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let sql_batch = format!("{}\n{}", drop_insert_trigger_sql, drop_update_trigger_sql); let sql_batch = format!(
"{drop_insert_trigger_sql}\n{drop_update_trigger_sql}\n{drop_delete_trigger_sql}"
);
tx.execute_batch(&sql_batch)?; tx.execute_batch(&sql_batch)?;
Ok(()) Ok(())
@ -252,31 +239,22 @@ pub fn drop_triggers_for_table(
fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String { fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks let pk_json_payload = pks
.iter() .iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk)) .map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
let column_inserts = if cols.is_empty() { let column_inserts = if cols.is_empty() {
// Nur PKs -> einfacher Insert ins Log // Nur PKs -> einfacher Insert ins Log
format!( format!(
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES (NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}));", VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}));"
log_table = TABLE_CRDT_LOGS,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
) )
} else { } else {
cols.iter().fold(String::new(), |mut acc, col| { cols.iter().fold(String::new(), |mut acc, col| {
writeln!( writeln!(
&mut acc, &mut acc,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks, column_name, new_value) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value)
VALUES (NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}), '{column}', json_object('value', NEW.\"{column}\"));", VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}), '{col}', json_object('value', NEW.\"{col}\"));"
log_table = TABLE_CRDT_LOGS,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap(); ).unwrap();
acc acc
}) })
@ -296,14 +274,14 @@ fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
/// Generiert das SQL zum Löschen eines Triggers. /// Generiert das SQL zum Löschen eines Triggers.
fn drop_trigger_sql(trigger_name: String) -> String { fn drop_trigger_sql(trigger_name: String) -> String {
format!("DROP TRIGGER IF EXISTS \"{}\";", trigger_name) format!("DROP TRIGGER IF EXISTS \"{trigger_name}\";")
} }
/// Generiert das SQL für den UPDATE-Trigger. /// Generiert das SQL für den UPDATE-Trigger.
fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String { fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks let pk_json_payload = pks
.iter() .iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk)) .map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
@ -314,32 +292,15 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols { for col in cols {
writeln!( writeln!(
&mut body, &mut body,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value)
SELECT NEW.\"{hlc_col}\", 'UPDATE', '{table}', json_object({pk_payload}), '{column}', SELECT {UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'UPDATE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', NEW.\"{column}\"), json_object('value', OLD.\"{column}\") json_object('value', NEW.\"{col}\"), json_object('value', OLD.\"{col}\")
WHERE NEW.\"{column}\" IS NOT OLD.\"{column}\";", WHERE NEW.\"{col}\" IS NOT OLD.\"{col}\";"
log_table = TABLE_CRDT_LOGS,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap(); ).unwrap();
} }
} }
// Soft-delete loggen // Soft-delete Logging entfernt - wir nutzen jetzt Hard Deletes mit eigenem BEFORE DELETE Trigger
writeln!(
&mut body,
"INSERT INTO {log_table} (haex_timestamp, op_type, table_name, row_pks)
SELECT NEW.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload})
WHERE NEW.\"{tombstone_col}\" = 1 AND OLD.\"{tombstone_col}\" = 0;",
log_table = TABLE_CRDT_LOGS,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
tombstone_col = TOMBSTONE_COLUMN
)
.unwrap();
let trigger_name = UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name); let trigger_name = UPDATE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name);
@ -352,3 +313,46 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
END;" END;"
) )
} }
/// Generiert das SQL für den BEFORE DELETE-Trigger.
/// WICHTIG: BEFORE DELETE damit die Daten noch verfügbar sind!
fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks
.iter()
.map(|pk| format!("'{pk}', OLD.\"{pk}\""))
.collect::<Vec<_>>()
.join(", ");
let mut body = String::new();
// Alle Spaltenwerte speichern für mögliche Wiederherstellung
if !cols.is_empty() {
for col in cols {
writeln!(
&mut body,
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value)
VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', OLD.\"{col}\"));"
).unwrap();
}
} else {
// Nur PKs -> minimales Delete Log
writeln!(
&mut body,
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}));"
)
.unwrap();
}
let trigger_name = DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name);
format!(
"CREATE TRIGGER IF NOT EXISTS \"{trigger_name}\"
BEFORE DELETE ON \"{table_name}\"
FOR EACH ROW
BEGIN
{body}
END;"
)
}

View File

@ -1,8 +1,11 @@
// src-tauri/src/database/core.rs // src-tauri/src/database/core.rs
use crate::crdt::trigger::UUID_FUNCTION_NAME;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::database::DbConnection; use crate::database::DbConnection;
use crate::extension::database::executor::SqlExecutor;
use base64::{engine::general_purpose::STANDARD, Engine as _}; use base64::{engine::general_purpose::STANDARD, Engine as _};
use rusqlite::functions::FunctionFlags;
use rusqlite::types::Value as SqlValue; use rusqlite::types::Value as SqlValue;
use rusqlite::{ use rusqlite::{
types::{Value as RusqliteValue, ValueRef}, types::{Value as RusqliteValue, ValueRef},
@ -12,9 +15,9 @@ use serde_json::Value as JsonValue;
use sqlparser::ast::{Expr, Query, Select, SetExpr, Statement, TableFactor, TableObject}; use sqlparser::ast::{Expr, Query, Select, SetExpr, Statement, TableFactor, TableObject};
use sqlparser::dialect::SQLiteDialect; use sqlparser::dialect::SQLiteDialect;
use sqlparser::parser::Parser; use sqlparser::parser::Parser;
use uuid::Uuid;
/// Öffnet und initialisiert eine Datenbank mit Verschlüsselung /// Öffnet und initialisiert eine Datenbank mit Verschlüsselung
///
pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connection, DatabaseError> { pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connection, DatabaseError> {
let flags = if create { let flags = if create {
OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_CREATE OpenFlags::SQLITE_OPEN_READ_WRITE | OpenFlags::SQLITE_OPEN_CREATE
@ -34,6 +37,19 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Register custom UUID function for SQLite triggers
conn.create_scalar_function(
UUID_FUNCTION_NAME,
0,
FunctionFlags::SQLITE_UTF8 | FunctionFlags::SQLITE_DETERMINISTIC,
|_ctx| {
Ok(Uuid::new_v4().to_string())
},
)
.map_err(|e| DatabaseError::DatabaseError {
reason: format!("Failed to register {UUID_FUNCTION_NAME} function: {e}"),
})?;
let journal_mode: String = conn let journal_mode: String = conn
.query_row("PRAGMA journal_mode=WAL;", [], |row| row.get(0)) .query_row("PRAGMA journal_mode=WAL;", [], |row| row.get(0))
.map_err(|e| DatabaseError::PragmaError { .map_err(|e| DatabaseError::PragmaError {
@ -45,8 +61,7 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
println!("WAL mode successfully enabled."); println!("WAL mode successfully enabled.");
} else { } else {
eprintln!( eprintln!(
"Failed to enable WAL mode, journal_mode is '{}'.", "Failed to enable WAL mode, journal_mode is '{journal_mode}'."
journal_mode
); );
} }
@ -73,12 +88,29 @@ pub fn parse_single_statement(sql: &str) -> Result<Statement, DatabaseError> {
/// Utility für SQL-Parsing - parst mehrere SQL-Statements /// Utility für SQL-Parsing - parst mehrere SQL-Statements
pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError> { pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError> {
let dialect = SQLiteDialect {}; let dialect = SQLiteDialect {};
Parser::parse_sql(&dialect, sql).map_err(|e| DatabaseError::ParseError {
reason: e.to_string(), // Normalize whitespace: replace multiple whitespaces (including newlines, tabs) with single space
let normalized_sql = sql
.split_whitespace()
.collect::<Vec<&str>>()
.join(" ");
Parser::parse_sql(&dialect, &normalized_sql).map_err(|e| DatabaseError::ParseError {
reason: format!("Failed to parse SQL: {e}"),
sql: sql.to_string(), sql: sql.to_string(),
}) })
} }
/// Prüft ob ein Statement ein RETURNING Clause hat (AST-basiert, sicher)
pub fn statement_has_returning(statement: &Statement) -> bool {
match statement {
Statement::Insert(insert) => insert.returning.is_some(),
Statement::Update { returning, .. } => returning.is_some(),
Statement::Delete(delete) => delete.returning.is_some(),
_ => false,
}
}
pub struct ValueConverter; pub struct ValueConverter;
impl ValueConverter { impl ValueConverter {
@ -105,7 +137,7 @@ impl ValueConverter {
serde_json::to_string(json_val) serde_json::to_string(json_val)
.map(SqlValue::Text) .map(SqlValue::Text)
.map_err(|e| DatabaseError::SerializationError { .map_err(|e| DatabaseError::SerializationError {
reason: format!("Failed to serialize JSON param: {}", e), reason: format!("Failed to serialize JSON param: {e}"),
}) })
} }
} }
@ -116,6 +148,25 @@ impl ValueConverter {
} }
} }
/// Execute SQL mit CRDT-Transformation (für Drizzle-Integration)
/// Diese Funktion sollte von Drizzle verwendet werden, um CRDT-Support zu erhalten
pub fn execute_with_crdt(
sql: String,
params: Vec<JsonValue>,
connection: &DbConnection,
hlc_service: &std::sync::MutexGuard<crate::crdt::hlc::HlcService>,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
with_connection(connection, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?;
let _modified_tables = SqlExecutor::execute_internal(&tx, hlc_service, &sql, &params)?;
tx.commit().map_err(DatabaseError::from)?;
// Für Drizzle: gebe leeres Array zurück (wie bei execute ohne RETURNING)
Ok(vec![])
})
}
/// Execute SQL OHNE CRDT-Transformation (für spezielle Fälle)
pub fn execute( pub fn execute(
sql: String, sql: String,
params: Vec<JsonValue>, params: Vec<JsonValue>,
@ -129,45 +180,23 @@ pub fn execute(
let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect(); let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect();
with_connection(connection, |conn| { with_connection(connection, |conn| {
// Check if the SQL contains RETURNING clause if sql.to_uppercase().contains("RETURNING") {
let has_returning = sql.to_uppercase().contains("RETURNING"); let mut stmt = conn.prepare(&sql)?;
if has_returning {
// Use prepare + query for RETURNING statements
let mut stmt = conn.prepare(&sql).map_err(|e| DatabaseError::PrepareError {
reason: e.to_string(),
})?;
let num_columns = stmt.column_count(); let num_columns = stmt.column_count();
let mut rows = stmt let mut rows = stmt.query(&params_sql[..])?;
.query(&params_sql[..])
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next()? {
reason: format!("Row iteration error: {}", e),
})? {
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns); let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns { for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| { let value_ref = row.get_ref(i)?;
DatabaseError::RowProcessingError {
reason: format!("Failed to get column {}: {}", i, e),
}
})?;
let json_val = convert_value_ref_to_json(value_ref)?; let json_val = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val); row_values.push(json_val);
} }
result_vec.push(row_values); result_vec.push(row_values);
} }
Ok(result_vec) Ok(result_vec)
} else { } else {
// For non-RETURNING statements, just execute and return empty array
conn.execute(&sql, &params_sql[..]).map_err(|e| { conn.execute(&sql, &params_sql[..]).map_err(|e| {
let table_name = extract_primary_table_name_from_sql(&sql).unwrap_or(None); let table_name = extract_primary_table_name_from_sql(&sql).unwrap_or(None);
DatabaseError::ExecutionError { DatabaseError::ExecutionError {
@ -176,7 +205,6 @@ pub fn execute(
table: table_name, table: table_name,
} }
})?; })?;
Ok(vec![]) Ok(vec![])
} }
}) })
@ -206,46 +234,36 @@ pub fn select(
let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect(); let params_sql: Vec<&dyn ToSql> = params_converted.iter().map(|v| v as &dyn ToSql).collect();
with_connection(connection, |conn| { with_connection(connection, |conn| {
let mut stmt = conn let mut stmt = conn.prepare(&sql)?;
.prepare(&sql)
.map_err(|e| DatabaseError::PrepareError {
reason: e.to_string(),
})?;
let num_columns = stmt.column_count(); let num_columns = stmt.column_count();
let mut rows = stmt.query(&params_sql[..])?;
let mut rows = stmt
.query(&params_sql[..])
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next()? {
reason: format!("Row iteration error: {}", e),
})? {
let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns); let mut row_values: Vec<JsonValue> = Vec::with_capacity(num_columns);
for i in 0..num_columns { for i in 0..num_columns {
let value_ref = row let value_ref = row.get_ref(i)?;
.get_ref(i)
.map_err(|e| DatabaseError::RowProcessingError {
reason: format!("Failed to get column {}: {}", i, e),
})?;
let json_val = convert_value_ref_to_json(value_ref)?; let json_val = convert_value_ref_to_json(value_ref)?;
row_values.push(json_val); row_values.push(json_val);
} }
result_vec.push(row_values); result_vec.push(row_values);
} }
Ok(result_vec) Ok(result_vec)
}) })
} }
pub fn select_with_crdt(
sql: String,
params: Vec<JsonValue>,
connection: &DbConnection,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
with_connection(connection, |conn| {
SqlExecutor::query_select(conn, &sql, &params)
})
}
/// Konvertiert rusqlite ValueRef zu JSON /// Konvertiert rusqlite ValueRef zu JSON
fn convert_value_ref_to_json(value_ref: ValueRef) -> Result<JsonValue, DatabaseError> { pub fn convert_value_ref_to_json(value_ref: ValueRef) -> Result<JsonValue, DatabaseError> {
let json_val = match value_ref { let json_val = match value_ref {
ValueRef::Null => JsonValue::Null, ValueRef::Null => JsonValue::Null,
ValueRef::Integer(i) => JsonValue::Number(i.into()), ValueRef::Integer(i) => JsonValue::Number(i.into()),

View File

@ -16,8 +16,6 @@ pub struct HaexSettings {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub value: Option<String>, pub value: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -28,8 +26,7 @@ impl HaexSettings {
key: row.get(1)?, key: row.get(1)?,
r#type: row.get(2)?, r#type: row.get(2)?,
value: row.get(3)?, value: row.get(3)?,
haex_tombstone: row.get(4)?, haex_timestamp: row.get(4)?,
haex_timestamp: row.get(5)?,
}) })
} }
} }
@ -54,8 +51,6 @@ pub struct HaexExtensions {
pub icon: Option<String>, pub icon: Option<String>,
pub signature: String, pub signature: String,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -73,8 +68,7 @@ impl HaexExtensions {
enabled: row.get(8)?, enabled: row.get(8)?,
icon: row.get(9)?, icon: row.get(9)?,
signature: row.get(10)?, signature: row.get(10)?,
haex_tombstone: row.get(11)?, haex_timestamp: row.get(11)?,
haex_timestamp: row.get(12)?,
}) })
} }
} }
@ -83,8 +77,7 @@ impl HaexExtensions {
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
pub struct HaexExtensionPermissions { pub struct HaexExtensionPermissions {
pub id: String, pub id: String,
#[serde(skip_serializing_if = "Option::is_none")] pub extension_id: String,
pub extension_id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub resource_type: Option<String>, pub resource_type: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
@ -99,8 +92,6 @@ pub struct HaexExtensionPermissions {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub updated_at: Option<String>, pub updated_at: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -116,8 +107,7 @@ impl HaexExtensionPermissions {
status: row.get(6)?, status: row.get(6)?,
created_at: row.get(7)?, created_at: row.get(7)?,
updated_at: row.get(8)?, updated_at: row.get(8)?,
haex_tombstone: row.get(9)?, haex_timestamp: row.get(9)?,
haex_timestamp: row.get(10)?,
}) })
} }
} }
@ -200,3 +190,51 @@ impl HaexCrdtConfigs {
} }
} }
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct HaexDesktopItems {
pub id: String,
pub workspace_id: String,
pub item_type: String,
pub reference_id: String,
pub position_x: i64,
pub position_y: i64,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>,
}
impl HaexDesktopItems {
pub fn from_row(row: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Self {
id: row.get(0)?,
workspace_id: row.get(1)?,
item_type: row.get(2)?,
reference_id: row.get(3)?,
position_x: row.get(4)?,
position_y: row.get(5)?,
haex_timestamp: row.get(6)?,
})
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct HaexWorkspaces {
pub id: String,
pub name: String,
pub position: i64,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>,
}
impl HaexWorkspaces {
pub fn from_row(row: &rusqlite::Row) -> rusqlite::Result<Self> {
Ok(Self {
id: row.get(0)?,
name: row.get(1)?,
position: row.get(2)?,
haex_timestamp: row.get(3)?,
})
}
}

View File

@ -0,0 +1,66 @@
// src-tauri/src/database/init.rs
// Database initialization utilities (trigger setup, etc.)
use crate::crdt::trigger;
use crate::database::error::DatabaseError;
use crate::table_names::{
TABLE_DESKTOP_ITEMS,
TABLE_EXTENSIONS,
TABLE_EXTENSION_PERMISSIONS,
TABLE_NOTIFICATIONS,
TABLE_SETTINGS,
TABLE_WORKSPACES,
};
use rusqlite::{params, Connection};
/// Liste aller CRDT-Tabellen die Trigger benötigen (ohne Password-Tabellen - die kommen in Extension)
const CRDT_TABLES: &[&str] = &[
TABLE_SETTINGS,
TABLE_EXTENSIONS,
TABLE_EXTENSION_PERMISSIONS,
TABLE_NOTIFICATIONS,
TABLE_WORKSPACES,
TABLE_DESKTOP_ITEMS,
];
/// Prüft ob Trigger bereits initialisiert wurden und erstellt sie falls nötig
///
/// Diese Funktion wird beim ersten Öffnen einer Template-DB aufgerufen.
/// Sie erstellt alle CRDT-Trigger für die definierten Tabellen und markiert
/// die Initialisierung in haex_settings.
///
/// Bei Migrations (ALTER TABLE) werden Trigger automatisch neu erstellt,
/// daher ist kein Versioning nötig.
pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, DatabaseError> {
let tx = conn.transaction()?;
// Check if triggers already initialized
let check_sql = format!(
"SELECT value FROM {TABLE_SETTINGS} WHERE key = ? AND type = ?"
);
let initialized: Option<String> = tx
.query_row(
&check_sql,
params!["triggers_initialized", "system"],
|row| row.get(0),
)
.ok();
if initialized.is_some() {
eprintln!("DEBUG: Triggers already initialized, skipping");
tx.commit()?; // Wichtig: Transaktion trotzdem abschließen
return Ok(true); // true = war schon initialisiert
}
eprintln!("INFO: Initializing CRDT triggers for database...");
// Create triggers for all CRDT tables
for table_name in CRDT_TABLES {
eprintln!(" - Setting up triggers for: {table_name}");
trigger::setup_triggers_for_table(&tx, table_name, false)?;
}
tx.commit()?;
eprintln!("INFO: ✓ CRDT triggers created successfully (flag pending)");
Ok(false) // false = wurde gerade initialisiert
}

View File

@ -3,10 +3,13 @@
pub mod core; pub mod core;
pub mod error; pub mod error;
pub mod generated; pub mod generated;
pub mod init;
use crate::crdt::hlc::HlcService; use crate::crdt::hlc::HlcService;
use crate::database::core::execute_with_crdt;
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::table_names::TABLE_CRDT_CONFIGS; use crate::extension::database::executor::SqlExecutor;
use crate::table_names::{TABLE_CRDT_CONFIGS, TABLE_SETTINGS};
use crate::AppState; use crate::AppState;
use rusqlite::Connection; use rusqlite::Connection;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
@ -17,6 +20,8 @@ use std::time::UNIX_EPOCH;
use std::{fs, sync::Arc}; use std::{fs, sync::Arc};
use tauri::{path::BaseDirectory, AppHandle, Manager, State}; use tauri::{path::BaseDirectory, AppHandle, Manager, State};
use tauri_plugin_fs::FsExt; use tauri_plugin_fs::FsExt;
#[cfg(not(target_os = "android"))]
use trash;
use ts_rs::TS; use ts_rs::TS;
pub struct DbConnection(pub Arc<Mutex<Option<Connection>>>); pub struct DbConnection(pub Arc<Mutex<Option<Connection>>>);
@ -42,13 +47,53 @@ pub fn sql_execute(
core::execute(sql, params, &state.db) core::execute(sql, params, &state.db)
} }
#[tauri::command]
pub fn sql_select_with_crdt(
sql: String,
params: Vec<JsonValue>,
state: State<'_, AppState>,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
core::select_with_crdt(sql, params, &state.db)
}
#[tauri::command]
pub fn sql_execute_with_crdt(
sql: String,
params: Vec<JsonValue>,
state: State<'_, AppState>,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(),
})?;
core::execute_with_crdt(sql, params, &state.db, &hlc_service)
}
#[tauri::command]
pub fn sql_query_with_crdt(
sql: String,
params: Vec<JsonValue>,
state: State<'_, AppState>,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(),
})?;
core::with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?;
let (_modified_tables, result) =
SqlExecutor::query_internal(&tx, &hlc_service, &sql, &params)?;
tx.commit().map_err(DatabaseError::from)?;
Ok(result)
})
}
/// Resolves a database name to the full vault path /// Resolves a database name to the full vault path
fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, DatabaseError> { fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, DatabaseError> {
// Sicherstellen, dass der Name eine .db Endung hat // Sicherstellen, dass der Name eine .db Endung hat
let vault_file_name = if vault_name.ends_with(VAULT_EXTENSION) { let vault_file_name = if vault_name.ends_with(VAULT_EXTENSION) {
vault_name.to_string() vault_name.to_string()
} else { } else {
format!("{}{VAULT_EXTENSION}", vault_name) format!("{vault_name}{VAULT_EXTENSION}")
}; };
let vault_directory = get_vaults_directory(app_handle)?; let vault_directory = get_vaults_directory(app_handle)?;
@ -56,13 +101,12 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
let vault_path = app_handle let vault_path = app_handle
.path() .path()
.resolve( .resolve(
format!("{vault_directory}/{}", vault_file_name), format!("{vault_directory}/{vault_file_name}"),
BaseDirectory::AppLocalData, BaseDirectory::AppLocalData,
) )
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!( reason: format!(
"Failed to resolve vault path for '{}': {}", "Failed to resolve vault path for '{vault_file_name}': {e}"
vault_file_name, e
), ),
})?; })?;
@ -70,7 +114,7 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
if let Some(parent) = vault_path.parent() { if let Some(parent) = vault_path.parent() {
fs::create_dir_all(parent).map_err(|e| DatabaseError::IoError { fs::create_dir_all(parent).map_err(|e| DatabaseError::IoError {
path: parent.display().to_string(), path: parent.display().to_string(),
reason: format!("Failed to create vaults directory: {}", e), reason: format!("Failed to create vaults directory: {e}"),
})?; })?;
} }
@ -90,7 +134,6 @@ pub fn get_vaults_directory(app_handle: &AppHandle) -> Result<String, DatabaseEr
Ok(vaults_dir.to_string_lossy().to_string()) Ok(vaults_dir.to_string_lossy().to_string())
} }
//#[serde(tag = "type", content = "details")]
#[derive(Debug, Serialize, Deserialize, TS)] #[derive(Debug, Serialize, Deserialize, TS)]
#[ts(export)] #[ts(export)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
@ -130,18 +173,18 @@ pub fn list_vaults(app_handle: AppHandle) -> Result<Vec<VaultInfo>, DatabaseErro
if let Some(filename) = path.file_name().and_then(|n| n.to_str()) { if let Some(filename) = path.file_name().and_then(|n| n.to_str()) {
if filename.ends_with(VAULT_EXTENSION) { if filename.ends_with(VAULT_EXTENSION) {
// Entferne .db Endung für die Rückgabe // Entferne .db Endung für die Rückgabe
println!("Vault gefunden {}", filename.to_string()); println!("Vault gefunden {filename}");
let metadata = fs::metadata(&path).map_err(|e| DatabaseError::IoError { let metadata = fs::metadata(&path).map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(), path: path.to_string_lossy().to_string(),
reason: format!("Metadaten konnten nicht gelesen werden: {}", e), reason: format!("Metadaten konnten nicht gelesen werden: {e}"),
})?; })?;
let last_access_timestamp = metadata let last_access_timestamp = metadata
.accessed() .accessed()
.map_err(|e| DatabaseError::IoError { .map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(), path: path.to_string_lossy().to_string(),
reason: format!("Zugriffszeit konnte nicht gelesen werden: {}", e), reason: format!("Zugriffszeit konnte nicht gelesen werden: {e}"),
})? })?
.duration_since(UNIX_EPOCH) .duration_since(UNIX_EPOCH)
.unwrap_or_default() // Fallback für den seltenen Fall einer Zeit vor 1970 .unwrap_or_default() // Fallback für den seltenen Fall einer Zeit vor 1970
@ -164,15 +207,68 @@ pub fn list_vaults(app_handle: AppHandle) -> Result<Vec<VaultInfo>, DatabaseErro
/// Checks if a vault with the given name exists /// Checks if a vault with the given name exists
#[tauri::command] #[tauri::command]
pub fn vault_exists(app_handle: AppHandle, db_name: String) -> Result<bool, DatabaseError> { pub fn vault_exists(app_handle: AppHandle, vault_name: String) -> Result<bool, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &db_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
Ok(Path::new(&vault_path).exists()) Ok(Path::new(&vault_path).exists())
} }
/// Deletes a vault database file /// Moves a vault database file to trash (or deletes permanently if trash is unavailable)
#[tauri::command] #[tauri::command]
pub fn delete_vault(app_handle: AppHandle, db_name: String) -> Result<String, DatabaseError> { pub fn move_vault_to_trash(
let vault_path = get_vault_path(&app_handle, &db_name)?; app_handle: AppHandle,
vault_name: String,
) -> Result<String, DatabaseError> {
// On Android, trash is not available, so delete permanently
#[cfg(target_os = "android")]
{
println!(
"Android platform detected, permanently deleting vault '{}'",
vault_name
);
return delete_vault(app_handle, vault_name);
}
// On non-Android platforms, try to use trash
#[cfg(not(target_os = "android"))]
{
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
path: vault_path,
reason: "Vault does not exist".to_string(),
});
}
// Try to move to trash first (works on desktop systems)
let moved_to_trash = trash::delete(&vault_path).is_ok();
if moved_to_trash {
// Also try to move auxiliary files to trash (ignore errors as they might not exist)
let _ = trash::delete(&vault_shm_path);
let _ = trash::delete(&vault_wal_path);
Ok(format!(
"Vault '{vault_name}' successfully moved to trash"
))
} else {
// Fallback: Permanent deletion if trash fails
println!(
"Trash not available, falling back to permanent deletion for vault '{vault_name}'"
);
delete_vault(app_handle, vault_name)
}
}
}
/// Deletes a vault database file permanently (bypasses trash)
#[tauri::command]
pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -181,12 +277,26 @@ pub fn delete_vault(app_handle: AppHandle, db_name: String) -> Result<String, Da
}); });
} }
if Path::new(&vault_shm_path).exists() {
fs::remove_file(&vault_shm_path).map_err(|e| DatabaseError::IoError {
path: vault_shm_path.clone(),
reason: format!("Failed to delete vault: {e}"),
})?;
}
if Path::new(&vault_wal_path).exists() {
fs::remove_file(&vault_wal_path).map_err(|e| DatabaseError::IoError {
path: vault_wal_path.clone(),
reason: format!("Failed to delete vault: {e}"),
})?;
}
fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError { fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError {
path: vault_path.clone(), path: vault_path.clone(),
reason: format!("Failed to delete vault: {}", e), reason: format!("Failed to delete vault: {e}"),
})?; })?;
Ok(format!("Vault '{}' successfully deleted", db_name)) Ok(format!("Vault '{vault_name}' successfully deleted"))
} }
#[tauri::command] #[tauri::command]
@ -196,16 +306,16 @@ pub fn create_encrypted_database(
key: String, key: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<String, DatabaseError> { ) -> Result<String, DatabaseError> {
println!("Creating encrypted vault with name: {}", vault_name); println!("Creating encrypted vault with name: {vault_name}");
let vault_path = get_vault_path(&app_handle, &vault_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path); println!("Resolved vault path: {vault_path}");
// Prüfen, ob bereits eine Vault mit diesem Namen existiert // Prüfen, ob bereits eine Vault mit diesem Namen existiert
if Path::new(&vault_path).exists() { if Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
path: vault_path, path: vault_path,
reason: format!("A vault with the name '{}' already exists", vault_name), reason: format!("A vault with the name '{vault_name}' already exists"),
}); });
} }
/* let resource_path = app_handle /* let resource_path = app_handle
@ -217,7 +327,7 @@ pub fn create_encrypted_database(
.path() .path()
.resolve("database/vault.db", BaseDirectory::Resource) .resolve("database/vault.db", BaseDirectory::Resource)
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve template database: {}", e), reason: format!("Failed to resolve template database: {e}"),
})?; })?;
let template_content = let template_content =
@ -226,20 +336,20 @@ pub fn create_encrypted_database(
.read(&template_path) .read(&template_path)
.map_err(|e| DatabaseError::IoError { .map_err(|e| DatabaseError::IoError {
path: template_path.display().to_string(), path: template_path.display().to_string(),
reason: format!("Failed to read template database from resources: {}", e), reason: format!("Failed to read template database from resources: {e}"),
})?; })?;
let temp_path = app_handle let temp_path = app_handle
.path() .path()
.resolve("temp_vault.db", BaseDirectory::AppLocalData) .resolve("temp_vault.db", BaseDirectory::AppLocalData)
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve temp database: {}", e), reason: format!("Failed to resolve temp database: {e}"),
})?; })?;
let temp_path_clone = temp_path.to_owned(); let temp_path_clone = temp_path.to_owned();
fs::write(temp_path, template_content).map_err(|e| DatabaseError::IoError { fs::write(temp_path, template_content).map_err(|e| DatabaseError::IoError {
path: vault_path.to_string(), path: vault_path.to_string(),
reason: format!("Failed to write temporary template database: {}", e), reason: format!("Failed to write temporary template database: {e}"),
})?; })?;
/* if !template_path.exists() { /* if !template_path.exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -252,8 +362,7 @@ pub fn create_encrypted_database(
let conn = Connection::open(&temp_path_clone).map_err(|e| DatabaseError::ConnectionFailed { let conn = Connection::open(&temp_path_clone).map_err(|e| DatabaseError::ConnectionFailed {
path: temp_path_clone.display().to_string(), path: temp_path_clone.display().to_string(),
reason: format!( reason: format!(
"Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {}", "Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {e}"
e
), ),
})?; })?;
@ -281,7 +390,7 @@ pub fn create_encrypted_database(
let _ = fs::remove_file(&vault_path); let _ = fs::remove_file(&vault_path);
let _ = fs::remove_file(&temp_path_clone); let _ = fs::remove_file(&temp_path_clone);
return Err(DatabaseError::QueryError { return Err(DatabaseError::QueryError {
reason: format!("Fehler während sqlcipher_export: {}", e), reason: format!("Fehler während sqlcipher_export: {e}"),
}); });
} }
@ -306,11 +415,11 @@ pub fn create_encrypted_database(
Ok(version) Ok(version)
}) { }) {
Ok(version) => { Ok(version) => {
println!("SQLCipher ist aktiv! Version: {}", version); println!("SQLCipher ist aktiv! Version: {version}");
} }
Err(e) => { Err(e) => {
eprintln!("FEHLER: SQLCipher scheint NICHT aktiv zu sein!"); eprintln!("FEHLER: SQLCipher scheint NICHT aktiv zu sein!");
eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {}", e); eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {e}");
eprintln!("Die Datenbank wurde wahrscheinlich NICHT verschlüsselt."); eprintln!("Die Datenbank wurde wahrscheinlich NICHT verschlüsselt.");
} }
} }
@ -318,7 +427,7 @@ pub fn create_encrypted_database(
conn.close() conn.close()
.map_err(|(_, e)| DatabaseError::ConnectionFailed { .map_err(|(_, e)| DatabaseError::ConnectionFailed {
path: template_path.display().to_string(), path: template_path.display().to_string(),
reason: format!("Fehler beim Schließen der Quelldatenbank: {}", e), reason: format!("Fehler beim Schließen der Quelldatenbank: {e}"),
})?; })?;
let _ = fs::remove_file(&temp_path_clone); let _ = fs::remove_file(&temp_path_clone);
@ -335,22 +444,19 @@ pub fn open_encrypted_database(
key: String, key: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<String, DatabaseError> { ) -> Result<String, DatabaseError> {
println!("Opening encrypted database vault_path: {}", vault_path); println!("Opening encrypted database vault_path: {vault_path}");
println!("Resolved vault path: {vault_path}");
// Vault-Pfad aus dem Namen ableiten
//let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path);
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
path: vault_path.to_string(), path: vault_path.to_string(),
reason: format!("Vault '{}' does not exist", vault_path), reason: format!("Vault '{vault_path}' does not exist"),
}); });
} }
initialize_session(&app_handle, &vault_path, &key, &state)?; initialize_session(&app_handle, &vault_path, &key, &state)?;
Ok(format!("Vault '{}' opened successfully", vault_path)) Ok(format!("Vault '{vault_path}' opened successfully"))
} }
/// Opens the DB, initializes the HLC service, and stores both in the AppState. /// Opens the DB, initializes the HLC service, and stores both in the AppState.
@ -361,9 +467,12 @@ fn initialize_session(
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<(), DatabaseError> { ) -> Result<(), DatabaseError> {
// 1. Establish the raw database connection // 1. Establish the raw database connection
let conn = core::open_and_init_db(path, key, false)?; let mut conn = core::open_and_init_db(path, key, false)?;
// 2. Initialize the HLC service // 2. Ensure CRDT triggers are initialized (for template DB)
let triggers_were_already_initialized = init::ensure_triggers_initialized(&mut conn)?;
// 3. Initialize the HLC service
let hlc_service = HlcService::try_initialize(&conn, app_handle).map_err(|e| { let hlc_service = HlcService::try_initialize(&conn, app_handle).map_err(|e| {
// We convert the HlcError into a DatabaseError // We convert the HlcError into a DatabaseError
DatabaseError::ExecutionError { DatabaseError::ExecutionError {
@ -373,16 +482,52 @@ fn initialize_session(
} }
})?; })?;
// 3. Store everything in the global AppState // 4. Store everything in the global AppState
let mut db_guard = state.db.0.lock().map_err(|e| DatabaseError::LockError { let mut db_guard = state.db.0.lock().map_err(|e| DatabaseError::LockError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Wichtig: Wir brauchen den db_guard gleich nicht mehr,
// da 'execute_with_crdt' 'with_connection' aufruft, was
// 'state.db' selbst locken muss.
// Wir müssen den Guard freigeben, *bevor* wir 'execute_with_crdt' rufen,
// um einen Deadlock zu verhindern.
// Aber wir müssen die 'conn' erst hineinbewegen.
*db_guard = Some(conn); *db_guard = Some(conn);
drop(db_guard);
let mut hlc_guard = state.hlc.lock().map_err(|e| DatabaseError::LockError { let mut hlc_guard = state.hlc.lock().map_err(|e| DatabaseError::LockError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
*hlc_guard = hlc_service; *hlc_guard = hlc_service;
// WICHTIG: hlc_guard *nicht* freigeben, da 'execute_with_crdt'
// eine Referenz auf die Guard erwartet.
// 5. NEUER SCHRITT: Setze das Flag via CRDT, falls nötig
if !triggers_were_already_initialized {
eprintln!("INFO: Setting 'triggers_initialized' flag via CRDT...");
let insert_sql = format!(
"INSERT INTO {TABLE_SETTINGS} (id, key, type, value) VALUES (?, ?, ?, ?)"
);
// execute_with_crdt erwartet Vec<JsonValue>, kein params!-Makro
let params_vec: Vec<JsonValue> = vec![
JsonValue::String(uuid::Uuid::new_v4().to_string()),
JsonValue::String("triggers_initialized".to_string()),
JsonValue::String("system".to_string()),
JsonValue::String("1".to_string()),
];
// Jetzt können wir 'execute_with_crdt' sicher aufrufen,
// da der AppState initialisiert ist.
execute_with_crdt(
insert_sql, params_vec, &state.db, // Das &DbConnection (der Mutex)
&hlc_guard, // Die gehaltene MutexGuard
)?;
eprintln!("INFO: ✓ 'triggers_initialized' flag set.");
}
Ok(()) Ok(())
} }

View File

@ -12,7 +12,6 @@ use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS};
use crate::AppState; use crate::AppState;
use std::collections::HashMap; use std::collections::HashMap;
use std::fs; use std::fs;
use std::io::Cursor;
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::Mutex; use std::sync::Mutex;
use std::time::{Duration, SystemTime}; use std::time::{Duration, SystemTime};
@ -65,60 +64,183 @@ impl ExtensionManager {
Self::default() Self::default()
} }
/// Helper function to validate path and check for path traversal
/// Returns the cleaned path if valid, or None if invalid/not found
/// If require_exists is true, returns None if path doesn't exist
pub fn validate_path_in_directory(
base_dir: &PathBuf,
relative_path: &str,
require_exists: bool,
) -> Result<Option<PathBuf>, ExtensionError> {
// Check for path traversal patterns
if relative_path.contains("..") {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {relative_path}"),
});
}
// Clean the path (same logic as in protocol.rs)
let clean_path = relative_path
.replace('\\', "/")
.trim_start_matches('/')
.split('/')
.filter(|&part| !part.is_empty() && part != "." && part != "..")
.collect::<PathBuf>();
let full_path = base_dir.join(&clean_path);
// Check if file/directory exists (if required)
if require_exists && !full_path.exists() {
return Ok(None);
}
// Verify path is within base directory
let canonical_base = base_dir
.canonicalize()
.map_err(|e| ExtensionError::Filesystem { source: e })?;
if let Ok(canonical_path) = full_path.canonicalize() {
if !canonical_path.starts_with(&canonical_base) {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {relative_path}"),
});
}
Ok(Some(canonical_path))
} else {
// Path doesn't exist yet - still validate it would be within base
if full_path.starts_with(&canonical_base) {
Ok(Some(full_path))
} else {
Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {relative_path}"),
})
}
}
}
/// Validates icon path and falls back to favicon.ico if not specified
fn validate_and_resolve_icon_path(
extension_dir: &PathBuf,
haextension_dir: &str,
icon_path: Option<&str>,
) -> Result<Option<String>, ExtensionError> {
// If icon is specified in manifest, validate it
if let Some(icon) = icon_path {
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, icon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
} else {
eprintln!("WARNING: Icon path specified in manifest not found: {icon}");
// Continue to fallback logic
}
}
// Fallback 1: Check haextension/favicon.ico
let haextension_favicon = format!("{haextension_dir}/favicon.ico");
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, &haextension_favicon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// Fallback 2: Check public/favicon.ico
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, "public/favicon.ico", true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// No icon found
Ok(None)
}
/// Extrahiert eine Extension-ZIP-Datei und validiert das Manifest /// Extrahiert eine Extension-ZIP-Datei und validiert das Manifest
fn extract_and_validate_extension( fn extract_and_validate_extension(
bytes: Vec<u8>, bytes: Vec<u8>,
temp_prefix: &str, temp_prefix: &str,
app_handle: &AppHandle,
) -> Result<ExtractedExtension, ExtensionError> { ) -> Result<ExtractedExtension, ExtensionError> {
let temp = std::env::temp_dir().join(format!("{}_{}", temp_prefix, uuid::Uuid::new_v4())); // Use app_cache_dir for better Android compatibility
let cache_dir = app_handle
.path()
.app_cache_dir()
.map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot get app cache dir: {e}"),
})?;
let temp_id = uuid::Uuid::new_v4();
let temp = cache_dir.join(format!("{temp_prefix}_{temp_id}"));
let zip_file_path = cache_dir.join(format!("{}_{}_{}.haextension", temp_prefix, temp_id, "temp"));
// Write bytes to a temporary ZIP file first (important for Android file system)
fs::write(&zip_file_path, &bytes).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
// Create extraction directory
fs::create_dir_all(&temp) fs::create_dir_all(&temp)
.map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?; .map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?;
let mut archive = ZipArchive::new(Cursor::new(bytes)).map_err(|e| { // Open ZIP file from disk (more reliable on Android than from memory)
let zip_file = fs::File::open(&zip_file_path).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
let mut archive = ZipArchive::new(zip_file).map_err(|e| {
ExtensionError::InstallationFailed { ExtensionError::InstallationFailed {
reason: format!("Invalid ZIP: {}", e), reason: format!("Invalid ZIP: {e}"),
} }
})?; })?;
archive archive
.extract(&temp) .extract(&temp)
.map_err(|e| ExtensionError::InstallationFailed { .map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot extract ZIP: {}", e), reason: format!("Cannot extract ZIP: {e}"),
})?; })?;
// Check if manifest.json is directly in temp or in a subdirectory // Clean up temporary ZIP file
let manifest_path = temp.join("manifest.json"); let _ = fs::remove_file(&zip_file_path);
let actual_dir = if manifest_path.exists() {
temp.clone()
} else {
// manifest.json is in a subdirectory - find it
let mut found_dir = None;
for entry in fs::read_dir(&temp)
.map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?
{
let entry = entry.map_err(|e| ExtensionError::Filesystem { source: e })?;
let path = entry.path();
if path.is_dir() && path.join("manifest.json").exists() {
found_dir = Some(path);
break;
}
}
found_dir.ok_or_else(|| ExtensionError::ManifestError { // Read haextension_dir from config if it exists, otherwise use default
reason: "manifest.json not found in extension archive".to_string(), let config_path = temp.join("haextension.config.json");
})? let haextension_dir = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read haextension.config.json: {e}"),
})?;
let config: serde_json::Value = serde_json::from_str(&config_content)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Invalid haextension.config.json: {e}"),
})?;
let dir = config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string();
dir
} else {
"haextension".to_string()
}; };
let manifest_path = actual_dir.join("manifest.json"); // Validate manifest path using helper function
let manifest_content = let manifest_relative_path = format!("{haextension_dir}/manifest.json");
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError { let manifest_path = Self::validate_path_in_directory(&temp, &manifest_relative_path, true)?
reason: format!("Cannot read manifest: {}", e), .ok_or_else(|| ExtensionError::ManifestError {
reason: format!("manifest.json not found at {haextension_dir}/manifest.json"),
})?; })?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let actual_dir = temp.clone();
let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read manifest: {e}"),
})?;
let content_hash = ExtensionCrypto::hash_directory(&actual_dir).map_err(|e| { let mut manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
// Validate and resolve icon path with fallback logic
let validated_icon = Self::validate_and_resolve_icon_path(&actual_dir, &haextension_dir, manifest.icon.as_deref())?;
manifest.icon = validated_icon;
let content_hash = ExtensionCrypto::hash_directory(&actual_dir, &manifest_path).map_err(|e| {
ExtensionError::SignatureVerificationFailed { ExtensionError::SignatureVerificationFailed {
reason: e.to_string(), reason: e.to_string(),
} }
@ -167,7 +289,6 @@ impl ExtensionManager {
Ok(specific_extension_dir) Ok(specific_extension_dir)
} }
pub fn add_production_extension(&self, extension: Extension) -> Result<(), ExtensionError> { pub fn add_production_extension(&self, extension: Extension) -> Result<(), ExtensionError> {
if extension.id.is_empty() { if extension.id.is_empty() {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
@ -223,11 +344,12 @@ impl ExtensionManager {
name: &str, name: &str,
) -> Result<Option<(String, Extension)>, ExtensionError> { ) -> Result<Option<(String, Extension)>, ExtensionError> {
// 1. Check dev extensions first (higher priority) // 1. Check dev extensions first (higher priority)
let dev_extensions = self.dev_extensions.lock().map_err(|e| { let dev_extensions =
ExtensionError::MutexPoisoned { self.dev_extensions
reason: e.to_string(), .lock()
} .map_err(|e| ExtensionError::MutexPoisoned {
})?; reason: e.to_string(),
})?;
for (id, ext) in dev_extensions.iter() { for (id, ext) in dev_extensions.iter() {
if ext.manifest.public_key == public_key && ext.manifest.name == name { if ext.manifest.public_key == public_key && ext.manifest.name == name {
@ -236,11 +358,12 @@ impl ExtensionManager {
} }
// 2. Check production extensions // 2. Check production extensions
let prod_extensions = self.production_extensions.lock().map_err(|e| { let prod_extensions =
ExtensionError::MutexPoisoned { self.production_extensions
reason: e.to_string(), .lock()
} .map_err(|e| ExtensionError::MutexPoisoned {
})?; reason: e.to_string(),
})?;
for (id, ext) in prod_extensions.iter() { for (id, ext) in prod_extensions.iter() {
if ext.manifest.public_key == public_key && ext.manifest.name == name { if ext.manifest.public_key == public_key && ext.manifest.name == name {
@ -262,11 +385,7 @@ impl ExtensionManager {
.map(|(_, ext)| ext)) .map(|(_, ext)| ext))
} }
pub fn remove_extension( pub fn remove_extension(&self, public_key: &str, name: &str) -> Result<(), ExtensionError> {
&self,
public_key: &str,
name: &str,
) -> Result<(), ExtensionError> {
let (id, _) = self let (id, _) = self
.find_extension_id_by_public_key_and_name(public_key, name)? .find_extension_id_by_public_key_and_name(public_key, name)?
.ok_or_else(|| ExtensionError::NotFound { .ok_or_else(|| ExtensionError::NotFound {
@ -276,11 +395,12 @@ impl ExtensionManager {
// Remove from dev extensions first // Remove from dev extensions first
{ {
let mut dev_extensions = self.dev_extensions.lock().map_err(|e| { let mut dev_extensions =
ExtensionError::MutexPoisoned { self.dev_extensions
reason: e.to_string(), .lock()
} .map_err(|e| ExtensionError::MutexPoisoned {
})?; reason: e.to_string(),
})?;
if dev_extensions.remove(&id).is_some() { if dev_extensions.remove(&id).is_some() {
return Ok(()); return Ok(());
} }
@ -288,11 +408,12 @@ impl ExtensionManager {
// Remove from production extensions // Remove from production extensions
{ {
let mut prod_extensions = self.production_extensions.lock().map_err(|e| { let mut prod_extensions =
ExtensionError::MutexPoisoned { self.production_extensions
reason: e.to_string(), .lock()
} .map_err(|e| ExtensionError::MutexPoisoned {
})?; reason: e.to_string(),
})?;
prod_extensions.remove(&id); prod_extensions.remove(&id);
} }
@ -315,7 +436,10 @@ impl ExtensionManager {
name: extension_name.to_string(), name: extension_name.to_string(),
})?; })?;
eprintln!("DEBUG: Removing extension with ID: {}", extension.id);
eprintln!(
"DEBUG: Extension name: {extension_name}, version: {extension_version}"
);
// Lösche Permissions und Extension-Eintrag in einer Transaktion // Lösche Permissions und Extension-Eintrag in einer Transaktion
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
@ -326,14 +450,15 @@ impl ExtensionManager {
})?; })?;
// Lösche alle Permissions mit extension_id // Lösche alle Permissions mit extension_id
PermissionManager::delete_permissions_in_transaction( eprintln!(
&tx, "DEBUG: Deleting permissions for extension_id: {}",
&hlc_service, extension.id
&extension.id, );
)?; PermissionManager::delete_permissions_in_transaction(&tx, &hlc_service, &extension.id)?;
// Lösche Extension-Eintrag mit extension_id // Lösche Extension-Eintrag mit extension_id
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSIONS} WHERE id = ?");
eprintln!("DEBUG: Executing SQL: {} with id = {}", sql, extension.id);
SqlExecutor::execute_internal_typed( SqlExecutor::execute_internal_typed(
&tx, &tx,
&hlc_service, &hlc_service,
@ -341,9 +466,12 @@ impl ExtensionManager {
rusqlite::params![&extension.id], rusqlite::params![&extension.id],
)?; )?;
eprintln!("DEBUG: Committing transaction");
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
})?; })?;
eprintln!("DEBUG: Transaction committed successfully");
// Entferne aus dem In-Memory-Manager // Entferne aus dem In-Memory-Manager
self.remove_extension(public_key, extension_name)?; self.remove_extension(public_key, extension_name)?;
@ -385,9 +513,10 @@ impl ExtensionManager {
pub async fn preview_extension_internal( pub async fn preview_extension_internal(
&self, &self,
app_handle: &AppHandle,
file_bytes: Vec<u8>, file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> { ) -> Result<ExtensionPreview, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_preview")?; let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_preview", app_handle)?;
let is_valid_signature = ExtensionCrypto::verify_signature( let is_valid_signature = ExtensionCrypto::verify_signature(
&extracted.manifest.public_key, &extracted.manifest.public_key,
@ -412,7 +541,7 @@ impl ExtensionManager {
custom_permissions: EditablePermissions, custom_permissions: EditablePermissions,
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<String, ExtensionError> { ) -> Result<String, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_ext")?; let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_ext", &app_handle)?;
// Signatur verifizieren (bei Installation wird ein Fehler geworfen, nicht nur geprüft) // Signatur verifizieren (bei Installation wird ein Fehler geworfen, nicht nur geprüft)
ExtensionCrypto::verify_signature( ExtensionCrypto::verify_signature(
@ -429,6 +558,17 @@ impl ExtensionManager {
&extracted.manifest.version, &extracted.manifest.version,
)?; )?;
// If extension version already exists, remove it completely before installing
if extensions_dir.exists() {
eprintln!(
"Extension version already exists at {}, removing old version",
extensions_dir.display()
);
std::fs::remove_dir_all(&extensions_dir).map_err(|e| {
ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e)
})?;
}
std::fs::create_dir_all(&extensions_dir).map_err(|e| { std::fs::create_dir_all(&extensions_dir).map_err(|e| {
ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e) ExtensionError::filesystem_with_path(extensions_dir.display().to_string(), e)
})?; })?;
@ -460,42 +600,44 @@ impl ExtensionManager {
let permissions = custom_permissions.to_internal_permissions(&extension_id); let permissions = custom_permissions.to_internal_permissions(&extension_id);
// Extension-Eintrag und Permissions in einer Transaktion speichern // Extension-Eintrag und Permissions in einer Transaktion speichern
with_connection(&state.db, |conn| { let actual_extension_id = with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?; let tx = conn.transaction().map_err(DatabaseError::from)?;
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned { let hlc_service_guard = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
// Klonen, um den MutexGuard freizugeben, bevor potenziell lange DB-Operationen stattfinden
let hlc_service = hlc_service_guard.clone();
drop(hlc_service_guard);
// 1. Extension-Eintrag erstellen mit generierter UUID // 1. Extension-Eintrag erstellen mit generierter UUID
let insert_ext_sql = format!( let insert_ext_sql = format!(
"INSERT INTO {} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSIONS} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSIONS
); );
SqlExecutor::execute_internal_typed( SqlExecutor::execute_internal_typed(
&tx, &tx,
&hlc_service, &hlc_service,
&insert_ext_sql, &insert_ext_sql,
rusqlite::params![ rusqlite::params![
extension_id, extension_id,
extracted.manifest.name, extracted.manifest.name,
extracted.manifest.version, extracted.manifest.version,
extracted.manifest.author, extracted.manifest.author,
extracted.manifest.entry, extracted.manifest.entry,
extracted.manifest.icon, extracted.manifest.icon,
extracted.manifest.public_key, extracted.manifest.public_key,
extracted.manifest.signature, extracted.manifest.signature,
extracted.manifest.homepage, extracted.manifest.homepage,
extracted.manifest.description, extracted.manifest.description,
true, // enabled true, // enabled
], extracted.manifest.single_instance.unwrap_or(false),
)?; ],
)?;
// 2. Permissions speichern (oder aktualisieren falls schon vorhanden) // 2. Permissions speichern
let insert_perm_sql = format!( let insert_perm_sql = format!(
"INSERT OR REPLACE INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSION_PERMISSIONS
); );
for perm in &permissions { for perm in &permissions {
@ -535,7 +677,7 @@ impl ExtensionManager {
self.add_production_extension(extension)?; self.add_production_extension(extension)?;
Ok(extension_id) Ok(actual_extension_id) // Gebe die actual_extension_id an den Caller zurück
} }
/// Scannt das Dateisystem beim Start und lädt alle installierten Erweiterungen. /// Scannt das Dateisystem beim Start und lädt alle installierten Erweiterungen.
@ -544,6 +686,7 @@ impl ExtensionManager {
app_handle: &AppHandle, app_handle: &AppHandle,
state: &State<'_, AppState>, state: &State<'_, AppState>,
) -> Result<Vec<String>, ExtensionError> { ) -> Result<Vec<String>, ExtensionError> {
// Clear existing data
self.production_extensions self.production_extensions
.lock() .lock()
.map_err(|e| ExtensionError::MutexPoisoned { .map_err(|e| ExtensionError::MutexPoisoned {
@ -563,19 +706,20 @@ impl ExtensionManager {
})? })?
.clear(); .clear();
// Schritt 1: Alle Daten aus der Datenbank in einem Rutsch laden. // Lade alle Daten aus der Datenbank
let extensions = with_connection(&state.db, |conn| { let extensions = with_connection(&state.db, |conn| {
let sql = format!( let sql = format!(
"SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled FROM {}", "SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance FROM {TABLE_EXTENSIONS}"
TABLE_EXTENSIONS );
); eprintln!("DEBUG: SQL Query before transformation: {sql}");
eprintln!("DEBUG: SQL Query before transformation: {}", sql);
let results = SqlExecutor::select_internal(conn, &sql, &[])?; let results = SqlExecutor::query_select(conn, &sql, &[])?;
eprintln!("DEBUG: Query returned {} results", results.len()); eprintln!("DEBUG: Query returned {} results", results.len());
let mut data = Vec::new(); let mut data = Vec::new();
for result in results { for row in results {
let id = result["id"] // Wir erwarten die Werte in der Reihenfolge der SELECT-Anweisung
let id = row[0]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing id field".to_string(), reason: "Missing id field".to_string(),
@ -583,31 +727,34 @@ impl ExtensionManager {
.to_string(); .to_string();
let manifest = ExtensionManifest { let manifest = ExtensionManifest {
name: result["name"] name: row[1]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing name field".to_string(), reason: "Missing name field".to_string(),
})? })?
.to_string(), .to_string(),
version: result["version"] version: row[2]
.as_str() .as_str()
.ok_or_else(|| DatabaseError::SerializationError { .ok_or_else(|| DatabaseError::SerializationError {
reason: "Missing version field".to_string(), reason: "Missing version field".to_string(),
})? })?
.to_string(), .to_string(),
author: result["author"].as_str().map(String::from), author: row[3].as_str().map(String::from),
entry: result["entry"].as_str().unwrap_or("index.html").to_string(), entry: row[4].as_str().map(String::from),
icon: result["icon"].as_str().map(String::from), icon: row[5].as_str().map(String::from),
public_key: result["public_key"].as_str().unwrap_or("").to_string(), public_key: row[6].as_str().unwrap_or("").to_string(),
signature: result["signature"].as_str().unwrap_or("").to_string(), signature: row[7].as_str().unwrap_or("").to_string(),
permissions: ExtensionPermissions::default(), permissions: ExtensionPermissions::default(),
homepage: result["homepage"].as_str().map(String::from), homepage: row[8].as_str().map(String::from),
description: result["description"].as_str().map(String::from), description: row[9].as_str().map(String::from),
single_instance: row[11]
.as_bool()
.or_else(|| row[11].as_i64().map(|v| v != 0)),
}; };
let enabled = result["enabled"] let enabled = row[10]
.as_bool() .as_bool()
.or_else(|| result["enabled"].as_i64().map(|v| v != 0)) .or_else(|| row[10].as_i64().map(|v| v != 0))
.unwrap_or(false); .unwrap_or(false);
data.push(ExtensionDataFromDb { data.push(ExtensionDataFromDb {
@ -626,7 +773,7 @@ impl ExtensionManager {
for extension_data in extensions { for extension_data in extensions {
let extension_id = extension_data.id; let extension_id = extension_data.id;
eprintln!("DEBUG: Processing extension: {}", extension_id); eprintln!("DEBUG: Processing extension: {extension_id}");
// Use public_key/name/version path structure // Use public_key/name/version path structure
let extension_path = self.get_extension_dir( let extension_path = self.get_extension_dir(
@ -636,10 +783,10 @@ impl ExtensionManager {
&extension_data.manifest.version, &extension_data.manifest.version,
)?; )?;
if !extension_path.exists() || !extension_path.join("manifest.json").exists() { // Check if extension directory exists
if !extension_path.exists() {
eprintln!( eprintln!(
"DEBUG: Extension files missing for: {} at {:?}", "DEBUG: Extension directory missing for: {extension_id} at {extension_path:?}"
extension_id, extension_path
); );
self.missing_extensions self.missing_extensions
.lock() .lock()
@ -655,10 +802,52 @@ impl ExtensionManager {
continue; continue;
} }
eprintln!( // Read haextension_dir from config if it exists, otherwise use default
"DEBUG: Extension loaded successfully: {}", let config_path = extension_path.join("haextension.config.json");
extension_id let haextension_dir = if config_path.exists() {
); match std::fs::read_to_string(&config_path) {
Ok(config_content) => {
match serde_json::from_str::<serde_json::Value>(&config_content) {
Ok(config) => {
config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string()
}
Err(_) => "haextension".to_string(),
}
}
Err(_) => "haextension".to_string(),
}
} else {
"haextension".to_string()
};
// Validate manifest.json path using helper function
let manifest_relative_path = format!("{haextension_dir}/manifest.json");
if Self::validate_path_in_directory(&extension_path, &manifest_relative_path, true)?
.is_none()
{
eprintln!(
"DEBUG: manifest.json missing or invalid for: {extension_id} at {haextension_dir}/manifest.json"
);
self.missing_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?
.push(MissingExtension {
id: extension_id.clone(),
public_key: extension_data.manifest.public_key.clone(),
name: extension_data.manifest.name.clone(),
version: extension_data.manifest.version.clone(),
});
continue;
}
eprintln!("DEBUG: Extension loaded successfully: {extension_id}");
let extension = Extension { let extension = Extension {
id: extension_id.clone(), id: extension_id.clone(),

View File

@ -57,13 +57,20 @@ pub struct ExtensionManifest {
pub name: String, pub name: String,
pub version: String, pub version: String,
pub author: Option<String>, pub author: Option<String>,
pub entry: String, #[serde(default = "default_entry_value")]
pub entry: Option<String>,
pub icon: Option<String>, pub icon: Option<String>,
pub public_key: String, pub public_key: String,
pub signature: String, pub signature: String,
pub permissions: ExtensionPermissions, pub permissions: ExtensionPermissions,
pub homepage: Option<String>, pub homepage: Option<String>,
pub description: Option<String>, pub description: Option<String>,
#[serde(default)]
pub single_instance: Option<bool>,
}
fn default_entry_value() -> Option<String> {
Some("index.html".to_string())
} }
impl ExtensionManifest { impl ExtensionManifest {
@ -155,7 +162,6 @@ impl ExtensionPermissions {
.and_then(|c| serde_json::from_value::<PermissionConstraints>(c.clone()).ok()), .and_then(|c| serde_json::from_value::<PermissionConstraints>(c.clone()).ok()),
status: p.status.clone().unwrap_or(PermissionStatus::Ask), status: p.status.clone().unwrap_or(PermissionStatus::Ask),
haex_timestamp: None, haex_timestamp: None,
haex_tombstone: None,
}) })
} }
} }
@ -173,6 +179,8 @@ pub struct ExtensionInfoResponse {
pub description: Option<String>, pub description: Option<String>,
pub homepage: Option<String>, pub homepage: Option<String>,
pub icon: Option<String>, pub icon: Option<String>,
pub entry: Option<String>,
pub single_instance: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub dev_server_url: Option<String>, pub dev_server_url: Option<String>,
} }
@ -198,6 +206,8 @@ impl ExtensionInfoResponse {
description: extension.manifest.description.clone(), description: extension.manifest.description.clone(),
homepage: extension.manifest.homepage.clone(), homepage: extension.manifest.homepage.clone(),
icon: extension.manifest.icon.clone(), icon: extension.manifest.icon.clone(),
entry: extension.manifest.entry.clone(),
single_instance: extension.manifest.single_instance,
dev_server_url, dev_server_url,
}) })
} }

View File

@ -42,12 +42,12 @@ enum DataProcessingError {
impl fmt::Display for DataProcessingError { impl fmt::Display for DataProcessingError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self { match self {
DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {}", e), DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {e}"),
DataProcessingError::Utf8Conversion(e) => { DataProcessingError::Utf8Conversion(e) => {
write!(f, "UTF-8-Konvertierungsfehler: {}", e) write!(f, "UTF-8-Konvertierungsfehler: {e}")
} }
DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {}", e), DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {e}"),
DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {}", msg), DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {msg}"),
} }
} }
} }
@ -101,7 +101,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-') .all(|c| c.is_ascii_alphanumeric() || c == '-')
{ {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension name: {}", extension_name), reason: format!("Invalid extension name: {extension_name}"),
}); });
} }
@ -111,7 +111,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '.') .all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '.')
{ {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension version: {}", extension_version), reason: format!("Invalid extension version: {extension_version}"),
}); });
} }
@ -146,11 +146,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(canonical_path) Ok(canonical_path)
} else { } else {
eprintln!( eprintln!(
"SECURITY WARNING: Path traversal attempt blocked: {}", "SECURITY WARNING: Path traversal attempt blocked: {requested_asset_path}"
requested_asset_path
); );
Err(ExtensionError::SecurityViolation { Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {}", requested_asset_path), reason: format!("Path traversal attempt: {requested_asset_path}"),
}) })
} }
} }
@ -159,11 +158,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(final_path) Ok(final_path)
} else { } else {
eprintln!( eprintln!(
"SECURITY WARNING: Invalid asset path: {}", "SECURITY WARNING: Invalid asset path: {requested_asset_path}"
requested_asset_path
); );
Err(ExtensionError::SecurityViolation { Err(ExtensionError::SecurityViolation {
reason: format!("Invalid asset path: {}", requested_asset_path), reason: format!("Invalid asset path: {requested_asset_path}"),
}) })
} }
} }
@ -184,7 +182,7 @@ pub fn extension_protocol_handler(
// Only allow same-protocol requests or tauri origin // Only allow same-protocol requests or tauri origin
// For null/empty origin (initial load), use wildcard // For null/empty origin (initial load), use wildcard
let protocol_prefix = format!("{}://", EXTENSION_PROTOCOL_NAME); let protocol_prefix = format!("{EXTENSION_PROTOCOL_NAME}://");
let allowed_origin = if origin.starts_with(&protocol_prefix) || origin == get_tauri_origin() { let allowed_origin = if origin.starts_with(&protocol_prefix) || origin == get_tauri_origin() {
origin origin
} else if origin.is_empty() || origin == "null" { } else if origin.is_empty() || origin == "null" {
@ -216,9 +214,9 @@ pub fn extension_protocol_handler(
.and_then(|v| v.to_str().ok()) .and_then(|v| v.to_str().ok())
.unwrap_or(""); .unwrap_or("");
println!("Protokoll Handler für: {}", uri_ref); println!("Protokoll Handler für: {uri_ref}");
println!("Origin: {}", origin); println!("Origin: {origin}");
println!("Referer: {}", referer); println!("Referer: {referer}");
let path_str = uri_ref.path(); let path_str = uri_ref.path();
@ -227,16 +225,16 @@ pub fn extension_protocol_handler(
// - Desktop: haex-extension://<base64>/{assetPath} // - Desktop: haex-extension://<base64>/{assetPath}
// - Android: http://localhost/{base64}/{assetPath} // - Android: http://localhost/{base64}/{assetPath}
let host = uri_ref.host().unwrap_or(""); let host = uri_ref.host().unwrap_or("");
println!("URI Host: {}", host); println!("URI Host: {host}");
let (info, segments_after_version) = if host == "localhost" || host == format!("{}.localhost", EXTENSION_PROTOCOL_NAME).as_str() { let (info, segments_after_version) = if host == "localhost" || host == format!("{EXTENSION_PROTOCOL_NAME}.localhost").as_str() {
// Android format: http://haex-extension.localhost/{base64}/{assetPath} // Android format: http://haex-extension.localhost/{base64}/{assetPath}
// Extract base64 from first path segment // Extract base64 from first path segment
println!("Android format detected: http://{}/...", host); println!("Android format detected: http://{host}/...");
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty()); let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
if let Some(first_segment) = segments_iter.next() { if let Some(first_segment) = segments_iter.next() {
println!("First path segment (base64): {}", first_segment); println!("First path segment (base64): {first_segment}");
match BASE64_STANDARD.decode(first_segment) { match BASE64_STANDARD.decode(first_segment) {
Ok(decoded_bytes) => match String::from_utf8(decoded_bytes) { Ok(decoded_bytes) => match String::from_utf8(decoded_bytes) {
Ok(json_str) => match serde_json::from_str::<ExtensionInfo>(&json_str) { Ok(json_str) => match serde_json::from_str::<ExtensionInfo>(&json_str) {
@ -252,29 +250,29 @@ pub fn extension_protocol_handler(
(info, remaining) (info, remaining)
} }
Err(e) => { Err(e) => {
eprintln!("Failed to parse JSON from base64 path: {}", e); eprintln!("Failed to parse JSON from base64 path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 path: {}", e))) .body(Vec::from(format!("Invalid extension info in base64 path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 path: {}", e); eprintln!("Failed to decode UTF-8 from base64 path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 path: {}", e))) .body(Vec::from(format!("Invalid UTF-8 in base64 path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode base64 from path: {}", e); eprintln!("Failed to decode base64 from path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in path: {}", e))) .body(Vec::from(format!("Invalid base64 in path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
} }
@ -311,35 +309,35 @@ pub fn extension_protocol_handler(
(info, segments) (info, segments)
} }
Err(e) => { Err(e) => {
eprintln!("Failed to parse JSON from base64 host: {}", e); eprintln!("Failed to parse JSON from base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 host: {}", e))) .body(Vec::from(format!("Invalid extension info in base64 host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 host: {}", e); eprintln!("Failed to decode UTF-8 from base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 host: {}", e))) .body(Vec::from(format!("Invalid UTF-8 in base64 host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode base64 host: {}", e); eprintln!("Failed to decode base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in host: {}", e))) .body(Vec::from(format!("Invalid base64 in host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
} }
} else { } else {
// No base64 host - use path-based parsing (for localhost/Android/Windows) // No base64 host - use path-based parsing (for localhost/Android/Windows)
parse_extension_info_from_path(path_str, origin, uri_ref, referer, &allowed_origin)? parse_extension_info_from_path(path_str, origin, uri_ref, referer)?
}; };
// Construct asset path from remaining segments // Construct asset path from remaining segments
@ -353,8 +351,8 @@ pub fn extension_protocol_handler(
&raw_asset_path &raw_asset_path
}; };
println!("Path: {}", path_str); println!("Path: {path_str}");
println!("Asset to load: {}", asset_to_load); println!("Asset to load: {asset_to_load}");
let absolute_secure_path = resolve_secure_extension_asset_path( let absolute_secure_path = resolve_secure_extension_asset_path(
app_handle, app_handle,
@ -362,7 +360,7 @@ pub fn extension_protocol_handler(
&info.public_key, &info.public_key,
&info.name, &info.name,
&info.version, &info.version,
&asset_to_load, asset_to_load,
)?; )?;
println!("Resolved path: {}", absolute_secure_path.display()); println!("Resolved path: {}", absolute_secure_path.display());
@ -497,7 +495,7 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_origin(origin) { if let Ok(hex) = parse_from_origin(origin) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Origin: {}", hex); println!("Parsed und gecached aus Origin: {hex}");
return Ok(info); return Ok(info);
} }
} }
@ -507,17 +505,17 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_uri_path(uri_ref) { if let Ok(hex) = parse_from_uri_path(uri_ref) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus URI: {}", hex); println!("Parsed und gecached aus URI: {hex}");
return Ok(info); return Ok(info);
} }
} }
println!("Fallback zu Referer-Parsing: {}", referer); println!("Fallback zu Referer-Parsing: {referer}");
if !referer.is_empty() && referer != "null" { if !referer.is_empty() && referer != "null" {
if let Ok(hex) = parse_from_uri_string(referer) { if let Ok(hex) = parse_from_uri_string(referer) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Referer: {}", hex); println!("Parsed und gecached aus Referer: {hex}");
return Ok(info); return Ok(info);
} }
} }
@ -609,29 +607,23 @@ fn validate_and_return_hex(segment: &str) -> Result<String, DataProcessingError>
Ok(segment.to_string()) Ok(segment.to_string())
} }
fn encode_hex_for_log(info: &ExtensionInfo) -> String {
let json_str = serde_json::to_string(info).unwrap_or_default();
hex::encode(json_str.as_bytes())
}
// Helper function to parse extension info from path segments // Helper function to parse extension info from path segments
fn parse_extension_info_from_path( fn parse_extension_info_from_path(
path_str: &str, path_str: &str,
origin: &str, origin: &str,
uri_ref: &Uri, uri_ref: &Uri,
referer: &str, referer: &str,
allowed_origin: &str,
) -> Result<(ExtensionInfo, Vec<String>), Box<dyn std::error::Error>> { ) -> Result<(ExtensionInfo, Vec<String>), Box<dyn std::error::Error>> {
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty()); let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
match (segments_iter.next(), segments_iter.next(), segments_iter.next()) { match (segments_iter.next(), segments_iter.next(), segments_iter.next()) {
(Some(public_key), Some(name), Some(version)) => { (Some(public_key), Some(name), Some(version)) => {
println!("=== Extension Protocol Handler (path-based) ==="); println!("=== Extension Protocol Handler (path-based) ===");
println!("Full URI: {}", uri_ref); println!("Full URI: {uri_ref}");
println!("Parsed from path segments:"); println!("Parsed from path segments:");
println!(" PublicKey: {}", public_key); println!(" PublicKey: {public_key}");
println!(" Name: {}", name); println!(" Name: {name}");
println!(" Version: {}", version); println!(" Version: {version}");
let info = ExtensionInfo { let info = ExtensionInfo {
public_key: public_key.to_string(), public_key: public_key.to_string(),
@ -653,7 +645,7 @@ fn parse_extension_info_from_path(
) { ) {
Ok(decoded) => { Ok(decoded) => {
println!("=== Extension Protocol Handler (legacy hex format) ==="); println!("=== Extension Protocol Handler (legacy hex format) ===");
println!("Full URI: {}", uri_ref); println!("Full URI: {uri_ref}");
println!("Decoded info:"); println!("Decoded info:");
println!(" PublicKey: {}", decoded.public_key); println!(" PublicKey: {}", decoded.public_key);
println!(" Name: {}", decoded.name); println!(" Name: {}", decoded.name);
@ -670,8 +662,8 @@ fn parse_extension_info_from_path(
Ok((decoded, segments)) Ok((decoded, segments))
} }
Err(e) => { Err(e) => {
eprintln!("Fehler beim Parsen (alle Fallbacks): {}", e); eprintln!("Fehler beim Parsen (alle Fallbacks): {e}");
Err(format!("Ungültige Anfrage: {}", e).into()) Err(format!("Ungültige Anfrage: {e}").into())
} }
} }
} }

View File

@ -70,8 +70,7 @@ pub fn copy_directory(
use std::path::PathBuf; use std::path::PathBuf;
println!( println!(
"Kopiere Verzeichnis von '{}' nach '{}'", "Kopiere Verzeichnis von '{source}' nach '{destination}'"
source, destination
); );
let source_path = PathBuf::from(&source); let source_path = PathBuf::from(&source);
@ -81,7 +80,7 @@ pub fn copy_directory(
return Err(ExtensionError::Filesystem { return Err(ExtensionError::Filesystem {
source: std::io::Error::new( source: std::io::Error::new(
std::io::ErrorKind::NotFound, std::io::ErrorKind::NotFound,
format!("Source directory '{}' not found", source), format!("Source directory '{source}' not found"),
), ),
}); });
} }
@ -93,7 +92,7 @@ pub fn copy_directory(
fs_extra::dir::copy(&source_path, &destination_path, &options).map_err(|e| { fs_extra::dir::copy(&source_path, &destination_path, &options).map_err(|e| {
ExtensionError::Filesystem { ExtensionError::Filesystem {
source: std::io::Error::new(std::io::ErrorKind::Other, e.to_string()), source: std::io::Error::other(e.to_string()),
} }
})?; })?;
Ok(()) Ok(())

View File

@ -4,28 +4,13 @@ use std::{
}; };
// src-tauri/src/extension/crypto.rs // src-tauri/src/extension/crypto.rs
use crate::extension::error::ExtensionError;
use ed25519_dalek::{Signature, Verifier, VerifyingKey}; use ed25519_dalek::{Signature, Verifier, VerifyingKey};
use sha2::{Digest, Sha256}; use sha2::{Digest, Sha256};
pub struct ExtensionCrypto; pub struct ExtensionCrypto;
impl ExtensionCrypto { impl ExtensionCrypto {
/// Berechnet Hash vom Public Key (wie im SDK)
pub fn calculate_key_hash(public_key_hex: &str) -> Result<String, String> {
let public_key_bytes =
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key hex: {}", e))?;
let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap())
.map_err(|e| format!("Invalid public key: {}", e))?;
let mut hasher = Sha256::new();
hasher.update(public_key.as_bytes());
let result = hasher.finalize();
// Ersten 20 Hex-Zeichen (10 Bytes) - wie im SDK
Ok(hex::encode(&result[..10]))
}
/// Verifiziert Extension-Signatur /// Verifiziert Extension-Signatur
pub fn verify_signature( pub fn verify_signature(
public_key_hex: &str, public_key_hex: &str,
@ -33,43 +18,81 @@ impl ExtensionCrypto {
signature_hex: &str, signature_hex: &str,
) -> Result<(), String> { ) -> Result<(), String> {
let public_key_bytes = let public_key_bytes =
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {}", e))?; hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {e}"))?;
let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap()) let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap())
.map_err(|e| format!("Invalid public key: {}", e))?; .map_err(|e| format!("Invalid public key: {e}"))?;
let signature_bytes = let signature_bytes =
hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {}", e))?; hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {e}"))?;
let signature = Signature::from_bytes(&signature_bytes.try_into().unwrap()); let signature = Signature::from_bytes(&signature_bytes.try_into().unwrap());
let content_hash = let content_hash =
hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {}", e))?; hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {e}"))?;
public_key public_key
.verify(&content_hash, &signature) .verify(&content_hash, &signature)
.map_err(|e| format!("Signature verification failed: {}", e)) .map_err(|e| format!("Signature verification failed: {e}"))
} }
/// Berechnet Hash eines Verzeichnisses (für Verifikation) /// Berechnet Hash eines Verzeichnisses (für Verifikation)
pub fn hash_directory(dir: &Path) -> Result<String, String> { pub fn hash_directory(dir: &Path, manifest_path: &Path) -> Result<String, ExtensionError> {
// 1. Alle Dateipfade rekursiv sammeln // 1. Alle Dateipfade rekursiv sammeln
let mut all_files = Vec::new(); let mut all_files = Vec::new();
Self::collect_files_recursively(dir, &mut all_files) Self::collect_files_recursively(dir, &mut all_files)
.map_err(|e| format!("Failed to collect files: {}", e))?; .map_err(|e| ExtensionError::Filesystem { source: e })?;
all_files.sort();
// 2. Konvertiere zu relativen Pfaden für konsistente Sortierung (wie im SDK)
let mut relative_files: Vec<(String, PathBuf)> = all_files
.into_iter()
.map(|path| {
let relative = path.strip_prefix(dir)
.unwrap_or(&path)
.to_string_lossy()
.to_string()
// Normalisiere Pfad-Separatoren zu Unix-Style (/) für plattformübergreifende Konsistenz
.replace('\\', "/");
(relative, path)
})
.collect();
// 3. Sortiere nach relativen Pfaden
relative_files.sort_by(|a, b| a.0.cmp(&b.0));
let mut hasher = Sha256::new(); let mut hasher = Sha256::new();
let manifest_path = dir.join("manifest.json");
// 2. Inhalte der sortierten Dateien hashen // Canonicalize manifest path for comparison (important on Android where symlinks may differ)
for file_path in all_files { // Also ensure the canonical path is still within the allowed directory (security check)
if file_path == manifest_path { let canonical_manifest_path = manifest_path.canonicalize()
.unwrap_or_else(|_| manifest_path.to_path_buf());
// Security: Verify canonical manifest path is still within dir
let canonical_dir = dir.canonicalize()
.unwrap_or_else(|_| dir.to_path_buf());
if !canonical_manifest_path.starts_with(&canonical_dir) {
return Err(ExtensionError::ManifestError {
reason: "Manifest path resolves outside of extension directory (potential path traversal)".to_string(),
});
}
// 4. Inhalte der sortierten Dateien hashen
for (_relative, file_path) in relative_files {
// Canonicalize file_path for comparison
let canonical_file_path = file_path.canonicalize()
.unwrap_or_else(|_| file_path.clone());
if canonical_file_path == canonical_manifest_path {
// FÜR DIE MANIFEST.JSON: // FÜR DIE MANIFEST.JSON:
let content_str = fs::read_to_string(&file_path) let content_str = fs::read_to_string(&file_path)
.map_err(|e| format!("Cannot read manifest file: {}", e))?; .map_err(|e| ExtensionError::Filesystem { source: e })?;
// Parse zu einem generischen JSON-Wert // Parse zu einem generischen JSON-Wert
let mut manifest: serde_json::Value = serde_json::from_str(&content_str) let mut manifest: serde_json::Value =
.map_err(|e| format!("Cannot parse manifest JSON: {}", e))?; serde_json::from_str(&content_str).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Cannot parse manifest JSON: {e}"),
}
})?;
// Entferne oder leere das Signaturfeld, um den "kanonischen Inhalt" zu erhalten // Entferne oder leere das Signaturfeld, um den "kanonischen Inhalt" zu erhalten
if let Some(obj) = manifest.as_object_mut() { if let Some(obj) = manifest.as_object_mut() {
@ -80,13 +103,23 @@ impl ExtensionCrypto {
} }
// Serialisiere das modifizierte Manifest zurück (mit 2 Spaces, wie in JS) // Serialisiere das modifizierte Manifest zurück (mit 2 Spaces, wie in JS)
let canonical_manifest_content = serde_json::to_string_pretty(&manifest).unwrap(); // serde_json sortiert die Keys automatisch alphabetisch
println!("canonical_manifest_content: {}", canonical_manifest_content); let canonical_manifest_content =
hasher.update(canonical_manifest_content.as_bytes()); serde_json::to_string_pretty(&manifest).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Failed to serialize manifest: {e}"),
}
})?;
// Normalisiere Zeilenenden zu Unix-Style (\n), wie Node.js JSON.stringify es macht
// Dies ist wichtig für plattformübergreifende Konsistenz (Desktop vs Android)
let normalized_content = canonical_manifest_content.replace("\r\n", "\n");
hasher.update(normalized_content.as_bytes());
} else { } else {
// FÜR ALLE ANDEREN DATEIEN: // FÜR ALLE ANDEREN DATEIEN:
let content = fs::read(&file_path) let content =
.map_err(|e| format!("Cannot read file {}: {}", file_path.display(), e))?; fs::read(&file_path).map_err(|e| ExtensionError::Filesystem { source: e })?;
hasher.update(&content); hasher.update(&content);
} }
} }

View File

@ -1,11 +1,11 @@
// src-tauri/src/extension/database/executor.rs (neu) // src-tauri/src/extension/database/executor.rs
use crate::crdt::hlc::HlcService; use crate::crdt::hlc::HlcService;
use crate::crdt::transformer::CrdtTransformer; use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger; use crate::crdt::trigger;
use crate::database::core::{parse_sql_statements, ValueConverter}; use crate::database::core::{convert_value_ref_to_json, parse_sql_statements};
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use rusqlite::{params_from_iter, Params, Transaction}; use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction};
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
use sqlparser::ast::Statement; use sqlparser::ast::Statement;
use std::collections::HashSet; use std::collections::HashSet;
@ -14,27 +14,25 @@ use std::collections::HashSet;
pub struct SqlExecutor; pub struct SqlExecutor;
impl SqlExecutor { impl SqlExecutor {
pub fn execute_internal_typed<P>( /// Führt ein SQL Statement OHNE RETURNING aus (mit CRDT)
/// Returns: modified_schema_tables
pub fn execute_internal_typed(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: P, // Akzeptiert jetzt alles, was rusqlite als Parameter versteht params: &[&dyn ToSql],
) -> Result<HashSet<String>, DatabaseError> ) -> Result<HashSet<String>, DatabaseError> {
where
P: Params,
{
let mut ast_vec = parse_sql_statements(sql)?; let mut ast_vec = parse_sql_statements(sql)?;
// Wir stellen sicher, dass wir nur EIN Statement verarbeiten. Das ist sicherer.
if ast_vec.len() != 1 { if ast_vec.len() != 1 {
return Err(DatabaseError::ExecutionError { return Err(DatabaseError::ExecutionError {
sql: sql.to_string(), sql: sql.to_string(),
reason: "execute_internal_typed sollte nur ein einzelnes SQL-Statement erhalten" reason: "execute_internal_typed should only receive a single SQL statement"
.to_string(), .to_string(),
table: None, table: None,
}); });
} }
// Wir nehmen das einzige Statement aus dem Vektor.
let mut statement = ast_vec.pop().unwrap(); let mut statement = ast_vec.pop().unwrap();
let transformer = CrdtTransformer::new(); let transformer = CrdtTransformer::new();
@ -46,53 +44,61 @@ impl SqlExecutor {
})?; })?;
let mut modified_schema_tables = HashSet::new(); let mut modified_schema_tables = HashSet::new();
if let Some(table_name) = if let Some(table_name) = transformer.transform_execute_statement_with_table_info(
transformer.transform_execute_statement(&mut statement, &hlc_timestamp)? &mut statement,
{ &hlc_timestamp,
)? {
modified_schema_tables.insert(table_name); modified_schema_tables.insert(table_name);
} }
// Führe das transformierte Statement aus.
// `params` wird jetzt nur noch einmal hierher bewegt, was korrekt ist.
let sql_str = statement.to_string(); let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed execute SQL: {sql_str}");
// Führe Statement aus
tx.execute(&sql_str, params) tx.execute(&sql_str, params)
.map_err(|e| DatabaseError::ExecutionError { .map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: None, table: None,
reason: e.to_string(), reason: format!("Execute failed: {e}"),
})?; })?;
// Die Trigger-Logik für CREATE TABLE bleibt erhalten. // Trigger-Logik für CREATE TABLE
if let Statement::CreateTable(create_table_details) = statement { if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string(); let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
eprintln!("DEBUG: Setting up triggers for table: {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?; trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
} }
Ok(modified_schema_tables) Ok(modified_schema_tables)
} }
/// Führt SQL aus (mit CRDT-Transformation) - OHNE Permission-Check
pub fn execute_internal( /// Führt ein SQL Statement MIT RETURNING aus (mit CRDT)
/// Returns: (modified_schema_tables, returning_results)
pub fn query_internal_typed(
tx: &Transaction, tx: &Transaction,
hlc_service: &HlcService, hlc_service: &HlcService,
sql: &str, sql: &str,
params: &[JsonValue], params: &[&dyn ToSql],
) -> Result<HashSet<String>, DatabaseError> { ) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
// Parameter validation let mut ast_vec = parse_sql_statements(sql)?;
let total_placeholders = sql.matches('?').count();
if total_placeholders != params.len() { if ast_vec.len() != 1 {
return Err(DatabaseError::ParameterMismatchError { return Err(DatabaseError::ExecutionError {
expected: total_placeholders,
provided: params.len(),
sql: sql.to_string(), sql: sql.to_string(),
reason: "query_internal_typed should only receive a single SQL statement"
.to_string(),
table: None,
}); });
} }
// SQL parsing let mut statement = ast_vec.pop().unwrap();
let mut ast_vec = parse_sql_statements(sql)?;
let transformer = CrdtTransformer::new(); let transformer = CrdtTransformer::new();
// Generate HLC timestamp
let hlc_timestamp = let hlc_timestamp =
hlc_service hlc_service
.new_timestamp_and_persist(tx) .new_timestamp_and_persist(tx)
@ -100,110 +106,179 @@ impl SqlExecutor {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
// Transform statements
let mut modified_schema_tables = HashSet::new(); let mut modified_schema_tables = HashSet::new();
for statement in &mut ast_vec { if let Some(table_name) = transformer.transform_execute_statement_with_table_info(
if let Some(table_name) = &mut statement,
transformer.transform_execute_statement(statement, &hlc_timestamp)? &hlc_timestamp,
{ )? {
modified_schema_tables.insert(table_name); modified_schema_tables.insert(table_name);
}
} }
// Convert parameters let sql_str = statement.to_string();
let sql_values = ValueConverter::convert_params(params)?; eprintln!("DEBUG: Transformed SQL (with RETURNING): {sql_str}");
// Execute statements // Prepare und query ausführen
for statement in ast_vec { let mut stmt = tx
let sql_str = statement.to_string(); .prepare(&sql_str)
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: None,
reason: e.to_string(),
})?;
tx.execute(&sql_str, params_from_iter(sql_values.iter())) let column_names: Vec<String> = stmt
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: None,
reason: e.to_string(),
})?;
if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string();
trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
}
}
Ok(modified_schema_tables)
}
/// Führt SELECT aus (mit CRDT-Transformation) - OHNE Permission-Check
pub fn select_internal(
conn: &rusqlite::Connection,
sql: &str,
params: &[JsonValue],
) -> Result<Vec<JsonValue>, DatabaseError> {
// Parameter validation
let total_placeholders = sql.matches('?').count();
if total_placeholders != params.len() {
return Err(DatabaseError::ParameterMismatchError {
expected: total_placeholders,
provided: params.len(),
sql: sql.to_string(),
});
}
let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.is_empty() {
return Ok(vec![]);
}
// Validate that all statements are queries
for stmt in &ast_vec {
if !matches!(stmt, Statement::Query(_)) {
return Err(DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "Only SELECT statements are allowed".to_string(),
table: None,
});
}
}
let sql_params = ValueConverter::convert_params(params)?;
let transformer = CrdtTransformer::new();
let last_statement = ast_vec.pop().unwrap();
let mut stmt_to_execute = last_statement;
transformer.transform_select_statement(&mut stmt_to_execute)?;
let transformed_sql = stmt_to_execute.to_string();
let mut prepared_stmt =
conn.prepare(&transformed_sql)
.map_err(|e| DatabaseError::ExecutionError {
sql: transformed_sql.clone(),
reason: e.to_string(),
table: None,
})?;
let column_names: Vec<String> = prepared_stmt
.column_names() .column_names()
.into_iter() .into_iter()
.map(|s| s.to_string()) .map(|s| s.to_string())
.collect(); .collect();
let num_columns = column_names.len();
let rows = prepared_stmt let mut rows = stmt
.query_map(params_from_iter(sql_params.iter()), |row| { .query(params_from_iter(params.iter()))
crate::extension::database::row_to_json_value(row, &column_names) .map_err(|e| DatabaseError::ExecutionError {
}) sql: sql_str.clone(),
.map_err(|e| DatabaseError::QueryError { table: None,
reason: e.to_string(), reason: e.to_string(),
})?; })?;
let mut results = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
for row_result in rows {
results.push(row_result.map_err(|e| DatabaseError::RowProcessingError { // Lese alle RETURNING Zeilen
reason: e.to_string(), while let Some(row) = rows.next().map_err(|e| DatabaseError::ExecutionError {
})?); sql: sql_str.clone(),
table: None,
reason: e.to_string(),
})? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: None,
reason: e.to_string(),
})?;
let json_value = convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
}
result_vec.push(row_values);
} }
Ok(results) // Trigger-Logik für CREATE TABLE
if let Statement::CreateTable(create_table_details) = statement {
let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
eprintln!("DEBUG: Setting up triggers for table (RETURNING): {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
}
Ok((modified_schema_tables, result_vec))
}
/// Führt ein einzelnes SQL Statement OHNE Typinformationen aus (JSON params)
pub fn execute_internal(
tx: &Transaction,
hlc_service: &HlcService,
sql: &str,
params: &[JsonValue],
) -> Result<HashSet<String>, DatabaseError> {
let sql_params: Vec<SqliteValue> = params
.iter()
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::execute_internal_typed(tx, hlc_service, sql, &param_refs)
}
/// Query-Variante (mit RETURNING) OHNE Typinformationen (JSON params)
pub fn query_internal(
tx: &Transaction,
hlc_service: &HlcService,
sql: &str,
params: &[JsonValue],
) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
let sql_params: Vec<SqliteValue> = params
.iter()
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::query_internal_typed(tx, hlc_service, sql, &param_refs)
}
/// Führt mehrere SQL Statements als Batch aus
pub fn execute_batch_internal(
tx: &Transaction,
hlc_service: &HlcService,
sqls: &[String],
params: &[Vec<JsonValue>],
) -> Result<HashSet<String>, DatabaseError> {
if sqls.len() != params.len() {
return Err(DatabaseError::ExecutionError {
sql: format!("{} statements but {} param sets", sqls.len(), params.len()),
reason: "Statement count and parameter count mismatch".to_string(),
table: None,
});
}
let mut all_modified_tables = HashSet::new();
for (sql, param_set) in sqls.iter().zip(params.iter()) {
let modified_tables = Self::execute_internal(tx, hlc_service, sql, param_set)?;
all_modified_tables.extend(modified_tables);
}
Ok(all_modified_tables)
}
/// Query für SELECT-Statements (read-only, kein CRDT nötig außer Filter)
pub fn query_select(
conn: &rusqlite::Connection,
sql: &str,
params: &[JsonValue],
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 {
return Err(DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "query_select should only receive a single SELECT statement".to_string(),
table: None,
});
}
// Hard Delete: Keine SELECT-Transformation mehr nötig
let stmt_to_execute = ast_vec.pop().unwrap();
let transformed_sql = stmt_to_execute.to_string();
eprintln!("DEBUG: SELECT (no transformation): {transformed_sql}");
// Convert JSON params to SQLite values
let sql_params: Vec<SqliteValue> = params
.iter()
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let mut prepared_stmt = conn.prepare(&transformed_sql)?;
let num_columns = prepared_stmt.column_count();
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
let mut rows = prepared_stmt.query(params_from_iter(param_refs.iter()))?;
let mut result: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next()? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i)?;
let json_value = convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
}
result.push(row_values);
}
Ok(result)
} }
} }

View File

@ -5,6 +5,7 @@ use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger; use crate::crdt::trigger;
use crate::database::core::{parse_sql_statements, with_connection, ValueConverter}; use crate::database::core::{parse_sql_statements, with_connection, ValueConverter};
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError; use crate::extension::error::ExtensionError;
use crate::extension::permissions::validator::SqlPermissionValidator; use crate::extension::permissions::validator::SqlPermissionValidator;
use crate::AppState; use crate::AppState;
@ -12,10 +13,8 @@ use crate::AppState;
use rusqlite::params_from_iter; use rusqlite::params_from_iter;
use rusqlite::types::Value as SqlValue; use rusqlite::types::Value as SqlValue;
use rusqlite::Transaction; use rusqlite::Transaction;
use serde_json::json;
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
use sqlparser::ast::{Statement, TableFactor, TableObject}; use sqlparser::ast::{Statement, TableFactor, TableObject};
use std::collections::HashSet;
use tauri::State; use tauri::State;
/// Führt Statements mit korrekter Parameter-Bindung aus /// Führt Statements mit korrekter Parameter-Bindung aus
@ -110,7 +109,7 @@ pub async fn extension_sql_execute(
public_key: String, public_key: String,
name: String, name: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<String>, ExtensionError> { ) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID // Get extension to retrieve its ID
let extension = state let extension = state
.extension_manager .extension_manager
@ -129,58 +128,98 @@ pub async fn extension_sql_execute(
// SQL parsing // SQL parsing
let mut ast_vec = parse_sql_statements(sql)?; let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 {
return Err(ExtensionError::Database {
source: DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "extension_sql_execute should only receive a single SQL statement"
.to_string(),
table: None,
},
});
}
let mut statement = ast_vec.pop().unwrap();
// Check if statement has RETURNING clause
let has_returning = crate::database::core::statement_has_returning(&statement);
// Database operation // Database operation
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?; let tx = conn.transaction().map_err(DatabaseError::from)?;
let transformer = CrdtTransformer::new(); let transformer = CrdtTransformer::new();
let executor = StatementExecutor::new(&tx);
// Get HLC service reference
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(),
})?;
// Generate HLC timestamp // Generate HLC timestamp
let hlc_timestamp = state let hlc_timestamp =
.hlc hlc_service
.lock() .new_timestamp_and_persist(&tx)
.unwrap() .map_err(|e| DatabaseError::HlcError {
.new_timestamp_and_persist(&tx) reason: e.to_string(),
.map_err(|e| DatabaseError::HlcError { })?;
reason: e.to_string(),
})?;
// Transform statements // Transform statement
let mut modified_schema_tables = HashSet::new(); transformer.transform_execute_statement(&mut statement, &hlc_timestamp)?;
for statement in &mut ast_vec {
if let Some(table_name) =
transformer.transform_execute_statement(statement, &hlc_timestamp)?
{
modified_schema_tables.insert(table_name);
}
}
// Convert parameters // Convert parameters to references
let sql_values = ValueConverter::convert_params(&params)?; let sql_values = ValueConverter::convert_params(&params)?;
let param_refs: Vec<&dyn rusqlite::ToSql> = sql_values
.iter()
.map(|v| v as &dyn rusqlite::ToSql)
.collect();
// Execute statements let result = if has_returning {
for statement in ast_vec { // Use query_internal for statements with RETURNING
executor.execute_statement_with_params(&statement, &sql_values)?; let (_, rows) = SqlExecutor::query_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
rows
} else {
// Use execute_internal for statements without RETURNING
SqlExecutor::execute_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
vec![]
};
if let Statement::CreateTable(create_table_details) = statement { // Handle CREATE TABLE trigger setup
let table_name_str = create_table_details.name.to_string(); if let Statement::CreateTable(ref create_table_details) = statement {
println!( // Extract table name and remove quotes (both " and `)
"Table '{}' created by extension, setting up CRDT triggers...", let raw_name = create_table_details.name.to_string();
table_name_str println!("DEBUG: Raw table name from AST: {raw_name:?}");
); println!(
trigger::setup_triggers_for_table(&tx, &table_name_str, false)?; "DEBUG: Raw table name chars: {:?}",
println!( raw_name.chars().collect::<Vec<_>>()
"Triggers for table '{}' successfully created.", );
table_name_str
); let table_name_str = raw_name.trim_matches('"').trim_matches('`').to_string();
}
println!("DEBUG: Cleaned table name: {table_name_str:?}");
println!(
"DEBUG: Cleaned table name chars: {:?}",
table_name_str.chars().collect::<Vec<_>>()
);
println!("Table '{table_name_str}' created by extension, setting up CRDT triggers...");
trigger::setup_triggers_for_table(&tx, &table_name_str, false)?;
println!("Triggers for table '{table_name_str}' successfully created.");
} }
// Commit transaction // Commit transaction
tx.commit().map_err(DatabaseError::from)?; tx.commit().map_err(DatabaseError::from)?;
Ok(modified_schema_tables.into_iter().collect()) Ok(result)
}) })
.map_err(ExtensionError::from) .map_err(ExtensionError::from)
} }
@ -192,7 +231,7 @@ pub async fn extension_sql_select(
public_key: String, public_key: String,
name: String, name: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<JsonValue>, ExtensionError> { ) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID // Get extension to retrieve its ID
let extension = state let extension = state
.extension_manager .extension_manager
@ -229,17 +268,10 @@ pub async fn extension_sql_select(
} }
} }
// Database operation // Database operation - return Vec<Vec<JsonValue>> like sql_select_with_crdt
with_connection(&state.db, |conn| { with_connection(&state.db, |conn| {
let sql_params = ValueConverter::convert_params(&params)?; let sql_params = ValueConverter::convert_params(&params)?;
let transformer = CrdtTransformer::new(); let stmt_to_execute = ast_vec.pop().unwrap();
// Use the last statement for result set
let last_statement = ast_vec.pop().unwrap();
let mut stmt_to_execute = last_statement;
// Transform the statement
transformer.transform_select_statement(&mut stmt_to_execute)?;
let transformed_sql = stmt_to_execute.to_string(); let transformed_sql = stmt_to_execute.to_string();
// Prepare and execute query // Prepare and execute query
@ -251,52 +283,34 @@ pub async fn extension_sql_select(
table: None, table: None,
})?; })?;
let column_names: Vec<String> = prepared_stmt let num_columns = prepared_stmt.column_count();
.column_names() let mut rows = prepared_stmt
.into_iter() .query(params_from_iter(sql_params.iter()))
.map(|s| s.to_string())
.collect();
let rows = prepared_stmt
.query_map(params_from_iter(sql_params.iter()), |row| {
row_to_json_value(row, &column_names)
})
.map_err(|e| DatabaseError::QueryError { .map_err(|e| DatabaseError::QueryError {
reason: e.to_string(), reason: e.to_string(),
})?; })?;
let mut results = Vec::new(); let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
for row_result in rows {
results.push(row_result.map_err(|e| DatabaseError::RowProcessingError { while let Some(row) = rows.next().map_err(|e| DatabaseError::QueryError {
reason: e.to_string(), reason: e.to_string(),
})?); })? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let json_value = crate::database::core::convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
}
result_vec.push(row_values);
} }
Ok(results) Ok(result_vec)
}) })
.map_err(ExtensionError::from) .map_err(ExtensionError::from)
} }
/// Konvertiert eine SQLite-Zeile zu JSON
fn row_to_json_value(
row: &rusqlite::Row,
columns: &[String],
) -> Result<JsonValue, rusqlite::Error> {
let mut map = serde_json::Map::new();
for (i, col_name) in columns.iter().enumerate() {
let value = row.get::<usize, rusqlite::types::Value>(i)?;
let json_value = match value {
rusqlite::types::Value::Null => JsonValue::Null,
rusqlite::types::Value::Integer(i) => json!(i),
rusqlite::types::Value::Real(f) => json!(f),
rusqlite::types::Value::Text(s) => json!(s),
rusqlite::types::Value::Blob(blob) => json!(blob.to_vec()),
};
map.insert(col_name.clone(), json_value);
}
Ok(JsonValue::Object(map))
}
/// Validiert Parameter gegen SQL-Platzhalter /// Validiert Parameter gegen SQL-Platzhalter
fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> { fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> {
let total_placeholders = count_sql_placeholders(sql); let total_placeholders = count_sql_placeholders(sql);
@ -317,15 +331,6 @@ fn count_sql_placeholders(sql: &str) -> usize {
sql.matches('?').count() sql.matches('?').count()
} }
/// Kürzt SQL für Fehlermeldungen
/* fn truncate_sql(sql: &str, max_length: usize) -> String {
if sql.len() <= max_length {
sql.to_string()
} else {
format!("{}...", &sql[..max_length])
}
} */
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;
@ -342,20 +347,4 @@ mod tests {
); );
assert_eq!(count_sql_placeholders("SELECT * FROM users"), 0); assert_eq!(count_sql_placeholders("SELECT * FROM users"), 0);
} }
/* #[test]
fn test_truncate_sql() {
let sql = "SELECT * FROM very_long_table_name";
assert_eq!(truncate_sql(sql, 10), "SELECT * F...");
assert_eq!(truncate_sql(sql, 50), sql);
} */
#[test]
fn test_validate_params() {
let params = vec![json!(1), json!("test")];
assert!(validate_params("SELECT * FROM users WHERE id = ? AND name = ?", &params).is_ok());
assert!(validate_params("SELECT * FROM users WHERE id = ?", &params).is_err());
assert!(validate_params("SELECT * FROM users", &params).is_err());
}
} }

View File

@ -174,7 +174,7 @@ impl serde::Serialize for ExtensionError {
let mut state = serializer.serialize_struct("ExtensionError", 4)?; let mut state = serializer.serialize_struct("ExtensionError", 4)?;
state.serialize_field("code", &self.code())?; state.serialize_field("code", &self.code())?;
state.serialize_field("type", &format!("{:?}", self))?; state.serialize_field("type", &format!("{self:?}"))?;
state.serialize_field("message", &self.to_string())?; state.serialize_field("message", &self.to_string())?;
if let Some(ext_id) = self.extension_id() { if let Some(ext_id) = self.extension_id() {

View File

@ -133,7 +133,7 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
// Check for path traversal attempts // Check for path traversal attempts
if pattern.contains("../") || pattern.contains("..\\") { if pattern.contains("../") || pattern.contains("..\\") {
return Err(ExtensionError::SecurityViolation { return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal detected in pattern: {}", pattern), reason: format!("Path traversal detected in pattern: {pattern}"),
}); });
} }
@ -143,7 +143,6 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
/// Resolves a path pattern to actual filesystem paths using Tauri's BaseDirectory /// Resolves a path pattern to actual filesystem paths using Tauri's BaseDirectory
pub fn resolve_path_pattern( pub fn resolve_path_pattern(
pattern: &str, pattern: &str,
app_handle: &tauri::AppHandle,
) -> Result<(String, String), ExtensionError> { ) -> Result<(String, String), ExtensionError> {
let (base_var, relative_path) = if let Some(slash_pos) = pattern.find('/') { let (base_var, relative_path) = if let Some(slash_pos) = pattern.find('/') {
(&pattern[..slash_pos], &pattern[slash_pos + 1..]) (&pattern[..slash_pos], &pattern[slash_pos + 1..])
@ -177,7 +176,7 @@ pub fn resolve_path_pattern(
"$TEMP" => "Temp", "$TEMP" => "Temp",
_ => { _ => {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Unknown base directory variable: {}", base_var), reason: format!("Unknown base directory variable: {base_var}"),
}); });
} }
}; };

View File

@ -1,7 +1,7 @@
/// src-tauri/src/extension/mod.rs /// src-tauri/src/extension/mod.rs
use crate::{ use crate::{
extension::{ extension::{
core::{EditablePermissions, ExtensionInfoResponse, ExtensionPreview}, core::{manager::ExtensionManager, EditablePermissions, ExtensionInfoResponse, ExtensionPreview},
error::ExtensionError, error::ExtensionError,
}, },
AppState, AppState,
@ -37,7 +37,7 @@ pub async fn get_all_extensions(
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<Vec<ExtensionInfoResponse>, String> { ) -> Result<Vec<ExtensionInfoResponse>, String> {
// Check if extensions are loaded, if not load them first // Check if extensions are loaded, if not load them first
let needs_loading = { /* let needs_loading = {
let prod_exts = state let prod_exts = state
.extension_manager .extension_manager
.production_extensions .production_extensions
@ -45,15 +45,15 @@ pub async fn get_all_extensions(
.unwrap(); .unwrap();
let dev_exts = state.extension_manager.dev_extensions.lock().unwrap(); let dev_exts = state.extension_manager.dev_extensions.lock().unwrap();
prod_exts.is_empty() && dev_exts.is_empty() prod_exts.is_empty() && dev_exts.is_empty()
}; }; */
if needs_loading { /* if needs_loading { */
state state
.extension_manager .extension_manager
.load_installed_extensions(&app_handle, &state) .load_installed_extensions(&app_handle, &state)
.await .await
.map_err(|e| format!("Failed to load extensions: {:?}", e))?; .map_err(|e| format!("Failed to load extensions: {e:?}"))?;
} /* } */
let mut extensions = Vec::new(); let mut extensions = Vec::new();
@ -82,12 +82,13 @@ pub async fn get_all_extensions(
#[tauri::command] #[tauri::command]
pub async fn preview_extension( pub async fn preview_extension(
app_handle: AppHandle,
state: State<'_, AppState>, state: State<'_, AppState>,
file_bytes: Vec<u8>, file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> { ) -> Result<ExtensionPreview, ExtensionError> {
state state
.extension_manager .extension_manager
.preview_extension_internal(file_bytes) .preview_extension_internal(&app_handle, file_bytes)
.await .await
} }
@ -193,13 +194,7 @@ pub async fn remove_extension(
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
state state
.extension_manager .extension_manager
.remove_extension_internal( .remove_extension_internal(&app_handle, &public_key, &name, &version, &state)
&app_handle,
&public_key,
&name,
&version,
&state,
)
.await .await
} }
@ -223,6 +218,16 @@ pub fn is_extension_installed(
#[derive(serde::Deserialize, Debug)] #[derive(serde::Deserialize, Debug)]
struct HaextensionConfig { struct HaextensionConfig {
dev: DevConfig, dev: DevConfig,
#[serde(default)]
keys: KeysConfig,
}
#[derive(serde::Deserialize, Debug, Default)]
struct KeysConfig {
#[serde(default)]
public_key_path: Option<String>,
#[serde(default)]
private_key_path: Option<String>,
} }
#[derive(serde::Deserialize, Debug)] #[derive(serde::Deserialize, Debug)]
@ -231,6 +236,8 @@ struct DevConfig {
port: u16, port: u16,
#[serde(default = "default_host")] #[serde(default = "default_host")]
host: String, host: String,
#[serde(default = "default_haextension_dir")]
haextension_dir: String,
} }
fn default_port() -> u16 { fn default_port() -> u16 {
@ -241,10 +248,14 @@ fn default_host() -> String {
"localhost".to_string() "localhost".to_string()
} }
fn default_haextension_dir() -> String {
"haextension".to_string()
}
/// Check if a dev server is reachable by making a simple HTTP request /// Check if a dev server is reachable by making a simple HTTP request
async fn check_dev_server_health(url: &str) -> bool { async fn check_dev_server_health(url: &str) -> bool {
use tauri_plugin_http::reqwest;
use std::time::Duration; use std::time::Duration;
use tauri_plugin_http::reqwest;
// Try to connect with a short timeout // Try to connect with a short timeout
let client = reqwest::Client::builder() let client = reqwest::Client::builder()
@ -276,70 +287,62 @@ pub async fn load_dev_extension(
let extension_path_buf = PathBuf::from(&extension_path); let extension_path_buf = PathBuf::from(&extension_path);
// 1. Read haextension.json to get dev server config // 1. Read haextension.config.json to get dev server config and haextension directory
let config_path = extension_path_buf.join("haextension.json"); let config_path = extension_path_buf.join("haextension.config.json");
let (host, port) = if config_path.exists() { let (host, port, haextension_dir) = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path).map_err(|e| { let config_content =
ExtensionError::ValidationError { std::fs::read_to_string(&config_path).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to read haextension.json: {}", e), reason: format!("Failed to read haextension.config.json: {e}"),
} })?;
})?;
let config: HaextensionConfig = serde_json::from_str(&config_content).map_err(|e| { let config: HaextensionConfig =
ExtensionError::ValidationError { serde_json::from_str(&config_content).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to parse haextension.json: {}", e), reason: format!("Failed to parse haextension.config.json: {e}"),
} })?;
})?;
(config.dev.host, config.dev.port) (config.dev.host, config.dev.port, config.dev.haextension_dir)
} else { } else {
// Default values if config doesn't exist // Default values if config doesn't exist
(default_host(), default_port()) (default_host(), default_port(), default_haextension_dir())
}; };
let dev_server_url = format!("http://{}:{}", host, port); let dev_server_url = format!("http://{host}:{port}");
eprintln!("📡 Dev server URL: {}", dev_server_url); eprintln!("📡 Dev server URL: {dev_server_url}");
eprintln!("📁 Haextension directory: {haextension_dir}");
// 1.5. Check if dev server is running // 1.5. Check if dev server is running
if !check_dev_server_health(&dev_server_url).await { if !check_dev_server_health(&dev_server_url).await {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!( reason: format!(
"Dev server at {} is not reachable. Please start your dev server first (e.g., 'npm run dev')", "Dev server at {dev_server_url} is not reachable. Please start your dev server first (e.g., 'npm run dev')"
dev_server_url
), ),
}); });
} }
eprintln!("✅ Dev server is reachable"); eprintln!("✅ Dev server is reachable");
// 2. Build path to manifest: <extension_path>/haextension/manifest.json // 2. Validate and build path to manifest: <extension_path>/<haextension_dir>/manifest.json
let manifest_path = extension_path_buf.join("haextension").join("manifest.json"); let manifest_relative_path = format!("{haextension_dir}/manifest.json");
let manifest_path = ExtensionManager::validate_path_in_directory(
// Check if manifest exists &extension_path_buf,
if !manifest_path.exists() { &manifest_relative_path,
return Err(ExtensionError::ManifestError { true,
reason: format!( )?
"Manifest not found at: {}. Make sure you run 'npx @haexhub/sdk init' first.", .ok_or_else(|| ExtensionError::ManifestError {
manifest_path.display() reason: format!(
), "Manifest not found at: {haextension_dir}/manifest.json. Make sure you run 'npx @haexhub/sdk init' first."
}); ),
} })?;
// 3. Read and parse manifest // 3. Read and parse manifest
let manifest_content = std::fs::read_to_string(&manifest_path).map_err(|e| { let manifest_content =
ExtensionError::ManifestError { std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Failed to read manifest: {}", e), reason: format!("Failed to read manifest: {e}"),
} })?;
})?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
// 4. Generate a unique ID for dev extension: dev_<public_key_first_8>_<name> // 4. Generate a unique ID for dev extension: dev_<public_key>_<name>
let key_prefix = manifest let extension_id = format!("dev_{}_{}", manifest.public_key, manifest.name);
.public_key
.chars()
.take(8)
.collect::<String>();
let extension_id = format!("dev_{}_{}", key_prefix, manifest.name);
// 5. Check if dev extension already exists (allow reload) // 5. Check if dev extension already exists (allow reload)
if let Some(existing) = state if let Some(existing) = state
@ -387,13 +390,11 @@ pub fn remove_dev_extension(
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
// Only remove from dev_extensions, not production_extensions // Only remove from dev_extensions, not production_extensions
let mut dev_exts = state let mut dev_exts = state.extension_manager.dev_extensions.lock().map_err(|e| {
.extension_manager ExtensionError::MutexPoisoned {
.dev_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(), reason: e.to_string(),
})?; }
})?;
// Find and remove by public_key and name // Find and remove by public_key and name
let to_remove = dev_exts let to_remove = dev_exts
@ -403,13 +404,10 @@ pub fn remove_dev_extension(
if let Some(id) = to_remove { if let Some(id) = to_remove {
dev_exts.remove(&id); dev_exts.remove(&id);
eprintln!("✅ Dev extension removed: {}", name); eprintln!("✅ Dev extension removed: {name}");
Ok(()) Ok(())
} else { } else {
Err(ExtensionError::NotFound { Err(ExtensionError::NotFound { public_key, name })
public_key,
name,
})
} }
} }

View File

@ -28,8 +28,7 @@ impl PermissionManager {
})?; })?;
let sql = format!( let sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSION_PERMISSIONS
); );
for perm in permissions { for perm in permissions {
@ -76,8 +75,7 @@ impl PermissionManager {
let db_perm: HaexExtensionPermissions = permission.into(); let db_perm: HaexExtensionPermissions = permission.into();
let sql = format!( let sql = format!(
"UPDATE {} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?", "UPDATE {TABLE_EXTENSION_PERMISSIONS} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?"
TABLE_EXTENSION_PERMISSIONS
); );
let params = params![ let params = params![
@ -111,7 +109,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
let sql = format!("UPDATE {} SET status = ? WHERE id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("UPDATE {TABLE_EXTENSION_PERMISSIONS} SET status = ? WHERE id = ?");
let params = params![new_status.as_str(), permission_id]; let params = params![new_status.as_str(), permission_id];
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params)?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params)?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
@ -133,7 +131,7 @@ impl PermissionManager {
})?; })?;
// Echtes DELETE - wird vom CrdtTransformer zu UPDATE umgewandelt // Echtes DELETE - wird vom CrdtTransformer zu UPDATE umgewandelt
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![permission_id])?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![permission_id])?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from) }).map_err(ExtensionError::from)
@ -152,7 +150,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![extension_id])?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![extension_id])?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from) }).map_err(ExtensionError::from)
@ -164,7 +162,7 @@ impl PermissionManager {
hlc_service: &crate::crdt::hlc::HlcService, hlc_service: &crate::crdt::hlc::HlcService,
extension_id: &str, extension_id: &str,
) -> Result<(), DatabaseError> { ) -> Result<(), DatabaseError> {
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(tx, hlc_service, &sql, params![extension_id])?; SqlExecutor::execute_internal_typed(tx, hlc_service, &sql, params![extension_id])?;
Ok(()) Ok(())
} }
@ -174,7 +172,7 @@ impl PermissionManager {
extension_id: &str, extension_id: &str,
) -> Result<Vec<ExtensionPermission>, ExtensionError> { ) -> Result<Vec<ExtensionPermission>, ExtensionError> {
with_connection(&app_state.db, |conn| { with_connection(&app_state.db, |conn| {
let sql = format!("SELECT * FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("SELECT * FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?; let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?;
let perms_iter = stmt.query_map(params![extension_id], |row| { let perms_iter = stmt.query_map(params![extension_id], |row| {
@ -197,6 +195,30 @@ impl PermissionManager {
action: Action, action: Action,
table_name: &str, table_name: &str,
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
// Remove quotes from table name if present (from SDK's getTableName())
let clean_table_name = table_name.trim_matches('"');
// Auto-allow: Extensions have full access to their own tables
// Table format: {publicKey}__{extensionName}__{tableName}
// Extension ID format: dev_{publicKey}_{extensionName} or {publicKey}_{extensionName}
// Get the extension to check if this is its own table
let extension = app_state
.extension_manager
.get_extension(extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {extension_id} not found"),
})?;
// Build expected table prefix: {publicKey}__{extensionName}__
let expected_prefix = format!("{}__{}__", extension.manifest.public_key, extension.manifest.name);
if clean_table_name.starts_with(&expected_prefix) {
// This is the extension's own table - auto-allow
return Ok(());
}
// Not own table - check explicit permissions
let permissions = Self::get_permissions(app_state, extension_id).await?; let permissions = Self::get_permissions(app_state, extension_id).await?;
let has_permission = permissions let has_permission = permissions
@ -205,7 +227,7 @@ impl PermissionManager {
.filter(|perm| perm.resource_type == ResourceType::Db) .filter(|perm| perm.resource_type == ResourceType::Db)
.filter(|perm| perm.action == action) // action ist nicht mehr Option .filter(|perm| perm.action == action) // action ist nicht mehr Option
.any(|perm| { .any(|perm| {
if perm.target != "*" && perm.target != table_name { if perm.target != "*" && perm.target != clean_table_name {
return false; return false;
} }
true true
@ -214,8 +236,8 @@ impl PermissionManager {
if !has_permission { if !has_permission {
return Err(ExtensionError::permission_denied( return Err(ExtensionError::permission_denied(
extension_id, extension_id,
&format!("{:?}", action), &format!("{action:?}"),
&format!("database table '{}'", table_name), &format!("database table '{table_name}'"),
)); ));
} }
@ -391,7 +413,7 @@ impl PermissionManager {
"db" => Ok(ResourceType::Db), "db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell), "shell" => Ok(ResourceType::Shell),
_ => Err(DatabaseError::SerializationError { _ => Err(DatabaseError::SerializationError {
reason: format!("Unknown resource type: {}", s), reason: format!("Unknown resource type: {s}"),
}), }),
} }
} }
@ -399,8 +421,7 @@ impl PermissionManager {
fn matches_path_pattern(pattern: &str, path: &str) -> bool { fn matches_path_pattern(pattern: &str, path: &str) -> bool {
if pattern.ends_with("/*") { if let Some(prefix) = pattern.strip_suffix("/*") {
let prefix = &pattern[..pattern.len() - 2];
return path.starts_with(prefix); return path.starts_with(prefix);
} }

View File

@ -165,8 +165,6 @@ pub struct ExtensionPermission {
pub constraints: Option<PermissionConstraints>, pub constraints: Option<PermissionConstraints>,
pub status: PermissionStatus, pub status: PermissionStatus,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub haex_tombstone: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub haex_timestamp: Option<String>, pub haex_timestamp: Option<String>,
} }
@ -269,7 +267,7 @@ impl ResourceType {
"db" => Ok(ResourceType::Db), "db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell), "shell" => Ok(ResourceType::Shell),
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Unknown resource type: {}", s), reason: format!("Unknown resource type: {s}"),
}), }),
} }
} }
@ -303,7 +301,7 @@ impl Action {
ResourceType::Fs => Ok(Action::Filesystem(FsAction::from_str(s)?)), ResourceType::Fs => Ok(Action::Filesystem(FsAction::from_str(s)?)),
ResourceType::Http => { ResourceType::Http => {
let action: HttpAction = let action: HttpAction =
serde_json::from_str(&format!("\"{}\"", s)).map_err(|_| { serde_json::from_str(&format!("\"{s}\"")).map_err(|_| {
ExtensionError::InvalidActionString { ExtensionError::InvalidActionString {
input: s.to_string(), input: s.to_string(),
resource_type: "http".to_string(), resource_type: "http".to_string(),
@ -331,7 +329,7 @@ impl PermissionStatus {
"granted" => Ok(PermissionStatus::Granted), "granted" => Ok(PermissionStatus::Granted),
"denied" => Ok(PermissionStatus::Denied), "denied" => Ok(PermissionStatus::Denied),
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Unknown permission status: {}", s), reason: format!("Unknown permission status: {s}"),
}), }),
} }
} }
@ -341,9 +339,9 @@ impl From<&ExtensionPermission> for crate::database::generated::HaexExtensionPer
fn from(perm: &ExtensionPermission) -> Self { fn from(perm: &ExtensionPermission) -> Self {
Self { Self {
id: perm.id.clone(), id: perm.id.clone(),
extension_id: Some(perm.extension_id.clone()), extension_id: perm.extension_id.clone(),
resource_type: Some(perm.resource_type.as_str().to_string()), resource_type: Some(perm.resource_type.as_str().to_string()),
action: Some(perm.action.as_str()), action: Some(perm.action.as_str().to_string()),
target: Some(perm.target.clone()), target: Some(perm.target.clone()),
constraints: perm constraints: perm
.constraints .constraints
@ -352,7 +350,6 @@ impl From<&ExtensionPermission> for crate::database::generated::HaexExtensionPer
status: perm.status.as_str().to_string(), status: perm.status.as_str().to_string(),
created_at: None, created_at: None,
updated_at: None, updated_at: None,
haex_tombstone: perm.haex_tombstone,
haex_timestamp: perm.haex_timestamp.clone(), haex_timestamp: perm.haex_timestamp.clone(),
} }
} }
@ -382,13 +379,12 @@ impl From<crate::database::generated::HaexExtensionPermissions> for ExtensionPer
Self { Self {
id: db_perm.id, id: db_perm.id,
extension_id: db_perm.extension_id.unwrap_or_default(), extension_id: db_perm.extension_id,
resource_type, resource_type,
action, action,
target: db_perm.target.unwrap_or_default(), target: db_perm.target.unwrap_or_default(),
constraints, constraints,
status, status,
haex_tombstone: db_perm.haex_tombstone,
haex_timestamp: db_perm.haex_timestamp, haex_timestamp: db_perm.haex_timestamp,
} }
} }

View File

@ -17,7 +17,7 @@ impl SqlPermissionValidator {
fn is_own_table(extension_id: &str, table_name: &str) -> bool { fn is_own_table(extension_id: &str, table_name: &str) -> bool {
// Tabellennamen sind im Format: {keyHash}_{extensionName}_{tableName} // Tabellennamen sind im Format: {keyHash}_{extensionName}_{tableName}
// extension_id ist der keyHash der Extension // extension_id ist der keyHash der Extension
table_name.starts_with(&format!("{}_", extension_id)) table_name.starts_with(&format!("{extension_id}_"))
} }
/// Validiert ein SQL-Statement gegen die Permissions einer Extension /// Validiert ein SQL-Statement gegen die Permissions einer Extension
@ -45,7 +45,7 @@ impl SqlPermissionValidator {
Self::validate_schema_statement(app_state, extension_id, &statement).await Self::validate_schema_statement(app_state, extension_id, &statement).await
} }
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Statement type not allowed: {}", sql), reason: format!("Statement type not allowed: {sql}"),
}), }),
} }
} }

View File

@ -26,7 +26,7 @@ pub fn run() {
let state = app_handle.state::<AppState>(); let state = app_handle.state::<AppState>();
// Rufe den Handler mit allen benötigten Parametern auf // Rufe den Handler mit allen benötigten Parametern auf
match extension::core::extension_protocol_handler(state, &app_handle, &request) { match extension::core::extension_protocol_handler(state, app_handle, &request) {
Ok(response) => response, Ok(response) => response,
Err(e) => { Err(e) => {
eprintln!( eprintln!(
@ -38,11 +38,10 @@ pub fn run() {
.status(500) .status(500)
.header("Content-Type", "text/plain") .header("Content-Type", "text/plain")
.body(Vec::from(format!( .body(Vec::from(format!(
"Interner Serverfehler im Protokollhandler: {}", "Interner Serverfehler im Protokollhandler: {e}"
e
))) )))
.unwrap_or_else(|build_err| { .unwrap_or_else(|build_err| {
eprintln!("Konnte Fehler-Response nicht erstellen: {}", build_err); eprintln!("Konnte Fehler-Response nicht erstellen: {build_err}");
tauri::http::Response::builder() tauri::http::Response::builder()
.status(500) .status(500)
.body(Vec::new()) .body(Vec::new())
@ -68,15 +67,19 @@ pub fn run() {
.invoke_handler(tauri::generate_handler![ .invoke_handler(tauri::generate_handler![
database::create_encrypted_database, database::create_encrypted_database,
database::delete_vault, database::delete_vault,
database::move_vault_to_trash,
database::list_vaults, database::list_vaults,
database::open_encrypted_database, database::open_encrypted_database,
database::sql_execute_with_crdt,
database::sql_execute, database::sql_execute,
database::sql_query_with_crdt,
database::sql_select_with_crdt,
database::sql_select, database::sql_select,
database::vault_exists, database::vault_exists,
extension::database::extension_sql_execute, extension::database::extension_sql_execute,
extension::database::extension_sql_select, extension::database::extension_sql_select,
extension::get_all_extensions,
extension::get_all_dev_extensions, extension::get_all_dev_extensions,
extension::get_all_extensions,
extension::get_extension_info, extension::get_extension_info,
extension::install_extension_with_permissions, extension::install_extension_with_permissions,
extension::is_extension_installed, extension::is_extension_installed,

View File

@ -1,13 +1,13 @@
{ {
"$schema": "https://schema.tauri.app/config/2", "$schema": "https://schema.tauri.app/config/2",
"productName": "haex-hub", "productName": "haex-hub",
"version": "0.1.0", "version": "0.1.4",
"identifier": "space.haex.hub", "identifier": "space.haex.hub",
"build": { "build": {
"beforeDevCommand": "pnpm dev", "beforeDevCommand": "pnpm dev",
"devUrl": "http://localhost:3003", "devUrl": "http://localhost:3003",
"beforeBuildCommand": "pnpm generate", "beforeBuildCommand": "pnpm generate",
"frontendDist": "../dist" "frontendDist": "../.output/public"
}, },
"app": { "app": {
@ -20,16 +20,21 @@
], ],
"security": { "security": {
"csp": { "csp": {
"default-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "default-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"script-src": [ "script-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"'wasm-unsafe-eval'" "'wasm-unsafe-eval'",
"'unsafe-inline'"
], ],
"style-src": [ "style-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"'unsafe-inline'" "'unsafe-inline'"
], ],
@ -44,20 +49,22 @@
"img-src": [ "img-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"data:", "data:",
"blob:" "blob:"
], ],
"font-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "font-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"object-src": ["'none'"], "object-src": ["'none'"],
"media-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "media-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"frame-src": ["haex-extension:"], "frame-src": ["haex-extension:"],
"frame-ancestors": ["'none'"], "frame-ancestors": ["'none'"],
"base-uri": ["'self'"] "base-uri": ["'self'"]
}, },
"assetProtocol": { "assetProtocol": {
"enable": true, "enable": true,
"scope": ["$APPDATA", "$RESOURCE"] "scope": ["$APPDATA", "$RESOURCE", "$APPLOCALDATA/**"]
} }
} }
}, },

View File

@ -2,7 +2,9 @@ export default defineAppConfig({
ui: { ui: {
colors: { colors: {
primary: 'sky', primary: 'sky',
secondary: 'purple', secondary: 'fuchsia',
warning: 'yellow',
danger: 'red',
}, },
}, },
}) })

View File

@ -1,8 +1,8 @@
<template> <template>
<UApp :locale="locales[locale]"> <UApp :locale="locales[locale]">
<NuxtLayout> <div data-vaul-drawer-wrapper>
<NuxtPage /> <NuxtPage />
</NuxtLayout> </div>
</UApp> </UApp>
</template> </template>

View File

@ -13,8 +13,48 @@
[disabled] { [disabled] {
@apply cursor-not-allowed; @apply cursor-not-allowed;
} }
/* Define safe-area-insets as CSS custom properties for JavaScript access */
:root {
--safe-area-inset-top: env(safe-area-inset-top, 0px);
--safe-area-inset-bottom: env(safe-area-inset-bottom, 0px);
--safe-area-inset-left: env(safe-area-inset-left, 0px);
--safe-area-inset-right: env(safe-area-inset-right, 0px);
}
/* Verhindere Scrolling auf html und body */
html {
overflow: hidden;
margin: 0;
padding: 0;
height: 100dvh;
height: 100vh; /* Fallback */
width: 100%;
}
body {
overflow: hidden;
margin: 0;
height: 100%;
width: 100%;
padding: 0;
}
#__nuxt {
/* Volle Höhe des body */
height: 100%;
width: 100%;
/* Safe-Area Paddings auf root element - damit ALLES davon profitiert */
padding-top: var(--safe-area-inset-top);
padding-bottom: var(--safe-area-inset-bottom);
padding-left: var(--safe-area-inset-left);
padding-right: var(--safe-area-inset-right);
box-sizing: border-box;
}
} }
:root { @theme {
--ui-header-height: 74px; --spacing-header: 3.5rem; /* 72px - oder dein Wunschwert */
} }

View File

@ -0,0 +1,61 @@
<template>
<div
v-if="data"
class="fixed top-2 right-2 bg-black/90 text-white text-xs p-3 rounded-lg shadow-2xl max-w-sm z-[9999] backdrop-blur-sm"
>
<div class="flex justify-between items-start gap-3 mb-2">
<span class="font-bold text-sm">{{ title }}</span>
<div class="flex gap-1">
<button
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="copyToClipboardAsync"
>
Copy
</button>
<button
v-if="dismissible"
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="handleDismiss"
>
</button>
</div>
</div>
<pre class="text-xs whitespace-pre-wrap font-mono overflow-auto max-h-96">{{ formattedData }}</pre>
</div>
</template>
<script setup lang="ts">
const props = withDefaults(
defineProps<{
data: Record<string, any> | null
title?: string
dismissible?: boolean
}>(),
{
title: 'Debug Info',
dismissible: false,
},
)
const emit = defineEmits<{
dismiss: []
}>()
const formattedData = computed(() => {
if (!props.data) return ''
return JSON.stringify(props.data, null, 2)
})
const copyToClipboardAsync = async () => {
try {
await navigator.clipboard.writeText(formattedData.value)
} catch (err) {
console.error('Failed to copy debug info:', err)
}
}
const handleDismiss = () => {
emit('dismiss')
}
</script>

View File

@ -0,0 +1,169 @@
<template>
<div class="w-full h-full relative">
<!-- Error overlay for dev extensions when server is not reachable -->
<div
v-if="extension?.devServerUrl && hasError"
class="absolute inset-0 bg-white dark:bg-gray-900 flex items-center justify-center p-8"
>
<div class="max-w-md space-y-4 text-center">
<UIcon
name="i-heroicons-exclamation-circle"
class="w-16 h-16 mx-auto text-yellow-500"
/>
<h3 class="text-lg font-semibold">Dev Server Not Reachable</h3>
<p class="text-sm opacity-70">
The dev server at {{ extension.devServerUrl }} is not reachable.
</p>
<div
class="bg-gray-100 dark:bg-gray-800 p-4 rounded text-left text-xs font-mono"
>
<p class="opacity-70 mb-2">To start the dev server:</p>
<code class="block">cd /path/to/extension</code>
<code class="block">npm run dev</code>
</div>
<UButton
label="Retry"
@click="retryLoad"
/>
</div>
</div>
<!-- Loading Spinner -->
<div
v-if="isLoading"
class="absolute inset-0 bg-white dark:bg-gray-900 flex items-center justify-center"
>
<div class="flex flex-col items-center gap-4">
<div
class="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-500"
/>
<p class="text-sm text-gray-600 dark:text-gray-400">
Loading extension...
</p>
</div>
</div>
<iframe
ref="iframeRef"
:class="[
'w-full h-full border-0 transition-all duration-1000 ease-out',
isLoading ? 'opacity-0 scale-0' : 'opacity-100 scale-100',
]"
:src="extensionUrl"
:sandbox="sandboxAttributes"
allow="autoplay; speaker-selection; encrypted-media;"
@load="handleIframeLoad"
@error="hasError = true"
/>
</div>
</template>
<script setup lang="ts">
import {
EXTENSION_PROTOCOL_PREFIX,
EXTENSION_PROTOCOL_NAME,
} from '~/config/constants'
const props = defineProps<{
extensionId: string
windowId: string
}>()
const extensionsStore = useExtensionsStore()
const { platform } = useDeviceStore()
const iframeRef = useTemplateRef('iframeRef')
const hasError = ref(false)
const isLoading = ref(true)
// Convert windowId to ref for reactive tracking
const windowIdRef = toRef(props, 'windowId')
const extension = computed(() => {
return extensionsStore.availableExtensions.find(
(ext) => ext.id === props.extensionId,
)
})
const handleIframeLoad = () => {
// Delay the fade-in slightly to allow window animation to mostly complete
setTimeout(() => {
isLoading.value = false
}, 200)
}
const sandboxDefault = ['allow-scripts'] as const
const sandboxAttributes = computed(() => {
return extension.value?.devServerUrl
? [...sandboxDefault, 'allow-same-origin'].join(' ')
: sandboxDefault.join(' ')
})
// Generate extension URL
const extensionUrl = computed(() => {
if (!extension.value) return ''
const { publicKey, name, version, devServerUrl } = extension.value
const assetPath = 'index.html'
if (!publicKey || !name || !version) {
console.error('Missing required extension fields')
return ''
}
// If dev server URL is provided, load directly from dev server
if (devServerUrl) {
const cleanUrl = devServerUrl.replace(/\/$/, '')
const cleanPath = assetPath.replace(/^\//, '')
return cleanPath ? `${cleanUrl}/${cleanPath}` : cleanUrl
}
const extensionInfo = {
name,
publicKey,
version,
}
const encodedInfo = btoa(JSON.stringify(extensionInfo))
if (platform === 'android' || platform === 'windows') {
// Android: Tauri uses http://{scheme}.localhost format
return `http://${EXTENSION_PROTOCOL_NAME}.localhost/${encodedInfo}/${assetPath}`
} else {
// Desktop: Use custom protocol with base64 as host
return `${EXTENSION_PROTOCOL_PREFIX}${encodedInfo}/${assetPath}`
}
})
const retryLoad = () => {
hasError.value = false
if (iframeRef.value) {
//iframeRef.value.src = iframeRef.value.src // Force reload
}
}
// Initialize extension message handler to set up context
useExtensionMessageHandler(iframeRef, extension, windowIdRef)
// Additional explicit registration on mount to ensure iframe is registered
onMounted(() => {
// Wait for iframe to be ready
if (iframeRef.value && extension.value) {
console.log(
'[ExtensionFrame] Manually registering iframe on mount',
extension.value.name,
'windowId:',
props.windowId,
)
registerExtensionIFrame(iframeRef.value, extension.value, props.windowId)
}
})
// Explicit cleanup before unmount
onBeforeUnmount(() => {
if (iframeRef.value) {
console.log('[ExtensionFrame] Unregistering iframe on unmount')
unregisterExtensionIFrame(iframeRef.value)
}
})
</script>

View File

@ -1,35 +1,75 @@
<template> <template>
<UContextMenu :items="contextMenuItems"> <div>
<div <UiDialogConfirm
ref="draggableEl" v-model:open="showUninstallDialog"
:style="style" :title="t('confirmUninstall.title')"
class="select-none cursor-grab active:cursor-grabbing" :description="t('confirmUninstall.message', { name: label })"
@pointerdown="handlePointerDown" :confirm-label="t('confirmUninstall.confirm')"
@pointermove="handlePointerMove" :abort-label="t('confirmUninstall.cancel')"
@pointerup="handlePointerUp" confirm-icon="i-heroicons-trash"
@dblclick="handleOpen" @confirm="handleConfirmUninstall"
> />
<div class="flex flex-col items-center gap-1 p-2">
<div <UContextMenu :items="contextMenuItems">
class="w-16 h-16 flex items-center justify-center bg-white/90 dark:bg-gray-800/90 rounded-lg shadow-lg hover:shadow-xl transition-shadow" <div
> ref="draggableEl"
<img v-if="icon" :src="icon" :alt="label" class="w-12 h-12 object-contain" /> :style="style"
<Icon v-else name="i-heroicons-puzzle-piece-solid" class="w-12 h-12 text-gray-500" /> class="select-none cursor-grab active:cursor-grabbing"
@pointerdown.left="handlePointerDown"
@pointermove="handlePointerMove"
@pointerup="handlePointerUp"
@click.left="handleClick"
@dblclick="handleDoubleClick"
>
<div class="flex flex-col items-center gap-2 p-3 group">
<div
:class="[
'w-20 h-20 flex items-center justify-center rounded-2xl transition-all duration-200 ease-out',
'backdrop-blur-sm border',
isSelected
? 'bg-white/95 dark:bg-gray-800/95 border-blue-500 dark:border-blue-400 shadow-lg scale-105'
: 'bg-white/80 dark:bg-gray-800/80 border-gray-200/50 dark:border-gray-700/50 hover:bg-white/90 dark:hover:bg-gray-800/90 hover:border-gray-300 dark:hover:border-gray-600 hover:shadow-md hover:scale-105',
]"
>
<img
v-if="icon"
:src="icon"
:alt="label"
class="w-14 h-14 object-contain transition-transform duration-200"
:class="{ 'scale-110': isSelected }"
/>
<UIcon
v-else
name="i-heroicons-puzzle-piece-solid"
:class="[
'w-14 h-14 transition-all duration-200',
isSelected
? 'text-blue-500 dark:text-blue-400 scale-110'
: 'text-gray-400 dark:text-gray-500 group-hover:text-gray-500 dark:group-hover:text-gray-400',
]"
/>
</div>
<span
:class="[
'text-xs text-center max-w-24 truncate px-3 py-1.5 rounded-lg transition-all duration-200',
'backdrop-blur-sm',
isSelected
? 'bg-white/95 dark:bg-gray-800/95 text-gray-900 dark:text-gray-100 font-medium shadow-md'
: 'bg-white/70 dark:bg-gray-800/70 text-gray-700 dark:text-gray-300 group-hover:bg-white/85 dark:group-hover:bg-gray-800/85',
]"
>
{{ label }}
</span>
</div> </div>
<span
class="text-xs text-center max-w-20 truncate bg-white/80 dark:bg-gray-800/80 px-2 py-1 rounded shadow"
>
{{ label }}
</span>
</div> </div>
</div> </UContextMenu>
</UContextMenu> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
const props = defineProps<{ const props = defineProps<{
id: string id: string
itemType: 'extension' | 'file' | 'folder' itemType: DesktopItemType
referenceId: string referenceId: string
initialX: number initialX: number
initialY: number initialY: number
@ -39,24 +79,51 @@ const props = defineProps<{
const emit = defineEmits<{ const emit = defineEmits<{
positionChanged: [id: string, x: number, y: number] positionChanged: [id: string, x: number, y: number]
open: [itemType: string, referenceId: string]
uninstall: [itemType: string, referenceId: string]
dragStart: [id: string, itemType: string, referenceId: string] dragStart: [id: string, itemType: string, referenceId: string]
dragEnd: [] dragEnd: []
}>() }>()
const desktopStore = useDesktopStore() const desktopStore = useDesktopStore()
const showUninstallDialog = ref(false)
const { t } = useI18n()
const isSelected = computed(() => desktopStore.isItemSelected(props.id))
const handleClick = (e: MouseEvent) => {
// Prevent selection during drag
if (isDragging.value) return
desktopStore.toggleSelection(props.id, e.ctrlKey || e.metaKey)
}
const handleUninstallClick = () => {
showUninstallDialog.value = true
}
const handleConfirmUninstall = async () => {
showUninstallDialog.value = false
await desktopStore.uninstallDesktopItem(
props.id,
props.itemType,
props.referenceId,
)
}
const contextMenuItems = computed(() => const contextMenuItems = computed(() =>
desktopStore.getContextMenuItems( desktopStore.getContextMenuItems(
props.id, props.id,
props.itemType, props.itemType,
props.referenceId, props.referenceId,
handleOpen, handleUninstallClick,
handleUninstall,
), ),
) )
// Inject viewport size from parent desktop
const viewportSize = inject<{
width: Ref<number>
height: Ref<number>
}>('viewportSize')
const draggableEl = ref<HTMLElement>() const draggableEl = ref<HTMLElement>()
const x = ref(props.initialX) const x = ref(props.initialX)
const y = ref(props.initialY) const y = ref(props.initialY)
@ -64,6 +131,10 @@ const isDragging = ref(false)
const offsetX = ref(0) const offsetX = ref(0)
const offsetY = ref(0) const offsetY = ref(0)
// Icon dimensions (approximate)
const iconWidth = 120 // Matches design in template
const iconHeight = 140
const style = computed(() => ({ const style = computed(() => ({
position: 'absolute' as const, position: 'absolute' as const,
left: `${x.value}px`, left: `${x.value}px`,
@ -105,15 +176,52 @@ const handlePointerUp = (e: PointerEvent) => {
if (draggableEl.value) { if (draggableEl.value) {
draggableEl.value.releasePointerCapture(e.pointerId) draggableEl.value.releasePointerCapture(e.pointerId)
} }
// Snap icon to viewport bounds if outside
if (viewportSize) {
const maxX = Math.max(0, viewportSize.width.value - iconWidth)
const maxY = Math.max(0, viewportSize.height.value - iconHeight)
x.value = Math.max(0, Math.min(maxX, x.value))
y.value = Math.max(0, Math.min(maxY, y.value))
}
emit('dragEnd') emit('dragEnd')
emit('positionChanged', props.id, x.value, y.value) emit('positionChanged', props.id, x.value, y.value)
} }
const handleOpen = () => { const handleDoubleClick = () => {
emit('open', props.itemType, props.referenceId) // Get icon position and size for animation
} if (draggableEl.value) {
const rect = draggableEl.value.getBoundingClientRect()
const handleUninstall = () => { const sourcePosition = {
emit('uninstall', props.itemType, props.referenceId) x: rect.left,
y: rect.top,
width: rect.width,
height: rect.height,
}
desktopStore.openDesktopItem(
props.itemType,
props.referenceId,
sourcePosition,
)
} else {
desktopStore.openDesktopItem(props.itemType, props.referenceId)
}
} }
</script> </script>
<i18n lang="yaml">
de:
confirmUninstall:
title: Erweiterung deinstallieren
message: Möchten Sie die Erweiterung '{name}' wirklich deinstallieren? Diese Aktion kann nicht rückgängig gemacht werden.
confirm: Deinstallieren
cancel: Abbrechen
en:
confirmUninstall:
title: Uninstall Extension
message: Do you really want to uninstall the extension '{name}'? This action cannot be undone.
confirm: Uninstall
cancel: Cancel
</i18n>

View File

@ -1,175 +1,374 @@
<template> <template>
<div <div
class="w-full h-full relative overflow-hidden bg-gradient-to-br from-blue-50 to-blue-100 dark:from-gray-900 dark:to-gray-800" ref="desktopEl"
class="absolute inset-0 overflow-hidden"
> >
<!-- Dropzones (only visible during drag) --> <Swiper
<Transition name="slide-down"> :modules="[SwiperNavigation]"
<div :slides-per-view="1"
v-if="isDragging" :space-between="0"
class="absolute top-0 left-0 right-0 flex gap-2 p-4 z-50" :initial-slide="currentWorkspaceIndex"
:speed="300"
:touch-angle="45"
:no-swiping="true"
no-swiping-class="no-swipe"
:allow-touch-move="allowSwipe"
class="h-full w-full"
direction="vertical"
@swiper="onSwiperInit"
@slide-change="onSlideChange"
>
<SwiperSlide
v-for="workspace in workspaces"
:key="workspace.id"
class="w-full h-full"
> >
<!-- Remove from Desktop Dropzone --> <UContextMenu :items="getWorkspaceContextMenuItems(workspace.id)">
<div <div
ref="removeDropzoneEl" class="w-full h-full relative"
class="flex-1 h-20 flex items-center justify-center gap-2 rounded-lg border-2 border-dashed transition-all" :style="getWorkspaceBackgroundStyle(workspace)"
:class=" @click.self.stop="handleDesktopClick"
isOverRemoveZone @mousedown.left.self="handleAreaSelectStart"
? 'bg-orange-500/20 border-orange-500 dark:bg-orange-400/20 dark:border-orange-400' @dragover.prevent="handleDragOver"
: 'border-orange-500/50 dark:border-orange-400/50' @drop.prevent="handleDrop($event, workspace.id)"
"
>
<Icon
name="i-heroicons-x-mark"
class="w-6 h-6"
:class="
isOverRemoveZone
? 'text-orange-700 dark:text-orange-300'
: 'text-orange-600 dark:text-orange-400'
"
/>
<span
class="font-semibold"
:class="
isOverRemoveZone
? 'text-orange-700 dark:text-orange-300'
: 'text-orange-600 dark:text-orange-400'
"
> >
Von Desktop entfernen <!-- Grid Pattern Background -->
</span> <div
</div> class="absolute inset-0 pointer-events-none opacity-30"
:style="{
backgroundImage:
'linear-gradient(rgba(0, 0, 0, 0.1) 1px, transparent 1px), linear-gradient(90deg, rgba(0, 0, 0, 0.1) 1px, transparent 1px)',
backgroundSize: '32px 32px',
}"
/>
<!-- Uninstall Dropzone --> <!-- Snap Dropzones (only visible when window drag near edge) -->
<div
ref="uninstallDropzoneEl"
class="flex-1 h-20 flex items-center justify-center gap-2 rounded-lg border-2 border-dashed transition-all"
:class="
isOverUninstallZone
? 'bg-red-500/20 border-red-500 dark:bg-red-400/20 dark:border-red-400'
: 'border-red-500/50 dark:border-red-400/50'
"
>
<Icon
name="i-heroicons-trash"
class="w-6 h-6"
:class="
isOverUninstallZone
? 'text-red-700 dark:text-red-300'
: 'text-red-600 dark:text-red-400'
"
/>
<span
class="font-semibold"
:class="
isOverUninstallZone
? 'text-red-700 dark:text-red-300'
: 'text-red-600 dark:text-red-400'
"
>
Deinstallieren
</span>
</div>
</div>
</Transition>
<HaexDesktopIcon <div
v-for="item in desktopItemIcons" class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:key="item.id" :class="
:id="item.id" showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
:item-type="item.itemType" "
:reference-id="item.referenceId" />
:initial-x="item.positionX"
:initial-y="item.positionY" <div
:label="item.label" class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:icon="item.icon" :class="
@position-changed="handlePositionChanged" showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
@open="handleOpen" "
@drag-start="handleDragStart" />
@drag-end="handleDragEnd"
@uninstall="handleUninstall" <!-- Area Selection Box -->
/> <div
v-if="isAreaSelecting"
class="absolute bg-blue-500/20 border-2 border-blue-500 pointer-events-none z-30"
:style="selectionBoxStyle"
/>
<!-- Icons for this workspace -->
<HaexDesktopIcon
v-for="item in getWorkspaceIcons(workspace.id)"
:id="item.id"
:key="item.id"
:item-type="item.itemType"
:reference-id="item.referenceId"
:initial-x="item.positionX"
:initial-y="item.positionY"
:label="item.label"
:icon="item.icon"
class="no-swipe"
@position-changed="handlePositionChanged"
@drag-start="handleDragStart"
@drag-end="handleDragEnd"
/>
<!-- Windows for this workspace -->
<template
v-for="window in getWorkspaceWindows(workspace.id)"
:key="window.id"
>
<!-- Overview Mode: Teleport to window preview -->
<Teleport
v-if="
windowManager.showWindowOverview &&
overviewWindowState.has(window.id)
"
:to="`#window-preview-${window.id}`"
>
<div
class="absolute origin-top-left"
:style="{
transform: `scale(${overviewWindowState.get(window.id)!.scale})`,
width: `${overviewWindowState.get(window.id)!.width}px`,
height: `${overviewWindowState.get(window.id)!.height}px`,
}"
>
<HaexWindow
v-show="
windowManager.showWindowOverview || !window.isMinimized
"
:id="window.id"
v-model:x="overviewWindowState.get(window.id)!.x"
v-model:y="overviewWindowState.get(window.id)!.y"
v-model:width="overviewWindowState.get(window.id)!.width"
v-model:height="overviewWindowState.get(window.id)!.height"
:title="window.title"
:icon="window.icon"
:is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX"
:source-y="window.sourceY"
:source-width="window.sourceWidth"
:source-height="window.sourceHeight"
:is-opening="window.isOpening"
:is-closing="window.isClosing"
:warning-level="
window.type === 'extension' &&
availableExtensions.find(
(ext) => ext.id === window.sourceId,
)?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe"
@close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)"
@activate="windowManager.activateWindow(window.id)"
@position-changed="
(x, y) =>
windowManager.updateWindowPosition(window.id, x, y)
"
@size-changed="
(width, height) =>
windowManager.updateWindowSize(window.id, width, height)
"
@drag-start="handleWindowDragStart(window.id)"
@drag-end="handleWindowDragEnd"
>
<!-- System Window: Render Vue Component -->
<component
:is="getSystemWindowComponent(window.sourceId)"
v-if="window.type === 'system'"
/>
<!-- Extension Window: Render iFrame -->
<HaexDesktopExtensionFrame
v-else
:extension-id="window.sourceId"
:window-id="window.id"
/>
</HaexWindow>
</div>
</Teleport>
<!-- Desktop Mode: Render directly in workspace -->
<HaexWindow
v-else
v-show="windowManager.showWindowOverview || !window.isMinimized"
:id="window.id"
v-model:x="window.x"
v-model:y="window.y"
v-model:width="window.width"
v-model:height="window.height"
:title="window.title"
:icon="window.icon"
:is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX"
:source-y="window.sourceY"
:source-width="window.sourceWidth"
:source-height="window.sourceHeight"
:is-opening="window.isOpening"
:is-closing="window.isClosing"
:warning-level="
window.type === 'extension' &&
availableExtensions.find((ext) => ext.id === window.sourceId)
?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe"
@close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)"
@activate="windowManager.activateWindow(window.id)"
@position-changed="
(x, y) => windowManager.updateWindowPosition(window.id, x, y)
"
@size-changed="
(width, height) =>
windowManager.updateWindowSize(window.id, width, height)
"
@drag-start="handleWindowDragStart(window.id)"
@drag-end="handleWindowDragEnd"
>
<!-- System Window: Render Vue Component -->
<component
:is="getSystemWindowComponent(window.sourceId)"
v-if="window.type === 'system'"
/>
<!-- Extension Window: Render iFrame -->
<HaexDesktopExtensionFrame
v-else
:extension-id="window.sourceId"
:window-id="window.id"
/>
</HaexWindow>
</template>
</div>
</UContextMenu>
</SwiperSlide>
</Swiper>
<!-- Window Overview Modal -->
<HaexWindowOverview />
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { Swiper, SwiperSlide } from 'swiper/vue'
import { Navigation } from 'swiper/modules'
import type { Swiper as SwiperType } from 'swiper'
import 'swiper/css'
import 'swiper/css/navigation'
const SwiperNavigation = Navigation
const desktopStore = useDesktopStore() const desktopStore = useDesktopStore()
const extensionsStore = useExtensionsStore() const extensionsStore = useExtensionsStore()
const router = useRouter() const windowManager = useWindowManagerStore()
const workspaceStore = useWorkspaceStore()
const { desktopItems } = storeToRefs(desktopStore) const { desktopItems } = storeToRefs(desktopStore)
const { availableExtensions } = storeToRefs(extensionsStore) const { availableExtensions } = storeToRefs(extensionsStore)
const {
currentWorkspace,
currentWorkspaceIndex,
workspaces,
swiperInstance,
allowSwipe,
isOverviewMode,
} = storeToRefs(workspaceStore)
const { getWorkspaceBackgroundStyle, getWorkspaceContextMenuItems } =
workspaceStore
// Drag state const { x: mouseX } = useMouse()
const desktopEl = useTemplateRef('desktopEl')
// Track desktop viewport size reactively
const { width: viewportWidth, height: viewportHeight } =
useElementSize(desktopEl)
// Provide viewport size to child windows
provide('viewportSize', {
width: viewportWidth,
height: viewportHeight,
})
// Area selection state
const isAreaSelecting = ref(false)
const selectionStart = ref({ x: 0, y: 0 })
const selectionEnd = ref({ x: 0, y: 0 })
const selectionBoxStyle = computed(() => {
const x1 = Math.min(selectionStart.value.x, selectionEnd.value.x)
const y1 = Math.min(selectionStart.value.y, selectionEnd.value.y)
const x2 = Math.max(selectionStart.value.x, selectionEnd.value.x)
const y2 = Math.max(selectionStart.value.y, selectionEnd.value.y)
return {
left: `${x1}px`,
top: `${y1}px`,
width: `${x2 - x1}px`,
height: `${y2 - y1}px`,
}
})
// Drag state for desktop icons
const isDragging = ref(false) const isDragging = ref(false)
const currentDraggedItemId = ref<string>() const currentDraggedItemId = ref<string>()
const currentDraggedItemType = ref<string>() const currentDraggedItemType = ref<string>()
const currentDraggedReferenceId = ref<string>() const currentDraggedReferenceId = ref<string>()
// Dropzone refs // Window drag state for snap zones
const removeDropzoneEl = ref<HTMLElement>() const isWindowDragging = ref(false)
const uninstallDropzoneEl = ref<HTMLElement>() const snapEdgeThreshold = 50 // pixels from edge to show snap zone
// Setup dropzones with VueUse // Computed visibility for snap zones (uses mouseX from above)
const { isOverDropZone: isOverRemoveZone } = useDropZone(removeDropzoneEl, { const showLeftSnapZone = computed(() => {
onDrop: () => { return isWindowDragging.value && mouseX.value <= snapEdgeThreshold
if (currentDraggedItemId.value) {
handleRemoveFromDesktop(currentDraggedItemId.value)
}
},
}) })
const { isOverDropZone: isOverUninstallZone } = useDropZone(uninstallDropzoneEl, { const showRightSnapZone = computed(() => {
onDrop: () => { if (!isWindowDragging.value) return false
if (currentDraggedItemType.value && currentDraggedReferenceId.value) { const viewportWidth = window.innerWidth
handleUninstall(currentDraggedItemType.value, currentDraggedReferenceId.value) return mouseX.value >= viewportWidth - snapEdgeThreshold
}
},
}) })
interface DesktopItemIcon extends IDesktopItem { // Get icons for a specific workspace
label: string const getWorkspaceIcons = (workspaceId: string) => {
icon?: string return desktopItems.value
.filter((item) => item.workspaceId === workspaceId)
.map((item) => {
if (item.itemType === 'system') {
const systemWindow = windowManager
.getAllSystemWindows()
.find((win) => win.id === item.referenceId)
return {
...item,
label: systemWindow?.name || 'Unknown',
icon: systemWindow?.icon || '',
}
}
if (item.itemType === 'extension') {
const extension = availableExtensions.value.find(
(ext) => ext.id === item.referenceId,
)
console.log('found ext', extension)
return {
...item,
label: extension?.name || 'Unknown',
icon: extension?.icon || '',
}
}
if (item.itemType === 'file') {
// Für später: file handling
return {
...item,
label: item.referenceId,
icon: undefined,
}
}
if (item.itemType === 'folder') {
// Für später: folder handling
return {
...item,
label: item.referenceId,
icon: undefined,
}
}
return {
...item,
label: item.referenceId,
icon: undefined,
}
})
} }
const desktopItemIcons = computed<DesktopItemIcon[]>(() => { // Get windows for a specific workspace (including minimized for teleport)
return desktopItems.value.map((item) => { const getWorkspaceWindows = (workspaceId: string) => {
if (item.itemType === 'extension') { return windowManager.windows.filter((w) => w.workspaceId === workspaceId)
const extension = availableExtensions.value.find( }
(ext) => ext.id === item.referenceId,
)
return { // Get Vue Component for system window
...item, const getSystemWindowComponent = (sourceId: string) => {
label: extension?.name || 'Unknown', const systemWindow = windowManager.getSystemWindow(sourceId)
icon: extension?.icon || '', return systemWindow?.component
} }
}
if (item.itemType === 'file') {
// Für später: file handling
return {
...item,
label: item.referenceId,
icon: undefined,
}
}
if (item.itemType === 'folder') {
// Für später: folder handling
return {
...item,
label: item.referenceId,
icon: undefined,
}
}
return {
...item,
label: item.referenceId,
icon: undefined,
}
})
})
const handlePositionChanged = async (id: string, x: number, y: number) => { const handlePositionChanged = async (id: string, x: number, y: number) => {
try { try {
@ -179,55 +378,306 @@ const handlePositionChanged = async (id: string, x: number, y: number) => {
} }
} }
const localePath = useLocalePath()
const handleOpen = (itemType: string, referenceId: string) => {
if (itemType === 'extension') {
router.push(
localePath({
name: 'extension',
params: { extensionId: referenceId },
})
)
}
// Für später: file und folder handling
}
const handleDragStart = (id: string, itemType: string, referenceId: string) => { const handleDragStart = (id: string, itemType: string, referenceId: string) => {
isDragging.value = true isDragging.value = true
currentDraggedItemId.value = id currentDraggedItemId.value = id
currentDraggedItemType.value = itemType currentDraggedItemType.value = itemType
currentDraggedReferenceId.value = referenceId currentDraggedReferenceId.value = referenceId
allowSwipe.value = false // Disable Swiper during icon drag
} }
const handleDragEnd = () => { const handleDragEnd = async () => {
// Cleanup drag state
isDragging.value = false isDragging.value = false
currentDraggedItemId.value = undefined currentDraggedItemId.value = undefined
currentDraggedItemType.value = undefined currentDraggedItemType.value = undefined
currentDraggedReferenceId.value = undefined currentDraggedReferenceId.value = undefined
allowSwipe.value = true // Re-enable Swiper after drag
} }
const handleUninstall = async (itemType: string, referenceId: string) => { // Handle drag over for launcher items
if (itemType === 'extension') { const handleDragOver = (event: DragEvent) => {
try { if (!event.dataTransfer) return
const extension = availableExtensions.value.find((ext) => ext.id === referenceId)
if (extension) { // Check if this is a launcher item
await extensionsStore.removeExtensionAsync( if (event.dataTransfer.types.includes('application/haex-launcher-item')) {
extension.publicKey, event.dataTransfer.dropEffect = 'copy'
extension.name, }
extension.version, }
)
// Reload extensions after uninstall // Handle drop for launcher items
await extensionsStore.loadExtensionsAsync() const handleDrop = async (event: DragEvent, workspaceId: string) => {
} if (!event.dataTransfer) return
} catch (error) {
console.error('Fehler beim Deinstallieren:', error) const launcherItemData = event.dataTransfer.getData(
'application/haex-launcher-item',
)
if (!launcherItemData) return
try {
const item = JSON.parse(launcherItemData) as {
id: string
name: string
icon: string
type: 'system' | 'extension'
}
// Get drop position relative to desktop
const desktopRect = (
event.currentTarget as HTMLElement
).getBoundingClientRect()
const x = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2)
const y = Math.max(0, event.clientY - desktopRect.top - 32)
// Create desktop icon on the specific workspace
await desktopStore.addDesktopItemAsync(
item.type as DesktopItemType,
item.id,
x,
y,
workspaceId,
)
} catch (error) {
console.error('Failed to create desktop icon:', error)
}
}
const handleDesktopClick = () => {
// Only clear selection if it was a simple click, not an area selection
// Check if we just finished an area selection (box size > threshold)
const boxWidth = Math.abs(selectionEnd.value.x - selectionStart.value.x)
const boxHeight = Math.abs(selectionEnd.value.y - selectionStart.value.y)
// If box is larger than 5px in any direction, it was an area select, not a click
if (boxWidth > 5 || boxHeight > 5) {
return
}
desktopStore.clearSelection()
isOverviewMode.value = false
}
const handleWindowDragStart = (windowId: string) => {
console.log('[Desktop] handleWindowDragStart:', windowId)
isWindowDragging.value = true
windowManager.draggingWindowId = windowId // Set in store for workspace cards
console.log(
'[Desktop] draggingWindowId set to:',
windowManager.draggingWindowId,
)
allowSwipe.value = false // Disable Swiper during window drag
}
const handleWindowDragEnd = async () => {
console.log('[Desktop] handleWindowDragEnd')
// Check if window should snap to left or right
const draggingWindowId = windowManager.draggingWindowId
if (draggingWindowId) {
if (showLeftSnapZone.value) {
// Snap to left half
windowManager.updateWindowPosition(draggingWindowId, 0, 0)
windowManager.updateWindowSize(
draggingWindowId,
viewportWidth.value / 2,
viewportHeight.value,
)
} else if (showRightSnapZone.value) {
// Snap to right half
windowManager.updateWindowPosition(
draggingWindowId,
viewportWidth.value / 2,
0,
)
windowManager.updateWindowSize(
draggingWindowId,
viewportWidth.value / 2,
viewportHeight.value,
)
} }
} }
// Für später: file und folder handling
isWindowDragging.value = false
windowManager.draggingWindowId = null // Clear from store
allowSwipe.value = true // Re-enable Swiper after drag
} }
// Area selection handlers
const handleAreaSelectStart = (e: MouseEvent) => {
if (!desktopEl.value) return
const rect = desktopEl.value.getBoundingClientRect()
const x = e.clientX - rect.left
const y = e.clientY - rect.top
isAreaSelecting.value = true
selectionStart.value = { x, y }
selectionEnd.value = { x, y }
// Clear current selection
desktopStore.clearSelection()
}
// Track mouse movement for area selection
useEventListener(window, 'mousemove', (e: MouseEvent) => {
if (isAreaSelecting.value && desktopEl.value) {
const rect = desktopEl.value.getBoundingClientRect()
const x = e.clientX - rect.left
const y = e.clientY - rect.top
selectionEnd.value = { x, y }
// Find all items within selection box
selectItemsInBox()
}
})
// End area selection
useEventListener(window, 'mouseup', () => {
if (isAreaSelecting.value) {
isAreaSelecting.value = false
// Reset selection coordinates after a short delay
// This allows handleDesktopClick to still check the box size
setTimeout(() => {
selectionStart.value = { x: 0, y: 0 }
selectionEnd.value = { x: 0, y: 0 }
}, 100)
}
})
const selectItemsInBox = () => {
const x1 = Math.min(selectionStart.value.x, selectionEnd.value.x)
const y1 = Math.min(selectionStart.value.y, selectionEnd.value.y)
const x2 = Math.max(selectionStart.value.x, selectionEnd.value.x)
const y2 = Math.max(selectionStart.value.y, selectionEnd.value.y)
desktopStore.clearSelection()
desktopItems.value.forEach((item) => {
// Check if item position is within selection box
const itemX = item.positionX + 60 // Icon center (approx)
const itemY = item.positionY + 60
if (itemX >= x1 && itemX <= x2 && itemY >= y1 && itemY <= y2) {
desktopStore.toggleSelection(item.id, true) // true = add to selection
}
})
}
// Swiper event handlers
const onSwiperInit = (swiper: SwiperType) => {
swiperInstance.value = swiper
}
const onSlideChange = (swiper: SwiperType) => {
workspaceStore.switchToWorkspace(
workspaceStore.workspaces.at(swiper.activeIndex)?.id,
)
}
/* const handleRemoveWorkspace = async () => {
if (!currentWorkspace.value || workspaces.value.length <= 1) return
const currentIndex = currentWorkspaceIndex.value
await workspaceStore.removeWorkspaceAsync(currentWorkspace.value.id)
// Slide to adjusted index
nextTick(() => {
if (swiperInstance.value) {
const newIndex = Math.min(currentIndex, workspaces.value.length - 1)
swiperInstance.value.slideTo(newIndex)
}
})
}
const handleDropWindowOnWorkspace = async (
event: DragEvent,
targetWorkspaceId: string,
) => {
// Get the window ID from drag data (will be set when we implement window dragging)
const windowId = event.dataTransfer?.getData('windowId')
if (windowId) {
await moveWindowToWorkspace(windowId, targetWorkspaceId)
}
} */
// Overview Mode: Calculate grid positions and scale for windows
// Calculate preview dimensions for window overview
const MIN_PREVIEW_WIDTH = 300 // 50% increase from 200
const MAX_PREVIEW_WIDTH = 600 // 50% increase from 400
const MIN_PREVIEW_HEIGHT = 225 // 50% increase from 150
const MAX_PREVIEW_HEIGHT = 450 // 50% increase from 300
// Store window state for overview (position only, size stays original)
const overviewWindowState = ref(
new Map<
string,
{ x: number; y: number; width: number; height: number; scale: number }
>(),
)
// Calculate scale and card dimensions for each window
watch(
() => windowManager.showWindowOverview,
(isOpen) => {
if (isOpen) {
// Wait for the Overview modal to mount and create the teleport targets
nextTick(() => {
windowManager.windows.forEach((window) => {
const scaleX = MAX_PREVIEW_WIDTH / window.width
const scaleY = MAX_PREVIEW_HEIGHT / window.height
const scale = Math.min(scaleX, scaleY, 1)
// Ensure minimum card size
const scaledWidth = window.width * scale
const scaledHeight = window.height * scale
let finalScale = scale
if (scaledWidth < MIN_PREVIEW_WIDTH) {
finalScale = MIN_PREVIEW_WIDTH / window.width
}
if (scaledHeight < MIN_PREVIEW_HEIGHT) {
finalScale = Math.max(
finalScale,
MIN_PREVIEW_HEIGHT / window.height,
)
}
overviewWindowState.value.set(window.id, {
x: 0,
y: 0,
width: window.width,
height: window.height,
scale: finalScale,
})
})
})
} else {
// Clear state when overview is closed
overviewWindowState.value.clear()
}
},
)
// Disable Swiper in overview mode
watch(isOverviewMode, (newValue) => {
allowSwipe.value = !newValue
})
// Watch for workspace changes to reload desktop items
watch(currentWorkspace, async () => {
if (currentWorkspace.value) {
await desktopStore.loadDesktopItemsAsync()
}
})
onMounted(async () => { onMounted(async () => {
// Load workspaces first
await workspaceStore.loadWorkspacesAsync()
// Then load desktop items for current workspace
await desktopStore.loadDesktopItemsAsync() await desktopStore.loadDesktopItemsAsync()
}) })
</script> </script>
@ -247,4 +697,14 @@ onMounted(async () => {
opacity: 0; opacity: 0;
transform: translateY(-100%); transform: translateY(-100%);
} }
.fade-enter-active,
.fade-leave-active {
transition: opacity 0.2s ease;
}
.fade-enter-from,
.fade-leave-to {
opacity: 0;
}
</style> </style>

View File

@ -89,7 +89,11 @@ const removeExtensionAsync = async () => {
} }
try { try {
await extensionStore.removeExtensionAsync(extension.id, extension.version) await extensionStore.removeExtensionAsync(
extension.publicKey,
extension.name,
extension.version,
)
await extensionStore.loadExtensionsAsync() await extensionStore.loadExtensionsAsync()
add({ add({

View File

@ -15,7 +15,7 @@
<div class="flex items-start gap-4"> <div class="flex items-start gap-4">
<div <div
v-if="preview?.manifest.icon" v-if="preview?.manifest.icon"
class="w-16 h-16 flex-shrink-0" class="w-16 h-16 shrink-0"
> >
<UIcon <UIcon
:name="preview.manifest.icon" :name="preview.manifest.icon"
@ -184,7 +184,6 @@ const shellPermissions = computed({
}, },
}) })
const permissionAccordionItems = computed(() => { const permissionAccordionItems = computed(() => {
const items = [] const items = []

View File

@ -1,114 +1,247 @@
<template> <template>
<UPopover v-model:open="open"> <UDrawer
v-model:open="open"
direction="right"
:title="t('launcher.title')"
:description="t('launcher.description')"
:overlay="false"
:modal="false"
:handle-only="true"
:ui="{
content: 'w-dvw max-w-md sm:max-w-fit',
}"
>
<UButton <UButton
icon="material-symbols:apps" icon="material-symbols:apps"
color="neutral" color="neutral"
variant="outline" variant="outline"
v-bind="$attrs" v-bind="$attrs"
size="xl" size="lg"
/> />
<template #content> <template #content>
<ul class="p-4 max-h-96 grid grid-cols-3 gap-2 overflow-scroll"> <div class="p-4 h-full overflow-y-auto">
<!-- Enabled Extensions --> <div class="flex flex-wrap">
<UiButton <!-- All launcher items (system windows + enabled extensions, alphabetically sorted) -->
v-for="extension in enabledExtensions" <UContextMenu
:key="extension.id" v-for="item in launcherItems"
square :key="item.id"
size="xl" :items="getContextMenuItems(item)"
variant="ghost" >
:ui="{ <UiButton
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible', square
leadingIcon: 'size-10', size="lg"
label: 'w-full', variant="ghost"
}" :ui="{
:icon="extension.icon || 'i-heroicons-puzzle-piece-solid'" base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab',
:label="extension.name" leadingIcon: 'size-10',
:tooltip="extension.name" label: 'w-full',
@click="openExtension(extension.id)" }"
/> :icon="item.icon"
:label="item.name"
:tooltip="item.name"
draggable="true"
@click="openItem(item)"
@dragstart="handleDragStart($event, item)"
/>
</UContextMenu>
<!-- Disabled Extensions (grayed out) --> <!-- Disabled Extensions (grayed out) -->
<UiButton <UiButton
v-for="extension in disabledExtensions" v-for="extension in disabledExtensions"
:key="extension.id" :key="extension.id"
square square
size="xl" size="xl"
variant="ghost" variant="ghost"
:disabled="true" :disabled="true"
:ui="{ :ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible opacity-40', base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible opacity-40',
leadingIcon: 'size-10', leadingIcon: 'size-10',
label: 'w-full', label: 'w-full',
}" }"
:icon="extension.icon || 'i-heroicons-puzzle-piece-solid'" :icon="extension.icon || 'i-heroicons-puzzle-piece-solid'"
:label="extension.name" :label="extension.name"
:tooltip="`${extension.name} (${t('disabled')})`" :tooltip="`${extension.name} (${t('disabled')})`"
/> />
</div>
<!-- Marketplace Button (always at the end) --> </div>
<UiButton
square
size="xl"
variant="soft"
color="primary"
:ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible',
leadingIcon: 'size-10',
label: 'w-full',
}"
icon="i-heroicons-plus-circle"
:label="t('marketplace')"
:tooltip="t('marketplace')"
@click="openMarketplace"
/>
</ul>
</template> </template>
</UPopover> </UDrawer>
<!-- Uninstall Confirmation Dialog -->
<UiDialogConfirm
v-model:open="showUninstallDialog"
:title="t('uninstall.confirm.title')"
:description="
t('uninstall.confirm.description', {
name: extensionToUninstall?.name || '',
})
"
:confirm-label="t('uninstall.confirm.button')"
confirm-icon="i-heroicons-trash"
@confirm="confirmUninstall"
/>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
defineOptions({
inheritAttrs: false,
})
const extensionStore = useExtensionsStore() const extensionStore = useExtensionsStore()
const router = useRouter() const windowManagerStore = useWindowManagerStore()
const route = useRoute()
const localePath = useLocalePath()
const { t } = useI18n() const { t } = useI18n()
const open = ref(false) const open = ref(false)
// Enabled extensions first // Uninstall dialog state
const enabledExtensions = computed(() => { const showUninstallDialog = ref(false)
return extensionStore.availableExtensions.filter((ext) => ext.enabled) const extensionToUninstall = ref<LauncherItem | null>(null)
// Unified launcher item type
interface LauncherItem {
id: string
name: string
icon: string
type: 'system' | 'extension'
}
// Combine system windows and enabled extensions, sorted alphabetically
const launcherItems = computed(() => {
const items: LauncherItem[] = []
// Add system windows
const systemWindows = windowManagerStore.getAllSystemWindows()
systemWindows.forEach((sysWin: SystemWindowDefinition) => {
items.push({
id: sysWin.id,
name: sysWin.name,
icon: sysWin.icon,
type: 'system',
})
})
// Add enabled extensions
const enabledExtensions = extensionStore.availableExtensions.filter(
(ext) => ext.enabled,
)
enabledExtensions.forEach((ext) => {
items.push({
id: ext.id,
name: ext.name,
icon: ext.icon || 'i-heroicons-puzzle-piece-solid',
type: 'extension',
})
})
// Sort alphabetically by name
return items.sort((a, b) => a.name.localeCompare(b.name))
}) })
// Disabled extensions last // Disabled extensions (shown grayed out at the end)
const disabledExtensions = computed(() => { const disabledExtensions = computed(() => {
return extensionStore.availableExtensions.filter((ext) => !ext.enabled) return extensionStore.availableExtensions.filter((ext) => !ext.enabled)
}) })
const openExtension = (extensionId: string) => { // Open launcher item (system window or extension)
router.push( const openItem = async (item: LauncherItem) => {
localePath({ try {
name: 'haexExtension', // Open the window with correct type and sourceId
params: { await windowManagerStore.openWindowAsync({
vaultId: route.params.vaultId, sourceId: item.id,
extensionId, type: item.type,
}, icon: item.icon,
}), title: item.name,
) })
open.value = false
open.value = false
} catch (error) {
console.log(error)
}
} }
const openMarketplace = () => { // Uninstall extension - shows confirmation dialog first
router.push( const uninstallExtension = async (item: LauncherItem) => {
localePath({ extensionToUninstall.value = item
name: 'extensionOverview', showUninstallDialog.value = true
params: { }
vaultId: route.params.vaultId,
}, // Confirm uninstall - actually removes the extension
}), const confirmUninstall = async () => {
if (!extensionToUninstall.value) return
try {
const extension = extensionStore.availableExtensions.find(
(ext) => ext.id === extensionToUninstall.value!.id,
)
if (!extension) return
// Close all windows of this extension first
const extensionWindows = windowManagerStore.windows.filter(
(win) => win.type === 'extension' && win.sourceId === extension.id,
)
for (const win of extensionWindows) {
windowManagerStore.closeWindow(win.id)
}
// Uninstall the extension
await extensionStore.removeExtensionAsync(
extension.publicKey,
extension.name,
extension.version,
)
// Refresh available extensions list
await extensionStore.loadExtensionsAsync()
// Close dialog and reset state
showUninstallDialog.value = false
extensionToUninstall.value = null
} catch (error) {
console.error('Failed to uninstall extension:', error)
}
}
// Get context menu items for launcher item
const getContextMenuItems = (item: LauncherItem) => {
const items = [
{
label: t('contextMenu.open'),
icon: 'i-heroicons-arrow-top-right-on-square',
onSelect: () => openItem(item),
},
]
// Add uninstall option for extensions
if (item.type === 'extension') {
items.push({
label: t('contextMenu.uninstall'),
icon: 'i-heroicons-trash',
onSelect: () => uninstallExtension(item),
})
}
return items
}
// Drag & Drop handling
const handleDragStart = (event: DragEvent, item: LauncherItem) => {
if (!event.dataTransfer) return
// Store the launcher item data
event.dataTransfer.effectAllowed = 'copy'
event.dataTransfer.setData(
'application/haex-launcher-item',
JSON.stringify(item),
) )
open.value = false
// Set drag image (optional - uses default if not set)
const dragImage = event.target as HTMLElement
if (dragImage) {
event.dataTransfer.setDragImage(dragImage, 20, 20)
}
} }
</script> </script>
@ -116,8 +249,30 @@ const openMarketplace = () => {
de: de:
disabled: Deaktiviert disabled: Deaktiviert
marketplace: Marketplace marketplace: Marketplace
launcher:
title: App Launcher
description: Wähle eine App zum Öffnen
contextMenu:
open: Öffnen
uninstall: Deinstallieren
uninstall:
confirm:
title: Erweiterung deinstallieren
description: Möchtest du wirklich "{name}" deinstallieren? Diese Aktion kann nicht rückgängig gemacht werden.
button: Deinstallieren
en: en:
disabled: Disabled disabled: Disabled
marketplace: Marketplace marketplace: Marketplace
launcher:
title: App Launcher
description: Select an app to open
contextMenu:
open: Open
uninstall: Uninstall
uninstall:
confirm:
title: Uninstall Extension
description: Do you really want to uninstall "{name}"? This action cannot be undone.
button: Uninstall
</i18n> </i18n>

View File

@ -15,7 +15,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(read).at(0)" :checked="Object.values(read).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(read).at(0)" :for="Object.keys(read).at(0)"
@ -42,7 +42,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(write).at(0)" :checked="Object.values(write).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(write).at(0)" :for="Object.keys(write).at(0)"
@ -69,7 +69,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(create).at(0)" :checked="Object.values(create).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(create).at(0)" :for="Object.keys(create).at(0)"

View File

@ -14,7 +14,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(read).at(0)" :checked="Object.values(read).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(read).at(0)" :for="Object.keys(read).at(0)"
@ -41,7 +41,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(write).at(0)" :checked="Object.values(write).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(write).at(0)" :for="Object.keys(write).at(0)"

View File

@ -15,7 +15,7 @@
type="checkbox" type="checkbox"
class="checkbox" class="checkbox"
:checked="Object.values(access).at(0)" :checked="Object.values(access).at(0)"
/> >
<label <label
class="label-text text-base" class="label-text text-base"
:for="Object.keys(access).at(0)" :for="Object.keys(access).at(0)"

View File

@ -8,7 +8,7 @@
> >
<div class="flex items-start gap-4"> <div class="flex items-start gap-4">
<!-- Icon --> <!-- Icon -->
<div class="flex-shrink-0"> <div class="shrink-0">
<div <div
v-if="extension.icon" v-if="extension.icon"
class="w-16 h-16 rounded-lg bg-primary/10 flex items-center justify-center" class="w-16 h-16 rounded-lg bg-primary/10 flex items-center justify-center"
@ -52,7 +52,7 @@
<p <p
v-if="extension.description" v-if="extension.description"
class="text-sm text-gray-600 dark:text-gray-300 mt-2 line-clamp-2" class="hidden @lg:flex text-sm text-gray-600 dark:text-gray-300 mt-2 line-clamp-2"
> >
{{ extension.description }} {{ extension.description }}
</p> </p>
@ -67,7 +67,9 @@
> >
<UIcon name="i-heroicons-check-circle-solid" /> <UIcon name="i-heroicons-check-circle-solid" />
<span v-if="!extension.installedVersion">{{ t('installed') }}</span> <span v-if="!extension.installedVersion">{{ t('installed') }}</span>
<span v-else>{{ t('installedVersion', { version: extension.installedVersion }) }}</span> <span v-else>{{
t('installedVersion', { version: extension.installedVersion })
}}</span>
</div> </div>
<div <div
v-if="extension.downloads" v-if="extension.downloads"
@ -114,10 +116,16 @@
<div class="flex items-center justify-between gap-2"> <div class="flex items-center justify-between gap-2">
<UButton <UButton
:label="getInstallButtonLabel()" :label="getInstallButtonLabel()"
:color="extension.isInstalled && !extension.installedVersion ? 'neutral' : 'primary'" :color="
extension.isInstalled && !extension.installedVersion
? 'neutral'
: 'primary'
"
:disabled="extension.isInstalled && !extension.installedVersion" :disabled="extension.isInstalled && !extension.installedVersion"
:icon=" :icon="
extension.isInstalled && !extension.installedVersion ? 'i-heroicons-check' : 'i-heroicons-arrow-down-tray' extension.isInstalled && !extension.installedVersion
? 'i-heroicons-check'
: 'i-heroicons-arrow-down-tray'
" "
size="sm" size="sm"
@click.stop="$emit('install')" @click.stop="$emit('install')"

View File

@ -1,5 +1,5 @@
<template> <template>
<div class="p-4 max-w-4xl mx-auto space-y-6"> <div class="p-4 mx-auto space-y-6 bg-default/90 backdrop-blur-2xl">
<div class="space-y-2"> <div class="space-y-2">
<h1 class="text-2xl font-bold">{{ t('title') }}</h1> <h1 class="text-2xl font-bold">{{ t('title') }}</h1>
<p class="text-sm opacity-70">{{ t('description') }}</p> <p class="text-sm opacity-70">{{ t('description') }}</p>
@ -85,28 +85,16 @@
<script setup lang="ts"> <script setup lang="ts">
import { invoke } from '@tauri-apps/api/core' import { invoke } from '@tauri-apps/api/core'
import { open } from '@tauri-apps/plugin-dialog' import { open } from '@tauri-apps/plugin-dialog'
import type { ExtensionInfoResponse } from '~~/src-tauri/bindings/ExtensionInfoResponse'
definePageMeta({
name: 'settings-developer',
})
const { t } = useI18n() const { t } = useI18n()
const { add } = useToast() const { add } = useToast()
const { loadExtensionsAsync } = useExtensionsStore() const { loadExtensionsAsync } = useExtensionsStore()
// State // State
const extensionPath = ref('') const extensionPath = ref('')
const isLoading = ref(false) const isLoading = ref(false)
const devExtensions = ref< const devExtensions = ref<Array<ExtensionInfoResponse>>([])
Array<{
id: string
publicKey: string
name: string
version: string
enabled: boolean
}>
>([])
// Load dev extensions on mount // Load dev extensions on mount
onMounted(async () => { onMounted(async () => {
@ -140,7 +128,7 @@ const loadDevExtensionAsync = async () => {
isLoading.value = true isLoading.value = true
try { try {
const extensionId = await invoke<string>('load_dev_extension', { await invoke<string>('load_dev_extension', {
extensionPath: extensionPath.value, extensionPath: extensionPath.value,
}) })
@ -157,10 +145,10 @@ const loadDevExtensionAsync = async () => {
// Clear input // Clear input
extensionPath.value = '' extensionPath.value = ''
} catch (error: any) { } catch (error) {
console.error('Failed to load dev extension:', error) console.error('Failed to load dev extension:', error)
add({ add({
description: error || t('add.errors.loadFailed'), description: t('add.errors.loadFailed') + error,
color: 'error', color: 'error',
}) })
} finally { } finally {
@ -171,7 +159,9 @@ const loadDevExtensionAsync = async () => {
// Load all dev extensions (for the list on this page) // Load all dev extensions (for the list on this page)
const loadDevExtensionListAsync = async () => { const loadDevExtensionListAsync = async () => {
try { try {
const extensions = await invoke<Array<any>>('get_all_dev_extensions') const extensions = await invoke<Array<ExtensionInfoResponse>>(
'get_all_dev_extensions',
)
devExtensions.value = extensions devExtensions.value = extensions
} catch (error) { } catch (error) {
console.error('Failed to load dev extensions:', error) console.error('Failed to load dev extensions:', error)
@ -179,29 +169,30 @@ const loadDevExtensionListAsync = async () => {
} }
// Reload a dev extension (removes and re-adds) // Reload a dev extension (removes and re-adds)
const reloadDevExtensionAsync = async (ext: any) => { const reloadDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
try { try {
console.log('reloadDevExtensionAsync', extension)
// Get the extension path from somewhere (we need to store this) // Get the extension path from somewhere (we need to store this)
// For now, just show a message // For now, just show a message
add({ add({
description: t('list.reloadInfo'), description: t('list.reloadInfo'),
color: 'info', color: 'info',
}) })
} catch (error: any) { } catch (error) {
console.error('Failed to reload dev extension:', error) console.error('Failed to reload dev extension:', error)
add({ add({
description: error || t('list.errors.reloadFailed'), description: t('list.errors.reloadFailed') + error,
color: 'error', color: 'error',
}) })
} }
} }
// Remove a dev extension // Remove a dev extension
const removeDevExtensionAsync = async (ext: any) => { const removeDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
try { try {
await invoke('remove_dev_extension', { await invoke('remove_dev_extension', {
publicKey: ext.publicKey, publicKey: extension.publicKey,
name: ext.name, name: extension.name,
}) })
add({ add({
@ -214,10 +205,10 @@ const removeDevExtensionAsync = async (ext: any) => {
// Reload all extensions store // Reload all extensions store
await loadExtensionsAsync() await loadExtensionsAsync()
} catch (error: any) { } catch (error) {
console.error('Failed to remove dev extension:', error) console.error('Failed to remove dev extension:', error)
add({ add({
description: error || t('list.errors.removeFailed'), description: t('list.errors.removeFailed') + error,
color: 'error', color: 'error',
}) })
} }

View File

@ -1,8 +1,8 @@
<template> <template>
<div class="flex flex-col h-full"> <div class="flex flex-col h-full bg-default">
<!-- Header with Actions --> <!-- Header with Actions -->
<div <div
class="flex flex-col sm:flex-row sm:items-center justify-between gap-4 p-6 border-b border-gray-200 dark:border-gray-800" class="flex flex-col @lg:flex-row @lg:items-center justify-between gap-4 p-6 border-b border-gray-200 dark:border-gray-800"
> >
<div> <div>
<h1 class="text-2xl font-bold"> <h1 class="text-2xl font-bold">
@ -14,14 +14,14 @@
</div> </div>
<div <div
class="flex flex-col sm:flex-row items-stretch sm:items-center gap-3" class="flex flex-col @lg:flex-row items-stretch @lg:items-center gap-3"
> >
<!-- Marketplace Selector --> <!-- Marketplace Selector -->
<USelectMenu <USelectMenu
v-model="selectedMarketplace" v-model="selectedMarketplace"
:items="marketplaces" :items="marketplaces"
value-key="id" value-key="id"
class="w-full sm:w-48" class="w-full @lg:w-48"
> >
<template #leading> <template #leading>
<UIcon name="i-heroicons-building-storefront" /> <UIcon name="i-heroicons-building-storefront" />
@ -33,6 +33,7 @@
:label="t('extension.installFromFile')" :label="t('extension.installFromFile')"
icon="i-heroicons-arrow-up-tray" icon="i-heroicons-arrow-up-tray"
color="neutral" color="neutral"
block
@click="onSelectExtensionAsync" @click="onSelectExtensionAsync"
/> />
</div> </div>
@ -40,7 +41,7 @@
<!-- Search and Filters --> <!-- Search and Filters -->
<div <div
class="flex flex-col sm:flex-row items-stretch sm:items-center gap-4 p-6 border-b border-gray-200 dark:border-gray-800" class="flex flex-col @lg:flex-row items-stretch @lg:items-center gap-4 p-6 border-b border-gray-200 dark:border-gray-800"
> >
<UInput <UInput
v-model="searchQuery" v-model="searchQuery"
@ -53,7 +54,7 @@
:items="categories" :items="categories"
:placeholder="t('filter.category')" :placeholder="t('filter.category')"
value-key="id" value-key="id"
class="w-full sm:w-48" class="w-full @lg:w-48"
> >
<template #leading> <template #leading>
<UIcon name="i-heroicons-tag" /> <UIcon name="i-heroicons-tag" />
@ -65,7 +66,7 @@
<div class="flex-1 overflow-auto p-6"> <div class="flex-1 overflow-auto p-6">
<div <div
v-if="filteredExtensions.length" v-if="filteredExtensions.length"
class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4" class="grid grid-cols-1 @md:grid-cols-2 @2xl:grid-cols-3 gap-4"
> >
<!-- Marketplace Extension Card --> <!-- Marketplace Extension Card -->
<HaexExtensionMarketplaceCard <HaexExtensionMarketplaceCard
@ -104,7 +105,7 @@
<HaexExtensionDialogInstall <HaexExtensionDialogInstall
v-model:open="showConfirmation" v-model:open="showConfirmation"
:preview="preview" :preview="preview"
@confirm="(addToDesktop) => addExtensionAsync(addToDesktop)" @confirm="addExtensionAsync"
/> />
<HaexExtensionDialogRemove <HaexExtensionDialogRemove
@ -124,13 +125,8 @@ import type {
import { open } from '@tauri-apps/plugin-dialog' import { open } from '@tauri-apps/plugin-dialog'
import type { ExtensionPreview } from '~~/src-tauri/bindings/ExtensionPreview' import type { ExtensionPreview } from '~~/src-tauri/bindings/ExtensionPreview'
definePageMeta({
name: 'extensionOverview',
})
const { t } = useI18n() const { t } = useI18n()
const extensionStore = useExtensionsStore() const extensionStore = useExtensionsStore()
const desktopStore = useDesktopStore()
const showConfirmation = ref(false) const showConfirmation = ref(false)
const openOverwriteDialog = ref(false) const openOverwriteDialog = ref(false)
@ -203,7 +199,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'HaexPassDummy', name: 'HaexPassDummy',
version: '1.0.0', version: '1.0.0',
author: 'HaexHub Team', author: 'HaexHub Team',
public_key: 'a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2', public_key:
'a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2',
description: description:
'Sicherer Passwort-Manager mit Ende-zu-Ende-Verschlüsselung und Autofill-Funktion.', 'Sicherer Passwort-Manager mit Ende-zu-Ende-Verschlüsselung und Autofill-Funktion.',
icon: 'i-heroicons-lock-closed', icon: 'i-heroicons-lock-closed',
@ -221,7 +218,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'HaexNotes', name: 'HaexNotes',
version: '2.1.0', version: '2.1.0',
author: 'HaexHub Team', author: 'HaexHub Team',
public_key: 'b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3', public_key:
'b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3',
description: description:
'Markdown-basierter Notizen-Editor mit Syntax-Highlighting und Live-Preview.', 'Markdown-basierter Notizen-Editor mit Syntax-Highlighting und Live-Preview.',
icon: 'i-heroicons-document-text', icon: 'i-heroicons-document-text',
@ -239,7 +237,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'HaexBackup', name: 'HaexBackup',
version: '1.5.2', version: '1.5.2',
author: 'Community', author: 'Community',
public_key: 'c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4', public_key:
'c3d4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4',
description: description:
'Automatische Backups deiner Daten mit Cloud-Sync-Unterstützung.', 'Automatische Backups deiner Daten mit Cloud-Sync-Unterstützung.',
icon: 'i-heroicons-cloud-arrow-up', icon: 'i-heroicons-cloud-arrow-up',
@ -257,7 +256,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'HaexCalendar', name: 'HaexCalendar',
version: '3.0.1', version: '3.0.1',
author: 'HaexHub Team', author: 'HaexHub Team',
public_key: 'd4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5', public_key:
'd4e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5',
description: description:
'Integrierter Kalender mit Event-Management und Synchronisation.', 'Integrierter Kalender mit Event-Management und Synchronisation.',
icon: 'i-heroicons-calendar', icon: 'i-heroicons-calendar',
@ -275,7 +275,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'Haex2FA', name: 'Haex2FA',
version: '1.2.0', version: '1.2.0',
author: 'Security Team', author: 'Security Team',
public_key: 'e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6', public_key:
'e5f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6',
description: description:
'2-Faktor-Authentifizierung Manager mit TOTP und Backup-Codes.', '2-Faktor-Authentifizierung Manager mit TOTP und Backup-Codes.',
icon: 'i-heroicons-shield-check', icon: 'i-heroicons-shield-check',
@ -293,7 +294,8 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
name: 'GitHub Integration', name: 'GitHub Integration',
version: '1.0.5', version: '1.0.5',
author: 'Community', author: 'Community',
public_key: 'f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6a7', public_key:
'f6a7b8c9d0e1f2a3b4c5d6e7f8a9b0c1d2e3f4a5b6c7d8e9f0a1b2c3d4e5f6a7',
description: description:
'Direkter Zugriff auf GitHub Repositories, Issues und Pull Requests.', 'Direkter Zugriff auf GitHub Repositories, Issues und Pull Requests.',
icon: 'i-heroicons-code-bracket', icon: 'i-heroicons-code-bracket',
@ -312,16 +314,23 @@ const marketplaceExtensions = ref<IMarketplaceExtension[]>([
const allExtensions = computed((): IMarketplaceExtension[] => { const allExtensions = computed((): IMarketplaceExtension[] => {
return marketplaceExtensions.value.map((ext) => { return marketplaceExtensions.value.map((ext) => {
// Extensions are uniquely identified by public_key + name // Extensions are uniquely identified by public_key + name
const installedExt = extensionStore.availableExtensions.find((installed) => { const installedExt = extensionStore.availableExtensions.find(
return installed.publicKey === ext.publicKey && installed.name === ext.name (installed) => {
}) return (
installed.publicKey === ext.publicKey && installed.name === ext.name
)
},
)
if (installedExt) { if (installedExt) {
return { return {
...ext, ...ext,
isInstalled: true, isInstalled: true,
// Show installed version if it differs from marketplace version // Show installed version if it differs from marketplace version
installedVersion: installedExt.version !== ext.version ? installedExt.version : undefined, installedVersion:
installedExt.version !== ext.version
? installedExt.version
: undefined,
} }
} }
@ -375,7 +384,7 @@ const onSelectExtensionAsync = async () => {
const isAlreadyInstalled = extensionStore.availableExtensions.some( const isAlreadyInstalled = extensionStore.availableExtensions.some(
(ext) => (ext) =>
ext.publicKey === preview.value!.manifest.public_key && ext.publicKey === preview.value!.manifest.public_key &&
ext.name === preview.value!.manifest.name ext.name === preview.value!.manifest.name,
) )
if (isAlreadyInstalled) { if (isAlreadyInstalled) {
@ -389,23 +398,18 @@ const onSelectExtensionAsync = async () => {
} }
} }
const addExtensionAsync = async (addToDesktop: boolean = false) => { const addExtensionAsync = async () => {
try { try {
console.log( console.log(
'preview.value?.editable_permissions', 'preview.value?.editable_permissions',
preview.value?.editable_permissions, preview.value?.editable_permissions,
) )
const extensionId = await extensionStore.installAsync( await extensionStore.installAsync(
extension.path, extension.path,
preview.value?.editable_permissions, preview.value?.editable_permissions,
) )
await extensionStore.loadExtensionsAsync() await extensionStore.loadExtensionsAsync()
// Add to desktop if requested
if (addToDesktop && extensionId) {
await desktopStore.addDesktopItemAsync('extension', extensionId, 50, 50)
}
add({ add({
color: 'success', color: 'success',
title: t('extension.success.title', { title: t('extension.success.title', {
@ -435,7 +439,7 @@ const reinstallExtensionAsync = async () => {
const installedExt = extensionStore.availableExtensions.find( const installedExt = extensionStore.availableExtensions.find(
(ext) => (ext) =>
ext.publicKey === preview.value!.manifest.public_key && ext.publicKey === preview.value!.manifest.public_key &&
ext.name === preview.value!.manifest.name ext.name === preview.value!.manifest.name,
) )
if (installedExt) { if (installedExt) {
@ -443,7 +447,7 @@ const reinstallExtensionAsync = async () => {
await extensionStore.removeExtensionAsync( await extensionStore.removeExtensionAsync(
installedExt.publicKey, installedExt.publicKey,
installedExt.name, installedExt.name,
installedExt.version installedExt.version,
) )
} }
@ -476,7 +480,11 @@ onMounted(async () => {
} */ } */
const removeExtensionAsync = async () => { const removeExtensionAsync = async () => {
if (!extensionToBeRemoved.value?.publicKey || !extensionToBeRemoved.value?.name || !extensionToBeRemoved.value?.version) { if (
!extensionToBeRemoved.value?.publicKey ||
!extensionToBeRemoved.value?.name ||
!extensionToBeRemoved.value?.version
) {
add({ add({
color: 'error', color: 'error',
description: 'Erweiterung kann nicht gelöscht werden', description: 'Erweiterung kann nicht gelöscht werden',

View File

@ -0,0 +1,325 @@
<template>
<div class="w-full h-full bg-default">
<div class="grid grid-cols-2 p-2">
<div class="p-2">{{ t('language') }}</div>
<div><UiDropdownLocale @select="onSelectLocaleAsync" /></div>
<div class="p-2">{{ t('design') }}</div>
<div><UiDropdownTheme @select="onSelectThemeAsync" /></div>
<div class="p-2">{{ t('vaultName.label') }}</div>
<div>
<UiInput
v-model="currentVaultName"
:placeholder="t('vaultName.label')"
@change="onSetVaultNameAsync"
/>
</div>
<div class="p-2">{{ t('notifications.label') }}</div>
<div>
<UiButton
:label="t('notifications.requestPermission')"
@click="requestNotificationPermissionAsync"
/>
</div>
<div class="p-2">{{ t('deviceName.label') }}</div>
<div>
<UiInput
v-model="deviceName"
:placeholder="t('deviceName.label')"
@change="onUpdateDeviceNameAsync"
/>
</div>
<div class="p-2">{{ t('workspaceBackground.label') }}</div>
<div class="flex gap-2">
<UiButton
:label="t('workspaceBackground.choose')"
@click="selectBackgroundImage"
/>
<UiButton
v-if="currentWorkspace?.background"
:label="t('workspaceBackground.remove.label')"
color="error"
@click="removeBackgroundImage"
/>
</div>
<div class="h-full" />
</div>
</div>
</template>
<script setup lang="ts">
import type { Locale } from 'vue-i18n'
import { open } from '@tauri-apps/plugin-dialog'
import {
readFile,
writeFile,
mkdir,
exists,
remove,
} from '@tauri-apps/plugin-fs'
import { appLocalDataDir } from '@tauri-apps/api/path'
const { t, setLocale } = useI18n()
const { currentVaultName } = storeToRefs(useVaultStore())
const { updateVaultNameAsync, updateLocaleAsync, updateThemeAsync } =
useVaultSettingsStore()
const onSelectLocaleAsync = async (locale: Locale) => {
await updateLocaleAsync(locale)
await setLocale(locale)
}
const { currentThemeName } = storeToRefs(useUiStore())
const onSelectThemeAsync = async (theme: string) => {
currentThemeName.value = theme
console.log('onSelectThemeAsync', currentThemeName.value)
await updateThemeAsync(theme)
}
const { add } = useToast()
const onSetVaultNameAsync = async () => {
try {
await updateVaultNameAsync(currentVaultName.value)
add({ description: t('vaultName.update.success'), color: 'success' })
} catch (error) {
console.error(error)
add({ description: t('vaultName.update.error'), color: 'error' })
}
}
const { requestNotificationPermissionAsync } = useNotificationStore()
const { deviceName } = storeToRefs(useDeviceStore())
const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore()
const workspaceStore = useWorkspaceStore()
const { currentWorkspace } = storeToRefs(workspaceStore)
const { updateWorkspaceBackgroundAsync } = workspaceStore
onMounted(async () => {
await readDeviceNameAsync()
})
const onUpdateDeviceNameAsync = async () => {
const check = vaultDeviceNameSchema.safeParse(deviceName.value)
if (!check.success) return
try {
await updateDeviceNameAsync({ name: deviceName.value })
add({ description: t('deviceName.update.success'), color: 'success' })
} catch (error) {
console.log(error)
add({ description: t('deviceName.update.error'), color: 'error' })
}
}
const selectBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
const selected = await open({
multiple: false,
filters: [
{
name: 'Images',
extensions: ['png', 'jpg', 'jpeg', 'webp'],
},
],
})
if (!selected || typeof selected !== 'string') {
return
}
// Read the selected file (works with Android photo picker URIs)
let fileData: Uint8Array
try {
fileData = await readFile(selected)
} catch (readError) {
add({
description: `Fehler beim Lesen: ${readError instanceof Error ? readError.message : String(readError)}`,
color: 'error',
})
return
}
// Detect file type from file signature
let ext = 'jpg' // default
if (fileData.length > 4) {
// PNG signature: 89 50 4E 47
if (
fileData[0] === 0x89 &&
fileData[1] === 0x50 &&
fileData[2] === 0x4e &&
fileData[3] === 0x47
) {
ext = 'png'
}
// JPEG signature: FF D8 FF
else if (
fileData[0] === 0xff &&
fileData[1] === 0xd8 &&
fileData[2] === 0xff
) {
ext = 'jpg'
}
// WebP signature: RIFF xxxx WEBP
else if (
fileData[0] === 0x52 &&
fileData[1] === 0x49 &&
fileData[2] === 0x46 &&
fileData[3] === 0x46
) {
ext = 'webp'
}
}
// Get app local data directory
const appDataPath = await appLocalDataDir()
// Construct target path manually to avoid path joining issues
const fileName = `workspace-${currentWorkspace.value.id}-background.${ext}`
const targetPath = `${appDataPath}/files/${fileName}`
// Create parent directory if it doesn't exist
const parentDir = `${appDataPath}/files`
try {
if (!(await exists(parentDir))) {
await mkdir(parentDir, { recursive: true })
}
} catch (mkdirError) {
add({
description: `Fehler beim Erstellen des Ordners: ${mkdirError instanceof Error ? mkdirError.message : String(mkdirError)}`,
color: 'error',
})
return
}
// Write file to app data directory
try {
await writeFile(targetPath, fileData)
} catch (writeError) {
add({
description: `Fehler beim Schreiben: ${writeError instanceof Error ? writeError.message : String(writeError)}`,
color: 'error',
})
return
}
// Store the absolute file path in database
try {
await updateWorkspaceBackgroundAsync(
currentWorkspace.value.id,
targetPath,
)
add({
description: t('workspaceBackground.update.success'),
color: 'success',
})
} catch (dbError) {
add({
description: `Fehler beim DB-Update: ${dbError instanceof Error ? dbError.message : String(dbError)}`,
color: 'error',
})
}
} catch (error) {
console.error('Error selecting background:', error)
add({
description: `${t('workspaceBackground.update.error')}: ${error instanceof Error ? error.message : String(error)}`,
color: 'error',
})
}
}
const removeBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
// Delete the background file if it exists
if (currentWorkspace.value.background) {
try {
// The background field contains the absolute file path
if (await exists(currentWorkspace.value.background)) {
await remove(currentWorkspace.value.background)
}
} catch (err) {
console.warn('Could not delete background file:', err)
// Continue anyway to clear the database entry
}
}
await updateWorkspaceBackgroundAsync(currentWorkspace.value.id, null)
add({
description: t('workspaceBackground.remove.success'),
color: 'success',
})
} catch (error) {
console.error('Error removing background:', error)
add({ description: t('workspaceBackground.remove.error'), color: 'error' })
}
}
</script>
<i18n lang="yaml">
de:
language: Sprache
design: Design
save: Änderung speichern
notifications:
label: Benachrichtigungen
requestPermission: Benachrichtigung erlauben
vaultName:
label: Vaultname
update:
success: Vaultname erfolgreich aktualisiert
error: Vaultname konnte nicht aktualisiert werden
deviceName:
label: Gerätename
update:
success: Gerätename wurde erfolgreich aktualisiert
error: Gerätename konnte nich aktualisiert werden
workspaceBackground:
label: Workspace-Hintergrund
choose: Bild auswählen
update:
success: Hintergrund erfolgreich aktualisiert
error: Fehler beim Aktualisieren des Hintergrunds
remove:
label: Hintergrund entfernen
success: Hintergrund erfolgreich entfernt
error: Fehler beim Entfernen des Hintergrunds
en:
language: Language
design: Design
save: save changes
notifications:
label: Notifications
requestPermission: Grant Permission
vaultName:
label: Vault Name
update:
success: Vault Name successfully updated
error: Vault name could not be updated
deviceName:
label: Device name
update:
success: Device name has been successfully updated
error: Device name could not be updated
workspaceBackground:
label: Workspace Background
choose: Choose Image
update:
success: Background successfully updated
error: Error updating background
remove:
label: Remove Background
success: Background successfully removed
error: Error removing background
</i18n>

View File

@ -2,6 +2,7 @@
<UiDialogConfirm <UiDialogConfirm
:confirm-label="t('create')" :confirm-label="t('create')"
@confirm="onCreateAsync" @confirm="onCreateAsync"
:description="t('description')"
> >
<UiButton <UiButton
:label="t('vault.create')" :label="t('vault.create')"
@ -55,7 +56,9 @@
<script setup lang="ts"> <script setup lang="ts">
import { vaultSchema } from './schema' import { vaultSchema } from './schema'
const { t } = useI18n() const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{ const vault = reactive<{
name: string name: string
@ -98,7 +101,7 @@ const onCreateAsync = async () => {
if (vaultId) { if (vaultId) {
initVault() initVault()
await navigateTo( await navigateTo(
useLocaleRoute()({ name: 'vaultOverview', params: { vaultId } }), useLocaleRoute()({ name: 'desktop', params: { vaultId } }),
) )
} }
} }
@ -118,6 +121,7 @@ de:
name: HaexVault name: HaexVault
title: Neue {haexvault} erstellen title: Neue {haexvault} erstellen
create: Erstellen create: Erstellen
description: Erstelle eine neue Vault für deine Daten
en: en:
vault: vault:
@ -127,4 +131,5 @@ en:
name: HaexVault name: HaexVault
title: Create new {haexvault} title: Create new {haexvault}
create: Create create: Create
description: Create a new vault for your data
</i18n> </i18n>

View File

@ -5,7 +5,7 @@
:description="vault.path || path" :description="vault.path || path"
@confirm="onOpenDatabase" @confirm="onOpenDatabase"
> >
<!-- <UiButton <UiButton
:label="t('vault.open')" :label="t('vault.open')"
:ui="{ :ui="{
base: 'px-3 py-2', base: 'px-3 py-2',
@ -14,8 +14,7 @@
size="xl" size="xl"
variant="outline" variant="outline"
block block
@click.stop="onLoadDatabase" />
/> -->
<template #title> <template #title>
<i18n-t <i18n-t
@ -59,7 +58,9 @@ const props = defineProps<{
path?: string path?: string
}>() }>()
const { t } = useI18n() const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{ const vault = reactive<{
name: string name: string
@ -100,9 +101,6 @@ const vault = reactive<{
} }
} */ } */
const { syncLocaleAsync, syncThemeAsync, syncVaultNameAsync } =
useVaultSettingsStore()
const check = ref(false) const check = ref(false)
const initVault = () => { const initVault = () => {
@ -150,21 +148,23 @@ const onOpenDatabase = async () => {
await navigateTo( await navigateTo(
localePath({ localePath({
name: 'vaultOverview', name: 'desktop',
params: { params: {
vaultId, vaultId,
}, },
}), }),
) )
await Promise.allSettled([
syncLocaleAsync(),
syncThemeAsync(),
syncVaultNameAsync(),
])
} catch (error) { } catch (error) {
open.value = false open.value = false
console.error('handleError', error, typeof error) if (error?.details?.reason === 'file is not a database') {
add({ color: 'error', description: `${error}` }) add({
color: 'error',
title: t('error.password.title'),
description: t('error.password.description'),
})
} else {
add({ color: 'error', description: JSON.stringify(error) })
}
} }
} }
</script> </script>
@ -178,7 +178,9 @@ de:
open: Vault öffnen open: Vault öffnen
description: Öffne eine vorhandene Vault description: Öffne eine vorhandene Vault
error: error:
open: Vault konnte nicht geöffnet werden. \n Vermutlich ist das Passwort falsch password:
title: Vault konnte nicht geöffnet werden
description: Bitte üperprüfe das Passwort
en: en:
open: Unlock open: Unlock
@ -188,5 +190,7 @@ en:
vault: vault:
open: Open Vault open: Open Vault
error: error:
open: Vault couldn't be opened. \n The password is probably wrong password:
title: Vault couldn't be opened
description: Please check your password
</i18n> </i18n>

View File

@ -0,0 +1,83 @@
<template>
<UTooltip :text="tooltip">
<button
class="size-8 shrink-0 rounded-lg flex justify-center transition-colors group"
:class="variantClasses.buttonClass"
@click="(e) => $emit('click', e)"
>
<UIcon
:name="icon"
class="size-4 text-gray-600 dark:text-gray-400"
:class="variantClasses.iconClass"
/>
</button>
</UTooltip>
</template>
<script setup lang="ts">
const props = defineProps<{
variant: 'close' | 'maximize' | 'minimize'
isMaximized?: boolean
}>()
defineEmits(['click'])
const icon = computed(() => {
switch (props.variant) {
case 'close':
return 'i-heroicons-x-mark'
case 'maximize':
return props.isMaximized
? 'i-heroicons-arrows-pointing-in'
: 'i-heroicons-arrows-pointing-out'
default:
return 'i-heroicons-minus'
}
})
const variantClasses = computed(() => {
if (props.variant === 'close') {
return {
iconClass: 'group-hover:text-error',
buttonClass: 'hover:bg-error/30 items-center',
}
} else if (props.variant === 'maximize') {
return {
iconClass: 'group-hover:text-warning',
buttonClass: 'hover:bg-warning/30 items-center',
}
} else {
return {
iconClass: 'group-hover:text-success',
buttonClass: 'hover:bg-success/30 items-end pb-1',
}
}
})
const { t } = useI18n()
const tooltip = computed(() => {
switch (props.variant) {
case 'close':
return t('close')
case 'maximize':
return props.isMaximized ? t('shrink') : t('maximize')
default:
return t('minimize')
}
})
</script>
<i18n lang="yaml">
de:
close: Schließen
maximize: Maximieren
shrink: Verkleinern
minimize: Minimieren
en:
close: Close
maximize: Maximize
shrink: Shrink
minimize: Minimize
</i18n>

View File

@ -0,0 +1,429 @@
<template>
<div
ref="windowEl"
:style="windowStyle"
:class="[
'absolute bg-default/80 backdrop-blur-xl rounded-lg shadow-xl overflow-hidden',
'transition-all ease-out duration-600',
'flex flex-col @container',
{ 'select-none': isResizingOrDragging },
isActive ? 'z-20' : 'z-10',
// Border colors based on warning level
warningLevel === 'warning'
? 'border-2 border-warning-500'
: warningLevel === 'danger'
? 'border-2 border-danger-500'
: 'border border-gray-200 dark:border-gray-700',
]"
@mousedown="handleActivate"
>
<!-- Window Titlebar -->
<div
ref="titlebarEl"
class="grid grid-cols-3 items-center px-3 py-1 bg-white/80 dark:bg-gray-800/80 border-b border-gray-200/50 dark:border-gray-700/50 cursor-move select-none touch-none"
@dblclick="handleMaximize"
>
<!-- Left: Icon -->
<div class="flex items-center gap-2">
<img
v-if="icon"
:src="icon"
:alt="title"
class="w-5 h-5 object-contain shrink-0"
/>
</div>
<!-- Center: Title -->
<div class="flex items-center justify-center">
<span
class="text-sm font-medium text-gray-900 dark:text-gray-100 truncate max-w-full"
>
{{ title }}
</span>
</div>
<!-- Right: Window Controls -->
<div class="flex items-center gap-1 justify-end">
<HaexWindowButton
variant="minimize"
@click.stop="handleMinimize"
/>
<HaexWindowButton
:is-maximized
variant="maximize"
@click.stop="handleMaximize"
/>
<HaexWindowButton
variant="close"
@click.stop="handleClose"
/>
</div>
</div>
<!-- Window Content -->
<div
:class="[
'flex-1 overflow-auto relative ',
isResizingOrDragging ? 'pointer-events-none' : '',
]"
>
<slot />
</div>
<!-- Resize Handles -->
<HaexWindowResizeHandles
:disabled="isMaximized"
@resize-start="handleResizeStart"
/>
</div>
</template>
<script setup lang="ts">
const props = defineProps<{
id: string
title: string
icon?: string | null
isActive?: boolean
sourceX?: number
sourceY?: number
sourceWidth?: number
sourceHeight?: number
isOpening?: boolean
isClosing?: boolean
warningLevel?: 'warning' | 'danger' // Warning indicator (e.g., dev extension, dangerous permissions)
}>()
const emit = defineEmits<{
close: []
minimize: []
activate: []
positionChanged: [x: number, y: number]
sizeChanged: [width: number, height: number]
dragStart: []
dragEnd: []
}>()
// Use defineModel for x, y, width, height
const x = defineModel<number>('x', { default: 100 })
const y = defineModel<number>('y', { default: 100 })
const width = defineModel<number>('width', { default: 800 })
const height = defineModel<number>('height', { default: 600 })
const windowEl = useTemplateRef('windowEl')
const titlebarEl = useTemplateRef('titlebarEl')
// Inject viewport size from parent desktop
const viewportSize = inject<{
width: Ref<number>
height: Ref<number>
}>('viewportSize')
const isMaximized = ref(false) // Don't start maximized
// Store initial position/size for restore
const preMaximizeState = ref({
x: x.value,
y: y.value,
width: width.value,
height: height.value,
})
// Dragging state
const isDragging = ref(false)
const dragStartX = ref(0)
const dragStartY = ref(0)
// Resizing state
const isResizing = ref(false)
const resizeDirection = ref<string>('')
const resizeStartX = ref(0)
const resizeStartY = ref(0)
const resizeStartWidth = ref(0)
const resizeStartHeight = ref(0)
const resizeStartPosX = ref(0)
const resizeStartPosY = ref(0)
const isResizingOrDragging = computed(
() => isResizing.value || isDragging.value,
)
// Setup drag with useDrag composable (supports mouse + touch)
useDrag(
({ movement: [mx, my], first, last }) => {
if (isMaximized.value) return
if (first) {
// Drag started - save initial position
isDragging.value = true
dragStartX.value = x.value
dragStartY.value = y.value
emit('dragStart')
return // Don't update position on first event
}
if (last) {
// Drag ended
isDragging.value = false
globalThis.getSelection()?.removeAllRanges()
emit('positionChanged', x.value, y.value)
emit('sizeChanged', width.value, height.value)
emit('dragEnd')
return
}
// Dragging (not first, not last)
const newX = dragStartX.value + mx
const newY = dragStartY.value + my
// Apply constraints during drag
const constrained = constrainToViewportDuringDrag(newX, newY)
x.value = constrained.x
y.value = constrained.y
},
{
domTarget: titlebarEl,
eventOptions: { passive: false },
pointer: { touch: true },
drag: {
filterTaps: true, // Filter out taps (clicks) vs drags
delay: 0, // No delay for immediate response
},
},
)
const windowStyle = computed(() => {
const baseStyle: Record<string, string> = {}
// Opening animation: start from icon position
if (
props.isOpening &&
props.sourceX !== undefined &&
props.sourceY !== undefined
) {
baseStyle.left = `${props.sourceX}px`
baseStyle.top = `${props.sourceY}px`
baseStyle.width = `${props.sourceWidth || 100}px`
baseStyle.height = `${props.sourceHeight || 100}px`
baseStyle.opacity = '0'
baseStyle.transform = 'scale(0.3)'
}
// Closing animation: shrink to icon position
else if (
props.isClosing &&
props.sourceX !== undefined &&
props.sourceY !== undefined
) {
baseStyle.left = `${props.sourceX}px`
baseStyle.top = `${props.sourceY}px`
baseStyle.width = `${props.sourceWidth || 100}px`
baseStyle.height = `${props.sourceHeight || 100}px`
baseStyle.opacity = '0'
baseStyle.transform = 'scale(0.3)'
}
// Normal state (maximized windows now use actual pixel dimensions)
else {
baseStyle.left = `${x.value}px`
baseStyle.top = `${y.value}px`
baseStyle.width = `${width.value}px`
baseStyle.height = `${height.value}px`
baseStyle.opacity = '1'
// Remove border-radius when maximized
if (isMaximized.value) {
baseStyle.borderRadius = '0'
}
}
// Performance optimization: hint browser about transforms
if (isDragging.value || isResizing.value) {
baseStyle.willChange = 'transform, width, height'
baseStyle.transform = 'translateZ(0)'
}
return baseStyle
})
const getViewportBounds = () => {
// Use reactive viewport size from parent if available
if (viewportSize) {
return {
width: viewportSize.width.value,
height: viewportSize.height.value,
}
}
// Fallback to parent element measurement
if (!windowEl.value?.parentElement) return null
const parent = windowEl.value.parentElement
return {
width: parent.clientWidth,
height: parent.clientHeight,
}
}
const constrainToViewportDuringDrag = (newX: number, newY: number) => {
const bounds = getViewportBounds()
if (!bounds) return { x: newX, y: newY }
const windowWidth = width.value
const windowHeight = height.value
// Allow sides and bottom to go out more
const maxOffscreenX = windowWidth / 3
const maxOffscreenBottom = windowHeight / 3
// For X axis: allow 1/3 to go outside on both sides
const maxX = bounds.width - windowWidth + maxOffscreenX
const minX = -maxOffscreenX
// For Y axis: HARD constraint at top (y=0), never allow window to go above header
const minY = 0
// Bottom: allow 1/3 to go outside
const maxY = bounds.height - windowHeight + maxOffscreenBottom
const constrainedX = Math.max(minX, Math.min(maxX, newX))
const constrainedY = Math.max(minY, Math.min(maxY, newY))
return { x: constrainedX, y: constrainedY }
}
const handleActivate = () => {
emit('activate')
}
const handleClose = () => {
emit('close')
}
const handleMinimize = () => {
emit('minimize')
}
const handleMaximize = () => {
if (isMaximized.value) {
// Restore
x.value = preMaximizeState.value.x
y.value = preMaximizeState.value.y
width.value = preMaximizeState.value.width
height.value = preMaximizeState.value.height
isMaximized.value = false
} else {
// Maximize - set position and size to viewport dimensions
preMaximizeState.value = {
x: x.value,
y: y.value,
width: width.value,
height: height.value,
}
// Get viewport bounds (desktop container, already excludes header)
const bounds = getViewportBounds()
if (bounds && bounds.width > 0 && bounds.height > 0) {
// Get safe-area-insets from CSS variables for debug
const safeAreaTop = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-top',
) || '0',
)
const safeAreaBottom = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-bottom',
) || '0',
)
// Desktop container uses 'absolute inset-0' which stretches over full viewport
// bounds.height = full viewport height (includes header area + safe-areas)
// We need to calculate available space properly
// Get header height from UI store (measured reactively in layout)
const uiStore = useUiStore()
const headerHeight = uiStore.headerHeight
x.value = 0
y.value = 0 // Start below header and status bar
width.value = bounds.width
// Height: viewport - header - both safe-areas
height.value = bounds.height - headerHeight - safeAreaTop - safeAreaBottom
isMaximized.value = true
}
}
}
// Window resizing
const handleResizeStart = (direction: string, e: MouseEvent | TouchEvent) => {
isResizing.value = true
resizeDirection.value = direction
let clientX: number
let clientY: number
if ('touches' in e) {
// Es ist ein TouchEvent
const touch = e.touches[0] // Hole den ersten Touch
// Prüfe, ob 'touch' existiert (ist undefined, wenn e.touches leer ist)
if (touch) {
clientX = touch.clientX
clientY = touch.clientY
} else {
// Ungültiges Start-Event (kein Finger). Abbruch.
isResizing.value = false
return
}
} else {
// Es ist ein MouseEvent
clientX = e.clientX
clientY = e.clientY
}
resizeStartX.value = clientX
resizeStartY.value = clientY
resizeStartWidth.value = width.value
resizeStartHeight.value = height.value
resizeStartPosX.value = x.value
resizeStartPosY.value = y.value
}
// Global mouse move handler (for resizing only, dragging handled by useDrag)
useEventListener(window, 'mousemove', (e: MouseEvent) => {
if (isResizing.value) {
const deltaX = e.clientX - resizeStartX.value
const deltaY = e.clientY - resizeStartY.value
const dir = resizeDirection.value
// Handle width changes
if (dir.includes('e')) {
width.value = Math.max(300, resizeStartWidth.value + deltaX)
} else if (dir.includes('w')) {
const newWidth = Math.max(300, resizeStartWidth.value - deltaX)
const widthDiff = resizeStartWidth.value - newWidth
x.value = resizeStartPosX.value + widthDiff
width.value = newWidth
}
// Handle height changes
if (dir.includes('s')) {
height.value = Math.max(200, resizeStartHeight.value + deltaY)
} else if (dir.includes('n')) {
const newHeight = Math.max(200, resizeStartHeight.value - deltaY)
const heightDiff = resizeStartHeight.value - newHeight
y.value = resizeStartPosY.value + heightDiff
height.value = newHeight
}
}
})
// Global mouse up handler (for resizing only, dragging handled by useDrag)
useEventListener(window, 'mouseup', () => {
if (isResizing.value) {
globalThis.getSelection()?.removeAllRanges()
isResizing.value = false
emit('positionChanged', x.value, y.value)
emit('sizeChanged', width.value, height.value)
}
})
</script>

View File

@ -0,0 +1,222 @@
<template>
<UDrawer
v-model:open="localShowWindowOverview"
direction="bottom"
:title="t('modal.title')"
:description="t('modal.description')"
>
<template #content>
<div class="h-full overflow-y-auto p-6 justify-center flex">
<!-- Window Thumbnails Flex Layout -->
<div
v-if="windows.length > 0"
class="flex flex-wrap gap-6 justify-center-safe items-start"
>
<div
v-for="window in windows"
:key="window.id"
class="relative group cursor-pointer"
>
<!-- Window Title Bar -->
<div class="flex items-center gap-3 mb-2 px-2">
<UIcon
v-if="window.icon"
:name="window.icon"
class="size-5 shrink-0"
/>
<div class="flex-1 min-w-0">
<p class="font-semibold text-sm truncate">
{{ window.title }}
</p>
</div>
<!-- Minimized Badge -->
<UBadge
v-if="window.isMinimized"
color="info"
size="xs"
:title="t('minimized')"
/>
</div>
<!-- Scaled Window Preview Container / Teleport Target -->
<div
:id="`window-preview-${window.id}`"
class="relative bg-gray-100 dark:bg-gray-900 rounded-xl overflow-hidden border-2 border-gray-200 dark:border-gray-700 group-hover:border-primary-500 transition-all shadow-lg"
:style="getCardStyle(window)"
@click="handleRestoreAndActivateWindow(window.id)"
>
<!-- Hover Overlay -->
<div
class="absolute inset-0 bg-primary-500/10 opacity-0 group-hover:opacity-100 transition-opacity pointer-events-none z-40"
/>
</div>
</div>
</div>
<!-- Empty State -->
<div
v-else
class="flex flex-col items-center justify-center py-12 text-gray-500 dark:text-gray-400"
>
<UIcon
name="i-heroicons-window"
class="size-16 mb-4 shrink-0"
/>
<p class="text-lg font-medium">No windows open</p>
<p class="text-sm">
Open an extension or system window to see it here
</p>
</div>
</div>
</template>
</UDrawer>
</template>
<script setup lang="ts">
const { t } = useI18n()
const windowManager = useWindowManagerStore()
const workspaceStore = useWorkspaceStore()
const { showWindowOverview, windows } = storeToRefs(windowManager)
// Local computed for two-way binding with UModal
const localShowWindowOverview = computed({
get: () => showWindowOverview.value,
set: (value) => {
showWindowOverview.value = value
},
})
const handleRestoreAndActivateWindow = (windowId: string) => {
const window = windowManager.windows.find((w) => w.id === windowId)
if (!window) return
// Switch to the workspace where this window is located
if (window.workspaceId) {
workspaceStore.slideToWorkspace(window.workspaceId)
}
// If window is minimized, restore it first
if (window.isMinimized) {
windowManager.restoreWindow(windowId)
} else {
// If not minimized, just activate it
windowManager.activateWindow(windowId)
}
// Close the overview
localShowWindowOverview.value = false
}
// Store original window sizes and positions to restore after overview closes
const originalWindowState = ref<
Map<string, { width: number; height: number; x: number; y: number }>
>(new Map())
// Min/Max dimensions for preview cards
const MIN_PREVIEW_WIDTH = 300
const MAX_PREVIEW_WIDTH = 600
const MIN_PREVIEW_HEIGHT = 225
const MAX_PREVIEW_HEIGHT = 450
// Calculate card size and scale based on window dimensions
const getCardStyle = (window: (typeof windows.value)[0]) => {
const scaleX = MAX_PREVIEW_WIDTH / window.width
const scaleY = MAX_PREVIEW_HEIGHT / window.height
const scale = Math.min(scaleX, scaleY, 1) // Never scale up, only down
// Calculate scaled dimensions
const scaledWidth = window.width * scale
const scaledHeight = window.height * scale
// Ensure minimum card size
let finalScale = scale
if (scaledWidth < MIN_PREVIEW_WIDTH) {
finalScale = MIN_PREVIEW_WIDTH / window.width
}
if (scaledHeight < MIN_PREVIEW_HEIGHT) {
finalScale = Math.max(finalScale, MIN_PREVIEW_HEIGHT / window.height)
}
const cardWidth = window.width * finalScale
const cardHeight = window.height * finalScale
return {
width: `${cardWidth}px`,
height: `${cardHeight}px`,
'--window-scale': finalScale, // CSS variable for scale
}
}
// Watch for overview closing to restore windows
watch(localShowWindowOverview, async (isOpen, wasOpen) => {
if (!isOpen && wasOpen) {
console.log('[WindowOverview] Overview closed, restoring windows...')
// Restore original window state
for (const window of windows.value) {
const originalState = originalWindowState.value.get(window.id)
if (originalState) {
console.log(
`[WindowOverview] Restoring window ${window.id} to:`,
originalState,
)
windowManager.updateWindowSize(
window.id,
originalState.width,
originalState.height,
)
windowManager.updateWindowPosition(
window.id,
originalState.x,
originalState.y,
)
}
}
originalWindowState.value.clear()
}
})
// Watch for overview opening to store original state
watch(
() => localShowWindowOverview.value && windows.value.length,
(shouldStore) => {
if (shouldStore && originalWindowState.value.size === 0) {
console.log('[WindowOverview] Storing original window states...')
for (const window of windows.value) {
console.log(`[WindowOverview] Window ${window.id}:`, {
originalSize: { width: window.width, height: window.height },
originalPos: { x: window.x, y: window.y },
})
originalWindowState.value.set(window.id, {
width: window.width,
height: window.height,
x: window.x,
y: window.y,
})
}
}
},
)
</script>
<i18n lang="yaml">
de:
modal:
title: Fensterübersicht
description: Übersicht aller offenen Fenster auf allen Workspaces
minimized: Minimiert
en:
modal:
title: Window Overview
description: Overview of all open windows on all workspaces
minimized: Minimized
</i18n>

View File

@ -0,0 +1,61 @@
<template>
<template v-if="!disabled">
<div
class="absolute top-0 left-0 size-2 cursor-nw-resize z-10"
@mousedown.left.stop="emitResizeStart('nw', $event)"
@touchstart.passive.stop="emitResizeStart('nw', $event)"
/>
<div
class="absolute top-0 right-0 size-2 cursor-ne-resize z-10"
@mousedown.left.stop="emitResizeStart('ne', $event)"
@touchstart.passive.stop="emitResizeStart('ne', $event)"
/>
<div
class="absolute bottom-0 left-0 size-2 cursor-sw-resize z-10"
@mousedown.left.stop="emitResizeStart('sw', $event)"
@touchstart.passive.stop="emitResizeStart('sw', $event)"
/>
<div
class="absolute bottom-0 right-0 w-2 h-2 cursor-se-resize z-10"
@mousedown.left.stop="emitResizeStart('se', $event)"
@touchstart.passive.stop="emitResizeStart('se', $event)"
/>
<div
class="absolute top-0 left-2 right-2 h-2 cursor-n-resize z-10"
@mousedown.left.stop="emitResizeStart('n', $event)"
@touchstart.passive.stop="emitResizeStart('n', $event)"
/>
<div
class="absolute bottom-0 left-2 right-2 h-2 cursor-s-resize z-10"
@mousedown.left.stop="emitResizeStart('s', $event)"
@touchstart.passive.stop="emitResizeStart('s', $event)"
/>
<div
class="absolute left-0 top-2 bottom-2 w-2 cursor-w-resize z-10"
@mousedown.left.stop="emitResizeStart('w', $event)"
@touchstart.passive.stop="emitResizeStart('w', $event)"
/>
<div
class="absolute right-0 top-2 bottom-2 w-2 cursor-e-resize z-10"
@mousedown.left.stop="emitResizeStart('e', $event)"
@touchstart.passive.stop="emitResizeStart('e', $event)"
/>
</template>
</template>
<script setup lang="ts">
// Props: Nur Information, ob Handles angezeigt werden sollen
defineProps<{
disabled?: boolean // True if window is maximized
}>()
// Emits: Signalisiert den Start des Resizing mit Richtung und Event
const emit = defineEmits<{
resizeStart: [direction: string, event: MouseEvent | TouchEvent]
}>()
// Funktion, um das Event nach oben weiterzuleiten
const emitResizeStart = (direction: string, event: MouseEvent | TouchEvent) => {
emit('resizeStart', direction, event)
}
</script>

View File

@ -0,0 +1,98 @@
<template>
<UCard
ref="cardEl"
class="cursor-pointer transition-all h-32 w-72 shrink-0 group duration-500 rounded-lg"
:class="[
workspace.id === currentWorkspace?.id
? 'ring-2 ring-secondary bg-secondary/10'
: 'hover:ring-2 hover:ring-gray-300',
isDragOver ? 'ring-4 ring-primary bg-primary/20 scale-105' : '',
]"
@click="workspaceStore.slideToWorkspace(workspace.id)"
>
<template #header>
<div class="flex justify-between">
<h3 class="font-semibold text-gray-900 dark:text-white text-lg">
{{ workspace.name }}
</h3>
<UButton
v-if="workspaceStore.workspaces.length > 1"
icon="mdi-close"
variant="ghost"
class="group-hover:opacity-100 opacity-0 transition-opacity duration-300"
@click.stop="workspaceStore.closeWorkspaceAsync(workspace.id)"
/>
</div>
</template>
</UCard>
</template>
<script setup lang="ts">
const props = defineProps<{ workspace: IWorkspace }>()
const workspaceStore = useWorkspaceStore()
const windowManager = useWindowManagerStore()
const { currentWorkspace } = storeToRefs(workspaceStore)
const cardEl = useTemplateRef('cardEl')
const isDragOver = ref(false)
// Use mouse position to detect if over card
const { x: mouseX, y: mouseY } = useMouse()
// Check if mouse is over this card while dragging
watchEffect(() => {
if (!windowManager.draggingWindowId || !cardEl.value?.$el) {
isDragOver.value = false
return
}
// Get card bounding box
const rect = cardEl.value.$el.getBoundingClientRect()
// Check if mouse is within card bounds
const isOver =
mouseX.value >= rect.left &&
mouseX.value <= rect.right &&
mouseY.value >= rect.top &&
mouseY.value <= rect.bottom
isDragOver.value = isOver
})
// Handle drop when drag ends - check BEFORE draggingWindowId is cleared
let wasOverThisCard = false
watchEffect(() => {
if (isDragOver.value && windowManager.draggingWindowId) {
wasOverThisCard = true
}
})
watch(
() => windowManager.draggingWindowId,
(newValue, oldValue) => {
// Drag ended (from something to null)
if (oldValue && !newValue && wasOverThisCard) {
console.log(
'[WorkspaceCard] Drop detected! Moving window to workspace:',
props.workspace.name,
)
const window = windowManager.windows.find((w) => w.id === oldValue)
if (window) {
window.workspaceId = props.workspace.id
window.x = 0
window.y = 0
// Switch to the workspace after dropping
//workspaceStore.slideToWorkspace(props.workspace.id)
}
wasOverThisCard = false
} else if (!newValue) {
// Drag ended but not over this card
wasOverThisCard = false
}
},
)
</script>

View File

@ -0,0 +1,28 @@
<template>
<UContextMenu :items="contextMenuItems">
<UiButton
v-bind="$attrs"
@click="$emit('click', $event)"
>
<template
v-for="(_, slotName) in $slots"
#[slotName]="slotProps"
>
<slot
:name="slotName"
v-bind="slotProps"
/>
</template>
</UiButton>
</UContextMenu>
</template>
<script setup lang="ts">
import type { ContextMenuItem } from '@nuxt/ui'
defineProps<{
contextMenuItems: ContextMenuItem[]
}>()
defineEmits<{ click: [Event] }>()
</script>

View File

@ -4,11 +4,10 @@
<UButton <UButton
class="pointer-events-auto" class="pointer-events-auto"
v-bind="{ v-bind="{
...{ size: isSmallScreen ? 'lg' : 'md' },
...buttonProps, ...buttonProps,
...$attrs, ...$attrs,
}" }"
@click="(e) => $emit('click', e)" @click="$emit('click', $event)"
> >
<template <template
v-for="(_, slotName) in $slots" v-for="(_, slotName) in $slots"

View File

@ -4,23 +4,15 @@
:title :title
:description :description
> >
<slot> <template
<!-- <UiButton v-for="(_, name) in $slots"
color="primary" :key="name"
variant="outline" #[name]="slotData"
icon="mdi:menu" >
:ui="{ <slot
base: '', :name="name"
}" v-bind="slotData"
/> --> />
</slot>
<template #title>
<slot name="title" />
</template>
<template #body>
<slot name="body" />
</template> </template>
<template #footer> <template #footer>
@ -38,7 +30,7 @@
:label="confirmLabel || t('confirm')" :label="confirmLabel || t('confirm')"
block block
color="primary" color="primary"
varaint="solid" variant="solid"
@click="$emit('confirm')" @click="$emit('confirm')"
/> />
</div> </div>

View File

@ -11,10 +11,6 @@ const { availableThemes, currentTheme } = storeToRefs(useUiStore())
const emit = defineEmits<{ select: [string] }>() const emit = defineEmits<{ select: [string] }>()
watchImmediate(availableThemes, () =>
console.log('availableThemes', availableThemes),
)
const items = computed<DropdownMenuItem[]>(() => const items = computed<DropdownMenuItem[]>(() =>
availableThemes?.value.map((theme) => ({ availableThemes?.value.map((theme) => ({
...theme, ...theme,

View File

@ -17,7 +17,7 @@
:title="t('pick')" :title="t('pick')"
class="top-0 left-0 absolute size-0" class="top-0 left-0 absolute size-0"
type="color" type="color"
/> >
<UiTooltip :tooltip="t('reset')"> <UiTooltip :tooltip="t('reset')">
<UiButton <UiButton

View File

@ -2,8 +2,8 @@
<UDropdownMenu <UDropdownMenu
:items="icons" :items="icons"
class="btn" class="btn"
@select="(newIcon) => (iconName = newIcon)"
:read_only :read_only
@select="(newIcon) => (iconName = newIcon)"
> >
<template #activator> <template #activator>
<Icon :name="iconName ? iconName : defaultIcon || icons.at(0)" /> <Icon :name="iconName ? iconName : defaultIcon || icons.at(0)" />
@ -12,8 +12,8 @@
<template #items="{ items }"> <template #items="{ items }">
<div class="grid grid-cols-6 -ml-2"> <div class="grid grid-cols-6 -ml-2">
<li <li
class="dropdown-item"
v-for="item in items" v-for="item in items"
class="dropdown-item"
@click="read_only ? '' : (iconName = item)" @click="read_only ? '' : (iconName = item)"
> >
<Icon <Icon

View File

@ -6,8 +6,8 @@
<button <button
:id :id
class="advance-select-toogle flex justify-between grow p-3" class="advance-select-toogle flex justify-between grow p-3"
@click.prevent="toogleMenu"
:disabled="read_only" :disabled="read_only"
@click.prevent="toogleMenu"
> >
<slot <slot
name="value" name="value"
@ -18,9 +18,9 @@
</slot> </slot>
</button> </button>
<button <button
@click.prevent="toogleMenu"
class="flex items-center p-2 hover:shadow rounded-md hover:bg-primary hover:text-base-content" class="flex items-center p-2 hover:shadow rounded-md hover:bg-primary hover:text-base-content"
:disabled="read_only" :disabled="read_only"
@click.prevent="toogleMenu"
> >
<i class="i-[material-symbols--keyboard-arrow-down] size-4" /> <i class="i-[material-symbols--keyboard-arrow-down] size-4" />
</button> </button>

View File

@ -1,65 +0,0 @@
// composables/extensionContextBroadcast.ts
// NOTE: This composable is deprecated. Use tabsStore.broadcastToAllTabs() instead.
// Keeping for backwards compatibility.
import { getExtensionWindow } from './extensionMessageHandler'
export const useExtensionContextBroadcast = () => {
// Globaler State für Extension IDs statt IFrames
const extensionIds = useState<Set<string>>(
'extension-ids',
() => new Set(),
)
const registerExtensionIframe = (_iframe: HTMLIFrameElement, extensionId: string) => {
extensionIds.value.add(extensionId)
}
const unregisterExtensionIframe = (_iframe: HTMLIFrameElement, extensionId: string) => {
extensionIds.value.delete(extensionId)
}
const broadcastContextChange = (context: {
theme: string
locale: string
platform: string
}) => {
const message = {
type: 'context.changed',
data: { context },
timestamp: Date.now(),
}
extensionIds.value.forEach((extensionId) => {
const win = getExtensionWindow(extensionId)
if (win) {
win.postMessage(message, '*')
}
})
}
const broadcastSearchRequest = (query: string, requestId: string) => {
const message = {
type: 'search.request',
data: {
query: { query, limit: 10 },
requestId,
},
timestamp: Date.now(),
}
extensionIds.value.forEach((extensionId) => {
const win = getExtensionWindow(extensionId)
if (win) {
win.postMessage(message, '*')
}
})
}
return {
registerExtensionIframe,
unregisterExtensionIframe,
broadcastContextChange,
broadcastSearchRequest,
}
}

View File

@ -16,11 +16,15 @@ interface ExtensionRequest {
// Globaler Handler - nur einmal registriert // Globaler Handler - nur einmal registriert
let globalHandlerRegistered = false let globalHandlerRegistered = false
const iframeRegistry = new Map<HTMLIFrameElement, IHaexHubExtension>() interface ExtensionInstance {
// Map event.source (WindowProxy) to extension for sandbox-compatible matching extension: IHaexHubExtension
const sourceRegistry = new Map<Window, IHaexHubExtension>() windowId: string
// Reverse map: extension ID to Window for broadcasting }
const extensionToWindowMap = new Map<string, Window>() const iframeRegistry = new Map<HTMLIFrameElement, ExtensionInstance>()
// Map event.source (WindowProxy) to extension instance for sandbox-compatible matching
const sourceRegistry = new Map<Window, ExtensionInstance>()
// Reverse map: window ID to Window for broadcasting (supports multiple windows per extension)
const windowIdToWindowMap = new Map<string, Window>()
// Store context values that need to be accessed outside setup // Store context values that need to be accessed outside setup
let contextGetters: { let contextGetters: {
@ -40,15 +44,25 @@ const registerGlobalMessageHandler = () => {
const request = event.data as ExtensionRequest const request = event.data as ExtensionRequest
// Find extension by decoding event.origin (works with sandboxed iframes) // Find extension instance by decoding event.origin (works with sandboxed iframes)
// Origin formats: // Origin formats:
// - Desktop: haex-extension://<base64> // - Desktop: haex-extension://<base64>
// - Android: http://haex-extension.localhost (need to check request URL for base64) // - Android: http://haex-extension.localhost (need to check request URL for base64)
let extension: IHaexHubExtension | undefined let instance: ExtensionInstance | undefined
// Debug: Find which extension sent this message
let sourceInfo = 'unknown source'
for (const [iframe, inst] of iframeRegistry.entries()) {
if (iframe.contentWindow === event.source) {
sourceInfo = `${inst.extension.name} (${inst.windowId})`
break
}
}
console.log( console.log(
'[ExtensionHandler] Received message from origin:', '[ExtensionHandler] Received message from:',
event.origin, sourceInfo,
'Method:',
request.method,
) )
// Try to decode extension info from origin // Try to decode extension info from origin
@ -74,10 +88,10 @@ const registerGlobalMessageHandler = () => {
if (iframeRegistry.size === 1) { if (iframeRegistry.size === 1) {
const entry = Array.from(iframeRegistry.entries())[0] const entry = Array.from(iframeRegistry.entries())[0]
if (entry) { if (entry) {
const [_, ext] = entry const [_, inst] = entry
extension = ext instance = inst
sourceRegistry.set(event.source as Window, ext) sourceRegistry.set(event.source as Window, inst)
extensionToWindowMap.set(ext.id, event.source as Window) windowIdToWindowMap.set(inst.windowId, event.source as Window)
} }
} }
} }
@ -91,16 +105,16 @@ const registerGlobalMessageHandler = () => {
} }
// Find matching extension in registry // Find matching extension in registry
for (const [_, ext] of iframeRegistry.entries()) { for (const [_, inst] of iframeRegistry.entries()) {
if ( if (
ext.name === decodedInfo.name && inst.extension.name === decodedInfo.name &&
ext.publicKey === decodedInfo.publicKey && inst.extension.publicKey === decodedInfo.publicKey &&
ext.version === decodedInfo.version inst.extension.version === decodedInfo.version
) { ) {
extension = ext instance = inst
// Register for future lookups // Register for future lookups
sourceRegistry.set(event.source as Window, ext) sourceRegistry.set(event.source as Window, inst)
extensionToWindowMap.set(ext.id, event.source as Window) windowIdToWindowMap.set(inst.windowId, event.source as Window)
break break
} }
} }
@ -110,31 +124,36 @@ const registerGlobalMessageHandler = () => {
} }
} }
// Fallback: Try to find extension by event.source (for localhost origin or legacy) // Fallback: Try to find extension instance by event.source (for localhost origin or legacy)
if (!extension) { if (!instance) {
extension = sourceRegistry.get(event.source as Window) instance = sourceRegistry.get(event.source as Window)
// If not registered yet, register on first message from this source // If not registered yet, find by matching iframe.contentWindow to event.source
if (!extension && iframeRegistry.size === 1) { if (!instance) {
// If we only have one iframe, assume this message is from it for (const [iframe, inst] of iframeRegistry.entries()) {
const entry = Array.from(iframeRegistry.entries())[0] if (iframe.contentWindow === event.source) {
if (entry) { instance = inst
const [_, ext] = entry // Register for future lookups
const windowSource = event.source as Window sourceRegistry.set(event.source as Window, inst)
sourceRegistry.set(windowSource, ext) windowIdToWindowMap.set(inst.windowId, event.source as Window)
extensionToWindowMap.set(ext.id, windowSource) console.log(
extension = ext '[ExtensionHandler] Registered instance via contentWindow match:',
inst.windowId,
)
break
}
} }
} else if (extension && !extensionToWindowMap.has(extension.id)) { } else if (instance && !windowIdToWindowMap.has(instance.windowId)) {
// Also register in reverse map for broadcasting // Also register in reverse map for broadcasting
extensionToWindowMap.set(extension.id, event.source as Window) windowIdToWindowMap.set(instance.windowId, event.source as Window)
} }
} }
if (!extension) { if (!instance) {
console.warn( console.warn(
'[ExtensionHandler] Could not identify extension for message:', '[ExtensionHandler] Could not identify extension instance from event.source.',
event.origin, 'Registered iframes:',
iframeRegistry.size,
) )
return // Message ist nicht von einem registrierten IFrame return // Message ist nicht von einem registrierten IFrame
} }
@ -147,20 +166,18 @@ const registerGlobalMessageHandler = () => {
try { try {
let result: unknown let result: unknown
if (request.method.startsWith('extension.')) { if (request.method.startsWith('haextension.context.')) {
result = await handleExtensionMethodAsync(request, extension)
} else if (request.method.startsWith('db.')) {
result = await handleDatabaseMethodAsync(request, extension)
} else if (request.method.startsWith('fs.')) {
result = await handleFilesystemMethodAsync(request, extension)
} else if (request.method.startsWith('http.')) {
result = await handleHttpMethodAsync(request, extension)
} else if (request.method.startsWith('permissions.')) {
result = await handlePermissionsMethodAsync(request, extension)
} else if (request.method.startsWith('context.')) {
result = await handleContextMethodAsync(request) result = await handleContextMethodAsync(request)
} else if (request.method.startsWith('storage.')) { } else if (request.method.startsWith('haextension.storage.')) {
result = await handleStorageMethodAsync(request, extension) result = await handleStorageMethodAsync(request, instance)
} else if (request.method.startsWith('haextension.db.')) {
result = await handleDatabaseMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.fs.')) {
result = await handleFilesystemMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.http.')) {
result = await handleHttpMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.permissions.')) {
result = await handlePermissionsMethodAsync(request, instance.extension)
} else { } else {
throw new Error(`Unknown method: ${request.method}`) throw new Error(`Unknown method: ${request.method}`)
} }
@ -203,6 +220,7 @@ const registerGlobalMessageHandler = () => {
export const useExtensionMessageHandler = ( export const useExtensionMessageHandler = (
iframeRef: Ref<HTMLIFrameElement | undefined | null>, iframeRef: Ref<HTMLIFrameElement | undefined | null>,
extension: ComputedRef<IHaexHubExtension | undefined | null>, extension: ComputedRef<IHaexHubExtension | undefined | null>,
windowId: Ref<string>,
) => { ) => {
// Initialize context getters (can use composables here because we're in setup) // Initialize context getters (can use composables here because we're in setup)
const { currentTheme } = storeToRefs(useUiStore()) const { currentTheme } = storeToRefs(useUiStore())
@ -223,13 +241,26 @@ export const useExtensionMessageHandler = (
// Registriere dieses IFrame // Registriere dieses IFrame
watchEffect(() => { watchEffect(() => {
if (iframeRef.value && extension.value) { if (iframeRef.value && extension.value) {
iframeRegistry.set(iframeRef.value, extension.value) iframeRegistry.set(iframeRef.value, {
extension: extension.value,
windowId: windowId.value,
})
} }
}) })
// Cleanup beim Unmount // Cleanup beim Unmount
onUnmounted(() => { onUnmounted(() => {
if (iframeRef.value) { if (iframeRef.value) {
const instance = iframeRegistry.get(iframeRef.value)
if (instance) {
// Remove from all maps
windowIdToWindowMap.delete(instance.windowId)
for (const [source, inst] of sourceRegistry.entries()) {
if (inst.windowId === instance.windowId) {
sourceRegistry.delete(source)
}
}
}
iframeRegistry.delete(iframeRef.value) iframeRegistry.delete(iframeRef.value)
} }
}) })
@ -239,6 +270,7 @@ export const useExtensionMessageHandler = (
export const registerExtensionIFrame = ( export const registerExtensionIFrame = (
iframe: HTMLIFrameElement, iframe: HTMLIFrameElement,
extension: IHaexHubExtension, extension: IHaexHubExtension,
windowId: string,
) => { ) => {
// Stelle sicher, dass der globale Handler registriert ist // Stelle sicher, dass der globale Handler registriert ist
registerGlobalMessageHandler() registerGlobalMessageHandler()
@ -250,54 +282,71 @@ export const registerExtensionIFrame = (
) )
} }
iframeRegistry.set(iframe, extension) iframeRegistry.set(iframe, { extension, windowId })
} }
export const unregisterExtensionIFrame = (iframe: HTMLIFrameElement) => { export const unregisterExtensionIFrame = (iframe: HTMLIFrameElement) => {
// Also remove from source registry // Also remove from source registry and instance map
const ext = iframeRegistry.get(iframe) const instance = iframeRegistry.get(iframe)
if (ext) { if (instance) {
// Find and remove all sources pointing to this extension // Find and remove all sources pointing to this instance
for (const [source, extension] of sourceRegistry.entries()) { for (const [source, inst] of sourceRegistry.entries()) {
if (extension === ext) { if (inst.windowId === instance.windowId) {
sourceRegistry.delete(source) sourceRegistry.delete(source)
} }
} }
// Remove from extension-to-window map // Remove from instance-to-window map
extensionToWindowMap.delete(ext.id) windowIdToWindowMap.delete(instance.windowId)
} }
iframeRegistry.delete(iframe) iframeRegistry.delete(iframe)
} }
// Export function to get Window for an extension (for broadcasting) // Export function to get Window for a specific instance (for broadcasting)
export const getExtensionWindow = (extensionId: string): Window | undefined => { export const getInstanceWindow = (windowId: string): Window | undefined => {
return extensionToWindowMap.get(extensionId) return windowIdToWindowMap.get(windowId)
} }
// ========================================== // Get all windows for an extension (all instances)
// Extension Methods export const getAllInstanceWindows = (extensionId: string): Window[] => {
// ========================================== const windows: Window[] = []
for (const [_, instance] of iframeRegistry.entries()) {
async function handleExtensionMethodAsync( if (instance.extension.id === extensionId) {
request: ExtensionRequest, const win = windowIdToWindowMap.get(instance.windowId)
extension: IHaexHubExtension, // Direkter Typ, kein ComputedRef mehr if (win) {
) { windows.push(win)
switch (request.method) {
case 'extension.getInfo': {
const info = (await invoke('get_extension_info', {
publicKey: extension.publicKey,
name: extension.name,
})) as Record<string, unknown>
// Override allowedOrigin with the actual window origin
// This fixes the dev-mode issue where Rust returns "tauri://localhost"
// but the actual origin is "http://localhost:3003"
return {
...info,
allowedOrigin: window.location.origin,
} }
} }
default: }
throw new Error(`Unknown extension method: ${request.method}`) return windows
}
// Deprecated - kept for backwards compatibility
export const getExtensionWindow = (extensionId: string): Window | undefined => {
// Return first window for this extension
return getAllInstanceWindows(extensionId)[0]
}
// Broadcast context changes to all extension instances
export const broadcastContextToAllExtensions = (context: {
theme: string
locale: string
platform?: string
}) => {
const message = {
type: 'haextension.context.changed',
data: { context },
timestamp: Date.now(),
}
console.log('[ExtensionHandler] Broadcasting context to all extensions:', context)
// Send to all registered extension windows
for (const [_, instance] of iframeRegistry.entries()) {
const win = windowIdToWindowMap.get(instance.windowId)
if (win) {
console.log('[ExtensionHandler] Sending context to:', instance.extension.name, instance.windowId)
win.postMessage(message, '*')
}
} }
} }
@ -315,11 +364,12 @@ async function handleDatabaseMethodAsync(
} }
switch (request.method) { switch (request.method) {
case 'db.query': { case 'haextension.db.query': {
const rows = await invoke<unknown[]>('extension_sql_select', { const rows = await invoke<unknown[]>('extension_sql_select', {
sql: params.query || '', sql: params.query || '',
params: params.params || [], params: params.params || [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
return { return {
@ -329,21 +379,22 @@ async function handleDatabaseMethodAsync(
} }
} }
case 'db.execute': { case 'haextension.db.execute': {
await invoke<string[]>('extension_sql_execute', { const rows = await invoke<unknown[]>('extension_sql_execute', {
sql: params.query || '', sql: params.query || '',
params: params.params || [], params: params.params || [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
return { return {
rows: [], rows,
rowsAffected: 1, rowsAffected: 1,
lastInsertId: undefined, lastInsertId: undefined,
} }
} }
case 'db.transaction': { case 'haextension.db.transaction': {
const statements = const statements =
(request.params as { statements?: string[] }).statements || [] (request.params as { statements?: string[] }).statements || []
@ -351,7 +402,8 @@ async function handleDatabaseMethodAsync(
await invoke('extension_sql_execute', { await invoke('extension_sql_execute', {
sql: stmt, sql: stmt,
params: [], params: [],
extensionId: extension.id, publicKey: extension.publicKey,
name: extension.name,
}) })
} }
@ -413,7 +465,7 @@ async function handlePermissionsMethodAsync(
async function handleContextMethodAsync(request: ExtensionRequest) { async function handleContextMethodAsync(request: ExtensionRequest) {
switch (request.method) { switch (request.method) {
case 'context.get': case 'haextension.context.get':
if (!contextGetters) { if (!contextGetters) {
throw new Error( throw new Error(
'Context not initialized. Make sure useExtensionMessageHandler is called in a component.', 'Context not initialized. Make sure useExtensionMessageHandler is called in a component.',
@ -436,34 +488,35 @@ async function handleContextMethodAsync(request: ExtensionRequest) {
async function handleStorageMethodAsync( async function handleStorageMethodAsync(
request: ExtensionRequest, request: ExtensionRequest,
extension: IHaexHubExtension, instance: ExtensionInstance,
) { ) {
const storageKey = `ext_${extension.id}_` // Storage is now per-window, not per-extension
const storageKey = `ext_${instance.extension.id}_${instance.windowId}_`
console.log( console.log(
`[HaexHub Storage] ${request.method} for extension ${extension.id}`, `[HaexHub Storage] ${request.method} for window ${instance.windowId}`,
) )
switch (request.method) { switch (request.method) {
case 'storage.getItem': { case 'haextension.storage.getItem': {
const key = request.params.key as string const key = request.params.key as string
return localStorage.getItem(storageKey + key) return localStorage.getItem(storageKey + key)
} }
case 'storage.setItem': { case 'haextension.storage.setItem': {
const key = request.params.key as string const key = request.params.key as string
const value = request.params.value as string const value = request.params.value as string
localStorage.setItem(storageKey + key, value) localStorage.setItem(storageKey + key, value)
return null return null
} }
case 'storage.removeItem': { case 'haextension.storage.removeItem': {
const key = request.params.key as string const key = request.params.key as string
localStorage.removeItem(storageKey + key) localStorage.removeItem(storageKey + key)
return null return null
} }
case 'storage.clear': { case 'haextension.storage.clear': {
// Remove only extension-specific keys // Remove only instance-specific keys
const keys = Object.keys(localStorage).filter((k) => const keys = Object.keys(localStorage).filter((k) =>
k.startsWith(storageKey), k.startsWith(storageKey),
) )
@ -471,8 +524,8 @@ async function handleStorageMethodAsync(
return null return null
} }
case 'storage.keys': { case 'haextension.storage.keys': {
// Return only extension-specific keys (without prefix) // Return only instance-specific keys (without prefix)
const keys = Object.keys(localStorage) const keys = Object.keys(localStorage)
.filter((k) => k.startsWith(storageKey)) .filter((k) => k.startsWith(storageKey))
.map((k) => k.substring(storageKey.length)) .map((k) => k.substring(storageKey.length))

View File

@ -14,12 +14,20 @@ export function useAndroidBackButton() {
// Track navigation history manually // Track navigation history manually
router.afterEach((to, from) => { router.afterEach((to, from) => {
console.log('[AndroidBack] Navigation:', { to: to.path, from: from.path, stackSize: historyStack.value.length }) console.log('[AndroidBack] Navigation:', {
to: to.path,
from: from.path,
stackSize: historyStack.value.length,
})
// If navigating forward (new page) // If navigating forward (new page)
if (from.path && to.path !== from.path && !historyStack.value.includes(to.path)) { if (
from.path &&
to.path !== from.path &&
!historyStack.value.includes(to.path)
) {
historyStack.value.push(from.path) historyStack.value.push(from.path)
console.log('[AndroidBack] Added to stack:', from.path, 'Stack:', historyStack.value) //console.log('[AndroidBack] Added to stack:', from.path, 'Stack:', historyStack.value)
} }
}) })
@ -31,7 +39,10 @@ export function useAndroidBackButton() {
// Listen to close requested event (triggered by Android back button) // Listen to close requested event (triggered by Android back button)
unlisten = await appWindow.onCloseRequested(async (event) => { unlisten = await appWindow.onCloseRequested(async (event) => {
console.log('[AndroidBack] Back button pressed, stack size:', historyStack.value.length) console.log(
'[AndroidBack] Back button pressed, stack size:',
historyStack.value.length,
)
// Check if we have history // Check if we have history
if (historyStack.value.length > 0) { if (historyStack.value.length > 0) {
@ -40,7 +51,10 @@ export function useAndroidBackButton() {
// Remove current page from stack // Remove current page from stack
historyStack.value.pop() historyStack.value.pop()
console.log('[AndroidBack] Going back, new stack size:', historyStack.value.length) console.log(
'[AndroidBack] Going back, new stack size:',
historyStack.value.length,
)
// Navigate back in router // Navigate back in router
router.back() router.back()

1
src/database/index.ts Normal file
View File

@ -0,0 +1 @@
export * as schema from './schemas'

View File

@ -1,12 +1,12 @@
import { integer, sqliteTable, text, index } from 'drizzle-orm/sqlite-core' import { integer, sqliteTable, text, index } from 'drizzle-orm/sqlite-core'
import tableNames from '../tableNames.json' import tableNames from '@/database/tableNames.json'
export const haexCrdtLogs = sqliteTable( export const haexCrdtLogs = sqliteTable(
tableNames.haex.crdt.logs.name, tableNames.haex.crdt.logs.name,
{ {
id: text() id: text()
.primaryKey() .$defaultFn(() => crypto.randomUUID())
.$defaultFn(() => crypto.randomUUID()), .primaryKey(),
haexTimestamp: text(tableNames.haex.crdt.logs.columns.haexTimestamp), haexTimestamp: text(tableNames.haex.crdt.logs.columns.haexTimestamp),
tableName: text(tableNames.haex.crdt.logs.columns.tableName), tableName: text(tableNames.haex.crdt.logs.columns.tableName),
rowPks: text(tableNames.haex.crdt.logs.columns.rowPks, { mode: 'json' }), rowPks: text(tableNames.haex.crdt.logs.columns.rowPks, { mode: 'json' }),
@ -33,8 +33,8 @@ export const haexCrdtSnapshots = sqliteTable(
tableNames.haex.crdt.snapshots.name, tableNames.haex.crdt.snapshots.name,
{ {
snapshotId: text(tableNames.haex.crdt.snapshots.columns.snapshotId) snapshotId: text(tableNames.haex.crdt.snapshots.columns.snapshotId)
.primaryKey() .$defaultFn(() => crypto.randomUUID())
.$defaultFn(() => crypto.randomUUID()), .primaryKey(),
created: text(), created: text(),
epochHlc: text(tableNames.haex.crdt.snapshots.columns.epochHlc), epochHlc: text(tableNames.haex.crdt.snapshots.columns.epochHlc),
locationUrl: text(tableNames.haex.crdt.snapshots.columns.locationUrl), locationUrl: text(tableNames.haex.crdt.snapshots.columns.locationUrl),
@ -45,8 +45,6 @@ export const haexCrdtSnapshots = sqliteTable(
) )
export const haexCrdtConfigs = sqliteTable(tableNames.haex.crdt.configs.name, { export const haexCrdtConfigs = sqliteTable(tableNames.haex.crdt.configs.name, {
key: text() key: text().primaryKey(),
.primaryKey()
.$defaultFn(() => crypto.randomUUID()),
value: text(), value: text(),
}) })

View File

@ -0,0 +1,182 @@
import { sql } from 'drizzle-orm'
import {
check,
integer,
sqliteTable,
text,
unique,
type AnySQLiteColumn,
type SQLiteColumnBuilderBase,
} from 'drizzle-orm/sqlite-core'
import tableNames from '@/database/tableNames.json'
const crdtColumnNames = {
haexTimestamp: 'haex_timestamp',
}
// Helper function to add common CRDT columns ( haexTimestamp)
export const withCrdtColumns = <
T extends Record<string, SQLiteColumnBuilderBase>,
>(
columns: T,
) => ({
...columns,
haexTimestamp: text(crdtColumnNames.haexTimestamp),
})
export const haexSettings = sqliteTable(
tableNames.haex.settings.name,
withCrdtColumns({
id: text()
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
key: text(),
type: text(),
value: text(),
}),
(table) => [unique().on(table.key, table.type, table.value)],
)
export type InsertHaexSettings = typeof haexSettings.$inferInsert
export type SelectHaexSettings = typeof haexSettings.$inferSelect
export const haexExtensions = sqliteTable(
tableNames.haex.extensions.name,
withCrdtColumns({
id: text()
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
public_key: text().notNull(),
name: text().notNull(),
version: text().notNull(),
author: text(),
description: text(),
entry: text().default('index.html'),
homepage: text(),
enabled: integer({ mode: 'boolean' }).default(true),
icon: text(),
signature: text().notNull(),
single_instance: integer({ mode: 'boolean' }).default(false),
}),
(table) => [
// UNIQUE constraint: Pro Developer (public_key) kann nur eine Extension mit diesem Namen existieren
unique().on(table.public_key, table.name),
],
)
export type InsertHaexExtensions = typeof haexExtensions.$inferInsert
export type SelectHaexExtensions = typeof haexExtensions.$inferSelect
export const haexExtensionPermissions = sqliteTable(
tableNames.haex.extension_permissions.name,
withCrdtColumns({
id: text()
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
extensionId: text(tableNames.haex.extension_permissions.columns.extensionId)
.notNull()
.references((): AnySQLiteColumn => haexExtensions.id, {
onDelete: 'cascade',
}),
resourceType: text('resource_type', {
enum: ['fs', 'http', 'db', 'shell'],
}),
action: text({ enum: ['read', 'write'] }),
target: text(),
constraints: text({ mode: 'json' }),
status: text({ enum: ['ask', 'granted', 'denied'] })
.notNull()
.default('denied'),
createdAt: text('created_at').default(sql`(CURRENT_TIMESTAMP)`),
updateAt: integer('updated_at', { mode: 'timestamp' }).$onUpdate(
() => new Date(),
),
}),
(table) => [
unique().on(
table.extensionId,
table.resourceType,
table.action,
table.target,
),
],
)
export type InserthaexExtensionPermissions =
typeof haexExtensionPermissions.$inferInsert
export type SelecthaexExtensionPermissions =
typeof haexExtensionPermissions.$inferSelect
export const haexNotifications = sqliteTable(
tableNames.haex.notifications.name,
withCrdtColumns({
id: text()
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
alt: text(),
date: text(),
icon: text(),
image: text(),
read: integer({ mode: 'boolean' }),
source: text(),
text: text(),
title: text(),
type: text({
enum: ['error', 'success', 'warning', 'info', 'log'],
}).notNull(),
}),
)
export type InsertHaexNotifications = typeof haexNotifications.$inferInsert
export type SelectHaexNotifications = typeof haexNotifications.$inferSelect
export const haexWorkspaces = sqliteTable(
tableNames.haex.workspaces.name,
withCrdtColumns({
id: text(tableNames.haex.workspaces.columns.id)
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
deviceId: text(tableNames.haex.workspaces.columns.deviceId).notNull(),
name: text(tableNames.haex.workspaces.columns.name).notNull(),
position: integer(tableNames.haex.workspaces.columns.position)
.notNull()
.default(0),
background: text(),
}),
(table) => [unique().on(table.position)],
)
export type InsertHaexWorkspaces = typeof haexWorkspaces.$inferInsert
export type SelectHaexWorkspaces = typeof haexWorkspaces.$inferSelect
export const haexDesktopItems = sqliteTable(
tableNames.haex.desktop_items.name,
withCrdtColumns({
id: text(tableNames.haex.desktop_items.columns.id)
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
workspaceId: text(tableNames.haex.desktop_items.columns.workspaceId)
.notNull()
.references(() => haexWorkspaces.id, { onDelete: 'cascade' }),
itemType: text(tableNames.haex.desktop_items.columns.itemType, {
enum: ['system', 'extension', 'file', 'folder'],
}).notNull(),
// Für Extensions (wenn itemType = 'extension')
extensionId: text(
tableNames.haex.desktop_items.columns.extensionId,
).references((): AnySQLiteColumn => haexExtensions.id, {
onDelete: 'cascade',
}),
// Für System Windows (wenn itemType = 'system')
systemWindowId: text(tableNames.haex.desktop_items.columns.systemWindowId),
positionX: integer(tableNames.haex.desktop_items.columns.positionX)
.notNull()
.default(0),
positionY: integer(tableNames.haex.desktop_items.columns.positionY)
.notNull()
.default(0),
}),
(table) => [
check(
'item_reference',
sql`(${table.itemType} = 'extension' AND ${table.extensionId} IS NOT NULL AND ${table.systemWindowId} IS NULL) OR (${table.itemType} = 'system' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL) OR (${table.itemType} = 'file' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL) OR (${table.itemType} = 'folder' AND ${table.systemWindowId} IS NOT NULL AND ${table.extensionId} IS NULL)`,
),
],
)
export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert
export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect

View File

@ -7,7 +7,7 @@
"key": "key", "key": "key",
"type": "type", "type": "type",
"value": "value", "value": "value",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -26,7 +26,7 @@
"signature": "signature", "signature": "signature",
"url": "url", "url": "url",
"version": "version", "version": "version",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -42,7 +42,7 @@
"status": "status", "status": "status",
"createdAt": "created_at", "createdAt": "created_at",
"updateAt": "updated_at", "updateAt": "updated_at",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -59,7 +59,19 @@
"text": "text", "text": "text",
"title": "title", "title": "title",
"type": "type", "type": "type",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp"
}
},
"workspaces": {
"name": "haex_workspaces",
"columns": {
"id": "id",
"deviceId": "device_id",
"name": "name",
"position": "position",
"createdAt": "created_at",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
@ -67,21 +79,17 @@
"name": "haex_desktop_items", "name": "haex_desktop_items",
"columns": { "columns": {
"id": "id", "id": "id",
"workspaceId": "workspace_id",
"itemType": "item_type", "itemType": "item_type",
"referenceId": "reference_id", "extensionId": "extension_id",
"systemWindowId": "system_window_id",
"positionX": "position_x", "positionX": "position_x",
"positionY": "position_y", "positionY": "position_y",
"haexTombstone": "haex_tombstone",
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
"passwords": {
"groups": "haex_passwords_groups",
"group_items": "haex_passwords_group_items",
"item_details": "haex_passwords_item_details",
"item_key_values": "haex_passwords_item_key_values",
"item_histories": "haex_passwords_item_history"
},
"crdt": { "crdt": {
"logs": { "logs": {
"name": "haex_crdt_logs", "name": "haex_crdt_logs",

Some files were not shown because too many files have changed in this diff Show More