72 Commits

Author SHA1 Message Date
3897a33565 feat: Add event broadcasting to all webview extensions
- Added emit_to_all_extensions method to ExtensionWebviewManager
- Implemented webview_extension_emit_to_all Tauri command
- Updated UI store to broadcast context changes to webview extensions
- Centralized event and method names using SDK constants
- Updated all handlers to use HAEXTENSION_METHODS constants

This enables proper context propagation (theme, locale, platform) to webview extensions.
2025-11-14 10:47:23 +01:00
7487696af4 Fix context change propagation to webview extensions
- Added emit_to_all_extensions method to ExtensionWebviewManager
- Created webview_extension_emit_to_all Tauri command
- Updated UI store to use new command instead of emit()
- Broadcasts CONTEXT_CHANGED event to all webview windows
- Fixes dynamic context updates not reaching webview extensions
2025-11-14 10:30:56 +01:00
c1ee8e6bc0 Update to SDK v1.9.10 with centralized method names
- Updated @haexhub/sdk dependency from 1.9.7 to 1.9.10
- Imported HAEXTENSION_METHODS and HAEXTENSION_EVENTS from SDK
- Updated all handler files to use new nested method constants
- Updated extensionMessageHandler to route using constants
- Changed application.open routing in web handler
- All method names now use haextension:subject:action schema
2025-11-14 10:22:52 +01:00
2202415441 Add permission check handlers for extensions
- Add check.rs with Tauri commands for checking web, database, and filesystem permissions
- Implement handlePermissionsMethodAsync in frontend to route permission checks
- Register permission check commands in lib.rs
- Add toast notification for permission denied errors in web requests
- Extensions can now check permissions before operations via SDK
2025-11-11 15:40:01 +01:00
9583e2f44b Rename Http to Web and implement permission checks
- Rename ResourceType::Http to ResourceType::Web
- Rename HttpAction to WebAction
- Rename HttpConstraints to WebConstraints
- Rename Action::Http to Action::Web
- Add check_web_permission method to PermissionManager
- Optimize permission loading (only fetch web permissions)
- Add permission checks to extension_web_fetch and extension_web_open
- Update manifest.rs to use Web instead of Http
2025-11-11 14:37:47 +01:00
d886fbd8bd Add web.openAsync method to open URLs in browser
- Add extension_web_open Tauri command
- Validate URL format and allow only http/https
- Use tauri-plugin-opener to open URL in default browser
- Add handleWebOpenAsync handler in frontend
2025-11-11 14:02:41 +01:00
9bad4008f2 Implement web requests on Rust backend to avoid CORS
- Add web module in src-tauri/src/extension/web/mod.rs
- Implement extension_web_fetch Tauri command using reqwest
- Add WebError variant to ExtensionError enum
- Update frontend handler to call Rust backend via Tauri IPC
- Web requests now run in native context without CORS restrictions
2025-11-11 13:54:55 +01:00
203f81e775 Add WebAPI handler for extensions
- Rename http.ts to web.ts handler
- Implement handleWebMethodAsync with haextension.web.fetch support
- Add base64 body encoding/decoding
- Add timeout support with AbortController
- Convert response headers and body to proper format
- Update message handler to route haextension.web.* methods
- Add TODO for permission checks

This enables extensions to make web requests through the host app,
bypassing iframe CORS restrictions.
2025-11-11 13:27:53 +01:00
554cb7762d Document automated release process in README 2025-11-10 11:58:13 +01:00
5856a73e5b Add automated release scripts for version management 2025-11-10 10:44:53 +01:00
38cc6f36d4 Bump version to 0.1.13 2025-11-10 10:22:43 +01:00
0d4059e518 Add TypeScript types for ExtensionError and improve error handling
- Add SerializedExtensionError TypeScript bindings from Rust
- Add ExtensionErrorCode enum export with ts-rs
- Create useExtensionError composable with type guards and error message extraction
- Fix developer page toast messages to show proper error messages instead of [object Object]
- Add getErrorMessage helper function for robust error handling across different error types

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-10 02:31:32 +01:00
c551641737 Bump version to 0.1.13 and remove unused mobile.rs
- Update version in tauri.conf.json to 0.1.13
- Remove incomplete and unused mobile.rs file

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-10 02:14:31 +01:00
75093485bd Add showImage handler stub and mobile file provider foundation
- Add haextension.fs.showImage handler that delegates to frontend PhotoSwipe
- Add mobile.rs with open_file_with_provider command for future Android FileProvider integration
- Keep showImage as backwards-compatible no-op since image viewing is now handled client-side

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-10 02:13:44 +01:00
e1be08cb76 Add openFile support for opening files with system viewer
Added new filesystem handler for opening files with the system's default viewer:
- Implemented haextension.fs.openFile handler in filesystem.ts
- Writes files to temp directory and opens with openPath from opener plugin
- Added Tauri permissions: opener:allow-open-path with $TEMP/** scope
- Added filesystem permissions for temp directory access

This enables extensions to open files (like images) in the native system viewer where users can zoom and interact with them naturally.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-09 23:58:40 +01:00
7d1f346c4b Improve UiDrawer styling and viewport calculations 2025-11-08 23:14:12 +01:00
af61972342 Fix ui prop reference in UiDrawer component 2025-11-08 20:28:39 +01:00
6187e32f89 Fix lockfile mismatch for zod dependency 2025-11-08 00:21:54 +01:00
43ba246174 Refactor extension handlers and improve mobile UX
- Split extensionMessageHandler into separate handler files
  - Created handlers directory with individual files for database, filesystem, http, permissions, context, and storage
  - Reduced main handler file from 602 to 342 lines
  - Improved code organization and maintainability

- Add viewport utilities for safe area handling
  - New viewport.ts utility with helpers for fullscreen dimensions
  - Proper safe area inset calculations for mobile devices
  - Fixed window positioning on small screens to start at 0,0

- Create UiDrawer wrapper component
  - Automatically applies safe area insets
  - Uses TypeScript DrawerProps interface for code completion
  - Replaced all UDrawer instances with UiDrawer

- Improve window management
  - Windows on small screens now use full viewport with safe areas
  - Fixed maximize functionality to respect safe areas
  - Consolidated safe area logic in reusable utilities
2025-11-08 00:14:53 +01:00
2b739b9e79 Improve database query handling with automatic fallback for RETURNING clauses 2025-11-07 01:39:44 +01:00
63849d86e1 Add sync backend infrastructure and improve grid snapping
- Implement crypto utilities for vault key management (Hybrid-Ansatz)
  - PBKDF2 key derivation with 600k iterations
  - AES-GCM encryption for vault keys and CRDT data
  - Optimized Base64 conversion with Buffer/btoa fallback
- Add Sync Engine Store for server communication
  - Vault key storage and retrieval
  - CRDT log push/pull operations
  - Supabase client integration
- Add Sync Orchestrator Store with realtime subscriptions
  - Event-driven sync (push after writes)
  - Supabase Realtime for instant sync
  - Sync status tracking per backend
- Add haex_sync_status table for reliable sync tracking
2025-11-05 17:08:49 +01:00
9adee46166 Bump version to 0.1.11 2025-11-05 01:08:33 +01:00
be7dff72dd Add sync backend infrastructure and improve grid snapping
- Add haexSyncBackends table with CRDT support for multi-backend sync
- Implement useSyncBackendsStore for managing sync server configurations
- Fix desktop icon grid snapping for all icon sizes (small to extra-large)
- Add Supabase client dependency for future sync implementation
- Generate database migration for sync_backends table
2025-11-05 01:08:09 +01:00
b465c117b0 Fix browser text selection during icon drag
- Add e.preventDefault() in handlePointerDown to prevent text selection
- Add @dragstart.prevent to prevent native browser drag
- Add select-none and @selectstart.prevent to workspace
- Add mouseleave event listener to reset drag state when leaving window
- Refactor grid positioning to use consistent iconPadding constant
2025-11-04 22:36:17 +01:00
731ae7cc47 Improve desktop grid positioning and spacing
- Increase icon spacing from 20px to 30px padding
- Add vertical grid offset (-30px) to start grid higher
- Remove screen-size dependent grid columns/rows (now fully dynamic)
- Fix dropzone visualization to use consistent snapToGrid function
- Clean up unused UI store dependencies
2025-11-04 16:39:08 +01:00
26ec4e2a89 Fix icon drag bounds on x-axis
Prevent icons from being dragged beyond viewport boundaries on the x-axis.
Icons are now clamped to valid positions during drag, not just on drop.

- Added viewport bounds checking for both x and y axes during drag
- Icons stay within [0, viewport.width - iconWidth] horizontally
- Icons stay within [0, viewport.height - iconHeight] vertically
- Eliminates snap-back behavior when dragging near edges

Bump version to 0.1.8
2025-11-04 16:11:30 +01:00
279468eddc Add device management and database-backed desktop settings
This update migrates desktop grid settings from localStorage to the database
and introduces a comprehensive device management system.

Features:
- New haex_devices table for device identification and naming
- Device-specific settings with foreign key relationships
- Preset-based icon sizes (Small, Medium, Large, Extra Large)
- Grid positioning improvements to prevent dragging behind PageHeader
- Dynamic icon sizing based on database settings

Database Changes:
- Created haex_devices table with deviceId (UUID) and name columns
- Modified haex_settings to include device_id FK and updated unique constraint
- Migration 0002_loose_quasimodo.sql for schema changes

Technical Improvements:
- Replaced arbitrary icon size slider (60-200px) with preset system
- Icons use actual measured dimensions for proper grid centering
- Settings sync on vault mount for cross-device consistency
- Proper bounds checking during icon drag operations

Bump version to 0.1.7
2025-11-04 16:04:38 +01:00
cffb129e4f Auto-open dev extensions after loading
- Dev extensions are now automatically opened in a window after successful load
- Simplified extension finding logic by using devExtensions directly
- Fixed table name handling to support both double quotes and backticks in permission manager
2025-11-04 00:46:46 +01:00
405cf25aab Bump version to 0.1.6 2025-11-03 11:10:11 +01:00
b097bf211d Make windows fullscreen on small screens
- Update window components to use fullscreen layout on small screens
- Adjust UI components styling for better mobile display
- Update desktop store for small screen handling
2025-11-03 11:08:26 +01:00
c71b8468df Fix workspace background feature for Android
- Add missing filesystem permissions in capabilities
  - fs:allow-applocaldata-read-recursive
  - fs:allow-applocaldata-write-recursive
  - fs:allow-write-file
  - fs:allow-mkdir
  - fs:allow-exists
  - fs:allow-remove

- Fix Android photo picker URI handling
  - Detect file type from binary signature (PNG, JPEG, WebP)
  - Use manual path construction to avoid path joining issues
  - Works with Android photo picker content:// URIs

- Improve error handling with detailed toast messages
  - Show specific error at each step (read, mkdir, write, db)
  - Better debugging on Android where console is unavailable

- Fix window activation behavior
  - Restore minimized windows when activated

- Remove unused imports in launcher component
2025-11-03 02:03:34 +01:00
3a4f482021 Add database migrations for workspace background feature
- Add migration 0001 for background column in haex_workspaces table
- Update vault.db with new schema
- Sync Android assets database
2025-11-03 01:32:00 +01:00
88507410ed Refactor code formatting and imports
- Reformat Rust code in extension database module
  - Improve line breaks and indentation
  - Remove commented-out test code
  - Clean up debug print statements formatting

- Update import path in CRDT schema (use @ alias)

- Fix UButton closing tag formatting in default layout
2025-11-03 01:30:46 +01:00
f38cecc84b Add workspace background customization and fix launcher drawer drag
- Add workspace background image support with file-based storage
  - Store background images in $APPLOCALDATA/files directory
  - Save file paths in database (text column in haex_workspaces)
  - Use convertFileSrc for secure asset:// URL conversion
  - Add context menu to workspaces with "Hintergrund ändern" option

- Implement background management in settings
  - File selection dialog for PNG, JPG, JPEG, WebP images
  - Copy selected images to app data directory
  - Remove background with file cleanup
  - Multilingual UI (German/English)

- Fix launcher drawer drag interference
  - Add :handle-only="true" to UDrawer to restrict drag to handle
  - Simplify drag handlers (removed complex state tracking)
  - Items can now be dragged to desktop without drawer interference

- Extend Tauri asset protocol scope to include $APPLOCALDATA/**
  for background image loading
2025-11-03 01:29:08 +01:00
931d51a1e1 Remove unused function parameters
Removed unused parameters:
- allowed_origin from parse_extension_info_from_path in protocol.rs
- app_handle from resolve_path_pattern in filesystem/core.rs
2025-11-02 15:07:44 +01:00
c97afdee18 Restore trash import for move_vault_to_trash functionality
The trash crate is needed for the move_vault_to_trash function which
moves vault files to the system trash instead of permanently deleting
them. Clippy incorrectly marked it as unused because it's only used
within a cfg(not(target_os = "android")) block.
2025-11-02 15:06:02 +01:00
65d2770df3 Fix Android build by unconditionally importing ts_rs::TS
When cargo clippy removed the unused trash import, the cfg attribute
accidentally applied to the ts_rs::TS import below it, making it
conditional for Android. This caused the Android build to fail with
"cannot find derive macro TS in this scope".

Moved the TS import out of the cfg block to make it available for all
platforms including Android.
2025-11-02 15:02:45 +01:00
a52e1b43fa Remove unused code and modernize Rust format strings
Applied cargo clippy fixes to clean up codebase:
- Removed unused imports (serde_json::json, std::collections::HashSet)
- Removed unused function encode_hex_for_log
- Modernized format strings to use inline variables
- Fixed clippy warnings for better code quality

All changes applied automatically by cargo clippy --fix
2025-11-02 14:48:01 +01:00
6ceb22f014 Bundle Iconify icons locally and enhance CSP for Tauri protocols
- Add lucide and hugeicons to serverBundle collections for local bundling
- Add https://tauri.localhost and asset: protocol to CSP directives
- Prevents CSP errors and eliminates dependency on Iconify API
2025-11-02 14:28:06 +01:00
4833dee89a Fix bundle targets to build for all platforms 2025-11-02 13:52:29 +01:00
a80c783576 Restore CSP settings in tauri.conf.json 2025-11-02 13:41:18 +01:00
4e1e4ae601 Bump version to 0.1.4 2025-11-02 00:58:02 +01:00
6a7f58a513 Fix production build crash by resolving circular import dependency
Moved database schemas from src-tauri/database/schemas/ to src/database/schemas/
to fix bundling issues and resolved circular import dependency that caused
"Cannot access uninitialized variable" error in production builds.

Key changes:
- Moved crdtColumnNames definition into haex.ts to break circular dependency
- Restored .$defaultFn(() => crypto.randomUUID()) calls
- Kept AnySQLiteColumn type annotations
- Removed obsolete TDZ fix script (no longer needed)
- Updated all import paths across stores and configuration files
2025-11-02 00:57:03 +01:00
3ed8d6bc05 Fix frontendDist path for nuxt generate output 2025-11-01 21:54:24 +01:00
81a72da26c Add post-build fix to generate script 2025-11-01 21:34:44 +01:00
4fa3515e32 Bump version to 0.1.3
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:37 +01:00
c5c30fd4c4 Fix Vite 7.x TDZ error in __vite__mapDeps with post-build script
- Add post-build script to fix Temporal Dead Zone error in generated code
- Remove debug logging from stores and composables
- Simplify init-logger plugin to essential error handling
- Fix circular store dependency in useUiStore

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-01 20:21:12 +01:00
8c7a02a019 Sync version numbers across all package files
Update Cargo.toml and tauri.conf.json to version 0.1.2 to match package.json
2025-11-01 19:33:42 +01:00
465fe19542 Clean up unused code and dependencies
- Remove commented-out code in Rust and TypeScript files
- Remove unused npm dependencies (@tauri-apps/plugin-http, @tauri-apps/plugin-sql, fuse.js)
- Remove commented imports in nuxt.config.ts
- Remove commented dependencies in Cargo.toml
2025-11-01 19:32:34 +01:00
d2d0f8996b Fix runtime CSP error by allowing inline scripts
Added 'unsafe-inline' to script-src CSP directive to fix JavaScript
initialization errors in production builds. Nuxt's generated modules
require inline script execution.

- Fixes: "Cannot access uninitialized variable" error
- Fixes: CSP script execution blocking
- Version bump to 0.1.2
2025-11-01 19:00:36 +01:00
f727d00639 Bump version to 0.1.1 2025-11-01 17:21:10 +01:00
a946b14f69 Fix Android assets upload to correct release
Use gh CLI to upload Android APK and AAB to the tagged release.
2025-11-01 17:20:13 +01:00
471baec284 Simplify Android build: use default command for APK and AAB
tauri android build creates both APK and AAB by default.
2025-11-01 16:44:32 +01:00
8298d807f3 Fix Android build commands: use --apk and --aab flags
Changed from incorrect --bundle aab to correct --aab flag.
2025-11-01 16:34:15 +01:00
42e6459fbf Prevent duplicate builds on tag pushes
Build workflow now ignores all tags to avoid running alongside release workflow.
2025-11-01 16:06:35 +01:00
6ae87fc694 Fix Android OpenSSL build by adding NDK toolchain to PATH
Set proper CC, AR, and RANLIB environment variables for all Android targets
to enable OpenSSL cross-compilation with SQLCipher encryption.
2025-11-01 16:03:46 +01:00
f7867a5bde Restore SQLCipher encryption for Android and fix CI build
- Re-enable bundled-sqlcipher-vendored-openssl for Android
- Add NDK environment variables for OpenSSL compilation
- Install perl and make for OpenSSL build in CI
- Ensures encryption works on all platforms including Android
2025-11-01 15:39:44 +01:00
d82599f588 Fix Android build by using platform-specific rusqlite features
- Use bundled-sqlcipher-vendored-openssl for non-Android platforms
- Use bundled (standard SQLite) for Android to avoid OpenSSL compilation issues
- Resolves OpenSSL build errors on Android targets
2025-11-01 15:36:20 +01:00
72bb211a76 Fix secrets access in workflow conditional
- Move secrets to env block instead of if condition
- Use bash conditional to check if keystore is available
- Provide clear logging for signed vs unsigned builds
2025-11-01 15:28:06 +01:00
f14ce0d6ad Add Android signing configuration to Gradle
- Configure signingConfigs to read from environment variables
- Apply signing to release builds when keystore is available
- Support both signed and unsigned builds
2025-11-01 15:26:21 +01:00
af09f4524d Remove iOS builds from CI/CD workflows 2025-11-01 15:21:58 +01:00
102832675d Fix Android build commands syntax
- Change from --apk to default build (produces APK)
- Change from --aab to --bundle aab for AAB generation
2025-11-01 15:20:49 +01:00
3490de2f51 Configure Android signing and disable iOS builds
- Add optional Android signing for build workflow (unsigned for testing)
- Require Android signing for release workflow
- Disable iOS builds (commented out) until Apple Developer Account is available
2025-11-01 15:06:56 +01:00
7c3af10938 Add Android and iOS builds to CI/CD pipelines 2025-11-01 15:00:33 +01:00
5c5d0785b9 Fix pnpm version conflict in CI workflows 2025-11-01 14:48:58 +01:00
121dd9dd00 Add GitHub Actions CI/CD pipelines
- Add build pipeline for Windows, macOS, and Linux
- Add release pipeline for automated releases
- Remove CLAUDE.md from git tracking
2025-11-01 14:46:01 +01:00
4ff6aee4d8 Fix Vue i18n warnings and component root node issues
- Set useScope: 'global' in UI store to prevent i18n scope conflicts
- Add wrapper div to vault page to ensure single root node for transitions
- Fixes 'Duplicate useI18n calling by local scope' warning
- Fixes 'Component inside <Transition> renders non-element root node' warning
2025-10-31 23:24:20 +01:00
dceb49ae90 Add context menu for vault actions and trash functionality
- Add UiButtonContext component for context menu support on buttons
- Implement vault trash functionality using trash crate
- Move vaults to system trash on desktop (with fallback to permanent delete on mobile)
- Add context menu to vault list items for better mobile UX
- Keep hover delete button for desktop users
2025-10-31 22:57:56 +01:00
5ea04a80e0 Fix Android safe-area handling and window maximization
- Fix extension signature verification on Android by canonicalizing paths (symlink compatibility)
- Implement proper safe-area-inset handling for mobile devices
- Add reactive header height measurement to UI store
- Fix maximized window positioning to respect safe-areas and header
- Create reusable HaexDebugOverlay component for mobile debugging
- Fix Swiper navigation by using absolute positioning instead of flex-1
- Remove debug logging after Android compatibility confirmed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-31 02:18:59 +01:00
65cf2e2c3c adjust gitignore 2025-10-30 22:01:31 +01:00
68d542b4d7 Update extension system and database migrations
Changes:
- Added CLAUDE.md with project instructions
- Updated extension manifest bindings (TypeScript)
- Regenerated database migrations (consolidated into single migration)
- Updated haex schema with table name handling
- Enhanced extension manager and manifest handling in Rust
- Updated extension store in frontend
- Updated vault.db

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-30 21:59:13 +01:00
f97cd4ad97 adjust drizzle backend.
return array of arrays
handle table names with quotes
2025-10-30 04:57:01 +01:00
140 changed files with 12974 additions and 4570 deletions

228
.github/workflows/build.yml vendored Normal file
View File

@ -0,0 +1,228 @@
name: Build
on:
push:
branches:
- main
- develop
tags-ignore:
- '**'
pull_request:
branches:
- main
- develop
workflow_dispatch:
jobs:
build-desktop:
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
args: ${{ matrix.args }}
- name: Upload artifacts (macOS)
if: matrix.platform == 'macos-latest'
uses: actions/upload-artifact@v4
with:
name: macos-${{ contains(matrix.args, 'aarch64') && 'aarch64' || 'x86_64' }}
path: |
src-tauri/target/*/release/bundle/dmg/*.dmg
src-tauri/target/*/release/bundle/macos/*.app
- name: Upload artifacts (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
uses: actions/upload-artifact@v4
with:
name: linux
path: |
src-tauri/target/release/bundle/deb/*.deb
src-tauri/target/release/bundle/appimage/*.AppImage
src-tauri/target/release/bundle/rpm/*.rpm
- name: Upload artifacts (Windows)
if: matrix.platform == 'windows-latest'
uses: actions/upload-artifact@v4
with:
name: windows
path: |
src-tauri/target/release/bundle/msi/*.msi
src-tauri/target/release/bundle/nsis/*.exe
build-android:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (if secrets available)
env:
ANDROID_KEYSTORE: ${{ secrets.ANDROID_KEYSTORE }}
ANDROID_KEYSTORE_PASSWORD: ${{ secrets.ANDROID_KEYSTORE_PASSWORD }}
ANDROID_KEY_ALIAS: ${{ secrets.ANDROID_KEY_ALIAS }}
ANDROID_KEY_PASSWORD: ${{ secrets.ANDROID_KEY_PASSWORD }}
run: |
if [ -n "$ANDROID_KEYSTORE" ]; then
echo "$ANDROID_KEYSTORE" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=$ANDROID_KEYSTORE_PASSWORD" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=$ANDROID_KEY_ALIAS" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=$ANDROID_KEY_PASSWORD" >> $GITHUB_ENV
echo "Keystore configured for signing"
else
echo "No keystore configured, building unsigned APK"
fi
- name: Build Android APK and AAB (unsigned if no keystore)
run: pnpm tauri android build
- name: Upload Android artifacts
uses: actions/upload-artifact@v4
with:
name: android
path: |
src-tauri/gen/android/app/build/outputs/apk/**/*.apk
src-tauri/gen/android/app/build/outputs/bundle/**/*.aab

251
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,251 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
jobs:
create-release:
permissions:
contents: write
runs-on: ubuntu-22.04
outputs:
release_id: ${{ steps.create-release.outputs.release_id }}
upload_url: ${{ steps.create-release.outputs.upload_url }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Get version
run: echo "PACKAGE_VERSION=$(node -p "require('./package.json').version")" >> $GITHUB_ENV
- name: Create release
id: create-release
uses: actions/github-script@v7
with:
script: |
const { data } = await github.rest.repos.createRelease({
owner: context.repo.owner,
repo: context.repo.repo,
tag_name: `v${process.env.PACKAGE_VERSION}`,
name: `haex-hub v${process.env.PACKAGE_VERSION}`,
body: 'Take a look at the assets to download and install this app.',
draft: true,
prerelease: false
})
core.setOutput('release_id', data.id)
core.setOutput('upload_url', data.upload_url)
return data.id
build-desktop:
needs: create-release
permissions:
contents: write
strategy:
fail-fast: false
matrix:
include:
- platform: 'macos-latest'
args: '--target aarch64-apple-darwin'
- platform: 'macos-latest'
args: '--target x86_64-apple-darwin'
- platform: 'ubuntu-22.04'
args: ''
- platform: 'windows-latest'
args: ''
runs-on: ${{ matrix.platform }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
with:
targets: ${{ matrix.platform == 'macos-latest' && 'aarch64-apple-darwin,x86_64-apple-darwin' || '' }}
- name: Install dependencies (Ubuntu)
if: matrix.platform == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf libssl-dev
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Build and release Tauri app
uses: tauri-apps/tauri-action@v0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
releaseId: ${{ needs.create-release.outputs.release_id }}
args: ${{ matrix.args }}
build-android:
needs: create-release
permissions:
contents: write
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install pnpm
uses: pnpm/action-setup@v4
- name: Setup Java
uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '17'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@stable
- name: Install Rust Android targets
run: |
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add i686-linux-android
rustup target add x86_64-linux-android
- name: Setup NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r26d
id: setup-ndk
- name: Setup Android NDK environment for OpenSSL
run: |
echo "ANDROID_NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
echo "NDK_HOME=${{ steps.setup-ndk.outputs.ndk-path }}" >> $GITHUB_ENV
# Add all Android toolchains to PATH for OpenSSL cross-compilation
echo "${{ steps.setup-ndk.outputs.ndk-path }}/toolchains/llvm/prebuilt/linux-x86_64/bin" >> $GITHUB_PATH
# Set CC, AR, RANLIB for each target
echo "CC_aarch64_linux_android=aarch64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_aarch64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_aarch64_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_armv7_linux_androideabi=armv7a-linux-androideabi24-clang" >> $GITHUB_ENV
echo "AR_armv7_linux_androideabi=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_armv7_linux_androideabi=llvm-ranlib" >> $GITHUB_ENV
echo "CC_i686_linux_android=i686-linux-android24-clang" >> $GITHUB_ENV
echo "AR_i686_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_i686_linux_android=llvm-ranlib" >> $GITHUB_ENV
echo "CC_x86_64_linux_android=x86_64-linux-android24-clang" >> $GITHUB_ENV
echo "AR_x86_64_linux_android=llvm-ar" >> $GITHUB_ENV
echo "RANLIB_x86_64_linux_android=llvm-ranlib" >> $GITHUB_ENV
- name: Install build dependencies for OpenSSL
run: |
sudo apt-get update
sudo apt-get install -y perl make
- name: Get pnpm store directory
shell: bash
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Setup Rust cache
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri
- name: Install frontend dependencies
run: pnpm install --frozen-lockfile
- name: Setup Keystore (required for release)
run: |
echo "${{ secrets.ANDROID_KEYSTORE }}" | base64 -d > $HOME/keystore.jks
echo "ANDROID_KEYSTORE_PATH=$HOME/keystore.jks" >> $GITHUB_ENV
echo "ANDROID_KEYSTORE_PASSWORD=${{ secrets.ANDROID_KEYSTORE_PASSWORD }}" >> $GITHUB_ENV
echo "ANDROID_KEY_ALIAS=${{ secrets.ANDROID_KEY_ALIAS }}" >> $GITHUB_ENV
echo "ANDROID_KEY_PASSWORD=${{ secrets.ANDROID_KEY_PASSWORD }}" >> $GITHUB_ENV
- name: Build Android APK and AAB (signed)
run: pnpm tauri android build
- name: Upload Android artifacts to Release
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
gh release upload ${{ github.ref_name }} \
src-tauri/gen/android/app/build/outputs/apk/universal/release/app-universal-release.apk \
src-tauri/gen/android/app/build/outputs/bundle/universalRelease/app-universal-release.aab \
--clobber
publish-release:
permissions:
contents: write
runs-on: ubuntu-22.04
needs: [create-release, build-desktop, build-android]
steps:
- name: Publish release
id: publish-release
uses: actions/github-script@v7
env:
release_id: ${{ needs.create-release.outputs.release_id }}
with:
script: |
github.rest.repos.updateRelease({
owner: context.repo.owner,
repo: context.repo.repo,
release_id: process.env.release_id,
draft: false,
prerelease: false
})

4
.gitignore vendored
View File

@ -26,4 +26,6 @@ dist-ssr
src-tauri/target
nogit*
.claude
.output
.output
target
CLAUDE.md

View File

@ -168,6 +168,32 @@ pnpm install
pnpm tauri dev
```
#### 📦 Release Process
Create a new release using the automated scripts:
```bash
# Patch release (0.1.13 → 0.1.14)
pnpm release:patch
# Minor release (0.1.13 → 0.2.0)
pnpm release:minor
# Major release (0.1.13 → 1.0.0)
pnpm release:major
```
The script automatically:
1. Updates version in `package.json`
2. Creates a git commit
3. Creates a git tag
4. Pushes to remote
GitHub Actions will then automatically:
- Build desktop apps (macOS, Linux, Windows)
- Build Android apps (APK and AAB)
- Create and publish a GitHub release
#### 🧭 Summary
HaexHub aims to:

View File

@ -1,7 +1,7 @@
import { defineConfig } from 'drizzle-kit'
export default defineConfig({
schema: './src-tauri/database/schemas/**.ts',
schema: './src/database/schemas/**.ts',
out: './src-tauri/database/migrations',
dialect: 'sqlite',
dbCredentials: {

View File

@ -1,5 +1,3 @@
//import tailwindcss from '@tailwindcss/vite'
import { fileURLToPath } from 'node:url'
// https://nuxt.com/docs/api/configuration/nuxt-config
@ -31,7 +29,6 @@ export default defineNuxtConfig({
'@vueuse/nuxt',
'@nuxt/icon',
'@nuxt/eslint',
//"@nuxt/image",
'@nuxt/fonts',
'@nuxt/ui',
],
@ -71,7 +68,7 @@ export default defineNuxtConfig({
includeCustomCollections: true,
},
serverBundle: {
collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione'],
collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione', 'lucide', 'hugeicons'],
},
customCollections: [
@ -125,7 +122,6 @@ export default defineNuxtConfig({
},
vite: {
//plugins: [tailwindcss()],
// Better support for Tauri CLI output
clearScreen: false,
// Enable environment variables

View File

@ -1,7 +1,7 @@
{
"name": "haex-hub",
"private": true,
"version": "0.1.0",
"version": "0.1.13",
"type": "module",
"scripts": {
"build": "nuxt build",
@ -14,58 +14,60 @@
"generate": "nuxt generate",
"postinstall": "nuxt prepare",
"preview": "nuxt preview",
"release:patch": "node scripts/release.js patch",
"release:minor": "node scripts/release.js minor",
"release:major": "node scripts/release.js major",
"tauri:build:debug": "tauri build --debug",
"tauri": "tauri"
},
"dependencies": {
"@haexhub/sdk": "^1.9.10",
"@nuxt/eslint": "1.9.0",
"@nuxt/fonts": "0.11.4",
"@nuxt/icon": "2.0.0",
"@nuxt/ui": "4.1.0",
"@nuxtjs/i18n": "10.0.6",
"@pinia/nuxt": "^0.11.2",
"@tailwindcss/vite": "^4.1.16",
"@pinia/nuxt": "^0.11.3",
"@supabase/supabase-js": "^2.80.0",
"@tailwindcss/vite": "^4.1.17",
"@tauri-apps/api": "^2.9.0",
"@tauri-apps/plugin-dialog": "^2.4.2",
"@tauri-apps/plugin-fs": "^2.4.4",
"@tauri-apps/plugin-http": "2.5.2",
"@tauri-apps/plugin-notification": "2.3.1",
"@tauri-apps/plugin-opener": "^2.5.2",
"@tauri-apps/plugin-os": "^2.3.2",
"@tauri-apps/plugin-sql": "2.3.0",
"@tauri-apps/plugin-store": "^2.4.1",
"@vueuse/components": "^13.9.0",
"@vueuse/core": "^13.9.0",
"@vueuse/gesture": "^2.0.0",
"@vueuse/nuxt": "^13.9.0",
"drizzle-orm": "^0.44.7",
"eslint": "^9.38.0",
"fuse.js": "^7.1.0",
"eslint": "^9.39.1",
"nuxt-zod-i18n": "^1.12.1",
"swiper": "^12.0.3",
"tailwindcss": "^4.1.16",
"vue": "^3.5.22",
"tailwindcss": "^4.1.17",
"vue": "^3.5.24",
"vue-router": "^4.6.3",
"zod": "^3.25.76"
},
"devDependencies": {
"@iconify-json/hugeicons": "^1.2.17",
"@iconify-json/lucide": "^1.2.71",
"@iconify/json": "^2.2.401",
"@iconify/tailwind4": "^1.0.6",
"@iconify-json/lucide": "^1.2.72",
"@iconify/json": "^2.2.404",
"@iconify/tailwind4": "^1.1.0",
"@libsql/client": "^0.15.15",
"@tauri-apps/cli": "^2.9.1",
"@types/node": "^24.9.1",
"@tauri-apps/cli": "^2.9.3",
"@types/node": "^24.10.0",
"@vitejs/plugin-vue": "6.0.1",
"@vue/compiler-sfc": "^3.5.22",
"drizzle-kit": "^0.31.5",
"globals": "^16.4.0",
"nuxt": "^4.2.0",
"@vue/compiler-sfc": "^3.5.24",
"drizzle-kit": "^0.31.6",
"globals": "^16.5.0",
"nuxt": "^4.2.1",
"prettier": "3.6.2",
"tsx": "^4.20.6",
"tw-animate-css": "^1.4.0",
"typescript": "^5.9.3",
"vite": "7.1.3",
"vite": "^7.2.2",
"vue-tsc": "3.0.6"
},
"prettier": {

3667
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

91
scripts/release.js Executable file
View File

@ -0,0 +1,91 @@
#!/usr/bin/env node
import { readFileSync, writeFileSync } from 'fs';
import { execSync } from 'child_process';
import { fileURLToPath } from 'url';
import { dirname, join } from 'path';
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
const rootDir = join(__dirname, '..');
const versionType = process.argv[2];
if (!['patch', 'minor', 'major'].includes(versionType)) {
console.error('Usage: pnpm release <patch|minor|major>');
process.exit(1);
}
// Read current package.json
const packageJsonPath = join(rootDir, 'package.json');
const packageJson = JSON.parse(readFileSync(packageJsonPath, 'utf8'));
const currentVersion = packageJson.version;
if (!currentVersion) {
console.error('No version found in package.json');
process.exit(1);
}
// Parse version
const [major, minor, patch] = currentVersion.split('.').map(Number);
// Calculate new version
let newVersion;
switch (versionType) {
case 'major':
newVersion = `${major + 1}.0.0`;
break;
case 'minor':
newVersion = `${major}.${minor + 1}.0`;
break;
case 'patch':
newVersion = `${major}.${minor}.${patch + 1}`;
break;
}
console.log(`📦 Bumping version from ${currentVersion} to ${newVersion}`);
// Update package.json
packageJson.version = newVersion;
writeFileSync(packageJsonPath, JSON.stringify(packageJson, null, 2) + '\n');
console.log('✅ Updated package.json');
// Git operations
try {
// Check if there are uncommitted changes
const status = execSync('git status --porcelain', { encoding: 'utf8' });
const hasOtherChanges = status
.split('\n')
.filter(line => line && !line.includes('package.json'))
.length > 0;
if (hasOtherChanges) {
console.error('❌ There are uncommitted changes besides package.json. Please commit or stash them first.');
process.exit(1);
}
// Add and commit package.json
execSync('git add package.json', { stdio: 'inherit' });
execSync(`git commit -m "Bump version to ${newVersion}"`, { stdio: 'inherit' });
console.log('✅ Committed version bump');
// Create tag
execSync(`git tag v${newVersion}`, { stdio: 'inherit' });
console.log(`✅ Created tag v${newVersion}`);
// Push changes and tag
console.log('📤 Pushing to remote...');
execSync('git push', { stdio: 'inherit' });
execSync(`git push origin v${newVersion}`, { stdio: 'inherit' });
console.log('✅ Pushed changes and tag');
console.log('\n🎉 Release v' + newVersion + ' created successfully!');
console.log('📋 GitHub Actions will now build and publish the release.');
} catch (error) {
console.error('❌ Git operation failed:', error.message);
// Rollback package.json changes
packageJson.version = currentVersion;
writeFileSync(packageJsonPath, JSON.stringify(packageJson, null, 2) + '\n');
console.log('↩️ Rolled back package.json changes');
process.exit(1);
}

1238
src-tauri/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[package]
name = "haex-hub"
version = "0.1.0"
version = "0.1.4"
description = "A Tauri App"
authors = ["you"]
edition = "2021"
@ -20,14 +20,7 @@ tauri-build = { version = "2.2", features = [] }
serde = { version = "1.0.228", features = ["derive"] }
[dependencies]
rusqlite = { version = "0.37.0", features = [
"load_extension",
"bundled-sqlcipher-vendored-openssl",
"functions",
] }
#tauri-plugin-sql = { version = "2", features = ["sqlite"] }tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }#libsqlite3-sys = { version = "0.31", features = ["bundled-sqlcipher"] }
#sqlx = { version = "0.8", features = ["runtime-tokio-rustls", "sqlite"] }
tokio = { version = "1.47.1", features = ["macros", "rt-multi-thread"] }
base64 = "0.22"
ed25519-dalek = "2.1"
fs_extra = "1.3.0"
@ -54,3 +47,11 @@ uhlc = "0.8.2"
url = "2.5.7"
uuid = { version = "1.18.1", features = ["v4"] }
zip = "6.0.0"
rusqlite = { version = "0.37.0", features = [
"load_extension",
"bundled-sqlcipher-vendored-openssl",
"functions",
] }
[target.'cfg(not(target_os = "android"))'.dependencies]
trash = "5.2.5"

View File

@ -1,10 +1,10 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { DbAction } from "./DbAction";
import type { FsAction } from "./FsAction";
import type { HttpAction } from "./HttpAction";
import type { ShellAction } from "./ShellAction";
import type { WebAction } from "./WebAction";
/**
* Ein typsicherer Container, der die spezifische Aktion für einen Ressourcentyp enthält.
*/
export type Action = { "Database": DbAction } | { "Filesystem": FsAction } | { "Http": HttpAction } | { "Shell": ShellAction };
export type Action = { "Database": DbAction } | { "Filesystem": FsAction } | { "Web": WebAction } | { "Shell": ShellAction };

View File

@ -0,0 +1,3 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type DisplayMode = "auto" | "window" | "iframe";

View File

@ -0,0 +1,6 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
/**
* Error codes for frontend handling
*/
export type ExtensionErrorCode = "SecurityViolation" | "NotFound" | "PermissionDenied" | "MutexPoisoned" | "Database" | "Filesystem" | "FilesystemWithPath" | "Http" | "Web" | "Shell" | "Manifest" | "Validation" | "InvalidPublicKey" | "InvalidSignature" | "InvalidActionString" | "SignatureVerificationFailed" | "CalculateHash" | "Installation";

View File

@ -1,3 +1,4 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { DisplayMode } from "./DisplayMode";
export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, devServerUrl: string | null, };
export type ExtensionInfoResponse = { id: string, publicKey: string, name: string, version: string, author: string | null, enabled: boolean, description: string | null, homepage: string | null, icon: string | null, entry: string | null, singleInstance: boolean | null, displayMode: DisplayMode | null, devServerUrl: string | null, };

View File

@ -1,4 +1,5 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { DisplayMode } from "./DisplayMode";
import type { ExtensionPermissions } from "./ExtensionPermissions";
export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, };
export type ExtensionManifest = { name: string, version: string, author: string | null, entry: string | null, icon: string | null, public_key: string, signature: string, permissions: ExtensionPermissions, homepage: string | null, description: string | null, single_instance: boolean | null, display_mode: DisplayMode | null, };

View File

@ -1,7 +1,7 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { DbConstraints } from "./DbConstraints";
import type { FsConstraints } from "./FsConstraints";
import type { HttpConstraints } from "./HttpConstraints";
import type { ShellConstraints } from "./ShellConstraints";
import type { WebConstraints } from "./WebConstraints";
export type PermissionConstraints = DbConstraints | FsConstraints | HttpConstraints | ShellConstraints;
export type PermissionConstraints = DbConstraints | FsConstraints | WebConstraints | ShellConstraints;

View File

@ -1,3 +1,3 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ResourceType = "fs" | "http" | "db" | "shell";
export type ResourceType = "fs" | "web" | "db" | "shell";

View File

@ -0,0 +1,6 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
/**
* Serialized representation of ExtensionError for TypeScript
*/
export type SerializedExtensionError = { code: number, type: string, message: string, extension_id: string | null, };

View File

@ -0,0 +1,6 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
/**
* Definiert Aktionen (HTTP-Methoden), die auf Web-Anfragen angewendet werden können.
*/
export type WebAction = "GET" | "POST" | "PUT" | "PATCH" | "DELETE" | "*";

View File

@ -0,0 +1,4 @@
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { RateLimit } from "./RateLimit";
export type WebConstraints = { methods: Array<string> | null, rate_limit: RateLimit | null, };

View File

@ -1,6 +1,7 @@
mod generator;
fn main() {
generator::event_names::generate_event_names();
generator::table_names::generate_table_names();
generator::rust_types::generate_rust_types();
tauri_build::build();

View File

@ -18,16 +18,27 @@
"fs:allow-appconfig-write-recursive",
"fs:allow-appdata-read-recursive",
"fs:allow-appdata-write-recursive",
"fs:allow-applocaldata-read-recursive",
"fs:allow-applocaldata-write-recursive",
"fs:allow-read-file",
"fs:allow-write-file",
"fs:allow-read-dir",
"fs:allow-mkdir",
"fs:allow-exists",
"fs:allow-remove",
"fs:allow-resource-read-recursive",
"fs:allow-resource-write-recursive",
"fs:allow-download-read-recursive",
"fs:allow-download-write-recursive",
"fs:allow-temp-read-recursive",
"fs:allow-temp-write-recursive",
"fs:default",
{
"identifier": "fs:scope",
"allow": [{ "path": "**" }]
"allow": [
{ "path": "**" },
{ "path": "$TEMP/**" }
]
},
"http:allow-fetch-send",
"http:allow-fetch",
@ -35,8 +46,15 @@
"notification:allow-create-channel",
"notification:allow-list-channels",
"notification:allow-notify",
"notification:allow-is-permission-granted",
"notification:default",
"opener:allow-open-url",
{
"identifier": "opener:allow-open-path",
"allow": [
{ "path": "$TEMP/**" }
]
},
"opener:default",
"os:allow-hostname",
"os:default",

View File

@ -0,0 +1,16 @@
{
"$schema": "../gen/schemas/desktop-schema.json",
"identifier": "extensions",
"description": "Minimal capability for extension webviews - extensions have NO direct system access",
"local": true,
"webviews": ["ext_*"],
"permissions": [
"core:default",
"core:webview:default",
"notification:default",
"notification:allow-is-permission-granted"
],
"remote": {
"urls": ["http://localhost:*", "haex-extension://*"]
}
}

View File

@ -1,8 +1,8 @@
import { writeFileSync, mkdirSync } from 'node:fs'
import { join, dirname } from 'node:path'
import { fileURLToPath } from 'node:url'
import tablesNames from './tableNames.json'
import { schema } from './index'
import tablesNames from '../../src/database/tableNames.json'
import { schema } from '../../src/database/index'
import { getTableColumns } from 'drizzle-orm'
import type { AnySQLiteColumn, SQLiteTable } from 'drizzle-orm/sqlite-core'

View File

@ -1,21 +0,0 @@
import { drizzle } from 'drizzle-orm/sqlite-proxy' // Adapter für Query Building ohne direkte Verbindung
import * as schema from './schemas' // Importiere alles aus deiner Schema-Datei
export * as schema from './schemas'
// sqlite-proxy benötigt eine (dummy) Ausführungsfunktion als Argument.
// Diese wird in unserem Tauri-Workflow nie aufgerufen, da wir nur .toSQL() verwenden.
// Sie muss aber vorhanden sein, um drizzle() aufrufen zu können.
const dummyExecutor = async (
sql: string,
params: unknown[],
method: 'all' | 'run' | 'get' | 'values',
) => {
console.warn(
`Frontend Drizzle Executor wurde aufgerufen (Methode: ${method}). Das sollte im Tauri-Invoke-Workflow nicht passieren!`,
)
// Wir geben leere Ergebnisse zurück, um die Typen zufriedenzustellen, falls es doch aufgerufen wird.
return { rows: [] } // Für 'run' (z.B. bei INSERT/UPDATE)
}
// Erstelle die Drizzle-Instanz für den SQLite-Dialekt
// Übergib den dummyExecutor und das importierte Schema
export const db = drizzle(dummyExecutor, { schema })

View File

@ -60,11 +60,12 @@ CREATE TABLE `haex_extensions` (
`version` text NOT NULL,
`author` text,
`description` text,
`entry` text DEFAULT 'index.html' NOT NULL,
`entry` text DEFAULT 'index.html',
`homepage` text,
`enabled` integer DEFAULT true,
`icon` text,
`signature` text NOT NULL,
`single_instance` integer DEFAULT false,
`haex_timestamp` text
);
--> statement-breakpoint
@ -94,8 +95,10 @@ CREATE TABLE `haex_settings` (
CREATE UNIQUE INDEX `haex_settings_key_type_value_unique` ON `haex_settings` (`key`,`type`,`value`);--> statement-breakpoint
CREATE TABLE `haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` blob,
`haex_timestamp` text
);
--> statement-breakpoint

View File

@ -0,0 +1,15 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint
CREATE TABLE `__new_haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` text,
`haex_timestamp` text
);
--> statement-breakpoint
INSERT INTO `__new_haex_workspaces`("id", "device_id", "name", "position", "background", "haex_timestamp") SELECT "id", "device_id", "name", "position", "background", "haex_timestamp" FROM `haex_workspaces`;--> statement-breakpoint
DROP TABLE `haex_workspaces`;--> statement-breakpoint
ALTER TABLE `__new_haex_workspaces` RENAME TO `haex_workspaces`;--> statement-breakpoint
PRAGMA foreign_keys=ON;--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -1 +0,0 @@
ALTER TABLE `haex_workspaces` ADD `device_id` text NOT NULL;

View File

@ -0,0 +1,13 @@
CREATE TABLE `haex_devices` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_devices_device_id_unique` ON `haex_devices` (`device_id`);--> statement-breakpoint
DROP INDEX `haex_settings_key_type_value_unique`;--> statement-breakpoint
ALTER TABLE `haex_settings` ADD `device_id` text REFERENCES haex_devices(id);--> statement-breakpoint
CREATE UNIQUE INDEX `haex_settings_device_id_key_type_unique` ON `haex_settings` (`device_id`,`key`,`type`);

View File

@ -0,0 +1,10 @@
CREATE TABLE `haex_sync_backends` (
`id` text PRIMARY KEY NOT NULL,
`name` text NOT NULL,
`server_url` text NOT NULL,
`enabled` integer DEFAULT true NOT NULL,
`priority` integer DEFAULT 0 NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_timestamp` text
);

View File

@ -0,0 +1,10 @@
CREATE TABLE `haex_sync_status` (
`id` text PRIMARY KEY NOT NULL,
`backend_id` text NOT NULL,
`last_pull_sequence` integer,
`last_push_hlc_timestamp` text,
`last_sync_at` text,
`error` text
);
--> statement-breakpoint
ALTER TABLE `haex_extensions` ADD `display_mode` text DEFAULT 'auto';

View File

@ -1,7 +1,7 @@
{
"version": "6",
"dialect": "sqlite",
"id": "bcdd9ad3-a87a-4a43-9eba-673f94b10287",
"id": "e3d61ad1-63be-41be-9243-41144e215f98",
"prevId": "00000000-0000-0000-0000-000000000000",
"tables": {
"haex_crdt_configs": {
@ -411,7 +411,7 @@
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
@ -444,6 +444,14 @@
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
@ -619,6 +627,13 @@
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
@ -634,6 +649,13 @@
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "blob",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",

View File

@ -1,8 +1,8 @@
{
"version": "6",
"dialect": "sqlite",
"id": "27735348-b9c5-4bc6-9cc5-dd707ad689b9",
"prevId": "bcdd9ad3-a87a-4a43-9eba-673f94b10287",
"id": "10bec43a-4227-483e-b1c1-fd50ae32bb96",
"prevId": "e3d61ad1-63be-41be-9243-41144e215f98",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
@ -411,7 +411,7 @@
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": true,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
@ -444,6 +444,14 @@
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
@ -641,6 +649,13 @@
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",

View File

@ -0,0 +1,774 @@
{
"version": "6",
"dialect": "sqlite",
"id": "3aedf10c-2266-40f4-8549-0ff8b0588853",
"prevId": "10bec43a-4227-483e-b1c1-fd50ae32bb96",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_devices": {
"name": "haex_devices",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_devices_device_id_unique": {
"name": "haex_devices_device_id_unique",
"columns": [
"device_id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_device_id_key_type_unique": {
"name": "haex_settings_device_id_key_type_unique",
"columns": [
"device_id",
"key",
"type"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_settings_device_id_haex_devices_id_fk": {
"name": "haex_settings_device_id_haex_devices_id_fk",
"tableFrom": "haex_settings",
"tableTo": "haex_devices",
"columnsFrom": [
"device_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,843 @@
{
"version": "6",
"dialect": "sqlite",
"id": "bf82259e-9264-44e7-a60f-8cc14a1f22e2",
"prevId": "3aedf10c-2266-40f4-8549-0ff8b0588853",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_devices": {
"name": "haex_devices",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_devices_device_id_unique": {
"name": "haex_devices_device_id_unique",
"columns": [
"device_id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_device_id_key_type_unique": {
"name": "haex_settings_device_id_key_type_unique",
"columns": [
"device_id",
"key",
"type"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_settings_device_id_haex_devices_id_fk": {
"name": "haex_settings_device_id_haex_devices_id_fk",
"tableFrom": "haex_settings",
"tableTo": "haex_devices",
"columnsFrom": [
"device_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_sync_backends": {
"name": "haex_sync_backends",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"server_url": {
"name": "server_url",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": true
},
"priority": {
"name": "priority",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,903 @@
{
"version": "6",
"dialect": "sqlite",
"id": "7ae230a2-4488-4214-9163-602018852676",
"prevId": "bf82259e-9264-44e7-a60f-8cc14a1f22e2",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_sync_status": {
"name": "haex_sync_status",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"backend_id": {
"name": "backend_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"last_pull_sequence": {
"name": "last_pull_sequence",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"last_push_hlc_timestamp": {
"name": "last_push_hlc_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"last_sync_at": {
"name": "last_sync_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"error": {
"name": "error",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_devices": {
"name": "haex_devices",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_devices_device_id_unique": {
"name": "haex_devices_device_id_unique",
"columns": [
"device_id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"display_mode": {
"name": "display_mode",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'auto'"
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_device_id_key_type_unique": {
"name": "haex_settings_device_id_key_type_unique",
"columns": [
"device_id",
"key",
"type"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_settings_device_id_haex_devices_id_fk": {
"name": "haex_settings_device_id_haex_devices_id_fk",
"tableFrom": "haex_settings",
"tableTo": "haex_devices",
"columnsFrom": [
"device_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_sync_backends": {
"name": "haex_sync_backends",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"server_url": {
"name": "server_url",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": true
},
"priority": {
"name": "priority",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -5,15 +5,36 @@
{
"idx": 0,
"version": "6",
"when": 1761430560028,
"tag": "0000_secret_ender_wiggin",
"when": 1762119713008,
"tag": "0000_cynical_nicolaos",
"breakpoints": true
},
{
"idx": 1,
"version": "6",
"when": 1761581351395,
"tag": "0001_late_the_renegades",
"when": 1762122405562,
"tag": "0001_furry_brother_voodoo",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1762263814375,
"tag": "0002_loose_quasimodo",
"breakpoints": true
},
{
"idx": 3,
"version": "6",
"when": 1762300795436,
"tag": "0003_luxuriant_deathstrike",
"breakpoints": true
},
{
"idx": 4,
"version": "6",
"when": 1762894662424,
"tag": "0004_fast_epoch",
"breakpoints": true
}
]

View File

@ -1,5 +0,0 @@
export const crdtColumnNames = {
haexTimestamp: 'haex_timestamp',
}
export * from './crdt'
export * from './haex'

Binary file not shown.

View File

@ -24,6 +24,23 @@ android {
versionCode = tauriProperties.getProperty("tauri.android.versionCode", "1").toInt()
versionName = tauriProperties.getProperty("tauri.android.versionName", "1.0")
}
signingConfigs {
create("release") {
val keystorePath = System.getenv("ANDROID_KEYSTORE_PATH")
val keystorePassword = System.getenv("ANDROID_KEYSTORE_PASSWORD")
val keyAlias = System.getenv("ANDROID_KEY_ALIAS")
val keyPassword = System.getenv("ANDROID_KEY_PASSWORD")
if (keystorePath != null && keystorePassword != null && keyAlias != null && keyPassword != null) {
storeFile = file(keystorePath)
storePassword = keystorePassword
this.keyAlias = keyAlias
this.keyPassword = keyPassword
}
}
}
buildTypes {
getByName("debug") {
manifestPlaceholders["usesCleartextTraffic"] = "true"
@ -43,6 +60,12 @@ android {
.plus(getDefaultProguardFile("proguard-android-optimize.txt"))
.toList().toTypedArray()
)
// Sign with release config if available
val releaseSigningConfig = signingConfigs.getByName("release")
if (releaseSigningConfig.storeFile != null) {
signingConfig = releaseSigningConfig
}
}
}
kotlinOptions {

View File

@ -1 +1 @@
{"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-read-file","fs:allow-read-dir","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:default","opener:allow-open-url","opener:default","os:allow-hostname","os:default","store:default"]}}
{"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-applocaldata-read-recursive","fs:allow-applocaldata-write-recursive","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-mkdir","fs:allow-exists","fs:allow-remove","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:allow-temp-read-recursive","fs:allow-temp-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"},{"path":"$TEMP/**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:allow-is-permission-granted","notification:default","opener:allow-open-url",{"identifier":"opener:allow-open-path","allow":[{"path":"$TEMP/**"}]},"opener:default","os:allow-hostname","os:default","store:default"]},"extensions":{"identifier":"extensions","description":"Minimal capability for extension webviews - extensions have NO direct system access","remote":{"urls":["http://localhost:*","haex-extension://*"]},"local":true,"webviews":["ext_*"],"permissions":["core:default","core:webview:default","notification:default","notification:allow-is-permission-granted"]}}

View File

@ -0,0 +1,76 @@
// src-tauri/generator/event_names.rs
use serde::Deserialize;
use std::collections::HashMap;
use std::env;
use std::fs::File;
use std::io::{BufReader, Write};
use std::path::Path;
#[derive(Debug, Deserialize)]
struct EventNames {
extension: HashMap<String, String>,
}
pub fn generate_event_names() {
let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt.");
println!("Generiere Event-Namen nach {out_dir}");
let events_path = Path::new("../src/constants/eventNames.json");
let dest_path = Path::new(&out_dir).join("eventNames.rs");
let file = File::open(events_path).expect("Konnte eventNames.json nicht öffnen");
let reader = BufReader::new(file);
let events: EventNames =
serde_json::from_reader(reader).expect("Konnte eventNames.json nicht parsen");
let mut code = String::from(
r#"
// ==================================================================
// HINWEIS: Diese Datei wurde automatisch von build.rs generiert.
// Manuelle Änderungen werden bei der nächsten Kompilierung überschrieben!
// ==================================================================
"#,
);
// Extension Events
code.push_str("// --- Extension Events ---\n");
for (key, value) in &events.extension {
let const_name = format!("EVENT_EXTENSION_{}", to_screaming_snake_case(key));
code.push_str(&format!(
"pub const {}: &str = \"{}\";\n",
const_name, value
));
}
code.push('\n');
// --- Datei schreiben ---
let mut f = File::create(&dest_path).expect("Konnte Zieldatei nicht erstellen");
f.write_all(code.as_bytes())
.expect("Konnte nicht in Zieldatei schreiben");
println!("cargo:rerun-if-changed=../src/constants/eventNames.json");
}
/// Konvertiert einen String zu SCREAMING_SNAKE_CASE
fn to_screaming_snake_case(s: &str) -> String {
let mut result = String::new();
let mut prev_is_lower = false;
for (i, ch) in s.chars().enumerate() {
if ch == '_' {
result.push('_');
prev_is_lower = false;
} else if ch.is_uppercase() {
if i > 0 && prev_is_lower {
result.push('_');
}
result.push(ch);
prev_is_lower = false;
} else {
result.push(ch.to_ascii_uppercase());
prev_is_lower = true;
}
}
result
}

View File

@ -1,3 +1,4 @@
// build/mod.rs
pub mod event_names;
pub mod rust_types;
pub mod table_names;

View File

@ -20,11 +20,11 @@ struct TableDefinition {
pub fn generate_table_names() {
let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt.");
println!("Generiere Tabellennamen nach {}", out_dir);
let schema_path = Path::new("database/tableNames.json");
println!("Generiere Tabellennamen nach {out_dir}");
let schema_path = Path::new("../src/database/tableNames.json");
let dest_path = Path::new(&out_dir).join("tableNames.rs");
let file = File::open(&schema_path).expect("Konnte tableNames.json nicht öffnen");
let file = File::open(schema_path).expect("Konnte tableNames.json nicht öffnen");
let reader = BufReader::new(file);
let schema: Schema =
serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen");
@ -66,7 +66,7 @@ pub fn generate_table_names() {
f.write_all(code.as_bytes())
.expect("Konnte nicht in Zieldatei schreiben");
println!("cargo:rerun-if-changed=database/tableNames.json");
println!("cargo:rerun-if-changed=../src/database/tableNames.json");
}
/// Konvertiert einen String zu SCREAMING_SNAKE_CASE
@ -108,8 +108,7 @@ fn generate_table_constants(table: &TableDefinition, const_prefix: &str) -> Stri
for (col_key, col_value) in &table.columns {
let col_const_name = format!("COL_{}_{}", const_prefix, to_screaming_snake_case(col_key));
code.push_str(&format!(
"pub const {}: &str = \"{}\";\n",
col_const_name, col_value
"pub const {col_const_name}: &str = \"{col_value}\";\n"
));
}

View File

@ -74,15 +74,14 @@ impl HlcService {
// Parse den String in ein Uuid-Objekt.
let uuid = Uuid::parse_str(&node_id_str).map_err(|e| {
HlcError::ParseNodeId(format!(
"Stored device ID is not a valid UUID: {}. Error: {}",
node_id_str, e
"Stored device ID is not a valid UUID: {node_id_str}. Error: {e}"
))
})?;
// Hol dir die rohen 16 Bytes und erstelle daraus die uhlc::ID.
// Das `*` dereferenziert den `&[u8; 16]` zu `[u8; 16]`, was `try_from` erwartet.
let node_id = ID::try_from(*uuid.as_bytes()).map_err(|e| {
HlcError::ParseNodeId(format!("Invalid node ID format from device store: {:?}", e))
HlcError::ParseNodeId(format!("Invalid node ID format from device store: {e:?}"))
})?;
// 2. Erstelle eine HLC-Instanz mit stabiler Identität
@ -95,8 +94,7 @@ impl HlcService {
if let Some(last_timestamp) = Self::load_last_timestamp(conn)? {
hlc.update_with_timestamp(&last_timestamp).map_err(|e| {
HlcError::Parse(format!(
"Failed to update HLC with persisted timestamp: {:?}",
e
"Failed to update HLC with persisted timestamp: {e:?}"
))
})?;
}
@ -119,7 +117,7 @@ impl HlcService {
if let Some(s) = value.as_str() {
// Das ist unser Erfolgsfall. Wir haben einen &str und können
// eine Kopie davon zurückgeben.
println!("Gefundene und validierte Geräte-ID: {}", s);
println!("Gefundene und validierte Geräte-ID: {s}");
if Uuid::parse_str(s).is_ok() {
// Erfolgsfall: Der Wert ist ein String UND eine gültige UUID.
// Wir können die Funktion direkt mit dem Wert verlassen.
@ -183,19 +181,19 @@ impl HlcService {
let hlc = hlc_guard.as_mut().ok_or(HlcError::NotInitialized)?;
hlc.update_with_timestamp(timestamp)
.map_err(|e| HlcError::Parse(format!("Failed to update HLC: {:?}", e)))
.map_err(|e| HlcError::Parse(format!("Failed to update HLC: {e:?}")))
}
/// Lädt den letzten persistierten Zeitstempel aus der Datenbank.
fn load_last_timestamp(conn: &Connection) -> Result<Option<Timestamp>, HlcError> {
let query = format!("SELECT value FROM {} WHERE key = ?1", TABLE_CRDT_CONFIGS);
let query = format!("SELECT value FROM {TABLE_CRDT_CONFIGS} WHERE key = ?1");
match conn.query_row(&query, params![HLC_TIMESTAMP_TYPE], |row| {
row.get::<_, String>(0)
}) {
Ok(state_str) => {
let timestamp = Timestamp::from_str(&state_str).map_err(|e| {
HlcError::ParseTimestamp(format!("Invalid timestamp format: {:?}", e))
HlcError::ParseTimestamp(format!("Invalid timestamp format: {e:?}"))
})?;
Ok(Some(timestamp))
}
@ -209,9 +207,8 @@ impl HlcService {
let timestamp_str = timestamp.to_string();
tx.execute(
&format!(
"INSERT INTO {} (key, value) VALUES (?1, ?2)
ON CONFLICT(key) DO UPDATE SET value = excluded.value",
TABLE_CRDT_CONFIGS
"INSERT INTO {TABLE_CRDT_CONFIGS} (key, value) VALUES (?1, ?2)
ON CONFLICT(key) DO UPDATE SET value = excluded.value"
),
params![HLC_TIMESTAMP_TYPE, timestamp_str],
)?;

View File

@ -11,8 +11,6 @@ const INSERT_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_insert";
const UPDATE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_update";
const DELETE_TRIGGER_TPL: &str = "z_crdt_{TABLE_NAME}_delete";
//const SYNC_ACTIVE_KEY: &str = "sync_active";
pub const HLC_TIMESTAMP_COLUMN: &str = "haex_timestamp";
/// Name der custom UUID-Generierungs-Funktion (registriert in database::core::open_and_init_db)
@ -34,17 +32,16 @@ pub enum CrdtSetupError {
impl Display for CrdtSetupError {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
match self {
CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {}", e),
CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {e}"),
CrdtSetupError::HlcColumnMissing {
table_name,
column_name,
} => write!(
f,
"Table '{}' is missing the required hlc column '{}'",
table_name, column_name
"Table '{table_name}' is missing the required hlc column '{column_name}'"
),
CrdtSetupError::PrimaryKeyMissing { table_name } => {
write!(f, "Table '{}' has no primary key", table_name)
write!(f, "Table '{table_name}' has no primary key")
}
}
}
@ -85,7 +82,8 @@ impl ColumnInfo {
}
fn is_safe_identifier(name: &str) -> bool {
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_')
// Allow alphanumeric characters, underscores, and hyphens (for extension names like "nuxt-app")
!name.is_empty() && name.chars().all(|c| c.is_alphanumeric() || c == '_' || c == '-')
}
/// Richtet CRDT-Trigger für eine einzelne Tabelle ein.
@ -130,7 +128,7 @@ pub fn setup_triggers_for_table(
let delete_trigger_sql = generate_delete_trigger_sql(table_name, &pks, &cols_to_track);
if recreate {
drop_triggers_for_table(&tx, table_name)?;
drop_triggers_for_table(tx, table_name)?;
}
tx.execute_batch(&insert_trigger_sql)?;
@ -144,13 +142,11 @@ pub fn setup_triggers_for_table(
pub fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<ColumnInfo>> {
if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}",
table_name
))
.into());
"Invalid or unsafe table name provided: {table_name}"
)));
}
let sql = format!("PRAGMA table_info(\"{}\");", table_name);
let sql = format!("PRAGMA table_info(\"{table_name}\");");
let mut stmt = conn.prepare(&sql)?;
let rows = stmt.query_map([], ColumnInfo::from_row)?;
rows.collect()
@ -164,8 +160,7 @@ pub fn drop_triggers_for_table(
) -> Result<(), CrdtSetupError> {
if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}",
table_name
"Invalid or unsafe table name provided: {table_name}"
))
.into());
}
@ -178,8 +173,7 @@ pub fn drop_triggers_for_table(
drop_trigger_sql(DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let sql_batch = format!(
"{}\n{}\n{}",
drop_insert_trigger_sql, drop_update_trigger_sql, drop_delete_trigger_sql
"{drop_insert_trigger_sql}\n{drop_update_trigger_sql}\n{drop_delete_trigger_sql}"
);
tx.execute_batch(&sql_batch)?;
@ -245,33 +239,22 @@ pub fn drop_triggers_for_table(
fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks
.iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk))
.map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>()
.join(", ");
let column_inserts = if cols.is_empty() {
// Nur PKs -> einfacher Insert ins Log
format!(
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}));"
)
} else {
cols.iter().fold(String::new(), |mut acc, col| {
writeln!(
&mut acc,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value)
VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}), '{column}', json_object('value', NEW.\"{column}\"));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value)
VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}), '{col}', json_object('value', NEW.\"{col}\"));"
).unwrap();
acc
})
@ -291,14 +274,14 @@ fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
/// Generiert das SQL zum Löschen eines Triggers.
fn drop_trigger_sql(trigger_name: String) -> String {
format!("DROP TRIGGER IF EXISTS \"{}\";", trigger_name)
format!("DROP TRIGGER IF EXISTS \"{trigger_name}\";")
}
/// Generiert das SQL für den UPDATE-Trigger.
fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks
.iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk))
.map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>()
.join(", ");
@ -309,16 +292,10 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols {
writeln!(
&mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value)
SELECT {uuid_fn}(), NEW.\"{hlc_col}\", 'UPDATE', '{table}', json_object({pk_payload}), '{column}',
json_object('value', NEW.\"{column}\"), json_object('value', OLD.\"{column}\")
WHERE NEW.\"{column}\" IS NOT OLD.\"{column}\";",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value)
SELECT {UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'UPDATE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', NEW.\"{col}\"), json_object('value', OLD.\"{col}\")
WHERE NEW.\"{col}\" IS NOT OLD.\"{col}\";"
).unwrap();
}
}
@ -342,7 +319,7 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks
.iter()
.map(|pk| format!("'{}', OLD.\"{}\"", pk, pk))
.map(|pk| format!("'{pk}', OLD.\"{pk}\""))
.collect::<Vec<_>>()
.join(", ");
@ -353,28 +330,17 @@ fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols {
writeln!(
&mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}), '{column}',
json_object('value', OLD.\"{column}\"));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value)
VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', OLD.\"{col}\"));"
).unwrap();
}
} else {
// Nur PKs -> minimales Delete Log
writeln!(
&mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}));",
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
"INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}));"
)
.unwrap();
}

View File

@ -47,7 +47,7 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
},
)
.map_err(|e| DatabaseError::DatabaseError {
reason: format!("Failed to register {} function: {}", UUID_FUNCTION_NAME, e),
reason: format!("Failed to register {UUID_FUNCTION_NAME} function: {e}"),
})?;
let journal_mode: String = conn
@ -61,8 +61,7 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
println!("WAL mode successfully enabled.");
} else {
eprintln!(
"Failed to enable WAL mode, journal_mode is '{}'.",
journal_mode
"Failed to enable WAL mode, journal_mode is '{journal_mode}'."
);
}
@ -89,8 +88,15 @@ pub fn parse_single_statement(sql: &str) -> Result<Statement, DatabaseError> {
/// Utility für SQL-Parsing - parst mehrere SQL-Statements
pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError> {
let dialect = SQLiteDialect {};
Parser::parse_sql(&dialect, sql).map_err(|e| DatabaseError::ParseError {
reason: e.to_string(),
// Normalize whitespace: replace multiple whitespaces (including newlines, tabs) with single space
let normalized_sql = sql
.split_whitespace()
.collect::<Vec<&str>>()
.join(" ");
Parser::parse_sql(&dialect, &normalized_sql).map_err(|e| DatabaseError::ParseError {
reason: format!("Failed to parse SQL: {e}"),
sql: sql.to_string(),
})
}
@ -131,7 +137,7 @@ impl ValueConverter {
serde_json::to_string(json_val)
.map(SqlValue::Text)
.map_err(|e| DatabaseError::SerializationError {
reason: format!("Failed to serialize JSON param: {}", e),
reason: format!("Failed to serialize JSON param: {e}"),
})
}
}
@ -251,7 +257,7 @@ pub fn select_with_crdt(
params: Vec<JsonValue>,
connection: &DbConnection,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
with_connection(&connection, |conn| {
with_connection(connection, |conn| {
SqlExecutor::query_select(conn, &sql, &params)
})
}

View File

@ -36,8 +36,7 @@ pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, Databa
// Check if triggers already initialized
let check_sql = format!(
"SELECT value FROM {} WHERE key = ? AND type = ?",
TABLE_SETTINGS
"SELECT value FROM {TABLE_SETTINGS} WHERE key = ? AND type = ?"
);
let initialized: Option<String> = tx
.query_row(
@ -57,7 +56,7 @@ pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, Databa
// Create triggers for all CRDT tables
for table_name in CRDT_TABLES {
eprintln!(" - Setting up triggers for: {}", table_name);
eprintln!(" - Setting up triggers for: {table_name}");
trigger::setup_triggers_for_table(&tx, table_name, false)?;
}

View File

@ -20,6 +20,8 @@ use std::time::UNIX_EPOCH;
use std::{fs, sync::Arc};
use tauri::{path::BaseDirectory, AppHandle, Manager, State};
use tauri_plugin_fs::FsExt;
#[cfg(not(target_os = "android"))]
use trash;
use ts_rs::TS;
pub struct DbConnection(pub Arc<Mutex<Option<Connection>>>);
@ -91,7 +93,7 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
let vault_file_name = if vault_name.ends_with(VAULT_EXTENSION) {
vault_name.to_string()
} else {
format!("{}{VAULT_EXTENSION}", vault_name)
format!("{vault_name}{VAULT_EXTENSION}")
};
let vault_directory = get_vaults_directory(app_handle)?;
@ -99,13 +101,12 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
let vault_path = app_handle
.path()
.resolve(
format!("{vault_directory}/{}", vault_file_name),
format!("{vault_directory}/{vault_file_name}"),
BaseDirectory::AppLocalData,
)
.map_err(|e| DatabaseError::PathResolutionError {
reason: format!(
"Failed to resolve vault path for '{}': {}",
vault_file_name, e
"Failed to resolve vault path for '{vault_file_name}': {e}"
),
})?;
@ -113,7 +114,7 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
if let Some(parent) = vault_path.parent() {
fs::create_dir_all(parent).map_err(|e| DatabaseError::IoError {
path: parent.display().to_string(),
reason: format!("Failed to create vaults directory: {}", e),
reason: format!("Failed to create vaults directory: {e}"),
})?;
}
@ -133,7 +134,6 @@ pub fn get_vaults_directory(app_handle: &AppHandle) -> Result<String, DatabaseEr
Ok(vaults_dir.to_string_lossy().to_string())
}
//#[serde(tag = "type", content = "details")]
#[derive(Debug, Serialize, Deserialize, TS)]
#[ts(export)]
#[serde(rename_all = "camelCase")]
@ -173,18 +173,18 @@ pub fn list_vaults(app_handle: AppHandle) -> Result<Vec<VaultInfo>, DatabaseErro
if let Some(filename) = path.file_name().and_then(|n| n.to_str()) {
if filename.ends_with(VAULT_EXTENSION) {
// Entferne .db Endung für die Rückgabe
println!("Vault gefunden {}", filename.to_string());
println!("Vault gefunden {filename}");
let metadata = fs::metadata(&path).map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(),
reason: format!("Metadaten konnten nicht gelesen werden: {}", e),
reason: format!("Metadaten konnten nicht gelesen werden: {e}"),
})?;
let last_access_timestamp = metadata
.accessed()
.map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(),
reason: format!("Zugriffszeit konnte nicht gelesen werden: {}", e),
reason: format!("Zugriffszeit konnte nicht gelesen werden: {e}"),
})?
.duration_since(UNIX_EPOCH)
.unwrap_or_default() // Fallback für den seltenen Fall einer Zeit vor 1970
@ -212,12 +212,63 @@ pub fn vault_exists(app_handle: AppHandle, vault_name: String) -> Result<bool, D
Ok(Path::new(&vault_path).exists())
}
/// Deletes a vault database file
/// Moves a vault database file to trash (or deletes permanently if trash is unavailable)
#[tauri::command]
pub fn move_vault_to_trash(
app_handle: AppHandle,
vault_name: String,
) -> Result<String, DatabaseError> {
// On Android, trash is not available, so delete permanently
#[cfg(target_os = "android")]
{
println!(
"Android platform detected, permanently deleting vault '{}'",
vault_name
);
return delete_vault(app_handle, vault_name);
}
// On non-Android platforms, try to use trash
#[cfg(not(target_os = "android"))]
{
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
path: vault_path,
reason: "Vault does not exist".to_string(),
});
}
// Try to move to trash first (works on desktop systems)
let moved_to_trash = trash::delete(&vault_path).is_ok();
if moved_to_trash {
// Also try to move auxiliary files to trash (ignore errors as they might not exist)
let _ = trash::delete(&vault_shm_path);
let _ = trash::delete(&vault_wal_path);
Ok(format!(
"Vault '{vault_name}' successfully moved to trash"
))
} else {
// Fallback: Permanent deletion if trash fails
println!(
"Trash not available, falling back to permanent deletion for vault '{vault_name}'"
);
delete_vault(app_handle, vault_name)
}
}
}
/// Deletes a vault database file permanently (bypasses trash)
#[tauri::command]
pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{}-shm", vault_path);
let vault_wal_path = format!("{}-wal", vault_path);
let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
@ -229,23 +280,23 @@ pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String,
if Path::new(&vault_shm_path).exists() {
fs::remove_file(&vault_shm_path).map_err(|e| DatabaseError::IoError {
path: vault_shm_path.clone(),
reason: format!("Failed to delete vault: {}", e),
reason: format!("Failed to delete vault: {e}"),
})?;
}
if Path::new(&vault_wal_path).exists() {
fs::remove_file(&vault_wal_path).map_err(|e| DatabaseError::IoError {
path: vault_wal_path.clone(),
reason: format!("Failed to delete vault: {}", e),
reason: format!("Failed to delete vault: {e}"),
})?;
}
fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError {
path: vault_path.clone(),
reason: format!("Failed to delete vault: {}", e),
reason: format!("Failed to delete vault: {e}"),
})?;
Ok(format!("Vault '{}' successfully deleted", vault_name))
Ok(format!("Vault '{vault_name}' successfully deleted"))
}
#[tauri::command]
@ -255,16 +306,16 @@ pub fn create_encrypted_database(
key: String,
state: State<'_, AppState>,
) -> Result<String, DatabaseError> {
println!("Creating encrypted vault with name: {}", vault_name);
println!("Creating encrypted vault with name: {vault_name}");
let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path);
println!("Resolved vault path: {vault_path}");
// Prüfen, ob bereits eine Vault mit diesem Namen existiert
if Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
path: vault_path,
reason: format!("A vault with the name '{}' already exists", vault_name),
reason: format!("A vault with the name '{vault_name}' already exists"),
});
}
/* let resource_path = app_handle
@ -276,7 +327,7 @@ pub fn create_encrypted_database(
.path()
.resolve("database/vault.db", BaseDirectory::Resource)
.map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve template database: {}", e),
reason: format!("Failed to resolve template database: {e}"),
})?;
let template_content =
@ -285,20 +336,20 @@ pub fn create_encrypted_database(
.read(&template_path)
.map_err(|e| DatabaseError::IoError {
path: template_path.display().to_string(),
reason: format!("Failed to read template database from resources: {}", e),
reason: format!("Failed to read template database from resources: {e}"),
})?;
let temp_path = app_handle
.path()
.resolve("temp_vault.db", BaseDirectory::AppLocalData)
.map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve temp database: {}", e),
reason: format!("Failed to resolve temp database: {e}"),
})?;
let temp_path_clone = temp_path.to_owned();
fs::write(temp_path, template_content).map_err(|e| DatabaseError::IoError {
path: vault_path.to_string(),
reason: format!("Failed to write temporary template database: {}", e),
reason: format!("Failed to write temporary template database: {e}"),
})?;
/* if !template_path.exists() {
return Err(DatabaseError::IoError {
@ -311,8 +362,7 @@ pub fn create_encrypted_database(
let conn = Connection::open(&temp_path_clone).map_err(|e| DatabaseError::ConnectionFailed {
path: temp_path_clone.display().to_string(),
reason: format!(
"Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {}",
e
"Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {e}"
),
})?;
@ -340,7 +390,7 @@ pub fn create_encrypted_database(
let _ = fs::remove_file(&vault_path);
let _ = fs::remove_file(&temp_path_clone);
return Err(DatabaseError::QueryError {
reason: format!("Fehler während sqlcipher_export: {}", e),
reason: format!("Fehler während sqlcipher_export: {e}"),
});
}
@ -365,11 +415,11 @@ pub fn create_encrypted_database(
Ok(version)
}) {
Ok(version) => {
println!("SQLCipher ist aktiv! Version: {}", version);
println!("SQLCipher ist aktiv! Version: {version}");
}
Err(e) => {
eprintln!("FEHLER: SQLCipher scheint NICHT aktiv zu sein!");
eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {}", e);
eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {e}");
eprintln!("Die Datenbank wurde wahrscheinlich NICHT verschlüsselt.");
}
}
@ -377,7 +427,7 @@ pub fn create_encrypted_database(
conn.close()
.map_err(|(_, e)| DatabaseError::ConnectionFailed {
path: template_path.display().to_string(),
reason: format!("Fehler beim Schließen der Quelldatenbank: {}", e),
reason: format!("Fehler beim Schließen der Quelldatenbank: {e}"),
})?;
let _ = fs::remove_file(&temp_path_clone);
@ -394,22 +444,19 @@ pub fn open_encrypted_database(
key: String,
state: State<'_, AppState>,
) -> Result<String, DatabaseError> {
println!("Opening encrypted database vault_path: {}", vault_path);
// Vault-Pfad aus dem Namen ableiten
//let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path);
println!("Opening encrypted database vault_path: {vault_path}");
println!("Resolved vault path: {vault_path}");
if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError {
path: vault_path.to_string(),
reason: format!("Vault '{}' does not exist", vault_path),
reason: format!("Vault '{vault_path}' does not exist"),
});
}
initialize_session(&app_handle, &vault_path, &key, &state)?;
Ok(format!("Vault '{}' opened successfully", vault_path))
Ok(format!("Vault '{vault_path}' opened successfully"))
}
/// Opens the DB, initializes the HLC service, and stores both in the AppState.
@ -461,8 +508,7 @@ fn initialize_session(
eprintln!("INFO: Setting 'triggers_initialized' flag via CRDT...");
let insert_sql = format!(
"INSERT INTO {} (id, key, type, value) VALUES (?, ?, ?, ?)",
TABLE_SETTINGS
"INSERT INTO {TABLE_SETTINGS} (id, key, type, value) VALUES (?, ?, ?, ?)"
);
// execute_with_crdt erwartet Vec<JsonValue>, kein params!-Makro

View File

@ -2,7 +2,7 @@ use crate::database::core::with_connection;
use crate::database::error::DatabaseError;
use crate::extension::core::manifest::{EditablePermissions, ExtensionManifest, ExtensionPreview};
use crate::extension::core::types::{copy_directory, Extension, ExtensionSource};
use crate::extension::core::ExtensionPermissions;
use crate::extension::core::{DisplayMode, ExtensionPermissions};
use crate::extension::crypto::ExtensionCrypto;
use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError;
@ -10,10 +10,8 @@ use crate::extension::permissions::manager::PermissionManager;
use crate::extension::permissions::types::ExtensionPermission;
use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS};
use crate::AppState;
use serde_json::Value as JsonValue;
use std::collections::{HashMap, HashSet};
use std::collections::HashMap;
use std::fs;
use std::io::Cursor;
use std::path::PathBuf;
use std::sync::Mutex;
use std::time::{Duration, SystemTime};
@ -66,40 +64,159 @@ impl ExtensionManager {
Self::default()
}
/// Helper function to validate path and check for path traversal
/// Returns the cleaned path if valid, or None if invalid/not found
/// If require_exists is true, returns None if path doesn't exist
pub fn validate_path_in_directory(
base_dir: &PathBuf,
relative_path: &str,
require_exists: bool,
) -> Result<Option<PathBuf>, ExtensionError> {
// Check for path traversal patterns
if relative_path.contains("..") {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {relative_path}"),
});
}
// Clean the path (same logic as in protocol.rs)
let clean_path = relative_path
.replace('\\', "/")
.trim_start_matches('/')
.split('/')
.filter(|&part| !part.is_empty() && part != "." && part != "..")
.collect::<PathBuf>();
let full_path = base_dir.join(&clean_path);
// Check if file/directory exists (if required)
if require_exists && !full_path.exists() {
return Ok(None);
}
// Verify path is within base directory
let canonical_base = base_dir
.canonicalize()
.map_err(|e| ExtensionError::Filesystem { source: e })?;
if let Ok(canonical_path) = full_path.canonicalize() {
if !canonical_path.starts_with(&canonical_base) {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {relative_path}"),
});
}
Ok(Some(canonical_path))
} else {
// Path doesn't exist yet - still validate it would be within base
if full_path.starts_with(&canonical_base) {
Ok(Some(full_path))
} else {
Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {relative_path}"),
})
}
}
}
/// Validates icon path and falls back to favicon.ico if not specified
fn validate_and_resolve_icon_path(
extension_dir: &PathBuf,
haextension_dir: &str,
icon_path: Option<&str>,
) -> Result<Option<String>, ExtensionError> {
// If icon is specified in manifest, validate it
if let Some(icon) = icon_path {
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, icon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string()));
} else {
eprintln!("WARNING: Icon path specified in manifest not found: {icon}");
// Continue to fallback logic
}
}
// Fallback 1: Check haextension/favicon.ico
let haextension_favicon = format!("{haextension_dir}/favicon.ico");
if let Some(clean_path) =
Self::validate_path_in_directory(extension_dir, &haextension_favicon, true)?
{
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// Fallback 2: Check public/favicon.ico
if let Some(clean_path) =
Self::validate_path_in_directory(extension_dir, "public/favicon.ico", true)?
{
return Ok(Some(clean_path.to_string_lossy().to_string()));
}
// No icon found
Ok(None)
}
/// Extrahiert eine Extension-ZIP-Datei und validiert das Manifest
fn extract_and_validate_extension(
bytes: Vec<u8>,
temp_prefix: &str,
app_handle: &AppHandle,
) -> Result<ExtractedExtension, ExtensionError> {
let temp = std::env::temp_dir().join(format!("{}_{}", temp_prefix, uuid::Uuid::new_v4()));
// Use app_cache_dir for better Android compatibility
let cache_dir =
app_handle
.path()
.app_cache_dir()
.map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot get app cache dir: {e}"),
})?;
let temp_id = uuid::Uuid::new_v4();
let temp = cache_dir.join(format!("{temp_prefix}_{temp_id}"));
let zip_file_path = cache_dir.join(format!(
"{}_{}_{}.haextension",
temp_prefix, temp_id, "temp"
));
// Write bytes to a temporary ZIP file first (important for Android file system)
fs::write(&zip_file_path, &bytes).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
// Create extraction directory
fs::create_dir_all(&temp)
.map_err(|e| ExtensionError::filesystem_with_path(temp.display().to_string(), e))?;
let mut archive = ZipArchive::new(Cursor::new(bytes)).map_err(|e| {
ExtensionError::InstallationFailed {
reason: format!("Invalid ZIP: {}", e),
}
// Open ZIP file from disk (more reliable on Android than from memory)
let zip_file = fs::File::open(&zip_file_path).map_err(|e| {
ExtensionError::filesystem_with_path(zip_file_path.display().to_string(), e)
})?;
let mut archive =
ZipArchive::new(zip_file).map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Invalid ZIP: {e}"),
})?;
archive
.extract(&temp)
.map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot extract ZIP: {}", e),
reason: format!("Cannot extract ZIP: {e}"),
})?;
// Clean up temporary ZIP file
let _ = fs::remove_file(&zip_file_path);
// Read haextension_dir from config if it exists, otherwise use default
let config_path = temp.join("haextension.config.json");
let haextension_dir = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read haextension.config.json: {}", e),
})?;
let config_content = std::fs::read_to_string(&config_path).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Cannot read haextension.config.json: {e}"),
}
})?;
let config: serde_json::Value = serde_json::from_str(&config_content)
.map_err(|e| ExtensionError::ManifestError {
reason: format!("Invalid haextension.config.json: {}", e),
})?;
let config: serde_json::Value = serde_json::from_str(&config_content).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Invalid haextension.config.json: {e}"),
}
})?;
let dir = config
.get("dev")
@ -108,56 +225,40 @@ impl ExtensionManager {
.unwrap_or("haextension")
.to_string();
// Security: Validate that haextension_dir doesn't contain ".." for path traversal
if dir.contains("..") {
return Err(ExtensionError::ManifestError {
reason: "Invalid haextension_dir: path traversal with '..' not allowed".to_string(),
});
}
dir
} else {
"haextension".to_string()
};
// Build the manifest path
let manifest_path = temp.join(&haextension_dir).join("manifest.json");
// Ensure the resolved path is still within temp directory (safety check against path traversal)
let canonical_temp = temp.canonicalize()
.map_err(|e| ExtensionError::Filesystem { source: e })?;
// Only check if manifest_path parent exists to avoid errors
if let Some(parent) = manifest_path.parent() {
if let Ok(canonical_manifest_dir) = parent.canonicalize() {
if !canonical_manifest_dir.starts_with(&canonical_temp) {
return Err(ExtensionError::ManifestError {
reason: "Security violation: manifest path outside extension directory".to_string(),
});
}
}
}
// Check if manifest exists
if !manifest_path.exists() {
return Err(ExtensionError::ManifestError {
reason: format!("manifest.json not found at {}/manifest.json", haextension_dir),
});
}
// Validate manifest path using helper function
let manifest_relative_path = format!("{haextension_dir}/manifest.json");
let manifest_path = Self::validate_path_in_directory(&temp, &manifest_relative_path, true)?
.ok_or_else(|| ExtensionError::ManifestError {
reason: format!("manifest.json not found at {haextension_dir}/manifest.json"),
})?;
let actual_dir = temp.clone();
let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read manifest: {}", e),
reason: format!("Cannot read manifest: {e}"),
})?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
let mut manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
let content_hash = ExtensionCrypto::hash_directory(&actual_dir, &manifest_path).map_err(|e| {
ExtensionError::SignatureVerificationFailed {
reason: e.to_string(),
}
})?;
// Validate and resolve icon path with fallback logic
let validated_icon = Self::validate_and_resolve_icon_path(
&actual_dir,
&haextension_dir,
manifest.icon.as_deref(),
)?;
manifest.icon = validated_icon;
let content_hash =
ExtensionCrypto::hash_directory(&actual_dir, &manifest_path).map_err(|e| {
ExtensionError::SignatureVerificationFailed {
reason: e.to_string(),
}
})?;
Ok(ExtractedExtension {
temp_dir: actual_dir,
@ -350,10 +451,7 @@ impl ExtensionManager {
})?;
eprintln!("DEBUG: Removing extension with ID: {}", extension.id);
eprintln!(
"DEBUG: Extension name: {}, version: {}",
extension_name, extension_version
);
eprintln!("DEBUG: Extension name: {extension_name}, version: {extension_version}");
// Lösche Permissions und Extension-Eintrag in einer Transaktion
with_connection(&state.db, |conn| {
@ -371,7 +469,7 @@ impl ExtensionManager {
PermissionManager::delete_permissions_in_transaction(&tx, &hlc_service, &extension.id)?;
// Lösche Extension-Eintrag mit extension_id
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSIONS);
let sql = format!("DELETE FROM {TABLE_EXTENSIONS} WHERE id = ?");
eprintln!("DEBUG: Executing SQL: {} with id = {}", sql, extension.id);
SqlExecutor::execute_internal_typed(
&tx,
@ -427,9 +525,11 @@ impl ExtensionManager {
pub async fn preview_extension_internal(
&self,
app_handle: &AppHandle,
file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_preview")?;
let extracted =
Self::extract_and_validate_extension(file_bytes, "haexhub_preview", app_handle)?;
let is_valid_signature = ExtensionCrypto::verify_signature(
&extracted.manifest.public_key,
@ -454,7 +554,8 @@ impl ExtensionManager {
custom_permissions: EditablePermissions,
state: &State<'_, AppState>,
) -> Result<String, ExtensionError> {
let extracted = Self::extract_and_validate_extension(file_bytes, "haexhub_ext")?;
let extracted =
Self::extract_and_validate_extension(file_bytes, "haexhub_ext", &app_handle)?;
// Signatur verifizieren (bei Installation wird ein Fehler geworfen, nicht nur geprüft)
ExtensionCrypto::verify_signature(
@ -525,33 +626,33 @@ impl ExtensionManager {
// 1. Extension-Eintrag erstellen mit generierter UUID
let insert_ext_sql = format!(
"INSERT INTO {} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)",
TABLE_EXTENSIONS
"INSERT INTO {TABLE_EXTENSIONS} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance, display_mode) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
);
SqlExecutor::execute_internal_typed(
&tx,
&hlc_service,
&insert_ext_sql,
rusqlite::params![
extension_id,
extracted.manifest.name,
extracted.manifest.version,
extracted.manifest.author,
extracted.manifest.entry,
extracted.manifest.icon,
extracted.manifest.public_key,
extracted.manifest.signature,
extracted.manifest.homepage,
extracted.manifest.description,
true, // enabled
],
)?;
&tx,
&hlc_service,
&insert_ext_sql,
rusqlite::params![
extension_id,
extracted.manifest.name,
extracted.manifest.version,
extracted.manifest.author,
extracted.manifest.entry,
extracted.manifest.icon,
extracted.manifest.public_key,
extracted.manifest.signature,
extracted.manifest.homepage,
extracted.manifest.description,
true, // enabled
extracted.manifest.single_instance.unwrap_or(false),
extracted.manifest.display_mode.as_ref().map(|dm| format!("{:?}", dm).to_lowercase()).unwrap_or_else(|| "auto".to_string()),
],
)?;
// 2. Permissions speichern
let insert_perm_sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)",
TABLE_EXTENSION_PERMISSIONS
"INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
);
for perm in &permissions {
@ -623,10 +724,9 @@ impl ExtensionManager {
// Lade alle Daten aus der Datenbank
let extensions = with_connection(&state.db, |conn| {
let sql = format!(
"SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled FROM {}",
TABLE_EXTENSIONS
"SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance, display_mode FROM {TABLE_EXTENSIONS}"
);
eprintln!("DEBUG: SQL Query before transformation: {}", sql);
eprintln!("DEBUG: SQL Query before transformation: {sql}");
let results = SqlExecutor::query_select(conn, &sql, &[])?;
eprintln!("DEBUG: Query returned {} results", results.len());
@ -655,13 +755,21 @@ impl ExtensionManager {
})?
.to_string(),
author: row[3].as_str().map(String::from),
entry: row[4].as_str().unwrap_or("index.html").to_string(),
entry: row[4].as_str().map(String::from),
icon: row[5].as_str().map(String::from),
public_key: row[6].as_str().unwrap_or("").to_string(),
signature: row[7].as_str().unwrap_or("").to_string(),
permissions: ExtensionPermissions::default(),
homepage: row[8].as_str().map(String::from),
description: row[9].as_str().map(String::from),
single_instance: row[11]
.as_bool()
.or_else(|| row[11].as_i64().map(|v| v != 0)),
display_mode: row[12].as_str().and_then(|s| match s {
"window" => Some(DisplayMode::Window),
"iframe" => Some(DisplayMode::Iframe),
"auto" | _ => Some(DisplayMode::Auto),
}),
};
let enabled = row[10]
@ -685,7 +793,7 @@ impl ExtensionManager {
for extension_data in extensions {
let extension_id = extension_data.id;
eprintln!("DEBUG: Processing extension: {}", extension_id);
eprintln!("DEBUG: Processing extension: {extension_id}");
// Use public_key/name/version path structure
let extension_path = self.get_extension_dir(
@ -698,8 +806,7 @@ impl ExtensionManager {
// Check if extension directory exists
if !extension_path.exists() {
eprintln!(
"DEBUG: Extension directory missing for: {} at {:?}",
extension_id, extension_path
"DEBUG: Extension directory missing for: {extension_id} at {extension_path:?}"
);
self.missing_extensions
.lock()
@ -721,35 +828,12 @@ impl ExtensionManager {
match std::fs::read_to_string(&config_path) {
Ok(config_content) => {
match serde_json::from_str::<serde_json::Value>(&config_content) {
Ok(config) => {
let dir = config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string();
// Security: Validate that haextension_dir doesn't contain ".."
if dir.contains("..") {
eprintln!(
"DEBUG: Invalid haextension_dir for: {}, contains '..'",
extension_id
);
self.missing_extensions
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?
.push(MissingExtension {
id: extension_id.clone(),
public_key: extension_data.manifest.public_key.clone(),
name: extension_data.manifest.name.clone(),
version: extension_data.manifest.version.clone(),
});
continue;
}
dir
}
Ok(config) => config
.get("dev")
.and_then(|dev| dev.get("haextension_dir"))
.and_then(|dir| dir.as_str())
.unwrap_or("haextension")
.to_string(),
Err(_) => "haextension".to_string(),
}
}
@ -759,12 +843,13 @@ impl ExtensionManager {
"haextension".to_string()
};
// Check if manifest.json exists in the haextension_dir
let manifest_path = extension_path.join(&haextension_dir).join("manifest.json");
if !manifest_path.exists() {
// Validate manifest.json path using helper function
let manifest_relative_path = format!("{haextension_dir}/manifest.json");
if Self::validate_path_in_directory(&extension_path, &manifest_relative_path, true)?
.is_none()
{
eprintln!(
"DEBUG: manifest.json missing for: {} at {:?}",
extension_id, manifest_path
"DEBUG: manifest.json missing or invalid for: {extension_id} at {haextension_dir}/manifest.json"
);
self.missing_extensions
.lock()
@ -780,7 +865,7 @@ impl ExtensionManager {
continue;
}
eprintln!("DEBUG: Extension loaded successfully: {}", extension_id);
eprintln!("DEBUG: Extension loaded successfully: {extension_id}");
let extension = Extension {
id: extension_id.clone(),

View File

@ -1,6 +1,6 @@
use crate::extension::error::ExtensionError;
use crate::extension::permissions::types::{
Action, DbAction, ExtensionPermission, FsAction, HttpAction, PermissionConstraints,
Action, DbAction, ExtensionPermission, FsAction, WebAction, PermissionConstraints,
PermissionStatus, ResourceType, ShellAction,
};
use serde::{Deserialize, Serialize};
@ -13,7 +13,8 @@ use ts_rs::TS;
pub struct PermissionEntry {
pub target: String,
/// Die auszuführende Aktion (z.B. "read", "read_write", "GET", "execute").
/// Die auszuführende Aktion (z.B. "read", "read_write", "execute").
/// Für Web-Permissions ist dies optional und wird ignoriert.
#[serde(default, skip_serializing_if = "Option::is_none")]
pub operation: Option<String>,
@ -51,19 +52,51 @@ pub struct ExtensionPermissions {
/// Typ-Alias für bessere Lesbarkeit, wenn die Struktur als UI-Modell verwendet wird.
pub type EditablePermissions = ExtensionPermissions;
#[derive(Serialize, Deserialize, Clone, Debug, TS)]
#[ts(export)]
#[serde(rename_all = "lowercase")]
pub enum DisplayMode {
/// Platform decides: Desktop = window, Mobile/Web = iframe (default)
Auto,
/// Always open in native window (if available, falls back to iframe)
Window,
/// Always open in iframe (embedded in main app)
Iframe,
}
impl Default for DisplayMode {
fn default() -> Self {
Self::Auto
}
}
#[derive(Serialize, Deserialize, Clone, Debug, TS)]
#[ts(export)]
pub struct ExtensionManifest {
pub name: String,
#[serde(default = "default_version_value")]
pub version: String,
pub author: Option<String>,
pub entry: String,
#[serde(default = "default_entry_value")]
pub entry: Option<String>,
pub icon: Option<String>,
pub public_key: String,
pub signature: String,
pub permissions: ExtensionPermissions,
pub homepage: Option<String>,
pub description: Option<String>,
#[serde(default)]
pub single_instance: Option<bool>,
#[serde(default)]
pub display_mode: Option<DisplayMode>,
}
fn default_entry_value() -> Option<String> {
Some("index.html".to_string())
}
fn default_version_value() -> String {
"0.0.0-dev".to_string()
}
impl ExtensionManifest {
@ -110,7 +143,7 @@ impl ExtensionPermissions {
}
if let Some(entries) = &self.http {
for p in entries {
if let Some(perm) = Self::create_internal(extension_id, ResourceType::Http, p) {
if let Some(perm) = Self::create_internal(extension_id, ResourceType::Web, p) {
permissions.push(perm);
}
}
@ -139,7 +172,14 @@ impl ExtensionPermissions {
ResourceType::Fs => FsAction::from_str(operation_str)
.ok()
.map(Action::Filesystem),
ResourceType::Http => HttpAction::from_str(operation_str).ok().map(Action::Http),
ResourceType::Web => {
// For web permissions, operation is optional - default to All
if operation_str.is_empty() {
Some(Action::Web(WebAction::All))
} else {
WebAction::from_str(operation_str).ok().map(Action::Web)
}
}
ResourceType::Shell => ShellAction::from_str(operation_str).ok().map(Action::Shell),
};
@ -172,6 +212,9 @@ pub struct ExtensionInfoResponse {
pub description: Option<String>,
pub homepage: Option<String>,
pub icon: Option<String>,
pub entry: Option<String>,
pub single_instance: Option<bool>,
pub display_mode: Option<DisplayMode>,
#[serde(skip_serializing_if = "Option::is_none")]
pub dev_server_url: Option<String>,
}
@ -197,6 +240,9 @@ impl ExtensionInfoResponse {
description: extension.manifest.description.clone(),
homepage: extension.manifest.homepage.clone(),
icon: extension.manifest.icon.clone(),
entry: extension.manifest.entry.clone(),
single_instance: extension.manifest.single_instance,
display_mode: extension.manifest.display_mode.clone(),
dev_server_url,
})
}

View File

@ -25,10 +25,10 @@ lazy_static::lazy_static! {
#[derive(Deserialize, Serialize, Debug, Clone)]
#[serde(rename_all = "camelCase")]
struct ExtensionInfo {
public_key: String,
name: String,
version: String,
pub struct ExtensionInfo {
pub public_key: String,
pub name: String,
pub version: String,
}
#[derive(Debug)]
@ -42,12 +42,12 @@ enum DataProcessingError {
impl fmt::Display for DataProcessingError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {}", e),
DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {e}"),
DataProcessingError::Utf8Conversion(e) => {
write!(f, "UTF-8-Konvertierungsfehler: {}", e)
write!(f, "UTF-8-Konvertierungsfehler: {e}")
}
DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {}", e),
DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {}", msg),
DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {e}"),
DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {msg}"),
}
}
}
@ -101,7 +101,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-')
{
return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension name: {}", extension_name),
reason: format!("Invalid extension name: {extension_name}"),
});
}
@ -111,7 +111,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '.')
{
return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension version: {}", extension_version),
reason: format!("Invalid extension version: {extension_version}"),
});
}
@ -146,11 +146,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(canonical_path)
} else {
eprintln!(
"SECURITY WARNING: Path traversal attempt blocked: {}",
requested_asset_path
"SECURITY WARNING: Path traversal attempt blocked: {requested_asset_path}"
);
Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {}", requested_asset_path),
reason: format!("Path traversal attempt: {requested_asset_path}"),
})
}
}
@ -159,11 +158,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(final_path)
} else {
eprintln!(
"SECURITY WARNING: Invalid asset path: {}",
requested_asset_path
"SECURITY WARNING: Invalid asset path: {requested_asset_path}"
);
Err(ExtensionError::SecurityViolation {
reason: format!("Invalid asset path: {}", requested_asset_path),
reason: format!("Invalid asset path: {requested_asset_path}"),
})
}
}
@ -184,7 +182,7 @@ pub fn extension_protocol_handler(
// Only allow same-protocol requests or tauri origin
// For null/empty origin (initial load), use wildcard
let protocol_prefix = format!("{}://", EXTENSION_PROTOCOL_NAME);
let protocol_prefix = format!("{EXTENSION_PROTOCOL_NAME}://");
let allowed_origin = if origin.starts_with(&protocol_prefix) || origin == get_tauri_origin() {
origin
} else if origin.is_empty() || origin == "null" {
@ -216,9 +214,9 @@ pub fn extension_protocol_handler(
.and_then(|v| v.to_str().ok())
.unwrap_or("");
println!("Protokoll Handler für: {}", uri_ref);
println!("Origin: {}", origin);
println!("Referer: {}", referer);
println!("Protokoll Handler für: {uri_ref}");
println!("Origin: {origin}");
println!("Referer: {referer}");
let path_str = uri_ref.path();
@ -227,16 +225,16 @@ pub fn extension_protocol_handler(
// - Desktop: haex-extension://<base64>/{assetPath}
// - Android: http://localhost/{base64}/{assetPath}
let host = uri_ref.host().unwrap_or("");
println!("URI Host: {}", host);
println!("URI Host: {host}");
let (info, segments_after_version) = if host == "localhost" || host == format!("{}.localhost", EXTENSION_PROTOCOL_NAME).as_str() {
let (info, segments_after_version) = if host == "localhost" || host == format!("{EXTENSION_PROTOCOL_NAME}.localhost").as_str() {
// Android format: http://haex-extension.localhost/{base64}/{assetPath}
// Extract base64 from first path segment
println!("Android format detected: http://{}/...", host);
println!("Android format detected: http://{host}/...");
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
if let Some(first_segment) = segments_iter.next() {
println!("First path segment (base64): {}", first_segment);
println!("First path segment (base64): {first_segment}");
match BASE64_STANDARD.decode(first_segment) {
Ok(decoded_bytes) => match String::from_utf8(decoded_bytes) {
Ok(json_str) => match serde_json::from_str::<ExtensionInfo>(&json_str) {
@ -252,29 +250,29 @@ pub fn extension_protocol_handler(
(info, remaining)
}
Err(e) => {
eprintln!("Failed to parse JSON from base64 path: {}", e);
eprintln!("Failed to parse JSON from base64 path: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 path: {}", e)))
.body(Vec::from(format!("Invalid extension info in base64 path: {e}")))
.map_err(|e| e.into());
}
},
Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 path: {}", e);
eprintln!("Failed to decode UTF-8 from base64 path: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 path: {}", e)))
.body(Vec::from(format!("Invalid UTF-8 in base64 path: {e}")))
.map_err(|e| e.into());
}
},
Err(e) => {
eprintln!("Failed to decode base64 from path: {}", e);
eprintln!("Failed to decode base64 from path: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in path: {}", e)))
.body(Vec::from(format!("Invalid base64 in path: {e}")))
.map_err(|e| e.into());
}
}
@ -311,35 +309,35 @@ pub fn extension_protocol_handler(
(info, segments)
}
Err(e) => {
eprintln!("Failed to parse JSON from base64 host: {}", e);
eprintln!("Failed to parse JSON from base64 host: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 host: {}", e)))
.body(Vec::from(format!("Invalid extension info in base64 host: {e}")))
.map_err(|e| e.into());
}
},
Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 host: {}", e);
eprintln!("Failed to decode UTF-8 from base64 host: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 host: {}", e)))
.body(Vec::from(format!("Invalid UTF-8 in base64 host: {e}")))
.map_err(|e| e.into());
}
},
Err(e) => {
eprintln!("Failed to decode base64 host: {}", e);
eprintln!("Failed to decode base64 host: {e}");
return Response::builder()
.status(400)
.header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in host: {}", e)))
.body(Vec::from(format!("Invalid base64 in host: {e}")))
.map_err(|e| e.into());
}
}
} else {
// No base64 host - use path-based parsing (for localhost/Android/Windows)
parse_extension_info_from_path(path_str, origin, uri_ref, referer, &allowed_origin)?
parse_extension_info_from_path(path_str, origin, uri_ref, referer)?
};
// Construct asset path from remaining segments
@ -353,8 +351,8 @@ pub fn extension_protocol_handler(
&raw_asset_path
};
println!("Path: {}", path_str);
println!("Asset to load: {}", asset_to_load);
println!("Path: {path_str}");
println!("Asset to load: {asset_to_load}");
let absolute_secure_path = resolve_secure_extension_asset_path(
app_handle,
@ -362,7 +360,7 @@ pub fn extension_protocol_handler(
&info.public_key,
&info.name,
&info.version,
&asset_to_load,
asset_to_load,
)?;
println!("Resolved path: {}", absolute_secure_path.display());
@ -497,7 +495,7 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_origin(origin) {
if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Origin: {}", hex);
println!("Parsed und gecached aus Origin: {hex}");
return Ok(info);
}
}
@ -507,17 +505,17 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_uri_path(uri_ref) {
if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus URI: {}", hex);
println!("Parsed und gecached aus URI: {hex}");
return Ok(info);
}
}
println!("Fallback zu Referer-Parsing: {}", referer);
println!("Fallback zu Referer-Parsing: {referer}");
if !referer.is_empty() && referer != "null" {
if let Ok(hex) = parse_from_uri_string(referer) {
if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Referer: {}", hex);
println!("Parsed und gecached aus Referer: {hex}");
return Ok(info);
}
}
@ -609,29 +607,23 @@ fn validate_and_return_hex(segment: &str) -> Result<String, DataProcessingError>
Ok(segment.to_string())
}
fn encode_hex_for_log(info: &ExtensionInfo) -> String {
let json_str = serde_json::to_string(info).unwrap_or_default();
hex::encode(json_str.as_bytes())
}
// Helper function to parse extension info from path segments
fn parse_extension_info_from_path(
path_str: &str,
origin: &str,
uri_ref: &Uri,
referer: &str,
allowed_origin: &str,
) -> Result<(ExtensionInfo, Vec<String>), Box<dyn std::error::Error>> {
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
match (segments_iter.next(), segments_iter.next(), segments_iter.next()) {
(Some(public_key), Some(name), Some(version)) => {
println!("=== Extension Protocol Handler (path-based) ===");
println!("Full URI: {}", uri_ref);
println!("Full URI: {uri_ref}");
println!("Parsed from path segments:");
println!(" PublicKey: {}", public_key);
println!(" Name: {}", name);
println!(" Version: {}", version);
println!(" PublicKey: {public_key}");
println!(" Name: {name}");
println!(" Version: {version}");
let info = ExtensionInfo {
public_key: public_key.to_string(),
@ -653,7 +645,7 @@ fn parse_extension_info_from_path(
) {
Ok(decoded) => {
println!("=== Extension Protocol Handler (legacy hex format) ===");
println!("Full URI: {}", uri_ref);
println!("Full URI: {uri_ref}");
println!("Decoded info:");
println!(" PublicKey: {}", decoded.public_key);
println!(" Name: {}", decoded.name);
@ -670,8 +662,8 @@ fn parse_extension_info_from_path(
Ok((decoded, segments))
}
Err(e) => {
eprintln!("Fehler beim Parsen (alle Fallbacks): {}", e);
Err(format!("Ungültige Anfrage: {}", e).into())
eprintln!("Fehler beim Parsen (alle Fallbacks): {e}");
Err(format!("Ungültige Anfrage: {e}").into())
}
}
}

View File

@ -70,8 +70,7 @@ pub fn copy_directory(
use std::path::PathBuf;
println!(
"Kopiere Verzeichnis von '{}' nach '{}'",
source, destination
"Kopiere Verzeichnis von '{source}' nach '{destination}'"
);
let source_path = PathBuf::from(&source);
@ -81,7 +80,7 @@ pub fn copy_directory(
return Err(ExtensionError::Filesystem {
source: std::io::Error::new(
std::io::ErrorKind::NotFound,
format!("Source directory '{}' not found", source),
format!("Source directory '{source}' not found"),
),
});
}
@ -93,7 +92,7 @@ pub fn copy_directory(
fs_extra::dir::copy(&source_path, &destination_path, &options).map_err(|e| {
ExtensionError::Filesystem {
source: std::io::Error::new(std::io::ErrorKind::Other, e.to_string()),
source: std::io::Error::other(e.to_string()),
}
})?;
Ok(())

View File

@ -18,20 +18,20 @@ impl ExtensionCrypto {
signature_hex: &str,
) -> Result<(), String> {
let public_key_bytes =
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {}", e))?;
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {e}"))?;
let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap())
.map_err(|e| format!("Invalid public key: {}", e))?;
.map_err(|e| format!("Invalid public key: {e}"))?;
let signature_bytes =
hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {}", e))?;
hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {e}"))?;
let signature = Signature::from_bytes(&signature_bytes.try_into().unwrap());
let content_hash =
hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {}", e))?;
hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {e}"))?;
public_key
.verify(&content_hash, &signature)
.map_err(|e| format!("Signature verification failed: {}", e))
.map_err(|e| format!("Signature verification failed: {e}"))
}
/// Berechnet Hash eines Verzeichnisses (für Verifikation)
@ -48,7 +48,9 @@ impl ExtensionCrypto {
let relative = path.strip_prefix(dir)
.unwrap_or(&path)
.to_string_lossy()
.to_string();
.to_string()
// Normalisiere Pfad-Separatoren zu Unix-Style (/) für plattformübergreifende Konsistenz
.replace('\\', "/");
(relative, path)
})
.collect();
@ -56,16 +58,30 @@ impl ExtensionCrypto {
// 3. Sortiere nach relativen Pfaden
relative_files.sort_by(|a, b| a.0.cmp(&b.0));
println!("=== Files to hash ({}): ===", relative_files.len());
for (rel, _) in &relative_files {
println!(" - {}", rel);
}
let mut hasher = Sha256::new();
// Canonicalize manifest path for comparison (important on Android where symlinks may differ)
// Also ensure the canonical path is still within the allowed directory (security check)
let canonical_manifest_path = manifest_path.canonicalize()
.unwrap_or_else(|_| manifest_path.to_path_buf());
// Security: Verify canonical manifest path is still within dir
let canonical_dir = dir.canonicalize()
.unwrap_or_else(|_| dir.to_path_buf());
if !canonical_manifest_path.starts_with(&canonical_dir) {
return Err(ExtensionError::ManifestError {
reason: "Manifest path resolves outside of extension directory (potential path traversal)".to_string(),
});
}
// 4. Inhalte der sortierten Dateien hashen
for (_relative, file_path) in relative_files {
if file_path == manifest_path {
// Canonicalize file_path for comparison
let canonical_file_path = file_path.canonicalize()
.unwrap_or_else(|_| file_path.clone());
if canonical_file_path == canonical_manifest_path {
// FÜR DIE MANIFEST.JSON:
let content_str = fs::read_to_string(&file_path)
.map_err(|e| ExtensionError::Filesystem { source: e })?;
@ -74,7 +90,7 @@ impl ExtensionCrypto {
let mut manifest: serde_json::Value =
serde_json::from_str(&content_str).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Cannot parse manifest JSON: {}", e),
reason: format!("Cannot parse manifest JSON: {e}"),
}
})?;
@ -91,11 +107,15 @@ impl ExtensionCrypto {
let canonical_manifest_content =
serde_json::to_string_pretty(&manifest).map_err(|e| {
ExtensionError::ManifestError {
reason: format!("Failed to serialize manifest: {}", e),
reason: format!("Failed to serialize manifest: {e}"),
}
})?;
println!("canonical_manifest_content: {}", canonical_manifest_content);
hasher.update(canonical_manifest_content.as_bytes());
// Normalisiere Zeilenenden zu Unix-Style (\n), wie Node.js JSON.stringify es macht
// Dies ist wichtig für plattformübergreifende Konsistenz (Desktop vs Android)
let normalized_content = canonical_manifest_content.replace("\r\n", "\n");
hasher.update(normalized_content.as_bytes());
} else {
// FÜR ALLE ANDEREN DATEIEN:
let content =

View File

@ -3,7 +3,7 @@
use crate::crdt::hlc::HlcService;
use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger;
use crate::database::core::{convert_value_ref_to_json, parse_sql_statements, ValueConverter};
use crate::database::core::{convert_value_ref_to_json, parse_sql_statements};
use crate::database::error::DatabaseError;
use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction};
use serde_json::Value as JsonValue;
@ -52,19 +52,25 @@ impl SqlExecutor {
}
let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed execute SQL: {}", sql_str);
eprintln!("DEBUG: Transformed execute SQL: {sql_str}");
// Führe Statement aus
tx.execute(&sql_str, params)
.map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(),
table: None,
reason: format!("Execute failed: {}", e),
reason: format!("Execute failed: {e}"),
})?;
// Trigger-Logik für CREATE TABLE
if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string();
let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
eprintln!("DEBUG: Setting up triggers for table: {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
}
@ -109,7 +115,7 @@ impl SqlExecutor {
}
let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed SQL (with RETURNING): {}", sql_str);
eprintln!("DEBUG: Transformed SQL (with RETURNING): {sql_str}");
// Prepare und query ausführen
let mut stmt = tx
@ -158,7 +164,13 @@ impl SqlExecutor {
// Trigger-Logik für CREATE TABLE
if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string();
let raw_name = create_table_details.name.to_string();
// Remove quotes from table name
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
eprintln!("DEBUG: Setting up triggers for table (RETURNING): {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
}
@ -174,7 +186,7 @@ impl SqlExecutor {
) -> Result<HashSet<String>, DatabaseError> {
let sql_params: Vec<SqliteValue> = params
.iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::execute_internal_typed(tx, hlc_service, sql, &param_refs)
@ -189,7 +201,7 @@ impl SqlExecutor {
) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
let sql_params: Vec<SqliteValue> = params
.iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::query_internal_typed(tx, hlc_service, sql, &param_refs)
@ -240,12 +252,12 @@ impl SqlExecutor {
let stmt_to_execute = ast_vec.pop().unwrap();
let transformed_sql = stmt_to_execute.to_string();
eprintln!("DEBUG: SELECT (no transformation): {}", transformed_sql);
eprintln!("DEBUG: SELECT (no transformation): {transformed_sql}");
// Convert JSON params to SQLite values
let sql_params: Vec<SqliteValue> = params
.iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v))
.map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?;
let mut prepared_stmt = conn.prepare(&transformed_sql)?;

View File

@ -5,6 +5,7 @@ use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger;
use crate::database::core::{parse_sql_statements, with_connection, ValueConverter};
use crate::database::error::DatabaseError;
use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError;
use crate::extension::permissions::validator::SqlPermissionValidator;
use crate::AppState;
@ -12,10 +13,8 @@ use crate::AppState;
use rusqlite::params_from_iter;
use rusqlite::types::Value as SqlValue;
use rusqlite::Transaction;
use serde_json::json;
use serde_json::Value as JsonValue;
use sqlparser::ast::{Statement, TableFactor, TableObject};
use std::collections::HashSet;
use tauri::State;
/// Führt Statements mit korrekter Parameter-Bindung aus
@ -110,7 +109,7 @@ pub async fn extension_sql_execute(
public_key: String,
name: String,
state: State<'_, AppState>,
) -> Result<Vec<String>, ExtensionError> {
) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID
let extension = state
.extension_manager
@ -129,58 +128,103 @@ pub async fn extension_sql_execute(
// SQL parsing
let mut ast_vec = parse_sql_statements(sql)?;
if ast_vec.len() != 1 {
return Err(ExtensionError::Database {
source: DatabaseError::ExecutionError {
sql: sql.to_string(),
reason: "extension_sql_execute should only receive a single SQL statement"
.to_string(),
table: None,
},
});
}
let mut statement = ast_vec.pop().unwrap();
// If this is a SELECT statement, delegate to extension_sql_select
if matches!(statement, Statement::Query(_)) {
return extension_sql_select(sql, params, public_key, name, state).await;
}
// Check if statement has RETURNING clause
let has_returning = crate::database::core::statement_has_returning(&statement);
// Database operation
with_connection(&state.db, |conn| {
let tx = conn.transaction().map_err(DatabaseError::from)?;
let transformer = CrdtTransformer::new();
let executor = StatementExecutor::new(&tx);
// Get HLC service reference
let hlc_service = state.hlc.lock().map_err(|_| DatabaseError::MutexPoisoned {
reason: "Failed to lock HLC service".to_string(),
})?;
// Generate HLC timestamp
let hlc_timestamp = state
.hlc
.lock()
.unwrap()
.new_timestamp_and_persist(&tx)
.map_err(|e| DatabaseError::HlcError {
reason: e.to_string(),
})?;
let hlc_timestamp =
hlc_service
.new_timestamp_and_persist(&tx)
.map_err(|e| DatabaseError::HlcError {
reason: e.to_string(),
})?;
// Transform statements
let mut modified_schema_tables = HashSet::new();
for statement in &mut ast_vec {
if let Some(table_name) =
transformer.transform_execute_statement(statement, &hlc_timestamp)?
{
modified_schema_tables.insert(table_name);
}
}
// Transform statement
transformer.transform_execute_statement(&mut statement, &hlc_timestamp)?;
// Convert parameters
// Convert parameters to references
let sql_values = ValueConverter::convert_params(&params)?;
let param_refs: Vec<&dyn rusqlite::ToSql> = sql_values
.iter()
.map(|v| v as &dyn rusqlite::ToSql)
.collect();
// Execute statements
for statement in ast_vec {
executor.execute_statement_with_params(&statement, &sql_values)?;
let result = if has_returning {
// Use query_internal for statements with RETURNING
let (_, rows) = SqlExecutor::query_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
rows
} else {
// Use execute_internal for statements without RETURNING
SqlExecutor::execute_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
vec![]
};
if let Statement::CreateTable(create_table_details) = statement {
let table_name_str = create_table_details.name.to_string();
println!(
"Table '{}' created by extension, setting up CRDT triggers...",
table_name_str
);
trigger::setup_triggers_for_table(&tx, &table_name_str, false)?;
println!(
"Triggers for table '{}' successfully created.",
table_name_str
);
}
// Handle CREATE TABLE trigger setup
if let Statement::CreateTable(ref create_table_details) = statement {
// Extract table name and remove quotes (both " and `)
let raw_name = create_table_details.name.to_string();
println!("DEBUG: Raw table name from AST: {raw_name:?}");
println!(
"DEBUG: Raw table name chars: {:?}",
raw_name.chars().collect::<Vec<_>>()
);
let table_name_str = raw_name.trim_matches('"').trim_matches('`').to_string();
println!("DEBUG: Cleaned table name: {table_name_str:?}");
println!(
"DEBUG: Cleaned table name chars: {:?}",
table_name_str.chars().collect::<Vec<_>>()
);
println!("Table '{table_name_str}' created by extension, setting up CRDT triggers...");
trigger::setup_triggers_for_table(&tx, &table_name_str, false)?;
println!("Triggers for table '{table_name_str}' successfully created.");
}
// Commit transaction
tx.commit().map_err(DatabaseError::from)?;
Ok(modified_schema_tables.into_iter().collect())
Ok(result)
})
.map_err(ExtensionError::from)
}
@ -192,7 +236,7 @@ pub async fn extension_sql_select(
public_key: String,
name: String,
state: State<'_, AppState>,
) -> Result<Vec<JsonValue>, ExtensionError> {
) -> Result<Vec<Vec<JsonValue>>, ExtensionError> {
// Get extension to retrieve its ID
let extension = state
.extension_manager
@ -229,10 +273,9 @@ pub async fn extension_sql_select(
}
}
// Database operation
// Database operation - return Vec<Vec<JsonValue>> like sql_select_with_crdt
with_connection(&state.db, |conn| {
let sql_params = ValueConverter::convert_params(&params)?;
// Hard Delete: Keine SELECT-Transformation mehr nötig
let stmt_to_execute = ast_vec.pop().unwrap();
let transformed_sql = stmt_to_execute.to_string();
@ -245,52 +288,34 @@ pub async fn extension_sql_select(
table: None,
})?;
let column_names: Vec<String> = prepared_stmt
.column_names()
.into_iter()
.map(|s| s.to_string())
.collect();
let rows = prepared_stmt
.query_map(params_from_iter(sql_params.iter()), |row| {
row_to_json_value(row, &column_names)
})
let num_columns = prepared_stmt.column_count();
let mut rows = prepared_stmt
.query(params_from_iter(sql_params.iter()))
.map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let mut results = Vec::new();
for row_result in rows {
results.push(row_result.map_err(|e| DatabaseError::RowProcessingError {
reason: e.to_string(),
})?);
let mut result_vec: Vec<Vec<JsonValue>> = Vec::new();
while let Some(row) = rows.next().map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})? {
let mut row_values: Vec<JsonValue> = Vec::new();
for i in 0..num_columns {
let value_ref = row.get_ref(i).map_err(|e| DatabaseError::QueryError {
reason: e.to_string(),
})?;
let json_value = crate::database::core::convert_value_ref_to_json(value_ref)?;
row_values.push(json_value);
}
result_vec.push(row_values);
}
Ok(results)
Ok(result_vec)
})
.map_err(ExtensionError::from)
}
/// Konvertiert eine SQLite-Zeile zu JSON
fn row_to_json_value(
row: &rusqlite::Row,
columns: &[String],
) -> Result<JsonValue, rusqlite::Error> {
let mut map = serde_json::Map::new();
for (i, col_name) in columns.iter().enumerate() {
let value = row.get::<usize, rusqlite::types::Value>(i)?;
let json_value = match value {
rusqlite::types::Value::Null => JsonValue::Null,
rusqlite::types::Value::Integer(i) => json!(i),
rusqlite::types::Value::Real(f) => json!(f),
rusqlite::types::Value::Text(s) => json!(s),
rusqlite::types::Value::Blob(blob) => json!(blob.to_vec()),
};
map.insert(col_name.clone(), json_value);
}
Ok(JsonValue::Object(map))
}
/// Validiert Parameter gegen SQL-Platzhalter
fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> {
let total_placeholders = count_sql_placeholders(sql);
@ -327,20 +352,4 @@ mod tests {
);
assert_eq!(count_sql_placeholders("SELECT * FROM users"), 0);
}
/* #[test]
fn test_truncate_sql() {
let sql = "SELECT * FROM very_long_table_name";
assert_eq!(truncate_sql(sql, 10), "SELECT * F...");
assert_eq!(truncate_sql(sql, 50), sql);
} */
#[test]
fn test_validate_params() {
let params = vec![json!(1), json!("test")];
assert!(validate_params("SELECT * FROM users WHERE id = ? AND name = ?", &params).is_ok());
assert!(validate_params("SELECT * FROM users WHERE id = ?", &params).is_err());
assert!(validate_params("SELECT * FROM users", &params).is_err());
}
}

View File

@ -1,10 +1,12 @@
// src-tauri/src/extension/error.rs
use thiserror::Error;
use ts_rs::TS;
use crate::database::error::DatabaseError;
/// Error codes for frontend handling
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, TS)]
#[ts(export)]
pub enum ExtensionErrorCode {
SecurityViolation = 1000,
NotFound = 1001,
@ -14,6 +16,7 @@ pub enum ExtensionErrorCode {
Filesystem = 2001,
FilesystemWithPath = 2004,
Http = 2002,
Web = 2005,
Shell = 2003,
Manifest = 3000,
Validation = 3001,
@ -25,6 +28,17 @@ pub enum ExtensionErrorCode {
Installation = 5000,
}
/// Serialized representation of ExtensionError for TypeScript
#[derive(Debug, Clone, serde::Serialize, TS)]
#[ts(export)]
pub struct SerializedExtensionError {
pub code: u16,
#[serde(rename = "type")]
pub error_type: String,
pub message: String,
pub extension_id: Option<String>,
}
impl serde::Serialize for ExtensionErrorCode {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
@ -70,6 +84,9 @@ pub enum ExtensionError {
#[error("HTTP request failed: {reason}")]
Http { reason: String },
#[error("Web request failed: {reason}")]
WebError { reason: String },
#[error("Shell command failed: {reason}")]
Shell {
reason: String,
@ -118,6 +135,7 @@ impl ExtensionError {
ExtensionError::Filesystem { .. } => ExtensionErrorCode::Filesystem,
ExtensionError::FilesystemWithPath { .. } => ExtensionErrorCode::FilesystemWithPath,
ExtensionError::Http { .. } => ExtensionErrorCode::Http,
ExtensionError::WebError { .. } => ExtensionErrorCode::Web,
ExtensionError::Shell { .. } => ExtensionErrorCode::Shell,
ExtensionError::ManifestError { .. } => ExtensionErrorCode::Manifest,
ExtensionError::ValidationError { .. } => ExtensionErrorCode::Validation,
@ -174,7 +192,7 @@ impl serde::Serialize for ExtensionError {
let mut state = serializer.serialize_struct("ExtensionError", 4)?;
state.serialize_field("code", &self.code())?;
state.serialize_field("type", &format!("{:?}", self))?;
state.serialize_field("type", &format!("{self:?}"))?;
state.serialize_field("message", &self.to_string())?;
if let Some(ext_id) = self.extension_id() {

View File

@ -133,7 +133,7 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
// Check for path traversal attempts
if pattern.contains("../") || pattern.contains("..\\") {
return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal detected in pattern: {}", pattern),
reason: format!("Path traversal detected in pattern: {pattern}"),
});
}
@ -143,7 +143,6 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
/// Resolves a path pattern to actual filesystem paths using Tauri's BaseDirectory
pub fn resolve_path_pattern(
pattern: &str,
app_handle: &tauri::AppHandle,
) -> Result<(String, String), ExtensionError> {
let (base_var, relative_path) = if let Some(slash_pos) = pattern.find('/') {
(&pattern[..slash_pos], &pattern[slash_pos + 1..])
@ -177,7 +176,7 @@ pub fn resolve_path_pattern(
"$TEMP" => "Temp",
_ => {
return Err(ExtensionError::ValidationError {
reason: format!("Unknown base directory variable: {}", base_var),
reason: format!("Unknown base directory variable: {base_var}"),
});
}
};

View File

@ -1,7 +1,7 @@
/// src-tauri/src/extension/mod.rs
use crate::{
extension::{
core::{EditablePermissions, ExtensionInfoResponse, ExtensionPreview},
core::{manager::ExtensionManager, EditablePermissions, ExtensionInfoResponse, ExtensionPreview},
error::ExtensionError,
},
AppState,
@ -13,6 +13,10 @@ pub mod database;
pub mod error;
pub mod filesystem;
pub mod permissions;
pub mod web;
#[cfg(not(any(target_os = "android", target_os = "ios")))]
pub mod webview;
#[tauri::command]
pub fn get_extension_info(
@ -52,7 +56,7 @@ pub async fn get_all_extensions(
.extension_manager
.load_installed_extensions(&app_handle, &state)
.await
.map_err(|e| format!("Failed to load extensions: {:?}", e))?;
.map_err(|e| format!("Failed to load extensions: {e:?}"))?;
/* } */
let mut extensions = Vec::new();
@ -82,12 +86,13 @@ pub async fn get_all_extensions(
#[tauri::command]
pub async fn preview_extension(
app_handle: AppHandle,
state: State<'_, AppState>,
file_bytes: Vec<u8>,
) -> Result<ExtensionPreview, ExtensionError> {
state
.extension_manager
.preview_extension_internal(file_bytes)
.preview_extension_internal(&app_handle, file_bytes)
.await
}
@ -291,12 +296,12 @@ pub async fn load_dev_extension(
let (host, port, haextension_dir) = if config_path.exists() {
let config_content =
std::fs::read_to_string(&config_path).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to read haextension.config.json: {}", e),
reason: format!("Failed to read haextension.config.json: {e}"),
})?;
let config: HaextensionConfig =
serde_json::from_str(&config_content).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to parse haextension.config.json: {}", e),
reason: format!("Failed to parse haextension.config.json: {e}"),
})?;
(config.dev.host, config.dev.port, config.dev.haextension_dir)
@ -305,47 +310,43 @@ pub async fn load_dev_extension(
(default_host(), default_port(), default_haextension_dir())
};
let dev_server_url = format!("http://{}:{}", host, port);
eprintln!("📡 Dev server URL: {}", dev_server_url);
eprintln!("📁 Haextension directory: {}", haextension_dir);
let dev_server_url = format!("http://{host}:{port}");
eprintln!("📡 Dev server URL: {dev_server_url}");
eprintln!("📁 Haextension directory: {haextension_dir}");
// 1.5. Check if dev server is running
if !check_dev_server_health(&dev_server_url).await {
return Err(ExtensionError::ValidationError {
reason: format!(
"Dev server at {} is not reachable. Please start your dev server first (e.g., 'npm run dev')",
dev_server_url
"Dev server at {dev_server_url} is not reachable. Please start your dev server first (e.g., 'npm run dev')"
),
});
}
eprintln!("✅ Dev server is reachable");
// 2. Build path to manifest: <extension_path>/<haextension_dir>/manifest.json
let manifest_path = extension_path_buf
.join(&haextension_dir)
.join("manifest.json");
// Check if manifest exists
if !manifest_path.exists() {
return Err(ExtensionError::ManifestError {
reason: format!(
"Manifest not found at: {}. Make sure you run 'npx @haexhub/sdk init' first.",
manifest_path.display()
),
});
}
// 2. Validate and build path to manifest: <extension_path>/<haextension_dir>/manifest.json
let manifest_relative_path = format!("{haextension_dir}/manifest.json");
let manifest_path = ExtensionManager::validate_path_in_directory(
&extension_path_buf,
&manifest_relative_path,
true,
)?
.ok_or_else(|| ExtensionError::ManifestError {
reason: format!(
"Manifest not found at: {haextension_dir}/manifest.json. Make sure you run 'npx @haexhub/sdk init' first."
),
})?;
// 3. Read and parse manifest
let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Failed to read manifest: {}", e),
reason: format!("Failed to read manifest: {e}"),
})?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
// 4. Generate a unique ID for dev extension: dev_<public_key_first_8>_<name>
let key_prefix = manifest.public_key.chars().take(8).collect::<String>();
let extension_id = format!("dev_{}_{}", key_prefix, manifest.name);
// 4. Generate a unique ID for dev extension: dev_<public_key>_<name>
let extension_id = format!("dev_{}_{}", manifest.public_key, manifest.name);
// 5. Check if dev extension already exists (allow reload)
if let Some(existing) = state
@ -407,7 +408,7 @@ pub fn remove_dev_extension(
if let Some(id) = to_remove {
dev_exts.remove(&id);
eprintln!("✅ Dev extension removed: {}", name);
eprintln!("✅ Dev extension removed: {name}");
Ok(())
} else {
Err(ExtensionError::NotFound { public_key, name })
@ -431,3 +432,85 @@ pub fn get_all_dev_extensions(
Ok(extensions)
}
// ============================================================================
// WebviewWindow Commands (Desktop only)
// ============================================================================
#[cfg(not(any(target_os = "android", target_os = "ios")))]
#[tauri::command]
pub fn open_extension_webview_window(
app_handle: AppHandle,
state: State<'_, AppState>,
extension_id: String,
title: String,
width: f64,
height: f64,
x: Option<f64>,
y: Option<f64>,
) -> Result<String, ExtensionError> {
eprintln!("[open_extension_webview_window] Received extension_id: {}", extension_id);
// Returns the window_id (generated UUID without dashes)
state.extension_webview_manager.open_extension_window(
&app_handle,
&state.extension_manager,
extension_id,
title,
width,
height,
x,
y,
)
}
#[cfg(not(any(target_os = "android", target_os = "ios")))]
#[tauri::command]
pub fn close_extension_webview_window(
app_handle: AppHandle,
state: State<'_, AppState>,
window_id: String,
) -> Result<(), ExtensionError> {
state
.extension_webview_manager
.close_extension_window(&app_handle, &window_id)
}
#[cfg(not(any(target_os = "android", target_os = "ios")))]
#[tauri::command]
pub fn focus_extension_webview_window(
app_handle: AppHandle,
state: State<'_, AppState>,
window_id: String,
) -> Result<(), ExtensionError> {
state
.extension_webview_manager
.focus_extension_window(&app_handle, &window_id)
}
#[cfg(not(any(target_os = "android", target_os = "ios")))]
#[tauri::command]
pub fn update_extension_webview_window_position(
app_handle: AppHandle,
state: State<'_, AppState>,
window_id: String,
x: f64,
y: f64,
) -> Result<(), ExtensionError> {
state
.extension_webview_manager
.update_extension_window_position(&app_handle, &window_id, x, y)
}
#[cfg(not(any(target_os = "android", target_os = "ios")))]
#[tauri::command]
pub fn update_extension_webview_window_size(
app_handle: AppHandle,
state: State<'_, AppState>,
window_id: String,
width: f64,
height: f64,
) -> Result<(), ExtensionError> {
state
.extension_webview_manager
.update_extension_window_size(&app_handle, &window_id, width, height)
}

View File

@ -0,0 +1,65 @@
// src-tauri/src/extension/permissions/check.rs
use crate::extension::error::ExtensionError;
use crate::extension::permissions::manager::PermissionManager;
use crate::AppState;
use std::path::Path;
use tauri::State;
#[tauri::command]
pub async fn check_web_permission(
extension_id: String,
url: String,
state: State<'_, AppState>,
) -> Result<(), ExtensionError> {
PermissionManager::check_web_permission(&state, &extension_id, &url).await
}
#[tauri::command]
pub async fn check_database_permission(
extension_id: String,
resource: String,
operation: String,
state: State<'_, AppState>,
) -> Result<(), ExtensionError> {
let action = match operation.as_str() {
"read" => crate::extension::permissions::types::Action::Database(
crate::extension::permissions::types::DbAction::Read,
),
"write" => crate::extension::permissions::types::Action::Database(
crate::extension::permissions::types::DbAction::ReadWrite,
),
_ => {
return Err(ExtensionError::ValidationError {
reason: format!("Invalid database operation: {}", operation),
})
}
};
PermissionManager::check_database_permission(&state, &extension_id, action, &resource).await
}
#[tauri::command]
pub async fn check_filesystem_permission(
extension_id: String,
path: String,
operation: String,
state: State<'_, AppState>,
) -> Result<(), ExtensionError> {
let action = match operation.as_str() {
"read" => crate::extension::permissions::types::Action::Filesystem(
crate::extension::permissions::types::FsAction::Read,
),
"write" => crate::extension::permissions::types::Action::Filesystem(
crate::extension::permissions::types::FsAction::ReadWrite,
),
_ => {
return Err(ExtensionError::ValidationError {
reason: format!("Invalid filesystem operation: {}", operation),
})
}
};
let file_path = Path::new(&path);
PermissionManager::check_filesystem_permission(&state, &extension_id, action, file_path).await
}

View File

@ -2,12 +2,14 @@ use crate::table_names::TABLE_EXTENSION_PERMISSIONS;
use crate::AppState;
use crate::database::core::with_connection;
use crate::database::error::DatabaseError;
use crate::extension::core::types::ExtensionSource;
use crate::extension::database::executor::SqlExecutor;
use crate::extension::error::ExtensionError;
use crate::extension::permissions::types::{Action, ExtensionPermission, PermissionStatus, ResourceType};
use crate::extension::permissions::types::{Action, ExtensionPermission, PermissionConstraints, PermissionStatus, ResourceType};
use tauri::State;
use crate::database::generated::HaexExtensionPermissions;
use rusqlite::params;
use std::path::Path;
pub struct PermissionManager;
@ -28,8 +30,7 @@ impl PermissionManager {
})?;
let sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)",
TABLE_EXTENSION_PERMISSIONS
"INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
);
for perm in permissions {
@ -76,8 +77,7 @@ impl PermissionManager {
let db_perm: HaexExtensionPermissions = permission.into();
let sql = format!(
"UPDATE {} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?",
TABLE_EXTENSION_PERMISSIONS
"UPDATE {TABLE_EXTENSION_PERMISSIONS} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?"
);
let params = params![
@ -111,7 +111,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(),
})?;
let sql = format!("UPDATE {} SET status = ? WHERE id = ?", TABLE_EXTENSION_PERMISSIONS);
let sql = format!("UPDATE {TABLE_EXTENSION_PERMISSIONS} SET status = ? WHERE id = ?");
let params = params![new_status.as_str(), permission_id];
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params)?;
tx.commit().map_err(DatabaseError::from)
@ -133,7 +133,7 @@ impl PermissionManager {
})?;
// Echtes DELETE - wird vom CrdtTransformer zu UPDATE umgewandelt
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSION_PERMISSIONS);
let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![permission_id])?;
tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from)
@ -152,7 +152,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(),
})?;
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS);
let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![extension_id])?;
tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from)
@ -164,7 +164,7 @@ impl PermissionManager {
hlc_service: &crate::crdt::hlc::HlcService,
extension_id: &str,
) -> Result<(), DatabaseError> {
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS);
let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(tx, hlc_service, &sql, params![extension_id])?;
Ok(())
}
@ -174,7 +174,7 @@ impl PermissionManager {
extension_id: &str,
) -> Result<Vec<ExtensionPermission>, ExtensionError> {
with_connection(&app_state.db, |conn| {
let sql = format!("SELECT * FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS);
let sql = format!("SELECT * FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?;
let perms_iter = stmt.query_map(params![extension_id], |row| {
@ -197,6 +197,31 @@ impl PermissionManager {
action: Action,
table_name: &str,
) -> Result<(), ExtensionError> {
// Remove quotes from table name if present (from SDK's getTableName())
// Support both double quotes and backticks (Drizzle uses backticks by default)
let clean_table_name = table_name.trim_matches('"').trim_matches('`');
// Auto-allow: Extensions have full access to their own tables
// Table format: {publicKey}__{extensionName}__{tableName}
// Extension ID format: dev_{publicKey}_{extensionName} or {publicKey}_{extensionName}
// Get the extension to check if this is its own table
let extension = app_state
.extension_manager
.get_extension(extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {extension_id} not found"),
})?;
// Build expected table prefix: {publicKey}__{extensionName}__
let expected_prefix = format!("{}__{}__", extension.manifest.public_key, extension.manifest.name);
if clean_table_name.starts_with(&expected_prefix) {
// This is the extension's own table - auto-allow
return Ok(());
}
// Not own table - check explicit permissions
let permissions = Self::get_permissions(app_state, extension_id).await?;
let has_permission = permissions
@ -205,7 +230,7 @@ impl PermissionManager {
.filter(|perm| perm.resource_type == ResourceType::Db)
.filter(|perm| perm.action == action) // action ist nicht mehr Option
.any(|perm| {
if perm.target != "*" && perm.target != table_name {
if perm.target != "*" && perm.target != clean_table_name {
return false;
}
true
@ -214,15 +239,105 @@ impl PermissionManager {
if !has_permission {
return Err(ExtensionError::permission_denied(
extension_id,
&format!("{:?}", action),
&format!("database table '{}'", table_name),
&format!("{action:?}"),
&format!("database table '{table_name}'"),
));
}
Ok(())
}
/* /// Prüft Dateisystem-Berechtigungen
/// Prüft Web-Berechtigungen für Requests
/// Method/operation is not checked - only protocol, domain, port, and path
pub async fn check_web_permission(
app_state: &State<'_, AppState>,
extension_id: &str,
url: &str,
) -> Result<(), ExtensionError> {
// Load permissions - for dev extensions, get from manifest; for production, from database
let permissions = if let Some(extension) = app_state.extension_manager.get_extension(extension_id) {
match &extension.source {
ExtensionSource::Development { .. } => {
// Dev extension - get web permissions from manifest
extension.manifest.permissions
.to_internal_permissions(extension_id)
.into_iter()
.filter(|p| p.resource_type == ResourceType::Web)
.map(|mut p| {
// Dev extensions have all permissions granted by default
p.status = PermissionStatus::Granted;
p
})
.collect()
}
ExtensionSource::Production { .. } => {
// Production extension - load from database
with_connection(&app_state.db, |conn| {
let sql = format!(
"SELECT * FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ? AND resource_type = 'web'"
);
let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?;
let perms_iter = stmt.query_map(params![extension_id], |row| {
crate::database::generated::HaexExtensionPermissions::from_row(row)
})?;
let permissions: Vec<ExtensionPermission> = perms_iter
.filter_map(Result::ok)
.map(Into::into)
.collect();
Ok(permissions)
})?
}
}
} else {
// Extension not found - deny
return Err(ExtensionError::ValidationError {
reason: format!("Extension not found: {}", extension_id),
});
};
let url_parsed = url::Url::parse(url).map_err(|e| ExtensionError::ValidationError {
reason: format!("Invalid URL: {}", e),
})?;
let domain = url_parsed.host_str().ok_or_else(|| ExtensionError::ValidationError {
reason: "URL does not contain a valid host".to_string(),
})?;
let has_permission = permissions
.iter()
.filter(|perm| perm.status == PermissionStatus::Granted)
.any(|perm| {
// Check if target matches the URL
let url_matches = if perm.target == "*" {
// Wildcard matches everything
true
} else if perm.target.contains("://") {
// URL pattern matching (with protocol and optional path)
Self::matches_url_pattern(&perm.target, url)
} else {
// Domain-only matching (legacy behavior)
perm.target == domain || domain.ends_with(&format!(".{}", perm.target))
};
// Return the URL match result (no method checking)
url_matches
});
if !has_permission {
return Err(ExtensionError::permission_denied(
extension_id,
"web request",
url,
));
}
Ok(())
}
/// Prüft Dateisystem-Berechtigungen
pub async fn check_filesystem_permission(
app_state: &State<'_, AppState>,
extension_id: &str,
@ -270,56 +385,6 @@ impl PermissionManager {
Ok(())
}
/// Prüft HTTP-Berechtigungen
pub async fn check_http_permission(
app_state: &State<'_, AppState>,
extension_id: &str,
method: &str,
url: &str,
) -> Result<(), ExtensionError> {
let permissions = Self::get_permissions(app_state, extension_id).await?;
let url_parsed = Url::parse(url).map_err(|e| ExtensionError::ValidationError {
reason: format!("Invalid URL: {}", e),
})?;
let domain = url_parsed.host_str().unwrap_or("");
let has_permission = permissions
.iter()
.filter(|perm| perm.status == PermissionStatus::Granted)
.filter(|perm| perm.resource_type == ResourceType::Http)
.any(|perm| {
let domain_matches = perm.target == "*"
|| perm.target == domain
|| domain.ends_with(&format!(".{}", perm.target));
if !domain_matches {
return false;
}
if let Some(PermissionConstraints::Http(constraints)) = &perm.constraints {
if let Some(methods) = &constraints.methods {
if !methods.iter().any(|m| m.eq_ignore_ascii_case(method)) {
return false;
}
}
}
true
});
if !has_permission {
return Err(ExtensionError::permission_denied(
extension_id,
method,
&format!("HTTP request to '{}'", url),
));
}
Ok(())
}
/// Prüft Shell-Berechtigungen
pub async fn check_shell_permission(
app_state: &State<'_, AppState>,
@ -382,16 +447,16 @@ impl PermissionManager {
Ok(())
}
*/
// Helper-Methoden - müssen DatabaseError statt ExtensionError zurückgeben
pub fn parse_resource_type(s: &str) -> Result<ResourceType, DatabaseError> {
match s {
"fs" => Ok(ResourceType::Fs),
"http" => Ok(ResourceType::Http),
"web" => Ok(ResourceType::Web),
"db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell),
_ => Err(DatabaseError::SerializationError {
reason: format!("Unknown resource type: {}", s),
reason: format!("Unknown resource type: {s}"),
}),
}
}
@ -399,8 +464,7 @@ impl PermissionManager {
fn matches_path_pattern(pattern: &str, path: &str) -> bool {
if pattern.ends_with("/*") {
let prefix = &pattern[..pattern.len() - 2];
if let Some(prefix) = pattern.strip_suffix("/*") {
return path.starts_with(prefix);
}
@ -419,6 +483,114 @@ impl PermissionManager {
pattern == path || pattern == "*"
}
/// Matches a URL against a URL pattern
/// Supports:
/// - Path wildcards: "https://domain.com/*"
/// - Subdomain wildcards: "https://*.domain.com/*"
fn matches_url_pattern(pattern: &str, url: &str) -> bool {
// Parse the actual URL
let Ok(url_parsed) = url::Url::parse(url) else {
return false;
};
// Check if pattern contains subdomain wildcard
let has_subdomain_wildcard = pattern.contains("://*.") || pattern.starts_with("*.");
if has_subdomain_wildcard {
// Extract components for wildcard matching
// Pattern: "https://*.example.com/*"
// Get protocol from pattern
let protocol_end = pattern.find("://").unwrap_or(0);
let pattern_protocol = if protocol_end > 0 {
&pattern[..protocol_end]
} else {
""
};
// Protocol must match if specified
if !pattern_protocol.is_empty() && pattern_protocol != url_parsed.scheme() {
return false;
}
// Extract the domain pattern (after *. )
let domain_start = if pattern.contains("://*.") {
pattern.find("://*.").unwrap() + 5 // length of "://.*"
} else if pattern.starts_with("*.") {
2 // length of "*."
} else {
return false;
};
// Find where the domain pattern ends (at / or end of string)
let domain_pattern_end = pattern[domain_start..].find('/').map(|i| domain_start + i).unwrap_or(pattern.len());
let domain_pattern = &pattern[domain_start..domain_pattern_end];
// Check if the URL's host ends with the domain pattern
let Some(url_host) = url_parsed.host_str() else {
return false;
};
// Match: *.example.com should match subdomain.example.com but not example.com
// Also match: exact domain if no subdomain wildcard prefix
if !url_host.ends_with(domain_pattern) && url_host != domain_pattern {
return false;
}
// For subdomain wildcard, ensure there's actually a subdomain
if pattern.contains("*.") && url_host == domain_pattern {
return false; // *.example.com should NOT match example.com
}
// Check path wildcard if present
if pattern.contains("/*") {
// Any path is allowed
return true;
}
// Check exact path if no wildcard
let pattern_path_start = domain_pattern_end;
if pattern_path_start < pattern.len() {
let pattern_path = &pattern[pattern_path_start..];
return url_parsed.path() == pattern_path;
}
return true;
}
// No subdomain wildcard - parse as full URL
let Ok(pattern_url) = url::Url::parse(pattern) else {
return false;
};
// Protocol must match
if pattern_url.scheme() != url_parsed.scheme() {
return false;
}
// Host must match
if pattern_url.host_str() != url_parsed.host_str() {
return false;
}
// Port must match (if specified)
if pattern_url.port() != url_parsed.port() {
return false;
}
// Path matching with wildcard support
if pattern.contains("/*") {
// Extract the base path before the wildcard
if let Some(wildcard_pos) = pattern.find("/*") {
let pattern_before_wildcard = &pattern[..wildcard_pos];
return url.starts_with(pattern_before_wildcard);
}
}
// Exact path match (no wildcard)
pattern_url.path() == url_parsed.path()
}
}

View File

@ -1,3 +1,4 @@
pub mod check;
pub mod manager;
pub mod types;
pub mod validator;

View File

@ -86,11 +86,11 @@ impl FromStr for FsAction {
}
}
/// Definiert Aktionen (HTTP-Methoden), die auf HTTP-Anfragen angewendet werden können.
/// Definiert Aktionen (HTTP-Methoden), die auf Web-Anfragen angewendet werden können.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, TS)]
#[serde(rename_all = "UPPERCASE")]
#[ts(export)]
pub enum HttpAction {
pub enum WebAction {
Get,
Post,
Put,
@ -100,20 +100,20 @@ pub enum HttpAction {
All,
}
impl FromStr for HttpAction {
impl FromStr for WebAction {
type Err = ExtensionError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s.to_uppercase().as_str() {
"GET" => Ok(HttpAction::Get),
"POST" => Ok(HttpAction::Post),
"PUT" => Ok(HttpAction::Put),
"PATCH" => Ok(HttpAction::Patch),
"DELETE" => Ok(HttpAction::Delete),
"*" => Ok(HttpAction::All),
"GET" => Ok(WebAction::Get),
"POST" => Ok(WebAction::Post),
"PUT" => Ok(WebAction::Put),
"PATCH" => Ok(WebAction::Patch),
"DELETE" => Ok(WebAction::Delete),
"*" => Ok(WebAction::All),
_ => Err(ExtensionError::InvalidActionString {
input: s.to_string(),
resource_type: "http".to_string(),
resource_type: "web".to_string(),
}),
}
}
@ -149,7 +149,7 @@ impl FromStr for ShellAction {
pub enum Action {
Database(DbAction),
Filesystem(FsAction),
Http(HttpAction),
Web(WebAction),
Shell(ShellAction),
}
@ -173,7 +173,7 @@ pub struct ExtensionPermission {
#[ts(export)]
pub enum ResourceType {
Fs,
Http,
Web,
Db,
Shell,
}
@ -195,7 +195,7 @@ pub enum PermissionStatus {
pub enum PermissionConstraints {
Database(DbConstraints),
Filesystem(FsConstraints),
Http(HttpConstraints),
Web(WebConstraints),
Shell(ShellConstraints),
}
@ -223,7 +223,7 @@ pub struct FsConstraints {
#[derive(Serialize, Deserialize, Clone, Debug, Default, TS)]
#[ts(export)]
pub struct HttpConstraints {
pub struct WebConstraints {
#[serde(skip_serializing_if = "Option::is_none")]
pub methods: Option<Vec<String>>,
#[serde(skip_serializing_if = "Option::is_none")]
@ -254,7 +254,7 @@ impl ResourceType {
pub fn as_str(&self) -> &str {
match self {
ResourceType::Fs => "fs",
ResourceType::Http => "http",
ResourceType::Web => "web",
ResourceType::Db => "db",
ResourceType::Shell => "shell",
}
@ -263,11 +263,11 @@ impl ResourceType {
pub fn from_str(s: &str) -> Result<Self, ExtensionError> {
match s {
"fs" => Ok(ResourceType::Fs),
"http" => Ok(ResourceType::Http),
"web" => Ok(ResourceType::Web),
"db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell),
_ => Err(ExtensionError::ValidationError {
reason: format!("Unknown resource type: {}", s),
reason: format!("Unknown resource type: {s}"),
}),
}
}
@ -284,7 +284,7 @@ impl Action {
.unwrap_or_default()
.trim_matches('"')
.to_string(),
Action::Http(action) => serde_json::to_string(action)
Action::Web(action) => serde_json::to_string(action)
.unwrap_or_default()
.trim_matches('"')
.to_string(),
@ -299,15 +299,15 @@ impl Action {
match resource_type {
ResourceType::Db => Ok(Action::Database(DbAction::from_str(s)?)),
ResourceType::Fs => Ok(Action::Filesystem(FsAction::from_str(s)?)),
ResourceType::Http => {
let action: HttpAction =
serde_json::from_str(&format!("\"{}\"", s)).map_err(|_| {
ResourceType::Web => {
let action: WebAction =
serde_json::from_str(&format!("\"{s}\"")).map_err(|_| {
ExtensionError::InvalidActionString {
input: s.to_string(),
resource_type: "http".to_string(),
resource_type: "web".to_string(),
}
})?;
Ok(Action::Http(action))
Ok(Action::Web(action))
}
ResourceType::Shell => Ok(Action::Shell(ShellAction::from_str(s)?)),
}
@ -329,7 +329,7 @@ impl PermissionStatus {
"granted" => Ok(PermissionStatus::Granted),
"denied" => Ok(PermissionStatus::Denied),
_ => Err(ExtensionError::ValidationError {
reason: format!("Unknown permission status: {}", s),
reason: format!("Unknown permission status: {s}"),
}),
}
}

View File

@ -17,7 +17,7 @@ impl SqlPermissionValidator {
fn is_own_table(extension_id: &str, table_name: &str) -> bool {
// Tabellennamen sind im Format: {keyHash}_{extensionName}_{tableName}
// extension_id ist der keyHash der Extension
table_name.starts_with(&format!("{}_", extension_id))
table_name.starts_with(&format!("{extension_id}_"))
}
/// Validiert ein SQL-Statement gegen die Permissions einer Extension
@ -45,7 +45,7 @@ impl SqlPermissionValidator {
Self::validate_schema_statement(app_state, extension_id, &statement).await
}
_ => Err(ExtensionError::ValidationError {
reason: format!("Statement type not allowed: {}", sql),
reason: format!("Statement type not allowed: {sql}"),
}),
}
}

View File

@ -0,0 +1,208 @@
// src-tauri/src/extension/web/mod.rs
use crate::extension::error::ExtensionError;
use crate::AppState;
use base64::{engine::general_purpose::STANDARD, Engine as _};
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::time::Duration;
use tauri::State;
use tauri_plugin_http::reqwest;
/// Request structure matching the SDK's WebRequestOptions
#[derive(Debug, Deserialize)]
pub struct WebFetchRequest {
pub url: String,
#[serde(default)]
pub method: Option<String>,
#[serde(default)]
pub headers: Option<HashMap<String, String>>,
#[serde(default)]
pub body: Option<String>, // Base64 encoded
#[serde(default)]
pub timeout: Option<u64>, // milliseconds
}
/// Response structure matching the SDK's WebResponse
#[derive(Debug, Serialize)]
pub struct WebFetchResponse {
pub status: u16,
pub status_text: String,
pub headers: HashMap<String, String>,
pub body: String, // Base64 encoded
pub url: String,
}
#[tauri::command]
pub async fn extension_web_open(
url: String,
public_key: String,
name: String,
state: State<'_, AppState>,
) -> Result<(), ExtensionError> {
// Get extension to validate it exists
let extension = state
.extension_manager
.get_extension_by_public_key_and_name(&public_key, &name)?
.ok_or_else(|| ExtensionError::NotFound {
public_key: public_key.clone(),
name: name.clone(),
})?;
// Validate URL format
let parsed_url = url::Url::parse(&url).map_err(|e| ExtensionError::WebError {
reason: format!("Invalid URL: {}", e),
})?;
// Only allow http and https URLs
let scheme = parsed_url.scheme();
if scheme != "http" && scheme != "https" {
return Err(ExtensionError::WebError {
reason: format!("Unsupported URL scheme: {}. Only http and https are allowed.", scheme),
});
}
// Check web permissions
crate::extension::permissions::manager::PermissionManager::check_web_permission(
&state,
&extension.id,
&url,
)
.await?;
// Open URL in default browser using tauri-plugin-opener
tauri_plugin_opener::open_url(&url, None::<&str>).map_err(|e| ExtensionError::WebError {
reason: format!("Failed to open URL in browser: {}", e),
})?;
Ok(())
}
#[tauri::command]
pub async fn extension_web_fetch(
url: String,
method: Option<String>,
headers: Option<HashMap<String, String>>,
body: Option<String>,
timeout: Option<u64>,
public_key: String,
name: String,
state: State<'_, AppState>,
) -> Result<WebFetchResponse, ExtensionError> {
// Get extension to validate it exists
let extension = state
.extension_manager
.get_extension_by_public_key_and_name(&public_key, &name)?
.ok_or_else(|| ExtensionError::NotFound {
public_key: public_key.clone(),
name: name.clone(),
})?;
let method_str = method.as_deref().unwrap_or("GET");
// Check web permissions before making request
crate::extension::permissions::manager::PermissionManager::check_web_permission(
&state,
&extension.id,
&url,
)
.await?;
let request = WebFetchRequest {
url,
method: Some(method_str.to_string()),
headers,
body,
timeout,
};
fetch_web_request(request).await
}
/// Performs the actual HTTP request without CORS restrictions
async fn fetch_web_request(request: WebFetchRequest) -> Result<WebFetchResponse, ExtensionError> {
let method_str = request.method.as_deref().unwrap_or("GET");
let timeout_ms = request.timeout.unwrap_or(30000);
// Build reqwest client with timeout
let client = reqwest::Client::builder()
.timeout(Duration::from_millis(timeout_ms))
.build()
.map_err(|e| ExtensionError::WebError {
reason: format!("Failed to create HTTP client: {}", e),
})?;
// Build request
let mut req_builder = match method_str.to_uppercase().as_str() {
"GET" => client.get(&request.url),
"POST" => client.post(&request.url),
"PUT" => client.put(&request.url),
"DELETE" => client.delete(&request.url),
"PATCH" => client.patch(&request.url),
"HEAD" => client.head(&request.url),
"OPTIONS" => client.request(reqwest::Method::OPTIONS, &request.url),
_ => {
return Err(ExtensionError::WebError {
reason: format!("Unsupported HTTP method: {}", method_str),
})
}
};
// Add headers
if let Some(headers) = request.headers {
for (key, value) in headers {
req_builder = req_builder.header(key, value);
}
}
// Add body if present (decode from base64)
if let Some(body_base64) = request.body {
let body_bytes = STANDARD.decode(&body_base64).map_err(|e| {
ExtensionError::WebError {
reason: format!("Failed to decode request body from base64: {}", e),
}
})?;
req_builder = req_builder.body(body_bytes);
}
// Execute request
let response = req_builder.send().await.map_err(|e| {
if e.is_timeout() {
ExtensionError::WebError {
reason: format!("Request timeout after {}ms", timeout_ms),
}
} else {
ExtensionError::WebError {
reason: format!("Request failed: {}", e),
}
}
})?;
// Extract response data
let status = response.status().as_u16();
let status_text = response.status().canonical_reason().unwrap_or("").to_string();
let final_url = response.url().to_string();
// Extract headers
let mut response_headers = HashMap::new();
for (key, value) in response.headers() {
if let Ok(value_str) = value.to_str() {
response_headers.insert(key.to_string(), value_str.to_string());
}
}
// Read body and encode to base64
let body_bytes = response.bytes().await.map_err(|e| ExtensionError::WebError {
reason: format!("Failed to read response body: {}", e),
})?;
let body_base64 = STANDARD.encode(&body_bytes);
Ok(WebFetchResponse {
status,
status_text,
headers: response_headers,
body: body_base64,
url: final_url,
})
}

View File

@ -0,0 +1,66 @@
use crate::extension::database::{extension_sql_execute, extension_sql_select};
use crate::extension::error::ExtensionError;
use crate::AppState;
use tauri::{State, WebviewWindow};
use super::helpers::get_extension_id;
#[tauri::command]
pub async fn webview_extension_db_query(
window: WebviewWindow,
state: State<'_, AppState>,
query: String,
params: Vec<serde_json::Value>,
) -> Result<serde_json::Value, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
// Get extension to retrieve public_key and name for existing database functions
let extension = state
.extension_manager
.get_extension(&extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {} not found", extension_id),
})?;
let rows = extension_sql_select(&query, params, extension.manifest.public_key.clone(), extension.manifest.name.clone(), state)
.await
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Database query failed: {}", e),
})?;
Ok(serde_json::json!({
"rows": rows,
"rowsAffected": 0,
"lastInsertId": null
}))
}
#[tauri::command]
pub async fn webview_extension_db_execute(
window: WebviewWindow,
state: State<'_, AppState>,
query: String,
params: Vec<serde_json::Value>,
) -> Result<serde_json::Value, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
// Get extension to retrieve public_key and name for existing database functions
let extension = state
.extension_manager
.get_extension(&extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {} not found", extension_id),
})?;
let rows = extension_sql_execute(&query, params, extension.manifest.public_key.clone(), extension.manifest.name.clone(), state)
.await
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Database execute failed: {}", e),
})?;
Ok(serde_json::json!({
"rows": rows,
"rowsAffected": rows.len(),
"lastInsertId": null
}))
}

View File

@ -0,0 +1,113 @@
use crate::extension::error::ExtensionError;
use crate::AppState;
use serde::{Deserialize, Serialize};
use tauri::{State, WebviewWindow};
use tauri_plugin_dialog::DialogExt;
use tauri_plugin_opener::OpenerExt;
#[derive(Debug, Clone, Deserialize)]
pub struct FileFilter {
pub name: String,
pub extensions: Vec<String>,
}
#[derive(Debug, Clone, Serialize)]
pub struct SaveFileResult {
pub path: String,
pub success: bool,
}
#[tauri::command]
pub async fn webview_extension_fs_save_file(
window: WebviewWindow,
_state: State<'_, AppState>,
data: Vec<u8>,
default_path: Option<String>,
title: Option<String>,
filters: Option<Vec<FileFilter>>,
) -> Result<Option<SaveFileResult>, ExtensionError> {
eprintln!("[Filesystem] save_file called with {} bytes", data.len());
eprintln!("[Filesystem] save_file default_path: {:?}", default_path);
eprintln!("[Filesystem] save_file first 10 bytes: {:?}", &data[..data.len().min(10)]);
// Build save dialog
let mut dialog = window.dialog().file();
if let Some(path) = default_path {
dialog = dialog.set_file_name(&path);
}
if let Some(t) = title {
dialog = dialog.set_title(&t);
}
if let Some(f) = filters {
for filter in f {
let ext_refs: Vec<&str> = filter.extensions.iter().map(|s| s.as_str()).collect();
dialog = dialog.add_filter(&filter.name, &ext_refs);
}
}
// Show dialog (blocking_save_file is safe in async commands)
eprintln!("[Filesystem] Showing save dialog...");
let file_path = dialog.blocking_save_file();
if let Some(file_path) = file_path {
// Convert FilePath to PathBuf
let path_buf = file_path.as_path().ok_or_else(|| ExtensionError::ValidationError {
reason: "Failed to get file path".to_string(),
})?;
eprintln!("[Filesystem] User selected path: {}", path_buf.display());
eprintln!("[Filesystem] Writing {} bytes to file...", data.len());
// Write file using std::fs
std::fs::write(path_buf, &data)
.map_err(|e| {
eprintln!("[Filesystem] ERROR writing file: {}", e);
ExtensionError::ValidationError {
reason: format!("Failed to write file: {}", e),
}
})?;
eprintln!("[Filesystem] File written successfully");
Ok(Some(SaveFileResult {
path: path_buf.to_string_lossy().to_string(),
success: true,
}))
} else {
eprintln!("[Filesystem] User cancelled");
// User cancelled
Ok(None)
}
}
#[tauri::command]
pub async fn webview_extension_fs_open_file(
window: WebviewWindow,
_state: State<'_, AppState>,
data: Vec<u8>,
file_name: String,
) -> Result<serde_json::Value, ExtensionError> {
// Get temp directory
let temp_dir = std::env::temp_dir();
let temp_file_path = temp_dir.join(&file_name);
// Write file to temp directory using std::fs
std::fs::write(&temp_file_path, data)
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to write temp file: {}", e),
})?;
// Open file with system's default viewer
let path_str = temp_file_path.to_string_lossy().to_string();
window.opener().open_path(path_str, None::<String>)
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to open file: {}", e),
})?;
Ok(serde_json::json!({
"success": true
}))
}

View File

@ -0,0 +1,57 @@
use crate::extension::core::protocol::ExtensionInfo;
use crate::extension::error::ExtensionError;
use crate::AppState;
use tauri::{State, WebviewWindow};
/// Get extension_id from window (SECURITY: window_id from Tauri, cannot be spoofed)
pub fn get_extension_id(window: &WebviewWindow, state: &State<AppState>) -> Result<String, ExtensionError> {
let window_id = window.label();
eprintln!("[webview_api] Looking up extension_id for window: {}", window_id);
let windows = state
.extension_webview_manager
.windows
.lock()
.map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
eprintln!("[webview_api] HashMap contents: {:?}", *windows);
let extension_id = windows
.get(window_id)
.cloned()
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Window {} is not registered as an extension window", window_id),
})?;
eprintln!("[webview_api] Found extension_id: {}", extension_id);
Ok(extension_id)
}
/// Get full extension info (public_key, name, version) from window
pub fn get_extension_info_from_window(
window: &WebviewWindow,
state: &State<AppState>,
) -> Result<ExtensionInfo, ExtensionError> {
let extension_id = get_extension_id(window, state)?;
// Get extension from ExtensionManager using the database UUID
let extension = state
.extension_manager
.get_extension(&extension_id)
.ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {} not found", extension_id),
})?;
let version = match &extension.source {
crate::extension::core::types::ExtensionSource::Production { version, .. } => version.clone(),
crate::extension::core::types::ExtensionSource::Development { .. } => "dev".to_string(),
};
Ok(ExtensionInfo {
public_key: extension.manifest.public_key,
name: extension.manifest.name,
version,
})
}

View File

@ -0,0 +1,333 @@
use crate::event_names::EVENT_EXTENSION_WINDOW_CLOSED;
use crate::extension::error::ExtensionError;
use crate::extension::ExtensionManager;
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
use tauri::{AppHandle, Emitter, Manager, WebviewUrl, WebviewWindowBuilder};
/// Verwaltet native WebviewWindows für Extensions (nur Desktop-Plattformen)
pub struct ExtensionWebviewManager {
/// Map: window_id -> extension_id
/// Das window_id ist ein eindeutiger Identifier (Tauri-kompatibel, keine Bindestriche)
/// und wird gleichzeitig als Tauri WebviewWindow label verwendet
pub windows: Arc<Mutex<HashMap<String, String>>>,
}
impl ExtensionWebviewManager {
pub fn new() -> Self {
Self {
windows: Arc::new(Mutex::new(HashMap::new())),
}
}
/// Öffnet eine Extension in einem nativen WebviewWindow
///
/// # Arguments
/// * `app_handle` - Tauri AppHandle
/// * `extension_manager` - Extension Manager für Zugriff auf Extension-Daten
/// * `extension_id` - ID der zu öffnenden Extension
/// * `title` - Fenstertitel
/// * `width` - Fensterbreite
/// * `height` - Fensterhöhe
/// * `x` - X-Position (optional)
/// * `y` - Y-Position (optional)
///
/// # Returns
/// Das window_id des erstellten Fensters
pub fn open_extension_window(
&self,
app_handle: &AppHandle,
extension_manager: &ExtensionManager,
extension_id: String,
title: String,
width: f64,
height: f64,
x: Option<f64>,
y: Option<f64>,
) -> Result<String, ExtensionError> {
// Extension aus Manager holen
let extension = extension_manager
.get_extension(&extension_id)
.ok_or_else(|| ExtensionError::NotFound {
public_key: "".to_string(),
name: extension_id.clone(),
})?;
// URL für Extension generieren (analog zum Frontend)
use crate::extension::core::types::ExtensionSource;
let url = match &extension.source {
ExtensionSource::Production { .. } => {
// Für Production Extensions: custom protocol
#[cfg(target_os = "android")]
let protocol = "http";
#[cfg(not(target_os = "android"))]
let protocol = "haex-extension";
// Extension Info Base64-codieren (wie im Frontend)
let extension_info = serde_json::json!({
"publicKey": extension.manifest.public_key,
"name": extension.manifest.name,
"version": match &extension.source {
ExtensionSource::Production { version, .. } => version,
_ => "",
}
});
let extension_info_str = serde_json::to_string(&extension_info)
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to serialize extension info: {}", e),
})?;
let extension_info_base64 =
base64::Engine::encode(&base64::engine::general_purpose::STANDARD, extension_info_str.as_bytes());
#[cfg(target_os = "android")]
let host = "haex-extension.localhost";
#[cfg(not(target_os = "android"))]
let host = "localhost";
let entry = extension.manifest.entry.as_deref().unwrap_or("index.html");
format!("{}://{}/{}/{}", protocol, host, extension_info_base64, entry)
}
ExtensionSource::Development { dev_server_url, .. } => {
// Für Dev Extensions: direkt Dev-Server URL
dev_server_url.clone()
}
};
// Eindeutige Window-ID generieren (wird auch als Tauri label verwendet, keine Bindestriche erlaubt)
let window_id = format!("ext_{}", uuid::Uuid::new_v4().simple());
eprintln!("Opening extension window: {} with URL: {}", window_id, url);
// WebviewWindow erstellen
let webview_url = WebviewUrl::External(url.parse().map_err(|e| {
ExtensionError::ValidationError {
reason: format!("Invalid URL: {}", e),
}
})?);
#[cfg(not(any(target_os = "android", target_os = "ios")))]
let mut builder = WebviewWindowBuilder::new(app_handle, &window_id, webview_url)
.title(&title)
.inner_size(width, height)
.decorations(true) // Native Decorations (Titlebar, etc.)
.resizable(true)
.skip_taskbar(false) // In Taskbar anzeigen
.center(); // Fenster zentrieren
#[cfg(any(target_os = "android", target_os = "ios"))]
let mut builder = WebviewWindowBuilder::new(app_handle, &window_id, webview_url)
.inner_size(width, height);
// Position setzen, falls angegeben (nur Desktop)
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let (Some(x_pos), Some(y_pos)) = (x, y) {
builder = builder.position(x_pos, y_pos);
}
// Fenster erstellen
let webview_window = builder.build().map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to create webview window: {}", e),
})?;
// Event-Listener für das Schließen des Fensters registrieren
let window_id_for_event = window_id.clone();
let app_handle_for_event = app_handle.clone();
let windows_for_event = self.windows.clone();
webview_window.on_window_event(move |event| {
if let tauri::WindowEvent::Destroyed = event {
eprintln!("WebviewWindow destroyed: {}", window_id_for_event);
// Registry cleanup
if let Ok(mut windows) = windows_for_event.lock() {
windows.remove(&window_id_for_event);
}
// Emit event an Frontend, damit das Tracking aktualisiert wird
let _ = app_handle_for_event.emit(EVENT_EXTENSION_WINDOW_CLOSED, &window_id_for_event);
}
});
// In Registry speichern
let mut windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
windows.insert(window_id.clone(), extension_id.clone());
eprintln!("Extension window opened successfully: {}", window_id);
Ok(window_id)
}
/// Schließt ein Extension-Fenster
pub fn close_extension_window(
&self,
app_handle: &AppHandle,
window_id: &str,
) -> Result<(), ExtensionError> {
let mut windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
if windows.remove(window_id).is_some() {
drop(windows); // Release lock before potentially blocking operation
// Webview Window schließen (nur Desktop)
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let Some(window) = app_handle.get_webview_window(window_id) {
window.close().map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to close window: {}", e),
})?;
}
eprintln!("Extension window closed: {}", window_id);
Ok(())
} else {
Err(ExtensionError::NotFound {
public_key: "".to_string(),
name: window_id.to_string(),
})
}
}
/// Fokussiert ein Extension-Fenster
pub fn focus_extension_window(
&self,
app_handle: &AppHandle,
window_id: &str,
) -> Result<(), ExtensionError> {
let windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
let exists = windows.contains_key(window_id);
drop(windows); // Release lock
if exists {
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let Some(window) = app_handle.get_webview_window(window_id) {
window.set_focus().map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to focus window: {}", e),
})?;
// Zusätzlich nach vorne bringen
window.set_always_on_top(true).ok();
window.set_always_on_top(false).ok();
}
Ok(())
} else {
Err(ExtensionError::NotFound {
public_key: "".to_string(),
name: window_id.to_string(),
})
}
}
/// Aktualisiert Position eines Extension-Fensters
pub fn update_extension_window_position(
&self,
app_handle: &AppHandle,
window_id: &str,
x: f64,
y: f64,
) -> Result<(), ExtensionError> {
let windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
let exists = windows.contains_key(window_id);
drop(windows); // Release lock
if exists {
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let Some(window) = app_handle.get_webview_window(window_id) {
use tauri::Position;
window
.set_position(Position::Physical(tauri::PhysicalPosition {
x: x as i32,
y: y as i32,
}))
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to set window position: {}", e),
})?;
}
Ok(())
} else {
Err(ExtensionError::NotFound {
public_key: "".to_string(),
name: window_id.to_string(),
})
}
}
/// Aktualisiert Größe eines Extension-Fensters
pub fn update_extension_window_size(
&self,
app_handle: &AppHandle,
window_id: &str,
width: f64,
height: f64,
) -> Result<(), ExtensionError> {
let windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
let exists = windows.contains_key(window_id);
drop(windows); // Release lock
if exists {
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let Some(window) = app_handle.get_webview_window(window_id) {
use tauri::Size;
window
.set_size(Size::Physical(tauri::PhysicalSize {
width: width as u32,
height: height as u32,
}))
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to set window size: {}", e),
})?;
}
Ok(())
} else {
Err(ExtensionError::NotFound {
public_key: "".to_string(),
name: window_id.to_string(),
})
}
}
/// Emits an event to all extension webview windows
pub fn emit_to_all_extensions<S: serde::Serialize + Clone>(
&self,
app_handle: &AppHandle,
event: &str,
payload: S,
) -> Result<(), ExtensionError> {
let windows = self.windows.lock().map_err(|e| ExtensionError::MutexPoisoned {
reason: e.to_string(),
})?;
eprintln!("[Manager] Emitting event '{}' to {} webview windows", event, windows.len());
// Iterate over all window IDs
for window_id in windows.keys() {
eprintln!("[Manager] Trying to emit to window: {}", window_id);
#[cfg(not(any(target_os = "android", target_os = "ios")))]
if let Some(window) = app_handle.get_webview_window(window_id) {
// Emit event to this specific webview window
match window.emit(event, payload.clone()) {
Ok(_) => eprintln!("[Manager] Successfully emitted event '{}' to window {}", event, window_id),
Err(e) => eprintln!("[Manager] Failed to emit event {} to window {}: {}", event, window_id, e),
}
} else {
eprintln!("[Manager] Window not found: {}", window_id);
}
}
Ok(())
}
}
impl Default for ExtensionWebviewManager {
fn default() -> Self {
Self::new()
}
}

View File

@ -0,0 +1,8 @@
pub mod database;
pub mod filesystem;
pub mod helpers;
pub mod manager;
pub mod web;
// Re-export manager types
pub use manager::ExtensionWebviewManager;

View File

@ -0,0 +1,266 @@
use crate::extension::core::protocol::ExtensionInfo;
use crate::extension::error::ExtensionError;
use crate::extension::permissions::manager::PermissionManager;
use crate::extension::permissions::types::{Action, DbAction, FsAction};
use crate::AppState;
use base64::Engine;
use serde::{Deserialize, Serialize};
use tauri::{State, WebviewWindow};
use tauri_plugin_http::reqwest;
use super::helpers::{get_extension_id, get_extension_info_from_window};
// ============================================================================
// Types for SDK communication
// ============================================================================
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ApplicationContext {
pub theme: String,
pub locale: String,
pub platform: String,
}
// ============================================================================
// Extension Info Command
// ============================================================================
#[tauri::command]
pub fn webview_extension_get_info(
window: WebviewWindow,
state: State<'_, AppState>,
) -> Result<ExtensionInfo, ExtensionError> {
get_extension_info_from_window(&window, &state)
}
// ============================================================================
// Context API Commands
// ============================================================================
#[tauri::command]
pub fn webview_extension_context_get(
state: State<'_, AppState>,
) -> Result<ApplicationContext, ExtensionError> {
let context = state.context.lock().map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to lock context: {}", e),
})?;
Ok(context.clone())
}
#[tauri::command]
pub fn webview_extension_context_set(
state: State<'_, AppState>,
context: ApplicationContext,
) -> Result<(), ExtensionError> {
let mut current_context = state.context.lock().map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to lock context: {}", e),
})?;
*current_context = context;
Ok(())
}
// ============================================================================
// Permission API Commands
// ============================================================================
#[tauri::command]
pub async fn webview_extension_check_web_permission(
window: WebviewWindow,
state: State<'_, AppState>,
url: String,
) -> Result<bool, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
match PermissionManager::check_web_permission(&state, &extension_id, &url).await {
Ok(_) => Ok(true),
Err(_) => Ok(false),
}
}
#[tauri::command]
pub async fn webview_extension_check_database_permission(
window: WebviewWindow,
state: State<'_, AppState>,
resource: String,
operation: String,
) -> Result<bool, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
let action = match operation.as_str() {
"read" => Action::Database(DbAction::Read),
"write" => Action::Database(DbAction::ReadWrite),
_ => return Ok(false),
};
match PermissionManager::check_database_permission(&state, &extension_id, action, &resource).await {
Ok(_) => Ok(true),
Err(_) => Ok(false),
}
}
#[tauri::command]
pub async fn webview_extension_check_filesystem_permission(
window: WebviewWindow,
state: State<'_, AppState>,
path: String,
action_str: String,
) -> Result<bool, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
let action = match action_str.as_str() {
"read" => Action::Filesystem(FsAction::Read),
"write" => Action::Filesystem(FsAction::ReadWrite),
_ => return Ok(false),
};
let path_buf = std::path::Path::new(&path);
match PermissionManager::check_filesystem_permission(&state, &extension_id, action, path_buf).await {
Ok(_) => Ok(true),
Err(_) => Ok(false),
}
}
// ============================================================================
// Web API Commands
// ============================================================================
#[tauri::command]
pub async fn webview_extension_web_open(
window: WebviewWindow,
state: State<'_, AppState>,
url: String,
) -> Result<(), ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
// Validate URL format
let parsed_url = url::Url::parse(&url).map_err(|e| ExtensionError::WebError {
reason: format!("Invalid URL: {}", e),
})?;
// Only allow http and https URLs
let scheme = parsed_url.scheme();
if scheme != "http" && scheme != "https" {
return Err(ExtensionError::WebError {
reason: format!("Unsupported URL scheme: {}. Only http and https are allowed.", scheme),
});
}
// Check web permissions
PermissionManager::check_web_permission(&state, &extension_id, &url).await?;
// Open URL in default browser using tauri-plugin-opener
tauri_plugin_opener::open_url(&url, None::<&str>).map_err(|e| ExtensionError::WebError {
reason: format!("Failed to open URL in browser: {}", e),
})?;
Ok(())
}
#[tauri::command]
pub async fn webview_extension_web_request(
window: WebviewWindow,
state: State<'_, AppState>,
url: String,
method: Option<String>,
headers: Option<serde_json::Value>,
body: Option<String>,
) -> Result<serde_json::Value, ExtensionError> {
let extension_id = get_extension_id(&window, &state)?;
// Check permission first
PermissionManager::check_web_permission(&state, &extension_id, &url).await?;
// Build request
let method = method.unwrap_or_else(|| "GET".to_string());
let client = reqwest::Client::new();
let mut request = match method.to_uppercase().as_str() {
"GET" => client.get(&url),
"POST" => client.post(&url),
"PUT" => client.put(&url),
"DELETE" => client.delete(&url),
"PATCH" => client.patch(&url),
_ => {
return Err(ExtensionError::ValidationError {
reason: format!("Unsupported HTTP method: {}", method),
})
}
};
// Add headers
if let Some(headers) = headers {
if let Some(headers_obj) = headers.as_object() {
for (key, value) in headers_obj {
if let Some(value_str) = value.as_str() {
request = request.header(key, value_str);
}
}
}
}
// Add body
if let Some(body) = body {
request = request.body(body);
}
// Execute request
let response = request
.send()
.await
.map_err(|e| ExtensionError::ValidationError {
reason: format!("HTTP request failed: {}", e),
})?;
let status = response.status().as_u16();
let headers_map = response.headers().clone();
// Get response body as bytes
let body_bytes = response
.bytes()
.await
.map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to read response body: {}", e),
})?;
// Encode body as base64
let body_base64 = base64::Engine::encode(&base64::engine::general_purpose::STANDARD, &body_bytes);
// Convert headers to JSON
let mut headers_json = serde_json::Map::new();
for (key, value) in headers_map.iter() {
if let Ok(value_str) = value.to_str() {
headers_json.insert(
key.to_string(),
serde_json::Value::String(value_str.to_string()),
);
}
}
Ok(serde_json::json!({
"status": status,
"headers": headers_json,
"body": body_base64,
"ok": status >= 200 && status < 300
}))
}
/// Broadcasts an event to all extension webview windows
#[tauri::command]
pub async fn webview_extension_emit_to_all(
app_handle: tauri::AppHandle,
state: State<'_, AppState>,
event: String,
payload: serde_json::Value,
) -> Result<(), ExtensionError> {
#[cfg(not(any(target_os = "android", target_os = "ios")))]
{
state.extension_webview_manager.emit_to_all_extensions(
&app_handle,
&event,
payload,
)?;
}
Ok(())
}

View File

@ -1,7 +1,14 @@
mod crdt;
mod database;
mod extension;
use crate::{crdt::hlc::HlcService, database::DbConnection, extension::core::ExtensionManager};
use crate::{
crdt::hlc::HlcService,
database::DbConnection,
extension::core::ExtensionManager,
};
#[cfg(not(any(target_os = "android", target_os = "ios")))]
use crate::extension::webview::ExtensionWebviewManager;
use std::sync::{Arc, Mutex};
use tauri::Manager;
@ -9,10 +16,17 @@ pub mod table_names {
include!(concat!(env!("OUT_DIR"), "/tableNames.rs"));
}
pub mod event_names {
include!(concat!(env!("OUT_DIR"), "/eventNames.rs"));
}
pub struct AppState {
pub db: DbConnection,
pub hlc: Mutex<HlcService>,
pub extension_manager: ExtensionManager,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
pub extension_webview_manager: ExtensionWebviewManager,
pub context: Arc<Mutex<extension::webview::web::ApplicationContext>>,
}
#[cfg_attr(mobile, tauri::mobile_entry_point)]
@ -26,7 +40,7 @@ pub fn run() {
let state = app_handle.state::<AppState>();
// Rufe den Handler mit allen benötigten Parametern auf
match extension::core::extension_protocol_handler(state, &app_handle, &request) {
match extension::core::extension_protocol_handler(state, app_handle, &request) {
Ok(response) => response,
Err(e) => {
eprintln!(
@ -38,11 +52,10 @@ pub fn run() {
.status(500)
.header("Content-Type", "text/plain")
.body(Vec::from(format!(
"Interner Serverfehler im Protokollhandler: {}",
e
"Interner Serverfehler im Protokollhandler: {e}"
)))
.unwrap_or_else(|build_err| {
eprintln!("Konnte Fehler-Response nicht erstellen: {}", build_err);
eprintln!("Konnte Fehler-Response nicht erstellen: {build_err}");
tauri::http::Response::builder()
.status(500)
.body(Vec::new())
@ -55,6 +68,13 @@ pub fn run() {
db: DbConnection(Arc::new(Mutex::new(None))),
hlc: Mutex::new(HlcService::new()),
extension_manager: ExtensionManager::new(),
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension_webview_manager: ExtensionWebviewManager::new(),
context: Arc::new(Mutex::new(extension::webview::web::ApplicationContext {
theme: "dark".to_string(),
locale: "en".to_string(),
platform: std::env::consts::OS.to_string(),
})),
})
//.manage(ExtensionState::default())
.plugin(tauri_plugin_dialog::init())
@ -68,6 +88,7 @@ pub fn run() {
.invoke_handler(tauri::generate_handler![
database::create_encrypted_database,
database::delete_vault,
database::move_vault_to_trash,
database::list_vaults,
database::open_encrypted_database,
database::sql_execute_with_crdt,
@ -78,6 +99,11 @@ pub fn run() {
database::vault_exists,
extension::database::extension_sql_execute,
extension::database::extension_sql_select,
extension::web::extension_web_fetch,
extension::web::extension_web_open,
extension::permissions::check::check_web_permission,
extension::permissions::check::check_database_permission,
extension::permissions::check::check_filesystem_permission,
extension::get_all_dev_extensions,
extension::get_all_extensions,
extension::get_extension_info,
@ -87,6 +113,41 @@ pub fn run() {
extension::preview_extension,
extension::remove_dev_extension,
extension::remove_extension,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::open_extension_webview_window,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::close_extension_webview_window,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::focus_extension_webview_window,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::update_extension_webview_window_position,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::update_extension_webview_window_size,
// WebView API commands (for native window extensions, desktop only)
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_get_info,
extension::webview::web::webview_extension_context_get,
extension::webview::web::webview_extension_context_set,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::database::webview_extension_db_query,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::database::webview_extension_db_execute,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_check_web_permission,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_check_database_permission,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_check_filesystem_permission,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_web_open,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_web_request,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::web::webview_extension_emit_to_all,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::filesystem::webview_extension_fs_save_file,
#[cfg(not(any(target_os = "android", target_os = "ios")))]
extension::webview::filesystem::webview_extension_fs_open_file,
])
.run(tauri::generate_context!())
.expect("error while running tauri application");

View File

@ -1,16 +1,17 @@
{
"$schema": "https://schema.tauri.app/config/2",
"productName": "haex-hub",
"version": "0.1.0",
"version": "0.1.13",
"identifier": "space.haex.hub",
"build": {
"beforeDevCommand": "pnpm dev",
"devUrl": "http://localhost:3003",
"beforeBuildCommand": "pnpm generate",
"frontendDist": "../dist"
"frontendDist": "../.output/public"
},
"app": {
"withGlobalTauri": true,
"windows": [
{
"title": "haex-hub",
@ -20,16 +21,21 @@
],
"security": {
"csp": {
"default-src": ["'self'", "http://tauri.localhost", "haex-extension:"],
"default-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"script-src": [
"'self'",
"http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:",
"'wasm-unsafe-eval'"
"'wasm-unsafe-eval'",
"'unsafe-inline'"
],
"style-src": [
"'self'",
"http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:",
"'unsafe-inline'"
],
@ -44,20 +50,22 @@
"img-src": [
"'self'",
"http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:",
"data:",
"blob:"
],
"font-src": ["'self'", "http://tauri.localhost", "haex-extension:"],
"font-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"object-src": ["'none'"],
"media-src": ["'self'", "http://tauri.localhost", "haex-extension:"],
"media-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"frame-src": ["haex-extension:"],
"frame-ancestors": ["'none'"],
"base-uri": ["'self'"]
},
"assetProtocol": {
"enable": true,
"scope": ["$APPDATA", "$RESOURCE"]
"scope": ["$APPDATA", "$RESOURCE", "$APPLOCALDATA/**"]
}
}
},

View File

@ -3,6 +3,8 @@ export default defineAppConfig({
colors: {
primary: 'sky',
secondary: 'fuchsia',
warning: 'yellow',
danger: 'red',
},
},
})

View File

@ -14,18 +14,44 @@
@apply cursor-not-allowed;
}
/* Define safe-area-insets as CSS custom properties for JavaScript access */
:root {
--safe-area-inset-top: env(safe-area-inset-top, 0px);
--safe-area-inset-bottom: env(safe-area-inset-bottom, 0px);
--safe-area-inset-left: env(safe-area-inset-left, 0px);
--safe-area-inset-right: env(safe-area-inset-right, 0px);
}
/* Verhindere Scrolling auf html und body */
html {
overflow: hidden;
margin: 0;
padding: 0;
height: 100dvh;
height: 100vh; /* Fallback */
width: 100%;
}
body {
overflow: hidden;
margin: 0;
height: 100%;
width: 100%;
padding: 0;
}
#__nuxt {
/* Stellt sicher, dass die App immer die volle Höhe hat */
min-height: 100vh;
/* Volle Höhe des body */
height: 100%;
width: 100%;
/* Sorgt dafür, dass Padding die Höhe nicht sprengt */
@apply box-border;
/* Safe-Area Paddings auf root element - damit ALLES davon profitiert */
padding-top: var(--safe-area-inset-top);
padding-bottom: var(--safe-area-inset-bottom);
padding-left: var(--safe-area-inset-left);
padding-right: var(--safe-area-inset-right);
/* Die Safe-Area Paddings */
padding-top: env(safe-area-inset-top);
padding-bottom: env(safe-area-inset-bottom);
padding-left: env(safe-area-inset-left);
padding-right: env(safe-area-inset-right);
box-sizing: border-box;
}
}

View File

@ -0,0 +1,61 @@
<template>
<div
v-if="data"
class="fixed top-2 right-2 bg-black/90 text-white text-xs p-3 rounded-lg shadow-2xl max-w-sm z-[9999] backdrop-blur-sm"
>
<div class="flex justify-between items-start gap-3 mb-2">
<span class="font-bold text-sm">{{ title }}</span>
<div class="flex gap-1">
<button
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="copyToClipboardAsync"
>
Copy
</button>
<button
v-if="dismissible"
class="bg-white/20 hover:bg-white/30 px-2 py-1 rounded text-xs transition-colors"
@click="handleDismiss"
>
</button>
</div>
</div>
<pre class="text-xs whitespace-pre-wrap font-mono overflow-auto max-h-96">{{ formattedData }}</pre>
</div>
</template>
<script setup lang="ts">
const props = withDefaults(
defineProps<{
data: Record<string, any> | null
title?: string
dismissible?: boolean
}>(),
{
title: 'Debug Info',
dismissible: false,
},
)
const emit = defineEmits<{
dismiss: []
}>()
const formattedData = computed(() => {
if (!props.data) return ''
return JSON.stringify(props.data, null, 2)
})
const copyToClipboardAsync = async () => {
try {
await navigator.clipboard.writeText(formattedData.value)
} catch (err) {
console.error('Failed to copy debug info:', err)
}
}
const handleDismiss = () => {
emit('dismiss')
}
</script>

View File

@ -86,6 +86,21 @@ const extension = computed(() => {
})
const handleIframeLoad = () => {
console.log('[ExtensionFrame] Iframe loaded successfully for:', extension.value?.name)
// Try to inject a test script to see if JavaScript execution works
try {
if (iframeRef.value?.contentWindow) {
console.log('[ExtensionFrame] Iframe has contentWindow access')
// This will fail with sandboxed iframes without allow-same-origin
console.log('[ExtensionFrame] Iframe origin:', iframeRef.value.contentWindow.location.href)
} else {
console.warn('[ExtensionFrame] Iframe contentWindow is null/undefined')
}
} catch (e) {
console.warn('[ExtensionFrame] Cannot access iframe content (expected with sandbox):', e)
}
// Delay the fade-in slightly to allow window animation to mostly complete
setTimeout(() => {
isLoading.value = false
@ -102,13 +117,28 @@ const sandboxAttributes = computed(() => {
// Generate extension URL
const extensionUrl = computed(() => {
if (!extension.value) return ''
if (!extension.value) {
console.log('[ExtensionFrame] No extension found')
return ''
}
const { publicKey, name, version, devServerUrl } = extension.value
const assetPath = 'index.html'
console.log('[ExtensionFrame] Generating URL for extension:', {
name,
publicKey: publicKey?.substring(0, 10) + '...',
version,
devServerUrl,
platform,
})
if (!publicKey || !name || !version) {
console.error('Missing required extension fields')
console.error('[ExtensionFrame] Missing required extension fields:', {
hasPublicKey: !!publicKey,
hasName: !!name,
hasVersion: !!version,
})
return ''
}
@ -116,7 +146,9 @@ const extensionUrl = computed(() => {
if (devServerUrl) {
const cleanUrl = devServerUrl.replace(/\/$/, '')
const cleanPath = assetPath.replace(/^\//, '')
return cleanPath ? `${cleanUrl}/${cleanPath}` : cleanUrl
const url = cleanPath ? `${cleanUrl}/${cleanPath}` : cleanUrl
console.log('[ExtensionFrame] Using dev server URL:', url)
return url
}
const extensionInfo = {
@ -126,13 +158,18 @@ const extensionUrl = computed(() => {
}
const encodedInfo = btoa(JSON.stringify(extensionInfo))
let url = ''
if (platform === 'android' || platform === 'windows') {
// Android: Tauri uses http://{scheme}.localhost format
return `http://${EXTENSION_PROTOCOL_NAME}.localhost/${encodedInfo}/${assetPath}`
url = `http://${EXTENSION_PROTOCOL_NAME}.localhost/${encodedInfo}/${assetPath}`
console.log('[ExtensionFrame] Generated Android/Windows URL:', url)
} else {
// Desktop: Use custom protocol with base64 as host
return `${EXTENSION_PROTOCOL_PREFIX}${encodedInfo}/${assetPath}`
url = `${EXTENSION_PROTOCOL_PREFIX}${encodedInfo}/${assetPath}`
console.log('[ExtensionFrame] Generated Desktop URL:', url)
}
return url
})
const retryLoad = () => {
@ -150,19 +187,28 @@ onMounted(() => {
// Wait for iframe to be ready
if (iframeRef.value && extension.value) {
console.log(
'[ExtensionFrame] Manually registering iframe on mount',
'[ExtensionFrame] Component MOUNTED',
extension.value.name,
'windowId:',
props.windowId,
)
registerExtensionIFrame(iframeRef.value, extension.value, props.windowId)
} else {
console.warn('[ExtensionFrame] Component mounted but missing iframe or extension:', {
hasIframe: !!iframeRef.value,
hasExtension: !!extension.value,
})
}
})
// Explicit cleanup before unmount
onBeforeUnmount(() => {
console.log('[ExtensionFrame] Component UNMOUNTING', {
extensionId: props.extensionId,
windowId: props.windowId,
hasIframe: !!iframeRef.value,
})
if (iframeRef.value) {
console.log('[ExtensionFrame] Unregistering iframe on unmount')
unregisterExtensionIFrame(iframeRef.value)
}
})

View File

@ -18,35 +18,32 @@
@pointerdown.left="handlePointerDown"
@pointermove="handlePointerMove"
@pointerup="handlePointerUp"
@dragstart.prevent
@click.left="handleClick"
@dblclick="handleDoubleClick"
>
<div class="flex flex-col items-center gap-2 p-3 group">
<div
:class="[
'w-20 h-20 flex items-center justify-center rounded-2xl transition-all duration-200 ease-out',
'flex items-center justify-center rounded-2xl transition-all duration-200 ease-out',
'backdrop-blur-sm border',
isSelected
? 'bg-white/95 dark:bg-gray-800/95 border-blue-500 dark:border-blue-400 shadow-lg scale-105'
: 'bg-white/80 dark:bg-gray-800/80 border-gray-200/50 dark:border-gray-700/50 hover:bg-white/90 dark:hover:bg-gray-800/90 hover:border-gray-300 dark:hover:border-gray-600 hover:shadow-md hover:scale-105',
]"
:style="{ width: `${containerSize}px`, height: `${containerSize}px` }"
>
<img
v-if="icon"
:src="icon"
:alt="label"
class="w-14 h-14 object-contain transition-transform duration-200"
:class="{ 'scale-110': isSelected }"
/>
<UIcon
v-else
name="i-heroicons-puzzle-piece-solid"
<HaexIcon
:name="icon || 'i-heroicons-puzzle-piece-solid'"
:class="[
'w-14 h-14 transition-all duration-200',
isSelected
? 'text-blue-500 dark:text-blue-400 scale-110'
: 'text-gray-400 dark:text-gray-500 group-hover:text-gray-500 dark:group-hover:text-gray-400',
'object-contain transition-all duration-200',
isSelected && 'scale-110',
!icon &&
(isSelected
? 'text-blue-500 dark:text-blue-400'
: 'text-gray-400 dark:text-gray-500 group-hover:text-gray-500 dark:group-hover:text-gray-400'),
]"
:style="{ width: `${innerIconSize}px`, height: `${innerIconSize}px` }"
/>
</div>
<span
@ -79,15 +76,19 @@ const props = defineProps<{
const emit = defineEmits<{
positionChanged: [id: string, x: number, y: number]
dragStart: [id: string, itemType: string, referenceId: string]
dragStart: [id: string, itemType: string, referenceId: string, width: number, height: number, x: number, y: number]
dragging: [id: string, x: number, y: number]
dragEnd: []
}>()
const desktopStore = useDesktopStore()
const { effectiveIconSize } = storeToRefs(desktopStore)
const showUninstallDialog = ref(false)
const { t } = useI18n()
const isSelected = computed(() => desktopStore.isItemSelected(props.id))
const containerSize = computed(() => effectiveIconSize.value) // Container size
const innerIconSize = computed(() => effectiveIconSize.value * 0.7) // Inner icon is 70% of container
const handleClick = (e: MouseEvent) => {
// Prevent selection during drag
@ -131,9 +132,40 @@ const isDragging = ref(false)
const offsetX = ref(0)
const offsetY = ref(0)
// Icon dimensions (approximate)
const iconWidth = 120 // Matches design in template
const iconHeight = 140
// Track actual icon dimensions dynamically
const { width: iconWidth, height: iconHeight } = useElementSize(draggableEl)
// Re-center icon position when dimensions are measured
watch([iconWidth, iconHeight], async ([width, height]) => {
if (width > 0 && height > 0) {
console.log('📐 Icon dimensions measured:', {
label: props.label,
width,
height,
currentPosition: { x: x.value, y: y.value },
gridCellSize: desktopStore.gridCellSize,
})
// Re-snap to grid with actual dimensions to ensure proper centering
const snapped = desktopStore.snapToGrid(x.value, y.value, width, height)
console.log('📍 Snapped position:', {
label: props.label,
oldPosition: { x: x.value, y: y.value },
newPosition: snapped,
})
const oldX = x.value
const oldY = y.value
x.value = snapped.x
y.value = snapped.y
// Save corrected position to database if it changed
if (oldX !== snapped.x || oldY !== snapped.y) {
emit('positionChanged', props.id, snapped.x, snapped.y)
}
}
}, { once: true }) // Only run once when dimensions are first measured
const style = computed(() => ({
position: 'absolute' as const,
@ -145,8 +177,11 @@ const style = computed(() => ({
const handlePointerDown = (e: PointerEvent) => {
if (!draggableEl.value || !draggableEl.value.parentElement) return
// Prevent any text selection during drag
e.preventDefault()
isDragging.value = true
emit('dragStart', props.id, props.itemType, props.referenceId)
emit('dragStart', props.id, props.itemType, props.referenceId, iconWidth.value, iconHeight.value, x.value, y.value)
// Get parent offset to convert from viewport coordinates to parent-relative coordinates
const parentRect = draggableEl.value.parentElement.getBoundingClientRect()
@ -165,8 +200,15 @@ const handlePointerMove = (e: PointerEvent) => {
const newX = e.clientX - parentRect.left - offsetX.value
const newY = e.clientY - parentRect.top - offsetY.value
x.value = newX
y.value = newY
// Clamp position to viewport bounds during drag
const maxX = viewportSize ? Math.max(0, viewportSize.width.value - iconWidth.value) : Number.MAX_SAFE_INTEGER
const maxY = viewportSize ? Math.max(0, viewportSize.height.value - iconHeight.value) : Number.MAX_SAFE_INTEGER
x.value = Math.max(0, Math.min(maxX, newX))
y.value = Math.max(0, Math.min(maxY, newY))
// Emit current position during drag
emit('dragging', props.id, x.value, y.value)
}
const handlePointerUp = (e: PointerEvent) => {
@ -177,10 +219,15 @@ const handlePointerUp = (e: PointerEvent) => {
draggableEl.value.releasePointerCapture(e.pointerId)
}
// Snap to grid with icon dimensions
const snapped = desktopStore.snapToGrid(x.value, y.value, iconWidth.value, iconHeight.value)
x.value = snapped.x
y.value = snapped.y
// Snap icon to viewport bounds if outside
if (viewportSize) {
const maxX = Math.max(0, viewportSize.width.value - iconWidth)
const maxY = Math.max(0, viewportSize.height.value - iconHeight)
const maxX = Math.max(0, viewportSize.width.value - iconWidth.value)
const maxY = Math.max(0, viewportSize.height.value - iconHeight.value)
x.value = Math.max(0, Math.min(maxX, x.value))
y.value = Math.max(0, Math.min(maxY, y.value))
}

View File

@ -1,7 +1,7 @@
<template>
<div
ref="desktopEl"
class="w-full h-full relative overflow-hidden"
class="absolute inset-0 overflow-hidden"
>
<Swiper
:modules="[SwiperNavigation]"
@ -13,7 +13,7 @@
:no-swiping="true"
no-swiping-class="no-swipe"
:allow-touch-move="allowSwipe"
class="w-full h-full"
class="h-full w-full"
direction="vertical"
@swiper="onSwiperInit"
@slide-change="onSlideChange"
@ -23,89 +23,165 @@
:key="workspace.id"
class="w-full h-full"
>
<div
class="w-full h-full relative"
@click.self.stop="handleDesktopClick"
@mousedown.left.self="handleAreaSelectStart"
@dragover.prevent="handleDragOver"
@drop.prevent="handleDrop($event, workspace.id)"
>
<!-- Grid Pattern Background -->
<UContextMenu :items="getWorkspaceContextMenuItems(workspace.id)">
<div
class="absolute inset-0 pointer-events-none opacity-30"
:style="{
backgroundImage:
'linear-gradient(rgba(0, 0, 0, 0.1) 1px, transparent 1px), linear-gradient(90deg, rgba(0, 0, 0, 0.1) 1px, transparent 1px)',
backgroundSize: '32px 32px',
}"
/>
<!-- Snap Dropzones (only visible when window drag near edge) -->
<div
class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'"
/>
<div
class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'"
/>
<!-- Area Selection Box -->
<div
v-if="isAreaSelecting"
class="absolute bg-blue-500/20 border-2 border-blue-500 pointer-events-none z-30"
:style="selectionBoxStyle"
/>
<!-- Icons for this workspace -->
<HaexDesktopIcon
v-for="item in getWorkspaceIcons(workspace.id)"
:id="item.id"
:key="item.id"
:item-type="item.itemType"
:reference-id="item.referenceId"
:initial-x="item.positionX"
:initial-y="item.positionY"
:label="item.label"
:icon="item.icon"
class="no-swipe"
@position-changed="handlePositionChanged"
@drag-start="handleDragStart"
@drag-end="handleDragEnd"
/>
<!-- Windows for this workspace -->
<template
v-for="window in getWorkspaceWindows(workspace.id)"
:key="window.id"
class="w-full h-full relative select-none"
:style="getWorkspaceBackgroundStyle(workspace)"
@click.self.stop="handleDesktopClick"
@mousedown.left.self="handleAreaSelectStart"
@dragover.prevent="handleDragOver"
@drop.prevent="handleDrop($event, workspace.id)"
@selectstart.prevent
>
<!-- Overview Mode: Teleport to window preview -->
<Teleport
v-if="
windowManager.showWindowOverview &&
overviewWindowState.has(window.id)
<!-- Drop Target Zone (visible during drag) -->
<div
v-if="dropTargetZone"
class="absolute border-2 border-blue-500 bg-blue-500/10 rounded-lg pointer-events-none z-10 transition-all duration-75"
:style="{
left: `${dropTargetZone.x}px`,
top: `${dropTargetZone.y}px`,
width: `${dropTargetZone.width}px`,
height: `${dropTargetZone.height}px`,
}"
/>
<!-- Snap Dropzones (only visible when window drag near edge) -->
<div
class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="
showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
"
:to="`#window-preview-${window.id}`"
/>
<div
class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="
showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
"
/>
<!-- Area Selection Box -->
<div
v-if="isAreaSelecting"
class="absolute bg-blue-500/20 border-2 border-blue-500 pointer-events-none z-30"
:style="selectionBoxStyle"
/>
<!-- Icons for this workspace -->
<HaexDesktopIcon
v-for="item in getWorkspaceIcons(workspace.id)"
:id="item.id"
:key="item.id"
:item-type="item.itemType"
:reference-id="item.referenceId"
:initial-x="item.positionX"
:initial-y="item.positionY"
:label="item.label"
:icon="item.icon"
class="no-swipe"
@position-changed="handlePositionChanged"
@drag-start="handleDragStart"
@dragging="handleDragging"
@drag-end="handleDragEnd"
/>
<!-- Windows for this workspace -->
<template
v-for="window in getWorkspaceWindows(workspace.id)"
:key="window.id"
>
<div
class="absolute origin-top-left"
:style="{
transform: `scale(${overviewWindowState.get(window.id)!.scale})`,
width: `${overviewWindowState.get(window.id)!.width}px`,
height: `${overviewWindowState.get(window.id)!.height}px`,
}"
<!-- Overview Mode: Teleport to window preview -->
<template
v-if="
windowManager.showWindowOverview &&
overviewWindowState.has(window.id)
"
>
<Teleport :to="`#window-preview-${window.id}`">
<div
class="absolute origin-top-left"
:style="{
transform: `scale(${overviewWindowState.get(window.id)!.scale})`,
width: `${overviewWindowState.get(window.id)!.width}px`,
height: `${overviewWindowState.get(window.id)!.height}px`,
}"
>
<HaexWindow
v-show="
windowManager.showWindowOverview || !window.isMinimized
"
:id="window.id"
v-model:x="overviewWindowState.get(window.id)!.x"
v-model:y="overviewWindowState.get(window.id)!.y"
v-model:width="overviewWindowState.get(window.id)!.width"
v-model:height="
overviewWindowState.get(window.id)!.height
"
:title="window.title"
:icon="window.icon"
:is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX"
:source-y="window.sourceY"
:source-width="window.sourceWidth"
:source-height="window.sourceHeight"
:is-opening="window.isOpening"
:is-closing="window.isClosing"
:warning-level="
window.type === 'extension' &&
availableExtensions.find(
(ext) => ext.id === window.sourceId,
)?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe"
@close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)"
@activate="windowManager.activateWindow(window.id)"
@position-changed="
(x, y) =>
windowManager.updateWindowPosition(window.id, x, y)
"
@size-changed="
(width, height) =>
windowManager.updateWindowSize(
window.id,
width,
height,
)
"
@drag-start="handleWindowDragStart(window.id)"
@drag-end="handleWindowDragEnd"
>
<!-- System Window: Render Vue Component -->
<component
:is="getSystemWindowComponent(window.sourceId)"
v-if="window.type === 'system'"
/>
<!-- Extension Window: Render iFrame -->
<HaexDesktopExtensionFrame
v-else
:extension-id="window.sourceId"
:window-id="window.id"
/>
</HaexWindow>
</div>
</Teleport>
</template>
<!-- Desktop Mode: Render directly in workspace -->
<template v-else>
<HaexWindow
v-show="
windowManager.showWindowOverview || !window.isMinimized
"
:id="window.id"
v-model:x="overviewWindowState.get(window.id)!.x"
v-model:y="overviewWindowState.get(window.id)!.y"
v-model:width="overviewWindowState.get(window.id)!.width"
v-model:height="overviewWindowState.get(window.id)!.height"
v-model:x="window.x"
v-model:y="window.y"
v-model:width="window.width"
v-model:height="window.height"
:title="window.title"
:icon="window.icon"
:is-active="windowManager.isWindowActive(window.id)"
@ -115,6 +191,14 @@
:source-height="window.sourceHeight"
:is-opening="window.isOpening"
:is-closing="window.isClosing"
:warning-level="
window.type === 'extension' &&
availableExtensions.find(
(ext) => ext.id === window.sourceId,
)?.devServerUrl
? 'warning'
: undefined
"
class="no-swipe"
@close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)"
@ -143,56 +227,10 @@
:window-id="window.id"
/>
</HaexWindow>
</div>
</Teleport>
<!-- Desktop Mode: Render directly in workspace -->
<HaexWindow
v-else
v-show="windowManager.showWindowOverview || !window.isMinimized"
:id="window.id"
v-model:x="window.x"
v-model:y="window.y"
v-model:width="window.width"
v-model:height="window.height"
:title="window.title"
:icon="window.icon"
:is-active="windowManager.isWindowActive(window.id)"
:source-x="window.sourceX"
:source-y="window.sourceY"
:source-width="window.sourceWidth"
:source-height="window.sourceHeight"
:is-opening="window.isOpening"
:is-closing="window.isClosing"
class="no-swipe"
@close="windowManager.closeWindow(window.id)"
@minimize="windowManager.minimizeWindow(window.id)"
@activate="windowManager.activateWindow(window.id)"
@position-changed="
(x, y) => windowManager.updateWindowPosition(window.id, x, y)
"
@size-changed="
(width, height) =>
windowManager.updateWindowSize(window.id, width, height)
"
@drag-start="handleWindowDragStart(window.id)"
@drag-end="handleWindowDragEnd"
>
<!-- System Window: Render Vue Component -->
<component
:is="getSystemWindowComponent(window.sourceId)"
v-if="window.type === 'system'"
/>
<!-- Extension Window: Render iFrame -->
<HaexDesktopExtensionFrame
v-else
:extension-id="window.sourceId"
:window-id="window.id"
/>
</HaexWindow>
</template>
</div>
</template>
</template>
</div>
</UContextMenu>
</SwiperSlide>
</Swiper>
@ -224,8 +262,8 @@ const {
allowSwipe,
isOverviewMode,
} = storeToRefs(workspaceStore)
const { x: mouseX } = useMouse()
const { getWorkspaceBackgroundStyle, getWorkspaceContextMenuItems } =
workspaceStore
const desktopEl = useTemplateRef('desktopEl')
@ -260,9 +298,44 @@ const selectionBoxStyle = computed(() => {
// Drag state for desktop icons
const isDragging = ref(false)
const currentDraggedItemId = ref<string>()
const currentDraggedItemType = ref<string>()
const currentDraggedReferenceId = ref<string>()
const currentDraggedItem = reactive({
id: '',
itemType: '',
referenceId: '',
width: 0,
height: 0,
x: 0,
y: 0,
})
// Track mouse position for showing drop target
const { x: mouseX } = useMouse()
const dropTargetZone = computed(() => {
if (!isDragging.value) return null
// Use the actual icon position during drag
const iconX = currentDraggedItem.x
const iconY = currentDraggedItem.y
// Use snapToGrid to get the exact position where the icon will land
const snapped = desktopStore.snapToGrid(
iconX,
iconY,
currentDraggedItem.width || undefined,
currentDraggedItem.height || undefined,
)
// Show dropzone at snapped position with grid cell size
const cellSize = desktopStore.gridCellSize
return {
x: snapped.x,
y: snapped.y,
width: currentDraggedItem.width || cellSize,
height: currentDraggedItem.height || cellSize,
}
})
// Window drag state for snap zones
const isWindowDragging = ref(false)
@ -354,20 +427,43 @@ const handlePositionChanged = async (id: string, x: number, y: number) => {
}
}
const handleDragStart = (id: string, itemType: string, referenceId: string) => {
const handleDragStart = (
id: string,
itemType: string,
referenceId: string,
width: number,
height: number,
x: number,
y: number,
) => {
isDragging.value = true
currentDraggedItemId.value = id
currentDraggedItemType.value = itemType
currentDraggedReferenceId.value = referenceId
currentDraggedItem.id = id
currentDraggedItem.itemType = itemType
currentDraggedItem.referenceId = referenceId
currentDraggedItem.width = width
currentDraggedItem.height = height
currentDraggedItem.x = x
currentDraggedItem.y = y
allowSwipe.value = false // Disable Swiper during icon drag
}
const handleDragging = (id: string, x: number, y: number) => {
if (currentDraggedItem.id === id) {
currentDraggedItem.x = x
currentDraggedItem.y = y
}
}
const handleDragEnd = async () => {
// Cleanup drag state
isDragging.value = false
currentDraggedItemId.value = undefined
currentDraggedItemType.value = undefined
currentDraggedReferenceId.value = undefined
currentDraggedItem.id = ''
currentDraggedItem.itemType = ''
currentDraggedItem.referenceId = ''
currentDraggedItem.width = 0
currentDraggedItem.height = 0
currentDraggedItem.x = 0
currentDraggedItem.y = 0
allowSwipe.value = true // Re-enable Swiper after drag
}
@ -402,15 +498,18 @@ const handleDrop = async (event: DragEvent, workspaceId: string) => {
const desktopRect = (
event.currentTarget as HTMLElement
).getBoundingClientRect()
const x = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2)
const y = Math.max(0, event.clientY - desktopRect.top - 32)
const rawX = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2)
const rawY = Math.max(0, event.clientY - desktopRect.top - 32)
// Snap to grid
const snapped = desktopStore.snapToGrid(rawX, rawY)
// Create desktop icon on the specific workspace
await desktopStore.addDesktopItemAsync(
item.type as DesktopItemType,
item.id,
x,
y,
snapped.x,
snapped.y,
workspaceId,
)
} catch (error) {
@ -649,6 +748,21 @@ watch(currentWorkspace, async () => {
}
})
// Reset drag state when mouse leaves the document (fixes stuck dropzone)
useEventListener(document, 'mouseleave', () => {
if (isDragging.value) {
isDragging.value = false
currentDraggedItem.id = ''
currentDraggedItem.itemType = ''
currentDraggedItem.referenceId = ''
currentDraggedItem.width = 0
currentDraggedItem.height = 0
currentDraggedItem.x = 0
currentDraggedItem.y = 0
allowSwipe.value = true
}
})
onMounted(async () => {
// Load workspaces first
await workspaceStore.loadWorkspacesAsync()

View File

@ -0,0 +1,167 @@
<template>
<UiDrawer
v-model:open="open"
:title="t('title')"
:description="t('description')"
>
<UiButton
:label="t('button.label')"
:ui="{
base: 'px-3 py-2',
}"
icon="mdi:plus"
size="xl"
variant="outline"
block
/>
<template #content>
<div class="p-6 flex flex-col min-h-[50vh]">
<div class="flex-1 flex items-center justify-center px-4">
<UForm
:state="vault"
class="w-full max-w-md space-y-6"
>
<UFormField
:label="t('vault.label')"
name="name"
>
<UInput
v-model="vault.name"
icon="mdi:safe"
:placeholder="t('vault.placeholder')"
autofocus
size="xl"
class="w-full"
/>
</UFormField>
<UFormField
:label="t('password.label')"
name="password"
>
<UiInput
v-model="vault.password"
type="password"
icon="i-heroicons-key"
:placeholder="t('password.placeholder')"
size="xl"
class="w-full"
/>
</UFormField>
</UForm>
</div>
<div class="flex gap-3 mt-auto pt-6">
<UButton
color="neutral"
variant="outline"
block
size="xl"
@click="open = false"
>
{{ t('cancel') }}
</UButton>
<UButton
color="primary"
block
size="xl"
@click="onCreateAsync"
>
{{ t('create') }}
</UButton>
</div>
</div>
</template>
</UiDrawer>
</template>
<script setup lang="ts">
import { vaultSchema } from './schema'
const open = defineModel<boolean>('open', { default: false })
const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{
name: string
password: string
type: 'password' | 'text'
}>({
name: 'HaexVault',
password: '',
type: 'password',
})
const initVault = () => {
vault.name = 'HaexVault'
vault.password = ''
vault.type = 'password'
}
const { createAsync } = useVaultStore()
const { add } = useToast()
const check = ref(false)
const onCreateAsync = async () => {
check.value = true
const nameCheck = vaultSchema.name.safeParse(vault.name)
const passwordCheck = vaultSchema.password.safeParse(vault.password)
if (!nameCheck.success || !passwordCheck.success) return
open.value = false
try {
if (vault.name && vault.password) {
const vaultId = await createAsync({
vaultName: vault.name,
password: vault.password,
})
if (vaultId) {
initVault()
await navigateTo(
useLocaleRoute()({ name: 'desktop', params: { vaultId } }),
)
}
}
} catch (error) {
console.error(error)
add({ color: 'error', description: JSON.stringify(error) })
}
}
</script>
<i18n lang="yaml">
de:
button:
label: Vault erstellen
vault:
label: Vaultname
placeholder: Vaultname
password:
label: Passwort
placeholder: Passwort eingeben
title: Neue HaexVault erstellen
create: Erstellen
cancel: Abbrechen
description: Erstelle eine neue Vault für deine Daten
en:
button:
label: Create vault
vault:
label: Vault name
placeholder: Vault name
password:
label: Password
placeholder: Enter password
title: Create new HaexVault
create: Create
cancel: Cancel
description: Create a new vault for your data
</i18n>

View File

@ -1,12 +1,11 @@
<template>
<UiDialogConfirm
<UiDrawer
v-model:open="open"
:confirm-label="t('open')"
:description="vault.path || path"
@confirm="onOpenDatabase"
:title="t('title')"
:description="path || t('description')"
>
<UiButton
:label="t('vault.open')"
:label="t('button.label')"
:ui="{
base: 'px-3 py-2',
}"
@ -16,37 +15,63 @@
block
/>
<template #title>
<i18n-t
keypath="title"
tag="p"
class="flex gap-x-2 text-wrap"
>
<template #haexvault>
<UiTextGradient>HaexVault</UiTextGradient>
</template>
</i18n-t>
</template>
<template #content>
<div class="p-6 flex flex-col min-h-[50vh]">
<div class="flex-1 flex items-center justify-center px-4">
<div class="w-full max-w-md space-y-4">
<div
v-if="path"
class="text-sm text-gray-500 dark:text-gray-400"
>
<span class="font-medium">{{ t('path.label') }}:</span>
{{ path }}
</div>
<template #body>
<UForm
:state="vault"
class="flex flex-col gap-4 w-full h-full justify-center"
>
<UiInputPassword
v-model="vault.password"
class="w-full"
autofocus
/>
<UForm
:state="vault"
class="w-full"
>
<UFormField
:label="t('password.label')"
name="password"
>
<UInput
v-model="vault.password"
type="password"
icon="i-heroicons-key"
:placeholder="t('password.placeholder')"
autofocus
size="xl"
class="w-full"
@keyup.enter="onOpenDatabase"
/>
</UFormField>
</UForm>
</div>
</div>
<UButton
hidden
type="submit"
@click="onOpenDatabase"
/>
</UForm>
<div class="flex gap-3 mt-auto pt-6">
<UButton
color="neutral"
variant="outline"
block
size="xl"
@click="open = false"
>
{{ t('cancel') }}
</UButton>
<UButton
color="primary"
block
size="xl"
@click="onOpenDatabase"
>
{{ t('open') }}
</UButton>
</div>
</div>
</template>
</UiDialogConfirm>
</UiDrawer>
</template>
<script setup lang="ts">
@ -58,7 +83,9 @@ const props = defineProps<{
path?: string
}>()
const { t } = useI18n()
const { t } = useI18n({
useScope: 'local',
})
const vault = reactive<{
name: string
@ -154,7 +181,12 @@ const onOpenDatabase = async () => {
)
} catch (error) {
open.value = false
if (error?.details?.reason === 'file is not a database') {
const errorDetails =
error && typeof error === 'object' && 'details' in error
? (error as { details?: { reason?: string } }).details
: undefined
if (errorDetails?.reason === 'file is not a database') {
add({
color: 'error',
title: t('error.password.title'),
@ -169,25 +201,37 @@ const onOpenDatabase = async () => {
<i18n lang="yaml">
de:
button:
label: Vault öffnen
open: Entsperren
title: '{haexvault} entsperren'
password: Passwort
vault:
open: Vault öffnen
cancel: Abbrechen
title: HaexVault entsperren
path:
label: Pfad
password:
label: Passwort
placeholder: Passwort eingeben
description: Öffne eine vorhandene Vault
error:
open: Vault konnte nicht geöffnet werden
password:
title: Vault konnte nicht geöffnet werden
description: Bitte üperprüfe das Passwort
description: Bitte überprüfe das Passwort
en:
button:
label: Open Vault
open: Unlock
title: Unlock {haexvault}
password: Passwort
cancel: Cancel
title: Unlock HaexVault
path:
label: Path
password:
label: Password
placeholder: Enter password
description: Open your existing vault
vault:
open: Open Vault
error:
open: Vault couldn't be opened
password:
title: Vault couldn't be opened
description: Please check your password

View File

@ -15,7 +15,7 @@
<div class="flex items-start gap-4">
<div
v-if="preview?.manifest.icon"
class="w-16 h-16 flex-shrink-0"
class="w-16 h-16 shrink-0"
>
<UIcon
:name="preview.manifest.icon"
@ -184,7 +184,6 @@ const shellPermissions = computed({
},
})
const permissionAccordionItems = computed(() => {
const items = []

View File

@ -1,5 +1,13 @@
<template>
<UPopover v-model:open="open">
<UiDrawer
v-model:open="open"
direction="right"
:title="t('launcher.title')"
:description="t('launcher.description')"
:overlay="false"
:modal="false"
:handle-only="true"
>
<UButton
icon="material-symbols:apps"
color="neutral"
@ -9,58 +17,63 @@
/>
<template #content>
<ul class="p-4 max-h-96 grid grid-cols-3 gap-2 overflow-scroll">
<!-- All launcher items (system windows + enabled extensions, alphabetically sorted) -->
<UContextMenu
v-for="item in launcherItems"
:key="item.id"
:items="getContextMenuItems(item)"
>
<div class="p-4 h-full overflow-y-auto">
<div class="flex flex-wrap">
<!-- All launcher items (system windows + enabled extensions, alphabetically sorted) -->
<UContextMenu
v-for="item in launcherItems"
:key="item.id"
:items="getContextMenuItems(item)"
>
<UiButton
square
size="lg"
variant="ghost"
:ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab',
leadingIcon: 'size-10',
label: 'w-full',
}"
:icon="item.icon"
:label="item.name"
:tooltip="item.name"
draggable="true"
@click="openItem(item)"
@dragstart="handleDragStart($event, item)"
/>
</UContextMenu>
<!-- Disabled Extensions (grayed out) -->
<UiButton
v-for="extension in disabledExtensions"
:key="extension.id"
square
size="lg"
size="xl"
variant="ghost"
:disabled="true"
:ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab active:cursor-grabbing',
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible opacity-40',
leadingIcon: 'size-10',
label: 'w-full',
}"
:icon="item.icon"
:label="item.name"
:tooltip="item.name"
draggable="true"
@click="openItem(item)"
@dragstart="handleDragStart($event, item)"
@dragend="handleDragEnd"
:icon="extension.icon || 'i-heroicons-puzzle-piece-solid'"
:label="extension.name"
:tooltip="`${extension.name} (${t('disabled')})`"
/>
</UContextMenu>
<!-- Disabled Extensions (grayed out) -->
<UiButton
v-for="extension in disabledExtensions"
:key="extension.id"
square
size="xl"
variant="ghost"
:disabled="true"
:ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible opacity-40',
leadingIcon: 'size-10',
label: 'w-full',
}"
:icon="extension.icon || 'i-heroicons-puzzle-piece-solid'"
:label="extension.name"
:tooltip="`${extension.name} (${t('disabled')})`"
/>
</ul>
</div>
</div>
</template>
</UPopover>
</UiDrawer>
<!-- Uninstall Confirmation Dialog -->
<UiDialogConfirm
v-model:open="showUninstallDialog"
:title="t('uninstall.confirm.title')"
:description="t('uninstall.confirm.description', { name: extensionToUninstall?.name || '' })"
:description="
t('uninstall.confirm.description', {
name: extensionToUninstall?.name || '',
})
"
:confirm-label="t('uninstall.confirm.button')"
confirm-icon="i-heroicons-trash"
@confirm="confirmUninstall"
@ -74,11 +87,14 @@ defineOptions({
const extensionStore = useExtensionsStore()
const windowManagerStore = useWindowManagerStore()
const uiStore = useUiStore()
const { t } = useI18n()
const open = ref(false)
const { isSmallScreen } = storeToRefs(uiStore)
// Uninstall dialog state
const showUninstallDialog = ref(false)
const extensionToUninstall = ref<LauncherItem | null>(null)
@ -226,10 +242,11 @@ const handleDragStart = (event: DragEvent, item: LauncherItem) => {
if (dragImage) {
event.dataTransfer.setDragImage(dragImage, 20, 20)
}
}
const handleDragEnd = () => {
// Cleanup if needed
// Close drawer on small screens to reveal workspace for drop
if (isSmallScreen.value) {
open.value = false
}
}
</script>
@ -237,6 +254,9 @@ const handleDragEnd = () => {
de:
disabled: Deaktiviert
marketplace: Marketplace
launcher:
title: App Launcher
description: Wähle eine App zum Öffnen
contextMenu:
open: Öffnen
uninstall: Deinstallieren
@ -249,6 +269,9 @@ de:
en:
disabled: Disabled
marketplace: Marketplace
launcher:
title: App Launcher
description: Select an app to open
contextMenu:
open: Open
uninstall: Uninstall

View File

@ -0,0 +1,65 @@
<template>
<div class="inline-flex">
<UTooltip :text="tooltip">
<!-- Bundled Icon (iconify) -->
<UIcon
v-if="isBundledIcon"
:name="name"
v-bind="$attrs"
/>
<!-- External Image (Extension icon) -->
<img
v-else
:src="imageUrl"
v-bind="$attrs"
@error="handleImageError"
/>
</UTooltip>
</div>
</template>
<script setup lang="ts">
import { convertFileSrc } from '@tauri-apps/api/core'
defineOptions({
inheritAttrs: false,
})
const props = defineProps<{
name: string
tooltip?: string
}>()
// Check if it's a bundled icon (no file extension)
const isBundledIcon = computed(() => {
return !props.name.match(/\.(png|jpg|jpeg|svg|gif|webp|ico)$/i)
})
// Convert file path to Tauri URL for images
const imageUrl = ref('')
const showFallback = ref(false)
// Default fallback icon
const FALLBACK_ICON = 'i-heroicons-puzzle-piece-solid'
watchEffect(() => {
if (!isBundledIcon.value && !showFallback.value) {
// Convert local file path to Tauri asset URL
imageUrl.value = convertFileSrc(props.name)
}
})
const handleImageError = () => {
console.warn(`Failed to load icon: ${props.name}`)
showFallback.value = true
}
// Use fallback icon if image failed to load
const name = computed(() => {
if (showFallback.value) {
return FALLBACK_ICON
}
return props.name
})
</script>

View File

@ -0,0 +1,185 @@
<template>
<div class="w-full h-full bg-default flex flex-col">
<!-- Header with controls -->
<div class="flex items-center justify-between p-4 border-b border-gray-200 dark:border-gray-700">
<div class="flex items-center gap-2">
<UIcon
name="i-heroicons-bug-ant"
class="w-5 h-5"
/>
<h2 class="text-lg font-semibold">
Debug Logs
</h2>
<span class="text-xs text-gray-500">
{{ logs.length }} logs
</span>
</div>
<div class="flex gap-2">
<UButton
:label="allCopied ? 'Copied!' : 'Copy All'"
:color="allCopied ? 'success' : 'primary'"
size="sm"
@click="copyAllLogs"
/>
<UButton
label="Clear Logs"
color="error"
size="sm"
@click="clearLogs"
/>
</div>
</div>
<!-- Filter Buttons -->
<div class="flex gap-2 p-4 border-b border-gray-200 dark:border-gray-700 overflow-x-auto">
<UButton
v-for="level in ['all', 'log', 'info', 'warn', 'error', 'debug']"
:key="level"
:label="level"
:color="filter === level ? 'primary' : 'neutral'"
size="sm"
@click="filter = level as any"
/>
</div>
<!-- Logs Container -->
<div
ref="logsContainer"
class="flex-1 overflow-y-auto p-4 space-y-2 font-mono text-xs"
>
<div
v-for="(log, index) in filteredLogs"
:key="index"
:class="[
'p-3 rounded-lg border-l-4 relative group',
log.level === 'error'
? 'bg-red-50 dark:bg-red-950/30 border-red-500'
: log.level === 'warn'
? 'bg-yellow-50 dark:bg-yellow-950/30 border-yellow-500'
: log.level === 'info'
? 'bg-blue-50 dark:bg-blue-950/30 border-blue-500'
: log.level === 'debug'
? 'bg-purple-50 dark:bg-purple-950/30 border-purple-500'
: 'bg-gray-50 dark:bg-gray-800 border-gray-400',
]"
>
<!-- Copy Button -->
<button
class="absolute top-2 right-2 p-1.5 rounded bg-white dark:bg-gray-700 shadow-sm hover:bg-gray-100 dark:hover:bg-gray-600 active:scale-95 transition-all"
@click="copyLogToClipboard(log)"
>
<UIcon
:name="copiedIndex === index ? 'i-heroicons-check' : 'i-heroicons-clipboard-document'"
:class="[
'w-4 h-4',
copiedIndex === index ? 'text-green-500' : ''
]"
/>
</button>
<div class="flex items-start gap-2 mb-1">
<span class="text-gray-500 dark:text-gray-400 text-[10px] shrink-0">
{{ log.timestamp }}
</span>
<span
:class="[
'font-semibold text-[10px] uppercase shrink-0',
log.level === 'error'
? 'text-red-600 dark:text-red-400'
: log.level === 'warn'
? 'text-yellow-600 dark:text-yellow-400'
: log.level === 'info'
? 'text-blue-600 dark:text-blue-400'
: log.level === 'debug'
? 'text-purple-600 dark:text-purple-400'
: 'text-gray-600 dark:text-gray-400',
]"
>
{{ log.level }}
</span>
</div>
<pre class="whitespace-pre-wrap break-words text-gray-900 dark:text-gray-100 pr-8">{{ log.message }}</pre>
</div>
<div
v-if="filteredLogs.length === 0"
class="text-center text-gray-500 py-8"
>
<UIcon
name="i-heroicons-document-text"
class="w-12 h-12 mx-auto mb-2 opacity-50"
/>
<p>No logs to display</p>
</div>
</div>
</div>
</template>
<script setup lang="ts">
import { globalConsoleLogs } from '~/plugins/console-interceptor'
import type { ConsoleLog } from '~/plugins/console-interceptor'
const filter = ref<'all' | 'log' | 'info' | 'warn' | 'error' | 'debug'>('all')
const logsContainer = ref<HTMLDivElement>()
const copiedIndex = ref<number | null>(null)
const allCopied = ref(false)
const { $clearConsoleLogs } = useNuxtApp()
const { copy } = useClipboard()
const logs = computed(() => globalConsoleLogs.value)
const filteredLogs = computed(() => {
if (filter.value === 'all') {
return logs.value
}
return logs.value.filter((log) => log.level === filter.value)
})
const clearLogs = () => {
if ($clearConsoleLogs) {
$clearConsoleLogs()
}
}
const copyLogToClipboard = async (log: ConsoleLog) => {
const text = `[${log.timestamp}] [${log.level.toUpperCase()}] ${log.message}`
await copy(text)
// Find the index in filteredLogs for visual feedback
const index = filteredLogs.value.indexOf(log)
copiedIndex.value = index
// Reset after 2 seconds
setTimeout(() => {
copiedIndex.value = null
}, 2000)
}
const copyAllLogs = async () => {
const allLogsText = filteredLogs.value
.map((log) => `[${log.timestamp}] [${log.level.toUpperCase()}] ${log.message}`)
.join('\n')
await copy(allLogsText)
allCopied.value = true
// Reset after 2 seconds
setTimeout(() => {
allCopied.value = false
}, 2000)
}
// Auto-scroll to bottom when new logs arrive
watch(
() => logs.value.length,
() => {
nextTick(() => {
if (logsContainer.value) {
logsContainer.value.scrollTop = logsContainer.value.scrollHeight
}
})
},
{ immediate: true }
)
</script>

View File

@ -1,5 +1,5 @@
<template>
<div class="p-4 mx-auto space-y-6 bg-default/90 backdrop-blur-2xl">
<div class="p-4 mx-auto space-y-6 bg-default">
<div class="space-y-2">
<h1 class="text-2xl font-bold">{{ t('title') }}</h1>
<p class="text-sm opacity-70">{{ t('description') }}</p>
@ -122,6 +122,7 @@ const browseExtensionPathAsync = async () => {
}
}
const windowManagerStore = useWindowManagerStore()
// Load a dev extension
const loadDevExtensionAsync = async () => {
if (!extensionPath.value) return
@ -140,15 +141,31 @@ const loadDevExtensionAsync = async () => {
// Reload list
await loadDevExtensionListAsync()
// Get the newly loaded extension info from devExtensions
const newlyLoadedExtension = devExtensions.value.find((ext) =>
extensionPath.value.includes(ext.name),
)
// Reload all extensions in the main extension store so they appear in the launcher
await loadExtensionsAsync()
// Open the newly loaded extension
if (newlyLoadedExtension) {
await windowManagerStore.openWindowAsync({
sourceId: newlyLoadedExtension.id,
type: 'extension',
icon: newlyLoadedExtension.icon || 'i-heroicons-puzzle-piece-solid',
title: newlyLoadedExtension.name,
})
}
// Clear input
extensionPath.value = ''
} catch (error) {
console.error('Failed to load dev extension:', error)
const { getErrorMessage } = useExtensionError()
add({
description: t('add.errors.loadFailed') + error,
description: `${t('add.errors.loadFailed')}: ${getErrorMessage(error)}`,
color: 'error',
})
} finally {
@ -180,8 +197,9 @@ const reloadDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
})
} catch (error) {
console.error('Failed to reload dev extension:', error)
const { getErrorMessage } = useExtensionError()
add({
description: t('list.errors.reloadFailed') + error,
description: `${t('list.errors.reloadFailed')}: ${getErrorMessage(error)}`,
color: 'error',
})
}
@ -207,8 +225,9 @@ const removeDevExtensionAsync = async (extension: ExtensionInfoResponse) => {
await loadExtensionsAsync()
} catch (error) {
console.error('Failed to remove dev extension:', error)
const { getErrorMessage } = useExtensionError()
add({
description: t('list.errors.removeFailed') + error,
description: `${t('list.errors.removeFailed')}: ${getErrorMessage(error)}`,
color: 'error',
})
}

View File

@ -1,5 +1,5 @@
<template>
<div class="w-full h-full bg-default">
<div class="w-full h-full bg-default overflow-scroll">
<div class="grid grid-cols-2 p-2">
<div class="p-2">{{ t('language') }}</div>
<div><UiDropdownLocale @select="onSelectLocaleAsync" /></div>
@ -33,13 +33,52 @@
/>
</div>
<div class="h-full"/>
<div class="p-2">{{ t('workspaceBackground.label') }}</div>
<div class="flex gap-2">
<UiButton
:label="t('workspaceBackground.choose')"
@click="selectBackgroundImage"
/>
<UiButton
v-if="currentWorkspace?.background"
:label="t('workspaceBackground.remove.label')"
color="error"
@click="removeBackgroundImage"
/>
</div>
<!-- Desktop Grid Settings -->
<div
class="col-span-2 mt-4 border-t border-gray-200 dark:border-gray-700 pt-4"
>
<h3 class="text-lg font-semibold mb-4">{{ t('desktopGrid.title') }}</h3>
</div>
<div class="p-2">{{ t('desktopGrid.iconSize.label') }}</div>
<div>
<USelect
v-model="iconSizePreset"
:items="iconSizePresetOptions"
/>
</div>
<div class="h-full" />
</div>
</div>
</template>
<script setup lang="ts">
import type { Locale } from 'vue-i18n'
import { open } from '@tauri-apps/plugin-dialog'
import {
readFile,
writeFile,
mkdir,
exists,
remove,
} from '@tauri-apps/plugin-fs'
import { appLocalDataDir } from '@tauri-apps/api/path'
import { DesktopIconSizePreset } from '~/stores/vault/settings'
const { t, setLocale } = useI18n()
@ -77,8 +116,44 @@ const { requestNotificationPermissionAsync } = useNotificationStore()
const { deviceName } = storeToRefs(useDeviceStore())
const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore()
const workspaceStore = useWorkspaceStore()
const { currentWorkspace } = storeToRefs(workspaceStore)
const { updateWorkspaceBackgroundAsync } = workspaceStore
const desktopStore = useDesktopStore()
const { iconSizePreset } = storeToRefs(desktopStore)
const { syncDesktopIconSizeAsync, updateDesktopIconSizeAsync } = desktopStore
// Icon size preset options
const iconSizePresetOptions = [
{
label: t('desktopGrid.iconSize.presets.small'),
value: DesktopIconSizePreset.small,
},
{
label: t('desktopGrid.iconSize.presets.medium'),
value: DesktopIconSizePreset.medium,
},
{
label: t('desktopGrid.iconSize.presets.large'),
value: DesktopIconSizePreset.large,
},
{
label: t('desktopGrid.iconSize.presets.extraLarge'),
value: DesktopIconSizePreset.extraLarge,
},
]
// Watch for icon size preset changes and update DB
watch(iconSizePreset, async (newPreset) => {
if (newPreset) {
await updateDesktopIconSizeAsync(newPreset)
}
})
onMounted(async () => {
await readDeviceNameAsync()
await syncDesktopIconSizeAsync()
})
const onUpdateDeviceNameAsync = async () => {
@ -92,6 +167,152 @@ const onUpdateDeviceNameAsync = async () => {
add({ description: t('deviceName.update.error'), color: 'error' })
}
}
const selectBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
const selected = await open({
multiple: false,
filters: [
{
name: 'Images',
extensions: ['png', 'jpg', 'jpeg', 'webp'],
},
],
})
if (!selected || typeof selected !== 'string') {
return
}
// Read the selected file (works with Android photo picker URIs)
let fileData: Uint8Array
try {
fileData = await readFile(selected)
} catch (readError) {
add({
description: `Fehler beim Lesen: ${readError instanceof Error ? readError.message : String(readError)}`,
color: 'error',
})
return
}
// Detect file type from file signature
let ext = 'jpg' // default
if (fileData.length > 4) {
// PNG signature: 89 50 4E 47
if (
fileData[0] === 0x89 &&
fileData[1] === 0x50 &&
fileData[2] === 0x4e &&
fileData[3] === 0x47
) {
ext = 'png'
}
// JPEG signature: FF D8 FF
else if (
fileData[0] === 0xff &&
fileData[1] === 0xd8 &&
fileData[2] === 0xff
) {
ext = 'jpg'
}
// WebP signature: RIFF xxxx WEBP
else if (
fileData[0] === 0x52 &&
fileData[1] === 0x49 &&
fileData[2] === 0x46 &&
fileData[3] === 0x46
) {
ext = 'webp'
}
}
// Get app local data directory
const appDataPath = await appLocalDataDir()
// Construct target path manually to avoid path joining issues
const fileName = `workspace-${currentWorkspace.value.id}-background.${ext}`
const targetPath = `${appDataPath}/files/${fileName}`
// Create parent directory if it doesn't exist
const parentDir = `${appDataPath}/files`
try {
if (!(await exists(parentDir))) {
await mkdir(parentDir, { recursive: true })
}
} catch (mkdirError) {
add({
description: `Fehler beim Erstellen des Ordners: ${mkdirError instanceof Error ? mkdirError.message : String(mkdirError)}`,
color: 'error',
})
return
}
// Write file to app data directory
try {
await writeFile(targetPath, fileData)
} catch (writeError) {
add({
description: `Fehler beim Schreiben: ${writeError instanceof Error ? writeError.message : String(writeError)}`,
color: 'error',
})
return
}
// Store the absolute file path in database
try {
await updateWorkspaceBackgroundAsync(
currentWorkspace.value.id,
targetPath,
)
add({
description: t('workspaceBackground.update.success'),
color: 'success',
})
} catch (dbError) {
add({
description: `Fehler beim DB-Update: ${dbError instanceof Error ? dbError.message : String(dbError)}`,
color: 'error',
})
}
} catch (error) {
console.error('Error selecting background:', error)
add({
description: `${t('workspaceBackground.update.error')}: ${error instanceof Error ? error.message : String(error)}`,
color: 'error',
})
}
}
const removeBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
// Delete the background file if it exists
if (currentWorkspace.value.background) {
try {
// The background field contains the absolute file path
if (await exists(currentWorkspace.value.background)) {
await remove(currentWorkspace.value.background)
}
} catch (err) {
console.warn('Could not delete background file:', err)
// Continue anyway to clear the database entry
}
}
await updateWorkspaceBackgroundAsync(currentWorkspace.value.id, null)
add({
description: t('workspaceBackground.remove.success'),
color: 'success',
})
} catch (error) {
console.error('Error removing background:', error)
add({ description: t('workspaceBackground.remove.error'), color: 'error' })
}
}
</script>
<i18n lang="yaml">
@ -112,6 +333,32 @@ de:
update:
success: Gerätename wurde erfolgreich aktualisiert
error: Gerätename konnte nich aktualisiert werden
workspaceBackground:
label: Workspace-Hintergrund
choose: Bild auswählen
update:
success: Hintergrund erfolgreich aktualisiert
error: Fehler beim Aktualisieren des Hintergrunds
remove:
label: Hintergrund entfernen
success: Hintergrund erfolgreich entfernt
error: Fehler beim Entfernen des Hintergrunds
desktopGrid:
title: Desktop-Raster
columns:
label: Spalten
unit: Spalten
rows:
label: Zeilen
unit: Zeilen
iconSize:
label: Icon-Größe
presets:
small: Klein
medium: Mittel
large: Groß
extraLarge: Sehr groß
unit: px
en:
language: Language
design: Design
@ -129,4 +376,30 @@ en:
update:
success: Device name has been successfully updated
error: Device name could not be updated
workspaceBackground:
label: Workspace Background
choose: Choose Image
update:
success: Background successfully updated
error: Error updating background
remove:
label: Remove Background
success: Background successfully removed
error: Error removing background
desktopGrid:
title: Desktop Grid
columns:
label: Columns
unit: columns
rows:
label: Rows
unit: rows
iconSize:
label: Icon Size
presets:
small: Small
medium: Medium
large: Large
extraLarge: Extra Large
unit: px
</i18n>

View File

@ -1,130 +0,0 @@
<template>
<UiDialogConfirm
:confirm-label="t('create')"
@confirm="onCreateAsync"
>
<UiButton
:label="t('vault.create')"
:ui="{
base: 'px-3 py-2',
}"
icon="mdi:plus"
size="xl"
variant="outline"
block
/>
<template #title>
<i18n-t
keypath="title"
tag="p"
class="flex gap-x-2 flex-wrap"
>
<template #haexvault>
<UiTextGradient>HaexVault</UiTextGradient>
</template>
</i18n-t>
</template>
<template #body>
<UForm
:state="vault"
class="flex flex-col gap-4 w-full h-full justify-center"
>
<UiInput
v-model="vault.name"
leading-icon="mdi:safe"
:label="t('vault.label')"
:placeholder="t('vault.placeholder')"
/>
<UiInputPassword
v-model="vault.password"
leading-icon="mdi:key-outline"
/>
<UButton
hidden
type="submit"
@click="onCreateAsync"
/>
</UForm>
</template>
</UiDialogConfirm>
</template>
<script setup lang="ts">
import { vaultSchema } from './schema'
const { t } = useI18n()
const vault = reactive<{
name: string
password: string
type: 'password' | 'text'
}>({
name: 'HaexVault',
password: '',
type: 'password',
})
const initVault = () => {
vault.name = 'HaexVault'
vault.password = ''
vault.type = 'password'
}
const { createAsync } = useVaultStore()
const { add } = useToast()
const check = ref(false)
const open = ref()
const onCreateAsync = async () => {
check.value = true
const nameCheck = vaultSchema.name.safeParse(vault.name)
const passwordCheck = vaultSchema.password.safeParse(vault.password)
if (!nameCheck.success || !passwordCheck.success) return
open.value = false
try {
if (vault.name && vault.password) {
const vaultId = await createAsync({
vaultName: vault.name,
password: vault.password,
})
if (vaultId) {
initVault()
await navigateTo(
useLocaleRoute()({ name: 'desktop', params: { vaultId } }),
)
}
}
} catch (error) {
console.error(error)
add({ color: 'error', description: JSON.stringify(error) })
}
}
</script>
<i18n lang="yaml">
de:
vault:
create: Neue Vault erstellen
label: Vaultname
placeholder: Vaultname
name: HaexVault
title: Neue {haexvault} erstellen
create: Erstellen
en:
vault:
create: Create new vault
label: Vaultname
placeholder: Vaultname
name: HaexVault
title: Create new {haexvault}
create: Create
</i18n>

View File

@ -3,13 +3,20 @@
ref="windowEl"
:style="windowStyle"
:class="[
'absolute bg-default/80 backdrop-blur-xl rounded-lg shadow-xl overflow-hidden isolate',
'border border-gray-200 dark:border-gray-700 transition-all ease-out duration-600 ',
'absolute bg-default/80 backdrop-blur-xl rounded-lg shadow-xl overflow-hidden',
'transition-all ease-out duration-600',
'flex flex-col @container',
{ 'select-none': isResizingOrDragging },
isActive ? 'z-20' : 'z-10',
// Border colors based on warning level
warningLevel === 'warning'
? 'border-2 border-warning-500'
: warningLevel === 'danger'
? 'border-2 border-danger-500'
: 'border border-gray-200 dark:border-gray-700',
]"
@mousedown="handleActivate"
@contextmenu.stop.prevent
>
<!-- Window Titlebar -->
<div
@ -19,10 +26,10 @@
>
<!-- Left: Icon -->
<div class="flex items-center gap-2">
<img
<HaexIcon
v-if="icon"
:src="icon"
:alt="title"
:name="icon"
:tooltip="title"
class="w-5 h-5 object-contain shrink-0"
/>
</div>
@ -44,6 +51,7 @@
/>
<HaexWindowButton
v-if="!isSmallScreen"
:is-maximized
variant="maximize"
@click.stop="handleMaximize"
@ -68,13 +76,14 @@
<!-- Resize Handles -->
<HaexWindowResizeHandles
:disabled="isMaximized"
:disabled="isMaximized || isSmallScreen"
@resize-start="handleResizeStart"
/>
</div>
</template>
<script setup lang="ts">
import { getAvailableContentHeight } from '~/utils/viewport'
const props = defineProps<{
id: string
title: string
@ -86,6 +95,7 @@ const props = defineProps<{
sourceHeight?: number
isOpening?: boolean
isClosing?: boolean
warningLevel?: 'warning' | 'danger' // Warning indicator (e.g., dev extension, dangerous permissions)
}>()
const emit = defineEmits<{
@ -107,12 +117,16 @@ const height = defineModel<number>('height', { default: 600 })
const windowEl = useTemplateRef('windowEl')
const titlebarEl = useTemplateRef('titlebarEl')
const uiStore = useUiStore()
const { isSmallScreen } = storeToRefs(uiStore)
// Inject viewport size from parent desktop
const viewportSize = inject<{
width: Ref<number>
height: Ref<number>
}>('viewportSize')
const isMaximized = ref(false) // Don't start maximized
// Start maximized on small screens
const isMaximized = ref(isSmallScreen.value)
// Store initial position/size for restore
const preMaximizeState = ref({
@ -144,7 +158,8 @@ const isResizingOrDragging = computed(
// Setup drag with useDrag composable (supports mouse + touch)
useDrag(
({ movement: [mx, my], first, last }) => {
if (isMaximized.value) return
// Disable dragging on small screens (always fullscreen)
if (isMaximized.value || isSmallScreen.value) return
if (first) {
// Drag started - save initial position
@ -318,10 +333,10 @@ const handleMaximize = () => {
x.value = 0
y.value = 0
width.value = bounds.width
height.value = bounds.height
// Use helper function to calculate correct height with safe areas
height.value = getAvailableContentHeight()
isMaximized.value = true
}
console.log('handleMaximize', preMaximizeState, bounds)
}
}

View File

@ -1,90 +1,76 @@
<template>
<UModal
<UiDrawer
v-model:open="localShowWindowOverview"
direction="bottom"
:title="t('modal.title')"
:description="t('modal.description')"
fullscreen
>
<template #content>
<div class="flex flex-col h-full">
<!-- Header -->
<div class="h-full overflow-y-auto p-6 justify-center flex">
<!-- Window Thumbnails Flex Layout -->
<div
class="flex items-center justify-end border-b p-2 border-gray-200 dark:border-gray-700"
v-if="windows.length > 0"
class="flex flex-wrap gap-6 justify-center-safe items-start"
>
<UButton
icon="i-heroicons-x-mark"
color="error"
variant="soft"
@click="localShowWindowOverview = false"
/>
</div>
<!-- Scrollable Content -->
<div class="flex-1 overflow-y-auto p-6 justify-center flex">
<!-- Window Thumbnails Flex Layout -->
<div
v-if="windows.length > 0"
class="flex flex-wrap gap-6 justify-center-safe items-start"
v-for="window in windows"
:key="window.id"
class="relative group cursor-pointer"
>
<div
v-for="window in windows"
:key="window.id"
class="relative group cursor-pointer"
>
<!-- Window Title Bar -->
<div class="flex items-center gap-3 mb-2 px-2">
<UIcon
v-if="window.icon"
:name="window.icon"
class="size-5 shrink-0"
/>
<div class="flex-1 min-w-0">
<p class="font-semibold text-sm truncate">
{{ window.title }}
</p>
</div>
<!-- Minimized Badge -->
<UBadge
v-if="window.isMinimized"
color="info"
size="xs"
:title="t('minimized')"
/>
<!-- Window Title Bar -->
<div class="flex items-center gap-3 mb-2 px-2">
<UIcon
v-if="window.icon"
:name="window.icon"
class="size-5 shrink-0"
/>
<div class="flex-1 min-w-0">
<p class="font-semibold text-sm truncate">
{{ window.title }}
</p>
</div>
<!-- Minimized Badge -->
<UBadge
v-if="window.isMinimized"
color="info"
size="xs"
:title="t('minimized')"
/>
</div>
<!-- Scaled Window Preview Container / Teleport Target -->
<!-- Scaled Window Preview Container / Teleport Target -->
<div
:id="`window-preview-${window.id}`"
class="relative bg-gray-100 dark:bg-gray-900 rounded-xl overflow-hidden border-2 border-gray-200 dark:border-gray-700 group-hover:border-primary-500 transition-all shadow-lg"
:style="getCardStyle(window)"
@click="handleRestoreAndActivateWindow(window.id)"
>
<!-- Hover Overlay -->
<div
:id="`window-preview-${window.id}`"
class="relative bg-gray-100 dark:bg-gray-900 rounded-xl overflow-hidden border-2 border-gray-200 dark:border-gray-700 group-hover:border-primary-500 transition-all shadow-lg"
:style="getCardStyle(window)"
@click="handleRestoreAndActivateWindow(window.id)"
>
<!-- Hover Overlay -->
<div
class="absolute inset-0 bg-primary-500/10 opacity-0 group-hover:opacity-100 transition-opacity pointer-events-none z-40"
/>
</div>
class="absolute inset-0 bg-primary-500/10 opacity-0 group-hover:opacity-100 transition-opacity pointer-events-none z-40"
/>
</div>
</div>
</div>
<!-- Empty State -->
<div
v-else
class="flex flex-col items-center justify-center py-12 text-gray-500 dark:text-gray-400"
>
<UIcon
name="i-heroicons-window"
class="size-16 mb-4 shrink-0"
/>
<p class="text-lg font-medium">No windows open</p>
<p class="text-sm">
Open an extension or system window to see it here
</p>
</div>
<!-- Empty State -->
<div
v-else
class="flex flex-col items-center justify-center py-12 text-gray-500 dark:text-gray-400"
>
<UIcon
name="i-heroicons-window"
class="size-16 mb-4 shrink-0"
/>
<p class="text-lg font-medium">No windows open</p>
<p class="text-sm">
Open an extension or system window to see it here
</p>
</div>
</div>
</template>
</UModal>
</UiDrawer>
</template>
<script setup lang="ts">

View File

@ -25,17 +25,70 @@
/>
</div>
</template>
<!-- Window Icons Preview -->
<div
v-if="workspaceWindows.length > 0"
class="flex flex-wrap gap-2 items-center"
>
<!-- Show first 8 window icons -->
<HaexIcon
v-for="window in visibleWindows"
:key="window.id"
:name="window.icon || 'i-heroicons-window'"
:tooltip="window.title"
class="size-6 opacity-70"
/>
<!-- Show remaining count badge if more than 8 windows -->
<UBadge
v-if="remainingCount > 0"
color="neutral"
variant="subtle"
size="sm"
>
+{{ remainingCount }}
</UBadge>
</div>
<!-- Empty state when no windows -->
<div
v-else
class="text-sm text-gray-400 dark:text-gray-600 italic"
>
{{ t('noWindows') }}
</div>
</UCard>
</template>
<script setup lang="ts">
const props = defineProps<{ workspace: IWorkspace }>()
const { t } = useI18n()
const workspaceStore = useWorkspaceStore()
const windowManager = useWindowManagerStore()
const { currentWorkspace } = storeToRefs(workspaceStore)
// Get all windows for this workspace
const workspaceWindows = computed(() => {
return windowManager.windows.filter(
(window) => window.workspaceId === props.workspace.id,
)
})
// Limit to 8 visible icons
const MAX_VISIBLE_ICONS = 8
const visibleWindows = computed(() => {
return workspaceWindows.value.slice(0, MAX_VISIBLE_ICONS)
})
// Count remaining windows
const remainingCount = computed(() => {
const remaining = workspaceWindows.value.length - MAX_VISIBLE_ICONS
return remaining > 0 ? remaining : 0
})
const cardEl = useTemplateRef('cardEl')
const isDragOver = ref(false)
@ -96,3 +149,10 @@ watch(
},
)
</script>
<i18n lang="yaml">
de:
noWindows: Keine Fenster geöffnet
en:
noWindows: No windows open
</i18n>

View File

@ -0,0 +1,54 @@
<template>
<UiDrawer
v-model:open="isOverviewMode"
direction="left"
:overlay="false"
:modal="false"
title="Workspaces"
description="Workspaces"
>
<template #content>
<div class="pl-8 pr-4 overflow-y-auto py-8">
<!-- Workspace Cards -->
<div class="flex flex-col gap-3">
<HaexWorkspaceCard
v-for="workspace in workspaces"
:key="workspace.id"
:workspace
/>
</div>
<!-- Add New Workspace Button -->
<UButton
block
variant="outline"
class="mt-6"
icon="i-heroicons-plus"
:label="t('add')"
@click="handleAddWorkspaceAsync"
/>
</div>
</template>
</UiDrawer>
</template>
<script setup lang="ts">
const { t } = useI18n()
const workspaceStore = useWorkspaceStore()
const { workspaces, isOverviewMode } = storeToRefs(workspaceStore)
const handleAddWorkspaceAsync = async () => {
const workspace = await workspaceStore.addWorkspaceAsync()
nextTick(() => {
workspaceStore.slideToWorkspace(workspace?.id)
})
}
</script>
<i18n lang="yaml">
de:
add: Workspace hinzufügen
en:
add: Add Workspace
</i18n>

View File

@ -0,0 +1,32 @@
<template>
<UDrawer
v-bind="$attrs"
:ui="{
content:
'pb-[env(safe-area-inset-bottom)] pt-[env(safe-area-inset-top)] ',
...(ui || {}),
}"
>
<template
v-for="(_, name) in $slots"
#[name]="slotData"
>
<slot
:name="name"
v-bind="slotData"
/>
</template>
</UDrawer>
</template>
<script setup lang="ts">
import type { DrawerProps } from '@nuxt/ui'
/**
* Wrapper around UDrawer that automatically applies safe area insets for mobile devices.
* Passes through all props and slots to UDrawer.
*/
const props = defineProps</* @vue-ignore */ DrawerProps>()
const { ui } = toRefs(props)
</script>

View File

@ -0,0 +1,28 @@
<template>
<UContextMenu :items="contextMenuItems">
<UiButton
v-bind="$attrs"
@click="$emit('click', $event)"
>
<template
v-for="(_, slotName) in $slots"
#[slotName]="slotProps"
>
<slot
:name="slotName"
v-bind="slotProps"
/>
</template>
</UiButton>
</UContextMenu>
</template>
<script setup lang="ts">
import type { ContextMenuItem } from '@nuxt/ui'
defineProps<{
contextMenuItems: ContextMenuItem[]
}>()
defineEmits<{ click: [Event] }>()
</script>

View File

@ -4,11 +4,11 @@
<UButton
class="pointer-events-auto"
v-bind="{
...{ size: isSmallScreen ? 'lg' : 'md' },
...buttonProps,
...$attrs,
}"
@click="(e) => $emit('click', e)"
size="lg"
@click="$emit('click', $event)"
>
<template
v-for="(_, slotName) in $slots"

View File

@ -5,7 +5,6 @@
:readonly="props.readOnly"
:leading-icon="props.leadingIcon"
:ui="{ base: 'peer' }"
:size="isSmallScreen ? 'lg' : 'md'"
@change="(e) => $emit('change', e)"
@blur="(e) => $emit('blur', e)"
@keyup="(e: KeyboardEvent) => $emit('keyup', e)"
@ -83,8 +82,6 @@ const filteredSlots = computed(() => {
Object.entries(useSlots()).filter(([name]) => name !== 'trailing'),
)
})
const { isSmallScreen } = storeToRefs(useUiStore())
</script>
<i18n lang="yaml">

View File

@ -1,49 +1,64 @@
// composables/extensionMessageHandler.ts
import { invoke } from '@tauri-apps/api/core'
import type { IHaexHubExtension } from '~/types/haexhub'
import { HAEXTENSION_METHODS, HAEXTENSION_EVENTS } from '@haexhub/sdk'
import {
EXTENSION_PROTOCOL_NAME,
EXTENSION_PROTOCOL_PREFIX,
} from '~/config/constants'
import type { Platform } from '@tauri-apps/plugin-os'
interface ExtensionRequest {
id: string
method: string
params: Record<string, unknown>
timestamp: number
}
import {
handleDatabaseMethodAsync,
handleFilesystemMethodAsync,
handleWebMethodAsync,
handlePermissionsMethodAsync,
handleContextMethodAsync,
handleStorageMethodAsync,
setContextGetters,
type ExtensionRequest,
type ExtensionInstance,
} from './handlers'
// Globaler Handler - nur einmal registriert
let globalHandlerRegistered = false
interface ExtensionInstance {
extension: IHaexHubExtension
windowId: string
}
const iframeRegistry = new Map<HTMLIFrameElement, ExtensionInstance>()
// Map event.source (WindowProxy) to extension instance for sandbox-compatible matching
const sourceRegistry = new Map<Window, ExtensionInstance>()
// Reverse map: window ID to Window for broadcasting (supports multiple windows per extension)
const windowIdToWindowMap = new Map<string, Window>()
// Store context values that need to be accessed outside setup
let contextGetters: {
getTheme: () => string
getLocale: () => string
getPlatform: () => Platform | undefined
} | null = null
const registerGlobalMessageHandler = () => {
if (globalHandlerRegistered) return
console.log('[ExtensionHandler] Registering global message handler')
window.addEventListener('message', async (event: MessageEvent) => {
// Log ALL messages first for debugging
console.log('[ExtensionHandler] Raw message received:', {
origin: event.origin,
dataType: typeof event.data,
data: event.data,
hasSource: !!event.source,
})
// Ignore console.forward messages - they're handled elsewhere
if (event.data?.type === 'console.forward') {
return
}
// Handle debug messages for Android debugging
if (event.data?.type === 'haexhub:debug') {
console.log('[ExtensionHandler] DEBUG MESSAGE FROM EXTENSION:', event.data.data)
return
}
const request = event.data as ExtensionRequest
console.log('[ExtensionHandler] Processing extension message:', {
origin: event.origin,
method: request?.method,
id: request?.id,
hasSource: !!event.source,
})
// Find extension instance by decoding event.origin (works with sandboxed iframes)
// Origin formats:
// - Desktop: haex-extension://<base64>
@ -166,17 +181,36 @@ const registerGlobalMessageHandler = () => {
try {
let result: unknown
if (request.method.startsWith('haextension.context.')) {
// Check specific methods first, then use direct routing to handlers
if (request.method === HAEXTENSION_METHODS.context.get) {
result = await handleContextMethodAsync(request)
} else if (request.method.startsWith('haextension.storage.')) {
} else if (
request.method === HAEXTENSION_METHODS.storage.getItem ||
request.method === HAEXTENSION_METHODS.storage.setItem ||
request.method === HAEXTENSION_METHODS.storage.removeItem ||
request.method === HAEXTENSION_METHODS.storage.clear ||
request.method === HAEXTENSION_METHODS.storage.keys
) {
result = await handleStorageMethodAsync(request, instance)
} else if (request.method.startsWith('haextension.db.')) {
} else if (
request.method === HAEXTENSION_METHODS.database.query ||
request.method === HAEXTENSION_METHODS.database.execute ||
request.method === HAEXTENSION_METHODS.database.transaction
) {
result = await handleDatabaseMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.fs.')) {
} else if (
request.method === HAEXTENSION_METHODS.filesystem.saveFile ||
request.method === HAEXTENSION_METHODS.filesystem.openFile ||
request.method === HAEXTENSION_METHODS.filesystem.showImage
) {
result = await handleFilesystemMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.http.')) {
result = await handleHttpMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension.permissions.')) {
} else if (
request.method === HAEXTENSION_METHODS.web.fetch ||
request.method === HAEXTENSION_METHODS.application.open
) {
result = await handleWebMethodAsync(request, instance.extension)
} else if (request.method.startsWith('haextension:permissions:')) {
// Permissions noch nicht migriert
result = await handlePermissionsMethodAsync(request, instance.extension)
} else {
throw new Error(`Unknown method: ${request.method}`)
@ -227,13 +261,11 @@ export const useExtensionMessageHandler = (
const { locale } = useI18n()
const { platform } = useDeviceStore()
// Store getters for use outside setup context
if (!contextGetters) {
contextGetters = {
getTheme: () => currentTheme.value?.value || 'system',
getLocale: () => locale.value,
getPlatform: () => platform,
}
}
setContextGetters({
getTheme: () => currentTheme.value?.value || 'system',
getLocale: () => locale.value,
getPlatform: () => platform,
})
// Registriere globalen Handler beim ersten Aufruf
registerGlobalMessageHandler()
@ -275,12 +307,7 @@ export const registerExtensionIFrame = (
// Stelle sicher, dass der globale Handler registriert ist
registerGlobalMessageHandler()
// Warnung wenn Context Getters nicht initialisiert wurden
if (!contextGetters) {
console.warn(
'Context getters not initialized. Make sure useExtensionMessageHandler was called in setup context first.',
)
}
// Note: Context getters should be initialized via useExtensionMessageHandler first
iframeRegistry.set(iframe, { extension, windowId })
}
@ -333,203 +360,26 @@ export const broadcastContextToAllExtensions = (context: {
platform?: string
}) => {
const message = {
type: 'haextension.context.changed',
type: HAEXTENSION_EVENTS.CONTEXT_CHANGED,
data: { context },
timestamp: Date.now(),
}
console.log('[ExtensionHandler] Broadcasting context to all extensions:', context)
console.log(
'[ExtensionHandler] Broadcasting context to all extensions:',
context,
)
// Send to all registered extension windows
for (const [_, instance] of iframeRegistry.entries()) {
const win = windowIdToWindowMap.get(instance.windowId)
if (win) {
console.log('[ExtensionHandler] Sending context to:', instance.extension.name, instance.windowId)
console.log(
'[ExtensionHandler] Sending context to:',
instance.extension.name,
instance.windowId,
)
win.postMessage(message, '*')
}
}
}
// ==========================================
// Database Methods
// ==========================================
async function handleDatabaseMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension, // Direkter Typ
) {
const params = request.params as {
query?: string
params?: unknown[]
}
switch (request.method) {
case 'haextension.db.query': {
const rows = await invoke<unknown[]>('extension_sql_select', {
sql: params.query || '',
params: params.params || [],
extensionId: extension.id,
})
return {
rows,
rowsAffected: 0,
lastInsertId: undefined,
}
}
case 'haextension.db.execute': {
await invoke<string[]>('extension_sql_execute', {
sql: params.query || '',
params: params.params || [],
extensionId: extension.id,
})
return {
rows: [],
rowsAffected: 1,
lastInsertId: undefined,
}
}
case 'haextension.db.transaction': {
const statements =
(request.params as { statements?: string[] }).statements || []
for (const stmt of statements) {
await invoke('extension_sql_execute', {
sql: stmt,
params: [],
extensionId: extension.id,
})
}
return { success: true }
}
default:
throw new Error(`Unknown database method: ${request.method}`)
}
}
// ==========================================
// Filesystem Methods (TODO)
// ==========================================
async function handleFilesystemMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!request || !extension) return
// TODO: Implementiere Filesystem Commands im Backend
throw new Error('Filesystem methods not yet implemented')
}
// ==========================================
// HTTP Methods (TODO)
// ==========================================
async function handleHttpMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere HTTP Commands im Backend
throw new Error('HTTP methods not yet implemented')
}
// ==========================================
// Permission Methods (TODO)
// ==========================================
async function handlePermissionsMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere Permission Request UI
throw new Error('Permission methods not yet implemented')
}
// ==========================================
// Context Methods
// ==========================================
async function handleContextMethodAsync(request: ExtensionRequest) {
switch (request.method) {
case 'haextension.context.get':
if (!contextGetters) {
throw new Error(
'Context not initialized. Make sure useExtensionMessageHandler is called in a component.',
)
}
return {
theme: contextGetters.getTheme(),
locale: contextGetters.getLocale(),
platform: contextGetters.getPlatform(),
}
default:
throw new Error(`Unknown context method: ${request.method}`)
}
}
// ==========================================
// Storage Methods
// ==========================================
async function handleStorageMethodAsync(
request: ExtensionRequest,
instance: ExtensionInstance,
) {
// Storage is now per-window, not per-extension
const storageKey = `ext_${instance.extension.id}_${instance.windowId}_`
console.log(
`[HaexHub Storage] ${request.method} for window ${instance.windowId}`,
)
switch (request.method) {
case 'haextension.storage.getItem': {
const key = request.params.key as string
return localStorage.getItem(storageKey + key)
}
case 'haextension.storage.setItem': {
const key = request.params.key as string
const value = request.params.value as string
localStorage.setItem(storageKey + key, value)
return null
}
case 'haextension.storage.removeItem': {
const key = request.params.key as string
localStorage.removeItem(storageKey + key)
return null
}
case 'haextension.storage.clear': {
// Remove only instance-specific keys
const keys = Object.keys(localStorage).filter((k) =>
k.startsWith(storageKey),
)
keys.forEach((k) => localStorage.removeItem(k))
return null
}
case 'haextension.storage.keys': {
// Return only instance-specific keys (without prefix)
const keys = Object.keys(localStorage)
.filter((k) => k.startsWith(storageKey))
.map((k) => k.substring(storageKey.length))
return keys
}
default:
throw new Error(`Unknown storage method: ${request.method}`)
}
}

Some files were not shown because too many files have changed in this diff Show More