25 Commits

Author SHA1 Message Date
e1be08cb76 Add openFile support for opening files with system viewer
Added new filesystem handler for opening files with the system's default viewer:
- Implemented haextension.fs.openFile handler in filesystem.ts
- Writes files to temp directory and opens with openPath from opener plugin
- Added Tauri permissions: opener:allow-open-path with $TEMP/** scope
- Added filesystem permissions for temp directory access

This enables extensions to open files (like images) in the native system viewer where users can zoom and interact with them naturally.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-09 23:58:40 +01:00
7d1f346c4b Improve UiDrawer styling and viewport calculations 2025-11-08 23:14:12 +01:00
af61972342 Fix ui prop reference in UiDrawer component 2025-11-08 20:28:39 +01:00
6187e32f89 Fix lockfile mismatch for zod dependency 2025-11-08 00:21:54 +01:00
43ba246174 Refactor extension handlers and improve mobile UX
- Split extensionMessageHandler into separate handler files
  - Created handlers directory with individual files for database, filesystem, http, permissions, context, and storage
  - Reduced main handler file from 602 to 342 lines
  - Improved code organization and maintainability

- Add viewport utilities for safe area handling
  - New viewport.ts utility with helpers for fullscreen dimensions
  - Proper safe area inset calculations for mobile devices
  - Fixed window positioning on small screens to start at 0,0

- Create UiDrawer wrapper component
  - Automatically applies safe area insets
  - Uses TypeScript DrawerProps interface for code completion
  - Replaced all UDrawer instances with UiDrawer

- Improve window management
  - Windows on small screens now use full viewport with safe areas
  - Fixed maximize functionality to respect safe areas
  - Consolidated safe area logic in reusable utilities
2025-11-08 00:14:53 +01:00
2b739b9e79 Improve database query handling with automatic fallback for RETURNING clauses 2025-11-07 01:39:44 +01:00
63849d86e1 Add sync backend infrastructure and improve grid snapping
- Implement crypto utilities for vault key management (Hybrid-Ansatz)
  - PBKDF2 key derivation with 600k iterations
  - AES-GCM encryption for vault keys and CRDT data
  - Optimized Base64 conversion with Buffer/btoa fallback
- Add Sync Engine Store for server communication
  - Vault key storage and retrieval
  - CRDT log push/pull operations
  - Supabase client integration
- Add Sync Orchestrator Store with realtime subscriptions
  - Event-driven sync (push after writes)
  - Supabase Realtime for instant sync
  - Sync status tracking per backend
- Add haex_sync_status table for reliable sync tracking
2025-11-05 17:08:49 +01:00
9adee46166 Bump version to 0.1.11 2025-11-05 01:08:33 +01:00
be7dff72dd Add sync backend infrastructure and improve grid snapping
- Add haexSyncBackends table with CRDT support for multi-backend sync
- Implement useSyncBackendsStore for managing sync server configurations
- Fix desktop icon grid snapping for all icon sizes (small to extra-large)
- Add Supabase client dependency for future sync implementation
- Generate database migration for sync_backends table
2025-11-05 01:08:09 +01:00
b465c117b0 Fix browser text selection during icon drag
- Add e.preventDefault() in handlePointerDown to prevent text selection
- Add @dragstart.prevent to prevent native browser drag
- Add select-none and @selectstart.prevent to workspace
- Add mouseleave event listener to reset drag state when leaving window
- Refactor grid positioning to use consistent iconPadding constant
2025-11-04 22:36:17 +01:00
731ae7cc47 Improve desktop grid positioning and spacing
- Increase icon spacing from 20px to 30px padding
- Add vertical grid offset (-30px) to start grid higher
- Remove screen-size dependent grid columns/rows (now fully dynamic)
- Fix dropzone visualization to use consistent snapToGrid function
- Clean up unused UI store dependencies
2025-11-04 16:39:08 +01:00
26ec4e2a89 Fix icon drag bounds on x-axis
Prevent icons from being dragged beyond viewport boundaries on the x-axis.
Icons are now clamped to valid positions during drag, not just on drop.

- Added viewport bounds checking for both x and y axes during drag
- Icons stay within [0, viewport.width - iconWidth] horizontally
- Icons stay within [0, viewport.height - iconHeight] vertically
- Eliminates snap-back behavior when dragging near edges

Bump version to 0.1.8
2025-11-04 16:11:30 +01:00
279468eddc Add device management and database-backed desktop settings
This update migrates desktop grid settings from localStorage to the database
and introduces a comprehensive device management system.

Features:
- New haex_devices table for device identification and naming
- Device-specific settings with foreign key relationships
- Preset-based icon sizes (Small, Medium, Large, Extra Large)
- Grid positioning improvements to prevent dragging behind PageHeader
- Dynamic icon sizing based on database settings

Database Changes:
- Created haex_devices table with deviceId (UUID) and name columns
- Modified haex_settings to include device_id FK and updated unique constraint
- Migration 0002_loose_quasimodo.sql for schema changes

Technical Improvements:
- Replaced arbitrary icon size slider (60-200px) with preset system
- Icons use actual measured dimensions for proper grid centering
- Settings sync on vault mount for cross-device consistency
- Proper bounds checking during icon drag operations

Bump version to 0.1.7
2025-11-04 16:04:38 +01:00
cffb129e4f Auto-open dev extensions after loading
- Dev extensions are now automatically opened in a window after successful load
- Simplified extension finding logic by using devExtensions directly
- Fixed table name handling to support both double quotes and backticks in permission manager
2025-11-04 00:46:46 +01:00
405cf25aab Bump version to 0.1.6 2025-11-03 11:10:11 +01:00
b097bf211d Make windows fullscreen on small screens
- Update window components to use fullscreen layout on small screens
- Adjust UI components styling for better mobile display
- Update desktop store for small screen handling
2025-11-03 11:08:26 +01:00
c71b8468df Fix workspace background feature for Android
- Add missing filesystem permissions in capabilities
  - fs:allow-applocaldata-read-recursive
  - fs:allow-applocaldata-write-recursive
  - fs:allow-write-file
  - fs:allow-mkdir
  - fs:allow-exists
  - fs:allow-remove

- Fix Android photo picker URI handling
  - Detect file type from binary signature (PNG, JPEG, WebP)
  - Use manual path construction to avoid path joining issues
  - Works with Android photo picker content:// URIs

- Improve error handling with detailed toast messages
  - Show specific error at each step (read, mkdir, write, db)
  - Better debugging on Android where console is unavailable

- Fix window activation behavior
  - Restore minimized windows when activated

- Remove unused imports in launcher component
2025-11-03 02:03:34 +01:00
3a4f482021 Add database migrations for workspace background feature
- Add migration 0001 for background column in haex_workspaces table
- Update vault.db with new schema
- Sync Android assets database
2025-11-03 01:32:00 +01:00
88507410ed Refactor code formatting and imports
- Reformat Rust code in extension database module
  - Improve line breaks and indentation
  - Remove commented-out test code
  - Clean up debug print statements formatting

- Update import path in CRDT schema (use @ alias)

- Fix UButton closing tag formatting in default layout
2025-11-03 01:30:46 +01:00
f38cecc84b Add workspace background customization and fix launcher drawer drag
- Add workspace background image support with file-based storage
  - Store background images in $APPLOCALDATA/files directory
  - Save file paths in database (text column in haex_workspaces)
  - Use convertFileSrc for secure asset:// URL conversion
  - Add context menu to workspaces with "Hintergrund ändern" option

- Implement background management in settings
  - File selection dialog for PNG, JPG, JPEG, WebP images
  - Copy selected images to app data directory
  - Remove background with file cleanup
  - Multilingual UI (German/English)

- Fix launcher drawer drag interference
  - Add :handle-only="true" to UDrawer to restrict drag to handle
  - Simplify drag handlers (removed complex state tracking)
  - Items can now be dragged to desktop without drawer interference

- Extend Tauri asset protocol scope to include $APPLOCALDATA/**
  for background image loading
2025-11-03 01:29:08 +01:00
931d51a1e1 Remove unused function parameters
Removed unused parameters:
- allowed_origin from parse_extension_info_from_path in protocol.rs
- app_handle from resolve_path_pattern in filesystem/core.rs
2025-11-02 15:07:44 +01:00
c97afdee18 Restore trash import for move_vault_to_trash functionality
The trash crate is needed for the move_vault_to_trash function which
moves vault files to the system trash instead of permanently deleting
them. Clippy incorrectly marked it as unused because it's only used
within a cfg(not(target_os = "android")) block.
2025-11-02 15:06:02 +01:00
65d2770df3 Fix Android build by unconditionally importing ts_rs::TS
When cargo clippy removed the unused trash import, the cfg attribute
accidentally applied to the ts_rs::TS import below it, making it
conditional for Android. This caused the Android build to fail with
"cannot find derive macro TS in this scope".

Moved the TS import out of the cfg block to make it available for all
platforms including Android.
2025-11-02 15:02:45 +01:00
a52e1b43fa Remove unused code and modernize Rust format strings
Applied cargo clippy fixes to clean up codebase:
- Removed unused imports (serde_json::json, std::collections::HashSet)
- Removed unused function encode_hex_for_log
- Modernized format strings to use inline variables
- Fixed clippy warnings for better code quality

All changes applied automatically by cargo clippy --fix
2025-11-02 14:48:01 +01:00
6ceb22f014 Bundle Iconify icons locally and enhance CSP for Tauri protocols
- Add lucide and hugeicons to serverBundle collections for local bundling
- Add https://tauri.localhost and asset: protocol to CSP directives
- Prevents CSP errors and eliminates dependency on Iconify API
2025-11-02 14:28:06 +01:00
74 changed files with 8354 additions and 2353 deletions

View File

@ -68,7 +68,7 @@ export default defineNuxtConfig({
includeCustomCollections: true, includeCustomCollections: true,
}, },
serverBundle: { serverBundle: {
collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione'], collections: ['mdi', 'line-md', 'solar', 'gg', 'emojione', 'lucide', 'hugeicons'],
}, },
customCollections: [ customCollections: [

View File

@ -1,7 +1,7 @@
{ {
"name": "haex-hub", "name": "haex-hub",
"private": true, "private": true,
"version": "0.1.4", "version": "0.1.12",
"type": "module", "type": "module",
"scripts": { "scripts": {
"build": "nuxt build", "build": "nuxt build",
@ -23,8 +23,9 @@
"@nuxt/icon": "2.0.0", "@nuxt/icon": "2.0.0",
"@nuxt/ui": "4.1.0", "@nuxt/ui": "4.1.0",
"@nuxtjs/i18n": "10.0.6", "@nuxtjs/i18n": "10.0.6",
"@pinia/nuxt": "^0.11.2", "@pinia/nuxt": "^0.11.3",
"@tailwindcss/vite": "^4.1.16", "@supabase/supabase-js": "^2.80.0",
"@tailwindcss/vite": "^4.1.17",
"@tauri-apps/api": "^2.9.0", "@tauri-apps/api": "^2.9.0",
"@tauri-apps/plugin-dialog": "^2.4.2", "@tauri-apps/plugin-dialog": "^2.4.2",
"@tauri-apps/plugin-fs": "^2.4.4", "@tauri-apps/plugin-fs": "^2.4.4",
@ -37,32 +38,32 @@
"@vueuse/gesture": "^2.0.0", "@vueuse/gesture": "^2.0.0",
"@vueuse/nuxt": "^13.9.0", "@vueuse/nuxt": "^13.9.0",
"drizzle-orm": "^0.44.7", "drizzle-orm": "^0.44.7",
"eslint": "^9.38.0", "eslint": "^9.39.1",
"nuxt-zod-i18n": "^1.12.1", "nuxt-zod-i18n": "^1.12.1",
"swiper": "^12.0.3", "swiper": "^12.0.3",
"tailwindcss": "^4.1.16", "tailwindcss": "^4.1.17",
"vue": "^3.5.22", "vue": "^3.5.24",
"vue-router": "^4.6.3", "vue-router": "^4.6.3",
"zod": "^3.25.76" "zod": "^3.25.76"
}, },
"devDependencies": { "devDependencies": {
"@iconify-json/hugeicons": "^1.2.17", "@iconify-json/hugeicons": "^1.2.17",
"@iconify-json/lucide": "^1.2.71", "@iconify-json/lucide": "^1.2.72",
"@iconify/json": "^2.2.401", "@iconify/json": "^2.2.404",
"@iconify/tailwind4": "^1.0.6", "@iconify/tailwind4": "^1.1.0",
"@libsql/client": "^0.15.15", "@libsql/client": "^0.15.15",
"@tauri-apps/cli": "^2.9.1", "@tauri-apps/cli": "^2.9.3",
"@types/node": "^24.9.1", "@types/node": "^24.10.0",
"@vitejs/plugin-vue": "6.0.1", "@vitejs/plugin-vue": "6.0.1",
"@vue/compiler-sfc": "^3.5.22", "@vue/compiler-sfc": "^3.5.24",
"drizzle-kit": "^0.31.5", "drizzle-kit": "^0.31.6",
"globals": "^16.4.0", "globals": "^16.5.0",
"nuxt": "^4.2.0", "nuxt": "^4.2.1",
"prettier": "3.6.2", "prettier": "3.6.2",
"tsx": "^4.20.6", "tsx": "^4.20.6",
"tw-animate-css": "^1.4.0", "tw-animate-css": "^1.4.0",
"typescript": "^5.9.3", "typescript": "^5.9.3",
"vite": "^7.1.3", "vite": "^7.2.2",
"vue-tsc": "3.0.6" "vue-tsc": "3.0.6"
}, },
"prettier": { "prettier": {

3239
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@ -18,16 +18,27 @@
"fs:allow-appconfig-write-recursive", "fs:allow-appconfig-write-recursive",
"fs:allow-appdata-read-recursive", "fs:allow-appdata-read-recursive",
"fs:allow-appdata-write-recursive", "fs:allow-appdata-write-recursive",
"fs:allow-applocaldata-read-recursive",
"fs:allow-applocaldata-write-recursive",
"fs:allow-read-file", "fs:allow-read-file",
"fs:allow-write-file",
"fs:allow-read-dir", "fs:allow-read-dir",
"fs:allow-mkdir",
"fs:allow-exists",
"fs:allow-remove",
"fs:allow-resource-read-recursive", "fs:allow-resource-read-recursive",
"fs:allow-resource-write-recursive", "fs:allow-resource-write-recursive",
"fs:allow-download-read-recursive", "fs:allow-download-read-recursive",
"fs:allow-download-write-recursive", "fs:allow-download-write-recursive",
"fs:allow-temp-read-recursive",
"fs:allow-temp-write-recursive",
"fs:default", "fs:default",
{ {
"identifier": "fs:scope", "identifier": "fs:scope",
"allow": [{ "path": "**" }] "allow": [
{ "path": "**" },
{ "path": "$TEMP/**" }
]
}, },
"http:allow-fetch-send", "http:allow-fetch-send",
"http:allow-fetch", "http:allow-fetch",
@ -38,6 +49,12 @@
"notification:allow-is-permission-granted", "notification:allow-is-permission-granted",
"notification:default", "notification:default",
"opener:allow-open-url", "opener:allow-open-url",
{
"identifier": "opener:allow-open-path",
"allow": [
{ "path": "$TEMP/**" }
]
},
"opener:default", "opener:default",
"os:allow-hostname", "os:allow-hostname",
"os:default", "os:default",

View File

@ -0,0 +1,105 @@
CREATE TABLE `haex_crdt_configs` (
`key` text PRIMARY KEY NOT NULL,
`value` text
);
--> statement-breakpoint
CREATE TABLE `haex_crdt_logs` (
`id` text PRIMARY KEY NOT NULL,
`haex_timestamp` text,
`table_name` text,
`row_pks` text,
`op_type` text,
`column_name` text,
`new_value` text,
`old_value` text
);
--> statement-breakpoint
CREATE INDEX `idx_haex_timestamp` ON `haex_crdt_logs` (`haex_timestamp`);--> statement-breakpoint
CREATE INDEX `idx_table_row` ON `haex_crdt_logs` (`table_name`,`row_pks`);--> statement-breakpoint
CREATE TABLE `haex_crdt_snapshots` (
`snapshot_id` text PRIMARY KEY NOT NULL,
`created` text,
`epoch_hlc` text,
`location_url` text,
`file_size_bytes` integer
);
--> statement-breakpoint
CREATE TABLE `haex_desktop_items` (
`id` text PRIMARY KEY NOT NULL,
`workspace_id` text NOT NULL,
`item_type` text NOT NULL,
`extension_id` text,
`system_window_id` text,
`position_x` integer DEFAULT 0 NOT NULL,
`position_y` integer DEFAULT 0 NOT NULL,
`haex_timestamp` text,
FOREIGN KEY (`workspace_id`) REFERENCES `haex_workspaces`(`id`) ON UPDATE no action ON DELETE cascade,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade,
CONSTRAINT "item_reference" CHECK(("haex_desktop_items"."item_type" = 'extension' AND "haex_desktop_items"."extension_id" IS NOT NULL AND "haex_desktop_items"."system_window_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'system' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'file' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL) OR ("haex_desktop_items"."item_type" = 'folder' AND "haex_desktop_items"."system_window_id" IS NOT NULL AND "haex_desktop_items"."extension_id" IS NULL))
);
--> statement-breakpoint
CREATE TABLE `haex_extension_permissions` (
`id` text PRIMARY KEY NOT NULL,
`extension_id` text NOT NULL,
`resource_type` text,
`action` text,
`target` text,
`constraints` text,
`status` text DEFAULT 'denied' NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_timestamp` text,
FOREIGN KEY (`extension_id`) REFERENCES `haex_extensions`(`id`) ON UPDATE no action ON DELETE cascade
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_extension_permissions_extension_id_resource_type_action_target_unique` ON `haex_extension_permissions` (`extension_id`,`resource_type`,`action`,`target`);--> statement-breakpoint
CREATE TABLE `haex_extensions` (
`id` text PRIMARY KEY NOT NULL,
`public_key` text NOT NULL,
`name` text NOT NULL,
`version` text NOT NULL,
`author` text,
`description` text,
`entry` text DEFAULT 'index.html',
`homepage` text,
`enabled` integer DEFAULT true,
`icon` text,
`signature` text NOT NULL,
`single_instance` integer DEFAULT false,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_extensions_public_key_name_unique` ON `haex_extensions` (`public_key`,`name`);--> statement-breakpoint
CREATE TABLE `haex_notifications` (
`id` text PRIMARY KEY NOT NULL,
`alt` text,
`date` text,
`icon` text,
`image` text,
`read` integer,
`source` text,
`text` text,
`title` text,
`type` text NOT NULL,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE TABLE `haex_settings` (
`id` text PRIMARY KEY NOT NULL,
`key` text,
`type` text,
`value` text,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_settings_key_type_value_unique` ON `haex_settings` (`key`,`type`,`value`);--> statement-breakpoint
CREATE TABLE `haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` blob,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -0,0 +1,15 @@
PRAGMA foreign_keys=OFF;--> statement-breakpoint
CREATE TABLE `__new_haex_workspaces` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`position` integer DEFAULT 0 NOT NULL,
`background` text,
`haex_timestamp` text
);
--> statement-breakpoint
INSERT INTO `__new_haex_workspaces`("id", "device_id", "name", "position", "background", "haex_timestamp") SELECT "id", "device_id", "name", "position", "background", "haex_timestamp" FROM `haex_workspaces`;--> statement-breakpoint
DROP TABLE `haex_workspaces`;--> statement-breakpoint
ALTER TABLE `__new_haex_workspaces` RENAME TO `haex_workspaces`;--> statement-breakpoint
PRAGMA foreign_keys=ON;--> statement-breakpoint
CREATE UNIQUE INDEX `haex_workspaces_position_unique` ON `haex_workspaces` (`position`);

View File

@ -0,0 +1,13 @@
CREATE TABLE `haex_devices` (
`id` text PRIMARY KEY NOT NULL,
`device_id` text NOT NULL,
`name` text NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_timestamp` text
);
--> statement-breakpoint
CREATE UNIQUE INDEX `haex_devices_device_id_unique` ON `haex_devices` (`device_id`);--> statement-breakpoint
DROP INDEX `haex_settings_key_type_value_unique`;--> statement-breakpoint
ALTER TABLE `haex_settings` ADD `device_id` text REFERENCES haex_devices(id);--> statement-breakpoint
CREATE UNIQUE INDEX `haex_settings_device_id_key_type_unique` ON `haex_settings` (`device_id`,`key`,`type`);

View File

@ -0,0 +1,10 @@
CREATE TABLE `haex_sync_backends` (
`id` text PRIMARY KEY NOT NULL,
`name` text NOT NULL,
`server_url` text NOT NULL,
`enabled` integer DEFAULT true NOT NULL,
`priority` integer DEFAULT 0 NOT NULL,
`created_at` text DEFAULT (CURRENT_TIMESTAMP),
`updated_at` integer,
`haex_timestamp` text
);

View File

@ -0,0 +1,692 @@
{
"version": "6",
"dialect": "sqlite",
"id": "e3d61ad1-63be-41be-9243-41144e215f98",
"prevId": "00000000-0000-0000-0000-000000000000",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_key_type_value_unique": {
"name": "haex_settings_key_type_value_unique",
"columns": [
"key",
"type",
"value"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "blob",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,692 @@
{
"version": "6",
"dialect": "sqlite",
"id": "10bec43a-4227-483e-b1c1-fd50ae32bb96",
"prevId": "e3d61ad1-63be-41be-9243-41144e215f98",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_key_type_value_unique": {
"name": "haex_settings_key_type_value_unique",
"columns": [
"key",
"type",
"value"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,774 @@
{
"version": "6",
"dialect": "sqlite",
"id": "3aedf10c-2266-40f4-8549-0ff8b0588853",
"prevId": "10bec43a-4227-483e-b1c1-fd50ae32bb96",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_devices": {
"name": "haex_devices",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_devices_device_id_unique": {
"name": "haex_devices_device_id_unique",
"columns": [
"device_id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_device_id_key_type_unique": {
"name": "haex_settings_device_id_key_type_unique",
"columns": [
"device_id",
"key",
"type"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_settings_device_id_haex_devices_id_fk": {
"name": "haex_settings_device_id_haex_devices_id_fk",
"tableFrom": "haex_settings",
"tableTo": "haex_devices",
"columnsFrom": [
"device_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,843 @@
{
"version": "6",
"dialect": "sqlite",
"id": "bf82259e-9264-44e7-a60f-8cc14a1f22e2",
"prevId": "3aedf10c-2266-40f4-8549-0ff8b0588853",
"tables": {
"haex_crdt_configs": {
"name": "haex_crdt_configs",
"columns": {
"key": {
"name": "key",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_logs": {
"name": "haex_crdt_logs",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"table_name": {
"name": "table_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"row_pks": {
"name": "row_pks",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"op_type": {
"name": "op_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"column_name": {
"name": "column_name",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"new_value": {
"name": "new_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"old_value": {
"name": "old_value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"idx_haex_timestamp": {
"name": "idx_haex_timestamp",
"columns": [
"haex_timestamp"
],
"isUnique": false
},
"idx_table_row": {
"name": "idx_table_row",
"columns": [
"table_name",
"row_pks"
],
"isUnique": false
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_crdt_snapshots": {
"name": "haex_crdt_snapshots",
"columns": {
"snapshot_id": {
"name": "snapshot_id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"created": {
"name": "created",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"epoch_hlc": {
"name": "epoch_hlc",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"location_url": {
"name": "location_url",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"file_size_bytes": {
"name": "file_size_bytes",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_desktop_items": {
"name": "haex_desktop_items",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"workspace_id": {
"name": "workspace_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"item_type": {
"name": "item_type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"system_window_id": {
"name": "system_window_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"position_x": {
"name": "position_x",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"position_y": {
"name": "position_y",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"haex_desktop_items_workspace_id_haex_workspaces_id_fk": {
"name": "haex_desktop_items_workspace_id_haex_workspaces_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_workspaces",
"columnsFrom": [
"workspace_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
},
"haex_desktop_items_extension_id_haex_extensions_id_fk": {
"name": "haex_desktop_items_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_desktop_items",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {
"item_reference": {
"name": "item_reference",
"value": "(\"haex_desktop_items\".\"item_type\" = 'extension' AND \"haex_desktop_items\".\"extension_id\" IS NOT NULL AND \"haex_desktop_items\".\"system_window_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'system' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'file' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL) OR (\"haex_desktop_items\".\"item_type\" = 'folder' AND \"haex_desktop_items\".\"system_window_id\" IS NOT NULL AND \"haex_desktop_items\".\"extension_id\" IS NULL)"
}
}
},
"haex_devices": {
"name": "haex_devices",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_devices_device_id_unique": {
"name": "haex_devices_device_id_unique",
"columns": [
"device_id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extension_permissions": {
"name": "haex_extension_permissions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"extension_id": {
"name": "extension_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"resource_type": {
"name": "resource_type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"action": {
"name": "action",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"target": {
"name": "target",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"constraints": {
"name": "constraints",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"status": {
"name": "status",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": "'denied'"
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extension_permissions_extension_id_resource_type_action_target_unique": {
"name": "haex_extension_permissions_extension_id_resource_type_action_target_unique",
"columns": [
"extension_id",
"resource_type",
"action",
"target"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_extension_permissions_extension_id_haex_extensions_id_fk": {
"name": "haex_extension_permissions_extension_id_haex_extensions_id_fk",
"tableFrom": "haex_extension_permissions",
"tableTo": "haex_extensions",
"columnsFrom": [
"extension_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_extensions": {
"name": "haex_extensions",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"public_key": {
"name": "public_key",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"version": {
"name": "version",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"author": {
"name": "author",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"description": {
"name": "description",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"entry": {
"name": "entry",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'index.html'"
},
"homepage": {
"name": "homepage",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": true
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"signature": {
"name": "signature",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"single_instance": {
"name": "single_instance",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_extensions_public_key_name_unique": {
"name": "haex_extensions_public_key_name_unique",
"columns": [
"public_key",
"name"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_notifications": {
"name": "haex_notifications",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"alt": {
"name": "alt",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"date": {
"name": "date",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"icon": {
"name": "icon",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"read": {
"name": "read",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"source": {
"name": "source",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"text": {
"name": "text",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"title": {
"name": "title",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_settings": {
"name": "haex_settings",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"key": {
"name": "key",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"value": {
"name": "value",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_settings_device_id_key_type_unique": {
"name": "haex_settings_device_id_key_type_unique",
"columns": [
"device_id",
"key",
"type"
],
"isUnique": true
}
},
"foreignKeys": {
"haex_settings_device_id_haex_devices_id_fk": {
"name": "haex_settings_device_id_haex_devices_id_fk",
"tableFrom": "haex_settings",
"tableTo": "haex_devices",
"columnsFrom": [
"device_id"
],
"columnsTo": [
"id"
],
"onDelete": "cascade",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_sync_backends": {
"name": "haex_sync_backends",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"server_url": {
"name": "server_url",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"enabled": {
"name": "enabled",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": true
},
"priority": {
"name": "priority",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"created_at": {
"name": "created_at",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "(CURRENT_TIMESTAMP)"
},
"updated_at": {
"name": "updated_at",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
},
"haex_workspaces": {
"name": "haex_workspaces",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"device_id": {
"name": "device_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"position": {
"name": "position",
"type": "integer",
"primaryKey": false,
"notNull": true,
"autoincrement": false,
"default": 0
},
"background": {
"name": "background",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"haex_timestamp": {
"name": "haex_timestamp",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"haex_workspaces_position_unique": {
"name": "haex_workspaces_position_unique",
"columns": [
"position"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {},
"checkConstraints": {}
}
},
"views": {},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
},
"internal": {
"indexes": {}
}
}

View File

@ -0,0 +1,34 @@
{
"version": "7",
"dialect": "sqlite",
"entries": [
{
"idx": 0,
"version": "6",
"when": 1762119713008,
"tag": "0000_cynical_nicolaos",
"breakpoints": true
},
{
"idx": 1,
"version": "6",
"when": 1762122405562,
"tag": "0001_furry_brother_voodoo",
"breakpoints": true
},
{
"idx": 2,
"version": "6",
"when": 1762263814375,
"tag": "0002_loose_quasimodo",
"breakpoints": true
},
{
"idx": 3,
"version": "6",
"when": 1762300795436,
"tag": "0003_luxuriant_deathstrike",
"breakpoints": true
}
]
}

Binary file not shown.

View File

@ -1 +1 @@
{"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-read-file","fs:allow-read-dir","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:allow-is-permission-granted","notification:default","opener:allow-open-url","opener:default","os:allow-hostname","os:default","store:default"]}} {"default":{"identifier":"default","description":"Capability for the main window","local":true,"windows":["main"],"permissions":["core:default","core:webview:allow-create-webview-window","core:webview:allow-create-webview","core:webview:allow-webview-show","core:webview:default","core:window:allow-create","core:window:allow-get-all-windows","core:window:allow-show","core:window:default","dialog:default","fs:allow-appconfig-read-recursive","fs:allow-appconfig-write-recursive","fs:allow-appdata-read-recursive","fs:allow-appdata-write-recursive","fs:allow-applocaldata-read-recursive","fs:allow-applocaldata-write-recursive","fs:allow-read-file","fs:allow-write-file","fs:allow-read-dir","fs:allow-mkdir","fs:allow-exists","fs:allow-remove","fs:allow-resource-read-recursive","fs:allow-resource-write-recursive","fs:allow-download-read-recursive","fs:allow-download-write-recursive","fs:allow-temp-read-recursive","fs:allow-temp-write-recursive","fs:default",{"identifier":"fs:scope","allow":[{"path":"**"},{"path":"$TEMP/**"}]},"http:allow-fetch-send","http:allow-fetch","http:default","notification:allow-create-channel","notification:allow-list-channels","notification:allow-notify","notification:allow-is-permission-granted","notification:default","opener:allow-open-url",{"identifier":"opener:allow-open-path","allow":[{"path":"$TEMP/**"}]},"opener:default","os:allow-hostname","os:default","store:default"]}}

View File

@ -20,11 +20,11 @@ struct TableDefinition {
pub fn generate_table_names() { pub fn generate_table_names() {
let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt."); let out_dir = env::var("OUT_DIR").expect("OUT_DIR ist nicht gesetzt.");
println!("Generiere Tabellennamen nach {}", out_dir); println!("Generiere Tabellennamen nach {out_dir}");
let schema_path = Path::new("../src/database/tableNames.json"); let schema_path = Path::new("../src/database/tableNames.json");
let dest_path = Path::new(&out_dir).join("tableNames.rs"); let dest_path = Path::new(&out_dir).join("tableNames.rs");
let file = File::open(&schema_path).expect("Konnte tableNames.json nicht öffnen"); let file = File::open(schema_path).expect("Konnte tableNames.json nicht öffnen");
let reader = BufReader::new(file); let reader = BufReader::new(file);
let schema: Schema = let schema: Schema =
serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen"); serde_json::from_reader(reader).expect("Konnte tableNames.json nicht parsen");
@ -108,8 +108,7 @@ fn generate_table_constants(table: &TableDefinition, const_prefix: &str) -> Stri
for (col_key, col_value) in &table.columns { for (col_key, col_value) in &table.columns {
let col_const_name = format!("COL_{}_{}", const_prefix, to_screaming_snake_case(col_key)); let col_const_name = format!("COL_{}_{}", const_prefix, to_screaming_snake_case(col_key));
code.push_str(&format!( code.push_str(&format!(
"pub const {}: &str = \"{}\";\n", "pub const {col_const_name}: &str = \"{col_value}\";\n"
col_const_name, col_value
)); ));
} }

View File

@ -74,15 +74,14 @@ impl HlcService {
// Parse den String in ein Uuid-Objekt. // Parse den String in ein Uuid-Objekt.
let uuid = Uuid::parse_str(&node_id_str).map_err(|e| { let uuid = Uuid::parse_str(&node_id_str).map_err(|e| {
HlcError::ParseNodeId(format!( HlcError::ParseNodeId(format!(
"Stored device ID is not a valid UUID: {}. Error: {}", "Stored device ID is not a valid UUID: {node_id_str}. Error: {e}"
node_id_str, e
)) ))
})?; })?;
// Hol dir die rohen 16 Bytes und erstelle daraus die uhlc::ID. // Hol dir die rohen 16 Bytes und erstelle daraus die uhlc::ID.
// Das `*` dereferenziert den `&[u8; 16]` zu `[u8; 16]`, was `try_from` erwartet. // Das `*` dereferenziert den `&[u8; 16]` zu `[u8; 16]`, was `try_from` erwartet.
let node_id = ID::try_from(*uuid.as_bytes()).map_err(|e| { let node_id = ID::try_from(*uuid.as_bytes()).map_err(|e| {
HlcError::ParseNodeId(format!("Invalid node ID format from device store: {:?}", e)) HlcError::ParseNodeId(format!("Invalid node ID format from device store: {e:?}"))
})?; })?;
// 2. Erstelle eine HLC-Instanz mit stabiler Identität // 2. Erstelle eine HLC-Instanz mit stabiler Identität
@ -95,8 +94,7 @@ impl HlcService {
if let Some(last_timestamp) = Self::load_last_timestamp(conn)? { if let Some(last_timestamp) = Self::load_last_timestamp(conn)? {
hlc.update_with_timestamp(&last_timestamp).map_err(|e| { hlc.update_with_timestamp(&last_timestamp).map_err(|e| {
HlcError::Parse(format!( HlcError::Parse(format!(
"Failed to update HLC with persisted timestamp: {:?}", "Failed to update HLC with persisted timestamp: {e:?}"
e
)) ))
})?; })?;
} }
@ -119,7 +117,7 @@ impl HlcService {
if let Some(s) = value.as_str() { if let Some(s) = value.as_str() {
// Das ist unser Erfolgsfall. Wir haben einen &str und können // Das ist unser Erfolgsfall. Wir haben einen &str und können
// eine Kopie davon zurückgeben. // eine Kopie davon zurückgeben.
println!("Gefundene und validierte Geräte-ID: {}", s); println!("Gefundene und validierte Geräte-ID: {s}");
if Uuid::parse_str(s).is_ok() { if Uuid::parse_str(s).is_ok() {
// Erfolgsfall: Der Wert ist ein String UND eine gültige UUID. // Erfolgsfall: Der Wert ist ein String UND eine gültige UUID.
// Wir können die Funktion direkt mit dem Wert verlassen. // Wir können die Funktion direkt mit dem Wert verlassen.
@ -183,19 +181,19 @@ impl HlcService {
let hlc = hlc_guard.as_mut().ok_or(HlcError::NotInitialized)?; let hlc = hlc_guard.as_mut().ok_or(HlcError::NotInitialized)?;
hlc.update_with_timestamp(timestamp) hlc.update_with_timestamp(timestamp)
.map_err(|e| HlcError::Parse(format!("Failed to update HLC: {:?}", e))) .map_err(|e| HlcError::Parse(format!("Failed to update HLC: {e:?}")))
} }
/// Lädt den letzten persistierten Zeitstempel aus der Datenbank. /// Lädt den letzten persistierten Zeitstempel aus der Datenbank.
fn load_last_timestamp(conn: &Connection) -> Result<Option<Timestamp>, HlcError> { fn load_last_timestamp(conn: &Connection) -> Result<Option<Timestamp>, HlcError> {
let query = format!("SELECT value FROM {} WHERE key = ?1", TABLE_CRDT_CONFIGS); let query = format!("SELECT value FROM {TABLE_CRDT_CONFIGS} WHERE key = ?1");
match conn.query_row(&query, params![HLC_TIMESTAMP_TYPE], |row| { match conn.query_row(&query, params![HLC_TIMESTAMP_TYPE], |row| {
row.get::<_, String>(0) row.get::<_, String>(0)
}) { }) {
Ok(state_str) => { Ok(state_str) => {
let timestamp = Timestamp::from_str(&state_str).map_err(|e| { let timestamp = Timestamp::from_str(&state_str).map_err(|e| {
HlcError::ParseTimestamp(format!("Invalid timestamp format: {:?}", e)) HlcError::ParseTimestamp(format!("Invalid timestamp format: {e:?}"))
})?; })?;
Ok(Some(timestamp)) Ok(Some(timestamp))
} }
@ -209,9 +207,8 @@ impl HlcService {
let timestamp_str = timestamp.to_string(); let timestamp_str = timestamp.to_string();
tx.execute( tx.execute(
&format!( &format!(
"INSERT INTO {} (key, value) VALUES (?1, ?2) "INSERT INTO {TABLE_CRDT_CONFIGS} (key, value) VALUES (?1, ?2)
ON CONFLICT(key) DO UPDATE SET value = excluded.value", ON CONFLICT(key) DO UPDATE SET value = excluded.value"
TABLE_CRDT_CONFIGS
), ),
params![HLC_TIMESTAMP_TYPE, timestamp_str], params![HLC_TIMESTAMP_TYPE, timestamp_str],
)?; )?;

View File

@ -32,17 +32,16 @@ pub enum CrdtSetupError {
impl Display for CrdtSetupError { impl Display for CrdtSetupError {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
match self { match self {
CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {}", e), CrdtSetupError::DatabaseError(e) => write!(f, "Database error: {e}"),
CrdtSetupError::HlcColumnMissing { CrdtSetupError::HlcColumnMissing {
table_name, table_name,
column_name, column_name,
} => write!( } => write!(
f, f,
"Table '{}' is missing the required hlc column '{}'", "Table '{table_name}' is missing the required hlc column '{column_name}'"
table_name, column_name
), ),
CrdtSetupError::PrimaryKeyMissing { table_name } => { CrdtSetupError::PrimaryKeyMissing { table_name } => {
write!(f, "Table '{}' has no primary key", table_name) write!(f, "Table '{table_name}' has no primary key")
} }
} }
} }
@ -129,7 +128,7 @@ pub fn setup_triggers_for_table(
let delete_trigger_sql = generate_delete_trigger_sql(table_name, &pks, &cols_to_track); let delete_trigger_sql = generate_delete_trigger_sql(table_name, &pks, &cols_to_track);
if recreate { if recreate {
drop_triggers_for_table(&tx, table_name)?; drop_triggers_for_table(tx, table_name)?;
} }
tx.execute_batch(&insert_trigger_sql)?; tx.execute_batch(&insert_trigger_sql)?;
@ -143,13 +142,11 @@ pub fn setup_triggers_for_table(
pub fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<ColumnInfo>> { pub fn get_table_schema(conn: &Connection, table_name: &str) -> RusqliteResult<Vec<ColumnInfo>> {
if !is_safe_identifier(table_name) { if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!( return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}", "Invalid or unsafe table name provided: {table_name}"
table_name )));
))
.into());
} }
let sql = format!("PRAGMA table_info(\"{}\");", table_name); let sql = format!("PRAGMA table_info(\"{table_name}\");");
let mut stmt = conn.prepare(&sql)?; let mut stmt = conn.prepare(&sql)?;
let rows = stmt.query_map([], ColumnInfo::from_row)?; let rows = stmt.query_map([], ColumnInfo::from_row)?;
rows.collect() rows.collect()
@ -163,8 +160,7 @@ pub fn drop_triggers_for_table(
) -> Result<(), CrdtSetupError> { ) -> Result<(), CrdtSetupError> {
if !is_safe_identifier(table_name) { if !is_safe_identifier(table_name) {
return Err(rusqlite::Error::InvalidParameterName(format!( return Err(rusqlite::Error::InvalidParameterName(format!(
"Invalid or unsafe table name provided: {}", "Invalid or unsafe table name provided: {table_name}"
table_name
)) ))
.into()); .into());
} }
@ -177,8 +173,7 @@ pub fn drop_triggers_for_table(
drop_trigger_sql(DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name)); drop_trigger_sql(DELETE_TRIGGER_TPL.replace("{TABLE_NAME}", table_name));
let sql_batch = format!( let sql_batch = format!(
"{}\n{}\n{}", "{drop_insert_trigger_sql}\n{drop_update_trigger_sql}\n{drop_delete_trigger_sql}"
drop_insert_trigger_sql, drop_update_trigger_sql, drop_delete_trigger_sql
); );
tx.execute_batch(&sql_batch)?; tx.execute_batch(&sql_batch)?;
@ -244,33 +239,22 @@ pub fn drop_triggers_for_table(
fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String { fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks let pk_json_payload = pks
.iter() .iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk)) .map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
let column_inserts = if cols.is_empty() { let column_inserts = if cols.is_empty() {
// Nur PKs -> einfacher Insert ins Log // Nur PKs -> einfacher Insert ins Log
format!( format!(
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}));", VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}));"
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
) )
} else { } else {
cols.iter().fold(String::new(), |mut acc, col| { cols.iter().fold(String::new(), |mut acc, col| {
writeln!( writeln!(
&mut acc, &mut acc,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value)
VALUES ({uuid_fn}(), NEW.\"{hlc_col}\", 'INSERT', '{table}', json_object({pk_payload}), '{column}', json_object('value', NEW.\"{column}\"));", VALUES ({UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'INSERT', '{table_name}', json_object({pk_json_payload}), '{col}', json_object('value', NEW.\"{col}\"));"
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap(); ).unwrap();
acc acc
}) })
@ -290,14 +274,14 @@ fn generate_insert_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
/// Generiert das SQL zum Löschen eines Triggers. /// Generiert das SQL zum Löschen eines Triggers.
fn drop_trigger_sql(trigger_name: String) -> String { fn drop_trigger_sql(trigger_name: String) -> String {
format!("DROP TRIGGER IF EXISTS \"{}\";", trigger_name) format!("DROP TRIGGER IF EXISTS \"{trigger_name}\";")
} }
/// Generiert das SQL für den UPDATE-Trigger. /// Generiert das SQL für den UPDATE-Trigger.
fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String { fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks let pk_json_payload = pks
.iter() .iter()
.map(|pk| format!("'{}', NEW.\"{}\"", pk, pk)) .map(|pk| format!("'{pk}', NEW.\"{pk}\""))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
@ -308,16 +292,10 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols { for col in cols {
writeln!( writeln!(
&mut body, &mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, new_value, old_value)
SELECT {uuid_fn}(), NEW.\"{hlc_col}\", 'UPDATE', '{table}', json_object({pk_payload}), '{column}', SELECT {UUID_FUNCTION_NAME}(), NEW.\"{HLC_TIMESTAMP_COLUMN}\", 'UPDATE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', NEW.\"{column}\"), json_object('value', OLD.\"{column}\") json_object('value', NEW.\"{col}\"), json_object('value', OLD.\"{col}\")
WHERE NEW.\"{column}\" IS NOT OLD.\"{column}\";", WHERE NEW.\"{col}\" IS NOT OLD.\"{col}\";"
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap(); ).unwrap();
} }
} }
@ -341,7 +319,7 @@ fn generate_update_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String { fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]) -> String {
let pk_json_payload = pks let pk_json_payload = pks
.iter() .iter()
.map(|pk| format!("'{}', OLD.\"{}\"", pk, pk)) .map(|pk| format!("'{pk}', OLD.\"{pk}\""))
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", "); .join(", ");
@ -352,28 +330,17 @@ fn generate_delete_trigger_sql(table_name: &str, pks: &[String], cols: &[String]
for col in cols { for col in cols {
writeln!( writeln!(
&mut body, &mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks, column_name, old_value)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}), '{column}', VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}), '{col}',
json_object('value', OLD.\"{column}\"));", json_object('value', OLD.\"{col}\"));"
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload,
column = col
).unwrap(); ).unwrap();
} }
} else { } else {
// Nur PKs -> minimales Delete Log // Nur PKs -> minimales Delete Log
writeln!( writeln!(
&mut body, &mut body,
"INSERT INTO {log_table} (id, haex_timestamp, op_type, table_name, row_pks) "INSERT INTO {TABLE_CRDT_LOGS} (id, haex_timestamp, op_type, table_name, row_pks)
VALUES ({uuid_fn}(), OLD.\"{hlc_col}\", 'DELETE', '{table}', json_object({pk_payload}));", VALUES ({UUID_FUNCTION_NAME}(), OLD.\"{HLC_TIMESTAMP_COLUMN}\", 'DELETE', '{table_name}', json_object({pk_json_payload}));"
log_table = TABLE_CRDT_LOGS,
uuid_fn = UUID_FUNCTION_NAME,
hlc_col = HLC_TIMESTAMP_COLUMN,
table = table_name,
pk_payload = pk_json_payload
) )
.unwrap(); .unwrap();
} }

View File

@ -47,7 +47,7 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
}, },
) )
.map_err(|e| DatabaseError::DatabaseError { .map_err(|e| DatabaseError::DatabaseError {
reason: format!("Failed to register {} function: {}", UUID_FUNCTION_NAME, e), reason: format!("Failed to register {UUID_FUNCTION_NAME} function: {e}"),
})?; })?;
let journal_mode: String = conn let journal_mode: String = conn
@ -61,8 +61,7 @@ pub fn open_and_init_db(path: &str, key: &str, create: bool) -> Result<Connectio
println!("WAL mode successfully enabled."); println!("WAL mode successfully enabled.");
} else { } else {
eprintln!( eprintln!(
"Failed to enable WAL mode, journal_mode is '{}'.", "Failed to enable WAL mode, journal_mode is '{journal_mode}'."
journal_mode
); );
} }
@ -97,7 +96,7 @@ pub fn parse_sql_statements(sql: &str) -> Result<Vec<Statement>, DatabaseError>
.join(" "); .join(" ");
Parser::parse_sql(&dialect, &normalized_sql).map_err(|e| DatabaseError::ParseError { Parser::parse_sql(&dialect, &normalized_sql).map_err(|e| DatabaseError::ParseError {
reason: format!("Failed to parse SQL: {}", e), reason: format!("Failed to parse SQL: {e}"),
sql: sql.to_string(), sql: sql.to_string(),
}) })
} }
@ -138,7 +137,7 @@ impl ValueConverter {
serde_json::to_string(json_val) serde_json::to_string(json_val)
.map(SqlValue::Text) .map(SqlValue::Text)
.map_err(|e| DatabaseError::SerializationError { .map_err(|e| DatabaseError::SerializationError {
reason: format!("Failed to serialize JSON param: {}", e), reason: format!("Failed to serialize JSON param: {e}"),
}) })
} }
} }
@ -258,7 +257,7 @@ pub fn select_with_crdt(
params: Vec<JsonValue>, params: Vec<JsonValue>,
connection: &DbConnection, connection: &DbConnection,
) -> Result<Vec<Vec<JsonValue>>, DatabaseError> { ) -> Result<Vec<Vec<JsonValue>>, DatabaseError> {
with_connection(&connection, |conn| { with_connection(connection, |conn| {
SqlExecutor::query_select(conn, &sql, &params) SqlExecutor::query_select(conn, &sql, &params)
}) })
} }

View File

@ -36,8 +36,7 @@ pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, Databa
// Check if triggers already initialized // Check if triggers already initialized
let check_sql = format!( let check_sql = format!(
"SELECT value FROM {} WHERE key = ? AND type = ?", "SELECT value FROM {TABLE_SETTINGS} WHERE key = ? AND type = ?"
TABLE_SETTINGS
); );
let initialized: Option<String> = tx let initialized: Option<String> = tx
.query_row( .query_row(
@ -57,7 +56,7 @@ pub fn ensure_triggers_initialized(conn: &mut Connection) -> Result<bool, Databa
// Create triggers for all CRDT tables // Create triggers for all CRDT tables
for table_name in CRDT_TABLES { for table_name in CRDT_TABLES {
eprintln!(" - Setting up triggers for: {}", table_name); eprintln!(" - Setting up triggers for: {table_name}");
trigger::setup_triggers_for_table(&tx, table_name, false)?; trigger::setup_triggers_for_table(&tx, table_name, false)?;
} }

View File

@ -93,7 +93,7 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
let vault_file_name = if vault_name.ends_with(VAULT_EXTENSION) { let vault_file_name = if vault_name.ends_with(VAULT_EXTENSION) {
vault_name.to_string() vault_name.to_string()
} else { } else {
format!("{}{VAULT_EXTENSION}", vault_name) format!("{vault_name}{VAULT_EXTENSION}")
}; };
let vault_directory = get_vaults_directory(app_handle)?; let vault_directory = get_vaults_directory(app_handle)?;
@ -101,13 +101,12 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
let vault_path = app_handle let vault_path = app_handle
.path() .path()
.resolve( .resolve(
format!("{vault_directory}/{}", vault_file_name), format!("{vault_directory}/{vault_file_name}"),
BaseDirectory::AppLocalData, BaseDirectory::AppLocalData,
) )
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!( reason: format!(
"Failed to resolve vault path for '{}': {}", "Failed to resolve vault path for '{vault_file_name}': {e}"
vault_file_name, e
), ),
})?; })?;
@ -115,7 +114,7 @@ fn get_vault_path(app_handle: &AppHandle, vault_name: &str) -> Result<String, Da
if let Some(parent) = vault_path.parent() { if let Some(parent) = vault_path.parent() {
fs::create_dir_all(parent).map_err(|e| DatabaseError::IoError { fs::create_dir_all(parent).map_err(|e| DatabaseError::IoError {
path: parent.display().to_string(), path: parent.display().to_string(),
reason: format!("Failed to create vaults directory: {}", e), reason: format!("Failed to create vaults directory: {e}"),
})?; })?;
} }
@ -174,18 +173,18 @@ pub fn list_vaults(app_handle: AppHandle) -> Result<Vec<VaultInfo>, DatabaseErro
if let Some(filename) = path.file_name().and_then(|n| n.to_str()) { if let Some(filename) = path.file_name().and_then(|n| n.to_str()) {
if filename.ends_with(VAULT_EXTENSION) { if filename.ends_with(VAULT_EXTENSION) {
// Entferne .db Endung für die Rückgabe // Entferne .db Endung für die Rückgabe
println!("Vault gefunden {}", filename.to_string()); println!("Vault gefunden {filename}");
let metadata = fs::metadata(&path).map_err(|e| DatabaseError::IoError { let metadata = fs::metadata(&path).map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(), path: path.to_string_lossy().to_string(),
reason: format!("Metadaten konnten nicht gelesen werden: {}", e), reason: format!("Metadaten konnten nicht gelesen werden: {e}"),
})?; })?;
let last_access_timestamp = metadata let last_access_timestamp = metadata
.accessed() .accessed()
.map_err(|e| DatabaseError::IoError { .map_err(|e| DatabaseError::IoError {
path: path.to_string_lossy().to_string(), path: path.to_string_lossy().to_string(),
reason: format!("Zugriffszeit konnte nicht gelesen werden: {}", e), reason: format!("Zugriffszeit konnte nicht gelesen werden: {e}"),
})? })?
.duration_since(UNIX_EPOCH) .duration_since(UNIX_EPOCH)
.unwrap_or_default() // Fallback für den seltenen Fall einer Zeit vor 1970 .unwrap_or_default() // Fallback für den seltenen Fall einer Zeit vor 1970
@ -233,8 +232,8 @@ pub fn move_vault_to_trash(
#[cfg(not(target_os = "android"))] #[cfg(not(target_os = "android"))]
{ {
let vault_path = get_vault_path(&app_handle, &vault_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{}-shm", vault_path); let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{}-wal", vault_path); let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -252,14 +251,12 @@ pub fn move_vault_to_trash(
let _ = trash::delete(&vault_wal_path); let _ = trash::delete(&vault_wal_path);
Ok(format!( Ok(format!(
"Vault '{}' successfully moved to trash", "Vault '{vault_name}' successfully moved to trash"
vault_name
)) ))
} else { } else {
// Fallback: Permanent deletion if trash fails // Fallback: Permanent deletion if trash fails
println!( println!(
"Trash not available, falling back to permanent deletion for vault '{}'", "Trash not available, falling back to permanent deletion for vault '{vault_name}'"
vault_name
); );
delete_vault(app_handle, vault_name) delete_vault(app_handle, vault_name)
} }
@ -270,8 +267,8 @@ pub fn move_vault_to_trash(
#[tauri::command] #[tauri::command]
pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String, DatabaseError> { pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String, DatabaseError> {
let vault_path = get_vault_path(&app_handle, &vault_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
let vault_shm_path = format!("{}-shm", vault_path); let vault_shm_path = format!("{vault_path}-shm");
let vault_wal_path = format!("{}-wal", vault_path); let vault_wal_path = format!("{vault_path}-wal");
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -283,23 +280,23 @@ pub fn delete_vault(app_handle: AppHandle, vault_name: String) -> Result<String,
if Path::new(&vault_shm_path).exists() { if Path::new(&vault_shm_path).exists() {
fs::remove_file(&vault_shm_path).map_err(|e| DatabaseError::IoError { fs::remove_file(&vault_shm_path).map_err(|e| DatabaseError::IoError {
path: vault_shm_path.clone(), path: vault_shm_path.clone(),
reason: format!("Failed to delete vault: {}", e), reason: format!("Failed to delete vault: {e}"),
})?; })?;
} }
if Path::new(&vault_wal_path).exists() { if Path::new(&vault_wal_path).exists() {
fs::remove_file(&vault_wal_path).map_err(|e| DatabaseError::IoError { fs::remove_file(&vault_wal_path).map_err(|e| DatabaseError::IoError {
path: vault_wal_path.clone(), path: vault_wal_path.clone(),
reason: format!("Failed to delete vault: {}", e), reason: format!("Failed to delete vault: {e}"),
})?; })?;
} }
fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError { fs::remove_file(&vault_path).map_err(|e| DatabaseError::IoError {
path: vault_path.clone(), path: vault_path.clone(),
reason: format!("Failed to delete vault: {}", e), reason: format!("Failed to delete vault: {e}"),
})?; })?;
Ok(format!("Vault '{}' successfully deleted", vault_name)) Ok(format!("Vault '{vault_name}' successfully deleted"))
} }
#[tauri::command] #[tauri::command]
@ -309,16 +306,16 @@ pub fn create_encrypted_database(
key: String, key: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<String, DatabaseError> { ) -> Result<String, DatabaseError> {
println!("Creating encrypted vault with name: {}", vault_name); println!("Creating encrypted vault with name: {vault_name}");
let vault_path = get_vault_path(&app_handle, &vault_name)?; let vault_path = get_vault_path(&app_handle, &vault_name)?;
println!("Resolved vault path: {}", vault_path); println!("Resolved vault path: {vault_path}");
// Prüfen, ob bereits eine Vault mit diesem Namen existiert // Prüfen, ob bereits eine Vault mit diesem Namen existiert
if Path::new(&vault_path).exists() { if Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
path: vault_path, path: vault_path,
reason: format!("A vault with the name '{}' already exists", vault_name), reason: format!("A vault with the name '{vault_name}' already exists"),
}); });
} }
/* let resource_path = app_handle /* let resource_path = app_handle
@ -330,7 +327,7 @@ pub fn create_encrypted_database(
.path() .path()
.resolve("database/vault.db", BaseDirectory::Resource) .resolve("database/vault.db", BaseDirectory::Resource)
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve template database: {}", e), reason: format!("Failed to resolve template database: {e}"),
})?; })?;
let template_content = let template_content =
@ -339,20 +336,20 @@ pub fn create_encrypted_database(
.read(&template_path) .read(&template_path)
.map_err(|e| DatabaseError::IoError { .map_err(|e| DatabaseError::IoError {
path: template_path.display().to_string(), path: template_path.display().to_string(),
reason: format!("Failed to read template database from resources: {}", e), reason: format!("Failed to read template database from resources: {e}"),
})?; })?;
let temp_path = app_handle let temp_path = app_handle
.path() .path()
.resolve("temp_vault.db", BaseDirectory::AppLocalData) .resolve("temp_vault.db", BaseDirectory::AppLocalData)
.map_err(|e| DatabaseError::PathResolutionError { .map_err(|e| DatabaseError::PathResolutionError {
reason: format!("Failed to resolve temp database: {}", e), reason: format!("Failed to resolve temp database: {e}"),
})?; })?;
let temp_path_clone = temp_path.to_owned(); let temp_path_clone = temp_path.to_owned();
fs::write(temp_path, template_content).map_err(|e| DatabaseError::IoError { fs::write(temp_path, template_content).map_err(|e| DatabaseError::IoError {
path: vault_path.to_string(), path: vault_path.to_string(),
reason: format!("Failed to write temporary template database: {}", e), reason: format!("Failed to write temporary template database: {e}"),
})?; })?;
/* if !template_path.exists() { /* if !template_path.exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
@ -365,8 +362,7 @@ pub fn create_encrypted_database(
let conn = Connection::open(&temp_path_clone).map_err(|e| DatabaseError::ConnectionFailed { let conn = Connection::open(&temp_path_clone).map_err(|e| DatabaseError::ConnectionFailed {
path: temp_path_clone.display().to_string(), path: temp_path_clone.display().to_string(),
reason: format!( reason: format!(
"Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {}", "Fehler beim Öffnen der unverschlüsselten Quelldatenbank: {e}"
e
), ),
})?; })?;
@ -394,7 +390,7 @@ pub fn create_encrypted_database(
let _ = fs::remove_file(&vault_path); let _ = fs::remove_file(&vault_path);
let _ = fs::remove_file(&temp_path_clone); let _ = fs::remove_file(&temp_path_clone);
return Err(DatabaseError::QueryError { return Err(DatabaseError::QueryError {
reason: format!("Fehler während sqlcipher_export: {}", e), reason: format!("Fehler während sqlcipher_export: {e}"),
}); });
} }
@ -419,11 +415,11 @@ pub fn create_encrypted_database(
Ok(version) Ok(version)
}) { }) {
Ok(version) => { Ok(version) => {
println!("SQLCipher ist aktiv! Version: {}", version); println!("SQLCipher ist aktiv! Version: {version}");
} }
Err(e) => { Err(e) => {
eprintln!("FEHLER: SQLCipher scheint NICHT aktiv zu sein!"); eprintln!("FEHLER: SQLCipher scheint NICHT aktiv zu sein!");
eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {}", e); eprintln!("Der Befehl 'PRAGMA cipher_version;' schlug fehl: {e}");
eprintln!("Die Datenbank wurde wahrscheinlich NICHT verschlüsselt."); eprintln!("Die Datenbank wurde wahrscheinlich NICHT verschlüsselt.");
} }
} }
@ -431,7 +427,7 @@ pub fn create_encrypted_database(
conn.close() conn.close()
.map_err(|(_, e)| DatabaseError::ConnectionFailed { .map_err(|(_, e)| DatabaseError::ConnectionFailed {
path: template_path.display().to_string(), path: template_path.display().to_string(),
reason: format!("Fehler beim Schließen der Quelldatenbank: {}", e), reason: format!("Fehler beim Schließen der Quelldatenbank: {e}"),
})?; })?;
let _ = fs::remove_file(&temp_path_clone); let _ = fs::remove_file(&temp_path_clone);
@ -448,19 +444,19 @@ pub fn open_encrypted_database(
key: String, key: String,
state: State<'_, AppState>, state: State<'_, AppState>,
) -> Result<String, DatabaseError> { ) -> Result<String, DatabaseError> {
println!("Opening encrypted database vault_path: {}", vault_path); println!("Opening encrypted database vault_path: {vault_path}");
println!("Resolved vault path: {}", vault_path); println!("Resolved vault path: {vault_path}");
if !Path::new(&vault_path).exists() { if !Path::new(&vault_path).exists() {
return Err(DatabaseError::IoError { return Err(DatabaseError::IoError {
path: vault_path.to_string(), path: vault_path.to_string(),
reason: format!("Vault '{}' does not exist", vault_path), reason: format!("Vault '{vault_path}' does not exist"),
}); });
} }
initialize_session(&app_handle, &vault_path, &key, &state)?; initialize_session(&app_handle, &vault_path, &key, &state)?;
Ok(format!("Vault '{}' opened successfully", vault_path)) Ok(format!("Vault '{vault_path}' opened successfully"))
} }
/// Opens the DB, initializes the HLC service, and stores both in the AppState. /// Opens the DB, initializes the HLC service, and stores both in the AppState.
@ -512,8 +508,7 @@ fn initialize_session(
eprintln!("INFO: Setting 'triggers_initialized' flag via CRDT..."); eprintln!("INFO: Setting 'triggers_initialized' flag via CRDT...");
let insert_sql = format!( let insert_sql = format!(
"INSERT INTO {} (id, key, type, value) VALUES (?, ?, ?, ?)", "INSERT INTO {TABLE_SETTINGS} (id, key, type, value) VALUES (?, ?, ?, ?)"
TABLE_SETTINGS
); );
// execute_with_crdt erwartet Vec<JsonValue>, kein params!-Makro // execute_with_crdt erwartet Vec<JsonValue>, kein params!-Makro

View File

@ -10,10 +10,8 @@ use crate::extension::permissions::manager::PermissionManager;
use crate::extension::permissions::types::ExtensionPermission; use crate::extension::permissions::types::ExtensionPermission;
use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS}; use crate::table_names::{TABLE_EXTENSIONS, TABLE_EXTENSION_PERMISSIONS};
use crate::AppState; use crate::AppState;
use serde_json::Value as JsonValue; use std::collections::HashMap;
use std::collections::{HashMap, HashSet};
use std::fs; use std::fs;
use std::io::Cursor;
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::Mutex; use std::sync::Mutex;
use std::time::{Duration, SystemTime}; use std::time::{Duration, SystemTime};
@ -77,7 +75,7 @@ impl ExtensionManager {
// Check for path traversal patterns // Check for path traversal patterns
if relative_path.contains("..") { if relative_path.contains("..") {
return Err(ExtensionError::SecurityViolation { return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {}", relative_path), reason: format!("Path traversal attempt: {relative_path}"),
}); });
} }
@ -104,7 +102,7 @@ impl ExtensionManager {
if let Ok(canonical_path) = full_path.canonicalize() { if let Ok(canonical_path) = full_path.canonicalize() {
if !canonical_path.starts_with(&canonical_base) { if !canonical_path.starts_with(&canonical_base) {
return Err(ExtensionError::SecurityViolation { return Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {}", relative_path), reason: format!("Path outside base directory: {relative_path}"),
}); });
} }
Ok(Some(canonical_path)) Ok(Some(canonical_path))
@ -114,7 +112,7 @@ impl ExtensionManager {
Ok(Some(full_path)) Ok(Some(full_path))
} else { } else {
Err(ExtensionError::SecurityViolation { Err(ExtensionError::SecurityViolation {
reason: format!("Path outside base directory: {}", relative_path), reason: format!("Path outside base directory: {relative_path}"),
}) })
} }
} }
@ -131,13 +129,13 @@ impl ExtensionManager {
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, icon, true)? { if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, icon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string())); return Ok(Some(clean_path.to_string_lossy().to_string()));
} else { } else {
eprintln!("WARNING: Icon path specified in manifest not found: {}", icon); eprintln!("WARNING: Icon path specified in manifest not found: {icon}");
// Continue to fallback logic // Continue to fallback logic
} }
} }
// Fallback 1: Check haextension/favicon.ico // Fallback 1: Check haextension/favicon.ico
let haextension_favicon = format!("{}/favicon.ico", haextension_dir); let haextension_favicon = format!("{haextension_dir}/favicon.ico");
if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, &haextension_favicon, true)? { if let Some(clean_path) = Self::validate_path_in_directory(extension_dir, &haextension_favicon, true)? {
return Ok(Some(clean_path.to_string_lossy().to_string())); return Ok(Some(clean_path.to_string_lossy().to_string()));
} }
@ -162,11 +160,11 @@ impl ExtensionManager {
.path() .path()
.app_cache_dir() .app_cache_dir()
.map_err(|e| ExtensionError::InstallationFailed { .map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot get app cache dir: {}", e), reason: format!("Cannot get app cache dir: {e}"),
})?; })?;
let temp_id = uuid::Uuid::new_v4(); let temp_id = uuid::Uuid::new_v4();
let temp = cache_dir.join(format!("{}_{}", temp_prefix, temp_id)); let temp = cache_dir.join(format!("{temp_prefix}_{temp_id}"));
let zip_file_path = cache_dir.join(format!("{}_{}_{}.haextension", temp_prefix, temp_id, "temp")); let zip_file_path = cache_dir.join(format!("{}_{}_{}.haextension", temp_prefix, temp_id, "temp"));
// Write bytes to a temporary ZIP file first (important for Android file system) // Write bytes to a temporary ZIP file first (important for Android file system)
@ -185,14 +183,14 @@ impl ExtensionManager {
let mut archive = ZipArchive::new(zip_file).map_err(|e| { let mut archive = ZipArchive::new(zip_file).map_err(|e| {
ExtensionError::InstallationFailed { ExtensionError::InstallationFailed {
reason: format!("Invalid ZIP: {}", e), reason: format!("Invalid ZIP: {e}"),
} }
})?; })?;
archive archive
.extract(&temp) .extract(&temp)
.map_err(|e| ExtensionError::InstallationFailed { .map_err(|e| ExtensionError::InstallationFailed {
reason: format!("Cannot extract ZIP: {}", e), reason: format!("Cannot extract ZIP: {e}"),
})?; })?;
// Clean up temporary ZIP file // Clean up temporary ZIP file
@ -203,12 +201,12 @@ impl ExtensionManager {
let haextension_dir = if config_path.exists() { let haextension_dir = if config_path.exists() {
let config_content = std::fs::read_to_string(&config_path) let config_content = std::fs::read_to_string(&config_path)
.map_err(|e| ExtensionError::ManifestError { .map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read haextension.config.json: {}", e), reason: format!("Cannot read haextension.config.json: {e}"),
})?; })?;
let config: serde_json::Value = serde_json::from_str(&config_content) let config: serde_json::Value = serde_json::from_str(&config_content)
.map_err(|e| ExtensionError::ManifestError { .map_err(|e| ExtensionError::ManifestError {
reason: format!("Invalid haextension.config.json: {}", e), reason: format!("Invalid haextension.config.json: {e}"),
})?; })?;
let dir = config let dir = config
@ -224,16 +222,16 @@ impl ExtensionManager {
}; };
// Validate manifest path using helper function // Validate manifest path using helper function
let manifest_relative_path = format!("{}/manifest.json", haextension_dir); let manifest_relative_path = format!("{haextension_dir}/manifest.json");
let manifest_path = Self::validate_path_in_directory(&temp, &manifest_relative_path, true)? let manifest_path = Self::validate_path_in_directory(&temp, &manifest_relative_path, true)?
.ok_or_else(|| ExtensionError::ManifestError { .ok_or_else(|| ExtensionError::ManifestError {
reason: format!("manifest.json not found at {}/manifest.json", haextension_dir), reason: format!("manifest.json not found at {haextension_dir}/manifest.json"),
})?; })?;
let actual_dir = temp.clone(); let actual_dir = temp.clone();
let manifest_content = let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError { std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Cannot read manifest: {}", e), reason: format!("Cannot read manifest: {e}"),
})?; })?;
let mut manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let mut manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
@ -440,8 +438,7 @@ impl ExtensionManager {
eprintln!("DEBUG: Removing extension with ID: {}", extension.id); eprintln!("DEBUG: Removing extension with ID: {}", extension.id);
eprintln!( eprintln!(
"DEBUG: Extension name: {}, version: {}", "DEBUG: Extension name: {extension_name}, version: {extension_version}"
extension_name, extension_version
); );
// Lösche Permissions und Extension-Eintrag in einer Transaktion // Lösche Permissions und Extension-Eintrag in einer Transaktion
@ -460,7 +457,7 @@ impl ExtensionManager {
PermissionManager::delete_permissions_in_transaction(&tx, &hlc_service, &extension.id)?; PermissionManager::delete_permissions_in_transaction(&tx, &hlc_service, &extension.id)?;
// Lösche Extension-Eintrag mit extension_id // Lösche Extension-Eintrag mit extension_id
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSIONS} WHERE id = ?");
eprintln!("DEBUG: Executing SQL: {} with id = {}", sql, extension.id); eprintln!("DEBUG: Executing SQL: {} with id = {}", sql, extension.id);
SqlExecutor::execute_internal_typed( SqlExecutor::execute_internal_typed(
&tx, &tx,
@ -615,8 +612,7 @@ impl ExtensionManager {
// 1. Extension-Eintrag erstellen mit generierter UUID // 1. Extension-Eintrag erstellen mit generierter UUID
let insert_ext_sql = format!( let insert_ext_sql = format!(
"INSERT INTO {} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSIONS} (id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSIONS
); );
SqlExecutor::execute_internal_typed( SqlExecutor::execute_internal_typed(
@ -641,8 +637,7 @@ impl ExtensionManager {
// 2. Permissions speichern // 2. Permissions speichern
let insert_perm_sql = format!( let insert_perm_sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSION_PERMISSIONS
); );
for perm in &permissions { for perm in &permissions {
@ -714,10 +709,9 @@ impl ExtensionManager {
// Lade alle Daten aus der Datenbank // Lade alle Daten aus der Datenbank
let extensions = with_connection(&state.db, |conn| { let extensions = with_connection(&state.db, |conn| {
let sql = format!( let sql = format!(
"SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance FROM {}", "SELECT id, name, version, author, entry, icon, public_key, signature, homepage, description, enabled, single_instance FROM {TABLE_EXTENSIONS}"
TABLE_EXTENSIONS
); );
eprintln!("DEBUG: SQL Query before transformation: {}", sql); eprintln!("DEBUG: SQL Query before transformation: {sql}");
let results = SqlExecutor::query_select(conn, &sql, &[])?; let results = SqlExecutor::query_select(conn, &sql, &[])?;
eprintln!("DEBUG: Query returned {} results", results.len()); eprintln!("DEBUG: Query returned {} results", results.len());
@ -779,7 +773,7 @@ impl ExtensionManager {
for extension_data in extensions { for extension_data in extensions {
let extension_id = extension_data.id; let extension_id = extension_data.id;
eprintln!("DEBUG: Processing extension: {}", extension_id); eprintln!("DEBUG: Processing extension: {extension_id}");
// Use public_key/name/version path structure // Use public_key/name/version path structure
let extension_path = self.get_extension_dir( let extension_path = self.get_extension_dir(
@ -792,8 +786,7 @@ impl ExtensionManager {
// Check if extension directory exists // Check if extension directory exists
if !extension_path.exists() { if !extension_path.exists() {
eprintln!( eprintln!(
"DEBUG: Extension directory missing for: {} at {:?}", "DEBUG: Extension directory missing for: {extension_id} at {extension_path:?}"
extension_id, extension_path
); );
self.missing_extensions self.missing_extensions
.lock() .lock()
@ -833,13 +826,12 @@ impl ExtensionManager {
}; };
// Validate manifest.json path using helper function // Validate manifest.json path using helper function
let manifest_relative_path = format!("{}/manifest.json", haextension_dir); let manifest_relative_path = format!("{haextension_dir}/manifest.json");
if Self::validate_path_in_directory(&extension_path, &manifest_relative_path, true)? if Self::validate_path_in_directory(&extension_path, &manifest_relative_path, true)?
.is_none() .is_none()
{ {
eprintln!( eprintln!(
"DEBUG: manifest.json missing or invalid for: {} at {}/manifest.json", "DEBUG: manifest.json missing or invalid for: {extension_id} at {haextension_dir}/manifest.json"
extension_id, haextension_dir
); );
self.missing_extensions self.missing_extensions
.lock() .lock()
@ -855,7 +847,7 @@ impl ExtensionManager {
continue; continue;
} }
eprintln!("DEBUG: Extension loaded successfully: {}", extension_id); eprintln!("DEBUG: Extension loaded successfully: {extension_id}");
let extension = Extension { let extension = Extension {
id: extension_id.clone(), id: extension_id.clone(),

View File

@ -42,12 +42,12 @@ enum DataProcessingError {
impl fmt::Display for DataProcessingError { impl fmt::Display for DataProcessingError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self { match self {
DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {}", e), DataProcessingError::HexDecoding(e) => write!(f, "Hex-Dekodierungsfehler: {e}"),
DataProcessingError::Utf8Conversion(e) => { DataProcessingError::Utf8Conversion(e) => {
write!(f, "UTF-8-Konvertierungsfehler: {}", e) write!(f, "UTF-8-Konvertierungsfehler: {e}")
} }
DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {}", e), DataProcessingError::JsonParsing(e) => write!(f, "JSON-Parsing-Fehler: {e}"),
DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {}", msg), DataProcessingError::Custom(msg) => write!(f, "Datenverarbeitungsfehler: {msg}"),
} }
} }
} }
@ -101,7 +101,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-') .all(|c| c.is_ascii_alphanumeric() || c == '-')
{ {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension name: {}", extension_name), reason: format!("Invalid extension name: {extension_name}"),
}); });
} }
@ -111,7 +111,7 @@ pub fn resolve_secure_extension_asset_path(
.all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '.') .all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '.')
{ {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Invalid extension version: {}", extension_version), reason: format!("Invalid extension version: {extension_version}"),
}); });
} }
@ -146,11 +146,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(canonical_path) Ok(canonical_path)
} else { } else {
eprintln!( eprintln!(
"SECURITY WARNING: Path traversal attempt blocked: {}", "SECURITY WARNING: Path traversal attempt blocked: {requested_asset_path}"
requested_asset_path
); );
Err(ExtensionError::SecurityViolation { Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal attempt: {}", requested_asset_path), reason: format!("Path traversal attempt: {requested_asset_path}"),
}) })
} }
} }
@ -159,11 +158,10 @@ pub fn resolve_secure_extension_asset_path(
Ok(final_path) Ok(final_path)
} else { } else {
eprintln!( eprintln!(
"SECURITY WARNING: Invalid asset path: {}", "SECURITY WARNING: Invalid asset path: {requested_asset_path}"
requested_asset_path
); );
Err(ExtensionError::SecurityViolation { Err(ExtensionError::SecurityViolation {
reason: format!("Invalid asset path: {}", requested_asset_path), reason: format!("Invalid asset path: {requested_asset_path}"),
}) })
} }
} }
@ -184,7 +182,7 @@ pub fn extension_protocol_handler(
// Only allow same-protocol requests or tauri origin // Only allow same-protocol requests or tauri origin
// For null/empty origin (initial load), use wildcard // For null/empty origin (initial load), use wildcard
let protocol_prefix = format!("{}://", EXTENSION_PROTOCOL_NAME); let protocol_prefix = format!("{EXTENSION_PROTOCOL_NAME}://");
let allowed_origin = if origin.starts_with(&protocol_prefix) || origin == get_tauri_origin() { let allowed_origin = if origin.starts_with(&protocol_prefix) || origin == get_tauri_origin() {
origin origin
} else if origin.is_empty() || origin == "null" { } else if origin.is_empty() || origin == "null" {
@ -216,9 +214,9 @@ pub fn extension_protocol_handler(
.and_then(|v| v.to_str().ok()) .and_then(|v| v.to_str().ok())
.unwrap_or(""); .unwrap_or("");
println!("Protokoll Handler für: {}", uri_ref); println!("Protokoll Handler für: {uri_ref}");
println!("Origin: {}", origin); println!("Origin: {origin}");
println!("Referer: {}", referer); println!("Referer: {referer}");
let path_str = uri_ref.path(); let path_str = uri_ref.path();
@ -227,16 +225,16 @@ pub fn extension_protocol_handler(
// - Desktop: haex-extension://<base64>/{assetPath} // - Desktop: haex-extension://<base64>/{assetPath}
// - Android: http://localhost/{base64}/{assetPath} // - Android: http://localhost/{base64}/{assetPath}
let host = uri_ref.host().unwrap_or(""); let host = uri_ref.host().unwrap_or("");
println!("URI Host: {}", host); println!("URI Host: {host}");
let (info, segments_after_version) = if host == "localhost" || host == format!("{}.localhost", EXTENSION_PROTOCOL_NAME).as_str() { let (info, segments_after_version) = if host == "localhost" || host == format!("{EXTENSION_PROTOCOL_NAME}.localhost").as_str() {
// Android format: http://haex-extension.localhost/{base64}/{assetPath} // Android format: http://haex-extension.localhost/{base64}/{assetPath}
// Extract base64 from first path segment // Extract base64 from first path segment
println!("Android format detected: http://{}/...", host); println!("Android format detected: http://{host}/...");
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty()); let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
if let Some(first_segment) = segments_iter.next() { if let Some(first_segment) = segments_iter.next() {
println!("First path segment (base64): {}", first_segment); println!("First path segment (base64): {first_segment}");
match BASE64_STANDARD.decode(first_segment) { match BASE64_STANDARD.decode(first_segment) {
Ok(decoded_bytes) => match String::from_utf8(decoded_bytes) { Ok(decoded_bytes) => match String::from_utf8(decoded_bytes) {
Ok(json_str) => match serde_json::from_str::<ExtensionInfo>(&json_str) { Ok(json_str) => match serde_json::from_str::<ExtensionInfo>(&json_str) {
@ -252,29 +250,29 @@ pub fn extension_protocol_handler(
(info, remaining) (info, remaining)
} }
Err(e) => { Err(e) => {
eprintln!("Failed to parse JSON from base64 path: {}", e); eprintln!("Failed to parse JSON from base64 path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 path: {}", e))) .body(Vec::from(format!("Invalid extension info in base64 path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 path: {}", e); eprintln!("Failed to decode UTF-8 from base64 path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 path: {}", e))) .body(Vec::from(format!("Invalid UTF-8 in base64 path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode base64 from path: {}", e); eprintln!("Failed to decode base64 from path: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in path: {}", e))) .body(Vec::from(format!("Invalid base64 in path: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
} }
@ -311,35 +309,35 @@ pub fn extension_protocol_handler(
(info, segments) (info, segments)
} }
Err(e) => { Err(e) => {
eprintln!("Failed to parse JSON from base64 host: {}", e); eprintln!("Failed to parse JSON from base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid extension info in base64 host: {}", e))) .body(Vec::from(format!("Invalid extension info in base64 host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode UTF-8 from base64 host: {}", e); eprintln!("Failed to decode UTF-8 from base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid UTF-8 in base64 host: {}", e))) .body(Vec::from(format!("Invalid UTF-8 in base64 host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
}, },
Err(e) => { Err(e) => {
eprintln!("Failed to decode base64 host: {}", e); eprintln!("Failed to decode base64 host: {e}");
return Response::builder() return Response::builder()
.status(400) .status(400)
.header("Access-Control-Allow-Origin", allowed_origin) .header("Access-Control-Allow-Origin", allowed_origin)
.body(Vec::from(format!("Invalid base64 in host: {}", e))) .body(Vec::from(format!("Invalid base64 in host: {e}")))
.map_err(|e| e.into()); .map_err(|e| e.into());
} }
} }
} else { } else {
// No base64 host - use path-based parsing (for localhost/Android/Windows) // No base64 host - use path-based parsing (for localhost/Android/Windows)
parse_extension_info_from_path(path_str, origin, uri_ref, referer, &allowed_origin)? parse_extension_info_from_path(path_str, origin, uri_ref, referer)?
}; };
// Construct asset path from remaining segments // Construct asset path from remaining segments
@ -353,8 +351,8 @@ pub fn extension_protocol_handler(
&raw_asset_path &raw_asset_path
}; };
println!("Path: {}", path_str); println!("Path: {path_str}");
println!("Asset to load: {}", asset_to_load); println!("Asset to load: {asset_to_load}");
let absolute_secure_path = resolve_secure_extension_asset_path( let absolute_secure_path = resolve_secure_extension_asset_path(
app_handle, app_handle,
@ -362,7 +360,7 @@ pub fn extension_protocol_handler(
&info.public_key, &info.public_key,
&info.name, &info.name,
&info.version, &info.version,
&asset_to_load, asset_to_load,
)?; )?;
println!("Resolved path: {}", absolute_secure_path.display()); println!("Resolved path: {}", absolute_secure_path.display());
@ -497,7 +495,7 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_origin(origin) { if let Ok(hex) = parse_from_origin(origin) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Origin: {}", hex); println!("Parsed und gecached aus Origin: {hex}");
return Ok(info); return Ok(info);
} }
} }
@ -507,17 +505,17 @@ fn parse_encoded_info_from_origin_or_uri_or_referer_or_cache(
if let Ok(hex) = parse_from_uri_path(uri_ref) { if let Ok(hex) = parse_from_uri_path(uri_ref) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus URI: {}", hex); println!("Parsed und gecached aus URI: {hex}");
return Ok(info); return Ok(info);
} }
} }
println!("Fallback zu Referer-Parsing: {}", referer); println!("Fallback zu Referer-Parsing: {referer}");
if !referer.is_empty() && referer != "null" { if !referer.is_empty() && referer != "null" {
if let Ok(hex) = parse_from_uri_string(referer) { if let Ok(hex) = parse_from_uri_string(referer) {
if let Ok(info) = process_hex_encoded_json(&hex) { if let Ok(info) = process_hex_encoded_json(&hex) {
cache_extension_info(&info); // Cache setzen cache_extension_info(&info); // Cache setzen
println!("Parsed und gecached aus Referer: {}", hex); println!("Parsed und gecached aus Referer: {hex}");
return Ok(info); return Ok(info);
} }
} }
@ -609,29 +607,23 @@ fn validate_and_return_hex(segment: &str) -> Result<String, DataProcessingError>
Ok(segment.to_string()) Ok(segment.to_string())
} }
fn encode_hex_for_log(info: &ExtensionInfo) -> String {
let json_str = serde_json::to_string(info).unwrap_or_default();
hex::encode(json_str.as_bytes())
}
// Helper function to parse extension info from path segments // Helper function to parse extension info from path segments
fn parse_extension_info_from_path( fn parse_extension_info_from_path(
path_str: &str, path_str: &str,
origin: &str, origin: &str,
uri_ref: &Uri, uri_ref: &Uri,
referer: &str, referer: &str,
allowed_origin: &str,
) -> Result<(ExtensionInfo, Vec<String>), Box<dyn std::error::Error>> { ) -> Result<(ExtensionInfo, Vec<String>), Box<dyn std::error::Error>> {
let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty()); let mut segments_iter = path_str.split('/').filter(|s| !s.is_empty());
match (segments_iter.next(), segments_iter.next(), segments_iter.next()) { match (segments_iter.next(), segments_iter.next(), segments_iter.next()) {
(Some(public_key), Some(name), Some(version)) => { (Some(public_key), Some(name), Some(version)) => {
println!("=== Extension Protocol Handler (path-based) ==="); println!("=== Extension Protocol Handler (path-based) ===");
println!("Full URI: {}", uri_ref); println!("Full URI: {uri_ref}");
println!("Parsed from path segments:"); println!("Parsed from path segments:");
println!(" PublicKey: {}", public_key); println!(" PublicKey: {public_key}");
println!(" Name: {}", name); println!(" Name: {name}");
println!(" Version: {}", version); println!(" Version: {version}");
let info = ExtensionInfo { let info = ExtensionInfo {
public_key: public_key.to_string(), public_key: public_key.to_string(),
@ -653,7 +645,7 @@ fn parse_extension_info_from_path(
) { ) {
Ok(decoded) => { Ok(decoded) => {
println!("=== Extension Protocol Handler (legacy hex format) ==="); println!("=== Extension Protocol Handler (legacy hex format) ===");
println!("Full URI: {}", uri_ref); println!("Full URI: {uri_ref}");
println!("Decoded info:"); println!("Decoded info:");
println!(" PublicKey: {}", decoded.public_key); println!(" PublicKey: {}", decoded.public_key);
println!(" Name: {}", decoded.name); println!(" Name: {}", decoded.name);
@ -670,8 +662,8 @@ fn parse_extension_info_from_path(
Ok((decoded, segments)) Ok((decoded, segments))
} }
Err(e) => { Err(e) => {
eprintln!("Fehler beim Parsen (alle Fallbacks): {}", e); eprintln!("Fehler beim Parsen (alle Fallbacks): {e}");
Err(format!("Ungültige Anfrage: {}", e).into()) Err(format!("Ungültige Anfrage: {e}").into())
} }
} }
} }

View File

@ -70,8 +70,7 @@ pub fn copy_directory(
use std::path::PathBuf; use std::path::PathBuf;
println!( println!(
"Kopiere Verzeichnis von '{}' nach '{}'", "Kopiere Verzeichnis von '{source}' nach '{destination}'"
source, destination
); );
let source_path = PathBuf::from(&source); let source_path = PathBuf::from(&source);
@ -81,7 +80,7 @@ pub fn copy_directory(
return Err(ExtensionError::Filesystem { return Err(ExtensionError::Filesystem {
source: std::io::Error::new( source: std::io::Error::new(
std::io::ErrorKind::NotFound, std::io::ErrorKind::NotFound,
format!("Source directory '{}' not found", source), format!("Source directory '{source}' not found"),
), ),
}); });
} }
@ -93,7 +92,7 @@ pub fn copy_directory(
fs_extra::dir::copy(&source_path, &destination_path, &options).map_err(|e| { fs_extra::dir::copy(&source_path, &destination_path, &options).map_err(|e| {
ExtensionError::Filesystem { ExtensionError::Filesystem {
source: std::io::Error::new(std::io::ErrorKind::Other, e.to_string()), source: std::io::Error::other(e.to_string()),
} }
})?; })?;
Ok(()) Ok(())

View File

@ -18,20 +18,20 @@ impl ExtensionCrypto {
signature_hex: &str, signature_hex: &str,
) -> Result<(), String> { ) -> Result<(), String> {
let public_key_bytes = let public_key_bytes =
hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {}", e))?; hex::decode(public_key_hex).map_err(|e| format!("Invalid public key: {e}"))?;
let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap()) let public_key = VerifyingKey::from_bytes(&public_key_bytes.try_into().unwrap())
.map_err(|e| format!("Invalid public key: {}", e))?; .map_err(|e| format!("Invalid public key: {e}"))?;
let signature_bytes = let signature_bytes =
hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {}", e))?; hex::decode(signature_hex).map_err(|e| format!("Invalid signature: {e}"))?;
let signature = Signature::from_bytes(&signature_bytes.try_into().unwrap()); let signature = Signature::from_bytes(&signature_bytes.try_into().unwrap());
let content_hash = let content_hash =
hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {}", e))?; hex::decode(content_hash_hex).map_err(|e| format!("Invalid content hash: {e}"))?;
public_key public_key
.verify(&content_hash, &signature) .verify(&content_hash, &signature)
.map_err(|e| format!("Signature verification failed: {}", e)) .map_err(|e| format!("Signature verification failed: {e}"))
} }
/// Berechnet Hash eines Verzeichnisses (für Verifikation) /// Berechnet Hash eines Verzeichnisses (für Verifikation)
@ -71,7 +71,7 @@ impl ExtensionCrypto {
if !canonical_manifest_path.starts_with(&canonical_dir) { if !canonical_manifest_path.starts_with(&canonical_dir) {
return Err(ExtensionError::ManifestError { return Err(ExtensionError::ManifestError {
reason: format!("Manifest path resolves outside of extension directory (potential path traversal)"), reason: "Manifest path resolves outside of extension directory (potential path traversal)".to_string(),
}); });
} }
@ -90,7 +90,7 @@ impl ExtensionCrypto {
let mut manifest: serde_json::Value = let mut manifest: serde_json::Value =
serde_json::from_str(&content_str).map_err(|e| { serde_json::from_str(&content_str).map_err(|e| {
ExtensionError::ManifestError { ExtensionError::ManifestError {
reason: format!("Cannot parse manifest JSON: {}", e), reason: format!("Cannot parse manifest JSON: {e}"),
} }
})?; })?;
@ -107,7 +107,7 @@ impl ExtensionCrypto {
let canonical_manifest_content = let canonical_manifest_content =
serde_json::to_string_pretty(&manifest).map_err(|e| { serde_json::to_string_pretty(&manifest).map_err(|e| {
ExtensionError::ManifestError { ExtensionError::ManifestError {
reason: format!("Failed to serialize manifest: {}", e), reason: format!("Failed to serialize manifest: {e}"),
} }
})?; })?;

View File

@ -3,7 +3,7 @@
use crate::crdt::hlc::HlcService; use crate::crdt::hlc::HlcService;
use crate::crdt::transformer::CrdtTransformer; use crate::crdt::transformer::CrdtTransformer;
use crate::crdt::trigger; use crate::crdt::trigger;
use crate::database::core::{convert_value_ref_to_json, parse_sql_statements, ValueConverter}; use crate::database::core::{convert_value_ref_to_json, parse_sql_statements};
use crate::database::error::DatabaseError; use crate::database::error::DatabaseError;
use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction}; use rusqlite::{params_from_iter, types::Value as SqliteValue, ToSql, Transaction};
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
@ -52,14 +52,14 @@ impl SqlExecutor {
} }
let sql_str = statement.to_string(); let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed execute SQL: {}", sql_str); eprintln!("DEBUG: Transformed execute SQL: {sql_str}");
// Führe Statement aus // Führe Statement aus
tx.execute(&sql_str, params) tx.execute(&sql_str, params)
.map_err(|e| DatabaseError::ExecutionError { .map_err(|e| DatabaseError::ExecutionError {
sql: sql_str.clone(), sql: sql_str.clone(),
table: None, table: None,
reason: format!("Execute failed: {}", e), reason: format!("Execute failed: {e}"),
})?; })?;
// Trigger-Logik für CREATE TABLE // Trigger-Logik für CREATE TABLE
@ -70,7 +70,7 @@ impl SqlExecutor {
.trim_matches('"') .trim_matches('"')
.trim_matches('`') .trim_matches('`')
.to_string(); .to_string();
eprintln!("DEBUG: Setting up triggers for table: {}", table_name_str); eprintln!("DEBUG: Setting up triggers for table: {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?; trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
} }
@ -115,7 +115,7 @@ impl SqlExecutor {
} }
let sql_str = statement.to_string(); let sql_str = statement.to_string();
eprintln!("DEBUG: Transformed SQL (with RETURNING): {}", sql_str); eprintln!("DEBUG: Transformed SQL (with RETURNING): {sql_str}");
// Prepare und query ausführen // Prepare und query ausführen
let mut stmt = tx let mut stmt = tx
@ -170,7 +170,7 @@ impl SqlExecutor {
.trim_matches('"') .trim_matches('"')
.trim_matches('`') .trim_matches('`')
.to_string(); .to_string();
eprintln!("DEBUG: Setting up triggers for table (RETURNING): {}", table_name_str); eprintln!("DEBUG: Setting up triggers for table (RETURNING): {table_name_str}");
trigger::setup_triggers_for_table(tx, &table_name_str, false)?; trigger::setup_triggers_for_table(tx, &table_name_str, false)?;
} }
@ -186,7 +186,7 @@ impl SqlExecutor {
) -> Result<HashSet<String>, DatabaseError> { ) -> Result<HashSet<String>, DatabaseError> {
let sql_params: Vec<SqliteValue> = params let sql_params: Vec<SqliteValue> = params
.iter() .iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v)) .map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?; .collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect(); let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::execute_internal_typed(tx, hlc_service, sql, &param_refs) Self::execute_internal_typed(tx, hlc_service, sql, &param_refs)
@ -201,7 +201,7 @@ impl SqlExecutor {
) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> { ) -> Result<(HashSet<String>, Vec<Vec<JsonValue>>), DatabaseError> {
let sql_params: Vec<SqliteValue> = params let sql_params: Vec<SqliteValue> = params
.iter() .iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v)) .map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?; .collect::<Result<Vec<_>, _>>()?;
let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect(); let param_refs: Vec<&dyn ToSql> = sql_params.iter().map(|p| p as &dyn ToSql).collect();
Self::query_internal_typed(tx, hlc_service, sql, &param_refs) Self::query_internal_typed(tx, hlc_service, sql, &param_refs)
@ -252,12 +252,12 @@ impl SqlExecutor {
let stmt_to_execute = ast_vec.pop().unwrap(); let stmt_to_execute = ast_vec.pop().unwrap();
let transformed_sql = stmt_to_execute.to_string(); let transformed_sql = stmt_to_execute.to_string();
eprintln!("DEBUG: SELECT (no transformation): {}", transformed_sql); eprintln!("DEBUG: SELECT (no transformation): {transformed_sql}");
// Convert JSON params to SQLite values // Convert JSON params to SQLite values
let sql_params: Vec<SqliteValue> = params let sql_params: Vec<SqliteValue> = params
.iter() .iter()
.map(|v| crate::database::core::ValueConverter::json_to_rusqlite_value(v)) .map(crate::database::core::ValueConverter::json_to_rusqlite_value)
.collect::<Result<Vec<_>, _>>()?; .collect::<Result<Vec<_>, _>>()?;
let mut prepared_stmt = conn.prepare(&transformed_sql)?; let mut prepared_stmt = conn.prepare(&transformed_sql)?;

View File

@ -13,10 +13,8 @@ use crate::AppState;
use rusqlite::params_from_iter; use rusqlite::params_from_iter;
use rusqlite::types::Value as SqlValue; use rusqlite::types::Value as SqlValue;
use rusqlite::Transaction; use rusqlite::Transaction;
use serde_json::json;
use serde_json::Value as JsonValue; use serde_json::Value as JsonValue;
use sqlparser::ast::{Statement, TableFactor, TableObject}; use sqlparser::ast::{Statement, TableFactor, TableObject};
use std::collections::HashSet;
use tauri::State; use tauri::State;
/// Führt Statements mit korrekter Parameter-Bindung aus /// Führt Statements mit korrekter Parameter-Bindung aus
@ -158,7 +156,8 @@ pub async fn extension_sql_execute(
})?; })?;
// Generate HLC timestamp // Generate HLC timestamp
let hlc_timestamp = hlc_service let hlc_timestamp =
hlc_service
.new_timestamp_and_persist(&tx) .new_timestamp_and_persist(&tx)
.map_err(|e| DatabaseError::HlcError { .map_err(|e| DatabaseError::HlcError {
reason: e.to_string(), reason: e.to_string(),
@ -169,15 +168,28 @@ pub async fn extension_sql_execute(
// Convert parameters to references // Convert parameters to references
let sql_values = ValueConverter::convert_params(&params)?; let sql_values = ValueConverter::convert_params(&params)?;
let param_refs: Vec<&dyn rusqlite::ToSql> = sql_values.iter().map(|v| v as &dyn rusqlite::ToSql).collect(); let param_refs: Vec<&dyn rusqlite::ToSql> = sql_values
.iter()
.map(|v| v as &dyn rusqlite::ToSql)
.collect();
let result = if has_returning { let result = if has_returning {
// Use query_internal for statements with RETURNING // Use query_internal for statements with RETURNING
let (_, rows) = SqlExecutor::query_internal_typed(&tx, &hlc_service, &statement.to_string(), &param_refs)?; let (_, rows) = SqlExecutor::query_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
rows rows
} else { } else {
// Use execute_internal for statements without RETURNING // Use execute_internal for statements without RETURNING
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &statement.to_string(), &param_refs)?; SqlExecutor::execute_internal_typed(
&tx,
&hlc_service,
&statement.to_string(),
&param_refs,
)?;
vec![] vec![]
}; };
@ -185,26 +197,23 @@ pub async fn extension_sql_execute(
if let Statement::CreateTable(ref create_table_details) = statement { if let Statement::CreateTable(ref create_table_details) = statement {
// Extract table name and remove quotes (both " and `) // Extract table name and remove quotes (both " and `)
let raw_name = create_table_details.name.to_string(); let raw_name = create_table_details.name.to_string();
println!("DEBUG: Raw table name from AST: {:?}", raw_name); println!("DEBUG: Raw table name from AST: {raw_name:?}");
println!("DEBUG: Raw table name chars: {:?}", raw_name.chars().collect::<Vec<_>>());
let table_name_str = raw_name
.trim_matches('"')
.trim_matches('`')
.to_string();
println!("DEBUG: Cleaned table name: {:?}", table_name_str);
println!("DEBUG: Cleaned table name chars: {:?}", table_name_str.chars().collect::<Vec<_>>());
println!( println!(
"Table '{}' created by extension, setting up CRDT triggers...", "DEBUG: Raw table name chars: {:?}",
table_name_str raw_name.chars().collect::<Vec<_>>()
); );
let table_name_str = raw_name.trim_matches('"').trim_matches('`').to_string();
println!("DEBUG: Cleaned table name: {table_name_str:?}");
println!(
"DEBUG: Cleaned table name chars: {:?}",
table_name_str.chars().collect::<Vec<_>>()
);
println!("Table '{table_name_str}' created by extension, setting up CRDT triggers...");
trigger::setup_triggers_for_table(&tx, &table_name_str, false)?; trigger::setup_triggers_for_table(&tx, &table_name_str, false)?;
println!( println!("Triggers for table '{table_name_str}' successfully created.");
"Triggers for table '{}' successfully created.",
table_name_str
);
} }
// Commit transaction // Commit transaction
@ -302,7 +311,6 @@ pub async fn extension_sql_select(
.map_err(ExtensionError::from) .map_err(ExtensionError::from)
} }
/// Validiert Parameter gegen SQL-Platzhalter /// Validiert Parameter gegen SQL-Platzhalter
fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> { fn validate_params(sql: &str, params: &[JsonValue]) -> Result<(), DatabaseError> {
let total_placeholders = count_sql_placeholders(sql); let total_placeholders = count_sql_placeholders(sql);
@ -339,20 +347,4 @@ mod tests {
); );
assert_eq!(count_sql_placeholders("SELECT * FROM users"), 0); assert_eq!(count_sql_placeholders("SELECT * FROM users"), 0);
} }
/* #[test]
fn test_truncate_sql() {
let sql = "SELECT * FROM very_long_table_name";
assert_eq!(truncate_sql(sql, 10), "SELECT * F...");
assert_eq!(truncate_sql(sql, 50), sql);
} */
#[test]
fn test_validate_params() {
let params = vec![json!(1), json!("test")];
assert!(validate_params("SELECT * FROM users WHERE id = ? AND name = ?", &params).is_ok());
assert!(validate_params("SELECT * FROM users WHERE id = ?", &params).is_err());
assert!(validate_params("SELECT * FROM users", &params).is_err());
}
} }

View File

@ -174,7 +174,7 @@ impl serde::Serialize for ExtensionError {
let mut state = serializer.serialize_struct("ExtensionError", 4)?; let mut state = serializer.serialize_struct("ExtensionError", 4)?;
state.serialize_field("code", &self.code())?; state.serialize_field("code", &self.code())?;
state.serialize_field("type", &format!("{:?}", self))?; state.serialize_field("type", &format!("{self:?}"))?;
state.serialize_field("message", &self.to_string())?; state.serialize_field("message", &self.to_string())?;
if let Some(ext_id) = self.extension_id() { if let Some(ext_id) = self.extension_id() {

View File

@ -133,7 +133,7 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
// Check for path traversal attempts // Check for path traversal attempts
if pattern.contains("../") || pattern.contains("..\\") { if pattern.contains("../") || pattern.contains("..\\") {
return Err(ExtensionError::SecurityViolation { return Err(ExtensionError::SecurityViolation {
reason: format!("Path traversal detected in pattern: {}", pattern), reason: format!("Path traversal detected in pattern: {pattern}"),
}); });
} }
@ -143,7 +143,6 @@ fn validate_path_pattern(pattern: &str) -> Result<(), ExtensionError> {
/// Resolves a path pattern to actual filesystem paths using Tauri's BaseDirectory /// Resolves a path pattern to actual filesystem paths using Tauri's BaseDirectory
pub fn resolve_path_pattern( pub fn resolve_path_pattern(
pattern: &str, pattern: &str,
app_handle: &tauri::AppHandle,
) -> Result<(String, String), ExtensionError> { ) -> Result<(String, String), ExtensionError> {
let (base_var, relative_path) = if let Some(slash_pos) = pattern.find('/') { let (base_var, relative_path) = if let Some(slash_pos) = pattern.find('/') {
(&pattern[..slash_pos], &pattern[slash_pos + 1..]) (&pattern[..slash_pos], &pattern[slash_pos + 1..])
@ -177,7 +176,7 @@ pub fn resolve_path_pattern(
"$TEMP" => "Temp", "$TEMP" => "Temp",
_ => { _ => {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!("Unknown base directory variable: {}", base_var), reason: format!("Unknown base directory variable: {base_var}"),
}); });
} }
}; };

View File

@ -52,7 +52,7 @@ pub async fn get_all_extensions(
.extension_manager .extension_manager
.load_installed_extensions(&app_handle, &state) .load_installed_extensions(&app_handle, &state)
.await .await
.map_err(|e| format!("Failed to load extensions: {:?}", e))?; .map_err(|e| format!("Failed to load extensions: {e:?}"))?;
/* } */ /* } */
let mut extensions = Vec::new(); let mut extensions = Vec::new();
@ -292,12 +292,12 @@ pub async fn load_dev_extension(
let (host, port, haextension_dir) = if config_path.exists() { let (host, port, haextension_dir) = if config_path.exists() {
let config_content = let config_content =
std::fs::read_to_string(&config_path).map_err(|e| ExtensionError::ValidationError { std::fs::read_to_string(&config_path).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to read haextension.config.json: {}", e), reason: format!("Failed to read haextension.config.json: {e}"),
})?; })?;
let config: HaextensionConfig = let config: HaextensionConfig =
serde_json::from_str(&config_content).map_err(|e| ExtensionError::ValidationError { serde_json::from_str(&config_content).map_err(|e| ExtensionError::ValidationError {
reason: format!("Failed to parse haextension.config.json: {}", e), reason: format!("Failed to parse haextension.config.json: {e}"),
})?; })?;
(config.dev.host, config.dev.port, config.dev.haextension_dir) (config.dev.host, config.dev.port, config.dev.haextension_dir)
@ -306,23 +306,22 @@ pub async fn load_dev_extension(
(default_host(), default_port(), default_haextension_dir()) (default_host(), default_port(), default_haextension_dir())
}; };
let dev_server_url = format!("http://{}:{}", host, port); let dev_server_url = format!("http://{host}:{port}");
eprintln!("📡 Dev server URL: {}", dev_server_url); eprintln!("📡 Dev server URL: {dev_server_url}");
eprintln!("📁 Haextension directory: {}", haextension_dir); eprintln!("📁 Haextension directory: {haextension_dir}");
// 1.5. Check if dev server is running // 1.5. Check if dev server is running
if !check_dev_server_health(&dev_server_url).await { if !check_dev_server_health(&dev_server_url).await {
return Err(ExtensionError::ValidationError { return Err(ExtensionError::ValidationError {
reason: format!( reason: format!(
"Dev server at {} is not reachable. Please start your dev server first (e.g., 'npm run dev')", "Dev server at {dev_server_url} is not reachable. Please start your dev server first (e.g., 'npm run dev')"
dev_server_url
), ),
}); });
} }
eprintln!("✅ Dev server is reachable"); eprintln!("✅ Dev server is reachable");
// 2. Validate and build path to manifest: <extension_path>/<haextension_dir>/manifest.json // 2. Validate and build path to manifest: <extension_path>/<haextension_dir>/manifest.json
let manifest_relative_path = format!("{}/manifest.json", haextension_dir); let manifest_relative_path = format!("{haextension_dir}/manifest.json");
let manifest_path = ExtensionManager::validate_path_in_directory( let manifest_path = ExtensionManager::validate_path_in_directory(
&extension_path_buf, &extension_path_buf,
&manifest_relative_path, &manifest_relative_path,
@ -330,15 +329,14 @@ pub async fn load_dev_extension(
)? )?
.ok_or_else(|| ExtensionError::ManifestError { .ok_or_else(|| ExtensionError::ManifestError {
reason: format!( reason: format!(
"Manifest not found at: {}/manifest.json. Make sure you run 'npx @haexhub/sdk init' first.", "Manifest not found at: {haextension_dir}/manifest.json. Make sure you run 'npx @haexhub/sdk init' first."
haextension_dir
), ),
})?; })?;
// 3. Read and parse manifest // 3. Read and parse manifest
let manifest_content = let manifest_content =
std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError { std::fs::read_to_string(&manifest_path).map_err(|e| ExtensionError::ManifestError {
reason: format!("Failed to read manifest: {}", e), reason: format!("Failed to read manifest: {e}"),
})?; })?;
let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?; let manifest: ExtensionManifest = serde_json::from_str(&manifest_content)?;
@ -406,7 +404,7 @@ pub fn remove_dev_extension(
if let Some(id) = to_remove { if let Some(id) = to_remove {
dev_exts.remove(&id); dev_exts.remove(&id);
eprintln!("✅ Dev extension removed: {}", name); eprintln!("✅ Dev extension removed: {name}");
Ok(()) Ok(())
} else { } else {
Err(ExtensionError::NotFound { public_key, name }) Err(ExtensionError::NotFound { public_key, name })

View File

@ -28,8 +28,7 @@ impl PermissionManager {
})?; })?;
let sql = format!( let sql = format!(
"INSERT INTO {} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)", "INSERT INTO {TABLE_EXTENSION_PERMISSIONS} (id, extension_id, resource_type, action, target, constraints, status) VALUES (?, ?, ?, ?, ?, ?, ?)"
TABLE_EXTENSION_PERMISSIONS
); );
for perm in permissions { for perm in permissions {
@ -76,8 +75,7 @@ impl PermissionManager {
let db_perm: HaexExtensionPermissions = permission.into(); let db_perm: HaexExtensionPermissions = permission.into();
let sql = format!( let sql = format!(
"UPDATE {} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?", "UPDATE {TABLE_EXTENSION_PERMISSIONS} SET resource_type = ?, action = ?, target = ?, constraints = ?, status = ? WHERE id = ?"
TABLE_EXTENSION_PERMISSIONS
); );
let params = params![ let params = params![
@ -111,7 +109,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
let sql = format!("UPDATE {} SET status = ? WHERE id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("UPDATE {TABLE_EXTENSION_PERMISSIONS} SET status = ? WHERE id = ?");
let params = params![new_status.as_str(), permission_id]; let params = params![new_status.as_str(), permission_id];
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params)?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params)?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
@ -133,7 +131,7 @@ impl PermissionManager {
})?; })?;
// Echtes DELETE - wird vom CrdtTransformer zu UPDATE umgewandelt // Echtes DELETE - wird vom CrdtTransformer zu UPDATE umgewandelt
let sql = format!("DELETE FROM {} WHERE id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![permission_id])?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![permission_id])?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from) }).map_err(ExtensionError::from)
@ -152,7 +150,7 @@ impl PermissionManager {
reason: "Failed to lock HLC service".to_string(), reason: "Failed to lock HLC service".to_string(),
})?; })?;
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![extension_id])?; SqlExecutor::execute_internal_typed(&tx, &hlc_service, &sql, params![extension_id])?;
tx.commit().map_err(DatabaseError::from) tx.commit().map_err(DatabaseError::from)
}).map_err(ExtensionError::from) }).map_err(ExtensionError::from)
@ -164,7 +162,7 @@ impl PermissionManager {
hlc_service: &crate::crdt::hlc::HlcService, hlc_service: &crate::crdt::hlc::HlcService,
extension_id: &str, extension_id: &str,
) -> Result<(), DatabaseError> { ) -> Result<(), DatabaseError> {
let sql = format!("DELETE FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("DELETE FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
SqlExecutor::execute_internal_typed(tx, hlc_service, &sql, params![extension_id])?; SqlExecutor::execute_internal_typed(tx, hlc_service, &sql, params![extension_id])?;
Ok(()) Ok(())
} }
@ -174,7 +172,7 @@ impl PermissionManager {
extension_id: &str, extension_id: &str,
) -> Result<Vec<ExtensionPermission>, ExtensionError> { ) -> Result<Vec<ExtensionPermission>, ExtensionError> {
with_connection(&app_state.db, |conn| { with_connection(&app_state.db, |conn| {
let sql = format!("SELECT * FROM {} WHERE extension_id = ?", TABLE_EXTENSION_PERMISSIONS); let sql = format!("SELECT * FROM {TABLE_EXTENSION_PERMISSIONS} WHERE extension_id = ?");
let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?; let mut stmt = conn.prepare(&sql).map_err(DatabaseError::from)?;
let perms_iter = stmt.query_map(params![extension_id], |row| { let perms_iter = stmt.query_map(params![extension_id], |row| {
@ -198,7 +196,8 @@ impl PermissionManager {
table_name: &str, table_name: &str,
) -> Result<(), ExtensionError> { ) -> Result<(), ExtensionError> {
// Remove quotes from table name if present (from SDK's getTableName()) // Remove quotes from table name if present (from SDK's getTableName())
let clean_table_name = table_name.trim_matches('"'); // Support both double quotes and backticks (Drizzle uses backticks by default)
let clean_table_name = table_name.trim_matches('"').trim_matches('`');
// Auto-allow: Extensions have full access to their own tables // Auto-allow: Extensions have full access to their own tables
// Table format: {publicKey}__{extensionName}__{tableName} // Table format: {publicKey}__{extensionName}__{tableName}
@ -209,7 +208,7 @@ impl PermissionManager {
.extension_manager .extension_manager
.get_extension(extension_id) .get_extension(extension_id)
.ok_or_else(|| ExtensionError::ValidationError { .ok_or_else(|| ExtensionError::ValidationError {
reason: format!("Extension with ID {} not found", extension_id), reason: format!("Extension with ID {extension_id} not found"),
})?; })?;
// Build expected table prefix: {publicKey}__{extensionName}__ // Build expected table prefix: {publicKey}__{extensionName}__
@ -238,8 +237,8 @@ impl PermissionManager {
if !has_permission { if !has_permission {
return Err(ExtensionError::permission_denied( return Err(ExtensionError::permission_denied(
extension_id, extension_id,
&format!("{:?}", action), &format!("{action:?}"),
&format!("database table '{}'", table_name), &format!("database table '{table_name}'"),
)); ));
} }
@ -415,7 +414,7 @@ impl PermissionManager {
"db" => Ok(ResourceType::Db), "db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell), "shell" => Ok(ResourceType::Shell),
_ => Err(DatabaseError::SerializationError { _ => Err(DatabaseError::SerializationError {
reason: format!("Unknown resource type: {}", s), reason: format!("Unknown resource type: {s}"),
}), }),
} }
} }
@ -423,8 +422,7 @@ impl PermissionManager {
fn matches_path_pattern(pattern: &str, path: &str) -> bool { fn matches_path_pattern(pattern: &str, path: &str) -> bool {
if pattern.ends_with("/*") { if let Some(prefix) = pattern.strip_suffix("/*") {
let prefix = &pattern[..pattern.len() - 2];
return path.starts_with(prefix); return path.starts_with(prefix);
} }

View File

@ -267,7 +267,7 @@ impl ResourceType {
"db" => Ok(ResourceType::Db), "db" => Ok(ResourceType::Db),
"shell" => Ok(ResourceType::Shell), "shell" => Ok(ResourceType::Shell),
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Unknown resource type: {}", s), reason: format!("Unknown resource type: {s}"),
}), }),
} }
} }
@ -301,7 +301,7 @@ impl Action {
ResourceType::Fs => Ok(Action::Filesystem(FsAction::from_str(s)?)), ResourceType::Fs => Ok(Action::Filesystem(FsAction::from_str(s)?)),
ResourceType::Http => { ResourceType::Http => {
let action: HttpAction = let action: HttpAction =
serde_json::from_str(&format!("\"{}\"", s)).map_err(|_| { serde_json::from_str(&format!("\"{s}\"")).map_err(|_| {
ExtensionError::InvalidActionString { ExtensionError::InvalidActionString {
input: s.to_string(), input: s.to_string(),
resource_type: "http".to_string(), resource_type: "http".to_string(),
@ -329,7 +329,7 @@ impl PermissionStatus {
"granted" => Ok(PermissionStatus::Granted), "granted" => Ok(PermissionStatus::Granted),
"denied" => Ok(PermissionStatus::Denied), "denied" => Ok(PermissionStatus::Denied),
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Unknown permission status: {}", s), reason: format!("Unknown permission status: {s}"),
}), }),
} }
} }

View File

@ -17,7 +17,7 @@ impl SqlPermissionValidator {
fn is_own_table(extension_id: &str, table_name: &str) -> bool { fn is_own_table(extension_id: &str, table_name: &str) -> bool {
// Tabellennamen sind im Format: {keyHash}_{extensionName}_{tableName} // Tabellennamen sind im Format: {keyHash}_{extensionName}_{tableName}
// extension_id ist der keyHash der Extension // extension_id ist der keyHash der Extension
table_name.starts_with(&format!("{}_", extension_id)) table_name.starts_with(&format!("{extension_id}_"))
} }
/// Validiert ein SQL-Statement gegen die Permissions einer Extension /// Validiert ein SQL-Statement gegen die Permissions einer Extension
@ -45,7 +45,7 @@ impl SqlPermissionValidator {
Self::validate_schema_statement(app_state, extension_id, &statement).await Self::validate_schema_statement(app_state, extension_id, &statement).await
} }
_ => Err(ExtensionError::ValidationError { _ => Err(ExtensionError::ValidationError {
reason: format!("Statement type not allowed: {}", sql), reason: format!("Statement type not allowed: {sql}"),
}), }),
} }
} }

View File

@ -26,7 +26,7 @@ pub fn run() {
let state = app_handle.state::<AppState>(); let state = app_handle.state::<AppState>();
// Rufe den Handler mit allen benötigten Parametern auf // Rufe den Handler mit allen benötigten Parametern auf
match extension::core::extension_protocol_handler(state, &app_handle, &request) { match extension::core::extension_protocol_handler(state, app_handle, &request) {
Ok(response) => response, Ok(response) => response,
Err(e) => { Err(e) => {
eprintln!( eprintln!(
@ -38,11 +38,10 @@ pub fn run() {
.status(500) .status(500)
.header("Content-Type", "text/plain") .header("Content-Type", "text/plain")
.body(Vec::from(format!( .body(Vec::from(format!(
"Interner Serverfehler im Protokollhandler: {}", "Interner Serverfehler im Protokollhandler: {e}"
e
))) )))
.unwrap_or_else(|build_err| { .unwrap_or_else(|build_err| {
eprintln!("Konnte Fehler-Response nicht erstellen: {}", build_err); eprintln!("Konnte Fehler-Response nicht erstellen: {build_err}");
tauri::http::Response::builder() tauri::http::Response::builder()
.status(500) .status(500)
.body(Vec::new()) .body(Vec::new())

View File

@ -20,10 +20,12 @@
], ],
"security": { "security": {
"csp": { "csp": {
"default-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "default-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"script-src": [ "script-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"'wasm-unsafe-eval'", "'wasm-unsafe-eval'",
"'unsafe-inline'" "'unsafe-inline'"
@ -31,6 +33,8 @@
"style-src": [ "style-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"'unsafe-inline'" "'unsafe-inline'"
], ],
@ -45,20 +49,22 @@
"img-src": [ "img-src": [
"'self'", "'self'",
"http://tauri.localhost", "http://tauri.localhost",
"https://tauri.localhost",
"asset:",
"haex-extension:", "haex-extension:",
"data:", "data:",
"blob:" "blob:"
], ],
"font-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "font-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"object-src": ["'none'"], "object-src": ["'none'"],
"media-src": ["'self'", "http://tauri.localhost", "haex-extension:"], "media-src": ["'self'", "http://tauri.localhost", "https://tauri.localhost", "asset:", "haex-extension:"],
"frame-src": ["haex-extension:"], "frame-src": ["haex-extension:"],
"frame-ancestors": ["'none'"], "frame-ancestors": ["'none'"],
"base-uri": ["'self'"] "base-uri": ["'self'"]
}, },
"assetProtocol": { "assetProtocol": {
"enable": true, "enable": true,
"scope": ["$APPDATA", "$RESOURCE"] "scope": ["$APPDATA", "$RESOURCE", "$APPLOCALDATA/**"]
} }
} }
}, },

View File

@ -18,35 +18,32 @@
@pointerdown.left="handlePointerDown" @pointerdown.left="handlePointerDown"
@pointermove="handlePointerMove" @pointermove="handlePointerMove"
@pointerup="handlePointerUp" @pointerup="handlePointerUp"
@dragstart.prevent
@click.left="handleClick" @click.left="handleClick"
@dblclick="handleDoubleClick" @dblclick="handleDoubleClick"
> >
<div class="flex flex-col items-center gap-2 p-3 group"> <div class="flex flex-col items-center gap-2 p-3 group">
<div <div
:class="[ :class="[
'w-20 h-20 flex items-center justify-center rounded-2xl transition-all duration-200 ease-out', 'flex items-center justify-center rounded-2xl transition-all duration-200 ease-out',
'backdrop-blur-sm border', 'backdrop-blur-sm border',
isSelected isSelected
? 'bg-white/95 dark:bg-gray-800/95 border-blue-500 dark:border-blue-400 shadow-lg scale-105' ? 'bg-white/95 dark:bg-gray-800/95 border-blue-500 dark:border-blue-400 shadow-lg scale-105'
: 'bg-white/80 dark:bg-gray-800/80 border-gray-200/50 dark:border-gray-700/50 hover:bg-white/90 dark:hover:bg-gray-800/90 hover:border-gray-300 dark:hover:border-gray-600 hover:shadow-md hover:scale-105', : 'bg-white/80 dark:bg-gray-800/80 border-gray-200/50 dark:border-gray-700/50 hover:bg-white/90 dark:hover:bg-gray-800/90 hover:border-gray-300 dark:hover:border-gray-600 hover:shadow-md hover:scale-105',
]" ]"
:style="{ width: `${containerSize}px`, height: `${containerSize}px` }"
> >
<img <HaexIcon
v-if="icon" :name="icon || 'i-heroicons-puzzle-piece-solid'"
:src="icon"
:alt="label"
class="w-14 h-14 object-contain transition-transform duration-200"
:class="{ 'scale-110': isSelected }"
/>
<UIcon
v-else
name="i-heroicons-puzzle-piece-solid"
:class="[ :class="[
'w-14 h-14 transition-all duration-200', 'object-contain transition-all duration-200',
isSelected isSelected && 'scale-110',
? 'text-blue-500 dark:text-blue-400 scale-110' !icon &&
: 'text-gray-400 dark:text-gray-500 group-hover:text-gray-500 dark:group-hover:text-gray-400', (isSelected
? 'text-blue-500 dark:text-blue-400'
: 'text-gray-400 dark:text-gray-500 group-hover:text-gray-500 dark:group-hover:text-gray-400'),
]" ]"
:style="{ width: `${innerIconSize}px`, height: `${innerIconSize}px` }"
/> />
</div> </div>
<span <span
@ -79,15 +76,19 @@ const props = defineProps<{
const emit = defineEmits<{ const emit = defineEmits<{
positionChanged: [id: string, x: number, y: number] positionChanged: [id: string, x: number, y: number]
dragStart: [id: string, itemType: string, referenceId: string] dragStart: [id: string, itemType: string, referenceId: string, width: number, height: number, x: number, y: number]
dragging: [id: string, x: number, y: number]
dragEnd: [] dragEnd: []
}>() }>()
const desktopStore = useDesktopStore() const desktopStore = useDesktopStore()
const { effectiveIconSize } = storeToRefs(desktopStore)
const showUninstallDialog = ref(false) const showUninstallDialog = ref(false)
const { t } = useI18n() const { t } = useI18n()
const isSelected = computed(() => desktopStore.isItemSelected(props.id)) const isSelected = computed(() => desktopStore.isItemSelected(props.id))
const containerSize = computed(() => effectiveIconSize.value) // Container size
const innerIconSize = computed(() => effectiveIconSize.value * 0.7) // Inner icon is 70% of container
const handleClick = (e: MouseEvent) => { const handleClick = (e: MouseEvent) => {
// Prevent selection during drag // Prevent selection during drag
@ -131,9 +132,40 @@ const isDragging = ref(false)
const offsetX = ref(0) const offsetX = ref(0)
const offsetY = ref(0) const offsetY = ref(0)
// Icon dimensions (approximate) // Track actual icon dimensions dynamically
const iconWidth = 120 // Matches design in template const { width: iconWidth, height: iconHeight } = useElementSize(draggableEl)
const iconHeight = 140
// Re-center icon position when dimensions are measured
watch([iconWidth, iconHeight], async ([width, height]) => {
if (width > 0 && height > 0) {
console.log('📐 Icon dimensions measured:', {
label: props.label,
width,
height,
currentPosition: { x: x.value, y: y.value },
gridCellSize: desktopStore.gridCellSize,
})
// Re-snap to grid with actual dimensions to ensure proper centering
const snapped = desktopStore.snapToGrid(x.value, y.value, width, height)
console.log('📍 Snapped position:', {
label: props.label,
oldPosition: { x: x.value, y: y.value },
newPosition: snapped,
})
const oldX = x.value
const oldY = y.value
x.value = snapped.x
y.value = snapped.y
// Save corrected position to database if it changed
if (oldX !== snapped.x || oldY !== snapped.y) {
emit('positionChanged', props.id, snapped.x, snapped.y)
}
}
}, { once: true }) // Only run once when dimensions are first measured
const style = computed(() => ({ const style = computed(() => ({
position: 'absolute' as const, position: 'absolute' as const,
@ -145,8 +177,11 @@ const style = computed(() => ({
const handlePointerDown = (e: PointerEvent) => { const handlePointerDown = (e: PointerEvent) => {
if (!draggableEl.value || !draggableEl.value.parentElement) return if (!draggableEl.value || !draggableEl.value.parentElement) return
// Prevent any text selection during drag
e.preventDefault()
isDragging.value = true isDragging.value = true
emit('dragStart', props.id, props.itemType, props.referenceId) emit('dragStart', props.id, props.itemType, props.referenceId, iconWidth.value, iconHeight.value, x.value, y.value)
// Get parent offset to convert from viewport coordinates to parent-relative coordinates // Get parent offset to convert from viewport coordinates to parent-relative coordinates
const parentRect = draggableEl.value.parentElement.getBoundingClientRect() const parentRect = draggableEl.value.parentElement.getBoundingClientRect()
@ -165,8 +200,15 @@ const handlePointerMove = (e: PointerEvent) => {
const newX = e.clientX - parentRect.left - offsetX.value const newX = e.clientX - parentRect.left - offsetX.value
const newY = e.clientY - parentRect.top - offsetY.value const newY = e.clientY - parentRect.top - offsetY.value
x.value = newX // Clamp position to viewport bounds during drag
y.value = newY const maxX = viewportSize ? Math.max(0, viewportSize.width.value - iconWidth.value) : Number.MAX_SAFE_INTEGER
const maxY = viewportSize ? Math.max(0, viewportSize.height.value - iconHeight.value) : Number.MAX_SAFE_INTEGER
x.value = Math.max(0, Math.min(maxX, newX))
y.value = Math.max(0, Math.min(maxY, newY))
// Emit current position during drag
emit('dragging', props.id, x.value, y.value)
} }
const handlePointerUp = (e: PointerEvent) => { const handlePointerUp = (e: PointerEvent) => {
@ -177,10 +219,15 @@ const handlePointerUp = (e: PointerEvent) => {
draggableEl.value.releasePointerCapture(e.pointerId) draggableEl.value.releasePointerCapture(e.pointerId)
} }
// Snap to grid with icon dimensions
const snapped = desktopStore.snapToGrid(x.value, y.value, iconWidth.value, iconHeight.value)
x.value = snapped.x
y.value = snapped.y
// Snap icon to viewport bounds if outside // Snap icon to viewport bounds if outside
if (viewportSize) { if (viewportSize) {
const maxX = Math.max(0, viewportSize.width.value - iconWidth) const maxX = Math.max(0, viewportSize.width.value - iconWidth.value)
const maxY = Math.max(0, viewportSize.height.value - iconHeight) const maxY = Math.max(0, viewportSize.height.value - iconHeight.value)
x.value = Math.max(0, Math.min(maxX, x.value)) x.value = Math.max(0, Math.min(maxX, x.value))
y.value = Math.max(0, Math.min(maxY, y.value)) y.value = Math.max(0, Math.min(maxY, y.value))
} }

View File

@ -23,20 +23,25 @@
:key="workspace.id" :key="workspace.id"
class="w-full h-full" class="w-full h-full"
> >
<UContextMenu :items="getWorkspaceContextMenuItems(workspace.id)">
<div <div
class="w-full h-full relative" class="w-full h-full relative select-none"
:style="getWorkspaceBackgroundStyle(workspace)"
@click.self.stop="handleDesktopClick" @click.self.stop="handleDesktopClick"
@mousedown.left.self="handleAreaSelectStart" @mousedown.left.self="handleAreaSelectStart"
@dragover.prevent="handleDragOver" @dragover.prevent="handleDragOver"
@drop.prevent="handleDrop($event, workspace.id)" @drop.prevent="handleDrop($event, workspace.id)"
@selectstart.prevent
> >
<!-- Grid Pattern Background --> <!-- Drop Target Zone (visible during drag) -->
<div <div
class="absolute inset-0 pointer-events-none opacity-30" v-if="dropTargetZone"
class="absolute border-2 border-blue-500 bg-blue-500/10 rounded-lg pointer-events-none z-10 transition-all duration-75"
:style="{ :style="{
backgroundImage: left: `${dropTargetZone.x}px`,
'linear-gradient(rgba(0, 0, 0, 0.1) 1px, transparent 1px), linear-gradient(90deg, rgba(0, 0, 0, 0.1) 1px, transparent 1px)', top: `${dropTargetZone.y}px`,
backgroundSize: '32px 32px', width: `${dropTargetZone.width}px`,
height: `${dropTargetZone.height}px`,
}" }"
/> />
@ -44,12 +49,16 @@
<div <div
class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out" class="absolute left-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'" :class="
showLeftSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
"
/> />
<div <div
class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out" class="absolute right-0 top-0 bottom-0 border-blue-500 pointer-events-none backdrop-blur-sm z-50 transition-all duration-500 ease-in-out"
:class="showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'" :class="
showRightSnapZone ? 'w-1/2 bg-blue-500/20 border-2' : 'w-0'
"
/> />
<!-- Area Selection Box --> <!-- Area Selection Box -->
@ -73,6 +82,7 @@
class="no-swipe" class="no-swipe"
@position-changed="handlePositionChanged" @position-changed="handlePositionChanged"
@drag-start="handleDragStart" @drag-start="handleDragStart"
@dragging="handleDragging"
@drag-end="handleDragEnd" @drag-end="handleDragEnd"
/> />
@ -208,6 +218,7 @@
</HaexWindow> </HaexWindow>
</template> </template>
</div> </div>
</UContextMenu>
</SwiperSlide> </SwiperSlide>
</Swiper> </Swiper>
@ -239,8 +250,8 @@ const {
allowSwipe, allowSwipe,
isOverviewMode, isOverviewMode,
} = storeToRefs(workspaceStore) } = storeToRefs(workspaceStore)
const { getWorkspaceBackgroundStyle, getWorkspaceContextMenuItems } =
const { x: mouseX } = useMouse() workspaceStore
const desktopEl = useTemplateRef('desktopEl') const desktopEl = useTemplateRef('desktopEl')
@ -275,9 +286,44 @@ const selectionBoxStyle = computed(() => {
// Drag state for desktop icons // Drag state for desktop icons
const isDragging = ref(false) const isDragging = ref(false)
const currentDraggedItemId = ref<string>() const currentDraggedItem = reactive({
const currentDraggedItemType = ref<string>() id: '',
const currentDraggedReferenceId = ref<string>() itemType: '',
referenceId: '',
width: 0,
height: 0,
x: 0,
y: 0,
})
// Track mouse position for showing drop target
const { x: mouseX, y: mouseY } = useMouse()
const dropTargetZone = computed(() => {
if (!isDragging.value) return null
// Use the actual icon position during drag
const iconX = currentDraggedItem.x
const iconY = currentDraggedItem.y
// Use snapToGrid to get the exact position where the icon will land
const snapped = desktopStore.snapToGrid(
iconX,
iconY,
currentDraggedItem.width || undefined,
currentDraggedItem.height || undefined,
)
// Show dropzone at snapped position with grid cell size
const cellSize = desktopStore.gridCellSize
return {
x: snapped.x,
y: snapped.y,
width: currentDraggedItem.width || cellSize,
height: currentDraggedItem.height || cellSize,
}
})
// Window drag state for snap zones // Window drag state for snap zones
const isWindowDragging = ref(false) const isWindowDragging = ref(false)
@ -369,20 +415,43 @@ const handlePositionChanged = async (id: string, x: number, y: number) => {
} }
} }
const handleDragStart = (id: string, itemType: string, referenceId: string) => { const handleDragStart = (
id: string,
itemType: string,
referenceId: string,
width: number,
height: number,
x: number,
y: number,
) => {
isDragging.value = true isDragging.value = true
currentDraggedItemId.value = id currentDraggedItem.id = id
currentDraggedItemType.value = itemType currentDraggedItem.itemType = itemType
currentDraggedReferenceId.value = referenceId currentDraggedItem.referenceId = referenceId
currentDraggedItem.width = width
currentDraggedItem.height = height
currentDraggedItem.x = x
currentDraggedItem.y = y
allowSwipe.value = false // Disable Swiper during icon drag allowSwipe.value = false // Disable Swiper during icon drag
} }
const handleDragging = (id: string, x: number, y: number) => {
if (currentDraggedItem.id === id) {
currentDraggedItem.x = x
currentDraggedItem.y = y
}
}
const handleDragEnd = async () => { const handleDragEnd = async () => {
// Cleanup drag state // Cleanup drag state
isDragging.value = false isDragging.value = false
currentDraggedItemId.value = undefined currentDraggedItem.id = ''
currentDraggedItemType.value = undefined currentDraggedItem.itemType = ''
currentDraggedReferenceId.value = undefined currentDraggedItem.referenceId = ''
currentDraggedItem.width = 0
currentDraggedItem.height = 0
currentDraggedItem.x = 0
currentDraggedItem.y = 0
allowSwipe.value = true // Re-enable Swiper after drag allowSwipe.value = true // Re-enable Swiper after drag
} }
@ -417,15 +486,18 @@ const handleDrop = async (event: DragEvent, workspaceId: string) => {
const desktopRect = ( const desktopRect = (
event.currentTarget as HTMLElement event.currentTarget as HTMLElement
).getBoundingClientRect() ).getBoundingClientRect()
const x = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2) const rawX = Math.max(0, event.clientX - desktopRect.left - 32) // Center icon (64px / 2)
const y = Math.max(0, event.clientY - desktopRect.top - 32) const rawY = Math.max(0, event.clientY - desktopRect.top - 32)
// Snap to grid
const snapped = desktopStore.snapToGrid(rawX, rawY)
// Create desktop icon on the specific workspace // Create desktop icon on the specific workspace
await desktopStore.addDesktopItemAsync( await desktopStore.addDesktopItemAsync(
item.type as DesktopItemType, item.type as DesktopItemType,
item.id, item.id,
x, snapped.x,
y, snapped.y,
workspaceId, workspaceId,
) )
} catch (error) { } catch (error) {
@ -664,6 +736,21 @@ watch(currentWorkspace, async () => {
} }
}) })
// Reset drag state when mouse leaves the document (fixes stuck dropzone)
useEventListener(document, 'mouseleave', () => {
if (isDragging.value) {
isDragging.value = false
currentDraggedItem.id = ''
currentDraggedItem.itemType = ''
currentDraggedItem.referenceId = ''
currentDraggedItem.width = 0
currentDraggedItem.height = 0
currentDraggedItem.x = 0
currentDraggedItem.y = 0
allowSwipe.value = true
}
})
onMounted(async () => { onMounted(async () => {
// Load workspaces first // Load workspaces first
await workspaceStore.loadWorkspacesAsync() await workspaceStore.loadWorkspacesAsync()

View File

@ -1,12 +1,12 @@
<template> <template>
<UDrawer <UiDrawer
v-model:open="open" v-model:open="open"
direction="right" direction="right"
:title="t('launcher.title')" :title="t('launcher.title')"
:description="t('launcher.description')" :description="t('launcher.description')"
:ui="{ :overlay="false"
content: 'w-dvw max-w-md sm:max-w-fit', :modal="false"
}" :handle-only="true"
> >
<UButton <UButton
icon="material-symbols:apps" icon="material-symbols:apps"
@ -30,7 +30,7 @@
size="lg" size="lg"
variant="ghost" variant="ghost"
:ui="{ :ui="{
base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab active:cursor-grabbing', base: 'size-24 flex flex-wrap text-sm items-center justify-center overflow-visible cursor-grab',
leadingIcon: 'size-10', leadingIcon: 'size-10',
label: 'w-full', label: 'w-full',
}" }"
@ -40,7 +40,6 @@
draggable="true" draggable="true"
@click="openItem(item)" @click="openItem(item)"
@dragstart="handleDragStart($event, item)" @dragstart="handleDragStart($event, item)"
@dragend="handleDragEnd"
/> />
</UContextMenu> </UContextMenu>
@ -64,7 +63,7 @@
</div> </div>
</div> </div>
</template> </template>
</UDrawer> </UiDrawer>
<!-- Uninstall Confirmation Dialog --> <!-- Uninstall Confirmation Dialog -->
<UiDialogConfirm <UiDialogConfirm
@ -88,11 +87,14 @@ defineOptions({
const extensionStore = useExtensionsStore() const extensionStore = useExtensionsStore()
const windowManagerStore = useWindowManagerStore() const windowManagerStore = useWindowManagerStore()
const uiStore = useUiStore()
const { t } = useI18n() const { t } = useI18n()
const open = ref(false) const open = ref(false)
const { isSmallScreen } = storeToRefs(uiStore)
// Uninstall dialog state // Uninstall dialog state
const showUninstallDialog = ref(false) const showUninstallDialog = ref(false)
const extensionToUninstall = ref<LauncherItem | null>(null) const extensionToUninstall = ref<LauncherItem | null>(null)
@ -240,10 +242,11 @@ const handleDragStart = (event: DragEvent, item: LauncherItem) => {
if (dragImage) { if (dragImage) {
event.dataTransfer.setDragImage(dragImage, 20, 20) event.dataTransfer.setDragImage(dragImage, 20, 20)
} }
}
const handleDragEnd = () => { // Close drawer on small screens to reveal workspace for drop
// Cleanup if needed if (isSmallScreen.value) {
open.value = false
}
} }
</script> </script>

View File

@ -0,0 +1,65 @@
<template>
<div class="inline-flex">
<UTooltip :text="tooltip">
<!-- Bundled Icon (iconify) -->
<UIcon
v-if="isBundledIcon"
:name="name"
v-bind="$attrs"
/>
<!-- External Image (Extension icon) -->
<img
v-else
:src="imageUrl"
v-bind="$attrs"
@error="handleImageError"
/>
</UTooltip>
</div>
</template>
<script setup lang="ts">
import { convertFileSrc } from '@tauri-apps/api/core'
defineOptions({
inheritAttrs: false,
})
const props = defineProps<{
name: string
tooltip?: string
}>()
// Check if it's a bundled icon (no file extension)
const isBundledIcon = computed(() => {
return !props.name.match(/\.(png|jpg|jpeg|svg|gif|webp|ico)$/i)
})
// Convert file path to Tauri URL for images
const imageUrl = ref('')
const showFallback = ref(false)
// Default fallback icon
const FALLBACK_ICON = 'i-heroicons-puzzle-piece-solid'
watchEffect(() => {
if (!isBundledIcon.value && !showFallback.value) {
// Convert local file path to Tauri asset URL
imageUrl.value = convertFileSrc(props.name)
}
})
const handleImageError = () => {
console.warn(`Failed to load icon: ${props.name}`)
showFallback.value = true
}
// Use fallback icon if image failed to load
const name = computed(() => {
if (showFallback.value) {
return FALLBACK_ICON
}
return props.name
})
</script>

View File

@ -1,5 +1,5 @@
<template> <template>
<div class="p-4 mx-auto space-y-6 bg-default/90 backdrop-blur-2xl"> <div class="p-4 mx-auto space-y-6 bg-default">
<div class="space-y-2"> <div class="space-y-2">
<h1 class="text-2xl font-bold">{{ t('title') }}</h1> <h1 class="text-2xl font-bold">{{ t('title') }}</h1>
<p class="text-sm opacity-70">{{ t('description') }}</p> <p class="text-sm opacity-70">{{ t('description') }}</p>
@ -122,6 +122,7 @@ const browseExtensionPathAsync = async () => {
} }
} }
const windowManagerStore = useWindowManagerStore()
// Load a dev extension // Load a dev extension
const loadDevExtensionAsync = async () => { const loadDevExtensionAsync = async () => {
if (!extensionPath.value) return if (!extensionPath.value) return
@ -140,9 +141,24 @@ const loadDevExtensionAsync = async () => {
// Reload list // Reload list
await loadDevExtensionListAsync() await loadDevExtensionListAsync()
// Get the newly loaded extension info from devExtensions
const newlyLoadedExtension = devExtensions.value.find((ext) =>
extensionPath.value.includes(ext.name),
)
// Reload all extensions in the main extension store so they appear in the launcher // Reload all extensions in the main extension store so they appear in the launcher
await loadExtensionsAsync() await loadExtensionsAsync()
// Open the newly loaded extension
if (newlyLoadedExtension) {
await windowManagerStore.openWindowAsync({
sourceId: newlyLoadedExtension.id,
type: 'extension',
icon: newlyLoadedExtension.icon || 'i-heroicons-puzzle-piece-solid',
title: newlyLoadedExtension.name,
})
}
// Clear input // Clear input
extensionPath.value = '' extensionPath.value = ''
} catch (error) { } catch (error) {

View File

@ -1,5 +1,5 @@
<template> <template>
<div class="w-full h-full bg-default"> <div class="w-full h-full bg-default overflow-scroll">
<div class="grid grid-cols-2 p-2"> <div class="grid grid-cols-2 p-2">
<div class="p-2">{{ t('language') }}</div> <div class="p-2">{{ t('language') }}</div>
<div><UiDropdownLocale @select="onSelectLocaleAsync" /></div> <div><UiDropdownLocale @select="onSelectLocaleAsync" /></div>
@ -33,6 +33,35 @@
/> />
</div> </div>
<div class="p-2">{{ t('workspaceBackground.label') }}</div>
<div class="flex gap-2">
<UiButton
:label="t('workspaceBackground.choose')"
@click="selectBackgroundImage"
/>
<UiButton
v-if="currentWorkspace?.background"
:label="t('workspaceBackground.remove.label')"
color="error"
@click="removeBackgroundImage"
/>
</div>
<!-- Desktop Grid Settings -->
<div
class="col-span-2 mt-4 border-t border-gray-200 dark:border-gray-700 pt-4"
>
<h3 class="text-lg font-semibold mb-4">{{ t('desktopGrid.title') }}</h3>
</div>
<div class="p-2">{{ t('desktopGrid.iconSize.label') }}</div>
<div>
<USelect
v-model="iconSizePreset"
:items="iconSizePresetOptions"
/>
</div>
<div class="h-full" /> <div class="h-full" />
</div> </div>
</div> </div>
@ -40,6 +69,16 @@
<script setup lang="ts"> <script setup lang="ts">
import type { Locale } from 'vue-i18n' import type { Locale } from 'vue-i18n'
import { open } from '@tauri-apps/plugin-dialog'
import {
readFile,
writeFile,
mkdir,
exists,
remove,
} from '@tauri-apps/plugin-fs'
import { appLocalDataDir } from '@tauri-apps/api/path'
import { DesktopIconSizePreset } from '~/stores/vault/settings'
const { t, setLocale } = useI18n() const { t, setLocale } = useI18n()
@ -77,8 +116,44 @@ const { requestNotificationPermissionAsync } = useNotificationStore()
const { deviceName } = storeToRefs(useDeviceStore()) const { deviceName } = storeToRefs(useDeviceStore())
const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore() const { updateDeviceNameAsync, readDeviceNameAsync } = useDeviceStore()
const workspaceStore = useWorkspaceStore()
const { currentWorkspace } = storeToRefs(workspaceStore)
const { updateWorkspaceBackgroundAsync } = workspaceStore
const desktopStore = useDesktopStore()
const { iconSizePreset } = storeToRefs(desktopStore)
const { syncDesktopIconSizeAsync, updateDesktopIconSizeAsync } = desktopStore
// Icon size preset options
const iconSizePresetOptions = [
{
label: t('desktopGrid.iconSize.presets.small'),
value: DesktopIconSizePreset.small,
},
{
label: t('desktopGrid.iconSize.presets.medium'),
value: DesktopIconSizePreset.medium,
},
{
label: t('desktopGrid.iconSize.presets.large'),
value: DesktopIconSizePreset.large,
},
{
label: t('desktopGrid.iconSize.presets.extraLarge'),
value: DesktopIconSizePreset.extraLarge,
},
]
// Watch for icon size preset changes and update DB
watch(iconSizePreset, async (newPreset) => {
if (newPreset) {
await updateDesktopIconSizeAsync(newPreset)
}
})
onMounted(async () => { onMounted(async () => {
await readDeviceNameAsync() await readDeviceNameAsync()
await syncDesktopIconSizeAsync()
}) })
const onUpdateDeviceNameAsync = async () => { const onUpdateDeviceNameAsync = async () => {
@ -92,6 +167,152 @@ const onUpdateDeviceNameAsync = async () => {
add({ description: t('deviceName.update.error'), color: 'error' }) add({ description: t('deviceName.update.error'), color: 'error' })
} }
} }
const selectBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
const selected = await open({
multiple: false,
filters: [
{
name: 'Images',
extensions: ['png', 'jpg', 'jpeg', 'webp'],
},
],
})
if (!selected || typeof selected !== 'string') {
return
}
// Read the selected file (works with Android photo picker URIs)
let fileData: Uint8Array
try {
fileData = await readFile(selected)
} catch (readError) {
add({
description: `Fehler beim Lesen: ${readError instanceof Error ? readError.message : String(readError)}`,
color: 'error',
})
return
}
// Detect file type from file signature
let ext = 'jpg' // default
if (fileData.length > 4) {
// PNG signature: 89 50 4E 47
if (
fileData[0] === 0x89 &&
fileData[1] === 0x50 &&
fileData[2] === 0x4e &&
fileData[3] === 0x47
) {
ext = 'png'
}
// JPEG signature: FF D8 FF
else if (
fileData[0] === 0xff &&
fileData[1] === 0xd8 &&
fileData[2] === 0xff
) {
ext = 'jpg'
}
// WebP signature: RIFF xxxx WEBP
else if (
fileData[0] === 0x52 &&
fileData[1] === 0x49 &&
fileData[2] === 0x46 &&
fileData[3] === 0x46
) {
ext = 'webp'
}
}
// Get app local data directory
const appDataPath = await appLocalDataDir()
// Construct target path manually to avoid path joining issues
const fileName = `workspace-${currentWorkspace.value.id}-background.${ext}`
const targetPath = `${appDataPath}/files/${fileName}`
// Create parent directory if it doesn't exist
const parentDir = `${appDataPath}/files`
try {
if (!(await exists(parentDir))) {
await mkdir(parentDir, { recursive: true })
}
} catch (mkdirError) {
add({
description: `Fehler beim Erstellen des Ordners: ${mkdirError instanceof Error ? mkdirError.message : String(mkdirError)}`,
color: 'error',
})
return
}
// Write file to app data directory
try {
await writeFile(targetPath, fileData)
} catch (writeError) {
add({
description: `Fehler beim Schreiben: ${writeError instanceof Error ? writeError.message : String(writeError)}`,
color: 'error',
})
return
}
// Store the absolute file path in database
try {
await updateWorkspaceBackgroundAsync(
currentWorkspace.value.id,
targetPath,
)
add({
description: t('workspaceBackground.update.success'),
color: 'success',
})
} catch (dbError) {
add({
description: `Fehler beim DB-Update: ${dbError instanceof Error ? dbError.message : String(dbError)}`,
color: 'error',
})
}
} catch (error) {
console.error('Error selecting background:', error)
add({
description: `${t('workspaceBackground.update.error')}: ${error instanceof Error ? error.message : String(error)}`,
color: 'error',
})
}
}
const removeBackgroundImage = async () => {
if (!currentWorkspace.value) return
try {
// Delete the background file if it exists
if (currentWorkspace.value.background) {
try {
// The background field contains the absolute file path
if (await exists(currentWorkspace.value.background)) {
await remove(currentWorkspace.value.background)
}
} catch (err) {
console.warn('Could not delete background file:', err)
// Continue anyway to clear the database entry
}
}
await updateWorkspaceBackgroundAsync(currentWorkspace.value.id, null)
add({
description: t('workspaceBackground.remove.success'),
color: 'success',
})
} catch (error) {
console.error('Error removing background:', error)
add({ description: t('workspaceBackground.remove.error'), color: 'error' })
}
}
</script> </script>
<i18n lang="yaml"> <i18n lang="yaml">
@ -112,6 +333,32 @@ de:
update: update:
success: Gerätename wurde erfolgreich aktualisiert success: Gerätename wurde erfolgreich aktualisiert
error: Gerätename konnte nich aktualisiert werden error: Gerätename konnte nich aktualisiert werden
workspaceBackground:
label: Workspace-Hintergrund
choose: Bild auswählen
update:
success: Hintergrund erfolgreich aktualisiert
error: Fehler beim Aktualisieren des Hintergrunds
remove:
label: Hintergrund entfernen
success: Hintergrund erfolgreich entfernt
error: Fehler beim Entfernen des Hintergrunds
desktopGrid:
title: Desktop-Raster
columns:
label: Spalten
unit: Spalten
rows:
label: Zeilen
unit: Zeilen
iconSize:
label: Icon-Größe
presets:
small: Klein
medium: Mittel
large: Groß
extraLarge: Sehr groß
unit: px
en: en:
language: Language language: Language
design: Design design: Design
@ -129,4 +376,30 @@ en:
update: update:
success: Device name has been successfully updated success: Device name has been successfully updated
error: Device name could not be updated error: Device name could not be updated
workspaceBackground:
label: Workspace Background
choose: Choose Image
update:
success: Background successfully updated
error: Error updating background
remove:
label: Remove Background
success: Background successfully removed
error: Error removing background
desktopGrid:
title: Desktop Grid
columns:
label: Columns
unit: columns
rows:
label: Rows
unit: rows
iconSize:
label: Icon Size
presets:
small: Small
medium: Medium
large: Large
extraLarge: Extra Large
unit: px
</i18n> </i18n>

View File

@ -16,6 +16,7 @@
: 'border border-gray-200 dark:border-gray-700', : 'border border-gray-200 dark:border-gray-700',
]" ]"
@mousedown="handleActivate" @mousedown="handleActivate"
@contextmenu.stop.prevent
> >
<!-- Window Titlebar --> <!-- Window Titlebar -->
<div <div
@ -25,10 +26,10 @@
> >
<!-- Left: Icon --> <!-- Left: Icon -->
<div class="flex items-center gap-2"> <div class="flex items-center gap-2">
<img <HaexIcon
v-if="icon" v-if="icon"
:src="icon" :name="icon"
:alt="title" :tooltip="title"
class="w-5 h-5 object-contain shrink-0" class="w-5 h-5 object-contain shrink-0"
/> />
</div> </div>
@ -50,6 +51,7 @@
/> />
<HaexWindowButton <HaexWindowButton
v-if="!isSmallScreen"
:is-maximized :is-maximized
variant="maximize" variant="maximize"
@click.stop="handleMaximize" @click.stop="handleMaximize"
@ -74,13 +76,14 @@
<!-- Resize Handles --> <!-- Resize Handles -->
<HaexWindowResizeHandles <HaexWindowResizeHandles
:disabled="isMaximized" :disabled="isMaximized || isSmallScreen"
@resize-start="handleResizeStart" @resize-start="handleResizeStart"
/> />
</div> </div>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { getAvailableContentHeight } from '~/utils/viewport'
const props = defineProps<{ const props = defineProps<{
id: string id: string
title: string title: string
@ -114,12 +117,16 @@ const height = defineModel<number>('height', { default: 600 })
const windowEl = useTemplateRef('windowEl') const windowEl = useTemplateRef('windowEl')
const titlebarEl = useTemplateRef('titlebarEl') const titlebarEl = useTemplateRef('titlebarEl')
const uiStore = useUiStore()
const { isSmallScreen } = storeToRefs(uiStore)
// Inject viewport size from parent desktop // Inject viewport size from parent desktop
const viewportSize = inject<{ const viewportSize = inject<{
width: Ref<number> width: Ref<number>
height: Ref<number> height: Ref<number>
}>('viewportSize') }>('viewportSize')
const isMaximized = ref(false) // Don't start maximized // Start maximized on small screens
const isMaximized = ref(isSmallScreen.value)
// Store initial position/size for restore // Store initial position/size for restore
const preMaximizeState = ref({ const preMaximizeState = ref({
@ -151,7 +158,8 @@ const isResizingOrDragging = computed(
// Setup drag with useDrag composable (supports mouse + touch) // Setup drag with useDrag composable (supports mouse + touch)
useDrag( useDrag(
({ movement: [mx, my], first, last }) => { ({ movement: [mx, my], first, last }) => {
if (isMaximized.value) return // Disable dragging on small screens (always fullscreen)
if (isMaximized.value || isSmallScreen.value) return
if (first) { if (first) {
// Drag started - save initial position // Drag started - save initial position
@ -322,31 +330,11 @@ const handleMaximize = () => {
const bounds = getViewportBounds() const bounds = getViewportBounds()
if (bounds && bounds.width > 0 && bounds.height > 0) { if (bounds && bounds.width > 0 && bounds.height > 0) {
// Get safe-area-insets from CSS variables for debug
const safeAreaTop = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-top',
) || '0',
)
const safeAreaBottom = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-bottom',
) || '0',
)
// Desktop container uses 'absolute inset-0' which stretches over full viewport
// bounds.height = full viewport height (includes header area + safe-areas)
// We need to calculate available space properly
// Get header height from UI store (measured reactively in layout)
const uiStore = useUiStore()
const headerHeight = uiStore.headerHeight
x.value = 0 x.value = 0
y.value = 0 // Start below header and status bar y.value = 0
width.value = bounds.width width.value = bounds.width
// Height: viewport - header - both safe-areas // Use helper function to calculate correct height with safe areas
height.value = bounds.height - headerHeight - safeAreaTop - safeAreaBottom height.value = getAvailableContentHeight()
isMaximized.value = true isMaximized.value = true
} }
} }

View File

@ -1,5 +1,5 @@
<template> <template>
<UDrawer <UiDrawer
v-model:open="localShowWindowOverview" v-model:open="localShowWindowOverview"
direction="bottom" direction="bottom"
:title="t('modal.title')" :title="t('modal.title')"
@ -70,7 +70,7 @@
</div> </div>
</div> </div>
</template> </template>
</UDrawer> </UiDrawer>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">

View File

@ -25,17 +25,70 @@
/> />
</div> </div>
</template> </template>
<!-- Window Icons Preview -->
<div
v-if="workspaceWindows.length > 0"
class="flex flex-wrap gap-2 items-center"
>
<!-- Show first 8 window icons -->
<HaexIcon
v-for="window in visibleWindows"
:key="window.id"
:name="window.icon || 'i-heroicons-window'"
:tooltip="window.title"
class="size-6 opacity-70"
/>
<!-- Show remaining count badge if more than 8 windows -->
<UBadge
v-if="remainingCount > 0"
color="neutral"
variant="subtle"
size="sm"
>
+{{ remainingCount }}
</UBadge>
</div>
<!-- Empty state when no windows -->
<div
v-else
class="text-sm text-gray-400 dark:text-gray-600 italic"
>
{{ t('noWindows') }}
</div>
</UCard> </UCard>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
const props = defineProps<{ workspace: IWorkspace }>() const props = defineProps<{ workspace: IWorkspace }>()
const { t } = useI18n()
const workspaceStore = useWorkspaceStore() const workspaceStore = useWorkspaceStore()
const windowManager = useWindowManagerStore() const windowManager = useWindowManagerStore()
const { currentWorkspace } = storeToRefs(workspaceStore) const { currentWorkspace } = storeToRefs(workspaceStore)
// Get all windows for this workspace
const workspaceWindows = computed(() => {
return windowManager.windows.filter(
(window) => window.workspaceId === props.workspace.id,
)
})
// Limit to 8 visible icons
const MAX_VISIBLE_ICONS = 8
const visibleWindows = computed(() => {
return workspaceWindows.value.slice(0, MAX_VISIBLE_ICONS)
})
// Count remaining windows
const remainingCount = computed(() => {
const remaining = workspaceWindows.value.length - MAX_VISIBLE_ICONS
return remaining > 0 ? remaining : 0
})
const cardEl = useTemplateRef('cardEl') const cardEl = useTemplateRef('cardEl')
const isDragOver = ref(false) const isDragOver = ref(false)
@ -96,3 +149,10 @@ watch(
}, },
) )
</script> </script>
<i18n lang="yaml">
de:
noWindows: Keine Fenster geöffnet
en:
noWindows: No windows open
</i18n>

View File

@ -0,0 +1,54 @@
<template>
<UiDrawer
v-model:open="isOverviewMode"
direction="left"
:overlay="false"
:modal="false"
title="Workspaces"
description="Workspaces"
>
<template #content>
<div class="pl-8 pr-4 overflow-y-auto py-8">
<!-- Workspace Cards -->
<div class="flex flex-col gap-3">
<HaexWorkspaceCard
v-for="workspace in workspaces"
:key="workspace.id"
:workspace
/>
</div>
<!-- Add New Workspace Button -->
<UButton
block
variant="outline"
class="mt-6"
icon="i-heroicons-plus"
:label="t('add')"
@click="handleAddWorkspaceAsync"
/>
</div>
</template>
</UiDrawer>
</template>
<script setup lang="ts">
const { t } = useI18n()
const workspaceStore = useWorkspaceStore()
const { workspaces, isOverviewMode } = storeToRefs(workspaceStore)
const handleAddWorkspaceAsync = async () => {
const workspace = await workspaceStore.addWorkspaceAsync()
nextTick(() => {
workspaceStore.slideToWorkspace(workspace?.id)
})
}
</script>
<i18n lang="yaml">
de:
add: Workspace hinzufügen
en:
add: Add Workspace
</i18n>

View File

@ -0,0 +1,32 @@
<template>
<UDrawer
v-bind="$attrs"
:ui="{
content:
'pb-[env(safe-area-inset-bottom)] pt-[env(safe-area-inset-top)] ',
...(ui || {}),
}"
>
<template
v-for="(_, name) in $slots"
#[name]="slotData"
>
<slot
:name="name"
v-bind="slotData"
/>
</template>
</UDrawer>
</template>
<script setup lang="ts">
import type { DrawerProps } from '@nuxt/ui'
/**
* Wrapper around UDrawer that automatically applies safe area insets for mobile devices.
* Passes through all props and slots to UDrawer.
*/
const props = defineProps</* @vue-ignore */ DrawerProps>()
const { ui } = toRefs(props)
</script>

View File

@ -7,6 +7,7 @@
...buttonProps, ...buttonProps,
...$attrs, ...$attrs,
}" }"
size="lg"
@click="$emit('click', $event)" @click="$emit('click', $event)"
> >
<template <template

View File

@ -5,7 +5,7 @@
:readonly="props.readOnly" :readonly="props.readOnly"
:leading-icon="props.leadingIcon" :leading-icon="props.leadingIcon"
:ui="{ base: 'peer' }" :ui="{ base: 'peer' }"
:size="isSmallScreen ? 'lg' : 'md'" size="lg"
@change="(e) => $emit('change', e)" @change="(e) => $emit('change', e)"
@blur="(e) => $emit('blur', e)" @blur="(e) => $emit('blur', e)"
@keyup="(e: KeyboardEvent) => $emit('keyup', e)" @keyup="(e: KeyboardEvent) => $emit('keyup', e)"
@ -83,8 +83,6 @@ const filteredSlots = computed(() => {
Object.entries(useSlots()).filter(([name]) => name !== 'trailing'), Object.entries(useSlots()).filter(([name]) => name !== 'trailing'),
) )
}) })
const { isSmallScreen } = storeToRefs(useUiStore())
</script> </script>
<i18n lang="yaml"> <i18n lang="yaml">

View File

@ -1,38 +1,29 @@
// composables/extensionMessageHandler.ts // composables/extensionMessageHandler.ts
import { invoke } from '@tauri-apps/api/core'
import type { IHaexHubExtension } from '~/types/haexhub' import type { IHaexHubExtension } from '~/types/haexhub'
import { import {
EXTENSION_PROTOCOL_NAME, EXTENSION_PROTOCOL_NAME,
EXTENSION_PROTOCOL_PREFIX, EXTENSION_PROTOCOL_PREFIX,
} from '~/config/constants' } from '~/config/constants'
import type { Platform } from '@tauri-apps/plugin-os' import {
handleDatabaseMethodAsync,
interface ExtensionRequest { handleFilesystemMethodAsync,
id: string handleHttpMethodAsync,
method: string handlePermissionsMethodAsync,
params: Record<string, unknown> handleContextMethodAsync,
timestamp: number handleStorageMethodAsync,
} setContextGetters,
type ExtensionRequest,
type ExtensionInstance,
} from './handlers'
// Globaler Handler - nur einmal registriert // Globaler Handler - nur einmal registriert
let globalHandlerRegistered = false let globalHandlerRegistered = false
interface ExtensionInstance {
extension: IHaexHubExtension
windowId: string
}
const iframeRegistry = new Map<HTMLIFrameElement, ExtensionInstance>() const iframeRegistry = new Map<HTMLIFrameElement, ExtensionInstance>()
// Map event.source (WindowProxy) to extension instance for sandbox-compatible matching // Map event.source (WindowProxy) to extension instance for sandbox-compatible matching
const sourceRegistry = new Map<Window, ExtensionInstance>() const sourceRegistry = new Map<Window, ExtensionInstance>()
// Reverse map: window ID to Window for broadcasting (supports multiple windows per extension) // Reverse map: window ID to Window for broadcasting (supports multiple windows per extension)
const windowIdToWindowMap = new Map<string, Window>() const windowIdToWindowMap = new Map<string, Window>()
// Store context values that need to be accessed outside setup
let contextGetters: {
getTheme: () => string
getLocale: () => string
getPlatform: () => Platform | undefined
} | null = null
const registerGlobalMessageHandler = () => { const registerGlobalMessageHandler = () => {
if (globalHandlerRegistered) return if (globalHandlerRegistered) return
@ -227,13 +218,11 @@ export const useExtensionMessageHandler = (
const { locale } = useI18n() const { locale } = useI18n()
const { platform } = useDeviceStore() const { platform } = useDeviceStore()
// Store getters for use outside setup context // Store getters for use outside setup context
if (!contextGetters) { setContextGetters({
contextGetters = {
getTheme: () => currentTheme.value?.value || 'system', getTheme: () => currentTheme.value?.value || 'system',
getLocale: () => locale.value, getLocale: () => locale.value,
getPlatform: () => platform, getPlatform: () => platform,
} })
}
// Registriere globalen Handler beim ersten Aufruf // Registriere globalen Handler beim ersten Aufruf
registerGlobalMessageHandler() registerGlobalMessageHandler()
@ -275,12 +264,7 @@ export const registerExtensionIFrame = (
// Stelle sicher, dass der globale Handler registriert ist // Stelle sicher, dass der globale Handler registriert ist
registerGlobalMessageHandler() registerGlobalMessageHandler()
// Warnung wenn Context Getters nicht initialisiert wurden // Note: Context getters should be initialized via useExtensionMessageHandler first
if (!contextGetters) {
console.warn(
'Context getters not initialized. Make sure useExtensionMessageHandler was called in setup context first.',
)
}
iframeRegistry.set(iframe, { extension, windowId }) iframeRegistry.set(iframe, { extension, windowId })
} }
@ -338,201 +322,21 @@ export const broadcastContextToAllExtensions = (context: {
timestamp: Date.now(), timestamp: Date.now(),
} }
console.log('[ExtensionHandler] Broadcasting context to all extensions:', context) console.log(
'[ExtensionHandler] Broadcasting context to all extensions:',
context,
)
// Send to all registered extension windows // Send to all registered extension windows
for (const [_, instance] of iframeRegistry.entries()) { for (const [_, instance] of iframeRegistry.entries()) {
const win = windowIdToWindowMap.get(instance.windowId) const win = windowIdToWindowMap.get(instance.windowId)
if (win) { if (win) {
console.log('[ExtensionHandler] Sending context to:', instance.extension.name, instance.windowId) console.log(
'[ExtensionHandler] Sending context to:',
instance.extension.name,
instance.windowId,
)
win.postMessage(message, '*') win.postMessage(message, '*')
} }
} }
} }
// ==========================================
// Database Methods
// ==========================================
async function handleDatabaseMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension, // Direkter Typ
) {
const params = request.params as {
query?: string
params?: unknown[]
}
switch (request.method) {
case 'haextension.db.query': {
const rows = await invoke<unknown[]>('extension_sql_select', {
sql: params.query || '',
params: params.params || [],
publicKey: extension.publicKey,
name: extension.name,
})
return {
rows,
rowsAffected: 0,
lastInsertId: undefined,
}
}
case 'haextension.db.execute': {
const rows = await invoke<unknown[]>('extension_sql_execute', {
sql: params.query || '',
params: params.params || [],
publicKey: extension.publicKey,
name: extension.name,
})
return {
rows,
rowsAffected: 1,
lastInsertId: undefined,
}
}
case 'haextension.db.transaction': {
const statements =
(request.params as { statements?: string[] }).statements || []
for (const stmt of statements) {
await invoke('extension_sql_execute', {
sql: stmt,
params: [],
publicKey: extension.publicKey,
name: extension.name,
})
}
return { success: true }
}
default:
throw new Error(`Unknown database method: ${request.method}`)
}
}
// ==========================================
// Filesystem Methods (TODO)
// ==========================================
async function handleFilesystemMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!request || !extension) return
// TODO: Implementiere Filesystem Commands im Backend
throw new Error('Filesystem methods not yet implemented')
}
// ==========================================
// HTTP Methods (TODO)
// ==========================================
async function handleHttpMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere HTTP Commands im Backend
throw new Error('HTTP methods not yet implemented')
}
// ==========================================
// Permission Methods (TODO)
// ==========================================
async function handlePermissionsMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere Permission Request UI
throw new Error('Permission methods not yet implemented')
}
// ==========================================
// Context Methods
// ==========================================
async function handleContextMethodAsync(request: ExtensionRequest) {
switch (request.method) {
case 'haextension.context.get':
if (!contextGetters) {
throw new Error(
'Context not initialized. Make sure useExtensionMessageHandler is called in a component.',
)
}
return {
theme: contextGetters.getTheme(),
locale: contextGetters.getLocale(),
platform: contextGetters.getPlatform(),
}
default:
throw new Error(`Unknown context method: ${request.method}`)
}
}
// ==========================================
// Storage Methods
// ==========================================
async function handleStorageMethodAsync(
request: ExtensionRequest,
instance: ExtensionInstance,
) {
// Storage is now per-window, not per-extension
const storageKey = `ext_${instance.extension.id}_${instance.windowId}_`
console.log(
`[HaexHub Storage] ${request.method} for window ${instance.windowId}`,
)
switch (request.method) {
case 'haextension.storage.getItem': {
const key = request.params.key as string
return localStorage.getItem(storageKey + key)
}
case 'haextension.storage.setItem': {
const key = request.params.key as string
const value = request.params.value as string
localStorage.setItem(storageKey + key, value)
return null
}
case 'haextension.storage.removeItem': {
const key = request.params.key as string
localStorage.removeItem(storageKey + key)
return null
}
case 'haextension.storage.clear': {
// Remove only instance-specific keys
const keys = Object.keys(localStorage).filter((k) =>
k.startsWith(storageKey),
)
keys.forEach((k) => localStorage.removeItem(k))
return null
}
case 'haextension.storage.keys': {
// Return only instance-specific keys (without prefix)
const keys = Object.keys(localStorage)
.filter((k) => k.startsWith(storageKey))
.map((k) => k.substring(storageKey.length))
return keys
}
default:
throw new Error(`Unknown storage method: ${request.method}`)
}
}

View File

@ -0,0 +1,36 @@
import type { Platform } from '@tauri-apps/plugin-os'
import type { ExtensionRequest } from './types'
// Context getters are set from the main handler during initialization
let contextGetters: {
getTheme: () => string
getLocale: () => string
getPlatform: () => Platform | undefined
} | null = null
export function setContextGetters(getters: {
getTheme: () => string
getLocale: () => string
getPlatform: () => Platform | undefined
}) {
contextGetters = getters
}
export async function handleContextMethodAsync(request: ExtensionRequest) {
switch (request.method) {
case 'haextension.context.get':
if (!contextGetters) {
throw new Error(
'Context not initialized. Make sure useExtensionMessageHandler is called in a component.',
)
}
return {
theme: contextGetters.getTheme(),
locale: contextGetters.getLocale(),
platform: contextGetters.getPlatform(),
}
default:
throw new Error(`Unknown context method: ${request.method}`)
}
}

View File

@ -0,0 +1,84 @@
import { invoke } from '@tauri-apps/api/core'
import type { IHaexHubExtension } from '~/types/haexhub'
import type { ExtensionRequest } from './types'
export async function handleDatabaseMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
const params = request.params as {
query?: string
params?: unknown[]
}
switch (request.method) {
case 'haextension.db.query': {
try {
const rows = await invoke<unknown[]>('extension_sql_select', {
sql: params.query || '',
params: params.params || [],
publicKey: extension.publicKey,
name: extension.name,
})
return {
rows,
rowsAffected: 0,
lastInsertId: undefined,
}
} catch (error) {
// If error is about non-SELECT statements (INSERT/UPDATE/DELETE with RETURNING),
// automatically retry with execute
if (error?.message?.includes('Only SELECT statements are allowed')) {
const rows = await invoke<unknown[]>('extension_sql_execute', {
sql: params.query || '',
params: params.params || [],
publicKey: extension.publicKey,
name: extension.name,
})
return {
rows,
rowsAffected: rows.length,
lastInsertId: undefined,
}
}
throw error
}
}
case 'haextension.db.execute': {
const rows = await invoke<unknown[]>('extension_sql_execute', {
sql: params.query || '',
params: params.params || [],
publicKey: extension.publicKey,
name: extension.name,
})
return {
rows,
rowsAffected: 1,
lastInsertId: undefined,
}
}
case 'haextension.db.transaction': {
const statements =
(request.params as { statements?: string[] }).statements || []
for (const stmt of statements) {
await invoke('extension_sql_execute', {
sql: stmt,
params: [],
publicKey: extension.publicKey,
name: extension.name,
})
}
return { success: true }
}
default:
throw new Error(`Unknown database method: ${request.method}`)
}
}

View File

@ -0,0 +1,83 @@
import { save } from '@tauri-apps/plugin-dialog'
import { writeFile } from '@tauri-apps/plugin-fs'
import { openPath } from '@tauri-apps/plugin-opener'
import { tempDir, join } from '@tauri-apps/api/path'
import type { IHaexHubExtension } from '~/types/haexhub'
import type { ExtensionRequest } from './types'
export async function handleFilesystemMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!request || !extension) return
switch (request.method) {
case 'haextension.fs.saveFile': {
const params = request.params as {
data: number[]
defaultPath?: string
title?: string
filters?: Array<{ name: string; extensions: string[] }>
}
// Convert number array back to Uint8Array
const data = new Uint8Array(params.data)
// Open save dialog
const filePath = await save({
defaultPath: params.defaultPath,
title: params.title || 'Save File',
filters: params.filters,
})
// User cancelled
if (!filePath) {
return null
}
// Write file
await writeFile(filePath, data)
return {
path: filePath,
success: true,
}
}
case 'haextension.fs.openFile': {
const params = request.params as {
data: number[]
fileName: string
mimeType?: string
}
try {
// Convert number array back to Uint8Array
const data = new Uint8Array(params.data)
// Get temp directory and create file path
const tempDirPath = await tempDir()
const tempFilePath = await join(tempDirPath, params.fileName)
// Write file to temp directory
await writeFile(tempFilePath, data)
// Open file with system's default viewer
await openPath(tempFilePath)
return {
success: true,
}
}
catch (error) {
console.error('[Filesystem] Error opening file:', error)
return {
success: false,
}
}
}
default:
throw new Error(`Unknown filesystem method: ${request.method}`)
}
}

View File

@ -0,0 +1,14 @@
import type { IHaexHubExtension } from '~/types/haexhub'
import type { ExtensionRequest } from './types'
export async function handleHttpMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere HTTP Commands im Backend
throw new Error('HTTP methods not yet implemented')
}

View File

@ -0,0 +1,10 @@
// Export all handler functions
export { handleDatabaseMethodAsync } from './database'
export { handleFilesystemMethodAsync } from './filesystem'
export { handleHttpMethodAsync } from './http'
export { handlePermissionsMethodAsync } from './permissions'
export { handleContextMethodAsync, setContextGetters } from './context'
export { handleStorageMethodAsync } from './storage'
// Export shared types
export type { ExtensionRequest, ExtensionInstance } from './types'

View File

@ -0,0 +1,14 @@
import type { IHaexHubExtension } from '~/types/haexhub'
import type { ExtensionRequest } from './types'
export async function handlePermissionsMethodAsync(
request: ExtensionRequest,
extension: IHaexHubExtension,
) {
if (!extension || !request) {
throw new Error('Extension not found')
}
// TODO: Implementiere Permission Request UI
throw new Error('Permission methods not yet implemented')
}

View File

@ -0,0 +1,52 @@
import type { ExtensionRequest, ExtensionInstance } from './types'
export async function handleStorageMethodAsync(
request: ExtensionRequest,
instance: ExtensionInstance,
) {
// Storage is now per-window, not per-extension
const storageKey = `ext_${instance.extension.id}_${instance.windowId}_`
console.log(
`[HaexHub Storage] ${request.method} for window ${instance.windowId}`,
)
switch (request.method) {
case 'haextension.storage.getItem': {
const key = request.params.key as string
return localStorage.getItem(storageKey + key)
}
case 'haextension.storage.setItem': {
const key = request.params.key as string
const value = request.params.value as string
localStorage.setItem(storageKey + key, value)
return null
}
case 'haextension.storage.removeItem': {
const key = request.params.key as string
localStorage.removeItem(storageKey + key)
return null
}
case 'haextension.storage.clear': {
// Remove only instance-specific keys
const keys = Object.keys(localStorage).filter((k) =>
k.startsWith(storageKey),
)
keys.forEach((k) => localStorage.removeItem(k))
return null
}
case 'haextension.storage.keys': {
// Return only instance-specific keys (without prefix)
const keys = Object.keys(localStorage)
.filter((k) => k.startsWith(storageKey))
.map((k) => k.substring(storageKey.length))
return keys
}
default:
throw new Error(`Unknown storage method: ${request.method}`)
}
}

View File

@ -0,0 +1,14 @@
// Shared types for extension message handlers
import type { IHaexHubExtension } from '~/types/haexhub'
export interface ExtensionRequest {
id: string
method: string
params: Record<string, unknown>
timestamp: number
}
export interface ExtensionInstance {
extension: IHaexHubExtension
windowId: string
}

View File

@ -1,5 +1,5 @@
import { integer, sqliteTable, text, index } from 'drizzle-orm/sqlite-core' import { integer, sqliteTable, text, index } from 'drizzle-orm/sqlite-core'
import tableNames from '~/database/tableNames.json' import tableNames from '@/database/tableNames.json'
export const haexCrdtLogs = sqliteTable( export const haexCrdtLogs = sqliteTable(
tableNames.haex.crdt.logs.name, tableNames.haex.crdt.logs.name,
@ -48,3 +48,27 @@ export const haexCrdtConfigs = sqliteTable(tableNames.haex.crdt.configs.name, {
key: text().primaryKey(), key: text().primaryKey(),
value: text(), value: text(),
}) })
/**
* Sync Status Table (WITHOUT CRDT - local-only metadata)
* Tracks sync progress for each backend
*/
export const haexSyncStatus = sqliteTable(
'haex_sync_status',
{
id: text('id')
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
backendId: text('backend_id').notNull(),
// Last server sequence number received from pull
lastPullSequence: integer('last_pull_sequence'),
// Last HLC timestamp pushed to server
lastPushHlcTimestamp: text('last_push_hlc_timestamp'),
// Last successful sync timestamp
lastSyncAt: text('last_sync_at'),
// Sync error message if any
error: text('error'),
},
)
export type InsertHaexSyncStatus = typeof haexSyncStatus.$inferInsert
export type SelectHaexSyncStatus = typeof haexSyncStatus.$inferSelect

View File

@ -8,7 +8,7 @@ import {
type AnySQLiteColumn, type AnySQLiteColumn,
type SQLiteColumnBuilderBase, type SQLiteColumnBuilderBase,
} from 'drizzle-orm/sqlite-core' } from 'drizzle-orm/sqlite-core'
import tableNames from '~/database/tableNames.json' import tableNames from '@/database/tableNames.json'
const crdtColumnNames = { const crdtColumnNames = {
haexTimestamp: 'haex_timestamp', haexTimestamp: 'haex_timestamp',
@ -24,17 +24,42 @@ export const withCrdtColumns = <
haexTimestamp: text(crdtColumnNames.haexTimestamp), haexTimestamp: text(crdtColumnNames.haexTimestamp),
}) })
export const haexDevices = sqliteTable(
tableNames.haex.devices.name,
withCrdtColumns({
id: text(tableNames.haex.devices.columns.id)
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
deviceId: text(tableNames.haex.devices.columns.deviceId)
.notNull()
.unique(),
name: text(tableNames.haex.devices.columns.name).notNull(),
createdAt: text(tableNames.haex.devices.columns.createdAt).default(
sql`(CURRENT_TIMESTAMP)`,
),
updatedAt: integer(tableNames.haex.devices.columns.updatedAt, {
mode: 'timestamp',
}).$onUpdate(() => new Date()),
}),
)
export type InsertHaexDevices = typeof haexDevices.$inferInsert
export type SelectHaexDevices = typeof haexDevices.$inferSelect
export const haexSettings = sqliteTable( export const haexSettings = sqliteTable(
tableNames.haex.settings.name, tableNames.haex.settings.name,
withCrdtColumns({ withCrdtColumns({
id: text() id: text(tableNames.haex.settings.columns.id)
.$defaultFn(() => crypto.randomUUID()) .$defaultFn(() => crypto.randomUUID())
.primaryKey(), .primaryKey(),
key: text(), deviceId: text(tableNames.haex.settings.columns.deviceId).references(
type: text(), (): AnySQLiteColumn => haexDevices.id,
value: text(), { onDelete: 'cascade' },
),
key: text(tableNames.haex.settings.columns.key),
type: text(tableNames.haex.settings.columns.type),
value: text(tableNames.haex.settings.columns.value),
}), }),
(table) => [unique().on(table.key, table.type, table.value)], (table) => [unique().on(table.deviceId, table.key, table.type)],
) )
export type InsertHaexSettings = typeof haexSettings.$inferInsert export type InsertHaexSettings = typeof haexSettings.$inferInsert
export type SelectHaexSettings = typeof haexSettings.$inferSelect export type SelectHaexSettings = typeof haexSettings.$inferSelect
@ -137,6 +162,7 @@ export const haexWorkspaces = sqliteTable(
position: integer(tableNames.haex.workspaces.columns.position) position: integer(tableNames.haex.workspaces.columns.position)
.notNull() .notNull()
.default(0), .default(0),
background: text(),
}), }),
(table) => [unique().on(table.position)], (table) => [unique().on(table.position)],
) )
@ -179,3 +205,30 @@ export const haexDesktopItems = sqliteTable(
) )
export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert export type InsertHaexDesktopItems = typeof haexDesktopItems.$inferInsert
export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect export type SelectHaexDesktopItems = typeof haexDesktopItems.$inferSelect
export const haexSyncBackends = sqliteTable(
tableNames.haex.sync_backends.name,
withCrdtColumns({
id: text(tableNames.haex.sync_backends.columns.id)
.$defaultFn(() => crypto.randomUUID())
.primaryKey(),
name: text(tableNames.haex.sync_backends.columns.name).notNull(),
serverUrl: text(tableNames.haex.sync_backends.columns.serverUrl).notNull(),
enabled: integer(tableNames.haex.sync_backends.columns.enabled, {
mode: 'boolean',
})
.default(true)
.notNull(),
priority: integer(tableNames.haex.sync_backends.columns.priority)
.default(0)
.notNull(),
createdAt: text(tableNames.haex.sync_backends.columns.createdAt).default(
sql`(CURRENT_TIMESTAMP)`,
),
updatedAt: integer(tableNames.haex.sync_backends.columns.updatedAt, {
mode: 'timestamp',
}).$onUpdate(() => new Date()),
}),
)
export type InsertHaexSyncBackends = typeof haexSyncBackends.$inferInsert
export type SelectHaexSyncBackends = typeof haexSyncBackends.$inferSelect

View File

@ -4,6 +4,7 @@
"name": "haex_settings", "name": "haex_settings",
"columns": { "columns": {
"id": "id", "id": "id",
"deviceId": "device_id",
"key": "key", "key": "key",
"type": "type", "type": "type",
"value": "value", "value": "value",
@ -89,6 +90,32 @@
"haexTimestamp": "haex_timestamp" "haexTimestamp": "haex_timestamp"
} }
}, },
"devices": {
"name": "haex_devices",
"columns": {
"id": "id",
"deviceId": "device_id",
"name": "name",
"createdAt": "created_at",
"updatedAt": "updated_at",
"haexTimestamp": "haex_timestamp"
}
},
"sync_backends": {
"name": "haex_sync_backends",
"columns": {
"id": "id",
"name": "name",
"serverUrl": "server_url",
"enabled": "enabled",
"priority": "priority",
"createdAt": "created_at",
"updatedAt": "updated_at",
"haexTimestamp": "haex_timestamp"
}
},
"crdt": { "crdt": {
"logs": { "logs": {

View File

@ -59,48 +59,7 @@
</main> </main>
<!-- Workspace Drawer --> <!-- Workspace Drawer -->
<UDrawer <HaexWorkspaceDrawer />
v-model:open="isOverviewMode"
direction="left"
:dismissible="false"
:overlay="false"
:modal="false"
title="Workspaces"
description="Workspaces"
>
<template #content>
<div class="p-6 h-full overflow-y-auto">
<UButton
block
trailing-icon="mdi-close"
class="text-2xl font-bold ext-gray-900 dark:text-white mb-4"
@click="isOverviewMode = false"
>
Workspaces
</UButton>
<!-- Workspace Cards -->
<div class="flex flex-col gap-3">
<HaexWorkspaceCard
v-for="workspace in workspaces"
:key="workspace.id"
:workspace
/>
</div>
<!-- Add New Workspace Button -->
<UButton
block
variant="outline"
class="mt-6"
@click="handleAddWorkspace"
icon="i-heroicons-plus"
:label="t('workspaces.add')"
>
</UButton>
</div>
</template>
</UDrawer>
</div> </div>
</template> </template>
@ -117,15 +76,7 @@ const { showWindowOverview, openWindowsCount } = storeToRefs(
useWindowManagerStore(), useWindowManagerStore(),
) )
const workspaceStore = useWorkspaceStore() const { isOverviewMode } = storeToRefs(useWorkspaceStore())
const { workspaces, isOverviewMode } = storeToRefs(workspaceStore)
const handleAddWorkspace = async () => {
const workspace = await workspaceStore.addWorkspaceAsync()
nextTick(() => {
workspaceStore.slideToWorkspace(workspace?.id)
})
}
// Measure header height and store it in UI store // Measure header height and store it in UI store
const headerEl = useTemplateRef('headerEl') const headerEl = useTemplateRef('headerEl')
@ -141,15 +92,11 @@ watch(height, (newHeight) => {
de: de:
search: search:
label: Suche label: Suche
workspaces: workspaces:
label: Workspaces label: Workspaces
add: Workspace hinzufügen
en: en:
search: search:
label: Search label: Search
workspaces: workspaces:
label: Workspaces label: Workspaces
add: Add Workspace
</i18n> </i18n>

View File

@ -25,7 +25,7 @@
<div <div
v-show="lastVaults.length" v-show="lastVaults.length"
class="max-w-md w-full sm:px-5" class="w-56"
> >
<div class="font-thin text-sm pb-1 w-full"> <div class="font-thin text-sm pb-1 w-full">
{{ t('lastUsed') }} {{ t('lastUsed') }}

View File

@ -53,6 +53,7 @@ const { addDeviceNameAsync } = useDeviceStore()
const { deviceId } = storeToRefs(useDeviceStore()) const { deviceId } = storeToRefs(useDeviceStore())
const { syncLocaleAsync, syncThemeAsync, syncVaultNameAsync } = const { syncLocaleAsync, syncThemeAsync, syncVaultNameAsync } =
useVaultSettingsStore() useVaultSettingsStore()
const { syncDesktopIconSizeAsync } = useDesktopStore()
onMounted(async () => { onMounted(async () => {
try { try {
@ -62,6 +63,7 @@ onMounted(async () => {
syncLocaleAsync(), syncLocaleAsync(),
syncThemeAsync(), syncThemeAsync(),
syncVaultNameAsync(), syncVaultNameAsync(),
syncDesktopIconSizeAsync(),
loadExtensionsAsync(), loadExtensionsAsync(),
readNotificationsAsync(), readNotificationsAsync(),
]) ])

View File

@ -1,9 +1,13 @@
import { eq } from 'drizzle-orm' import { eq } from 'drizzle-orm'
import { haexDesktopItems } from '~/database/schemas' import { haexDesktopItems, haexDevices } from '~/database/schemas'
import type { import type {
InsertHaexDesktopItems, InsertHaexDesktopItems,
SelectHaexDesktopItems, SelectHaexDesktopItems,
} from '~/database/schemas' } from '~/database/schemas'
import {
DesktopIconSizePreset,
iconSizePresetValues,
} from '~/stores/vault/settings'
import de from './de.json' import de from './de.json'
import en from './en.json' import en from './en.json'
@ -20,15 +24,104 @@ export const useDesktopStore = defineStore('desktopStore', () => {
const workspaceStore = useWorkspaceStore() const workspaceStore = useWorkspaceStore()
const { currentWorkspace } = storeToRefs(workspaceStore) const { currentWorkspace } = storeToRefs(workspaceStore)
const { $i18n } = useNuxtApp() const { $i18n } = useNuxtApp()
const deviceStore = useDeviceStore()
const settingsStore = useVaultSettingsStore()
$i18n.setLocaleMessage('de', { $i18n.setLocaleMessage('de', { desktop: de })
desktop: de,
})
$i18n.setLocaleMessage('en', { desktop: en }) $i18n.setLocaleMessage('en', { desktop: en })
const desktopItems = ref<IDesktopItem[]>([]) const desktopItems = ref<IDesktopItem[]>([])
const selectedItemIds = ref<Set<string>>(new Set()) const selectedItemIds = ref<Set<string>>(new Set())
// Desktop Grid Settings (stored in DB per device)
const iconSizePreset = ref<DesktopIconSizePreset>(
DesktopIconSizePreset.medium,
)
// Get device internal ID from DB
const getDeviceInternalIdAsync = async () => {
if (!deviceStore.deviceId || !currentVault.value?.drizzle) return undefined
const device = await currentVault.value.drizzle.query.haexDevices.findFirst(
{
where: eq(haexDevices.deviceId, deviceStore.deviceId),
},
)
return device?.id ? device.id : undefined
}
// Sync icon size from DB
const syncDesktopIconSizeAsync = async () => {
const deviceInternalId = await getDeviceInternalIdAsync()
if (!deviceInternalId) return
const preset =
await settingsStore.syncDesktopIconSizeAsync(deviceInternalId)
iconSizePreset.value = preset
}
// Update icon size in DB
const updateDesktopIconSizeAsync = async (preset: DesktopIconSizePreset) => {
const deviceInternalId = await getDeviceInternalIdAsync()
if (!deviceInternalId) return
await settingsStore.updateDesktopIconSizeAsync(deviceInternalId, preset)
iconSizePreset.value = preset
}
const effectiveIconSize = computed(() => {
return iconSizePresetValues[iconSizePreset.value]
})
const iconPadding = 30
// Calculate grid cell size based on icon size
const gridCellSize = computed(() => {
// Add padding around icon (30px extra for spacing)
return effectiveIconSize.value + iconPadding
})
// Snap position to grid (centers icon in cell)
// iconWidth and iconHeight are optional - if provided, they're used for centering
const snapToGrid = (
x: number,
y: number,
iconWidth?: number,
iconHeight?: number,
) => {
const cellSize = gridCellSize.value
const halfCell = cellSize / 2
// Use provided dimensions or fall back to the effective icon size (not cell size!)
const actualIconWidth = iconWidth || effectiveIconSize.value
const actualIconHeight = iconHeight || effectiveIconSize.value
// Calculate which grid cell the position falls into
// Add half the icon size to x/y to get the center point for snapping
const centerX = x + actualIconWidth / 2
const centerY = y + actualIconHeight / 2
// Find nearest grid cell center
// Grid cells are centered at: halfCell, halfCell + cellSize, halfCell + 2*cellSize, ...
// Which is: halfCell + (n * cellSize) for n = 0, 1, 2, ...
const col = Math.round((centerX - halfCell) / cellSize)
const row = Math.round((centerY - halfCell) / cellSize)
// Calculate the center of the target grid cell
const gridCenterX = halfCell + col * cellSize
const gridCenterY = halfCell + row * cellSize
// Calculate the top-left position that centers the icon in the cell
const snappedX = gridCenterX - actualIconWidth / 2
const snappedY = gridCenterY - actualIconHeight / 2
return {
x: snappedX,
y: snappedY,
}
}
const loadDesktopItemsAsync = async () => { const loadDesktopItemsAsync = async () => {
if (!currentVault.value?.drizzle) { if (!currentVault.value?.drizzle) {
console.error('Kein Vault geöffnet') console.error('Kein Vault geöffnet')
@ -46,9 +139,12 @@ export const useDesktopStore = defineStore('desktopStore', () => {
.from(haexDesktopItems) .from(haexDesktopItems)
.where(eq(haexDesktopItems.workspaceId, currentWorkspace.value.id)) .where(eq(haexDesktopItems.workspaceId, currentWorkspace.value.id))
desktopItems.value = items.map(item => ({ desktopItems.value = items.map((item) => ({
...item, ...item,
referenceId: item.itemType === 'extension' ? item.extensionId! : item.systemWindowId!, referenceId:
item.itemType === 'extension'
? item.extensionId!
: item.systemWindowId!,
})) }))
} catch (error) { } catch (error) {
console.error('Fehler beim Laden der Desktop-Items:', error) console.error('Fehler beim Laden der Desktop-Items:', error)
@ -77,7 +173,10 @@ export const useDesktopStore = defineStore('desktopStore', () => {
workspaceId: targetWorkspaceId, workspaceId: targetWorkspaceId,
itemType: itemType, itemType: itemType,
extensionId: itemType === 'extension' ? referenceId : null, extensionId: itemType === 'extension' ? referenceId : null,
systemWindowId: itemType === 'system' || itemType === 'file' || itemType === 'folder' ? referenceId : null, systemWindowId:
itemType === 'system' || itemType === 'file' || itemType === 'folder'
? referenceId
: null,
positionX: positionX, positionX: positionX,
positionY: positionY, positionY: positionY,
} }
@ -90,7 +189,10 @@ export const useDesktopStore = defineStore('desktopStore', () => {
if (result.length > 0 && result[0]) { if (result.length > 0 && result[0]) {
const itemWithRef = { const itemWithRef = {
...result[0], ...result[0],
referenceId: itemType === 'extension' ? result[0].extensionId! : result[0].systemWindowId!, referenceId:
itemType === 'extension'
? result[0].extensionId!
: result[0].systemWindowId!,
} }
desktopItems.value.push(itemWithRef) desktopItems.value.push(itemWithRef)
return itemWithRef return itemWithRef
@ -101,7 +203,7 @@ export const useDesktopStore = defineStore('desktopStore', () => {
itemType, itemType,
referenceId, referenceId,
workspaceId: targetWorkspaceId, workspaceId: targetWorkspaceId,
position: { x: positionX, y: positionY } position: { x: positionX, y: positionY },
}) })
// Log full error details // Log full error details
@ -138,7 +240,10 @@ export const useDesktopStore = defineStore('desktopStore', () => {
const item = result[0] const item = result[0]
desktopItems.value[index] = { desktopItems.value[index] = {
...item, ...item,
referenceId: item.itemType === 'extension' ? item.extensionId! : item.systemWindowId!, referenceId:
item.itemType === 'extension'
? item.extensionId!
: item.systemWindowId!,
} }
} }
} }
@ -171,16 +276,14 @@ export const useDesktopStore = defineStore('desktopStore', () => {
itemType: DesktopItemType, itemType: DesktopItemType,
referenceId: string, referenceId: string,
) => { ) => {
return desktopItems.value.find( return desktopItems.value.find((item) => {
(item) => {
if (item.itemType !== itemType) return false if (item.itemType !== itemType) return false
if (itemType === 'extension') { if (itemType === 'extension') {
return item.extensionId === referenceId return item.extensionId === referenceId
} else { } else {
return item.systemWindowId === referenceId return item.systemWindowId === referenceId
} }
}, })
)
} }
const openDesktopItem = ( const openDesktopItem = (
@ -191,9 +294,9 @@ export const useDesktopStore = defineStore('desktopStore', () => {
const windowManager = useWindowManagerStore() const windowManager = useWindowManagerStore()
if (itemType === 'system') { if (itemType === 'system') {
const systemWindow = windowManager.getAllSystemWindows().find( const systemWindow = windowManager
(win) => win.id === referenceId, .getAllSystemWindows()
) .find((win) => win.id === referenceId)
if (systemWindow) { if (systemWindow) {
windowManager.openWindowAsync({ windowManager.openWindowAsync({
@ -298,6 +401,28 @@ export const useDesktopStore = defineStore('desktopStore', () => {
openDesktopItem(itemType, referenceId) openDesktopItem(itemType, referenceId)
} }
// Build second menu group based on item type
const secondGroup = [
{
label: $i18n.t('desktop.contextMenu.removeFromDesktop'),
icon: 'i-heroicons-x-mark',
onSelect: async () => {
await removeDesktopItemAsync(id)
},
},
]
// Only show uninstall option for extensions
if (itemType === 'extension') {
secondGroup.push({
label: $i18n.t('desktop.contextMenu.uninstall'),
icon: 'i-heroicons-trash',
onSelect: async () => {
onUninstall()
},
})
}
return [ return [
[ [
{ {
@ -306,20 +431,7 @@ export const useDesktopStore = defineStore('desktopStore', () => {
onSelect: handleOpen, onSelect: handleOpen,
}, },
], ],
[ secondGroup,
{
label: $i18n.t('desktop.contextMenu.removeFromDesktop'),
icon: 'i-heroicons-x-mark',
onSelect: async () => {
await removeDesktopItemAsync(id)
},
},
{
label: $i18n.t('desktop.contextMenu.uninstall'),
icon: 'i-heroicons-trash',
onSelect: onUninstall,
},
],
] ]
} }
@ -338,5 +450,12 @@ export const useDesktopStore = defineStore('desktopStore', () => {
toggleSelection, toggleSelection,
clearSelection, clearSelection,
isItemSelected, isItemSelected,
// Grid settings
iconSizePreset,
syncDesktopIconSizeAsync,
updateDesktopIconSizeAsync,
effectiveIconSize,
gridCellSize,
snapToGrid,
} }
}) })

View File

@ -1,4 +1,5 @@
import { defineAsyncComponent, type Component } from 'vue' import { defineAsyncComponent, type Component } from 'vue'
import { getFullscreenDimensions } from '~/utils/viewport'
export interface IWindow { export interface IWindow {
id: string id: string
@ -191,11 +192,30 @@ export const useWindowManagerStore = defineStore('windowManager', () => {
const viewportHeight = window.innerHeight - 60 const viewportHeight = window.innerHeight - 60
console.log('viewportHeight', window.innerHeight, viewportHeight) console.log('viewportHeight', window.innerHeight, viewportHeight)
const windowHeight = Math.min(height, viewportHeight)
// Check if we're on a small screen
const { isSmallScreen } = useUiStore()
let windowWidth: number
let windowHeight: number
let x: number
let y: number
if (isSmallScreen) {
// On small screens, make window fullscreen starting at 0,0
// Use helper function to calculate correct dimensions with safe areas
const fullscreen = getFullscreenDimensions()
x = fullscreen.x
y = fullscreen.y
windowWidth = fullscreen.width
windowHeight = fullscreen.height
} else {
// On larger screens, use normal sizing and positioning
windowHeight = Math.min(height, viewportHeight)
// Adjust width proportionally if needed (optional) // Adjust width proportionally if needed (optional)
const aspectRatio = width / height const aspectRatio = width / height
const windowWidth = Math.min( windowWidth = Math.min(
width, width,
viewportWidth, viewportWidth,
windowHeight * aspectRatio, windowHeight * aspectRatio,
@ -205,8 +225,9 @@ export const useWindowManagerStore = defineStore('windowManager', () => {
const offset = currentWorkspaceWindows.value.length * 30 const offset = currentWorkspaceWindows.value.length * 30
const centerX = Math.max(0, (viewportWidth - windowWidth) / 1 / 3) const centerX = Math.max(0, (viewportWidth - windowWidth) / 1 / 3)
const centerY = Math.max(0, (viewportHeight - windowHeight) / 1 / 3) const centerY = Math.max(0, (viewportHeight - windowHeight) / 1 / 3)
const x = Math.min(centerX + offset, viewportWidth - windowWidth) x = Math.min(centerX + offset, viewportWidth - windowWidth)
const y = Math.min(centerY + offset, viewportHeight - windowHeight) y = Math.min(centerY + offset, viewportHeight - windowHeight)
}
const newWindow: IWindow = { const newWindow: IWindow = {
id: windowId, id: windowId,
@ -300,6 +321,7 @@ export const useWindowManagerStore = defineStore('windowManager', () => {
const window = windows.value.find((w) => w.id === windowId) const window = windows.value.find((w) => w.id === windowId)
if (window) { if (window) {
window.zIndex = nextZIndex.value++ window.zIndex = nextZIndex.value++
window.isMinimized = false
activeWindowId.value = windowId activeWindowId.value = windowId
} }
} }

View File

@ -4,6 +4,7 @@ import {
type SelectHaexWorkspaces, type SelectHaexWorkspaces,
} from '~/database/schemas' } from '~/database/schemas'
import type { Swiper } from 'swiper/types' import type { Swiper } from 'swiper/types'
import { convertFileSrc } from '@tauri-apps/api/core'
export type IWorkspace = SelectHaexWorkspaces export type IWorkspace = SelectHaexWorkspaces
@ -203,12 +204,86 @@ export const useWorkspaceStore = defineStore('workspaceStore', () => {
isOverviewMode.value = false isOverviewMode.value = false
} }
const updateWorkspaceBackgroundAsync = async (
workspaceId: string,
base64Image: string | null,
) => {
if (!currentVault.value?.drizzle) {
throw new Error('Kein Vault geöffnet')
}
try {
const result = await currentVault.value.drizzle
.update(haexWorkspaces)
.set({ background: base64Image })
.where(eq(haexWorkspaces.id, workspaceId))
.returning()
if (result.length > 0 && result[0]) {
const index = workspaces.value.findIndex((ws) => ws.id === workspaceId)
if (index !== -1) {
workspaces.value[index] = result[0]
}
}
} catch (error) {
console.error('Fehler beim Aktualisieren des Workspace-Hintergrunds:', error)
throw error
}
}
const getWorkspaceBackgroundStyle = (workspace: IWorkspace) => {
if (!workspace.background) return {}
// The background field contains the absolute file path
// Convert it to an asset URL
const assetUrl = convertFileSrc(workspace.background)
return {
backgroundImage: `url(${assetUrl})`,
backgroundSize: 'cover',
backgroundPosition: 'center',
backgroundRepeat: 'no-repeat',
}
}
const getWorkspaceContextMenuItems = (workspaceId: string) => {
const windowManager = useWindowManagerStore()
return [[
{
label: 'Hintergrund ändern',
icon: 'i-mdi-image',
onSelect: async () => {
// Store the workspace ID for settings to use
currentWorkspaceIndex.value = workspaces.value.findIndex(
(ws) => ws.id === workspaceId,
)
// Get settings window info
const settingsWindow = windowManager.getAllSystemWindows()
.find((win) => win.id === 'settings')
if (settingsWindow) {
await windowManager.openWindowAsync({
type: 'system',
sourceId: settingsWindow.id,
title: settingsWindow.name,
icon: settingsWindow.icon || undefined,
workspaceId,
})
}
},
},
]]
}
return { return {
addWorkspaceAsync, addWorkspaceAsync,
allowSwipe, allowSwipe,
closeWorkspaceAsync, closeWorkspaceAsync,
currentWorkspace, currentWorkspace,
currentWorkspaceIndex, currentWorkspaceIndex,
getWorkspaceBackgroundStyle,
getWorkspaceContextMenuItems,
isOverviewMode, isOverviewMode,
slideToWorkspace, slideToWorkspace,
loadWorkspacesAsync, loadWorkspacesAsync,
@ -218,6 +293,7 @@ export const useWorkspaceStore = defineStore('workspaceStore', () => {
switchToNext, switchToNext,
switchToPrevious, switchToPrevious,
switchToWorkspace, switchToWorkspace,
updateWorkspaceBackgroundAsync,
workspaces, workspaces,
} }
}) })

130
src/stores/sync/backends.ts Normal file
View File

@ -0,0 +1,130 @@
import { eq } from 'drizzle-orm'
import {
haexSyncBackends,
type InsertHaexSyncBackends,
type SelectHaexSyncBackends,
} from '~/database/schemas'
export const useSyncBackendsStore = defineStore('syncBackendsStore', () => {
const { currentVault } = storeToRefs(useVaultStore())
const backends = ref<SelectHaexSyncBackends[]>([])
const enabledBackends = computed(() =>
backends.value.filter((b) => b.enabled),
)
const sortedBackends = computed(() =>
[...backends.value].sort((a, b) => (b.priority || 0) - (a.priority || 0)),
)
// Load all sync backends from database
const loadBackendsAsync = async () => {
if (!currentVault.value?.drizzle) {
console.error('No vault opened')
return
}
try {
const result = await currentVault.value.drizzle
.select()
.from(haexSyncBackends)
backends.value = result
} catch (error) {
console.error('Failed to load sync backends:', error)
throw error
}
}
// Add a new sync backend
const addBackendAsync = async (backend: InsertHaexSyncBackends) => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
const result = await currentVault.value.drizzle
.insert(haexSyncBackends)
.values(backend)
.returning()
if (result.length > 0 && result[0]) {
backends.value.push(result[0])
return result[0]
}
} catch (error) {
console.error('Failed to add sync backend:', error)
throw error
}
}
// Update an existing sync backend
const updateBackendAsync = async (
id: string,
updates: Partial<InsertHaexSyncBackends>,
) => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
const result = await currentVault.value.drizzle
.update(haexSyncBackends)
.set(updates)
.where(eq(haexSyncBackends.id, id))
.returning()
if (result.length > 0 && result[0]) {
const index = backends.value.findIndex((b) => b.id === id)
if (index !== -1) {
backends.value[index] = result[0]
}
return result[0]
}
} catch (error) {
console.error('Failed to update sync backend:', error)
throw error
}
}
// Delete a sync backend
const deleteBackendAsync = async (id: string) => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
await currentVault.value.drizzle
.delete(haexSyncBackends)
.where(eq(haexSyncBackends.id, id))
backends.value = backends.value.filter((b) => b.id !== id)
} catch (error) {
console.error('Failed to delete sync backend:', error)
throw error
}
}
// Enable/disable a backend
const toggleBackendAsync = async (id: string, enabled: boolean) => {
return updateBackendAsync(id, { enabled })
}
// Update backend priority (for sync order)
const updatePriorityAsync = async (id: string, priority: number) => {
return updateBackendAsync(id, { priority })
}
return {
backends,
enabledBackends,
sortedBackends,
loadBackendsAsync,
addBackendAsync,
updateBackendAsync,
deleteBackendAsync,
toggleBackendAsync,
updatePriorityAsync,
}
})

390
src/stores/sync/engine.ts Normal file
View File

@ -0,0 +1,390 @@
/**
* Sync Engine Store - Executes sync operations with haex-sync-server backends
* Handles vault key storage and CRDT log synchronization
*/
import { createClient } from '@supabase/supabase-js'
import type { SelectHaexCrdtLogs } from '~/database/schemas'
import {
encryptVaultKeyAsync,
decryptVaultKeyAsync,
encryptCrdtDataAsync,
decryptCrdtDataAsync,
generateVaultKey,
} from '~/utils/crypto/vaultKey'
interface VaultKeyCache {
[vaultId: string]: {
vaultKey: Uint8Array
timestamp: number
}
}
interface SyncLogData {
vaultId: string
encryptedData: string
nonce: string
haexTimestamp: string
sequence: number
}
interface PullLogsResponse {
logs: Array<{
id: string
userId: string
vaultId: string
encryptedData: string
nonce: string
haexTimestamp: string
sequence: number
createdAt: string
}>
hasMore: boolean
}
export const useSyncEngineStore = defineStore('syncEngineStore', () => {
const { currentVault, currentVaultId } = storeToRefs(useVaultStore())
const syncBackendsStore = useSyncBackendsStore()
// In-memory cache for decrypted vault keys (cleared on logout/vault close)
const vaultKeyCache = ref<VaultKeyCache>({})
// Supabase client (initialized with config from backend)
const supabaseClient = ref<ReturnType<typeof createClient> | null>(null)
/**
* Initializes Supabase client for a specific backend
*/
const initSupabaseClientAsync = async (backendId: string) => {
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
throw new Error('Backend not found')
}
// Get Supabase URL and anon key from server health check
const response = await fetch(backend.serverUrl)
if (!response.ok) {
throw new Error('Failed to connect to sync server')
}
const serverInfo = await response.json()
const supabaseUrl = serverInfo.supabaseUrl
// For now, we need to configure the anon key somewhere
// TODO: Store this in backend config or fetch from somewhere secure
const supabaseAnonKey = 'YOUR_SUPABASE_ANON_KEY'
supabaseClient.value = createClient(supabaseUrl, supabaseAnonKey)
}
/**
* Gets the current Supabase auth token
*/
const getAuthTokenAsync = async (): Promise<string | null> => {
if (!supabaseClient.value) {
return null
}
const {
data: { session },
} = await supabaseClient.value.auth.getSession()
return session?.access_token ?? null
}
/**
* Stores encrypted vault key on the server
*/
const storeVaultKeyAsync = async (
backendId: string,
vaultId: string,
password: string,
): Promise<void> => {
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
throw new Error('Backend not found')
}
// Generate new vault key
const vaultKey = generateVaultKey()
// Encrypt vault key with password
const encryptedData = await encryptVaultKeyAsync(vaultKey, password)
// Get auth token
const token = await getAuthTokenAsync()
if (!token) {
throw new Error('Not authenticated')
}
// Send to server
const response = await fetch(`${backend.serverUrl}/sync/vault-key`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${token}`,
},
body: JSON.stringify({
vaultId,
...encryptedData,
}),
})
if (!response.ok) {
const error = await response.json().catch(() => ({}))
throw new Error(
`Failed to store vault key: ${error.error || response.statusText}`,
)
}
// Cache decrypted vault key
vaultKeyCache.value[vaultId] = {
vaultKey,
timestamp: Date.now(),
}
}
/**
* Retrieves and decrypts vault key from the server
*/
const getVaultKeyAsync = async (
backendId: string,
vaultId: string,
password: string,
): Promise<Uint8Array> => {
// Check cache first
const cached = vaultKeyCache.value[vaultId]
if (cached) {
return cached.vaultKey
}
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
throw new Error('Backend not found')
}
// Get auth token
const token = await getAuthTokenAsync()
if (!token) {
throw new Error('Not authenticated')
}
// Fetch from server
const response = await fetch(
`${backend.serverUrl}/sync/vault-key/${vaultId}`,
{
method: 'GET',
headers: {
'Authorization': `Bearer ${token}`,
},
},
)
if (response.status === 404) {
throw new Error('Vault key not found on server')
}
if (!response.ok) {
const error = await response.json().catch(() => ({}))
throw new Error(
`Failed to get vault key: ${error.error || response.statusText}`,
)
}
const data = await response.json()
// Decrypt vault key
const vaultKey = await decryptVaultKeyAsync(
data.encryptedVaultKey,
data.salt,
data.nonce,
password,
)
// Cache decrypted vault key
vaultKeyCache.value[vaultId] = {
vaultKey,
timestamp: Date.now(),
}
return vaultKey
}
/**
* Pushes CRDT logs to the server
*/
const pushLogsAsync = async (
backendId: string,
vaultId: string,
logs: SelectHaexCrdtLogs[],
): Promise<void> => {
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
throw new Error('Backend not found')
}
// Get vault key from cache
const cached = vaultKeyCache.value[vaultId]
if (!cached) {
throw new Error('Vault key not available. Please unlock vault first.')
}
const vaultKey = cached.vaultKey
// Get auth token
const token = await getAuthTokenAsync()
if (!token) {
throw new Error('Not authenticated')
}
// Encrypt each log entry
const encryptedLogs: SyncLogData[] = []
for (const log of logs) {
const { encryptedData, nonce } = await encryptCrdtDataAsync(
log,
vaultKey,
)
// Generate sequence number based on timestamp
const sequence = Date.now()
encryptedLogs.push({
vaultId,
encryptedData,
nonce,
haexTimestamp: log.haexTimestamp!,
sequence,
})
}
// Send to server
const response = await fetch(`${backend.serverUrl}/sync/push`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${token}`,
},
body: JSON.stringify({
vaultId,
logs: encryptedLogs,
}),
})
if (!response.ok) {
const error = await response.json().catch(() => ({}))
throw new Error(
`Failed to push logs: ${error.error || response.statusText}`,
)
}
}
/**
* Pulls CRDT logs from the server
*/
const pullLogsAsync = async (
backendId: string,
vaultId: string,
afterSequence?: number,
limit?: number,
): Promise<SelectHaexCrdtLogs[]> => {
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
throw new Error('Backend not found')
}
// Get vault key from cache
const cached = vaultKeyCache.value[vaultId]
if (!cached) {
throw new Error('Vault key not available. Please unlock vault first.')
}
const vaultKey = cached.vaultKey
// Get auth token
const token = await getAuthTokenAsync()
if (!token) {
throw new Error('Not authenticated')
}
// Fetch from server
const response = await fetch(`${backend.serverUrl}/sync/pull`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${token}`,
},
body: JSON.stringify({
vaultId,
afterSequence,
limit: limit ?? 100,
}),
})
if (!response.ok) {
const error = await response.json().catch(() => ({}))
throw new Error(
`Failed to pull logs: ${error.error || response.statusText}`,
)
}
const data: PullLogsResponse = await response.json()
// Decrypt each log entry
const decryptedLogs: SelectHaexCrdtLogs[] = []
for (const log of data.logs) {
try {
const decrypted = await decryptCrdtDataAsync<SelectHaexCrdtLogs>(
log.encryptedData,
log.nonce,
vaultKey,
)
decryptedLogs.push(decrypted)
} catch (error) {
console.error('Failed to decrypt log entry:', log.id, error)
// Skip corrupted entries
}
}
return decryptedLogs
}
/**
* Clears vault key from cache
*/
const clearVaultKeyCache = (vaultId?: string) => {
if (vaultId) {
delete vaultKeyCache.value[vaultId]
} else {
vaultKeyCache.value = {}
}
}
/**
* Health check - verifies server is reachable
*/
const healthCheckAsync = async (backendId: string): Promise<boolean> => {
const backend = syncBackendsStore.backends.find((b) => b.id === backendId)
if (!backend) {
return false
}
try {
const response = await fetch(backend.serverUrl)
return response.ok
} catch {
return false
}
}
return {
vaultKeyCache,
supabaseClient,
initSupabaseClientAsync,
getAuthTokenAsync,
storeVaultKeyAsync,
getVaultKeyAsync,
pushLogsAsync,
pullLogsAsync,
clearVaultKeyCache,
healthCheckAsync,
}
})

View File

@ -0,0 +1,525 @@
/**
* Sync Orchestrator Store - Orchestrates sync operations across all backends
* Uses Supabase Realtime subscriptions for instant sync
*/
import { eq, gt } from 'drizzle-orm'
import type { RealtimeChannel } from '@supabase/supabase-js'
import {
haexCrdtLogs,
haexSyncStatus,
type SelectHaexCrdtLogs,
type SelectHaexSyncStatus,
} from '~/database/schemas'
interface SyncState {
isConnected: boolean
isSyncing: boolean
error: string | null
subscription: RealtimeChannel | null
status: SelectHaexSyncStatus | null
}
interface BackendSyncState {
[backendId: string]: SyncState
}
export const useSyncOrchestratorStore = defineStore(
'syncOrchestratorStore',
() => {
const { currentVault, currentVaultId } = storeToRefs(useVaultStore())
const syncBackendsStore = useSyncBackendsStore()
const syncEngineStore = useSyncEngineStore()
// Sync state per backend
const syncStates = ref<BackendSyncState>({})
// Track if we're currently processing a local write
const isProcessingLocalWrite = ref(false)
/**
* Loads sync status from database for a backend
*/
const loadSyncStatusAsync = async (
backendId: string,
): Promise<SelectHaexSyncStatus | null> => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
const results = await currentVault.value.drizzle
.select()
.from(haexSyncStatus)
.where(eq(haexSyncStatus.backendId, backendId))
.limit(1)
return results[0] ?? null
} catch (error) {
console.error('Failed to load sync status:', error)
return null
}
}
/**
* Updates sync status in database
*/
const updateSyncStatusAsync = async (
backendId: string,
updates: Partial<SelectHaexSyncStatus>,
): Promise<void> => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
const existing = await loadSyncStatusAsync(backendId)
if (existing) {
// Update existing
await currentVault.value.drizzle
.update(haexSyncStatus)
.set({
...updates,
lastSyncAt: new Date().toISOString(),
})
.where(eq(haexSyncStatus.backendId, backendId))
} else {
// Insert new
await currentVault.value.drizzle.insert(haexSyncStatus).values({
backendId,
...updates,
lastSyncAt: new Date().toISOString(),
})
}
// Update local state
if (syncStates.value[backendId]) {
syncStates.value[backendId].status = await loadSyncStatusAsync(
backendId,
)
}
} catch (error) {
console.error('Failed to update sync status:', error)
throw error
}
}
/**
* Gets logs that need to be pushed to server (after last push HLC)
*/
const getLogsToPushAsync = async (
backendId: string,
): Promise<SelectHaexCrdtLogs[]> => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
const status = await loadSyncStatusAsync(backendId)
const lastPushHlc = status?.lastPushHlcTimestamp
const query = currentVault.value.drizzle
.select()
.from(haexCrdtLogs)
.orderBy(haexCrdtLogs.haexTimestamp)
if (lastPushHlc) {
return await query.where(
gt(haexCrdtLogs.haexTimestamp, lastPushHlc),
)
}
return await query
} catch (error) {
console.error('Failed to get logs to push:', error)
throw error
}
}
/**
* Applies remote logs to local database
*/
const applyRemoteLogsAsync = async (
logs: SelectHaexCrdtLogs[],
): Promise<void> => {
if (!currentVault.value?.drizzle) {
throw new Error('No vault opened')
}
try {
// Insert logs into local CRDT log table
for (const log of logs) {
await currentVault.value.drizzle
.insert(haexCrdtLogs)
.values(log)
.onConflictDoNothing() // Skip if already exists
}
// TODO: Apply CRDT log entries to actual data tables
// This requires replaying the operations from the log
console.log(`Applied ${logs.length} remote logs to local database`)
} catch (error) {
console.error('Failed to apply remote logs:', error)
throw error
}
}
/**
* Pushes local changes to a specific backend
*/
const pushToBackendAsync = async (backendId: string): Promise<void> => {
if (!currentVaultId.value) {
throw new Error('No vault opened')
}
const state = syncStates.value[backendId]
if (!state) {
throw new Error('Backend not initialized')
}
if (state.isSyncing) {
console.log(`Already syncing with backend ${backendId}`)
return
}
state.isSyncing = true
state.error = null
try {
// Get logs that need to be pushed
const logs = await getLogsToPushAsync(backendId)
if (logs.length === 0) {
console.log(`No logs to push to backend ${backendId}`)
return
}
await syncEngineStore.pushLogsAsync(
backendId,
currentVaultId.value,
logs,
)
// Update sync status with last pushed HLC timestamp
const lastHlc = logs[logs.length - 1]?.haexTimestamp
if (lastHlc) {
await updateSyncStatusAsync(backendId, {
lastPushHlcTimestamp: lastHlc,
})
}
console.log(`Pushed ${logs.length} logs to backend ${backendId}`)
} catch (error) {
console.error(`Failed to push to backend ${backendId}:`, error)
state.error = error instanceof Error ? error.message : 'Unknown error'
await updateSyncStatusAsync(backendId, {
error: state.error,
})
throw error
} finally {
state.isSyncing = false
}
}
/**
* Pulls changes from a specific backend
*/
const pullFromBackendAsync = async (backendId: string): Promise<void> => {
if (!currentVaultId.value) {
throw new Error('No vault opened')
}
const state = syncStates.value[backendId]
if (!state) {
throw new Error('Backend not initialized')
}
if (state.isSyncing) {
console.log(`Already syncing with backend ${backendId}`)
return
}
state.isSyncing = true
state.error = null
try {
const status = await loadSyncStatusAsync(backendId)
const afterSequence = status?.lastPullSequence ?? undefined
const remoteLogs = await syncEngineStore.pullLogsAsync(
backendId,
currentVaultId.value,
afterSequence,
100,
)
if (remoteLogs.length > 0) {
await applyRemoteLogsAsync(remoteLogs)
// Update sync status with last pulled sequence
// TODO: Get actual sequence from server response
const lastSequence = Date.now()
await updateSyncStatusAsync(backendId, {
lastPullSequence: lastSequence,
})
console.log(
`Pulled ${remoteLogs.length} logs from backend ${backendId}`,
)
}
} catch (error) {
console.error(`Failed to pull from backend ${backendId}:`, error)
state.error = error instanceof Error ? error.message : 'Unknown error'
await updateSyncStatusAsync(backendId, {
error: state.error,
})
throw error
} finally {
state.isSyncing = false
}
}
/**
* Handles incoming realtime changes from Supabase
*/
const handleRealtimeChangeAsync = async (
backendId: string,
payload: any,
) => {
console.log(`Realtime change from backend ${backendId}:`, payload)
// Don't process if we're currently writing locally to avoid loops
if (isProcessingLocalWrite.value) {
console.log('Skipping realtime change - local write in progress')
return
}
// Pull latest changes from this backend
try {
await pullFromBackendAsync(backendId)
} catch (error) {
console.error('Failed to handle realtime change:', error)
}
}
/**
* Subscribes to realtime changes from a backend
*/
const subscribeToBackendAsync = async (backendId: string): Promise<void> => {
if (!currentVaultId.value) {
throw new Error('No vault opened')
}
const state = syncStates.value[backendId]
if (!state) {
throw new Error('Backend not initialized')
}
if (state.subscription) {
console.log(`Already subscribed to backend ${backendId}`)
return
}
const client = syncEngineStore.supabaseClient
if (!client) {
throw new Error('Supabase client not initialized')
}
try {
// Subscribe to sync_logs table for this vault
const channel = client
.channel(`sync_logs:${currentVaultId.value}`)
.on(
'postgres_changes',
{
event: 'INSERT',
schema: 'public',
table: 'sync_logs',
filter: `vault_id=eq.${currentVaultId.value}`,
},
(payload) => {
handleRealtimeChangeAsync(backendId, payload).catch(console.error)
},
)
.subscribe((status) => {
if (status === 'SUBSCRIBED') {
state.isConnected = true
console.log(`Subscribed to backend ${backendId}`)
} else if (status === 'CHANNEL_ERROR' || status === 'TIMED_OUT') {
state.isConnected = false
state.error = `Subscription error: ${status}`
console.error(
`Subscription to backend ${backendId} failed: ${status}`,
)
}
})
state.subscription = channel
} catch (error) {
console.error(`Failed to subscribe to backend ${backendId}:`, error)
state.error = error instanceof Error ? error.message : 'Unknown error'
throw error
}
}
/**
* Unsubscribes from realtime changes
*/
const unsubscribeFromBackendAsync = async (
backendId: string,
): Promise<void> => {
const state = syncStates.value[backendId]
if (!state || !state.subscription) {
return
}
try {
await state.subscription.unsubscribe()
state.subscription = null
state.isConnected = false
console.log(`Unsubscribed from backend ${backendId}`)
} catch (error) {
console.error(`Failed to unsubscribe from backend ${backendId}:`, error)
}
}
/**
* Initializes sync for a backend
*/
const initBackendAsync = async (backendId: string): Promise<void> => {
if (syncStates.value[backendId]) {
console.log(`Backend ${backendId} already initialized`)
return
}
// Load sync status from database
const status = await loadSyncStatusAsync(backendId)
// Initialize state
syncStates.value[backendId] = {
isConnected: false,
isSyncing: false,
error: null,
subscription: null,
status,
}
try {
// Initial pull to get all existing data
await pullFromBackendAsync(backendId)
// Subscribe to realtime changes
await subscribeToBackendAsync(backendId)
} catch (error) {
console.error(`Failed to initialize backend ${backendId}:`, error)
throw error
}
}
/**
* Called after local write operations to push changes
*/
const onLocalWriteAsync = async (): Promise<void> => {
isProcessingLocalWrite.value = true
try {
// Push to all enabled backends in parallel
const enabledBackends = syncBackendsStore.enabledBackends
await Promise.allSettled(
enabledBackends.map((backend) => pushToBackendAsync(backend.id)),
)
} catch (error) {
console.error('Failed to push local changes:', error)
} finally {
isProcessingLocalWrite.value = false
}
}
/**
* Starts sync for all enabled backends
*/
const startSyncAsync = async (): Promise<void> => {
const enabledBackends = syncBackendsStore.enabledBackends
if (enabledBackends.length === 0) {
console.log('No enabled backends to sync with')
return
}
console.log(`Starting sync with ${enabledBackends.length} backends`)
for (const backend of enabledBackends) {
try {
await initBackendAsync(backend.id)
} catch (error) {
console.error(
`Failed to start sync with backend ${backend.id}:`,
error,
)
}
}
}
/**
* Stops sync for all backends
*/
const stopSyncAsync = async (): Promise<void> => {
console.log('Stopping sync for all backends')
for (const backendId of Object.keys(syncStates.value)) {
await unsubscribeFromBackendAsync(backendId)
}
syncStates.value = {}
}
/**
* Gets sync state for a specific backend
*/
const getSyncState = (backendId: string): SyncState | null => {
return syncStates.value[backendId] ?? null
}
/**
* Checks if any backend is currently syncing
*/
const isAnySyncing = computed(() => {
return Object.values(syncStates.value).some((state) => state.isSyncing)
})
/**
* Checks if all backends are connected
*/
const areAllConnected = computed(() => {
const enabledBackends = syncBackendsStore.enabledBackends
if (enabledBackends.length === 0) return false
return enabledBackends.every((backend) => {
const state = syncStates.value[backend.id]
return state?.isConnected ?? false
})
})
return {
syncStates,
isProcessingLocalWrite,
isAnySyncing,
areAllConnected,
loadSyncStatusAsync,
updateSyncStatusAsync,
getLogsToPushAsync,
applyRemoteLogsAsync,
pushToBackendAsync,
pullFromBackendAsync,
subscribeToBackendAsync,
unsubscribeFromBackendAsync,
initBackendAsync,
onLocalWriteAsync,
startSyncAsync,
stopSyncAsync,
getSyncState,
}
},
)

View File

@ -45,7 +45,9 @@ export const useDeviceStore = defineStore('vaultDeviceStore', () => {
const isKnownDeviceAsync = async () => { const isKnownDeviceAsync = async () => {
const { readDeviceNameAsync } = useVaultSettingsStore() const { readDeviceNameAsync } = useVaultSettingsStore()
return !!(await readDeviceNameAsync(deviceId.value)) const device = await readDeviceNameAsync(deviceId.value)
console.log('device', device)
return !!device
} }
const readDeviceNameAsync = async (id?: string) => { const readDeviceNameAsync = async (id?: string) => {
@ -54,7 +56,8 @@ export const useDeviceStore = defineStore('vaultDeviceStore', () => {
if (!_id) return if (!_id) return
deviceName.value = (await readDeviceNameAsync(_id))?.value ?? '' const device = await readDeviceNameAsync(_id)
deviceName.value = device?.name ?? ''
return deviceName.value return deviceName.value
} }

View File

@ -4,14 +4,29 @@ import * as schema from '~/database/schemas/haex'
import type { Locale } from 'vue-i18n' import type { Locale } from 'vue-i18n'
export enum VaultSettingsTypeEnum { export enum VaultSettingsTypeEnum {
deviceName = 'deviceName',
settings = 'settings', settings = 'settings',
system = 'system',
} }
export enum VaultSettingsKeyEnum { export enum VaultSettingsKeyEnum {
locale = 'locale', locale = 'locale',
theme = 'theme', theme = 'theme',
vaultName = 'vaultName', vaultName = 'vaultName',
desktopIconSize = 'desktopIconSize',
}
export enum DesktopIconSizePreset {
small = 'small',
medium = 'medium',
large = 'large',
extraLarge = 'extra-large',
}
export const iconSizePresetValues: Record<DesktopIconSizePreset, number> = {
[DesktopIconSizePreset.small]: 60,
[DesktopIconSizePreset.medium]: 80,
[DesktopIconSizePreset.large]: 120,
[DesktopIconSizePreset.extraLarge]: 160,
} }
export const vaultDeviceNameSchema = z.string().min(3).max(255) export const vaultDeviceNameSchema = z.string().min(3).max(255)
@ -118,20 +133,22 @@ export const useVaultSettingsStore = defineStore('vaultSettingsStore', () => {
.where(eq(schema.haexSettings.key, 'vaultName')) .where(eq(schema.haexSettings.key, 'vaultName'))
} }
const readDeviceNameAsync = async (id?: string) => { const readDeviceNameAsync = async (deviceId?: string) => {
const { currentVault } = useVaultStore() const { currentVault } = useVaultStore()
if (!id) return undefined if (!deviceId) return undefined
const deviceName = const device =
await currentVault?.drizzle?.query.haexSettings.findFirst({ await currentVault?.drizzle?.query.haexDevices.findFirst({
where: and( where: eq(schema.haexDevices.deviceId, deviceId),
eq(schema.haexSettings.type, VaultSettingsTypeEnum.deviceName),
eq(schema.haexSettings.key, id),
),
}) })
return deviceName?.id ? deviceName : undefined // Workaround für Drizzle Bug: findFirst gibt manchmal Objekt mit undefined Werten zurück
// https://github.com/drizzle-team/drizzle-orm/issues/3872
// Prüfe ob das Device wirklich existiert (id muss gesetzt sein, da NOT NULL)
if (!device?.id) return undefined
return device
} }
const addDeviceNameAsync = async ({ const addDeviceNameAsync = async ({
@ -149,10 +166,9 @@ export const useVaultSettingsStore = defineStore('vaultSettingsStore', () => {
return return
} }
return currentVault?.drizzle?.insert(schema.haexSettings).values({ return currentVault?.drizzle?.insert(schema.haexDevices).values({
type: VaultSettingsTypeEnum.deviceName, deviceId,
key: deviceId, name: deviceName,
value: deviceName,
}) })
} }
@ -169,14 +185,49 @@ export const useVaultSettingsStore = defineStore('vaultSettingsStore', () => {
if (!isNameOk.success) return if (!isNameOk.success) return
return currentVault?.drizzle return currentVault?.drizzle
?.update(schema.haexSettings) ?.update(schema.haexDevices)
.set({ .set({
value: deviceName, name: deviceName,
}) })
.where(eq(schema.haexDevices.deviceId, deviceId))
}
const syncDesktopIconSizeAsync = async (deviceInternalId: string) => {
const iconSizeRow =
await currentVault.value?.drizzle.query.haexSettings.findFirst({
where: and(
eq(schema.haexSettings.deviceId, deviceInternalId),
eq(schema.haexSettings.key, VaultSettingsKeyEnum.desktopIconSize),
eq(schema.haexSettings.type, VaultSettingsTypeEnum.system),
),
})
if (!iconSizeRow?.id) {
// Kein Eintrag vorhanden, erstelle einen mit Default (medium)
await currentVault.value?.drizzle.insert(schema.haexSettings).values({
deviceId: deviceInternalId,
key: VaultSettingsKeyEnum.desktopIconSize,
type: VaultSettingsTypeEnum.system,
value: DesktopIconSizePreset.medium,
})
return DesktopIconSizePreset.medium
}
return iconSizeRow.value as DesktopIconSizePreset
}
const updateDesktopIconSizeAsync = async (
deviceInternalId: string,
preset: DesktopIconSizePreset,
) => {
return await currentVault.value?.drizzle
.update(schema.haexSettings)
.set({ value: preset })
.where( .where(
and( and(
eq(schema.haexSettings.key, deviceId), eq(schema.haexSettings.deviceId, deviceInternalId),
eq(schema.haexSettings.type, VaultSettingsTypeEnum.deviceName), eq(schema.haexSettings.key, VaultSettingsKeyEnum.desktopIconSize),
eq(schema.haexSettings.type, VaultSettingsTypeEnum.system),
), ),
) )
} }
@ -191,5 +242,7 @@ export const useVaultSettingsStore = defineStore('vaultSettingsStore', () => {
updateLocaleAsync, updateLocaleAsync,
updateThemeAsync, updateThemeAsync,
updateVaultNameAsync, updateVaultNameAsync,
syncDesktopIconSizeAsync,
updateDesktopIconSizeAsync,
} }
}) })

View File

@ -0,0 +1,250 @@
/**
* Crypto utilities for Vault Key Management
* Implements the "Hybrid-Ansatz" for vault key encryption
*/
const PBKDF2_ITERATIONS = 600_000
const KEY_LENGTH = 256
const ALGORITHM = 'AES-GCM'
/**
* Derives a cryptographic key from a password using PBKDF2
*/
export async function deriveKeyFromPasswordAsync(
password: string,
salt: Uint8Array,
): Promise<CryptoKey> {
const encoder = new TextEncoder()
const passwordBuffer = encoder.encode(password)
// Ensure salt has a proper ArrayBuffer (not SharedArrayBuffer)
const saltBuffer = new Uint8Array(salt)
// Import password as key material
const keyMaterial = await crypto.subtle.importKey(
'raw',
passwordBuffer,
'PBKDF2',
false,
['deriveKey'],
)
// Derive key using PBKDF2
return await crypto.subtle.deriveKey(
{
name: 'PBKDF2',
salt: saltBuffer,
iterations: PBKDF2_ITERATIONS,
hash: 'SHA-256',
},
keyMaterial,
{ name: ALGORITHM, length: KEY_LENGTH },
false, // not extractable
['encrypt', 'decrypt'],
)
}
/**
* Generates a random vault key (32 bytes)
*/
export function generateVaultKey(): Uint8Array {
return crypto.getRandomValues(new Uint8Array(32))
}
/**
* Encrypts the vault key with a password-derived key
* Returns: { encryptedVaultKey, salt, nonce } all as Base64 strings
*/
export async function encryptVaultKeyAsync(
vaultKey: Uint8Array,
password: string,
): Promise<{
encryptedVaultKey: string
salt: string
nonce: string
}> {
// Generate random salt for PBKDF2
const salt = crypto.getRandomValues(new Uint8Array(32))
// Derive encryption key from password
const derivedKey = await deriveKeyFromPasswordAsync(password, salt)
// Generate random nonce for AES-GCM
const nonce = crypto.getRandomValues(new Uint8Array(12))
// Ensure vaultKey has proper ArrayBuffer
const vaultKeyBuffer = new Uint8Array(vaultKey)
// Encrypt vault key
const encryptedBuffer = await crypto.subtle.encrypt(
{
name: ALGORITHM,
iv: nonce,
},
derivedKey,
vaultKeyBuffer,
)
// Convert to Base64 for storage
return {
encryptedVaultKey: arrayBufferToBase64(encryptedBuffer),
salt: arrayBufferToBase64(salt),
nonce: arrayBufferToBase64(nonce),
}
}
/**
* Decrypts the vault key using the password
*/
export async function decryptVaultKeyAsync(
encryptedVaultKey: string,
salt: string,
nonce: string,
password: string,
): Promise<Uint8Array> {
// Convert Base64 to Uint8Array
const encryptedBuffer = base64ToArrayBuffer(encryptedVaultKey)
const saltBuffer = base64ToArrayBuffer(salt)
const nonceBuffer = base64ToArrayBuffer(nonce)
// Derive decryption key from password
const derivedKey = await deriveKeyFromPasswordAsync(password, saltBuffer)
// Ensure buffers have proper ArrayBuffer
const encryptedData = new Uint8Array(encryptedBuffer)
const iv = new Uint8Array(nonceBuffer)
// Decrypt vault key
const decryptedBuffer = await crypto.subtle.decrypt(
{
name: ALGORITHM,
iv,
},
derivedKey,
encryptedData,
)
return new Uint8Array(decryptedBuffer)
}
/**
* Encrypts CRDT log data with the vault key
*/
export async function encryptCrdtDataAsync(
data: object,
vaultKey: Uint8Array,
): Promise<{
encryptedData: string
nonce: string
}> {
// Ensure vaultKey has proper ArrayBuffer
const vaultKeyBuffer = new Uint8Array(vaultKey)
// Import vault key for encryption
const cryptoKey = await crypto.subtle.importKey(
'raw',
vaultKeyBuffer,
{ name: ALGORITHM },
false,
['encrypt'],
)
// Generate random nonce
const nonce = crypto.getRandomValues(new Uint8Array(12))
// Serialize data to JSON
const encoder = new TextEncoder()
const dataBuffer = encoder.encode(JSON.stringify(data))
// Encrypt data
const encryptedBuffer = await crypto.subtle.encrypt(
{
name: ALGORITHM,
iv: nonce,
},
cryptoKey,
dataBuffer,
)
return {
encryptedData: arrayBufferToBase64(encryptedBuffer),
nonce: arrayBufferToBase64(nonce),
}
}
/**
* Decrypts CRDT log data with the vault key
*/
export async function decryptCrdtDataAsync<T = object>(
encryptedData: string,
nonce: string,
vaultKey: Uint8Array,
): Promise<T> {
// Ensure vaultKey has proper ArrayBuffer
const vaultKeyBuffer = new Uint8Array(vaultKey)
// Import vault key for decryption
const cryptoKey = await crypto.subtle.importKey(
'raw',
vaultKeyBuffer,
{ name: ALGORITHM },
false,
['decrypt'],
)
// Convert Base64 to buffers
const encryptedBuffer = base64ToArrayBuffer(encryptedData)
const nonceBuffer = base64ToArrayBuffer(nonce)
// Ensure buffers have proper ArrayBuffer
const encryptedDataBuffer = new Uint8Array(encryptedBuffer)
const iv = new Uint8Array(nonceBuffer)
// Decrypt data
const decryptedBuffer = await crypto.subtle.decrypt(
{
name: ALGORITHM,
iv,
},
cryptoKey,
encryptedDataBuffer,
)
// Parse JSON
const decoder = new TextDecoder()
const jsonString = decoder.decode(decryptedBuffer)
return JSON.parse(jsonString) as T
}
// Utility functions for Base64 conversion
function arrayBufferToBase64(buffer: ArrayBuffer | Uint8Array): string {
const bytes = buffer instanceof Uint8Array ? buffer : new Uint8Array(buffer)
// Use Buffer for efficient base64 encoding (works in Node/Bun)
if (typeof Buffer !== 'undefined') {
return Buffer.from(bytes).toString('base64')
}
// Fallback to btoa for browser environments
let binary = ''
for (let i = 0; i < bytes.length; i++) {
const byte = bytes[i]
if (byte !== undefined) {
binary += String.fromCharCode(byte)
}
}
return btoa(binary)
}
function base64ToArrayBuffer(base64: string): Uint8Array {
// Use Buffer for efficient base64 decoding (works in Node/Bun)
if (typeof Buffer !== 'undefined') {
return new Uint8Array(Buffer.from(base64, 'base64'))
}
// Fallback to atob for browser environments
const binary = atob(base64)
const bytes = new Uint8Array(binary.length)
for (let i = 0; i < binary.length; i++) {
bytes[i] = binary.charCodeAt(i)
}
return bytes
}

63
src/utils/viewport.ts Normal file
View File

@ -0,0 +1,63 @@
// Viewport and safe area utilities
export interface ViewportDimensions {
width: number
height: number
safeAreaTop: number
safeAreaBottom: number
headerHeight: number
}
/**
* Get viewport dimensions with safe areas and header height
*/
export function getViewportDimensions(): ViewportDimensions {
const viewportWidth = window.innerWidth
const viewportHeight = window.innerHeight - 40 // Subtract header height
// Get safe-area-insets from CSS variables
const safeAreaTop = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-top',
) || '0',
)
const safeAreaBottom = parseFloat(
getComputedStyle(document.documentElement).getPropertyValue(
'--safe-area-inset-bottom',
) || '0',
)
// Get header height from UI store
const { headerHeight } = useUiStore()
return {
width: viewportWidth,
height: viewportHeight,
safeAreaTop,
safeAreaBottom,
headerHeight,
}
}
/**
* Calculate available content height (viewport height minus safe areas)
* Note: viewport height already excludes header, so we only subtract safe areas
*/
export function getAvailableContentHeight(): number {
const dimensions = getViewportDimensions()
return dimensions.height - dimensions.safeAreaTop - dimensions.safeAreaBottom
}
/**
* Calculate fullscreen window dimensions (for small screens)
*/
export function getFullscreenDimensions() {
const dimensions = getViewportDimensions()
return {
x: 0,
y: 0,
width: dimensions.width,
height: getAvailableContentHeight(),
}
}