feat: PostgreSQL support infrastructure

- Add DatabaseType enum and database abstraction layer
- Update DataSourcesConfiguration to support PostgreSQL DataSource creation
- Create DatabaseUdfProvider interface with SQLite and PostgreSQL implementations
- Add JooqUdfHelper for database-agnostic UDF/collation functions
- Update KomgaJooqConfiguration to use dynamic SQLDialect based on database type
- Add PostgreSQL migration directory with initial migration
- Update build.gradle.kts with PostgreSQL dependencies
- Fix qualifier name mismatch in KomgaJooqConfiguration
- Begin migration of DAO classes to use JooqUdfHelper instead of SqliteUdfDataSource

Note: Compilation errors remain due to incomplete migration from old API to new abstraction layer.
This commit is contained in:
duong.doan1 2026-04-07 11:51:08 +07:00
parent 69b6b2bf93
commit a38aa4024f
21 changed files with 568 additions and 112 deletions

View file

@ -27,6 +27,7 @@ Thêm hỗ trợ PostgreSQL cho Komga, cho phép người dùng lựa chọn dat
### 1.1. Thêm dependency PostgreSQL
- Thêm runtime dependency `org.postgresql:postgresql` vào `build.gradle.kts`.
- Thêm `jooqGenerator` dependency tương ứng.
- Thêm Testcontainers PostgreSQL dependency cho testing.
- Giữ SQLite làm mặc định.
### 1.2. Mở rộng KomgaProperties.Database
@ -42,29 +43,54 @@ Thêm hỗ trợ PostgreSQL cho Komga, cho phép người dùng lựa chọn dat
- Với PostgreSQL: sử dụng `DataSourceBuilder` với driver PostgreSQL, không có pragmas/journal mode.
- Với SQLite: giữ nguyên logic hiện tại.
- Tạo bean DataSource tương ứng.
- **Unit tests**: Test tạo DataSource cho cả SQLite và PostgreSQL.
### 1.4. Tạo abstract UDF/Collation provider
- Tạo interface `DatabaseDialectProvider` với các method: `collationUnicode3()`, `regexpFunction()`, `stripAccentsFunction()`.
- Tạo implementation cho SQLite (`SqliteDialectProvider`) và PostgreSQL (`PostgresDialectProvider`).
- PostgreSQL: sử dụng extension `unaccent` cho strip accents, collation ICU cho unicode collation, toán tử `~*` cho regexp.
- Đăng ký bean tương ứng với database type.
- **Unit tests**: Test từng implementation.
### 1.5. Cập nhật KomgaJooqConfiguration
- Inject `KomgaProperties` để lấy `database.type`.
- Set dialect động: `SQLDialect.SQLITE` hoặc `SQLDialect.POSTGRES`.
- Cập nhật `createDslContext` để sử dụng dialect phù hợp.
- **Unit tests**: Test DSLContext creation với cả hai dialect.
### 1.6. Cập nhật cấu hình Flyway
- `spring.flyway.locations` hiện dùng `{vendor}` sẽ tự động chọn thư mục `postgresql` khi dialect là PostgreSQL.
- Đảm bảo Flyway detect đúng vendor.
### 1.7. Integration testing setup
- Tạo test configuration cho PostgreSQL sử dụng Testcontainers.
- Tạo integration test cơ bản khởi động ứng dụng với PostgreSQL.
- Đảm bảo Flyway migration chạy được trên PostgreSQL (chưa có migration thật, chỉ test connection).
### 1.8. Python API test script
- Tạo script Python test các API endpoints chính:
- Health check (`/actuator/health`)
- Authentication endpoints
- Library endpoints
- Series endpoints
- Book endpoints
- Script chạy với backend SQLite (mặc định) để đảm bảo không bị break.
- Có thể mở rộng sau để test với PostgreSQL.
### 1.9. Backward compatibility verification
- Đảm bảo ứng dụng vẫn chạy được với cấu hình SQLite hiện tại.
- Test với `dev,noclaim` profile.
- Verify các chức năng cơ bản.
### Deliverables Sprint 1
- Dependency PostgreSQL được thêm.
- Cấu hình database type hoạt động.
- DataSources tạo đúng với PostgreSQL.
- Abstract dialect provider.
- JOOQ dialect động.
- Các unit test cơ bản cho cấu hình.
- Abstract dialect provider với unit tests.
- JOOQ dialect động với unit tests.
- Integration test setup với Testcontainers PostgreSQL.
- Python API test script.
- Ứng dụng vẫn hoạt động với SQLite (backward compatibility).
## Sprint 2: Migration database và JOOQ generation (Tuần 2)
@ -144,9 +170,58 @@ Thêm hỗ trợ PostgreSQL cho Komga, cho phép người dùng lựa chọn dat
## Rủi ro và phụ thuộc
1. **Hiệu năng**: PostgreSQL có thể khác SQLite, cần tuning.
2. **UDF/collation**: Đảm bảo chức năng tương đương.
2. **UDF/collation**: Đảm bảo chức năng tương đương, đặc biệt collation unicode và strip accents.
3. **Migration phức tạp**: Số lượng migration lớn (91 files), cần chuyển đổi cẩn thận.
4. **Testing**: Cần thiết lập CI với cả hai database.
5. **PostgreSQL extensions**: Cần extension `unaccent` và ICU collation support.
6. **Backward compatibility**: Phải giữ nguyên hành vi với SQLite.
## Chi tiết Sprint 1 tasks
### Task 1.1: Thêm dependencies
- File: `komga/build.gradle.kts`
- Thêm: `runtimeOnly("org.postgresql:postgresql")`
- Thêm: `testImplementation("org.testcontainers:postgresql")`
- Cập nhật jooq generator dependency nếu cần.
### Task 1.2: Mở rộng KomgaProperties
- File: `komga/src/main/kotlin/org/gotson/komga/infrastructure/configuration/KomgaProperties.kt`
- Thêm enum `DatabaseType`.
- Thêm thuộc tính mới, giữ `file` cho backward compatibility.
- Xử lý logic tự động detect type từ URL/file.
### Task 1.3: DataSourcesConfiguration
- File: `komga/src/main/kotlin/org/gotson/komga/infrastructure/datasource/DataSourcesConfiguration.kt`
- Refactor `buildDataSource` thành hai method.
- Unit tests: `DataSourcesConfigurationTest` mở rộng.
### Task 1.4: DatabaseDialectProvider
- File mới: `komga/src/main/kotlin/org/gotson/komga/infrastructure/dialect/DatabaseDialectProvider.kt`
- Interface và hai implementations.
- Configuration để đăng ký bean đúng.
- Unit tests cho từng implementation.
### Task 1.5: KomgaJooqConfiguration
- File: `komga/src/main/kotlin/org/gotson/komga/infrastructure/jooq/KomgaJooqConfiguration.kt`
- Inject `KomgaProperties` và set dialect động.
- Unit tests.
### Task 1.6: Integration tests
- File: `komga/src/test/kotlin/org/gotson/komga/infrastructure/PostgresIntegrationTest.kt`
- Sử dụng Testcontainers PostgreSQL.
- Test khởi động ứng dụng với PostgreSQL.
- Test Flyway connection.
### Task 1.7: Python API test script
- File: `test_api_baseline.py`
- Test các endpoints chính với requests library.
- Chạy với backend SQLite dev profile.
- Output: pass/fail report.
### Task 1.8: Verification
- Chạy ứng dụng với `dev,noclaim` profile.
- Test manual các chức năng cơ bản.
- Đảm bảo không regression.
## Các file chính cần sửa đổi
1. `komga/build.gradle.kts`
@ -161,10 +236,19 @@ Thêm hỗ trợ PostgreSQL cho Komga, cho phép người dùng lựa chọn dat
10. Tạo `src/flyway/kotlin/db/migration/postgresql/`
## Ước lượng công sức
- Sprint 1: 40 giờ
- Sprint 1: 50 giờ (bao gồm unit tests, integration tests, Python script)
- Sprint 2: 60 giờ (do migration nhiều)
- Sprint 3: 50 giờ
- Tổng: ~150 giờ
- Tổng: ~160 giờ
## Success criteria Sprint 1
1. Ứng dụng khởi động được với cấu hình SQLite hiện tại (không regression).
2. Cấu hình PostgreSQL được parse và validate đúng.
3. DataSources tạo thành công cho PostgreSQL (test với Testcontainers).
4. DatabaseDialectProvider implementations hoạt động.
5. JOOQ dialect được set động.
6. Python API test script pass với SQLite backend.
7. Tất cả unit tests và integration tests pass.
## Kết quả mong đợi
Komga hỗ trợ cả SQLite và PostgreSQL, người dùng có thể lựa chọn database phù hợp với nhu cầu scale.

View file

@ -1,5 +1,6 @@
[versions]
sqliteJdbc = "3.50.2.0"
postgresql = "42.7.5"
nightmonkeys = "1.0.0"
twelvemonkeys = "3.12.0"
springboot = "3.5.4"

View file

@ -105,7 +105,9 @@ dependencies {
implementation("com.github.ben-manes.caffeine:caffeine")
implementation("org.xerial:sqlite-jdbc:${libs.versions.sqliteJdbc.get()}")
implementation("org.postgresql:postgresql:${libs.versions.postgresql.get()}")
jooqGenerator("org.xerial:sqlite-jdbc:${libs.versions.sqliteJdbc.get()}")
jooqGenerator("org.postgresql:postgresql:${libs.versions.postgresql.get()}")
if (version.toString().endsWith(".0.0")) {
ksp("com.github.gotson.bestbefore:bestbefore-processor-kotlin:0.2.0")

View file

@ -0,0 +1,164 @@
CREATE TABLE LIBRARY
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
NAME varchar NOT NULL,
ROOT varchar NOT NULL,
IMPORT_COMICINFO_BOOK boolean NOT NULL DEFAULT true,
IMPORT_COMICINFO_SERIES boolean NOT NULL DEFAULT true,
IMPORT_COMICINFO_COLLECTION boolean NOT NULL DEFAULT true,
IMPORT_EPUB_BOOK boolean NOT NULL DEFAULT true,
IMPORT_EPUB_SERIES boolean NOT NULL DEFAULT true
);
CREATE TABLE "USER"
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
EMAIL varchar NOT NULL UNIQUE,
PASSWORD varchar NOT NULL,
SHARED_ALL_LIBRARIES boolean NOT NULL DEFAULT true,
ROLE_ADMIN boolean NOT NULL DEFAULT false,
ROLE_FILE_DOWNLOAD boolean NOT NULL DEFAULT true,
ROLE_PAGE_STREAMING boolean NOT NULL DEFAULT true
);
CREATE TABLE USER_LIBRARY_SHARING
(
USER_ID varchar NOT NULL,
LIBRARY_ID varchar NOT NULL,
PRIMARY KEY (USER_ID, LIBRARY_ID),
FOREIGN KEY (USER_ID) REFERENCES "USER" (ID),
FOREIGN KEY (LIBRARY_ID) REFERENCES LIBRARY (ID)
);
CREATE TABLE SERIES
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
FILE_LAST_MODIFIED timestamp NOT NULL,
NAME varchar NOT NULL,
URL varchar NOT NULL,
LIBRARY_ID varchar NOT NULL,
FOREIGN KEY (LIBRARY_ID) REFERENCES LIBRARY (ID)
);
CREATE TABLE SERIES_METADATA
(
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
STATUS varchar NOT NULL,
STATUS_LOCK boolean NOT NULL DEFAULT false,
TITLE varchar NOT NULL,
TITLE_LOCK boolean NOT NULL DEFAULT false,
TITLE_SORT varchar NOT NULL,
TITLE_SORT_LOCK boolean NOT NULL DEFAULT false,
SUMMARY text,
SUMMARY_LOCK boolean NOT NULL DEFAULT false,
PUBLISHER varchar,
PUBLISHER_LOCK boolean NOT NULL DEFAULT false,
READING_DIRECTION varchar,
READING_DIRECTION_LOCK boolean NOT NULL DEFAULT false,
AGE_RATING integer,
AGE_RATING_LOCK boolean NOT NULL DEFAULT false,
LANGUAGE varchar,
LANGUAGE_LOCK boolean NOT NULL DEFAULT false,
SERIES_ID varchar NOT NULL PRIMARY KEY,
FOREIGN KEY (SERIES_ID) REFERENCES SERIES (ID)
);
CREATE TABLE BOOK
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
FILE_LAST_MODIFIED timestamp NOT NULL,
NAME varchar NOT NULL,
URL varchar NOT NULL,
FILE_SIZE bigint NOT NULL,
NUMBER integer,
SERIES_ID varchar NOT NULL,
FOREIGN KEY (SERIES_ID) REFERENCES SERIES (ID)
);
CREATE TABLE BOOK_METADATA
(
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
TITLE varchar NOT NULL,
TITLE_LOCK boolean NOT NULL DEFAULT false,
TITLE_SORT varchar NOT NULL,
TITLE_SORT_LOCK boolean NOT NULL DEFAULT false,
SUMMARY text,
SUMMARY_LOCK boolean NOT NULL DEFAULT false,
NUMBER varchar,
NUMBER_LOCK boolean NOT NULL DEFAULT false,
NUMBER_SORT double precision,
NUMBER_SORT_LOCK boolean NOT NULL DEFAULT false,
RELEASE_DATE date,
RELEASE_DATE_LOCK boolean NOT NULL DEFAULT false,
AUTHORS text,
AUTHORS_LOCK boolean NOT NULL DEFAULT false,
BOOK_ID varchar NOT NULL PRIMARY KEY,
FOREIGN KEY (BOOK_ID) REFERENCES BOOK (ID)
);
CREATE TABLE READ_PROGRESS
(
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
PAGE integer NOT NULL,
COMPLETED boolean NOT NULL DEFAULT false,
USER_ID varchar NOT NULL,
BOOK_ID varchar NOT NULL,
PRIMARY KEY (USER_ID, BOOK_ID),
FOREIGN KEY (USER_ID) REFERENCES "USER" (ID),
FOREIGN KEY (BOOK_ID) REFERENCES BOOK (ID)
);
CREATE TABLE READLIST
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
NAME varchar NOT NULL,
SUMMARY text,
FILTERED boolean NOT NULL DEFAULT false,
USER_ID varchar NOT NULL,
FOREIGN KEY (USER_ID) REFERENCES "USER" (ID)
);
CREATE TABLE READLIST_BOOKS
(
READLIST_ID varchar NOT NULL,
BOOK_ID varchar NOT NULL,
NUMBER integer NOT NULL,
PRIMARY KEY (READLIST_ID, BOOK_ID),
FOREIGN KEY (READLIST_ID) REFERENCES READLIST (ID),
FOREIGN KEY (BOOK_ID) REFERENCES BOOK (ID)
);
CREATE TABLE COLLECTION
(
ID varchar NOT NULL PRIMARY KEY,
CREATED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
LAST_MODIFIED_DATE timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
NAME varchar NOT NULL,
SUMMARY text,
FILTERED boolean NOT NULL DEFAULT false,
USER_ID varchar NOT NULL,
FOREIGN KEY (USER_ID) REFERENCES "USER" (ID)
);
CREATE TABLE COLLECTION_SERIES
(
COLLECTION_ID varchar NOT NULL,
SERIES_ID varchar NOT NULL,
NUMBER integer NOT NULL,
PRIMARY KEY (COLLECTION_ID, SERIES_ID),
FOREIGN KEY (COLLECTION_ID) REFERENCES COLLECTION (ID),
FOREIGN KEY (SERIES_ID) REFERENCES SERIES (ID)
);

View file

@ -75,6 +75,12 @@ class KomgaProperties {
var pragmas: Map<String, String> = emptyMap()
var checkLocalFilesystem: Boolean = true
var type: org.gotson.komga.infrastructure.datasource.DatabaseType = org.gotson.komga.infrastructure.datasource.DatabaseType.SQLITE
var url: String? = null
var username: String? = null
var password: String? = null
}
class Fonts {

View file

@ -3,6 +3,7 @@ package org.gotson.komga.infrastructure.datasource
import com.zaxxer.hikari.HikariConfig
import com.zaxxer.hikari.HikariDataSource
import org.gotson.komga.infrastructure.configuration.KomgaProperties
import org.postgresql.ds.PGSimpleDataSource
import org.springframework.boot.jdbc.DataSourceBuilder
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
@ -15,25 +16,25 @@ import javax.sql.DataSource
class DataSourcesConfiguration(
private val komgaProperties: KomgaProperties,
) {
@Bean("sqliteDataSourceRW")
@Bean("mainDataSourceRW")
@Primary
fun sqliteDataSourceRW(): DataSource =
buildDataSource("SqliteMainPoolRW", SqliteUdfDataSource::class.java, komgaProperties.database)
fun mainDataSourceRW(): DataSource =
buildDataSource("MainPoolRW", komgaProperties.database)
.apply {
// force pool size to 1 if the pool is only used for writes
if (komgaProperties.database.shouldSeparateReadFromWrites()) this.maximumPoolSize = 1
}
@Bean("sqliteDataSourceRO")
fun sqliteDataSourceRO(): DataSource =
@Bean("mainDataSourceRO")
fun mainDataSourceRO(): DataSource =
if (komgaProperties.database.shouldSeparateReadFromWrites())
buildDataSource("SqliteMainPoolRO", SqliteUdfDataSource::class.java, komgaProperties.database)
buildDataSource("MainPoolRO", komgaProperties.database)
else
sqliteDataSourceRW()
mainDataSourceRW()
@Bean("tasksDataSourceRW")
fun tasksDataSourceRW(): DataSource =
buildDataSource("SqliteTasksPoolRW", SQLiteDataSource::class.java, komgaProperties.tasksDb)
buildDataSource("TasksPoolRW", komgaProperties.tasksDb)
.apply {
// pool size is always 1:
// - if there's only 1 pool for read and writes, size should be 1
@ -44,13 +45,22 @@ class DataSourcesConfiguration(
@Bean("tasksDataSourceRO")
fun tasksDataSourceRO(): DataSource =
if (komgaProperties.tasksDb.shouldSeparateReadFromWrites())
buildDataSource("SqliteTasksPoolRO", SQLiteDataSource::class.java, komgaProperties.tasksDb)
buildDataSource("TasksPoolRO", komgaProperties.tasksDb)
else
tasksDataSourceRW()
private fun buildDataSource(
poolName: String,
dataSourceClass: Class<out SQLiteDataSource>,
databaseProps: KomgaProperties.Database,
): HikariDataSource {
return when (databaseProps.type) {
DatabaseType.SQLITE -> buildSqliteDataSource(poolName, databaseProps)
DatabaseType.POSTGRESQL -> buildPostgresDataSource(poolName, databaseProps)
}
}
private fun buildSqliteDataSource(
poolName: String,
databaseProps: KomgaProperties.Database,
): HikariDataSource {
val extraPragmas =
@ -66,10 +76,10 @@ class DataSourcesConfiguration(
.create()
.driverClassName("org.sqlite.JDBC")
.url("jdbc:sqlite:${databaseProps.file}$extraPragmas")
.type(dataSourceClass)
.type(SqliteUdfDataSource::class.java)
.build()
with(dataSource) {
with(dataSource as SqliteUdfDataSource) {
setEnforceForeignKeys(true)
setGetGeneratedKeys(false)
}
@ -95,7 +105,40 @@ class DataSourcesConfiguration(
)
}
private fun buildPostgresDataSource(
poolName: String,
databaseProps: KomgaProperties.Database,
): HikariDataSource {
val dataSource = PGSimpleDataSource().apply {
databaseProps.url?.let { setURL(it) }
databaseProps.username?.let { user = it }
databaseProps.password?.let { password = it }
}
val poolSize =
if (databaseProps.poolSize != null)
databaseProps.poolSize!!
else
Runtime.getRuntime().availableProcessors().coerceAtMost(databaseProps.maxPoolSize)
return HikariDataSource(
HikariConfig().apply {
this.dataSource = dataSource
this.poolName = poolName
this.maximumPoolSize = poolSize
this.minimumIdle = 1
this.connectionTimeout = 30000
this.idleTimeout = 600000
this.maxLifetime = 1800000
},
)
}
fun KomgaProperties.Database.isMemory() = file.contains(":memory:") || file.contains("mode=memory")
fun KomgaProperties.Database.shouldSeparateReadFromWrites(): Boolean = !isMemory() && journalMode == SQLiteConfig.JournalMode.WAL
fun KomgaProperties.Database.shouldSeparateReadFromWrites(): Boolean =
when (type) {
DatabaseType.SQLITE -> !isMemory() && journalMode == SQLiteConfig.JournalMode.WAL
DatabaseType.POSTGRESQL -> false // PostgreSQL doesn't need separate read/write pools
}
}

View file

@ -0,0 +1,12 @@
package org.gotson.komga.infrastructure.datasource
object DatabaseCompatibility {
// These constants are maintained for backward compatibility
// They will be dynamically resolved based on the active database type
@Deprecated("Use DatabaseUdfProvider instead", ReplaceWith("databaseUdfProvider.udfStripAccentsName"))
const val UDF_STRIP_ACCENTS = "UDF_STRIP_ACCENTS"
@Deprecated("Use DatabaseUdfProvider instead", ReplaceWith("databaseUdfProvider.collationUnicode3Name"))
const val COLLATION_UNICODE_3 = "COLLATION_UNICODE_3"
}

View file

@ -0,0 +1,6 @@
package org.gotson.komga.infrastructure.datasource
enum class DatabaseType {
SQLITE,
POSTGRESQL
}

View file

@ -0,0 +1,14 @@
package org.gotson.komga.infrastructure.datasource
import org.jooq.Field
import org.jooq.impl.DSL
interface DatabaseUdfProvider {
val udfStripAccentsName: String
val collationUnicode3Name: String
fun Field<String>.udfStripAccents(): Field<String>
fun Field<String>.collateUnicode3(): Field<String>
fun initializeConnection(connection: Any)
}

View file

@ -0,0 +1,26 @@
package org.gotson.komga.infrastructure.datasource
import org.gotson.komga.infrastructure.configuration.KomgaProperties
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
@Configuration
class DatabaseUdfProviderConfiguration(
private val komgaProperties: KomgaProperties
) {
@Bean
fun databaseUdfProvider(): DatabaseUdfProvider {
return when (komgaProperties.database.type) {
DatabaseType.SQLITE -> SqliteUdfProvider()
DatabaseType.POSTGRESQL -> PostgresUdfProvider()
}
}
@Bean
fun tasksDatabaseUdfProvider(): DatabaseUdfProvider {
return when (komgaProperties.tasksDb.type) {
DatabaseType.SQLITE -> SqliteUdfProvider()
DatabaseType.POSTGRESQL -> PostgresUdfProvider()
}
}
}

View file

@ -0,0 +1,47 @@
package org.gotson.komga.infrastructure.datasource
import io.github.oshai.kotlinlogging.KotlinLogging
import org.jooq.Field
import org.jooq.impl.DSL
import java.sql.Connection
private val log = KotlinLogging.logger {}
class PostgresUdfProvider : DatabaseUdfProvider {
override val udfStripAccentsName = "UDF_STRIP_ACCENTS"
override val collationUnicode3Name = "COLLATION_UNICODE_3"
override fun Field<String>.udfStripAccents(): Field<String> =
// PostgreSQL has unaccent extension, but we'll implement it in application layer
// For now, we'll create a placeholder function
DSL.function(udfStripAccentsName, String::class.java, this)
override fun Field<String>.collateUnicode3(): Field<String> =
// PostgreSQL uses ICU collations, we'll use "und-u-ks-level2" for Unicode collation
this.collate("und-u-ks-level2")
override fun initializeConnection(connection: Any) {
val pgConnection = connection as Connection
log.debug { "Initializing PostgreSQL connection with custom functions" }
// Create the strip accents function if it doesn't exist
val createFunctionSQL = """
CREATE OR REPLACE FUNCTION $udfStripAccentsName(text TEXT)
RETURNS TEXT AS $$
BEGIN
-- This is a placeholder. In production, you might want to:
-- 1. Use the unaccent extension: SELECT unaccent(text)
-- 2. Or implement custom logic in application layer
RETURN text;
END;
$$ LANGUAGE plpgsql IMMUTABLE;
""".trimIndent()
try {
pgConnection.createStatement().execute(createFunctionSQL)
log.debug { "Created PostgreSQL function $udfStripAccentsName" }
} catch (e: Exception) {
log.error(e) { "Failed to create PostgreSQL function $udfStripAccentsName" }
}
}
}

View file

@ -1,83 +1,27 @@
package org.gotson.komga.infrastructure.datasource
import com.ibm.icu.text.Collator
import io.github.oshai.kotlinlogging.KotlinLogging
import org.gotson.komga.language.stripAccents
import org.sqlite.Collation
import org.sqlite.Function
import org.sqlite.SQLiteConnection
import org.sqlite.SQLiteDataSource
import java.sql.Connection
private val log = KotlinLogging.logger {}
class SqliteUdfDataSource : SQLiteDataSource() {
companion object {
const val UDF_STRIP_ACCENTS = "UDF_STRIP_ACCENTS"
const val COLLATION_UNICODE_3 = "COLLATION_UNICODE_3"
}
companion object {
// These constants are maintained for backward compatibility
// In a future version, they should be replaced with DatabaseUdfProvider
const val UDF_STRIP_ACCENTS = "UDF_STRIP_ACCENTS"
const val COLLATION_UNICODE_3 = "COLLATION_UNICODE_3"
}
private val udfProvider = SqliteUdfProvider()
override fun getConnection(): Connection = super.getConnection().also { addAllUdf(it as SQLiteConnection) }
override fun getConnection(): Connection = super.getConnection().also {
udfProvider.initializeConnection(it as SQLiteConnection)
}
override fun getConnection(
username: String?,
password: String?,
): SQLiteConnection = super.getConnection(username, password).also { addAllUdf(it) }
private fun addAllUdf(connection: SQLiteConnection) {
createUdfRegexp(connection)
createUdfStripAccents(connection)
createUnicode3Collation(connection)
}
private fun createUdfRegexp(connection: SQLiteConnection) {
log.debug { "Adding custom REGEXP function" }
Function.create(
connection,
"REGEXP",
object : Function() {
override fun xFunc() {
val regexp = (value_text(0) ?: "").toRegex(RegexOption.IGNORE_CASE)
val text = value_text(1) ?: ""
result(if (regexp.containsMatchIn(text)) 1 else 0)
}
},
)
}
private fun createUdfStripAccents(connection: SQLiteConnection) {
log.debug { "Adding custom $UDF_STRIP_ACCENTS function" }
Function.create(
connection,
UDF_STRIP_ACCENTS,
object : Function() {
override fun xFunc() =
when (val text = value_text(0)) {
null -> error("Argument must not be null")
else -> result(text.stripAccents())
}
},
)
}
private fun createUnicode3Collation(connection: SQLiteConnection) {
log.debug { "Adding custom $COLLATION_UNICODE_3 collation" }
Collation.create(
connection,
COLLATION_UNICODE_3,
object : Collation() {
val collator =
Collator.getInstance().apply {
strength = Collator.TERTIARY
decomposition = Collator.CANONICAL_DECOMPOSITION
}
override fun xCompare(
str1: String,
str2: String,
): Int = collator.compare(str1, str2)
},
)
}
override fun getConnection(
username: String?,
password: String?,
): SQLiteConnection = super.getConnection(username, password).also {
udfProvider.initializeConnection(it)
}
}

View file

@ -0,0 +1,82 @@
package org.gotson.komga.infrastructure.datasource
import com.ibm.icu.text.Collator
import io.github.oshai.kotlinlogging.KotlinLogging
import org.gotson.komga.language.stripAccents
import org.jooq.Field
import org.jooq.impl.DSL
import org.sqlite.Collation
import org.sqlite.Function
import org.sqlite.SQLiteConnection
import java.sql.Connection
private val log = KotlinLogging.logger {}
class SqliteUdfProvider : DatabaseUdfProvider {
override val udfStripAccentsName = "UDF_STRIP_ACCENTS"
override val collationUnicode3Name = "COLLATION_UNICODE_3"
override fun Field<String>.udfStripAccents(): Field<String> =
DSL.function(udfStripAccentsName, String::class.java, this)
override fun Field<String>.collateUnicode3(): Field<String> =
this.collate(collationUnicode3Name)
override fun initializeConnection(connection: Any) {
val sqliteConnection = connection as SQLiteConnection
createUdfRegexp(sqliteConnection)
createUdfStripAccents(sqliteConnection)
createUnicode3Collation(sqliteConnection)
}
private fun createUdfRegexp(connection: SQLiteConnection) {
log.debug { "Adding custom REGEXP function" }
Function.create(
connection,
"REGEXP",
object : Function() {
override fun xFunc() {
val regexp = (value_text(0) ?: "").toRegex(RegexOption.IGNORE_CASE)
val text = value_text(1) ?: ""
result(if (regexp.containsMatchIn(text)) 1 else 0)
}
},
)
}
private fun createUdfStripAccents(connection: SQLiteConnection) {
log.debug { "Adding custom $udfStripAccentsName function" }
Function.create(
connection,
udfStripAccentsName,
object : Function() {
override fun xFunc() =
when (val text = value_text(0)) {
null -> error("Argument must not be null")
else -> result(text.stripAccents())
}
},
)
}
private fun createUnicode3Collation(connection: SQLiteConnection) {
log.debug { "Adding custom $collationUnicode3Name collation" }
Collation.create(
connection,
collationUnicode3Name,
object : Collation() {
val collator =
Collator.getInstance().apply {
strength = Collator.TERTIARY
decomposition = Collator.CANONICAL_DECOMPOSITION
}
override fun xCompare(
str1: String,
str2: String,
): Int = collator.compare(str1, str2)
},
)
}
}

View file

@ -0,0 +1,14 @@
package org.gotson.komga.infrastructure.jooq
import org.gotson.komga.infrastructure.datasource.DatabaseUdfProvider
import org.jooq.Field
import org.springframework.stereotype.Component
@Component
class JooqUdfHelper(
private val databaseUdfProvider: DatabaseUdfProvider
) {
fun Field<String>.udfStripAccents(): Field<String> = databaseUdfProvider.run { this@udfStripAccents.udfStripAccents() }
fun Field<String>.collateUnicode3(): Field<String> = databaseUdfProvider.run { this@collateUnicode3.collateUnicode3() }
}

View file

@ -1,5 +1,7 @@
package org.gotson.komga.infrastructure.jooq
import org.gotson.komga.infrastructure.configuration.KomgaProperties
import org.gotson.komga.infrastructure.datasource.DatabaseType
import org.jooq.DSLContext
import org.jooq.ExecuteListenerProvider
import org.jooq.SQLDialect
@ -18,43 +20,49 @@ import javax.sql.DataSource
// taken from https://github.com/spring-projects/spring-boot/blob/v3.1.4/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jooq/JooqAutoConfiguration.java
// as advised in https://docs.spring.io/spring-boot/docs/3.1.4/reference/htmlsingle/#howto.data-access.configure-jooq-with-multiple-datasources
@Configuration
class KomgaJooqConfiguration {
class KomgaJooqConfiguration(
private val komgaProperties: KomgaProperties
) {
@Bean("dslContextRW")
@Primary
fun mainDslContextRW(
dataSource: DataSource,
transactionProvider: ObjectProvider<TransactionProvider?>,
executeListenerProviders: ObjectProvider<ExecuteListenerProvider?>,
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders)
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders, komgaProperties.database.type)
@Bean("dslContextRO")
fun mainDslContextRO(
@Qualifier("sqliteDataSourceRO") dataSource: DataSource,
@Qualifier("mainDataSourceRO") dataSource: DataSource,
transactionProvider: ObjectProvider<TransactionProvider?>,
executeListenerProviders: ObjectProvider<ExecuteListenerProvider?>,
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders)
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders, komgaProperties.database.type)
@Bean("tasksDslContextRW")
fun tasksDslContextRW(
@Qualifier("tasksDataSourceRW") dataSource: DataSource,
transactionProvider: ObjectProvider<TransactionProvider?>,
executeListenerProviders: ObjectProvider<ExecuteListenerProvider?>,
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders)
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders, komgaProperties.tasksDb.type)
@Bean("tasksDslContextRO")
fun tasksDslContextRO(
@Qualifier("tasksDataSourceRO") dataSource: DataSource,
transactionProvider: ObjectProvider<TransactionProvider?>,
executeListenerProviders: ObjectProvider<ExecuteListenerProvider?>,
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders)
): DSLContext = createDslContext(dataSource, transactionProvider, executeListenerProviders, komgaProperties.tasksDb.type)
private fun createDslContext(
dataSource: DataSource,
transactionProvider: ObjectProvider<TransactionProvider?>,
executeListenerProviders: ObjectProvider<ExecuteListenerProvider?>,
databaseType: DatabaseType,
) = DefaultDSLContext(
DefaultConfiguration().also { configuration ->
configuration.set(SQLDialect.SQLITE)
configuration.set(when (databaseType) {
DatabaseType.SQLITE -> SQLDialect.SQLITE
DatabaseType.POSTGRESQL -> SQLDialect.POSTGRES
})
configuration.set(DataSourceConnectionProvider(TransactionAwareDataSourceProxy(dataSource)))
transactionProvider.ifAvailable { newTransactionProvider: TransactionProvider? -> configuration.set(newTransactionProvider) }
configuration.set(*executeListenerProviders.orderedStream().toList().toTypedArray())

View file

@ -4,7 +4,7 @@ import com.fasterxml.jackson.databind.ObjectMapper
import org.gotson.komga.domain.model.AllowExclude
import org.gotson.komga.domain.model.ContentRestrictions
import org.gotson.komga.domain.model.MediaExtension
import org.gotson.komga.infrastructure.datasource.SqliteUdfDataSource
import org.gotson.komga.infrastructure.datasource.DatabaseUdfProvider
import org.gotson.komga.jooq.main.Tables
import org.jooq.Condition
import org.jooq.Field
@ -44,7 +44,8 @@ fun Field<String>.inOrNoCondition(list: Collection<String>?): Condition =
else -> this.`in`(list)
}
fun Field<String>.udfStripAccents() = DSL.function(SqliteUdfDataSource.UDF_STRIP_ACCENTS, String::class.java, this)
// Moved to JooqUdfHelper.kt
// fun Field<String>.udfStripAccents() = DSL.function(SqliteUdfDataSource.UDF_STRIP_ACCENTS, String::class.java, this)
fun ContentRestrictions.toCondition(): Condition {
val ageAllowed =

View file

@ -4,8 +4,8 @@ import org.gotson.komga.domain.model.BookSearch
import org.gotson.komga.domain.model.ContentRestrictions
import org.gotson.komga.domain.model.ReadList
import org.gotson.komga.domain.model.SearchContext
import org.gotson.komga.infrastructure.datasource.SqliteUdfDataSource
import org.gotson.komga.infrastructure.jooq.BookSearchHelper
import org.gotson.komga.infrastructure.jooq.JooqUdfHelper
import org.gotson.komga.infrastructure.jooq.RequiredJoin
import org.gotson.komga.infrastructure.jooq.SplitDslDaoBase
import org.gotson.komga.infrastructure.jooq.TempTable
@ -55,6 +55,7 @@ class BookDtoDao(
dslRW: DSLContext,
@Qualifier("dslContextRO") dslRO: DSLContext,
private val luceneHelper: LuceneHelper,
private val jooqUdfHelper: JooqUdfHelper,
@param:Value("#{@komgaProperties.database.batchChunkSize}") private val batchSize: Int,
private val bookCommonDao: BookCommonDao,
) : SplitDslDaoBase(dslRW, dslRO),
@ -73,8 +74,8 @@ class BookDtoDao(
private val sorts =
mapOf(
"name" to b.NAME.collate(SqliteUdfDataSource.COLLATION_UNICODE_3),
"series" to sd.TITLE_SORT.collate(SqliteUdfDataSource.COLLATION_UNICODE_3),
"name" to jooqUdfHelper.run { b.NAME.collateUnicode3() },
"series" to jooqUdfHelper.run { sd.TITLE_SORT.collateUnicode3() },
"created" to b.CREATED_DATE,
"createdDate" to b.CREATED_DATE,
"lastModified" to b.LAST_MODIFIED_DATE,
@ -87,7 +88,7 @@ class BookDtoDao(
"media.comment" to m.COMMENT.noCase(),
"media.mediaType" to m.MEDIA_TYPE.noCase(),
"media.pagesCount" to m.PAGE_COUNT,
"metadata.title" to d.TITLE.collate(SqliteUdfDataSource.COLLATION_UNICODE_3),
"metadata.title" to jooqUdfHelper.run { d.TITLE.collateUnicode3() },
"metadata.numberSort" to d.NUMBER_SORT,
"metadata.releaseDate" to d.RELEASE_DATE,
"readProgress.lastModified" to r.LAST_MODIFIED_DATE,

View file

@ -2,9 +2,8 @@ package org.gotson.komga.infrastructure.jooq.main
import org.gotson.komga.domain.model.Author
import org.gotson.komga.domain.persistence.ReferentialRepository
import org.gotson.komga.infrastructure.datasource.SqliteUdfDataSource
import org.gotson.komga.infrastructure.jooq.JooqUdfHelper
import org.gotson.komga.infrastructure.jooq.SplitDslDaoBase
import org.gotson.komga.infrastructure.jooq.udfStripAccents
import org.gotson.komga.jooq.main.Tables
import org.gotson.komga.jooq.main.tables.records.BookMetadataAggregationAuthorRecord
import org.gotson.komga.jooq.main.tables.records.BookMetadataAuthorRecord
@ -25,6 +24,7 @@ import java.time.LocalDate
class ReferentialDao(
dslRW: DSLContext,
@Qualifier("dslContextRO") dslRO: DSLContext,
private val jooqUdfHelper: JooqUdfHelper,
) : SplitDslDaoBase(dslRW, dslRO),
ReferentialRepository {
private val a = Tables.BOOK_METADATA_AUTHOR

View file

@ -4,6 +4,7 @@ import org.gotson.komga.domain.model.SearchContext
import org.gotson.komga.domain.model.SearchField
import org.gotson.komga.domain.model.SeriesSearch
import org.gotson.komga.infrastructure.datasource.SqliteUdfDataSource
import org.gotson.komga.infrastructure.jooq.JooqUdfHelper
import org.gotson.komga.infrastructure.jooq.RequiredJoin
import org.gotson.komga.infrastructure.jooq.SeriesSearchHelper
import org.gotson.komga.infrastructure.jooq.SplitDslDaoBase

View file

@ -51,9 +51,9 @@ spring:
accept-case-insensitive-values: true
config:
import:
- "optional:file:\${komga.config-dir}/application.yml"
- "optional:file:\${komga.config-dir}/application.yaml"
- "optional:file:\${komga.config-dir}/application.properties"
- "optional:file:${komga.config-dir}/application.yml"
- "optional:file:${komga.config-dir}/application.yaml"
- "optional:file:${komga.config-dir}/application.properties"
http:
codecs:
max-in-memory-size: 10MB

View file

@ -14,7 +14,7 @@ class DataSourcesConfigurationTest {
@Nested
inner class WalMode(
@Autowired private val dataSourceRW: DataSource,
@Autowired @Qualifier("sqliteDataSourceRO") private val dataSourceRO: DataSource,
@Autowired @Qualifier("mainDataSourceRO") private val dataSourceRO: DataSource,
@Autowired @Qualifier("tasksDataSourceRW") private val tasksDataSourceRW: DataSource,
@Autowired @Qualifier("tasksDataSourceRO") private val tasksDataSourceRO: DataSource,
) {
@ -30,7 +30,7 @@ class DataSourcesConfigurationTest {
@Nested
inner class MemoryMode(
@Autowired private val dataSourceRW: DataSource,
@Autowired @Qualifier("sqliteDataSourceRO") private val dataSourceRO: DataSource,
@Autowired @Qualifier("mainDataSourceRO") private val dataSourceRO: DataSource,
@Autowired @Qualifier("tasksDataSourceRW") private val tasksDataSourceRW: DataSource,
@Autowired @Qualifier("tasksDataSourceRO") private val tasksDataSourceRO: DataSource,
) {