Merge branch 'master' into feat/viewport-fix

This commit is contained in:
Angry Toenail 2026-05-06 19:16:15 +01:00
commit dcfb704256
No known key found for this signature in database
900 changed files with 71225 additions and 26276 deletions

View file

@ -1,40 +0,0 @@
---
name: Bug report
about: Create a report to help us improve
title: "[Bug Report] Short Form Subject (50 Chars or less)"
labels: bug report
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem please ensure that your screenshots are SFW or at least appropriately censored.
**Stash Version: (from Settings -> About):**
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.

64
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View file

@ -0,0 +1,64 @@
name: Bug Report
description: Create a report to help us fix the bug
labels: ["bug report"]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
- type: textarea
id: description
attributes:
label: Describe the bug
description: Provide a clear and concise description of what the bug is.
validations:
required: true
- type: textarea
id: reproduction
attributes:
label: Steps to reproduce
description: Detail the steps that would replicate this issue.
placeholder: |
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
validations:
required: true
- type: textarea
id: expected
attributes:
label: Expected behaviour
description: Provide clear and concise description of what you expected to happen.
validations:
required: true
- type: textarea
id: context
attributes:
label: Screenshots or additional context
description: Provide any additional context and SFW screenshots here to help us solve this issue.
validations:
required: false
- type: input
id: stashversion
attributes:
label: Stash version
description: This can be found in Settings > About.
placeholder: (e.g. v0.28.1)
validations:
required: true
- type: input
id: devicedetails
attributes:
label: Device details
description: |
If this is an issue that occurs when using the Stash interface, please provide details of the device/browser used which presents the reported issue.
placeholder: (e.g. Firefox 97 (64-bit) on Windows 11)
validations:
required: false
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output from Settings > Logs. This will be automatically formatted into code, so no need for backticks.
render: shell

11
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View file

@ -0,0 +1,11 @@
blank_issues_enabled: false
contact_links:
- name: Community forum
url: https://discourse.stashapp.cc
about: Start a discussion on the community forum.
- name: Community Discord
url: https://discord.gg/Y8MNsvQBvZ
about: Chat with the community on Discord.
- name: Documentation
url: https://docs.stashapp.cc
about: Check out documentation for help and information.

View file

@ -1,24 +0,0 @@
---
name: Discussion / Request for Commentary [RFC]
about: This is for issues that will be discussed and won't necessarily result directly
in commits or pull requests.
title: "[RFC] Short Form Title"
labels: help wanted
assignees: ''
---
<!-- Update or delete the title if you need to delegate your title gore to something
# Title
*### Scope*
<!-- describe the scope of your topic and your goals ideally within a single paragraph or TL;DR kind of summary so its easier for people to determine if they can contribute at a glance. -->
## Long Form
<!-- Only required if your scope and titles can't cover everything. -->
## Examples
<!-- if you can show a picture or video examples post them here, please ensure that you respect people's time and attention and understand that people are volunteering their time, so concision is ideal and considerate. -->
## Reference Reading
<!-- if there is any reference reading or documentation, please refer to it here. -->

View file

@ -1,20 +0,0 @@
---
name: Feature request
about: Suggest an idea for this project
title: "[Feature] Short Form Title (50 chars or less.)"
labels: feature request
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View file

@ -0,0 +1,44 @@
name: Feature Request
description: Request a new feature or idea to be added to Stash
labels: ["feature request"]
body:
- type: textarea
id: description
attributes:
label: Describe the feature you'd like
description: Provide a clear description of the feature you'd like implemented
validations:
required: true
- type: textarea
id: benefits
attributes:
label: Describe the benefits this would bring to existing users
description: |
Explain the measurable benefits this feature would achieve for existing users.
The benefits should be described in terms of outcomes for users, not specific implementations.
validations:
required: true
- type: textarea
id: already_possible
attributes:
label: Is there an existing way to achieve this goal?
description: |
Yes/No. If Yes, describe how your proposed feature differs from or improves upon the current method
validations:
required: true
- type: checkboxes
id: confirm-search
attributes:
label: Have you searched for an existing open/closed issue?
description: |
To help us keep these issues under control, please ensure you have first [searched our issue list](https://github.com/stashapp/stash/issues?q=is%3Aissue) for any existing issues that cover the core request or benefit of your proposal.
options:
- label: I have searched for existing issues and none cover the core request of my proposal
required: true
- type: textarea
id: context
attributes:
label: Additional context
description: Add any other context or screenshots about the feature request here.
validations:
required: false

28
.github/workflows/build-compiler.yml vendored Normal file
View file

@ -0,0 +1,28 @@
name: Compiler Build
on:
workflow_dispatch:
env:
COMPILER_IMAGE: ghcr.io/stashapp/compiler:13
jobs:
build-compiler:
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v6
- uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: docker/setup-buildx-action@v3
- uses: docker/build-push-action@v6
with:
push: true
context: "{{defaultContext}}:docker/compiler"
tags: |
${{ env.COMPILER_IMAGE }}
ghcr.io/stashapp/compiler:latest
cache-from: type=gha,scope=all,mode=max
cache-to: type=gha,scope=all,mode=max

View file

@ -2,7 +2,10 @@ name: Build
on:
push:
branches: [ develop, master ]
branches:
- develop
- master
- 'releases/**'
pull_request:
release:
types: [ published ]
@ -12,50 +15,163 @@ concurrency:
cancel-in-progress: true
env:
COMPILER_IMAGE: stashapp/compiler:11
COMPILER_IMAGE: ghcr.io/stashapp/compiler:13
jobs:
build:
runs-on: ubuntu-22.04
# Job 1: Generate code and build UI
# Runs natively (no Docker) — go generate/gqlgen and node don't need cross-compilers.
# Produces artifacts (generated Go files + UI build) consumed by test and build jobs.
generate:
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v6
with:
fetch-depth: 0
fetch-tags: true
- name: Setup Go
uses: actions/setup-go@v6
- name: Checkout
run: git fetch --prune --unshallow --tags
# pnpm version is read from the packageManager field in package.json
# very broken (4.3, 4.4)
- name: Install pnpm
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061
with:
package_json_file: ui/v2.5/package.json
- name: Setup Node.js
uses: actions/setup-node@v6
with:
node-version: '20'
cache: 'pnpm'
cache-dependency-path: ui/v2.5/pnpm-lock.yaml
- name: Install UI dependencies
run: cd ui/v2.5 && pnpm install --frozen-lockfile
- name: Generate
run: make generate
- name: Cache UI build
uses: actions/cache@v5
id: cache-ui
with:
path: ui/v2.5/build
key: ${{ runner.os }}-ui-build-${{ hashFiles('ui/v2.5/pnpm-lock.yaml', 'ui/v2.5/public/**', 'ui/v2.5/src/**', 'graphql/**/*.graphql') }}
- name: Validate UI
# skip UI validation for pull requests if UI is unchanged
if: ${{ github.event_name != 'pull_request' || steps.cache-ui.outputs.cache-hit != 'true' }}
run: make validate-ui
- name: Build UI
# skip UI build for pull requests if UI is unchanged (UI was cached)
if: ${{ github.event_name != 'pull_request' || steps.cache-ui.outputs.cache-hit != 'true' }}
run: make ui
# Bundle generated Go files + UI build for downstream jobs (test + build)
- name: Upload generated artifacts
uses: actions/upload-artifact@v7
with:
name: generated
retention-days: 1
path: |
internal/api/generated_exec.go
internal/api/generated_models.go
ui/v2.5/build/
ui/login/locales/
# Job 2: Integration tests
# Runs natively (no Docker) — only needs Go + GCC (for CGO/SQLite), both on ubuntu-22.04.
# Runs in parallel with the build matrix jobs.
test:
needs: generate
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v6
- name: Setup Go
uses: actions/setup-go@v5
uses: actions/setup-go@v6
with:
go-version-file: 'go.mod'
- name: Pull compiler image
run: docker pull $COMPILER_IMAGE
- name: Cache node modules
uses: actions/cache@v3
env:
cache-name: cache-node_modules
# Places generated Go files + UI build into the working tree so the build compiles
- name: Download generated artifacts
uses: actions/download-artifact@v8
with:
path: ui/v2.5/node_modules
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ hashFiles('ui/v2.5/yarn.lock') }}
name: generated
- name: Cache UI build
uses: actions/cache@v3
id: cache-ui
env:
cache-name: cache-ui
- name: Test Backend
run: make it
# Job 3: Cross-compile for all platforms
# Each platform gets its own runner and Docker container (ghcr.io/stashapp/compiler:13).
# Each build-cc-* make target is self-contained (sets its own GOOS/GOARCH/CC),
# so running them in separate containers is functionally identical to one container.
# Runs in parallel with the test job.
build:
needs: generate
runs-on: ubuntu-24.04
strategy:
fail-fast: false
matrix:
include:
- platform: windows
make-target: build-cc-windows
artifact-paths: |
dist/stash-win.exe
tag: win
- platform: macos
make-target: build-cc-macos
artifact-paths: |
dist/stash-macos
dist/Stash.app.zip
tag: osx
- platform: linux
make-target: build-cc-linux
artifact-paths: |
dist/stash-linux
tag: linux
- platform: linux-arm64v8
make-target: build-cc-linux-arm64v8
artifact-paths: |
dist/stash-linux-arm64v8
tag: arm
- platform: linux-arm32v7
make-target: build-cc-linux-arm32v7
artifact-paths: |
dist/stash-linux-arm32v7
tag: arm
- platform: linux-arm32v6
make-target: build-cc-linux-arm32v6
artifact-paths: |
dist/stash-linux-arm32v6
tag: arm
- platform: freebsd
make-target: build-cc-freebsd
artifact-paths: |
dist/stash-freebsd
tag: freebsd
steps:
- uses: actions/checkout@v6
with:
path: ui/v2.5/build
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ hashFiles('ui/v2.5/yarn.lock', 'ui/v2.5/public/**', 'ui/v2.5/src/**', 'graphql/**/*.graphql') }}
fetch-depth: 0
fetch-tags: true
- name: Cache go build
uses: actions/cache@v3
env:
# increment the number suffix to bump the cache
cache-name: cache-go-cache-1
- name: Download generated artifacts
uses: actions/download-artifact@v8
with:
name: generated
- name: Cache Go build
uses: actions/cache@v5
with:
path: .go-cache
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ hashFiles('go.mod', '**/go.sum') }}
key: ${{ runner.os }}-go-cache-${{ matrix.platform }}-${{ hashFiles('go.mod', '**/go.sum') }}
# kept seperate to test timings
- name: pull compiler image
run: docker pull $COMPILER_IMAGE
- name: Start build container
env:
@ -64,45 +180,50 @@ jobs:
mkdir -p .go-cache
docker run -d --name build --mount type=bind,source="$(pwd)",target=/stash,consistency=delegated --mount type=bind,source="$(pwd)/.go-cache",target=/root/.cache/go-build,consistency=delegated --env OFFICIAL_BUILD=${{ env.official-build }} -w /stash $COMPILER_IMAGE tail -f /dev/null
- name: Pre-install
run: docker exec -t build /bin/bash -c "make pre-ui"
- name: Generate
run: docker exec -t build /bin/bash -c "make generate"
- name: Validate UI
# skip UI validation for pull requests if UI is unchanged
if: ${{ github.event_name != 'pull_request' || steps.cache-ui.outputs.cache-hit != 'true' }}
run: docker exec -t build /bin/bash -c "make validate-ui"
# Static validation happens in the linter workflow in parallel to this workflow
# Run Dynamic validation here, to make sure we pass all the projects integration tests
- name: Test Backend
run: docker exec -t build /bin/bash -c "make it"
- name: Build UI
# skip UI build for pull requests if UI is unchanged (UI was cached)
# this means that the build version/time may be incorrect if the UI is
# not changed in a pull request
if: ${{ github.event_name != 'pull_request' || steps.cache-ui.outputs.cache-hit != 'true' }}
run: docker exec -t build /bin/bash -c "make ui"
- name: Compile for all supported platforms
run: |
docker exec -t build /bin/bash -c "make build-cc-windows"
docker exec -t build /bin/bash -c "make build-cc-macos"
docker exec -t build /bin/bash -c "make build-cc-linux"
docker exec -t build /bin/bash -c "make build-cc-linux-arm64v8"
docker exec -t build /bin/bash -c "make build-cc-linux-arm32v7"
docker exec -t build /bin/bash -c "make build-cc-linux-arm32v6"
docker exec -t build /bin/bash -c "make build-cc-freebsd"
- name: Zip UI
run: docker exec -t build /bin/bash -c "make zip-ui"
- name: Build (${{ matrix.platform }})
run: docker exec -t build /bin/bash -c "make ${{ matrix.make-target }}"
- name: Cleanup build container
run: docker rm -f -v build
- name: Upload build artifact
uses: actions/upload-artifact@v7
with:
name: build-${{ matrix.platform }}
retention-days: 1
path: ${{ matrix.artifact-paths }}
# Job 4: Release
# Waits for both test and build to pass, then collects all platform artifacts
# into dist/ for checksums, GitHub releases, and multi-arch Docker push.
release:
needs: [test, build]
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v6
with:
fetch-depth: 0
fetch-tags: true
# Downloads all artifacts (generated + 7 platform builds) into artifacts/ subdirectories
- name: Download all build artifacts
uses: actions/download-artifact@v8
with:
path: artifacts
# Reassemble platform binaries from matrix job artifacts into a single dist/ directory
# make sure that artifacts have executable bit set
# upload-artifact@v4 strips the common path prefix (dist/), so files are at the artifact root
- name: Collect binaries
run: |
mkdir -p dist
cp artifacts/build-*/* dist/
chmod +x dist/*
- name: Zip UI
run: |
cd artifacts/generated/ui/v2.5/build && zip -r ../../../../../dist/stash-ui.zip .
- name: Generate checksums
run: |
git describe --tags --exclude latest_develop | tee CHECKSUMS_SHA1
@ -113,7 +234,7 @@ jobs:
- name: Upload Windows binary
# only upload binaries for pull requests
if: ${{ github.event_name == 'pull_request' && github.base_ref != 'refs/heads/develop' && github.base_ref != 'refs/heads/master'}}
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v7
with:
name: stash-win.exe
path: dist/stash-win.exe
@ -121,15 +242,23 @@ jobs:
- name: Upload macOS binary
# only upload binaries for pull requests
if: ${{ github.event_name == 'pull_request' && github.base_ref != 'refs/heads/develop' && github.base_ref != 'refs/heads/master'}}
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v7
with:
name: stash-macos
path: dist/stash-macos
- name: Upload macOS bundle
# only upload binaries for pull requests
if: ${{ github.event_name == 'pull_request' && github.base_ref != 'refs/heads/develop' && github.base_ref != 'refs/heads/master'}}
uses: actions/upload-artifact@v7
with:
name: Stash.app.zip
path: dist/Stash.app.zip
- name: Upload Linux binary
# only upload binaries for pull requests
if: ${{ github.event_name == 'pull_request' && github.base_ref != 'refs/heads/develop' && github.base_ref != 'refs/heads/master'}}
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v7
with:
name: stash-linux
path: dist/stash-linux
@ -137,14 +266,14 @@ jobs:
- name: Upload UI
# only upload for pull requests
if: ${{ github.event_name == 'pull_request' && github.base_ref != 'refs/heads/develop' && github.base_ref != 'refs/heads/master'}}
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v7
with:
name: stash-ui.zip
path: dist/stash-ui.zip
- name: Update latest_develop tag
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/develop' }}
run : git tag -f latest_develop; git push -f --tags
run: git tag -f latest_develop; git push -f --tags
- name: Development Release
if: ${{ github.event_name == 'push' && github.ref == 'refs/heads/develop' }}
@ -194,7 +323,7 @@ jobs:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
run: |
docker run --rm --privileged docker/binfmt:a7996909642ee92942dcd6cff44b9b95f08dad64
docker run --rm --privileged tonistiigi/binfmt
docker info
docker buildx create --name builder --use
docker buildx inspect --bootstrap
@ -210,7 +339,7 @@ jobs:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
run: |
docker run --rm --privileged docker/binfmt:a7996909642ee92942dcd6cff44b9b95f08dad64
docker run --rm --privileged tonistiigi/binfmt
docker info
docker buildx create --name builder --use
docker buildx inspect --bootstrap

View file

@ -6,67 +6,23 @@ on:
branches:
- master
- develop
- 'releases/**'
pull_request:
env:
COMPILER_IMAGE: stashapp/compiler:11
jobs:
golangci:
name: lint
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Checkout
run: git fetch --prune --unshallow --tags
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version-file: 'go.mod'
- name: Pull compiler image
run: docker pull $COMPILER_IMAGE
- name: Start build container
run: |
mkdir -p .go-cache
docker run -d --name build --mount type=bind,source="$(pwd)",target=/stash,consistency=delegated --mount type=bind,source="$(pwd)/.go-cache",target=/root/.cache/go-build,consistency=delegated -w /stash $COMPILER_IMAGE tail -f /dev/null
# no tags or depth needed for lint
- uses: actions/checkout@v6
- uses: actions/setup-go@v6
# generate-backend runs natively (just go generate + touch-ui) — no Docker needed
- name: Generate Backend
run: docker exec -t build /bin/bash -c "make generate-backend"
run: make generate-backend
## WARN
## using v1, update in a later PR
- name: Run golangci-lint
uses: golangci/golangci-lint-action@v6
with:
# Optional: version of golangci-lint to use in form of v1.2 or v1.2.3 or `latest` to use the latest version
version: latest
# Optional: working directory, useful for monorepos
# working-directory: somedir
# Optional: golangci-lint command line arguments.
#
# Note: By default, the `.golangci.yml` file should be at the root of the repository.
# The location of the configuration file can be changed by using `--config=`
args: --timeout=5m
# Optional: show only new issues if it's a pull request. The default value is `false`.
# only-new-issues: true
# Optional: if set to true, then all caching functionality will be completely disabled,
# takes precedence over all other caching options.
# skip-cache: true
# Optional: if set to true, then the action won't cache or restore ~/go/pkg.
# skip-pkg-cache: true
# Optional: if set to true, then the action won't cache or restore ~/.cache/go-build.
# skip-build-cache: true
# Optional: The mode to install golangci-lint. It can be 'binary' or 'goinstall'.
# install-mode: "goinstall"
- name: Cleanup build container
run: docker rm -f -v build
uses: golangci/golangci-lint-action@v6

View file

@ -1,5 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="WEB_MODULE" version="4">
<component name="Go" enabled="true" />
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/certs" />
@ -10,4 +11,4 @@
<orderEntry type="inheritedJdk" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>
</module>

View file

@ -50,7 +50,7 @@ export CGO_ENABLED := 1
# define COMPILER_IMAGE for cross-compilation docker container
ifndef COMPILER_IMAGE
COMPILER_IMAGE := stashapp/compiler:latest
COMPILER_IMAGE := ghcr.io/stashapp/compiler:latest
endif
.PHONY: release
@ -129,7 +129,7 @@ phasher: build-flags
# builds dynamically-linked debug binaries
.PHONY: build
build: stash phasher
build: stash
# builds dynamically-linked PIE release binaries
.PHONY: build-release
@ -187,8 +187,6 @@ build-cc-macos:
# Combine into universal binaries
lipo -create -output dist/stash-macos dist/stash-macos-intel dist/stash-macos-arm
rm dist/stash-macos-intel dist/stash-macos-arm
lipo -create -output dist/phasher-macos dist/phasher-macos-intel dist/phasher-macos-arm
rm dist/phasher-macos-intel dist/phasher-macos-arm
# Place into bundle and zip up
rm -rf dist/Stash.app
@ -198,6 +196,16 @@ build-cc-macos:
cd dist && rm -f Stash.app.zip && zip -r Stash.app.zip Stash.app
rm -rf dist/Stash.app
.PHONY: build-cc-macos-phasher
build-cc-macos-phasher:
make build-cc-macos-arm
make build-cc-macos-intel
# Combine into universal binaries
lipo -create -output dist/phasher-macos dist/phasher-macos-intel dist/phasher-macos-arm
rm dist/phasher-macos-intel dist/phasher-macos-arm
# do not bundle phasher
.PHONY: build-cc-freebsd
build-cc-freebsd: export GOOS := freebsd
build-cc-freebsd: export GOARCH := amd64
@ -275,7 +283,7 @@ generate: generate-backend generate-ui
.PHONY: generate-ui
generate-ui:
cd ui/v2.5 && yarn run gqlgen
cd ui/v2.5 && npm run gqlgen
.PHONY: generate-backend
generate-backend: touch-ui
@ -338,9 +346,19 @@ server-clean:
# installs UI dependencies. Run when first cloning repository, or if UI
# dependencies have changed
# If CI is set, configures pnpm to use a local store to avoid
# putting .pnpm-store in /stash
# NOTE: to run in the docker build container, using the existing
# node_modules folder, rename the .modules.yaml to .modules.yaml.bak
# and a new one will be generated. This will need to be reversed after
# building.
.PHONY: pre-ui
pre-ui:
cd ui/v2.5 && yarn install --frozen-lockfile
ifdef CI
cd ui/v2.5 && pnpm config set store-dir ~/.pnpm-store && pnpm install --frozen-lockfile
else
cd ui/v2.5 && pnpm install --frozen-lockfile
endif
.PHONY: ui-env
ui-env: build-info
@ -359,7 +377,7 @@ ui: ui-only generate-login-locale
.PHONY: ui-only
ui-only: ui-env
cd ui/v2.5 && yarn build
cd ui/v2.5 && npm run build
.PHONY: zip-ui
zip-ui:
@ -368,20 +386,24 @@ zip-ui:
.PHONY: ui-start
ui-start: ui-env
cd ui/v2.5 && yarn start --host
cd ui/v2.5 && npm run start -- --host
.PHONY: fmt-ui
fmt-ui:
cd ui/v2.5 && yarn format
cd ui/v2.5 && npm run format
# runs all of the frontend PR-acceptance steps
.PHONY: validate-ui
validate-ui:
cd ui/v2.5 && yarn run validate
cd ui/v2.5 && npm run validate
# these targets run the same steps as fmt-ui and validate-ui, but only on files that have changed
fmt-ui-quick:
cd ui/v2.5 && yarn run prettier --write $$(git diff --name-only --relative --diff-filter d . ../../graphql)
cd ui/v2.5 && \
files=$$(git diff --name-only --relative --diff-filter d . ../../graphql); \
if [ -n "$$files" ]; then \
npm run prettier -- --write $$files; \
fi
# does not run tsc checks, as they are slow
validate-ui-quick:
@ -389,9 +411,9 @@ validate-ui-quick:
tsfiles=$$(git diff --name-only --relative --diff-filter d src | grep -e "\.tsx\?\$$"); \
scssfiles=$$(git diff --name-only --relative --diff-filter d src | grep "\.scss"); \
prettyfiles=$$(git diff --name-only --relative --diff-filter d . ../../graphql); \
if [ -n "$$tsfiles" ]; then yarn run eslint $$tsfiles; fi && \
if [ -n "$$scssfiles" ]; then yarn run stylelint $$scssfiles; fi && \
if [ -n "$$prettyfiles" ]; then yarn run prettier --check $$prettyfiles; fi
if [ -n "$$tsfiles" ]; then npm run eslint -- $$tsfiles; fi && \
if [ -n "$$scssfiles" ]; then npm run stylelint -- $$scssfiles; fi && \
if [ -n "$$prettyfiles" ]; then npm run prettier -- --check $$prettyfiles; fi
# runs all of the backend PR-acceptance steps
.PHONY: validate-backend

View file

@ -9,90 +9,101 @@
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/stashapp/stash?logo=github)](https://github.com/stashapp/stash/releases/latest)
[![GitHub issues by-label](https://img.shields.io/github/issues-raw/stashapp/stash/bounty)](https://github.com/stashapp/stash/labels/bounty)
### **Stash is a self-hosted webapp written in Go which organizes and serves your porn.**
![demo image](docs/readme_assets/demo_image.png)
### **Stash is a self-hosted webapp written in Go which organizes and serves your diverse content collection, catering to both your SFW and NSFW needs.**
* Stash gathers information about videos in your collection from the internet, and is extensible through the use of community-built plugins for a large number of content producers and sites.
* Stash supports a wide variety of both video and image formats.
* You can tag videos and find them later.
* Stash provides statistics about performers, tags, studios and more.
![Screenshot of Stash web application interface](docs/readme_assets/demo_image.png)
- Stash gathers information about videos in your collection from the internet, and is extensible through the use of community-built plugins for a large number of content producers and sites.
- Stash supports a wide variety of both video and image formats.
- You can tag videos and find them later.
- Stash provides statistics about performers, tags, studios and more.
You can [watch a SFW demo video](https://vimeo.com/545323354) to see it in action.
For further information you can consult the [documentation](https://docs.stashapp.cc) or [read the in-app manual](ui/v2.5/src/docs/en).
For further information you can consult the [documentation](https://docs.stashapp.cc) or access the in-app manual from within the application (also available at [docs.stashapp.cc/in-app-manual](https://docs.stashapp.cc/in-app-manual)).
# Installing Stash
#### Windows Users:
> [!tip]
Step-by-step instructions are available at [docs.stashapp.cc/installation](https://docs.stashapp.cc/installation/).
As of version 0.27.0, Stash doesn't support anymore _Windows 7, 8, Server 2008 and Server 2012._
Windows 10 or Server 2016 are at least required.
#### Mac Users:
As of version 0.29.0, Stash requires at least _macOS 11 Big Sur._
Stash can still be ran through docker on older versions of macOS
> [!important]
>**Windows Users**
>
>As of version 0.27.0, Stash no longer supports _Windows 7, 8, Server 2008 and Server 2012._
>At least Windows 10 or Server 2016 is required.
>
>**macOS Users**
>
> As of version 0.29.0, Stash requires _macOS 11 Big Sur_ or later.
> Stash can still be run through docker on older versions of macOS.
<img src="docs/readme_assets/windows_logo.svg" width="100%" height="75"> Windows | <img src="docs/readme_assets/mac_logo.svg" width="100%" height="75"> macOS | <img src="docs/readme_assets/linux_logo.svg" width="100%" height="75"> Linux | <img src="docs/readme_assets/docker_logo.svg" width="100%" height="75"> Docker
:---:|:---:|:---:|:---:
[Latest Release](https://github.com/stashapp/stash/releases/latest/download/stash-win.exe) <br /> <sup><sub>[Development Preview](https://github.com/stashapp/stash/releases/download/latest_develop/stash-win.exe)</sub></sup> | [Latest Release](https://github.com/stashapp/stash/releases/latest/download/Stash.app.zip) <br /> <sup><sub>[Development Preview](https://github.com/stashapp/stash/releases/download/latest_develop/Stash.app.zip)</sub></sup> | [Latest Release (amd64)](https://github.com/stashapp/stash/releases/latest/download/stash-linux) <br /> <sup><sub>[Development Preview (amd64)](https://github.com/stashapp/stash/releases/download/latest_develop/stash-linux)</sub></sup> <br /> [More Architectures...](https://github.com/stashapp/stash/releases/latest) | [Instructions](docker/production/README.md) <br /> <sup><sub>[Sample docker-compose.yml](docker/production/docker-compose.yml)</sub></sup>
Download links for other platforms and architectures are available on the [Releases page](https://github.com/stashapp/stash/releases).
Download links for other platforms and architectures are available on the [Releases](https://github.com/stashapp/stash/releases) page.
## First Run
#### Windows/macOS Users: Security Prompt
On Windows or macOS, running the app might present a security prompt since the binary isn't yet signed.
On Windows or macOS, running the app might present a security prompt since the application binary isn't yet signed.
On Windows, bypass this by clicking "more info" and then the "run anyway" button. On macOS, Control+Click the app, click "Open", and then "Open" again.
- On Windows, bypass this by clicking "more info" and then the "run anyway" button.
- On macOS, Control+Click the app, click "Open", and then "Open" again.
#### FFmpeg
Stash requires FFmpeg. If you don't have it installed, Stash will download a copy for you. It is recommended that Linux users install `ffmpeg` from their distro's package manager.
#### ffmpeg
Stash requires FFmpeg. If you don't have it installed, Stash will prompt you to download a copy during setup. It is recommended that Linux users install `ffmpeg` from their distro's package manager.
# Usage
## Quickstart Guide
Stash is a web-based application. Once the application is running, the interface is available (by default) from http://localhost:9999.
Stash is a web-based application. Once the application is running, the interface is available (by default) from `http://localhost:9999`.
On first run, Stash will prompt you for some configuration options and media directories to index, called "Scanning" in Stash. After scanning, your media will be available for browsing, curating, editing, and tagging.
Stash can pull metadata (performers, tags, descriptions, studios, and more) directly from many sites through the use of [scrapers](https://github.com/stashapp/stash/blob/develop/ui/v2.5/src/docs/en/Manual/Scraping.md), which integrate directly into Stash. Identifying an entire collection will typically require a mix of multiple sources:
- The project maintains [StashDB](https://stashdb.org/), a crowd-sourced repository of scene, studio, and performer information. Connecting it to Stash will allow you to automatically identify much of a typical media collection. It runs on our stash-box software and is primarily focused on mainstream digital scenes and studios. Instructions, invite codes, and more can be found in this guide to [Accessing StashDB](https://guidelines.stashdb.org/docs/faq_getting-started/stashdb/accessing-stashdb/).
- The stashapp team maintains [StashDB](https://stashdb.org/), a crowd-sourced repository of scene, studio, and performer information. Connecting it to Stash will allow you to automatically identify much of a typical media collection. It runs on our stash-box software and is primarily focused on mainstream digital scenes and studios. Instructions, invite codes, and more can be found in this guide to [Accessing StashDB](https://guidelines.stashdb.org/docs/faq_getting-started/stashdb/accessing-stashdb/).
- Several community-managed stash-box databases can also be connected to Stash in a similar manner. Each one serves a slightly different niche and follows their own methodology. A rundown of each stash-box, their differences, and the information you need to sign up can be found in this guide to [Accessing Stash-Boxes](https://guidelines.stashdb.org/docs/faq_getting-started/stashdb/accessing-stash-boxes/).
- Many community-maintained scrapers can also be downloaded, installed, and updated from within Stash, allowing you to pull data from a wide range of other websites and databases. They can be found by navigating to Settings -> Metadata Providers -> Available Scrapers -> Community (stable). These can be trickier to use than a stash-box because every scraper works a little differently. For more information, please visit the [CommunityScrapers repository](https://github.com/stashapp/CommunityScrapers).
- Many community-maintained scrapers can also be downloaded, installed, and updated from within Stash, allowing you to pull data from a wide range of other websites and databases. They can be found by navigating to `Settings → Metadata Providers → Available Scrapers → Community (stable)`. These can be trickier to use than a stash-box because every scraper works a little differently. For more information, please visit the [CommunityScrapers repository](https://github.com/stashapp/CommunityScrapers).
- All of the above methods of scraping data into Stash are also covered in more detail in our [Guide to Scraping](https://docs.stashapp.cc/beginner-guides/guide-to-scraping/).
<sub>[StashDB](http://stashdb.org) is the canonical instance of our open source metadata API, [stash-box](https://github.com/stashapp/stash-box).</sub>
# Translation
[![Translate](https://translate.codeberg.org/widget/stash/stash/svg-badge.svg)](https://translate.codeberg.org/engage/stash/)
Stash is available in 32 languages (so far!) and it could be in your language too. We use Weblate to coordinate community translations. If you want to help us translate Stash into your language, you can make an account at [Codeberg's Weblate](https://translate.codeberg.org/projects/stash/stash/) to get started contributing new languages or improving existing ones. Thanks!
Stash is available in 32 languages (so far!) and it could be in your language too. We use Weblate to coordinate community translations. If you want to help us translate Stash, you can make an account at [Codeberg's Weblate](https://translate.codeberg.org/projects/stash/stash/) to contribute to new or existing languages. Thanks!
The badge below shows the current translation status of Stash across all supported languages:
[![Translation status](https://translate.codeberg.org/widget/stash/stash/multi-auto.svg)](https://translate.codeberg.org/engage/stash/)
## Join Our Community
# Support & Resources
We are excited to announce that we have a new home for support, feature requests, and discussions related to Stash and its associated projects. Join our community on the [Discourse forum](https://discourse.stashapp.cc) to connect with other users, share your ideas, and get help from fellow enthusiasts.
Need help or want to get involved? Start with the documentation, then reach out to the community if you need further assistance.
# Support (FAQ)
### Documentation
- [Official documentation](https://docs.stashapp.cc) - official guides guides and troubleshooting.
- [In-app manual](https://docs.stashapp.cc/in-app-manual) press <kbd>Shift</kbd> + <kbd>?</kbd> in the app or view the manual online.
- [FAQ](https://discourse.stashapp.cc/c/support/faq/28) - common questions and answers.
- [Community wiki](https://discourse.stashapp.cc/tags/c/community-wiki/22/stash) - guides, how-tos and tips.
### Community & discussion
- [Community forum](https://discourse.stashapp.cc) - community support, feature requests and discussions.
- [Discord](https://discord.gg/2TsNFKt) - real-time chat and community support.
- [GitHub discussions](https://github.com/stashapp/stash/discussions) - community support and feature discussions.
- [Lemmy community](https://discuss.online/c/stashapp) - board-style community space.
Check out our documentation on [Stash-Docs](https://docs.stashapp.cc) for information about the software, questions, guides, add-ons and more.
For more help you can:
* Check the in-app documentation, in the top right corner of the app (it's also mirrored on [Stash-Docs](https://docs.stashapp.cc/in-app-manual))
* Join our [community forum](https://discourse.stashapp.cc)
* Join the [Discord server](https://discord.gg/2TsNFKt)
* Start a [discussion on GitHub](https://github.com/stashapp/stash/discussions)
# Customization
## Themes and CSS Customization
There is a [directory of community-created themes](https://docs.stashapp.cc/themes/list) on Stash-Docs.
You can also change the Stash interface to fit your desired style with various snippets from [Custom CSS snippets](https://docs.stashapp.cc/themes/custom-css-snippets).
### Community scrapers & plugins
- [Metadata sources](https://docs.stashapp.cc/metadata-sources/)
- [Plugins](https://docs.stashapp.cc/plugins/)
- [Themes](https://docs.stashapp.cc/themes/)
- [Other projects](https://docs.stashapp.cc/other-projects/)
# For Developers

View file

@ -5,20 +5,39 @@ import (
"fmt"
"os"
"os/exec"
"path/filepath"
flag "github.com/spf13/pflag"
"github.com/stashapp/stash/pkg/ffmpeg"
"github.com/stashapp/stash/pkg/hash/imagephash"
"github.com/stashapp/stash/pkg/hash/videophash"
"github.com/stashapp/stash/pkg/models"
)
func customUsage() {
fmt.Fprintf(os.Stderr, "Usage:\n")
fmt.Fprintf(os.Stderr, "%s [OPTIONS] VIDEOFILE...\n\nOptions:\n", os.Args[0])
fmt.Fprintf(os.Stderr, "%s [OPTIONS] FILE...\n\nOptions:\n", os.Args[0])
flag.PrintDefaults()
}
func printPhash(ff *ffmpeg.FFMpeg, ffp *ffmpeg.FFProbe, inputfile string, quiet *bool) error {
// Determine if this is a video or image file based on extension
ext := filepath.Ext(inputfile)
ext = ext[1:] // remove the leading dot
// Common image extensions
imageExts := map[string]bool{
"jpg": true, "jpeg": true, "png": true, "gif": true, "webp": true, "bmp": true, "avif": true,
}
if imageExts[ext] {
return printImagePhash(ff, inputfile, quiet)
}
return printVideoPhash(ff, ffp, inputfile, quiet)
}
func printVideoPhash(ff *ffmpeg.FFMpeg, ffp *ffmpeg.FFProbe, inputfile string, quiet *bool) error {
ffvideoFile, err := ffp.NewVideoFile(inputfile)
if err != nil {
return err
@ -46,6 +65,24 @@ func printPhash(ff *ffmpeg.FFMpeg, ffp *ffmpeg.FFProbe, inputfile string, quiet
return nil
}
func printImagePhash(ff *ffmpeg.FFMpeg, inputfile string, quiet *bool) error {
imgFile := &models.ImageFile{
BaseFile: &models.BaseFile{Path: inputfile},
}
phash, err := imagephash.Generate(ff, imgFile)
if err != nil {
return err
}
if *quiet {
fmt.Printf("%x\n", *phash)
} else {
fmt.Printf("%x %v\n", *phash, imgFile.Path)
}
return nil
}
func getPaths() (string, string) {
ffmpegPath, _ := exec.LookPath("ffmpeg")
ffprobePath, _ := exec.LookPath("ffprobe")
@ -67,7 +104,7 @@ func main() {
args := flag.Args()
if len(args) < 1 {
fmt.Fprintf(os.Stderr, "Missing VIDEOFILE argument.\n")
fmt.Fprintf(os.Stderr, "Missing FILE argument.\n")
flag.Usage()
os.Exit(2)
}
@ -87,4 +124,5 @@ func main() {
fmt.Fprintln(os.Stderr, err)
}
}
}

View file

@ -76,6 +76,10 @@ func main() {
defer pprof.StopCPUProfile()
}
// initialise desktop.IsDesktop here so that it doesn't get affected by
// ffmpeg hardware checks later on
desktop.InitIsDesktop()
mgr, err := manager.Initialize(cfg, l)
if err != nil {
exitError(fmt.Errorf("manager initialization error: %w", err))
@ -110,7 +114,7 @@ func main() {
// Logs only error level message to stderr.
func initLogTemp() *log.Logger {
l := log.NewLogger()
l.Init("", true, "Error")
l.Init("", true, "Error", 0)
logger.Logger = l
return l
@ -118,7 +122,7 @@ func initLogTemp() *log.Logger {
func initLog(cfg *config.Config) *log.Logger {
l := log.NewLogger()
l.Init(cfg.GetLogFile(), cfg.GetLogOut(), cfg.GetLogLevel())
l.Init(cfg.GetLogFile(), cfg.GetLogOut(), cfg.GetLogLevel(), cfg.GetLogFileMaxSize())
logger.Logger = l
return l

View file

@ -1,14 +1,16 @@
# This dockerfile should be built with `make docker-build` from the stash root.
# Build Frontend
FROM node:20-alpine AS frontend
FROM node:24-alpine AS frontend
RUN apk add --no-cache make git
## cache node_modules separately
COPY ./ui/v2.5/package.json ./ui/v2.5/yarn.lock /stash/ui/v2.5/
COPY ./ui/v2.5/package.json ./ui/v2.5/pnpm-lock.yaml /stash/ui/v2.5/
WORKDIR /stash
COPY Makefile /stash/
COPY ./graphql /stash/graphql/
COPY ./ui /stash/ui/
# pnpm install with npm
RUN npm install -g pnpm
RUN make pre-ui
RUN make generate-ui
ARG GITHASH

View file

@ -5,11 +5,13 @@ ARG CUDA_VERSION=12.8.0
FROM node:20-alpine AS frontend
RUN apk add --no-cache make git
## cache node_modules separately
COPY ./ui/v2.5/package.json ./ui/v2.5/yarn.lock /stash/ui/v2.5/
COPY ./ui/v2.5/package.json ./ui/v2.5/pnpm-lock.yaml /stash/ui/v2.5/
WORKDIR /stash
COPY Makefile /stash/
COPY ./graphql /stash/graphql/
COPY ./ui /stash/ui/
# pnpm install with npm
RUN npm install -g pnpm
RUN make pre-ui
RUN make generate-ui
ARG GITHASH

View file

@ -12,9 +12,8 @@ RUN if [ "$TARGETPLATFORM" = "linux/arm/v6" ]; then BIN=stash-linux-arm32v6; \
FROM --platform=$TARGETPLATFORM alpine:latest AS app
COPY --from=binary /stash /usr/bin/
RUN apk add --no-cache ca-certificates python3 py3-requests py3-requests-toolbelt py3-lxml py3-pip ffmpeg ruby tzdata vips vips-tools \
&& pip install --user --break-system-packages mechanicalsoup cloudscraper stashapp-tools \
&& gem install faraday
RUN apk add --no-cache ca-certificates python3 py3-requests py3-requests-toolbelt py3-lxml py3-pip ffmpeg tzdata vips vips-tools vips-heif \
&& pip install --break-system-packages mechanicalsoup cloudscraper stashapp-tools
ENV STASH_CONFIG_FILE=/root/.stash/config.yml
# Basic build-time metadata as defined at https://github.com/opencontainers/image-spec/blob/main/annotations.md#pre-defined-annotation-keys

View file

@ -1 +0,0 @@
*.sdk.tar.*

View file

@ -1,83 +1,86 @@
FROM golang:1.24.3
### OSXCROSS
FROM debian:bookworm AS osxcross
# add osxcross
WORKDIR /tmp/osxcross
ARG OSXCROSS_REVISION=5e1b71fcceb23952f3229995edca1b6231525b5b
ADD --checksum=sha256:d3f771bbc20612fea577b18a71be3af2eb5ad2dd44624196cf55de866d008647 https://codeload.github.com/tpoechtrager/osxcross/tar.gz/${OSXCROSS_REVISION} /tmp/osxcross.tar.gz
LABEL maintainer="https://discord.gg/2TsNFKt"
ARG OSX_SDK_VERSION=11.3
ARG OSX_SDK_DOWNLOAD_FILE=MacOSX${OSX_SDK_VERSION}.sdk.tar.xz
ARG OSX_SDK_DOWNLOAD_URL=https://github.com/phracker/MacOSX-SDKs/releases/download/${OSX_SDK_VERSION}/${OSX_SDK_DOWNLOAD_FILE}
ADD --checksum=sha256:cd4f08a75577145b8f05245a2975f7c81401d75e9535dcffbb879ee1deefcbf4 ${OSX_SDK_DOWNLOAD_URL} /tmp/osxcross/tarballs/${OSX_SDK_DOWNLOAD_FILE}
RUN apt-get update && apt-get install -y apt-transport-https ca-certificates gnupg
ENV UNATTENDED=yes \
SDK_VERSION=${OSX_SDK_VERSION} \
OSX_VERSION_MIN=10.10
RUN apt update && \
apt install -y --no-install-recommends \
bash ca-certificates clang cmake git patch libssl-dev bzip2 cpio libbz2-dev libxml2-dev make python3 xz-utils zlib1g-dev
# lzma-dev libxml2-dev xz
RUN tar --strip=1 -C /tmp/osxcross -xf /tmp/osxcross.tar.gz
RUN ./build.sh
RUN mkdir -p /etc/apt/keyrings
### FREEBSD cross-compilation stage
# use alpine for cacheable image since apt is notorous for not caching
FROM alpine:3 AS freebsd
# match golang latest
# https://go.dev/wiki/FreeBSD
ARG FREEBSD_VERSION=12.4
ADD --checksum=sha256:581c7edacfd2fca2bdf5791f667402d22fccd8a5e184635e0cac075564d57aa8 \
http://ftp-archive.freebsd.org/mirror/FreeBSD-Archive/old-releases/amd64/${FREEBSD_VERSION}-RELEASE/base.txz \
/tmp/base.txz
ADD https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key nodesource.gpg.key
RUN cat nodesource.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg && rm nodesource.gpg.key
RUN echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_20.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list
ADD https://dl.yarnpkg.com/debian/pubkey.gpg yarn.gpg
RUN cat yarn.gpg | gpg --dearmor -o /etc/apt/keyrings/yarn.gpg && rm yarn.gpg
RUN echo "deb [signed-by=/etc/apt/keyrings/yarn.gpg] https://dl.yarnpkg.com/debian/ stable main" | tee /etc/apt/sources.list.d/yarn.list
RUN apt-get update && \
apt-get install -y --no-install-recommends \
git make tar bash nodejs yarn zip \
clang llvm-dev cmake patch libxml2-dev uuid-dev libssl-dev xz-utils \
bzip2 gzip sed cpio libbz2-dev zlib1g-dev \
gcc-mingw-w64 \
gcc-arm-linux-gnueabi libc-dev-armel-cross linux-libc-dev-armel-cross \
gcc-aarch64-linux-gnu libc-dev-arm64-cross && \
rm -rf /var/lib/apt/lists/*;
# FreeBSD cross-compilation setup
# https://github.com/smartmontools/docker-build/blob/6b8c92560d17d325310ba02d9f5a4b250cb0764a/Dockerfile#L66
ENV FREEBSD_VERSION 13.4
ENV FREEBSD_DOWNLOAD_URL http://ftp.plusline.de/FreeBSD/releases/amd64/${FREEBSD_VERSION}-RELEASE/base.txz
ENV FREEBSD_SHA 8e13b0a93daba349b8d28ad246d7beb327659b2ef4fe44d89f447392daec5a7c
RUN cd /tmp && \
curl -o base.txz $FREEBSD_DOWNLOAD_URL && \
echo "$FREEBSD_SHA base.txz" | sha256sum -c - && \
mkdir -p /opt/cross-freebsd && \
cd /opt/cross-freebsd && \
tar -xf /tmp/base.txz ./lib/ ./usr/lib/ ./usr/include/ && \
rm -f /tmp/base.txz && \
cd /opt/cross-freebsd/usr/lib && \
find . -xtype l | xargs ls -l | grep ' /lib/' | awk '{print "ln -sf /opt/cross-freebsd"$11 " " $9}' | /bin/sh && \
WORKDIR /opt/cross-freebsd
RUN apk add --no-cache tar xz
RUN tar -xf /tmp/base.txz --strip-components=1 ./usr/lib ./usr/include ./lib
RUN cd /opt/cross-freebsd/usr/lib && \
find . -type l -exec sh -c ' \
for link; do \
target=$(readlink "$link"); \
case "$target" in \
/lib/*) ln -sf "/opt/cross-freebsd$target" "$link";; \
esac; \
done \
' sh {} + && \
ln -s libc++.a libstdc++.a && \
ln -s libc++.so libstdc++.so
# macOS cross-compilation setup
ENV OSX_SDK_VERSION 11.3
ENV OSX_SDK_DOWNLOAD_FILE MacOSX${OSX_SDK_VERSION}.sdk.tar.xz
ENV OSX_SDK_DOWNLOAD_URL https://github.com/phracker/MacOSX-SDKs/releases/download/${OSX_SDK_VERSION}/${OSX_SDK_DOWNLOAD_FILE}
ENV OSX_SDK_SHA cd4f08a75577145b8f05245a2975f7c81401d75e9535dcffbb879ee1deefcbf4
ENV OSXCROSS_REVISION 5e1b71fcceb23952f3229995edca1b6231525b5b
ENV OSXCROSS_DOWNLOAD_URL https://codeload.github.com/tpoechtrager/osxcross/tar.gz/${OSXCROSS_REVISION}
ENV OSXCROSS_SHA d3f771bbc20612fea577b18a71be3af2eb5ad2dd44624196cf55de866d008647
### BUILDER
FROM golang:1.24.3 AS builder
ENV PATH=/opt/osx-ndk-x86/bin:$PATH
RUN cd /tmp && \
curl -o osxcross.tar.gz $OSXCROSS_DOWNLOAD_URL && \
echo "$OSXCROSS_SHA osxcross.tar.gz" | sha256sum -c - && \
mkdir osxcross && \
tar --strip=1 -C osxcross -xf osxcross.tar.gz && \
rm -f osxcross.tar.gz && \
curl -Lo $OSX_SDK_DOWNLOAD_FILE $OSX_SDK_DOWNLOAD_URL && \
echo "$OSX_SDK_SHA $OSX_SDK_DOWNLOAD_FILE" | sha256sum -c - && \
mv $OSX_SDK_DOWNLOAD_FILE osxcross/tarballs/ && \
UNATTENDED=yes SDK_VERSION=$OSX_SDK_VERSION OSX_VERSION_MIN=10.10 osxcross/build.sh && \
cp osxcross/target/lib/* /usr/lib/ && \
mv osxcross/target /opt/osx-ndk-x86 && \
rm -rf /tmp/osxcross
# copy in nodejs instead of using nodesource :thumbsup:
COPY --from=docker.io/library/node:24-bookworm /usr/local /usr/local
# copy in osxcross
COPY --from=osxcross /tmp/osxcross/target/lib /usr/lib
COPY --from=osxcross /tmp/osxcross/target /opt/osx-ndk-x86
# copy in cross-freebsd
COPY --from=freebsd /opt/cross-freebsd /opt/cross-freebsd
ENV PATH /opt/osx-ndk-x86/bin:$PATH
# pnpm install with npm
RUN npm install -g pnpm
RUN mkdir -p /root/.ssh && \
chmod 0700 /root/.ssh && \
ssh-keyscan github.com > /root/.ssh/known_hosts
# git for getting hash
# make and bash for building
# ignore "dubious ownership" errors
# clang for macos
# zip for stashapp.zip
# gcc-extensions for cross-arch build
# we still target arm soft float?
RUN apt-get update && \
apt-get install -y --no-install-recommends \
git make bash \
clang zip \
gcc-mingw-w64 \
gcc-arm-linux-gnueabi \
libc-dev-armel-cross linux-libc-dev-armel-cross \
gcc-aarch64-linux-gnu libc-dev-arm64-cross && \
rm -rf /var/lib/apt/lists/*;
RUN git config --global safe.directory '*'
# To test locally:
# make generate
# make ui
# cd docker/compiler
# make build
# docker run --rm -v /PATH_TO_STASH:/stash -w /stash -i -t stashapp/compiler:latest make build-cc-all
# # binaries will show up in /dist
# docker build . -t ghcr.io/stashapp/compiler:latest
# docker run --rm -v /PATH_TO_STASH:/stash -w /stash -i -t ghcr.io/stashapp/compiler:latest make build-cc-all
# # binaries will show up in /dist

View file

@ -1,16 +1,22 @@
host=ghcr.io
user=stashapp
repo=compiler
version=11
version=13
VERSION_IMAGE = ${host}/${user}/${repo}:${version}
LATEST_IMAGE = ${host}/${user}/${repo}:latest
latest:
docker build -t ${user}/${repo}:latest .
docker build -t ${LATEST_IMAGE} .
build:
docker build -t ${user}/${repo}:${version} -t ${user}/${repo}:latest .
docker build -t ${VERSION_IMAGE} -t ${LATEST_IMAGE} .
build-no-cache:
docker build --no-cache -t ${user}/${repo}:${version} -t ${user}/${repo}:latest .
docker build --no-cache -t ${VERSION_IMAGE} -t ${LATEST_IMAGE} .
install: build
docker push ${user}/${repo}:${version}
docker push ${user}/${repo}:latest
# requires docker login ghcr.io
# echo $CR_PAT | docker login ghcr.io -u USERNAME --password-stdin
push:
docker push ${VERSION_IMAGE}
docker push ${LATEST_IMAGE}

View file

@ -1,3 +1,3 @@
Modified from https://github.com/bep/dockerfiles/tree/master/ci-goreleaser
When the Dockerfile is changed, the version number should be incremented in the Makefile and the new version tag should be pushed to Docker Hub. The GitHub workflow files also need to be updated to pull the correct image tag.
When the Dockerfile is changed, the version number should be incremented in [.github/workflows/build-compiler.yml](../../.github/workflows/build-compiler.yml) and the workflow [manually ran](). `env: COMPILER_IMAGE` in [.github/workflows/build.yml](../../.github/workflows/build.yml) also needs to be updated to pull the correct image tag.

View file

@ -5,7 +5,8 @@
* [Go](https://golang.org/dl/)
* [GolangCI](https://golangci-lint.run/) - A meta-linter which runs several linters in parallel
* To install, follow the [local installation instructions](https://golangci-lint.run/welcome/install/#local-installation)
* [Yarn](https://yarnpkg.com/en/docs/install) - Yarn package manager
* [nodejs](https://nodejs.org/en/download) - nodejs runtime
* corepack/[pnpm](https://pnpm.io/installation) - nodejs package manager (included with nodejs)
## Environment
@ -22,32 +23,22 @@ NOTE: The `make` command in Windows will be `mingw32-make` with MinGW. For examp
### macOS
1. If you don't have it already, install the [Homebrew package manager](https://brew.sh).
2. Install dependencies: `brew install go git yarn gcc make node ffmpeg`
2. Install dependencies: `brew install go git gcc make node ffmpeg`
### Linux
#### Arch Linux
1. Install dependencies: `sudo pacman -S go git yarn gcc make nodejs ffmpeg --needed`
1. Install dependencies: `sudo pacman -S go git gcc make nodejs ffmpeg --needed`
#### Ubuntu
1. Install dependencies: `sudo apt-get install golang git yarnpkg gcc nodejs ffmpeg -y`
1. Install dependencies: `sudo apt-get install golang git gcc nodejs ffmpeg -y`
### OpenBSD
1. Install dependencies `doas pkg_add gmake go git yarn node cmake`
2. Compile a custom ffmpeg from ports. The default ffmpeg in OpenBSD's packages is not compiled with WebP support, which is required by Stash.
- If you've already installed ffmpeg, uninstall it: `doas pkg_delete ffmpeg`
- If you haven't already, [fetch the ports tree and verify](https://www.openbsd.org/faq/ports/ports.html#PortsFetch).
- Find the ffmpeg port in `/usr/ports/graphics/ffmpeg`, and patch the Makefile to include libwebp
- Add `webp` to `WANTLIB`
- Add `graphics/libwebp` to the list in `LIB_DEPENDS`
- Add `-lwebp -lwebpdecoder -lwebpdemux -lwebpmux` to `LIBavcodec_EXTRALIBS`
- Add `--enable-libweb` to the list in `CONFIGURE_ARGS`
- If you've already built ffmpeg from ports before, you may need to also increment `REVISION`
- Run `doas make install`
- Follow the instructions below to build a release, but replace the final step `make build-release` with `gmake flags-release stash`, to [avoid the PIE buildmode](https://github.com/golang/go/issues/59866).
1. Install dependencies `doas pkg_add gmake go git node cmake ffmpeg`
2. Follow the instructions below to build a release, but replace the final step `make build-release` with `gmake flags-release stash`, to [avoid the PIE buildmode](https://github.com/golang/go/issues/59866).
NOTE: The `make` command in OpenBSD will be `gmake`. For example, `make pre-ui` will be `gmake pre-ui`.
@ -127,8 +118,8 @@ This project uses a modification of the [CI-GoReleaser](https://github.com/bep/d
To cross-compile the app yourself:
1. Run `make pre-ui`, `make generate` and `make ui` outside the container, to generate files and build the UI.
2. Pull the latest compiler image from Docker Hub: `docker pull stashapp/compiler`
3. Run `docker run --rm --mount type=bind,source="$(pwd)",target=/stash -w /stash -it stashapp/compiler /bin/bash` to open a shell inside the container.
2. Pull the latest compiler image from GHCR: `docker pull ghcr.io/stashapp/compiler`
3. Run `docker run --rm --mount type=bind,source="$(pwd)",target=/stash -w /stash -it ghcr.io/stashapp/compiler /bin/bash` to open a shell inside the container.
4. From inside the container, run `make build-cc-all` to build for all platforms, or run `make build-cc-{platform}` to build for a specific platform (have a look at the `Makefile` for the list of targets).
5. You will find the compiled binaries in `dist/`.

49
go.mod
View file

@ -7,15 +7,15 @@ require (
github.com/WithoutPants/sortorder v0.0.0-20230616003020-921c9ef69552
github.com/Yamashou/gqlgenc v0.32.1
github.com/anacrolix/dms v1.2.2
github.com/antchfx/htmlquery v1.3.0
github.com/antchfx/htmlquery v1.3.5
github.com/asticode/go-astisub v0.25.1
github.com/chromedp/cdproto v0.0.0-20231007061347-18b01cd81617
github.com/chromedp/chromedp v0.9.2
github.com/chromedp/cdproto v0.0.0-20250803210736-d308e07a266d
github.com/chromedp/chromedp v0.14.2
github.com/corona10/goimagehash v1.1.0
github.com/disintegration/imaging v1.6.2
github.com/dop251/goja v0.0.0-20231027120936-b396bb4c349d
github.com/doug-martin/goqu/v9 v9.18.0
github.com/go-chi/chi/v5 v5.0.12
github.com/go-chi/chi/v5 v5.2.2
github.com/go-chi/cors v1.2.1
github.com/go-chi/httplog v0.3.1
github.com/go-toast/toast v0.0.0-20190211030409-01e6764cf0a4
@ -32,7 +32,11 @@ require (
github.com/json-iterator/go v1.1.12
github.com/kermieisinthehouse/gosx-notifier v0.1.2
github.com/kermieisinthehouse/systray v1.2.4
github.com/knadh/koanf v1.5.0
github.com/knadh/koanf/parsers/yaml v1.1.0
github.com/knadh/koanf/providers/env v1.1.0
github.com/knadh/koanf/providers/file v1.2.0
github.com/knadh/koanf/providers/posflag v1.0.1
github.com/knadh/koanf/v2 v2.2.1
github.com/lucasb-eyer/go-colorful v1.2.0
github.com/mattn/go-sqlite3 v1.14.22
github.com/mitchellh/mapstructure v1.5.0
@ -40,9 +44,10 @@ require (
github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8
github.com/remeh/sizedwaitgroup v1.0.0
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06
github.com/sirupsen/logrus v1.9.3
github.com/spf13/cast v1.6.0
github.com/spf13/pflag v1.0.5
github.com/spf13/pflag v1.0.6
github.com/stretchr/testify v1.10.0
github.com/tidwall/gjson v1.16.0
github.com/vearutop/statigz v1.4.0
@ -51,33 +56,35 @@ require (
github.com/vektra/mockery/v2 v2.10.0
github.com/xWTF/chardet v0.0.0-20230208095535-c780f2ac244e
github.com/zencoder/go-dash/v3 v3.0.2
golang.org/x/crypto v0.38.0
golang.org/x/crypto v0.45.0
golang.org/x/image v0.18.0
golang.org/x/net v0.40.0
golang.org/x/sys v0.33.0
golang.org/x/term v0.32.0
golang.org/x/text v0.25.0
golang.org/x/net v0.47.0
golang.org/x/sys v0.38.0
golang.org/x/term v0.37.0
golang.org/x/text v0.31.0
golang.org/x/time v0.10.0
gopkg.in/guregu/null.v4 v4.0.0
gopkg.in/natefinch/lumberjack.v2 v2.2.1
gopkg.in/yaml.v2 v2.4.0
)
require (
github.com/agnivade/levenshtein v1.2.1 // indirect
github.com/antchfx/xpath v1.2.3 // indirect
github.com/antchfx/xpath v1.3.5 // indirect
github.com/asticode/go-astikit v0.20.0 // indirect
github.com/asticode/go-astits v1.8.0 // indirect
github.com/chromedp/sysutil v1.0.0 // indirect
github.com/chromedp/sysutil v1.1.0 // indirect
github.com/coder/websocket v1.8.12 // indirect
github.com/cpuguy83/go-md2man/v2 v2.0.7 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/dlclark/regexp2 v1.7.0 // indirect
github.com/fsnotify/fsnotify v1.6.0 // indirect
github.com/fsnotify/fsnotify v1.9.0 // indirect
github.com/go-json-experiment/json v0.0.0-20250725192818-e39067aee2d2 // indirect
github.com/go-sourcemap/sourcemap v2.1.3+incompatible // indirect
github.com/go-viper/mapstructure/v2 v2.2.1 // indirect
github.com/go-viper/mapstructure/v2 v2.4.0 // indirect
github.com/gobwas/httphead v0.1.0 // indirect
github.com/gobwas/pool v0.2.1 // indirect
github.com/gobwas/ws v1.3.0 // indirect
github.com/gobwas/ws v1.4.0 // indirect
github.com/goccy/go-yaml v1.18.0 // indirect
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
github.com/google/pprof v0.0.0-20230207041349-798e818bf904 // indirect
@ -85,9 +92,8 @@ require (
github.com/hashicorp/go-multierror v1.1.1 // indirect
github.com/hashicorp/hcl v1.0.0 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/josharian/intern v1.0.0 // indirect
github.com/knadh/koanf/maps v0.1.2 // indirect
github.com/magiconair/properties v1.8.7 // indirect
github.com/mailru/easyjson v0.7.7 // indirect
github.com/mattn/go-colorable v0.1.14 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/mitchellh/copystructure v1.2.0 // indirect
@ -114,9 +120,10 @@ require (
github.com/urfave/cli/v2 v2.27.6 // indirect
github.com/xrash/smetrics v0.0.0-20240521201337-686a1a2994c1 // indirect
go.uber.org/atomic v1.11.0 // indirect
golang.org/x/mod v0.24.0 // indirect
golang.org/x/sync v0.14.0 // indirect
golang.org/x/tools v0.33.0 // indirect
go.yaml.in/yaml/v3 v3.0.3 // indirect
golang.org/x/mod v0.29.0 // indirect
golang.org/x/sync v0.18.0 // indirect
golang.org/x/tools v0.38.0 // indirect
gopkg.in/ini.v1 v1.67.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

229
go.sum
View file

@ -72,7 +72,6 @@ github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuy
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
github.com/alecthomas/units v0.0.0-20190717042225-c3de453c63f4/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
github.com/alecthomas/units v0.0.0-20190924025748-f65c72e2690d/go.mod h1:rBZYJk541a8SKzHPHnH3zbiI+7dagKZ0cgpgrD7Fyho=
github.com/anacrolix/dms v1.2.2 h1:0mk2/DXNqa5KDDbaLgFPf3oMV6VCGdFNh3d/gt4oafM=
github.com/anacrolix/dms v1.2.2/go.mod h1:msPKAoppoNRfrYplJqx63FZ+VipDZ4Xsj3KzIQxyU7k=
github.com/anacrolix/envpprof v0.0.0-20180404065416-323002cec2fa/go.mod h1:KgHhUaQMc8cC0+cEflSgCFNFbKwi5h54gqtVn8yhP7c=
@ -86,10 +85,10 @@ github.com/andybalholm/brotli v1.0.5 h1:8uQZIdzKmjc/iuPu7O2ioW48L81FgatrcpfFmiq/
github.com/andybalholm/brotli v1.0.5/go.mod h1:fO7iG3H7G2nSZ7m0zPUDn85XEX2GTukHGRSepvi9Eig=
github.com/andybalholm/cascadia v1.3.3 h1:AG2YHrzJIm4BZ19iwJ/DAua6Btl3IwJX+VI4kktS1LM=
github.com/andybalholm/cascadia v1.3.3/go.mod h1:xNd9bqTn98Ln4DwST8/nG+H0yuB8Hmgu1YHNnWw0GeA=
github.com/antchfx/htmlquery v1.3.0 h1:5I5yNFOVI+egyia5F2s/5Do2nFWxJz41Tr3DyfKD25E=
github.com/antchfx/htmlquery v1.3.0/go.mod h1:zKPDVTMhfOmcwxheXUsx4rKJy8KEY/PU6eXr/2SebQ8=
github.com/antchfx/xpath v1.2.3 h1:CCZWOzv5bAqjVv0offZ2LVgVYFbeldKQVuLNbViZdes=
github.com/antchfx/xpath v1.2.3/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
github.com/antchfx/htmlquery v1.3.5 h1:aYthDDClnG2a2xePf6tys/UyyM/kRcsFRm+ifhFKoU0=
github.com/antchfx/htmlquery v1.3.5/go.mod h1:5oyIPIa3ovYGtLqMPNjBF2Uf25NPCKsMjCnQ8lvjaoA=
github.com/antchfx/xpath v1.3.5 h1:PqbXLC3TkfeZyakF5eeh3NTWEbYl4VHNVeufANzDbKQ=
github.com/antchfx/xpath v1.3.5/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
github.com/antihax/optional v1.0.0/go.mod h1:uupD/76wgC+ih3iEmQUL+0Ugr19nfwCT1kdvxnR2qWY=
github.com/arbovm/levenshtein v0.0.0-20160628152529-48b4e1c0c4d0 h1:jfIu9sQUG6Ig+0+Ap1h4unLjW6YQJpKZVmUzxsD4E/Q=
github.com/arbovm/levenshtein v0.0.0-20160628152529-48b4e1c0c4d0/go.mod h1:t2tdKJDJF9BV14lnkjHmOQgcvEKgtqs5a1N3LNdJhGE=
@ -104,16 +103,6 @@ github.com/asticode/go-astisub v0.25.1 h1:RZMGfZPp7CXOkI6g+zCU7DRLuciGPGup921uKZ
github.com/asticode/go-astisub v0.25.1/go.mod h1:WTkuSzFB+Bp7wezuSf2Oxulj5A8zu2zLRVFf6bIFQK8=
github.com/asticode/go-astits v1.8.0 h1:rf6aiiGn/QhlFjNON1n5plqF3Fs025XLUwiQ0NB6oZg=
github.com/asticode/go-astits v1.8.0/go.mod h1:DkOWmBNQpnr9mv24KfZjq4JawCFX1FCqjLVGvO0DygQ=
github.com/aws/aws-sdk-go-v2 v1.9.2/go.mod h1:cK/D0BBs0b/oWPIcX/Z/obahJK1TT7IPVjy53i/mX/4=
github.com/aws/aws-sdk-go-v2/config v1.8.3/go.mod h1:4AEiLtAb8kLs7vgw2ZV3p2VZ1+hBavOc84hqxVNpCyw=
github.com/aws/aws-sdk-go-v2/credentials v1.4.3/go.mod h1:FNNC6nQZQUuyhq5aE5c7ata8o9e4ECGmS4lAXC7o1mQ=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.6.0/go.mod h1:gqlclDEZp4aqJOancXK6TN24aKhT0W0Ae9MHk3wzTMM=
github.com/aws/aws-sdk-go-v2/internal/ini v1.2.4/go.mod h1:ZcBrrI3zBKlhGFNYWvju0I3TR93I7YIgAfy82Fh4lcQ=
github.com/aws/aws-sdk-go-v2/service/appconfig v1.4.2/go.mod h1:FZ3HkCe+b10uFZZkFdvf98LHW21k49W8o8J366lqVKY=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.3.2/go.mod h1:72HRZDLMtmVQiLG2tLfQcaWLCssELvGl+Zf2WVxMmR8=
github.com/aws/aws-sdk-go-v2/service/sso v1.4.2/go.mod h1:NBvT9R1MEF+Ud6ApJKM0G+IkPchKS7p7c2YPKwHmBOk=
github.com/aws/aws-sdk-go-v2/service/sts v1.7.2/go.mod h1:8EzeIqfWt2wWT4rJVu3f21TfrhJ8AEMzVybRNSb/b4g=
github.com/aws/smithy-go v1.8.0/go.mod h1:SObp3lf9smib00L/v3U2eAKG8FyQ7iLrJnQiAmR5n+E=
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
@ -127,13 +116,12 @@ github.com/census-instrumentation/opencensus-proto v0.3.0/go.mod h1:f6KPmirojxKA
github.com/cespare/xxhash v1.1.0/go.mod h1:XrSqR1VqqWfGrhpAt58auRo0WTKS1nRRg3ghfAqPWnc=
github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/cespare/xxhash/v2 v2.1.2/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/chromedp/cdproto v0.0.0-20230802225258-3cf4e6d46a89/go.mod h1:GKljq0VrfU4D5yc+2qA6OVr8pmO/MBbPEWqWQ/oqGEs=
github.com/chromedp/cdproto v0.0.0-20231007061347-18b01cd81617 h1:/5dwcyi5WOawM1Iz6MjrYqB90TRIdZv3O0fVHEJb86w=
github.com/chromedp/cdproto v0.0.0-20231007061347-18b01cd81617/go.mod h1:GKljq0VrfU4D5yc+2qA6OVr8pmO/MBbPEWqWQ/oqGEs=
github.com/chromedp/chromedp v0.9.2 h1:dKtNz4kApb06KuSXoTQIyUC2TrA0fhGDwNZf3bcgfKw=
github.com/chromedp/chromedp v0.9.2/go.mod h1:LkSXJKONWTCHAfQasKFUZI+mxqS4tZqhmtGzzhLsnLs=
github.com/chromedp/sysutil v1.0.0 h1:+ZxhTpfpZlmchB58ih/LBHX52ky7w2VhQVKQMucy3Ic=
github.com/chromedp/sysutil v1.0.0/go.mod h1:kgWmDdq8fTzXYcKIBqIYvRRTnYb9aNS9moAV0xufSww=
github.com/chromedp/cdproto v0.0.0-20250803210736-d308e07a266d h1:ZtA1sedVbEW7EW80Iz2GR3Ye6PwbJAJXjv7D74xG6HU=
github.com/chromedp/cdproto v0.0.0-20250803210736-d308e07a266d/go.mod h1:NItd7aLkcfOA/dcMXvl8p1u+lQqioRMq/SqDp71Pb/k=
github.com/chromedp/chromedp v0.14.2 h1:r3b/WtwM50RsBZHMUm9fsNhhzRStTHrKdr2zmwbZSzM=
github.com/chromedp/chromedp v0.14.2/go.mod h1:rHzAv60xDE7VNy/MYtTUrYreSc0ujt2O1/C3bzctYBo=
github.com/chromedp/sysutil v1.1.0 h1:PUFNv5EcprjqXZD9nJb9b/c9ibAbxiYo4exNWZyipwM=
github.com/chromedp/sysutil v1.1.0/go.mod h1:WiThHUdltqCNKGc4gaU50XgYjwjYIhKWoHGPTUfWTJ8=
github.com/chzyer/logex v1.1.10/go.mod h1:+Ywpsq7O8HXn0nuIou7OrIPyXbp3wmkHB+jjWRnGsAI=
github.com/chzyer/logex v1.2.0/go.mod h1:9+9sk7u7pGNWYMkh0hdiL++6OeibzJccyQU4p4MedaY=
github.com/chzyer/readline v0.0.0-20180603132655-2972be24d48e/go.mod h1:nSuG5e5PlCu98SY8svDHJxuZscDgtXS6KTTbou5AhLI=
@ -185,7 +173,6 @@ github.com/dop251/goja_nodejs v0.0.0-20211022123610-8dd9abb0616d/go.mod h1:DngW8
github.com/doug-martin/goqu/v9 v9.18.0 h1:/6bcuEtAe6nsSMVK/M+fOiXUNfyFF3yYtE07DBPFMYY=
github.com/doug-martin/goqu/v9 v9.18.0/go.mod h1:nf0Wc2/hV3gYK9LiyqIrzBEVGlI8qW3GuDCEobC4wBQ=
github.com/dustin/go-humanize v0.0.0-20180421182945-02af3965c54e/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/dustin/go-humanize v1.0.0/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=
@ -200,19 +187,17 @@ github.com/envoyproxy/protoc-gen-validate v0.6.2/go.mod h1:2t7qjJNvHPx8IjnBOzl9E
github.com/fatih/color v1.7.0/go.mod h1:Zm6kSWBoL9eyXnKyktHP6abPY2pDugNf5KwzbycvMj4=
github.com/fatih/color v1.9.0/go.mod h1:eQcE1qtQxscV5RaZvpXrrb8Drkc3/DdQ+uUYCNjL+zU=
github.com/fatih/color v1.13.0/go.mod h1:kLAiJbzzSOZDVNGyDpeOxJ47H46qBXwg5ILebYFFOfk=
github.com/fatih/structs v1.1.0/go.mod h1:9NiDSp5zOcgEDl+j00MP/WkGVPOlPRLejGD8Ga6PJ7M=
github.com/frankban/quicktest v1.14.6 h1:7Xjx+VpznH+oBnejlPUj8oUpdxnVs4f8XU8WnHkI4W8=
github.com/frankban/quicktest v1.14.6/go.mod h1:4ptaffx2x8+WTWXmUCuVU6aPUX1/Mz7zb5vbUoiM6w0=
github.com/fsnotify/fsnotify v1.4.9/go.mod h1:znqG4EE+3YCdAaPaxE2ZRY/06pZUdp0tY4IgpuI1SZQ=
github.com/fsnotify/fsnotify v1.5.1/go.mod h1:T3375wBYaZdLLcVNkcVbzGHY7f1l/uK5T5Ai1i3InKU=
github.com/fsnotify/fsnotify v1.6.0 h1:n+5WquG0fcWoWp6xPWfHdbskMCQaFnG6PfBrh1Ky4HY=
github.com/fsnotify/fsnotify v1.6.0/go.mod h1:sl3t1tCWJFWoRz9R8WJCbQihKKwmorjAbSClcnxKAGw=
github.com/fsnotify/fsnotify v1.9.0 h1:2Ml+OJNzbYCTzsxtv8vKSFD9PbJjmhYF14k/jKC7S9k=
github.com/fsnotify/fsnotify v1.9.0/go.mod h1:8jBTzvmWwFyi3Pb8djgCCO5IBqzKJ/Jwo8TRcHyHii0=
github.com/ghodss/yaml v1.0.0/go.mod h1:4dBDuWmgqj2HViK6kFavaiC9ZROes6MMH2rRYeMEF04=
github.com/glycerine/go-unsnap-stream v0.0.0-20180323001048-9f0cb55181dd/go.mod h1:/20jfyN9Y5QPEAprSgKAUr+glWDY39ZiUEAYOEv5dsE=
github.com/glycerine/goconvey v0.0.0-20180728074245-46e3a41ad493/go.mod h1:Ogl1Tioa0aV7gstGFO7KhffUsb9M4ydbEbbxpcEDc24=
github.com/go-chi/chi/v5 v5.0.7/go.mod h1:DslCQbL2OYiznFReuXYUmQ2hGd1aDpCnlMNITLSKoi8=
github.com/go-chi/chi/v5 v5.0.12 h1:9euLV5sTrTNTRUU9POmDUvfxyj6LAABLUcEWO+JJb4s=
github.com/go-chi/chi/v5 v5.0.12/go.mod h1:DslCQbL2OYiznFReuXYUmQ2hGd1aDpCnlMNITLSKoi8=
github.com/go-chi/chi/v5 v5.2.2 h1:CMwsvRVTbXVytCk1Wd72Zy1LAsAh9GxMmSNWLHCG618=
github.com/go-chi/chi/v5 v5.2.2/go.mod h1:L2yAIGWB3H+phAw1NxKwWM+7eUH/lU8pOMm5hHcoops=
github.com/go-chi/cors v1.2.1 h1:xEC8UT3Rlp2QuWNEr4Fs/c2EAGVKBwy/1vHx3bppil4=
github.com/go-chi/cors v1.2.1/go.mod h1:sSbTewc+6wYHBBCW7ytsFSn836hqM7JxpglAy2Vzc58=
github.com/go-chi/httplog v0.3.1 h1:uC3IUWCZagtbCinb3ypFh36SEcgd6StWw2Bu0XSXRtg=
@ -220,31 +205,28 @@ github.com/go-chi/httplog v0.3.1/go.mod h1:UoiQQ/MTZH5V6JbNB2FzF0DynTh5okpXxlhsy
github.com/go-gl/glfw v0.0.0-20190409004039-e6da0acd62b1/go.mod h1:vR7hzQXu2zJy9AVAgeJqvqgH9Q5CA+iKCZ2gyEVpxRU=
github.com/go-gl/glfw/v3.3/glfw v0.0.0-20191125211704-12ad95a8df72/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=
github.com/go-gl/glfw/v3.3/glfw v0.0.0-20200222043503-6f7a984d4dc4/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=
github.com/go-json-experiment/json v0.0.0-20250725192818-e39067aee2d2 h1:iizUGZ9pEquQS5jTGkh4AqeeHCMbfbjeb0zMt0aEFzs=
github.com/go-json-experiment/json v0.0.0-20250725192818-e39067aee2d2/go.mod h1:TiCD2a1pcmjd7YnhGH0f/zKNcCD06B029pHhzV23c2M=
github.com/go-kit/kit v0.8.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=
github.com/go-kit/kit v0.9.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=
github.com/go-kit/log v0.1.0/go.mod h1:zbhenjAZHb184qTLMA9ZjW7ThYL0H2mk7Q6pNt4vbaY=
github.com/go-ldap/ldap v3.0.2+incompatible/go.mod h1:qfd9rJvER9Q0/D/Sqn1DfHRoBp40uXYvFoEVrNEPqRc=
github.com/go-logfmt/logfmt v0.3.0/go.mod h1:Qt1PoO58o5twSAckw1HlFXLmHsOX5/0LbT9GBnD5lWE=
github.com/go-logfmt/logfmt v0.4.0/go.mod h1:3RMwSq7FuexP4Kalkev3ejPJsZTpXXBr9+V4qmtdjCk=
github.com/go-logfmt/logfmt v0.5.0/go.mod h1:wCYkCAKZfumFQihp8CzCvQ3paCTfi41vtzG1KdI/P7A=
github.com/go-sourcemap/sourcemap v2.1.3+incompatible h1:W1iEw64niKVGogNgBN3ePyLFfuisuzeidWPMPWmECqU=
github.com/go-sourcemap/sourcemap v2.1.3+incompatible/go.mod h1:F8jJfvm2KbVjc5NqelyYJmf/v5J0dwNLS2mL4sNA1Jg=
github.com/go-sql-driver/mysql v1.6.0/go.mod h1:DCzpHaOWr8IXmIStZouvnhqoel9Qv2LBy8hT2VhHyBg=
github.com/go-sql-driver/mysql v1.8.1 h1:LedoTUt/eveggdHS9qUFC1EFSa8bU2+1pZjSRpvNJ1Y=
github.com/go-sql-driver/mysql v1.8.1/go.mod h1:wEBSXgmK//2ZFJyE+qWnIsVGmvmEKlqwuVSjsCm7DZg=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
github.com/go-test/deep v1.0.2-0.20181118220953-042da051cf31/go.mod h1:wGDj63lr65AM2AQyKZd/NYHGb0R+1RLqB8NKt3aSFNA=
github.com/go-toast/toast v0.0.0-20190211030409-01e6764cf0a4 h1:qZNfIGkIANxGv/OqtnntR4DfOY2+BgwR60cAcu/i3SE=
github.com/go-toast/toast v0.0.0-20190211030409-01e6764cf0a4/go.mod h1:kW3HQ4UdaAyrUCSSDR4xUzBKW6O2iA4uHhk7AtyYp10=
github.com/go-viper/mapstructure/v2 v2.2.1 h1:ZAaOCxANMuZx5RCeg0mBdEZk7DZasvvZIxtHqx8aGss=
github.com/go-viper/mapstructure/v2 v2.2.1/go.mod h1:oJDH3BJKyqBA2TXFhDsKDGDTlndYOZ6rGS0BRZIxGhM=
github.com/go-viper/mapstructure/v2 v2.4.0 h1:EBsztssimR/CONLSZZ04E8qAkxNYq4Qp9LvH92wZUgs=
github.com/go-viper/mapstructure/v2 v2.4.0/go.mod h1:oJDH3BJKyqBA2TXFhDsKDGDTlndYOZ6rGS0BRZIxGhM=
github.com/gobwas/httphead v0.1.0 h1:exrUm0f4YX0L7EBwZHuCF4GDp8aJfVeBrlLQrs6NqWU=
github.com/gobwas/httphead v0.1.0/go.mod h1:O/RXo79gxV8G+RqlR/otEwx4Q36zl9rqC5u12GKvMCM=
github.com/gobwas/pool v0.2.1 h1:xfeeEhW7pwmX8nuLVlqbzVc7udMDrwetjEv+TZIz1og=
github.com/gobwas/pool v0.2.1/go.mod h1:q8bcK0KcYlCgd9e7WYLm9LpyS+YeLd8JVDW6WezmKEw=
github.com/gobwas/ws v1.2.1/go.mod h1:hRKAFb8wOxFROYNsT1bqfWnhX+b5MFeJM9r2ZSwg/KY=
github.com/gobwas/ws v1.3.0 h1:sbeU3Y4Qzlb+MOzIe6mQGf7QR4Hkv6ZD0qhGkBFL2O0=
github.com/gobwas/ws v1.3.0/go.mod h1:hRKAFb8wOxFROYNsT1bqfWnhX+b5MFeJM9r2ZSwg/KY=
github.com/gobwas/ws v1.4.0 h1:CTaoG1tojrh4ucGPcoJFiAQUAsEWekEWvLy7GsVNqGs=
github.com/gobwas/ws v1.4.0/go.mod h1:G3gNqMNtPppf5XUz7O4shetPpcZ1VJ7zt18dlUeakrc=
github.com/goccy/go-yaml v1.18.0 h1:8W7wMFS12Pcas7KU+VVkaiCng+kG8QiFeFwzFb+rwuw=
github.com/goccy/go-yaml v1.18.0/go.mod h1:XBurs7gK8ATbW4ZPGKgcbrY1Br56PdM69F7LkFRi1kA=
github.com/godbus/dbus/v5 v5.0.4/go.mod h1:xhWf0FNVPg57R7Z0UbKHbJfkEywrmjJnf7w5xrFpKfA=
@ -288,7 +270,6 @@ github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaS
github.com/golang/protobuf v1.5.1/go.mod h1:DopwsBzvsk0Fs44TXzsVbJyPhcCPeIwnvohx4u74HPM=
github.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
github.com/golang/snappy v0.0.0-20180518054509-2e65f85255db/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v0.0.1/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v0.0.3/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/google/btree v0.0.0-20180124185431-e89373fe6b4a/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
@ -305,7 +286,7 @@ github.com/google/go-cmp v0.5.3/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/
github.com/google/go-cmp v0.5.4/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.6/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.7/go.mod h1:n+brtR0CgQNWTVd5ZUFpTBC8YFBDLK/h/bpaJ8/DtOE=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
@ -346,11 +327,9 @@ github.com/gorilla/sessions v1.2.1 h1:DHd3rPN5lE3Ts3D8rKkQ8x/0kqfeNmBAaiSi+o7Fsg
github.com/gorilla/sessions v1.2.1/go.mod h1:dk2InVEVJ0sfLlnXv9EAgkf6ecYs/i80K/zI+bUmuGM=
github.com/gorilla/websocket v1.5.0 h1:PPwGk2jz7EePpoHN/+ClbZu8SPxiqlu12wZP/3sWmnc=
github.com/gorilla/websocket v1.5.0/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
github.com/grpc-ecosystem/go-grpc-prometheus v1.2.0/go.mod h1:8NvIoxWQoOIhqOTXgfV/d3M/q6VIi02HzZEHgUlZvzk=
github.com/grpc-ecosystem/grpc-gateway v1.16.0/go.mod h1:BDjrQk3hbvj6Nolgz8mAMFbcEtjT1g+wF4CSlocrBnw=
github.com/hashicorp/consul/api v1.11.0/go.mod h1:XjsvQN+RJGWI2TWy1/kqaE16HrR2J/FWgkYjdZQsX9M=
github.com/hashicorp/consul/api v1.12.0/go.mod h1:6pVBMo0ebnYdt2S3H87XhekM/HHrUoTD2XXb/VrZVy0=
github.com/hashicorp/consul/api v1.13.0/go.mod h1:ZlVrynguJKcYr54zGaDbaL3fOvKC9m72FhPvA8T35KQ=
github.com/hashicorp/consul/sdk v0.8.0/go.mod h1:GBvyrGALthsZObzUGsfgHZQDXjg4lOjagTIwIR1vPms=
github.com/hashicorp/errwrap v1.0.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brvKWEqk/Jc4=
github.com/hashicorp/errwrap v1.1.0 h1:OxrOeh75EUXMY8TBjag2fzXGZ40LB6IKw45YeGUDY2I=
@ -358,8 +337,6 @@ github.com/hashicorp/errwrap v1.1.0/go.mod h1:YH+1FKiLXxHSkmPseP+kNlulaMuP3n2brv
github.com/hashicorp/go-cleanhttp v0.5.0/go.mod h1:JpRdi6/HCYpAwUzNwuwqhbovhLtngrth3wmdIIUrZ80=
github.com/hashicorp/go-cleanhttp v0.5.1/go.mod h1:JpRdi6/HCYpAwUzNwuwqhbovhLtngrth3wmdIIUrZ80=
github.com/hashicorp/go-cleanhttp v0.5.2/go.mod h1:kO/YDlP8L1346E6Sodw+PrpBSV4/SoxCXGY6BqNFT48=
github.com/hashicorp/go-hclog v0.0.0-20180709165350-ff2cf002a8dd/go.mod h1:9bjs9uLqI8l75knNv3lV1kA55veR+WUPSiKIWcQHudI=
github.com/hashicorp/go-hclog v0.8.0/go.mod h1:5CU+agLiy3J7N7QjHK5d05KxGsuXiQLrjA0H7acj2lQ=
github.com/hashicorp/go-hclog v0.12.0/go.mod h1:whpDNt7SSdeAju8AWKIWsul05p54N/39EeqMAyrmvFQ=
github.com/hashicorp/go-hclog v1.0.0/go.mod h1:whpDNt7SSdeAju8AWKIWsul05p54N/39EeqMAyrmvFQ=
github.com/hashicorp/go-immutable-radix v1.0.0/go.mod h1:0y9vanUI8NX6FsYoO3zeMjhV/C5i9g4Q3DwcSNZ4P60=
@ -369,17 +346,12 @@ github.com/hashicorp/go-multierror v1.0.0/go.mod h1:dHtQlpGsu+cZNNAkkCN/P3hoUDHh
github.com/hashicorp/go-multierror v1.1.0/go.mod h1:spPvp8C1qA32ftKqdAHm4hHTbPw+vmowP0z+KUhOZdA=
github.com/hashicorp/go-multierror v1.1.1 h1:H5DkEtf6CXdFp0N0Em5UCwQpXMWke8IA0+lD48awMYo=
github.com/hashicorp/go-multierror v1.1.1/go.mod h1:iw975J/qwKPdAO1clOe2L8331t/9/fmwbPZ6JB6eMoM=
github.com/hashicorp/go-plugin v1.0.1/go.mod h1:++UyYGoz3o5w9ZzAdZxtQKrWWP+iqPBn3cQptSMzBuY=
github.com/hashicorp/go-retryablehttp v0.5.3/go.mod h1:9B5zBasrRhHXnJnui7y6sL7es7NDiJgTc6Er0maI1Xs=
github.com/hashicorp/go-retryablehttp v0.5.4/go.mod h1:9B5zBasrRhHXnJnui7y6sL7es7NDiJgTc6Er0maI1Xs=
github.com/hashicorp/go-rootcerts v1.0.1/go.mod h1:pqUvnprVnM5bf7AOirdbb01K4ccR319Vf4pU3K5EGc8=
github.com/hashicorp/go-rootcerts v1.0.2/go.mod h1:pqUvnprVnM5bf7AOirdbb01K4ccR319Vf4pU3K5EGc8=
github.com/hashicorp/go-sockaddr v1.0.0/go.mod h1:7Xibr9yA9JjQq1JpNB2Vw7kxv8xerXegt+ozgdvDeDU=
github.com/hashicorp/go-sockaddr v1.0.2/go.mod h1:rB4wwRAUzs07qva3c5SdrY/NEtAUjGlgmH/UkBUC97A=
github.com/hashicorp/go-syslog v1.0.0/go.mod h1:qPfqrKkXGihmCqbJM2mZgkZGvKG1dFdvsLplgctolz4=
github.com/hashicorp/go-uuid v1.0.0/go.mod h1:6SBZvOh/SIDV7/2o3Jml5SYk/TvGqwFJ/bN7x4byOro=
github.com/hashicorp/go-uuid v1.0.1/go.mod h1:6SBZvOh/SIDV7/2o3Jml5SYk/TvGqwFJ/bN7x4byOro=
github.com/hashicorp/go-version v1.1.0/go.mod h1:fltr4n8CU8Ke44wwGCBoEymUuxUHl09ZGVZPK5anwXA=
github.com/hashicorp/golang-lru v0.5.0/go.mod h1:/m3WP610KZHVQ1SGc6re/UDhFvYD7pJ4Ao+sR/qLZy8=
github.com/hashicorp/golang-lru v0.5.1/go.mod h1:/m3WP610KZHVQ1SGc6re/UDhFvYD7pJ4Ao+sR/qLZy8=
github.com/hashicorp/golang-lru v0.5.4/go.mod h1:iADmTwqILo4mZ8BN3D2Q6+9jd8WM5uGBxy+E8yxSoD4=
@ -394,14 +366,8 @@ github.com/hashicorp/memberlist v0.2.2/go.mod h1:MS2lj3INKhZjWNqd3N0m3J+Jxf3DAOn
github.com/hashicorp/memberlist v0.3.0/go.mod h1:MS2lj3INKhZjWNqd3N0m3J+Jxf3DAOnAH9VT3Sh9MUE=
github.com/hashicorp/serf v0.9.5/go.mod h1:UWDWwZeL5cuWDJdl0C6wrvrUwEqtQ4ZKBKKENpqIUyk=
github.com/hashicorp/serf v0.9.6/go.mod h1:TXZNMjZQijwlDvp+r0b63xZ45H7JmCmgg4gpTwn9UV4=
github.com/hashicorp/vault/api v1.0.4/go.mod h1:gDcqh3WGcR1cpF5AJz/B1UFheUEneMoIospckxBxk6Q=
github.com/hashicorp/vault/sdk v0.1.13/go.mod h1:B+hVj7TpuQY1Y/GPbCpffmgd+tSEwvhkWnjtSYCaS2M=
github.com/hashicorp/yamux v0.0.0-20180604194846-3520598351bb/go.mod h1:+NfK9FKeTrX5uv1uIXGdwYDTeHna2qgaIlx54MXqjAM=
github.com/hashicorp/yamux v0.0.0-20181012175058-2f1d1f20f75d/go.mod h1:+NfK9FKeTrX5uv1uIXGdwYDTeHna2qgaIlx54MXqjAM=
github.com/hasura/go-graphql-client v0.13.1 h1:kKbjhxhpwz58usVl+Xvgah/TDha5K2akNTRQdsEHN6U=
github.com/hasura/go-graphql-client v0.13.1/go.mod h1:k7FF7h53C+hSNFRG3++DdVZWIuHdCaTbI7siTJ//zGQ=
github.com/hjson/hjson-go/v4 v4.0.0 h1:wlm6IYYqHjOdXH1gHev4VoXCaW20HdQAGCxdOEEg2cs=
github.com/hjson/hjson-go/v4 v4.0.0/go.mod h1:KaYt3bTw3zhBjYqnXkYywcYctk0A2nxeEFTse3rH13E=
github.com/huandu/xstrings v1.0.0/go.mod h1:4qWG/gcEcfX4z/mBDHJ++3ReCw9ibxbsNJbcucJdbSo=
github.com/iancoleman/strcase v0.2.0/go.mod h1:iwCmte+B7n89clKwxIoIXy/HfoL7AsD47ZCWhYzw7ho=
github.com/ianlancetaylor/demangle v0.0.0-20181102032728-5e5cf60278f6/go.mod h1:aSSvb/t6k1mPoxDqO4vJh6VOCGPwU4O0C2/Eqndh1Sc=
@ -412,18 +378,10 @@ github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
github.com/jinzhu/copier v0.4.0 h1:w3ciUoD19shMCRargcpm0cm91ytaBhDvuRpz1ODO/U8=
github.com/jinzhu/copier v0.4.0/go.mod h1:DfbEm0FYsaqBcKcFuvmOZb218JkPGtvSHsKg8S8hyyg=
github.com/jmespath/go-jmespath v0.4.0/go.mod h1:T8mJZnbsbmF+m6zOOFylbeCJqk5+pHWvzYPziyZiYoo=
github.com/jmespath/go-jmespath/internal/testify v1.5.1/go.mod h1:L3OGu8Wl2/fWfCI6z80xFu9LTZmf1ZRjMHUOPmWr69U=
github.com/jmoiron/sqlx v1.4.0 h1:1PLqN7S1UYp5t4SrVVnt4nUVNemrDAtxlulVe+Qgm3o=
github.com/jmoiron/sqlx v1.4.0/go.mod h1:ZrZ7UsYB/weZdl2Bxg6jCRO9c3YHl8r3ahlKmRT4JLY=
github.com/joho/godotenv v1.3.0 h1:Zjp+RcGpHhGlrMbJzXTrZZPrWj+1vfm90La1wgB6Bhc=
github.com/joho/godotenv v1.3.0/go.mod h1:7hK45KPybAkOC6peb+G5yklZfMxEjkZhHbwpqxOKXbg=
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
github.com/josharian/intern v1.0.0/go.mod h1:5DoeVV0s6jJacbCEi61lwdGj/aVlrQvzHFFd8Hwg//Y=
github.com/jpillora/backoff v1.0.0/go.mod h1:J/6gKK9jxlEcS3zixgDgUAsiuZ7yrSoa/FX5e0EB2j4=
github.com/json-iterator/go v1.1.6/go.mod h1:+SdeFBvtyEkXs7REEP0seUULqWtbJapLOCVDaaPEHmU=
github.com/json-iterator/go v1.1.9/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/json-iterator/go v1.1.10/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/json-iterator/go v1.1.11/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=
github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM=
github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=
@ -431,17 +389,25 @@ github.com/jstemmer/go-junit-report v0.0.0-20190106144839-af01ea7f8024/go.mod h1
github.com/jstemmer/go-junit-report v0.9.1/go.mod h1:Brl9GWCQeLvo8nXZwPNNblvFj/XSXhF0NWZEnDohbsk=
github.com/jtolds/gls v4.2.1+incompatible/go.mod h1:QJZ7F/aHp+rZTRtaJ1ow/lLfFfVYBRgL+9YlvaHOwJU=
github.com/julienschmidt/httprouter v1.2.0/go.mod h1:SYymIcj16QtmaHHD7aYtjjsJG7VTCxuUUipMqKk8s4w=
github.com/julienschmidt/httprouter v1.3.0/go.mod h1:JR6WtHb+2LUe8TCKY3cZOxFyyO8IZAc4RVcycCCAKdM=
github.com/kermieisinthehouse/gosx-notifier v0.1.2 h1:KV0KBeKK2B24kIHY7iK0jgS64Q05f4oB+hUZmsPodxQ=
github.com/kermieisinthehouse/gosx-notifier v0.1.2/go.mod h1:xyWT07azFtUOcHl96qMVvKhvKzsMcS7rKTHQyv8WTho=
github.com/kermieisinthehouse/systray v1.2.4 h1:pdH5vnl+KKjRrVCRU4g/2W1/0HVzuuJ6WXHlPPHYY6s=
github.com/kermieisinthehouse/systray v1.2.4/go.mod h1:axh6C/jNuSyC0QGtidZJURc9h+h41HNoMySoLVrhVR4=
github.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8=
github.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=
github.com/knadh/koanf v1.5.0 h1:q2TSd/3Pyc/5yP9ldIrSdIz26MCcyNQzW0pEAugLPNs=
github.com/knadh/koanf v1.5.0/go.mod h1:Hgyjp4y8v44hpZtPzs7JZfRAW5AhN7KfZcwv1RYggDs=
github.com/knadh/koanf/maps v0.1.2 h1:RBfmAW5CnZT+PJ1CVc1QSJKf4Xu9kxfQgYVQSu8hpbo=
github.com/knadh/koanf/maps v0.1.2/go.mod h1:npD/QZY3V6ghQDdcQzl1W4ICNVTkohC8E73eI2xW4yI=
github.com/knadh/koanf/parsers/yaml v1.1.0 h1:3ltfm9ljprAHt4jxgeYLlFPmUaunuCgu1yILuTXRdM4=
github.com/knadh/koanf/parsers/yaml v1.1.0/go.mod h1:HHmcHXUrp9cOPcuC+2wrr44GTUB0EC+PyfN3HZD9tFg=
github.com/knadh/koanf/providers/env v1.1.0 h1:U2VXPY0f+CsNDkvdsG8GcsnK4ah85WwWyJgef9oQMSc=
github.com/knadh/koanf/providers/env v1.1.0/go.mod h1:QhHHHZ87h9JxJAn2czdEl6pdkNnDh/JS1Vtsyt65hTY=
github.com/knadh/koanf/providers/file v1.2.0 h1:hrUJ6Y9YOA49aNu/RSYzOTFlqzXSCpmYIDXI7OJU6+U=
github.com/knadh/koanf/providers/file v1.2.0/go.mod h1:bp1PM5f83Q+TOUu10J/0ApLBd9uIzg+n9UgthfY+nRA=
github.com/knadh/koanf/providers/posflag v1.0.1 h1:EnMxHSrPkYCFnKgBUl5KBgrjed8gVFrcXDzaW4l/C6Y=
github.com/knadh/koanf/providers/posflag v1.0.1/go.mod h1:3Wn3+YG3f4ljzRyCUgIwH7G0sZ1pMjCOsNBovrbKmAk=
github.com/knadh/koanf/v2 v2.2.1 h1:jaleChtw85y3UdBnI0wCqcg1sj1gPoz6D3caGNHtrNE=
github.com/knadh/koanf/v2 v2.2.1/go.mod h1:PSFru3ufQgTsI7IF+95rf9s8XA1+aHxKuO/W+dPoHEY=
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/konsorten/go-windows-terminal-sequences v1.0.3/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/kr/fs v0.1.0/go.mod h1:FFnZGqtBN9Gxj7eW1uZ42v5BccTP0vu6NEaFoC2HwRg=
github.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
@ -465,8 +431,6 @@ github.com/lyft/protoc-gen-star v0.5.3/go.mod h1:V0xaHgaf5oCCqmcxYcWiDfTiKsZsRc8
github.com/magiconair/properties v1.8.5/go.mod h1:y3VJvCyxH9uVvJTWEGAELF3aiYNyPKd5NZ3oSwXrF60=
github.com/magiconair/properties v1.8.7 h1:IeQXZAiQcpL9mgcAe1Nu6cX9LLw6ExEHKjN0VQdvPDY=
github.com/magiconair/properties v1.8.7/go.mod h1:Dhd985XPs7jluiymwWYZ0G4Z61jb3vdS329zhj2hYo0=
github.com/mailru/easyjson v0.7.7 h1:UGYAvKxe3sBsEDzO8ZeWOSlIQfWFlxbzLZe7hwFURr0=
github.com/mailru/easyjson v0.7.7/go.mod h1:xzfreul335JAWq5oZzymOObrkdz5UnU4kGfJJLY9Nlc=
github.com/mattn/go-colorable v0.0.9/go.mod h1:9vuHe8Xs5qXnSaW/c/ABM9alt+Vo+STaOChaDxuIBZU=
github.com/mattn/go-colorable v0.1.4/go.mod h1:U0ppj6V5qS13XJ6of8GYAs25YV2eR4EVcfRqFIhoBtE=
github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
@ -492,22 +456,17 @@ github.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5
github.com/miekg/dns v1.0.14/go.mod h1:W1PPwlIAgtquWBMBEV9nkV9Cazfe8ScdGz/Lj7v3Nrg=
github.com/miekg/dns v1.1.26/go.mod h1:bPDLeHnStXmXAq1m/Ch/hvfNHr14JKNPMBo3VZKjuso=
github.com/miekg/dns v1.1.41/go.mod h1:p6aan82bvRIyn+zDIv9xYNUpwa73JcSh9BKwknJysuI=
github.com/mitchellh/cli v1.0.0/go.mod h1:hNIlj7HEI86fIcpObd7a0FcrxTWetlwJDGcceTlRvqc=
github.com/mitchellh/cli v1.1.0/go.mod h1:xcISNoH86gajksDmfB23e/pu+B+GeFRMYmoHXxx3xhI=
github.com/mitchellh/copystructure v1.0.0/go.mod h1:SNtv71yrdKgLRyLFxmLdkAbkKEFWgYaq1OVrnRcwhnw=
github.com/mitchellh/copystructure v1.2.0 h1:vpKXTN4ewci03Vljg/q9QvCGUDttBOGBIa15WveJJGw=
github.com/mitchellh/copystructure v1.2.0/go.mod h1:qLl+cE2AmVv+CoeAwDPye/v+N2HKCj9FbZEVFJRxO9s=
github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=
github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=
github.com/mitchellh/go-testing-interface v0.0.0-20171004221916-a61a99592b77/go.mod h1:kRemZodwjscx+RGhAo8eIhFbs2+BFgRtFPeD/KE+zxI=
github.com/mitchellh/go-testing-interface v1.0.0/go.mod h1:kRemZodwjscx+RGhAo8eIhFbs2+BFgRtFPeD/KE+zxI=
github.com/mitchellh/go-wordwrap v1.0.0/go.mod h1:ZXFpozHsX6DPmq2I0TCekCxypsnAUbP2oI0UX1GXzOo=
github.com/mitchellh/mapstructure v0.0.0-20160808181253-ca63d7c062ee/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
github.com/mitchellh/mapstructure v1.1.2/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
github.com/mitchellh/mapstructure v1.4.3/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=
github.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=
github.com/mitchellh/mapstructure v1.5.0/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=
github.com/mitchellh/reflectwalk v1.0.0/go.mod h1:mSTlrgnPZtwu0c4WaC2kGObEpuNDbx0jmZXqmk4esnw=
github.com/mitchellh/reflectwalk v1.0.2 h1:G2LzWKi524PWgd3mLHV8Y5k7s6XUvT0Gef6zxSIeXaQ=
github.com/mitchellh/reflectwalk v1.0.2/go.mod h1:mSTlrgnPZtwu0c4WaC2kGObEpuNDbx0jmZXqmk4esnw=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
@ -519,26 +478,20 @@ github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9G
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/mschoch/smat v0.0.0-20160514031455-90eadee771ae/go.mod h1:qAyveg+e4CE+eKJXWVjKXM4ck2QobLqTDytGJbLLhJg=
github.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=
github.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=
github.com/natefinch/pie v0.0.0-20170715172608-9a0d72014007 h1:Ohgj9L0EYOgXxkDp+bczlMBiulwmqYzQpvQNUdtt3oc=
github.com/natefinch/pie v0.0.0-20170715172608-9a0d72014007/go.mod h1:wKCOWMb6iNlvKiOToY2cNuaovSXvIiv1zDi9QDR7aGQ=
github.com/nfnt/resize v0.0.0-20180221191011-83c6a9932646 h1:zYyBkD/k9seD2A7fsi6Oo2LfFZAehjjQMERAvZLEDnQ=
github.com/nfnt/resize v0.0.0-20180221191011-83c6a9932646/go.mod h1:jpp1/29i3P1S/RLdc7JQKbRpFeM1dOBd8T9ki5s+AY8=
github.com/npillmayer/nestext v0.1.3/go.mod h1:h2lrijH8jpicr25dFY+oAJLyzlya6jhnuG+zWp9L0Uk=
github.com/nu7hatch/gouuid v0.0.0-20131221200532-179d4d0c4d8d h1:VhgPp6v9qf9Agr/56bj7Y/xa04UccTW04VP0Qed4vnQ=
github.com/nu7hatch/gouuid v0.0.0-20131221200532-179d4d0c4d8d/go.mod h1:YUTz3bUH2ZwIWBy3CJBeOBEugqcmXREj14T+iG/4k4U=
github.com/oklog/run v1.0.0/go.mod h1:dlhp/R75TPv97u0XWUtDeV/lRKWPKSdTuV0TZvrmrQA=
github.com/orisano/pixelmatch v0.0.0-20220722002657-fb0b55479cde h1:x0TT0RDC7UhAVbbWWBzr41ElhJx5tXPWkIHA2HWPRuw=
github.com/orisano/pixelmatch v0.0.0-20220722002657-fb0b55479cde/go.mod h1:nZgzbfBr3hhjoZnS66nKrHmduYNpc34ny7RK4z5/HM0=
github.com/pascaldekloe/goe v0.0.0-20180627143212-57f6aae5913c/go.mod h1:lzWF7FIEvWOWxwDKqyGYQf6ZUaNfKdP144TG7ZOy1lc=
github.com/pascaldekloe/goe v0.1.0/go.mod h1:lzWF7FIEvWOWxwDKqyGYQf6ZUaNfKdP144TG7ZOy1lc=
github.com/pelletier/go-toml v1.7.0/go.mod h1:vwGMzjaWMwyfHwgIBhI2YUM4fB6nL6lVAvS1LBMMhTE=
github.com/pelletier/go-toml v1.9.4 h1:tjENF6MfZAg8e4ZmZTeWaWiT2vXtsoO6+iuOjFhECwM=
github.com/pelletier/go-toml v1.9.4/go.mod h1:u1nR/EPcESfeI/szUZKdtJ0xRNbUoANCkoOuaOx1Y+c=
github.com/pelletier/go-toml/v2 v2.1.0 h1:FnwAJ4oYMvbT/34k9zzHuZNrhlz48GB3/s6at6/MHO4=
github.com/pelletier/go-toml/v2 v2.1.0/go.mod h1:tJU2Z3ZkXwnxa4DPO899bsyIoywizdUvyaeZurnPPDc=
github.com/philhofer/fwd v1.0.0/go.mod h1:gk3iGcWd9+svBvR0sR+KPcfE+RNWozjowpeBVG3ZVNU=
github.com/pierrec/lz4 v2.0.5+incompatible/go.mod h1:pdkljMzZIN41W+lC3N2tnIh5sFi+IEE17M5jbnwPHcY=
github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8 h1:KoWmjvw+nsYOo29YJK9vDA65RGE3NrOnUtO7a+RF9HU=
github.com/pkg/browser v0.0.0-20210911075715-681adbf594b8/go.mod h1:HKlIX3XHQyzLZPlr7++PzdhaXEj94dEiJgZDTsxEqUI=
github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
@ -555,24 +508,17 @@ github.com/posener/complete v1.2.3/go.mod h1:WZIdtGGp+qx0sLrYKtIRAruyNpv6hFCicSg
github.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=
github.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=
github.com/prometheus/client_golang v1.4.0/go.mod h1:e9GMxYsXl05ICDXkRhurwBS4Q3OK1iX/F2sw+iXX5zU=
github.com/prometheus/client_golang v1.7.1/go.mod h1:PY5Wy2awLA44sXw4AOSfFBetzPP4j5+D6mVACh+pe2M=
github.com/prometheus/client_golang v1.11.1/go.mod h1:Z6t4BnS23TR94PD6BsDNk8yVqroYurpAkEiz0P2BEV0=
github.com/prometheus/client_model v0.0.0-20180712105110-5c3871d89910/go.mod h1:MbSGuTsp3dbXC40dX6PRTWyKYBIrTGTE9sqQNg2J8bo=
github.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/client_model v0.2.0/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/prometheus/common v0.4.1/go.mod h1:TNfzLD0ON7rHzMJeJkieUDPYmFC7Snx/y86RQel1bk4=
github.com/prometheus/common v0.9.1/go.mod h1:yhUN8i9wzaXS3w1O07YhxHEBxD+W35wd8bs7vj7HSQ4=
github.com/prometheus/common v0.10.0/go.mod h1:Tlit/dnDKsSWFlCLTWaA1cyBgKHSMdTB80sz/V91rCo=
github.com/prometheus/common v0.26.0/go.mod h1:M7rCNAaPfAosfx8veZJCuw84e35h3Cfd9VFqTh1DIvc=
github.com/prometheus/procfs v0.0.0-20181005140218-185b4288413d/go.mod h1:c3At6R/oaqEKCNdg8wHV1ftS6bRYblBhIjjI8uT2IGk=
github.com/prometheus/procfs v0.0.2/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=
github.com/prometheus/procfs v0.0.8/go.mod h1:7Qr8sr6344vo1JqZ6HhLceV9o3AJ1Ff+GxbHq6oeK9A=
github.com/prometheus/procfs v0.1.3/go.mod h1:lV6e/gmhEcM9IjHGsFOCxxuZ+z1YqCvr4OA4YeYWdaU=
github.com/prometheus/procfs v0.6.0/go.mod h1:cz+aTbrPOrUb4q7XlbU9ygM+/jj0fzG6c1xBZuNvfVA=
github.com/remeh/sizedwaitgroup v1.0.0 h1:VNGGFwNo/R5+MJBf6yrsr110p0m4/OX4S3DCy7Kyl5E=
github.com/remeh/sizedwaitgroup v1.0.0/go.mod h1:3j2R4OIe/SeS6YDhICBy22RWjJC5eNCJ1V+9+NVNYlo=
github.com/rhnvrm/simples3 v0.6.1/go.mod h1:Y+3vYm2V7Y4VijFoJHHTrja6OgPrJ2cBti8dPGkC3sA=
github.com/rogpeppe/fastuuid v1.2.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=
github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.6.1/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
@ -590,9 +536,9 @@ github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQD
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd h1:CmH9+J6ZSsIjUK3dcGsnCnO41eRBOnY12zwkn5qVwgc=
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd/go.mod h1:hPqNNc0+uJM6H+SuU8sEs5K5IQeKccPqeSjfgcKGgPk=
github.com/ryanuber/columnize v0.0.0-20160712163229-9b3edd62028f/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
github.com/ryanuber/columnize v2.1.0+incompatible/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
github.com/ryanuber/go-glob v1.0.0/go.mod h1:807d1WSdnB0XRJzKNil9Om6lcp/3a0v4qIHxIXzX/Yc=
github.com/ryszard/goskiplist v0.0.0-20150312221310-2dfbae5fcf46/go.mod h1:uAQ5PCi+MFsC7HjREoAz1BU+Mq60+05gifQSsHSDG/8=
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06 h1:OkMGxebDjyw0ULyrTYWeN0UNCCkmCWfjPnIA2W6oviI=
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06/go.mod h1:+ePHsJ1keEjQtpvf9HHw0f4ZeJ0TLRsxhunSI2hYJSs=
github.com/sagikazarmark/crypt v0.3.0/go.mod h1:uD/D+6UF4SrIR1uGEv7bBNkNqLGqUr43MRiaGWX1Nig=
github.com/sagikazarmark/crypt v0.4.0/go.mod h1:ALv2SRj7GxYV4HO9elxH9nS6M9gW+xDNxqmyJ6RfDFM=
github.com/sean-/seed v0.0.0-20170313163322-e2103e2c3529/go.mod h1:DxrIzT+xaE7yg65j358z/aeFdxmN0P9QXhEzd20vsDc=
@ -600,7 +546,6 @@ github.com/sergi/go-diff v1.3.1 h1:xkr+Oxo4BOQKmkn/B9eMK0g5Kg/983T9DqqPHwYqD+8=
github.com/sergi/go-diff v1.3.1/go.mod h1:aMJSSKb2lpPvRNec0+w3fl7LP9IOFzdc9Pa4NFbPK1I=
github.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
github.com/sirupsen/logrus v1.6.0/go.mod h1:7uNnSEd1DgxDLC74fIahvMZmmYsHGZGEOFrfsX/uA88=
github.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=
github.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=
github.com/smartystreets/assertions v0.0.0-20180927180507-b2de0cb4f26d/go.mod h1:OnSkiWE9lh6wB0YB77sQom3nweQdgAjqCqsofrRNTgc=
@ -621,8 +566,9 @@ github.com/spf13/cobra v1.7.0 h1:hyqWnYt1ZQShIddO5kBpj3vu05/++x6tJ6dg8EC572I=
github.com/spf13/cobra v1.7.0/go.mod h1:uLxZILRyS/50WlhOIKD7W6V5bgeIt+4sICxh6uRMrb0=
github.com/spf13/jwalterweatherman v1.1.0 h1:ue6voC5bR5F8YxI5S67j9i582FU4Qvo2bmqnqMYADFk=
github.com/spf13/jwalterweatherman v1.1.0/go.mod h1:aNWZUN0dPAAO/Ljvb5BEdw96iTZ0EXowPYD95IqWIGo=
github.com/spf13/pflag v1.0.5 h1:iy+VFUOCP1a+8yFto/drg2CJ5u0yRoB7fZw3DKv/JXA=
github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/spf13/pflag v1.0.6 h1:jFzHGLGAlb3ruxLB8MhbI6A8+AQX/2eW4qeyNZXNp2o=
github.com/spf13/pflag v1.0.6/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/spf13/viper v1.10.0/go.mod h1:SoyBPwAtKDzypXNDFKN5kzH7ppppbGZtls1UpIy5AsM=
github.com/spf13/viper v1.10.1/go.mod h1:IGlFPqhNAPKRxohIzWpI5QEy4kuI7tcl5WvR+8qy1rU=
github.com/spf13/viper v1.16.0 h1:rGGH0XDZhdUOryiDWjmIvUSWpbNqisK8Wk0Vyefw8hc=
@ -683,11 +629,8 @@ github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5t
github.com/zencoder/go-dash/v3 v3.0.2 h1:oP1+dOh+Gp57PkvdCyMfbHtrHaxfl3w4kR3KBBbuqQE=
github.com/zencoder/go-dash/v3 v3.0.2/go.mod h1:30R5bKy1aUYY45yesjtZ9l8trNc2TwNqbS17WVQmCzk=
go.etcd.io/etcd/api/v3 v3.5.1/go.mod h1:cbVKeC6lCfl7j/8jBhAK6aIYO9XOjdptoxU/nLQcPvs=
go.etcd.io/etcd/api/v3 v3.5.4/go.mod h1:5GB2vv4A4AOn3yk7MftYGHkUfGtDHnEraIjym4dYz5A=
go.etcd.io/etcd/client/pkg/v3 v3.5.1/go.mod h1:IJHfcCEKxYu1Os13ZdwCwIUTUVGYTSAM3YSwc9/Ac1g=
go.etcd.io/etcd/client/pkg/v3 v3.5.4/go.mod h1:IJHfcCEKxYu1Os13ZdwCwIUTUVGYTSAM3YSwc9/Ac1g=
go.etcd.io/etcd/client/v2 v2.305.1/go.mod h1:pMEacxZW7o8pg4CrFE7pquyCJJzZvkvdD2RibOCCCGs=
go.etcd.io/etcd/client/v3 v3.5.4/go.mod h1:ZaRkVgBZC+L+dLCjTcF1hRXpgZXQPOvnA/Ak/gq3kiY=
go.opencensus.io v0.21.0/go.mod h1:mSImk1erAIZhrmZN+AvHh14ztQfjbGwt4TtuofqLduU=
go.opencensus.io v0.22.0/go.mod h1:+kGneAE2xo2IficOXnaByMWTGM9T73dGwxeWcUqIpI8=
go.opencensus.io v0.22.2/go.mod h1:yxeiOL68Rb0Xd1ddK5vPZ/oVn4vY4Ynel7k9FzqtOIw=
@ -701,6 +644,8 @@ go.uber.org/atomic v1.11.0 h1:ZvwS0R+56ePWxUNi+Atn9dWONBPp/AUETXlHW0DxSjE=
go.uber.org/atomic v1.11.0/go.mod h1:LUxbIzbOniOlMKjJjyPfpl4v+PKK2cNJn91OQbhoJI0=
go.uber.org/multierr v1.6.0/go.mod h1:cdWPpRnG4AhwMwsgIHip0KRBQjJy5kYEpYjJxpXp9iU=
go.uber.org/zap v1.17.0/go.mod h1:MXVU+bhUf/A7Xi2HNOnopQOrmycQ5Ih87HtOu4q5SSo=
go.yaml.in/yaml/v3 v3.0.3 h1:bXOww4E/J3f66rav3pX3m8w6jDE4knZjGOw8b5Y6iNE=
go.yaml.in/yaml/v3 v3.0.3/go.mod h1:tBHosrYAkRZjRAOREWbDnBXUf08JOwYq++0QNwQiWzI=
golang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20181029021203-45a5f77698d3/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
@ -718,8 +663,12 @@ golang.org/x/crypto v0.0.0-20211108221036-ceb1ce70b4fa/go.mod h1:GvvjBRRGRdwPK5y
golang.org/x/crypto v0.0.0-20211215165025-cf75a172585e/go.mod h1:P+XmwS30IXTQdn5tA2iutPOUgjI07+tq3H3K9MVA1s8=
golang.org/x/crypto v0.0.0-20220112180741-5e0467b6c7ce/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
golang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
golang.org/x/crypto v0.38.0 h1:jt+WWG8IZlBnVbomuhg2Mdq0+BBQaHbtqHEFEigjUV8=
golang.org/x/crypto v0.38.0/go.mod h1:MvrbAqul58NNYPKnOra203SB9vpuZW0e+RRZV+Ggqjw=
golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
golang.org/x/crypto v0.23.0/go.mod h1:CKFgDieR+mRhux2Lsu27y0fO304Db0wZe70UKqHu0v8=
golang.org/x/crypto v0.31.0/go.mod h1:kDsLvtWBEx7MV9tJOj9bnXsPbxwJQ6csT/x4KIN4Ssk=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/exp v0.0.0-20190306152737-a1d7652674e8/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/exp v0.0.0-20190510132918-efd6b22b2522/go.mod h1:ZjyILWgesfNpC6sMxTJOJm9Kp84zZh5NQWvqDGG3Qr8=
@ -761,8 +710,12 @@ golang.org/x/mod v0.4.2/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=
golang.org/x/mod v0.5.0/go.mod h1:5OXOZSfqPIIbmVBIIKWRFfZjPR0E5r58TLhUjH0a2Ro=
golang.org/x/mod v0.5.1/go.mod h1:5OXOZSfqPIIbmVBIIKWRFfZjPR0E5r58TLhUjH0a2Ro=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.24.0 h1:ZfthKaKaT4NrhGVZHO1/WDTwGES4De8KtWO0SIbNJMU=
golang.org/x/mod v0.24.0/go.mod h1:IXM97Txy2VM4PJ3gI61r1YEk/gAj6zAHN3AdZt6S9Ww=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.12.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/mod v0.15.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.29.0 h1:HV8lRxZC4l2cr3Zq1LvtOsi/ThTgWnUk/y64QSs8GwA=
golang.org/x/mod v0.29.0/go.mod h1:NyhrlYXJ2H4eJiRy/WDBO6HMqZQ6q9nk4JzS3NuCK+w=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20181023162649-9b4f9f5ad519/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
@ -811,9 +764,14 @@ golang.org/x/net v0.0.0-20210813160813-60bc85c4be6d/go.mod h1:9nx3DQGgdP8bBQD5qx
golang.org/x/net v0.0.0-20211015210444-4f30a5c0130f/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.5.0/go.mod h1:DivGGAXEgPSlEBzxGzZI+ZLohi+xUj054jfeKui00ws=
golang.org/x/net v0.40.0 h1:79Xs7wF06Gbdcg4kdCCIQArK11Z1hr5POQ6+fIYHNuY=
golang.org/x/net v0.40.0/go.mod h1:y0hY0exeL2Pku80/zKK7tpntoX23cqL3Oa6njdgRtds=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
golang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=
golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM=
golang.org/x/net v0.33.0/go.mod h1:HXLR5J+9DxmrqMwG9qjGCxZ+zKXxBru04zlTvWlWuN4=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
@ -843,18 +801,21 @@ golang.org/x/sync v0.0.0-20201020160332-67f06af15bc9/go.mod h1:RxMgew5VJxzue5/jJ
golang.org/x/sync v0.0.0-20201207232520-09787c993a3a/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20210220032951-036812b2e83c/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.14.0 h1:woo0S4Yywslg6hp4eUFjTVOyKt0RookbpAHG4c1HmhQ=
golang.org/x/sync v0.14.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=
golang.org/x/sync v0.6.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.0.0-20180823144017-11551d06cbcc/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20181026203630-95b1ffbd15a5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20181116152217-5ac8a444bdc5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190129075346-302c3dd5f1cc/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190222072716-a9d3bda3a223/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190312061237-fead79001313/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190403152447-81d4e9dc473e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190415145633-3fd5a3612ccd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190422165155-953cdadca894/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -866,12 +827,10 @@ golang.org/x/sys v0.0.0-20190726091711-fc99dfbffb4e/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20190922100055-0a153f010e69/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190924154521-2837fb4f24fe/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191001151750-bb3f8db39f24/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191005200804-aed5e4c7ecf9/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191008105621-543471e840be/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191204072324-ce4227a45e2e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191228213918-04cbcbbfeed8/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200106162015-b016eb3dc98e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200113162924-86b910548bc1/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200122134326-e047566fdf82/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -886,8 +845,6 @@ golang.org/x/sys v0.0.0-20200501052902-10377860bb8e/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20200511232937-7e40ca221e25/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200515095857-1151b9dac4a9/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200523222454-059865788121/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200615200032-f1bc736245b1/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200625212154-ddb9806d33ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200803210538-64077c9b5642/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200905004654-be1d3432aa8f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -895,7 +852,6 @@ golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20201201145000-ef89a241ccb3/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210104204734-6f8348627aad/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210119212857-b64e53b001e4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210220050731-9a76102bfb43/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210225134936-a50acf3fe073/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210303074136-134d130e1a04/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -908,7 +864,6 @@ golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20210423185535-09eb48e85fd7/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210510120138-977fb7262007/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210514084401-e8d321eab015/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210603081109-ebe580a85c40/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210603125802-9665404d3644/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210616045830-e2b7044e8c71/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
@ -931,21 +886,30 @@ golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220908164124-27713097b956/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.4.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.10.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.33.0 h1:q3i8TbbEz+JRD9ywIRlyRAQbM0qF7hu24q3teo2hbuw=
golang.org/x/sys v0.33.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.17.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.20.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.28.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/telemetry v0.0.0-20240228155512-f48c80bd79b2/go.mod h1:TeRTkGYfJXctD9OcfyVLyj2J3IxLnKwHJR8f4D8a3YE=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.4.0/go.mod h1:9P2UbLfCdcvo3p/nzKvsmas4TnlujnuoV9hGgYzW1lQ=
golang.org/x/term v0.32.0 h1:DR4lr0TjUs3epypdhTOkMmuF5CDFJ/8pOnbzMZPQ7bg=
golang.org/x/term v0.32.0/go.mod h1:uZG1FhGx848Sqfsq4/DlJr3xGGsYMu/L5GW4abiaEPQ=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=
golang.org/x/term v0.12.0/go.mod h1:owVbMEjm3cBLCHdkQu9b1opXd4ETQWc3BhuQGKgXgvU=
golang.org/x/term v0.17.0/go.mod h1:lLRBjIVuehSbZlaOtGMbcMncT+aqLLLmKrsjNrUguwk=
golang.org/x/term v0.20.0/go.mod h1:8UkIAJTvZgivsXaD6/pH6U9ecQzZ45awqEOzuCvwpFY=
golang.org/x/term v0.27.0/go.mod h1:iMsnZpn0cago0GOrHO2+Y7u7JPn5AylBrcoWkElMTSM=
golang.org/x/term v0.37.0 h1:8EGAD0qCmHYZg6J17DvsMy9/wJ7/D/4pV/wfnld5lTU=
golang.org/x/term v0.37.0/go.mod h1:5pB4lxRNYYVZuTLmy8oR2BH8dflOR+IbTYFD8fi3254=
golang.org/x/text v0.0.0-20170915032832-14c0d48ead0c/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.1-0.20180807135948-17ff2d5776d2/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.1-0.20181227161524-e6919f6577db/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.4/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
@ -953,9 +917,14 @@ golang.org/x/text v0.3.5/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
golang.org/x/text v0.6.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.25.0 h1:qVyWApTSYLk/drJRO5mDlNYskwQznZmkpV2c8q9zls4=
golang.org/x/text v0.25.0/go.mod h1:WEdwpYrmk1qmdHvhkSTNPm3app7v4rsT8F2UD6+VHIA=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=
golang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.15.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=
golang.org/x/text v0.21.0/go.mod h1:4IBbMaMmOPCJ8SecivzSH54+73PCFmPWxNTLm+vZkEQ=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/time v0.0.0-20181108054448-85acf8d2951c/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20190308202827-9d24e82272b4/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20191024005414-555d28b269f0/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
@ -1020,8 +989,11 @@ golang.org/x/tools v0.1.5/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=
golang.org/x/tools v0.1.7/go.mod h1:LGqMHiF4EqQNHR1JncWGqT5BVaXmza+X+BDGol+dOxo=
golang.org/x/tools v0.1.8/go.mod h1:nABZi5QlRsZVlzPpHl034qft6wpY4eDcsTt5AaioBiU=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.33.0 h1:4qz2S3zmRxbGIhDIAgjxvFutSvH5EfnsYrRBj0UI0bc=
golang.org/x/tools v0.33.0/go.mod h1:CIJMaWEY88juyUfo7UbgPqbC8rU2OqfAV1h2Qp0oMYI=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.13.0/go.mod h1:HvlwmtVNQAhOuCjW7xxvovg8wbNq7LwfXh/k7wXUl58=
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d/go.mod h1:aiJjzUbINMkxbQROHiO6hDPo2LHcIPhhQsa9DLh0yGk=
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
golang.org/x/tools v0.38.0/go.mod h1:yEsQ/d/YK8cjh0L6rZlY8tgtlKiBNTL14pGDJPJpYQs=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
@ -1068,7 +1040,6 @@ google.golang.org/appengine v1.6.6/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCID
google.golang.org/appengine v1.6.7/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
google.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc=
google.golang.org/genproto v0.0.0-20190307195333-5fe7a883aa19/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
google.golang.org/genproto v0.0.0-20190404172233-64821d5d2107/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
google.golang.org/genproto v0.0.0-20190418145605-e7d98fc518a7/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
google.golang.org/genproto v0.0.0-20190425155659-357c62f0e4bb/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
google.golang.org/genproto v0.0.0-20190502173448-54afdca5d873/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=
@ -1132,11 +1103,9 @@ google.golang.org/genproto v0.0.0-20211129164237-f09f9a12af12/go.mod h1:5CzLGKJ6
google.golang.org/genproto v0.0.0-20211203200212-54befc351ae9/go.mod h1:5CzLGKJ67TSI2B9POpiiyGha0AjJvZIUgRMt1dSmuhc=
google.golang.org/genproto v0.0.0-20211206160659-862468c7d6e0/go.mod h1:5CzLGKJ67TSI2B9POpiiyGha0AjJvZIUgRMt1dSmuhc=
google.golang.org/genproto v0.0.0-20211208223120-3a66f561d7aa/go.mod h1:5CzLGKJ67TSI2B9POpiiyGha0AjJvZIUgRMt1dSmuhc=
google.golang.org/grpc v1.14.0/go.mod h1:yo6s7OP7yaDglbqo1J04qKzAhqBH6lvTonzMVmEdcZw=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=
google.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=
google.golang.org/grpc v1.22.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
google.golang.org/grpc v1.23.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
google.golang.org/grpc v1.25.1/go.mod h1:c3i+UQWmh7LiEpx4sFZnkU36qjEYZ0imhYfXVyQciAY=
google.golang.org/grpc v1.26.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=
@ -1177,7 +1146,6 @@ google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp0
google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
google.golang.org/protobuf v1.27.1/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
gopkg.in/alecthomas/kingpin.v2 v2.2.6/go.mod h1:FMv+mEhP44yOT+4EoQTLFTRgOQ1FBLkstjWtayDeSgw=
gopkg.in/asn1-ber.v1 v1.0.0-20181015200546-f715ec2f112d/go.mod h1:cuepJuh7vyXfUyUwEgHQXw849cJrilpS5NeIjOWESAw=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
@ -1190,14 +1158,14 @@ gopkg.in/ini.v1 v1.66.2/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/ini.v1 v1.66.3/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/square/go-jose.v2 v2.3.1/go.mod h1:M9dMgbHiYLoDGQrXy7OpJDJWiKiU//h+vD76mk0e1AI=
gopkg.in/natefinch/lumberjack.v2 v2.2.1 h1:bBRl1b0OH9s/DuPhuXpNl+VtCaJXFZ5/uEFST95x9zc=
gopkg.in/natefinch/lumberjack.v2 v2.2.1/go.mod h1:YD8tP3GAjkrDg1eZH7EGmyESg/lsYskCTPBJVb9jqSc=
gopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.3/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.5/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.3.0/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
@ -1214,4 +1182,3 @@ honnef.co/go/tools v0.0.1-2020.1.4/go.mod h1:X/FiERA/W4tHapMX5mGpAtMSVEeEUOyHaw9
rsc.io/binaryregexp v0.2.0/go.mod h1:qTv7/COck+e2FymRvadv62gMdZztPaShugOCi3I+8D8=
rsc.io/quote/v3 v3.1.0/go.mod h1:yEA65RcK8LyAZtP9Kv3t0HmxON59tX3rD+tICJqUlj0=
rsc.io/sampler v1.3.0/go.mod h1:T1hPZKmBbMNahiBKFy5HrXp6adAjACjK9JXDnKaTXpA=
sigs.k8s.io/yaml v1.2.0/go.mod h1:yfXDCHCao9+ENCvLSE62v9VSji2MKu5jeNfTrofGhJc=

View file

@ -35,6 +35,8 @@ models:
model: github.com/stashapp/stash/internal/api.BoolMap
PluginConfigMap:
model: github.com/stashapp/stash/internal/api.PluginConfigMap
File:
model: github.com/stashapp/stash/internal/api.File
VideoFile:
fields:
# override float fields - #1572
@ -138,4 +140,8 @@ models:
fields:
plugins:
resolver: true
Performer:
fields:
career_length:
resolver: true

View file

@ -6,6 +6,26 @@ type Query {
findDefaultFilter(mode: FilterMode!): SavedFilter
@deprecated(reason: "default filter now stored in UI config")
"Find a file by its id or path"
findFile(id: ID, path: String): BaseFile!
"Queries for Files"
findFiles(
file_filter: FileFilterType
filter: FindFilterType
ids: [ID!]
): FindFilesResultType!
"Find a file by its id or path"
findFolder(id: ID, path: String): Folder!
"Queries for Files"
findFolders(
folder_filter: FolderFilterType
filter: FindFilterType
ids: [ID!]
): FindFoldersResultType!
"Find a scene by ID or Checksum"
findScene(id: ID, checksum: String): Scene
findSceneByHash(input: SceneHashInput!): Scene
@ -145,6 +165,12 @@ type Query {
input: ScrapeSingleStudioInput!
): [ScrapedStudio!]!
"Scrape for a single tag"
scrapeSingleTag(
source: ScraperSourceInput!
input: ScrapeSingleTagInput!
): [ScrapedTag!]!
"Scrape for a single performer"
scrapeSinglePerformer(
source: ScraperSourceInput!
@ -308,6 +334,7 @@ type Mutation {
sceneMarkerCreate(input: SceneMarkerCreateInput!): SceneMarker
sceneMarkerUpdate(input: SceneMarkerUpdateInput!): SceneMarker
bulkSceneMarkerUpdate(input: BulkSceneMarkerUpdateInput!): [SceneMarker!]
sceneMarkerDestroy(id: ID!): Boolean!
sceneMarkersDestroy(ids: [ID!]!): Boolean!
@ -346,11 +373,13 @@ type Mutation {
performerDestroy(input: PerformerDestroyInput!): Boolean!
performersDestroy(ids: [ID!]!): Boolean!
bulkPerformerUpdate(input: BulkPerformerUpdateInput!): [Performer!]
performerMerge(input: PerformerMergeInput!): Performer!
studioCreate(input: StudioCreateInput!): Studio
studioUpdate(input: StudioUpdateInput!): Studio
studioDestroy(input: StudioDestroyInput!): Boolean!
studiosDestroy(ids: [ID!]!): Boolean!
bulkStudioUpdate(input: BulkStudioUpdateInput!): [Studio!]
movieCreate(input: MovieCreateInput!): Movie
@deprecated(reason: "Use groupCreate instead")
@ -393,8 +422,14 @@ type Mutation {
"""
moveFiles(input: MoveFilesInput!): Boolean!
deleteFiles(ids: [ID!]!): Boolean!
"Deletes file entries from the database without deleting the files from the filesystem"
destroyFiles(ids: [ID!]!): Boolean!
fileSetFingerprints(input: FileSetFingerprintsInput!): Boolean!
"Reveal the file in the system file manager"
revealFileInFileManager(id: ID!): Boolean!
"Reveal the folder in the system file manager"
revealFolderInFileManager(id: ID!): Boolean!
# Saved filters
saveFilter(input: SaveFilterInput!): SavedFilter!
@ -548,6 +583,8 @@ type Mutation {
stashBoxBatchPerformerTag(input: StashBoxBatchTagInput!): String!
"Run batch studio tag task. Returns the job ID."
stashBoxBatchStudioTag(input: StashBoxBatchTagInput!): String!
"Run batch tag tag task. Returns the job ID."
stashBoxBatchTagTag(input: StashBoxBatchTagInput!): String!
"Enables DLNA for an optional duration. Has no effect if DLNA is enabled by default"
enableDLNA(input: EnableDLNAInput!): Boolean!

View file

@ -2,6 +2,8 @@ input SetupInput {
"Empty to indicate $HOME/.stash/config.yml default"
configLocation: String!
stashes: [StashConfigInput!]!
"True if SFW content mode is enabled"
sfwContentMode: Boolean
"Empty to indicate default"
databaseFile: String!
"Empty to indicate default"
@ -67,6 +69,8 @@ input ConfigGeneralInput {
databasePath: String
"Path to backup directory"
backupDirectoryPath: String
"Path to trash directory - if set, deleted files will be moved here instead of being permanently deleted"
deleteTrashPath: String
"Path to generated files"
generatedPath: String
"Path to import/export files"
@ -153,6 +157,8 @@ input ConfigGeneralInput {
logLevel: String
"Whether to log http access"
logAccess: Boolean
"Maximum log size"
logFileMaxSize: Int
"True if galleries should be created from folders with images"
createGalleriesFromFolders: Boolean
"Regex used to identify images as gallery covers"
@ -178,6 +184,18 @@ input ConfigGeneralInput {
scraperPackageSources: [PackageSourceInput!]
"Source of plugin packages"
pluginPackageSources: [PackageSourceInput!]
"Size of the longest dimension for each sprite in pixels"
spriteScreenshotSize: Int
"True if sprite generation should use the sprite interval and min/max sprites settings instead of the default"
useCustomSpriteInterval: Boolean
"Time between two different scrubber sprites in seconds - only used if useCustomSpriteInterval is true"
spriteInterval: Float
"Minimum number of sprites to be generated - only used if useCustomSpriteInterval is true"
minimumSprites: Int
"Minimum number of sprites to be generated - only used if useCustomSpriteInterval is true"
maximumSprites: Int
}
type ConfigGeneralResult {
@ -187,6 +205,8 @@ type ConfigGeneralResult {
databasePath: String!
"Path to backup directory"
backupDirectoryPath: String!
"Path to trash directory - if set, deleted files will be moved here instead of being permanently deleted"
deleteTrashPath: String!
"Path to generated files"
generatedPath: String!
"Path to import/export files"
@ -277,6 +297,18 @@ type ConfigGeneralResult {
logLevel: String!
"Whether to log http access"
logAccess: Boolean!
"Maximum log size"
logFileMaxSize: Int!
"True if sprite generation should use the sprite interval and min/max sprites settings instead of the default"
useCustomSpriteInterval: Boolean!
"Time between two different scrubber sprites in seconds - only used if useCustomSpriteInterval is true"
spriteInterval: Float!
"Minimum number of sprites to be generated - only used if useCustomSpriteInterval is true"
minimumSprites: Int!
"Maximum number of sprites to be generated - only used if useCustomSpriteInterval is true"
maximumSprites: Int!
"Size of the longest dimension for each sprite in pixels"
spriteScreenshotSize: Int!
"Array of video file extensions"
videoExtensions: [String!]!
"Array of image file extensions"
@ -309,6 +341,7 @@ input ConfigDisableDropdownCreateInput {
tag: Boolean
studio: Boolean
movie: Boolean
gallery: Boolean
}
enum ImageLightboxDisplayMode {
@ -329,6 +362,7 @@ input ConfigImageLightboxInput {
resetZoomOnNav: Boolean
scrollMode: ImageLightboxScrollMode
scrollAttemptsBeforeChange: Int
disableAnimation: Boolean
}
type ConfigImageLightboxResult {
@ -338,9 +372,13 @@ type ConfigImageLightboxResult {
resetZoomOnNav: Boolean
scrollMode: ImageLightboxScrollMode
scrollAttemptsBeforeChange: Int!
disableAnimation: Boolean
}
input ConfigInterfaceInput {
"True if SFW content mode is enabled"
sfwContentMode: Boolean
"Ordered list of items that should be shown in the menu"
menuItems: [String!]
@ -379,6 +417,9 @@ input ConfigInterfaceInput {
customLocales: String
customLocalesEnabled: Boolean
"When true, disables all customizations (plugins, CSS, JavaScript, locales) for troubleshooting"
disableCustomizations: Boolean
"Interface language"
language: String
@ -404,9 +445,13 @@ type ConfigDisableDropdownCreate {
tag: Boolean!
studio: Boolean!
movie: Boolean!
gallery: Boolean!
}
type ConfigInterfaceResult {
"True if SFW content mode is enabled"
sfwContentMode: Boolean!
"Ordered list of items that should be shown in the menu"
menuItems: [String!]
@ -449,6 +494,9 @@ type ConfigInterfaceResult {
customLocales: String
customLocalesEnabled: Boolean
"When true, disables all customizations (plugins, CSS, JavaScript, locales) for troubleshooting"
disableCustomizations: Boolean
"Interface language"
language: String

View file

@ -6,9 +6,18 @@ type Fingerprint {
type Folder {
id: ID!
path: String!
basename: String!
parent_folder_id: ID
zip_file_id: ID
parent_folder_id: ID @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder
"Returns all parent folders in order from immediate parent to top-level"
parent_folders: [Folder!]!
zip_file: BasicFile
"Returns direct sub-folders"
sub_folders: [Folder!]!
mod_time: Time!
@ -21,8 +30,32 @@ interface BaseFile {
path: String!
basename: String!
parent_folder_id: ID!
zip_file_id: ID
parent_folder_id: ID! @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder!
zip_file: BasicFile
mod_time: Time!
size: Int64!
fingerprint(type: String!): String
fingerprints: [Fingerprint!]!
created_at: Time!
updated_at: Time!
}
type BasicFile implements BaseFile {
id: ID!
path: String!
basename: String!
parent_folder_id: ID! @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder!
zip_file: BasicFile
mod_time: Time!
size: Int64!
@ -39,8 +72,11 @@ type VideoFile implements BaseFile {
path: String!
basename: String!
parent_folder_id: ID!
zip_file_id: ID
parent_folder_id: ID! @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder!
zip_file: BasicFile
mod_time: Time!
size: Int64!
@ -66,8 +102,11 @@ type ImageFile implements BaseFile {
path: String!
basename: String!
parent_folder_id: ID!
zip_file_id: ID
parent_folder_id: ID! @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder!
zip_file: BasicFile
mod_time: Time!
size: Int64!
@ -75,6 +114,7 @@ type ImageFile implements BaseFile {
fingerprint(type: String!): String
fingerprints: [Fingerprint!]!
format: String!
width: Int!
height: Int!
@ -89,8 +129,11 @@ type GalleryFile implements BaseFile {
path: String!
basename: String!
parent_folder_id: ID!
zip_file_id: ID
parent_folder_id: ID! @deprecated(reason: "Use parent_folder instead")
zip_file_id: ID @deprecated(reason: "Use zip_file instead")
parent_folder: Folder!
zip_file: BasicFile
mod_time: Time!
size: Int64!
@ -116,7 +159,7 @@ input MoveFilesInput {
input SetFingerprintsInput {
type: String!
"an null value will remove the fingerprint"
"a null value will remove the fingerprint"
value: String
}
@ -125,3 +168,22 @@ input FileSetFingerprintsInput {
"only supplied fingerprint types will be modified"
fingerprints: [SetFingerprintsInput!]!
}
type FindFilesResultType {
count: Int!
"Total megapixels of any image files"
megapixels: Float!
"Total duration in seconds of any video files"
duration: Float!
"Total file size in bytes"
size: Int!
files: [BaseFile!]!
}
type FindFoldersResultType {
count: Int!
folders: [Folder!]!
}

View file

@ -75,22 +75,48 @@ input OrientationCriterionInput {
value: [OrientationEnum!]!
}
input PHashDuplicationCriterionInput {
duplicated: Boolean
"Currently unimplemented"
input DuplicationCriterionInput {
duplicated: Boolean @deprecated(reason: "Use phash field instead")
"Currently unimplemented. Intended for phash distance matching."
distance: Int
"Filter by phash duplication"
phash: Boolean
"Filter by URL duplication"
url: Boolean
"Filter by Stash ID duplication"
stash_id: Boolean
"Filter by title duplication"
title: Boolean
}
input FileDuplicationCriterionInput {
duplicated: Boolean @deprecated(reason: "Use phash field instead")
"Currently unimplemented. Intended for phash distance matching."
distance: Int
"Filter by phash duplication"
phash: Boolean
}
input StashIDCriterionInput {
"""
If present, this value is treated as a predicate.
That is, it will filter based on stash_ids with the matching endpoint
That is, it will filter based on stash_id with the matching endpoint
"""
endpoint: String
stash_id: String
modifier: CriterionModifier!
}
input StashIDsCriterionInput {
"""
If present, this value is treated as a predicate.
That is, it will filter based on stash_ids with the matching endpoint
"""
endpoint: String
stash_ids: [String]
modifier: CriterionModifier!
}
input CustomFieldCriterionInput {
field: String!
value: [Any!]
@ -126,10 +152,15 @@ input PerformerFilterType {
fake_tits: StringCriterionInput
"Filter by penis length value"
penis_length: FloatCriterionInput
"Filter by ciricumcision"
"Filter by circumcision"
circumcised: CircumcisionCriterionInput
"Filter by career length"
"Deprecated: use career_start and career_end. This filter is non-functional."
career_length: StringCriterionInput
@deprecated(reason: "Use career_start and career_end")
"Filter by career start"
career_start: DateCriterionInput
"Filter by career end"
career_end: DateCriterionInput
"Filter by tattoos"
tattoos: StringCriterionInput
"Filter by piercings"
@ -146,6 +177,8 @@ input PerformerFilterType {
tag_count: IntCriterionInput
"Filter by scene count"
scene_count: IntCriterionInput
"Filter by marker count (via scene)"
marker_count: IntCriterionInput
"Filter by image count"
image_count: IntCriterionInput
"Filter by gallery count"
@ -156,6 +189,9 @@ input PerformerFilterType {
o_counter: IntCriterionInput
"Filter by StashID"
stash_id_endpoint: StashIDCriterionInput
@deprecated(reason: "use stash_ids_endpoint instead")
"Filter by StashIDs"
stash_ids_endpoint: StashIDsCriterionInput
# rating expressed as 1-100
rating100: IntCriterionInput
"Filter by url"
@ -186,6 +222,8 @@ input PerformerFilterType {
galleries_filter: GalleryFilterType
"Filter by related tags that meet this criteria"
tags_filter: TagFilterType
"Filter by related scene markers (via scene) that meet this criteria"
markers_filter: SceneMarkerFilterType
"Filter by creation time"
created_at: TimestampCriterionInput
"Filter by last update time"
@ -211,9 +249,9 @@ input SceneMarkerFilterType {
updated_at: TimestampCriterionInput
"Filter by scene date"
scene_date: DateCriterionInput
"Filter by cscene reation time"
"Filter by scene creation time"
scene_created_at: TimestampCriterionInput
"Filter by lscene ast update time"
"Filter by scene last update time"
scene_updated_at: TimestampCriterionInput
"Filter by related scenes that meet this criteria"
scene_filter: SceneFilterType
@ -248,8 +286,8 @@ input SceneFilterType {
organized: Boolean
"Filter by o-counter"
o_counter: IntCriterionInput
"Filter Scenes that have an exact phash match available"
duplicated: PHashDuplicationCriterionInput
"Filter Scenes by duplication criteria"
duplicated: DuplicationCriterionInput
"Filter by resolution"
resolution: ResolutionCriterionInput
"Filter by orientation"
@ -292,6 +330,11 @@ input SceneFilterType {
performer_count: IntCriterionInput
"Filter by StashID"
stash_id_endpoint: StashIDCriterionInput
@deprecated(reason: "use stash_ids_endpoint instead")
"Filter by StashIDs"
stash_ids_endpoint: StashIDsCriterionInput
"Filter by StashID count"
stash_id_count: IntCriterionInput
"Filter by url"
url: StringCriterionInput
"Filter by interactive"
@ -330,6 +373,10 @@ input SceneFilterType {
groups_filter: GroupFilterType
"Filter by related markers that meet this criteria"
markers_filter: SceneMarkerFilterType
"Filter by related files that meet this criteria"
files_filter: FileFilterType
custom_fields: [CustomFieldCriterionInput!]
}
input MovieFilterType {
@ -401,6 +448,8 @@ input GroupFilterType {
created_at: TimestampCriterionInput
"Filter by last update time"
updated_at: TimestampCriterionInput
"Filter by o-counter"
o_counter: IntCriterionInput
"Filter by containing groups"
containing_groups: HierarchicalMultiCriterionInput
@ -410,11 +459,16 @@ input GroupFilterType {
containing_group_count: IntCriterionInput
"Filter by number of sub-groups the group has"
sub_group_count: IntCriterionInput
"Filter by number of scenes the group has"
scene_count: IntCriterionInput
"Filter by related scenes that meet this criteria"
scenes_filter: SceneFilterType
"Filter by related studios that meet this criteria"
studios_filter: StudioFilterType
"Filter by custom fields"
custom_fields: [CustomFieldCriterionInput!]
}
input StudioFilterType {
@ -428,6 +482,9 @@ input StudioFilterType {
parents: MultiCriterionInput
"Filter by StashID"
stash_id_endpoint: StashIDCriterionInput
@deprecated(reason: "use stash_ids_endpoint instead")
"Filter by StashIDs"
stash_ids_endpoint: StashIDsCriterionInput
"Filter to only include studios with these tags"
tags: HierarchicalMultiCriterionInput
"Filter to only include studios missing this property"
@ -442,6 +499,8 @@ input StudioFilterType {
image_count: IntCriterionInput
"Filter by gallery count"
gallery_count: IntCriterionInput
"Filter by group count"
group_count: IntCriterionInput
"Filter by tag count"
tag_count: IntCriterionInput
"Filter by url"
@ -452,16 +511,22 @@ input StudioFilterType {
child_count: IntCriterionInput
"Filter by autotag ignore value"
ignore_auto_tag: Boolean
"Filter by organized"
organized: Boolean
"Filter by related scenes that meet this criteria"
scenes_filter: SceneFilterType
"Filter by related images that meet this criteria"
images_filter: ImageFilterType
"Filter by related galleries that meet this criteria"
galleries_filter: GalleryFilterType
"Filter by related groups that meet this criteria"
groups_filter: GroupFilterType
"Filter by creation time"
created_at: TimestampCriterionInput
"Filter by last update time"
updated_at: TimestampCriterionInput
custom_fields: [CustomFieldCriterionInput!]
}
input GalleryFilterType {
@ -534,6 +599,14 @@ input GalleryFilterType {
studios_filter: StudioFilterType
"Filter by related tags that meet this criteria"
tags_filter: TagFilterType
"Filter by related files that meet this criteria"
files_filter: FileFilterType
"Filter by related folders that meet this criteria"
folders_filter: FolderFilterType
"Filter by parent folder of the zip or folder the gallery is in"
parent_folder: HierarchicalMultiCriterionInput
custom_fields: [CustomFieldCriterionInput!]
}
input TagFilterType {
@ -592,24 +665,41 @@ input TagFilterType {
"Filter by number of parent tags the tag has"
parent_count: IntCriterionInput
"Filter by number f child tags the tag has"
"Filter by number of child tags the tag has"
child_count: IntCriterionInput
"Filter by autotag ignore value"
ignore_auto_tag: Boolean
"Filter by StashID"
stash_id_endpoint: StashIDCriterionInput
@deprecated(reason: "use stash_ids_endpoint instead")
"Filter by StashID"
stash_ids_endpoint: StashIDsCriterionInput
"Filter by related scenes that meet this criteria"
scenes_filter: SceneFilterType
"Filter by related images that meet this criteria"
images_filter: ImageFilterType
"Filter by related galleries that meet this criteria"
galleries_filter: GalleryFilterType
"Filter by related groups that meet this criteria"
groups_filter: GroupFilterType
"Filter by related performers that meet this criteria"
performers_filter: PerformerFilterType
"Filter by related studios that meet this criteria"
studios_filter: StudioFilterType
"Filter by related scene markers that meet this criteria"
markers_filter: SceneMarkerFilterType
"Filter by creation time"
created_at: TimestampCriterionInput
"Filter by last update time"
updated_at: TimestampCriterionInput
custom_fields: [CustomFieldCriterionInput!]
}
input ImageFilterType {
@ -624,6 +714,8 @@ input ImageFilterType {
id: IntCriterionInput
"Filter by file checksum"
checksum: StringCriterionInput
"Filter by file phash distance"
phash_distance: PhashDistanceCriterionInput
"Filter by path"
path: StringCriterionInput
"Filter by file count"
@ -679,6 +771,109 @@ input ImageFilterType {
studios_filter: StudioFilterType
"Filter by related tags that meet this criteria"
tags_filter: TagFilterType
"Filter by related files that meet this criteria"
files_filter: FileFilterType
"Filter by custom fields"
custom_fields: [CustomFieldCriterionInput!]
}
input FileFilterType {
AND: FileFilterType
OR: FileFilterType
NOT: FileFilterType
path: StringCriterionInput
basename: StringCriterionInput
dir: StringCriterionInput
parent_folder: HierarchicalMultiCriterionInput
zip_file: MultiCriterionInput
"Filter by modification time"
mod_time: TimestampCriterionInput
"Filter files by duplication criteria (only phash applies to files)"
duplicated: FileDuplicationCriterionInput
"find files based on hash"
hashes: [FingerprintFilterInput!]
video_file_filter: VideoFileFilterInput
image_file_filter: ImageFileFilterInput
scene_count: IntCriterionInput
image_count: IntCriterionInput
gallery_count: IntCriterionInput
"Filter by related scenes that meet this criteria"
scenes_filter: SceneFilterType
"Filter by related images that meet this criteria"
images_filter: ImageFilterType
"Filter by related galleries that meet this criteria"
galleries_filter: GalleryFilterType
"Filter by creation time"
created_at: TimestampCriterionInput
"Filter by last update time"
updated_at: TimestampCriterionInput
}
input FolderFilterType {
AND: FolderFilterType
OR: FolderFilterType
NOT: FolderFilterType
path: StringCriterionInput
basename: StringCriterionInput
parent_folder: HierarchicalMultiCriterionInput
zip_file: MultiCriterionInput
"Filter by modification time"
mod_time: TimestampCriterionInput
gallery_count: IntCriterionInput
"Filter by files that meet this criteria"
files_filter: FileFilterType
"Filter by related galleries that meet this criteria"
galleries_filter: GalleryFilterType
"Filter by creation time"
created_at: TimestampCriterionInput
"Filter by last update time"
updated_at: TimestampCriterionInput
}
input VideoFileFilterInput {
resolution: ResolutionCriterionInput
orientation: OrientationCriterionInput
framerate: IntCriterionInput
bitrate: IntCriterionInput
format: StringCriterionInput
video_codec: StringCriterionInput
audio_codec: StringCriterionInput
"in seconds"
duration: IntCriterionInput
captions: StringCriterionInput
interactive: Boolean
interactive_speed: IntCriterionInput
}
input ImageFileFilterInput {
format: StringCriterionInput
resolution: ResolutionCriterionInput
orientation: OrientationCriterionInput
}
input FingerprintFilterInput {
type: String!
value: String!
"Hamming distance - defaults to 0"
distance: Int
}
enum CriterionModifier {
@ -738,7 +933,7 @@ input GenderCriterionInput {
}
input CircumcisionCriterionInput {
value: [CircumisedEnum!]
value: [CircumcisedEnum!]
modifier: CriterionModifier!
}

View file

@ -32,6 +32,7 @@ type Gallery {
cover: Image
paths: GalleryPathsType! # Resolver
custom_fields: Map!
image(index: Int!): Image!
}
@ -50,6 +51,8 @@ input GalleryCreateInput {
studio_id: ID
tag_ids: [ID!]
performer_ids: [ID!]
custom_fields: Map
}
input GalleryUpdateInput {
@ -71,6 +74,8 @@ input GalleryUpdateInput {
performer_ids: [ID!]
primary_file_id: ID
custom_fields: CustomFieldsInput
}
input BulkGalleryUpdateInput {
@ -89,6 +94,8 @@ input BulkGalleryUpdateInput {
studio_id: ID
tag_ids: BulkUpdateIds
performer_ids: BulkUpdateIds
custom_fields: CustomFieldsInput
}
input GalleryDestroyInput {
@ -100,6 +107,8 @@ input GalleryDestroyInput {
"""
delete_file: Boolean
delete_generated: Boolean
"If true, delete the file entry from the database if the file is not assigned to any other objects"
destroy_file_entry: Boolean
}
type FindGalleriesResultType {

View file

@ -30,6 +30,8 @@ type Group {
performer_count(depth: Int): Int! # Resolver
sub_group_count(depth: Int): Int! # Resolver
scenes: [Scene!]!
o_counter: Int # Resolver
custom_fields: Map!
}
input GroupDescriptionInput {
@ -58,6 +60,8 @@ input GroupCreateInput {
front_image: String
"This should be a URL or a base64 encoded data URL"
back_image: String
custom_fields: Map
}
input GroupUpdateInput {
@ -81,6 +85,8 @@ input GroupUpdateInput {
front_image: String
"This should be a URL or a base64 encoded data URL"
back_image: String
custom_fields: CustomFieldsInput
}
input BulkUpdateGroupDescriptionsInput {
@ -93,6 +99,8 @@ input BulkGroupUpdateInput {
ids: [ID!]
# rating expressed as 1-100
rating100: Int
date: String
synopsis: String
studio_id: ID
director: String
urls: BulkUpdateStrings
@ -100,6 +108,8 @@ input BulkGroupUpdateInput {
containing_groups: BulkUpdateGroupDescriptionsInput
sub_groups: BulkUpdateGroupDescriptionsInput
custom_fields: CustomFieldsInput
}
input GroupDestroyInput {

View file

@ -21,6 +21,7 @@ type Image {
studio: Studio
tags: [Tag!]!
performers: [Performer!]!
custom_fields: Map!
}
type ImageFileType {
@ -56,6 +57,7 @@ input ImageUpdateInput {
gallery_ids: [ID!]
primary_file_id: ID
custom_fields: CustomFieldsInput
}
input BulkImageUpdateInput {
@ -76,18 +78,23 @@ input BulkImageUpdateInput {
performer_ids: BulkUpdateIds
tag_ids: BulkUpdateIds
gallery_ids: BulkUpdateIds
custom_fields: CustomFieldsInput
}
input ImageDestroyInput {
id: ID!
delete_file: Boolean
delete_generated: Boolean
"If true, delete the file entry from the database if the file is not assigned to any other objects"
destroy_file_entry: Boolean
}
input ImagesDestroyInput {
ids: [ID!]!
delete_file: Boolean
delete_generated: Boolean
"If true, delete the file entry from the database if the file is not assigned to any other objects"
destroy_file_entry: Boolean
}
type FindImagesResultType {

View file

@ -10,8 +10,11 @@ input GenerateMetadataInput {
transcodes: Boolean
"Generate transcodes even if not required"
forceTranscodes: Boolean
"Generate video phashes during scan"
phashes: Boolean
interactiveHeatmapsSpeeds: Boolean
"Generate image phashes during scan"
imagePhashes: Boolean
imageThumbnails: Boolean
clipPreviews: Boolean
@ -19,6 +22,12 @@ input GenerateMetadataInput {
sceneIDs: [ID!]
"marker ids to generate for"
markerIDs: [ID!]
"image ids to generate for"
imageIDs: [ID!]
"gallery ids to generate for"
galleryIDs: [ID!]
"paths to run generate on, in addition to the other ID lists"
paths: [String!]
"overwrite existing media"
overwrite: Boolean
@ -85,8 +94,10 @@ input ScanMetadataInput {
scanGenerateImagePreviews: Boolean
"Generate sprites during scan"
scanGenerateSprites: Boolean
"Generate phashes during scan"
"Generate video phashes during scan"
scanGeneratePhashes: Boolean
"Generate image phashes during scan"
scanGenerateImagePhashes: Boolean
"Generate image thumbnails during scan"
scanGenerateThumbnails: Boolean
"Generate image clip previews during scan"
@ -107,8 +118,10 @@ type ScanMetadataOptions {
scanGenerateImagePreviews: Boolean!
"Generate sprites during scan"
scanGenerateSprites: Boolean!
"Generate phashes during scan"
"Generate video phashes during scan"
scanGeneratePhashes: Boolean!
"Generate image phashes during scan"
scanGenerateImagePhashes: Boolean
"Generate image thumbnails during scan"
scanGenerateThumbnails: Boolean!
"Generate image clip previews during scan"
@ -118,6 +131,14 @@ type ScanMetadataOptions {
input CleanMetadataInput {
paths: [String!]
"""
Don't check zip file contents when determining whether to clean a file.
This can significantly speed up the clean process, but will potentially miss removed files within zip files.
Where users do not modify zip files contents directly, this should be safe to use.
Defaults to false.
"""
ignoreZipFileContents: Boolean
"Do a dry run. Don't delete any files"
dryRun: Boolean!
}
@ -204,7 +225,9 @@ input IdentifyMetadataOptionsInput {
setCoverImage: Boolean
setOrganized: Boolean
"defaults to true if not provided"
includeMalePerformers: Boolean
includeMalePerformers: Boolean @deprecated(reason: "Use performerGenders")
"Filter to only include performers with these genders. If not provided, all genders are included."
performerGenders: [GenderEnum!]
"defaults to true if not provided"
skipMultipleMatches: Boolean
"tag to tag skipped multiple matches with"
@ -249,7 +272,9 @@ type IdentifyMetadataOptions {
setCoverImage: Boolean
setOrganized: Boolean
"defaults to true if not provided"
includeMalePerformers: Boolean
includeMalePerformers: Boolean @deprecated(reason: "Use performerGenders")
"Filter to only include performers with these genders. If not provided, all genders are included."
performerGenders: [GenderEnum!]
"defaults to true if not provided"
skipMultipleMatches: Boolean
"tag to tag skipped multiple matches with"
@ -310,6 +335,8 @@ input ImportObjectsInput {
input BackupDatabaseInput {
download: Boolean
"If true, blob files will be included in the backup. This can significantly increase the size of the backup and the time it takes to create it, but allows for a complete backup of the system that can be restored without needing access to the original media files."
includeBlobs: Boolean
}
input AnonymiseDatabaseInput {
@ -344,4 +371,6 @@ input CustomFieldsInput {
full: Map
"If populated, only the keys in this map will be updated"
partial: Map
"Remove any keys in this list"
remove: [String!]
}

View file

@ -7,7 +7,7 @@ enum GenderEnum {
NON_BINARY
}
enum CircumisedEnum {
enum CircumcisedEnum {
CUT
UNCUT
}
@ -29,8 +29,10 @@ type Performer {
measurements: String
fake_tits: String
penis_length: Float
circumcised: CircumisedEnum
career_length: String
circumcised: CircumcisedEnum
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
alias_list: [String!]!
@ -76,10 +78,13 @@ input PerformerCreateInput {
measurements: String
fake_tits: String
penis_length: Float
circumcised: CircumisedEnum
career_length: String
circumcised: CircumcisedEnum
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
alias_list: [String!]
twitter: String @deprecated(reason: "Use urls")
instagram: String @deprecated(reason: "Use urls")
@ -114,10 +119,13 @@ input PerformerUpdateInput {
measurements: String
fake_tits: String
penis_length: Float
circumcised: CircumisedEnum
career_length: String
circumcised: CircumcisedEnum
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
alias_list: [String!]
twitter: String @deprecated(reason: "Use urls")
instagram: String @deprecated(reason: "Use urls")
@ -157,10 +165,13 @@ input BulkPerformerUpdateInput {
measurements: String
fake_tits: String
penis_length: Float
circumcised: CircumisedEnum
career_length: String
circumcised: CircumcisedEnum
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
"Duplicate aliases and those equal to name will result in an error (case-insensitive)"
alias_list: BulkUpdateStrings
twitter: String @deprecated(reason: "Use urls")
instagram: String @deprecated(reason: "Use urls")
@ -185,3 +196,10 @@ type FindPerformersResultType {
count: Int!
performers: [Performer!]!
}
input PerformerMergeInput {
source: [ID!]!
destination: ID!
# values defined here will override values in the destination
values: PerformerUpdateInput
}

View file

@ -42,6 +42,13 @@ input SceneMarkerUpdateInput {
tag_ids: [ID!]
}
input BulkSceneMarkerUpdateInput {
ids: [ID!]
title: String
primary_tag_id: ID
tag_ids: BulkUpdateIds
}
type FindSceneMarkersResultType {
count: Int!
scene_markers: [SceneMarker!]!

View file

@ -79,6 +79,8 @@ type Scene {
performers: [Performer!]!
stash_ids: [StashID!]!
custom_fields: Map!
"Return valid stream paths"
sceneStreams: [SceneStreamEndpoint!]!
}
@ -120,6 +122,8 @@ input SceneCreateInput {
Files must not already be primary for another scene.
"""
file_ids: [ID!]
custom_fields: Map
}
input SceneUpdateInput {
@ -158,6 +162,8 @@ input SceneUpdateInput {
)
primary_file_id: ID
custom_fields: CustomFieldsInput
}
enum BulkUpdateIdMode {
@ -190,18 +196,24 @@ input BulkSceneUpdateInput {
tag_ids: BulkUpdateIds
group_ids: BulkUpdateIds
movie_ids: BulkUpdateIds @deprecated(reason: "Use group_ids")
custom_fields: CustomFieldsInput
}
input SceneDestroyInput {
id: ID!
delete_file: Boolean
delete_generated: Boolean
"If true, delete the file entry from the database if the file is not assigned to any other objects"
destroy_file_entry: Boolean
}
input ScenesDestroyInput {
ids: [ID!]!
delete_file: Boolean
delete_generated: Boolean
"If true, delete the file entry from the database if the file is not assigned to any other objects"
destroy_file_entry: Boolean
}
type FindScenesResultType {

View file

@ -18,7 +18,9 @@ type ScrapedPerformer {
fake_tits: String
penis_length: String
circumcised: String
career_length: String
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
# aliases must be comma-delimited to be parsed correctly
@ -54,7 +56,9 @@ input ScrapedPerformerInput {
fake_tits: String
penis_length: String
circumcised: String
career_length: String
career_length: String @deprecated(reason: "Use career_start and career_end")
career_start: String
career_end: String
tattoos: String
piercings: String
aliases: String

View file

@ -55,9 +55,14 @@ type ScrapedStudio {
"Set if studio matched"
stored_id: ID
name: String!
url: String
url: String @deprecated(reason: "use urls")
urls: [String!]
parent: ScrapedStudio
image: String
details: String
"Aliases must be comma-delimited to be parsed correctly"
aliases: String
tags: [ScrapedTag!]
remote_site_id: String
}
@ -66,6 +71,11 @@ type ScrapedTag {
"Set if tag matched"
stored_id: ID
name: String!
description: String
alias_list: [String!]
parent: ScrapedTag
"Remote site ID, if applicable"
remote_site_id: String
}
type ScrapedScene {
@ -191,6 +201,13 @@ input ScrapeSingleStudioInput {
query: String
}
input ScrapeSingleTagInput {
"""
Query can be either a name or a Stash ID
"""
query: String
}
input ScrapeSinglePerformerInput {
"Instructs to query by string"
query: String
@ -274,7 +291,10 @@ type StashBoxFingerprint {
duration: Int!
}
"If neither ids nor names are set, tag all items"
"""
Accepts either ids, or a combination of names and stash_ids.
If none are set, then all existing items will be tagged.
"""
input StashBoxBatchTagInput {
"Stash endpoint to use for the tagging"
endpoint: Int @deprecated(reason: "use stash_box_endpoint")
@ -286,12 +306,17 @@ input StashBoxBatchTagInput {
refresh: Boolean!
"If batch adding studios, should their parent studios also be created?"
createParent: Boolean!
"If set, only tag these ids"
"""
IDs in stash of the items to update.
If set, names and stash_ids fields will be ignored.
"""
ids: [ID!]
"If set, only tag these names"
"Names of the items in the stash-box instance to search for and create"
names: [String!]
"If set, only tag these performer ids"
"Stash IDs of the items in the stash-box instance to search for and create"
stash_ids: [String!]
"IDs in stash of the performers to update"
performer_ids: [ID!] @deprecated(reason: "use ids")
"If set, only tag these performer names"
"Names of the performers in the stash-box instance to search for and create"
performer_names: [String!] @deprecated(reason: "use names")
}

View file

@ -1,12 +1,14 @@
type Studio {
id: ID!
name: String!
url: String
url: String @deprecated(reason: "Use urls")
urls: [String!]!
parent_studio: Studio
child_studios: [Studio!]!
aliases: [String!]!
tags: [Tag!]!
ignore_auto_tag: Boolean!
organized: Boolean!
image_path: String # Resolver
scene_count(depth: Int): Int! # Resolver
@ -24,11 +26,15 @@ type Studio {
updated_at: Time!
groups: [Group!]!
movies: [Movie!]! @deprecated(reason: "use groups instead")
o_counter: Int
custom_fields: Map!
}
input StudioCreateInput {
name: String!
url: String
url: String @deprecated(reason: "Use urls")
urls: [String!]
parent_id: ID
"This should be a URL or a base64 encoded data URL"
image: String
@ -37,15 +43,20 @@ input StudioCreateInput {
rating100: Int
favorite: Boolean
details: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
aliases: [String!]
tag_ids: [ID!]
ignore_auto_tag: Boolean
organized: Boolean
custom_fields: Map
}
input StudioUpdateInput {
id: ID!
name: String
url: String
url: String @deprecated(reason: "Use urls")
urls: [String!]
parent_id: ID
"This should be a URL or a base64 encoded data URL"
image: String
@ -54,9 +65,27 @@ input StudioUpdateInput {
rating100: Int
favorite: Boolean
details: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
aliases: [String!]
tag_ids: [ID!]
ignore_auto_tag: Boolean
organized: Boolean
custom_fields: CustomFieldsInput
}
input BulkStudioUpdateInput {
ids: [ID!]!
url: String @deprecated(reason: "Use urls")
urls: BulkUpdateStrings
parent_id: ID
# rating expressed as 1-100
rating100: Int
favorite: Boolean
details: String
tag_ids: BulkUpdateIds
ignore_auto_tag: Boolean
organized: Boolean
}
input StudioDestroyInput {

View file

@ -9,6 +9,7 @@ type Tag {
created_at: Time!
updated_at: Time!
favorite: Boolean!
stash_ids: [StashID!]!
image_path: String # Resolver
scene_count(depth: Int): Int! # Resolver
scene_marker_count(depth: Int): Int! # Resolver
@ -23,6 +24,7 @@ type Tag {
parent_count: Int! # Resolver
child_count: Int! # Resolver
custom_fields: Map!
}
input TagCreateInput {
@ -30,14 +32,18 @@ input TagCreateInput {
"Value that does not appear in the UI but overrides name for sorting"
sort_name: String
description: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
aliases: [String!]
ignore_auto_tag: Boolean
favorite: Boolean
"This should be a URL or a base64 encoded data URL"
image: String
stash_ids: [StashIDInput!]
parent_ids: [ID!]
child_ids: [ID!]
custom_fields: Map
}
input TagUpdateInput {
@ -46,14 +52,18 @@ input TagUpdateInput {
"Value that does not appear in the UI but overrides name for sorting"
sort_name: String
description: String
"Duplicate aliases and those equal to name will be ignored (case-insensitive)"
aliases: [String!]
ignore_auto_tag: Boolean
favorite: Boolean
"This should be a URL or a base64 encoded data URL"
image: String
stash_ids: [StashIDInput!]
parent_ids: [ID!]
child_ids: [ID!]
custom_fields: CustomFieldsInput
}
input TagDestroyInput {
@ -68,11 +78,14 @@ type FindTagsResultType {
input TagsMergeInput {
source: [ID!]!
destination: ID!
# values defined here will override values in the destination
values: TagUpdateInput
}
input BulkTagUpdateInput {
ids: [ID!]
description: String
"Duplicate aliases and those equal to name will result in an error (case-insensitive)"
aliases: BulkUpdateStrings
ignore_auto_tag: Boolean
favorite: Boolean

View file

@ -13,6 +13,7 @@ fragment ImageFragment on Image {
fragment StudioFragment on Studio {
name
id
aliases
urls {
...URLFragment
}
@ -28,6 +29,13 @@ fragment StudioFragment on Studio {
fragment TagFragment on Tag {
name
id
description
aliases
category {
id
name
description
}
}
fragment MeasurementsFragment on Measurements {
@ -119,18 +127,6 @@ fragment SceneFragment on Scene {
}
}
query FindSceneByFingerprint($fingerprint: FingerprintQueryInput!) {
findSceneByFingerprint(fingerprint: $fingerprint) {
...SceneFragment
}
}
query FindScenesByFullFingerprints($fingerprints: [FingerprintQueryInput!]!) {
findScenesByFullFingerprints(fingerprints: $fingerprints) {
...SceneFragment
}
}
query FindScenesBySceneFingerprints(
$fingerprints: [[FingerprintQueryInput!]!]!
) {
@ -169,6 +165,21 @@ query FindStudio($id: ID, $name: String) {
}
}
query FindTag($id: ID, $name: String) {
findTag(id: $id, name: $name) {
...TagFragment
}
}
query QueryTags($input: TagQueryInput!) {
queryTags(input: $input) {
count
tags {
...TagFragment
}
}
}
mutation SubmitFingerprint($input: FingerprintSubmission!) {
submitFingerprint(input: $input)
}

View file

@ -40,6 +40,8 @@ func authenticateHandler() func(http.Handler) http.Handler {
return
}
r = session.SetLocalRequest(r)
userID, err := manager.GetInstance().SessionStore.Authenticate(w, r)
if err != nil {
if !errors.Is(err, session.ErrUnauthorized) {

View file

@ -98,7 +98,7 @@ func (t changesetTranslator) string(value *string) string {
return ""
}
return *value
return strings.TrimSpace(*value)
}
func (t changesetTranslator) optionalString(value *string, field string) models.OptionalString {
@ -106,7 +106,12 @@ func (t changesetTranslator) optionalString(value *string, field string) models.
return models.OptionalString{}
}
return models.NewOptionalStringPtr(value)
if value == nil {
return models.NewOptionalStringPtr(nil)
}
trimmed := strings.TrimSpace(*value)
return models.NewOptionalString(trimmed)
}
func (t changesetTranslator) optionalDate(value *string, field string) (models.OptionalDate, error) {
@ -318,8 +323,14 @@ func (t changesetTranslator) updateStrings(value []string, field string) *models
return nil
}
// Trim whitespace from each string
trimmedValues := make([]string, len(value))
for i, v := range value {
trimmedValues[i] = strings.TrimSpace(v)
}
return &models.UpdateStrings{
Values: value,
Values: trimmedValues,
Mode: models.RelationshipUpdateModeSet,
}
}
@ -329,8 +340,14 @@ func (t changesetTranslator) updateStringsBulk(value *BulkUpdateStrings, field s
return nil
}
// Trim whitespace from each string
trimmedValues := make([]string, len(value.Values))
for i, v := range value.Values {
trimmedValues[i] = strings.TrimSpace(v)
}
return &models.UpdateStrings{
Values: value.Values,
Values: trimmedValues,
Mode: value.Mode,
}
}
@ -448,7 +465,7 @@ func groupsDescriptionsFromGroupInput(input []*GroupDescriptionInput) ([]models.
GroupID: gID,
}
if v.Description != nil {
ret[i].Description = *v.Description
ret[i].Description = strings.TrimSpace(*v.Description)
}
}

View file

@ -7,8 +7,10 @@ import (
"fmt"
"io"
"net/http"
"os"
"regexp"
"runtime"
"strings"
"time"
"golang.org/x/sys/cpu"
@ -36,6 +38,24 @@ var stashReleases = func() map[string]string {
}
}
// isMacOSBundle checks if the application is running from within a macOS .app bundle
func isMacOSBundle() bool {
exec, err := os.Executable()
return err == nil && strings.Contains(exec, "Stash.app/")
}
// getWantedRelease determines which release variant to download based on platform and bundle type
func getWantedRelease(platform string) string {
release := stashReleases()[platform]
// On macOS, check if running from .app bundle
if runtime.GOOS == "darwin" && isMacOSBundle() {
return "Stash.app.zip"
}
return release
}
type githubReleasesResponse struct {
Url string
Assets_url string
@ -168,7 +188,7 @@ func GetLatestRelease(ctx context.Context) (*LatestRelease, error) {
}
platform := fmt.Sprintf("%s/%s", runtime.GOOS, arch)
wantedRelease := stashReleases()[platform]
wantedRelease := getWantedRelease(platform)
url := apiReleases
if build.IsDevelop() {

View file

@ -0,0 +1,12 @@
package api
import "github.com/stashapp/stash/pkg/models"
func handleUpdateCustomFields(input models.CustomFieldsInput) models.CustomFieldsInput {
ret := input
// convert json.Numbers to int/float
ret.Full = convertMapJSONNumbers(ret.Full)
ret.Partial = convertMapJSONNumbers(ret.Partial)
return ret
}

23
internal/api/fields.go Normal file
View file

@ -0,0 +1,23 @@
package api
import (
"context"
"github.com/99designs/gqlgen/graphql"
)
type queryFields []string
func collectQueryFields(ctx context.Context) queryFields {
fields := graphql.CollectAllFields(ctx)
return queryFields(fields)
}
func (f queryFields) Has(field string) bool {
for _, v := range f {
if v == field {
return true
}
}
return false
}

View file

@ -26,6 +26,7 @@ var imageBoxExts = []string{
".gif",
".svg",
".webp",
".avif",
}
func newImageBox(box fs.FS) (*imageBox, error) {
@ -101,7 +102,7 @@ func initCustomPerformerImages(customPath string) {
}
}
func getDefaultPerformerImage(name string, gender *models.GenderEnum) []byte {
func getDefaultPerformerImage(name string, gender *models.GenderEnum, sfwMode bool) []byte {
// try the custom box first if we have one
if performerBoxCustom != nil {
ret, err := performerBoxCustom.GetRandomImageByName(name)
@ -111,6 +112,10 @@ func getDefaultPerformerImage(name string, gender *models.GenderEnum) []byte {
logger.Warnf("error loading custom default performer image: %v", err)
}
if sfwMode {
return static.ReadAll(static.DefaultSFWPerformerImage)
}
var g models.GenderEnum
if gender != nil {
g = *gender

35
internal/api/input.go Normal file
View file

@ -0,0 +1,35 @@
package api
import (
"fmt"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
// TODO - apply handleIDs to other resolvers that accept ID lists
// handleIDList validates and converts a list of string IDs to integers
func handleIDList(idList []string, field string) ([]int, error) {
if err := validateIDList(idList); err != nil {
return nil, fmt.Errorf("validating %s: %w", field, err)
}
ids, err := stringslice.StringSliceToIntSlice(idList)
if err != nil {
return nil, fmt.Errorf("converting %s: %w", field, err)
}
return ids, nil
}
// validateIDList returns an error if there are any duplicate ids in the list
func validateIDList(ids []string) error {
seen := make(map[string]struct{})
for _, id := range ids {
if _, exists := seen[id]; exists {
return fmt.Errorf("duplicate id found: %s", id)
}
seen[id] = struct{}{}
}
return nil
}

View file

@ -10,6 +10,8 @@
//go:generate go run github.com/vektah/dataloaden TagLoader int *github.com/stashapp/stash/pkg/models.Tag
//go:generate go run github.com/vektah/dataloaden GroupLoader int *github.com/stashapp/stash/pkg/models.Group
//go:generate go run github.com/vektah/dataloaden FileLoader github.com/stashapp/stash/pkg/models.FileID github.com/stashapp/stash/pkg/models.File
//go:generate go run github.com/vektah/dataloaden FolderLoader github.com/stashapp/stash/pkg/models.FolderID *github.com/stashapp/stash/pkg/models.Folder
//go:generate go run github.com/vektah/dataloaden FolderRelatedFolderIDsLoader github.com/stashapp/stash/pkg/models.FolderID []github.com/stashapp/stash/pkg/models.FolderID
//go:generate go run github.com/vektah/dataloaden SceneFileIDsLoader int []github.com/stashapp/stash/pkg/models.FileID
//go:generate go run github.com/vektah/dataloaden ImageFileIDsLoader int []github.com/stashapp/stash/pkg/models.FileID
//go:generate go run github.com/vektah/dataloaden GalleryFileIDsLoader int []github.com/stashapp/stash/pkg/models.FileID
@ -41,27 +43,40 @@ const (
)
type Loaders struct {
SceneByID *SceneLoader
SceneFiles *SceneFileIDsLoader
ScenePlayCount *ScenePlayCountLoader
SceneOCount *SceneOCountLoader
ScenePlayHistory *ScenePlayHistoryLoader
SceneOHistory *SceneOHistoryLoader
SceneLastPlayed *SceneLastPlayedLoader
SceneByID *SceneLoader
SceneFiles *SceneFileIDsLoader
ScenePlayCount *ScenePlayCountLoader
SceneOCount *SceneOCountLoader
ScenePlayHistory *ScenePlayHistoryLoader
SceneOHistory *SceneOHistoryLoader
SceneLastPlayed *SceneLastPlayedLoader
SceneCustomFields *CustomFieldsLoader
ImageFiles *ImageFileIDsLoader
GalleryFiles *GalleryFileIDsLoader
GalleryByID *GalleryLoader
ImageByID *ImageLoader
GalleryByID *GalleryLoader
GalleryCustomFields *CustomFieldsLoader
ImageByID *ImageLoader
ImageCustomFields *CustomFieldsLoader
PerformerByID *PerformerLoader
PerformerCustomFields *CustomFieldsLoader
StudioByID *StudioLoader
TagByID *TagLoader
GroupByID *GroupLoader
FileByID *FileLoader
StudioByID *StudioLoader
StudioCustomFields *CustomFieldsLoader
TagByID *TagLoader
TagCustomFields *CustomFieldsLoader
GroupByID *GroupLoader
GroupCustomFields *CustomFieldsLoader
FileByID *FileLoader
FolderByID *FolderLoader
FolderParentFolderIDs *FolderRelatedFolderIDsLoader
FolderSubFolderIDs *FolderRelatedFolderIDsLoader
}
type Middleware struct {
@ -82,11 +97,21 @@ func (m Middleware) Middleware(next http.Handler) http.Handler {
maxBatch: maxBatch,
fetch: m.fetchGalleries(ctx),
},
GalleryCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchGalleryCustomFields(ctx),
},
ImageByID: &ImageLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchImages(ctx),
},
ImageCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchImageCustomFields(ctx),
},
PerformerByID: &PerformerLoader{
wait: wait,
maxBatch: maxBatch,
@ -97,6 +122,16 @@ func (m Middleware) Middleware(next http.Handler) http.Handler {
maxBatch: maxBatch,
fetch: m.fetchPerformerCustomFields(ctx),
},
StudioCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchStudioCustomFields(ctx),
},
SceneCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchSceneCustomFields(ctx),
},
StudioByID: &StudioLoader{
wait: wait,
maxBatch: maxBatch,
@ -107,16 +142,41 @@ func (m Middleware) Middleware(next http.Handler) http.Handler {
maxBatch: maxBatch,
fetch: m.fetchTags(ctx),
},
TagCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchTagCustomFields(ctx),
},
GroupByID: &GroupLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchGroups(ctx),
},
GroupCustomFields: &CustomFieldsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchGroupCustomFields(ctx),
},
FileByID: &FileLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchFiles(ctx),
},
FolderByID: &FolderLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchFolders(ctx),
},
FolderParentFolderIDs: &FolderRelatedFolderIDsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchFoldersParentFolderIDs(ctx),
},
FolderSubFolderIDs: &FolderRelatedFolderIDsLoader{
wait: wait,
maxBatch: maxBatch,
fetch: m.fetchFoldersSubFolderIDs(ctx),
},
SceneFiles: &SceneFileIDsLoader{
wait: wait,
maxBatch: maxBatch,
@ -187,6 +247,18 @@ func (m Middleware) fetchScenes(ctx context.Context) func(keys []int) ([]*models
}
}
func (m Middleware) fetchSceneCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Scene.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchImages(ctx context.Context) func(keys []int) ([]*models.Image, []error) {
return func(keys []int) (ret []*models.Image, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
@ -199,6 +271,18 @@ func (m Middleware) fetchImages(ctx context.Context) func(keys []int) ([]*models
}
}
func (m Middleware) fetchImageCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Image.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchGalleries(ctx context.Context) func(keys []int) ([]*models.Gallery, []error) {
return func(keys []int) (ret []*models.Gallery, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
@ -246,6 +330,18 @@ func (m Middleware) fetchStudios(ctx context.Context) func(keys []int) ([]*model
}
}
func (m Middleware) fetchStudioCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Studio.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchTags(ctx context.Context) func(keys []int) ([]*models.Tag, []error) {
return func(keys []int) (ret []*models.Tag, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
@ -257,6 +353,42 @@ func (m Middleware) fetchTags(ctx context.Context) func(keys []int) ([]*models.T
}
}
func (m Middleware) fetchTagCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Tag.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchGroupCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Group.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchGalleryCustomFields(ctx context.Context) func(keys []int) ([]models.CustomFieldMap, []error) {
return func(keys []int) (ret []models.CustomFieldMap, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Gallery.GetCustomFieldsBulk(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchGroups(ctx context.Context) func(keys []int) ([]*models.Group, []error) {
return func(keys []int) (ret []*models.Group, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
@ -279,6 +411,39 @@ func (m Middleware) fetchFiles(ctx context.Context) func(keys []models.FileID) (
}
}
func (m Middleware) fetchFolders(ctx context.Context) func(keys []models.FolderID) ([]*models.Folder, []error) {
return func(keys []models.FolderID) (ret []*models.Folder, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Folder.FindMany(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchFoldersParentFolderIDs(ctx context.Context) func(keys []models.FolderID) ([][]models.FolderID, []error) {
return func(keys []models.FolderID) (ret [][]models.FolderID, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Folder.GetManyParentFolderIDs(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchFoldersSubFolderIDs(ctx context.Context) func(keys []models.FolderID) ([][]models.FolderID, []error) {
return func(keys []models.FolderID) (ret [][]models.FolderID, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {
var err error
ret, err = m.Repository.Folder.GetManySubFolderIDs(ctx, keys)
return err
})
return ret, toErrorSlice(err)
}
}
func (m Middleware) fetchScenesFileIDs(ctx context.Context) func(keys []int) ([][]models.FileID, []error) {
return func(keys []int) (ret [][]models.FileID, errs []error) {
err := m.Repository.WithDB(ctx, func(ctx context.Context) error {

View file

@ -0,0 +1,224 @@
// Code generated by github.com/vektah/dataloaden, DO NOT EDIT.
package loaders
import (
"sync"
"time"
"github.com/stashapp/stash/pkg/models"
)
// FolderLoaderConfig captures the config to create a new FolderLoader
type FolderLoaderConfig struct {
// Fetch is a method that provides the data for the loader
Fetch func(keys []models.FolderID) ([]*models.Folder, []error)
// Wait is how long wait before sending a batch
Wait time.Duration
// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
MaxBatch int
}
// NewFolderLoader creates a new FolderLoader given a fetch, wait, and maxBatch
func NewFolderLoader(config FolderLoaderConfig) *FolderLoader {
return &FolderLoader{
fetch: config.Fetch,
wait: config.Wait,
maxBatch: config.MaxBatch,
}
}
// FolderLoader batches and caches requests
type FolderLoader struct {
// this method provides the data for the loader
fetch func(keys []models.FolderID) ([]*models.Folder, []error)
// how long to done before sending a batch
wait time.Duration
// this will limit the maximum number of keys to send in one batch, 0 = no limit
maxBatch int
// INTERNAL
// lazily created cache
cache map[models.FolderID]*models.Folder
// the current batch. keys will continue to be collected until timeout is hit,
// then everything will be sent to the fetch method and out to the listeners
batch *folderLoaderBatch
// mutex to prevent races
mu sync.Mutex
}
type folderLoaderBatch struct {
keys []models.FolderID
data []*models.Folder
error []error
closing bool
done chan struct{}
}
// Load a Folder by key, batching and caching will be applied automatically
func (l *FolderLoader) Load(key models.FolderID) (*models.Folder, error) {
return l.LoadThunk(key)()
}
// LoadThunk returns a function that when called will block waiting for a Folder.
// This method should be used if you want one goroutine to make requests to many
// different data loaders without blocking until the thunk is called.
func (l *FolderLoader) LoadThunk(key models.FolderID) func() (*models.Folder, error) {
l.mu.Lock()
if it, ok := l.cache[key]; ok {
l.mu.Unlock()
return func() (*models.Folder, error) {
return it, nil
}
}
if l.batch == nil {
l.batch = &folderLoaderBatch{done: make(chan struct{})}
}
batch := l.batch
pos := batch.keyIndex(l, key)
l.mu.Unlock()
return func() (*models.Folder, error) {
<-batch.done
var data *models.Folder
if pos < len(batch.data) {
data = batch.data[pos]
}
var err error
// its convenient to be able to return a single error for everything
if len(batch.error) == 1 {
err = batch.error[0]
} else if batch.error != nil {
err = batch.error[pos]
}
if err == nil {
l.mu.Lock()
l.unsafeSet(key, data)
l.mu.Unlock()
}
return data, err
}
}
// LoadAll fetches many keys at once. It will be broken into appropriate sized
// sub batches depending on how the loader is configured
func (l *FolderLoader) LoadAll(keys []models.FolderID) ([]*models.Folder, []error) {
results := make([]func() (*models.Folder, error), len(keys))
for i, key := range keys {
results[i] = l.LoadThunk(key)
}
folders := make([]*models.Folder, len(keys))
errors := make([]error, len(keys))
for i, thunk := range results {
folders[i], errors[i] = thunk()
}
return folders, errors
}
// LoadAllThunk returns a function that when called will block waiting for a Folders.
// This method should be used if you want one goroutine to make requests to many
// different data loaders without blocking until the thunk is called.
func (l *FolderLoader) LoadAllThunk(keys []models.FolderID) func() ([]*models.Folder, []error) {
results := make([]func() (*models.Folder, error), len(keys))
for i, key := range keys {
results[i] = l.LoadThunk(key)
}
return func() ([]*models.Folder, []error) {
folders := make([]*models.Folder, len(keys))
errors := make([]error, len(keys))
for i, thunk := range results {
folders[i], errors[i] = thunk()
}
return folders, errors
}
}
// Prime the cache with the provided key and value. If the key already exists, no change is made
// and false is returned.
// (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)
func (l *FolderLoader) Prime(key models.FolderID, value *models.Folder) bool {
l.mu.Lock()
var found bool
if _, found = l.cache[key]; !found {
// make a copy when writing to the cache, its easy to pass a pointer in from a loop var
// and end up with the whole cache pointing to the same value.
cpy := *value
l.unsafeSet(key, &cpy)
}
l.mu.Unlock()
return !found
}
// Clear the value at key from the cache, if it exists
func (l *FolderLoader) Clear(key models.FolderID) {
l.mu.Lock()
delete(l.cache, key)
l.mu.Unlock()
}
func (l *FolderLoader) unsafeSet(key models.FolderID, value *models.Folder) {
if l.cache == nil {
l.cache = map[models.FolderID]*models.Folder{}
}
l.cache[key] = value
}
// keyIndex will return the location of the key in the batch, if its not found
// it will add the key to the batch
func (b *folderLoaderBatch) keyIndex(l *FolderLoader, key models.FolderID) int {
for i, existingKey := range b.keys {
if key == existingKey {
return i
}
}
pos := len(b.keys)
b.keys = append(b.keys, key)
if pos == 0 {
go b.startTimer(l)
}
if l.maxBatch != 0 && pos >= l.maxBatch-1 {
if !b.closing {
b.closing = true
l.batch = nil
go b.end(l)
}
}
return pos
}
func (b *folderLoaderBatch) startTimer(l *FolderLoader) {
time.Sleep(l.wait)
l.mu.Lock()
// we must have hit a batch limit and are already finalizing this batch
if b.closing {
l.mu.Unlock()
return
}
l.batch = nil
l.mu.Unlock()
b.end(l)
}
func (b *folderLoaderBatch) end(l *FolderLoader) {
b.data, b.error = l.fetch(b.keys)
close(b.done)
}

View file

@ -0,0 +1,225 @@
// Code generated by github.com/vektah/dataloaden, DO NOT EDIT.
package loaders
import (
"sync"
"time"
"github.com/stashapp/stash/pkg/models"
)
// FolderParentFolderIDsLoaderConfig captures the config to create a new FolderParentFolderIDsLoader
type FolderParentFolderIDsLoaderConfig struct {
// Fetch is a method that provides the data for the loader
Fetch func(keys []models.FolderID) ([][]models.FolderID, []error)
// Wait is how long wait before sending a batch
Wait time.Duration
// MaxBatch will limit the maximum number of keys to send in one batch, 0 = not limit
MaxBatch int
}
// NewFolderParentFolderIDsLoader creates a new FolderParentFolderIDsLoader given a fetch, wait, and maxBatch
func NewFolderParentFolderIDsLoader(config FolderParentFolderIDsLoaderConfig) *FolderRelatedFolderIDsLoader {
return &FolderRelatedFolderIDsLoader{
fetch: config.Fetch,
wait: config.Wait,
maxBatch: config.MaxBatch,
}
}
// FolderRelatedFolderIDsLoader batches and caches requests
type FolderRelatedFolderIDsLoader struct {
// this method provides the data for the loader
fetch func(keys []models.FolderID) ([][]models.FolderID, []error)
// how long to done before sending a batch
wait time.Duration
// this will limit the maximum number of keys to send in one batch, 0 = no limit
maxBatch int
// INTERNAL
// lazily created cache
cache map[models.FolderID][]models.FolderID
// the current batch. keys will continue to be collected until timeout is hit,
// then everything will be sent to the fetch method and out to the listeners
batch *folderParentFolderIDsLoaderBatch
// mutex to prevent races
mu sync.Mutex
}
type folderParentFolderIDsLoaderBatch struct {
keys []models.FolderID
data [][]models.FolderID
error []error
closing bool
done chan struct{}
}
// Load a FolderID by key, batching and caching will be applied automatically
func (l *FolderRelatedFolderIDsLoader) Load(key models.FolderID) ([]models.FolderID, error) {
return l.LoadThunk(key)()
}
// LoadThunk returns a function that when called will block waiting for a FolderID.
// This method should be used if you want one goroutine to make requests to many
// different data loaders without blocking until the thunk is called.
func (l *FolderRelatedFolderIDsLoader) LoadThunk(key models.FolderID) func() ([]models.FolderID, error) {
l.mu.Lock()
if it, ok := l.cache[key]; ok {
l.mu.Unlock()
return func() ([]models.FolderID, error) {
return it, nil
}
}
if l.batch == nil {
l.batch = &folderParentFolderIDsLoaderBatch{done: make(chan struct{})}
}
batch := l.batch
pos := batch.keyIndex(l, key)
l.mu.Unlock()
return func() ([]models.FolderID, error) {
<-batch.done
var data []models.FolderID
if pos < len(batch.data) {
data = batch.data[pos]
}
var err error
// its convenient to be able to return a single error for everything
if len(batch.error) == 1 {
err = batch.error[0]
} else if batch.error != nil {
err = batch.error[pos]
}
if err == nil {
l.mu.Lock()
l.unsafeSet(key, data)
l.mu.Unlock()
}
return data, err
}
}
// LoadAll fetches many keys at once. It will be broken into appropriate sized
// sub batches depending on how the loader is configured
func (l *FolderRelatedFolderIDsLoader) LoadAll(keys []models.FolderID) ([][]models.FolderID, []error) {
results := make([]func() ([]models.FolderID, error), len(keys))
for i, key := range keys {
results[i] = l.LoadThunk(key)
}
folderIDs := make([][]models.FolderID, len(keys))
errors := make([]error, len(keys))
for i, thunk := range results {
folderIDs[i], errors[i] = thunk()
}
return folderIDs, errors
}
// LoadAllThunk returns a function that when called will block waiting for a FolderIDs.
// This method should be used if you want one goroutine to make requests to many
// different data loaders without blocking until the thunk is called.
func (l *FolderRelatedFolderIDsLoader) LoadAllThunk(keys []models.FolderID) func() ([][]models.FolderID, []error) {
results := make([]func() ([]models.FolderID, error), len(keys))
for i, key := range keys {
results[i] = l.LoadThunk(key)
}
return func() ([][]models.FolderID, []error) {
folderIDs := make([][]models.FolderID, len(keys))
errors := make([]error, len(keys))
for i, thunk := range results {
folderIDs[i], errors[i] = thunk()
}
return folderIDs, errors
}
}
// Prime the cache with the provided key and value. If the key already exists, no change is made
// and false is returned.
// (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)
func (l *FolderRelatedFolderIDsLoader) Prime(key models.FolderID, value []models.FolderID) bool {
l.mu.Lock()
var found bool
if _, found = l.cache[key]; !found {
// make a copy when writing to the cache, its easy to pass a pointer in from a loop var
// and end up with the whole cache pointing to the same value.
cpy := make([]models.FolderID, len(value))
copy(cpy, value)
l.unsafeSet(key, cpy)
}
l.mu.Unlock()
return !found
}
// Clear the value at key from the cache, if it exists
func (l *FolderRelatedFolderIDsLoader) Clear(key models.FolderID) {
l.mu.Lock()
delete(l.cache, key)
l.mu.Unlock()
}
func (l *FolderRelatedFolderIDsLoader) unsafeSet(key models.FolderID, value []models.FolderID) {
if l.cache == nil {
l.cache = map[models.FolderID][]models.FolderID{}
}
l.cache[key] = value
}
// keyIndex will return the location of the key in the batch, if its not found
// it will add the key to the batch
func (b *folderParentFolderIDsLoaderBatch) keyIndex(l *FolderRelatedFolderIDsLoader, key models.FolderID) int {
for i, existingKey := range b.keys {
if key == existingKey {
return i
}
}
pos := len(b.keys)
b.keys = append(b.keys, key)
if pos == 0 {
go b.startTimer(l)
}
if l.maxBatch != 0 && pos >= l.maxBatch-1 {
if !b.closing {
b.closing = true
l.batch = nil
go b.end(l)
}
}
return pos
}
func (b *folderParentFolderIDsLoaderBatch) startTimer(l *FolderRelatedFolderIDsLoader) {
time.Sleep(l.wait)
l.mu.Lock()
// we must have hit a batch limit and are already finalizing this batch
if b.closing {
l.mu.Unlock()
return
}
l.batch = nil
l.mu.Unlock()
b.end(l)
}
func (b *folderParentFolderIDsLoaderBatch) end(l *FolderRelatedFolderIDsLoader) {
b.data, b.error = l.fetch(b.keys)
close(b.done)
}

View file

@ -4,6 +4,7 @@ import (
"fmt"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil"
)
type BaseFile interface {
@ -27,6 +28,29 @@ func convertVisualFile(f models.File) (VisualFile, error) {
}
}
func convertBaseFile(f models.File) BaseFile {
if f == nil {
return nil
}
switch f := f.(type) {
case BaseFile:
return f
case *models.VideoFile:
return &VideoFile{VideoFile: f}
case *models.ImageFile:
return &ImageFile{ImageFile: f}
case *models.BaseFile:
return &BasicFile{BaseFile: f}
default:
panic("unknown file type")
}
}
func convertBaseFiles(files []models.File) []BaseFile {
return sliceutil.Map(files, convertBaseFile)
}
type GalleryFile struct {
*models.BaseFile
}
@ -62,3 +86,15 @@ func (ImageFile) IsVisualFile() {}
func (f *ImageFile) Fingerprints() []models.Fingerprint {
return f.ImageFile.Fingerprints
}
type BasicFile struct {
*models.BaseFile
}
func (BasicFile) IsBaseFile() {}
func (BasicFile) IsVisualFile() {}
func (f *BasicFile) Fingerprints() []models.Fingerprint {
return f.BaseFile.Fingerprints
}

View file

@ -7,6 +7,7 @@ import (
"sort"
"strconv"
"github.com/99designs/gqlgen/graphql"
"github.com/stashapp/stash/internal/build"
"github.com/stashapp/stash/internal/manager"
"github.com/stashapp/stash/pkg/logger"
@ -95,6 +96,12 @@ func (r *Resolver) VideoFile() VideoFileResolver {
func (r *Resolver) ImageFile() ImageFileResolver {
return &imageFileResolver{r}
}
func (r *Resolver) BasicFile() BasicFileResolver {
return &basicFileResolver{r}
}
func (r *Resolver) Folder() FolderResolver {
return &folderResolver{r}
}
func (r *Resolver) SavedFilter() SavedFilterResolver {
return &savedFilterResolver{r}
}
@ -125,6 +132,8 @@ type tagResolver struct{ *Resolver }
type galleryFileResolver struct{ *Resolver }
type videoFileResolver struct{ *Resolver }
type imageFileResolver struct{ *Resolver }
type basicFileResolver struct{ *Resolver }
type folderResolver struct{ *Resolver }
type savedFilterResolver struct{ *Resolver }
type pluginResolver struct{ *Resolver }
type configResultResolver struct{ *Resolver }
@ -137,6 +146,13 @@ func (r *Resolver) withReadTxn(ctx context.Context, fn func(ctx context.Context)
return r.repository.WithReadTxn(ctx, fn)
}
// idOnly returns true if the query is only asking for the id field.
// This can be used to optimize certain queries where we don't need to load the full object if we're only getting the id.
func (r *Resolver) idOnly(ctx context.Context) bool {
fields := graphql.CollectAllFields(ctx)
return len(fields) == 1 && fields[0] == "id"
}
func (r *queryResolver) MarkerWall(ctx context.Context, q *string) (ret []*models.SceneMarker, err error) {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
ret, err = r.repository.SceneMarker.Wall(ctx, q)

View file

@ -1,30 +1,80 @@
package api
import "context"
import (
"context"
func (r *galleryFileResolver) Fingerprint(ctx context.Context, obj *GalleryFile, type_ string) (*string, error) {
fp := obj.BaseFile.Fingerprints.For(type_)
if fp != nil {
v := fp.Value()
return &v, nil
"github.com/stashapp/stash/internal/api/loaders"
"github.com/stashapp/stash/pkg/models"
)
func fingerprintResolver(fp models.Fingerprints, type_ string) (*string, error) {
fingerprint := fp.For(type_)
if fingerprint != nil {
value := fingerprint.Value()
return &value, nil
}
return nil, nil
}
func (r *galleryFileResolver) Fingerprint(ctx context.Context, obj *GalleryFile, type_ string) (*string, error) {
return fingerprintResolver(obj.BaseFile.Fingerprints, type_)
}
func (r *imageFileResolver) Fingerprint(ctx context.Context, obj *ImageFile, type_ string) (*string, error) {
fp := obj.ImageFile.Fingerprints.For(type_)
if fp != nil {
v := fp.Value()
return &v, nil
}
return nil, nil
return fingerprintResolver(obj.ImageFile.Fingerprints, type_)
}
func (r *videoFileResolver) Fingerprint(ctx context.Context, obj *VideoFile, type_ string) (*string, error) {
fp := obj.VideoFile.Fingerprints.For(type_)
if fp != nil {
v := fp.Value()
return &v, nil
}
return nil, nil
return fingerprintResolver(obj.VideoFile.Fingerprints, type_)
}
func (r *basicFileResolver) Fingerprint(ctx context.Context, obj *BasicFile, type_ string) (*string, error) {
return fingerprintResolver(obj.BaseFile.Fingerprints, type_)
}
func (r *galleryFileResolver) ParentFolder(ctx context.Context, obj *GalleryFile) (*models.Folder, error) {
return loaders.From(ctx).FolderByID.Load(obj.ParentFolderID)
}
func (r *imageFileResolver) ParentFolder(ctx context.Context, obj *ImageFile) (*models.Folder, error) {
return loaders.From(ctx).FolderByID.Load(obj.ParentFolderID)
}
func (r *videoFileResolver) ParentFolder(ctx context.Context, obj *VideoFile) (*models.Folder, error) {
return loaders.From(ctx).FolderByID.Load(obj.ParentFolderID)
}
func (r *basicFileResolver) ParentFolder(ctx context.Context, obj *BasicFile) (*models.Folder, error) {
return loaders.From(ctx).FolderByID.Load(obj.ParentFolderID)
}
func zipFileResolver(ctx context.Context, zipFileID *models.FileID) (*BasicFile, error) {
if zipFileID == nil {
return nil, nil
}
f, err := loaders.From(ctx).FileByID.Load(*zipFileID)
if err != nil {
return nil, err
}
return &BasicFile{
BaseFile: f.Base(),
}, nil
}
func (r *galleryFileResolver) ZipFile(ctx context.Context, obj *GalleryFile) (*BasicFile, error) {
return zipFileResolver(ctx, obj.ZipFileID)
}
func (r *imageFileResolver) ZipFile(ctx context.Context, obj *ImageFile) (*BasicFile, error) {
return zipFileResolver(ctx, obj.ZipFileID)
}
func (r *videoFileResolver) ZipFile(ctx context.Context, obj *VideoFile) (*BasicFile, error) {
return zipFileResolver(ctx, obj.ZipFileID)
}
func (r *basicFileResolver) ZipFile(ctx context.Context, obj *BasicFile) (*BasicFile, error) {
return zipFileResolver(ctx, obj.ZipFileID)
}

View file

@ -0,0 +1,78 @@
package api
import (
"context"
"path/filepath"
"github.com/stashapp/stash/internal/api/loaders"
"github.com/stashapp/stash/pkg/models"
)
func (r *folderResolver) Basename(ctx context.Context, obj *models.Folder) (string, error) {
return filepath.Base(obj.Path), nil
}
func (r *folderResolver) ParentFolder(ctx context.Context, obj *models.Folder) (*models.Folder, error) {
if obj.ParentFolderID == nil {
return nil, nil
}
if r.idOnly(ctx) {
return &models.Folder{ID: *obj.ParentFolderID}, nil
}
return loaders.From(ctx).FolderByID.Load(*obj.ParentFolderID)
}
func foldersFromIDs(ids []models.FolderID) []*models.Folder {
ret := make([]*models.Folder, len(ids))
for i, id := range ids {
ret[i] = &models.Folder{ID: id}
}
return ret
}
func (r *folderResolver) ParentFolders(ctx context.Context, obj *models.Folder) ([]*models.Folder, error) {
ids, err := loaders.From(ctx).FolderParentFolderIDs.Load(obj.ID)
if err != nil {
return nil, err
}
if r.idOnly(ctx) {
return foldersFromIDs(ids), nil
}
var errs []error
ret, errs := loaders.From(ctx).FolderByID.LoadAll(ids)
return ret, firstError(errs)
}
func (r *folderResolver) SubFolders(ctx context.Context, obj *models.Folder) ([]*models.Folder, error) {
ids, err := loaders.From(ctx).FolderSubFolderIDs.Load(obj.ID)
if err != nil {
return nil, err
}
if r.idOnly(ctx) {
return foldersFromIDs(ids), nil
}
var errs []error
ret, errs := loaders.From(ctx).FolderByID.LoadAll(ids)
return ret, firstError(errs)
}
func (r *folderResolver) ZipFile(ctx context.Context, obj *models.Folder) (*BasicFile, error) {
// shortcut for id only queries
if r.idOnly(ctx) {
if obj.ZipFileID == nil {
return nil, nil
}
return &BasicFile{
BaseFile: &models.BaseFile{ID: *obj.ZipFileID},
}, nil
}
return zipFileResolver(ctx, obj.ZipFileID)
}

View file

@ -216,3 +216,16 @@ func (r *galleryResolver) Image(ctx context.Context, obj *models.Gallery, index
return
}
func (r *galleryResolver) CustomFields(ctx context.Context, obj *models.Gallery) (map[string]interface{}, error) {
m, err := loaders.From(ctx).GalleryCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
if m == nil {
return make(map[string]interface{}), nil
}
return m, nil
}

View file

@ -161,3 +161,12 @@ func (r *imageResolver) Urls(ctx context.Context, obj *models.Image) ([]string,
return obj.URLs.List(), nil
}
func (r *imageResolver) CustomFields(ctx context.Context, obj *models.Image) (map[string]interface{}, error) {
customFields, err := loaders.From(ctx).ImageCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
return customFields, nil
}

View file

@ -204,3 +204,27 @@ func (r *groupResolver) Scenes(ctx context.Context, obj *models.Group) (ret []*m
return ret, nil
}
func (r *groupResolver) OCounter(ctx context.Context, obj *models.Group) (ret *int, err error) {
var count int
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
count, err = r.repository.Scene.OCountByGroupID(ctx, obj.ID)
return err
}); err != nil {
return nil, err
}
return &count, nil
}
func (r *groupResolver) CustomFields(ctx context.Context, obj *models.Group) (map[string]interface{}, error) {
m, err := loaders.From(ctx).GroupCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
if m == nil {
return make(map[string]interface{}), nil
}
return m, nil
}

View file

@ -109,6 +109,31 @@ func (r *performerResolver) HeightCm(ctx context.Context, obj *models.Performer)
return obj.Height, nil
}
func (r *performerResolver) CareerStart(ctx context.Context, obj *models.Performer) (*string, error) {
if obj.CareerStart != nil {
ret := obj.CareerStart.String()
return &ret, nil
}
return nil, nil
}
func (r *performerResolver) CareerEnd(ctx context.Context, obj *models.Performer) (*string, error) {
if obj.CareerEnd != nil {
ret := obj.CareerEnd.String()
return &ret, nil
}
return nil, nil
}
func (r *performerResolver) CareerLength(ctx context.Context, obj *models.Performer) (*string, error) {
if obj.CareerStart == nil && obj.CareerEnd == nil {
return nil, nil
}
ret := models.FormatYearRange(obj.CareerStart, obj.CareerEnd)
return &ret, nil
}
func (r *performerResolver) Birthdate(ctx context.Context, obj *models.Performer) (*string, error) {
if obj.Birthdate != nil {
ret := obj.Birthdate.String()

View file

@ -410,3 +410,16 @@ func (r *sceneResolver) OHistory(ctx context.Context, obj *models.Scene) ([]*tim
return ptrRet, nil
}
func (r *sceneResolver) CustomFields(ctx context.Context, obj *models.Scene) (map[string]interface{}, error) {
m, err := loaders.From(ctx).SceneCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
if m == nil {
return make(map[string]interface{}), nil
}
return m, nil
}

View file

@ -40,6 +40,35 @@ func (r *studioResolver) Aliases(ctx context.Context, obj *models.Studio) ([]str
return obj.Aliases.List(), nil
}
func (r *studioResolver) URL(ctx context.Context, obj *models.Studio) (*string, error) {
if !obj.URLs.Loaded() {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
return obj.LoadURLs(ctx, r.repository.Studio)
}); err != nil {
return nil, err
}
}
urls := obj.URLs.List()
if len(urls) == 0 {
return nil, nil
}
return &urls[0], nil
}
func (r *studioResolver) Urls(ctx context.Context, obj *models.Studio) ([]string, error) {
if !obj.URLs.Loaded() {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
return obj.LoadURLs(ctx, r.repository.Studio)
}); err != nil {
return nil, err
}
}
return obj.URLs.List(), nil
}
func (r *studioResolver) Tags(ctx context.Context, obj *models.Studio) (ret []*models.Tag, err error) {
if !obj.TagIDs.Loaded() {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
@ -114,6 +143,24 @@ func (r *studioResolver) MovieCount(ctx context.Context, obj *models.Studio, dep
return r.GroupCount(ctx, obj, depth)
}
func (r *studioResolver) OCounter(ctx context.Context, obj *models.Studio) (ret *int, err error) {
var res_scene int
var res_image int
var res int
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
res_scene, err = r.repository.Scene.OCountByStudioID(ctx, obj.ID)
if err != nil {
return err
}
res_image, err = r.repository.Image.OCountByStudioID(ctx, obj.ID)
return err
}); err != nil {
return nil, err
}
res = res_scene + res_image
return &res, nil
}
func (r *studioResolver) ParentStudio(ctx context.Context, obj *models.Studio) (ret *models.Studio, err error) {
if obj.ParentID == nil {
return nil, nil
@ -160,6 +207,19 @@ func (r *studioResolver) Groups(ctx context.Context, obj *models.Studio) (ret []
return ret, nil
}
func (r *studioResolver) CustomFields(ctx context.Context, obj *models.Studio) (map[string]interface{}, error) {
m, err := loaders.From(ctx).StudioCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
if m == nil {
return make(map[string]interface{}), nil
}
return m, nil
}
// deprecated
func (r *studioResolver) Movies(ctx context.Context, obj *models.Studio) (ret []*models.Group, err error) {
return r.Groups(ctx, obj)

View file

@ -54,6 +54,16 @@ func (r *tagResolver) Aliases(ctx context.Context, obj *models.Tag) (ret []strin
return obj.Aliases.List(), nil
}
func (r *tagResolver) StashIds(ctx context.Context, obj *models.Tag) ([]*models.StashID, error) {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
return obj.LoadStashIDs(ctx, r.repository.Tag)
}); err != nil {
return nil, err
}
return stashIDsSliceToPtrSlice(obj.StashIDs.List()), nil
}
func (r *tagResolver) SceneCount(ctx context.Context, obj *models.Tag, depth *int) (ret int, err error) {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
ret, err = scene.CountByTagID(ctx, r.repository.Scene, obj.ID, depth)
@ -171,3 +181,16 @@ func (r *tagResolver) ChildCount(ctx context.Context, obj *models.Tag) (ret int,
return ret, nil
}
func (r *tagResolver) CustomFields(ctx context.Context, obj *models.Tag) (map[string]interface{}, error) {
m, err := loaders.From(ctx).TagCustomFields.Load(obj.ID)
if err != nil {
return nil, err
}
if m == nil {
return make(map[string]interface{}), nil
}
return m, nil
}

View file

@ -5,6 +5,7 @@ import (
"encoding/json"
"errors"
"fmt"
"io/fs"
"path/filepath"
"regexp"
"strconv"
@ -85,6 +86,8 @@ func (r *mutationResolver) setConfigFloat(key string, value *float64) {
func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGeneralInput) (*ConfigGeneralResult, error) {
c := config.GetInstance()
// #4709 - allow stash paths even if they do not exist, so that users may configure stash
// for disconnected drives or network storage.
existingPaths := c.GetStashPaths()
if input.Stashes != nil {
for _, s := range input.Stashes {
@ -97,8 +100,12 @@ func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGen
}
}
if isNew {
s.Path = filepath.Clean(s.Path)
// if it exists, it must be directory
exists, err := fsutil.DirExists(s.Path)
if !exists {
// allow it to not exist but if it does exist it must be a directory
if !exists && !errors.Is(err, fs.ErrNotExist) {
return makeConfigGeneralResult(), err
}
}
@ -150,6 +157,15 @@ func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGen
c.SetString(config.BackupDirectoryPath, *input.BackupDirectoryPath)
}
existingDeleteTrashPath := c.GetDeleteTrashPath()
if input.DeleteTrashPath != nil && existingDeleteTrashPath != *input.DeleteTrashPath {
if err := validateDir(config.DeleteTrashPath, *input.DeleteTrashPath, true); err != nil {
return makeConfigGeneralResult(), err
}
c.SetString(config.DeleteTrashPath, *input.DeleteTrashPath)
}
existingGeneratedPath := c.GetGeneratedPath()
if input.GeneratedPath != nil && existingGeneratedPath != *input.GeneratedPath {
if err := validateDir(config.Generated, *input.GeneratedPath, false); err != nil {
@ -278,6 +294,11 @@ func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGen
if input.PreviewPreset != nil {
c.SetString(config.PreviewPreset, input.PreviewPreset.String())
}
r.setConfigBool(config.UseCustomSpriteInterval, input.UseCustomSpriteInterval)
r.setConfigFloat(config.SpriteInterval, input.SpriteInterval)
r.setConfigInt(config.MinimumSprites, input.MinimumSprites)
r.setConfigInt(config.MaximumSprites, input.MaximumSprites)
r.setConfigInt(config.SpriteScreenshotSize, input.SpriteScreenshotSize)
r.setConfigBool(config.TranscodeHardwareAcceleration, input.TranscodeHardwareAcceleration)
if input.MaxTranscodeSize != nil {
@ -334,6 +355,10 @@ func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGen
logger.SetLogLevel(*input.LogLevel)
}
if input.LogFileMaxSize != nil && *input.LogFileMaxSize != c.GetLogFileMaxSize() {
c.SetInt(config.LogFileMaxSize, *input.LogFileMaxSize)
}
if input.Excludes != nil {
for _, r := range input.Excludes {
_, err := regexp.Compile(r)
@ -445,6 +470,8 @@ func (r *mutationResolver) ConfigureGeneral(ctx context.Context, input ConfigGen
func (r *mutationResolver) ConfigureInterface(ctx context.Context, input ConfigInterfaceInput) (*ConfigInterfaceResult, error) {
c := config.GetInstance()
r.setConfigBool(config.SFWContentMode, input.SfwContentMode)
if input.MenuItems != nil {
c.SetInterface(config.MenuItems, input.MenuItems)
}
@ -478,6 +505,8 @@ func (r *mutationResolver) ConfigureInterface(ctx context.Context, input ConfigI
r.setConfigString(config.ImageLightboxScrollModeKey, (*string)(options.ScrollMode))
r.setConfigInt(config.ImageLightboxScrollAttemptsBeforeChange, options.ScrollAttemptsBeforeChange)
r.setConfigBool(config.ImageLightboxDisableAnimation, options.DisableAnimation)
}
if input.CSS != nil {
@ -498,12 +527,15 @@ func (r *mutationResolver) ConfigureInterface(ctx context.Context, input ConfigI
r.setConfigBool(config.CustomLocalesEnabled, input.CustomLocalesEnabled)
r.setConfigBool(config.DisableCustomizations, input.DisableCustomizations)
if input.DisableDropdownCreate != nil {
ddc := input.DisableDropdownCreate
r.setConfigBool(config.DisableDropdownCreatePerformer, ddc.Performer)
r.setConfigBool(config.DisableDropdownCreateStudio, ddc.Studio)
r.setConfigBool(config.DisableDropdownCreateTag, ddc.Tag)
r.setConfigBool(config.DisableDropdownCreateMovie, ddc.Movie)
r.setConfigBool(config.DisableDropdownCreateGallery, ddc.Gallery)
}
r.setConfigString(config.HandyKey, input.HandyKey)

View file

@ -5,10 +5,14 @@ import (
"fmt"
"strconv"
"github.com/stashapp/stash/internal/desktop"
"github.com/stashapp/stash/internal/manager"
"github.com/stashapp/stash/internal/manager/config"
"github.com/stashapp/stash/pkg/file"
"github.com/stashapp/stash/pkg/fsutil"
"github.com/stashapp/stash/pkg/logger"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/session"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
@ -16,7 +20,7 @@ func (r *mutationResolver) MoveFiles(ctx context.Context, input MoveFilesInput)
if err := r.withTxn(ctx, func(ctx context.Context) error {
fileStore := r.repository.File
folderStore := r.repository.Folder
mover := file.NewMover(fileStore, folderStore)
mover := file.NewMover(fileStore, folderStore, manager.GetInstance().Config.GetStashPaths().Paths())
mover.RegisterHooks(ctx)
var (
@ -54,13 +58,14 @@ func (r *mutationResolver) MoveFiles(ctx context.Context, input MoveFilesInput)
folderPath := *input.DestinationFolder
// ensure folder path is within the library
if err := r.validateFolderPath(folderPath); err != nil {
stashPaths := manager.GetInstance().Config.GetStashPaths()
if err := r.validateFolderPath(stashPaths, folderPath); err != nil {
return err
}
// get or create folder hierarchy
var err error
folder, err = file.GetOrCreateFolderHierarchy(ctx, folderStore, folderPath)
folder, err = file.GetOrCreateFolderHierarchy(ctx, folderStore, folderPath, stashPaths.Paths())
if err != nil {
return fmt.Errorf("getting or creating folder hierarchy: %w", err)
}
@ -109,8 +114,7 @@ func (r *mutationResolver) MoveFiles(ctx context.Context, input MoveFilesInput)
return true, nil
}
func (r *mutationResolver) validateFolderPath(folderPath string) error {
paths := manager.GetInstance().Config.GetStashPaths()
func (r *mutationResolver) validateFolderPath(paths config.StashConfigs, folderPath string) error {
if l := paths.GetStashFromDirPath(folderPath); l == nil {
return fmt.Errorf("folder path %s must be within a stash library path", folderPath)
}
@ -149,7 +153,9 @@ func (r *mutationResolver) DeleteFiles(ctx context.Context, ids []string) (ret b
return false, fmt.Errorf("converting ids: %w", err)
}
fileDeleter := file.NewDeleter()
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
fileDeleter := file.NewDeleterWithTrash(trashPath)
destroyer := &file.ZipDestroyer{
FileDestroyer: r.repository.File,
FolderDestroyer: r.repository.Folder,
@ -208,6 +214,58 @@ func (r *mutationResolver) DeleteFiles(ctx context.Context, ids []string) (ret b
return true, nil
}
func (r *mutationResolver) DestroyFiles(ctx context.Context, ids []string) (ret bool, err error) {
fileIDs, err := stringslice.StringSliceToIntSlice(ids)
if err != nil {
return false, fmt.Errorf("converting ids: %w", err)
}
destroyer := &file.ZipDestroyer{
FileDestroyer: r.repository.File,
FolderDestroyer: r.repository.Folder,
}
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.File
for _, fileIDInt := range fileIDs {
fileID := models.FileID(fileIDInt)
f, err := qb.Find(ctx, fileID)
if err != nil {
return err
}
if len(f) == 0 {
return fmt.Errorf("file with id %d not found", fileID)
}
path := f[0].Base().Path
// ensure not a primary file
isPrimary, err := qb.IsPrimary(ctx, fileID)
if err != nil {
return fmt.Errorf("checking if file %s is primary: %w", path, err)
}
if isPrimary {
return fmt.Errorf("cannot destroy primary file entry %s", path)
}
// destroy DB entries only (no filesystem deletion)
const deleteFile = false
if err := destroyer.DestroyZip(ctx, f[0], nil, deleteFile); err != nil {
return fmt.Errorf("destroying file entry %s: %w", path, err)
}
}
return nil
}); err != nil {
return false, err
}
return true, nil
}
func (r *mutationResolver) FileSetFingerprints(ctx context.Context, input FileSetFingerprintsInput) (bool, error) {
fileIDInt, err := strconv.Atoi(input.ID)
if err != nil {
@ -272,3 +330,71 @@ func (r *mutationResolver) FileSetFingerprints(ctx context.Context, input FileSe
return true, nil
}
func (r *mutationResolver) RevealFileInFileManager(ctx context.Context, id string) (bool, error) {
// disallow if request did not come from localhost
if !session.IsLocalRequest(ctx) {
logger.Warnf("Attempt to reveal file in file manager from non-local request")
return false, fmt.Errorf("access denied")
}
fileIDInt, err := strconv.Atoi(id)
if err != nil {
return false, fmt.Errorf("converting id: %w", err)
}
var filePath string
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
files, err := r.repository.File.Find(ctx, models.FileID(fileIDInt))
if err != nil {
return fmt.Errorf("finding file: %w", err)
}
if len(files) == 0 {
return fmt.Errorf("file with id %d not found", fileIDInt)
}
filePath = files[0].Base().Path
return nil
}); err != nil {
return false, err
}
if err := desktop.RevealInFileManager(filePath); err != nil {
return false, err
}
return true, nil
}
func (r *mutationResolver) RevealFolderInFileManager(ctx context.Context, id string) (bool, error) {
// disallow if request did not come from localhost
if !session.IsLocalRequest(ctx) {
logger.Warnf("Attempt to reveal folder in file manager from non-local request")
return false, fmt.Errorf("access denied")
}
folderIDInt, err := strconv.Atoi(id)
if err != nil {
return false, fmt.Errorf("converting id: %w", err)
}
var folderPath string
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
folder, err := r.repository.Folder.Find(ctx, models.FolderID(folderIDInt))
if err != nil {
return fmt.Errorf("finding folder: %w", err)
}
if folder == nil {
return fmt.Errorf("folder with id %d not found", folderIDInt)
}
folderPath = folder.Path
return nil
}); err != nil {
return false, err
}
if err := desktop.RevealInFileManager(folderPath); err != nil {
return false, err
}
return true, nil
}

View file

@ -6,6 +6,7 @@ import (
"fmt"
"os"
"strconv"
"strings"
"github.com/stashapp/stash/internal/manager"
"github.com/stashapp/stash/pkg/file"
@ -41,13 +42,17 @@ func (r *mutationResolver) GalleryCreate(ctx context.Context, input GalleryCreat
}
// Populate a new gallery from the input
newGallery := models.NewGallery()
newGallery := models.CreateGalleryInput{
Gallery: &models.Gallery{},
}
*newGallery.Gallery = models.NewGallery()
newGallery.Title = input.Title
newGallery.Title = strings.TrimSpace(input.Title)
newGallery.Code = translator.string(input.Code)
newGallery.Details = translator.string(input.Details)
newGallery.Photographer = translator.string(input.Photographer)
newGallery.Rating = input.Rating100
newGallery.Organized = translator.bool(input.Organized)
var err error
@ -74,15 +79,17 @@ func (r *mutationResolver) GalleryCreate(ctx context.Context, input GalleryCreat
}
if input.Urls != nil {
newGallery.URLs = models.NewRelatedStrings(input.Urls)
newGallery.URLs = models.NewRelatedStrings(stringslice.TrimSpace(input.Urls))
} else if input.URL != nil {
newGallery.URLs = models.NewRelatedStrings([]string{*input.URL})
newGallery.URLs = models.NewRelatedStrings([]string{strings.TrimSpace(*input.URL)})
}
newGallery.CustomFields = convertMapJSONNumbers(input.CustomFields)
// Start the transaction and save the gallery
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Gallery
if err := qb.Create(ctx, &newGallery, nil); err != nil {
if err := qb.Create(ctx, &newGallery); err != nil {
return err
}
@ -239,6 +246,10 @@ func (r *mutationResolver) galleryUpdate(ctx context.Context, input models.Galle
return nil, fmt.Errorf("converting scene ids: %w", err)
}
if input.CustomFields != nil {
updatedGallery.CustomFields = handleUpdateCustomFields(*input.CustomFields)
}
// gallery scene is set from the scene only
gallery, err := qb.UpdatePartial(ctx, galleryID, updatedGallery)
@ -291,6 +302,10 @@ func (r *mutationResolver) BulkGalleryUpdate(ctx context.Context, input BulkGall
return nil, fmt.Errorf("converting scene ids: %w", err)
}
if input.CustomFields != nil {
updatedGallery.CustomFields = handleUpdateCustomFields(*input.CustomFields)
}
ret := []*models.Gallery{}
// Start the transaction and save the galleries
@ -333,15 +348,18 @@ func (r *mutationResolver) GalleryDestroy(ctx context.Context, input models.Gall
return false, fmt.Errorf("converting ids: %w", err)
}
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
var galleries []*models.Gallery
var imgsDestroyed []*models.Image
fileDeleter := &image.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
Paths: manager.GetInstance().Paths,
}
deleteGenerated := utils.IsTrue(input.DeleteGenerated)
deleteFile := utils.IsTrue(input.DeleteFile)
destroyFileEntry := utils.IsTrue(input.DestroyFileEntry)
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Gallery
@ -362,7 +380,7 @@ func (r *mutationResolver) GalleryDestroy(ctx context.Context, input models.Gall
galleries = append(galleries, gallery)
imgsDestroyed, err = r.galleryService.Destroy(ctx, gallery, fileDeleter, deleteGenerated, deleteFile)
imgsDestroyed, err = r.galleryService.Destroy(ctx, gallery, fileDeleter, deleteGenerated, deleteFile, destroyFileEntry)
if err != nil {
return err
}

View file

@ -4,6 +4,7 @@ import (
"context"
"fmt"
"strconv"
"strings"
"github.com/stashapp/stash/internal/static"
"github.com/stashapp/stash/pkg/group"
@ -13,15 +14,19 @@ import (
"github.com/stashapp/stash/pkg/utils"
)
func groupFromGroupCreateInput(ctx context.Context, input GroupCreateInput) (*models.Group, error) {
func groupFromGroupCreateInput(ctx context.Context, input GroupCreateInput) (*models.CreateGroupInput, error) {
translator := changesetTranslator{
inputMap: getUpdateInputMap(ctx),
}
// Populate a new group from the input
newGroup := models.NewGroup()
newGroupInput := &models.CreateGroupInput{
Group: &models.Group{},
}
*newGroupInput.Group = models.NewGroup()
newGroup := newGroupInput.Group
newGroup.Name = input.Name
newGroup.Name = strings.TrimSpace(input.Name)
newGroup.Aliases = translator.string(input.Aliases)
newGroup.Duration = input.Duration
newGroup.Rating = input.Rating100
@ -55,31 +60,22 @@ func groupFromGroupCreateInput(ctx context.Context, input GroupCreateInput) (*mo
}
if input.Urls != nil {
newGroup.URLs = models.NewRelatedStrings(input.Urls)
newGroup.URLs = models.NewRelatedStrings(stringslice.TrimSpace(input.Urls))
}
return &newGroup, nil
}
func (r *mutationResolver) GroupCreate(ctx context.Context, input GroupCreateInput) (*models.Group, error) {
newGroup, err := groupFromGroupCreateInput(ctx, input)
if err != nil {
return nil, err
}
newGroupInput.CustomFields = convertMapJSONNumbers(input.CustomFields)
// Process the base 64 encoded image string
var frontimageData []byte
if input.FrontImage != nil {
frontimageData, err = utils.ProcessImageInput(ctx, *input.FrontImage)
newGroupInput.FrontImageData, err = utils.ProcessImageInput(ctx, *input.FrontImage)
if err != nil {
return nil, fmt.Errorf("processing front image: %w", err)
}
}
// Process the base 64 encoded image string
var backimageData []byte
if input.BackImage != nil {
backimageData, err = utils.ProcessImageInput(ctx, *input.BackImage)
newGroupInput.BackImageData, err = utils.ProcessImageInput(ctx, *input.BackImage)
if err != nil {
return nil, fmt.Errorf("processing back image: %w", err)
}
@ -87,13 +83,22 @@ func (r *mutationResolver) GroupCreate(ctx context.Context, input GroupCreateInp
// HACK: if back image is being set, set the front image to the default.
// This is because we can't have a null front image with a non-null back image.
if len(frontimageData) == 0 && len(backimageData) != 0 {
frontimageData = static.ReadAll(static.DefaultGroupImage)
if len(newGroupInput.FrontImageData) == 0 && len(newGroupInput.BackImageData) != 0 {
newGroupInput.FrontImageData = static.ReadAll(static.DefaultGroupImage)
}
return newGroupInput, nil
}
func (r *mutationResolver) GroupCreate(ctx context.Context, input GroupCreateInput) (*models.Group, error) {
createGroupInput, err := groupFromGroupCreateInput(ctx, input)
if err != nil {
return nil, err
}
// Start the transaction and save the group
if err := r.withTxn(ctx, func(ctx context.Context) error {
if err = r.groupService.Create(ctx, newGroup, frontimageData, backimageData); err != nil {
if err = r.groupService.Create(ctx, createGroupInput); err != nil {
return err
}
@ -103,9 +108,9 @@ func (r *mutationResolver) GroupCreate(ctx context.Context, input GroupCreateInp
}
// for backwards compatibility - run both movie and group hooks
r.hookExecutor.ExecutePostHooks(ctx, newGroup.ID, hook.GroupCreatePost, input, nil)
r.hookExecutor.ExecutePostHooks(ctx, newGroup.ID, hook.MovieCreatePost, input, nil)
return r.getGroup(ctx, newGroup.ID)
r.hookExecutor.ExecutePostHooks(ctx, createGroupInput.Group.ID, hook.GroupCreatePost, input, nil)
r.hookExecutor.ExecutePostHooks(ctx, createGroupInput.Group.ID, hook.MovieCreatePost, input, nil)
return r.getGroup(ctx, createGroupInput.Group.ID)
}
func groupPartialFromGroupUpdateInput(translator changesetTranslator, input GroupUpdateInput) (ret models.GroupPartial, err error) {
@ -149,6 +154,12 @@ func groupPartialFromGroupUpdateInput(translator changesetTranslator, input Grou
}
updatedGroup.URLs = translator.updateStrings(input.Urls, "urls")
if input.CustomFields != nil {
updatedGroup.CustomFields = *input.CustomFields
// convert json.Numbers to int/float
updatedGroup.CustomFields.Full = convertMapJSONNumbers(updatedGroup.CustomFields.Full)
updatedGroup.CustomFields.Partial = convertMapJSONNumbers(updatedGroup.CustomFields.Partial)
}
return updatedGroup, nil
}
@ -216,6 +227,12 @@ func (r *mutationResolver) GroupUpdate(ctx context.Context, input GroupUpdateInp
func groupPartialFromBulkGroupUpdateInput(translator changesetTranslator, input BulkGroupUpdateInput) (ret models.GroupPartial, err error) {
updatedGroup := models.NewGroupPartial()
updatedGroup.Date, err = translator.optionalDate(input.Date, "date")
if err != nil {
err = fmt.Errorf("converting date: %w", err)
return
}
updatedGroup.Synopsis = translator.optionalString(input.Synopsis, "synopsis")
updatedGroup.Rating = translator.optionalInt(input.Rating100, "rating100")
updatedGroup.Director = translator.optionalString(input.Director, "director")
@ -245,6 +262,13 @@ func groupPartialFromBulkGroupUpdateInput(translator changesetTranslator, input
updatedGroup.URLs = translator.optionalURLsBulk(input.Urls, nil)
if input.CustomFields != nil {
updatedGroup.CustomFields = *input.CustomFields
// convert json.Numbers to int/float
updatedGroup.CustomFields.Full = convertMapJSONNumbers(updatedGroup.CustomFields.Full)
updatedGroup.CustomFields.Partial = convertMapJSONNumbers(updatedGroup.CustomFields.Partial)
}
return updatedGroup, nil
}

View file

@ -177,6 +177,13 @@ func (r *mutationResolver) imageUpdate(ctx context.Context, input models.ImageUp
return nil, fmt.Errorf("converting tag ids: %w", err)
}
if input.CustomFields != nil {
updatedImage.CustomFields = *input.CustomFields
// convert json.Numbers to int/float
updatedImage.CustomFields.Full = convertMapJSONNumbers(updatedImage.CustomFields.Full)
updatedImage.CustomFields.Partial = convertMapJSONNumbers(updatedImage.CustomFields.Partial)
}
qb := r.repository.Image
image, err := qb.UpdatePartial(ctx, imageID, updatedImage)
if err != nil {
@ -237,6 +244,13 @@ func (r *mutationResolver) BulkImageUpdate(ctx context.Context, input BulkImageU
return nil, fmt.Errorf("converting tag ids: %w", err)
}
if input.CustomFields != nil {
updatedImage.CustomFields = *input.CustomFields
// convert json.Numbers to int/float
updatedImage.CustomFields.Full = convertMapJSONNumbers(updatedImage.CustomFields.Full)
updatedImage.CustomFields.Partial = convertMapJSONNumbers(updatedImage.CustomFields.Partial)
}
// Start the transaction and save the images
if err := r.withTxn(ctx, func(ctx context.Context) error {
var updatedGalleryIDs []int
@ -308,9 +322,11 @@ func (r *mutationResolver) ImageDestroy(ctx context.Context, input models.ImageD
return false, fmt.Errorf("converting id: %w", err)
}
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
var i *models.Image
fileDeleter := &image.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
Paths: manager.GetInstance().Paths,
}
if err := r.withTxn(ctx, func(ctx context.Context) error {
@ -323,7 +339,7 @@ func (r *mutationResolver) ImageDestroy(ctx context.Context, input models.ImageD
return fmt.Errorf("image with id %d not found", imageID)
}
return r.imageService.Destroy(ctx, i, fileDeleter, utils.IsTrue(input.DeleteGenerated), utils.IsTrue(input.DeleteFile))
return r.imageService.Destroy(ctx, i, fileDeleter, utils.IsTrue(input.DeleteGenerated), utils.IsTrue(input.DeleteFile), utils.IsTrue(input.DestroyFileEntry))
}); err != nil {
fileDeleter.Rollback()
return false, err
@ -348,9 +364,11 @@ func (r *mutationResolver) ImagesDestroy(ctx context.Context, input models.Image
return false, fmt.Errorf("converting ids: %w", err)
}
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
var images []*models.Image
fileDeleter := &image.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
Paths: manager.GetInstance().Paths,
}
if err := r.withTxn(ctx, func(ctx context.Context) error {
@ -368,7 +386,7 @@ func (r *mutationResolver) ImagesDestroy(ctx context.Context, input models.Image
images = append(images, i)
if err := r.imageService.Destroy(ctx, i, fileDeleter, utils.IsTrue(input.DeleteGenerated), utils.IsTrue(input.DeleteFile)); err != nil {
if err := r.imageService.Destroy(ctx, i, fileDeleter, utils.IsTrue(input.DeleteGenerated), utils.IsTrue(input.DeleteFile), utils.IsTrue(input.DestroyFileEntry)); err != nil {
return err
}
}

View file

@ -122,9 +122,10 @@ func (r *mutationResolver) MigrateHashNaming(ctx context.Context) (string, error
func (r *mutationResolver) BackupDatabase(ctx context.Context, input BackupDatabaseInput) (*string, error) {
// if download is true, then backup to temporary file and return a link
download := input.Download != nil && *input.Download
includeBlobs := input.IncludeBlobs != nil && *input.IncludeBlobs
mgr := manager.GetInstance()
backupPath, backupName, err := mgr.BackupDatabase(download)
backupPath, backupName, err := mgr.BackupDatabase(download, includeBlobs)
if err != nil {
logger.Errorf("Error backing up database: %v", err)
return nil, err

View file

@ -4,6 +4,7 @@ import (
"context"
"fmt"
"strconv"
"strings"
"github.com/stashapp/stash/internal/static"
"github.com/stashapp/stash/pkg/models"
@ -32,7 +33,7 @@ func (r *mutationResolver) MovieCreate(ctx context.Context, input MovieCreateInp
// Populate a new group from the input
newGroup := models.NewGroup()
newGroup.Name = input.Name
newGroup.Name = strings.TrimSpace(input.Name)
newGroup.Aliases = translator.string(input.Aliases)
newGroup.Duration = input.Duration
newGroup.Rating = input.Rating100
@ -56,9 +57,9 @@ func (r *mutationResolver) MovieCreate(ctx context.Context, input MovieCreateInp
}
if input.Urls != nil {
newGroup.URLs = models.NewRelatedStrings(input.Urls)
newGroup.URLs = models.NewRelatedStrings(stringslice.TrimSpace(input.Urls))
} else if input.URL != nil {
newGroup.URLs = models.NewRelatedStrings([]string{*input.URL})
newGroup.URLs = models.NewRelatedStrings([]string{strings.TrimSpace(*input.URL)})
}
// Process the base 64 encoded image string

View file

@ -2,12 +2,16 @@ package api
import (
"context"
"errors"
"fmt"
"slices"
"strconv"
"strings"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/performer"
"github.com/stashapp/stash/pkg/plugin/hook"
"github.com/stashapp/stash/pkg/sliceutil"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
"github.com/stashapp/stash/pkg/utils"
)
@ -37,9 +41,9 @@ func (r *mutationResolver) PerformerCreate(ctx context.Context, input models.Per
// Populate a new performer from the input
newPerformer := models.NewPerformer()
newPerformer.Name = input.Name
newPerformer.Name = strings.TrimSpace(input.Name)
newPerformer.Disambiguation = translator.string(input.Disambiguation)
newPerformer.Aliases = models.NewRelatedStrings(input.AliasList)
newPerformer.Aliases = models.NewRelatedStrings(stringslice.UniqueExcludeFold(stringslice.TrimSpace(input.AliasList), newPerformer.Name))
newPerformer.Gender = input.Gender
newPerformer.Ethnicity = translator.string(input.Ethnicity)
newPerformer.Country = translator.string(input.Country)
@ -48,7 +52,6 @@ func (r *mutationResolver) PerformerCreate(ctx context.Context, input models.Per
newPerformer.FakeTits = translator.string(input.FakeTits)
newPerformer.PenisLength = input.PenisLength
newPerformer.Circumcised = input.Circumcised
newPerformer.CareerLength = translator.string(input.CareerLength)
newPerformer.Tattoos = translator.string(input.Tattoos)
newPerformer.Piercings = translator.string(input.Piercings)
newPerformer.Favorite = translator.bool(input.Favorite)
@ -62,17 +65,17 @@ func (r *mutationResolver) PerformerCreate(ctx context.Context, input models.Per
newPerformer.URLs = models.NewRelatedStrings([]string{})
if input.URL != nil {
newPerformer.URLs.Add(*input.URL)
newPerformer.URLs.Add(strings.TrimSpace(*input.URL))
}
if input.Twitter != nil {
newPerformer.URLs.Add(utils.URLFromHandle(*input.Twitter, twitterURL))
newPerformer.URLs.Add(utils.URLFromHandle(strings.TrimSpace(*input.Twitter), twitterURL))
}
if input.Instagram != nil {
newPerformer.URLs.Add(utils.URLFromHandle(*input.Instagram, instagramURL))
newPerformer.URLs.Add(utils.URLFromHandle(strings.TrimSpace(*input.Instagram), instagramURL))
}
if input.Urls != nil {
newPerformer.URLs.Add(input.Urls...)
newPerformer.URLs.Add(stringslice.TrimSpace(input.Urls)...)
}
var err error
@ -86,6 +89,25 @@ func (r *mutationResolver) PerformerCreate(ctx context.Context, input models.Per
return nil, fmt.Errorf("converting death date: %w", err)
}
newPerformer.CareerStart, err = translator.datePtr(input.CareerStart)
if err != nil {
return nil, fmt.Errorf("converting career start: %w", err)
}
newPerformer.CareerEnd, err = translator.datePtr(input.CareerEnd)
if err != nil {
return nil, fmt.Errorf("converting career end: %w", err)
}
// if career_start/career_end not provided, parse deprecated career_length
if newPerformer.CareerStart == nil && newPerformer.CareerEnd == nil && input.CareerLength != nil {
start, end, err := models.ParseYearRangeString(*input.CareerLength)
if err != nil {
return nil, fmt.Errorf("could not parse career_length %q: %w", *input.CareerLength, err)
}
newPerformer.CareerStart = start
newPerformer.CareerEnd = end
}
newPerformer.TagIDs, err = translator.relatedIds(input.TagIds)
if err != nil {
return nil, fmt.Errorf("converting tag ids: %w", err)
@ -135,7 +157,7 @@ func (r *mutationResolver) PerformerCreate(ctx context.Context, input models.Per
return r.getPerformer(ctx, newPerformer.ID)
}
func (r *mutationResolver) validateNoLegacyURLs(translator changesetTranslator) error {
func validateNoLegacyURLs(translator changesetTranslator) error {
// ensure url/twitter/instagram are not included in the input
if translator.hasField("url") {
return fmt.Errorf("url field must not be included if urls is included")
@ -150,7 +172,7 @@ func (r *mutationResolver) validateNoLegacyURLs(translator changesetTranslator)
return nil
}
func (r *mutationResolver) handleLegacyURLs(ctx context.Context, performerID int, legacyURL, legacyTwitter, legacyInstagram models.OptionalString, updatedPerformer *models.PerformerPartial) error {
func (r *mutationResolver) handleLegacyURLs(ctx context.Context, performerID int, legacyURLs legacyPerformerURLs, updatedPerformer *models.PerformerPartial) error {
qb := r.repository.Performer
// we need to be careful with URL/Twitter/Instagram
@ -169,23 +191,23 @@ func (r *mutationResolver) handleLegacyURLs(ctx context.Context, performerID int
existingURLs := p.URLs.List()
// performer partial URLs should be empty
if legacyURL.Set {
if legacyURLs.URL.Set {
replaced := false
for i, url := range existingURLs {
if !performer.IsTwitterURL(url) && !performer.IsInstagramURL(url) {
existingURLs[i] = legacyURL.Value
existingURLs[i] = legacyURLs.URL.Value
replaced = true
break
}
}
if !replaced {
existingURLs = append(existingURLs, legacyURL.Value)
existingURLs = append(existingURLs, legacyURLs.URL.Value)
}
}
if legacyTwitter.Set {
value := utils.URLFromHandle(legacyTwitter.Value, twitterURL)
if legacyURLs.Twitter.Set {
value := utils.URLFromHandle(legacyURLs.Twitter.Value, twitterURL)
found := false
// find and replace the first twitter URL
for i, url := range existingURLs {
@ -200,9 +222,9 @@ func (r *mutationResolver) handleLegacyURLs(ctx context.Context, performerID int
existingURLs = append(existingURLs, value)
}
}
if legacyInstagram.Set {
if legacyURLs.Instagram.Set {
found := false
value := utils.URLFromHandle(legacyInstagram.Value, instagramURL)
value := utils.URLFromHandle(legacyURLs.Instagram.Value, instagramURL)
// find and replace the first instagram URL
for i, url := range existingURLs {
if performer.IsInstagramURL(url) {
@ -225,16 +247,25 @@ func (r *mutationResolver) handleLegacyURLs(ctx context.Context, performerID int
return nil
}
func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.PerformerUpdateInput) (*models.Performer, error) {
performerID, err := strconv.Atoi(input.ID)
if err != nil {
return nil, fmt.Errorf("converting id: %w", err)
}
type legacyPerformerURLs struct {
URL models.OptionalString
Twitter models.OptionalString
Instagram models.OptionalString
}
translator := changesetTranslator{
inputMap: getUpdateInputMap(ctx),
}
func (u *legacyPerformerURLs) AnySet() bool {
return u.URL.Set || u.Twitter.Set || u.Instagram.Set
}
func legacyPerformerURLsFromInput(input models.PerformerUpdateInput, translator changesetTranslator) legacyPerformerURLs {
return legacyPerformerURLs{
URL: translator.optionalString(input.URL, "url"),
Twitter: translator.optionalString(input.Twitter, "twitter"),
Instagram: translator.optionalString(input.Instagram, "instagram"),
}
}
func performerPartialFromInput(input models.PerformerUpdateInput, translator changesetTranslator) (*models.PerformerPartial, error) {
// Populate performer from the input
updatedPerformer := models.NewPerformerPartial()
@ -248,7 +279,29 @@ func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.Per
updatedPerformer.FakeTits = translator.optionalString(input.FakeTits, "fake_tits")
updatedPerformer.PenisLength = translator.optionalFloat64(input.PenisLength, "penis_length")
updatedPerformer.Circumcised = translator.optionalString((*string)(input.Circumcised), "circumcised")
updatedPerformer.CareerLength = translator.optionalString(input.CareerLength, "career_length")
// prefer career_start/career_end over deprecated career_length
if translator.hasField("career_start") || translator.hasField("career_end") {
var err error
updatedPerformer.CareerStart, err = translator.optionalDate(input.CareerStart, "career_start")
if err != nil {
return nil, fmt.Errorf("converting career start: %w", err)
}
updatedPerformer.CareerEnd, err = translator.optionalDate(input.CareerEnd, "career_end")
if err != nil {
return nil, fmt.Errorf("converting career end: %w", err)
}
} else if translator.hasField("career_length") && input.CareerLength != nil {
start, end, err := models.ParseYearRangeString(*input.CareerLength)
if err != nil {
return nil, fmt.Errorf("could not parse career_length %q: %w", *input.CareerLength, err)
}
if start != nil {
updatedPerformer.CareerStart = models.NewOptionalDate(*start)
}
if end != nil {
updatedPerformer.CareerEnd = models.NewOptionalDate(*end)
}
}
updatedPerformer.Tattoos = translator.optionalString(input.Tattoos, "tattoos")
updatedPerformer.Piercings = translator.optionalString(input.Piercings, "piercings")
updatedPerformer.Favorite = translator.optionalBool(input.Favorite, "favorite")
@ -259,19 +312,17 @@ func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.Per
updatedPerformer.IgnoreAutoTag = translator.optionalBool(input.IgnoreAutoTag, "ignore_auto_tag")
updatedPerformer.StashIDs = translator.updateStashIDs(input.StashIds, "stash_ids")
var err error
if translator.hasField("urls") {
// ensure url/twitter/instagram are not included in the input
if err := r.validateNoLegacyURLs(translator); err != nil {
if err := validateNoLegacyURLs(translator); err != nil {
return nil, err
}
updatedPerformer.URLs = translator.updateStrings(input.Urls, "urls")
}
legacyURL := translator.optionalString(input.URL, "url")
legacyTwitter := translator.optionalString(input.Twitter, "twitter")
legacyInstagram := translator.optionalString(input.Instagram, "instagram")
updatedPerformer.Birthdate, err = translator.optionalDate(input.Birthdate, "birthdate")
if err != nil {
return nil, fmt.Errorf("converting birthdate: %w", err)
@ -296,10 +347,27 @@ func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.Per
return nil, fmt.Errorf("converting tag ids: %w", err)
}
updatedPerformer.CustomFields = input.CustomFields
// convert json.Numbers to int/float
updatedPerformer.CustomFields.Full = convertMapJSONNumbers(updatedPerformer.CustomFields.Full)
updatedPerformer.CustomFields.Partial = convertMapJSONNumbers(updatedPerformer.CustomFields.Partial)
updatedPerformer.CustomFields = handleUpdateCustomFields(input.CustomFields)
return &updatedPerformer, nil
}
func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.PerformerUpdateInput) (*models.Performer, error) {
performerID, err := strconv.Atoi(input.ID)
if err != nil {
return nil, fmt.Errorf("converting id: %w", err)
}
translator := changesetTranslator{
inputMap: getUpdateInputMap(ctx),
}
updatedPerformer, err := performerPartialFromInput(input, translator)
if err != nil {
return nil, err
}
legacyURLs := legacyPerformerURLsFromInput(input, translator)
var imageData []byte
imageIncluded := translator.hasField("image")
@ -314,17 +382,38 @@ func (r *mutationResolver) PerformerUpdate(ctx context.Context, input models.Per
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Performer
if legacyURL.Set || legacyTwitter.Set || legacyInstagram.Set {
if err := r.handleLegacyURLs(ctx, performerID, legacyURL, legacyTwitter, legacyInstagram, &updatedPerformer); err != nil {
if legacyURLs.AnySet() {
if err := r.handleLegacyURLs(ctx, performerID, legacyURLs, updatedPerformer); err != nil {
return err
}
}
if err := performer.ValidateUpdate(ctx, performerID, updatedPerformer, qb); err != nil {
if updatedPerformer.Aliases != nil {
p, err := qb.Find(ctx, performerID)
if err != nil {
return err
}
if p != nil {
if err := p.LoadAliases(ctx, qb); err != nil {
return err
}
effectiveAliases := updatedPerformer.Aliases.Apply(p.Aliases.List())
name := p.Name
if updatedPerformer.Name.Set {
name = updatedPerformer.Name.Value
}
sanitized := stringslice.UniqueExcludeFold(effectiveAliases, name)
updatedPerformer.Aliases.Values = sanitized
updatedPerformer.Aliases.Mode = models.RelationshipUpdateModeSet
}
}
if err := performer.ValidateUpdate(ctx, performerID, *updatedPerformer, qb); err != nil {
return err
}
_, err = qb.UpdatePartial(ctx, performerID, updatedPerformer)
_, err = qb.UpdatePartial(ctx, performerID, *updatedPerformer)
if err != nil {
return err
}
@ -368,7 +457,28 @@ func (r *mutationResolver) BulkPerformerUpdate(ctx context.Context, input BulkPe
updatedPerformer.FakeTits = translator.optionalString(input.FakeTits, "fake_tits")
updatedPerformer.PenisLength = translator.optionalFloat64(input.PenisLength, "penis_length")
updatedPerformer.Circumcised = translator.optionalString((*string)(input.Circumcised), "circumcised")
updatedPerformer.CareerLength = translator.optionalString(input.CareerLength, "career_length")
// prefer career_start/career_end over deprecated career_length
if translator.hasField("career_start") || translator.hasField("career_end") {
updatedPerformer.CareerStart, err = translator.optionalDate(input.CareerStart, "career_start")
if err != nil {
return nil, fmt.Errorf("converting career start: %w", err)
}
updatedPerformer.CareerEnd, err = translator.optionalDate(input.CareerEnd, "career_end")
if err != nil {
return nil, fmt.Errorf("converting career end: %w", err)
}
} else if translator.hasField("career_length") && input.CareerLength != nil {
start, end, err := models.ParseYearRangeString(*input.CareerLength)
if err != nil {
return nil, fmt.Errorf("could not parse career_length %q: %w", *input.CareerLength, err)
}
if start != nil {
updatedPerformer.CareerStart = models.NewOptionalDate(*start)
}
if end != nil {
updatedPerformer.CareerEnd = models.NewOptionalDate(*end)
}
}
updatedPerformer.Tattoos = translator.optionalString(input.Tattoos, "tattoos")
updatedPerformer.Piercings = translator.optionalString(input.Piercings, "piercings")
@ -381,16 +491,18 @@ func (r *mutationResolver) BulkPerformerUpdate(ctx context.Context, input BulkPe
if translator.hasField("urls") {
// ensure url/twitter/instagram are not included in the input
if err := r.validateNoLegacyURLs(translator); err != nil {
if err := validateNoLegacyURLs(translator); err != nil {
return nil, err
}
updatedPerformer.URLs = translator.updateStringsBulk(input.Urls, "urls")
}
legacyURL := translator.optionalString(input.URL, "url")
legacyTwitter := translator.optionalString(input.Twitter, "twitter")
legacyInstagram := translator.optionalString(input.Instagram, "instagram")
legacyURLs := legacyPerformerURLs{
URL: translator.optionalString(input.URL, "url"),
Twitter: translator.optionalString(input.Twitter, "twitter"),
Instagram: translator.optionalString(input.Instagram, "instagram"),
}
updatedPerformer.Birthdate, err = translator.optionalDate(input.Birthdate, "birthdate")
if err != nil {
@ -416,6 +528,10 @@ func (r *mutationResolver) BulkPerformerUpdate(ctx context.Context, input BulkPe
return nil, fmt.Errorf("converting tag ids: %w", err)
}
if input.CustomFields != nil {
updatedPerformer.CustomFields = handleUpdateCustomFields(*input.CustomFields)
}
ret := []*models.Performer{}
// Start the transaction and save the performers
@ -423,8 +539,8 @@ func (r *mutationResolver) BulkPerformerUpdate(ctx context.Context, input BulkPe
qb := r.repository.Performer
for _, performerID := range performerIDs {
if legacyURL.Set || legacyTwitter.Set || legacyInstagram.Set {
if err := r.handleLegacyURLs(ctx, performerID, legacyURL, legacyTwitter, legacyInstagram, &updatedPerformer); err != nil {
if legacyURLs.AnySet() {
if err := r.handleLegacyURLs(ctx, performerID, legacyURLs, &updatedPerformer); err != nil {
return err
}
}
@ -504,3 +620,87 @@ func (r *mutationResolver) PerformersDestroy(ctx context.Context, performerIDs [
return true, nil
}
func (r *mutationResolver) PerformerMerge(ctx context.Context, input PerformerMergeInput) (*models.Performer, error) {
srcIDs, err := stringslice.StringSliceToIntSlice(input.Source)
if err != nil {
return nil, fmt.Errorf("converting source ids: %w", err)
}
// ensure source ids are unique
srcIDs = sliceutil.AppendUniques(nil, srcIDs)
destID, err := strconv.Atoi(input.Destination)
if err != nil {
return nil, fmt.Errorf("converting destination id: %w", err)
}
// ensure destination is not in source list
if slices.Contains(srcIDs, destID) {
return nil, errors.New("destination performer cannot be in source list")
}
var values *models.PerformerPartial
var imageData []byte
if input.Values != nil {
translator := changesetTranslator{
inputMap: getNamedUpdateInputMap(ctx, "input.values"),
}
values, err = performerPartialFromInput(*input.Values, translator)
if err != nil {
return nil, err
}
legacyURLs := legacyPerformerURLsFromInput(*input.Values, translator)
if legacyURLs.AnySet() {
return nil, errors.New("Merging legacy performer URLs is not supported")
}
if input.Values.Image != nil {
var err error
imageData, err = utils.ProcessImageInput(ctx, *input.Values.Image)
if err != nil {
return nil, fmt.Errorf("processing cover image: %w", err)
}
}
} else {
v := models.NewPerformerPartial()
values = &v
}
var dest *models.Performer
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Performer
dest, err = qb.Find(ctx, destID)
if err != nil {
return fmt.Errorf("finding destination performer ID %d: %w", destID, err)
}
// ensure source performers exist
if _, err := qb.FindMany(ctx, srcIDs); err != nil {
return fmt.Errorf("finding source performers: %w", err)
}
if _, err := qb.UpdatePartial(ctx, destID, *values); err != nil {
return fmt.Errorf("updating performer: %w", err)
}
if err := qb.Merge(ctx, srcIDs, destID); err != nil {
return fmt.Errorf("merging performers: %w", err)
}
if len(imageData) > 0 {
if err := qb.UpdateImage(ctx, destID, imageData); err != nil {
return err
}
}
return nil
}); err != nil {
return nil, err
}
return dest, nil
}

View file

@ -32,7 +32,7 @@ func (r *mutationResolver) SaveFilter(ctx context.Context, input SaveFilterInput
f := models.SavedFilter{
Mode: input.Mode,
Name: input.Name,
Name: strings.TrimSpace(input.Name),
FindFilter: input.FindFilter,
ObjectFilter: input.ObjectFilter,
UIOptions: input.UIOptions,

View file

@ -5,6 +5,7 @@ import (
"errors"
"fmt"
"strconv"
"strings"
"time"
"github.com/stashapp/stash/internal/manager"
@ -62,9 +63,9 @@ func (r *mutationResolver) SceneCreate(ctx context.Context, input models.SceneCr
}
if input.Urls != nil {
newScene.URLs = models.NewRelatedStrings(input.Urls)
newScene.URLs = models.NewRelatedStrings(stringslice.TrimSpace(input.Urls))
} else if input.URL != nil {
newScene.URLs = models.NewRelatedStrings([]string{*input.URL})
newScene.URLs = models.NewRelatedStrings([]string{strings.TrimSpace(*input.URL)})
}
newScene.PerformerIDs, err = translator.relatedIds(input.PerformerIds)
@ -102,8 +103,15 @@ func (r *mutationResolver) SceneCreate(ctx context.Context, input models.SceneCr
}
}
customFields := convertMapJSONNumbers(input.CustomFields)
if err := r.withTxn(ctx, func(ctx context.Context) error {
ret, err = r.Resolver.sceneService.Create(ctx, &newScene, fileIDs, coverImageData)
ret, err = r.Resolver.sceneService.Create(ctx, models.CreateSceneInput{
Scene: &newScene,
FileIDs: fileIDs,
CoverImage: coverImageData,
CustomFields: customFields,
})
return err
}); err != nil {
return nil, err
@ -296,6 +304,7 @@ func (r *mutationResolver) sceneUpdate(ctx context.Context, input models.SceneUp
}
var coverImageData []byte
coverImageIncluded := translator.hasField("cover_image")
if input.CoverImage != nil {
var err error
coverImageData, err = utils.ProcessImageInput(ctx, *input.CoverImage)
@ -304,26 +313,41 @@ func (r *mutationResolver) sceneUpdate(ctx context.Context, input models.SceneUp
}
}
var customFields *models.CustomFieldsInput
if input.CustomFields != nil {
cfCopy := *input.CustomFields
customFields = &cfCopy
// convert json.Numbers to int/float
customFields.Full = convertMapJSONNumbers(customFields.Full)
customFields.Partial = convertMapJSONNumbers(customFields.Partial)
}
scene, err := qb.UpdatePartial(ctx, sceneID, *updatedScene)
if err != nil {
return nil, err
}
if err := r.sceneUpdateCoverImage(ctx, scene, coverImageData); err != nil {
return nil, err
if coverImageIncluded {
if err := r.sceneUpdateCoverImage(ctx, scene, coverImageData); err != nil {
return nil, err
}
}
if customFields != nil {
if err := qb.SetCustomFields(ctx, scene.ID, *customFields); err != nil {
return nil, err
}
}
return scene, nil
}
func (r *mutationResolver) sceneUpdateCoverImage(ctx context.Context, s *models.Scene, coverImageData []byte) error {
if len(coverImageData) > 0 {
qb := r.repository.Scene
qb := r.repository.Scene
// update cover table
if err := qb.UpdateCover(ctx, s.ID, coverImageData); err != nil {
return err
}
// update cover table - empty data will clear the cover
if err := qb.UpdateCover(ctx, s.ID, coverImageData); err != nil {
return err
}
return nil
@ -385,6 +409,12 @@ func (r *mutationResolver) BulkSceneUpdate(ctx context.Context, input BulkSceneU
}
}
var customFields *models.CustomFieldsInput
if input.CustomFields != nil {
cf := handleUpdateCustomFields(*input.CustomFields)
customFields = &cf
}
ret := []*models.Scene{}
// Start the transaction and save the scenes
@ -397,6 +427,12 @@ func (r *mutationResolver) BulkSceneUpdate(ctx context.Context, input BulkSceneU
return err
}
if customFields != nil {
if err := qb.SetCustomFields(ctx, scene.ID, *customFields); err != nil {
return err
}
}
ret = append(ret, scene)
}
@ -428,16 +464,18 @@ func (r *mutationResolver) SceneDestroy(ctx context.Context, input models.SceneD
}
fileNamingAlgo := manager.GetInstance().Config.GetVideoFileNamingAlgorithm()
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
var s *models.Scene
fileDeleter := &scene.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
FileNamingAlgo: fileNamingAlgo,
Paths: manager.GetInstance().Paths,
}
deleteGenerated := utils.IsTrue(input.DeleteGenerated)
deleteFile := utils.IsTrue(input.DeleteFile)
destroyFileEntry := utils.IsTrue(input.DestroyFileEntry)
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Scene
@ -454,7 +492,7 @@ func (r *mutationResolver) SceneDestroy(ctx context.Context, input models.SceneD
// kill any running encoders
manager.KillRunningStreams(s, fileNamingAlgo)
return r.sceneService.Destroy(ctx, s, fileDeleter, deleteGenerated, deleteFile)
return r.sceneService.Destroy(ctx, s, fileDeleter, deleteGenerated, deleteFile, destroyFileEntry)
}); err != nil {
fileDeleter.Rollback()
return false, err
@ -482,15 +520,17 @@ func (r *mutationResolver) ScenesDestroy(ctx context.Context, input models.Scene
var scenes []*models.Scene
fileNamingAlgo := manager.GetInstance().Config.GetVideoFileNamingAlgorithm()
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
fileDeleter := &scene.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
FileNamingAlgo: fileNamingAlgo,
Paths: manager.GetInstance().Paths,
}
deleteGenerated := utils.IsTrue(input.DeleteGenerated)
deleteFile := utils.IsTrue(input.DeleteFile)
destroyFileEntry := utils.IsTrue(input.DestroyFileEntry)
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Scene
@ -509,7 +549,7 @@ func (r *mutationResolver) ScenesDestroy(ctx context.Context, input models.Scene
// kill any running encoders
manager.KillRunningStreams(scene, fileNamingAlgo)
if err := r.sceneService.Destroy(ctx, scene, fileDeleter, deleteGenerated, deleteFile); err != nil {
if err := r.sceneService.Destroy(ctx, scene, fileDeleter, deleteGenerated, deleteFile, destroyFileEntry); err != nil {
return err
}
}
@ -569,6 +609,7 @@ func (r *mutationResolver) SceneMerge(ctx context.Context, input SceneMergeInput
var values *models.ScenePartial
var coverImageData []byte
var customFields *models.CustomFieldsInput
if input.Values != nil {
translator := changesetTranslator{
@ -587,14 +628,20 @@ func (r *mutationResolver) SceneMerge(ctx context.Context, input SceneMergeInput
return nil, fmt.Errorf("processing cover image: %w", err)
}
}
if input.Values.CustomFields != nil {
cf := handleUpdateCustomFields(*input.Values.CustomFields)
customFields = &cf
}
} else {
v := models.NewScenePartial()
values = &v
}
mgr := manager.GetInstance()
trashPath := mgr.Config.GetDeleteTrashPath()
fileDeleter := &scene.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
FileNamingAlgo: mgr.Config.GetVideoFileNamingAlgorithm(),
Paths: mgr.Paths,
}
@ -617,7 +664,20 @@ func (r *mutationResolver) SceneMerge(ctx context.Context, input SceneMergeInput
return fmt.Errorf("scene with id %d not found", destID)
}
return r.sceneUpdateCoverImage(ctx, ret, coverImageData)
// only update cover image if one was provided
if len(coverImageData) > 0 {
if err := r.sceneUpdateCoverImage(ctx, ret, coverImageData); err != nil {
return err
}
}
if customFields != nil {
if err := r.Resolver.repository.Scene.SetCustomFields(ctx, ret.ID, *customFields); err != nil {
return err
}
}
return nil
}); err != nil {
return nil, err
}
@ -650,7 +710,7 @@ func (r *mutationResolver) SceneMarkerCreate(ctx context.Context, input SceneMar
// Populate a new scene marker from the input
newMarker := models.NewSceneMarker()
newMarker.Title = input.Title
newMarker.Title = strings.TrimSpace(input.Title)
newMarker.Seconds = input.Seconds
newMarker.PrimaryTagID = primaryTagID
newMarker.SceneID = sceneID
@ -736,9 +796,10 @@ func (r *mutationResolver) SceneMarkerUpdate(ctx context.Context, input SceneMar
}
mgr := manager.GetInstance()
trashPath := mgr.Config.GetDeleteTrashPath()
fileDeleter := &scene.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
FileNamingAlgo: mgr.Config.GetVideoFileNamingAlgorithm(),
Paths: mgr.Paths,
}
@ -820,6 +881,123 @@ func (r *mutationResolver) SceneMarkerUpdate(ctx context.Context, input SceneMar
return r.getSceneMarker(ctx, markerID)
}
func (r *mutationResolver) BulkSceneMarkerUpdate(ctx context.Context, input BulkSceneMarkerUpdateInput) ([]*models.SceneMarker, error) {
ids, err := stringslice.StringSliceToIntSlice(input.Ids)
if err != nil {
return nil, fmt.Errorf("converting ids: %w", err)
}
translator := changesetTranslator{
inputMap: getUpdateInputMap(ctx),
}
// Populate performer from the input
partial := models.NewSceneMarkerPartial()
partial.Title = translator.optionalString(input.Title, "title")
partial.PrimaryTagID, err = translator.optionalIntFromString(input.PrimaryTagID, "primary_tag_id")
if err != nil {
return nil, fmt.Errorf("converting primary tag id: %w", err)
}
partial.TagIDs, err = translator.updateIdsBulk(input.TagIds, "tag_ids")
if err != nil {
return nil, fmt.Errorf("converting tag ids: %w", err)
}
ret := []*models.SceneMarker{}
// Start the transaction and save the performers
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.SceneMarker
for _, id := range ids {
l := partial
if err := adjustMarkerPartialForTagExclusion(ctx, r.repository.SceneMarker, id, &l); err != nil {
return err
}
updated, err := qb.UpdatePartial(ctx, id, l)
if err != nil {
return err
}
ret = append(ret, updated)
}
return nil
}); err != nil {
return nil, err
}
// execute post hooks outside of txn
var newRet []*models.SceneMarker
for _, m := range ret {
r.hookExecutor.ExecutePostHooks(ctx, m.ID, hook.SceneMarkerUpdatePost, input, translator.getFields())
m, err = r.getSceneMarker(ctx, m.ID)
if err != nil {
return nil, err
}
newRet = append(newRet, m)
}
return newRet, nil
}
// adjustMarkerPartialForTagExclusion adjusts the SceneMarkerPartial to exclude the primary tag from tag updates.
func adjustMarkerPartialForTagExclusion(ctx context.Context, r models.SceneMarkerReader, id int, partial *models.SceneMarkerPartial) error {
if partial.TagIDs == nil && !partial.PrimaryTagID.Set {
return nil
}
// exclude primary tag from tag updates
var primaryTagID int
if partial.PrimaryTagID.Set {
primaryTagID = partial.PrimaryTagID.Value
} else {
existing, err := r.Find(ctx, id)
if err != nil {
return fmt.Errorf("finding existing primary tag id: %w", err)
}
primaryTagID = existing.PrimaryTagID
}
existingTagIDs, err := r.GetTagIDs(ctx, id)
if err != nil {
return fmt.Errorf("getting existing tag ids: %w", err)
}
tagIDAttr := partial.TagIDs
if tagIDAttr == nil {
tagIDAttr = &models.UpdateIDs{
IDs: existingTagIDs,
Mode: models.RelationshipUpdateModeSet,
}
}
newTagIDs := tagIDAttr.Apply(existingTagIDs)
// Remove primary tag from newTagIDs if present
newTagIDs = sliceutil.Exclude(newTagIDs, []int{primaryTagID})
if len(existingTagIDs) != len(newTagIDs) {
partial.TagIDs = &models.UpdateIDs{
IDs: newTagIDs,
Mode: models.RelationshipUpdateModeSet,
}
} else {
// no change to tags required
partial.TagIDs = nil
}
return nil
}
func (r *mutationResolver) SceneMarkerDestroy(ctx context.Context, id string) (bool, error) {
return r.SceneMarkersDestroy(ctx, []string{id})
}
@ -832,9 +1010,10 @@ func (r *mutationResolver) SceneMarkersDestroy(ctx context.Context, markerIDs []
var markers []*models.SceneMarker
fileNamingAlgo := manager.GetInstance().Config.GetVideoFileNamingAlgorithm()
trashPath := manager.GetInstance().Config.GetDeleteTrashPath()
fileDeleter := &scene.FileDeleter{
Deleter: file.NewDeleter(),
Deleter: file.NewDeleterWithTrash(trashPath),
FileNamingAlgo: fileNamingAlgo,
Paths: manager.GetInstance().Paths,
}

View file

@ -39,7 +39,7 @@ func (r *mutationResolver) SubmitStashBoxFingerprints(ctx context.Context, input
}
func (r *mutationResolver) StashBoxBatchPerformerTag(ctx context.Context, input manager.StashBoxBatchTagInput) (string, error) {
b, err := resolveStashBoxBatchTagInput(input.Endpoint, input.StashBoxEndpoint)
b, err := resolveStashBoxBatchTagInput(input.Endpoint, input.StashBoxEndpoint) //nolint:staticcheck
if err != nil {
return "", err
}
@ -49,7 +49,7 @@ func (r *mutationResolver) StashBoxBatchPerformerTag(ctx context.Context, input
}
func (r *mutationResolver) StashBoxBatchStudioTag(ctx context.Context, input manager.StashBoxBatchTagInput) (string, error) {
b, err := resolveStashBoxBatchTagInput(input.Endpoint, input.StashBoxEndpoint)
b, err := resolveStashBoxBatchTagInput(input.Endpoint, input.StashBoxEndpoint) //nolint:staticcheck
if err != nil {
return "", err
}
@ -58,6 +58,16 @@ func (r *mutationResolver) StashBoxBatchStudioTag(ctx context.Context, input man
return strconv.Itoa(jobID), nil
}
func (r *mutationResolver) StashBoxBatchTagTag(ctx context.Context, input manager.StashBoxBatchTagInput) (string, error) {
b, err := resolveStashBoxBatchTagInput(input.Endpoint, input.StashBoxEndpoint) //nolint:staticcheck
if err != nil {
return "", err
}
jobID := manager.GetInstance().StashBoxBatchTagTag(ctx, b, input)
return strconv.Itoa(jobID), nil
}
func (r *mutationResolver) SubmitStashBoxSceneDraft(ctx context.Context, input StashBoxDraftSubmissionInput) (*string, error) {
b, err := resolveStashBox(input.StashBoxIndex, input.StashBoxEndpoint)
if err != nil {
@ -153,6 +163,14 @@ func (r *mutationResolver) makeSceneDraft(ctx context.Context, s *models.Scene,
return nil, err
}
// Load StashIDs for tags
tqb := r.repository.Tag
for _, t := range draft.Tags {
if err := t.LoadStashIDs(ctx, tqb); err != nil {
return nil, err
}
}
draft.Cover = cover
return draft, nil

View file

@ -4,6 +4,7 @@ import (
"context"
"fmt"
"strconv"
"strings"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/plugin/hook"
@ -30,19 +31,28 @@ func (r *mutationResolver) StudioCreate(ctx context.Context, input models.Studio
}
// Populate a new studio from the input
newStudio := models.NewStudio()
newStudio := models.NewCreateStudioInput()
newStudio.Name = input.Name
newStudio.URL = translator.string(input.URL)
newStudio.Name = strings.TrimSpace(input.Name)
newStudio.Rating = input.Rating100
newStudio.Favorite = translator.bool(input.Favorite)
newStudio.Details = translator.string(input.Details)
newStudio.IgnoreAutoTag = translator.bool(input.IgnoreAutoTag)
newStudio.Aliases = models.NewRelatedStrings(input.Aliases)
newStudio.Organized = translator.bool(input.Organized)
newStudio.Aliases = models.NewRelatedStrings(stringslice.UniqueExcludeFold(stringslice.TrimSpace(input.Aliases), newStudio.Name))
newStudio.StashIDs = models.NewRelatedStashIDs(models.StashIDInputs(input.StashIds).ToStashIDs())
var err error
newStudio.URLs = models.NewRelatedStrings([]string{})
if input.URL != nil {
newStudio.URLs.Add(strings.TrimSpace(*input.URL))
}
if input.Urls != nil {
newStudio.URLs.Add(stringslice.TrimSpace(input.Urls)...)
}
newStudio.ParentID, err = translator.intPtrFromString(input.ParentID)
if err != nil {
return nil, fmt.Errorf("converting parent id: %w", err)
@ -52,6 +62,7 @@ func (r *mutationResolver) StudioCreate(ctx context.Context, input models.Studio
if err != nil {
return nil, fmt.Errorf("converting tag ids: %w", err)
}
newStudio.CustomFields = convertMapJSONNumbers(input.CustomFields)
// Process the base 64 encoded image string
var imageData []byte
@ -106,11 +117,11 @@ func (r *mutationResolver) StudioUpdate(ctx context.Context, input models.Studio
updatedStudio.ID = studioID
updatedStudio.Name = translator.optionalString(input.Name, "name")
updatedStudio.URL = translator.optionalString(input.URL, "url")
updatedStudio.Details = translator.optionalString(input.Details, "details")
updatedStudio.Rating = translator.optionalInt(input.Rating100, "rating100")
updatedStudio.Favorite = translator.optionalBool(input.Favorite, "favorite")
updatedStudio.IgnoreAutoTag = translator.optionalBool(input.IgnoreAutoTag, "ignore_auto_tag")
updatedStudio.Organized = translator.optionalBool(input.Organized, "organized")
updatedStudio.Aliases = translator.updateStrings(input.Aliases, "aliases")
updatedStudio.StashIDs = translator.updateStashIDs(input.StashIds, "stash_ids")
@ -124,6 +135,31 @@ func (r *mutationResolver) StudioUpdate(ctx context.Context, input models.Studio
return nil, fmt.Errorf("converting tag ids: %w", err)
}
if translator.hasField("urls") {
// ensure url not included in the input
if err := validateNoLegacyURLs(translator); err != nil {
return nil, err
}
updatedStudio.URLs = translator.updateStrings(input.Urls, "urls")
} else if translator.hasField("url") {
// handle legacy url field
legacyURLs := []string{}
if input.URL != nil {
legacyURLs = append(legacyURLs, *input.URL)
}
updatedStudio.URLs = &models.UpdateStrings{
Mode: models.RelationshipUpdateModeSet,
Values: legacyURLs,
}
}
updatedStudio.CustomFields = input.CustomFields
// convert json.Numbers to int/float
updatedStudio.CustomFields.Full = convertMapJSONNumbers(updatedStudio.CustomFields.Full)
updatedStudio.CustomFields.Partial = convertMapJSONNumbers(updatedStudio.CustomFields.Partial)
// Process the base 64 encoded image string
var imageData []byte
imageIncluded := translator.hasField("image")
@ -139,6 +175,28 @@ func (r *mutationResolver) StudioUpdate(ctx context.Context, input models.Studio
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Studio
if updatedStudio.Aliases != nil {
s, err := qb.Find(ctx, studioID)
if err != nil {
return err
}
if s != nil {
if err := s.LoadAliases(ctx, qb); err != nil {
return err
}
effectiveAliases := updatedStudio.Aliases.Apply(s.Aliases.List())
name := s.Name
if updatedStudio.Name.Set {
name = updatedStudio.Name.Value
}
sanitized := stringslice.UniqueExcludeFold(effectiveAliases, name)
updatedStudio.Aliases.Values = sanitized
updatedStudio.Aliases.Mode = models.RelationshipUpdateModeSet
}
}
if err := studio.ValidateModify(ctx, updatedStudio, qb); err != nil {
return err
}
@ -163,6 +221,97 @@ func (r *mutationResolver) StudioUpdate(ctx context.Context, input models.Studio
return r.getStudio(ctx, studioID)
}
func (r *mutationResolver) BulkStudioUpdate(ctx context.Context, input BulkStudioUpdateInput) ([]*models.Studio, error) {
ids, err := stringslice.StringSliceToIntSlice(input.Ids)
if err != nil {
return nil, fmt.Errorf("converting ids: %w", err)
}
translator := changesetTranslator{
inputMap: getUpdateInputMap(ctx),
}
// Populate performer from the input
partial := models.NewStudioPartial()
partial.ParentID, err = translator.optionalIntFromString(input.ParentID, "parent_id")
if err != nil {
return nil, fmt.Errorf("converting parent id: %w", err)
}
if translator.hasField("urls") {
// ensure url/twitter/instagram are not included in the input
if err := validateNoLegacyURLs(translator); err != nil {
return nil, err
}
partial.URLs = translator.updateStringsBulk(input.Urls, "urls")
} else if translator.hasField("url") {
// handle legacy url field
legacyURLs := []string{}
if input.URL != nil {
legacyURLs = append(legacyURLs, *input.URL)
}
partial.URLs = &models.UpdateStrings{
Mode: models.RelationshipUpdateModeSet,
Values: legacyURLs,
}
}
partial.Favorite = translator.optionalBool(input.Favorite, "favorite")
partial.Rating = translator.optionalInt(input.Rating100, "rating100")
partial.Details = translator.optionalString(input.Details, "details")
partial.IgnoreAutoTag = translator.optionalBool(input.IgnoreAutoTag, "ignore_auto_tag")
partial.Organized = translator.optionalBool(input.Organized, "organized")
partial.TagIDs, err = translator.updateIdsBulk(input.TagIds, "tag_ids")
if err != nil {
return nil, fmt.Errorf("converting tag ids: %w", err)
}
ret := []*models.Studio{}
// Start the transaction and save the performers
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Studio
for _, id := range ids {
local := partial
local.ID = id
if err := studio.ValidateModify(ctx, local, qb); err != nil {
return err
}
updated, err := qb.UpdatePartial(ctx, local)
if err != nil {
return err
}
ret = append(ret, updated)
}
return nil
}); err != nil {
return nil, err
}
// execute post hooks outside of txn
var newRet []*models.Studio
for _, studio := range ret {
r.hookExecutor.ExecutePostHooks(ctx, studio.ID, hook.StudioUpdatePost, input, translator.getFields())
studio, err = r.getStudio(ctx, studio.ID)
if err != nil {
return nil, err
}
newRet = append(newRet, studio)
}
return newRet, nil
}
func (r *mutationResolver) StudioDestroy(ctx context.Context, input StudioDestroyInput) (bool, error) {
id, err := strconv.Atoi(input.ID)
if err != nil {

View file

@ -4,8 +4,8 @@ import (
"context"
"fmt"
"strconv"
"strings"
"github.com/stashapp/stash/pkg/logger"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/plugin/hook"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
@ -30,15 +30,26 @@ func (r *mutationResolver) TagCreate(ctx context.Context, input TagCreateInput)
}
// Populate a new tag from the input
newTag := models.NewTag()
newTag := models.CreateTagInput{
Tag: &models.Tag{},
}
*newTag.Tag = models.NewTag()
newTag.Name = input.Name
newTag.Name = strings.TrimSpace(input.Name)
newTag.SortName = translator.string(input.SortName)
newTag.Aliases = models.NewRelatedStrings(input.Aliases)
newTag.Aliases = models.NewRelatedStrings(stringslice.UniqueExcludeFold(stringslice.TrimSpace(input.Aliases), newTag.Name))
newTag.Favorite = translator.bool(input.Favorite)
newTag.Description = translator.string(input.Description)
newTag.IgnoreAutoTag = translator.bool(input.IgnoreAutoTag)
var stashIDInputs models.StashIDInputs
for _, sid := range input.StashIds {
if sid != nil {
stashIDInputs = append(stashIDInputs, *sid)
}
}
newTag.StashIDs = models.NewRelatedStashIDs(stashIDInputs.ToStashIDs())
var err error
newTag.ParentIDs, err = translator.relatedIds(input.ParentIds)
@ -51,6 +62,8 @@ func (r *mutationResolver) TagCreate(ctx context.Context, input TagCreateInput)
return nil, fmt.Errorf("converting child tag ids: %w", err)
}
newTag.CustomFields = convertMapJSONNumbers(input.CustomFields)
// Process the base 64 encoded image string
var imageData []byte
if input.Image != nil {
@ -64,7 +77,7 @@ func (r *mutationResolver) TagCreate(ctx context.Context, input TagCreateInput)
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Tag
if err := tag.ValidateCreate(ctx, newTag, qb); err != nil {
if err := tag.ValidateCreate(ctx, *newTag.Tag, qb); err != nil {
return err
}
@ -89,6 +102,46 @@ func (r *mutationResolver) TagCreate(ctx context.Context, input TagCreateInput)
return r.getTag(ctx, newTag.ID)
}
func tagPartialFromInput(input TagUpdateInput, translator changesetTranslator) (*models.TagPartial, error) {
updatedTag := models.NewTagPartial()
updatedTag.Name = translator.optionalString(input.Name, "name")
updatedTag.SortName = translator.optionalString(input.SortName, "sort_name")
updatedTag.Favorite = translator.optionalBool(input.Favorite, "favorite")
updatedTag.IgnoreAutoTag = translator.optionalBool(input.IgnoreAutoTag, "ignore_auto_tag")
updatedTag.Description = translator.optionalString(input.Description, "description")
updatedTag.Aliases = translator.updateStrings(input.Aliases, "aliases")
var updateStashIDInputs models.StashIDInputs
for _, sid := range input.StashIds {
if sid != nil {
updateStashIDInputs = append(updateStashIDInputs, *sid)
}
}
updatedTag.StashIDs = translator.updateStashIDs(updateStashIDInputs, "stash_ids")
var err error
updatedTag.ParentIDs, err = translator.updateIds(input.ParentIds, "parent_ids")
if err != nil {
return nil, fmt.Errorf("converting parent tag ids: %w", err)
}
updatedTag.ChildIDs, err = translator.updateIds(input.ChildIds, "child_ids")
if err != nil {
return nil, fmt.Errorf("converting child tag ids: %w", err)
}
if input.CustomFields != nil {
updatedTag.CustomFields = *input.CustomFields
// convert json.Numbers to int/float
updatedTag.CustomFields.Full = convertMapJSONNumbers(updatedTag.CustomFields.Full)
updatedTag.CustomFields.Partial = convertMapJSONNumbers(updatedTag.CustomFields.Partial)
}
return &updatedTag, nil
}
func (r *mutationResolver) TagUpdate(ctx context.Context, input TagUpdateInput) (*models.Tag, error) {
tagID, err := strconv.Atoi(input.ID)
if err != nil {
@ -100,24 +153,9 @@ func (r *mutationResolver) TagUpdate(ctx context.Context, input TagUpdateInput)
}
// Populate tag from the input
updatedTag := models.NewTagPartial()
updatedTag.Name = translator.optionalString(input.Name, "name")
updatedTag.SortName = translator.optionalString(input.SortName, "sort_name")
updatedTag.Favorite = translator.optionalBool(input.Favorite, "favorite")
updatedTag.IgnoreAutoTag = translator.optionalBool(input.IgnoreAutoTag, "ignore_auto_tag")
updatedTag.Description = translator.optionalString(input.Description, "description")
updatedTag.Aliases = translator.updateStrings(input.Aliases, "aliases")
updatedTag.ParentIDs, err = translator.updateIds(input.ParentIds, "parent_ids")
updatedTag, err := tagPartialFromInput(input, translator)
if err != nil {
return nil, fmt.Errorf("converting parent tag ids: %w", err)
}
updatedTag.ChildIDs, err = translator.updateIds(input.ChildIds, "child_ids")
if err != nil {
return nil, fmt.Errorf("converting child tag ids: %w", err)
return nil, err
}
var imageData []byte
@ -134,11 +172,33 @@ func (r *mutationResolver) TagUpdate(ctx context.Context, input TagUpdateInput)
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Tag
if err := tag.ValidateUpdate(ctx, tagID, updatedTag, qb); err != nil {
if updatedTag.Aliases != nil {
t, err := qb.Find(ctx, tagID)
if err != nil {
return err
}
if t != nil {
if err := t.LoadAliases(ctx, qb); err != nil {
return err
}
newAliases := updatedTag.Aliases.Apply(t.Aliases.List())
name := t.Name
if updatedTag.Name.Set {
name = updatedTag.Name.Value
}
sanitized := stringslice.UniqueExcludeFold(newAliases, name)
updatedTag.Aliases.Values = sanitized
updatedTag.Aliases.Mode = models.RelationshipUpdateModeSet
}
}
if err := tag.ValidateUpdate(ctx, tagID, *updatedTag, qb); err != nil {
return err
}
t, err = qb.UpdatePartial(ctx, tagID, updatedTag)
t, err = qb.UpdatePartial(ctx, tagID, *updatedTag)
if err != nil {
return err
}
@ -286,6 +346,31 @@ func (r *mutationResolver) TagsMerge(ctx context.Context, input TagsMergeInput)
return nil, nil
}
var values *models.TagPartial
var imageData []byte
if input.Values != nil {
translator := changesetTranslator{
inputMap: getNamedUpdateInputMap(ctx, "input.values"),
}
values, err = tagPartialFromInput(*input.Values, translator)
if err != nil {
return nil, err
}
if input.Values.Image != nil {
var err error
imageData, err = utils.ProcessImageInput(ctx, *input.Values.Image)
if err != nil {
return nil, fmt.Errorf("processing cover image: %w", err)
}
}
} else {
v := models.NewTagPartial()
values = &v
}
var t *models.Tag
if err := r.withTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Tag
@ -300,28 +385,22 @@ func (r *mutationResolver) TagsMerge(ctx context.Context, input TagsMergeInput)
return fmt.Errorf("tag with id %d not found", destination)
}
parents, children, err := tag.MergeHierarchy(ctx, destination, source, qb)
if err != nil {
return err
}
if err = qb.Merge(ctx, source, destination); err != nil {
return err
}
err = qb.UpdateParentTags(ctx, destination, parents)
if err != nil {
return err
}
err = qb.UpdateChildTags(ctx, destination, children)
if err != nil {
if err := tag.ValidateUpdate(ctx, destination, *values, qb); err != nil {
return err
}
err = tag.ValidateHierarchyExisting(ctx, t, parents, children, qb)
if err != nil {
logger.Errorf("Error merging tag: %s", err)
return err
if _, err := qb.UpdatePartial(ctx, destination, *values); err != nil {
return fmt.Errorf("updating tag: %w", err)
}
if len(imageData) > 0 {
if err := qb.UpdateImage(ctx, destination, imageData); err != nil {
return err
}
}
return nil

View file

@ -82,6 +82,7 @@ func makeConfigGeneralResult() *ConfigGeneralResult {
Stashes: config.GetStashPaths(),
DatabasePath: config.GetDatabasePath(),
BackupDirectoryPath: config.GetBackupDirectoryPath(),
DeleteTrashPath: config.GetDeleteTrashPath(),
GeneratedPath: config.GetGeneratedPath(),
MetadataPath: config.GetMetadataPath(),
ConfigFilePath: config.GetConfigFile(),
@ -95,6 +96,11 @@ func makeConfigGeneralResult() *ConfigGeneralResult {
CalculateMd5: config.IsCalculateMD5(),
VideoFileNamingAlgorithm: config.GetVideoFileNamingAlgorithm(),
ParallelTasks: config.GetParallelTasks(),
UseCustomSpriteInterval: config.GetUseCustomSpriteInterval(),
SpriteInterval: config.GetSpriteInterval(),
SpriteScreenshotSize: config.GetSpriteScreenshotSize(),
MinimumSprites: config.GetMinimumSprites(),
MaximumSprites: config.GetMaximumSprites(),
PreviewAudio: config.GetPreviewAudio(),
PreviewSegments: config.GetPreviewSegments(),
PreviewSegmentDuration: config.GetPreviewSegmentDuration(),
@ -115,6 +121,7 @@ func makeConfigGeneralResult() *ConfigGeneralResult {
LogOut: config.GetLogOut(),
LogLevel: config.GetLogLevel(),
LogAccess: config.GetLogAccess(),
LogFileMaxSize: config.GetLogFileMaxSize(),
VideoExtensions: config.GetVideoExtensions(),
ImageExtensions: config.GetImageExtensions(),
GalleryExtensions: config.GetGalleryExtensions(),
@ -154,6 +161,7 @@ func makeConfigInterfaceResult() *ConfigInterfaceResult {
javascriptEnabled := config.GetJavascriptEnabled()
customLocales := config.GetCustomLocales()
customLocalesEnabled := config.GetCustomLocalesEnabled()
disableCustomizations := config.GetDisableCustomizations()
language := config.GetLanguage()
handyKey := config.GetHandyKey()
scriptOffset := config.GetFunscriptOffset()
@ -162,6 +170,7 @@ func makeConfigInterfaceResult() *ConfigInterfaceResult {
disableDropdownCreate := config.GetDisableDropdownCreate()
return &ConfigInterfaceResult{
SfwContentMode: config.GetSFWContentMode(),
MenuItems: menuItems,
SoundOnPreview: &soundOnPreview,
WallShowTitle: &wallShowTitle,
@ -180,6 +189,7 @@ func makeConfigInterfaceResult() *ConfigInterfaceResult {
JavascriptEnabled: &javascriptEnabled,
CustomLocales: &customLocales,
CustomLocalesEnabled: &customLocalesEnabled,
DisableCustomizations: &disableCustomizations,
Language: &language,
ImageLightbox: &imageLightboxOptions,

View file

@ -0,0 +1,120 @@
package api
import (
"context"
"errors"
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindFile(ctx context.Context, id *string, path *string) (BaseFile, error) {
var ret models.File
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
qb := r.repository.File
var err error
switch {
case id != nil:
idInt, err := strconv.Atoi(*id)
if err != nil {
return err
}
var files []models.File
files, err = qb.Find(ctx, models.FileID(idInt))
if err != nil {
return err
}
if len(files) > 0 {
ret = files[0]
}
case path != nil:
ret, err = qb.FindByPath(ctx, *path, true)
if err == nil && ret == nil {
return errors.New("file not found")
}
default:
return errors.New("either id or path must be provided")
}
return err
}); err != nil {
return nil, err
}
return convertBaseFile(ret), nil
}
func (r *queryResolver) FindFiles(
ctx context.Context,
fileFilter *models.FileFilterType,
filter *models.FindFilterType,
ids []string,
) (ret *FindFilesResultType, err error) {
var fileIDs []models.FileID
if len(ids) > 0 {
fileIDsInt, err := stringslice.StringSliceToIntSlice(ids)
if err != nil {
return nil, err
}
fileIDs = models.FileIDsFromInts(fileIDsInt)
}
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
var files []models.File
var err error
fields := collectQueryFields(ctx)
result := &models.FileQueryResult{}
if len(fileIDs) > 0 {
files, err = r.repository.File.Find(ctx, fileIDs...)
if err == nil {
result.Count = len(files)
for _, f := range files {
if asVideo, ok := f.(*models.VideoFile); ok {
result.TotalDuration += asVideo.Duration
}
if asImage, ok := f.(*models.ImageFile); ok {
result.Megapixels += asImage.Megapixels()
}
result.TotalSize += f.Base().Size
}
}
} else {
result, err = r.repository.File.Query(ctx, models.FileQueryOptions{
QueryOptions: models.QueryOptions{
FindFilter: filter,
Count: fields.Has("count"),
},
FileFilter: fileFilter,
TotalDuration: fields.Has("duration"),
Megapixels: fields.Has("megapixels"),
TotalSize: fields.Has("size"),
})
if err == nil {
files, err = result.Resolve(ctx)
}
}
if err != nil {
return err
}
ret = &FindFilesResultType{
Count: result.Count,
Files: convertBaseFiles(files),
Duration: result.TotalDuration,
Megapixels: result.Megapixels,
Size: int(result.TotalSize),
}
return nil
}); err != nil {
return nil, err
}
return ret, nil
}

View file

@ -0,0 +1,99 @@
package api
import (
"context"
"errors"
"strconv"
"github.com/stashapp/stash/pkg/models"
)
func (r *queryResolver) FindFolder(ctx context.Context, id *string, path *string) (*models.Folder, error) {
var ret *models.Folder
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
qb := r.repository.Folder
var err error
switch {
case id != nil:
idInt, err := strconv.Atoi(*id)
if err != nil {
return err
}
ret, err = qb.Find(ctx, models.FolderID(idInt))
if err != nil {
return err
}
case path != nil:
ret, err = qb.FindByPath(ctx, *path, true)
if err == nil && ret == nil {
return errors.New("folder not found")
}
default:
return errors.New("either id or path must be provided")
}
return err
}); err != nil {
return nil, err
}
return ret, nil
}
func (r *queryResolver) FindFolders(
ctx context.Context,
folderFilter *models.FolderFilterType,
filter *models.FindFilterType,
ids []string,
) (ret *FindFoldersResultType, err error) {
var folderIDs []models.FolderID
if len(ids) > 0 {
folderIDsInt, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}
folderIDs = models.FolderIDsFromInts(folderIDsInt)
}
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
var folders []*models.Folder
var err error
fields := collectQueryFields(ctx)
result := &models.FolderQueryResult{}
if len(folderIDs) > 0 {
folders, err = r.repository.Folder.FindMany(ctx, folderIDs)
if err == nil {
result.Count = len(folders)
}
} else {
result, err = r.repository.Folder.Query(ctx, models.FolderQueryOptions{
QueryOptions: models.QueryOptions{
FindFilter: filter,
Count: fields.Has("count"),
},
FolderFilter: folderFilter,
})
if err == nil {
folders, err = result.Resolve(ctx)
}
}
if err != nil {
return err
}
ret = &FindFoldersResultType{
Count: result.Count,
Folders: folders,
}
return nil
}); err != nil {
return nil, err
}
return ret, nil
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindGallery(ctx context.Context, id string) (ret *models.Gallery, err error) {
@ -25,7 +24,7 @@ func (r *queryResolver) FindGallery(ctx context.Context, id string) (ret *models
}
func (r *queryResolver) FindGalleries(ctx context.Context, galleryFilter *models.GalleryFilterType, filter *models.FindFilterType, ids []string) (ret *FindGalleriesResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindGroup(ctx context.Context, id string) (ret *models.Group, err error) {
@ -25,7 +24,7 @@ func (r *queryResolver) FindGroup(ctx context.Context, id string) (ret *models.G
}
func (r *queryResolver) FindGroups(ctx context.Context, groupFilter *models.GroupFilterType, filter *models.FindFilterType, ids []string) (ret *FindGroupsResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -7,7 +7,6 @@ import (
"github.com/99designs/gqlgen/graphql"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindImage(ctx context.Context, id *string, checksum *string) (*models.Image, error) {
@ -55,7 +54,7 @@ func (r *queryResolver) FindImages(
filter *models.FindFilterType,
) (ret *FindImagesResultType, err error) {
if len(ids) > 0 {
imageIds, err = stringslice.StringSliceToIntSlice(ids)
imageIds, err = handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindMovie(ctx context.Context, id string) (ret *models.Group, err error) {
@ -25,7 +24,7 @@ func (r *queryResolver) FindMovie(ctx context.Context, id string) (ret *models.G
}
func (r *queryResolver) FindMovies(ctx context.Context, movieFilter *models.GroupFilterType, filter *models.FindFilterType, ids []string) (ret *FindMoviesResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindPerformer(ctx context.Context, id string) (ret *models.Performer, err error) {
@ -26,7 +25,7 @@ func (r *queryResolver) FindPerformer(ctx context.Context, id string) (ret *mode
func (r *queryResolver) FindPerformers(ctx context.Context, performerFilter *models.PerformerFilterType, filter *models.FindFilterType, performerIDs []int, ids []string) (ret *FindPerformersResultType, err error) {
if len(ids) > 0 {
performerIDs, err = stringslice.StringSliceToIntSlice(ids)
performerIDs, err = handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -9,7 +9,6 @@ import (
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/scene"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindScene(ctx context.Context, id *string, checksum *string) (*models.Scene, error) {
@ -83,7 +82,7 @@ func (r *queryResolver) FindScenes(
filter *models.FindFilterType,
) (ret *FindScenesResultType, err error) {
if len(ids) > 0 {
sceneIDs, err = stringslice.StringSliceToIntSlice(ids)
sceneIDs, err = handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -4,11 +4,10 @@ import (
"context"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindSceneMarkers(ctx context.Context, sceneMarkerFilter *models.SceneMarkerFilterType, filter *models.FindFilterType, ids []string) (ret *FindSceneMarkersResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindStudio(ctx context.Context, id string) (ret *models.Studio, err error) {
@ -26,7 +25,7 @@ func (r *queryResolver) FindStudio(ctx context.Context, id string) (ret *models.
}
func (r *queryResolver) FindStudios(ctx context.Context, studioFilter *models.StudioFilterType, filter *models.FindFilterType, ids []string) (ret *FindStudiosResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -5,7 +5,6 @@ import (
"strconv"
"github.com/stashapp/stash/pkg/models"
"github.com/stashapp/stash/pkg/sliceutil/stringslice"
)
func (r *queryResolver) FindTag(ctx context.Context, id string) (ret *models.Tag, err error) {
@ -25,7 +24,7 @@ func (r *queryResolver) FindTag(ctx context.Context, id string) (ret *models.Tag
}
func (r *queryResolver) FindTags(ctx context.Context, tagFilter *models.TagFilterType, filter *models.FindFilterType, ids []string) (ret *FindTagsResultType, err error) {
idInts, err := stringslice.StringSliceToIntSlice(ids)
idInts, err := handleIDList(ids, "ids")
if err != nil {
return nil, err
}

View file

@ -6,6 +6,7 @@ import (
"fmt"
"slices"
"strconv"
"strings"
"github.com/stashapp/stash/pkg/match"
"github.com/stashapp/stash/pkg/models"
@ -201,7 +202,7 @@ func (r *queryResolver) ScrapeSingleScene(ctx context.Context, source scraper.So
}
// TODO - this should happen after any scene is scraped
if err := r.matchScenesRelationships(ctx, ret, *source.StashBoxEndpoint); err != nil {
if err := r.matchScenesRelationships(ctx, ret, b.Endpoint); err != nil {
return nil, err
}
default:
@ -245,7 +246,7 @@ func (r *queryResolver) ScrapeMultiScenes(ctx context.Context, source scraper.So
// just flatten the slice and pass it in
flat := sliceutil.Flatten(ret)
if err := r.matchScenesRelationships(ctx, flat, *source.StashBoxEndpoint); err != nil {
if err := r.matchScenesRelationships(ctx, flat, b.Endpoint); err != nil {
return nil, err
}
@ -335,7 +336,7 @@ func (r *queryResolver) ScrapeSingleStudio(ctx context.Context, source scraper.S
if len(ret) > 0 {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
for _, studio := range ret {
if err := match.ScrapedStudioHierarchy(ctx, r.repository.Studio, studio, *source.StashBoxEndpoint); err != nil {
if err := match.ScrapedStudioHierarchy(ctx, r.repository.Studio, studio, b.Endpoint); err != nil {
return err
}
}
@ -350,7 +351,63 @@ func (r *queryResolver) ScrapeSingleStudio(ctx context.Context, source scraper.S
return nil, nil
}
return nil, errors.New("stash_box_index must be set")
return nil, errors.New("stash_box_endpoint must be set")
}
func (r *queryResolver) ScrapeSingleTag(ctx context.Context, source scraper.Source, input ScrapeSingleTagInput) ([]*models.ScrapedTag, error) {
if source.StashBoxIndex != nil || source.StashBoxEndpoint != nil {
b, err := resolveStashBox(source.StashBoxIndex, source.StashBoxEndpoint)
if err != nil {
return nil, err
}
client := r.newStashBoxClient(*b)
var ret []*models.ScrapedTag
query := *input.Query
out, err := client.QueryTag(ctx, query)
if err != nil {
return nil, err
} else if out != nil {
ret = append(ret, out...)
}
if len(ret) > 0 {
if err := r.withReadTxn(ctx, func(ctx context.Context) error {
for _, tag := range ret {
if err := match.ScrapedTag(ctx, r.repository.Tag, tag, b.Endpoint); err != nil {
return err
}
}
return nil
}); err != nil {
return nil, err
}
// tag name query returns results that may not match the query exactly.
// if there is an exact match, it should be first
if query != "" {
for i, result := range ret {
if strings.EqualFold(result.Name, query) {
// prepend exact match to the front of the slice
if i != 0 {
ret = append([]*models.ScrapedTag{result}, append(ret[:i], ret[i+1:]...)...)
}
break
}
}
}
return ret, nil
}
return nil, nil
}
return nil, errors.New("stash_box_endpoint must be set")
}
func (r *queryResolver) ScrapeSinglePerformer(ctx context.Context, source scraper.Source, input ScrapeSinglePerformerInput) ([]*models.ScrapedPerformer, error) {

View file

@ -18,9 +18,14 @@ type PerformerFinder interface {
GetImage(ctx context.Context, performerID int) ([]byte, error)
}
type sfwConfig interface {
GetSFWContentMode() bool
}
type performerRoutes struct {
routes
performerFinder PerformerFinder
sfwConfig sfwConfig
}
func (rs performerRoutes) Routes() chi.Router {
@ -54,7 +59,7 @@ func (rs performerRoutes) Image(w http.ResponseWriter, r *http.Request) {
}
if len(image) == 0 {
image = getDefaultPerformerImage(performer.Name, performer.Gender)
image = getDefaultPerformerImage(performer.Name, performer.Gender, rs.sfwConfig.GetSFWContentMode())
}
utils.ServeImage(w, r, image)

View file

@ -12,6 +12,7 @@ import (
"github.com/stashapp/stash/internal/manager"
"github.com/stashapp/stash/internal/manager/config"
"github.com/stashapp/stash/internal/static"
"github.com/stashapp/stash/pkg/ffmpeg"
"github.com/stashapp/stash/pkg/file/video"
"github.com/stashapp/stash/pkg/fsutil"
@ -243,6 +244,12 @@ func (rs sceneRoutes) streamSegment(w http.ResponseWriter, r *http.Request, stre
}
func (rs sceneRoutes) Screenshot(w http.ResponseWriter, r *http.Request) {
// if default flag is set, return the default image
if r.URL.Query().Get("default") == "true" {
utils.ServeImage(w, r, static.ReadAll(static.DefaultSceneImage))
return
}
scene := r.Context().Value(sceneKey).(*models.Scene)
ss := manager.SceneServer{

View file

@ -135,6 +135,13 @@ func marshalScrapedGroups(content []scraper.ScrapedContent) ([]*models.ScrapedGr
ret = append(ret, m)
case models.ScrapedGroup:
ret = append(ret, &m)
// it's possible that a scraper returns models.ScrapedMovie
case *models.ScrapedMovie:
g := m.ScrapedGroup()
ret = append(ret, &g)
case models.ScrapedMovie:
g := m.ScrapedGroup()
ret = append(ret, &g)
default:
return nil, fmt.Errorf("%w: cannot turn ScrapedContent into ScrapedGroup", models.ErrConversion)
}

View file

@ -11,6 +11,7 @@ import (
"net/http"
"os"
"path"
"path/filepath"
"runtime/debug"
"strconv"
"strings"
@ -255,6 +256,9 @@ func Initialize() (*Server, error) {
staticUI = statigz.FileServer(ui.UIBox.(fs.ReadDirFS))
}
// handle favicon override
r.HandleFunc("/favicon.ico", handleFavicon(staticUI))
// Serve the web app
r.HandleFunc("/*", func(w http.ResponseWriter, r *http.Request) {
ext := path.Ext(r.URL.Path)
@ -295,6 +299,31 @@ func Initialize() (*Server, error) {
return server, nil
}
func handleFavicon(staticUI *statigz.Server) func(w http.ResponseWriter, r *http.Request) {
mgr := manager.GetInstance()
cfg := mgr.Config
// check if favicon.ico exists in the config directory
// if so, use that
// otherwise, use the embedded one
iconPath := filepath.Join(cfg.GetConfigPath(), "favicon.ico")
exists, _ := fsutil.FileExists(iconPath)
if exists {
logger.Debugf("Using custom favicon at %s", iconPath)
}
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Cache-Control", "no-cache")
if exists {
http.ServeFile(w, r, iconPath)
} else {
staticUI.ServeHTTP(w, r)
}
}
}
// Start starts the server. It listens on the configured address and port.
// It calls ListenAndServeTLS if TLS is configured, otherwise it calls ListenAndServe.
// Calls to Start are blocked until the server is shutdown.
@ -322,6 +351,7 @@ func (s *Server) getPerformerRoutes() chi.Router {
return performerRoutes{
routes: routes{txnManager: repo.TxnManager},
performerFinder: repo.Performer,
sfwConfig: s.manager.Config,
}.Routes()
}
@ -420,7 +450,7 @@ func cssHandler(c *config.Config) func(w http.ResponseWriter, r *http.Request) {
return func(w http.ResponseWriter, r *http.Request) {
var paths []string
if c.GetCSSEnabled() {
if c.GetCSSEnabled() && !c.GetDisableCustomizations() {
// search for custom.css in current directory, then $HOME/.stash
fn := c.GetCSSPath()
exists, _ := fsutil.FileExists(fn)
@ -438,7 +468,7 @@ func javascriptHandler(c *config.Config) func(w http.ResponseWriter, r *http.Req
return func(w http.ResponseWriter, r *http.Request) {
var paths []string
if c.GetJavascriptEnabled() {
if c.GetJavascriptEnabled() && !c.GetDisableCustomizations() {
// search for custom.js in current directory, then $HOME/.stash
fn := c.GetJavascriptPath()
exists, _ := fsutil.FileExists(fn)
@ -456,7 +486,7 @@ func customLocalesHandler(c *config.Config) func(w http.ResponseWriter, r *http.
return func(w http.ResponseWriter, r *http.Request) {
buffer := bytes.Buffer{}
if c.GetCustomLocalesEnabled() {
if c.GetCustomLocalesEnabled() && !c.GetDisableCustomizations() {
// search for custom-locales.json in current directory, then $HOME/.stash
path := c.GetCustomLocalesPath()
exists, _ := fsutil.FileExists(path)

View file

@ -9,12 +9,14 @@ import (
type GalleryURLBuilder struct {
BaseURL string
GalleryID string
UpdatedAt string
}
func NewGalleryURLBuilder(baseURL string, gallery *models.Gallery) GalleryURLBuilder {
return GalleryURLBuilder{
BaseURL: baseURL,
GalleryID: strconv.Itoa(gallery.ID),
UpdatedAt: strconv.FormatInt(gallery.UpdatedAt.Unix(), 10),
}
}
@ -23,5 +25,5 @@ func (b GalleryURLBuilder) GetPreviewURL() string {
}
func (b GalleryURLBuilder) GetCoverURL() string {
return b.BaseURL + "/gallery/" + b.GalleryID + "/cover"
return b.BaseURL + "/gallery/" + b.GalleryID + "/cover?t=" + b.UpdatedAt
}

View file

@ -101,16 +101,15 @@ func createPerformer(ctx context.Context, pqb models.PerformerWriter) error {
func createStudio(ctx context.Context, qb models.StudioWriter, name string) (*models.Studio, error) {
// create the studio
studio := models.Studio{
Name: name,
}
studio := models.NewCreateStudioInput()
studio.Name = name
err := qb.Create(ctx, &studio)
if err != nil {
return nil, err
}
return &studio, nil
return studio.Studio, nil
}
func createTag(ctx context.Context, qb models.TagWriter) error {
@ -119,7 +118,7 @@ func createTag(ctx context.Context, qb models.TagWriter) error {
Name: testName,
}
err := qb.Create(ctx, &tag)
err := qb.Create(ctx, &models.CreateTagInput{Tag: &tag})
if err != nil {
return err
}
@ -225,7 +224,7 @@ func createSceneFile(ctx context.Context, name string, folderStore models.Folder
}
func getOrCreateFolder(ctx context.Context, folderStore models.FolderFinderCreator, folderPath string) (*models.Folder, error) {
f, err := folderStore.FindByPath(ctx, folderPath)
f, err := folderStore.FindByPath(ctx, folderPath, true)
if err != nil {
return nil, fmt.Errorf("getting folder by path: %w", err)
}
@ -366,7 +365,10 @@ func makeImage(expectedResult bool) *models.Image {
}
func createImage(ctx context.Context, w models.ImageWriter, o *models.Image, f *models.ImageFile) error {
err := w.Create(ctx, o, []models.FileID{f.ID})
err := w.Create(ctx, &models.CreateImageInput{
Image: o,
FileIDs: []models.FileID{f.ID},
})
if err != nil {
return fmt.Errorf("Failed to create image with path '%s': %s", f.Path, err.Error())
@ -469,7 +471,10 @@ func makeGallery(expectedResult bool) *models.Gallery {
}
func createGallery(ctx context.Context, w models.GalleryWriter, o *models.Gallery, f *models.BaseFile) error {
err := w.Create(ctx, o, []models.FileID{f.ID})
err := w.Create(ctx, &models.CreateGalleryInput{
Gallery: o,
FileIDs: []models.FileID{f.ID},
})
if err != nil {
return fmt.Errorf("Failed to create gallery with path '%s': %s", f.Path, err.Error())
}

View file

@ -2,6 +2,7 @@
package desktop
import (
"fmt"
"os"
"path"
"path/filepath"
@ -17,6 +18,16 @@ import (
"golang.org/x/term"
)
var isDesktop bool
// InitIsDesktop sets the value of isDesktop.
// Changed IsDesktop to be evaluated once at startup because if it is
// checked while there are open terminal sessions (such as the ffmpeg hardware
// encoding checks), it may return false.
func InitIsDesktop() {
isDesktop = isDesktopCheck()
}
type FaviconProvider interface {
GetFavicon() []byte
GetFaviconPng() []byte
@ -59,22 +70,33 @@ func SendNotification(title string, text string) {
}
func IsDesktop() bool {
return isDesktop
}
// isDesktop tries to determine if the application is running in a desktop environment
// where desktop features like system tray and notifications should be enabled.
func isDesktopCheck() bool {
if isDoubleClickLaunched() {
logger.Debug("Detected double-click launch")
return true
}
// Check if running under root
if os.Getuid() == 0 {
logger.Debug("Running as root, disabling desktop features")
return false
}
// Check if stdin is a terminal
if term.IsTerminal(int(os.Stdin.Fd())) {
logger.Debug("Running in terminal, disabling desktop features")
return false
}
if isService() {
logger.Debug("Running as a service, disabling desktop features")
return false
}
if IsServerDockerized() {
logger.Debug("Running in docker, disabling desktop features")
return false
}
@ -134,15 +156,17 @@ func getIconPath() string {
return path.Join(config.GetInstance().GetConfigPath(), "icon.png")
}
func RevealInFileManager(path string) {
exists, err := fsutil.FileExists(path)
func RevealInFileManager(path string) error {
info, err := os.Stat(path)
if err != nil {
logger.Errorf("Error checking file: %s", err)
return
return fmt.Errorf("error checking path: %w", err)
}
if exists && IsDesktop() {
revealInFileManager(path)
absPath, err := filepath.Abs(path)
if err != nil {
return fmt.Errorf("error getting absolute path: %w", err)
}
return revealInFileManager(absPath, info)
}
func getServerURL(path string) string {

View file

@ -4,9 +4,11 @@
package desktop
import (
"fmt"
"os"
"os/exec"
"github.com/kermieisinthehouse/gosx-notifier"
gosxnotifier "github.com/kermieisinthehouse/gosx-notifier"
"github.com/stashapp/stash/pkg/logger"
)
@ -32,8 +34,11 @@ func sendNotification(notificationTitle string, notificationText string) {
}
}
func revealInFileManager(path string) {
exec.Command(`open`, `-R`, path)
func revealInFileManager(path string, _ os.FileInfo) error {
if err := exec.Command(`open`, `-R`, path).Run(); err != nil {
return fmt.Errorf("error revealing path in Finder: %w", err)
}
return nil
}
func isDoubleClickLaunched() bool {

View file

@ -4,8 +4,10 @@
package desktop
import (
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
"github.com/stashapp/stash/pkg/logger"
@ -33,8 +35,15 @@ func sendNotification(notificationTitle string, notificationText string) {
}
}
func revealInFileManager(path string) {
func revealInFileManager(path string, info os.FileInfo) error {
dir := path
if !info.IsDir() {
dir = filepath.Dir(path)
}
if err := exec.Command("xdg-open", dir).Run(); err != nil {
return fmt.Errorf("error opening directory in file manager: %w", err)
}
return nil
}
func isDoubleClickLaunched() bool {

View file

@ -4,6 +4,7 @@
package desktop
import (
"os"
"os/exec"
"syscall"
"unsafe"
@ -83,6 +84,10 @@ func sendNotification(notificationTitle string, notificationText string) {
}
}
func revealInFileManager(path string) {
exec.Command(`explorer`, `\select`, path)
func revealInFileManager(path string, _ os.FileInfo) error {
c := exec.Command(`explorer`, `/select,`, path)
logger.Debugf("Running: %s", c.String())
// explorer seems to return an error code even when it works, so ignore the error
_ = c.Run()
return nil
}

View file

@ -3,6 +3,8 @@
package desktop
import (
"fmt"
"runtime"
"strings"
"github.com/kermieisinthehouse/systray"
@ -20,7 +22,12 @@ func startSystray(exit chan int, faviconProvider FaviconProvider) {
// system is started from a non-terminal method, e.g. double-clicking an icon.
c := config.GetInstance()
if c.GetShowOneTimeMovedNotification() {
SendNotification("Stash has moved!", "Stash now runs in your tray, instead of a terminal window.")
// Use platform-appropriate terminology
location := "tray"
if runtime.GOOS == "darwin" {
location = "menu bar"
}
SendNotification("Stash has moved!", "Stash now runs in your "+location+", instead of a terminal window.")
c.SetBool(config.ShowOneTimeMovedNotification, false)
if err := c.Write(); err != nil {
logger.Errorf("Error while writing configuration file: %v", err)
@ -52,12 +59,12 @@ func startSystray(exit chan int, faviconProvider FaviconProvider) {
func systrayInitialize(exit chan<- int, faviconProvider FaviconProvider) {
favicon := faviconProvider.GetFavicon()
systray.SetTemplateIcon(favicon, favicon)
systray.SetTooltip("🟢 Stash is Running.")
c := config.GetInstance()
systray.SetTooltip(fmt.Sprintf("🟢 Stash is Running on port %d.", c.GetPort()))
openStashButton := systray.AddMenuItem("Open Stash", "Open a browser window to Stash")
var menuItems []string
systray.AddSeparator()
c := config.GetInstance()
if !c.IsNewSystem() {
menuItems = c.GetMenuItems()
for _, item := range menuItems {

333
internal/dlna/activity.go Normal file
View file

@ -0,0 +1,333 @@
package dlna
import (
"context"
"fmt"
"sync"
"time"
"github.com/stashapp/stash/pkg/logger"
"github.com/stashapp/stash/pkg/txn"
)
const (
// DefaultSessionTimeout is the time after which a session is considered complete
// if no new requests are received.
// This is set high (5 minutes) because DLNA clients buffer aggressively and may not
// send any HTTP requests for extended periods while the user is still watching.
DefaultSessionTimeout = 5 * time.Minute
// monitorInterval is how often we check for expired sessions.
monitorInterval = 10 * time.Second
)
// ActivityConfig provides configuration options for DLNA activity tracking.
type ActivityConfig interface {
// GetDLNAActivityTrackingEnabled returns true if activity tracking should be enabled.
// If not implemented, defaults to true.
GetDLNAActivityTrackingEnabled() bool
// GetMinimumPlayPercent returns the minimum percentage of a video that must be
// watched before incrementing the play count. Uses UI setting if available.
GetMinimumPlayPercent() int
}
// SceneActivityWriter provides methods for saving scene activity.
type SceneActivityWriter interface {
SaveActivity(ctx context.Context, sceneID int, resumeTime *float64, playDuration *float64) (bool, error)
AddViews(ctx context.Context, sceneID int, dates []time.Time) ([]time.Time, error)
}
// streamSession represents an active DLNA streaming session.
type streamSession struct {
SceneID int
ClientIP string
StartTime time.Time
LastActivity time.Time
VideoDuration float64
PlayCountAdded bool
}
// sessionKey generates a unique key for a session based on client IP and scene ID.
func sessionKey(clientIP string, sceneID int) string {
return fmt.Sprintf("%s:%d", clientIP, sceneID)
}
// percentWatched calculates the estimated percentage of video watched.
// Uses a time-based approach since DLNA clients buffer aggressively and byte
// positions don't correlate with actual playback position.
//
// The key insight: you cannot have watched more of the video than time has elapsed.
// If the video is 30 minutes and only 1 minute has passed, maximum watched is ~3.3%.
func (s *streamSession) percentWatched() float64 {
if s.VideoDuration <= 0 {
return 0
}
// Calculate elapsed time from session start to last activity
elapsed := s.LastActivity.Sub(s.StartTime).Seconds()
if elapsed <= 0 {
return 0
}
// Maximum possible percent is based on elapsed time
// You can't watch more of the video than time has passed
timeBasedPercent := (elapsed / s.VideoDuration) * 100
// Cap at 100%
if timeBasedPercent > 100 {
return 100
}
return timeBasedPercent
}
// estimatedResumeTime calculates the estimated resume time based on elapsed time.
// Since DLNA clients buffer aggressively, byte positions don't correlate with playback.
// Instead, we estimate based on how long the session has been active.
// Returns the time in seconds, or 0 if the video is nearly complete (>=98%).
func (s *streamSession) estimatedResumeTime() float64 {
if s.VideoDuration <= 0 {
return 0
}
// Calculate elapsed time from session start
elapsed := s.LastActivity.Sub(s.StartTime).Seconds()
if elapsed <= 0 {
return 0
}
// If elapsed time exceeds 98% of video duration, reset resume time (matches frontend behavior)
if elapsed >= s.VideoDuration*0.98 {
return 0
}
// Resume time is approximately where the user was watching
// Capped by video duration
if elapsed > s.VideoDuration {
elapsed = s.VideoDuration
}
return elapsed
}
// ActivityTracker tracks DLNA streaming activity and saves it to the database.
type ActivityTracker struct {
txnManager txn.Manager
sceneWriter SceneActivityWriter
config ActivityConfig
sessionTimeout time.Duration
sessions map[string]*streamSession
mutex sync.RWMutex
ctx context.Context
cancelFunc context.CancelFunc
wg sync.WaitGroup
}
// NewActivityTracker creates a new ActivityTracker.
func NewActivityTracker(
txnManager txn.Manager,
sceneWriter SceneActivityWriter,
config ActivityConfig,
) *ActivityTracker {
ctx, cancel := context.WithCancel(context.Background())
tracker := &ActivityTracker{
txnManager: txnManager,
sceneWriter: sceneWriter,
config: config,
sessionTimeout: DefaultSessionTimeout,
sessions: make(map[string]*streamSession),
ctx: ctx,
cancelFunc: cancel,
}
// Start the session monitor goroutine
tracker.wg.Add(1)
go tracker.monitorSessions()
return tracker
}
// Stop stops the activity tracker and processes any remaining sessions.
func (t *ActivityTracker) Stop() {
t.cancelFunc()
t.wg.Wait()
// Process any remaining sessions
t.mutex.Lock()
sessions := make([]*streamSession, 0, len(t.sessions))
for _, session := range t.sessions {
sessions = append(sessions, session)
}
t.sessions = make(map[string]*streamSession)
t.mutex.Unlock()
for _, session := range sessions {
t.processCompletedSession(session)
}
}
// RecordRequest records a streaming request for activity tracking.
// Each request updates the session's LastActivity time, which is used for
// time-based tracking of watch progress.
func (t *ActivityTracker) RecordRequest(sceneID int, clientIP string, videoDuration float64) {
if !t.isEnabled() {
return
}
key := sessionKey(clientIP, sceneID)
now := time.Now()
t.mutex.Lock()
defer t.mutex.Unlock()
session, exists := t.sessions[key]
if !exists {
session = &streamSession{
SceneID: sceneID,
ClientIP: clientIP,
StartTime: now,
VideoDuration: videoDuration,
}
t.sessions[key] = session
logger.Debugf("[DLNA Activity] New session started: scene=%d, client=%s", sceneID, clientIP)
}
session.LastActivity = now
}
// monitorSessions periodically checks for expired sessions and processes them.
func (t *ActivityTracker) monitorSessions() {
defer t.wg.Done()
ticker := time.NewTicker(monitorInterval)
defer ticker.Stop()
for {
select {
case <-t.ctx.Done():
return
case <-ticker.C:
t.processExpiredSessions()
}
}
}
// processExpiredSessions finds and processes sessions that have timed out.
func (t *ActivityTracker) processExpiredSessions() {
now := time.Now()
var expiredSessions []*streamSession
t.mutex.Lock()
for key, session := range t.sessions {
timeSinceStart := now.Sub(session.StartTime)
timeSinceActivity := now.Sub(session.LastActivity)
// Must have no HTTP activity for the full timeout period
if timeSinceActivity <= t.sessionTimeout {
continue
}
// DLNA clients buffer aggressively - they fetch most/all of the video quickly,
// then play from cache with NO further HTTP requests.
//
// Two scenarios:
// 1. User watched the whole video: timeSinceStart >= videoDuration
// -> Set LastActivity to when timeout began (they finished watching)
// 2. User stopped early: timeSinceStart < videoDuration
// -> Keep LastActivity as-is (best estimate of when they stopped)
videoDuration := time.Duration(session.VideoDuration) * time.Second
if timeSinceStart >= videoDuration && videoDuration > 0 {
// User likely watched the whole video, then it timed out
// Estimate they watched until the timeout period started
session.LastActivity = now.Add(-t.sessionTimeout)
}
// else: User stopped early - LastActivity is already our best estimate
expiredSessions = append(expiredSessions, session)
delete(t.sessions, key)
}
t.mutex.Unlock()
for _, session := range expiredSessions {
t.processCompletedSession(session)
}
}
// processCompletedSession saves activity data for a completed streaming session.
func (t *ActivityTracker) processCompletedSession(session *streamSession) {
percentWatched := session.percentWatched()
resumeTime := session.estimatedResumeTime()
logger.Debugf("[DLNA Activity] Session completed: scene=%d, client=%s, videoDuration=%.1fs, percent=%.1f%%, resume=%.1fs",
session.SceneID, session.ClientIP, session.VideoDuration, percentWatched, resumeTime)
// Only save if there was meaningful activity (at least 1% watched)
if percentWatched < 1 {
logger.Debugf("[DLNA Activity] Session too short, skipping save")
return
}
// Skip DB operations if txnManager is nil (for testing)
if t.txnManager == nil {
logger.Debugf("[DLNA Activity] No transaction manager, skipping DB save")
return
}
// Determine what needs to be saved
shouldSaveResume := resumeTime > 0
shouldAddView := !session.PlayCountAdded && percentWatched >= float64(t.getMinimumPlayPercent())
// Nothing to save
if !shouldSaveResume && !shouldAddView {
return
}
// Save everything in a single transaction
ctx := context.Background()
if err := txn.WithTxn(ctx, t.txnManager, func(ctx context.Context) error {
// Save resume time only. DLNA clients buffer aggressively and don't report
// playback position, so we can't accurately track play duration - saving
// guesses would corrupt analytics. Resume time is still useful as a
// "continue watching" hint even if imprecise.
if shouldSaveResume {
if _, err := t.sceneWriter.SaveActivity(ctx, session.SceneID, &resumeTime, nil); err != nil {
return fmt.Errorf("save resume time: %w", err)
}
}
// Increment play count (also updates last_played_at via view date)
if shouldAddView {
if _, err := t.sceneWriter.AddViews(ctx, session.SceneID, []time.Time{time.Now()}); err != nil {
return fmt.Errorf("add view: %w", err)
}
session.PlayCountAdded = true
logger.Debugf("[DLNA Activity] Incremented play count for scene %d (%.1f%% watched)",
session.SceneID, percentWatched)
}
return nil
}); err != nil {
logger.Warnf("[DLNA Activity] Failed to save activity for scene %d: %v", session.SceneID, err)
}
}
// isEnabled returns true if activity tracking is enabled.
func (t *ActivityTracker) isEnabled() bool {
if t.config == nil {
return true // Default to enabled
}
return t.config.GetDLNAActivityTrackingEnabled()
}
// getMinimumPlayPercent returns the minimum play percentage for incrementing play count.
func (t *ActivityTracker) getMinimumPlayPercent() int {
if t.config == nil {
return 0 // Default: any play increments count (matches frontend default)
}
return t.config.GetMinimumPlayPercent()
}

Some files were not shown because too many files have changed in this diff Show more