Compare commits

...

96 Commits

Author SHA1 Message Date
Dmitry Popov
9a8dc4dbe5 Merge branch 'main' into tests-fixes 2025-11-22 12:29:22 +01:00
Dmitry Popov
7eb6d093cf fix(core): invalidate map characters every 1 hour for any missing/revoked permissions 2025-11-22 12:25:24 +01:00
CI
a23e544a9f chore: [skip ci] 2025-11-22 09:42:11 +00:00
CI
845ea7a576 chore: release version v1.85.3 2025-11-22 09:42:11 +00:00
Dmitry Popov
ae8fbf30e4 fix(core): fixed connection time status issues. fixed character alliance update issues 2025-11-22 10:41:35 +01:00
CI
3de385c902 chore: [skip ci] 2025-11-20 10:57:05 +00:00
CI
5f3d4dba37 chore: release version v1.85.2 2025-11-20 10:57:05 +00:00
Dmitry Popov
8acc7ddc25 fix(core): increased API pool limits 2025-11-20 11:56:31 +01:00
CI
ed6d25f3ea chore: [skip ci] 2025-11-20 10:35:09 +00:00
CI
ab07d1321d chore: release version v1.85.1 2025-11-20 10:35:09 +00:00
Dmitry Popov
a81e61bd70 Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-20 11:31:39 +01:00
Dmitry Popov
d2d33619c2 fix(core): increased API pool limits 2025-11-20 11:31:36 +01:00
CI
fa464110c6 chore: [skip ci] 2025-11-19 23:13:02 +00:00
CI
a5fa60e699 chore: release version v1.85.0 2025-11-19 23:13:02 +00:00
Dmitry Popov
6db994852f feat(core): added support for new ship types 2025-11-20 00:12:30 +01:00
CI
0a68676957 chore: [skip ci] 2025-11-19 21:06:28 +00:00
CI
9b82dd8f43 chore: release version v1.84.37 2025-11-19 21:06:28 +00:00
Dmitry Popov
aac2c33fd2 fix(auth): fixed character auth issues 2025-11-19 22:05:49 +01:00
CI
1665b65619 chore: [skip ci] 2025-11-19 10:33:10 +00:00
CI
e1a946bb1d chore: release version v1.84.36 2025-11-19 10:33:10 +00:00
Dmitry Popov
543ec7f071 fix: fixed duplicated map slugs 2025-11-19 11:32:35 +01:00
CI
bf40d2cb8d chore: [skip ci] 2025-11-19 09:44:24 +00:00
CI
48ac40ea55 chore: release version v1.84.35 2025-11-19 09:44:24 +00:00
Dmitry Popov
5a3f3c40fe Merge pull request #552 from guarzo/guarzo/structurefix
fix: structure search / paste issues
2025-11-19 13:43:52 +04:00
guarzo
d5bac311ff Merge branch 'main' into guarzo/structurefix 2025-11-18 22:24:30 -05:00
Guarzo
34a7c854ed fix: structure search / paste issues 2025-11-18 22:19:04 -05:00
CI
ebb6090be9 chore: [skip ci] 2025-11-18 11:47:15 +00:00
CI
7a4d31db60 chore: release version v1.84.34 2025-11-18 11:47:15 +00:00
Dmitry Popov
2acf9ed5dc Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-18 12:46:45 +01:00
Dmitry Popov
46df025200 fix(core): fixed character tracking issues 2025-11-18 12:46:42 +01:00
CI
43a363b5ab chore: [skip ci] 2025-11-18 11:00:34 +00:00
CI
03688387d8 chore: release version v1.84.33 2025-11-18 11:00:34 +00:00
Dmitry Popov
5060852918 Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-18 12:00:04 +01:00
Dmitry Popov
57381b9782 fix(core): fixed character tracking issues 2025-11-18 12:00:01 +01:00
CI
6014c60e13 chore: [skip ci] 2025-11-18 10:08:04 +00:00
CI
1b711d7b4b chore: release version v1.84.32 2025-11-18 10:08:04 +00:00
Dmitry Popov
f761ba9746 fix(core): fixed character tracking issues 2025-11-18 11:04:32 +01:00
CI
20a795c5b5 chore: [skip ci] 2025-11-17 13:41:22 +00:00
CI
0c80894c65 chore: release version v1.84.31 2025-11-17 13:41:22 +00:00
Dmitry Popov
21844f0550 fix(core): fixed connactions validation logic 2025-11-17 14:40:46 +01:00
CI
f7716ca45a chore: [skip ci] 2025-11-17 12:38:04 +00:00
CI
de74714c77 chore: release version v1.84.30 2025-11-17 12:38:04 +00:00
Dmitry Popov
4dfa83bd30 chore: fixed character updates issue 2025-11-17 13:37:30 +01:00
CI
cb4dba8dc2 chore: [skip ci] 2025-11-17 12:09:39 +00:00
CI
1d75b8f063 chore: release version v1.84.29 2025-11-17 12:09:39 +00:00
Dmitry Popov
2a42c4e6df Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-17 13:09:08 +01:00
Dmitry Popov
0ee6160bcd chore: fixed MapEventRelay logs 2025-11-17 13:09:05 +01:00
CI
5826d2492b chore: [skip ci] 2025-11-17 11:53:30 +00:00
CI
a643e20247 chore: release version v1.84.28 2025-11-17 11:53:30 +00:00
Dmitry Popov
66dc680281 fix(core): fixed ACL updates 2025-11-17 12:52:59 +01:00
Dmitry Popov
5e0965ead4 fix(tests): updated tests 2025-11-17 12:52:11 +01:00
CI
46f46c745e chore: [skip ci] 2025-11-17 09:16:32 +00:00
CI
00bf620e35 chore: release version v1.84.27 2025-11-17 09:16:32 +00:00
Dmitry Popov
46eef60d86 chore: fixed warnings 2025-11-17 10:15:57 +01:00
Dmitry Popov
4c39c6fb39 fix(tests): updated tests 2025-11-17 00:09:10 +01:00
Dmitry Popov
fe836442ab fix(core): supported characters_updates for external events 2025-11-17 00:08:08 +01:00
Dmitry Popov
9514806dbb fix(core): improved character tracking 2025-11-16 23:45:39 +01:00
Dmitry Popov
4e6423ebc8 fix(core): improved character tracking 2025-11-16 18:28:58 +01:00
Dmitry Popov
a97e598299 fix(core): improved character location tracking 2025-11-16 16:39:39 +01:00
CI
9c26b50aac chore: [skip ci] 2025-11-16 01:14:41 +00:00
CI
3f2ddf5cc4 chore: release version v1.84.26 2025-11-16 01:14:41 +00:00
Dmitry Popov
233b2bd7a4 fix(core): disable character tracker pausing 2025-11-16 02:14:05 +01:00
CI
0d35268efc chore: [skip ci] 2025-11-16 01:01:35 +00:00
CI
d169220eb2 chore: release version v1.84.25 2025-11-16 01:01:35 +00:00
Dmitry Popov
182d5ec9fb fix(core): used upsert for adding map systems 2025-11-16 02:00:59 +01:00
CI
32958253b7 chore: [skip ci] 2025-11-15 22:50:08 +00:00
CI
c011d56ce7 chore: release version v1.84.24 2025-11-15 22:50:08 +00:00
Dmitry Popov
73d1921d42 Merge pull request #549 from wanderer-industries/redesign-and-fixes
fix(Map): New design and prepared main pages for new patch
2025-11-16 02:49:36 +04:00
Dmitry Popov
7bb810e1e6 chore: update bg image url 2025-11-15 23:35:59 +01:00
CI
c90ac7b1e3 chore: [skip ci] 2025-11-15 22:17:28 +00:00
CI
005e0c2bc6 chore: release version v1.84.23 2025-11-15 22:17:28 +00:00
Dmitry Popov
808acb540e fix(core): fixed map pings cancel errors 2025-11-15 23:16:58 +01:00
DanSylvest
06626f910b fix(Map): Fixed problem related with error if settings was removed and mapper crashed. Fixed settings reset. 2025-11-15 21:30:45 +03:00
CI
812582d955 chore: [skip ci] 2025-11-15 11:38:00 +00:00
CI
f3077c0bf1 chore: release version v1.84.22 2025-11-15 11:38:00 +00:00
Dmitry Popov
32c70cbbad Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-15 12:37:31 +01:00
Dmitry Popov
8934935e10 fix(core): fixed map initialization 2025-11-15 12:37:27 +01:00
CI
20c8a53712 chore: [skip ci] 2025-11-15 08:48:30 +00:00
CI
b22970fef3 chore: release version v1.84.21 2025-11-15 08:48:30 +00:00
Dmitry Popov
cf72394ef9 Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-15 09:47:53 +01:00
Dmitry Popov
e6dbba7283 fix(core): fixed map characters adding 2025-11-15 09:47:48 +01:00
CI
843b3b86b2 chore: [skip ci] 2025-11-15 07:29:25 +00:00
CI
bd865b9f64 chore: release version v1.84.20 2025-11-15 07:29:25 +00:00
Dmitry Popov
ae91cd2f92 Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-15 08:25:59 +01:00
Dmitry Popov
0be7a5f9d0 fix(core): fixed map start issues 2025-11-15 08:25:55 +01:00
CI
e15bfa426a chore: [skip ci] 2025-11-14 19:28:51 +00:00
CI
4198e4b07a chore: release version v1.84.19 2025-11-14 19:28:51 +00:00
Dmitry Popov
03ee08ff67 Merge branch 'main' of github.com:wanderer-industries/wanderer 2025-11-14 20:28:16 +01:00
Dmitry Popov
ac4dd4c28b fix(core): fixed map start issues 2025-11-14 20:28:12 +01:00
CI
308e81a464 chore: [skip ci] 2025-11-14 18:36:20 +00:00
CI
6f4240d931 chore: release version v1.84.18 2025-11-14 18:36:20 +00:00
Dmitry Popov
847b45a431 fix(core): added gracefull map poll recovery from saved state. added map slug unique checks 2025-11-14 19:35:45 +01:00
CI
5ec97d74ca chore: [skip ci] 2025-11-14 13:43:40 +00:00
CI
74359a5542 chore: release version v1.84.17 2025-11-14 13:43:40 +00:00
Dmitry Popov
0020f46dd8 fix(core): fixed activity tracking issues 2025-11-14 14:42:44 +01:00
CI
a6751b45c6 chore: [skip ci] 2025-11-13 16:20:24 +00:00
128 changed files with 7415 additions and 2064 deletions

View File

@@ -1,9 +1,9 @@
name: Build Docker Image
name: Build Develop
on:
push:
tags:
- '**'
branches:
- develop
env:
MIX_ENV: prod
@@ -18,12 +18,85 @@ permissions:
contents: write
jobs:
build:
name: 🛠 Build
runs-on: ubuntu-22.04
if: ${{ github.ref == 'refs/heads/develop' && github.event_name == 'push' }}
permissions:
checks: write
contents: write
packages: write
attestations: write
id-token: write
pull-requests: write
repository-projects: write
strategy:
matrix:
otp: ["27"]
elixir: ["1.17"]
node-version: ["18.x"]
outputs:
commit_hash: ${{ steps.set-commit-develop.outputs.commit_hash }}
steps:
- name: Prepare
run: |
platform=${{ matrix.platform }}
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
- name: Setup Elixir
uses: erlef/setup-beam@v1
with:
otp-version: ${{matrix.otp}}
elixir-version: ${{matrix.elixir}}
# nix build would also work here because `todos` is the default package
- name: ⬇️ Checkout repo
uses: actions/checkout@v3
with:
ssh-key: "${{ secrets.COMMIT_KEY }}"
fetch-depth: 0
- name: 😅 Cache deps
id: cache-deps
uses: actions/cache@v4
env:
cache-name: cache-elixir-deps
with:
path: |
deps
key: ${{ runner.os }}-mix-${{ matrix.elixir }}-${{ matrix.otp }}-${{ hashFiles('**/mix.lock') }}
restore-keys: |
${{ runner.os }}-mix-${{ matrix.elixir }}-${{ matrix.otp }}-
- name: 😅 Cache compiled build
id: cache-build
uses: actions/cache@v4
env:
cache-name: cache-compiled-build
with:
path: |
_build
key: ${{ runner.os }}-build-${{ hashFiles('**/mix.lock') }}-${{ hashFiles( '**/lib/**/*.{ex,eex}', '**/config/*.exs', '**/mix.exs' ) }}
restore-keys: |
${{ runner.os }}-build-${{ hashFiles('**/mix.lock') }}-
${{ runner.os }}-build-
# Step: Download project dependencies. If unchanged, uses
# the cached version.
- name: 🌐 Install dependencies
run: mix deps.get --only "prod"
# Step: Compile the project treating any warnings as errors.
# Customize this step if a different behavior is desired.
- name: 🛠 Compiles without warnings
if: steps.cache-build.outputs.cache-hit != 'true'
run: mix compile
- name: Set commit hash for develop
id: set-commit-develop
run: |
echo "commit_hash=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
docker:
name: 🛠 Build Docker Images
needs: build
runs-on: ubuntu-22.04
outputs:
release-tag: ${{ steps.get-latest-tag.outputs.tag }}
release-notes: ${{ steps.get-content.outputs.string }}
permissions:
checks: write
contents: write
@@ -37,6 +110,7 @@ jobs:
matrix:
platform:
- linux/amd64
- linux/arm64
steps:
- name: Prepare
run: |
@@ -46,25 +120,9 @@ jobs:
- name: ⬇️ Checkout repo
uses: actions/checkout@v3
with:
ref: ${{ needs.build.outputs.commit_hash }}
fetch-depth: 0
- name: Get Release Tag
id: get-latest-tag
uses: "WyriHaximus/github-action-get-previous-tag@v1"
with:
fallback: 1.0.0
- name: ⬇️ Checkout repo
uses: actions/checkout@v3
with:
ref: ${{ steps.get-latest-tag.outputs.tag }}
fetch-depth: 0
- name: Prepare Changelog
run: |
yes | cp -rf CHANGELOG.md priv/changelog/CHANGELOG.md
sed -i '1i%{title: "Change Log"}\n\n---\n' priv/changelog/CHANGELOG.md
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
@@ -113,24 +171,6 @@ jobs:
if-no-files-found: error
retention-days: 1
- uses: markpatterson27/markdown-to-output@v1
id: extract-changelog
with:
filepath: CHANGELOG.md
- name: Get content
uses: 2428392/gh-truncate-string-action@v1.3.0
id: get-content
with:
stringToTruncate: |
📣 Wanderer new release available 🎉
**Version**: ${{ steps.get-latest-tag.outputs.tag }}
${{ steps.extract-changelog.outputs.body }}
maxLength: 500
truncationSymbol: "…"
merge:
runs-on: ubuntu-latest
needs:
@@ -161,9 +201,8 @@ jobs:
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{version}},value=${{ needs.docker.outputs.release-tag }}
type=raw,value=develop,enable=${{ github.ref == 'refs/heads/develop' }}
type=raw,value=develop-{{sha}},enable=${{ github.ref == 'refs/heads/develop' }}
- name: Create manifest list and push
working-directory: /tmp/digests
@@ -176,12 +215,20 @@ jobs:
docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.meta.outputs.version }}
notify:
name: 🏷 Notify about release
name: 🏷 Notify about develop release
runs-on: ubuntu-22.04
needs: [docker, merge]
steps:
- name: Discord Webhook Action
uses: tsickert/discord-webhook@v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_URL }}
content: ${{ needs.docker.outputs.release-notes }}
webhook-url: ${{ secrets.DISCORD_WEBHOOK_URL_DEV }}
content: |
📣 New develop release available 🚀
**Commit**: `${{ github.sha }}`
**Status**: Development/Testing Release
Docker image: `wandererltd/community-edition:develop`
⚠️ This is an unstable development release for testing purposes.

View File

@@ -4,7 +4,6 @@ on:
push:
branches:
- main
- develop
env:
MIX_ENV: prod
@@ -22,7 +21,7 @@ jobs:
build:
name: 🛠 Build
runs-on: ubuntu-22.04
if: ${{ (github.ref == 'refs/heads/main' || github.ref == 'refs/heads/develop') && github.event_name == 'push' }}
if: ${{ github.ref == 'refs/heads/main' && github.event_name == 'push' }}
permissions:
checks: write
contents: write
@@ -37,7 +36,7 @@ jobs:
elixir: ["1.17"]
node-version: ["18.x"]
outputs:
commit_hash: ${{ steps.generate-changelog.outputs.commit_hash || steps.set-commit-develop.outputs.commit_hash }}
commit_hash: ${{ steps.generate-changelog.outputs.commit_hash }}
steps:
- name: Prepare
run: |
@@ -91,7 +90,6 @@ jobs:
- name: Generate Changelog & Update Tag Version
id: generate-changelog
if: github.ref == 'refs/heads/main'
run: |
git config --global user.name 'CI'
git config --global user.email 'ci@users.noreply.github.com'
@@ -102,15 +100,16 @@ jobs:
- name: Set commit hash for develop
id: set-commit-develop
if: github.ref == 'refs/heads/develop'
run: |
echo "commit_hash=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
docker:
name: 🛠 Build Docker Images
if: github.ref == 'refs/heads/develop'
needs: build
runs-on: ubuntu-22.04
outputs:
release-tag: ${{ steps.get-latest-tag.outputs.tag }}
release-notes: ${{ steps.get-content.outputs.string }}
permissions:
checks: write
contents: write
@@ -137,6 +136,17 @@ jobs:
ref: ${{ needs.build.outputs.commit_hash }}
fetch-depth: 0
- name: Get Release Tag
id: get-latest-tag
uses: "WyriHaximus/github-action-get-previous-tag@v1"
with:
fallback: 1.0.0
- name: Prepare Changelog
run: |
yes | cp -rf CHANGELOG.md priv/changelog/CHANGELOG.md
sed -i '1i%{title: "Change Log"}\n\n---\n' priv/changelog/CHANGELOG.md
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
@@ -185,6 +195,24 @@ jobs:
if-no-files-found: error
retention-days: 1
- uses: markpatterson27/markdown-to-output@v1
id: extract-changelog
with:
filepath: CHANGELOG.md
- name: Get content
uses: 2428392/gh-truncate-string-action@v1.3.0
id: get-content
with:
stringToTruncate: |
📣 Wanderer new release available 🎉
**Version**: ${{ steps.get-latest-tag.outputs.tag }}
${{ steps.extract-changelog.outputs.body }}
maxLength: 500
truncationSymbol: "…"
merge:
runs-on: ubuntu-latest
needs:
@@ -215,8 +243,9 @@ jobs:
tags: |
type=ref,event=branch
type=ref,event=pr
type=raw,value=develop,enable=${{ github.ref == 'refs/heads/develop' }}
type=raw,value=develop-{{sha}},enable=${{ github.ref == 'refs/heads/develop' }}
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{version}},value=${{ needs.docker.outputs.release-tag }}
- name: Create manifest list and push
working-directory: /tmp/digests
@@ -259,3 +288,14 @@ jobs:
## How to Promote?
In order to promote this to prod, edit the draft and press **"Publish release"**.
draft: true
notify:
name: 🏷 Notify about release
runs-on: ubuntu-22.04
needs: [docker, merge]
steps:
- name: Discord Webhook Action
uses: tsickert/discord-webhook@v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_URL }}
content: ${{ needs.docker.outputs.release-notes }}

View File

@@ -1,187 +0,0 @@
name: Build Docker ARM Image
on:
push:
tags:
- '**'
env:
MIX_ENV: prod
GH_TOKEN: ${{ github.token }}
REGISTRY_IMAGE: wandererltd/community-edition-arm
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions:
contents: write
jobs:
docker:
name: 🛠 Build Docker Images
runs-on: ubuntu-22.04
outputs:
release-tag: ${{ steps.get-latest-tag.outputs.tag }}
release-notes: ${{ steps.get-content.outputs.string }}
permissions:
checks: write
contents: write
packages: write
attestations: write
id-token: write
pull-requests: write
repository-projects: write
strategy:
fail-fast: false
matrix:
platform:
- linux/arm64
steps:
- name: Prepare
run: |
platform=${{ matrix.platform }}
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
- name: ⬇️ Checkout repo
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Get Release Tag
id: get-latest-tag
uses: "WyriHaximus/github-action-get-previous-tag@v1"
with:
fallback: 1.0.0
- name: ⬇️ Checkout repo
uses: actions/checkout@v3
with:
ref: ${{ steps.get-latest-tag.outputs.tag }}
fetch-depth: 0
- name: Prepare Changelog
run: |
yes | cp -rf CHANGELOG.md priv/changelog/CHANGELOG.md
sed -i '1i%{title: "Change Log"}\n\n---\n' priv/changelog/CHANGELOG.md
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY_IMAGE }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ secrets.WANDERER_DOCKER_USER }}
password: ${{ secrets.WANDERER_DOCKER_PASSWORD }}
- name: Build and push
id: build
uses: docker/build-push-action@v6
with:
push: true
context: .
file: ./Dockerfile
cache-from: type=gha
cache-to: type=gha,mode=max
labels: ${{ steps.meta.outputs.labels }}
platforms: ${{ matrix.platform }}
outputs: type=image,"name=${{ env.REGISTRY_IMAGE }}",push-by-digest=true,name-canonical=true,push=true
build-args: |
MIX_ENV=prod
BUILD_METADATA=${{ steps.meta.outputs.json }}
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v4
with:
name: digests-${{ env.PLATFORM_PAIR }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
- uses: markpatterson27/markdown-to-output@v1
id: extract-changelog
with:
filepath: CHANGELOG.md
- name: Get content
uses: 2428392/gh-truncate-string-action@v1.3.0
id: get-content
with:
stringToTruncate: |
📣 Wanderer **ARM** release available 🎉
**Version**: :${{ steps.get-latest-tag.outputs.tag }}
${{ steps.extract-changelog.outputs.body }}
maxLength: 500
truncationSymbol: "…"
merge:
runs-on: ubuntu-latest
needs:
- docker
steps:
- name: Download digests
uses: actions/download-artifact@v4
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.WANDERER_DOCKER_USER }}
password: ${{ secrets.WANDERER_DOCKER_PASSWORD }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Docker meta
id: meta
uses: docker/metadata-action@v5
with:
images: |
${{ env.REGISTRY_IMAGE }}
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{version}},value=${{ needs.docker.outputs.release-tag }}
- name: Create manifest list and push
working-directory: /tmp/digests
run: |
docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.REGISTRY_IMAGE }}@sha256:%s ' *)
- name: Inspect image
run: |
docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.meta.outputs.version }}
notify:
name: 🏷 Notify about release
runs-on: ubuntu-22.04
needs: [docker, merge]
steps:
- name: Discord Webhook Action
uses: tsickert/discord-webhook@v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_URL }}
content: ${{ needs.docker.outputs.release-notes }}

View File

@@ -2,6 +2,229 @@
<!-- changelog -->
## [v1.85.3](https://github.com/wanderer-industries/wanderer/compare/v1.85.2...v1.85.3) (2025-11-22)
### Bug Fixes:
* core: fixed connection time status issues. fixed character alliance update issues
## [v1.85.2](https://github.com/wanderer-industries/wanderer/compare/v1.85.1...v1.85.2) (2025-11-20)
### Bug Fixes:
* core: increased API pool limits
## [v1.85.1](https://github.com/wanderer-industries/wanderer/compare/v1.85.0...v1.85.1) (2025-11-20)
### Bug Fixes:
* core: increased API pool limits
## [v1.85.0](https://github.com/wanderer-industries/wanderer/compare/v1.84.37...v1.85.0) (2025-11-19)
### Features:
* core: added support for new ship types
## [v1.84.37](https://github.com/wanderer-industries/wanderer/compare/v1.84.36...v1.84.37) (2025-11-19)
### Bug Fixes:
* auth: fixed character auth issues
## [v1.84.36](https://github.com/wanderer-industries/wanderer/compare/v1.84.35...v1.84.36) (2025-11-19)
### Bug Fixes:
* fixed duplicated map slugs
## [v1.84.35](https://github.com/wanderer-industries/wanderer/compare/v1.84.34...v1.84.35) (2025-11-19)
### Bug Fixes:
* structure search / paste issues
## [v1.84.34](https://github.com/wanderer-industries/wanderer/compare/v1.84.33...v1.84.34) (2025-11-18)
### Bug Fixes:
* core: fixed character tracking issues
## [v1.84.33](https://github.com/wanderer-industries/wanderer/compare/v1.84.32...v1.84.33) (2025-11-18)
### Bug Fixes:
* core: fixed character tracking issues
## [v1.84.32](https://github.com/wanderer-industries/wanderer/compare/v1.84.31...v1.84.32) (2025-11-18)
### Bug Fixes:
* core: fixed character tracking issues
## [v1.84.31](https://github.com/wanderer-industries/wanderer/compare/v1.84.30...v1.84.31) (2025-11-17)
### Bug Fixes:
* core: fixed connactions validation logic
## [v1.84.30](https://github.com/wanderer-industries/wanderer/compare/v1.84.29...v1.84.30) (2025-11-17)
## [v1.84.29](https://github.com/wanderer-industries/wanderer/compare/v1.84.28...v1.84.29) (2025-11-17)
## [v1.84.28](https://github.com/wanderer-industries/wanderer/compare/v1.84.27...v1.84.28) (2025-11-17)
### Bug Fixes:
* core: fixed ACL updates
## [v1.84.27](https://github.com/wanderer-industries/wanderer/compare/v1.84.26...v1.84.27) (2025-11-17)
### Bug Fixes:
* core: supported characters_updates for external events
* core: improved character tracking
* core: improved character tracking
* core: improved character location tracking
## [v1.84.26](https://github.com/wanderer-industries/wanderer/compare/v1.84.25...v1.84.26) (2025-11-16)
### Bug Fixes:
* core: disable character tracker pausing
## [v1.84.25](https://github.com/wanderer-industries/wanderer/compare/v1.84.24...v1.84.25) (2025-11-16)
### Bug Fixes:
* core: used upsert for adding map systems
## [v1.84.24](https://github.com/wanderer-industries/wanderer/compare/v1.84.23...v1.84.24) (2025-11-15)
### Bug Fixes:
* Map: Fixed problem related with error if settings was removed and mapper crashed. Fixed settings reset.
## [v1.84.23](https://github.com/wanderer-industries/wanderer/compare/v1.84.22...v1.84.23) (2025-11-15)
### Bug Fixes:
* core: fixed map pings cancel errors
## [v1.84.22](https://github.com/wanderer-industries/wanderer/compare/v1.84.21...v1.84.22) (2025-11-15)
### Bug Fixes:
* core: fixed map initialization
## [v1.84.21](https://github.com/wanderer-industries/wanderer/compare/v1.84.20...v1.84.21) (2025-11-15)
### Bug Fixes:
* core: fixed map characters adding
## [v1.84.20](https://github.com/wanderer-industries/wanderer/compare/v1.84.19...v1.84.20) (2025-11-15)
### Bug Fixes:
* core: fixed map start issues
## [v1.84.19](https://github.com/wanderer-industries/wanderer/compare/v1.84.18...v1.84.19) (2025-11-14)
### Bug Fixes:
* core: fixed map start issues
## [v1.84.18](https://github.com/wanderer-industries/wanderer/compare/v1.84.17...v1.84.18) (2025-11-14)
### Bug Fixes:
* core: added gracefull map poll recovery from saved state. added map slug unique checks
## [v1.84.17](https://github.com/wanderer-industries/wanderer/compare/v1.84.16...v1.84.17) (2025-11-14)
### Bug Fixes:
* core: fixed activity tracking issues
## [v1.84.16](https://github.com/wanderer-industries/wanderer/compare/v1.84.15...v1.84.16) (2025-11-13)

View File

@@ -30,7 +30,7 @@ format f:
mix format
test t:
mix test
MIX_ENV=test mix test
coverage cover co:
mix test --cover
@@ -45,4 +45,3 @@ versions v:
@cat .tool-versions
@cat Aptfile
@echo

View File

@@ -73,7 +73,9 @@ body > div:first-of-type {
}
.maps_bg {
background-image: url('../images/maps_bg.webp');
/* OLD image */
/* background-image: url('../images/maps_bg.webp'); */
background-image: url('https://wanderer-industries.github.io/wanderer-assets/images/eve-screen-catalyst-expansion-bg.jpg');
background-size: cover;
background-position: center;
width: 100%;

View File

@@ -51,20 +51,8 @@ export const Characters = ({ data }: CharactersProps) => {
['border-lime-600/70']: character.online,
},
)}
title={character.tracking_paused ? `${character.name} - Tracking Paused (click to resume)` : character.name}
title={character.name}
>
{character.tracking_paused && (
<>
<span
className={clsx(
'absolute flex flex-col p-[2px] top-[0px] left-[0px] w-[35px] h-[35px]',
'text-yellow-500 text-[9px] z-10 bg-gray-800/40',
'pi',
PrimeIcons.PAUSE,
)}
/>
</>
)}
{mainCharacterEveId === character.eve_id && (
<span
className={clsx(

View File

@@ -1,6 +1,6 @@
@use "sass:color";
@use '@/hooks/Mapper/components/map/styles/eve-common-variables';
@import '@/hooks/Mapper/components/map/styles/solar-system-node';
@use '@/hooks/Mapper/components/map/styles/solar-system-node' as v;
@keyframes move-stripes {
from {
@@ -26,8 +26,8 @@
background-color: var(--rf-node-bg-color, #202020) !important;
color: var(--rf-text-color, #ffffff);
box-shadow: 0 0 5px rgba($dark-bg, 0.5);
border: 1px solid color.adjust($pastel-blue, $lightness: -10%);
box-shadow: 0 0 5px rgba(v.$dark-bg, 0.5);
border: 1px solid color.adjust(v.$pastel-blue, $lightness: -10%);
border-radius: 5px;
position: relative;
z-index: 3;
@@ -99,7 +99,7 @@
}
&.selected {
border-color: $pastel-pink;
border-color: v.$pastel-pink;
box-shadow: 0 0 10px #9a1af1c2;
}
@@ -113,11 +113,11 @@
bottom: 0;
z-index: -1;
border-color: $neon-color-1;
border-color: v.$neon-color-1;
background: repeating-linear-gradient(
45deg,
$neon-color-3 0px,
$neon-color-3 8px,
v.$neon-color-3 0px,
v.$neon-color-3 8px,
transparent 8px,
transparent 21px
);
@@ -146,7 +146,7 @@
border: 1px solid var(--eve-solar-system-status-color-lookingFor-dark15);
background-image: linear-gradient(275deg, #45ff8f2f, #457fff2f);
&.selected {
border-color: $pastel-pink;
border-color: v.$pastel-pink;
}
}
@@ -347,13 +347,13 @@
.Handle {
min-width: initial;
min-height: initial;
border: 1px solid $pastel-blue;
border: 1px solid v.$pastel-blue;
width: 5px;
height: 5px;
pointer-events: auto;
&.selected {
border-color: $pastel-pink;
border-color: v.$pastel-pink;
}
&.HandleTop {

View File

@@ -14,8 +14,27 @@ export const useCommandsCharacters = () => {
const ref = useRef({ update });
ref.current = { update };
const charactersUpdated = useCallback((characters: CommandCharactersUpdated) => {
ref.current.update(() => ({ characters: characters.slice() }));
const charactersUpdated = useCallback((updatedCharacters: CommandCharactersUpdated) => {
ref.current.update(state => {
const existing = state.characters ?? [];
// Put updatedCharacters into a map keyed by ID
const updatedMap = new Map(updatedCharacters.map(c => [c.eve_id, c]));
// 1. Update existing characters when possible
const merged = existing.map(character => {
const updated = updatedMap.get(character.eve_id);
if (updated) {
updatedMap.delete(character.eve_id); // Mark as processed
return { ...character, ...updated };
}
return character;
});
// 2. Any remaining items in updatedMap are NEW characters → add them
const newCharacters = Array.from(updatedMap.values());
return { characters: [...merged, ...newCharacters] };
});
}, []);
const characterAdded = useCallback((value: CommandCharacterAdded) => {

View File

@@ -30,10 +30,14 @@ export const SystemStructures: React.FC = () => {
const processClipboard = useCallback(
(text: string) => {
if (!systemId) {
console.warn('Cannot update structures: no system selected');
return;
}
const updated = processSnippetText(text, structures);
handleUpdateStructures(updated);
},
[structures, handleUpdateStructures],
[systemId, structures, handleUpdateStructures],
);
const handlePaste = useCallback(

View File

@@ -56,6 +56,11 @@ export function useSystemStructures({ systemId, outCommand }: UseSystemStructure
const handleUpdateStructures = useCallback(
async (newList: StructureItem[]) => {
if (!systemId) {
console.warn('Cannot update structures: systemId is undefined');
return;
}
const { added, updated, removed } = getActualStructures(structures, newList);
const sanitizedAdded = added.map(sanitizeIds);

View File

@@ -1,6 +1,6 @@
@use "sass:color";
@use '@/hooks/Mapper/components/map/styles/eve-common-variables';
@import '@/hooks/Mapper/components/map/styles/solar-system-node';
@use '@/hooks/Mapper/components/map/styles/solar-system-node' as v;
:root {
--rf-has-user-characters: #ffc75d;
@@ -108,7 +108,7 @@
}
&.selected {
border-color: $pastel-pink;
border-color: v.$pastel-pink;
box-shadow: 0 0 10px #9a1af1c2;
}
@@ -122,11 +122,11 @@
bottom: 0;
z-index: -1;
border-color: $neon-color-1;
border-color: v.$neon-color-1;
background: repeating-linear-gradient(
45deg,
$neon-color-3 0px,
$neon-color-3 8px,
v.$neon-color-3 0px,
v.$neon-color-3 8px,
transparent 8px,
transparent 21px
);
@@ -152,7 +152,7 @@
&.eve-system-status-lookingFor {
background-image: linear-gradient(275deg, #45ff8f2f, #457fff2f);
&.selected {
border-color: $pastel-pink;
border-color: v.$pastel-pink;
}
}

View File

@@ -1,5 +1,4 @@
import { useMapRootState } from '@/hooks/Mapper/mapRootProvider';
import { useCallback, useRef } from 'react';
import {
CommandCharacterAdded,
CommandCharacterRemoved,
@@ -7,6 +6,7 @@ import {
CommandCharacterUpdated,
CommandPresentCharacters,
} from '@/hooks/Mapper/types';
import { useCallback, useRef } from 'react';
export const useCommandsCharacters = () => {
const { update } = useMapRootState();
@@ -14,8 +14,27 @@ export const useCommandsCharacters = () => {
const ref = useRef({ update });
ref.current = { update };
const charactersUpdated = useCallback((characters: CommandCharactersUpdated) => {
ref.current.update(() => ({ characters: characters.slice() }));
const charactersUpdated = useCallback((updatedCharacters: CommandCharactersUpdated) => {
ref.current.update(state => {
const existing = state.characters ?? [];
// Put updatedCharacters into a map keyed by ID
const updatedMap = new Map(updatedCharacters.map(c => [c.eve_id, c]));
// 1. Update existing characters when possible
const merged = existing.map(character => {
const updated = updatedMap.get(character.eve_id);
if (updated) {
updatedMap.delete(character.eve_id); // Mark as processed
return { ...character, ...updated };
}
return character;
});
// 2. Any remaining items in updatedMap are NEW characters → add them
const newCharacters = Array.from(updatedMap.values());
return { characters: [...merged, ...newCharacters] };
});
}, []);
const characterAdded = useCallback((value: CommandCharacterAdded) => {

View File

@@ -33,7 +33,6 @@ export type CharacterTypeRaw = {
corporation_id: number;
corporation_name: string;
corporation_ticker: string;
tracking_paused: boolean;
};
export interface TrackingCharacter {

View File

@@ -12,11 +12,11 @@ const animateBg = function (bgCanvas) {
*/
const randomInRange = (max, min) => Math.floor(Math.random() * (max - min + 1)) + min;
const BASE_SIZE = 1;
const VELOCITY_INC = 1.01;
const VELOCITY_INC = 1.002;
const VELOCITY_INIT_INC = 0.525;
const JUMP_VELOCITY_INC = 0.55;
const JUMP_SIZE_INC = 1.15;
const SIZE_INC = 1.01;
const SIZE_INC = 1.002;
const RAD = Math.PI / 180;
const WARP_COLORS = [
[197, 239, 247],

View File

@@ -27,11 +27,7 @@ config :wanderer_app,
generators: [timestamp_type: :utc_datetime],
ddrt: WandererApp.Map.CacheRTree,
logger: Logger,
pubsub_client: Phoenix.PubSub,
wanderer_kills_base_url:
System.get_env("WANDERER_KILLS_BASE_URL", "ws://host.docker.internal:4004"),
wanderer_kills_service_enabled:
System.get_env("WANDERER_KILLS_SERVICE_ENABLED", "false") == "true"
pubsub_client: Phoenix.PubSub
config :wanderer_app, WandererAppWeb.Endpoint,
adapter: Bandit.PhoenixAdapter,

View File

@@ -4,7 +4,7 @@ import Config
config :wanderer_app, WandererApp.Repo,
username: "postgres",
password: "postgres",
hostname: System.get_env("DB_HOST", "localhost"),
hostname: "localhost",
database: "wanderer_dev",
stacktrace: true,
show_sensitive_data_on_connection_error: true,

View File

@@ -177,7 +177,34 @@ config :wanderer_app,
],
extra_characters_50: map_subscription_extra_characters_50_price,
extra_hubs_10: map_subscription_extra_hubs_10_price
}
},
# Finch pool configuration - separate pools for different services
# ESI Character Tracking pool - high capacity for bulk character operations
# With 30+ TrackerPools × ~100 concurrent tasks, need large pool
finch_esi_character_pool_size:
System.get_env("WANDERER_FINCH_ESI_CHARACTER_POOL_SIZE", "200") |> String.to_integer(),
finch_esi_character_pool_count:
System.get_env("WANDERER_FINCH_ESI_CHARACTER_POOL_COUNT", "4") |> String.to_integer(),
# ESI General pool - standard capacity for general ESI operations
finch_esi_general_pool_size:
System.get_env("WANDERER_FINCH_ESI_GENERAL_POOL_SIZE", "50") |> String.to_integer(),
finch_esi_general_pool_count:
System.get_env("WANDERER_FINCH_ESI_GENERAL_POOL_COUNT", "4") |> String.to_integer(),
# Webhooks pool - isolated from ESI rate limits
finch_webhooks_pool_size:
System.get_env("WANDERER_FINCH_WEBHOOKS_POOL_SIZE", "25") |> String.to_integer(),
finch_webhooks_pool_count:
System.get_env("WANDERER_FINCH_WEBHOOKS_POOL_COUNT", "2") |> String.to_integer(),
# Default pool - everything else (email, license manager, etc.)
finch_default_pool_size:
System.get_env("WANDERER_FINCH_DEFAULT_POOL_SIZE", "25") |> String.to_integer(),
finch_default_pool_count:
System.get_env("WANDERER_FINCH_DEFAULT_POOL_COUNT", "2") |> String.to_integer(),
# Character tracker concurrency settings
# Location updates need high concurrency for <2s response with 3000+ characters
location_concurrency:
System.get_env("WANDERER_LOCATION_CONCURRENCY", "#{System.schedulers_online() * 12}")
|> String.to_integer()
config :ueberauth, Ueberauth,
providers: [

View File

@@ -1,8 +1,25 @@
defmodule WandererApp.Api.Changes.SlugifyName do
@moduledoc """
Ensures map slugs are unique by:
1. Slugifying the provided slug/name
2. Checking for existing slugs (optimization)
3. Finding next available slug with numeric suffix if needed
4. Relying on database unique constraint as final arbiter
Race Condition Mitigation:
- Optimistic check reduces DB roundtrips for most cases
- Database unique index ensures no duplicates slip through
- Proper error messages for constraint violations
- Telemetry events for monitoring conflicts
"""
use Ash.Resource.Change
alias Ash.Changeset
require Ash.Query
require Logger
# Maximum number of attempts to find a unique slug
@max_attempts 100
@impl true
@spec change(Changeset.t(), keyword, Change.context()) :: Changeset.t()
@@ -26,7 +43,7 @@ defmodule WandererApp.Api.Changes.SlugifyName do
# Get the current record ID if this is an update operation
current_id = Changeset.get_attribute(changeset, :id)
# Check if the base slug is available
# Check if the base slug is available (optimization to avoid numeric suffixes when possible)
if slug_available?(base_slug, current_id) do
base_slug
else
@@ -35,16 +52,44 @@ defmodule WandererApp.Api.Changes.SlugifyName do
end
end
defp find_available_slug(base_slug, current_id, n) do
defp find_available_slug(base_slug, current_id, n) when n <= @max_attempts do
candidate_slug = "#{base_slug}-#{n}"
if slug_available?(candidate_slug, current_id) do
# Emit telemetry when we had to use a suffix (indicates potential conflict)
:telemetry.execute(
[:wanderer_app, :map, :slug_suffix_used],
%{suffix_number: n},
%{base_slug: base_slug, final_slug: candidate_slug}
)
candidate_slug
else
find_available_slug(base_slug, current_id, n + 1)
end
end
defp find_available_slug(base_slug, _current_id, n) when n > @max_attempts do
# Fallback: use timestamp suffix if we've tried too many numeric suffixes
# This handles edge cases where many maps have similar names
timestamp = System.system_time(:millisecond)
fallback_slug = "#{base_slug}-#{timestamp}"
Logger.warning(
"Slug generation exceeded #{@max_attempts} attempts for '#{base_slug}', using timestamp fallback",
base_slug: base_slug,
fallback_slug: fallback_slug
)
:telemetry.execute(
[:wanderer_app, :map, :slug_fallback_used],
%{attempts: n},
%{base_slug: base_slug, fallback_slug: fallback_slug}
)
fallback_slug
end
defp slug_available?(slug, current_id) do
query =
WandererApp.Api.Map
@@ -60,9 +105,20 @@ defmodule WandererApp.Api.Changes.SlugifyName do
|> Ash.Query.limit(1)
case Ash.read(query) do
{:ok, []} -> true
{:ok, _} -> false
{:error, _} -> false
{:ok, []} ->
true
{:ok, _existing} ->
false
{:error, error} ->
# Log error but be conservative - assume slug is not available
Logger.warning("Error checking slug availability",
slug: slug,
error: inspect(error)
)
false
end
end
end

View File

@@ -31,7 +31,7 @@ defmodule WandererApp.Api.Map do
routes do
base("/maps")
get(:by_slug, route: "/:slug")
index :read
# index :read
post(:new)
patch(:update)
delete(:destroy)

View File

@@ -67,6 +67,7 @@ defmodule WandererApp.Api.MapSystem do
code_interface do
define(:create, action: :create)
define(:upsert, action: :upsert)
define(:destroy, action: :destroy)
define(:by_id,
@@ -129,6 +130,31 @@ defmodule WandererApp.Api.MapSystem do
defaults [:create, :update, :destroy]
create :upsert do
primary? false
upsert? true
upsert_identity :map_solar_system_id
# Update these fields on conflict
upsert_fields [
:position_x,
:position_y,
:visible,
:name
]
accept [
:map_id,
:solar_system_id,
:name,
:position_x,
:position_y,
:visible,
:locked,
:status
]
end
read :read do
primary?(true)

View File

@@ -16,15 +16,48 @@ defmodule WandererApp.Application do
WandererApp.Vault,
WandererApp.Repo,
{Phoenix.PubSub, name: WandererApp.PubSub, adapter_name: Phoenix.PubSub.PG2},
# Multiple Finch pools for different services to prevent connection pool exhaustion
# ESI Character Tracking pool - high capacity for bulk character operations
{
Finch,
name: WandererApp.Finch.ESI.CharacterTracking,
pools: %{
default: [
size: Application.get_env(:wanderer_app, :finch_esi_character_pool_size, 100),
count: Application.get_env(:wanderer_app, :finch_esi_character_pool_count, 4)
]
}
},
# ESI General pool - standard capacity for general ESI operations
{
Finch,
name: WandererApp.Finch.ESI.General,
pools: %{
default: [
size: Application.get_env(:wanderer_app, :finch_esi_general_pool_size, 50),
count: Application.get_env(:wanderer_app, :finch_esi_general_pool_count, 4)
]
}
},
# Webhooks pool - isolated from ESI rate limits
{
Finch,
name: WandererApp.Finch.Webhooks,
pools: %{
default: [
size: Application.get_env(:wanderer_app, :finch_webhooks_pool_size, 25),
count: Application.get_env(:wanderer_app, :finch_webhooks_pool_count, 2)
]
}
},
# Default pool - everything else (email, license manager, etc.)
{
Finch,
name: WandererApp.Finch,
pools: %{
default: [
# number of connections per pool
size: 50,
# number of pools (so total 50 connections)
count: 4
size: Application.get_env(:wanderer_app, :finch_default_pool_size, 25),
count: Application.get_env(:wanderer_app, :finch_default_pool_count, 2)
]
}
},

View File

@@ -73,6 +73,54 @@ defmodule WandererApp.Cache do
def filter_by_attr_in(type, attr, includes), do: type |> get() |> filter_in(attr, includes)
@doc """
Batch lookup multiple keys from cache.
Returns a map of key => value pairs, with `default` used for missing keys.
"""
def lookup_all(keys, default \\ nil) when is_list(keys) do
# Get all values from cache
values = get_all(keys)
# Build result map with defaults for missing keys
result =
keys
|> Enum.map(fn key ->
value = Map.get(values, key, default)
{key, value}
end)
|> Map.new()
{:ok, result}
end
@doc """
Batch insert multiple key-value pairs into cache.
Accepts a map of key => value pairs or a list of {key, value} tuples.
Skips nil values (deletes the key instead).
"""
def insert_all(entries, opts \\ [])
def insert_all(entries, opts) when is_map(entries) do
# Filter out nil values and delete those keys
{to_delete, to_insert} =
entries
|> Enum.split_with(fn {_key, value} -> is_nil(value) end)
# Delete keys with nil values
Enum.each(to_delete, fn {key, _} -> delete(key) end)
# Insert non-nil values
unless Enum.empty?(to_insert) do
put_all(to_insert, opts)
end
:ok
end
def insert_all(entries, opts) when is_list(entries) do
insert_all(Map.new(entries), opts)
end
defp find(list, %{} = attrs, match: match) do
list
|> Enum.find(fn item ->

View File

@@ -1,6 +1,8 @@
defmodule WandererApp.CachedInfo do
require Logger
alias WandererAppWeb.Helpers.APIUtils
def run(_arg) do
:ok = cache_trig_systems()
end
@@ -29,14 +31,71 @@ defmodule WandererApp.CachedInfo do
)
end)
Cachex.get(:ship_types_cache, type_id)
get_ship_type_from_cache_or_api(type_id)
{:ok, ship_type} ->
{:ok, ship_type}
end
end
defp get_ship_type_from_cache_or_api(type_id) do
case Cachex.get(:ship_types_cache, type_id) do
{:ok, ship_type} when not is_nil(ship_type) ->
{:ok, ship_type}
{:ok, nil} ->
case WandererApp.Esi.get_type_info(type_id) do
{:ok, info} when not is_nil(info) ->
ship_type = parse_type(type_id, info)
{:ok, group_info} = get_group_info(ship_type.group_id)
{:ok, ship_type_info} =
WandererApp.Api.ShipTypeInfo |> Ash.create(ship_type |> Map.merge(group_info))
{:ok,
ship_type_info
|> Map.take([
:type_id,
:group_id,
:group_name,
:name,
:description,
:mass,
:capacity,
:volume
])}
{:error, reason} ->
Logger.error("Failed to get ship_type #{type_id} from ESI: #{inspect(reason)}")
{:ok, nil}
error ->
Logger.error("Failed to get ship_type #{type_id} from ESI: #{inspect(error)}")
{:ok, nil}
end
end
end
def get_group_info(nil), do: {:ok, nil}
def get_group_info(group_id) do
case WandererApp.Esi.get_group_info(group_id) do
{:ok, info} when not is_nil(info) ->
{:ok, parse_group(group_id, info)}
{:error, reason} ->
Logger.error("Failed to get group_info #{group_id} from ESI: #{inspect(reason)}")
{:ok, %{group_name: ""}}
error ->
Logger.error("Failed to get group_info #{group_id} from ESI: #{inspect(error)}")
{:ok, %{group_name: ""}}
end
end
def get_system_static_info(solar_system_id) do
{:ok, solar_system_id} = APIUtils.parse_int(solar_system_id)
case Cachex.get(:system_static_info_cache, solar_system_id) do
{:ok, nil} ->
case WandererApp.Api.MapSolarSystem.read() do
@@ -149,6 +208,25 @@ defmodule WandererApp.CachedInfo do
end
end
defp parse_group(group_id, group) do
%{
group_id: group_id,
group_name: Map.get(group, "name")
}
end
defp parse_type(type_id, type) do
%{
type_id: type_id,
name: Map.get(type, "name"),
description: Map.get(type, "description"),
group_id: Map.get(type, "group_id"),
mass: "#{Map.get(type, "mass")}",
capacity: "#{Map.get(type, "capacity")}",
volume: "#{Map.get(type, "volume")}"
}
end
defp build_jump_index() do
case get_solar_system_jumps() do
{:ok, jumps} ->

View File

@@ -4,6 +4,8 @@ defmodule WandererApp.Character do
require Logger
alias WandererApp.Cache
@read_character_wallet_scope "esi-wallet.read_character_wallet.v1"
@read_corp_wallet_scope "esi-wallet.read_corporation_wallets.v1"
@@ -16,6 +18,9 @@ defmodule WandererApp.Character do
ship_item_id: nil
}
@present_on_map_ttl :timer.seconds(10)
@not_present_on_map_ttl :timer.minutes(2)
def get_by_eve_id(character_eve_id) when is_binary(character_eve_id) do
WandererApp.Api.Character.by_eve_id(character_eve_id)
end
@@ -41,7 +46,7 @@ defmodule WandererApp.Character do
def get_character!(character_id) do
case get_character(character_id) do
{:ok, character} ->
{:ok, character} when not is_nil(character) ->
character
_ ->
@@ -50,16 +55,10 @@ defmodule WandererApp.Character do
end
end
def get_map_character(map_id, character_id, opts \\ []) do
def get_map_character(map_id, character_id) do
case get_character(character_id) do
{:ok, character} ->
# If we are forcing the character to not be present, we merge the character state with map settings
character_is_present =
if opts |> Keyword.get(:not_present, false) do
false
else
WandererApp.Character.TrackerManager.Impl.character_is_present(map_id, character_id)
end
{:ok, character} when not is_nil(character) ->
character_is_present = character_is_present?(map_id, character_id)
{:ok,
character
@@ -187,6 +186,10 @@ defmodule WandererApp.Character do
{:ok, result} ->
{:ok, result |> prepare_search_results()}
{:error, error} ->
Logger.warning("#{__MODULE__} failed search: #{inspect(error)}")
{:ok, []}
error ->
Logger.warning("#{__MODULE__} failed search: #{inspect(error)}")
{:ok, []}
@@ -263,22 +266,26 @@ defmodule WandererApp.Character do
end
end
defp maybe_merge_map_character_settings(%{id: character_id} = character, _map_id, true) do
{:ok, tracking_paused} =
WandererApp.Cache.lookup("character:#{character_id}:tracking_paused", false)
@decorate cacheable(
cache: Cache,
key: "character-present-#{map_id}-#{character_id}",
opts: [ttl: @present_on_map_ttl]
)
defp character_is_present?(map_id, character_id),
do: WandererApp.Character.TrackerManager.Impl.character_is_present(map_id, character_id)
character
|> Map.merge(%{tracking_paused: tracking_paused})
end
defp maybe_merge_map_character_settings(character, _map_id, true), do: character
@decorate cacheable(
cache: Cache,
key: "not-present-map-character-#{map_id}-#{character_id}",
opts: [ttl: @not_present_on_map_ttl]
)
defp maybe_merge_map_character_settings(
%{id: character_id} = character,
map_id,
_character_is_present
false
) do
{:ok, tracking_paused} =
WandererApp.Cache.lookup("character:#{character_id}:tracking_paused", false)
WandererApp.MapCharacterSettingsRepo.get(map_id, character_id)
|> case do
{:ok, settings} when not is_nil(settings) ->
@@ -296,7 +303,7 @@ defmodule WandererApp.Character do
character
|> Map.merge(@default_character_tracking_data)
end
|> Map.merge(%{online: false, tracking_paused: tracking_paused})
|> Map.merge(%{online: false})
end
defp prepare_search_results(result) do
@@ -324,7 +331,7 @@ defmodule WandererApp.Character do
do:
{:ok,
Enum.map(eve_ids, fn eve_id ->
Task.async(fn -> apply(WandererApp.Esi.ApiClient, method, [eve_id]) end)
Task.async(fn -> apply(WandererApp.Esi, method, [eve_id]) end)
end)
# 145000 == Timeout in milliseconds
|> Enum.map(fn task -> Task.await(task, 145_000) end)

View File

@@ -14,8 +14,8 @@ defmodule WandererApp.Character.Tracker do
active_maps: [],
is_online: false,
track_online: true,
track_location: true,
track_ship: true,
track_location: false,
track_ship: false,
track_wallet: false,
status: "new"
]
@@ -36,14 +36,11 @@ defmodule WandererApp.Character.Tracker do
status: binary()
}
@pause_tracking_timeout :timer.minutes(60 * 10)
@offline_timeout :timer.minutes(5)
@online_error_timeout :timer.minutes(10)
@ship_error_timeout :timer.minutes(10)
@location_error_timeout :timer.minutes(10)
@location_error_timeout :timer.seconds(30)
@location_error_threshold 3
@online_forbidden_ttl :timer.seconds(7)
@offline_check_delay_ttl :timer.seconds(15)
@online_limit_ttl :timer.seconds(7)
@forbidden_ttl :timer.seconds(10)
@limit_ttl :timer.seconds(5)
@location_limit_ttl :timer.seconds(1)
@@ -93,81 +90,16 @@ defmodule WandererApp.Character.Tracker do
end
end
def check_online_errors(character_id),
do: check_tracking_errors(character_id, "online", @online_error_timeout)
def check_ship_errors(character_id),
do: check_tracking_errors(character_id, "ship", @ship_error_timeout)
def check_location_errors(character_id),
do: check_tracking_errors(character_id, "location", @location_error_timeout)
defp check_tracking_errors(character_id, type, timeout) do
WandererApp.Cache.lookup!("character:#{character_id}:#{type}_error_time")
|> case do
nil ->
:skip
error_time ->
duration = DateTime.diff(DateTime.utc_now(), error_time, :millisecond)
if duration >= timeout do
pause_tracking(character_id)
WandererApp.Cache.delete("character:#{character_id}:#{type}_error_time")
:ok
else
:skip
end
end
defp increment_location_error_count(character_id) do
cache_key = "character:#{character_id}:location_error_count"
current_count = WandererApp.Cache.lookup!(cache_key) || 0
new_count = current_count + 1
WandererApp.Cache.put(cache_key, new_count)
new_count
end
defp pause_tracking(character_id) do
if WandererApp.Character.can_pause_tracking?(character_id) &&
not WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused") do
# Log character tracking statistics before pausing
Logger.debug(fn ->
{:ok, character_state} = WandererApp.Character.get_character_state(character_id)
"CHARACTER_TRACKING_PAUSED: Character tracking paused due to sustained errors: #{inspect(character_id: character_id,
active_maps: length(character_state.active_maps),
is_online: character_state.is_online,
tracking_duration_minutes: get_tracking_duration_minutes(character_id))}"
end)
WandererApp.Cache.delete("character:#{character_id}:online_forbidden")
WandererApp.Cache.delete("character:#{character_id}:online_error_time")
WandererApp.Cache.delete("character:#{character_id}:ship_error_time")
WandererApp.Cache.delete("character:#{character_id}:location_error_time")
WandererApp.Character.update_character(character_id, %{online: false})
WandererApp.Character.update_character_state(character_id, %{
is_online: false
})
# Original log kept for backward compatibility
Logger.warning("[CharacterTracker] paused for #{character_id}")
WandererApp.Cache.put(
"character:#{character_id}:tracking_paused",
true,
ttl: @pause_tracking_timeout
)
{:ok, %{solar_system_id: solar_system_id}} =
WandererApp.Character.get_character(character_id)
{:ok, %{active_maps: active_maps}} =
WandererApp.Character.get_character_state(character_id)
active_maps
|> Enum.each(fn map_id ->
WandererApp.Cache.put(
"map:#{map_id}:character:#{character_id}:start_solar_system_id",
solar_system_id
)
end)
end
defp reset_location_error_count(character_id) do
WandererApp.Cache.delete("character:#{character_id}:location_error_count")
end
def update_settings(character_id, track_settings) do
@@ -194,8 +126,7 @@ defmodule WandererApp.Character.Tracker do
case WandererApp.Character.get_character(character_id) do
{:ok, %{eve_id: eve_id, access_token: access_token, tracking_pool: tracking_pool}}
when not is_nil(access_token) ->
(WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden")
|> case do
true ->
{:error, :skipped}
@@ -224,9 +155,10 @@ defmodule WandererApp.Character.Tracker do
)
end
if online.online == true && online.online != is_online do
if online.online == true && not is_online do
WandererApp.Cache.delete("character:#{character_id}:ship_error_time")
WandererApp.Cache.delete("character:#{character_id}:location_error_time")
WandererApp.Cache.delete("character:#{character_id}:location_error_count")
WandererApp.Cache.delete("character:#{character_id}:info_forbidden")
WandererApp.Cache.delete("character:#{character_id}:ship_forbidden")
WandererApp.Cache.delete("character:#{character_id}:location_forbidden")
@@ -294,12 +226,6 @@ defmodule WandererApp.Character.Tracker do
{:error, :error_limited, headers} ->
reset_timeout = get_reset_timeout(headers)
reset_seconds =
Map.get(headers, "x-esi-error-limit-reset", ["unknown"]) |> List.first()
remaining =
Map.get(headers, "x-esi-error-limit-remain", ["unknown"]) |> List.first()
WandererApp.Cache.put(
"character:#{character_id}:online_forbidden",
true,
@@ -357,8 +283,7 @@ defmodule WandererApp.Character.Tracker do
defp get_reset_timeout(_headers, default_timeout), do: default_timeout
def update_info(character_id) do
(WandererApp.Cache.has_key?("character:#{character_id}:info_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:info_forbidden")
|> case do
true ->
{:error, :skipped}
@@ -442,8 +367,7 @@ defmodule WandererApp.Character.Tracker do
{:ok, %{eve_id: eve_id, access_token: access_token, tracking_pool: tracking_pool}}
when not is_nil(access_token) ->
(WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:ship_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:ship_forbidden"))
|> case do
true ->
{:error, :skipped}
@@ -552,7 +476,7 @@ defmodule WandererApp.Character.Tracker do
case WandererApp.Character.get_character(character_id) do
{:ok, %{eve_id: eve_id, access_token: access_token, tracking_pool: tracking_pool}}
when not is_nil(access_token) ->
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused")
WandererApp.Cache.has_key?("character:#{character_id}:location_forbidden")
|> case do
true ->
{:error, :skipped}
@@ -565,19 +489,33 @@ defmodule WandererApp.Character.Tracker do
character_id: character_id
) do
{:ok, location} when is_map(location) and not is_struct(location) ->
reset_location_error_count(character_id)
WandererApp.Cache.delete("character:#{character_id}:location_error_time")
character_state
|> maybe_update_location(location)
:ok
{:error, error} when error in [:forbidden, :not_found, :timeout] ->
error_count = increment_location_error_count(character_id)
Logger.warning("ESI_ERROR: Character location tracking failed",
character_id: character_id,
tracking_pool: tracking_pool,
error_type: error,
error_count: error_count,
endpoint: "character_location"
)
if error_count >= @location_error_threshold do
WandererApp.Cache.put(
"character:#{character_id}:location_forbidden",
true,
ttl: @location_error_timeout
)
end
if is_nil(
WandererApp.Cache.lookup!("character:#{character_id}:location_error_time")
) do
@@ -601,13 +539,24 @@ defmodule WandererApp.Character.Tracker do
{:error, :error_limited}
{:error, error} ->
error_count = increment_location_error_count(character_id)
Logger.error("ESI_ERROR: Character location tracking failed: #{inspect(error)}",
character_id: character_id,
tracking_pool: tracking_pool,
error_type: error,
error_count: error_count,
endpoint: "character_location"
)
if error_count >= @location_error_threshold do
WandererApp.Cache.put(
"character:#{character_id}:location_forbidden",
true,
ttl: @location_error_timeout
)
end
if is_nil(
WandererApp.Cache.lookup!("character:#{character_id}:location_error_time")
) do
@@ -620,13 +569,24 @@ defmodule WandererApp.Character.Tracker do
{:error, :skipped}
_ ->
error_count = increment_location_error_count(character_id)
Logger.error("ESI_ERROR: Character location tracking failed - wrong response",
character_id: character_id,
tracking_pool: tracking_pool,
error_type: "wrong_response",
error_count: error_count,
endpoint: "character_location"
)
if error_count >= @location_error_threshold do
WandererApp.Cache.put(
"character:#{character_id}:location_forbidden",
true,
ttl: @location_error_timeout
)
end
if is_nil(
WandererApp.Cache.lookup!("character:#{character_id}:location_error_time")
) do
@@ -662,8 +622,7 @@ defmodule WandererApp.Character.Tracker do
|> case do
true ->
(WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:wallet_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:wallet_forbidden"))
|> case do
true ->
{:error, :skipped}
@@ -750,6 +709,7 @@ defmodule WandererApp.Character.Tracker do
end
end
# when old_alliance_id != alliance_id and is_nil(alliance_id)
defp maybe_update_alliance(
%{character_id: character_id, alliance_id: old_alliance_id} = state,
alliance_id
@@ -775,6 +735,7 @@ defmodule WandererApp.Character.Tracker do
)
state
|> Map.merge(%{alliance_id: nil})
end
defp maybe_update_alliance(
@@ -782,8 +743,7 @@ defmodule WandererApp.Character.Tracker do
alliance_id
)
when old_alliance_id != alliance_id do
(WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden")
|> case do
true ->
state
@@ -813,6 +773,7 @@ defmodule WandererApp.Character.Tracker do
)
state
|> Map.merge(%{alliance_id: alliance_id})
_error ->
Logger.error("Failed to get alliance info for #{alliance_id}")
@@ -829,8 +790,7 @@ defmodule WandererApp.Character.Tracker do
)
when old_corporation_id != corporation_id do
(WandererApp.Cache.has_key?("character:#{character_id}:online_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:corporation_info_forbidden") ||
WandererApp.Cache.has_key?("character:#{character_id}:tracking_paused"))
WandererApp.Cache.has_key?("character:#{character_id}:corporation_info_forbidden"))
|> case do
true ->
state
@@ -1006,9 +966,7 @@ defmodule WandererApp.Character.Tracker do
),
do: %{
state
| track_online: true,
track_location: true,
track_ship: true
| track_online: true
}
defp maybe_start_online_tracking(
@@ -1052,11 +1010,6 @@ defmodule WandererApp.Character.Tracker do
DateTime.utc_now()
)
WandererApp.Cache.put(
"map:#{map_id}:character:#{character_id}:start_solar_system_id",
track_settings |> Map.get(:solar_system_id)
)
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:solar_system_id")
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:station_id")
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:structure_id")
@@ -1107,7 +1060,7 @@ defmodule WandererApp.Character.Tracker do
)
end
state
%{state | track_location: false, track_ship: false}
end
defp maybe_stop_tracking(
@@ -1137,19 +1090,6 @@ defmodule WandererApp.Character.Tracker do
defp get_online(_), do: %{online: false}
defp get_tracking_duration_minutes(character_id) do
case WandererApp.Cache.lookup!("character:#{character_id}:map:*:tracking_start_time") do
nil ->
0
start_time when is_struct(start_time, DateTime) ->
DateTime.diff(DateTime.utc_now(), start_time, :minute)
_ ->
0
end
end
# Telemetry handler for database pool monitoring
def handle_pool_query(_event_name, measurements, metadata, _config) do
queue_time = measurements[:queue_time]

View File

@@ -14,8 +14,8 @@ defmodule WandererApp.Character.TrackerManager do
GenServer.start_link(__MODULE__, args, name: __MODULE__)
end
def start_tracking(character_id, opts \\ []),
do: GenServer.cast(__MODULE__, {&Impl.start_tracking/3, [character_id, opts]})
def start_tracking(character_id),
do: GenServer.cast(__MODULE__, {&Impl.start_tracking/2, [character_id]})
def stop_tracking(character_id),
do: GenServer.cast(__MODULE__, {&Impl.stop_tracking/2, [character_id]})

View File

@@ -40,13 +40,13 @@ defmodule WandererApp.Character.TrackerManager.Impl do
tracked_characters
|> Enum.each(fn character_id ->
start_tracking(state, character_id, %{})
start_tracking(state, character_id)
end)
state
end
def start_tracking(state, character_id, opts) do
def start_tracking(state, character_id) do
if not WandererApp.Cache.has_key?("#{character_id}:track_requested") do
WandererApp.Cache.insert(
"#{character_id}:track_requested",

View File

@@ -8,7 +8,8 @@ defmodule WandererApp.Character.TrackerPool do
:tracked_ids,
:uuid,
:characters,
server_online: false
server_online: false,
last_location_duration: 0
]
@name __MODULE__
@@ -19,13 +20,19 @@ defmodule WandererApp.Character.TrackerPool do
@update_location_interval :timer.seconds(1)
@update_online_interval :timer.seconds(30)
@check_offline_characters_interval :timer.minutes(5)
@check_online_errors_interval :timer.minutes(1)
@check_ship_errors_interval :timer.minutes(1)
@check_location_errors_interval :timer.minutes(1)
@update_ship_interval :timer.seconds(2)
@update_info_interval :timer.minutes(2)
@update_wallet_interval :timer.minutes(10)
# Per-operation concurrency limits
# Location updates are critical and need high concurrency (100 chars in ~200ms)
# Note: This is fetched at runtime since it's configured via runtime.exs
defp location_concurrency do
Application.get_env(:wanderer_app, :location_concurrency, System.schedulers_online() * 12)
end
# Other operations can use lower concurrency
@standard_concurrency System.schedulers_online() * 2
@logger Application.compile_env(:wanderer_app, :logger)
def new(), do: __struct__()
@@ -109,17 +116,23 @@ defmodule WandererApp.Character.TrackerPool do
"server_status"
)
Process.send_after(self(), :update_online, 100)
Process.send_after(self(), :check_online_errors, :timer.seconds(60))
Process.send_after(self(), :check_ship_errors, :timer.seconds(90))
Process.send_after(self(), :check_location_errors, :timer.seconds(120))
# Stagger pool startups to distribute load across multiple pools
# Critical location updates get minimal stagger (0-500ms)
# Other operations get wider stagger (0-10s) to reduce thundering herd
location_stagger = :rand.uniform(500)
online_stagger = :rand.uniform(10_000)
ship_stagger = :rand.uniform(10_000)
info_stagger = :rand.uniform(60_000)
Process.send_after(self(), :update_online, 100 + online_stagger)
Process.send_after(self(), :update_location, 300 + location_stagger)
Process.send_after(self(), :update_ship, 500 + ship_stagger)
Process.send_after(self(), :update_info, 1500 + info_stagger)
Process.send_after(self(), :check_offline_characters, @check_offline_characters_interval)
Process.send_after(self(), :update_location, 300)
Process.send_after(self(), :update_ship, 500)
Process.send_after(self(), :update_info, 1500)
if WandererApp.Env.wallet_tracking_enabled?() do
Process.send_after(self(), :update_wallet, 1000)
wallet_stagger = :rand.uniform(120_000)
Process.send_after(self(), :update_wallet, 1000 + wallet_stagger)
end
{:noreply, state}
@@ -169,7 +182,7 @@ defmodule WandererApp.Character.TrackerPool do
fn character_id ->
WandererApp.Character.Tracker.update_online(character_id)
end,
max_concurrency: System.schedulers_online() * 4,
max_concurrency: @standard_concurrency,
on_timeout: :kill_task,
timeout: :timer.seconds(5)
)
@@ -232,7 +245,7 @@ defmodule WandererApp.Character.TrackerPool do
WandererApp.Character.Tracker.check_offline(character_id)
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
max_concurrency: @standard_concurrency,
on_timeout: :kill_task
)
|> Enum.each(fn
@@ -250,126 +263,6 @@ defmodule WandererApp.Character.TrackerPool do
{:noreply, state}
end
def handle_info(
:check_online_errors,
%{
characters: characters
} =
state
) do
Process.send_after(self(), :check_online_errors, @check_online_errors_interval)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.TaskWrapper.start_link(
WandererApp.Character.Tracker,
:check_online_errors,
[
character_id
]
)
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task
)
|> Enum.each(fn
{:ok, _result} -> :ok
error -> @logger.error("Error in check_online_errors: #{inspect(error)}")
end)
rescue
e ->
Logger.error("""
[Tracker Pool] check_online_errors => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
end
def handle_info(
:check_ship_errors,
%{
characters: characters
} =
state
) do
Process.send_after(self(), :check_ship_errors, @check_ship_errors_interval)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.TaskWrapper.start_link(
WandererApp.Character.Tracker,
:check_ship_errors,
[
character_id
]
)
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task
)
|> Enum.each(fn
{:ok, _result} -> :ok
error -> @logger.error("Error in check_ship_errors: #{inspect(error)}")
end)
rescue
e ->
Logger.error("""
[Tracker Pool] check_ship_errors => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
end
def handle_info(
:check_location_errors,
%{
characters: characters
} =
state
) do
Process.send_after(self(), :check_location_errors, @check_location_errors_interval)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.TaskWrapper.start_link(
WandererApp.Character.Tracker,
:check_location_errors,
[
character_id
]
)
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task
)
|> Enum.each(fn
{:ok, _result} -> :ok
error -> @logger.error("Error in check_location_errors: #{inspect(error)}")
end)
rescue
e ->
Logger.error("""
[Tracker Pool] check_location_errors => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
end
def handle_info(
:update_location,
%{
@@ -380,26 +273,52 @@ defmodule WandererApp.Character.TrackerPool do
) do
Process.send_after(self(), :update_location, @update_location_interval)
start_time = System.monotonic_time(:millisecond)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_location(character_id)
end,
max_concurrency: System.schedulers_online() * 4,
max_concurrency: location_concurrency(),
on_timeout: :kill_task,
timeout: :timer.seconds(5)
)
|> Enum.each(fn _result -> :ok end)
# Emit telemetry for location update performance
duration = System.monotonic_time(:millisecond) - start_time
:telemetry.execute(
[:wanderer_app, :tracker_pool, :location_update],
%{duration: duration, character_count: length(characters)},
%{pool_uuid: state.uuid}
)
# Warn if location updates are falling behind (taking > 800ms for 100 chars)
if duration > 800 do
Logger.warning(
"[Tracker Pool] Location updates falling behind: #{duration}ms for #{length(characters)} chars (pool: #{state.uuid})"
)
:telemetry.execute(
[:wanderer_app, :tracker_pool, :location_lag],
%{duration: duration, character_count: length(characters)},
%{pool_uuid: state.uuid}
)
end
{:noreply, %{state | last_location_duration: duration}}
rescue
e ->
Logger.error("""
[Tracker Pool] update_location => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
{:noreply, state}
end
end
def handle_info(
@@ -415,32 +334,48 @@ defmodule WandererApp.Character.TrackerPool do
:update_ship,
%{
characters: characters,
server_online: true
server_online: true,
last_location_duration: location_duration
} =
state
) do
Process.send_after(self(), :update_ship, @update_ship_interval)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_ship(character_id)
end,
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task,
timeout: :timer.seconds(5)
# Backpressure: Skip ship updates if location updates are falling behind
if location_duration > 1000 do
Logger.debug(
"[Tracker Pool] Skipping ship update due to location lag (#{location_duration}ms)"
)
|> Enum.each(fn _result -> :ok end)
rescue
e ->
Logger.error("""
[Tracker Pool] update_ship => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
:telemetry.execute(
[:wanderer_app, :tracker_pool, :ship_skipped],
%{count: 1},
%{pool_uuid: state.uuid, reason: :location_lag}
)
{:noreply, state}
else
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_ship(character_id)
end,
max_concurrency: @standard_concurrency,
on_timeout: :kill_task,
timeout: :timer.seconds(5)
)
|> Enum.each(fn _result -> :ok end)
rescue
e ->
Logger.error("""
[Tracker Pool] update_ship => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
end
end
def handle_info(
@@ -456,35 +391,51 @@ defmodule WandererApp.Character.TrackerPool do
:update_info,
%{
characters: characters,
server_online: true
server_online: true,
last_location_duration: location_duration
} =
state
) do
Process.send_after(self(), :update_info, @update_info_interval)
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_info(character_id)
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task
# Backpressure: Skip info updates if location updates are severely falling behind
if location_duration > 1500 do
Logger.debug(
"[Tracker Pool] Skipping info update due to location lag (#{location_duration}ms)"
)
|> Enum.each(fn
{:ok, _result} -> :ok
error -> Logger.error("Error in update_info: #{inspect(error)}")
end)
rescue
e ->
Logger.error("""
[Tracker Pool] update_info => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
:telemetry.execute(
[:wanderer_app, :tracker_pool, :info_skipped],
%{count: 1},
%{pool_uuid: state.uuid, reason: :location_lag}
)
{:noreply, state}
else
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_info(character_id)
end,
timeout: :timer.seconds(15),
max_concurrency: @standard_concurrency,
on_timeout: :kill_task
)
|> Enum.each(fn
{:ok, _result} -> :ok
error -> Logger.error("Error in update_info: #{inspect(error)}")
end)
rescue
e ->
Logger.error("""
[Tracker Pool] update_info => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
{:noreply, state}
end
end
def handle_info(
@@ -513,7 +464,7 @@ defmodule WandererApp.Character.TrackerPool do
WandererApp.Character.Tracker.update_wallet(character_id)
end,
timeout: :timer.minutes(5),
max_concurrency: System.schedulers_online() * 4,
max_concurrency: @standard_concurrency,
on_timeout: :kill_task
)
|> Enum.each(fn

View File

@@ -52,7 +52,7 @@ defmodule WandererApp.Character.TrackerPoolDynamicSupervisor do
defp get_available_pool([]), do: nil
defp get_available_pool([{pid, uuid} | pools]) do
defp get_available_pool([{_pid, uuid} | pools]) do
case Registry.lookup(@unique_registry, Module.concat(WandererApp.Character.TrackerPool, uuid)) do
[] ->
nil
@@ -62,8 +62,8 @@ defmodule WandererApp.Character.TrackerPoolDynamicSupervisor do
nil ->
get_available_pool(pools)
pid ->
pid
pool_pid ->
pool_pid
end
end
end

View File

@@ -2,6 +2,8 @@ defmodule WandererApp.Esi do
@moduledoc group: :esi
defdelegate get_server_status, to: WandererApp.Esi.ApiClient
defdelegate get_group_info(group_id, opts \\ []), to: WandererApp.Esi.ApiClient
defdelegate get_type_info(type_id, opts \\ []), to: WandererApp.Esi.ApiClient
defdelegate get_alliance_info(eve_id, opts \\ []), to: WandererApp.Esi.ApiClient
defdelegate get_corporation_info(eve_id, opts \\ []), to: WandererApp.Esi.ApiClient
defdelegate get_character_info(eve_id, opts \\ []), to: WandererApp.Esi.ApiClient

View File

@@ -17,6 +17,17 @@ defmodule WandererApp.Esi.ApiClient do
@logger Application.compile_env(:wanderer_app, :logger)
# Pool selection for different operation types
# Character tracking operations use dedicated high-capacity pool
@character_tracking_pool WandererApp.Finch.ESI.CharacterTracking
# General ESI operations use standard pool
@general_pool WandererApp.Finch.ESI.General
# Helper function to get Req options with appropriate Finch pool
defp req_options_for_pool(pool) do
[base_url: "https://esi.evetech.net", finch: pool]
end
def get_server_status, do: do_get("/status", [], @cache_opts)
def set_autopilot_waypoint(add_to_beginning, clear_other_waypoints, destination_id, opts \\ []),
@@ -38,10 +49,13 @@ defmodule WandererApp.Esi.ApiClient do
do:
do_post_esi(
"/characters/affiliation/",
json: character_eve_ids,
params: %{
datasource: "tranquility"
}
[
json: character_eve_ids,
params: %{
datasource: "tranquility"
}
],
@character_tracking_pool
)
def get_routes_custom(hubs, origin, params),
@@ -116,7 +130,33 @@ defmodule WandererApp.Esi.ApiClient do
@decorate cacheable(
cache: Cache,
key: "info-#{eve_id}",
key: "group-info-#{group_id}",
opts: [ttl: @ttl]
)
def get_group_info(group_id, opts),
do:
do_get(
"/universe/groups/#{group_id}/",
opts,
@cache_opts
)
@decorate cacheable(
cache: Cache,
key: "type-info-#{type_id}",
opts: [ttl: @ttl]
)
def get_type_info(type_id, opts),
do:
do_get(
"/universe/types/#{type_id}/",
opts,
@cache_opts
)
@decorate cacheable(
cache: Cache,
key: "alliance-info-#{eve_id}",
opts: [ttl: @ttl]
)
def get_alliance_info(eve_id, opts \\ []) do
@@ -137,7 +177,7 @@ defmodule WandererApp.Esi.ApiClient do
@decorate cacheable(
cache: Cache,
key: "info-#{eve_id}",
key: "corporation-info-#{eve_id}",
opts: [ttl: @ttl]
)
def get_corporation_info(eve_id, opts \\ []) do
@@ -150,7 +190,7 @@ defmodule WandererApp.Esi.ApiClient do
@decorate cacheable(
cache: Cache,
key: "info-#{eve_id}",
key: "character-info-#{eve_id}",
opts: [ttl: @ttl]
)
def get_character_info(eve_id, opts \\ []) do
@@ -203,8 +243,17 @@ defmodule WandererApp.Esi.ApiClient do
do: get_character_auth_data(character_eve_id, "ship", opts ++ @cache_opts)
def search(character_eve_id, opts \\ []) do
search_val = to_string(opts[:params][:search] || "")
categories_val = to_string(opts[:params][:categories] || "character,alliance,corporation")
params = Keyword.get(opts, :params, %{}) |> Map.new()
search_val =
to_string(Map.get(params, :search) || Map.get(params, "search") || "")
categories_val =
to_string(
Map.get(params, :categories) ||
Map.get(params, "categories") ||
"character,alliance,corporation"
)
query_params = [
{"search", search_val},
@@ -220,7 +269,7 @@ defmodule WandererApp.Esi.ApiClient do
@decorate cacheable(
cache: Cache,
key: "search-#{character_eve_id}-#{categories_val}-#{search_val |> Slug.slugify()}",
key: "search-#{character_eve_id}-#{categories_val}-#{Base.encode64(search_val)}",
opts: [ttl: @ttl]
)
defp get_search(character_eve_id, search_val, categories_val, merged_opts) do
@@ -254,14 +303,18 @@ defmodule WandererApp.Esi.ApiClient do
character_id = opts |> Keyword.get(:character_id, nil)
# Use character tracking pool for character operations
pool = @character_tracking_pool
if not is_access_token_expired?(character_id) do
do_get(
path,
auth_opts,
opts |> with_refresh_token()
opts |> with_refresh_token(),
pool
)
else
do_get_retry(path, auth_opts, opts |> with_refresh_token())
do_get_retry(path, auth_opts, opts |> with_refresh_token(), :forbidden, pool)
end
end
@@ -295,19 +348,19 @@ defmodule WandererApp.Esi.ApiClient do
defp with_cache_opts(opts),
do: opts |> Keyword.merge(@cache_opts) |> Keyword.merge(cache_dir: System.tmp_dir!())
defp do_get(path, api_opts \\ [], opts \\ []) do
defp do_get(path, api_opts \\ [], opts \\ [], pool \\ @general_pool) do
case Cachex.get(:api_cache, path) do
{:ok, cached_data} when not is_nil(cached_data) ->
{:ok, cached_data}
_ ->
do_get_request(path, api_opts, opts)
do_get_request(path, api_opts, opts, pool)
end
end
defp do_get_request(path, api_opts \\ [], opts \\ []) do
defp do_get_request(path, api_opts \\ [], opts \\ [], pool \\ @general_pool) do
try do
@req_esi_options
req_options_for_pool(pool)
|> Req.new()
|> Req.get(
api_opts
@@ -398,12 +451,48 @@ defmodule WandererApp.Esi.ApiClient do
{:ok, %{status: status, headers: headers}} ->
{:error, "Unexpected status: #{status}"}
{:error, _reason} ->
{:error, %Mint.TransportError{reason: :timeout}} ->
# Emit telemetry for pool timeout
:telemetry.execute(
[:wanderer_app, :finch, :pool_timeout],
%{count: 1},
%{method: "GET", path: path, pool: pool}
)
{:error, :pool_timeout}
{:error, reason} ->
# Check if this is a Finch pool error
if is_exception(reason) and Exception.message(reason) =~ "unable to provide a connection" do
:telemetry.execute(
[:wanderer_app, :finch, :pool_exhausted],
%{count: 1},
%{method: "GET", path: path, pool: pool}
)
end
{:error, "Request failed"}
end
rescue
e ->
Logger.error(Exception.message(e))
error_msg = Exception.message(e)
# Emit telemetry for pool exhaustion errors
if error_msg =~ "unable to provide a connection" do
:telemetry.execute(
[:wanderer_app, :finch, :pool_exhausted],
%{count: 1},
%{method: "GET", path: path, pool: pool}
)
Logger.error("FINCH_POOL_EXHAUSTED: #{error_msg}",
method: "GET",
path: path,
pool: inspect(pool)
)
else
Logger.error(error_msg)
end
{:error, "Request failed"}
end
@@ -492,13 +581,13 @@ defmodule WandererApp.Esi.ApiClient do
end
end
defp do_post_esi(url, opts) do
defp do_post_esi(url, opts, pool \\ @general_pool) do
try do
req_opts =
(opts |> with_user_agent_opts() |> Keyword.merge(@retry_opts)) ++
[params: opts[:params] || []]
Req.new(@req_esi_options ++ req_opts)
Req.new(req_options_for_pool(pool) ++ req_opts)
|> Req.post(url: url)
|> case do
{:ok, %{status: status, body: body}} when status in [200, 201] ->
@@ -576,18 +665,54 @@ defmodule WandererApp.Esi.ApiClient do
{:ok, %{status: status}} ->
{:error, "Unexpected status: #{status}"}
{:error, %Mint.TransportError{reason: :timeout}} ->
# Emit telemetry for pool timeout
:telemetry.execute(
[:wanderer_app, :finch, :pool_timeout],
%{count: 1},
%{method: "POST_ESI", path: url, pool: pool}
)
{:error, :pool_timeout}
{:error, reason} ->
# Check if this is a Finch pool error
if is_exception(reason) and Exception.message(reason) =~ "unable to provide a connection" do
:telemetry.execute(
[:wanderer_app, :finch, :pool_exhausted],
%{count: 1},
%{method: "POST_ESI", path: url, pool: pool}
)
end
{:error, reason}
end
rescue
e ->
@logger.error(Exception.message(e))
error_msg = Exception.message(e)
# Emit telemetry for pool exhaustion errors
if error_msg =~ "unable to provide a connection" do
:telemetry.execute(
[:wanderer_app, :finch, :pool_exhausted],
%{count: 1},
%{method: "POST_ESI", path: url, pool: pool}
)
@logger.error("FINCH_POOL_EXHAUSTED: #{error_msg}",
method: "POST_ESI",
path: url,
pool: inspect(pool)
)
else
@logger.error(error_msg)
end
{:error, "Request failed"}
end
end
defp do_get_retry(path, api_opts, opts, status \\ :forbidden) do
defp do_get_retry(path, api_opts, opts, status \\ :forbidden, pool \\ @general_pool) do
refresh_token? = opts |> Keyword.get(:refresh_token?, false)
retry_count = opts |> Keyword.get(:retry_count, 0)
character_id = opts |> Keyword.get(:character_id, nil)
@@ -602,7 +727,8 @@ defmodule WandererApp.Esi.ApiClient do
do_get(
path,
api_opts |> Keyword.merge(auth_opts),
opts |> Keyword.merge(retry_count: retry_count + 1)
opts |> Keyword.merge(retry_count: retry_count + 1),
pool
)
{:error, _error} ->

View File

@@ -2,7 +2,7 @@ defmodule WandererApp.ExternalEvents do
@moduledoc """
External event system for SSE and webhook delivery.
This system is completely separate from the internal Phoenix PubSub
This system is completely separate from the internal Phoenix PubSub
event system and does NOT modify any existing event flows.
External events are delivered to:
@@ -72,20 +72,12 @@ defmodule WandererApp.ExternalEvents do
# Check if MapEventRelay is alive before sending
if Process.whereis(MapEventRelay) do
try do
# Use call with timeout instead of cast for better error handling
GenServer.call(MapEventRelay, {:deliver_event, event}, 5000)
:ok
catch
:exit, {:timeout, _} ->
Logger.error("Timeout delivering event to MapEventRelay for map #{map_id}")
{:error, :timeout}
:exit, reason ->
Logger.error("Failed to deliver event to MapEventRelay: #{inspect(reason)}")
{:error, reason}
end
# Use cast for async delivery to avoid blocking the caller
# This is critical for performance in hot paths (character updates)
GenServer.cast(MapEventRelay, {:deliver_event, event})
:ok
else
Logger.debug(fn -> "MapEventRelay not available for event delivery (map: #{map_id})" end)
{:error, :relay_not_available}
end
else

View File

@@ -20,6 +20,7 @@ defmodule WandererApp.ExternalEvents.Event do
| :character_added
| :character_removed
| :character_updated
| :characters_updated
| :map_kill
| :acl_member_added
| :acl_member_removed
@@ -42,50 +43,6 @@ defmodule WandererApp.ExternalEvents.Event do
defstruct [:id, :map_id, :type, :payload, :timestamp]
@doc """
Creates a new external event with ULID for ordering.
Validates that the event_type is supported before creating the event.
"""
@spec new(String.t(), event_type(), map()) :: t() | {:error, :invalid_event_type}
def new(map_id, event_type, payload) when is_binary(map_id) and is_map(payload) do
if valid_event_type?(event_type) do
%__MODULE__{
id: Ecto.ULID.generate(System.system_time(:millisecond)),
map_id: map_id,
type: event_type,
payload: payload,
timestamp: DateTime.utc_now()
}
else
raise ArgumentError,
"Invalid event type: #{inspect(event_type)}. Must be one of: #{supported_event_types() |> Enum.map(&to_string/1) |> Enum.join(", ")}"
end
end
@doc """
Converts an event to JSON format for delivery.
"""
@spec to_json(t()) :: map()
def to_json(%__MODULE__{} = event) do
%{
"id" => event.id,
"type" => to_string(event.type),
"map_id" => event.map_id,
"timestamp" => DateTime.to_iso8601(event.timestamp),
"payload" => serialize_payload(event.payload)
}
end
# Convert Ash structs and other complex types to plain maps
defp serialize_payload(payload) when is_struct(payload) do
serialize_payload(payload, MapSet.new())
end
defp serialize_payload(payload) when is_map(payload) do
serialize_payload(payload, MapSet.new())
end
# Define allowlisted fields for different struct types
@system_fields [
:id,
@@ -133,6 +90,73 @@ defmodule WandererApp.ExternalEvents.Event do
]
@signature_fields [:id, :signature_id, :name, :type, :group]
@supported_event_types [
:add_system,
:deleted_system,
:system_renamed,
:system_metadata_changed,
:signatures_updated,
:signature_added,
:signature_removed,
:connection_added,
:connection_removed,
:connection_updated,
:character_added,
:character_removed,
:character_updated,
:characters_updated,
:map_kill,
:acl_member_added,
:acl_member_removed,
:acl_member_updated,
:rally_point_added,
:rally_point_removed
]
@doc """
Creates a new external event with ULID for ordering.
Validates that the event_type is supported before creating the event.
"""
@spec new(String.t(), event_type(), map()) :: t() | {:error, :invalid_event_type}
def new(map_id, event_type, payload) when is_binary(map_id) and is_map(payload) do
if valid_event_type?(event_type) do
%__MODULE__{
id: Ecto.ULID.generate(System.system_time(:millisecond)),
map_id: map_id,
type: event_type,
payload: payload,
timestamp: DateTime.utc_now()
}
else
raise ArgumentError,
"Invalid event type: #{inspect(event_type)}. Must be one of: #{supported_event_types() |> Enum.map(&to_string/1) |> Enum.join(", ")}"
end
end
@doc """
Converts an event to JSON format for delivery.
"""
@spec to_json(t()) :: map()
def to_json(%__MODULE__{} = event) do
%{
"id" => event.id,
"type" => to_string(event.type),
"map_id" => event.map_id,
"timestamp" => DateTime.to_iso8601(event.timestamp),
"payload" => serialize_payload(event.payload)
}
end
# Convert Ash structs and other complex types to plain maps
defp serialize_payload(payload) when is_struct(payload) do
serialize_payload(payload, MapSet.new())
end
defp serialize_payload(payload) when is_map(payload) do
serialize_payload(payload, MapSet.new())
end
# Overloaded versions with visited tracking
defp serialize_payload(payload, visited) when is_struct(payload) do
# Check for circular reference
@@ -193,29 +217,7 @@ defmodule WandererApp.ExternalEvents.Event do
Returns all supported event types.
"""
@spec supported_event_types() :: [event_type()]
def supported_event_types do
[
:add_system,
:deleted_system,
:system_renamed,
:system_metadata_changed,
:signatures_updated,
:signature_added,
:signature_removed,
:connection_added,
:connection_removed,
:connection_updated,
:character_added,
:character_removed,
:character_updated,
:map_kill,
:acl_member_added,
:acl_member_removed,
:acl_member_updated,
:rally_point_added,
:rally_point_removed
]
end
def supported_event_types, do: @supported_event_types
@doc """
Validates an event type.

View File

@@ -82,16 +82,9 @@ defmodule WandererApp.ExternalEvents.MapEventRelay do
@impl true
def handle_call({:deliver_event, %Event{} = event}, _from, state) do
# Log ACL events at info level for debugging
if event.type in [:acl_member_added, :acl_member_removed, :acl_member_updated] do
Logger.debug(fn ->
"MapEventRelay received :deliver_event (call) for map #{event.map_id}, type: #{event.type}"
end)
else
Logger.debug(fn ->
"MapEventRelay received :deliver_event (call) for map #{event.map_id}, type: #{event.type}"
end)
end
Logger.debug(fn ->
"MapEventRelay received :deliver_event (call) for map #{event.map_id}, type: #{event.type}"
end)
new_state = deliver_single_event(event, state)
{:reply, :ok, new_state}

View File

@@ -90,7 +90,9 @@ defmodule WandererApp.ExternalEvents.WebhookDispatcher do
@impl true
def handle_cast({:dispatch_events, map_id, events}, state) do
Logger.debug(fn -> "WebhookDispatcher received #{length(events)} events for map #{map_id}" end)
Logger.debug(fn ->
"WebhookDispatcher received #{length(events)} events for map #{map_id}"
end)
# Emit telemetry for batch events
:telemetry.execute(
@@ -290,7 +292,7 @@ defmodule WandererApp.ExternalEvents.WebhookDispatcher do
request = Finch.build(:post, url, headers, payload)
case Finch.request(request, WandererApp.Finch, timeout: 30_000) do
case Finch.request(request, WandererApp.Finch.Webhooks, timeout: 30_000) do
{:ok, %Finch.Response{status: status}} ->
{:ok, status}

View File

@@ -31,7 +31,7 @@ defmodule WandererApp.StartCorpWalletTrackerTask do
if not is_nil(admin_character) do
:ok =
WandererApp.Character.TrackerManager.start_tracking(admin_character.id, keep_alive: true)
WandererApp.Character.TrackerManager.start_tracking(admin_character.id)
{:ok, _pid} =
WandererApp.Character.TrackerManager.start_transaction_tracker(admin_character.id)

View File

@@ -546,7 +546,7 @@ defmodule WandererApp.Kills.Client do
end
end
defp check_health(%{socket_pid: pid, last_message_time: last_msg_time} = state)
defp check_health(%{socket_pid: pid, last_message_time: last_msg_time} = _state)
when not is_nil(pid) and not is_nil(last_msg_time) do
cond do
not socket_alive?(pid) ->

View File

@@ -229,7 +229,7 @@ defmodule WandererApp.Kills.MapEventListener do
{:error, :not_running} ->
{:error, :not_running}
{:ok, status} ->
{:ok, _status} ->
{:error, :not_connected}
error ->

View File

@@ -136,9 +136,6 @@ defmodule WandererApp.License.LicenseManager do
end
end
@doc """
Updates a license's expiration date based on the map's subscription.
"""
def update_license_expiration_from_subscription(map_id) do
with {:ok, license} <- get_license_by_map_id(map_id),
{:ok, subscription} <- SubscriptionManager.get_active_map_subscription(map_id) do
@@ -146,24 +143,15 @@ defmodule WandererApp.License.LicenseManager do
end
end
@doc """
Formats a datetime as YYYY-MM-DD.
"""
defp format_date(datetime) do
Calendar.strftime(datetime, "%Y-%m-%d")
end
@doc """
Generates a link to the map.
"""
defp generate_map_link(map_slug) do
base_url = Application.get_env(:wanderer_app, :web_app_url)
"#{base_url}/#{map_slug}"
end
@doc """
Gets the map owner's data.
"""
defp get_map_owner_email(map) do
{:ok, %{owner: owner}} = map |> Ash.load([:owner])
"#{owner.name}(#{owner.eve_id})"

View File

@@ -135,7 +135,7 @@ defmodule WandererApp.License.LicenseManagerClient do
Application.get_env(:wanderer_app, :license_manager)[:auth_key]
end
defp parse_error_response(status, %{"error" => error_message}) do
defp parse_error_response(_status, %{"error" => error_message}) do
{:error, error_message}
end

View File

@@ -53,8 +53,8 @@ defmodule WandererApp.Map do
{:ok, map} ->
map
_ ->
Logger.error(fn -> "Failed to get map #{map_id}" end)
error ->
Logger.error("Failed to get map #{map_id}: #{inspect(error)}")
%{}
end
end
@@ -134,6 +134,22 @@ defmodule WandererApp.Map do
def get_options(map_id),
do: {:ok, map_id |> get_map!() |> Map.get(:options, Map.new())}
def get_tracked_character_ids(map_id) do
{:ok,
map_id
|> get_map!()
|> Map.get(:characters, [])
|> Enum.filter(fn character_id ->
{:ok, tracking_start_time} =
WandererApp.Cache.lookup(
"character:#{character_id}:map:#{map_id}:tracking_start_time",
nil
)
not is_nil(tracking_start_time)
end)}
end
@doc """
Returns a full list of characters in the map
"""
@@ -183,9 +199,31 @@ defmodule WandererApp.Map do
def add_characters!(map, []), do: map
def add_characters!(%{map_id: map_id} = map, [character | rest]) do
add_character(map_id, character)
add_characters!(map, rest)
def add_characters!(%{map_id: map_id} = map, characters) when is_list(characters) do
# Get current characters list once
current_characters = Map.get(map, :characters, [])
characters_ids =
characters
|> Enum.map(fn %{id: char_id} -> char_id end)
# Filter out characters that already exist
new_character_ids =
characters_ids
|> Enum.reject(fn char_id -> char_id in current_characters end)
# If all characters already exist, return early
if new_character_ids == [] do
map
else
case update_map(map_id, %{characters: new_character_ids ++ current_characters}) do
{:commit, map} ->
map
_ ->
map
end
end
end
def add_character(
@@ -198,64 +236,13 @@ defmodule WandererApp.Map do
case not (characters |> Enum.member?(character_id)) do
true ->
WandererApp.Character.get_map_character(map_id, character_id)
|> case do
{:ok,
%{
alliance_id: alliance_id,
corporation_id: corporation_id,
solar_system_id: solar_system_id,
structure_id: structure_id,
station_id: station_id,
ship: ship_type_id,
ship_name: ship_name
}} ->
map_id
|> update_map(%{characters: [character_id | characters]})
map_id
|> update_map(%{characters: [character_id | characters]})
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:alliance_id",
# alliance_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:corporation_id",
# corporation_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:solar_system_id",
# solar_system_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:structure_id",
# structure_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:station_id",
# station_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:ship_type_id",
# ship_type_id
# )
# WandererApp.Cache.insert(
# "map:#{map_id}:character:#{character_id}:ship_name",
# ship_name
# )
:ok
error ->
error
end
:ok
_ ->
{:error, :already_exists}
:ok
end
end

View File

@@ -23,7 +23,8 @@ defmodule WandererApp.Map.CacheRTree do
alias WandererApp.Cache
@grid_size 150 # Grid cell size in pixels
# Grid cell size in pixels
@grid_size 150
# Type definitions matching DDRT behavior
@type id :: number() | String.t()
@@ -59,19 +60,26 @@ defmodule WandererApp.Map.CacheRTree do
# Update leaves storage
current_leaves = get_leaves(name)
new_leaves = Enum.reduce(leaves, current_leaves, fn {id, box}, acc ->
Map.put(acc, id, {id, box})
end)
new_leaves =
Enum.reduce(leaves, current_leaves, fn {id, box}, acc ->
Map.put(acc, id, {id, box})
end)
put_leaves(name, new_leaves)
# Update spatial grid
current_grid = get_grid(name)
new_grid = Enum.reduce(leaves, current_grid, fn leaf, grid ->
add_to_grid(grid, leaf)
end)
new_grid =
Enum.reduce(leaves, current_grid, fn leaf, grid ->
add_to_grid(grid, leaf)
end)
put_grid(name, new_grid)
{:ok, %{}} # Match DRTree return format
# Match DRTree return format
{:ok, %{}}
end
@doc """
@@ -97,17 +105,19 @@ defmodule WandererApp.Map.CacheRTree do
current_grid = get_grid(name)
# Remove from leaves and track bounding boxes for grid cleanup
{new_leaves, removed} = Enum.reduce(ids, {current_leaves, []}, fn id, {leaves, removed} ->
case Map.pop(leaves, id) do
{nil, leaves} -> {leaves, removed}
{{^id, box}, leaves} -> {leaves, [{id, box} | removed]}
end
end)
{new_leaves, removed} =
Enum.reduce(ids, {current_leaves, []}, fn id, {leaves, removed} ->
case Map.pop(leaves, id) do
{nil, leaves} -> {leaves, removed}
{{^id, box}, leaves} -> {leaves, [{id, box} | removed]}
end
end)
# Update grid
new_grid = Enum.reduce(removed, current_grid, fn {id, box}, grid ->
remove_from_grid(grid, id, box)
end)
new_grid =
Enum.reduce(removed, current_grid, fn {id, box}, grid ->
remove_from_grid(grid, id, box)
end)
put_leaves(name, new_leaves)
put_grid(name, new_grid)
@@ -133,17 +143,21 @@ defmodule WandererApp.Map.CacheRTree do
"""
@impl true
def update(id, box_or_tuple, name) do
{old_box, new_box} = case box_or_tuple do
{old, new} ->
{old, new}
box ->
# Need to look up old box
leaves = get_leaves(name)
case Map.get(leaves, id) do
{^id, old} -> {old, box}
nil -> {nil, box} # Will be handled as new insert
end
end
{old_box, new_box} =
case box_or_tuple do
{old, new} ->
{old, new}
box ->
# Need to look up old box
leaves = get_leaves(name)
case Map.get(leaves, id) do
{^id, old} -> {old, box}
# Will be handled as new insert
nil -> {nil, box}
end
end
# Delete old, insert new
if old_box, do: delete([id], name)
@@ -184,6 +198,7 @@ defmodule WandererApp.Map.CacheRTree do
# Precise intersection test
leaves = get_leaves(name)
matching_ids =
Enum.filter(candidate_ids, fn id ->
case Map.get(leaves, id) do
@@ -216,6 +231,7 @@ defmodule WandererApp.Map.CacheRTree do
iex> CacheRTree.init_tree("rtree_map_456", %{width: 150, verbose: false})
:ok
"""
@impl true
def init_tree(name, config \\ %{}) do
Cache.put(cache_key(name, :leaves), %{})
Cache.put(cache_key(name, :grid), %{})
@@ -319,6 +335,7 @@ defmodule WandererApp.Map.CacheRTree do
# Floor division that works correctly with negative numbers
defp div_floor(a, b) when a >= 0, do: div(a, b)
defp div_floor(a, b) when a < 0 do
case rem(a, b) do
0 -> div(a, b)

View File

@@ -58,10 +58,6 @@ defmodule WandererApp.Map.Manager do
{:ok, pings_cleanup_timer} =
:timer.send_interval(@pings_cleanup_interval, :cleanup_pings)
# safe_async_task(fn ->
# start_last_active_maps()
# end)
{:ok,
%{
check_maps_queue_timer: check_maps_queue_timer,
@@ -134,20 +130,6 @@ defmodule WandererApp.Map.Manager do
end
end
defp start_last_active_maps() do
{:ok, last_map_states} =
WandererApp.Api.MapState.get_last_active(
DateTime.utc_now()
|> DateTime.add(-30, :minute)
)
last_map_states
|> Enum.map(fn %{map_id: map_id} -> map_id end)
|> Enum.each(fn map_id -> start_map(map_id) end)
:ok
end
defp start_maps() do
chunks =
@maps_queue

View File

@@ -4,7 +4,7 @@ defmodule WandererApp.Map.MapPool do
require Logger
alias WandererApp.Map.Server
alias WandererApp.Map.{MapPoolState, Server}
defstruct [
:map_ids,
@@ -15,7 +15,7 @@ defmodule WandererApp.Map.MapPool do
@cache :map_pool_cache
@registry :map_pool_registry
@unique_registry :unique_map_pool_registry
@map_pool_limit 20
@map_pool_limit 10
@garbage_collection_interval :timer.hours(4)
@systems_cleanup_timeout :timer.minutes(30)
@@ -26,7 +26,17 @@ defmodule WandererApp.Map.MapPool do
def new(), do: __struct__()
def new(args), do: __struct__(args)
def start_link(map_ids) do
# Accept both {uuid, map_ids} tuple (from supervisor restart) and just map_ids (legacy)
def start_link({uuid, map_ids}) when is_binary(uuid) and is_list(map_ids) do
GenServer.start_link(
@name,
{uuid, map_ids},
name: Module.concat(__MODULE__, uuid)
)
end
# For backward compatibility - generate UUID if only map_ids provided
def start_link(map_ids) when is_list(map_ids) do
uuid = UUID.uuid1()
GenServer.start_link(
@@ -38,13 +48,42 @@ defmodule WandererApp.Map.MapPool do
@impl true
def init({uuid, map_ids}) do
{:ok, _} = Registry.register(@unique_registry, Module.concat(__MODULE__, uuid), map_ids)
# Check for crash recovery - if we have previous state in ETS, merge it with new map_ids
{final_map_ids, recovery_info} =
case MapPoolState.get_pool_state(uuid) do
{:ok, recovered_map_ids} ->
# Merge and deduplicate map IDs
merged = Enum.uniq(recovered_map_ids ++ map_ids)
recovery_count = length(recovered_map_ids)
Logger.info(
"[Map Pool #{uuid}] Crash recovery detected: recovering #{recovery_count} maps",
pool_uuid: uuid,
recovered_maps: recovered_map_ids,
new_maps: map_ids,
total_maps: length(merged)
)
# Emit telemetry for crash recovery
:telemetry.execute(
[:wanderer_app, :map_pool, :recovery, :start],
%{recovered_map_count: recovery_count, total_map_count: length(merged)},
%{pool_uuid: uuid}
)
{merged, %{recovered: true, count: recovery_count}}
{:error, :not_found} ->
# Normal startup, no previous state to recover
{map_ids, %{recovered: false}}
end
# Register with empty list - maps will be added as they're started in handle_continue
{:ok, _} = Registry.register(@unique_registry, Module.concat(__MODULE__, uuid), [])
{:ok, _} = Registry.register(@registry, __MODULE__, uuid)
map_ids
|> Enum.each(fn id ->
Cachex.put(@cache, id, uuid)
end)
# Don't pre-populate cache - will be populated as maps start in handle_continue
# This prevents duplicates when recovering
state =
%{
@@ -53,32 +92,99 @@ defmodule WandererApp.Map.MapPool do
}
|> new()
{:ok, state, {:continue, {:start, map_ids}}}
{:ok, state, {:continue, {:start, {final_map_ids, recovery_info}}}}
end
@impl true
def terminate(_reason, _state) do
def terminate(reason, %{uuid: uuid} = _state) do
# On graceful shutdown, clean up ETS state
# On crash, keep ETS state for recovery
case reason do
:normal ->
Logger.debug("[Map Pool #{uuid}] Graceful shutdown, cleaning up ETS state")
MapPoolState.delete_pool_state(uuid)
:shutdown ->
Logger.debug("[Map Pool #{uuid}] Graceful shutdown, cleaning up ETS state")
MapPoolState.delete_pool_state(uuid)
{:shutdown, _} ->
Logger.debug("[Map Pool #{uuid}] Graceful shutdown, cleaning up ETS state")
MapPoolState.delete_pool_state(uuid)
_ ->
Logger.warning(
"[Map Pool #{uuid}] Abnormal termination (#{inspect(reason)}), keeping ETS state for recovery"
)
# Keep ETS state for crash recovery
:ok
end
:ok
end
@impl true
def handle_continue({:start, map_ids}, state) do
def handle_continue({:start, {map_ids, recovery_info}}, state) do
Logger.info("#{@name} started")
# Track recovery statistics
start_time = System.monotonic_time(:millisecond)
initial_count = length(map_ids)
# Start maps synchronously and accumulate state changes
new_state =
{new_state, failed_maps} =
map_ids
|> Enum.reduce(state, fn map_id, current_state ->
|> Enum.reduce({state, []}, fn map_id, {current_state, failed} ->
case do_start_map(map_id, current_state) do
{:ok, updated_state} ->
updated_state
{updated_state, failed}
{:error, reason} ->
Logger.error("[Map Pool] Failed to start map #{map_id}: #{reason}")
current_state
# Emit telemetry for individual map recovery failure
if recovery_info.recovered do
:telemetry.execute(
[:wanderer_app, :map_pool, :recovery, :map_failed],
%{map_id: map_id},
%{pool_uuid: state.uuid, reason: reason}
)
end
{current_state, [map_id | failed]}
end
end)
# Calculate final statistics
end_time = System.monotonic_time(:millisecond)
duration_ms = end_time - start_time
successful_count = length(new_state.map_ids)
failed_count = length(failed_maps)
# Log and emit telemetry for recovery completion
if recovery_info.recovered do
Logger.info(
"[Map Pool #{state.uuid}] Crash recovery completed: #{successful_count}/#{initial_count} maps recovered in #{duration_ms}ms",
pool_uuid: state.uuid,
recovered_count: successful_count,
failed_count: failed_count,
total_count: initial_count,
duration_ms: duration_ms,
failed_maps: failed_maps
)
:telemetry.execute(
[:wanderer_app, :map_pool, :recovery, :complete],
%{
recovered_count: successful_count,
failed_count: failed_count,
duration_ms: duration_ms
},
%{pool_uuid: state.uuid}
)
end
# Schedule periodic tasks
Process.send_after(self(), :backup_state, @backup_state_timeout)
Process.send_after(self(), :cleanup_systems, 15_000)
@@ -91,6 +197,55 @@ defmodule WandererApp.Map.MapPool do
{:noreply, new_state}
end
@impl true
def handle_continue({:init_map, map_id}, %{uuid: uuid} = state) do
# Perform the actual map initialization asynchronously
# This runs after the GenServer.call has already returned
start_time = System.monotonic_time(:millisecond)
try do
# Initialize the map state and start the map server using extracted helper
do_initialize_map_server(map_id)
duration = System.monotonic_time(:millisecond) - start_time
Logger.info("[Map Pool #{uuid}] Map #{map_id} initialized successfully in #{duration}ms")
# Emit telemetry for slow initializations
if duration > 5_000 do
Logger.warning("[Map Pool #{uuid}] Slow map initialization: #{map_id} took #{duration}ms")
:telemetry.execute(
[:wanderer_app, :map_pool, :slow_init],
%{duration_ms: duration},
%{map_id: map_id, pool_uuid: uuid}
)
end
{:noreply, state}
rescue
e ->
duration = System.monotonic_time(:millisecond) - start_time
Logger.error("""
[Map Pool #{uuid}] Failed to initialize map #{map_id} after #{duration}ms: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
# Rollback: Remove from state, registry, cache, and ETS using extracted helper
new_state = do_unregister_map(map_id, uuid, state)
# Emit telemetry for failed initialization
:telemetry.execute(
[:wanderer_app, :map_pool, :init_failed],
%{duration_ms: duration},
%{map_id: map_id, pool_uuid: uuid, reason: Exception.message(e)}
)
{:noreply, new_state}
end
end
@impl true
def handle_cast(:stop, state), do: {:stop, :normal, state}
@@ -111,13 +266,38 @@ defmodule WandererApp.Map.MapPool do
{:reply, :ok, state}
else
case do_start_map(map_id, state) do
{:ok, new_state} ->
{:reply, :ok, new_state}
# Check if map is already started or being initialized
if map_id in map_ids do
Logger.debug("[Map Pool #{uuid}] Map #{map_id} already in pool")
{:reply, {:ok, :already_started}, state}
else
# Pre-register the map in registry and cache to claim ownership
# This prevents race conditions where multiple pools try to start the same map
registry_result =
Registry.update_value(@unique_registry, Module.concat(__MODULE__, uuid), fn r_map_ids ->
[map_id | r_map_ids]
end)
{:error, _reason} ->
# Error already logged in do_start_map
{:reply, :ok, state}
case registry_result do
{_new_value, _old_value} ->
# Add to cache
Cachex.put(@cache, map_id, uuid)
# Add to state
new_state = %{state | map_ids: [map_id | map_ids]}
# Persist state to ETS
MapPoolState.save_pool_state(uuid, new_state.map_ids)
Logger.debug("[Map Pool #{uuid}] Map #{map_id} queued for async initialization")
# Return immediately and initialize asynchronously
{:reply, {:ok, :initializing}, new_state, {:continue, {:init_map, map_id}}}
:error ->
Logger.error("[Map Pool #{uuid}] Failed to register map #{map_id} in registry")
{:reply, {:error, :registration_failed}, state}
end
end
end
end
@@ -165,22 +345,25 @@ defmodule WandererApp.Map.MapPool do
# Step 2: Add to cache
case Cachex.put(@cache, map_id, uuid) do
{:ok, _} ->
completed_operations = [:cache | completed_operations]
:ok
{:error, reason} ->
raise "Failed to add to cache: #{inspect(reason)}"
end
# Step 3: Start the map server
map_id
|> WandererApp.Map.get_map_state!()
|> Server.Impl.start_map()
completed_operations = [:cache | completed_operations]
# Step 3: Start the map server using extracted helper
do_initialize_map_server(map_id)
completed_operations = [:map_server | completed_operations]
# Step 4: Update GenServer state (last, as this is in-memory and fast)
new_state = %{state | map_ids: [map_id | map_ids]}
# Step 5: Persist state to ETS for crash recovery
MapPoolState.save_pool_state(uuid, new_state.map_ids)
Logger.debug("[Map Pool] Successfully started map #{map_id} in pool #{uuid}")
{:ok, new_state}
rescue
@@ -263,12 +446,14 @@ defmodule WandererApp.Map.MapPool do
# Step 2: Delete from cache
case Cachex.del(@cache, map_id) do
{:ok, _} ->
completed_operations = [:cache | completed_operations]
:ok
{:error, reason} ->
raise "Failed to delete from cache: #{inspect(reason)}"
end
completed_operations = [:cache | completed_operations]
# Step 3: Stop the map server (clean up all map resources)
map_id
|> Server.Impl.stop_map()
@@ -278,6 +463,9 @@ defmodule WandererApp.Map.MapPool do
# Step 4: Update GenServer state (last, as this is in-memory and fast)
new_state = %{state | map_ids: map_ids |> Enum.reject(fn id -> id == map_id end)}
# Step 5: Persist state to ETS for crash recovery
MapPoolState.save_pool_state(uuid, new_state.map_ids)
Logger.debug("[Map Pool] Successfully stopped map #{map_id} from pool #{uuid}")
{:ok, new_state}
rescue
@@ -294,6 +482,35 @@ defmodule WandererApp.Map.MapPool do
end
end
# Helper function to initialize the map server (no state management)
# This extracts the common map initialization logic used in both
# synchronous (do_start_map) and asynchronous ({:init_map, map_id}) paths
defp do_initialize_map_server(map_id) do
map_id
|> WandererApp.Map.get_map_state!()
|> Server.Impl.start_map()
end
# Helper function to unregister a map from all tracking
# Used for rollback when map initialization fails in the async path
defp do_unregister_map(map_id, uuid, state) do
# Remove from registry
Registry.update_value(@unique_registry, Module.concat(__MODULE__, uuid), fn r_map_ids ->
Enum.reject(r_map_ids, &(&1 == map_id))
end)
# Remove from cache
Cachex.del(@cache, map_id)
# Update state
new_state = %{state | map_ids: Enum.reject(state.map_ids, &(&1 == map_id))}
# Update ETS
MapPoolState.save_pool_state(uuid, new_state.map_ids)
new_state
end
defp rollback_stop_map_operations(map_id, uuid, completed_operations) do
Logger.warning("[Map Pool] Attempting to rollback stop_map operations for #{map_id}")
@@ -335,10 +552,14 @@ defmodule WandererApp.Map.MapPool do
def handle_call(:error, _, state), do: {:stop, :error, :ok, state}
@impl true
def handle_info(:backup_state, %{map_ids: map_ids} = state) do
def handle_info(:backup_state, %{map_ids: map_ids, uuid: uuid} = state) do
Process.send_after(self(), :backup_state, @backup_state_timeout)
try do
# Persist pool state to ETS
MapPoolState.save_pool_state(uuid, map_ids)
# Backup individual map states to database
map_ids
|> Task.async_stream(
fn map_id ->
@@ -502,36 +723,55 @@ defmodule WandererApp.Map.MapPool do
{:noreply, state}
end
def handle_info(
:update_online,
%{
characters: characters,
server_online: true
} =
state
) do
Process.send_after(self(), :update_online, @update_online_interval)
def handle_info(:map_deleted, %{map_ids: map_ids} = state) do
# When a map is deleted, stop all maps in this pool that are deleted
# This is a graceful shutdown triggered by user action
Logger.info("[Map Pool #{state.uuid}] Received map_deleted event, stopping affected maps")
try do
characters
|> Task.async_stream(
fn character_id ->
WandererApp.Character.Tracker.update_online(character_id)
end,
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task,
timeout: :timer.seconds(5)
)
|> Enum.each(fn _result -> :ok end)
rescue
e ->
Logger.error("""
[Tracker Pool] update_online => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
""")
end
# Check which of our maps were deleted and stop them
new_state =
map_ids
|> Enum.reduce(state, fn map_id, current_state ->
# Check if the map still exists in the database
case WandererApp.MapRepo.get(map_id) do
{:ok, %{deleted: true}} ->
Logger.info("[Map Pool #{state.uuid}] Map #{map_id} was deleted, stopping it")
{:noreply, state}
case do_stop_map(map_id, current_state) do
{:ok, updated_state} ->
updated_state
{:error, reason} ->
Logger.error(
"[Map Pool #{state.uuid}] Failed to stop deleted map #{map_id}: #{reason}"
)
current_state
end
{:ok, _map} ->
# Map still exists and is not deleted
current_state
{:error, _} ->
# Map doesn't exist, should stop it
Logger.info("[Map Pool #{state.uuid}] Map #{map_id} not found, stopping it")
case do_stop_map(map_id, current_state) do
{:ok, updated_state} ->
updated_state
{:error, reason} ->
Logger.error(
"[Map Pool #{state.uuid}] Failed to stop missing map #{map_id}: #{reason}"
)
current_state
end
end
end)
{:noreply, new_state}
end
def handle_info(event, state) do

View File

@@ -7,7 +7,8 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
@cache :map_pool_cache
@registry :map_pool_registry
@unique_registry :unique_map_pool_registry
@map_pool_limit 20
@map_pool_limit 10
@genserver_call_timeout :timer.minutes(2)
@name __MODULE__
@@ -30,7 +31,32 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
start_child([map_id], pools |> Enum.count())
pid ->
GenServer.call(pid, {:start_map, map_id})
result = GenServer.call(pid, {:start_map, map_id}, @genserver_call_timeout)
case result do
{:ok, :initializing} ->
Logger.debug(
"[Map Pool Supervisor] Map #{map_id} queued for async initialization"
)
result
{:ok, :already_started} ->
Logger.debug("[Map Pool Supervisor] Map #{map_id} already started")
result
:ok ->
# Legacy synchronous response (from crash recovery path)
Logger.debug("[Map Pool Supervisor] Map #{map_id} started synchronously")
result
other ->
Logger.warning(
"[Map Pool Supervisor] Unexpected response for map #{map_id}: #{inspect(other)}"
)
other
end
end
end
end
@@ -59,7 +85,7 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
find_pool_by_scanning_registry(map_id)
[{pool_pid, _}] ->
GenServer.call(pool_pid, {:stop_map, map_id})
GenServer.call(pool_pid, {:stop_map, map_id}, @genserver_call_timeout)
end
{:error, reason} ->
@@ -102,7 +128,7 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
# Update the cache to fix the inconsistency
Cachex.put(@cache, map_id, pool_uuid)
GenServer.call(pool_pid, {:stop_map, map_id})
GenServer.call(pool_pid, {:stop_map, map_id}, @genserver_call_timeout)
nil ->
Logger.debug("Map #{map_id} not found in any pool registry")
@@ -113,7 +139,7 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
defp get_available_pool([]), do: nil
defp get_available_pool([{pid, uuid} | pools]) do
defp get_available_pool([{_pid, uuid} | pools]) do
case Registry.lookup(@unique_registry, Module.concat(WandererApp.Map.MapPool, uuid)) do
[] ->
nil
@@ -140,9 +166,13 @@ defmodule WandererApp.Map.MapPoolDynamicSupervisor do
end
defp start_child(map_ids, pools_count) do
case DynamicSupervisor.start_child(@name, {WandererApp.Map.MapPool, map_ids}) do
# Generate UUID for the new pool - this will be used for crash recovery
uuid = UUID.uuid1()
# Pass both UUID and map_ids to the pool for crash recovery support
case DynamicSupervisor.start_child(@name, {WandererApp.Map.MapPool, {uuid, map_ids}}) do
{:ok, pid} ->
Logger.info("Starting map pool, total map_pools: #{pools_count + 1}")
Logger.info("Starting map pool #{uuid}, total map_pools: #{pools_count + 1}")
{:ok, pid}
{:error, {:already_started, pid}} ->

View File

@@ -0,0 +1,190 @@
defmodule WandererApp.Map.MapPoolState do
@moduledoc """
Helper module for persisting MapPool state to ETS for crash recovery.
This module provides functions to save and retrieve MapPool state from an ETS table.
The state survives GenServer crashes but is lost on node restart, which ensures
automatic recovery from crashes while avoiding stale state on system restart.
## ETS Table Ownership
The ETS table `:map_pool_state_table` is owned by the MapPoolSupervisor,
ensuring it survives individual MapPool process crashes.
## State Format
State is stored as tuples: `{pool_uuid, map_ids, last_updated_timestamp}`
where:
- `pool_uuid` is the unique identifier for the pool (key)
- `map_ids` is a list of map IDs managed by this pool
- `last_updated_timestamp` is the Unix timestamp of the last update
"""
require Logger
@table_name :map_pool_state_table
@stale_threshold_hours 24
@doc """
Initializes the ETS table for storing MapPool state.
This should be called by the MapPoolSupervisor during initialization.
The table is created as:
- `:set` - Each pool UUID has exactly one entry
- `:public` - Any process can read/write
- `:named_table` - Can be accessed by name
Returns the table reference or raises if table already exists.
"""
@spec init_table() :: :ets.table()
def init_table do
:ets.new(@table_name, [:set, :public, :named_table])
end
@doc """
Saves the current state of a MapPool to ETS.
## Parameters
- `uuid` - The unique identifier for the pool
- `map_ids` - List of map IDs currently managed by this pool
## Examples
iex> MapPoolState.save_pool_state("pool-123", [1, 2, 3])
:ok
"""
@spec save_pool_state(String.t(), [integer()]) :: :ok
def save_pool_state(uuid, map_ids) when is_binary(uuid) and is_list(map_ids) do
timestamp = System.system_time(:second)
true = :ets.insert(@table_name, {uuid, map_ids, timestamp})
Logger.debug("Saved MapPool state for #{uuid}: #{length(map_ids)} maps",
pool_uuid: uuid,
map_count: length(map_ids)
)
:ok
end
@doc """
Retrieves the saved state for a MapPool from ETS.
## Parameters
- `uuid` - The unique identifier for the pool
## Returns
- `{:ok, map_ids}` if state exists
- `{:error, :not_found}` if no state exists for this UUID
## Examples
iex> MapPoolState.get_pool_state("pool-123")
{:ok, [1, 2, 3]}
iex> MapPoolState.get_pool_state("non-existent")
{:error, :not_found}
"""
@spec get_pool_state(String.t()) :: {:ok, [integer()]} | {:error, :not_found}
def get_pool_state(uuid) when is_binary(uuid) do
case :ets.lookup(@table_name, uuid) do
[{^uuid, map_ids, _timestamp}] ->
{:ok, map_ids}
[] ->
{:error, :not_found}
end
end
@doc """
Deletes the state for a MapPool from ETS.
This should be called when a pool is gracefully shut down.
## Parameters
- `uuid` - The unique identifier for the pool
## Examples
iex> MapPoolState.delete_pool_state("pool-123")
:ok
"""
@spec delete_pool_state(String.t()) :: :ok
def delete_pool_state(uuid) when is_binary(uuid) do
true = :ets.delete(@table_name, uuid)
Logger.debug("Deleted MapPool state for #{uuid}", pool_uuid: uuid)
:ok
end
@doc """
Removes stale entries from the ETS table.
Entries are considered stale if they haven't been updated in the last
#{@stale_threshold_hours} hours. This helps prevent the table from growing
unbounded due to pool UUIDs that are no longer in use.
Returns the number of entries deleted.
## Examples
iex> MapPoolState.cleanup_stale_entries()
{:ok, 3}
"""
@spec cleanup_stale_entries() :: {:ok, non_neg_integer()}
def cleanup_stale_entries do
stale_threshold = System.system_time(:second) - @stale_threshold_hours * 3600
match_spec = [
{
{:"$1", :"$2", :"$3"},
[{:<, :"$3", stale_threshold}],
[:"$1"]
}
]
stale_uuids = :ets.select(@table_name, match_spec)
Enum.each(stale_uuids, fn uuid ->
:ets.delete(@table_name, uuid)
Logger.info("Cleaned up stale MapPool state for #{uuid}",
pool_uuid: uuid,
reason: :stale
)
end)
{:ok, length(stale_uuids)}
end
@doc """
Returns all pool states currently stored in ETS.
Useful for debugging and monitoring.
## Examples
iex> MapPoolState.list_all_states()
[
{"pool-123", [1, 2, 3], 1699564800},
{"pool-456", [4, 5], 1699564900}
]
"""
@spec list_all_states() :: [{String.t(), [integer()], integer()}]
def list_all_states do
:ets.tab2list(@table_name)
end
@doc """
Returns the count of pool states currently stored in ETS.
## Examples
iex> MapPoolState.count_states()
5
"""
@spec count_states() :: non_neg_integer()
def count_states do
:ets.info(@table_name, :size)
end
end

View File

@@ -2,6 +2,8 @@ defmodule WandererApp.Map.MapPoolSupervisor do
@moduledoc false
use Supervisor
alias WandererApp.Map.MapPoolState
@name __MODULE__
@registry :map_pool_registry
@unique_registry :unique_map_pool_registry
@@ -11,6 +13,10 @@ defmodule WandererApp.Map.MapPoolSupervisor do
end
def init(_args) do
# Initialize ETS table for MapPool state persistence
# This table survives individual MapPool crashes but is lost on node restart
MapPoolState.init_table()
children = [
{Registry, [keys: :unique, name: @unique_registry]},
{Registry, [keys: :duplicate, name: @registry]},

View File

@@ -167,7 +167,9 @@ defmodule WandererApp.Map.Reconciler do
defp cleanup_zombie_maps([]), do: :ok
defp cleanup_zombie_maps(zombie_maps) do
Logger.warning("[Map Reconciler] Found #{length(zombie_maps)} zombie maps: #{inspect(zombie_maps)}")
Logger.warning(
"[Map Reconciler] Found #{length(zombie_maps)} zombie maps: #{inspect(zombie_maps)}"
)
Enum.each(zombie_maps, fn map_id ->
Logger.info("[Map Reconciler] Cleaning up zombie map: #{map_id}")
@@ -201,7 +203,9 @@ defmodule WandererApp.Map.Reconciler do
defp fix_orphan_maps([]), do: :ok
defp fix_orphan_maps(orphan_maps) do
Logger.warning("[Map Reconciler] Found #{length(orphan_maps)} orphan maps: #{inspect(orphan_maps)}")
Logger.warning(
"[Map Reconciler] Found #{length(orphan_maps)} orphan maps: #{inspect(orphan_maps)}"
)
Enum.each(orphan_maps, fn map_id ->
Logger.info("[Map Reconciler] Fixing orphan map: #{map_id}")
@@ -246,7 +250,10 @@ defmodule WandererApp.Map.Reconciler do
)
:error ->
Logger.warning("[Map Reconciler] Could not find pool for map #{map_id}, removing from cache")
Logger.warning(
"[Map Reconciler] Could not find pool for map #{map_id}, removing from cache"
)
Cachex.del(@cache, map_id)
end
end)

View File

@@ -72,12 +72,12 @@ defmodule WandererApp.Map.Routes do
{:ok, %{routes: routes, systems_static_data: systems_static_data}}
error ->
_error ->
{:ok, %{routes: [], systems_static_data: []}}
end
end
def find(map_id, hubs, origin, routes_settings, true) do
def find(_map_id, hubs, origin, routes_settings, true) do
origin = origin |> String.to_integer()
hubs = hubs |> Enum.map(&(&1 |> String.to_integer()))

View File

@@ -12,7 +12,6 @@ defmodule WandererApp.Map.Operations.Connections do
# Connection type constants
@connection_type_wormhole 0
@connection_type_stargate 1
# Ship size constants
@small_ship_size 0

View File

@@ -90,7 +90,8 @@ defmodule WandererApp.Map.Operations.Signatures do
updated_signatures: [],
removed_signatures: [],
solar_system_id: solar_system_id,
character_id: validated_char_uuid, # Pass internal UUID here
# Pass internal UUID here
character_id: validated_char_uuid,
user_id: user_id,
delete_connection_with_sigs: false
}) do
@@ -176,7 +177,8 @@ defmodule WandererApp.Map.Operations.Signatures do
updated_signatures: [attrs],
removed_signatures: [],
solar_system_id: system.solar_system_id,
character_id: validated_char_uuid, # Pass internal UUID here
# Pass internal UUID here
character_id: validated_char_uuid,
user_id: user_id,
delete_connection_with_sigs: false
})

View File

@@ -34,28 +34,14 @@ defmodule WandererApp.Map.Server.CharactersImpl do
track_characters(map_id, rest)
end
def update_tracked_characters(map_id) do
def invalidate_characters(map_id) do
Task.start_link(fn ->
{:ok, all_map_tracked_character_ids} =
character_ids =
map_id
|> WandererApp.MapCharacterSettingsRepo.get_tracked_by_map_all()
|> case do
{:ok, settings} -> {:ok, settings |> Enum.map(&Map.get(&1, :character_id))}
_ -> {:ok, []}
end
|> WandererApp.Map.get_map!()
|> Map.get(:characters, [])
{:ok, actual_map_tracked_characters} =
WandererApp.Cache.lookup("maps:#{map_id}:tracked_characters", [])
characters_to_remove = actual_map_tracked_characters -- all_map_tracked_character_ids
WandererApp.Cache.insert_or_update(
"map_#{map_id}:invalidate_character_ids",
characters_to_remove,
fn ids ->
(ids ++ characters_to_remove) |> Enum.uniq()
end
)
WandererApp.Cache.insert("map_#{map_id}:invalidate_character_ids", character_ids)
:ok
end)
@@ -78,9 +64,7 @@ defmodule WandererApp.Map.Server.CharactersImpl do
})
end
defp untrack_character(_is_character_map_active, _map_id, character_id) do
:ok
end
defp untrack_character(_is_character_map_active, _map_id, _character_id), do: :ok
defp is_character_map_active?(map_id, character_id) do
case WandererApp.Character.get_character_state(character_id) do
@@ -155,6 +139,12 @@ defmodule WandererApp.Map.Server.CharactersImpl do
Task.start_link(fn ->
with :ok <- WandererApp.Map.remove_character(map_id, character_id),
{:ok, character} <- WandererApp.Character.get_map_character(map_id, character_id) do
# Clean up character-specific cache entries
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:solar_system_id")
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:station_id")
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:structure_id")
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:location_updated_at")
Impl.broadcast!(map_id, :character_removed, character)
# ADDITIVE: Also broadcast to external event system (webhooks/WebSocket)
@@ -193,98 +183,103 @@ defmodule WandererApp.Map.Server.CharactersImpl do
end
end
# Calculate optimal concurrency based on character count
# Scales from base concurrency (32 on 8-core) up to 128 for 300+ characters
defp calculate_max_concurrency(character_count) do
base_concurrency = System.schedulers_online() * 4
cond do
character_count < 100 -> base_concurrency
character_count < 200 -> base_concurrency * 2
character_count < 300 -> base_concurrency * 3
true -> base_concurrency * 4
end
end
def update_characters(map_id) do
start_time = System.monotonic_time(:microsecond)
try do
{:ok, presence_character_ids} =
WandererApp.Cache.lookup("map_#{map_id}:presence_character_ids", [])
{:ok, tracked_character_ids} = WandererApp.Map.get_tracked_character_ids(map_id)
presence_character_ids
|> Task.async_stream(
fn character_id ->
character_updates =
maybe_update_online(map_id, character_id) ++
maybe_update_tracking_status(map_id, character_id) ++
maybe_update_location(map_id, character_id) ++
maybe_update_ship(map_id, character_id) ++
maybe_update_alliance(map_id, character_id) ++
maybe_update_corporation(map_id, character_id)
character_count = length(tracked_character_ids)
character_updates
|> Enum.filter(fn update -> update != :skip end)
|> Enum.map(fn update ->
update
|> case do
{:character_location, location_info, old_location_info} ->
{:ok, map_state} = WandererApp.Map.get_map_state(map_id)
update_location(
map_state,
character_id,
location_info,
old_location_info
)
:broadcast
{:character_ship, _info} ->
:broadcast
{:character_online, _info} ->
:broadcast
{:character_tracking, _info} ->
:broadcast
{:character_alliance, _info} ->
WandererApp.Cache.insert_or_update(
"map_#{map_id}:invalidate_character_ids",
[character_id],
fn ids ->
[character_id | ids] |> Enum.uniq()
end
)
:broadcast
{:character_corporation, _info} ->
WandererApp.Cache.insert_or_update(
"map_#{map_id}:invalidate_character_ids",
[character_id],
fn ids ->
[character_id | ids] |> Enum.uniq()
end
)
:broadcast
_ ->
:skip
end
end)
|> Enum.filter(fn update -> update != :skip end)
|> Enum.uniq()
|> Enum.each(fn update ->
case update do
:broadcast ->
update_character(map_id, character_id)
_ ->
:ok
end
end)
:ok
end,
timeout: :timer.seconds(15),
max_concurrency: System.schedulers_online() * 4,
on_timeout: :kill_task
# Emit telemetry for tracking update cycle start
:telemetry.execute(
[:wanderer_app, :map, :update_characters, :start],
%{character_count: character_count, system_time: System.system_time()},
%{map_id: map_id}
)
|> Enum.each(fn
{:ok, _result} -> :ok
{:error, reason} -> Logger.error("Error in update_characters: #{inspect(reason)}")
end)
# Calculate dynamic concurrency based on character count
max_concurrency = calculate_max_concurrency(character_count)
updated_characters =
tracked_character_ids
|> Task.async_stream(
fn character_id ->
# Use batch cache operations for all character tracking data
process_character_updates_batched(map_id, character_id)
end,
timeout: :timer.seconds(15),
max_concurrency: max_concurrency,
on_timeout: :kill_task
)
|> Enum.reduce([], fn
{:ok, {:updated, character}}, acc ->
[character | acc]
{:ok, _result}, acc ->
acc
{:error, reason}, acc ->
Logger.error("Error in update_characters: #{inspect(reason)}")
acc
end)
unless Enum.empty?(updated_characters) do
# Broadcast to internal channels
Impl.broadcast!(map_id, :characters_updated, %{
characters: updated_characters,
timestamp: DateTime.utc_now()
})
# Broadcast to external event system (webhooks/WebSocket)
WandererApp.ExternalEvents.broadcast(map_id, :characters_updated, %{
characters: updated_characters,
timestamp: DateTime.utc_now()
})
end
# Emit telemetry for successful completion
duration = System.monotonic_time(:microsecond) - start_time
:telemetry.execute(
[:wanderer_app, :map, :update_characters, :complete],
%{
duration: duration,
character_count: character_count,
updated_count: length(updated_characters),
system_time: System.system_time()
},
%{map_id: map_id}
)
:ok
rescue
e ->
# Emit telemetry for error case
duration = System.monotonic_time(:microsecond) - start_time
:telemetry.execute(
[:wanderer_app, :map, :update_characters, :error],
%{
duration: duration,
system_time: System.system_time()
},
%{map_id: map_id, error: Exception.message(e)}
)
Logger.error("""
[Map Server] update_characters => exception: #{Exception.message(e)}
#{Exception.format_stacktrace(__STACKTRACE__)}
@@ -292,14 +287,382 @@ defmodule WandererApp.Map.Server.CharactersImpl do
end
end
defp update_character(map_id, character_id) do
{:ok, character} = WandererApp.Character.get_map_character(map_id, character_id)
Impl.broadcast!(map_id, :character_updated, character)
# ADDITIVE: Also broadcast to external event system (webhooks/WebSocket)
WandererApp.ExternalEvents.broadcast(map_id, :character_updated, character)
defp calculate_character_state_hash(character) do
# Hash all trackable fields for quick comparison
:erlang.phash2(%{
online: character.online,
ship: character.ship,
ship_name: character.ship_name,
ship_item_id: character.ship_item_id,
solar_system_id: character.solar_system_id,
station_id: character.station_id,
structure_id: character.structure_id,
alliance_id: character.alliance_id,
corporation_id: character.corporation_id
})
end
defp process_character_updates_batched(map_id, character_id) do
# Step 1: Get current character data for hash comparison
case WandererApp.Character.get_character(character_id) do
{:ok, character} ->
new_hash = calculate_character_state_hash(character)
state_hash_key = "map:#{map_id}:character:#{character_id}:state_hash"
{:ok, old_hash} = WandererApp.Cache.lookup(state_hash_key, nil)
if new_hash == old_hash do
# No changes detected - skip expensive processing (70-90% of cases)
:no_change
else
# Changes detected - proceed with full processing
process_character_changes(map_id, character_id, character, state_hash_key, new_hash)
end
{:error, _error} ->
:ok
end
end
# Process character changes when hash indicates updates
defp process_character_changes(map_id, character_id, character, state_hash_key, new_hash) do
# Step 1: Batch read all cached values for this character
cache_keys = [
"map:#{map_id}:character:#{character_id}:online",
"map:#{map_id}:character:#{character_id}:ship_type_id",
"map:#{map_id}:character:#{character_id}:ship_name",
"map:#{map_id}:character:#{character_id}:solar_system_id",
"map:#{map_id}:character:#{character_id}:station_id",
"map:#{map_id}:character:#{character_id}:structure_id",
"map:#{map_id}:character:#{character_id}:location_updated_at",
"map:#{map_id}:character:#{character_id}:alliance_id",
"map:#{map_id}:character:#{character_id}:corporation_id"
]
{:ok, cached_values} = WandererApp.Cache.lookup_all(cache_keys)
# Step 2: Calculate all updates
{character_updates, cache_updates} =
calculate_character_updates(map_id, character_id, character, cached_values)
# Step 3: Update the state hash in cache
cache_updates = Map.put(cache_updates, state_hash_key, new_hash)
# Step 4: Batch write all cache updates
unless Enum.empty?(cache_updates) do
WandererApp.Cache.insert_all(cache_updates)
end
# Step 5: Process update events
has_updates =
character_updates
|> Enum.filter(fn update -> update != :skip end)
|> Enum.map(fn update ->
case update do
{:character_location, location_info, old_location_info} ->
start_time = System.monotonic_time(:microsecond)
:telemetry.execute(
[:wanderer_app, :character, :location_update, :start],
%{system_time: System.system_time()},
%{
character_id: character_id,
map_id: map_id,
from_system: old_location_info.solar_system_id,
to_system: location_info.solar_system_id
}
)
{:ok, map_state} = WandererApp.Map.get_map_state(map_id)
update_location(
map_state,
character_id,
location_info,
old_location_info
)
duration = System.monotonic_time(:microsecond) - start_time
:telemetry.execute(
[:wanderer_app, :character, :location_update, :complete],
%{duration: duration, system_time: System.system_time()},
%{
character_id: character_id,
map_id: map_id,
from_system: old_location_info.solar_system_id,
to_system: location_info.solar_system_id
}
)
:has_update
{:character_ship, _info} ->
:has_update
{:character_online, %{online: online}} ->
if not online do
WandererApp.Cache.delete("map:#{map_id}:character:#{character_id}:solar_system_id")
end
:has_update
{:character_tracking, _info} ->
:has_update
{:character_alliance, _info} ->
WandererApp.Cache.insert_or_update(
"map_#{map_id}:invalidate_character_ids",
[character_id],
fn ids ->
[character_id | ids] |> Enum.uniq()
end
)
:has_update
{:character_corporation, _info} ->
WandererApp.Cache.insert_or_update(
"map_#{map_id}:invalidate_character_ids",
[character_id],
fn ids ->
[character_id | ids] |> Enum.uniq()
end
)
:has_update
_ ->
:skip
end
end)
|> Enum.any?(fn result -> result == :has_update end)
if has_updates do
case WandererApp.Character.get_map_character(map_id, character_id) do
{:ok, character} ->
{:updated, character}
{:error, _} ->
:ok
end
else
:ok
end
end
# Calculate all character updates in a single pass
defp calculate_character_updates(map_id, character_id, character, cached_values) do
updates = []
cache_updates = %{}
# Check each type of update using specialized functions
{updates, cache_updates} =
check_online_update(map_id, character_id, character, cached_values, updates, cache_updates)
{updates, cache_updates} =
check_ship_update(map_id, character_id, character, cached_values, updates, cache_updates)
{updates, cache_updates} =
check_location_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
)
{updates, cache_updates} =
check_alliance_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
)
{updates, cache_updates} =
check_corporation_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
)
{updates, cache_updates}
end
# Check for online status changes
defp check_online_update(map_id, character_id, character, cached_values, updates, cache_updates) do
online_key = "map:#{map_id}:character:#{character_id}:online"
old_online = Map.get(cached_values, online_key)
if character.online != old_online do
{
[{:character_online, %{online: character.online}} | updates],
Map.put(cache_updates, online_key, character.online)
}
else
{updates, cache_updates}
end
end
# Check for ship changes
defp check_ship_update(map_id, character_id, character, cached_values, updates, cache_updates) do
ship_type_key = "map:#{map_id}:character:#{character_id}:ship_type_id"
ship_name_key = "map:#{map_id}:character:#{character_id}:ship_name"
old_ship_type_id = Map.get(cached_values, ship_type_key)
old_ship_name = Map.get(cached_values, ship_name_key)
if character.ship != old_ship_type_id or character.ship_name != old_ship_name do
{
[
{:character_ship,
%{
ship: character.ship,
ship_name: character.ship_name,
ship_item_id: character.ship_item_id
}}
| updates
],
cache_updates
|> Map.put(ship_type_key, character.ship)
|> Map.put(ship_name_key, character.ship_name)
}
else
{updates, cache_updates}
end
end
# Check for location changes with race condition detection
defp check_location_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
) do
solar_system_key = "map:#{map_id}:character:#{character_id}:solar_system_id"
station_key = "map:#{map_id}:character:#{character_id}:station_id"
structure_key = "map:#{map_id}:character:#{character_id}:structure_id"
location_timestamp_key = "map:#{map_id}:character:#{character_id}:location_updated_at"
old_solar_system_id = Map.get(cached_values, solar_system_key)
old_station_id = Map.get(cached_values, station_key)
old_structure_id = Map.get(cached_values, structure_key)
old_timestamp = Map.get(cached_values, location_timestamp_key)
if character.solar_system_id != old_solar_system_id ||
character.structure_id != old_structure_id ||
character.station_id != old_station_id do
# Race condition detection
{:ok, current_cached_timestamp} =
WandererApp.Cache.lookup(location_timestamp_key)
race_detected =
!is_nil(old_timestamp) && !is_nil(current_cached_timestamp) &&
old_timestamp != current_cached_timestamp
if race_detected do
Logger.warning(
"[CharacterTracking] Race condition detected for character #{character_id} on map #{map_id}: " <>
"cache was modified between read (#{inspect(old_timestamp)}) and write (#{inspect(current_cached_timestamp)})"
)
:telemetry.execute(
[:wanderer_app, :character, :location_update, :race_condition],
%{system_time: System.system_time()},
%{
character_id: character_id,
map_id: map_id,
old_system: old_solar_system_id,
new_system: character.solar_system_id,
old_timestamp: old_timestamp,
current_timestamp: current_cached_timestamp
}
)
end
now = DateTime.utc_now()
{
[
{:character_location,
%{
solar_system_id: character.solar_system_id,
structure_id: character.structure_id,
station_id: character.station_id
}, %{solar_system_id: old_solar_system_id}}
| updates
],
cache_updates
|> Map.put(solar_system_key, character.solar_system_id)
|> Map.put(station_key, character.station_id)
|> Map.put(structure_key, character.structure_id)
|> Map.put(location_timestamp_key, now)
}
else
{updates, cache_updates}
end
end
# Check for alliance changes
defp check_alliance_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
) do
alliance_key = "map:#{map_id}:character:#{character_id}:alliance_id"
old_alliance_id = Map.get(cached_values, alliance_key)
if character.alliance_id != old_alliance_id do
{
[{:character_alliance, %{alliance_id: character.alliance_id}} | updates],
Map.put(cache_updates, alliance_key, character.alliance_id)
}
else
{updates, cache_updates}
end
end
# Check for corporation changes
defp check_corporation_update(
map_id,
character_id,
character,
cached_values,
updates,
cache_updates
) do
corporation_key = "map:#{map_id}:character:#{character_id}:corporation_id"
old_corporation_id = Map.get(cached_values, corporation_key)
if character.corporation_id != old_corporation_id do
{
[{:character_corporation, %{corporation_id: character.corporation_id}} | updates],
Map.put(cache_updates, corporation_key, character.corporation_id)
}
else
{updates, cache_updates}
end
end
defp update_location(
_state,
_character_id,
_location,
%{solar_system_id: nil}
),
do: :ok
defp update_location(
%{map: %{scope: scope}, map_id: map_id, map_opts: map_opts} =
_state,
@@ -307,49 +670,59 @@ defmodule WandererApp.Map.Server.CharactersImpl do
location,
old_location
) do
start_solar_system_id =
WandererApp.Cache.take("map:#{map_id}:character:#{character_id}:start_solar_system_id")
case is_nil(old_location.solar_system_id) &&
is_nil(start_solar_system_id) &&
ConnectionsImpl.can_add_location(scope, location.solar_system_id) do
ConnectionsImpl.is_connection_valid(
scope,
old_location.solar_system_id,
location.solar_system_id
)
|> case do
true ->
:ok = SystemsImpl.maybe_add_system(map_id, location, nil, map_opts)
# Add new location system
case SystemsImpl.maybe_add_system(map_id, location, old_location, map_opts) do
:ok ->
:ok
_ ->
if is_nil(start_solar_system_id) || start_solar_system_id == old_location.solar_system_id do
ConnectionsImpl.is_connection_valid(
scope,
old_location.solar_system_id,
location.solar_system_id
)
|> case do
true ->
:ok =
SystemsImpl.maybe_add_system(map_id, location, old_location, map_opts)
{:error, error} ->
Logger.error(
"[CharacterTracking] Failed to add new location system #{location.solar_system_id} for character #{character_id} on map #{map_id}: #{inspect(error)}"
)
end
:ok =
SystemsImpl.maybe_add_system(map_id, old_location, location, map_opts)
# Add old location system (in case it wasn't on map)
case SystemsImpl.maybe_add_system(map_id, old_location, location, map_opts) do
:ok ->
:ok
if is_character_in_space?(location) do
:ok =
ConnectionsImpl.maybe_add_connection(
map_id,
location,
old_location,
character_id,
false,
nil
)
end
{:error, error} ->
Logger.error(
"[CharacterTracking] Failed to add old location system #{old_location.solar_system_id} for character #{character_id} on map #{map_id}: #{inspect(error)}"
)
end
# Add connection if character is in space
if is_character_in_space?(location) do
case ConnectionsImpl.maybe_add_connection(
map_id,
location,
old_location,
character_id,
false,
nil
) do
:ok ->
:ok
{:error, error} ->
Logger.error(
"[CharacterTracking] Failed to add connection for character #{character_id} on map #{map_id}: #{inspect(error)}"
)
_ ->
:ok
end
else
# skip adding connection or system if character just started tracking on the map
:ok
end
_ ->
:ok
end
end
@@ -390,197 +763,14 @@ defmodule WandererApp.Map.Server.CharactersImpl do
end
defp track_character(map_id, character_id) do
{:ok, %{solar_system_id: solar_system_id} = map_character} =
WandererApp.Character.get_map_character(map_id, character_id, not_present: true)
{:ok, character} =
WandererApp.Character.get_character(character_id)
WandererApp.Cache.delete("character:#{character_id}:tracking_paused")
add_character(map_id, map_character, true)
add_character(map_id, character, true)
WandererApp.Character.TrackerManager.update_track_settings(character_id, %{
map_id: map_id,
track: true,
track_online: true,
track_location: true,
track_ship: true,
solar_system_id: solar_system_id
track: true
})
end
defp maybe_update_online(map_id, character_id) do
with {:ok, old_online} <-
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:online"),
{:ok, %{online: online}} <-
WandererApp.Character.get_character(character_id) do
case old_online != online do
true ->
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:online",
online
)
[{:character_online, %{online: online}}]
_ ->
[:skip]
end
else
error ->
Logger.error("Failed to update online: #{inspect(error, pretty: true)}")
[:skip]
end
end
defp maybe_update_tracking_status(map_id, character_id) do
with {:ok, old_tracking_paused} <-
WandererApp.Cache.lookup(
"map:#{map_id}:character:#{character_id}:tracking_paused",
false
),
{:ok, tracking_paused} <-
WandererApp.Cache.lookup("character:#{character_id}:tracking_paused", false) do
case old_tracking_paused != tracking_paused do
true ->
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:tracking_paused",
tracking_paused
)
[{:character_tracking, %{tracking_paused: tracking_paused}}]
_ ->
[:skip]
end
else
error ->
Logger.error("Failed to update character_tracking: #{inspect(error, pretty: true)}")
[:skip]
end
end
defp maybe_update_ship(map_id, character_id) do
with {:ok, old_ship_type_id} <-
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:ship_type_id"),
{:ok, old_ship_name} <-
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:ship_name"),
{:ok, %{ship: ship_type_id, ship_name: ship_name, ship_item_id: ship_item_id}} <-
WandererApp.Character.get_character(character_id) do
case old_ship_type_id != ship_type_id or
old_ship_name != ship_name do
true ->
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:ship_type_id",
ship_type_id
)
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:ship_name",
ship_name
)
[
{:character_ship,
%{ship: ship_type_id, ship_name: ship_name, ship_item_id: ship_item_id}}
]
_ ->
[:skip]
end
else
error ->
Logger.error("Failed to update ship: #{inspect(error, pretty: true)}")
[:skip]
end
end
defp maybe_update_location(map_id, character_id) do
{:ok, old_solar_system_id} =
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:solar_system_id")
{:ok, old_station_id} =
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:station_id")
{:ok, old_structure_id} =
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:structure_id")
{:ok, %{solar_system_id: solar_system_id, structure_id: structure_id, station_id: station_id}} =
WandererApp.Character.get_character(character_id)
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:solar_system_id",
solar_system_id
)
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:station_id",
station_id
)
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:structure_id",
structure_id
)
if solar_system_id != old_solar_system_id || structure_id != old_structure_id ||
station_id != old_station_id do
[
{:character_location,
%{
solar_system_id: solar_system_id,
structure_id: structure_id,
station_id: station_id
}, %{solar_system_id: old_solar_system_id}}
]
else
[:skip]
end
end
defp maybe_update_alliance(map_id, character_id) do
with {:ok, old_alliance_id} <-
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:alliance_id"),
{:ok, %{alliance_id: alliance_id}} <-
WandererApp.Character.get_character(character_id) do
case old_alliance_id != alliance_id do
true ->
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:alliance_id",
alliance_id
)
[{:character_alliance, %{alliance_id: alliance_id}}]
_ ->
[:skip]
end
else
error ->
Logger.error("Failed to update alliance: #{inspect(error, pretty: true)}")
[:skip]
end
end
defp maybe_update_corporation(map_id, character_id) do
with {:ok, old_corporation_id} <-
WandererApp.Cache.lookup("map:#{map_id}:character:#{character_id}:corporation_id"),
{:ok, %{corporation_id: corporation_id}} <-
WandererApp.Character.get_character(character_id) do
case old_corporation_id != corporation_id do
true ->
WandererApp.Cache.insert(
"map:#{map_id}:character:#{character_id}:corporation_id",
corporation_id
)
[{:character_corporation, %{corporation_id: corporation_id}}]
_ ->
[:skip]
end
else
error ->
Logger.error("Failed to update corporation: #{inspect(error, pretty: true)}")
[:skip]
end
end
end

View File

@@ -223,6 +223,7 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
update_connection(map_id, :update_time_status, [:time_status], connection_update, fn
%{time_status: old_time_status},
%{id: connection_id, time_status: time_status} = updated_connection ->
# Handle EOL marking cache separately
case time_status == @connection_time_status_eol do
true ->
if old_time_status != @connection_time_status_eol do
@@ -230,18 +231,30 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
"map_#{map_id}:conn_#{connection_id}:mark_eol_time",
DateTime.utc_now()
)
set_start_time(map_id, connection_id, DateTime.utc_now())
end
_ ->
if old_time_status == @connection_time_status_eol do
WandererApp.Cache.delete("map_#{map_id}:conn_#{connection_id}:mark_eol_time")
set_start_time(map_id, connection_id, DateTime.utc_now())
end
end
# Always reset start_time when status changes (manual override)
# This ensures user manual changes aren't immediately overridden by cleanup
if time_status != old_time_status do
# Emit telemetry for manual time status change
:telemetry.execute(
[:wanderer_app, :connection, :manual_status_change],
%{system_time: System.system_time()},
%{
map_id: map_id,
connection_id: connection_id,
old_time_status: old_time_status,
new_time_status: time_status
}
)
set_start_time(map_id, connection_id, DateTime.utc_now())
maybe_update_linked_signature_time_status(map_id, updated_connection)
end
end)
@@ -353,6 +366,25 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
solar_system_source_id,
solar_system_target_id
) do
# Emit telemetry for automatic time status downgrade
elapsed_minutes = DateTime.diff(DateTime.utc_now(), connection_start_time, :minute)
:telemetry.execute(
[:wanderer_app, :connection, :auto_downgrade],
%{
elapsed_minutes: elapsed_minutes,
system_time: System.system_time()
},
%{
map_id: map_id,
connection_id: connection_id,
old_time_status: time_status,
new_time_status: new_time_status,
solar_system_source: solar_system_source_id,
solar_system_target: solar_system_target_id
}
)
set_start_time(map_id, connection_id, DateTime.utc_now())
update_connection_time_status(map_id, %{
@@ -401,7 +433,7 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
)
else
error ->
Logger.error("Failed to update_linked_signature_time_status: #{inspect(error)}")
Logger.warning("Failed to update_linked_signature_time_status: #{inspect(error)}")
end
end
@@ -537,6 +569,12 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
Impl.broadcast!(map_id, :add_connection, connection)
Impl.broadcast!(map_id, :maybe_link_signature, %{
character_id: character_id,
solar_system_source: old_location.solar_system_id,
solar_system_target: location.solar_system_id
})
# ADDITIVE: Also broadcast to external event system (webhooks/WebSocket)
WandererApp.ExternalEvents.broadcast(map_id, :connection_added, %{
connection_id: connection.id,
@@ -548,19 +586,12 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
time_status: connection.time_status
})
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_connection_added, %{
character_id: character_id,
user_id: character.user_id,
map_id: map_id,
solar_system_source_id: old_location.solar_system_id,
solar_system_target_id: location.solar_system_id
})
Impl.broadcast!(map_id, :maybe_link_signature, %{
WandererApp.User.ActivityTracker.track_map_event(:map_connection_added, %{
character_id: character_id,
solar_system_source: old_location.solar_system_id,
solar_system_target: location.solar_system_id
user_id: character.user_id,
map_id: map_id,
solar_system_source_id: old_location.solar_system_id,
solar_system_target_id: location.solar_system_id
})
:ok
@@ -657,14 +688,17 @@ defmodule WandererApp.Map.Server.ConnectionsImpl do
)
)
def is_connection_valid(_scope, from_solar_system_id, to_solar_system_id)
when is_nil(from_solar_system_id) or is_nil(to_solar_system_id),
do: false
def is_connection_valid(:all, from_solar_system_id, to_solar_system_id),
do: from_solar_system_id != to_solar_system_id
def is_connection_valid(:none, _from_solar_system_id, _to_solar_system_id), do: false
def is_connection_valid(scope, from_solar_system_id, to_solar_system_id)
when not is_nil(from_solar_system_id) and not is_nil(to_solar_system_id) and
from_solar_system_id != to_solar_system_id do
when from_solar_system_id != to_solar_system_id do
with {:ok, known_jumps} <- find_solar_system_jump(from_solar_system_id, to_solar_system_id),
{:ok, from_system_static_info} <- get_system_static_info(from_solar_system_id),
{:ok, to_system_static_info} <- get_system_static_info(to_solar_system_id) do

View File

@@ -25,12 +25,11 @@ defmodule WandererApp.Map.Server.Impl do
]
@pubsub_client Application.compile_env(:wanderer_app, :pubsub_client)
@connections_cleanup_timeout :timer.minutes(1)
@ddrt Application.compile_env(:wanderer_app, :ddrt)
@update_presence_timeout :timer.seconds(5)
@update_characters_timeout :timer.seconds(1)
@update_tracked_characters_timeout :timer.minutes(1)
@invalidate_characters_timeout :timer.hours(1)
def new(), do: __struct__()
def new(args), do: __struct__(args)
@@ -45,19 +44,77 @@ defmodule WandererApp.Map.Server.Impl do
}
|> new()
with {:ok, map} <-
WandererApp.MapRepo.get(map_id, [
:owner,
:characters,
acls: [
:owner_id,
members: [:role, :eve_character_id, :eve_corporation_id, :eve_alliance_id]
]
]),
{:ok, systems} <- WandererApp.MapSystemRepo.get_visible_by_map(map_id),
{:ok, connections} <- WandererApp.MapConnectionRepo.get_by_map(map_id),
{:ok, subscription_settings} <-
WandererApp.Map.SubscriptionManager.get_active_map_subscription(map_id) do
# Parallelize database queries for faster initialization
start_time = System.monotonic_time(:millisecond)
tasks = [
Task.async(fn ->
{:map,
WandererApp.MapRepo.get(map_id, [
:owner,
:characters,
acls: [
:owner_id,
members: [:role, :eve_character_id, :eve_corporation_id, :eve_alliance_id]
]
])}
end),
Task.async(fn ->
{:systems, WandererApp.MapSystemRepo.get_visible_by_map(map_id)}
end),
Task.async(fn ->
{:connections, WandererApp.MapConnectionRepo.get_by_map(map_id)}
end),
Task.async(fn ->
{:subscription, WandererApp.Map.SubscriptionManager.get_active_map_subscription(map_id)}
end)
]
results = Task.await_many(tasks, :timer.seconds(15))
duration = System.monotonic_time(:millisecond) - start_time
# Emit telemetry for slow initializations
if duration > 5_000 do
Logger.warning("[Map Server] Slow map state initialization: #{map_id} took #{duration}ms")
:telemetry.execute(
[:wanderer_app, :map, :slow_init],
%{duration_ms: duration},
%{map_id: map_id}
)
end
# Extract results
map_result =
Enum.find_value(results, fn
{:map, result} -> result
_ -> nil
end)
systems_result =
Enum.find_value(results, fn
{:systems, result} -> result
_ -> nil
end)
connections_result =
Enum.find_value(results, fn
{:connections, result} -> result
_ -> nil
end)
subscription_result =
Enum.find_value(results, fn
{:subscription, result} -> result
_ -> nil
end)
# Process results
with {:ok, map} <- map_result,
{:ok, systems} <- systems_result,
{:ok, connections} <- connections_result,
{:ok, subscription_settings} <- subscription_result do
initial_state
|> init_map(
map,
@@ -88,13 +145,12 @@ defmodule WandererApp.Map.Server.Impl do
"maps:#{map_id}"
)
WandererApp.Map.CacheRTree.init_tree("rtree_#{map_id}", %{width: 150, verbose: false})
Process.send_after(self(), {:update_characters, map_id}, @update_characters_timeout)
Process.send_after(
self(),
{:update_tracked_characters, map_id},
@update_tracked_characters_timeout
{:invalidate_characters, map_id},
@invalidate_characters_timeout
)
Process.send_after(self(), {:update_presence, map_id}, @update_presence_timeout)
@@ -143,17 +199,11 @@ defmodule WandererApp.Map.Server.Impl do
defdelegate cleanup_systems(map_id), to: SystemsImpl
defdelegate cleanup_connections(map_id), to: ConnectionsImpl
defdelegate cleanup_characters(map_id), to: CharactersImpl
defdelegate untrack_characters(map_id, characters_ids), to: CharactersImpl
defdelegate add_system(map_id, system_info, user_id, character_id, opts \\ []), to: SystemsImpl
defdelegate paste_connections(map_id, connections, user_id, character_id), to: ConnectionsImpl
defdelegate paste_systems(map_id, systems, user_id, character_id, opts), to: SystemsImpl
defdelegate add_system_comment(map_id, comment_info, user_id, character_id), to: SystemsImpl
defdelegate remove_system_comment(map_id, comment_id, user_id, character_id), to: SystemsImpl
defdelegate delete_systems(
@@ -165,49 +215,27 @@ defmodule WandererApp.Map.Server.Impl do
to: SystemsImpl
defdelegate update_system_name(map_id, update), to: SystemsImpl
defdelegate update_system_description(map_id, update), to: SystemsImpl
defdelegate update_system_status(map_id, update), to: SystemsImpl
defdelegate update_system_tag(map_id, update), to: SystemsImpl
defdelegate update_system_temporary_name(map_id, update), to: SystemsImpl
defdelegate update_system_locked(map_id, update), to: SystemsImpl
defdelegate update_system_labels(map_id, update), to: SystemsImpl
defdelegate update_system_linked_sig_eve_id(map_id, update), to: SystemsImpl
defdelegate update_system_position(map_id, update), to: SystemsImpl
defdelegate add_hub(map_id, hub_info), to: SystemsImpl
defdelegate remove_hub(map_id, hub_info), to: SystemsImpl
defdelegate add_ping(map_id, ping_info), to: PingsImpl
defdelegate cancel_ping(map_id, ping_info), to: PingsImpl
defdelegate add_connection(map_id, connection_info), to: ConnectionsImpl
defdelegate delete_connection(map_id, connection_info), to: ConnectionsImpl
defdelegate get_connection_info(map_id, connection_info), to: ConnectionsImpl
defdelegate update_connection_time_status(map_id, connection_update), to: ConnectionsImpl
defdelegate update_connection_type(map_id, connection_update), to: ConnectionsImpl
defdelegate update_connection_mass_status(map_id, connection_update), to: ConnectionsImpl
defdelegate update_connection_ship_size_type(map_id, connection_update), to: ConnectionsImpl
defdelegate update_connection_locked(map_id, connection_update), to: ConnectionsImpl
defdelegate update_connection_custom_info(map_id, connection_update), to: ConnectionsImpl
defdelegate update_signatures(map_id, signatures_update), to: SignaturesImpl
def import_settings(map_id, settings, user_id) do
@@ -274,14 +302,14 @@ defmodule WandererApp.Map.Server.Impl do
CharactersImpl.update_characters(map_id)
end
def handle_event({:update_tracked_characters, map_id} = event) do
def handle_event({:invalidate_characters, map_id} = event) do
Process.send_after(
self(),
event,
@update_tracked_characters_timeout
@invalidate_characters_timeout
)
CharactersImpl.update_tracked_characters(map_id)
CharactersImpl.invalidate_characters(map_id)
end
def handle_event({:update_presence, map_id} = event) do
@@ -358,6 +386,13 @@ defmodule WandererApp.Map.Server.Impl do
update_options(map_id, options)
end
def handle_event(:map_deleted) do
# Map has been deleted - this event is handled by MapPool to stop the server
# and by MapLive to redirect users. Nothing to do here.
Logger.debug("Map deletion event received, will be handled by MapPool")
:ok
end
def handle_event({ref, _result}) when is_reference(ref) do
Process.demonitor(ref, [:flush])
end
@@ -452,6 +487,8 @@ defmodule WandererApp.Map.Server.Impl do
) do
{:ok, options} = WandererApp.MapRepo.options_to_form_data(initial_map)
@ddrt.init_tree("rtree_#{map_id}", %{width: 150, verbose: false})
map =
initial_map
|> WandererApp.Map.new()

View File

@@ -212,9 +212,9 @@ defmodule WandererApp.Map.Server.SignaturesImpl do
defp maybe_update_connection_time_status(
map_id,
%{custom_info: old_custom_info} = old_sig,
%{custom_info: old_custom_info} = _old_sig,
%{custom_info: new_custom_info, system_id: system_id, linked_system_id: linked_system_id} =
updated_sig
_updated_sig
)
when not is_nil(linked_system_id) do
old_time_status = get_time_status(old_custom_info)
@@ -235,9 +235,9 @@ defmodule WandererApp.Map.Server.SignaturesImpl do
defp maybe_update_connection_mass_status(
map_id,
%{type: old_type} = old_sig,
%{type: old_type} = _old_sig,
%{type: new_type, system_id: system_id, linked_system_id: linked_system_id} =
updated_sig
_updated_sig
)
when not is_nil(linked_system_id) do
if old_type != new_type do

View File

@@ -45,7 +45,7 @@ defmodule WandererApp.Map.Server.SystemsImpl do
} = system_info,
user_id,
character_id,
opts
_opts
) do
map_id
|> WandererApp.Map.check_location(%{solar_system_id: solar_system_id})
@@ -100,8 +100,8 @@ defmodule WandererApp.Map.Server.SystemsImpl do
%{
solar_system_id: solar_system_id,
text: text
} = comment_info,
user_id,
} = _comment_info,
_user_id,
character_id
) do
system =
@@ -431,6 +431,16 @@ defmodule WandererApp.Map.Server.SystemsImpl do
def maybe_add_system(map_id, location, old_location, map_opts)
when not is_nil(location) do
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :start],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: location.solar_system_id,
from_system: old_location && old_location.solar_system_id
}
)
case WandererApp.Map.check_location(map_id, location) do
{:ok, location} ->
rtree_name = "rtree_#{map_id}"
@@ -481,49 +491,142 @@ defmodule WandererApp.Map.Server.SystemsImpl do
position_y: updated_system.position_y
})
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :complete],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: updated_system.solar_system_id,
system_id: updated_system.id,
operation: :update_existing
}
)
:ok
_ ->
{:ok, solar_system_info} =
WandererApp.CachedInfo.get_system_static_info(location.solar_system_id)
WandererApp.MapSystemRepo.create(%{
map_id: map_id,
solar_system_id: location.solar_system_id,
name: solar_system_info.solar_system_name,
position_x: position.x,
position_y: position.y
})
WandererApp.CachedInfo.get_system_static_info(location.solar_system_id)
|> case do
{:ok, new_system} ->
@ddrt.insert(
{new_system.solar_system_id,
WandererApp.Map.PositionCalculator.get_system_bounding_rect(new_system)},
rtree_name
)
WandererApp.Cache.put(
"map_#{map_id}:system_#{new_system.id}:last_activity",
DateTime.utc_now(),
ttl: @system_inactive_timeout
)
WandererApp.Map.add_system(map_id, new_system)
Impl.broadcast!(map_id, :add_system, new_system)
# ADDITIVE: Also broadcast to external event system (webhooks/WebSocket)
WandererApp.ExternalEvents.broadcast(map_id, :add_system, %{
solar_system_id: new_system.solar_system_id,
name: new_system.name,
position_x: new_system.position_x,
position_y: new_system.position_y
{:ok, solar_system_info} ->
# Use upsert instead of create - handles race conditions gracefully
WandererApp.MapSystemRepo.upsert(%{
map_id: map_id,
solar_system_id: location.solar_system_id,
name: solar_system_info.solar_system_name,
position_x: position.x,
position_y: position.y
})
|> case do
{:ok, system} ->
# System was either created or updated - both cases are success
@ddrt.insert(
{system.solar_system_id,
WandererApp.Map.PositionCalculator.get_system_bounding_rect(system)},
rtree_name
)
:ok
WandererApp.Cache.put(
"map_#{map_id}:system_#{system.id}:last_activity",
DateTime.utc_now(),
ttl: @system_inactive_timeout
)
WandererApp.Map.add_system(map_id, system)
Impl.broadcast!(map_id, :add_system, system)
# ADDITIVE: Also broadcast to external event system (webhooks/WebSocket)
WandererApp.ExternalEvents.broadcast(map_id, :add_system, %{
solar_system_id: system.solar_system_id,
name: system.name,
position_x: system.position_x,
position_y: system.position_y
})
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :complete],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: system.solar_system_id,
system_id: system.id,
operation: :upsert
}
)
:ok
{:error, error} = result ->
Logger.warning(
"[CharacterTracking] Failed to upsert system #{location.solar_system_id} on map #{map_id}: #{inspect(error, pretty: true)}"
)
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :error],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: location.solar_system_id,
error: error,
reason: :db_upsert_failed
}
)
result
error ->
Logger.warning(
"[CharacterTracking] Failed to upsert system #{location.solar_system_id} on map #{map_id}: #{inspect(error, pretty: true)}"
)
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :error],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: location.solar_system_id,
error: error,
reason: :db_upsert_failed_unexpected
}
)
{:error, error}
end
{:error, error} = result ->
Logger.warning(
"[CharacterTracking] Failed to add system #{inspect(location.solar_system_id)} on map #{map_id}: #{inspect(error, pretty: true)}"
)
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :error],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: location.solar_system_id,
error: error,
reason: :db_upsert_failed
}
)
result
error ->
Logger.warning("Failed to create system: #{inspect(error, pretty: true)}")
:ok
Logger.warning(
"[CharacterTracking] Failed to add system #{inspect(location.solar_system_id)} on map #{map_id}: #{inspect(error, pretty: true)}"
)
:telemetry.execute(
[:wanderer_app, :map, :system_addition, :error],
%{system_time: System.system_time()},
%{
map_id: map_id,
solar_system_id: location.solar_system_id,
error: error,
reason: :db_upsert_failed_unexpected
}
)
{:error, error}
end
end
@@ -642,13 +745,12 @@ defmodule WandererApp.Map.Server.SystemsImpl do
position_y: system.position_y
})
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:system_added, %{
character_id: character_id,
user_id: user_id,
map_id: map_id,
solar_system_id: solar_system_id
})
WandererApp.User.ActivityTracker.track_map_event(:system_added, %{
character_id: character_id,
user_id: user_id,
map_id: map_id,
solar_system_id: solar_system_id
})
end
defp maybe_update_extra_info(system, nil), do: system
@@ -805,6 +907,10 @@ defmodule WandererApp.Map.Server.SystemsImpl do
update_map_system_last_activity(map_id, updated_system)
else
{:error, error} ->
Logger.error("Failed to update system: #{inspect(error, pretty: true)}")
:ok
error ->
Logger.error("Failed to update system: #{inspect(error, pretty: true)}")
:ok

View File

@@ -0,0 +1,429 @@
defmodule WandererApp.Map.SlugRecovery do
@moduledoc """
Handles automatic recovery from duplicate map slug scenarios.
This module provides functions to:
- Detect duplicate slugs in the database (including deleted maps)
- Automatically fix duplicates by renaming newer maps
- Verify and recreate unique indexes (enforced on all maps, including deleted)
- Safely handle race conditions during recovery
## Slug Uniqueness Policy
All map slugs must be unique across the entire maps_v1 table, including
deleted maps. This prevents confusion and ensures that a slug can always
unambiguously identify a specific map in the system's history.
The recovery process is designed to be:
- Idempotent (safe to run multiple times)
- Production-safe (minimal locking, fast execution)
- Observable (telemetry events for monitoring)
"""
require Logger
alias WandererApp.Repo
@doc """
Recovers from a duplicate slug scenario for a specific slug.
This function:
1. Finds all maps with the given slug (including deleted)
2. Keeps the oldest map with the original slug
3. Renames newer duplicates with numeric suffixes
4. Verifies the unique index exists
Returns:
- `{:ok, result}` - Recovery successful
- `{:error, reason}` - Recovery failed
## Examples
iex> recover_duplicate_slug("home-2")
{:ok, %{fixed_count: 1, kept_map_id: "...", renamed_maps: [...]}}
"""
def recover_duplicate_slug(slug) do
start_time = System.monotonic_time(:millisecond)
Logger.warning("Starting slug recovery for '#{slug}'",
slug: slug,
operation: :recover_duplicate_slug
)
:telemetry.execute(
[:wanderer_app, :map, :slug_recovery, :start],
%{system_time: System.system_time()},
%{slug: slug, operation: :recover_duplicate_slug}
)
result =
Repo.transaction(fn ->
# Find all maps with this slug (including deleted), ordered by insertion time
duplicates = find_duplicate_maps(slug)
case duplicates do
[] ->
Logger.info("No maps found with slug '#{slug}' during recovery")
%{fixed_count: 0, kept_map_id: nil, renamed_maps: []}
[_single_map] ->
Logger.info("Only one map found with slug '#{slug}', no recovery needed")
%{fixed_count: 0, kept_map_id: nil, renamed_maps: []}
[kept_map | maps_to_rename] ->
# Convert binary UUID to string for consistency
kept_map_id_str =
if is_binary(kept_map.id), do: Ecto.UUID.load!(kept_map.id), else: kept_map.id
Logger.warning(
"Found #{length(maps_to_rename)} duplicate maps for slug '#{slug}', fixing...",
slug: slug,
kept_map_id: kept_map_id_str,
duplicate_count: length(maps_to_rename)
)
# Rename the duplicate maps
renamed_maps =
maps_to_rename
|> Enum.with_index(2)
|> Enum.map(fn {map, index} ->
new_slug = generate_unique_slug(slug, index)
rename_map(map, new_slug)
end)
%{
fixed_count: length(renamed_maps),
kept_map_id: kept_map_id_str,
renamed_maps: renamed_maps
}
end
end)
case result do
{:ok, recovery_result} ->
duration = System.monotonic_time(:millisecond) - start_time
:telemetry.execute(
[:wanderer_app, :map, :slug_recovery, :complete],
%{
duration_ms: duration,
fixed_count: recovery_result.fixed_count,
system_time: System.system_time()
},
%{slug: slug, result: recovery_result}
)
Logger.info("Slug recovery completed successfully",
slug: slug,
fixed_count: recovery_result.fixed_count,
duration_ms: duration
)
{:ok, recovery_result}
{:error, reason} = error ->
duration = System.monotonic_time(:millisecond) - start_time
:telemetry.execute(
[:wanderer_app, :map, :slug_recovery, :error],
%{duration_ms: duration, system_time: System.system_time()},
%{slug: slug, error: inspect(reason)}
)
Logger.error("Slug recovery failed",
slug: slug,
error: inspect(reason),
duration_ms: duration
)
error
end
end
@doc """
Verifies that the unique index on map slugs exists.
If missing, attempts to create it (after fixing any duplicates).
Returns:
- `{:ok, :exists}` - Index already exists
- `{:ok, :created}` - Index was created
- `{:error, reason}` - Failed to create index
"""
def verify_unique_index do
Logger.debug("Verifying unique index on maps_v1.slug")
# Check if the index exists
index_query = """
SELECT 1
FROM pg_indexes
WHERE tablename = 'maps_v1'
AND indexname = 'maps_v1_unique_slug_index'
LIMIT 1
"""
case Repo.query(index_query, []) do
{:ok, %{rows: [[1]]}} ->
Logger.debug("Unique index exists")
{:ok, :exists}
{:ok, %{rows: []}} ->
Logger.warning("Unique index missing, attempting to create")
create_unique_index()
{:error, reason} ->
Logger.error("Failed to check for unique index", error: inspect(reason))
{:error, reason}
end
end
@doc """
Performs a full recovery scan of all maps, fixing any duplicates found.
Processes both deleted and non-deleted maps.
This function will:
1. Drop the unique index if it exists (to allow fixing duplicates)
2. Find and fix all duplicate slugs
3. Return statistics about the recovery
Note: This function does NOT recreate the index. Call `verify_unique_index/0`
after this function completes to ensure the index is recreated.
This is a more expensive operation and should be run:
- During maintenance windows
- After detecting multiple duplicate slug errors
- As part of deployment verification
Returns:
- `{:ok, stats}` - Recovery completed with statistics
- `{:error, reason}` - Recovery failed
"""
def recover_all_duplicates do
Logger.info("Starting full duplicate slug recovery (including deleted maps)")
start_time = System.monotonic_time(:millisecond)
:telemetry.execute(
[:wanderer_app, :map, :full_recovery, :start],
%{system_time: System.system_time()},
%{}
)
# Drop the unique index if it exists to allow fixing duplicates
drop_unique_index_if_exists()
# Find all slugs that have duplicates (including deleted maps)
duplicate_slugs_query = """
SELECT slug, COUNT(*) as count
FROM maps_v1
GROUP BY slug
HAVING COUNT(*) > 1
"""
case Repo.query(duplicate_slugs_query, []) do
{:ok, %{rows: []}} ->
Logger.info("No duplicate slugs found")
{:ok, %{total_slugs_fixed: 0, total_maps_renamed: 0}}
{:ok, %{rows: duplicate_rows}} ->
Logger.warning("Found #{length(duplicate_rows)} slugs with duplicates",
duplicate_count: length(duplicate_rows)
)
# Fix each duplicate slug
results =
Enum.map(duplicate_rows, fn [slug, _count] ->
case recover_duplicate_slug(slug) do
{:ok, result} -> result
{:error, _} -> %{fixed_count: 0, kept_map_id: nil, renamed_maps: []}
end
end)
stats = %{
total_slugs_fixed: length(results),
total_maps_renamed: Enum.sum(Enum.map(results, & &1.fixed_count))
}
duration = System.monotonic_time(:millisecond) - start_time
:telemetry.execute(
[:wanderer_app, :map, :full_recovery, :complete],
%{
duration_ms: duration,
slugs_fixed: stats.total_slugs_fixed,
maps_renamed: stats.total_maps_renamed,
system_time: System.system_time()
},
%{stats: stats}
)
Logger.info("Full recovery completed",
stats: stats,
duration_ms: duration
)
{:ok, stats}
{:error, reason} = error ->
Logger.error("Failed to query for duplicates", error: inspect(reason))
error
end
end
# Private functions
defp find_duplicate_maps(slug) do
# Find all maps (including deleted) with this slug
query = """
SELECT id, name, slug, deleted, inserted_at
FROM maps_v1
WHERE slug = $1
ORDER BY inserted_at ASC
"""
case Repo.query(query, [slug]) do
{:ok, %{rows: rows}} ->
Enum.map(rows, fn [id, name, slug, deleted, inserted_at] ->
%{id: id, name: name, slug: slug, deleted: deleted, inserted_at: inserted_at}
end)
{:error, reason} ->
Logger.error("Failed to query for duplicate maps",
slug: slug,
error: inspect(reason)
)
[]
end
end
defp rename_map(map, new_slug) do
# Convert binary UUID to string for logging
map_id_str = if is_binary(map.id), do: Ecto.UUID.load!(map.id), else: map.id
Logger.info("Renaming map #{map_id_str} from '#{map.slug}' to '#{new_slug}'",
map_id: map_id_str,
old_slug: map.slug,
new_slug: new_slug,
deleted: map.deleted
)
update_query = """
UPDATE maps_v1
SET slug = $1, updated_at = NOW()
WHERE id = $2
"""
case Repo.query(update_query, [new_slug, map.id]) do
{:ok, _} ->
Logger.info("Successfully renamed map #{map_id_str} to '#{new_slug}'")
%{
map_id: map_id_str,
old_slug: map.slug,
new_slug: new_slug,
map_name: map.name,
deleted: map.deleted
}
{:error, reason} ->
map_id_str = if is_binary(map.id), do: Ecto.UUID.load!(map.id), else: map.id
Logger.error("Failed to rename map #{map_id_str}",
map_id: map_id_str,
old_slug: map.slug,
new_slug: new_slug,
error: inspect(reason)
)
%{
map_id: map_id_str,
old_slug: map.slug,
new_slug: nil,
error: reason
}
end
end
defp generate_unique_slug(base_slug, index) do
candidate = "#{base_slug}-#{index}"
# Verify this slug is actually unique (check all maps, including deleted)
query = "SELECT 1 FROM maps_v1 WHERE slug = $1 LIMIT 1"
case Repo.query(query, [candidate]) do
{:ok, %{rows: []}} ->
candidate
{:ok, %{rows: [[1]]}} ->
# This slug is taken, try the next one
generate_unique_slug(base_slug, index + 1)
{:error, _} ->
# On error, be conservative and try next number
generate_unique_slug(base_slug, index + 1)
end
end
defp create_unique_index do
Logger.warning("Creating unique index on maps_v1.slug")
# Create index on all maps (including deleted ones)
# This enforces slug uniqueness across all maps regardless of deletion status
create_index_query = """
CREATE UNIQUE INDEX CONCURRENTLY IF NOT EXISTS maps_v1_unique_slug_index
ON maps_v1 (slug)
"""
case Repo.query(create_index_query, []) do
{:ok, _} ->
Logger.info("Successfully created unique index (includes deleted maps)")
:telemetry.execute(
[:wanderer_app, :map, :index_created],
%{system_time: System.system_time()},
%{index_name: "maps_v1_unique_slug_index"}
)
{:ok, :created}
{:error, reason} ->
Logger.error("Failed to create unique index", error: inspect(reason))
{:error, reason}
end
end
defp drop_unique_index_if_exists do
Logger.debug("Checking if unique index exists before recovery")
check_query = """
SELECT 1
FROM pg_indexes
WHERE tablename = 'maps_v1'
AND indexname = 'maps_v1_unique_slug_index'
LIMIT 1
"""
case Repo.query(check_query, []) do
{:ok, %{rows: [[1]]}} ->
Logger.info("Dropping unique index to allow duplicate recovery")
drop_query = "DROP INDEX IF EXISTS maps_v1_unique_slug_index"
case Repo.query(drop_query, []) do
{:ok, _} ->
Logger.info("Successfully dropped unique index")
:ok
{:error, reason} ->
Logger.warning("Failed to drop unique index", error: inspect(reason))
:ok
end
{:ok, %{rows: []}} ->
Logger.debug("Unique index does not exist, no need to drop")
:ok
{:error, reason} ->
Logger.warning("Failed to check for unique index", error: inspect(reason))
:ok
end
end
end

View File

@@ -23,10 +23,12 @@ defmodule WandererApp.Release do
IO.puts("Run migrations..")
prepare()
for repo <- repos() do
for repo <- repos do
{:ok, _, _} = Ecto.Migrator.with_repo(repo, &Ecto.Migrator.run(&1, :up, all: true))
end
run_post_migration_tasks()
:init.stop()
end
@@ -76,6 +78,8 @@ defmodule WandererApp.Release do
Enum.each(streaks, fn {repo, up_to_version} ->
{:ok, _, _} = Ecto.Migrator.with_repo(repo, &Ecto.Migrator.run(&1, :up, to: up_to_version))
end)
run_post_migration_tasks()
end
defp migration_streaks(pending_migrations) do
@@ -215,4 +219,40 @@ defmodule WandererApp.Release do
IO.puts("Starting repos..")
Enum.each(repos(), & &1.start_link(pool_size: 2))
end
defp run_post_migration_tasks do
IO.puts("Running post-migration tasks..")
# Recover any duplicate map slugs
IO.puts("Checking for duplicate map slugs..")
case WandererApp.Map.SlugRecovery.recover_all_duplicates() do
{:ok, %{total_slugs_fixed: 0}} ->
IO.puts("No duplicate slugs found.")
{:ok, %{total_slugs_fixed: count, total_maps_renamed: renamed}} ->
IO.puts("Successfully fixed #{count} duplicate slug(s), renamed #{renamed} map(s).")
{:error, reason} ->
IO.puts("Warning: Failed to recover duplicate slugs: #{inspect(reason)}")
IO.puts("Application will continue, but you may need to manually fix duplicate slugs.")
end
# Ensure the unique index exists after recovery
IO.puts("Verifying unique index on map slugs..")
case WandererApp.Map.SlugRecovery.verify_unique_index() do
{:ok, :exists} ->
IO.puts("Unique index already exists.")
{:ok, :created} ->
IO.puts("Successfully created unique index.")
{:error, reason} ->
IO.puts("Warning: Failed to verify/create unique index: #{inspect(reason)}")
IO.puts("You may need to manually create the index.")
end
IO.puts("Post-migration tasks completed.")
end
end

View File

@@ -3,11 +3,25 @@ defmodule WandererApp.MapPingsRepo do
require Logger
def get_by_id(ping_id),
do: WandererApp.Api.MapPing.by_id!(ping_id) |> Ash.load([:system])
def get_by_id(ping_id) do
case WandererApp.Api.MapPing.by_id(ping_id) do
{:ok, ping} ->
ping |> Ash.load([:system])
def get_by_map(map_id),
do: WandererApp.Api.MapPing.by_map!(%{map_id: map_id}) |> Ash.load([:character, :system])
error ->
error
end
end
def get_by_map(map_id) do
case WandererApp.Api.MapPing.by_map(%{map_id: map_id}) do
{:ok, ping} ->
ping |> Ash.load([:character, :system])
error ->
error
end
end
def get_by_map_and_system!(map_id, system_id),
do: WandererApp.Api.MapPing.by_map_and_system!(%{map_id: map_id, system_id: system_id})

View File

@@ -1,6 +1,8 @@
defmodule WandererApp.MapRepo do
use WandererApp, :repository
require Logger
@default_map_options %{
"layout" => "left_to_right",
"store_custom_labels" => "false",
@@ -30,6 +32,116 @@ defmodule WandererApp.MapRepo do
|> WandererApp.Api.Map.get_map_by_slug()
|> load_user_permissions(current_user)
@doc """
Safely retrieves a map by slug, handling the case where multiple maps
with the same slug exist (database integrity issue).
When duplicates are detected, automatically triggers recovery to fix them
and retries the query once.
Returns:
- `{:ok, map}` - Single map found
- `{:error, :multiple_results}` - Multiple maps found (after recovery attempt)
- `{:error, :not_found}` - No map found
- `{:error, reason}` - Other error
"""
def get_map_by_slug_safely(slug, retry_count \\ 0) do
try do
map = WandererApp.Api.Map.get_map_by_slug!(slug)
{:ok, map}
rescue
error in Ash.Error.Invalid.MultipleResults ->
handle_multiple_results(slug, error, retry_count)
error in Ash.Error.Invalid ->
# Check if this Invalid error contains a MultipleResults error
case find_multiple_results_error(error) do
{:ok, multiple_results_error} ->
handle_multiple_results(slug, multiple_results_error, retry_count)
:error ->
# Some other Invalid error
Logger.error("Error retrieving map by slug",
slug: slug,
error: inspect(error)
)
{:error, :unknown_error}
end
error in Ash.Error.Query.NotFound ->
Logger.debug("Map not found with slug: #{slug}")
{:error, :not_found}
error ->
Logger.error("Error retrieving map by slug",
slug: slug,
error: inspect(error)
)
{:error, :unknown_error}
end
end
# Helper function to handle multiple results errors with automatic recovery
defp handle_multiple_results(slug, error, retry_count) do
count = Map.get(error, :count, 2)
Logger.error("Multiple maps found with slug '#{slug}' - triggering automatic recovery",
slug: slug,
count: count,
retry_count: retry_count,
error: inspect(error)
)
# Emit telemetry for monitoring
:telemetry.execute(
[:wanderer_app, :map, :duplicate_slug_detected],
%{count: count, retry_count: retry_count},
%{slug: slug, operation: :get_by_slug}
)
# Attempt automatic recovery if this is the first try
if retry_count == 0 do
case WandererApp.Map.SlugRecovery.recover_duplicate_slug(slug) do
{:ok, recovery_result} ->
Logger.info("Successfully recovered duplicate slug '#{slug}', retrying query",
slug: slug,
fixed_count: recovery_result.fixed_count
)
# Retry the query once after recovery
get_map_by_slug_safely(slug, retry_count + 1)
{:error, reason} ->
Logger.error("Failed to recover duplicate slug '#{slug}'",
slug: slug,
error: inspect(reason)
)
{:error, :multiple_results}
end
else
# Already retried once, give up
Logger.error(
"Multiple maps still found with slug '#{slug}' after recovery attempt",
slug: slug,
count: count
)
{:error, :multiple_results}
end
end
# Helper function to check if an Ash.Error.Invalid contains a MultipleResults error
defp find_multiple_results_error(%Ash.Error.Invalid{errors: errors}) do
errors
|> Enum.find_value(:error, fn
%Ash.Error.Invalid.MultipleResults{} = mr_error -> {:ok, mr_error}
_ -> false
end)
end
def load_relationships(map, []), do: {:ok, map}
def load_relationships(map, relationships), do: map |> Ash.load(relationships)

View File

@@ -5,6 +5,10 @@ defmodule WandererApp.MapSystemRepo do
system |> WandererApp.Api.MapSystem.create()
end
def upsert(system) do
system |> WandererApp.Api.MapSystem.upsert()
end
def get_by_map_and_solar_system_id(map_id, solar_system_id) do
WandererApp.Api.MapSystem.by_map_id_and_solar_system_id(map_id, solar_system_id)
|> case do

View File

@@ -487,7 +487,7 @@ defmodule WandererApp.SecurityAudit do
# Private functions
defp store_audit_entry(audit_entry) do
defp store_audit_entry(_audit_entry) do
# Handle async processing if enabled
# if async_enabled?() do
# WandererApp.SecurityAudit.AsyncProcessor.log_event(audit_entry)

View File

@@ -4,7 +4,9 @@ defmodule WandererApp.Test.DDRT do
This allows mocking of DDRT calls in tests.
"""
@callback insert({integer(), any()} | list({integer(), any()}), String.t()) :: {:ok, map()} | {:error, term()}
@callback init_tree(String.t(), map()) :: :ok | {:error, term()}
@callback insert({integer(), any()} | list({integer(), any()}), String.t()) ::
{:ok, map()} | {:error, term()}
@callback update(integer(), any(), String.t()) :: {:ok, map()} | {:error, term()}
@callback delete(integer() | [integer()], String.t()) :: {:ok, map()} | {:error, term()}
@callback query(any(), String.t()) :: {:ok, [any()]} | {:error, term()}

View File

@@ -49,7 +49,7 @@ defmodule WandererApp.Ueberauth.Strategy.Eve do
WandererApp.Cache.put(
"eve_auth_#{params[:state]}",
[with_wallet: with_wallet, is_admin?: is_admin?],
ttl: :timer.minutes(15)
ttl: :timer.minutes(30)
)
opts = oauth_client_options_from_conn(conn, with_wallet, is_admin?)
@@ -66,17 +66,22 @@ defmodule WandererApp.Ueberauth.Strategy.Eve do
Handles the callback from Eve.
"""
def handle_callback!(%Plug.Conn{params: %{"code" => code, "state" => state}} = conn) do
opts =
WandererApp.Cache.get("eve_auth_#{state}")
case WandererApp.Cache.get("eve_auth_#{state}") do
nil ->
# Cache expired or invalid state - redirect to welcome page
conn
|> redirect!("/welcome")
params = [code: code]
opts ->
params = [code: code]
case WandererApp.Ueberauth.Strategy.Eve.OAuth.get_access_token(params, opts) do
{:ok, token} ->
fetch_user(conn, token)
case WandererApp.Ueberauth.Strategy.Eve.OAuth.get_access_token(params, opts) do
{:ok, token} ->
fetch_user(conn, token)
{:error, {error_code, error_description}} ->
set_errors!(conn, [error(error_code, error_description)])
{:error, {error_code, error_description}} ->
set_errors!(conn, [error(error_code, error_description)])
end
end
end

View File

@@ -1,16 +1,57 @@
defmodule WandererApp.User.ActivityTracker do
@moduledoc false
@moduledoc """
Activity tracking wrapper that ensures audit logging never crashes application logic.
Activity tracking is best-effort and errors are logged but not propagated to callers.
This prevents race conditions (e.g., duplicate activity records) from affecting
critical business operations like character tracking or connection management.
"""
require Logger
def track_map_event(
event_type,
metadata
),
do: WandererApp.Map.Audit.track_map_event(event_type, metadata)
@doc """
Track a map-related event. Always returns `{:ok, result}` even on error.
def track_acl_event(
event_type,
metadata
),
do: WandererApp.Map.Audit.track_acl_event(event_type, metadata)
Errors (such as unique constraint violations from concurrent operations)
are logged but do not propagate to prevent crashing critical application logic.
"""
def track_map_event(event_type, metadata) do
case WandererApp.Map.Audit.track_map_event(event_type, metadata) do
{:ok, result} ->
{:ok, result}
{:error, error} ->
Logger.warning("Failed to track map event (non-critical)",
event_type: event_type,
map_id: metadata[:map_id],
error: inspect(error),
reason: :best_effort_tracking
)
# Return success to prevent crashes - activity tracking is best-effort
{:ok, nil}
end
end
@doc """
Track an ACL-related event. Always returns `{:ok, result}` even on error.
Errors are logged but do not propagate to prevent crashing critical application logic.
"""
def track_acl_event(event_type, metadata) do
case WandererApp.Map.Audit.track_acl_event(event_type, metadata) do
{:ok, result} ->
{:ok, result}
{:error, error} ->
Logger.warning("Failed to track ACL event (non-critical)",
event_type: event_type,
acl_id: metadata[:acl_id],
error: inspect(error),
reason: :best_effort_tracking
)
# Return success to prevent crashes - activity tracking is best-effort
{:ok, nil}
end
end
end

View File

@@ -5,7 +5,7 @@
<navbar class="navbar bg-base-100 !sticky top-0 z-50 bg-opacity-0 ">
<div class="navbar-start">
<div class="dropdown">
<div tabindex="0" role="button" class="btn btn-ghost btn-circle">
<div tabindex="0" role="button" class="btn btn-ghost btn-circle text-white">
<svg
xmlns="http://www.w3.org/2000/svg"
class="h-5 w-5"
@@ -34,7 +34,12 @@
</div>
</div>
<div class="navbar-center">
<a href="/" class="btn btn-ghost text-xl">Wanderer</a>
<a
href="/"
class="!opacity-0 text-[24px] text-white [text-shadow:0_0px_8px_rgba(0,0,0,0.8)]"
>
Wanderer
</a>
</div>
<div class="navbar-end"></div>
</navbar>
@@ -44,10 +49,13 @@
<!--Footer-->
<footer class="!z-10 w-full pt-8 pb-4 text-sm text-center fade-in flex justify-center items-center">
<div class="flex flex-col justify-center items-center">
<a target="_blank" rel="noopener noreferrer" href="https://www.eveonline.com/partners"><img src="/images/eo_pp.png" style="width: 300px;" alt="Eve Online Partnership Program"></a>
<div class="text-gray-500 no-underline hover:no-underline">
All <a href="/license">EVE related materials</a> are property of <a href="https://www.ccpgames.com">CCP Games</a>
&copy; {Date.utc_today().year} Wanderer Industries.
<a target="_blank" rel="noopener noreferrer" href="https://www.eveonline.com/partners">
<img src="/images/eo_pp.png" style="width: 300px;" alt="Eve Online Partnership Program" />
</a>
<div class="text-stone-400 no-underline hover:no-underline [text-shadow:0_0px_4px_rgba(0,0,0,0.8)]">
All <a href="/license">EVE related materials</a>
are property of <a href="https://www.ccpgames.com">CCP Games</a>
&copy; {Date.utc_today().year} Wanderer Industries.
</div>
</div>
</footer>

View File

@@ -34,5 +34,4 @@
<.new_version_banner app_version={@app_version} enabled={@map_subscriptions_enabled?} />
</div>
<.live_component module={WandererAppWeb.Alerts} id="notifications" view_flash={@flash} />

View File

@@ -1,5 +1,5 @@
<article class="prose prose-lg ccp-font w-full max-w-3xl mx-auto">
<div class="w-full px-4 md:px-6 text-xl leading-normal ccp-font">
<div class="w-full px-4 md:px-6 text-xl leading-normal ccp-font [&_*]:text-stone-200 [&_*]:[text-shadow:0_0px_8px_rgba(0,0,0,0.4)] bg-neutral-900/60 py-8">
{raw(@file.body)}
</div>
</article>

View File

@@ -4,7 +4,7 @@
<div class="flex min-h-[calc(100vh-100px)] items-center justify-center px-2 py-10 text-center xl:pe-0 xl:ps-10">
<div>
<h1 class="text-center text-[clamp(2rem,6vw,4rem)] font-black leading-[1.1] [word-break:auto-phrase] xl:w-[115%] xl:text-start [:root[dir=rtl]_&amp;]:leading-[1.35]">
<span class="[&amp;::selection]:text-base-content brightness-150 contrast-150 [&amp;::selection]:bg-blue-700/20">
<span class="[&amp;::selection]:text-base-content brightness-150 contrast-150 [&amp;::selection]:bg-blue-700/20 [text-shadow:0_0px_8px_rgba(0,0,0,0.7)]">
Join or support us!
<!---->
</span>

View File

@@ -6,7 +6,7 @@
<div class="container pt-5 mx-auto flex flex-wrap flex-row justify-center items-center gap-8">
<!--Left Col-->
<div class="flex flex-col justify-center items-center overflow-y-hidden">
<h1 class="ccp-font my-4 text-2xl text-white font-bold leading-tight text-center md:text-left ">
<h1 class="ccp-font my-4 pr-4 text-2xl text-white font-bold leading-tight text-center md:text-left [text-shadow:0_0px_8px_rgba(0,0,0,0.8)]">
THE #1 EVE MAPPER TOOL
</h1>
</div>
@@ -28,21 +28,27 @@
</div>
<div
id="posts-container"
class="bg-neutral rounded-box max-w-[90%] p-4 max-h-[60vh] overflow-y-auto"
class="bg-neutral rounded-box max-w-[90%] p-4 max-h-[60vh] overflow-y-auto relative z-1"
>
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 xl:grid-cols-3 gap-4">
<%= for post <- @posts do %>
<.link class="group carousel-item relative" navigate={~p"/news/#{post.id}"}>
<div class="artboard-horizontal phone-1 relative hover:text-white">
<.link class="group carousel-item relative my-2 mx-2" navigate={~p"/news/#{post.id}"}>
<div class="artbard-hoorizontal phone-1 relative hover:text-white">
<img
class="rounded-lg shadow-lg block !w-[300px] !h-[180px] opacity-75"
class="rounded-m shadow-lg block !w-[300px] !h-[180px] opacity-75 !m-0"
src={post.cover_image_uri}
/>
<div class="absolute rounded-lg top-0 left-0 w-full h-full bg-gradient-to-b from-transparent to-black opacity-75 group-hover:opacity-25 transition-opacity duration-300">
<div class="absolute rounded-m top-0 left-0 w-full h-full bg-gradient-to-b from-transparent to-black opacity-75 group-hover:opacity-25 transition-opacity duration-300">
</div>
<div class="absolute w-full bottom-2 p-4">
<% [first_part, second_part] = String.split(post.title, ":", parts: 2) %>
<h3 class="!m-0 !text-s font-bold break-normal ccp-font whitespace-nowrap text-white">
{first_part}
</h3>
<p class="!m-0 !text-s text-white text-ellipsis overflow-hidden whitespace-nowrap ccp-font">
{second_part || ""}
</p>
</div>
<h3 class="absolute bottom-4 left-14 !text-md font-bold break-normal pt-6 pb-2 ccp-font text-white">
{post.title}
</h3>
</div>
</.link>
<% end %>
@@ -50,49 +56,6 @@
</div>
<script>
document.addEventListener('DOMContentLoaded', function() {
const postsContainer = document.getElementById('posts-container');
if (!postsContainer) return;
let scrollSpeed = 0.5; // pixels per frame
let isScrolling = true;
let scrollDirection = 1; // 1 for down, -1 for up
function autoScroll() {
if (!isScrolling) return;
const maxScroll = postsContainer.scrollHeight - postsContainer.clientHeight;
if (maxScroll <= 0) return; // No need to scroll if content fits
postsContainer.scrollTop += scrollSpeed * scrollDirection;
// Reverse direction when reaching top or bottom
if (postsContainer.scrollTop >= maxScroll) {
scrollDirection = -1;
} else if (postsContainer.scrollTop <= 0) {
scrollDirection = 1;
}
requestAnimationFrame(autoScroll);
}
// Pause scrolling on hover
postsContainer.addEventListener('mouseenter', () => {
isScrolling = false;
});
// Resume scrolling when mouse leaves
postsContainer.addEventListener('mouseleave', () => {
isScrolling = true;
requestAnimationFrame(autoScroll);
});
// Start autoscroll after a delay
setTimeout(() => {
requestAnimationFrame(autoScroll);
}, 2000);
});
</script>
<%!-- <div class="carousel carousel-center !bg-neutral rounded-box max-w-4xl space-x-6 p-4">

View File

@@ -1,21 +1,23 @@
<article class="prose prose-lg ccp-font w-full max-w-3xl mx-auto">
<div class="w-full px-4 md:px-6 text-xl leading-normal ccp-font">
<h1 class="font-bold break-normal pt-10 ccp-font text-white">
<h1 class="font-bold break-normal pt-10 ccp-font text-white ml-8">
License
</h1>
<h3 class="txt-color txt-color-grayLight">
<strong class="flex items-center gap-0">
<.icon name="hero-at-symbol" class="h-8 w-8" /> CCP Copyright Notice
</strong>
</h3>
<p>
EVE Online and the EVE logo are the registered trademarks of CCP hf. All rights are reserved worldwide.
All other trademarks are the property of their respective owners.
EVE Online, the EVE logo, EVE and all associated logos and designs are the intellectual property of CCP hf.
All artwork, screenshots, characters, vehicles, storylines, world facts or other recognizable features of the
intellectual property relating to these trademarks are likewise the intellectual property of CCP hf.
CCP is in no way responsible for the content on or functioning of this website, nor can it be liable for
any damage arising from the use of this website.
</p>
<div class="bg-neutral-900/60 text-stone-200 [text-shadow:0_0px_8px_rgba(0,0,0,0.7)] px-8 py-1">
<h3 class="txt-color txt-color-grayLight">
<strong class="flex items-center gap-0">
<.icon name="hero-at-symbol" class="h-8 w-8" /> CCP Copyright Notice
</strong>
</h3>
<p>
EVE Online and the EVE logo are the registered trademarks of CCP hf. All rights are reserved worldwide.
All other trademarks are the property of their respective owners.
EVE Online, the EVE logo, EVE and all associated logos and designs are the intellectual property of CCP hf.
All artwork, screenshots, characters, vehicles, storylines, world facts or other recognizable features of the
intellectual property relating to these trademarks are likewise the intellectual property of CCP hf.
CCP is in no way responsible for the content on or functioning of this website, nor can it be liable for
any damage arising from the use of this website.
</p>
</div>
</div>
</article>

View File

@@ -33,7 +33,7 @@
<%= for post <- @posts do %>
<.link
navigate={~p"/news/#{post.id}"}
class="card sm:card-side hover:bg-base-200 transition-colors sm:max-w-none hover:text-white"
class="card sm:card-side bg-neutral-900/60 hover:bg-neutral-900/80 transition-colors sm:max-w-none hover:text-white text-stone-200 !rounded-[0]"
>
<figure class="mx-auto w-full object-cover p-6 max-sm:pb-0 sm:max-w-[12rem] sm:pe-0">
<img
@@ -45,16 +45,16 @@
</figure>
<div class="card-body hover:text-white">
<h2 class="card-title">{post.title}</h2>
<p class="text-xs opacity-60">
<p class="text-xs text-stone-200">
{post.description}
</p>
<div class="card-actions justify-end">
<ul class="flex flex-wrap items-center p-0 m-0">
<div class="card-actions">
<ul class="flex flex-wrap items-center p-0 m-0 gap-2">
<li
:for={tag <- post.tags}
class="inline-flex rounded-[35px] bg-primary px-1 text-white"
class="inline-flex rounded-[35px] bg-primary text-white"
>
<div class="badge badge-outline text-primary rounded-none border-none text-sm">
<div class="badge-outline text-primary rounded-none border-none text-sm text-lime-400">
#{tag}
</div>
</li>

View File

@@ -2,116 +2,121 @@
<!--Container-->
<div class="w-full px-4 md:px-6 text-xl leading-normal ccp-font">
<!--Title-->
<h1 class="font-bold break-normal pt-10 ccp-font text-white">
<h1 class="font-bold break-normal pt-10 ccp-font text-white [text-shadow:0_0px_8px_rgba(0,0,0,0.7)]">
{@post.title}
</h1>
<div class="text-md md:text-base font-normal mt-0 ccp-font flex items-center gap-4">
<div class="flex justify-start content-center gap-2">
{@post.date} - BY <span class="uppercase">{@post.author}</span>
</div>
<div class="bg-neutral-900/60 text-stone-200 [text-shadow:0_0px_8px_rgba(0,0,0,0.4)] px-8 py-1">
<div class="text-md md:text-base font-normal mt-0 ccp-font flex items-center gap-4">
<div class="flex justify-start content-center gap-2">
{@post.date} - BY <span class="uppercase">{@post.author}</span>
</div>
<div class="min-h-[10px] w-px self-stretch border-t-0 bg-gradient-to-tr from-transparent to-transparent opacity-25 via-neutral-200 block">
</div>
<div class="flex justify-start content-center">
<a
class="no-underline hover:text-pink-500 hover:text-underline h-8 md:h-auto p-2 text-center h-auto transform hover:scale-125 duration-300 ease-in-out"
href={"https://twitter.com/intent/tweet?url=#{current_url(@conn)}"}
target="_blank"
>
<svg class="fill-current h-6" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<path d="M30.063 7.313c-.813 1.125-1.75 2.125-2.875 2.938v.75c0 1.563-.188 3.125-.688 4.625a15.088 15.088 0 0 1-2.063 4.438c-.875 1.438-2 2.688-3.25 3.813a15.015 15.015 0 0 1-4.625 2.563c-1.813.688-3.75 1-5.75 1-3.25 0-6.188-.875-8.875-2.625.438.063.875.125 1.375.125 2.688 0 5.063-.875 7.188-2.5-1.25 0-2.375-.375-3.375-1.125s-1.688-1.688-2.063-2.875c.438.063.813.125 1.125.125.5 0 1-.063 1.5-.25-1.313-.25-2.438-.938-3.313-1.938a5.673 5.673 0 0 1-1.313-3.688v-.063c.813.438 1.688.688 2.625.688a5.228 5.228 0 0 1-1.875-2c-.5-.875-.688-1.813-.688-2.75 0-1.063.25-2.063.75-2.938 1.438 1.75 3.188 3.188 5.25 4.25s4.313 1.688 6.688 1.813a5.579 5.579 0 0 1 1.5-5.438c1.125-1.125 2.5-1.688 4.125-1.688s3.063.625 4.188 1.813a11.48 11.48 0 0 0 3.688-1.375c-.438 1.375-1.313 2.438-2.563 3.188 1.125-.125 2.188-.438 3.313-.875z">
</path>
</svg>
</a>
<a
class="inline-block no-underline hover:text-pink-500 hover:text-underline text-center h-auto p-2 transform hover:scale-125 duration-300 ease-in-out"
href={"https://www.facebook.com/sharer/sharer.php?u=#{current_url(@conn)}"}
target="_blank"
>
<svg class="fill-current h-6" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<path d="M19 6h5V0h-5c-3.86 0-7 3.14-7 7v3H8v6h4v16h6V16h5l1-6h-6V7c0-.542.458-1 1-1z">
</path>
</svg>
</a>
<a
class="inline-block no-underline hover:text-pink-500 hover:text-underline text-center h-auto p-2 transform hover:scale-125 duration-300 ease-in-out"
href={"https://www.reddit.com/submit?url=#{current_url(@conn)}"}
target="_blank"
>
<svg
class="fill-current h-6"
aria-hidden="true"
xmlns="http://www.w3.org/2000/svg"
width="24"
height="24"
fill="none"
viewBox="0 0 24 24"
<div class="min-h-[10px] w-px self-stretch border-t-0 bg-gradient-to-tr from-transparent to-transparent opacity-25 via-neutral-200 block">
</div>
<div class="flex justify-start content-center">
<a
class="no-underline hover:text-pink-500 hover:text-underline h-8 md:h-auto p-2 text-center h-auto transform hover:scale-125 duration-300 ease-in-out"
href={"https://twitter.com/intent/tweet?url=#{current_url(@conn)}"}
target="_blank"
>
<path
fill="currentColor"
d="M12.008 16.521a3.84 3.84 0 0 0 2.47-.77v.04a.281.281 0 0 0 .005-.396.281.281 0 0 0-.395-.005 3.291 3.291 0 0 1-2.09.61 3.266 3.266 0 0 1-2.081-.63.27.27 0 0 0-.38.381 3.84 3.84 0 0 0 2.47.77Z"
/>
<path
fill="currentColor"
fill-rule="evenodd"
d="M22 12c0 5.523-4.477 10-10 10S2 17.523 2 12 6.477 2 12 2s10 4.477 10 10Zm-4.845-1.407A1.463 1.463 0 0 1 18.67 12a1.46 1.46 0 0 1-.808 1.33c.01.146.01.293 0 .44 0 2.242-2.61 4.061-5.829 4.061s-5.83-1.821-5.83-4.061a3.25 3.25 0 0 1 0-.44 1.458 1.458 0 0 1-.457-2.327 1.458 1.458 0 0 1 2.063-.064 7.163 7.163 0 0 1 3.9-1.23l.738-3.47v-.006a.31.31 0 0 1 .37-.236l2.452.49a1 1 0 1 1-.132.611l-2.14-.45-.649 3.12a7.11 7.11 0 0 1 3.85 1.23c.259-.246.6-.393.957-.405Z"
clip-rule="evenodd"
/>
<path
fill="currentColor"
d="M15.305 13a1 1 0 1 1-2 0 1 1 0 0 1 2 0Zm-4.625 0a1 1 0 1 1-2 0 1 1 0 0 1 2 0Z"
/>
</svg>
</a>
</div>
<div class="min-h-[10px] w-px self-stretch border-t-0 bg-gradient-to-tr from-transparent to-transparent opacity-25 via-neutral-200 block">
</div>
<div class="flex justify-start content-center">
<button
id="link-share-button"
class="copy-link flex no-underline hover:text-pink-500 hover:text-underline md:h-auto p-2 text-center h-auto relative transform hover:scale-125 duration-300 ease-in-out"
type="button"
data-url={current_url(@conn)}
>
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16"
fill="currentColor"
class="fill-current h-6"
>
<path
fill-rule="evenodd"
d="M8.914 6.025a.75.75 0 0 1 1.06 0 3.5 3.5 0 0 1 0 4.95l-2 2a3.5 3.5 0 0 1-5.396-4.402.75.75 0 0 1 1.251.827 2 2 0 0 0 3.085 2.514l2-2a2 2 0 0 0 0-2.828.75.75 0 0 1 0-1.06Z"
clip-rule="evenodd"
/>
<path
fill-rule="evenodd"
d="M7.086 9.975a.75.75 0 0 1-1.06 0 3.5 3.5 0 0 1 0-4.95l2-2a3.5 3.5 0 0 1 5.396 4.402.75.75 0 0 1-1.251-.827 2 2 0 0 0-3.085-2.514l-2 2a2 2 0 0 0 0 2.828.75.75 0 0 1 0 1.06Z"
clip-rule="evenodd"
/>
</svg>
<div class="absolute w-[100px] left-8 link-copied hidden">Link copied</div>
</button>
</div>
</div>
<div class="w-full justify-end">
<ul class="flex flex-wrap items-center p-0 m-0">
<li :for={tag <- @post.tags} class="inline-flex rounded-[35px] bg-primary px-1 text-white">
<a href="#">
<div class="badge badge-outline text-primary rounded-none border-none text-xl">
#{tag}
</div>
<svg class="fill-current h-6" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<path d="M30.063 7.313c-.813 1.125-1.75 2.125-2.875 2.938v.75c0 1.563-.188 3.125-.688 4.625a15.088 15.088 0 0 1-2.063 4.438c-.875 1.438-2 2.688-3.25 3.813a15.015 15.015 0 0 1-4.625 2.563c-1.813.688-3.75 1-5.75 1-3.25 0-6.188-.875-8.875-2.625.438.063.875.125 1.375.125 2.688 0 5.063-.875 7.188-2.5-1.25 0-2.375-.375-3.375-1.125s-1.688-1.688-2.063-2.875c.438.063.813.125 1.125.125.5 0 1-.063 1.5-.25-1.313-.25-2.438-.938-3.313-1.938a5.673 5.673 0 0 1-1.313-3.688v-.063c.813.438 1.688.688 2.625.688a5.228 5.228 0 0 1-1.875-2c-.5-.875-.688-1.813-.688-2.75 0-1.063.25-2.063.75-2.938 1.438 1.75 3.188 3.188 5.25 4.25s4.313 1.688 6.688 1.813a5.579 5.579 0 0 1 1.5-5.438c1.125-1.125 2.5-1.688 4.125-1.688s3.063.625 4.188 1.813a11.48 11.48 0 0 0 3.688-1.375c-.438 1.375-1.313 2.438-2.563 3.188 1.125-.125 2.188-.438 3.313-.875z">
</path>
</svg>
</a>
</li>
</ul>
</div>
<a
class="inline-block no-underline hover:text-pink-500 hover:text-underline text-center h-auto p-2 transform hover:scale-125 duration-300 ease-in-out"
href={"https://www.facebook.com/sharer/sharer.php?u=#{current_url(@conn)}"}
target="_blank"
>
<svg class="fill-current h-6" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 32 32">
<path d="M19 6h5V0h-5c-3.86 0-7 3.14-7 7v3H8v6h4v16h6V16h5l1-6h-6V7c0-.542.458-1 1-1z">
</path>
</svg>
</a>
<a
class="inline-block no-underline hover:text-pink-500 hover:text-underline text-center h-auto p-2 transform hover:scale-125 duration-300 ease-in-out"
href={"https://www.reddit.com/submit?url=#{current_url(@conn)}"}
target="_blank"
>
<svg
class="fill-current h-6"
aria-hidden="true"
xmlns="http://www.w3.org/2000/svg"
width="24"
height="24"
fill="none"
viewBox="0 0 24 24"
>
<path
fill="currentColor"
d="M12.008 16.521a3.84 3.84 0 0 0 2.47-.77v.04a.281.281 0 0 0 .005-.396.281.281 0 0 0-.395-.005 3.291 3.291 0 0 1-2.09.61 3.266 3.266 0 0 1-2.081-.63.27.27 0 0 0-.38.381 3.84 3.84 0 0 0 2.47.77Z"
/>
<path
fill="currentColor"
fill-rule="evenodd"
d="M22 12c0 5.523-4.477 10-10 10S2 17.523 2 12 6.477 2 12 2s10 4.477 10 10Zm-4.845-1.407A1.463 1.463 0 0 1 18.67 12a1.46 1.46 0 0 1-.808 1.33c.01.146.01.293 0 .44 0 2.242-2.61 4.061-5.829 4.061s-5.83-1.821-5.83-4.061a3.25 3.25 0 0 1 0-.44 1.458 1.458 0 0 1-.457-2.327 1.458 1.458 0 0 1 2.063-.064 7.163 7.163 0 0 1 3.9-1.23l.738-3.47v-.006a.31.31 0 0 1 .37-.236l2.452.49a1 1 0 1 1-.132.611l-2.14-.45-.649 3.12a7.11 7.11 0 0 1 3.85 1.23c.259-.246.6-.393.957-.405Z"
clip-rule="evenodd"
/>
<path
fill="currentColor"
d="M15.305 13a1 1 0 1 1-2 0 1 1 0 0 1 2 0Zm-4.625 0a1 1 0 1 1-2 0 1 1 0 0 1 2 0Z"
/>
</svg>
</a>
</div>
<div class="min-h-[10px] w-px self-stretch border-t-0 bg-gradient-to-tr from-transparent to-transparent opacity-25 via-neutral-200 block">
</div>
<div class="flex justify-start content-center">
<button
id="link-share-button"
class="copy-link flex no-underline hover:text-pink-500 hover:text-underline md:h-auto p-2 text-center h-auto relative transform hover:scale-125 duration-300 ease-in-out"
type="button"
data-url={current_url(@conn)}
>
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16"
fill="currentColor"
class="fill-current h-6"
>
<path
fill-rule="evenodd"
d="M8.914 6.025a.75.75 0 0 1 1.06 0 3.5 3.5 0 0 1 0 4.95l-2 2a3.5 3.5 0 0 1-5.396-4.402.75.75 0 0 1 1.251.827 2 2 0 0 0 3.085 2.514l2-2a2 2 0 0 0 0-2.828.75.75 0 0 1 0-1.06Z"
clip-rule="evenodd"
/>
<path
fill-rule="evenodd"
d="M7.086 9.975a.75.75 0 0 1-1.06 0 3.5 3.5 0 0 1 0-4.95l2-2a3.5 3.5 0 0 1 5.396 4.402.75.75 0 0 1-1.251-.827 2 2 0 0 0-3.085-2.514l-2 2a2 2 0 0 0 0 2.828.75.75 0 0 1 0 1.06Z"
clip-rule="evenodd"
/>
</svg>
<div class="absolute w-[100px] left-8 link-copied hidden">Link copied</div>
</button>
</div>
</div>
<div class="w-full justify-end">
<ul class="flex flex-wrap items-center p-0 m-0">
<li
:for={tag <- @post.tags}
class="inline-flex rounded-[35px] bg-primary px-1 text-white"
>
<a href="#">
<div class="badge badge-outline text-lime-400 rounded-none border-none text-xl">
#{tag}
</div>
</a>
</li>
</ul>
</div>
<h4 class=" break-normal font-normal mt-8 ccp-font">
{@post.description}
</h4>
<!--Post Content-->
{raw(@post.body)}
<h4 class=" break-normal font-normal mt-8 ccp-font">
{@post.description}
</h4>
<!--Post Content-->
{raw(@post.body)}
</div>
</div>
<!--/container-->
</article>

View File

@@ -103,8 +103,8 @@ defmodule WandererAppWeb.LicenseApiController do
}
```
"""
def update_validity(conn, %{"id" => license_id, "is_valid" => is_valid}) do
with {:ok, license} <- License.by_id(license_id),
def update_validity(conn, %{"id" => license_id}) do
with {:ok, _license} <- License.by_id(license_id),
{:ok, updated_license} <- LicenseManager.invalidate_license(license_id) do
conn
|> json(format_license(updated_license))

View File

@@ -143,7 +143,7 @@ defmodule WandererAppWeb.MapAPIController do
})
end
def bulk_delete_v1(conn, params) do
def bulk_delete_v1(conn, _params) do
# Basic bulk delete implementation for testing
conn
|> put_status(204)

View File

@@ -635,7 +635,7 @@ defmodule WandererAppWeb.MapSystemAPIController do
reason: reason
})
error ->
_error ->
conn
|> put_status(:bad_request)
|> APIUtils.respond_data(%{deleted: false, error: "Invalid system ID format"})

View File

@@ -15,24 +15,63 @@ defmodule WandererAppWeb.MapSystemSignatureAPIController do
description: "A cosmic signature scanned in an EVE Online solar system",
type: :object,
properties: %{
id: %OpenApiSpex.Schema{type: :string, format: :uuid, description: "Unique signature identifier"},
solar_system_id: %OpenApiSpex.Schema{type: :integer, description: "EVE Online solar system ID"},
eve_id: %OpenApiSpex.Schema{type: :string, description: "In-game signature ID (e.g., ABC-123)"},
id: %OpenApiSpex.Schema{
type: :string,
format: :uuid,
description: "Unique signature identifier"
},
solar_system_id: %OpenApiSpex.Schema{
type: :integer,
description: "EVE Online solar system ID"
},
eve_id: %OpenApiSpex.Schema{
type: :string,
description: "In-game signature ID (e.g., ABC-123)"
},
character_eve_id: %OpenApiSpex.Schema{
type: :string,
description: "EVE character ID who scanned/updated this signature. Must be a valid character in the database. If not provided, defaults to the map owner's character.",
description:
"EVE character ID who scanned/updated this signature. Must be a valid character in the database. If not provided, defaults to the map owner's character.",
nullable: true
},
name: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Signature name"},
description: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Additional notes"},
description: %OpenApiSpex.Schema{
type: :string,
nullable: true,
description: "Additional notes"
},
type: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Signature type"},
linked_system_id: %OpenApiSpex.Schema{type: :integer, nullable: true, description: "Connected solar system ID for wormholes"},
kind: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Signature kind (e.g., cosmic_signature)"},
group: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Signature group (e.g., wormhole, data, relic)"},
custom_info: %OpenApiSpex.Schema{type: :string, nullable: true, description: "Custom metadata"},
linked_system_id: %OpenApiSpex.Schema{
type: :integer,
nullable: true,
description: "Connected solar system ID for wormholes"
},
kind: %OpenApiSpex.Schema{
type: :string,
nullable: true,
description: "Signature kind (e.g., cosmic_signature)"
},
group: %OpenApiSpex.Schema{
type: :string,
nullable: true,
description: "Signature group (e.g., wormhole, data, relic)"
},
custom_info: %OpenApiSpex.Schema{
type: :string,
nullable: true,
description: "Custom metadata"
},
updated: %OpenApiSpex.Schema{type: :integer, nullable: true, description: "Update counter"},
inserted_at: %OpenApiSpex.Schema{type: :string, format: :date_time, description: "Creation timestamp"},
updated_at: %OpenApiSpex.Schema{type: :string, format: :date_time, description: "Last update timestamp"}
inserted_at: %OpenApiSpex.Schema{
type: :string,
format: :date_time,
description: "Creation timestamp"
},
updated_at: %OpenApiSpex.Schema{
type: :string,
format: :date_time,
description: "Last update timestamp"
}
},
required: [
:id,
@@ -178,7 +217,8 @@ defmodule WandererAppWeb.MapSystemSignatureAPIController do
properties: %{
error: %OpenApiSpex.Schema{
type: :string,
description: "Error type (e.g., 'invalid_character', 'system_not_found', 'missing_params')"
description:
"Error type (e.g., 'invalid_character', 'system_not_found', 'missing_params')"
}
},
example: %{error: "invalid_character"}

View File

@@ -604,7 +604,7 @@ defmodule WandererAppWeb.MapWebhooksAPIController do
# Private Functions
# -----------------------------------------------------------------
defp get_map(conn, map_identifier) do
defp get_map(conn, _map_identifier) do
# The map should already be loaded by the CheckMapApiKey plug
case conn.assigns[:map] do
nil -> {:error, :map_not_found}

View File

@@ -186,14 +186,18 @@ defmodule WandererAppWeb.Plugs.CheckJsonApiAuth do
defp get_map_identifier(conn) do
# 1. Check path params (e.g., /api/v1/maps/:map_identifier/systems)
case conn.params["map_identifier"] do
id when is_binary(id) and id != "" -> id
id when is_binary(id) and id != "" ->
id
_ ->
# 2. Check request body for map_id (JSON:API format)
case conn.body_params do
%{"data" => %{"attributes" => %{"map_id" => map_id}}} when is_binary(map_id) and map_id != "" ->
%{"data" => %{"attributes" => %{"map_id" => map_id}}}
when is_binary(map_id) and map_id != "" ->
map_id
%{"data" => %{"relationships" => %{"map" => %{"data" => %{"id" => map_id}}}}} when is_binary(map_id) and map_id != "" ->
%{"data" => %{"relationships" => %{"map" => %{"data" => %{"id" => map_id}}}}}
when is_binary(map_id) and map_id != "" ->
map_id
# 3. Check flat body params (non-JSON:API format)

View File

@@ -74,8 +74,10 @@ defmodule WandererAppWeb.Helpers.APIUtils do
end
@spec parse_int(binary() | integer()) :: {:ok, integer()} | {:error, String.t()}
def parse_int(nil), do: {:ok, nil}
def parse_int(str) when is_binary(str) do
Logger.debug("Parsing integer from: #{inspect(str)}")
Logger.debug(fn -> "Parsing integer from: #{inspect(str)}" end)
case Integer.parse(str) do
{num, ""} -> {:ok, num}

View File

@@ -574,6 +574,12 @@ defmodule WandererAppWeb.AccessListsLive do
:telemetry.execute([:wanderer_app, :acl, :member, :add], %{count: 1})
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"acls:#{access_list_id}",
{:acl_updated, %{acl_id: access_list_id}}
)
{:ok, member}
_ ->
@@ -607,6 +613,12 @@ defmodule WandererAppWeb.AccessListsLive do
:telemetry.execute([:wanderer_app, :acl, :member, :add], %{count: 1})
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"acls:#{access_list_id}",
{:acl_updated, %{acl_id: access_list_id}}
)
{:ok, member}
_ ->
@@ -641,6 +653,12 @@ defmodule WandererAppWeb.AccessListsLive do
:telemetry.execute([:wanderer_app, :acl, :member, :add], %{count: 1})
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"acls:#{access_list_id}",
{:acl_updated, %{acl_id: access_list_id}}
)
{:ok, member}
error ->

View File

@@ -336,7 +336,7 @@
label="Valid"
options={Enum.map(@valid_types, fn valid_type -> {valid_type.label, valid_type.id} end)}
/>
<!-- API Key Section with grid layout -->
<div class="modal-action">
<.button class="mt-2" type="submit" phx-disable-with="Saving...">

View File

@@ -44,6 +44,20 @@ defmodule WandererAppWeb.MapCharactersEventHandler do
socket
end
# Uses the characters from the payload instead of fetching all from database
def handle_server_event(
%{event: :characters_updated, payload: %{characters: characters}},
socket
),
do:
socket
|> MapEventHandler.push_map_event(
"characters_updated",
characters |> Enum.map(&map_ui_character/1)
)
# Legacy handler for :characters_updated without payload (backwards compatibility)
# This can be removed once all callers use the new batch format
def handle_server_event(
%{event: :characters_updated},
%{
@@ -294,8 +308,6 @@ defmodule WandererAppWeb.MapCharactersEventHandler do
when not is_nil(character_eve_id) do
{:ok, character} = WandererApp.Character.get_by_eve_id("#{character_eve_id}")
WandererApp.Cache.delete("character:#{character.id}:tracking_paused")
{:noreply, socket}
end
@@ -318,7 +330,6 @@ defmodule WandererAppWeb.MapCharactersEventHandler do
|> Map.put(:alliance_ticker, Map.get(character, :alliance_ticker, ""))
|> Map.put_new(:ship, WandererApp.Character.get_ship(character))
|> Map.put_new(:location, get_location(character))
|> Map.put_new(:tracking_paused, character |> Map.get(:tracking_paused, false))
defp get_location(character),
do: %{

View File

@@ -59,14 +59,13 @@ defmodule WandererAppWeb.MapConnectionsEventHandler do
character_id: main_character_id
})
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_connection_added, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: "#{solar_system_source_id}" |> String.to_integer(),
solar_system_target_id: "#{solar_system_target_id}" |> String.to_integer()
})
WandererApp.User.ActivityTracker.track_map_event(:map_connection_added, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: "#{solar_system_source_id}" |> String.to_integer(),
solar_system_target_id: "#{solar_system_target_id}" |> String.to_integer()
})
{:noreply, socket}
end
@@ -149,14 +148,13 @@ defmodule WandererAppWeb.MapConnectionsEventHandler do
end
end
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_connection_removed, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: solar_system_source_id,
solar_system_target_id: solar_system_target_id
})
WandererApp.User.ActivityTracker.track_map_event(:map_connection_removed, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: solar_system_source_id,
solar_system_target_id: solar_system_target_id
})
{:noreply, socket}
end
@@ -202,16 +200,15 @@ defmodule WandererAppWeb.MapConnectionsEventHandler do
_ -> nil
end
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_connection_updated, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: "#{solar_system_source_id}" |> String.to_integer(),
solar_system_target_id: "#{solar_system_target_id}" |> String.to_integer(),
key: key_atom,
value: value
})
WandererApp.User.ActivityTracker.track_map_event(:map_connection_updated, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_source_id: "#{solar_system_source_id}" |> String.to_integer(),
solar_system_target_id: "#{solar_system_target_id}" |> String.to_integer(),
key: key_atom,
value: value
})
apply(WandererApp.Map.Server, method_atom, [
map_id,

View File

@@ -21,59 +21,85 @@ defmodule WandererAppWeb.MapCoreEventHandler do
:refresh_permissions,
%{assigns: %{current_user: current_user, map_slug: map_slug}} = socket
) do
{:ok, %{id: map_id, user_permissions: user_permissions, owner_id: owner_id}} =
map_slug
|> WandererApp.Api.Map.get_map_by_slug!()
|> Ash.load(:user_permissions, actor: current_user)
try do
{:ok, %{id: map_id, user_permissions: user_permissions, owner_id: owner_id}} =
map_slug
|> WandererApp.Api.Map.get_map_by_slug!()
|> Ash.load(:user_permissions, actor: current_user)
user_permissions =
WandererApp.Permissions.get_map_permissions(
user_permissions,
owner_id,
current_user.characters |> Enum.map(& &1.id)
)
user_permissions =
WandererApp.Permissions.get_map_permissions(
user_permissions,
owner_id,
current_user.characters |> Enum.map(& &1.id)
)
case user_permissions do
%{view_system: false} ->
socket
|> Phoenix.LiveView.put_flash(:error, "Your access to the map have been revoked.")
|> Phoenix.LiveView.push_navigate(to: ~p"/maps")
case user_permissions do
%{view_system: false} ->
socket
|> Phoenix.LiveView.put_flash(:error, "Your access to the map have been revoked.")
|> Phoenix.LiveView.push_navigate(to: ~p"/maps")
%{track_character: track_character} ->
{:ok, map_characters} =
case WandererApp.MapCharacterSettingsRepo.get_tracked_by_map_filtered(
map_id,
current_user.characters |> Enum.map(& &1.id)
) do
{:ok, settings} ->
{:ok,
settings
|> Enum.map(fn s -> s |> Ash.load!(:character) |> Map.get(:character) end)}
%{track_character: track_character} ->
{:ok, map_characters} =
case WandererApp.MapCharacterSettingsRepo.get_tracked_by_map_filtered(
map_id,
current_user.characters |> Enum.map(& &1.id)
) do
{:ok, settings} ->
{:ok,
settings
|> Enum.map(fn s -> s |> Ash.load!(:character) |> Map.get(:character) end)}
_ ->
{:ok, []}
end
case track_character do
false ->
:ok = WandererApp.Character.TrackingUtils.untrack(map_characters, map_id, self())
_ ->
{:ok, []}
:ok =
WandererApp.Character.TrackingUtils.track(
map_characters,
map_id,
true,
self()
)
end
case track_character do
false ->
:ok = WandererApp.Character.TrackingUtils.untrack(map_characters, map_id, self())
socket
|> assign(user_permissions: user_permissions)
|> MapEventHandler.push_map_event(
"user_permissions",
user_permissions
)
end
rescue
error in Ash.Error.Invalid.MultipleResults ->
Logger.error("Multiple maps found with slug '#{map_slug}' during refresh_permissions",
slug: map_slug,
error: inspect(error)
)
_ ->
:ok =
WandererApp.Character.TrackingUtils.track(
map_characters,
map_id,
true,
self()
)
end
# Emit telemetry for monitoring
:telemetry.execute(
[:wanderer_app, :map, :duplicate_slug_detected],
%{count: 1},
%{slug: map_slug, operation: :refresh_permissions}
)
# Return socket unchanged - permissions won't refresh but won't crash
socket
error ->
Logger.error("Error refreshing permissions for map slug '#{map_slug}'",
slug: map_slug,
error: inspect(error)
)
socket
|> assign(user_permissions: user_permissions)
|> MapEventHandler.push_map_event(
"user_permissions",
user_permissions
)
end
end
@@ -126,7 +152,6 @@ defmodule WandererAppWeb.MapCoreEventHandler do
|> assign(show_topup: true)
end
@impl true
def handle_server_event(
{_event, {:flash, type, message}},
socket
@@ -301,8 +326,8 @@ defmodule WandererAppWeb.MapCoreEventHandler do
end
def handle_ui_event(
event,
body,
_event,
_body,
%{assigns: %{main_character_id: main_character_id, can_track?: true}} =
socket
)
@@ -328,20 +353,12 @@ defmodule WandererAppWeb.MapCoreEventHandler do
if actor do
case WandererApp.Api.MapDefaultSettings.get_by_map_id(%{map_id: map_id}) do
{:ok, [existing | _]} ->
result =
WandererApp.Api.MapDefaultSettings.update(existing, %{settings: settings},
actor: actor
)
WandererApp.Api.MapDefaultSettings.update(existing, %{settings: settings}, actor: actor)
result
error ->
result =
WandererApp.Api.MapDefaultSettings.create(%{map_id: map_id, settings: settings},
actor: actor
)
result
_error ->
WandererApp.Api.MapDefaultSettings.create(%{map_id: map_id, settings: settings},
actor: actor
)
end
else
Logger.error("No character found for user #{current_user.id}")
@@ -411,7 +428,7 @@ defmodule WandererAppWeb.MapCoreEventHandler do
%{
id: current_user_id,
characters: current_user_characters
} = current_user,
} = _current_user,
user_permissions,
owner_id
) do
@@ -512,9 +529,6 @@ defmodule WandererAppWeb.MapCoreEventHandler do
defp check_map_access(_, _), do: {:error, :no_permissions}
defp setup_map_socket(socket, map_id, map_slug, map_name, init_data, only_tracked_characters) do
end
defp handle_map_server_started(
%{
assigns: %{
@@ -530,7 +544,7 @@ defmodule WandererAppWeb.MapCoreEventHandler do
) do
with {:ok, _} <- current_user |> WandererApp.Api.User.update_last_map(%{last_map_id: map_id}),
{:ok, characters_limit} <- map_id |> WandererApp.Map.get_characters_limit(),
{:ok, present_character_ids} <-
{:ok, map_character_ids} <-
WandererApp.Cache.lookup("map_#{map_id}:presence_character_ids", []) do
events =
case tracked_characters |> Enum.any?(&(&1.access_token == nil)) do
@@ -550,7 +564,7 @@ defmodule WandererAppWeb.MapCoreEventHandler do
events
end
character_limit_reached? = present_character_ids |> Enum.count() >= characters_limit
character_limit_reached? = map_character_ids |> Enum.count() >= characters_limit
events =
cond do
@@ -607,7 +621,7 @@ defmodule WandererAppWeb.MapCoreEventHandler do
%{
kills: kills_data,
present_characters:
present_character_ids
map_character_ids
|> WandererApp.Character.get_character_eve_ids!(),
user_characters: tracked_characters |> Enum.map(& &1.eve_id),
system_static_infos: nil,

View File

@@ -165,13 +165,12 @@ defmodule WandererAppWeb.MapRoutesEventHandler do
solar_system_id: solar_system_id
})
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:hub_added, %{
character_id: main_character_id,
user_id: current_user.id,
map_id: map_id,
solar_system_id: solar_system_id
})
WandererApp.User.ActivityTracker.track_map_event(:hub_added, %{
character_id: main_character_id,
user_id: current_user.id,
map_id: map_id,
solar_system_id: solar_system_id
})
{:noreply, socket}
else
@@ -204,13 +203,12 @@ defmodule WandererAppWeb.MapRoutesEventHandler do
solar_system_id: solar_system_id
})
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:hub_removed, %{
character_id: main_character_id,
user_id: current_user.id,
map_id: map_id,
solar_system_id: solar_system_id
})
WandererApp.User.ActivityTracker.track_map_event(:hub_removed, %{
character_id: main_character_id,
user_id: current_user.id,
map_id: map_id,
solar_system_id: solar_system_id
})
{:noreply, socket}
end

View File

@@ -41,7 +41,6 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
},
%{
assigns: %{
current_user: current_user,
tracked_characters: tracked_characters,
map_id: map_id,
map_user_settings: map_user_settings,
@@ -106,7 +105,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
%{"solar_system_id" => solar_system_id, "coordinates" => coordinates} = _event,
%{
assigns: %{
current_user: current_user,
current_user: %{id: current_user_id},
has_tracked_characters?: true,
map_id: map_id,
main_character_id: main_character_id,
@@ -122,7 +121,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
solar_system_id: solar_system_id,
coordinates: coordinates
},
current_user.id,
current_user_id,
main_character_id
)
@@ -137,7 +136,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
} = _event,
%{
assigns: %{
current_user: current_user,
current_user: %{id: current_user_id},
has_tracked_characters?: true,
map_id: map_id,
main_character_id: main_character_id,
@@ -150,14 +149,14 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
WandererApp.Map.Server.paste_systems(
map_id,
systems,
current_user.id,
current_user_id,
main_character_id
)
WandererApp.Map.Server.paste_connections(
map_id,
connections,
current_user.id,
current_user_id,
main_character_id
)
@@ -208,7 +207,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
%{
assigns: %{
map_id: map_id,
current_user: current_user,
current_user: %{id: current_user_id},
main_character_id: main_character_id,
has_tracked_characters?: true,
user_permissions: %{update_system: true} = user_permissions
@@ -250,15 +249,14 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
|> Map.put_new(key_atom, value)
])
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:system_updated, %{
character_id: main_character_id,
user_id: current_user.id,
map_id: map_id,
solar_system_id: "#{solar_system_id}" |> String.to_integer(),
key: key_atom,
value: value
})
WandererApp.User.ActivityTracker.track_map_event(:system_updated, %{
character_id: main_character_id,
user_id: current_user_id,
map_id: map_id,
solar_system_id: "#{solar_system_id}" |> String.to_integer(),
key: key_atom,
value: value
})
end
{:noreply, socket}
@@ -302,7 +300,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
%{
assigns: %{
map_id: map_id,
current_user: current_user,
current_user: %{id: current_user_id},
main_character_id: main_character_id,
has_tracked_characters?: true,
user_permissions: %{delete_system: true}
@@ -314,7 +312,7 @@ defmodule WandererAppWeb.MapSystemsEventHandler do
map_id
|> WandererApp.Map.Server.delete_systems(
solar_system_ids |> Enum.map(&String.to_integer/1),
current_user.id,
current_user_id,
main_character_id
)

View File

@@ -164,14 +164,4 @@ defmodule WandererAppWeb.MapAuditLive do
{:noreply, socket |> push_navigate(to: valid_path)}
end
end
defp get_valid_period(period, true), do: period
defp get_valid_period(period, _map_subscription_active) do
if period in @active_subscription_periods do
"1H"
else
period
end
end
end

View File

@@ -74,6 +74,13 @@ defmodule WandererAppWeb.MapLive do
"You don't have main character set, please update it in tracking settings (top right icon)."
)}
def handle_info(:map_deleted, socket),
do:
{:noreply,
socket
|> put_flash(:info, "This map has been deleted.")
|> push_navigate(to: ~p"/maps")}
def handle_info(:no_access, socket),
do:
{:noreply,

View File

@@ -6,6 +6,7 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
alias BetterNumber, as: Number
alias WandererApp.License.LicenseManager
alias WandererApp.Map.SubscriptionManager
@impl true
def mount(socket) do
@@ -39,10 +40,10 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
{:ok, map} = WandererApp.MapRepo.get(map_id)
{:ok, estimated_price, discount} =
WandererApp.Map.SubscriptionManager.estimate_price(subscription_form, false)
SubscriptionManager.estimate_price(subscription_form, false)
{:ok, map_subscriptions} =
WandererApp.Map.SubscriptionManager.get_map_subscriptions(map_id)
SubscriptionManager.get_map_subscriptions(map_id)
socket =
socket
@@ -76,7 +77,7 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
}
{:ok, additional_price, discount} =
WandererApp.Map.SubscriptionManager.calc_additional_price(
SubscriptionManager.calc_additional_price(
subscription_form,
selected_subscription
)
@@ -103,7 +104,7 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
|> WandererApp.Api.MapSubscription.by_id!()
|> WandererApp.Api.MapSubscription.cancel()
{:ok, map_subscriptions} = WandererApp.Map.SubscriptionManager.get_map_subscriptions(map_id)
{:ok, map_subscriptions} = SubscriptionManager.get_map_subscriptions(map_id)
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
@@ -115,9 +116,9 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
map_id: map_id
})
case WandererApp.License.LicenseManager.get_license_by_map_id(map_id) do
case LicenseManager.get_license_by_map_id(map_id) do
{:ok, license} ->
WandererApp.License.LicenseManager.invalidate_license(license.id)
LicenseManager.invalidate_license(license.id)
Logger.info("Cancelled license for map #{map_id}")
{:error, reason} ->
@@ -309,12 +310,10 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
# Check if a license exists, if not create one, otherwise update its expiration
# The License Manager service will verify the subscription is active
case WandererApp.License.LicenseManager.get_license_by_map_id(map_id) do
case LicenseManager.get_license_by_map_id(map_id) do
{:ok, _license} ->
# License exists, update its expiration date
case WandererApp.License.LicenseManager.update_license_expiration_from_subscription(
map_id
) do
case LicenseManager.update_license_expiration_from_subscription(map_id) do
{:ok, updated_license} ->
Logger.info(
"Updated license expiration for map #{map_id} to #{updated_license.expire_at}"
@@ -376,7 +375,7 @@ defmodule WandererAppWeb.Maps.MapSubscriptionsComponent do
defp create_map_license(socket, map_id) do
# No license found, create one
case WandererApp.License.LicenseManager.create_license_for_map(map_id) do
case LicenseManager.create_license_for_map(map_id) do
{:ok, license} ->
Logger.debug(fn ->
"Automatically created license #{license.license_key} for map #{map_id} during subscription update"

View File

@@ -1,6 +1,8 @@
defmodule WandererAppWeb.MapsLive do
use WandererAppWeb, :live_view
alias Phoenix.LiveView.AsyncResult
require Logger
@pubsub_client Application.compile_env(:wanderer_app, :pubsub_client)
@@ -275,17 +277,57 @@ defmodule WandererAppWeb.MapsLive do
:telemetry.execute([:wanderer_app, :map, :created], %{count: 1})
maybe_create_default_acl(form, new_map)
# Reload maps synchronously to avoid timing issues with flash messages
{:ok, %{maps: maps}} = load_maps(current_user)
{:noreply,
socket
|> assign_async(:maps, fn ->
load_maps(current_user)
end)
|> put_flash(
:info,
"Map '#{new_map.name}' created successfully with slug '#{new_map.slug}'"
)
|> assign(:maps, AsyncResult.ok(maps))
|> push_patch(to: ~p"/maps")}
{:error, %Ash.Error.Invalid{errors: errors}} ->
# Check for slug uniqueness constraint violation
slug_error =
Enum.find(errors, fn error ->
case error do
%{field: :slug} -> true
%{message: message} when is_binary(message) -> String.contains?(message, "unique")
_ -> false
end
end)
error_message =
if slug_error do
"A map with this name already exists. The system will automatically adjust the name if needed. Please try again."
else
errors
|> Enum.map(fn error ->
field = Map.get(error, :field, "field")
message = Map.get(error, :message, "validation error")
"#{field}: #{message}"
end)
|> Enum.join(", ")
end
Logger.warning("Map creation failed",
form: form,
errors: inspect(errors),
slug_error: slug_error != nil
)
{:noreply,
socket
|> put_flash(:error, "Failed to create map: #{error_message}")
|> assign(error: error_message)}
{:error, %{errors: errors}} ->
error_message =
errors
|> Enum.map(fn %{field: _field} = error ->
|> Enum.map(fn error ->
"#{Map.get(error, :message, "Field validation error")}"
end)
|> Enum.join(", ")
@@ -296,9 +338,14 @@ defmodule WandererAppWeb.MapsLive do
|> assign(error: error_message)}
{:error, error} ->
Logger.error("Unexpected error creating map",
form: form,
error: inspect(error)
)
{:noreply,
socket
|> put_flash(:error, "Failed to create map")
|> put_flash(:error, "Failed to create map. Please try again.")
|> assign(error: error)}
end
end
@@ -342,99 +389,158 @@ defmodule WandererAppWeb.MapsLive do
%{"form" => form} = _params,
%{assigns: %{map_slug: map_slug, current_user: current_user}} = socket
) do
{:ok, map} =
map_slug
|> WandererApp.Api.Map.get_map_by_slug!()
|> Ash.load(:acls)
scope =
form
|> Map.get("scope")
|> case do
"" -> "wormholes"
scope -> scope
end
form =
form
|> Map.put("acls", form["acls"] || [])
|> Map.put("scope", scope)
|> Map.put(
"only_tracked_characters",
(form["only_tracked_characters"] || "false") |> String.to_existing_atom()
)
map
|> WandererApp.Api.Map.update(form)
WandererApp.MapRepo.get_map_by_slug_safely(map_slug)
|> case do
{:ok, _updated_map} ->
{added_acls, removed_acls} = map.acls |> Enum.map(& &1.id) |> _get_acls_diff(form["acls"])
{:ok, map} ->
# Successfully found the map, proceed with loading and updating
{:ok, map_with_acls} = Ash.load(map, :acls)
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"maps:#{map.id}",
{:map_acl_updated, map.id, added_acls, removed_acls}
)
scope =
form
|> Map.get("scope")
|> case do
"" -> "wormholes"
scope -> scope
end
{:ok, tracked_characters} =
WandererApp.Maps.get_tracked_map_characters(map.id, current_user)
form =
form
|> Map.put("acls", form["acls"] || [])
|> Map.put("scope", scope)
|> Map.put(
"only_tracked_characters",
(form["only_tracked_characters"] || "false") |> String.to_existing_atom()
)
first_tracked_character_id = Enum.map(tracked_characters, & &1.id) |> List.first()
map_with_acls
|> WandererApp.Api.Map.update(form)
|> case do
{:ok, _updated_map} ->
{added_acls, removed_acls} =
map_with_acls.acls |> Enum.map(& &1.id) |> _get_acls_diff(form["acls"])
added_acls
|> Enum.each(fn acl_id ->
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_acl_added, %{
character_id: first_tracked_character_id,
user_id: current_user.id,
map_id: map.id,
acl_id: acl_id
})
end)
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"maps:#{map_with_acls.id}",
{:map_acl_updated, map_with_acls.id, added_acls, removed_acls}
)
removed_acls
|> Enum.each(fn acl_id ->
{:ok, _} =
WandererApp.User.ActivityTracker.track_map_event(:map_acl_removed, %{
character_id: first_tracked_character_id,
user_id: current_user.id,
map_id: map.id,
acl_id: acl_id
})
end)
{:ok, tracked_characters} =
WandererApp.Maps.get_tracked_map_characters(map_with_acls.id, current_user)
first_tracked_character_id = Enum.map(tracked_characters, & &1.id) |> List.first()
added_acls
|> Enum.each(fn acl_id ->
WandererApp.User.ActivityTracker.track_map_event(:map_acl_added, %{
character_id: first_tracked_character_id,
user_id: current_user.id,
map_id: map_with_acls.id,
acl_id: acl_id
})
end)
removed_acls
|> Enum.each(fn acl_id ->
WandererApp.User.ActivityTracker.track_map_event(:map_acl_removed, %{
character_id: first_tracked_character_id,
user_id: current_user.id,
map_id: map_with_acls.id,
acl_id: acl_id
})
end)
{:noreply,
socket
|> push_navigate(to: ~p"/maps")}
{:error, error} ->
{:noreply,
socket
|> put_flash(:error, "Failed to update map")
|> assign(error: error)}
end
{:error, :multiple_results} ->
{:noreply,
socket
|> put_flash(
:error,
"Multiple maps found with this identifier. Please contact support to resolve this issue."
)
|> push_navigate(to: ~p"/maps")}
{:error, error} ->
{:error, :not_found} ->
{:noreply,
socket
|> put_flash(:error, "Failed to update map")
|> assign(error: error)}
|> put_flash(:error, "Map not found")
|> push_navigate(to: ~p"/maps")}
{:error, _reason} ->
{:noreply,
socket
|> put_flash(:error, "Failed to load map. Please try again.")
|> push_navigate(to: ~p"/maps")}
end
end
def handle_event("delete", %{"data" => map_slug} = _params, socket) do
map =
map_slug
|> WandererApp.Api.Map.get_map_by_slug!()
|> WandererApp.Api.Map.mark_as_deleted!()
WandererApp.MapRepo.get_map_by_slug_safely(map_slug)
|> case do
{:ok, map} ->
# Successfully found the map, proceed with deletion
deleted_map = WandererApp.Api.Map.mark_as_deleted!(map)
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"maps:#{map.id}",
:map_deleted
)
Phoenix.PubSub.broadcast(
WandererApp.PubSub,
"maps:#{deleted_map.id}",
:map_deleted
)
current_user = socket.assigns.current_user
current_user = socket.assigns.current_user
{:noreply,
socket
|> assign_async(:maps, fn ->
load_maps(current_user)
end)
|> push_patch(to: ~p"/maps")}
# Reload maps synchronously to avoid timing issues with flash messages
{:ok, %{maps: maps}} = load_maps(current_user)
{:noreply,
socket
|> assign(:maps, AsyncResult.ok(maps))
|> push_patch(to: ~p"/maps")}
{:error, :multiple_results} ->
# Multiple maps found with this slug - data integrity issue
# Reload maps synchronously
{:ok, %{maps: maps}} = load_maps(socket.assigns.current_user)
{:noreply,
socket
|> put_flash(
:error,
"Multiple maps found with this identifier. Please contact support to resolve this issue."
)
|> assign(:maps, AsyncResult.ok(maps))}
{:error, :not_found} ->
# Map not found
# Reload maps synchronously
{:ok, %{maps: maps}} = load_maps(socket.assigns.current_user)
{:noreply,
socket
|> put_flash(:error, "Map not found or already deleted")
|> assign(:maps, AsyncResult.ok(maps))
|> push_patch(to: ~p"/maps")}
{:error, _reason} ->
# Other error
# Reload maps synchronously
{:ok, %{maps: maps}} = load_maps(socket.assigns.current_user)
{:noreply,
socket
|> put_flash(:error, "Failed to delete map. Please try again.")
|> assign(:maps, AsyncResult.ok(maps))}
end
end
def handle_event(
@@ -511,10 +617,10 @@ defmodule WandererAppWeb.MapsLive do
def handle_progress(
:settings,
entry,
%{assigns: %{current_user: current_user, map_id: map_id}} = socket
%{assigns: %{current_user: _current_user, map_id: _map_id}} = socket
) do
if entry.done? do
[uploaded_file_path] =
[_uploaded_file_path] =
consume_uploaded_entries(socket, :settings, fn %{path: path}, _entry ->
tmp_file_path =
System.tmp_dir!()

View File

@@ -234,12 +234,12 @@ defmodule WandererAppWeb.Plugs.RequestValidator do
end
end
defp validate_param_value(key, value, max_length, max_depth, current_depth)
defp validate_param_value(_key, value, max_length, max_depth, current_depth)
when is_map(value) do
validate_params(value, max_length, max_depth, current_depth)
end
defp validate_param_value(key, value, max_length, max_depth, current_depth)
defp validate_param_value(_key, value, max_length, max_depth, current_depth)
when is_list(value) do
validate_params(value, max_length, max_depth, current_depth)
end

View File

@@ -11,7 +11,7 @@ defmodule WandererAppWeb.PresenceGracePeriodManager do
require Logger
# 30 minutes
@grace_period_ms :timer.minutes(10)
@grace_period_ms :timer.minutes(30)
@check_remove_queue_interval :timer.seconds(30)
defstruct pending_removals: %{}, timers: %{}, to_remove: []

View File

@@ -40,6 +40,7 @@ defmodule WandererAppWeb.Router do
"https://images.evetech.net",
"https://web.ccpgamescdn.com",
"https://images.ctfassets.net",
"https://wanderer-industries.github.io",
"https://w.appzi.io"
]

View File

@@ -78,7 +78,36 @@ defmodule WandererAppWeb.Telemetry do
summary("vm.memory.total", unit: {:byte, :kilobyte}),
summary("vm.total_run_queue_lengths.total"),
summary("vm.total_run_queue_lengths.cpu"),
summary("vm.total_run_queue_lengths.io")
summary("vm.total_run_queue_lengths.io"),
# Finch Pool Metrics
counter("wanderer_app.finch.pool_exhausted.count",
tags: [:pool, :method],
description: "Count of Finch pool exhaustion errors"
),
counter("wanderer_app.finch.pool_timeout.count",
tags: [:pool, :method],
description: "Count of Finch pool timeout errors"
),
# Character Tracker Pool Metrics
summary("wanderer_app.tracker_pool.location_update.duration",
unit: :millisecond,
tags: [:pool_uuid],
description: "Time taken to update all character locations in a pool"
),
counter("wanderer_app.tracker_pool.location_lag.count",
tags: [:pool_uuid],
description: "Count of location updates falling behind (>800ms)"
),
counter("wanderer_app.tracker_pool.ship_skipped.count",
tags: [:pool_uuid, :reason],
description: "Count of ship updates skipped due to backpressure"
),
counter("wanderer_app.tracker_pool.info_skipped.count",
tags: [:pool_uuid, :reason],
description: "Count of info updates skipped due to backpressure"
)
]
end

3
m
View File

@@ -15,6 +15,9 @@ case $COMMAND in
deps)
MIX_ENV=dev mix deps.get
;;
deploy)
MIX_ENV=dev mix assets.deploy
;;
setup)
MIX_ENV=dev mix setup
;;

View File

@@ -3,7 +3,7 @@ defmodule WandererApp.MixProject do
@source_url "https://github.com/wanderer-industries/wanderer"
@version "1.84.16"
@version "1.85.3"
def project do
[

View File

@@ -1,5 +1,5 @@
%{
title: "Greetings, Capsuleers!",
title: "Welcome: Greetings, Capsuleers!",
author: "Wanderer Team",
cover_image_uri: "/assets/hello.webp",
tags: ~w(hello world),

View File

@@ -1,5 +1,5 @@
%{
title: "Introducing Wanderer Community Edition",
title: "Announcement: Wanderer Community Edition",
author: "Wanderer Team",
cover_image_uri: "/images/news/ce_logo_dark.png",
tags: ~w(community-edition open-source),

Some files were not shown because too many files have changed in this diff Show More