Compare commits

..

7 Commits

Author SHA1 Message Date
dgtlmoon
5971e06091 Test speedup - check janus queues too 2025-09-29 14:57:56 +02:00
dgtlmoon
f4e8d1963f Build - linux/arm64 and linux/arm64/v8 are the same, remove v8
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/amd64 (alpine) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm64 (alpine) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/amd64 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm/v7 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm/v8 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm64 (main) (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-09-29 14:12:10 +02:00
dgtlmoon
45d5e961dc Build - Pinning library versions to fix tests 2025-09-29 11:46:37 +02:00
dgtlmoon
45f2863966 Notifications - Upgrade Apprise 1.9.4 (#3443)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/amd64 (alpine) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm64 (alpine) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/amd64 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm/v7 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm/v8 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm64 (main) (push) Has been cancelled
ChangeDetection.io Container Build Test / Build linux/arm64/v8 (main) (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
CodeQL / Analyze (javascript) (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
2025-09-23 15:36:40 +02:00
dgtlmoon
01c1ac4c0c Process text/* non-HTML in their original format keeping line breaks, auto-detect attachments/downloads for text or HTML, WARNING - Will trigger false changes for some existing text file watches #3434 (#3435)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-09-19 10:42:34 +02:00
dgtlmoon
b2f9aec383 UI - Implementation of unread counter - adding test
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-09-18 11:53:43 +02:00
dgtlmoon
a95aa67aef UI - Re #3393 #3419 Implementation of unread counter tab along with realtime updates (#3433)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-09-18 11:14:26 +02:00
58 changed files with 325 additions and 1930 deletions

View File

@@ -4,11 +4,13 @@ updates:
directory: /
schedule:
interval: "weekly"
"caronc/apprise":
versioning-strategy: "increase"
schedule:
interval: "daily"
groups:
all:
patterns:
- "*"
- package-ecosystem: pip
directory: /
schedule:
interval: "daily"
allow:
- dependency-name: "apprise"

View File

@@ -95,7 +95,7 @@ jobs:
push: true
tags: |
${{ secrets.DOCKER_HUB_USERNAME }}/changedetection.io:dev,ghcr.io/${{ github.repository }}:dev
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8,linux/arm64/v8
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8
cache-from: type=gha
cache-to: type=gha,mode=max
@@ -133,7 +133,7 @@ jobs:
file: ./Dockerfile
push: true
tags: ${{ steps.meta.outputs.tags }}
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8,linux/arm64/v8
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8
cache-from: type=gha
cache-to: type=gha,mode=max
# Looks like this was disabled

View File

@@ -38,8 +38,6 @@ jobs:
dockerfile: ./Dockerfile
- platform: linux/arm/v8
dockerfile: ./Dockerfile
- platform: linux/arm64/v8
dockerfile: ./Dockerfile
# Alpine Dockerfile platforms (musl via alpine check)
- platform: linux/amd64
dockerfile: ./.github/test/Dockerfile-alpine

View File

@@ -71,7 +71,6 @@ jobs:
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_watch_model'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_jinja2_security'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_semver'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_browser_notifications'
- name: Test built container with Pytest (generally as requests/plaintext fetching)
run: |

View File

@@ -1 +0,0 @@
# Browser notifications blueprint

View File

@@ -1,76 +0,0 @@
from flask import Blueprint, jsonify, request
from loguru import logger
def construct_blueprint(datastore):
browser_notifications_blueprint = Blueprint('browser_notifications', __name__)
@browser_notifications_blueprint.route("/test", methods=['POST'])
def test_browser_notification():
"""Send a test browser notification using the apprise handler"""
try:
from changedetectionio.notification.apprise_plugin.custom_handlers import apprise_browser_notification_handler
# Check if there are any subscriptions
browser_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
if not browser_subscriptions:
return jsonify({'success': False, 'message': 'No browser subscriptions found'}), 404
# Get notification data from request or use defaults
data = request.get_json() or {}
title = data.get('title', 'Test Notification')
body = data.get('body', 'This is a test notification from changedetection.io')
# Use the apprise handler directly
success = apprise_browser_notification_handler(
body=body,
title=title,
notify_type='info',
meta={'url': 'browser://test'}
)
if success:
subscription_count = len(browser_subscriptions)
return jsonify({
'success': True,
'message': f'Test notification sent successfully to {subscription_count} subscriber(s)'
})
else:
return jsonify({'success': False, 'message': 'Failed to send test notification'}), 500
except ImportError:
logger.error("Browser notification handler not available")
return jsonify({'success': False, 'message': 'Browser notification handler not available'}), 500
except Exception as e:
logger.error(f"Failed to send test browser notification: {e}")
return jsonify({'success': False, 'message': f'Error: {str(e)}'}), 500
@browser_notifications_blueprint.route("/clear", methods=['POST'])
def clear_all_browser_notifications():
"""Clear all browser notification subscriptions from the datastore"""
try:
# Get current subscription count
browser_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
subscription_count = len(browser_subscriptions)
# Clear all subscriptions
if 'settings' not in datastore.data:
datastore.data['settings'] = {}
if 'application' not in datastore.data['settings']:
datastore.data['settings']['application'] = {}
datastore.data['settings']['application']['browser_subscriptions'] = []
datastore.needs_write = True
logger.info(f"Cleared {subscription_count} browser notification subscriptions")
return jsonify({
'success': True,
'message': f'Cleared {subscription_count} browser notification subscription(s)'
})
except Exception as e:
logger.error(f"Failed to clear all browser notifications: {e}")
return jsonify({'success': False, 'message': f'Clear all failed: {str(e)}'}), 500
return browser_notifications_blueprint

View File

@@ -87,7 +87,6 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
form=form,
guid=datastore.data['app_guid'],
has_proxies=datastore.proxy_list,
has_unviewed=datastore.has_unviewed,
hosted_sticky=os.getenv("SALTED_PASS", False) == False,
now_time_server=round(time.time()),
pagination=pagination,
@@ -97,6 +96,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
sort_order=request.args.get('order') if request.args.get('order') else request.cookies.get('order'),
system_default_fetcher=datastore.data['settings']['application'].get('fetch_backend'),
tags=sorted_tags,
unread_changes_count=datastore.unread_changes_count,
watches=sorted_watches
)

View File

@@ -82,8 +82,11 @@ document.addEventListener('DOMContentLoaded', function() {
{%- set cols_required = cols_required + 1 -%}
{%- endif -%}
{%- set ui_settings = datastore.data['settings']['application']['ui'] -%}
<div id="watch-table-wrapper">
{%- set wrapper_classes = [
'has-unread-changes' if unread_changes_count else '',
'has-error' if errored_count else '',
] -%}
<div id="watch-table-wrapper" class="{{ wrapper_classes | reject('equalto', '') | join(' ') }}">
{%- set table_classes = [
'favicon-enabled' if 'favicons_enabled' not in ui_settings or ui_settings['favicons_enabled'] else 'favicon-not-enabled',
] -%}
@@ -241,10 +244,10 @@ document.addEventListener('DOMContentLoaded', function() {
</tbody>
</table>
<ul id="post-list-buttons">
<li id="post-list-with-errors" class="{%- if errored_count -%}has-error{%- endif -%}" style="display: none;" >
<li id="post-list-with-errors" style="display: none;" >
<a href="{{url_for('watchlist.index', with_errors=1, tag=request.args.get('tag')) }}" class="pure-button button-tag button-error">With errors ({{ errored_count }})</a>
</li>
<li id="post-list-mark-views" class="{%- if has_unviewed -%}has-unviewed{%- endif -%}" style="display: none;" >
<li id="post-list-mark-views" style="display: none;" >
<a href="{{url_for('ui.mark_all_viewed',with_errors=request.args.get('with_errors',0)) }}" class="pure-button button-tag " id="mark-all-viewed">Mark all viewed</a>
</li>
{%- if active_tag_uuid -%}
@@ -252,8 +255,8 @@ document.addEventListener('DOMContentLoaded', function() {
<a href="{{url_for('ui.mark_all_viewed', tag=active_tag_uuid) }}" class="pure-button button-tag " id="mark-all-viewed">Mark all viewed in '{{active_tag.title}}'</a>
</li>
{%- endif -%}
<li id="post-list-unread" class="{%- if has_unviewed -%}has-unviewed{%- endif -%}" style="display: none;" >
<a href="{{url_for('watchlist.index', unread=1, tag=request.args.get('tag')) }}" class="pure-button button-tag">Unread</a>
<li id="post-list-unread" style="display: none;" >
<a href="{{url_for('watchlist.index', unread=1, tag=request.args.get('tag')) }}" class="pure-button button-tag">Unread (<span id="unread-tab-counter">{{ unread_changes_count }}</span>)</a>
</li>
<li>
<a href="{{ url_for('ui.form_watch_checknow', tag=active_tag_uuid, with_errors=request.args.get('with_errors',0)) }}" class="pure-button button-tag" id="recheck-all">Recheck

View File

@@ -39,11 +39,6 @@ from loguru import logger
from changedetectionio import __version__
from changedetectionio import queuedWatchMetaData
from changedetectionio.api import Watch, WatchHistory, WatchSingleHistory, CreateWatch, Import, SystemInfo, Tag, Tags, Notifications, WatchFavicon
from changedetectionio.notification.BrowserNotifications import (
BrowserNotificationsVapidPublicKey,
BrowserNotificationsSubscribe,
BrowserNotificationsUnsubscribe
)
from changedetectionio.api.Search import Search
from .time_handler import is_within_schedule
@@ -99,7 +94,6 @@ except locale.Error:
logger.warning(f"Unable to set locale {default_locale}, locale is not installed maybe?")
watch_api = Api(app, decorators=[csrf.exempt])
browser_notification_api = Api(app, decorators=[csrf.exempt])
def init_app_secret(datastore_path):
secret = ""
@@ -342,11 +336,6 @@ def changedetection_app(config=None, datastore_o=None):
watch_api.add_resource(Notifications, '/api/v1/notifications',
resource_class_kwargs={'datastore': datastore})
# Browser notification endpoints
browser_notification_api.add_resource(BrowserNotificationsVapidPublicKey, '/browser-notifications-api/vapid-public-key')
browser_notification_api.add_resource(BrowserNotificationsSubscribe, '/browser-notifications-api/subscribe')
browser_notification_api.add_resource(BrowserNotificationsUnsubscribe, '/browser-notifications-api/unsubscribe')
@login_manager.user_loader
def user_loader(email):
@@ -500,29 +489,10 @@ def changedetection_app(config=None, datastore_o=None):
except FileNotFoundError:
abort(404)
@app.route("/service-worker.js", methods=['GET'])
def service_worker():
from flask import make_response
try:
# Serve from the changedetectionio/static/js directory
static_js_path = os.path.join(os.path.dirname(__file__), 'static', 'js')
response = make_response(send_from_directory(static_js_path, "service-worker.js"))
response.headers['Content-Type'] = 'application/javascript'
response.headers['Service-Worker-Allowed'] = '/'
response.headers['Cache-Control'] = 'no-cache, no-store, must-revalidate'
response.headers['Pragma'] = 'no-cache'
response.headers['Expires'] = '0'
return response
except FileNotFoundError:
abort(404)
import changedetectionio.blueprint.browser_steps as browser_steps
app.register_blueprint(browser_steps.construct_blueprint(datastore), url_prefix='/browser-steps')
import changedetectionio.blueprint.browser_notifications.browser_notifications as browser_notifications
app.register_blueprint(browser_notifications.construct_blueprint(datastore), url_prefix='/browser-notifications')
from changedetectionio.blueprint.imports import construct_blueprint as construct_import_blueprint
app.register_blueprint(construct_import_blueprint(datastore, update_q, queuedWatchMetaData), url_prefix='/imports')

View File

@@ -707,7 +707,6 @@ class commonSettingsForm(Form):
processor = RadioField( label=u"Processor - What do you want to achieve?", choices=processors.available_processors(), default="text_json_diff")
timezone = StringField("Timezone for watch schedule", render_kw={"list": "timezones"}, validators=[validateTimeZoneName()])
webdriver_delay = IntegerField('Wait seconds before extracting text', validators=[validators.Optional(), validators.NumberRange(min=1, message="Should contain one or more seconds")])
class importForm(Form):

View File

@@ -66,11 +66,6 @@ class model(dict):
'socket_io_enabled': True,
'favicons_enabled': True
},
'vapid': {
'private_key': None,
'public_key': None,
'contact_email': None
},
}
}
}

View File

@@ -1,217 +0,0 @@
import json
from flask import request, current_app
from flask_restful import Resource, marshal_with, fields
from loguru import logger
browser_notifications_fields = {
'success': fields.Boolean,
'message': fields.String,
}
vapid_public_key_fields = {
'publicKey': fields.String,
}
test_notification_fields = {
'success': fields.Boolean,
'message': fields.String,
'sent_count': fields.Integer,
}
class BrowserNotificationsVapidPublicKey(Resource):
"""Get VAPID public key for browser push notifications"""
@marshal_with(vapid_public_key_fields)
def get(self):
try:
from changedetectionio.notification.apprise_plugin.browser_notification_helpers import (
get_vapid_config_from_datastore, convert_pem_public_key_for_browser
)
datastore = current_app.config.get('DATASTORE')
if not datastore:
return {'publicKey': None}, 500
private_key, public_key_pem, contact_email = get_vapid_config_from_datastore(datastore)
if not public_key_pem:
return {'publicKey': None}, 404
# Convert PEM format to URL-safe base64 format for browser
public_key_b64 = convert_pem_public_key_for_browser(public_key_pem)
if public_key_b64:
return {'publicKey': public_key_b64}
else:
return {'publicKey': None}, 500
except Exception as e:
logger.error(f"Failed to get VAPID public key: {e}")
return {'publicKey': None}, 500
class BrowserNotificationsSubscribe(Resource):
"""Subscribe to browser notifications"""
@marshal_with(browser_notifications_fields)
def post(self):
try:
data = request.get_json()
if not data:
return {'success': False, 'message': 'No data provided'}, 400
subscription = data.get('subscription')
if not subscription:
return {'success': False, 'message': 'Subscription is required'}, 400
# Validate subscription format
required_fields = ['endpoint', 'keys']
for field in required_fields:
if field not in subscription:
return {'success': False, 'message': f'Missing subscription field: {field}'}, 400
if 'p256dh' not in subscription['keys'] or 'auth' not in subscription['keys']:
return {'success': False, 'message': 'Missing subscription keys'}, 400
# Get datastore
datastore = current_app.config.get('DATASTORE')
if not datastore:
return {'success': False, 'message': 'Datastore not available'}, 500
# Initialize browser_subscriptions if it doesn't exist
if 'browser_subscriptions' not in datastore.data['settings']['application']:
datastore.data['settings']['application']['browser_subscriptions'] = []
# Check if subscription already exists
existing_subscriptions = datastore.data['settings']['application']['browser_subscriptions']
for existing_sub in existing_subscriptions:
if existing_sub.get('endpoint') == subscription.get('endpoint'):
return {'success': True, 'message': 'Already subscribed to browser notifications'}
# Add new subscription
datastore.data['settings']['application']['browser_subscriptions'].append(subscription)
datastore.needs_write = True
logger.info(f"New browser notification subscription: {subscription.get('endpoint')}")
return {'success': True, 'message': 'Successfully subscribed to browser notifications'}
except Exception as e:
logger.error(f"Failed to subscribe to browser notifications: {e}")
return {'success': False, 'message': f'Subscription failed: {str(e)}'}, 500
class BrowserNotificationsUnsubscribe(Resource):
"""Unsubscribe from browser notifications"""
@marshal_with(browser_notifications_fields)
def post(self):
try:
data = request.get_json()
if not data:
return {'success': False, 'message': 'No data provided'}, 400
subscription = data.get('subscription')
if not subscription or not subscription.get('endpoint'):
return {'success': False, 'message': 'Valid subscription is required'}, 400
# Get datastore
datastore = current_app.config.get('DATASTORE')
if not datastore:
return {'success': False, 'message': 'Datastore not available'}, 500
# Check if subscriptions exist
browser_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
if not browser_subscriptions:
return {'success': True, 'message': 'No subscriptions found'}
# Remove subscription with matching endpoint
endpoint = subscription.get('endpoint')
original_count = len(browser_subscriptions)
datastore.data['settings']['application']['browser_subscriptions'] = [
sub for sub in browser_subscriptions
if sub.get('endpoint') != endpoint
]
removed_count = original_count - len(datastore.data['settings']['application']['browser_subscriptions'])
if removed_count > 0:
datastore.needs_write = True
logger.info(f"Removed {removed_count} browser notification subscription(s)")
return {'success': True, 'message': 'Successfully unsubscribed from browser notifications'}
else:
return {'success': True, 'message': 'No matching subscription found'}
except Exception as e:
logger.error(f"Failed to unsubscribe from browser notifications: {e}")
return {'success': False, 'message': f'Unsubscribe failed: {str(e)}'}, 500
class BrowserNotificationsTest(Resource):
"""Send a test browser notification"""
@marshal_with(test_notification_fields)
def post(self):
try:
data = request.get_json()
if not data:
return {'success': False, 'message': 'No data provided', 'sent_count': 0}, 400
title = data.get('title', 'Test Notification')
body = data.get('body', 'This is a test notification from changedetection.io')
# Get datastore to check if subscriptions exist
datastore = current_app.config.get('DATASTORE')
if not datastore:
return {'success': False, 'message': 'Datastore not available', 'sent_count': 0}, 500
# Check if there are subscriptions before attempting to send
browser_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
if not browser_subscriptions:
return {'success': False, 'message': 'No subscriptions found', 'sent_count': 0}, 404
# Use the apprise handler directly
try:
from changedetectionio.notification.apprise_plugin.custom_handlers import apprise_browser_notification_handler
# Call the apprise handler with test data
success = apprise_browser_notification_handler(
body=body,
title=title,
notify_type='info',
meta={'url': 'browser://test'}
)
# Count how many subscriptions we have after sending (some may have been removed if invalid)
final_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
sent_count = len(browser_subscriptions) # Original count
if success:
return {
'success': True,
'message': f'Test notification sent successfully to {sent_count} subscriber(s)',
'sent_count': sent_count
}
else:
return {
'success': False,
'message': 'Failed to send test notification',
'sent_count': 0
}, 500
except ImportError:
return {'success': False, 'message': 'Browser notification handler not available', 'sent_count': 0}, 500
except Exception as e:
logger.error(f"Failed to send test browser notification: {e}")
return {'success': False, 'message': f'Test failed: {str(e)}', 'sent_count': 0}, 500

View File

@@ -1,273 +0,0 @@
"""
Browser notification helpers for Web Push API
Shared utility functions for VAPID key handling and notification sending
"""
import json
import re
import time
from loguru import logger
def convert_pem_private_key_for_pywebpush(private_key):
"""
Convert PEM private key to the format that pywebpush expects
Args:
private_key: PEM private key string or already converted key
Returns:
Vapid instance for pywebpush (avoids PEM parsing compatibility issues)
"""
try:
from py_vapid import Vapid
import tempfile
import os
# If we get a string, assume it's PEM and create a Vapid instance from it
if isinstance(private_key, str) and private_key.startswith('-----BEGIN'):
# Write PEM to temporary file and load with Vapid.from_file
with tempfile.NamedTemporaryFile(mode='w', suffix='.pem', delete=False) as tmp_file:
tmp_file.write(private_key)
tmp_file.flush()
temp_path = tmp_file.name
try:
# Load using Vapid.from_file - this is more compatible with pywebpush
vapid_instance = Vapid.from_file(temp_path)
os.unlink(temp_path) # Clean up
logger.debug("Successfully created Vapid instance from PEM")
return vapid_instance
except Exception as e:
os.unlink(temp_path) # Clean up even on error
logger.error(f"Failed to create Vapid instance from PEM: {e}")
# Fall back to returning the original PEM string
return private_key
else:
# Return as-is if not a PEM string
return private_key
except Exception as e:
logger.error(f"Failed to convert private key: {e}")
return private_key
def convert_pem_public_key_for_browser(public_key_pem):
"""
Convert PEM public key to URL-safe base64 format for browser applicationServerKey
Args:
public_key_pem: PEM public key string
Returns:
URL-safe base64 encoded public key without padding
"""
try:
from cryptography.hazmat.primitives import serialization
import base64
# Parse PEM directly using cryptography library
pem_bytes = public_key_pem.encode() if isinstance(public_key_pem, str) else public_key_pem
# Load the public key from PEM
public_key_crypto = serialization.load_pem_public_key(pem_bytes)
# Get the raw public key bytes in uncompressed format (what browsers expect)
public_key_raw = public_key_crypto.public_bytes(
encoding=serialization.Encoding.X962,
format=serialization.PublicFormat.UncompressedPoint
)
# Convert to URL-safe base64 (remove padding)
public_key_b64 = base64.urlsafe_b64encode(public_key_raw).decode('ascii').rstrip('=')
return public_key_b64
except Exception as e:
logger.error(f"Failed to convert public key format: {e}")
return None
def send_push_notifications(subscriptions, notification_payload, private_key, contact_email, datastore):
"""
Send push notifications to a list of subscriptions
Args:
subscriptions: List of push subscriptions
notification_payload: Dict with notification data (title, body, etc.)
private_key: VAPID private key (will be converted if needed)
contact_email: Contact email for VAPID claims
datastore: Datastore object for updating subscriptions
Returns:
Tuple of (success_count, total_count)
"""
try:
from pywebpush import webpush, WebPushException
except ImportError:
logger.error("pywebpush not available - cannot send browser notifications")
return 0, len(subscriptions)
# Convert private key to format pywebpush expects
private_key_for_push = convert_pem_private_key_for_pywebpush(private_key)
success_count = 0
total_count = len(subscriptions)
# Send to all subscriptions
for subscription in subscriptions[:]: # Copy list to avoid modification issues
try:
webpush(
subscription_info=subscription,
data=json.dumps(notification_payload),
vapid_private_key=private_key_for_push,
vapid_claims={
"sub": f"mailto:{contact_email}",
"aud": f"https://{subscription['endpoint'].split('/')[2]}"
}
)
success_count += 1
except WebPushException as e:
logger.warning(f"Failed to send browser notification to subscription: {e}")
# Remove invalid subscriptions (410 = Gone, 404 = Not Found)
if e.response and e.response.status_code in [404, 410]:
logger.info("Removing invalid browser notification subscription")
try:
subscriptions.remove(subscription)
datastore.needs_write = True
except ValueError:
pass # Already removed
except Exception as e:
logger.error(f"Unexpected error sending browser notification: {e}")
return success_count, total_count
def create_notification_payload(title, body, icon_path=None):
"""
Create a standard notification payload
Args:
title: Notification title
body: Notification body
icon_path: Optional icon path (defaults to favicon)
Returns:
Dict with notification payload
"""
return {
'title': title,
'body': body,
'icon': icon_path or '/static/favicons/favicon-32x32.png',
'badge': '/static/favicons/favicon-32x32.png',
'timestamp': int(time.time() * 1000),
}
def get_vapid_config_from_datastore(datastore):
"""
Get VAPID configuration from datastore with proper error handling
Args:
datastore: Datastore object
Returns:
Tuple of (private_key, public_key, contact_email) or (None, None, None) if error
"""
try:
if not datastore:
return None, None, None
vapid_config = datastore.data.get('settings', {}).get('application', {}).get('vapid', {})
private_key = vapid_config.get('private_key')
public_key = vapid_config.get('public_key')
contact_email = vapid_config.get('contact_email', 'citizen@example.com')
return private_key, public_key, contact_email
except Exception as e:
logger.error(f"Failed to get VAPID config from datastore: {e}")
return None, None, None
def get_browser_subscriptions(datastore):
"""
Get browser subscriptions from datastore
Args:
datastore: Datastore object
Returns:
List of subscriptions
"""
try:
if not datastore:
return []
return datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
except Exception as e:
logger.error(f"Failed to get browser subscriptions: {e}")
return []
def save_browser_subscriptions(datastore, subscriptions):
"""
Save browser subscriptions to datastore
Args:
datastore: Datastore object
subscriptions: List of subscriptions to save
"""
try:
if not datastore:
return
# Ensure the settings structure exists
if 'settings' not in datastore.data:
datastore.data['settings'] = {}
if 'application' not in datastore.data['settings']:
datastore.data['settings']['application'] = {}
datastore.data['settings']['application']['browser_subscriptions'] = subscriptions
datastore.needs_write = True
except Exception as e:
logger.error(f"Failed to save browser subscriptions: {e}")
def create_error_response(message, sent_count=0, status_code=500):
"""
Create standardized error response for API endpoints
Args:
message: Error message
sent_count: Number of notifications sent (for test endpoints)
status_code: HTTP status code
Returns:
Tuple of (response_dict, status_code)
"""
return {'success': False, 'message': message, 'sent_count': sent_count}, status_code
def create_success_response(message, sent_count=None):
"""
Create standardized success response for API endpoints
Args:
message: Success message
sent_count: Number of notifications sent (optional)
Returns:
Response dict
"""
response = {'success': True, 'message': message}
if sent_count is not None:
response['sent_count'] = sent_count
return response

View File

@@ -1,6 +1,5 @@
import json
import re
import time
from urllib.parse import unquote_plus
import requests
@@ -111,80 +110,3 @@ def apprise_http_custom_handler(
except Exception as e:
logger.error(f"Unexpected error occurred while sending custom notification to {url}: {e}")
return False
@notify(on="browser")
def apprise_browser_notification_handler(
body: str,
title: str,
notify_type: str,
meta: dict,
*args,
**kwargs,
) -> bool:
"""
Browser push notification handler for browser:// URLs
Ignores anything after browser:// and uses single default channel
"""
try:
from pywebpush import webpush, WebPushException
from flask import current_app
# Get VAPID keys from app settings
try:
datastore = current_app.config.get('DATASTORE')
if not datastore:
logger.error("No datastore available for browser notifications")
return False
vapid_config = datastore.data.get('settings', {}).get('application', {}).get('vapid', {})
private_key = vapid_config.get('private_key')
public_key = vapid_config.get('public_key')
contact_email = vapid_config.get('contact_email', 'admin@changedetection.io')
if not private_key or not public_key:
logger.error("VAPID keys not configured for browser notifications")
return False
except Exception as e:
logger.error(f"Failed to get VAPID configuration: {e}")
return False
# Get subscriptions from datastore
browser_subscriptions = datastore.data.get('settings', {}).get('application', {}).get('browser_subscriptions', [])
if not browser_subscriptions:
logger.info("No browser subscriptions found")
return True # Not an error - just no subscribers
# Import helper functions
try:
from .browser_notification_helpers import create_notification_payload, send_push_notifications
except ImportError:
logger.error("Browser notification helpers not available")
return False
# Prepare notification payload
notification_payload = create_notification_payload(title, body)
# Send notifications using shared helper
success_count, total_count = send_push_notifications(
subscriptions=browser_subscriptions,
notification_payload=notification_payload,
private_key=private_key,
contact_email=contact_email,
datastore=datastore
)
# Update datastore with cleaned subscriptions
datastore.data['settings']['application']['browser_subscriptions'] = browser_subscriptions
logger.info(f"Sent browser notifications: {success_count}/{total_count} successful")
return success_count > 0
except ImportError:
logger.error("pywebpush not available - cannot send browser notifications")
return False
except Exception as e:
logger.error(f"Unexpected error in browser notification handler: {e}")
return False

View File

@@ -8,7 +8,7 @@ def process_notification(n_object, datastore):
from changedetectionio.safe_jinja import render as jinja_render
from . import default_notification_format_for_watch, default_notification_format, valid_notification_formats
# be sure its registered
from .apprise_plugin.custom_handlers import apprise_http_custom_handler, apprise_browser_notification_handler
from .apprise_plugin.custom_handlers import apprise_http_custom_handler
now = time.time()
if n_object.get('notification_timestamp'):

View File

@@ -153,12 +153,26 @@ class perform_site_check(difference_detection_processor):
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
self.fetcher.content = html_tools.workarounds_for_obfuscations(self.fetcher.content)
html_content = self.fetcher.content
content_type = self.fetcher.get_all_headers().get('content-type', '').lower()
is_attachment = 'attachment' in self.fetcher.get_all_headers().get('content-disposition', '').lower()
# If not JSON, and if it's not text/plain..
if 'text/plain' in self.fetcher.get_all_headers().get('content-type', '').lower():
# Try to detect better mime types if its a download or not announced as HTML
if is_attachment or 'octet-stream' in content_type or not 'html' in content_type:
logger.debug(f"Got a reply that may be a download or possibly a text attachment, checking..")
try:
import magic
mime = magic.from_buffer(html_content, mime=True)
logger.debug(f"Guessing mime type, original content_type '{content_type}', mime type detected '{mime}'")
if mime and "/" in mime: # looks valid and is a valid mime type
content_type = mime
except Exception as e:
logger.error(f"Error getting a more precise mime type from 'magic' library ({str(e)}")
if 'text/' in content_type and not 'html' in content_type:
# Don't run get_text or xpath/css filters on plaintext
stripped_text_from_html = html_content
else:
# If not JSON, and if it's not text/plain..
# Does it have some ld+json price data? used for easier monitoring
update_obj['has_ldjson_price_data'] = html_tools.has_ldjson_product_info(self.fetcher.content)

View File

@@ -243,14 +243,15 @@ def handle_watch_update(socketio, **kwargs):
general_stats = {
'count_errors': errored_count,
'has_unviewed': datastore.has_unviewed
'unread_changes_count': datastore.unread_changes_count
}
# Debug what's being emitted
# logger.debug(f"Emitting 'watch_update' event for {watch.get('uuid')}, data: {watch_data}")
# Emit to all clients (no 'broadcast' parameter needed - it's the default behavior)
socketio.emit("watch_update", {'watch': watch_data, 'general_stats': general_stats})
socketio.emit("watch_update", {'watch': watch_data})
socketio.emit("general_stats_update", general_stats)
# Log after successful emit - use watch_data['uuid'] to avoid variable shadowing issues
logger.trace(f"Socket.IO: Emitted update for watch {watch_data['uuid']}, Checking now: {watch_data['checking_now']}")

View File

@@ -9,7 +9,7 @@ set -x
# SOCKS5 related - start simple Socks5 proxy server
# SOCKSTEST=xyz should show in the logs of this service to confirm it fetched
docker run --network changedet-network -d --hostname socks5proxy --rm --name socks5proxy -p 1080:1080 -e PROXY_USER=proxy_user123 -e PROXY_PASSWORD=proxy_pass123 serjs/go-socks5-proxy
docker run --network changedet-network -d --hostname socks5proxy-noauth --rm -p 1081:1080 --name socks5proxy-noauth serjs/go-socks5-proxy
docker run --network changedet-network -d --hostname socks5proxy-noauth --rm -p 1081:1080 --name socks5proxy-noauth -e REQUIRE_AUTH=false serjs/go-socks5-proxy
echo "---------------------------------- SOCKS5 -------------------"
# SOCKS5 related - test from proxies.json

View File

@@ -1,6 +1,6 @@
{
"name": "changedetection.io",
"short_name": "changedetection",
"name": "",
"short_name": "",
"icons": [
{
"src": "android-chrome-192x192.png",
@@ -15,8 +15,5 @@
],
"theme_color": "#ffffff",
"background_color": "#ffffff",
"display": "standalone",
"start_url": "/",
"scope": "/",
"gcm_sender_id": "103953800507"
"display": "standalone"
}

View File

@@ -1,450 +0,0 @@
/**
* changedetection.io Browser Push Notifications
* Handles service worker registration, push subscription management, and notification permissions
*/
class BrowserNotifications {
constructor() {
this.serviceWorkerRegistration = null;
this.vapidPublicKey = null;
this.isSubscribed = false;
this.init();
}
async init() {
if (!this.isSupported()) {
console.warn('Push notifications are not supported in this browser');
return;
}
try {
// Get VAPID public key from server
await this.fetchVapidPublicKey();
// Register service worker
await this.registerServiceWorker();
// Check existing subscription state
await this.checkExistingSubscription();
// Initialize UI elements
this.initializeUI();
// Set up notification URL monitoring
this.setupNotificationUrlMonitoring();
} catch (error) {
console.error('Failed to initialize browser notifications:', error);
}
}
isSupported() {
return 'serviceWorker' in navigator &&
'PushManager' in window &&
'Notification' in window;
}
async fetchVapidPublicKey() {
try {
const response = await fetch('/browser-notifications-api/vapid-public-key');
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
const data = await response.json();
this.vapidPublicKey = data.publicKey;
} catch (error) {
console.error('Failed to fetch VAPID public key:', error);
throw error;
}
}
async registerServiceWorker() {
try {
this.serviceWorkerRegistration = await navigator.serviceWorker.register('/service-worker.js', {
scope: '/'
});
console.log('Service Worker registered successfully');
// Wait for service worker to be ready
await navigator.serviceWorker.ready;
} catch (error) {
console.error('Service Worker registration failed:', error);
throw error;
}
}
initializeUI() {
// Bind event handlers to existing elements in the template
this.bindEventHandlers();
// Update UI based on current permission state
this.updatePermissionStatus();
}
bindEventHandlers() {
const enableBtn = document.querySelector('#enable-notifications-btn');
const testBtn = document.querySelector('#test-notification-btn');
if (enableBtn) {
enableBtn.addEventListener('click', () => this.requestNotificationPermission());
}
if (testBtn) {
testBtn.addEventListener('click', () => this.sendTestNotification());
}
}
setupNotificationUrlMonitoring() {
// Monitor the notification URLs textarea for browser:// URLs
const notificationUrlsField = document.querySelector('textarea[name*="notification_urls"]');
if (notificationUrlsField) {
const checkForBrowserUrls = async () => {
const urls = notificationUrlsField.value || '';
const hasBrowserUrls = /browser:\/\//.test(urls);
// If browser URLs are detected and we're not subscribed, auto-subscribe
if (hasBrowserUrls && !this.isSubscribed && Notification.permission === 'default') {
const shouldSubscribe = confirm('Browser notifications detected! Would you like to enable browser notifications now?');
if (shouldSubscribe) {
await this.requestNotificationPermission();
}
} else if (hasBrowserUrls && !this.isSubscribed && Notification.permission === 'granted') {
// Permission already granted but not subscribed - auto-subscribe silently
console.log('Auto-subscribing to browser notifications...');
await this.subscribe();
}
};
// Check immediately
checkForBrowserUrls();
// Check on input changes
notificationUrlsField.addEventListener('input', checkForBrowserUrls);
}
}
async updatePermissionStatus() {
const statusElement = document.querySelector('#permission-status');
const enableBtn = document.querySelector('#enable-notifications-btn');
const testBtn = document.querySelector('#test-notification-btn');
if (!statusElement) return;
const permission = Notification.permission;
statusElement.textContent = permission;
statusElement.className = `permission-${permission}`;
// Show/hide controls based on permission
if (permission === 'default') {
if (enableBtn) enableBtn.style.display = 'inline-block';
if (testBtn) testBtn.style.display = 'none';
} else if (permission === 'granted') {
if (enableBtn) enableBtn.style.display = 'none';
if (testBtn) testBtn.style.display = 'inline-block';
} else { // denied
if (enableBtn) enableBtn.style.display = 'none';
if (testBtn) testBtn.style.display = 'none';
}
}
async requestNotificationPermission() {
try {
const permission = await Notification.requestPermission();
this.updatePermissionStatus();
if (permission === 'granted') {
console.log('Notification permission granted');
// Automatically subscribe to browser notifications
this.subscribe();
} else {
console.log('Notification permission denied');
}
} catch (error) {
console.error('Error requesting notification permission:', error);
}
}
async subscribe() {
if (Notification.permission !== 'granted') {
alert('Please enable notifications first');
return;
}
if (this.isSubscribed) {
console.log('Already subscribed to browser notifications');
return;
}
try {
// First, try to clear any existing subscription with different keys
await this.clearExistingSubscription();
// Create push subscription
const subscription = await this.serviceWorkerRegistration.pushManager.subscribe({
userVisibleOnly: true,
applicationServerKey: this.urlBase64ToUint8Array(this.vapidPublicKey)
});
// Send subscription to server
const response = await fetch('/browser-notifications-api/subscribe', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRFToken': document.querySelector('input[name=csrf_token]')?.value
},
body: JSON.stringify({
subscription: subscription.toJSON()
})
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
// Store subscription status
this.isSubscribed = true;
console.log('Successfully subscribed to browser notifications');
} catch (error) {
console.error('Failed to subscribe to browser notifications:', error);
// Show user-friendly error message
if (error.message.includes('different applicationServerKey')) {
this.showSubscriptionConflictDialog(error);
} else {
alert(`Failed to subscribe: ${error.message}`);
}
}
}
async unsubscribe() {
try {
if (!this.isSubscribed) return;
// Get current subscription
const subscription = await this.serviceWorkerRegistration.pushManager.getSubscription();
if (!subscription) {
this.isSubscribed = false;
return;
}
// Unsubscribe from server
const response = await fetch('/browser-notifications-api/unsubscribe', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRFToken': document.querySelector('input[name=csrf_token]')?.value
},
body: JSON.stringify({
subscription: subscription.toJSON()
})
});
if (!response.ok) {
console.warn(`Server unsubscribe failed: ${response.status}`);
}
// Unsubscribe locally
await subscription.unsubscribe();
// Update status
this.isSubscribed = false;
console.log('Unsubscribed from browser notifications');
} catch (error) {
console.error('Failed to unsubscribe from browser notifications:', error);
}
}
async sendTestNotification() {
try {
// First, check if we're subscribed
if (!this.isSubscribed) {
const shouldSubscribe = confirm('You need to subscribe to browser notifications first. Subscribe now?');
if (shouldSubscribe) {
await this.subscribe();
// Give a moment for subscription to complete
await new Promise(resolve => setTimeout(resolve, 1000));
} else {
return;
}
}
const response = await fetch('/browser-notifications/test', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRFToken': document.querySelector('input[name=csrf_token]')?.value
}
});
if (!response.ok) {
if (response.status === 404) {
// No subscriptions found on server - try subscribing
alert('No browser subscriptions found. Subscribing now...');
await this.subscribe();
return;
}
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
const result = await response.json();
alert(result.message);
console.log('Test notification result:', result);
} catch (error) {
console.error('Failed to send test notification:', error);
alert(`Failed to send test notification: ${error.message}`);
}
}
urlBase64ToUint8Array(base64String) {
const padding = '='.repeat((4 - base64String.length % 4) % 4);
const base64 = (base64String + padding)
.replace(/-/g, '+')
.replace(/_/g, '/');
const rawData = window.atob(base64);
const outputArray = new Uint8Array(rawData.length);
for (let i = 0; i < rawData.length; ++i) {
outputArray[i] = rawData.charCodeAt(i);
}
return outputArray;
}
async checkExistingSubscription() {
/**
* Check if we already have a valid browser subscription
* Updates this.isSubscribed based on actual browser state
*/
try {
if (!this.serviceWorkerRegistration) {
this.isSubscribed = false;
return;
}
const existingSubscription = await this.serviceWorkerRegistration.pushManager.getSubscription();
if (existingSubscription) {
// We have a subscription - verify it's still valid and matches our VAPID key
const subscriptionJson = existingSubscription.toJSON();
// Check if the endpoint is still active (basic validation)
if (subscriptionJson.endpoint && subscriptionJson.keys) {
console.log('Found existing valid subscription');
this.isSubscribed = true;
} else {
console.log('Found invalid subscription, clearing...');
await existingSubscription.unsubscribe();
this.isSubscribed = false;
}
} else {
console.log('No existing subscription found');
this.isSubscribed = false;
}
} catch (error) {
console.warn('Failed to check existing subscription:', error);
this.isSubscribed = false;
}
}
async clearExistingSubscription() {
/**
* Clear any existing push subscription that might conflict with our VAPID keys
*/
try {
const existingSubscription = await this.serviceWorkerRegistration.pushManager.getSubscription();
if (existingSubscription) {
console.log('Found existing subscription, unsubscribing...');
await existingSubscription.unsubscribe();
console.log('Successfully cleared existing subscription');
}
} catch (error) {
console.warn('Failed to clear existing subscription:', error);
// Don't throw - this is just cleanup
}
}
showSubscriptionConflictDialog(error) {
/**
* Show user-friendly dialog for subscription conflicts
*/
const message = `Browser notifications are already set up for a different changedetection.io instance or with different settings.
To fix this:
1. Clear your existing subscription
2. Try subscribing again
Would you like to automatically clear the old subscription and retry?`;
if (confirm(message)) {
this.clearExistingSubscription().then(() => {
// Retry subscription after clearing
setTimeout(() => {
this.subscribe();
}, 500);
});
} else {
alert('To use browser notifications, please manually clear your browser notifications for this site in browser settings, then try again.');
}
}
async clearAllNotifications() {
/**
* Clear all browser notification subscriptions (admin function)
*/
try {
// Call the server to clear ALL subscriptions from datastore
const response = await fetch('/browser-notifications/clear', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRFToken': document.querySelector('input[name=csrf_token]')?.value
}
});
if (response.ok) {
const result = await response.json();
console.log('Server response:', result.message);
// Also clear the current browser's subscription if it exists
const existingSubscription = await this.serviceWorkerRegistration.pushManager.getSubscription();
if (existingSubscription) {
await existingSubscription.unsubscribe();
console.log('Cleared current browser subscription');
}
// Update status
this.isSubscribed = false;
alert(result.message + '. All browser notifications have been cleared.');
} else {
const error = await response.json();
console.error('Server clear failed:', error.message);
alert('Failed to clear server subscriptions: ' + error.message);
}
} catch (error) {
console.error('Failed to clear all notifications:', error);
alert('Failed to clear notifications: ' + error.message);
}
}
}
// Initialize when DOM is ready
if (document.readyState === 'loading') {
document.addEventListener('DOMContentLoaded', () => {
window.browserNotifications = new BrowserNotifications();
});
} else {
window.browserNotifications = new BrowserNotifications();
}

View File

@@ -117,15 +117,16 @@ $(document).ready(function () {
}
})
socket.on('general_stats_update', function (general_stats) {
// Tabs at bottom of list
$('#watch-table-wrapper').toggleClass("has-unread-changes", general_stats.unread_changes_count !==0)
$('#watch-table-wrapper').toggleClass("has-error", general_stats.count_errors !== 0)
$('#post-list-with-errors a').text(`With errors (${ new Intl.NumberFormat(navigator.language).format(general_stats.count_errors) })`);
$('#unread-tab-counter').text(new Intl.NumberFormat(navigator.language).format(general_stats.unread_changes_count));
});
socket.on('watch_update', function (data) {
const watch = data.watch;
const general_stats = data.general_stats;
// Log the entire watch object for debugging
console.log('!!! WATCH UPDATE EVENT RECEIVED !!!');
console.log(`${watch.event_timestamp} - Watch update ${watch.uuid} - Checking now - ${watch.checking_now} - UUID in URL ${window.location.href.includes(watch.uuid)}`);
console.log('Watch data:', watch);
console.log('General stats:', general_stats);
// Updating watch table rows
const $watchRow = $('tr[data-watch-uuid="' + watch.uuid + '"]');
@@ -150,13 +151,6 @@ $(document).ready(function () {
console.log('Updated UI for watch:', watch.uuid);
}
// Tabs at bottom of list
$('#post-list-mark-views').toggleClass("has-unviewed", general_stats.has_unviewed);
$('#post-list-unread').toggleClass("has-unviewed", general_stats.has_unviewed);
$('#post-list-with-errors').toggleClass("has-error", general_stats.count_errors !== 0)
$('#post-list-with-errors a').text(`With errors (${ general_stats.count_errors })`);
$('body').toggleClass('checking-now', watch.checking_now && window.location.href.includes(watch.uuid));
});

View File

@@ -1,95 +0,0 @@
// changedetection.io Service Worker for Browser Push Notifications
self.addEventListener('install', function(event) {
console.log('Service Worker installing');
self.skipWaiting();
});
self.addEventListener('activate', function(event) {
console.log('Service Worker activating');
event.waitUntil(self.clients.claim());
});
self.addEventListener('push', function(event) {
console.log('Push message received', event);
let notificationData = {
title: 'changedetection.io',
body: 'A watched page has changed',
icon: '/static/favicons/favicon-32x32.png',
badge: '/static/favicons/favicon-32x32.png',
tag: 'changedetection-notification',
requireInteraction: false,
timestamp: Date.now()
};
// Parse push data if available
if (event.data) {
try {
const pushData = event.data.json();
notificationData = {
...notificationData,
...pushData
};
} catch (e) {
console.warn('Failed to parse push data:', e);
notificationData.body = event.data.text() || notificationData.body;
}
}
const promiseChain = self.registration.showNotification(
notificationData.title,
{
body: notificationData.body,
icon: notificationData.icon,
badge: notificationData.badge,
tag: notificationData.tag,
requireInteraction: notificationData.requireInteraction,
timestamp: notificationData.timestamp,
data: {
url: notificationData.url || '/',
timestamp: notificationData.timestamp
}
}
);
event.waitUntil(promiseChain);
});
self.addEventListener('notificationclick', function(event) {
console.log('Notification clicked', event);
event.notification.close();
const targetUrl = event.notification.data?.url || '/';
event.waitUntil(
clients.matchAll().then(function(clientList) {
// Check if there's already a window/tab open with our app
for (let i = 0; i < clientList.length; i++) {
const client = clientList[i];
if (client.url.includes(self.location.origin) && 'focus' in client) {
client.navigate(targetUrl);
return client.focus();
}
}
// If no existing window, open a new one
if (clients.openWindow) {
return clients.openWindow(targetUrl);
}
})
);
});
self.addEventListener('notificationclose', function(event) {
console.log('Notification closed', event);
});
// Handle messages from the main thread
self.addEventListener('message', function(event) {
console.log('Service Worker received message:', event.data);
if (event.data && event.data.type === 'SKIP_WAITING') {
self.skipWaiting();
}
});

View File

@@ -17,18 +17,6 @@ body.checking-now {
position: fixed;
}
#post-list-buttons {
#post-list-with-errors.has-error {
display: inline-block !important;
}
#post-list-mark-views.has-unviewed {
display: inline-block !important;
}
#post-list-unread.has-unviewed {
display: inline-block !important;
}
}

View File

@@ -127,5 +127,44 @@
display: inline-block !important;
}
}
}
#watch-table-wrapper {
/* general styling */
#post-list-buttons {
text-align: right;
padding: 0px;
margin: 0px;
li {
display: inline-block;
}
a {
border-top-left-radius: initial;
border-top-right-radius: initial;
border-bottom-left-radius: 5px;
border-bottom-right-radius: 5px;
}
}
/* post list dynamically on/off stuff */
&.has-error {
#post-list-buttons {
#post-list-with-errors {
display: inline-block !important;
}
}
}
&.has-unread-changes {
#post-list-buttons {
#post-list-unread, #post-list-mark-views, #post-list-unread {
display: inline-block !important;
}
}
}
}

View File

@@ -203,24 +203,6 @@ code {
}
#post-list-buttons {
text-align: right;
padding: 0px;
margin: 0px;
li {
display: inline-block;
}
a {
border-top-left-radius: initial;
border-top-right-radius: initial;
border-bottom-left-radius: 5px;
border-bottom-right-radius: 5px;
}
}
body:after {
content: "";
background: linear-gradient(130deg, var(--color-background-gradient-first), var(--color-background-gradient-second) 41.07%, var(--color-background-gradient-third) 84.05%);

File diff suppressed because one or more lines are too long

View File

@@ -140,28 +140,6 @@ class ChangeDetectionStore:
secret = secrets.token_hex(16)
self.__data['settings']['application']['api_access_token'] = secret
# Generate VAPID keys for browser push notifications
if not self.__data['settings']['application']['vapid'].get('private_key'):
try:
from py_vapid import Vapid
vapid = Vapid()
vapid.generate_keys()
# Convert bytes to strings for JSON serialization
private_pem = vapid.private_pem()
public_pem = vapid.public_pem()
self.__data['settings']['application']['vapid']['private_key'] = private_pem.decode() if isinstance(private_pem, bytes) else private_pem
self.__data['settings']['application']['vapid']['public_key'] = public_pem.decode() if isinstance(public_pem, bytes) else public_pem
# Set default contact email if not present
if not self.__data['settings']['application']['vapid'].get('contact_email'):
self.__data['settings']['application']['vapid']['contact_email'] = 'citizen@example.com'
logger.info("Generated new VAPID keys for browser push notifications")
except ImportError:
logger.warning("py_vapid not available - browser notifications will not work")
except Exception as e:
logger.warning(f"Failed to generate VAPID keys: {e}")
self.needs_write = True
# Finally start the thread that will manage periodic data saves to JSON
@@ -224,14 +202,13 @@ class ChangeDetectionStore:
return seconds
@property
def has_unviewed(self):
if not self.__data.get('watching'):
return None
def unread_changes_count(self):
unread_changes_count = 0
for uuid, watch in self.__data['watching'].items():
if watch.history_n >= 2 and watch.viewed == False:
return True
return False
unread_changes_count += 1
return unread_changes_count
@property
def data(self):

View File

@@ -33,34 +33,6 @@
<div id="notification-test-log" style="display: none;"><span class="pure-form-message-inline">Processing..</span></div>
</div>
</div>
<!-- Browser Notifications -->
<div id="browser-notification-section">
<div class="pure-control-group">
<label>Browser Notifications</label>
<div class="pure-form-message-inline">
<p><strong>Browser push notifications!</strong> Use <code>browser://</code> URLs in your notification settings to receive real-time push notifications even when this tab is closed.</p>
<p><small><strong>Troubleshooting:</strong> If you get "different applicationServerKey" errors, click "Clear All Notifications" below and try again. This happens when switching between different changedetection.io instances.</small></p>
<div id="browser-notification-controls" style="margin-top: 1em;">
<div id="notification-permission-status">
<p>Browser notifications: <span id="permission-status">checking...</span></p>
</div>
<div id="browser-notification-actions">
<button type="button" id="enable-notifications-btn" class="pure-button button-secondary button-xsmall" style="display: none;">
Enable Browser Notifications
</button>
<button type="button" id="test-notification-btn" class="pure-button button-secondary button-xsmall" style="display: none;">
Send browser test notification
</button>
<button type="button" id="clear-notifications-btn" class="pure-button button-secondary button-xsmall" onclick="window.browserNotifications?.clearAllNotifications()" style="margin-left: 0.5em;">
Clear All Notifications
</button>
</div>
</div>
</div>
</div>
</div>
<div id="notification-customisation" class="pure-control-group">
<div class="pure-control-group">
{{ render_field(form.notification_title, class="m-d notification-title", placeholder=settings_application['notification_title']) }}

View File

@@ -35,7 +35,6 @@
<script src="{{url_for('static_content', group='js', filename='jquery-3.6.0.min.js')}}"></script>
<script src="{{url_for('static_content', group='js', filename='csrf.js')}}" defer></script>
<script src="{{url_for('static_content', group='js', filename='feather-icons.min.js')}}" defer></script>
<script src="{{url_for('static_content', group='js', filename='browser-notifications.js')}}" defer></script>
{% if socket_io_enabled %}
<script src="{{url_for('static_content', group='js', filename='socket.io.min.js')}}"></script>
<script src="{{url_for('static_content', group='js', filename='realtime.js')}}" defer></script>

View File

@@ -75,7 +75,7 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
wait_for_all_checks(client)
time.sleep(0.5)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# The trigger line is REMOVED, this should trigger
set_original(excluding='The golden line')
@@ -84,7 +84,7 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
time.sleep(1)
@@ -98,14 +98,14 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
wait_for_all_checks(client)
time.sleep(1)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Remove it again, and we should get a trigger
set_original(excluding='The golden line')
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
@@ -169,7 +169,7 @@ def test_check_add_line_contains_trigger(client, live_server, measure_memory_usa
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# The trigger line is ADDED, this should trigger
set_original(add_line='<p>Oh yes please</p>')
@@ -177,7 +177,7 @@ def test_check_add_line_contains_trigger(client, live_server, measure_memory_usa
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Takes a moment for apprise to fire
wait_for_notification_endpoint_output()

View File

@@ -38,9 +38,9 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'test-endpoint' in res.data
# Default no password set, this stuff should be always available.
@@ -74,9 +74,9 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
res = client.get(url_for("ui.ui_edit.watch_get_latest_html", uuid=uuid))
assert b'which has this one new line' in res.data
# Now something should be ready, indicated by having a 'unviewed' class
# Now something should be ready, indicated by having a 'has-unread-changes' class
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# #75, and it should be in the RSS feed
rss_token = extract_rss_token_from_UI(client)
@@ -90,7 +90,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
assert expected_url.encode('utf-8') in res.data
#
# Following the 'diff' link, it should no longer display as 'unviewed' even after we recheck it a few times
# Following the 'diff' link, it should no longer display as 'has-unread-changes' even after we recheck it a few times
res = client.get(url_for("ui.ui_views.diff_history_page", uuid=uuid))
assert b'selected=""' in res.data, "Confirm diff history page loaded"
@@ -111,12 +111,12 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'class="has-unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'class="has-unread-changes' not in res.data
assert b'head title' in res.data # Should be ON by default
assert b'test-endpoint' in res.data
@@ -140,8 +140,8 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'class="has-unviewed' in res.data
assert b'has-unread-changes' in res.data
assert b'class="has-unread-changes' in res.data
assert b'head title' not in res.data # should now be off
@@ -151,8 +151,8 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# hit the mark all viewed link
res = client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
assert b'class="has-unviewed' not in res.data
assert b'unviewed' not in res.data
assert b'class="has-unread-changes' not in res.data
assert b'has-unread-changes' not in res.data
# #2458 "clear history" should make the Watch object update its status correctly when the first snapshot lands again
client.get(url_for("ui.clear_watch_history", uuid=uuid))
@@ -165,3 +165,53 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# Cleanup everything
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
def test_non_text_mime_or_downloads(client, live_server, measure_memory_usage):
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write("""some random text that should be split by line
and not parsed with html_to_text
this way we know that it correctly parsed as plain text
\r\n
ok\r\n
got it\r\n
""")
test_url = url_for('test_endpoint', content_type="application/octet-stream", _external=True)
# Add our URL to the import page
res = client.post(
url_for("imports.import_page"),
data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
wait_for_all_checks(client)
### check the front end
res = client.get(
url_for("ui.ui_views.preview_page", uuid="first"),
follow_redirects=True
)
assert b"some random text that should be split by line\n" in res.data
####
# Check the snapshot by API that it has linefeeds too
watch_uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
res = client.get(
url_for("watchhistory", uuid=watch_uuid),
headers={'x-api-key': api_key},
)
# Fetch a snapshot by timestamp, check the right one was found
res = client.get(
url_for("watchsinglehistory", uuid=watch_uuid, timestamp=list(res.json.keys())[-1]),
headers={'x-api-key': api_key},
)
assert b"some random text that should be split by line\n" in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)

View File

@@ -58,6 +58,7 @@ def run_socketio_watch_update_test(client, live_server, password_mode=""):
has_watch_update = False
has_unviewed_update = False
got_general_stats_update = False
for i in range(10):
# Get received events
@@ -65,15 +66,11 @@ def run_socketio_watch_update_test(client, live_server, password_mode=""):
if received:
logger.info(f"Received {len(received)} events after {i+1} seconds")
# Check for watch_update events with unviewed=True
for event in received:
if event['name'] == 'watch_update':
has_watch_update = True
if event['args'][0]['watch'].get('unviewed', False):
has_unviewed_update = True
logger.info("Found unviewed update event!")
break
if event['name'] == 'general_stats_update':
got_general_stats_update = True
if has_unviewed_update:
break
@@ -92,7 +89,7 @@ def run_socketio_watch_update_test(client, live_server, password_mode=""):
assert has_watch_update, "No watch_update events received"
# Verify we received an unviewed event
assert has_unviewed_update, "No watch_update event with unviewed=True received"
assert got_general_stats_update, "Got general stats update event"
# Alternatively, check directly if the watch in the datastore is marked as unviewed
from changedetectionio.flask_app import app

View File

@@ -107,9 +107,9 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
# The page changed, BUT the text is still there, just the rest of it changes, we should not see a change
@@ -120,9 +120,9 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
# 2548
@@ -131,7 +131,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Now we set a change where the text is gone AND its different content, it should now trigger
@@ -139,7 +139,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data

View File

@@ -125,7 +125,7 @@ def test_conditions_with_text_and_number(client, live_server):
time.sleep(2)
# 75 is > 20 and < 100 and contains "5"
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Case 2: Change with one condition violated
@@ -141,7 +141,7 @@ def test_conditions_with_text_and_number(client, live_server):
# Should NOT be marked as having changes since not all conditions are met
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
@@ -299,7 +299,7 @@ def test_lev_conditions_plugin(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Check the content saved initially, even tho a condition was set - this is the first snapshot so shouldnt be affected by conditions
res = client.get(
@@ -326,7 +326,7 @@ def test_lev_conditions_plugin(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data #because this will be like 0.90 not 0.8 threshold
assert b'has-unread-changes' not in res.data #because this will be like 0.90 not 0.8 threshold
############### Now change it a MORE THAN 50%
test_return_data = """<html>
@@ -345,7 +345,7 @@ def test_lev_conditions_plugin(client, live_server, measure_memory_usage):
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# cleanup for the next
client.get(
url_for("ui.form_delete", uuid="all"),

View File

@@ -116,10 +116,10 @@ def test_check_markup_include_filters_restriction(client, live_server, measure_m
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# It should have 'unviewed' still
# It should have 'has-unread-changes' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Tests the whole stack works with the CSS Filter

View File

@@ -190,7 +190,7 @@ def test_element_removal_full(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# so that we set the state to 'unviewed' after all the edits
# so that we set the state to 'has-unread-changes' after all the edits
client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
# Make a change to header/footer/nav

View File

@@ -31,7 +31,7 @@ def _runner_test_http_errors(client, live_server, http_code, expected_text):
res = client.get(url_for("watchlist.index"))
# no change
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert bytes(expected_text.encode('utf-8')) in res.data

View File

@@ -174,10 +174,10 @@ def test_check_filter_and_regex_extract(client, live_server, measure_memory_usag
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should have 'unviewed' still
# It should have 'has-unread-changes' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Check HTML conversion detected and workd
res = client.get(

View File

@@ -128,9 +128,9 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
# Make a change
@@ -141,9 +141,9 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
@@ -154,7 +154,7 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.ui_views.preview_page", uuid="first"))
@@ -222,9 +222,9 @@ def _run_test_global_ignore(client, as_source=False, extra_ignore=""):
# Trigger a check
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class), adding random ignore text should not cause a change
# It should report nothing found (no new 'has-unread-changes' class), adding random ignore text should not cause a change
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
#####
@@ -238,10 +238,10 @@ def _run_test_global_ignore(client, as_source=False, extra_ignore=""):
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
# Just to be sure.. set a regular modified change that will trigger it
@@ -249,7 +249,7 @@ def _run_test_global_ignore(client, as_source=False, extra_ignore=""):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data

View File

@@ -111,7 +111,7 @@ def test_render_anchor_tag_content_true(client, live_server, measure_memory_usag
assert '(/modified_link)' in res.data.decode()
# since the link has changed, and we chose to render anchor tag content,
# we should detect a change (new 'unviewed' class)
# we should detect a change (new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b"unviewed" in res.data
assert b"/test-endpoint" in res.data

View File

@@ -77,9 +77,9 @@ def test_normal_page_check_works_with_ignore_status_code(client, live_server, me
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
assert b'/test-endpoint' in res.data
@@ -124,8 +124,8 @@ def test_403_page_check_works_with_ignore_status_code(client, live_server, measu
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should have 'unviewed' still
# It should have 'has-unread-changes' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data

View File

@@ -89,7 +89,7 @@ def test_check_ignore_whitespace(client, live_server, measure_memory_usage):
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data

View File

@@ -26,7 +26,7 @@ def test_jinja2_in_url_query(client, live_server, measure_memory_usage):
assert b"Watch added" in res.data
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(
url_for("ui.ui_views.preview_page", uuid="first"),
follow_redirects=True
@@ -51,7 +51,7 @@ def test_jinja2_security_url_query(client, live_server, measure_memory_usage):
assert b"Watch added" in res.data
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'is invalid and cannot be used' in res.data
# Some of the spewed output from the subclasses

View File

@@ -280,9 +280,9 @@ def check_json_filter(json_filter, client, live_server):
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should have 'unviewed' still
# It should have 'has-unread-changes' still
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Should not see this, because its not in the JSONPath we entered
res = client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
@@ -418,14 +418,14 @@ def check_json_ext_filter(json_filter, client, live_server):
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should have 'unviewed'
# It should have 'has-unread-changes'
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.ui_views.preview_page", uuid="first"))
# We should never see 'ForSale' because we are selecting on 'Sold' in the rule,
# But we should know it triggered ('unviewed' assert above)
# But we should know it triggered ('has-unread-changes' assert above)
assert b'ForSale' not in res.data
assert b'Sold' in res.data
@@ -465,7 +465,7 @@ def test_ignore_json_order(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Just to be sure it still works
with open("test-datastore/endpoint-content.txt", "w") as f:
@@ -476,7 +476,7 @@ def test_ignore_json_order(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data

View File

@@ -40,9 +40,9 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
#####################
@@ -62,9 +62,9 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
watch = live_server.app.config['DATASTORE'].data['watching'][uuid]
@@ -92,9 +92,9 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
time.sleep(0.2)
@@ -108,7 +108,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data # A change should have registered because empty_pages_are_a_change is ON
assert b'has-unread-changes' in res.data # A change should have registered because empty_pages_are_a_change is ON
assert b'fetch-error' not in res.data
#

View File

@@ -49,9 +49,9 @@ def test_fetch_pdf(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
# Now something should be ready, indicated by having a 'has-unread-changes' class
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# The original checksum should be not be here anymore (cdio adds it to the bottom of the text)

View File

@@ -47,9 +47,9 @@ def test_fetch_pdf(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
# Now something should be ready, indicated by having a 'has-unread-changes' class
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# The original checksum should be not be here anymore (cdio adds it to the bottom of the text)

View File

@@ -112,7 +112,7 @@ def test_itemprop_price_change(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'180.45' in res.data
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
time.sleep(0.2)
@@ -129,7 +129,7 @@ def test_itemprop_price_change(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'120.45' in res.data
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
@@ -178,7 +178,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
assert b'more than one price detected' not in res.data
# BUT the new price should show, even tho its within limits
assert b'1,000.45' or b'1000.45' in res.data #depending on locale
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# price changed to something LESS than min (900), SHOULD be a change
set_original_response(props_markup=instock_props[0], price='890.45')
@@ -188,7 +188,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'890.45' in res.data
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
client.get(url_for("ui.mark_all_viewed"))
@@ -200,7 +200,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'820.45' in res.data
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
client.get(url_for("ui.mark_all_viewed"))
# price changed to something MORE than max (1100.10), SHOULD be a change
@@ -210,7 +210,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
res = client.get(url_for("watchlist.index"))
# Depending on the LOCALE it may be either of these (generally for US/default/etc)
assert b'1,890.45' in res.data or b'1890.45' in res.data
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
@@ -294,7 +294,7 @@ def test_itemprop_percent_threshold(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'960.45' in res.data
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Bigger INCREASE change than the threshold should trigger
set_original_response(props_markup=instock_props[0], price='1960.45')
@@ -302,7 +302,7 @@ def test_itemprop_percent_threshold(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'1,960.45' or b'1960.45' in res.data #depending on locale
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Small decrease should NOT trigger
@@ -312,7 +312,7 @@ def test_itemprop_percent_threshold(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'1,950.45' or b'1950.45' in res.data #depending on locale
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data

View File

@@ -43,9 +43,9 @@ def test_check_basic_change_detection_functionality_source(client, live_server,
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
# Now something should be ready, indicated by having a 'has-unread-changes' class
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(
url_for("ui.ui_views.diff_history_page", uuid="first"),

View File

@@ -96,7 +96,7 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
# so that we set the state to 'unviewed' after all the edits
# so that we set the state to 'has-unread-changes' after all the edits
client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
# Trigger a check
@@ -104,9 +104,9 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
assert b'/test-endpoint' in res.data
# Make a change
@@ -116,9 +116,9 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Now set the content which contains the trigger text
set_modified_with_trigger_text_response()
@@ -126,7 +126,7 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# https://github.com/dgtlmoon/changedetection.io/issues/616
# Apparently the actual snapshot that contains the trigger never shows

View File

@@ -42,7 +42,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
# It should report nothing found (just a new one shouldnt have anything)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
### test regex
res = client.post(
@@ -54,7 +54,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
follow_redirects=True
)
wait_for_all_checks(client)
# so that we set the state to 'unviewed' after all the edits
# so that we set the state to 'has-unread-changes' after all the edits
client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
with open("test-datastore/endpoint-content.txt", "w") as f:
@@ -65,7 +65,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
# It should report nothing found (nothing should match the regex)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write("regex test123<br>\nsomething 123")
@@ -73,7 +73,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Cleanup everything
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)

View File

@@ -69,7 +69,7 @@ def test_trigger_regex_functionality_with_filter(client, live_server, measure_me
# It should report nothing found (nothing should match the regex and filter)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# now this should trigger something
with open("test-datastore/endpoint-content.txt", "w") as f:
@@ -78,7 +78,7 @@ def test_trigger_regex_functionality_with_filter(client, live_server, measure_me
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
time.sleep(sleep_time_for_fetch_thread)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
# Cleanup everything
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)

View File

@@ -248,3 +248,44 @@ def test_page_title_listing_behaviour(client, live_server):
res = client.get(url_for("watchlist.index"))
assert b"head titlecustom html" in res.data
def test_ui_viewed_unread_flag(client, live_server):
import time
set_original_response(extra_title="custom html")
# Add our URL to the import page
res = client.post(
url_for("imports.import_page"),
data={"urls": url_for('test_endpoint', _external=True)+"\r\n"+url_for('test_endpoint', _external=True)},
follow_redirects=True
)
assert b"2 Imported" in res.data
wait_for_all_checks(client)
set_modified_response()
res = client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
assert b'Queued 2 watches for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'<span id="unread-tab-counter">2</span>' in res.data
assert res.data.count(b'data-watch-uuid') == 2
# one should now be viewed, but two in total still
client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
res = client.get(url_for("watchlist.index"))
assert b'<span id="unread-tab-counter">1</span>' in res.data
assert res.data.count(b'data-watch-uuid') == 2
# check ?unread=1 works
res = client.get(url_for("watchlist.index")+"?unread=1")
assert res.data.count(b'data-watch-uuid') == 1
assert b'<span id="unread-tab-counter">1</span>' in res.data
# Mark all viewed test again
client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
time.sleep(0.2)
res = client.get(url_for("watchlist.index"))
assert b'<span id="unread-tab-counter">0</span>' in res.data

View File

@@ -97,7 +97,7 @@ def test_unique_lines_functionality(client, live_server, measure_memory_usage):
follow_redirects=True
)
assert b"Updated watch." in res.data
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Make a change
set_modified_swapped_lines()
@@ -108,16 +108,16 @@ def test_unique_lines_functionality(client, live_server, measure_memory_usage):
# Give the thread time to pick it up
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
# It should report nothing found (no new 'has-unread-changes' class)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
# Now set the content which contains the new text and re-ordered existing text
set_modified_with_trigger_text_response()
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
@@ -157,7 +157,7 @@ def test_sort_lines_functionality(client, live_server, measure_memory_usage):
res = client.get(url_for("watchlist.index"))
# Should be a change registered
assert b'unviewed' in res.data
assert b'has-unread-changes' in res.data
res = client.get(
url_for("ui.ui_views.preview_page", uuid="first"),

View File

@@ -208,7 +208,7 @@ def test_check_markup_xpath_filter_restriction(client, live_server, measure_memo
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
assert b'unviewed' not in res.data
assert b'has-unread-changes' not in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data

View File

@@ -1,436 +0,0 @@
"""
Tests for browser notification functionality
Tests VAPID key handling, subscription management, and notification sending
"""
import json
import sys
import tempfile
import os
import unittest
from unittest.mock import patch, Mock, MagicMock
from py_vapid import Vapid
from changedetectionio.notification.apprise_plugin.browser_notification_helpers import (
convert_pem_private_key_for_pywebpush,
convert_pem_public_key_for_browser,
send_push_notifications,
create_notification_payload,
get_vapid_config_from_datastore,
get_browser_subscriptions,
save_browser_subscriptions
)
class TestVAPIDKeyHandling(unittest.TestCase):
"""Test VAPID key generation, conversion, and validation"""
def test_create_notification_payload(self):
"""Test notification payload creation"""
payload = create_notification_payload("Test Title", "Test Body", "/test-icon.png")
self.assertEqual(payload['title'], "Test Title")
self.assertEqual(payload['body'], "Test Body")
self.assertEqual(payload['icon'], "/test-icon.png")
self.assertEqual(payload['badge'], "/static/favicons/favicon-32x32.png")
self.assertIn('timestamp', payload)
self.assertIsInstance(payload['timestamp'], int)
def test_create_notification_payload_defaults(self):
"""Test notification payload with default values"""
payload = create_notification_payload("Title", "Body")
self.assertEqual(payload['icon'], "/static/favicons/favicon-32x32.png")
self.assertEqual(payload['badge'], "/static/favicons/favicon-32x32.png")
def test_convert_pem_private_key_for_pywebpush_with_valid_pem(self):
"""Test conversion of valid PEM private key to Vapid instance"""
# Generate a real VAPID key
vapid = Vapid()
vapid.generate_keys()
private_pem = vapid.private_pem().decode()
# Convert using our function
converted_key = convert_pem_private_key_for_pywebpush(private_pem)
# Should return a Vapid instance
self.assertIsInstance(converted_key, Vapid)
def test_convert_pem_private_key_invalid_input(self):
"""Test conversion with invalid input returns original"""
invalid_key = "not-a-pem-key"
result = convert_pem_private_key_for_pywebpush(invalid_key)
self.assertEqual(result, invalid_key)
none_key = None
result = convert_pem_private_key_for_pywebpush(none_key)
self.assertEqual(result, none_key)
def test_convert_pem_public_key_for_browser(self):
"""Test conversion of PEM public key to browser format"""
# Generate a real VAPID key pair
vapid = Vapid()
vapid.generate_keys()
public_pem = vapid.public_pem().decode()
# Convert to browser format
browser_key = convert_pem_public_key_for_browser(public_pem)
# Should return URL-safe base64 string
self.assertIsInstance(browser_key, str)
self.assertGreater(len(browser_key), 0)
# Should not contain padding
self.assertFalse(browser_key.endswith('='))
def test_convert_pem_public_key_invalid(self):
"""Test public key conversion with invalid input"""
result = convert_pem_public_key_for_browser("invalid-pem")
self.assertIsNone(result)
class TestDatastoreIntegration(unittest.TestCase):
"""Test datastore operations for VAPID and subscriptions"""
def test_get_vapid_config_from_datastore(self):
"""Test retrieving VAPID config from datastore"""
mock_datastore = Mock()
mock_datastore.data = {
'settings': {
'application': {
'vapid': {
'private_key': 'test-private-key',
'public_key': 'test-public-key',
'contact_email': 'test@example.com'
}
}
}
}
private_key, public_key, contact_email = get_vapid_config_from_datastore(mock_datastore)
self.assertEqual(private_key, 'test-private-key')
self.assertEqual(public_key, 'test-public-key')
self.assertEqual(contact_email, 'test@example.com')
def test_get_vapid_config_missing_email(self):
"""Test VAPID config with missing contact email uses default"""
mock_datastore = Mock()
mock_datastore.data = {
'settings': {
'application': {
'vapid': {
'private_key': 'test-private-key',
'public_key': 'test-public-key'
}
}
}
}
private_key, public_key, contact_email = get_vapid_config_from_datastore(mock_datastore)
self.assertEqual(contact_email, 'citizen@example.com')
def test_get_vapid_config_empty_datastore(self):
"""Test VAPID config with empty datastore returns None values"""
mock_datastore = Mock()
mock_datastore.data = {}
private_key, public_key, contact_email = get_vapid_config_from_datastore(mock_datastore)
self.assertIsNone(private_key)
self.assertIsNone(public_key)
self.assertEqual(contact_email, 'citizen@example.com')
def test_get_browser_subscriptions(self):
"""Test retrieving browser subscriptions from datastore"""
mock_datastore = Mock()
test_subscriptions = [
{
'endpoint': 'https://fcm.googleapis.com/fcm/send/test1',
'keys': {'p256dh': 'key1', 'auth': 'auth1'}
},
{
'endpoint': 'https://fcm.googleapis.com/fcm/send/test2',
'keys': {'p256dh': 'key2', 'auth': 'auth2'}
}
]
mock_datastore.data = {
'settings': {
'application': {
'browser_subscriptions': test_subscriptions
}
}
}
subscriptions = get_browser_subscriptions(mock_datastore)
self.assertEqual(len(subscriptions), 2)
self.assertEqual(subscriptions, test_subscriptions)
def test_get_browser_subscriptions_empty(self):
"""Test getting subscriptions from empty datastore returns empty list"""
mock_datastore = Mock()
mock_datastore.data = {}
subscriptions = get_browser_subscriptions(mock_datastore)
self.assertEqual(subscriptions, [])
def test_save_browser_subscriptions(self):
"""Test saving browser subscriptions to datastore"""
mock_datastore = Mock()
mock_datastore.data = {'settings': {'application': {}}}
test_subscriptions = [
{'endpoint': 'test1', 'keys': {'p256dh': 'key1', 'auth': 'auth1'}}
]
save_browser_subscriptions(mock_datastore, test_subscriptions)
self.assertEqual(mock_datastore.data['settings']['application']['browser_subscriptions'], test_subscriptions)
self.assertTrue(mock_datastore.needs_write)
class TestNotificationSending(unittest.TestCase):
"""Test notification sending with mocked pywebpush"""
@patch('pywebpush.webpush')
def test_send_push_notifications_success(self, mock_webpush):
"""Test successful notification sending"""
mock_webpush.return_value = True
mock_datastore = Mock()
mock_datastore.needs_write = False
subscriptions = [
{
'endpoint': 'https://fcm.googleapis.com/fcm/send/test1',
'keys': {'p256dh': 'key1', 'auth': 'auth1'}
}
]
# Generate a real VAPID key for testing
vapid = Vapid()
vapid.generate_keys()
private_key = vapid.private_pem().decode()
notification_payload = {
'title': 'Test Title',
'body': 'Test Body'
}
success_count, total_count = send_push_notifications(
subscriptions=subscriptions,
notification_payload=notification_payload,
private_key=private_key,
contact_email='test@example.com',
datastore=mock_datastore
)
self.assertEqual(success_count, 1)
self.assertEqual(total_count, 1)
self.assertTrue(mock_webpush.called)
# Verify webpush was called with correct parameters
call_args = mock_webpush.call_args
self.assertEqual(call_args[1]['subscription_info'], subscriptions[0])
self.assertEqual(json.loads(call_args[1]['data']), notification_payload)
self.assertIn('vapid_private_key', call_args[1])
self.assertEqual(call_args[1]['vapid_claims']['sub'], 'mailto:test@example.com')
@patch('pywebpush.webpush')
def test_send_push_notifications_webpush_exception(self, mock_webpush):
"""Test handling of WebPushException with invalid subscription removal"""
from pywebpush import WebPushException
# Mock a 410 response (subscription gone)
mock_response = Mock()
mock_response.status_code = 410
mock_webpush.side_effect = WebPushException("Subscription expired", response=mock_response)
mock_datastore = Mock()
mock_datastore.needs_write = False
subscriptions = [
{
'endpoint': 'https://fcm.googleapis.com/fcm/send/test1',
'keys': {'p256dh': 'key1', 'auth': 'auth1'}
}
]
vapid = Vapid()
vapid.generate_keys()
private_key = vapid.private_pem().decode()
success_count, total_count = send_push_notifications(
subscriptions=subscriptions,
notification_payload={'title': 'Test', 'body': 'Test'},
private_key=private_key,
contact_email='test@example.com',
datastore=mock_datastore
)
self.assertEqual(success_count, 0)
self.assertEqual(total_count, 1)
self.assertTrue(mock_datastore.needs_write) # Should mark for subscription cleanup
def test_send_push_notifications_no_pywebpush(self):
"""Test graceful handling when pywebpush is not available"""
with patch.dict('sys.modules', {'pywebpush': None}):
subscriptions = [{'endpoint': 'test', 'keys': {}}]
success_count, total_count = send_push_notifications(
subscriptions=subscriptions,
notification_payload={'title': 'Test', 'body': 'Test'},
private_key='test-key',
contact_email='test@example.com',
datastore=Mock()
)
self.assertEqual(success_count, 0)
self.assertEqual(total_count, 1)
class TestBrowserIntegration(unittest.TestCase):
"""Test browser integration aspects (file existence)"""
def test_javascript_browser_notifications_class_exists(self):
"""Test that browser notifications JavaScript file exists and has expected structure"""
js_file = "/var/www/changedetection.io/changedetectionio/static/js/browser-notifications.js"
self.assertTrue(os.path.exists(js_file))
with open(js_file, 'r') as f:
content = f.read()
# Check for key class and methods
self.assertIn('class BrowserNotifications', content)
self.assertIn('async init()', content)
self.assertIn('async subscribe()', content)
self.assertIn('async sendTestNotification()', content)
self.assertIn('setupNotificationUrlMonitoring()', content)
def test_service_worker_exists(self):
"""Test that service worker file exists"""
sw_file = "/var/www/changedetection.io/changedetectionio/static/js/service-worker.js"
self.assertTrue(os.path.exists(sw_file))
with open(sw_file, 'r') as f:
content = f.read()
# Check for key service worker functionality
self.assertIn('push', content)
self.assertIn('notificationclick', content)
class TestAPIEndpoints(unittest.TestCase):
"""Test browser notification API endpoints"""
def test_browser_notifications_module_exists(self):
"""Test that BrowserNotifications API module exists"""
api_file = "/var/www/changedetection.io/changedetectionio/notification/BrowserNotifications.py"
self.assertTrue(os.path.exists(api_file))
with open(api_file, 'r') as f:
content = f.read()
# Check for key API classes
self.assertIn('BrowserNotificationsVapidPublicKey', content)
self.assertIn('BrowserNotificationsSubscribe', content)
self.assertIn('BrowserNotificationsUnsubscribe', content)
def test_vapid_public_key_conversion(self):
"""Test VAPID public key conversion for browser use"""
# Generate a real key pair
vapid = Vapid()
vapid.generate_keys()
public_pem = vapid.public_pem().decode()
# Convert to browser format
browser_key = convert_pem_public_key_for_browser(public_pem)
# Verify it's a valid URL-safe base64 string
self.assertIsInstance(browser_key, str)
self.assertGreater(len(browser_key), 80) # P-256 uncompressed point should be ~88 chars
# Should not have padding
self.assertFalse(browser_key.endswith('='))
# Should only contain URL-safe base64 characters
import re
self.assertRegex(browser_key, r'^[A-Za-z0-9_-]+$')
class TestIntegrationFlow(unittest.TestCase):
"""Test complete integration flow"""
@patch('pywebpush.webpush')
def test_complete_notification_flow(self, mock_webpush):
"""Test complete flow from subscription to notification"""
mock_webpush.return_value = True
# Create mock datastore with VAPID keys
mock_datastore = Mock()
vapid = Vapid()
vapid.generate_keys()
mock_datastore.data = {
'settings': {
'application': {
'vapid': {
'private_key': vapid.private_pem().decode(),
'public_key': vapid.public_pem().decode(),
'contact_email': 'test@example.com'
},
'browser_subscriptions': [
{
'endpoint': 'https://fcm.googleapis.com/fcm/send/test123',
'keys': {
'p256dh': 'test-p256dh-key',
'auth': 'test-auth-key'
}
}
]
}
}
}
mock_datastore.needs_write = False
# Get configuration
private_key, public_key, contact_email = get_vapid_config_from_datastore(mock_datastore)
subscriptions = get_browser_subscriptions(mock_datastore)
# Create notification
payload = create_notification_payload("Test Title", "Test Message")
# Send notification
success_count, total_count = send_push_notifications(
subscriptions=subscriptions,
notification_payload=payload,
private_key=private_key,
contact_email=contact_email,
datastore=mock_datastore
)
# Verify success
self.assertEqual(success_count, 1)
self.assertEqual(total_count, 1)
self.assertTrue(mock_webpush.called)
# Verify webpush call parameters
call_args = mock_webpush.call_args
self.assertIn('subscription_info', call_args[1])
self.assertIn('vapid_private_key', call_args[1])
self.assertIn('vapid_claims', call_args[1])
# Verify vapid_claims format
vapid_claims = call_args[1]['vapid_claims']
self.assertEqual(vapid_claims['sub'], 'mailto:test@example.com')
self.assertEqual(vapid_claims['aud'], 'https://fcm.googleapis.com')
if __name__ == '__main__':
unittest.main()

View File

@@ -130,40 +130,47 @@ def extract_UUID_from_client(client):
def wait_for_all_checks(client=None):
"""
Waits until the queue is empty and workers are idle.
Much faster than the original with adaptive timing.
Waits until both queues are empty and all workers are idle.
Optimized for Janus queues with minimal delays.
"""
from changedetectionio.flask_app import update_q as global_update_q
from changedetectionio.flask_app import update_q as global_update_q, notification_q
from changedetectionio import worker_handler
logger = logging.getLogger()
empty_since = None
attempt = 0
max_attempts = 150 # Still reasonable upper bound
while attempt < max_attempts:
# Start with fast checks, slow down if needed
if attempt < 10:
time.sleep(0.1) # Very fast initial checks
elif attempt < 30:
time.sleep(0.3) # Medium speed
else:
time.sleep(0.8) # Slower for persistent issues
# Much tighter timing with reliable Janus queues
max_attempts = 100 # Reduced from 150
q_length = global_update_q.qsize()
for attempt in range(max_attempts):
# Check both queues and worker status
update_q_size = global_update_q.qsize()
notification_q_size = notification_q.qsize()
running_uuids = worker_handler.get_running_uuids()
any_workers_busy = len(running_uuids) > 0
if q_length == 0 and not any_workers_busy:
if empty_since is None:
empty_since = time.time()
elif time.time() - empty_since >= 0.15: # Shorter wait
break
# Both queues empty and no workers processing
if update_q_size == 0 and notification_q_size == 0 and not any_workers_busy:
# Small delay to account for items being added to queue during processing
time.sleep(0.05)
# Double-check after brief delay
update_q_size = global_update_q.qsize()
notification_q_size = notification_q.qsize()
running_uuids = worker_handler.get_running_uuids()
any_workers_busy = len(running_uuids) > 0
if update_q_size == 0 and notification_q_size == 0 and not any_workers_busy:
return # All clear!
# Adaptive sleep timing - start fast, get slightly slower
if attempt < 20:
time.sleep(0.05) # Very fast initial checks
elif attempt < 50:
time.sleep(0.1) # Medium speed
else:
empty_since = None
attempt += 1
time.sleep(0.3)
time.sleep(0.2) # Slower for edge cases
logger.warning(f"wait_for_all_checks() timed out after {max_attempts} attempts")
# Replaced by new_live_server_setup and calling per function scope in conftest.py
def live_server_setup(live_server):

View File

@@ -39,7 +39,7 @@ jsonpath-ng~=1.5.3
# jq not available on Windows so must be installed manually
# Notification library
apprise==1.9.3
apprise==1.9.4
# - Needed for apprise/spush, and maybe others? hopefully doesnt trigger a rust compile.
# - Requires extra wheel for rPi, adds build time for arm/v8 which is not in piwheels
@@ -51,8 +51,8 @@ cryptography==44.0.1
# use any version other than 2.0.x due to https://github.com/eclipse/paho.mqtt.python/issues/814
paho-mqtt!=2.0.*
# Used for CSS filtering
beautifulsoup4>=4.0.0
# Used for CSS filtering, JSON extraction from HTML
beautifulsoup4>=4.0.0,<=4.13.5
# XPath filtering, lxml is required by bs4 anyway, but put it here to be safe.
# #2328 - 5.2.0 and 5.2.1 had extra CPU flag CFLAGS set which was not compatible on older hardware
@@ -142,6 +142,3 @@ pre_commit >= 4.2.0
# For events between checking and socketio updates
blinker
# For Web Push notifications (browser notifications)
pywebpush