Compare commits

..

21 Commits

Author SHA1 Message Date
dgtlmoon
6d5970e55a WIP
Some checks failed
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-03-22 22:13:25 +01:00
dgtlmoon
8e833a2d71 Store 'last_modified' time info
Some checks failed
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / test-container-build (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-03-21 13:16:03 +01:00
dgtlmoon
efacc1cb6b use deepmerge 2025-03-21 13:10:44 +01:00
dgtlmoon
6c39c868f2 New deep merge store method 2025-03-21 13:04:20 +01:00
dgtlmoon
b6195cf5af always set default processor 2025-03-21 12:39:11 +01:00
dgtlmoon
d01032b639 Fix rehydratw 2025-03-21 11:21:06 +01:00
dgtlmoon
63a8802f32 Tidy up model def and clean up API endpoint
Some checks are pending
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Waiting to run
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Blocked by required conditions
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Blocked by required conditions
ChangeDetection.io App Test / lint-code (push) Waiting to run
ChangeDetection.io App Test / test-application-3-10 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-11 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-12 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-13 (push) Blocked by required conditions
2025-03-21 11:00:35 +01:00
dgtlmoon
35d3ebeba5 Adding organisational UI tags 2025-03-21 10:42:25 +01:00
dgtlmoon
9182918139 improve datastore object (better for switching model types) 2025-03-21 10:23:15 +01:00
dgtlmoon
822a985b16 fix imports
Some checks are pending
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Waiting to run
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Blocked by required conditions
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Blocked by required conditions
ChangeDetection.io App Test / lint-code (push) Waiting to run
ChangeDetection.io App Test / test-application-3-10 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-11 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-12 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-13 (push) Blocked by required conditions
2025-03-20 09:37:50 +01:00
dgtlmoon
03725992d0 Use new pyppeteerng
Some checks failed
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Waiting to run
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Blocked by required conditions
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Blocked by required conditions
ChangeDetection.io App Test / lint-code (push) Waiting to run
ChangeDetection.io App Test / test-application-3-10 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-11 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-12 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-13 (push) Blocked by required conditions
ChangeDetection.io Container Build Test / test-container-build (push) Has been cancelled
2025-03-20 00:10:58 +01:00
dgtlmoon
b612e5ace0 Revert "fix test"
This reverts commit d6470bc963.
2025-03-20 00:01:44 +01:00
dgtlmoon
d6470bc963 fix test 2025-03-19 23:59:27 +01:00
dgtlmoon
a218b10c5f Remove extra form redef 2025-03-19 23:56:51 +01:00
dgtlmoon
80ed6cbfc5 not needed 2025-03-19 17:56:12 +01:00
dgtlmoon
80c05516f7 remove 'enabled plugins' 2025-03-19 17:55:45 +01:00
dgtlmoon
eff6c1cdd3 Remove enabled pluginsd 2025-03-19 17:55:14 +01:00
dgtlmoon
b9a068b050 Small type check 2025-03-19 17:49:13 +01:00
dgtlmoon
a262f373cc Remove hard coded exmaples stuff 2025-03-19 17:43:16 +01:00
dgtlmoon
673ec24fa3 More work on plugins 2025-03-19 15:52:31 +01:00
dgtlmoon
9a073fc9aa WIP - pluggy refactor 2025-03-19 15:24:02 +01:00
107 changed files with 1962 additions and 2234 deletions

View File

@@ -27,7 +27,8 @@ jobs:
needs: lint-code
uses: ./.github/workflows/test-stack-reusable-workflow.yml
with:
python-version: '3.11'
python-version: '3.11'
skip-pypuppeteer: true
test-application-3-12:
needs: lint-code

View File

@@ -2,7 +2,7 @@
# Read more https://github.com/dgtlmoon/changedetection.io/wiki
__version__ = '0.49.9'
__version__ = '0.49.4'
from changedetectionio.strtobool import strtobool
from json.decoder import JSONDecodeError
@@ -33,7 +33,6 @@ def sigshutdown_handler(_signo, _stack_frame):
global datastore
name = signal.Signals(_signo).name
logger.critical(f'Shutdown: Got Signal - {name} ({_signo}), Saving DB to disk and calling shutdown')
datastore.sync_to_json()
logger.success('Sync JSON to disk complete.')
# This will throw a SystemExit exception, because eventlet.wsgi.server doesn't know how to deal with it.
# Solution: move to gevent or other server in the future (#2014)

View File

@@ -1,62 +0,0 @@
import os
from changedetectionio.strtobool import strtobool
from flask_restful import abort, Resource
from flask import request
import validators
from . import auth
class Import(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
@auth.check_token
def post(self):
"""
@api {post} /api/v1/import Import a list of watched URLs
@apiDescription Accepts a line-feed separated list of URLs to import, additionally with ?tag_uuids=(tag id), ?tag=(name), ?proxy={key}, ?dedupe=true (default true) one URL per line.
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/import --data-binary @list-of-sites.txt -H"x-api-key:8a111a21bc2f8f1dd9b9353bbd46049a"
@apiName Import
@apiGroup Watch
@apiSuccess (200) {List} OK List of watch UUIDs added
@apiSuccess (500) {String} ERR Some other error
"""
extras = {}
if request.args.get('proxy'):
plist = self.datastore.proxy_list
if not request.args.get('proxy') in plist:
return "Invalid proxy choice, currently supported proxies are '{}'".format(', '.join(plist)), 400
else:
extras['proxy'] = request.args.get('proxy')
dedupe = strtobool(request.args.get('dedupe', 'true'))
tags = request.args.get('tag')
tag_uuids = request.args.get('tag_uuids')
if tag_uuids:
tag_uuids = tag_uuids.split(',')
urls = request.get_data().decode('utf8').splitlines()
added = []
allow_simplehost = not strtobool(os.getenv('BLOCK_SIMPLEHOSTS', 'False'))
for url in urls:
url = url.strip()
if not len(url):
continue
# If hosts that only contain alphanumerics are allowed ("localhost" for example)
if not validators.url(url, simple_host=allow_simplehost):
return f"Invalid or unsupported URL - {url}", 400
if dedupe and self.datastore.url_exists(url):
continue
new_uuid = self.datastore.add_watch(url=url, extras=extras, tag=tags, tag_uuids=tag_uuids)
added.append(new_uuid)
return added

View File

@@ -1,51 +0,0 @@
from flask_restful import Resource, abort
from flask import request
from . import auth
class Search(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
@auth.check_token
def get(self):
"""
@api {get} /api/v1/search Search for watches
@apiDescription Search watches by URL or title text
@apiExample {curl} Example usage:
curl "http://localhost:5000/api/v1/search?q=https://example.com/page1" -H"x-api-key:813031b16330fe25e3780cf0325daa45"
curl "http://localhost:5000/api/v1/search?q=https://example.com/page1?tag=Favourites" -H"x-api-key:813031b16330fe25e3780cf0325daa45"
curl "http://localhost:5000/api/v1/search?q=https://example.com?partial=true" -H"x-api-key:813031b16330fe25e3780cf0325daa45"
@apiName Search
@apiGroup Watch Management
@apiQuery {String} q Search query to match against watch URLs and titles
@apiQuery {String} [tag] Optional name of tag to limit results (name not UUID)
@apiQuery {String} [partial] Allow partial matching of URL query
@apiSuccess (200) {Object} JSON Object containing matched watches
"""
query = request.args.get('q', '').strip()
tag_limit = request.args.get('tag', '').strip()
from changedetectionio.strtobool import strtobool
partial = bool(strtobool(request.args.get('partial', '0'))) if 'partial' in request.args else False
# Require a search query
if not query:
abort(400, message="Search query 'q' parameter is required")
# Use the search function from the datastore
matching_uuids = self.datastore.search_watches_for_url(query=query, tag_limit=tag_limit, partial=partial)
# Build the response with watch details
results = {}
for uuid in matching_uuids:
watch = self.datastore.data['watching'].get(uuid)
results[uuid] = {
'last_changed': watch.last_changed,
'last_checked': watch['last_checked'],
'last_error': watch['last_error'],
'title': watch['title'],
'url': watch['url'],
'viewed': watch.viewed
}
return results, 200

View File

@@ -1,54 +0,0 @@
from flask_restful import Resource
from . import auth
class SystemInfo(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
self.update_q = kwargs['update_q']
@auth.check_token
def get(self):
"""
@api {get} /api/v1/systeminfo Return system info
@apiDescription Return some info about the current system state
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/systeminfo -H"x-api-key:813031b16330fe25e3780cf0325daa45"
HTTP/1.0 200
{
'queue_size': 10 ,
'overdue_watches': ["watch-uuid-list"],
'uptime': 38344.55,
'watch_count': 800,
'version': "0.40.1"
}
@apiName Get Info
@apiGroup System Information
"""
import time
overdue_watches = []
# Check all watches and report which have not been checked but should have been
for uuid, watch in self.datastore.data.get('watching', {}).items():
# see if now - last_checked is greater than the time that should have been
# this is not super accurate (maybe they just edited it) but better than nothing
t = watch.threshold_seconds()
if not t:
# Use the system wide default
t = self.datastore.threshold_seconds
time_since_check = time.time() - watch.get('last_checked')
# Allow 5 minutes of grace time before we decide it's overdue
if time_since_check - (5 * 60) > t:
overdue_watches.append(uuid)
from changedetectionio import __version__ as main_version
return {
'queue_size': self.update_q.qsize(),
'overdue_watches': overdue_watches,
'uptime': round(time.time() - self.datastore.start_time, 2),
'watch_count': len(self.datastore.data.get('watching', {})),
'version': main_version
}, 200

View File

@@ -1,156 +0,0 @@
from flask_expects_json import expects_json
from flask_restful import abort, Resource
from flask import request
from . import auth
# Import schemas from __init__.py
from . import schema_tag, schema_create_tag, schema_update_tag
class Tag(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
# Get information about a single tag
# curl http://localhost:5000/api/v1/tag/<string:uuid>
@auth.check_token
def get(self, uuid):
"""
@api {get} /api/v1/tag/:uuid Single tag - get data or toggle notification muting.
@apiDescription Retrieve tag information and set notification_muted status
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/tag/cc0cfffa-f449-477b-83ea-0caafd1dc091 -H"x-api-key:813031b16330fe25e3780cf0325daa45"
curl "http://localhost:5000/api/v1/tag/cc0cfffa-f449-477b-83ea-0caafd1dc091?muted=muted" -H"x-api-key:813031b16330fe25e3780cf0325daa45"
@apiName Tag
@apiGroup Tag
@apiParam {uuid} uuid Tag unique ID.
@apiQuery {String} [muted] =`muted` or =`unmuted` , Sets the MUTE NOTIFICATIONS state
@apiSuccess (200) {String} OK When muted operation OR full JSON object of the tag
@apiSuccess (200) {JSON} TagJSON JSON Full JSON object of the tag
"""
from copy import deepcopy
tag = deepcopy(self.datastore.data['settings']['application']['tags'].get(uuid))
if not tag:
abort(404, message=f'No tag exists with the UUID of {uuid}')
if request.args.get('muted', '') == 'muted':
self.datastore.data['settings']['application']['tags'][uuid]['notification_muted'] = True
return "OK", 200
elif request.args.get('muted', '') == 'unmuted':
self.datastore.data['settings']['application']['tags'][uuid]['notification_muted'] = False
return "OK", 200
return tag
@auth.check_token
def delete(self, uuid):
"""
@api {delete} /api/v1/tag/:uuid Delete a tag and remove it from all watches
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/tag/cc0cfffa-f449-477b-83ea-0caafd1dc091 -X DELETE -H"x-api-key:813031b16330fe25e3780cf0325daa45"
@apiParam {uuid} uuid Tag unique ID.
@apiName DeleteTag
@apiGroup Tag
@apiSuccess (200) {String} OK Was deleted
"""
if not self.datastore.data['settings']['application']['tags'].get(uuid):
abort(400, message='No tag exists with the UUID of {}'.format(uuid))
# Delete the tag, and any tag reference
del self.datastore.data['settings']['application']['tags'][uuid]
# Remove tag from all watches
for watch_uuid, watch in self.datastore.data['watching'].items():
if watch.get('tags') and uuid in watch['tags']:
watch['tags'].remove(uuid)
return 'OK', 204
@auth.check_token
@expects_json(schema_update_tag)
def put(self, uuid):
"""
@api {put} /api/v1/tag/:uuid Update tag information
@apiExample {curl} Example usage:
Update (PUT)
curl http://localhost:5000/api/v1/tag/cc0cfffa-f449-477b-83ea-0caafd1dc091 -X PUT -H"x-api-key:813031b16330fe25e3780cf0325daa45" -H "Content-Type: application/json" -d '{"title": "New Tag Title"}'
@apiDescription Updates an existing tag using JSON
@apiParam {uuid} uuid Tag unique ID.
@apiName UpdateTag
@apiGroup Tag
@apiSuccess (200) {String} OK Was updated
@apiSuccess (500) {String} ERR Some other error
"""
tag = self.datastore.data['settings']['application']['tags'].get(uuid)
if not tag:
abort(404, message='No tag exists with the UUID of {}'.format(uuid))
tag.update(request.json)
self.datastore.needs_write_urgent = True
return "OK", 200
@auth.check_token
# Only cares for {'title': 'xxxx'}
def post(self):
"""
@api {post} /api/v1/watch Create a single tag
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/watch -H"x-api-key:813031b16330fe25e3780cf0325daa45" -H "Content-Type: application/json" -d '{"name": "Work related"}'
@apiName Create
@apiGroup Tag
@apiSuccess (200) {String} OK Was created
@apiSuccess (500) {String} ERR Some other error
"""
json_data = request.get_json()
title = json_data.get("title",'').strip()
new_uuid = self.datastore.add_tag(title=title)
if new_uuid:
return {'uuid': new_uuid}, 201
else:
return "Invalid or unsupported tag", 400
class Tags(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
@auth.check_token
def get(self):
"""
@api {get} /api/v1/tags List tags
@apiDescription Return list of available tags
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/tags -H"x-api-key:813031b16330fe25e3780cf0325daa45"
{
"cc0cfffa-f449-477b-83ea-0caafd1dc091": {
"title": "Tech News",
"notification_muted": false,
"date_created": 1677103794
},
"e6f5fd5c-dbfe-468b-b8f3-f9d6ff5ad69b": {
"title": "Shopping",
"notification_muted": true,
"date_created": 1676662819
}
}
@apiName ListTags
@apiGroup Tag Management
@apiSuccess (200) {String} OK JSON dict
"""
result = {}
for uuid, tag in self.datastore.data['settings']['application']['tags'].items():
result[uuid] = {
'date_created': tag.get('date_created', 0),
'notification_muted': tag.get('notification_muted', False),
'title': tag.get('title', ''),
'uuid': tag.get('uuid')
}
return result, 200

View File

@@ -1,26 +0,0 @@
import copy
from . import api_schema
from ..model import watch_base
# Build a JSON Schema atleast partially based on our Watch model
watch_base_config = watch_base()
schema = api_schema.build_watch_json_schema(watch_base_config)
schema_create_watch = copy.deepcopy(schema)
schema_create_watch['required'] = ['url']
schema_update_watch = copy.deepcopy(schema)
schema_update_watch['additionalProperties'] = False
# Tag schema is also based on watch_base since Tag inherits from it
schema_tag = copy.deepcopy(schema)
schema_create_tag = copy.deepcopy(schema_tag)
schema_create_tag['required'] = ['title']
schema_update_tag = copy.deepcopy(schema_tag)
schema_update_tag['additionalProperties'] = False
# Import all API resources
from .Watch import Watch, WatchHistory, WatchSingleHistory, CreateWatch
from .Tags import Tags, Tag
from .Import import Import
from .SystemInfo import SystemInfo

View File

@@ -9,9 +9,19 @@ import validators
from . import auth
import copy
# Import schemas from __init__.py
from . import schema, schema_create_watch, schema_update_watch
# See docs/README.md for rebuilding the docs/apidoc information
from . import api_schema
from ..model import schema as watch_schema
# Build a JSON Schema atleast partially based on our Watch model
schema = api_schema.build_watch_json_schema(watch_schema)
schema_create_watch = copy.deepcopy(schema)
schema_create_watch['required'] = ['url']
schema_update_watch = copy.deepcopy(schema)
schema_update_watch['additionalProperties'] = False
class Watch(Resource):
def __init__(self, **kwargs):
@@ -42,9 +52,9 @@ class Watch(Resource):
@apiSuccess (200) {JSON} WatchJSON JSON Full JSON object of the watch
"""
from copy import deepcopy
watch = deepcopy(self.datastore.data['watching'].get(uuid))
watch = self.datastore.data['watching'].get(uuid)
if not watch:
abort(404, message='No watch exists with the UUID of {}'.format(uuid))
abort(404, message=f'No watch exists with the UUID of {uuid}')
if request.args.get('recheck'):
self.update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': uuid}))
@@ -62,13 +72,16 @@ class Watch(Resource):
self.datastore.data['watching'].get(uuid).unmute()
return "OK", 200
# Return without history, get that via another API call
# Properties are not returned as a JSON, so add the required props manually
watch['history_n'] = watch.history_n
# attr .last_changed will check for the last written text snapshot on change
watch['last_changed'] = watch.last_changed
watch['viewed'] = watch.viewed
return watch
response = dict(watch.get_data())
# Add properties that aren't included in the standard dictionary items (they are properties/attr)
response['history_n'] = watch.history_n
response['last_changed'] = watch.last_changed
response['viewed'] = watch.viewed
response['title'] = watch.get('title')
return response
@auth.check_token
def delete(self, uuid):
@@ -103,16 +116,17 @@ class Watch(Resource):
@apiSuccess (200) {String} OK Was updated
@apiSuccess (500) {String} ERR Some other error
"""
watch = self.datastore.data['watching'].get(uuid)
if not watch:
abort(404, message='No watch exists with the UUID of {}'.format(uuid))
if not self.datastore.data['watching'].get(uuid):
abort(404, message=f'No watch exists with the UUID of {uuid}')
if request.json.get('proxy'):
plist = self.datastore.proxy_list
if not request.json.get('proxy') in plist:
return "Invalid proxy choice, currently supported proxies are '{}'".format(', '.join(plist)), 400
return f"Invalid proxy choice, currently supported proxies are '{', '.join(plist)}'", 400
watch.update(request.json)
self.datastore.data['watching'][uuid].update(request.json)
self.datastore.data['watching'][uuid].save_data()
return "OK", 200
@@ -274,6 +288,8 @@ class CreateWatch(Resource):
list = {}
tag_limit = request.args.get('tag', '').lower()
for uuid, watch in self.datastore.data['watching'].items():
# Watch tags by name (replace the other calls?)
tags = self.datastore.get_all_tags_for_watch(uuid=uuid)
@@ -294,4 +310,110 @@ class CreateWatch(Resource):
self.update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': uuid}))
return {'status': "OK"}, 200
return list, 200
return list, 200
class Import(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
@auth.check_token
def post(self):
"""
@api {post} /api/v1/import Import a list of watched URLs
@apiDescription Accepts a line-feed separated list of URLs to import, additionally with ?tag_uuids=(tag id), ?tag=(name), ?proxy={key}, ?dedupe=true (default true) one URL per line.
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/import --data-binary @list-of-sites.txt -H"x-api-key:8a111a21bc2f8f1dd9b9353bbd46049a"
@apiName Import
@apiGroup Watch
@apiSuccess (200) {List} OK List of watch UUIDs added
@apiSuccess (500) {String} ERR Some other error
"""
extras = {}
if request.args.get('proxy'):
plist = self.datastore.proxy_list
if not request.args.get('proxy') in plist:
return "Invalid proxy choice, currently supported proxies are '{}'".format(', '.join(plist)), 400
else:
extras['proxy'] = request.args.get('proxy')
dedupe = strtobool(request.args.get('dedupe', 'true'))
tags = request.args.get('tag')
tag_uuids = request.args.get('tag_uuids')
if tag_uuids:
tag_uuids = tag_uuids.split(',')
urls = request.get_data().decode('utf8').splitlines()
added = []
allow_simplehost = not strtobool(os.getenv('BLOCK_SIMPLEHOSTS', 'False'))
for url in urls:
url = url.strip()
if not len(url):
continue
# If hosts that only contain alphanumerics are allowed ("localhost" for example)
if not validators.url(url, simple_host=allow_simplehost):
return f"Invalid or unsupported URL - {url}", 400
if dedupe and self.datastore.url_exists(url):
continue
new_uuid = self.datastore.add_watch(url=url, extras=extras, tag=tags, tag_uuids=tag_uuids)
added.append(new_uuid)
return added
class SystemInfo(Resource):
def __init__(self, **kwargs):
# datastore is a black box dependency
self.datastore = kwargs['datastore']
self.update_q = kwargs['update_q']
@auth.check_token
def get(self):
"""
@api {get} /api/v1/systeminfo Return system info
@apiDescription Return some info about the current system state
@apiExample {curl} Example usage:
curl http://localhost:5000/api/v1/systeminfo -H"x-api-key:813031b16330fe25e3780cf0325daa45"
HTTP/1.0 200
{
'queue_size': 10 ,
'overdue_watches': ["watch-uuid-list"],
'uptime': 38344.55,
'watch_count': 800,
'version': "0.40.1"
}
@apiName Get Info
@apiGroup System Information
"""
import time
overdue_watches = []
# Check all watches and report which have not been checked but should have been
for uuid, watch in self.datastore.data.get('watching', {}).items():
# see if now - last_checked is greater than the time that should have been
# this is not super accurate (maybe they just edited it) but better than nothing
t = watch.threshold_seconds()
if not t:
# Use the system wide default
t = self.datastore.threshold_seconds
time_since_check = time.time() - watch.get('last_checked')
# Allow 5 minutes of grace time before we decide it's overdue
if time_since_check - (5 * 60) > t:
overdue_watches.append(uuid)
from changedetectionio import __version__ as main_version
return {
'queue_size': self.update_q.qsize(),
'overdue_watches': overdue_watches,
'uptime': round(time.time() - self.datastore.start_time, 2),
'watch_count': len(self.datastore.data.get('watching', {})),
'version': main_version
}, 200

View File

@@ -11,14 +11,22 @@ def check_token(f):
datastore = args[0].datastore
config_api_token_enabled = datastore.data['settings']['application'].get('api_access_token_enabled')
if not config_api_token_enabled:
return
try:
api_key_header = request.headers['x-api-key']
except KeyError:
return make_response(
jsonify("No authorization x-api-key header."), 403
)
config_api_token = datastore.data['settings']['application'].get('api_access_token')
# config_api_token_enabled - a UI option in settings if access should obey the key or not
if config_api_token_enabled:
if request.headers.get('x-api-key') != config_api_token:
return make_response(
jsonify("Invalid access - API key invalid."), 403
)
if api_key_header != config_api_token:
return make_response(
jsonify("Invalid access - API key invalid."), 403
)
return f(*args, **kwargs)

View File

@@ -0,0 +1,12 @@
from changedetectionio import apprise_plugin
import apprise
# Create our AppriseAsset and populate it with some of our new values:
# https://github.com/caronc/apprise/wiki/Development_API#the-apprise-asset-object
asset = apprise.AppriseAsset(
image_url_logo='https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/changedetectionio/static/images/avatar-256x256.png'
)
asset.app_id = "changedetection.io"
asset.app_desc = "ChangeDetection.io best and simplest website monitoring and change detection"
asset.app_url = "https://changedetection.io"

View File

@@ -0,0 +1,98 @@
# include the decorator
from apprise.decorators import notify
from loguru import logger
from requests.structures import CaseInsensitiveDict
@notify(on="delete")
@notify(on="deletes")
@notify(on="get")
@notify(on="gets")
@notify(on="post")
@notify(on="posts")
@notify(on="put")
@notify(on="puts")
def apprise_custom_api_call_wrapper(body, title, notify_type, *args, **kwargs):
import requests
import json
import re
from urllib.parse import unquote_plus
from apprise.utils.parse import parse_url as apprise_parse_url
url = kwargs['meta'].get('url')
schema = kwargs['meta'].get('schema').lower().strip()
# Choose POST, GET etc from requests
method = re.sub(rf's$', '', schema)
requests_method = getattr(requests, method)
params = CaseInsensitiveDict({}) # Added to requests
auth = None
has_error = False
# Convert /foobar?+some-header=hello to proper header dictionary
results = apprise_parse_url(url)
# Add our headers that the user can potentially over-ride if they wish
# to to our returned result set and tidy entries by unquoting them
headers = CaseInsensitiveDict({unquote_plus(x): unquote_plus(y)
for x, y in results['qsd+'].items()})
# https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#get-parameter-manipulation
# In Apprise, it relies on prefixing each request arg with "-", because it uses say &method=update as a flag for apprise
# but here we are making straight requests, so we need todo convert this against apprise's logic
for k, v in results['qsd'].items():
if not k.strip('+-') in results['qsd+'].keys():
params[unquote_plus(k)] = unquote_plus(v)
# Determine Authentication
auth = ''
if results.get('user') and results.get('password'):
auth = (unquote_plus(results.get('user')), unquote_plus(results.get('user')))
elif results.get('user'):
auth = (unquote_plus(results.get('user')))
# If it smells like it could be JSON and no content-type was already set, offer a default content type.
if body and '{' in body[:100] and not headers.get('Content-Type'):
json_header = 'application/json; charset=utf-8'
try:
# Try if it's JSON
json.loads(body)
headers['Content-Type'] = json_header
except ValueError as e:
logger.warning(f"Could not automatically add '{json_header}' header to the notification because the document failed to parse as JSON: {e}")
pass
# POSTS -> HTTPS etc
if schema.lower().endswith('s'):
url = re.sub(rf'^{schema}', 'https', results.get('url'))
else:
url = re.sub(rf'^{schema}', 'http', results.get('url'))
status_str = ''
try:
r = requests_method(url,
auth=auth,
data=body.encode('utf-8') if type(body) is str else body,
headers=headers,
params=params
)
if not (200 <= r.status_code < 300):
status_str = f"Error sending '{method.upper()}' request to {url} - Status: {r.status_code}: '{r.reason}'"
logger.error(status_str)
has_error = True
else:
logger.info(f"Sent '{method.upper()}' request to {url}")
has_error = False
except requests.RequestException as e:
status_str = f"Error sending '{method.upper()}' request to {url} - {str(e)}"
logger.error(status_str)
has_error = True
if has_error:
raise TypeError(status_str)
return True

View File

@@ -1,16 +0,0 @@
from apprise import AppriseAsset
# Refer to:
# https://github.com/caronc/apprise/wiki/Development_API#the-apprise-asset-object
APPRISE_APP_ID = "changedetection.io"
APPRISE_APP_DESC = "ChangeDetection.io best and simplest website monitoring and change detection"
APPRISE_APP_URL = "https://changedetection.io"
APPRISE_AVATAR_URL = "https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/changedetectionio/static/images/avatar-256x256.png"
apprise_asset = AppriseAsset(
app_id=APPRISE_APP_ID,
app_desc=APPRISE_APP_DESC,
app_url=APPRISE_APP_URL,
image_url_logo=APPRISE_AVATAR_URL,
)

View File

@@ -1,112 +0,0 @@
import json
import re
from urllib.parse import unquote_plus
import requests
from apprise.decorators import notify
from apprise.utils.parse import parse_url as apprise_parse_url
from loguru import logger
from requests.structures import CaseInsensitiveDict
SUPPORTED_HTTP_METHODS = {"get", "post", "put", "delete", "patch", "head"}
def notify_supported_methods(func):
for method in SUPPORTED_HTTP_METHODS:
func = notify(on=method)(func)
# Add support for https, for each supported http method
func = notify(on=f"{method}s")(func)
return func
def _get_auth(parsed_url: dict) -> str | tuple[str, str]:
user: str | None = parsed_url.get("user")
password: str | None = parsed_url.get("password")
if user is not None and password is not None:
return (unquote_plus(user), unquote_plus(password))
if user is not None:
return unquote_plus(user)
return ""
def _get_headers(parsed_url: dict, body: str) -> CaseInsensitiveDict:
headers = CaseInsensitiveDict(
{unquote_plus(k).title(): unquote_plus(v) for k, v in parsed_url["qsd+"].items()}
)
# If Content-Type is not specified, guess if the body is a valid JSON
if headers.get("Content-Type") is None:
try:
json.loads(body)
headers["Content-Type"] = "application/json; charset=utf-8"
except Exception:
pass
return headers
def _get_params(parsed_url: dict) -> CaseInsensitiveDict:
# https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#get-parameter-manipulation
# In Apprise, it relies on prefixing each request arg with "-", because it uses say &method=update as a flag for apprise
# but here we are making straight requests, so we need todo convert this against apprise's logic
params = CaseInsensitiveDict(
{
unquote_plus(k): unquote_plus(v)
for k, v in parsed_url["qsd"].items()
if k.strip("-") not in parsed_url["qsd-"]
and k.strip("+") not in parsed_url["qsd+"]
}
)
return params
@notify_supported_methods
def apprise_http_custom_handler(
body: str,
title: str,
notify_type: str,
meta: dict,
*args,
**kwargs,
) -> bool:
url: str = meta.get("url")
schema: str = meta.get("schema")
method: str = re.sub(r"s$", "", schema).upper()
# Convert /foobar?+some-header=hello to proper header dictionary
parsed_url: dict[str, str | dict | None] | None = apprise_parse_url(url)
if parsed_url is None:
return False
auth = _get_auth(parsed_url=parsed_url)
headers = _get_headers(parsed_url=parsed_url, body=body)
params = _get_params(parsed_url=parsed_url)
url = re.sub(rf"^{schema}", "https" if schema.endswith("s") else "http", parsed_url.get("url"))
try:
response = requests.request(
method=method,
url=url,
auth=auth,
headers=headers,
params=params,
data=body.encode("utf-8") if isinstance(body, str) else body,
)
response.raise_for_status()
logger.info(f"Successfully sent custom notification to {url}")
return True
except requests.RequestException as e:
logger.error(f"Remote host error while sending custom notification to {url}: {e}")
return False
except Exception as e:
logger.error(f"Unexpected error occurred while sending custom notification to {url}: {e}")
return False

View File

@@ -89,8 +89,6 @@ def construct_blueprint(datastore: ChangeDetectionStore):
flash("Maximum number of backups reached, please remove some", "error")
return redirect(url_for('backups.index'))
# Be sure we're written fresh
datastore.sync_to_json()
zip_thread = threading.Thread(target=create_backup, args=(datastore.datastore_path, datastore.data.get("watching")))
zip_thread.start()
backup_threads.append(zip_thread)
@@ -138,7 +136,7 @@ def construct_blueprint(datastore: ChangeDetectionStore):
return send_from_directory(os.path.abspath(datastore.datastore_path), filename, as_attachment=True)
@login_optionally_required
@backups_blueprint.route("", methods=['GET'])
@backups_blueprint.route("/", methods=['GET'])
def index():
backups = find_backups()
output = render_template("overview.html",

View File

@@ -27,7 +27,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': uuid}))
if len(importer_handler.remaining_data) == 0:
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
else:
remaining_urls = importer_handler.remaining_data
@@ -63,7 +63,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': uuid}))
# Could be some remaining, or we could be on GET
form = forms.importForm(formdata=request.form if request.method == 'POST' else None)
form = forms.importForm(formdata=request.form if request.method == 'POST' else None, datastore=datastore)
output = render_template("import.html",
form=form,
import_url_list_remaining="\n".join(remaining_urls),

View File

@@ -3,7 +3,6 @@ import time
from wtforms import ValidationError
from loguru import logger
from changedetectionio.forms import validate_url
class Importer():
@@ -151,6 +150,7 @@ class import_xlsx_wachete(Importer):
self.new_uuids = []
from openpyxl import load_workbook
from changedetectionio.forms import validate_url
try:
wb = load_workbook(data)

View File

@@ -16,24 +16,26 @@
<form class="pure-form" action="{{url_for('imports.import_page')}}" method="POST" enctype="multipart/form-data">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<div class="tab-pane-inner" id="url-list">
<div class="pure-control-group">
<legend>
Enter one URL per line, and optionally add tags for each URL after a space, delineated by comma
(,):
<br>
<p><strong>Example: </strong><code>https://example.com tag1, tag2, last tag</code></p>
<code>https://example.com tag1, tag2, last tag</code>
<br>
URLs which do not pass validation will stay in the textarea.
</div>
</legend>
{{ render_field(form.processor, class="processor") }}
<div class="pure-control-group">
<textarea name="urls" class="pure-input-1-2" placeholder="https://"
style="width: 100%;
font-family:monospace;
white-space: pre;
overflow-wrap: normal;
overflow-x: scroll;" rows="25">{{ import_url_list_remaining }}</textarea>
</div>
<div id="quick-watch-processor-type"></div>
<div id="quick-watch-processor-type">
</div>
</div>
@@ -41,7 +43,7 @@
<div class="pure-control-group">
<legend>
Copy and Paste your Distill.io watch 'export' file, this should be a JSON file.<br>
This is <i>experimental</i>, supported fields are <code>name</code>, <code>uri</code>, <code>tags</code>, <code>config:selections</code>, the rest (including <code>schedule</code>) are ignored.
<br>
@@ -49,7 +51,7 @@
How to export? <a href="https://distill.io/docs/web-monitor/how-export-and-import-monitors/">https://distill.io/docs/web-monitor/how-export-and-import-monitors/</a><br>
Be sure to set your default fetcher to Chrome if required.<br>
</p>
</div>
</legend>
<textarea name="distill-io" class="pure-input-1-2" style="width: 100%;

View File

@@ -1,32 +1,35 @@
from changedetectionio.strtobool import strtobool
from flask import Blueprint, flash, redirect, url_for
from flask_login import login_required
from changedetectionio.store import ChangeDetectionStore
from changedetectionio import queuedWatchMetaData
from queue import PriorityQueue
from changedetectionio import queuedWatchMetaData
from changedetectionio.processors.constants import PRICE_DATA_TRACK_ACCEPT, PRICE_DATA_TRACK_REJECT
PRICE_DATA_TRACK_ACCEPT = 'accepted'
PRICE_DATA_TRACK_REJECT = 'rejected'
def construct_blueprint(datastore: ChangeDetectionStore, update_q: PriorityQueue):
def construct_blueprint(datastore, update_q: PriorityQueue):
price_data_follower_blueprint = Blueprint('price_data_follower', __name__)
@login_required
@price_data_follower_blueprint.route("/<string:uuid>/accept", methods=['GET'])
def accept(uuid):
old_data = datastore.data['watching'][uuid].get_data()
datastore.data['watching'][uuid] = datastore.rehydrate_entity(default_dict=old_data, processor_override='restock_diff')
datastore.data['watching'][uuid]['track_ldjson_price_data'] = PRICE_DATA_TRACK_ACCEPT
datastore.data['watching'][uuid]['processor'] = 'restock_diff'
datastore.data['watching'][uuid].clear_watch()
# Queue the watch for updating
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': uuid}))
return redirect(url_for("watchlist.index"))
return redirect(url_for("index"))
@login_required
@price_data_follower_blueprint.route("/<string:uuid>/reject", methods=['GET'])
def reject(uuid):
datastore.data['watching'][uuid]['track_ldjson_price_data'] = PRICE_DATA_TRACK_REJECT
return redirect(url_for("watchlist.index"))
return redirect(url_for("index"))
return price_data_follower_blueprint

View File

@@ -0,0 +1,3 @@
PRICE_DATA_TRACK_ACCEPT = 'accepted'
PRICE_DATA_TRACK_REJECT = 'rejected'

View File

@@ -1 +1,103 @@
RSS_FORMAT_TYPES = [('plaintext', 'Plain text'), ('html', 'HTML Color')]
import time
import datetime
import pytz
from flask import Blueprint, make_response, request, url_for
from loguru import logger
from feedgen.feed import FeedGenerator
from changedetectionio.store import ChangeDetectionStore
from changedetectionio.safe_jinja import render as jinja_render
def construct_blueprint(datastore: ChangeDetectionStore):
rss_blueprint = Blueprint('rss', __name__)
# Import the login decorator if needed
# from changedetectionio.auth_decorator import login_optionally_required
@rss_blueprint.route("/", methods=['GET'])
def feed():
now = time.time()
# Always requires token set
app_rss_token = datastore.data['settings']['application'].get('rss_access_token')
rss_url_token = request.args.get('token')
if rss_url_token != app_rss_token:
return "Access denied, bad token", 403
from changedetectionio import diff
limit_tag = request.args.get('tag', '').lower().strip()
# Be sure limit_tag is a uuid
for uuid, tag in datastore.data['settings']['application'].get('tags', {}).items():
if limit_tag == tag.get('title', '').lower().strip():
limit_tag = uuid
# Sort by last_changed and add the uuid which is usually the key..
sorted_watches = []
# @todo needs a .itemsWithTag() or something - then we can use that in Jinaj2 and throw this away
for uuid, watch in datastore.data['watching'].items():
# @todo tag notification_muted skip also (improve Watch model)
if datastore.data['settings']['application'].get('rss_hide_muted_watches') and watch.get('notification_muted'):
continue
if limit_tag and not limit_tag in watch['tags']:
continue
watch['uuid'] = uuid
sorted_watches.append(watch)
sorted_watches.sort(key=lambda x: x.last_changed, reverse=False)
fg = FeedGenerator()
fg.title('changedetection.io')
fg.description('Feed description')
fg.link(href='https://changedetection.io')
for watch in sorted_watches:
dates = list(watch.history.keys())
# Re #521 - Don't bother processing this one if theres less than 2 snapshots, means we never had a change detected.
if len(dates) < 2:
continue
if not watch.viewed:
# Re #239 - GUID needs to be individual for each event
# @todo In the future make this a configurable link back (see work on BASE_URL https://github.com/dgtlmoon/changedetection.io/pull/228)
guid = "{}/{}".format(watch['uuid'], watch.last_changed)
fe = fg.add_entry()
# Include a link to the diff page, they will have to login here to see if password protection is enabled.
# Description is the page you watch, link takes you to the diff JS UI page
# Dict val base_url will get overriden with the env var if it is set.
ext_base_url = datastore.data['settings']['application'].get('active_base_url')
# Because we are called via whatever web server, flask should figure out the right path (
diff_link = {'href': url_for('ui.ui_views.diff_history_page', uuid=watch['uuid'], _external=True)}
fe.link(link=diff_link)
# @todo watch should be a getter - watch.get('title') (internally if URL else..)
watch_title = watch.get('title') if watch.get('title') else watch.get('url')
fe.title(title=watch_title)
html_diff = diff.render_diff(previous_version_file_contents=watch.get_history_snapshot(dates[-2]),
newest_version_file_contents=watch.get_history_snapshot(dates[-1]),
include_equal=False,
line_feed_sep="<br>")
# @todo Make this configurable and also consider html-colored markup
# @todo User could decide if <link> goes to the diff page, or to the watch link
rss_template = "<html><body>\n<h4><a href=\"{{watch_url}}\">{{watch_title}}</a></h4>\n<p>{{html_diff}}</p>\n</body></html>\n"
content = jinja_render(template_str=rss_template, watch_title=watch_title, html_diff=html_diff, watch_url=watch.link)
fe.content(content=content, type='CDATA')
fe.guid(guid, permalink=False)
dt = datetime.datetime.fromtimestamp(int(watch.newest_history_key))
dt = dt.replace(tzinfo=pytz.UTC)
fe.pubDate(dt)
response = make_response(fg.rss_str())
response.headers.set('Content-Type', 'application/rss+xml;charset=utf-8')
logger.trace(f"RSS generated in {time.time() - now:.3f}s")
return response
return rss_blueprint

View File

@@ -1,147 +0,0 @@
from changedetectionio.safe_jinja import render as jinja_render
from changedetectionio.store import ChangeDetectionStore
from feedgen.feed import FeedGenerator
from flask import Blueprint, make_response, request, url_for, redirect
from loguru import logger
import datetime
import pytz
import re
import time
BAD_CHARS_REGEX=r'[\x00-\x08\x0B\x0C\x0E-\x1F]'
# Anything that is not text/UTF-8 should be stripped before it breaks feedgen (such as binary data etc)
def scan_invalid_chars_in_rss(content):
for match in re.finditer(BAD_CHARS_REGEX, content):
i = match.start()
bad_char = content[i]
hex_value = f"0x{ord(bad_char):02x}"
# Grab context
start = max(0, i - 20)
end = min(len(content), i + 21)
context = content[start:end].replace('\n', '\\n').replace('\r', '\\r')
logger.warning(f"Invalid char {hex_value} at pos {i}: ...{context}...")
# First match is enough
return True
return False
def clean_entry_content(content):
cleaned = re.sub(BAD_CHARS_REGEX, '', content)
return cleaned
def construct_blueprint(datastore: ChangeDetectionStore):
rss_blueprint = Blueprint('rss', __name__)
# Some RSS reader situations ended up with rss/ (forward slash after RSS) due
# to some earlier blueprint rerouting work, it should goto feed.
@rss_blueprint.route("/", methods=['GET'])
def extraslash():
return redirect(url_for('rss.feed'))
# Import the login decorator if needed
# from changedetectionio.auth_decorator import login_optionally_required
@rss_blueprint.route("", methods=['GET'])
def feed():
now = time.time()
# Always requires token set
app_rss_token = datastore.data['settings']['application'].get('rss_access_token')
rss_url_token = request.args.get('token')
if rss_url_token != app_rss_token:
return "Access denied, bad token", 403
from changedetectionio import diff
limit_tag = request.args.get('tag', '').lower().strip()
# Be sure limit_tag is a uuid
for uuid, tag in datastore.data['settings']['application'].get('tags', {}).items():
if limit_tag == tag.get('title', '').lower().strip():
limit_tag = uuid
# Sort by last_changed and add the uuid which is usually the key..
sorted_watches = []
# @todo needs a .itemsWithTag() or something - then we can use that in Jinaj2 and throw this away
for uuid, watch in datastore.data['watching'].items():
# @todo tag notification_muted skip also (improve Watch model)
if datastore.data['settings']['application'].get('rss_hide_muted_watches') and watch.get('notification_muted'):
continue
if limit_tag and not limit_tag in watch['tags']:
continue
watch['uuid'] = uuid
sorted_watches.append(watch)
sorted_watches.sort(key=lambda x: x.last_changed, reverse=False)
fg = FeedGenerator()
fg.title('changedetection.io')
fg.description('Feed description')
fg.link(href='https://changedetection.io')
html_colour_enable = False
if datastore.data['settings']['application'].get('rss_content_format') == 'html':
html_colour_enable = True
for watch in sorted_watches:
dates = list(watch.history.keys())
# Re #521 - Don't bother processing this one if theres less than 2 snapshots, means we never had a change detected.
if len(dates) < 2:
continue
if not watch.viewed:
# Re #239 - GUID needs to be individual for each event
# @todo In the future make this a configurable link back (see work on BASE_URL https://github.com/dgtlmoon/changedetection.io/pull/228)
guid = "{}/{}".format(watch['uuid'], watch.last_changed)
fe = fg.add_entry()
# Include a link to the diff page, they will have to login here to see if password protection is enabled.
# Description is the page you watch, link takes you to the diff JS UI page
# Dict val base_url will get overriden with the env var if it is set.
ext_base_url = datastore.data['settings']['application'].get('active_base_url')
# @todo fix
# Because we are called via whatever web server, flask should figure out the right path (
diff_link = {'href': url_for('ui.ui_views.diff_history_page', uuid=watch['uuid'], _external=True)}
fe.link(link=diff_link)
# @todo watch should be a getter - watch.get('title') (internally if URL else..)
watch_title = watch.get('title') if watch.get('title') else watch.get('url')
fe.title(title=watch_title)
try:
html_diff = diff.render_diff(previous_version_file_contents=watch.get_history_snapshot(dates[-2]),
newest_version_file_contents=watch.get_history_snapshot(dates[-1]),
include_equal=False,
line_feed_sep="<br>",
html_colour=html_colour_enable
)
except FileNotFoundError as e:
html_diff = f"History snapshot file for watch {watch.get('uuid')}@{watch.last_changed} - '{watch.get('title')} not found."
# @todo Make this configurable and also consider html-colored markup
# @todo User could decide if <link> goes to the diff page, or to the watch link
rss_template = "<html><body>\n<h4><a href=\"{{watch_url}}\">{{watch_title}}</a></h4>\n<p>{{html_diff}}</p>\n</body></html>\n"
content = jinja_render(template_str=rss_template, watch_title=watch_title, html_diff=html_diff, watch_url=watch.link)
# Out of range chars could also break feedgen
if scan_invalid_chars_in_rss(content):
content = clean_entry_content(content)
fe.content(content=content, type='CDATA')
fe.guid(guid, permalink=False)
dt = datetime.datetime.fromtimestamp(int(watch.newest_history_key))
dt = dt.replace(tzinfo=pytz.UTC)
fe.pubDate(dt)
response = make_response(fg.rss_str())
response.headers.set('Content-Type', 'application/rss+xml;charset=utf-8')
logger.trace(f"RSS generated in {time.time() - now:.3f}s")
return response
return rss_blueprint

View File

@@ -13,7 +13,7 @@ from changedetectionio.auth_decorator import login_optionally_required
def construct_blueprint(datastore: ChangeDetectionStore):
settings_blueprint = Blueprint('settings', __name__, template_folder="templates")
@settings_blueprint.route("", methods=['GET', "POST"])
@settings_blueprint.route("/", methods=['GET', "POST"])
@login_optionally_required
def settings_page():
from changedetectionio import forms
@@ -71,12 +71,12 @@ def construct_blueprint(datastore: ChangeDetectionStore):
if not os.getenv("SALTED_PASS", False) and len(form.application.form.password.encrypted_password):
datastore.data['settings']['application']['password'] = form.application.form.password.encrypted_password
datastore.needs_write_urgent = True
datastore.save_settings()
flash("Password protection enabled.", 'notice')
flask_login.logout_user()
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
datastore.needs_write_urgent = True
datastore.save_settings()
flash("Settings updated.")
else:
@@ -84,6 +84,24 @@ def construct_blueprint(datastore: ChangeDetectionStore):
# Convert to ISO 8601 format, all date/time relative events stored as UTC time
utc_time = datetime.now(ZoneInfo("UTC")).isoformat()
# Get processor plugins info
from changedetectionio.processors import get_all_plugins_info
plugins_info = get_all_plugins_info()
# Process settings including plugin toggles
if request.method == 'POST' and form.validate():
# Process the main form data
app_update = dict(deepcopy(form.data['application']))
# Don't update password with '' or False (Added by wtforms when not in submission)
if 'password' in app_update and not app_update['password']:
del (app_update['password'])
datastore.data['settings']['application'].update(app_update)
datastore.data['settings']['requests'].update(form.data['requests'])
datastore.save_settings()
flash("Settings updated.")
output = render_template("settings.html",
api_key=datastore.data['settings']['application'].get('api_access_token'),
@@ -93,6 +111,7 @@ def construct_blueprint(datastore: ChangeDetectionStore):
form=form,
hide_remove_pass=os.getenv("SALTED_PASS", False),
min_system_recheck_seconds=int(os.getenv('MINIMUM_SECONDS_RECHECK_TIME', 3)),
plugins_info=plugins_info,
settings_application=datastore.data['settings']['application'],
timezone_default_config=datastore.data['settings']['application'].get('timezone'),
utc_time=utc_time,
@@ -105,7 +124,6 @@ def construct_blueprint(datastore: ChangeDetectionStore):
def settings_reset_api_key():
secret = secrets.token_hex(16)
datastore.data['settings']['application']['api_access_token'] = secret
datastore.needs_write_urgent = True
flash("API Key was regenerated.")
return redirect(url_for('settings.settings_page')+'#api')

View File

@@ -9,6 +9,7 @@
const email_notification_prefix=JSON.parse('{{emailprefix|tojson}}');
{% endif %}
</script>
<script src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<script src="{{url_for('static_content', group='js', filename='plugins.js')}}" defer></script>
<script src="{{url_for('static_content', group='js', filename='notifications.js')}}" defer></script>
@@ -25,6 +26,7 @@
<li class="tab"><a href="#api">API</a></li>
<li class="tab"><a href="#timedate">Time &amp Date</a></li>
<li class="tab"><a href="#proxies">CAPTCHA &amp; Proxies</a></li>
<li class="tab"><a href="#plugins">Plugins</a></li>
</ul>
</div>
<div class="box-wrap inner">
@@ -78,10 +80,7 @@
{{ render_field(form.application.form.pager_size) }}
<span class="pure-form-message-inline">Number of items per page in the watch overview list, 0 to disable.</span>
</div>
<div class="pure-control-group">
{{ render_field(form.application.form.rss_content_format) }}
<span class="pure-form-message-inline">Love RSS? Does your reader support HTML? Set it here</span>
</div>
<div class="pure-control-group">
{{ render_checkbox_field(form.application.form.extract_title_as_title) }}
<span class="pure-form-message-inline">Note: This will automatically apply to all existing watches.</span>
@@ -299,10 +298,36 @@ nav
{{ render_field(form.requests.form.extra_browsers) }}
</div>
</div>
<div class="tab-pane-inner" id="plugins">
<div class="pure-control-group">
<h4>Registered Plugins</h4>
<p>The following plugins are currently registered in the system - <a href="https://changedetection.io/plugins">Get more plugins here</a></p>
<table class="pure-table pure-table-striped">
<thead>
<tr>
<th>Name</th>
<th>Description</th>
<th>Version</th>
</tr>
</thead>
<tbody>
{% for plugin in plugins_info %}
<tr>
<td>{{ plugin.name }}</td>
<td>{{ plugin.description }}</td>
<td>{{ plugin.version }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div id="actions">
<div class="pure-control-group">
{{ render_button(form.save_button) }}
<a href="{{url_for('watchlist.index')}}" class="pure-button button-small button-cancel">Back</a>
<a href="{{url_for('index')}}" class="pure-button button-small button-cancel">Back</a>
<a href="{{url_for('ui.clear_all_history')}}" class="pure-button button-small button-error">Clear Snapshot History</a>
</div>
</div>

View File

@@ -56,6 +56,7 @@ def construct_blueprint(datastore: ChangeDetectionStore):
def mute(uuid):
if datastore.data['settings']['application']['tags'].get(uuid):
datastore.data['settings']['application']['tags'][uuid]['notification_muted'] = not datastore.data['settings']['application']['tags'][uuid]['notification_muted']
datastore.data['settings']['application']['tags'][uuid].save_data()
return redirect(url_for('tags.tags_overview_page'))
@tags_blueprint.route("/delete/<string:uuid>", methods=['GET'])
@@ -176,7 +177,8 @@ def construct_blueprint(datastore: ChangeDetectionStore):
datastore.data['settings']['application']['tags'][uuid].update(form.data)
datastore.data['settings']['application']['tags'][uuid]['processor'] = 'restock_diff'
datastore.needs_write_urgent = True
datastore.data['settings']['application']['tags'][uuid].save_data()
flash("Updated")
return redirect(url_for('tags.tags_overview_page'))

View File

@@ -47,7 +47,7 @@
<a class="link-mute state-{{'on' if tag.notification_muted else 'off'}}" href="{{url_for('tags.mute', uuid=tag.uuid)}}"><img src="{{url_for('static_content', group='images', filename='bell-off.svg')}}" alt="Mute notifications" title="Mute notifications" class="icon icon-mute" ></a>
</td>
<td>{{ "{:,}".format(tag_count[uuid]) if uuid in tag_count else 0 }}</td>
<td class="title-col inline"> <a href="{{url_for('watchlist.index', tag=uuid) }}">{{ tag.title }}</a></td>
<td class="title-col inline"> <a href="{{url_for('index', tag=uuid) }}">{{ tag.title }}</a></td>
<td>
<a class="pure-button pure-button-primary" href="{{ url_for('tags.form_tag_edit', uuid=uuid) }}">Edit</a>&nbsp;
<a class="pure-button pure-button-primary" href="{{ url_for('tags.delete', uuid=uuid) }}" title="Deletes and removes tag">Delete</a>

View File

@@ -36,7 +36,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
else:
flash("Cleared snapshot history for watch {}".format(uuid))
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
@ui_blueprint.route("/clear_history", methods=['GET', 'POST'])
@login_optionally_required
@@ -52,7 +52,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
else:
flash('Incorrect confirmation text.', 'error')
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
output = render_template("clear_all_history.html")
return output
@@ -68,7 +68,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
continue
datastore.set_last_viewed(watch_uuid, int(time.time()))
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
@ui_blueprint.route("/delete", methods=['GET'])
@login_optionally_required
@@ -77,7 +77,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
if uuid != 'all' and not uuid in datastore.data['watching'].keys():
flash('The watch by UUID {} does not exist.'.format(uuid), 'error')
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
# More for testing, possible to return the first/only
if uuid == 'first':
@@ -85,7 +85,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
datastore.delete(uuid)
flash('Deleted.')
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
@ui_blueprint.route("/clone", methods=['GET'])
@login_optionally_required
@@ -96,13 +96,12 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = list(datastore.data['watching'].keys()).pop()
new_uuid = datastore.clone(uuid)
if new_uuid:
if not datastore.data['watching'].get(uuid).get('paused'):
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=5, item={'uuid': new_uuid}))
flash('Cloned.')
if not datastore.data['watching'].get(uuid).get('paused'):
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=5, item={'uuid': new_uuid}))
flash('Cloned, you are editing the new watch.')
return redirect(url_for("ui.ui_edit.edit_page", uuid=new_uuid))
return redirect(url_for('index'))
@ui_blueprint.route("/checknow", methods=['GET'])
@login_optionally_required
@@ -144,7 +143,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
if i == 0:
flash("No watches available to recheck.")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
@ui_blueprint.route("/form/checkbox-operations", methods=['POST'])
@login_optionally_required
@@ -164,6 +163,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = uuid.strip()
if datastore.data['watching'].get(uuid):
datastore.data['watching'][uuid.strip()]['paused'] = True
datastore.data['watching'][uuid.strip()].save_data()
flash("{} watches paused".format(len(uuids)))
elif (op == 'unpause'):
@@ -171,6 +171,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = uuid.strip()
if datastore.data['watching'].get(uuid):
datastore.data['watching'][uuid.strip()]['paused'] = False
datastore.data['watching'][uuid.strip()].save_data()
flash("{} watches unpaused".format(len(uuids)))
elif (op == 'mark-viewed'):
@@ -185,6 +186,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = uuid.strip()
if datastore.data['watching'].get(uuid):
datastore.data['watching'][uuid.strip()]['notification_muted'] = True
datastore.data['watching'][uuid.strip()].save_data()
flash("{} watches muted".format(len(uuids)))
elif (op == 'unmute'):
@@ -192,6 +194,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = uuid.strip()
if datastore.data['watching'].get(uuid):
datastore.data['watching'][uuid.strip()]['notification_muted'] = False
datastore.data['watching'][uuid.strip()].save_data()
flash("{} watches un-muted".format(len(uuids)))
elif (op == 'recheck'):
@@ -207,6 +210,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
uuid = uuid.strip()
if datastore.data['watching'].get(uuid):
datastore.data['watching'][uuid]["last_error"] = False
datastore.data['watching'][uuid].save_data()
flash(f"{len(uuids)} watches errors cleared")
elif (op == 'clear-history'):
@@ -232,7 +236,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
elif (op == 'assign-tag'):
op_extradata = request.form.get('op_extradata', '').strip()
if op_extradata:
tag_uuid = datastore.add_tag(title=op_extradata)
tag_uuid = datastore.add_tag(name=op_extradata)
if op_extradata and tag_uuid:
for uuid in uuids:
uuid = uuid.strip()
@@ -245,7 +249,10 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
flash(f"{len(uuids)} watches were tagged")
return redirect(url_for('watchlist.index'))
for uuid in uuids:
datastore.data['watching'][uuid.strip()].save_data()
return redirect(url_for('index'))
@ui_blueprint.route("/share-url/<string:uuid>", methods=['GET'])
@@ -297,6 +304,6 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, running_updat
logger.error(f"Error sharing -{str(e)}")
flash(f"Could not share, something went wrong while communicating with the share server - {str(e)}", 'error')
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
return ui_blueprint

View File

@@ -24,7 +24,6 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
# https://stackoverflow.com/questions/42984453/wtforms-populate-form-with-data-if-data-exists
# https://wtforms.readthedocs.io/en/3.0.x/forms/#wtforms.form.Form.populate_obj ?
def edit_page(uuid):
from changedetectionio import forms
from changedetectionio.blueprint.browser_steps.browser_steps import browser_step_ui_config
from changedetectionio import processors
import importlib
@@ -32,26 +31,26 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
# More for testing, possible to return the first/only
if not datastore.data['watching'].keys():
flash("No watches to edit", "error")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
if uuid == 'first':
uuid = list(datastore.data['watching'].keys()).pop()
if not uuid in datastore.data['watching']:
flash("No watch with the UUID %s found." % (uuid), "error")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
switch_processor = request.args.get('switch_processor')
if switch_processor:
for p in processors.available_processors():
for p in processors.available_processors(datastore):
if p[0] == switch_processor:
datastore.data['watching'][uuid]['processor'] = switch_processor
flash(f"Switched to mode - {p[1]}.")
datastore.clear_watch_history(uuid)
redirect(url_for('ui_edit.edit_page', uuid=uuid))
# be sure we update with a copy instead of accidently editing the live object by reference
default = deepcopy(datastore.data['watching'][uuid])
default = datastore.data['watching'][uuid]
# Defaults for proxy choice
if datastore.proxy_list is not None: # When enabled
@@ -61,31 +60,19 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
default['proxy'] = ''
# proxy_override set to the json/text list of the items
# Does it use some custom form? does one exist?
processor_name = datastore.data['watching'][uuid].get('processor', '')
processor_classes = next((tpl for tpl in processors.find_processors() if tpl[1] == processor_name), None)
if not processor_classes:
flash(f"Cannot load the edit form for processor/plugin '{processor_classes[1]}', plugin missing?", 'error')
return redirect(url_for('watchlist.index'))
parent_module = processors.get_parent_module(processor_classes[0])
try:
# Get the parent of the "processor.py" go up one, get the form (kinda spaghetti but its reusing existing code)
forms_module = importlib.import_module(f"{parent_module.__name__}.forms")
# Access the 'processor_settings_form' class from the 'forms' module
form_class = getattr(forms_module, 'processor_settings_form')
except ModuleNotFoundError as e:
# .forms didnt exist
form_class = forms.processor_text_json_diff_form
except AttributeError as e:
# .forms exists but no useful form
form_class = forms.processor_text_json_diff_form
# Get the appropriate form class for this processor using the pluggy system
processor_name = datastore.data['watching'][uuid].get('processor', 'text_json_diff')
form_class = processors.get_form_class_for_processor(processor_name)
if not form_class:
flash(f"Cannot load the edit form for processor/plugin '{processor_name}', plugin missing?", 'error')
return redirect(url_for('index'))
form = form_class(formdata=request.form if request.method == 'POST' else None,
data=default,
extra_notification_tokens=default.extra_notification_token_values(),
default_system_settings=datastore.data['settings']
default_system_settings=datastore.data['settings'],
datastore=datastore
)
# For the form widget tag UUID back to "string name" for the field
@@ -127,10 +114,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
extra_update_obj['paused'] = False
extra_update_obj['time_between_check'] = form.time_between_check.data
# Ignore text
form_ignore_text = form.ignore_text.data
datastore.data['watching'][uuid]['ignore_text'] = form_ignore_text
extra_update_obj['ignore_text'] = form.ignore_text.data
# Be sure proxy value is None
if datastore.proxy_list is not None and form.data['proxy'] == '':
@@ -153,25 +137,26 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
extra_update_obj['tags'] = form.data.get('tags')
else:
for t in form.data.get('tags').split(','):
tag_uuids.append(datastore.add_tag(title=t))
tag_uuids.append(datastore.add_tag(name=t))
extra_update_obj['tags'] = tag_uuids
datastore.data['watching'][uuid].update(form.data)
datastore.data['watching'][uuid].update(extra_update_obj)
if not datastore.data['watching'][uuid].get('tags'):
# Force it to be a list, because form.data['tags'] will be string if nothing found
# And del(form.data['tags'] ) wont work either for some reason
datastore.data['watching'][uuid]['tags'] = []
datastore.update_watch(uuid=uuid, update_obj=form.data | extra_update_obj)
# Recast it if need be to right data Watch handler
watch_class = processors.get_custom_watch_obj_for_processor(form.data.get('processor'))
processor_name = datastore.data['watching'][uuid].get('processor')
watch_class = processors.get_watch_model_for_processor(processor_name)
datastore.data['watching'][uuid] = watch_class(datastore_path=datastore.datastore_path, default=datastore.data['watching'][uuid])
datastore.data['watching'][uuid].save_data()
flash("Updated watch - unpaused!" if request.args.get('unpause_on_save') else "Updated watch.")
# Re #286 - We wait for syncing new data to disk in another thread every 60 seconds
# But in the case something is added we should save straight away
datastore.needs_write_urgent = True
# Do not queue on edit if its not within the time range
@@ -198,6 +183,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
f"{uuid} - Recheck scheduler, error handling timezone, check skipped - TZ name '{tz_name}' - {str(e)}")
return False
#############################
if not datastore.data['watching'][uuid].get('paused') and is_in_schedule:
# Queue the watch for immediate recheck, with a higher priority
@@ -207,7 +193,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
if request.args.get("next") and request.args.get("next") == 'diff':
return redirect(url_for('ui.ui_views.diff_history_page', uuid=uuid))
return redirect(url_for('watchlist.index', tag=request.args.get("tag",'')))
return redirect(url_for('index', tag=request.args.get("tag",'')))
else:
if request.method == 'POST' and not form.validate():
@@ -236,7 +222,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
# Only works reliably with Playwright
template_args = {
'available_processors': processors.available_processors(),
'available_processors': processors.available_processors(datastore),
'available_timezones': sorted(available_timezones()),
'browser_steps_config': browser_step_ui_config,
'emailprefix': os.getenv('NOTIFICATION_MAIL_BUTTON_PREFIX', False),

View File

@@ -4,7 +4,6 @@ from loguru import logger
from changedetectionio.store import ChangeDetectionStore
from changedetectionio.auth_decorator import login_optionally_required
from changedetectionio.notification import process_notification
def construct_blueprint(datastore: ChangeDetectionStore):
notification_blueprint = Blueprint('ui_notification', __name__, template_folder="../ui/templates")
@@ -18,10 +17,11 @@ def construct_blueprint(datastore: ChangeDetectionStore):
# Watch_uuid could be unset in the case it`s used in tag editor, global settings
import apprise
from ...apprise_plugin.assets import apprise_asset
from ...apprise_plugin.custom_handlers import apprise_http_custom_handler # noqa: F401
apobj = apprise.Apprise(asset=apprise_asset)
from changedetectionio.apprise_asset import asset
apobj = apprise.Apprise(asset=asset)
# so that the custom endpoints are registered
from changedetectionio.apprise_plugin import apprise_custom_api_call_wrapper
is_global_settings_form = request.args.get('mode', '') == 'global-settings'
is_group_settings_form = request.args.get('mode', '') == 'group-settings'
@@ -90,6 +90,7 @@ def construct_blueprint(datastore: ChangeDetectionStore):
n_object['as_async'] = False
n_object.update(watch.extra_notification_token_values())
from changedetectionio.notification import process_notification
sent_obj = process_notification(n_object, datastore)
except Exception as e:

View File

@@ -37,7 +37,7 @@
</div>
<br />
<div class="pure-control-group">
<a href="{{url_for('watchlist.index')}}" class="pure-button button-cancel"
<a href="{{url_for('index')}}" class="pure-button button-cancel"
>Cancel</a
>
</div>

View File

@@ -26,7 +26,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
watch = datastore.data['watching'][uuid]
except KeyError:
flash("No history found for the specified link, bad link?", "error")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
system_uses_webdriver = datastore.data['settings']['application']['fetch_backend'] == 'html_webdriver'
extra_stylesheets = [url_for('static_content', group='styles', filename='diff.css')]
@@ -91,7 +91,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
watch = datastore.data['watching'][uuid]
except KeyError:
flash("No history found for the specified link, bad link?", "error")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
# For submission of requesting an extract
extract_form = forms.extractDataForm(request.form)
@@ -119,7 +119,7 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
if len(dates) < 2:
flash("Not enough saved change detection snapshots to produce a report.", "error")
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
# Save the current newest history as the most recently viewed
datastore.set_last_viewed(uuid, time.time())
@@ -191,12 +191,12 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
@login_optionally_required
def form_quick_watch_add():
from changedetectionio import forms
form = forms.quickWatchForm(request.form)
form = forms.quickWatchForm(request.form, datastore=datastore)
if not form.validate():
for widget, l in form.errors.items():
flash(','.join(l), 'error')
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
url = request.form.get('url').strip()
if datastore.url_exists(url):
@@ -215,6 +215,6 @@ def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMe
update_q.put(queuedWatchMetaData.PrioritizedItem(priority=1, item={'uuid': new_uuid}))
flash("Watch added.")
return redirect(url_for('watchlist.index', tag=request.args.get('tag','')))
return redirect(url_for('index', tag=request.args.get('tag','')))
return views_blueprint

View File

@@ -1,111 +0,0 @@
import os
import time
from flask import Blueprint, request, make_response, render_template, redirect, url_for, flash, session
from flask_login import current_user
from flask_paginate import Pagination, get_page_parameter
from changedetectionio import forms
from changedetectionio.store import ChangeDetectionStore
from changedetectionio.auth_decorator import login_optionally_required
def construct_blueprint(datastore: ChangeDetectionStore, update_q, queuedWatchMetaData):
watchlist_blueprint = Blueprint('watchlist', __name__, template_folder="templates")
@watchlist_blueprint.route("/", methods=['GET'])
@login_optionally_required
def index():
active_tag_req = request.args.get('tag', '').lower().strip()
active_tag_uuid = active_tag = None
# Be sure limit_tag is a uuid
if active_tag_req:
for uuid, tag in datastore.data['settings']['application'].get('tags', {}).items():
if active_tag_req == tag.get('title', '').lower().strip() or active_tag_req == uuid:
active_tag = tag
active_tag_uuid = uuid
break
# Redirect for the old rss path which used the /?rss=true
if request.args.get('rss'):
return redirect(url_for('rss.feed', tag=active_tag_uuid))
op = request.args.get('op')
if op:
uuid = request.args.get('uuid')
if op == 'pause':
datastore.data['watching'][uuid].toggle_pause()
elif op == 'mute':
datastore.data['watching'][uuid].toggle_mute()
datastore.needs_write = True
return redirect(url_for('watchlist.index', tag = active_tag_uuid))
# Sort by last_changed and add the uuid which is usually the key..
sorted_watches = []
with_errors = request.args.get('with_errors') == "1"
errored_count = 0
search_q = request.args.get('q').strip().lower() if request.args.get('q') else False
for uuid, watch in datastore.data['watching'].items():
if with_errors and not watch.get('last_error'):
continue
if active_tag_uuid and not active_tag_uuid in watch['tags']:
continue
if watch.get('last_error'):
errored_count += 1
if search_q:
if (watch.get('title') and search_q in watch.get('title').lower()) or search_q in watch.get('url', '').lower():
sorted_watches.append(watch)
elif watch.get('last_error') and search_q in watch.get('last_error').lower():
sorted_watches.append(watch)
else:
sorted_watches.append(watch)
form = forms.quickWatchForm(request.form)
page = request.args.get(get_page_parameter(), type=int, default=1)
total_count = len(sorted_watches)
pagination = Pagination(page=page,
total=total_count,
per_page=datastore.data['settings']['application'].get('pager_size', 50), css_framework="semantic")
sorted_tags = sorted(datastore.data['settings']['application'].get('tags').items(), key=lambda x: x[1]['title'])
output = render_template(
"watch-overview.html",
active_tag=active_tag,
active_tag_uuid=active_tag_uuid,
app_rss_token=datastore.data['settings']['application'].get('rss_access_token'),
datastore=datastore,
errored_count=errored_count,
form=form,
guid=datastore.data['app_guid'],
has_proxies=datastore.proxy_list,
has_unviewed=datastore.has_unviewed,
hosted_sticky=os.getenv("SALTED_PASS", False) == False,
now_time_server=time.time(),
pagination=pagination,
queued_uuids=[q_uuid.item['uuid'] for q_uuid in update_q.queue],
search_q=request.args.get('q', '').strip(),
sort_attribute=request.args.get('sort') if request.args.get('sort') else request.cookies.get('sort'),
sort_order=request.args.get('order') if request.args.get('order') else request.cookies.get('order'),
system_default_fetcher=datastore.data['settings']['application'].get('fetch_backend'),
tags=sorted_tags,
watches=sorted_watches
)
if session.get('share-link'):
del(session['share-link'])
resp = make_response(output)
# The template can run on cookie or url query info
if request.args.get('sort'):
resp.set_cookie('sort', request.args.get('sort'))
if request.args.get('order'):
resp.set_cookie('order', request.args.get('order'))
return resp
return watchlist_blueprint

View File

@@ -116,7 +116,8 @@ def execute_ruleset_against_all_plugins(current_watch_uuid: str, application_dat
if not jsonLogic(logic=ruleset, data=EXECUTE_DATA, operations=CUSTOM_OPERATIONS):
result = False
return {'executed_data': EXECUTE_DATA, 'result': result}
return result
# Load plugins dynamically
for plugin in plugin_manager.get_plugins():

View File

@@ -67,8 +67,7 @@ def construct_blueprint(datastore):
return jsonify({
'status': 'success',
'result': result.get('result'),
'data': result.get('executed_data'),
'result': result,
'message': 'Condition passes' if result else 'Condition does not pass'
})

View File

@@ -1,8 +1,10 @@
#!/usr/bin/env python3
import flask_login
import locale
import os
import pytz
import queue
import threading
import time
@@ -33,8 +35,7 @@ from loguru import logger
from changedetectionio import __version__
from changedetectionio import queuedWatchMetaData
from changedetectionio.api import Watch, WatchHistory, WatchSingleHistory, CreateWatch, Import, SystemInfo, Tag, Tags
from changedetectionio.api.Search import Search
from changedetectionio.api import api_v1
from .time_handler import is_within_schedule
datastore = None
@@ -74,6 +75,7 @@ if os.getenv('FLASK_SERVER_NAME'):
# Disables caching of the templates
app.config['TEMPLATES_AUTO_RELOAD'] = True
app.jinja_env.add_extension('jinja2.ext.loopcontrols')
app.jinja_env.globals.update(hasattr=hasattr)
csrf = CSRFProtect()
csrf.init_app(app)
notification_debug_log=[]
@@ -123,18 +125,14 @@ def _jinja2_filter_format_number_locale(value: float) -> str:
return formatted_value
@app.template_global('is_checking_now')
def _watch_is_checking_now(watch_obj, format="%Y-%m-%d %H:%M:%S"):
# Worker thread tells us which UUID it is currently processing.
for t in running_update_threads:
if t.current_uuid == watch_obj['uuid']:
return True
# We use the whole watch object from the store/JSON so we can see if there's some related status in terms of a thread
# running or something similar.
@app.template_filter('format_last_checked_time')
def _jinja2_filter_datetime(watch_obj, format="%Y-%m-%d %H:%M:%S"):
# Worker thread tells us which UUID it is currently processing.
for t in running_update_threads:
if t.current_uuid == watch_obj['uuid']:
return '<span class="spinner"></span><span> Checking now</span>'
if watch_obj['last_checked'] == 0:
return 'Not yet'
@@ -233,7 +231,7 @@ def changedetection_app(config=None, datastore_o=None):
if has_password_enabled and not flask_login.current_user.is_authenticated:
# Permitted
if request.endpoint and request.endpoint == 'static_content' and request.view_args and request.view_args.get('group') in ['styles', 'js', 'images', 'favicons']:
if request.endpoint and 'static_content' in request.endpoint and request.view_args and request.view_args.get('group') == 'styles':
return None
# Permitted
elif request.endpoint and 'login' in request.endpoint:
@@ -247,42 +245,34 @@ def changedetection_app(config=None, datastore_o=None):
# RSS access with token is allowed
elif request.endpoint and 'rss.feed' in request.endpoint:
return None
# API routes - use their own auth mechanism (@auth.check_token)
elif request.path.startswith('/api/'):
return None
else:
return login_manager.unauthorized()
watch_api.add_resource(WatchSingleHistory,
watch_api.add_resource(api_v1.WatchSingleHistory,
'/api/v1/watch/<string:uuid>/history/<string:timestamp>',
resource_class_kwargs={'datastore': datastore, 'update_q': update_q})
watch_api.add_resource(WatchHistory,
watch_api.add_resource(api_v1.WatchHistory,
'/api/v1/watch/<string:uuid>/history',
resource_class_kwargs={'datastore': datastore})
watch_api.add_resource(CreateWatch, '/api/v1/watch',
watch_api.add_resource(api_v1.CreateWatch, '/api/v1/watch',
resource_class_kwargs={'datastore': datastore, 'update_q': update_q})
watch_api.add_resource(Watch, '/api/v1/watch/<string:uuid>',
watch_api.add_resource(api_v1.Watch, '/api/v1/watch/<string:uuid>',
resource_class_kwargs={'datastore': datastore, 'update_q': update_q})
watch_api.add_resource(SystemInfo, '/api/v1/systeminfo',
watch_api.add_resource(api_v1.SystemInfo, '/api/v1/systeminfo',
resource_class_kwargs={'datastore': datastore, 'update_q': update_q})
watch_api.add_resource(Import,
watch_api.add_resource(api_v1.Import,
'/api/v1/import',
resource_class_kwargs={'datastore': datastore})
watch_api.add_resource(Tags, '/api/v1/tags',
resource_class_kwargs={'datastore': datastore})
watch_api.add_resource(Tag, '/api/v1/tag', '/api/v1/tag/<string:uuid>',
resource_class_kwargs={'datastore': datastore})
watch_api.add_resource(Search, '/api/v1/search',
resource_class_kwargs={'datastore': datastore})
# Setup cors headers to allow all domains
# https://flask-cors.readthedocs.io/en/latest/
# CORS(app)
@@ -295,12 +285,12 @@ def changedetection_app(config=None, datastore_o=None):
@login_manager.unauthorized_handler
def unauthorized_handler():
flash("You must be logged in, please log in.", 'error')
return redirect(url_for('login', next=url_for('watchlist.index')))
return redirect(url_for('login', next=url_for('index')))
@app.route('/logout')
def logout():
flask_login.logout_user()
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
# https://github.com/pallets/flask/blob/93dd1709d05a1cf0e886df6223377bdab3b077fb/examples/tutorial/flaskr/__init__.py#L39
# You can divide up the stuff like this
@@ -310,7 +300,7 @@ def changedetection_app(config=None, datastore_o=None):
if request.method == 'GET':
if flask_login.current_user.is_authenticated:
flash("Already logged in")
return redirect(url_for("watchlist.index"))
return redirect(url_for("index"))
output = render_template("login.html")
return output
@@ -327,13 +317,13 @@ def changedetection_app(config=None, datastore_o=None):
# It's more reliable and safe to ignore the 'next' redirect
# When we used...
# next = request.args.get('next')
# return redirect(next or url_for('watchlist.index'))
# return redirect(next or url_for('index'))
# We would sometimes get login loop errors on sites hosted in sub-paths
# note for the future:
# if not is_safe_url(next):
# return flask.abort(400)
return redirect(url_for('watchlist.index'))
return redirect(url_for('index'))
else:
flash('Incorrect password', 'error')
@@ -346,8 +336,109 @@ def changedetection_app(config=None, datastore_o=None):
if os.getenv('USE_X_SETTINGS') and 'X-Forwarded-Prefix' in request.headers:
app.config['REMEMBER_COOKIE_PATH'] = request.headers['X-Forwarded-Prefix']
app.config['SESSION_COOKIE_PATH'] = request.headers['X-Forwarded-Prefix']
return None
@app.route("/", methods=['GET'])
@login_optionally_required
def index():
global datastore
from changedetectionio.forms import quickWatchForm
active_tag_req = request.args.get('tag', '').lower().strip()
active_tag_uuid = active_tag = None
# Be sure limit_tag is a uuid
if active_tag_req:
for uuid, tag in datastore.data['settings']['application'].get('tags', {}).items():
if active_tag_req == tag.get('title', '').lower().strip() or active_tag_req == uuid:
active_tag = tag
active_tag_uuid = uuid
break
# Redirect for the old rss path which used the /?rss=true
if request.args.get('rss'):
return redirect(url_for('rss.feed', tag=active_tag_uuid))
op = request.args.get('op')
if op:
uuid = request.args.get('uuid')
if op == 'pause':
datastore.data['watching'][uuid].toggle_pause()
elif op == 'mute':
datastore.data['watching'][uuid].toggle_mute()
return redirect(url_for('index', tag = active_tag_uuid))
# Sort by last_changed and add the uuid which is usually the key..
sorted_watches = []
with_errors = request.args.get('with_errors') == "1"
errored_count = 0
search_q = request.args.get('q').strip().lower() if request.args.get('q') else False
for uuid, watch in datastore.data['watching'].items():
if with_errors and not watch.get('last_error'):
continue
if active_tag_uuid and not active_tag_uuid in watch['tags']:
continue
if watch.get('last_error'):
errored_count += 1
if search_q:
if (watch.get('title') and search_q in watch.get('title').lower()) or search_q in watch.get('url', '').lower():
sorted_watches.append(watch)
elif watch.get('last_error') and search_q in watch.get('last_error').lower():
sorted_watches.append(watch)
else:
sorted_watches.append(watch)
form = quickWatchForm(request.form, datastore=datastore)
page = request.args.get(get_page_parameter(), type=int, default=1)
total_count = len(sorted_watches)
pagination = Pagination(page=page,
total=total_count,
per_page=datastore.data['settings']['application'].get('pager_size', 50), css_framework="semantic")
sorted_tags = sorted(datastore.data['settings']['application'].get('tags').items(), key=lambda x: x[1]['title'])
output = render_template(
"watch-overview.html",
# Don't link to hosting when we're on the hosting environment
active_tag=active_tag,
active_tag_uuid=active_tag_uuid,
app_rss_token=datastore.data['settings']['application'].get('rss_access_token'),
datastore=datastore,
errored_count=errored_count,
form=form,
guid=datastore.data['app_guid'],
has_proxies=datastore.proxy_list,
has_unviewed=datastore.has_unviewed,
hosted_sticky=os.getenv("SALTED_PASS", False) == False,
pagination=pagination,
queued_uuids=[q_uuid.item['uuid'] for q_uuid in update_q.queue],
search_q=request.args.get('q','').strip(),
sort_attribute=request.args.get('sort') if request.args.get('sort') else request.cookies.get('sort'),
sort_order=request.args.get('order') if request.args.get('order') else request.cookies.get('order'),
system_default_fetcher=datastore.data['settings']['application'].get('fetch_backend'),
tags=sorted_tags,
watches=sorted_watches
)
if session.get('share-link'):
del(session['share-link'])
resp = make_response(output)
# The template can run on cookie or url query info
if request.args.get('sort'):
resp.set_cookie('sort', request.args.get('sort'))
if request.args.get('order'):
resp.set_cookie('order', request.args.get('order'))
return resp
@app.route("/static/<string:group>/<string:filename>", methods=['GET'])
def static_content(group, filename):
from flask import make_response
@@ -433,15 +524,12 @@ def changedetection_app(config=None, datastore_o=None):
import changedetectionio.conditions.blueprint as conditions
app.register_blueprint(conditions.construct_blueprint(datastore), url_prefix='/conditions')
import changedetectionio.blueprint.rss.blueprint as rss
import changedetectionio.blueprint.rss as rss
app.register_blueprint(rss.construct_blueprint(datastore), url_prefix='/rss')
# watchlist UI buttons etc
import changedetectionio.blueprint.ui as ui
app.register_blueprint(ui.construct_blueprint(datastore, update_q, running_update_threads, queuedWatchMetaData))
import changedetectionio.blueprint.watchlist as watchlist
app.register_blueprint(watchlist.construct_blueprint(datastore=datastore, update_q=update_q, queuedWatchMetaData=queuedWatchMetaData), url_prefix='')
# @todo handle ctrl break
ticker_thread = threading.Thread(target=ticker_thread_check_time_launch_checks).start()

View File

@@ -3,7 +3,6 @@ import re
from loguru import logger
from wtforms.widgets.core import TimeInput
from changedetectionio.blueprint.rss import RSS_FORMAT_TYPES
from changedetectionio.conditions.form import ConditionFormRow
from changedetectionio.strtobool import strtobool
@@ -24,7 +23,7 @@ from wtforms import (
from flask_wtf.file import FileField, FileAllowed
from wtforms.fields import FieldList
from wtforms.validators import ValidationError
from wtforms.validators import ValidationError, Optional
from validators.url import url as url_validator
@@ -306,10 +305,10 @@ class ValidateAppRiseServers(object):
def __call__(self, form, field):
import apprise
from .apprise_plugin.assets import apprise_asset
from .apprise_plugin.custom_handlers import apprise_http_custom_handler # noqa: F401
apobj = apprise.Apprise()
apobj = apprise.Apprise(asset=apprise_asset)
# so that the custom endpoints are registered
from .apprise_asset import asset
for server_url in field.data:
url = server_url.strip()
@@ -509,8 +508,14 @@ class quickWatchForm(Form):
url = fields.URLField('URL', validators=[validateURL()])
tags = StringTagUUID('Group tag', [validators.Optional()])
watch_submit_button = SubmitField('Watch', render_kw={"class": "pure-button pure-button-primary"})
processor = RadioField(u'Processor', choices=processors.available_processors(), default="text_json_diff")
processor = RadioField(u'Processor', default="text_json_diff")
edit_and_watch_submit_button = SubmitField('Edit > Watch', render_kw={"class": "pure-button pure-button-primary"})
def __init__(self, formdata=None, obj=None, prefix="", data=None, meta=None, **kwargs):
super().__init__(formdata, obj, prefix, data, meta, **kwargs)
# Set processor choices based on datastore if available
#datastore = kwargs.get('datastore')
self.processor.choices = self.processors.available_processors()
@@ -523,6 +528,13 @@ class commonSettingsForm(Form):
self.notification_body.extra_notification_tokens = kwargs.get('extra_notification_tokens', {})
self.notification_title.extra_notification_tokens = kwargs.get('extra_notification_tokens', {})
self.notification_urls.extra_notification_tokens = kwargs.get('extra_notification_tokens', {})
# Set processor choices based on datastore if available
datastore = kwargs.get('datastore')
if datastore:
self.processor.choices = self.processors.available_processors(datastore)
else:
self.processor.choices = self.processors.available_processors()
extract_title_as_title = BooleanField('Extract <title> from document and use as watch title', default=False)
fetch_backend = RadioField(u'Fetch Method', choices=content_fetchers.available_fetchers(), validators=[ValidateContentFetcherIsReady()])
@@ -530,17 +542,26 @@ class commonSettingsForm(Form):
notification_format = SelectField('Notification format', choices=valid_notification_formats.keys())
notification_title = StringField('Notification Title', default='ChangeDetection.io Notification - {{ watch_url }}', validators=[validators.Optional(), ValidateJinja2Template()])
notification_urls = StringListField('Notification URL List', validators=[validators.Optional(), ValidateAppRiseServers(), ValidateJinja2Template()])
processor = RadioField( label=u"Processor - What do you want to achieve?", choices=processors.available_processors(), default="text_json_diff")
processor = RadioField( label=u"Processor - What do you want to achieve?", default="text_json_diff")
timezone = StringField("Timezone for watch schedule", render_kw={"list": "timezones"}, validators=[validateTimeZoneName()])
webdriver_delay = IntegerField('Wait seconds before extracting text', validators=[validators.Optional(), validators.NumberRange(min=1, message="Should contain one or more seconds")])
class importForm(Form):
from . import processors
processor = RadioField(u'Processor', choices=processors.available_processors(), default="text_json_diff")
processor = RadioField(u'Processor', default="text_json_diff")
urls = TextAreaField('URLs')
xlsx_file = FileField('Upload .xlsx file', validators=[FileAllowed(['xlsx'], 'Must be .xlsx file!')])
file_mapping = SelectField('File mapping', [validators.DataRequired()], choices={('wachete', 'Wachete mapping'), ('custom','Custom mapping')})
def __init__(self, formdata=None, obj=None, prefix="", data=None, meta=None, **kwargs):
super().__init__(formdata, obj, prefix, data, meta, **kwargs)
# Set processor choices based on datastore if available
datastore = kwargs.get('datastore')
if datastore:
self.processor.choices = self.processors.available_processors(datastore)
else:
self.processor.choices = self.processors.available_processors()
class SingleBrowserStep(Form):
@@ -715,11 +736,12 @@ class globalSettingsRequestForm(Form):
default_ua = FormField(DefaultUAInputForm, label="Default User-Agent overrides")
def validate_extra_proxies(self, extra_validators=None):
for e in self.data['extra_proxies']:
if e.get('proxy_name') or e.get('proxy_url'):
if not e.get('proxy_name','').strip() or not e.get('proxy_url','').strip():
self.extra_proxies.errors.append('Both a name, and a Proxy URL is required.')
return False
if self.data.get('extra_proxies'):
for e in self.data['extra_proxies']:
if e.get('proxy_name') or e.get('proxy_url'):
if not e.get('proxy_name','').strip() or not e.get('proxy_url','').strip():
self.extra_proxies.errors.append('Both a name, and a Proxy URL is required.')
return False
# datastore.data['settings']['application']..
@@ -740,9 +762,6 @@ class globalSettingsApplicationForm(commonSettingsForm):
render_kw={"style": "width: 5em;"},
validators=[validators.NumberRange(min=0,
message="Should be atleast zero (disabled)")])
rss_content_format = SelectField('RSS Content format', choices=RSS_FORMAT_TYPES)
removepassword_button = SubmitField('Remove password', render_kw={"class": "pure-button pure-button-primary"})
render_anchor_tag_content = BooleanField('Render anchor tag content', default=False)
shared_diff_access = BooleanField('Allow access to view diff page when password is enabled', default=False, validators=[validators.Optional()])
@@ -753,7 +772,6 @@ class globalSettingsApplicationForm(commonSettingsForm):
validators=[validators.NumberRange(min=0,
message="Should contain zero or more attempts")])
class globalSettingsForm(Form):
# Define these as FormFields/"sub forms", this way it matches the JSON storage
# datastore.data['settings']['application']..

View File

@@ -1,7 +1,4 @@
from os import getenv
from changedetectionio.blueprint.rss import RSS_FORMAT_TYPES
from changedetectionio.notification import (
default_notification_body,
default_notification_format,
@@ -12,8 +9,6 @@ from changedetectionio.notification import (
_FILTER_FAILURE_THRESHOLD_ATTEMPTS_DEFAULT = 6
DEFAULT_SETTINGS_HEADERS_USERAGENT='Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36'
class model(dict):
base_config = {
'note': "Hello! If you change this file manually, please be sure to restart your changedetection.io instance!",
@@ -53,13 +48,12 @@ class model(dict):
'password': False,
'render_anchor_tag_content': False,
'rss_access_token': None,
'rss_content_format': RSS_FORMAT_TYPES[0][0],
'rss_hide_muted_watches': True,
'schema_version' : 0,
'shared_diff_access': False,
'webdriver_delay': None , # Extra delay in seconds before extracting text
'tags': {}, #@todo use Tag.model initialisers
'timezone': None, # Default IANA timezone name
'timezone': None # Default IANA timezone name
}
}
}

View File

@@ -1,14 +1,57 @@
from changedetectionio.model import watch_base
import os
import json
import uuid as uuid_builder
import time
from copy import deepcopy
from loguru import logger
from changedetectionio.model import watch_base, schema
class model(watch_base):
"""Tag model that writes to tags/{uuid}/tag.json instead of the main watch directory"""
__datastore_path = None
def __init__(self, *arg, **kw):
super(model, self).__init__(*arg, **kw)
self.__datastore_path = kw.get("datastore_path")
self['overrides_watch'] = kw.get('default', {}).get('overrides_watch')
if kw.get('default'):
self.update(kw['default'])
del kw['default']
@property
def watch_data_dir(self):
# Override to use tags directory instead of the normal watch data directory
datastore_path = getattr(self, '_model__datastore_path', None)
if datastore_path:
tags_path = os.path.join(datastore_path, 'tags')
# Make sure the tags directory exists
if not os.path.exists(tags_path):
os.makedirs(tags_path)
return os.path.join(tags_path, self['uuid'])
return None
def save_data(self):
"""Override to save tag to tags/{uuid}/tag.json"""
logger.debug(f"Saving tag {self['uuid']}")
if not self.get('uuid'):
# Might have been called when creating the tag
return
tags_path = os.path.join(self.__datastore_path, 'tags')
if not os.path.isdir(tags_path):
os.mkdir(os.path.join(tags_path))
path = os.path.join(tags_path, self.get('uuid')+".json")
try:
with open(path + ".tmp", 'w') as json_file:
json.dump(self.get_data(), json_file, indent=4)
os.replace(path + ".tmp", path)
except Exception as e:
logger.error(f"Error writing JSON for tag {self.get('uuid')}!! (JSON file save was skipped) : {str(e)}")

View File

@@ -38,17 +38,13 @@ class model(watch_base):
jitter_seconds = 0
def __init__(self, *arg, **kw):
self.__datastore_path = kw.get('datastore_path')
if kw.get('datastore_path'):
del kw['datastore_path']
super(model, self).__init__(*arg, **kw)
if kw.get('default'):
self.update(kw['default'])
del kw['default']
if self.get('default'):
del self['default']
# Be sure the cached timestamp is ready
bump = self.history
@@ -296,11 +292,12 @@ class model(watch_base):
with open(filepath, 'r', encoding='utf-8', errors='ignore') as f:
return f.read()
# Save some text file to the appropriate path and bump the history
# Save some text file to the appropriate path and bump the history
# result_obj from fetch_site_status.run()
def save_history_text(self, contents, timestamp, snapshot_id):
import brotli
import tempfile
logger.trace(f"{self.get('uuid')} - Updating history.txt with timestamp {timestamp}")
self.ensure_data_dir_exists()
@@ -308,37 +305,26 @@ class model(watch_base):
threshold = int(os.getenv('SNAPSHOT_BROTLI_COMPRESSION_THRESHOLD', 1024))
skip_brotli = strtobool(os.getenv('DISABLE_BROTLI_TEXT_SNAPSHOT', 'False'))
# Decide on snapshot filename and destination path
if not skip_brotli and len(contents) > threshold:
snapshot_fname = f"{snapshot_id}.txt.br"
encoded_data = brotli.compress(contents.encode('utf-8'), mode=brotli.MODE_TEXT)
dest = os.path.join(self.watch_data_dir, snapshot_fname)
if not os.path.exists(dest):
with open(dest, 'wb') as f:
f.write(brotli.compress(contents.encode('utf-8'), mode=brotli.MODE_TEXT))
else:
snapshot_fname = f"{snapshot_id}.txt"
encoded_data = contents.encode('utf-8')
dest = os.path.join(self.watch_data_dir, snapshot_fname)
if not os.path.exists(dest):
with open(dest, 'wb') as f:
f.write(contents.encode('utf-8'))
dest = os.path.join(self.watch_data_dir, snapshot_fname)
# Write snapshot file atomically if it doesn't exist
if not os.path.exists(dest):
with tempfile.NamedTemporaryFile('wb', delete=False, dir=self.watch_data_dir) as tmp:
tmp.write(encoded_data)
tmp.flush()
os.fsync(tmp.fileno())
tmp_path = tmp.name
os.rename(tmp_path, dest)
# Append to history.txt atomically
# Append to index
# @todo check last char was \n
index_fname = os.path.join(self.watch_data_dir, "history.txt")
index_line = f"{timestamp},{snapshot_fname}\n"
with open(index_fname, 'a') as f:
f.write("{},{}\n".format(timestamp, snapshot_fname))
f.close()
# Lets try force flush here since it's usually a very small file
# If this still fails in the future then try reading all to memory first, re-writing etc
with open(index_fname, 'a', encoding='utf-8') as f:
f.write(index_line)
f.flush()
os.fsync(f.fileno())
# Update internal state
self.__newest_history_key = timestamp
self.__history_n += 1
@@ -427,11 +413,6 @@ class model(watch_base):
def snapshot_error_screenshot_ctime(self):
return self.__get_file_ctime('last-error-screenshot.png')
@property
def watch_data_dir(self):
# The base dir of the watch data
return os.path.join(self.__datastore_path, self['uuid']) if self.__datastore_path else None
def get_error_text(self):
"""Return the text saved from a previous request that resulted in a non-200 error"""
fname = os.path.join(self.watch_data_dir, "last-error.txt")
@@ -575,7 +556,7 @@ class model(watch_base):
import brotli
filepath = os.path.join(self.watch_data_dir, 'last-fetched.br')
if not os.path.isfile(filepath) or os.path.getsize(filepath) == 0:
if not os.path.isfile(filepath):
# If a previous attempt doesnt yet exist, just snarf the previous snapshot instead
dates = list(self.history.keys())
if len(dates):

View File

@@ -1,135 +1,246 @@
import os
import uuid
from copy import deepcopy
from loguru import logger
import time
import json
from changedetectionio import strtobool
from changedetectionio.notification import default_notification_format_for_watch
schema = {
# Custom notification content
# Re #110, so then if this is set to None, we know to use the default value instead
# Requires setting to None on submit if it's the same as the default
# Should be all None by default, so we use the system default in this case.
'body': None,
'browser_steps': [],
'browser_steps_last_error_step': None,
'check_count': 0,
'check_unique_lines': False, # On change-detected, compare against all history if its something new
'consecutive_filter_failures': 0, # Every time the CSS/xPath filter cannot be located, reset when all is fine.
'content-type': None,
'date_created': None,
'extract_text': [], # Extract text by regex after filters
'extract_title_as_title': False,
'fetch_backend': 'system', # plaintext, playwright etc
'fetch_time': 0.0,
'filter_failure_notification_send': strtobool(os.getenv('FILTER_FAILURE_NOTIFICATION_SEND_DEFAULT', 'True')),
'filter_text_added': True,
'filter_text_removed': True,
'filter_text_replaced': True,
'follow_price_changes': True,
'has_ldjson_price_data': None,
'headers': {}, # Extra headers to send
'ignore_text': [], # List of text to ignore when calculating the comparison checksum
'in_stock_only': True, # Only trigger change on going to instock from out-of-stock
'include_filters': [],
'last_checked': 0,
'last_error': False,
'last_modified': None,
'last_viewed': 0, # history key value of the last viewed via the [diff] link
'method': 'GET',
'notification_alert_count': 0,
'notification_body': None,
'notification_format': default_notification_format_for_watch,
'notification_muted': False,
'notification_screenshot': False, # Include the latest screenshot if available and supported by the apprise URL
'notification_title': None,
'notification_urls': [], # List of URLs to add to the notification Queue (Usually AppRise)
'paused': False,
'previous_md5': False,
'previous_md5_before_filters': False, # Used for skipping changedetection entirely
'processor': 'text_json_diff', # could be restock_diff or others from .processors
'processor_state': {}, # Extra configs for custom processors/plugins, keyed by processor name
'price_change_threshold_percent': None,
'proxy': None, # Preferred proxy connection
'remote_server_reply': None, # From 'server' reply header
'sort_text_alphabetically': False,
'subtractive_selectors': [],
'tag': '', # Old system of text name for a tag, to be removed
'tags': [], # list of UUIDs to App.Tags
'text_should_not_be_present': [], # Text that should not present
'time_between_check': {'weeks': None, 'days': None, 'hours': None, 'minutes': None, 'seconds': None},
'time_between_check_use_default': True,
"time_schedule_limit": {
"enabled": False,
"monday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"tuesday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"wednesday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"thursday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"friday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"saturday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"sunday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
},
'title': None,
'track_ldjson_price_data': None,
'trim_text_whitespace': False,
'remove_duplicate_lines': False,
'trigger_text': [], # List of text or regex to wait for until a change is detected
'url': '',
'uuid': None,
'webdriver_delay': None,
'webdriver_js_execute_code': None, # Run before change-detection
}
class watch_base(dict):
__data = {}
__datastore_path = None
__save_enabled = True
def __init__(self, *arg, **kw):
self.update({
# Custom notification content
# Re #110, so then if this is set to None, we know to use the default value instead
# Requires setting to None on submit if it's the same as the default
# Should be all None by default, so we use the system default in this case.
'body': None,
'browser_steps': [],
'browser_steps_last_error_step': None,
'check_count': 0,
'check_unique_lines': False, # On change-detected, compare against all history if its something new
'consecutive_filter_failures': 0, # Every time the CSS/xPath filter cannot be located, reset when all is fine.
'content-type': None,
'date_created': None,
'extract_text': [], # Extract text by regex after filters
'extract_title_as_title': False,
'fetch_backend': 'system', # plaintext, playwright etc
'fetch_time': 0.0,
'filter_failure_notification_send': strtobool(os.getenv('FILTER_FAILURE_NOTIFICATION_SEND_DEFAULT', 'True')),
'filter_text_added': True,
'filter_text_removed': True,
'filter_text_replaced': True,
'follow_price_changes': True,
'has_ldjson_price_data': None,
'headers': {}, # Extra headers to send
'ignore_text': [], # List of text to ignore when calculating the comparison checksum
'in_stock_only': True, # Only trigger change on going to instock from out-of-stock
'include_filters': [],
'last_checked': 0,
'last_error': False,
'last_viewed': 0, # history key value of the last viewed via the [diff] link
'method': 'GET',
'notification_alert_count': 0,
'notification_body': None,
'notification_format': default_notification_format_for_watch,
'notification_muted': False,
'notification_screenshot': False, # Include the latest screenshot if available and supported by the apprise URL
'notification_title': None,
'notification_urls': [], # List of URLs to add to the notification Queue (Usually AppRise)
'paused': False,
'previous_md5': False,
'previous_md5_before_filters': False, # Used for skipping changedetection entirely
'processor': 'text_json_diff', # could be restock_diff or others from .processors
'price_change_threshold_percent': None,
'proxy': None, # Preferred proxy connection
'remote_server_reply': None, # From 'server' reply header
'sort_text_alphabetically': False,
'subtractive_selectors': [],
'tag': '', # Old system of text name for a tag, to be removed
'tags': [], # list of UUIDs to App.Tags
'text_should_not_be_present': [], # Text that should not present
'time_between_check': {'weeks': None, 'days': None, 'hours': None, 'minutes': None, 'seconds': None},
'time_between_check_use_default': True,
"time_schedule_limit": {
"enabled": False,
"monday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"tuesday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"wednesday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"thursday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"friday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"saturday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
"sunday": {
"enabled": True,
"start_time": "00:00",
"duration": {
"hours": "24",
"minutes": "00"
}
},
},
'title': None,
'track_ldjson_price_data': None,
'trim_text_whitespace': False,
'remove_duplicate_lines': False,
'trigger_text': [], # List of text or regex to wait for until a change is detected
'url': '',
'uuid': str(uuid.uuid4()),
'webdriver_delay': None,
'webdriver_js_execute_code': None, # Run before change-detection
})
# Initialize internal data storage
super(watch_base, self).__init__(*arg, **kw)
self.__data = deepcopy(schema)
self.__datastore_path = kw.pop('datastore_path', None)
# Initialize as empty dict but maintain dict interface
super(watch_base, self).__init__()
# Update with provided data
if arg or kw:
self.update(*arg, **kw)
if self.get('default'):
del self['default']
# Generate UUID if needed
if not self.__data.get('uuid'):
self.__data['uuid'] = str(uuid.uuid4())
if self.__data.get('default'):
del(self.__data['default'])
@property
def watch_data_dir(self):
# The base dir of the watch data
return os.path.join(self.__datastore_path, self['uuid']) if self.__datastore_path else None
def enable_saving(self):
self.__save_enabled = True
# Dictionary interface methods to use self.__data
def __getitem__(self, key):
return self.__data[key]
def __setitem__(self, key, value):
self.__data[key] = value
self.__data['last_modified'] = time.time()
def __delitem__(self, key):
del self.__data[key]
def __contains__(self, key):
return key in self.__data
def __iter__(self):
return iter(self.__data)
def __len__(self):
return len(self.__data)
def get(self, key, default=None):
return self.__data.get(key, default)
def update(self, *args, **kwargs):
if args:
if len(args) > 1:
raise TypeError("update expected at most 1 arguments, got %d" % len(args))
other = dict(args[0])
for key in other:
self.__data[key] = other[key]
for key in kwargs:
self.__data[key] = kwargs[key]
self.__data['last_modified'] = time.time()
def items(self):
return self.__data.items()
def keys(self):
return self.__data.keys()
def values(self):
return self.__data.values()
def pop(self, key, default=None):
return self.__data.pop(key, default)
def popitem(self):
return self.__data.popitem()
def clear(self):
self.__data.clear()
self.__data['last_modified'] = time.time()
def get_data(self):
"""Returns the internal data dictionary"""
return self.__data
def save_data(self):
if self.__save_enabled:
if not self.__data.get('uuid'):
# Might have been called when creating the watch
return
logger.debug(f"Saving watch {self['uuid']}")
path = os.path.join(self.__datastore_path, self.get('uuid'))
filepath = os.path.join(str(path), "watch.json")
if not os.path.exists(path):
os.mkdir(path)
try:
import tempfile
with tempfile.NamedTemporaryFile(mode='wb+', delete=False) as tmp:
tmp.write(json.dumps(self.get_data(), indent=2).encode('utf-8'))
tmp.flush()
os.replace(tmp.name, filepath)
except Exception as e:
logger.error(f"Error writing JSON for {self.get('uuid')}!! (JSON file save was skipped) : {str(e)}")

View File

@@ -4,9 +4,6 @@ from apprise import NotifyFormat
import apprise
from loguru import logger
from .apprise_plugin.assets import APPRISE_AVATAR_URL
from .apprise_plugin.custom_handlers import apprise_http_custom_handler # noqa: F401
from .safe_jinja import render as jinja_render
valid_tokens = {
'base_url': '',
@@ -42,6 +39,10 @@ valid_notification_formats = {
def process_notification(n_object, datastore):
# so that the custom endpoints are registered
from changedetectionio.apprise_plugin import apprise_custom_api_call_wrapper
from .safe_jinja import render as jinja_render
now = time.time()
if n_object.get('notification_timestamp'):
logger.trace(f"Time since queued {now-n_object['notification_timestamp']:.3f}s")
@@ -65,12 +66,12 @@ def process_notification(n_object, datastore):
# raise it as an exception
sent_objs = []
from .apprise_plugin.assets import apprise_asset
from .apprise_asset import asset
if 'as_async' in n_object:
apprise_asset.async_mode = n_object.get('as_async')
asset.async_mode = n_object.get('as_async')
apobj = apprise.Apprise(debug=True, asset=apprise_asset)
apobj = apprise.Apprise(debug=True, asset=asset)
if not n_object.get('notification_urls'):
return None
@@ -111,7 +112,7 @@ def process_notification(n_object, datastore):
and not url.startswith('get') \
and not url.startswith('delete') \
and not url.startswith('put'):
url += k + f"avatar_url={APPRISE_AVATAR_URL}"
url += k + 'avatar_url=https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/changedetectionio/static/images/avatar-256x256.png'
if url.startswith('tgram://'):
# Telegram only supports a limit subset of HTML, remove the '<br>' we place in.

View File

@@ -4,12 +4,12 @@ from changedetectionio.strtobool import strtobool
from copy import deepcopy
from loguru import logger
import hashlib
import importlib
import inspect
import os
import pkgutil
import re
from .pluggy_interface import plugin_manager, hookimpl
class difference_detection_processor():
browser_steps = None
@@ -172,83 +172,208 @@ class difference_detection_processor():
return changed_detected, update_obj, ''.encode('utf-8')
def find_sub_packages(package_name):
def get_all_plugins_info():
"""
Find all sub-packages within the given package.
:param package_name: The name of the base package to scan for sub-packages.
:return: A list of sub-package names.
Get information about all registered processor plugins
:return: A list of dictionaries with plugin info
"""
package = importlib.import_module(package_name)
return [name for _, name, is_pkg in pkgutil.iter_modules(package.__path__) if is_pkg]
plugins_info = []
# Collect from all registered plugins
for plugin in plugin_manager.get_plugins():
if hasattr(plugin, "get_processor_name") and hasattr(plugin, "get_processor_description"):
processor_name = plugin.get_processor_name()
description = plugin.get_processor_description()
# Get version if available
version = "N/A"
if hasattr(plugin, "get_processor_version"):
plugin_version = plugin.get_processor_version()
if plugin_version:
version = plugin_version
if processor_name and description:
plugins_info.append({
"name": processor_name,
"description": description,
"version": version
})
# Fallback if no plugins registered
if not plugins_info:
plugins_info = [
{"name": "text_json_diff", "description": "Webpage Text/HTML, JSON and PDF changes", "version": "1.0.0"},
{"name": "restock_diff", "description": "Re-stock & Price detection for single product pages", "version": "1.0.0"}
]
return plugins_info
def find_processors():
"""
Find all subclasses of DifferenceDetectionProcessor in the specified package.
:param package_name: The name of the package to scan for processor modules.
:return: A list of (module, class) tuples.
"""
package_name = "changedetectionio.processors" # Name of the current package/module
processors = []
sub_packages = find_sub_packages(package_name)
for sub_package in sub_packages:
module_name = f"{package_name}.{sub_package}.processor"
try:
module = importlib.import_module(module_name)
# Iterate through all classes in the module
for name, obj in inspect.getmembers(module, inspect.isclass):
if issubclass(obj, difference_detection_processor) and obj is not difference_detection_processor:
processors.append((module, sub_package))
except (ModuleNotFoundError, ImportError) as e:
logger.warning(f"Failed to import module {module_name}: {e} (find_processors())")
return processors
def get_parent_module(module):
module_name = module.__name__
if '.' not in module_name:
return None # Top-level module has no parent
parent_module_name = module_name.rsplit('.', 1)[0]
try:
return importlib.import_module(parent_module_name)
except Exception as e:
pass
return False
def get_custom_watch_obj_for_processor(processor_name):
from changedetectionio.model import Watch
watch_class = Watch.model
processor_classes = find_processors()
custom_watch_obj = next((tpl for tpl in processor_classes if tpl[1] == processor_name), None)
if custom_watch_obj:
# Parent of .processor.py COULD have its own Watch implementation
parent_module = get_parent_module(custom_watch_obj[0])
if hasattr(parent_module, 'Watch'):
watch_class = parent_module.Watch
return watch_class
def available_processors():
def available_processors(datastore=None):
"""
Get a list of processors by name and description for the UI elements
:return: A list :)
Filtered by enabled_plugins setting if datastore is provided
:return: A list of tuples (processor_name, description)
"""
plugins_info = get_all_plugins_info()
processor_list = []
for plugin in plugins_info:
processor_list.append((plugin["name"], plugin["description"]))
return processor_list
def get_processor_handler(processor_name, datastore, watch_uuid):
"""
Get the processor handler for the specified processor name
:return: The processor handler instance
"""
# Try each plugin in turn
for plugin in plugin_manager.get_plugins():
if hasattr(plugin, "perform_site_check"):
handler = plugin.perform_site_check(datastore=datastore, watch_uuid=watch_uuid)
if handler:
return handler
# If no plugins handled it, use the appropriate built-in processor
watch = datastore.data['watching'].get(watch_uuid)
if watch and watch.get('processor') == 'restock_diff':
from .restock_diff.processor import perform_site_check
return perform_site_check(datastore=datastore, watch_uuid=watch_uuid)
else:
# Default to text_json_diff
from .text_json_diff.processor import perform_site_check
return perform_site_check(datastore=datastore, watch_uuid=watch_uuid)
def get_form_class_for_processor(processor_name):
"""
Get the form class for the specified processor name
:return: The form class
"""
# Try each plugin in turn
for plugin in plugin_manager.get_plugins():
if hasattr(plugin, "get_form_class"):
form_class = plugin.get_form_class(processor_name=processor_name)
if form_class:
return form_class
# If no plugins provided a form class, use the appropriate built-in form
if processor_name == 'restock_diff':
try:
from .restock_diff.forms import processor_settings_form
return processor_settings_form
except ImportError:
pass
# Default to text_json_diff form
from changedetectionio import forms
return forms.processor_text_json_diff_form
def get_watch_model_for_processor(processor_name):
"""
Get the Watch model class for the specified processor name
:return: The Watch model class
"""
processor_classes = find_processors()
# Try each plugin in turn
for plugin in plugin_manager.get_plugins():
if hasattr(plugin, "get_watch_model_class"):
model_class = plugin.get_watch_model_class(processor_name=processor_name)
if model_class:
return model_class
available = []
for package, processor_class in processor_classes:
available.append((processor_class, package.name))
# Default to standard Watch model
from changedetectionio.model import Watch
return Watch.model
return available
# Define plugin implementations for the built-in processors
class TextJsonDiffPlugin:
@hookimpl
def get_processor_name(self):
return "text_json_diff"
@hookimpl
def get_processor_description(self):
from .text_json_diff.processor import name
return name
@hookimpl
def get_processor_version(self):
from changedetectionio import __version__
return __version__
@hookimpl
def get_processor_ui_tag(self):
from .text_json_diff.processor import UI_tag
return UI_tag
@hookimpl
def perform_site_check(self, datastore, watch_uuid):
watch = datastore.data['watching'].get(watch_uuid)
if watch and watch.get('processor', 'text_json_diff') == 'text_json_diff':
from .text_json_diff.processor import perform_site_check
return perform_site_check(datastore=datastore, watch_uuid=watch_uuid)
return None
@hookimpl
def get_form_class(self, processor_name):
if processor_name == 'text_json_diff':
from changedetectionio import forms
return forms.processor_text_json_diff_form
return None
@hookimpl
def get_watch_model_class(self, processor_name):
if processor_name == 'text_json_diff':
from changedetectionio.model import Watch
return Watch.model
return None
class RestockDiffPlugin:
@hookimpl
def get_processor_name(self):
return "restock_diff"
@hookimpl
def get_processor_description(self):
from .restock_diff.processor import name
return name
@hookimpl
def get_processor_version(self):
from changedetectionio import __version__
return __version__
@hookimpl
def get_processor_ui_tag(self):
from .restock_diff.processor import UI_tag
return UI_tag
@hookimpl
def perform_site_check(self, datastore, watch_uuid):
watch = datastore.data['watching'].get(watch_uuid)
if watch and watch.get('processor') == 'restock_diff':
from .restock_diff.processor import perform_site_check
return perform_site_check(datastore=datastore, watch_uuid=watch_uuid)
return None
@hookimpl
def get_form_class(self, processor_name):
if processor_name == 'restock_diff':
try:
from .restock_diff.forms import processor_settings_form
return processor_settings_form
except ImportError:
pass
return None
@hookimpl
def get_watch_model_class(self, processor_name):
if processor_name == 'restock_diff':
from . import restock_diff
return restock_diff.Watch
return None
# Register the built-in processor plugins
plugin_manager.register(TextJsonDiffPlugin())
plugin_manager.register(RestockDiffPlugin())

View File

@@ -0,0 +1,5 @@
# Common constants used across processors
# Price data tracking constants
PRICE_DATA_TRACK_ACCEPT = 'accepted'
PRICE_DATA_TRACK_REJECT = 'rejected'

View File

@@ -0,0 +1,85 @@
import pluggy
from loguru import logger
# Ensure that the namespace in HookspecMarker matches PluginManager
PLUGIN_NAMESPACE = "changedetectionio_processors"
hookspec = pluggy.HookspecMarker(PLUGIN_NAMESPACE)
hookimpl = pluggy.HookimplMarker(PLUGIN_NAMESPACE)
UI_tags = {}
class ProcessorSpec:
"""Hook specifications for difference detection processors."""
@hookspec
def get_processor_name():
"""Return the processor name for selection in the UI."""
pass
@hookspec
def get_processor_description():
"""Return a human-readable description of the processor."""
pass
@hookspec
def get_processor_version():
"""Return the processor plugin version."""
pass
@hookspec
def get_processor_ui_tag():
"""Return the UI tag for the processor (used for categorization in UI)."""
pass
@hookspec
def perform_site_check(datastore, watch_uuid):
"""Return the processor handler class or None if not applicable.
Each plugin should check if it's the right processor for this watch
and return None if it's not.
Should return an instance of a class that implements:
- call_browser(preferred_proxy_id=None): Fetch the content
- run_changedetection(watch): Analyze for changes and return tuple of (changed_detected, update_obj, contents)
"""
pass
@hookspec
def get_form_class(processor_name):
"""Return the WTForms form class for the processor settings or None if not applicable.
Each plugin should check if it's the right processor and return None if not.
"""
pass
@hookspec
def get_watch_model_class(processor_name):
"""Return a custom Watch model class if needed or None if not applicable.
Each plugin should check if it's the right processor and return None if not.
"""
pass
# Set up Pluggy Plugin Manager
plugin_manager = pluggy.PluginManager(PLUGIN_NAMESPACE)
# Register hookspecs
plugin_manager.add_hookspecs(ProcessorSpec)
# Initialize by loading plugins and building UI_tags dictionary
try:
# Discover installed plugins from external packages (if any)
plugin_manager.load_setuptools_entrypoints(PLUGIN_NAMESPACE)
logger.info(f"Loaded plugins: {plugin_manager.get_plugins()}")
# Build UI_tags dictionary from all plugins
for plugin in plugin_manager.get_plugins():
if hasattr(plugin, "get_processor_name") and hasattr(plugin, "get_processor_ui_tag"):
plugin_name = plugin.get_processor_name()
ui_tag = plugin.get_processor_ui_tag()
if plugin_name and ui_tag:
UI_tags[plugin_name] = ui_tag
logger.info(f"Found UI tag for plugin {plugin_name}: {ui_tag}")
except Exception as e:
logger.critical(f"Error loading plugins: {str(e)}")

View File

@@ -1,5 +1,4 @@
from babel.numbers import parse_decimal
from changedetectionio.model.Watch import model as BaseWatch
from typing import Union
import re
@@ -7,6 +6,7 @@ import re
class Restock(dict):
def parse_currency(self, raw_value: str) -> Union[float, None]:
from babel.numbers import parse_decimal
# Clean and standardize the value (ie 1,400.00 should be 1400.00), even better would be store the whole thing as an integer.
standardized_value = raw_value
@@ -56,14 +56,19 @@ class Restock(dict):
super().__setitem__(key, value)
class Watch(BaseWatch):
def load_extra_vars(self):
# something from disk?
def __init__(self, *arg, **kw):
super().__init__(*arg, **kw)
# Restock Obj helps with the state of the situation
self['restock'] = Restock(kw['default']['restock']) if kw.get('default') and kw['default'].get('restock') else Restock()
self['restock_settings'] = kw['default']['restock_settings'] if kw.get('default',{}).get('restock_settings') else {
'follow_price_changes': True,
'in_stock_processing' : 'in_stock_only'
} #@todo update
}
def clear_watch(self):
super().clear_watch()

View File

@@ -9,6 +9,7 @@ import time
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
name = 'Re-stock & Price detection for single product pages'
description = 'Detects if the product goes back to in-stock'
UI_tag = "Restock"
class UnableToExtractRestockData(Exception):
def __init__(self, status_code):
@@ -152,7 +153,8 @@ class perform_site_check(difference_detection_processor):
# Unset any existing notification error
update_obj = {'last_notification_error': False, 'last_error': False, 'restock': Restock()}
if not 'restock_settings' in watch.keys():
raise Exception("Restock settings not found in watch.")
self.screenshot = self.fetcher.screenshot
self.xpath_data = self.fetcher.xpath_data

View File

@@ -10,13 +10,14 @@ from changedetectionio.conditions import execute_ruleset_against_all_plugins
from changedetectionio.processors import difference_detection_processor
from changedetectionio.html_tools import PERL_STYLE_REGEX, cdata_in_document_to_text, TRANSLATE_WHITESPACE_TABLE
from changedetectionio import html_tools, content_fetchers
from changedetectionio.blueprint.price_data_follower import PRICE_DATA_TRACK_ACCEPT, PRICE_DATA_TRACK_REJECT
from changedetectionio.processors.constants import PRICE_DATA_TRACK_ACCEPT, PRICE_DATA_TRACK_REJECT
from loguru import logger
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
name = 'Webpage Text/HTML, JSON and PDF changes'
description = 'Detects all text changes where possible'
UI_tag = "Text Diff"
json_filter_prefixes = ['json:', 'jq:', 'jqraw:']
@@ -334,14 +335,12 @@ class perform_site_check(difference_detection_processor):
# And check if 'conditions' will let this pass through
if watch.get('conditions') and watch.get('conditions_match_logic'):
conditions_result = execute_ruleset_against_all_plugins(current_watch_uuid=watch.get('uuid'),
application_datastruct=self.datastore.data,
ephemeral_data={
'text': stripped_text_from_html
}
)
if not conditions_result.get('result'):
if not execute_ruleset_against_all_plugins(current_watch_uuid=watch.get('uuid'),
application_datastruct=self.datastore.data,
ephemeral_data={
'text': stripped_text_from_html
}
):
# Conditions say "Condition not met" so we block it.
blocked = True

View File

@@ -52,7 +52,7 @@ $(document).ready(function () {
// Create a rule object
let rule = {
const rule = {
field: field,
operator: operator,
value: value
@@ -96,10 +96,6 @@ $(document).ready(function () {
contentType: false, // Let the browser set the correct content type
success: function (response) {
if (response.status === "success") {
if(rule['field'] !== "page_filtered_text") {
// A little debug helper for the user
$('#verify-state-text').text(`${rule['field']} was value "${response.data[rule['field']]}"`)
}
if (response.result) {
alert("✅ Condition PASSES verification against current snapshot!");
} else {

View File

@@ -1,4 +1,22 @@
(function ($) {
// Initialize plugin management UI when the DOM is ready
$(document).ready(function() {
// Add event handlers for plugin checkboxes
$("#plugins-table input[type='checkbox']").on('change', function() {
const isEnabled = $(this).is(':checked');
// For visual feedback, fade the row when disabled
if (isEnabled) {
$(this).closest('tr').removeClass('disabled-plugin');
} else {
$(this).closest('tr').addClass('disabled-plugin');
}
const pluginName = $(this).closest('tr').find('td:nth-child(2)').text().trim();
console.log(`Plugin ${pluginName} ${isEnabled ? 'enabled' : 'disabled'}`);
});
});
/**
* debounce
* @param {integer} milliseconds This param indicates the number of milliseconds

View File

@@ -48,8 +48,6 @@ $(function () {
$('input[type=checkbox]').not(this).prop('checked', this.checked);
});
const time_check_step_size_seconds=1;
// checkboxes - show/hide buttons
$("input[type=checkbox]").click(function (e) {
if ($('input[type=checkbox]:checked').length) {
@@ -59,30 +57,5 @@ $(function () {
}
});
setInterval(function () {
// Background ETA completion for 'checking now'
$(".watch-table .checking-now .last-checked").each(function () {
const eta_complete = parseFloat($(this).data('eta_complete'));
const fetch_duration = parseInt($(this).data('fetchduration'));
if (eta_complete + 2 > nowtimeserver && fetch_duration > 3) {
const remaining_seconds = Math.abs(eta_complete) - nowtimeserver - 1;
let r = (1.0 - (remaining_seconds / fetch_duration)) * 100;
if (r < 10) {
r = 10;
}
if (r >= 90) {
r = 100;
}
$(this).css('background-size', `${r}% 100%`);
//$(this).text(`${r}% remain ${remaining_seconds}`);
} else {
$(this).css('background-size', `100% 100%`);
}
});
nowtimeserver = nowtimeserver + time_check_step_size_seconds;
}, time_check_step_size_seconds * 1000);
});

View File

@@ -6,7 +6,7 @@ from flask import (
from .html_tools import TRANSLATE_WHITESPACE_TABLE
from . model import App, Watch
from copy import deepcopy, copy
from copy import deepcopy
from os import path, unlink
from threading import Lock
import json
@@ -17,9 +17,9 @@ import threading
import time
import uuid as uuid_builder
from loguru import logger
from deepmerge import always_merger
from .processors import get_custom_watch_obj_for_processor
from .processors.restock_diff import Restock
from .processors import get_watch_model_for_processor
# Because the server will run as a daemon and wont know the URL for notification links when firing off a notification
BASE_URL_NOT_SET_TEXT = '("Base URL" not set - see settings - notifications)'
@@ -31,11 +31,6 @@ dictfilt = lambda x, y: dict([ (i,x[i]) for i in x if i in set(y) ])
# https://stackoverflow.com/questions/6190468/how-to-trigger-function-on-value-change
class ChangeDetectionStore:
lock = Lock()
# For general updates/writes that can wait a few seconds
needs_write = False
# For when we edit, we should write to disk
needs_write_urgent = False
__version_check = True
@@ -46,12 +41,9 @@ class ChangeDetectionStore:
self.datastore_path = datastore_path
self.json_store_path = "{}/url-watches.json".format(self.datastore_path)
logger.info(f"Datastore path is '{self.json_store_path}'")
self.needs_write = False
self.start_time = time.time()
self.stop_thread = False
# Base definition for all watchers
# deepcopy part of #569 - not sure why its needed exactly
self.generic_definition = deepcopy(Watch.model(datastore_path = datastore_path, default={}))
if path.isfile('changedetectionio/source.txt'):
with open('changedetectionio/source.txt') as f:
@@ -59,38 +51,30 @@ class ChangeDetectionStore:
# So when someone gives us a backup file to examine, we know exactly what code they were running.
self.__data['build_sha'] = f.read()
self.generic_definition = deepcopy(Watch.model(datastore_path = datastore_path, default={}))
try:
# @todo retest with ", encoding='utf-8'"
with open(self.json_store_path) as json_file:
from_disk = json.load(json_file)
import os
# First load global settings from the main JSON file if it exists
if os.path.isfile(self.json_store_path):
with open(self.json_store_path) as json_file:
from_disk = json.load(json_file)
# Load app_guid and settings from the main JSON file
if 'app_guid' in from_disk:
self.__data['app_guid'] = from_disk['app_guid']
if 'settings' in from_disk:
if 'headers' in from_disk['settings']:
self.__data['settings']['headers'].update(from_disk['settings']['headers'])
if 'requests' in from_disk['settings']:
self.__data['settings']['requests'].update(from_disk['settings']['requests'])
if 'application' in from_disk['settings']:
self.__data['settings']['application'].update(from_disk['settings']['application'])
# @todo isnt there a way todo this dict.update recursively?
# Problem here is if the one on the disk is missing a sub-struct, it wont be present anymore.
if 'watching' in from_disk:
self.__data['watching'].update(from_disk['watching'])
if 'app_guid' in from_disk:
self.__data['app_guid'] = from_disk['app_guid']
if 'settings' in from_disk:
if 'headers' in from_disk['settings']:
self.__data['settings']['headers'].update(from_disk['settings']['headers'])
if 'requests' in from_disk['settings']:
self.__data['settings']['requests'].update(from_disk['settings']['requests'])
if 'application' in from_disk['settings']:
self.__data['settings']['application'].update(from_disk['settings']['application'])
# Convert each existing watch back to the Watch.model object
for uuid, watch in self.__data['watching'].items():
self.__data['watching'][uuid] = self.rehydrate_entity(uuid, watch)
logger.info(f"Watching: {uuid} {watch['url']}")
# And for Tags also, should be Restock type because it has extra settings
for uuid, tag in self.__data['settings']['application']['tags'].items():
self.__data['settings']['application']['tags'][uuid] = self.rehydrate_entity(uuid, tag, processor_override='restock_diff')
logger.info(f"Tag: {uuid} {tag['title']}")
# First time ran, Create the datastore.
except (FileNotFoundError):
@@ -109,6 +93,8 @@ class ChangeDetectionStore:
else:
# Bump the update version by running updates
self.scan_load_watches()
self.scan_load_tags()
self.run_updates()
self.__data['version_tag'] = version_tag
@@ -140,53 +126,93 @@ class ChangeDetectionStore:
secret = secrets.token_hex(16)
self.__data['settings']['application']['api_access_token'] = secret
self.needs_write = True
def scan_load_watches(self):
# Finally start the thread that will manage periodic data saves to JSON
save_data_thread = threading.Thread(target=self.save_datastore).start()
# Now scan for individual watch.json files in the datastore directory
import pathlib
watch_jsons = list(pathlib.Path(self.datastore_path).rglob("*/watch.json"))
def rehydrate_entity(self, uuid, entity, processor_override=None):
"""Set the dict back to the dict Watch object"""
entity['uuid'] = uuid
for watch_file in watch_jsons:
# Extract UUID from the directory name (parent directory of watch.json)
uuid = watch_file.parent.name
if processor_override:
watch_class = get_custom_watch_obj_for_processor(processor_override)
entity['processor']=processor_override
else:
watch_class = get_custom_watch_obj_for_processor(entity.get('processor'))
try:
with open(watch_file, 'r') as f:
watch_data = json.load(f)
# Create a Watch object and add it to the datastore
self.__data['watching'][uuid] = self.rehydrate_entity(default_dict=watch_data)
logger.info(f"Watching: {uuid} {watch_data.get('url')}")
if entity.get('uuid') != 'text_json_diff':
logger.trace(f"Loading Watch object '{watch_class.__module__}.{watch_class.__name__}' for UUID {uuid}")
except Exception as e:
logger.error(f"Error loading watch from {watch_file}: {str(e)}")
continue
logger.debug(f"{len(self.__data['watching'])} watches loaded.")
entity = watch_class(datastore_path=self.datastore_path, default=entity)
def scan_load_tags(self):
import pathlib
# Now scan for individual tag.json files in the tags directory
tags_path = os.path.join(self.datastore_path, 'tags')
if os.path.exists(tags_path):
tag_jsons = list(pathlib.Path(tags_path).rglob("*.json"))
for tag_file in tag_jsons:
# Extract UUID from the directory name (parent directory of tag.json)
try:
with open(tag_file, 'r') as f:
tag_data = json.load(f)
uuid = str(tag_file).replace('.json', '')
tag_data['uuid'] = uuid
# Create a Tag object and add it to the datastore
self.__data['settings']['application']['tags'][uuid] = self.rehydrate_entity(
default_dict=tag_data,
processor_override='restock_diff'
)
logger.info(f"Tag: {uuid} {tag_data.get('title', 'No title found')}")
except Exception as e:
logger.error(f"Error loading tag from {tag_file}: {str(e)}")
continue
logger.debug(f"{len(self.__data['settings']['application']['tags'])} tags loaded.")
def rehydrate_entity(self, default_dict: dict, processor_override=None):
if not processor_override and default_dict.get('processor'):
processor_override = default_dict.get('processor')
if not processor_override:
processor_override = 'text_json_diff'
watch_class = get_watch_model_for_processor(processor_override)
default_dict['processor'] = processor_override
entity = watch_class(datastore_path=self.datastore_path, default=default_dict)
entity.enable_saving()
return entity
def set_last_viewed(self, uuid, timestamp):
logger.debug(f"Setting watch UUID: {uuid} last viewed to {int(timestamp)}")
self.data['watching'][uuid].update({'last_viewed': int(timestamp)})
self.needs_write = True
self.data['watching'][uuid].save_data()
def remove_password(self):
self.__data['settings']['application']['password'] = False
self.needs_write = True
self.save_settings()
def update_watch(self, uuid, update_obj):
"""
Update a watch with new values using the deepmerge library.
"""
# It's possible that the watch could be deleted before update
if not self.__data['watching'].get(uuid):
if not uuid in self.data['watching'].keys() or update_obj is None:
return
with self.lock:
# In python 3.9 we have the |= dict operator, but that still will lose data on nested structures...
for dict_key, d in self.generic_definition.items():
if isinstance(d, dict):
if update_obj is not None and dict_key in update_obj:
self.__data['watching'][uuid][dict_key].update(update_obj[dict_key])
del (update_obj[dict_key])
# In python 3.9 we have the |= dict operator, but that still will lose data on nested structures...
for dict_key, d in self.generic_definition.items():
if isinstance(d, dict):
if update_obj is not None and dict_key in update_obj:
self.__data['watching'][uuid][dict_key].update(update_obj[dict_key])
del (update_obj[dict_key])
self.__data['watching'][uuid].update(update_obj)
self.__data['watching'][uuid].save_data()
self.__data['watching'][uuid].update(update_obj)
self.needs_write = True
@property
def threshold_seconds(self):
@@ -246,19 +272,11 @@ class ChangeDetectionStore:
shutil.rmtree(path)
del self.data['watching'][uuid]
self.needs_write_urgent = True
# Clone a watch by UUID
def clone(self, uuid):
url = self.data['watching'][uuid].get('url')
extras = deepcopy(self.data['watching'][uuid])
extras = self.data['watching'][uuid]
new_uuid = self.add_watch(url=url, extras=extras)
watch = self.data['watching'][new_uuid]
if self.data['settings']['application'].get('extract_title_as_title') or watch['extract_title_as_title']:
# Because it will be recalculated on the next fetch
self.data['watching'][new_uuid]['title'] = None
return new_uuid
def url_exists(self, url):
@@ -273,7 +291,6 @@ class ChangeDetectionStore:
# Remove a watchs data but keep the entry (URL etc)
def clear_watch_history(self, uuid):
self.__data['watching'][uuid].clear_watch()
self.needs_write_urgent = True
def add_watch(self, url, tag='', extras=None, tag_uuids=None, write_to_disk_now=True):
import requests
@@ -351,7 +368,7 @@ class ChangeDetectionStore:
apply_extras['tags'] = list(set(apply_extras.get('tags')))
# If the processor also has its own Watch implementation
watch_class = get_custom_watch_obj_for_processor(apply_extras.get('processor'))
watch_class = get_watch_model_for_processor(apply_extras.get('processor'))
new_watch = watch_class(datastore_path=self.datastore_path, url=url)
new_uuid = new_watch.get('uuid')
@@ -364,14 +381,11 @@ class ChangeDetectionStore:
if not apply_extras.get('date_created'):
apply_extras['date_created'] = int(time.time())
new_watch.update(apply_extras)
new_watch.ensure_data_dir_exists()
new_watch.update(apply_extras)
self.__data['watching'][new_uuid] = new_watch
if write_to_disk_now:
self.sync_to_json()
self.__data['watching'][new_uuid].save_data()
logger.debug(f"Added '{url}'")
return new_uuid
@@ -385,58 +399,22 @@ class ChangeDetectionStore:
return False
def sync_to_json(self):
logger.info("Saving JSON..")
def save_settings(self):
logger.info("Saving application settings...")
try:
data = deepcopy(self.__data)
except RuntimeError as e:
# Try again in 15 seconds
time.sleep(15)
logger.error(f"! Data changed when writing to JSON, trying again.. {str(e)}")
self.sync_to_json()
return
else:
try:
# Re #286 - First write to a temp file, then confirm it looks OK and rename it
# This is a fairly basic strategy to deal with the case that the file is corrupted,
# system was out of memory, out of RAM etc
with open(self.json_store_path+".tmp", 'w') as json_file:
json.dump(data, json_file, indent=4)
os.replace(self.json_store_path+".tmp", self.json_store_path)
except Exception as e:
logger.error(f"Error writing JSON!! (Main JSON file save was skipped) : {str(e)}")
self.needs_write = False
self.needs_write_urgent = False
# Thread runner, this helps with thread/write issues when there are many operations that want to update the JSON
# by just running periodically in one thread, according to python, dict updates are threadsafe.
def save_datastore(self):
while True:
if self.stop_thread:
# Suppressing "Logging error in Loguru Handler #0" during CICD.
# Not a meaningful difference for a real use-case just for CICD.
# the side effect is a "Shutting down datastore thread" message
# at the end of each test.
# But still more looking better.
import sys
logger.remove()
logger.add(sys.stderr)
logger.critical("Shutting down datastore thread")
return
if self.needs_write or self.needs_write_urgent:
self.sync_to_json()
# Once per minute is enough, more and it can cause high CPU usage
# better here is to use something like self.app.config.exit.wait(1), but we cant get to 'app' from here
for i in range(120):
time.sleep(0.5)
if self.stop_thread or self.needs_write_urgent:
break
# Only save app settings, not the watches or tags (they're saved individually)
data = {'settings': self.__data.get('settings')}
#data = deepcopy(self.__data)
# Remove the watches from the main JSON file
if 'watching' in data:
del data['watching']
# Remove the tags from the main JSON file since they're saved individually now
# if 'settings' in data and 'application' in data['settings'] and 'tags' in data['settings']['application']:
# del data['settings']['application']['tags']
except Exception as e:
x=1
# Go through the datastore path and remove any snapshots that are not mentioned in the index
# This usually is not used, but can be handy.
@@ -576,30 +554,31 @@ class ChangeDetectionStore:
return ret
def add_tag(self, title):
def add_tag(self, name):
# If name exists, return that
n = title.strip().lower()
n = name.strip().lower()
logger.debug(f">>> Adding new tag - '{n}'")
if not n:
return False
for uuid, tag in self.__data['settings']['application'].get('tags', {}).items():
if n == tag.get('title', '').lower().strip():
logger.warning(f"Tag '{title}' already exists, skipping creation.")
logger.warning(f"Tag '{name}' already exists, skipping creation.")
return uuid
# Eventually almost everything todo with a watch will apply as a Tag
# So we use the same model as a Watch
with self.lock:
from .model import Tag
new_tag = Tag.model(datastore_path=self.datastore_path, default={
'title': title.strip(),
'date_created': int(time.time())
})
from .model import Tag
new_tag = Tag.model(datastore_path=self.datastore_path, default={
'title': name.strip(),
'date_created': int(time.time())
})
new_uuid = new_tag.get('uuid')
new_uuid = new_tag.get('uuid')
self.__data['settings']['application']['tags'][new_uuid] = new_tag
self.__data['settings']['application']['tags'][new_uuid].save_data()
self.__data['settings']['application']['tags'][new_uuid] = new_tag
return new_uuid
@@ -636,41 +615,6 @@ class ChangeDetectionStore:
if watch.get('processor') == processor_name:
return True
return False
def search_watches_for_url(self, query, tag_limit=None, partial=False):
"""Search watches by URL, title, or error messages
Args:
query (str): Search term to match against watch URLs, titles, and error messages
tag_limit (str, optional): Optional tag name to limit search results
partial: (bool, optional): sub-string matching
Returns:
list: List of UUIDs of watches that match the search criteria
"""
matching_uuids = []
query = query.lower().strip()
tag = self.tag_exists_by_name(tag_limit) if tag_limit else False
for uuid, watch in self.data['watching'].items():
# Filter by tag if requested
if tag_limit:
if not tag.get('uuid') in watch.get('tags', []):
continue
# Search in URL, title, or error messages
if partial:
if ((watch.get('title') and query in watch.get('title').lower()) or
query in watch.get('url', '').lower() or
(watch.get('last_error') and query in watch.get('last_error').lower())):
matching_uuids.append(uuid)
else:
if ((watch.get('title') and query == watch.get('title').lower()) or
query == watch.get('url', '').lower() or
(watch.get('last_error') and query == watch.get('last_error').lower())):
matching_uuids.append(uuid)
return matching_uuids
def get_unique_notification_tokens_available(self):
# Ask each type of watch if they have any extra notification token to add to the validation
@@ -887,7 +831,7 @@ class ChangeDetectionStore:
if tag:
tag_uuids = []
for t in tag.split(','):
tag_uuids.append(self.add_tag(title=t))
tag_uuids.append(self.add_tag(name=t))
self.data['watching'][uuid]['tags'] = tag_uuids
@@ -930,6 +874,7 @@ class ChangeDetectionStore:
# Migrate old 'in_stock' values to the new Restock
def update_17(self):
from .processors.restock_diff import Restock
for uuid, watch in self.data['watching'].items():
if 'in_stock' in watch:
watch['restock'] = Restock({'in_stock': watch.get('in_stock')})

View File

@@ -1,3 +1,7 @@
{% macro hasattr(obj, name) -%}
{{ obj is defined and name in obj.__dict__ }}
{%- endmacro %}
{% macro render_field(field) %}
<div {% if field.errors %} class="error" {% endif %}>{{ field.label }}</div>
<div {% if field.errors %} class="error" {% endif %}>{{ field(**kwargs)|safe }}

View File

@@ -42,7 +42,7 @@
<a class="pure-menu-heading" href="https://changedetection.io" rel="noopener">
<strong>Change</strong>Detection.io</a>
{% else %}
<a class="pure-menu-heading" href="{{url_for('watchlist.index')}}">
<a class="pure-menu-heading" href="{{url_for('index')}}">
<strong>Change</strong>Detection.io</a>
{% endif %}
{% if current_diff_url %}
@@ -157,13 +157,15 @@
<h4>Try our Chrome extension</h4>
<p>
<a id="chrome-extension-link"
title="Chrome Extension - Web Page Change Detection with changedetection.io!"
title="Try our new Chrome Extension!"
href="https://chromewebstore.google.com/detail/changedetectionio-website/kefcfmgmlhmankjmnbijimhofdjekbop">
<img alt="Chrome store icon" src="{{url_for('static_content', group='images', filename='Google-Chrome-icon.png')}}">
Chrome Webstore
</a>
</p>
Easily add the current web-page from your browser directly into your changedetection.io tool, more great features coming soon!
<h4>Changedetection.io needs your support!</h4>
<p>
You can help us by supporting changedetection.io on these platforms;
@@ -171,20 +173,17 @@
<p>
<ul>
<li>
<a href="https://alternativeto.net/software/changedetection-io/about/" title="Web page change detection at alternativeto.net">Rate us at
<a href="https://alternativeto.net/software/changedetection-io/about/">Rate us at
AlternativeTo.net</a>
</li>
<li>
<a href="https://github.com/dgtlmoon/changedetection.io" title="Web page change detection on GitHub">Star us on GitHub</a>
<a href="https://github.com/dgtlmoon/changedetection.io">Star us on GitHub</a>
</li>
<li>
<a rel="nofollow" href="https://twitter.com/change_det_io" title="Web page change detection on Twitter">Follow us at Twitter/X</a>
<a href="https://twitter.com/change_det_io">Follow us at Twitter/X</a>
</li>
<li>
<a rel="nofollow" href="https://www.g2.com/products/changedetection-io/reviews" title="Web page change detection reviews at G2">G2 Software reviews</a>
</li>
<li>
<a rel="nofollow" href="https://www.linkedin.com/company/changedetection-io" title="Visit web page change detection at LinkedIn">Check us out on LinkedIn</a>
<a href="https://www.linkedin.com/company/changedetection-io">Check us out on LinkedIn</a>
</li>
<li>
And tell your friends and colleagues :)

View File

@@ -0,0 +1,49 @@
{% extends 'base.html' %} {% block content %}
<div class="edit-form">
<div class="box-wrap inner">
<form
class="pure-form pure-form-stacked"
action="{{url_for('ui.clear_all_history')}}"
method="POST"
>
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}" >
<fieldset>
<div class="pure-control-group">
This will remove version history (snapshots) for ALL watches, but keep
your list of URLs! <br />
You may like to use the <strong>BACKUP</strong> link first.<br />
</div>
<br />
<div class="pure-control-group">
<label for="confirmtext">Confirmation text</label>
<input
type="text"
id="confirmtext"
required=""
name="confirmtext"
value=""
size="10"
/>
<span class="pure-form-message-inline"
>Type in the word <strong>clear</strong> to confirm that you
understand.</span
>
</div>
<br />
<div class="pure-control-group">
<button type="submit" class="pure-button pure-button-primary">
Clear History!
</button>
</div>
<br />
<div class="pure-control-group">
<a href="{{url_for('index')}}" class="pure-button button-cancel"
>Cancel</a
>
</div>
</fieldset>
</form>
</div>
</div>
{% endblock %}

View File

@@ -305,8 +305,8 @@ Math: {{ 1 + 1 }}") }}
{{ render_field(form.conditions_match_logic) }}
{{ render_fieldlist_of_formfields_as_table(form.conditions) }}
<div class="pure-form-message-inline">
<p id="verify-state-text">Use the verify (✓) button to test if a condition passes against the current snapshot.</p>
<br>
Use the verify (✓) button to test if a condition passes against the current snapshot.<br><br>
Did you know that <strong>conditions</strong> can be extended with your own custom plugin? tutorials coming soon!<br>
</div>
</div>
@@ -588,10 +588,10 @@ keyword") }}
{{ render_button(form.save_button) }}
<a href="{{url_for('ui.form_delete', uuid=uuid)}}"
class="pure-button button-small button-error ">Delete</a>
{% if watch.history_n %}<a href="{{url_for('ui.clear_watch_history', uuid=uuid)}}"
class="pure-button button-small button-error ">Clear History</a>{% endif %}
<a href="{{url_for('ui.clear_watch_history', uuid=uuid)}}"
class="pure-button button-small button-error ">Clear History</a>
<a href="{{url_for('ui.form_clone', uuid=uuid)}}"
class="pure-button button-small ">Clone &amp; Edit</a>
class="pure-button button-small ">Create Copy</a>
</div>
</div>
</form>

View File

@@ -0,0 +1,125 @@
{% extends 'base.html' %}
{% block content %}
{% from '_helpers.html' import render_field %}
<script src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<div class="edit-form monospaced-textarea">
<div class="tabs collapsable">
<ul>
<li class="tab" id=""><a href="#url-list">URL List</a></li>
<li class="tab"><a href="#distill-io">Distill.io</a></li>
<li class="tab"><a href="#xlsx">.XLSX &amp; Wachete</a></li>
</ul>
</div>
<div class="box-wrap inner">
<form class="pure-form" action="{{url_for('imports.import_page')}}" method="POST" enctype="multipart/form-data">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
<div class="tab-pane-inner" id="url-list">
<legend>
Enter one URL per line, and optionally add tags for each URL after a space, delineated by comma
(,):
<br>
<code>https://example.com tag1, tag2, last tag</code>
<br>
URLs which do not pass validation will stay in the textarea.
</legend>
{{ render_field(form.processor, class="processor") }}
<textarea name="urls" class="pure-input-1-2" placeholder="https://"
style="width: 100%;
font-family:monospace;
white-space: pre;
overflow-wrap: normal;
overflow-x: scroll;" rows="25">{{ import_url_list_remaining }}</textarea>
<div id="quick-watch-processor-type">
</div>
</div>
<div class="tab-pane-inner" id="distill-io">
<legend>
Copy and Paste your Distill.io watch 'export' file, this should be a JSON file.<br>
This is <i>experimental</i>, supported fields are <code>name</code>, <code>uri</code>, <code>tags</code>, <code>config:selections</code>, the rest (including <code>schedule</code>) are ignored.
<br>
<p>
How to export? <a href="https://distill.io/docs/web-monitor/how-export-and-import-monitors/">https://distill.io/docs/web-monitor/how-export-and-import-monitors/</a><br>
Be sure to set your default fetcher to Chrome if required.<br>
</p>
</legend>
<textarea name="distill-io" class="pure-input-1-2" style="width: 100%;
font-family:monospace;
white-space: pre;
overflow-wrap: normal;
overflow-x: scroll;" placeholder="Example Distill.io JSON export file
{
&quot;client&quot;: {
&quot;local&quot;: 1
},
&quot;data&quot;: [
{
&quot;name&quot;: &quot;Unraid | News&quot;,
&quot;uri&quot;: &quot;https://unraid.net/blog&quot;,
&quot;config&quot;: &quot;{\&quot;selections\&quot;:[{\&quot;frames\&quot;:[{\&quot;index\&quot;:0,\&quot;excludes\&quot;:[],\&quot;includes\&quot;:[{\&quot;type\&quot;:\&quot;xpath\&quot;,\&quot;expr\&quot;:\&quot;(//div[@id='App']/div[contains(@class,'flex')]/main[contains(@class,'relative')]/section[contains(@class,'relative')]/div[@class='container']/div[contains(@class,'flex')]/div[contains(@class,'w-full')])[1]\&quot;}]}],\&quot;dynamic\&quot;:true,\&quot;delay\&quot;:2}],\&quot;ignoreEmptyText\&quot;:true,\&quot;includeStyle\&quot;:false,\&quot;dataAttr\&quot;:\&quot;text\&quot;}&quot;,
&quot;tags&quot;: [],
&quot;content_type&quot;: 2,
&quot;state&quot;: 40,
&quot;schedule&quot;: &quot;{\&quot;type\&quot;:\&quot;INTERVAL\&quot;,\&quot;params\&quot;:{\&quot;interval\&quot;:4447}}&quot;,
&quot;ts&quot;: &quot;2022-03-27T15:51:15.667Z&quot;
}
]
}
" rows="25">{{ original_distill_json }}</textarea>
</div>
<div class="tab-pane-inner" id="xlsx">
<fieldset>
<div class="pure-control-group">
{{ render_field(form.xlsx_file, class="processor") }}
</div>
<div class="pure-control-group">
{{ render_field(form.file_mapping, class="processor") }}
</div>
</fieldset>
<div class="pure-control-group">
<span class="pure-form-message-inline">
Table of custom column and data types mapping for the <strong>Custom mapping</strong> File mapping type.
</span>
<table style="border: 1px solid #aaa; padding: 0.5rem; border-radius: 4px;">
<tr>
<td><strong>Column #</strong></td>
{% for n in range(4) %}
<td><input type="number" name="custom_xlsx[col_{{n}}]" style="width: 4rem;" min="1"></td>
{% endfor %}
</tr>
<tr>
<td><strong>Type</strong></td>
{% for n in range(4) %}
<td><select name="custom_xlsx[col_type_{{n}}]">
<option value="" style="color: #aaa"> -- none --</option>
<option value="url">URL</option>
<option value="title">Title</option>
<option value="include_filters">CSS/xPath filter</option>
<option value="tag">Group / Tag name(s)</option>
<option value="interval_minutes">Recheck time (minutes)</option>
</select></td>
{% endfor %}
</tr>
</table>
</div>
</div>
<button type="submit" class="pure-button pure-input-1-2 pure-button-primary">Import</button>
</form>
</div>
</div>
{% endblock %}

View File

@@ -0,0 +1,19 @@
{% extends 'base.html' %}
{% block content %}
<div class="edit-form">
<div class="inner">
<h4 style="margin-top: 0px;">Notification debug log</h4>
<div id="notification-error-log">
<ul style="font-size: 80%; margin:0px; padding: 0 0 0 7px">
{% for log in logs|reverse %}
<li>{{log}}</li>
{% endfor %}
</ul>
</div>
</div>
</div>
{% endblock %}

View File

@@ -3,16 +3,7 @@
{% from '_helpers.html' import render_simple_field, render_field, render_nolabel_field, sort_by_title %}
<script src="{{url_for('static_content', group='js', filename='jquery-3.6.0.min.js')}}"></script>
<script src="{{url_for('static_content', group='js', filename='watch-overview.js')}}" defer></script>
<script>let nowtimeserver={{ now_time_server }};</script>
<style>
.checking-now .last-checked {
background-image: linear-gradient(to bottom, transparent 0%, rgba(0,0,0,0.05) 40%, rgba(0,0,0,0.1) 100%);
background-size: 0 100%;
background-repeat: no-repeat;
transition: background-size 0.9s ease
}
</style>
<div class="box">
<form class="pure-form" action="{{ url_for('ui.ui_views.form_quick_watch_add', tag=active_tag_uuid) }}" method="POST" id="new-watch-form">
@@ -55,12 +46,12 @@
{% endif %}
{% if search_q %}<div id="search-result-info">Searching "<strong><i>{{search_q}}</i></strong>"</div>{% endif %}
<div>
<a href="{{url_for('watchlist.index')}}" class="pure-button button-tag {{'active' if not active_tag_uuid }}">All</a>
<a href="{{url_for('index')}}" class="pure-button button-tag {{'active' if not active_tag_uuid }}">All</a>
<!-- tag list -->
{% for uuid, tag in tags %}
{% if tag != "" %}
<a href="{{url_for('watchlist.index', tag=uuid) }}" class="pure-button button-tag {{'active' if active_tag_uuid == uuid }}">{{ tag.title }}</a>
<a href="{{url_for('index', tag=uuid) }}" class="pure-button button-tag {{'active' if active_tag_uuid == uuid }}">{{ tag.title }}</a>
{% endif %}
{% endfor %}
</div>
@@ -81,14 +72,14 @@
<tr>
{% set link_order = "desc" if sort_order == 'asc' else "asc" %}
{% set arrow_span = "" %}
<th><input style="vertical-align: middle" type="checkbox" id="check-all" > <a class="{{ 'active '+link_order if sort_attribute == 'date_created' else 'inactive' }}" href="{{url_for('watchlist.index', sort='date_created', order=link_order, tag=active_tag_uuid)}}"># <span class='arrow {{link_order}}'></span></a></th>
<th><input style="vertical-align: middle" type="checkbox" id="check-all" > <a class="{{ 'active '+link_order if sort_attribute == 'date_created' else 'inactive' }}" href="{{url_for('index', sort='date_created', order=link_order, tag=active_tag_uuid)}}"># <span class='arrow {{link_order}}'></span></a></th>
<th class="empty-cell"></th>
<th><a class="{{ 'active '+link_order if sort_attribute == 'label' else 'inactive' }}" href="{{url_for('watchlist.index', sort='label', order=link_order, tag=active_tag_uuid)}}">Website <span class='arrow {{link_order}}'></span></a></th>
<th><a class="{{ 'active '+link_order if sort_attribute == 'label' else 'inactive' }}" href="{{url_for('index', sort='label', order=link_order, tag=active_tag_uuid)}}">Website <span class='arrow {{link_order}}'></span></a></th>
{% if any_has_restock_price_processor %}
<th>Restock &amp; Price</th>
{% endif %}
<th><a class="{{ 'active '+link_order if sort_attribute == 'last_checked' else 'inactive' }}" href="{{url_for('watchlist.index', sort='last_checked', order=link_order, tag=active_tag_uuid)}}"><span class="hide-on-mobile">Last</span> Checked <span class='arrow {{link_order}}'></span></a></th>
<th><a class="{{ 'active '+link_order if sort_attribute == 'last_changed' else 'inactive' }}" href="{{url_for('watchlist.index', sort='last_changed', order=link_order, tag=active_tag_uuid)}}"><span class="hide-on-mobile">Last</span> Changed <span class='arrow {{link_order}}'></span></a></th>
<th><a class="{{ 'active '+link_order if sort_attribute == 'last_checked' else 'inactive' }}" href="{{url_for('index', sort='last_checked', order=link_order, tag=active_tag_uuid)}}"><span class="hide-on-mobile">Last</span> Checked <span class='arrow {{link_order}}'></span></a></th>
<th><a class="{{ 'active '+link_order if sort_attribute == 'last_changed' else 'inactive' }}" href="{{url_for('index', sort='last_changed', order=link_order, tag=active_tag_uuid)}}"><span class="hide-on-mobile">Last</span> Changed <span class='arrow {{link_order}}'></span></a></th>
<th class="empty-cell"></th>
</tr>
</thead>
@@ -100,8 +91,8 @@
{% endif %}
{% for watch in (watches|sort(attribute=sort_attribute, reverse=sort_order == 'asc'))|pagination_slice(skip=pagination.skip) %}
{% set is_unviewed = watch.newest_history_key| int > watch.last_viewed and watch.history_n>=2 %}
{% set checking_now = is_checking_now(watch) %}
{% set is_unviewed = watch.newest_history_key| int > watch.last_viewed and watch.history_n>=2 %}
<tr id="{{ watch.uuid }}"
class="{{ loop.cycle('pure-table-odd', 'pure-table-even') }} processor-{{ watch['processor'] }}
{% if watch.last_error is defined and watch.last_error != False %}error{% endif %}
@@ -109,18 +100,16 @@
{% if watch.paused is defined and watch.paused != False %}paused{% endif %}
{% if is_unviewed %}unviewed{% endif %}
{% if watch.has_restock_info %} has-restock-info {% if watch['restock']['in_stock'] %}in-stock{% else %}not-in-stock{% endif %} {% else %}no-restock-info{% endif %}
{% if watch.uuid in queued_uuids %}queued{% endif %}
{% if checking_now %}checking-now{% endif %}
">
{% if watch.uuid in queued_uuids %}queued{% endif %}">
<td class="inline checkbox-uuid" ><input name="uuids" type="checkbox" value="{{ watch.uuid}} " > <span>{{ loop.index+pagination.skip }}</span></td>
<td class="inline watch-controls">
{% if not watch.paused %}
<a class="state-off" href="{{url_for('watchlist.index', op='pause', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='pause.svg')}}" alt="Pause checks" title="Pause checks" class="icon icon-pause" ></a>
<a class="state-off" href="{{url_for('index', op='pause', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='pause.svg')}}" alt="Pause checks" title="Pause checks" class="icon icon-pause" ></a>
{% else %}
<a class="state-on" href="{{url_for('watchlist.index', op='pause', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='play.svg')}}" alt="UnPause checks" title="UnPause checks" class="icon icon-unpause" ></a>
<a class="state-on" href="{{url_for('index', op='pause', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='play.svg')}}" alt="UnPause checks" title="UnPause checks" class="icon icon-unpause" ></a>
{% endif %}
{% set mute_label = 'UnMute notification' if watch.notification_muted else 'Mute notification' %}
<a class="link-mute state-{{'on' if watch.notification_muted else 'off'}}" href="{{url_for('watchlist.index', op='mute', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='bell-off.svg')}}" alt="{{ mute_label }}" title="{{ mute_label }}" class="icon icon-mute" ></a>
<a class="link-mute state-{{'on' if watch.notification_muted else 'off'}}" href="{{url_for('index', op='mute', uuid=watch.uuid, tag=active_tag_uuid)}}"><img src="{{url_for('static_content', group='images', filename='bell-off.svg')}}" alt="{{ mute_label }}" title="{{ mute_label }}" class="icon icon-mute" ></a>
</td>
<td class="title-col inline">{{watch.title if watch.title is not none and watch.title|length > 0 else watch.url}}
<a class="external" target="_blank" rel="noopener" href="{{ watch.link.replace('source:','') }}"></a>
@@ -189,14 +178,7 @@
{% endif %}
</td>
{% endif %}
{#last_checked becomes fetch-start-time#}
<td class="last-checked" data-timestamp="{{ watch.last_checked }}" {% if checking_now %} data-fetchduration={{ watch.fetch_time }} data-eta_complete="{{ watch.last_checked+watch.fetch_time }}" {% endif %} >
{% if checking_now %}
<span class="spinner"></span><span> Checking now</span>
{% else %}
{{watch|format_last_checked_time|safe}}</td>
{% endif %}
<td class="last-checked" data-timestamp="{{ watch.last_checked }}">{{watch|format_last_checked_time|safe}}</td>
<td class="last-changed" data-timestamp="{{ watch.last_changed }}">{% if watch.history_n >=2 and watch.last_changed >0 %}
{{watch.last_changed|format_timestamp_timeago}}
{% else %}
@@ -228,7 +210,7 @@
<ul id="post-list-buttons">
{% if errored_count %}
<li>
<a href="{{url_for('watchlist.index', with_errors=1, tag=request.args.get('tag')) }}" class="pure-button button-tag button-error ">With errors ({{ errored_count }})</a>
<a href="{{url_for('index', with_errors=1, tag=request.args.get('tag')) }}" class="pure-button button-tag button-error ">With errors ({{ errored_count }})</a>
</li>
{% endif %}
{% if has_unviewed %}

View File

@@ -1,24 +0,0 @@
import pytest
from apprise import AppriseAsset
from changedetectionio.apprise_asset import (
APPRISE_APP_DESC,
APPRISE_APP_ID,
APPRISE_APP_URL,
APPRISE_AVATAR_URL,
)
@pytest.fixture(scope="function")
def apprise_asset() -> AppriseAsset:
from changedetectionio.apprise_asset import apprise_asset
return apprise_asset
def test_apprise_asset_init(apprise_asset: AppriseAsset):
assert isinstance(apprise_asset, AppriseAsset)
assert apprise_asset.app_id == APPRISE_APP_ID
assert apprise_asset.app_desc == APPRISE_APP_DESC
assert apprise_asset.app_url == APPRISE_APP_URL
assert apprise_asset.image_url_logo == APPRISE_AVATAR_URL

View File

@@ -1,211 +0,0 @@
import json
from unittest.mock import patch
import pytest
import requests
from apprise.utils.parse import parse_url as apprise_parse_url
from ...apprise_plugin.custom_handlers import (
_get_auth,
_get_headers,
_get_params,
apprise_http_custom_handler,
SUPPORTED_HTTP_METHODS,
)
@pytest.mark.parametrize(
"url,expected_auth",
[
("get://user:pass@localhost:9999", ("user", "pass")),
("get://user@localhost:9999", "user"),
("get://localhost:9999", ""),
("get://user%20name:pass%20word@localhost:9999", ("user name", "pass word")),
],
)
def test_get_auth(url, expected_auth):
"""Test authentication extraction with various URL formats."""
parsed_url = apprise_parse_url(url)
assert _get_auth(parsed_url) == expected_auth
@pytest.mark.parametrize(
"url,body,expected_content_type",
[
(
"get://localhost:9999?+content-type=application/xml",
"test",
"application/xml",
),
("get://localhost:9999", '{"key": "value"}', "application/json; charset=utf-8"),
("get://localhost:9999", "plain text", None),
("get://localhost:9999?+content-type=text/plain", "test", "text/plain"),
],
)
def test_get_headers(url, body, expected_content_type):
"""Test header extraction and content type detection."""
parsed_url = apprise_parse_url(url)
headers = _get_headers(parsed_url, body)
if expected_content_type:
assert headers.get("Content-Type") == expected_content_type
@pytest.mark.parametrize(
"url,expected_params",
[
("get://localhost:9999?param1=value1", {"param1": "value1"}),
("get://localhost:9999?param1=value1&-param2=ignored", {"param1": "value1"}),
("get://localhost:9999?param1=value1&+header=test", {"param1": "value1"}),
(
"get://localhost:9999?encoded%20param=encoded%20value",
{"encoded param": "encoded value"},
),
],
)
def test_get_params(url, expected_params):
"""Test parameter extraction with URL encoding and exclusion logic."""
parsed_url = apprise_parse_url(url)
params = _get_params(parsed_url)
assert dict(params) == expected_params
@pytest.mark.parametrize(
"url,schema,method",
[
("get://localhost:9999", "get", "GET"),
("post://localhost:9999", "post", "POST"),
("delete://localhost:9999", "delete", "DELETE"),
],
)
@patch("requests.request")
def test_apprise_custom_api_call_success(mock_request, url, schema, method):
"""Test successful API calls with different HTTP methods and schemas."""
mock_request.return_value.raise_for_status.return_value = None
meta = {"url": url, "schema": schema}
result = apprise_http_custom_handler(
body="test body", title="Test Title", notify_type="info", meta=meta
)
assert result is True
mock_request.assert_called_once()
call_args = mock_request.call_args
assert call_args[1]["method"] == method.upper()
assert call_args[1]["url"].startswith("http")
@patch("requests.request")
def test_apprise_custom_api_call_with_auth(mock_request):
"""Test API call with authentication."""
mock_request.return_value.raise_for_status.return_value = None
url = "get://user:pass@localhost:9999/secure"
meta = {"url": url, "schema": "get"}
result = apprise_http_custom_handler(
body=json.dumps({"key": "value"}),
title="Secure Test",
notify_type="info",
meta=meta,
)
assert result is True
mock_request.assert_called_once()
call_args = mock_request.call_args
assert call_args[1]["auth"] == ("user", "pass")
@pytest.mark.parametrize(
"exception_type,expected_result",
[
(requests.RequestException, False),
(requests.HTTPError, False),
(Exception, False),
],
)
@patch("requests.request")
def test_apprise_custom_api_call_failure(mock_request, exception_type, expected_result):
"""Test various failure scenarios."""
url = "get://localhost:9999/error"
meta = {"url": url, "schema": "get"}
# Simulate different types of exceptions
mock_request.side_effect = exception_type("Error occurred")
result = apprise_http_custom_handler(
body="error body", title="Error Test", notify_type="error", meta=meta
)
assert result == expected_result
def test_invalid_url_parsing():
"""Test handling of invalid URL parsing."""
meta = {"url": "invalid://url", "schema": "invalid"}
result = apprise_http_custom_handler(
body="test", title="Invalid URL", notify_type="info", meta=meta
)
assert result is False
@pytest.mark.parametrize(
"schema,expected_method",
[
(http_method, http_method.upper())
for http_method in SUPPORTED_HTTP_METHODS
],
)
@patch("requests.request")
def test_http_methods(mock_request, schema, expected_method):
"""Test all supported HTTP methods."""
mock_request.return_value.raise_for_status.return_value = None
url = f"{schema}://localhost:9999"
result = apprise_http_custom_handler(
body="test body",
title="Test Title",
notify_type="info",
meta={"url": url, "schema": schema},
)
assert result is True
mock_request.assert_called_once()
call_args = mock_request.call_args
assert call_args[1]["method"] == expected_method
@pytest.mark.parametrize(
"input_schema,expected_method",
[
(f"{http_method}s", http_method.upper())
for http_method in SUPPORTED_HTTP_METHODS
],
)
@patch("requests.request")
def test_https_method_conversion(
mock_request, input_schema, expected_method
):
"""Validate that methods ending with 's' use HTTPS and correct HTTP method."""
mock_request.return_value.raise_for_status.return_value = None
url = f"{input_schema}://localhost:9999"
result = apprise_http_custom_handler(
body="test body",
title="Test Title",
notify_type="info",
meta={"url": url, "schema": input_schema},
)
assert result is True
mock_request.assert_called_once()
call_args = mock_request.call_args
assert call_args[1]["method"] == expected_method
assert call_args[1]["url"].startswith("https")

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env python3
import psutil
import resource
import time
from threading import Thread
@@ -28,10 +28,9 @@ def reportlog(pytestconfig):
def track_memory(memory_usage, ):
process = psutil.Process(os.getpid())
while not memory_usage["stop"]:
current_rss = process.memory_info().rss
memory_usage["peak"] = max(memory_usage["peak"], current_rss)
max_rss = resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
memory_usage["peak"] = max(memory_usage["peak"], max_rss)
time.sleep(0.01) # Adjust the sleep time as needed
@pytest.fixture(scope='function')

View File

@@ -36,7 +36,7 @@ def test_select_custom(client, live_server, measure_memory_usage):
assert b"1 Imported" in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'Proxy Authentication Required' not in res.data
res = client.get(

View File

@@ -83,14 +83,14 @@ def test_restock_detection(client, live_server, measure_memory_usage):
# Is it correctly show as NOT in stock?
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'not-in-stock' in res.data
# Is it correctly shown as in stock
set_back_in_stock_response()
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'not-in-stock' not in res.data
# We should have a notification
@@ -107,6 +107,6 @@ def test_restock_detection(client, live_server, measure_memory_usage):
assert not os.path.isfile("test-datastore/notification.txt"), "No notification should have fired when it went OUT OF STOCK by default"
# BUT we should see that it correctly shows "not in stock"
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'not-in-stock' in res.data, "Correctly showing NOT IN STOCK in the list after it changed from IN STOCK"

View File

@@ -1,4 +1,4 @@
from .util import live_server_setup, wait_for_all_checks
from .util import live_server_setup
from flask import url_for
import time
@@ -44,7 +44,7 @@ def test_check_access_control(app, client, live_server):
assert b"Password protection enabled." in res.data
# Check we hit the login
res = c.get(url_for("watchlist.index"), follow_redirects=True)
res = c.get(url_for("index"), follow_redirects=True)
# Should be logged out
assert b"Login" in res.data
@@ -52,14 +52,6 @@ def test_check_access_control(app, client, live_server):
res = c.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
assert b'Random content' in res.data
# access to assets should work (check_authentication)
res = c.get(url_for('static_content', group='js', filename='jquery-3.6.0.min.js'))
assert res.status_code == 200
res = c.get(url_for('static_content', group='styles', filename='styles.css'))
assert res.status_code == 200
res = c.get(url_for('static_content', group='styles', filename='404-testetest.css'))
assert res.status_code == 404
# Check wrong password does not let us in
res = c.post(
url_for("login"),
@@ -172,7 +164,7 @@ def test_check_access_control(app, client, live_server):
assert b"Password protection enabled." in res.data
# Check we hit the login
res = c.get(url_for("watchlist.index"), follow_redirects=True)
res = c.get(url_for("index"), follow_redirects=True)
# Should be logged out
assert b"Login" in res.data

View File

@@ -72,7 +72,7 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
res = client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# The trigger line is REMOVED, this should trigger
@@ -81,7 +81,7 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
# Check in the processor here what's going on, its triggering empty-reply and no change.
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
@@ -90,14 +90,14 @@ def test_check_removed_line_contains_trigger(client, live_server, measure_memory
set_original(excluding=None)
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# Remove it again, and we should get a trigger
set_original(excluding='The golden line')
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
@@ -157,14 +157,14 @@ def test_check_add_line_contains_trigger(client, live_server, measure_memory_usa
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# The trigger line is ADDED, this should trigger
set_original(add_line='<p>Oh yes please</p>')
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data

View File

@@ -2,7 +2,7 @@
import time
from flask import url_for
from .util import live_server_setup, wait_for_all_checks
from .util import live_server_setup, extract_api_key_from_UI, wait_for_all_checks
import json
import uuid
@@ -58,14 +58,14 @@ def test_setup(client, live_server, measure_memory_usage):
def test_api_simple(client, live_server, measure_memory_usage):
#live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
api_key = extract_api_key_from_UI(client)
# Create a watch
set_original_response()
# Validate bad URL
test_url = url_for('test_endpoint', _external=True )
test_url = url_for('test_endpoint', _external=True,
headers={'x-api-key': api_key}, )
res = client.post(
url_for("createwatch"),
data=json.dumps({"url": "h://xxxxxxxxxom"}),
@@ -290,13 +290,13 @@ def test_access_denied(client, live_server, measure_memory_usage):
assert b"Settings updated." in res.data
def test_api_watch_PUT_update(client, live_server, measure_memory_usage):
#live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
api_key = extract_api_key_from_UI(client)
# Create a watch
set_original_response()
test_url = url_for('test_endpoint', _external=True)
test_url = url_for('test_endpoint', _external=True,
headers={'x-api-key': api_key}, )
# Create new
res = client.post(
@@ -371,8 +371,7 @@ def test_api_watch_PUT_update(client, live_server, measure_memory_usage):
def test_api_import(client, live_server, measure_memory_usage):
#live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
api_key = extract_api_key_from_UI(client)
res = client.post(
url_for("import") + "?tag=import-test",
@@ -383,55 +382,11 @@ def test_api_import(client, live_server, measure_memory_usage):
assert res.status_code == 200
assert len(res.json) == 2
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b"https://website1.com" in res.data
assert b"https://website2.com" in res.data
# Should see the new tag in the tag/groups list
res = client.get(url_for('tags.tags_overview_page'))
assert b'import-test' in res.data
def test_api_conflict_UI_password(client, live_server, measure_memory_usage):
#live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
# Enable password check and diff page access bypass
res = client.post(
url_for("settings.settings_page"),
data={"application-password": "foobar", # password is now set! API should still work!
"application-api_access_token_enabled": "y",
"requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"Password protection enabled." in res.data
# Create a watch
set_original_response()
test_url = url_for('test_endpoint', _external=True)
# Create new
res = client.post(
url_for("createwatch"),
data=json.dumps({"url": test_url, "title": "My test URL" }),
headers={'content-type': 'application/json', 'x-api-key': api_key},
follow_redirects=True
)
assert res.status_code == 201
wait_for_all_checks(client)
url = url_for("createwatch")
# Get a listing, it will be the first one
res = client.get(
url,
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert len(res.json)

View File

@@ -1,101 +0,0 @@
from copy import copy
from flask import url_for
import json
import time
from .util import live_server_setup, wait_for_all_checks
def test_api_search(client, live_server):
live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
watch_data = {}
# Add some test watches
urls = [
'https://example.com/page1',
'https://example.org/testing',
'https://test-site.com/example'
]
# Import the test URLs
res = client.post(
url_for("imports.import_page"),
data={"urls": "\r\n".join(urls)},
follow_redirects=True
)
assert b"3 Imported" in res.data
wait_for_all_checks(client)
# Get a listing, it will be the first one
watches_response = client.get(
url_for("createwatch"),
headers={'x-api-key': api_key}
)
# Add a title to one watch for title search testing
for uuid, watch in watches_response.json.items():
watch_data = client.get(url_for("watch", uuid=uuid),
follow_redirects=True,
headers={'x-api-key': api_key}
)
if urls[0] == watch_data.json['url']:
# HTTP PUT ( UPDATE an existing watch )
client.put(
url_for("watch", uuid=uuid),
headers={'x-api-key': api_key, 'content-type': 'application/json'},
data=json.dumps({'title': 'Example Title Test'}),
)
# Test search by URL
res = client.get(url_for("search")+"?q=https://example.com/page1", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert len(res.json) == 1
assert list(res.json.values())[0]['url'] == urls[0]
# Test search by URL - partial should NOT match without ?partial=true flag
res = client.get(url_for("search")+"?q=https://example", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert len(res.json) == 0
# Test search by title
res = client.get(url_for("search")+"?q=Example Title Test", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert len(res.json) == 1
assert list(res.json.values())[0]['url'] == urls[0]
assert list(res.json.values())[0]['title'] == 'Example Title Test'
# Test search that should return multiple results (partial = true)
res = client.get(url_for("search")+"?q=https://example&partial=true", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert len(res.json) == 2
# Test empty search
res = client.get(url_for("search")+"?q=", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert res.status_code == 400
# Add a tag to test search with tag filter
tag_name = 'test-tag'
res = client.post(
url_for("tag"),
data=json.dumps({"title": tag_name}),
headers={'content-type': 'application/json', 'x-api-key': api_key}
)
assert res.status_code == 201
tag_uuid = res.json['uuid']
# Add the tag to one watch
for uuid, watch in watches_response.json.items():
if urls[2] == watch['url']:
client.put(
url_for("watch", uuid=uuid),
headers={'x-api-key': api_key, 'content-type': 'application/json'},
data=json.dumps({'tags': [tag_uuid]}),
)
# Test search with tag filter and q
res = client.get(url_for("search") + f"?q={urls[2]}&tag={tag_name}", headers={'x-api-key': api_key, 'content-type': 'application/json'})
assert len(res.json) == 1
assert list(res.json.values())[0]['url'] == urls[2]

View File

@@ -1,143 +0,0 @@
#!/usr/bin/env python3
from flask import url_for
from .util import live_server_setup, wait_for_all_checks
import json
def test_api_tags_listing(client, live_server, measure_memory_usage):
live_server_setup(live_server)
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
tag_title = 'Test Tag'
# Get a listing
res = client.get(
url_for("tags"),
headers={'x-api-key': api_key}
)
assert res.text.strip() == "{}", "Should be empty list"
assert res.status_code == 200
res = client.post(
url_for("tag"),
data=json.dumps({"title": tag_title}),
headers={'content-type': 'application/json', 'x-api-key': api_key}
)
assert res.status_code == 201
new_tag_uuid = res.json.get('uuid')
# List tags - should include our new tag
res = client.get(
url_for("tags"),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert new_tag_uuid in res.text
assert res.json[new_tag_uuid]['title'] == tag_title
assert res.json[new_tag_uuid]['notification_muted'] == False
# Get single tag
res = client.get(
url_for("tag", uuid=new_tag_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert res.json['title'] == tag_title
# Update tag
res = client.put(
url_for("tag", uuid=new_tag_uuid),
data=json.dumps({"title": "Updated Tag"}),
headers={'content-type': 'application/json', 'x-api-key': api_key}
)
assert res.status_code == 200
assert b'OK' in res.data
# Verify update worked
res = client.get(
url_for("tag", uuid=new_tag_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert res.json['title'] == 'Updated Tag'
# Mute tag notifications
res = client.get(
url_for("tag", uuid=new_tag_uuid) + "?muted=muted",
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert b'OK' in res.data
# Verify muted status
res = client.get(
url_for("tag", uuid=new_tag_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert res.json['notification_muted'] == True
# Unmute tag
res = client.get(
url_for("tag", uuid=new_tag_uuid) + "?muted=unmuted",
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert b'OK' in res.data
# Verify unmuted status
res = client.get(
url_for("tag", uuid=new_tag_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert res.json['notification_muted'] == False
# Create a watch with the tag and check it matches UUID
test_url = url_for('test_endpoint', _external=True)
res = client.post(
url_for("createwatch"),
data=json.dumps({"url": test_url, "tag": "Updated Tag", "title": "Watch with tag"}),
headers={'content-type': 'application/json', 'x-api-key': api_key},
follow_redirects=True
)
assert res.status_code == 201
watch_uuid = res.json.get('uuid')
# Verify tag is associated with watch by name if need be
res = client.get(
url_for("watch", uuid=watch_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert new_tag_uuid in res.json.get('tags', [])
# Delete tag
res = client.delete(
url_for("tag", uuid=new_tag_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 204
# Verify tag is gone
res = client.get(
url_for("tags"),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert new_tag_uuid not in res.text
# Verify tag was removed from watch
res = client.get(
url_for("watch", uuid=watch_uuid),
headers={'x-api-key': api_key}
)
assert res.status_code == 200
assert new_tag_uuid not in res.json.get('tags', [])
# Delete the watch
res = client.delete(
url_for("watch", uuid=watch_uuid),
headers={'x-api-key': api_key},
)
assert res.status_code == 204

View File

@@ -2,7 +2,7 @@
import time
from flask import url_for
from .util import live_server_setup, extract_UUID_from_client, wait_for_all_checks
from .util import live_server_setup, extract_UUID_from_client, extract_api_key_from_UI, wait_for_all_checks
def set_response_with_ldjson():
@@ -95,22 +95,24 @@ def test_check_ldjson_price_autodetect(client, live_server, measure_memory_usage
wait_for_all_checks(client)
# Should get a notice that it's available
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'ldjson-price-track-offer' in res.data
# Accept it
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
#time.sleep(1)
client.get(url_for('price_data_follower.accept', uuid=uuid, follow_redirects=True))
res = client.get(url_for('price_data_follower.accept', uuid=uuid, follow_redirects=True))
# should now be switched to restock_mode
wait_for_all_checks(client)
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
# Offer should be gone
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'Embedded price data' not in res.data
assert b'tracking-ldjson-price-data' in res.data
# and last snapshop (via API) should be just the price
api_key = live_server.app.config['DATASTORE'].data['settings']['application'].get('api_access_token')
api_key = extract_api_key_from_UI(client)
res = client.get(
url_for("watchsinglehistory", uuid=uuid, timestamp='latest'),
headers={'x-api-key': api_key},
@@ -136,7 +138,7 @@ def test_check_ldjson_price_autodetect(client, live_server, measure_memory_usage
)
assert b"1 Imported" in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'ldjson-price-track-offer' not in res.data
##########################################################################################
@@ -154,6 +156,7 @@ def _test_runner_check_bad_format_ignored(live_server, client, has_ldjson_price_
assert b"1 Imported" in res.data
wait_for_all_checks(client)
assert len(client.application.config.get('DATASTORE').data['watching'])
for k,v in client.application.config.get('DATASTORE').data['watching'].items():
assert v.get('last_error') == False
assert v.get('has_ldjson_price_data') == has_ldjson_price_data, f"Detected LDJSON data? should be {has_ldjson_price_data}"
@@ -163,7 +166,7 @@ def _test_runner_check_bad_format_ignored(live_server, client, has_ldjson_price_
client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
def test_bad_ldjson_is_correctly_ignored(client, live_server, measure_memory_usage):
def test_bad_ldjson_is_correctly_ignored(client, live_server):
#live_server_setup(live_server)
test_return_data = """
<html>

View File

@@ -39,7 +39,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'test-endpoint' in res.data
@@ -75,7 +75,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
assert b'which has this one new line' in res.data
# Now something should be ready, indicated by having a 'unviewed' class
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# #75, and it should be in the RSS feed
@@ -112,7 +112,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'Mark all viewed' not in res.data
assert b'head title' not in res.data # Should not be present because this is off by default
@@ -131,7 +131,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
assert b'Mark all viewed' in res.data
@@ -151,7 +151,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
client.get(url_for("ui.clear_watch_history", uuid=uuid))
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'preview/' in res.data
#

View File

@@ -107,7 +107,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -120,7 +120,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -129,7 +129,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
set_original_ignore_response()
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
@@ -137,7 +137,7 @@ def test_check_block_changedetection_text_NOT_present(client, live_server, measu
set_modified_response_minus_block_text()
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data

View File

@@ -2,39 +2,29 @@
import time
from flask import url_for
from .util import live_server_setup, wait_for_all_checks
from . util import live_server_setup
def test_clone_functionality(client, live_server, measure_memory_usage):
def test_trigger_functionality(client, live_server, measure_memory_usage):
live_server_setup(live_server)
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write("<html><body>Some content</body></html>")
test_url = url_for('test_endpoint', _external=True)
# Give the endpoint time to spin up
time.sleep(1)
# Add our URL to the import page
res = client.post(
url_for("imports.import_page"),
data={"urls": test_url},
data={"urls": "https://changedetection.io"},
follow_redirects=True
)
assert b"1 Imported" in res.data
wait_for_all_checks(client)
# So that we can be sure the same history doesnt carry over
time.sleep(1)
res = client.get(
url_for("ui.form_clone", uuid="first"),
follow_redirects=True
)
existing_uuids = set()
for uuid, watch in live_server.app.config['DATASTORE'].data['watching'].items():
new_uuids = set(watch.history.keys())
duplicates = existing_uuids.intersection(new_uuids)
assert len(duplicates) == 0
existing_uuids.update(new_uuids)
assert b"Cloned" in res.data
assert b"Cloned." in res.data

View File

@@ -1,6 +1,5 @@
#!/usr/bin/env python3
import json
import urllib
from flask import url_for
from .util import live_server_setup, wait_for_all_checks
@@ -44,12 +43,14 @@ def set_number_out_of_range_response(number="150"):
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
def test_setup(live_server):
live_server_setup(live_server)
def test_conditions_with_text_and_number(client, live_server):
"""Test that both text and number conditions work together with AND logic."""
set_original_response("50")
live_server_setup(live_server)
#live_server_setup(live_server)
test_url = url_for('test_endpoint', _external=True)
@@ -114,7 +115,7 @@ def test_conditions_with_text_and_number(client, live_server):
wait_for_all_checks(client)
# 75 is > 20 and < 100 and contains "5"
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
@@ -128,7 +129,7 @@ def test_conditions_with_text_and_number(client, live_server):
wait_for_all_checks(client)
# Should NOT be marked as having changes since not all conditions are met
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
@@ -138,6 +139,7 @@ def test_conditions_with_text_and_number(client, live_server):
def test_condition_validate_rule_row(client, live_server):
set_original_response("50")
#live_server_setup(live_server)
test_url = url_for('test_endpoint', _external=True)

View File

@@ -119,7 +119,7 @@ def test_check_markup_include_filters_restriction(client, live_server, measure_m
# It should have 'unviewed' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
@@ -218,7 +218,7 @@ def test_filter_is_empty_help_suggestion(client, live_server, measure_memory_usa
res = client.get(
url_for("watchlist.index"),
url_for("index"),
follow_redirects=True
)
@@ -240,7 +240,7 @@ def test_filter_is_empty_help_suggestion(client, live_server, measure_memory_usa
wait_for_all_checks(client)
res = client.get(
url_for("watchlist.index"),
url_for("index"),
follow_redirects=True
)

View File

@@ -204,7 +204,7 @@ def test_element_removal_full(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# There should not be an unviewed change, as changes should be removed
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b"unviewed" not in res.data
# Re #2752

View File

@@ -32,7 +32,7 @@ def _runner_test_http_errors(client, live_server, http_code, expected_text):
# Give the thread time to pick it up
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
# no change
assert b'unviewed' not in res.data
assert bytes(expected_text.encode('utf-8')) in res.data
@@ -78,7 +78,7 @@ def test_DNS_errors(client, live_server, measure_memory_usage):
# Give the thread time to pick it up
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
found_name_resolution_error = b"Temporary failure in name resolution" in res.data or b"Name or service not known" in res.data
assert found_name_resolution_error
# Should always record that we tried
@@ -107,7 +107,7 @@ def test_low_level_errors_clear_correctly(client, live_server, measure_memory_us
wait_for_all_checks(client)
# We should see the DNS error
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
found_name_resolution_error = b"Temporary failure in name resolution" in res.data or b"Name or service not known" in res.data
assert found_name_resolution_error
@@ -122,7 +122,7 @@ def test_low_level_errors_clear_correctly(client, live_server, measure_memory_us
# Now the error should be gone
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
found_name_resolution_error = b"Temporary failure in name resolution" in res.data or b"Name or service not known" in res.data
assert not found_name_resolution_error

View File

@@ -103,7 +103,7 @@ def test_check_filter_multiline(client, live_server, measure_memory_usage):
assert b"Updated watch." in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
# Issue 1828
assert b'not at the start of the expression' not in res.data
@@ -160,7 +160,7 @@ def test_check_filter_and_regex_extract(client, live_server, measure_memory_usag
# Give the thread time to pick it up
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
#issue 1828
assert b'not at the start of the expression' not in res.data
@@ -174,7 +174,7 @@ def test_check_filter_and_regex_extract(client, live_server, measure_memory_usag
# It should have 'unviewed' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# Check HTML conversion detected and workd

View File

@@ -113,7 +113,7 @@ def run_filter_test(client, live_server, content_filter):
checked += 1
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'Warning, no filters were found' in res.data
assert not os.path.isfile("test-datastore/notification.txt")
time.sleep(1)

View File

@@ -77,7 +77,7 @@ def test_setup_group_tag(client, live_server, measure_memory_usage):
)
assert b"1 Imported" in res.data
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'import-tag' in res.data
assert b'extra-import-tag' in res.data
@@ -90,7 +90,7 @@ def test_setup_group_tag(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'Warning, no filters were found' not in res.data
res = client.get(
@@ -255,7 +255,7 @@ def test_limit_tag_ui(client, live_server, measure_memory_usage):
assert b"40 Imported" in res.data
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'test-tag' in res.data
# All should be here
@@ -263,7 +263,7 @@ def test_limit_tag_ui(client, live_server, measure_memory_usage):
tag_uuid = get_UUID_for_tag_name(client, name="test-tag")
res = client.get(url_for("watchlist.index", tag=tag_uuid))
res = client.get(url_for("index", tag=tag_uuid))
# Just a subset should be here
assert b'test-tag' in res.data
@@ -273,7 +273,6 @@ def test_limit_tag_ui(client, live_server, measure_memory_usage):
assert b'Deleted' in res.data
res = client.get(url_for("tags.delete_all"), follow_redirects=True)
assert b'All tags deleted' in res.data
def test_clone_tag_on_import(client, live_server, measure_memory_usage):
#live_server_setup(live_server)
test_url = url_for('test_endpoint', _external=True)
@@ -285,7 +284,7 @@ def test_clone_tag_on_import(client, live_server, measure_memory_usage):
assert b"1 Imported" in res.data
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'test-tag' in res.data
assert b'another-tag' in res.data
@@ -293,7 +292,6 @@ def test_clone_tag_on_import(client, live_server, measure_memory_usage):
res = client.get(url_for("ui.form_clone", uuid=watch_uuid), follow_redirects=True)
assert b'Cloned' in res.data
res = client.get(url_for("watchlist.index"))
# 2 times plus the top link to tag
assert res.data.count(b'test-tag') == 3
assert res.data.count(b'another-tag') == 3
@@ -313,15 +311,14 @@ def test_clone_tag_on_quickwatchform_add(client, live_server, measure_memory_usa
assert b"Watch added" in res.data
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'test-tag' in res.data
assert b'another-tag' in res.data
watch_uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.get(url_for("ui.form_clone", uuid=watch_uuid), follow_redirects=True)
assert b'Cloned' in res.data
res = client.get(url_for("watchlist.index"))
assert b'Cloned' in res.data
# 2 times plus the top link to tag
assert res.data.count(b'test-tag') == 3
assert res.data.count(b'another-tag') == 3

View File

@@ -127,7 +127,7 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -140,7 +140,7 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -151,7 +151,7 @@ def test_check_ignore_text_functionality(client, live_server, measure_memory_usa
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(url_for("ui.ui_views.preview_page", uuid="first"))
@@ -214,7 +214,7 @@ def test_check_global_ignore_text_functionality(client, live_server, measure_mem
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class), adding random ignore text should not cause a change
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
#####
@@ -229,7 +229,7 @@ def test_check_global_ignore_text_functionality(client, live_server, measure_mem
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -238,7 +238,7 @@ def test_check_global_ignore_text_functionality(client, live_server, measure_mem
set_modified_original_ignore_response()
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)

View File

@@ -114,7 +114,7 @@ def test_render_anchor_tag_content_true(client, live_server, measure_memory_usag
# since the link has changed, and we chose to render anchor tag content,
# we should detect a change (new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b"unviewed" in res.data
assert b"/test-endpoint" in res.data

View File

@@ -79,7 +79,7 @@ def test_normal_page_check_works_with_ignore_status_code(client, live_server, me
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
assert b'/test-endpoint' in res.data
@@ -127,6 +127,6 @@ def test_403_page_check_works_with_ignore_status_code(client, live_server, measu
# It should have 'unviewed' still
# Because it should be looking at only that 'sametext' id
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data

View File

@@ -91,6 +91,6 @@ def test_check_ignore_whitespace(client, live_server, measure_memory_usage):
time.sleep(sleep_time_for_fetch_thread)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data

View File

@@ -31,8 +31,8 @@ https://example.com tag1, other tag"""
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get( url_for("watchlist.index"))
res = client.get( url_for("watchlist.index"))
res = client.get( url_for("index"))
res = client.get( url_for("index"))
def xtest_import_skip_url(client, live_server, measure_memory_usage):
@@ -55,7 +55,7 @@ def xtest_import_skip_url(client, live_server, measure_memory_usage):
assert b"1 Skipped" in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get( url_for("watchlist.index"))
res = client.get( url_for("index"))
def test_import_distillio(client, live_server, measure_memory_usage):
@@ -113,7 +113,7 @@ def test_import_distillio(client, live_server, measure_memory_usage):
assert b"xpath:(//div[@id=&#39;App&#39;]/div[contains(@class,&#39;flex&#39;)]/main[contains(@class,&#39;relative&#39;)]/section[contains(@class,&#39;relative&#39;)]/div[@class=&#39;container&#39;]/div[contains(@class,&#39;flex&#39;)]/div[contains(@class,&#39;w-full&#39;)])[1]" in res.data
# did the tags work?
res = client.get( url_for("watchlist.index"))
res = client.get( url_for("index"))
# check tags
assert b"nice stuff" in res.data
@@ -121,7 +121,7 @@ def test_import_distillio(client, live_server, measure_memory_usage):
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
def test_import_custom_xlsx(client, live_server, measure_memory_usage):
"""Test can upload a excel spreadsheet and the watches are created correctly"""
@@ -156,7 +156,7 @@ def test_import_custom_xlsx(client, live_server, measure_memory_usage):
assert b'Error processing row number 1' in res.data
res = client.get(
url_for("watchlist.index")
url_for("index")
)
assert b'Somesite results ABC' in res.data
@@ -194,7 +194,7 @@ def test_import_watchete_xlsx(client, live_server, measure_memory_usage):
assert b'4 imported from Wachete .xlsx' in res.data
res = client.get(
url_for("watchlist.index")
url_for("index")
)
assert b'Somesite results ABC' in res.data

View File

@@ -52,7 +52,7 @@ def test_jinja2_security_url_query(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'is invalid and cannot be used' in res.data
# Some of the spewed output from the subclasses
assert b'dict_values' not in res.data

View File

@@ -281,7 +281,7 @@ def check_json_filter(json_filter, client, live_server):
wait_for_all_checks(client)
# It should have 'unviewed' still
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# Should not see this, because its not in the JSONPath we entered
@@ -417,7 +417,7 @@ def check_json_ext_filter(json_filter, client, live_server):
wait_for_all_checks(client)
# It should have 'unviewed'
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(url_for("ui.ui_views.diff_history_page", uuid="first"))
@@ -455,7 +455,7 @@ def test_ignore_json_order(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# Just to be sure it still works
@@ -466,7 +466,7 @@ def test_ignore_json_order(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
@@ -488,7 +488,7 @@ def test_correct_header_detect(client, live_server, measure_memory_usage):
)
assert b"1 Imported" in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
# Fixed in #1593
assert b'No parsable JSON found in this document' not in res.data

View File

@@ -41,7 +41,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
@@ -63,7 +63,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
@@ -93,7 +93,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
@@ -105,7 +105,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
assert watch.last_changed == watch['last_checked']
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data # A change should have registered because empty_pages_are_a_change is ON
assert b'fetch-error' not in res.data

View File

@@ -139,7 +139,7 @@ def test_check_notification(client, live_server, measure_memory_usage):
time.sleep(3)
# Check no errors were recorded
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'notification-error' not in res.data

View File

@@ -46,7 +46,7 @@ def test_check_notification_error_handling(client, live_server, measure_memory_u
logging.debug("Fetching watch overview....")
res = client.get(
url_for("watchlist.index"))
url_for("index"))
if bytes("Notification error detected".encode('utf-8')) in res.data:
found=True

View File

@@ -50,7 +50,7 @@ def test_fetch_pdf(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# The original checksum should be not be here anymore (cdio adds it to the bottom of the text)

View File

@@ -0,0 +1,15 @@
from flask import url_for
from changedetectionio.tests.util import live_server_setup
def test_checkplugins_registered(live_server, client):
live_server_setup(live_server)
res = client.get(
url_for("settings.settings_page")
)
assert res.status_code == 200
# Should be registered in the info table
assert b'<td>Webpage Text/HTML, JSON and PDF changes' in res.data
assert b'<td>text_json_diff' in res.data

View File

@@ -48,7 +48,7 @@ def test_fetch_pdf(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# The original checksum should be not be here anymore (cdio adds it to the bottom of the text)

View File

@@ -64,7 +64,7 @@ def test_restock_itemprop_basic(client, live_server):
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'more than one price detected' not in res.data
assert b'has-restock-info' in res.data
assert b' in-stock' in res.data
@@ -81,7 +81,7 @@ def test_restock_itemprop_basic(client, live_server):
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'has-restock-info not-in-stock' in res.data
@@ -95,22 +95,25 @@ def test_itemprop_price_change(client, live_server):
test_url = url_for('test_endpoint', _external=True)
set_original_response(props_markup=instock_props[0], price="190.95")
client.post(
res = client.post(
url_for("ui.ui_views.form_quick_watch_add"),
data={"url": test_url, "tags": 'restock tests', 'processor': 'restock_diff'},
follow_redirects=True
)
assert res.status_code == 200
# A change in price, should trigger a change by default
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'190.95' in res.data
# basic price change, look for notification
set_original_response(props_markup=instock_props[0], price='180.45')
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'180.45' in res.data
assert b'unviewed' in res.data
client.get(url_for("ui.mark_all_viewed"), follow_redirects=True)
@@ -125,7 +128,7 @@ def test_itemprop_price_change(client, live_server):
assert b"Updated watch." in res.data
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'120.45' in res.data
assert b'unviewed' not in res.data
@@ -170,7 +173,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
set_original_response(props_markup=instock_props[0], price='1000.45')
client.get(url_for("ui.form_watch_checknow"))
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'more than one price detected' not in res.data
# BUT the new price should show, even tho its within limits
@@ -183,7 +186,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
res = client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'890.45' in res.data
assert b'unviewed' in res.data
@@ -195,7 +198,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
res = client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'820.45' in res.data
assert b'unviewed' in res.data
client.get(url_for("ui.mark_all_viewed"))
@@ -204,7 +207,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
set_original_response(props_markup=instock_props[0], price='1890.45')
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
# Depending on the LOCALE it may be either of these (generally for US/default/etc)
assert b'1,890.45' in res.data or b'1890.45' in res.data
assert b'unviewed' in res.data
@@ -288,7 +291,7 @@ def test_itemprop_percent_threshold(client, live_server):
set_original_response(props_markup=instock_props[0], price='960.45')
client.get(url_for("ui.form_watch_checknow"))
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'960.45' in res.data
assert b'unviewed' not in res.data
@@ -296,7 +299,7 @@ def test_itemprop_percent_threshold(client, live_server):
set_original_response(props_markup=instock_props[0], price='1960.45')
client.get(url_for("ui.form_watch_checknow"))
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'1,960.45' or b'1960.45' in res.data #depending on locale
assert b'unviewed' in res.data
@@ -306,7 +309,7 @@ def test_itemprop_percent_threshold(client, live_server):
set_original_response(props_markup=instock_props[0], price='1950.45')
client.get(url_for("ui.form_watch_checknow"))
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'1,950.45' or b'1950.45' in res.data #depending on locale
assert b'unviewed' not in res.data
@@ -395,7 +398,7 @@ def test_data_sanity(client, live_server):
test_url = url_for('test_endpoint', _external=True)
test_url2 = url_for('test_endpoint2', _external=True)
set_original_response(props_markup=instock_props[0], price="950.95")
client.post(
res = client.post(
url_for("ui.ui_views.form_quick_watch_add"),
data={"url": test_url, "tags": 'restock tests', 'processor': 'restock_diff'},
follow_redirects=True
@@ -403,7 +406,7 @@ def test_data_sanity(client, live_server):
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'950.95' in res.data
# Check the restock model object doesnt store the value by mistake and used in a new one
@@ -413,7 +416,7 @@ def test_data_sanity(client, live_server):
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert str(res.data.decode()).count("950.95") == 1, "Price should only show once (for the watch added, no other watches yet)"
## different test, check the edit page works on an empty request result
@@ -455,6 +458,6 @@ def test_special_prop_examples(client, live_server):
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'ception' not in res.data
assert b'155.55' in res.data

View File

@@ -0,0 +1,88 @@
#!/usr/bin/env python3
import os
import time
from flask import url_for
from .util import live_server_setup, wait_for_all_checks, extract_UUID_from_client
def test_restock_settings_persistence(client, live_server):
"""Test that restock processor and settings are correctly saved and loaded after app restart"""
live_server_setup(live_server)
# Create a test page with pricing information
test_return_data = """<html>
<body>
Some initial text<br>
<p>Which is across multiple lines</p>
<br>
So let's see what happens. <br>
<div>price: $10.99</div>
<div id="sametext">Out of stock</div>
</body>
</html>
"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
# Add our URL to the import page (pointing to our test endpoint)
test_url = url_for('test_endpoint', _external=True)
# Add a new watch with the restock_diff processor
res = client.post(
url_for("ui.ui_views.form_quick_watch_add"),
data={"url": test_url, "tags": '', 'processor': 'restock_diff'},
follow_redirects=True
)
# Wait for initial check to complete
wait_for_all_checks(client)
# Get the UUID of the watch
uuid = extract_UUID_from_client(client)
# Set custom restock settings
res = client.post(
url_for("ui.ui_edit.edit_page", uuid=uuid),
data={
"url": test_url,
"tags": "",
"headers": "",
"restock_settings-price_change_min": 10,
"restock_settings-price_change_threshold_percent": 5,
'fetch_backend': "html_requests",
"processor" : 'restock_diff'
},
follow_redirects=True
)
assert b"Updated watch." in res.data
# Verify the settings were saved in the current datastore
app_config = client.application.config.get('DATASTORE').data
watch = app_config['watching'][uuid]
assert watch.get('processor') == 'restock_diff'
assert watch['restock_settings'].get('price_change_min') == 10
assert watch['restock_settings'].get('price_change_threshold_percent') == 5
# Restart the application by calling teardown and recreating the datastore
# This simulates shutting down and restarting the app
datastore = client.application.config.get('DATASTORE')
datastore.stop_thread = True
# Create a new datastore instance that will read from the saved JSON
from changedetectionio import store
new_datastore = store.ChangeDetectionStore(datastore_path="./test-datastore", include_default_watches=False)
client.application.config['DATASTORE'] = new_datastore
# Verify the watch settings were correctly loaded after restart
app_config = client.application.config.get('DATASTORE').data
watch = app_config['watching'][uuid]
# Check that processor mode is correctly preserved
assert watch.get('processor') == 'restock_diff', "Watch processor mode should be preserved as 'restock_diff'"
# Check that the restock settings were correctly preserved
assert watch['restock_settings'].get('price_change_min') == 10, "price_change_min setting should be preserved"
assert watch['restock_settings'].get('price_change_threshold_percent') == 5, "price_change_threshold_percent setting should be preserved"

View File

@@ -49,22 +49,6 @@ def set_original_cdata_xml():
f.write(test_return_data)
def set_html_content(content):
test_return_data = f"""<html>
<body>
Some initial text<br>
<p>{content}</p>
<br>
So let's see what happens. <br>
</body>
</html>
"""
# Write as UTF-8 encoded bytes
with open("test-datastore/endpoint-content.txt", "wb") as f:
f.write(test_return_data.encode('utf-8'))
def test_setup(client, live_server, measure_memory_usage):
live_server_setup(live_server)
@@ -180,58 +164,3 @@ def test_rss_xpath_filtering(client, live_server, measure_memory_usage):
assert b'Some other description' not in res.data # Should NOT be selected by the xpath
res = client.get(url_for("ui.form_delete", uuid="all"), follow_redirects=True)
def test_rss_bad_chars_breaking(client, live_server):
"""This should absolutely trigger the RSS builder to go into worst state mode
- source: prefix means no html conversion (which kinda filters out the bad stuff)
- Binary data
- Very long so that the saving is performed by Brotli (and decoded back to bytes)
Otherwise feedgen should support regular unicode
"""
#live_server_setup(live_server)
with open("test-datastore/endpoint-content.txt", "w") as f:
ten_kb_string = "A" * 10_000
f.write(ten_kb_string)
test_url = url_for('test_endpoint', _external=True)
res = client.post(
url_for("imports.import_page"),
data={"urls": "source:"+test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
wait_for_all_checks(client)
# Set the bad content
with open("test-datastore/endpoint-content.txt", "w") as f:
jpeg_bytes = "\xff\xd8\xff\xe0\x00\x10XXXXXXXX\x00\x01\x02\x00\x00\x01\x00\x01\x00\x00" # JPEG header
jpeg_bytes += "A" * 10_000
f.write(jpeg_bytes)
res = client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
assert b'Queued 1 watch for rechecking.' in res.data
wait_for_all_checks(client)
rss_token = extract_rss_token_from_UI(client)
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n == 2
# Check RSS feed is still working
res = client.get(
url_for("rss.feed", uuid=uuid, token=rss_token),
follow_redirects=False # Important! leave this off! it should not redirect
)
assert res.status_code == 200
#assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n == 2
#assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n == 2

View File

@@ -20,7 +20,7 @@ def test_basic_search(client, live_server, measure_memory_usage):
assert b"2 Imported" in res.data
# By URL
res = client.get(url_for("watchlist.index") + "?q=first-res")
res = client.get(url_for("index") + "?q=first-res")
assert urls[0].encode('utf-8') in res.data
assert urls[1].encode('utf-8') not in res.data
@@ -33,7 +33,7 @@ def test_basic_search(client, live_server, measure_memory_usage):
)
assert b"Updated watch." in res.data
res = client.get(url_for("watchlist.index") + "?q=xxx-title")
res = client.get(url_for("index") + "?q=xxx-title")
assert urls[0].encode('utf-8') in res.data
assert urls[1].encode('utf-8') not in res.data
@@ -54,7 +54,7 @@ def test_search_in_tag_limit(client, live_server, measure_memory_usage):
# By URL
res = client.get(url_for("watchlist.index") + "?q=first-res")
res = client.get(url_for("index") + "?q=first-res")
# Split because of the import tag separation
assert urls[0].split(' ')[0].encode('utf-8') in res.data, urls[0].encode('utf-8')
assert urls[1].split(' ')[0].encode('utf-8') not in res.data, urls[0].encode('utf-8')
@@ -68,7 +68,7 @@ def test_search_in_tag_limit(client, live_server, measure_memory_usage):
)
assert b"Updated watch." in res.data
res = client.get(url_for("watchlist.index") + "?q=xxx-title")
res = client.get(url_for("index") + "?q=xxx-title")
assert urls[0].split(' ')[0].encode('utf-8') in res.data, urls[0].encode('utf-8')
assert urls[1].split(' ')[0].encode('utf-8') not in res.data, urls[0].encode('utf-8')

View File

@@ -67,7 +67,7 @@ def _runner_test_various_file_slash(client, file_uri):
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
substrings = [b"URLs with hostname components are not permitted", b"No connection adapters were found for"]

View File

@@ -76,5 +76,5 @@ def test_share_watch(client, live_server, measure_memory_usage):
assert bytes(include_filters.encode('utf-8')) in res.data
# Check it saved the URL
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert bytes(test_url.encode('utf-8')) in res.data

View File

@@ -45,7 +45,7 @@ def test_check_basic_change_detection_functionality_source(client, live_server,
wait_for_all_checks(client)
# Now something should be ready, indicated by having a 'unviewed' class
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(

View File

@@ -104,7 +104,7 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
@@ -116,7 +116,7 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# Now set the content which contains the trigger text
@@ -124,7 +124,7 @@ def test_trigger_functionality(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# https://github.com/dgtlmoon/changedetection.io/issues/616

View File

@@ -41,7 +41,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (just a new one shouldnt have anything)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
### test regex
@@ -63,7 +63,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
wait_for_all_checks(client)
# It should report nothing found (nothing should match the regex)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
with open("test-datastore/endpoint-content.txt", "w") as f:
@@ -71,7 +71,7 @@ def test_trigger_regex_functionality(client, live_server, measure_memory_usage):
client.get(url_for("ui.form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
res = client.get(url_for("watchlist.index"))
res = client.get(url_for("index"))
assert b'unviewed' in res.data
# Cleanup everything

Some files were not shown because too many files have changed in this diff Show More