Compare commits

..

23 Commits

Author SHA1 Message Date
dgtlmoon
cab11ced5f Re #2945 - Handle ByteOrderMark in JSON strings 2025-02-07 21:28:40 +01:00
RoboMagus
dbd4adf23a Add major and minor tags for Docker release workflow (#2938)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / test-container-build (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
CodeQL / Analyze (javascript) (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
2025-02-01 10:52:04 +01:00
dgtlmoon
b1e700b3ff Adding jinja2/browsersteps test (#2915)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
CodeQL / Analyze (javascript) (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
2025-01-28 18:14:49 +01:00
Iftekhar Alam Fuad
1c61b5a623 Header handling - Fix header parsing to split on the first colon only (headers where the value contained :// type may have been broken) (#2929)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-01-26 00:08:09 +01:00
dgtlmoon
e799a1cdcb 0.49.00
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / test-container-build (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
CodeQL / Analyze (javascript) (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
2025-01-21 13:40:01 +01:00
dgtlmoon
938065db6f Update README.md
Some checks are pending
Build and push containers / metadata (push) Waiting to run
Build and push containers / build-push-containers (push) Waiting to run
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Waiting to run
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Blocked by required conditions
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Blocked by required conditions
ChangeDetection.io App Test / lint-code (push) Waiting to run
ChangeDetection.io App Test / test-application-3-10 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-11 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-12 (push) Blocked by required conditions
ChangeDetection.io App Test / test-application-3-13 (push) Blocked by required conditions
2025-01-20 16:10:54 +01:00
dgtlmoon
4f2d38ff49 Build/Libraries - Pin referencing library which breaks due to out-dated flask_expects_json, remove pip upgrade in test(#2912)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io Container Build Test / test-container-build (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
2025-01-18 23:20:58 +01:00
dgtlmoon
8960f401b7 Notifications - Custom POST:// GET:// etc endpoints - returning 204 and other 20x responses are OK (don't show an error was detected)(#2897)
Some checks failed
Build and push containers / metadata (push) Has been cancelled
Build and push containers / build-push-containers (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Build distribution 📦 (push) Has been cancelled
ChangeDetection.io App Test / lint-code (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Test the built 📦 package works basically. (push) Has been cancelled
Publish Python 🐍distribution 📦 to PyPI and TestPyPI / Publish Python 🐍 distribution 📦 to PyPI (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-10 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-11 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-12 (push) Has been cancelled
ChangeDetection.io App Test / test-application-3-13 (push) Has been cancelled
CodeQL / Analyze (javascript) (push) Has been cancelled
CodeQL / Analyze (python) (push) Has been cancelled
2025-01-13 13:13:18 +01:00
dgtlmoon
1c1f1c6f6b 0.48.06 2025-01-09 23:02:29 +01:00
dgtlmoon
a2a98811a5 Restock - Add test for new lower/higher price notification Re #2715 (#2892) 2025-01-09 22:59:55 +01:00
dgtlmoon
5a0ef8fc01 Update integration test for "linuxserver" test build (#2891) 2025-01-09 21:36:39 +01:00
dgtlmoon
d90de0851d Notifications - Update Apprise to 1.9.2 - Fixes custom posts:// gets:// etc URL's being double-encoded, fixes chantify:// notifications (#2868) (#2875) (#2870) 2025-01-09 21:16:32 +01:00
dgtlmoon
360b4f0d8b Custom posts:// get:// notifications etc - Be sure our custom extensions are imported (#2890) 2025-01-09 21:10:09 +01:00
dgtlmoon
6fc04d7f1c "Send test notification" button - Easier to understand test send results, Improved error handling, code refactor (#2888) 2025-01-08 14:35:41 +01:00
dgtlmoon
66fb05527b Improve last_checked vs last_changed time information precision (#2883) 2025-01-06 20:38:50 +01:00
William Brawner
202e47d728 Update Apprise to 1.9.1 (#2876) 2025-01-02 20:06:25 +01:00
Florian Kretschmer
d67d396b88 Builder/Docker - Remove PUID and PGID ( they were not used ) (#2852) 2024-12-27 13:03:36 +01:00
MoshiMoshi0
05f54f0ce6 UI - Fix diff not starting from last viewed snapshot (#2744) (#2856) 2024-12-27 13:03:10 +01:00
dgtlmoon
6adf10597e 0.48.05 2024-12-27 11:24:56 +01:00
dgtlmoon
4419bc0e61 Fixing test for CVE-2024-56509 (#2864) 2024-12-27 11:09:52 +01:00
dgtlmoon
f7e9846c9b CVE-2024-56509 - Stricter file protocol checking pre-check ( Improper Input Validation Leading to LFR/Path Traversal when fetching file:.. ) 2024-12-27 09:26:28 +01:00
dgtlmoon
5dea5e1def 0.48.04 2024-12-16 21:50:53 +01:00
dgtlmoon
0fade0a473 Windows was sometimes missing timezone data (#2845 #2826) 2024-12-16 21:50:28 +01:00
47 changed files with 527 additions and 265 deletions

View File

@@ -2,32 +2,33 @@
# Test that we can still build on Alpine (musl modified libc https://musl.libc.org/) # Test that we can still build on Alpine (musl modified libc https://musl.libc.org/)
# Some packages wont install via pypi because they dont have a wheel available under this architecture. # Some packages wont install via pypi because they dont have a wheel available under this architecture.
FROM ghcr.io/linuxserver/baseimage-alpine:3.18 FROM ghcr.io/linuxserver/baseimage-alpine:3.21
ENV PYTHONUNBUFFERED=1 ENV PYTHONUNBUFFERED=1
COPY requirements.txt /requirements.txt COPY requirements.txt /requirements.txt
RUN \ RUN \
apk add --update --no-cache --virtual=build-dependencies \ apk add --update --no-cache --virtual=build-dependencies \
build-base \
cargo \ cargo \
g++ \ git \
gcc \
jpeg-dev \ jpeg-dev \
libc-dev \ libc-dev \
libffi-dev \ libffi-dev \
libjpeg \
libxslt-dev \ libxslt-dev \
make \
openssl-dev \ openssl-dev \
py3-wheel \
python3-dev \ python3-dev \
zip \
zlib-dev && \ zlib-dev && \
apk add --update --no-cache \ apk add --update --no-cache \
libjpeg \
libxslt \ libxslt \
python3 \ nodejs \
py3-pip && \ poppler-utils \
python3 && \
echo "**** pip3 install test of changedetection.io ****" && \ echo "**** pip3 install test of changedetection.io ****" && \
pip3 install -U pip wheel setuptools && \ python3 -m venv /lsiopy && \
pip3 install -U --no-cache-dir --find-links https://wheel-index.linuxserver.io/alpine-3.18/ -r /requirements.txt && \ pip install -U pip wheel setuptools && \
pip install -U --no-cache-dir --find-links https://wheel-index.linuxserver.io/alpine-3.21/ -r /requirements.txt && \
apk del --purge \ apk del --purge \
build-dependencies build-dependencies

View File

@@ -103,6 +103,19 @@ jobs:
# provenance: false # provenance: false
# A new tagged release is required, which builds :tag and :latest # A new tagged release is required, which builds :tag and :latest
- name: Docker meta :tag
if: github.event_name == 'release' && startsWith(github.event.release.tag_name, '0.')
uses: docker/metadata-action@v5
id: meta
with:
images: |
${{ secrets.DOCKER_HUB_USERNAME }}/changedetection.io
ghcr.io/dgtlmoon/changedetection.io
tags: |
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
- name: Build and push :tag - name: Build and push :tag
id: docker_build_tag_release id: docker_build_tag_release
if: github.event_name == 'release' && startsWith(github.event.release.tag_name, '0.') if: github.event_name == 'release' && startsWith(github.event.release.tag_name, '0.')
@@ -111,11 +124,7 @@ jobs:
context: ./ context: ./
file: ./Dockerfile file: ./Dockerfile
push: true push: true
tags: | tags: ${{ steps.meta.outputs.tags }}
${{ secrets.DOCKER_HUB_USERNAME }}/changedetection.io:${{ github.event.release.tag_name }}
ghcr.io/dgtlmoon/changedetection.io:${{ github.event.release.tag_name }}
${{ secrets.DOCKER_HUB_USERNAME }}/changedetection.io:latest
ghcr.io/dgtlmoon/changedetection.io:latest
platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8,linux/arm64/v8 platforms: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v8,linux/arm64/v8
cache-from: type=gha cache-from: type=gha
cache-to: type=gha,mode=max cache-to: type=gha,mode=max

View File

@@ -45,7 +45,6 @@ jobs:
- name: Test that the basic pip built package runs without error - name: Test that the basic pip built package runs without error
run: | run: |
set -ex set -ex
sudo pip3 install --upgrade pip
pip3 install dist/changedetection.io*.whl pip3 install dist/changedetection.io*.whl
changedetection.io -d /tmp -p 10000 & changedetection.io -d /tmp -p 10000 &
sleep 3 sleep 3

View File

@@ -64,14 +64,16 @@ jobs:
echo "Running processes in docker..." echo "Running processes in docker..."
docker ps docker ps
- name: Test built container with Pytest (generally as requests/plaintext fetching) - name: Run Unit Tests
run: | run: |
# Unit tests # Unit tests
echo "run test with unittest"
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_notification_diff' docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_notification_diff'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_watch_model' docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_watch_model'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_jinja2_security' docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_jinja2_security'
docker run test-changedetectionio bash -c 'python3 -m unittest changedetectionio.tests.unit.test_semver'
- name: Test built container with Pytest (generally as requests/plaintext fetching)
run: |
# All tests # All tests
echo "run test with pytest" echo "run test with pytest"
# The default pytest logger_level is TRACE # The default pytest logger_level is TRACE

View File

@@ -120,7 +120,7 @@ Easily add the current web page to your changedetection.io tool, simply install
[<img src="./docs/chrome-extension-screenshot.png" style="max-width:80%;" alt="Chrome Extension to easily add the current web-page to detect a change." title="Chrome Extension to easily add the current web-page to detect a change." />](https://chromewebstore.google.com/detail/changedetectionio-website/kefcfmgmlhmankjmnbijimhofdjekbop) [<img src="./docs/chrome-extension-screenshot.png" style="max-width:80%;" alt="Chrome Extension to easily add the current web-page to detect a change." title="Chrome Extension to easily add the current web-page to detect a change." />](https://chromewebstore.google.com/detail/changedetectionio-website/kefcfmgmlhmankjmnbijimhofdjekbop)
[Goto the Chrome Webstore to download the extension.](https://chromewebstore.google.com/detail/changedetectionio-website/kefcfmgmlhmankjmnbijimhofdjekbop) [Goto the Chrome Webstore to download the extension.](https://chromewebstore.google.com/detail/changedetectionio-website/kefcfmgmlhmankjmnbijimhofdjekbop) ( Or check out the [GitHub repo](https://github.com/dgtlmoon/changedetection.io-browser-extension) )
## Installation ## Installation

View File

@@ -2,7 +2,7 @@
# Read more https://github.com/dgtlmoon/changedetection.io/wiki # Read more https://github.com/dgtlmoon/changedetection.io/wiki
__version__ = '0.48.03' __version__ = '0.49.0'
from changedetectionio.strtobool import strtobool from changedetectionio.strtobool import strtobool
from json.decoder import JSONDecodeError from json.decoder import JSONDecodeError
@@ -24,6 +24,9 @@ from loguru import logger
app = None app = None
datastore = None datastore = None
def get_version():
return __version__
# Parent wrapper or OS sends us a SIGTERM/SIGINT, do everything required for a clean shutdown # Parent wrapper or OS sends us a SIGTERM/SIGINT, do everything required for a clean shutdown
def sigshutdown_handler(_signo, _stack_frame): def sigshutdown_handler(_signo, _stack_frame):
global app global app

View File

@@ -76,6 +76,7 @@ class Watch(Resource):
# Return without history, get that via another API call # Return without history, get that via another API call
# Properties are not returned as a JSON, so add the required props manually # Properties are not returned as a JSON, so add the required props manually
watch['history_n'] = watch.history_n watch['history_n'] = watch.history_n
# attr .last_changed will check for the last written text snapshot on change
watch['last_changed'] = watch.last_changed watch['last_changed'] = watch.last_changed
watch['viewed'] = watch.viewed watch['viewed'] = watch.viewed
return watch return watch

View File

@@ -1,3 +1,4 @@
from changedetectionio import apprise_plugin
import apprise import apprise
# Create our AppriseAsset and populate it with some of our new values: # Create our AppriseAsset and populate it with some of our new values:

View File

@@ -1,6 +1,8 @@
# include the decorator # include the decorator
from apprise.decorators import notify from apprise.decorators import notify
from loguru import logger from loguru import logger
from requests.structures import CaseInsensitiveDict
@notify(on="delete") @notify(on="delete")
@notify(on="deletes") @notify(on="deletes")
@@ -13,70 +15,84 @@ from loguru import logger
def apprise_custom_api_call_wrapper(body, title, notify_type, *args, **kwargs): def apprise_custom_api_call_wrapper(body, title, notify_type, *args, **kwargs):
import requests import requests
import json import json
import re
from urllib.parse import unquote_plus from urllib.parse import unquote_plus
from apprise.utils import parse_url as apprise_parse_url from apprise.utils.parse import parse_url as apprise_parse_url
from apprise import URLBase
url = kwargs['meta'].get('url') url = kwargs['meta'].get('url')
schema = kwargs['meta'].get('schema').lower().strip()
if url.startswith('post'): # Choose POST, GET etc from requests
r = requests.post method = re.sub(rf's$', '', schema)
elif url.startswith('get'): requests_method = getattr(requests, method)
r = requests.get
elif url.startswith('put'):
r = requests.put
elif url.startswith('delete'):
r = requests.delete
url = url.replace('post://', 'http://') params = CaseInsensitiveDict({}) # Added to requests
url = url.replace('posts://', 'https://')
url = url.replace('put://', 'http://')
url = url.replace('puts://', 'https://')
url = url.replace('get://', 'http://')
url = url.replace('gets://', 'https://')
url = url.replace('put://', 'http://')
url = url.replace('puts://', 'https://')
url = url.replace('delete://', 'http://')
url = url.replace('deletes://', 'https://')
headers = {}
params = {}
auth = None auth = None
has_error = False
# Convert /foobar?+some-header=hello to proper header dictionary # Convert /foobar?+some-header=hello to proper header dictionary
results = apprise_parse_url(url) results = apprise_parse_url(url)
if results:
# Add our headers that the user can potentially over-ride if they wish
# to to our returned result set and tidy entries by unquoting them
headers = {unquote_plus(x): unquote_plus(y)
for x, y in results['qsd+'].items()}
# https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#get-parameter-manipulation # Add our headers that the user can potentially over-ride if they wish
# In Apprise, it relies on prefixing each request arg with "-", because it uses say &method=update as a flag for apprise # to to our returned result set and tidy entries by unquoting them
# but here we are making straight requests, so we need todo convert this against apprise's logic headers = CaseInsensitiveDict({unquote_plus(x): unquote_plus(y)
for k, v in results['qsd'].items(): for x, y in results['qsd+'].items()})
if not k.strip('+-') in results['qsd+'].keys():
params[unquote_plus(k)] = unquote_plus(v)
# Determine Authentication # https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#get-parameter-manipulation
auth = '' # In Apprise, it relies on prefixing each request arg with "-", because it uses say &method=update as a flag for apprise
if results.get('user') and results.get('password'): # but here we are making straight requests, so we need todo convert this against apprise's logic
auth = (unquote_plus(results.get('user')), unquote_plus(results.get('user'))) for k, v in results['qsd'].items():
elif results.get('user'): if not k.strip('+-') in results['qsd+'].keys():
auth = (unquote_plus(results.get('user'))) params[unquote_plus(k)] = unquote_plus(v)
# Try to auto-guess if it's JSON # Determine Authentication
h = 'application/json; charset=utf-8' auth = ''
if results.get('user') and results.get('password'):
auth = (unquote_plus(results.get('user')), unquote_plus(results.get('user')))
elif results.get('user'):
auth = (unquote_plus(results.get('user')))
# If it smells like it could be JSON and no content-type was already set, offer a default content type.
if body and '{' in body[:100] and not headers.get('Content-Type'):
json_header = 'application/json; charset=utf-8'
try:
# Try if it's JSON
json.loads(body)
headers['Content-Type'] = json_header
except ValueError as e:
logger.warning(f"Could not automatically add '{json_header}' header to the notification because the document failed to parse as JSON: {e}")
pass
# POSTS -> HTTPS etc
if schema.lower().endswith('s'):
url = re.sub(rf'^{schema}', 'https', results.get('url'))
else:
url = re.sub(rf'^{schema}', 'http', results.get('url'))
status_str = ''
try: try:
json.loads(body) r = requests_method(url,
headers['Content-Type'] = h auth=auth,
except ValueError as e: data=body.encode('utf-8') if type(body) is str else body,
logger.warning(f"Could not automatically add '{h}' header to the {kwargs['meta'].get('schema')}:// notification because the document failed to parse as JSON: {e}") headers=headers,
pass params=params
)
r(results.get('url'), if not (200 <= r.status_code < 300):
auth=auth, status_str = f"Error sending '{method.upper()}' request to {url} - Status: {r.status_code}: '{r.reason}'"
data=body.encode('utf-8') if type(body) is str else body, logger.error(status_str)
headers=headers, has_error = True
params=params else:
) logger.info(f"Sent '{method.upper()}' request to {url}")
has_error = False
except requests.RequestException as e:
status_str = f"Error sending '{method.upper()}' request to {url} - {str(e)}"
logger.error(status_str)
has_error = True
if has_error:
raise TypeError(status_str)
return True

View File

@@ -598,17 +598,31 @@ def changedetection_app(config=None, datastore_o=None):
if 'notification_title' in request.form and request.form['notification_title'].strip(): if 'notification_title' in request.form and request.form['notification_title'].strip():
n_object['notification_title'] = request.form.get('notification_title', '').strip() n_object['notification_title'] = request.form.get('notification_title', '').strip()
elif datastore.data['settings']['application'].get('notification_title'):
n_object['notification_title'] = datastore.data['settings']['application'].get('notification_title')
else:
n_object['notification_title'] = "Test title"
if 'notification_body' in request.form and request.form['notification_body'].strip(): if 'notification_body' in request.form and request.form['notification_body'].strip():
n_object['notification_body'] = request.form.get('notification_body', '').strip() n_object['notification_body'] = request.form.get('notification_body', '').strip()
elif datastore.data['settings']['application'].get('notification_body'):
n_object['notification_body'] = datastore.data['settings']['application'].get('notification_body')
else:
n_object['notification_body'] = "Test body"
n_object['as_async'] = False
n_object.update(watch.extra_notification_token_values()) n_object.update(watch.extra_notification_token_values())
from .notification import process_notification
sent_obj = process_notification(n_object, datastore)
from . import update_worker
new_worker = update_worker.update_worker(update_q, notification_q, app, datastore)
new_worker.queue_notification_for_watch(notification_q=notification_q, n_object=n_object, watch=watch)
except Exception as e: except Exception as e:
return make_response(f"Error: str(e)", 400) e_str = str(e)
# Remove this text which is not important and floods the container
e_str = e_str.replace(
"DEBUG - <class 'apprise.decorators.base.CustomNotifyPlugin.instantiate_plugin.<locals>.CustomNotifyPluginWrapper'>",
'')
return make_response(e_str, 400)
return 'OK - Sent test notifications' return 'OK - Sent test notifications'

View File

@@ -1,5 +1,6 @@
from typing import List from loguru import logger
from lxml import etree from lxml import etree
from typing import List
import json import json
import re import re
@@ -298,8 +299,10 @@ def extract_json_as_string(content, json_filter, ensure_is_ldjson_info_type=None
# https://github.com/dgtlmoon/changedetection.io/pull/2041#issuecomment-1848397161w # https://github.com/dgtlmoon/changedetection.io/pull/2041#issuecomment-1848397161w
# Try to parse/filter out the JSON, if we get some parser error, then maybe it's embedded within HTML tags # Try to parse/filter out the JSON, if we get some parser error, then maybe it's embedded within HTML tags
try: try:
stripped_text_from_html = _parse_json(json.loads(content), json_filter) # .lstrip("\ufeff") strings ByteOrderMark from UTF8 and still lets the UTF work
except json.JSONDecodeError: stripped_text_from_html = _parse_json(json.loads(content.lstrip("\ufeff") ), json_filter)
except json.JSONDecodeError as e:
logger.warning(str(e))
# Foreach <script json></script> blob.. just return the first that matches json_filter # Foreach <script json></script> blob.. just return the first that matches json_filter
# As a last resort, try to parse the whole <body> # As a last resort, try to parse the whole <body>

View File

@@ -69,7 +69,7 @@ def parse_headers_from_text_file(filepath):
for l in f.readlines(): for l in f.readlines():
l = l.strip() l = l.strip()
if not l.startswith('#') and ':' in l: if not l.startswith('#') and ':' in l:
(k, v) = l.split(':') (k, v) = l.split(':', 1) # Split only on the first colon
headers[k.strip()] = v.strip() headers[k.strip()] = v.strip()
return headers return headers

View File

@@ -247,37 +247,32 @@ class model(watch_base):
bump = self.history bump = self.history
return self.__newest_history_key return self.__newest_history_key
# Given an arbitrary timestamp, find the closest next key # Given an arbitrary timestamp, find the best history key for the [diff] button so it can preset a smarter from_version
# For example, last_viewed = 1000 so it should return the next 1001 timestamp
#
# used for the [diff] button so it can preset a smarter from_version
@property @property
def get_next_snapshot_key_to_last_viewed(self): def get_from_version_based_on_last_viewed(self):
"""Unfortunately for now timestamp is stored as string key""" """Unfortunately for now timestamp is stored as string key"""
keys = list(self.history.keys()) keys = list(self.history.keys())
if not keys: if not keys:
return None return None
if len(keys) == 1:
return keys[0]
last_viewed = int(self.get('last_viewed')) last_viewed = int(self.get('last_viewed'))
prev_k = keys[0]
sorted_keys = sorted(keys, key=lambda x: int(x)) sorted_keys = sorted(keys, key=lambda x: int(x))
sorted_keys.reverse() sorted_keys.reverse()
# When the 'last viewed' timestamp is greater than the newest snapshot, return second last # When the 'last viewed' timestamp is greater than or equal the newest snapshot, return second newest
if last_viewed > int(sorted_keys[0]): if last_viewed >= int(sorted_keys[0]):
return sorted_keys[1] return sorted_keys[1]
# When the 'last viewed' timestamp is between snapshots, return the older snapshot
for newer, older in list(zip(sorted_keys[0:], sorted_keys[1:])):
if last_viewed < int(newer) and last_viewed >= int(older):
return older
for k in sorted_keys: # When the 'last viewed' timestamp is less than the oldest snapshot, return oldest
if int(k) < last_viewed: return sorted_keys[-1]
if prev_k == sorted_keys[0]:
# Return the second last one so we dont recommend the same version compares itself
return sorted_keys[1]
return prev_k
prev_k = k
return keys[0]
def get_history_snapshot(self, timestamp): def get_history_snapshot(self, timestamp):
import brotli import brotli

View File

@@ -67,6 +67,10 @@ def process_notification(n_object, datastore):
sent_objs = [] sent_objs = []
from .apprise_asset import asset from .apprise_asset import asset
if 'as_async' in n_object:
asset.async_mode = n_object.get('as_async')
apobj = apprise.Apprise(debug=True, asset=asset) apobj = apprise.Apprise(debug=True, asset=asset)
if not n_object.get('notification_urls'): if not n_object.get('notification_urls'):
@@ -157,8 +161,6 @@ def process_notification(n_object, datastore):
attach=n_object.get('screenshot', None) attach=n_object.get('screenshot', None)
) )
# Give apprise time to register an error
time.sleep(3)
# Returns empty string if nothing found, multi-line string otherwise # Returns empty string if nothing found, multi-line string otherwise
log_value = logs.getvalue() log_value = logs.getvalue()

View File

@@ -33,8 +33,8 @@ class difference_detection_processor():
url = self.watch.link url = self.watch.link
# Protect against file://, file:/ access, check the real "link" without any meta "source:" etc prepended. # Protect against file:, file:/, file:// access, check the real "link" without any meta "source:" etc prepended.
if re.search(r'^file:/', url.strip(), re.IGNORECASE): if re.search(r'^file:', url.strip(), re.IGNORECASE):
if not strtobool(os.getenv('ALLOW_FILE_URI', 'false')): if not strtobool(os.getenv('ALLOW_FILE_URI', 'false')):
raise Exception( raise Exception(
"file:// type access is denied for security reasons." "file:// type access is denied for security reasons."

View File

@@ -1,42 +1,52 @@
$(document).ready(function() { $(document).ready(function () {
$('#add-email-helper').click(function (e) { $('#add-email-helper').click(function (e) {
e.preventDefault(); e.preventDefault();
email = prompt("Destination email"); email = prompt("Destination email");
if(email) { if (email) {
var n = $(".notification-urls"); var n = $(".notification-urls");
var p=email_notification_prefix; var p = email_notification_prefix;
$(n).val( $.trim( $(n).val() )+"\n"+email_notification_prefix+email ); $(n).val($.trim($(n).val()) + "\n" + email_notification_prefix + email);
}
});
$('#send-test-notification').click(function (e) {
e.preventDefault();
data = {
notification_body: $('#notification_body').val(),
notification_format: $('#notification_format').val(),
notification_title: $('#notification_title').val(),
notification_urls: $('.notification-urls').val(),
tags: $('#tags').val(),
window_url: window.location.href,
}
$.ajax({
type: "POST",
url: notification_base_url,
data : data,
statusCode: {
400: function(data) {
// More than likely the CSRF token was lost when the server restarted
alert(data.responseText);
} }
} });
}).done(function(data){
console.log(data); $('#send-test-notification').click(function (e) {
alert(data); e.preventDefault();
})
}); data = {
notification_body: $('#notification_body').val(),
notification_format: $('#notification_format').val(),
notification_title: $('#notification_title').val(),
notification_urls: $('.notification-urls').val(),
tags: $('#tags').val(),
window_url: window.location.href,
}
$('.notifications-wrapper .spinner').fadeIn();
$('#notification-test-log').show();
$.ajax({
type: "POST",
url: notification_base_url,
data: data,
statusCode: {
400: function (data) {
$("#notification-test-log>span").text(data.responseText);
},
}
}).done(function (data) {
$("#notification-test-log>span").text(data);
}).fail(function (jqXHR, textStatus, errorThrown) {
// Handle connection refused or other errors
if (textStatus === "error" && errorThrown === "") {
console.error("Connection refused or server unreachable");
$("#notification-test-log>span").text("Error: Connection refused or server is unreachable.");
} else {
console.error("Error:", textStatus, errorThrown);
$("#notification-test-log>span").text("An error occurred: " + textStatus);
}
}).always(function () {
$('.notifications-wrapper .spinner').hide();
})
});
}); });

View File

@@ -380,7 +380,15 @@ a.pure-button-selected {
} }
.notifications-wrapper { .notifications-wrapper {
padding: 0.5rem 0 1rem 0; padding-top: 0.5rem;
#notification-test-log {
padding-top: 1rem;
white-space: pre-wrap;
word-break: break-word;
overflow-wrap: break-word;
max-width: 100%;
box-sizing: border-box;
}
} }
label { label {

View File

@@ -780,7 +780,14 @@ a.pure-button-selected {
cursor: pointer; } cursor: pointer; }
.notifications-wrapper { .notifications-wrapper {
padding: 0.5rem 0 1rem 0; } padding-top: 0.5rem; }
.notifications-wrapper #notification-test-log {
padding-top: 1rem;
white-space: pre-wrap;
word-break: break-word;
overflow-wrap: break-word;
max-width: 100%;
box-sizing: border-box; }
label:hover { label:hover {
cursor: pointer; } cursor: pointer; }

View File

@@ -24,11 +24,13 @@
</ul> </ul>
</div> </div>
<div class="notifications-wrapper"> <div class="notifications-wrapper">
<a id="send-test-notification" class="pure-button button-secondary button-xsmall" >Send test notification</a> <a id="send-test-notification" class="pure-button button-secondary button-xsmall" >Send test notification</a> <div class="spinner" style="display: none;"></div>
{% if emailprefix %} {% if emailprefix %}
<a id="add-email-helper" class="pure-button button-secondary button-xsmall" >Add email <img style="height: 1em; display: inline-block" src="{{url_for('static_content', group='images', filename='email.svg')}}" alt="Add an email address"> </a> <a id="add-email-helper" class="pure-button button-secondary button-xsmall" >Add email <img style="height: 1em; display: inline-block" src="{{url_for('static_content', group='images', filename='email.svg')}}" alt="Add an email address"> </a>
{% endif %} {% endif %}
<a href="{{url_for('notification_logs')}}" class="pure-button button-secondary button-xsmall" >Notification debug logs</a> <a href="{{url_for('notification_logs')}}" class="pure-button button-secondary button-xsmall" >Notification debug logs</a>
<br>
<div id="notification-test-log" style="display: none;"><span class="pure-form-message-inline">Processing..</span></div>
</div> </div>
</div> </div>
<div id="notification-customisation" class="pure-control-group"> <div id="notification-customisation" class="pure-control-group">

View File

@@ -191,7 +191,7 @@
{% if watch.history_n >= 2 %} {% if watch.history_n >= 2 %}
{% if is_unviewed %} {% if is_unviewed %}
<a href="{{ url_for('diff_history_page', uuid=watch.uuid, from_version=watch.get_next_snapshot_key_to_last_viewed) }}" target="{{watch.uuid}}" class="pure-button pure-button-primary diff-link">History</a> <a href="{{ url_for('diff_history_page', uuid=watch.uuid, from_version=watch.get_from_version_based_on_last_viewed) }}" target="{{watch.uuid}}" class="pure-button pure-button-primary diff-link">History</a>
{% else %} {% else %}
<a href="{{ url_for('diff_history_page', uuid=watch.uuid)}}" target="{{watch.uuid}}" class="pure-button pure-button-primary diff-link">History</a> <a href="{{ url_for('diff_history_page', uuid=watch.uuid)}}" target="{{watch.uuid}}" class="pure-button pure-button-primary diff-link">History</a>
{% endif %} {% endif %}

View File

@@ -34,7 +34,7 @@ def test_execute_custom_js(client, live_server, measure_memory_usage):
assert b"unpaused" in res.data assert b"unpaused" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n >= 1, "Watch history had atleast 1 (everything fetched OK)" assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n >= 1, "Watch history had atleast 1 (everything fetched OK)"
assert b"This text should be removed" not in res.data assert b"This text should be removed" not in res.data

View File

@@ -48,7 +48,7 @@ def test_noproxy_option(client, live_server, measure_memory_usage):
follow_redirects=True follow_redirects=True
) )
assert b"Watch added in Paused state, saving will unpause" in res.data assert b"Watch added in Paused state, saving will unpause" in res.data
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.get( res = client.get(
url_for("edit_page", uuid=uuid, unpause_on_save=1)) url_for("edit_page", uuid=uuid, unpause_on_save=1))
assert b'No proxy' in res.data assert b'No proxy' in res.data

View File

@@ -81,7 +81,7 @@ def test_socks5(client, live_server, measure_memory_usage):
assert "Awesome, you made it".encode('utf-8') in res.data assert "Awesome, you made it".encode('utf-8') in res.data
# PROXY CHECKER WIDGET CHECK - this needs more checking # PROXY CHECKER WIDGET CHECK - this needs more checking
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.get( res = client.get(
url_for("check_proxies.start_check", uuid=uuid), url_for("check_proxies.start_check", uuid=uuid),

View File

@@ -1,9 +1,9 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import os.path import os.path
import time
from flask import url_for from flask import url_for
from .util import live_server_setup, wait_for_all_checks, wait_for_notification_endpoint_output from .util import live_server_setup, wait_for_all_checks, wait_for_notification_endpoint_output
from changedetectionio import html_tools
def set_original(excluding=None, add_line=None): def set_original(excluding=None, add_line=None):

View File

@@ -44,7 +44,6 @@ def set_modified_response():
return None return None
def is_valid_uuid(val): def is_valid_uuid(val):
try: try:
uuid.UUID(str(val)) uuid.UUID(str(val))
@@ -56,8 +55,9 @@ def is_valid_uuid(val):
def test_setup(client, live_server, measure_memory_usage): def test_setup(client, live_server, measure_memory_usage):
live_server_setup(live_server) live_server_setup(live_server)
def test_api_simple(client, live_server, measure_memory_usage): def test_api_simple(client, live_server, measure_memory_usage):
#live_server_setup(live_server) # live_server_setup(live_server)
api_key = extract_api_key_from_UI(client) api_key = extract_api_key_from_UI(client)
@@ -129,6 +129,9 @@ def test_api_simple(client, live_server, measure_memory_usage):
assert after_recheck_info['last_checked'] != before_recheck_info['last_checked'] assert after_recheck_info['last_checked'] != before_recheck_info['last_checked']
assert after_recheck_info['last_changed'] != 0 assert after_recheck_info['last_changed'] != 0
# #2877 When run in a slow fetcher like playwright etc
assert after_recheck_info['last_changed'] == after_recheck_info['last_checked']
# Check history index list # Check history index list
res = client.get( res = client.get(
url_for("watchhistory", uuid=watch_uuid), url_for("watchhistory", uuid=watch_uuid),

View File

@@ -99,7 +99,7 @@ def test_check_ldjson_price_autodetect(client, live_server, measure_memory_usage
assert b'ldjson-price-track-offer' in res.data assert b'ldjson-price-track-offer' in res.data
# Accept it # Accept it
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
#time.sleep(1) #time.sleep(1)
client.get(url_for('price_data_follower.accept', uuid=uuid, follow_redirects=True)) client.get(url_for('price_data_follower.accept', uuid=uuid, follow_redirects=True))
client.get(url_for("form_watch_checknow"), follow_redirects=True) client.get(url_for("form_watch_checknow"), follow_redirects=True)

View File

@@ -2,7 +2,6 @@
import time import time
from flask import url_for from flask import url_for
from urllib.request import urlopen
from .util import set_original_response, set_modified_response, live_server_setup, wait_for_all_checks, extract_rss_token_from_UI, \ from .util import set_original_response, set_modified_response, live_server_setup, wait_for_all_checks, extract_rss_token_from_UI, \
extract_UUID_from_client extract_UUID_from_client
@@ -69,7 +68,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
# Check the 'get latest snapshot works' # Check the 'get latest snapshot works'
res = client.get(url_for("watch_get_latest_html", uuid=uuid)) res = client.get(url_for("watch_get_latest_html", uuid=uuid))

View File

@@ -40,7 +40,7 @@ def test_check_encoding_detection(client, live_server, measure_memory_usage):
# Content type recording worked # Content type recording worked
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
assert live_server.app.config['DATASTORE'].data['watching'][uuid]['content-type'] == "text/html" assert live_server.app.config['DATASTORE'].data['watching'][uuid]['content-type'] == "text/html"
res = client.get( res = client.get(

View File

@@ -51,7 +51,7 @@ def run_filter_test(client, live_server, content_filter):
assert b"1 Imported" in res.data assert b"1 Imported" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
assert live_server.app.config['DATASTORE'].data['watching'][uuid]['consecutive_filter_failures'] == 0, "No filter = No filter failure" assert live_server.app.config['DATASTORE'].data['watching'][uuid]['consecutive_filter_failures'] == 0, "No filter = No filter failure"

View File

@@ -288,7 +288,7 @@ def test_clone_tag_on_import(client, live_server, measure_memory_usage):
assert b'test-tag' in res.data assert b'test-tag' in res.data
assert b'another-tag' in res.data assert b'another-tag' in res.data
watch_uuid = extract_UUID_from_client(client) watch_uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.get(url_for("form_clone", uuid=watch_uuid), follow_redirects=True) res = client.get(url_for("form_clone", uuid=watch_uuid), follow_redirects=True)
assert b'Cloned' in res.data assert b'Cloned' in res.data
@@ -315,7 +315,7 @@ def test_clone_tag_on_quickwatchform_add(client, live_server, measure_memory_usa
assert b'test-tag' in res.data assert b'test-tag' in res.data
assert b'another-tag' in res.data assert b'another-tag' in res.data
watch_uuid = extract_UUID_from_client(client) watch_uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.get(url_for("form_clone", uuid=watch_uuid), follow_redirects=True) res = client.get(url_for("form_clone", uuid=watch_uuid), follow_redirects=True)
assert b'Cloned' in res.data assert b'Cloned' in res.data

View File

@@ -36,7 +36,7 @@ def test_ignore(client, live_server, measure_memory_usage):
# Give the thread time to pick it up # Give the thread time to pick it up
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
# use the highlighter endpoint # use the highlighter endpoint
res = client.post( res = client.post(
url_for("highlight_submit_ignore_url", uuid=uuid), url_for("highlight_submit_ignore_url", uuid=uuid),

View File

@@ -514,3 +514,15 @@ def test_check_jq_ext_filter(client, live_server, measure_memory_usage):
def test_check_jqraw_ext_filter(client, live_server, measure_memory_usage): def test_check_jqraw_ext_filter(client, live_server, measure_memory_usage):
if jq_support: if jq_support:
check_json_ext_filter('jq:.[] | select(.status | contains("Sold"))', client, live_server) check_json_ext_filter('jq:.[] | select(.status | contains("Sold"))', client, live_server)
def test_jsonpath_BOM_utf8(client, live_server, measure_memory_usage):
from .. import html_tools
# JSON string with BOM and correct double-quoted keys
json_str = '\ufeff{"name": "José", "emoji": "😊", "language": "中文", "greeting": "Привет"}'
# See that we can find the second <script> one, which is not broken, and matches our filter
text = html_tools.extract_json_as_string(json_str, "json:$.name")
assert text == '"José"'

View File

@@ -29,7 +29,7 @@ def test_content_filter_live_preview(client, live_server, measure_memory_usage):
data={"url": test_url, "tags": ''}, data={"url": test_url, "tags": ''},
follow_redirects=True follow_redirects=True
) )
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.post( res = client.post(
url_for("edit_page", uuid=uuid), url_for("edit_page", uuid=uuid),
data={ data={

View File

@@ -48,7 +48,7 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
##################### #####################
client.post( client.post(
url_for("settings_page"), url_for("settings_page"),
data={"application-empty_pages_are_a_change": "", data={"application-empty_pages_are_a_change": "", # default, OFF, they are NOT a change
"requests-time_between_check-minutes": 180, "requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"}, 'application-fetch_backend': "html_requests"},
follow_redirects=True follow_redirects=True
@@ -66,6 +66,14 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
res = client.get(url_for("index")) res = client.get(url_for("index"))
assert b'unviewed' not in res.data assert b'unviewed' not in res.data
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
watch = live_server.app.config['DATASTORE'].data['watching'][uuid]
assert watch.last_changed == 0
assert watch['last_checked'] != 0
# ok now do the opposite # ok now do the opposite
@@ -92,6 +100,10 @@ def test_check_basic_change_detection_functionality(client, live_server, measure
# A totally zero byte (#2528) response should also not trigger an error # A totally zero byte (#2528) response should also not trigger an error
set_zero_byte_response() set_zero_byte_response()
client.get(url_for("form_watch_checknow"), follow_redirects=True) client.get(url_for("form_watch_checknow"), follow_redirects=True)
# 2877
assert watch.last_changed == watch['last_checked']
wait_for_all_checks(client) wait_for_all_checks(client)
res = client.get(url_for("index")) res = client.get(url_for("index"))
assert b'unviewed' in res.data # A change should have registered because empty_pages_are_a_change is ON assert b'unviewed' in res.data # A change should have registered because empty_pages_are_a_change is ON

View File

@@ -6,7 +6,7 @@ from flask import url_for
from loguru import logger from loguru import logger
from .util import set_original_response, set_modified_response, set_more_modified_response, live_server_setup, wait_for_all_checks, \ from .util import set_original_response, set_modified_response, set_more_modified_response, live_server_setup, wait_for_all_checks, \
set_longer_modified_response set_longer_modified_response, get_index
from . util import extract_UUID_from_client from . util import extract_UUID_from_client
import logging import logging
import base64 import base64
@@ -29,7 +29,7 @@ def test_check_notification(client, live_server, measure_memory_usage):
# Re 360 - new install should have defaults set # Re 360 - new install should have defaults set
res = client.get(url_for("settings_page")) res = client.get(url_for("settings_page"))
notification_url = url_for('test_notification_endpoint', _external=True).replace('http', 'json') notification_url = url_for('test_notification_endpoint', _external=True).replace('http', 'json')+"?status_code=204"
assert default_notification_body.encode() in res.data assert default_notification_body.encode() in res.data
assert default_notification_title.encode() in res.data assert default_notification_title.encode() in res.data
@@ -76,7 +76,7 @@ def test_check_notification(client, live_server, measure_memory_usage):
testimage_png = 'iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII=' testimage_png = 'iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII='
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
datastore = 'test-datastore' datastore = 'test-datastore'
with open(os.path.join(datastore, str(uuid), 'last-screenshot.png'), 'wb') as f: with open(os.path.join(datastore, str(uuid), 'last-screenshot.png'), 'wb') as f:
f.write(base64.b64decode(testimage_png)) f.write(base64.b64decode(testimage_png))
@@ -135,7 +135,14 @@ def test_check_notification(client, live_server, measure_memory_usage):
# Trigger a check # Trigger a check
client.get(url_for("form_watch_checknow"), follow_redirects=True) client.get(url_for("form_watch_checknow"), follow_redirects=True)
wait_for_all_checks(client)
time.sleep(3) time.sleep(3)
# Check no errors were recorded
res = client.get(url_for("index"))
assert b'notification-error' not in res.data
# Verify what was sent as a notification, this file should exist # Verify what was sent as a notification, this file should exist
with open("test-datastore/notification.txt", "r") as f: with open("test-datastore/notification.txt", "r") as f:
notification_submission = f.read() notification_submission = f.read()
@@ -284,7 +291,7 @@ def test_notification_custom_endpoint_and_jinja2(client, live_server, measure_me
# CUSTOM JSON BODY CHECK for POST:// # CUSTOM JSON BODY CHECK for POST://
set_original_response() set_original_response()
# https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#header-manipulation # https://github.com/caronc/apprise/wiki/Notify_Custom_JSON#header-manipulation
test_notification_url = url_for('test_notification_endpoint', _external=True).replace('http://', 'post://')+"?xxx={{ watch_url }}&+custom-header=123&+second=hello+world%20%22space%22" test_notification_url = url_for('test_notification_endpoint', _external=True).replace('http://', 'post://')+"?status_code=204&xxx={{ watch_url }}&+custom-header=123&+second=hello+world%20%22space%22"
res = client.post( res = client.post(
url_for("settings_page"), url_for("settings_page"),
@@ -319,6 +326,11 @@ def test_notification_custom_endpoint_and_jinja2(client, live_server, measure_me
time.sleep(2) # plus extra delay for notifications to fire time.sleep(2) # plus extra delay for notifications to fire
# Check no errors were recorded, because we asked for 204 which is slightly uncommon but is still OK
res = get_index(client)
assert b'notification-error' not in res.data
with open("test-datastore/notification.txt", 'r') as f: with open("test-datastore/notification.txt", 'r') as f:
x = f.read() x = f.read()
j = json.loads(x) j = json.loads(x)
@@ -360,7 +372,10 @@ def test_global_send_test_notification(client, live_server, measure_memory_usage
#live_server_setup(live_server) #live_server_setup(live_server)
set_original_response() set_original_response()
if os.path.isfile("test-datastore/notification.txt"): if os.path.isfile("test-datastore/notification.txt"):
os.unlink("test-datastore/notification.txt") os.unlink("test-datastore/notification.txt") \
# 1995 UTF-8 content should be encoded
test_body = 'change detection is cool 网站监测 内容更新了'
# otherwise other settings would have already existed from previous tests in this file # otherwise other settings would have already existed from previous tests in this file
res = client.post( res = client.post(
@@ -368,8 +383,7 @@ def test_global_send_test_notification(client, live_server, measure_memory_usage
data={ data={
"application-fetch_backend": "html_requests", "application-fetch_backend": "html_requests",
"application-minutes_between_check": 180, "application-minutes_between_check": 180,
#1995 UTF-8 content should be encoded "application-notification_body": test_body,
"application-notification_body": 'change detection is cool 网站监测 内容更新了',
"application-notification_format": default_notification_format, "application-notification_format": default_notification_format,
"application-notification_urls": "", "application-notification_urls": "",
"application-notification_title": "New ChangeDetection.io Notification - {{ watch_url }}", "application-notification_title": "New ChangeDetection.io Notification - {{ watch_url }}",
@@ -399,12 +413,10 @@ def test_global_send_test_notification(client, live_server, measure_memory_usage
assert res.status_code != 400 assert res.status_code != 400
assert res.status_code != 500 assert res.status_code != 500
# Give apprise time to fire
time.sleep(4)
with open("test-datastore/notification.txt", 'r') as f: with open("test-datastore/notification.txt", 'r') as f:
x = f.read() x = f.read()
assert 'change detection is cool 网站监测 内容更新了' in x assert test_body in x
os.unlink("test-datastore/notification.txt") os.unlink("test-datastore/notification.txt")

View File

@@ -373,13 +373,14 @@ def test_headers_textfile_in_request(client, live_server, measure_memory_usage):
wait_for_all_checks(client) wait_for_all_checks(client)
with open('test-datastore/headers-testtag.txt', 'w') as f: with open('test-datastore/headers-testtag.txt', 'w') as f:
f.write("tag-header: test") f.write("tag-header: test\r\nurl-header: http://example.com")
with open('test-datastore/headers.txt', 'w') as f: with open('test-datastore/headers.txt', 'w') as f:
f.write("global-header: nice\r\nnext-global-header: nice") f.write("global-header: nice\r\nnext-global-header: nice\r\nurl-header-global: http://example.com/global")
with open('test-datastore/' + extract_UUID_from_client(client) + '/headers.txt', 'w') as f: uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
f.write("watch-header: nice") with open(f'test-datastore/{uuid}/headers.txt', 'w') as f:
f.write("watch-header: nice\r\nurl-header-watch: http://example.com/watch")
wait_for_all_checks(client) wait_for_all_checks(client)
client.get(url_for("form_watch_checknow"), follow_redirects=True) client.get(url_for("form_watch_checknow"), follow_redirects=True)
@@ -410,6 +411,9 @@ def test_headers_textfile_in_request(client, live_server, measure_memory_usage):
assert b"Xxx:ooo" in res.data assert b"Xxx:ooo" in res.data
assert b"Watch-Header:nice" in res.data assert b"Watch-Header:nice" in res.data
assert b"Tag-Header:test" in res.data assert b"Tag-Header:test" in res.data
assert b"Url-Header:http://example.com" in res.data
assert b"Url-Header-Global:http://example.com/global" in res.data
assert b"Url-Header-Watch:http://example.com/watch" in res.data
# Check the custom UA from system settings page made it through # Check the custom UA from system settings page made it through
if os.getenv('PLAYWRIGHT_DRIVER_URL'): if os.getenv('PLAYWRIGHT_DRIVER_URL'):

View File

@@ -189,6 +189,17 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
client.get(url_for("mark_all_viewed")) client.get(url_for("mark_all_viewed"))
# 2715 - Price detection (once it crosses the "lower" threshold) again with a lower price - should trigger again!
set_original_response(props_markup=instock_props[0], price='820.45')
res = client.get(url_for("form_watch_checknow"), follow_redirects=True)
assert b'1 watches queued for rechecking.' in res.data
wait_for_all_checks(client)
res = client.get(url_for("index"))
assert b'820.45' in res.data
assert b'unviewed' in res.data
client.get(url_for("mark_all_viewed"))
# price changed to something MORE than max (1100.10), SHOULD be a change # price changed to something MORE than max (1100.10), SHOULD be a change
set_original_response(props_markup=instock_props[0], price='1890.45') set_original_response(props_markup=instock_props[0], price='1890.45')
client.get(url_for("form_watch_checknow"), follow_redirects=True) client.get(url_for("form_watch_checknow"), follow_redirects=True)
@@ -203,7 +214,7 @@ def _run_test_minmax_limit(client, extra_watch_edit_form):
def test_restock_itemprop_minmax(client, live_server): def test_restock_itemprop_minmax(client, live_server):
# live_server_setup(live_server) #live_server_setup(live_server)
extras = { extras = {
"restock_settings-follow_price_changes": "y", "restock_settings-follow_price_changes": "y",
"restock_settings-price_change_min": 900.0, "restock_settings-price_change_min": 900.0,
@@ -369,7 +380,7 @@ def test_change_with_notification_values(client, live_server):
## Now test the "SEND TEST NOTIFICATION" is working ## Now test the "SEND TEST NOTIFICATION" is working
os.unlink("test-datastore/notification.txt") os.unlink("test-datastore/notification.txt")
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.post(url_for("ajax_callback_send_notification_test", watch_uuid=uuid), data={}, follow_redirects=True) res = client.post(url_for("ajax_callback_send_notification_test", watch_uuid=uuid), data={}, follow_redirects=True)
time.sleep(5) time.sleep(5)
assert os.path.isfile("test-datastore/notification.txt"), "Notification received" assert os.path.isfile("test-datastore/notification.txt"), "Notification received"

View File

@@ -132,7 +132,7 @@ def test_rss_xpath_filtering(client, live_server, measure_memory_usage):
) )
assert b"Watch added in Paused state, saving will unpause" in res.data assert b"Watch added in Paused state, saving will unpause" in res.data
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.post( res = client.post(
url_for("edit_page", uuid=uuid, unpause_on_save=1), url_for("edit_page", uuid=uuid, unpause_on_save=1),
data={ data={

View File

@@ -39,7 +39,7 @@ def test_check_basic_scheduler_functionality(client, live_server, measure_memory
assert b"1 Imported" in res.data assert b"1 Imported" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
# Setup all the days of the weeks using XXX as the placeholder for monday/tuesday/etc # Setup all the days of the weeks using XXX as the placeholder for monday/tuesday/etc
@@ -104,7 +104,7 @@ def test_check_basic_global_scheduler_functionality(client, live_server, measure
assert b"1 Imported" in res.data assert b"1 Imported" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
# Setup all the days of the weeks using XXX as the placeholder for monday/tuesday/etc # Setup all the days of the weeks using XXX as the placeholder for monday/tuesday/etc

View File

@@ -1,9 +1,7 @@
import os import os
from flask import url_for from flask import url_for
from .util import set_original_response, set_modified_response, live_server_setup, wait_for_all_checks from .util import live_server_setup, wait_for_all_checks
import time
from .. import strtobool from .. import strtobool
@@ -61,54 +59,44 @@ def test_bad_access(client, live_server, measure_memory_usage):
assert b'Watch protocol is not permitted by SAFE_PROTOCOL_REGEX' in res.data assert b'Watch protocol is not permitted by SAFE_PROTOCOL_REGEX' in res.data
def test_file_slashslash_access(client, live_server, measure_memory_usage): def _runner_test_various_file_slash(client, file_uri):
#live_server_setup(live_server)
test_file_path = os.path.abspath(__file__)
# file:// is permitted by default, but it will be caught by ALLOW_FILE_URI
client.post( client.post(
url_for("form_quick_watch_add"), url_for("form_quick_watch_add"),
data={"url": f"file://{test_file_path}", "tags": ''}, data={"url": file_uri, "tags": ''},
follow_redirects=True follow_redirects=True
) )
wait_for_all_checks(client) wait_for_all_checks(client)
res = client.get(url_for("index")) res = client.get(url_for("index"))
substrings = [b"URLs with hostname components are not permitted", b"No connection adapters were found for"]
# If it is enabled at test time # If it is enabled at test time
if strtobool(os.getenv('ALLOW_FILE_URI', 'false')): if strtobool(os.getenv('ALLOW_FILE_URI', 'false')):
res = client.get( if file_uri.startswith('file:///'):
url_for("preview_page", uuid="first"), # This one should be the full qualified path to the file and should get the contents of this file
follow_redirects=True res = client.get(
) url_for("preview_page", uuid="first"),
follow_redirects=True
)
assert b'_runner_test_various_file_slash' in res.data
else:
# This will give some error from requests or if it went to chrome, will give some other error :-)
assert any(s in res.data for s in substrings)
assert b"test_file_slashslash_access" in res.data res = client.get(url_for("form_delete", uuid="all"), follow_redirects=True)
else: assert b'Deleted' in res.data
# Default should be here
assert b'file:// type access is denied for security reasons.' in res.data
def test_file_slash_access(client, live_server, measure_memory_usage): def test_file_slash_access(client, live_server, measure_memory_usage):
#live_server_setup(live_server) #live_server_setup(live_server)
# file: is NOT permitted by default, so it will be caught by ALLOW_FILE_URI check
test_file_path = os.path.abspath(__file__) test_file_path = os.path.abspath(__file__)
_runner_test_various_file_slash(client, file_uri=f"file://{test_file_path}")
# file:// is permitted by default, but it will be caught by ALLOW_FILE_URI _runner_test_various_file_slash(client, file_uri=f"file:/{test_file_path}")
client.post( _runner_test_various_file_slash(client, file_uri=f"file:{test_file_path}") # CVE-2024-56509
url_for("form_quick_watch_add"),
data={"url": f"file:/{test_file_path}", "tags": ''},
follow_redirects=True
)
wait_for_all_checks(client)
res = client.get(url_for("index"))
# If it is enabled at test time
if strtobool(os.getenv('ALLOW_FILE_URI', 'false')):
# So it should permit it, but it should fall back to the 'requests' library giving an error
# (but means it gets passed to playwright etc)
assert b"URLs with hostname components are not permitted" in res.data
else:
# Default should be here
assert b'file:// type access is denied for security reasons.' in res.data
def test_xss(client, live_server, measure_memory_usage): def test_xss(client, live_server, measure_memory_usage):
#live_server_setup(live_server) #live_server_setup(live_server)

View File

@@ -0,0 +1,64 @@
#!/usr/bin/env python3
# run from dir above changedetectionio/ dir
# python3 -m unittest changedetectionio.tests.unit.test_semver
import re
import unittest
# The SEMVER regex
SEMVER_REGEX = r"^(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$"
# Compile the regex
semver_pattern = re.compile(SEMVER_REGEX)
class TestSemver(unittest.TestCase):
def test_valid_versions(self):
"""Test valid semantic version strings"""
valid_versions = [
"1.0.0",
"0.1.0",
"0.0.1",
"1.0.0-alpha",
"1.0.0-alpha.1",
"1.0.0-0.3.7",
"1.0.0-x.7.z.92",
"1.0.0-alpha+001",
"1.0.0+20130313144700",
"1.0.0-beta+exp.sha.5114f85"
]
for version in valid_versions:
with self.subTest(version=version):
self.assertIsNotNone(semver_pattern.match(version), f"Version {version} should be valid")
def test_invalid_versions(self):
"""Test invalid semantic version strings"""
invalid_versions = [
"0.48.06",
"1.0",
"1.0.0-",
# Seems to pass the semver.org regex?
# "1.0.0-alpha-",
"1.0.0+",
"1.0.0-alpha+",
"1.0.0-",
"01.0.0",
"1.01.0",
"1.0.01",
".1.0.0",
"1..0.0"
]
for version in invalid_versions:
with self.subTest(version=version):
res = semver_pattern.match(version)
self.assertIsNone(res, f"Version '{version}' should be invalid")
def test_our_version(self):
from changedetectionio import get_version
our_version = get_version()
self.assertIsNotNone(semver_pattern.match(our_version), f"Our version '{our_version}' should be a valid SEMVER string")
if __name__ == '__main__':
unittest.main()

View File

@@ -16,7 +16,6 @@ class TestDiffBuilder(unittest.TestCase):
watch = Watch.model(datastore_path='/tmp', default={}) watch = Watch.model(datastore_path='/tmp', default={})
watch.ensure_data_dir_exists() watch.ensure_data_dir_exists()
watch['last_viewed'] = 110
# Contents from the browser are always returned from the browser/requests/etc as str, str is basically UTF-16 in python # Contents from the browser are always returned from the browser/requests/etc as str, str is basically UTF-16 in python
watch.save_history_text(contents="hello world", timestamp=100, snapshot_id=str(uuid_builder.uuid4())) watch.save_history_text(contents="hello world", timestamp=100, snapshot_id=str(uuid_builder.uuid4()))
@@ -25,31 +24,42 @@ class TestDiffBuilder(unittest.TestCase):
watch.save_history_text(contents="hello world", timestamp=112, snapshot_id=str(uuid_builder.uuid4())) watch.save_history_text(contents="hello world", timestamp=112, snapshot_id=str(uuid_builder.uuid4()))
watch.save_history_text(contents="hello world", timestamp=115, snapshot_id=str(uuid_builder.uuid4())) watch.save_history_text(contents="hello world", timestamp=115, snapshot_id=str(uuid_builder.uuid4()))
watch.save_history_text(contents="hello world", timestamp=117, snapshot_id=str(uuid_builder.uuid4())) watch.save_history_text(contents="hello world", timestamp=117, snapshot_id=str(uuid_builder.uuid4()))
p = watch.get_from_version_based_on_last_viewed
assert p == "100", "Correct 'last viewed' timestamp was detected"
p = watch.get_next_snapshot_key_to_last_viewed watch['last_viewed'] = 110
assert p == "112", "Correct last-viewed timestamp was detected" p = watch.get_from_version_based_on_last_viewed
assert p == "109", "Correct 'last viewed' timestamp was detected"
# When there is only one step of difference from the end of the list, it should return second-last change
watch['last_viewed'] = 116 watch['last_viewed'] = 116
p = watch.get_next_snapshot_key_to_last_viewed p = watch.get_from_version_based_on_last_viewed
assert p == "115", "Correct 'second last' last-viewed timestamp was detected when using the last timestamp" assert p == "115", "Correct 'last viewed' timestamp was detected"
watch['last_viewed'] = 99 watch['last_viewed'] = 99
p = watch.get_next_snapshot_key_to_last_viewed p = watch.get_from_version_based_on_last_viewed
assert p == "100" assert p == "100", "When the 'last viewed' timestamp is less than the oldest snapshot, return oldest"
watch['last_viewed'] = 200 watch['last_viewed'] = 200
p = watch.get_next_snapshot_key_to_last_viewed p = watch.get_from_version_based_on_last_viewed
assert p == "115", "When the 'last viewed' timestamp is greater than the newest snapshot, return second last " assert p == "115", "When the 'last viewed' timestamp is greater than the newest snapshot, return second newest"
watch['last_viewed'] = 109 watch['last_viewed'] = 109
p = watch.get_next_snapshot_key_to_last_viewed p = watch.get_from_version_based_on_last_viewed
assert p == "109", "Correct when its the same time" assert p == "109", "Correct when its the same time"
# new empty one # new empty one
watch = Watch.model(datastore_path='/tmp', default={}) watch = Watch.model(datastore_path='/tmp', default={})
p = watch.get_next_snapshot_key_to_last_viewed p = watch.get_from_version_based_on_last_viewed
assert p == None, "None when no history available" assert p == None, "None when no history available"
watch.save_history_text(contents="hello world", timestamp=100, snapshot_id=str(uuid_builder.uuid4()))
p = watch.get_from_version_based_on_last_viewed
assert p == "100", "Correct with only one history snapshot"
watch['last_viewed'] = 200
p = watch.get_from_version_based_on_last_viewed
assert p == "100", "Correct with only one history snapshot"
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@@ -76,6 +76,14 @@ def set_more_modified_response():
return None return None
def set_empty_text_response():
test_return_data = """<html><body></body></html>"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
return None
def wait_for_notification_endpoint_output(): def wait_for_notification_endpoint_output():
'''Apprise can take a few seconds to fire''' '''Apprise can take a few seconds to fire'''
#@todo - could check the apprise object directly instead of looking for this file #@todo - could check the apprise object directly instead of looking for this file
@@ -215,9 +223,10 @@ def live_server_setup(live_server):
def test_method(): def test_method():
return request.method return request.method
# Where we POST to as a notification # Where we POST to as a notification, also use a space here to test URL escaping is OK across all tests that use this. ( #2868 )
@live_server.app.route('/test_notification_endpoint', methods=['POST', 'GET']) @live_server.app.route('/test_notification endpoint', methods=['POST', 'GET'])
def test_notification_endpoint(): def test_notification_endpoint():
with open("test-datastore/notification.txt", "wb") as f: with open("test-datastore/notification.txt", "wb") as f:
# Debug method, dump all POST to file also, used to prove #65 # Debug method, dump all POST to file also, used to prove #65
data = request.stream.read() data = request.stream.read()
@@ -235,8 +244,11 @@ def live_server_setup(live_server):
f.write(request.content_type) f.write(request.content_type)
print("\n>> Test notification endpoint was hit.\n", data) print("\n>> Test notification endpoint was hit.\n", data)
return "Text was set"
content = "Text was set"
status_code = request.args.get('status_code',200)
resp = make_response(content, status_code)
return resp
# Just return the verb in the request # Just return the verb in the request
@live_server.app.route('/test-basicauth', methods=['GET']) @live_server.app.route('/test-basicauth', methods=['GET'])
@@ -273,15 +285,43 @@ def live_server_setup(live_server):
<p id="remove">This text should be removed</p> <p id="remove">This text should be removed</p>
<form onsubmit="event.preventDefault();"> <form onsubmit="event.preventDefault();">
<!-- obfuscated text so that we dont accidentally get a false positive due to conversion of the source :) ---> <!-- obfuscated text so that we dont accidentally get a false positive due to conversion of the source :) --->
<button name="test-button" onclick="getElementById('remove').remove();getElementById('some-content').innerHTML = atob('SSBzbWVsbCBKYXZhU2NyaXB0IGJlY2F1c2UgdGhlIGJ1dHRvbiB3YXMgcHJlc3NlZCE=')">Click here</button> <button name="test-button" onclick="
<div id=some-content></div> getElementById('remove').remove();
getElementById('some-content').innerHTML = atob('SSBzbWVsbCBKYXZhU2NyaXB0IGJlY2F1c2UgdGhlIGJ1dHRvbiB3YXMgcHJlc3NlZCE=');
getElementById('reflect-text').innerHTML = getElementById('test-input-text').value;
">Click here</button>
<div id="some-content"></div>
<pre> <pre>
{header_text.lower()} {header_text.lower()}
</pre> </pre>
</body>
<br>
<!-- used for testing that the jinja2 compiled here --->
<input type="text" value="" id="test-input-text" /><br>
<div id="reflect-text">Waiting to reflect text from #test-input-text here</div>
</form>
</body>
</html>""", 200) </html>""", 200)
resp.headers['Content-Type'] = 'text/html' resp.headers['Content-Type'] = 'text/html'
return resp return resp
live_server.start() live_server.start()
def get_index(client):
import inspect
# Get the caller's frame (parent function)
frame = inspect.currentframe()
caller_frame = frame.f_back # Go back to the caller's frame
caller_name = caller_frame.f_code.co_name
caller_line = caller_frame.f_lineno
print(f"Called by: {caller_name}, Line: {caller_line}")
res = client.get(url_for("index"))
with open(f"test-datastore/index-{caller_name}-{caller_line}.html", 'wb') as f:
f.write(res.data)
return res

View File

@@ -2,14 +2,16 @@
import os import os
from flask import url_for from flask import url_for
from ..util import live_server_setup, wait_for_all_checks, extract_UUID_from_client from ..util import live_server_setup, wait_for_all_checks, get_index
def test_setup(client, live_server, measure_memory_usage): def test_setup(client, live_server):
live_server_setup(live_server) live_server_setup(live_server)
# Add a site in paused mode, add an invalid filter, we should still have visual selector data ready # Add a site in paused mode, add an invalid filter, we should still have visual selector data ready
def test_visual_selector_content_ready(client, live_server, measure_memory_usage): def test_visual_selector_content_ready(client, live_server, measure_memory_usage):
live_server.stop()
live_server.start()
import os import os
import json import json
@@ -27,7 +29,7 @@ def test_visual_selector_content_ready(client, live_server, measure_memory_usage
follow_redirects=True follow_redirects=True
) )
assert b"Watch added in Paused state, saving will unpause" in res.data assert b"Watch added in Paused state, saving will unpause" in res.data
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
res = client.post( res = client.post(
url_for("edit_page", uuid=uuid, unpause_on_save=1), url_for("edit_page", uuid=uuid, unpause_on_save=1),
data={ data={
@@ -87,7 +89,9 @@ def test_visual_selector_content_ready(client, live_server, measure_memory_usage
def test_basic_browserstep(client, live_server, measure_memory_usage): def test_basic_browserstep(client, live_server, measure_memory_usage):
#live_server_setup(live_server) live_server.stop()
live_server.start()
assert os.getenv('PLAYWRIGHT_DRIVER_URL'), "Needs PLAYWRIGHT_DRIVER_URL set for this test" assert os.getenv('PLAYWRIGHT_DRIVER_URL'), "Needs PLAYWRIGHT_DRIVER_URL set for this test"
test_url = url_for('test_interactive_html_endpoint', _external=True) test_url = url_for('test_interactive_html_endpoint', _external=True)
@@ -108,9 +112,13 @@ def test_basic_browserstep(client, live_server, measure_memory_usage):
"url": test_url, "url": test_url,
"tags": "", "tags": "",
'fetch_backend': "html_webdriver", 'fetch_backend': "html_webdriver",
'browser_steps-0-operation': 'Click element', 'browser_steps-0-operation': 'Enter text in field',
'browser_steps-0-selector': 'button[name=test-button]', 'browser_steps-0-selector': '#test-input-text',
'browser_steps-0-optional_value': '', # Should get set to the actual text (jinja2 rendered)
'browser_steps-0-optional_value': "Hello-Jinja2-{% now 'Europe/Berlin', '%Y-%m-%d' %}",
'browser_steps-1-operation': 'Click element',
'browser_steps-1-selector': 'button[name=test-button]',
'browser_steps-1-optional_value': '',
# For now, cookies doesnt work in headers because it must be a full cookiejar object # For now, cookies doesnt work in headers because it must be a full cookiejar object
'headers': "testheader: yes\buser-agent: MyCustomAgent", 'headers': "testheader: yes\buser-agent: MyCustomAgent",
}, },
@@ -119,7 +127,7 @@ def test_basic_browserstep(client, live_server, measure_memory_usage):
assert b"unpaused" in res.data assert b"unpaused" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
uuid = extract_UUID_from_client(client) uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n >= 1, "Watch history had atleast 1 (everything fetched OK)" assert live_server.app.config['DATASTORE'].data['watching'][uuid].history_n >= 1, "Watch history had atleast 1 (everything fetched OK)"
assert b"This text should be removed" not in res.data assert b"This text should be removed" not in res.data
@@ -132,13 +140,32 @@ def test_basic_browserstep(client, live_server, measure_memory_usage):
assert b"This text should be removed" not in res.data assert b"This text should be removed" not in res.data
assert b"I smell JavaScript because the button was pressed" in res.data assert b"I smell JavaScript because the button was pressed" in res.data
assert b'Hello-Jinja2-20' in res.data
assert b"testheader: yes" in res.data assert b"testheader: yes" in res.data
assert b"user-agent: mycustomagent" in res.data assert b"user-agent: mycustomagent" in res.data
live_server.stop()
def test_non_200_errors_report_browsersteps(client, live_server):
live_server.stop()
live_server.start()
four_o_four_url = url_for('test_endpoint', status_code=404, _external=True) four_o_four_url = url_for('test_endpoint', status_code=404, _external=True)
four_o_four_url = four_o_four_url.replace('localhost.localdomain', 'cdio') four_o_four_url = four_o_four_url.replace('localhost.localdomain', 'cdio')
four_o_four_url = four_o_four_url.replace('localhost', 'cdio') four_o_four_url = four_o_four_url.replace('localhost', 'cdio')
res = client.post(
url_for("form_quick_watch_add"),
data={"url": four_o_four_url, "tags": '', 'edit_and_watch_submit_button': 'Edit > Watch'},
follow_redirects=True
)
assert b"Watch added in Paused state, saving will unpause" in res.data
assert os.getenv('PLAYWRIGHT_DRIVER_URL'), "Needs PLAYWRIGHT_DRIVER_URL set for this test"
uuid = next(iter(live_server.app.config['DATASTORE'].data['watching']))
# now test for 404 errors # now test for 404 errors
res = client.post( res = client.post(
url_for("edit_page", uuid=uuid, unpause_on_save=1), url_for("edit_page", uuid=uuid, unpause_on_save=1),
@@ -153,12 +180,14 @@ def test_basic_browserstep(client, live_server, measure_memory_usage):
follow_redirects=True follow_redirects=True
) )
assert b"unpaused" in res.data assert b"unpaused" in res.data
wait_for_all_checks(client) wait_for_all_checks(client)
res = client.get(url_for("index")) res = get_index(client)
assert b'Error - 404' in res.data assert b'Error - 404' in res.data
client.get( client.get(
url_for("form_delete", uuid="all"), url_for("form_delete", uuid="all"),
follow_redirects=True follow_redirects=True
) )

View File

@@ -243,7 +243,6 @@ class update_worker(threading.Thread):
os.unlink(full_path) os.unlink(full_path)
def run(self): def run(self):
now = time.time()
while not self.app.config.exit.is_set(): while not self.app.config.exit.is_set():
update_handler = None update_handler = None
@@ -254,6 +253,7 @@ class update_worker(threading.Thread):
pass pass
else: else:
fetch_start_time = time.time()
uuid = queued_item_data.item.get('uuid') uuid = queued_item_data.item.get('uuid')
self.current_uuid = uuid self.current_uuid = uuid
if uuid in list(self.datastore.data['watching'].keys()) and self.datastore.data['watching'][uuid].get('url'): if uuid in list(self.datastore.data['watching'].keys()) and self.datastore.data['watching'][uuid].get('url'):
@@ -268,7 +268,6 @@ class update_worker(threading.Thread):
watch = self.datastore.data['watching'].get(uuid) watch = self.datastore.data['watching'].get(uuid)
logger.info(f"Processing watch UUID {uuid} Priority {queued_item_data.priority} URL {watch['url']}") logger.info(f"Processing watch UUID {uuid} Priority {queued_item_data.priority} URL {watch['url']}")
now = time.time()
try: try:
# Processor is what we are using for detecting the "Change" # Processor is what we are using for detecting the "Change"
@@ -288,6 +287,10 @@ class update_worker(threading.Thread):
update_handler.call_browser() update_handler.call_browser()
# In reality, the actual time of when the change was detected could be a few seconds after this
# For example it should include when the page stopped rendering if using a playwright/chrome type fetch
fetch_start_time = time.time()
changed_detected, update_obj, contents = update_handler.run_changedetection(watch=watch) changed_detected, update_obj, contents = update_handler.run_changedetection(watch=watch)
# Re #342 # Re #342
@@ -512,7 +515,7 @@ class update_worker(threading.Thread):
if not self.datastore.data['watching'].get(uuid): if not self.datastore.data['watching'].get(uuid):
continue continue
#
# Different exceptions mean that we may or may not want to bump the snapshot, trigger notifications etc # Different exceptions mean that we may or may not want to bump the snapshot, trigger notifications etc
if process_changedetection_results: if process_changedetection_results:
@@ -525,8 +528,6 @@ class update_worker(threading.Thread):
except Exception as e: except Exception as e:
logger.warning(f"UUID: {uuid} Extract <title> as watch title was enabled, but couldn't find a <title>.") logger.warning(f"UUID: {uuid} Extract <title> as watch title was enabled, but couldn't find a <title>.")
# Now update after running everything
timestamp = round(time.time())
try: try:
self.datastore.update_watch(uuid=uuid, update_obj=update_obj) self.datastore.update_watch(uuid=uuid, update_obj=update_obj)
@@ -542,24 +543,28 @@ class update_worker(threading.Thread):
# Small hack so that we sleep just enough to allow 1 second between history snapshots # Small hack so that we sleep just enough to allow 1 second between history snapshots
# this is because history.txt indexes/keys snapshots by epoch seconds and we dont want dupe keys # this is because history.txt indexes/keys snapshots by epoch seconds and we dont want dupe keys
# @also - the keys are one per second at the most (for now)
if watch.newest_history_key and int(timestamp) == int(watch.newest_history_key): if watch.newest_history_key and int(fetch_start_time) == int(watch.newest_history_key):
logger.warning( logger.warning(
f"Timestamp {timestamp} already exists, waiting 1 seconds so we have a unique key in history.txt") f"Timestamp {fetch_start_time} already exists, waiting 1 seconds so we have a unique key in history.txt")
timestamp = str(int(timestamp) + 1) fetch_start_time += 1
time.sleep(1) time.sleep(1)
watch.save_history_text(contents=contents, watch.save_history_text(contents=contents,
timestamp=timestamp, timestamp=int(fetch_start_time),
snapshot_id=update_obj.get('previous_md5', 'none')) snapshot_id=update_obj.get('previous_md5', 'none'))
if update_handler.fetcher.content:
watch.save_last_fetched_html(contents=update_handler.fetcher.content, timestamp=timestamp) empty_pages_are_a_change = self.datastore.data['settings']['application'].get('empty_pages_are_a_change', False)
if update_handler.fetcher.content or (not update_handler.fetcher.content and empty_pages_are_a_change):
# attribute .last_changed is then based on this data
watch.save_last_fetched_html(contents=update_handler.fetcher.content, timestamp=int(fetch_start_time))
# Notifications should only trigger on the second time (first time, we gather the initial snapshot) # Notifications should only trigger on the second time (first time, we gather the initial snapshot)
if watch.history_n >= 2: if watch.history_n >= 2:
logger.info(f"Change detected in UUID {uuid} - {watch['url']}") logger.info(f"Change detected in UUID {uuid} - {watch['url']}")
if not watch.get('notification_muted'): if not watch.get('notification_muted'):
# @todo only run this if notifications exist
self.send_content_changed_notification(watch_uuid=uuid) self.send_content_changed_notification(watch_uuid=uuid)
except Exception as e: except Exception as e:
@@ -581,15 +586,15 @@ class update_worker(threading.Thread):
except Exception as e: except Exception as e:
pass pass
self.datastore.update_watch(uuid=uuid, update_obj={'fetch_time': round(time.time() - now, 3), self.datastore.update_watch(uuid=uuid, update_obj={'fetch_time': round(time.time() - fetch_start_time, 3),
'last_checked': round(time.time()), 'last_checked': int(fetch_start_time),
'check_count': count 'check_count': count
}) })
self.current_uuid = None # Done self.current_uuid = None # Done
self.q.task_done() self.q.task_done()
logger.debug(f"Watch {uuid} done in {time.time()-now:.2f}s") logger.debug(f"Watch {uuid} done in {time.time()-fetch_start_time:.2f}s")
# Give the CPU time to interrupt # Give the CPU time to interrupt
time.sleep(0.1) time.sleep(0.1)

View File

@@ -12,9 +12,6 @@ services:
# environment: # environment:
# Default listening port, can also be changed with the -p option # Default listening port, can also be changed with the -p option
# - PORT=5000 # - PORT=5000
# - PUID=1000
# - PGID=1000
# #
# Log levels are in descending order. (TRACE is the most detailed one) # Log levels are in descending order. (TRACE is the most detailed one)
# Log output levels: TRACE, DEBUG(default), INFO, SUCCESS, WARNING, ERROR, CRITICAL # Log output levels: TRACE, DEBUG(default), INFO, SUCCESS, WARNING, ERROR, CRITICAL

View File

@@ -35,7 +35,7 @@ dnspython==2.6.1 # related to eventlet fixes
# jq not available on Windows so must be installed manually # jq not available on Windows so must be installed manually
# Notification library # Notification library
apprise==1.9.0 apprise==1.9.2
# apprise mqtt https://github.com/dgtlmoon/changedetection.io/issues/315 # apprise mqtt https://github.com/dgtlmoon/changedetection.io/issues/315
# use any version other than 2.0.x due to https://github.com/eclipse/paho.mqtt.python/issues/814 # use any version other than 2.0.x due to https://github.com/eclipse/paho.mqtt.python/issues/814
@@ -95,5 +95,8 @@ babel
# Needed for > 3.10, https://github.com/microsoft/playwright-python/issues/2096 # Needed for > 3.10, https://github.com/microsoft/playwright-python/issues/2096
greenlet >= 3.0.3 greenlet >= 3.0.3
# Pinned or it causes problems with flask_expects_json which seems unmaintained
referencing==0.35.1
# Scheduler - Windows seemed to miss a lot of default timezone info (even "UTC" !) # Scheduler - Windows seemed to miss a lot of default timezone info (even "UTC" !)
tzdata tzdata