Compare commits

...

64 Commits

Author SHA1 Message Date
dgtlmoon
ac1b5ee6f7 Update wiki link for 'More info' about sharing a watch and its configuration 2022-05-17 22:43:56 +02:00
dgtlmoon
aca6c753bf skip JSON 2022-05-17 22:12:37 +02:00
dgtlmoon
efae86c134 small cleanups 2022-05-17 22:06:20 +02:00
dgtlmoon
0f72471343 Option to control if pages with no renderable content are a change (like JS webapps that dont render any text sometimes etc) 2022-05-17 22:03:18 +02:00
dgtlmoon
16809b48f8 Playwright - raise EmptyReply on empty reply, no need to process further 2022-05-17 18:40:15 +02:00
dgtlmoon
67c833d2bc Re #214 - configurable wait extra seconds for webdriver requests before extracting text (#606) 2022-05-17 18:35:33 +02:00
weeix
31fea55ee4 Fix PLAYWRIGHT_DRIVER_URL default value (cf. #587) (#599) 2022-05-14 22:34:44 +02:00
dgtlmoon
b6c50d3b1a Update PIP readme.md 2022-05-10 22:46:59 +02:00
dgtlmoon
034507f14f Fixing Pip install problem - Update MANIFEST to include model/ subdir, improving imports (#593) 2022-05-10 22:45:08 +02:00
dgtlmoon
0e385b1c22 0.39.13 2022-05-10 17:24:38 +02:00
dgtlmoon
f28c260576 Distill.io JSON export file importer (#592) 2022-05-10 17:15:41 +02:00
dgtlmoon
18f0b63b7d Ability to specify a list of proxies to choose from, always using the first one by default, See wiki (#591) 2022-05-08 20:35:36 +02:00
Thilo-Alexander Ginkel
97045e7a7b Improving Playwright docs (#588) 2022-05-07 22:23:17 +02:00
dgtlmoon
9807cf0cda Playwright - code fix 2022-05-07 17:29:59 +02:00
dgtlmoon
d4b5237103 Playwright fetcher - more reliable by just waiting arbitrary seconds after the last network IO 2022-05-07 17:14:40 +02:00
dgtlmoon
dc6f76ba64 Make proxy configuration more consistent - see https://github.com/dgtlmoon/changedetection.io/wiki/Proxy-configuration (#585) 2022-05-07 16:37:56 +02:00
dgtlmoon
1f2f93184e Playwright fetcher - use the correct default User-Agent 2022-05-06 23:59:38 +02:00
dgtlmoon
0f08c8dda3 Toggle visibility of extra requests options/settings when not in use (#584) 2022-05-06 23:40:32 +02:00
dgtlmoon
68db20168e Add new fetch method: Playwright Chromium (Selenium/WebDriver alternative) (#489) 2022-05-02 21:40:40 +02:00
dgtlmoon
1d4474f5a3 Simplify scrub operation (simply cleans all) (#575) 2022-05-02 21:10:23 +02:00
dgtlmoon
613308881c Bugfix - dont update record when deleted during check 2022-05-01 21:41:29 +02:00
dgtlmoon
f69585b276 Improving support info in README.md 2022-04-29 20:26:02 +02:00
dgtlmoon
0179940df1 Handle deletions better (#570) 2022-04-29 19:12:33 +02:00
dgtlmoon
c0d0424e7e Data storage bug fix #569 2022-04-29 18:26:15 +02:00
dgtlmoon
014dc61222 Upgrade notifications library - fixing marketup in email subject 2022-04-29 09:39:40 +02:00
dgtlmoon
06517bfd22 Ability to 'Share' a watch by a generated link, this will include all filters and triggers - see Wiki (#563) 2022-04-26 10:52:08 +02:00
dgtlmoon
b3a115dd4a Upgrade notifications library Re #555 - fixing telegram HTML markup in notification title 2022-04-25 23:12:32 +02:00
dgtlmoon
ffc4215411 Unify MINIMUM_SECONDS_RECHECK_TIME env var variable to 60 seconds 2022-04-24 20:37:30 +02:00
dgtlmoon
9e708810d1 Seconds/minutes/hours/days between checks form field upgrade from 'minutes' only (#512) 2022-04-24 16:56:32 +02:00
dgtlmoon
1e8aa6158b Form styling improvements 2022-04-24 14:40:53 +02:00
dgtlmoon
015353eccc Form field handling improvements - fixing field list handler for empty lines 2022-04-24 13:53:13 +02:00
dgtlmoon
501183e66b Fix "Add email" button in main global notification settings 2022-04-22 10:51:52 +02:00
dgtlmoon
def74f27e6 Test notification button fixed in main settings (#556) 2022-04-21 21:36:10 +02:00
dgtlmoon
37775a46c6 tgram:// be sure total notification size is always under their 4096 size limit 2022-04-21 16:28:15 +02:00
dgtlmoon
e4eaa0c817 Shows which items are already in the queue, disables adding to the queue if already in the recheck queue (#552) 2022-04-21 12:52:45 +02:00
dgtlmoon
206ded4201 Notifications - Signal API support, Ntfy support, hotmail, matrix, Gotify API fixes 2022-04-20 23:13:55 +02:00
dgtlmoon
9e71f2aa35 Discord:// notification size limit - also includes the notification title 2022-04-20 17:00:21 +02:00
dgtlmoon
f9594aeffb Fix spelling errors 2022-04-20 09:51:53 +02:00
dgtlmoon
b4e1353376 Update README.md 2022-04-19 23:56:11 +02:00
dgtlmoon
5b670c38d3 Update README.md 2022-04-19 23:48:21 +02:00
dgtlmoon
2a9fb12451 Import speed improvements, and adding an import URL batch size of 5,000 to stop accidental CPU overload (#549) 2022-04-19 23:15:32 +02:00
dgtlmoon
6c3c5dc28a Ability to set the default fetch mode via the DEFAULT_FETCH_BACKEND variable 2022-04-19 23:15:00 +02:00
dgtlmoon
8f062bfec9 Refactor form handling (#548) 2022-04-19 21:43:07 +02:00
dgtlmoon
380c512cc2 Adding support for change detection of HTML source-code via "source:https://website.com" prefix (#540) 2022-04-12 17:36:29 +02:00
dgtlmoon
d7ed7c44ed Re-label the quick-add widget placeholder 'tag' to 'watch group' 2022-04-12 10:55:43 +02:00
dgtlmoon
34a87c0f41 HTTP Fetcher code improvements 2022-04-12 08:36:08 +02:00
dgtlmoon
4074fe53f1 Adding RSS metadata auto-discovery 2022-04-12 07:35:47 +02:00
Tristan Hill
44d599d0d1 Upgrade WTforms form handler to v3 (#523) 2022-04-09 19:50:56 +02:00
dgtlmoon
615fe9290a 0.39.12 2022-04-09 14:16:30 +02:00
dgtlmoon
2cc6955bc3 Miscellaneous settings form visual improvements (#535) 2022-04-09 12:15:34 +02:00
dgtlmoon
9809af142d Option to render links as [Some Text ](/link), adds the ability to change-detect on hyperlink changes 2022-04-09 10:35:14 +02:00
dgtlmoon
1890881977 Specify our Discord avatar_url as default avatar_url 2022-04-08 18:35:59 +02:00
dgtlmoon
9fc2fe85d5 Minor git updates 2022-04-08 17:21:42 +02:00
dgtlmoon
bb3c546838 Fix screenshot tab name 2022-04-08 17:08:06 +02:00
dgtlmoon
165f794595 Discord:// notifications should be cut to 2000 chars or Discord will not process them. (#531 + #323) 2022-04-08 16:32:04 +02:00
dgtlmoon
a440eece9e Make long reports in the notification error log easier to read 2022-04-08 14:12:42 +02:00
dgtlmoon
34c83f0e7c [Add email] button in notification settings with a prefix set from NOTIFICATION_MAIL_BUTTON_PREFIX env variable when defined. (#528) 2022-04-07 18:18:23 +02:00
dgtlmoon
f6e518497a Update README.md 2022-04-07 14:59:31 +02:00
dgtlmoon
63e91a3d66 Skip processing a watch into the RSS feed if there's not enough data to examine (fixes Internal Server Error when accessing the RSS feed) (#521) 2022-04-05 20:31:31 +02:00
dgtlmoon
3034d047c2 Introduce an AJAX button for sending test notifications instead of the checkbox (#519) 2022-04-05 18:04:26 +02:00
dgtlmoon
2620818ba7 Make text tab always available at default 2022-04-02 14:55:40 +02:00
dgtlmoon
9fe4f95990 When fetching a snapshot via Chrome, make the most recent screenshot available on the Diff and Preview pages (#516) 2022-04-02 14:49:32 +02:00
dgtlmoon
ffd2a89d60 Remove 'unviewed' status in watch table when Diff link clicked (#514) 2022-03-31 11:01:07 +02:00
dgtlmoon
8f40f19328 RSS feed CDATA should contain difference output 2022-03-30 10:51:10 +02:00
64 changed files with 2718 additions and 939 deletions

View File

@@ -20,6 +20,11 @@ COPY requirements.txt /requirements.txt
RUN pip install --target=/dependencies -r /requirements.txt
# Playwright is an alternative to Selenium
# Excluded this package from requirements.txt to prevent arm/v6 and arm/v7 builds from failing
RUN pip install --target=/dependencies playwright~=1.20 \
|| echo "WARN: Failed to install Playwright. The application can still run, but the Playwright option will be disabled."
# Final image stage
FROM python:3.8-slim

View File

@@ -1,5 +1,6 @@
recursive-include changedetectionio/templates *
recursive-include changedetectionio/static *
recursive-include changedetectionio/model *
include changedetection.py
global-exclude *.pyc
global-exclude node_modules

View File

@@ -16,6 +16,13 @@ Live your data-life *pro-actively* instead of *re-actively*, do not rely on mani
<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/screenshot.png" style="max-width:100%;" alt="Self-hosted web page change monitoring" title="Self-hosted web page change monitoring" />
**Get your own private instance now! Let us host it for you!**
[**Try our $6.99/month subscription - unlimited checks, watches and notifications!**](https://lemonade.changedetection.io/start), choose from different geographical locations, let us handle everything for you.
#### Example use cases
Know when ...
@@ -58,14 +65,3 @@ Then visit http://127.0.0.1:5000 , You should now be able to access the UI.
See https://github.com/dgtlmoon/changedetection.io for more information.
### Support us
Do you use changedetection.io to make money? does it save you time or money? Does it make your life easier? less stressful? Remember, we write this software when we should be doing actual paid work, we have to buy food and pay rent just like you.
Please support us, even small amounts help a LOT.
BTC `1PLFN327GyUarpJd7nVe7Reqg9qHx5frNn`
<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/btc-support.png" style="max-width:50%;" alt="Support us!" />

View File

@@ -9,7 +9,7 @@ _Know when web pages change! Stay ontop of new information!_
Live your data-life *pro-actively* instead of *re-actively*.
Free, Open-source web page monitoring, notification and change detection. Don't have time? [Try our $6.99/month plan - unlimited checks and watches!](https://lemonade.changedetection.io/start)
Free, Open-source web page monitoring, notification and change detection. Don't have time? [**Try our $6.99/month subscription - unlimited checks and watches!**](https://lemonade.changedetection.io/start)
[<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/screenshot.png" style="max-width:100%;" alt="Self-hosted web page change monitoring" title="Self-hosted web page change monitoring" />](https://lemonade.changedetection.io/start)
@@ -17,10 +17,7 @@ Free, Open-source web page monitoring, notification and change detection. Don't
**Get your own private instance now! Let us host it for you!**
[![Deploy to Lemonade](https://lemonade.changedetection.io/static/images/lemonade.svg)](https://lemonade.changedetection.io/start)
[_Let us host your own private instance - We accept PayPal and Bitcoin, Support the further development of changedetection.io!_](https://lemonade.changedetection.io/start)
[**Try our $6.99/month subscription - unlimited checks and watches!**](https://lemonade.changedetection.io/start) , _half the price of other website change monitoring services and comes with unlimited watches & checks!_
@@ -39,13 +36,14 @@ Free, Open-source web page monitoring, notification and change detection. Don't
- COVID related news from government websites
- University/organisation news from their website
- Detect and monitor changes in JSON API responses
- API monitoring and alerting
- JSON API monitoring and alerting
- Changes in legal and other documents
- Trigger API calls via notifications when text appears on a website
- Glue together APIs using the JSON filter and JSON notifications
- Create RSS feeds based on changes in web content
- Monitor HTML source code for unexpected changes, strengthen your PCI compliance
- You have a very sensitive list of URLs to watch and you do _not_ want to use the paid alternatives. (Remember, _you_ are the product)
_Need an actual Chrome runner with Javascript support? We support fetching via WebDriver!</a>_
## Screenshots
@@ -70,6 +68,10 @@ Docker standalone
$ docker run -d --restart always -p "127.0.0.1:5000:5000" -v datastore-volume:/datastore --name changedetection.io dgtlmoon/changedetection.io
```
### Windows
See the install instructions at the wiki https://github.com/dgtlmoon/changedetection.io/wiki/Microsoft-Windows
### Python Pip
Check out our pypi page https://pypi.org/project/changedetection.io/
@@ -163,17 +165,17 @@ See the wiki https://github.com/dgtlmoon/changedetection.io/wiki/Proxy-configura
Raspberry Pi and linux/arm/v6 linux/arm/v7 arm64 devices are supported! See the wiki for [details](https://github.com/dgtlmoon/changedetection.io/wiki/Fetching-pages-with-WebDriver)
## Windows support?
YES! See the wiki https://github.com/dgtlmoon/changedetection.io/wiki/Microsoft-Windows
## Support us
Do you use changedetection.io to make money? does it save you time or money? Does it make your life easier? less stressful? Remember, we write this software when we should be doing actual paid work, we have to buy food and pay rent just like you.
Please support us, even small amounts help a LOT.
BTC `1PLFN327GyUarpJd7nVe7Reqg9qHx5frNn`
Firstly, consider taking out a [change detection monthly subscription - unlimited checks and watches](https://lemonade.changedetection.io/start) , even if you don't use it, you still get the warm fuzzy feeling of helping out the project. (And who knows, you might just use it!)
Or directly donate an amount PayPal [![Donate](https://img.shields.io/badge/Donate-PayPal-green.svg)](https://www.paypal.com/donate/?hosted_button_id=7CP6HR9ZCNDYJ)
Or BTC `1PLFN327GyUarpJd7nVe7Reqg9qHx5frNn`
<img src="https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/btc-support.png" style="max-width:50%;" alt="Support us!" />

View File

@@ -32,6 +32,7 @@ from flask import (
render_template,
request,
send_from_directory,
session,
url_for,
)
from flask_login import login_required
@@ -39,7 +40,7 @@ from flask_wtf import CSRFProtect
from changedetectionio import html_tools
__version__ = '0.39.11'
__version__ = '0.39.13.1'
datastore = None
@@ -94,16 +95,6 @@ def init_app_secret(datastore_path):
return secret
# Remember python is by reference
# populate_form in wtfors didnt work for me. (try using a setattr() obj type on datastore.watch?)
def populate_form_from_watch(form, watch):
for i in form.__dict__.keys():
if i[0] != '_':
p = getattr(form, i)
if hasattr(p, 'data') and i in watch:
setattr(p, "data", watch[i])
# We use the whole watch object from the store/JSON so we can see if there's some related status in terms of a thread
# running or something similar.
@app.template_filter('format_last_checked_time')
@@ -272,7 +263,7 @@ def changedetection_app(config=None, datastore_o=None):
@app.route("/rss", methods=['GET'])
@login_required
def rss():
from . import diff
limit_tag = request.args.get('tag')
# Sort by last_changed and add the uuid which is usually the key..
@@ -301,6 +292,19 @@ def changedetection_app(config=None, datastore_o=None):
fg.link(href='https://changedetection.io')
for watch in sorted_watches:
dates = list(watch['history'].keys())
# Re #521 - Don't bother processing this one if theres less than 2 snapshots, means we never had a change detected.
if len(dates) < 2:
continue
# Convert to int, sort and back to str again
# @todo replace datastore getter that does this automatically
dates = [int(i) for i in dates]
dates.sort(reverse=True)
dates = [str(i) for i in dates]
prev_fname = watch['history'][dates[1]]
if not watch['viewed']:
# Re #239 - GUID needs to be individual for each event
# @todo In the future make this a configurable link back (see work on BASE_URL https://github.com/dgtlmoon/changedetection.io/pull/228)
@@ -316,12 +320,16 @@ def changedetection_app(config=None, datastore_o=None):
diff_link = {'href': "{}{}".format(base_url, url_for('diff_history_page', uuid=watch['uuid']))}
# @todo use title if it exists
fe.link(link=diff_link)
fe.title(title=watch['url'])
# @todo in the future <description><![CDATA[<html><body>Any code html is valid.</body></html>]]></description>
fe.description(description=watch['url'])
# @todo watch should be a getter - watch.get('title') (internally if URL else..)
watch_title = watch.get('title') if watch.get('title') else watch.get('url')
fe.title(title=watch_title)
latest_fname = watch['history'][dates[0]]
html_diff = diff.render_diff(prev_fname, latest_fname, include_equal=False, line_feed_sep="</br>")
fe.description(description="<![CDATA[<html><body><h4>{}</h4>{}</body></html>".format(watch_title, html_diff))
fe.guid(guid, permalink=False)
dt = datetime.datetime.fromtimestamp(int(watch['newest_history_key']))
@@ -335,6 +343,8 @@ def changedetection_app(config=None, datastore_o=None):
@app.route("/", methods=['GET'])
@login_required
def index():
from changedetectionio import forms
limit_tag = request.args.get('tag')
pause_uuid = request.args.get('pause')
@@ -371,7 +381,6 @@ def changedetection_app(config=None, datastore_o=None):
existing_tags = datastore.get_all_tags()
from changedetectionio import forms
form = forms.quickWatchForm(request.form)
output = render_template("watch-overview.html",
@@ -383,56 +392,63 @@ def changedetection_app(config=None, datastore_o=None):
has_unviewed=datastore.data['has_unviewed'],
# Don't link to hosting when we're on the hosting environment
hosted_sticky=os.getenv("SALTED_PASS", False) == False,
guid=datastore.data['app_guid'])
guid=datastore.data['app_guid'],
queued_uuids=update_q.queue)
if session.get('share-link'):
del(session['share-link'])
return output
# AJAX endpoint for sending a test
@app.route("/notification/send-test", methods=['POST'])
@login_required
def ajax_callback_send_notification_test():
import apprise
apobj = apprise.Apprise()
# validate URLS
if not len(request.form['notification_urls'].strip()):
return make_response({'error': 'No Notification URLs set'}, 400)
for server_url in request.form['notification_urls'].splitlines():
if len(server_url.strip()):
if not apobj.add(server_url):
message = '{} is not a valid AppRise URL.'.format(server_url)
return make_response({'error': message}, 400)
try:
n_object = {'watch_url': request.form['window_url'],
'notification_urls': request.form['notification_urls'].splitlines(),
'notification_title': request.form['notification_title'].strip(),
'notification_body': request.form['notification_body'].strip(),
'notification_format': request.form['notification_format'].strip()
}
notification_q.put(n_object)
except Exception as e:
return make_response({'error': str(e)}, 400)
return 'OK'
@app.route("/scrub", methods=['GET', 'POST'])
@login_required
def scrub_page():
import re
if request.method == 'POST':
confirmtext = request.form.get('confirmtext')
limit_date = request.form.get('limit_date')
limit_timestamp = 0
# Re #149 - allow empty/0 timestamp limit
if len(limit_date):
try:
limit_date = limit_date.replace('T', ' ')
# I noticed chrome will show '/' but actually submit '-'
limit_date = limit_date.replace('-', '/')
# In the case that :ss seconds are supplied
limit_date = re.sub(r'(\d\d:\d\d)(:\d\d)', '\\1', limit_date)
str_to_dt = datetime.datetime.strptime(limit_date, '%Y/%m/%d %H:%M')
limit_timestamp = int(str_to_dt.timestamp())
if limit_timestamp > time.time():
flash("Timestamp is in the future, cannot continue.", 'error')
return redirect(url_for('scrub_page'))
except ValueError:
flash('Incorrect date format, cannot continue.', 'error')
return redirect(url_for('scrub_page'))
if confirmtext == 'scrub':
changes_removed = 0
for uuid, watch in datastore.data['watching'].items():
if limit_timestamp:
changes_removed += datastore.scrub_watch(uuid, limit_timestamp=limit_timestamp)
else:
changes_removed += datastore.scrub_watch(uuid)
for uuid in datastore.data['watching'].keys():
datastore.scrub_watch(uuid)
flash("Cleared snapshot history ({} snapshots removed)".format(changes_removed))
flash("Cleared all snapshot history")
else:
flash('Incorrect confirmation text.', 'error')
return redirect(url_for('index'))
output = render_template("scrub.html")
output = render_template("scrub.html")
return output
@@ -472,49 +488,79 @@ def changedetection_app(config=None, datastore_o=None):
@app.route("/edit/<string:uuid>", methods=['GET', 'POST'])
@login_required
# https://stackoverflow.com/questions/42984453/wtforms-populate-form-with-data-if-data-exists
# https://wtforms.readthedocs.io/en/3.0.x/forms/#wtforms.form.Form.populate_obj ?
def edit_page(uuid):
from changedetectionio import forms
form = forms.watchForm(request.form)
using_default_check_time = True
# More for testing, possible to return the first/only
if not datastore.data['watching'].keys():
flash("No watches to edit", "error")
return redirect(url_for('index'))
if uuid == 'first':
uuid = list(datastore.data['watching'].keys()).pop()
if not uuid in datastore.data['watching']:
flash("No watch with the UUID %s found." % (uuid), "error")
return redirect(url_for('index'))
if request.method == 'GET':
if not uuid in datastore.data['watching']:
flash("No watch with the UUID %s found." % (uuid), "error")
return redirect(url_for('index'))
# be sure we update with a copy instead of accidently editing the live object by reference
default = deepcopy(datastore.data['watching'][uuid])
populate_form_from_watch(form, datastore.data['watching'][uuid])
# Show system wide default if nothing configured
if datastore.data['watching'][uuid]['fetch_backend'] is None:
default['fetch_backend'] = datastore.data['settings']['application']['fetch_backend']
if datastore.data['watching'][uuid]['fetch_backend'] is None:
form.fetch_backend.data = datastore.data['settings']['application']['fetch_backend']
# Show system wide default if nothing configured
if all(value == 0 or value == None for value in datastore.data['watching'][uuid]['time_between_check'].values()):
default['time_between_check'] = deepcopy(datastore.data['settings']['requests']['time_between_check'])
# Defaults for proxy choice
if datastore.proxy_list is not None: # When enabled
system_proxy = datastore.data['settings']['requests']['proxy']
if default['proxy'] is None:
default['proxy'] = system_proxy
else:
# Does the chosen one exist?
if not any(default['proxy'] in tup for tup in datastore.proxy_list):
default['proxy'] = datastore.proxy_list[0][0]
# Used by the form handler to keep or remove the proxy settings
default['proxy_list'] = datastore.proxy_list
# proxy_override set to the json/text list of the items
form = forms.watchForm(formdata=request.form if request.method == 'POST' else None,
data=default,
)
if datastore.proxy_list is None:
# @todo - Couldn't get setattr() etc dynamic addition working, so remove it instead
del form.proxy
else:
form.proxy.choices = datastore.proxy_list
if default['proxy'] is None:
form.proxy.default='http://hello'
if request.method == 'POST' and form.validate():
extra_update_obj = {}
# Re #110, if they submit the same as the default value, set it to None, so we continue to follow the default
if form.minutes_between_check.data == datastore.data['settings']['requests']['minutes_between_check']:
form.minutes_between_check.data = None
# Assume we use the default value, unless something relevant is different, then use the form value
# values could be None, 0 etc.
# Set to None unless the next for: says that something is different
extra_update_obj['time_between_check'] = dict.fromkeys(form.time_between_check.data)
for k, v in form.time_between_check.data.items():
if v and v != datastore.data['settings']['requests']['time_between_check'][k]:
extra_update_obj['time_between_check'] = form.time_between_check.data
using_default_check_time = False
break
# Use the default if its the same as system wide
if form.fetch_backend.data == datastore.data['settings']['application']['fetch_backend']:
form.fetch_backend.data = None
update_obj = {'url': form.url.data.strip(),
'minutes_between_check': form.minutes_between_check.data,
'tag': form.tag.data.strip(),
'title': form.title.data.strip(),
'headers': form.headers.data,
'body': form.body.data,
'method': form.method.data,
'ignore_status_codes': form.ignore_status_codes.data,
'fetch_backend': form.fetch_backend.data,
'trigger_text': form.trigger_text.data,
'notification_title': form.notification_title.data,
'notification_body': form.notification_body.data,
'notification_format': form.notification_format.data,
'extract_title_as_title': form.extract_title_as_title.data,
}
extra_update_obj['fetch_backend'] = None
# Notification URLs
datastore.data['watching'][uuid]['notification_urls'] = form.notification_urls.data
@@ -526,42 +572,25 @@ def changedetection_app(config=None, datastore_o=None):
# Reset the previous_md5 so we process a new snapshot including stripping ignore text.
if form_ignore_text:
if len(datastore.data['watching'][uuid]['history']):
update_obj['previous_md5'] = get_current_checksum_include_ignore_text(uuid=uuid)
datastore.data['watching'][uuid]['css_filter'] = form.css_filter.data.strip()
datastore.data['watching'][uuid]['subtractive_selectors'] = form.subtractive_selectors.data
extra_update_obj['previous_md5'] = get_current_checksum_include_ignore_text(uuid=uuid)
# Reset the previous_md5 so we process a new snapshot including stripping ignore text.
if form.css_filter.data.strip() != datastore.data['watching'][uuid]['css_filter']:
if len(datastore.data['watching'][uuid]['history']):
update_obj['previous_md5'] = get_current_checksum_include_ignore_text(uuid=uuid)
extra_update_obj['previous_md5'] = get_current_checksum_include_ignore_text(uuid=uuid)
datastore.data['watching'][uuid].update(update_obj)
datastore.data['watching'][uuid].update(form.data)
datastore.data['watching'][uuid].update(extra_update_obj)
flash("Updated watch.")
# Re #286 - We wait for syncing new data to disk in another thread every 60 seconds
# But in the case something is added we should save straight away
datastore.sync_to_json()
datastore.needs_write_urgent = True
# Queue the watch for immediate recheck
update_q.put(uuid)
if form.trigger_check.data:
if len(form.notification_urls.data):
n_object = {'watch_url': form.url.data.strip(),
'notification_urls': form.notification_urls.data,
'notification_title': form.notification_title.data,
'notification_body': form.notification_body.data,
'notification_format': form.notification_format.data,
'uuid': uuid
}
notification_q.put(n_object)
flash('Test notification queued.')
else:
flash('No notification URLs set, cannot send test.', 'error')
# Diff page [edit] link should go back to diff page
if request.args.get("next") and request.args.get("next") == 'diff' and not form.save_and_preview_button.data:
return redirect(url_for('diff_history_page', uuid=uuid))
@@ -576,18 +605,14 @@ def changedetection_app(config=None, datastore_o=None):
if request.method == 'POST' and not form.validate():
flash("An error occurred, please see below.", "error")
# Re #110 offer the default minutes
using_default_minutes = False
if form.minutes_between_check.data == None:
form.minutes_between_check.data = datastore.data['settings']['requests']['minutes_between_check']
using_default_minutes = True
output = render_template("edit.html",
uuid=uuid,
watch=datastore.data['watching'][uuid],
form=form,
using_default_minutes=using_default_minutes,
current_base_url = datastore.data['settings']['application']['base_url']
has_empty_checktime=using_default_check_time,
using_global_webdriver_wait=default['webdriver_delay'] is None,
current_base_url=datastore.data['settings']['application']['base_url'],
emailprefix=os.getenv('NOTIFICATION_MAIL_BUTTON_PREFIX', False)
)
return output
@@ -595,110 +620,100 @@ def changedetection_app(config=None, datastore_o=None):
@app.route("/settings", methods=['GET', "POST"])
@login_required
def settings_page():
from changedetectionio import content_fetcher, forms
form = forms.globalSettingsForm(request.form)
default = deepcopy(datastore.data['settings'])
if datastore.proxy_list is not None:
# When enabled
system_proxy = datastore.data['settings']['requests']['proxy']
# In the case it doesnt exist anymore
if not any([system_proxy in tup for tup in datastore.proxy_list]):
system_proxy = None
if request.method == 'GET':
form.minutes_between_check.data = int(datastore.data['settings']['requests']['minutes_between_check'])
form.notification_urls.data = datastore.data['settings']['application']['notification_urls']
form.global_subtractive_selectors.data = datastore.data['settings']['application']['global_subtractive_selectors']
form.global_ignore_text.data = datastore.data['settings']['application']['global_ignore_text']
form.ignore_whitespace.data = datastore.data['settings']['application']['ignore_whitespace']
form.extract_title_as_title.data = datastore.data['settings']['application']['extract_title_as_title']
form.fetch_backend.data = datastore.data['settings']['application']['fetch_backend']
form.notification_title.data = datastore.data['settings']['application']['notification_title']
form.notification_body.data = datastore.data['settings']['application']['notification_body']
form.notification_format.data = datastore.data['settings']['application']['notification_format']
form.base_url.data = datastore.data['settings']['application']['base_url']
default['requests']['proxy'] = system_proxy if system_proxy is not None else datastore.proxy_list[0][0]
# Used by the form handler to keep or remove the proxy settings
default['proxy_list'] = datastore.proxy_list
if request.method == 'POST' and form.data.get('removepassword_button') == True:
# Don't use form.data on POST so that it doesnt overrid the checkbox status from the POST status
form = forms.globalSettingsForm(formdata=request.form if request.method == 'POST' else None,
data=default
)
if datastore.proxy_list is None:
# @todo - Couldn't get setattr() etc dynamic addition working, so remove it instead
del form.requests.form.proxy
else:
form.requests.form.proxy.choices = datastore.proxy_list
if request.method == 'POST':
# Password unset is a GET, but we can lock the session to a salted env password to always need the password
if not os.getenv("SALTED_PASS", False):
datastore.data['settings']['application']['password'] = False
flash("Password protection removed.", 'notice')
flask_login.logout_user()
return redirect(url_for('settings_page'))
if form.application.form.data.get('removepassword_button', False):
# SALTED_PASS means the password is "locked" to what we set in the Env var
if not os.getenv("SALTED_PASS", False):
datastore.remove_password()
flash("Password protection removed.", 'notice')
flask_login.logout_user()
return redirect(url_for('settings_page'))
if request.method == 'POST' and form.validate():
datastore.data['settings']['application']['notification_urls'] = form.notification_urls.data
datastore.data['settings']['requests']['minutes_between_check'] = form.minutes_between_check.data
datastore.data['settings']['application']['extract_title_as_title'] = form.extract_title_as_title.data
datastore.data['settings']['application']['fetch_backend'] = form.fetch_backend.data
datastore.data['settings']['application']['notification_title'] = form.notification_title.data
datastore.data['settings']['application']['notification_body'] = form.notification_body.data
datastore.data['settings']['application']['notification_format'] = form.notification_format.data
datastore.data['settings']['application']['notification_urls'] = form.notification_urls.data
datastore.data['settings']['application']['base_url'] = form.base_url.data
datastore.data['settings']['application']['global_subtractive_selectors'] = form.global_subtractive_selectors.data
datastore.data['settings']['application']['global_ignore_text'] = form.global_ignore_text.data
datastore.data['settings']['application']['ignore_whitespace'] = form.ignore_whitespace.data
if form.validate():
datastore.data['settings']['application'].update(form.data['application'])
datastore.data['settings']['requests'].update(form.data['requests'])
if form.trigger_check.data:
if len(form.notification_urls.data):
n_object = {'watch_url': "Test from changedetection.io!",
'notification_urls': form.notification_urls.data,
'notification_title': form.notification_title.data,
'notification_body': form.notification_body.data,
'notification_format': form.notification_format.data,
}
notification_q.put(n_object)
flash('Test notification queued.')
else:
flash('No notification URLs set, cannot send test.', 'error')
if not os.getenv("SALTED_PASS", False) and len(form.application.form.password.encrypted_password):
datastore.data['settings']['application']['password'] = form.application.form.password.encrypted_password
datastore.needs_write_urgent = True
flash("Password protection enabled.", 'notice')
flask_login.logout_user()
return redirect(url_for('index'))
if not os.getenv("SALTED_PASS", False) and form.password.encrypted_password:
datastore.data['settings']['application']['password'] = form.password.encrypted_password
flash("Password protection enabled.", 'notice')
flask_login.logout_user()
return redirect(url_for('index'))
datastore.needs_write_urgent = True
flash("Settings updated.")
datastore.needs_write = True
flash("Settings updated.")
if request.method == 'POST' and not form.validate():
flash("An error occurred, please see below.", "error")
else:
flash("An error occurred, please see below.", "error")
output = render_template("settings.html",
form=form,
current_base_url = datastore.data['settings']['application']['base_url'],
hide_remove_pass=os.getenv("SALTED_PASS", False))
hide_remove_pass=os.getenv("SALTED_PASS", False),
emailprefix=os.getenv('NOTIFICATION_MAIL_BUTTON_PREFIX', False))
return output
@app.route("/import", methods=['GET', "POST"])
@login_required
def import_page():
import validators
remaining_urls = []
good = 0
if request.method == 'POST':
urls = request.values.get('urls').split("\n")
for url in urls:
url = url.strip()
url, *tags = url.split(" ")
# Flask wtform validators wont work with basic auth, use validators package
if len(url) and validators.url(url):
new_uuid = datastore.add_watch(url=url.strip(), tag=" ".join(tags))
# Straight into the queue.
update_q.put(new_uuid)
good += 1
from .importer import import_url_list, import_distill_io_json
# URL List import
if request.values.get('urls') and len(request.values.get('urls').strip()):
# Import and push into the queue for immediate update check
importer = import_url_list()
importer.run(data=request.values.get('urls'), flash=flash, datastore=datastore)
for uuid in importer.new_uuids:
update_q.put(uuid)
if len(importer.remaining_data) == 0:
return redirect(url_for('index'))
else:
if len(url):
remaining_urls.append(url)
remaining_urls = importer.remaining_data
# Distill.io import
if request.values.get('distill-io') and len(request.values.get('distill-io').strip()):
# Import and push into the queue for immediate update check
d_importer = import_distill_io_json()
d_importer.run(data=request.values.get('distill-io'), flash=flash, datastore=datastore)
for uuid in d_importer.new_uuids:
update_q.put(uuid)
flash("{} Imported, {} Skipped.".format(good, len(remaining_urls)))
if len(remaining_urls) == 0:
# Looking good, redirect to index.
return redirect(url_for('index'))
# Could be some remaining, or we could be on GET
output = render_template("import.html",
remaining="\n".join(remaining_urls)
import_url_list_remaining="\n".join(remaining_urls),
original_distill_json=''
)
return output
@@ -763,6 +778,9 @@ def changedetection_app(config=None, datastore_o=None):
except Exception as e:
previous_version_file_contents = "Unable to read {}.\n".format(previous_file)
screenshot_url = datastore.get_screenshot(uuid)
output = render_template("diff.html", watch_a=watch,
newest=newest_version_file_contents,
previous=previous_version_file_contents,
@@ -773,7 +791,8 @@ def changedetection_app(config=None, datastore_o=None):
current_previous_version=str(previous_version),
current_diff_url=watch['url'],
extra_title=" - Diff - {}".format(watch['title'] if watch['title'] else watch['url']),
left_sticky=True)
left_sticky=True,
screenshot=screenshot_url)
return output
@@ -833,15 +852,17 @@ def changedetection_app(config=None, datastore_o=None):
else:
content.append({'line': "No history found", 'classes': ''})
screenshot_url = datastore.get_screenshot(uuid)
output = render_template("preview.html",
content=content,
extra_stylesheets=extra_stylesheets,
ignored_line_numbers=ignored_line_numbers,
triggered_line_numbers=trigger_line_numbers,
current_diff_url=watch['url'],
screenshot=screenshot_url,
watch=watch,
uuid=uuid)
return output
@app.route("/settings/notification-logs", methods=['GET'])
@@ -954,6 +975,28 @@ def changedetection_app(config=None, datastore_o=None):
@app.route("/static/<string:group>/<string:filename>", methods=['GET'])
def static_content(group, filename):
if group == 'screenshot':
from flask import make_response
# Could be sensitive, follow password requirements
if datastore.data['settings']['application']['password'] and not flask_login.current_user.is_authenticated:
abort(403)
# These files should be in our subdirectory
try:
# set nocache, set content-type
watch_dir = datastore_o.datastore_path + "/" + filename
response = make_response(send_from_directory(filename="last-screenshot.png", directory=watch_dir, path=watch_dir + "/last-screenshot.png"))
response.headers['Content-type'] = 'image/png'
response.headers['Cache-Control'] = 'no-cache, no-store, must-revalidate'
response.headers['Pragma'] = 'no-cache'
response.headers['Expires'] = 0
return response
except FileNotFoundError:
abort(404)
# These files should be in our subdirectory
try:
return send_from_directory("static/{}".format(group), path=filename)
@@ -966,28 +1009,38 @@ def changedetection_app(config=None, datastore_o=None):
from changedetectionio import forms
form = forms.quickWatchForm(request.form)
if form.validate():
url = request.form.get('url').strip()
if datastore.url_exists(url):
flash('The URL {} already exists'.format(url), "error")
return redirect(url_for('index'))
# @todo add_watch should throw a custom Exception for validation etc
new_uuid = datastore.add_watch(url=url, tag=request.form.get('tag').strip())
# Straight into the queue.
update_q.put(new_uuid)
flash("Watch added.")
return redirect(url_for('index'))
else:
if not form.validate():
flash("Error")
return redirect(url_for('index'))
url = request.form.get('url').strip()
if datastore.url_exists(url):
flash('The URL {} already exists'.format(url), "error")
return redirect(url_for('index'))
# @todo add_watch should throw a custom Exception for validation etc
new_uuid = datastore.add_watch(url=url, tag=request.form.get('tag').strip())
if new_uuid:
# Straight into the queue.
update_q.put(new_uuid)
flash("Watch added.")
return redirect(url_for('index'))
@app.route("/api/delete", methods=['GET'])
@login_required
def api_delete():
uuid = request.args.get('uuid')
if uuid != 'all' and not uuid in datastore.data['watching'].keys():
flash('The watch by UUID {} does not exist.'.format(uuid), 'error')
return redirect(url_for('index'))
# More for testing, possible to return the first/only
if uuid == 'first':
uuid = list(datastore.data['watching'].keys()).pop()
datastore.delete(uuid)
flash('Deleted.')
@@ -1044,6 +1097,59 @@ def changedetection_app(config=None, datastore_o=None):
flash("{} watches are queued for rechecking.".format(i))
return redirect(url_for('index', tag=tag))
@app.route("/api/share-url", methods=['GET'])
@login_required
def api_share_put_watch():
"""Given a watch UUID, upload the info and return a share-link
the share-link can be imported/added"""
import requests
import json
tag = request.args.get('tag')
uuid = request.args.get('uuid')
# more for testing
if uuid == 'first':
uuid = list(datastore.data['watching'].keys()).pop()
# copy it to memory as trim off what we dont need (history)
watch = deepcopy(datastore.data['watching'][uuid])
if (watch.get('history')):
del (watch['history'])
# for safety/privacy
for k in list(watch.keys()):
if k.startswith('notification_'):
del watch[k]
for r in['uuid', 'last_checked', 'last_changed']:
if watch.get(r):
del (watch[r])
# Add the global stuff which may have an impact
watch['ignore_text'] += datastore.data['settings']['application']['global_ignore_text']
watch['subtractive_selectors'] += datastore.data['settings']['application']['global_subtractive_selectors']
watch_json = json.dumps(watch)
try:
r = requests.request(method="POST",
data={'watch': watch_json},
url="https://changedetection.io/share/share",
headers={'App-Guid': datastore.data['app_guid']})
res = r.json()
session['share-link'] = "https://changedetection.io/share/{}".format(res['share_key'])
except Exception as e:
flash("Could not share, something went wrong while communicating with the share server.", 'error')
# https://changedetection.io/share/VrMv05wpXyQa
# in the browser - should give you a nice info page - wtf
# paste in etc
return redirect(url_for('index'))
# @todo handle ctrl break
ticker_thread = threading.Thread(target=ticker_thread_check_time_launch_checks).start()
@@ -1113,8 +1219,6 @@ def notification_runner():
notification_debug_log = notification_debug_log[-100:]
# Thread runner to check every minute, look for new watches to feed into the Queue.
def ticker_thread_check_time_launch_checks():
from changedetectionio import update_worker
@@ -1151,7 +1255,9 @@ def ticker_thread_check_time_launch_checks():
# Check for watches outside of the time threshold to put in the thread queue.
now = time.time()
max_system_wide = int(copied_datastore.data['settings']['requests']['minutes_between_check']) * 60
recheck_time_minimum_seconds = int(os.getenv('MINIMUM_SECONDS_RECHECK_TIME', 60))
recheck_time_system_seconds = datastore.threshold_seconds
for uuid, watch in copied_datastore.data['watching'].items():
@@ -1160,18 +1266,15 @@ def ticker_thread_check_time_launch_checks():
continue
# If they supplied an individual entry minutes to threshold.
watch_minutes_between_check = watch.get('minutes_between_check', None)
if watch_minutes_between_check is not None:
# Cast to int just incase
max_time = int(watch_minutes_between_check) * 60
threshold = now
watch_threshold_seconds = watch.threshold_seconds()
if watch_threshold_seconds:
threshold -= watch_threshold_seconds
else:
# Default system wide.
max_time = max_system_wide
threshold = now - max_time
threshold -= recheck_time_system_seconds
# Yeah, put it in the queue, it's more than time
if watch['last_checked'] <= threshold:
if watch['last_checked'] <= max(threshold, recheck_time_minimum_seconds):
if not uuid in running_uuids and uuid not in update_q.queue:
update_q.put(uuid)
@@ -1179,4 +1282,4 @@ def ticker_thread_check_time_launch_checks():
time.sleep(3)
# Should be low so we can break this out in testing
app.config.exit.wait(1)
app.config.exit.wait(1)

View File

@@ -8,7 +8,7 @@ import sys
import eventlet
import eventlet.wsgi
from . import store, changedetection_app
from . import store, changedetection_app, content_fetcher
from . import __version__
def main():

View File

@@ -1,14 +1,9 @@
from abc import ABC, abstractmethod
import chardet
import os
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.webdriver.common.proxy import Proxy as SeleniumProxy
from selenium.common.exceptions import WebDriverException
import requests
import time
import urllib3.exceptions
import sys
class EmptyReply(Exception):
def __init__(self, status_code, url):
@@ -16,16 +11,30 @@ class EmptyReply(Exception):
self.status_code = status_code
self.url = url
return
pass
class ReplyWithContentButNoText(Exception):
def __init__(self, status_code, url):
# Set this so we can use it in other parts of the app
self.status_code = status_code
self.url = url
return
pass
class Fetcher():
error = None
status_code = None
content = None
headers = None
# Will be needed in the future by the VisualSelector, always get this where possible.
screenshot = False
fetcher_description = "No description"
system_http_proxy = os.getenv('HTTP_PROXY')
system_https_proxy = os.getenv('HTTPS_PROXY')
fetcher_description ="No description"
# Time ONTOP of the system defined env minimum time
render_extract_delay=0
@abstractmethod
def get_error(self):
@@ -42,6 +51,10 @@ class Fetcher():
# Should set self.error, self.status_code and self.content
pass
@abstractmethod
def quit(self):
return
@abstractmethod
def get_last_status_code(self):
return self.status_code
@@ -51,29 +64,121 @@ class Fetcher():
def is_ready(self):
return True
# Maybe for the future, each fetcher provides its own diff output, could be used for text, image
# the current one would return javascript output (as we use JS to generate the diff)
#
# Returns tuple(mime_type, stream)
# @abstractmethod
# def return_diff(self, stream_a, stream_b):
# return
def available_fetchers():
import inspect
from changedetectionio import content_fetcher
p=[]
for name, obj in inspect.getmembers(content_fetcher):
if inspect.isclass(obj):
# @todo html_ is maybe better as fetcher_ or something
# In this case, make sure to edit the default one in store.py and fetch_site_status.py
if "html_" in name:
t=tuple([name,obj.fetcher_description])
p.append(t)
# See the if statement at the bottom of this file for how we switch between playwright and webdriver
import inspect
p = []
for name, obj in inspect.getmembers(sys.modules[__name__], inspect.isclass):
if inspect.isclass(obj):
# @todo html_ is maybe better as fetcher_ or something
# In this case, make sure to edit the default one in store.py and fetch_site_status.py
if name.startswith('html_'):
t = tuple([name, obj.fetcher_description])
p.append(t)
return p
return p
class html_webdriver(Fetcher):
class base_html_playwright(Fetcher):
fetcher_description = "Playwright {}/Javascript".format(
os.getenv("PLAYWRIGHT_BROWSER_TYPE", 'chromium').capitalize()
)
if os.getenv("PLAYWRIGHT_DRIVER_URL"):
fetcher_description += " via '{}'".format(os.getenv("PLAYWRIGHT_DRIVER_URL"))
browser_type = ''
command_executor = ''
# Configs for Proxy setup
# In the ENV vars, is prefixed with "playwright_proxy_", so it is for example "playwright_proxy_server"
playwright_proxy_settings_mappings = ['bypass', 'server', 'username', 'password']
proxy = None
def __init__(self, proxy_override=None):
# .strip('"') is going to save someone a lot of time when they accidently wrap the env value
self.browser_type = os.getenv("PLAYWRIGHT_BROWSER_TYPE", 'chromium').strip('"')
self.command_executor = os.getenv(
"PLAYWRIGHT_DRIVER_URL",
'ws://playwright-chrome:3000'
).strip('"')
# If any proxy settings are enabled, then we should setup the proxy object
proxy_args = {}
for k in self.playwright_proxy_settings_mappings:
v = os.getenv('playwright_proxy_' + k, False)
if v:
proxy_args[k] = v.strip('"')
if proxy_args:
self.proxy = proxy_args
# allow per-watch proxy selection override
if proxy_override:
self.proxy = {'server': proxy_override}
def run(self,
url,
timeout,
request_headers,
request_body,
request_method,
ignore_status_codes=False):
from playwright.sync_api import sync_playwright
import playwright._impl._api_types
from playwright._impl._api_types import Error, TimeoutError
with sync_playwright() as p:
browser_type = getattr(p, self.browser_type)
# Seemed to cause a connection Exception even tho I can see it connect
# self.browser = browser_type.connect(self.command_executor, timeout=timeout*1000)
browser = browser_type.connect_over_cdp(self.command_executor, timeout=timeout * 1000)
# Set user agent to prevent Cloudflare from blocking the browser
# Use the default one configured in the App.py model that's passed from fetch_site_status.py
context = browser.new_context(
user_agent=request_headers['User-Agent'] if request_headers.get('User-Agent') else 'Mozilla/5.0',
proxy=self.proxy
)
page = context.new_page()
page.set_viewport_size({"width": 1280, "height": 1024})
try:
response = page.goto(url, timeout=timeout * 1000, wait_until='commit')
# Wait_until = commit
# - `'commit'` - consider operation to be finished when network response is received and the document started loading.
# Better to not use any smarts from Playwright and just wait an arbitrary number of seconds
# This seemed to solve nearly all 'TimeoutErrors'
extra_wait = int(os.getenv("WEBDRIVER_DELAY_BEFORE_CONTENT_READY", 5)) + self.render_extract_delay
page.wait_for_timeout(extra_wait * 1000)
except playwright._impl._api_types.TimeoutError as e:
raise EmptyReply(url=url, status_code=None)
if response is None:
raise EmptyReply(url=url, status_code=None)
if len(page.content().strip()) == 0:
raise EmptyReply(url=url, status_code=None)
self.status_code = response.status
self.content = page.content()
self.headers = response.all_headers()
# Some bug where it gives the wrong screenshot size, but making a request with the clip set first seems to solve it
# JPEG is better here because the screenshots can be very very large
page.screenshot(type='jpeg', clip={'x': 1.0, 'y': 1.0, 'width': 1280, 'height': 1024})
self.screenshot = page.screenshot(type='jpeg', full_page=True, quality=90)
context.close()
browser.close()
class base_html_webdriver(Fetcher):
if os.getenv("WEBDRIVER_URL"):
fetcher_description = "WebDriver Chrome/Javascript via '{}'".format(os.getenv("WEBDRIVER_URL"))
else:
@@ -86,12 +191,11 @@ class html_webdriver(Fetcher):
selenium_proxy_settings_mappings = ['proxyType', 'ftpProxy', 'httpProxy', 'noProxy',
'proxyAutoconfigUrl', 'sslProxy', 'autodetect',
'socksProxy', 'socksVersion', 'socksUsername', 'socksPassword']
proxy = None
def __init__(self, proxy_override=None):
from selenium.webdriver.common.proxy import Proxy as SeleniumProxy
proxy=None
def __init__(self):
# .strip('"') is going to save someone a lot of time when they accidently wrap the env value
self.command_executor = os.getenv("WEBDRIVER_URL", 'http://browser-chrome:4444/wd/hub').strip('"')
@@ -102,6 +206,16 @@ class html_webdriver(Fetcher):
if v:
proxy_args[k] = v.strip('"')
# Map back standard HTTP_ and HTTPS_PROXY to webDriver httpProxy/sslProxy
if not proxy_args.get('webdriver_httpProxy') and self.system_http_proxy:
proxy_args['httpProxy'] = self.system_http_proxy
if not proxy_args.get('webdriver_sslProxy') and self.system_https_proxy:
proxy_args['httpsProxy'] = self.system_https_proxy
# Allows override the proxy on a per-request basis
if proxy_override is not None:
proxy_args['httpProxy'] = proxy_override
if proxy_args:
self.proxy = SeleniumProxy(raw=proxy_args)
@@ -113,19 +227,22 @@ class html_webdriver(Fetcher):
request_method,
ignore_status_codes=False):
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.common.exceptions import WebDriverException
# request_body, request_method unused for now, until some magic in the future happens.
# check env for WEBDRIVER_URL
driver = webdriver.Remote(
self.driver = webdriver.Remote(
command_executor=self.command_executor,
desired_capabilities=DesiredCapabilities.CHROME,
proxy=self.proxy)
try:
driver.get(url)
self.driver.get(url)
except WebDriverException as e:
# Be sure we close the session window
driver.quit()
self.quit()
raise
# @todo - how to check this? is it possible?
@@ -134,31 +251,41 @@ class html_webdriver(Fetcher):
# raise EmptyReply(url=url, status_code=r.status_code)
# @todo - dom wait loaded?
time.sleep(int(os.getenv("WEBDRIVER_DELAY_BEFORE_CONTENT_READY", 5)))
self.content = driver.page_source
time.sleep(int(os.getenv("WEBDRIVER_DELAY_BEFORE_CONTENT_READY", 5)) + self.render_extract_delay)
self.content = self.driver.page_source
self.headers = {}
self.screenshot = self.driver.get_screenshot_as_png()
self.quit()
driver.quit()
# Does the connection to the webdriver work? run a test connection.
def is_ready(self):
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.common.exceptions import WebDriverException
driver = webdriver.Remote(
self.driver = webdriver.Remote(
command_executor=self.command_executor,
desired_capabilities=DesiredCapabilities.CHROME)
# driver.quit() seems to cause better exceptions
driver.quit()
self.quit()
return True
def quit(self):
if self.driver:
try:
self.driver.quit()
except Exception as e:
print("Exception in chrome shutdown/quit" + str(e))
# "html_requests" is listed as the default fetcher in store.py!
class html_requests(Fetcher):
fetcher_description = "Basic fast Plaintext/HTTP Client"
def __init__(self, proxy_override=None):
self.proxy_override = proxy_override
def run(self,
url,
timeout,
@@ -167,12 +294,24 @@ class html_requests(Fetcher):
request_method,
ignore_status_codes=False):
proxies={}
# Allows override the proxy on a per-request basis
if self.proxy_override:
proxies = {'http': self.proxy_override, 'https': self.proxy_override, 'ftp': self.proxy_override}
else:
if self.system_http_proxy:
proxies['http'] = self.system_http_proxy
if self.system_https_proxy:
proxies['https'] = self.system_https_proxy
r = requests.request(method=request_method,
data=request_body,
url=url,
headers=request_headers,
timeout=timeout,
verify=False)
data=request_body,
url=url,
headers=request_headers,
timeout=timeout,
proxies=proxies,
verify=False)
# If the response did not tell us what encoding format to expect, Then use chardet to override what `requests` thinks.
# For example - some sites don't tell us it's utf-8, but return utf-8 content
@@ -192,3 +331,11 @@ class html_requests(Fetcher):
self.content = r.text
self.headers = r.headers
# Decide which is the 'real' HTML webdriver, this is more a system wide config
# rather than site-specific.
use_playwright_as_chrome_fetcher = os.getenv('PLAYWRIGHT_DRIVER_URL', False)
if use_playwright_as_chrome_fetcher:
html_webdriver = base_html_playwright
else:
html_webdriver = base_html_webdriver

View File

@@ -4,7 +4,6 @@ import re
import time
import urllib3
from inscriptis import get_text
from changedetectionio import content_fetcher, html_tools
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
@@ -17,10 +16,39 @@ class perform_site_check():
super().__init__(*args, **kwargs)
self.datastore = datastore
# If there was a proxy list enabled, figure out what proxy_args/which proxy to use
# if watch.proxy use that
# fetcher.proxy_override = watch.proxy or main config proxy
# Allows override the proxy on a per-request basis
# ALWAYS use the first one is nothing selected
def set_proxy_from_list(self, watch):
proxy_args = None
if self.datastore.proxy_list is None:
return None
# If its a valid one
if any([watch['proxy'] in p for p in self.datastore.proxy_list]):
proxy_args = watch['proxy']
# not valid (including None), try the system one
else:
system_proxy = self.datastore.data['settings']['requests']['proxy']
# Is not None and exists
if any([system_proxy in p for p in self.datastore.proxy_list]):
proxy_args = system_proxy
# Fallback - Did not resolve anything, use the first available
if proxy_args is None:
proxy_args = self.datastore.proxy_list[0][0]
return proxy_args
def run(self, uuid):
timestamp = int(time.time()) # used for storage etc too
changed_detected = False
screenshot = False # as bytes
stripped_text_from_html = ""
watch = self.datastore.data['watching'][uuid]
@@ -46,130 +74,166 @@ class perform_site_check():
if 'Accept-Encoding' in request_headers and "br" in request_headers['Accept-Encoding']:
request_headers['Accept-Encoding'] = request_headers['Accept-Encoding'].replace(', br', '')
# @todo check the failures are really handled how we expect
timeout = self.datastore.data['settings']['requests']['timeout']
url = self.datastore.get_val(uuid, 'url')
request_body = self.datastore.get_val(uuid, 'body')
request_method = self.datastore.get_val(uuid, 'method')
ignore_status_code = self.datastore.get_val(uuid, 'ignore_status_codes')
# source: support
is_source = False
if url.startswith('source:'):
url = url.replace('source:', '')
is_source = True
# Pluggable content fetcher
prefer_backend = watch['fetch_backend']
if hasattr(content_fetcher, prefer_backend):
klass = getattr(content_fetcher, prefer_backend)
else:
timeout = self.datastore.data['settings']['requests']['timeout']
url = self.datastore.get_val(uuid, 'url')
request_body = self.datastore.get_val(uuid, 'body')
request_method = self.datastore.get_val(uuid, 'method')
ignore_status_code = self.datastore.get_val(uuid, 'ignore_status_codes')
# If the klass doesnt exist, just use a default
klass = getattr(content_fetcher, "html_requests")
# Pluggable content fetcher
prefer_backend = watch['fetch_backend']
if hasattr(content_fetcher, prefer_backend):
klass = getattr(content_fetcher, prefer_backend)
proxy_args = self.set_proxy_from_list(watch)
fetcher = klass(proxy_override=proxy_args)
# Configurable per-watch or global extra delay before extracting text (for webDriver types)
system_webdriver_delay = self.datastore.data['settings']['application'].get('webdriver_delay', None)
if watch['webdriver_delay'] is not None:
fetcher.render_extract_delay = watch['webdriver_delay']
elif system_webdriver_delay is not None:
fetcher.render_extract_delay = system_webdriver_delay
fetcher.run(url, timeout, request_headers, request_body, request_method, ignore_status_code)
# Fetching complete, now filters
# @todo move to class / maybe inside of fetcher abstract base?
# @note: I feel like the following should be in a more obvious chain system
# - Check filter text
# - Is the checksum different?
# - Do we convert to JSON?
# https://stackoverflow.com/questions/41817578/basic-method-chaining ?
# return content().textfilter().jsonextract().checksumcompare() ?
is_json = 'application/json' in fetcher.headers.get('Content-Type', '')
is_html = not is_json
# source: support, basically treat it as plaintext
if is_source:
is_html = False
is_json = False
css_filter_rule = watch['css_filter']
subtractive_selectors = watch.get(
"subtractive_selectors", []
) + self.datastore.data["settings"]["application"].get(
"global_subtractive_selectors", []
)
has_filter_rule = css_filter_rule and len(css_filter_rule.strip())
has_subtractive_selectors = subtractive_selectors and len(subtractive_selectors[0].strip())
if is_json and not has_filter_rule:
css_filter_rule = "json:$"
has_filter_rule = True
if has_filter_rule:
if 'json:' in css_filter_rule:
stripped_text_from_html = html_tools.extract_json_as_string(content=fetcher.content, jsonpath_filter=css_filter_rule)
is_html = False
if is_html or is_source:
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
html_content = fetcher.content
# If not JSON, and if it's not text/plain..
if 'text/plain' in fetcher.headers.get('Content-Type', '').lower():
# Don't run get_text or xpath/css filters on plaintext
stripped_text_from_html = html_content
else:
# If the klass doesnt exist, just use a default
klass = getattr(content_fetcher, "html_requests")
# Then we assume HTML
if has_filter_rule:
# For HTML/XML we offer xpath as an option, just start a regular xPath "/.."
if css_filter_rule[0] == '/' or css_filter_rule.startswith('xpath:'):
html_content = html_tools.xpath_filter(xpath_filter=css_filter_rule.replace('xpath:', ''),
html_content=fetcher.content)
else:
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
html_content = html_tools.css_filter(css_filter=css_filter_rule, html_content=fetcher.content)
if has_subtractive_selectors:
html_content = html_tools.element_removal(subtractive_selectors, html_content)
fetcher = klass()
fetcher.run(url, timeout, request_headers, request_body, request_method, ignore_status_code)
# Fetching complete, now filters
# @todo move to class / maybe inside of fetcher abstract base?
if not is_source:
# extract text
stripped_text_from_html = \
html_tools.html_to_text(
html_content,
render_anchor_tag_content=self.datastore.data["settings"][
"application"].get(
"render_anchor_tag_content", False)
)
# @note: I feel like the following should be in a more obvious chain system
# - Check filter text
# - Is the checksum different?
# - Do we convert to JSON?
# https://stackoverflow.com/questions/41817578/basic-method-chaining ?
# return content().textfilter().jsonextract().checksumcompare() ?
is_json = 'application/json' in fetcher.headers.get('Content-Type', '')
is_html = not is_json
css_filter_rule = watch['css_filter']
subtractive_selectors = watch.get(
"subtractive_selectors", []
) + self.datastore.data["settings"]["application"].get(
"global_subtractive_selectors", []
)
has_filter_rule = css_filter_rule and len(css_filter_rule.strip())
has_subtractive_selectors = subtractive_selectors and len(subtractive_selectors[0].strip())
if is_json and not has_filter_rule:
css_filter_rule = "json:$"
has_filter_rule = True
if has_filter_rule:
if 'json:' in css_filter_rule:
stripped_text_from_html = html_tools.extract_json_as_string(content=fetcher.content, jsonpath_filter=css_filter_rule)
is_html = False
if is_html:
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
html_content = fetcher.content
# If not JSON, and if it's not text/plain..
if 'text/plain' in fetcher.headers.get('Content-Type', '').lower():
# Don't run get_text or xpath/css filters on plaintext
elif is_source:
stripped_text_from_html = html_content
else:
# Then we assume HTML
if has_filter_rule:
# For HTML/XML we offer xpath as an option, just start a regular xPath "/.."
if css_filter_rule[0] == '/':
html_content = html_tools.xpath_filter(xpath_filter=css_filter_rule, html_content=fetcher.content)
else:
# CSS Filter, extract the HTML that matches and feed that into the existing inscriptis::get_text
html_content = html_tools.css_filter(css_filter=css_filter_rule, html_content=fetcher.content)
if has_subtractive_selectors:
html_content = html_tools.element_removal(subtractive_selectors, html_content)
# get_text() via inscriptis
stripped_text_from_html = get_text(html_content)
# Re #340 - return the content before the 'ignore text' was applied
text_content_before_ignored_filter = stripped_text_from_html.encode('utf-8')
# We rely on the actual text in the html output.. many sites have random script vars etc,
# in the future we'll implement other mechanisms.
# Re #340 - return the content before the 'ignore text' was applied
text_content_before_ignored_filter = stripped_text_from_html.encode('utf-8')
update_obj["last_check_status"] = fetcher.get_last_status_code()
# Treat pages with no renderable text content as a change? No by default
empty_pages_are_a_change = self.datastore.data['settings']['application'].get('empty_pages_are_a_change', False)
if not is_json and not empty_pages_are_a_change and len(stripped_text_from_html.strip()) == 0:
raise content_fetcher.ReplyWithContentButNoText(url=url, status_code=200)
# If there's text to skip
# @todo we could abstract out the get_text() to handle this cleaner
text_to_ignore = watch.get('ignore_text', []) + self.datastore.data['settings']['application'].get('global_ignore_text', [])
if len(text_to_ignore):
stripped_text_from_html = html_tools.strip_ignore_text(stripped_text_from_html, text_to_ignore)
else:
stripped_text_from_html = stripped_text_from_html.encode('utf8')
# We rely on the actual text in the html output.. many sites have random script vars etc,
# in the future we'll implement other mechanisms.
# Re #133 - if we should strip whitespaces from triggering the change detected comparison
if self.datastore.data['settings']['application'].get('ignore_whitespace', False):
fetched_md5 = hashlib.md5(stripped_text_from_html.translate(None, b'\r\n\t ')).hexdigest()
else:
fetched_md5 = hashlib.md5(stripped_text_from_html).hexdigest()
update_obj["last_check_status"] = fetcher.get_last_status_code()
# On the first run of a site, watch['previous_md5'] will be an empty string, set it the current one.
if not len(watch['previous_md5']):
watch['previous_md5'] = fetched_md5
update_obj["previous_md5"] = fetched_md5
# If there's text to skip
# @todo we could abstract out the get_text() to handle this cleaner
text_to_ignore = watch.get('ignore_text', []) + self.datastore.data['settings']['application'].get('global_ignore_text', [])
if len(text_to_ignore):
stripped_text_from_html = html_tools.strip_ignore_text(stripped_text_from_html, text_to_ignore)
else:
stripped_text_from_html = stripped_text_from_html.encode('utf8')
blocked_by_not_found_trigger_text = False
# Re #133 - if we should strip whitespaces from triggering the change detected comparison
if self.datastore.data['settings']['application'].get('ignore_whitespace', False):
fetched_md5 = hashlib.md5(stripped_text_from_html.translate(None, b'\r\n\t ')).hexdigest()
else:
fetched_md5 = hashlib.md5(stripped_text_from_html).hexdigest()
if len(watch['trigger_text']):
# Yeah, lets block first until something matches
blocked_by_not_found_trigger_text = True
# Filter and trigger works the same, so reuse it
result = html_tools.strip_ignore_text(content=str(stripped_text_from_html),
wordlist=watch['trigger_text'],
mode="line numbers")
if result:
blocked_by_not_found_trigger_text = False
# On the first run of a site, watch['previous_md5'] will be None, set it the current one.
if not watch.get('previous_md5'):
watch['previous_md5'] = fetched_md5
update_obj["previous_md5"] = fetched_md5
blocked_by_not_found_trigger_text = False
if not blocked_by_not_found_trigger_text and watch['previous_md5'] != fetched_md5:
changed_detected = True
update_obj["previous_md5"] = fetched_md5
update_obj["last_changed"] = timestamp
if len(watch['trigger_text']):
# Yeah, lets block first until something matches
blocked_by_not_found_trigger_text = True
# Filter and trigger works the same, so reuse it
result = html_tools.strip_ignore_text(content=str(stripped_text_from_html),
wordlist=watch['trigger_text'],
mode="line numbers")
if result:
blocked_by_not_found_trigger_text = False
if not blocked_by_not_found_trigger_text and watch['previous_md5'] != fetched_md5:
changed_detected = True
update_obj["previous_md5"] = fetched_md5
update_obj["last_changed"] = timestamp
# Extract title as title
if is_html:
if self.datastore.data['settings']['application']['extract_title_as_title'] or watch['extract_title_as_title']:
if not watch['title'] or not len(watch['title']):
update_obj['title'] = html_tools.extract_element(find='title', html_content=fetcher.content)
# Extract title as title
if is_html:
if self.datastore.data['settings']['application']['extract_title_as_title'] or watch['extract_title_as_title']:
if not watch['title'] or not len(watch['title']):
update_obj['title'] = html_tools.extract_element(find='title', html_content=fetcher.content)
return changed_detected, update_obj, text_content_before_ignored_filter
return changed_detected, update_obj, text_content_before_ignored_filter, fetcher.screenshot

View File

@@ -15,7 +15,6 @@ from wtforms import (
validators,
widgets,
)
from wtforms.fields import html5
from wtforms.validators import ValidationError
from changedetectionio import content_fetcher
@@ -26,6 +25,8 @@ from changedetectionio.notification import (
valid_notification_formats,
)
from wtforms.fields import FormField
valid_method = {
'GET',
'POST',
@@ -36,27 +37,31 @@ valid_method = {
default_method = 'GET'
class StringListField(StringField):
widget = widgets.TextArea()
def _value(self):
if self.data:
return "\r\n".join(self.data)
# ignore empty lines in the storage
data = list(filter(lambda x: len(x.strip()), self.data))
# Apply strip to each line
data = list(map(lambda x: x.strip(), data))
return "\r\n".join(data)
else:
return u''
# incoming
def process_formdata(self, valuelist):
if valuelist:
# Remove empty strings
cleaned = list(filter(None, valuelist[0].split("\n")))
self.data = [x.strip() for x in cleaned]
p = 1
if valuelist and len(valuelist[0].strip()):
# Remove empty strings, stripping and splitting \r\n, only \n etc.
self.data = valuelist[0].splitlines()
# Remove empty lines from the final data
self.data = list(filter(lambda x: len(x.strip()), self.data))
else:
self.data = []
class SaltyPasswordField(StringField):
widget = widgets.PasswordInput()
encrypted_password = ""
@@ -84,6 +89,13 @@ class SaltyPasswordField(StringField):
else:
self.data = False
class TimeBetweenCheckForm(Form):
weeks = IntegerField('Weeks', validators=[validators.Optional(), validators.NumberRange(min=0, message="Should contain zero or more seconds")])
days = IntegerField('Days', validators=[validators.Optional(), validators.NumberRange(min=0, message="Should contain zero or more seconds")])
hours = IntegerField('Hours', validators=[validators.Optional(), validators.NumberRange(min=0, message="Should contain zero or more seconds")])
minutes = IntegerField('Minutes', validators=[validators.Optional(), validators.NumberRange(min=0, message="Should contain zero or more seconds")])
seconds = IntegerField('Seconds', validators=[validators.Optional(), validators.NumberRange(min=0, message="Should contain zero or more seconds")])
# @todo add total seconds minimum validatior = minimum_seconds_recheck_time
# Separated by key:value
class StringDictKeyValue(StringField):
@@ -122,7 +134,6 @@ class ValidateContentFetcherIsReady(object):
def __call__(self, form, field):
import urllib3.exceptions
from changedetectionio import content_fetcher
# Better would be a radiohandler that keeps a reference to each class
@@ -231,7 +242,7 @@ class ValidateListRegex(object):
except re.error:
message = field.gettext('RegEx \'%s\' is not a valid regular expression.')
raise ValidationError(message % (line))
class ValidateCSSJSONXPATHInput(object):
"""
Filter validation
@@ -293,43 +304,43 @@ class ValidateCSSJSONXPATHInput(object):
# Re #265 - maybe in the future fetch the page and offer a
# warning/notice that its possible the rule doesnt yet match anything?
class quickWatchForm(Form):
# https://wtforms.readthedocs.io/en/2.3.x/fields/#module-wtforms.fields.html5
# `require_tld` = False is needed even for the test harness "http://localhost:5005.." to run
url = html5.URLField('URL', validators=[validateURL()])
url = fields.URLField('URL', validators=[validateURL()])
tag = StringField('Group tag', [validators.Optional(), validators.Length(max=35)])
# Common to a single watch and the global settings
class commonSettingsForm(Form):
notification_urls = StringListField('Notification URL List', validators=[validators.Optional(), ValidateNotificationBodyAndTitleWhenURLisSet(), ValidateAppRiseServers()])
notification_title = StringField('Notification Title', default=default_notification_title, validators=[validators.Optional(), ValidateTokensList()])
notification_body = TextAreaField('Notification Body', default=default_notification_body, validators=[validators.Optional(), ValidateTokensList()])
notification_format = SelectField('Notification Format', choices=valid_notification_formats.keys(), default=default_notification_format)
trigger_check = BooleanField('Send test notification on save')
fetch_backend = RadioField(u'Fetch Method', choices=content_fetcher.available_fetchers(), validators=[ValidateContentFetcherIsReady()])
notification_urls = StringListField('Notification URL list', validators=[validators.Optional(), ValidateNotificationBodyAndTitleWhenURLisSet(), ValidateAppRiseServers()])
notification_title = StringField('Notification title', default=default_notification_title, validators=[validators.Optional(), ValidateTokensList()])
notification_body = TextAreaField('Notification body', default=default_notification_body, validators=[validators.Optional(), ValidateTokensList()])
notification_format = SelectField('Notification format', choices=valid_notification_formats.keys(), default=default_notification_format)
fetch_backend = RadioField(u'Fetch method', choices=content_fetcher.available_fetchers(), validators=[ValidateContentFetcherIsReady()])
extract_title_as_title = BooleanField('Extract <title> from document and use as watch title', default=False)
webdriver_delay = IntegerField('Wait seconds before extracting text', validators=[validators.Optional(), validators.NumberRange(min=1, message="Should contain one or more seconds")] )
class watchForm(commonSettingsForm):
url = html5.URLField('URL', validators=[validateURL()])
tag = StringField('Group tag', [validators.Optional(), validators.Length(max=35)])
url = fields.URLField('URL', validators=[validateURL()])
tag = StringField('Group tag', [validators.Optional(), validators.Length(max=35)], default='')
time_between_check = FormField(TimeBetweenCheckForm)
css_filter = StringField('CSS/JSON/XPATH Filter', [ValidateCSSJSONXPATHInput()], default='')
minutes_between_check = html5.IntegerField('Maximum time in minutes until recheck',
[validators.Optional(), validators.NumberRange(min=1)])
css_filter = StringField('CSS/JSON/XPATH Filter', [ValidateCSSJSONXPATHInput()])
subtractive_selectors = StringListField('Remove elements', [ValidateCSSJSONXPATHInput(allow_xpath=False, allow_json=False)])
title = StringField('Title')
title = StringField('Title', default='')
ignore_text = StringListField('Ignore Text', [ValidateListRegex()])
headers = StringDictKeyValue('Request Headers')
body = TextAreaField('Request Body', [validators.Optional()])
method = SelectField('Request Method', choices=valid_method, default=default_method)
ignore_status_codes = BooleanField('Ignore Status Codes (process non-2xx status codes as normal)', default=False)
ignore_text = StringListField('Ignore text', [ValidateListRegex()])
headers = StringDictKeyValue('Request headers')
body = TextAreaField('Request body', [validators.Optional()])
method = SelectField('Request method', choices=valid_method, default=default_method)
ignore_status_codes = BooleanField('Ignore status codes (process non-2xx status codes as normal)', default=False)
trigger_text = StringListField('Trigger/wait for text', [validators.Optional(), ValidateListRegex()])
save_button = SubmitField('Save', render_kw={"class": "pure-button pure-button-primary"})
save_and_preview_button = SubmitField('Save & Preview', render_kw={"class": "pure-button pure-button-primary"})
proxy = RadioField('Proxy')
def validate(self, **kwargs):
if not super().validate():
@@ -344,15 +355,33 @@ class watchForm(commonSettingsForm):
return result
class globalSettingsForm(commonSettingsForm):
password = SaltyPasswordField()
minutes_between_check = html5.IntegerField('Maximum time in minutes until recheck',
[validators.NumberRange(min=1)])
extract_title_as_title = BooleanField('Extract <title> from document and use as watch title')
# datastore.data['settings']['requests']..
class globalSettingsRequestForm(Form):
time_between_check = FormField(TimeBetweenCheckForm)
proxy = RadioField('Proxy')
# datastore.data['settings']['application']..
class globalSettingsApplicationForm(commonSettingsForm):
base_url = StringField('Base URL', validators=[validators.Optional()])
global_subtractive_selectors = StringListField('Remove elements', [ValidateCSSJSONXPATHInput(allow_xpath=False, allow_json=False)])
global_ignore_text = StringListField('Ignore Text', [ValidateListRegex()])
ignore_whitespace = BooleanField('Ignore whitespace')
real_browser_save_screenshot = BooleanField('Save last screenshot when using Chrome?')
removepassword_button = SubmitField('Remove password', render_kw={"class": "pure-button pure-button-primary"})
empty_pages_are_a_change = BooleanField('Treat empty pages as a change?', default=False)
render_anchor_tag_content = BooleanField('Render anchor tag content', default=False)
fetch_backend = RadioField('Fetch Method', default="html_requests", choices=content_fetcher.available_fetchers(), validators=[ValidateContentFetcherIsReady()])
password = SaltyPasswordField()
class globalSettingsForm(Form):
# Define these as FormFields/"sub forms", this way it matches the JSON storage
# datastore.data['settings']['application']..
# datastore.data['settings']['requests']..
requests = FormField(globalSettingsRequestForm)
application = FormField(globalSettingsApplicationForm)
save_button = SubmitField('Save', render_kw={"class": "pure-button pure-button-primary"})
removepassword_button = SubmitField('Remove password', render_kw={"class": "pure-button pure-button-primary"})

View File

@@ -4,6 +4,9 @@ from typing import List
from bs4 import BeautifulSoup
from jsonpath_ng.ext import parse
import re
from inscriptis import get_text
from inscriptis.model.config import ParserConfig
class JSONNotFound(ValueError):
@@ -25,12 +28,12 @@ def subtractive_css_selector(css_selector, html_content):
item.decompose()
return str(soup)
def element_removal(selectors: List[str], html_content):
"""Joins individual filters into one css filter."""
selector = ",".join(selectors)
return subtractive_css_selector(selector, html_content)
# Return str Utf-8 of matched rules
def xpath_filter(xpath_filter, html_content):
@@ -167,3 +170,35 @@ def strip_ignore_text(content, wordlist, mode="content"):
return ignored_line_numbers
return "\n".encode('utf8').join(output)
def html_to_text(html_content: str, render_anchor_tag_content=False) -> str:
"""Converts html string to a string with just the text. If ignoring
rendering anchor tag content is enable, anchor tag content are also
included in the text
:param html_content: string with html content
:param render_anchor_tag_content: boolean flag indicating whether to extract
hyperlinks (the anchor tag content) together with text. This refers to the
'href' inside 'a' tags.
Anchor tag content is rendered in the following manner:
'[ text ](anchor tag content)'
:return: extracted text from the HTML
"""
# if anchor tag content flag is set to True define a config for
# extracting this content
if render_anchor_tag_content:
parser_config = ParserConfig(
annotation_rules={"a": ["hyperlink"]}, display_links=True
)
# otherwise set config to None
else:
parser_config = None
# get text and annotations via inscriptis
text_content = get_text(html_content, config=parser_config)
return text_content

View File

@@ -0,0 +1,133 @@
from abc import ABC, abstractmethod
import time
import validators
class Importer():
remaining_data = []
new_uuids = []
good = 0
def __init__(self):
self.new_uuids = []
self.good = 0
self.remaining_data = []
@abstractmethod
def run(self,
data,
flash,
datastore):
pass
class import_url_list(Importer):
"""
Imports a list, can be in <code>https://example.com tag1, tag2, last tag</code> format
"""
def run(self,
data,
flash,
datastore,
):
urls = data.split("\n")
good = 0
now = time.time()
if (len(urls) > 5000):
flash("Importing 5,000 of the first URLs from your list, the rest can be imported again.")
for url in urls:
url = url.strip()
if not len(url):
continue
tags = ""
# 'tags' should be a csv list after the URL
if ' ' in url:
url, tags = url.split(" ", 1)
# Flask wtform validators wont work with basic auth, use validators package
# Up to 5000 per batch so we dont flood the server
if len(url) and validators.url(url.replace('source:', '')) and good < 5000:
new_uuid = datastore.add_watch(url=url.strip(), tag=tags, write_to_disk_now=False)
if new_uuid:
# Straight into the queue.
self.new_uuids.append(new_uuid)
good += 1
continue
# Worked past the 'continue' above, append it to the bad list
if self.remaining_data is None:
self.remaining_data = []
self.remaining_data.append(url)
flash("{} Imported from list in {:.2f}s, {} Skipped.".format(good, time.time() - now, len(self.remaining_data)))
class import_distill_io_json(Importer):
def run(self,
data,
flash,
datastore,
):
import json
good = 0
now = time.time()
self.new_uuids=[]
try:
data = json.loads(data.strip())
except json.decoder.JSONDecodeError:
flash("Unable to read JSON file, was it broken?", 'error')
return
if not data.get('data'):
flash("JSON structure looks invalid, was it broken?", 'error')
return
for d in data.get('data'):
d_config = json.loads(d['config'])
extras = {'title': d['name']}
if len(d['uri']) and good < 5000:
try:
# @todo we only support CSS ones at the moment
if d_config['selections'][0]['frames'][0]['excludes'][0]['type'] == 'css':
extras['subtractive_selectors'] = d_config['selections'][0]['frames'][0]['excludes'][0]['expr']
except KeyError:
pass
except IndexError:
pass
try:
extras['css_filter'] = d_config['selections'][0]['frames'][0]['includes'][0]['expr']
if d_config['selections'][0]['frames'][0]['includes'][0]['type'] == 'xpath':
extras['css_filter'] = 'xpath:' + extras['css_filter']
except KeyError:
pass
except IndexError:
pass
try:
extras['tag'] = ", ".join(d['tags'])
except KeyError:
pass
except IndexError:
pass
new_uuid = datastore.add_watch(url=d['uri'].strip(),
extras=extras,
write_to_disk_now=False)
if new_uuid:
# Straight into the queue.
self.new_uuids.append(new_uuid)
good += 1
flash("{} Imported from Distill.io in {:.2f}s, {} Skipped.".format(len(self.new_uuids), time.time() - now, len(self.remaining_data)))

View File

@@ -0,0 +1,53 @@
import collections
import os
import uuid as uuid_builder
from changedetectionio.notification import (
default_notification_body,
default_notification_format,
default_notification_title,
)
class model(dict):
base_config = {
'note': "Hello! If you change this file manually, please be sure to restart your changedetection.io instance!",
'watching': {},
'settings': {
'headers': {
'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
'Accept-Encoding': 'gzip, deflate', # No support for brolti in python requests yet.
'Accept-Language': 'en-GB,en-US;q=0.9,en;'
},
'requests': {
'timeout': 15, # Default 15 seconds
'time_between_check': {'weeks': None, 'days': None, 'hours': 3, 'minutes': None, 'seconds': None},
'workers': 10, # Number of threads, lower is better for slow connections
'proxy': None # Preferred proxy connection
},
'application': {
'password': False,
'base_url' : None,
'extract_title_as_title': False,
'empty_pages_are_a_change': False,
'fetch_backend': os.getenv("DEFAULT_FETCH_BACKEND", "html_requests"),
'global_ignore_text': [], # List of text to ignore when calculating the comparison checksum
'global_subtractive_selectors': [],
'ignore_whitespace': False,
'render_anchor_tag_content': False,
'notification_urls': [], # Apprise URL list
# Custom notification content
'notification_title': default_notification_title,
'notification_body': default_notification_body,
'notification_format': default_notification_format,
'real_browser_save_screenshot': True,
'schema_version' : 0,
'webdriver_delay': None # Extra delay in seconds before extracting text
}
}
}
def __init__(self, *arg, **kw):
super(model, self).__init__(*arg, **kw)
self.update(self.base_config)

View File

@@ -0,0 +1,70 @@
import os
import uuid as uuid_builder
minimum_seconds_recheck_time = int(os.getenv('MINIMUM_SECONDS_RECHECK_TIME', 60))
from changedetectionio.notification import (
default_notification_body,
default_notification_format,
default_notification_title,
)
class model(dict):
base_config = {
'url': None,
'tag': None,
'last_checked': 0,
'last_changed': 0,
'paused': False,
'last_viewed': 0, # history key value of the last viewed via the [diff] link
'newest_history_key': 0,
'title': None,
'previous_md5': False,
# UUID not needed, should be generated only as a key
# 'uuid':
'headers': {}, # Extra headers to send
'body': None,
'method': 'GET',
'history': {}, # Dict of timestamp and output stripped filename
'ignore_text': [], # List of text to ignore when calculating the comparison checksum
# Custom notification content
'notification_urls': [], # List of URLs to add to the notification Queue (Usually AppRise)
'notification_title': default_notification_title,
'notification_body': default_notification_body,
'notification_format': default_notification_format,
'css_filter': "",
'subtractive_selectors': [],
'trigger_text': [], # List of text or regex to wait for until a change is detected
'fetch_backend': None,
'extract_title_as_title': False,
'proxy': None, # Preferred proxy connection
# Re #110, so then if this is set to None, we know to use the default value instead
# Requires setting to None on submit if it's the same as the default
# Should be all None by default, so we use the system default in this case.
'time_between_check': {'weeks': None, 'days': None, 'hours': None, 'minutes': None, 'seconds': None},
'webdriver_delay': None
}
def __init__(self, *arg, **kw):
self.update(self.base_config)
# goes at the end so we update the default object with the initialiser
super(model, self).__init__(*arg, **kw)
@property
def has_empty_checktime(self):
# using all() + dictionary comprehension
# Check if all values are 0 in dictionary
res = all(x == None or x == False or x==0 for x in self.get('time_between_check', {}).values())
return res
def threshold_seconds(self):
seconds = 0
mtable = {'seconds': 1, 'minutes': 60, 'hours': 3600, 'days': 86400, 'weeks': 86400 * 7}
for m, n in mtable.items():
x = self.get('time_between_check', {}).get(m, None)
if x:
seconds += x * n
return seconds

View File

View File

@@ -26,13 +26,6 @@ default_notification_title = 'ChangeDetection.io Notification - {watch_url}'
def process_notification(n_object, datastore):
apobj = apprise.Apprise(debug=True)
for url in n_object['notification_urls']:
url = url.strip()
print (">> Process Notification: AppRise notifying {}".format(url))
apobj.add(url)
# Get the notification body from datastore
n_body = n_object.get('notification_body', default_notification_body)
n_title = n_object.get('notification_title', default_notification_title)
@@ -54,19 +47,55 @@ def process_notification(n_object, datastore):
# https://github.com/caronc/apprise/wiki/Development_LogCapture
# Anything higher than or equal to WARNING (which covers things like Connection errors)
# raise it as an exception
apobjs=[]
for url in n_object['notification_urls']:
with apprise.LogCapture(level=apprise.logging.DEBUG) as logs:
apobj.notify(
body=n_body,
title=n_title,
body_format=n_format)
apobj = apprise.Apprise(debug=True)
url = url.strip()
if len(url):
print(">> Process Notification: AppRise notifying {}".format(url))
with apprise.LogCapture(level=apprise.logging.DEBUG) as logs:
# Re 323 - Limit discord length to their 2000 char limit total or it wont send.
# Because different notifications may require different pre-processing, run each sequentially :(
# 2000 bytes minus -
# 200 bytes for the overhead of the _entire_ json payload, 200 bytes for {tts, wait, content} etc headers
# Length of URL - Incase they specify a longer custom avatar_url
# Returns empty string if nothing found, multi-line string otherwise
log_value = logs.getvalue()
if log_value and 'WARNING' in log_value or 'ERROR' in log_value:
raise Exception(log_value)
# So if no avatar_url is specified, add one so it can be correctly calculated into the total payload
k = '?' if not '?' in url else '&'
if not 'avatar_url' in url:
url += k + 'avatar_url=https://raw.githubusercontent.com/dgtlmoon/changedetection.io/master/changedetectionio/static/images/avatar-256x256.png'
if url.startswith('tgram://'):
# real limit is 4096, but minus some for extra metadata
payload_max_size = 3600
body_limit = max(0, payload_max_size - len(n_title))
n_title = n_title[0:payload_max_size]
n_body = n_body[0:body_limit]
elif url.startswith('discord://'):
# real limit is 2000, but minus some for extra metadata
payload_max_size = 1700
body_limit = max(0, payload_max_size - len(n_title))
n_title = n_title[0:payload_max_size]
n_body = n_body[0:body_limit]
apobj.add(url)
apobj.notify(
title=n_title,
body=n_body,
body_format=n_format)
apobj.clear()
# Incase it needs to exist in memory for a while after to process(?)
apobjs.append(apobj)
# Returns empty string if nothing found, multi-line string otherwise
log_value = logs.getvalue()
if log_value and 'WARNING' in log_value or 'ERROR' in log_value:
raise Exception(log_value)
# Notification title + body content parameters get created here.
def create_notification_parameters(n_object, datastore):

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

View File

@@ -0,0 +1,40 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<svg
version="1.1"
id="Layer_1"
x="0px"
y="0px"
viewBox="0 0 115.77 122.88"
style="enable-background:new 0 0 115.77 122.88"
xml:space="preserve"
sodipodi:docname="copy.svg"
inkscape:version="1.1.1 (1:1.1+202109281949+c3084ef5ed)"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg"><defs
id="defs11" /><sodipodi:namedview
id="namedview9"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
showgrid="false"
inkscape:zoom="5.5501303"
inkscape:cx="57.83648"
inkscape:cy="61.439999"
inkscape:window-width="1920"
inkscape:window-height="1056"
inkscape:window-x="1920"
inkscape:window-y="0"
inkscape:window-maximized="1"
inkscape:current-layer="g6" /><style
type="text/css"
id="style2">.st0{fill-rule:evenodd;clip-rule:evenodd;}</style><g
id="g6"><path
class="st0"
d="M89.62,13.96v7.73h12.19h0.01v0.02c3.85,0.01,7.34,1.57,9.86,4.1c2.5,2.51,4.06,5.98,4.07,9.82h0.02v0.02 v73.27v0.01h-0.02c-0.01,3.84-1.57,7.33-4.1,9.86c-2.51,2.5-5.98,4.06-9.82,4.07v0.02h-0.02h-61.7H40.1v-0.02 c-3.84-0.01-7.34-1.57-9.86-4.1c-2.5-2.51-4.06-5.98-4.07-9.82h-0.02v-0.02V92.51H13.96h-0.01v-0.02c-3.84-0.01-7.34-1.57-9.86-4.1 c-2.5-2.51-4.06-5.98-4.07-9.82H0v-0.02V13.96v-0.01h0.02c0.01-3.85,1.58-7.34,4.1-9.86c2.51-2.5,5.98-4.06,9.82-4.07V0h0.02h61.7 h0.01v0.02c3.85,0.01,7.34,1.57,9.86,4.1c2.5,2.51,4.06,5.98,4.07,9.82h0.02V13.96L89.62,13.96z M79.04,21.69v-7.73v-0.02h0.02 c0-0.91-0.39-1.75-1.01-2.37c-0.61-0.61-1.46-1-2.37-1v0.02h-0.01h-61.7h-0.02v-0.02c-0.91,0-1.75,0.39-2.37,1.01 c-0.61,0.61-1,1.46-1,2.37h0.02v0.01v64.59v0.02h-0.02c0,0.91,0.39,1.75,1.01,2.37c0.61,0.61,1.46,1,2.37,1v-0.02h0.01h12.19V35.65 v-0.01h0.02c0.01-3.85,1.58-7.34,4.1-9.86c2.51-2.5,5.98-4.06,9.82-4.07v-0.02h0.02H79.04L79.04,21.69z M105.18,108.92V35.65v-0.02 h0.02c0-0.91-0.39-1.75-1.01-2.37c-0.61-0.61-1.46-1-2.37-1v0.02h-0.01h-61.7h-0.02v-0.02c-0.91,0-1.75,0.39-2.37,1.01 c-0.61,0.61-1,1.46-1,2.37h0.02v0.01v73.27v0.02h-0.02c0,0.91,0.39,1.75,1.01,2.37c0.61,0.61,1.46,1,2.37,1v-0.02h0.01h61.7h0.02 v0.02c0.91,0,1.75-0.39,2.37-1.01c0.61-0.61,1-1.46,1-2.37h-0.02V108.92L105.18,108.92z"
id="path4"
style="fill:#ffffff;fill-opacity:1" /></g></svg>

After

Width:  |  Height:  |  Size: 2.5 KiB

View File

@@ -0,0 +1,46 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<svg
width="18"
height="19.92"
viewBox="0 0 18 19.92"
version="1.1"
id="svg6"
sodipodi:docname="spread.svg"
inkscape:version="1.1.1 (1:1.1+202109281949+c3084ef5ed)"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg">
<defs
id="defs10" />
<sodipodi:namedview
id="namedview8"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageshadow="2"
inkscape:pageopacity="0.0"
inkscape:pagecheckerboard="0"
showgrid="false"
fit-margin-top="0"
fit-margin-left="0"
fit-margin-right="0"
fit-margin-bottom="0"
inkscape:zoom="28.416667"
inkscape:cx="9.0087975"
inkscape:cy="9.9941348"
inkscape:window-width="1920"
inkscape:window-height="1056"
inkscape:window-x="1920"
inkscape:window-y="0"
inkscape:window-maximized="1"
inkscape:current-layer="svg6" />
<path
d="M -3,-2 H 21 V 22 H -3 Z"
fill="none"
id="path2" />
<path
d="m 15,14.08 c -0.76,0 -1.44,0.3 -1.96,0.77 L 5.91,10.7 C 5.96,10.47 6,10.24 6,10 6,9.76 5.96,9.53 5.91,9.3 L 12.96,5.19 C 13.5,5.69 14.21,6 15,6 16.66,6 18,4.66 18,3 18,1.34 16.66,0 15,0 c -1.66,0 -3,1.34 -3,3 0,0.24 0.04,0.47 0.09,0.7 L 5.04,7.81 C 4.5,7.31 3.79,7 3,7 1.34,7 0,8.34 0,10 c 0,1.66 1.34,3 3,3 0.79,0 1.5,-0.31 2.04,-0.81 l 7.12,4.16 c -0.05,0.21 -0.08,0.43 -0.08,0.65 0,1.61 1.31,2.92 2.92,2.92 1.61,0 2.92,-1.31 2.92,-2.92 0,-1.61 -1.31,-2.92 -2.92,-2.92 z"
id="path4"
style="fill:#0078e7;fill-opacity:1" />
</svg>

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

@@ -0,0 +1,16 @@
$(document).ready(function() {
function toggle() {
if ($('input[name="application-fetch_backend"]:checked').val() != 'html_requests') {
$('#requests-override-options').hide();
$('#webdriver-override-options').show();
} else {
$('#requests-override-options').show();
$('#webdriver-override-options').hide();
}
}
$('input[name="application-fetch_backend"]').click(function (e) {
toggle();
});
toggle();
});

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,53 @@
$(document).ready(function() {
$('#add-email-helper').click(function (e) {
e.preventDefault();
email = prompt("Destination email");
if(email) {
var n = $(".notification-urls");
var p=email_notification_prefix;
$(n).val( $.trim( $(n).val() )+"\n"+email_notification_prefix+email );
}
});
$('#send-test-notification').click(function (e) {
e.preventDefault();
// this can be global
var csrftoken = $('input[name=csrf_token]').val();
$.ajaxSetup({
beforeSend: function(xhr, settings) {
if (!/^(GET|HEAD|OPTIONS|TRACE)$/i.test(settings.type) && !this.crossDomain) {
xhr.setRequestHeader("X-CSRFToken", csrftoken)
}
}
})
data = {
window_url : window.location.href,
notification_urls : $('.notification-urls').val(),
notification_title : $('.notification-title').val(),
notification_body : $('.notification-body').val(),
notification_format : $('.notification-format').val(),
}
for (key in data) {
if (!data[key].length) {
alert(key+" is empty, cannot send test.")
return;
}
}
$.ajax({
type: "POST",
url: notification_base_url,
data : data
}).done(function(data){
console.log(data);
alert('Sent');
}).fail(function(data){
console.log(data);
alert('Error: '+data.responseJSON.error);
})
});
});

View File

@@ -1,13 +0,0 @@
window.addEventListener("load", (event) => {
// just an example for now
function toggleVisible(elem) {
// theres better ways todo this
var x = document.getElementById(elem);
if (x.style.display === "block") {
x.style.display = "none";
} else {
x.style.display = "block";
}
}
});

View File

@@ -1,6 +1,11 @@
// Rewrite this is a plugin.. is all this JS really 'worth it?'
if(!window.location.hash) {
var tab=document.querySelectorAll("#default-tab a");
tab[0].click();
}
window.addEventListener('hashchange', function() {
var tabs = document.getElementsByClassName('active');
while (tabs[0]) {
@@ -21,7 +26,6 @@ if (!has_errors.length) {
focus_error_tab();
}
function set_active_tab() {
var tab=document.querySelectorAll("a[href='"+location.hash+"']");
if (tab.length) {

View File

@@ -0,0 +1,24 @@
$(function () {
// Remove unviewed status when normally clicked
$('.diff-link').click(function () {
$(this).closest('.unviewed').removeClass('unviewed');
});
$('.with-share-link > *').click(function () {
$("#copied-clipboard").remove();
var range = document.createRange();
var n=$("#share-link")[0];
range.selectNode(n);
window.getSelection().removeAllRanges();
window.getSelection().addRange(range);
document.execCommand("copy");
window.getSelection().removeAllRanges();
$('.with-share-link').append('<span style="font-size: 80%; color: #fff;" id="copied-clipboard">Copied to clipboard</span>');
$("#copied-clipboard").fadeOut(2500, function() {
$(this).remove();
});
});
});

View File

@@ -0,0 +1,16 @@
$(document).ready(function() {
function toggle() {
if ($('input[name="fetch_backend"]:checked').val() != 'html_requests') {
$('#requests-override-options').hide();
$('#webdriver-override-options').show();
} else {
$('#requests-override-options').show();
$('#webdriver-override-options').hide();
}
}
$('input[name="fetch_backend"]').click(function (e) {
toggle();
});
toggle();
});

View File

@@ -1,7 +1,8 @@
#diff-ui {
background: #fff;
padding: 2em;
margin: 1em;
margin-left: 1em;
margin-right: 1em;
border-radius: 5px;
font-size: 11px; }
#diff-ui table {
@@ -70,3 +71,8 @@ td#diff-col div {
/* ignored and triggered? make it obvious error */
.ignored.triggered {
background-color: #ff0000; }
.tab-pane-inner#screenshot {
text-align: center; }
.tab-pane-inner#screenshot img {
max-width: 99%; }

View File

@@ -2,7 +2,8 @@
background: #fff;
padding: 2em;
margin: 1em;
margin-left: 1em;
margin-right: 1em;
border-radius: 5px;
font-size: 11px;
@@ -85,4 +86,11 @@ td#diff-col div {
/* ignored and triggered? make it obvious error */
.ignored.triggered {
background-color: #ff0000;
}
.tab-pane-inner#screenshot {
text-align: center;
img {
max-width: 99%;
}
}

View File

@@ -31,7 +31,7 @@ a.github-link {
section.content {
padding-top: 5em;
padding-bottom: 5em;
padding-bottom: 1em;
flex-direction: column;
display: flex;
align-items: center;
@@ -180,11 +180,20 @@ body:after, body:before {
.messages li.notice {
background: rgba(255, 255, 255, 0.5); }
.messages.with-share-link > *:hover {
cursor: pointer; }
#notification-customisation {
border: 1px solid #ccc;
padding: 1rem;
padding: 0.5rem;
border-radius: 5px; }
#notification-error-log {
border: 1px solid #ccc;
padding: 1rem;
border-radius: 5px;
overflow-wrap: break-word; }
#token-table.pure-table td, #token-table.pure-table th {
font-size: 80%; }
@@ -279,6 +288,11 @@ footer {
padding-bottom: 1em; }
.pure-form .pure-control-group div, .pure-form .pure-group div, .pure-form .pure-controls div {
margin: 0px; }
.pure-form .pure-control-group .checkbox > *, .pure-form .pure-group .checkbox > *, .pure-form .pure-controls .checkbox > * {
display: inline;
vertical-align: middle; }
.pure-form .pure-control-group .checkbox > label, .pure-form .pure-group .checkbox > label, .pure-form .pure-controls .checkbox > label {
padding-left: 5px; }
.pure-form .error input {
background-color: #ffebeb; }
.pure-form ul.errors {
@@ -295,10 +309,10 @@ footer {
font-weight: bold; }
.pure-form textarea {
width: 100%; }
.pure-form ul#fetch_backend {
.pure-form .inline-radio ul {
margin: 0px;
list-style: none; }
.pure-form ul#fetch_backend > li > * {
.pure-form .inline-radio ul li > * {
display: inline-block; }
@media only screen and (max-width: 760px), (min-device-width: 768px) and (max-device-width: 1024px) {
@@ -317,7 +331,7 @@ footer {
right: auto; }
section.content {
padding-top: 110px; }
div.tabs ul li {
div.tabs.collapsable ul li {
display: block;
border-radius: 0px; }
input[type='text'] {
@@ -403,14 +417,17 @@ and also iPads specifically.
padding: 20px;
border-radius: 5px; }
.tab-pane-inner {
padding: 0px; }
.tab-pane-inner:not(:target) {
display: none; }
.tab-pane-inner:target {
display: block; }
.edit-form {
min-width: 70%; }
.edit-form .tab-pane-inner {
padding: 0px; }
.edit-form .tab-pane-inner:not(:target) {
display: none; }
.edit-form .tab-pane-inner:target {
display: block; }
min-width: 70%;
/* so it cant overflow */
max-width: 95%; }
.edit-form .box-wrap {
position: relative; }
.edit-form .inner {
@@ -426,3 +443,8 @@ ul {
padding-left: 1em;
padding-top: 0px;
margin-top: 4px; }
.time-check-widget tr {
display: inline; }
.time-check-widget tr input[type="number"] {
width: 4em; }

View File

@@ -35,7 +35,7 @@ a.github-link {
section.content {
padding-top: 5em;
padding-bottom: 5em;
padding-bottom: 1em;
flex-direction: column;
display: flex;
align-items: center;
@@ -237,14 +237,25 @@ body:after, body:before {
background: rgba(255, 255, 255, .5);
}
}
&.with-share-link {
> *:hover {
cursor:pointer;
}
}
}
#notification-customisation {
border: 1px solid #ccc;
padding: 1rem;
padding: 0.5rem;
border-radius: 5px;
}
#notification-error-log {
border: 1px solid #ccc;
padding: 1rem;
border-radius: 5px;
overflow-wrap: break-word;
}
#token-table {
&.pure-table td, &.pure-table th {
@@ -369,6 +380,15 @@ footer {
div {
margin: 0px;
}
.checkbox {
> * {
display: inline;
vertical-align: middle;
}
> label {
padding-left: 5px;
}
}
}
/* The input fields with errors */
.error {
@@ -398,14 +418,16 @@ footer {
textarea {
width: 100%;
}
ul#fetch_backend {
margin: 0px;
list-style: none;
> li {
> * {
display: inline-block;
.inline-radio {
ul {
margin: 0px;
list-style: none;
li {
> * {
display: inline-block;
}
}
}
}
}
}
@@ -437,7 +459,7 @@ footer {
}
// Make the tabs easier to hit, they will be all nice and horizontal
div.tabs ul li {
div.tabs.collapsable ul li {
display: block;
border-radius: 0px;
}
@@ -573,10 +595,7 @@ $form-edge-padding: 20px;
}
}
.edit-form {
min-width: 70%;
.tab-pane-inner {
.tab-pane-inner {
&:not(:target) {
display: none;
}
@@ -585,7 +604,12 @@ $form-edge-padding: 20px;
}
// doesnt need padding because theres another row of buttons/activity
padding: 0px;
}
}
.edit-form {
min-width: 70%;
/* so it cant overflow */
max-width: 95%;
.box-wrap {
position: relative;
}
@@ -607,4 +631,13 @@ ul {
padding-left: 1em;
padding-top: 0px;
margin-top: 4px;
}
.time-check-widget {
tr {
display: inline;
input[type="number"] {
width: 4em;
}
}
}

View File

@@ -1,3 +1,6 @@
from flask import (
flash
)
import json
import logging
import os
@@ -7,19 +10,21 @@ import uuid as uuid_builder
from copy import deepcopy
from os import mkdir, path, unlink
from threading import Lock
import re
import requests
from changedetectionio.notification import (
default_notification_body,
default_notification_format,
default_notification_title,
)
from . model import App, Watch
# Is there an existing library to ensure some data store (JSON etc) is in sync with CRUD methods?
# Open a github issue if you know something :)
# https://stackoverflow.com/questions/6190468/how-to-trigger-function-on-value-change
class ChangeDetectionStore:
lock = Lock()
# For general updates/writes that can wait a few seconds
needs_write = False
# For when we edit, we should write to disk
needs_write_urgent = False
def __init__(self, datastore_path="/datastore", include_default_watches=True, version_tag="0.0.0"):
# Should only be active for docker
@@ -27,71 +32,14 @@ class ChangeDetectionStore:
self.needs_write = False
self.datastore_path = datastore_path
self.json_store_path = "{}/url-watches.json".format(self.datastore_path)
self.proxy_list = None
self.stop_thread = False
self.__data = {
'note': "Hello! If you change this file manually, please be sure to restart your changedetection.io instance!",
'watching': {},
'settings': {
'headers': {
'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36',
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
'Accept-Encoding': 'gzip, deflate', # No support for brolti in python requests yet.
'Accept-Language': 'en-GB,en-US;q=0.9,en;'
},
'requests': {
'timeout': 15, # Default 15 seconds
'minutes_between_check': 3 * 60, # Default 3 hours
'workers': 10 # Number of threads, lower is better for slow connections
},
'application': {
'password': False,
'base_url' : None,
'extract_title_as_title': False,
'fetch_backend': 'html_requests',
'global_ignore_text': [], # List of text to ignore when calculating the comparison checksum
'global_subtractive_selectors': [],
'ignore_whitespace': False,
'notification_urls': [], # Apprise URL list
# Custom notification content
'notification_title': default_notification_title,
'notification_body': default_notification_body,
'notification_format': default_notification_format,
}
}
}
self.__data = App.model()
# Base definition for all watchers
self.generic_definition = {
'url': None,
'tag': None,
'last_checked': 0,
'last_changed': 0,
'paused': False,
'last_viewed': 0, # history key value of the last viewed via the [diff] link
'newest_history_key': "",
'title': None,
# Re #110, so then if this is set to None, we know to use the default value instead
# Requires setting to None on submit if it's the same as the default
'minutes_between_check': None,
'previous_md5': "",
'uuid': str(uuid_builder.uuid4()),
'headers': {}, # Extra headers to send
'body': None,
'method': 'GET',
'history': {}, # Dict of timestamp and output stripped filename
'ignore_text': [], # List of text to ignore when calculating the comparison checksum
# Custom notification content
'notification_urls': [], # List of URLs to add to the notification Queue (Usually AppRise)
'notification_title': default_notification_title,
'notification_body': default_notification_body,
'notification_format': default_notification_format,
'css_filter': "",
'subtractive_selectors': [],
'trigger_text': [], # List of text or regex to wait for until a change is detected
'fetch_backend': None,
'extract_title_as_title': False
}
# deepcopy part of #569 - not sure why its needed exactly
self.generic_definition = deepcopy(Watch.model())
if path.isfile('changedetectionio/source.txt'):
with open('changedetectionio/source.txt') as f:
@@ -163,6 +111,17 @@ class ChangeDetectionStore:
secret = secrets.token_hex(16)
self.__data['settings']['application']['rss_access_token'] = secret
# Proxy list support - available as a selection in settings when text file is imported
# CSV list
# "name, address", or just "name"
proxy_list_file = "{}/proxies.txt".format(self.datastore_path)
if path.isfile(proxy_list_file):
self.import_proxy_list(proxy_list_file)
# Bump the update version by running updates
self.run_updates()
self.needs_write = True
# Finally start the thread that will manage periodic data saves to JSON
@@ -188,8 +147,16 @@ class ChangeDetectionStore:
self.data['watching'][uuid].update({'last_viewed': int(timestamp)})
self.needs_write = True
def remove_password(self):
self.__data['settings']['application']['password'] = False
self.needs_write = True
def update_watch(self, uuid, update_obj):
# It's possible that the watch could be deleted before update
if not self.__data['watching'].get(uuid):
return
with self.lock:
# In python 3.9 we have the |= dict operator, but that still will lose data on nested structures...
@@ -204,6 +171,17 @@ class ChangeDetectionStore:
self.needs_write = True
@property
def threshold_seconds(self):
seconds = 0
mtable = {'seconds': 1, 'minutes': 60, 'hours': 3600, 'days': 86400, 'weeks': 86400 * 7}
minimum_seconds_recheck_time = int(os.getenv('MINIMUM_SECONDS_RECHECK_TIME', 60))
for m, n in mtable.items():
x = self.__data['settings']['requests']['time_between_check'].get(m)
if x:
seconds += x * n
return max(seconds, minimum_seconds_recheck_time)
@property
def data(self):
has_unviewed = False
@@ -266,7 +244,7 @@ class ChangeDetectionStore:
del self.data['watching'][uuid]
self.needs_write = True
self.needs_write_urgent = True
# Clone a watch by UUID
def clone(self, uuid):
@@ -290,69 +268,64 @@ class ChangeDetectionStore:
return self.data['watching'][uuid].get(val)
# Remove a watchs data but keep the entry (URL etc)
def scrub_watch(self, uuid, limit_timestamp = False):
def scrub_watch(self, uuid):
import pathlib
import hashlib
del_timestamps = []
self.__data['watching'][uuid].update({'history': {}, 'last_checked': 0, 'last_changed': 0, 'newest_history_key': 0, 'previous_md5': False})
self.needs_write_urgent = True
changes_removed = 0
for item in pathlib.Path(self.datastore_path).rglob(uuid+"/*.txt"):
unlink(item)
for timestamp, path in self.data['watching'][uuid]['history'].items():
if not limit_timestamp or (limit_timestamp is not False and int(timestamp) > limit_timestamp):
self.unlink_history_file(path)
del_timestamps.append(timestamp)
changes_removed += 1
if not limit_timestamp:
self.data['watching'][uuid]['last_checked'] = 0
self.data['watching'][uuid]['last_changed'] = 0
self.data['watching'][uuid]['previous_md5'] = ""
for timestamp in del_timestamps:
del self.data['watching'][uuid]['history'][str(timestamp)]
# If there was a limitstamp, we need to reset some meta data about the entry
# This has to happen after we remove the others from the list
if limit_timestamp:
newest_key = self.get_newest_history_key(uuid)
if newest_key:
self.data['watching'][uuid]['last_checked'] = int(newest_key)
# @todo should be the original value if it was less than newest key
self.data['watching'][uuid]['last_changed'] = int(newest_key)
try:
with open(self.data['watching'][uuid]['history'][str(newest_key)], "rb") as fp:
content = fp.read()
self.data['watching'][uuid]['previous_md5'] = hashlib.md5(content).hexdigest()
except (FileNotFoundError, IOError):
self.data['watching'][uuid]['previous_md5'] = ""
pass
self.needs_write = True
return changes_removed
def add_watch(self, url, tag="", extras=None):
def add_watch(self, url, tag="", extras=None, write_to_disk_now=True):
if extras is None:
extras = {}
# Incase these are copied across, assume it's a reference and deepcopy()
apply_extras = deepcopy(extras)
# Was it a share link? try to fetch the data
if (url.startswith("https://changedetection.io/share/")):
try:
r = requests.request(method="GET",
url=url,
# So we know to return the JSON instead of the human-friendly "help" page
headers={'App-Guid': self.__data['app_guid']})
res = r.json()
# List of permisable stuff we accept from the wild internet
for k in ['url', 'tag',
'paused', 'title',
'previous_md5', 'headers',
'body', 'method',
'ignore_text', 'css_filter',
'subtractive_selectors', 'trigger_text',
'extract_title_as_title']:
if res.get(k):
apply_extras[k] = res[k]
except Exception as e:
logging.error("Error fetching metadata for shared watch link", url, str(e))
flash("Error fetching metadata for {}".format(url), 'error')
return False
with self.lock:
# @todo use a common generic version of this
new_uuid = str(uuid_builder.uuid4())
_blank = deepcopy(self.generic_definition)
_blank.update({
# #Re 569
# Not sure why deepcopy was needed here, sometimes new watches would appear to already have 'history' set
# I assumed this would instantiate a new object but somehow an existing dict was getting used
new_watch = deepcopy(Watch.model({
'url': url,
'tag': tag
})
}))
# Incase these are copied across, assume it's a reference and deepcopy()
apply_extras = deepcopy(extras)
for k in ['uuid', 'history', 'last_checked', 'last_changed', 'newest_history_key', 'previous_md5', 'viewed']:
if k in apply_extras:
del apply_extras[k]
_blank.update(apply_extras)
self.data['watching'][new_uuid] = _blank
new_watch.update(apply_extras)
self.__data['watching'][new_uuid]=new_watch
# Get the directory ready
output_path = "{}/{}".format(self.datastore_path, new_uuid)
@@ -361,7 +334,8 @@ class ChangeDetectionStore:
except FileExistsError:
print(output_path, "already exists.")
self.sync_to_json()
if write_to_disk_now:
self.sync_to_json()
return new_uuid
# Save some text file to the appropriate path and bump the history
@@ -381,9 +355,25 @@ class ChangeDetectionStore:
return fname
def get_screenshot(self, watch_uuid):
output_path = "{}/{}".format(self.datastore_path, watch_uuid)
fname = "{}/last-screenshot.png".format(output_path)
if path.isfile(fname):
return fname
return False
# Save as PNG, PNG is larger but better for doing visual diff in the future
def save_screenshot(self, watch_uuid, screenshot: bytes):
output_path = "{}/{}".format(self.datastore_path, watch_uuid)
fname = "{}/last-screenshot.png".format(output_path)
with open(fname, 'wb') as f:
f.write(screenshot)
f.close()
def sync_to_json(self):
logging.info("Saving JSON..")
print("Saving JSON..")
try:
data = deepcopy(self.__data)
except RuntimeError as e:
@@ -405,6 +395,7 @@ class ChangeDetectionStore:
logging.error("Error writing JSON!! (Main JSON file save was skipped) : %s", str(e))
self.needs_write = False
self.needs_write_urgent = False
# Thread runner, this helps with thread/write issues when there are many operations that want to update the JSON
# by just running periodically in one thread, according to python, dict updates are threadsafe.
@@ -415,14 +406,14 @@ class ChangeDetectionStore:
print("Shutting down datastore thread")
return
if self.needs_write:
if self.needs_write or self.needs_write_urgent:
self.sync_to_json()
# Once per minute is enough, more and it can cause high CPU usage
# better here is to use something like self.app.config.exit.wait(1), but we cant get to 'app' from here
for i in range(30):
time.sleep(2)
if self.stop_thread:
for i in range(120):
time.sleep(0.5)
if self.stop_thread or self.needs_write_urgent:
break
# Go through the datastore path and remove any snapshots that are not mentioned in the index
@@ -438,7 +429,69 @@ class ChangeDetectionStore:
import pathlib
# Only in the sub-directories
for item in pathlib.Path(self.datastore_path).rglob("*/*txt"):
if not str(item) in index:
print ("Removing",item)
unlink(item)
for uuid in self.data['watching']:
for item in pathlib.Path(self.datastore_path).rglob(uuid+"/*.txt"):
if not str(item) in index:
print ("Removing",item)
unlink(item)
def import_proxy_list(self, filename):
import csv
with open(filename, newline='') as f:
reader = csv.reader(f, skipinitialspace=True)
# @todo This loop can could be improved
l = []
for row in reader:
if len(row):
if len(row)>=2:
l.append(tuple(row[:2]))
else:
l.append(tuple([row[0], row[0]]))
self.proxy_list = l if len(l) else None
# Run all updates
# IMPORTANT - Each update could be run even when they have a new install and the schema is correct
# So therefor - each `update_n` should be very careful about checking if it needs to actually run
# Probably we should bump the current update schema version with each tag release version?
def run_updates(self):
import inspect
import shutil
updates_available = []
for i, o in inspect.getmembers(self, predicate=inspect.ismethod):
m = re.search(r'update_(\d+)$', i)
if m:
updates_available.append(int(m.group(1)))
updates_available.sort()
for update_n in updates_available:
if update_n > self.__data['settings']['application']['schema_version']:
print ("Applying update_{}".format((update_n)))
# Wont exist on fresh installs
if os.path.exists(self.json_store_path):
shutil.copyfile(self.json_store_path, self.datastore_path+"/url-watches-before-{}.json".format(update_n))
try:
update_method = getattr(self, "update_{}".format(update_n))()
except Exception as e:
print("Error while trying update_{}".format((update_n)))
print(e)
# Don't run any more updates
return
else:
# Bump the version, important
self.__data['settings']['application']['schema_version'] = update_n
# Convert minutes to seconds on settings and each watch
def update_1(self):
if self.data['settings']['requests'].get('minutes_between_check'):
self.data['settings']['requests']['time_between_check']['minutes'] = self.data['settings']['requests']['minutes_between_check']
# Remove the default 'hours' that is set from the model
self.data['settings']['requests']['time_between_check']['hours'] = None
for uuid, watch in self.data['watching'].items():
if 'minutes_between_check' in watch:
# Only upgrade individual watch time if it was set
if watch.get('minutes_between_check', False):
self.data['watching'][uuid]['time_between_check']['minutes'] = watch['minutes_between_check']

View File

@@ -1,35 +1,39 @@
{% from '_helpers.jinja' import render_field %}
{% macro render_common_settings_form(form, current_base_url) %}
{% macro render_common_settings_form(form, current_base_url, emailprefix) %}
<div class="pure-control-group">
{{ render_field(form.notification_urls, rows=5, placeholder="Examples:
Gitter - gitter://token/room
Office365 - o365://TenantID:AccountEmail/ClientID/ClientSecret/TargetEmail
AWS SNS - sns://AccessKeyID/AccessSecretKey/RegionName/+PhoneNo
SMTPS - mailtos://user:pass@mail.domain.com?to=receivingAddress@example.com")
SMTPS - mailtos://user:pass@mail.domain.com?to=receivingAddress@example.com", class="notification-urls")
}}
<div class="pure-form-message-inline">
<ul>
<li>Use <a target=_new href="https://github.com/caronc/apprise">AppRise URLs</a> for notification to just about any service! <i><a target=_new href="https://github.com/dgtlmoon/changedetection.io/wiki/Notification-configuration-notes">Please read the notification services wiki here for important configuration notes</a></i>.</li>
<li><code>discord://</code> will silently fail if the total message length is more than 2000 chars.</li>
<li><code>discord://</code> only supports a maximum <strong>2,000 characters</strong> of notification text, including the title.</li>
<li><code>tgram://</code> bots cant send messages to other bots, so you should specify chat ID of non-bot user.</li>
<li>Go here for <a href="{{url_for('notification_logs')}}">Notification debug logs</a></li>
<li>Go here for <a href="{{url_for('notification_logs')}}">notification debug logs</a></li>
</ul>
</div>
<br/>
<a id="send-test-notification" class="pure-button button-secondary button-xsmall" style="font-size: 70%">Send test notification</a>
{% if emailprefix %}
<a id="add-email-helper" class="pure-button button-secondary button-xsmall" style="font-size: 70%">Add email</a>
{% endif %}
</div>
<div id="notification-customisation">
<div id="notification-customisation" class="pure-control-group">
<div class="pure-control-group">
{{ render_field(form.notification_title, class="m-d") }}
{{ render_field(form.notification_title, class="m-d notification-title") }}
<span class="pure-form-message-inline">Title for all notifications</span>
</div>
<div class="pure-control-group">
{{ render_field(form.notification_body , rows=5) }}
{{ render_field(form.notification_body , rows=5, class="notification-body") }}
<span class="pure-form-message-inline">Body for all notifications</span>
</div>
<div class="pure-control-group">
{{ render_field(form.notification_format , rows=5) }}
{{ render_field(form.notification_format , rows=5, class="notification-format") }}
<span class="pure-form-message-inline">Format for all notifications</span>
</div>
<div class="pure-controls">
@@ -93,7 +97,4 @@
</span>
</div>
</div>
<div class="pure-control-group">
{{ render_field(form.trigger_check) }}
</div>
{% endmacro %}

View File

@@ -1,3 +1,30 @@
{% macro render_field(field) %}
<div {% if field.errors %} class="error" {% endif %}>{{ field(**kwargs)|safe }}
<div {% if field.errors %} class="error" {% endif %}>{{ field.label }}</div>
{% if field.errors %}
<ul class=errors>
{% for error in field.errors %}
<li>{{ error }}</li>
{% endfor %}
</ul>
{% endif %}
</div>
{% endmacro %}
{% macro render_checkbox_field(field) %}
<div class="checkbox {% if field.errors %} error {% endif %}">
{{ field(**kwargs)|safe }} {{ field.label }}
{% if field.errors %}
<ul class=errors>
{% for error in field.errors %}
<li>{{ error }}</li>
{% endfor %}
</ul>
{% endif %}
</div>
{% endmacro %}
{% macro render_field(field) %}
<div {% if field.errors %} class="error" {% endif %}>{{ field.label }}</div>
<div {% if field.errors %} class="error" {% endif %}>{{ field(**kwargs)|safe }}

View File

@@ -5,6 +5,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta name="description" content="Self hosted website change detection.">
<title>Change Detection{{extra_title}}</title>
<link rel="alternate" type="application/rss+xml" title="Changedetection.io » Feed{% if active_tag %}- {{active_tag}}{% endif %}" href="{{ url_for('rss', tag=active_tag , token=app_rss_token)}}" />
<link rel="stylesheet" href="{{url_for('static_content', group='styles', filename='pure-min.css')}}">
<link rel="stylesheet" href="{{url_for('static_content', group='styles', filename='styles.css')}}">
{% if extra_stylesheets %}
@@ -17,6 +18,8 @@
background-image: url({{url_for('static_content', group='images', filename='gradient-border.png')}});
}
</style>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='jquery-3.6.0.min.js')}}"></script>
</head>
<body>
@@ -91,6 +94,13 @@
</ul>
{% endif %}
{% endwith %}
{% if session['share-link'] %}
<ul class="messages with-share-link">
<li class="message">Share this link: <span id="share-link">{{ session['share-link'] }}</span> <img style="height: 1em;display:inline-block;" src="{{url_for('static_content', group='images', filename='copy.svg')}}" /></li>
</ul>
{% endif %}
{% block content %}
{% endblock %}

View File

@@ -1,7 +1,6 @@
{% extends 'base.html' %}
{% block content %}
<div id="settings">
<h1>Differences</h1>
<form class="pure-form " action="" method="GET">
@@ -35,21 +34,45 @@
<div id="diff-jump">
<a onclick="next_diff();">Jump</a>
</div>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<div class="tabs">
<ul>
<li class="tab" id="default-tab"><a href="#text">Text</a></li>
{% if screenshot %}
<li class="tab"><a href="#screenshot">Current screenshot</a></li>
{% endif %}
</ul>
</div>
<div id="diff-ui">
<div class="tip">Pro-tip: Use <strong>show current snapshot</strong> tab to visualise what will be ignored.</div>
<table>
<tbody>
<tr>
<!-- just proof of concept copied straight from github.com/kpdecker/jsdiff -->
<td id="a" style="display: none;">{{previous}}</td>
<td id="b" style="display: none;">{{newest}}</td>
<td id="diff-col">
<span id="result"></span>
</td>
</tr>
</tbody>
</table>
Diff algorithm from the amazing <a href="https://github.com/kpdecker/jsdiff">github.com/kpdecker/jsdiff</a>
<div class="tab-pane-inner" id="text">
<div class="tip">Pro-tip: Use <strong>show current snapshot</strong> tab to visualise what will be ignored.
</div>
<table>
<tbody>
<tr>
<!-- just proof of concept copied straight from github.com/kpdecker/jsdiff -->
<td id="a" style="display: none;">{{previous}}</td>
<td id="b" style="display: none;">{{newest}}</td>
<td id="diff-col">
<span id="result"></span>
</td>
</tr>
</tbody>
</table>
Diff algorithm from the amazing <a href="https://github.com/kpdecker/jsdiff">github.com/kpdecker/jsdiff</a>
</div>
{% if screenshot %}
<div class="tab-pane-inner" id="screenshot">
<p>
<i>For now, only the most recent screenshot is saved and displayed.</i>
</p>
<img src="{{url_for('static_content', group='screenshot', filename=uuid)}}">
</div>
{% endif %}
</div>

View File

@@ -1,13 +1,20 @@
{% extends 'base.html' %}
{% block content %}
{% from '_helpers.jinja' import render_field %}
{% from '_helpers.jinja' import render_button %}
{% from '_helpers.jinja' import render_field, render_checkbox_field, render_button %}
{% from '_common_fields.jinja' import render_common_settings_form %}
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<script>
const notification_base_url="{{url_for('ajax_callback_send_notification_test')}}";
{% if emailprefix %}
const email_notification_prefix=JSON.parse('{{ emailprefix|tojson }}');
{% endif %}
</script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='watch-settings.js')}}" defer></script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='notifications.js')}}" defer></script>
<div class="edit-form monospaced-textarea">
<div class="tabs">
<div class="tabs collapsable">
<ul>
<li class="tab" id="default-tab"><a href="#general">General</a></li>
<li class="tab"><a href="#request">Request</a></li>
@@ -35,8 +42,8 @@
<span class="pure-form-message-inline">Organisational tag/group name used in the main listing page</span>
</div>
<div class="pure-control-group">
{{ render_field(form.minutes_between_check) }}
{% if using_default_minutes %}
{{ render_field(form.time_between_check, class="time-check-widget") }}
{% if has_empty_checktime %}
<span class="pure-form-message-inline">Currently using the <a
href="{{ url_for('settings_page', uuid=uuid) }}">default global settings</a>, change to another value if you want to be specific.</span>
{% else %}
@@ -45,26 +52,46 @@
{% endif %}
</div>
<div class="pure-control-group">
{{ render_field(form.extract_title_as_title) }}
{{ render_checkbox_field(form.extract_title_as_title) }}
</div>
</fieldset>
</div>
<div class="tab-pane-inner" id="request">
<div class="pure-control-group">
{{ render_field(form.fetch_backend) }}
<div class="pure-control-group inline-radio">
{{ render_field(form.fetch_backend, class="fetch-backend") }}
<span class="pure-form-message-inline">
<p>Use the <strong>Basic</strong> method (default) where your watched site doesn't need Javascript to render.</p>
<p>The <strong>Chrome/Javascript</strong> method requires a network connection to a running WebDriver+Chrome server, set by the ENV var 'WEBDRIVER_URL'. </p>
</span>
</div>
<hr/>
<fieldset class="pure-group">
<span class="pure-form-message-inline">
{% if form.proxy %}
<div class="pure-control-group inline-radio">
{{ render_field(form.proxy, class="fetch-backend-proxy") }}
<span class="pure-form-message-inline">
Choose a proxy for this watch
</span>
</div>
{% endif %}
<fieldset class="pure-group" id="webdriver-override-options">
<div class="pure-form-message-inline">
<strong>If you're having trouble waiting for the page to be fully rendered (text missing etc), try increasing the 'wait' time here.</strong>
<br/>
This will wait <i>n</i> seconds before extracting the text.
</div>
<div class="pure-control-group">
{{ render_field(form.webdriver_delay) }}
</div>
{% if using_global_webdriver_wait %}
<div class="pure-form-message-inline">
<strong>Using the current global default settings</strong>
</div>
{% endif %}
</fieldset>
<fieldset class="pure-group" id="requests-override-options">
<div class="pure-form-message-inline">
<strong>Request override is currently only used by the <i>Basic fast Plaintext/HTTP Client</i> method.</strong>
</span>
</div>
<div class="pure-control-group">
{{ render_field(form.method) }}
</div>
@@ -82,7 +109,7 @@ User-Agent: wonderbra 1.0") }}
}") }}
</div>
<div>
{{ render_field(form.ignore_status_codes) }}
{{ render_checkbox_field(form.ignore_status_codes) }}
</div>
</fieldset>
<br/>
@@ -92,7 +119,7 @@ User-Agent: wonderbra 1.0") }}
<strong>Note: <i>These settings override the global settings for this watch.</i></strong>
<fieldset>
<div class="field-group">
{{ render_common_settings_form(form, current_base_url) }}
{{ render_common_settings_form(form, current_base_url, emailprefix) }}
</div>
</fieldset>
</div>
@@ -119,7 +146,7 @@ User-Agent: wonderbra 1.0") }}
<li>CSS - Limit text to this CSS rule, only text matching this CSS rule is included.</li>
<li>JSON - Limit text to this JSON rule, using <a href="https://pypi.org/project/jsonpath-ng/">JSONPath</a>, prefix with <code>"json:"</code>, use <code>json:$</code> to force re-formatting if required, <a
href="https://jsonpath.com/" target="new">test your JSONPath here</a></li>
<li>XPath - Limit text to this XPath rule, simply start with a forward-slash, example <code>//*[contains(@class, 'sametext')]</code>, <a
<li>XPath - Limit text to this XPath rule, simply start with a forward-slash, example <code>//*[contains(@class, 'sametext')]</code> or <code>xpath://*[contains(@class, 'sametext')]</code>, <a
href="http://xpather.com/" target="new">test your XPath here</a></li>
</ul>
Please be sure that you thoroughly understand how to write CSS or JSONPath, XPath selector rules before filing an issue on GitHub! <a

View File

@@ -1,30 +1,86 @@
{% extends 'base.html' %}
{% block content %}
<div class="edit-form">
<div class="inner">
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<div class="edit-form monospaced-textarea">
<div class="tabs collapsable">
<ul>
<li class="tab" id="default-tab"><a href="#url-list">URL List</a></li>
<li class="tab"><a href="#distill-io">Distill.io</a></li>
</ul>
</div>
<div class="box-wrap inner">
<form class="pure-form pure-form-aligned" action="{{url_for('import_page')}}" method="POST">
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}"/>
<fieldset class="pure-group">
<legend>
Enter one URL per line, and optionally add tags for each URL after a space, delineated by comma (,):
<br>
<code>https://example.com tag1, tag2, last tag</code>
<br>
URLs which do not pass validation will stay in the textarea.
</legend>
<div class="tab-pane-inner" id="url-list">
<fieldset class="pure-group">
<legend>
Enter one URL per line, and optionally add tags for each URL after a space, delineated by comma
(,):
<br>
<code>https://example.com tag1, tag2, last tag</code>
<br>
URLs which do not pass validation will stay in the textarea.
</legend>
<textarea name="urls" class="pure-input-1-2" placeholder="https://"
style="width: 100%;
<textarea name="urls" class="pure-input-1-2" placeholder="https://"
style="width: 100%;
font-family:monospace;
white-space: pre;
overflow-wrap: normal;
overflow-x: scroll;" rows="25">{{ remaining }}</textarea>
</fieldset>
overflow-x: scroll;" rows="25">{{ import_url_list_remaining }}</textarea>
</fieldset>
</div>
<div class="tab-pane-inner" id="distill-io">
<fieldset class="pure-group">
<legend>
Copy and Paste your Distill.io watch 'export' file, this should be a JSON file.</br>
This is <i>experimental</i>, supported fields are <code>name</code>, <code>uri</code>, <code>tags</code>, <code>config:selections</code>, the rest (including <code>schedule</code>) are ignored.
<br/>
<p>
How to export? <a href="https://distill.io/docs/web-monitor/how-export-and-import-monitors/">https://distill.io/docs/web-monitor/how-export-and-import-monitors/</a><br/>
Be sure to set your default fetcher to Chrome if required.</br>
</p>
</legend>
<textarea name="distill-io" class="pure-input-1-2" style="width: 100%;
font-family:monospace;
white-space: pre;
overflow-wrap: normal;
overflow-x: scroll;" placeholder="Example Distill.io JSON export file
{
&quot;client&quot;: {
&quot;local&quot;: 1
},
&quot;data&quot;: [
{
&quot;name&quot;: &quot;Unraid | News&quot;,
&quot;uri&quot;: &quot;https://unraid.net/blog&quot;,
&quot;config&quot;: &quot;{\&quot;selections\&quot;:[{\&quot;frames\&quot;:[{\&quot;index\&quot;:0,\&quot;excludes\&quot;:[],\&quot;includes\&quot;:[{\&quot;type\&quot;:\&quot;xpath\&quot;,\&quot;expr\&quot;:\&quot;(//div[@id='App']/div[contains(@class,'flex')]/main[contains(@class,'relative')]/section[contains(@class,'relative')]/div[@class='container']/div[contains(@class,'flex')]/div[contains(@class,'w-full')])[1]\&quot;}]}],\&quot;dynamic\&quot;:true,\&quot;delay\&quot;:2}],\&quot;ignoreEmptyText\&quot;:true,\&quot;includeStyle\&quot;:false,\&quot;dataAttr\&quot;:\&quot;text\&quot;}&quot;,
&quot;tags&quot;: [],
&quot;content_type&quot;: 2,
&quot;state&quot;: 40,
&quot;schedule&quot;: &quot;{\&quot;type\&quot;:\&quot;INTERVAL\&quot;,\&quot;params\&quot;:{\&quot;interval\&quot;:4447}}&quot;,
&quot;ts&quot;: &quot;2022-03-27T15:51:15.667Z&quot;
}
]
}
" rows="25">{{ original_distill_json }}</textarea>
</fieldset>
</div>
<button type="submit" class="pure-button pure-input-1-2 pure-button-primary">Import</button>
</form>
</div>
</div>
</div>
{% endblock %}

View File

@@ -5,7 +5,7 @@
<div class="inner">
<h4 style="margin-top: 0px;">The following issues were detected when sending notifications</h4>
<div id="notification-customisation">
<div id="notification-error-log">
<ul style="font-size: 80%; margin:0px; padding: 0 0 0 7px">
{% for log in logs|reverse %}
<li>{{log}}</li>

View File

@@ -6,18 +6,40 @@
<h1>Current - {{watch.last_checked|format_timestamp_timeago}}</h1>
</div>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<div class="tabs">
<ul>
<li class="tab" id="default-tab"><a href="#text">Text</a></li>
{% if screenshot %}
<li class="tab"><a href="#screenshot">Current screenshot</a></li>
{% endif %}
</ul>
</div>
<div id="diff-ui">
<span class="ignored">Grey lines are ignored</span> <span class="triggered">Blue lines are triggers</span>
<table>
<tbody>
<tr>
<td id="diff-col">
<div class="tab-pane-inner" id="text">
<span class="ignored">Grey lines are ignored</span> <span class="triggered">Blue lines are triggers</span>
<table>
<tbody>
<tr>
<td id="diff-col">
{% for row in content %}
<div class="{{row.classes}}">{{row.line}}</div>
{% endfor %}
</td>
</tr>
</tbody>
</table>
</td>
</tr>
</tbody>
</table>
</div>
{% if screenshot %}
<div class="tab-pane-inner" id="screenshot">
<p>
<i>For now, only the most recent screenshot is saved and displayed.</i>
</p>
<img src="{{url_for('static_content', group='screenshot', filename=uuid)}}">
</div>
{% endif %}
</div>
{% endblock %}

View File

@@ -7,7 +7,7 @@
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}"/>
<fieldset>
<div class="pure-control-group">
This will remove all version snapshots/data, but keep your list of URLs. <br/>
This will remove ALL version snapshots/data, but keep your list of URLs. <br/>
You may like to use the <strong>BACKUP</strong> link first.<br/>
</div>
<br/>
@@ -17,12 +17,6 @@
<span class="pure-form-message-inline">Type in the word <strong>scrub</strong> to confirm that you understand!</span>
</div>
<br/>
<div class="pure-control-group">
<label for="confirmtext">Optional: Limit deletion of snapshots to snapshots <i>newer</i> than date/time</label>
<input type="datetime-local" id="limit_date" name="limit_date" />
<span class="pure-form-message-inline">dd/mm/yyyy hh:mm (24 hour format)</span>
</div>
<br/>
<div class="pure-control-group">
<button type="submit" class="pure-button pure-button-primary">Scrub!</button>
</div>

View File

@@ -1,14 +1,20 @@
{% extends 'base.html' %}
{% block content %}
{% from '_helpers.jinja' import render_field, render_button %}
{% from '_helpers.jinja' import render_field, render_checkbox_field, render_button %}
{% from '_common_fields.jinja' import render_common_settings_form %}
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='settings.js')}}" defer></script>
<script>
const notification_base_url="{{url_for('ajax_callback_send_notification_test')}}";
{% if emailprefix %}
const email_notification_prefix=JSON.parse('{{emailprefix|tojson}}');
{% endif %}
</script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='tabs.js')}}" defer></script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='notifications.js')}}" defer></script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='global-settings.js')}}" defer></script>
<div class="edit-form">
<div class="tabs">
<div class="tabs collapsable">
<ul>
<li class="tab" id="default-tab"><a href="#general">General</a></li>
<li class="tab"><a href="#notifications">Notifications</a></li>
@@ -22,15 +28,15 @@
<div class="tab-pane-inner" id="general">
<fieldset>
<div class="pure-control-group">
{{ render_field(form.minutes_between_check) }}
{{ render_field(form.requests.form.time_between_check, class="time-check-widget") }}
<span class="pure-form-message-inline">Default time for all watches, when the watch does not have a specific time setting.</span>
</div>
<div class="pure-control-group">
{% if not hide_remove_pass %}
{% if current_user.is_authenticated %}
{{ render_button(form.removepassword_button) }}
{{ render_button(form.application.form.removepassword_button) }}
{% else %}
{{ render_field(form.password) }}
{{ render_field(form.application.form.password) }}
<span class="pure-form-message-inline">Password protection for your changedetection.io application.</span>
{% endif %}
{% else %}
@@ -38,7 +44,7 @@
{% endif %}
</div>
<div class="pure-control-group">
{{ render_field(form.base_url, placeholder="http://yoursite.com:5000/",
{{ render_field(form.application.form.base_url, placeholder="http://yoursite.com:5000/",
class="m-d") }}
<span class="pure-form-message-inline">
Base URL used for the {base_url} token in notifications and RSS links.<br/>Default value is the ENV var 'BASE_URL' (Currently "{{current_base_url}}"),
@@ -47,44 +53,76 @@
</div>
<div class="pure-control-group">
{{ render_field(form.extract_title_as_title) }}
{{ render_checkbox_field(form.application.form.extract_title_as_title) }}
<span class="pure-form-message-inline">Note: This will automatically apply to all existing watches.</span>
</div>
<div class="pure-control-group">
{{ render_checkbox_field(form.application.form.real_browser_save_screenshot) }}
<span class="pure-form-message-inline">When using a Chrome browser, a screenshot from the last check will be available on the Diff page</span>
</div>
<div class="pure-control-group">
{{ render_checkbox_field(form.application.form.empty_pages_are_a_change) }}
<span class="pure-form-message-inline">When a page contains HTML, but no renderable text appears (empty page), is this considered a change?</span>
</div>
{% if form.requests.proxy %}
<div class="pure-control-group inline-radio">
{{ render_field(form.requests.form.proxy, class="fetch-backend-proxy") }}
<span class="pure-form-message-inline">
Choose a default proxy for all watches
</span>
</div>
{% endif %}
</fieldset>
</div>
<div class="tab-pane-inner" id="notifications">
<fieldset>
<div class="field-group">
{{ render_common_settings_form(form, current_base_url) }}
{{ render_common_settings_form(form.application.form, current_base_url, emailprefix) }}
</div>
</fieldset>
<a href="{{url_for('notification_logs')}}">Notification debug logs</a>
</div>
<div class="tab-pane-inner" id="fetching">
<div class="pure-control-group">
{{ render_field(form.fetch_backend) }}
<div class="pure-control-group inline-radio">
{{ render_field(form.application.form.fetch_backend, class="fetch-backend") }}
<span class="pure-form-message-inline">
<p>Use the <strong>Basic</strong> method (default) where your watched sites don't need Javascript to render.</p>
<p>The <strong>Chrome/Javascript</strong> method requires a network connection to a running WebDriver+Chrome server, set by the ENV var 'WEBDRIVER_URL'. </p>
</span>
</div>
<fieldset class="pure-group" id="webdriver-override-options">
<div class="pure-form-message-inline">
<strong>If you're having trouble waiting for the page to be fully rendered (text missing etc), try increasing the 'wait' time here.</strong>
<br/>
This will wait <i>n</i> seconds before extracting the text.
</div>
<div class="pure-control-group">
{{ render_field(form.application.form.webdriver_delay) }}
</div>
</fieldset>
</div>
<div class="tab-pane-inner" id="filters">
<fieldset class="pure-group">
{{ render_field(form.ignore_whitespace) }}
{{ render_checkbox_field(form.application.form.ignore_whitespace) }}
<span class="pure-form-message-inline">Ignore whitespace, tabs and new-lines/line-feeds when considering if a change was detected.<br/>
<i>Note:</i> Changing this will change the status of your existing watches, possibily trigger alerts etc.
<i>Note:</i> Changing this will change the status of your existing watches, possibly trigger alerts etc.
</span>
</fieldset>
<fieldset class="pure-group">
{{ render_checkbox_field(form.application.form.render_anchor_tag_content) }}
<span class="pure-form-message-inline">Render anchor tag content, default disabled, when enabled renders links as <code>(link text)[https://somesite.com]</code>
<br/>
<i>Note:</i> Changing this could affect the content of your existing watches, possibly trigger alerts etc.
</span>
</fieldset>
<fieldset class="pure-group">
{{ render_field(form.global_subtractive_selectors, rows=5, placeholder="header
{{ render_field(form.application.form.global_subtractive_selectors, rows=5, placeholder="header
footer
nav
.stockticker") }}
@@ -96,7 +134,7 @@ nav
</span>
</fieldset>
<fieldset class="pure-group">
{{ render_field(form.global_ignore_text, rows=5, placeholder="Some text to ignore in a line
{{ render_field(form.application.form.global_ignore_text, rows=5, placeholder="Some text to ignore in a line
/some.regex\d{2}/ for case-INsensitive regex
") }}
<span class="pure-form-message-inline">Note: This is applied globally in addition to the per-watch rules.</span><br/>

View File

@@ -1,7 +1,8 @@
{% extends 'base.html' %}
{% block content %}
{% from '_helpers.jinja' import render_simple_field %}
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='jquery-3.6.0.min.js')}}"></script>
<script type="text/javascript" src="{{url_for('static_content', group='js', filename='watch-overview.js')}}" defer></script>
<div class="box">
<form class="pure-form" action="{{ url_for('api_watch_add') }}" method="POST" id="new-watch-form">
@@ -9,11 +10,10 @@
<fieldset>
<legend>Add a new change detection watch</legend>
{{ render_simple_field(form.url, placeholder="https://...", required=true) }}
{{ render_simple_field(form.tag, value=active_tag if active_tag else '', placeholder="tag") }}
{{ render_simple_field(form.tag, value=active_tag if active_tag else '', placeholder="watch group") }}
<button type="submit" class="pure-button pure-button-primary">Watch</button>
</fieldset>
<!-- add extra stuff, like do a http POST and send headers -->
<!-- user/pass r = requests.get('https://api.github.com/user', auth=('user', 'pass')) -->
<span style="color:#eee; font-size: 80%;"><img style="height: 1em;display:inline-block;" src="{{url_for('static_content', group='images', filename='spread.svg')}}" /> Tip: You can also add 'shared' watches. <a href="https://github.com/dgtlmoon/changedetection.io/wiki/Sharing-a-Watch">More info</a></a></span>
</form>
<div>
<a href="{{url_for('index')}}" class="pure-button button-tag {{'active' if not active_tag }}">All</a>
@@ -45,12 +45,15 @@
{% if watch.last_error is defined and watch.last_error != False %}error{% endif %}
{% if watch.last_notification_error is defined and watch.last_notification_error != False %}error{% endif %}
{% if watch.paused is defined and watch.paused != False %}paused{% endif %}
{% if watch.newest_history_key| int > watch.last_viewed| int %}unviewed{% endif %}">
{% if watch.newest_history_key| int > watch.last_viewed| int %}unviewed{% endif %}
{% if watch.uuid in queued_uuids %}queued{% endif %}">
<td class="inline">{{ loop.index }}</td>
<td class="inline paused-state state-{{watch.paused}}"><a href="{{url_for('index', pause=watch.uuid, tag=active_tag)}}"><img src="{{url_for('static_content', group='images', filename='pause.svg')}}" alt="Pause" title="Pause"/></a></td>
<td class="title-col inline">{{watch.title if watch.title is not none and watch.title|length > 0 else watch.url}}
<a class="external" target="_blank" rel="noopener" href="{{ watch.url }}"></a>
<a class="external" target="_blank" rel="noopener" href="{{ watch.url.replace('source:','') }}"></a>
<a href="{{url_for('api_share_put_watch', uuid=watch.uuid)}}"><img style="height: 1em;display:inline-block;" src="{{url_for('static_content', group='images', filename='spread.svg')}}" /></a>
{%if watch.fetch_backend == "html_webdriver" %}<img style="height: 1em; display:inline-block;" src="{{url_for('static_content', group='images', filename='Google-Chrome-icon.png')}}" />{% endif %}
{% if watch.last_error is defined and watch.last_error != False %}
@@ -71,11 +74,11 @@
{% endif %}
</td>
<td>
<a href="{{ url_for('api_watch_checknow', uuid=watch.uuid, tag=request.args.get('tag')) }}"
class="pure-button button-small pure-button-primary">Recheck</a>
<a {% if watch.uuid in queued_uuids %}disabled="true"{% endif %} href="{{ url_for('api_watch_checknow', uuid=watch.uuid, tag=request.args.get('tag')) }}"
class="recheck pure-button button-small pure-button-primary">{% if watch.uuid in queued_uuids %}Queued{% else %}Recheck{% endif %}</a>
<a href="{{ url_for('edit_page', uuid=watch.uuid)}}" class="pure-button button-small pure-button-primary">Edit</a>
{% if watch.history|length >= 2 %}
<a href="{{ url_for('diff_history_page', uuid=watch.uuid) }}" target="{{watch.uuid}}" class="pure-button button-small pure-button-primary">Diff</a>
<a href="{{ url_for('diff_history_page', uuid=watch.uuid) }}" target="{{watch.uuid}}" class="pure-button button-small pure-button-primary diff-link">Diff</a>
{% else %}
{% if watch.history|length == 1 %}
<a href="{{ url_for('preview_page', uuid=watch.uuid)}}" target="{{watch.uuid}}" class="pure-button button-small pure-button-primary">Preview</a>

View File

@@ -16,6 +16,7 @@ def cleanup(datastore_path):
# Unlink test output files
files = ['output.txt',
'url-watches.json',
'secret.txt',
'notification.txt',
'count.txt',
'endpoint-content.txt'

View File

@@ -1,5 +1,5 @@
from flask import url_for
from . util import live_server_setup
def test_check_access_control(app, client):
# Still doesnt work, but this is closer.
@@ -12,9 +12,9 @@ def test_check_access_control(app, client):
# Enable password check.
res = c.post(
url_for("settings_page"),
data={"password": "foobar",
"minutes_between_check": 180,
'fetch_backend': "html_requests"},
data={"application-password": "foobar",
"requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"},
follow_redirects=True
)
@@ -46,77 +46,34 @@ def test_check_access_control(app, client):
assert b"BACKUP" in res.data
assert b"IMPORT" in res.data
assert b"LOG OUT" in res.data
assert b"minutes_between_check" in res.data
assert b"time_between_check-minutes" in res.data
assert b"fetch_backend" in res.data
##################################################
# Remove password button, and check that it worked
##################################################
res = c.post(
url_for("settings_page"),
data={
"minutes_between_check": 180,
"tag": "",
"headers": "",
"fetch_backend": "html_webdriver",
"removepassword_button": "Remove password"
"requests-time_between_check-minutes": 180,
"application-fetch_backend": "html_webdriver",
"application-removepassword_button": "Remove password"
},
follow_redirects=True,
)
assert b"Password protection removed." in res.data
assert b"LOG OUT" not in res.data
# There was a bug where saving the settings form would submit a blank password
def test_check_access_control_no_blank_password(app, client):
# Still doesnt work, but this is closer.
with app.test_client() as c:
# Check we dont have any password protection enabled yet.
res = c.get(url_for("settings_page"))
assert b"Remove password" not in res.data
# Enable password check.
############################################################
# Be sure a blank password doesnt setup password protection
############################################################
res = c.post(
url_for("settings_page"),
data={"password": "",
"minutes_between_check": 180,
'fetch_backend': "html_requests"},
data={"application-password": "",
"requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"Password protection enabled." not in res.data
assert b"Login" not in res.data
assert b"Password protection enabled" not in res.data
# There was a bug where saving the settings form would submit a blank password
def test_check_access_no_remote_access_to_remove_password(app, client):
# Still doesnt work, but this is closer.
with app.test_client() as c:
# Check we dont have any password protection enabled yet.
res = c.get(url_for("settings_page"))
assert b"Remove password" not in res.data
# Enable password check.
res = c.post(
url_for("settings_page"),
data={"password": "password",
"minutes_between_check": 180,
'fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"Password protection enabled." in res.data
assert b"Login" in res.data
res = c.post(
url_for("settings_page"),
data={
"minutes_between_check": 180,
"tag": "",
"headers": "",
"fetch_backend": "html_webdriver",
"removepassword_button": "Remove password"
},
follow_redirects=True,
)
assert b"Password protection removed." not in res.data
res = c.get(url_for("index"),
follow_redirects=True)
assert b"watch-table-wrapper" not in res.data

View File

@@ -26,7 +26,8 @@ def test_snapshot_api_detects_change(client, live_server):
time.sleep(1)
# Add our URL to the import page
test_url = url_for('test_endpoint', content_type="text/plain", _external=True)
test_url = url_for('test_endpoint', content_type="text/plain",
_external=True)
res = client.post(
url_for("import_page"),
data={"urls": test_url},

View File

@@ -50,6 +50,14 @@ def test_check_basic_change_detection_functionality(client, live_server):
#####################
# Check HTML conversion detected and workd
res = client.get(
url_for("preview_page", uuid="first"),
follow_redirects=True
)
# Check this class does not appear (that we didnt see the actual source)
assert b'foobar-detection' not in res.data
# Make a change
set_modified_response()
@@ -70,6 +78,11 @@ def test_check_basic_change_detection_functionality(client, live_server):
res = client.get(url_for("rss"))
expected_url = url_for('test_endpoint', _external=True)
assert b'<rss' in res.data
# re #16 should have the diff in here too
assert b'(into ) which has this one new line' in res.data
assert b'CDATA' in res.data
assert expected_url.encode('utf-8') in res.data
# Following the 'diff' link, it should no longer display as 'unviewed' even after we recheck it a few times
@@ -96,7 +109,7 @@ def test_check_basic_change_detection_functionality(client, live_server):
# Enable auto pickup of <title> in settings
res = client.post(
url_for("settings_page"),
data={"extract_title_as_title": "1", "minutes_between_check": 180, 'fetch_backend': "html_requests"},
data={"application-extract_title_as_title": "1", "requests-time_between_check-minutes": 180, 'application-fetch_backend': "html_requests"},
follow_redirects=True
)

View File

@@ -0,0 +1,38 @@
#!/usr/bin/python3
"""Test suite for the method to extract text from an html string"""
from ..html_tools import html_to_text
def test_html_to_text_func():
test_html = """<html>
<body>
Some initial text</br>
<p>Which is across multiple lines</p>
<a href="/first_link"> More Text </a>
</br>
So let's see what happens. </br>
<a href="second_link.com"> Even More Text </a>
</body>
</html>
"""
# extract text, with 'render_anchor_tag_content' set to False
text_content = html_to_text(test_html, render_anchor_tag_content=False)
no_links_text = \
"Some initial text\n\nWhich is across multiple " \
"lines\n\nMore Text So let's see what happens. Even More Text"
# check that no links are in the extracted text
assert text_content == no_links_text
# extract text, with 'render_anchor_tag_content' set to True
text_content = html_to_text(test_html, render_anchor_tag_content=True)
links_text = \
"Some initial text\n\nWhich is across multiple lines\n\n[ More Text " \
"](/first_link) So let's see what happens. [ Even More Text ]" \
"(second_link.com)"
# check that links are present in the extracted text
assert text_content == links_text

View File

@@ -171,11 +171,24 @@ def test_check_ignore_text_functionality(client, live_server):
def test_check_global_ignore_text_functionality(client, live_server):
sleep_time_for_fetch_thread = 3
# Give the endpoint time to spin up
time.sleep(1)
ignore_text = "XXXXX\r\nYYYYY\r\nZZZZZ"
set_original_ignore_response()
# Give the endpoint time to spin up
time.sleep(1)
# Goto the settings page, add our ignore text
res = client.post(
url_for("settings_page"),
data={
"requests-time_between_check-minutes": 180,
"application-global_ignore_text": ignore_text,
'application-fetch_backend': "html_requests"
},
follow_redirects=True
)
assert b"Settings updated." in res.data
# Add our URL to the import page
test_url = url_for('test_endpoint', _external=True)
@@ -192,17 +205,6 @@ def test_check_global_ignore_text_functionality(client, live_server):
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# Goto the settings page, add our ignore text
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 180,
"global_ignore_text": ignore_text,
'fetch_backend': "html_requests"
},
follow_redirects=True
)
assert b"Settings updated." in res.data
# Goto the edit page of the item, add our ignore text
# Add our URL to the import page
@@ -225,12 +227,16 @@ def test_check_global_ignore_text_functionality(client, live_server):
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# so that we are sure everything is viewed and in a known 'nothing changed' state
res = client.get(url_for("diff_history_page", uuid="first"))
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
# Make a change
# Make a change which includes the ignore text
set_modified_ignore_response()
# Trigger a check
@@ -243,7 +249,7 @@ def test_check_global_ignore_text_functionality(client, live_server):
assert b'unviewed' not in res.data
assert b'/test-endpoint' in res.data
# Just to be sure.. set a regular modified change..
# Just to be sure.. set a regular modified change that will trigger it
set_modified_original_ignore_response()
client.get(url_for("api_watch_checknow"), follow_redirects=True)
time.sleep(sleep_time_for_fetch_thread)

View File

@@ -0,0 +1,125 @@
#!/usr/bin/python3
"""Test suite for the render/not render anchor tag content functionality"""
import time
from flask import url_for
from .util import live_server_setup
def test_setup(live_server):
live_server_setup(live_server)
def set_original_ignore_response():
test_return_data = """<html>
<body>
Some initial text</br>
<a href="/original_link"> Some More Text </a>
</br>
So let's see what happens. </br>
</body>
</html>
"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
# Should be the same as set_original_ignore_response() but with a different
# link
def set_modified_ignore_response():
test_return_data = """<html>
<body>
Some initial text</br>
<a href="/modified_link"> Some More Text </a>
</br>
So let's see what happens. </br>
</body>
</html>
"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
def test_render_anchor_tag_content_true(client, live_server):
"""Testing that the link changes are detected when
render_anchor_tag_content setting is set to true"""
sleep_time_for_fetch_thread = 3
# Give the endpoint time to spin up
time.sleep(1)
# set original html text
set_original_ignore_response()
# Goto the settings page, choose to ignore links (dont select/send "application-render_anchor_tag_content")
res = client.post(
url_for("settings_page"),
data={
"requests-time_between_check-minutes": 180,
"application-fetch_backend": "html_requests",
},
follow_redirects=True,
)
assert b"Settings updated." in res.data
# Add our URL to the import page
test_url = url_for("test_endpoint", _external=True)
res = client.post(
url_for("import_page"), data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
time.sleep(sleep_time_for_fetch_thread)
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# set a new html text with a modified link
set_modified_ignore_response()
time.sleep(sleep_time_for_fetch_thread)
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# We should not see the rendered anchor tag
res = client.get(url_for("preview_page", uuid="first"))
assert '(/modified_link)' not in res.data.decode()
# Goto the settings page, ENABLE render anchor tag
res = client.post(
url_for("settings_page"),
data={
"requests-time_between_check-minutes": 180,
"application-render_anchor_tag_content": "true",
"application-fetch_backend": "html_requests",
},
follow_redirects=True,
)
assert b"Settings updated." in res.data
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# check that the anchor tag content is rendered
res = client.get(url_for("preview_page", uuid="first"))
assert '(/modified_link)' in res.data.decode()
# since the link has changed, and we chose to render anchor tag content,
# we should detect a change (new 'unviewed' class)
res = client.get(url_for("index"))
assert b"unviewed" in res.data
assert b"/test-endpoint" in res.data
# Cleanup everything
res = client.get(url_for("api_delete", uuid="all"),
follow_redirects=True)
assert b'Deleted' in res.data

View File

@@ -51,9 +51,9 @@ def test_normal_page_check_works_with_ignore_status_code(client, live_server):
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 180,
"ignore_status_codes": "y",
'fetch_backend': "html_requests"
"requests-time_between_check-minutes": 180,
"application-ignore_status_codes": "y",
'application-fetch_backend': "html_requests"
},
follow_redirects=True
)

View File

@@ -61,9 +61,9 @@ def test_check_ignore_whitespace(client, live_server):
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 180,
"ignore_whitespace": "y",
'fetch_backend': "html_requests"
"requests-time_between_check-minutes": 180,
"application-ignore_whitespace": "y",
"application-fetch_backend": "html_requests"
},
follow_redirects=True
)

View File

@@ -5,18 +5,17 @@ import time
from flask import url_for
from .util import live_server_setup
def test_import(client, live_server):
def test_setup(client, live_server):
live_server_setup(live_server)
def test_import(client, live_server):
# Give the endpoint time to spin up
time.sleep(1)
res = client.post(
url_for("import_page"),
data={
"distill-io": "",
"urls": """https://example.com
https://example.com tag1
https://example.com tag1, other tag"""
@@ -26,3 +25,96 @@ https://example.com tag1, other tag"""
assert b"3 Imported" in res.data
assert b"tag1" in res.data
assert b"other tag" in res.data
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get( url_for("index"))
res = client.get( url_for("index"))
def xtest_import_skip_url(client, live_server):
# Give the endpoint time to spin up
time.sleep(1)
res = client.post(
url_for("import_page"),
data={
"distill-io": "",
"urls": """https://example.com
:ht000000broken
"""
},
follow_redirects=True,
)
assert b"1 Imported" in res.data
assert b"ht000000broken" in res.data
assert b"1 Skipped" in res.data
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get( url_for("index"))
def test_import_distillio(client, live_server):
distill_data='''
{
"client": {
"local": 1
},
"data": [
{
"name": "Unraid | News",
"uri": "https://unraid.net/blog",
"config": "{\\"selections\\":[{\\"frames\\":[{\\"index\\":0,\\"excludes\\":[],\\"includes\\":[{\\"type\\":\\"xpath\\",\\"expr\\":\\"(//div[@id='App']/div[contains(@class,'flex')]/main[contains(@class,'relative')]/section[contains(@class,'relative')]/div[@class='container']/div[contains(@class,'flex')]/div[contains(@class,'w-full')])[1]\\"}]}],\\"dynamic\\":true,\\"delay\\":2}],\\"ignoreEmptyText\\":true,\\"includeStyle\\":false,\\"dataAttr\\":\\"text\\"}",
"tags": ["nice stuff", "nerd-news"],
"content_type": 2,
"state": 40,
"schedule": "{\\"type\\":\\"INTERVAL\\",\\"params\\":{\\"interval\\":4447}}",
"ts": "2022-03-27T15:51:15.667Z"
}
]
}
'''
# Give the endpoint time to spin up
time.sleep(1)
client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
res = client.post(
url_for("import_page"),
data={
"distill-io": distill_data,
"urls" : ''
},
follow_redirects=True,
)
assert b"Unable to read JSON file, was it broken?" not in res.data
assert b"1 Imported from Distill.io" in res.data
res = client.get( url_for("edit_page", uuid="first"))
assert b"https://unraid.net/blog" in res.data
assert b"Unraid | News" in res.data
# flask/wtforms should recode this, check we see it
# wtforms encodes it like id=&#39 ,but html.escape makes it like id=&#x27
# - so just check it manually :(
#import json
#import html
#d = json.loads(distill_data)
# embedded_d=json.loads(d['data'][0]['config'])
# x=html.escape(embedded_d['selections'][0]['frames'][0]['includes'][0]['expr']).encode('utf-8')
assert b"xpath:(//div[@id=&#39;App&#39;]/div[contains(@class,&#39;flex&#39;)]/main[contains(@class,&#39;relative&#39;)]/section[contains(@class,&#39;relative&#39;)]/div[@class=&#39;container&#39;]/div[contains(@class,&#39;flex&#39;)]/div[contains(@class,&#39;w-full&#39;)])[1]" in res.data
# did the tags work?
res = client.get( url_for("index"))
assert b"nice stuff" in res.data
assert b"nerd-news" in res.data
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
# Clear flask alerts
res = client.get(url_for("index"))

View File

@@ -270,6 +270,7 @@ def test_check_json_filter_bool_val(client, live_server):
)
assert b"1 Imported" in res.data
time.sleep(3)
# Goto the edit page, add our ignore text
# Add our URL to the import page
res = client.post(
@@ -284,6 +285,7 @@ def test_check_json_filter_bool_val(client, live_server):
)
assert b"Updated watch." in res.data
time.sleep(3)
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)

View File

@@ -0,0 +1,102 @@
#!/usr/bin/python3
import time
from flask import url_for
from urllib.request import urlopen
from .util import set_original_response, set_modified_response, live_server_setup
sleep_time_for_fetch_thread = 3
def set_nonrenderable_response():
test_return_data = """<html>
<head><title>modified head title</title></head>
<!-- like when some angular app was broken and doesnt render or whatever -->
<body>
</body>
</html>
"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
return None
def test_check_basic_change_detection_functionality(client, live_server):
set_original_response()
live_server_setup(live_server)
# Add our URL to the import page
res = client.post(
url_for("import_page"),
data={"urls": url_for('test_endpoint', _external=True)},
follow_redirects=True
)
assert b"1 Imported" in res.data
time.sleep(sleep_time_for_fetch_thread)
# Do this a few times.. ensures we dont accidently set the status
for n in range(3):
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
#####################
client.post(
url_for("settings_page"),
data={"application-empty_pages_are_a_change": "",
"requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"},
follow_redirects=True
)
# this should not trigger a change, because no good text could be converted from the HTML
set_nonrenderable_response()
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("index"))
assert b'unviewed' not in res.data
# ok now do the opposite
client.post(
url_for("settings_page"),
data={"application-empty_pages_are_a_change": "y",
"requests-time_between_check-minutes": 180,
'application-fetch_backend': "html_requests"},
follow_redirects=True
)
set_modified_response()
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(sleep_time_for_fetch_thread)
# It should report nothing found (no new 'unviewed' class)
res = client.get(url_for("index"))
assert b'unviewed' in res.data
#
# Cleanup everything
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data

View File

@@ -2,15 +2,17 @@ import os
import time
import re
from flask import url_for
from . util import set_original_response, set_modified_response, live_server_setup
from . util import set_original_response, set_modified_response, set_more_modified_response, live_server_setup
import logging
from changedetectionio.notification import default_notification_body, default_notification_title
def test_setup(live_server):
live_server_setup(live_server)
# Hard to just add more live server URLs when one test is already running (I think)
# So we add our test here (was in a different file)
def test_check_notification(client, live_server):
live_server_setup(live_server)
set_original_response()
# Give the endpoint time to spin up
@@ -49,84 +51,76 @@ def test_check_notification(client, live_server):
notification_url = url.replace('http', 'json')
print (">>>> Notification URL: "+notification_url)
notification_form_data = {"notification_urls": notification_url,
"notification_title": "New ChangeDetection.io Notification - {watch_url}",
"notification_body": "BASE URL: {base_url}\n"
"Watch URL: {watch_url}\n"
"Watch UUID: {watch_uuid}\n"
"Watch title: {watch_title}\n"
"Watch tag: {watch_tag}\n"
"Preview: {preview_url}\n"
"Diff URL: {diff_url}\n"
"Snapshot: {current_snapshot}\n"
"Diff: {diff}\n"
"Diff Full: {diff_full}\n"
":-)",
"notification_format": "Text"}
notification_form_data.update({
"url": test_url,
"tag": "my tag",
"title": "my title",
"headers": "",
"fetch_backend": "html_requests"})
res = client.post(
url_for("edit_page", uuid="first"),
data={"notification_urls": notification_url,
"notification_title": "New ChangeDetection.io Notification - {watch_url}",
"notification_body": "BASE URL: {base_url}\n"
"Watch URL: {watch_url}\n"
"Watch UUID: {watch_uuid}\n"
"Watch title: {watch_title}\n"
"Watch tag: {watch_tag}\n"
"Preview: {preview_url}\n"
"Diff URL: {diff_url}\n"
"Snapshot: {current_snapshot}\n"
"Diff: {diff}\n"
"Diff Full: {diff_full}\n"
":-)",
"notification_format": "Text",
"url": test_url,
"tag": "my tag",
"title": "my title",
"headers": "",
"fetch_backend": "html_requests",
"trigger_check": "y"},
data=notification_form_data,
follow_redirects=True
)
assert b"Updated watch." in res.data
assert b"Test notification queued" in res.data
# Hit the edit page, be sure that we saved it
# Re #242 - wasnt saving?
res = client.get(
url_for("edit_page", uuid="first"))
assert bytes(notification_url.encode('utf-8')) in res.data
# Re #242 - wasnt saving?
assert bytes("New ChangeDetection.io Notification".encode('utf-8')) in res.data
# Because we hit 'send test notification on save'
## Now recheck, and it should have sent the notification
time.sleep(3)
set_modified_response()
notification_submission = None
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
time.sleep(3)
# Verify what was sent as a notification, this file should exist
with open("test-datastore/notification.txt", "r") as f:
notification_submission = f.read()
# Did we see the URL that had a change, in the notification?
assert test_url in notification_submission
os.unlink("test-datastore/notification.txt")
set_modified_response()
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(3)
# Did the front end see it?
res = client.get(
url_for("index"))
assert bytes("just now".encode('utf-8')) in res.data
notification_submission=None
# Verify what was sent as a notification
with open("test-datastore/notification.txt", "r") as f:
notification_submission = f.read()
# Did we see the URL that had a change, in the notification?
assert test_url in notification_submission
# Did we see the URL that had a change, in the notification?
# Diff was correctly executed
assert test_url in notification_submission
assert ':-)' in notification_submission
assert "Diff Full: Some initial text" in notification_submission
assert "Diff: (changed) Which is across multiple lines" in notification_submission
assert "(into ) which has this one new line" in notification_submission
# Re #342 - check for accidental python byte encoding of non-utf8/string
assert "b'" not in notification_submission
assert re.search('Watch UUID: [0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}', notification_submission, re.IGNORECASE)
assert "Watch title: my title" in notification_submission
assert "Watch tag: my tag" in notification_submission
assert "diff/" in notification_submission
assert "preview/" in notification_submission
assert ":-)" in notification_submission
assert "New ChangeDetection.io Notification - {}".format(test_url) in notification_submission
if env_base_url:
# Re #65 - did we see our BASE_URl ?
@@ -135,50 +129,17 @@ def test_check_notification(client, live_server):
else:
logging.debug(">>> Skipping BASE_URL check")
## Now configure something clever, we go into custom config (non-default) mode, this is returned by the endpoint
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(";jasdhflkjadshf kjhsdfkjl ahslkjf haslkjd hfaklsj hf\njl;asdhfkasj stuff we will detect\n")
res = client.post(
url_for("settings_page"),
data={"notification_title": "New ChangeDetection.io Notification - {watch_url}",
"notification_urls": "json://foobar.com", #Re #143 should not see that it sent without [test checkbox]
"minutes_between_check": 180,
"fetch_backend": "html_requests",
},
follow_redirects=True
)
assert b"Settings updated." in res.data
# Re #143 - should not see this if we didnt hit the test box
assert b"Test notification queued" not in res.data
# Trigger a check
# This should insert the {current_snapshot}
set_more_modified_response()
client.get(url_for("api_watch_checknow"), follow_redirects=True)
# Give the thread time to pick it up
time.sleep(3)
# Did the front end see it?
res = client.get(
url_for("index"))
assert bytes("just now".encode('utf-8')) in res.data
# Verify what was sent as a notification, this file should exist
with open("test-datastore/notification.txt", "r") as f:
notification_submission = f.read()
print ("Notification submission was:", notification_submission)
# Re #342 - check for accidental python byte encoding of non-utf8/string
assert "b'" not in notification_submission
assert "Ohh yeah awesome" in notification_submission
assert re.search('Watch UUID: [0-9a-f]{8}(-[0-9a-f]{4}){3}-[0-9a-f]{12}', notification_submission, re.IGNORECASE)
assert "Watch title: my title" in notification_submission
assert "Watch tag: my tag" in notification_submission
assert "diff/" in notification_submission
assert "preview/" in notification_submission
assert ":-)" in notification_submission
assert "New ChangeDetection.io Notification - {}".format(test_url) in notification_submission
# This should insert the {current_snapshot}
assert "stuff we will detect" in notification_submission
# Prove that "content constantly being marked as Changed with no Updating causes notification" is not a thing
# https://github.com/dgtlmoon/changedetection.io/discussions/192
@@ -186,33 +147,38 @@ def test_check_notification(client, live_server):
# Trigger a check
client.get(url_for("api_watch_checknow"), follow_redirects=True)
time.sleep(3)
time.sleep(1)
client.get(url_for("api_watch_checknow"), follow_redirects=True)
time.sleep(3)
time.sleep(1)
client.get(url_for("api_watch_checknow"), follow_redirects=True)
time.sleep(3)
time.sleep(1)
assert os.path.exists("test-datastore/notification.txt") == False
# Now adding a wrong token should give us an error
res = client.post(
url_for("settings_page"),
data={"notification_title": "New ChangeDetection.io Notification - {watch_url}",
"notification_body": "Rubbish: {rubbish}\n",
"notification_format": "Text",
"notification_urls": "json://foobar.com",
"minutes_between_check": 180,
"fetch_backend": "html_requests"
},
# cleanup for the next
client.get(
url_for("api_delete", uuid="all"),
follow_redirects=True
)
assert bytes("is not a valid token".encode('utf-8')) in res.data
def test_notification_validation(client, live_server):
#live_server_setup(live_server)
time.sleep(3)
# re #242 - when you edited an existing new entry, it would not correctly show the notification settings
# Add our URL to the import page
test_url = url_for('test_endpoint', _external=True)
res = client.post(
url_for("api_watch_add"),
data={"url": test_url, "tag": 'nice one'},
follow_redirects=True
)
assert b"Watch added" in res.data
# Re #360 some validation
res = client.post(
url_for("edit_page", uuid="first"),
data={"notification_urls": notification_url,
data={"notification_urls": 'json://localhost/foobar',
"notification_title": "",
"notification_body": "",
"notification_format": "Text",
@@ -220,8 +186,28 @@ def test_check_notification(client, live_server):
"tag": "my tag",
"title": "my title",
"headers": "",
"fetch_backend": "html_requests",
"trigger_check": "y"},
"fetch_backend": "html_requests"},
follow_redirects=True
)
assert b"Notification Body and Title is required when a Notification URL is used" in res.data
# Now adding a wrong token should give us an error
res = client.post(
url_for("settings_page"),
data={"application-notification_title": "New ChangeDetection.io Notification - {watch_url}",
"application-notification_body": "Rubbish: {rubbish}\n",
"application-notification_format": "Text",
"application-notification_urls": "json://localhost/foobar",
"requests-time_between_check-minutes": 180,
"fetch_backend": "html_requests"
},
follow_redirects=True
)
assert bytes("is not a valid token".encode('utf-8')) in res.data
# cleanup for the next
client.get(
url_for("api_delete", uuid="all"),
follow_redirects=True
)

View File

@@ -35,7 +35,7 @@ def test_check_notification_error_handling(client, live_server):
"tag": "",
"title": "",
"headers": "",
"minutes_between_check": "180",
"time_between_check-minutes": "180",
"fetch_backend": "html_requests",
"trigger_check": "y"},
follow_redirects=True

View File

@@ -27,6 +27,7 @@ def test_headers_in_request(client, live_server):
)
assert b"1 Imported" in res.data
time.sleep(3)
cookie_header = '_ga=GA1.2.1022228332; cookie-preferences=analytics:accepted;'
@@ -84,9 +85,25 @@ def test_body_in_request(client, live_server):
)
assert b"1 Imported" in res.data
body_value = 'Test Body Value'
time.sleep(3)
# Add a properly formatted body with a proper method
# add the first 'version'
res = client.post(
url_for("edit_page", uuid="first"),
data={
"url": test_url,
"tag": "",
"method": "POST",
"fetch_backend": "html_requests",
"body": "something something"},
follow_redirects=True
)
assert b"Updated watch." in res.data
time.sleep(3)
# Now the change which should trigger a change
body_value = 'Test Body Value'
res = client.post(
url_for("edit_page", uuid="first"),
data={

View File

@@ -0,0 +1,76 @@
#!/usr/bin/python3
import time
from flask import url_for
from urllib.request import urlopen
from .util import set_original_response, set_modified_response, live_server_setup
import re
sleep_time_for_fetch_thread = 3
def test_share_watch(client, live_server):
set_original_response()
live_server_setup(live_server)
test_url = url_for('test_endpoint', _external=True)
css_filter = ".nice-filter"
# Add our URL to the import page
res = client.post(
url_for("import_page"),
data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
# Goto the edit page, add our ignore text
# Add our URL to the import page
res = client.post(
url_for("edit_page", uuid="first"),
data={"css_filter": css_filter, "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"Updated watch." in res.data
# Check it saved
res = client.get(
url_for("edit_page", uuid="first"),
)
assert bytes(css_filter.encode('utf-8')) in res.data
# click share the link
res = client.get(
url_for("api_share_put_watch", uuid="first"),
follow_redirects=True
)
assert b"Share this link:" in res.data
assert b"https://changedetection.io/share/" in res.data
html = res.data.decode()
share_link_search = re.search('<span id="share-link">(.*)</span>', html, re.IGNORECASE)
assert share_link_search
# Now delete what we have, we will try to re-import it
# Cleanup everything
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
# Add our URL to the import page
res = client.post(
url_for("import_page"),
data={"urls": share_link_search.group(1)},
follow_redirects=True
)
assert b"1 Imported" in res.data
# Now hit edit, we should see what we expect
# that the import fetched the meta-data
# Check it saved
res = client.get(
url_for("edit_page", uuid="first"),
)
assert bytes(css_filter.encode('utf-8')) in res.data

View File

@@ -0,0 +1,95 @@
#!/usr/bin/python3
import time
from flask import url_for
from urllib.request import urlopen
from .util import set_original_response, set_modified_response, live_server_setup
sleep_time_for_fetch_thread = 3
def test_setup(live_server):
live_server_setup(live_server)
def test_check_basic_change_detection_functionality_source(client, live_server):
set_original_response()
test_url = 'source:'+url_for('test_endpoint', _external=True)
# Add our URL to the import page
res = client.post(
url_for("import_page"),
data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
time.sleep(sleep_time_for_fetch_thread)
#####################
# Check HTML conversion detected and workd
res = client.get(
url_for("preview_page", uuid="first"),
follow_redirects=True
)
# Check this class DOES appear (that we didnt see the actual source)
assert b'foobar-detection' in res.data
# Make a change
set_modified_response()
# Force recheck
res = client.get(url_for("api_watch_checknow"), follow_redirects=True)
assert b'1 watches are queued for rechecking.' in res.data
time.sleep(5)
# Now something should be ready, indicated by having a 'unviewed' class
res = client.get(url_for("index"))
assert b'unviewed' in res.data
res = client.get(
url_for("diff_history_page", uuid="first"),
follow_redirects=True
)
assert b'&lt;title&gt;modified head title' in res.data
def test_check_ignore_elements(client, live_server):
set_original_response()
time.sleep(2)
test_url = 'source:'+url_for('test_endpoint', _external=True)
# Add our URL to the import page
res = client.post(
url_for("import_page"),
data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
time.sleep(sleep_time_for_fetch_thread)
#####################
# We want <span> and <p> ONLY, but ignore span with .foobar-detection
res = client.post(
url_for("edit_page", uuid="first"),
data={"css_filter": 'span,p', "url": test_url, "tag": "", "subtractive_selectors": ".foobar-detection", 'fetch_backend': "html_requests"},
follow_redirects=True
)
time.sleep(sleep_time_for_fetch_thread)
res = client.get(
url_for("preview_page", uuid="first"),
follow_redirects=True
)
assert b'foobar-detection' not in res.data
assert b'&lt;br' not in res.data
assert b'&lt;p' in res.data

View File

@@ -20,8 +20,8 @@ def test_check_watch_field_storage(client, live_server):
res = client.post(
url_for("edit_page", uuid="first"),
data={ "notification_urls": "json://myapi.com",
"minutes_between_check": 126,
data={ "notification_urls": "json://127.0.0.1:30000\r\njson://128.0.0.1\r\n",
"time_between_check-minutes": 126,
"css_filter" : ".fooclass",
"title" : "My title",
"ignore_text" : "ignore this",
@@ -38,8 +38,14 @@ def test_check_watch_field_storage(client, live_server):
url_for("edit_page", uuid="first"),
follow_redirects=True
)
# checks that we dont get an error when using blank lines in the field value
assert not b"json://127.0.0.1\n\njson" in res.data
assert not b"json://127.0.0.1\r\n\njson" in res.data
assert not b"json://127.0.0.1\r\n\rjson" in res.data
assert b"json://127.0.0.1" in res.data
assert b"json://128.0.0.1" in res.data
assert b"json://myapi.com" in res.data
assert b"126" in res.data
assert b".fooclass" in res.data
assert b"My title" in res.data
@@ -56,8 +62,8 @@ def test_check_recheck_global_setting(client, live_server):
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 1566,
'fetch_backend': "html_requests"
"requests-time_between_check-minutes": 1566,
'application-fetch_backend': "html_requests"
},
follow_redirects=True
)
@@ -88,8 +94,8 @@ def test_check_recheck_global_setting(client, live_server):
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 222,
'fetch_backend': "html_requests"
"requests-time_between_check-minutes": 222,
'application-fetch_backend': "html_requests"
},
follow_redirects=True
)
@@ -108,7 +114,7 @@ def test_check_recheck_global_setting(client, live_server):
res = client.post(
url_for("edit_page", uuid="first"),
data={"url": test_url,
"minutes_between_check": 55,
"time_between_check-minutes": 55,
'fetch_backend': "html_requests"
},
follow_redirects=True
@@ -124,8 +130,8 @@ def test_check_recheck_global_setting(client, live_server):
res = client.post(
url_for("settings_page"),
data={
"minutes_between_check": 666,
'fetch_backend': "html_requests"
"requests-time_between_check-minutes": 666,
"application-fetch_backend": "html_requests"
},
follow_redirects=True
)
@@ -134,7 +140,7 @@ def test_check_recheck_global_setting(client, live_server):
res = client.post(
url_for("edit_page", uuid="first"),
data={"url": test_url,
"minutes_between_check": "",
"time_between_check-minutes": "",
'fetch_backend': "html_requests"
},
follow_redirects=True
@@ -147,4 +153,3 @@ def test_check_recheck_global_setting(client, live_server):
follow_redirects=True
)
assert b"666" in res.data

View File

@@ -116,4 +116,46 @@ def test_xpath_validation(client, live_server):
data={"css_filter": "/something horrible", "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"is not a valid XPath expression" in res.data
assert b"is not a valid XPath expression" in res.data
# actually only really used by the distll.io importer, but could be handy too
def test_check_with_prefix_css_filter(client, live_server):
res = client.get(url_for("api_delete", uuid="all"), follow_redirects=True)
assert b'Deleted' in res.data
# Give the endpoint time to spin up
time.sleep(1)
set_original_response()
# Add our URL to the import page
test_url = url_for('test_endpoint', _external=True)
res = client.post(
url_for("import_page"),
data={"urls": test_url},
follow_redirects=True
)
assert b"1 Imported" in res.data
time.sleep(3)
res = client.post(
url_for("edit_page", uuid="first"),
data={"css_filter": "xpath://*[contains(@class, 'sametext')]", "url": test_url, "tag": "", "headers": "", 'fetch_backend': "html_requests"},
follow_redirects=True
)
assert b"Updated watch." in res.data
time.sleep(3)
res = client.get(
url_for("preview_page", uuid="first"),
follow_redirects=True
)
with open('/tmp/fuck.html', 'wb') as f:
f.write(res.data)
assert b"Some text thats the same" in res.data #in selector
assert b"Some text that will change" not in res.data #not in selector
client.get(url_for("api_delete", uuid="all"), follow_redirects=True)

View File

@@ -10,6 +10,7 @@ def set_original_response():
<p>Which is across multiple lines</p>
</br>
So let's see what happens. </br>
<span class="foobar-detection" style='display:none'></span>
</body>
</html>
"""
@@ -35,6 +36,24 @@ def set_modified_response():
return None
def set_more_modified_response():
test_return_data = """<html>
<head><title>modified head title</title></head>
<body>
Some initial text</br>
<p>which has this one new line</p>
</br>
So let's see what happens. </br>
Ohh yeah awesome<br/>
</body>
</html>
"""
with open("test-datastore/endpoint-content.txt", "w") as f:
f.write(test_return_data)
return None
def live_server_setup(live_server):
@@ -66,6 +85,7 @@ def live_server_setup(live_server):
# Just return the body in the request
@live_server.app.route('/test-body', methods=['POST', 'GET'])
def test_body():
print ("TEST-BODY GOT", request.data, "returning")
return request.data
# Just return the verb in the request
@@ -82,7 +102,7 @@ def live_server_setup(live_server):
if data != None:
f.write(data)
print("\n>> Test notification endpoint was hit.\n")
print("\n>> Test notification endpoint was hit.\n", data)
return "Text was set"

View File

@@ -2,6 +2,7 @@ import threading
import queue
import time
from changedetectionio import content_fetcher
# A single update worker
#
# Requests for checking on a single site(watch) from a queue of watches
@@ -32,17 +33,17 @@ class update_worker(threading.Thread):
else:
self.current_uuid = uuid
from changedetectionio import content_fetcher
if uuid in list(self.datastore.data['watching'].keys()):
changed_detected = False
contents = ""
screenshot = False
update_obj= {}
now = time.time()
try:
changed_detected, update_obj, contents = update_handler.run(uuid)
changed_detected, update_obj, contents, screenshot = update_handler.run(uuid)
# Re #342
# In Python 3, all strings are sequences of Unicode characters. There is a bytes type that holds raw bytes.
@@ -51,6 +52,10 @@ class update_worker(threading.Thread):
raise Exception("Error - returned data from the fetch handler SHOULD be bytes")
except PermissionError as e:
self.app.logger.error("File permission error updating", uuid, str(e))
except content_fetcher.ReplyWithContentButNoText as e:
# Totally fine, it's by choice - just continue on, nothing more to care about
# Page had elements/content but no renderable text
pass
except content_fetcher.EmptyReply as e:
# Some kind of custom to-str handler in the exception handler that does this?
err_text = "EmptyReply: Status Code {}".format(e.status_code)
@@ -140,6 +145,9 @@ class update_worker(threading.Thread):
# Always record that we atleast tried
self.datastore.update_watch(uuid=uuid, update_obj={'fetch_time': round(time.time() - now, 3),
'last_checked': round(time.time())})
# Always save the screenshot if it's available
if screenshot:
self.datastore.save_screenshot(watch_uuid=uuid, screenshot=screenshot)
self.current_uuid = None # Done
self.q.task_done()

View File

@@ -17,12 +17,19 @@ services:
# Alternative WebDriver/selenium URL, do not use "'s or 's!
# - WEBDRIVER_URL=http://browser-chrome:4444/wd/hub
#
# WebDriver proxy settings webdriver_proxyType, webdriver_ftpProxy, webdriver_httpProxy, webdriver_noProxy,
# webdriver_proxyAutoconfigUrl, webdriver_sslProxy, webdriver_autodetect,
# WebDriver proxy settings webdriver_proxyType, webdriver_ftpProxy, webdriver_noProxy,
# webdriver_proxyAutoconfigUrl, webdriver_autodetect,
# webdriver_socksProxy, webdriver_socksUsername, webdriver_socksVersion, webdriver_socksPassword
#
# https://selenium-python.readthedocs.io/api.html#module-selenium.webdriver.common.proxy
#
# Alternative Playwright URL, do not use "'s or 's!
# - PLAYWRIGHT_DRIVER_URL=ws://playwright-chrome:3000/
#
# Playwright proxy settings playwright_proxy_server, playwright_proxy_bypass, playwright_proxy_username, playwright_proxy_password
#
# https://playwright.dev/python/docs/api/class-browsertype#browser-type-launch-option-proxy
#
# Plain requsts - proxy support example.
# - HTTP_PROXY=socks5h://10.10.1.10:1080
# - HTTPS_PROXY=socks5h://10.10.1.10:1080
@@ -58,6 +65,13 @@ services:
# # Workaround to avoid the browser crashing inside a docker container
# # See https://github.com/SeleniumHQ/docker-selenium#quick-start
# - /dev/shm:/dev/shm
# restart: unless-stopped
# Used for fetching pages via Playwright+Chrome where you need Javascript support.
# playwright-chrome:
# hostname: playwright-chrome
# image: browserless/chrome
# restart: unless-stopped
volumes:

View File

@@ -13,11 +13,11 @@ requests[socks] ~= 2.26
urllib3 > 1.26
chardet > 2.3.0
wtforms ~= 2.3.3
wtforms ~= 3.0
jsonpath-ng ~= 1.5.3
# Notification library
apprise ~= 0.9.7
apprise ~= 0.9.8.3
# apprise mqtt https://github.com/dgtlmoon/changedetection.io/issues/315
paho-mqtt
@@ -40,3 +40,4 @@ selenium ~= 4.1.0
# need to revisit flask login versions
werkzeug ~= 2.0.0
# playwright is installed at Dockerfile build time because it's not available on all platforms