This patch implements a simple JSONEncoder just to fix#2502 / on the long term
SearXNG needs a data schema for the result items and a json generator for the
result list.
Closes: https://github.com/searxng/searxng/issues/2505
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Over the years the webapp module became more and more a mess. To improve the
modulaization a little this patch moves some implementations from the webapp
module to webutils module.
HINT: this patch brings non functional change
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
A blocklist and a passlist can be configured in /etc/searxng/limiter.toml::
[botdetection.ip_lists]
pass_ip = [
'51.15.252.168', # IPv4 of check.searx.space
]
block_ip = [
'93.184.216.34', # IPv4 of example.org
]
Closes: https://github.com/searxng/searxng/issues/2127
Closes: https://github.com/searxng/searxng/pull/2129
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The monolithic implementation of the limiter was divided into methods and
implemented in the Python package searx.botdetection. Detailed documentation on
the methods has been added.
The methods are divided into two groups:
1. Probe HTTP headers
- Method http_accept
- Method http_accept_encoding
- Method http_accept_language
- Method http_connection
- Method http_user_agent
2. Rate limit:
- Method ip_limit
- Method link_token (new)
The (reduced) implementation of the limiter is now in the module
searx.botdetection.limiter. The first group was transferred unchanged to this
module. The ip_limit contains the sliding windows implemented by the limiter so
far.
This merge also fixes some long outstandig issue:
- limiter does not evaluate the Accept-Language correct [1]
- limiter needs a IPv6 prefix to block networks instead of IPs [2]
Without additional configuration the limiter works as before (apart from the
bugfixes). For the commissioning of additional methods (link_toke), a
configuration must be made in an additional configuration file. Without this
configuration, the limiter runs as before (zero configuration).
The ip_limit Method implements the sliding windows of the vanilla limiter,
additionally the link_token method can be used in this method. The link_token
method can be used to investigate whether a request is suspicious. To activate
the link_token method in the ip_limit method add the following to your
/etc/searxng/limiter.toml::
[botdetection.ip_limit]
link_token = true
[1] https://github.com/searxng/searxng/issues/2455
[2] https://github.com/searxng/searxng/issues/2477
HINT: this patch has no functional change / it is the preparation for following
changes and bugfixes
Over the years, the preferences template became an unmanageable beast. To make
the source code more readable the monolith is splitted into elements. The
splitting into elements also has the advantage that a new template can make use
of them.
The reversed checkbox is a quirk that is only used in the prefereces and must be
eliminated in the long term. For this the macro 'checkbox_onoff_reversed' was
added to the preferences.html template. The 'checkbox' macro is also a quirk of
the preferences.html we don't want to use in other templates (it is an
input-checkbox in a HTML form that was misused for status display).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
In my tests I see bots rotating IPs (with endless IP lists). If such a bot has
100 IPs and has three attempts (SUSPICIOUS_IP_MAX = 3) then it can successfully
send up to 300 requests in one day while rotating the IP. To block the bots for
a longer period of time the SUSPICIOUS_IP_WINDOW, as the time period in which an
IP is observed, must be increased.
For normal WEB-browsers this is no problem, because the SUSPICIOUS_IP_WINDOW is
deleted as soon as the CSS with the token is loaded.
SUSPICIOUS_IP_WINDOW = 3600 * 24 * 30
Time (sec) before sliding window for one suspicious IP expires.
SUSPICIOUS_IP_MAX = 3
Maximum requests from one suspicious IP in the :py:obj:`SUSPICIOUS_IP_WINDOW`."""
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
For correct determination of the IP to the request the function
botdetection.get_real_ip() is implemented. This fonction is used in the
ip_limit and link_token method of the botdetection and it is used in the
self_info plugin.
A documentation about the X-Forwarded-For header has been added.
[1] https://github.com/searxng/searxng/pull/2357#issuecomment-1566211059
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- counting requests in LONG_WINDOW and BURST_WINDOW is not needed when the
request is validated by the link_token method [1]
- renew a ping-key on validation [2], this is needed for infinite scrolling,
where no new token (CSS) is loaded. / this does not fix the BURST_MAX issue in
the vanilla limiter
- normalize the counter names of the ip_limit method to 'ip_limit.*'
- just integrate the ip_limit method straight forward in the limiter plugin /
non intermediate code --> ip_limit now returns None or a werkzeug.Response
object that can be passed by the plugin to the flask application / non
intermediate code that returns a tuple
[1] https://github.com/searxng/searxng/pull/2357#issuecomment-1566113277
[2] https://github.com/searxng/searxng/pull/2357#discussion_r1208542206
[3] https://github.com/searxng/searxng/pull/2357#issuecomment-1566125979
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
To intercept bots that get their IPs from a range of IPs, there is a
``SUSPICIOUS_IP_WINDOW``. In this window the suspicious IPs are stored for a
longer time. IPs stored in this sliding window have a maximum of
``SUSPICIOUS_IP_MAX`` accesses before they are blocked. As soon as the IP makes
a request that is not suspicious, the sliding window for this IP is droped.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
To activate the ``link_token`` method in the ``ip_limit`` method add the
following to your ``/etc/searxng/limiter.toml``::
[botdetection.ip_limit]
link_token = true
Related: https://github.com/searxng/searxng/pull/2357#issuecomment-1554116941
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
In order to be able to meet the outstanding requirements, the implementation is
modularized and supplemented with documentation.
This patch does not contain functional change, except it fixes issue #2455
----
Aktivate limiter in the settings.yml and simulate a bot request by::
curl -H 'Accept-Language: de-DE,en-US;q=0.7,en;q=0.3' \
-H 'Accept: text/html'
-H 'User-Agent: xyz' \
-H 'Accept-Encoding: gzip' \
'http://127.0.0.1:8888/search?q=foo'
In the LOG:
DEBUG searx.botdetection.link_token : missing ping for this request: .....
Since ``BURST_MAX_SUSPICIOUS = 2`` you can repeat the query above two time
before you get a "Too Many Requests" response.
Closes: https://github.com/searxng/searxng/issues/2455
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
If there were no results but errors in the engines then the error dialogs of the
engines was displayed in the result list.
With the new design errors of the engines should only be displayed in the
sidebar and at the same time duplications of the (template) code will be
avoided.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
* set border top and bottom on sidebar collasables
* inrease peading on summary so its easier to click on mobile
* remove margins and add flex wrapper to normalize elements in sidebar
Make elements in the sidebar collapse able. Except infoboxes all elements in
the sidebar are collapsed by default.
By folding out the sidebar elements, the UI looks less cluttered. Especially on
small devices like smartphones, where the sidebar is above the results list, the
UX should be improved [1].
[1] https://github.com/searxng/searxng/issues/2140
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Block requests from PetalBlock. Normally robots.txt is enough to stop
PetalBlock from making requests [1]. However, if SearXNG is offered below a
path (example.org/search), then the robots.txt is not available in the root
paths of the domain / subdomain.
[1] https://webmaster.petalsearch.com/site/petalbot
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Wikipedia's zh-classical is not zh_Hant (see doc-string of engines.wikipedia).
Fixed the example in the doc-string of locales.get_engine_locale() to 'zh_TW'.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
To set the language from language recognition and hold the value selected by the
client, the previous implementation creates a copy of the SearchQuery object and
manipulates the SearchQuery object by calling function replace_auto_language().
This patch tries to implement a similar functionality in a more central place,
in function get_search_query_from_webapp() when the SearchQuery object is build
up.
Additional this patch uses the language preferred by the client, if language
recognition does not have a match / the existing implementation does not care
about client preferences and uses 'all' in case of no match.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Follow up of #2269
The script to update the descriptions of the engines does no longer work since
PR #2269 has been merged.
searx/engines/wikipedia.py
==========================
1. There was a misusage of zh-classical.wikipedia.org:
- `zh-classical` is dedicate to classical Chinese [1] which is not
traditional Chinese [2].
- zh.wikipedia.org has LanguageConverter enabled [3] and is going to
dynamically show simplified or traditional Chinese according to the
HTTP Accept-Language header.
2. The update_engine_descriptions.py needs a list of all wikipedias. The
implementation from #2269 included only a reduced list:
- https://meta.wikimedia.org/wiki/Wikipedia_article_depth
- https://meta.wikimedia.org/wiki/List_of_Wikipedias
searxng_extra/update/update_engine_descriptions.py
==================================================
Before PR #2269 there was a match_language() function that did an approximation
using various methods. With PR #2269 there are only the types in the data model
of the languages, which can be recognized by babel. The approximation methods,
which are needed (only here) in the determination of the descriptions, must be
replaced by other methods.
[1] https://en.wikipedia.org/wiki/Classical_Chinese
[2] https://en.wikipedia.org/wiki/Traditional_Chinese_characters
[3] https://www.mediawiki.org/wiki/Writing_systems#LanguageConverter
Closes: https://github.com/searxng/searxng/issues/2330
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Since [bb3a01f8] has been merged to the Farside project, Farside instances do no
longer need to send requests to SearXNG instances [1].
There are some old unmaintained Farside instances on the web that continue to
query SearXNG instances --> we can safely block their requests.
[1] https://github.com/benbusby/farside/issues/95
[bb3a01f8] https://github.com/benbusby/farside/commit/bb3a01f8
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
When the user press [TAB] the input form should be filled with the highlighted
item from the autocomplete list, but not release a search / with other words:
what we now have by pressing once on [ENTER] should be mapped to the [TAB] key
and pressing [ENTER] once should release a search query. [1]
[1] https://github.com/searxng/searxng/issues/778#issuecomment-1016593816
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- Update input when selecting autocomplete prediction with keyboard
- Search immediately by pressing enter key
- Search immediately by clicking on an autocomplete suggestion
Related:
- https://github.com/searxng/searxng/issues/778
On some result items from Bing-WEB the `<span class='algoSlug_icon'>` tag is the
only tag that contains a description. The issue can be reproduced by [1]::
!bi vmware
[1] https://github.com/searxng/searxng/issues/1764#issuecomment-1417990531
Reported-by: @AlyoshaVasilieva
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This PR does no functional change it is just an attempt to make more clear in
the code, what a default category is and what a subcategory is. The previous
name 'others' leads to confusion with the **category 'other'**.
If a engine is not assigned to a category, the default is assigned::
DEFAULT_CATEGORY = 'other'
If an engine has only one category and this category is shown as tab in the user
interface, this engine has no further subgrouping::
NO_SUBGROUPING = 'without further subgrouping'
Related:
- https://github.com/searxng/searxng/issues/1604
- https://github.com/searxng/searxng/pull/1545
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
When using ``use_default_settings: true``, removing default categories from
settings.yml will not remove them from the UI.
The value ``categories_as_tabs`` is a dictionary type (a4c2cfb) and dictionary
types are merged additive by ``settings_loader.update_settings()``.
This patch replaces the default ``categories_as_tabs`` by the one from the
``user_settings``.
Related: https://github.com/searxng/searxng/issues/1019#issuecomment-1193145654
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- requests without HTTP header 'Connection' or missing 'User-Agent' will be
blocked by the limiter
- re_bot is related to 'User-Agent' and has been renamed to block_user_agent
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Google-News returns internal links where the origin URL is encoded in a
base64 (RFC 2045 aka URL-safe) string.
Closes: https://github.com/searxng/searxng/issues/1959
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
In debug mode more detailed logging is needed to evaluate if an access should
have been blocked by the limiter.
BTW: remove duplicate code checking bot signature ``re_bot.match(user_agent)``
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Since 28. March google has changed its response, this patch fixes the google
engine to scrap out the results & images from the new designed response.
closes: https://github.com/searxng/searxng/issues/2287
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This patch replaces the *full of magic* ``utils.match_language`` function by a
``locales.match_locale``. The ``locales.match_locale`` function is based on the
``locales.build_engine_locales`` introduced in 9ae409a0 [1].
In the past SearXNG did only support a search by a language but not in a region.
This has been changed a long time ago and regions have been added to SearXNG
core but not to the engines. The ``utils.match_language`` was the function to
handle the different aspects of language/regions in SearXNG core and the
supported *languages* in the engine. The ``utils.match_language`` did it with
some magic and works good for most use cases but fails in some edge case.
To replace the concurrence of languages and regions in the SearXNG core the
``locales.build_engine_locales`` was introduced in 9ae409a0 [1]. With the last
patches all engines has been migrated to a ``fetch_traits`` and a
language/region concept that is based on ``locales.build_engine_locales``.
To summarize: there is no longer a need for the ``locales.match_language``.
[1] https://github.com/searxng/searxng/pull/1652
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
All engines has been migrated from ``supported_languages`` to the
``fetch_traits`` concept. There is no longer a need for the obsolete code that
implements the ``supported_languages`` concept.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
re-implementation of the Archlinux Wiki:
- fetch_traits(): fetch languages, wiki URLs and title arguments
- add content field to the result list
- add documentation
Wikis from wiki.archlinux.fr, wiki.archlinux.ro, archtr.org/wiki do no longer
exists (has been merged in the main wiki).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- fetch_traits() SepiaSearch and Peertube are using identical languages.
Replace module's dictionary `supported_languages` by `engine.traits.languages`
(data_type: `traits_v1`).
- fixed code to pass pylint
- request(): add argument boostLanguages
- response(): is replaced by peertube's video_response() function, which adds
metadata from channel name, host & tags
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- fetch_traits(): fetch locales (and languages) from dailymotion API
- removed obsolete data-type "supported_languages"
- add documentation
- improved argument list of the HTTP request:
- add argument: family_filter_map
- add conditional argument: localization
Don't add localization and country arguments if the user does select a
language (:de, :en, ..)
- improve code quality (mainly improve readability)
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Partial reverse engineering of the Google engines including a improved language
and region handling based on the engine.traits_v1 data.
When ever possible the implementations of the Google engines try to make use of
the async REST APIs. The get_lang_info() has been generalized to a
get_google_info() function / especially the region handling has been improved by
adding the cr parameter.
searx/data/engine_traits.json
Add data type "traits_v1" generated by the fetch_traits() functions from:
- Google (WEB),
- Google images,
- Google news,
- Google scholar and
- Google videos
and remove data from obsolete data type "supported_languages".
A traits.custom type that maps region codes to *supported_domains* is fetched
from https://www.google.com/supported_domains
searx/autocomplete.py:
Reversed engineered autocomplete from Google WEB. Supports Google's languages and
subdomains. The old API suggestqueries.google.com/complete has been replaced
by the async REST API: https://{subdomain}/complete/search?{args}
searx/engines/google.py
Reverse engineering and extensive testing ..
- fetch_traits(): Fetch languages & regions from Google properties.
- always use the async REST API (formally known as 'use_mobile_ui')
- use *supported_domains* from traits
- improved the result list by fetching './/div[@data-content-feature]'
and parsing the type of the various *content features* --> thumbnails are
added
searx/engines/google_images.py
Reverse engineering and extensive testing ..
- fetch_traits(): Fetch languages & regions from Google properties.
- use *supported_domains* from traits
- if exists, freshness_date is added to the result
- issue 1864: result list has been improved a lot (due to the new cr parameter)
searx/engines/google_news.py
Reverse engineering and extensive testing ..
- fetch_traits(): Fetch languages & regions from Google properties.
*supported_domains* is not needed but a ceid list has been added.
- different region handling compared to Google WEB
- fixed for various languages & regions (due to the new ceid parameter) /
avoid CONSENT page
- Google News do no longer support time range
- result list has been fixed: XPath of pub_date and pub_origin
searx/engines/google_videos.py
- fetch_traits(): Fetch languages & regions from Google properties.
- use *supported_domains* from traits
- add paging support
- implement a async request ('asearch': 'arc' & 'async':
'use_ac:true,_fmt:html')
- simplified code (thanks to '_fmt:html' request)
- issue 1359: fixed xpath of video length data
searx/engines/google_scholar.py
- fetch_traits(): Fetch languages & regions from Google properties.
- use *supported_domains* from traits
- request(): include patents & citations
- response(): fixed CAPTCHA detection (Scholar has its own CATCHA manager)
- hardening XPath to iterate over results
- fixed XPath of pub_type (has been change from gs_ct1 to gs_cgt2 class)
- issue 1769 fixed: new request implementation is no longer incompatible
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Partial reverse engineering of the DuckDuckGo (DDG) engines including a
improved language and region handling based on the enigne.traits_v1 data.
- DDG Lite
- DDG Instant Answer API
- DDG Images
- DDG Weather
docs/src/searx.engine.duckduckgo.rst:
Online documentation of the DDG engines (make docs.live)
searx/data/engine_traits.json
Add data type "traits_v1" generated by the fetch_traits() functions from:
- "duckduckgo" (WEB),
- "duckduckgo images" and
- "duckduckgo weather"
and remove data from obsolete data type "supported_languages".
searx/autocomplete.py:
Reversed engineered Autocomplete from DDG. Supports DDG's languages.
searx/engines/duckduckgo.py:
- fetch_traits(): Fetch languages & regions from DDG.
- get_ddg_lang(): Get DDG's language identifier from SearXNG's locale. DDG
defines its languages by region codes. DDG-Lite does not offer a language
selection to the user, only a region can be selected by the user.
- Cache ``vqd`` value: The vqd value depends on the query string and is needed
for the follow up pages or the images loaded by a XMLHttpRequest (DDG
images). The ``vqd`` value of a search term is stored for 10min in the
redis DB.
- DDG Lite engine: reversed engineered request method with improved Language
and region support and better ``vqd`` handling.
searx/engines/duckduckgo_definitions.py: DDG Instant Answer API
The *instant answers* API does not support languages, or at least we could not
find out how language support should work. It seems that most of the features
are based on English terms.
searx/engines/duckduckgo_images.py: DDG Images
Reversed engineered request method. Improved language and region handling
based on cookies and the enigne.traits_v1 data. Response: add image format to
the result list
searx/engines/duckduckgo_weather.py: DDG Weather
Improved language and region handling based on cookies and the
enigne.traits_v1 data.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
One reason for the often seen CAPTCHA of the Startpage requests are the
incomplete requests SearXNG sends to startpage.com: this patch is a complete new
implementation of the ``request()`` function, reversed engineered from the
Startpage's search form. The new implementation:
- use traits of data_type: traits_v1 and drop deprecated data_type: supported_languages
- adds time-range support
- adds save-search support
- fix searxng/searxng/issues 1884
- fix searxng/searxng/issues 1081 --> improvements to avoid CAPTCHA
In preparation for more categories (News, Images, Videos ..) from Startpage, the
variable ``startpage_categ`` was set up. The default value is ``web`` and other
categories from Startpage are not yet implemented.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
BTW this fix an issue in wikipedia: SearXNG's locales zh-TW and zh-HK are now
using language `zh-classical` from wikipedia (and not `zh`).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
With the language and region tags from the EngineTraitsMap the handling of
SearXNG's tags of languages and regions has been normalized and is no longer
a *mystery*. The "languages" became "locales" that are supported by babel and
by this, the update_engine_traits.py can be simplified a lot.
Other code places can be simplified as well, but these simplifications
should (respectively can) only be done when none of the engines work with the
deprecated EngineTraits.supported_languages interface anymore.
This commit replaces searx.languages by searx.sxng_locales and fix the naming of
some names from "language" to "locale" (e.g. language_codes --> sxng_locales).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Wikipedia engines.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Google engines.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the DuckDuckGo engines.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Yahoo engine.
.. note::
Includes migration of the request methode from 'supported_languages' to
'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Qwant engines.
.. note::
Includes migration of the request methode from 'supported_languages' to
'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Dailymotion engine.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Startpage engine.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implements a fetch_traits function for the Bing engines.
.. note::
Does not include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object!
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- fetch_traits(): Fetch languages from peertube's search-index source code.
[mod] Include migration of the request methode from 'supported_languages'
to 'traits' (EngineTraits) object.
[fix] old supported_languages_url is no longer valid since the sources
has been moved to a different path.
- fixed code to pass pylint
- request(): complete re-implementation based on the API docs [1]
- response(): complete re-implementation, adds serveral fields missed before
- add source code documentation
[1] https://docs.joinpeertube.org/api-rest-reference.html#tag/Search/operation/searchVideos
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Implementations of the *traits* of the engines.
Engine's traits are fetched from the origin engine and stored in a JSON file in
the *data folder*. Most often traits are languages and region codes and their
mapping from SearXNG's representation to the representation in the origin search
engine.
To load traits from the persistence::
searx.enginelib.traits.EngineTraitsMap.from_data()
For new traits new properties can be added to the class::
searx.enginelib.traits.EngineTraits
.. hint::
Implementation is downward compatible to the deprecated *supported_languages
method* from the vintage implementation.
The vintage code is tagged as *deprecated* an can be removed when all engines
has been ported to the *traits method*.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
From the analyse of @9Ninety [1] we know that DDG (and may be other engines / I
have startpage in mind) does some kind of TLS fingerprint to block bots.
This patch shuffles the default ciphers from httpx to avoid a cipher profile
that is known to httpx (and blocked by DDG).
[1] https://github.com/searxng/searxng/issues/2246#issuecomment-1467895556
----
From `What Is TLS Fingerprint and How to Bypass It`_
> When implementing TLS fingerprinting, servers can't operate based on a
> locked-in whitelist database of fingerprints. New fingerprints appear
> when web clients or TLS libraries release new versions. So, they have to
> live off a blocklist database instead.
> ...
> It's safe to leave the first three as is but shuffle the remaining ciphers
> and you can bypass the TLS fingerprint check.
.. _What Is TLS Fingerprint and How to Bypass It:
https://www.zenrows.com/blog/what-is-tls-fingerprint#how-to-bypass-tls-fingerprinting
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Closes: https://github.com/searxng/searxng/issues/2246
Partial merge of [PR-1736]
[PR-1736] https://github.com/searxng/searxng/pull/1736
Suggested-by: @FunctionalHacker in [1]
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
When the user choose "Auto-detected", the choice remains on the following queries.
The detected language is displayed.
For example "Auto-detected (en)":
* the next query language is going to be auto detected
* for the current query, the detected language is English.
This replace the autodetect_search_language plugin.
Tineye becomes active as soon as a https:// signature is found in the search
term, but most of the time a reverse image search is not requested when a URL is
specified, often the URL is just from a C&P.
The frequent requests to tineye lead in the end to the SearXNG instance being
blocked by tineye and the user seeing unexpected error messages.
BTW: many maintainers have disabled this engine in their local SearXNG settings.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
* fix type in settings.yml: replace suspend_times by suspended_times
* always use delay defined in settings.yml:
* HTTP status 402 and 403: read the value from settings.yml instead of using the hardcoded value of 1 day.
* startpage engine: CAPTCHA suspend the engine for one day instead of one week
* use html input elements instead of buttons for the pagination forms at the bottom of the result page
* move the less section that hides the pagination number widget on mobile to the mobile section
* clean up the less code for styling the numbers of the pagination widget
* fix: add the margin for box results (like in news category) to the bottom of the result to have a margin between pagination widget and article result
Adds to the navigation widget, preserving forward/backward nav, and
inserting a list of clickable page numbers between them.
Phone sized devices continue without this widget as deterministic
display under small screen sizes has not been solved.
The widget is agnostic to the actual amount of pages there are that one
can navigate to and as such shows all plausible, albeit not necessarilly
valid, possibilities.
This widget does not interfere with infinite scroll in any fashion.
continuation of #2117
related to #2111
This commit:
* fixes the Docker tag using an additional variable DOCKER_TAG, see searx/version.py
* fixes the Docker labels org.label-schema.vcs-ref and org.opencontainers.image.revision
* adds searx/version_frozen to .gitignore
Make suspended_time changeable in settings.yml
Allow different values to be set for different exceptions.
Co-authored-by: Alexandre Flament <alex@al-f.net>
This patch is to hardening the parsing of the bing response:
1. To fix [2087] check if the selected result item contains a link, otherwise
skip result item and continue in the result loop. Increment the result
pointer when a result has been added / the enumerate that counts for skipped
items is no longer valid when result items are skipped.
To test the bugfix use: ``!bi :all cerbot``
2. Limit the XPath selection of result items to direct children nodes (list
items ``li``) of the ordered list (``ol``).
To test the selector use: ``!bi :en pontiac aztek wiki``
.. in the result list you should find the wikipedia entry on top,
compare [2068]
[2087] https://github.com/searxng/searxng/issues/2087
[2068] https://github.com/searxng/searxng/issues/2068
This changes the Suggestions to be a single column, not a wrapping row,
changing the input to be incapable of overflowing into visually adjacent
elements.
Modify the XPath selector to get the wikipedia result plus small fixes.
About result content: especially with the Wikipedia result, we'd get several
paragraph elements, only the first paragraph would be taken and displayed on the
search result
- Add documentation to the plugin
- Harmonize FastText language model with SearXNG's language model
Reosurces::
import fasttext # --> +10 MB
fasttext.load_model(str(data_dir / 'lid.176.ftz')) # --> +4MB
Suggested-by: @dalf
- To speed up and simplify the deployment use fasttext-wheel instead of fasttext
- Building numpy on the Alpine Linux of docker-images takes ages --> install
py3-numpy from Alpines package manager (apk)
- Alpine Linux on docker-images (musl libc) do not support fasttext-wheel (gnu
libc) --> patch Dockerfile and build from fastetxt:
sed -i s/fasttext-wheel/fasttext/ requirements.txt
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
- the environment variable SEARXNG_REDIS_URL overrides the setting value redis.url
- ./manage sets SEARXNG_REDIS_URL to unix:///usr/local/searxng-redis/run/redis.sock if:
- the socket exists
- SEARXNG_REDIS_URL is not already defined
Update of PR #1856
Co-authored-by: Markus Heiser <markus.heiser@darmarit.de>
Default behavior of urllib.parse_qs is to discard blank values, causing a preference of none to be deserialized as undefined, using the instance default rather than the selected preference.
redis.Redis.from_url(url) doesn't check if the url is valid
Before this commit: actual error are detected later when the client is actually used.
With this commit, client() makes sure to return a valid Redis client or None.
Also, the code makes sure not to log the password of the Redis URL
settings.yml:
* The default URL was unix:///usr/local/searxng-redis/run/redis.sock?db=0
* The default URL is now "false"
The default URL makes the log difficult to deal with:
if the admin didn't install a Redis instance, the logs record a false error.
It worked before because SearXNG initialized the Redis connection when the limiter started.
In this commit, SearXNG initializes Redis in searx/webapp.py
so various components can use Redis without taking care of the initialization step.
- fix issue reported #1809
- filter out `None` value from issn and isbn list
- add comments (from publicationName)
- add publisher
Closes: https://github.com/searxng/searxng/issues/1809
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Some result items from core.ac.uk do not have an URL::
Traceback (most recent call last):
File "searx/search/processors/online.py", line 154, in search
search_results = self._search_basic(query, params)
File "searx/search/processors/online.py", line 142, in _search_basic
return self.engine.response(response)
File "SearXNG/searx/engines/core.py", line 73, in response
'url': source['urls'][0].replace('http://', 'https://', 1),
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Currentty, when oa_doi_rewrite find a DOI in the result URL, it replace the URL.
In this commit, the plugin adds the key "doi" to the result,
so the paper.html can show it.
* Move the datetime to str code from searx.webapp.search to searx.webutils.searxng_format_date
* When the month, day, hour, day and second are zero, the function returns only the year.
The word "hackable" may arouse interest in programmers to participate in the
development, but it scares the ordinary user.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The google news are in a rework, the content area of a news item has been
removed.
Closes: https://github.com/searxng/searxng/issues/1790
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Fix::
searx/locales.py:docstring of searx.locales.get_engine_locale:17: \
WARNING: Definition list ends without a blank line; unexpected unindent.
Improvement: don't show default values in the generated documentation whe it is
more a mess than a usefull information (`:meta hide-value:`).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
no_result_for_http_status contains a list of HTTP status.
These HTTP status are seen an empty result list.
In other cases an exception is thrown as usual.
Previously raise_for_httperror were ignoring all HTTP error,
which make defective engines invisible in the stats.
With the POST method, autocomplete.js does not URL encode the values.
For example "1+1" is sent as "1+1" which is read as "1 1" since space are URL encoded with a plus.
There is no clean way to fix the bug since autocomplete.js seems abandoned.
The commit monkey patches the ajax function of autocomplete.js
Related to #1695
Only raise "suspicious Accept-Encoding" when both "gzip" and "deflate" are missing from Accept-Encoding.
Prevent Browsers which only implement one compression solution from being blocked by the limiter plugin.
Example Browser which is currently blocked: Lynx Browser (https://lynx.invisible-island.net)
When a user selects an unknown or invalid locale by using the search syntax:
!qw siemens :de-TW
Before this patch a UnknownLocaleError exception will be rasied:
```
Traceback (most recent call last):
File "SearXNG/searx/search/processors/online.py", line 154, in search
search_results = self._search_basic(query, params)
File "SearXNG/searx/search/processors/online.py", line 128, in _search_basic
self.engine.request(query, params)
File "SearXNG/searx/engines/qwant.py", line 98, in request
q_locale = get_engine_locale(params['language'], supported_languages, default='en_US')
File "SearXNG/searx/locales.py", line 216, in get_engine_locale
locale = babel.Locale.parse(searxng_locale, sep='-')
File "SearXNG/local/py3/lib/python3.8/site-packages/babel/core.py", line 330, in parse
raise UnknownLocaleError(input_id)
```
This patch implements a simple exception handling, since e.g. `de-TW` does not
exists `de` will be used to get engines locale. On invalid terms like `xy-XY`
the default will be returned.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The request function should not request a language (aka locale) that is not
supported by qwant. Select a locale like zh-TW ends in qwant's API error:
ERROR searx.engines.qwant news: exception : \
API error::locale must be one of the following values: \
en_gb, en_ie, en_us, en_ca, en_my, en_au, en_nz, de_de, de_ch, de_at, fr_fr, \
fr_be, fr_ch, fr_ca, fr_ad, fc_ca, co_fr, es_es, es_ar, es_cl, es_co, es_mx, \
es_pe, es_ad, ca_es, ca_ad, ca_fr, eu_es, eu_fr, it_it, it_ch, pt_pt, pt_ad, \
nl_be, nl_nl
The existing searx.utils.match_language function is unsuitable for this purpose,
it is replaced by function searx.locales.get_engine_locale that is based on the
methods from the babel package.
The quant's _fetch_supported_languages function has been revised to filter out
languages 8aka locales) not supported by qwant.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The match_language function sometimes returns incorrect results which is why a
new function get_engine_locale is required.
A bugfix of the match_language is not easily possible, because there is almost
no documentation for it and already the call parameters are undefined. E.g. the
function processes values like the ones from yahoo::
"yahoo": [
"ar",
...
"zh_chs",
"zh_cht"
]
The get_engine_locale has been documented in detail, there is a clear
description of the assumptions as well as the requirements and approximation
rules (read doc-string for more details)::
Argument ``engine_locales`` is a python dict that maps *SearXNG locales* to
corresponding *engine locales*:
<engine>: {
# SearXNG string : engine-string
'ca-ES' : 'ca_ES',
'fr-BE' : 'fr_BE',
'fr-CA' : 'fr_CA',
'fr-CH' : 'fr_CH',
'fr' : 'fr_FR',
...
'pl-PL' : 'pl_PL',
'pt-PT' : 'pt_PT'
}
.. hint::
The *SearXNG locale* string has to be known by babel!
In the following you will find a comparison:
>>> import babel.languages
>>> from searx.utils import match_language
>>> from searx.locales import get_engine_locale
Assume we have an engine that supports the follwoing locales:
>>> lang_list = {
... "zh-CN": "zh_CN",
... "zh-HK": "zh_HK",
... "nl-BE": "nl_BE",
... "fr-CA": "fr_CA",
... }
Assumption:
A. When a user selects a language the results should be optimized according to
the selected language.
B. When user selects a language and a territory the results should be
optimized with first priority on territory and second on language.
----
Example: (Assumption A.)
A user selects region 'zh-TW' which should end in zh_HK
hint:
CN is 'Hans' and HK ('Hant') fits better to TW ('Hant')
>>> get_engine_locale('zh-TW', lang_list)
'zh_HK'
>>> lang_list[match_language('zh-TW', lang_list)]
'zh_CN'
----
Example: (Assumption A.)
A user selects only the language 'zh' which should end in CN
>>> get_engine_locale('zh', lang_list)
'zh_CN'
>>> lang_list[match_language('zh', lang_list)]
'zh_CN'
----
Example: (Assumption B.)
A user selects region 'fr-BE' which should end in nl-BE
hint:
priority should be on the territory the user selected. If the user
prefers 'fr' he will select 'fr' without a region tag.
>>> get_engine_locale('fr-BE', lang_list, default='unknown')
'nl_BE'
>>> match_language('fr-BE', lang_list, fallback='unknown')
'fr-CA'
----
Example: (Assumption A.)
A user selects only the language 'fr' which should end in fr_CA
>>> get_engine_locale('fr', lang_list)
'fr_CA'
>>> lang_list[match_language('fr', lang_list)]
'fr_CA'
----
The difference in priority on the territory is best shown with a engine that
supports the following locales:
>>> lang_list = {
... "fr-FR": "fr_FR",
... "fr-CA": "fr_CA",
... "en-GB": "en_GB",
... "nl-BE": "nl_BE",
... }
----
Example: (Assumption A.)
A user selects only a language
>>> get_engine_locale('en', lang_list)
'en_GB'
>>> match_language('en', lang_list)
'en-GB'
hint: the engine supports fr_FR and fr_CA since no territory is given, fr_FR
takes priority ..
>>> get_engine_locale('fr', lang_list)
'fr_FR'
>>> lang_list[match_language('fr', lang_list)]
'fr_FR'
----
Example: (Assumption B.)
A user selects region 'fr-BE' which should end in nl-BE
>>> get_engine_locale('fr-BE', lang_list)
'nl_BE'
>>> lang_list[match_language('fr-BE', lang_list)]
'fr_FR'
----
If the user selects a language and there are two locales like the following:
>>> lang_list = {
... "fr-BE": "fr_BE",
... "fr-CH": "fr_CH",
... }
>>>
>>> get_engine_locale('fr', lang_list)
'fr_BE'
>>> lang_list[match_language('fr', lang_list)]
'fr_BE'
Looks like both functions return the same value, but match_language depends on the
order of the dictionary (which is not predictable):
>>> lang_list = {
... "fr-CH": "fr_CH",
... "fr-BE": "fr_BE",
... }
>>> get_engine_locale('fr', lang_list)
'fr_BE'
>>> lang_list[match_language('fr', lang_list)]
'fr_CH'
>>>
The get_engine_locale selects the locale by looking at the "population percent"
and this percentage has an higher amount in BE (68.%) compared to CH (21%)
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
By using new property `qwant_categ:` the category of qwant is no longer bound to
the category of SearXNG.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Neeva is "the world's first ad-free, private search engine" and uses data from Apple, Bing, Yelp and "others".
They claim to crawl "hundreds of millions" of URLs a day (https://twitter.com/Neeva/status/1536447373903335426).
Some HTTP-Clients do have issues with the ``opensearch.xml`` from SearXNG
(related [1][2]) while other OpenSearch descriptions[3] (e.g. from qwant) work
flawles.
Inspired by the OpenSearch description from qwant and with informations from the
specification[4] the ``opensearch.xml`` has been *improved*.
- convert `<Url>` methods from lower case to upper case (`POST`|`GET`)
- add `<moz:SearchForm>` and `xmlns:moz="http://www.mozilla.org/2006/browser/search/"`
- add `<Query role="example" searchTerms="SearXNG" />` [4]
OpenSearch description documents should include at least one Query element of
`role="example"` that is expected to return search results. Search clients may
use this example query to validate that the search engine is working properly.
- modified `<LongName>` to SearXNG
- modified `<Description>` the word 'hackable' scares uninitiated users and was removed
- add the `type="image/png"` to `<Image>`
Test can be done by::
make run
Visit http://127.0.0.1:8888/ and add the search engine to your WEB-Browser /
test with different WEB-Browser from desktop and Smartphones (are there any iOS
user here, please test on Safari and Chrome).
[1] https://app.element.io/#/room/#searxng:matrix.org/$xN_abdKhNqUlgXRBrb_9F3pqOxnSzGQ1TG0s0G9hQVw
[2] https://github.com/searxng/searxng/issues/431
[3] https://developer.mozilla.org/en-US/docs/Web/OpenSearch
[4] https://github.com/dewitt/opensearch/blob/master/opensearch-1-1-draft-6.md#the-query-element
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This implements the Deepl Translation engine. It works nearly like lingva but
directly to the deepl API. This api only needs a to-lang, from-lang is a fake
by now.
There is a free option to use [1].
[1] https://www.deepl.com/pro-api?cta=header-pro-api for registering a free account.
yep.com is still in beta, the api.yep.com does not have paging support. There
is only a 'limit' argument with a maximum of 100 results.
yep.com seems fast; there is nor need for a timeout of 12 sec.
The API returns JSON nevertheless what the HTTP header is, the "show more"
button on yep.com's web site does not set a special HTTP Accept header.
FYI: The index does not support languages, the WEB UI does not offer a language
selection of the results and the entire index seems in English.
Closes: https://github.com/searxng/searxng/issues/1619
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Most engines that support languages (and regions) use the Accept-Language from
the WEB browser to build a response that fits to the language (and region).
- add new engine option: send_accept_language_header
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The errors make pyright usage useless since a new error won't be seen [1].
[1] https://github.com/searxng/searxng/pull/1569
```
searx/compat.py:11:27 - error: Expression of type "Type[cached_property[_T@cached_property]]" cannot be assigned to declared type "Type[cached_property]"
"Type[cached_property[_T@cached_property]]" is incompatible with "Type[cached_property]"
Type "Type[cached_property[_T@cached_property]]" cannot be assigned to type "Type[cached_property]" (reportGeneralTypeIssues)
searx/utils.py:69:36 - error: Expression of type "None" cannot be assigned to parameter of type "str"
Type "None" cannot be assigned to type "str" (reportGeneralTypeIssues)
searx/utils.py:573:85 - error: Expression of type "None" cannot be assigned to parameter of type "int"
Type "None" cannot be assigned to type "int" (reportGeneralTypeIssues)
searx/webapp.py:1306:22 - error: Argument of type "str" cannot be assigned to parameter "__a" of type "BytesPath" in function "join"
Type "str" cannot be assigned to type "BytesPath"
"str" is incompatible with "bytes"
"str" is incompatible with protocol "PathLike[bytes]"
"__fspath__" is not present (reportGeneralTypeIssues)
searx/webapp.py:1306:68 - error: Argument of type "Literal['themes']" cannot be assigned to parameter "paths" of type "BytesPath" in function "join"
Type "Literal['themes']" cannot be assigned to type "BytesPath"
"Literal['themes']" is incompatible with "bytes"
"Literal['themes']" is incompatible with protocol "PathLike[bytes]"
"__fspath__" is not present (reportGeneralTypeIssues)
searx/webapp.py:1306:78 - error: Argument of type "str | Any | None" cannot be assigned to parameter "paths" of type "BytesPath" in function "join"
Type "str | Any | None" cannot be assigned to type "BytesPath"
Type "str" cannot be assigned to type "BytesPath"
"str" is incompatible with "bytes"
"str" is incompatible with protocol "PathLike[bytes]"
"__fspath__" is not present (reportGeneralTypeIssues)
searx/webapp.py:1306:85 - error: Argument of type "Literal['img']" cannot be assigned to parameter "paths" of type "BytesPath" in function "join"
Type "Literal['img']" cannot be assigned to type "BytesPath"
"Literal['img']" is incompatible with "bytes"
"Literal['img']" is incompatible with protocol "PathLike[bytes]"
"__fspath__" is not present (reportGeneralTypeIssues)
searx/engines/mongodb.py:8:6 - warning: Import "pymongo" could not be resolved (reportMissingImports)
searx/engines/mysql_server.py:9:8 - warning: Import "mysql.connector" could not be resolved (reportMissingImports)
searx/engines/postgresql.py:9:8 - warning: Import "psycopg2" could not be resolved from source (reportMissingModuleSource)
searx/engines/xpath.py:187:28 - warning: "categories" is not defined (reportUndefinedVariable)
searx/search/__init__.py:184:82 - warning: "flask" is not defined (reportUndefinedVariable)
searx/search/checker/background.py:19:26 - error: Type of "schedule" is partially unknown
Type of "schedule" is "(delay: Any, func: Any, *args: Any) -> Literal[True]" (reportUnknownVariableType)
searx/shared/__init__.py:8:12 - warning: Import "uwsgi" could not be resolved (reportMissingImports)
searx/shared/shared_uwsgi.py:5:8 - warning: Import "uwsgi" could not be resolved (reportMissingImports)
```
The engine name is not only a *name* its also a identifier that is used in
logs, HTTP headers and more. Unicode characters in the name of an engine could
cause various issues.
Closes: https://github.com/searxng/searxng/issues/1544
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Emojipedia is an emoji reference website which documents the meaning and
common usage of emoji characters in the Unicode Standard. It is owned by Zedge
since 2021. Emojipedia is a voting member of The Unicode Consortium.[1]
Cherry picked from @james-still [2[3] and slightly modified to fit SearXNG's
quality gates.
[1] https://en.wikipedia.org/wiki/Emojipedia
[2] 2fc01eb20f
[3] https://github.com/searx/searx/pull/3278