mirror of
https://github.com/discourse/discourse.git
synced 2025-09-07 12:02:53 +08:00
FIX: blacklisted crawlers could get through by omitting the accept header
This commit is contained in:
parent
059f1d8df4
commit
b87fa6d749
2 changed files with 2 additions and 3 deletions
|
@ -289,7 +289,6 @@ class Middleware::RequestTracker
|
|||
def block_crawler(request)
|
||||
request.get? &&
|
||||
!request.xhr? &&
|
||||
request.env['HTTP_ACCEPT'] =~ /text\/html/ &&
|
||||
!request.path.ends_with?('robots.txt') &&
|
||||
CrawlerDetection.is_blocked_crawler?(request.env['HTTP_USER_AGENT'])
|
||||
end
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue