Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
Germey committed May 4, 2020
1 parent d3fd457 commit 3a2cf9e
Show file tree
Hide file tree
Showing 5 changed files with 11 additions and 7 deletions.
7 changes: 5 additions & 2 deletions deployment-comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
---
apiVersion: v1
items:
- apiVersion: extensions/v1beta1
- apiVersion: apps/v1
kind: Deployment
metadata:
annotations:
Expand All @@ -19,6 +19,9 @@ items:
namespace: crawler
spec:
replicas: 10
selector:
matchLabels:
io.kompose.service: crawler-weibo-comment
revisionHistoryLimit: 1
strategy: {}
template:
Expand Down Expand Up @@ -53,7 +56,7 @@ items:
key: universal
- name: PROXYPOOL_ENABLED
value: 'true'
image: germey/crawler-weibo-comment:${TAG}
image: germey/crawler-weibo-comment
name: crawler-weibo-comment
resources:
limits:
Expand Down
4 changes: 2 additions & 2 deletions deployment-single.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ metadata:
name: crawler-weibo-single
namespace: crawler
spec:
schedule: "* */1 * * *"
schedule: "*/20 * * * *"
jobTemplate:
spec:
template:
Expand All @@ -16,7 +16,7 @@ spec:
- single
env:
- name: WEIBO_WEIBO_ID
value: '4496797961825543'
value: '4467107636950632'
- name: WEIBO_COOKIES
valueFrom:
secretKeyRef:
Expand Down
2 changes: 1 addition & 1 deletion weibo/middlewares.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ def get_random_proxy(self):
:return:
"""
try:
if self.auth:
if getattr(self, 'auth') and self.auth:
response = requests.get(self.proxypool_url, timeout=5, auth=self.auth)
else:
response = requests.get(self.proxypool_url, timeout=5)
Expand Down
2 changes: 1 addition & 1 deletion weibo/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@

# definition of proxy
PROXYPOOL_URL = env.str('PROXYPOOL_URL')
PROXYTUNNEL_URL = env.str('PROXYTUNNEL_URL')
# PROXYTUNNEL_URL = env.str('PROXYTUNNEL_URL')

# definition of elasticsearch
ELASTICSEARCH_CONNECTION_STRING = env.str('ELASTICSEARCH_CONNECTION_STRING')
3 changes: 2 additions & 1 deletion weibo/spiders/single.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
START_COMMENT_ID = env.str('START_COMMENT_ID', None)
print('START_COMMENT_ID', START_COMMENT_ID)


class SingleSpider(Spider):
"""
comment spider of single weibo
Expand Down Expand Up @@ -45,7 +46,7 @@ class SingleSpider(Spider):
"X-Requested-With": "XMLHttpRequest",
}

cookies = env.str('WEIBO_COOKIES')
cookies = env.str('WEIBO_COOKIES', '')
page = 1

def start_requests(self):
Expand Down

0 comments on commit 3a2cf9e

Please sign in to comment.