You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using scrapy-playwright with latest versions on the webkit browser on ubuntu 22.04.
I can start and debug the spider once or twice. Trying to stop it using the debugger "stop" button (Ctrl+Break) will only half stop it, it will linger open forever if not stopped again with another Ctrl+Break.
After once or twice that the scrapy spider is working fine, it will stop returning 'page' in the meta field of the response object.
There are no errors in the logs.
Sometimes I manage to restart the process again. At other times I am forced to restart the machine...
I tried killing the WPBrowser* processes but no luck...
Any idea how to deal with this?
Do I need to add some handler to stop the debug more gracefully? Is it related at all?
The text was updated successfully, but these errors were encountered:
I am using scrapy-playwright with latest versions on the webkit browser on ubuntu 22.04.
I can start and debug the spider once or twice. Trying to stop it using the debugger "stop" button (Ctrl+Break) will only half stop it, it will linger open forever if not stopped again with another Ctrl+Break.
After once or twice that the scrapy spider is working fine, it will stop returning 'page' in the meta field of the response object.
There are no errors in the logs.
Sometimes I manage to restart the process again. At other times I am forced to restart the machine...
I tried killing the WPBrowser* processes but no luck...
Any idea how to deal with this?
Do I need to add some handler to stop the debug more gracefully? Is it related at all?
The text was updated successfully, but these errors were encountered: