python - How to send JavaScript and Cookies Enabled in Scrapy? -


i scraping website using scrapy require cooking , java-script enabled. don't think have process javascript. need pretend if javascript enabled.

here have tried: 1) enable cookies through following in settings

cookies_enabled = true cookies_debug = true 

2) using download middleware cookies

downloader_middlewares = {     'scrapy.contrib.downloadermiddleware.useragent.useragentmiddleware': 400,     'scrapy.contrib.downloadermiddleware.cookies.cookiesmiddleware':700 } 

3) sending 'x-javascript-enabled': 'true'

default_request_headers={     'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',     'accept-language': 'en',     'x-javascript-enabled': 'true' } 

but none of them working me. can please suggest idea or give me direction ?

thanks in advance replies.

afaik, there no universal solution. have debug site, see how determines javascript not supported/enabled client.

i don't think server looks @ x-javascript-enabled header. maybe there cookie set javascript when page loads in real javascript enabled browser? maybe server looks @ user-agent header?

see this response.


Comments