Hi,
I have been working on a macro for scraping webpages.
The earlier version of the tool used "msxml2.xmlhttp" to load pages
Set html_Page = CreateObject("htmlFile")
With CreateObject("msxml2.xmlhttp")
.Open "GET", page_url, False
.send
html_Page.body.innerHTML = .responseText
End With
I use this html_Page object to parse through tags on the page.
When I took a dump of all the text on the page("html_Page") , I found only static content was available.
The target Page may have been static earlier ,hence the macro worked fine for a while.
Since, the website update ,the page now has some data this seems to be fetched after page load through JS/Ajax. I confirmed this by loading the page in Firefox with JS disabled.
So is there a way to run the JS on this page so that all elements are loaded with "msxml2.xmlhttp" or something similar?