Click here to Skip to main content
15,909,091 members

Comments by cbf28 (Top 1 by date)

cbf28 5-Dec-12 14:57pm View    
perfectly - if a human can browse it , so can a bot.

So my platform of choice would be Linux, and the database would be Oracle. The crawlers gathering the data and storing it in a database because I'll be using that to generate a set of graphs based on the data.

The reason I choose linux, is that once I've got my data stored in Oracle, I can use some linux script to query and filter the database and leverage my familiarity with Linux scripting.

The next step would be making these graphs updated in real time - but the proof of concept can be script-based for the moment.

So I know how to store and manipulate the data, however I am exploring tools to gather it from the web.

Ideas?