Click here to Skip to main content
15,892,927 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
So, I would be getting an 80MB or more excel/csv file as the response from a server and I need to run a list of regexes to manipulate some of the content within the response before presenting it to the end user. I've tried reading the response chunk by chunk, process the regexes over the chunk before proceeding onto the next chunk. With this approach, there is a chance that pattern matching data can span across different chunks and hence cannot be manipulated. If I process all the regexes over the complete response then it takes a large amount of time to process..

Would anyone suggest an efficient way to handle this and bringing down the processing time?
Posted
Comments
Richard MacCutchan 19-Jan-15 11:15am    
Using regexes against such a large amount of data will always take a long time. You probably need to think of an alternate solution to your problem.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900