Click here to Skip to main content
15,891,864 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Is it a good idea to use SSIS to make 200,000 calls to WCF?
My requirement is to get data from SQL Server, pass that to WCF(row by row) to get related data from another DB server, get back data and Update Original DB, call again one more WCF(row by row) to send status to another server.

What would be the best way to implement it. Performance will be key concern here.

Thanks
Posted
Updated 7-Jan-13 2:30am
v2
Comments
Keith Barrow 7-Jan-13 8:35am    
Why do you think this is a good idea? You must have reasons. Personally I'd have thought updating one database from the other directly through SQL would have been quickest, next quickest would be an app that connects to two databases, pretty much the slowest way to do this would be via a web service such as WCF, even on named pipes. I'm also unsure why you need to do this row by row, this will be slower than updating many records at once.

It might be better to explain what you are trying to do, and what constraints you have (e.g. the DBs can't communicate directly).
ZurdoDev 7-Jan-13 8:41am    
Best way is always relative. You have to look at what servers you have, including hardware and what it will take to install and maintain. SSIS is fine and likely a good idea. If it needs a lot more fine tuning then you could use BizTalk.
[no name] 7-Jan-13 9:37am    
What makes it necessary to use WCF instead of a direct DB-to-DB approach? And why update it row by row instead of as a batch?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900