I have a site that we are moving from Flat HTML into a Database using in house written VB.Net applications. The website is roughly 10,000 pages. The volume of pages however is not the problem. The Page HTML content is all now held in a database, Now I am looping through the pages and checking each page using regular expressions for either a href or img src tag so I can collect the URL for all images and attachments. Once I have these, i pass the URL (currently) to the HttpWebRequest.DownloadData which makes a request to the website for the url provided and then the attachment is downloaded and eventually written out using the WriteAllBytes() function. This works for one iterration and then times out. What i need to know is why does it time out and is there a better method of doing this? the files can be rather large that are getting moved the site is about 5.7gb and most of this will be down to diffrent file formats.Tags:dotnet (1 comment | Leave a comment)