Thank you everyone for your help. I always appreciate it and every answer here was a good one. I gave everyone a 5.
I am using an ftp api that downloads a 5g file everynight. The api requires on fail to make a new file then merge them. This is why i had that problem. I agree its not the best way but I have to use it per company policy.
I used OriginalGriff's answer. I ran a for loop broke it into 20 pieces and then put it back together with the original file.
Here is the code i used. I tried it on a short text file, zip file and exe. It worked on all three. I tried to format the code here I hope it looks ok. Thanks again.
Brian Cummings
for (int x = 0; x < 20; x++)
{
try
{
using (FileStream fsSource = new FileStream(pathSource,
FileMode.Open, FileAccess.Read))
{
long fLength = fsSource.Length / 20 + 1;
byte[] bytes = new byte[fLength];
int numBytesToRead = (int)fLength;
int numBytesRead = 0;
while (numBytesToRead > 0)
{
fsSource.Position = readPosition;
readPosition += fLength;
int n = fsSource.Read(bytes, numBytesRead, numBytesToRead);
if (n == 0) break;
numBytesRead += n;
numBytesToRead -= n;
}
numBytesToRead = bytes.Length;
using (FileStream fsNew = new FileStream(pathNew + "00" + x.ToString() + ".zip", FileMode.Create, FileAccess.Write))
{
fsNew.Write(bytes, 0, numBytesToRead);
fsNew.Flush();
fsNew.Close();
fsNew.Dispose();
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
Console.WriteLine(ex.Message);
}
}