![]()
General News Suggestion Question Bug Answer Joke Praise Rant Admin ROBOCOPY VS FASTCOPY FASTER FULLLast thing: I simplified things, the full picture can be found, for example, at (v=vs.110).aspx. Either cleanup happens some time in the future or right now, that's the difference. ![]() "Using" helps when you want a guarantee, that full (or partial) cleanup happens when you want it to happen, that's all. Second: the whole point of using "using" statement refers to that "sooner or later" part: when exactly the actual deletion/destruction takes place? Leave it to GC, and you never know when it is called non-deterministic cleanup. ROBOCOPY VS FASTCOPY FASTER FREESo you can (almost always) play it safe by simply not caring about any cleanup: if you forgot to free a buffer or a string, or forgot to close the file etc., - don't worry, GC will do the maintenance. The GC sits as if in an ambush and waits until your application is not too busy, then it jumps in, deletes objects that are not referenced any more, and goes into hiding until next opportunity. I am afraid you completely misunderstood my comment, sorry not to make it clear from the start.įirst: _all_ objects are sooner or later disposed of by a special. Int read = bwread.Read (dataArray, 0, array_length) Using (BinaryWriter bwwrite = new BinaryWriter (fswrite)) (destination, FileMode.Create, FileAccess.Write, FileShare.None, array_length)) Using (FileStream fswrite = new FileStream Using (BinaryReader bwread = new BinaryReader (fsread)) (source, FileMode.Open, FileAccess.Read, FileShare.None, array_length)) Using (FileStream fsread = new FileStream Int array_length = ( int) Math.Pow ( 2, 19) īyte dataArray = new byte / /// Source file path /// Destination file path static void FMove ( string source, string destination) Tsize = tsize / ( int) Math.Pow ( 2, 30) size time in milliseconds per hour long tsize = size * 3600000 / milliseconds Int milliseconds = 1 + ( int) ((DateTime.Now - start_time).TotalMilliseconds) Long size = new FileInfo (destination).Length ![]() / /// Source file path /// Destination file path public static void MoveTime ( string source, string destination) I hope you enjoyed my little story and it brought a smile to your face. Any buffer size smaller than that has rapid throughput drop off. The gigabit network card used an average 555Mb/sec for both simultaneous send and receive.Īfter I did some enhancements to my program, I retested the File.Move throughput at a shockingly slow 25GB/hour. The program performance used an average 25% CPU of all eight cores with this routine. ROBOCOPY VS FASTCOPY FASTER WINDOWSI was working on a dual quad core XEON Windows 2008 Server with 16GB RAM. This system is not in production yet so there was little contention for resources. The destination was a Linux based media playout server. It is designed to serve multiple servers with high speed access. My network source for the files was a Harmonic MediaGrid, which is a high density parallel network storage system. My numbers may not duplicate on your system. ![]() I need to explain the production environment so you have a better understanding of the throughput numbers. Wow! My first test was written in a few minutes and achieved 250GB/hour. The first thing that came to mind was using larger buffer sizes in file read and write to reduce overhead. I did a quick search on CodeProject and MSDN without finding anything. ROBOCOPY VS FASTCOPY FASTER SOFTWAREHow did they achieve more than I was? I wiped the dust off of 45 years of software development and began a deeper consideration of system architecture and software processing. After a lot of wrangling, they changed some settings and all of a sudden they were achieving 160GB/hour! Then I discovered they were only achieving 5GB/hour writing back. I was only achieving 25GB/hour so I was beginning to get worried. ![]() An outside vendor provided a program which handled the processing and sent them back to the data store. It was a two step process where I sent the file to a server where they were processed. I thought 1000 hours! That would take 40 days! Well, it was a lot of data so I better get started. Then I had to do a mass conversion of a 50TB data store of 15,000 files. I wasn't actually waiting for these files while they were going through various stages of processing so I didn't really care. I was wrong! I typically achieved transfer speeds of about 60GB/hour over a Gigabit network connection depending on other traffic and disk activity. ROBOCOPY VS FASTCOPY FASTER CODENET had already written highly efficient code for me. I used File.Copy or File.Move making the typical assumption that Microsoft. I often need to transfer files about 5GB and larger from one location to another. I develop and maintain a Media Automation system for a satellite broadcaster. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |