Behavior change request.

  • If I extract a file from powerarchiver to a folder, power archiver extracts the file to a temp directory, then copies the file to the location, then deletes the temp copy. When you are working with large files this presents a huge unessary slowdown and a lot of wasted disc space.

    WinRAR handles this better by extracting to a temp directory, then MOVING the file to the correct location instead of copying it. In WindowsXP this is just a change in the File Allocation Table or NTFS equivilant, saying that the file is now located in X directory, which is very fast, no need to create 2 copies of the file then delete one of them.

    If you are extracting a 4gb file you only need 4gb of space, not 8gb like currently required by powerarchiver for the same operation.

    Make sense?

  • guess it does.

    but let’s hear from dev’s…

  • conexware

    I believe that if you use “write directly into zip” option, you can make it write directly into zip file, w/o using temp which then speeds everything up.

    If you want to make that pernament setting, you can set it up at Options>Config>Misc, and it should work with all formats!

    Does that work as intended?


  • SPWolf,

    I couldn’t find the setting you mentioned, where is it and what is it called?


  • conexware

    It should be under Speed options> Use current folder as temp. That should work for compression only of course.


  • Well, that helps some, but since it doesn’t affect decompression, it isn’t really what I was asking/requesting. My intention in starting this thread was to make development aware of a very small change that would improve performance proportional to the size of the files.


  • Great tip! I would like this feature as soon as possible

Log in to reply