No spanning while using queue!!!
-
Hello
Although I managed to change the default settings for the .zip archives (custom spanning works while uding default shell compress to xxxxx.zip) it turns out there’s no spanning applied while using the queue option!
Also, is there a possibility to change the default naming policy for the .zip files? (e.g. like filename+text)
P.S. Using latets version 11.50.66
-
so queue does not apply spanning in any case or?
as to the naming policy, start an new thread in wishlist forums please and explain a bit more.
-
That’s right in any case, meaning when the queue option is used the final reasult is a single .zip file = no spanning.
It doesn’t matter if I queue files with ‘+spanning’ option or using default .zip profile (includes spanning).
-
thank you!
-
Hi,
please check following release:
http://dl.powerarchiver.com/2010/powarc1160rc1.exeand let us know how it works now.
thanks!
-
It installed correctly and all looked good but when I started using the queue option the atarter/queue program crashed and now it crashes on every attempt of me enabling queue.
-
Hi Axeldor,
you need to re-download PA from our pages, it will work correctly… that older version had an issue, it is not the same as official preview release released 2 days ago…
-
Oh! Seems like there’s no rest for me :/
It works ok now (I can add item to the queue and it spans files correctly), but queue resets itself when the first file is compressed. Goes blank = no queue / working but not queueing! :)
-
how do you mean queue resets itself when first file is compressed? Queue monitors pa.exe when it exits, it deletes the job, not before?
-
how do you mean queue resets itself when first file is compressed?
Yes, I open the queue window for preview and I can see that after the 1st file it compressed the rest of the jobs gets deleted!
-
Yes, I open the queue window for preview and I can see that after the 1st file it compressed the rest of the jobs gets deleted!
i get it, it is something specific, got it…
-
Version 11.60.10 is up, this should be working properly now… please check it out!