Dear @Alpha-Testers and all of our users,
time has come for testing of PowerArchiver and PACL for macOS.
Please let us know here if you have Mac and can test latest builds.
PowerArchiver 2020 - tabbing, opening, extracting, adding, testing, favorite folders, support for multiple languages, opening via Finder, explorer mode, installer.
PACL 10 - support for most formats and features in Windows version.
Upcoming: Tools such as archive converter, batch zip, multi-extract.
To start testing, please sign up here in this thread, and we will send you latest build.
Ashampoo_Snap_Wednesday, November 20, 2019_12h54m56s_008_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h55m05s_009_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h55m14s_010_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h55m30s_011_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h55m39s_012_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h55m49s_013_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h56m00s_014_.png Ashampoo_Snap_Wednesday, November 20, 2019_12h54m43s_007_.png
I have a “USB2.0” DVD-RW drive with a blank 700MB CD-R inserted.
Windows 10 sees in just fine explorer and it’s usable, and works.
However it is not showing in any dropdown in PowerArchiver.
Thanks for any help…
PowerArchiver 20.10.02 (05/2021) 64-bit
Windows 10 Pro
20H2 build: 19042.1348
Windows Feature Experience Pack 120.2212.3920.0
Hi,The window size is suddenly “large” Resizing the window to normal size works for the current action, but isn’t remembered The next time I’m using the shell extension, the window is large again, so it seems the size change isn’t saved
I’ve got a strange behavior with the progress window in PowerArchiver 2021 shell extension:
The progress window of PowerArchiver itself has its normal size.
Is there a possibility to reset the shell extension window size?
I’m running W10 Pro on an HP i-7 machine with now 32gb of RAM. The system drive is an SSD.
In the course of sorting out some inconsistencies in a couple of my .pbs jobs, I’ve been paying close attention to the system CPU resource and power consumption of PA Starter, as shown by the various resource management tools. I’ve also been trying to identify why my system seems rather sluggish despite some extensive tuning for speed etc.
When I have enabled Starter as a start-up programme, to use the PA Queue facility, I see that Starter consistently uses between 10-13% of CPU, with High Power consumption indicated. It runs, consuming this resource, irrespective of what may be happening with PA and any pbs script jobs.
I can understand a need to use a good chunk of CPU when Starter is actually handling PA jobs, but it shouldn’t sit and consume this amount when it really isn’t doing anything but wait for a job to start and for which it is needed.
Perhaps I’ve missed a way of lessening Starter’s impact, but in the meantime I’ve adjusted my set-up to running PA without it.
I’ve found a bug when trying to test an encrypted, 28-volume 7-zip archive of a 18.3 GB folder containing 41,751 files. The test fails, showing errors in all files.Here’s the folder I compressed, and the 28 resulting files:
ArchivesHere’s the multi-volume compressed file opened in PA, showing the total file count and the full archive size:
Opened in PowerArchiverAnd here’s the test results screen after I click Actions -> Test…, showing an incorrect file count and archive size (this seems to be the file size of the first volume alone):
One important detail about the test screen above is that PA opens an UAC prompt asking for privileged execution before starting the test. Is this the expected behavior? It seems PA wouldn’t need higher execution privilege since everything it’s touching is owned by my own user.
In any case, granting or denying the UAC prompt results in the exact same screen.Now, if instead of testing the archive I tell PA to extract the contents, the operation works fine (and with no UAC prompts):
ExtractingWith the Properties panel for both folders, the original and the extracted one, showing the exact same file count and total size:
I hope this helps. 🙂
Currently on the 2021 20.00.55, updated, and licensed.
I have been facing this issue since 2019 version.Queues don’t actually do anything
That I run powerarchiver starter as admin, or without admin.
Then I queue some files, and most of the files take forever, to go from 1% to 2%
and let’s say if the 5th of total 10 files are stuck that should mean the other 4 should be completed, but no, the directories are just like that.
Using Queue feature makes the Powerarchiver to process and compress files, but the PA archives do no appear, neither the disk’s total used space changes.
However if I manually compress files without queuing, no matter if I run multiple compression or one by one, there’s always success.
sometimes compression is stuck at certain percentage for a 50 or 200 MB file, forever.
I want to speed up multiple compression, I find the batch compression feature useless, as I cannot configure any custom parameters or advanced option under PA archive format
So, Let me save parameters(Advanced Option) of Profiles
i) Either I should be able to customize PA Default profile, so I don’t have to select files and manually select profile, but not even single parameter can be saved
ii) Or I should be able to create a new profile and be able to set that as default.
It’s end of 2020 and I have been trying to reach Powerarchiver team to add this, but no one wants to listen or do anything.
Refer case PA-FCW-TLNFM-096
I still cannot save parameters under advanced options like DDL/EXE filter to delta, or PNG PDF Filter, or set zlib to 9 or dictionary size or experimental stuff.
Soo many Revisions and versions released but I still cannot have a profile saved with whatever I want, be it existing or new profile.
Don’t you guys at Powerarchiver labs get frustrated everytime you have to configure these parameters manually?
When I stop an ongoing compression process, that is stuck or let’s say it’s not stuck but I click on cancel and then on No, it still continues compression for no reason.If I start compression and pause, the progress bar shortly(progresses) reaches to completion, but no compression is done, and then archive is stuck forever
Just cancel it
For all my claims above I have sent ticket today with screenshots from my registered email.if you can:
Batch archive is good idea, but if I have multiple directories inside directories, then I want first child directory to have all of it’s subdirectory compressed to that same first child directory
I have a folder in D:\pics
that folder has following folder
Each of the above folder has sub folder:
I want those sub folders(black, white, colorful, fluffy) not to be compressed individually, but the cats, dogs and fishes folders to be compressed individually
The support doesn’t answer my email about a license question for weeks, because despite a lifetime license Powerarchiver refuses some functions
Dear PowerArchiver Team!
As I unfortunately found out, there is a big bug in Powerarchiver, which causes the compression rate to drop really significantly! I wonder if no one has noticed this yet? Are you guys still actively developing?
The following error occurs! (This is already as I find a blatant error!).
ERROR:File extensions and formats are not recognized automatically! If a file name e.g. Mütze.jpg is simply “renamed” in Mütze.txt, or
Mütze.png, Mütze.mp3, Mütze.pdf etc. each time a DIFFERENT compression algorithm is performed on the SAME FILE! Remember! The file extension was only renamed, so the file extension e.g. from .jpg to .txt is only by Rename changed! Just take a .jpg or .txt or .pdf file and change the file extension, you will see see this error! No headers are read automatically! Is this really wanted?Then the perfect compression algorithm can never be found and optimized by PowerArchiver? You should investigate this dramatic error once absolutely!
Greetings from your trusty Konglomat!
Please stay healthy and make PowerArchiver more powerful than ever! I am looking forward to future updates! You are still awesome! Please keep up the good work! I love PowerArchiver, really more than just about anything!And will probably buy and use it for eternity!
There would be many of us with Intel Processors.
and they have their own optimized zlib algorithm, which can result in more efficiency if used combined with their hardware processor.
Can we get the same Functionality Under Hardware Acceleration Feature?
Zlib is not the only feature that intel has included with their processor, there’s many, which if combined can result in efficient and better compression ratios.
I thought, I’ll give it a try and upgrade my PA 2018 Toolbox on my production machine to PA 2021 (20.10.03)Sending registration codes in your order recovery doesn’t seem to work at the moment (I also checked my junk folder), online activation works Using registry to disable modules (for example HKLM\SOFTWARE\PowerArchiverInt\General\DisableBurning) doesn’t work as expected with modern ui. The module is visible, but I’ll get the trial nag, if I try to open it. It’s almost fine in classic and ribbon mode.
“DisableClouds” and “DisableExplorer” seem not to be respected at all, while the classic UI seems to respect the setting to disable any other unused module:In settings I won’t be able to access the SmartAI settings and the settings for the internal editor if the FTP module is disabled
Powerarchiver 2021 20.00.73
Windows 10 Education 10.0.19042 Build 19042
When extracting gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf.tar.xz , Powerarchiver wrongly thinks some .exe files have a length of zero:
Once extracted:D:\Temp\Powerarchiver\gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf\bin>dir *.exe Volume in drive D is DATA Volume Serial Number is 0E12-BCA2 Directory of D:\Temp\Powerarchiver\gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf\bin 2020-11-20 07:10 PM 1,391,599 arm-none-linux-gnueabihf-addr2line.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-ar.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-as.exe 2020-11-20 07:41 PM 3,030,119 arm-none-linux-gnueabihf-c++.exe 2020-11-20 07:10 PM 1,389,293 arm-none-linux-gnueabihf-c++filt.exe 2020-11-20 07:41 PM 3,027,513 arm-none-linux-gnueabihf-cpp.exe 2020-11-20 07:10 PM 4,040,503 arm-none-linux-gnueabihf-dwp.exe 2020-11-20 07:10 PM 391,769 arm-none-linux-gnueabihf-elfedit.exe 2020-11-20 07:41 PM 0 arm-none-linux-gnueabihf-g++.exe 2020-11-20 07:41 PM 3,026,926 arm-none-linux-gnueabihf-gcc-10.2.1.exe 2020-11-20 07:41 PM 609,607 arm-none-linux-gnueabihf-gcc-ar.exe 2020-11-20 07:41 PM 609,607 arm-none-linux-gnueabihf-gcc-nm.exe 2020-11-20 07:41 PM 609,607 arm-none-linux-gnueabihf-gcc-ranlib.exe 2020-11-20 07:41 PM 0 arm-none-linux-gnueabihf-gcc.exe 2020-11-20 07:41 PM 2,165,533 arm-none-linux-gnueabihf-gcov-dump.exe 2020-11-20 07:41 PM 2,343,605 arm-none-linux-gnueabihf-gcov-tool.exe 2020-11-20 07:41 PM 2,450,233 arm-none-linux-gnueabihf-gcov.exe 2020-11-20 07:54 PM 9,605,899 arm-none-linux-gnueabihf-gdb.exe 2020-11-20 07:41 PM 3,028,997 arm-none-linux-gnueabihf-gfortran.exe 2020-11-20 07:10 PM 1,412,943 arm-none-linux-gnueabihf-gprof.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-ld.bfd.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-ld.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-ld.gold.exe 2020-11-20 07:41 PM 25,546,567 arm-none-linux-gnueabihf-lto-dump.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-nm.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-objcopy.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-objdump.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-ranlib.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-readelf.exe 2020-11-20 07:10 PM 1,393,083 arm-none-linux-gnueabihf-size.exe 2020-11-20 07:10 PM 1,392,464 arm-none-linux-gnueabihf-strings.exe 2020-11-20 07:10 PM 0 arm-none-linux-gnueabihf-strip.exe 32 File(s) 67,465,867 bytes 0 Dir(s) 1,002,422,431,744 bytes free
When the same archive is being extracted from a git bash session (after having installed git 2.30.1 for Windows 64 bit version from git-scm.com), the .exe files are extracted as expected:xz -k -d gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf.tar.xz tar xf gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf.tar cd gcc-arm-10.2-2020.11-mingw-w64-i686-arm-none-linux-gnueabihf/bin ll *.exe -rwxr-xr-x 1 User 197121 1391599 Nov 20 19:10 arm-none-linux-gnueabihf-addr2line.exe* -rwxr-xr-x 2 User 197121 1421598 Nov 20 19:10 arm-none-linux-gnueabihf-ar.exe* -rwxr-xr-x 2 User 197121 2028927 Nov 20 19:10 arm-none-linux-gnueabihf-as.exe* -rwxr-xr-x 2 User 197121 3030119 Nov 20 19:41 arm-none-linux-gnueabihf-c++.exe* -rwxr-xr-x 1 User 197121 1389293 Nov 20 19:10 arm-none-linux-gnueabihf-c++filt.exe* -rwxr-xr-x 1 User 197121 3027513 Nov 20 19:41 arm-none-linux-gnueabihf-cpp.exe* -rwxr-xr-x 1 User 197121 4040503 Nov 20 19:10 arm-none-linux-gnueabihf-dwp.exe* -rwxr-xr-x 1 User 197121 391769 Nov 20 19:10 arm-none-linux-gnueabihf-elfedit.exe* -rwxr-xr-x 2 User 197121 3030119 Nov 20 19:41 arm-none-linux-gnueabihf-g++.exe* -rwxr-xr-x 2 User 197121 3026926 Nov 20 19:41 arm-none-linux-gnueabihf-gcc-10.2.1.exe* -rwxr-xr-x 1 User 197121 609607 Nov 20 19:41 arm-none-linux-gnueabihf-gcc-ar.exe* -rwxr-xr-x 1 User 197121 609607 Nov 20 19:41 arm-none-linux-gnueabihf-gcc-nm.exe* -rwxr-xr-x 1 User 197121 609607 Nov 20 19:41 arm-none-linux-gnueabihf-gcc-ranlib.exe* -rwxr-xr-x 2 User 197121 3026926 Nov 20 19:41 arm-none-linux-gnueabihf-gcc.exe* -rwxr-xr-x 1 User 197121 2165533 Nov 20 19:41 arm-none-linux-gnueabihf-gcov-dump.exe* -rwxr-xr-x 1 User 197121 2343605 Nov 20 19:41 arm-none-linux-gnueabihf-gcov-tool.exe* -rwxr-xr-x 1 User 197121 2450233 Nov 20 19:41 arm-none-linux-gnueabihf-gcov.exe* -rwxr-xr-x 1 User 197121 9605899 Nov 20 19:54 arm-none-linux-gnueabihf-gdb.exe* -rwxr-xr-x 1 User 197121 3028997 Nov 20 19:41 arm-none-linux-gnueabihf-gfortran.exe* -rwxr-xr-x 1 User 197121 1412943 Nov 20 19:10 arm-none-linux-gnueabihf-gprof.exe* -rwxr-xr-x 4 User 197121 2572182 Nov 20 19:10 arm-none-linux-gnueabihf-ld.bfd.exe* -rwxr-xr-x 4 User 197121 2572182 Nov 20 19:10 arm-none-linux-gnueabihf-ld.exe* -rwxr-xr-x 2 User 197121 4550029 Nov 20 19:10 arm-none-linux-gnueabihf-ld.gold.exe* -rwxr-xr-x 1 User 197121 25546567 Nov 20 19:41 arm-none-linux-gnueabihf-lto-dump.exe* -rwxr-xr-x 2 User 197121 1404945 Nov 20 19:10 arm-none-linux-gnueabihf-nm.exe* -rwxr-xr-x 2 User 197121 1531656 Nov 20 19:10 arm-none-linux-gnueabihf-objcopy.exe* -rwxr-xr-x 2 User 197121 1991350 Nov 20 19:10 arm-none-linux-gnueabihf-objdump.exe* -rwxr-xr-x 2 User 197121 1421598 Nov 20 19:10 arm-none-linux-gnueabihf-ranlib.exe* -rwxr-xr-x 2 User 197121 1163376 Nov 20 19:10 arm-none-linux-gnueabihf-readelf.exe* -rwxr-xr-x 1 User 197121 1393083 Nov 20 19:10 arm-none-linux-gnueabihf-size.exe* -rwxr-xr-x 1 User 197121 1392464 Nov 20 19:10 arm-none-linux-gnueabihf-strings.exe* -rwxr-xr-x 2 User 197121 1531656 Nov 20 19:10 arm-none-linux-gnueabihf-strip.exe*
The same archive file does extract properly under a native Linux Ubuntu 20.04 system, or under Windows 10 using the WSL2 Linux subsystem using xz and tar.
Hi! I came across an issue when I tried to uninstall PowerArch 2013. I got an error “The specified account already exists” which broke the process. I got the same error when I tried to install the current version, without uninstalling the previous one.
So I can’t uninstall and I can’t install PowerArchiver. Any suggestions? OS Windows 8.1 x64.
I’m using Convert Archives on a local folder of .zip files, converting them to .lzh files, and I’m finding that perhaps 1 in 75 files is actually converting and outputting a file. Even then, the .lzh file is incomplete, missing most of the source files.
The progress bars completes OK, say 75/75 files, but only one .lzh file exists in the output folder.
It seems to be the same with different source and output folders on both local and USB disk - I can’t see a pattern.
The files extract and compress (to .lzh) without issue on their own.
Is there a log file I can check please, to see why PA does not like these files as part of the batch conversion?
Thanks very much, Rich.
Hello. My problem with Powerarchiver is as follows:
First of let me say that I am running a Windows 7 Starter edition on my Toshiba NB505 laptop. You can see my specs if you google it. Needless to say that it has very limited resources and has low ram and roughly 1.66 GHz Atom processor.
I am currently using the latest version of PA. But mostly the tasks I do with it are create and uncompress zip files, as this comes in handy with my needs for compressing older files which I dont need. There is where it comes my issue with program and it is just that version after version it becomes harder for my computer to load and create zip files and 7zip ones. Needless to say that the same goes for zipx.
I have been using PA for a decade and its a good program but it is just that the current version has things which I don’t need or seldom use it, such as the preview file panel, and also the explorer tabs, skin customization and the nagging conexware update tool. Thus I was just asking, Does it exist a minimalistic version of PA or perhaps some ultra light version of it?. Something which it can be run on older computers.
Please don’t say to install an older version or going back to 2003, as well, this is not what I had in mind. I know that I could use the command line version but for me usng a GUI is more convenient. Regsrding compression speeds, I mostly use deflate for legacy zip files on the fast setting.
So far what I am doing is to set PA to use the classic skin windows view and while it does alleviate some of the lagging symptoms it doesn’t really seem to make the program to run or load faster.
Any help or advice on my problem?. By the way, I use the PRO version as that’s my current license. By the way, does PA has any pdf user’s manual or reference book? I see there are changes and numerous functionality that has been added but I have no clue on how to use it. I hope somebody could help me with that as well.
PA not using current folder as temp as it should
There are 2 settings in PA configuration options:
Miscellaneous > Speed > Use current folder as temp
Folders > Temp Location
These two settings seem to be contradicting each other because if you select to use current folder as temp, on the Temp Location field it won’t let you delete what’s in there and that setting overrides the speed setting. Extracting from H: drive to temp folder on C:, and then copying back to H: takes twice as long as if it just extracted to the current directory on H: and then renamed the file(s) if it needs to…
Speed settig is only used for compression I believe.
Best is to set temp folder to the drive you use the most for zipping/unzipping.
There are a lot of problems with overwriting files directly (not using temp), so that will not be used to save you from possible loss of data in specific situations. Better safe then sorry.
Which specific situations if I may ask? Wouldn’t confirm overwrite work? This should be the default behavior. For those of us who store data on many different drives, having a forced Temp dir setting is really inconvenient.
if data is corrupted or extract is cancelled, you will lose your current files if they were being overwriten without an temp file.
Agreed. But this shouldn’t be an issue. Here is the flow chart:
1. User right-clicks archive file and selects “powerarchiver > extract here”
2. PA notices files or directories it’s about to extract already exist in current dir (I guess it scans archive and scans current dir and makes a comparison)
3. PA pops-up a prompt dialog confirming overwrite: Overwrite? Yes|No
4. If user selects ‘Yes’ then user is fine with current data being overwritten, if user selects ‘No’ then process aborts.
5. User notices that all extraction operations take half the time as they did when using a forced temp dir on a different hard drive and, feeling extremely pleased he recommends PA to all his friends stating that the program has great support and works really fast. :D
Seriously though, this is really quite simple and is the default behavior for so many programs, any time a user is about to overwrite something there is always a confirmation popup dialog first. It’s Windows, it was designed to be idiot-proof. Let’s not add 100% time overhead to the process in the interest of idiot-proofing it even more.
By the way, what is the current behavior for if a file about to be extracted already exists in the current dir? First it copies the file to the other drive where Temp folder is, then it copies the file back to the current dir (all the while hard drives getting quite a workout…), notices that the file already exists, and pops up an overwrite confirmation dialog at that point? Then can’t we just eliminate all the time-consuming copying? Thank you for your time.
problem is that when you dont have temp folder, you might exract bad file and it will overwrite good file. There is no way around that.
only problem comes when you have different temp drive and destination drive. If you have everything on the same drive, it is moved over…