Whats the difference between LZMA2 an 7zip?
LZMA2 should be an "Improved version of LZMA". Anywhere in the 7ZIP/LZMA doc I have read that it should better compress things like audio/video/pictures (or any other bad compressible stuff). In my tests LZMA beats LZMA2 in 80-90% and if not, the advantage of LZMA2 is minimal.
wishmasterf said:
Do this mean a implementation into USB-Loaders could be possible with useable speed? Maybe an extension to the wbfs-filesystem would be great to store them.
No, WIA is nothing for USB loaders because of RAM and CPU time usage.
At the moment I close the development of WIA. Perhaps I make some tests with greater chunk sizes. I think a complete compression as one chunk will never be part of WIT, because a) the file layer of my tools needs random access (e.g. for patching while copying) and b) this is a job for the packing tools. But it is no problem to write a little batch file that (de)compress an image. This is the old Unix philosophy.
LZMA2 should be an "Improved version of LZMA". Anywhere in the 7ZIP/LZMA doc I have read that it should better compress things like audio/video/pictures (or any other bad compressible stuff). In my tests LZMA beats LZMA2 in 80-90% and if not, the advantage of LZMA2 is minimal.
Ok so it seems for wii-data LZMA is more useful. But whats the differenz you compress with lzma and 7zip? do they any other compression stuff in combination?
Wiimm said:
No, WIA is nothing for USB loaders because of RAM and CPU time usage.
Ok. I only want to use it to archive my games, but it would be possible it would
be great to use that format. is WIA/PURE not useable for loaders or because of the compression?
At the moment I close the development of WIA. Perhaps I make some tests with greater chunk sizes. I think a complete compression as one chunk will never be part of WIT, because a) the file layer of my tools needs random access (e.g. for patching while copying) and b) this is a job for the packing tools. But it is no problem to write a little batch file that (de)compress an image. This is the old Unix philosophy.
If you want the best compression at all than store your file in WIA/NONE and compress it with an external packer (like 7zip). The NONE format is designed for this use and minimize the management info. If you want WIA check sums than use WIA/PURGE instead. The compressed files are minimal larger. Before reading such image must be completely uncompressed.
How does using 7-zip LZMA1 on a WIA (no compression) compare to using 7-zip (LZMA1) on a FST dump? Are they essentially the same procedure (compressing decrypted files inside the Wii game, especially /DATA/)? I think they are:
QUOTE3-4 month ago I made test with decrypted images. If encrypting a decrypted images the result is the original ISO image. The advantage of decrypted images are that data don't looks like random data and that make decrypted images compressible
wwt extract uses the old wbfslib extraction function (sequential writing). But wia needs to look inside the disc before starting writing. wwt extract does not support any patching or scrubbing. Rewriting it is already on my 2do list.
In other words: "wwt extarct --wia" does not work. (I have to deny this option).
wwt extract uses the old wbfslib extraction function (sequential writing). But wia needs to look inside the disc before starting writing. wwt extract does not support any patching or scrubbing. Rewriting it is already on my 2do list.
In other words: "wwt extarct --wia" does not work. (I have to deny this option).
The last Version seems to be great! I will do my tests and will report my results. I will see if i have enough time, but maybe i will post my Testresults next weekend.
WIA supports now variable chunk sizes, always a multiple of 2 MiB. The 2 MiB base value is set, because is is the size of a Wii sector group. The 64 sectors in a Wii sector group share H2 hash values. Loading of all 64 sectors is necessary to calculate all hash values.
Since tomorrow the test script ./scrips/test-image-size.sh is running. Here is a sample result of Animal Crossing:
'@N' is the chunk-size-factor. '@50' means: 50 * base_chunk_size = 100 MiB. For this @50 calculation wit needs about 300 MiB of RAM. All compressions done with level 9. You see that the internal WIA/LZMA@50 beats the best external 7zip compression.
Just call ./scripts/test-image-size.sh to print built in help.
Remark II:
The next beta release is coming today or tomorrow. The WIA format has not been changed since the previous beta, so that already generated WIAs can be used furthermore.
Remark III:
The new (and beta) tool 'wdf' can show some parameters of a wia. Try 'wdf +dump image'. If you forget the '+dump' is will pack the image to a WDF. You can also rename it to 'wdf-dump' to set dumping as default command. 'wdf --help' prints a help message.
do you know why the internal compression is taking so much longer than external?
based on those 4 discs it looks like none@2 or none@5 plus external 7zip is a really good mix for speed and high compression...
have you tested conversion back to a scrubbed playable format? the speed benefit of external might be lost with the extra step
do you know why the internal compression is taking so much longer than external?
based on those 4 discs it looks like none@2 or none@5 plus external 7zip is a really good mix for speed and high compression...Perhaps it is the compression level. 5 is the default and I use 9 (alll >=7 are equal). But I don't know the internals of 7zip.
QUOTE(vexing @ Sep 20 2010, 10:47 PM) have you tested conversion back to a scrubbed playable format? the speed benefit of external might be lost with the extra step
After your test Results i think LZMA with chunck - size 50 MiB should be the default setting. It seems it is most cases one of the best settings. What do you think wimm?
have you tested conversion back to a scrubbed playable format? the speed benefit of external might be lost with the extra stepMy test script make a diff for each created wia file.
by tested i meant timed. if it takes 10 minutes to go .7z -> .wia -> .iso but only takes 5 minutes to go .wia/lzma -> iso, that is important to know because they'll generally only be compressed once, but potentially decompressed multiple times
After your test Results i think LZMA with chunck - size 50 MiB should be the default setting. It seems it is most cases one of the best settings. What do you think wimm?
@50 was most compressed generally but the speed differences were significant at smaller chunk sizes... comparing @50 and @20 in that test set, at worst @20 was 6% larger final size and 17% faster, at best it was 2x as fast with a very slightly smaller final size.
@50 was most compressed generally but the speed differences were significant at smaller chunk sizes... comparing @50 and @20 in that test set, at worst @20 was 6% larger final size and 17% faster, at best it was 2x as fast with a very slightly smaller final size.
Oh i see. So maybe the best setting could be @20. So we have to do more tests to find the best default setting because current setting @5 seems not the best way.
@wimm what do you think about changing the default chuck - size?
In the latest beta @20 is the default. At this moment a script runs with @10,20,30,40,50 for about 10 different games. It will run until tomorrow. After that we can continue the discussion.
EDIT:
The first 3 results:
Code:
Summary of RUUP01, Animal Crossing: Let's Go to the City:
ÂÂÂÂ59919934ÂÂ16.98%ÂÂÂÂ3:04.094 m:sÂÂWIA/LZMA@40
ÂÂÂÂ60852341ÂÂ17.25%ÂÂÂÂ2:59.068 m:sÂÂWIA/LZMA@30
ÂÂÂÂ61577490ÂÂ17.45%ÂÂÂÂ3:13.285 m:sÂÂWIA/LZMA@50
ÂÂÂÂ64762661ÂÂ18.36%ÂÂÂÂ2:49.431 m:sÂÂWIA/LZMA@20
ÂÂÂÂ67473743ÂÂ19.13%ÂÂÂÂ2:42.040 m:sÂÂWIA/LZMA@10
ÂÂ 352695380 100.00%ÂÂÂÂÂÂ 3.746 secÂÂWDF
Summary of RMCP01, Mario Kart Wii:
ÂÂ2258986334ÂÂ81.47%ÂÂ 28:37.735 m:sÂÂWIA/LZMA@50
ÂÂ2269315147ÂÂ81.84%ÂÂ 27:01.235 m:sÂÂWIA/LZMA@40
ÂÂ2275611253ÂÂ82.07%ÂÂ 21:50.050 m:sÂÂWIA/LZMA@20
ÂÂ2278355000ÂÂ82.17%ÂÂ 24:42.624 m:sÂÂWIA/LZMA@30
ÂÂ2311609224ÂÂ83.37%ÂÂ 18:21.345 m:sÂÂWIA/LZMA@10
ÂÂ2772579124 100.00%ÂÂÂÂ1:19.534 m:sÂÂWDF
Summary of R3OP01, Metroid: Other M:
ÂÂ5655712840ÂÂ72.59% 1:32:47.480 hmsÂÂWIA/LZMA@50
ÂÂ5663710912ÂÂ72.70% 1:28:19.128 hmsÂÂWIA/LZMA@40
ÂÂ5675384932ÂÂ72.85% 1:22:05.267 hmsÂÂWIA/LZMA@30
ÂÂ5694298457ÂÂ73.09% 1:13:04.454 hmsÂÂWIA/LZMA@20
ÂÂ5730003666ÂÂ73.55% 1:02:51.743 hmsÂÂWIA/LZMA@10
ÂÂ7790477636 100.00%ÂÂÂÂ4:10.745 m:sÂÂWDF
EDIT II:
Command is: ./test-image-size.sh --compr lz --level @10,@20,@30,@40,@50 --fast --data *.wdf
Do it with your games
ok on the last 3 games we see that on average maybe also @10 could be the best setting. maybe the best choice is between @10 and @20, maybe @16, @14 or anything else.
And now I have bad news: The LZMA functions use a user specified allocation function. So it was easy to count the dynamic memory usage of LZMA/LZMA2. Here are the (bad) results:
<!--c1--><div class='codetop'>CODE</div><div class='codemain'><!--ec1-->ÂÂmethod levelÂÂÂÂÂÂÂÂ memory usage
-----------------------------------------
ÂÂ LZMAÂÂ.1ÂÂÂÂÂÂ 1831198 =ÂÂ 1.75 MiB
ÂÂ LZMAÂÂ.2ÂÂÂÂÂÂ 3174686 =ÂÂ 3.03 MiB
ÂÂ LZMAÂÂ.3ÂÂÂÂÂÂ 9072926 =ÂÂ 8.65 MiB
ÂÂ LZMAÂÂ.4ÂÂÂÂÂÂ32665886 =ÂÂ31.15 MiB
ÂÂ LZMAÂÂ.5ÂÂÂÂ 194146594 = 185.15 MiBÂÂdefault for 7zip
ÂÂ LZMAÂÂ.6ÂÂÂÂ 387084578 = 369.15 MiB
ÂÂ LZMAÂÂ.7-9:ÂÂ705851730 = 673.15 MiB
ÂÂ LZMA2 .1ÂÂÂÂÂÂ 4938158 =ÂÂ 4.71 MiB
ÂÂ LZMA2 .2ÂÂÂÂÂÂ 5986734 =ÂÂ 5.71 MiB
ÂÂ LZMA2 .3ÂÂÂÂÂÂ10705326 =ÂÂ10.21 MiB
ÂÂ LZMA2 .4ÂÂÂÂÂÂ32731566 =ÂÂ31.22 MiB
ÂÂ LZMA2 .5ÂÂÂÂ 194212274 = 185.22 MiBÂÂdefault for 7zip
ÂÂ LZMA2 .6ÂÂÂÂ 387150258 = 369.22 MiB
ÂÂ LZMA2 .7-9:ÂÂ705917410 = 673.22 MiB<!--c2--></div><!--ec2-->
Level 1-4 uses an other algorithm than 5-7. Level 7-9 are identically. Additionally WIA uses 2*chunk_size == 4MiB*factor of dynamic memory. BZIP2 level 9 should take 47 MiB.
LZMA Encoder will use default values for any parameter, if it is
ÂÂ-1ÂÂfor any from: level, loc, lp, pb, fb, numThreads
ÂÂ 0ÂÂfor dictSize
ÂÂ
level - compression level: 0 <= level <= 9;
ÂÂalgo = 0 means fast method
ÂÂalgo = 1 means normal method
dictSize - The dictionary size in bytes. The maximum value is
ÂÂÂÂÂÂÂÂ128 MB = (1 << 27) bytes for 32-bit version
ÂÂÂÂÂÂÂÂÂÂ1 GB = (1 << 30) bytes for 64-bit version
ÂÂÂÂ The default value is 16 MB = (1 << 24) bytes.
ÂÂÂÂ It's recommended to use the dictionary that is larger than 4 KB and
ÂÂÂÂ that can be calculated as (1 << N) or (3 << N) sizes.
lc - The number of literal context bits (high bits of previous literal).
ÂÂÂÂ It can be in the range from 0 to 8. The default value is 3.
ÂÂÂÂ Sometimes lc=4 gives the gain for big files.
lp - The number of literal pos bits (low bits of current position for literals).
ÂÂÂÂ It can be in the range from 0 to 4. The default value is 0.
ÂÂÂÂ The lp switch is intended for periodical data when the period is equal to 2^lp.
ÂÂÂÂ For example, for 32-bit (4 bytes) periodical data you can use lp=2. Often it's
ÂÂÂÂ better to set lc=0, if you change lp switch.
pb - The number of pos bits (low bits of current position).
ÂÂÂÂ It can be in the range from 0 to 4. The default value is 2.
ÂÂÂÂ The pb switch is intended for periodical data when the period is equal 2^pb.
fb - Word size (the number of fast bytes).
ÂÂÂÂ It can be in the range from 5 to 273. The default value is 32.
ÂÂÂÂ Usually, a big number gives a little bit better compression ratio and
ÂÂÂÂ slower compression process.
numThreads - The number of thereads. 1 or 2. The default value is 2.
ÂÂÂÂ Fast mode (algo = 0) can use only 1 thread.<!--c2--></div><!--ec2-->
Here are some statistics for different LZMA compression levels:
I have removed the level 9 results because they have always the same size as level 7.
<!--c1--><div class='codetop'>CODE</div><div class='codemain'><!--ec1-->Command: ./test-image-size --fast --data --compr lzma \
ÂÂÂÂÂÂÂÂ --level '.1@10,.1@30,.1@50,.3@10,.3@30,.3@50,
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ.4@10,.4@30,.4@50,.5@10,.5@30,.5@50,
ÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂÂ.7@10,.7@30,.7@50,.9@10,.9@30,.9@50'
<!--coloro:#800000--><span style="color:#800000"><!--/coloro--><b>Discussion: What is the best LZMA default?</b><!--colorc--></span><!--/colorc-->
I already offer some keywords like BEST, FAST, DEFAULT. Perhaps I need more like LOWMEM, HIMEM, ....
A new Nintendo Switch firmware update is here. System software version 18.0.1 has been released. This update offers the typical stability features as all other...
As each year passes, retro games become harder and harder to play, as the physical media begins to fall apart and becomes more difficult and expensive to obtain. The...
While rumors had been floating about rampantly as to the future plans of Nintendo, the President of the company, Shuntaro Furukawa, made a brief statement confirming...
TheFlow has done it again--a new kernel exploit has been released for PlayStation 4 consoles. This latest exploit is called PPPwn, and works on PlayStation 4 systems...
Nintendo might just as well be a law firm more than a videogame company at this point in time, since they have yet again issued their now almost trademarked usual...
Nintendo has officially announced that a successor to the beloved Switch console is on the horizon. As we eagerly anticipate what innovations this new device will...
Another video game prototype has been found and preserved, and this time, it's none other than the game that spawned an entire franchise beloved by many, the very...
Anbernic is back with yet another retro handheld device. The upcoming RG28XX is another console sporting the quad-core H700 chip of the company's recent RG35XX 2024...
DOOM is well-known for being ported to basically every device with some kind of input, and that list now includes the old retro game console in Persona 5 Royal...
Two classic titles join the Nintendo Switch Online Expansion Pack game lineup. Available starting April 24th will be the motorcycle racing game Extreme G and another...
Nintendo has officially announced that a successor to the beloved Switch console is on the horizon. As we eagerly anticipate what innovations this new device will...
While rumors had been floating about rampantly as to the future plans of Nintendo, the President of the company, Shuntaro Furukawa, made a brief statement confirming...
Nintendo might just as well be a law firm more than a videogame company at this point in time, since they have yet again issued their now almost trademarked usual...
As each year passes, retro games become harder and harder to play, as the physical media begins to fall apart and becomes more difficult and expensive to obtain. The...
Ubisoft has today officially revealed the next installment in the Assassin's Creed franchise: Assassin's Creed Shadows. This entry is set in late Sengoku-era Japan...
A new Nintendo Switch firmware update is here. System software version 18.0.1 has been released. This update offers the typical stability features as all other...
TheFlow has done it again--a new kernel exploit has been released for PlayStation 4 consoles. This latest exploit is called PPPwn, and works on PlayStation 4 systems...
After rumour got out about an upcoming NES Edition release for the famed Nintendo World Championships, Nintendo has officially unveiled the new game, titled "Nintendo...
DOOM is well-known for being ported to basically every device with some kind of input, and that list now includes the old retro game console in Persona 5 Royal...
The number of layoffs and cuts in the videogame industry sadly continue to grow, with the latest huge layoffs coming from Microsoft, due to what MIcrosoft calls a...
@K3Nv2 He was looking into consoles for his house anyway (for my younger siblings and himself) and I had a spare XSX and Switch I ended up selling him at pretty steep cuts. I would just give them to him, but I did buy them with the intent to sell them... and college is super expensive.