Commodore 64 validation

General No-Intro related discussions.
Post Reply
User avatar
nitro322
Posts: 37
Joined: 05 Aug 2018 00:38

Commodore 64 validation

Post by nitro322 »

What's the current status of the C64, specifically floppy disks?

I know that no-intro cataloged the preservation project collection, but the DAT file for that hasn't been updated since 2013. It's also uses NIB images, whereas the preservation project currently uses NBZ images (which, as far as I can tell, is a compressed form of NIB). It's possible to convert NBZ to NIB using nibtools, but the converted images don't verify against the no-intro DAT file.

At this point there doesn't seem to have a good way to verify C64 disk images, unless I'm missing something (and I hope I am).

Is there any effort to catalog the newer preservation project NBZ files? if not, any particular reason not to do that? Lack of time and/or desire are two obvious candidates, but I'm not sure if there was any decision to intentionally not use those images. I'm happy to help contribute if I can on any outstanding work here, I just don't know if there's even any work being done. 2013 is a looong time without any updates.

Would appreciate if anyone could clue me in. Thanks.
User avatar
xuom2
High Council
Posts: 926
Joined: 22 May 2008 18:45

Re: Commodore 64 validation

Post by xuom2 »

C64 at the moment is unmaintained. It can be wiped and readded if you have a better romset (of course if the new romset maintainer agrees and giving him proper credits).
User avatar
nitro322
Posts: 37
Joined: 05 Aug 2018 00:38

Re: Commodore 64 validation

Post by nitro322 »

Understood. Thanks for clarifying.
User avatar
nitro322
Posts: 37
Joined: 05 Aug 2018 00:38

Re: Commodore 64 validation

Post by nitro322 »

I've been doing a little research on this topic. It doesn't appear to be easily solvable, at least using the system of checksumming the output file, which is based on the assumption that a good dump should always produce identical output.

From what I've been able to gather, reading from a C64 floppy disk is more akin to ripping an analog record than an optical disc - you're going to get imperfections (especially relating to copy protection) that will result in different output between different disks, or even different output when reading the same disk multiple times.

Looking specifically at the NIB format used by the C64 Preservation Project, it's expected that the NIB/NBZ files themselves will differ. They provide a utility that computes the checksum of the contents of the disk. Eg.:

Code: Select all

$ nibscan ~/data/games/Commodore/Commodore\ 64/Backup/dump/Action\ Biker.nbz | grep MD5
BAM/DIR MD5:    0xdd6a5557932cf9d58bde653549867429
[682/683 sectors] Full MD5:     0xd443f9a63499c60ce94fbf652dd390be
Those checksums from my personal dump match the existing C64 PP copy that's out there, but the checksum of the NBZ file does not.

So, a DAT file consisting of checksums of NBZ files wouldn't be sufficient or even useful for people to use to verify their own dumps. The only thing it'd be useful for is to verify that the dump they obtained elsewhere precisely matches the one dump that has been deemed authoritative. To be clear, that does still have some value in terms of ensuring your ROM collection is good, but being unable to verify your own or any new dumps is a big weakness.

An alternative approach could be to store something like the computed checksum of the data inside the NBZ file. Then, by comparing the computer values of your own dumps against an authoritative set, you'll be able to verify whether you have a good dump. I think this is overall a more useful approach, but, of course, it requires a separate format-specific utility to computer the internal checksums and breaks pretty much all existing ROM verification tools that just look at the file itself.

Complicating matters further, there are other tools that create images differently than I described above. Rather than creating a single output file, it seems to create multiple "streams" of files, which each track corresponding to a different file. This "flux" format is considered by some to be more accurate in terms of preservation than NIB/NBZ, but I don't have the right hardware to be able to produce dumps in that format. Obviously there would be additional complications in how to checksum and verify that data.

So...

That's a lot of text, but wanted to share my findings and hopefully generate a little discussion on this topic. Have similar challenges with another format come up before? Is there any precedent on handling internal checksums as described above vs. checksumming the file itself? Are there other options to tackle this that I'm not seeing?

Ultimately, I'm trying to determine if there's any good path forward for this.
Post Reply