Kerr-Mudd, John
2024-05-22 15:46:29 UTC
Note: xposted to afc
On Wed, 22 May 2024 09:23:02 -0400
partitioning on your system, GPT drives and MBR drives and your modern
BIOS would accomodate both and Windows Explorer would just 'see' both
drives without complaint?
Yes, absolutely.
It's nice when stuff works :-) That's for sure.
*******
The next area of interest, coming up, is NTFS only has four billion clusters.
2^64 clusters − 1 cluster (format); [No idea what this means, 2^64 is a CDB limit maybe]
256 TB size − 64 KB clusters ( early Windows 10 or less )
8 PB − 2 MB clusters ( late Windows 10 had more cluster sizes added, NotBackwardCompatible )
( if you show this partition to Win7, it "offers to format it", out of spite )
From this, we might conclude
16 TB size - C: drive Windows 10 install wants only 4 KB clusters (enables compression)
- It's possible Win7 might still accept 64 KB clusters for a Windows Install.
As a test of VirtualBox, I tried to create a large disk using a .vdi container.
This is the best I could do. In the Properties pie chart area for F: ,
File Explorer was unable to work out how much free space there was,
whereas good old Command Prompt worked it out.
F:\>dir
Volume in drive F is BIG
Volume Serial Number is EE25-2764
Directory of F:\
05/22/2024 07:21 AM 82,110 test.txt
1 File(s) 82,110 bytes
0 Dir(s) 562,949,287,706,624 bytes free
F:\>
We may be approaching a point, where some things "act up again",
just after the "trauma of 2.2TB" had passed :-)
Anyway, this little investigation makes me wonder what the OS
did with my backup drive. I had a few problems with partitioning it,
and may have stepped into this issue, without recognizing what
was going on. The OS is pretty crafty, and if you offer a large
volume (in modern Win10), it just restricts your cluster size
choices and does not offer any info you might benefit from,
for planning purposes. The above 512TB volume is using 128KB clusters,
as a guess as to how that worked out.
Size: 82.110 bytes
Size on disk: 131,072 bytes <== 128KB cluster, can't work with Win7
Paul
WIWAL, as a "HelpDesk Consultant"! some dept wanted to store huge number
of I dunno, some things, trouble was that they ran out of space much faster
than they imagined as every file on the Big Disk was say c. 1kB of
text, but the cluster size was (again making it up) c. 16kB.
On Wed, 22 May 2024 09:23:02 -0400
Windows 7 supports GPT partitioning, which removes the 32-bit MBR limitation
when defining storage. If you were using MSDOS partitioning on Windows 7, that
has a 2.2TB limit. When you buy a big drive, you use GPT so that all of
the disk can be used without a problem.
Is it possible to have both types of hard drivewhen defining storage. If you were using MSDOS partitioning on Windows 7, that
has a 2.2TB limit. When you buy a big drive, you use GPT so that all of
the disk can be used without a problem.
partitioning on your system, GPT drives and MBR drives and your modern
BIOS would accomodate both and Windows Explorer would just 'see' both
drives without complaint?
*******
The next area of interest, coming up, is NTFS only has four billion clusters.
2^64 clusters − 1 cluster (format); [No idea what this means, 2^64 is a CDB limit maybe]
256 TB size − 64 KB clusters ( early Windows 10 or less )
8 PB − 2 MB clusters ( late Windows 10 had more cluster sizes added, NotBackwardCompatible )
( if you show this partition to Win7, it "offers to format it", out of spite )
From this, we might conclude
16 TB size - C: drive Windows 10 install wants only 4 KB clusters (enables compression)
- It's possible Win7 might still accept 64 KB clusters for a Windows Install.
As a test of VirtualBox, I tried to create a large disk using a .vdi container.
This is the best I could do. In the Properties pie chart area for F: ,
File Explorer was unable to work out how much free space there was,
whereas good old Command Prompt worked it out.
F:\>dir
Volume in drive F is BIG
Volume Serial Number is EE25-2764
Directory of F:\
05/22/2024 07:21 AM 82,110 test.txt
1 File(s) 82,110 bytes
0 Dir(s) 562,949,287,706,624 bytes free
F:\>
We may be approaching a point, where some things "act up again",
just after the "trauma of 2.2TB" had passed :-)
Anyway, this little investigation makes me wonder what the OS
did with my backup drive. I had a few problems with partitioning it,
and may have stepped into this issue, without recognizing what
was going on. The OS is pretty crafty, and if you offer a large
volume (in modern Win10), it just restricts your cluster size
choices and does not offer any info you might benefit from,
for planning purposes. The above 512TB volume is using 128KB clusters,
as a guess as to how that worked out.
Size: 82.110 bytes
Size on disk: 131,072 bytes <== 128KB cluster, can't work with Win7
Paul
of I dunno, some things, trouble was that they ran out of space much faster
than they imagined as every file on the Big Disk was say c. 1kB of
text, but the cluster size was (again making it up) c. 16kB.
--
Bah, and indeed Humbug.
Bah, and indeed Humbug.