VHD truncation at 127 GB

Nov 12, 2015 at 4:32 PM
Edited Nov 12, 2015 at 4:35 PM
Hi Ken and Team,

I've used DiscUtils over the past months to create VHD files, copying various files from the boot drive to the VHD. It has worked great.

But my application really needs to create VHD files as large as 1 TB, and mine are locking up at 127 GB with "Out of disk space".

I saw that Geometry.FromCapacity() truncates the size to 127 GB, so I pass a Geometry instance into Disk.InitializeDynamic() to avoid calling Geometry.FromCapacity(). Unfortunately my files still cap at 127 GB. This is causing me real difficulty.

I tried using GuidPartitionTable instead of BiosPartitionTable, but it makes NtfsFileSystem.Format() throw an ArgumentOutOfRangeException "Attempt to write beyond end of substream". Actually SubStream.Write() throws it, via NtfsFormatter.CreateBiosParameterBlock().

E.g. I want to create a VHD 144 gb, so I create a new Geometry(154537033728,255,63,512). The code below creates the file fine and FileChecker.Check() says it's valid but says "WARNING: Footer: Disk Geometry does not match documented Microsoft geometry for this capacity".

How do I use DiscUtils to write VHDs larger than 127 gb?

Here is the code that creates the VHD, and writes files from boot drive to VHD:
const int BUFFERLENGTH = 0x10000; // 64 kb
private NtfsFileSystem _ntfs;
private FileStream _vhd;

private void CreateVHDFile(int capacityInMB)
{
    try
    {
        long capacity = capacityInMB * 1024L * 1024L;
        Geometry geometry = new Geometry(capacity, 255, 63, 512); // Pass in to avoid Geometry.FromCapacity() truncation at 127 gb
        _vhd = File.Create(base.FileName);
        Disk disk = Disk.InitializeDynamic(_vhd, Ownership.None, capacity, geometry);
        BiosPartitionTable.Initialize(disk, WellKnownPartitionType.WindowsNtfs); // Tried GuidPartitionTable too but it throws exception on NtfsFileSystem.Format()
        VolumeManager vm = new VolumeManager(disk);
        LogicalVolumeInfo[] volumes = vm.GetLogicalVolumes();
        _ntfs = NtfsFileSystem.Format(volumes[0], "VHD_Disk");
    }
    catch (Exception e)
    {
        _log.ErrorFormat("Unable to create VHD ({0})", e.ToString());
    }
}

private bool WriteFile(string filePathOnDisk, string filePathInVHD)
{
    bool wrote = true;
    try
    {
        using (FileStream fileOnDisk = new FileStream(filePathOnDisk, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
        {
            string vhdDir = Path.GetDirectoryName(filePathInVHD);
            if (!_ntfs.DirectoryExists(vhdDir))
            {
                _ntfs.CreateDirectory(vhdDir);
            }

            using (Stream fileInVhd = _ntfs.OpenFile(filePathInVHD, FileMode.Create))
            {
                byte[] buffer = new byte[BUFFERLENGTH]; // 64 kb
                int bytesRead = 0;
                while ((bytesRead = fileOnDisk.Read(buffer, 0, BUFFERLENGTH)) != 0)
                {
                    fileInVhd.Write(buffer, 0, bytesRead);
                    base.ProgressCounter.Increment(bytesRead);
                }
            }
        }
    }
    catch (Exception e)
    {
        _log.ErrorFormat("Unable to write '{0}' to '{1}' ({2})", filePathOnDisk, filePathInVHD, e.Message);
        wrote = false;
    }
    return wrote;
}
I'm getting "Out of disk space" in my log, which I find in DiscUtils.Ntfs.ClusterBitmap.AllocateClusters() line 117. Where am I going wrong?

Thank you very much for any pointers!
Rob
Nov 12, 2015 at 11:49 PM
Here is more info, in case it helps you see what's happening.

When the VHD gets to ~125 gb, writing files to it slows to a crawl. This is because when adding each new file, DiscUtils spends about 35 seconds in a loop inside ClusterBitmap.FindClusters(), around line 90 under "Try to find a contiguous range" comment.

Here is a capture of the call stack:
DiscUtils.Ntfs.ClusterBitmap.FindClusters(long count = 15, System.Collections.Generic.List<DiscUtils.Tuple<long,long>> result = Count = 0, long start = 4716080, long end = 37728644, bool isMft = false, bool contiguous = true, long headroom = 0) Line 219
DiscUtils.Ntfs.ClusterBitmap.AllocateClusters(long count = 15, long proposedStart = -1, bool isMft = false, long total = 0) Line 90 + 0x62 bytes
DiscUtils.Ntfs.RawClusterStream.AllocateClusters(long startVcn = 0, int count = 15) Line 217 + 0x91 bytes
DiscUtils.Ntfs.RawClusterStream.ExpandToClusters(long numVirtualClusters = 15, DiscUtils.Ntfs.NonResidentAttributeRecord extent = {DiscUtils.Ntfs.NonResidentAttributeRecord}, bool allocate = true) Line 153 + 0x1c bytes
DiscUtils.Ntfs.NonResidentAttributeBuffer.SetCapacity(long value = 60675) Line 102 + 0x7e bytes
DiscUtils.Ntfs.NonResidentAttributeBuffer.Write(long pos = 0, byte[] buffer = {byte[65536]}, int offset = 0, int count = 60675) Line 131 + 0x1f bytes
DiscUtils.Ntfs.NtfsAttributeBuffer.Write(long pos = 0, byte[] buffer = {byte[65536]}, int offset = 0, int count = 60675) Line 164 + 0x28 bytes
DiscUtils.BufferStream.Write(byte[] buffer = {byte[65536]}, int offset = 0, int count = 60675) Line 175 + 0x24 bytes
DiscUtils.Ntfs.File.FileStream.Write(byte[] buffer = {byte[65536]}, int offset = 0, int count = 60675) Line 1234 + 0x1d bytes
DiscUtils.Ntfs.NtfsFileStream.Write(byte[] buffer = {byte[65536]}, int offset = 0, int count = 60675) Line 141 + 0x1d bytes
MyTest.VHDPackage.WriteFile(string filePathOnDisk = "C:\\Users\\rob\\AppData\\Roaming\\Apple Computer\\MobileSync\\Backup\\d5f68e0e602b59d00066282a24b16725b706faa8\\b80573e32f21cc71b1757a0ba0a0973e0fe72b57", string filePathInVHD = "Items\\C\\Users\\rob\\AppData\\Roaming\\Apple Computer\\MobileSync\\Backup\\d5f68e0e602b59d00066282a24b16725b706faa8\\b80573e32f21cc71b1757a0ba0a0973e0fe72b57") Line 269 + 0x13 bytes
To re-ask my original question, is there anything I can do to get past this 127 GB limit?

Thank you,
Rob
Nov 25, 2015 at 6:50 PM
Here's the solution I went with.

In my CreateVHDFile() method, I use a capacity that is 2x the size of the files I actually need to write into it.

E.g. if I need to write 150 GB of files into my VHD, I create a Geometry instance using a capacity of 300 GB.

That got me past the 127 gb limit.

DiscUtils is a great tool. Thanks!

Rob
Jan 7, 2016 at 6:53 AM
Hi Rob,

I don't pass the disk geometry when creating virtual disks as I prefer the software to infer it based on the capacity, and then I pass this geometry when initializing the partition table:
var disk = Disk.InitializeDynamic(stream, Ownership.Dispose, capacity, blockSize);
var partitionTable = BiosPartitionTable.Initialize(disk.Content, disk.GetGeometry()); // same for GPT
If this does not work out of the box for VHDs or you need to VHDXs then use my fork which contains various fixes and adds support for very large capacities using a new method Geometry.Lba48Geometry(long capacity, int sectorSize). I use it to create virtual disks VHD/X which can have capacities up to 2040 GB/64 TB and I can initialize them to MBR/GPT.
Jan 11, 2016 at 1:26 PM
Thanks for this info -- I'll keep it in mind.