Solving Very Slow Visual Studio Build Times in VMWare

ClockIn 2009, we chose 15″ MacBook Pro BTO over Dell, HP and Lenovo offerings. Apple offered us the best hardware and equivalent lease terms but with much simpler servicing done by ETF rather than (often incorrect) paper statements and checks. 99% of the time we ran this MacBooks with Windows 7 under Boot Camp. When I started doing some iOS development, I ran Snow Leopard and Xcode under VirtualBox and when I got fed up with the flakiness, I used VMware Workstation. OS X didn’t run very well under virtualization mostly because accelerated Quartz Extreme drivers don’t exist for VMware Workstation. Still, it actually worked well enough and was much more convenient than dual booting — which is such a huge time suck. When the lease period ended, we renewed with Apple and decided to just use OS X as the host environment for a variety of reasons:

The transition went very smoothly. I was using a VM with Windows Server 2008 and Visual Studio 2010 for primary .NET web development. Configuring IIS Express to serve outside of localhost bound to a host-only adapter is great for cross-browser testing, but it can be even more useful to enable remote access to Fiddler and point external browsers to the Fiddler proxy running in the VM to get both client debugging and HTTP sniffing at the same time. All of this was working out great.
 
At the start of October I downloaded Windows Server 2012 and Visual Studio 2012 and created a new development VM. In order to minimize disk storage, I moved my source tree to a folder in the host OS X environment and exposed it via the “Shared Folders” feature of VMWare Fusion 5.x to Windows. At the same time I started working on a new project.

 

The Slowening

Once the project grew to 20, 30 and 50k lines of C# code, the build times started to become horrifically slow. When combined with running unit tests, build became a big time for a bathroom break or cup of coffee event like building a project in C 15 years ago. Builds would show cdc.exe running at ~50% CPU (e.g. 1 core) and some other stuff totaling ~70% CPU. The VM was not memory bound and network IO was minimal. This was my first substantial project using Code First EF, so I thought maybe the complex object graph is just hard for the C# compiler to deal with.
 
After a few weeks of increasingly painful build times, I was looking at breaking my solution up so that I could build against pre-compiled DLLs — anything to make it go faster. I ran across a post on SuperUser:

… (Full disclosure: I work on VMware Fusion.)

I have heard that storing the code on a “network” drive (either an HGFS share or an NFS/CIFS share on the host, accessed via a virtual ethernet device) is a bad idea. Apparently the build performance is pretty bad in this configuration.

Oh really? Hmmm. Maybe it isn’t that my class libraries are so complex but something else is going on. Here are some empirical measurements of rebuild time of an actual solution:

VMWare shared folder: 	50 sec
OS X SMB share: 	18 sec
within virtual disk:	 9 sec

Wow. Problem solved. Incremental builds are basically instantaneous and a full rebuild takes 9 seconds when the code is hosted inside the VM image. Not only does hosting the source code within the VM virtual disk make the build go 5.5x faster, the CPU time of csc.exe goes way down. I don’t know how the VMWare shared folder is implemented. It appears as a mapped drive to a UNC name to Windows but it is very slow. Moral of the story is just don’t host your source code on the host machine with VMWare. The performance penalty is just not worth it. If you need to share the source code tree inside the VM with the host OS, create a file share from the VM to the host over a host-only adapter.

 

Update

I can confirm this is still a problem in VMWare Fusion 6. I’m hoping maybe the new SMB implementation in Mavericks might greatly improve the performance of sharing source code from the host OS to the VM.

Update

I just updated my virtual machine to Windows Server 2012 R2 (aka Windows 8.1 server). It is running on VMWare Fusion 6. The build time of this large project is now 15 seconds over VMWare Shared Folders. Significant remaining issue is that Visual Studio uses a whole core of CPU to do nothing — just having a large solution open, not editing anything.

Advertisements

13 Responses to Solving Very Slow Visual Studio Build Times in VMWare

  1. coldgrnd says:

    thanks for sharing. I’m just trying to evaluate the option to have a configuration similar to yours running windows in a vm on a Macbook. Would be good to see how a build on a native windows machine performs compared to the one you do in the vm. got any figures?

    • Brian Reiter says:

      On our previous-generation ~2009 dual-core MacBook hardware, I ran Windows for Visual Studio in a native Boot Camp partition and used VMs primarily for browser testing and light iOS development. (Xcode was pretty miserable running on OS X in a VM).

      On the new machine, I haven’t bothered with Boot Camp. Windows on the VM with just 2 virtual cores is faster than native was before. The convenience of having different VM configurations for projects/clients trumps even considering dual boot for me. I dot have benchmarks.

      FWIW. I do have a 4-core Sandy Bridge i7, SSD and 16GB of RAM. I sometimes give a large Windows Server VM 8GB of RAM. Usually 4GB is adequate. I also tend to split my workflow between Windows and OS X apps. I have a dual-monitor configuration and put the Windows VM full screen on one. Visual Studio does not work well with Unity as there are a lot of weird rendering glitches and the redraw performance is terrible. When just on the laptop, I put the VM in a full screen Space.

      • coldgrnd says:

        sounds like the way to go! the argument of having different vm configurations for different clients/projects is actually one of the most important points for me besides the performance.
        thanks again Brian,
        cheers
        Oliver

      • Brian Reiter says:

        There are scenarios where use of VM breaks down.

        For example, it may be tough/unsupported (beta) to configure Hyper-V within a VMWare VM. Hyper-V is required by the Windows Phone 8 emulator. I haven’t tried it at all.

  2. Daniel Prows says:

    I work on a particularly large application. and when i say large, i mean huge. The compile time for people on windows dev boxes is ~20 minutes with a mechanical disk, ~8-13 with an SSD.

    With my Macbook Pro with Retina display – Vmware fusion 5 running windows 7, visual studio 2010 – compiling with the code on the mac partition shared with windows, my compile time was ~48 minutes. Moving the code onto the virtual machine cut the compile time of the full application to 5 minutes.

  3. Warren Pena says:

    Thanks for sharing! I’ve run in to the exact same thing with gcc. I do all my development in OS X using MacVim, but I need to build for Windows. If I try to compile from the working copy in OS X using VMWare’s Shared Folder feature, it takes AGES. If I copy same code to the Windows virtual disk and compile from there, it’s many times faster. Glad to know I’m not the only one.

  4. ac says:

    It would seem a little odd if builds were faster under VM from the virtual disk than under the host running the VM. My first guess would be that people are more likely to have Windows Security Essentials or other things that install filesystem filters under the host and possibly not have the same setup under the VM. With compiles under VS the disk access tends to be very “chatty”, so any additional driver or hardware delays seem to be a big deal as shown by the responses above. I tried Intel CPP compiler for a while but found that it was similar if not worse – if you can have all the compiler related disk access target a RAM disk you could probably see 2-5x improvement over SSD on some build scenarios where CPU is not the issue.

    I know Windows installs can be faster under the VM but haven’t measured the disk performance of vmware virtual disks when running under same setup as gues as the host would.

  5. [Short version: try if Parallels Desktop is better – it is for me.]

    ac: no, the problem is very much real. And that virtual disks are as fast as native disks (and they are, both in Fusion and Parallels) is just good engineering.

    I reported it to VMware several *years* ago, did a few rounds of debugging and data gathering with them, they reproduced the issue and promised to pay attention to testing performance with Visual Studio. Nothing improved in v5. Nothing in v6. I was in contact with the VMware folks all that time, but no improvements were coming. They as much as said that it’s not a priority — they have approximately one person working on all of HGFS, it seems.

    As for Mavericks’ SMB2: don’t get your hopes too high. 10.8’s implementation is significantly slower than Samba 3, despite the latter using SMB1 (and I expect Samba 4 to be even faster). I even went so far as to install Samba3 manually, but the performance is still significantly worse than native/vdisk. And it has its own set of problems: e.g. VS is asking to reload projects because they were “modified on disk” (they weren’t).

    My benchmark with Fusion 6 for a C++ project (with heavy use of PCH) was: 58s on virtual disk, 281s(!) on shared folder. The former had 100% utilization of 4 cores, the latter about 50%. That’s one thing that matters too, BTW: HGFS is not good with multi-core parallel access. (VMware’s HGFS is problematic on Linux too: make -j4 regularly hangs my entire VM, presumably also because of parallel I/O.) This was with Visual Studio 2013 and Windows 8.1 previews on x64 VM on Mountain Lion on SSD, so everything involved was modern.

    In the end, I solved this differently: I installed Parallels Desktop 9. I resisted it with v5, because Fusion feels more polished (and it used to be a _lot_ more stable a few years back) and has better UI and license (multiple computers usage allowed). And I was a VMware customer, using Workstation or Fusion and updating yearly since 2001.

    Results with the same project and VM content: compiling on virtual disk: 56s, compiling from shared folder: 109s. Still significantly slower than virtual disk (although I did get comparable speeds for other projects – I assume it depends on the amount of files being accessed, so if anything, it should be better for C#). But almost three times faster than Fusion’s shared folders. For me, that’s good enough and worth it for the convenience of having the source tree on the Mac.

    • Brian Reiter says:

      I tried Samba3 from MacPorts. It turned out to be too flaky and unreliable to use for hosting source.

      • vslavik says:

        Been using the same thing daily, several hours per day for a year now (since upgrading to Mountain Lion with its much slower SMB2 implementation) and can’t really complain about any of that. Guess I was luckier than you 😉

  6. Luke says:

    THANK YOU THANK YOU THANK YOU, solved my issue 🙂

  7. Brian S. says:

    Just a note, still an issue after all this time. You post saved me a lot of headache trying to figure this out. I was hoping to be able to shut down the VM when I didn’t actually need Visual Studio, but I guess that’s not to be.

    • Brian Reiter says:

      The latest cross-platform work from Microsoft and Xamarin may mitigate it somewhat and more in the future. The currently-beta version of coreclr is officially supported on OS X and Linux with the whole ASP.Net stack. There is Visual Studio Code which is an editor and compiler based on Atom. There is http://omnisharp.net. I’ve had a reasonable amount of success using omnisharp and xbuild in Sublime to edit and compile large projects. The biggest issues I have run into other than general beta stuff is that Visual Studio native unit tests don’t compile because the library is not open source (use nunit) and SQL Server requires Windows.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: