Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

Tools

Virtual Machines, Put To The Developer Test


Mark Cloutier is a software engineer at Lockheed Martin.

Often, a developer's goal is to create software that runs on many different operating systems, while giving users the same experience no matter which operating system they happen to be using. Virtual machine technology lets developers run multiple operating systems each installed in a VM, which should save them time and money -- if the VM performance doesn't suffer.

To determine whether it's viable to run a VM as a testing/developing environment for software development, I ran a battery of performance tests, all using Java running on various configurations of Ubuntu Linux and Windows. This allowed comparisons between the host control PCs and the virtual setups. I created tests to compare three factors: processing speed, speed of the network interface cards, and input/output speeds of hard drives, all comparing performance on the various configurations. The source for all of the test programs is available for download here.

The processor speeds and RAM allotments of the host PC and the guest PC were:

  • The host PC sees a 1.73-GHz processor and 2.00 GB of RAM.
  • The guest PC sees a 1.73 GHZ processor and 512 MB of RAM.

The difference in the RAM is because the guest gets allocated 512 MB (or as specified) of the 2.00 GB of the host.

There were six total configurations on which I conducted tests. I ran two controls using only the host:

  • One test on Ubuntu Linux.
  • One test on Microsoft's Windows XP.

The other tests were run on combinations of hosts and guests:

  • Ubuntu Linux (guest) on Ubuntu Linux.
  • Windows (guest) on Ubuntu Linux.
  • Windows (guest) on Windows
  • Ubuntu Linux (guest) on Windows.

Each test was run 100 times and the results were averaged to get more accurate measurements. All of the tests were run on clean installs of the operating systems with only the default settings to get the most unaffected and comparable results.

The first test was to determine the comparative processing speed of the configurations. To test the processing speed, I wrote a test program that tabulates the time it takes to create 500,000 Java objects (Figure 1).

[Click image to view at full size]
Figure 1: Speed of creating 500,000 Java objects on the various configurations.

The results show a minor decrease in performance compared to the controls when the host operating system is the same as the guest operating system. The most surprising result is the difference between running Ubuntu-on-Windows and running Windows-on-Ubuntu. Windows-on-Ubuntu took more than twice the time in creating the objects compared to the control, while Ubuntu-on-Windows was actually faster than Windows-on-Windows. Overall, the results show that there are no major differences between the guest and the host, Windows-on-Ubuntu notwithstanding. I suspect that the large differences here are a result of the way the abstraction layer executes the Windows code when Ubuntu is the host.

The next test was to determine the comparative speed of the NICs on the various configurations. To do this, I wrote a second test program that executes a SQL query through the Java Database Connectivity (JDBC) API (Figure 2).

[Click image to view at full size]
Figure 2: Speed of SQL queries on the various configurations.

The SQL query result set included 99,900 rows. The results here show a distinct speed difference when using Ubuntu as the host system, compared to Windows as the host. I suspect that the reason is that drivers for devices such as NICs are specifically written by manufacturers for Windows, while drivers in Linux distributions (such as Ubuntu) are written by the open-source community, which may lead to slight differences in implementations. There is almost no speed difference between the Windows control and the guests on Windows. On the Ubuntu side, there is a little more variation, with the surprise that Ubuntu on Ubuntu is the slowest (although not exceedingly far behind the control). Again, I attribute this to the drivers for Ubuntu being written by the open-source community instead of NIC manufacturer.

The third test program I wrote determines the comparative input/output speeds of hard drives on the various configurations. To do this, the program outputs a large text file, which the program reads back in. The speeds of each operation are then calculated (Figure 3).

[Click image to view at full size]
Figure 3: Speed of hard-drive I/O on various configurations.

The key thing to remember is that the VM's so-called "hard drive" is actually just a file on the host's hard drive.

These results show little variation when Windows is the host, but much more variation when Ubuntu is the host. With Windows and Ubuntu as the controls, their results are similar, as are the two guest configurations on Windows. The main surprise in this test is that Windows-on-Ubuntu was almost twice as slow on both the input and the output side of things.

A final test program took in all of the previous tests for an overall perspective on performance, using a program that queries a SQL database and outputs the result set to a file. It then reads the file back in and creates objects populated with what's in the file (Figure 4).

[Click image to view at full size]
Figure 4: Speed of running full test, including NIC, hard disk I/O, and processor speed on various configurations.

The tests show similar results across the board, with the exception of Windows-on-Ubuntu, which underperformed the others consistently. The tests show fairly conclusively that, except for the Windows-on-Ubuntu case, a VM is just as usable as a physical PC for software development. While there's a slight degradation in performance in many cases, the decrease is minor overall and likely worth the efficiency gains from having any operating system readily available.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.