I don’t like Virtual Machines

I don’t like virtual machines. Give that my current position and employer involves building, maintaining, supporting, optimizing, and selling solutions/services this statement is a bit ironic. I don’t hate the benefits that the technology has given us. It’s amazing about what it has provided. It’s also amazing how you can scale up a service without having to bring in lots of new hardware and maintain that as well. It’s more efficient and cost effective than the old way of doing things. It has progressed development of operating systems, and drivers.

The problem I have with virtual machines involves more of the context of what they are. In a very simple manner, a VM is an emulation of a physical machine run within a computer, also known as a hypervisor. Oftentimes, lots of virtual machines are run on the same box by using systems like VMWare ESX server, Zen, or KVM. Very frequently the difference between a physical machine and a virtual machine are very little. The differences show up with 3D/low-latency applications and VMs that depend on hardware input that cannot be emulated (I.E. number generation on VMs).  After reading that previous statement, and considering the downside, there should be something that sticks out to you. It’s something that should make you feel uncomfortable.

For me, I’m made very uncomfortable in the fact that we have many VMs running on the same box. Much of the processing, storage, and memory are consumed by redundant operating system processes, and/or files associated.  This seems really inefficient to have 30 instances of Windows Server 2012 all running IIS at the same time. The alternative to this madness is through the use of containers. I like the idea. I would love to get a chance to learn more about OpenVZ and LXC when I get more time. I like containers because they are sandboxed/managed containers which pushes the processing ability onto the actual job being performed. It feels more efficient, and more inlined with solving the problem rather than creating more infrastructure.

Prior to the virtualization era: We were encouraged to build grid services. This was great, you could throw a lot of machines at a problem and had them work in harmony. However, that didn’t work as well as we hoped due to the immature tools and frameworks offered at the time. In replacement of grid computing, the next approach was to split the problem up into individual processing united and to just to throw a lot of machines at the problem. After VMs are “nearly-free.” This really doesn’t fix the problem, it just seems like we’re timesharing on a powerful server once again.

Are all technical books dry and boring?

After reading many technical books I believe that a trend has emerged: Most technical books tend to be varying degrees of being dry. This is starting to become incredibly irritating, it also hurts in keeping the readers attention. At the moment I don’t have a lot of justification to support why non-dry techincal book, however some of the most memorial books  have included silly examples, real world examples, and a non-referential tone.


So far the only books I’ve read that didn’t have a dry tone:

1. Test Driven Development By Kent Beck

2. Effective Java- Joshua Bloch

3. Head First For Design Patterns

4. Apache Wicket in Action – Martijn Dashorst

Trends with Low Powered Clients [Tablets, and Smartphones]

Recently there has been a push to replace desktops and laptops with low powered machines. These machines have taken the place as mobile devices [phones], tablets, netbooks, and with some ambiguity ultra books. These machines claim to push the responsibility of major processing onto server environments [the cloud], and leaves the visual tasks [GUI, localization, collecting user requests] up to the client. If you believe the marketing hype….

Recently there has been a push to replace desktops and laptops with low powered machines. These machines have taken the place as mobile devices [phones], tablets, netbooks, and with some ambiguity ultra books. These machines claim to push the responsibility of major processing onto server environments [the cloud], and leaves the visual tasks [GUI, localization, collecting user requests] up to the client. If you believe the marketing hype….


The advantages to the user is the biggest draw of the trend. The user is able to purchase a cheaper piece of equipment, and tends to have a lower cost of ownership. Although the environment may lock the user down to what software they may install. Low powered machines tend not to take up as much physical space, and give a longer battery life.

Before I go on a huge rant, I must concede that the whole tablet/smartphone trend has made it cheaper and easier to create portable generic devices. Instead of purchasing a small novelty devices, you can get an application that does the same thing. Soon in the future, hipsters will be bragging about how they still use a desktop alarm clock, because its retro.


Despite the trendiness of tablets and smart phones, this is pretty much the whole “thin client” fad all over again. Most of the work is pushed to the server, and the thin client must have access to the server to work. Despite the marketing, network issues have not been resolved with tablets and netbooks. Cellular modems have been added to the devices [3G, HPSA, and LTE], but they tend to have issues with coverage, network saturation, providers/governments willingness to jam signals, and latency. Even with WIFI tablets, a network isn’t always guaranteed.

Another annoyance that comes with the low powered devices, is that the user lacks the choice of operating system. Due to non-Intel compatible processors and locked down BIOSes this makes things difficult. The user is locked down to the platform that is installed. It’s kind of a benefit as that the device can update its-self. However, applications can be rejected from the marketplace/application-store.

Sub rant: A tablet makes it incredibly easy to navigate and read web pages. However it becomes very difficult to perform any task that has any merit. The onscreen keyboards and autocorrect tend to impede or prevent the task from being performed. One has to have a Bluetooth keyboard if they wish to type anything longer than a text message on a tablet. Additionally the ability to select something with precision is lost on a touch enabled device. Although, one can pair [Bluetooth] or plug in [USB] a mouse and use the mouse pointer on android.

What does this have to do with development?

Sadly, this is being pushed to developers as being the “hot new thing.” Its being treated as something that has never existed before, if you believe the job listings. It seems like everyone wants an IOS/Android application, wither its relativant or not. Apparently, the world of Windows CE, QNX, PalmOS, and embedded version of linux are pretty much ignored.

It appears that the push for an application market place is pushing the application as a service idea. It’s great for the user, outdated versions practically become a thing of the past. This is unrealistic for anything but novelty and open source applications. Applications will be expected to be supported, maintained, and cheap [to compete]. I may not be a business major, but this seems unfeasible as a business. You can’t sell a jug of milk and provide free refills for life.

Another concern for developers is the cloud/traditional services that will be provided to the mobile. Given the specifications of the devices, and the current models for services [XML based], providing an efficient delivery model for mobile devices. I would change my opinion on this issue if binary services were used for mobile devices. It would also be nice if the services were switchable, depending on the client device. [XML/YAML/text based for desktops, binary for mobile devices]. This would be slightly more tolerant for the network needs of mobile devices.

What I believe isn’t possible with “The Desktop Killers” (Tablets and Smart Phones)

  • Professional Grade Photoshop-like Applications
  • Real-time, low latency high end graphics rendering [rendering on a server farm will kill the latency]
  • P2P Software
  • Multi-user space on the device [currently the mindset is tied down to 1 device, 1 user at a time]
  • True background services/notifications (its more of a hack right now, this is impeded by power saving and modem [WIF, Cellular] usage) [Examples: monitoring applications, IM applications (where the user may not actively be using their device)]

“The World Has Changed”

Many people claim that “The World Has Changed” when they see something they don’t recognize from before. This common phrase is a cliché. It is cliché that is used, and abused, without actual regard to gradual change or reflection of major influences. The most irritating resultant of this phrase is that it manages to provide little meaning, and prevents critical rebuttals. It is quite difficult to argue against the context in which this phrase is used. The phrase, however vague and meaningless, is not false, but it is not exactly true. Human civilization evolves, not necessarily in the biological sense, by taking existing ideas and improving on them. There are many who have claimed that they have created something that did not exist before, is that they are severely unaware of similar existing ideas, or of previous attempts.

The world has changed to attempt to bridge the content of one idea to another weakly connected idea. For example one could say: “People used to dig in the ground with stones, but the world has changed. Now people use backhoes [UK/AUS: diggers].” This phrase makes a statement that something that cannot be refuted. The phrase makes a magical leap to another similar connecting statement, similar to “hand waving,” that ignores the connections between the two phrases. The gap between the two phrases is not bounded. The amount of context connecting the two phrases could include wars, world peace, and space exploration. The typical westerner, with agricultural experience, would association the connection of digging with rocks to diggers to be rocks->sharpened rock->shovel->machinery/hydraulics->jackhammer->digger. However, the original intent [or persuasion] may have been to get the listener to open up to the idea of the gap being a magical entity creating the advancement [good for hyping up products and investment in organizations], or a massive leap in the human condition.

Clichés are phrases that are similar to slang, they are the disease of the everyday language. Easy to catch from another, easily spotted, and difficult to get rid of. These clichés are popular amongst cultures, and communicate generic feelings; however they also tend to hide realities. In summary, the phrase attempts to convince the listening party that humanity has drastically changed. Somehow, humanity can no longer satisfy their basic needs without advancements. Somehow, without the TV, humanity can’t feed itself, interact with others, or even reproduce without being directed. Inversely, it also implies that the notion that cultures which do not share the same opinion of the change are backwards and somewhat incompatible is inheritantly wrong.

Lastly, I leave with a visualization of behavior change in the last 10 years. Has the human population changed that much? Yes there have been minor changes in consumption of resources, but has the mobile phone really introduced communication as a new concept to human beings? Have we lost the same desires to interact with each others? No, we’ve been doing it for many years and will continue to do so.

Visualization 2000 vs. 2010 Behavior and Cultural Statistics